Since 2015, I have been using Eric Mazur’s Peer Instruction method to increase the amount of time that students spend talking about physics in my classes. In Peer Instruction, students answer conceptual questions individually, and show their responses. Instruction proceeds depending on the percentage of students with a correct response (Figure 1). If more than 80% of students have the correct answer the teacher or one of the correct respondents explains the answer and instruction progresses. If less than 20% of the class has the correct answer, the teacher steps in and offers additional information to help clarify the theory behind the question. Peer Instruction becomes most useful when the percentage of correct responses lies between 20% and 80% – students discuss their answers and try to explain their choice to their peers. These end up being some of the best moments of student engagement that I have in my classes, with conversations sometimes becoming more heated than one would expect when discussing conceptual physics questions!
In the Peer Instruction model, theory multiple-choice questions are referred to as ConcepTests. There is a bank of these questions in the original book by Eric Mazur, but I usually find questions to use through a Google search of “whatever-topic-I’m-looking-for ConcepTests” – usually ending up on the ones produced by Giancoli many years ago. I also use questions from old provincial exams, test banks – basically wherever I can find them.
When teaching this in an in-person environment, I have students responding to questions by holding up individual 4”x 6” whiteboards. These are easy to make, and there is little disruption in class for distribution. More advanced systems accomplish the same thing (e.g. iClickers), but are obviously more costly with students needing remotes. A middle ground between these options is Plickers, where students hold up a four sided QR Code and the teacher records responses using a smart phone, getting statistics compiled in real time.
Since adding this to my instruction, students have performed better on the conceptual parts of tests, and generally seemed to better understand the physics behind their calculations. It has also provided a nice change of pace from typical numerical based work and lab activities. One of my concerns when I first started using Peer Instruction was that it would take away from students problem solving skills, due to class time spent on it. After some data collection and analysis, it turned out that there were no negative effects on problem solving, and lots of gains in conceptual understanding.
When COVID-19 forced the switch to remote teaching, I prioritised problem-solving heavy instruction, primarily through videos and problem sets that my students could complete asynchronously. I soon found that I was missing the live interaction with students. Since we had to have a live weekly check-in, I decided to use that time on a remote version of Peer Instruction.
A key part of making this work remotely was being able to extend my desktop onto a second screen so that I could monitor student responses while presenting and discussing the questions. Students responded in whatever way made sense to them (See Figure 2): some wrote ABCD on a piece of paper and moved it so that their chosen letter was visible; others used Snapchat to write a letter on their phone and hold it up to their camera; and those who didn’t want to turn on their camera wrote answers in chat.
I asked students to not respond until prompted by me to do so, and they displayed their answers in whatever form they had chosen. In face to face instruction, I would ask them to discuss with their neighbour if the percentage of correct responses was in the appropriate range, but that wasn’t easily accomplished in an online setting. For every question, I asked for volunteers to explain chosen answers and tried to draw a discussion out of the students about the content. We rarely got to the point of re-polling questions, as it didn’t seem to fit the way we were doing it.
Students were also glad that we returned to using Peer Instruction during remote teaching. 34 of 36 reported in their end of term survey that the remote Peer Instruction sessions helped increase their understanding. They reported that these blocks were “very effective, and it established the same one-on-one feel that we have in class if you ask questions”. Another stated that they were happy to have the Peer Instruction blocks, because it was interactive and really reinforced aspects of the video lessons”. One student did respond that they missed the “turn to your partner” aspect of Peer Instruction, which was something that they found very helpful in face to face instruction. Others commented that the holding up of answers to the screen meant that they were seeing others’ answers, which in turn influenced their own answer. This was effectively removing the independent answering that is an important part of the first response given by students.
The two student concerns above are important drawbacks of using Peer Instruction remotely. The issue of seeing others responses can be addressed through software that collects student responses without revealing the answer. Giving students Plickers cards to use at home and hold up to the screen could work to accomplish this. Plickers is also beta-testing a remote option where students answer on their personal devices, or via the website, and the data becomes available to the teacher. Other polling software exists that could likely solve this problem. Giving students the chance to discuss with each other is a very important part of Peer Instruction, and could be added to remote implementation through breakout rooms. This wasn’t an option for me in the spring, but will be something I use if I end up remote teaching this fall.
In the end, the use of Peer Instruction in remote teaching satisfied what I hoped it would – a return to focusing on conceptual discussions with students and a change of pace, where our live instruction time was used in a different way than video lessons and practice problems. It wasn’t quite as dynamic compared to in-person instruction, but was definitely worthwhile.