Expand the section navigation mobile menu

Center for Excellence in Teaching and Learning

Kresge Library, Room 430
100 Library Drive
Rochester, Michigan 48309-4479
(location map)
(248) 370-2751

Peer Paired Problems: Large-Class Active Learning & Metacognition Activity

Mon, May 15, 2023 at 7:30 AM

Audience response systems (ARSs), such as iClicker (cost involved) and Mentimeter (free), have become quite popular within the educational field, especially in large lecture-format classes (Mayer et al., 2009). While working with my colleague, Dr. Sarah Hosch, on a complete redesign of our BIO 1200 courses, one of our observations was that frequently, faculty used ARS-based questions to only target and reinforce the lower Bloom’s level concepts of “remember” and “understand” (Caldwell, 2007). We wished to develop a straight-forward (read: simple) process to allow us (instructors) to ask more complex questions that covered the upper Bloom’s levels of “apply,” “analyze,” and “evaluate,” AND we wanted a process that could be used in large classes (50-150+) to provide real-time data that the instructor could immediately analyze and share with the class.

The Peer Paired Problems Method

The Peer Paired Problems (PPP) technique was originally piloted in 2 large sections of BIO 1200 Biology I during fall semester of 2017 and 2018.  For analysis and comparisons, our data was compared to 9 other BIO 1200 sections in the same terms (control) that were not using PPPs or being redesigned. 

Here is an overview of the PPP process:

Once or twice during lectures on a chapter, students were presented a series of 4 questions to answer using a classroom response system.  These questions are written to address the upper levels of Bloom’s taxonomy (Bloom, et al). We used the iClicker audience response system (Macmillan Learning).  We embedded the 4 questions directly into our lecture slides and determined a “good” spot within the class to conduct a PPP activity.  

The first two questions are answered initially by individual students (Individual Questions), and then the same two questions are asked again but students answer after consultations and discussions with fellow classmates (Paired Questions). 

Individual Questions

1. The PPP question is presented to the class the first time for up to 2 minutes.  Students must answer the question only by themselves. Student may or may not be allowed to use their notes, text, lecture slides, Google, etc., to answer the question, depending on the instructor’s intent and preferences. All students submit an answer within the 2 minutes. 

2. Next, the students are presented with a second question that directly asks them to report their confidence in their answer to the first question (High, Medium, Low confidence). All students submit a confidence vote (20 seconds or so is enough time).

Paired Questions

3. Next, students are shown the original question a second time.  However, this time the students are instructed to work in pairs or small groups to determine a consensus or best answer. Allow students 2 minutes again to answer.

4. Following the submission of answers from the 2nd attempt, students are again asked to judge their confidence in having answered the question correctly after working as part of a group. Confidence is again scored as high, medium, and low. Subsequently, the instructor will reveal the answer to the entire class.

Real-time Analysis

By using an ARS that can record and track student data, the instructor can quickly visualize the class data as a bar graph and/or percentages for each question of the PPP and report to the class on the number of correct responses and the confidence levels of the class.  

Most importantly, the instructor can easily show the class the data and comment on any significant “shifts” in the number of students getting the question correct between the second group attempt vs. the first, individual attempt. Also, any shifts in confidence between the 1st attempt and the 2nd attempt can be reviewed and discussed. This immediate feedback can be used to direct the material that is covered or reviewed in class by the instructor moving forward. 

Conclusions

We wished to develop an evidence-based method for asking complex Bloom questions in a format amendable to larger lecture classes. Data from our pilot studies indicated students increased their performance on questions following pairing; this improvement carried over on upper level and critical thinking questions included on exams; and using the confidence scores as a measure of student metacognition, we found 25% more students reported feeling highly confident after paring and about 20% fewer students reported low confidence after pairing.

The PPPs aim to provide “early and often feedback.” With this information, faculty can encourage student success in several ways. 

First, students will be given feedback on their mastery of content and application of that content in critical thinking exercises IN class. 

Second, instructors can use PPP data to identify students who are struggling and, using emails and Faculty Feedback, are better able to direct those students to advisers and resources on campus. 

Thirdly, students have repeated opportunities to develop their metacognition skills by being prompted to rate their confidence levels. If students can more accurately gauge their knowledge and skill level, they may be better able to alter their learning activities appropriately to fill in the gaps in their knowledge.  

A final observation of our BIO 1200 redesign was that by using demographic data from OIRADA (institutional research office), we were able to compare the DFWI rates of underrepresented minorities (URMs) in our 2 sections compared to URMs from all of the other non-redesigned BIO 1200 sections.  Interestingly, we found that the DFWI rates for URM students in redesigned sections decreased 11.7%, from 40.7% (non-redesigned) to 29.3% (redesigned) while having no effect on the DFWI rates of non-URM students (18.8% in redesigned and non-redesigned sections of BIO 1200).  We are currently following up on this unexpected outcome from our 1200 redesign data and hope to tease out what specific (if any) activities or changes we made contributed to this increased performance of URMs in BIO 1200.

References and Resources 

Mayer RE, Stull A, DeLeeuw K, et al. Clickers in college classrooms: Fostering learning with questioning methods in large lecture classes. Contemp Educ Psychol. 2009;34(1):51-57.

Caldwell JE. Clickers in the large classroom: Current research and best-practice tips. CBE life sciences education. 2007;6(1):9-20. http://www.ncbi.nlm.nih.gov/pubmed/17339389. doi: 10.1187/cbe.06-12-0205.

Bloom, B. S.; Engelhart, M. D.; Furst, E. J.; Hill, W. H.; Krathwohl, D. R. (1956). Taxonomy of educational objectives: The classification of educational goals. Vol. Handbook I: Cognitive domain. New York: David McKay Company.

Levy MG, Scott BM. Metacognition: Examining the components of a fuzzy concept. Educational Research eJournal. 2013;2(2):120-131. http://dialnet.unirioja.es/servlet/oaiart?codigo=4459208.

Teaching and Learning Academy, John N. Gardner Institute, February 25, 2017.

Chick, N. (2013). Metacognition. Vanderbilt University Center for Teaching. Retrieved [from https://cft.vanderbilt.edu/guides-sub-pages/metacognition/.

Bransford, John D., Brown Ann L., and Cocking Rodney R. (2000).  How people learn:  Brain mind, experience, and school.  Washington, D.C.: National Academy Press.

Save and adapt a Google Doc version of this teaching tip.

About the Author

Jonathan Yates is a Special Instructor in the Department of Biological Sciences. His area of interest is the scholarship of teaching and learning with a focus on student success in large lecture class environments. Dr. Yates is also the course coordinator for BIO 1200 and BIO 1300. Outside of the classroom, Dr. Yates can often be found playing golf at the OU courses, weather permitting!

Others may share and adapt under Creative Commons License CC BY-NC

View all CETL Weekly Teaching Tips

Tags:
active learning, large classes, ou-authored, stem