Art Bettis, Ph.D., professor of earth & environmental sciences
Use peer review of writing to develop students’ critical thinking skills regardless of the size of the class.
Art Bettis uses student peer review of writing in his large lecture course, Introduction to Environmental Science, to promote analysis, synthesis, and evaluation of course concepts and to engage with the process of scientific knowledge creation.
Professor Bettis designs writing assignments that require students to engage with key course questions.
As a result, the students sharpen their thinking and engage in many of the higher-order thinking skills identified in Bloom’s Taxonomy, particularly Analysis, Synthesis, and Evaluation.
To facilitate peer review in his large (150-200 students) lecture course, Professor Bettis uses Calibrated Peer Review (CPR), a web-based software that organizes anonymous peer review and also calculates a grade based upon the students’ reliability as peer reviewers as well as the students’ reflections on their own and on others’ writing. The process is similar to the process of peer review outlined above. Each paper is given a final grade based upon a combination of:
The instructor can review scores flagged for potential problems and override them as needed. However, a University of Iowa study has found that CPR scores closely match those determined by TAs. Since CPR is an online tool, it also avoids the time limitations of an in-class review, allowing for deeper thinking and reflection.
Whether or not peer review of writing relies on CPR as a tool, peer review has the potential to:
Students in certain disciplines may not expect to write at all or at least not in a large-lecture course. Professor Bettis and other researchers and practitioners strongly recommend taking time to help students understand how writing and peer review will help students achieve course learning objectives.
Students will learn the most from the peer review exercise when both the writing assignment guidelines and the peer review rubric are deliberately engineered to assess and promote student achievement of course learning objectives.
Peer reviewing is a skill, and most students will benefit from guidance about how to do it appropriately. A thoughtful rubric and a calibration or “mock peer review” of papers written or provided by the instructor can help train students to identify key issues and provide specific, constructive feedback.
Peer review is most effective when students have an opportunity to use the feedback, whether through a reflection on their own writing or through rewrites or additional papers.
Peer review can be anonymous or signed, and peer-review groupings may be either random or engineered by the instructor.
The class size, environment, course content and other considerations may influence how peer review assignments are set up.
When possible, it is helpful to guide peer review with open-ended (rather than yes/no) questions that avoid asking students to make emotional judgments or offer unsubstantiated opinions. Rather than ask students whether a paper is well-argued, reviewers can underline the sentence they think is the thesis statement and identify the types of supporting evidence used by the author. For a useful sample list of items for a peer feedback form, see Linda B. Nilson’s 2003 article in College Teaching.
At first, students in courses that use Calibrated Peer Review may be skeptical about the idea of other students “grading” them. Student buy-in can be increased when instructors spend ample time explaining how the score is largely determined by the student’s own performance as a reviewer (which measures applied understanding of course concepts) and by their own self-assessment. Moreover, the portion of the grade not determined by the individual student is carefully shaped by instructor-provided rubrics, adjustments for reviewer reliability scores, and the software’s ability to let instructors override problematic cases. CPR grades closely match grades provided by TAs.
Nilson, L. (2003). "Improving Student Peer Feedback." College Teaching, 51 (1), p. 34-38.
The Regents of the University of California. (2015) Calibrated Peer Review: Publications. University of California. Retrieved from http://cpr.molsci.ucla.edu/Publications.aspx
Russell, J., Van Horne, S., Ward, A., Bettis, A., Sipola, M., Colombo, M., & Rocheford, M. (2016). "Large lecture transformation: Adopting Evidence-based practices to increase student engagement and performance in an introductory science course." Journal of Geoscience Education, 64.
The Teaching Center. (2016) Planning and Guiding In-Class Peer Review. Washington University in St. Louis. Retrieved from https://teachingcenter.wustl.edu/resources/incorporating-writing/planning-and-guiding-in-class-peer-review/
EDUCAUSE Learning Initiative. (2013) 7 Things You Should Know About Calibrated Peer Review. EDUCAUSE Learning Initiative. Retrieved from https://net.educause.edu/ir/library/pdf/eli7101.pdf
With former University of Iowa assistant professor of earth and environmental sciences Adam Ward, Professor Bettis worked with OTLT Center for Teaching staff member Jane Russell and the Large Lecture Transformation Project, with assistance from the Center for Teaching SITA program, to improve teaching and learning in Introduction to Environmental Science. This transformation included the development of class sessions in the TILE active-learning classrooms and the introduction of Calibrated Peer Review. System requirements, assignment libraries, downloads, a tour of how CPR works, and other support is available on the CPR webpage.
Instructors may also wish to explore PeerMark, an online tool that organizes peer review based upon carefully-designed instructor rubrics.
Rubrics -- Discover how rubrics can help turn assessment into a learning opportunity while also making grading more efficient.
Writing Instruction -- Explore how to use writing to promote course goals, and how to effectively comment on high-stakes writing assignments.