“It makes me feel better… just because they said I had a solid argument:” characterization of student interaction with peer feedback

Mary Tess Urbanek a, Danny Vinton b and Alena Moon *a
aDepartment of Chemistry, University of North Texas, Denton, TX, USA. E-mail: Alena.Moon@unt.edu
bDepartment of Chemistry, University of Nebraska – Lincoln, Lincoln, NE, USA

Received 11th April 2025 , Accepted 9th October 2025

First published on 9th October 2025


Abstract

Peer review activities have been shown to be beneficial for chemistry students and can promote both their conceptual and science practice competencies. Previous work has focused on identifying what peer review features prompt students to revise their work, where a higher degree of revision is typically correlated with more learning benefits for the student. More recently this research has begun to identify what characteristics of the feedback recipient influence this feedback uptake. However, in order to best implement these types of activities into the classroom, we must understand how these characteristics and features influence students’ engagement with peer feedback. In this study, we utilized semi-structured interviews to simulate a peer review activity for general chemistry II students. During these interviews, we asked students to respond to a series of hypothetical peer review comments, reflect on how their confidence changed, and explain whether they would like to revise their work. Using a phenomenographic approach, we identified three distinct framings that the students adopted based on their confidence about their initial drafts. Students who experienced low confidence viewed the peer review activity as offering them a mechanism to manage their uncertainty. Meanwhile, students who felt confident about their initial draft either looked to the peer review to offer confirmation that they had gotten the correct answer, or looked for feedback on how to improve their work. These frames shaped the way the students interpreted the feedback message, which ultimately directed their revision choices. This work offers valuable insights for instructors about how to best frame peer review activities to support student learning.


Introduction

Providing students with feedback about their learning is a crucial aspect of education (Hattie and Timperley, 2007; Topping, 2009; Goetz et al., 2018; Lipnevich and Smith, 2022). In post-secondary chemistry classrooms, instructors are typically responsible for giving this feedback to students. However, the high student-to-teacher ratios can make it difficult for instructors to give meaningful and specific feedback to each student. Additionally, this feedback typically comes after the students have submitted the assignment, meaning there is no way for an instructor to ensure that their feedback was processed meaningfully by the student (Nicol, 2010, 2021).

Peer review activities can be utilized to address these issues when incorporated with other feedback mechanisms in the classroom (Cho et al., 2006; Cho and Schunn, 2007; Cho and MacArthur, 2010). It has been well-documented in the literature that peer review is a beneficial activity for students, as students are tasked with acting as both a reviewer and a reviewee, thus gaining learning benefits from both roles (Nicol et al., 2014; Wu and Schunn, 2021b; Zong et al., 2021; Gao et al., 2023; Watts et al., 2024). Previously, peer review research has focused on identifying what types of peer review comments influence a student's revision rate, but the results have been mixed (Nelson and Schunn, 2009; Patchan and Schunn, 2016; Patchan et al., 2016; Wu and Schunn, 2020a, 2021a). The variability in these findings suggest that it is more than the content of these messages that shapes revision behavior. In fact, Lipnevich and Smith (2022) offer a model for feedback, which illustrates how several factors, including the individual characteristics of the student, can influence feedback uptake.

This study builds on this vein by qualitatively exploring how a student's individual characteristics, namely their confidence, shapes their engagement in peer review activities. Confidence has emerged as a recurring part of students’ engagement with various tasks, both peer review and learning more generally (Efklides, 2006, 2011; Berg and Moon, 2022). Despite being acknowledged as having an impact on the ways that students learn, the specific ways that a student's initial confidence shapes their engagement with peer feedback has yet to be explored. By understanding how confidence shapes students’ engagement with peer review, we can improve scaffolding and framing for these activities to maximize their impact. Therefore, in this study, we explored confidence directly by simulating peer review feedback during semi-structured interviews.

Background

Importance of feedback in learning

A substantial amount of research has shown that providing feedback to students can strongly impact their learning by influencing their understanding of the content, their emotions, and their behaviors (Hattie and Timperley, 2007; Topping, 2009; Goetz et al., 2018; Lipnevich and Smith, 2022). Hattie and Timperley (2007) have conceptualized feedback as “information provided by an agent regarding specific aspects of one's performance or understanding” (p. 81). This information can come in a variety of forms, such as grades or comments, and from a variety of sources, such as instructors, peers, or oneself. Previously, research on feedback has focused on identifying what comment qualities, specifically what types of feedback messages, are most conducive to student learning (Cho et al., 2006; Hattie and Timperley, 2007; Nelson and Schunn, 2009; Cho and MacArthur, 2010). However, in more recent years this research has expanded to consider the participatory role that students play in the feedback process (Boud and Molloy, 2013; Nicol et al., 2014; Hattie et al., 2016; Winstone et al., 2017a; Brooks et al., 2019; Lipnevich and Smith, 2022). Specifically, this work highlights that there are a variety of factors (e.g., individual affect, class norms, assignment features) that shape how an individual interacts with feedback (Nicol, 2010, 2021; Lipnevich and Smith, 2022). Because of this, these factors influence the impact the feedback has when it is processed and acted upon by the student (Hattie et al., 2016; Winstone et al., 2017a; Nicol, 2021; Wu and Schunn, 2021a).

In higher education, providing students with feedback is challenging, especially for high-enrollment courses such as general and organic chemistry. These courses typically have very high student-to-teacher ratios, making it unrealistic for a single instructor to provide students with individualized and specific feedback. Additionally, when students are provided with feedback about their understanding, it typically occurs after the assignment is completed, such as a grade after an exam. While this feedback is still valuable for students, it does limit the opportunity for students to directly apply it to their learning and for instructors to ensure the feedback was beneficial and processed by the student (Nicol, 2010, 2021). Therefore, we explore how peer review can offer feedback in a way that accommodates these challenges.

Using peer review to provide feedback

Implementing peer review activities can offer students the opportunity to take a more active role in the feedback process (Nicol et al., 2014; Winstone et al., 2017a; Carless and Boud, 2018). Typically, these activities consist of students producing an initial draft of an assignment, which is then anonymously evaluated by peers. Concurrently, they evaluate their classmates’ work. After this evaluation process is over, students can use the feedback to revise and resubmit their work. Previous research has identified numerous benefits associated with engaging in these activities, including cognitive, metacognitive, and affective impacts (Liu and Hansen, 2002; Topping, 2009; Tsai and Chuang, 2013; Cheng et al., 2015). In these activities, a student acts as both a reviewer and reviewee, and both roles have implications for learning (Nicol et al., 2014; Wu and Schunn, 2021b; Zong et al., 2021; Gao et al., 2023; Watts et al., 2024). Receiving feedback can help students identify new weaknesses in their work they may not have identified on their own (Nicol et al., 2014), while providing feedback helps students become more familiar with the assessment criteria and can prompt deep reflection on their work (Nicol et al., 2014; Berg and Moon, 2022). This engagement can support the development of students’ feedback literacy, or their ability to make sense of information and use it to improve their work or learning (Carless and Boud, 2018; Dong et al., 2023; Zhang et al., 2024). It also enhances their evaluative judgement competency, or the ability to assess the quality of work (Ajjawi and Boud, 2018; Tai et al., 2018; Ibarra-Sáiz et al., 2020; Nicol, 2021). Furthermore, the structure of these activities ensures that students receive multiple pieces of feedback, which can lead to greater improvements in writing than just receiving feedback from a single instructor (Cho et al., 2006; Cho and Schunn, 2007; Cho and MacArthur, 2010).

In chemistry contexts, peer review activities are often integrated into writing-to-learn pedagogies (Moon et al., 2018; Finkenstaedt-Quinn et al., 2019, 2024a, 2024,b; Watts et al., 2024), and laboratory contexts (Gragson and Hagen, 2010; Basso, 2020; Piccinno et al., 2023). Studies have shown that students generally perceive the peer review activity as helpful for their learning (Finkenstaedt-Quinn et al., 2024a; Madden et al., 2024), and instructors often see improvement in their students’ writing between drafts (Gragson and Hagen, 2010; Zwicky and Hands, 2016). Additionally, engaging in peer review activities has been linked to increased conceptual understanding of chemical phenomena, such as light–matter interactions (Moon et al., 2018) and Lewis Structures (Finkenstaedt-Quinn et al., 2019). It also promotes students’ engagement in science practices, as shown by work from Berg and Moon (2022) that illustrated how giving feedback to peers during a data analysis task prompted students to generate critical internal feedback about their own work.

Factors influencing the effectiveness of peer review

While several benefits associated with peer review activities have been identified, questions remain regarding the variation across these findings. For example, some studies suggest that providing feedback is a more beneficial process for students than receiving feedback is (Ion et al., 2019; Yu and Schunn, 2023; Cui and Schunn, 2024). These studies propose that giving feedback is a more active and constructive process than receiving feedback is (Wu and Schunn 2023). However, other studies have found that both giving and receiving peer review are beneficial for students (Nicol et al., 2014; Zong et al., 2021; Gao et al., 2023; Finkenstaedt-Quinn, et al., 2024a). Additionally, the role that feedback features have on student revisions has also garnered mixed findings. Work from Nelson and Schunn (2009) found that peer review comments that included suggestions for improvements were positively related to student revisions, while work from Wu and Schunn (2020a, 2020b) and Patchan et al. (2016) suggests no statistically significant relationship. This variability extends to other peer review comment features, including those that offer praise, identify a problem in the student's work (i.e. problem statements), and those that explain why and how a student should improve their draft (i.e. explanations and solutions) (Nelson and Schunn, 2009; Patchan and Schunn, 2016; Patchan et al., 2016; Wu and Schunn, 2020b, 2021a).

The variability in these findings suggest that there are likely individual student characteristics that impact how these peer review comments are processed and ultimately used in these activities. A host of these factors have been identified across the literature, including characteristics such as openness and receptivity, volition, etc. (Carless, 2006; Winstone et al., 2017b; Carless and Boud, 2018; Wu and Schunn, 2021a, 2021b; Lipnevich and Smith, 2022).

Included in these factors is the student's affect, specifically their confidence and uncertainty towards their work. These have emerged as related components of students’ experiences with peer review, both in our work on peer review and work on feedback uptake more broadly. In Berg and Moon (2022), hypothetical social comparisons, grounded in social comparison theory, were used to prompt the process of reviewing peers. Social comparison theory asserts that when individuals encounter situations for which they are uncertain about how to act or behave, they will compare themselves to others to extract criteria for how to act to reduce the uncertainty (Michinov and Michinov, 2001). Confidence can be conceptualized as the inverse of this uncertainty—“a state of being certain about the success of a behavioral act” (Stankov et al., 2012). Importantly, confidence is an element of metacognitive experiences (Efklides, 2006) where one's feelings of confidence are hypothesized to shape how one self-regulates during a task (Efklides, 2011). In Berg and Moon (2022), participants explicitly articulated their confidence with phrases like “I was feeling pretty good” and their uncertainty with phrases like “I don’t know.” A key finding of this study was that these expressions of confidence and uncertainty corresponded to how students responded to reading their peers’ work (Berg and Moon, 2022), echoing Efklides's (2011) assertions about the connection between metacognition and affect. In a similar vein, Lipnevich and Smith (2022) identified students’ confidence as a potential characteristic that could exacerbate the nature of the peer review comment. They described how students who may be experiencing low confidence in their work may have these feelings enhanced by negative feedback and vice versa. In the study presented herein, we anticipated that participants’ feelings of confidence and uncertainty from engaging in the task would shape how they interacted with receiving feedback.

While previous studies have identified students’ confidence as playing an influential role in how they engage with feedback, questions still remain regarding how this confidence actually shapes their engagement. While identifying confidence as an important feature to attend to is important for understanding what students may be experiencing when completing these activities, it is only one piece. Understanding how a student's confidence appears and affects their interaction with feedback is critical for a number of reasons. It can help provide clarity as to why certain peer review comment types are helpful in some contexts but not others (Nelson and Schunn, 2009; Patchan et al., 2016; Wu and Schunn, 2020a, 2020b). Additionally, it can help us to identify what best practices are for designing and scaffolding these types of activities. This has potential to ensure that these activities are the most beneficial that they can be for students.

Therefore, we used a qualitative approach to simulate peer review to investigate how a students’ confidence changes in light of peer feedback and how it frames their engagement with their feedback comments. We then explored how this engagement drives students’ revision behaviors. This study investigates the following research questions:

(1) How does a student's confidence in their initial draft shape their motivations for engaging with peer feedback?

(2) How do these motivations shape how students engage with peer feedback and revise their initial arguments?

Methods

Overview

We administered a previously designed argumentation task in a General Chemistry II course as a homework assignment. Following the administration of this assignment, we recruited students to participate in semi-structured interviews. During these interviews, we simulated peer feedback, where each student was shown four hypothetical peer reviews about their response. At the end of the interview, we gave students the opportunity to revise their work and reflect on how the peer feedback influenced their decision. Interview transcripts were analyzed utilizing phenomenography to identify how the students’ initial confidence influenced their engagement with the peer review comments. Each portion of the data collection and analysis is explained below.

Positionality and reflexivity

As qualitative researchers, we recognize that our identities played a role in the nature and quality of the data that we collected. For this interview context, because the topic is not sensitive, we argue that the most prominent and influential facet of our identity is our perceived age (1st and 3rd author). Specifically, it is likely that the fact that we looked young and appeared informal aided in establishing rapport with an undergraduate-aged population. Additionally, though these were not particularly sensitive interviews, we employed best practices to establish rapport and respect. These included, but were not limited to, making ‘small talk’ at the beginning, explicitly articulating our commitments to not assessing them and only valuing and seeking their thoughts, and being very mindful and present. In this way, we specifically attended to the visible comfort of the participant and relied on the 3rd author's interviewing experience to pivot to increase comfort when necessary.

Data collection

We began data collection for this study by administering an argumentation task as a homework assignment in a large, General Chemistry II course at a research-intensive university in the Midwest. This assignment asked students to select conditions (i.e. a thickness, temperature, and treatment) to dehydrate onion slices based on empirical data and provide an argument as to why those were the best conditions. To determine the best condition set, the students must consider three onion qualities: how dehydrated the slices become, how quickly the slices spoil, and how fast the flavor is lost. This task differs from typical assignments in that there is no correct answer; rather, multiple answers could be supported with the data included in the assignment. A full copy of the task and description of student responses to the task can be found in a previous publication (Urbanek et al., 2023).

When submitting their assignments, students were asked if they would be interested in participating in a follow-up interview about the assignment. In total, 78 students indicated they were interested. From these interested students we selected nineteen to participate in interviews based on their indicated availability during open interview times.

An overview of the interview process can be found in Fig. 1. These interviews began by showing the students a copy of their initial responses to the assignment and asking them to explain how they had originally arrived at their response. After this, we showed the students a series of peer review comments about their work that we constructed based on common features identified in previous literature (Nelson and Schunn, 2009). We also used peer review comments from previous work in our group to mimic student language. The peer review comments utilized for the interviews can be found in Table 1.


image file: d5rp00118h-f1.tif
Fig. 1 Overview of data collection processes.
Table 1 Examples of peer review given to each student during their interview. The specific option the student was shown was determined by what they had written in their original draft. Each student interacted with one of the options from each category
Type of peer review comment Options
Praise and stylistic solution Option One : Overall, the decision seems to follow the data, good job! Just try and avoid using “I” and other pronouns to keep the answer more professional. Very nice and solid arguments otherwise though.
Option Two : Overall, the decision seems to follow the data, good job! I would fluff the parameters sentence just to make it more professional. Very nice and solid arguments otherwise though.
Praise and high-prose solution Your choice for temperature, thickness, and treatment look right to me. I think your second response could do a better job of including evidence from the data and explaining why that data supports your answer. Like just add how temperature, thickness, and treatment better preserve the onion.
Counterargument Option One : While I do understand your choice in thickness, I disagree with the temperature you selected. If you chose 50/60 degrees, then your moisture content is going to be very high, which is going to be bad for the onions. You need to go back and reconsider Fig. 2 and how your conditions will affect the moisture content.
Option Two : Your choice in thickness makes sense to me. However, I don’t agree with your choice of temperature. If you chose 70 degrees, you’re going to have much higher rates of browning and flavor loss. This will make for a bad onion. I would recommend looking at Tables 1 and 2 again to reconsider your browning and flavor loss.
Vague criticism This is a little confusing. It's hard to tell how you got to your answer.


Each student interacted with four types of peer review comments. The participant first viewed the praise and stylistic solution peer review. This type of peer review offered encouragement to the student by saying that the reviewer completely agreed with the students’ condition choice but offered a grammatical solution that the student could implement to improve their response. Once the student had read the peer review, they were prompted to reflect on how it was influencing what they thought of their original answer, as well as how confident they felt. Then, students viewed the second type of peer review, which was a praise and high-prose solution comment. This review also offered encouragement by agreeing with the students’ initial response but prompted the student additional evidence and reasoning in a revision. Students were prompted to reflect on what they were taking away from the feedback, as well as how confident they felt. The third peer review comment offered a counterargument to the selected condition set. This peer review pointed out specific problems with the student's initial temperature choice and recommended the student change this in their revision. This review also offered an explanation as to why such a change was necessary by pointing out weaknesses in the student's initial claim. Again, after seeing this review, students were prompted to share their thoughts on the feedback message and share how it was affecting their confidence. Finally, the students engaged with a vague criticism peer review. This peer review did not point out specific problems nor offer any solutions for the students to implement, but alluded that the response needed improvement and was confusing. The students also reflected on their takeaways and confidence levels after interacting with this review.

We drafted different options of the same peer review type to help ensure that the peer review was consistent but still relevant to the individual receiving it. For example, students who had personal pronouns in their initial draft were shown the first option for the praise/stylistic solution peer review. Otherwise, students were shown an alternative peer review that recommended they pad their initial response. Two options were also utilized for the counterargument peer review, where students were shown the option that included their initial temperature in it. Both options are included in Table 1.

After going through each peer review comment, the students were given an opportunity to revise their work. If the student opted to revise their work, we asked them to explain why they made that revision. If the students did not revise their work, we asked them to explain why not. Interviews ranged from 45 minutes to approximately 90 minutes. All students who participated in the interviews were compensated with a $20 gift card. All data collection procedures followed proper IRB protocol, and a full copy of the interview guide is included in the SI.

Data analysis

Verbatim transcripts for each participant's interview were generated using Otter AI. Each transcript was checked for accuracy by listening to the audio while simultaneously reading the written account and correcting any discrepancies between the two.

Data were analyzed according to phenomenography procedures outlined by Marton (1986). According to this methodological perspective, there are qualitatively distinct ways of experiencing a phenomenon, and these distinct ways are limited in number (Marton, 1986). The purpose of phenomenography in this study is to identify and describe the qualitatively different ways that peer feedback is experienced. We first generated summaries of each student's interview. These summaries detailed the students’ initial response to the task, their initial confidence, their takeaways from the peer review, how their confidence changed based on the feedback they received, what revisions the students made, and why the students made those revisions. Through this step, we familiarized ourselves with the data and identified the critical aspects of the interviews that could be utilized to differentiate the student groups (Marton, 1986). We condensed these summaries into a table that provided a brief description of each participant.

This table included information about student's confidence level going into the peer review, how their confidence changed in response to each peer review, and a brief description of each revision (e.g. the student incorporated more evidence into a second draft). Condensing the information from the interviews in this way enabled us to observe similarities across participants. These similarities were then utilized to define the categories (Marton, 1986). We initially sorted students based on the confidence they had in their initial draft. Two categories from this sorting emerged: students who exhibited low confidence and students who exhibited some higher degree of confidence.

We then began sorting student responses within these categories to further differentiate student responses. From these groupings, we built descriptions that captured the differences between the groups. These descriptions were then utilized to sort the students again and further refine the categorical descriptions. We continued this iterative process until these descriptions reached stability and required no further refinement (Marton, 1986). A detailed description of each student's response to the peer review comments, their confidence levels, their revision decisions, and how they were sorted can be found in the SI.

Trustworthiness

To ensure the robustness of our analysis, we engaged in trustworthiness processes with an outside researcher who also conducts peer review research (Lincoln and Guba, 1985; Creswell and Poth, 2018). We aimed to ensure that the categories we identified during our analysis could be recognized by other researchers with reasonable agreement (Marton, 1986). The first author trained the external researcher on the analysis by explaining the descriptions of the categories and discussing examples that illustrated said descriptions. Then, each researcher independently sorted a subset of four transcripts using the category descriptions and compared our results. From this sorting, we identified a need to clarify the boundaries between certain categories, specifically between students who sought feedback about their initial claim from the peer review and students who expected the peer review to confirm their initial claim was correct. Through discussion, we clarified that the main difference between these categories was the framing the students had for engaging in peer review. We revised our descriptions for these categories accordingly. After trustworthiness was established, the first author reviewed their coding for the remaining transcripts to ensure each transcript was categorized appropriately.

Results

Overview

We identified three distinct ways that students’ confidence framed the way they engaged with the peer review. Specifically, the students’ initial confidence shaped what they focused on in each peer review message, which ultimately affected their revision choices. An overview of the students’ initial confidence, their response to each peer review type, and their final revision choices are outlined in Fig. 2. In the results below, we qualitatively describe how the students’ confidence shaped what they attended to in each peer feedback comment and illustrate how it ultimately informed their final revision choices.
image file: d5rp00118h-f2.tif
Fig. 2 Overview of students’ engagement with the peer review comments. The students’ confidence in their initial draft shaped what they attended to and took up from each peer review comment they interacted with, which ultimately drove their final revision decisions.

Peer review helps resolve uncertainty

The first group included students who entered the peer review feeling low confidence about their initial response. These students all noted that they were feeling uncertain, either about their data interpretation or how they used evidence to support their claim. For example, Philodendron felt uncertain about how to coordinate the evidence together into a cohesive claim since the data conflicted on what the best condition set was.

I don’t know… Just because the results of these two aren’t the same. Like, in the water content one, it's much worse to have the fifty degrees Celsius. But the fifty degrees Celsius was the best one in these. So there's like a conflict there.” (Philodendron).

Meanwhile, Lilly mentioned being unsure about their understanding of the rate constants in the task.

“I think if it had a lower k value and that's what's responsible for taste then it… Wait. Do I have that backwards? Maybe… I’m not sure how I got that relationship on this one to be honest…” (Lilly).

For these students, engaging with the peer reviews provided them with an opportunity to get feedback on whether this uncertainty was warranted and how to manage it more productively. This desire to resolve this uncertainty framed what aspects of the peer review comments the students attended to. For example, in response to the praise and stylistic solution peer review, some students, like Philodendron, experienced a decrease in their confidence levels.

“It honestly did not help my confidence that much… Just because I still don’t know. There's that conflicting data. And I would just like… want someone to explain with the conflicting data how I can tell which one to choose. Because I didn’t really know, I kind of just had to make sort of an educated guess.” (Philodendron).

This illustrates that Philodendron was explicitly looking for the peer review to comment on how they had managed the conflicting data in their initial draft. However, because the peer review was more focused on stylistic improvements, it was unable to resolve Philodendron's uncertainty and they sustained low confidence.

Other students, like Lilly, focused on the praise aspect of the feedback and experienced an increase in confidence.

It makes me feel better… Just because they said I had a solid argument. I guess it just makes me think that some people might have viewed the graphs the same way that I did.” (Lilly).

For Lilly, having others agree with them insinuated that their initial interpretation of the rate constants had been logical, prompting them to feel more confident since this peer review resolved their initial uncertainty.

When considering the praise and high-prose solution peer review, Philodendron was still unable to resolve their uncertainty, remaining not confident.

“I’m not very confident… I think that it just goes back to the way that I’m reading it, there seems to be a conflict. And so I wasn’t super confident to begin with when I chose. And then from this peer review, it kind of just seems like I didn’t validate or explain my choices either. (Philodendron).

Again, we see that Philodendron's initial low confidence was framing what they were taking away from the feedback message. Specifically, they interpreted this peer review as indicating that their first draft was lacking evidence and reasoning to support it. Already feeling uncertain about this aspect before receiving this peer review, the comment ultimately heightened their initial uncertainty and led to them remaining not confident.

In contrast to Philodendron, Lilly focused on trying to get information to help resolve their uncertainty.

“I guess I would just need to explain more. It could have been better. And then they would have understood where I got it from… It makes me feel good about it. I still don’t know if I’m right. But it makes me feel more confident that this peer review could see where I got it from. (Lilly).

Lilly again felt more confident, noting the praise from the feedback insinuated their original response was logical, which ultimately reduced their uncertainty and raised their confidence. However, unlike the previous peer review, Lilly also attended to the solution aspect of the feedback message and identified new ways they could revise their work to improve their response.

All students in this group experienced a decrease in their confidence when they encountered the counterargument review. Since these students were all focused on getting feedback on how to better resolve their uncertainty, engaging with a review that disagreed with their response prompted deep reflection on their initial reasoning. Some students, such as Philodendron, were prompted to consider changing their condition choice to better manage their uncertainty about their response.

“I do realize that there's—that obviously at 50 degrees Celsius, you’re gonna have some problems here… [I’m] not super confident. Just because I realize with the onion I chose, under the conditions I chose, it's going to have some problems. And if I choose it the way they said, I’m still not super sure because we would have browning… (Philodendron).

While other students, such as Lilly, were prompted to revisit their data interpretation.

It's making me go back to my data and compare why you would have higher rates of browning and flavor loss. I guess it's because I still don’t fully understand the k relationship. But it makes me go back and rethink why… [My confidence] is probably like a 5 out of 10. Just because… not knowing the k makes me feel less comfortable with what I answered.” (Lilly).

For both students, the counterargument peer review exacerbated their initial uncertainty in their response and resulted in a decrease in their confidence levels.

These students continued to express their uncertainty when engaging with the vague criticism peer review. Specifically, these students interpreted the peer review as indicating that there was a problem in their work that required a revision.

“I would probably agree with that. Just from my answers there, I definitely did not explain my thought process really at all… [I’m] not very confident… I already know there is a problem. So somebody saying that this doesn’t make sense makes me think that maybe it's the other way. (Philodendron).

To Philodendron, this reviewer's confusion stems from the same source as their own uncertainty, which ultimately resulted in enhancing their initial concerns. Again, Philodendron's need to resolve their uncertainty framed how they interpreted the meaning of the feedback message and ultimately led to them decreasing their confidence.

Based on these excerpts, the students’ engagement with the peer review was framed by their low confidence in their first draft. For these students, changes in their confidence levels were determined by whether the peer review indicated they managed their uncertainty well (i.e. Lilly) or not (i.e. Philodendron). All students in this group opted to make revisions to their work. Philodendron ended up substantially changing their response by adopting a new claim altogether.

“Okay, so I was just thinking about which one would be the best. And I was trying to… Because the last time when I was doing it, I did notice there was a conflict between the two. But I didn’t come up with a good reason why I chose one over the other… And so I was just trying to think about that more… And in doing so, I changed my mind and decided it’d be better to have lower water content… So I just tried and look at which one would maybe affect [the onion] more. And that's when I thought that maybe I should look at how much it would change with the temperature. And that's when I saw it would be 1 kilogram, which I thought was substantial, but with the rate constant it was only a 0.1 difference.” (Philodendron).

Philodendron's engagement with the peer review helped them identify new ways of answering the question that would help address their initial uncertainty. In this case, by reconsidering a certain onion quality (i.e. the water content), Philodendron was able to engage in deeper data analysis and come up with a new argument that resolved this initial uncertainty. While their confidence remained low throughout the interview, they were able to identify a new argument that they could adopt to help resolve some of their concerns with their first draft.

On the other hand, Lilly revised their work by incorporating more evidence and reasoning into their response.

“I guess in the first section it said that a higher k value meant they browned more quickly. And if they have a higher k value they don’t hold on to taste as well. Which was what I had written. And because two people agreed with me, I’m just gonna stick with my answer. But the peer review said it would like more of an explanation of where I got my conclusion from, so I just wanted to re-emphasize what I was saying.” (Lilly).

Lilly initially entered the peer review feeling uncertain about their interpretation of the rate constant values. This uncertainty framed what they were attending to and taking away from each feedback message. For example, Lilly felt more confident when attending to the praise components of the feedback, as they interpreted this as meaning they had productively managed their uncertainty. They also experienced lower confidence when considering the feedback that enhanced their uncertainty, such as the counterargument review. By engaging with the peer review, Lilly was able to gather information on how well they had managed this uncertainty, which ultimately helped them decide how to better support their claim through revision.

Peer review can provide feedback for improvement

The second group consisted of students who felt confident about their initial draft, but acknowledged there was still room for improvement. Tulip highlighted this sentiment when reflecting on their initial draft before seeing any peer review.

Right now,I still overall agree with my final conclusion. I definitely think that I could have expanded more on specific data values in my final response. But other than that, no major comments.” (Tulip).

While the students in this group felt confident about their initial choices, they still attended to aspects of the feedback messages that indicated they should revise their work. For example, Tulip was focused on how the reviews critiqued their use of evidence and reasoning in their work. Tulip reflected on this when responding to the praise and stylistic solution peer review.

I mean, they said that my decision follows the dataNice, solid arguments. I still feel confident about what I produced.But it is good to hear that I can work on being a little bit more professional and almost overly clear.”

Tulip's high confidence stemmed from the reviewer praising their use of data to support their claim, which illustrates its importance to Tulip. Even though Tulip felt confident in their first draft, their openness to feedback helped them attend to the more critical aspects of the review.

Tulip approached the praise and high-prose solution peer review in a similar way; however, after this review they experienced a decrease in their confidence. Specifically, they felt less confident because the peer review indicated that they needed to better support their claim in a revision.

I didn’t cite any major specific examples from the data, any specific water values, any k values. So I definitely need to use that to be more clear and not just assume that the reader has access to all this data…I’m probably a bit less confident. My final conclusions, I still feel good about that. But as for the explanation sides of things, probably a bit less. I definitely could have cited more specific examples.” (Tulip).

Tulip's receptiveness to the peer review comments helped them identify what aspects of their argument were working (i.e., their claim) and which may need more work (i.e., their explanation). While they may have felt less confident about how they supported their initial claim, they still felt confident in what they proposed for their claim.

When considering the counterargument peer review, Tulip again felt a slight decrease in their confidence levels.

Yeah, so this is kind of disagreeing with the 50 degrees decision. And this is probably just a matter of putting different weights of different aspects…I still overall hold on to my idea. But I think that's just a consumer bias situation, where I would want stuff that tastes good… But I can see how I kind of chose an extreme value. So I’m probably just a little bit less [confident], just hearing a different conclusion.” (Tulip).

Tulip acknowledged that this peer review helped them realize that their temperature choice may be justified by personal biases rather than the data. They indicated that it was possible they may have selected a more “extreme” answer, though they still believed that their initial idea was viable. Again, Tulip's takeaway from this review was framed by their initial confidence level. They went into this peer review knowing that there likely was room to improve their response, but overall felt confident in what they said, which was ultimately confirmed by the peer review.

When considering the vague criticism peer review, Tulip again focused on what the reviewer noted about their reasoning.

I should have cited specific numbers… So I guess just being more conscious of that, citing specific data and where it came from is definitely gonna be important.Because I explained why the trends are the way they are, but I didn’t explain how I interpreted the trends to reach my final conclusion…Definitely get why they’re confused.” (Tulip).

In Tulip's reflection on this peer review, their main focus was to collect feedback on how to improve their response. Even though this peer review did not offer any specific suggestions on how to revise their work, Tulip's focus on identifying ways to improve their response prompted them to engage in deep reflection to identify aspects of their argument that needed revision.

By this point in the interview, Tulip was compelled to reconsider what evidence and reasoning they had initially used. Like the other students in this group, Tulip opted to make a revision by incorporating new evidence and reasoning into their argument. Specifically, they wanted to ensure that their second draft was clearer and cited specific evidence to support their claim. After revising their work, Tulip reflected on how the peer review helped them improve their response.

So I guess as a whole,comparing my revision to my initial, it's just being a little bit more specific and clear about where I’m drawing the values or explanations from. Explaining the discrepancies that occurred in both extremes rather than just saying “these were extreme values”… And so as a whole, I feel like my response is definitely a little bit more clear.” (Tulip).

To summarize, students in this group began the peer review activity feeling confident in their work while still recognizing that there may be changes that would help improve their response. This openness to feedback helped students identify what critiques the peer review offered that would help improve their response. For Tulip, they were especially focused in on feedback about how well they supported their claim with specific evidence and reasoning. By attending to this information, Tulip was able to uptake the appropriate feedback needed to improve their response in a revision.

Peer review provides confirmation

The final group in this study included students who framed the peer review activity as being confirmatory. These students all came into the interviews feeling very confident in their initial draft and were focused on taking up feedback that confirmed that their initial response was correct. Often, this was done by focusing solely on whether the peer review agreed with their final answer or not. For example, Poppy experienced an increase in confidence after reading the praise and stylistic solution peer review.

Um, it makes me feel more likely to stand by it if somebody else was to come in and be like, hey I think you’re wrong.I’d be like, well actually somebody said that I follow the data.” (Poppy).

While this peer review did offer a solution for Poppy to consider during their revision, they focused only on whether the peer review indicated that they got the correct answer or not. To Poppy, having the peer review agree with their response increased their confidence because it meant they followed the data and did not need to make a revision.

Poppy maintained this focus when interacting with the praise and high-prose solution peer review. Here, Poppy maintained high confidence but was more reflective on the suggestion offered by the peer review.

I still feel like this is telling me I did something correctly. But… that makes me want to go back and make sure that it's very clear cut what I said… It makes me want to like, really make sure that I understand it in order to relay my answer to somebody else.[I’m still confident] because the first sentence says “Looks right to me,” so it means I still did somewhat of a good job.” (Poppy).

Like their previous interactions with the peer review, Poppy was focused on making sure that the reviewer agreed with their initial response and used that as a metric for how well they did. While Poppy identified ways to check their work (i.e., make sure their argument is clear), they ultimately attend to the peer review agreeing with them. This illustrates that Poppy's focus when receiving feedback was to ensure others agreed with their response.

The counterargument was the only peer review these students interacted with that decreased their confidence. Unlike the previous peer reviews, the counterargument pointed out a specific aspect of the argument that should be changed in a second draft.

This one is telling me that they disagree with my temperature choice the onions were dehydrated at.And while I agree that the moisture content will be high, this review doesn’t really talk about the other condition. So I’m not super keen to accept this at face value… There's a part of me that's like, “Pretty confident!”But there's this part of me that would probably throw me for a loop if I hadn’t had any outside context regarding any of the data. Like if I hadn’t seen the other ones, I probably would be like, ‘Oh, okay. I guess I will go back and look.’ So not as confident, if we’re just taking this into context.” (Poppy).

While this peer review pointed out a weakness in Poppy's work, Poppy opted to stand by their initial choice. Specifically, they noted that they thought this peer review was limited in terms of how much data it considered. Poppy still experienced a decrease in confidence because the peer review disagreed with them; however, they maintained that their initial draft was still correct. Specifically, Poppy noted if they had not already seen peer reviews that agreed with them, they likely would have changed their response based on this review. This again highlights that Poppy was focused on ensuring that others agreed with their work.

When students in this group were shown the vague criticism peer review, they once again maintained high confidence in their initial draft, specifically citing that this review was too vague to act on.

I don’t know how [my response] is confusing… I don’t think it really is impacting my argument. I want to see if this person is responding that way because they didn’t read it, or they genuinely didn’t understand it.” (Poppy).

This peer review had limited influence on how Poppy was thinking about their initial draft as it did not identify why the response was confusing. Furthermore, Poppy questioned the credibility of the reviewer by specifically noting that the confusion could be from the reviewer not reading the response rather than their work being genuinely confusing. Because of this, Poppy seemed justified in ignoring this feedback and did not incorporate it into their final revision. Poppy's response to both peer review comments that disagreed with their initial reasoning (i.e. the counterargument peer review and the vague criticism peer review) was to identify weaknesses in these reviews. By doing so, Poppy was able to discount the reviews and maintain the idea that their initial response was correct.

All students in this group opted not to make a revision. Specifically, these students noted that most of the peer review had agreed with their claim, which suggested that there was no reason to revise their work. Despite not making a revision, these students did acknowledge that the peer review pointed out specific places they could have improved their argument. However, they viewed these suggestions as important to incorporate for future work, but not necessarily for this assignment.

When I was looking at it, I felt like most of the comments—at least the first two were mostly about formatting or stuff that I could have done better. Like not using “I” or just giving a better response with more data and evidence why I said that.It doesn’t say anything about the response itself not being right necessarily. So this is good to consider if I were to do it again, but maybe not for my final response.” (Poppy).

To summarize, students in this group entered the peer review feeling confident in their initial response. When engaging with the peer review, they were focused on ensuring that others agreed with the answer they had selected to confirm they had done the assignment correctly. Poppy's quotes illustrate that this group was more focused on confirming their initial draft than incorporating any suggestions for improvements that were offered. Since no peer review was able to convince the students that their initial response was incorrect, these students opted to make no revision in their final draft.

Discussion and implications

In this study, we linked student's confidence in their initial draft to what they focused on when engaging with each peer review comment. Specifically, we identified a range of confidence levels that differentially shaped how students took up feedback. For example, students who expressed low confidence about their first draft were focused on collecting feedback on how to resolve their uncertainty, so they attended to aspects of the peer review comments that gave them insight into this. On the other hand, students who felt confident in their initial draft but were focused on identifying ways to improve it attended to aspects of the feedback that helped them decide how to revise. Finally, students who experienced high confidence levels and believed their initial draft was correct focused on aspects of the peer review comments that confirmed their initial response.

These foci ultimately explained why the students made different choices when it came time to revise their work. Regarding our students who experienced uncertainty about their first draft, we highlight that peer review can provide a mechanism for these students to resolve this and improve their response. For students who were confident in their initial response but were open to making changes in a second draft, peer review provided an avenue for these students identify new ways to improve their response. For students who had a high initial confidence and were focused on confirming they had gotten the correct answer, certain suggestions made by the peer review comments may not have been taken up by the student. This ultimately resulted in a lack of revision for these students.

The link between confidence and what students attend to in a feedback message has been identified in previous work (Lipnevich and Smith, 2022), and our findings echo this previous work. For example, previous work has noted that low confidence can be enhanced by negative feedback (Lipnevich and Smith, 2022). This was demonstrated by uncertain students (i.e. Lilly and Philodendron), who felt decreases in their confidence levels after attending to more critical peer review comments, such as the counterargument or vague criticism reviews. On the other hand, previous literature has also noted that positive feedback can enhance high confidence levels (Lipnevich and Smith, 2022). This also aligns with our findings, where we saw students who felt extremely confident in their first draft (i.e. Poppy) maintain this confidence level after attending to the praise components of some of the peer review comments.

However, the findings from our work can help explain some of the variability shown in prior research on the relationships between comment features and implementation rate (Nelson and Schunn, 2009; Patchan et al., 2011, 2016; Patchan and Schunn, 2016; Wu and Schunn, 2021a, 2021b). Specifically, our work considers how a student's confidence shapes how they process these different features, which ultimately influenced their feedback uptake. For example, students who experienced low confidence viewed the counterargument as a sign that they needed to revise their work. On the other hand, students who had high confidence responded by pushing back or ignoring the comment. Future work could explore this on a larger scale; that is, quantitatively model how confidence moderates the relationship between feedback features and revision practices (e.g., Wu and Schunn, 2021a). This could be accomplished by prompting students to rank their confidence prior to peer review. Scaling up peer review activities also introduces the complexity of engaging students in both giving and receiving feedback. While an exploratory study did demonstrate ways that reviewing others shaped the reviewer's confidence (Berg and Moon, 2022), more qualitative work is needed to determine how an individual's confidence impacts the construction of peer feedback comments.

These findings also serve to unpack the individual characteristics umbrella that has emerged from research on feedback uptake and peer review. Specifically, our work expands on how confidence has been previously conceptualized by Lipnevich and Smith (2022), highlighting how low confidence can act as an asset and motivate students to seek information from their peers while high confidence may act as a barrier. This adds a bit of nuance, as it suggests that confidence may be less influenced by praise or negativity, and instead more impacted by the degree to which the feedback potentially resolves the underlying uncertainty. Practically, this undergirds an instructional need to treat uncertainty as a productive construct and to motivate both student and instructor attention to feedback that is substantive (Ha et al., 2024; Starrett et al., 2024). Research in chemistry education has only just begun to explore how uncertainty can be a productive tool for chemistry learning and these findings suggest this remains an important direction for research.

Because an individual's initial framing plays such a key role in shaping how they navigate an activity like peer review, there is room for instructors to impact students’ goals and thus support better engagement in the activity. Instructors can directly influence the framing of these activities by messaging to students that their peers' ideas that can be productive (Scherr and Hammer, 2009; Criswell, 2011; Berland and Hammer, 2012). For example, messaging that peers are a valuable resource and can help identify new ways to improve their response could increase openness to feedback even for students who are confident and seeking confirmation. Messaging that it is okay to be confused and that there may not be a “correct” way to resolve this confusion could support low-confidence students in persisting through the discomfort of uncertainty and confusion to extract useful information from their peers. Admittedly, this kind of framing stands in contrast to dominant messages currently sent in practice, which often tend towards defining correct and incorrect in terms of alignment with a key (Schwarz et al., 2024). However, for activities like peer review that involve data analysis, argumentation, reflection, and critique (and for which, there is no key), alternative messages are necessary for these kinds of activities to be impactful. Further research will be needed to investigate how shifting this framing impacts the nature of students’ engagement in activities, like peer review, that engage learners in such cognitively demanding practices (Duschl, 2007; Kuhn and Zillmer, 2015).

Limitations

The design of this qualitative study both afforded and limited the kinds of claims that can be made from this data. This study simulated peer feedback and, thus, does not mimic authentic peer review activity where peer review is bidirectional and there is little to no external prompting for reflection beyond the feedback itself. This does raise concerns regarding the transferability of these findings to more authentic settings. However, this work was exploratory with the aim of elucidating how students engaged with feedback to inform how activities are scaffolded and facilitated.

These findings were limited to exploring the role that confidence plays in peer review activities. Certainly, confidence not the only factor that influences engagement in peer review. There are likely a multitude of factors that influence students’ peer review engagement behaviours (e.g., student's goal orientation for the course) (Lewis, 2018). Accounting for more factors may offer a more nuanced understanding of the groups identified in our study. For example, it is entirely possible that high-confidence students may not have known how to use the feedback they received from their peers, which is why they opted not to revise their work (Carless and Boud, 2018; Dong et al., 2023; Zhang et al., 2024). Future work should include additional factors to explain how students engage with these feedback comments.

Furthermore, these interviews explored only one direction of peer review—receiving feedback. This was critical for precisely modelling how learners process feedback they receive. However, peer review does involve both giving and receiving feedback, which very likely interact with one another (Nicol et al., 2014). For this reason, findings from this study have limited impact on our understanding of peer review until more studies, as we proposed above, qualitatively explore the process of giving feedback and quantitatively model the role of confidence.

Ethical considerations

All data collection procedures followed approved Institutional Review Board (IRB) protocol.

Author contributions

MTU: conceptualization, interview design, data collection, data analysis, and writing. DV: data analysis. AM: conceptualization, oversight of data collection and analysis, writing.

Conflicts of interest

There are no conflicts to declare.

Data availability

Data collected from human participants, described in the methods, are not available for confidentiality reasons.

Supplementary information (SI) includes a table describing all participants’ engagement with peer feedback, which was used in data analysis. See DOI: https://doi.org/10.1039/d5rp00118h.

Acknowledgements

We would like to thank our student participants for their thoughtfulness and time. We would like to thank John Zhou for helping to establish trustworthiness for this project and helping to write the peer review comments. We also would like to thank Stephanie Berg for her insight and help, especially when drafting the peer review comments. We also would like to thank the reviewers for their thoughtfulness in providing feedback on this manuscript.

References

  1. Ajjawi R. and Boud D., (2018), Examining the nature and effects of feedback dialogue, Assess. Eval. High. Educ., 43(7), 1106–1119 DOI:10.1080/02602938.2018.1434128.
  2. Basso A., (2020), Results of a Peer Review Activity in an Organic Chemistry Laboratory Course for Undergraduates, J. Chem. Educ., 97(11), 4073–4077 DOI:10.1021/acs.jchemed.0c00373.
  3. Berg S. A. and Moon A., (2022), Prompting hypothetical social comparisons to support chemistry students’ data analysis and interpretations, Chem. Educ. Res. Pract., 23(1), 124–136 10.1039/d1rp00213a.
  4. Berland L. K. and Hammer D., (2012), Framing for scientific argumentation, J. Res. Sci. Teach., 49(1), 68–94 DOI:10.1002/tea.20446.
  5. Boud D. and Molloy E., (2013), Rethinking models of feedback for learning: the challenge of design, Assess. Eval. High. Educ., 38(6), 698–712 DOI:10.1080/02602938.2012.691462.
  6. Brooks C., Huang Y., Hattie J., Carroll A. and Burton R., (2019), What Is My Next Step? School Students’ Perceptions of Feedback, Front. Educ., 4(96), 1–14 DOI:10.3389/feduc.2019.00096.
  7. Carless D., (2006), Differing perceptions in the feedback process, Stud. High. Educ., 31(2), 219–233 DOI:10.1080/03075070600572132.
  8. Carless D. and Boud D., (2018), The development of student feedback literacy: enabling uptake of feedback, Assess. Eval. High. Educ., 43(8), 1315–1325 DOI:10.1080/02602938.2018.1463354.
  9. Cheng K. H., Liang J. C. and Tsai C. C., (2015), Examining the role of feedback messages in undergraduate students’ writing performance during an online peer assessment activity, Internet High. Educ., 25, 78–84 DOI:10.1016/j.iheduc.2015.02.001.
  10. Cho K. and MacArthur C., (2010), Student revision with peer and expert reviewing, Learn. Instr., 20(4), 328–338 DOI:10.1016/j.learninstruc.2009.08.006.
  11. Cho K. and Schunn C. D., (2007), Scaffolded writing and rewriting in the discipline: a web-based reciprocal peer review system, Comput. Educ., 48(3), 409–426 DOI:10.1016/j.compedu.2005.02.004.
  12. Cho K., Schunn C. D. and Charney D., (2006), Commenting on writing: typology and perceived helpfulness of comments from novice peer reviewers and subject matter experts, Writ. Commun., 23(3), 260–294 DOI:10.1177/0741088306289261.
  13. Creswell J. W. and Poth C. N. (2018), Qualitative Inquiry and Research Design Choosing among Five Approaches, 4th edn, Thousand Oaks: SAGE Publications, Inc.
  14. Criswell B., (2011), Framing Inquiry in High School Chemistry: Helping Students See the Bigger Picture, J. Chem. Educ., 89(2), 199–205 DOI:10.1021/ed101197w.
  15. Cui Y. and Schunn C. D., (2024), Peer feedback that consistently supports learning to write and read: providing comments on meaning-level issues, Assess. Eval. High. Educ., 49(8), 1168–1181 DOI:10.1080/02602938.2024.2364025.
  16. Dong Z., Gao Y. and Schunn C. D., (2023), Assessing students’ peer feedback literacy in writing: scale development and validation, Assess. Eval. High. Educ., 48(8), 1103–1118 DOI:10.1080/02602938.2023.2175781.
  17. Duschl R., (2007), Quality argumentation and epistemic criteria, Argum. Sci. Educ., 35, 159–175 DOI:10.1007/978-1-4020-6670-2_8.
  18. Efklides A., (2006), Metacognition and Affect: What Can Metacognitive Experiences Tell Us About the Learning Process? Educ. Res. Rev., 1(1), 3–14 DOI:10.1016/j.edurev.2005.11.001.
  19. Efklides A., (2011), Interactions of Metacognition With Motivation and Affect in Self-Regulated Learning: The MASRL Model, Educ. Psychol., 46(1), 6–25 DOI:10.1080/00461520.2011.538645.
  20. Finkenstaedt-Quinn S. A., Milne S. L., Petterson M. N., Chen J. and Shultz G. V., (2024a), Student Experiences With Peer Review and Revision for Writing-to-Learn in a Chemistry Course Context, Writ. Commun., 41(4), 632–663 DOI:10.1177/07410883241263542.
  21. Finkenstaedt-Quinn S. A., Watts F. M. and Shultz G. V., (2024b), Reading, receiving, revising: a case study on the relationship between peer review and revision in writing-to-learn, Assess. Writ., 59, 100808 DOI:10.1016/j.asw.2024.100808.
  22. Finkenstaedt-Quinn S. A., Snyder-White E. P., Connor M. C., Gere A. R. and Shultz G. V., (2019), Characterizing Peer Review Comments and Revision from a Writing-to-Learn Assignment Focused on Lewis Structures, J. Chem. Educ., 96(2), 227–237 DOI:10.1021/acs.jchemed.8b00711.
  23. Gao Y., An Q. and Schunn C. D., (2023), The bilateral benefits of providing and receiving peer feedback in academic writing across varying L2 proficiency, Stud. Educ. Eval., 77, 101252 DOI:10.1016/j.stueduc.2023.101252.
  24. Goetz T., Lipnevich A. A., Krannich M. and Gogol K., (2018), Performance feedback and emotions, In Lipnevich A. A. and Smith J. K. (ed.), The Cambridge handbook of instructional feedback, Cambridge University Press, pp. 554–574 DOI:10.1017/9781316832134.027.
  25. Gragson D. E. and Hagen J. P., (2010), Developing technical writing skills in the physical chemistry laboratory: a progressive approach employing peer review, J. Chem. Educ., 87(1), 62–65 DOI:10.1021/ed800015t.
  26. Ha H., Chen Y. C. and Park J., (2024), Teacher strategies to support student navigation of uncertainty: considering the dynamic nature of scientific uncertainty throughout phases of sensemaking, Sci. Educ., 108(3), 890–928 DOI:10.1002/sce.21857.
  27. Hattie J., Gan M. and Brooks C., (2016), Instruction Based on Feedback.
  28. Hattie J. and Timperley H., (2007), The power of feedback, Rev. Educ. Res., 77(1), 81–112 DOI:10.3102/003465430298487.
  29. Ibarra-Sáiz M. S., Rodríguez-Gómez G. and Boud D., (2020), Developing student competence through peer assessment: the role of feedback, self-regulation and evaluative judgement. High. Educ., 80(1), 137–156 DOI:10.1007/s10734-019-00469-2.
  30. Ion G., Sánchez Martí A. and Agud Morell I., (2019), Giving or receiving feedback: which is more beneficial to students’ learning? Assess. Eval. High. Educ., 44(1), 124–138 DOI:10.1080/02602938.2018.1484881.
  31. Kuhn D. and Zillmer N., (2015), Developing Norms of Discourse, in Resnick L. B., Asterhan C. S. C. and Clarke S. N. (ed.), Socializing Intelligence Through Academic Talk and Dialogue, American Educational Research Association, pp. 77–86, https://www.jstor.org/stable/j.ctt1s474m1.9.
  32. Lewis S., (2018), Goal orientations of general chemistry students via the achievement goal framework, Chem. Educ. Res. Pract., (19), 199–212 10.1039/c7rp00148g.
  33. Lincoln Y. S. and Guba E. G., (1985), Naturalistic Inquiry, Beverly Hills, California: Sage Publications.
  34. Lipnevich A. A. and Smith J. K., (2022), Student – Feedback Interaction Model: Revised, Stud. Educ. Eval., 75, 101208 DOI:10.1016/j.stueduc.2022.101208.
  35. Liu J. and Hansen J. G., (2002), Peer Response in Second Language Writing Classroom, Michigan: University of Michigan Press DOI:10.3998/mpub.9361097.
  36. Madden B., Murphy A., Seery M. K. and Ryan B. J., (2024), You get what you give – Reciprocal Peer Review in a Chemistry Assignment using a VLE to promote insightful feedback, J. Chem. Educ., 101(4), 1766–1770 DOI:10.1021/acs.jchemed.3c01101.
  37. Marton F., (1986), Phenomenography-A Research Approach to Investigating Different Understandings of Reality, Thought, 21(3), 28–49.
  38. Michinov E. and Michinov N., (2001), The similarity hypothesis: a test of the moderating role of social comparison orientation, Eur. J. Soc. Psychol., 31(5), 549–555.  DOI:10.1002/ejsp.78.
  39. Moon A., Zotos E., Finkenstaedt-Quinn S., Gere A. R. and Shultz G., (2018), Investigation of the role of writing-to-learn in promoting student understanding of light–matter interactions, Chem. Educ. Res. Prac., 19(3), 807–818 10.1039/C8RP00090E.
  40. Nelson M. M. and Schunn C. D., (2009), The nature of feedback: how different types of peer feedback affect writing performance, Instr. Sci., 37(4), 375–401 DOI:10.1007/s11251-008-9053-x.
  41. Nicol D., (2010), From monologue to dialogue: improving written feedback processes in mass higher education, Assess. Eval. High. Educ., 35(5), 501–517 DOI:10.1080/02602931003786559.
  42. Nicol D., (2021), The power of internal feedback: exploiting natural comparison processes, Assess. Eval. High. Educ., 46(5), 756–778 DOI:10.1080/02602938.2020.1823314.
  43. Nicol D., Thomson A. and Breslin C., (2014), Rethinking feedback practices in higher education: a peer review perspective, Assess. Eval. High. Educ, 39(1), 102–122 DOI:10.1080/02602938.2013.795518.
  44. Patchan M. M. and Schunn C. D., (2016), Understanding the effects of receiving peer feedback for text revision: relations between author and reviewer ability, J. Writ. Res., 8(2), 227–265 DOI:10.17239/jowr-2016.08.02.03.
  45. Patchan M. M., Schunn C. D. and Clark R. J., (2011), Writing in natural sciences: understanding the effects of different types of reviewers on the writing process, J. Writ. Res., 2(3), 365–393 DOI:10.17239/jowr-2011.02.03.4.
  46. Patchan M. M., Schunn C. D. and Correnti R. J., (2016), The nature of feedback: how peer feedback features affect students’ implementation rate and quality of revisions, J. Educ. Psychol., 108(8), 1098–1120 DOI:10.1037/edu0000103.
  47. Piccinno T. F., Basso A. and Bracco F., (2023), Results of a Peer Review Activity Carried out Alternatively on a Compulsory or Voluntary Basis, J. Chem. Educ., 100(2), 489–495 DOI:10.1021/acs.jchemed.2c00229.
  48. Scherr R. E. and Hammer D., (2009), Student behavior and epistemological framing: examples from collaborative active-learning activities in physics, Cogn. Instruct., 27(2), 147–174 DOI:10.1080/07370000902797379.
  49. Schwarz C. E., DeGlopper K. S., Greco N. C., Russ R. S. and Stowe R. L., (2024), Modeling Student Negotiation of Assessment-Related Epistemological Messages in a College Science Course, Sci. Educ., 109, 429–447 DOI:10.1002/sce.21914.
  50. Stankov L., Lee J., Luo W. and Hogan D. J., (2012), Confidence: a better predictor of academic achievement than self-efficacy, self-concept and anxiety? Learn. Individ. Differ., 22(6), 747–758 DOI:10.1016/j.lindif.2012.05.013.
  51. Starrett E., Jordan M., Chen Y. C., Park J. and Meza-Torres C., (2024), Desirable uncertainty in science teaching: exploring teachers’ perceptions and practice of using student scientific uncertainty as a pedagogical resource, Teach. Teach. Educ., 140, 104456 DOI:10.1016/j.tate.2023.104456.
  52. Tai J., Ajjawi R., Boud D., Dawson P. and Panadero E., (2018), Developing evaluative judgement: enabling students to make decisions about the quality of work, High. Educ., 76(3), 467–481 DOI:10.1007/s10734-017-0220-3.
  53. Topping K. J., (2009), Peer assessment, Theory Pract., 48(1), 20–27 DOI:10.1080/00405840802577569.
  54. Tsai Y. C. and Chuang M. T., (2013), Fostering revision of argumentative writing through structured peer assessment, Percept. Mot. Skills., 116(1), 210–221 DOI:10.2466/10.23.PMS.116.1.210-221.
  55. Urbanek M. T., Moritz B. and Moon A., (2023), Exploring students’ dominant approaches to handling epistemic uncertainty when engaging in argument from evidence, Chem. Educ. Res. Pract., 24(4), 1142–1152 10.1039/D3RP00035D.
  56. Watts F. M., Finkenstaedt-Quinn S. A. and Shultz G. V., (2024), Examining the role of assignment design and peer review on student responses and revisions to an organic chemistry writing-to-learn assignment, Chem. Educ. Res. Pract., 25(3), 721–741 10.1039/d4rp00024b.
  57. Winstone N. E., Nash R. A., Parker M. and Rowntree J., (2017a), Supporting Learners’ Agentic Engagement With Feedback: A Systematic Review and a Taxonomy of Recipience Processes, Educ. Psychol., 52(1), 17–37 DOI:10.1080/00461520.2016.1207538.
  58. Winstone N. E., Nash R. A., Rowntree J. and Parker M., (2017b), ‘It’d be useful, but I wouldn’t use it’: barriers to university students’ feedback seeking and recipience, Stud. High. Educ., 42(11), 2026–2041 DOI:10.1080/03075079.2015.1130032.
  59. Wu Y. and Schunn C. D., (2020a), From feedback to revisions: effects of feedback features and perceptions, Contemp. Educ. Psychol., 60, 101826 DOI:10.1016/j.cedpsych.2019.101826.
  60. Wu Y. and Schunn C. D., (2020b), When peers agree, do students listen? The central role of feedback quality and feedback frequency in determining uptake of feedback, Contemp. Educ. Psychol., 62, 101897 DOI:10.1016/j.cedpsych.2020.101897.
  61. Wu Y. and Schunn C. D., (2021a), From plans to actions: a process model for why feedback features influence feedback implementation, Instr. Sci., 49(3), 365–394 DOI:10.1007/s11251-021-09546-5.
  62. Wu Y. and Schunn C. D., (2021b), The Effects of Providing and Receiving Peer Feedback on Writing Performance and Learning of Secondary School Students, Am. Educ. Res. J., 58(3), 492–526 DOI:10.3102/0002831220945266.
  63. Wu Y. and Schunn C. D., (2023), Passive, active and constructive engagement with peer feedback: a revised model of learning from peer feedback, Contemp. Educ. Psychol., 73, 102160 DOI:10.1016/j.cedpsych.2023.102160.
  64. Yu Q. and Schunn C. D., (2023), Understanding the what and when of peer feedback benefits for performance and transfer, Comput. Hum. Behav., 147, 107857 DOI:10.1016/j.chb.2023.107857.
  65. Zhang Y., Schunn C. D. and Wu Y., (2024), Interconnecting peer feedback literacy: exploring the relationship between providing and acting on peer feedback, Stud. Educ. Eval., 83, 101411 DOI:10.1016/j.stueduc.2024.101411.
  66. Zong Z., Schunn C. D. and Wang Y., (2021), What aspects of online peer feedback robustly predict growth in students’ task performance? Comput. Hum. Behav., 124, 106924 DOI:10.1016/j.chb.2021.106924.
  67. Zwicky D. A. and Hands M. D., (2016), The Effect of Peer Review on Information Literacy Outcomes in a Chemical Literature Course. J. Chem. Educ., 93(3), 477–481 DOI:10.1021/acs.jchemed.5b00413.

This journal is © The Royal Society of Chemistry 2026
Click here to see how this site uses Cookies. View our privacy policy here.