Mary Tess
Urbanek
a,
Benjamin
Moritz
b and
Alena
Moon
*a
aUniversity of Nebraska-Lincoln, Department of Chemistry, Lincoln, NE, USA. E-mail: amoon3@unl.edu
bBenedictine College, Department of Chemistry, Atchison, KS, USA
First published on 24th May 2023
While uncertainty is inherent to doing science, it is often excluded from science instruction, especially postsecondary chemistry instruction. There are a variety of barriers to infusing uncertainty into the postsecondary chemistry classroom, including ensuring productive struggle with uncertainty, evaluating student engagement with uncertainty, and facilitating engagement in a way that fits within the postsecondary chemistry context. In this study, we aimed to address these difficulties by designing an argumentation task that enables the direct observation of students interacting with epistemic uncertainty. This task was administered as a written assignment to a large-enrollment, second-semester general chemistry course. Student responses were analyzed to generate a rubric that captures the varied ways students grapple with epistemic uncertainty. In accordance with previous literature, we observed students not engaging with the uncertainty (e.g., generating vague, incomprehensible arguments) and selectively engage with the uncertainty (e.g., use data selectively to avoid uncertainty). However, we also observed the qualitatively distinct approaches students utilized to productively manage epistemic uncertainty. Importantly, we believe that these ways of productively handling uncertainty translate to the kinds of scientific reasoning, personal decision making, and socioscientific reasoning that these learners will continue to engage in. Therefore, this work has implications for supporting students’ scientific argumentation by offering instructors a practical way to engage their students with uncertainty and a model to interpret and respond to their students.
To provide students with the opportunity to engage authentically with science, uncertainty must be integrated into the classroom and curriculum (Buck et al., 2014; Lee et al., 2014). Giving students the opportunity to engage with uncertainty in our classroom exposes them to “the forms of uncertainty that scientists experience, and that in turn drive practices such as argumentation, explanation, and investigation in scientific activity” (Manz and Suárez, 2018). The National Research Council has explicitly outlined expectations regarding student's ability to handle uncertainty and conflicting information in the classroom. They assert that students should be able to (1) recognize when data conflicts with initially held ideas or expectations, (2) recognize limitations of conclusions drawn from data, and (3) be able to revise their initial understanding to incorporate new information (National Research Council, 2012). These standards set the expectation that as students progress through their science education, they should be developing skills to manage uncertainty (National Research Council, 2012).
Despite the explicit statement that students should be progressing in their understanding of managing uncertainty, science instruction in the classroom has continued to treat science as an entity that is final and completely certain (Osborne, 2014). Treating science as a final answer can be detrimental to perceptions of science, as it removes the ability for students to view science as an approach to knowledge generation. Therefore, it is key for instructors to expose students to this uncertainty and help them productively grapple with it while engaging in science practices (Manz, 2018; Chen and Qiao, 2020; Chen and Techawitthayachinda, 2021; Chen, 2022; Watkins and Manz, 2022).
Not only does this uncertainty exist within the science educational context, it also permeates outside the classroom as well. Socio-scientific issues (SSI) often carry a sense of uncertainty to them, as they involve both scientific and personal concerns (Evagorou et al., 2012; Alred and Dauer, 2020; Kelp et al., 2022). These issues often require students to evaluate data that conflicts with other pieces of evidence, which can further escalate their level of uncertainty (Evagorou et al., 2012; Novak and Treagust, 2018; Bolger et al., 2021). It thus becomes crucial for students to not only manage uncertainty within the classroom, but also in the public space. For example, during the COVID-19 pandemic, citizens had to assess the credibility of data sources, consider ethical and personal concerns, and evaluate scientific knowledge to make decisions about wearing masks and getting vaccinated (Kelp et al., 2022; Puig and Evagorou, 2023). Generating responsible citizens requires science education to help students foster an understanding of uncertainty and ways to manage it when generating an argument (Osborne, 2013, 2014; Manz, 2018).
There are several types of uncertainty that students may encounter while engaging in their science courses. One type of uncertainty that students must learn to grapple with is epistemic uncertainty. Epistemic uncertainty is defined as, “learners' subjective awareness and knowing of incomplete skills or abilities about how to explain a phenomenon, derive trends from muddled data, interpret and represent raw data as evidence, and put forth scientific arguments” (Chen and Qiao, 2020). Learning to productively manage epistemic uncertainty is a key competency for science students to develop, as without it they are limited in their ability to engage in science practices and make informed decisions about the world around them (Manz, 2015, 2018; Manz and Suárez, 2018; Chen et al., 2019; Chen and Qiao, 2020; Chen and Techawitthayachinda, 2021; Chen, 2022; Watkins and Manz, 2022).
Other literature has shown that students tend to rely on personal reasoning to navigate uncertainty rather than scientific knowledge (Chinn and Brewer, 1998; Evagorou et al., 2012; Mehl et al., 2020; Puig and Evagorou, 2023). For example, Puig and Evagorou (2023) found that pre-service teachers often relied on their own personal experiences to evaluate the credibility of a news headline about COVID-19 instead of the scientific knowledge they were learning in the course. These findings were consistent with Evagorou et al.'s (2012) investigation of how 12 to 13 year-old students in England generated claims about a particular SSI involving squirrel populations. They found that a student's previous experiences and personal beliefs offered a lens through which they interpreted data.
Chinn and Brewer (1993) have extensively studied how students interact with anomalous data in science education. They have identified seven unique ways that students will respond to anomalies. Only one of these ways leads students to revising their theory to fully incorporate the new evidence. They have found that students can ignore the data, reinterpret the data in such a way that it aligns with their current understanding, or provide an explanation that asserts the conflicting evidence is not relevant to their theory. However, in most of these responses we can see that students avoid engaging with the conflicting evidence to maintain their pre-instructional theory.
There are several factors that influence the way in which a student interacts with conflicting evidence (Chinn and Brewer, 1993, 1998; Sundstrom et al., 2020; Phillips et al., 2021). Students’ understanding of the nature of science and uncertainty plays a role in determining their actions. For example, Phillips and colleagues (2018) found that students in a physics lab that viewed the purpose of lab as being a confirmatory experience (i.e., that evidence collected in the lab was meant to confirm what they had been taught in lecture) typically avoided engaging with conflicting data. Additionally, students who hold epistemic ideas that learning science involves memorizing facts rather than making sense of the world will often avoid engaging with conflicting evidence when presented with it (Chinn and Brewer, 1993, 1998). Other influencing factors include how strongly a student holds onto an initial theory, what prior experience and knowledge a student has, and how the student views the data. For conflicting evidence to be meaningful to students, they must view it as valid and inexplainable by their current understanding (Nussbaum and Novick, 1982; Chinn and Brewer, 1998). Additionally, when a new explanation is presented to the student, they must view it as an accessible and plausible explanation for the phenomenon (Nussbaum and Novick, 1982).
Argument-Driven Inquiry labs (ADI) have also given insight into how students engage with uncertainty in the post-secondary context (Walker et al., 2019). In these labs, students work in small groups to collect data and construct arguments about a particular concept. Once lab groups have constructed an argument, they then share their findings, evidence, and justifications to other groups. After presenting their own argument and hearing counterarguments, the students work together to revise their arguments. However, Walker et al. (2019) found that students struggle to revise their arguments during these labs. They completed semi-structured interviews with students in an ADI lab course to gain insight into students’ perceptions of argumentation competencies. These interviews showed that students had unique ways of interpreting what it meant to change their claim. For example, some students believed that changing a claim required new evidence to be collected, or that existing evidence had to be “manipulated” to align with a new claim. Very few students believed that changing a claim could be justified by the reinterpretation of existing evidence.
Previous research has shown that managing uncertainty is a challenging process for students across both K-12 and post-secondary contexts (Evagorou et al., 2012; Novak and Treagust, 2018; Chen et al., 2019; Walker et al., 2019; Bolger et al., 2021; Phillips et al., 2021). Across these studies, many students were unable to meaningfully reckon with uncertainty while engaging in science practices and instruction (Walker et al., 2019; Bolger et al., 2021; Phillips et al., 2021). Given the stress that recent educational efforts have placed on students authentically engaging in science in the classroom, we must continue to provide students with the opportunity to engage in uncertainty in argumentation (National Research Council, 2012; Buck et al., 2014; Lee et al., 2014). Additionally, we must provide students with instruction that equips them to deal with this uncertainty. To do so, we need to know the ways in which students can productively engage with uncertainty in a science practice context. This guides our research question for this study:
What are the dominant approaches students use to handle uncertainty when engaging in argumentation?
The goal of the task was to select optimal conditions to dehydrate onion slices based on three different onion qualities. There were three conditions the students considered: the temperature of dehydration, the thickness of the onion slice, and whether the onion slices should be treated with salt before dehydration. The three onion qualities—moisture content, browning, and flavor loss—were all measured with a data set that characterized the quality across the conditions (Mitra et al., 2014). Moisture content data showed the mass of water per mass of dry solid for the varying conditions. The browning and flavor loss data showed rate constants for the reactions responsible for the phenomenon across all possible conditions.
After reviewing each set of data, the students are probed to discuss the information they extracted and provide reasoning for why they think the trends may be occurring. Once the students have reviewed the data for each of these onion qualities, they construct an argument that explains what they believe are the best conditions to dehydrate the onions at and why. A figure depicting the task components is shown in Fig. 1.
The uncertainty for the students lies in the data itself. The different onion qualities indicate different conditions as being optimal. For example, the moisture content of the onion is optimal at high temperatures and thin thicknesses, but the other qualities of the onion are optimized at low temperatures and thick thicknesses. Students must reconcile this conflict when crafting their final argument. A full copy of the task can be found in the ESI,† Part B.
The task was administered in the course electronically as a regular homework assignment and points were awarded for completion. Because the task was administered as a homework assignment, it was open for a week's time. IRB approval was obtained and students submitted a consent form prior to completing the assignment. In total, 487 students consented to having their responses utilized for research purposes. During the period in which the assignment was open, the researcher held office hours for students to attend if they had any questions regarding the assignment. Additionally, students were given contact information for the researcher to ask any questions they had. Once the assignment concluded, six students were recruited for response process interviews. Recruitment for the interviews was completed via the homework assignment, where students volunteered to be contacted to schedule an interview. Participants were compensated with a $20 gift card for participating in the interview.
Once we had open-coded several responses, we met to compare our summaries and look for patterns within our notes that could help differentiate student reasoning patterns. To do this, we grouped students who omitted similar evidence, had similar misinterpretations, or prioritized similar variables together. Once the groups were sorted, we built descriptions that characterized the unique reasoning features present in the students' arguments. From these descriptions, we organized a preliminary rubric. To ensure that this preliminary rubric was appropriate, we pulled a new set of data that had the same conditions and tried applying the rubric to it. We then iterated through testing the rubric by coding new sets of data, discussing our coding, and refining the rubric until it reached stability.
A full breakdown of the IRR process, as well as each iteration of the rubric, can be found in the ESI,† Part C. We first began by independently coding fifty responses that neither researcher had seen before. Once these had been coded, we determined that constraining the code options based off the condition set the student selected did not always lead to truly capturing the type of reasoning the student employed. Therefore, we combined all the codes into one rubric.
We then pulled a new set of fifty student responses and coded them independently. For this round of coding, we calculated a Cohen's Kappa value of 0.558, which indicated a weak level of agreement (Watts and Finkenstaedt-Quinn, 2021). We discussed each disagreement until consensus was reached. We saw that the main reason for the disagreement came from confusion about what role the student's condition choice played in assigning a reasoning code.
Once this source of disagreement had been discussed, a new set of fifty responses were coded. This time, we calculated a Cohen's Kappa value of 0.631, which indicated a weak to moderate agreement (Watts and Finkenstaedt-Quinn, 2021). As we discussed our disagreements, we decided to make two changes to the rubric. First, we decided to combine all the separate misinterpretation codes into one code for two reasons. First, we did not gain any insight into their reasoning by parsing out the misunderstandings. Additionally, it was difficult to determine the source of the misinterpretation(s) the student had. We also decided to remove the “No Justification of Thickness or Temperature” code, as it only described omission rather than reasoning students were employing to grapple with uncertainty.
Once the rubric had been changed, a new set of fifty responses were coded by each researcher. We calculated a Cohen's Kappa value of 0.799 for this set of codes, which indicated a strong level of agreement (Watts and Finkenstaedt-Quinn, 2021). Once we had established our IRR, all remaining student responses were coded with the finalized rubric.
“When considering the MC [moisture content, ESI,† Part B, Fig. 2] graph, the content in the 3 mm onions is only one and a half units higher for the 50- and 60-degrees onions than the 70 degrees onions. However, the rate of browning for the 70-degree onions is significantly higher than that of the 50-degree 3 mm onions. The 50-degree onions take the cake with their incredibly low rate of thiosulfinate loss, a solid 0.6 on the table for 3 mm treated onions. Overall, I believe that the following parameters will make for the best quality onions with the longest shelf life.” (Leek's written response)
In Leek's written response, we can see that they are selecting a temperature that would give good values for the browning and flavoring of the onion. However, it is not completely clear how they have decided on their thickness selection. During Leek's interview, they were asked to explain how they decided their condition set:
“For the one millimeter thick onions, I noticed that the rate constants were extremely high. At this point, I was very sure that the one millimeter onions weren’t going to be a viable solution to drying the onions because it failed in the browning charts [ESI,† Part B, Table 1]. And it also failed over here [ESI,† Part B, Table 2]. So, I started looking back and forth between the three and five millimeters slices. And at this point, I think I kind of like, made a guess to go with a certain middle ground… like three millimeters… because for treated samples it was lower for the flavor and for the browning constants.” (Leek's response process interview)
Here we can see that Leek further elaborates on their reasoning for selecting 3 mm for their thickness. They expressed that they worried that selecting the 1 mm thickness, which is the most favorable for the moisture content, would create issues for the browning and flavor loss rates. Therefore, they decide that selecting an average value is the best way to address the conflict. Leek's interview helped us to realize that a student could utilize multiple types of reasoning to construct their argument for the best conditions and allowed us to further clarify the patterns we were seeing in the student responses.
“I chose these conditions because at 70 [°C] there was less water left and the smaller it was the faster it got dehydrated, I said not to add salt because salt preserves food and we don’t want to preserve the onions.” (Chive)
The other common misinterpretation pertained to the meaning of the data. This included misinterpreting the moisture content bar graph, such as saying the higher the values were, the lower the moisture content was. It also included when a student misinterpreted the meaning of the rate constant, such as when a response said that a higher rate constant meant a lower overall rate. This is shown in Pearl's response, who selected 70 degrees, 1 mm, untreated onions.
“I think this because it had the highest rates for both browning and moisture content. This means that it will brown slower and lose [flavoring] slower. The moisture content was also low but not the lowest. The lowest is the 1 mm treated at 70 degrees. I still picked these parameters because it had a big difference in rates for browning and flavoring.” (Pearl)
Because Pearl believed that a higher rate constant meant a slower rate, they believed they were picking a set of conditions that benefitted all three onion qualities. This removed the conflict in the data, thus removing the uncertainty that they would have had to engage with otherwise.
Another example of no engagement occurred when a student's reasoning was vague. A student was assigned a vague code when the student selected a variable (temperature, thickness, or treatment) to dehydrate the onion instead of a condition set (such as 70 degrees, 1 mm, treated). The vague reasoning code was also used to capture responses that did not offer any insight into the student's reasoning for selecting a condition set. In these situations, students typically just restated the goal of the assignment and did not cite any evidence to support their final conclusion.
“These parameters are the best because the rate of browning was the second lowest at 0.74. The rate of thiosulphinate loss was the lowest with all of these parameters at 0.2.” (Cipollini)
The condition set that Cipollini chose had one of the highest moisture contents for the onion, which was not ideal. However, we do not have evidence to suggest that Cipollini recognized this as a limitation for their condition set.
The second way students can exhibit selective engagement is by taking a compartmentalized approach in their argument. This type of reasoning occurs when a student considers how a particular condition will affect an onion quality favorably but does not discuss any of the negative aspects of selecting a particular condition. This reasoning differs from the neglecting codes in that the student will discuss all three qualities of the onion in their response. Shallot employed this reasoning to justify the conditions 70 °C, 5 mm, treated onions.
“Higher temp = lower moisture content, Thicker cut = less browning, [treatment] = less thiosulphinate [flavor] loss.” (Shallot)
Here we can see that Shallot does not discuss any of the drawbacks of selecting the high temperature or high thickness slices. Additionally, they are thinking about how a particular condition will affect only one quality, instead of considering the effect each condition will have on each of the onion's qualities. By doing this, Shallot effectively avoided discussing the conflicting data in their response.
“I believe these are the best parameters to dehydrate the onions because treated 1 mm slices at 70 °C retain only approximately 2 kg water per kg dry solid, which allows for the onions to have a longer shelf life and preserve a higher quality. Even though it has a 0.346 rate constant for browning and a 7.0 rate of thiosulphinate loss, I would prioritize a longer shelf life since it creates more time to sell the units.” (Scallion)
The data the students considered in the task indicated very similar conditions as being ideal for the browning and flavoring of the onion. Therefore, students who selected conditions to prioritize those qualities at the expense of moisture content received a 2 out of 3 code. This was the type of reasoning that Allium utilized in their response.
“Going off the data, it checks off 2 out of the 3 boxes that we are looking for in terms of highest quality. For moisture content, it does not provide the best moisture and is in fact one of the lowest. However, when examining the browning constant rate, it has a rate of 0.074, which is among the lowest and one of the longest to brown. Finally, for the factor that gives onions their flavor, it has a loss rate of 0.2, which is the best of any data presented on the table.” (Allium)
Some students try to handle the conflict by selecting conditions to try to make up for where a previous condition lacked. For example, some students selected a higher thickness, which benefitted browning and flavoring, to make up for the negative effects selecting a higher temperature caused on those qualities. Responses that utilized this type of reasoning were coded with the balancing positive and negative code. This is the type of reasoning that Hagyma used.
“While the high temperature does increase the rate of browning and thiosulphinate loss, it also removes significantly more moisture than lower temperatures, making the final product more shelf stable. Using a thicker slice and salt treatment will help to counteract the browning and thiosulphinate loss.” (Hagyma)
Some students will take this a step further and consider how much a particular condition choice affects one of the onion's qualities. For example, the moisture content figure the students considered shows a large decrease in moisture between 60 and 70 °C (ESI,† Part B). Some students used this drastic decrease in moisture content between the two temperatures as evidence to further justify their condition selection. When students considered this difference between data points in their reasoning, they earned a variance code. Basila considered the differences between data points as justification in their argument.
“Based on Fig. 2 [ESI,† Part B, Fig. 2], going from just 60 degrees to 70 makes a large difference in the moisture content for all groups in every thickness and treatment. Because a lower moisture content is better for the shelf life and quality of product, it is very important that we choose 70 degrees. While Tables 1 and 2 [ESI,† Part B, Tables 1 and 2] do show a higher rate of browning and thiosulphinate loss for higher temperatures of dehydrating, the difference of moisture content in Fig. 2 [ESI,† Part B, Fig. 2] just shows way too big of a difference to use 50 or 60 degrees for dehydration.” (Basila)
Another way students approached addressing the conflicting data was to select conditions that gave an average value for all three onion qualities. In these situations, students opted to not favor a particular quality over another, thus receiving a middle ground code. Vidalia took this approach to decide on a condition set.
“I believe that the best way [is] to do a 3 mm treated onion slice for the dried onions because it is a happy medium of flavor and dehydration. If we went with the 1 mm sample it wouldn’t be as flavorful while having slightly less dehydration than 3 [mm]. If we went with 5 mm, the flavor would be very good but it would also have a shorter browning time than 3 mm.” (Vidalia)
Some students used a combination of the previously outlined reasoning codes to get to their final response. In these situations, students would use one type of reasoning to come to their decision about the onion's temperature, but another type of reasoning to decide on their thickness. The first combination that was present in this data set were students who both prioritized moisture content and took a middle ground approach. We can see this in Pyaaz's reasoning for selecting treated onion slices that were dehydrated at 70 °C and were 3 mm thick.
“The onions should be dehydrated at 70 degrees Celsius due to these onions significantly scoring better on the moisture test, whilst lagging slightly further behind in the other quality parameters. The onions should be cut to be 3 mm thick, because of the significant improvement in dehydration, [but] with thinner onion slices the rates of spoilage and flavor degradation are higher.” (Pyaaz)
The other combination of reasoning patterns students used was combining 2 out of 3 reasoning with middle ground reasoning. This code is similar to the prioritizing MC/middle ground code in that the students prioritize one quality with one choice and take a middle ground for another. The difference between the two codes lies in what qualities the student decides to prioritize more. We can see this reasoning play out in Bulb's response below.
“I believe 60 °C is the best temperature because it is a moderate temperature and at too high of a temperature, the browning and thiosulphinate loss rates are too high, and at too low of a temperature, the moisture content is too high. In addition, it is best to treat the onions because for all three quality parameters, the treated onions were better than the untreated. Lastly, 5 mm is the best thickness because, although it was the worst thickness based on moisture content, it was the best thickness based on browning and thiosulphinate loss rates, which are more direct predictors of spoilage rate than moisture content.” (Bulb)
A breakdown of the frequency of each reasoning pattern can be seen in Fig. 2. The most common reasoning patterns came from students who had misinterpretations within their analysis. The second most common reasoning is students who prioritized the browning and flavoring, which was captured by the two out of three code.
Fig. 3 shows the number of students who did not engage, selectively engaged, or fully engaged with the uncertainty. We can see that a majority of students fully engaged with the uncertainty in the task and demonstrated understanding of the limitations and benefits of selecting their particular condition set. The second highest frequency reasoning was students who unproductively engaged with the uncertainty, with many students having a misinterpretation that made it difficult to engage with the data. Finally, nearly 100 students omitted discussion of the conflict, as seen by the frequency of selective engagement in Fig. 3.
Fig. 3 Frequency of the engagement levels students utilized in their responses. The colors in this figure correspond to the codes in Fig. 2. |
Fig. 4 Sankey diagram depicting condition choices and the various reasoning patterns students utilized to justify their decision. Condition sets that had less than ten students select it are omitted from the diagram. The colors in this diagram correspond to the colors in Fig. 3. |
Additionally, we can see that students engaged with the uncertainty to varying degrees to arrive at their condition selection. For example, some students who selected 50 °C, 5 mm, treated slices did not engage with the uncertainty because of having some misinterpretations about the data or task. Other students in this condition set selectively engaged with the uncertainty by omitting discussion of the onion's moisture content, and others productively engaged with the uncertainty to earn a two out of three code.
We can see that there are common reasoning patterns within each condition set. For example, students who opted for 60 °C, 3 mm, and treated slices typically arrived at that answer by taking a middle ground approach. Another condition set that has common reasoning patterns is the 70 °C, 5 mm, treated slices. Here we see that students typically arrived at this answer by taking a compartmentalized approach to the data, trying to strike a balance between the onion qualities, or considering the variance between data points.
We can also see that students may decide on different condition sets even if they utilize the same reasoning patterns. For example, we can see that students who used 2 out of 3 and middle ground reasoning often came to two different conclusions for the best conditions. Some students selected 60 °C, 5 mm, treated slices, while other students selected 50 °C, 3 mm, treated slices. The difference in the condition set selection depended on when the student utilized the middle ground reasoning and when they use the 2 out of 3 reasoning. When students select 60 °C, 5 mm, treated slices, they typically take a middle ground approach to justify the temperature and a 2 out of 3 approach to justify the thickness. When students select 50 °C, 3 mm, treated, slices, they typically take the 2 out of 3 approach for the temperature and the middle ground approach for thickness. While either condition can be reached with the same type of reasoning, the way in which the student employs the reasoning to come to a final decision may differ.
There was a substantial subset of strategies that were rather unproductive, where students misunderstood components of the task or generated vague reasoning. In this study, we viewed these forms of reasoning as barriers to meaningful participation in the argumentation and data analysis. This allows us to consider how instructors could offer feedback to students that will empower the learner to engage more meaningfully. For misunderstandings, this feedback can simply and directly correct the specific misunderstanding or prompt the learner to revisit the data. Rather than interacting with the vague reasoning as though it is strongly held or representative of misunderstandings, we believe that simply prompting the learner to try again by messaging that their reasoning did not make sense could guide the learner to more productive reasoning. Further, rather than messaging to the student that their answer was wrong or incorrect, which are not useful epistemic criteria for argumentation, it offers an epistemic criterion for argumentation as the reason for revision (i.e., the reasoning did not make sense).
The primary novel contribution of these findings is the variety of ways students grappled with the conflict and resulting uncertainty. Critically, these students recognized the conflict and reckoned with it. This contrasts with findings about conflicting data in which students can dismiss, ignore, and rationalize away conflict (Evagorou et al., 2012; Novak and Treagust, 2018; Walker et al., 2019; Bolger et al., 2021; Phillips et al., 2021). We view some of these kinds of reasoning as transferrable to socioscientific argumentation (e.g., 2 out of 3, middle ground). For example, a reasoner could prioritize lowering global temperatures and ocean health over energy demands to argue for limiting CO2 emissions using 2 out of 3 reasoning (Chen et al., 2019). We further want to highlight ‘variance’ reasoning because these learners demonstrated deeper analysis than their peers. Where their peers considered all three trends in the data, these students attended to the magnitude of the trends. Specifically, they identified differences between data points to make comparisons between data sets. We view this level of analysis as a target mode to guide learners to. In follow-up interviews, we saw learners move towards this deeper analysis with relatively simple prompts: “what is your justification for prioritizing these two qualities?”
The ways of reasoning revealed in this study can enable tailored and specific feedback that will support learners in developing data analysis skills. Further, this task utilizes a relatively neutral context that can be implemented in a variety of chemistry classes to engage students with conflict and uncertainty. We have administered this task as a stand-alone assignment and paired with peer review (Berg and Moon, 2022). Both forms of administration have their advantages. Preliminary evidence from our research program is revealing that peer review can facilitate the prompting to deeper analysis that we have highlighted as a major implication for practice above. Finally, we believe that administering this task can offer epistemological messaging that represents scientific reasoning as using evidence that all people are entitled to engage in, rather than representing scientific reasoning as arriving at the correct answer and only accessible by those who do so (Chen et al., 2019).
We believe this task can fit in a wide variety of chemistry coursework, but it is important to recognize that it is a reading and writing heavy task. The task was administered at an institution at which most students speak English as their first language. So there remains an open question about the accessibility of this task for learners whose native language is not English. Future work and future users of the task should consider their unique students and what modifications to the task may best meet their students’ needs.
Footnote |
† Electronic supplementary information (ESI) available. See DOI: https://doi.org/10.1039/d3rp00035d |
This journal is © The Royal Society of Chemistry 2023 |