A characterization of chemistry learners’ engagement in data analysis and interpretation

Stephanie A. Berg and Alena Moon *
Department of Chemistry, University of Nebraska, Lincoln, Nebraska, USA. E-mail: amoon3@unl.edu

Received 27th May 2022 , Accepted 12th August 2022

First published on 17th August 2022


Abstract

Both graph comprehension and data analysis and interpretation are influenced by one's prior knowledge and experiences. To understand how one‘s prior knowledge and experiences interact with their analysis of a graph, we conducted think-aloud interviews with general chemistry students as they interpreted a graph to determine optimal conditions for an experiment. Afterwards, students engaged in a simulated peer review by reviewing three sample responses, which further revealed their reasoning. We deconstructed students’ analyses using Data-Frame Theory to identify the prior knowledge and experiences that informed and guided their analysis, as well as characterizing moments in which their analysis was influenced by different sources of information. Using template analysis, we present and discuss four themes: establishing the frame, observing and interacting with the data, data-frame interactions, and when frames change. From these findings, we discuss implications for utilizing students’ prior knowledge and experiences to aid in their data analysis and interpretation, as well as identify opportunities for future research.


Introduction

Reforms in science and chemistry education have emphasized the need for STEM students to engage in science and engineering practices (Talanquer and Pollard, 2010; National Research Council, 2012; Cooper and Klymkowsky, 2013). At the K-12 level, the Next Generation Science Standards in the United States have effectively outlined competency across eight science practices for various grade bands (National Research Council, 2012). While it is critical for undergraduate students to develop competency in these practices to be scientifically literate and prepared for future careers in STEM (Cooper et al., 2015), research is required to define competency and outline how it develops. As part of this research, we aim to characterize how undergraduate chemistry students engage in the science practice data analysis and interpretation.

Studies across science education, educational psychology, and chemistry education have shown that prior knowledge and experiences influence students’ data-based reasoning. Accessing (or not accessing) certain prior knowledge and experiences can affect the features and patterns that one notices in a data set (Pinker and Feedle, 1990; Shah and Carpenter, 1995; Friel et al., 2001; Shah and Hoeffner, 2002; Jeong et al., 2007; Heisterkamp and Talanquer, 2015). Relevant prior knowledge is also necessary to tie patterns found within the data back to the phenomenon under study (Shah and Carpenter, 1995; Latour, 1999; Shah and Hoeffner, 2002; Lai et al., 2016). In the absence of the necessary content knowledge, many students will rely on heuristics and intuition or neglect to use reasoning entirely (Heisterkamp and Talanquer, 2015; Becker et al., 2017; Masnick and Morris, 2022). For more advanced scientists, this prior knowledge is key because with it, the scientist will contextualize their interpretations in the broader scientific context; that is, they consider hypotheses, theory, experimental design, and implications to draw conclusions (Angra and Gardner, 2017). To support the development of this integration between relevant prior knowledge and science practice engagement for younger scientists, we need to understand how these two domains interact.

To that end, this study seeks to investigate how general chemistry students use data to determine optimal conditions for an experiment. We use Data-Frame Theory to model the dynamic interactions between a student's prior knowledge and experiences and their analysis of a graph as they engage in a data analysis and interpretation task. Our study is specifically guided by this research question:

How do general chemistry students’ prior knowledge and experiences interact with their graph analysis during data analysis and interpretation?

Background

Data analysis and interpretation

When encountering data, an individual must first encode the visual features and identify the features that are important (Carpenter and Shah, 1998; Shah and Hoeffner, 2002; Glazer, 2011; Zagallo et al., 2016). Once the important visual features are encoded, any relevant patterns or relationships must be identified within the data representation (Ratwani et al., 2008; Zagallo et al., 2016). Not being able to differentiate between the relevant and irrelevant information is one of the many challenges for data-based reasoning (Kanari and Millar, 2004; Jeong et al., 2007). In addition to this, students may use select data to form conclusions or predictions, often explaining away the data that contradicts their theory (Chinn and Brewer, 1993, 2001; Meister and Upmeier Zu Belzen, 2021). After the important patterns have been identified, they must be connected back to the phenomena or concepts modelled in the representation (Latour, 1999; Shah and Hoeffner, 2002; Glazer, 2011). Both identifying relevant information and connecting the information back to a phenomenon are processes that are largely influenced by an individual's prior knowledge and experiences with the type of data representation and the phenomenon being considered. Graph schemas aid in identifying the kinds of relationships represented in the graph (Pinker and Feedle, 1990). Prior knowledge of the phenomenon modelled in the graph can also “unlock” more expert-like reasoning and comprehension (Roth and Bowen, 2000). Not having ready access to the content knowledge of the phenomenon could affect what visual features and patterns are identified within the graph (Shah and Hoeffner, 2002). This is also true of analysis with other forms empirical data in representations such as data tables (Jeong et al., 2007; Masnick and Morris, 2022).

Many of the challenges identified for students’ data-based reasoning in chemistry education echo those of science education and psychology. In a study investigating students’ use of initial kinetic rates data in constructing rate laws, Becker and colleagues found that some students neglected using some of the empirical data provided or even neglected to use the data entirely (Becker et al., 2017). They also found that many students used unproductive reasoning and heuristics when forming their model. We posit that the use of this reasoning could potentially come from students’ lack of relevant content knowledge of the phenomena they were modelling. In another data analysis and interpretation study, Heisterkamp and Talanquer used a case study approach to explore a participant's analysis of boiling point data and ionization energies (2015). The authors identified the use of “hybridized” reasoning wherein the participant used a mix of intuitive ideas and chemical content knowledge to form explanations for trends in the data. The participant also relied on explicit surface features of data to form explanations. For example, the participant tried to explain the increasing boiling points of substances by counting the number of atoms in each molecule and calculating the differences in masses of compounds. In both studies, chemistry students’ reasoning seems to be influenced by their prior knowledge and experiences that they use to analyse the data. Additional work in biology education and science education have further supported this (Jeong et al., 2007; Angra and Gardner, 2016, 2017).

The work in graph comprehension, data analysis and interpretation, and chemistry education has continually found that prior knowledge and experiences affect students’ analyses with data representations (Pinker and Feedle, 1990; Carpenter and Shah, 1998; Roth and Bowen, 2000; Jeong et al., 2007; Heisterkamp and Talanquer, 2015; Masnick and Morris, 2022). However, little work has considered how prior knowledge and experiences interact with one's data-based reasoning. Therefore, we sought to bridge this gap in the literature by characterizing the processes by which chemistry students engaged in data analysis and interpretation using real data. We specifically aimed to account for their prior knowledge and experiences that they used to make sense of the data.

Data-frame theory

The science practice of data analysis and interpretation can be viewed as a sensemaking process (Chen and Terada, 2021). Raw empirical data must be manipulated, organized, and interpreted by the scientist to generate meaning (National Research Council, 2012). One kind of data representation that all scientists encounter, regardless of what field they are in, is the graph. Numerous studies have characterized the graph comprehension process (Shah and Carpenter, 1995; Carpenter and Shah, 1998; Friel et al., 2001; Ratwani et al., 2008), but it is typically detached from science practice. These studies offer little insight into how an analyst's prior science knowledge and science practice competency interact when they analyse and interpret data. With the specific aim of understanding this interaction, we use the Data-Frame Theory of sensemaking (Klein and Moon, 2006; Klein et al., 2007).

Data-Frame Theory asserts that analysts concurrently construct data (observations) and frame (reasoning), and that the data and frame inform one another. The frame is an explanatory structure that helps to account for pieces of data by describing its relationship to other data in the environment. In this way, the frame serves as a lens for making observations and assigning meaning to those observations. Frames can take the structures of stories, scripts, maps, or plans, and are synonymous with schemas (Hammer et al., 2004; Klein et al., 2007; Gouvea et al., 2019).

The other component of data-frame theory is data. Data is the information extracted from the environment. Within chemistry, data would be considered the empirical data collected from experiments, but we would also consider other sources of information, such as experimental schemas, directions, molecular structures, or events, to be data as well.

Data and frames tend to interact in a cyclic pattern. A person begins by encountering data in an environment. Certain points of data or key features within the data serve as anchors that help to elicit a frame. This frame can then guide how the analyst makes sense of the data. The frame is influenced by the analyst's prior knowledge and experience with similar data. The frame can also be influenced by whatever goals the person may have associated with the data (e.g., a hypothesis). Once the frame is established, a person can begin to search for more information. During this search the frame “filters” incoming information to help a person seek more relevant information to aid in the sensemaking. To account for the incoming information, the frame can be elaborated and extended to fill in whatever gaps were not originally accounted for, illustrated in Fig. 1.


image file: d2rp00154c-f1.tif
Fig. 1 Data-frame theory model of sensemaking modified from Klein and Moon (2006).

During the sensemaking process, it is likely that an individual will encounter data that is inconsistent with their frame. When this happens one can begin to question their frame and note where the data violates expectations generated from the frame. After these gaps are exposed, the frame can be maintained through two processes: preserving the frame or elaborating the frame. To preserve the frame, the anomalous data can be explained away or disregarded. This aligns with many other studies within science education and discipline-based education research in that many will discount data that contradicts their mental model (Chinn and Brewer, 1993, 2001; Meister and Upmeier Zu Belzen, 2021). If the contradictory data is accepted, one can engage in elaborating their frame. During this process, the frame is expanded and extended to account for the new information. Although the frame might be undergoing changes, the frame's integrity is still intact and its key anchors are maintained.

If the anomalous data is accepted and cannot be integrated into the frame, a person may decide to disengage from their current frame and construct a new frame. In this reframing process, new anchors will be searched for to establish the new frame. Data that was previously overlooked or discarded may now be considered and interpreted from a new perspective as well.

At times an individual might compare or even consider multiple frames to make sense of the data. Klein and colleagues estimate that a person can use up to three frames simultaneously (2007). We liken using multiple frames in sensemaking to observing the data from multiple perspectives. Each frame will have different anchors that are unique to each frame, allowing the individual to pick up on different aspects of the data specific to the perspective they are considering. Students can also compare the frame that they are using to another frame they encounter. This can potentially cause students to disengage from their initial frame and use the other if they find the new frame compelling enough.

We posit that for an individual to consider and compare a frame against their own, they likely need to engage in decentring from a central perspective. Decentring involves recognizing alternative perceptions and reasoning to the same problem or situation (Piaget, 1955). To consider and compare frames, an individual must recognize alternative perspectives of the same data, which means they are recognizing alternative perceptions of the same problem. Differentiating between different perspectives in this way has been shown to support more productive argumentation (Moon et al., 2017) and support student–teacher interactions (Teuscher et al., 2016).

Data-Frame Theory also asserts that experts and novices engage in data-based reasoning in similar ways. The primary difference between an expert's analysis and a novice's analysis is what mental models and prior knowledge the individual can access. Experts possess richer collections of conceptual knowledge and more experiences that can inform their frame when engaging in sensemaking. By having richer collections of knowledge and experiences, experts can access more frames to use when engaging in sensemaking. This also allows experts to find deeper meaning in data that is presented to them, whereas novices tend to produce more shallow conclusions. Data-Frame Theory helps to identify what pieces of knowledge and experiences inform an expert's frame to lead to more productive sensemaking and sophisticated conclusions.

We propose using Klein's Data-Frame Theory because it enables us to deconstruct an analysis, as well as document interactions between frames and data. Other studies have shown the utility in using a frame theory to explore how one's frame can affect the ways in which they reason with a problem (Hammer et al., 2004; Slominski et al., 2020). Similarly, Data-Frame Theory can help to identify a student's different prior knowledge and experiences they use to guide their analysis of a data set. Different data features can also be identified that were used to help students reach a conclusion after consulting the data set. Data-Frame Theory also can characterize more complex interactions between data and one's frame. Inconsistent and anomalous data or information that challenge a scientist's initial ideas and hypotheses can be explored and characterized. Additionally, data analysis and interpretation, like other science practices, is an iterative and dynamic process. Scientists’ previous experiences with types of data and knowledge of the content they are studying greatly influence how they interact with their own data. Much of their data analysis process is informed and guided by the scientists’ expertise. When they encounter anomalous data, scientists must consult their prior knowledge and decide how they plan to address it. In this work, we are particularly interested in accounting for these interactions to characterize the different processes that a chemistry student engages in while using data from a graph to reach a conclusion.

Data collection and analysis

Interview protocol

The semi-structured interviews used for this study consisted of two stages. In the first stage, students determined the optimal reaction conditions for an experiment using a graph. The task is a scaffolded version of the same experimental decision made by Doidge and colleagues when deciding what concentration of HCl would best be used to isolate gold in a dual-phase extraction of waste electronic equipment (Doidge et al., 2016). Students were specifically tasked to use the graph from Doidge and colleagues to choose what concentration of HCl would best extract a maximal amount of gold and a minimal amount of tin and iron, shown in Fig. 2. They chose between 0 M, 2 M, and 4 M. Throughout this part of the interview, students were encouraged to annotate the graph using the Zoom annotation feature to help explain their reasoning, especially when referring to specific parts of the graph. At the end of this interview phase, students wrote a summary of their verbal analysis, and were told to include whatever amount of detail and evidence that they deemed necessary to convince someone else of their choice. The written summaries were typed into the chat function of the Zoom call to be referenced later in the interview.
image file: d2rp00154c-f2.tif
Fig. 2 Screen capture of Fernald's graph.

In the second stage of the interview, students engaged in a simulated peer review, in which they gave feedback to three sample responses and compared them to their own (Berg and Moon, 2022). Each sample response was pre-constructed to support one of the three choices for the task. The responses also had different argumentative flaws embedded for students to identify and critique. During the simulated peer review, students compared their own analysis to that of the sample response and gave feedback meant to improve the response. This comparison helped to indirectly probe the student's own analysis as well, especially if they mentioned that the analysis of a sample response was similar to their own. During the simulated peer review, several students expressed feeling less confident in their selection of which concentration of HCl use for the task. To remedy this, students were given the opportunity to make changes to their work. If students made these changes during their interview, they were asked how their response had changed from before the peer review.


image file: d2rp00154c-f3.tif
Fig. 3 Screen capture of Gregor's graph.

Sampling

This study took place in a large, Midwestern university in fall 2020. Institutional IRB approval was obtained before recruiting for interviews. Participating students (N = 18) were from general chemistry I and II courses for both majors and non-majors. Students submitted consent forms electronically before their interviews. After the interview, students were compensated with a $20 gift card for their participation.

Data collection

All interviews were conducted remotely via Zoom. Each interview was recorded within the application, and the recordings were used to generate transcripts of the interview. The video recordings of the interview were kept and used in the analysis for visual reference, especially when students annotated the graph in reference to something. All transcripts and videos have been deidentified and pseudonyms have been given to participants. Any stills from the video recording consists of only the graph and the students’ drawing; any reference to the students’ names have been cropped out of the image when used in analysis.

Analysis

We analysed transcripts from both stages of the interview for analysis. Particular attention was paid to the second stage if students changed their answer to a different concentration of HCl. We believe that this was evidence of students engaging in the task with a new or elaborated frame.

Because our research question focuses on the processes that students are engaged in when analysing a graph, we used both process coding and open coding (Miles et al., 2014). Process codes helped to identify different actions that the students took at different times during the task. The codes were developed to describe actions that the student consciously described they were doing as well as actions that they may not have been aware of. The open codes that were developed in this time were used to capture different features of the graph that students were using as well as the different conceptual and everyday ideas that they used to help with the task.

After open coding, we employed template analysis, a type of thematic analysis, to organize codes (Brooks et al., 2015). We began by sorting open codes into two a priori categories: the data category and the frame category. Codes that had anything to do with information that could be found in the graph were sorted into the data category. These codes included instances in which students pointed out peaks within the graph or comparisons students may have made between different parts of the graph. Codes that focused on reasoning with information external to the graph or used information from the prompt were sorted into the frame category. Many of these codes included some sort of set of goals or objectives that students mentioned that they wanted to accomplish to fulfil the task. These are assumed to be parts of a students’ frame that helped guide their analysis.

In the second round of template analysis, we noticed that some of the codes sorted into the data category also had elements of a frame to them as well. These codes did have elements of a graph feature included in them, but there was also some sort of opinion or evaluation being made with the data. For example, in his comparison between the increases of gold and other metals between 2 M and 4 M, Fernald said, “And this increase [in other metals] is not compensated for the increase in gold accumulated.” This part of his analysis does include directly observable information from the graph (the increases), but there is also an element of value or meaning ascribed in the comparison that is not directly observable (the compensation of one increase over the other). We infer assertions and evaluations like Fernald's are showcasing an interaction between the student's frame and the data that they are encountering; thus, we constructed a new coding category to capture them.

After using template analysis to organize a student's analysis, we developed a coding scheme in which the analysis is followed and deconstructed according to three categories: the frame, the data, and the data-frame interactions. To identify the student's frame, we identified key anchors that seemed to guide their analysis. We identified these anchors by considering three things: how a student, explicitly or implicitly, defined minimal or maximal for the task; what concentrations of HCl were being considered (i.e., if a student constrained their focus to one area of the graph or used data from beyond the designated choices to inform their choice); and what metals were being considered. We also tried to identify any outside knowledge, conceptual or previous experiences, that seemed to inform or guide parts of the frame. Next, we identified the data being used by the student during the think-aloud interview. This meant identifying which points or areas of the graph that the student deemed important, identifying what comparisons students made in the graph, and how such comparisons were made. Finally, we identified the data-frame interactions that occurred during the student's analysis. These interactions were moments in which the frame was used to inform decisions with the data and moments in which the data had some sort of effect on the student's frame. Each student's interview was deconstructed following this method of analysis.

We recognize that in some interviews, students’ analyses were not as linear or direct as others. Many students changed answers at some point during the interview. We infer that the change in answer indicated that something had changed in the student's frame. To distinguish between a student's different frames, we identified the anchors that seemed to inform the student's answer before and after the change. For this part of analysis, we identified either an entirely new set of anchors for a student's frame or we identified new anchors that might have been added onto the pre-existing set. After this, we identified the circumstances under which the frames changed as well as pinpoint the actions students undertook when their analysis deviated or changed somehow.

Trustworthiness

An outside researcher who is also trained on Data-Frame Theory as a theoretical framework was recruited to establish trustworthiness for the modelling of students’ data analyses. The first author and the outside researcher collaboratively deconstructed three students’ interviews to identify the students’ frames for analysis, the data used, and the data-frame interactions that took place to help a student reach their final answer. Any disagreement or discrepancies between researchers were discussed and changes were made to the coding to reflect the discussion.

To begin, the outside researcher was trained on the anchors that the first author had identified as relevant for the task. The first author and outside researcher then constructed frames for each student by identifying anchors that seemed to guide the student through their analysis. There was an initial discrepancy between researchers concerning whether one student had formed a frame and immediately changed to another frame or if the student had been simultaneously using two frames to approach the task before settling on one. The researchers discussed and decided that because the student seemed to weigh both frames equally when speculating, he likely was considering multiple frames at once rather than changing frames. This was noted and applied to similar interviews in which students considered multiple answers until eventually settling on one. The researchers also worked to describe new frames that students had after changing answers in the interview.

Once frames had been constructed for each student, the researchers identified data that each student considered in their analysis, as well as what students did with the data. Researchers went through the interview transcripts line by line to identify the graph features students spoke about or marked on their screen. Researchers then identified what students did with the graph features to help further their analysis.

After considering the frames and the data that students used within their analysis, the researchers identified data-frame interactions for each interview. Much of the discussion between researchers focused on two of the interviews in which students changed their answer at some point. The researchers inferred that a change in answer meant that students experienced some sort of change in their frame. The researchers further inferred that there was data to prompt the change. Both students were exposed to new information before they changed their answer: one student reread the prompt that changed their perspective and another student engaged in a peer review that changed their perspective. Therefore, in the instance of a frame change, the researchers decided to classify it as a data-frame interaction, and to expand the definition of data to include different sources of information such as simulated peer review and rereading of a prompt. Following this, other interviews in which students experienced a change in frame were reanalysed to identify the information that caused such a change.

Once consensus was reached for all three interviews, the first author coded the remaining interviews.

Results

Overview

We first explore the theme of forming a frame. Chemistry students began the task by establishing some sort of frame to help make sense of the data. This often involved first drawing upon relevant prior knowledge and experiences that were activated by the task. This in turn would help students establish goals and objectives to guide their analysis and completion of the task. Although all students did establish a frame to work through the task, many of the students began the task using multiple frames in their approach before deciding on one.

Next, we investigate the theme of observing and interacting with data. Having established and decided on a frame, students then began searching for graph data that was relevant to their frame. This information included different points within the graph such as the peaks for different lines and low points for others. Once students had identified the relevant information, they would engage in interacting with the graphical data. For this task, these interactions mainly consisted of comparisons of different graphical data.

Following the themes of frame and data, we introduce a third theme that examines the interactions between the two. After exploring the data to find relevant information and engaging with important graphical data, students engaged in various data-frame interactions. The most common data-frame interaction involved students using graphical data to make evaluations of the different HCl concentration options. These evaluations in turn were used to help students gauge which option best fulfilled the task according to their frame. For some students, the data within the graph activated another data-frame interaction in which students deviated from the task's prompt and incorporated extra or different objectives in their frame that were not observed in other students’ analyses.

The final theme we explore is when students’ frames changed during their analysis. We consider this to be another form of a data-frame interaction; however, the interaction resulted from incoming information outside of the data in the graph. One source of information that prompted such a change was the rereading of the prompt. Students began their interviews with one answer informed by one frame, but immediately after rereading directions changed their answer informed by a different frame. Another type of incoming information that led to many students changing their answer was considering an alternative perspective. Typically, this occurred during the simulated peer review in which students were exposed to an alternative frame that seemed to prompt a change in perspective.

These findings are outlined in Table 1 and presented in further detail below with quotes from students and images of their graph analyses.

Table 1 Overview of themes and subthemes in students' analyses, underlined themes and subthemes denote a devoted written section in the results
Themes Subthemes Description
Establishing frame Prior knowledge and experiences Students draw upon prior knowledge and experiences that are activated by data within the graph and prompt. These can help activate and guide the student's frame
Anchors Key pieces of data and objectives of the student help students form a frame to guide their analysis
Multiple frames Some students approach the task seeing multiple perspectives or interpretations of the same prompt
Observing and interacting with data Relevance of data Students’ frames help “filter” incoming information from the graph, helping them identify what graphical data is important and what is irrelevant to their analysis
Comparisons Students make different kinds of comparisons in graphical data to gather information for their decision-making
Data-frame interactions Data-based evaluations Students use objectives and anchors from their frame to evaluate their choices for the task
Data affecting frame Students consider information outside of the task prompt and embed the new information into their frame, producing a small change in frame
Frames changes Students’ frames changed through one of two ways: considering new information outside of the graph and considering alternative perspectives


Establishing the frame

To begin, students called on prior knowledge and experiences that could help construct a frame for them to engage in the task. Many of these experiences involved some sort of work in laboratory. Throughout his analysis, Evander often mentioned wanting a pure gold product to fulfil the task. When asked about this, he specifically recalled his summer work in a research lab:

“I worked in a research lab where drug purity is paramount because you don’t want stuff messing with drug delivery. So I guess that was kind of in my mind.” (Evander)

Even though he did not reference this research lab experience right away in his interview, we infer that he unconsciously was drawing upon the knowledge he gained working in the environment to help form a frame to complete the task.

Not all students drew upon prior knowledge and experiences that were directly related to the task at hand. Many students brought up everyday connections to the task that helped them make sense of the task. In addition to this, some students attempted to use conceptual knowledge that was unrelated to the task. For example, Hector attempted to use his prior knowledge of kinetics to make sense of the task:

One thing I’m like trying to incorporate would be like rates of reactions… I feel like if there were more materials other than the HCl, the gold, and the PA, then it could probably slow the reaction down.” (Hector)

Here, Hector is attempting to use the concept of kinetics to help inform his analysis. In the interview, he had mentioned that it was a subject covered in his chemistry lecture, but that he had not understood it as well in as other concepts. We found this particularly interesting as there was no mention of reaction rates in the task's prompt, nor was there any data provided that related to reaction rates; his ideas solely came from his prior knowledge and experiences in class. Hector's focus on the molecular level of the extraction phenomenon demonstrates a relatively sophisticated level of reasoning for the task; however, none of the data presented related to rates of reaction, nor did the task ask him to consider that for his response. Although the concept of rate of reactions helped Hector initially to begin activating a frame and guide him to an answer, he later changed his response during the peer review portion of the interview. We infer that this was because the prior knowledge Hector had activated was limited in what it could help him make sense of.

Accessing prior knowledge and experiences for the task helped students to activate anchors for their frame and guide the frame overall. The anchors helped to define the goals or objectives that students set out to accomplish during their task, as shown by Gregor:

“Oh, I saw some keywords in the, in the question…you know, ‘maximal amount of Au with minimal amounts of waste.’…So when I see things like that, I'm thinking that the problem that we're trying to solve here is one of there's a specific name for it, but it's, it's getting the most of what you want and the minimum of what you don't. Optimization, I believe.” (Gregor)

Here, Gregor is focusing in on keywords in the prompt and connecting them to previous optimization problems he has encountered before. The combination of the “maximal” and “minimal” terms seemed to activate his previous experiences with optimization. This allowed him to begin to establish anchors that helped guide what data he needed to consider. Gregor began to hint that his one anchor for him involved searching for an area of the graph that satisfied having a maximum of gold while still maintaining a relative minimum of tin and iron.

Overall, anchors helped to define the goals or objectives that students set out to accomplish through their analysis. They gave direction for the student's frame to follow. Because anchors were so embedded in a student's analysis, it was often difficult to directly elicit from a student's interview. However, some students verbalized this aspect of their frame, like Ben did:

“So the first [step] is find the most Au that I can get. The second one is find the least Sn and Fe that I can get. And then the third one would be like now find the point where you can achieve both of the two steps the best way you can.” (Ben)

The steps that Ben lists out serve as guides for the remainder of his analysis. Having these concrete objectives for his frame helped him to anticipate what graphical information he would need to complete the task. Even students that did not necessarily directly verbalize their objectives like Ben had some sort of frame that helped direct their analysis of the graph.

However, not every student formed a guiding frame immediately when they began the task. Some students began by acknowledging that different concentrations of acid would fulfil different interpretations of the task, which seemed to show they were approaching the task from multiple perspectives. We infer that these students were tracking multiple frames, one for each perspective they considered. Take Bruce:

“The biggest one, I kept going back and forth between either two or four molarity, just because I don't know how important it is to get rid of waste. Like if the main goal is to extract gold or the main goal is to minimize waste, I think that could depend on, whatever the goal is would change if you want to use two or four molar hydrochloric acid.” (Bruce)

Bruce recognized that there could be multiple interpretations for the task, with each having its own set of objectives to fulfil. The “main goal” for one frame was to extract as much gold as possible, making 4 M an ideal choice. While the other frame he tracked was to avoid extracting other metals with the gold, making 2 M an ideal choice. Although Bruce saw the merit in both interpretations of the task, he ultimately chose one to pursue for the remainder of his analysis.

Observing and interacting with data

Once students had a frame, they began to use it to navigate the different information in the graph. The frame played a key role during this part of their data analysis, as it allowed them to filter incoming information from the graph, such as Gregor explains when describing his analysis:

“I'm looking around and I'm trying to figure out, okay, so what of this information is relevant? And that's why I went and I found our items we're actually looking for and try to ignore the rest of this to some degree.” (Gregor)

Here Gregor points out that there is key information that is relevant for his analysis. His frame allowed him to solely focus on the gold, iron, and tin curves during his analysis as those were the metals of interest for his interpretation of the task. The rest of the data displayed on the graph was filtered out and ignored, as they were not part of his frame when engaging in the task. Gregor's frame also cued him to specific spaces of the graph (Fig. 3):

It simplifies the problem solving if I realized that all I really have to do is figure out where this is maximum*circles maximum point on gold curve*and then look and see what's going on with my other sort of secondary criteria. And then if I look at that and I go, okay, well they're pretty high here*circles tin and iron points at 4 M*.Then I go, well, where are they low? And I see, well, how much did we really relatively change?” (Gregor)

Here, Gregor is cued by his frame to narrow down his search of the graph to the space in which gold has reached a maximum. Once this has been located, he mentions his second objective to fulfil, which for Gregor's frame meant finding a spot with relatively low extractions of tin and iron. He then navigated to the bottom space of the graph to search for the tin and iron lines where he knew the extraction would be lower.

Following the identification of relevant data and important areas of the graph, students began to make various types of comparisons. Many of these comparisons were between a specific metal or set of metals at two different points in the graph, such as Jemma demonstrates when comparing the gold extraction at 0 M and 2 M: “The amount of gold extracted [at 2 M] is like 25% higher than at zero where it's only 65.”

Another type of comparison made by some students was gauging the relative differences of all the metals’ extractions at a given concentration of HCl. Some students did this by creating a “ratio” of the extraction of gold to the extraction of tin and iron, such as Colette did:

Interviewer: “I wanna go back to a point you mentioned earlier when you were analysing the graph, you talked about this idea of ratios. Can you explain that a little bit more for me?”

Colette: “I guess the ratio you want as much gold as you can get with having as little of the iron and tin as you can get, so like the farther apart they are. To me, if you have a 90[thin space (1/6-em)]:[thin space (1/6-em)]10 ratio, that's better than a 70[thin space (1/6-em)]:[thin space (1/6-em)]30 ratio. If you're only getting 70% of your gold, but you're getting 30% of the other metals.”

The ratio Colette constructed was helpful in comparing multiple data points at once. Students that constructed ratios for comparisons were condensing graphical data from both gold and the iron and tin data into a single piece of information for a given concentration of HCl. By constructing a ratio for each possibility of HCl, students could then compare each ratio and evaluate how well each choice fit their frame's objectives via a data-frame interaction.

Data-frame interactions

There were two main ways in which graphical data and students’ frames interacted within this task: data-based evaluations against a frame and data affecting a frame.

In a data-based evaluation, students weighed the different options of HCl outlined in the task by first obtaining some sort of relevant data from the graph. This data could be singular in nature wherein students use one point on the graph, such as a peak, or it could be more complex and involve comparisons of points or multiple points at once. Students then used this data to help assess how well an HCl concentration fit the objectives of their frame. If the concentration of HCl fulfilled their objective(s), they qualified it as an option for their final choice. If the concentration of HCl did not fulfil an objective or violates some aspect of their frame, the concentration is then disqualified and no longer considered an option for the final choice. To illustrate this, consider an excerpt from Gregor's analysis:

The first thing that comes up is 4 M. 4 M comes up because that's where the gold maximum is…And then when I think about it and I decided that maybe we don't want all of this waste right here, I start looking for the minimum of the tin and iron and that's at 2 [M].” (Gregor)

Gregor begins by first noting that the peak of gold is at 4 M. The peak signifies the largest extraction of gold which aligns with his objective of achieving a relative maximum amount of gold. This qualifies 4 M as an option for Gregor to further consider against the other objectives of his frame. The next objective he considers is achieving a minimum extraction of tin and iron. Still considering 4 M as an option, Gregor then checks the tin and iron points at 4 M to see if they fit his objective. After noting that the tin and iron extraction is much higher at 4 M compared to the other options, Gregor decides to disqualify 4 M as an option for the task because of its failure to meet one of his frame's objectives. Gregor then continues his analysis to examine 2 M HCl as a potential option as it seems to better fit the objective of a minimum extraction of iron and tin.

The other type of data-frame interaction involved data within the graph affecting a student's frame. This was a far less common occurrence, and students’ frames experienced relatively small changes from the data. The students that had their frame affected by the data in the graph seemed to consider information that was not outlined in the task prompt and embedded the new information into their frame. For example, Fernald's analysis considered a metal that was not outlined in the task:

“Well, the ideal molarity would be around the relative maximum for the gold, and it would also be around the relative minimum for all of these other metals, especially with Sb.” (Fernald)

We posit that Fernald experienced a very small frame change upon beginning to explore the graph. As mentioned in the Methods section, the prompt only asked students to consider gold, iron, and tin. These were also the only lines in the graph that were displayed in colour to help students more easily identify them. Because of this purposeful design of the prompt and graph, we infer that Fernald began the task only considering these metals. However, in exploring the graph during his interview warm-up questions, he noticed the line for antimony and used it for the remainder of his analysis. We believe that Fernald may have embedded antimony into his frame as it was relatively visible line on the graph, being lighter in colour and situated above the other greyed out metals in the graph. Despite frequent reminders from the interviewer, Fernald continually referenced antimony throughout his analysis, leading us to believe that his frame considered this metal in addition to the others. Specifically, we infer that he incorporated achieving a relative minimum extraction of antimony into his original frame objective of achieving a relative minimum extraction of tin and iron. Even though including antimony in his frame did not cause his analysis to differ drastically from other students in the end (he chose 2 M as his final answer, as did most other students), it illustrates an instance in which a student's frame and data interact within their analysis.

When frames change

Many students changed their answer at some point during their analysis. We suspect the change in their answer reflected some sort of change of the students’ frames. Students often seemed to have additional or entirely new objectives for their analysis to follow, which resulted in new answers for the task. We observed these changes in answers in two different scenarios, considering new information outside the graph and considering alternative perspectives.

Frame changes that resulted from considering new information outside the graph occurred when students reread the prompt at some point. These interviews began with students either not fully reading or remembering the full prompt before starting their analysis. Students began by choosing 4 M and justifying their choice by stating the gold or all metals were highest at that point, such as Kit illustrates: “I guess in all, I would say 4 M, because it seems like based off the 4 M in general, there's more percent metal extracted than from the other ones.” Because the students justified their answer without mentioning any other aspect of the prompt, the researcher would reread the prompt to ensure the student had fully understood the directions. Immediately after rereading the prompt students switched their answer to 2 M, often saying they misunderstood the question being asked before or that they understood the task better after rereading the prompt. The rereading of the prompt helped students to form new frame objectives that helped to guide their analysis of the graph, such as demonstrated with Kit:

Interviewer: “So what new pieces of information do you think that you used in order to reach your conclusion?”

Kit: “Oh, like based off of like the new, like reading the paragraph again? I think that the key things for me was like the maximal amount of Au and the minimal amount of waste. I think that originally I had thought this [maximal amount of Au], but I wasn't considering the amount of waste.”

Originally, Kit's frame only contained the objective to find as much gold as possible within the range of concentrations given. This objective guided her to seek out the highest point for the gold line, which happened to be at 4 M, qualifying 4 M as an option to fulfil the task. Upon rereading the prompt, Kit identified an additional objective for her frame to use: achieving a minimal amount of waste. Having an additional objective in her frame guided Kit to new data within the graph that she had not considered as relevant before. This new information then allowed her to consider a new concentration of HCl which progressed her analysis further into a new answer for the task.

In addition to experiencing frame changes from considering new information outside of the graph, some students also experienced frame changes by considering alternative perspectives. Most students that incurred a frame change in this way did so during the simulated peer review portion of the interview. During the peer review students compared their own work to that of a hypothetical peer. After this comparison, some students stated that they had low confidence in their answer or were feeling less confident overall. A few students even admitted that they thought their answer was wrong. To alleviate this, the interviewer gave the student an opportunity to make changes to their answer, which gave students the chance to choose a different concentration of HCl to fulfil the task. After students had changed their answer and explained what made that answer more appropriate, the interviewer asked the student what had motivated their change in answer. Some of these students explained that the peer review offered a new perspective that seemed better than their own. For example, consider Violet after reviewing the 0 M sample response:

“This one's pretty convincing, different after reading the first student and comparing it to mine. It's kind of like seeing it from someone else's perspective. [It] just kind of puts it into a better perspective. And I feel like that makes a lot more sense than my answer…It makes a lot of sense that there's actually zero waste at 0 M and you still get a pretty good amount of the gold and no waste at all.” (Violet)

The peer review prompted Violet to compare against the 0 M sample response. The comparison exposed Violet to a perspective of a frame different to her own that was more appealing. This new frame had slightly different objectives from her original frame, wherein the extraction had to achieve an absolute minimum extraction of tin and iron rather than a relative minimum. This disqualified her original answer of 2 M, which could be interpreted as having a relative minimum extraction of tin and iron (tin being slightly above zero and iron at zero) but not an absolute minimum. The only HCl concentration that achieved an absolute minimum of iron and tin was 0 M, so it qualified as the best option for Violet's frame.

Although the majority of frame changes from considering alternative perspectives arose from the simulated peer review, one student, Ariel, experienced a frame change by solely thinking of another person's perspective. Ariel began her interview with 4 M HCl as her answer. When asked to explain what had made this the most appropriate answer Ariel described her definition of minimal for the task: “So that's kind of how, I guess I would think of it where you can't count something as minimal until it is actually present.” Using this definition, the only concentration that was qualified to fit her frame was 4 M, as both iron and tin had extractions above zero.

Intrigued by this answer, the interviewer asked Ariel to consider how the scientists behind the experiment would approach the same task. Ariel began to express ideas surrounding reducing the error of the experiment, and when asked to specifically describe what the best choice would be for the scientists, she said, “I would say when you get the most amount of gold extracted, and then when you probably just have as little possible of tin and iron.” Here, Ariel is beginning to recognize a frame different from her own, the main difference being each frame's objective involving the iron and tin extraction. Ariel's frame defined a minimal extraction as an extraction in which both metals had a presence, whereas the new “scientist” frame defined a minimal extraction as having relatively little to no iron or tin present. Following this moment in the interview, the interviewer asked Ariel what perspective she wanted to take for the task. She admitted to feeling indecisive at this point but decided to choose 2 M HCl as her answer. After going through her analysis with the interviewer, Ariel brought up an experience from her general chemistry laboratory course that reminded her of the interview task, describing it as such:

“Well, we do like one where we mix like caffeine with HCl and like water or something…I think it's kind of similar to this one where it has like different layers…So I was just kind of thinking about that, like visually when I was doing this, cause it was kind of similar.” (Ariel)

Ariel's frame change was unique in that she did not need to read another written analysis to consider this alternative perspective. Upon being asked to consider a more science-based perspective, she seemed to activate a new frame that could be used to complete the same set of directions in the task. This new perspective also seemed to help Ariel establish connections between her previous experiences in an experiment she had performed in the laboratory to the metal extraction task at hand. This in turn helped her to visualize and better make sense of the phenomenon, leading to more productive sensemaking for the task.

Discussion and implications for research and practice

We outlined the ways in which a student's prior knowledge and experiences interact with their data analysis. We specifically demonstrated that a student's prior knowledge and experience played a part in activating and developing a frame that could be used to analyse the graph. Upon approaching the task, students could draw upon relevant knowledge and experiences that helped guide them through the task. For example, one student, Evander, specifically brought up his experience working in a drug research lab as something that he used to help him navigate the task. Although his lab work did not directly mirror the task, it was similar enough for him to use as a reference. Evander used the idea of purity from his research to help identify priorities for his frame objectives to complete and support his reasoning. Slominski and colleagues (2020) interviewed several biology, physics, and engineering faculty with a fluid dynamics task in two different contexts: blood vessels and water pipes. Some biology faculty used vocabulary and conceptual knowledge reminiscent of cardiovascular systems when answering questions for the water pipe task. Even though water pipes do not directly relate to cardiovascular systems, they do share the same underlying concept of fluid dynamics. Like Evander, the biology faculty used their prior experiences and prior knowledge as a reference and frame to help navigate new problems.

Not all prior knowledge and experiences were helpful for students. Consider Hector's interview; even though he was taking a rather sophisticated approach thinking of how the rate of the extraction could be affected by what metals were being extracted at a given concentration of HCl, he did end up changing answers and drawing upon different sources of knowledge to inform a changed frame later in the interview. We infer that something in the interview questions or task prompt activated his conceptual knowledge of kinetics from class, and that he decided to use it to inform his decision-making. We further infer that Hector struggled to connect the concept any further to the task, so he further sought other knowledge and experiences to inform his frame. This shows an example of a less productive piece of prior knowledge. Hector's knowledge and was conceptually sound, but it had limited use for the context of the task. Hector saw the lack of utility in the conceptual knowledge, so he drew upon other ideas to inform his reasoning. We also can see experts do this as well in their reasoning. Slominski and colleagues’ highlight one biological expert's attempt at using relatively sophisticated biological conceptual knowledge to explain a phenomenon. The expert was trying to use capillary physics as a conceptual resource but was unable to connect it to the task. Recognizing this, the expert then drew upon their previous experiences with syringes and used this to inform and shape their frame.

We also observed students draw upon their everyday knowledge and experiences to help complete the task. In one example, Ariel began her interview by defining minimal as the smallest amount that was not zero. This definition informed one of the objectives for her frame, which in turn helped her choose her initial answer of 4 M. Other studies have shown this to occur during chemistry students’ data analysis. Heisterkamp and Talanquer (2015) observed their participant using examples from his everyday life to explain differences in chemical phenomena. Experts also use everyday knowledge and experiences to help them reason in unfamiliar contexts as well. In their dual-context fluid dynamics task, Slominski and colleagues found that some experts used a frame informed by everyday knowledge to help reason through the task situated in an unfamiliar context. For example, one biologist used an outdoor water hose in their reasoning for the water pipe context. Their experience with putting a finger over the end of a water hose gave them a reference to complete the task. In all these examples, participants accessed prior knowledge from their everyday experiences that can help inform their frame for a task. There is likely some aspect of the task that activates this everyday knowledge or experience, which then serves as a frame to help the participant make sense of their task.

Our work provides additional evidence that novices and experts actually undergo similar sensemaking processes (Klein et al., 2007). Both students and experts access some sort of prior knowledge or experience to help establish a frame for the sensemaking process. The key element that seems to vary in sophistication is access to relevant prior knowledge and experiences, where experts often have richer mental models and a larger knowledge base. Those who have access to prior knowledge and experiences that share similarities to the phenomenon or target similar concepts can engage in more sophisticated and productive reasoning.

To ensure that students are equipped to engage in sophisticated and productive data analysis in the classroom, instructors should consider how they are activating students’ knowledge and experiences that could help students productively engage in data-based reasoning. Situating tasks in a variety of different contexts and experiences could serve to expand learners’ knowledge base, thereby increasing the chances of activating students’ prior knowledge and experiences that could productively frame their data analysis. Additionally, scaffolding to explicitly cue students’ relevant prior knowledge and experiences can support productive engagement. The sensitivity to scaffolding was evidenced by Ariel who modified her response simply by being prompted to consider the perspective of the scientists conducting the experiment.

Overall, our participants were proficient in navigating the graph. With the scaffolding we employed, students did not seem to have any problems comprehending the graph and its features. Students could easily identify local maxima and minima, compare points and slopes, and interact with the surface features of the graph. Students’ competency in comprehending these surface-level graph features aligns with much of the science education literature on graph comprehension in science courses (Potgieter et al., 2008; Ivanjek et al., 2016). The ease with which students navigated graphs and their features suggest they have rich graph schemas that help them to synthesize and comprehend information from a graph (Pinker and Feedle, 1990). These schemas can serve to automate graph reading for students, which helps to explain why we observed little variation in students’ graph comprehension, and primarily variation in what information students used from the graph.

In addition to considering how one's prior knowledge and experiences interact with graphs, we also observed different ways in which a frame changed during our participants’ data analysis. In all three ways students noticed something that then altered their frame in some way. There was a variation in the degree of these frame changes, depending on how the new information aligned with a student's current frame. New information that produced smaller frame changes generally fit with the student's original frame. Smaller changes in analysis occurred when the new information did not alter any objectives within a student's frame. If we consider Fernald, there was a very small frame change early on in his analysis. Initially we infer that his frame had him consider only gold, tin, and iron data in the graph, as they were the only metals outlined in the task. After viewing the graph though, Fernald incorporated antimony into his frame by expanding upon his minimal extraction objective. This is an example of elaborating one's frame in Data-Frame Theory, in which the core tenets of the frame remain integral, but that the frame is changed to accommodate new data that was not explained or considered beforehand (Klein et al., 2007).

To produce more significant changes in a student's frame, the student's frame needed to be challenged in some way. One way in which we saw this occur was by engaging students in a simulated peer review to have them consider an alternative perspective. In previous work, we used social comparison theory to model how students engaged in the simulated peer review (Berg and Moon, 2022). Here, we argue that the simulated peer review offered students an opportunity to compare their frame to that of the sample response they were reviewing. From this comparison students evaluated how well their frame fulfilled the task. If students were not satisfied with how their frame completed the task, they could adapt it or use a different frame entirely.

Throughout the simulated peer reviews students demonstrated proficiency in comparing their frames to that of the sample responses. This was especially prominent for students that changed their frame after reviewing the sample response. We argue that to change their frame in this manner, students decentred from their original frame to consider another. Decentring is defined as the process of recognizing and understanding different perspectives from one's own (Piaget, 1955). Some students directly verbalized that reading the sample responses offered new perspectives they had not considered, such as Violet did when reviewing the 0 M sample response: “It's kind of like seeing it from someone else's perspective. [It] just kind of puts it into a better perspective.” Here, Violet recognizes that the 0 M sample response has a perspective different from her own, and then acknowledges that the sample response's frame fulfils the task better than her own. By recognizing the alternative perspective and then adopting it as her own, Violet shows evidence of decentring from her original frame.

Our findings provide further evidence that not only does empirical data shape a student's sensemaking during data analysis and interpretation, but also other sources of data such as socially obtained information. Previous work in physics education has found that frames can be influenced by social cues (Gouvea et al., 2019). In the context of our study, social cues and social information was accessed by engaging in the simulated peer review. Through the peer review, students were exposed to alternative perspectives that they could consider and potentially adopt as their own for the task. Further work could and should consider how a student's frame influences how they may engage in a peer-based classroom activity. Our work is based in peer review, but there are likely similar occurrences in other peer-based classroom activities such as peer-led team learning. We consider this an opportunity for further research to a) uncover how peers prompt modifications to a learner's frame and b) elucidate how scaffolding and intangible cues may serve to activate specific frame components, or even prompt reflection on and help aid in productive modification of frames.

We encourage classroom practitioners to consider how they are attempting to elicit sensemaking from students in tasks. Instructors should account for how the information in the task, such as empirical data and prompt instructions, may be activating specific frames with which students make sense of the data. When designing the activity, instructors can also think of how to leverage certain prior knowledge and prior experiences that relate to the task. These both can help guide students in forming an appropriate frame to make sense of the task in a productive manner. Finally, instructors can consider how other sources of information, such as alternative perspectives of peers, may influence students’ frames in their sensemaking process and overall engagement in a task. Other sources of information, like reviewing peer's work, could have the power to challenge students’ less productive or relevant ideas.

Limitations

Our characterizations of students’ data analyses are highly contextualized in the design of the task we constructed and research methods used.

First, we utilized a familiar data representation for students. It is extremely likely that undergraduate students have encountered graphs and used them at some point before engaging in this task. Familiarity with analysing and using graphs affords robust graph schemas for students to use, which made the navigation of the graph relatively effortless for students. Throughout the interviews students had no difficulties navigating the graph to find certain information, make comparisons, or in comprehending the graph in nearly any manner. Other kinds of data representations, both domain-specific representations (such as NMR spectra) and domain-general representations (such as data tables) are very likely to produce different sensemaking from students.

We also acknowledge that the design only required students to consider one set of data. For our work, we simplified the task so that students only needed to consider a graph of extraction values, in which we narrowed students’ focus to three specific metals. We designed the task to be simple so that our analysis could both focus on characterizing students’ sensemaking processes and identifying students’ prior knowledge and previous experiences used in their sensemaking. Tasks that use multiple sets of data will produce more complex sensemaking and potentially more complex frames. This would be especially likely for tasks in which the data sets contradict one another in some way.

The content knowledge required to navigate the task also poses a limitation. The task was purposefully scaffolded to be accessible to chemistry students enrolled in first- and second-semester general chemistry courses. Much of the underlying chemistry content was removed so that students could engage in the task with relative ease. Tasks that would require more content knowledge to engage in will produce much more complex sensemaking from participants. It is also likely that requiring more content knowledge would produce more variation in sensemaking from participants as well.

Finally, our use of interviews does pose a limitation in generalizability to broader contexts; however, our use of interviews enabled us to probe students’ thought processes during their analyses as well as identify prior knowledge and experiences that influenced their analyses. During data collection, we deemed that saturation was reached at eighteen interviews as students in the later interviews used no new reasoning or perspectives that had not been used before (Nelson, 2017).

Conflicts of interest

There are no conflicts to declare.

Acknowledgements

We would like to thank the members of the Moon group for offering feedback and helping establish trustworthiness.

References

  1. Angra A. and Gardner S. M., (2016), Development of a framework for graph choice and construction, Adv. Physiol. Educ., 40(1), 123–128 DOI:10.1152/advan.00152.2015.
  2. Angra A. and Gardner S. M., (2017), Reflecting on graphs: Attributes of graph choice and construction practices in biology, CBE Life Sci. Educ., 16(3), 1–15 DOI:10.1187/cbe.16-08-0245.
  3. Becker N. M., Rupp C. A. and Brandriet A., (2017), Engaging students in analyzing and interpreting data to construct mathematical models: an analysis of students reasoning in a method of initial rates task, Chem. Educ. Res. Pract., 18(4), 798–810 10.1039/C6RP00205F.
  4. Berg S. A. and Moon A., (2022), Prompting hypothetical social comparisons to support chemistry students data analysis and interpretations, Chem. Educ. Res. Pract., 23(1), 124–136 10.1039/d1rp00213a.
  5. Brooks J. et al., (2015), The Utility of Template Analysis in Qualitative Psychology Research, Qual. Res. Psychol., 12(2), 202–222 DOI:10.1080/14780887.2014.955224.
  6. Carpenter P. A. and Shah P., (1998), A model of the perceptual and conceptual processes in graph comprehension, J. Exp. Psychol.: Appl., 4(2), 75–100 DOI:10.1037/1076-898X.4.2.75.
  7. Chen Y.-C. and Terada T., (2021), Development and validation of an observation-based protocol to measure the eight scientific practices of the next generation science standards in K-12 science classrooms, J. Res. Sci. Teach., 58(10), 1489–1526 DOI:10.1002/tea.21716.
  8. Chinn C. A. and Brewer, W., (2001), Models of Data: A Theory of How People Evaluate Data, Cognit. Instr., 19, 323–393 DOI:10.1207/S1532690XCI1903_3.
  9. Chinn C. A. and Brewer W. F., (1993), The Role of Anomalous Data in Knowledge Acquisition: A Theoretical Framework and Implications for Science Instruction, Rev. Educ. Res., 63(1), 1–49 DOI:10.3102/00346543063001001.
  10. Cooper M. and Klymkowsky M., (2013), Chemistry, Life, the Universe, and Everything: A New Approach to General Chemistry, and a Model for Curriculum Reform, J. Chem. Educ., 90(9), 1116–1122 DOI:10.1021/ed300456y.
  11. Cooper et al., (2015), Challenge faculty to transform STEM learning, Science, 350(6258), 281–282 DOI:10.1126/science.aab0933.
  12. Council N. R., (2012), A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas, Washington, DC: The National Academies Press DOI:10.17226/13165.
  13. Doidge E. D. et al., (2016), A Simple Primary Amide for the Selective Recovery of Gold from Secondary Resources, Angew. Chem., Int. Ed., 55(40), 12436–12439 DOI:10.1002/anie.201606113.
  14. Friel S. N., Curcio F. R. and Bright G. W., (2001), Making sense of graphs: Critical factors influencing comprehension and instructional implications, J. Res. Math. Educ., 32(2), 124–158 DOI:10.2307/749671.
  15. Glazer N., (2011), Challenges with graph interpretation: A review of the literature, Stud. Sci. Educ., 47(2), 183–210 DOI:10.1080/03057267.2011.605307.
  16. Gouvea J., Sawtelle V. and Nair A., (2019), Epistemological progress in physics and its impact on biology, Phys. Rev. Phys. Educ. Res., 15(1), 10107 DOI:10.1103/PhysRevPhysEducRes.15.010107.
  17. Hammer D. et al., (2004), Resources, framing, and transfer, (Rec 0087519), pp. 1–26.
  18. Heisterkamp K. and Talanquer V., (2015), Interpreting Data: The Hybrid Mind, J. Chem. Educ., 92(12), 1988–1995 DOI:10.1021/acs.jchemed.5b00589.
  19. Ivanjek L. et al., (2016), Student reasoning about graphs in different contexts, Phys. Rev. Phys. Educ. Res., 12(1), 1–13 DOI:10.1103/PhysRevPhysEducRes.12.010106.
  20. Jeong H., Songer N. B. and Lee S. Y., (2007), Evidentiary competence: Sixth graders understanding for gathering and interpreting evidence in scientific investigations, Res. Sci. Educ., 37(1), 75–97 DOI:10.1007/s11165-006-9014-9.
  21. Kanari Z. and Millar R., (2004), Reasoning from data: How students collect and interpret data in science investigations, J. Res. Sci. Teach., 41(7), 748–769 DOI:10.1002/tea.20020.
  22. Klein G. and Moon B., (2006), Making sense of sensemaking 2: A macrocognitive model, IEEE Intelligent Syst., 21(5), 88–92 DOI:10.1109/MIS.2006.100.
  23. Klein G. et al., (2007), A Data-Frame Theory of Sensemaking, Expertise out of context, pp. 113–155.
  24. Lai K. et al., (2016), Measuring Graph Comprehension, Critique, and Construction in Science, J. Sci. Educ. Technol., 25(4), 665–681 DOI:10.1007/s10956-016-9621-9.
  25. Latour B., (1999), Pandora's hope: essays on the reality of science studies, Harvard University Press.
  26. Masnick A. M. and Morris B. J., (2022), A Model of Scientific Data Reasoning, Educ. Sci., 12(2), 1–19 DOI:10.3390/educsci12020071.
  27. Meister S. and Upmeier Zu Belzen A., (2021), Analysis of data-based scientific reasoning from a product-based and a process-based perspective, Educ. Sci., 11(10) DOI:10.3390/educsci11100639.
  28. Miles M. B., Michael Huberman A. and Saldaña J., (2014), Qualitative Data Analysis: A Methods Sourcebook, 3rd edn, Los Angeles, CA: SAGE.
  29. Moon A. et al., (2017), Decentering: A Characteristic of Effective Student-Student Discourse in Inquiry-Oriented Physical Chemistry Classrooms, J. Chem. Educ., 94(7), 829–836 DOI:10.1021/acs.jchemed.6b00856.
  30. National Research Council, (2012), A framework for K-12 science education: Practices, crosscutting concepts, and core ideas, A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas, Washington, DC: The National Academies Press DOI:10.17226/13165.
  31. Nelson J., (2017), Using conceptual depth criteria: addressing the challenge of reaching saturation in qualitative research, Qual. Res., 17(5), 554–570 DOI:10.1177/1468794116679873.
  32. Piaget J., (1955), The Language and Thought of the Child, Cleveland, OH: Meridian Books.
  33. Pinker S. and Feedle R., (1990), A theory of graph comprehension, Artificial Intelligence and the Future of Testing, pp. 73–126.
  34. Potgieter M., Harding A. and Engelbrecht J., (2008), Transfer of Algebraic and Graphical Thinking between Mathematics and Chemistry, J. Res. Sci. Teach., 45(2), 197–218 DOI:10.1002/tea.
  35. Ratwani R. M., Trafton J. G. and Boehm-Davis D. A., (2008), Thinking Graphically: Connecting Vision and Cognition During Graph Comprehension, J. Exp. Psychol.: Appl., 14(1), 36–49 DOI:10.1037/1076-898X.14.1.36.
  36. Roth W. and Bowen G. M., (2000), Learning Difficulties Related to Graphing: A Hermeneutic, Res. Sci. Educ., 30(1), 123–139.
  37. Shah P. and Carpenter P. A., (1995), Conceptual limitations in comprehending line graphs, J. Exp. Psychol.: Gen., 43–61 DOI:10.1037/0096-3445.124.1.43.
  38. Shah P. and Hoeffner J., (2002), Review of Graph Comprehension Research: Implications for Instruction, Educ. Psychol. Rev., 14(1), 47–69. Available at: http://www.springerlink.com/content/v2581778612k5432/?MUD=MP.
  39. Slominski T. et al., (2020), Using framing as a lens to understand context effects on expert reasoning, CBE Life Sci. Educ., 19(3), 1–15 DOI:10.1187/cbe.19-11-0230.
  40. Talanquer V. and Pollard J., (2010), Lets teach how we think instead of what we know, Chem. Educ. Res. Pract., 11(2), 74–83.
  41. Teuscher D., Moore K. C. and Carlson M. P., (2016), Decentering: A construct to analyze and explain teacher actions as they relate to student thinking, J. Math. Teach. Educ., 19(5), 433–456 DOI:10.1007/s10857-015-9304-0.
  42. Zagallo P., Meddleton S. and Bolger M. S., (2016), Teaching real data interpretation with models (TRIM): Analysis of student dialogue in a large-enrollment cell and developmental biology course, CBE Life Sci. Educ., 15(2), 1–18 DOI:10.1187/cbe.15-11-0239.

This journal is © The Royal Society of Chemistry 2023