Jon-Marc G.
Rodriguez
a,
Kinsey
Bain
b,
Marcy H.
Towns
*a,
Maja
Elmgren
c and
Felix M.
Ho
*c
aDepartment of Chemistry, Purdue University, West Lafayette, Indiana 47907, USA. E-mail: mtowns@purdue.edu
bDepartment of Chemistry, Michigan State University, East Lansing, Michigan 48824, USA
cDepartment of Chemistry – Ångström Laboratory, Uppsala University, 751 20 Uppsala, Sweden. E-mail: felix.ho@kemi.uu.se
First published on 22nd August 2018
Graphical representations are an important tool used to model abstract processes in fields such as chemistry. Successful interpretation of a graph involves a combination of mathematical expertise and discipline-specific content to reason about the relationship between the variables and to describe the phenomena represented. In this work, we studied students’ graphical reasoning as they responded to a chemical kinetics prompt. Qualitative data was collected and analyzed for a sample of 70 students through the use of an assessment involving short-answer test items administered in a first-year, non-majors chemistry course at a Swedish university. The student responses were translated from Swedish to English and subsequently coded to analyze the chemical and mathematical ideas students attributed to the graph. Mathematical reasoning and ideas related to covariation were analyzed using graphical forms and the shape thinking perspective of graphical reasoning. Student responses were further analyzed by focusing on the extent to which they integrated chemistry and mathematics. This was accomplished by conceptualizing modeling as discussing mathematical narratives, characterizing how students described the “story” communicated by the graph. Analysis provided insight into students’ understanding of mathematical models of chemical processes.
Given this backdrop, it is not surprising students have difficulty using and applying calculus in other contexts, such as modeling physical systems (Becker and Towns, 2012). The act of modeling, which often requires processes to be translated into mathematical formalisms, is a common practice in the sciences, and it has been identified as a foundational scientific practice that students should engage in at all levels of education (National Research Council, 2012; Bruce, 2013; Posthuma-Adams, 2014; Edwards and Head, 2016). Modeling in the physical sciences can be particularly challenging because it often requires the integration of (scientific) knowledge with a student's mathematical knowledge, a problem that is further compounded when considering that chemistry requires students to think abstractly at the particulate-level, which is not readily observable or accessible (Kozma and Russell, 1997; Becker and Towns, 2012; Bain et al., 2018). Nevertheless, researchers agree that making connections across different domains of knowledge through modeling is necessary to promote a deeper understanding of chemistry (Talanquer, 2011; Taber, 2013; Sjostrom and Talanquer, 2014; Cooper et al., 2015; Laverty et al., 2016; Becker et al., 2017).
Previous work has investigated student understanding of mathematical expressions and their relationship to chemical phenomena, with many studies placed in the context of chemical kinetics (Jasien and Oberem, 2002; Justi, 2002; Greenbowe and Meltzer, 2003; Hadfield and Wieman, 2010; Becker and Towns, 2012; Bain and Towns, 2016; Becker et al., 2017). In their review paper, Bain and Towns (2016) comment on the highly quantitative nature of chemical kinetics, which heavily relies on the use of graphs and representations, making it an excellent context to investigating graphical reasoning. In addition, Bain and Towns (2016) echo the call of the National Research Council for more discipline-based education research (DBER) that focuses on studies at the undergraduate level and emphasizes interdisciplinary work, such as collaborations between chemistry and mathematics communities (Singer et al., 2012). This study seeks to contribute to the body of knowledge related to graphical reasoning in the physical sciences by bridging the gap between research done in chemistry and mathematics. To this end, our guiding research question is the following: In what ways do students use mathematics in combination with their knowledge of chemistry and chemical kinetics to interpret concentration versus time graphs?
More generally, studies that involved prompting students to reason about physical science graphs along with parallel, decontextualized (math-only) graphs have discussed the role of mathematical ability (Potgieter et al., 2007) and the increased complexity associated with contextualized graphs (Planinic et al., 2013; Phage et al., 2017), in which different contexts may cue students to utilize less-productive problem-solving strategies to reason about the graphs (Ivanjeck et al., 2016). Collectively, the body of literature on graphical reasoning in the physical sciences provides insight regarding the complex factors that interact when students reason about graphs. In the section that follows, we build on this body of literature by focusing more explicitly on the role of mathematical reasoning, considering the role of covariational reasoning in understanding the information represented in a graph.
Using covariational reasoning as a resource to interpret graphs is highlighted in the process–object distinction, which is framed in the mathematics community as involving complementary perspectives regarding how a mathematical function can be viewed (Even, 1990; Schwartz and Yerushalmy, 1992; Sfard, 1992; Moschkovich et al., 1993; Potgieter et al., 2007). According to Moschkovich et al. (1993) the process perspective of a graph emphasizes the underlying covariation and relationship between the variables (viewing the graph as a mapping of all possible input and output values), whereas the object perspective of a graph draws attention to the graph as a whole (viewing it as an entity with properties). This characterization of considering a graph as a process or an object is at the core of Moore and Thompson's (2015) conceptualization of shape thinking. Within the shape thinking framework, reasoning is characterized as emergent, which more explicitly considers the relationship between the varying quantities (framing the graph as a process), and static, which focuses on the overall shape of the curve (framing the graph as an object). We would like to emphasize that static reasoning or viewing the graph as an object is not inherently unproductive; both conceptualizations of a graph are resources that may be useful for solving problems in different contexts, and students should be able to reason using both perspectives (Even, 1990; Schwartz and Yerushalmy, 1992; Sfard, 1992; Potgieter et al., 2007).
In the context of chemical kinetics, we are not simply mapping all possible inputs to outputs in the sense of a function with respect to x, rather, there are physical meanings attached. In particular, the independent variable time t is directional, so the individual points on the graph are not independent of all others, but rather there is a “history”. The story “unfolds” and unlike in pure mathematical settings, students need to be able to imagine and follow this “unfolding” with time. Therefore, covariation takes on an interesting new meaning from the perspective of an expert reasoning with chemistry concepts, in which a point in the Cartesian coordinate system becomes an event and a group of meaningfully connected points becomes a story. The relationship between the variables that define an event and the coordination of what Saldanha and Thompson (1998) refer to as “simultaneous continuous variation” affords the ability to make interferences about the process, which serves as the basis for a descriptive account informed by chemistry ideas. For this reason, we frame modeling as engaging in mathematical narratives, which involves discussing the “story” communicated by a graph, or engaging in “narratives that fuse aspects of events and situations with properties of symbols and notations,” (Nemirovsky, 1996). This affords us the language to think about how students integrated chemistry ideas to interpret graphs.
As discussed by Becker et al. (2017), resources can be characterized as procedural, epistemological, or conceptual in nature. In this work we focus on students’ conceptual resources related to chemistry and mathematics. In the case of mathematical resources, we build on Sherin's (2001) symbolic forms framework. Symbolic forms are mathematical resources that involve attributing intuitive mathematical ideas to a pattern in an equation (Sherin, 2001). Although the symbolic forms framework was originally developed to characterize how physics students think about equations during quantitative problem solving, it has proven useful for investigating mathematical reasoning across discipline-based education fields, including chemistry (Sherin, 2001; Izak, 2004; Becker and Towns, 2012; Hu and Rebello, 2013; Jones, 2013, 2015a, 2015b; Von Korff and Rubello, 2014; Dorko and Speer, 2015). In our forthcoming paper, we describe how symbolic forms can be adapted for the analysis of graphical reasoning, in which we frame reasoning involving “graphical forms” as attributing mathematical ideas (conceptual schema) to a registration (region in the graph); for example, the graphical form steepness as rate involves associating the steepness of a graphical region with ideas related to rate (Rodriguez et al., 2018). A summary of graphical forms described in Rodriguez et al. (2018) is provided in Table 1. It is worth drawing attention to the dynamic nature of graphical forms, which may involve multiple conceptual schemas potentially being associated with a single registration. This can be illustrated by considering the distinction between a straight line with a zero gradient (which indicates constant concentration in a concentration vs. time curve) or a straight line with a non-zero slope (which indicates a constant rate of change in the concentration values in a concentration vs. time curve).
Graphical form | Registration and conceptual schema |
---|---|
Steepness as rate | Varying levels of steepness in a graph correspond to different rates |
Straight means constant | A straight line indicates a lack of change/constant rate |
Curve means change | A curve indicates continuous change/changing rate |
For the study discussed herein, we used graphical forms to characterize and analyze the intuitive mathematical ideas students used to reason about graphs. Furthermore, our approach to analyzing students’ mathematical reasoning involved characterizing graphical forms using the emergent and static designations from the shape thinking framework, meaning we categorized students’ intuitive mathematical ideas about graphs based on whether they focused more on its process or object nature (Moore and Thompson, 2015). Our use of multiple frameworks during data analysis was an effort to address the themes that emerged from the data, which lead us to consider literature from different communities. The graphical forms framework afforded the ability to characterize mathematical resources activated by students, the shape thinking perspective provided insight regarding students’ engagement in covariational reasoning, and framing our discussion around mathematical narratives allowed us to consider the integration of mathematics and chemistry ideas. Thus, our combination of frameworks allowed us to focus on the mathematical resources students used and draw conclusions regarding how students combined mathematical reasoning with chemistry ideas to explain the process modeled.
The final sample that we report is comprised of 70 students (more information regarding our final sample size is provided below). To provide context, the prerequisites for students in this course involved senior high school (gymnasium) chemistry and a mathematics series that encompassed differential and integral calculus, including concepts such as the limit definition of the derivative and its relationship to the slope of a tangent line. The prompt given to the students provided a concentration vs. time graph, along with three short-answer questions related to the graph (provided in Fig. 1). Content validity of the assessment items was achieved by discussing and co-developing this prompt among a group of four researchers, and the wording in the prompt was refined after initially being piloted (in both English and Swedish) with a group of participants that included three professors, a postdoctoral researcher, and two PhD students. One of the learning objectives for the kinetics unit was for students to be able to extract information about what is happening at the molecular level from a graphical representation of a reaction. This is reflected in the design of the prompt, which focuses on conceptual understanding and requires students to integrate chemical and mathematical knowledge. Furthermore, the prompt supports students to reason conceptually by scaffolding the students to first consider what is happening and then consider why this is happening. The first two items in the prompt ask the students to think about (a) what is being modeled and (b) the rate at different points in the graph (what); the last item (c) asks the students to explain why the rate changed (why). This is in line with the notion that in order to elicit a deeper level of thinking, instructors must ask what (descriptive, surface-level questions) before they ask why (mechanistic, explanatory questions) (Cooper, 2015).
This emphasis on conceptual understanding led the researchers to consider how students apply knowledge to unfamiliar situations, which has been identified as a key component of conceptual understanding (Holme et al., 2015). Within the resources framework, in order for students to be able to use knowledge in novel situations, resources related to the task need to be coherently organized in such a way that they are not dependent on a single context (Hammer et al., 2005). The prompt was designed, in part, to evaluate the extent in which students are able to use the appropriate knowledge in a different context. The graph in the prompt did not reflect the concentration vs. time graphs normally depicted in textbooks, and although chemically possible, it exhibited deviations from empirical results one would observe in typical laboratory work done in an introductory general chemistry course (see Fig. 2). In addition to representing a somewhat unfamiliar problem-solving scenario, the last item in the prompt reflects what is described as an “ill-defined” problem, in which the question is open-ended and does not have one correct answer (Singer et al., 2012). For this problem, students are prompted to suggest a plausible explanation for the observed graph shape, which could encompass a myriad of possible justifications. This requires students to attend to features in the graph, draw conclusions using intuitive mathematical ideas, and subsequently connect their mathematical reasoning to chemistry concepts.
Moreover, we would also like to note the nature of the prompt we developed incorporated considerations discussed in the National Research Council's (2012) conceptualization of “three-dimensional learning”, which encompasses: science practices (combination of skill and knowledge utilized by scientists); crosscutting concepts (unifying ideas that have applicability across science fields); and core ideas (fundamental principles relevant to a discipline). Using the criteria provided in the Three-Dimensional Learning Assessment Protocol (3D-LAP; Laverty et al., 2016) to evaluate whether our prompt elicited student engagement in these three dimensions, we concluded that our prompt was “three-dimensional”. It required students to engage in developing and using models (science practice), reason about patterns/cause and effect (crosscutting concepts), and consider ideas related to stability in chemical systems (core idea).
The first stage involved developing and refining the coding scheme for the students’ exam responses (J. G. R. and K. B.). Prior to coding, the exams were translated from Swedish into English by one of the authors (F. M. H.) and a random sample of the exams were subsequently back-translated by a different author (M. E.) to confirm the accuracy and quality of the translations. After analyzing responses for 50 (of the 109) students, it was thought data saturation was reached because all student responses could be classified using existing codes. To confirm data saturation, 20 additional exams were coded, making the final sample that we report comprised of 70 students. Initial analysis of the 70 exams was done through the process of constant-comparison, with two researchers (J. G. R. and K. B.) coding in tandem and requiring 100% agreement for assignment of codes (Strauss and Corbin, 1990; Campbell et al., 2013). Following the coding of the translated dataset, two different researchers (F. M. H. and M. E.) applied the coding scheme to five students’ original (untranslated) exams and discussed their code assignments with the other authors (J. G. R., K. B., and M. H. T.). Based on this discussion, both teams (J. G. R./K. B. and F. M. H./M. E.) re-coded the same five exams (English and Swedish, respectively) using the refined coding scheme. Once again, all the authors met and discussed the code assignments; subsequently both teams re-coded the same five exams, as well as another five exams, which yielded a Cohen's Kappa of 0.92. Both teams then coded an additional four exams to reach a final Cohen's Kappa of 0.95 for 20% of the data corpus (14 out of 70 exams). This iterative process of both teams coding and discussing helped to refine and modify the coding scheme and the assignment of codes, which was ultimately used in the final stage of data analysis to re-code the entire dataset (J. G. R. and K. B.).
The coding scheme developed into a multi-tier categorization system that was used to characterize student responses, as shown in Fig. 4. Although not depicted in the coding scheme, we also coded student responses based on whether it was correct or incorrect, that is, whether students would have received credit for their response to the item on the exam. To be clear, our primary focus involved elucidating the resources activated by students in this context. Our use of an additional layer of coding involving the correct/incorrect designation was not to catalog students’ misconceptions; instead, it served as a metric to gauge how productive student responses were and provide a way to frame the resources that were inappropriately activated or applied in this context. We were attempting to address the practical consideration regarding grading, in which responses characterized as more productive would earn students more credit on the exam and would simultaneously reflect a more normative understanding of chemistry ideas.
As shown in Fig. 4, we characterized the student responses based on the discipline-specific (chemistry vs. mathematics) resources, such as the content and reasoning the students used. This was developed through a combination of inductive and deductive analysis; the chemistry categories developed as a result of the observed student responses and the mathematical reasoning categories were modeled after the graphical forms framework and the delineation of emergent vs. static reasoning. For the chemistry categories, an “Unproductive Application” sub-category was created to encompass instances in which students activated or utilized resources in a manner that were less useful for problem-solving in this context, and a “Productive Application” sub-category was created to encompasses ideas and reasoning that more appropriately addressed the prompt. It is also important to note that chemical and mathematical reasoning categories were not mutually exclusive, in which examples involving both types of reasoning could qualify as engaging in mathematical narratives (the dotted line in Fig. 4). However, for the purpose of this study, our characterization of mathematical narratives focused primarily on instances in which student descriptions involved a more normative understanding of chemical kinetics. Thus, our discussion of mathematical narratives in the next section encompasses the criterion that students provided a chemically plausible explanation of the graph provided. This was an effort to narrow the scope of analysis, emphasizing reasoning that more closely reflected that of experts, in the sense of the ability to productively combine different domains of knowledge (Hull et al., 2013; Bain et al., 2018).
Benjamin: “A product, since the concentration rises. This suggests that more molecules of the substance is formed, contrary to a reactant where the substance is consumed.”
One of the limitations of our sample is that we were not able to ask probing follow-up questions to clarify students’ reasoning. However, based on how students such as Benjamin responded to item (a) we can draw some inferences about their mathematical reasoning. In order to be able to connect the graph with the idea that the amount of product increases during a reaction, it is implied that the students are able to elicit the general trend or directionality of the graph, namely, that the graph is increasing. This is a basic reasoning skill that is foundational for interpreting graphs, and we characterize this intuitive mathematical idea as the graphical form trend from shape directionality, in which the students looked at the graph as a whole and reached a conclusion regarding the graph's overall tendency to increase or decrease. In terms of classifying graphical forms as static or emergent using the shape thinking framework conceptualized by Moore and Thompson (2015), we view trend from shape directionality as an example of static reasoning. In this case, the entire graph is the registration attended to by the students, where they focused more on the general shape of the overall graph and assigned ideas to this “object”, as opposed to using emergent reasoning and conceptualizing the graph as a process or mapping of all possible input and output values. We would like to emphasize that static reasoning is not inherently unproductive, and in this context students used it effectively as a resource to solve the problem.
Given that Benjamin reasoned using the graphical form trend from shape directionality, we assert that his approach to answering the question began with intuitive mathematical ideas about the graph, followed by the assignment of chemistry ideas that are consistent with conclusions reached using mathematical reasoning. Thus, Benjamin, who represents a typical student response for item (a), described the general story communicated by the graph by integrating mathematics and chemistry ideas. This type of reasoning that seemed to involve mathematics as a starting point for engaging in modeling (i.e., discussing mathematical narratives) is consistent with what was previously observed when students engage in quantitative problem-solving (Bain et al., 2018). We argue that anchoring reasoning in mathematical ideas is particularly important for interpreting graphs, because it is through the use of intuitive mathematical reasoning that we can determine trends in the data—e.g., how rate changes in item (b)—and subsequently attribute a chemically plausible explanation for these observed trends—e.g., a particulate-level explanation of the graph's shape in item (c).
Rosalynn: “The reaction rate is the greatest after 1 minute. The slope of the curve is the steepest here and the slope shows the reaction rate. The reaction rate is the lowest after 5 minutes. The curve is flat here, and the flatter the slope the slower the rate.”
In this example, Rosalynn equated the slope of the graph to the reaction rate, with her description utilizing general terms that emphasized the general shape of specific registrations (e.g., “steepness” and “flat”). Our conceptualization of static reasoning encompasses student responses that consider the graph (or a region of the graph) as a whole by drawing attention to its shape or framing it as an object with descriptive features. This is in contrast to how others, such as Hillary, reasoned using the steepness as rate graphical form:
Hillary: “At t = 1 min the curve has the highest slope, in other words there is the greatest difference in conc. per time, the derivative gives the exact answer. Quickest. At t = 5 min the rate was the lowest since the derivative is 0, in other words time passes but the conc. does not increase, no reaction occurs.”
Similar to Rosalynn's response, Hillary used the steepness of the graph to reason about rate; however, rather than simply attending to the general shape of regions in the graph, Hillary's discussion of rate made use of reasoning that incorporated ideas related to the derivative. In this sense, we consider Hillary's reasoning to be emergent, which more explicitly considers the relationship between both variables, “conc. per time”. Comparing the static and emergent characterizations of steepness as rate, aspects of covariational reasoning are more prominent when thinking about the derivative or the formal definition of slope, because they inherently involve drawing a connection between both variables. However, the presence of words such as “derivative” in a student's response does not necessarily indicate their reasoning is emergent, rather, the defining feature of emergent reasoning is the incorporation of ideas related to covariation or the process described by correlated variables.
Since the second item dealt with thinking about the rate of change, it readily lent itself to reasoning using the derivative, which led to our observation of emergent reasoning. However, in comparison to emergent reasoning, twice as many students engaged in static reasoning to address item (b). As mentioned previously, static reasoning is not necessarily unproductive; on the contrary, reasoning that focused on the shape of different regions of the curve to determine the relative slope was sufficient for responding to this question.
Fig. 5 Student responses were characterized as extraneous, relevant, and chemically plausible depending on how productive the statement was for addressing item (c), adapted from Toulmin's (1958) conceptualization of argumentation. |
Bess: “The solution is a diprotic acid which means that protolysis occurs in two steps: the reaction rate has two ‘spikes’ A diprotic acid, e.g. carbonic acid donates 2 protons after complete protolysis. Because protolysis occurs when carbonic acid and its conjugate base has a solution the pKa value of which is pKa1: 6.35 and pKa2: 10.33.”
Other students even explicitly used the word “titration” in their description of the graph. This was likely based on students associating the shape of the concentration vs. time graph with a titration curve's sigmoidal shape. In this case, the prompt activated resources that may be useful for problem solving in other contexts, but was not useful for thinking about this graph. The discussion of titrations and related content can be thought of as an extreme example of viewing the graph as an object, in which students simply associated chemistry ideas with the shape.
James: “R = k[A]a[B]b– at the beginning there was a lot of reactants that could form products – high reaction rate. The fact that it later increased again could be due to another reaction. The products themselves became reactants and formed new product.”
Initially, James provided a reasonable explanation that the initial rate of a reaction was fast because there were more reactants available (productively applying the resource more reactant, higher rate), but when describing the point in the graph where the rate increases again (t = 10), he stated that it was the result of a second reaction. If it were the case that a second reaction occurred this would not be communicated with the graph provided because the y-axis has not changed. The graph shows how a specific product changes over time, and a different reaction that forms a different product would not be represented on the graph (if the product did start to react, the concentration shown in the graph would begin to drop). Although while James activated a chemistry resource relevant for this problem (the resource complex reaction mechanism), its application was less productive and led to a result that was inconsistent with the graph provided. As shown in James’ response, the extraneous, relevant, and chemically plausible designations are not mutually exclusive, and different portions of a student's response could be classified into different categories (i.e., the first sentence was chemically plausible, but the remainder of the passage was only relevant).
Zachary: “What could have happened is that at t = 5 the reaction reached equilibrium but more reactant was added and according to Le Chatelier's principle the reaction forms more products to reduce the stress in the system. At t = 10, product is constantly being removed which results in a constant and even formation of the products.”
In the example provided Zachary connected the flat region in the graph to equilibrium, discussing how Le Chatelier's principle provided insight into what could have resulted in the observed shape of the graph. From the perspective of an expert, a simpler explanation could explain the constant rate of the reaction at t = 10; nevertheless, Zachary's description is not outside the realm of possibility, with his reasoning reflecting a deeper and detailed level of thinking that attempted to account for the distinct features of the graph.
Fig. 6 Student engagement in discussing mathematical narratives (for clarity, the graph shown above has been modified from the original version that was provided in the exam). |
We also acknowledge the complex role of language in our study, which prompted us to analyze the untranslated data in addition to the translated data. First-year chemistry courses are taught around the globe and international collaborations require devising methods to handle translation of language and attentiveness to the general nature of language. Future work involves further investigating the relationship between our results and language by continuing to work through our data in its original language.
Moreover, although both static and emergent conceptualizations can be productive in some situations, this does not mean they are equally productive in all situations. In particular, there are situations where a lack of emergent reasoning would be highly limiting in terms of understanding dynamic processes. This was exemplified in our dataset when students only viewed the graph as an object, focusing primarily on the representation as a static image that had associated properties (e.g., “titration curve”). If we were to consider the example of a student who can only see (or memorize) an exponential decay (concentration vs. time) curve as one entity representing “consumption” or “first order reaction”, then the entire graph is just one static object and the dynamic and continuous processes embedded in even such a simple graph are unavailable. From an instructional perspective, we need to help students think about the whole process, “walk along the curve”, and ask themselves what is happening and how it is changing.
Additionally, we would like to emphasize the value in getting students to critique statements and reflect on the plausibility of responses, since this may help support students in making connections and reasoning mathematically (Becker et al., 2017). For example, following an exam (such as the one discussed herein), the instructor could provide some examples of students’ “less productive” reasoning and prompt the students to critique the responses and reason why they may or may not be chemically plausible. As we observed in our dataset, students frequently discussed ideas that are relevant to chemical kinetics, but in some cases it seemed as though students may simply be using the correct “vocabulary” associated with this context; a closer reading of their responses indicated an incomplete understanding of related concepts, suggesting the need for more support in how to use the resources they have more productively to better reflect an expert's understanding of phenomena.
Moreover, student engagement in discussing mathematical narratives is dependent on their ability to reason mathematically about graphs, which, as mentioned previously, is largely influenced by their ability to use covariational reasoning. In their work Carlson et al. (2002) provided a framework that described different behaviors that when exhibited afford the classification of a learner into different developmental stages (in a Piagetian sense) in terms of their ability to consider the different facets of covariation in the context of graphical reasoning. At the lower levels of the covariation progression is the ability to think more generally about the relationship between variables and the associated trends, and at the higher levels of the progression is the ability to apply calculus concepts such as instantaneous rate-of-change to reason about graphs. Considering the progressive levels of sophistication proposed by Carlson et al. (2002) regarding the ability to use different mathematical tools such as calculus to reason covariationally, our working definitions of static and emergent reasoning suggests that static reasoning may be more accessible to a larger group of students that are less proficient in mathematics; however, additional work is needed to better establish a relationship between static/emergent conceptualizations and mathematical proficiency. Although we reaffirm that more expert-like graphical reasoning encompasses both the process and the object perspective of viewing a graph, it is important to be cognizant of the mathematical resources available to students and to understand that students may not be able to move fluidly between both perspectives.
The results presented herein illustrate the use of our approach toward investigating students’ understanding of a graphical representation of a dynamic process. The graphical forms framework afforded a way to characterize students’ reasoning and consider the extent in which the students engaged in covariational reasoning, providing insight regarding the mathematical resources utilized by students. We encourage the community to continue to add to the literature on graphical reasoning and investigate how we can promote students to provide deeper, mechanistic explanations of phenomena. Based on the extensive review of the literature done by the National Research Council, we assert studies that investigate graphical reasoning should be placed in contexts that have little representation, such as advanced topics in biochemistry or physical chemistry, which likely involve the use of distinct combinations of the resources described herein (Singer et al., 2012).
Our investigation of graphical reasoning involved synthesizing constructs and frameworks from educational research done in physics (graphical forms, mathematical narratives) and mathematics (covariation, process–object reasoning, shape thinking). We find tremendous value in moving outside the limits of our discipline and considering how other fields approach thinking about the many facets of education and learning. We advocate for work that moves not just beyond our disciplinary fields, but also crosses international boundaries to expand our collective knowledge.
This journal is © The Royal Society of Chemistry 2019 |