Supporting submicroscopic reasoning in students’ explanations of absorption phenomena using a simulation-based activity

Natalia Spitha ab, Yujian Zhang a, Samuel Pazicni *a, Sarah A. Fullington a, Carla Morais c, Amanda Rae Buchberger a and Pamela S. Doolittle a
aDepartment of Chemistry, University of Wisconsin-Madison, 1101 University Avenue, 53706, Madison, Wisconsin, USA. E-mail: sam.pazicni@chem.wisc.edu
bHumboldt-Universität zu Berlin, Institute of Chemistry, Chemistry Education Department, Brook-Taylor Str. 2, 12489 Berlin, Germany
cCIQUP, Institute of Molecular Sciences (IMS), Unidade de Ensino das Ciências, Departamento de Química e Bioquímica, Faculdade de Ciências da Universidade do Porto, Rua do Campo Alegre, P-4169-007 Porto, Portugal

Received 20th June 2023 , Accepted 8th October 2023

First published on 9th October 2023


Abstract

The Beer–Lambert law is a fundamental relationship in chemistry that helps connect macroscopic experimental observations (i.e., the amount of light exiting a solution sample) to a symbolic model composed of system-level parameters (e.g., concentration values). Despite the wide use of the Beer–Lambert law in the undergraduate chemistry curriculum and its applicability to analytical techniques, students’ use of the model is not commonly investigated. Specifically, no previous work has explored how students connect the Beer–Lambert law to absorption phenomena using submicroscopic-level reasoning, which is important for understanding light absorption at the particle level. The incorporation of visual-conceptual tools (such as animations and simulations) into instruction has been shown to be effective in conveying key points about particle-level reasoning and facilitating connections among the macroscopic, submicroscopic, and symbolic domains. This study evaluates the extent to which a previously reported simulation-based virtual laboratory activity (BLSim) is associated with students’ use of particle-level models when explaining absorption phenomena. Two groups of analytical chemistry students completed a series of tasks that prompted them to construct explanations of absorption phenomena, with one group having completed the simulation-based activity prior to the assessment tasks. Student responses were coded using Johnstone's triad. When comparing work from the two student groups, chi-square tests revealed statistically significant associations (with approximately medium to large effect sizes) between students using the simulation and employing particle-level reasoning. That said, submicroscopic-level reasoning did not always provide more explanatory power to students’ answers. Additionally, we observed the productive use of a variety of submicroscopic light–matter interaction models. We conjecture that engaging with BLSim provided new submicroscopic-level resources for students to leverage in explanations and predictions of absorption phenomena.


Introduction

Light–matter interactions are central to scientific research and teaching. From the role of light in photosynthesis and other biological processes, to the study of the wave- and particle-like properties of light, to the use of spectroscopy to quantify and characterize molecules, chemistry students encounter light–matter interaction phenomena regularly during their undergraduate studies. Given the chemical nature of this phenomenon, light–matter interactions can be navigated at different “levels” of Johnstone's triad (Johnstone, 1991). For example, students may consider what the color of a solution tells us about the wavelengths and amount of light it absorbs (macroscopic), how light of different frequencies can excite electrons or cause molecules to vibrate and rotate (submicroscopic), or how we can use a model like the Beer–Lambert law to relate the transmittance of light through a solution to its concentration (symbolic). Given several documented challenges in understanding specific aspects of light–matter interaction (Minter, 2019; Balabanoff et al., 2020, 2022), as well as with drawing connections among the macroscopic, submicroscopic, and symbolic levels of chemistry in general (Johnstone, 1991), it is important to investigate how students reason about such phenomena and how we can best support them in building scientifically sound explanations of absorption phenomena.

Recent chemistry education studies have identified some patterns and challenges in students’ reasoning about light–matter interaction phenomena. In addition, and to a lesser extent, previous work has explored curricular interventions that could promote desired reasoning patterns among students. A recent body of work has focused on understanding the models used by chemistry and physics students when reasoning about light–matter interactions at a quantum-mechanical level, such as the wave-particle duality of light (Ayene et al., 2011; Körhasan and Miller, 2019; Balabanoff et al., 2022), the photoelectric effect (McKagan et al., 2009; Özcan, 2015; Supurwoko et al., 2017; Balabanoff et al., 2020), and the quantization of atomic energy levels and spectra (Stefani and Tsaparlis, 2009; Dangur et al., 2014; Didiş Körhasan and Wang, 2016; Moon et al., 2018; Minter, 2019). However, less research has emphasized students’ understanding of the Beer–Lambert law (eqn (1)), a widely applicable spectroscopic relationship that governs the relationship between the transmittance of light through a solution and parameters like molar concentration:

 
image file: d3rp00153a-t1.tif(1)
Here, A is absorbance, I and I0 represent the intensities of light exiting and entering a solution, respectively, ε is the molar absorptivity of the solute, b is the path length of the sample chamber, and c is the molar concentration.

Analytical chemistry instructors regard the Beer–Lambert law as the second most important topic in their curriculum (Kovarik et al., 2022). The Beer–Lambert law is central to UV-Visible spectroscopy, a technique which students commonly employ in the laboratory to quantify analytes in solution. The model is also fundamental to the operation of other analytical instruments, such as detectors of chromatography systems, as well as applications like imaging and photovoltaics, where the exponential profile of light intensity across a material is important to consider. Despite the broad applicability of the Beer–Lambert law, little is known about how students engage with the model, or if they interpret its symbols as a model at all. It is possible, for example, that students use the Beer–Lambert model algorithmically as an equation, solving for the different variables involved, as has been observed in other chemistry contexts (Tasker, 2014). In fact, evidence (Ricci et al., 1994; Bare, 2000; Spitha et al., 2021) suggests that the exponential intensity decay of light as it passes through a solution is often missed by students, who think that absorption occurs linearly throughout a solution (possibly due to the linear-like appearance of the Beer–Lambert law). Another challenge lies with a lack of emphasis on the submicroscopic level when discussing the basis of the Beer–Lambert law, as its derivation is discussed predominately in mathematical terms (Spitha et al., 2021). Research has shown that students, even when skilled in mathematics, struggle to connect physical models to equations; in turn, this lack of connection hinders students from using mathematical equations in the appropriate contexts (Stefani and Tsaparlis, 2009; Moon et al., 2018; Lazenby et al., 2019).

In previous work, a simulation tool (Beer's Law Simulation) and an accompanying learning activity (BLSim) were introduced for deriving the Beer–Lambert law starting from a simplified submicroscopic representation of photons (spheres) and molecules (cylinders) (Spitha et al., 2021). In the simulation, a large number of small spheres simultaneously fall through an array of cylinders and can be either transmitted through to a “detector” or captured by a cylinder along their path (ESI, Section S1). In the activity, students make predictions and evaluate them through virtual experiments in the simulation to explore how the number of photons “absorbed” depends on the size of cylinders (“absorptivity”), the number of cylinders in a “layer” of solution (“concentration”), and the number of layers of cylinders (“path length”). Subsequently, students are guided through translating these observations to mathematical models, leading to the derivation of the Beer–Lambert law. The design of the BLsim activity was, on one hand, motivated by the demonstrated potential of simulation-based activities (along with appropriate prompting) to foster connections between the submicroscopic, symbolic, and macroscopic levels (Johnstone, 1991; Williamson and Abraham, 1995; Russell et al., 1997; Tasker and Dalton, 2006; Lancaster et al., 2013; Kelly, 2014; Schwedler and Kaldewey, 2020; Kaldaras and Wieman, 2023). On the other hand, the interactivity of the simulation and the inclusion of prediction/test-type prompts aimed to support students in (re)constructing explanations around absorption phenomena, a process which has been shown to be strengthened through metacognition (Homer and Plass, 2014; Kelly, 2014; Kapon, 2017; Kelly et al., 2017). In this work, using a newly developed spectroscopy assessment, we employ a quasiexperimental design to examine how engagement with this simulation-based activity influenced students’ reasoning about absorption phenomena.

Theoretical framework

Johnstone's triangle

Johnstone's triangle (Johnstone, 1991) is a widely used framework in science education, particularly in the teaching of chemistry. This framework (Fig. 1) posits that chemistry concepts can be learned and represented at three different domains, each with its own definition and properties (Johnstone, 2000). The macroscopic domain deals with the tangible and observable properties of matter that can be seen, touched, and smelled. Examples of macroscopic properties include color, texture, and melting point. The submicroscopic domain deals with invisible particles and processes that underlie the macroscopic properties of matter, which include atoms, molecules, ions, electrons, chemical reactions, and bond formation. Finally, the symbolic domain deals with the abstract symbols and models used to represent and communicate chemical concepts. Examples include chemical formulas, equations, and graphs.
image file: d3rp00153a-f1.tif
Fig. 1 Johnstone's triangle, representing the three domains of chemistry reasoning, as applied to the context of this study: macroscopic (experimental observation of light being absorbed by a solution), submicroscopic (particle-level understanding of light exciting molecules), and symbolic (the Beer–Lambert law and a light intensity profile).

According to Johnstone, while well-trained chemists are able to consider chemical phenomena using all vertices of this triangle, novice learners encounter challenges when expected to process information from all three domains at once (Johnstone, 2000). Since many introductory chemistry concepts can be examined from the perspectives of all three vertices, students can often be required to navigate all three apexes in a single lesson, which presents a significant challenge for novices (Johnstone, 2006).

Within the framework of Johnstone's triangle, the Beer–Lambert law primarily connects the macroscopic and symbolic apexes through the model of eqn (1). However, focusing solely those two apexes, without engaging with the particulate-level models that provide an explanation for the Beer–Lambert law, can lead to students using the equation algorithmically to solve problems (Lazenby et al., 2019). The incorporation of visual-conceptual tools (such as animations and simulations) has been shown to be effective in conveying key points about the submicroscopic domain, and thus has the potential to facilitate linkages between the three vertices of Johnstone's triangle (Tasker and Dalton, 2006; Dangur et al., 2014). As a result, we expect students who have engaged with the BLSim activity to incorporate elements of submicroscopic reasoning when prompted to explain macroscopic observations and/or mathematical relationships.

Resources framework

We used the resources framework introduced by Hammer to guide our analysis of students’ explanations about light absorption phenomena (Hammer, 2004). The resources framework is rooted in the broader knowledge-in-pieces (KiP) epistemological perspective, which views knowledge as a dynamic constellation of fine-grained elements that evolve or become reorganized as someone learns (diSessa, 1993; diSessa and Sherin, 1998; Hammer et al., 2005). The resources framework further builds on KiP by suggesting that those knowledge elements (which Hammer characterizes as conceptual and epistemological “resources”) are diverse in complexity (“grain size”) and are activated in context-dependent ways. Both KiP and the resources framework emphasize the dynamic nature of cognitive structures, which challenges unitary views of the ontology of cognitive structure (such as the notion of “misconceptions” as stable concepts that can be replaced by “expert views”) (diSessa and Sherin, 1998; Hammer et al., 2005; Harlow and Bianchini, 2020). Furthermore, resources, while not intrinsically correct or incorrect, can be activated productively or unproductively for a given task: Clement et al. (1989) have highlighted that even preconceptions that could be perceived as “barriers” to learning can be leveraged productively by students as anchoring conceptions towards a specific task or learning objective. In other words, as has been proposed recently, the evaluation of the scientific accuracy of a resource activation and the characterization of its usage as “productive” or “unproductive” lie along distinct, in-principle independent, axes (Crandell and Pazicni, 2023).

The definition of “productivity” of an activated resource depends on the task with which the student engages. For example, if two students were asked to explain how the transmitted light intensity through a solution changes with an increased concentration, one student may use an energy level diagram to show how photons excite electrons to a higher energy level, while another student may draw a picture of solute particles “blocking” rays of light. Depending on the way the two students use these representations in their explanations, the second student may be able to more effectively demonstrate how the molecules act as an ensemble to reduce the intensity of light, even though the excitation-based model of the first student is more scientifically accurate. Thus, the productivity of a resource is related to the success with which a student can use it to explain or predict phenomena, even if that resource may be scientifically non-normative.

Through the lens of the resources framework, learners are viewed as engaging with endless cognitive and epistemological resources, while educators aim to scaffold the presentation of content and frame questions to activate intended resources. In this study, we present students with a series of explanation- and prediction-based prompts that center on the different parameters of the Beer–Lambert law. As students engage with these prompts, which go beyond the typical types of assessments they have encountered in their course, we expect a variety of resources to be activated. For example, when asked to explain a light intensity vs. path length graph an analytical chemistry student may have a variety of resources to draw from and use in their explanation, such as excitation and emission, energy-wavelength relationships, or a UV-Vis spectrometer. Resource activation can also affect the features of the light intensity vs. path length graph the student chooses to attend to: if a resource of “exponential decay” is activated by the prompt and graph, the student will likely attempt to explain the shape of the graph; if not, the student may explain only the overall decrease in intensity. Lastly, epistemological resources also will play a role in a students’ responses. For example, a student's view of what constitutes an explanation vs. description can influence the nature of their responses (Cooper et al., 2016). Our study explores which resources are activated when students engage with novel assessment prompts on spectroscopy, and to what extent completing the BLSim-based activity affects this resource activation.

Research questions

Drawing from the Johnstone's triangle and cognitive resources frameworks, the following research questions guide this work:

1. Which domains of Johnstone's triangle characterize activated cognitive resources when students are prompted to explain absorption phenomena?

2. To what extent does participation in the BLSim activity affect the distribution of resources across the Johnstone's triangle domains in students' explanations of absorption phenomena?

In answering these research questions, our analysis of student responses also revealed a diverse range of light–matter-interaction resources that were activated when students were prompted about the Beer–Lambert law, whose usage by students is discussed in the present manuscript.

Methods

Setting and participants

The BLSim lab and the accompanying spectroscopy assessment discussed in this study were developed for the analytical chemistry curriculum at a large research university in the Midwestern United States. The activity and assessment were implemented in the lab curriculum of two analytical chemistry courses during the Fall 2020 semester. The two courses differed in the typical professional interests of the enrolled students (“Course 1” targeted at health and biological science majors and “Course 2” targeted at engineering and chemistry majors) but had an identical (hybrid) lecture and lab curriculum leading up to the time when the BLSim activity and spectroscopy assessment were completed. Prior to engaging with these materials, all students had received formal instruction of the Beer–Lambert law (including its calculus-based derivation) and had completed one fluorescence-related laboratory experiment where an analogous linear relationship was used to determine molar concentration. The course participants were divided into two groups with a slightly different order of lab activities (Fig. 2). For the comparison group, students performed the designed spectroscopy assessment before having participated in the BLSim laboratory activity, while, for the simulation group, students participated in the simulation-based lab activity (as a synchronous online lab) before completing the assessment. The time period from the introduction of the Beer–Lambert law in lecture to the completion of the BLSim activity and the spectroscopy assessment was approximately three weeks for both groups, with the spectroscopy assessment and BLSim being administered within one week of each other (ESI, Section S2). Due to the different number of student sections completing a lab at a given time, the two groups had an unequal number of students, with the comparison group (n = 71) having more students than the simulation group (n = 53). The representation of the two courses was similar in the comparison group (65% of group enrolled in Course 1) and the simulation group (57% of group enrolled in Course 1). In addition, a subsequent preliminary evaluation of the code counts in the comparison group (see Data analysis and ESI, Section S2) suggested that the reasoning demonstrated by students from the two courses could be classified similarly across the domains of Johnstone's triangle. As a result, the comparison and simulation groups were each considered as a whole with no distinction made between the two courses.
image file: d3rp00153a-f2.tif
Fig. 2 Quasi-experimental design for this study: the spectroscopy assessment was submitted as an assignment either five days preceding (comparison group) or five days following (simulation group) a laboratory activity that incorporated BLSim.

The BLSim activity

The BLSim activity (ESI, Section S1), first piloted as a virtual lab activity in Summer 2020, is described in detail in a previous publication (Spitha et al., 2021). The activity consists of an online simulation (cylinder-molecules and falling sphere-photons, Fig. S1, ESI) and a series of guiding prompts (ESI, Section S1.2) that ask students to vary the arrangement and/or size of the “molecules,” predict and then measure the number of transmitted “photons,” and construct three simple relationships that can be combined and manipulated to derive the Beer–Lambert law. The three relationships resulting from the predict/test prompts each implicitly correspond to one of the three variables in the Beer–Lambert law. First, students vary the number of cylinders in one layer to determine that this variable is proportional to the number of photons absorbed. Second, students vary the radius of the cylinders to determine that the number of absorbed photons is proportional to the square of this variable (i.e., the cross-sectional area), which is related to the probability of a photon “seeing” a cylinder on its path. Third, students successively increase the number of layers in which cylinders are arranged to find that each molecular layer absorbs a constant fraction (but a decreasing absolute number) of photons. After analyzing students’ responses from the first implementation, some prompts were revised for Fall 2020 to elicit more precise predictions and reflection about the parameters involved in the simulation (Spitha, 2021).

The prediction/test-type prompts were supplemented by additional prompts that specifically aimed to foster connections between the submicroscopic elements of the simulation and the mathematical aspects of the Beer–Lambert law (equation and exponential intensity curve). Prior to the virtual lab activity, students had been asked to ensure that they are familiar with the Beer–Lambert law and the definitions of its variables, as well as with converting between percent transmittance and absorbance. As a “pre-lab” exercise (which students had the option of revisiting and correcting at the end of the lab), students were also asked to familiarize themselves with the simulation tool and state how they thought each variable of the Beer–Lambert law was represented in the simulation. Finally, after conducting the experimental trials where the number of cylinder layers was varied, students were asked to sketch the number of photons transmitted through each layer and to describe the type of curve that arose.

Besides the derivation of the Beer–Lambert law (which was not the focus of the assessment conducted in this study), the learning objectives of the BLSim activity can be summarized as (i) describe how the intensity profile of a beam of light changes as it passes through a solution of absorptive particles; (ii) explain the changes in light intensity through a solution in terms of microscopic interactions between light and absorbing centers; and (iii) explain the individual effects of concentration (i.e., more particles for light to interact with), absorptivity (i.e., higher probability of absorption per particle), and path length (i.e., successive layers of particles receive less and less incident light) on absorbance.

Spectroscopy assessment

The spectroscopy assessment consisted of three multi-part prompts that focused on each of the three parameters affecting absorbance according to the Beer–Lambert law. The assessment was an embedded component of the analytical chemistry course with the instructional goals of (i) engaging students in authentic science practices of explanation and prediction and (ii) evaluating the BLSim learning objectives. With respect to addressing the research questions above in the phenomenographic tradition (and through the lens of the aforementioned theoretical frameworks), the spectroscopy assessment also elucidated the range of light–matter-interaction resources that would be activated in students by prompts that covered material where no previous formal instruction had been given. The prompts were developed using evidence-centered design (Mislevy et al., 2003; Mislevy and Haertel, 2006) and evaluated in accordance with the 3D-Learning Assessment Protocol (3D-LAP) (Laverty et al., 2016). For these prompts, students were provided different scenarios of light hitting a solution and were asked to use a model to explain and predict changes in graphs of light intensity vs. path length (Fig. 3). The prompts were prefaced by a written statement that specifically placed them in the context of the Beer–Lambert law. The design of this assessment and its 3D-LAP evaluation are provided in full in the online ESI, Section S3. Our analysis of student responses focused on four specific sub-tasks that were expected to elicit model-based explanations from the students, namely:
image file: d3rp00153a-f3.tif
Fig. 3 Light intensity profiles for different explanation prompts: (a) light intensity profile whose shape is to be explained in prompt Q1. (b) Provided light intensity profile for a 0.01 M solution (solid green) and expected profile to be predicted by students for a 0.02 M solution (dashed black) in prompt Q2. (c) Provided intensity profiles for “equimolar” solutions of crystal violet (purple) and methyl orange (orange) to be explained in prompt Q3. (d) Provided intensity profile through a 1 cm cuvette (solid blue) and expected profile to be drawn by students for a 0.5 cm cuvette (dashed black) in prompt Q4.

• Q1 (explain graph shape): draw a picture […] that depicts how you think light interacts with the iron complex solution. Use that illustration to explain why the intensity profile follows the shape shown above […].

• Q2 (doubled concentration): explain or show how your illustration in [Q1] changes when a 0.02 M iron complex solution is instead [illuminated]. Then, […] predict and sketch the intensity profile through the 0.02 M solution.

• Q3 (varied solute): How is it possible that [the equimolar solutions of Crystal Violet and Methyl Orange] exhibit different intensity profiles under the same excitation conditions? Provide […] an explanation based on the molecular behavior of CV and MO, [by drawing an illustration of the light-solution interaction].

• Q4 (halved path length): [on an existing graph from an experiment with path length 1 cm], sketch the intensity profile through an identical solution placed instead in a cuvette with length b = 0.5 cm. […] Explain your reasoning for the curve you drew above.

Data collection

After initial review by our institution's Institutional Review Board (IRB), this study was considered to constitute course material evaluation (instead of human subjects research) and therefore exempt from IRB review and oversight. Following formal instruction on the Beer–Lambert law in the lecture component of the course, all students submitted scanned copies of their responses to the spectroscopy assessment as part of a post-lab assignment for an experiment involving the spectrophotometric determination of iron. An instructor of the course (ARB) removed any identifying information from students’ submissions and shared those documents with the researchers who conducted the analysis (NS and YZ). In order to minimize bias, all coded documents were renamed prior to qualitative analysis, such that the metadata indicating whether a document was associated with the simulation or comparison group was not immediately accessible. An initial scan through student work was performed to ensure that all submissions contained responses to all the prompts. From the initial dataset of 132 submissions, 124 met this criterion for analysis.

Data analysis

Fundamentally, the goal of this study was to provide insight regarding how students reason about absorption phenomena. Following this goal, analysis of data generated by the spectroscopy assessment focused on identifying resources students used when explaining and predicting, and characterizing those resources using the domains of Johnstone's triangle. While some a priori expectations existed about which elements may be present in student responses (given the intended learning goals of the BLSim activity as well as course material covered by that time point), the coding approach followed was largely inductive. Two coders (NS and YZ) holistically examined students’ responses and inscriptions to identify singular ideas which (i) appeared to be germane to the student's explanation or prediction; and (ii) the coders considered to be evidence for inferring the activation of a particular resource. In addition, each student response to prompts Q1–Q4 was classified using mutually exclusive categories (e.g., submicroscopic, non-submicroscopic, and other) based on Johnstone's framework. Here, we emphasize that, being unable to access students’ real-time resource activation process, the coders applied their own understanding of the subject matter, knowledge of the course material, and theoretical frameworks of interest to infer and label the types of resources activated in students, using student responses as evidence. It follows that the resources reported here are not “contained” in student responses, nor are they necessarily generalizable to other prompts or phenomena. Rather, what constitutes a “resource” is highly specific to the context and lens of our study. Similarly, since the assessment prompts were newly developed, the criteria for distinguishing between submicroscopic, non-submicroscopic, or other student reasoning were developed through data-driven and prompt-specific discussions.

For prompts Q1–3, NS and YZ each initially analyzed a subset of submissions, among which ∼30% were coded by both. Any coding differences in the double-coded subset were discussed and the coding criteria refined until a 100% agreement was achieved. Following that, each person re-coded their entire set under the new agreed-upon criteria. The same process was then followed for the remaining submissions, with a total of 42/124 submissions double coded. For prompt Q4, due to practical limitations, one person coded the entire data set a priori, while the second coder reviewed all assignments of the first coder and addressed any disagreements by discussing and refining the coding criteria until 100% agreement was also achieved. A third coder (CM) was invited to code 26 randomly chosen files and results were compared to the original coding. Discussions with the third coder pointed to the fact that the addition of flowchart tools was necessary to capture the coding process for prompts Q2 and Q4, because students’ responses to neighboring prompts had to be accounted for to accurately categorize students’ reasoning in those prompts. The two flowcharts were tested out for all 124 files by NS and YZ, and any disagreements in coding assignment were discussed to reach 100% agreement. As part of related iterative discussions, the prompts not requiring flowcharts were also ultimately reviewed by both coders for all 124 files and negotiated to 100% agreement.

Finally, to ensure the reliability of the coding scheme, a fourth person (SAF) coded a subset (26) of the submissions for all four prompts (five coding schemes, as prompt Q4 involved evaluating both the correctness and explanation). In determining the interrater reliability, the first set of coders (NS and YZ) were treated as one coder due to their 100% agreement. The initial (un-negotiated) agreement was moderate to high for the coding assignments in prompts Q1 and Q4 (Krippendorff's α values ranging from 0.70 to 0.92), but lower for prompts Q2 and Q3 (0.36 and 0.31, respectively). The primary source of initial disagreement for prompt Q2 (doubled concentration) was the subtlety of some of the steps in the flowchart where the drawing of the previous question had to be considered; often, single-word or small drawing choices of the student could influence the binary decisions made in the flowchart and lead to different conclusions. For prompt Q3 (varied solute), the most common source of disagreement was the collective interpretation of textual and drawn elements to determine if an answer can be classified as submicroscopic or non-submicroscopic. After the initial coding by the final coder, the three coders discussed all disagreements. For each coding assignment, a decision was made between negotiated agreement (one or both of the coders change their coding assignment) and a disagreement (the coders “agree to disagree”) (Garrison et al., 2006). The final negotiated agreement was excellent for all five prompts (Krippendorff's α values ranging from 0.91 to 1.00). Interrater reliability results before and after negotiation are summarized in the ESI, Section S5. The codes that classified the type of reasoning in students’ responses to each of the prompts Q1–Q4 are provided in the Results, while the entire codebook with examples, flowcharts, and transcriptions is summarized in the ESI, Section S4. In the code counts presented hereafter, the coding decisions by the original coders were kept for the two cases where no negotiated agreement was reached.

Chi-square tests of independence were performed to examine the associations between the codes assigned to student responses and completing the BLSim activity. Prior to performing inferential tests, we inspected our code count data to ensure they met the six assumptions of a chi-square test of independence (McHugh, 2013) (ESI, Section S6). All chi-square tests, associated effect sizes, and standardized residuals were computed using SPSS (version 29.0). Given that the degrees of freedom for each chi-square test were greater than one, Cramér's V was used as a measure of effect size; Cramér's V values were interpreted following Cohen's guidelines (Cohen, 1988). Standardized residuals were used to aid in the interpretation of the chi-square tests, as they provide information about which cells of the contingency table contribute to a significant chi-square: a positive standardized residual indicates that more counts are observed than would be expected by chance, while a negative standardized residual indicates fewer observed counts than expected by chance.

Results

Prompt Q1: explaining the graph shape

Prompt Q1 (explain graph shape) asked students to draw a picture of the light-solution interaction that would help explain the shape of a provided graph of light intensity versus path length. The two features of the graph that could theoretically warrant an explanation, although not prompted for explicitly, were (i) the overall decrease of the light intensity through the solution; and (ii) the shape of the light intensity profile. The graphical feature of overall decrease was the focus of a majority of students’ explanations, with a minority (∼27%) of students providing an explanation for the decreasing magnitude of the graph's slope. In terms of the resources inferred from each explanation, students’ responses could be classified generally as submicroscopic (light rays hitting molecules), non-submicroscopic (light entering and exiting the solution or mathematical reasoning combined with mentioning the phenomenon), or other, depending on the depicted model of the light-solution interaction or the absence thereof (Fig. 4). These types of reasoning were coded as submicro, nonSubmicro, and other, respectively (Table 1). Resources were coded as submicro or nonSubmicro only if they were used within an explanation, meaning that responses that did not address the phenomenon (i.e., absorption) or indicate how light intensity was represented in the drawing were categorized as other. The scientific accuracy of the resources activated did not influence this categorization of responses.
image file: d3rp00153a-f4.tif
Fig. 4 Examples of (a) submicroscopic, (b) non-submicroscopic, and (c) “other” reasoning demonstrated in Q1. Transcriptions of student answers and the detailed rationale for these coding assignments is provided in the ESI, Section S4.2.
Table 1 Codes pertaining to Johnstone's triangle domains observed in student responses to prompt Q1
Q1 code Type of reasoning represented
submicro Submicroscopic reasoning such as depiction of light rays hitting particulate representations of molecules as they pass through the cuvette (rays hitting individual molecules should be shown)
nonSubmicro Macroscopic depiction of light entering/exiting the solution without showing an interaction between light rays and molecules/particles, or purely mathematical reasoning combined with mentioning of a phenomenon (e.g., absorption)
other Reasoning without a drawing of light and solution, or reasoning in neither of the above categories, or no reasoning (even in the presence of a drawing)


Fig. 5 compares the evidence for submicroscopic reasoning between the students in the comparison and simulation groups for prompt Q1 (explain graph shape). The majority of students (38 out of 53) in the simulation group used submicroscopic-level resources in their reasoning, while in the comparison group, resources were more evenly spread over the domains of Johnstone's triangle. A chi-square test was performed to test the null hypothesis that there is no association between the distribution of codes and engaging with BLSim. The results of that chi-square test, χ2 (2, n = 124) = 11.77, p = 0.003, Cramér's V = 0.31, indicated that our sample provided sufficient evidence to reject this null hypothesis. That is, there was a statistically significant association with a medium effect size between the Q1 code distribution and engagement with BLSim. Moreover, a standardized residual of +1.7 indicated that students in the simulation group were more likely to provide submicroscopic reasoning than would be expected by chance, while negative standardized residuals (−1.6, −1.0) indicated that students in simulation group were less likely to provide non-submicroscopic or other reasoning than would be expected by chance.


image file: d3rp00153a-f5.tif
Fig. 5 Distribution of student answers for prompt Q1 (explain graph shape). Left: Number of students demonstrating submicroscopic, non-submicroscopic, or other reasoning in the comparison and simulation groups. Right: Contingency table for the χ2 test examining the association between completing BLSim activity and reasoning code distribution. Standardized residuals (SR), expected counts, and observed counts are reported in each cell.

Prompt Q2: doubling the concentration

In prompt Q2 (doubled concentration), students were asked to predict and sketch a new light intensity profile for a solution of double the Q1 concentration, as well as explain or show how they would change their Q1 illustration to rationalize their new light intensity profile. Two features were expected in each response: (i) a new light intensity profile that is either more curved or less curved compared to the one provided in prompt Q1; and (ii) an explanation that used either submicroscopic and/or non-submicroscopic resources. Table 2 summarizes the four types of reasoning observed in student responses to prompt Q2. While the majority of students recognized that the light intensity profile for doubled concentration should be more curved (i.e., decreasing with a steeper slope, as in Fig. 3b) than the profile provided in Q1, their responses varied in how they explained the difference caused by the change of the solution concentration. In both the comparison and simulation groups, a large proportion of students described adding more molecules to their previous submicroscopic-level illustration, or drew out a new illustration with more molecules interacting with light rays (Fig. 6a). Such answers were categorized as “clearly submicroscopic” (clearSub). In contrast, some students approached this task by using the Beer–Lambert law and discussed percentage transmittance, where their illustration only depicted the incoming and outcoming light rays with no light–matter interaction shown (Fig. 6b). These responses were categorized as clearly non-submicroscopic (abbreviated as clearNonSub). Due to the nature of prompt Q2, which gave students the option of either providing a new drawing or describing how they would alter the existing one, some responses could not be clearly categorized as submicroscopic or non-submicroscopic; in some cases, submicroscopic drawings in prompt Q1 were not followed by explicitly particle-level reasoning in prompt Q2 (compare Fig. 4a and 6c from the same student), while other students mentioned “more molecules” being present in Q2 even though their previous drawing was macroscopic. For cases where it was impossible to determine if the response was approached using submicroscopic resources, we assigned the category unclear. Lastly, there were several answers that only described the shape and trend of light intensity profiles without explanation, which were assigned the noReas (no reasoning) code (Fig. 6d).
Table 2 Codes pertaining to Johnstone's triangle domains observed in student responses to prompt Q2
Q2 code Type of reasoning represented
clearSub Clearly submicroscopic reasoning: drawing more molecules in a picture with rays hitting molecules, or connecting molecule-related wording to previous submicroscopic-level picture
unclear Mentions increased interaction between the light and more-concentrated solution, without providing an illustration that shows whether the student approached the question from a submicroscopic or non-submicroscopic point
clearNonSub Clearly macroscopic or mathematical: algorithmic reasoning in terms of Beer's law, % transmittance, concentration, etc.
noReas Only shows illustration without explanation, or no explanation that goes beyond a description of the graph (no phenomena addressed)



image file: d3rp00153a-f6.tif
Fig. 6 Examples of (a) clearSub, (b) clearNonSub, (c) unclear, and (d) noReas demonstrated in Q2. The answer shown in (c) was from the same student submission as Fig. 4a and could not be categorized as clearSub or clearNonSub. The transcriptions of each answer and the detailed rationale for these coding assignments is provided in the ESI, Section S4.2.

Fig. 7 summarizes the classification of students’ reasoning in their answers to prompt Q2. Overall, 68% of students provided clearly classifiable (clearNonSub or clearSub) drawings and explanations for the comparison between the light intensity profiles of the original and doubled concentration solution. 16% of students provided explanations with unclear classification, while 16% of students provided no reasoning. Regarding the simulation and comparison groups, 31 out of 53 students in the simulation group approached prompt Q2 using submicroscopic resources, while only 21 out of 71 students in the comparison group did so. Another difference between these two groups is the number of students who leveraged clear non-submicroscopic resources, with the comparison group having 26 students and the simulation group only having 6. Overall, a statistically significant association with a large effect size between the Q2 code distribution and engagement with BLSim was observed: χ2 (3, n = 124) = 13.06, p = 0.005, Cramér's V = 0.33. Standardized residual data indicated that students who completed the BLSim activity were more likely to employ submicroscopic reasoning and less likely to employ non-submicroscopic or unclearly classified reasoning than would be expected by chance, when prompted to explain how a change in concentration affects a light intensity profile.


image file: d3rp00153a-f7.tif
Fig. 7 Distribution of student responses for prompt Q2 (doubled concentration). Left: Number of students demonstrating clear submicroscopic, clear macroscopic, unclear, or no reasoning in comparison and simulation group. Right: Contingency table for the χ2 test examining the association between completing BLSim activity and students using submicroscopic reasoning. Standardized residuals (SR), expected counts and observed counts are reported in each cell.

Prompt Q3: varying the solute (absorptivity)

In prompt Q3 (varied solute), students were provided with intensity profile graphs for crystal violet (CV) and methyl orange (MO) and asked to explain why the intensity profiles look so different even though the solutions are equimolar. The prompt preceding Q3 in the activity (see ESI, Section S3) asked students for a mathematical explanation (where the concept of absorptivity was to be addressed), while prompt Q3 itself asked for an explanation that included an illustration of “how they thought each solution interacted with light” and an explanation “based on the molecular behavior of CV and MO”. This scaffolding ensured that students did not conflate the type of resources intended to be activated in each step of the activity. Three categories were generated to describe the resources used by students in responding to prompt Q3, including submicroscopic, non-submicroscopic, and other (Table 3). For responses coded as submicroscopic (submicro), many students sketched an illustration of an ensemble of molecules interacting with light inside the cuvette (Fig. 8a), with molecules of CV absorbing more light rays per molecule (e.g., due to their depicted particle size). Additionally, a small portion of students illustrated this phenomenon using a single molecule depiction of the light–matter interaction, with CV having less transmittance (Fig. 8b), which was also categorized as submicroscopic reasoning. In comparison, for responses coded as nonSubmicro, light–matter interactions were not addressed at the particulate level, with typical responses depicting only a representation of the initial and final light intensity, along with statements of CV “absorbing more light” (Fig. 8c). Some students approached this task by leveraging the direct proportional relationship between absorbance and molar absorptivity in the Beer–Lambert law, which was surprising due to the intentional scaffolding to prompt differentiating between mathematical and particulate-level reasoning. Answers with no visualization shown, no explanation provided, or reasoning in neither category, were coded as other (Fig. 8d).
Table 3 Codes pertaining to Johnstone's triangle domains observed in student responses to prompt Q3
Q3 code Type of reasoning represented
submicro Light is shown interacting either with a single molecule or an ensemble of molecules in each solution OR, if not, the explanation addresses how much light CV and MO absorb “per molecule” (e.g., “CV absorbs more per molecule than MO”)
nonSubmicro The different final intensities of light for the two solutions are explained/depicted, without light-molecule interaction shown
other No visualization is used in the explanation, or reasoning in neither of the above categories, or no reasoning (even in the presence of a drawing)



image file: d3rp00153a-f8.tif
Fig. 8 Examples of (a) submicro using an ensemble of molecules; (b) submicro using a single molecule; (c) nonSubmicro; and (d) other reasoning demonstrated in Q3. Transcriptions and the detailed rationale for these coding assignments are provided in the ESI, Section S4.2.

Fig. 9 compares the counts of submicroscopic vs. other types of resources observed in prompt Q3 between the comparison and simulation groups of students. There is an apparent difference in the amount of non-submicroscopic reasoning (nonSubmicro and other) between the two groups of students: 39 out of 71 students in the comparison group employed nonSubmicro/other reasoning, while 17 out of 53 students in the simulation group did so. In addition, higher counts of submicroscopic resources were observed in the simulation group than in the comparison group, despite the latter having more participants, which indicates that submicroscopic reasoning is more prevalent in the simulation group than in the comparison group. Overall, a statistically significant association with a medium effect size between the Q3 code distribution and engagement with BLSim was observed: χ2 (2, n = 124) = 13.13, p = 0.001, Cramér's V = 0.33. Standardized residual data indicate that students who completed the BLSim activity were more likely to employ submicroscopic resources and less likely to employ non-submicroscopic resources (as compared to what would be expected by chance) when prompted to explain how a change in solute affects a light intensity profile.


image file: d3rp00153a-f9.tif
Fig. 9 Distribution of student responses for prompt Q3 (varied solute). Left: Number of students in the comparison and simulation groups demonstrating submicroscopic, non-submicroscopic, or “other” reasoning. Right: Contingency table for the χ2 test examining the association between completing BLSim activity and reasoning code distribution. Standardized residuals (SR), expected counts and observed counts are reported in each cell.

Prompt Q4: halving the total path length

In prompt Q4 (halved path length), students were provided a light intensity versus path length profile for a solution in a 1.0 cm cuvette and prompted to sketch the intensity profile for the same solution when placed in a 0.5 cm cuvette. Students were then prompted to explain their reasoning behind the curve they drew, without explicit prompting to sketch light–matter interactions. As may be expected from a lack of explicit prompting, students exclusively used words (as opposed to illustrations) to explain their reasoning. Furthermore, only five responses from the entire dataset (4 from the simulation group and 1 from the comparison group) employed resources that we would classify as submicroscopic using the criteria from prompts Q1–Q3 (e.g., “Because the path length decrease[s], the light passes [through a] shorter length of molecular layers and exit[s] the sample. The ability of molecule[s] to absorb/the amount of molecule[s] per layer does not change, so the slope is the same.”).

Despite this limited occurrence of submicroscopic-level reasoning, we considered it meaningful to further analyze the responses to prompt Q4, as we observed that student answers could be differentiated by the extent to which they focused on changes in light intensity within the solution, and not just on the difference in the final intensity of light exiting the cuvettes in the two scenarios. While all students recognized that the final intensity of light exiting the 0.5 cm cuvette would be higher, student answers differed in how they envisioned the light intensity changing within the solution. We observed that some students did not accurately depict the two intensity profiles as coinciding for the first 0.5 cm of the length traversed (Fig. 10b). These students also tended to provide explanations based on a smaller overall “amount” of analyte leading to less light absorption, without considering that the degree of absorption of light by a given “slice” of solution only depends on the concentration and absorptivity of the analyte. This line of reasoning is more consistent with a macroscopic or algorithmic understanding of the Beer–Lambert law, whereby the solution is a “black box” whose output is determined by certain parameters.


image file: d3rp00153a-f10.tif
Fig. 10 Examples of accurate (a) and inaccurate (b) profiles of light intensity vs. path length traversed, for two identical solutions placed in cuvettes of 0.5 cm and 1.0 cm. In some cases (c), a student's representation was unclear regarding whether they were interpreting the intensity profile accurately or inaccurately.

In contrast, the majority of the students in both the simulation and comparison groups were able to recognize that the intensity profiles for the two identical solutions completely coincided, with the light intensity in the shorter cuvette ceasing to decrease after exiting the solution (Fig. 10a). However, when asked to justify their sketched graphs, most students did not provide an explicit reasoning for why the first portions of the graph were the same. Many responses only tended to the final intensities exiting the two cuvettes being different (onlyFinal), whereas several responses simply described the shapes of the two graphs without a physical explanation (noReas). Students who provided explicit reasoning for the graph shapes typically did so by stating that the interaction between light and the solution is initially exactly the same, or by making a connection between the “sameness” of the solutions and the rate at which light is absorbed (sameCurve). These code definitions are summarized in Table 4. Due to the absence of drawings in students' responses to prompt Q4, it is difficult for us to make a claim about students’ submicroscopic vs. non-submicroscopic reasoning for this task. However, there is a notable distinction between responses that only focused on what goes “in and out” of the solution versus responses that also addressed the rate of light extinction within the solution.

Table 4 Codes pertaining to Johnstone's triangle domains observed in student responses to prompt Q2
Q4 code Type of reasoning represented
sameCurve Addresses the unchanged interaction between light and the solution
OR an explicit connection is made between “same solution” and “same shape”
onlyFinal The different final light intensities of the two solutions are addressed in terms of phenomena or algorithmic reasoning (i.e., mentioning of absorption or transmittance), without further discussion of light-solution interaction
noReas Accurate graph with no explanation that goes beyond a description of the graph (mentioning only intensity or path length constitutes a description of the graph)


Fig. 11a summarizes the accuracy of the intensity profiles sketched by students across the simulation and comparison groups in response to prompt Q4. A detailed description of the criteria for accuracy is given in the codebook (ESI, Section S4.1). Fig. 11b further distinguishes the types of explanation provided for the accurately drawn graphs. Responses where the accuracy of the graph was unclear (such as the example in Fig. 10c) were also included in the explanation breakdown, as the students’ written explanations were sufficiently detailed to be categorized in the same manner as the explanations for “clearly” accurate graphs. Overall, there was insufficient evidence in our sample to reject the null hypotheses that no association exists between engaging with the BLSim activity and either (i) the distribution of prompt Q4 accuracy codes, χ2 (2, n = 124) = 1.32, p = 0.52, Cramér's V = 0.10, or (ii) the distribution of prompt Q4 explanation codes, χ2 (2, n = 80) = 1.48, p = 0.48, Cramér's V = 0.14.


image file: d3rp00153a-f11.tif
Fig. 11 Distribution of student responses to prompt Q4 (halved path length): (a) accurate, unclear, and inaccurate comparison of the intensity profile within two cuvettes of varying length containing identical solutions. (b) Classification of students’ explanations provided for the accurately and “unclearly” compared graphs.

Range of activated resources observed across prompts

As described in the previous sections, students’ reasoning could be classified as submicroscopic (rays hitting molecules, light interacting with solutes), non-submicroscopic (light entering and exiting the solution or purely mathematical reasoning combined with mentioning of a phenomenon), or other (reasoning without a sketch of the phenomenon, reasoning outside the defined categorizations, or no reasoning). However, within the former categorizations based on Johnstone's triangle, evidence for a variety of resources relating to light-solute interactions was observed in student reasoning. Some of the resources activated in students were scientifically normative, while some of them were not. For example, when students were asked to explain the shape of the intensity profile using an illustration, the majority of students described light being “absorbed” by molecules, either by including relevant phrasing in their explanations or by labelling portions of their inscriptions. However, resources such as “blocking” (Fig. 12a) and “deflection” (or “scattering”) (Fig. 12b) of light rays by molecules were also activated in students. The scattering resource was inferred from students' writing or by students illustrating arrows changing in direction (or new arrows being drawn facing in different directions). Even though scattering is only a minor contributor to decreasing the light intensity exiting the opposite end of the cuvette, students nonetheless employed this resource productively to explain the phenomenon at hand. Another example of a different resource activation was the use of “particle size” as a stand-in for absorptivity in prompt Q3 (the comparison of the CV and MO molecules). Evidence for this resource consisted of students stating that CV is a larger molecule than MO and therefore absorbs more light (Fig. 12c). Additional resources that were encountered in student responses were “excitation” (Fig. 12d) (where energy transfer from light to the molecule was discussed), which likely comes from students’ previous experiences in chemistry or physics courses; “probability” or “possibility” (Fig. 12a) (e.g., the probability that light encounters a molecule being higher, lower, or the same across the solution), and the inclusion of spectrometer components (Fig. 12d) (light source, grating, and detector), which is likely a resource originating from how the Beer–Lambert law is introduced in the analytical chemistry course. In prompt Q3, when students were asked to explain the intensity profile graphs for crystal violet (CV) and methyl orange (MO), resources relating to wavelength such as a spectrum (Fig. 12e) or a discussion of color (Fig. 12f) were also observed in students’ responses. Although we observed many unique resources across responses to prompts Q1–Q4, resources other than “absorption” were individually observed very infrequently and could not be reliably quantified. Thus, we are unable to infer any association between the completion of the BLSim activity and the activation or suppression of these “non-absorption” resources.
image file: d3rp00153a-f12.tif
Fig. 12 Examples of varied resources that were activated in students’ answers: (a) light “blocking” and “possibility” resources, (b) “deflection” or “scattering” resource, (c) “molecular size” resource, (d) “spectrometer set-up” and “excitation” resources, (e) “wavelength” resource, (f) “color” resource. The transcriptions of each answer are provided in the ESI, Section S4.5.

Discussion

This study analyzed how a simulation-based activity affects undergraduate analytical chemistry students’ reasoning about light absorption phenomenon. In all of the four aforementioned tasks, students’ explanations were not evaluated based on accuracy in reasoning; instead, they were categorized and analyzed for potential resources that might be activated by the prompts and used in students’ reasoning. Specifically, we investigated the prevalence of macroscopic, symbolic, or submicroscopic reasoning elements in students’ explanations.

Statistically significant and meaningful associations between using BLSim and employing particle-level reasoning

As can be seen from the code frequency results from prompts Q1–Q3 (Fig. 5, 7 and 9), the proportion of students using submicroscopic reasoning (rather than non-submicroscopic or other types of reasoning) was larger in the simulation group of students than in the comparison group. Chi-square tests for these three prompts revealed statistically significant associations (with medium to large effect sizes) between completion of the simulation activity and reasoning code distributions, with more submicroscopic codes in the simulation group being a major contributor to the overall chi-square values.

These observed associations could be attributed to the simulation activity, which included a particle-based model of light-solution interaction (photon-spheres interacting with molecule-cylinders). Student responses and subsequent inferential statistics suggest that the BLSim activity provided a submicroscopic model consisting of light interacting with solute particles (e.g., responses shown in Fig. 4a, 6a and 8a) for students to use when completing prompts Q1–Q3. Indeed, submicroscopic models with elements more directly represented in BLSim simulation were observed in a few student responses, as shown in Fig. 13. In this example, a student from the simulation group depicted the CV and MO molecules as circles of different size, with CV showing a larger particle size and a lower final intensity coming out (as represented by the number of arrows). The student stated that CV “absorbs more photons than MO per molecule” and drew an explicit parallel to varying the “radius of the disks” in “last week's law” (likely meaning the BLSim activity). This response parallels a task in the BLSim activity that prompted students to vary the radius of the cylinders and measure the number of photons transmitted and absorbed. Other student responses also incorporated visual elements that resembled the BLSim’s simulation tool or used wording similar to the associated activity prompts. In lieu of a pre-post assessment design and given the diversity of information that students had received about this topic in lecture, the laboratory, and previous courses, our study is unable to assess the extent and mechanism(s) by which students took up these resources from the BLSim activity, or the BLSim components critical for said uptake. However, because of all other instructional factors being equivalent for the comparison and simulation groups, we can conclude a clear association between completing the BLSim activity and engaging in submicroscopic reasoning in the post-assessment. Our findings are consistent with previous studies discussing the effectiveness of animations and simulations in supporting students’ mental models (Williamson and Abraham, 1995; Dalton, 2003; Tasker and Dalton, 2006; Kelly, 2014), explanations (Kelly et al., 2017) and depth of understanding (Homer and Plass, 2014) surrounding particulate-level phenomena.


image file: d3rp00153a-f13.tif
Fig. 13 Example of a simulation group student's “particle size” resource activation in prompt Q3 (varying the solute) (a) and comparison with the task of varying cylinder radius in the BLSim activity (b) and (c). A transcript of the student's response is provided in the ESI, Section S4.5.

The use of non-submicroscopic (i.e., macroscopic or symbolic) resources were also inferred from student responses from both simulation and comparison groups, with a higher fraction of students within the comparison group engaging in non-submicroscopic reasoning. Non-submicroscopic reasoning (Fig. 4b, 6b and 8c) often focused only on phenomena occurring outside of the cuvette (incoming and outcoming light intensity) or used the Beer–Lambert law algorithmically (directly/inversely proportional relationship between parameters). For students in the comparison group, it is possible that the absence of the BLSim activity before completing the spectroscopy assessments is related to the prevalence of non-submicroscopic reasoning. For students in the simulation group that engaged in non-submicroscopic reasoning, the reliance on macroscopic phenomena or mathematical relationships (instead of particle-level models) suggests that submicroscopic-to-macroscopic and submicroscopic-to-symbolic linkages of Johnstone's triangle were either unsuccessfully formed during the BLSim activity or not activated by the assessment prompts. In order to form the aforementioned linkages in an explanation, instructors must provide not only sufficient time and scaffolding (Taber, 2013), but also more engagement with constructing and evaluating models (Lazenby et al., 2019) or reconstructing/reiterating explanations (Kapon, 2017). It is possible that the BLSim activity did not sufficiently frame submicroscopic representations in a way that students would consider useful for building their explanations. However, such a conclusion cannot be deduced from our data because, due to the influence of the assessment prompts on resource activation (see discussion below), it is possible that the students in question did “possess” a particle-level explanatory model for the Beer–Lambert law, but this resource was not activated by the assessment prompts.

Productivity and scientific normativity of activated resources

The variety of resources whose activation we inferred from our data corpus included both scientifically normative and scientifically non-normative ideas. While our study and research questions did not aim to systematically characterize the accuracy of students’ ideas or the soundness of their written explanations, viewing the students’ responses from a resources perspective highlighted an important distinction between scientific accuracy of activated resources and their productivity within an explanation or prediction prompt. For example, as shown in Fig. 12, some students described phenomena such as “blocking” or “scattering” as the predominant interaction between the light and solution or molecules. Scattering does not accurately represent how light interacts with a dilute solution, while blocking is an oversimplification of the effective interaction between photons and molecules. However, many students nonetheless used these resources productively to explain why the intensity of light decreases as it passes through the solution. For example, the student in Fig. 12a used the blocking idea to discuss the probability of light encountering a particle and to explain not only that the intensity of light decreases overall, but also that it decreases at a decreasing rate (“the possibility to encounter a particle stays the same, but the overall light intensity is decreasing, so that the decreasing rate of the intensity [in the graph] is decreasing”). Another example of a scientifically inaccurate but often productively used resource activation was the use of “molecular size” as a stand-in for absorptivity in the comparison of two different solutes (inscriptions in Fig. 8a and 13a and wording in Fig. 12c), which can support the argument that CV molecules are more likely to absorb light than MO molecules, and in turn explain why the intensity curve of CV decreases more steeply. While all three answers (inaccurately) described or depicted the CV molecule as bigger than MO, they differed in how productively the student used the “particle size” resource in the explanation. The response in Fig. 12c appeared to use the resource less productively, since the two cuvettes were drawn to have the same exiting light intensity (number of arrows) despite the size difference in the molecules. In contrast, the particle size resource was used more productively in Fig. 8a and 13a, where the depicted size difference of CV and MO molecules was connected to both the likelihood of absorption per molecule and the final light intensity. The student in Fig. 13a also acknowledged the depicted particle size as an “analogy” for the absorption ability per molecule (as opposed to a literal representation of molecular size), writing that “this is like […] when we increased the radius of the disks” in the BLSim activity.”

A “reverse” pattern (high scientific normativity, low productivity) was also observed in some student responses. For example, Fig. 12d displays a very accurate representation of photon absorption in response to prompt Q1: an energy level diagram showing an electron being excited to a higher energy level. However, the energy level diagram and the accompanying wording, “the absorption of light increases energy (excitation),” are not further used to explain why the intensity of light decreases, or what the effect of excitation would look like for an ensemble of molecules. In a different example (Fig. 12e, prompt Q3), the student draws the absorption spectra for crystal violet and methyl orange, accurately demonstrating the complex wavelength dependence of absorption. However, in the context of the task (where monochromatic light is used), this scientifically accurate resource is not used productively to explain the differences in the intensity profiles of the two solutes. In general, our observations suggest that the resources the assessment prompts activated were manifold and diverse (Hammer, 2004) and that a variety of resources could be applied towards our intended task (e.g., explaining a graph shape). Additionally, the distinction between productivity and accuracy observed in our dataset is in line with the principles of the resources framework (the correctness of a resource is not inherent, but rather context-dependent) as well as recent literature. Recent work by Crandell and Pazicni (2023) investigated students’ activation of symmetry resources. They pointed out that in some cases, students might be leveraging resources that are productive in helping them solve problems but the resources themselves may not necessarily be considered scientifically normative. Our results suggest a similar scenario, where certain scientifically non-normative resources were activated by the assessment prompts but were used productively by students to explain absorption phenomena.

Influence of prompting on submicroscopic-level resource use

When comparing the assessment prompts following data collection, we noted a potential scaffolding difference. Prompts Q1–Q3 specifically asked for a picture or illustration to support students’ explanations. However, prompt Q4 only asked students to explain their reasoning behind the curve they drew, without explicitly asking them to draw a picture or illustration of the light-solution interaction. The prevalence of model-based reasoning observed in all prompts of our dataset except Q4 suggests that a lack of explicit prompting in Q4 contributed to the lack of model-based reasoning observed in student responses. However, it could also be argued that the complex relationship between light intensity and the path length variable (where one must consider the interaction of light with multiple layers of molecules) may have been more challenging for students to extract from the BLSim activity and depict on the particle level, even if prompted. A revision of prompt Q4 to directly elicit a drawing would allow us to distinguish between these two possible interpretations. In addition to elucidating students’ understanding of the intensity-path length relationship, the incorporation of a related drawing prompt (either in the BLSim activity itself or in the subsequent assessment or activity) could support students in developing mental models from the information gained from BLSim. A previous study by Dalton (2003) suggested that utilizing student drawings as learning strategy before and after working with animations allows students to rehearse what they have seen in the animations and encourages them to express their own understanding of the animations. In addition, a recent contribution by Bruce and coworkers (2022) emphasized the importance of creating representations in forming connections between macroscopic and submicroscopic domains.

Limitations

Despite the herein demonstrated promise of the BLSim tool in promoting submicroscopic-level reasoning in chemistry students, it is important to consider the following limitations. First, the observation that students use more submicroscopic-level resources in their reasoning does not necessarily imply an increase in their ability to connect submicroscopic level models to the symbolic level. As shown in the online ESI, Section S7, there was no significant difference between the simulation and comparison groups in students’ tendency to connect their drawings in prompt Q1 to the graph shape. Many students from both groups limited their explanation to justifying why light intensity decreases (at all) throughout the solution, without addressing the changing slope (“flattening out”) of the light intensity profile. To explain why light intensity decreases, a submicroscopic model does not necessarily provide more nuance or meaning to an explanation than a macroscopic model or algorithmic reasoning. Additional work is needed to design both the simulation and assessment prompts to support students in not just in constructing particle-level models of the absorption phenomenon, but also in using such models to construct meaningful explanations.

A second limitation of this study pertains to the degree of independence that we can assume among the student submissions we analyzed, as well as the subjectivity required in part to classify their reasoning (see Methods). Our claims about students’ resource activation stem from analyzing their written answers and drawings. To that end, the data analyzed in this study (written explanations and drawings) may have not been representative of students’ reasoning processes while responding to the assessment prompts. Furthermore, during the semester when the data were collected, the laboratory component of the course was virtual, and so we cannot know if and which students worked on the assignments individually or collaboratively (as is common in other labs). Although the data were diverse enough to suggest that each student produced a distinct, individual response, students may have had the opportunity to share and discuss the BLSim tool and spectroscopy assessment questions outside of class, thus influencing each other's answers.

Lastly, the design and implementation of the spectroscopy assessment in the iteration reported in this study has limited the scope of conclusions that we can draw from our dataset. As discussed above, due to the lack of explicit prompting to “draw a picture” in prompt Q4 (halved path length), no students provided sketches of submicroscopic models for how the path length affects light absorption, except for a few cases where a student addressed light-molecule interactions in writing. Consequently, our study was unable to evaluate whether the BLSim activity helps students understand the effect (or lack thereof) of the path length variable on absorbance at the submicroscopic level. We saw that the simulation-based activity promoted submicroscopic-level reasoning in the contexts of varied concentration (c) and absorptivity (ε), but its effect on students’ understanding of the third variable (b) of the Beer–Lambert law remains unexplored. Furthermore, conducting this assessment by comparing two different groups of students, instead of administering a “pre” and “post” version of assessment before and after the BLSim activity, provides only an indirect picture of the learning gains of students while interacting with the activity. While the association between completing the activity and demonstrating submicroscopic reasoning is clearly demonstrated, the mechanism through which the activity makes new resources available to students, as well as the value of the simulation component of the activity as compared to alternative forms of visual representation, remain to be determined. Finally, given the short timeframe within the administration of the activity and the assessment (one week), the present study cannot evaluate the long-term impact of participating in the activity on resource activation.

Conclusions and implications

Summary

This work presented here assessed the effectiveness of the previously published BLSim activity (Spitha et al., 2021) in engaging students with particle-level reasoning about absorption phenomena. In doing so, this work has begun to address the research gap on students’ understanding of a widely used spectroscopic relationship, the Beer–Lambert law. In this study, we explored which resources were activated by novel spectroscopy assessment prompts and to what extent completing the BLSim activity affects this resource activation. We expected students who have engaged with the BLSim activity to incorporate elements of submicroscopic reasoning when prompted to explain how and why the light intensity vs. path length profile would change when changing three parameters associated with the Beer–Lambert law: ε, b, and c. Our quantitative analysis of codes derived from Johnston's triangle revealed a statistically significant association between completion of the BLSim activity and students employing submicroscopic-level resources in their explanations. We infer from this result that the BLSim activity likely provides new resources, particularly submicroscopic models consisting of light and solute particles, for students to use in their reasoning. Our qualitative analysis provided additional insight into students’ resource activation, and we observed a variety of resources used in students’ reasoning. While many of the resources activated by the assessment prompts may not be considered scientifically normative by instructors, they were nonetheless often used productively by students to address the assessment tasks. Overall, this work has demonstrated a clear effect of the BLSim activity on the activation of submicroscopic resources when reasoning about the concentration and absorptivity aspects of the Beer–Lambert law and has shed light on the diversity of resources students use. Further work is needed to investigate how students can be supported (by simulations and other aspects of the curriculum) in drawing connections among the submicroscopic, macroscopic, and symbolic domains and thus building more meaningful explanations.

Implications for teaching

Simulations can be powerful tools for chemistry instructors to incorporate into their teaching (Tasker and Dalton, 2006; Tasker, 2014; Kelly et al., 2017, 2021). In our study, we showed that completing a simulation-based activity centered on the derivation of the Beer–Lambert law from a particle-level representation was associated with a higher tendency of students to consider submicroscopic-level reasoning when asked to explain or predict an intensity vs. wavelength graph. However, the present study cannot discern what elements of the activity are most responsible for the observed increase in submicroscopic-level reasoning, and if the incorporation of an interactive simulation (as opposed to images or animations) is a necessary step. In the iterative design of the BLSim interface and prompts and analysis of students’ engagement with the tool via their written responses (Spitha, 2021), as well as in observing the effect of our assessment prompts’ wording in eliciting model-based reasoning, a more important implication becomes evident: prompting and framing is crucial both in how students engage with simulations and in how instructors can evaluate the effectiveness of curricular interventions.

Regarding the teaching of spectroscopic phenomena and the Beer–Lambert law, the findings of this study also highlight some of the diverse (and maybe unexpected) resources that are activated in students when they are faced with largely unfamiliar phenomena to explain (a particle-level explanation of exponential intensity profiles is not expected of a typical analytical chemistry student). Having provided students with a set of exercises that are explicitly framed in the context of the Beer–Lambert law, an instructor may expect students to primarily use absorption-related resources to explain how light interacts with the solution. However, a substantial number of students in our study appeared to consider scattering as the primary phenomenon occurring, which could either stem from everyday experiences of students or by in-lecture coverage of scattering as a source of error in absorption spectroscopy. This mis-activation of the “scattering” resource could potentially be mediated by incorporating tasks in the assessment (or other instructional activities) that highlight the contexts (e.g., range of molar concentrations) in which scattering and absorption are predominant. In general, engaging students with unfamiliar prompts prior to instruction (to gauge the range of resources that are initially activated in students) or in the middle of an instructional unit (to understand how resources based on “new” material are activated or not activated in certain contexts) could serve as a useful formative assessment strategy for instructors.

Author contributions

YZ and NS conducted data analysis and collaborated on the writing of this manuscript under the supervision of SP. NS developed the spectroscopy assessment in collaboration with ARB and PSD, who conducted the data collection (administering activities and assignments). ARB also contributed to the data curation (anonymization). CM and SAF contributed to establishing reliability of our coding schemes as indicated in the Methods.

Conflicts of interest

There are no conflicts of interest to declare.

Acknowledgements

The authors would like to acknowledge Drs. Olivia Crandell, Morgan Howe, and Jenna Tashiro for their constructive input throughout the data analysis portion of this project.

References

  1. Ayene M., Kriek J. and Damtie B., (2011), Wave-particle duality and uncertainty principle: Phenomenographic categories of description of tertiary physics students’ depictions, Phys. Rev. Spec. Top. – Phys. Educ. Res., 7(2), 020113.
  2. Balabanoff M. E., Al Fulaiti H., Bhusal S., Harrold A. and Moon A. C., (2020), An exploration of chemistry students’ conceptions of light and light–matter interactions in the context of the photoelectric effect, Int. J. Sci. Educ., 42(6), 861–881.
  3. Balabanoff M., Kaur S., Barbera J. and Moon A., (2022), A construct modelling approach to characterize chemistry students’ understanding of the nature of light, Int. J. Sci. Educ., 44(6), 873–895.
  4. Bare W. D., (2000), A More Pedagogically Sound Treatment of Beer's Law: A Derivation Based on a Corpuscular-Probability Model, J. Chem. Educ., 77(7), 929.
  5. Bruce M. R. M., Bruce A. E. and Walter J., (2022), Creating Representation in Support of Chemical Reasoning to Connect Macroscopic and Submicroscopic Domains of Knowledge, J. Chem. Educ., 99(4), 1734–1746.
  6. Clement J., Brown D. E. and Zietsman A., (1989), Not all preconceptions are misconceptions: finding ‘anchoring conceptions’ for grounding instruction on students’ intuitions, Int. J. Sci. Educ., 11(5), 554–565.
  7. Cohen J., (1988), Statistical power analysis for the behavioral sciences, 2nd edn, L. Erlbaum Associates.
  8. Cooper M. M., Kouyoumdjian H. and Underwood S. M., (2016), Investigating Students’ Reasoning about Acid–Base Reactions, J. Chem. Educ., 93(10), 1703–1712.
  9. Crandell O. M. and Pazicni S., (2023), Leveraging cognitive resources to investigate the impact of molecular orientation on students’ activation of symmetry resources, Chem. Educ. Res. Pract., 24(1), 353–368.
  10. Dalton R. M., (2003), The development of students’ mental models of chemical substances and processes at the molecular level.
  11. Dangur V., Avargil S., Peskin U. and Judy Dori Y., (2014), Learning quantum chemistry via a visual-conceptual approach: students’ bidirectional textual and visual understanding, Chem. Educ. Res. Pract., 15(3), 297–310.
  12. Didiş Körhasan N. and Wang L., (2016), Students’ mental models of atomic spectra, Chem. Educ. Res. Pract., 17(4), 743–755.
  13. diSessa A. A., (1993), Toward an Epistemology of Physics, Cogn. Instr., 10(2–3), 105–225.
  14. diSessa A. A. and Sherin B. L., (1998), What changes in conceptual change? Int. J. Sci. Educ., 20(10), 1155–1191.
  15. Garrison D. R., Cleveland-Innes M., Koole M. and Kappelman J., (2006), Revisiting methodological issues in transcript analysis: Negotiated coding and reliability, Internet High. Educ., 9(1), 1–8.
  16. Hammer D., (2004), The Variability of Student Reasoning, Lecture 3: Manifold Cognitive Resources, Research on Physics Education, Proceedings of the International School of Physics “Enrico Fermi.”, pp. 321–340.
  17. Hammer D., Elby A., Scherr R. E. and Redish E. F., (2005), Resources, framing, and transfer, in Transfer of Learning from a Modern Multidisciplinary Perspective, Current Perspectives on Cognition, Learning, and Instruction, Information Age Publishing Inc., pp. 89–119.
  18. Harlow D. B. and Bianchini J. A., (2020), Knowledge-in-Pieces—Andrea A. diSessa, David Hammer, in Akpan B. and Kennedy T. J. (ed.), Science Education in Theory and Practice: An Introductory Guide to Learning Theory, Springer Texts in Education, Springer International Publishing, pp. 389–401.
  19. Homer B. D. and Plass J. L., (2014), Level of interactivity and executive functions as predictors of learning in computer-based chemistry simulations, Comput. Hum. Behav., 36, 365–375.
  20. Johnstone A. H., (1991), Why is science difficult to learn? Things are seldom what they seem, J. Comput. Assist. Learn., 7(2), 75–83.
  21. Johnstone A. H., (2000), Teaching Of Chemistry - Logical Or Psychological? Chem. Educ. Res. Pract., 1(1), 9–15.
  22. Johnstone A. H., (2006), Chemical education research in Glasgow in perspective, Chem. Educ. Res. Pract., 7(2), 49–63.
  23. Kaldaras L. and Wieman C., (2023), Cognitive framework for blended mathematical sensemaking in science, Int. J. STEM Educ., 10(1), 18.
  24. Kapon S., (2017), Unpacking Sensemaking, Sci. Educ., 101(1), 165–198.
  25. Kelly R. M., (2014), Using Variation Theory with Metacognitive Monitoring To Develop Insights into How Students Learn from Molecular Visualizations, J. Chem. Educ., 91(8), 1152–1161.
  26. Kelly R. M., Akaygun S., Hansen S. J. R. and Villalta-Cerdas A., (2017), The effect that comparing molecular animations of varying accuracy has on students’ submicroscopic explanations, Chem. Educ. Res. Pract., 18(4), 582–600.
  27. Kelly R. M., Akaygun S., Hansen S. J. R., Villalta-Cerdas A. and Adam J., (2021), Examining learning of atomic level ideas about precipitation reactions with a resources framework, Chem. Educ. Res. Pract., 22(4), 886–904.
  28. Körhasan N. D. and Miller K., (2019), Students’ mental models of wave–particle duality, Can. J. Phys., 98(3), 266–273.
  29. Kovarik M. L., Galarreta B. C., Mahon P. J., McCurry D. A., Gerdon A. E., Collier S. M. and Squires M. E., (2022), Survey of the Undergraduate Analytical Chemistry Curriculum, J. Chem. Educ., 99(6), 2317–2326.
  30. Lancaster K., Moore E. B., Parson R. and Perkins K. K., (2013), Insights from Using PhET's Design Principles for Interactive Chemistry Simulations, Pedagogic Roles of Animations and Simulations in Chemistry Courses, ACS Symposium Series, American Chemical Society, pp. 97–126.
  31. Laverty J. T., Underwood S. M., Matz R. L., Posey L. A., Carmel J. H., Caballero M. D., et al., (2016), Characterizing College Science Assessments: The Three-Dimensional Learning Assessment Protocol, PLoS One, 11(9), e0162333.
  32. Lazenby K., Rupp C. A., Brandriet A., Mauger-Sonnek K. and Becker N. M., (2019), Undergraduate Chemistry Students’ Conceptualization of Models in General Chemistry, J. Chem. Educ., 96(3), 455–468.
  33. McHugh M. L., (2013), The Chi-square test of independence, Biochem. Med., 23(2), 143–149.
  34. McKagan S. B., Handley W., Perkins K. K. and Wieman C. E., (2009), A research-based curriculum for teaching the photoelectric effect, Am. J. Phys., 77(1), 87–94.
  35. Minter C. J., (2019), Characterization of Students’ Reasoning about Atomic Emission Spectra: A Design-Based Research Study to Improve Students’ Understanding of Light-Matter Interactions.
  36. Mislevy R. J. and Haertel G. D., (2006), Implications of Evidence-Centered Design for Educational Testing, Educ. Meas. Issues Pract., 25(4), 6–20.
  37. Mislevy R. J., Steinberg L. S. and Almond R. G., (2003), On the structure of educational assessments, Meas. Interdiscip. Res. Perspect., 1, 3–62.
  38. Moon A., Zotos E., Finkenstaedt-Quinn S., Gere A. R. and Shultz G., (2018), Investigation of the role of writing-to-learn in promoting student understanding of light–matter interactions, Chem. Educ. Res. Pract., 19(3), 807–818.
  39. Özcan Ö., (2015), Investigating students’ mental models about the nature of light in different contexts, Eur. J. Phys., 36(6), 065042.
  40. Ricci R. W., Ditzler M. and Nestor L. P., (1994), Discovering the Beer-Lambert Law, J. Chem. Educ., 71(11), 983.
  41. Russell J. W., Kozma R. B., Jones T., Wykoff J., Marx N. and Davis J., (1997), Use of Simultaneous-Synchronized Macroscopic, Microscopic, and Symbolic Representations To Enhance the Teaching and Learning of Chemical Concepts, J. Chem. Educ., 74(3), 330.
  42. Schwedler S. and Kaldewey M., (2020), Linking the submicroscopic and symbolic level in physical chemistry: how voluntary simulation-based learning activities foster first-year university students’ conceptual understanding, Chem. Educ. Res. Pract., 21(4), 1132–1147.
  43. Spitha N., (2021), Simulations as Epistemic Glue between Differential Equations and Photophysics: Layered Perovskite Carrier Dynamics and the Origins of the Beer–Lambert Law.
  44. Spitha N., Doolittle P. S., Buchberger A. R. and Pazicni S., (2021), Simulation-Based Guided Inquiry Activity for Deriving the Beer–Lambert Law, J. Chem. Educ., 98(5), 1705–1711.
  45. Stefani C. and Tsaparlis G., (2009), Students’ levels of explanations, models, and misconceptions in basic quantum chemistry: A phenomenographic study, J. Res. Sci. Teach., 46(5), 520–536.
  46. Supurwoko S., Cari C., Sarwanto S., Sukarmin S. and Suparmi S., (2017), The effect of Phet Simulation media for physics teacher candidate understanding on photoelectric effect concept, Int. J. Sci. Appl. Sci. Conf. Ser., 1(1), 33–39.
  47. Taber K. S., (2013), Revisiting the chemistry triplet: drawing upon the nature of chemical knowledge and the psychology of learning to inform chemistry education, Chem. Educ. Res. Pract., 14(2), 156–168.
  48. Tasker R., (2014), Research into Practice: Visualising the Molecular World for a Deep Understanding of Chemistry, Teach. Sci., 60(2), 16–27.
  49. Tasker R. and Dalton R., (2006), Research into practice: visualisation of the molecular world using animations, Chem. Educ. Res. Pract., 7(2), 141–159.
  50. Williamson V. M. and Abraham M. R., (1995), The effects of computer animation on the particulate mental models of college chemistry students, J. Res. Sci. Teach., 32(5), 521–534.

Footnotes

Electronic supplementary information (ESI) available: Overview of simulation tool, codebook and flowcharts with example transcriptions, complete spectroscopy assessment and 3D-LAP evaluation, interrater reliability, and a discussion of the explanation of graph shape. See DOI: https://doi.org/10.1039/d3rp00153a
Both authors contributed equally to this work.

This journal is © The Royal Society of Chemistry 2024