Using feedback loops from computational simulations as resources for sensemaking: a case study from physical chemistry

Andreas Haraldsrud *ab and Tor Ole B. Odden a
aCenter for Computing in Science Education, University of Oslo, 0371 Oslo, Norway. E-mail: a.d.haraldsrud@kjemi.uio.no
bDepartment of Chemistry, University of Oslo, 0371 Oslo, Norway

Received 17th January 2024 , Accepted 19th March 2024

First published on 25th March 2024


Abstract

Sensemaking is an important way of learning and engaging in science. Research has shown that sensemaking activities, such as questioning, hypothesizing, and model building, are pivotal in developing critical thinking and problem-solving skills in science education. This paper investigates the role of computational simulations in facilitating sensemaking in chemistry education, specifically examining how these simulations can sustain the sensemaking process. Through a detailed case study in a physical chemistry course, we explore the interplay between students' real-world experiences, theoretical knowledge, and computational simulations. This analysis reveals that computational simulations, by providing interactive and visual representations of chemical phenomena, can create a conducive environment for sensemaking, where students actively engage in exploring and resolving conceptual uncertainties. Based on these results, we argue that computational tools, when effectively integrated into educational settings, can facilitate sensemaking in science education.


Introduction

One of the most important aspects of learning science is constructing new knowledge from established knowledge. When we teach science, we try to help our students learn certain principles and facts and then use these principles to make sense of the many other disparate pieces of knowledge they have acquired.

However, sensemaking is not necessarily an easy process (Odden and Russ, 2018). It requires time, effort, and an ongoing dialogue (Ford, 2012). It is also a highly individual process, as different students will often use very different explanations to make sense of ideas. Thus, teachers who aim to support their students' sensemaking often go to great lengths to provide rich, multifaceted learning environments. One common element of such rich learning environments is the use of interactive computational simulations (Plass et al., 2012; Easley, 2020; Hunter et al., 2021). Such simulations have been explored as a tool for building conceptual understanding and supporting sensemaking in various disciplines, including physics, biology, and chemistry education (Papert, 1980; Wilensky and Reisman, 2006; Plass et al., 2012; Sand et al., 2019). For example, Hunter et al. applied the sensemaking framework proposed by Odden and Russ to analyze a collaborative gas law activity in general chemistry (Hunter et al., 2021), demonstrating how the activity led to productive episodes of sensemaking.

In this article, we will build upon these studies to examine how computational simulations can support sensemaking in chemistry education. This will be explored through a case study of students working with activities on particle interactions and equilibrium in an authentic learning situation in physical chemistry. In this study, we are interested in addressing the following research question: through what mechanisms can sensemaking be facilitated and sustained with computational simulations?

Literature review and background

Computational simulations have been valuable tools for science educators for over 60 years (Papert, 1980). Smetana and Bell (2012) have compiled a thorough review of the literature on computational simulations in science education before 2012. Here, we will first summarize some of the main arguments from this review, along with newer research in chemistry education relevant to our research question, in order to later theoretically situate sensemaking within this body of literature.

We choose to include three factors related to learning in science: conceptual understanding, science process skills and modeling. We will later use what these three concepts have in common with sensemaking to support our analysis.

Computational simulations can enhance conceptual understanding

Conceptual understanding is often contrasted with procedural understanding: conceptual understanding involves an understanding of relationships, principles and representations, while procedural knowledge involves an understanding of algorithmic and step-by-step solution strategies (Braithwaite and Sprague, 2021). Numerous studies have demonstrated that computational simulations can facilitate the development of students’ conceptual understanding of science (Windschitl and Andre, 1998; Soderberg and Price, 2003; Trundle and Bell, 2010; Cannady et al., 2019; Zendler and Greiner, 2020). By providing a virtual environment where students can manipulate variables and observe the resulting outcomes, computational simulations help students visualize abstract concepts and make connections between theory and real-world phenomena. For example, a study by Özmen et al. (Özmen et al., 2009) found that students who used computational simulations to explore chemical bonding developed a better conceptual understanding of chemical bonds compared to those who received traditional instruction.

Another example of how conceptual understanding can be improved is given by Wu et al. (2001). They show how the visual nature of a computational simulation can improve students’ understanding of chemical structure by making it possible to convert between multiple representations of molecules. In addition, the dynamical aspect of computational simulations is explored by Stieff and Wilensky (2003), who show how computational simulations can enhance students’ conceptual understanding of chemical equilibrium. They also report an increased degree of logical reasoning when students worked with computational simulations.

The research on how computational simulations affect student learning of chemistry is quite diverse. Interactive simulation tools like PhET, developed in 2002, have been explored in different contexts and levels of education (Lancaster et al., 2013; Moore et al., 2014). In a study from a general chemistry course, Clark and Chamberlain (2014) show how computational simulations can be used to support student modeling competence. This is further built on and expanded upon by Salame and Makki (2021), showing a positive impact of simulations on student affect and attitudes towards chemistry.

Correia et al. (2019) demonstrated the benefits of using PhET simulations to understand particulate level gas behavior in secondary school. In this study, the value of using multiple representations of particulate level phenomena is emphasized. Computational simulations are shown to be valuable tools for this purpose.

A study by Ganasen and Shamuganathan (2017) shows a statistically significant improvement in student achievement on conceptual understanding of chemical equilibrium by using computational simulations. Further, Casa-Coila et al. (2023) recently conducted a study, showing a statistically significant improvement on learning a multitude of important chemical concepts at both the microscopic and macroscopic levels.

Computational simulations promote science process skills

Science process skills are skills associated with scientific methods and procedures, such as observation, inference, experimentation, reasoning and critical thinking (National Research Council, 2006). Research on students’ development of these skills show that students who use computational simulations in science achieve at least the same or better results than those in traditional learning environments (Faryniarz and Lockwood, 1992; Geban et al., 1992; Lin and Lehman, 1999).

Computational simulations may also provide a mistake-friendly environment where students can experiment, make errors, and learn from them without real-world consequences. This freedom to explore and learn from mistakes promotes a growth mindset and encourages students to take risks in their learning. The interactive nature of computational simulations can captivate their attention and promote active participation, which in turn may foster their development of science process skills (Lee, 1999; Hargrave and Kenton, 2000; Berlin and White, 2010).

Pedagogical considerations for computational simulations

As we have seen, there is a solid research foundation that shows that computational simulations can be used to promote conceptual understanding and science process skills. However, to ensure that computational simulations are used effectively, Smetana and Bell (2012) also outline some key pedagogical considerations that can help ensure the educational effectiveness of computational simulations.

One important finding is that we should use computational simulations as a supplement to traditional methods, not as a replacement. This is not surprising, since researchers have argued that different methods and tools have different modalities – that is, they describe certain aspects of reality in specific ways. For instance, it is more effective to use animation than static images when trying to understand the dynamic aspect of protein synthesis or how a galvanic cell works. Conversely, when trying to understand the composition of a peptide, different static images may convey this information better than an animation. A well-written text is also often necessary to understand arguments and abstract phenomena, as well as to explain and supplement images and figures.

Some studies suggest that computational simulations are most effective when used before, not after, traditional classroom instruction (Brant et al., 1991; Winberg and Berg, 2007). One possible explanation behind this may be that students attend less to the computational simulation when it serves as a review (Smetana and Bell, 2012). Also, when using a computational simulation before traditional instruction, students may be more inclined towards exploration and inquiry-based learning. The understanding gained from computational simulations may also decrease the students’ cognitive load during instruction, thereby priming them for traditional instruction (Winberg and Berg, 2007).

There is also some evidence that the order in which students use computational simulations do not matter (Liu et al., 2008), indicating that whether the order is important is dependent on the content and the design of the activity. For example, using a simulation before a laboratory exercise could possibly make the laboratory exercise less motivating and thus less fruitful for learning. Also, some evidence suggests that students might be more motivated by working with data from traditional experiments than with data from simulations (Corter et al., 2011).

Another central finding is that the students need proper support and scaffolding to learn from computational simulations effectively (Smetana and Bell, 2012). Poorly constructed animations and non-intuitive controls can lead to misunderstanding and frustration. Technical issues and difficulties may also move the focus from learning content to handling the tool. It is therefore important to design the computational simulation tools in such a way that we reduce the cognitive load of students (Chandler and Sweller, 1991; Lee et al., 2004). There is also need for proper support and guidance from the teacher in using the computational simulation effectively in a proper learning context (Roth, 1995; Eylon et al., 1996; Ardac and Sezen, 2002; Limniou et al., 2009).

Theoretical perspectives

Sensemaking

Although science education researchers have been interested in the question of how people make sense of science for many years (Dewey, 1910), in recent years this interest has generated a growing body of research on the cognitive, epistemological, and discursive elements of sensemaking (Kapon, 2017; Odden and Russ, 2018; Eichenlaub, 2019; Sand et al., 2019; Hunter et al., 2021; Odden, 2021; Wu and Yezierski, 2022; Kapon and Berland, 2023). Based on previous literature and their own empirical findings, Odden and Russ (2018) have proposed a definition of sensemaking as “a dynamic process of building or revising an explanation in order to ‘figure something out’—to ascertain the mechanism underlying a phenomenon in order to resolve a gap or inconsistency in one's understanding,’’. This sensemaking process has been explored in different settings and is considered useful for supporting science content learning across different contexts (2019).

When students are engaged in sensemaking, they often experience a disquieting feeling of uncertainty or frustration. This uncertainty frequently centers on a vexing question: a challenging or puzzling gap in knowledge that stimulates students' sensemaking processes. Such questions can often arise from scenarios where intuitive or everyday understandings clash with the principles of science, thus requiring a deeper engagement and exploration to resolve the inconsistencies and develop a coherent understanding. These vexing questions may, in turn, be leveraged by educators to motivate active thinking, discussion, and exploration among students.

Sensemaking in chemistry

Identifying possible mechanisms behind sensemaking can be valuable to better support students’ deeper understanding of complex chemical concepts, for instance the connection between micro and macro level phenomena. Different frameworks have been developed to understand and support sensemaking processes in chemistry education, for example the framework that Hunter et al. modified (2021) or the PedChemSense framework developed by Wu and Yezierski (2022). Frameworks have also been developed across various scientific disciplines, such as the Sci-Math Sensemaking Framework. This particular framework is designed with the objective of comprehending and facilitating students’ sensemaking processes when interpreting equations within a scientific context (Zhao and Schuchardt, 2021).

The interaction between students and teachers is also important for sensemaking. In a recent study, Hamnell-Pamment (2024) demonstrated how different mechanisms of student–teacher interactions can sustain sensemaking in chemistry. The teacher's role in making connections to theory, real-world experience and to different chemistry knowledge domains was shown to be of importance. It was also shown that it is feasible to use different representations like equations, or mathematical symbols, to mediate progression in the sensemaking process. Equations contain elements of abstraction, which in turn can lead to the emergence of vexing questions. We can hypothesize that simulations might provide some of the same elements of abstraction, as well as other affordances, that might support sensemaking.

Active participation is necessary for sensemaking to happen. Thus, active learning pedagogies like POGIL (Process Oriented Guided Inquiry Learning) can be a valuable strategy for sensemaking. Elements of POGIL include construction of ideas through an iterative process, using inquiry and exploration of models as a basis for the discussions (Rodriguez et al., 2020b). Thus, the POGIL approach shares a lot of common features with sensemaking, and we can hypothesize that POGIL might sustain student engagement in sensemaking. Hunter et al. (2021) investigated how the POGIL framework supports sensemaking, which we will elaborate on later. They also encouraged more research on applying the sensemaking framework to “investigate student's reasoning across other concepts in chemistry and across science” (p. 341), which we will follow up on.

Simulations and sensemaking

In the literature review section, we presented several examples of research on how learning-centered computational simulations can improve conceptual understanding in different scientific domains, as well as how they can promote more general science process skills. We will draw on these results to hypothesize how computational simulations can facilitate sensemaking.

Sand et al. (2019) present a case-study showing how a student engaged in sensemaking in physics through a computational activity. They argue that the reason behind this is that the computational activity allowed the student to realize a gap in her understanding and to easily implement, test and critique her ideas to bridge this gap. We also argue that there's one additional reason behind the engagement in sensemaking that was not addressed explicitly: that is, we claim that the computational approach allowed the student to work hands-on with models, and to test these models continuously. Scientific modelling can contribute to sensemaking (Sands, 2021). Modelling involves the formulation, analysis and critique of representations of reality. The modelling process involves moving back and forth between different descriptions of a phenomenon, critiquing and building an explanation through the process. These are some of the same attributes associated with sensemaking (Odden and Russ, 2018; Kaldaras and Wieman, 2023).

Modelling also shares a lot of common features with science process skills, which have been shown to improve using computational simulations. Kluge (2019) uses clinical interviews to show that computational simulations can be used to “negotiate a meeting point between theory, previous experience and knowledge, and be instrumental in conceptual sense-making”. This resonates well with the arguments for how computational simulations can promote science process skills, which we addressed earlier.

Different modelling practices have been explored in chemistry education. For example, authentic modelling practices was shown to support student motivation and knowledge about models and modelling in chemistry (Prins et al., 2009). Model-based reasoning has also been identified as a problem-solving strategy in chemistry, where students use and refer to models to solve complex problems (Rodriguez et al., 2020a). Thus, both reasoning about models (metamodeling) and reasoning with models has been studied in learning chemistry, which are classified as distinct features of model-based learning in the literature (Lazenby et al., 2020; Passmore et al., 2014).

Sensemaking is connected to conceptual understanding, science process skills and modelling

To summarize, studies have shown that computational simulations can facilitate sensemaking and related processes like conceptual understanding, modelling and the development of science process skills. Research describing conceptual understanding, modelling, science process skills and active learning approaches like POGIL is valuable for understanding sensemaking, as they share a lot of features with each other. We will now briefly consider these joint features, while also pointing out their distinct features.

Both sensemaking and conceptual understanding require the combination of conceptual knowledge and an active negotiation of meaning. However, when we describe conceptual understanding in science education, we often mean correct conceptual understanding (as opposed to misunderstanding). While this of course is a desired outcome of a learning activity, a correct understanding of a phenomenon is not required to engage in sensemaking. Instead, in sensemaking, conceptual resources are used to make sense of a phenomenon, whether right or wrong. Thus, sensemaking is a process for learning and conceptual understanding is a product of a learning process.

Science process skills and sensemaking share a foundational relationship in scientific inquiry, both emphasizing the understanding and application of scientific concepts and principles through active engagement, critical-thinking and iterative problem-solving. That is, science process skills involve critical thinking, processing information and solving problems (Darmaji et al., 2019). These are all dynamic skills, iteratively drawing upon different resources to form conclusions. Iterative inquiry, where students move back and forth between arguments, is also an important aspect of sensemaking. However, while science process skills are focused on the methodology and the “how” of science, sensemaking is more about the “why” of science, and is not bound to any specific methodology.

The last factor we choose to consider here, is modelling. Modelling is an iterative process, requiring the evaluation and (re-)formulation of models by comparing with data from simulations, experiments or observations. This iterative process is also a very important feature of sensemaking. Sensemaking can involve modelling but is more broadly defined as a process where the integration and interpretation of meaning is more important than the construction of models.

We will build upon these results to further examine the mechanisms behind how computational simulations facilitate and sustain sensemaking.

Sensemaking vs. answer-making

Sensemaking requires persistence in this iterative process of trying to “figure something out,” which stands in contrast to answer-making, a process wherein students try to reproduce knowledge to get an answer as quickly as possible (Chen et al., 2013). The two processes also have different goals: the end goal of sensemaking is explanation and understanding, while the end goal of answer-making is an answer or solution to a problem.

The general difference between sensemaking and answer-making is summarized in the following figure, adapted from Odden and Russ (2018) and Chen et al. (2013). First, students engage in sensemaking when they identify and address gaps in their knowledge. To resolve this issue, they choose to generate an explanation. This explanation building is often iterative and goes on over time. In this process, they might draw upon different cognitive resources from their discipline or across disciplines (Kaldaras and Wieman, 2023) (Fig. 1).


image file: d4rp00017j-f1.tif
Fig. 1 The difference between sensemaking and answer-making, adapted from Odden and Russ (2018) and Chen et al. (2013), simplified by Hunter et al. (2021). Further simplified by the authors of this article.

Hunter et al. (2021) operationalized this difference, arguing that there were certain distinguishing characteristics of student context and discussion which can be used to differentiate sensemaking and answer-making. These characteristics are:

(1) A feature of the question context that allows for real-world experiences to be used as a part for the negotiation of meaning. For example, students might use their experiences with falling objects to reason about the sedimentation of particles in a gravitational field. These experiences might also be drawn from laboratory exercises, e.g. “The substances might separate into different phases as we observed in the organic lab”.

(2) Explanation building with collaborative construction and critique. Here, the process of going back and forth in a discussion through different arguments can be contrasted with the more straightforward answer-making process, where arguments aren’t weighed, combined or contrasted to construct new meaning. In this process, we often see the emergence of vexing questions that guide and sustain the explanation building.

(3) The quality of explanation: for example, sensemaking tends to feature stronger and more coherent claims, evidence and reasoning than answer-making (McNeill et al., 2006; Hunter et al., 2021).

Methods

Research setting

The data for this research was gathered during collaborative discussion sessions within a physical chemistry course held at the University of Oslo (UiO) Norway during the fall semester of 2021. The course comprised a diverse group of participants, including students specializing in chemistry, material science, and chemistry education. A total of 63 students were registered for the course. The physical chemistry course consisted of two 90 minute lectures and one 90 minute discussion session per week; during discussion sessions, students primarily tackled problem sets aligned with the week's lecture content. Neither lectures nor discussion sessions were mandatory to complete the course. The students were divided into two groups for the discussion sessions, and approximately 20 students were participating on each session. Sessions were facilitated by chemistry bachelor students serving as learning assistants (LA's).

Based on our study objective of understanding how sensemaking can occur in authentic educational settings, data was collected in situ, i.e., without specific intervention from the research team. Thus, video recordings were made of regular group sessions within the physical chemistry course. The organization into groups of 4–5 students was facilitated by the LA's.

The LA's had completed a Learning Assistant Program before teaching this course. Thus, they had been trained in how to scaffold student learning by asking follow-up questions and by giving small hints instead of the solution when students were stuck. They kept this supporting role throughout the whole session, facilitating discussions and supporting the students to reason about their answers.

The students had already worked with problems related to gases (ideal, real and the kinetic model of gases) and enthalpy in previous group sessions. They were exposed to particulate-level thinking through the kinetic model of gases in the present course, as well as some particulate-level models in kinetics from general chemistry the previous year. The exercise they worked with in the group session that was studied, built upon this general understanding of particles, but extended it into the context of the physical chemistry course.

The study PI maintained a non-intrusive stance during data collection, while the learning assistants retained their customary role of aiding the students. Exercises used in the sessions also originated from the teacher responsible for the group activities, rather than the research team.

Description of the computational simulation activity

The activity featured in this study aims to help students develop conceptual understanding of particle dynamics and equilibrium, and is based on the Python module BubbleBox (v.0.1.4), which is a free educational module in the Hylleraas Software Platform, developed at the University of Oslo (“Hylleraas Software Platform—Hylleraas,” n.d.). BubbleBox is a module for creating molecular dynamics simulations, and contains algorithms for setting up systems, interactions and forces, advancing the system in time, and analyzing the system. The system can be set up manually by giving masses, positions and velocities to all particles, or one can explore a set of pre-defined systems, like ideal gases, attractive gasses, FCC lattices, semi-permeable membranes and harmonic chain systems. Most systems can be set up and visualized in both two and three dimensions. All interactions with the module is done through Python commands.

Since students can either use pre-defined commands or write commands and calculations themselves, BubbleBox can be adjusted to different levels of mathematics and programming experience. For instance, to advance the system in time, one can either write one's own numerical integration code or call upon a pre-defined Python function. As such, BubbleBox has an adjustable degree of “black box” coding. In the exercise featured in this study, BubbleBox was used as a more-or-less black box simulator in the activity described in this study, although students had the option to examine the code behind the different functions.

BubbleBox is built to facilitate exploration by being simultaneously flexible and easy to use. It is also designed to help users bridge their understanding of the microscopic and macroscopic world, and to make it possible to visually explore the effects of mathematical models. These affordances make this tool potentially useful for sensemaking in ways we will address through our analysis.

In the present case, the students used pre-defined functions to simulate the dynamics of a two-dimensional fluid which uses the Lennard-Jones potential for intermolecular interactions. The activity was framed as a numerical experiment with analogies drawn to traditional wet labs.

First, students were asked to set up a system and visualize its state. They do this by running a code cell with existing code in their Jupyter Notebook. The resulting output is shown in Fig. 2.


image file: d4rp00017j-f2.tif
Fig. 2 The state of a two-dimensional system in BubbleBox is shown as particles with vector arrows denoting their velocity.

After setup, the students ran their simulations by running another pre-defined cell of code and observed the system as it evolved in time. As such, the activity was formed as an open inquiry task where students ran pre-made code and used this to discuss properties about the system. These simulations then formed the basis for a series questions related to the dynamics of the system. The focus of the first activities were on how the particles interacted with each other and the walls of the container. The first discussion questions were asked to give the students room for interpreting the system and making a mental model of what it can represent:

1. What happens when the molecules collide?

2. How do the molecules interact with the walls?

3. Do the molecules move in straight or curved lines?

4. Can you find pairs or groups of molecules clustered together?

5. In what direction does the “arrow of time” point? Where does it start? Discuss by using the system in the simulation.

The numerical experiment was designed to facilitate inquiry and discussion, rather than teaching the students a specific concept. The exercises were followed by more traditional lectures and exercises on these themes to help the students consolidate their knowledge.

The questions and code were distributed through a Jupyter Notebook environment as a single document where students could read about relevant theory, write, discuss, and run the computational simulations.

Data collection

The primary data for this study consisted of video recordings of one of the group sessions. Out of a total of five groups, three of the groups were video recorded simultaneously, and the two other groups were recorded later in the same week. Notably, two of these recordings were excluded from subsequent analysis due to excessive background noise, which rendered transcription challenging. Thus, the data corpus consisted of three recordings, encompassing a total of n = 11 students (two groups of four and one group of three students). The recording and data analysis were approved by the Norwegian Centre for Research Data (SIKT) and every participant signed a form of consent where they agreed to begin video recorded.

We will present a case based on one of the groups in the sessions to give a fine-grained description and analysis of the ways computational simulations affect sensemaking. Since there is a limited amount of science education research on how computational simulations can facilitate sensemaking, a case study is a suitable method to explore and describe possible relationships and mechanisms in detail (Bassey, 1999; Grauer, 2012). Case studies are not intended to be generalizable, but they can provide important insight into the complexity of educational contexts (Hamilton and Corbett-Whittier, 2013). The episode presented below was chosen both due to the high prevalence of sensemaking behavior and because the students address fundamental concepts that are not necessarily covered in traditional teaching, the question of “What is a collision between particles?”.

Analysis

Prior to analysis, the data was transcribed, and all student names were replaced with pseudonyms to render it anonymous. The original data is in Norwegian, so all excerpts in this article have additionally been translated into English. Translation validity was checked by the second author, who is a native English speaker.

The analysis proceeded in three phases: first, we used the coding scheme adapted from the sensemaking epistemic game provided by Odden and Russ (Odden and Russ, 2018) and adapted by Hunter et al. to identify possible episodes of sensemaking. Second, we evaluated each candidate episode of sensemaking according to the criteria previously discussed. This was done to further clarify the distinction of sensemaking versus answer-making. The more criteria an episode met, the stronger degree of sensemaking was identified in the episode. Third, we identified the resources responsible for initializing or sustaining the sensemaking process. These resources could for example be a question in the exercise, an output or an input in a computational simulation, theoretical knowledge of students or prompts from a learning assistant.

The coding scheme was discussed in the research group before coding was done by the first author. The applied codes were then discussed and refined by the first and second author. Only minor changes were made in the applied codes after the discussion, which suggest a general agreement on the application of codes. Most of the changes were due to imprecise use of the codes “evidence” and “reasoning”, but after clarifying the distinction, both researchers agreed on their use. After the discussion, the explanation in the coding scheme were further clarified, but no codes were added or removed. The final coding scheme is summarized in Table 1:

Table 1 Coding scheme for identifying sensemaking
Code Definition
Sensemaking codes
Entry condition Initial voicing of a gap in understanding or a question leading to discussion.
Explanation building Students work together to iteratively build, compare and critique arguments to resolve the gap in understanding. They can also use different resources as part of their argument, e.g. figures, computational simulations or theory.
Resolution Explicit voicing of a conclusion to the entry condition, backed up by previous arguments.
Quality of explanation codes
Claim A statement of a possible fact or observation.
Evidence Information and arguments supporting a specific claim.
Reasoning Explicit connections and voiced reasoning between claim and evidence.
Initiator codes
Discussion resource The external resource used to initialize or sustain a sensemaking process, for example a simulation, text or figure.


Codes were applied to sentences or groups of sentences. The sensemaking codes were not overlapping with each other, but the initiator and quality of explanation codes were applied on top of the sensemaking code to provide more depth in the analysis. An example of how the coding was done in MaxQDA 2022 is provided in Fig. 3. Code names and colors are marked on the right side of the text, and coded segments are colored. Applying several codes to a segment result in mixed colors in the text.


image file: d4rp00017j-f3.tif
Fig. 3 Example of applied codes in MaxQDA 2022.

About 20% of the data was coded by an additional researcher (the second author), including the case presented here. To make sure that the same segments were coded by both researchers, coded segments were marked without showing the codes of the first researcher. Thus, names of codes were removed, and all colors were changed to the same grey color. Cohen's kappa were calculated with MaxQDA Pro Analytics 2022, giving a kappa value of κ = 0.86, showing a strong agreement (McHugh, 2012).

Results

The presented case show how computational simulation is used in combination with real-world experience and theory to build a mental model of a system and sustain a sensemaking process featuring different questions and phenomena. Although evidence of sensemaking was present in each group's discussion, the episode we describe features an important vexing question that seems to sustain sensemaking in different discussions throughout the first part of the problem-solving activity. By showing how these discussions are leveraged, we aim to illuminate the role of computational simulations in supporting sensemaking.

When first introduced to the exercise, the students read the introduction thoroughly, requested help with some technical issues from the learning assistant, and started discussing the prompts. We call the students Lisa, Karen, John and Alex.

After running the computational simulation, the students were asked to discuss what happens when the molecules in the simulation collide. Alex tried to answer the question, but Lisa changed the focus of the discussion based on what they had seen in the simulation:

19 Alex : If we see what happens when they collide, they just go their separate ways. Change direction.

20 Lisa : But I feel like they never collide.

21 Alex : Yeah, but there's a collision [points to screen].

22 Lisa : Okay, is that a collision?

23 Alex : Yes, there, there! [points to screen]

Here, we see that the visualizations produced by the computational simulation lead the students to question whether molecules do, in fact, actually collide. Since the simulation shows particles interacting through weak van der Waals forces, they do not touch due to repulsion when they get too close. When Lisa says “But I feel like they never collide.” (line 20), she draws upon the recent observation from the computational simulation, where the particles do not touch, and experience from real-world observations. In the macroscopic world, we do not observe any space between colliding objects. Since the exercise explicitly claims that the particles collide, the simulation introduces a gap in Lisa's understanding of what a collision is. This acts as a vexing question based on real-world experiences that come into conflict with the particulate-level simulation. However, this question is presumably quickly resolved by Alex pointing at the screen and showing Lisa that there are indeed collisions in the computational simulation. Thus, the question is seemingly not a vexing question that initiates sensemaking. But, as we shall see, although it was not immediately pursued, it set the students up for sensemaking in a subsequent episode.

After apparently solving the issue of whether there were collisions or not, the students returned to the discussion prompt:

24 Lisa : Okay, what happens is that they…

25 Alex : Changing direction. Do they change speed too?

26 Karen : Yes, they change…

27 Alex : How do you see that?

28 Lisa : Do you see them coming towards each other, and then quickly moving apart? And then the speed increases when they are moving apart. Or they go slower.

29 Alex : Yes, it depends.

30 Lisa : Yeah, so they […] sort of. So, they change speed. [points to screen]

31 Karen : Yes, we are at [exercise] 1? That they change speed and direction?

32 Alex : Yes.

33 Karen : Next question.

Lisa presents arguments through the visual output of the simulation (line 28). Alex seems to be hesitating and gives a vague response to this argument (line 29) and the students seemingly agree about what happens when the particles collide (lines 30–33). The discussion could have ended here, leading us to conclude that the students engaged in answer-making, but as we shall see, the vexing question posed by Lisa is sustained by the simulation and leads to further discussions.

Immediately following the quick conclusion on the first question, Alex reads the next question: “How do the molecules react to the wall?”. He then tries to answer, which prompted Karen and Lisa to pitch in:

36 Alex : How do the molecules react to the wall? Yes, how do they react to the wall? They repel? Yes, so they collide, and…

37 Karen : Collide and… Okay, let's look at the speed, then. If it changes. They do not change speed. Do they?

38 Lisa : No, but. It doesn’t look like they change speed, but they change…

[…]

40 John : There is also the fact that they come at different speeds. If someone comes at 10 km h −1 and someone comes at 50, then they will go differently [clashes palms together]. While there they just hit the wall, and then [shows collision with his hand, illustrating the particle, and his palm on the other hand, illustrating the wall, and demonstrates that the particle has the same speed before and after the collision]

Here, Karen and Lisa started by evaluating the output from the computational simulation (lines 37–38). John followed up with an analogy, drawing on both kinaesthetic resources and real-world experience about collisions (line 40). They do not address the vexing question about collisions here, but instead turn to the question about how the molecules react with the wall, iteratively drawing on real-world experiences (e.g. John's argument, line 40) and the simulation (e.g. “Okay, let's look at the speed”, line 37) as a resource for sensemaking. When interacting with the walls, the particles in the simulation touch the walls and bounce back elastically in the opposite direction. Since there are no visible space between the walls and the particles, the students do not discuss whether there is a collision with the walls or not, like they did with the particles.

Building on this line of reasoning, Karen used theoretical considerations about ideal gases to evaluate the system:

41 Karen : But we see here. Or something like that. I’ve read a bit [laughs]. But if you think of an ideal gas, with collisions and such, then the kinetic energy must be conserved. And then the speed must be maintained.

42 Alex : But is this an ideal gas?

[…]

45 Karen : But the kinetic energy is the mass times the speed squared. [everyone nods]. When they collide with each other, it's something else. But with the wall, it shouldn’t influence them. Or when it is an ideal gas, then. Then one assumes elastic collisions.

46 Alex : Yes, so that they are unaffected by the collision.

47 Karen : Yes.

Here, Karen drew on existing knowledge which she tried to fit into the group's interpretation of the computational simulation (line 41). They had previously read that this was a simulation of a Lennard-Jones fluid, but their strong theoretical knowledge about ideal gases inspired them to change the premise of the model. Their discussion thus drew on several different resources—theory, real-world knowledge, and the output from the simulation—in a conflicting negotiation about meaning. Although the simulation does not show an ideal gas, their theoretical knowledge helps them contrast different arguments related to the simulation. This type of discussion is typical of sensemaking, as we have previously discussed.

Next, John revisited the question that Lisa posed earlier: do the molecules collide at all? This vexing question is reintroduced by John (line 48) due to conflicts between his perception of a collision and the collision in the simulation, much like how it was introduced by Lisa earlier.

48 John : But they don’t meet each other here. Just almost.

49 Alex : It's probably […] the electron cloud and.

50 Lisa : Yeah, I was wondering if we should just think of it as a collision or if we should think of it as not hitting each other.

51 John : We think of it as a collision.

52 Lisa : Yes, we think of it as a collision.

53 Karen : Yes, because first of all, they don’t hit each other 100%.

54 Lisa : No. But I don’t know.

55 John : No, but they collide anyway.

56 Alex : But should we assume they do?

57 Lisa : We don’t know if it's the model that's made in such a way that they shouldn’t [shows collision with hands].

Here we see that even Alex, who tried previously to convince Lisa about the particles colliding, was now negotiating meaning with less certainty than before (line 56). This uncertainty is introduced by the simulation and is essential in creating gaps or inconsistencies in students’ knowledge, which is required for students to engage in sensemaking. Since they are now discussing whether the molecules collide at all once more, the other conclusions they have made about the collisions (e.g. that the particles change speed and direction when they collide) are now a part of the discussion again.

The students also introduced a new perspective to their discussion by critically evaluating the models underlying the simulation (line 57). This discussion of limitations and affordances of the model, in turn, affected the cognitive resources that the students used in their negotiation of meaning.

We have now seen how the students’ real-world knowledge of macroscopic collisions is incompatible with the output from the simulation, and the simulation is also in conflict with the students’ understanding of ideal gases. Both these conflicts lead the students into a sensemaking process where they negotiate meaning iteratively with the goal of addressing a vexing question (“do the molecules collide?”) and related questions.

Next, the students introduced an additional resource into their discussion: the concept of forces.

58 Karen : Yes, but when you read about the system, it's a two-dimensional Lennard Jones fluid. But then it's repulsive and yes…

59 John : But wouldn’t it go without saying that they collide when they change direction? It is a collision, even if there are attractive and repulsive forces.

60 Karen : Yes. [thinks a long time].

61 Lisa : We can write that, then. That there are attractive and repulsive forces. Interactions.

62 Alex : But shouldn’t it be an ideal gas?

Here, John posited that despite particles not physically “touching”, they still collide due to forces (line 59). Again, the vexing question about collisions leads the student into sensemaking by introducing uncertainty and stimulating them to come up with additional evidence. Upon reflection, Karen and Lisa agreed (lines 60–61). However, Alex reintroduced the theory of ideal gases as a counterpoint (line 62). The infusion of theoretical insights aided their sensemaking process, deterring premature conclusions and thereby continuing the sensemaking process.

Having reached a consensus on the occurrence of collisions, the students now delved into the underlying assumptions of the model and their implications for the behavior of the particles in the simulation.

63 Karen : No, not necessarily. It was something that I guessed. But.

64 Alex : Maybe we’re meant to find out.

65 Karen : In an ideal gas, there are no repulsive or attractive forces between the molecules. Theoretically.

66 John : They move in straight lines, then.

67 Lisa : Yes.

68 Karen : Yes, that is…

69 Alex : And if they only move in straight lines, then they cannot be affected by repulsive and attractive forces.

70 Karen : No, that's true. It's just the collisions we see.

71 Alex : Yes, and when they move. There are no repulsive… Or, no. [everyone writes]

Here, the students utilized their understanding of the behavior of ideal gases and the dynamics of such a system to interpret their observations of the computational simulation (lines 65–71). This engagement with the simulation served to focus their attention on the limitations of the ideal gas law, contrasting the implications of that model with the behavior they were observing. The simulation shows particles moving in lines with a slight curve when they approach other particles. Even though they do not interpret this interaction correctly, they still use different resources to try and make sense of what happens.

In the final part of the episode, Lisa sought further clarity, leading to a resolution to the question of what it means for particles to collide:

72 Lisa : Ok, it's probably a stupid question, something I didn’t get, but.

73 Karen : There are no stupid questions! [smiles]

74 Lisa : But why can’t they be affected by repulsive and attractive interactions when they go in straight lines?

75 Alex : [points to the screen] If you have one molecule that's here, and it repels a molecule that's passing by, then it's going to affect the path or…

76 John : So, then it will kind of come, then they will. Watch now [shows palm and fist]. When it's just a straight line, then it's [denotes collision]. But with attractive and repulsive, it will be more tsjiuuuum [sound language. Also shows with hands a parabolic path as the particle approaches another].

77 Lisa : Oh, yeah. There you go!

78 John : [Lisa and Karen mimic the sounds John made and laughs] So some particles will be attractive, so then it will come against, also it will niiiiiiii in [sound language and hands showing two particles “sticking” to each other]. While the repulsive ones will then just push them away.

79 Lisa : Yes, exactly. Exactly.

80 John : And give more curved lines.

81 Lisa : Ah, well explained!

Here, we see that in response to Lisa's questioning (line 74), Alex anchored his explanation in the computational simulation (line 75), after which John iterated on this explanation using illustrative hand gestures (line 76). Thus, the simulation again provided a productive input to the students’ discussion and argumentation, which served to sustain their sensemaking process. This, in turn, prompted John to use embodied knowledge (Crowder, 1996) to explain how systems with and without attractive and repulsive forces will act. Immediately after this explanation, everyone expressed agreement that they had reached a satisfactory conclusion—a successful resolution to the sensemaking process.

To review: we are arguing that this episode illustrates a complete and successful cycle of sensemaking, as defined above. That is, it fulfils all three of the criteria for sensemaking: the students at various points drew on their real-world experience as a resource in the discussion (along with scientific theory and visualization from the computational model); they iteratively built their explanation through multiple cycles of construction and critique; and the students produced a cogent argument that had a clear claim (there are collisions between particles), evidence (the computational simulation output), and reasoning (John's explanation of why the trajectory of the particles is characteristic of collisions). Furthermore, there is a clear, recurring, vexing question about the nature of collisions that acts as an entry condition into the sensemaking process and a unifying thread throughout the subsequent discussion.

To make the complete cycle of sensemaking easier to follow, we have visualized the discussion in Fig. 4. In this diagram, we indicate movement between arguments and questions with arrows. Resources from the computational simulation and the computational model are marked with blue circles.


image file: d4rp00017j-f4.tif
Fig. 4 A model of the discussion in episode two, where conflicts between real-world knowledge, theory and visual output from the computational simulation act to sustain the sensemaking process and lead the discussion further, maintaining a level of uncertainty that prevents the students from doing simple answer-making.

From this perspective, we see that at first, the sensemaking discussion is quite linear, ending in a premature resolution. Karen and Lisa start to examine whether the particles change speed or not through the simulation output, and John follows up with arguments based on real-world experiences of collisions. This is followed by the vexing question where a gap in their understanding of collisions is introduced. Here, the group uses the computational simulation and real-world experiences interchangeably to explore and build arguments. This is an iterative process where the concept of real gases is revisited several times, and results from the computational simulation act as a way of sustaining this process by providing constant feedback.

This representation also allows us to disentangle the role of the computational simulation in the process, which we argue is to provide the students with resources for their discussion. These resources are different representations of the phenomena, including the visual output and the underlying computational model, which the students then pick up and use in their argumentation. Importantly, the use of these computational resources often inspires further sensemaking; for example, when students start to wonder which model is used in the computational simulation, their knowledge of ideal and real gases is drawn in as proof in the arguments (e.g. “In an ideal gas, there are no repulsive or attractive forces between the molecules. Theoretically.”), but also as the foundation of new questions (e.g. “But shouldn’t it be an ideal gas?”). Most importantly, the discussion about the existence of collisions between particles acts as a vexing question throughout the whole episode. This question likely emerged because of the difference in microscopic collisions in the simulation (space between particles) and the perception of macroscopic collisions in the real world (no space between particles). Thus, we argue that computational simulations can productively aid and sustain sensemaking by conveying information in different representational formats.

Discussion

From our data, we see that the interactions between computational simulation, theory and real-world explanations provide new meaning and provide the basis for evidence in the students’ discussion. This resonates well with the literature claiming that computational simulations should be used in combination with other forms of teaching and learning (Smetana and Bell, 2012) since they can act as a meeting point between theory and previous experience (Kluge, 2019). We also see the value of the representations produced by such simulations; when used as an alternative representational form, computational simulations can give students a more thorough understanding of important and fundamental concepts, which aligns with results from Sand et al. (2019). Within chemistry education specifically, computational particle simulations provide students with a way of exploring physical and chemical concepts at the micro level, possibly bridging a gap between macro-level observations and micro-level explanations. Introducing computational simulations as a part of a curriculum can thus provide students with tools to engage in sensemaking and thereby build a more solid conceptual understanding, both within and across different disciplines.

Returning to our research question, the goal of this study was to examine mechanisms behind how computational simulations could sustain sensemaking. Based on the evidence in our analysis, we propose four key affordances of computational simulations for sustaining sensemaking:

1. The explorative factor: The interactive element of a computational simulation encourages students to experiment and explore a problem instead of solving a problem and then looking up a solution.

2. The feedback factor: Computational simulations provide students with immediate feedback, which can serve to keep students engaged in a task for a longer amount of time.

3. The representational factor: A computational simulation combines conceptual chemistry with mathematical concepts and algorithms to produce visual output of a model.

4. The uncertainty factor: Simulations can produce results that come into conflict with previous experience. This can introduce a higher level of uncertainty for the students, which is important to sustain the sensemaking process.

Through these mechanisms, computational simulations can be used to facilitate persistence and exploring of ideas. Viewing our data through these potential mechanisms, we note that in the sensemaking episode above we saw most of mechanism two, three and four, and less of students rerunning the computational simulation to explore different scenarios. Although the students kept rerunning the computational simulation to produce visual aids for their argumentation and questions, they did not change the models or parameters of the numerical experiment. This probably has to do with the design of the exercise, where we did not explicitly try to lead the students into exploring different outcomes.

We also saw more of the feedback factor than the representational factor. Although the students discussed the underlying models of the computational simulation, they could have examined the mathematical models by examining the code in the BubbleBox modules. However, as with the explorative factor, this was not explicitly stated in the exercise. We can therefore infer that careful design of the computational simulation exercise is crucial for leading students into sensemaking which is sustained by the representational and explorative factor. One factor that stands out as an evident mechanism for sustaining the sensemaking process, is the uncertainty factor. In the episode presented above, the simulation provided the students with output that conflicted with their knowledge and real-world experiences—that is, it kept them in a state of uncertainty, where they needed to engage in repeated discussion in order to resolve their gap in knowledge. This uncertainty is a key difference between sensemaking and answer-making: answer-making requires certainty that one has come to a right answer, whereas sensemaking is sustained by uncertainty. Thus, tools, activities, questions, or visualizations that allow students to become aware of their own uncertainty can potentially act as resources for sensemaking. Computational simulations provide one such tool, and our research shows promising results in students using such explorative simulations to sustain sensemaking about particulate-level models in chemistry.

Limitations

While this study provides insights into how computational simulations can facilitate sensemaking in chemistry education, it operates under certain assumptions and limitations that warrant consideration. Firstly, the assumption that engagement with simulations enhances sensemaking may simplify the complexities of external factors such as prior knowledge and familiarity with computational tools. If these students were subjected to such simulations without any prior knowledge of Python programming or the Jupyter Notebook environment, technical difficulties might have limited their engagement in sensemaking.

Furthermore, the generalizability of our findings is also constrained by the context-specific nature of a case study, which is not fully applicable to different educational settings or disciplines. We have presented a study that shows possible mechanism behind how sensemaking can be initialized and sustained through computational simulations, but the effect we have described clearly depends on the design of the activity and the context where it is introduced. For instance, students need to be comfortable with sharing ideas for a collective sensemaking process to occur. The social aspect behind sensemaking is an important factor not addressed explicitly in this study.

It is important to address that we wanted to investigate the sensemaking process, not the resulting conceptual understanding from this process. Therefore, we chose an episode where the learning assistant (LA) does not intervene because we wanted to see how the students themselves interacted with the simulation, without corrections or external facilitation from the LA. We also chose to study an authentic learning situation with exercises made by the instructor, not by chemical education researchers for a specific purpose. Thus, the students were led into sensemaking by themselves and the simulation, even though some of the conclusions they made were incorrect. Consequently, this study does not address conceptual change or understanding of a particular chemical phenomenon, merely the sensemaking process that can lead to such learning.

While computational simulations are a valuable tool for enhancing sensemaking in chemistry education, the design of the simulation activity and the integration of this tool within a broader pedagogical framework are crucial for realizing its full potential.

Implications for research and practice

We recommend that future research examine how a different exercise design affects the sensemaking process regarding our proposed mechanisms. Also, since we did not explicitly address the social aspect of sensemaking, we warrant further research comparing sensemaking and simulations in a social setting with a setting without collaboration.

Chemistry educators can use our four proposed mechanisms as a framework for designing classroom activities that facilitates and supports sensemaking. For example, we can make prompts for students to explore the effect of different factors on a chemical system. They can then get feedback from the simulation in the form of different representations, like numbers, graphs or graphics, which we can prompt them to discuss the meaning of. By posing questions that take into account student's real-world knowledge, we can introduce an uncertainty that drives the students into a negotiation of meaning with themselves and each other.

Keep in mind that if conceptual change or learning of a specific concept is the goal of the activity, it could be useful to include some control mechanisms so that the uncertainty factor does not lead to wrong conclusions in the end. Explicit control questions, provided partial solutions or teacher-student interaction might serve to this purpose.

In addition, we warrant the use of tools like BubbleBox as a more open-box simulation, where students can explore how different factors affect a specific system. This requires a basic knowledge of programming but is a potential way to engage students in modeling and sensemaking through exploration. Such activities can also be subject to research that will be a valuable addition to the field.

Conclusion

Through this case study we have seen how computational simulations can provide students with productive resources for sensemaking, allowing students to make observations that conflict with their previous conceptions and thus make them aware of gaps in their understanding of a phenomenon. Such gaps can both initiate and sustain a sensemaking process. In our data, we saw how a perceived conflict between students’ real-world experience and their interpretation of a visual representation of molecular interactions lead them from answer-making to sensemaking. We identified four possible mechanisms that can lead computational simulations to facilitate and sustain sensemaking: exploration, feedback, representations and uncertainty. These factors can be important in supporting sensemaking in chemistry education.

Conflicts of interest

There are no conflicts to declare.

References

  1. Ardac D. and Sezen A. H., (2002), Effectiveness of Computer-Based Chemistry Instruction in Enhancing the Learning of Content and Variable Control Under Guided Versus Unguided Conditions, J. Sci. Educ. Technol., 11, 39–48 DOI:10.1023/A:1013995314094.
  2. Bassey M., (1999), Case study research in educational settings, Doing qualitative research in educational settings, Philadelphia: Open University Press, Buckingham [England].
  3. Berlin D. and White A., (2010), Computer Simulations and the Transition from Concrete Manipulation of Objects to Abstract Thinking in Elementary School Mathematics, School Sci. Math., 86, 468–479 DOI:10.1111/j.1949-8594.1986.tb11643.x.
  4. Braithwaite D. W. and Sprague L., (2021), Conceptual Knowledge, Procedural Knowledge, and Metacognition in Routine and Nonroutine Problem Solving, Cognitive Sci., 45, e13048 DOI:10.1111/cogs.13048.
  5. Brant G., Hooper E. and Sugrue B., (1991), Which Comes First the Simulation or the Lecture? J. Educ. Comput. Res., 7, 469–481.
  6. Cannady M. A., Vincent-Ruz P., Chung J. M. and Schunn C. D., (2019), Scientific sensemaking supports science content learning across disciplines and instructional contexts, Contemp. Educ. Psychol., 59, 101802 DOI:10.1016/j.cedpsych.2019.101802.
  7. Casa-Coila M. D., Mamani-Vilca P. S., Tisnado-Mamani L. M., Pari-Achata D. and Vilca-Apaza H. M., (2023), Model Chemlab and Phet Simulator: A Didactic Resource for Chemistry Learning in Undergraduate Students, IJMST, 10, 59–75 DOI:10.15379/ijmst.v10i5.2420.
  8. Chandler P. and Sweller J., (1991), Cognitive Load Theory and the Format of Instruction, Cogn. Instruct., 8, 293–332 DOI:10.1207/s1532690xci0804_2.
  9. Chen Y., Irving P. W. and Sayre E. C., (2013), Epistemic game for answer making in learning about hydrostatics, Phys. Rev. ST Phys. Educ. Res., 9, 010108 DOI:10.1103/PhysRevSTPER.9.010108.
  10. Clark T. M. and Chamberlain J. M., (2014), Use of a PhET Interactive Simulation in General Chemistry Laboratory: Models of the Hydrogen Atom, J. Chem. Educ., 91, 1198–1202 DOI:10.1021/ed400454p.
  11. Correia A.-P., Koehler N., Thompson A. and Phye G., (2019), The application of PhET simulation to teach gas behavior on the submicroscopic level: secondary school students’ perceptions, Res. Sci. Technol. Educ., 37, 193–217 DOI:10.1080/02635143.2018.1487834.
  12. Corter J. E., Esche S. K., Chassapis C., Ma J. and Nickerson J. V., (2011), Process and learning outcomes from remotely-operated, simulated, and hands-on student laboratories, Comput. Educ., 57, 2054–2067 DOI:10.1016/j.compedu.2011.04.009.
  13. Crowder E. M., (1996), Gestures at Work in Sense-Making Science Talk, J. Learn. Sci., 5, 173–208 DOI:10.1207/s15327809jls0503_2.
  14. Darmaji D., Kurniawan D. A. and Irdianti I., (2019), Physics education students’ science process skills, IJERE, 8, 293 DOI:10.11591/ijere.v8i2.16401.
  15. Dewey J., (1910), How we think, D.C. Heath & Co.
  16. Easley K., (2020), Simulations and Sensemaking in Elementary Project-Based Science (Thesis).
  17. Eichenlaub M., (2019), Mathematical sensemaking via epistemic games, Dissertation Abstracts International: Section B: The Sciences and Engineering, University of Maryland.
  18. Eylon B.-S., Ronen M. and Ganiel U., (1996), Computer simulations as tools for teaching and learning: using a simulation environment in optics, J. Sci. Educ. Technol., 5, 93–110 DOI:10.1007/BF01575150.
  19. Faryniarz J. V. and Lockwood L. G., (1992), Effectiveness of microcomputer simulations in stimulating environmental problem solving by community college students, J. Res. Sci. Teach., 29, 453–470 DOI:10.1002/tea.3660290503.
  20. Ford M. J., (2012), A Dialogic Account of Sense-Making in Scientific Argumentation and Reasoning, Cogn. Instruct., 30, 207–245 DOI:10.1080/07370008.2012.689383.
  21. Ganasen S. and Shamuganathan S., (2017), The Effectiveness of Physics Education Technology (PhET) Interactive Simulations in Enhancing Matriculation Students’ Understanding of Chemical Equilibrium and Remediating Their Misconceptions, in Karpudewan M., Md Zain A. N. and Chandrasegaran A. L. (ed.), Overcoming Students’ Misconceptions in Science: Strategies and Perspectives from Malaysia, Singapore: Springer, pp. 157–178 DOI:10.1007/978-981-10-3437-4_9.
  22. Geban Ö., Askar P. and Özkan Ï., 1992. Effects of Computer Simulations and Problem-Solving Approaches on High School Students, J. Educ. Res., 86, 5–10.
  23. Grauer K., (2012), A Case for Case Study Research in Education, in Klein S. R. (ed.), Action Research Methods: Plain and Simple, New York: Palgrave Macmillan US, pp. 69–79 DOI:10.1057/9781137046635_4.
  24. Hamilton L. and Corbett-Whittier C., (2013), Using Case Study in Education Research, London DOI:10.4135/9781473913851.
  25. Hamnell-Pamment Y., (2024), Making sense of chemical equilibrium: productive teacher–student dialogues as a balancing act between sensemaking and managing tension, Chem. Educ. Res. Pract., 25, 171–192 10.1039/D3RP00249G.
  26. Hargrave C. P. and Kenton J. M., (2000), Preinstructional Simulations: Implications for Science Classroom Teaching, J. Comput. Math. Sci. Teach., 19, 47–58.
  27. Hunter K. H., Rodriguez J.-M. G. and Becker N. M., (2021), Making sense of sensemaking: using the sensemaking epistemic game to investigate student discourse during a collaborative gas law activity, Chem. Educ. Res. Pract., 22, 328–346 10.1039/D0RP00290A.
  28. Hylleraas Software Platform—Hylleraas [W. W. W. Document], n.d. URL https://hylleraas.readthedocs.io/en/latest/(accessed 4.28.23).
  29. Kaldaras L. and Wieman C., (2023), Cognitive framework for blended mathematical sensemaking in science, Int. J. STEM Educ., 10, 18 DOI:10.1186/s40594-023-00409-8.
  30. Kapon S., (2017), Unpacking Sensemaking, Sci. Educ., 101, 165–198 DOI:10.1002/sce.21248.
  31. Kapon S. and Berland L., (2023), Epistemic Models of Sensemaking and Reasoning, in Taşar M. F. and Heron P. R. L. (ed.), The International Handbook of Physics Education Research: Learning Physics, AIP Publishing LLC DOI:10.1063/9780735425477_012.
  32. Kluge A., (2019), Learning science with an interactive simulator: negotiating the practice-theory barrier, Int. J. Sci. Educ., 41, 1071–1095 DOI:10.1080/09500693.2019.1590881.
  33. Lancaster K., Moore E. B., Parson R. and Perkins K. K., (2013), Insights from Using PhET's Design Principles for Interactive Chemistry Simulations, in Suits J. P. and Sanger M. J. (ed.), ACS Symposium Series, Washington, DC: American Chemical Society, pp. 97–126 DOI:10.1021/bk-2013-1142.ch005.
  34. Lazenby K., Stricker A., Brandriet A., Rupp C. A. and Becker N. M., (2020), Undergraduate Chemistry Students’ Epistemic Criteria for Scientific Models, J. Chem. Educ., 97, 16–26 DOI:10.1021/acs.jchemed.9b00505.
  35. Lee J., (1999), Effectiveness of Computer-Based Instructional Simulation: A Meta Analysis, Int. J. Instruct. Media, 26, 71–85.
  36. Lee K. M., Nicoll G. and Brooks D. W., 2004. A Comparison of Inquiry and Worked Example Web-Based Instruction Using Physlets, J. Sci. Educ. Technol., 13, 81–88 DOI:10.1023/B:JOST.0000019640.07432.2b.
  37. Limniou M., Papadopoulos N. and Whitehead C., (2009), Integration of simulation into pre-laboratory chemical course: computer cluster versus WebCT, Comput. Educ., 52, 45–52 DOI:10.1016/j.compedu.2008.06.006.
  38. Lin X. and Lehman J. D., (1999), Supporting learning of variable control in a computer-based biology environment: effects of prompting college students to reflect on their own thinking, J. Res. Sci. Teach., 36, 837–858 DOI:10.1002/(SICI)1098-2736(199909)36:7[double bond splayed right]837::AID-TEA6[double bond splayed left]3.0.CO;2-U.
  39. Liu H.-C., Andre T. and Greenbowe T., 2008. The Impact of Learner's Prior Knowledge on Their Use of Chemistry Computer Simulations: A Case Study, J. Sci. Educ. Technol., 17, 466–482 DOI:10.1007/s10956-008-9115-5.
  40. McHugh M. L., 2012. Interrater reliability: the kappa statistic, Biochem. Med. (Zagreb), 22, 276–282.
  41. McNeill K. L., Lizotte D. J., Krajcik J. and Marx R. W., (2006), Supporting Students’ Construction of Scientific Explanations by Fading Scaffolds in Instructional Materials, J. Learn. Sci., 15, 153–191 DOI:10.1207/s15327809jls1502_1.
  42. Moore E. B., Chamberlain J. M., Parson R., Perkins K. K., (2014), PhET Interactive Simulations: Transformative Tools for Teaching Chemistry, J. Chem. Educ., 91, 1191–1197 DOI:10.1021/ed4005084.
  43. National Research Council, (2006), America's Lab Report: Investigations in High School Science. National Academies Press, Washington, DC DOI:10.17226/11311.
  44. Odden T. O. B., (2021), How conceptual blends support sensemaking: a case study from introductory physics, Sci. Educ., 105, 989–1012 DOI:10.1002/sce.21674.
  45. Odden T. O. B. and Russ R. S., (2018), Defining sensemaking: bringing clarity to a fragmented theoretical construct, Sci. Educ., 103, 187–205 DOI:10.1002/sce.21452.
  46. Özmen H., Demircioğlu H. and Demircioğlu G., (2009), The effects of conceptual change texts accompanied with animations on overcoming 11th grade students’ alternative conceptions of chemical bonding, Comput. Educ., 52, 681–695 DOI:10.1016/j.compedu.2008.11.017.
  47. Papert S., (1980), Mindstorms: children, computers, and powerful ideas, New York: Basic Books.
  48. Passmore C., Gouvea J. S. and Giere R., (2014), Models in Science and in Learning Science: Focusing Scientific Practice on Sense-making, in Matthews M. R. (ed.), International Handbook of Research in History, Philosophy and Science Teaching, Dordrecht: Springer Netherlands, pp. 1171–1202 DOI:10.1007/978-94-007-7654-8_36.
  49. Plass J. L., Milne C., Homer B. D., Schwartz R. N., Hayward E. O., Jordan T., Verkuilen J., Ng F., Wang Y., Barrientos J., (2012), Investigating the effectiveness of computer simulations for chemistry learning, J. Res. Sci. Teach., 49, 394–419 DOI:10.1002/tea.21008.
  50. Prins G. T., Bulte A. M. W., Van Driel J. H. and Pilot A., (2009), Students’ Involvement in Authentic Modelling Practices as Contexts in Chemistry Education, Res. Sci. Educ., 39, 681–700 DOI:10.1007/s11165-008-9099-4.
  51. Rodriguez J.-M. G., Bain K., Towns M. H., (2020a), The Role of Epistemology and Epistemic Games in Mediating the Use of Mathematics in Chemistry: Implications for Mathematics Instruction and Research on Undergraduate Mathematics Education, Int. J. Res. Undergrad. Math. Ed., 6, 279–301 DOI:10.1007/s40753-019-00110-8.
  52. Rodriguez J.-M. G., Hunter K. H., Scharlott L. J., Becker N. M., (2020b), A Review of Research on Process Oriented Guided Inquiry Learning: Implications for Research and Practice, J. Chem. Educ., 97, 3506–3520 DOI:10.1021/acs.jchemed.0c00355.
  53. Roth W.-M., (1995), Affordances of computers in teacher-student interactions: the case of interactive physicsTM, J. Res. Sci. Teach., 32, 329–347 DOI:10.1002/tea.3660320404.
  54. Salame I. I., Makki J., (2021), Examining the Use of PhET Simulations on Students’ Attitudes and Learning in General Chemistry II, Int. J. Environ. Sci. Ed., 17, e2247 DOI:10.21601/ijese/10966.
  55. Sand O., Odden T. O., Lindstrom C. and Caballero M., (2019), How computation can facilitate sensemaking about physics: a case study, Paper presented at Physics Education Research Conference 2018, Washington DOI:10.1119/perc.2018.pr.Sand.
  56. Sands D., (2021), Modeling as sensemaking: towards a theory of modelling in physics education, Eur. J. Phys., 42, 064001 DOI:10.1088/1361-6404/abcc80.
  57. Smetana L. K. and Bell R. L., (2012), Computer Simulations to Support Science Instruction and Learning: a critical review of the literature, Int. J. Sci. Educ.34, 1337–1370 DOI:10.1080/09500693.2011.605182.
  58. Soderberg P. and Price F., (2003), An examination of problem-based teaching and learning in population genetics and evolution using EVOLVE, a computer simulation, Int. J. Sci. Educ., 25, 35–55 DOI:10.1080/09500690110095285.
  59. Stieff M. and Wilensky U., (2003), Connected Chemistry—Incorporating Interactive Simulations into the Chemistry Classroom, J. Sci. Educ. Technol., 12, 285–302 DOI:10.1023/A:1025085023936.
  60. Trundle K. C. and Bell R. L., (2010), The use of a computer simulation to promote conceptual change: a quasi-experimental study, Comput. Educ., 54, 1078–1088 DOI:10.1016/j.compedu.2009.10.012.
  61. Wilensky U. and Reisman K., (2006), Thinking Like a Wolf, a Sheep, or a Firefly: Learning Biology Through Constructing and Testing Computational Theories—An Embodied Modeling Approach, Cogn. Instruct., 24, 171–209 DOI:10.1207/s1532690xci2402_1.
  62. Winberg T. M. and Berg C. A. R., (2007), Students’ cognitive focus during a chemistry laboratory exercise: effects of a computer-simulated prelab. J. Res. Sci. Teach., 44, 1108–1133 DOI:10.1002/tea.20217.
  63. Windschitl M. and Andre T., (1998), Using computer simulations to enhance conceptual change: the roles of constructivist instruction and student epistemological beliefs, J. Res. Sci. Teach., 35, 145–160 DOI:10.1002/(SICI)1098-2736(199802)35:2[double bond splayed right]145::AID-TEA5[double bond splayed left]3.0.CO;2-S.
  64. Wu H.-K., Krajcik J. S. and Soloway E., (2001), Promoting understanding of chemical representations: students’ use of a visualization tool in the classroom, J. Res. Sci. Teach., 38, 821–842 DOI:10.1002/tea.1033.
  65. Wu M.-Y. M. and Yezierski E. J., (2022), Pedagogical chemistry sensemaking: a novel conceptual framework to facilitate pedagogical sensemaking in model-based lesson planning, Chem. Educ. Res. Pract., 23, 287–299 10.1039/D1RP00282A.
  66. Zendler A. and Greiner H., (2020), The effect of two instructional methods on learning outcome in chemistry education: the experiment method and computer simulation, Educ. Chem. Eng., 30, 9–19 DOI:10.1016/j.ece.2019.09.001.
  67. Zhao F. and Schuchardt A., (2021), Development of the Sci-math Sensemaking Framework: categorizing sensemaking of mathematical equations in science, Int. J. STEM Educ., 8, 10 DOI:10.1186/s40594-020-00264-x.

This journal is © The Royal Society of Chemistry 2024
Click here to see how this site uses Cookies. View our privacy policy here.