Andreas
Haraldsrud
*ab and
Tor Ole B.
Odden
a
aCenter for Computing in Science Education, University of Oslo, 0371 Oslo, Norway. E-mail: a.d.haraldsrud@kjemi.uio.no
bDepartment of Chemistry, University of Oslo, 0371 Oslo, Norway
First published on 25th March 2024
Sensemaking is an important way of learning and engaging in science. Research has shown that sensemaking activities, such as questioning, hypothesizing, and model building, are pivotal in developing critical thinking and problem-solving skills in science education. This paper investigates the role of computational simulations in facilitating sensemaking in chemistry education, specifically examining how these simulations can sustain the sensemaking process. Through a detailed case study in a physical chemistry course, we explore the interplay between students' real-world experiences, theoretical knowledge, and computational simulations. This analysis reveals that computational simulations, by providing interactive and visual representations of chemical phenomena, can create a conducive environment for sensemaking, where students actively engage in exploring and resolving conceptual uncertainties. Based on these results, we argue that computational tools, when effectively integrated into educational settings, can facilitate sensemaking in science education.
However, sensemaking is not necessarily an easy process (Odden and Russ, 2018). It requires time, effort, and an ongoing dialogue (Ford, 2012). It is also a highly individual process, as different students will often use very different explanations to make sense of ideas. Thus, teachers who aim to support their students' sensemaking often go to great lengths to provide rich, multifaceted learning environments. One common element of such rich learning environments is the use of interactive computational simulations (Plass et al., 2012; Easley, 2020; Hunter et al., 2021). Such simulations have been explored as a tool for building conceptual understanding and supporting sensemaking in various disciplines, including physics, biology, and chemistry education (Papert, 1980; Wilensky and Reisman, 2006; Plass et al., 2012; Sand et al., 2019). For example, Hunter et al. applied the sensemaking framework proposed by Odden and Russ to analyze a collaborative gas law activity in general chemistry (Hunter et al., 2021), demonstrating how the activity led to productive episodes of sensemaking.
In this article, we will build upon these studies to examine how computational simulations can support sensemaking in chemistry education. This will be explored through a case study of students working with activities on particle interactions and equilibrium in an authentic learning situation in physical chemistry. In this study, we are interested in addressing the following research question: through what mechanisms can sensemaking be facilitated and sustained with computational simulations?
We choose to include three factors related to learning in science: conceptual understanding, science process skills and modeling. We will later use what these three concepts have in common with sensemaking to support our analysis.
Another example of how conceptual understanding can be improved is given by Wu et al. (2001). They show how the visual nature of a computational simulation can improve students’ understanding of chemical structure by making it possible to convert between multiple representations of molecules. In addition, the dynamical aspect of computational simulations is explored by Stieff and Wilensky (2003), who show how computational simulations can enhance students’ conceptual understanding of chemical equilibrium. They also report an increased degree of logical reasoning when students worked with computational simulations.
The research on how computational simulations affect student learning of chemistry is quite diverse. Interactive simulation tools like PhET, developed in 2002, have been explored in different contexts and levels of education (Lancaster et al., 2013; Moore et al., 2014). In a study from a general chemistry course, Clark and Chamberlain (2014) show how computational simulations can be used to support student modeling competence. This is further built on and expanded upon by Salame and Makki (2021), showing a positive impact of simulations on student affect and attitudes towards chemistry.
Correia et al. (2019) demonstrated the benefits of using PhET simulations to understand particulate level gas behavior in secondary school. In this study, the value of using multiple representations of particulate level phenomena is emphasized. Computational simulations are shown to be valuable tools for this purpose.
A study by Ganasen and Shamuganathan (2017) shows a statistically significant improvement in student achievement on conceptual understanding of chemical equilibrium by using computational simulations. Further, Casa-Coila et al. (2023) recently conducted a study, showing a statistically significant improvement on learning a multitude of important chemical concepts at both the microscopic and macroscopic levels.
Computational simulations may also provide a mistake-friendly environment where students can experiment, make errors, and learn from them without real-world consequences. This freedom to explore and learn from mistakes promotes a growth mindset and encourages students to take risks in their learning. The interactive nature of computational simulations can captivate their attention and promote active participation, which in turn may foster their development of science process skills (Lee, 1999; Hargrave and Kenton, 2000; Berlin and White, 2010).
One important finding is that we should use computational simulations as a supplement to traditional methods, not as a replacement. This is not surprising, since researchers have argued that different methods and tools have different modalities – that is, they describe certain aspects of reality in specific ways. For instance, it is more effective to use animation than static images when trying to understand the dynamic aspect of protein synthesis or how a galvanic cell works. Conversely, when trying to understand the composition of a peptide, different static images may convey this information better than an animation. A well-written text is also often necessary to understand arguments and abstract phenomena, as well as to explain and supplement images and figures.
Some studies suggest that computational simulations are most effective when used before, not after, traditional classroom instruction (Brant et al., 1991; Winberg and Berg, 2007). One possible explanation behind this may be that students attend less to the computational simulation when it serves as a review (Smetana and Bell, 2012). Also, when using a computational simulation before traditional instruction, students may be more inclined towards exploration and inquiry-based learning. The understanding gained from computational simulations may also decrease the students’ cognitive load during instruction, thereby priming them for traditional instruction (Winberg and Berg, 2007).
There is also some evidence that the order in which students use computational simulations do not matter (Liu et al., 2008), indicating that whether the order is important is dependent on the content and the design of the activity. For example, using a simulation before a laboratory exercise could possibly make the laboratory exercise less motivating and thus less fruitful for learning. Also, some evidence suggests that students might be more motivated by working with data from traditional experiments than with data from simulations (Corter et al., 2011).
Another central finding is that the students need proper support and scaffolding to learn from computational simulations effectively (Smetana and Bell, 2012). Poorly constructed animations and non-intuitive controls can lead to misunderstanding and frustration. Technical issues and difficulties may also move the focus from learning content to handling the tool. It is therefore important to design the computational simulation tools in such a way that we reduce the cognitive load of students (Chandler and Sweller, 1991; Lee et al., 2004). There is also need for proper support and guidance from the teacher in using the computational simulation effectively in a proper learning context (Roth, 1995; Eylon et al., 1996; Ardac and Sezen, 2002; Limniou et al., 2009).
When students are engaged in sensemaking, they often experience a disquieting feeling of uncertainty or frustration. This uncertainty frequently centers on a vexing question: a challenging or puzzling gap in knowledge that stimulates students' sensemaking processes. Such questions can often arise from scenarios where intuitive or everyday understandings clash with the principles of science, thus requiring a deeper engagement and exploration to resolve the inconsistencies and develop a coherent understanding. These vexing questions may, in turn, be leveraged by educators to motivate active thinking, discussion, and exploration among students.
The interaction between students and teachers is also important for sensemaking. In a recent study, Hamnell-Pamment (2024) demonstrated how different mechanisms of student–teacher interactions can sustain sensemaking in chemistry. The teacher's role in making connections to theory, real-world experience and to different chemistry knowledge domains was shown to be of importance. It was also shown that it is feasible to use different representations like equations, or mathematical symbols, to mediate progression in the sensemaking process. Equations contain elements of abstraction, which in turn can lead to the emergence of vexing questions. We can hypothesize that simulations might provide some of the same elements of abstraction, as well as other affordances, that might support sensemaking.
Active participation is necessary for sensemaking to happen. Thus, active learning pedagogies like POGIL (Process Oriented Guided Inquiry Learning) can be a valuable strategy for sensemaking. Elements of POGIL include construction of ideas through an iterative process, using inquiry and exploration of models as a basis for the discussions (Rodriguez et al., 2020b). Thus, the POGIL approach shares a lot of common features with sensemaking, and we can hypothesize that POGIL might sustain student engagement in sensemaking. Hunter et al. (2021) investigated how the POGIL framework supports sensemaking, which we will elaborate on later. They also encouraged more research on applying the sensemaking framework to “investigate student's reasoning across other concepts in chemistry and across science” (p. 341), which we will follow up on.
Sand et al. (2019) present a case-study showing how a student engaged in sensemaking in physics through a computational activity. They argue that the reason behind this is that the computational activity allowed the student to realize a gap in her understanding and to easily implement, test and critique her ideas to bridge this gap. We also argue that there's one additional reason behind the engagement in sensemaking that was not addressed explicitly: that is, we claim that the computational approach allowed the student to work hands-on with models, and to test these models continuously. Scientific modelling can contribute to sensemaking (Sands, 2021). Modelling involves the formulation, analysis and critique of representations of reality. The modelling process involves moving back and forth between different descriptions of a phenomenon, critiquing and building an explanation through the process. These are some of the same attributes associated with sensemaking (Odden and Russ, 2018; Kaldaras and Wieman, 2023).
Modelling also shares a lot of common features with science process skills, which have been shown to improve using computational simulations. Kluge (2019) uses clinical interviews to show that computational simulations can be used to “negotiate a meeting point between theory, previous experience and knowledge, and be instrumental in conceptual sense-making”. This resonates well with the arguments for how computational simulations can promote science process skills, which we addressed earlier.
Different modelling practices have been explored in chemistry education. For example, authentic modelling practices was shown to support student motivation and knowledge about models and modelling in chemistry (Prins et al., 2009). Model-based reasoning has also been identified as a problem-solving strategy in chemistry, where students use and refer to models to solve complex problems (Rodriguez et al., 2020a). Thus, both reasoning about models (metamodeling) and reasoning with models has been studied in learning chemistry, which are classified as distinct features of model-based learning in the literature (Lazenby et al., 2020; Passmore et al., 2014).
Both sensemaking and conceptual understanding require the combination of conceptual knowledge and an active negotiation of meaning. However, when we describe conceptual understanding in science education, we often mean correct conceptual understanding (as opposed to misunderstanding). While this of course is a desired outcome of a learning activity, a correct understanding of a phenomenon is not required to engage in sensemaking. Instead, in sensemaking, conceptual resources are used to make sense of a phenomenon, whether right or wrong. Thus, sensemaking is a process for learning and conceptual understanding is a product of a learning process.
Science process skills and sensemaking share a foundational relationship in scientific inquiry, both emphasizing the understanding and application of scientific concepts and principles through active engagement, critical-thinking and iterative problem-solving. That is, science process skills involve critical thinking, processing information and solving problems (Darmaji et al., 2019). These are all dynamic skills, iteratively drawing upon different resources to form conclusions. Iterative inquiry, where students move back and forth between arguments, is also an important aspect of sensemaking. However, while science process skills are focused on the methodology and the “how” of science, sensemaking is more about the “why” of science, and is not bound to any specific methodology.
The last factor we choose to consider here, is modelling. Modelling is an iterative process, requiring the evaluation and (re-)formulation of models by comparing with data from simulations, experiments or observations. This iterative process is also a very important feature of sensemaking. Sensemaking can involve modelling but is more broadly defined as a process where the integration and interpretation of meaning is more important than the construction of models.
We will build upon these results to further examine the mechanisms behind how computational simulations facilitate and sustain sensemaking.
The general difference between sensemaking and answer-making is summarized in the following figure, adapted from Odden and Russ (2018) and Chen et al. (2013). First, students engage in sensemaking when they identify and address gaps in their knowledge. To resolve this issue, they choose to generate an explanation. This explanation building is often iterative and goes on over time. In this process, they might draw upon different cognitive resources from their discipline or across disciplines (Kaldaras and Wieman, 2023) (Fig. 1).
![]() | ||
Fig. 1 The difference between sensemaking and answer-making, adapted from Odden and Russ (2018) and Chen et al. (2013), simplified by Hunter et al. (2021). Further simplified by the authors of this article. |
Hunter et al. (2021) operationalized this difference, arguing that there were certain distinguishing characteristics of student context and discussion which can be used to differentiate sensemaking and answer-making. These characteristics are:
(1) A feature of the question context that allows for real-world experiences to be used as a part for the negotiation of meaning. For example, students might use their experiences with falling objects to reason about the sedimentation of particles in a gravitational field. These experiences might also be drawn from laboratory exercises, e.g. “The substances might separate into different phases as we observed in the organic lab”.
(2) Explanation building with collaborative construction and critique. Here, the process of going back and forth in a discussion through different arguments can be contrasted with the more straightforward answer-making process, where arguments aren’t weighed, combined or contrasted to construct new meaning. In this process, we often see the emergence of vexing questions that guide and sustain the explanation building.
(3) The quality of explanation: for example, sensemaking tends to feature stronger and more coherent claims, evidence and reasoning than answer-making (McNeill et al., 2006; Hunter et al., 2021).
Based on our study objective of understanding how sensemaking can occur in authentic educational settings, data was collected in situ, i.e., without specific intervention from the research team. Thus, video recordings were made of regular group sessions within the physical chemistry course. The organization into groups of 4–5 students was facilitated by the LA's.
The LA's had completed a Learning Assistant Program before teaching this course. Thus, they had been trained in how to scaffold student learning by asking follow-up questions and by giving small hints instead of the solution when students were stuck. They kept this supporting role throughout the whole session, facilitating discussions and supporting the students to reason about their answers.
The students had already worked with problems related to gases (ideal, real and the kinetic model of gases) and enthalpy in previous group sessions. They were exposed to particulate-level thinking through the kinetic model of gases in the present course, as well as some particulate-level models in kinetics from general chemistry the previous year. The exercise they worked with in the group session that was studied, built upon this general understanding of particles, but extended it into the context of the physical chemistry course.
The study PI maintained a non-intrusive stance during data collection, while the learning assistants retained their customary role of aiding the students. Exercises used in the sessions also originated from the teacher responsible for the group activities, rather than the research team.
Since students can either use pre-defined commands or write commands and calculations themselves, BubbleBox can be adjusted to different levels of mathematics and programming experience. For instance, to advance the system in time, one can either write one's own numerical integration code or call upon a pre-defined Python function. As such, BubbleBox has an adjustable degree of “black box” coding. In the exercise featured in this study, BubbleBox was used as a more-or-less black box simulator in the activity described in this study, although students had the option to examine the code behind the different functions.
BubbleBox is built to facilitate exploration by being simultaneously flexible and easy to use. It is also designed to help users bridge their understanding of the microscopic and macroscopic world, and to make it possible to visually explore the effects of mathematical models. These affordances make this tool potentially useful for sensemaking in ways we will address through our analysis.
In the present case, the students used pre-defined functions to simulate the dynamics of a two-dimensional fluid which uses the Lennard-Jones potential for intermolecular interactions. The activity was framed as a numerical experiment with analogies drawn to traditional wet labs.
First, students were asked to set up a system and visualize its state. They do this by running a code cell with existing code in their Jupyter Notebook. The resulting output is shown in Fig. 2.
![]() | ||
Fig. 2 The state of a two-dimensional system in BubbleBox is shown as particles with vector arrows denoting their velocity. |
After setup, the students ran their simulations by running another pre-defined cell of code and observed the system as it evolved in time. As such, the activity was formed as an open inquiry task where students ran pre-made code and used this to discuss properties about the system. These simulations then formed the basis for a series questions related to the dynamics of the system. The focus of the first activities were on how the particles interacted with each other and the walls of the container. The first discussion questions were asked to give the students room for interpreting the system and making a mental model of what it can represent:
1. What happens when the molecules collide?
2. How do the molecules interact with the walls?
3. Do the molecules move in straight or curved lines?
4. Can you find pairs or groups of molecules clustered together?
5. In what direction does the “arrow of time” point? Where does it start? Discuss by using the system in the simulation.
The numerical experiment was designed to facilitate inquiry and discussion, rather than teaching the students a specific concept. The exercises were followed by more traditional lectures and exercises on these themes to help the students consolidate their knowledge.
The questions and code were distributed through a Jupyter Notebook environment as a single document where students could read about relevant theory, write, discuss, and run the computational simulations.
We will present a case based on one of the groups in the sessions to give a fine-grained description and analysis of the ways computational simulations affect sensemaking. Since there is a limited amount of science education research on how computational simulations can facilitate sensemaking, a case study is a suitable method to explore and describe possible relationships and mechanisms in detail (Bassey, 1999; Grauer, 2012). Case studies are not intended to be generalizable, but they can provide important insight into the complexity of educational contexts (Hamilton and Corbett-Whittier, 2013). The episode presented below was chosen both due to the high prevalence of sensemaking behavior and because the students address fundamental concepts that are not necessarily covered in traditional teaching, the question of “What is a collision between particles?”.
The analysis proceeded in three phases: first, we used the coding scheme adapted from the sensemaking epistemic game provided by Odden and Russ (Odden and Russ, 2018) and adapted by Hunter et al. to identify possible episodes of sensemaking. Second, we evaluated each candidate episode of sensemaking according to the criteria previously discussed. This was done to further clarify the distinction of sensemaking versus answer-making. The more criteria an episode met, the stronger degree of sensemaking was identified in the episode. Third, we identified the resources responsible for initializing or sustaining the sensemaking process. These resources could for example be a question in the exercise, an output or an input in a computational simulation, theoretical knowledge of students or prompts from a learning assistant.
The coding scheme was discussed in the research group before coding was done by the first author. The applied codes were then discussed and refined by the first and second author. Only minor changes were made in the applied codes after the discussion, which suggest a general agreement on the application of codes. Most of the changes were due to imprecise use of the codes “evidence” and “reasoning”, but after clarifying the distinction, both researchers agreed on their use. After the discussion, the explanation in the coding scheme were further clarified, but no codes were added or removed. The final coding scheme is summarized in Table 1:
Code | Definition |
---|---|
Sensemaking codes | |
Entry condition | Initial voicing of a gap in understanding or a question leading to discussion. |
Explanation building | Students work together to iteratively build, compare and critique arguments to resolve the gap in understanding. They can also use different resources as part of their argument, e.g. figures, computational simulations or theory. |
Resolution | Explicit voicing of a conclusion to the entry condition, backed up by previous arguments. |
Quality of explanation codes | |
Claim | A statement of a possible fact or observation. |
Evidence | Information and arguments supporting a specific claim. |
Reasoning | Explicit connections and voiced reasoning between claim and evidence. |
Initiator codes | |
Discussion resource | The external resource used to initialize or sustain a sensemaking process, for example a simulation, text or figure. |
Codes were applied to sentences or groups of sentences. The sensemaking codes were not overlapping with each other, but the initiator and quality of explanation codes were applied on top of the sensemaking code to provide more depth in the analysis. An example of how the coding was done in MaxQDA 2022 is provided in Fig. 3. Code names and colors are marked on the right side of the text, and coded segments are colored. Applying several codes to a segment result in mixed colors in the text.
About 20% of the data was coded by an additional researcher (the second author), including the case presented here. To make sure that the same segments were coded by both researchers, coded segments were marked without showing the codes of the first researcher. Thus, names of codes were removed, and all colors were changed to the same grey color. Cohen's kappa were calculated with MaxQDA Pro Analytics 2022, giving a kappa value of κ = 0.86, showing a strong agreement (McHugh, 2012).
When first introduced to the exercise, the students read the introduction thoroughly, requested help with some technical issues from the learning assistant, and started discussing the prompts. We call the students Lisa, Karen, John and Alex.
After running the computational simulation, the students were asked to discuss what happens when the molecules in the simulation collide. Alex tried to answer the question, but Lisa changed the focus of the discussion based on what they had seen in the simulation:
19 Alex : If we see what happens when they collide, they just go their separate ways. Change direction.
20 Lisa : But I feel like they never collide.
21 Alex : Yeah, but there's a collision [points to screen].
22 Lisa : Okay, is that a collision?
23 Alex : Yes, there, there! [points to screen]
Here, we see that the visualizations produced by the computational simulation lead the students to question whether molecules do, in fact, actually collide. Since the simulation shows particles interacting through weak van der Waals forces, they do not touch due to repulsion when they get too close. When Lisa says “But I feel like they never collide.” (line 20), she draws upon the recent observation from the computational simulation, where the particles do not touch, and experience from real-world observations. In the macroscopic world, we do not observe any space between colliding objects. Since the exercise explicitly claims that the particles collide, the simulation introduces a gap in Lisa's understanding of what a collision is. This acts as a vexing question based on real-world experiences that come into conflict with the particulate-level simulation. However, this question is presumably quickly resolved by Alex pointing at the screen and showing Lisa that there are indeed collisions in the computational simulation. Thus, the question is seemingly not a vexing question that initiates sensemaking. But, as we shall see, although it was not immediately pursued, it set the students up for sensemaking in a subsequent episode.
After apparently solving the issue of whether there were collisions or not, the students returned to the discussion prompt:
24 Lisa : Okay, what happens is that they…
25 Alex : Changing direction. Do they change speed too?
26 Karen : Yes, they change…
27 Alex : How do you see that?
28 Lisa : Do you see them coming towards each other, and then quickly moving apart? And then the speed increases when they are moving apart. Or they go slower.
29 Alex : Yes, it depends.
30 Lisa : Yeah, so they […] sort of. So, they change speed. [points to screen]
31 Karen : Yes, we are at [exercise] 1? That they change speed and direction?
32 Alex : Yes.
33 Karen : Next question.
Lisa presents arguments through the visual output of the simulation (line 28). Alex seems to be hesitating and gives a vague response to this argument (line 29) and the students seemingly agree about what happens when the particles collide (lines 30–33). The discussion could have ended here, leading us to conclude that the students engaged in answer-making, but as we shall see, the vexing question posed by Lisa is sustained by the simulation and leads to further discussions.
Immediately following the quick conclusion on the first question, Alex reads the next question: “How do the molecules react to the wall?”. He then tries to answer, which prompted Karen and Lisa to pitch in:
36 Alex : How do the molecules react to the wall? Yes, how do they react to the wall? They repel? Yes, so they collide, and…
37 Karen : Collide and… Okay, let's look at the speed, then. If it changes. They do not change speed. Do they?
38 Lisa : No, but. It doesn’t look like they change speed, but they change…
[…]
40 John : There is also the fact that they come at different speeds. If someone comes at 10 km h −1 and someone comes at 50, then they will go differently [clashes palms together]. While there they just hit the wall, and then [shows collision with his hand, illustrating the particle, and his palm on the other hand, illustrating the wall, and demonstrates that the particle has the same speed before and after the collision]
Here, Karen and Lisa started by evaluating the output from the computational simulation (lines 37–38). John followed up with an analogy, drawing on both kinaesthetic resources and real-world experience about collisions (line 40). They do not address the vexing question about collisions here, but instead turn to the question about how the molecules react with the wall, iteratively drawing on real-world experiences (e.g. John's argument, line 40) and the simulation (e.g. “Okay, let's look at the speed”, line 37) as a resource for sensemaking. When interacting with the walls, the particles in the simulation touch the walls and bounce back elastically in the opposite direction. Since there are no visible space between the walls and the particles, the students do not discuss whether there is a collision with the walls or not, like they did with the particles.
Building on this line of reasoning, Karen used theoretical considerations about ideal gases to evaluate the system:
41 Karen : But we see here. Or something like that. I’ve read a bit [laughs]. But if you think of an ideal gas, with collisions and such, then the kinetic energy must be conserved. And then the speed must be maintained.
42 Alex : But is this an ideal gas?
[…]
45 Karen : But the kinetic energy is the mass times the speed squared. [everyone nods]. When they collide with each other, it's something else. But with the wall, it shouldn’t influence them. Or when it is an ideal gas, then. Then one assumes elastic collisions.
46 Alex : Yes, so that they are unaffected by the collision.
47 Karen : Yes.
Here, Karen drew on existing knowledge which she tried to fit into the group's interpretation of the computational simulation (line 41). They had previously read that this was a simulation of a Lennard-Jones fluid, but their strong theoretical knowledge about ideal gases inspired them to change the premise of the model. Their discussion thus drew on several different resources—theory, real-world knowledge, and the output from the simulation—in a conflicting negotiation about meaning. Although the simulation does not show an ideal gas, their theoretical knowledge helps them contrast different arguments related to the simulation. This type of discussion is typical of sensemaking, as we have previously discussed.
Next, John revisited the question that Lisa posed earlier: do the molecules collide at all? This vexing question is reintroduced by John (line 48) due to conflicts between his perception of a collision and the collision in the simulation, much like how it was introduced by Lisa earlier.
48 John : But they don’t meet each other here. Just almost.
49 Alex : It's probably […] the electron cloud and.
50 Lisa : Yeah, I was wondering if we should just think of it as a collision or if we should think of it as not hitting each other.
51 John : We think of it as a collision.
52 Lisa : Yes, we think of it as a collision.
53 Karen : Yes, because first of all, they don’t hit each other 100%.
54 Lisa : No. But I don’t know.
55 John : No, but they collide anyway.
56 Alex : But should we assume they do?
57 Lisa : We don’t know if it's the model that's made in such a way that they shouldn’t [shows collision with hands].
Here we see that even Alex, who tried previously to convince Lisa about the particles colliding, was now negotiating meaning with less certainty than before (line 56). This uncertainty is introduced by the simulation and is essential in creating gaps or inconsistencies in students’ knowledge, which is required for students to engage in sensemaking. Since they are now discussing whether the molecules collide at all once more, the other conclusions they have made about the collisions (e.g. that the particles change speed and direction when they collide) are now a part of the discussion again.
The students also introduced a new perspective to their discussion by critically evaluating the models underlying the simulation (line 57). This discussion of limitations and affordances of the model, in turn, affected the cognitive resources that the students used in their negotiation of meaning.
We have now seen how the students’ real-world knowledge of macroscopic collisions is incompatible with the output from the simulation, and the simulation is also in conflict with the students’ understanding of ideal gases. Both these conflicts lead the students into a sensemaking process where they negotiate meaning iteratively with the goal of addressing a vexing question (“do the molecules collide?”) and related questions.
Next, the students introduced an additional resource into their discussion: the concept of forces.
58 Karen : Yes, but when you read about the system, it's a two-dimensional Lennard Jones fluid. But then it's repulsive and yes…
59 John : But wouldn’t it go without saying that they collide when they change direction? It is a collision, even if there are attractive and repulsive forces.
60 Karen : Yes. [thinks a long time].
61 Lisa : We can write that, then. That there are attractive and repulsive forces. Interactions.
62 Alex : But shouldn’t it be an ideal gas?
Here, John posited that despite particles not physically “touching”, they still collide due to forces (line 59). Again, the vexing question about collisions leads the student into sensemaking by introducing uncertainty and stimulating them to come up with additional evidence. Upon reflection, Karen and Lisa agreed (lines 60–61). However, Alex reintroduced the theory of ideal gases as a counterpoint (line 62). The infusion of theoretical insights aided their sensemaking process, deterring premature conclusions and thereby continuing the sensemaking process.
Having reached a consensus on the occurrence of collisions, the students now delved into the underlying assumptions of the model and their implications for the behavior of the particles in the simulation.
63 Karen : No, not necessarily. It was something that I guessed. But.
64 Alex : Maybe we’re meant to find out.
65 Karen : In an ideal gas, there are no repulsive or attractive forces between the molecules. Theoretically.
66 John : They move in straight lines, then.
67 Lisa : Yes.
68 Karen : Yes, that is…
69 Alex : And if they only move in straight lines, then they cannot be affected by repulsive and attractive forces.
70 Karen : No, that's true. It's just the collisions we see.
71 Alex : Yes, and when they move. There are no repulsive… Or, no. [everyone writes]
Here, the students utilized their understanding of the behavior of ideal gases and the dynamics of such a system to interpret their observations of the computational simulation (lines 65–71). This engagement with the simulation served to focus their attention on the limitations of the ideal gas law, contrasting the implications of that model with the behavior they were observing. The simulation shows particles moving in lines with a slight curve when they approach other particles. Even though they do not interpret this interaction correctly, they still use different resources to try and make sense of what happens.
In the final part of the episode, Lisa sought further clarity, leading to a resolution to the question of what it means for particles to collide:
72 Lisa : Ok, it's probably a stupid question, something I didn’t get, but.
73 Karen : There are no stupid questions! [smiles]
74 Lisa : But why can’t they be affected by repulsive and attractive interactions when they go in straight lines?
75 Alex : [points to the screen] If you have one molecule that's here, and it repels a molecule that's passing by, then it's going to affect the path or…
76 John : So, then it will kind of come, then they will. Watch now [shows palm and fist]. When it's just a straight line, then it's [denotes collision]. But with attractive and repulsive, it will be more tsjiuuuum [sound language. Also shows with hands a parabolic path as the particle approaches another].
77 Lisa : Oh, yeah. There you go!
78 John : [Lisa and Karen mimic the sounds John made and laughs] So some particles will be attractive, so then it will come against, also it will niiiiiiii in [sound language and hands showing two particles “sticking” to each other]. While the repulsive ones will then just push them away.
79 Lisa : Yes, exactly. Exactly.
80 John : And give more curved lines.
81 Lisa : Ah, well explained!
Here, we see that in response to Lisa's questioning (line 74), Alex anchored his explanation in the computational simulation (line 75), after which John iterated on this explanation using illustrative hand gestures (line 76). Thus, the simulation again provided a productive input to the students’ discussion and argumentation, which served to sustain their sensemaking process. This, in turn, prompted John to use embodied knowledge (Crowder, 1996) to explain how systems with and without attractive and repulsive forces will act. Immediately after this explanation, everyone expressed agreement that they had reached a satisfactory conclusion—a successful resolution to the sensemaking process.
To review: we are arguing that this episode illustrates a complete and successful cycle of sensemaking, as defined above. That is, it fulfils all three of the criteria for sensemaking: the students at various points drew on their real-world experience as a resource in the discussion (along with scientific theory and visualization from the computational model); they iteratively built their explanation through multiple cycles of construction and critique; and the students produced a cogent argument that had a clear claim (there are collisions between particles), evidence (the computational simulation output), and reasoning (John's explanation of why the trajectory of the particles is characteristic of collisions). Furthermore, there is a clear, recurring, vexing question about the nature of collisions that acts as an entry condition into the sensemaking process and a unifying thread throughout the subsequent discussion.
To make the complete cycle of sensemaking easier to follow, we have visualized the discussion in Fig. 4. In this diagram, we indicate movement between arguments and questions with arrows. Resources from the computational simulation and the computational model are marked with blue circles.
From this perspective, we see that at first, the sensemaking discussion is quite linear, ending in a premature resolution. Karen and Lisa start to examine whether the particles change speed or not through the simulation output, and John follows up with arguments based on real-world experiences of collisions. This is followed by the vexing question where a gap in their understanding of collisions is introduced. Here, the group uses the computational simulation and real-world experiences interchangeably to explore and build arguments. This is an iterative process where the concept of real gases is revisited several times, and results from the computational simulation act as a way of sustaining this process by providing constant feedback.
This representation also allows us to disentangle the role of the computational simulation in the process, which we argue is to provide the students with resources for their discussion. These resources are different representations of the phenomena, including the visual output and the underlying computational model, which the students then pick up and use in their argumentation. Importantly, the use of these computational resources often inspires further sensemaking; for example, when students start to wonder which model is used in the computational simulation, their knowledge of ideal and real gases is drawn in as proof in the arguments (e.g. “In an ideal gas, there are no repulsive or attractive forces between the molecules. Theoretically.”), but also as the foundation of new questions (e.g. “But shouldn’t it be an ideal gas?”). Most importantly, the discussion about the existence of collisions between particles acts as a vexing question throughout the whole episode. This question likely emerged because of the difference in microscopic collisions in the simulation (space between particles) and the perception of macroscopic collisions in the real world (no space between particles). Thus, we argue that computational simulations can productively aid and sustain sensemaking by conveying information in different representational formats.
Returning to our research question, the goal of this study was to examine mechanisms behind how computational simulations could sustain sensemaking. Based on the evidence in our analysis, we propose four key affordances of computational simulations for sustaining sensemaking:
1. The explorative factor: The interactive element of a computational simulation encourages students to experiment and explore a problem instead of solving a problem and then looking up a solution.
2. The feedback factor: Computational simulations provide students with immediate feedback, which can serve to keep students engaged in a task for a longer amount of time.
3. The representational factor: A computational simulation combines conceptual chemistry with mathematical concepts and algorithms to produce visual output of a model.
4. The uncertainty factor: Simulations can produce results that come into conflict with previous experience. This can introduce a higher level of uncertainty for the students, which is important to sustain the sensemaking process.
Through these mechanisms, computational simulations can be used to facilitate persistence and exploring of ideas. Viewing our data through these potential mechanisms, we note that in the sensemaking episode above we saw most of mechanism two, three and four, and less of students rerunning the computational simulation to explore different scenarios. Although the students kept rerunning the computational simulation to produce visual aids for their argumentation and questions, they did not change the models or parameters of the numerical experiment. This probably has to do with the design of the exercise, where we did not explicitly try to lead the students into exploring different outcomes.
We also saw more of the feedback factor than the representational factor. Although the students discussed the underlying models of the computational simulation, they could have examined the mathematical models by examining the code in the BubbleBox modules. However, as with the explorative factor, this was not explicitly stated in the exercise. We can therefore infer that careful design of the computational simulation exercise is crucial for leading students into sensemaking which is sustained by the representational and explorative factor. One factor that stands out as an evident mechanism for sustaining the sensemaking process, is the uncertainty factor. In the episode presented above, the simulation provided the students with output that conflicted with their knowledge and real-world experiences—that is, it kept them in a state of uncertainty, where they needed to engage in repeated discussion in order to resolve their gap in knowledge. This uncertainty is a key difference between sensemaking and answer-making: answer-making requires certainty that one has come to a right answer, whereas sensemaking is sustained by uncertainty. Thus, tools, activities, questions, or visualizations that allow students to become aware of their own uncertainty can potentially act as resources for sensemaking. Computational simulations provide one such tool, and our research shows promising results in students using such explorative simulations to sustain sensemaking about particulate-level models in chemistry.
Furthermore, the generalizability of our findings is also constrained by the context-specific nature of a case study, which is not fully applicable to different educational settings or disciplines. We have presented a study that shows possible mechanism behind how sensemaking can be initialized and sustained through computational simulations, but the effect we have described clearly depends on the design of the activity and the context where it is introduced. For instance, students need to be comfortable with sharing ideas for a collective sensemaking process to occur. The social aspect behind sensemaking is an important factor not addressed explicitly in this study.
It is important to address that we wanted to investigate the sensemaking process, not the resulting conceptual understanding from this process. Therefore, we chose an episode where the learning assistant (LA) does not intervene because we wanted to see how the students themselves interacted with the simulation, without corrections or external facilitation from the LA. We also chose to study an authentic learning situation with exercises made by the instructor, not by chemical education researchers for a specific purpose. Thus, the students were led into sensemaking by themselves and the simulation, even though some of the conclusions they made were incorrect. Consequently, this study does not address conceptual change or understanding of a particular chemical phenomenon, merely the sensemaking process that can lead to such learning.
While computational simulations are a valuable tool for enhancing sensemaking in chemistry education, the design of the simulation activity and the integration of this tool within a broader pedagogical framework are crucial for realizing its full potential.
Chemistry educators can use our four proposed mechanisms as a framework for designing classroom activities that facilitates and supports sensemaking. For example, we can make prompts for students to explore the effect of different factors on a chemical system. They can then get feedback from the simulation in the form of different representations, like numbers, graphs or graphics, which we can prompt them to discuss the meaning of. By posing questions that take into account student's real-world knowledge, we can introduce an uncertainty that drives the students into a negotiation of meaning with themselves and each other.
Keep in mind that if conceptual change or learning of a specific concept is the goal of the activity, it could be useful to include some control mechanisms so that the uncertainty factor does not lead to wrong conclusions in the end. Explicit control questions, provided partial solutions or teacher-student interaction might serve to this purpose.
In addition, we warrant the use of tools like BubbleBox as a more open-box simulation, where students can explore how different factors affect a specific system. This requires a basic knowledge of programming but is a potential way to engage students in modeling and sensemaking through exploration. Such activities can also be subject to research that will be a valuable addition to the field.
This journal is © The Royal Society of Chemistry 2024 |