Failing forward in chemistry laboratory courses: the impact of engagement and mistakes during pre-lab activities on students' situational engagement

Reetta Kyynäräinen *a, Lars-Erik Malmberg b, Elisa Vilhunen c, Mikko-Jussi Laakso d and Veli-Matti Vesterinen a
aDepartment of Chemistry, University of Turku, Turku, Finland. E-mail: reetta.kyynarainen@utu.fi
bDepartment of Education, University of Oxford, Oxford, UK
cDepartment of Education, University of Helsinki, Helsinki, Finland
dTurku Research Institute for Learning Analytics, University of Turku, Turku, Finland

Received 27th June 2025 , Accepted 9th September 2025

First published on 10th September 2025


Abstract

This study investigates the role of mistakes and affective experiences during online pre-lab activities in predicting students’ situational engagement (conceptualized here as a simultaneous experience of interest, skill, and challenge, i.e. optimal learning moments) in subsequent laboratory sessions in an undergraduate chemistry laboratory course (n = 256). The data collection followed an ecological momentary assessment design. We specified multilevel structural equation models (MSEMs), including two- and three-level structural equation models, to examine how mistakes impacted students’ situational engagement during pre-lab activities and subsequent laboratory sessions. The findings indicate that mistakes in pre-lab tasks were associated with lower perceived skill and higher experience of challenge during that task, but did not predict students’ interest, skill, challenge, or situational engagement in the subsequent laboratory session. Autoregressive effects from pre-lab activities on students’ situational engagement during lab sessions were observed across all elements of engagement, while skill and challenge during pre-lab activities also predicted higher interest in the subsequent laboratory session. Based on our findings, we propose that while mistakes in the pre-lab activities do not play a significant role in predicting students’ engagement upon entering the laboratory, affective experiences during pre-lab activities can play a significant role in predicting students’ engagement in the laboratory, and that laboratory engagement could be enhanced by providing students with sufficiently challenging pre-lab activities.


Introduction

Although performance and learning are not necessarily indications of one another, learning is often evaluated based on one's performance during instruction (Soderstrom and Bjork, 2015). Lower performance during acquisition, indicated by, for instance, mistakes, might not hinder learning and can sometimes have the opposite effect (see e.g.Loibl and Leuders, 2019). Evidence suggests that especially long-term motor learning, which is one of the most significant learning objectives of laboratory instruction (cf.Kirschner, 1992), benefits from distributing practice, possibly leading momentarily to unoptimized performance, including more mistakes (Keith and Frese, 2008; Soderstrom and Bjork, 2015).

In particular, disciplines encompassing a trial-and-error nature, such as chemistry, foster environments where mistakes inevitably occur (Schmid et al., 2022; Agustian et al., 2024; Kyynäräinen et al., 2024). Nonetheless, there is still a deficiency of research on students’ perceptions and affective responses to making mistakes, particularly across varying contexts and learning environments, like chemistry laboratory courses (cf.Steuer et al., 2025). Understanding students’ reactions to their errors is crucial for supporting learning stemming from these integral events. Beyond this, understanding the previously overlooked lagged effects of making mistakes can help educators intervene early and provide additional support to students in need (cf.Kyynäräinen et al., 2024).

In addition, the affective domain in general has often been overlooked in chemistry laboratory-based studies (see e.g.Galloway et al., 2016; Agustian and Seery, 2017; Agustian, 2022). Previous research suggests that it is the primary driver of undergraduate students’ learning goals in chemistry laboratory courses (DeKorver and Towns, 2015) and that students’ expectations determine their perceived and enacted experiences (Kirschner et al., 1993; Galloway and Bretz, 2015). Based on these findings, students’ prior affective experiences could significantly predict their goal-setting and learning processes in the chemistry laboratory. This might arise from students’ attempts at affective forecasting, trying to foresee future emotional experiences based on one's expectations of success or failure (Wilson and Gilbert, 2003; Pilin, 2021). Affective forecasting might plausibly lead to assimilation, where students shift their actual emotional experiences to meet their forecast, increasing the significance of future success or failure anticipations (Wilson and Gilbert, 2003).

In order to expand knowledge about how students’ prior experiences influence their laboratory instruction, we focus on the role of pre-lab activities and their affective outcomes and investigate how students’ engagement and mistakes in pre-lab activities predict their engagement in subsequent laboratory sessions. We go beyond previous studies by investigating the lagged effects of students’ experiences during pre-lab exercises on their affective experiences in the laboratory. This study deepens the understanding of students’ affective responses to making mistakes in a new context, as well as provides insight into the role of pre-lab activities in promoting high situational engagement during subsequent laboratory instruction.

The research data were collected using an ecological momentary assessment (EMA) design, using brief questionnaires measuring the in situ experiences of students repeatedly during or immediately after learning tasks in authentic learning settings (see Sinatra et al., 2015). In this study, students responded altogether to 20 EMA questionnaires while completing the digital pre-lab activities and during the laboratory sessions. We view mistakes as situation-specific instances that influence learning engagement (cf.Kyynäräinen et al., 2024). Considering the dynamic nature of engagement (e.g.Greene, 2015; Schneider et al., 2016; Schmidt et al., 2018; Upadyaya et al., 2021; Heikkinen et al., 2025; Tang et al., 2025), which the EMA design is well-suited to capture, we utilize situational engagement, also referred to as optimal learning moments (Schneider et al., 2016), as a theoretical lens to study undergraduate students’ momentary experiences over a chemistry laboratory course. According to this conceptualization, students experience high levels of situational engagement when high situational interest, high perceived skill level, and high enough task difficulty meet; that is, they are challenged just the right amount. The EMA questionnaires contained items on these optimal learning elements – interest, skill, and challenge. The design enables us to capture how situational engagement and its elements change over time and which factors, such as making mistakes or prior engagement, might contribute to these changes.

Chemistry laboratory courses

Laboratory courses are distinctive, ever-present parts of chemistry education, particularly at the higher education level (Agustian, 2022; Seery et al., 2024). Recently, teaching laboratories have been established as places to learn to do science, creating complex, embodied learning environments that differ greatly from classic classrooms or lecture halls (Seery, 2020; Agustian, 2022; Seery et al., 2024). These complex learning environments are characterized by integrating knowledge, skills, and attitudes, requiring the transfer of the learned to real settings, and involving the coordination of qualitatively different elements (Agustian and Seery, 2017; van Merrienboer et al., 2003). Laboratory education aims at a broad variety of outcomes, including improving students’ experimental skills, disciplinary learning, higher-order thinking skills, transversal competencies, and affective outcomes (Seery et al., 2024).

Chemistry laboratory courses typically consist of preparative pre-lab activities followed by an instructional laboratory experiment. Most traditionally, pre-lab activities intend to introduce chemical concepts by, for instance, pre-laboratory lecture, quizzes, or discussion (Agustian and Seery, 2017). Other possible objectives of pre-lab activities are introducing laboratory techniques with, for example, simulations, technique videos or safety information, as well as preparing students for the affective aspects of laboratory work by, e.g. increasing their confidence or motivation or reducing anxiety (Agustian and Seery, 2017; Chu, 2017; Spagnoli et al., 2017; Rayment et al., 2023). Pre-lab activities are beneficial for learning as they provide opportunities to stimulate and increase students’ knowledge on the key chemical concepts prior to the laboratory experiment itself – facilitating deeper engagement with the content (Chu, 2017; Seery et al., 2024). According to Seery and colleagues (2024), these activities could also help students identify the intended goals for the laboratory experiment, promoting constructive alignment and allowing them to engage in a meaningful way.

Based on previous research, students’ situational engagement during laboratory instruction is highly prone to external or structural triggers, particularly being given choice in, for example, framing one's own work (Schmidt et al., 2018). In addition, Galloway and Bretz (2015) found out that for most students, their learning experience in the laboratory is intensely framed by their expectations – what they anticipate thinking and feeling becomes an enacted reality. Previous research also suggests that prior engagement can significantly predict contemporary engagement (Wylie and Hodgen, 2012). Considering these notions, students’ antecedent experiences during the pre-lab activities might impact the level of their engagement in the laboratory (cf.Agustian and Seery, 2017).

Situational engagement

Engagement can be studied through various lenses, but here we use the situational engagement paradigm, conceptualizing high situational engagement as optimal learning moments (OLMs) that encompass a simultaneous experience of high interest, skill, and challenge (Schneider et al., 2016; Tang et al., 2025). The OLM framework unites interest, competence, and task difficulty, building on the idea that learning should be difficult enough, but not overwhelmingly hard. It emphasizes the importance of positive pre- and post-dispositional affection, sufficient challenges, and sometimes even a moderate amount of stress to foster effective learning (cf.Tuominen-Soini and Salmela-Aro, 2014; Schneider et al., 2016; Tang et al., 2025; de Anda et al., 2000).

The optimal learning theory is based on the concept of flow (Csikszentmihalyi, 2000; Schmidt, 2010) and situational interest (Hidi and Renninger, 2006; Krapp and Prenzel, 2011). Csikszentmihalyi (2000) defines flow as a situation-specific instance so deeply engaging that human needs are momentarily suspended and time loses its temporal boundaries, characterized by above-average levels of challenge and skill. Situational interest, on the other hand, is described as a spontaneous motivating psychological state of a short duration, triggered typically by external features and sustained through meaningfulness of tasks and/or personal involvement (Hidi and Renninger, 2006). Situational interest is perceived as an important precondition for learning, as it enables the student to focus on the tasks at hand and motivates them to put in effort (Schraw and Lehman, 2001; Hidi and Renninger, 2006). Thus, optimal learning theory links situational interest to the higher-than-average levels of skill and challenge during flow (Schneider et al., 2016; Tang et al., 2025).

OLMs are assumed to promote learning (Schneider et al., 2016; Tang et al., 2025), although the linkage between experiencing OLMs, i.e. high situational engagement, and heightened learning outcomes has not been empirically confirmed. However, previous research indicates that the relationship between situational engagement and learning could be mediated by students’ self-regulatory processes (cf.Heikkinen et al., 2025). Namely, experiencing high situational engagement can promote students’ self-regulatory processes and the use of more efficient learning strategies (e.g.Lee et al., 2014; Schneider et al., 2016), which are, in turn, are associated with heightened learning outcomes. Beyond influencing students’ immediate learning outcomes, situational engagement can also shape broader affective factors, such as experiences of concentration, enjoyment, efficacy, and success (Schneider et al., 2016; Tang et al., 2025). In their study, Hidi and Renninger (2006) describe how individual interest can be developed and deepened through situational interest. Similarly, situational engagement could develop into sustained engagement in students (cf.Hidi and Renninger, 2006).

According to previous findings, students’ momentary, task-specific perceptions of the utility and attainment values are reflected in their situational engagement, i.e. OLMs (Schmidt et al., 2018; Eccles and Wigfield, 2020; Salmela-Aro et al., 2021; Tang et al., 2025). Previous research also states that the type of instructional activities affects students’ situational engagement (Schmidt et al., 2018; Inkinen et al., 2019; Inkinen et al., 2020; Vilhunen et al., 2025). Inkinen and colleagues (2019) conducted a study in Finland and USA, which revealed some differences between the proportion of time students in secondary school science classrooms experienced OLMs on average during lab work (21.7% in Finland and 15.1% in USA), calculating (38.1% in Finland and 18.1% in USA), and computer work (25.2% in Finland and 14.9% in USA). Compared to other types of activities, Finnish students were twice as likely to experience high situational engagement when they were calculating. Oftentimes, pre-lab activities can contain calculations, suggesting a possibly engaging experience for students (Inkinen et al., 2019). Despite the growing body of research on students’ affect in the laboratory, little is known about the lagged effects of students’ engagement from pre-lab activities on the subsequent laboratory sessions (Agustian and Seery, 2017).

The role of mistakes

Making mistakes can significantly influence learning. Not only encountering mistakes but also the way one perceives them (also referred to as error beliefs or error learning orientation, valuing mistakes as learning opportunities) can impact learning processes by shaping one's engagement, emotions and motivation (see Tulis and Fulmer, 2013; Leighton et al., 2018; Tulis et al., 2018; Kyynäräinen et al., 2024; Schmid et al., 2025). In general, making mistakes is associated with increased negative emotions and hindered perceptions of skill (Tulis and Ainley, 2011; Allchin, 2012; Agustian et al., 2024; Kyynäräinen et al., 2024; Sharabi and Roth, 2025). However, making mistakes could promote situational engagement, i.e. OLMs, through increased challenge and agency (Galloway et al., 2016; Kyynäräinen et al., 2024), particularly if the context is conducive to learning (cf.Ames and Archer, 1988; Ames, 1992). In such a context, the teacher has created a classroom climate in which mistakes, errors and challenges are seen as natural parts of learning, and they can provide task contingent feedback (Hattie and Timperley, 2007; Leighton et al., 2018; Soncini et al., 2022).

Error-centred learning interventions (Chu and Leighton, 2019; Lee, 2020), like productive failure activities, in which students’ misconceptions are utilized as prompts for conceptual change, can enhance cognitive learning (Loibl and Leuders, 2019; Vilhunen et al., 2023). These interventions promote adaptive, action-related responses to mistakes, which have been associated with further acquisition of knowledge, enhancing students’ learning stemming from mistakes (Spear et al., 2024). Although detecting and discussing individuals’ mistakes in depth can be effective in promoting students’ learning, it can be challenging to implement this in real classroom or laboratory settings alongside other instructional activities (Narciss and Alemdag, 2025). An alternative approach could be erroneous examples or solutions presented by a teacher, allowing students to detect the mistake, followed by a classroom discussion of the mistake at hand (Safadi and Yerushalmi, 2014). This was also detected to elicit more naïve ideas and support learning transfer, especially for the students with lower prior knowledge.

Nevertheless, previous research (Schmid et al., 2025; Tulis and Dresel, 2025) suggests that brief interventions aiming at improving students’ error beliefs and learning seem a little ineffective, as internalizing the (meta-) cognitive and (meta-) affective strategies in dealing with mistakes can take time (Narciss and Alemdag, 2025). Instead, for example, encouraging dialogue about overcoming mistakes, accompanied by emotion recognition and concrete action plans, has been found to reduce students’ fear of failure (Peterson et al., 2025). Although these error beliefs seem to be rather stable, in situ responses to mistakes seem to be more malleable (Soncini et al., 2025).

Accordingly, students’ affective responses to mistakes have been found to differ in, for example, situations where the mistake is resolved with an instructor or with one's peers (Kyynäräinen et al., 2024). In general, integrative emotion regulation (Sharabi and Roth, 2025), as well as positive framing of errors, and the absence of framing mistakes negatively or as deleterious (Leighton et al., 2018; Käfer et al., 2019; Tulis and Dresel, 2025), might also contribute to students’ adaptive coping practices to making mistakes, plausibly promoting learning. This could mean encouraging students not to fear challenge or frustration, but to be attentive to them and to identify those not as ego-threatening but as excellent indicators of learning possibilities, i.e. meta-affective learning (DeKorver and Towns, 2015; Radoff et al., 2019; Thomas et al., 2022; Sharabi and Roth, 2025). Meta-affective learning could promote behavioural shifts, actions towards learning from the challenges or mistakes. For example, a student might fail a pre-lab activity, experiencing disappointment or frustration, but this could reflect their recognition of the need to revise before entering the laboratory, and their view of the laboratory experiment as a good opportunity to fill in their learning gaps (cf.Chu, 2017).

The present study

Previous research has indicated that some students find avoiding mistakes as an important goal for laboratory courses (DeKorver and Towns, 2015). There is a need for research on students’ perceptions about their own mistakes, particularly across different contexts and environments (cf.Steuer et al., 2025). This study takes on this, investigating undergraduate students’ affective responses to mistakes in an online learning environment during pre-lab tasks and later in subsequent laboratory sessions. Additionally, Agustian and Seery (2017) highlighted the need for further research into the role of pre-lab activities in shaping affective dimensions of practical chemistry. In this study, we address the research gap by focusing on the lagged effects of students’ affective experiences and mistakes during the pre-lab activities. We deploy the OLM framework to study dynamic short-term engagement within a student with due respect to the specific context (e.g.Sinatra et al., 2015; Schneider et al., 2016; Salmela-Aro et al., 2021; Atabek-Yigit and Senoz, 2023; Kyynäräinen et al., 2024) – providing valuable perspectives on students’ momentary and lagged affective responses to mistakes. In order to understand the patterns of situational engagement in the laboratory better, we also investigate how it changes during instruction.

The research questions of this study are:

1. To what extent is making mistakes associated with students’ situational engagement in the pre-lab activities?

2. To what extent do students’ situational engagement and making mistakes in the pre-lab activities predict their situational engagement during the subsequent laboratory session?

3. How does students’ situational engagement change within a laboratory session and across the laboratory course?

Methods

The research took place at a Finnish university, and the data were gathered from the first undergraduate chemistry laboratory course during the autumn semesters of 2023 and 2024. The research design, identical in both years, included a background questionnaire in addition to a total of 20 ecological momentary assessment (EMA) questionnaires along the course. In EMA designs, self-reports are collected through brief repeated questionnaires administered in real time and in authentic settings. In these questionnaires, individuals report their momentary experiences at fixed measurement points over a given period (Kitterød and Lyngstad, 2005). The strength of EMA questionnaires is that they provide highly situational questionnaire data with minimal recall bias (Kitterød and Lyngstad, 2005; Hektner et al., 2007). Additionally, they are highly contextual because participants reflect their experiences upon a given moment in an authentic setting, providing a so-called person-in-context approach to studying engagement (Greene, 2015; Sinatra et al., 2015).

Research environment and participants

The research context was the first chemistry laboratory course for undergraduate students, with the learning objectives of being able to safely carry out laboratory experiments under instruction and mastering some key separation and analysis methods. This course consisted of seven laboratory sessions with different topics and traditional undergraduate laboratory tasks: (1) preparing standard solutions and measuring concentration, (2) finding the equilibrium constant spectrophotometrically, (3) distillation, (4) liquid–liquid extraction, (5) pH-titration, (6) complexometric titration, and (7) buffer solutions.

As a preparatory task before each lab session, students were asked to complete mandatory learning exercises on an online learning platform. The pre-lab tasks were to be completed before entering the laboratory, but no earlier than one week in advance. Completing the tasks took approximately a little under an hour. After completing the exercise and a brief EMA questionnaire, students automatically received feedback indicating which tasks they had answered correctly or partially correctly – indicating whether they had made any mistakes. These pre-lab tasks were embedded into the overall laboratory learning process (cf.Agustian and Seery, 2017), as they were related to the topic of each subsequent laboratory session. The aim was to familiarize students with the concepts and the laboratory instructions of the upcoming experiment. The preparative exercises included, for instance, calculations, questions about the key analysis methods or procedures, or simulations close to the topic of the laboratory work.

In the laboratory, students performed the weekly 3- to 4-hour experiments in pairs, led by the same teacher each week. Students were required to keep a laboratory diary containing all the calculations required to complete the experiment. After students were done with the experiment, they presented their lab diary to the teacher and discussed the work. Each student began the course by carrying out experiment 1; however, the rest of the experiments were performed in one of three alternative orders (1, 2, 3, 5, 7, 4, 6 or 1, 3, 2, 4, 6, 5, 7 or 1, 2, 5, 3, 4, 6, 7).

Totally 256 students took part in the study (n = 155 in 2023 and n = 101 in 2024). The participants were mostly (n = 188, 73.4%) first-year students, while there were some second-year students (n = 33, 12.9%), and third-year-or-above (n = 35, 13.7%). Females were overrepresented in this study, as 185 (72.3%) of the participants identified themselves as females, and 65 (25.4%) identified themselves as males. The participants majored in biochemistry (n = 78, 30.9%), biotechnology (n = 62, 24.2%), chemistry (n = 47, 18.4%), physics (n = 20, 7.8%), biology (n = 11, 4.3%), mathematics (n = 8, 3.1%), geology (n = 8, 3.1%), materials technology (n = 6, 2.3%), and other subjects (n = 16, 6.25%).

Data collection

At the beginning of the laboratory course, students responded to a background questionnaire. Ecological momentary assessment (EMA) design was implemented to collect situational data after the pre-lab activities and during the laboratory sessions. Before we started collecting the situational data from the laboratory, all students had completed the first laboratory experiment. Thus, they were already familiar with the standard practices and the chemical equipment used in the teaching laboratory.

We collected the EMA data from the laboratory experiments 4. Liquid–liquid extraction, 5. pH-titration, 6. Complexometric titration, and 7. Buffer solutions. During these four laboratory sessions, students responded to a total of 13 EMA questionnaires, with 3 or 4 questionnaires implemented within one session. The links to the EMA questionnaires were embedded in the work instructions, and students responded at predetermined points of the laboratory experiment using their phones. Thus, the data collection followed students’ progress in the experiments, allowing us to study different phases of work with respective EMA questionnaires. This also ensured that responding to the EMA questionnaires would interfere with one's workflow and engagement as little as possible. EMA questionnaires were administered approximately an hour apart, and the last EMA questionnaire of each experiment was completed after finalizing the experiment, but before discussing it with the teacher.

Altogether, we received 1586 situational responses from the laboratory, with a compliance rate of 47.7%, and the mean number of EMA questionnaire responses per person was 7.36 (SD = 3.56, range = [1,13]). In addition, we received 1006 situational responses from the four preparative learning exercises antedating these four laboratory sessions. The EMA questionnaire items automatically showed up after the students had completed the pre-lab learning exercises. Therefore, the compliance rate in these EMA questionnaires was significantly higher than in the lab, namely 98.2%. The EMA questionnaires were quick to respond to, taking only approximately 1 to 3 minutes.

All EMA questionnaires contained the same single-item indicators of the levels of situational interest, perceived skill, and task difficulty (Schneider et al., 2016), originally adapted from Csikszentmihalyi and Schneider's (2000) as well as Shernoff and colleagues’ (2003) studies. In general, previous research suggests that single-item indicators offer a psychometrically sound alternative to longer scales in measuring motivational-affective constructs, particularly across contexts with repeated measures, such as EMA designs (see Hoeppner et al., 2011; Gogol et al., 2014; Fisher et al., 2016). Although the internal consistency of the measures cannot be measured with only one item, there is evidence for the stability of the measures of self-competence items (cf. skill) within students across time (Moneta et al., 2014) as well as significant cross-study consistency for the OLM items used in this study across various contexts and age groups, including undergraduate students in chemistry laboratory courses (see e.g.Schneider et al., 2016; Upadyaya et al., 2021; Atabek-Yigit and Senoz, 2023). The questionnaires were implemented in Finnish, and the phrasing aligned with previous research (e.g.Schneider et al., 2016; Inkinen et al., 2019; Inkinen et al., 2020; Tang et al., 2025), including the following items: ‘Were you interested in what you were doing?’, ‘Did you feel skilled at what you were doing?’, and ‘Did you feel challenged by what you were doing?’. Participants reported their level of agreement on a 5-point Likert scale with response categories ranging from ‘1 = Not at all’ to ‘5 = Very much’.

Analytical approach

Concerning the pre-lab activities, we coded mistakes in cases where, based on the students’ pre-lab quiz responses, they had made mistakes in one or more of the subtasks. All tasks were automatically assessed, and thus, mistakes were indicated by task scores diverging from the total score. This resulted in 643 (63.9%) cases that included mistakes. Typical mistakes in the pre-lab tasks included calculation errors and misunderstanding the laboratory instructions for the related experiment. Thus, many of these mistakes were presumably caused by student's lack of the skills needed to successfully carry out the laboratory experiment in the subsequent lab session.

Across all measurement points in the data, the means and standard deviations for the optimal learning elements were as follows: interest (M = 3.43, SD = 0.95), skill (M = 3.2, SD = 1.09), and challenge (M = 2.84, SD = 1.12). Accordingly, optimal learning moments (Schneider et al., 2016), representing high situational engagement, were coded as 1 when interest, skill, and challenge were simultaneously above the average of the population (4 or 5 for interest and skill, and 3, 4, or 5 for challenge), and 0 in all other cases. Third, we coded the progress within the session to range from 0 (pre-lab activities) to 1 (final EMA questionnaire during the session). Similarly, progress across the entire course ranged from 0 (before the first experiment) to 1 (the final experiment).

The statistical analysis methods were selected considering the hierarchical structure of the data. That is, situational responses are nested within lessons, which are again nested within individuals. Namely, each student (level 3, individual level) has not only participated in multiple lessons (level 2, lesson level) but also responded to multiple EMA questionnaires within one lesson (level 1, situational level). Therefore, a three-level hierarchy is present in the data, and our modelling approach included three-level MSEM models. In these models, individuals and lessons were treated as clusters.

The intraclass correlations (ICCs) of interest, skill, and challenge indicated that a three-level approach was reasonable. Generally, a multilevel approach is suggested when ICCs are above 0.10 (Irimata and Wilson, 2018). Here, at the lesson level, ICCs were 0.08 for OLMs, 0.21 for interest, 0.11 for skill, 0.07 for challenge, 0.10 for OLMs, and at the individual level, 0.17 for OLMs, 0.31 for interest, 0.19 for skill, 0.12 for challenge, and 0.14 for OLMs. This indicates that there is significant variation in the variables explained by both the lesson and the individual, but most of the variance was found at the situational level.

Situational level predictors included progress within a session as well as progress across the entire course. Lesson-level predictors, on the other hand, included students’ experiences during pre-lab activities, their mistakes in these activities, as well as the topic of the experiment at hand. At the individual level, optimal learning elements were allowed to correlate with one another; however, no individual-level predictors were included in the models. Results concerning the individual-level effects on situational engagement can be found in our previous article (Kyynäräinen et al., 2024).

Four multilevel models were created to answer the research questions in turn. Model 1 (see Fig. 1) and Model 2 (see Fig. 2) aimed to answer RQ1. They were simple two-level regression models, including only the individual level and lesson level (ICCs for OLMs = 0.14, interest = 0.32, skill = 0.14, and challenge = 0.14). This was due to the lack of variance at the situational level, as the models focused on students’ experiences only during pre-lab activities. Model 1 was a two-level, logistic regression-based structural equation model evaluating the relationship between making mistakes and OLMs at the lesson level, including the associations between the experiment at hand (i.e. laboratory session covariates) and the probability of making mistakes. The individual level means and variances of experiencing OLMs were also considered in the model. Model 2 deepened Model 1, investigating the elements of optimal learning (interest, skill, and challenge) separately.


image file: d5rp00231a-f1.tif
Fig. 1 Model 1, a two-level logistic regression model aimed at answering RQ1.

image file: d5rp00231a-f2.tif
Fig. 2 Model 2, a two-level regression model aimed at answering RQ1.

Model 3 (see Fig. 3) and Model 4 (see Fig. 4) were designed to address RQ2 and RQ3. They were regression-based three-level MSEM models, resembling cross-lagged panel models, first investigating OLMs as a whole (Model 3) and then its elements separately (Model 4). In these models, we investigated lagged effects of situational engagement and mistakes in pre-lab activities on students’ situational engagement during the entire subsequent laboratory session, rather than only at the first measurement point, as typically done in cross-lagged panel models.


image file: d5rp00231a-f3.tif
Fig. 3 Model 3, a three-level SEM aimed at answering RQ2 and RQ3.

image file: d5rp00231a-f4.tif
Fig. 4 Model 4, a three-level SEM aimed at answering RQ2 and RQ3.

In more detail, Model 3 (see Fig. 3) examined the relationships between progress within the session and OLMs at the situational level and between progress across the course, experiment (i.e. laboratory session covariates), OLMs, and making mistakes during pre-lab activities, as well as OLMs during laboratory sessions at the lesson level. Also, means and variances of OLMs were considered at the individual level. Model 4 (see Fig. 4) was similar to the third one, but instead of OLMs, it included interest, skill, and challenge as the dependent variables, and the correlations between these were considered at the individual level.

The analyses were conducted using Mplus version 8.11 (Muthén and Muthén, 1998–2017). Maximum likelihood for robust standard error (MLR) estimation was applied, and the missing data, which were missing completely at random (Little's MCAR test, χ2 = 5.31, p = 0.505), were handled by the Full Information Maximum Likelihood (FIML) method. As none of the models included any latent variables and the residuals were allowed to covariate freely, the models were overfitted with zero degrees of freedom. This resulted in perfect, non-informative model fit indices. As such, model fit statistics (including RMSEA, CFI, TLI, and SRMR) are not reported for the models, as they do not provide meaningful information about model adequacy in this context.

Results

Descriptive statistics and correlations

Descriptive statistics of OLMs and elements of optimal learning (interest, skill, and challenge) in pre-lab activities and the laboratory are presented in Table 1. Optimal learning elements were related to one another on all three levels: situational, lesson, and individual. Correlations on each level are shown in Table 2.
Table 1 Descriptive statistics of the measured variables
  n Compliance rate (%) M S.D.
Interest in pre-lab activity 990 96.7 3.11 0.96
Skill in pre-lab activity 991 96.8 2.71 1.02
Challenge in pre-lab activity 989 96.6 3.43 0.95
OLM in pre-lab activity 984 96.1 0.06 0.24
Mistake in pre-lab activity 1006 98.2 0.639 0.48
Interest in lab 1611 48.5 3.62 0.88
Skill in lab 1613 48.5 3.51 1.02
Challenge in lab 1613 48.3 2.48 1.06
OLM in lab 1609 48.4 0.12 0.32


Table 2 Correlations between interest, skill, and challenge during laboratory sessions at situational, lesson, and individual levels
  Situational level Lesson level Individual level
Interest Skill Challenge Interest Skill Challenge Interest Skill Challenge
r r r r r r r r r
Note: * = p < 0.05, ** = p < 0.01, *** = p < 0.001.
Interest
Skill 0.14*** 0.06** 0.14***
Challenge −0.12*** −0.45*** −0.01 −0.01 −0.02 −0.06*


RQ1: To what extent is making mistakes associated with students’ situational engagement in the pre-lab activities?

In Model 1 (see Fig. 1), experiencing OLMs was not associated with making mistakes in the pre-lab activities at a statistically significant level (β = −0.02, S.E. = 0.02, and p = 0.198). The relationships between making mistakes and the elements of optimal learning, including perceived interest, skill, and challenge, were studied further. In Model 2, making mistakes in the pre-lab activities was associated with students’ levels of skill and challenge (see Fig. 2). The results (see Table 3) imply that students experienced lower skill and higher challenge during the preparative learning exercises if they also made mistakes in these exercises. Based on the results, there were also some statistically significant differences in the probability of making mistakes in each pre-lab activity.
Table 3 Lesson-level effects of mistakes during pre-lab activities and the experiment on students’ interest, skill, and challenge during the tasks, analysed using MSEM
  Interest in pre-lab activities Skill in pre-lab activities Challenge in pre-lab activities Mistakes in pre-lab activities
β S.E. p β S.E. p β S.E. p β S.E. p
Note: presented estimate values (β), standard errors (S.E.) and p-values (p). Statistically significant results (p < 0.05) are highlighted in bold.
Mistakes in pre-lab activities −0.06 0.06 0.282 −0.19 0.06 0.003 0.12 0.06 0.042
 
Reference: pH-titration
Liquid–liquid extraction −0.51 0.08 <0.001 0.33 0.09 <0.001 0.08 0.08 0.302 0.27 0.04 <0.001
Buffer solutions 0.27 0.06 <0.001 −0.48 0.07 <0.001 0.48 0.07 <0.001 0.05 0.04 0.147
Complexometric titration 0.10 0.07 0.137 0.25 0.07 0.001 −0.47 0.07 <0.001 −0.01 0.04 0.732
 
Reference: liquid–liquid extraction
Buffer solutions 0.24 0.07 0.001 −0.81 0.09 <0.001 0.40 0.07 <0.001 −0.21 0.04 <0.001
Complexometric titration 0.61 0.08 <0.001 −0.08 0.09 0.417 −0.55 0.08 <0.001 −0.28 0.04 <0.001
 
Reference: buffer solutions
Complexometric titration 0.36 0.06 <0.001 0.73 0.07 <0.001 −0.95 0.07 <0.001 −0.07 0.04 0.117


RQ2: To what extent do students’ situational engagement and making mistakes in the pre-lab activities predict their situational engagement during the subsequent laboratory session?

According to the results of Model 3 (see Fig. 3) presented in Table 4, experiencing OLMs in the pre-lab activities consequently predicts OLMs in the subsequent laboratory session (β = 0.12, S.E. = 0.05, and p = 0.028). Making mistakes in the pre-lab activities did not predict OLMs in the subsequent laboratory session at a statistically significant level.
Table 4 Lesson-level effects of situational engagement and mistakes during pre-lab activities, as well as the experiment on students’ situational engagement during the subsequent laboratory session, analysed using three-level SEM
  OLM in the subsequent laboratory session
β S.E. p
Note: presented estimate values (β), standard errors (S.E.) and p-values (p). Statistically significant results (p < 0.05) are highlighted in bold.
OLM in pre-lab activities 0.12 0.05 0.028
Mistakes in pre-lab activities 0.01 0.02 0.540
Progress across the course −0.10 0.07 0.160
 
Reference: pH-titration
Liquid–liquid extraction 0.01 0.03 0.984
Buffer solutions −0.01 0.03 0.839
Complexometric titration −0.00 0.03 0.968
 
Reference: liquid–liquid extraction
Buffer solutions −0.01 0.04 0.893
Complexometric titration 0.00 0.03 0.988
 
Reference: buffer solutions
Complexometric titration 0.00 0.02 0.839


For a more comprehensive analysis, the elements of engagement were studied further separately (see Fig. 4). The results (see Table 5) suggest that students’ interest, skill, and challenge during pre-lab activities play a significant role in predicting their situational engagement, encompassing interest, skill, and challenge in the subsequent laboratory session. The results state that interest in pre-lab activities predicts higher interest in the lab, while feeling skilled at the pre-lab activities predicts not only higher skill but also higher interest in the lab. Similarly, perceived challenge in the pre-lab activities predicts higher challenge in the lab, while it also predicts higher interest.

Table 5 Lesson-level effects of pre-lab interest, skill, challenge, mistakes, and progress across the course, on students’ interest, skill and challenge during subsequent lab sessions, analysed using three-level SEM
  Interest in the subsequent laboratory session Skill in the subsequent laboratory session Challenge in the subsequent laboratory session
β S.E. p β S.E. p β S.E. p
Note: presented estimate values (β), standard errors (S.E.) and p-values (p). Statistically significant results (p < 0.05) are highlighted in bold.
Interest in pre-lab activities 0.20 0.04 <0.001 0.01 0.03 0.885 0.00 0.03 1.000
Skill in pre-lab activities 0.07 0.04 0.040 0.20 0.04 <0.001 −0.01 0.03 0.738
Challenge in pre-lab activities 0.08 0.04 0.041 0.03 0.04 0.373 0.10 0.04 0.010
Mistakes in pre-lab activities 0.03 0.05 0.642 0.04 0.06 0.500 −0.01 0.05 0.842
Progress across the course −0.00 0.14 0.980 0.37 0.15 0.015 −0.47 0.18 0.010
 
Reference: pH-titration
Liquid–liquid extraction 0.16 0.06 0.010 0.10 0.07 0.158 −0.37 0.08 <0.001
Buffer solutions −0.23 0.07 0.001 −0.17 0.08 0.033 0.06 0.07 0.366
Complexometric titration 0.09 0.07 0.169 0.13 0.07 0.050 −0.33 0.07 <0.001
 
Reference: Liquid–liquid extraction
Buffer solutions −0.38 0.07 <0.001 −0.26 0.08 0.001 0.42 0.09 <0.001
Complexometric titration −0.06 0.07 0.382 0.03 0.07 0.649 0.02 0.10 0.883
 
Reference: buffer solutions
Complexometric titration 0.32 0.06 <0.001 0.29 0.07 <0.001 −0.41 0.07 <0.001


Based on the results, students’ mistakes in the pre-lab activities did not seem to carry over to the laboratory session, as they did not predict any of the elements of OLMs, i.e. interest, skill, or challenge, in the subsequent laboratory session.

The model also implies that the topic of the laboratory experiment played a crucial role in predicting the levels of students’ interest, skill, and challenge. Students seemed to be the most interested during the pH-titration experiment, the most skilled during the complexometric titration experiment, and the most challenged by the buffer solution experiment.

RQ3: How does students’ situational engagement change within a laboratory session and across the laboratory course?

The results of Models 3 and 4 indicate that the progress within a laboratory session impacts students’ OLMs, interest, skill, and challenge. At the end of the laboratory session, students were more likely to experience OLMs (β = 0.06, S.E. = 0.03, and p = 0.036), and they felt less interested (β = −0.34, S.E. = 0.06, and p < 0.001) and skilled (β = −0.67, S.E. = 0.09, and p < 0.001), and significantly more challenged (β = 1.03, S.E. = 0.10, and p < 0.001). As the course progressed over time, students felt more skilled (β = 0.37, S.E. = 0.15, and p = 0.015) and less challenged (β = −0.47, S.E. = 0.18, p = 0.010).

Discussion

This study aimed to deepen the understanding of the role of pre-lab activities and their affective outcomes by investigating how students’ engagement and mistakes in pre-lab activities predict their engagement in the subsequent laboratory sessions. We will discuss our findings, interpreted within the framework of optimal learning (Schneider et al., 2016; Tang et al., 2025), addressing each research question in turn. We address some limitations of the study, and finally, we provide our insights into practical implications and directions for future research.

RQ1: To what extent is making mistakes associated with students’ situational engagement in the pre-lab activities?

In line with previous findings from the laboratory setting (cf.Kyynäräinen et al., 2024), students experienced lower skill and higher challenge in the pre-lab activity situations where they had made mistakes. However, here, making mistakes was not associated with lower interest at a significant level. Thus, we suggest that making mistakes in an online learning environment could lead to different kinds of affective responses compared to making mistakes in the lab. Additionally, based on the descriptive statistics, students were, on average, less interested in the pre-lab activities than they were in the lab. Perhaps because the level of interest in the pre-lab activities was lower in general, it might not be as exposed to external interference.

Additionally, here, students did not know for sure if they had made mistakes in the pre-lab tasks before responding to the EMA questionnaires, as they only saw their scores after completing the EMA questionnaire. Thus, in this setting, students’ perceptions of their interest, skill, and challenge in the pre-lab activities were mostly based on their self-reflections of success (cf.Chu, 2017). Having received failure feedback, compared to being uncertain of one's success, might also alter students’ emotional experiences (cf.Tulis and Ainley, 2011; Chu, 2017; Soncini et al., 2025; Tulis and Dresel, 2025) and therefore, students’ perceptions might still have shifted after filling in the EMA questionnaire. Previous research suggests that students’ adaptive error reactions and affective-motivational responses could be supported with digital learning environments giving encouraging feedback after mistakes (Narciss and Alemdag, 2025; Soncini et al., 2025; Tulis and Dresel, 2025), mitigating the plausible maladaptive reactions to receiving failure feedback.

In addition, the model results indicate that the experiment at hand has significant impacts on the levels of students’ interest, skill, and challenge, supporting their role in addressing the affective domain (cf.Agustian and Seery, 2017), as well as their probability of making mistakes. It seems that students made significantly more mistakes in the preparative learning exercises concerning the pH-titration experiment. These exercises incorporated, for instance, pH-calculations, which are often perceived as difficult (see e.g.Tümay, 2016; Sheppard, 2006). However, based on the data, students perceived the learning exercises concerning buffer solutions even more difficult, but this did not result in a significantly higher probability of making mistakes in those exercises. Therefore, it seems like a higher challenge level does not contribute to the probability of erring alone but perhaps factors like students’ misconceptions or the type of task could also have an impact (cf.Tulis, 2013; Käfer et al., 2019; Loibl and Leuders, 2019).

RQ2: To what extent do students’ situational engagement and making mistakes in the pre-lab activities predict their situational engagement during the subsequent laboratory session?

Even though mistakes in the pre-lab activities predicted students’ concurrent skill and challenge, they did not present any lagged effects on their situational engagement, i.e. OLMs, or its elements. This suggests that the affective impacts of mistakes are plausibly rather short in duration. Thus, upon entering the laboratory, students express unimpacted levels of interest, skill, and challenge, even though mistakes in the pre-labs might reflect gaps in skills and knowledge required to successfully conduct the laboratory experiments. It is also possible that the impact of mistakes does not strongly transfer between different learning environments. Accordingly, the pre-lab mistakes made in an online learning environment might not play such an important role in predicting students’ engagement in an in-person learning environment of the laboratory.

In addition, factors like positive error beliefs, i.e. seeing errors as valuable for learning (Schmid et al., 2025; Tulis and Dresel, 2025), resilience, i.e. the ability to bounce back (Smith et al., 2008; McMillan and Moore, 2020), or mastery orientation, i.e. the drive to grow through challenge (Tulis and Ainley, 2011; Steuer et al., 2013; Tulis et al., 2018), could moderate these lagged effects of mistakes. Students’ reactions to mistakes could also be associated with their ability to regulate meta-affective learning (cf.Radoff et al., 2019; Sharabi and Roth, 2025). Students with high meta-affective skills (see Thomas et al., 2022) could be more likely to perceive mistakes in pre-lab activities as indications of excellent learning opportunities and look forward to the laboratory experiments to fill in their learning gap.

Previous research also states that people tend to overestimate the duration of their emotional reaction to setbacks and challenges (Wilson and Gilbert, 2003). Therefore, if students were asked to anticipate the impact of previous mistakes on forthcoming experiences, they might forecast more intense and longer in duration effects. Thus, it might not be as surprising as it may seem that our results do not support long-term lagged effects of mistakes.

While mistakes did not demonstrate any lagged effects in the laboratory, students’ affective experiences did. This finding contributes to the previous understanding that pre-lab activities play a significant role in shaping students affective experiences (Cann, 2016; Galloway et al., 2016; Agustian and Seery, 2017) while it also aligns with the results of DeKorver and Towns (2015), suggesting that affective learning domain is a significant driver of students’ learning experiences in the laboratory. Our empirical results reveal that engagement predicted engagement, interest predicted interest, skill predicted skill, and challenge predicted challenge – expectations based on previous experiences predicted enacted prospective experiences (cf.Wylie and Hodgen, 2012; Galloway and Bretz, 2015).

In addition to these consequent, autoregressive effects, both experiences of skill and challenge during pre-lab activities predicted higher interest in the subsequent laboratory session. The result indicates that situational interest in the laboratory could also be triggered by prior experiences of skill and challenge in the pre-lab activities (cf.Hidi and Renninger, 2006). This emphasises the importance of pre-lab tasks that are challenging enough, but that still retain students’ experiences of competence for a meaningful experience in the laboratory – having to exert oneself at an adequate level can promote affective learning (Schneider et al., 2016).

RQ3: How does students’ situational engagement change within a laboratory session and across the laboratory course?

In addition to the laboratory experiment itself (cf.Pontigon and Talanquer, 2025), the phase of work plays a key role in predicting students’ affective experiences. These laboratory experiments tended to begin with practical phases, where students performed, for example, a given procedure, while the sessions typically ended with interpreting data, calculating results, or drawing conclusions, which are phases that integrate theory and practice. Previous research indicates that students experience high situational engagement more often when using models and constructing explanations (Inkinen et al., 2020) and particularly when laboratory work requires them to combine theory and practice (Kyynäräinen et al., 2024).

Here, students were less interested, significantly less skilled and more challenged, but also more likely to be situationally engaged at the end of the laboratory experiment. This suggests that, although students’ levels of situational interest and perceived skill decreased, they remained high enough for them to experience OLMs. The result aligns with the previous understanding that too low a challenge level is typically the barrier to experiencing OLMs in the laboratory (cf.Inkinen et al., 2019; Atabek-Yigit and Senoz, 2023; Kyynäräinen et al., 2024). Therefore, an increase in the challenge level at the end of the laboratory session seems to contribute to experiencing a higher level of engagement.

Finally, the results reveal that at the end of the laboratory course, students perceived higher skills and lower challenge levels of the activities. This suggests that students had improved in their skills and knowledge during the laboratory course, plausibly perceiving themselves as more competent.

Limitations

The first limitation that should be noted is that, as the models were overfitted, they cannot be falsified like conventional structural equation models. Therefore, the results should be interpreted with caution. Additionally, in the laboratory, students had plenty of other things to take into consideration besides the data collection, leading to another significant limitation of this study: the relatively low compliance rate. As approximately half of the data points from the laboratory were missing, this could impact the observed overtime effects. Nonetheless, the data were missing completely at random, and for instance, general performance did not predict the missingness.

Another significant consideration is that the EMA questionnaire responses in the pre-lab activities do not perfectly align with the lab session EMA responses, as there was only one EMA questionnaire per pre-lab activity, in contrast to several questionnaires per lab session. Therefore, students may have considered the pre-lab more holistically when responding to the EMA questionnaire. However, as the laboratory EMAs were administered approximately one hour apart, students had completed several lab activities within that timeframe, presumably reflecting the antecedent phase rather holistically again.

Additionally, another factor that should be considered is that students’ experiences of interest, skill, and challenge in the laboratory might encompass very different things from those in pre-lab tasks. We suggest that several factors, such as the task type, online versus in-person learning environment, and working independently versus with one's peers, might have an impact. Here, for example, students completed the pre-lab exercises, containing, for example, simulations and calculations, in an online learning environment individually, whereas the laboratory experiments, containing, for instance, measuring chemicals, performing chemical analyses, and calculating the results, were carried out in person, working with a pair. This does not affect the interpretation of our results as we treat interest, skill, and challenge in the pre-lab activities and during the laboratory sessions as separate variables, but additional research on the impact of these factors could provide a deeper understanding.

We identified students’ mistakes in the pre-lab activities solely based on their automatically assessed scores. Therefore, some of the mistakes may have remained unobserved if the student had received a perfect score regardless of their mistake, for example, by guessing. However, in these cases, students might not have realized their mistake before entering the lab, impacting their expectations for the work. We would like to acknowledge that the variable “mistakes in pre-lab activities” is not a perfect measure of students’ mistakes, and it is in many ways similar to students’ academic performance in the pre-lab tasks. Nevertheless, in this study, each pre-lab activity was scored with a maximum of 21 points, even though the number of subtasks in the pre-labs varied between 4 and 7. Accordingly, some subtasks carried more weight than others, and the maximum number of mistakes varied between pre-labs. Thus, in this context, we perceive the “mistakes in pre-lab activities” variable as more standardized and interpretable than, for instance, the score of the pre-lab tasks reflecting performance in the pre-lab activities. Finally, students performed the experiments in one of three optional orders. On the one hand, this made it more difficult to follow students’ learning based on acquired skills or knowledge of chemical concepts. On the other hand, this randomization decreased noise on the overtime impacts along the progress across the course.

Implications for research

In this study, we utilized optimal learning theory to examine shifts in students’ engagement (Schneider et al., 2016). The framework provides opportunities to make interpretations about the dynamic over time effects and trajectories in engagement. Another key component of this study was making mistakes, which is still a relatively under-researched topic (cf.Steuer et al., 2025), particularly in chemistry education, and specifically beyond students’ misconceptions. To obtain a more comprehensive understanding of the role of mistakes in chemistry education, we see that future research should also approach students’ mistakes and the linkage to their affective experiences using various methodologies, for example, students’ self-reports, interviews or observation. This could also provide further information on, for instance, how students’ responses to careless errors might differ from their responses to more fundamental mistakes, resulting from, for example, knowledge gaps. Additional research in the chemistry education field is needed to investigate how receiving failure feedback in pre-lab activities impacts students’ affect and how encouraging feedback could induce meta-affective learning in students (cf.Peterson et al., 2025; Tulis and Dresel, 2025).

Finally, future research is needed to examine which factors can contribute to how students overcome their mistakes. We suggest that in-person factors, such as resilience, meta-affective skills, error beliefs, and mastery orientation, as well as environmental factors, such as error climate, learning environment, and teacher support, may play key roles. Furthermore, we propose that students’ reactions to making mistakes can vary between digital environments and in-person contexts, and these differences in the dynamics of encountering mistakes across different contexts should be investigated further (cf.Soncini et al., 2025).

Implications for practice

Our results (RQ1) indicate that students experience lowered perceived skill and heightened challenge when they have made mistakes in pre-lab activities. However, no lagged effects of pre-lab mistakes on students’ situational engagement in the lab were observed, and we propose that students can overcome their negative affective responses to making mistakes in the pre-lab activities rather quickly. Instructors should bear in mind that mistakes are an intrinsic part of learning chemistry, and they need not be avoided.

Instead, our results (RQ2) suggest that when students feel challenged in the pre-lab activities, they express higher interest in the experiment. Perhaps, if the pre-lab activities feel too easy, students feel like they do not have so much learning to look forward to, leading to decreased interest and engagement. Therefore, educators should direct their attention to how challenging the pre-lab activities are, ensuring they include difficult enough tasks to promote interest and engagement. However, also feeling skilled in pre-lab activity predicted higher interest in the lab. Thus, fostering a sense of competence and confidence through pre-lab activities can also trigger interest in students in the laboratory.

Based on the results (RQ3), students’ skill level increased, while the perceived challenge decreased over the entire laboratory course, indicating that students have improved across the laboratory course. This highlights the need to provide students with activities of an adequate challenge level, for them to operate within their zone of proximal development (Vygotsky, 1978). Thus, we propose that a steady increase in the challenge level of learning tasks might be called for to promote students’ continued growth during laboratory courses in an optimal way.

Conclusions

In conclusion, we aimed to investigate the relationship between students’ mistakes in pre-lab activities and their situational engagement (i.e. OLMs) in an undergraduate chemistry laboratory course. We studied how these factors predict students’ situational engagement in the subsequent laboratory session. To answer our research questions, we used the MSEM framework, particularly two-level structural equation models and longitudinal three-level structural equation models. The results suggest that mistakes in online pre-lab tasks were associated with lower levels of skill and higher challenge during the task, but they did not predict students’ interest, skill, challenge, or OLMs in the subsequent laboratory session. We observed significant autoregressive effects from pre-lab activities to students’ situational engagement during laboratory sessions across all measured elements, while skill and challenge during pre-lab activities also predicted higher interest in the subsequent laboratory session. Based on the results, we suggest that mistakes in the pre-lab activities trigger relatively short-term affective responses in students and that engagement in the laboratory could be supported by fostering a sense of competence and confidence in students and ensuring that the pre-lab activities are sufficiently challenging.

Ethics statement

This study was carried out according to the ethics requirements for research involving human subjects, following the ethical guidelines of the Finnish National Board on Research Integrity TENK. Study participation did not pose any risks and was not associated with high physical or emotional stress. All participants were informed about the absolute voluntariness of participation, as well as the study objective, the protection of data privacy, and the no-risk character of the study. The participants were provided with contact information for any questions or problems.

Conflicts of interest

There are no conflicts of interest to declare.

Data availability

The data supporting the findings of this investigation are not publicly available due to data privacy laws protecting the personal information of the participants. However, anonymized data can be obtained from Veli-Matti Vesterinen upon reasonable request.

Acknowledgements

This research is funded by the Research Council of Finland (EDUCA Flagship #358924 and #358947) and the Ministry of Education and Culture (Doctoral school pilot #VN/3137/2024-OKM-4).

References

  1. Agustian H. Y., (2022), Considering the hexad of learning domains in the laboratory to address the overlooked aspects of chemistry education and fragmentary approach to assessment of student learning, Chem. Educ. Res. Pract., 23, 518–530 10.1039/D1RP00271F.
  2. Agustian H. Y., Gammelgaard B., Rangkuti M. A. and Niemann J., (2024), “I Feel Like a Real Chemist Right Now”: Epistemic Affect as a Fundamental Driver of Inquiry in the Chemistry Laboratory, Sci. Educ., 109, 722–744 DOI:10.1002/sce.21933.
  3. Agustian H. Y. and Seery M. K., (2017), Reasserting the role of pre-laboratory activities in chemistry education: a proposed framework for their design, Chem. Educ. Res. Pract., 18, 518–532 10.1039/C7RP00140A.
  4. Allchin D., (2012), Teaching the nature of science through scientific errors, Sci. Educ., 96, 904–926 DOI:10.1002/sce.21019.
  5. Ames C., (1992), Classrooms: Goals, structures, and student motivation, J. Educ. Psychol., 84(3), 261.
  6. Ames C. and Archer J., (1988), Achievement goals in the classroom: students' learning strategies and motivation processes, J. Educ. Psychol., 80(3), 260–267.
  7. Atabek-Yigit E. and Senoz A. B., (2023), Optimal learning moments in an undergraduate chemistry lab, Res. Sci. Technol. Educ., 43(1), 21–37 DOI:10.1080/02635143.2023.2224241.
  8. Cann A. J., (2016), Increasing Student Engagement with Practical Classes Through Online Pre-Lab Quizzes, J. Biol. Educ., 50, 101–112 DOI:10.1080/00219266.2014.986182.
  9. Chu M.-W., (2017), Using Computer Simulated Science Laboratories: A Test of Pre-Laboratory Activities with the Learning Error and Formative Feedback Model, [Doctoral dissertation, University of Alberta], Available at: https://ualberta.scholaris.ca/server/api/core/bitstreams/e93b3212-a446-4256-ba0a-19c3c79670c4/content.
  10. Chu M.-W. and Leighton J. P., (2019), Enhancing Digital Simulated Laboratory Assessments: A Test of Pre-Laboratory Activities with the Learning Error and Formative Feedback Model, J. Sci. Educ. Technol., 28(3), 251–264 DOI:10.1007/s10956-018-9763-z.
  11. Csikszentmihalyi M., (2000), Flow, in Kazdin A. E., (ed.), Encyclopedia of Psychology, vol. 3, Oxford: Oxford University Press, pp. 381–382.
  12. Csikszentmihalyi M. and Schneider B., (2000), Becoming adult, New York: Basic Books.
  13. de Anda D., Baroni S., Boskin L., Buchwald L., Morgan J., Ow J. and Weiss R., (2000), Stress, stressors, and coping among high school students, Child. Youth Services Rev., 22(6), 441–463.
  14. DeKorver B. K. and Towns M. H., (2015), General Chemistry Students’ Goals for Chemistry Laboratory Coursework, J. Chem. Educ., 92, 2031–2037 DOI:10.1021/acs.jchemed.5b00463.
  15. Eccles J. S. and Wigfield A., (2020), From expectancy-value theory to situated expectancy-value theory: a developmental, social cognitive, and sociocultural perspective on motivation, Contemp. Educ. Psychol., 61, 101859 DOI:10.1016/j.cedpsych.2020.101859.
  16. Fisher G. G., Matthews R. A. and Gibbons A. M., (2016), Developing and investigating the use of single-item measures in organizational research, J. Occup. Health Psychol., 21(1), 3–23 DOI:10.1037/a0039139.
  17. Galloway K. R. and Bretz S. L., (2015), Using cluster analysis to characterize meaningful learning in a first-year university chemistry laboratory course, Chem. Educ. Res. Pract., 16, 879–892 10.1039/C5RP00077G.
  18. Galloway K. R., Malakpa Z. and Bretz S. L., (2016), Investigating Affective Experiences in the Undergraduate Chemistry Laboratory: Students’ Perceptions of Control and Responsibility, J. Chem. Educ., 93, 227–238 DOI:10.1021/acs.jchemed.5b00737.
  19. Gogol K., Brunner M., Goetz T., Martin R., Ugen S., Keller U., Fischbach A. and Preckel F., (2014), “My Questionnaire is Too Long!” The assessments of motivational-affective constructs with three-item and single-item measures, Contemp. Educ. Psychol., 39, 188–205 DOI:10.1016/j.cedpsych.2014.04.002.
  20. Greene B. A., (2015), Measuring Cognitive Engagement With Self-Report Scales: Reflections From Over 20 Years of Research, Educ. Psychol., 50, 14–30 DOI:10.1080/00461520.2014.989230.
  21. Hattie J., and Timperley H., (2007), The power of feedback, Rev. Educ. Res., 77(1), 81–112.
  22. Heikkinen S., Saqr M., Malmberg J. and Tedre M., (2025), A longitudinal study of interplay between student engagement and self-regulation, Int. J. Educ. Technol. Higher Educ., 22(1), 21 DOI:10.1186/s41239-025-00523-3.
  23. Hektner J. M., Schmidt J. A. and Csikszentmihalyi M., (2007), Experience sampling method: Measuring the quality of everyday life, Thousand Oaks, CA: Sage Publications.
  24. Hidi S. and Renninger K. A., (2006), The Four-Phase Model of Interest Development, Educ. Psychol., 41, 111–127 DOI:10.1207/s15326985ep4102_4.
  25. Hoeppner B. B., Kelly J. F., Urbanoski K. A. and Slaymaker V., (2011), Comparative utility of a single-item versus multiple-item measure of self-efficacy in predicting relapse among young adults, J. Substance Abuse Treatment, 41, 305–312 DOI:10.1016/j.jsat.2011.04.005.
  26. Inkinen J., Klager C., Juuti K., Schneider B., Salmela-Aro K., Krajcik J. and Lavonen J., (2020), High school students’ situational engagement associated with scientific practices in designed science learning situations, Sci. Educ., 104, 667–692 DOI:10.1002/sce.21570.
  27. Inkinen J., Klager C., Schneider B., Juuti K., Krajick J., Lavonen J. and Salmela-Aro K., (2019), Science classroom activities and student situational engagement, Int. J. Sci. Educ., 41, 316–329 DOI:10.1080/09500693.2018.1549372.
  28. Irimata K. M. and Wilson J. R., (2018), Identifying intraclass correlations necessitating hierarchical modeling, J. Appl. Stat., 45, 626–641 DOI:10.1080/02664763.2017.1288203.
  29. Käfer J., Kuger S., Klieme E. and Kunter M., (2019), The significance of dealing with mistakes for student achievement and motivation: results of doubly latent multilevel analyses, Eur. J. Psychol. Educ., 34, 731–753 DOI:10.1007/s10212-018-0408-7.
  30. Keith N. and Frese M., (2008), Effectiveness of error management training: a meta-analysis, J. Appl. Psychol., 93, 59–69 DOI:10.1037/0021-9010.93.1.59.
  31. Kirschner P. A., (1992), Epistemology, practical work and Academic skills in science education, Sci. Educ., 1, 273–299 DOI:10.1007/BF00430277.
  32. Kirschner P., Meester M., Middelbeek E. and Hermans H., (1993), Agreement between student expectations, experiences, and actual objectives of practicals in the natural sciences at the Open University of The Netherlands, Int. J. Sci. Educ., 15(2), 175–197.
  33. Kitterød R. H. and Lyngstad T. H., (2005), Diary versus questionnaire information on time spent on housework – The case of Norway, eIJTUR, 2, 13–32 DOI:10.13085/eIJTUR.2.1.13-32.
  34. Krapp A. and Prenzel M., (2011), Research on Interest in Science: theories, methods, and findings, Int. J. Sci. Educ., 33, 27–50 DOI:10.1080/09500693.2010.518645.
  35. Kyynäräinen R., Vilhunen E. and Vesterinen V.-M., (2024), How making mistakes shapes students’ situational engagement in chemistry laboratory? Int. J. Sci. Educ. DOI:10.1080/09500693.2024.2439142.
  36. Lee W. S., (2020), An Experimental Investigation Into the Application of a Learning-From-Mistakes Approach Among Freshmen Students, Sage Open, 10, 2158244020931938 DOI:10.1177/2158244020931938.
  37. Lee W., Lee M.-J. and Bong M. (2014). Testing interest and self-efficacy as predictors of academic self-regulation and achievement, Contemp. Educ. Psychol., 39(2), 86–99 DOI:10.1016/j.cedpsych.2014.02.002.
  38. Leighton J. P., Tang W. and Guo Q., (2018), Undergraduate students’ attitudes towards mistakes in learning and academic achievement, Assess. Eval. Higher Educ., 43(4), 612–628 DOI:10.1080/02602938.2017.1387230.
  39. Loibl K. and Leuders T., (2019), How to make failure productive: fostering learning from errors through elaboration prompts, Learn. Instruct., 62, 1–10 DOI:10.1016/j.learninstruc.2019.03.002.
  40. McMillan J. H. and Moore S., (2020), Better Being Wrong (Sometimes): Classroom Assessment that Enhances Student Learning and Motivation, Clearing House, 93, 85–92 DOI:10.1080/00098655.2020.1721414.
  41. Moneta G. B., Schneider B. and Csikszentmihalyi M., (2014), A Longitudinal Study of the Self-Concepts and Experiential Components of Self-Worth and Affect Across Adolescence, Applications of Flow in Human Development and Education, Dordrecht: Springer DOI:10.1007/978-94-017-9094-9_21.
  42. Narciss S. and Alemdag E., (2025), Learning from errors and failure in educational contexts: new insights and future directions for research and practice, Brit. J. Educ. Psychol., 95, 197–218 DOI:10.1111/bjep.12716.
  43. Peterson E. R., Sharma T., Bird A., Henderson A. M. E., Ramgopal V., Reese E. and Morton S. M. B., (2025), How mothers talk to their children about failure, mistakes and setbacks is related to their children's fear of failure, Brit. J. Educ. Psychol., 95, 124–142 DOI:10.1111/bjep.12685.
  44. Pilin M. A., (2021), The past of predicting the future: a review of the multidisciplinary history of affective forecasting, History Human Sci., 34(3–4), 290–306 DOI:10.1177/0952695120976330.
  45. Pontigon D. and Talanquer V., (2025), Examining student engagement in the organic chemistry laboratory, Chem. Educ. Res. Pract., 2025, 26, 780–793 10.1039/D5RP00063G.
  46. Radoff J., Jaber L. Z. and Hammer D., (2019), “It's Scary but It's Also Exciting”: Evidence of Meta-Affective Learning in Science, Cognit. Instruct., 37, 73–92 DOI:10.1080/07370008.2018.1539737.
  47. Rayment S. J., Evans J., Moss K., Coffey M., Kirk S. H. and Sivasubramaniam S. D., (2023), Using lessons from a comparative study of chemistry & bioscience pre-lab activities to design effective pre-lab interventions: a case study, J. Biol. Educ., 57(5), 1092–1111 DOI:10.1080/00219266.2021.2011771.
  48. Safadi R. and Yerushalmi E., (2014), Problem solving vs. troubleshooting tasks: the case of sixth-grade students studying simple electric circuits, Int. J. Sci. Math. Educ., 12(6), 1341–1366 DOI:10.1007/s10763-013-9461-5.
  49. Salmela-Aro K., Upadyaya K., Cumsille P., Lavonen J., Avalos B. and Eccles J., (2021), Momentary TASK-VALUES and expectations predict engagement in science among Finnish and Chilean secondary school students, Int. J. Psychol., 56, 415–424 DOI:10.1002/ijop.12719.
  50. Schmid R., Robin N., Smit R. and Strahl A., (2022), The Influence of Error Learning Orientation on Intrinsic Motivation for Visual Programming in STEM Education, Eur. J. STEM Educ., 7, 05 DOI:10.20897/ejsteme/12477.
  51. Schmid R., Smit R., Robin N. and Strahl A., (2025), The role of momentary emotions in promoting error learning orientation among lower secondary school students: an intervention study embedded in a short visual programming course, Brit. J. Educ. Psychol., 95(1), 107–123 DOI:10.1111/bjep.12681.
  52. Schmidt J. A., (2010), Flow in Education, International Encyclopedia of Education, Elsevier, pp. 605–611 DOI:10.1016/B978-0-08-044894-7.00608-4.
  53. Schmidt J. A., Rosenberg J. M. and Beymer P. N., (2018), A person-in-context approach to student engagement in science: examining learning activities and choice, J. Res. Sci. Teach., 55, 19–43 DOI:10.1002/tea.21409.
  54. Schneider B., Krajcik J., Lavonen J., Salmela-Aro K., Broda M., Spicer J., Bruner J., Moeller J., Linnansaari J., Juuti K., Viljaranta J., (2016), Investigating optimal learning moments in U.S. and finnish science classes, J. Res. Sci. Teach., 53, 400–421 DOI:10.1002/tea.21306.
  55. Schraw G. and Lehman S., (2001), Situational interest: a review of the literature and directions for future research, Educ. Psychol. Rev., 13 (1), 23–52 DOI:10.1023/A:1009004801455.
  56. Seery M. K., (2020), Establishing the Laboratory as the Place to Learn How to Do Chemistry, J. Chem. Educ., 97, 1511–1514 DOI:10.1021/acs.jchemed.9b00764.
  57. Seery M. K., Agustian H. Y., Christiansen F. V., Gammelgaard B. and Malm R. H., (2024), 10 Guiding principles for learning in the laboratory, Chem. Educ. Res. Pract., 25, 383–402 10.1039/D3RP00245D.
  58. Sharabi Y. and Roth G., (2025), Emotion regulation styles and the tendency to learn from academic failures, Brit. J. Educ. Psychol., 95, 162–179 DOI:10.1111/bjep.12696.
  59. Sheppard K., (2006), High school students’ understanding of titrations and related acid-base phenomena, Chem. Educ. Res. Pract., 7, 32–45 10.1039/B5RP90014J.
  60. Shernoff D., Csikszentmihalyi M., Schneider B. and Shernoff E., (2003), Student engagement in high school classrooms from the perspective of flow theory, School Psychology Quarterly, 18(2), 158–176.
  61. Sinatra G. M., Heddy B. C. and Lombardi D., (2015), The Challenges of Defining and Measuring Student Engagement in Science, Educ. Psychol., 50, 1–13 DOI:10.1080/00461520.2014.1002924.
  62. Smith B. W., Dalen J., Wiggins K., Tooley E., Christopher P. and Bernard J., (2008), The brief resilience scale: assessing the ability to bounce back, Int. J. Behav. Med., 15, 194–200 DOI:10.1080/10705500802222972.
  63. Soderstrom N. C. and Bjork R. A., (2015), Learning Versus Performance: An Integrative Review, Perspect. Psychol. Sci., 10(2), 176–199 DOI:10.1177/1745691615569000.
  64. Soncini A., Matteucci M. C., Tomasetto C. and Butera F., (2025), Supportive error feedback fosters students’ adaptive reactions towards errors: Evidence from a targeted online intervention with Italian middle school students, Brit. J. Educ. Psychol., 95, 92–106 DOI:10.1111/bjep.12679.
  65. Soncini A., Visintin E. P., Matteucci M. C., Tomasetto C. and Butera F., (2022), Positive error climate promotes learning outcomes through students’ adaptive reactions towards errors, Learn. Instruct., 80, 101627 DOI:10.1016/j.learninstruc.2022.101627.
  66. Spagnoli D., Wong L., Maisey S. and Clemons T. D., (2017), Prepare, Do, Review: a model used to reduce the negative feelings towards laboratory classes in an introductory chemistry undergraduate unit, Chem. Educ. Res. Pract., 18(1), 26–44.
  67. Spear J., Tulis M. and Dresel M., (2024), Knowledge effects of action-related reactions to errors, Educ. Psychol., 44(9–10), 1092–1105 DOI:10.1080/01443410.2024.2426549.
  68. Steuer G., Rosentritt-Brunn G. and Dresel M., (2013), Dealing with errors in mathematics classrooms: structure and relevance of perceived error climate, Contemp. Educ. Psychol., 38, 196–210 DOI:10.1016/j.cedpsych.2013.03.002.
  69. Steuer G., Tulis M. and Peterson E. R., (2025), Learning from errors and failure in educational contexts, Br. J. Educ. Psychol., 95(1), 1–10 DOI:10.1111/bjep.12723.
  70. Tang X., Chen I.-C., Lavonen J., Schneider B., Krajcik J. and Salmela-Aro K., (2025), Optimal Learning Moments in Finnish and US Science Classrooms: A Psychological Network Analysis Approach, FLR, 13, 10–26 DOI:10.14786/flr.v13i2.1313.
  71. Thomas A. K., Wulff A. N., Landinez D. and Bulevich J. B., (2022), Thinking about thinking about thinking … & feeling: A model for metacognitive and meta-affective processes in task engagement, WIRES Cognit. Sci., 13, e1618 DOI:10.1002/wcs.1618.
  72. Tulis M., (2013), Error management behavior in classrooms: Teachers’ responses to student mistakes, Teach. Teacher Educ., 33, 56–68 DOI:10.1016/j.tate.2013.02.003.
  73. Tulis M. and Ainley M., (2011), Interest, enjoyment and pride after failure experiences? Predictors of students’ state-emotions after success and failure during learning in mathematics, Educ. Psychol., 31, 779–807 DOI:10.1080/01443410.2011.608524.
  74. Tulis M. and Dresel M., (2025), Effects on and consequences of responses to errors: Results from two experimental studies, Brit. J. Educ. Psychol., 95, 143–161 DOI:10.1111/bjep.12686.
  75. Tulis M. and Fulmer S. M., (2013), Students’ motivational and emotional experiences and their relationship to persistence during academic challenge in mathematics and reading, Learn. Ind. Diff., 27, 35–46 DOI:10.1016/j.lindif.2013.06.003.
  76. Tulis M., Steuer G. and Dresel M., (2018), Positive beliefs about errors as an important element of adaptive individual dealing with errors during academic learning, Educ. Psychol., 38, 139–158 DOI:10.1080/01443410.2017.1384536.
  77. Tümay H., (2016), Emergence, Learning Difficulties, and Misconceptions in Chemistry Undergraduate Students’ Conceptualizations of Acid Strength, Sci. Educ., 25, 21–46 DOI:10.1007/s11191-015-9799-x.
  78. Tuominen-Soini H. and Salmela-Aro K., (2014), Schoolwork engagement and burnout among Finnish high school students and young adults: profiles, progressions, and educational outcomes, Dev. Psychol., 50, 649–662 DOI:10.1037/a0033898.
  79. Upadyaya K., Cumsille P., Avalos B., Araneda S., Lavonen J. and Salmela-Aro K., (2021), Patterns of situational engagement and task values in science lessons, J. Educ. Res., 114(4), 394–403 DOI:10.1080/00220671.2021.1955651.
  80. Van Merrienboer J. J. G., Kirschner P. A. and Kester L., (2003), Taking the Load Off a Learner's Mind: Instructional Design for Complex Learning, Educ. Psychol., 38(1), 5–13 DOI:10.1207/S15326985EP3801_2.
  81. Vilhunen E., Chiu M.-H., Salmela-Aro K., Lavonen J. and Juuti K., (2023), Epistemic Emotions and Observations Are Intertwined in Scientific Sensemaking: A Study among Upper Secondary Physics Students, Int. J. Sci. Math. Educ., 21, 1545–1566 DOI:10.1007/s10763-022-10310-5.
  82. Vilhunen E., Vesterinen V.-M., Äijälä M., Salovaara J., Siponen J., Lavonen J., Salmela-Aro K. and Riuttanen L., (2025), Promoting university students’ situational engagement in online learning for climate education, Int. Higher Educ., 65, 100987 DOI:10.1016/j.iheduc.2024.100987.
  83. Vygotsky L. S., (1978), Mind in society: The development of higher psychological processes, Cambridge, MA: Harvard University Press DOI:10.2307/j.ctvjf9vz4.
  84. Wilson T. D. and Gilbert D. T., (2003), Affective forecasting, Adv. Exp. Soc. Psychol., 35, 345–411 DOI:10.1016/S0065-2601(03)01006-2.
  85. Wylie C. and Hodgen E., (2012), Trajectories and patterns of student engagement: Evidence from a longitudinal study, Handbook of Research on Student Engagement, Springer, pp. 585–599 DOI:10.1007/978-1-4614-20187_28.

This journal is © The Royal Society of Chemistry 2026
Click here to see how this site uses Cookies. View our privacy policy here.