Note-taking moderates the relationship between invested mental effort and solving chirality tasks

Katrin Schuessler *a, Michael Giese b and Maik Walpuski a
aUniversity of Duisburg-Essen – Chemistry Education, Schützenbahn 70, Essen 45127, Germany. E-mail: katrin.schuessler@uni-due.de
bUniversität Duisburg Essen – Institut für Organische Chemie, Universitätsstr. 7, Essen 45141, Germany

Received 11th July 2025 , Accepted 25th September 2025

First published on 15th October 2025


Abstract

When visual representations of molecules (e.g., skeletal formula) must be decoded to process a task (e.g., determine the absolute configuration of a molecule) and the corresponding schemas are not yet sufficiently automated, paper–pencil format notes may help select relevant information, organize it appropriately, and integrate knowledge without exceeding the working memory capacity (encoding and external storage). This article examines the extent to which task difficulty and invested mental effort differ for digital and paper–pencil-based tasks on the topic of chirality (RQ1) and the extent to which note-taking impacts students’ working memory load when working on paper–pencil-based chirality tasks (RQ2). The dataset is based on the responses of 80 students from Germany who completed 19 chirality task tandems (each consisting of one digital and one paper–pencil-based task) and rated their invested mental effort for each task. Item response theory analyses, group comparisons, and moderation analyses were conducted. Paper–pencil-based chirality tasks were found to be significantly easier than digital chirality tasks, and students invested significantly less mental effort in completing the paper–pencil-based chirality tasks (RQ1). Students who took notes in the paper–pencil format were found to be more capable of solving chirality tasks in both formats. Both groups invested a comparable amount of mental effort. A moderation analysis revealed that when note-taking was low, the relationship between invested mental effort and the probability of solving the task was strongest. For the note-takers, the relationship between the invested mental effort and the probability of solving the task decreased as the number of notes increased (RQ2). The results indicate that notes as external storage are relevant for processing tasks requiring handling representation. As the digital format does not offer comparable options for taking notes, notes represent a subject-specific format difference.


Introduction

The role of representations in organic chemistry

Visual representations are fundamental to communication in organic chemistry (Kozma et al., 2000; Goodwin, 2010). Therefore, to participate in subject-internal communication, students must learn to read and interpret the representations used (Fig. 1).
image file: d5rp00256g-f1.tif
Fig. 1 Communication through representations in organic chemistry.

Perceptual fluency (for an overview, see Kellman and Garrigan, 2009) enables experts to extract relevant information from representations unconsciously and without cognitive effort. Accordingly, experts can derive all the necessary information from visual representations of molecules and convert it into other visual (Rau, 2018) or verbal (Kozma and Russell, 1997) representations. Kozma and Russell (1997) defined the ability to recognize “different surface features as all representing the same principle, concept, or chemical situation” and transform them into another form of representation as representational competence. Expert-novice comparisons show that experts are significantly more proficient in recognizing commonalities regardless of representation and are more proficient in translating representations into text than novices (Kozma and Russell, 1997). Novices struggle to translate between verbal and visual representations (Kozma and Russell, 1997; Cooper et al., 2010; Bodé et al., 2016; Rau, 2018), as they struggle to comprehend visual representations, assign meaning to them (Anzovino and Lowery Bretz, 2016; Graulich and Bhattacharyya, 2017; Asmussen et al., 2023; Dood and Watts, 2023), and derive information from them (Asmussen et al., 2023). Students unable to assign meaning to representations struggle with organic chemistry (Anzovino and Lowery Bretz, 2016; Graulich and Bhattacharyya, 2017), usually rely only on memorization (Grove and Lowery Bretz, 2012), and fail with this strategy (Anderson and Bodner, 2008).

The close relationship between content knowledge and representational competence is problematic for learning: students must work with representations to learn concepts but require concepts to interpret representations (representational dilemma; Rau, 2017). Accordingly, learners with higher content knowledge have higher representational competencies (Sim and Daniel, 2014; Rho et al., 2022) and representational competence mediates between prior and content knowledge in organic chemistry (Dickmann et al., 2019).

For organic chemistry, the International Union of Pure and Applied Chemistry (IUPAC) requires the use of the skeletal formula (Fig. 2) as the standard form of representation (Brecher, 2008). Thus, organic chemistry textbooks (e.g., Clayden et al., 2001; Grossman, 2020) and molecular editors (e.g., ChemDraw) use the skeletal formula. Skeletal formula represents molecules in a more reduced form (carbon atoms and the hydrogen atoms attached to them are not represented explicitly but only by C–C bonds). For students, the degree of abstraction of the skeletal formula is a challenge to be mastered as part of their professionalization (Dood and Watts, 2023).


image file: d5rp00256g-f2.tif
Fig. 2 Butanol molecule depicted using structural and skeletal formulas.

No explicit introduction of the skeletal formula at the university level is currently planned in Germany (Society of German Chemists, Gesellschaft Deutscher Chemiker, GDCh, 2021), and organic chemistry textbooks offer insufficient opportunities for practice (Gurung et al., 2022). However, working with representations of individual molecules is a crucial basis for more complex tasks in further learning (Stowe and Esselman, 2023). Without explicit promotion of representational competence, the duration and course level of chemistry lessons at school remain significant predictors of representational competence after several semesters of study (Taskin et al., 2017).

In summary, successful handling of representations such as the skeletal formula is crucial for processing tasks in organic chemistry. Students must learn to read skeletal formulas to consider implicit information, identify relevant structures, and relate them to scientific concepts. Provided these processes are insufficiently automated, they represent a challenge for working memory load.

Cognitive load and note-taking

Intrinsic cognitive load arises from the amount of information to be processed and the relationships between information elements (element interactivity, Sweller, 2010). Knowledge is organized in schemas that only load the working memory as one element (Sweller, 1988; Sweller et al., 1998; Paas et al., 2003a,b, 2004; van Merriënboer and Sweller, 2005; Kalyuga, 2014; Sweller et al., 2019). Novices must construct up these schemas and automate them through sufficient practice. Without appropriate schemas, working memory is burdened by high element interactivity when processing complex tasks (Leahy and Sweller, 2004, 2005, 2008; Sweller, 2010; Leahy and Sweller, 2016).

Study results on the cognitive load when working with molecules of different sizes show a correlation between the level of cognitive load and the degree of expertise in drawing comparatively small molecules (Tiettmeyer et al., 2017). The probability of solving tasks decreases with increasing element interactivity: while 80% of molecules with six or fewer atoms are correctly translated from the half-structure formula into a Lewis structure; this applies to only 30% of molecules with seven or more atoms (Cooper et al., 2010). This indicates a cognitive overload due to a lack of sufficient schema automation (Sweller, 1988).

In summary, for novices, working with relatively small molecules is already associated with a high cognitive load because many elements and their relationships with each other (element interactivity) must be processed.

When schema automation is insufficient and element interactivity is high, markings and notes can help reduce the load on working memory by noting partial solutions (rather than holding them in working memory). Notes thus help in selecting, organizing, and integrating relevant information (Mayer, 2012, 2014). A note-taking review describes note-taking when reading texts, listening to lectures, or watching videos as a five-step process (Jansen et al., 2017): (1) the material presented must be understood, (2) key points or aspects must be identified, (3) the material must be related to prior knowledge and previous notes, (4) new information must be paraphrased or summarized, and (5) the result must be transformed into a written form. Therefore, taking notes is more cognitively demanding than just merely listening; however, if sufficient cognitive capacity is available, it can lead to better learning outcomes because information is processed more deeply (encoding effect, Di Vesta and Gray, 1972). A second function of note-taking is known as the external storage effect (Rickards and Friedman, 1978; Kiewra et al., 1991). Notes support learning by presenting selected relevant information that can be studied. If visual representations of molecules must be decoded to process a task and the corresponding schemas are not yet sufficiently automated, notes may help in encoding and providing external storage to select relevant information, organize it appropriately, and integrate knowledge without exceeding the working memory capacity.

While notes in paper format can be used easily and flexibly, digital molecular editors offer significantly fewer and less flexible options for working with notes (Schuessler et al., 2024a,b). Therefore, the format (paper–pencil-based or digital) is a relevant factor when investigating the role of notes in processing tasks in chemistry.

Research questions

Derivation of the research questions

When asked to determine the absolute configuration of a molecule, an expert with sufficient fluency immediately “sees” where chirality centres are located, their configuration, and which substituents must be prioritized when assigning position numbers. Accordingly, rather than many interacting elements, only a few fully automated schemas are processed and working memory is not loaded. Conversely, to solve the same task, a novice will first check whether each carbon atom has four different bonding partners and thus represents a chirality centre (Fig. 3a). The working memory can be relieved by adding hydrogen atoms that are not explicitly represented (Fig. 3b) and marking identified chirality centres (Fig. 3c). While determining the priority of the binding partners according to the rules of Cahn–Ingold–Prelog (CIP), the priorities can be noted to relieve the working memory (Fig. 3d). Moreover, when determining the configuration, the direction of rotation can be drawn as shown in Fig. 3e and the configuration (R or S) can be noted (Fig. 3f) to relieve the working memory. When determining the absolute configuration, the carbon atoms of the molecule can be numbered according to the IUPAC rules to relieve the working memory (Fig. 3g). Without notes, the element interactivity would probably exceed the working memory capacity of novices so the task would not be solved successfully.
image file: d5rp00256g-f3.tif
Fig. 3 Step-by-step determination of the absolute configuration of a molecule based on notes; (a–g) show the individual solution steps.

Hence, if the absolute configuration of a molecule represented by the skeletal formula must be determined, this representation must be used to (1) extract contained information (e.g., non-depicted hydrogen atoms, Fig. 3b), (2) identify relevant structures (e.g., carbon atoms with four different substituents, Fig. 3c), (3) use prior knowledge (e.g., CIP rules to determine the priority of substituents, Fig. 3d), and (4) integrate information to generate partial solutions (e.g., the absolute configuration of a chirality centre, Fig. 3e). These partial solutions can (5) be noted to save them temporarily (e.g., note “S” after determining the absolute configuration of the first chirality centre, Fig. 3f, to free up working memory capacity for determining the configuration of the second chirality centre). In the first four steps, notes function more in the sense of encoding; in the fifth step, the external storage function predominates.

Traditionally, research on note-taking occurs when learning from lectures, texts, or videos (Rickards and Friedman, 1978; Kiewra et al., 1991; Jansen et al., 2017). The role of note-taking in working with chemical representations appears to have been minimally studied. Recent studies in chemistry investigating the handling of molecule representations have instead focused on the advantages and disadvantages of two- and three-dimensional representations (Habig, 2020; Keller et al., 2021). Therefore, as molecular editors do not offer note-taking options comparable to the paper format, digital tasks must be solved without notes or an analogue scribble sheet must be used (which can be inconvenient because the molecules must first be drawn). Thus, the format (digital or paper–pencil-based) could be relevant when examining the role of notes in working with chemical representations.

Research questions

Therefore, this article examines the following research questions (RQs):

To what extent do task difficulty and invested mental effort differ for digital and paper–pencil-based tasks on the topic of chirality? (RQ1)

To what extent does note-taking impact students’ working memory load when working on paper–pencil-based chirality tasks? (RQ2)

Based on the literature, it is expected that digital tasks are solved correctly less often than paper–pencil-based tasks (H1.1). The mental effort invested may be higher for digital tasks because more information must be held in working memory (external storage, H1.2a) or lower because less deep processing occurs without taking notes (encoding, H1.2b). Two hypotheses can also be formulated regarding RQ2: students who take notes report a lower cognitive load because they relieve their working memory (external storage, H2.1a). Alternatively, students who take notes may also report a higher cognitive load because they process information more intensively (encoding, H2.1b).

Methods

Samples

The dataset is based on the responses of 80 students who completed chirality tasks and invested mental effort ratings in three consecutive semesters, about one-third of the way through each course: 19 students in the summer semester of 2023 (degree program: BSc chemistry and water sciences, attendance of the course in standard period of study: 2nd semester, graduation after 6 semesters standard period of study), 21 students in the winter semester of 2023/24 (degree program: chemistry teacher training students, attendance of the course in standard period of study: 3rd semester, graduation after 6 semesters standard period of study), and 40 students in the summer semester of 2024 (degree program: BSc chemistry and water sciences, attendance of the course in standard period of study: 2nd semester, graduation after 6 semesters standard period of study).

Three additional students from the winter semester of 2023/24 participated in the data collection. As they only completed part of the survey, they were excluded from the data analysis.

The data collection occurred as part of the organic chemistry exercise the students attended. Students in all three degree programmes attended an introduction to organic chemistry in which they learn about the structure of simple organic compounds, functional groups, and the associated basic reaction types (radical substitution, nucleophilic substitution, elimination, and electrophilic addition). The exercise was supervised by two lecturers (Authors 1 and 2). Author 1 is an expert in chemistry education, and Author 2 is an expert in organic chemistry. The exercise accompanied an organic chemistry lecture. As part of the exercise, students worked on digital exercises at home every week. A molecular editor (Kekule.js) was implemented in the e-learning and e-assessment tool JACK to enable the digital implementation of organic chemistry tasks (Schuessler et al., 2024b), such as the chirality tasks, enabling molecules to be drawn using the skeletal formula. The student drawing was translated by the system into an InChI code (Heller et al., 2015) that uniquely described the molecule stereospecifically. InChI codes generated from student input are compared with the stored sample solution and can thus be evaluated automatically. The system also has the option to evaluate multiple-choice tasks.

In class, exercise sessions were used to work together on further tasks and discuss questions and problems from the lecture or the digital exercises. The students completing the exercises of this study were given feedback on their current performance level (regardless of whether they had agreed to the data analysis for scientific purposes). In the summer semester of 2024, students could collect bonus points to improve their grades on passing the exam by actively participating in the exercise in which this study was conducted on a specific date (consent to use the data for scientific purposes was independent of the awarding of bonus points). The introduction of bonus points had an impact on the number of students who participated in the exercise.

It was not (individually) checked whether and how often the students had practiced with the digital tasks as part of the exercise. The only condition for completing the digital chirality exercises was that an introductory set of digital exercises from the first exercise session had been completed.

Design and procedure

The study pursues a media comparison approach (Clark and Feldon, 2014; Buchner and Kerres, 2023) and uses an inner-subject design for this purpose. Accordingly, all students completed the digital and paper–pencil-based chirality tasks and assessed the mental effort invested in each of these tasks. The students’ informed consent to participate in scientific surveys and data analysis was obtained at the beginning of the semester. All students were asked in advance to bring their own digital devices for digital task processing. If this was not possible, students were provided with an iPad from the faculty. Brief information on the procedure and purpose of the study was provided before collecting the data. Subsequently, the students first completed the digital tasks and the associated invested mental effort ratings in the summer semester of 2023 and the winter semester of 2023/24 before completing the paper–pencil-based tasks and the associated invested mental effort ratings. In the summer semester of 2024, half of the students began the paper–pencil-based tasks while the other half began the digital tasks. The other part was then completed. Students were asked to rate the invested mental effort individually for each task immediately after completing the task in both formats. A maximum of 30 minutes was available to complete the chirality tasks. The time limit was not required for the paper format. In the digital format, the system automatically ended some students’ work after 30 minutes.

Occasionally, students asked whether they could use a scribble sheet during digital data collection. This was permitted. The scribbles were not collected and were therefore not considered for the data analysis.

Chirality tasks

The test instrument originally comprised 20 tandem tasks, each comprising a paper–pencil-based task and a similar digital task. Fig. 4 shows an example task. As one task (C15d) led to issues with the data collection in one semester (when asked by a student, the test administrator said the task contained an error), this task and the corresponding paper–pencil-based partner task were excluded from the data analysis. Therefore, the evaluated dataset comprised 19 tandem tasks. In both formats, five tasks required the students to indicate for a depicted molecule via four multiple-choice single-select answer options whether the molecule had a chirality centre and if so at which carbon atom. Five further tasks required the students to use a dual-choice single-select answer format to determine whether the two molecules shown were identical molecules or enantiomers. The following four tasks required students to draw the corresponding molecule based on a given IUPAC name of a molecule with at least one stereocentre (open format). Finally, the students were asked to state the absolute configuration of a given molecule in five tasks using a multiple-choice single-select format with four answer options.
image file: d5rp00256g-f4.tif
Fig. 4 Example task (translation of the original German text).

Invested mental effort

The mental effort invested in each task was recorded using Paas’ labelled nine-point one-item rating scale (Paas, 1992). Immediate ratings (Xie and Salvendy, 2000; Paas et al., 2003a,b; van Gog et al., 2012; Leppink and van Merriënboer, 2015; Schmeck et al., 2015) were used to record the amount of invested mental effort on a task-specific basis.

Coding

The first author evaluated student responses in paper format. Correct solutions were coded as 1, and incorrect solutions or missing answers were coded as 0. The student answers in digital format were automatically corrected by the e-learning tool based on the given sample solutions. Again, correct answers were coded with 1 and incorrect answers or missing solutions with 0.

For the paper–pencil-based task processing, the first author also coded whether markings or notes were used during task processing. All processing traces in the paper-based test booklets that could be clearly assigned to a task were considered. If a task had traces of processing, it was coded as 1. Tasks without recognizable traces of processing were coded with 0. Fig. 5 provides an overview of the processing traces. Fig. 5a shows markings and notes also used by the lecturers supplying the exercise. Fig. 5b shows students’ markings and notes not used by the lecturers supplying the exercise.


image file: d5rp00256g-f5.tif
Fig. 5 Example traces of processing. Red highlighting subsequently added by the authors. (a) Markings and notes used in the exercise; (b) alternative markings and notes.

Data analyses

Item response theory (IRT) was used to calculate the task difficulty for the individual tasks based on the student responses to the chirality tasks. Winsteps was used for IRT analyses (Boone et al., 2014). Further data analysis was conducted at the task level. The relative task difficulties (Rasch parameters) were used for this. The student responses to the invested mental effort rating scale were also IRT scaled, and the Rasch parameters were also used here for further data analysis at the task level. The reliability of the chirality tasks (person reliability: 0.82 and item reliability: 0.90) and the invested mental effort ratings (person reliability: 0.93 and item reliability: 0.96) for the overall sample (Nstudents = 80, Nchirality[thin space (1/6-em)]tasks = 38, Ninvested[thin space (1/6-em)]mental[thin space (1/6-em)]effort[thin space (1/6-em)]ratings = 38) were satisfactory.

The task-based dataset examined comprised 19 tasks for which the following variables were available: relative task difficulty in paper format, relative task difficulty in digital format, invested mental effort in paper format, and invested mental effort in digital format. Easier chirality tasks (solved correctly by many students) are associated with a lower Rasch value, while more difficult tasks are associated with a higher Rasch value. Tasks in which many students have invested much mental effort are associated with a lower Rasch value. Tasks in which many students have only invested a small amount of mental effort are associated with a higher Rasch value.

RQ1 was answered by using t-tests to compare the relative task difficulty and the invested mental effort between the formats (paper vs. digital). As the students worked on both task types and tandem tasks were designed to address similar content, t-tests for paired values were used and a d for repeated measures (dRM) was reported. Additionally, correlation analyses were conducted. The t-tests and correlation analyses were calculated using SPSS (version 29). The d value for repeated measurements was calculated using the Psychometrica website (Lenhard and Lenhard, 2017).

RQ2 was answered by labelling students with at least one processing trace coded for at least one task as note-takers. Processing traces represented all types of notes or markings that appeared to have been made in connection with the task processing (examples are shown in Fig. 5). Winsteps was used to examine the extent to which the person abilities of students who left traces of processing differed from the person abilities of other students (differential item function [DIF] analyses; an effect size d was reported for groups of different sizes). Winsteps was also used to calculate the solution probabilities (observational averages) for the digital and paper–pencil-based chirality tasks as well as for the invested mental effort in both formats separately for both groups (note-takers vs. students whose test booklets had no processing traces). This resulted in eight new variables for each task: observational average for the paper–pencil-based chirality tasks, observational average for the digital chirality tasks, observational average for the paper–pencil-based invested mental effort ratings, and observational average for the digital invested mental effort ratings – once each for the note-takers and once each for the students whose test booklets showed no traces of processing (Table 1).

Table 1 Structure of the dataset
  Note-takers Students whose test booklets show no traces of processing
Observational average for the paper-based chirality tasks Observational average for the digital chirality tasks Observational average for the paper-based invested mental effort ratings Observational average for the digital invested mental effort ratings Observational average for notes (paper format) Observational average f or the paper-based chirality tasks Observational average for the digital chirality tasks Observational average for the paper-based invested mental effort ratings Observational average for the digital invested mental effort ratings
Tasks                  


Additionally, the observational average for notes was calculated for each paper format task (based on the number of notes taken by the note-takers) and added as an additional variable for each task. Observational averages were imported to SPSS, which was used for further analyses. Paired t-tests were used to compare the solution probabilities for the chirality tasks and the invested mental effort for the note-takers and non-note-takers within the formats (digital and paper). The solution probability for the invested mental effort in the paper format for the note-takers was centered using the group mean for further analyses. Moderation analysis was used to examine the extent to which an interaction term comprising note-taking and invested mental effort explains the probability of solving the paper–pencil-based drawing tasks better than either variable alone.

The SPSS PROCESS tool (version 4.2) was used to analyse the influence of note-taking on the relationship between invested mental effort and solution probability (PROCESS Model #1). The results section reports R2 and changes in R2 for the model. Additionally, it reports t-ratios, regression coefficients (B), and standard errors (SEs) for the model parameters as well as the results based on Johnson–Neyman output. For all analyses, 95% bias-corrected and accelerated confidence intervals, based on 1000 bootstrap samples (95% CI), are reported in parentheses.

Table 2 provides an overview of the descriptive statistics before and after the Rasch analysis.

Table 2 Descriptive statistics
  Item Paper–pencil format Digital format
Average score (solution) Rasch value (solution) Average score (invested mental effort) Rasch value (invested mental effort) Number of people who have taken notes Average score (solution) Rasch value (solution) Average score (invested mental effort) Rasch value (invested mental effort)
Chirality centre C01 0.91 −2.29 3.05 1.04 5 0.89 −1.99 3.54 0.77
C02 0.59 −0.02 3.33 0.87 15 0.71 −0.68 3.63 0.72
C03 0.60 −0.08 3.84 0.60 11 0.66 −0.40 4.31 0.34
C04 0.70 −0.61 3.71 0.67 10 0.69 −0.54 4.56 0.21
C06 0.78 −1.13 3.94 0.55 9 0.75 −0.90 4.63 0.13
Identical or enantiomers C07 0.63 −0.20 4.73 0.09 9 0.70 −0.61 5.89 −0.57
C08 0.69 −0.54 5.06 −0.10 11 0.63 −0.20 4.93 0.00
C10 0.68 −0.47 5.24 −0.20 9 0.40 0.90 5.60 −0.43
C11 0.54 0.23 4.61 0.16 6 0.48 0.53 5.20 −0.19
C12 0.73 −0.75 4.74 0.09 3 0.66 −0.40 5.30 −0.25
Draw C13 0.60 −0.08 4.44 0.24 13 0.50 0.39 4.84 −0.08
C14 0.60 −0.10 4.40 0.14 11 0.36 1.06 5.08 −0.33
C16 0.49 0.47 5.29 −0.32 11 0.26 1.59 5.80 −1.13
C18 0.35 1.15 5.85 −0.71 10 0.28 1.41 5.43 −1.01
Absolute configuration C19 0.55 0.17 4.39 0.27 21 0.50 0.01 3.80 0.19
C20 0.60 −0.08 4.69 0.07 27 0.49 0.01 3.95 −0.04
C21 0.43 0.78 4.68 0.00 31 0.31 0.98 4.53 −0.46
C22 0.63 −0.20 5.13 −0.21 22 0.15 2.15 4.43 −0.33
C23 0.55 0.16 5.20 −0.30 31 0.43 0.27 4.48 −0.50


Results

To what extent do task difficulty and invested mental effort differ for digital and paper–pencil-based tasks on the topic of chirality?

For the chirality tasks, the relative task difficulty for the total sample of students, N = 80, differed between paper–pencil-based tasks, M = −0.19, SE = 0.17, and digital tasks, M = 0.19, SE = 0.23 (Fig. 6). The difference, ΔM = −0.38, 95% CI [−0.72, −0.04], was significant, t(18) = 2.35, p = 0.031, dRM = 0.69. Thus, the paper–pencil-based chirality tasks were significantly easier than the digital chirality tasks.
image file: d5rp00256g-f6.tif
Fig. 6 Boxplot diagram for relative task difficulty for the paper–pencil and digital formats. The circle labelled six describes an outlier. This is the task shown above as an example (Fig. 4).

For the invested mental effort ratings, the relative task difficulty for the total sample of students, N = 80, differed between paper–pencil-based tasks, M = 0.16, SE = 0.10, and digital tasks, M = −0.16, SE = 0.11. The difference, ΔM = 0.31, 95% CI [0.21, 0.41], was significant, t(18) = 6.42, p ≤ 0.001, dRM = 1.68 (Fig. 7). Thus, students invested significantly less mental effort in completing the paper–pencil-based chirality tasks than in completing the digital chirality tasks.


image file: d5rp00256g-f7.tif
Fig. 7 Boxplot diagram for Rasch values for invested mental effort. Note That the wording of the item “When completing the task, my overall mental effort was …” and the partial credit model mean that tasks in which less mental effort was invested (e.g. students chose 1 “very, very low”) have a high Rasch value. Tasks in which a lot of mental effort was invested (e.g. students chose 9 “very, very high”) have a low Rasch value. The interpretation of the values here is therefore inverse to the Rasch values of the task processing.

In both formats, a significant correlation was found between relative task difficulty and invested mental effort. The correlation between invested mental effort and relative task difficulty was slightly weaker for the paper format, r = −0.65, p = 0.002, 95% CI [−0.898, 0.024], than for the digital format, r = −0.75, p ≤ 0.001, 95% CI [−0.913, −0.506]. Thus, tasks with higher task difficulty involved higher investment of mental effort.

To what extent does note-taking impact students’ working memory load when working on paper–pencil-based chirality tasks?

A total of 261 processing traces were coded. For 32 students, no processing traces were coded; for 48 students, at least one processing trace was coded for one task. A maximum of 19 processing traces were coded per person. The note-takers left processing traces for an average of 5.5 tasks while working on them. The most processing traces were found in the task block requiring the students to determine the absolute configuration of the molecule (33% of the answers with processing traces), followed by the task block requiring molecules to be drawn (14%) and the task block requiring students to determine whether a molecule had a chirality centre (12%). The fewest processing traces were made in the task block requiring students to determine whether the molecules were identical molecules or enantiomers (10%).

Students who took notes, M = 0.72 and SE = 0.13, showed higher person abilities in the chirality tasks than students whose paper-based test booklets showed no processing traces (DIF analyses), M = −0.03, SE = 0.16, ΔM = 0.75, SE = 0.20, F(1,78) = 15.82, p ≤ 0.001, d = 1.73. Hence, students who took notes were more able to solve chirality tasks than those whose paper-based test booklets showed no processing traces.

Students who took notes, M = −0.16 and SE = 0.07, showed equivalent invested mental effort to students whose paper-based test booklets showed no processing traces (DIF analyses), M = −0.14, SE = 0.10, ΔM = 0.01, SE = 0.13, F(1,78) = 0.01, and p = 0.06. Hence, students who took notes invested a comparable amount of mental effort to students whose paper-based test booklet showed no evidence of processing.

Since the two groups of students (note-takers and students whose paper-based test booklets showed no traces of processing) differed in their ability to solve the chirality tasks (DIF analyses), the observational averages (solution probabilities) calculated separately by Winsteps for both groups for the paper–pencil-based and digital chirality tasks and the mental effort invested in both formats were used for further analyses. Additionally, the number of notes was used to calculate the probability that the note-takers would take notes for a task. It was then determined to what extent the probability of solving the chirality tasks and the invested mental effort differed for both groups of students (note-takers and students whose paper test booklets showed no traces of processing) in both formats (paper vs. digital).

For the paper–pencil-based chirality tasks, the solution probability differed for the note-takers, M = 0.67, SE = 0.03, and the non-note-takers, M = 0.53, SE = 0.04. The difference, ΔM = −0.15, 95% CI [−0.23, −0.07], was significant, t(18) = 3.63, p = 0.003, dRM = 0.71. For the digital chirality tasks, the solution probability also differed for the note-takers, M = 0.62, SE = 0.04, and the non-note-takers, M = 0.44, SE = 0.05. The difference, ΔM = −0.18, 95% CI [−0.256, −0.09], was again significant, t(18) = 4.31, p = 0.002, and dRM = 0.9805. Therefore, the higher personal ability of the note-takers was not solely due to the paper format. Instead, note-takers performed significantly better with both formats.

The (solution probability for the) invested mental effort for note-takers and non-note-takers in paper–pencil-based (note-takers: M = 3.63, SE = 0.23, non-note-takers: M = 3.72, SE = 0.12) and in digital format (note-takers: M = 4.23, SE = 0.24, non-note-takers: M = 4.11, SE = 0.14) did not differ significantly, paper–pencil-based: t(18) = 0.71, p = 0.470, digital: t(18) = 0.89, and p = 0.362. Hence, the effort invested by both groups did not differ based on the respective format.

For the note-takers, a moderation analysis was conducted to better explain the higher probability of solving the paper–pencil-based chirality tasks with comparable mental effort. For this purpose, the invested mental effort for the group of note-takers for the paper–pencil-based tasks was first centered using the mean value. Fig. 8 illustrates the relationship between the probability of solving the task and the invested mental effort for two extreme groups of items: items with few notes versus items with many notes. For items for which many notes were made, the probability of solution was relatively constant, regardless of the amount of mental effort invested. For items for which few notes were made, the probability of solution varied more widely. Items with a high probability of being solved tended to have below-average levels of mental effort invested, while items with a low probability of being solved tended to have above-average levels of mental effort invested.


image file: d5rp00256g-f8.tif
Fig. 8 Interaction effect. Scatterplot visualizing the moderated regression of invested mental effort on note-taking for tasks with few and many notes made. Note. The tasks were only divided into two approximately equally sized groups to create the figure. This grouping was not used for the statistical calculation. The moderation analysis was calculated with a continuous moderator.

The results indicate that invested mental effort, B = −0.18, SE = 0.03, t = 5.63, p < 0.001, 95% CI [−0.25, −0.11], and note-taking, B = −0.37, SE = 0.50, t = 2.74, p = 0.015, 95% CI [−2.44, −0.31], were significant predictors for the probability of solving the task. The interaction term (invested mental effort × note-taking) was another significant predictor, B = 0.33, SE = 0.12, t = 2.72, p = 0.016, 95% CI [0.07, 0.58]. Overall, the model accounted for 80% of the variance of the probability of solving the task, with the interaction term explaining 10% variance beyond the main-effects terms.

The interaction indicated that when note-taking was low (1 SD beneath the mean), a significant negative relationship existed between the invested mental effort and the probability of solving the task (B = −0.14, 95% CI [−0.19, −0.10], SE = 0.02, t = 6.75, p < 0.001). When note-taking was at the mean, the relationship between the invested mental effort and the probability of solving the task was weaker (B = −0.09, 95% CI [−0.12, −0.05], SE = 0.02, t = 4.90, p < 0.001). When note-taking was high (1 SD above the mean), the relationship between the invested mental effort and the probability of solving the task was no longer significant (B = −0.03, 95% CI [−0.10, 0.04], SE = 0.03, t = 0.81, p = 0.431). Hence, the correlation between the invested mental effort and the probability of solving the task decreased as the number of notes increased.

Discussion

To what extent do task difficulty and invested mental effort differ for digital and paper–pencil-based tasks on the topic of chirality?

The results show that compared to digital chirality tasks, the paper–pencil-based chirality tasks involved lower task difficulty and less invested mental effort. The first finding is consistent with hypothesis H1.1 (it is expected that digital tasks are solved correctly less often than paper–pencil-based tasks) and previous research comparing paper–pencil-based and digital tasks in organic chemistry (Schuessler et al., 2024a,b).

The second finding is consistent with hypothesis H1.2a (the mental effort invested is higher for digital tasks because more information must be held in working memory, external storage). Consequently, hypothesis H1.2b (the mental effort invested is lower for digital tasks because less deep processing occurs without taking notes, encoding) must be rejected. Students invest mental effort to solve digital chirality tasks, but without external storage, they must invest more effort and are less successful. This interpretation is supported by the findings of correlation analyses: for both formats, a significant negative correlation was found between relative task difficulty and the Rasch parameter for invested mental effort. Tasks with higher task difficulty involved a higher investment of mental effort. Therefore, the solution of digital chirality tasks appears impaired. From an instructional efficiency perspective (Paas and van Merriënboer, 1993; van Gog and Paas, 2008), processing paper–pencil-based chirality tasks appears more favourable than processing digital chirality tasks. When working on digital chirality tasks, an additional load appears to exist on the working memory, which is reflected in the greater mental effort invested. The inability to relieve the working memory by taking notes is a possible cause. The significantly higher mental effort invested in the digital chirality tasks supports this assumption and indicates a possible cognitive overload in the absence of the opportunity to relieve working memory by taking notes (external storage, Rickards and Friedman, 1978; Kiewra et al., 1991).

To what extent does note-taking impact students’ working memory load when working on paper–pencil-based chirality tasks?

Students who took notes were significantly better able to solve chirality tasks than those whose paper-based test booklet showed no traces of processing. This higher ability was found for both formats (digital and paper–pencil). Invested mental effort did not differ between students who took notes and those whose paper-based test booklet showed no traces of processing. The latter result could indicate that taking notes helps relieve the working memory (external storage function of notes) and achieve better results with a comparable amount of mental effort. Invested mental effort was higher for both groups for the digital format. Fig. 9 provides an overview of the findings.
image file: d5rp00256g-f9.tif
Fig. 9 Overview of the findings.

In view of the results, neither the external storage nor the encoding hypothesis is supported. Students who take notes do not indicate a lower cognitive load (external storage, H2.1a) but also do not report a higher cognitive load (encoding, H2.1b). From an instructional efficiency perspective (Paas and van Merriënboer, 1993; van Gog and Paas, 2008), note-taking appears more efficient, as these students gain better results with the same amount of effort. However, since these students also perform better in the digital format, where they cannot take notes, they may also have a higher level of expertise and therefore perform better with the same amount of mental effort. One possible explanation is the relativity of cognitive load (Brünken et al., 2012): since no objective criterion exists for the amount of mental effort invested, two people can associate different things with, for example, a medium amount of mental effort (Krieglstein et al., 2025).

A moderation was calculated to predict the observational average for the chirality tasks for note-takers using invested mental effort and note-taking as predictors. The moderation analysis indicated that invested mental effort, note-taking, and the interaction term (invested mental effort × note-taking) are significant predictors of the probability of solving the task. Overall, the model accounted for 80% of the variance of the probability of solving the task, with the interaction term explaining 10% variance beyond the main-effects terms. When note-taking was low, the relationship between invested mental effort and the probability of solving the task was strongest. When note-taking was high, the relationship between invested mental effort and the probability of solving the task was no longer significant. Hence, the correlation between the invested mental effort and the probability of solving the task decreases as the number of notes increases. However, it is unclear why note-takers also perform better in digital format. In principle, it could be expected that the note-takers using the digital format (where they cannot take notes) would achieve similar results to the students whose paper-based test booklets had no processing traces (or that their mental effort increased disproportionately).

Limitations and future research

The results were limited by the notes being taken spontaneously. Firstly, this reduced the number of students for the moderation analysis. Secondly, students and tasks differed significantly in the number of notes taken. Furthermore, it is unclear why some students took notes spontaneously and others did not. These students may regulate their working memory load better. However, the students may also have learned the strategy of note-taking and that systematically practising note-taking while working on the task is a means to support all students in solving the task. Additionally, the results are limited by the note-takers being more able to complete the chirality tasks than the students whose paper-based test booklets showed no traces of processing. Another limitation is the coding of the processing traces. Firstly, it was difficult to recognize whether processing traces were present, especially when only dots were applied. Secondly, the processing traces appeared to make varying degrees of sense without further information. In future, qualitative data or a more differentiated coding of larger datasets could provide further insights.

A further limitation in connection with the analysis is the scribbles that individual students used in digital format, as these were not included in the study. Our analysis of the notes relates exclusively to notes that were made in paper format. We did not consider the individual notes that were made in digital format for two reasons. Firstly, we do not consider these notes to be of equal value, as they do not primarily support helpful processing procedures, but also require a high degree of (actually unnecessary) drawing of the molecule from the screen. It is therefore very likely that they are associated with a high level of split attention and extraneous cognitive load. Secondly, it was the exception that students used this option, so that a quantitative evaluation would not have been possible, also because a clear assignment to the associated tasks is hardly possible with these scribbles.

The decreasing correlation between the mental effort invested and the probability of solving the task as the number of notes increases indicates that taking notes is a means to manage a high working memory load, reduce it, and thus also solve difficult tasks (external storage). However, future studies should aim to replicate the results with clearer group comparisons (students who must take notes for each task and students who are not allowed to take notes), perhaps combined with previous note-taking training for the note-taking group. Additionally, the level of task complexity (e.g., determining the absolute configuration of molecules with one, two, or three chirality centres) at which students depend on notes when solving tasks should be investigated.

Studies on the importance of note-taking are predominantly from learning situations, whereas the data described here are from a test situation. We assume that our test subjects were still novices at the time of testing, whose corresponding schemas are not yet sufficiently automated. In this sense, the test situation could also be seen as a further learning opportunity. The relationship between the impact of notes and developing expertise (full schema automation) should be investigated further in the future.

Additional data on student motivation, extraneous cognitive load in the digital format, and working memory resources could be helpful in investigating whether more effort is invested at some point due to a lack of motivation, the extent to which extraneous cognitive load increases the mental effort required in the digital format, and the extent to which tasks may not be solved due to a lack of working memory resources. Furthermore, the study was conducted in Germany, where there are no specific requirements for the introduction of the skeletal formula. We observe that it tends not to be used at school, whereas it is standard (and often not explicitly introduced) in organic chemistry at the university level. To gain better insights, it would be useful to collect more systematic information on how teachers at school and at university introduce and use the skeletal formula. Finally, we examined students from three degree programs at one university. We did not investigate the extent to which there are differences between students on the different degree programs. A replication of the results with a different sample (university or degree program) would therefore appear desirable.

Conclusion

This paper compares the task difficulty and mental effort invested in digital and paper–pencil-based chirality tasks. Both tasks require the use of representations (molecular representations using the skeletal formula). Such tasks require students to read information provided with the representation and link it to prior knowledge to achieve partial solutions. Notes appear relevant as support for processing information (encoding) and as external storage to avoid overloading the working memory. The results show that the paper–pencil-based chirality tasks involved lower task difficulty and less invested mental effort. This result indicates that notes help relieve the burden on working memory (external storage). Furthermore, the findings indicate that students who took notes were significantly better able to solve chirality tasks than those whose paper-based test booklet showed no traces of processing while investing a comparable amount of mental effort. Interestingly, this result was observed for paper–pencil-based and digital tasks (where no notes could be taken). Further research is needed to investigate the importance of note-taking in tasks requiring the use of representations. It should also be determined whether all notes are equally helpful or whether certain notes are more helpful than others.

The results of a moderation analysis showed that the correlation between the invested mental effort and the probability of solving the task decreased as the number of notes increased. This indicates that notes as external storage are relevant for processing tasks requiring handling representation. As the digital format does not offer comparable options for taking notes, notes represent a subject-specific format difference.

Author contributions

Conceptualization: Katrin Schuessler; data curation: Katrin Schuessler; formal analysis: Katrin Schuessler; funding acquisition: Michael Giese, Maik Walpuski; investigation: Katrin Schuessler; methodology: Katrin Schuessler, Maik Walpuski; project administration: Michael Giese, Maik Walpuski; resources: Michael Giese, Maik Walpuski; supervision: Maik Walpuski; validation: Katrin Schuessler; writing – original draft: Katrin Schuessler; writing – review & editing: Michael Giese, Maik Walpuski.

Conflicts of interest

There are no conflicts to declare.

Data availability

The dataset is available here: https://doi.org/10.17185/duepublico/83725.

Acknowledgements

We thank all the students who participated in the study and the Foundation for Innovation in Higher Education for funding the PITCH project within whose framework the study was conducted.

Notes and references

  1. Anderson T. L. and Bodner G. M., (2008), What can we do about ‘Parker’? A case study of a good student who didn’t ‘get’ organic chemistry, Chem. Educ. Res. Pract., 9(2), 93–101.
  2. Anzovino M. E. and Lowery Bretz S., (2016), Organic chemistry students’ fragmented ideas about the structure and function of nucleophiles and electrophiles: a concept map analysis, Chem. Educ. Res. Pract., 17(4), 1019–1029.
  3. Asmussen G., Rodemer M. and Bernholt S., (2023), Blooming student difficulties in dealing with organic reaction mechanisms – an attempt at systemization, Chem. Educ. Res. Pract., 24(3), 1035–1054.
  4. Bodé N. E., Caron J. and Flynn A. B., (2016), Evaluating students’ learning gains and experiences from using nomenclature101.com, Chem. Educ. Res. Pract., 17(4), 1156–1173.
  5. Boone W. J., Staver J. R. and Yale M. S., (2014), Rasch analysis in the human sciences, Dordrecht: Springer Netherlands.
  6. Brecher J., (2008), Graphical representation standards for chemical structure diagrams (IUPAC Recommendations 2008), Pure Appl. Chem., 80(2), 277–410.
  7. Brünken R., Seufert T. and Paas F., (2012), in Plass J. L., Moreno R. and Brünken R. (ed.), Cognitive load theory, Cambridge University Press, pp. 181–202.
  8. Buchner J. and Kerres M., (2023), Media comparison studies dominate comparative research on augmented reality in education, Comput. Educ., 195, 104711.
  9. Clark R. E. and Feldon D. F., (2014), Ten common but questionable principles of multimedia learning, in Mayer R. E. (ed.), The Cambridge handbook of multimedia learning, Cambridge University Press, pp. 151–173.
  10. Clayden J., Greeves N., Warren S. and Wothers P., (2001), Organic chemistry, Oxford: Oxford University Press.
  11. Cooper M. M., Grove N., Underwood S. M. and Klymkowsky M. W., (2010), Lost in Lewis structures: an investigation of student difficulties in developing representational competence, J. Chem. Educ., 87(8), 869–874.
  12. Dickmann T., Opfermann M., Dammann E., Lang M. and Rumann S., (2019), What you see is what you learn? The role of visual model comprehension for academic success in chemistry, Chem. Educ. Res. Pract., 20(4), 804–820.
  13. Di Vesta F. J. and Gray G. S., (1972), Listening and note taking, J. Educ. Psychol., 63(1), 8–14.
  14. Dood A. J. and Watts F. M., (2023), Students’ strategies, struggles, and successes with mechanism problem solving in organic chemistry: a scoping review of the research literature, J. Chem. Educ., 100(1), 53–68.
  15. GDCh, (2021), Empfehlungen der GDCh-Studienkommission zum Bachelorstudium Chemie an Universitäten [Recommendations of the GDCh Study Commission on the Bachelor's Programme in Chemistry at Universities], https://www.gdch.de/fileadmin/downloads/Service_und_Informationen/Downloads/Schule_Studium/PDF/2021_GDCh_Studienkommission_Druckversion.pdf, accessed 3/14, 2024.
  16. Goodwin W., (2010), How do structural formulas embody the theory of organic chemistry? Br. J. Philos. Sci., 61(3), 621–633.
  17. Graulich N. and Bhattacharyya G., (2017), Investigating students’ similarity judgments in organic chemistry, Chem. Educ. Res. Pract., 18(4), 774–784.
  18. Grossman R. B., (2020), Art of writing reasonable organic reaction mechanisms, [S.l.], SPRINGER.
  19. Grove N. P. and Lowery Bretz S., (2012), A continuum of learning: from rote memorization to meaningful learning in organic chemistry, Chem. Educ. Res. Pract., 13(3), 201–208.
  20. Gurung E., Jacob R., Bunch Z., Thompson B. and Popova M., (2022), Evaluating the effectiveness of organic chemistry textbooks for promoting representational competence, J. Chem. Educ., 99(5), 2044–2054.
  21. Habig S., (2020), Who can benefit from augmented reality in chemistry? Sex differences in solving stereochemistry problems using augmented reality, Br. J. Educ. Technol., 51(3), 629–644.
  22. Heller S. R., McNaught A., Pletnev I., Stein S. and Tchekhovskoi D., (2015), InChI, the IUPAC international chemical identifier, J. Cheminf., 7, 23.
  23. Jansen R. S., Lakens D. and IJsselsteijn W. A., (2017), An integrative review of the cognitive costs and benefits of note-taking, Educ. Res. Rev., 22, 223–233.
  24. Kalyuga S., (2014), in Mayer R. E. (ed.), The Cambridge handbook of multimedia learning, Cambridge University Press, pp. 576–597.
  25. Keller S., Rumann S. and Habig S., (2021), Cognitive load implications for augmented reality supported chemistry learning, Information, 12(3), 96.
  26. Kellman P. J. and Garrigan P., (2009), Perceptual learning and human expertise, Phys. Life Rev., 6(2), 53–84.
  27. Kiewra K. A., DuBois N. F., Christian D., McShane A., Meyerhoffer M. and Roskelley D., (1991), Note-taking functions and techniques, J. Educ. Psychol., 83(2), 240–245.
  28. Kozma R., Chin E., Russell J. and Marx N., (2000), The roles of representations and tools in the chemistry laboratory and their implications for chemistry learning, J. Learn. Sci., 9(2), 105–143.
  29. Kozma R. and Russell J., (1997), Multimedia and understanding: expert and novice responses to different representations of chemical phenomena, J. Res. Sci. Teachnol., 34(9), 949–968.
  30. Krieglstein F., Schmitz M., Wesenberg L., Spitzer M. W. H. and Rey G. D., (2025), How a first impression biases cognitive load assessments: Anchoring effects in problem-solving tasks of varying element interactivity, Mem. Cogn. DOI:10.3758/s13421-025-01764-3.
  31. Leahy W. and Sweller J., (2004), Cognitive load and the imagination effect, Appl. Cognit. Psychol., 18(7), 857–875.
  32. Leahy W. and Sweller J., (2005), Interactions among the imagination, expertise reversal, and element interactivity effects, J. Exp. Psychol. Appl., 11(4), 266–276.
  33. Leahy W. and Sweller J., (2008), The imagination effect increases with an increased intrinsic cognitive load, Appl. Cognit. Psychol., 22(2), 273–283.
  34. Leahy W. and Sweller J., (2016), Cognitive load theory and the effects of transient information on the modality effect, Instr. Sci., 44(1), 107–123.
  35. Lenhard W. and Lenhard A., (2017), Psychometrica: Computation of Effect Sizes, https://www.psychometrica.de/effektstaerke.html, accessed 2025, June 27th.
  36. Leppink J. and van Merriënboer J. J. G., (2015), The beast of aggregating cognitive load measures in technology-based learning, Educ. Technol. Soc., 18(4), 230–245.
  37. Mayer R. E., (2012), Multimedia learning, Cambridge University Press.
  38. Mayer R. E., (2014), Cognitive theory of multimedia learning, in Mayer R. E. (ed.), The Cambridge handbook of multimedia learning, Cambridge University Press, pp. 43–71.
  39. Paas F., (1992), Training strategies for attaining transfer of problem-solving skill in statistics: a cognitive-load approach, J. Educ. Psychol., 84(4), 429–434.
  40. Paas F., Renkl A. and Sweller J., (2003a), Cognitive load theory and instructional design: recent developments, Educ. Psychol., 38(1), 1–4.
  41. Paas F., Tuovinen J. E., Tabbers H. and van Gerven P. W. M., (2003b), Cognitive load measurement as a means to advance cognitive load theory, Educ. Psychol., 38(1), 63–71.
  42. Paas F., Renkl A. and Sweller J., (2004), Cognitive load theory: instructional implications of the interaction between information structures and cognitive architecture, Instr. Sci., 32(1/2), 1–8.
  43. Paas F. and van Merriënboer J. J. G., (1993), The efficiency of instructional conditions: an approach to combine mental effort and performance measures, Hum. Factors, 35(4), 737–743.
  44. Rau M. A., (2017), Conditions for the effectiveness of multiple visual representations in enhancing STEM learning, Educ. Psychol. Rev., 29(4), 717–761.
  45. Rau M. A., (2018), Making connections among multiple visual representations: how do sense-making skills and perceptual fluency relate to learning of chemistry knowledge? Instr. Sci., 46(2), 209–243.
  46. Rho J., Rau M. A. and Vanveen B., Investigating growth of representational competencies by knowledge-component model, Zenodo, 2022.
  47. Rickards J. P. and Friedman F., (1978), The encoding versus the external storage hypothesis in note taking, Contemp. Educ. Psychol., 3(2), 136–143.
  48. Schmeck A., Opfermann M., van Gog T., Paas F. and Leutner D., (2015), Measuring cognitive load with subjective rating scales during problem solving: differences between immediate and delayed ratings, Instr. Sci., 43(1), 93–114.
  49. Schuessler K., Rodemer M., Giese M. and Walpuski M., (2024a), Organic chemistry and the challenge of representations: student difficulties with different representation forms when switching from paper–pencil to digital format, J. Chem. Educ., 101(11), 4566–4579.
  50. Schuessler K., Striewe M., Pueschner D., Luetzen A., Goedicke M., Giese M. and Walpuski M., (2024b), Developing and evaluating an e-learning and e-assessment tool for organic chemistry in higher education, Front. Educ., 9, 1355078.
  51. Sim J. H. and Daniel E. G. S., (2014), Representational competence in chemistry: a comparison between students with different levels of understanding of basic chemical concepts and chemical representations, Cogent. Educ., 1(1), 991180.
  52. Stowe R. L. and Esselman B. J., (2023), The picture is not the point: toward using representations as models for making sense of phenomena, J. Chem. Educ., 100(1), 15–21.
  53. Sweller J., (1988), Cognitive load during problem solving: effects on learning, Cogn. Sci., 12(2), 257–285.
  54. Sweller J., (2010), Element interactivity and intrinsic, extraneous, and germane cognitive load, Educ. Psychol. Rev., 22(2), 123–138.
  55. Sweller J., van Merrienboer J. J. G. and Paas F. G. W. C., (1998), Cognitive architecture and instructional design, Educ. Psychol. Rev., 10(3), 251–296.
  56. Sweller J., van Merriënboer J. J. G. and Paas F., (2019), Cognitive architecture and instructional design: 20 years later, Educ. Psychol. Rev., 31(2), 261–292.
  57. Taskin V., Bernholt S. and Parchmann I., (2017), Student teachers’ knowledge about chemical representations, Int. J. Sci. Math. Educ., 15(1), 39–55.
  58. Tiettmeyer J. M., Coleman A. F., Balok R. S., Gampp T. W., Duffy P. L., Mazzarone K. M. and Grove N. P., (2017), Unraveling the complexities: an investigation of the factors that induce load in chemistry students constructing Lewis structures, J. Chem. Educ., 94(3), 282–288.
  59. van Gog T., Kirschner F., Kester L. and Paas F., (2012), Timing and frequency of mental effort measurement: evidence in favour of repeated measures, Appl. Cognit. Psychol., 26(6), 833–839.
  60. van Gog T. and Paas F., (2008), Instructional Efficiency: Revisiting the Original Construct in Educational Research, Educ. Psychol., 43(1), 16–26.
  61. van Merriënboer J. J. G. and Sweller J., (2005), Cognitive load theory and complex learning: recent developments and future directions, Educ. Psychol. Rev., 17(2), 147–177.
  62. Xie B. and Salvendy G., (2000), Prediction of mental workload in single and multiple tasks environments, Int. J. Cogn. Ergonom., 4(3), 213–242.

This journal is © The Royal Society of Chemistry 2026
Click here to see how this site uses Cookies. View our privacy policy here.