Science practices in the general chemistry laboratory: hits, misses, and near misses

Norda Simone Stephenson *ad, Caroline Lund Dahlberg b, Trà Huỳnh cd, Erin Duffy ad, Katherine Hunter a, Luka Spring b, Elayna Worline b, Jessica Weaver b and Kaitlyn Bolland d
aDepartment of Chemistry, Western Washington University, Bellingham, WA 98225, USA. E-mail: stephen2@wwu.edu
bBiology Department, Western Washington University, Bellingham, WA 98225, USA
cDepartment of Physics & Astronomy, Western Washington University, Bellingham, WA 98225, USA
dScience, Math, and Technology Education Western Washington University, Bellingham, WA 98225, USA

Received 16th July 2025 , Accepted 18th November 2025

First published on 19th November 2025


Abstract

An emphasis on science practices in university science education provides a logical route to student development of scientific literacy and critical thinking. Unlike scientific literacy and critical thinking which have been the subjects of much debate and controversy, the science practices are consensus-based and well-defined. As the disaggregated components of critical thinking, science practices are well-positioned to support and systematically assess student development of critical thinking. In this paper, we report on our examination of general chemistry laboratory curricula for opportunities to engage students in science practices using the Three-Dimensional Learning Assessment Protocol, 3D-LAP. Our findings suggest that the laboratory curricula provide opportunities for students to engage in some practices, while others are consistently underrepresented. Additionally, there is a disproportionate emphasis on mathematical thinking which not only has equity implications, especially for marginalized students, but could also have the unintended consequence of students developing a distorted view of the intellectual work of chemistry. The findings of our study indicate a need for greater balance with respect to the practices in the general chemistry laboratory curricula, and have the potential to inform the design, development, or adaptation of lab curricula to better support scientific literacy and critical thinking. We couple accountable disciplinary knowledge and constructive alignment to discuss these and other findings and implications of our work for teaching, learning, and research.


Introduction

The development of students with strong critical thinking and scientific literacy skills remains an important priority for university science education. Students proficient in these skills will not only have a better understanding of the natural world, but will also develop and apply the requisite skills to real world issues in ways that are both cognitively and dispositionally sound (Facione, 1990; Gormally et al., 2012). However, both critical thinking and scientific literacy have been the subjects of much debate and controversy for decades, and have suffered from a lack of consensus within the academic community about what constitutes critical thinking and scientific literacy. While, or perhaps, because we also highly value critical thinking and scientific literacy, rather than adding to the ever-growing number of definitions for these two constructs, we propose that focusing on student engagement in science practices can help us achieve the goal of developing both students’ critical thinking and scientific literacy skills. Throughout this manuscript we use the term science practices. We have chosen this term rather than science and engineering practices, because our focus is on chemistry – a physical science, and our team's expertise does not extend to engineering and engineering practices.

Science practices are the behaviors and activities that scientists engage in frequently and can be considered to be the disaggregated components of critical thinking (National Research Council, 2012a; Cooper, 2016; Stowe and Cooper, 2017). Moreover, science practices are not just skills; they combine content knowledge with skills, and require that students demonstrate what they are able to do with that knowledge (National Research Council, 2012a; Reed et al., 2017). In this way, the science practices provide evidence of what students know and are able to do with content knowledge and therefore embody the goals and intents of both critical thinking and scientific literacy. However, unlike critical thinking and scientific literacy which generally lack consensus definitions, the science practices are now well-defined, providing not only common ground for discussion, but also allowing for their systematic assessment.

Student engagement in science practices can and should take place in all teaching and learning environments. Undergraduate chemistry laboratory classes, in particular, provide an environment that is conducive to student engagement in science practices due to their longer duration, more relaxed atmosphere, and lower student–instructor ratio (compared to lecture classes). However, while the laboratory is particularly well-suited for student engagement in science practices, actual engagement in practices is unlikely to take place without the appropriate materials and resources, including laboratory materials (e.g. laboratory experiments) that emphasize the practices.

The work described in this manuscript is part of a larger project focused on exploring how science practices are incorporated across introductory biology, chemistry and physics laboratory courses. Phase 1 of the project examines laboratory artifacts to determine the extent to which they have the potential to elicit evidence of student engagement in science practices. This paper reports on how the science practices are situated within general chemistry laboratory curricula, with emphasis on which practices are targeted, the degree to which these practices are targeted, and in what part of the lab (in lab or out of lab) these targeted practices are present. Additionally, we explore opportunities for promoting greater student engagement in science practices within the laboratory curricula. The findings of this study have the potential to inform the design, development or adaptation of lab curricula to better support scientific literacy and critical thinking development. We discuss this and other implications of this work for teaching, learning, and research.

Science practices and the three-dimensional learning assessment protocol (3D-LAP)

Over the last thirteen years, our understanding of science practices has been significantly impacted by two major publications. First, the clear descriptions of the science practices in the Framework for K-12 Science Education (National Research Council, 2012a) and the Next Generation Science Standards (NGSS) (National Research Council, 2013) represented a critical turning point for discussing and assessing science practices within the science education community, providing “enhanced professional language for communicating meaning” (Osborne, 2014). Secondly, the Three-Dimensional Learning Assessment Protocol or 3D-LAP (Laverty et al., 2016), a set of criteria developed to characterize the potential of college science assessment tasks across biology, chemistry and physics to elicit evidence of student engagement in core ideas, crosscutting concepts, and science practices. These two changes have increased clarity of understanding and potential for systematic assessment of science practices within the science education community. Although the Framework and NGSS were developed for use in K-12 contexts, there is a burgeoning body of evidence that supports their application to K-20 and beyond (Laverty et al., 2016; Carmel et al., 2017, 2019; Underwood et al., 2018; Rodriguez and Towns, 2018; Stephenson et al., 2020, 2023; Van Wyk et al., 2025). Similarly, while the 3D-LAP was not specifically developed for use in laboratory assessment, it has been used successfully to characterize laboratory curricula (Carmel et al., 2019; Stephenson et al., 2023). The original 3D-LAP criteria for Science Practices used for non-laboratory college science assessments outline eight practices (Laverty et al., 2016). To characterize science practices in laboratory environments, two additional practices considered critical to what students are expected to do in the laboratory, were added to the 3D-LAP criteria, bringing the number of practices in this expanded version of the 3D-LAP to ten (Carmel et al., 2019). This expanded version of the 3D-LAP criteria for Science Practices was used in this study. These practices are shown in Table 1, and the 3D-LAP criteria for developing and using models (P2) are shown in Box 1.

Box 1. 3D-LAP Criteria for Developing and Using Models (Laverty et al., 2016).

3D-LAP criteria for developing and using models

1. Question gives an event, observation, or phenomenon for the student to explain or make a prediction about.

2. Question gives a representation or asks student to construct a representation.

3. Question asks student to explain or make a prediction about the event, observation, or phenomenon.

4. Question asks student to provide the reasoning that links the representation to their explanation or prediction.


Table 1 Science practices of the expanded 3D-LAP (Laverty et al., 2016; Carmel et al., 2019)
P1 Asking Questions
P2 Developing and Using Models
P3 Planning Investigations
P4 Analyzing and Interpreting Data
P5 Using Math and Computational Thinking
P6 Constructing Explanations
P7 Engaging in Argument from Evidence
P8 Evaluating Information
P9 Communicating Information
P10 Defining Problems and Designing Solutions


While science practices are by no means new to science education research, this study focuses primarily on chemistry education research involving the science practices since the publication of the well-defined practices in the Framework and NGSS. In general chemistry courses, engagement in science practices has been investigated through student reasoning about a range of phenomena. For example, Becker and colleagues (2017) gained insight into how students engaged in interpreting data and mathematical thinking by examining their responses to an initial rates task. Researchers were able to understand the mathematical reasoning skills of students in a first-year non-majors chemistry course at a Swedish university through their responses to a graph-based task in chemical kinetics (Rodriguez et al., 2019). Similarly, Urbanek and colleagues (2023) examined students’ responses to epistemic uncertainty as they engaged with an argumentation task. Other studies at this level have focused on providing students with opportunities to engage in other science practices, including constructing explanations (CE), engaging in argument from evidence (EE), and developing and using models (DUM), among others (Williams et al., 2015; Cooper et al., 2016; Brandriet et al., 2018; Noyes and Cooper, 2019; Shiroda et al., 2024). These studies reveal areas where student engagement in particular practices can be strengthened and are valuable tools for instructors and curriculum developers.

Perhaps unsurprisingly, science practices-related research in general chemistry has a strong laboratory emphasis. A number of studies have made the case that some laboratory pedagogies and instructional models better support student development in science practices than more traditional pedagogies. These include argument driven inquiry (ADI) (Walker et al., 2016; Polk and Santos, 2025), Science Writing Heuristic (Hike and Hughes-Phelan, 2020; Stephenson et al., 2023), project-based learning (Carmel et al., 2017, 2019), and general inquiry-type initiatives (Hosbein and Walker, 2022), which have been shown to provide students with more opportunities to engage with science practices than more traditional environments. Other work at the general chemistry level has focused on the development, modification, and implementation of resources to promote deeper and more meaningful student engagement with science practices. Initiatives include new experiments or parts of experiments (e.g. prelabs, postlabs, guiding questions) (Carmel et al., 2017; Rodriguez and Towns, 2018; Gao et al., 2021).

A small, but significant body of research into science practices has centered around the development and/or use of instruments for characterizing various instructional environments and materials for their potential to engage students in science practices. Science practices assessment tasks that provide students with opportunities to engage in science practices were developed by Stephenson and colleagues (2020) using an evidence-centered design, while Hosbein and Walker (2022) developed the Investigation Design, Explanation, and Argument Assessments for the first semester of General Chemistry Laboratory (IDEAA-GC1) to assess students’ development in the practices of planning and carrying out investigations and generating scientific arguments in an inquiry-type general chemistry laboratory. The IONIC protocol (ICAP to Measure by Observation NGSS Science Practice Implementation in the Classroom; Interactive-Constructive-Active-Passive (ICAP) (Chen and Terada, 2021) has been used to compare student engagement in science practices in ADI-modified and traditional laboratory environments in general chemistry (Polk and Santos, 2025). The Three-Dimensional Learning Observation Protocol or 3D-LOP (Bain et al., 2020), near cousin to the previously mentioned 3D-LAP, has been used to characterize pedagogical approaches in instructional environments and assess student engagement in three dimensions, including science practices.

The aforementioned 3D-LAP (Laverty et al., 2016) has been used in a number of studies. The set of protocols originally developed to characterize assessment items across biology, chemistry and physics, has also been used for the characterization of ACS exam items that address general chemistry concepts (Reed et al., 2017). Additionally, the protocol has been used to illustrate how existing general chemistry and organic chemistry assessments can be modified to make them meet the criteria for three-dimensionality, as well as for characterizing assessment items in organic chemistry (Stowe and Cooper, 2017; Underwood et al., 2018). We are aware of two studies that have applied the 3D-LAP criteria for Science Practices to characterize laboratory curricula at the general chemistry level in general chemistry laboratories. Carmel and colleagues examined the project-based laboratory curriculum at a large, research university in the Midwest of the United States and found that it provided more opportunities for students to engage in science practices than the traditional laboratory curriculum. A similar study by Stephenson et al. (2023) in an international context, revealed that the Science Writing Heuristic laboratory curriculum also provided students with more opportunities to engage in science practices than the traditional laboratory curriculum.

This paper reports on the examination of the general chemistry curricula at a medium-sized, primarily undergraduate, public institution in the Pacific Northwest of the United States. The work reported here explores opportunities for students to engage in science practices across the general chemistry series of courses, and if/how those opportunities change as students progress from one course to the next, responding to the call for more cross-sectional studies (National Research Council, 2012b). Moreover, we couple the theoretical lenses of constructive alignment and accountable disciplinary knowledge to explain our findings and discuss the implications of our work. We view this examination of our lab curricula as a first step in responding to the increasing calls for evidence of student learning from labs (Rodriguez and Towns, 2018; Lowery Bretz, 2019; Seery, 2020). Given the consensus definitions of the practices and the practices criteria from the 3D-LAP, we can gather evidence of student engagement in science practices as a proxy for what students learn from labs. This work was guided by two main research questions (RQs):

1. How are science practices situated within the general chemistry laboratory curricula? (Which practices are students given opportunities to engage with? To what extent do students have opportunities to engage in these practices? In what part of the lab (in lab or out of lab) are opportunities to engage in practices available to students?)

2. What opportunities (if any), exist for promoting greater student engagement in science practices within the general chemistry curricula?

Theoretical framework

Constructive Alignment and Accountable Disciplinary Knowledge are the two main theories that guide our work in this study. Constructive alignment derives from constructivism and advances that learners construct deeper and more meaningful understandings when there is congruence between learning outcomes, teaching and learning tasks, and assessment (Biggs, 2014; Loughlin et al., 2020). Constructive alignment has been used to guide the redesign of a first-year laboratory course at the University of Bristol (Adams, 2020), leading to more learning opportunities and increased engagement and enjoyment of the course. The approach has also been used to support organic chemistry students’ development of argumentation skills (Deng et al., 2022), and the development of “graduate attributes,” such as critical thinking, in inorganic chemistry students (Damoyi and Makhathini, 2023). Recently, Van Wyk et al. (2025) applied constructive alignment in examining laboratory experiments for their potential to engage students in inquiry and science practices in an analytical chemistry course. In this study, we use constructive alignment to guide discussion of our characterization of science practices in the general chemistry laboratory curricula. We consider the intended outcomes to be engagement in the science practices as described in the Framework and the NGSS, and the criteria for science practices in the 3D-LAP (Laverty et al., 2016; Carmel et al., 2019). The tasks are the prompts embedded within the laboratory activities and artifacts that students are asked to engage with, and the assessments are the products that students are held accountable for that are associated with the laboratory activities.

Accountable Disciplinary Knowledge (ADK) “describes what is taken as disciplinary knowledge in a particular context”, and may take different forms for different persons, in different situations, and at different times (Stevens et al., 2008; Dziallas and Fincher, 2019). Although the concept of ADK was developed within engineering contexts, its utility spans the disciplines. For example, ADK has been used in a physics context to understand students' ideas about what counts as doing physics, and to characterize the kinds of activities in which students are engaged as they participate in a community of practice (Irving and Sayre, 2014). We adopt accountable disciplinary knowledge to examine student engagement in science practices in the context of the general chemistry laboratory. Student engagement in science practices is a commonly articulated goal within chemistry, especially chemistry laboratories, making it valued knowledge. In examining chemistry laboratory curricula for opportunities for students to engage in science practices through an ADK lens, we gain insight into “what counts as doing chemistry” within general chemistry lab spaces, and the extent to which we assess what we value in these spaces. Adopting ADK helps us identify any gaps between what is counted as laboratory chemistry knowledge (what students are held accountable for) and what we actually value as chemistry knowledge in the general chemistry laboratory. By coupling constructive alignment with accountable disciplinary knowledge in this study, we are able to not only identify contexts where the curricula provide opportunities for students to engage in science practices, but also pinpoint and target for further development, any discrepancies between the intended and enacted curricula.

Methods

Study context

This study examined the general chemistry laboratory curricula for their potential to elicit evidence of student engagement in science practices using the 3D-LAP. This is part of a larger research project exploring science practices in laboratory curricula across introductory biology, chemistry and physics in a medium-sized, primarily undergraduate, public institution. General chemistry consists of a series of three courses over one academic year (1 course each quarter), and is a requirement for most STEM majors. General Chemistry I (CHEM I), the first course in the series, includes a focus on Matter and Energy, Atomic Structure, Bonding and Stoichiometry, while General Chemistry II (CHEM II) emphasizes Solutions, Gases, Thermodynamics and Kinetics. The final course in the series, General Chemistry III (CHEM III), focuses on Equilibria and Electrochemistry. Each course consists of large lectures with smaller laboratory sections. CHEM I and II lab sections meet once each week for two hours. Each section is supervised by one instructor/graduate teaching assistant and accommodates up to 24 students. CHEM III lab sections also meet once weekly, but for three hours, supported by an instructor/graduate teaching assistant and a lab assistant. CHEM III sections accommodate up to 32 students each. This paper focuses on the laboratory component of general chemistry. The laboratory curricula used across the general chemistry series of courses are institutionally-developed and enacted within an instructor-directed laboratory structure model where experimental goals, procedures and data analysis are prescribed. This laboratory model is widely used in science labs, especially in large enrollment laboratory courses where it allows for greater control of the laboratory space and increased safety. Given the relative popularity of this model, exploring opportunities for students to engage in science practices in contexts such as ours is important.

Characterizing laboratory materials for science practices

We examined the most recent laboratory manuals, pre-lab worksheets and post-lab assignments for each course in the general chemistry sequence. Our process for characterizing these materials as having the potential to elicit evidence of student engagement with the practices followed the method described by Carmel et al. (2019). Carmel and colleagues developed an extension to the Three-Dimensional Learning Assessment Protocol (3D-LAP), originally used to assess non-laboratory college science assessments, for characterizing the types of activities students encounter within the laboratory curricula. In modifying the 3D-LAP for use in laboratory settings, Carmel and colleagues added two additional practices, extending the original number of practices from 8 to 10. The 10 practices in the expanded 3D-LAP criteria for science practices are: asking questions (P1), developing and using models (P2), planning investigations (P3), analyzing and interpreting data (P4), using math and computational thinking (P5), constructing explanations (P6), engaging in argument from evidence (P7), evaluating information (P8), communicating information (P9), and defining problems and designing solutions (P10). This expanded version of the 3D-LAP for science practices was used in this study. Each experimental activity (or lab), along with its pre- and post-lab assignments, was coded as a single unit. In order for a prompt or set of prompts to be coded as having the potential to engage students in one or more science practices, they needed to meet all the criteria that define participation in those science practices. Such instances were identified as “hits”. Prompts or activities that only partially met the criteria for any practice were not coded as having the potential for student engagement in that practice. In addition to documenting instances where students were prompted to engage in science practices, we also noted where students were prompted to engage in practices (in lab and out of lab), as well as instances where only one criterion for student engagement in a science practice was not met. These instances were identified as “near misses”. Prompts or activities that had the potential to elicit student engagement in more than one practice were coded for including multiple practices. A single practice could show up multiple times within a single experimental activity/lab. Fig. 1 and 2 provide an example of a hit and a near miss. We will revisit the near miss example in the Discussion section.
image file: d5rp00272a-f1.tif
Fig. 1 Activity coded as hit for developing and using models showing specific prompts mapped to 3D-LAP criteria.

image file: d5rp00272a-f2.tif
Fig. 2 Activity coded as near miss for developing and using models showing specific descriptions mapped to 3D-LAP criteria.

The coding process

Multiple authors independently examined and coded all laboratory materials using the expanded 3D-LAP criteria for science practices. Before full-scale coding and prior to discussion of differences, inter-rater agreement was calculated. Coding of the general chemistry laboratory materials took place over about 24 months. Individual coders first read and analyzed materials for each laboratory session prior to meeting in pairs or trios. Meetings in pairs (or trios) were followed by whole team meetings to share and discuss codes, resolving any discrepancies to reach full consensus (100% agreement) on final science practices assignments.

Findings

RQ 1: How are science practices situated within the general chemistry laboratory curricula?

To address this broad question, we divided it into sub-questions that focused on particular elements of the general chemistry laboratory. We present the findings using these sub-questions.
Which practices are students given opportunities to engage with?. We examined all 24 laboratory experiments across the general chemistry series for evidence of potential to engage students in the science practices using the constructed response version of the 3D-LAP criteria for science practices. In Fig. 3, colored areas show where opportunities for engaging in science practices were identified for the 9 experiments in CHEM I, the first course in the series. (Heat maps showing opportunities for students to engage in practices in the next two courses in the series can be found in the SI.) Experiments were coded as providing an opportunity to engage students in a science practice (hits) if the practice showed up at least once, that is, all the criteria for that practice described in the 3D-LAP were found in a single prompt or set of related prompts. We found evidence that 18 of 24 (75%) experiments in the general chemistry laboratory curricula provided students with opportunities to engage in at least one practice. Collectively, students had opportunities to engage in 5 practices – math and computational thinking (P5), analyzing and interpreting data (P4), constructing explanations (P6), engaging in argument from evidence (P7), and developing and using models (P2) – over the entire curricula. Math and computational thinking (P5) and analyzing and interpreting data (P4) were the most strongly emphasized throughout the curricula, with potential to engage students in these practices identified in 14/24 experiments. Developing and Using Models (P2), Constructing Explanations (P6), and Engaging in Argument from Evidence (P7) each appeared in 4/24 experiments. Our analyses also revealed that math and computational thinking (P5) and analyzing and interpreting data (P4) and constructing explanations (P6) and engaging in argument from evidence (P7) were almost always present together. This is unsurprising, given the significant overlap between the criteria for these pairs of practices. No experiment met all the criteria for having the potential to engage students in asking questions (P1), evaluating information (P8), communicating information (P9), defining problems and designing solutions (P10), and planning investigations (P3). The maximum, average, and modal number of practices that students had opportunities to engage with during a laboratory session was 5, 1.6, and 2, respectively.
image file: d5rp00272a-f3.tif
Fig. 3 Heat map showing presence (and absence) of science practices in CHEM 1.

With respect to opportunities presented within each course for students to engage in science practices, 7 of the 9 experiments examined for CHEM I, the first course in the series, showed potential to engage students in at least one practice. We found that the laboratory materials had the potential to engage students in developing and using models (P2), analyzing and interpreting data (P4), math and computational thinking (P5), constructing explanations (P6) and engaging in argument from evidence (P7), with analyzing and interpreting data (P4) and mathematical thinking (P5) present in the greatest number of experiments. The maximum number of practices that students had opportunities to engage with during an experiment in CHEM I was 3, with a mean of 1.4 and a mode of 2. CHEM II, the second in the series of general chemistry courses, comprises eight experiments. Overall, the experiments in this course demonstrated potential to engage students in 5 practices—analyzing and interpreting data (P4), mathematical thinking (P5), constructing explanations (P6), engaging in argument from evidence (P7), and developing and using models (P2). As with CHEM I, opportunities to engage in analyzing and interpreting data (P4) and mathematical thinking (P5) were identified in most experiments. The maximum number of practices that students had opportunities to engage with during a single experiment in CHEM II was 5, with a mean of 2.4, and a mode of 3 practices. Seven experiments comprise the CHEM III laboratory curriculum. Collectively, the CHEM III curriculum showed potential to engage students in two practices only – analyzing and interpreting data (P4) and mathematical thinking (P5), with three experiments characterized as not having potential to engage students in any practice. The maximum number of practices that students had opportunity to engage in by experiment was 2, with an average of 1, and a mode of 2.

To what extent do students have opportunities to engage in the practices identified in the general chemistry curricula?. We also explored whether students had multiple opportunities to engage in the 5 practices identified during an experiment. The intensity of the color highlights in Fig. 3 is an indication of the frequency with which students had opportunities to engage in each practice. The lightest shade indicates a single instance or opportunity, while increasing intensities show an increasing number of instances up to 5 or more. Our examination revealed that students sometimes had multiple opportunities to engage in the same practice within a laboratory session (that is, in 11/24 experiments). Our data show that most multiple opportunities were associated with analyzing and interpreting data (P4) and math and computational thinking (P5). In CHEM I, students had 1 opportunity to engage in constructing explanations (P6) and engaging in argument from evidence (P7), that is, there were not multiple opportunities to engage in these practices in CHEM I.) No opportunities were identified for engaging in constructing explanations (P6), engaging with argument from evidence (P7), or developing and using models (P2) in CHEM III. Only one experiment in the entire course series was identified as providing multiple opportunities for constructing explanations (P6), engaging in argument from evidence (P7), and developing and using models (P2). These practices therefore showed up fewer number of times and with lower frequencies, suggesting that the lab curricula did not provide students with many opportunities to develop proficiency with these practices.
How do opportunities for engaging in practices change as we go across the series of courses?. We found that students had opportunities to engage in the same five science practices in both CHEM I and CHEM II (that is, math and computational thinking (P5), analyzing and interpreting data (P4), constructing explanations (P6), engaging in argument from evidence (P7), and developing and using models (P2)). However, while there was no change in the diversity of practices students had opportunities to engage in as they went from CHEM I to II, the number of opportunities to engage in constructing explanations (P6) and engaging in argument (P7) from evidence, although limited, increased. The number of opportunities to engage in math and computational thinking (P5) also increased from CHEM I to II, while opportunities for analyzing and interpreting data (P4) remained fairly constant. The number of experiments that provided opportunities for students to engage in developing and using models remained the same from CHEM I to II, but there was a decline in the frequency of opportunities. Interestingly, CHEM III is where students had the fewest opportunities to engage in science practices, with only two practices – analyzing and interpreting data (P4) and math and computational thinking (P5), in three experiments.
In what part of the lab (in lab or out of lab) are opportunities to engage in practices available to students?. Considering the presence of practices in each experiment and the number of times each practice showed up, students had 72 opportunities to engage in science practices across the general chemistry laboratory curricula. Forty (40) of the 72 opportunities showed up in lab (that is, during the lab session) and the other 32 opportunities presented out of lab (in prelab or postlab activities). In CHEM I, students had 28 opportunities to engage in science practices, 23 of which were in lab, while there were 33 opportunities for students to engage in science practices in CHEM II, with 12 in lab and 21 out of lab. In CHEM III, students had a total of 11 opportunities to engage in science practices with 5 opportunities showing up in lab. Overall, 56% of all opportunities to engage in practices showed up in lab, while 44% of the practices showed up in activities to be done out of lab. However, the percentages of opportunities to engage in practices in and out of lab varied widely from course to course. Fig. 4 is a heat map showing in lab and out of lab opportunities for CHEM I. Similar heat maps for CHEM II and CHEM III are in SI.
image file: d5rp00272a-f4.tif
Fig. 4 Heat map showing in lab and out of lab opportunities to engage in science practices in CHEM 1.

RQ 2: What opportunities (if any), exist for promoting greater student engagement in science practices within the general chemistry curricula?

We were intrigued by instances where a prompt or set of related prompts almost had the potential to lead to engagement in a science practice, that is, the prompts met all but one criterion – near misses. We therefore examined and documented the number of near misses, which practices were associated with near miss opportunities, and where these near misses showed up in the general chemistry laboratory curricula. We identified 105 near miss opportunities in 5 science practices (analyzing and interpreting data (P4), mathematical thinking (P5), constructing explanations (P6), engaging in argument from evidence (P7) and developing and using models (P2)) across the general chemistry curricula. Interestingly, near miss opportunities were identified only for science practices that were already found present, and therefore did not increase the diversity of practices that students could potentially engage in. In terms of distribution of near miss opportunities among the 5 practices identified, approximately 25% were in analyzing and interpreting data (P4) and mathematical thinking (P5), respectively, while constructing explanations (P6) and engaging in argument from evidence (P7) each accounted for 18% of near miss opportunities. Developing and using models (P2) was identified in 14% of near miss opportunities. More than 67% of near miss opportunities were identified in CHEM I, 15% in CHEM II and 17% in CHEM III. Near miss opportunities showed up more frequently in lab, accounting for 73% of all opportunities, while out of lab near misses accounted for the remaining 27%. Fig. 5 shows the near miss opportunities and frequencies identified for CHEM I. Heat maps showing near miss opportunities for CHEM II and III, as well as where near miss opportunities show up in all the courses in the series can be found in SI.
image file: d5rp00272a-f5.tif
Fig. 5 Heat map showing near miss opportunities in CHEM 1.

Discussion

Hits and misses in the general chemistry lab

Our examination of how science practices are situated within the general chemistry laboratory curricula revealed that of the 10 science practices described in the expanded 3D-LAP criteria, students had opportunities to engage in 5 practices in 75% of experiments, with students engaging in about 2 practices during each experiment over the three-course series. The 5 practices are analyzing and interpreting data (P4), math and computational thinking (P5), constructing explanations (P6), engaging in argument from evidence (P7), and developing and using models (P2), and are considered hits since we found evidence that all the criteria for these practices were met in opportunities presented. Considering that the laboratory curricula examined were not developed using an explicit practice-focused design, the level of potential to engage students in practices already identified in the curricula represents a good foundation on which more practice-focused curricula can be built. While we are not suggesting that students should engage in all science practices in a single experiment, given the importance within science education of student development of critical thinking (of which the science practices can be considered to be disaggregated components), students should have opportunities to engage in all practices over an entire laboratory curricula/sequence. Engagement in science practices in laboratory spaces requires that students demonstrate what they know and are able to do with their knowledge (National Research Council, 2012a). Practices-focused curricula provide an avenue for this demonstration, and can provide responses to calls for evidence of what, if anything, students are learning from labs (Rodriguez and Towns, 2018; Lowery Bretz, 2019; Seery, 2020).

We recognize that the science practices that students did not have opportunities to engage with in our examination (misses) are those which may require more effort to meet the criteria for those practices, that is, asking questions (P1), defining problems and designing solutions (P10), planning investigations (P3), communicating information (P9), and evaluating information (P8). Asking questions is considered to be the foundation of all science. It is therefore important for students to have opportunities to define problems, develop testable questions about aspects of their problem, plan investigations to answer their questions, communicate their findings to a non-expert audience, and provide critiques of the work of their peers or others. The complex problems that our society and world face need solutions that are based in true critical thinking, which means engaging with all the practices. While we are cognizant of challenges associated with large classes such as general chemistry which make it harder to support all students engaging in these practices, more intentional planning to incorporate missed practices could include the adoption of reformed type labs such as project-based, problem-based, Science Writing Heuristic (SWH), and Argument-Driven Inquiry (ADI), (Walker et al., 2016; Carmel et al., 2017, 2019, Hikes and Hughes-Phelan, 2020; Hosbein and Walker, 2022; Stephenson et al., 2023; Polk and Santos, 2025), among others. Creating these opportunities may necessitate reducing some opportunities for practices that are overrepresented to allow room for those that are not represented at all. While adoption of reformed labs should go a far way towards more practices-focused curricula, this must be part of a deliberate effort in curricular development and design at the institutional level. Providing students with opportunities to engage in the practices that are currently absent from the curricula, and do science in ways that scientists do, better prepare them to find sustainable solutions to complex problems.

It is particularly noteworthy that in CHEM III (the last course in the series) students had opportunities to engage with only 2 practices (math and computational thinking (P5) and analyzing and interpreting data (P4)), and with low frequencies over the life of the course. Three (3) of the 7 experiments in this course did not provide opportunities for students to engage in any practice. The absence of a variety of practices for students to engage in could be attributed to an emphasis on the development of particular skills in this course (e.g. technical skills), as well as intentional removal of more explicit prompting/scaffolding with the assumption/expectation that students are at the point in their academic careers where they will engage in other practices in the absence of explicit prompting. While it is possible that some students may engage in deeper exploration that may lead to engagement in practices not explicitly prompted for, many students, especially at the highly competitive level of general chemistry, are likely to resort to the use of Fatima's rules such as focusing on only what is required to obtain a high score, rather than meaningful learning (Larson, 1995). We believe that it is possible to simultaneously emphasize both practices and skills in all courses as necessary, but this will only happen with intentional efforts. The final course in the series has much potential for showcasing student lab learning as this is the point where students have acquired the most experience with general chemistry and have begun to make connections between concepts, and may therefore provide the most natural place for them to demonstrate what they know through engagement in more science practices. However, students need explicit prompts to look for, make, and describe these connections. Therefore, rather than a decline in the number and frequency of practices and explicit prompting as students engage in the last course in the series, we suggest that students can be challenged and held accountable for learning chemistry through engagement in more practices, and with greater frequency.

Not only do students need opportunities to engage in all science practices, they also need opportunities to practice with the practices and develop proficiency. Practicing with the practices necessitates that students have multiple opportunities to engage with the practices, including grade-free opportunities, to reduce anxiety and increase comfortability and learning (Shepherd and Garrett-Roe, 2024; Yik et al., 2024). Our examination revealed that while students often had many opportunities to engage in mathematical thinking (P5) and analyzing and interpreting data (P4), opportunities to engage in the other 3 practices identified were less common.

Mathematical thinking in the general chemistry lab

Our examination also revealed a disproportionate emphasis on math and computational thinking (P5) and analyzing and interpreting data (P4) across the three courses. This is consistent with findings that more traditional curricula tend to emphasize these two practices more than others (Carmel et al., 2017; Stephenson et al., 2023). Interestingly, our analysis revealed that many of the characterizations for analyzing and interpreting data (P4) were arrived at through a math gateway, that is, through analyzing and interpreting mathematical data, further supporting our finding that the chemistry laboratory curricula is math-laden. Undoubtedly, math and computational thinking (P5) is critical for students’ success in chemistry as they engage in sensemaking about chemical data and mathematical models needed for understanding chemistry concepts. However, because most students who enroll in general chemistry have broad interests and majors, an overemphasis on math prevents many students from advancing in their chosen field or career, that is, math proficiency becomes a gatekeeper in chemistry courses. The reputation of general chemistry as a gatekeeper course is well documented in the science education literature. This is especially concerning, and becomes an equity issue, especially for marginalized students who often enter general chemistry with fewer experiences with math than their peers, and are therefore starting at a disadvantage mathematically (Ralph et al., 2022). In math-laden chemistry curricula where students are being held accountable for a lot of math, students who struggle with math are likely to expend their mental energy in trying to understand or memorize the math, leaving little capacity to focus on learning chemistry. While math is integral to learning some chemistry concepts, it is important to remember that chemistry is not math, and by making math the focus of chemistry, we may be “misrepresent[ing] the intellectual work of chemistry” (Ralph et al., 2022). Moreover, a laser focus on math and analysis, to the neglect of the other practices, leaves students short on other competencies, thereby doing them a disservice. As we evaluate chemistry curricula, it is important that we hold students accountable for what is central and most valued as learning chemistry, lest what we assess become the things that get valued (Resnick, cited in Ebert-May and Emery, 2017). Clear messaging to students about what really matters in learning chemistry is an important pillar of equity.

In lab and out of lab: locating opportunities for practice

Considering where students are presented with opportunities to engage in science practices (that is, whether in lab or out of lab), overall students had marginally more opportunities to engage with practices in lab than out of lab, with the number of opportunities varying widely from quarter to quarter. During the laboratory session (in lab) students work with their peers and TAs and therefore have a supportive community where their engagement in science practices can be encouraged and scaffolded. Peers and near peers, especially in collaborative environments where learners can work together to construct understandings, have been shown to be effective in helping students learn (Freeman et al., 2014; Theobald et al., 2020; Clements et al., 2022; Mataka et al., 2023). Such environments support active learning and provide scaffolding that can lead to deeper and more meaningful learning. We therefore suggest that in lab opportunities to engage in science practices should be capitalized on, with deliberate emphasis on providing more opportunities for students to engage in science practices in lab. We also believe that greater intentionality around the number and frequency of opportunities that students have to engage in practices in lab within each course during curricular design and development would be beneficial for students.

Near miss opportunities in the general chemistry lab

Examining near miss opportunities in the laboratory curricula provides a glimpse of what the curricula could look like without major overhaul or modification. While major modification will be needed to incorporate practices for which the curricula do not currently provide opportunities (misses), near misses represent the “low-hanging fruits” or changes that can be made less painfully and within the shorter term, to turn almost opportunities to engage in practices into actual opportunities. While the near miss opportunities identified during our examination did not reveal opportunities to diversify the practices in which students engage, they represent opportunities to significantly increase engagement in constructing explanations (P6), engaging in argument from evidence (P7), and developing and using models (P2). Most near miss opportunities were missing the last criterion for each practice, usually associated with providing reasoning, interpretation or consequence. This criterion can be considered the linchpin of a task, demanding that students demonstrate their understandings through meaningful learning and not superficial responses, rote memorization and regurgitation (Cooper et al., 2012, 2017). As more near miss opportunities appear in lab than out of lab, near misses also represent potential opportunities where students could engage in practices in a supportive community of their peers and instructors/graduate teaching assistants. Until modification of the curricula, lab instructors and teaching assistants may be able to facilitate student engagement in science practices through near misses by explicitly posing and discussing prompts that fulfill the reasoning/interpretation/consequence criterion. However, to hold students accountable for learning chemistry in such cases, it may be necessary to ask students to submit their responses to the prompts. Writing appropriate prompts to turn near misses and misses into hits and incorporating them in the curricula will no doubt require time and effort. Developing the prompts will require continuous effort as writing effective prompts is usually an iterative process. Moreover, we recognize that tasks and activities that do not ask students to provide reasoning/interpretation/consequence are easier and less time-intensive with respect to grading, and anything that requires more time on grading may be straining an already overburdened system. We therefore believe that a strategic approach focused on deep and meaningful learning of chemistry, which might include a better balance between opportunities for math and computational thinking (P5) and analyzing and interpreting data (P4) and other practices, might be needed. It may also be possible to coach and train students to write succinctly, so that their responses become easier to grade with time. As opportunities for students to engage in and develop proficiency in practices increase, the quality of students’ responses should also increase, making them easier to grade. Moreover, if near miss opportunities are turned into hits, more opportunities to hold students accountable for learning chemistry would be created. Below (Fig. 6 and 7) we share two examples of near misses from our lab curricula and show how these could be scaffolded/modified to become hits (that is, meet all the criteria for a practice). While the experimental information presented in Fig. 6 and 7 has been condensed for brevity, this does not alter our characterization in any way. By giving careful attention to alignment between the criteria for each practice (as set out in the expanded 3D-LAP) and the tasks or prompts we want students to respond to, near misses can be transformed into hits.
image file: d5rp00272a-f6.tif
Fig. 6 Scaffolding of a near miss to meet the 3D-LAP criteria for developing and using models.

image file: d5rp00272a-f7.tif
Fig. 7 Scaffolding of a near miss to meet the 3D-LAP criteria for constructing explanations and engaging in argument from evidence.

Limitations

The work reported on in this paper focused on the general chemistry curricula at a single medium sized, primarily white, public institution in the Pacific Northwest of the United States. As such, our findings and insights do not have generalizability to all chemistry courses, or to institutions with different demographics. We have provided detailed descriptions of our work to allow readers to determine the extent to which the findings and implications of this study may be transferable to their contexts.

This study focused on characterizing opportunities within the general chemistry laboratory curricula for students to engage in science practices. Opportunities to engage in science practices do not guarantee that students will actually engage in science practices. However, Stephenson et al. (2023) found that when students were explicitly prompted to engage in practices, they were more likely to do so.

Our characterization of opportunities to engage with science practices in the general chemistry laboratory curricula was limited to the laboratory artifacts we examined and the descriptions provided in the 3D-LAP criteria. It is possible that other opportunities to engage in science practices in the laboratory were created by teaching assistants and instructors. In addition, the lab artifacts may have included other science practices than those described in the Framework for K-12 Science Education, or included in the expanded 3D-LAP for science practices. Any such additional opportunities or science practices are not captured in this study.

Implications for teaching and learning and research

Our work shows that the expanded 3D-LAP criteria for science practices is an effective tool for characterizing chemistry laboratory curricula. Beyond identifying what science practices are (hits) and are not present (misses), the 3D-LAP can help to pinpoint where near miss opportunities appear in the curricula. This may be important for curricular design and development decisions, especially where resources are limited. Near miss opportunities represent changes to the curricula that may be addressed within the short term, and perhaps without a great investment of resources. Our characterization work also provides additional insights into the laboratory curricula and so we believe this is an important first step for curriculum designers and developers considering lab reform. For example, we were able to identify practices that are overrepresented and underrepresented in our curricula. This is important because the distribution of practices within and across the curricula may have equity implications. Beyond characterization, the 3D-LAP is also effective in guiding the modification or scaffolding of tasks, especially near misses. These applications of the 3D-LAP underscore the importance of alignment between learning outcomes, tasks and assessments. In these ways, the 3D-LAP facilitates movement toward more practice-focused curricula.

Addressing misses in the general chemistry laboratory curricula is likely to require more intentional commitment of resources over time. Reformed type labs such as project-based, problem-based, Science Writing Heuristic (SWH), and Argument-Driven Inquiry (ADI) available in the public domain could provide alternatives for consideration, reducing the resource and time commitments associated with the reform process at the institutional level. We are aware that some institutions use laboratory instructional materials that are produced by academic publishers. We suggest that instructors and academic publishers work even more closely to ensure that laboratory instructional materials reflect current best practices in science education (such as emphasis on science practices).

If teaching assistants and lab instructors are to be effective in helping general chemistry students provide evidence of what they know through the practices in laboratory spaces while we work at curriculum reform, more intentionality and preparation will be needed. As teaching assistants and instructors prepare for lab sessions, it might be necessary to prepare additional prompts that deliberately require students to think about reasoning, interpretations and consequences. Training in how to write effective prompts that meet the requisite criteria will be needed, and this could facilitate more interdisciplinary, interdepartmental and inter-institutional collaborations to make the best use of resources. The way teaching assistants are trained will require a bit of a shift, but should not necessarily add time to training.

Research is needed to determine the extent to which students take up opportunities to engage in science practices. Our research is currently exploring the relationship between opportunities created to engage in science practices by the curricula and what students actually do in the laboratory, but more work is needed in this area. Exploring whether students who are exposed to practices-focused curricula adopt this as a way of thinking and practicing when they are no longer prompted to engage in practices also represents another important avenue for study.

Conclusion

We examined our general chemistry laboratory curricula through a science practices lens and applied constructive alignment and accountable disciplinary knowledge in interpreting our findings. Mixing of these two frameworks represents a novel combination in chemistry education research, allowing us to center science practices as valued chemistry knowledge and identify strengths, gaps, and opportunities in that knowledge. Our work reveals that while there is some emphasis on science practices in the laboratory curricula (hits), there are also significant gaps (misses) and opportunities (near misses). In conceptualizing practices that were almost met as near misses, this work provides novel and useful framing that can help instructors, curriculum designers and developers see these near misses as more accessible pathways to hits. We also identified an inordinate concentration on math and computational thinking (P5) and analyzing and interpreting data (P4), which may disproportionately impact marginalized students. Considering that the practices are the disaggregated components of critical thinking, students need engagement in all practices, not just some, in order to develop proficiency in critical thinking and scientific literacy. The 3D-LAP demonstrated great utility in helping us identify successes, gaps and opportunities with respect to science practices in our laboratory curricula. This work therefore makes a significant contribution to the small body of work using the 3D-LAP in characterizing laboratory assessment, helping to further establish the 3D-LAP as an effective tool for this purpose. Science practices-focused laboratory curricula can help us identify areas of strength, as well as areas for improvement in the curricula, and better discern what students know and are able to do, as well as what they don’t know and are not able to do. This insight would go a long way in responding to calls for evidence of what, if anything, students learn from labs. While our work emphasizes the practices, we are by no means minimizing the importance of content. Rather, we agree wholeheartedly with Willingham (2007) that engagement in critical thinking requires something for students to think critically about (that is, content). Indeed, our understanding of the practices takes into consideration the necessity of content knowledge as the practices provide evidence of what students know (content) and what they are able to do (demonstration/evidence) with their knowledge.

Conflicts of interest

There are no conflicts to declare.

Data availability

Data supporting our analyses and findings are included as part of our supplementary information (SI). Supplementary information is available. See DOI: https://doi.org/10.1039/d5rp00272a.

Acknowledgements

We would like to acknowledge Luke Ghallahorne's assistance with data visualization. This work was supported by National Science Foundation DUE 2427871 (2044432).

References

  1. Adams C. J., (2020), A Constructively Aligned First-Year Laboratory Course, J. Chem. Educ., 97 (7), 1863–1873.
  2. Bain K., Bender L., Bergeron P., Caballero M. D., Carmel J. H., Duffy E. M., Ebert-May D., Fata-Hartley C. L., Herrington D. G., Laverty J. T., Matz R. L., Nelson P. C., Posey L. A., Stoltzfus J. R., Stowe R. L., Sweeder R. D., Tessmer S. H., Underwood S. M., Urban-Lurain M. and Cooper M. M., (2020), Characterizing College Science Instruction: The Three-Dimensional Learning Observation Protocol, PLoS One, 15(6), e0234640.
  3. Becker N. M., Rupp C. A. and Brandriet A. R., (2017), Engaging students in analyzing and interpreting data to construct mathematical models: an analysis of students’ reasoning in a method of initial rates task, Chem. Educ. Res. Pract., 18, 798–810.
  4. Biggs J., (2014), Constructive alignment in university teaching, HERDSA Rev. High. Educ., 36(3), 5–22.
  5. Brandriet A., Rupp C. A., Lazenbya K. and Becker N. M., (2018), Evaluating students' abilities to construct mathematical models from data using latent class analysis, Chem. Educ. Res. Pract., 19, 375–391.
  6. Carmel J. H., Herrington D. G., Posey L. A., Ward J. S., Pollock A. M. and Cooper M. M., (2019), Helping Students to “Do Science”: Characterizing Scientific Practices in General Chemistry Laboratory Curricula, J. Chem. Educ., 96(3), 423–434.
  7. Carmel J. H., Ward J. S. and Cooper M. M., (2017), A Glowing Recommendation: A Project-Based Cooperative Laboratory Activity to Promote Use of the Scientific and Engineering Practices, J. Chem. Educ., 94(5), 626–631.
  8. Chen C. and Terada, T., (2021), Development and validation of an observation-based protocol to measure the eight scientific practices of the next generation science standards in K-12 science classrooms, J. Res. Sci. Teach., 58(10), 1489–1526.
  9. Clements T. P., Friedman K. L., Johnson H. J., Meier C. J., Watkins J., Brockman A. J. and Brame C. J., (2022), “It made me feel like a bigger part of the STEM community”: Incorporation of Learning Assistants Enhances Students’ Sense of Belonging in a Large Introductory Biology Course, CBE Life Sci. Educ., 21(2), ar26.
  10. Cooper M. M., (2016), It Is Time to Say What We Mean, J. Chem. Educ., 93(5), 799–800.
  11. Cooper M. M., Kouyoumdjian H. and Underwood S. M., (2016), Investigating Students’ Reasoning about Acid–Base Reactions, J. Chem. Educ., 93(10), 1703–1712.
  12. Cooper M. M., Posey L. A. and Underwood S. M., (2017), Core Ideas and Topics: Building Up or Drilling Down? J. Chem. Educ., 94(5), 541–548.
  13. Cooper M. M., Underwood S. M., Hilley C. Z. and Klymkowsky M. W., (2012), Development and Assessment of a Molecular Structure and Properties Learning Progression, J. Chem. Educ., 89(11), 1351–1357.
  14. Damoyi N. and Makhathini T., (2023), Advances in Social Science, Education, and Humanities Research, in M. Makua and M. Akinlolu et al. (ed.), Development of some graduate attributes through constructive alignment for formative assessments of practicals: A case of chemistry students, Paris: Atlantis Press, pp. 299–319.
  15. Deng J. M., Carle M. S. and Flynn A. B., (2022), Student Reasoning in Organic Chemistry, in Graulich N. and Schultz G. (ed.) Students’ Reasoning in Chemistry Arguments and Designing Resources Using Constructive Alignment, UK: Royal Society of Chemistry, pp. 74–89.
  16. Dziallas S. and Fincher S., (2019), Accountable Disciplinary Knowledge in Computing Education: A Case-Comparative Approach, Proc. 2019 ACM Conf. Int. Comput. Educ. Res., 1–9.
  17. Ebert-May D. and Emery N., (2017), Teaching like a scientist: Assessing your assessments, Front. Ecol. Environ., 15(5), 227.
  18. Facione P., (1990), Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction, Millbrae, CA: California Academic Press.
  19. Freeman S., Eddy S. L., McDonough M., Smith M. K., Okoroafor N., Jordt H. and Wenderoth M. P., (2014), Active learning increases student performance in science, engineering, and mathematics, Proc. Natl. Acad. Sci. U. S. A., 111 (23), 8410–8415.
  20. Gao R., Lloyd J., Emenike B. U., Quarless D., Kim Y. and Emenike M. E., (2021), Using Guiding Questions to Promote Scientific Practices in Undergraduate Chemistry Laboratories, J. Chem. Educ., 98(12), 3731–3738.
  21. Gormally, C., Brickman, P., and Lutz, M., (2012), Developing a test of scientific literacy skills (TOSLS): Measuring undergraduates' evaluation of scientific information and arguments, CBE Life Sci. Educ., 11(4), 364–377.
  22. Hike N. and Hughes-Phelan S. J., (2020), Using the Science Writing Heuristic to Support NGSS-Aligned Instruction, J. Chem. Educ., 97(2), 358–367.
  23. Hosbein K. and Walker J., (2022), Assessment of Scientific Practice Proficiency and Content Understanding Following an Inquiry-Based Laboratory Course, J. Chem. Educ., 99(12), 3833–3841.
  24. Irving P. W. and Sayre E. C., (2014), Conditions for building a community of practice in an advanced physics laboratory, Phys. Rev. ST Phys. Educ. Res., 10(1), 1–16.
  25. Larson J. O., (1995), Fatima's rules and other elements of an unintended chemistry curriculum, Paper presented to the American Educational Research Association Annual Meeting, San Francisco, April.
  26. Laverty J. T., Underwood S. M., Matz R. L., Posey L. A., Carmel J. H., Caballero M. D., Fata-Hartley C. L., Ebert-May D., Jardeleza S. E., and Cooper M. M., (2016), Characterizing College Science Assessments: The Three-Dimensional Learning Assessment Protocol, PLoS One, 11(9), e0162333.
  27. Loughlin C., Lygo-Baker S. and Lindberg-Sand Å., (2020), Reclaiming constructive alignment, Eur. J. High. Educ., 11(2), 119–136.
  28. Lowery Bretz S., (2019), Evidence for the Importance of Laboratory Courses, J. Chem. Educ., 96(2), 193–195.
  29. Mataka L. M., Saderholm J. C. and Hodge T., (2023), Developing Undergraduate Learning Assistants' Skills in Guiding Science Learning, Sci. Educ. Int., 34(2), 142–150.
  30. National Research Council, (2012a), A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas, Washington, DC: The National Academies Press.
  31. National Research Council, (2012b), Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering, Washington, DC: The National Academies Press.
  32. National Research Council, (2013), Next Generation Science Standards: For States, By States, Washington, DC: The National Academies Press.
  33. Noyes K. and Cooper M. M., (2019), Investigating Student Understanding of London Dispersion Forces: A Longitudinal Study, J. Chem. Educ., 96(9), 1821–1832.
  34. Osborne J., (2014), Teaching Scientific Practices: Meeting the Challenge of Change, J. Sci. Teach. Educ., 25, 177–196.
  35. Polk M. E. and Santos D. L., (2025), Science Practices in Action: Group Engagement with Different Degrees of Inquiry in General Chemistry Laboratory, J. Chem. Educ., 102(4), 1380–1388.
  36. Ralph V., Scharlott L. J., Schafer A. G. L., Deshaye M. Y., Becker N. M. and Stowe R. L., (2022), Advancing Equity in STEM: The Impact Assessment Design Has on Who Succeeds in Undergraduate Introductory Chemistry, J. Am. Chem. Soc. Au, 2(8), 1869–1880.
  37. Reed J. J., Brandriet A. R. and Holme T. A., (2017), Analyzing the Role of Science Practices in ACS Exam Items, J. Chem. Educ., 94(1), 3–10.
  38. Rodriguez J. G., Bain K., Towns M. H., Elmgren M. and Ho F. M., (2019), Covariational reasoning and mathematical narratives: investigating students’ understanding of graphs in chemical kinetics, Chem. Educ. Res. Pract., 20, 107–119.
  39. Rodriguez J. G. and Towns M. H., (2018), Modifying Laboratory Experiments to Promote Engagement in Critical Thinking by Reframing Prelab and Postlab Questions, J. Chem. Educ., 95(12), 2141–2147.
  40. Seery M. K., (2020), Establishing the Laboratory as the Place to Learn How to Do Chemistry, J. Chem. Educ., 97(6), 1511–1514.
  41. Shepherd T. D. and Garrett-Roe S., (2024), Low-Stakes, Growth-Oriented Testing in Large-Enrollment General Chemistry 1: Formulation, Implementation, and Statistical Analysis, J. Chem. Educ., 101(8), 3097–3106.
  42. Shiroda M., Franovic C. G.-C., de Lima J., Noyes K., Babi D., Beltran-Flores E., Kesh J., McKay R. L., Persson-Gordon E., Cooper M. M., Long T. M., Schwarz C. V. and Stoltzfus J. R., (2024), Examining and Supporting Mechanistic Explanations Across Chemistry and Biology Courses, CBE Life Sci. Educ., 23(3), ar38.
  43. Stephenson N., Duffy E. M., Day E. L., Padilla K., Herrington D. G., Cooper M. M. and Carmel J. H., (2020), Development and Validation of Scientific Practices Assessment Tasks for the General Chemistry Laboratory, J. Chem. Educ., 97(4), 884–893.
  44. Stephenson N., Facey P. and Sadler-McKnight N., (2023), Chemical Education Research during COVID: Lessons Learned during the Pandemic, in Nelson D. (ed.), Searching for Evidence of Science Practices in Chemistry Laboratory Curricula, pp. 37–50 DOI:10.1021/bk-2023-1448.
  45. Stevens R., O’Connor K., Garrison L., Jocuns A. and Amos D. M., (2008), Becoming an Engineer: Toward a Three-Dimensional View of Engineering Learning, J. Eng. Educ., 97(3), 355–368.
  46. Stowe R. L. and Cooper M. M., (2017), Practicing What We Preach: Assessing “Critical Thinking” in Organic Chemistry, J. Chem. Educ., 94(12), 1852–1859.
  47. Theobald E. J., Hill M. J., Tran E., Agrawal S., Arroyo E. N., Behling S., Chambwe N., Cintrón D. L., Cooper J. D., Dunster G., Grummer J. A., Hennessey K., Hsiao J., Iranon N., Jones L., Jordt H., Keller M., Lacey M. E., Littlefield C. E., Lowe A., Newman S., Okolo V., Olroyd S., Peecook B. R., Pickett S. B., Slager D. L., Caviedes-Solis I. W., Stanchak K. E., Sundaravardan V., Valdebenito C., Williams C. R., Zinsli K. and Freeman S., (2020), Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math, Proc. Natl. Acad. Sci. U. S. A., 117 (12), 6476–6483.
  48. Underwood S. M., Posey L. A., Herrington D. G., Carmel J. H. and Cooper M. M., (2018), Adapting Assessment Tasks to Support Three-Dimensional Learning, J. Chem. Educ., 95(2), 207–217.
  49. Urbanek M. T., Moritz B. and Moon A., (2023), Exploring students’ dominant approaches to handling epistemic uncertainty when engaging in argument from evidence, Chem. Educ. Res. Pract., 24, 1142–1152.
  50. Van Wyk A. L., Bhinu A., Frederick K. A., Lieberman M. and Cole R. S., (2025), Bridging the Science Practices Gap: Analyzing Laboratory Materials for Their Opportunities for Engagement in Science Practices, J. Chem. Educ., 102 (3), 970–983.
  51. Walker J. P., Sampson V., Southerland S. and Enderle P. J., (2016), Using the laboratory to engage all students in science practices, Chem. Educ. Res. Pract., 17, 1098–1113.
  52. Williams L. C., Underwood S. M., Klymkowsky M. W. and Cooper M. M., (2015), Are Noncovalent Interactions an Achilles Heel in Chemistry Education? A Comparison of Instructional Approaches, J. Chem. Educ., 92(12), 1979–1987.
  53. Willingham D., (2007), The Difficulty of Teaching Critical Thinking: Cognitive Insights, American Federation of Teachers.
  54. Yik B., Machost H., Streifer A., Palmer M., Morkowchuk L. and Stains M., (2024), Students’ Perceptions of Specifications Grading: Development and Evaluation of the Perceptions of Grading Schemes (PGS) Instrument, J. Chem. Educ., 101(9), 3723–3738.

Footnote

Current Institution: Solebury School, 6832 Phillips Mill Rd, New Hope, PA 18938.

This journal is © The Royal Society of Chemistry 2026
Click here to see how this site uses Cookies. View our privacy policy here.