Open Access Article
This Open Access Article is licensed under a Creative Commons Attribution-Non Commercial 3.0 Unported Licence

Lowering the activation barrier to create a constructively aligned undergraduate chemistry laboratory experience – a review of innovations in assessments and course design

Chun-wan Timothy Lo and Sharonna Greenberg*
Department of Chemistry and Chemical Biology, McMaster University, Hamilton, ON L8S 4M1, Canada. E-mail: greenbsh@mcmaster.ca

Received 13th February 2025 , Accepted 16th July 2025

First published on 15th August 2025


Abstract

Reimagining laboratory education in chemistry can help address demands to revitalize the undergraduate chemistry curriculum. In doing so, we can help students think like scientists and connect chemistry to other disciplines. Historically, undergraduate laboratories were taught through expository experiments coupled with traditional lab reports. However, these practices do not allow for constructive alignment of the curriculum, because the assessments target the cognitive domain of learning while the learning outcomes and class activities target the psychomotor domain. This lack of alignment also limits meaningful learning in the laboratory, at the heart of the cognitive, affective, and psychomotor domains. This review summarises some recent innovations in course design and assessments for undergraduate level laboratory courses. Overall, we aspire to lower the activation energy barrier for educators to find and implement curricular reforms in laboratory education that are constructively aligned within their course. We structure this review under the major learning outcomes of laboratory instruction, defined by Reid and Shah: (1) linking cognitive and psychomotor domains; (2) developing practical skills; (3) designing experiments; and (4) improving transferable skills, which are further separated into scientific writing, oral communication, and peer learning.


Introduction

There has been a surge for chemistry educators to revitalize the chemistry curricula (Adam, 2002; Matlin et al., 2016; Nagarajan and Overton, 2019), with a focus on laboratory education as this is the place where students “learn how to do chemistry” (Seery et al., 2019). For example, Matlin et al. suggest that learners should be taught through problem-solving approaches to encourage and test understanding and deductive reasoning, and to connect chemistry to other disciplines and to the society, rather than a collection of facts that require rote learning. However, rote learning is commonly found in the laboratory, where students often carry out expository “cookbook” style experiments and are assessed through traditional lab reports, with little context on utilizing scientific methods and solving real-world problems.

We believe that fundamental issues with the laboratory curriculum stem from a lack of constructive alignment between the learning outcomes, class activities, and assessments (Biggs, 1996, 2014). Others have expressed similar views (Berns, 2019; Veale et al., 2020; K. Seery et al., 2024; Seen, 2025). Hounsell & Hounsell (Hounsell and Hounsell, 2007) further extend constructive alignment to include contextual influences such as student backgrounds and aspirations, learner support, and course organization and management.

The first step in creating a constructively aligned curriculum is establishing a well-developed set of learning outcomes such as Reid and Shah (2007). These learning outcomes should incorporate all domains of learning – cognitive (thinking), psychomotor (doing), and affective (feeling) – to allow for meaningful learning to occur (Novak and Gowin, 1984; Novak, 2009; Bretz, 2019). Agustian (2022) further proposes an integrative approach to assessment in the laboratory that includes not just these domains, but also social and epistemic.

After developing a set of learning outcomes, the curriculum should be designed intentionally to allow for constructive alignment. A major problem with typical laboratory curricula is the misalignment between learning outcomes, class activities, and assessments. For example, learning outcomes may focus on all domains of learning (cognitive, psychomotor, and affective), but class activities often involve expository “cookbook” style experiments (psychomotor domain) while assessments typically involve traditional lab reports (cognitive domain). Neither class activities nor assessments typically target the affective domain, despite its importance in the lab, and the social and epistemic domains are similarly overlooked (Agustian, 2022).

In this integrative review, we summarize innovative ideas for class activities and assessments in the laboratory that are constructively aligned with the learning outcomes. We begin with the following well-established set of learning outcomes proposed by Reid and Shah (Reid and Shah, 2007).

(1) Students should link the cognitive and psychomotor domains by exposing theoretical ideas to empirical testing.

(2) Students should perform practical skills related to apparatus and chemical handling.

(3) Students should develop experimental design skills to improve their scientific observations, analyses and interpretations.

(4) Students should improve their transferrable skills such as scientific writing, oral communication, and working with their peers.

These learning outcomes are echoed in the literature (for example, see: Seery et al., 2019; Nikolic et al., 2024) for engineering disciplines; (Agustian et al., 2022) and in accreditation processes worldwide (ACS Committee on Professional Training, 2023); (Find degrees accredited by the RSC, 2024); (Canadian Society of Chemistry, 2019); and others (González and Wagenaar, 2003; Japan University Accreditation Association, 2018; Pyke et al., 2021).

Based on these learning outcomes, we then describe a few creative class activities and assessment methods in the laboratory. We have grouped these activities and assessment methods into the four distinct categories of learning outcomes indicated above, although some methods may span more than one category. For each activity or assessment method, we have also included subsections entitled “expected outcomes and advantages”, largely from the students’ point of view, and “challenges and suggestions”, largely from the instructors’ point of view. In these subsections, we highlight aspects mentioned in the primary literature, in addition to offering our own opinions based on our experiences in laboratory instruction. Most of the articles chosen are recent, published in the last 15 years, and are specific to chemistry. However, we hope that readers from other STEM fields can apply these ideas to their own disciplines.

Other prior reviews have focused on laboratory education from a different angle. For example, Agustian and coworkers (Agustian et al., 2022) conducted a comprehensive review to characterize laboratory learning across STEM disciplines. Studies were coded based on key competencies and grouped into five overarching themes: experimental competencies, disciplinary learning, higher-order thinking skills and epistemic learning, transversal competences, and affective outcomes. The authors focused on the qualities of the study design, results of the study, and their relevance and applicability at university level. A similar comprehensive review (Gericke et al., 2023) focuses on secondary school rather than university education. In another recent review, Seery et al. (2024) frames recent advances in laboratory education under 10 guiding principles related to constructive alignment. The authors suggest actionable items aligned with each guiding principle, and point the reader to recent literature examples to achieve those actionable items. We also note that tackling actionable items, like those suggested by Seery et al. (2024) is challenging, even when we know that other educational practices are substantially better than our current practices (Henderson et al., 2018). Henderson et al. point out that the biggest barrier is figuring out how to implement new methods at scale. For new ideas to be sustainable, the authors advocate for promoting social connections within the university to encourage collaboration and sharing of ideas, rather than new ideas that are focused on an individual instructor. We direct interested readers to the excellent work of Henderson and coworkers in how to implement sustainable change.

Our review differs from these prior reviews in that our focus is on how to practically apply these innovative, constructively-aligned strategies, and what challenges may arise. Our hope is that our readers will critically examine their own laboratory courses for constructive alignment, and become inspired to implement new classroom activities and assessments described herein (or in other related reviews), that are better aligned with their learning outcomes. By pointing out the advantages and challenges, as well as the logistics of implementing each strategy, we hope to lower the activation energy barrier for instructors to try something new. Table 1 below summarizes each innovation along with its pros and cons. Typically, the advantages of each method focus on the students* perspective and the overall learning experience. Meanwhile, the disadvantages of each method are from the instructors* perspective and often include demands on time and resources.

Table 1 Summary of recent, innovative course designs and assessment techniques in laboratory instruction. In this table, pros are largely for students while cons are largely for instructors, unless otherwise noted by the asterisk
Method Pros Cons and suggestions Ref.
Learning objective 1: linking the cognitive and psychomotor domains
Motivation:
– Facilitating the connections between theory and experiment
Written lab wrapper and oral exit interviews – Encouraging preparation and self-reflection – Investing substantial time to complete oral exit interviews (Crawford and Kloepper, 2019)
– Focusing on overall goals of the lab – Consider peer review, clear rubrics, and competency- or specifications-based grading.
– Boosting oral communication skills  
– Receiving immediate feedback  
– Inspiring positive affect  
 
Learning objective 2: developing practical skills
Motivation:
– Aligning assessments constructively with intended learning outcomes
– Prioritizing practical skills development
Rubric for evaluation of laboratory performance – Receiving immediate feedback – Developing a detailed grading rubric (Veale et al., 2020)
– Emphasizing the psychomotor domain by alignment of the grading scheme with the desired skillset – Training Tas to grade fairly
  – Consider peer-evaluation, self-reflection, and competency- or specifications-based grading
 
Practical laboratory examinations – Focus on the process (skills development) rather than the outcome of the experiment – Investing substantial time and resources (Kirton et al., 2014)
  – Feeling stressed (students)*
  – Consider flagging important skills and creating laboratory video demonstrations
 
Learning objective 3: designing experiments
Motivation:
– Thinking like a scientist
– Cultivating critical thinking, creative thinking, independence, confidence, and sense of belonging
– Gaining deeper understanding of experiment and instrumentation
– Recognizing the limits of one's knowledge
– Mimicking a research environment
– Arousing interest in seeking research opportunities
Course-based undergraduate research experiences (CUREs) – Easing the transition between expository labs and research experiences at the lower-year undergraduate level – Designing open-ended yet self-contained problems to investigate (Clark et al., 2016)
  – Training TAs to be facilitators rather than knowledge providers
  – Accessing and setting up diverse materials and equipment
  – Consider asking students to design experiments using an inventory of chemicals and equipment
  – Consider using peer tutors to support inquiry skills
  – Consider student self-reflection in assessment methods
 
Problem-based learning (PBL) mini-projects – Taking control of experiments (student-centred) – Designing self-contained problems to solve (McDonnell et al., 2007)
  – Extensive facilitating during the initial planning stages
  – Devoting laboratory time to experimental planning
  – Preparing large libraries of chemicals and equipment
  – Consider similar suggestions as listed above in course-based undergraduate research experiences (CUREs)
 
Learning objective 4A: transferrable skills – scientific writing
Motivation:
– Developing written communication, information literacy, and high-order thinking skills
Scaffolded laboratory report writing – Internalizing the organization of laboratory reports – Providing timely feedback with high student volume (Deiner et al., 2012)
– Reducing structural errors in report submissions – Consider mentorship from peer tutors and peer review by classmates
 
Integrating information literacy with scientific writing – Improving information literacy skills – Generating interest and buy-in from students* (Borchardt et al., 2019)
– Implementing in lectures or tutorials without using lab time* – Developing appropriate rubrics for information literacy and using them consistently
  – Consider mentorship from peer tutors and consulting with a librarian as an expert in information literacy
 
Learning objective 4B: transferrable skills – oral scientific communication
Motivation:
– Promoting deeper learning
– Developing oral communication skills
Oral laboratory reports – Immediate verbal feedback helps focus on the learning rather than the grade – Feeling stressed (students)* (Burrows et al., 2021)
– Scaffolding questions to target individual learner needs – Investing substantial time to hold oral interviews
– Promoting better preparation and knowledge retention – Consider clear rubrics and competency- or specifications-based grading
  – Consider a mix of oral and written assessments
 
Video-recorded assignments – Implementing in large-enrolment courses – Developing a detailed grading rubric and answer key (Berns, 2019)
– Focus on organizing thoughts and presenting information clearly – Training TAs regarding expectations and grading consistency
– Low-stakes with multiple retakes as needed – Designing specific experiments that can be done at home
– Incorporating artistic freedom and universal design for learning  
 
Learning objective 4C: transferrable skills – peer learning
Motivation:
– Promoting student–student engagement
– Promoting the affective and social domains of learning
– Developing a sense of community and belonging, particularly for under-represented minorities
Team contracts and collaboration rubric – Tracking cohorts of students through the curriculum – Tackling social loafing (Mertz et al., 2023)
– Intervening in dysfunctional groups* – Giving and receiving feedback in a constructive way
– Targeting the affective domain through self-reflection – Protecting anonymity when providing feedback
  – Consider a specifications-based grading scheme
  – Consider guidance on how to give and receive feedback and providing aggregated or class-wide feedback
 
Reciprocal peer teaching (RPT) – Deepening understanding of content – Accepting that peers can be knowledge providers* (Zewail-Foote and Gonzalez, 2023)
– Enhancing metacognitive skills – Standardizing assessment practices
– Targeting the affective and conative domains through intrinsic motivation – Consider self-reflection, self- and peer-evaluation, and a competency-based grading scheme
 
Argument-driven inquiry (ADI) – Developing critical thinking skills – Accepting that peers can be knowledge providers* (Walker et al., 2011)
– Simulating a more realistic research environment – Training TAs as facilitators
– Developing rich, convincing arguments, and accepting different viewpoints – Assessing reasoning and argumentation skills
– Targeting the epistemic domain of learning – Consider an observation protocol to assess argumentation skills
  – Consider a competency- or specifications-based grading scheme
  – Consider self-reflection that focuses on the growth mindset
  – Consider peer tutors as facilitators


Discussion

1. Linking the cognitive and psychomotor domains

According to Reid and Shah (Reid and Shah, 2007), laboratory work exposes theoretical ideas to empirical testing. In this way, students are encouraged to connect what they have learned in lectures (the cognitive domain) to the experimental work they carried out in the lab (the psychomotor domain).
Motivation. To encourage student self-reflection and have a greater impact on learning, the cognitive and psychomotor domains should be linked immediately. While lab reports can be effective assessments of laboratory learning, they are often due a week later, rather than immediately after the lab. Instructional methods across education similarly value immediate assessment and feedback (The CARL framework of reflection, 2018; Creating a CARD Lesson Plan, 2020; Hu et al., 2022).

Post-lab assessments that occur immediately after the lab encourage better student preparation, likely because students would not have any other time to prepare for the assessment (Crawford and Kloepper, 2019). In addition, low-stakes assessments are useful to target the affective domain of learning: a lesser impact on the student's grade will alleviate student anxiety, while encouraging them to do the work.

Herein, we present an innovative technique that requires students to formulate their ideas and thoughts directly following experimentation in the laboratory. This technique is worth very little in terms of grades and it helps with student preparation, anxiety, and understanding of the overall laboratory goals.


Method 1: combination of written lab wrapper and oral exit interviews. Crawford and Kloepper report the use of written lab wrappers and oral exit interviews (Crawford and Kloepper, 2019). A lab wrapper is a short reflection where students are asked to write about high-level ideas, such as the overall goals of the experiment and what new skills they learned. This helps them synthesize their thoughts to prepare for the oral exit interview. In the oral exit interview, students are asked similar questions as the lab wrapper, as well as whether the lab helped improve their understanding of lecture material. Other questions in the oral exit interview cover material specific to the lab and help with the lab report that students will later write.

This use of both written and oral communication was applied in a second-year chemistry course, in a 24-student lab section. This is a low-stakes assessment: only the exit interview was graded, contributing to a total of 0.45–1.5% of the student's grade in the course.


Expected outcomes and advantages. Brief assessments used immediately after the lab activity encouraged better preparation before the lab: during the semester, the authors observed an increase in students meeting with lab partners prior to lab. In addition, these assessments help students reflect on their learning experiences, looking at the skills they learned and the big picture of the lab, instead of focusing on individual lab procedures. In this way, students are explicitly shown the link between the cognitive and psychomotor domains. The post-semester questionnaire recorded a 21% increase in overall number of students mentioning higher-order learning activities such as attempting to understand the lab, visualizing lab steps or looking up unfamiliar concepts, for lab preparation. Over 80% of students agreed that exit interviews reinforced lab material and helped them make connections between course material and lab.

As the semester progressed, students were less nervous and more comfortable with the exit interviews and generally acclimatized to this alternate form of assessment. The low-stakes assessment may improve students’ resilience by providing an opportunity for students to make mistakes on the exit interview and improve for the lab report. This is a great example of how the affective domain can also influence learning alongside the cognitive and psychomotor domains.

Analyses of grades on exit interviews shows an additional advantage: a general improvement in oral communication. Oral communication will be discussed further in Section 4.B.


Challenges and suggestions. Exit interviews are 3–5 minutes in length. Considering a 24-student section, a total of 1–2 h would be required for the instructor(s) to finish all the interviews after the lab section. The authors did not specify the logistics, specifically the extra time needed to finish all interviews. However, in one iteration of exit interviews, the authors note that students were paired up to reduce the time needed to finish all interviews.

One strategy to alleviate time demands on instructors is to consider instituting peer feedback. This strategy might be especially useful later in the semester, after first establishing norms with the first few exit interviews conducted by faculty. We refer readers to Section 4.C of this article regarding the use of peer feedback in the curriculum. Other strategies to alleviate time demands on instructors may involve students being randomly chosen for an interview, or teaching assistants administering and grading the interviews. To facilitate this, teaching assistants should receive training, and rubrics should be developed to ensure the fairness of the assessment. While the authors did not discuss their rubric, the rubric is likely based on interview questions, and students generally earned half or full credit for the exit interviews. An alternative to rubrics is to grade the exit interviews using a competency-based (Townsley and Schmid, 2020), or specifications-based grading scheme (Pascal et al., 2020; Nilson, 2023). In a specifications-based grading scheme, which falls under the umbrella term competency-based grading, educators provide clear specifications of what is required to earn a given grade in the class. Tasks and assignments are based on a pass/fail basis, and the specifications usually refer to the number of passing these tasks.

2. Developing practical skills

As discussed in Section 1, one of the main learning outcomes in the lab is to link the cognitive domain of learning with the psychomotor domain. However, developing the psychomotor domain is itself a distinct learning outcome, as the skills gained in the psychomotor domain are distinct from the cognitive domain (Kirton et al., 2014). Traditional expository labs are effective to achieve the learning outcome of developing practical skills. However, typical assessments (lab reports) in these expository labs are misaligned with this learning outcome because the execution of the protocol itself is rarely evaluated (Seery, 2020). Seen (Seen, 2025) further develops this idea beyond course-level learning outcomes to program-level learning outcomes, through a framework to evaluate the assessment of practical skills. Seen points out that laboratory skills are not typically assessed directly through observation, but rather assessed indirectly through lab reports, lab notes, or other post-lab assessments. The methods below address some alternative assessments to directly evaluate hands-on skills in the lab.
Motivation. Because the psychomotor domain of learning is distinct from the cognitive domain, it can be helpful to evaluate practical skills separately. Otherwise, instructors run the risk of focusing exclusively on the cognitive domain, which is easier and more familiar to assess. Assigning grades to practical skills ensures that both instructors and students prioritize this learning outcome. For example, Kirton et al. (Kirton et al., 2014) observe that students did not consider the actual work being conducted and the hands-on skills they were learning in the lab, and were reluctant to invest time in laboratory techniques/skills without reward (marks). By properly assigning the reward (marks) for practical skills, students may place greater value and emphasis on these skills. The result is a constructively aligned course, in which the learning outcomes and assessments are mutually reinforcing.
Method 1: development of rubric for the evaluation of laboratory performance. Veale and colleagues developed a detailed scoring rubric to evaluate experimental techniques in a 3rd year, synthetic organic chemistry lab (Veale et al., 2020). The authors focused on assessment as a tool for skill development. Therefore, the rubric specifically tackled several different experimental techniques. These criteria were graded on a 5-point scale and worth 20 points, while the written lab report was worth 10 points, for a total of 30 points. Instructors trained TAs to use the rubric to evaluate students on select aspects of the experimental protocol. These aspects were intended to be indicative of overall performance, and it was more efficient and scalable to choose select aspects rather than witnessing the entire experiment. For example, in grading students’ workup during an extraction, TAs do not have to watch the entire drying and filtering step but only look at the amount of MgSO4 added to the organic layer as an indication of good workup skills. Immediately after the lab, students obtain the lab performance grade and individualized feedback from the TA.
Expected outcomes and advantages. Students improved in their practical skills throughout the course, as shown by a general increase in the lab performance grades. This may be a result of a general improvement over time, but it may also be attributable to immediate feedback and/or to the grading scheme. With respect to immediate feedback, experienced academic staff could use the rubric to effectively assess practical skills and provide formative feedback. With respect to the grading scheme, most marks (20 out of 30 points) were related to experimental techniques, which better aligns with expectations: students should invest more time and place greater value on the experimental techniques if they are worth more marks.
Challenges and suggestions. The rubrics were developed by experienced instructors, which is a time-consuming task, given that each rubric is specific to a particular experiment. The rubrics were also used by experienced TAs to evaluate students: in future iterations of using the rubric, newer TAs could be trained to alleviate workload and reduce the time and resources required for real-time observation. A similar rubric was developed by Chen et al., specifically for instructors/TAs with little experience (Chen et al., 2013), in which one TA was responsible for not more than two students. Evaluation using such rubric is time-consuming, and likely not realistic in courses having considerable number of students.

Some low-stakes peer-feedback or self-reflection may also be useful to help students improve their practical skills. Peer-evaluation or self-reflection may be reasonable towards the end of the semester, once norms for practical skills have been established. This is advantageous in terms of instructors’ time and resources; however, peer feedback can bring potential setbacks. We refer readers to Section 4.C of this article regarding the use of peer feedback in the curriculum. Competency- or specifications-based grading can also offer an alternative to more detailed rubrics.


Method 2: assessment using practical laboratory examinations. Kirton et al. developed a practical laboratory examination, which they termed Structured Chemistry Examinations (SChemEs) (Kirton et al., 2014). The practical exam is conducted at the end of the term, worth 15% of the student's grade, and features multiple-station-style assessment, with each station focusing on a different skill area. The practical exam resembles the widely used objective structured clinical examinations (OSCEs) in the clinical disciplines developed in the 1970s and the objective structured practical/laboratory examinations (OSPEs/OSLEs) (Chitra et al., 2022) in anatomy and physiology.

Kirton et al.'s pilot study was implemented on a compulsory first-year chemistry module for pharmacy and life sciences programs, with a focus on organic chemistry. A passing grade on the practical exam was not required to pass this course. Students completed five main skills: basic techniques, information management, interpretative exercises, apparatus handling and numeracy. Each skill contributes one-sixth of the overall mark, with the remaining one-sixth contributed from students’ level of preparedness.


Expected outcomes and advantages. The design of practical exam changes the focus of the assessment from the outcomes achieved to the processes involved. This assessment method is constructively aligned with the intended learning outcomes for Year 1 students, namely the practical skills that students are meant to develop. This is also a highly flexible method, allowing instructors to implement this assessment in courses with different learning outcomes and disciplines.
Challenges and suggestions. This method requires a substantial time commitment for instructors to design and implement. In addition, students may find this type of practical exam stressful. For example, students suggested that there should be practice sessions prior to the actual assessment, even though tasks given in the practical exam were similar to tasks they had seen previously in the laboratory classes. However, financial constraints and lack of time typically preclude this idea. The authors suggested instead to signpost the important skills in the student practical schedule, and to direct students to video demonstrations of examinable skills and techniques.

3. Designing experiments

Another general purpose of laboratory learning is to teach students how to do science (Seery et al., 2019). This idea is relevant to the epistemic domain of learning, which Agustian and Matthews advocate serves as the foundation for learning in the laboratory (Matthews, 2018; Agustian, 2022). Specifically, as Thomas et al. suggest (Thomas et al., 2015), obtaining a degree in chemistry is learning to become a scientist and contribute to the world of chemical sciences. Thus, students should work like chemists, write like chemists and think like chemists upon graduation. To achieve this, experimental design skills should be explicitly taught so that students learn to independently plan and execute new and innovative experiments related to a problem. Frameworks to teach experimental design will be discussed under the umbrella of inquiry. Inquiry-based methods, according to Lazonder and Harmsen (Lazonder and Harmsen, 2016), enable students to learn about a topic through self-directed investigations.

Domin (1999) characterized laboratory instructional styles according to the outcome, approach, and procedure involved (Domin, 1999). Meanwhile, Buck and coworkers have proposed several levels of inquiry, corresponding to the level of open-endedness of the experiment (Buck et al., 2008). For example, confirmation (level 0) is equivalent to expository style laboratories, in which all parts of the investigation are mapped out for students. The levels then increase (½, 1, 2, and 3), where the last level, authentic inquiry, involves students generating their own scientific question to an open-ended problem, and determining how to conduct the experiment and report the results. Here, we summarize different methods of inquiry-style laboratory learning using the framework from Buck et al., including course-based undergraduate research experiences (CUREs), problem-based learning (PBL) and argument-driven inquiry (ADI).

Motivation. Level 0 (expository) labs give little emphasis to critical thinking and conceptual learning, and no opportunity for experimental design. Moreover, level 0 labs often involve confirming theory taught in lectures, which is an inefficient use of the laboratory (Kirschner and Meester, 1988). Students in expository laboratories are often unclear of the aims and applications of the laboratory work, and struggle to interpret experimental results. To promote students’ ability in experimental design, laboratory instructions of higher inquiry level (towards authentic laboratory experience) are needed. By increasing the open-endedness of experiments, students not only learn science content, but also the experimental processes and design. They also gain independence in carrying out scientific work.

Increasing the level of inquiry also aligns with Perry's scheme of intellectual development (Finster, 1991; Winberg and Berg, 2007) and with undergraduate degree level expectations (see, for example (Appendix 2: OCAV's Undergraduate and Graduate Degree Level Expectations—Ontario Universities Council on Quality Assurance, 2024)). For example, students coming out of high school tend to characterize knowledge into right and wrong statements. Their undergraduate degree helps them learn to recognize the limits of their knowledge and determine ways to uncover new ideas. The process of inquiry is vital to this aspect of growth.

In the three methods described below, encouraging experimental design serves to release a certain degree of control to students, which then positively affects their attitudes, stimulating curiosity and lifelong learning. Students gain improved independence and a sense of belonging in STEM disciplines (Balster et al., 2010; Estrada et al., 2011; Provost, 2022). Mimicking a more realistic research environment also encourages student interest in seeking research opportunities. This is a great example of using the affective domain of learning to facilitate student learning.


Method 1: structured course-based undergraduate research experiences (CUREs). There exists a vast amount of literature related to CUREs in the chemistry undergraduate curriculum. The method below shows how CUREs can be implemented in a first-year setting. We also direct interested readers to an article of faculty perspectives about CUREs (Connor et al., 2022) and a guide for starting or implementing CUREs (Provost, 2022).

Clark et al. (Clark et al., 2016) developed and implemented a transition from expository labs to CUREs to ease students’ anxiety and promote the process of uncovering new knowledge. The CURE was implemented in a large, first-year introductory chemistry course (2300 students) in the second semester. The course is divided into lab sections with 25 students each, led by a graduate TA. The laboratory component starts with ten traditional laboratory experiments, followed by a three-week-long group research experience. The project is an example of “structured inquiry”, level ½ according to Buck et al.'s designation (Buck et al., 2008). The outcomes are unknown to students, and TAs are explicitly advised to leave students’ questions unresolved, as they would be investigated in the next experiment. There are three important aspects to this experience: (1) signposting the research question to examine the significance of the big picture; (2) using specific in-lab questions to guide students through the inquiry process; (3) using an inductive approach to generalize experimental results and address research claims.


Expected outcomes and advantages. In this method, instructors are not necessarily distributors of knowledge. As such, this method is built to challenge students’ traditional thinking and encourage students to find answers themselves. As a result, students are expected to gain an improved understanding of research, so that they can seek additional research opportunities and persist in a STEM discipline.

From an end-of-course survey, a student described a traditional lab setting as a “scary place where no one talks to each other for the whole lab”, and contrasted it with the CURE, which “is definitely not like that.” CUREs also challenge students to experience a more realistic research environment, such as the “messiness of real-world data”. Students also recognize they must become more responsible for their own learning in a CURE course. This encouraged them to better understand the laboratory work they are performing and the related concepts, and shows that this method is also effective in linking the cognitive and psychomotor domains (see Section 1 above).


Challenges and suggestions. Compared to expository labs, CUREs are more challenging for all stakeholders: instructors, technicians, TAs, and students.

It is difficult and potentially anxiety-provoking for both students and teaching assistants to adopt a more open-ended inquiry-style approach. (For student perspectives, see (Chopra et al., 2017); for TA perspectives, see Sandi-Urena et al., 2011; Sandi-Urena and Gatlin, 2012; Wheeler et al., 2015). From the student perspective, Chopra and coworkers point out that some students favour the expository style as it is more familiar, while others embrace the new method of inquiry labs (Chopra et al., 2017). Students face a lot of freedom in designing their experiment, which can feel overwhelming. We propose addressing this through student self-reflections to track their progress, which has the added advantage of addressing the affective domain of learning.

From the TA perspective, initially the expository style was deemed “easy to teach” while the inquiry style lacked required support and tools to handle the lab and accomplish their instructional goals. However, TA frustration with inquiry style labs diminished as they became more acquainted with the instructional methodology, and were provided continuous support by senior TAs and the faculty coordinator (Sandi-Urena et al., 2011). Sandi-Urena and Gatlin advocated the need to provide training to TAs in order to have a successful education reform in the teaching laboratories (Sandi-Urena and Gatlin, 2012), and Wheeler et al. has discussed TA perceptions of TA trainings in their work (Wheeler et al., 2015).

Similarly, Gericke et al. has summarized some challenges of implementing science practice in laboratory work in line with guided or open inquiry in secondary schools (Gericke et al., 2023). The authors emphasize the need to explicitly teach questioning skills. Therefore, TAs generally require more training as facilitators, instead of the traditional knowledge provider. In Clark et al.'s study, for example, TAs were required to first attend these labs as a learner and debrief with the instructor. Such training should be advantageous and necessary if educators are collectively shifting the paradigm of laboratory instruction to inquiry-based methods. One way to alleviate the training requirement for TAs is to involve upper-year peer tutors, once the course has been established for a few years. Peer tutors are 2–3 years ahead of the students in the laboratory course, and would have already gained experience in inquiry methods, having completed the course 2–3 years earlier. Peer tutors are “paid” through course credit rather than TA wages, with no expectation of grading duties (Ding and Harskamp, 2011).

Another general challenge in CUREs relates to the workload for lab technicians in setting up the protocol that students propose with limited materials and equipment. To address this, students could be directed to choose from an inventory of chemicals and equipment to work with, as in this study by Clark and coworkers. In addition, instructors can design a self-contained problem to solve and frame the activity carefully. This way, students are ultimately carrying out a standardized synthesis, which limits room for troubleshooting and error, but with the perception of self-directed inquiry. Framing and setting up such a self-contained problem heavily depends on the level of inquiry.


Method 2: problem-based learning (PBL) mini-projects. Problem-based learning (PBL) was introduced in the field of medical education in 1960s (Lohfeld et al., 2005; Neville et al., 2019, p. 50). Problem-based learning, along with project-based learning (see Section 4.C, method 2 in this article) are identified as student-centered learning strategies which develop and solidify students’ system thinking skills (Nagarajan and Overton, 2019).

PBL methods often require an entire semester. However, McDonnell et al. implemented the following PBL mini-project in a second-year chemistry course, in the last five 3-hour laboratory sessions of the semester (McDonnell et al., 2007). Students were divided into groups of 3–4, and each group was assigned an academic staff as project supervisor. Groups were presented with a project title, such as “Can the lipids in cheese be extracted and analysed?”. Students were also given a pre-project talk involving the learning outcomes and project objectives. In the first session, students developed their experimental plan that they would carry out in the next four sessions. The assessment of the PBL mini-project included the project plan, presentation of the project upon completion, individual project diaries (background reading, experiences/results in laboratory) and a reflective project statement. We categorize this method as an open inquiry (level 2 according to Buck et al.'s designation (Buck et al., 2008)). For more examples of PBL with authentic inquiry, refer to Quattruci (Quattrucci, 2018).


Expected outcomes and advantages. PBL labs encourage productive success (Kapur, 2016), in which students attempt to generate solutions before receiving direct instruction. Students are expected to think about the experiment in the context of an overall problem-solving scenario. While initially reluctant, students began taking control of their project mid-way through, emphasizing critical thinking, experimental design and independence as a scientist.

Analyses of student feedback showed that students found the projects “fun/interesting” and gained “confidence in the laboratory/use of new instruments”, to a greater extent than by completing traditional laboratory sessions. Staff also reported that students gained a deeper understanding of the principles and procedures for using instruments or carrying out experiments, although the experiments carried out were similar to the ones they would normally do in traditional laboratory sessions, which is an example of improving practical skills (see Section 2 above) through PBL. Staff also observed a greater enthusiasm and enhanced engagement among the class after initiation of the PBL mini-projects.


Challenges and suggestions. The authors acknowledge the extra burden and time required to supervise the groups. The bulk of the work occurred in the initial planning stages, including project outlines, initial general guidance, and assessing the project plan. The authors also reduced resource requirements by having one laboratory supervisor engaging with all projects. We suggest that in PBL courses, as in CUREs, peer tutors could alleviate some of this burden by facilitating learning and supporting students’ inquiry process.

PBL, like other inquiry methods, requires more laboratory time than expository laboratories, due to students’ planning and trial and error. McDonnell and coworkers believe that improvements in students’ lab learning experiences outweigh these costs.

Similar to the structured CURE, the availability of equipment and chemicals is likely another challenge. Students should consider this factor in the planning stages, which involves guidance from academic staff.

4. Improving transferrable skills

Nägele and Stalder emphasised the importance of transferrable skills to enhance the competence and employability of graduates from school-based education to work (Nägele and Stalder, 2017). In chemistry education, the laboratory environment is ripe for fostering transferrable skills such as collaboration and communication, both orally and in writing. These transferable skills are required for accreditation and degree-level expectations (González and Wagenaar, 2003; Kuh, 2008; Japan University Accreditation Association, 2018; Canadian Society of Chemistry, 2019; Seery et al., 2019; Pyke et al., 2021; Agustian et al., 2022; ACS Committee on Professional Training, 2023; Find degrees accredited by the RSC, 2024; Nikolic et al., 2024) For example, Kuh's report published under Liberal Education and America's Promise (LEAP) of the American Association of Colleges and Universities (AAC&U) lists critical thinking and analysis, writing and oral communication, quantitative and information literacy, and teamwork and problem solving among intellectual and practical skills as part of the essential learning outcomes in college studies.
4.A. Scientific writing. Scientific writing is identified as one of the essential learning outcomes by the AAC&U (National Leadership Council for Liberal Education and America's Promise, 2011) and is required for accreditation worldwide, referenced above in Section 4. In order to help students with scientific writing, such as formal lab reports, we describe below approaches such as scaffolding the writing process for formal lab reports (Deiner et al., 2012) and integrating information literacy with scientific writing (Borchardt et al., 2019).

An important point here is the impact that generative artificial intelligence may have on scientific writing. This is beyond the scope of this review, but interested readers may consult (Yuriev et al., 2023; Rojas, 2024; Ruff et al., 2024). For example, Ruff et al. believe that although generative AI should be introduced and utilized in the curriculum, it should not replace human writing in chemistry education or professional work. Original report writing remains one of the best means for instructors to assess students’ capacity for organized thought and coherent reasoning (Ruff et al., 2024). Yuriev et al. advocate for analysis of the uses and impacts of generative artificial intelligence in chemistry education (Yuriev et al., 2023), and Rojas has investigated ChatGPT's application for writing assignments (Rojas, 2024).


Motivation. Developing scientific writing and information literacy (IL) skills is a common learning objective at the university and college levels. Indeed, these skills enhance student engagement with concepts and higher-order thinking processes (Berry and Fawkes, 2010; Deiner et al., 2012).

Often, educators are motivated to implement direct instruction in scientific writing and IL due to the varying quality and depth in the lab report submissions. Some also observed that students may be penalized on their lab reports due to omissions rather than incorrect statements, which can be mitigated with direct instruction.

Despite the importance of these skills, faculty in sciences often struggle with the best way to incorporate them into the curriculum (Gullikson, 2006). The two examples below show how these skills can be implemented at the first-year university level, where students have a broad range of linguistic backgrounds, writing experiences and competencies.


Method 1: scaffolded laboratory report writing. Deiner et al. reported a scaffolded approach to teach laboratory report writing (Deiner et al., 2012). Scaffolding a new technique or skill involves breaking it down into smaller segments. With respect to lab report writing, scaffolding was achieved by providing students with guiding questions to write particular sections in a laboratory report.

This assessment was designed and implemented for a first-year introductory chemistry program. In the first semester, students wrote laboratory reports in the form of short-answer worksheets; the scaffolded lab report writing then began in the second semester, with only certain sections due at a time. Throughout the semester, more lab report sections are required, until a full report is submitted for the last lab report. The authors reported supporting strategies such as online modules and short (∼3 min), informal one-on-one meeting when returning graded student work. Berry et al. have also reported a similar scaffolding approach, utilizing a piecemeal approach combined with peer review, and an option to submit report drafts for instructor feedback (Berry and Fawkes, 2010).


Expected outcomes advantages. Overall, this method requires few resources and can achieve large gains in students’ scientific writing, conceptual thinking, and interpretation skills.

Scaffolded questions help students learn to internalize them and use them as an organizational tool to write laboratory reports. In addition, the course set up allows students to apply the feedback they received to write the same report section at a later date.

The authors observed that students’ reports have fewer structural errors after introducing the scaffold, and fewer students handed in lab reports that were longer/shorter than necessary. This is likely the result of clearer goals communicated to the students.


Challenges and suggestions. One limitation in this method is student and instructor workload. Instructors must provide timely feedback on each lab report section, such that students can implement this feedback before their next writing attempt. This might not be possible for institutions with high student volumes in first-year general chemistry courses. Mentorship from peer tutors (as suggested in Section 3) or peer review and feedback from classmates (with appropriate training) may be helpful in tackling these time-related challenges. We refer readers to Berry et al.'s work for the aspects of peer review in report writing (Berry and Fawkes, 2010).
Method 2: integrating information literacy with scientific writing. Borchardt and colleagues integrated IL skills with scientific writing in a first-year biology course (Borchardt et al., 2019).

Towards the beginning of the course, students are introduced to IL, including different types of scientific information, how to categorize information, and how to evaluate the quality of discovered information. The authors also included a set of learning objectives, corresponding assessments, and rubrics for this information literacy lab session.

At the end of the course, students engage in experimental design via PBL (see Section 3 above) and analyze their own results. Drawing on the IL skills that they developed earlier in the course, their final project involves completing a journal-style research paper based on their inquiry project. The grading rubric for this project addresses criteria in experimental design, scientific writing, and the integration of information literacy. For example, students are evaluated on their ability to integrate relevant and appropriate resources and proper citations into their scientific writing.

In related literature, other authors have taken a complementary approach, using problem-based learning (PBL) to develop IL skills, but without explicit instruction in IL (Shultz and Li, 2016; Shultz and Zemke, 2019).


Expected outcomes and advantages. Statistical analyses of student grades on the research paper show improvements in four specific rubric categories: (i) well-defined focus of purpose, experiment and research; (ii) clear presentation of collected information and strong connection to a broad topic of research; (iii) structured hypothesis and conclusion supported by sources; and (iv) relevance and quality of sources. These categories might have been influenced by improved information literacy skills or by the experimental design feature of the course.

This method is versatile and can be implemented in a lecture course with or without a lab component, or as a small online module for students, either synchronous or asynchronous.


Challenges and suggestions. Anonymous end of semester evaluations indicate that students did not self-report gains on information literacy. Furthermore, many students commented that the information literacy lab was boring. Based on this feedback, in the following year, the authors incorporated more problem-based learning concepts in this IL lab session by having students research the answer to a question of their choosing, rather than conduct assigned topic-based searches.

The authors observed that some students believed they were already information-literate. Combatting the self-perceptions involves some delicate signposting to encourage students to further develop their IL skills. We suggest strategies such as peer evaluation, mentorship from upper-year peer tutors, or a method to benchmark students’ IL skills at the beginning and the end of the course so students can see their progress over time.

Developing a rubric for IL is challenging. To address this, the American Association of Colleges and Universities (AAC&U) has published VALUE rubrics, which are open educational resources (OER), on information literacy (VALUE Rubrics, 2007). Similar rubrics can be developed and tailored to specific coursework. Applying the rubric consistently is another challenge. The authors suggest having a single scorer in a small class for all students, and in large classes through TA training to ensure grading consistency. In our experience, grading several assignments together with TAs and instructors, followed by a group discussion about any points of difference, should be effective to reduce variance among scorers. Similarly, Borchardt et al. suggested a sample round of scoring and discussion with all scorers, and the active involvement of a librarian with advanced information literacy experience and associated assessment measures.

4.B. Oral scientific communication. Along with scientific writing, oral communication skills were identified as one of the essential learning outcomes by the American Association of Colleges and Universities (AAC&U) (National Leadership Council for Liberal Education and America's Promise, 2011). Typical activities that incorporate oral communication skills include poster presentations (Kennedy, 1985; Sisak, 1997; Marino et al., 2000; Logan et al., 2015; Widanski et al., 2020) and oral presentations (Meyer, 2003; Applebee et al., 2018). Here, we present two alternatives that hone oral scientific communication: oral interviews (Burrows et al., 2021) and video-style assignments (Berns, 2019).
Motivation. Traditional laboratory reports do not help students develop oral communication skills, nor do they provide immediate feedback. They are also lacking in their ability to capture students’ true understanding (Crawford and Kloepper, 2019) and stimulate scientific discussion.

In contrast, oral examinations provide an opportunity to thoroughly probe student understanding through leading questions to facilitate student–faculty discussion (Ramella, 2019). This type of assessment is common in graduate school programs and in some undergraduate programs in European nations (Ramlo et al., 2024), but not as common in undergraduate courses in North America (Dicks et al., 2012).

Scientific communication can also be promoted through video-style assignments described below. This idea is also found in health education (Wallace and VanderMolen, 2019), and a framework to create digital video as a personalized and active learning assignment has been described (Campbell and Cox, 2018).


Method 1: oral laboratory reports. Burrows et al. reported using oral assessments, called “lab interviews”, to replace written lab reports in a 3rd year biochemistry laboratory course (Burrows et al., 2021). The oral lab reports resemble typical written lab reports, in which students are questioned on experimental techniques and data analysis. Readers are encouraged to consult other examples of oral lab reports referenced therein (Roecker, 2007; Dicks et al., 2012; Goodman, 2020, p. 97; Gardner and Giordano, 2023, p. 100; Salmon et al., 2024, p. 101).

In Burrows and coworkers’ example, the course starts with the first four weeks introducing fundamental laboratory techniques and practices. Subsequently, eight weeks are spent focusing on a mini-project, followed by a final practical lab focused on technique assessment. No written lab reports are required from students. Instead, a total of eight 30-minute oral lab reports are scheduled with the professor throughout the semester. There are typically 2 lab sections, each with 8–12 students enrolled in a chemistry major. The oral lab report questions target both lower-order cognitive skills and high-order cognitive skills, and they are evaluated using a rubric.


Expected outcomes and advantages. Face-to-face examinations can motivate students to process material at an advanced level. In comparison to written lab reports, students in an oral lab report would not be able to “Google” or refer to textbooks for answers. Similarly, oral examinations would minimize academic dishonesty issues due to the use of AI in written communications. One student pointed out that writing lab reports required students to “just answer questions, look up on the internet, or read articles to try and find the answers rather than think”. In addition, when written lab reports are returned, students admit that they focus on the grade rather than the feedback. In comparison, receiving instant feedback in the oral lab report helped students retain the knowledge better because they could focus on the feedback rather than the grade.

Oral lab reports also allow questions to be scaffolded to suit individual learner levels, and help students move from novice to expert problem solvers. Students have indicated the informal format and instant feedback were crucial to their knowledge development. As a result, the structure of the oral lab report promotes deeper learning and better preparation for both the lab and the oral interview. Oral lab reports, like exit interviews described in Section 1, should similarly be effective to link the cognitive and psychomotor domains of learning.


Challenges and suggestions. Student initially felt stressed and intimidated about revealing their knowledge gaps to the professor/instructor. However, with several opportunities to practice oral scientific communication throughout the semester, students’ anxiety levels should decrease over time, as pointed out in Section 1 where we discussed exit interviews (Crawford and Kloepper, 2019).

Administering oral lab reports requires a lot of time from the instructors [2 sections × (8–12 students) × 30 min] = 8–12 hour each week. The authors noted that this style of assessment can be used with small class sizes, while similar assessments in large laboratory sections with large student to teacher ratios is challenging.

The development of a rubric for the oral exams can also be challenging. The authors used a criterion-based format to evaluate student responses during the interview with an analytical rubric. A sample rubric is provided in the supporting information of their article. We suggest that as an alternative to rubrics, instructors may also consider a competency-based or a specifications-based grading scheme.

In addition, the authors advocate for a mix of oral and written assessments, as both oral and written communication skills are important in scientific communication. In Burrows et al.'s example, no written lab reports were required of students, but students had already practiced their written communication in prior courses.


Method 2: video-recorded lab assignments. Berns reported a video assignment in a first year, general chemistry course (Berns, 2019). Students recorded 3 × 3 min video dairies of a crystal growth experiment, and graders reviewed the videos using a detailed rubric and answer key.

Another noteworthy point is that this experiment, which is largely done at home, was developed and carried out pre-COVID, and not as a response to the pandemic.

The framework of 3 video installments resembles a lab report. Students are asked to discuss the synthesis, product analysis, and experimental errors. In the last video, students are asked to independently read a document about X-ray diffraction and analyze a provided data set of a crystal sample.


Expected outcomes and advantages. Compared to lab report writing, this assessment technique does not require students to write formally, which is suggested to hinder scientific thinking by also requiring formal, written scientific communication skills in the same task (Holliday et al., 1994; Berns, 2019). Therefore, in this assessment, students would not stress about formatting and focus more on organizing their thoughts and analyzing their data. Often, students do not reach this level until their 3rd or 4th year of undergraduate studies. Students should be able to more effectively link the cognitive and psychomotor domains of learning with this method (see Section 1 above).

In addition, recording a video allows students to do multiple re-takes as necessary, which may alleviate their anxiety as compared to an in-person oral lab report. Moreover, the low stakes, informal setting allows students to express themselves with bigger freedom. For example, while not required, some students incorporated their other talents into the otherwise purely scientific presentation, blurring the line between science and art: singing, rapping, or turning their responses into a detective story. This type of flexibility and creativity in assessments is an excellent example of universal design for learning, incorporating interdisciplinary approaches from the arts, and emphasizing the affective domain of learning.

In comparison to the above oral assessment methods, this is likely the least resource-hungry alternative from the instructors’ perspective. It can be graded by TAs and carried out in large-enrolment first-year introductory courses.


Challenges and suggestions. Effectively grading these video assignments requires a detailed grading rubric and answer key. This relies on discussion between instructors and TAs, especially considering the freedom students have in the video assignment. Sharing the rubric with students in advance will help them understand the expectations in terms of video content, but at the same time might limit students’ creativity in expressing their ideas. Instructors and TAs may find it hard to grade the videos, due to processing both the visual and oral information at the same time. The author noted that TAs and instructors hold weekly meetings to review expectations for each experiment, and outline important information that should be expected, and for TAs to raise issues in grading to further standardize the process. The author also recommended providing grading examples to TAs to help improve grading consistencies.

This assessment format works well for long-term experiments in which very little can go wrong and the experiment can be done at home. Developing other experiments with a similar structure would be challenging for most chemistry labs (Berns, 2019). Such implementation would require carefully selecting the experiments and its related content.

4.C. Peer learning. Peer learning is built upon social constructivist theories of learning, in which learners construct knowledge and meaning based on their experiences, and student learning is influenced by social interactions (Novak, 1993; Scott et al., 2007; Walker et al., 2011). Agustian's framework (Agustian, 2022) in fact includes social learning as a domain encompassing the cognitive, affective, psychomotor domains.

According to the essential learning outcomes (ELOs) from American Association of Colleges and Universities (AAC&U) (Essential Learning Outcomes, 2007), peer learning may be categorized under “teamwork and problem solving” or “teamwork skills in diverse groups” (National Leadership Council for Liberal Education and America's Promise, 2011). In addition to AAC&U, the committee of professional training in the American Chemical Society (ACS) also includes team skills in its guidelines for undergraduate program approval (ACS Committee on Professional Training, 2023; Mertz et al., 2023).

Here, we present three interesting methods to promote peer learning: team contracts and collaboration rubrics (Mertz et al., 2023), reciprocal peer teaching (Zewail-Foote and Gonzalez, 2023), and argument-driven inquiry (Walker et al., 2011; Walker and Sampson, 2013). In addition, readers are encouraged to consider other models which are typically applied outside of the laboratory, such as peer-led team learning (PLTL) (Woodward et al., 1993; Wilson and Varma-Nelson, 2016), process-oriented guided inquiry learning (POGIL) (Lewis and Lewis, 2005; Eberlein et al., 2008), and two-stage testing (Gilley and Clarkston, 2014; Kloepper, 2015). In particular, the PLTL model (also called the workshop model) is reported to show unique effectiveness for females, under-represented minorities (URM), and under-prepared students, as a result of URM peer leaders serving as role models (Wilson and Varma-Nelson, 2016).


Motivation. Peer learning is a less common learning objective in laboratory learning, with little emphasis in the literature on how to explicitly teach and assess collaborative skills. This further speaks to the misalignment of the lab curriculum: students often work in groups in the lab but are typically assessed individually. The AAC&U have created VALUE rubrics for teamwork and critical thinking, and rubrics are also provided in supporting information of the argument-driven inquiry paper discussed below (Walker et al., 2011).

In the examples of peer learning below, researchers emphasized the impact of peer learning on the affective domain of learning. For example, peer learning helps engender a sense of community and belonging in sciences.


Method 1: team contracts and collaboration rubric. Mertz et al. explicitly teach collaboration skills, using collaboration rubrics and student learning contracts (Mertz et al., 2023). Collaboration skills are scaffolded through the undergraduate biochemistry curriculum in four courses: second semester of first-year general chemistry, physical chemistry I and biochemistry I, and biochemistry II. Of these courses, only general chemistry and physical chemistry I have a lab component and will be discussed here.

In the two lab courses, students completed several multi-week experiments in teams of two to three. These groups were assigned randomly for each experiment and each student had a specific role within their group. Students completed pre- and post-lab collaboration forms. The pre-lab form prompted students to plan the weekly lab activities and group communication, while the post-lab form prompted students to reflect on their experiences with a growth mindset approach. Students were asked to assess themselves and their partners’ skills across the following categories: quality of technical work, commitment, leadership, communication, independence in data analysis and independence in the lab space.

A specifications grading rubric (Pascal et al., 2020; Nilson, 2023) was used, whereby a minimum collaboration score is required to achieve a particular laboratory grade. The rubric scores and feedback were not shared with the students but used as indicators of whether faculty intervention was necessary. Such collaboration grade made up 10% of the overall laboratory grade.


Expected outcomes and advantages. The collaboration forms include free-response sections, where students are given agency and voice in group work scenarios. This kind of self- and peer-evaluation alleviates time demands on instructors, while simultaneously allowing instructors to intervene in group projects if things are not going well.

The scaffolded curriculum allows instructors to track collaboration skills in specific cohorts of students longitudinally over many courses. This allows educators to evaluate the overall curriculum in terms of collaboration skills, although the authors have not published new findings as of the submission of this article.

Another advantage is that this method involves several opportunities for self- and peer-evaluation throughout the term, allowing students to focus on their growth and that of their peers over time. This shifts the focus away from grading and towards personal growth, and targets the affective and social domains of learning.


Challenges and suggestions. One challenge in group work is the issue of social loafing (Karau and Williams, 1993), which refers to a group member who does not complete or make meaningful attempts at completing their share of the work. The authors identified this as a lack of properly incentivizing being a good collaborator. While the current rubric does not address this issue, the specifications grading system does, because a minimum collaboration score is required to pass the laboratory component.

Another issue in group work is how to give and receive feedback in a constructive way. If feedback is not provided, then students cannot reflect on their performance (as perceived by their peers) and improve in the future. If feedback is not constructive, then there may be adverse effects on performers’ self-confidence and self-efficacy. In this study, students initially struggled to properly evaluate other group members and articulate clear rationale for those scores. To address this, tutorials on how to constructively give and receive feedback can be created (Stone and Heen, 2014).

Another issue when giving feedback is protecting student anonymity. We suggest mitigating this by releasing aggregated feedback to each student after several group projects. In addition, instructors can provide generalized class-wide feedback, which is common in music performance groups (Bonshor, 2017; Emerson et al., 2019).


Method 2: reciprocal peer teaching. Zewail-Foote and Gonzalez (Zewail-Foote and Gonzalez, 2023) created a lab experience involving both inquiry (see Section 3) and peer learning. The authors utilized reciprocal peer teaching, where students had the opportunity to teach their peers.

This pilot course, led by two professors and two upper-year peer tutors met weekly to study two interrelated projects over the course of the semester. There were seven first-year students enrolled in the course. For the first half of the course, students were divided into two teams, each assigned to one of two projects. Faculty members guided students to design and conduct their experiments and analyze their results, while peer tutors helped facilitate lab skills and offered academic and social support. Midway through the semester, the two teams reported their current progress to the class, including experimental protocols, accomplishments, and future directions of the project. The teams then swapped projects for the second half of the semester. Students were then fully in charge of instructing the new team about their prior projects, with less guidance from the peer tutors and the faculty members.

This method uses two types of peer learning. The first kind of peer learning is mentorship, in which upper-year students (peer tutors) mentored the first-year students in the laboratory; related literature refers to this as peer-led team learning (PLTL), where learning is led by a near-peer (Wilson and Varma-Nelson, 2016). The second kind of peer learning is reciprocal peer teaching (RPT) (Gazula et al., 2017), in which first-year students mentored each other in the second half of the course. RPT is an example of the learning-by-teaching paradigm (Roscoe and Chi, 2007; Duran, 2017).


Expected outcomes and advantages. The authors have included a list of expected learning outcomes of the course (Zewail-Foote and Gonzalez, 2023). We focus on expected outcomes and advantages due to the aspects of peer learning below.

In the first half of the course, an experienced peer tutor facilitated and guided student learning. This mentorship approach differs from a typical lecture in that the peer tutor is responsible for the guidance, but the teaching is still one-directional and hierarchical in nature. The learning environment provided in this mentorship approach is especially beneficial to students who identify as URM: these students realize larger gains than majority students (Frye et al., 2021). To further create awareness of URM representation and a sense of belonging for students, the two URM faculty members who led the course also integrated culturally relevant content by sharing their own lived experiences.

The second half utilizes RPT, in which two equally positioned groups of classmates assume both roles of teacher and learner. RPT promotes student–student engagement, peer-led teaching, and group learning. The goal in RPT is to deepen students’ understanding of the content and promote metacognitive skills. Teaching others also helps link the cognitive and psychomotor domains (see Section 1 above).

The authors also speculate that RPT enhances constructivism and learning autonomy to stimulate intrinsic motivation. This is because RPT builds in student investment and accountability, reinforces collaboration between teams, and promotes a sense of community around a common goal of scientific inquiry. In other words, RPT is effective in promoting the affective and conative domains (Agustian, 2022). This is evident as the authors observed that students were curious and excited to see how far the other team had progressed on their projects.


Challenges and suggestions. This pilot study involved seven students in two groups, each completing one of two projects. In classes with more students, it might be difficult to scale the group sizes or introduce additional interrelated projects for groups to rotate through.

Like other peer learning strategies, RPT requires a paradigm shift for students to consider themselves and their peers as knowledge providers (see Section 3 on designing experiments).

Another foreseeable challenge is grading consistency. The authors implemented a competency-based grading system, centered around the development of research skills. Students were also evaluated based on their research progress, communication skills and collaborative performance. The article did not include specifics of evaluation on these items, but we suggest a self-reflection alongside self- and peer-evaluation, to mitigate time demands on instructors.


Method 3: argument-driven inquiry (ADI). An argument-driven inquiry (ADI) instructional model (Walker et al., 2011; Walker and Sampson, 2013) was implemented in a small enrollment, 1st year general chemistry lab. The ADI model is a multi-step process in which students design and conduct an experiment, formulate their own ideas from the experiment, engage in active argumentation with other students, and engage in a peer review process. (Walker and Sampson, 2013). This example is readily applicable to Section 3 above (designing experiments), but we include it here to focus on aspects related to peer learning.

We also encourage readers to refer to a similar method called the science writing and workshop template (SWWT) (Stephenson et al., 2019), utilizing small workshop sessions in conjunction with laboratory sessions, discussing and debating with their peers, and reflecting on and refining their ideas.


Expected outcomes and advantages. Qualitative and quantitative analyses of student performance shows that ADI develops their critical thinking skills. Students gain an increased ability to form scientific arguments, which requires them to analyze and evaluate data, then rationalize its use as evidence for a claim. This aligns well with teaching students “how” to do science, as opposed to just carrying out procedures (Seery et al., 2019). It simulates a more realistic research environment, by implementing multiple cycles of feedback, both orally and written, from different sources including peers and instructors. Furthermore, this method introduces students to the concept of moving beyond right and wrong to a reasoned explanation. All these advantages relate to the epistemic domain of learning (Agustian, 2022). The authors have also noted that students’ arguments improved in both use of reasoning and willingness to change claims over time.

Although this method involved both oral and written components, the authors have found that participation in oral argumentations is more robust or complex than written arguments. This is because students can engage in a back-and-forth discussion with peers in real time. Similar advantages are discussed in Sections 1 and 4.B above. The presence of an audience during an argumentation session may encourage students to develop rich, convincing arguments in response to the questions and critique of their peers.


Challenges and suggestions. This learning model requires robust instructor/TA training that switches their role from a traditional and hierarchical knowledge provider to a facilitator. Similar to other peer learning methods, students should also acknowledge their peers as knowledge providers. These challenges are also discussed above (see Section 3 on experimental design).

Another foreseeable challenge would be the time needed to implement the grading of the reasoning/argumentation session. These sessions were assessed by scoring the transcriptions of the argumentations using the Assessment of Scientific Argumentation in the Classroom (ASAC) observation protocol (Sampson et al., 2012). The ASAC is further divided into three sections: conceptual and cognitive; epistemic; and social aspects. TAs may be used to alleviate the grading pressure from the instructors after proper training, but the same issue exists regarding the instructor/TA to student ratio.

As an alternative to formal grading and its inherent demands on instructors or TAs, we suggest competency-based or specifications-based grading, self-reflection by students that focuses on growth over time, and upper-year peer tutors as facilitators.

Conclusion

Traditional expository experiments coupled with lab reports do not target all the learning outcomes for laboratory instruction described by Reid and Shah. Using these learning objectives as a framework, this integrative review summarizes some recent, innovative techniques for undergraduate laboratory course design and assessments.

Table 1 above summarizes each method discussed herein, along with its advantages and disadvantages. To address these challenges, instructors are encouraged to consider strategies to expand time, including implementing alternative grading schemes such as specifications- or competency-based grading, and promoting student self-reflection, peer evaluation, upper-year peer tutors acting as mentors. These strategies have the added benefits of targeting the affective and social domains of learning.

We hope this work help motivate chemistry educators and lower their activation energy barrier to redesign and revitalize the undergraduate laboratory instruction. We are excited to see future reports of innovative laboratory activities and assessments!

Conflicts of interest

There are no conflicts to declare.

Data availability

Due to this submission being an integrative review, we do not include any primary research results, software or code and no new data were generated or analysed. Analyses are qualitative and are described in the article.

Acknowledgements

We are grateful to the McCall MacBain Postdoctoral Fellows Teaching and Leadership Program. We would also like to offer our gratitude to Drs Veronica Rodriguez Moncalvo and Jennifer Nash for useful discussions and feedback on this work.

References

  1. ACS Committee on Professional Training, (2023), ACS Guidelines and Evaluation Procedures for Bachelor's Degree Programs (Guidelines), United States of America.
  2. Adam D., (2002), Chemistry caught in crisis catalysed by student apathy, Nature, 416, 777–777 DOI:10.1038/416777b.
  3. Agustian H. Y., (2022), Considering the hexad of learning domains in the laboratory to address the overlooked aspects of chemistry education and fragmentary approach to assessment of student learning, Chem. Educ. Res. Pract., 23, 518–530 10.1039/D1RP00271F.
  4. Agustian H. Y., Finne L. T., Jørgensen J. T., Pedersen M. I., Christiansen F. V., Gammelgaard B. and Nielsen J. A., (2022),Learning outcomes of university chemistry teaching in laboratories: a systematic review of empirical literature, Rev. Educ., 10, e3360 DOI:10.1002/rev3.3360.
  5. Appendix 2: OCAV's Undergraduate and Graduate Degree Level Expectations—Ontario Universities Council on Quality Assurance [WWW Document], (2024), URL https://oucqa.ca/framework/appendix-2/ (accessed 9.18.24).
  6. Applebee M. S., Johanson A. P., Lawler-Sagarin K. A., Losey E. N. and Munro-Leighton C., (2018), The Three-Minute Slide as an Effective Tool for Developing Oral Communication Skills, J. Chem. Educ., 95, 1419–1422 DOI:10.1021/acs.jchemed.7b00649.
  7. Balster N., Pfund C., Rediske R. and Branchaw J., (2010), Entering Research: A Course That Creates Community and Structure for Beginning Undergraduate Researchers in the STEM Disciplines, CBE—Life Sci. Educ., 9, 108–118 DOI:10.1187/cbe.09-10-0073.
  8. Berns V. M., (2019), Oral Alternatives to Traditional Written Lab Reports, Communication in Chemistry, ACS Symposium Series, American Chemical Society, pp. 111–117 DOI:10.1021/bk-2019-1327.ch008.
  9. Berry D. E. and Fawkes K. L., (2010), Constructing the Components of a Lab Report Using Peer Review, J. Chem. Educ., 87, 57–61 DOI:10.1021/ed8000107.
  10. Biggs J., (1996), Enhancing teaching through constructive alignment, High. Educ., 32, 347–364 DOI:10.1007/BF00138871.
  11. Biggs J., (2014), Constructive alignment in university teaching, in Peter K. (ed.), HERDSA Review of Higher Education, Australia, pp. 5–22.
  12. Bonshor M., (2017), Conductor feedback and the amateur singer: the role of criticism and praise in building choral confidence, Res. Stud. Music Educ., 39, 139–160 DOI:10.1177/1321103X17709630.
  13. Borchardt R., Salcedo T. and Bentley M., (2019), Little intervention, big results: intentional integration of information literacy into an introductory-level biology lab course, J. Biol. Educ., 53, 450–462 DOI:10.1080/00219266.2018.1494029.
  14. Bretz S. L., (2019), Evidence for the Importance of Laboratory Courses, J. Chem. Educ., 96, 193–195 DOI:10.1021/acs.jchemed.8b00874.
  15. Buck L. B., Bretz S. L. and Towns M. H., (2008), Characterizing the level of inquiry in the undergraduate laboratory, J. Coll. Sci. Teach. 38, 52–58.
  16. Burrows N. L., Ouellet J., Joji J. and Man J., (2021), Alternative Assessment to Lab Reports: A Phenomenology Study of Undergraduate Biochemistry Students’ Perceptions of Interview Assessment, J. Chem. Educ., 98, 1518–1528 DOI:10.1021/acs.jchemed.1c00150.
  17. Campbell L. O. and Cox T. D., (2018), Digital Video as a Personalized Learning Assignment: A Qualitative Study of Student Authored Video Using the ICSDR Model, J. Scholarsh. Teach. Learn., 18, 11–24.
  18. Canadian Society of Chemistry, (2019), CSC Accreditation Guidelines (Guidelines), Canada: Chemical Institute of Canada.
  19. Chen H.-J., She J.-L., Chou C.-C., Tsai Y.-M. and Chiu M.-H., (2013), Development and Application of a Scoring Rubric for Evaluating Students’ Experimental Skills in Organic Chemistry: An Instructional Guide for Teaching Assistants, J. Chem. Educ., 90, 1296–1302 DOI:10.1021/ed101111g.
  20. Chitra E., Ramamurthy S., Mohamed S. M. and Nadarajah V. D., (2022), Study of the impact of objective structured laboratory examination to evaluate students’ practical competencies, J. Biol. Educ. 56, 560–569 DOI:10.1080/00219266.2020.1858931.
  21. Chopra I., O’Connor J., Pancho R., Chrzanowski M. and Sandi-Urena S., (2017), Reform in a general chemistry laboratory: how do students experience change in the instructional approach? Chem. Educ. Res. Pract., 18, 113–126 10.1039/C6RP00082G.
  22. Clark T. M., Ricciardo R. and Weaver T., (2016), Transitioning from Expository Laboratory Experiments to Course-Based Undergraduate Research in General Chemistry, J. Chem. Educ., 93, 56–63 DOI:10.1021/acs.jchemed.5b00371.
  23. Connor M. C., Pratt J. M. and Raker J. R., (2022), Goals for the Undergraduate Instructional Inorganic Chemistry Laboratory When Course-Based Undergraduate Research Experiences Are Implemented: A National Survey, J. Chem. Educ., 99, 4068–4078 DOI:10.1021/acs.jchemed.2c00267.
  24. Crawford G. L. and Kloepper K. D., (2019), Exit Interviews: Laboratory Assessment Incorporating Written and Oral Communication, J. Chem. Educ., 96, 880–887 DOI:10.1021/acs.jchemed.8b00950.
  25. Creating a CARD Lesson Plan, (2020), Fac. Learn. Hub. URL https://tlconestoga.ca/creating-a-card-lesson-plan/ (accessed 9.9.24).
  26. Deiner L. J., Newsome D. and Samaroo D., (2012), Directed Self-Inquiry: A Scaffold for Teaching Laboratory Report Writing, J. Chem. Educ., 89, 1511–1514 DOI:10.1021/ed300169g.
  27. Dicks A. P., Lautens M., Koroluk K. J. and Skonieczny S., (2012), Undergraduate Oral Examinations in a University Organic Chemistry Curriculum, J. Chem. Educ., 89, 1506–1510 DOI:10.1021/ed200782c.
  28. Ding N. and Harskamp E. G., (2011), Collaboration and Peer Tutoring in Chemistry Laboratory Education, Int. J. Sci. Educ., 33, 839–863 DOI:10.1080/09500693.2010.498842.
  29. Domin D. S., (1999), A Review of Laboratory Instruction Styles, J. Chem. Educ., 76, 543 DOI:10.1021/ed076p543.
  30. Duran D., (2017), Learning-by-teaching. Evidence and implications as a pedagogical mechanism, Innov. Educ. Teach. Int., 54, 476–484 DOI:10.1080/14703297.2016.1156011.
  31. Eberlein T., Kampmeier J., Minderhout V., Moog R. S., Platt T., Varma-Nelson P. and White H. B., (2008), Pedagogies of engagement in science: a comparison of PBL, POGIL, and PLTL, Biochem. Mol. Biol. Educ., 36, 262–273 DOI:10.1002/bmb.20204.
  32. Emerson K., Williamson V. and Wilkinson R., (2019), Once more, with feeling: Conductors’ use of assessments and directives to provide feedback in choir rehearsals, Music. Sci., 23, 362–382 DOI:10.1177/1029864919844810.
  33. Essential Learning Outcomes [WWW Document], (2007), AAC&U. URL https://www.aacu.org/trending-topics/essential-learning-outcomes (accessed 7.9.24).
  34. Estrada M., Woodcock A., Hernandez P. R. and Schultz P. W., (2011), Toward a model of social influence that explains minority student integration into the scientific community, J. Educ. Psychol., 103, 206–222 DOI:10.1037/a0020743.
  35. Find degrees accredited by the RSC [WWW Document], (2024), R. Soc. Chem. URL https://www.rsc.org/membership-and-community/degree-accreditation/find-accredited-courses/(accessed 8.6.24).
  36. Finster D. C., (1991), Developmental instruction: Part II. Application of the Perry model to general chemistry, J. Chem. Educ., 68, 752 DOI:10.1021/ed068p752.
  37. Frye R. D., Barone M. C., Hammond N. B., Eloi-Evans S., Trenshaw K. F. and Raucci M., (2021), Incentives and Barriers to Participation in PLTL Workshop Spaces: an Exploration of Underrepresented Students’ Experiences, J. Women Minor. Sci. Eng., 27, 1–31 DOI:10.1615/JWomenMinorScienEng.2021029908.
  38. Gardner D. E. and Giordano A. N., (2023), The Challenges and Value of Undergraduate Oral Exams in the Physical Chemistry Classroom: A Useful Tool in the Assessment Toolbox, J. Chem. Educ., 100, 1705–1709 DOI:10.1021/acs.jchemed.3c00011.
  39. Gazula S., McKenna L., Cooper S. and Paliadelis P., (2017), A Systematic Review of Reciprocal Peer Tutoring within Tertiary Health Profession Educational Programs, Health Prof. Educ., 3, 64–78 DOI:10.1016/j.hpe.2016.12.001.
  40. Gericke N., Högström P. and Wallin J., (2023), A systematic review of research on laboratory work in secondary school, Stud. Sci. Educ., 59, 245–285 DOI:10.1080/03057267.2022.2090125.
  41. Gilley B. H. and Clarkston B., (2014), Collaborative Testing: Evidence of Learning in a Controlled In-Class Study of Undergraduate Students, J. Coll. Sci. Teach., 43, 83–91 DOI:10.2505/4/jcst14_043_03_83.
  42. González J. and Wagenaar R., (2003), Tuning Educational Sutrctures in Europe, Socrates Programme, European Commission.
  43. Goodman A. L., (2020), Can Group Oral Exams and Team Assignments Help Create a Supportive Student Community in a Biochemistry Course for Nonmajors? J. Chem. Educ., 97, 3441–3445 DOI:10.1021/acs.jchemed.0c00815.
  44. Gullikson S., (2006), Faculty Perceptions of ACRL's Information Literacy Competency Standards for Higher Education, J. Acad. Librariansh., 32, 583–592 DOI:10.1016/j.acalib.2006.06.001.
  45. Henderson C., Fisher K. Q. and Beach A., (2018), Change in Higher Education, Researching and Enacting Change in Postsecondary Education, Routledge.
  46. Holliday W. G., Yore L. D. and Alvermann D. E., (1994), The reading–science learning–writing connection: breakthroughs, barriers, and promises, J. Res. Sci. Teach., 31, 877–893 DOI:10.1002/tea.3660310905.
  47. Hounsell D. and Hounsell J., (2007), Teaching-learning environments in contemporary mass higher education, Variations in Student Learning and Perceptions of Academic Quality, BJEP Monograph Series II. British Psychological Society, pp. 91–111.
  48. Hu K., Ma R.-J., Ma C., Zheng Q.-K. and Sun Z.-G., (2022), Comparison of the BOPPPS model and traditional instructional approaches in thoracic surgery education, BMC Med. Educ., 22, 447 DOI:10.1186/s12909-022-03526-0.
  49. Japan University Accreditation Association, (2018), University Accreditation Handbook, Japan: Japan University Accreditation Association.
  50. Kapur M., (2016), Examining Productive Failure, Productive Success, Unproductive Failure, and Unproductive Success in Learning, Educ. Psychol., 51, 289–299 DOI:10.1080/00461520.2016.1155457.
  51. Karau S. J. and Williams K. D., (1993), Social loafing: a meta-analytic review and theoretical integration, J. Pers. Soc. Psychol., 65, 681–706 DOI:10.1037/0022-3514.65.4.681.
  52. Kennedy J. H., (1985), Poster presentations for evaluating laboratory coursework, J. Chem. Educ., 62, 1104 DOI:10.1021/ed062p1104.
  53. Kirschner P. A. and Meester M. A. M., (1988), The laboratory in higher science education: problems, premises and objectives, High. Educ., 17, 81–98 DOI:10.1007/BF00130901.
  54. Kirton S. B., Al-Ahmad A. and Fergus S., (2014), Using Structured Chemistry Examinations (SChemEs) As an Assessment Method To Improve Undergraduate Students’ Generic, Practical, and Laboratory-Based Skills, J. Chem. Educ., 91, 648–654 DOI:10.1021/ed300491c.
  55. Kloepper K. D., (2015), Stoplight Quizzes: A Multilevel Assessment Strategy for Lecture and Laboratory Courses, J. Chem. Educ., 92, 509–511 DOI:10.1021/ed500512u.
  56. K. Seery M., Y. Agustian H., V. Christiansen F., Gammelgaard B., H. Malm R., (2024), 10 Guiding principles for learning in the laboratory, Chem. Educ. Res. Pract., 25, 383–402 10.1039/D3RP00245D.
  57. Kuh G. D., (2008), High-Impact Educational Practices – What they are, who has access to them, and why they matter, Association of American Colleges and Universities.
  58. Lazonder A. W. and Harmsen R., (2016), Meta-Analysis of Inquiry-Based Learning: Effects of Guidance, Rev. Educ. Res., 86, 681–718 DOI:10.3102/0034654315627366.
  59. Lewis S. E. and Lewis J. E., (2005), Departing from Lectures: An Evaluation of a Peer-Led Guided Inquiry Alternative, J. Chem. Educ., 82, 135 DOI:10.1021/ed082p135.
  60. Logan J. L., Quiñones R., Sunderland D. P., (2015), Poster Presentations: Turning a Lab of the Week into a Culminating Experience, J. Chem. Educ., 92, 96–101 DOI:10.1021/ed400695x.
  61. Lohfeld L., Neville A. and Norman G., (2005), PBL in Undergraduate Medical Education: A Qualitative Study of the Views of Canadian Residents, Adv. Health Sci. Educ., 10, 189–214 DOI:10.1007/s10459-005-1293-9.
  62. Marino R., Clarkson S., Mills P. A., Sweeney W. V. and DeMeo S., (2000), Using Poster Sessions as an Alternative to Written Examination—The Poster Exam, J. Chem. Educ., 77, 1158 DOI:10.1021/ed077p1158.
  63. Matlin S. A., Mehta G., Hopf H. and Krief A., (2016), One-world chemistry and systems thinking, Nat. Chem., 8, 393–398 DOI:10.1038/nchem.2498.
  64. Matthews M. R., (2018), History, Philosophy and Science Teaching – New Perspectives, Science: Ohilosophy, History and Education. Switzerland: Springer.
  65. McDonnell C., O’Connor C. and Seery M. K., (2007), Developing practical chemistry skills by means of student-driven problem based learning mini-projects, Chem. Educ. Res. Pract., 8, 130–139 10.1039/B6RP90026G.
  66. Mertz P. S., Sherrer S. M. and Bowers G. M., (2023), Teaching and assessing undergraduate collaboration skills scaffolded through the biochemistry curriculum using collaboration rubrics and student learning contracts, Biochem. Mol. Biol. Educ., 51, 499–507 DOI:10.1002/bmb.21760.
  67. Meyer G. M., (2003), Scientific Communication for Chemistry Majors: A New Course, J. Chem. Educ., 80, 1174 DOI:10.1021/ed080p1174.
  68. Nagarajan S. and Overton T., (2019), Promoting Systems Thinking Using Project- and Problem-Based Learning, J. Chem. Educ., 96, 2901–2909 DOI:10.1021/acs.jchemed.9b00358.
  69. Nägele C. and Stalder B. E., (2017), Competence and the Need for Transferable Skills, in Mulder M. (ed.), Competence-Based Vocational and Professional Education: Bridging the Worlds of Work and Education, Cham: Springer International Publishing, pp. 739–753 DOI:10.1007/978-3-319-41713-4_34.
  70. National Leadership Council for Liberal Education and America's Promise, (2011), The LEAP Vision for Learning: Outcomes, Practices, Impact, and Employers’ Views, Washington, DC: Association of American Colleges and Universities.
  71. Neville A., Norman G. and White R., (2019), McMaster at 50: lessons learned from five decades of PBL, Adv. Health Sci. Educ., 24, 853–863 DOI:10.1007/s10459-019-09908-2.
  72. Nikolic S., Suesse T. F., Grundy S., Haque R., Lyden S., Hassan G. M., Daniel S., Belkina M. and Lal S., (2024), Laboratory learning objectives: ranking objectives across the cognitive, psychomotor and affective domains within engineering, Eur. J. Eng. Educ., 49, 454–473 DOI:10.1080/03043797.2023.2248042.
  73. Nilson L. B., (2023), Specifications Grading: Restoring Rigor, Motivating Students, and Saving Faculty Time, Routledge, New York DOI:10.4324/9781003447061.
  74. Novak J. D., (1993), Human constructivism: a unification of psychological and epistemological phenomena in meaning making, Int. J. Pers. Constr. Psychol., 6, 167–193 DOI:10.1080/08936039308404338.
  75. Novak J. D., (2009), Learning, Creating, and Using Knowledge: Concept Maps as Facilitative Tools in Schools and Corporations, 2nd edn, New York: Routledge DOI:10.4324/9780203862001.
  76. Novak J. D. and Gowin D. B., (1984), Learning How to Learn, Cambridge: Cambridge University Press DOI:10.1017/CBO9781139173469.
  77. Pascal J., Vogel T. J. and Wagstrom K., (2020), Grading by Competency and Specifications: Giving Better Feedback and Saving Time. Presented at the 2020 ASEE Virtual Annual Conference Content Access.
  78. Provost J. J., (2022), Developing Course Undergraduate Research Experiences (CUREs) in Chemistry, J. Chem. Educ., 99, 3842–3848 DOI:10.1021/acs.jchemed.2c00390.
  79. Pyke S., O’Brien G., Yates B. and Buntine M., (2021), Chemistry Academic Standards Statement, 2nd edn, Melbourne, Australia: The Royal Australian Chemical Institute.
  80. Quattrucci J. G., (2018), Problem-Based Approach to Teaching Advanced Chemistry Laboratories and Developing Students’ Critical Thinking Skills, J. Chem. Educ., 95, 259–266 DOI:10.1021/acs.jchemed.7b00558.
  81. Ramella D., (2019), Oral Exams: A Deeply Neglected Tool for Formative Assessment in Chemistry, Active Learning in General Chemistry: Specific Interventions, ACS Symposium Series, American Chemical Society, pp. 79–89 DOI:10.1021/bk-2019-1340.ch006.
  82. Ramlo S., Salmon C. and Xue Y., (2024), Chemistry Students’ Views of Taking an Oral Exam, J. Coll. Sci. Teach., 1–7 DOI:10.1080/0047231X.2024.2398387.
  83. Reid N. and Shah I., (2007), The role of laboratory work in university chemistry, Chem. Educ. Res. Pract., 8, 172–185 10.1039/B5RP90026C.
  84. Roecker L., (2007), Using Oral Examination as a Technique To Assess Student Understanding and Teaching Effectiveness, J. Chem. Educ., 84, 1663 DOI:10.1021/ed084p1663.
  85. Rojas A. J., (2024), An Investigation into ChatGPT's Application for a Scientific Writing Assignment, J. Chem. Educ., 101, 1959–1965 DOI:10.1021/acs.jchemed.4c00034.
  86. Roscoe R. D. and Chi M. T. H., (2007), Understanding Tutor Learning: Knowledge-Building and Knowledge-Telling in Peer Tutors’ Explanations and Questions, Rev. Educ. Res., 77, 534–574 DOI:10.3102/0034654307309920.
  87. Ruff E. F., Engen M. A., Franz J. L., Mauser J. F., West J. K. and Zemke J. M. O., (2024), ChatGPT Writing Assistance and Evaluation Assignments Across the Chemistry Curriculum, J. Chem. Educ., 101, 2483–2492 DOI:10.1021/acs.jchemed.4c00248.
  88. Salmon C. R., Bonvallet P. A., Xue Y. and Ramlo S. E., (2024), Oral Examinations of Structure and Function in General and Organic Chemistry Courses, J. Chem. Educ., 101, 921–929 DOI:10.1021/acs.jchemed.3c00749.
  89. Sampson V., Enderle P. J. and Walker J. P., (2012), The Development and Validation of the Assessment of Scientific Argumentation in the Classroom (ASAC) Observation Protocol: A Tool for Evaluating How Students Participate in Scientific Argumentation, in Khine M. S. (ed.), Perspectives on Scientific Argumentation: Theory, Practice and Research, Springer Netherlands, Dordrecht, pp. 235–264 DOI:10.1007/978-94-007-2470-9_12.
  90. Sandi-Urena S., Cooper M. M. and Gatlin T. A., (2011), Graduate teaching assistants’ epistemological and metacognitive development, Chem. Educ. Res. Pract., 12, 92–100 10.1039/C1RP90012A.
  91. Sandi-Urena S. and Gatlin T. A., (2012), Experimental Chemistry Teaching: Understanding Teaching Assistants’ Experience in the Academic Laboratory, Educ. Quím., 23, 141–148.
  92. Scott P., Asoko H. and Leach J., (2007), Student Conceptions and Conceptual Learning in Science, in Abell S. K. and Lederman N. (ed.), Handbook of Research on Science Education, New York: Taylor & Francis Group, pp. 31–56.
  93. Seen A. J., (2025), Ensuring We Assess What We Value in Laboratory Work, J. Chem. Educ., 102, 2172–2176 DOI:10.1021/acs.jchemed.4c01407.
  94. Seery M. K., (2020), Establishing the Laboratory as the Place to Learn How to Do Chemistry, J. Chem. Educ., 97, 1511–1514 DOI:10.1021/acs.jchemed.9b00764.
  95. Seery M. K., Agustian H. Y. and Zhang X., (2019), A Framework for Learning in the Chemistry Laboratory, Isr. J. Chem., 59, 546–553 DOI:10.1002/ijch.201800093.
  96. Shultz G. V. and Li Y., (2016), Student Development of Information Literacy Skills during Problem-Based Organic Chemistry Laboratory Experiments, J. Chem. Educ., 93, 413–422 DOI:10.1021/acs.jchemed.5b00523.
  97. Shultz G. V. and Zemke J. M., (2019), “I Wanna Just Google It and Find the Answer”: Student Information Searching in a Problem-Based Inorganic Chemistry Laboratory Experiment, J. Chem. Educ., 96, 618–628 DOI:10.1021/acs.jchemed.8b00821.
  98. Sisak M. E., (1997), Poster Sessions as a Learning Technique, J. Chem. Educ., 74, 1065 DOI:10.1021/ed074p1065.2.
  99. Stephenson N. S., Miller I. R. and Sadler-McKnight N. P., (2019), Impact of Peer-Led Team Learning and the Science Writing and Workshop Template on the Critical Thinking Skills of First-Year Chemistry Students, J. Chem. Educ., 96, 841–849 DOI:10.1021/acs.jchemed.8b00836.
  100. Stone D. and Heen S., (2014), Thanks for the Feedback: The Science and Art of Receiving Feedback Well, Penguin Books Limited.
  101. The CARL framework of reflection [WWW Document], (2018), Univ. Edinb. URL https://www.ed.ac.uk/reflection/reflectors-toolkit/reflecting-on-experience/carl (accessed 9.9.24).
  102. Thomas A. C., Boucher M. A. and Pulliam C. R., (2015), Qualitative to Quantitative and Spectrum to Report: An Instrument-Focused Research Methods Course for First-Year Students, J. Chem. Educ., 92, 439–443 DOI:10.1021/ed5007019.
  103. Townsley M. and Schmid D., (2020), Alternative grading practices: an entry point for faculty in competency-based education, J. Comp. Educ., 5, e01219 DOI:10.1002/cbe2.1219.
  104. VALUE Rubrics [WWW Document], (2007), AAC&U. URL https://www.aacu.org/initiatives/value-initiative/value-rubrics (accessed 9.20.24).
  105. Veale C. G. L., Jeena V. and Sithebe S., (2020), Prioritizing the Development of Experimental Skills and Scientific Reasoning: A Model for Authentic Evaluation of Laboratory Performance in Large Organic Chemistry Classes, J. Chem. Educ., 97, 675–680 DOI:10.1021/acs.jchemed.9b00703.
  106. Walker J. P. and Sampson V., (2013), Learning to Argue and Arguing to Learn: Argument-Driven Inquiry as a Way to Help Undergraduate Chemistry Students Learn How to Construct Arguments and Engage in Argumentation During a Laboratory Course, J. Res. Sci. Teach., 50, 561–596 DOI:10.1002/tea.21082.
  107. Walker J. P., Sampson V. and Zimmerman C. O., (2011), Argument-Driven Inquiry: An Introduction to a New Instructional Model for Use in Undergraduate Chemistry Labs, J. Chem. Educ., 88, 1048–1056 DOI:10.1021/ed100622h.
  108. Wallace H. and VanderMolen J., (2019), Teaching Health Education Through the Development of Student Centered Video Assignment, Front. Public Health, 7, 312 DOI:10.3389/fpubh.2019.00312.
  109. Wheeler L. B., Maeng J. L. and Whitworth B. A., (2015), Teaching assistants’ perceptions of a training to support an inquiry-based general chemistry laboratory course, Chem. Educ. Res. Pract., 16, 824–842 10.1039/C5RP00104H.
  110. Widanski B., Thompson J. A., Foran-Mulcahy K., (2020), Improving Students’ Oral Scientific Communication Skills through Targeted Instruction in Organic Chemistry Lab, J. Chem. Educ., 97, 3603–3608 DOI:10.1021/acs.jchemed.9b01190.
  111. Wilson S. B. and Varma-Nelson P., (2016), Small Groups, Significant Impact: A Review of Peer-Led Team Learning Research with Implications for STEM Education Researchers and Faculty, J. Chem. Educ., 93, 1686–1702 DOI:10.1021/acs.jchemed.5b00862.
  112. Winberg T. M. and Berg C. A. R., (2007), Students’ cognitive focus during a chemistry laboratory exercise: effects of a computer-simulated prelab, J. Res. Sci. Teach., 44, 1108–1133 DOI:10.1002/tea.20217.
  113. Woodward A. E., Weiner M. and Gosser D., (1993), Problem solving workshops in general chemistry, J. Chem. Educ., 70, 651 DOI:10.1021/ed070p651.1.
  114. Yuriev E., Wink D. J. and Holme, T. A., (2023), Virtual Special Issue Call for Papers: Investigating the Uses and Impacts of Generative Artificial Intelligence in Chemistry Education, J. Chem. Educ., 100, 3168–3170 DOI:10.1021/acs.jchemed.3c00829.
  115. Zewail-Foote M. and Gonzalez M., (2023), Crisscrossing Learning Experiences in an Undergraduate Research-Based Laboratory Course to Promote Reciprocal Peer Learning, J. Chem. Educ., 100, 1092–1099 DOI:10.1021/acs.jchemed.2c00341.

This journal is © The Royal Society of Chemistry 2025
Click here to see how this site uses Cookies. View our privacy policy here.