Revisiting faculty members’ goals for the undergraduate chemistry laboratory

Megan C. Connor a, Guizella A. Rocabado b and Jeffrey R. Raker *a
aDepartment of Chemistry, University of South Florida, 4202 East Fowler Avenue, CHE 205, Tampa, Florida 33620, USA. E-mail: jraker@usf.edu
bDepartment of Physical Science, Southern Utah University, 351 West University Boulevard, SC013B, Cedar City, Utah 84720, USA

Received 19th July 2022 , Accepted 7th October 2022

First published on 18th October 2022


Abstract

Over a decade has passed since faculty members’ goals for the undergraduate chemistry instructional laboratory were first investigated on a large, national scale in the United States. This study revisits these goals, using data from a 2022 national survey of chemistry faculty members in the United States (n = 521) to investigate current objectives, including how those goals vary with course, institutional context, and receipt of funding for improving undergraduate chemistry courses. A modified version of the Faculty Goals for Undergraduate Chemistry Laboratory Instrument was used to measure goals, with psychometric evidence providing support for its use across the studied contexts, with the exception of the general chemistry laboratory. Goals were associated with course and receipt of funding but not institution type, both with regard to institutions’ highest chemistry degree awarded and approval from the American Chemical Society to award certified bachelor's chemistry degrees. Results suggest that faculty members may adopt a distinct set of goals not immediately associated with the practice of chemistry in the general chemistry laboratory. Further, goals increasingly focus on providing research experience and cultivating disciplinary knowledge and skills with progression through the chemistry curriculum; this focus increases more abruptly when moving from large-enrollment lower-level courses to small-enrollment upper-level courses. Findings imply a need for increased efforts focused on (1) evaluating goals for the general chemistry laboratory, including whether those goals contribute to overarching curricular objectives, (2) promoting adoption of evidence-based pedagogies in large-enrollment contexts to better align instruction with the practice of chemistry, (3) supporting faculty members in procuring funding to improve courses, and (4) refining professional societies’ evaluation criteria for undergraduate chemistry programs.


Introduction

The instructional laboratory features prominently in undergraduate chemistry education, with leading professional societies requiring students in approved or accredited programs to have several hundred hours of laboratory course work (Committee on Professional Training, 2015; Royal Society of Chemistry, 2022). Chemistry educators have long considered the instructional laboratory as an indispensable component of the curriculum, deeming hands-on laboratory experiences as necessary for learning within the discipline (Bretz, 2019). However, there is minimal research investigating the effects of traditional laboratory instruction on student learning (Hofstein and Mamlok-Naaman, 2007). Further, laboratory pedagogies that provide evidence of student learning have not been widely adopted by chemistry educators (Bretz, 2019; Connor et al., 2022; Raker et al., 2022). The instructional laboratory, thus, has the potential to promote learning, in particular through fostering an understanding of disciplinary practices (Lave and Wenger, 1991; National Research Council, 2012). However, limited adoption of evidence-based instructional laboratory curricula, combined with the labor-intensive and costly nature of such courses, threatens continued use of this instructional format (Sansom and Walker, 2019). In response, educators have called for evidence of students’ learning in laboratory courses (Bretz, 2019) and the adoption of evidence-based laboratory pedagogies (Sansom and Walker, 2019) across the community.

Faculty members’ goals for the instructional laboratory are diverse, historically involving the cultivation of students’ abilities across four broad categories: practical skills (e.g., collecting data via techniques used by practicing chemists), scientific skills (e.g., designing experiments to investigate chemical phenomena), general skills (e.g., working in a team), and skills associated with learning chemistry (e.g., illustrating concepts) (Reid and Shah, 2007). However, there are numerous challenges to achieving these goals, all of which point toward possible needed reform of undergraduate chemistry laboratory education (Seery, 2020). As Seery (2020) notes, the goal of teaching students practical skills is often undermined by a curriculum that uses laboratory reports as the primary mode of assessment. Further, teaching students scientific skills is made difficult by the limited adoption of evidence-based pedagogies (Seery, 2020). General skills and skills associated with learning chemistry are then not specific to the instructional laboratory (Seery, 2020), with the latter potentially being taught less efficient in this format (Woolnough and Allsop, 1985). These challenges are summarized in Seery's (2020) call for the undergraduate chemistry laboratory to adopt one ultimate pedagogical goal: to serve as “the place where students learn how to do chemistry.”

Emerging trends in laboratory instruction are starting to address calls for the adoption of evidence-based instructional laboratory curricula that support students’ learning of disciplinary practices. For instance, digital badges and other competency-based models are being used to assess students’ proficiency with laboratory techniques (Seery et al., 2017; Pullen et al., 2018). The Journal of Chemical Education has also announced an upcoming special issue focused on inquiry-based teaching in the chemistry laboratory (Grushow et al., 2021). This issue of the Journal of Chemical Education reflects the growing number of educators committed to incorporating inquiry-based curricula into their instructional laboratory courses (Walker et al., 2011; Pagano et al., 2018; Christianson and Waters, 2021; Hanson and Stultz, 2021; Rattanakit, 2021; Roller et al., 2021). While degrees of inquiry vary among these curricula, all involve engaging students in the scientific discovery process (Weaver et al., 2008). Process-oriented guided-inquiry learning (POGIL), for example, serves as one prominent inquiry-based curriculum being incorporated into the chemistry instructional laboratory curriculum (Hunnicutt et al., 2014). Chemistry educators are also increasingly using research-based laboratory curricula, an extension of inquiry-based curricula (Weaver et al., 2008). Course-based undergraduate research experiences (CUREs) are one such pedagogy, where students engage in experimental design, collection and analysis of novel data, and production of broadly relevant results within the instructional laboratory (Pagano et al., 2018; Waterman and Heemstra, 2018; Allen et al., 2021b; Hanson and Stultz, 2021; Smythers et al., 2021; Connor et al., 2022). Importantly however, while evidence-based laboratory curricula are becoming increasingly prevalent, their adoption is not yet widespread (Bretz, 2019; Connor et al., 2022; Raker et al., 2022).

Supporting widespread adoption of evidence-based instructional laboratory curricula requires an understanding of chemistry faculty members’ current goals for the instructional laboratory. Awareness of such goals is essential for developers and disseminators of evidence-based curricula. While the four broad categories of goals identified by Reid and Shah (2007) provide useful insight into the aims adopted by faculty members, emerging trends in laboratory instruction suggest the intended role of the laboratory may be evolving. Insight into current goals for the instructional laboratory, in particular how they vary across course and institutional contexts, allows for more targeted dissemination efforts in which faculty members are supported in adopting the evidence-based curriculum that best aligns with their distinct goals. Understanding current goals is also necessary to support faculty members in making the instructional laboratory “the place where students learn how to do chemistry” (Seery, 2020), as educators may need to abandon goals which do not contribute to this overarching aim.

A survey in 2009 focused on goals for the instructional laboratory among chemistry faculty members within the United States, providing a foundation for investigating their current goals (Bruck and Towns, 2013). The investigation involved developing and administering an instrument to measure faculty members’ goals for the instructional laboratory with regard to research experience, transferable scientific skills, and discipline-specific skills. The study afforded insight into goals associated with particular laboratory courses in the curriculum, with findings suggesting that organic and upper-level chemistry courses aimed to provide students with research experiences and discipline-specific skills, though instructors of general chemistry did not espouse such goals. Receipt of funding to improve undergraduate chemistry laboratory courses was also found to be associated with greater emphasis on goals relating to research experience and discipline-specific skills (Bruck and Towns, 2013), suggesting that funding may support the alignment of laboratory courses with the practice of chemistry. Further, goals were unassociated with institution type, suggesting the instructional laboratory may play a uniform role across institutional contexts. However, since this work took place nearly a decade ago (i.e., 2009), the persistence of these associations (or lack thereof) is uncertain.

Undergraduate chemistry laboratory education in context

Leading chemistry professional societies provide guidelines for undergraduate programs. These guidelines, in turn, provide a context for understanding both overarching objectives of the instructional laboratory curriculum and the specific activities which contribute to these objectives. Collectively, guidelines suggest that the key objectives of undergraduate chemistry education are to equip students with the knowledge and skills necessary for being effective, practicing chemists (Committee on Professional Training, 2015; Royal Society of Chemistry, 2022). They further emphasize that the instructional laboratory serves as a primary context in which many necessary skills are cultivated.

The American Chemical Society (ACS) requires that approved programs “emphasize the laboratory experience and the development of professional skills needed to be an effective chemist” (Committee on Professional Training, 2015). The Society further describes the instructional laboratory as an ideal context in which students cultivate skills such as using chemical instrumentation to solve problems, communicating technical information both orally and through writing, and solving problems in multidisciplinary teams, all of which ACS identifies as necessary abilities for functioning as a practicing chemist. The Society identifies numerous essential skills that necessitate hands-on experience in the instructional laboratory in order to develop proficiency (Committee on Professional Training, 2020); these skills include working with glassware and balances, working with real samples, handling solutions (e.g., preparing, pipetting, and diluting), making calibration curves, using chemical instrumentation, conducting biochemical assays, synthesizing and preparing different classes of molecules, conducting air-free manipulations, purifying and preparing samples for measurement via instrumentation, and safely handling chemicals, among others.

Similarly, the Royal Society of Chemistry (RSC) requires that their accredited undergraduate programs provide students with “essential skills for applying chemical knowledge and solving complex scientific problems with a strong emphasis on laboratory skills in synthesis and analysis” (Royal Society of Chemistry, 2022). Students in accredited programs must use a variety of synthetic and measurement techniques (Royal Society of Chemistry, 2022). For instance, students must design and carry out synthesis, purification, isolation, and characterization strategies, as well as conduct qualitative and quantitative sample preparation, purification, chemical measurement, and associated analyses (The Quality Assurance Agency for Higher Education, 2022). Further, students must also successfully collaborate in groups and communicate effectively (The Quality Assurance Agency for Higher Education, 2022).

The necessary laboratory activities and learning outcomes identified in these professional guidelines directly inform the specific goals that chemistry faculty members must adopt to accomplish overarching curricular objectives. Understanding how faculty members’ goals for the instructional laboratory vary across courses is necessary for ensuring that courses work together in a coherent fashion to accomplish these aims. Further, understanding how goals vary across factors, such as course and institution type, is necessary for identifying avenues to best support adoption of evidence-based instructional laboratory curricula. The Faculty Goals for Undergraduate Chemistry Laboratory Instrument serves as a measure of these goals (Bruck and Towns, 2013), thus facilitating such an investigation.

Research goal and questions

This study aimed to psychometrically evaluate the Faculty Goals for Undergraduate Chemistry Laboratory Instrument used in the previous survey of chemistry faculty members’ goals (Bruck and Towns, 2013), and in turn, collect validity and reliability evidence to support interpreting scores as a measure of faculty members’ goals for the undergraduate chemistry laboratory across course, institution type, and receipt of funding to improve undergraduate chemistry courses. Work to accomplish this goal allowed for an investigation of two guiding research questions:

(1) How do faculty goals for the undergraduate chemistry instructional laboratory differ by course, institutions’ highest chemistry degree awarded, professional societies’ approval for institutions to award certified undergraduate chemistry degrees, and receipt of funding to improve undergraduate chemistry courses?

(2) How do differences in goals for the undergraduate chemistry instructional laboratory compare to differences in goals observed in a similar study conducted by Bruck and Towns (2013) in 2009?

Methods

We addressed our research aim and questions through a national survey of postsecondary chemistry faculty members within the United States in Spring 2022. The survey included the 28-item Faculty Goals for Undergraduate Chemistry Laboratory Instrument developed by Bruck and Towns (2013), including items targeting additional variables (e.g., receipt of funding for improving undergraduate chemistry courses). Data were analyzed using confirmatory factor analysis (CFA), measurement invariance testing, descriptive statistics, and inferential statistics. The study was approved by the University of South Florida's Institutional Review Board on October 6, 2021 (Application STUDY003351).

Population

The survey was distributed to chemistry faculty members at four-year institutions in the United States awarding one or more bachelor's degrees in chemistry between 2014 and 2019. The Integrated Postsecondary Education Data System (National Center for Education Statistics, 2021) was used to identify a total of 1143 institutions meeting these criteria, and publicly available faculty lists from these institutions were used to construct a list of faculty members. Faculty member lists at 105 institutions were unavailable. The resulting study population included N = 13[thin space (1/6-em)]897 faculty members from N = 1038 institutions (see Table 1). Institutions represented a range of characteristics: institution control (i.e., public or private), highest chemistry degree awarded (i.e., bachelor's or graduate), and approval by ACS to award certified bachelor's chemistry degrees (see Table 1).
Table 1 Study population (N) definition, including type of institution control, highest chemistry degree awarded, ACS-certified undergraduate chemistry degree, and response rates
Institutional control Highest chem. degree awarded Number of institutions Number of institutions with ACS approval Number of faculty members Number of respondents Response rate (%)
Public Bachelor's 228 148 2671 252 9.4
Public Graduate 216 211 5843 467 8.0
Private Bachelor's 509 208 3499 364 10.4
Private Graduate 85 77 1884 106 5.6
Totals 1038 644 13[thin space (1/6-em)]897 1189 8.6


Institutions in the United States that grant graduate degrees in chemistry (i.e., master's or doctorate) have higher research activity compared to institutions that only grant bachelor's degrees. High research active institutions also tend to have larger course sizes and faculty members who must balance research and teaching obligations (Cox et al., 2011). Further, ACS is a leading professional society in the United States analogous to the RSC in the United Kingdom and the Royal Australian Chemical Institute (RACI) in Australia. The Committee on Professional Training within ACS establishes evaluation criteria for undergraduate chemistry degree programs (Committee on Professional Training, 2015). Programs meeting these criteria are granted approval from the ACS to award certified bachelor's chemistry degrees. The approval process with the ACS is similar to the accreditation process with the RSC and RACI, whereby the ACS grants approval to institutions to certify undergraduate chemistry degrees and RSC and RACI accredit degrees.

Instructional laboratory courses within the United States can be associated with a lecture course (e.g., General Chemistry Laboratory and General Chemistry Lecture, Organic Chemistry I Laboratory and Organic Chemistry I Lecture, etc.), however some upper-level laboratory courses may be independent of lecture (e.g., senior capstone research project courses) (Connor and Raker, 2022). “Traditional” laboratory courses are often organized around sets of experiments that students complete, with such experiments typically involving the verification of results using an established protocol (Weaver et al., 2008).

Data collection and study sample

Data were collected online via Qualtrics in Spring 2022, and faculty members (N = 13[thin space (1/6-em)]897) were invited to participate via email. The survey was available for three weeks, and reminder emails were sent to faculty members who had yet to respond following the first and second week.

A total of 1189 faculty members consented to participate and began the survey (i.e., 8.6% overall response rate; see Table 1). A total of n = 607 faculty members from n = 422 institutions indicated that, within the last three years (i.e., Spring 2019 to Fall 2021), they had been the primary instructor of an undergraduate chemistry laboratory course they considered “not impacted by the COVID-19 pandemic.” Respondents referenced the course for which they had the most influence over if more than one course met this criterion. Respondents provided the name of this course, course level, and the types of majors enrolled (e.g., pre-professional, non-STEM, etc.). A total of n = 521 faculty members from n = 379 institutions provided complete responses to the Faculty Goals for Undergraduate Chemistry Laboratory instrument, yielding a response rate of 3.7% to form the sample for analysis (see Table 2).

Table 2 Study sample (n) definition, including type of institution control, highest chemistry degree awarded, ACS-certified undergraduate chemistry degree, and response rates
Institutional control Highest chem. degree awarded Number of institutions represented in sample Number of institutions with ACS approval represented in sample Number of faculty members in sample Response rate (%)
Public Bachelor's 102 75 145 5.4
Public Graduate 87 85 121 2.1
Private Bachelor's 167 102 220 6.3
Private Graduate 23 23 35 1.9
Totals 379 285 521 3.7


Authors M. C. C. and J. R. R. used course names, levels, and types of majors enrolled to categorize courses as general, organic, or upper-level chemistry until 100% agreement was obtained; a similar categorization scheme was used by Bruck and Towns (2013) and thus results can be directly compared.

Psychometric evaluation of the faculty goals for undergraduate chemistry laboratory instrument

Confirmatory factor analysis (CFA) was used to evaluate the internal structure of the Faculty Goals for Undergraduate Chemistry Laboratory Instrument and to conduct modifications that improve its functionality. McDonald's omega (ω) was then used to evaluate reliability. McDonald's ω is conceptually similar to Cronbach's alpha, though it is the appropriate reliability coefficient when item loadings are unequal (Komperda et al., 2018). Since McDonald's ω is only appropriate for use with a single-factor model, a reliability value was determined for each subscale (Komperda et al., 2018). Following this evaluation, measurement invariance was tested to support comparisons between groups identified in Research Question 1.

The 28-item Faculty Goals for Undergraduate Chemistry Laboratory Instrument measures goals across seven factors: Research Experience; Group Work and Broader Communication Skills; Error Analysis, Data Collection and Analysis; Connection between Lab and Lecture; Transferable Skills (Lab-Specific); Transferable Skills (Not Lab-Specific); and Laboratory Writing. Twenty-six items use a six-point Likert scale ranging from strongly disagree to strongly agree, and two items use a five-point frequency scale ranging from 0% to 76–100% (see Table 3).

Table 3 Factors and items from the faculty goals for undergraduate chemistry laboratory instrument developed by Bruck and Towns. Participants responded to 26 items using a six-point Likert scale ranging from “strongly disagree” to “strongly agree”. Descriptive statistics are included for each item. Factor loadings were obtained using the final revised model and entire dataset (n = 521)
Factor Items Label Mean SD Skewness Kurtosis Loading
a Frequency items using a five-point scale ranging from 0% to 76–100%.
Research experience • Laboratory techniques used by professional chemists are used in the teaching laboratory. R1 4.72 1.19 −1.09 4.10 0.62
• Preparing students for research experiences is a goal for the laboratory. R2 4.31 1.29 −0.68 2.98 0.71
• The laboratory gives students an idea of how chemistry is performed in the real world. R3 4.56 1.18 −0.80 3.50 0.73
• The laboratory is designed to encourage the development of scientific reasoning skills. R4 5.11 0.89 −1.06 4.60 0.66
• Understanding the usefulness of specific laboratory techniques is a goal for the course. R5 4.82 1.13 −1.06 3.95 0.51
• How often are students conducting experiments that mimic research experiences?a R6 2.58 1.16 0.50 2.42
Group work and broader communication skills • Students need to learn to work together in laboratory to succeed in future courses. G1 4.66 1.14 −0.83 3.43 0.70
• Students need to learn to work together in laboratory to succeed in their future careers. G2 4.70 1.18 −1.01 3.95 0.79
• Group work in laboratory encourages students to use their peers as information sources. G3 4.63 1.07 −0.76 3.78 0.72
• This laboratory course is designed to develop oral communication skills. G4 3.28 1.49 0.09 2.00 0.49
• The laboratory is designed to have students present data in multiple formats such as PowerPoint, posters, laboratory reports, etc. G5 3.28 1.56 0.20 1.96 0.47
Error analysis, data collection and analysis • Error analysis is necessary to understand the limitations of measurement in the laboratory. E1 4.48 1.24 −0.83 3.35 0.57
• Teaching students about uncertainty in measurement procedures is important. E2 4.79 1.09 −1.07 4.41 0.59
• Laboratory is a place for students to learn to analyze data. E3 5.21 0.85 −1.49 7.08 0.70
• Understanding the need for proper data collection techniques is a goal for laboratory. E4 4.84 1.06 −1.07 4.69 0.78
• How often are students required to carry out an error analysis?a E5 2.89 1.40 0.24 1.74
Connection between lab and lecture • Making laboratories relevant to lecture content is an aspect of our laboratories. C1 4.80 1.15 −1.12 4.32 0.87
• There is a strong connection between the lecture and the laboratory. C2 4.64 1.25 −0.92 3.45 0.92
• The goal for laboratory instruction is to reinforce lecture content. C3 4.37 1.25 −0.66 3.19 0.73
Transferable skills (lab-specific) • Laboratory activities and experiments selected for this course are designed to develop students’ mastery of laboratory techniques. TS1 5.16 0.98 −1.52 6.18 0.62
• Laboratory activities and experiments selected for this course are designed to focus on skills that are transferable to research-oriented laboratories. TS2 4.99 1.07 −1.36 5.25 0.86
• Laboratory activities and experiments selected for this course are designed to develop skills that students can apply to future science courses. TS3 5.12 0.96 −1.37 5.73 0.61
Transferable skills (not lab-specific) • Laboratory activities and experiments selected for this course are designed to teach students to build logical arguments based on their data. TN1 5.11 1.05 −1.40 5.20 0.67
• Laboratory activities and experiments selected for this course are designed to foster an appreciation for science in students. TN2 4.91 0.98 −1.04 4.72 0.55
• Laboratory activities and experiments selected for this course are designed to generalize to multiple disciplines. TN3 3.91 1.27 −0.50 2.58 0.33
Laboratory writing • Teaching students how to write scientific reports is a goal for laboratory. W1 4.56 1.40 −0.87 3.04 0.83
• Writing laboratory reports helps students to communicate what they know about chemistry. W2 4.93 1.16 −1.25 4.48 0.82
• Learning to keep a proper laboratory notebook is a vital skill for students to acquire. W3 4.97 1.16 −1.46 5.23 0.50


The internal structure of the instrument was psychometrically evaluated via exploratory factor analysis during its development (Bruck and Towns, 2013), and it has since been evaluated using CFA (Connor et al., 2022). Further, the reliability of data obtained using this instrument has been evaluated using both Cronbach's alpha (α) and McDonald's ω (Bruck and Towns, 2013; Connor et al., 2022). These analyses have generated acceptable evidence of validity and reliability. However, collinearity of multiple factors (i.e., Transferable Skills (Lab-Specific), Transferable Skills (Not Lab-Specific), and Research Experience) was observed in a previous investigation (Connor et al., 2022), suggesting that these factors may be measuring a single, broader construct. Thus, the internal structure may require modification to provide a more valid measure of goals. Further, the degree to which the instrument functions similarly across instructional and institutional demographics is uncertain, and evidence of such measurement invariance is necessary to compare goals of different faculty member groups (Rocabado et al., 2020).

Initial evaluation of factor structure using CFA. CFA was conducted in RStudio using the “lavaan” package and the instrument's published factor structure (Rosseel, 2012; Bruck and Towns, 2013). Data were randomly split into halves, with the first half (n = 261) used for initial evaluation of model fit to our data. The robust maximum likelihood (MLR) estimator was used to estimate model parameters, as it accommodates our non-normally distributed data (Li, 2016). Model fit was evaluated using the χ2 statistic, comparative fit index (CFI), standardized root-mean square residual (SRMR), and root mean square error of approximation (RMSEA). For acceptable model fit, the χ2 statistic should be nonsignificant, the CFI should equal or exceed 0.90, the SRMR should be below 0.08, and the RMSEA should be below 0.06 (Hu and Bentler, 1999; Diemer et al., 2017).

Results from CFA indicate that the published factor structure exhibits unacceptable fit to our data, with all fit statistics failing to meet acceptable cutoff criteria: χ2 (n = 261, df = 329, p < 0.001) = 808.8, CFI = 0.84, SRMR = 0.08, and RMSEA = 0.08. Modification indices were evaluated to identify possible changes to the published model that would improve fit and provide a more valid measure of goals.

Modification and re-evaluation of factor structure using CFA. Several of the highest modification indices include the only two items utilizing a five-point frequency scale (i.e., items R6 and E5), unlike the other items in the instrument which use a six-point Likert scale. Combining multiple scales of various lengths may result in a difference in the loading pattern of items; these items were thus excluded from the model during modification of the factor structure, resulting in some improvement in fit. However, model fit statistics still did not meet acceptable cutoff criteria: χ2 (n = 261, df = 278, p < 0.001) = 679.8, CFI = 0.85, SRMR = 0.08, and RMSEA = 0.07.

Modification indices were thus evaluated for the revised model, with the top three indices suggesting correlated error variances between items E1 and E2, G4 and G5, and G1 and G2. Each of these item pairs is conceptually similar; items E1 and E2 relate to distinct aspects of measurement error, G4 and G5 relate to specific communication skills, and G1 and G2 relate to the different necessities of learning to work in groups (see Table 3). Further, while items within pairs are conceptually similar, they still measure unique facets of the broader concept to which they relate. This lack of redundancy between items suggests they should not be excluded from the model. The next largest modification indices suggest correlated error variances between items on different factors; the final revised model thus only includes correlated error between the top three item pairs. With these errors correlated and items R6 and E5 excluded, the final revised model has acceptable fit for the first half of data: χ2 (n = 261, df = 275, p < 0.001) = 500.3, CFI = 0.92, SRMR = 0.06, and RMSEA = 0.06. The final revised model has borderline acceptable fit with the second half of data: χ2 (n = 260, df = 275, p < 0.001) = 589.8, CFI = 0.88, SRMR = 0.07, and RMSEA = 0.07, and acceptable fit for the complete dataset: χ2 (n = 521, df = 275, p < 0.001) = 780.5, CFI = 0.90, SRMR = 0.06, and RMSEA = 0.06. Item loadings for the complete dataset are provided in Table 3. Factor scores for the complete dataset were generated with “lavaan” using the final revised model (Rosseel, 2012).

Evaluation of reliability using McDonald's ω. McDonald's ω coefficients for each factor exceed or are near the recommended cutoff criterion of 0.70 (Komperda et al., 2018), providing evidence of reliability for measures obtained from the entire study sample (n = 521) and the 26-item factor model (see Table 4). McDonald's ω for the Transferable Skills (Not Lab-Specific) factor likely did not exceed the 0.70 cutoff due to the loose inter-relatedness of items measuring this broad construct (Tavakol and Dennick, 2011). McDonald's ω was calculated using the “psych” package in RStudio (Revelle, 2018).
Table 4 McDonald's ω coefficients for factors in the faculty goals for the undergraduate chemistry laboratory instrument. All values exceed or are near the recommended minimum value of 0.70
Factor ω
Research experience 0.82
Group work and broader communication skills 0.87
Error analysis, data collection and analysis 0.86
Connection between lab and lecture 0.88
Transferable skills (lab-specific) 0.76
Transferable skills (not lab-specific) 0.58
Laboratory writing 0.78


Evaluation of measurement invariance across course, highest chemistry degree awarded, ACS-approval to award certified bachelor's degrees, and receipt of funding. After finalizing the final model, measurement invariance testing was conducted for the configural, metric, and scalar models comparing faculty member groups by course (i.e., general, organic, or upper-level chemistry), institutions’ highest chemistry degree awarded (i.e., bachelor's or graduate), institutions’ approval from ACS to award certified bachelor's chemistry degrees (i.e., approved or nonapproved), and receipt of any funding to improve undergraduate chemistry courses (i.e., funding or no funding). Testing was conducted in RStudio using the “lavaan” package and procedure outlined in Rocabado et al. (2020). The MLR estimator was used to estimate model parameters, and significance was set at α = 0.05.

Measurement invariance was also tested for the strict model for the benefit of researchers wishing to use the instrument, and results are presented within respective tables; all groups, aside from those by course, exhibited strict invariance. However, this study relied on factor scores for statistically testing, as they are a more precise measure in comparison to composite scores and thus more appropriate for our research purposes (Rocabado et al., 2020). Therefore, only scalar invariance is discussed herein.

Metric invariance was established for faculty member groups by course, as indicated by the change in χ2 and other fit statistics when moving from the metric to scalar model: Δχ2(df = 38, p < 0.001) = 200.8; ΔCFI = 0.03; ΔSRMR = 0.01; ΔRMSEA = 0.01; see Table 5. Metric invariance implies that groups interpret factors similarly, though data is not sufficiently represented by the same factor model to facilitate comparisons (Rocabado et al., 2020). Factor scores between these groups should, thus, only be compared holistically, and differences interpreted with caution.

Table 5 Measurement invariance testing for the faculty goals for undergraduate chemistry laboratory instrument comparing faculty members who teach general (n = 172), organic (n = 123), or upper-level chemistry laboratory courses (n = 226)
Testing level χ 2 df p-Value CFI SRMR RMSEA Δχ2 Δdf p-Value ΔCFI ΔSRMR ΔRMSEA
Baseline – general 543.8 275 <0.001 0.87 0.08 0.08
Baseline – organic 407.5 275 <0.001 0.90 0.07 0.06
Baseline – upper-level 573.3 275 <0.001 0.87 0.07 0.07
Configural 1527.3 825 <0.001 0.87 0.07 0.07
Metric 1559.3 863 <0.001 0.87 0.08 0.07 32.0 38 0.742 0.00 0.01 0.00
Scalar 1760.1 901 <0.001 0.84 0.09 0.08 200.8 38 <0.001 −0.03 0.01 0.01
Strict 1926.1 953 <0.001 0.80 0.09 0.08 166.0 52 <0.001 −0.04 0.00 0.00


Scalar invariance was established for faculty member groups by highest chemistry degree awarded, approval from ACS to award certified bachelor's chemistry degrees, and receipt of any funding to improve undergraduate chemistry courses, meaning that data from different groups are sufficiently represented by the same factor model to facilitate meaningful comparisons of factor scores (Rocabado et al., 2020). No change in most fit statistics (ΔCFI = 0.00, ΔSRMR = 0.00, and ΔRMSEA = 0.00) when moving from the metric to scalar model constitutes evidence of scalar invariance for faculty member groups by highest chemistry degree awarded (see Table 6).

Table 6 Measurement invariance testing for the faculty goals for undergraduate chemistry laboratory instrument comparing faculty members at institutions granting only bachelor's chemistry degrees (n = 365) versus those also granting graduate chemistry degrees (n = 156)
Testing level χ 2 df p-value CFI SRMR RMSEA Δχ2 Δdf p-value ΔCFI ΔSRMR ΔRMSEA
Baseline – bachelor's 653.6 275 <0.001 0.90 0.06 0.07
Baseline – graduate 486.6 275 <0.001 0.88 0.07 0.07
Configural 1143.4 550 <0.001 0.89 0.06 0.07
Metric 1154.8 569 <0.001 0.89 0.07 0.07 11.4 19 0.910 0.00 0.01 0.00
Scalar 1190.8 588 <0.001 0.89 0.07 0.07 36.0 19 0.011 0.00 0.00 0.00
Strict 1229.3 614 <0.001 0.89 0.07 0.07 38.5 26 0.054 0.00 0.00 0.00


Further, no change in fit statistics when moving from the metric to scalar model provides evidence of scalar invariance for faculty member groups by receipt of funding (Δχ2(df = 19, p = 0.444) = 19.2; ΔCFI = 0.00; ΔSRMR = 0.00; ΔRMSEA = 0.00; see Table 7), and no change in most fit statistics when moving from the metric to scalar model provides evidence of scalar invariance for faculty member groups by ACS approval status: Δχ2(df = 19, p = 0.310) = 21.5; ΔCFI = 0.00; ΔSRMR = 0.00; see Table 8.

Table 7 Measurement invariance testing for the faculty goals for undergraduate chemistry laboratory instrument comparing faculty members at institutions having received funding to improve undergraduate chemistry courses (n = 235) versus those at institutions who have not received such funding (n = 286)
Testing level χ 2 df p-Value CFI SRMR RMSEA Δχ2 Δdf p-Value ΔCFI ΔSRMR ΔRMSEA
Baseline – no funding 602.4 275 <0.001 0.89 0.06 0.07
Baseline – funding 595.9 275 <0.001 0.88 0.07 0.07
Configural 1198.3 550 <0.001 0.88 0.06 0.07
Metric 1204.0 569 <0.001 0.88 0.07 0.07 5.7 19 0.999 0.00 0.01 0.00
Scalar 1223.2 588 <0.001 0.88 0.07 0.07 19.2 19 0.444 0.00 0.00 0.00
Strict 1244.9 614 <0.001 0.88 0.07 0.07 21.7 26 0.705 0.00 0.00 0.00


Table 8 Measurement invariance testing for the faculty goals for undergraduate chemistry laboratory instrument comparing faculty members at institutions with ACS-approval to award certified bachelor's degrees (n = 409) versus those at institutions without approval (n = 112)
Testing level χ 2 df p-Value CFI SRMR RMSEA Δχ2 Δdf p-Value ΔCFI ΔSRMR ΔRMSEA
Baseline – approval 685.6 275 <0.001 0.90 0.06 0.07
Baseline – no approval 447.1 275 <0.001 0.87 0.08 0.07
Configural 1152.3 550 <0.001 0.89 0.06 0.07
Metric 1144.9 569 <0.001 0.90 0.07 0.07 −7.4 19 0.992 0.01 0.01 0.00
Scalar 1166.4 588 <0.001 0.90 0.07 0.06 21.5 19 0.310 0.00 0.00 −0.01
Strict 1184.3 614 <0.001 0.90 0.07 0.06 17.9 26 0.879 0.00 0.00 0.00


Data analysis

After establishing measurement invariance, the first research question was addressed using inferential statistics. Kruskal–Wallis and post hoc Tukey Honestly Significantly Different (HSD) tests were used to evaluate differences in factor scores on each of the seven factors for faculty member groups by course (i.e., general, organic, and upper-level chemistry). Mann–Whitney U tests were used to evaluate differences in factor scores for faculty member groups by institutions’ highest chemistry degree awarded (i.e., bachelor's versus graduate), receipt of funding for improving undergraduate chemistry courses (i.e., receipt of any funding versus no funding), and ACS approval to award certified bachelor's chemistry degrees (i.e., approved versus nonapproved). The Kruskal–Wallis test and post hoc Tukey HSD tests are used when comparing three or more groups, whereas the Mann–Whitney U test is used when comparing two groups (Sheskin, 2011). Bonferroni adjusted significance was set at 0.007 (α = 0.05/7) for these analyses to account for testing across each of the seven factors. Effect sizes (r) were calculated post hoc, with small, medium, and large effects corresponded to values of 0.10, 0.30, and 0.50, respectively (Cohen, 1992). All inferential statistics were done with STATA 17.0 (StataCorp, 2021).

The second research question was addressed via a descriptive comparison of our Research Question 1 results and findings from Bruck and Towns’ (2013) investigation of faculty members’ goals for the undergraduate chemistry laboratory.

Results and discussion

Inferential statistics are presented for the Faculty Goals for Undergraduate Chemistry Laboratory Instrument by course, institutions’ highest chemistry degree awarded, ACS-approval to award certified bachelor's chemistry degrees, and receipt of funding to improve undergraduate chemistry courses (Research Question 1). To address Research Question 2, results are compared to findings from the survey of faculty goals for the undergraduate chemistry laboratory conducted in 2009 (Bruck and Towns, 2013).

RQ1: How do faculty goals for the undergraduate chemistry instructional laboratory differ by course, institution type, and receipt of funding to improve undergraduate chemistry courses?

Kruskal–Wallis and post hoc Tukey HSD tests indicate several differences in faculty goals by course. Further, Mann–Whitney U tests result in differences in goals between faculty member groups by receipt of funding and not by institutions’ highest chemistry degree awarded or ACS-approval to award certified bachelor's chemistry degrees. Results suggest that faculty goals for the undergraduate chemistry laboratory are associated with course and receipt of funding, though there is no evidence of an association between goals and highest chemistry degree awarded or ACS-approval to award certified bachelor's chemistry degrees.

Kruskal–Wallis tests indicated differences in factor scores on all seven factors for general, organic, and upper-level chemistry instructional laboratory courses (see Table 9).

Table 9 Kruskal–Wallis test comparisons of factors scores across general (n = 172), organic (n = 123), and upper-level (n = 226) chemistry instructional laboratory courses
Factor H p value Mediangen Medianorg Medianupper
a Corresponds to significance at the 0.007 level. Bonferroni adjusted significance was set at 0.007.
Research experience 69.7 <0.001a −0.25 0.02 0.36
Group work and broader communication skills 30.1 <0.001a −0.12 −0.10 0.22
Error analysis, data collection and analysis 62.1 <0.001a −0.07 −0.15 0.25
Connection between lab and lecture 10.9 0.004a 0.24 −0.18 0.17
Transferable skills (lab-specific) 44.9 <0.001a −0.10 0.06 0.22
Transferable skills (not lab-specific) 48.0 <0.001a −0.05 0.01 0.14
Laboratory writing 71.0 <0.001a −0.11 −0.02 0.31


While only metric invariance was observed for faculty member groups by course, factors still share a similar, generalized meaning for each group. This shared meaning, in combination with differences in factor scores (as indicated by medium to large effect sizes), provides support for a holistic comparison of differences in goals for general, organic, and upper-level chemistry laboratory courses. However, results relating to course should be interpreted with caution, in particular those with small effect sizes.

Faculty members may place greater emphasis on goals relating to Research Experience, Transferable Skills (Lab-Specific), and Laboratory Writing in the organic chemistry instructional laboratory than in the general chemistry instructional laboratory

Post hoc Tukey HSD tests indicated differences in factor scores between organic and general chemistry instructional laboratory courses for the following factors: Research Experience, Transferable Skills (Lab-Specific), and Laboratory Writing (see Table 10). Effect sizes were small (r = 0.17–0.18; see Table 10); thus, there is a degree of possibility that differences in scores are partially due to instrument functionality rather than differing goals.
Table 10 Post hoc Tukey HSD pairwise comparisons of factor scores by course
Factor Organic vs. general Upper vs. general Upper vs. organic
p value r p value r p value r
a Corresponds to significance at the 0.007 level. Bonferroni adjusted significance was set at 0.007.
Research experience 0.001a 0.18 <0.001a 0.41 <0.001a 0.25
Group work and broader communication skills 0.887 0.03 <0.001a 0.25 <0.001a 0.22
Error analysis, data collection and analysis 0.953 0.03 <0.001a 0.32 <0.001a 0.36
Connection between lab and lecture 0.015 0.18 0.709 0.04 0.066 0.14
Transferable skills (lab-specific) 0.005a 0.18 <0.001a 0.33 0.019 0.16
Transferable skills (not lab-specific) 0.054 0.11 <0.001a 0.33 <0.001a 0.23
Laboratory writing 0.001a 0.17 <0.001a 0.41 <0.001a 0.25


Faculty members placed greater emphasis on goals relating to Research Experience and Transferable Skills (Lab-Specific) in the organic chemistry laboratory when compared to the general chemistry laboratory. These differences may be due to a variety of factors, including smaller course sizes and a somewhat more homogenous student population when moving from general to organic chemistry (Gibbons et al., 2017; Reed et al., 2017). Smaller course sizes may facilitate the incorporation of laboratory techniques used by professional chemists, as purchasing and maintaining multiple instruments for large-enrollment general chemistry laboratory courses is costly and time intensive. Further, general chemistry courses often serve a broad student population, including but not limited to nursing, engineering, pre-professional, and non-STEM majors (Reed et al., 2017). While the organic chemistry instructional laboratory is still regarded as a large-enrollment course serving a range of majors, the range is relatively narrower (Reed et al., 2017). As the proportion of chemistry and closely related STEM majors increases when progressing from general to organic chemistry, instructors may increase their focus on preparing students for careers in research laboratories and industry. A previous investigation also found the organic chemistry instructional laboratory to be highly technique-oriented (Bruck et al., 2010), further suggesting a greater focus on preparing students for professional work in the discipline.

Previous research has shown that inorganic chemistry faculty members who implement CUREs report a similar emphasis on goals relating to Research Experience and Transferable Skills (Lab-Specific) (Connor et al., 2022). CUREs may therefore be a promising evidence-based pedagogy that helps organic chemistry instructors accomplish these goals. CUREs are currently incorporated into the organic chemistry laboratory to a degree as described in the research literature (Allen et al., 2021a; Watts et al., 2021), though additional efforts to support adoption in large-enrollment contexts are needed (Cruz et al., 2020).

Furthermore, faculty members placed greater emphasis on goals relating to Laboratory Writing in the organic chemistry laboratory when compared to the general chemistry laboratory. This difference may be due to increased incorporation of formal laboratory reports at the organic chemistry course stage in the curriculum or an increased focus on cultivating disciplinary writing skills in upper-level courses. However, additional research is needed to understand the nature of this writing, as it is possible that instructors may need support in adopting assessments beyond laboratory reports (Seery, 2020).

Faculty members place greater emphasis on goals relating to Research Experience, Group Work and Broader Communication Skills, Error Analysis, Data Collection and Analysis, Transferable Skills (Lab-Specific), Transferable Skills (Not Lab-Specific), and Laboratory Writing in the upper-level chemistry instructional laboratory than in the general chemistry instructional laboratory

Tukey HSD tests indicate differences in scores between upper-level and general chemistry instructional laboratory courses for six of seven factors: Research Experience; Group Work and Broader Communication Skills; Error Analysis, Data Collection and Analysis; Transferable Skills (Lab-Specific); Transferable Skills (Not Lab-Specific); and Laboratory Writing (see Table 10). Effect sizes range from medium to large (r = 0.25–0.41; see Table 10), suggesting a substantive difference in factor scores and, in turn, faculty goals.

For all factors exhibiting a difference in scores, faculty members placed greater emphasis on related goals in the upper-level chemistry laboratory. These differences again suggest an increasing focus on developing students’ disciplinary knowledge and skills as they progress in the curriculum. Greater emphasis on these goals further aligns with the majority of objectives reported by inorganic faculty members implementing CUREs (Connor et al., 2022), suggesting CUREs may also be a promising pedagogy for broad adoption in upper-level laboratory courses.

Scores for general chemistry are systematically lower on all factors except Connection between Lab and Lecture, the only factor unrelated to developing knowledge and skills for participating in the discipline, when compared to upper-level courses. The substantive difference in goals across most factors suggests that these goals may not be of high priority in the general chemistry laboratory curriculum. These results further suggest that faculty members may have alternate goals for the general chemistry instructional laboratory not immediately associated with the practice of chemistry. Alternate goals may entail providing the broad general chemistry student population with a fundamental understanding of chemical concepts; though, additional research is needed to identify these goals and determine whether they support the overarching objective of making instructional laboratories “the place where students learn how to do chemistry” (Seery, 2020).

The potential mismatch in goals further suggests that the Faculty Goals for Undergraduate Chemistry Laboratory Instrument may be ill-suited for use in the general chemistry contexts. Measurement invariance at just the metric level for groups by course provides further support for the notion that instructors adopt a distinct set of goals for the general chemistry laboratory and that the instrument is inappropriate for use in this context.

Faculty members place greater emphasis on goals relating to Research Experience, Group Work and Broader Communication Skills, Error Analysis, Data Collection and Analysis, Transferable Skills (Not Lab-Specific), and Laboratory Writing in the upper-level chemistry instructional laboratory than in the organic chemistry instructional laboratory

Tukey HSD tests indicated differences in scores between upper-level and organic chemistry instructional laboratory courses for these factors: Research Experience; Group Work and Broader Communication Skills; Error Analysis, Data Collection and Analysis; Transferable Skills (Not Lab-Specific); and Laboratory Writing (see Table 10). Effect sizes correspond to medium effects (r = 0.22–0.36; see Table 10), providing support for a meaningful comparison of faculty goals.

For all factors exhibiting a difference in scores, faculty members again placed greater emphasis on related goals in the upper-level chemistry laboratory, further pointing toward an increasing instructional focus on cultivating students’ disciplinary knowledge and skills as they progress in the curriculum. Effect sizes (r = 0.22–0.36) suggest this increase from the organic to upper-level chemistry laboratory is more substantive than the increase from the general to organic chemistry laboratory, where effect sizes were small to medium (r = 0.17–0.18).

This more substantive increase may be due to the shift away from large-enrollment courses when moving from the organic to upper-level chemistry laboratory. Large-enrollment contexts may limit the disciplinary knowledge and skills that instructors believe they can cultivate with the large number of students and corresponding limited resources (e.g., time, instrumentation, etc.), in turn causing these faculty members to adopt alternate goals. Additional efforts focused on supporting faculty members’ implementation of evidence-based pedagogies in large-enrollment contexts may be needed. It is also possible that chemistry faculty in large-enrollment contexts balance cultivating students’ disciplinary knowledge and skills with providing a foundational understanding of chemistry and its relevance to other disciplines to their broad student population, the latter of which may be taught more efficiently in the lecture format (Woolnough and Allsop, 1985). Additional research is thus needed to identify alternate goals and factors contributing to their adoption. Such insight will be necessary for understanding whether faculty members need support in aligning their laboratory courses with the practice of chemistry, including the most productive avenues for providing such support (if needed).

Faculty goals for the instructional laboratory are not associated with an institution's highest chemistry degree awarded

No evidence of an association was found between faculty goals and institutions’ highest chemistry degree awarded (i.e., bachelor's or graduate). Mann–Whitney U tests resulted in non-significant p-values (i.e., p = 0.144–0.973) across the seven factors. Corresponding effect sizes were negligible (i.e., r = 0.00–0.06). Complete analyses, including p-values and effect sizes, are provided in the Appendix (see Table 12 in Appendix).

These results suggest that faculty members have similar goals for the instructional laboratory across these institutional contexts. A department's highest degree awarded serves as an indicator for its amount of focus on research versus teaching (Cox et al., 2011), with departments that grant graduate degrees presumably placing greater emphasis on research compared to those only granting bachelor's degrees. However, our finding mirrors that of a recent investigation in which instrumentation use in the undergraduate instructional laboratory was also unassociated with institutions’ highest chemistry degree awarded (Connor and Raker, 2022). Further, variables such as flipped classroom use, percentage of time lecturing, and inorganic chemistry laboratory requirements are also shown to be unassociated with institutions’ highest chemistry degree awarded (Reisner et al., 2015; Srinivasan et al., 2018; Yik et al., 2022), suggesting that institutional context may not impact undergraduate chemistry education to the extent presumed.

Faculty goals for the instructional laboratory are not associated with an institution's approval from ACS to award certified bachelor's chemistry degrees

No evidence of an association was found between faculty goals and institutions’ approval from ACS to award certified bachelor's chemistry degrees. Mann–Whitney U tests resulted in negligible p-values (p = 0.039–0.821), and effect sizes indicated negligible to small effects (r = 0.01–0.09). Complete analyses are provided in the Appendix (see Table 13 in Appendix).

Results suggest that faculty members have similar goals for the instructional laboratory regardless of ACS-approval status. This finding also aligns with results from Connor and Raker (2022), in which no evidence of differences in instrumentation use were observed by approval status from the ACS to award certified bachelor's degrees. While less research has investigated associations between characteristics of undergraduate chemistry instruction and this institutional variable, alignment of these findings further suggests that institutional context may not be associated with the undergraduate chemistry curriculum. However, we do note one study in the research literature that reported an association between inorganic chemistry laboratory requirements and ACS-approval status (Reisner et al., 2015); more understanding is needed to better presume an association or lack of association.

Faculty goals for the instructional laboratory are associated with receipt of funding to improve undergraduate chemistry courses

Mann–Whitney U tests indicate that receipt of funding to improve undergraduate chemistry courses was associated with greater emphasis on goals relating to: Research Experience; Group Work and Broader Communication Skills; Error Analysis, Data Collection and Analysis; Transferable Skills (Lab-Specific); and Transferable Skills (Not Lab-Specific; see Table 11). Faculty members were asked to report receipt of any (i.e., internal or external) funding to improve undergraduate chemistry laboratory or non-laboratory courses (e.g., institutional grants; Course, Curriculum, and Laboratory Improvement (CCLI) grants; Transforming Undergraduate Education in Science (TUES) grants; and Improving Undergraduate STEM Education (IUSE) grants). Effect sizes are small for these differences (r = 0.12–0.19; see Table 11).
Table 11 Mann–Whitney U test comparisons of factors scores between faculty members at institutions that have received funding to improve undergraduate chemistry courses (n = 235) versus those at institutions that have not received such funding (n = 286)
Factor U z-Score p value r Mediannofunding Medianfunding
a Corresponds to significance at the 0.007 level. Bonferroni adjusted significance was set at 0.007.
Research experience 26[thin space (1/6-em)]975 3.87 <0.001a 0.17 −0.01 0.21
Group work and broader communication skills 29[thin space (1/6-em)]794 2.23 0.026 0.10 −0.02 0.15
Error analysis, data collection and analysis 28[thin space (1/6-em)]743 2.84 0.004a 0.12 0.01 0.16
Connection between lab and lecture 30[thin space (1/6-em)]752 1.67 0.095 0.07 0.24 0.05
Transferable skills (lab-specific) 26[thin space (1/6-em)]287 4.28 <0.001a 0.19 −0.01 0.17
Transferable skills (not lab-specific) 27[thin space (1/6-em)]230 3.73 <0.001a 0.16 0.01 0.10
Laboratory writing 30[thin space (1/6-em)]337 1.91 0.056 0.08 0.07 0.14


These results suggest that faculty members who procure funding to improve undergraduate chemistry courses aim to use the instructional laboratory to provide students with research experiences, along with the opportunity to cultivate transferable science skills (e.g., the ability to investigate phenomena in groups, communicate scientific findings, and construct logical arguments from data) and discipline-specific skills (e.g., the ability to collect data via techniques used by practicing chemists). Receipt of such funding may therefore support the alignment of instructional laboratory courses with the practice of chemistry.

This finding contrasts with that of Connor and Raker (2022), in which instrumentation use was unassociated with receipt of funding. Collectively, these findings suggest that faculty members incorporate instrumentation into their courses regardless of whether they have received funding, though receipt of funding may influence how they are able to incorporate instruments into their respective courses. For instance, most analytical chemistry faculty members may incorporate HPLC into their instructional laboratory course, but receipt of funding may facilitate a multi-week experiment focused on optimizing a new separation method to address a broader research question, as such an experiment would require additional resources (e.g., reagents, staff support, etc.). Courses without funds to procure such resources may be limited to single-week instructional laboratory exercises in which students separate a known mixture of compounds using a predetermined optimal separation method.

RQ2: How do differences in goals for the undergraduate chemistry instructional laboratory compare to differences in goals observed in a similar study conducted by Bruck and Towns (2013) in 2009?

Faculty goals for the undergraduate chemistry instructional laboratory varied in both similar and different ways when compared to differences in goals observed via the national survey of postsecondary chemistry faculty members in 2009 (Bruck and Towns, 2013). These results provide an overview of current objectives for the undergraduate chemistry instructional laboratory and insight into how objectives may have changed in the last decade.

Faculty members again placed less emphasis on goals relating to Research Experience and Laboratory Writing in the general chemistry instructional laboratory when compared to other courses

Results from both the 2022 and 2009 national surveys of faculty goals indicate lower scores on the factors Research Experience and Laboratory Writing for general chemistry laboratory courses in comparison to organic and upper-level chemistry laboratory courses. This finding suggests that the general chemistry instructional laboratory was and remains a context in which cultivating knowledge and skills immediately associated with the practice of chemistry may not be a top priority. The finding further underscores the need for an investigation of general chemistry instructors’ potential alternate goals, and how alternate goals contribute to curricular objectives, as this trend appears to be longstanding rather than recent.

Faculty members again placed greater emphasis on goals relating to Group Work and Broader Communication Skills and Error Analysis, Data Collection and Analysis in the upper-level chemistry instructional laboratory when compared to the organic chemistry instructional laboratory, though they did not again place similar emphasis on goals relating to Research Experience, Transferable Skills (Not Lab-Specific), and Laboratory Writing

Factor scores on the Group Work and Error Analysis factors were higher for upper-level chemistry courses when compared to organic chemistry courses for both the 2022 and 2009 national surveys. Further, scores on the factors Research Experience, Transferable Skills (Not Lab-Specific), and Laboratory Writing were higher for upper-level chemistry than organic chemistry in the more recent survey.

Higher scores on a larger number of factors for upper-level chemistry courses in the more recent survey suggest that upper-level and organic chemistry courses are currently more dissimilar in their objectives than a decade prior. These scores currently suggest a more abrupt increase in focus on cultivating disciplinary knowledge and skills when moving from large-enrollment organic chemistry courses to upper-level chemistry courses. This sharper increase may be due to current goals in the organic chemistry instructional laboratory being more focused on serving a broad student population. This recent trend highlights the growing need for efforts focused on supporting faculty members in adopting evidence-based laboratory curricula in large-enrollment contexts and potentially abandoning goals better accomplished in the lecture courses (Woolnough and Allsop, 1985).

Faculty members did not again place similar emphasis on goals relating to Transferable Skills across courses

Scores on the Transferable Skills factors differed across courses in the more recent survey, unlike that of the prior survey. Differences included higher scores on the Transferrable Skills (Lab-Specific) factor for organic and upper-level chemistry courses in comparison to general chemistry courses, in addition to higher scores on the Transferrable Skills (Not Lab-Specific) factor for upper-level courses in comparison to general and organic chemistry courses.

When comparing these differences, scores on both factors are lower for general chemistry and higher for upper-level chemistry, whereas scores for organic chemistry are lower on one factor (i.e., Not Lab-Specific) and higher on the other factor (i.e., Lab-Specific). This comparison provides additional evidence of an increasing instructional focus on cultivating disciplinary knowledge and skills when progressing from general, to organic, to upper-level chemistry laboratory courses. The combination of high and low scores for organic chemistry further suggests that this large-enrollment laboratory course currently acts as steppingstone in the curriculum, with faculty members aiming to serve a broad student population while still preparing students for upper-level chemistry laboratory courses. This finding further underscores the need for an investigation of potential alternate goals and their alignment with the overarching pedagogical goal of making the laboratory “the place where students learn how to do chemistry” (Seery, 2020).

Faculty goals relating to Research Experience and Error Analysis, Data Collection and Analysis were again associated with receipt of funding

Research Experience and Error Analysis, Data Collection and Analysis goals were again associated with receipt of funding to improve undergraduate chemistry courses. Results suggest both a past and current focus on providing students with research experience and disciplinary knowledge and skills (e.g., the ability to collect data via techniques used by practicing chemists and conduct associated error analysis) among faculty members procuring funding to improve undergraduate chemistry courses. These results suggest that procuring funding was and is a means of aligning one's laboratory course with the practice of chemistry.

Limitations

Issues with the validity of data for faculty members’ goals for the general chemistry laboratory were ambiguous, meaning current objectives for this instructional context are unclear. However, this limitation constitutes an important finding; systematically lower scores on six of the seven factors suggest that faculty members adopt a distinct set of instructional goals in the general chemistry laboratory that are not directly associated with the practice of chemistry. This possibility is further supported by the failure to establish scalar invariance for groups by course, as measurement invariance at the metric level implies that while factors share a general meaning between groups, responses are skewed due to any number of underlying reasons. Skewness may in part be due to acquiescence bias, in which survey participants provide positive or negative responses despite having no position on given statements (Rocabado et al., 2020); the possible mismatch between goals identified in the Faculty Goals for Undergraduate Chemistry Laboratory Instrument and those espoused by instructors of general chemistry would result in a lack of positionality and, in turn, acquiescence bias among this group of survey participants. To understand goals for the general chemistry laboratory and its role in the curriculum, additional research is therefore needed.

Further, faculty members taking part in this study were exclusively from the United States. While guidelines for postsecondary chemistry programs in the United States share similarities with other national professional societies, the degree to which findings apply to chemistry instructional laboratories in other countries is uncertain. Additional research to measure and establish measurement invariance for the Faculty Goals for Undergraduate Chemistry Laboratory Instrument for groups by other national curricula is needed.

Implications for instructors

Less emphasis on Research Experience, Transferable Skills (Lab-Specific) and Laboratory Writing goals in general chemistry relative to other courses suggests that faculty members in this context do not necessary espouse instructional goals immediately associated with the practice of chemistry. This may be due to a variety of factors, including logistical barriers posed by large-enrollment course sizes and the challenge of serving a broad student population (Gibbons et al., 2017). If instructors wish to create a general chemistry environment aligned with the practice of chemistry while still serving a broader student population, the chemistry education community may need to re-envision laboratory experiments and activities provided in this instructional context. Alternatively, instructors and department leaders may need to reconsider whether the general chemistry instructional laboratory is needed, as it may not contribute to the overarching objectives of undergraduate chemistry education.

New, emerging pedagogies may inform how instructors accomplish a realignment between the general chemistry instructional laboratory and the practice of chemistry. For instance, course-based undergraduate research experiences have successfully embedded research experiences into general chemistry instructional laboratories (Clark et al., 2016; May et al., 2018; Sommers et al., 2021), making the numerous benefits of undergraduate research (e.g., increased retention in STEM) available to a larger, broad population of students (Nagda et al., 1998; Bangera and Brownell, 2014). The CURE pedagogy further allows instructors to cultivate discipline-specific skills while serving students enrolled in various majors. Notably, however, the use of CUREs in large-enrollment contexts is limited (Cruz et al., 2020). Other forms of inquiry-based laboratory instruction that require smaller-scale curricular transformations may provide a more productive means of incorporating disciplinary activities into large-enrollment contexts (Winkelmann et al., 2015).

Instructors of organic and upper-level chemistry laboratory courses may also consider CURE adoption, as they seek to place similar emphasis on goals relating to Research Experience, Group Work and Broader Communication Skills, Error Analysis, Data Collection and Analysis, and Transferable Skills compared to inorganic chemistry faculty members who implement CUREs in their laboratory courses (Connor et al., 2022). Adoption of this evidence-based pedagogy may help ensure students’ learning while allowing instructors to maintain their own distinct set of instructional goals. Lastly, instructors of organic and upper-level laboratory courses should evaluate whether their greater emphasis on Laboratory Writing goals is driven by a desire to cultivate disciplinary writing skills or their use of formal laboratory reports as the primary mode of assessment. If the latter, instructors may need to consider incorporating competency-based assessments to evaluate and encourage students’ cultivation of practical skills (Seery et al., 2017; Pullen et al., 2018).

Implications for department and institution leaders

Evidence of an association between goals relating to most factors and receipt of funding (i.e., internal or external) suggests that faculty with such funding aim to use the instructional laboratory to provide students with research experience, disciplinary knowledge and skills, and transferable science skills. It is possible that receipt of funding provides instructors with a means of accomplishing such goals, which in turn results in their adoption. It is further possible that instructors who successfully procure funding already espouse these goals and effectively communicate them to funding agencies. Yet regardless of causality, department and institution leaders should encourage and support faculty members in procuring funding to improve their undergraduate chemistry courses, as results suggest that providing such resources results in positive change.

Further, lack of evidence for an association between faculty goals and institutions’ highest chemistry degree awarded suggests that having a graduate-level chemistry research program does not necessarily influence instructors’ goals relating to Research Experience. Leaders of departments with such programs should therefore not assume that their undergraduate instructional laboratories afford such learning opportunities, but rather explicitly encourage instructors to incorporate these experiences.

In addition, to explicitly encouraging instructors to incorporate research experiences into their courses, departments and institutions should also support instructors in embedding undergraduate research into large-enrollment courses such as general chemistry. This support could entail offering faculty development opportunities focused on designing and implementing course-based undergraduate research experiences or providing additional instrumentation (or access to additional instrumentation) for use in instructional laboratories. Undergraduate research is a high-impact practice with numerous associated benefits, and its incorporation in large-enrollment contexts would support realignment of these courses with the practice of chemistry.

Implications for policy

No evidence of an association was found between faculty goals and approval from ACS to award certified bachelor's chemistry degrees, suggesting that faculty members have similar goals for the undergraduate instructional laboratory regardless of ACS-approval status. The ACS's Committee on Professional Training may therefore consider more explicit evaluation criteria regarding goals for instructional laboratory courses in order to differentiate chemistry bachelor's degree programs effectively. Guidelines on these explicit evaluation criteria would allow departments to adopt a set of coherent, distinct goals aimed at further improving the undergraduate chemistry instructional laboratory. Other professional societies, such as the Royal Society of Chemistry, may also wish to reflect on their evaluation criteria with regard to goals for instructional laboratory courses based on our findings, with a focus on ensuring their explicitness.

Implications for researchers

Lower scores on six of seven factors for the general chemistry laboratory when compared to the upper-level chemistry laboratory suggest that instructors in general chemistry may adopt an alternate set of instructional goals not immediately associated with the practice of chemistry. The Faculty Goals for Undergraduate Chemistry Laboratory Instrument may therefore be inappropriate for use in the context of general chemistry. Additional research is needed to understand the potentially distinct set of instructional goals for the general chemistry laboratory separate from other chemistry courses. Understanding these goals will be essential for evaluating the role of the general chemistry instructional laboratory in the broader undergraduate chemistry curriculum and potentially deciding whether such a course fits with broader curricular objectives.

Further, instructors may require support in aligning their goals for the general chemistry laboratory with the practice of chemistry. Insight into factors that contribute to their adoption of alternate goals (e.g., course size, limited laboratory staff support, etc.) may be necessary to effectively support alignment, as instructors will need assistance in surmounting these barriers. Knowledge of these factors may also inform how to best support the adoption of inquiry-based and evidence-based pedagogies in large-enrollment contexts, as such factors may also serve as barriers to their implementation. Research focused on supporting adoption of evidence-based instructional laboratory pedagogies across large-enrollment contexts (i.e., general and organic chemistry) is needed.

Research focused on Laboratory Writing goals in organic and upper-level courses is needed. It is unclear whether instructors’ greater emphasis on these goals reflects a desire to cultivate disciplinary writing skills or the use of formal laboratory reports as a primary mode of assessment. If the latter is true, these instructors may need support in adopting competency-based assessments (Pullen et al., 2018).

For researchers seeking to further improve functionality of the instrument for use in organic and upper-level courses, modifications to the Transferable Skills (Not Lab-Specific) factor may be a promising first step. As noted, McDonald's ω for this factor likely did not exceed the 0.70 cutoff due to the loose inter-relatedness of items measuring this broad construct (Tavakol and Dennick, 2011). The loading of item TN3 is particularly low in comparison to other items (i.e., 0.33; see Table 3). While the construct is broad, replacing item TN3 with a new item that is somewhat more related to items TN1 and TN2 may improve reliability of the measure. More items may also be needed to provide a valid measure given the broad nature of this construct. Researchers should further note the collinearity of the Transferable Skills (Lab-Specific), Transferable Skills (Not Lab-Specific), and Research Experience factors observed in Connor et al. (2022), as it suggests these factors are measuring a single, broader construct. With the rise in inquiry-based and research-based laboratory curricula, additional work is needed to understand the broader construct.

Conclusions

Results from a national survey of chemistry faculty members in the United States (n = 521) provide an overview of current goals for the undergraduate chemistry instructional laboratory, including insight into how goals compare to those from a decade prior (Bruck and Towns, 2013). Measures of faculty members’ goals were obtained using a modified version of the Faculty Goals for Undergraduate Chemistry Laboratory Instrument (Bruck and Towns, 2013), with evidence of validity, reliability, and measurement invariance providing general support for its use across course, institution type, and funding status. One exception for use may entail the general chemistry instructional laboratory, as instructors of this course exhibited lower scores on nearly all factors, suggesting that they may adopt alternate goals not immediately associated with the practice of chemistry. Measurement invariance at just the metric level further supports this possibility. This finding highlights the need for an instrument appropriate for use in the general chemistry instructional laboratory, as insight into these goals will be necessary for understanding the role of this course in the larger curriculum and, in turn, evaluating whether the course is necessary for accomplishing curricular objectives.

Goals for the undergraduate chemistry instructional laboratory were found to be associated with course and receipt of funding for improving undergraduate chemistry courses, similar to findings of the 2009 national survey of faculty goals for the undergraduate chemistry laboratory (Bruck and Towns, 2013). In addition, goals were found to be unassociated with institution type, both with regard to institutions’ highest chemistry degree awarded and institutions’ approval from a leading professional society to award certified bachelor's chemistry degrees. Departments should thus provide instructors with guidance in adopting goals associated with the practice of chemistry regardless of their program's research activity. Professional societies may also consider adopting more explicit evaluation criteria to better differentiate institutions and positively guide their instructional efforts. Results further indicate that faculty goals increasingly focus on providing research experience and disciplinary knowledge and skills with progression from general, to organic, to upper-level chemistry laboratory courses. This focus increases substantively with movement from large-enrollment to upper-level courses, suggesting that these contexts may limit the goals that instructors can reasonably achieve. Increased focus on supporting the adoption of inquiry-based and research-based pedagogies may constitute one means of realigning these courses with the practice of chemistry while still serving their broad student populations.

Author contributions

M. C. C. and J. R. R. conceived the project. M. C. C. and J. R. R. collected the data. M. C. C. conducted the psychometric evaluations and statistical analyses. M. C. C., G. A. R., and J. R. R. discussed and interpreted study results. M. C. C. authored the paper. All authors reviewed and edited the manuscript.

Conflicts of interest

There are no conflicts to declare.

Appendix

Table 12 Mann–Whitney U test comparisons of factors scores between faculty members at institutions granting only bachelor's degrees (n = 365) versus those at institutions granting bachelor's and graduate chemistry degrees (n = 156)
Factor U z-Score p value r Medianbachelor Mediangraduate
Research experience 27[thin space (1/6-em)]774 0.44 0.659 0.02 0.12 0.03
Group work and broader communication skills 27[thin space (1/6-em)]818 0.42 0.679 0.02 0.06 0.10
Error analysis, data collection and analysis 28[thin space (1/6-em)]417 0.03 0.973 0.00 0.09 0.05
Connection between lab and lecture 26[thin space (1/6-em)]169 1.46 0.144 0.06 0.18 0.15
Transferable skills (lab-specific) 27[thin space (1/6-em)]286 0.75 0.452 0.03 0.07 0.09
Transferable skills (not lab-specific) 27[thin space (1/6-em)]633 0.53 0.600 0.02 0.04 0.06
Laboratory writing 27[thin space (1/6-em)]479 0.63 0.530 0.03 0.09 0.12


Table 13 Mann–Whitney U test comparisons of factors scores between faculty members at institutions with approval from ACS to award certified bachelor's chemistry degrees (n = 409) versus those at institutions without approval (n = 112)
Factor U z-score p value r Medianapproved Mediannonapproved
Research experience 20[thin space (1/6-em)]581 1.64 0.100 0.07 0.12 0.07
Group work and broader communication skills 22[thin space (1/6-em)]346 0.40 0.693 0.02 0.07 0.05
Error analysis, data collection and analysis 22[thin space (1/6-em)]584 0.23 0.821 0.01 0.05 0.10
Connection between lab and lecture 19[thin space (1/6-em)]987 2.07 0.039 0.09 0.16 0.26
Transferable skills (lab-specific) 20[thin space (1/6-em)]242 1.89 0.059 0.08 0.08 0.02
Transferable skills (not lab-specific) 20[thin space (1/6-em)]505 1.70 0.089 0.07 0.07 0.01
Laboratory writing 21[thin space (1/6-em)]355 1.10 0.273 0.05 0.11 0.07


Acknowledgements

We thank the 521 faculty members who gave of their time to complete the survey, describe their instructional laboratory courses, and report their goals for those courses.

References

  1. Allen W. E., Hosbein K. N., Kennedy A. M., Whiting B. and Walker J. P., (2021a), Design and Implementation of an Organic to Analytical CURE Sequence. J. Chem. Educ., 98(7), 2199–2208.
  2. Allen W. E., Hosbein K. N., Kennedy A. M., Whiting B. and Walker J. P., (2021b), Embedding research directly into the chemistry curriculum with an organic to analytical sequence. J. Chem. Educ., 98(7), 2188–2198.
  3. Bangera G. and Brownell S. E., (2014), Course-Based Undergraduate Research Experiences Can Make Scientific Research More Inclusive. CBE—Life Sci. Educ., 13, 602–909.
  4. Bretz S. L., (2019), Evidence for the Importance of Laboratory Courses. J. Chem. Educ., 96(2), 193–195.
  5. Bruck A. D. and Towns M., (2013), Development, Implementation, and Analysis of a National Survey of Faculty Goals for Undergraduate Chemistry Laboratory. J. Chem. Educ., 90, 685–693.
  6. Bruck L. B., Towns M. and Bretz S. L., (2010), Faculty perspectives of undergraduate chemistry laboratory: Goals and obstacles to success. J. Chem. Educ., 87(12), 1416–1424.
  7. Christianson A. M. and Waters C. A., (2021), Silver Chloride Waste Recycling as a Guided-Inquiry Experiment for the Instrumental Analysis Laboratory. J. Chem. Educ., 99(2), 1014–1020.
  8. Clark T. M., Ricciardo R. and Weaver T., (2016), Transitioning from Expository Laboratory Experiments to Course-Based Undergraduate Research in General Chemistry. J. Chem. Educ., 93(1), 56–63.
  9. Cohen J., (1992), A power primer. Quant. Methods Psychol., 112(1), 155–159.
  10. Committee on Professional Training, (2015), Undergraduate professional education in chemistry: ACS guidelines and evaluation procedures for bachelor's degree programs, American Chemical Society.
  11. Committee on Professional Training, (2020), Laboratory Experiences that Require Hands-on Experience: A response to COVID-19, American Chemical Society.
  12. Connor M. C., Pratt J. M. and Raker J. R., (2022), Goals for the Undergraduate Instructional Inorganic Chemistry Laboratory When Course-Based Undergraduate Research Experiences Are Implemented: A National Survey, J. Chem. Ed. DOI:10.1021/acs.jchemed.2c00267.
  13. Connor M. C. and Raker J. R., (2022), Instrumentation Use in Postsecondary Instructional Chemistry Laboratory Courses: Results from a National Survey, J. Chem. Educ., 99(9), 3143–3154.
  14. Cox B. E., Mcintosh K. L., Reason R. D. and Terenzini P. T., (2011), A Culture of Teaching: Policy, Perception, and Practice in Higher Education. Res. High. Educ., 52, 808–829.
  15. Cruz C. L., Holmberg-Douglas N., Onuska N. P. R., McManus J. B., MacKenzie I. A., Hutson B. L., et al., (2020), Development of a Large-Enrollment Course-Based Research Experience in an Undergraduate Organic Chemistry Laboratory: Structure-Function Relationships in Pyrylium Photoredox Catalysts. J. Chem. Educ., 97(6), 1572–1578.
  16. Diemer M. A., Rapa L. J., Park C. J. and Perry J. C., (2017), Development and Validation of the Critical Consciousness Scale. Youth Soc., 49(4), 461–483.
  17. Gibbons R. E., Laga E. E., Leon J., Villafañ S. M., Stains M., Murphy K., et al., (2017), Chasm Crossed? Clicker Use in Postsecondary Chemistry Education. J. Chem. Educ., 94(5), 549–557.
  18. Grushow A., Hunnicutt S., Muñiz M., Reisner B. A., Schaertel S. and Whitnell R., (2021), Journal of Chemical Education Call for Papers: Special Issue on New Visions for Teaching Chemistry Laboratory. J. Chem. Educ., 98(11), 3409–3411.
  19. Hanson P. K. and Stultz L. K., (2021), Linking Chemistry and Biology through Course-Based Undergraduate Research on Anticancer Ruthenium Complexes.
  20. Hofstein A. and Mamlok-Naaman R., (2007), The laboratory in science education: the state of the art.
  21. Hu L. T. and Bentler P. M., (1999), Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Model. Multidiscip. J., 6(1), 1–55.
  22. Hunnicutt S. S., Grushow A. and Whitnell R., (2014), Guided-Inquiry Experiments for Physical Chemistry: The POGIL-PCL Model. J. Chem. Educ., 92(2), 262–268.
  23. Komperda R., Pentecost T. C. and Barbera J., (2018), Moving beyond Alpha: A Primer on Alternative Sources of Single-Administration Reliability Evidence for Quantitative Chemistry Education Research. J. Chem. Educ., 95(9), 1477–1491.
  24. Lave J. and Wenger E., (1991), Situated Learning, Cambridge University Press.
  25. Li C., (2016), Confirmatory factor analysis with ordinal data: Comparing robust maximum likelihood and diagonally weighted least squares. Behav. Res. Methods, 48, 936–949.
  26. May N. W., McNamara S. M., Wang S., Kolesar K. R., Vernon J., Wolfe J. P., et al., (2018), Polar Plunge: Semester-long snow chemistry research in the general chemistry laboratory. J. Chem. Educ., 95(4), 543–552.
  27. Nagda B. A., Gregerman S. R., Jonides J., Von Hippel W. and Lerner J. S., (1998), Undergraduate Student-Faculty Research Partnerships Affect Student Retention, Rev. High. Educ., 22, 55–72.
  28. National Center for Education Statistics, (2021), Integrated Postsecondary Education Data System.
  29. National Research Council, (2012), A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas.
  30. Pagano J. K., Jaworski L., Lopatto D. and Waterman R., (2018), An Inorganic Chemistry Laboratory Course as Research. J. Chem. Educ., 95(9), 1520–1525.
  31. Pullen R., Thickett S. C. and Bissember A. C., (2018), Investigating the viability of a competency-based, qualitative laboratory assessment model in first-year undergraduate chemistry. Chem. Educ. Res. Pract., 19, 629.
  32. Raker J., Pratt J., Connor M., Smith S., Stewart J., Reisner B., et al., (2022), The Postsecondary Inorganic Chemistry Instructional Laboratory Curriculum: Results from a National Survey. J. Chem. Educ., 99, 1971–1981.
  33. Rattanakit P., (2021), Open Inquiry-Based Laboratory Project on Plant-Mediated Green Synthesis of Metal Nanoparticles and Their Potential Applications. J. Chem. Educ., 98(12), 3984–3991.
  34. Reed J. J., Villafañe S. M., Raker J. R., Holme T. A. and Murphy K. L., (2017), What We Don’t Test: What an Analysis of Unreleased ACS Exam Items Reveals about Content Coverage in General Chemistry Assessments. J. Chem. Educ., 94(4), 418–428.
  35. Reid N. and Shah I., (2007), The role of laboratory work in university chemistry. Chem. Educ. Res. Pract., 8(2), 172–185.
  36. Reisner B. A., Smith S. R., Stewart J. L., Raker J. R., Crane J. L., Sobel S. G. and Pesterfield L. L., (2015), Great Expectations: Using an Analysis of Current Practices To Propose a Framework for the Undergraduate Inorganic Curriculum. Inorg. Chem., 54, 8859−8868.
  37. Revelle W., (2018), psych: Procedures for Personality and Psychological Research.
  38. Rocabado G. A., Komperda R., Lewis J. E. and Barbera J., (2020), Addressing diversity and inclusion through group comparisons: a primer on measurement invariance testing. Chem. Educ. Res. Pract., 21(3), 969–988.
  39. Roller R. M., Sumantakul S., Tran M., Van Wyk A., Zinna J., Donelson D. A., et al., (2021), Inquiry-Based Laboratories Using Paper Microfluidic Devices. J. Chem. Educ., 98(6), 1946–1953.
  40. softwareRosseel Y., (2012), lavaan: An R Package for Structural Equation Modeling.
  41. softwareRoyal Society of Chemistry, (2022), Accreditation of Degree Programmes.
  42. Sansom R. and Walker J. P., (2019), Investing in Laboratory Courses. J. Chem. Educ., 96(2), 193–195.
  43. Seery M. K., (2020), Establishing the Laboratory as the Place to Learn How to Do Chemistry. J. Chem. Educ., 97(6), 1511–1514.
  44. Seery M. K., Agustian H. Y., Doidge E. D., Kucharski M. M., O’Connor H. M. and Price A., (2017), Developing laboratory skills by incorporating peer-review and digital badges. Chem. Educ. Res. Pract., 18(3), 403–419.
  45. Sheskin D. J., (2011), Handbook of parametric and nonparametric statistical procedures, 5th edn, CRC Press.
  46. Smythers A. L., Ford M. M., Hawkins D. G., Connor M. C., Lawrence K. C., Stanton C. R., et al., (2021), Modernizing the Analytical Chemistry Laboratory: The Design and Implementation of a Modular Protein-Centered Course. J. Chem. Educ., 98(5), 1645–1652.
  47. Sommers A. S., Miller A. W., Gift A. D., Richter-Egger D. L., Darr J. P. and Cutucache C. E., (2021), CURE Disrupted! Takeaways from a CURE without a Wet-Lab Experience. J. Chem. Educ., 98(2), 357–367.
  48. Srinivasan S., Gibbons R. E., Murphy K. L. and Raker J., (2018), Flipped classroom use in chemistry education: Results from a survey of postsecondary faculty members. Chem. Educ. Res. Pract., 19(4), 1307–1318.
  49. softwareStataCorp, (2021), Stata Statistical Software: Release 17.
  50. Tavakol M. and Dennick R., (2011), Making sense of Cronbach's alpha. Int. J. Med. Educ., 2, 53–55.
  51. The Quality Assurance Agency for Higher Education, (2022), Subject Benchmark Statement: Chemistry, 5th edn, Southgate House.
  52. Walker J. P., Sampson V. and Zimmerman C. O., (2011), Argument-Driven Inquiry: An Introduction to a New Instructional Model for Use in Undergraduate Chemistry Labs. J. Chem. Educ., 88, 1048–1056.
  53. Waterman R. and Heemstra J., (2018), Expanding the CURE Model: Course-based Undergraduate Research Experience, Research Corporation for Science Advancement.
  54. Watts F. M., Spencer J. L. and Shultz G. V., (2021), Writing Assignments to Support the Learning Goals of a CURE. J. Chem. Educ., 98(2), 510–514.
  55. Weaver G. C., Russell C. B. and Wink D. J., (2008), Inquiry-based and research-based laboratory pedagogies in undergraduate science, Nat. Chem. Biol., 4(10), 577–580.
  56. Winkelmann K., Baloga M., Marcinkowski T., Giannoulis C., Anquandah G. and Cohen P., (2015), Improving students’ inquiry skills and self-efficacy through research-inspired modules in the general chemistry laboratory. J. Chem. Educ., 92(2), 247–255.
  57. Woolnough B. E. and Allsop T., (1985), Practical Work in Science, Cambridge University Press.
  58. Yik B. J., Raker J. R., Apkarian N., Stains M., Henderson C., Dancy M. H. and Johnson E., (2022), Evaluating the impact of malleable factors on percent time lecturing in gateway chemistry, mathematics, and physics courses. Int. J. STEM Educ. 2022, 9(1), 1–23.

This journal is © The Royal Society of Chemistry 2023