Using the EPIC (exposure, persuasion, identification, and commitment) to develop a measure of student buy-in to laboratory learning goals

Elizabeth B. Vaughan, A. Montoya-Cowan and Jack Barbera*
Department of Chemistry, Portland State University, Portland Oregon, 97207-0751, USA. E-mail: jack.barbera@pdx.edu

Received 7th June 2025 , Accepted 14th August 2025

First published on 14th August 2025


Abstract

Buy-in has been described as the series of judgements students make when deciding to engage with a pedagogical practice. Buy-in to pedagogical practices is known to have a meaningful impact on students’ engagement in learning environments, making it a construct of interest for researchers and practitioners. This study develops a measure of student buy-in to faculty defined laboratory learning goals by adapting an existing measure which operationalizes the construct in terms of Exposure, Persuasion, Identification, and Commitment (EPIC). The adapted measure presented in this study, deemed the EPIC-LaG (Laboratory Goals), was developed using learning goals for general and organic chemistry laboratory courses and psychometrically evaluated. Evidence related to response process indicated that students were interpreting and responding to EPIC-LaG items as intended. Structural validity evidence provided support for the unidimensional constructs of Exposure, Persuasion, Identification, and Commitment, as well as for the structural model relating the constructs. Single administration reliability evidence provided support for the internal consistency of the items. Finally, evidence of scalar measurement invariance was found for each group in the study, demonstrating the generalizability of the structural model across groups, which provides support for comparisons made between them. When comparisons were investigated, differences in buy-in pathways were identified between general and organic chemistry laboratory courses, and between ‘cookbook’ and argument-driven inquiry style general chemistry laboratory courses. This study provides psychometric evidence to support the interpretation of EPIC-LaG data and serves as a foundation for others interested in adapting the EPIC-LaG to investigate students’ buy-in to their laboratory learning goals.


Introduction

Undergraduate laboratory courses are considered by many education researchers and practitioners to be an integral aspect of students’ training in the field of chemistry (Bowers, 1924; Nersessian, 1989; Tobin, 1990; Reid and Shah, 2007). While there is a general consensus that hands-on laboratory experiences are important for students, limited evidence exists for the impact of laboratory courses upon student learning (Bretz, 2019). This conundrum has spurred many chemistry education researchers toward investigating and attempting to improve students’ learning experiences within undergraduate chemistry laboratory courses (Bretz, 2019; Grushow et al., 2021). As a part of this research trend, many chemistry education researchers and practitioners have begun to replace cookbook style laboratory courses (i.e., laboratory courses with predetermined procedures and expected outcomes) with inquiry-based and/or research-based laboratory courses (e.g., Weaver et al., 2008; Grushow et al., 2021). As pedagogical updates are being made in the laboratory environment, it is necessary for researchers and practitioners to investigate the impact that these changes have on student learning experiences.

It is well known across education literature that students’ active engagement in the learning process has the potential to positively impact course outcomes, including test performance, grades, and persistence (Lester, 2013). Despite its benefits, student engagement is not a guarantee. When provided with an opportunity to engage in the learning process, “students may decide to participate based on a series of judgments, including whether the activities are deemed as valuable to the learning process, are enjoyable, or allow for meaningful interaction with others” (Cavanagh et al., 2016). This series of judgements is better known in the education research community as ‘buy-in’.

Buy-in

Currently in the literature, there does not exist a systematic definition of the term ‘buy-in’. That said, researchers and theorists have posited that buy-in reflects “elements of participation, support, and a sense of commitment to change, as well as a belief that such changes will have a positive impact on student learning” (Cavanagh et al., 2016; see also Levin, 2000; Brazeal et al., 2016). Although a high level of student buy-in is desirable across classroom settings, the judgements that may lead to student buy-in (or lack there-of) to a particular educational experience may be influenced by a variety of factors, including instructor support, classroom climate, and prior learning experiences (Freeman et al., 2007; Micari and Pazos, 2012; Tanner, 2013; Zumbrunn et al., 2014; Cavanagh et al., 2016). Because buy-in can be impacted by personal, interpersonal, and environmental factors, tools for measuring this construct likely need to be aligned with a specific learning environment.

Measuring buy-in

While quantitative measures of buy-in exist across education literature (Brazeal et al., 2016; Cavanagh et al., 2016; Aragón et al., 2017; Wang et al., 2021; Donis et al., 2024), none have been explicitly developed to measure student buy-in in lower division undergraduate chemistry laboratory courses. That said, there is a measure of buy-in that has been adapted for use in STEM lecture courses. In 2017 Aragón and colleagues published a measure that assessed faculty buy-in to the adoption of inclusive teaching practices (Aragón et al., 2017). Because there is not currently a literature consensus on the definition and structure of buy-in, the measure developed by Aragón and colleagues operationalizes the concept through a series of self-report items intended to capture four key elements represented by the acronym EPIC: Exposure, Persuasion, Identification, and Commitment. These elements allow instructors and researchers to answer a series of questions (noted below), which may help in furthering understanding of participant buy-in to and engagement in educational practices.

Has the participant…

(1) …been exposed to a particular experience or practice? (Exposure);

(2) …been persuaded that the experience or practice is good? (Persuasion);

(3) …identified that the experience or practice is good for them personally and/or within their unique context? (Identification);

(4) …committed to participating in or using the experience or practice (Commitment)?

Aragón and colleagues used a logic based argument to hypothesize that “each step … could predict attrition (and adoption) beginning with exposure, through persuasion, identification, and commitment, then onto implementation” (Aragón et al., 2017). In the context of inclusive teaching practices, the researchers argued that in order for a faculty member to adopt an inclusive teaching practice, the faculty member must first know that the practice exists (Exposure). Once the faculty member is aware of the practice, they must be convinced that the practice is generally a ‘good idea’ (Persuasion). Once the faculty member believes that the practice is generally good, they must also believe that the practice is good for themselves personally (e.g., compatible with their teaching approach, etc.) (Identification). Finally, the faculty member may choose to commit to the inclusive teaching practice (Commitment).

In 2016, the EPIC was adapted by science education researchers in order to assess student buy-in to active learning in a college level science course (Cavanagh et al., 2016). More recently, in 2021, the EPIC was once again adapted to investigate college students’ buy-in to evidence-based teaching practices in STEM as part of a larger study (Wang et al., 2021). In this model-based study, Wang and colleagues investigated the paths among the buy-in steps as well as the paths that connect the buy-in steps to other variables of interest to the researchers (Wang et al., 2021). The current study, reported here, is part of a larger project exploring the relations among students’ expectations, buy-in, and engagement in lower-division undergraduate chemistry laboratory courses. Therefore, the model proposed by Wang et al. (2021) will be utilized when evaluating the relation among buy-in elements in this study.

Faculty learning goals

In any undergraduate course, there are a variety of pedagogical structures and practices that students may choose to buy-in to. This study is focused on investigating students’ buy-in to laboratory learning goals. Learning goals are generally defined as the intended purposes and desired learning objectives of a particular course, which generally identify the knowledge and/or skills that a student in that course should achieve (Sheridan Center for Teaching and Learning, Brown University, 2024). Learning goals for undergraduate chemistry laboratory courses are typically determined by the faculty members and/or chemistry department that developed each course, and can vary widely by instructor, course level, and institution (Bruck et al., 2010; Bretz et al., 2013; Bruck and Towns, 2013). Because of this variability, meaningfully investigating students’ buy-in to faculty defined learning goals in a particular environment requires developing a measure that is specific to that environment. Laboratory learning goals for the environments of interest in this study have previously been investigated by the authors of this manuscript (Vaughan et al., 2024).

Validity and reliability evidence

When an existing measure, such as the EPIC, is adapted to a new environment, evidence supporting the validity and reliability of the data collected with the measure must be gathered. The types of validity and reliability evidence that are of relevance to this study include: response process validity, internal structure validity, measurement invariance, and single administration reliability. Collecting evidence in support of response process validity suggests that participants are interpreting and responding to items in a manner that aligns with the expectations of the researchers (Collins, 2003; Arjoon et al., 2013; Deng et al., 2021). Evidence supporting internal structure validity indicates that relations between survey items and latent constructs match the hypothetical structure of the construct that is being studied (Worthington and Whittaker, 2006; Arjoon et al., 2013). Measurement invariance evaluates the similarities in the internal structure of data collected from multiple groups. Evidence of measurement invariance provides support for making comparisons between responses from the groups (Rocabado et al., 2020). Finally, evidence supporting single administration reliability suggests that participants’ responses are consistent among items measuring the same construct (Kuder and Richardson, 1937; Komperda et al., 2018; Taber, 2018).

Purpose of this study

As part of a larger project exploring the relations among students’ expectations, buy-in, and engagement in lower-division undergraduate chemistry laboratory courses, this study focused on adapting and psychometrically evaluating a measure to investigate students’ buy-in to faculty defined learning goals in lower division chemistry laboratory courses at Portland State University. The research questions guiding this study are as follows:

1. What psychometric evidence supports the validity and reliability of EPIC-LaG data collected in the general and organic chemistry laboratories at Portland State University?

2. If sufficient evidence of validity and reliability is found to support the interpretation of data collected with the measure, how does student buy-in differ between courses and pedagogical styles?

Methods

Adapting the EPIC

Participants’ buy-in to a particular practice is assessed through a four-part item set. The EPIC assesses buy-in by determining participants’ levels of exposure, persuasion, identification, and commitment. At the beginning of each item set, an individual practice is presented to the participant and they are prompted to select whether or not they have been exposed to this practice through a yes or no response. For example, when the EPIC was used to investigate student buy-in to evidence-based teaching practices in STEM, items that were presented to students included: “I completed in-class activities in groups of two or more,” and “I received feedback on my progress towards course objectives throughout the semester” (Wang et al., 2021). Participants who indicate that they have been exposed to the practice (i.e., selected ‘yes’) are then prompted with the remaining three statements in the set (Table 1) and respond using the same yes/no answer format. These follow-up prompts remain consistent between item sets and are used to investigate participants’ levels of persuasion, identification, and commitment. When the EPIC was used to investigate student buy-in to evidence-based teaching practices in STEM, follow up statements included “I believe this advances my learning” (persuasion); “I like to learn this way” (identification); and “I want to do this in future courses” (commitment) (Wang et al., 2021). Table 1 provides an example of a complete item-set for a single EPIC item.
Table 1 Example of an EPIC item set investigating student buy-in to evidence-based teaching practices in STEM (Wang et al., 2021)
Prompt for exposure (yes/no response)
I completed in-class activities in groups of two or more.
If the response to the prompt for exposure is yes, then the prompts for persuasion, identification, and commitment are presented simultaneously.

Prompt for persuasion (yes/no response) Prompt for identification (yes/no response) Prompt for commitment (yes/no response)
I believe this advances my learning. I like to learn this way. I want to do this in future courses.


EPIC items

This study builds on a previous qualitative study investigating the learning goals and expectations of laboratory coordinators, graduate teaching assistants, and students enrolled in general and organic chemistry laboratory courses at Portland State University (Vaughan et al., 2024). The Vaughan et al. (2024) study was particularly focused on investigating the overarching goals and expectations for each of the laboratory courses of interest (as opposed to specific weekly or topic focused goals). This focus was addressed by conducting semi-structured interviews with the laboratory coordinators who manage and oversee the general and organic chemistry laboratory courses at Portland State University. Interested readers are referred to the original publication for additional details related to data collection, analysis, and results. In order to convert the faculty defined learning goals into exposure prompts for the EPIC, each learning goal was converted into the past tense and a universal item stem, “In my [general/organic] chemistry laboratory course this term I…”, was added. The original EPIC prompts for persuasion, identification, and commitment were also updated to better reflect the laboratory learning environment. The new prompts for persuasion, identification, and commitment were as follows: “I believe that this was an important thing to learn in my laboratory course” (Persuasion), “I liked learning about this in my laboratory course” (Identification), and “I want to learn more about this in the future” (Commitment). Each of the prompts for exposure, persuasion, identification, and commitment included in the adapted version of the EPIC, deemed the EPIC-LaG (Laboratory Goals) can be found in Table 2.
Table 2 Exposure, persuasion, identification, and commitment prompts for the EPIC-LaG
Prompts for exposure (yes/no response)
a Prompts for persuasion, identification, and commitment are consistent for every item (i.e. laboratory learning goal).
Item stem: in my [general/organic] chemistry laboratory course this term…
I learned how to collect and accurately record data.
I learned how to describe the properties of substances qualitatively.
I learned how to describe the properties of substances quantitatively.
I learned about error analysis.
I learned about lab safety.
I learned about written communication skills.
I learned about hands on laboratory skills.
I applied chemistry concepts that were learned in the lecture.
I learned how to behave professionally in a laboratory setting.
I learned about how chemistry concepts apply in real life.
I learned about the differences between theoretical concepts and reality.

Prompt for persuasion (yes/no response)a Prompt for identification (yes/no response)a Prompt for commitment (yes/no response)a
 
I believe that this was an important thing to learn in my laboratory course. I liked learning about this in my laboratory course. I want to learn more about this in the future.


The Vaughan et al. (2024) qualitative study identified both faculty defined laboratory learning goals and faculty defined affective expectations for the laboratory courses of interest. While affective expectations are an important aspect of laboratory courses, this study is focused on investigating students’ buy-in to faculty defined learning goals. Therefore, the two affective expectations identified (e.g., “Feel excited about/interested in laboratory activities.” and “Gain confidence in the laboratory.”) were not included as exposure prompts. Additionally, the laboratory goal to “Learn about a weekly concept/theory” was excluded because it was deemed by researchers to be too vague to yield meaningful responses from students.

Data collection

All data included in this study was collected at Portland State University (PSU), an urban, public, four year, doctoral granting institution in the pacific northwest. As of fall 2024, PSU has a total enrollment of approximately 20[thin space (1/6-em)]000 students, just under 16[thin space (1/6-em)]000 of whom are undergraduates. According to institutional data, 40% of PSU students are enrolled part time, 64% of students are from the Portland metro area, and 43% of students are first generation (Portland State University, 2025). Two types of data related to the EPIC-LaG were collected. First, qualitative response process data was collected in the form of interviews and free responses items. This data was used to assess whether participants were interpreting and responding to the newly developed EPIC-LaG items in a manner that aligned with the expectations of the researchers. The second type of data collected in this study was quantitative survey data. Quantitative data was used to assess evidence of structural validity, measurement invariance, and single administration reliability of the data generated by the EPIC-LaG with the populations.

Ethical considerations

This study was supported by Human Subjects Research (IRB) protocol #158 at Portland State University. All ethical research guidelines were followed, including the collection of informed consent from participants. None of the researchers involved in this project acted as laboratory instructors for the courses described in this study at the time of data collection.
Qualitative data collection. Evidence of response process validity was collected for the EPIC-LaG data through student interviews and free response survey items. Interview participants included students who were enrolled in a first-term general or organic chemistry laboratory course at Portland State University in the fall of 2020. Interview participants were not offered any incentive for participation. All interviews took place via Zoom at the end of the fall term and beginning of the winter term in the 2020/2021 academic year. Each interview lasted approximately 45 minutes, during which participants were given a copy of the EPIC-LaG and asked to respond to each of the prompts include in the measure. Upon completion of the survey, interview participants were asked to read each of the exposure prompts (Table 2, representing the faculty defined learning goals) aloud, state which response they selected (yes/no), and explain why they selected that response. Follow-up questions were asked as needed in order to better understand participants’ interpretation of each prompt and/or the reasoning behind their selected response. Additionally, interview participants were asked to restate each of the prompts for exposure, persuasion, and identification in their own words, in order to ensure that students were correctly interpreting and responding to these prompts within each item set.

In addition to interviews, response process data was also collected through open-ended survey items. Survey responses were collected in the winter of 2021 from students taking a second-term general or organic chemistry laboratory course at Portland State University. Surveys were distributed via Qualtrics in the last two weeks of the laboratory course. Survey participants were recruited via an in-class video announcement that was pre-recoded by the first-author (E. B. V.) and presented by the graduate teaching assistants in each laboratory section. The video was then posted to each laboratory section's learning management site, along with written recruitment materials and a link to the Qualtrics survey. Students were offered a nominal amount of extra credit (<2% of the total points in each course) for accessing the survey. Survey participants were randomly presented three out of the eleven possible item sets and asked to respond to each. After each item set, students were asked to restate the exposure prompt (the faculty defined learning goal) in their own words. For each of the 11 items sets, 4 interview responses and approximately 50 open ended written responses were collected.

Quantitative data collection. All of the quantitative survey data analyzed in this study were collected from students enrolled in a first term general or organic chemistry course at Portland State University (PSU) in the 2021/2022 and 2022/2023 academic years (Table 3). The general chemistry laboratory course taught in 2021/2022 academic year employed cookbook style laboratory activities, while the general chemistry laboratory course taught in the 2022/2023 employed argument-driven inquiry (ADI) style laboratory activities (Walker et al., 2011). All organic chemistry courses included in this study employed cookbook style laboratory activities. As described for the qualitative data collection, study participants were recruited via an in-class video announcement which directed them to the learning management site for their course, where they would find a link to the Qualtrics survey. With the support of the laboratory coordinators, the survey was treated as a course assignment. While each survey was graded for completion, students were able to choose whether or not their data could be used for research proposes. The EPIC-LaG was distributed to students during the final two weeks of each laboratory course as part of a larger survey that included multiple measures as well as a check item, which asked students to select a specific response option from a 1–100 scale. This item allowed for the removal of student responses who were not reading the items. Before analyzing any quantitative data, responses were cleaned by removing any duplicate participant responses or responses from participants who incorrectly responded to the check item.
Table 3 Cleaned student response totals for the EPIC-LaG by year. The pedagogical style of each course is denoted as either CB (Cookbook) or ADI (Argument-Driven Inquiry)
Course Student responses, n (response ratea)
a Response rates represent the percentage of complete consenting responses out of the total course enrollment at the beginning of each term.
2021/2022
General Chemistry (CB) 148 (41%)
Organic Chemistry (CB) 93 (47%)
2022/2023
General Chemistry (ADI) 226 (57%)
Organic Chemistry (CB) 108 (68%)
Total 575


Data analysis

Qualitative data

Students’ interview transcripts and open-ended survey responses were analyzed to assess evidence of response process validity for the EPIC-LaG data. This analysis was conducted by the first author (E. B. V.) in collaboration with a secondary coder (A. M. C.). Each member of the analysis team individually read through each of the interview transcripts and written responses. Because interview participants were asked to respond to each item and describe the reason for their response, items were flagged from the interview transcripts if: (1) the participant's reasoning did not match their response (yes/no), (2) the participant expressed confusion or did not seem to understand an item, and/or (3) the participant's explanation indicated that their interpretation of an item was different from its intended meaning. Alternatively, free response participants were asked to respond to each prompt and restate the prompt in their own words. Therefore, items were flagged based on written responses if: (1) the participant expressed confusion or did not seem to understand an item, or (2) the participant's explanation indicated that their interpretation of an item was different from its intended meaning. Once all of the qualitative data had been analyzed by both members of the research team, the two researchers discussed the clarity and relevance of any flagged items, in order to come to a consensus.
Quantitative data. Quantitative data collected in this study were investigated for evidence of various sources of validity and reliability including: internal structure validity, single administration reliability, and measurement invariance.
Investigating evidence of internal structure validity. If a student does not feel that they have been exposed to a particular learning goal, then they do not have the opportunity to buy-in to that learning goal. However, it is proposed that once a student has been exposed to a laboratory learning goal, they have the opportunity to be persuaded of the value of the learning goal (persuasion), believe the goal will ultimately provide a benefit to them as a learners (identification), and continue learning about this topic in future courses (commitment) (Cavanagh et al., 2016; Aragón et al., 2017; Wang et al., 2021). When surveying students about their buy-in, the prompts are not sequentially presented. Instead, once the exposure prompt is endorsed, a student will be prompted with all three and can endorse any or none of them. Therefore, the theoretical structure of the EPIC data proposed by Wang et al. (2021) includes the hypothesized path (i.e., persuasion to identification to commitment) as well a direct path from persuasion to commitment (Fig. 1). This theoretical structure for buy-in data collected with the EPIC has previously been tested for a version of the measure intended to investigate college students’ buy-in to evidence-based teaching practices in STEM, and evidence of good data-model fit was found (Wang et al., 2021).
image file: d5rp00203f-f1.tif
Fig. 1 Structural model relating persuasion, identification, and commitment. Individual indicator items for the latent variables are omitted for clarity. Students must respond positively to an exposure item in order to be presented with the other items (persuasion, identification, and commitment). Therefore, exposure (represented to the left of the dashed line) is a precursor to buy-in but is not included in the model (right of the dashed line) itself.

Evidence in support of the structural validity for the data collected with the EPIC-LaG was explored via confirmatory factor analysis (CFA) and structural equation modeling (SEM). All analyses were completed using the statistical program R (version 4.2.0 (2022-04-22)) with the package lavaan (version 0.6-11). The structure of each of the four individual constructs (exposure, persuasion, identification, commitment) was investigated via CFA. The mediation model (Fig. 1) was investigated via SEM. Because the EPIC employs a dichotomous (yes/no) response type, a robust diagonally weighted least squares estimator (WLSMV) was employed (Finney and DiStefano, 2013) for all analyses. Additionally, fit statistics were calculated and interpreted for goodness of the data-model fit. Recommended cut-off values to be used with the WLSMV estimator are CFI ≥ 0.95, TLI ≥ 0.95, and RMSEA ≤ 0.05 (Yu, 2002; Beauducel and Herzberg, 2006; DiStefano and Morgan, 2014; Rocabado et al., 2020). The commonly reported SRMR fit index is not recommended for use with the WLSMV indicator, and therefore is not included in this study (Yu, 2002; Beauducel and Herzberg, 2006; Finney and DiStefano, 2013; Rocabado et al., 2020).

Investigating evidence of single administration reliability. Kuder–Richardson reliability (KR-20) was used to investigate the single-administration reliability of each unidimensional constructs included in the EPIC-LaG. The KR-20 was selected due to the dichotomous nature of the EPIC-LaG items (i.e., the yes/no response type). Values for KR-20 range from 0 to 1, where a value of 1 indicates that all of the observed variance can be attributed to the latent construct. Therefore, larger KR-20 values provide evidence to support of the internal consistency of the items (Kuder and Richardson, 1937; Cortina, 1993).
Investigating evidence of measurement invariance. Before the EPIC-LaG can be used to make comparisons between the scores of different respondent groups, measurement invariance must be established to evaluate the similarities in the internal structure of data collected from multiple groups. In this study, evidence of measurement invariance was assessed for the mediation model (Fig. 1) comparing course level (general chemistry vs. organic chemistry) and course structure (cookbook vs. ADI). Establishing measurement invariance requires investigation of sequential, increasingly constrained, steps with sufficient evidence of data-model fit being established at each step (Rocabado et al., 2020). The first step of measurement invariance is configural invariance, which requires simultaneously testing the unconstrained factor model for each group. If evidence of configural invariance is found, then metric invariance is assessed by constraining the factor loadings across groups. Next, scalar invariance is evaluated by constraining both the factor loadings and the item intercepts across groups. Finally, strict (conservative) invariance is evaluated by constraining factor loadings, item intercepts, and error variances across groups. Each step of measurement invariance is evaluated using the suggested recommended cutoffs for changes in data-model fit between subsequent levels: ΔCFI ≤ 0.010, and ΔRMSEA ≤ 0.015 (Chen, 2007). It should also be noted that because data collected with the EPIC-LaG employs a dichotomous response type, the scaling of the underlying latent normal distribution for each set of item responses must be decided before analysis can take place (Rocabado et al., 2020). In this case, theta scaling was employed, setting the variance of the residual term to 1. Literature suggests that theta scaling is appropriate for invariance research (Millsap and Yun-Tein, 2004). Typically, the final step of measurement invariance (strict/conservative) requires residuals (i.e., error variances) to be constrained to be equal across groups. Because the selection of theta scaling for dichotomous response types sets the variance of the residual term to 1, the final step of measurement invariance (i.e., constraining residuals across groups) is not necessary (Millsap and Yun-Tein, 2004; Finney and DiStefano, 2013; Rocabado et al., 2020).

Results and discussion

Evidence in support of response process validity

Before any quantitative EPIC-LaG data was analyzed, qualitative response process data was investigated to ensure that each of the items was functioning properly. Analysis of interview and free response data revealed that participants were interpreting and responding to the items as intended. As such, no modifications were made to the survey items. Examples of student response data that provide evidence in support of response process validity for data collected with the EPIC-LaG can be found in Table 4. Each of the EPIC prompts for exposure (Table 2) seemed to function well for both general chemistry and organic chemistry students, regardless of which course the item was originally identified to represent a learning goal for (Vaughan et al., 2024). For example, when general chemistry students were asked to restate the item “…I learned how to collect and accurately record data.” in their own words, they responded, “…I learned how to accurately record data that I had observed/gathered during the procedures of each experiment.” When organic chemistry students were given the same prompt, one student responded, “I was able to learn and practice the accurate collection of data as well as how to organize this data.” Because this item contains more than one idea (i.e., collecting data and recording data), response process data were used to ensure that students were understanding and interpreting both parts of the item.
Table 4 General and organic chemistry student responses which provide evidence of response process validity for data collected with the EPIC-LaG
Survey item Example student response from general chemistry Example student response from organic chemistry
Item stem: In my [general/organic] chemistry laboratory course this term… Prompt: please restate the item in your own words.
I learned how to collect and accurately record data. “…I learned how to accurately record data that I had observed/gathered during the procedures of each experiment.” “I was able to learn and practice the accurate collection of data as well as how to organize this data.”
I learned how to describe the properties of substances qualitatively. “…I learned how to explain the different characteristics and features of substances.” “I learned how to describe substances based on how they look and react.”
I learned how to describe the properties of substances quantitatively. “I learned how to describe properties of substances using numbers and formulas.” “I learned various methods of weighing and measuring substances.”
I learned about error analysis. “I learned how to calculate the error of an experiment's results to see how far off from the theoretical it was.” “Having to include a portion about possible sources of error in all my lab reports, I was able to see a pattern in where much of the error occurs in an experiment thus strengthening my knowledge on error analysis.”
I learned about lab safety. “…I learned the safe way of handling chemicals, correct dress code, and the safe way to complete the lab.” “Over the course of the lab for o-chem, lab safety was stressed and thoroughly taught.”
I learned about written communication skills. “…I learned how to write formal reports and take formal lab notes.” “From writing out procedures, data analyses, and conclusions, I believe I strengthened my scientific writing skills in a way that clearly and effectively communicates to the reader the steps I took and the outcome that came out of it.”
I learned about hands on laboratory skills. “I learned how to use laboratory equipment and related conduct in a lab.” “During my organic chemistry laboratory course this semester, I learned about different physical skills useful in the laboratory setting”
I applied chemistry concepts that were learned in the lecture. “This term, in my general chemistry laboratory course, I utilized chemical ideas gained in lectures.” “I would learn a topic in lecture and be able to apply my knowledge of it to the experiment in lab and in the written lab report.”
I learned how to behave professionally in a laboratory setting. “I learned how to treat lab equipment appropriately, focused on going through procedures with communication towards my lab partner, and made sure to abide by laboratory safety rules through the whole session.” “In the organic chemistry lab, we learned how to behave safely and professionally in a lab setting to prevent any hazards.”
I learned about how chemistry concepts apply in real life. “Throughout chemistry lab this term, there was a connection between real life and why it is needed in the real life world.” “Concepts of organic chemistry that showcase how substances such as pharmaceuticals and drugs function with the body, for example.”
I learned about the differences between theoretical concepts and reality. “I learned that just because the reaction looks right on paper, doesn't mean you know how the actual reaction will occur.” “I learned how real world examples and experiments can differ from what we believe is happening in theory and reasons why this is the case.”


Analysis of quantitative data

The initial quantitative data analysis conducted in this study utilized an aggregate data set that included students enrolled in both general chemistry and organic chemistry courses. Table 5 provides the totals and portions (percent) of students from the aggregate data set who responded ‘yes’ to each of the EPIC-LaG prompts. For example, 473 students (82% of the aggregated dataset) reported that they had learned how to collect and accurately record data. Of those 473 students, 442, 283, and 274 selected ‘yes’ for the persuasion, identification, and commitment prompts (respectively). Students who did not report ‘yes’ to the exposure item were not presented with the additional prompts.
Table 5 Student reported buy-in to laboratory learning goals as represented by EPIC-LaG responses for the aggregate data set (n = 575)
Learning goal   Students reporting exposure who also endorsed:
Item stem: in my [general/organic] chemistry laboratory course this term… Exposure Persuasion Identification Commitment
I learned how to collect and accurately record data. 473 (82%) 442 (93%) 283 (60%) 274 (58%)
I learned how to describe the properties of substances qualitatively. 429 (75%) 376 (88%) 251 (59%) 243 (57%)
I learned how to describe the properties of substances quantitatively. 438 (76%) 371 (85%) 252 (58%) 233 (53%)
I learned about error analysis. 445 (77%) 380 (85%) 229 (51%) 242 (54%)
I learned about lab safety. 454 (79%) 428 (94%) 246 (54%) 204 (45%)
I learned about written communication skills. 378 (66%) 292 (77%) 280 (74%) 255 (67%)
I learned about hands on laboratory skills. 426 (74%) 349 (82%) 264 (62%) 243 (57%)
I applied chemistry concepts that were learned in the lecture. 479 (83%) 461 (96%) 255 (53%) 221 (46%)
I learned how to behave professionally in a laboratory setting. 428 (74%) 384 (90%) 204 (48%) 230 (54%)
I learned about how chemistry concepts apply in real life. 468 (81%) 413 (88%) 350 (75%) 306 (65%)
I learned about the differences between theoretical concepts and reality. 444 (77%) 374 (84%) 285 (66%) 242 (55%)


Evidence in support of internal structure validity and single administration reliability

CFA and SEM were used to assess evidence in support of structural validity for data collected with the EPIC-LaG. First, the structure of each of the individual constructs (exposure, persuasion, identification, and commitment) was assessed via CFA using the aggregated data set (i.e., responses from all participants regardless of course level or pedagogical style). Evidence of good data-model fit was found for each of the unidimensional constructs (Table 6). Before analyzing the full mediation model (Fig. 1), evidence of single administration reliability was also assessed for each of the unidimensional constructs. When KR-20 was calculated for each factor, values of 0.76 or greater were found for each of the unidimensional constructs, providing evidence to support the internal consistency of the items (Table 6).
Table 6 Data-model fit statistics for each of the unidimensional constructs included the EPIC-LaG (n = 575)
Factor χ2 (df) p-value CFI TLI RMSEA [90% CI] KR-20
Bold values indicate the results met the suggested cutoff criteria for good fit (CFI ≥ 0.95, TLI ≥ 0.95 and RMSEA ≤ 0.05) (Yu, 2002; Beauducel and Herzberg, 2006; Finney and DiStefano, 2013; Rocabado et al., 2020).
Exposure 114.092 (44) <0.001 0.972 0.964 0.053 [0.041–0.065] 0.76
Persuasion 61.052 (44) <0.001 0.996 0.995 0.026 [0.004–0.041] 0.83
Identification 83.961 (44) <0.001 0.996 0.995 0.040 [0.027–0.053] 0.88
Commitment 58.291 (44) 0.073 0.999 0.999 0.024 [0.000–0.039] 0.90


Once appropriate evidence of data-model fit was found for each of the unidimensional constructs, the relations among persuasion, identification, and commitment were explored via structural equation modeling (Fig. 2). When analyzing the aggregated data set, evidence of adequate-to-good data-model fit was found for the full mediation model (Table 7). This finding aligns with the data structure seen in previously reported uses of the EPIC which assessed students’ buy-in to evidence based teaching practices (Wang et al., 2021) and active learning (Cavanagh et al., 2016). Persuasion was found to be positively related to identification (coefficient = 0.538) and commitment (coefficient = 0.296). This finding suggests that students who are persuaded that a particular learning goal is important for them to learn in their laboratory course are also more likely to identify that they liked learning about that learning goal and to express commitment to learning more about that learning goal in the future. Additionally, identification was found to be positively related to commitment (coefficient = 0.557). This finding suggests that students who identify that they liked learning about a particular learning goal are more likely to express commitment to learning more about that learning goal in the future. In addition to the direct effects defined in the mediation model, the indirect effect from persuasion to commitment through identification was investigated and found to be significant for the aggregated data set. The standardized estimate of the statistically significant indirect effect was 0.296, indicating that even when persuaded, students are less likely to commit to a learning goal unless they also identify with that goal. A similar result was also found when evaluating this path in a prior study (Wang et al., 2021).


image file: d5rp00203f-f2.tif
Fig. 2 Results of structural analysis with the aggregated data set. Standardized regression coefficients are included for all significant paths (p < 0.05).
Table 7 Data-model fit statistics for the EPIC-LaG structural model (n = 575)
χ2 (df) p-value CFI TLI RMSEA [90% CI]
Bold values indicate the results met the suggested cutoff criteria for good fit (CFI ≥ 0.95, TLI ≥ 0.95 and RMSEA ≤ 0.05) (Yu, 2002; Beauducel and Herzberg, 2006; Finney and DiStefano, 2013; Rocabado et al., 2020).
1256.606 (492) <0.001 0.982 0.981 0.048 [0.045–0.052]


Evidence of measurement invariance. Before the EPIC-LaG can be used to investigate comparisons between student groups, evidence supporting the internal structure of the data for each group must be assessed through measurement invariance testing. This study focused on investigating measurement invariance of the model (Fig. 2) by course level (general chemistry vs. organic chemistry) and pedagogical style (cookbook vs. ADI). Evidence of scalar invariance was found between each of the groups being investigated, demonstrating the generalizability of the mediation model across groups and supporting comparisons of latent means between groups (Thompson and Green, 2013; Lee, 2018). Data-model fit statistics and their differences among each level of measurement invariance testing can be found in Table 8 (course level comparisons) and Table 9 (pedagogical style comparisons).
Table 8 Data-model fit statistics for measurement invariance by course for the EPIC-LaG
a Bold values indicate goodness-of-fit values that met the suggested cutoff criteria for good fit (CFI ≥ 0.95, RMSEA ≤ 0.05) (Yu, 2002; Beauducel and Herzberg, 2006; Finney and DiStefano, 2013; Rocabado et al., 2020).b Bold values indicate data-model fit changes between subsequent levels that met the suggested cutoff criteria (ΔCFI ≤ 0.010, ΔRMSEA ≤ 0.015) (Chen, 2007).
Baseline data-model fit by groupa
Group χ2 (df) p-value CFI RMSEA
General chemistry (n = 374) 928.766 (492) <0.001 0.949 0.049
Organic chemistry (n = 201) 722.977 (492) <0.001 0.951 0.048

Invariance data-model fit by levelb
Testing level χ2 (df) p-value CFI RMSEA ΔCFI ΔRMSEA
Configural 1826.212 (984) <0.001 0.981 0.048
Metric 1986.451 (1014) <0.001 0.978 0.046 −0.003 −0.002
Scalar 1856.397 (1044) <0.001 0.981 0.047 0.003 0.001


Table 9 Data-model fit statistics for measurement invariance by pedagogical style for the EPIC-LaG
a Bold values indicate goodness-of-fit values that met the suggested cutoff criteria for good fit (CFI ≥ 0.95, RMSEA ≤ 0.05) (Yu, 2002; Beauducel and Herzberg, 2006; Finney and DiStefano, 2013; Rocabado et al., 2020).b Bold values indicate data-model fit changes between subsequent levels that met the suggested cutoff criteria (ΔCFI ≤ 0.010, ΔRMSEA ≤ 0.015) (Chen, 2007).
Baseline data-model fit by groupa
Group χ2 (df) p-value CFI RMSEA
Cookbook style general chemistry (n = 148) 579.295 (492) <0.001 0.973 0.035
ADI style general chemistry (n = 226) 671.007 (492) <0.001 0.965 0.040

Invariance data-model fit by levelb
Testing level χ2 (df) p-value CFI RMSEA ΔCFI ΔRMSEA
Configural 1125.845 (984) 0.001 0.995 0.028
Metric 1141.322 (1014) 0.003 0.996 0.026 0.001 −0.002
Scalar 1128.591 (1044) 0.006 0.996 0.025 0.000 −0.001


Group comparisons. Potential differences in buy-in pathways were explored between general chemistry students and organic chemistry students by analyzing disaggregated data using the mediation model (Fig. 3). Data-model fit statistics for each of the individual group analyses can be found in the Appendix (Table 10). Similar to the aggregated analysis, the pathways between persuasion and identification, and identification and commitment, were found to be much stronger for each group than the pathway between persuasion and commitment. One major difference between the general chemistry and organic chemistry results was found (Fig. 3). For general chemistry students, the direct path between persuasion and commitment was significant, but for organic chemistry students this pathway was not significant.
image file: d5rp00203f-f3.tif
Fig. 3 Results of structural analysis for general chemistry (top values, n = 374) and organic chemistry (bottom values, n = 201). Standardized regression coefficients are included for all significant paths (p < 0.05). NS = non-significant path.

These results may be related to students’ level of interest in chemistry. Within educational psychology, interest is defined as a “psychological state of engaging or the predisposition to reengage with particular classes of objects, events, or ideas over time” (Hidi and Renninger, 2006). The persuasion prompt (I believe that this was an important thing to learn in my laboratory course.) and the identification prompt (I liked learning about this in my laboratory course.), focus on the current laboratory course that a student is enrolled in, and therefore may be related to students’ triggered situational interest, which “emerges spontaneously in response to features of an environment” (Hidi and Renninger, 2006; Flaherty, 2020). In this way, triggered situational interest may, in part, explain the significant relation between persuasion and identification. The commitment prompt (I want to learn more about this in the future.), on the other hand, focuses on a future learning experience, and therefore may be related to a ‘more developed’ phase of interest, such as maintained situational interest or individual interest (Hidi and Renninger, 2006). Identification may have a stronger relation to commitment than persuasion due to the particularly affective nature of the identification prompt (i.e., I liked…). This is because interest development is strongly related to “attention and positive feelings … in terms of affect or liking” (Hidi and Renninger, 2006).

The difference in the significance of the relation between persuasion and commitment between general and organic chemistry students may also be related to students’ interest levels. General chemistry courses tend to serve a broad population of majors, who likely have very broad levels of interest in chemistry content. Organic chemistry courses, on the other hand, primarily serve students who have decided to major in the chemical, biological, and/or health sciences, and therefore likely have a stronger (or more developed) interest in chemistry as a whole. This reasoning is supported by a 2015 study focused on general chemistry students which found that chemistry majors reported higher levels of interest than non-science majors (Ferrell and Barbera, 2015). Additional support for this reasoning is provided by the higher path values from persuasion to identification and from identification to commitment for organic chemistry students found in the current study. Therefore, future studies may be interested in explicitly exploring group-differences in buy-in by major.

In this case, it is possible that the difference in significance observed in the path between persuasion and commitment for general and organic chemistry students occurs because the belief that a particular concept is important to learn (persuasion), is more strongly relate to a desire to learn more about that topic in the future (commitment) for students who may have a less developed interest in that content (in this case, general chemistry students). While this is one possible explanation for the observed difference, further qualitative studies are needed to better understand the reasons for general and organic chemistry students’ buy-in levels.

Potential differences in buy-in pathways were also explored between general chemistry students who participated in a cookbook style laboratory course and general chemistry students who participated in an ADI style laboratory course. Data-model fit statistics for each of the individual group analyses can be found in the Appendix (Table 10). As previously seen in the comparison between general and organic chemistry students, for both cookbook and ADI style general chemistry students, the pathways between persuasion and identification, and identification and commitment, were found to be much stronger than the pathway between persuasion and commitment (Fig. 4). One major difference was identified between the cookbook style general chemistry responses and the ADI style general chemistry responses. For general chemistry students who experienced cookbook-style labs, the direct path between persuasion and commitment was not significant, while for the ADI style general chemistry students this pathway was significant.


image file: d5rp00203f-f4.tif
Fig. 4 Results of structural analysis for cookbook style general chemistry (top values, n = 148) and ADI style general chemistry (bottom values, n = 226). Standardized regression coefficients are included for all significant paths (p < 0.05). NS = non-significant path.

As described for the difference in buy-in between student enrolled in general and organic chemistry laboratory courses, this result may be related to interest. An important component of interest development is attention (Hidi and Renninger, 2006). ADI style laboratory courses are intentionally designed to “provide students with an opportunity to develop their own method to generate data, to carry out investigations, use data to answer research questions, write, and be more reflective as they work” (Walker et al., 2011). In this case, the structure of an ADI style laboratory course may encourage students to be more attentive to chemistry content than a cookbook style course. Because this increased attention, the belief that a particular concept is important to learn (persuasion), may more strongly relate to a desire to learn more about that topic in the future (commitment) and therefore be less dependent on one's identification with the topic.

The identification of a non-significant direct path between persuasion and commitment for one of the two groups in each model (Fig. 3 and 4) is a finding that has not been previously reported for other applications of the EPIC in STEM courses (Cavanagh et al., 2016; Wang et al., 2021) and therefore requires further investigation to fully understand. However, this novel result reveals that, when measuring laboratory learning goals with the EPIC-LAG, the direct relation between students’ levels of persuasion and commitment may not be as salient as the pathway from persuasion to commitment that includes identification. Because of this, laboratory coordinators and instructors who are interested in increasing student buy-in to learning goals may want to pay particular attention to the identification component, and explore avenues to increase students’ appreciation of, or improve their affective feelings related to, laboratory learning goals.

Conclusions

The aim of this study was to adapt the EPIC in order to investigate students’ buy-in to faculty defined learning goals in lower division chemistry laboratory courses at Portland State University. In addition to developing new items to assess buy-in in the environments of interest, this aim included investigating psychometric evidence for the data collected with the newly adapted instrument.

This study produced the Exposure, Persuasion, Identification, and Commitment to Laboratory Goals (EPIC-LaG): a measure of student buy-in to laboratory learning goals that includes 11 item sets. Qualitative response process data provided evidence that each of the items developed in this study functioned well for both general chemistry and organic chemistry students. Quantitative data provided additional evidence of validity and reliability for data collected with the EPIC-LaG. Evidence in support of structural validity was found for each of the individual constructs (via CFA), as well as for the mediation model relating each of the constructs (via SEM). Evidence in support of single administration reliability, calculated through KR-20, was found for each of the individual constructs included in the measure (exposure, persuasion, identification, and commitment). Finally, evidence of scalar invariance was found for each of the different groups in this study (general chemistry vs. organic chemistry and cookbook vs. ADI), demonstrating the generalizability of the mediation model across groups. These invariance analyses provide evidence in support of consequential validity for the comparison of latent means by group in future studies using the EPIC-LaG. Collectively the validity and reliability evidence presented in this study provide support for reporting and analyses of the EPIC-LaG scores in the environments of interest.

When potential differences in buy-in pathways were explored between groups (e.g., general chemistry vs. organic chemistry and cookbook vs. ADI) a non-significant direct path between persuasion and commitment was detected for one of the two groups in each model. While the EPIC has not previously been used to explore group differences in a discipline-based education research context, the finding reported in this study suggest that the measure is capable of detecting differences in the relations among students’ persuasion, identification, and commitment to laboratory learning goals. In order to better understand the circumstances leading to these differences, future studies using the EPIC-LaG would benefit from the collection qualitative data related to students’ perception of the relationship between persuasion and commitment.

While the EPIC-LaG items described it this study are specific to general and organic chemistry laboratory courses at Portland State University, this study provides a framework for researchers and practitioners who are interested in investigating students buy-in to laboratory learning goals at other institutions. Just as the previous version of the EPIC was adapted to develop the EPIC-LaG in this study, interested researchers and practitioners at other institutions may choose to modify the exposure items included in the EPIC-LaG align with the learning goals defined for their courses.

Limitations

The buy-in measure developed in this study was intended to investigate students’ buy-in to laboratory learning goals at a single institution of interest. Therefore, the results of this study may not be broadly generalizable to other institutions. Additionally, it should be noted that the qualitative response process data included in this study was collected from students enrolled in virtual laboratory courses (during the Covid-19 shutdown). While this delivery method is atypical for the laboratory courses of interest to this study, the qualitative data collection was focused on students’ interpretation and understanding of survey items related to buy-in and faculty defined learning goals, not their reported experiences. Therefore, this data collection still provided useful insight to this study, despite the atypical course delivery method.

Implications for research

This study addresses a gap in chemistry education literature by developing a measure to assess students’ buy-in to faculty defined learning goals in lower division undergraduate chemistry laboratory courses. Researchers who are interested in using the EPIC-LaG at new institutions are encouraged to adapt the exposure prompts to align with the laboratory learning goals that have been defined for courses they are interested in studying.

As is, the EPIC-LaG items described in this study will be used in future studies conducted at Portland State University to investigate the relations among buy-in (as measured by the EPIC-LaG) and other variables. A previous study conducted by Cavanagh and colleagues found that “student buy-in to active learning [as measured by one version of the EPIC] was positively associated with engagement in self-regulated learning and students’ course performance” (Cavanagh et al., 2016). Additionally, another version of the EPIC published by Wang and colleagues has been used to investigate a proposed framework relating students’ trust in instructors, growth mindset, buy-in toward evidence based learning practices, student engagement, and desired student outcomes (persistence and course grades) (Wang et al., 2021). These studies suggest that meaningful relations likely exist between buy-in (as measured by the EPIC-LaG), students’ laboratory perspectives, and engagement. Future studies from the authors of this manuscript (E. B. V. and J. B.) will investigate the relations among student expectations, buy-in, and engagement in the general and organic chemistry laboratory courses at Portland State University.

The study reported here provides a starting point for future work by the authors of this manuscript by adapting the EPIC to a new environment and investigating previously reported relations among persuasion, identification, and commitment (Wang et al., 2021). While the structural model published by Wang and colleagues has been used to inform this study, we acknowledge that there is not a single consensus on the theoretical structure of buy-in. Therefore, future studies investigating the relations among student expectations, buy-in, and engagement will explore multiple possible models of the relations among constructs.

Implications for practice

In a synthesis of 800 meta-analyses (that included more than 50[thin space (1/6-em)]000 studies) focused on the topics of teaching and learning, educational researcher John Hattie found that faculty “having clear intentions and success criteria (i.e., learning goals)” for a course is one of the key strategies for improving student achievement in higher education regardless of discipline (Hattie, 2011). For practitioners who are interested in adapting the EPIC-LaG to their own unique laboratory environments, the process of adapting the instrument may facilitate meaningful reflection on their existing laboratory learning goals. While adapting the EPIC-LaG is not the only method by which reflection can occur, the structure of the instrument encourages practitioners to specifically consider a variety of important components, for example: Do their students recognize the learning goals for their chemistry laboratory courses? How are their students interpreting the laboratory learning goals? Do their students believe they are actually learning the topics addressed in the learning goals? This investigation could provide practitioners with meaningful insight related to their students’ perceptions and learning experiences.

While adapting the EPIC-LaG to investigate student buy-in in new laboratory environments could provide substantial insights to practitioners, those who are interested in adapting the instrument should ensure that their unique version of the EPIC-LaG produces valid and reliable data before interpreting any results. We acknowledge that not every practitioner has the time or interest required to adapt the EPIC-LaG, administer the instrument, and/or analyze the generated data. Therefore, the environmental specificity of the EPIC-LaG highlights an opportunity for collaboration between discipline-based education researchers and practitioners who are interested in examining their laboratory learning goals. If the EPIC-LaG is appropriately adapted for new laboratory environments, data collected with the measure may be used by teams of practitioners and researchers to make informed adjustments to laboratory activities, which may increase the likelihood of students’ buying-in to the learning goals for the laboratory course.

Conflicts of interest

There are no conflicts to declare.

Data availability

The data are not publicly available as approval for this study did not include permission for sharing data publicly.

Appendix

Table 10 Data-model fit statistics for each of the EPIC-LaG structural models
Group χ2 (df) p-value CFI TLI RMSEA [90% CI]
Bold values indicate the results met the suggested cutoff criteria for good fit (CFI ≥ 0.95, TLI ≥ 0.95 and RMSEA ≤ 0.05) (Yu, 2002; Beauducel and Herzberg, 2006; Finney and DiStefano, 2013; Rocabado et al., 2020).
Aggregated general chemistry (n = 374) 928.766 (492) <0.001 0.949 0.946 0.049 [0.044–0.054]
Aggregated organic chemistry (n = 201) 722.977 (492) <0.001 0.951 0.947 0.048 [0.041–0.056]
Cookbook style general chemistry (n = 148) 579.295 (492) <0.001 0.973 0.971 0.035 [0.021–0.046]
ADI style general chemistry (n = 226) 671.007 (492) <0.001 0.965 0.963 0.040 [0.032–0.048]


References

  1. Aragón O. R., Dovidio J. F. and Graham M. J., (2017), Colorblind and multicultural ideologies are associated with faculty adoption of inclusive teaching practices, J. Diversity Higher Educ., 10(3), 201–215 DOI:10.1037/dhe0000026.
  2. Arjoon J. A., Xu X. and Lewis J. E., (2013), Understanding the state of the art for measurement in chemistry education research: Examining the psychometric evidence, J. Chem. Educ., 90(5), 536–545 DOI:10.1021/ed3002013.
  3. Beauducel A. and Herzberg P. Y., (2006), On the Performance of Maximum Likelihood Versus Means and Variance Adjusted Weighted Least Squares Estimation in CFA, Struct. Equation Model.: A Multidiscip. J., 13(2), 186–203 DOI:10.1207/s15328007sem1302_2.
  4. Bowers W. G., (1924), The advantages of laboratory work in the study of elementary chemistry, J. Chem. Educ., 1(11), 205 DOI:10.1021/ed001p205.
  5. Brazeal K. R., Brown T. L. and Couch B. A., (2016), Characterizing Student Perceptions of and Buy-In toward Common Formative Assessment Techniques, CBE Life Sci. Educ., 15(4), ar73 DOI:10.1187/cbe.16-03-0133.
  6. Bretz S. L., (2019), Evidence for the Importance of Laboratory Courses, J. Chem. Educ., 96(2), 193–195 DOI:10.1021/acs.jchemed.8b00874.
  7. Bretz S. L., Fay M., Bruck L. B. and Towns M. H., (2013), What Faculty Interviews Reveal about Meaningful Learning in the Undergraduate Chemistry Laboratory, J. Chem. Educ., 90(3), 281–288 DOI:10.1021/ed300384r.
  8. Bruck A. D. and Towns M., (2013), Development, Implementation, and Analysis of a National Survey of Faculty Goals for Undergraduate Chemistry Laboratory, J. Chem. Educ., 90(6), 685–693 DOI:10.1021/ed300371n.
  9. Bruck L. B., Towns M. and Bretz S. L. (2010), Faculty Perspectives of Undergraduate Chemistry Laboratory: Goals and Obstacles to Success, J. Chem. Educ., 87(12), 1416–1424 DOI:10.1021/ed900002d.
  10. Cavanagh A. J., Aragón O. R., Chen X., Couch B. A., Durham M. F. and Bobrownicki A., et al., (2016), Student Buy-In to Active Learning in a College Science Course, CBE—Life Sci. Educ., 15(4), ar76 DOI:10.1187/cbe.16-07-0212.
  11. Chen F. F., (2007), Sensitivity of Goodness of Fit Indexes to Lack of Measurement Invariance, Struct. Equation Model.: A Multidiscip. J., 14(3), 464–504 DOI:10.1080/10705510701301834.
  12. Collins D., (2003), Pretesting survey instruments: An overview of cognitive methods, Qual. Life Res., 12(3), 229–238 DOI:10.1023/A:1023254226592.
  13. Cortina J. M., (1993), What is coefficient alpha? An examination of theory and applications, J. Appl. Psychol., 78(1), 98–104 DOI:10.1037/0021-9010.78.1.98.
  14. Deng J. M., Streja N. and Flynn A. B., (2021), Response Process Validity Evidence in Chemistry Education Research. J. Chem. Educ., 98(12), 3656–3666 DOI:10.1021/acs.jchemed.1c00749.
  15. DiStefano C. and Morgan G. B., (2014), A Comparison of Diagonal Weighted Least Squares Robust Estimation Techniques for Ordinal Data, Struct. Equation Model.: A Multidiscip. J., 21(3), 425–438 DOI:10.1080/10705511.2014.915373.
  16. Donis K., Aikens M. L., Swamy U., Delgado M., Gillespie M. and Graves P., et al., (2024), Learning Assistants and Instructors Provide Social Support That Influences Student Engagement Differently in Undergraduate Active Learning Chemistry Courses, J. Chem. Educ., 101(8), 2989–3002 DOI:10.1021/acs.jchemed.3c01137.
  17. Ferrell B. and Barbera J. (2015). Analysis of students’ self-efficacy, interest, and effort beliefs in general chemistry. Chem. Educ. Res. Pract., 16(2), 318–337 10.1039/C4RP00152D.
  18. Finney S. J. and DiStefano C., (2013), Nonnormal and categorical data in structural equation modeling, in Structural equation modeling: A second course, 2nd ed., IAP Information Age Publishing, pp. 439–492.
  19. Flaherty A. A., (2020), A review of affective chemistry education research and its implications for future research, Chem. Educ. Res. Pract., 21(3), 698–713 10.1039/C9RP00200F.
  20. Freeman S., O’Connor E., Parks J. W., Cunningham M., Hurley D. and Haak D., Dirks C., et al., (2007). Prescribed Active Learning Increases Performance in Introductory Biology. CBE—Life Sci. Educ., 6(2), 132–139 DOI:10.1187/cbe.06-09-0194.
  21. Grushow A., Hunnicutt S., Muñiz M., Reisner B. A., Schaertel S. and Whitnell R., (2021), Journal of Chemical Education Call for Papers: Special Issue on New Visions for Teaching Chemistry Laboratory, J. Chem. Educ., 98(11), 3409–3411 DOI:10.1021/acs.jchemed.1c01000.
  22. Hattie J., (2011), Which strategies best enhance teaching and learning in higher education? Empirical research in teaching and learning: Contributions from social psychology, Wiley Blackwell, pp. 130–142.  DOI:10.1002/9781444395341.ch8.
  23. Hidi S. and Renninger K. A., (2006), The Four-Phase Model of Interest Development, Educ. Psychologist, 41(2), 111–127 DOI:10.1207/s15326985ep4102_4.
  24. Komperda R., Pentecost T. C. and Barbera J., (2018), Moving beyond Alpha: A Primer on Alternative Sources of Single-Administration Reliability Evidence for Quantitative Chemistry Education Research, J. Chem. Educ., 95(9), 1477–1491 DOI:10.1021/acs.jchemed.8b00220.
  25. Kuder G. F. and Richardson M. W., (1937), The theory of the estimation of test reliability. Psychometrika, 2(3), 151–160 DOI:10.1007/BF02288391.
  26. Lee S. T. H., (2018), Testing for measurement invariance: Does your measure mean the same thing for different participants? APS Obs., 31, 32–33.
  27. Lester D., (2013), A Review of the Student Engagement Literature, Focus on Colleges, Universities, and Schools, 7(1).
  28. Levin B., (2000), Putting Students at the Centre in Education Reform. Journal of Educational Change, 1(2), 155–172 DOI:10.1023/A:1010024225888.
  29. Micari M. and Pazos P., (2012), Connecting to the Professor: Impact of the Student–Faculty Relationship in a Highly Challenging Course, College Teaching, 60(2), 41–47 DOI:10.1080/87567555.2011.627576.
  30. Millsap R. E. and Yun-Tein J., (2004), Assessing Factorial Invariance in Ordered-Categorical Measures. Multivariate Behavioral Research, 39(3), 479–515 DOI:10.1207/S15327906MBR3903_4.
  31. Nersessian N. J., (1989), Conceptual change in science and in science education, Synthese, 80(1), 163–183 DOI:10.1007/BF00869953.
  32. Portland State University, (2025), Facts: PSU By the Numbers | Portland State University. https://www.pdx.edu/portland-state-university-facts.
  33. Reid N. and Shah I. (2007), The role of laboratory work in university chemistry, Chemistry Education Research and Practice, 8(2), 172–185 10.1039/B5RP90026C.
  34. Rocabado G. A., Komperda R., Lewis J. E. and Barbera J., (2020), Addressing diversity and inclusion through group comparisons: A primer on measurement invariance testing, Chem. Educ. Res. Pract., 21(3), 969–988 10.1039/D0RP00025F.
  35. Sheridan Center for Teaching and Learning, Brown University, (2024), Establishing Learning Goals | Sheridan Center for Teaching and Learning | Brown University. https://sheridan.brown.edu/resources/course-design/establishing-learning-goals.
  36. Taber K. S., (2018), The Use of Cronbach's Alpha When Developing and Reporting Research Instruments in Science Education, Research in Sci. Educ., 48(6), 1273–1296 DOI:10.1007/s11165-016-9602-2.
  37. Tanner K. D., (2013), Structure Matters: Twenty-One Teaching Strategies to Promote Student Engagement and Cultivate Classroom Equity, CBE—Life Sci. Educ., 12(3), 322–331 DOI:10.1187/cbe.13-06-0115.
  38. Thompson M. S. and Green S. B., (2013), Evaluating between-group differences in latent variable means, in Structural equation modeling: A second course, 2nd ed., IAP Information Age Publishing, pp. 163–218.
  39. Tobin K., (1990), Research on Science Laboratory Activities: In Pursuit of Better Questions and Answers to Improve Learning, School Science and Mathematics, 90(5), 403–418 DOI:10.1111/j.1949-8594.1990.tb17229.x.
  40. Vaughan E. B., Montoya-Cowan A., Kim C., Stephens A., Hamilton O. and Barbera J., (2024), Investigating the Learning Goals and Expectations of Laboratory Coordinators, Graduate Teaching Assistants, and Students in General and Organic Chemistry Laboratory Courses, J. Chem. Educ., 101(12), 5173–5182 DOI:10.1021/acs.jchemed.4c00569.
  41. Walker J. P., Sampson V. and Zimmerman C. O., (2011), Argument-Driven Inquiry: An Introduction to a New Instructional Model for Use in Undergraduate Chemistry Labs, J. Chem. Educ., 88(8), 1048–1056 DOI:10.1021/ed100622h.
  42. Wang C., Cavanagh A. J., Bauer M., Reeves P. M., Gill J. C. and Chen X., et al., (2021), A Framework of College Student Buy-in to Evidence-Based Teaching Practices in STEM: The Roles of Trust and Growth Mindset, CBE—Life Sci. Educ., 20(4), ar54 DOI:10.1187/cbe.20-08-0185.
  43. Weaver G. C., Russell C. B. and Wink D. J., (2008), Inquiry-based and research-based laboratory pedagogies in undergraduate science. Nat. Chem. Biol., 4(10), 577–580 DOI:10.1038/nchembio1008-577.
  44. Worthington R. L. and Whittaker T. A., (2006), Scale Development Research: A Content Analysis and Recommendations for Best Practices, The Counseling Psychologist, 34(6), 806–838 DOI:10.1177/0011000006288127.
  45. Yu C.-Y., (2002), Evaluating Cutoff Criteria of Model Fit Indices for Latent Variable Models with Binary and Continuous Outcomes,University of California, Los Angeles.
  46. Zumbrunn S., McKim C., Buhs E., & Hawley L. R., (2014), Support, belonging, motivation, and engagement in the college classroom: A mixed method study, Instructional Science, 42(5), 661–684 DOI:10.1007/s11251-014-9310-0.

This journal is © The Royal Society of Chemistry 2025
Click here to see how this site uses Cookies. View our privacy policy here.