Open Access Article
This Open Access Article is licensed under a Creative Commons Attribution-Non Commercial 3.0 Unported Licence

Factors associated with chemistry faculty members’ cooperative adoption of evidence-based instructional practices: results from a national survey

Megan C. Connor *a and Jeffrey R. Raker *b
aDepartment of Chemistry and Biochemistry, Samford University, Birmingham, Alabama 35229, USA. E-mail: mconnor@samford.edu
bDepartment of Chemistry, University of South Florida, Tampa, Florida 33620, USA. E-mail: jraker@usf.edu

Received 26th July 2023 , Accepted 26th January 2024

First published on 7th February 2024


Abstract

Despite institutional reform efforts to increase use of evidence-based instructional practices (EBIPs) in undergraduate chemistry and STEM courses, didactic lecture remains the predominant mode of instruction. Research to inform these initiatives routinely focuses on drivers and barriers to EBIP adoption, with recent work investigating factors associated with faculty members’ cooperative adoption of EBIPs from five STEM disciplines including chemistry. To understand the role of these specific factors within undergraduate chemistry education across a broad set of institutions, we conducted a national survey of chemistry faculty members (n = 1105) from the United States in Spring 2023. The survey targeted constructs that may underlie the cooperative adoption of EBIPs, including faculty members’ perception of (1) using EBIPs as mutually beneficial, (2) having their success and failure intertwined, and (3) institutional climate around teaching. The survey also included items targeting teaching-specific social interactions, another potential aspect of cooperative adoption. Results from multilevel modeling suggest that EBIP adoption is associated with chemistry faculty members’ perception of using EBIPs as mutually beneficial, aligning with prior findings on STEM faculty members’ cooperative adoption of these practices. However, there is no evidence of an association between EBIP adoption and chemistry faculty members’ perception of campus climate around teaching, where prior findings indicate an inverse association among STEM faculty members. Results further indicate that EBIP adoption is associated with the number of people with whom one specifically discusses pedagogy, instruction, and assessment. Collectively, our results demonstrate that differences exist between STEM disciplines and point toward the chemistry education research community's responsibility to further explore EBIP adoption from a disciplinary lens. Our investigation also provides insight into factors associated with the cooperative adoption of EBIPs among chemistry faculty members on a national level; we identify several implications for how chemistry faculty member change agents (e.g., course coordinators, department leaders) may effectively promote EBIP adoption across the undergraduate chemistry curriculum.


Introduction

Evidence-based instructional practices (EBIPs) are teaching innovations and techniques that educational research demonstrates as effective in supporting student learning (e.g., Missett and Foster, 2015; Stains and Vickrey, 2017); some EBIPs have emerged out of the postsecondary chemistry context (e.g., Process-Oriented Guided-Inquiry Learning; Moog et al., 2009), while others are more general to STEM education (e.g., flipped classroom strategies; Herreid and Schiller, 2013). Due to the demonstrated positive impact of EBIPs on students’ academic performance and affective outcomes, many chemistry educators are now using these instructional practices (e.g., Srinivasan et al., 2018b; Raker et al., 2021b; Connor et al., 2022). However, EBIPs are still not the predominant mode of instruction in undergraduate chemistry courses (Stains et al., 2018); rather, instruction is largely characterized by passive lecture-based approaches, which research demonstrates as less effective than active instructional techniques (e.g., Freeman et al., 2014; Stains et al., 2018; Deslauriers et al., 2019; Theobald et al., 2020).

Cooperative adoption of EBIPs across STEM disciplines

Limited adoption of EBIPs among STEM faculty members, including those in chemistry, has resulted in institutional reform efforts to increase use of these techniques (e.g., Kezar, 2016; Mestre et al., 2019). It has further led to research studies on institutional change initiatives that seek to promote EBIP adoption in undergraduate STEM courses (inclusive of chemistry departments and chemistry course contexts). Collectively, this research literature suggests that STEM faculty members’ adoption of EBIPs does not occur in isolation (e.g., Andrews et al., 2016; Ma et al., 2018; Lane et al., 2019; Mestre et al., 2019; McConnell et al., 2020; Middleton et al., 2022); rather, EBIP adoption takes place in the context of social interactions with one's peers, all of whom are embedded within a social network (e.g., Kezar, 2016). For instance, chemistry and biology faculty members’ teaching-specific interactions with their colleagues (e.g., discussion) have been found to impact their use of EBIPs (Lane et al., 2019). In addition, research suggests that life sciences faculty who are discipline-based education researchers (DBERs) more effectively promote and support instructional reform than other departmental colleagues, namely through co-teaching, providing ready access to findings from education research, and mentoring (Andrews et al., 2016).

Research further suggests that, across various STEM disciplines (e.g., chemistry, biology, physics, geology, environmental engineering, etc.), faculty members who are formal mentors play an essential role in the spread of knowledge surrounding EBIPs, serving as catalysts that help communities of practice dedicated to EBIP adoption grow and connect with one another (Ma et al., 2018; Mestre et al., 2019). Further, chemistry, physics, mathematics, and engineering faculty members who have more teaching-specific interactions tend to use student-centered (i.e., peer-led team learning, PLTL) instructional approaches, in contrast to those with fewer interactions who tend to use instructor-centered approaches (e.g., lecturing; Middleton et al., 2022). These faculty members also report greater experience with assessing learning, an essential aspect of evidence-based instruction (McConnell et al., 2020). However, those who use EBIPs tend to preferentially interact with other faculty members who use EBIPs, suggesting knowledge of EBIPs may remain with existing users (Lane et al., 2020).

EBIP adoption can, thus, be a cooperative rather than independent endeavor (McAlpin et al., 2022). In a recent study, McAlpin et al. (2022) investigated potential factors associated with STEM faculty members’ cooperative adoption of EBIPs, including their teaching-specific interactions (McAlpin et al., 2022); their investigation focused on faculty members across five STEM disciplines (i.e., biological sciences, chemistry, earth sciences, mathematics, and physics) at three large public research universities. Findings from this study suggest that faculty members are more likely to adopt EBIPs when they perceive such practices as mutually beneficial to themselves and their colleagues (McAlpin et al., 2022). Faculty members who are more often the target of teaching discussions, or those who serve as opinion leaders, also tend to be EBIP adopters. Additional research demonstrates that faculty members may serve as opinion leaders on teaching regardless of their academic rank (Lane et al., 2019). Training faculty members of any rank or tenure status within a department to use EBIPs may, thus, be one approach to supporting EBIP adoption. Moreover, faculty members who more strongly perceive their institutional climate as supportive of teaching innovation are also those who tend to not adopt EBIPs (McAlpin et al., 2022); while the nature of this relationship is unclear, it is possible that a population of STEM faculty members views institutional readiness for teaching innovation as, instead, institutional pressure to use EBIPs (McAlpin et al., 2022). Results of the investigation imply a need for activities designed to convince faculty of the mutual benefit of adopting EBIPs, which are distinct from activities that focus on teaching faculty how to implement EBIPs (McAlpin et al., 2022). Authors of the investigation also call for additional studies focused on factors underlying the cooperative adoption, as they may vary with context (McAlpin et al., 2022).

Cooperative adoption of EBIPs among chemistry faculty members

The investigation by McAlpin et al. (2022) provides an essential foundation for understanding the cooperative adoption of EBIPs; however, it is possible that these associations are unrepresentative of chemistry faculty members. Few studies of STEM transformations disaggregate data by discipline and focus on the uniqueness of each STEM disciplines; when such disaggregation or focused studies are conducted, key differences are observed. For instance, results from a national survey of chemistry faculty members suggest that those who perceive their departmental climate as supportive of teaching innovation tend to adopt EBIPs (Connor and Raker, 2023); however, other studies investigating the relationship between departmental climate and EBIP adoption across various STEM disciplines have found no evidence of an association (Emery et al., 2021; Shi and Stains, 2021). Therefore, while institutional and departmental climate are distinct, it is possible that the association between institutional climate and EBIP adoption found by McAlpin et al. (2022) does not broadly extend to chemistry faculty members. McAlpin et al. (2022) also called for additional investigations of factors underlying cooperative adoption within specific populations of interest, noting that every change initiative is distinct. Moreover, most investigations of chemistry faculty members’ instructional practices do not account for potential aspects of cooperative adoption, instead focusing on the relationship between use of evidence-based practices and individual-level instructor variables (e.g., instructional goals and beliefs about teaching), course variables (e.g., course level), and institution variables (e.g., public or private institutional control; Gibbons et al., 2017, 2018; Srinivasan et al., 2018a; Raker et al., 2021a; Connor et al., 2023; Zammit et al., 2023). The investigation herein aims to identifying factors underlying the cooperative adoption of EBIPs among chemistry faculty members, as understanding these associations will be essential for promoting and supporting widespread instructional reform in undergraduate chemistry courses.

Theoretical framework

Modeling the cooperative adoption of EBIPs using a model of organizational change

The CACAO model developed by Dormant (2011) provides a theoretical foundation for this investigation. Its name derives from what the model identifies as key components of a change initiative: Change, Adopters, ChangeAgents, and Organization. Change refers to new systems, processes, or behaviors that are desired in lieu of the status quo. Adopters are those who implement change, while change agents promote and support change among adopters. Change, adopters, and change agents all exist within an organization, i.e., the context for the initiative (Dormant, 2011). Several studies on instructional innovation in postsecondary STEM education have utilized the CACAO model to understand the adoption of EBIPs (e.g., Landrum et al., 2017; McAlpin et al., 2022; Viskupic et al., 2022; Yik et al., 2022a). In these studies, use of EBIPs in postsecondary STEM education constitutes the desired outcome (i.e., change), where faculty members (i.e., adopters) implement these instructional practices. Institution leaders, department leaders, pedagogical developers, and education researchers are typically among those who promote and support EBIP use (i.e., change agents). These individuals’ college or university (i.e., organization) serve as the context for the change initiative. In addition to identifying the key components of a change initiative, the CACAO model also outlines discrete, progressive steps taken by those adopting change: (1) awareness, (2) curiosity, (3) mental tryout, (4) hands-on tryout, and (5) adoption (Dormant, 2011). These steps have since been used to develop an instrument that measures individual faculty member's stage of EBIP adoption in postsecondary education (Landrum et al., 2017). The instrument relies on faculty members’ self-reported knowledge and use of EBIPs, and it has been productively used in multiple investigations of instructional change (McAlpin et al., 2022; Yik et al., 2022a).

Further, the CACAO model prioritizes understanding potential adopters, as this insight enables change agents to effectively promote and support change (Dormant, 2011). The model identifies several aspects of an ideal change initiative from the perspective of potential adopters. These include the relative advantage and social impact of adopting a change, as well as the compatibility, simplicity, and adaptability of the change. Changes that afford additional benefits relative to the status quo are conducive to widespread adoption, whereas changes that offer little relative advantage are less likely to be widely adopted. Likewise, changes that positively impact the social relations of adopters are conducive to widespread adoption, whereas changes that would negatively impact their social relations are less likely to be adopted. The compatibility of the change with an organization's current systems and practices, its simplicity to implement, and its adaptability to adopters’ unique context further supports broad adoption. Conversely, change that conflicts with current systems and practices, is complex or complicated to implement, or cannot be tailored to one's context is less likely to be widely adopted (Dormant, 2011). When considering aspects of an ideal change initiative in the context of chemistry instructional reform, adoption of EBIPs would be most widespread (1) when chemistry faculty members perceive such practices as advantageous compared to other forms of instruction, (2) when they perceive EBIP adoption as having a positive (or at least neutral) impact on relationships with others in their department or institution, and (3) when they perceive EBIPs as compatible with current instructional approaches at their institution, easy to implement, and adaptable to their own classroom, chemistry content, etc. (McAlpin et al., 2022).

Potential factors underlying the cooperative adoption of EBIPs

Using the CACAO model, McAlpin et al. (2022) investigated the association of three aspects of an ideal change initiative (i.e., relative advantage, social impact, and compatibility) with EBIP adoption among STEM faculty members. Each aspect was operationalized using a measurable construct that serves as an element of the broader aspect, all of which account for faculty members’ embeddedness in a social network. Using these conceptualizations, McAlpin et al. (2022) developed the Cooperative Adoption Factors Instrument (CAFI) to measure each of the constructs as they relate to faculty members’ adoption of EBIPs across five STEM disciplines at three institutions. The two additional aspects outlined in the CACAO model (i.e., simplicity and adaptability) were not investigated given they are specific to particular instructional practices, and the authors aimed to develop an instrument for a wide array of contexts.
Strategic complements. McAlpin et al. (2022) operationalized relative advantage as strategic complements, or the perception of EBIP adoption as mutually beneficial and reinforcing. Strategic complements were first conceptualized in economics and describe decisions that are mutually beneficial and reinforcing (e.g., Bulow et al., 1985; Jackson and Zenou, 2015). In the context of postsecondary instructional reform, strategic complementarity relates to a faculty member's perceived advantage of adopting EBIPs based on adoption by other faculty members (McAlpin et al., 2022). When a faculty member perceives that EBIP adoption will have greater benefits when more faculty members are using them, the decision to adopt EBIPs is a strategic complement (McAlpin et al., 2022). These greater benefits also imply a relative advantage to using EBIPs. Some perceived mutual benefits may include increased collaboration and sharing of instructional resources, professional development experiences related to teaching and learning, and opportunities for education research, all of which research identifies as drivers of EBIP adoption (Shadle et al., 2017).
Interdependence. Social impact is operationalized as interdependence (McAlpin et al., 2022), or the perception that faculty members’ success or failure is intertwined (Boon and Holmes, 1991). Interdependence is a widely used measure of interpersonal trust (e.g., Boon and Holmes, 1991; Aktipis et al., 2018), and it can more accurately predict helping behaviors when compared to other related measures (e.g., kinship; Aktipis et al., 2018). McAlpin et al. (2022) considered trust as an essential metric for evaluating readiness for institutional change, as faculty tend to make trust-based decisions during such initiatives (Kezar, 2016). Knowledge of EBIPs is essential for adoption, and faculty members may acquire this knowledge through initial social interactions with change agents (e.g., department leaders) or other potential adopters (i.e., other faculty members) (McAlpin et al., 2022). Further, the expansion of this social network is essential for sustained change, as it provides faculty members with a continual source of knowledge and support (López et al., 2022). However, these interactions likely depend upon the level of trust between colleagues, in particular that interactions surrounding EBIPs will not negatively impact professional outcomes (e.g., promotion) or create department divides that disrupt the social structure, both of which research identifies as barriers to EBIP adoption (Shadle et al., 2017).
Climate. Compatibility is operationalized as climate, or the perception of institutional readiness for change. When contemplating adoption, faculty members may consider whether EBIPs are compatible with their institution's climate around teaching, which they deduce from interactions with their peers (Walter et al., 2021). A positive climate that supports teaching innovation may be conducive to EBIP adoption, as faculty members may be encouraged to experiment with new teaching practices, including EBIPs, without penalty. For instance, chemistry faculty members who adopt EBIPs tend to be those who perceive a departmental climate in which departmental policies, practices, and expectations reflect a commitment to the continuous improvement of teaching (Connor and Raker, 2023). STEM faculty members have also identified an unsupportive departmental climate around teaching as a barrier to EBIP adoption in numerous studies (e.g., Henderson and Dancy, 2007; Landrum et al., 2017; Sturtevant and Wheeler, 2019). In such a climate, the department may not support pedagogical exploration, experimentation, or deviation from traditional lecture (Shadle et al., 2017).

Previous results obtained from the CAFI provide initial insight into factors associated with STEM faculty members’ cooperative adoption of EBIPs (McAlpin et al., 2022), providing implications for how change agents may effectively promote and support their use. Namely, results suggest that perceiving EBIP adoption as a strategic complement is associated with EBIP adoption, whereas perceiving institutional readiness for change is inversely associated with adoption (McAlpin et al., 2022). The former association suggests that when faculty members believe that more people using EBIPs results in greater benefits, they are more likely to adopt EBIPs. The latter association suggests that faculty members who do not use EBIPs may perceive institutional readiness for change (and possibly institutional pressure to use EBIPs) more acutely than faculty members who already use EBIPs in their courses, though the nature of this relationship is unclear (McAlpin et al., 2022). However, the degree to which these associations broadly persist among postsecondary chemistry faculty members across institutions is unresolved. By using the CAFI to investigate the association of these factors with chemistry faculty members’ EBIP adoption on a national scale, researchers can obtain insight into the extent to which associations observed by McAlpin et al. (2022) extend to this instructor population. Such results will help change agents across institutions (e.g., chemistry department leaders) develop effective strategies for promoting and supporting reform across the undergraduate chemistry curriculum.

Social interactions. Faculty members’ perceptions of strategic complements, interdependence, and climate are likely influenced by their network of peers. Research also suggests that social interactions influence instructional practices (e.g., Kezar, 2016; Lane et al., 2019, 2020). Therefore, in addition to investigating aspects of an ideal change initiative, McAlpin et al. (2022) also investigated the association between faculty members’ social interactions and EBIP adoption. Specifically, they focused on the degree to which faculty members serve as targets for teaching discussion, a proxy measure of their opinion leadership. Notably, McAlpin et al. (2022) found their proxy measure of opinion leadership (i.e., indegree from a social network analysis) to be associated with EBIP adoption, suggesting faculty who are more often the target of teaching discussions are more likely to use EBIPs. This finding aligns with that of another investigation in which opinion leaders tended to be discipline-based education researchers, who their colleagues perceived as having distinct professional expertise in education (Andrews et al., 2016).

This finding adds to the growing body of research on how social interactions may influence EBIP adoption. However, such relationships are complex, resulting in numerous aspects about their nature which are unresolved. These include but are not limited to the nature of teaching discussions (e.g., which aspects of teaching are discussed) and engagement in teaching evaluations with one's discussion partner. Results from one investigation across three science departments (biology, chemistry, and geoscience) suggest that both the context and content of teaching conversations may be important for promoting EBIP adoption (Lane et al., 2022). While data were not disaggregated by discipline, office location and course overlap were found to connect faculty and supported the sharing of innovative teaching knowledge. Conservations that participants considered as influencing their use of EBIPs then focused on a range of topics, including course delivery, content coverage, teaching strategies, and the degree of course synchronization, among others (Lane et al., 2022). Moreover, it is possible that, while all discussions surrounding teaching are useful, only discussions about particular aspects of teaching drive EBIP adoption. Such insight would help opinion leaders focus their discussions on topics that will effectively promote reform. It is further possible that EBIP adoption is associated with having a faculty discussion partner that has observed one's teaching, or vice versa, providing departments with a formal structure for promoting EBIP use. Research demonstrates that faculty mentors may catalyze instructional change (e.g., Mestre et al., 2019); insight into the association between having a discussion partner who has observed one's teaching and EBIP adoption would thus provide a nuanced perspective into how mentors effectively promote reform.

Research questions

This study addresses two main research questions:

1. To what extent are chemistry faculty members’ perceptions of strategic complements, interdependence, and campus climate around teaching associated with EBIP adoption in their postsecondary courses?

2. To what extent are chemistry faculty members’ social interactions relating to teaching associated with EBIP adoption in their postsecondary courses?

Methods

To address these research questions, this study employs a national survey of postsecondary chemistry faculty members at institutions within the United States awarding four-year degrees in chemistry. The survey took place in Spring 2023 and included the CAFI instrument as developed by McAlpin et al. (2022), items targeting faculty members’ social interactions surrounding aspects of teaching and learning, and the EBIP Adoption Scale developed by Landrum et al. (2017). Data are first evaluated for evidence of validity and reliability using confirmatory factor analysis (CFA) and two reliability measures. After obtaining sufficient evidence to support the interpretation of data as measures of intended constructs, data are further analyzed via multilevel binary logistic regression. The study was approved by the University of South Florida's Institutional Review Board on February 27, 2023 (Application STUDY005219).

Population and survey development

The survey targeted chemistry faculty members at four-year institutions in the United States that awarded at least one bachelor's degree in chemistry from 2015 to 2020. Institutions meeting these criteria were identified using the Integrated Postsecondary Education Data System (IPEDS; National Center for Education Statistics 2021), and publicly available faculty lists on institution websites were used to construct a list of faculty members (i.e., the study population). The study population included N = 14[thin space (1/6-em)]116 chemistry faculty members from N = 1137 institutions (see Table 1). Institutions represented a range of characteristics, including institutional control (i.e., public or private), highest chemistry degree awarded (i.e., bachelor's or graduate), and approval from the American Chemical Society (ACS) to award certified bachelor's degrees (see Table 1). Institutional control and highest chemistry degree awarded were determined using IPEDS, and ACS approval was determined using the current listing of approved programs on the ACS Committee on Professional Training website.
Table 1 Definition of the study population (N), including response rates. Institutions represented a range of characteristics, including institutional control, highest chemistry degree awarded, and ACS approval
Institutional control Highest chem. degree awarded Number of institutions Number of institutions with ACS approval Number of faculty members Number of respondents Response rate (%)
Public Bachelor's 250 165 2664 334 12.5
Public Graduate 223 217 5982 548 9.2
Private Bachelor's 575 233 3558 488 13.7
Private Graduate 89 78 1912 145 7.6
Totals 1137 693 14[thin space (1/6-em)]116 1515 10.7


Institutions in the United States awarding graduate degrees (i.e., master's or doctorate) typically have higher research activity compared to institutions awarding only bachelor's degrees. These institutions tend to have larger course sizes and faculty members with time-consuming research obligations (Cox et al., 2011), both of which can act as barriers to EBIP adoption (Lund and Stains, 2015; Shadle et al., 2017). Further, ACS is a professional society in the United States, and the Committee on Professional Training within ACS establishes evaluation criteria for undergraduate chemistry degree programs (Committee on Professional Training, 2023). Degree programs meeting evaluation criteria are granted approval from the ACS to award certified bachelor's chemistry degrees. ACS is analogous to the Royal Society of Chemistry (RSC) in the United Kingdom and the Royal Australian Chemical Institute (RACI) in Australia, where the ACS approval process is similar to the RSC and RACI accreditation processes. Approval from ACS to award certified bachelor's degrees has been found to be associated with greater use of evidence-based pedagogies (e.g., Connor et al., 2022).

The survey included previously published versions of the CAFI (see Appendix 1; McAlpin et al., 2022) and EBIP Adoption Scale (see Appendix 1; Landrum et al., 2017). Items targeting faculty members’ social interactions surrounding aspects of teaching and learning were developed using recommended items for collecting network data in the social sciences (Burt, 1984). Mirroring these recommended items (Burt, 1984), participants were first asked to list up to five people with whom they discussed teaching and learning over the past year (see Appendix 1). Subsequent items then probed different aspects of their interactions with each listed person (e.g., whether the listed person has observed their teaching, or vice versa; whether the listed person engages in discipline-based education research; etc.; see Appendix 1). Literature on collecting network data recommends that survey respondents list and respond in reference to a maximum of five people; this maximum accounts for the average number of people that respondents typically list, time considerations for completing the survey, the increased likelihood of survey fatigue when responding in reference to more than five individuals, and evidence on the amount of information individuals can comfortably retain at once in their memory (Miller, 1956; Simon, 1974; Burt, 1984).

Further, social interaction items were designed to gain general rather than detailed insight into the nature of social interactions. For example, participants were simply asked how many people with whom they discussed teaching and learning over the past year, with the anticipation that some participants would restrict their responses to people with whom they had frequent or focused interactions while others would not. This decision was informed by the study's aim of investigating various factors underlying the cooperative adoption of EBIPs in addition to social interaction variables. Numerous items targeting specific aspects of respondents’ social interactions would contribute to survey fatigue given the already large number of probed variables; to reduce survey fatigue while still investigating a range of factors underlying cooperative adoption, items thus only targeted general aspects of social interactions. By identifying general social interaction variables associated with the cooperative adoption of EBIPs, this study aims to provide a foundation through which future studies can more deeply probe specific aspects. Relevant survey items are provided in Appendix 1 in the order in which they were presented to participants.

Data collection and study sample

Data were collected via Qualtrics in Spring 2023. Faculty members (N = 14[thin space (1/6-em)]116) were invited via email to complete the online survey. The survey was available for three weeks; two reminder emails were sent to faculty members with incomplete responses at the end of the first and second week. A total of n = 1515 individuals consented to participate and began the survey (see Table 1), and n = 1105 of these individuals (1) indicated they had been the primary instructor of an undergraduate chemistry lecture course within the last three years and (2) provided complete responses to questions targeting cooperative adoption factors (see Table 2). Demographics of the study sample are summarized in Appendix 2.
Table 2 Definition of the study sample (n), including response rates
Institutional control Highest chem. degree awarded Number of institutions in sample Number of institutions with ACS approval in sample Number of faculty members in sample Response rate (%)
Public Bachelor's 122 102 240 9.0
Public Graduate 170 168 382 6.4
Private Bachelor's 234 136 375 10.5
Private Graduate 61 55 108 5.6
Totals 587 461 1105 7.8


Psychometric evaluation of the cooperative adoption factors instrument (CAFI)

The CAFI is a 17-item instrument designed to measure three constructs underlying faculty members’ cooperative adoption of EBIPs (i.e., strategic complements, interdependence, and climate) using 7-point Likert and semantic differential scales (McAlpin et al., 2022). Data from the CAFI have been psychometrically evaluated using exploratory factor analysis, CFA, and various reliability coefficients. For the study herein, CFA is used to evaluate the known internal structure of the CAFI; Cronbach's alpha (α) and McDonald's omega (ω) coefficients are used to evaluate internal consistency. Results from these analyses provide evidence of structural validity and reliability which, in turn, provide support for interpreting scores as measuring perceptions of strategic complements, interdependence, and campus climate toward teaching (i.e., constructs underlying cooperative adoption of EBIPs).
Evaluation of internal structure using CFA. CFA was conducted using the instrument's published factor structure (McAlpin et al., 2022) and the weighted least squares–mean and variance adjusted (WLSMV) estimator to accommodate the ordinal nature of the data (Li, 2016). Negatively worded items were reverse coded prior to analysis. The fit of a model is evaluated through various statistics in CFA, including the comparative fit index (CFI), Tucker-Lewis index (TLI), root mean square error of approximation (RMSEA), standardized root mean square residual (SRMR), and χ2 statistic (Worthington and Whittaker, 2006). For acceptable model fit, CFI ≥ 0.90, TLI ≥ 0.90, RMSEA < 0.05, and SRMR < 0.10 (Hu and Bentler, 1999; Worthington and Whittaker, 2006). Ideally, χ2 should be nonsignificant, though this is unexpected given the sample size and model complexity for the study herein. The model exhibited good fit, where χ2 (n = 1105, df = 110, p < 0.001) = 395.6, CFI = 0.97, TLI = 0.96, RMSEA = 0.05, and SRMR = 0.05. Standardized item loadings ranged from 0.41 to 0.82, and covariance coefficients ranged from 0.12 to 0.21 (see Appendix 3). Factor scores were generated using the CFA model for use in subsequent multilevel modeling. The analysis was completed using the “lavaan” package in RStudio (Rosseel, 2012).
Evaluation of internal consistency using reliability coefficients. Cronbach's α and McDonald's ω coefficients were calculated for each factor using the “psych” package in RStudio (Revelle, 2018). For both coefficients, higher values approaching one are preferred and interpreted as being good (Guttman, 1945; Cronbach, 1951; McDonald, 1999). For the study herein, coefficients approach one for all factors (see Appendix 3), providing evidence of reliability.

Psychometric evaluation of the EBIP adoption scale

The EBIP Adoption Scale is a 6-item instrument designed to measure instructors’ EBIP adoption stage outlined in the CACAO model (see Table 3; Landrum et al., 2017). The instrument uses a Guttman scale and dichotomous “yes/no” responses, where items are ordered to reflect progressive stages of EBIP adoption. Respondents’ EBIP adoption stage is marked by a change in response pattern from “yes” to “no.”
Table 3 EBIP Adoption Scale developed by Landrum et al. (2017). Items map onto a CACAO adoption stage (Dormant, 2011) and a modified CACAO adoption stage used by Yik et al. (2022). Participants’ scores were calculated using their total number of “yes” responses
EBIP adoption scale item Score (number of “yes” responses) CACAO adoption stage Modified CACAO adoption stage
0 Awareness Awareness
Prior to this survey, I already knew about evidence-based instructional practices (EBIPs). 1 Awareness Awareness
I have thought about how to implement EBIPs in my courses. 2 Mental tryout Tryout
I have spent time learning about EBIPs (e.g., attended workshops, experimented in class, read education literature), and I am prepared to use EBIPs. 3 Hands-on tryout Tryout
I consistently use EBIPs in my courses. 4 Adoption Adoption
I consistently use EBIPs, and I continue to learn about and experiment with new EBIPs. 5 Adoption Adoption
I have evidence that my teaching has improved since I started using EBIPs. 6 Adoption Adoption


Various statistics are used to evaluate the reliability and unidimensionality of measures obtained via Guttman scales, including the coefficient of reproducibility (CR), coefficient of scalability (CS), minimal marginal reproducibility (MMR), and percent improvement (PI; McIver and Carmines, 1981). Ideally, CR values should be greater than 0.90 and CS values greater than 0.60 for evidence of reliability and unidimensionality (Abdi, 2010). Lower MMR values and higher PI values are further ideal. For measures obtained using the EBIP Adoption Scale for the study herein, CR = 0.96 and CS = 0.85. These values exceed the cutoff criteria, providing evidence of reliability and unidimensionality. Further, MMR = 0.74 and PI = 0.22. The MMR and PI are higher and lower than desired, respectively, though not prohibitive. With evidence of reliability and unidimensionality, participants’ scores on the EBIP adoption scale were then determined using their total number of “yes” responses (see Table 3). Items from the EBIP Adoption Scale and respondents’ total scores were further mapped onto modified CACAO adoption stages as done by Yik et al. (2022); (see Table 3) to facilitate multilevel modeling. In these modified stages, Mental Tryout and Hands-on Tryout are condensed into one stage, i.e., Tryout. Participants were thus categorized into one of three EBIP adoption stages: Awareness, Tryout, or Adoption.

Multilevel modeling

To address Research Question 1, two multilevel binary logistic regression models (i.e., Tryout – CAFI and Adoption – CAFI) are used to investigate the extent to which faculty members’ perceptions of strategic complements, interdependence, and climate are associated with EBIP adoption. The Tryout – CAFI model is used to distinguish between awareness and tryout, and the Adoption – CAFI model is used to distinguish between tryout and adoption. Two multilevel binary logistic regression models are used rather than a single multinominal logistic regression model, as the latter would yield two different regression coefficients from a reference stage of adoption. A single multilevel ordinal logistic regression model could also be used; however, such a model assumes that the odds between each adoption stage are equivalent and proportional, and research demonstrates that unique factors are associated with each adoption stage (e.g., Yik et al., 2022a). To obtain the most parsimonious and interpretable regression coefficients, two different models are thus used to model two different outcomes (i.e., tryout and adoption).

To address Research Question 2, two additional multilevel binary logistic regression models (i.e., Tryout – Social and Adoption – Social) are used to investigate the extent to which faculty members’ social interactions relating to teaching and learning are associated with EBIP adoption when accounting for perceptions of strategic complements, interdependence, and climate. The Tryout – CAFI model is used to distinguish between awareness and tryout, and the Adoption – CAFI model is used to distinguish between tryout and adoption. Two multilevel binary logistic regression models are again used to model two different outcomes (i.e., tryout and adoption) to obtain the most parsimonious and interpretable regression coefficients.

A multilevel approach is used to account for the nesting of faculty members within departments, a violation of the assumption that data represent independent observations (see Raudenbush and Bryk, 2002). All models included two levels, with faculty members at the first level and departments at the second level. The number of departments corresponds to the number of institutions in the study sample (see Table 2), as each institution has a single chemistry department. The Tryout (CAFI and Social) models included n = 322 participants and n = 290 groups (i.e., departments). The Adoption (CAFI and Social) models included n = 997 participants and n = 550 groups. Mean and variance adaptive Gauss–Hermite quadrature (mvaghermite) integration with seven integration points was used for modeling with the “melogit” command in StataSE Version 17 (StataCorp, 2021).

Defining outcome variables. Outcome variables include three mutually exclusive EBIP adoption stages (i.e., Awareness, Tryout, or Adoption) determined using participants’ EBIP Adoption Scale score (see Table 3). The Tryout – CAFI and Tryout – Social models are used to distinguish between the Awareness and Tryout stages of EBIP adoption, and the Adoption – CAFI and Adoption – Social models are used to distinguish between the Tryout and Adoption stages. Participants’ EBIP adoption stages are summarized in Table 4.
Table 4 Distribution of participants’ EBIP adoption stages. Stages were determined using participants’ total number of ‘yes’ responses on the EBIP Adoption Scale (Landrum et al., 2017) and modified CACAO adoption stages used by Yik et al. (2022)
Adoption stage Number of study participants (n = 1105) Percentage of study participants (%)
Awareness 108 9.8
Tryout 258 23.3
Adoption 739 66.9


Within this distribution, 66.9% of study participants are in the Adoption stage (see Table 4). This percentage is substantive and potentially an anomaly, as research suggests lecture-based approaches remain the dominant mode of instruction in undergraduate STEM courses (Stains et al., 2018). Given that only faculty members who provided complete responses to factors underlying the cooperative adoption of EBIPs were included in the study sample, it is possible that study participants are committed to teaching and are, in turn, more likely to be in the Adoption stage. However, a distribution of stages needed to investigate factors associated with cooperative adoption was obtained (see Table 4), so this potential response bias does not prohibit the investigation.

Defining predictor variables. Participants’ CAFI factor scores generated using the CFA model were included as predictor variables across all conditional models. While CAFI factor scores correspond to latent constructs measured by the instrument, regression models used herein contain only predictor variables and not latent variables to obtain more parsimonious models (Raykov and Marcoulides, 1999). Departmental demographic variables that prior research has found to associated with EBIP adoption (Lund and Stains, 2015; Connor et al., 2022) were also included as predictor variables across conditional models to further account for the nested nature of data (see Table 5). Variables that capture faculty members’ social interactions relating to teaching and learning were included as predictor variables in the Tryout – Social and Adoption – Social models (see Table 5). Missing values for all variables were imputed as zero. Results from the evaluation of unconditional models are provided in Appendix 4.
Table 5 Summary of predictor variables included in regression models
Predictor variable Definition Coding
a DBER investigates learning and teaching in a discipline from a perspective reflecting disciplinary knowledge and practices, and it is informed by research on learning and cognition (National Research Council, 2012). SoTL investigates learning and teaching with an emphasis on developing reflective practice and using classroom-based evidence (National Research Council, 2012), though it does not emerge from education theory or the learning sciences (Coppola and Krajcik, 2013).
Department demographic variables
ACS approval Approval from ACS to award certified bachelor's chemistry degrees 0 = no, 1 = yes
Highest chemistry degree awarded – bachelor's A chemistry bachelor's degree is the highest degree awarded by a department 0 = no, 1 = yes
CAFI variables
Strategic complements CAFI factor scores from CFA
Interdependence CAFI factor scores from CFA
Climate CAFI factor scores from CFA
Social interaction variables
Total people Total number of people with whom the faculty member discussed teaching and learning over the past year 0 = no people, 1 = 1 person, 2 = 2 people, 3 = 3 people, 4 = 4 people, 5 = 5 people
Department Of the people with whom the faculty member discussed teaching and learning over the past year, the number that were from their department 0 = no people, 1 = 1 person, 2 = 2 people, 3 = 3 people, 4 = 4 people, 5 = 5 people
College Of the people with whom the faculty member discussed teaching and learning over the past year, the number that were from their college or university 0 = no people, 1 = 1 person, 2 = 2 people, 3 = 3 people, 4 = 4 people, 5 = 5 people
External Of the people with whom the faculty member discussed teaching and learning over the past year, the number the faculty member has observed teach during that time 0 = no people, 1 = 1 person, 2 = 2 people, 3 = 3 people, 4 = 4 people, 5 = 5 people
Observed by Of the people with whom the faculty member discussed teaching and learning over the past year, the number that observed the faculty member's teaching during that time 0 = no people, 1 = 1 person, 2 = 2 people, 3 = 3 people, 4 = 4 people, 5 = 5 people
Observed Of the people with whom the faculty member discussed teaching and learning over the past year, the number the faculty member observed teaching during that time 0 = no people, 1 = 1 person, 2 = 2 people, 3 = 3 people, 4 = 4 people, 5 = 5 people
SoTL or DBER Of the people with whom the faculty member discussed teaching and learning over the past year, the number engaged in Scholarship of Teaching and Learning (SoTL) or Discipline-Based Education Research (DBER)a 0 = no people, 1 = 1 person, 2 = 2 people, 3 = 3 people, 4 = 4 people, 5 = 5 people
Textbook Of the people with whom the faculty member discussed teaching and learning over the past year, the number they discussed the textbook with 0 = no people, 1 = 1 person, 2 = 2 people, 3 = 3 people, 4 = 4 people, 5 = 5 people
Content Of the people with whom the faculty member discussed teaching and learning over the past year, the number they discussed course content with 0 = no people, 1 = 1 person, 2 = 2 people, 3 = 3 people, 4 = 4 people, 5 = 5 people
Pedagogy and instruction Of the people with whom the faculty member discussed teaching and learning over the past year, the number they discussed pedagogy and instruction with 0 = no people, 1 = 1 person, 2 = 2 people, 3 = 3 people, 4 = 4 people, 5 = 5 people
Assessment Of the people with whom the faculty member discussed teaching and learning over the past year, the number they discussed assessment with 0 = no people, 1 = 1 person, 2 = 2 people, 3 = 3 people, 4 = 4 people, 5 = 5 people
Academic dishonesty and integrity Of the people with whom the faculty member discussed teaching and learning over the past year, the number they discussed academic dishonesty and integrity with 0 = no people, 1 = 1 person, 2 = 2 people, 3 = 3 people, 4 = 4 people, 5 = 5 people


Results

Results from a national survey of factors underlying the cooperative adoption of EBIPs in postsecondary chemistry courses are presented. Specifically, results from regression models Tryout – CAFI and Adoption – CAFI provide insight into the extent that chemistry faculty members’ perceptions of strategic complements, interdependence, and campus climate around teaching are associated with EBIP adoption (i.e., Research Question 1). Results from models Tryout – Social and Adoption – Social provide further insight into the extent to which chemistry faculty members’ social interactions relating to teaching and learning are associated with EBIP adoption (i.e., Research Question 2). Odds ratios (ORs) are used to measure the strength of association between outcome and predictor variables. ORs > 1 represent the increase in odds of an event taking place (being in the Adoption stage versus Tryout stage) for every one-unit increase in the predictor variable (e.g., strategic complements) when all other predictor variables are held constant. Likewise, ORs < 1 represent the decrease in odds of an event taking place for every one-unit increase in the predictor variable when all other predictor variables are held constant. ORs = 1 indicate no relationship between outcome and predictor variables.

RQ1: To what extent are chemistry faculty members’ perceptions of strategic complements, interdependence, and campus climate around teaching associated with EBIP adoption in their postsecondary courses?

ORs for the Tryout – CAFI model range from 1.02 to 2.78, and those for the Adoption – CAFI model range from 0.88 to 2.82 (see Table 6). Strategic complements is a significant predictor variable in both models. The OR = 1.53 for this variable in the Tryout – CAFI model, meaning the odds of a chemistry faculty member being in the Tryout versus Awareness stage are 1.53 times greater for one standard deviation above the mean when the faculty member perceives EBIP adoption as a strategic complement. The OR = 2.82 in the Adoption – CAFI model, indicating the odds of a faculty member being in the Adoption versus Tryout stage are 2.82 times greater for one standard deviation above the mean when the faculty member perceives EBIP adoption as such a complement. Other CAFI variables (i.e., interdependence and climate) are nonsignificant predictors with ORs approaching one across models; there is, thus, no evidence of an association between these variables and EBIP adoption.

Table 6 Perceptions of strategic complements, interdependence, and climate associated with EBIP adoption in postsecondary chemistry courses
Variable EBIPs tryout EBIPs adoption
OR SE p OR SE p
*Significance at 0.05 level, ** significance at 0.01 level, and *** significance at <0.001 level.
Departmental demographic variables
ACS approval 1.62 0.58 0.182 1.06 0.27 0.822
Highest degree awarded – bachelor's 1.78 0.45 0.025* 1.19 0.22 0.351
CAFI variables
Strategic complements 1.53 0.18 <0.001*** 2.82 0.30 <0.001***
Interdependence 1.26 0.19 0.136 0.88 0.10 0.275
Climate 1.02 0.12 0.858 1.03 0.09 0.651


RQ2: To what extent are chemistry faculty members’ social interactions relating to teaching associated with EBIP adoption in their postsecondary courses?

For the Tryout – Social model, ORs ranged from 0.84 to 1.75 (see Table 7). Strategic complements is again a predictor variable in this more comprehensive model, with an OR = 1.61 approaching that of the Tryout – CAFI model. No social interaction variables were significant predictors, with p-values greater than or equal to 0.115 and ORs near one. There is, thus, no evidence that faculty members’ social interactions relating to teaching and learning are associated with being in the Tryout versus Awareness stage, or vice versa.

Table 7 Social interaction variables associated with EBIP adoption in postsecondary chemistry courses when accounting for perceptions of strategic complements, interdependence, and climate
Variable EBIPs tryout EBIPs adoption
OR SE p OR SE p
*Significance at 0.05 level, ** significance at 0.01 level, and *** significance at <0.001 level.
Departmental demographic variables
ACS approval 1.75 0.71 0.170 1.05 0.28 0.839
Highest degree awarded – bachelor's 1.47 0.43 0.185 1.06 0.21 0.776
CAFI variables
Strategic complements 1.61 0.24 0.002** 2.80 0.30 <0.001***
Interdependence 1.10 0.20 0.579 0.83 0.10 0.125
Climate 1.00 0.13 0.971 1.02 0.09 0.809
Social interaction variables
Total people 1.23 0.24 0.284 1.06 0.12 0.628
Department 0.95 0.21 0.805 0.79 0.11 0.088
College 1.29 0.39 0.396 0.95 0.16 0.782
External 0.86 0.30 0.664 0.84 0.17 0.409
Observed by 0.97 0.20 0.867 1.16 0.14 0.206
Observed 1.01 0.17 0.961 0.86 0.09 0.160
SOTL or DBER 0.94 0.10 0.576 1.04 0.07 0.617
Textbook 1.04 0.13 0.779 0.88 0.07 0.133
Content 0.97 0.14 0.804 1.16 0.10 0.077
Pedagogy and instruction 1.21 0.15 0.124 1.23 0.10 0.014*
Assessment 0.84 0.12 0.239 1.18 0.10 0.044*
Academic dishonesty and integrity 1.21 0.15 0.115 0.94 0.06 0.353


ORs for the Adoption – Social model range from 0.79 to 2.80 (see Table 7). The largest OR (i.e., 2.80) corresponds to strategic complements, approaching the OR for this variable in the Tryout – CAFI model. Two social interaction variables are significant predictors of being in the Adoption versus Tryout stage: the total number of people with whom a chemistry faculty member discusses pedagogy and instruction (OR = 1.23) and the total number of people with whom they discuss assessment (OR = 1.18). The odds of a chemistry faculty member being in the Adoption versus Tryout stage, thus, increases by 1.23 and 1.18 times, respectively, for each additional person with whom they discuss pedagogy/instruction and assessment. Notably, the total number of people with whom a faculty member discusses teaching and learning in general is not a significant predictor of being in the Adoption versus Tryout stage. Other social interaction variables were nonsignificant predictors in this model, with ORs approaching one.

Discussion and implications

This study evaluated factors associated with chemistry faculty members’ cooperative adoption of EBIPs in their postsecondary courses. Among the evaluated factors were strategic complements (i.e., the perception of using EBIP as mutually beneficial to oneself and one's colleagues), interdependence (i.e., the perception that faculty members’ success or failure is intertwined), and campus climate around teaching (i.e., the perception of institutional readiness for instructional reform), as well as a range of social interaction variables. Our discussion focuses on factors found to be significant predictors of EBIP tryout and adoption, including implications for effectively promoting EBIP adoption across the undergraduate chemistry curriculum; we note where our results from chemistry contexts differ from those of other STEM disciplines and studies encompassing (but not disaggregating by) multiple STEM disciplines.

Association of strategic complements with EBIP tryout and adoption

Results suggest that chemistry faculty members’ perception of using EBIPs as mutually beneficial to oneself and one's colleagues (i.e., strategic complements) is associated with both the tryout and adoption of EBIPs in their postsecondary courses. These results further suggest that EBIP adoption among chemistry faculty members may be rooted in developing and having a mindset that the more colleagues who use EBIPs, the greater the associated benefits. Mutual benefits resulting from more departmental colleagues using these techniques may include, but are not limited to, increased shared knowledge and resources for implementing EBIPs, increased support in abandoning traditional instructor-centered approaches, a departmental climate more focused on the continuous improvement of teaching, and improved student learning outcomes across the chemistry curriculum that, in turn, allow more faculty members to meet their instructional goals. To promote EBIP use, department leaders could facilitate peer mentoring or collaboration opportunities to increase the transparency of any potential mutual benefits. The CAFI could subsequently be used as a tool for professional development facilitators or department leaders to measure the impact of such experiences.

To further understand EBIP adoption across the undergraduate chemistry curriculum, additional research is needed to identify specific mutual benefits that act as drivers of cooperative adoption; for example, what mutual benefits does cohort-based professional development involving multiple members of the same chemistry department afford, and what role do these benefits play in supporting EBIP adoption? Chemistry faculty members in the United States also report pressure from various chemistry communities (e.g., textbook authors, ACS exam authors, and department colleagues) to cover a breadth of chemistry content rather than adopting a depth approach better aligned with how people learn (Kraft et al., 2023). These content coverage expectations also function as a barrier to EBIP adoption (Shadle et al., 2017). Thus, research could also investigate whether more chemistry department colleagues using EBIPs functions to reduce this pressure (i.e., a mutual benefit) and, in turn, further drive EBIP use.

The association between EBIP adoption and chemistry faculty members’ perception of using EBIPs as mutually beneficial aligns with prior findings on STEM faculty members’ cooperative adoption of EBIPs (McAlpin et al., 2022). However, there is no evidence of an association between chemistry faculty members’ perception of campus climate around teaching and EBIP adoption, where McAlpin et al. (2022) observed an inverse association between STEM faculty members’ perception of campus climate and adoption. Conversely, prior research demonstrates that chemistry faculty members’ perception of departmental climate around teaching is associated with EBIP adoption (Connor and Raker, 2023), while no evidence of an association is found for STEM faculty members’ perception of departmental climate and adoption (Emery et al., 2021; Shi and Stains, 2021). These differences suggest further exploration is needed to understand the role of departmental versus campus climate in the adoption of EBIPs among chemistry faculty members, including the possibility that their instructional practices are distinctly influenced by their perception of departmental rather than campus climate when compared to faculty members in other STEM disciplines.

Association of social interaction variables with EBIP adoption

The number of people with whom a chemistry faculty member discusses pedagogy/instruction and assessment is also associated with the adoption of EBIPs. However, there is no evidence of an association between the numbers of people with whom a chemistry faculty member discusses the course textbook, content, or academic dishonesty/integrity and EBIP adoption. This finding differs from that of Lane et al. (2022), where faculty members from multiple STEM disciplines (including chemistry, though data were not disaggregated by discipline) report that their conversations about course content influence their use of EBIPs. Given that chemistry faculty members report pressure to cover a breadth of content from multiple communities (Kraft et al., 2023), including their department colleagues, it is possible that chemistry faculty members choose to avoid this topic in conversations with colleagues or that conversations surrounding content coverage are not productive. Future research should, thus, aim to investigate the role of content-focused conversations in chemistry faculty members’ adoption of EBIPs, as absent or futile conversation may be impeding adoption. Outcomes could then inform strategies for supporting productive, content-focused conversations among chemistry faculty members.

Further, we found no evidence of an association between the total number of people with whom a faculty member discusses teaching regardless of topic and EBIP adoption. Collectively, these results suggest that while talking to others about teaching is important, it is the topics of the conversations that are associated with sustained adoption among chemistry faculty members. Additional studies focused on the nature of these topic-specific discussions, including frequencies, contexts, and objectives, will thus be essential for further promoting EBIP adoption in postsecondary chemistry courses. Future research on EBIP adoption in postsecondary chemistry courses should also routinely consider social interactions, as these findings suggest that focusing only on individual chemistry faculty members and their enacted teaching practices is insufficient for exploring how and why EBIPs are adopted.

Results also suggest that, in addition to the important mindset associated with trying out EBIPs, chemistry faculty members need communities in which these topic-specific conversations can occur for sustained EBIP adoption. There is no evidence of an association between EBIP adoption and the number of people in one's department, college, or university with whom they discuss teaching and learning. This result differs from that of another study focused on faculty members from multiple STEM disciplines (including chemistry, though data were not disaggregated by discipline), which found that EBIP adoption is associated with having both deeper and more extensive social connections within and across departments (Middleton et al., 2022). For chemistry faculty members, the composition of these communities may, therefore, be less important than the topics of their conversations. Chemistry faculty learning communities (Houseknecht et al., 2020), communities of practice (CoPs) (Raker et al., 2020), and departmental curriculum committees could, thus, provide the space to initiate and catalyze such discussions. A number of chemistry-specific CoPs have already formed to provide such spaces and, in turn, support EBIP adoption across institutions. For instance, OrganicERs is a CoP designed to introduce active-learning techniques to organic chemistry faculty members across the United States (Leontyev et al., 2020). Stone et al. (2020) also report the formation of a CoP to support chemistry faculty members in implementing course-based undergraduate research experiences, an evidence-based pedagogy for the instructional laboratory, across institutions. Further, the Interactive Online Network of Inorganic Chemists (IONiC) is a CoP committed to sharing instructional content and evidence-based teaching practices among inorganic chemistry faculty members internationally (Watson et al., 2020). Moreover, organically emerging discussions about these topics may be just as important, though additional research is needed to understand specific aspects of these conversations.

When conceptualizing the work of EBIP developers, evaluators, and disseminators, it is important to distinguish between building awareness, trying out EBIPs, and ultimately adopting EBIPs (Landrum et al., 2017). The work herein corroborates work from others that that there may not be a single approach to catalyze awareness, tryout, and adoption, as these groups are not the same (Viskupic et al., 2022; Yik et al., 2022a). Specifically, results suggest that chemistry faculty members’ mindset about the mutual benefits of using EBIPs may be an initial, important step on the path of adoption. Communities in which chemistry faculty members can discuss instruction/pedagogy and assessment may then be essential for moving individuals beyond the tryout stage.

Further, a number of our findings are distinct from those of studies focused on faculty members from other STEM disciplines or multiple (though not disaggregated) STEM disciplines. These differences suggest that effective approaches to catalyzing awareness, tryout, and adoption are unique to not only adoption stage but also to discipline. These key differences are highlighted and reemphasized below:

• We found no evidence of an association between chemistry faculty members’ perception of campus climate around teaching and EBIP adoption, though McAlpin et al. (2022) observed an inverse association between STEM faculty members’ perception of campus climate and adoption. When combined with prior research demonstrating that chemistry faculty members’ perception of departmental climate around teaching is associated with EBIP adoption (Connor and Raker, 2023), this finding suggests that chemistry faculty members’ instructional practices are distinctly influenced by their perception of departmental rather than campus climate when compared to faculty members in other STEM disciplines. Influencing chemistry faculty members’ perceptions of departmental rather than institutional climate may, thus, be a more productive approach to increase EBIP use.

• We found no evidence of an association between the number of people with whom a chemistry faculty member discusses course content and EBIP adoption, though in an investigation by Lane et al. (2022), faculty members from multiple STEM disciplines report that their conversations about course content influence their use of EBIPs. Content coverage expectations are a barrier to EBIP adoption (Shadle et al., 2017), and chemistry faculty members report pressure from multiple chemistry communities to cover a breadth of content (Kraft et al., 2023). This finding raises the possibility that chemistry faculty members choose to avoid this topic in conversations with colleagues or that conversations surrounding content coverage are not productive. This issue merits further research, as absent or futile conversations surrounding content coverage may be impeding EBIP adoption in undergraduate chemistry education.

• We found no evidence of an association between chemistry faculty members’ EBIP adoption and the number of people in their department, college, or university with whom they discuss teaching and learning, though in another study targeting faculty members from multiple STEM disciplines (Middleton et al., 2022), EBIP adoption is associated with having both deeper and more extensive social connections within and across departments. This difference suggests that while chemistry faculty members need communities in which topic-specific conversations about teaching can occur for sustained EBIP adoption, unlike in other STEM disciplines, the composition of these communities may not be essential. Chemistry faculty members may, therefore, uniquely benefit from participating in existing chemistry-specific faculty learning communities and CoPs that aim to support EBIP use across institutions. It is further possible that chemistry faculty members have yet to develop extensive, teaching-focused social networks within their departments and institutions when compared to faculty members from other STEM disciplines, potentially due to insufficient reform efforts targeting these individuals. Future research should address underlying causes for this possible lack of association, as expanding these social networks among chemistry faculty members could effectively promote EBIP adoption in undergraduate chemistry education.

Limitations

Three key limitations are important to discuss considering our work herein.

First, the social dimensions explored could be strengthened by a social network analysis approach, as has been used in other studies of faculty members’ teaching-specific interactions (Kezar, 2016; Lane et al., 2020; McConnell et al., 2020). In a context where a complete social network can be created, a more robust analysis of social influences on outcome measures can be conducted. For example, McAlpin et al. (2022) used indegree from a social network analysis as a measure of opinion leadership when exploring the relationship between cooperative adoption factors and EBIP adoption. However, to conduct such a study, one is limited to smaller sample sizes and fewer departmental/institution sites; McAlpin et al. (2022) only studied three institutions and five STEM disciplines within each institution. This limits generalizability and estimates of the relationship of interest at a national level, as was the goal of the study herein. There is a possible ego network approach for doing such work (Arnaboldi et al., 2012); however, given the broad focus of the study herein, there are a limited the number of questions that could be expected to be answered by a respondent in a reasonable time period. Results of the study herein suggest that there are points of interest that should be considered in designing and executing a rigorous ego network analysis investigation.

Second, the work herein overlaps with multiple survey research studies in chemistry and STEM for which a growing number of important factors of association with EBIP adoption have been identified (Gibbons et al., 2017; Raker et al., 2021a; Yik et al., 2022a, 2022b). It is unmanageable to conduct a study with sufficient statistical power to evaluate all possible associations with EBIP adoption. Still, if the focus of such work is to identify levers for change, it is prudent to suggest that future studies incorporate previously found factors into such studies. For example, a number of studies show that classroom setup (i.e., classroom space conducive to small group work) is associated with decreased time lecturing and sustained EBIP adoption (Yik et al., 2022a, 2022b). Thus, it is important to use this potentially confounding factor in studies evaluating the effectiveness of a professional development intervention on adoption of EBIPs.

Third, the distribution of study participants’ EBIP adoption stage may not reflect that of the larger study population, as research suggests lecture-based approaches remain the dominant mode of instruction in undergraduate STEM courses (Stains et al., 2018). In a representative sample, the majority of participants would, thus, likely/hypothetically be in the Awareness or Tryout stages. The relatively small percentage of study participants in the Awareness stage may mean that our data are not representative of all chemistry faculty members in this adoption stage. The smaller number of participants in this stage may have also limited the statistical power and, in turn, ability to identify factors associated with the tryout of EBIPs using the Tryout – CAFI and Tryout – Social logistic regression models. Therefore, additional factors may be associated with the tryout of EBIPs, though this investigation is unable to provide evidence of these associations. However, the large number of study participants in the Adoption stage, combined with the moderate number in the Tryout stage, allowed for the identification of multiple factors associated with adopting EBIPs.

Conclusions

Results from a national survey of chemistry faculty members (n = 1105) provide insight into various factors associated with the cooperative adoption of EBIPs in postsecondary chemistry courses. This insight is necessary for change agents (e.g., course coordinators and department leaders) to effectively promote postsecondary chemistry instructional reform, as findings extend beyond those on individual chemistry faculty members and their enacted instructional practices to account for the social context in which change occurs. Results indicate that perceiving the use of EBIPs as mutually beneficial for oneself and one's colleagues is associated with both trying out and adopting EBIPs in postsecondary chemistry courses. Further, discussing pedagogy/instruction and assessment with others is associated with adopting EBIPs, meaning that having formal opportunities for faculty members to specifically discuss these aspects (e.g., in communities of practice) may be important for sustaining change. This study adds to research on factors underlying the cooperative adoption of EBIPs (McAlpin et al., 2022), providing a foundation through which the chemistry education community can more effectively design and support change initiatives; this study also adds to research in delineating factors that are significant in the chemistry education context compared to other STEM disciplines and STEM as a whole, and vice versa (i.e., no evidence of significance in the chemistry education context).

Author contributions

M. C. C. and J. R. R. conceived the project. M. C. C. and J. R. R. collected the data. M. C. C. conducted the statistical analyses. M. C. C. wrote the paper. M. C. C. and J. R. R. read, edited, and approved the final manuscript.

Conflicts of interest

There are no conflicts to declare.

Appendices

Appendix 1: survey items

EBIP adoption scale (Landrum et al., 2017). What is an EBIP? It is an evidence-based instructional practice or approach that has a demonstrated record of success. That is, there is reliable, valid empirical evidence to suggest that when faculty use EBIPs, student learning is supported, and it is implied that EBIPs are more effective than standard traditional lecture and discussion methods. Active learning techniques are often EBIPs, such as just-in-time teaching, process oriented guided inquiry learning, think-pair-share, cooperative learning, peer instruction, service learning, and many others.
Yes No
1. Prior to this survey, I already knew about evidence-based instructional practices (EBIPs)
2. I have thought about how to implement EBIPs in my courses.
3. I've spent time learning about EBIPs (e.g. attended workshop, experimented in class, read education literature) and I am prepared to use them.
4. I consistently use EBIPs in my courses.
5. I consistently use EBIPs and I continue to learn about and experiment with new EBIPs.
6. I have evidence that my teaching has improved since I started using EBIPs.

The cooperative adoption factors instrument (McAlpin et al., 2022)

Please rate your level of agreement with these statements.

1. My department is more effective when all faculty are committed to using EBIPs.

Strongly disagree Disagree Somewhat disagree Neither agree nor disagree Somewhat agree Agree Strongly agree

2. My students are likely to be more receptive to an EBIP in my course, if they have had a similar EBIP in another departmental course.

Strongly disagree Disagree Somewhat disagree Neither agree nor disagree Somewhat agree Agree Strongly agree

3. I do not use EBIPs because I want students to be exposed to lecture-based teaching.

Strongly disagree Disagree Somewhat disagree Neither agree nor disagree Somewhat agree Agree Strongly agree

4. Departments best foster student learning if EBIPs are adopted by all faculty members.

Strongly disagree Disagree Somewhat disagree Neither agree nor disagree Somewhat agree Agree Strongly agree

5. It's easier for me to adopt EBIPs if someone else in my department is adopting them because my colleagues serve as a handy resource for me.

Strongly disagree Disagree Somewhat disagree Neither agree nor disagree Somewhat agree Agree Strongly agree

6. I am convinced that EBIPs are of little value for student learning.

Strongly disagree Disagree Somewhat disagree Neither agree nor disagree Somewhat agree Agree Strongly agree

We are now going to ask you 6 questions about how you perceive yourself in relation to your closest departmental colleague.

1. When something good happens to a close colleague in my department, that is

Very bad for me Bad for me Somewhat bad for me Neither good nor bad for me Somewhat good for me Good for me Very good for me

2. When something bad happens to a close colleague in my department, that is:

Very bad for me Bad for me Somewhat bad for me Neither good nor bad for me Somewhat good for me Good for me Very good for me

3. When a close colleague in my department succeeds, I feel:

Very bad Bad Somewhat bad Neither good nor bad Somewhat good Good Very good

4. When a close colleague in my department fails, I feel:

Very bad Bad Somewhat bad Neither good nor bad Somewhat good Good Very good

5. My close departmental colleague's gain is:

My loss Neither my gain or my loss My gain

6. My close departmental colleague's loss is:

My loss Neither my gain or my loss My gain

For each item, please select the scale point that best represents your opinion.

Each statement begins with “I believe that… ”

the campus culture is generally supportive of teaching. the campus culture is generally unsupportive of teaching.
the campus culture is shaped by leaders who are not supportive of my teaching. the campus culture is shaped by leaders who are supportive of my teaching.
the campus culture breeds divisiveness in teaching discussions. the campus culture breeds collaboration in teaching discussions.
the campus culture does not value teaching. the campus culture values teaching.
the campus culture connects me with other teachers. the campus culture isolates me from other teachers.
Items targeting teaching-specific social interactions. This section aims to gather information about the persons that you discuss teaching and learning over the last year. Please create a list (either written down or in your mind) of those persons. If your list has more than five (5) people, please limit your list to the five (5) people that you most discuss teaching and learning. In the next questions you will be asked about those people. Please do not provide any identifying information about those people (e.g., their name). We will ask you to provide numbers, letters, or pseudonyms (i.e., a name that is not the person's real name) to represent the people on your list to assist you in data collection. The ‘identifier’ you provide is not important to our analyses and will be deleted after data collection has completed.

1. How many people have you discussed teaching and learning with over the last year?

0 1 2 3 4 5 Prefer not to disclose

2. Please provide a number (e.g., “101”), letter (e.g., “AG”), or pseudonym (e.g., “Susan”; please do not provide their actual name) for each of your people.

3. For each of your people, what is their affiliation?

In your department In your college or university External to your college or university
Identifier #1
Identifier #2

4. Of your people, who has observed your teaching in the last year? (select all that apply)

5. Of your people, who have you observed their teaching in the last year? (select all that apply)

6. To the best of your knowledge, who of your people have or are currently engaged in STEM education research or scholarship of teaching and learning? (select all that apply)

7. For each of your people, what topics related to teaching and learning have you discussed in the last year? (select all that apply)

Textbook Course content Pedagogy and instruction (e.g., group work, classroom response systems) Assessment (e.g., examinations) Academic dishonesty and cheating
Identifier #1
Identifier #2

Appendix 2: Demographics of the study sample

Table 8
Table 8 Summary of the gender and racial distribution of the study sample
Demographic variables Number of faculty members in sample (n = 1105) Percentage of faculty members in sample (%)
Gender
Man 517 46.8
Woman 425 38.5
Nonbinary 12 1.1
Transgender 7 0.6
Prefer not to disclose 54 4.9
No response 90 8.1
Race
American Indian or Alaska Native only 0 0.0
Asian only 56 5.1
Black or African American only 31 2.8
Native Hawaiian or Other Pacific Islander only 0 0.0
White only 822 74.4
Some other race, ethnicity, or origin only 7 0.6
More than one race 29 2.6
Prefer to self-describe 15 1.4
Prefer not to disclose 55 5.0
No response 90 8.1


Appendix 3: Additional results from psychometric evaluation of survey responses

Table 9 and 10
Table 9 Factors and items from the CAFI developed by Mcalpin et al. (2022), including descriptive statistics, item loadings, and covariance coefficients. Participants responded to items using seven-point Likert and semantic differential scales
Factor Items Mean SD Skewness Kurtosis Loading
a Item was reverse coded prior to analysis, *p < 0.01, **p < 0.001.
Complements comp1 4.52 1.50 −0.34 2.73 0.81
comp2 5.09 1.24 −0.73 3.76 0.54
comp3a 5.67 1.45 −1.05 3.36 0.53
comp4 4.45 1.52 −0.39 2.60 0.74
comp5 4.71 1.56 −0.64 2.81 0.41
comp6a 5.69 1.40 −1.05 3.57 0.65
Interdependence inter1 5.37 1.05 −0.25 2.64 0.82
inter2a 5.20 0.99 −0.07 2.43 0.50
inter3 6.08 0.85 −1.01 4.69 0.66
inter4a 5.86 0.83 −0.69 4.26 0.47
inter5 5.48 1.14 −0.16 2.16 0.73
inter6a 5.44 1.11 −0.15 2.32 0.57
Climate clim2a 5.09 1.75 −0.76 2.49 0.67
clim3 4.62 1.72 −0.47 2.27 0.74
clim4 4.74 1.58 −0.57 2.67 0.60
clim5 5.23 1.74 −0.89 2.79 0.75
clim6a 4.49 1.65 −0.33 2.22 0.52
Covariance coefficient
Complements and Interdependence 0.17**
Complements and Climate 0.12*
Interdependence and Climate 0.21**


Table 10 Reliability coefficients for factors in the CAFI. Cronbach's α and McDonald's ω values approach one and can be interpreted as good
Factor α ω
Complements 0.79 0.89
Interdependence 0.85 0.93
Climate 0.79 0.83


Appendix 4. Evaluation of unconditional multilevel models

Unconditional models, in which only outcome variables are modeled and not predictor variables, exhibit intraclass correlation coefficients (ICCs) < 0.10. These values indicate that less than 10% of variation in the outcome variable is accounted for by nesting faculty members within departments and that data may be accurately modeled at one level (see Huang and Cornell 2016). However, it is possible that ICCs are small given the large number of departments and comparably small number of faculty members (see Table 11). Two-level models were, thus, specified for the study herein given this possibility. Two-level models are also conceptually appropriate and used in other investigations focused on EBIP adoption (e.g., Yik et al., 2022a). ICCs were determined using the procedure in Liu (2016).
Table 11 Summary of unconditional models. The large number of groups and comparably small number of faculty members potentially resulted in ICCs < 0.10. Two-level models were thus specified for the study herein
Models Number of participants Number of groups (i.e., departments) Minimum number of participants per group Maximum number of participants per group ICC
Tryout (CAFI and Social) 366 290 1 9 <0.01
Adoption (CAFI and Social) 997 550 1 12 0.05


Acknowledgements

The authors would like to thank the faculty members that participated in this study.

References

  1. Abdi H., (2010), Guttman Scaling, in Salkind J. N. (ed.), Encyclopedia of research design, pp. 559–560.
  2. Aktipis A., Cronk L., Alcock J., Ayers J. D., Baciu C., Balliet D., et al., (2018), Understanding cooperation through fitness interdependence, Nat. Hum. Behav., 2(7), 429–431.
  3. Andrews T. C., Conaway E. P., Zhao J. and Dolan E. L., (2016), Colleagues as change agents: How department networks and opinion leaders influence teaching at a single research university, CBE Life Sci. Educ., 15(2), 1–17.
  4. Arnaboldi V., Conti M., Passarella A. and Pezzoni F., (2012), Analysis of ego network structure in online social networks, Proc. - 2012 ASE/IEEE Int. Conf. Privacy, Secur. Risk Trust 2012 ASE/IEEE Int. Conf. Soc. Comput. Soc., 2012, pp. 31–40.
  5. Boon S. and Holmes J., (1991), The dynamics of interpersonal trust: Resolving uncertainty in the face of risk, in Hinde R. A. and J. Groebel (ed.), Cooperation and prosocial behavior, Cambridge University Press, pp. 190–211.
  6. Bulow J. I., Geanakoplos J. D. and Klemperer P. D., (1985), Multimarket oligopoly: Strategic substitutes and complements, J. Polit. Econ., 93(3), 488–511.
  7. Burt R. S., (1984), Network items and the general social survey, Soc. Netw., 6(4), 293–339.
  8. Committee on Professional Training, (2023), ACS guidelines for undergraduate chemistry programs, American Chemical Society, https://drive.google.com/file/d/1xy8PWWCsrK4ZOUsc9bE6iq9hWS1I5Y1E/view?usp=share_link.
  9. Connor M. C. and Raker J. R., (2023), Measuring the Association of Departmental Climate around Teaching with Adoption of Evidence-Based Instructional Practices: A National Survey of Chemistry Faculty Members, J. Chem. Educ., 100(9), 3462–3476.
  10. Connor M. C., Pratt J. M. and Raker J. R., (2022), Goals for the Undergraduate Instructional Inorganic Chemistry Laboratory When Course-Based Undergraduate Research Experiences Are Implemented: A National Survey, J. Chem. Educ., 99(12), 4068–4078.
  11. Connor M. C., Rocabado G. A. and Raker J. R., (2023), Revisiting faculty members’ goals for the undergraduate chemistry laboratory, Chem. Educ. Res. Pract., 24, 217–233.
  12. Coppola B. P. and Krajcik J. S., (2013), Discipline-centered post-secondary science education research: Understanding university level science learning, J. Res. Sci. Teach., 50(6), 627–638.
  13. Cox B. E., Mcintosh K. L., Reason R. D. and Terenzini P. T., (2011), A Culture of Teaching: Policy, Perception, and Practice in Higher Education, Res. High. Educ., 52, 808–829.
  14. Cronbach L. J., (1951), Coefficient Alpha and the Internal Structure of Tests, Psychometrika, 16(3), 297–334.
  15. Deslauriers L., McCarty L. S., Miller K., Callaghan K. and Kestin G., (2019), Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom, Proc. Natl. Acad. Sci. U. S. A., 1–7.
  16. Dormant D., (2011), The chocolate model of change, Author.
  17. Emery N., Maher J. M. and Ebert-May D., (2021), Environmental influences and individual characteristics that affect learner-centered teaching practices, PLoS One, 16(4), 1–25.
  18. Freeman S., Eddy S. L., McDonough M., Smith M. K., Okoroafor N., Jordt H. and Wenderoth M. P., (2014), Active learning increases student performance in science, engineering, and mathematics, Proc. Natl. Acad. Sci. U. S. A., 111(23), 8410–8415.
  19. Gibbons R. E., Laga E. E., Leon J., Villafañ S. M., Stains M., Murphy K., et al., (2017), Chasm Crossed? Clicker Use in Postsecondary Chemistry Education, J. Chem. Educ., 94(5), 549–557.
  20. Gibbons R. E., Villafañe S. M., Stains M., Murphy K. L. and Raker J. R., (2018), Beliefs about learning and enacted instructional practices: An investigation in postsecondary chemistry education, J. Res. Sci. Teach., 55(8), 1111–1133.
  21. Guttman L. A., (1945), Basis for Analyzing Test-Retest Reliability, Psychometrika, 10(4), 255–282.
  22. Henderson C. and Dancy M. H., (2007), Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics, Phys. Rev. Spec. Top. – Phys. Educ. Res., 3(2), 020102.
  23. Herreid C. F. and Schiller N. A., (2013), Case Studies and the Flipped Classroom, J. Coll. Sci. Teach., 42(5), 62–66.
  24. Houseknecht J. B., Bachinski G. J., Miller M. H., White S. A. and Andrews D. M., (2020), Effectiveness of the active learning in organic chemistry faculty development workshops, Chem. Educ. Res. Pract., 21(1), 387–398.
  25. Hu L. T. and Bentler P. M., (1999), Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives, Struct. Equ. Model. A Multidiscip. J., 6(1), 1–55.
  26. Huang F. L. and Cornell D. G., (2016), Using Multilevel Factor Analysis With Clustered Data, J. Psychoeduc. Assess., 34(1), 3–14.
  27. Jackson M. O. and Zenou Y., (2015), Games on networks, Handb. Game Theory with Econ. Appl., 4(1), 95–163.
  28. Kezar A., (2016), Higher Education Change and Social Networks: A Review of Research, J. Higher Educ., 85(1), 91–125 DOI:10.1080/00221546.2014.11777320.
  29. Kraft A., Popova M., Erdmann R. M., Harshman J. and Stains M., (2023), Tensions between depth and breadth: an exploratory investigation of chemistry assistant professors’ perspectives on content coverage, Chem. Educ. Res. Pract., 24(2), 567–576.
  30. Landrum R. E., Viskupic K., Shadle S. E. and Bullock D., (2017), Assessing the STEM landscape: the current instructional climate survey and the evidence-based instructional practices adoption scale, Int. J. STEM Educ., 4(1), 1–10.
  31. Lane A. K., Skvoretz J., Ziker J. P., Couch B. A., Earl B., Lewis J. E., et al., (2019), Investigating how faculty social networks and peer influence relate to knowledge and use of evidence-based teaching practices, Int. J. STEM Educ., 6(1), 1–14.
  32. Lane A. K., McAlpin J. D., Earl B., Feola S., Lewis J. E., Mertens K., et al., (2020), Innovative teaching knowledge stays with users, Proc. Natl. Acad. Sci. U. S. A., 117(37), 22665–22667.
  33. Lane A. K., Earl B., Feola S., Lewis J. E., McAlpin J. D., Mertens K., et al., (2022), Context and content of teaching conversations: exploring how to promote sharing of innovative teaching knowledge between science faculty, Int. J. STEM Educ., 9(1), 1–16.
  34. Leontyev A., Houseknecht J. B., Maloney V., Muzyka J. L., Rossi R., Welder C. O. and Winfield L., (2020), OrganicERs: Building a Community of Practice for Organic Chemistry Instructors through Workshops and Web-Based Resources, J. Chem. Educ., 97(1), 106–111.
  35. Li C. H., (2016), Confirmatory factor analysis with ordinal data: Comparing robust maximum likelihood and diagonally weighted least squares, Behav. Res. Methods, 48(3), 936–949.
  36. Liu X., (2016), Applied Ordinal Logistic Regression Using Stata, SAGE Publications.
  37. López N., Morgan D. L., Hutchings Q. R. and Davis K., (2022), Revisiting critical STEM interventions: a literature review of STEM organizational learning, Int. J. STEM Educ., 9(1), 1–14.
  38. Lund T. J. and Stains M., (2015), The importance of context: an exploration of factors influencing the adoption of student-centered teaching among chemistry, biology, and physics faculty, Int. J. STEM Educ., 2(1), 1–21.
  39. Ma S., Herman G. L., Tomkin J. H., Mestre J. P. and West M., (2018), Spreading Teaching Innovations in Social Networks: the Bridging Role of Mentors, J. STEM Educ. Res., 1(1–2), 60–84.
  40. McAlpin J. D., Ziker J. P., Skvoretz J., Couch B. A., Earl B., Feola S., et al., (2022), Development of the Cooperative Adoption Factors Instrument to measure factors associated with instructional practice in the context of institutional change, Int. J. STEM Educ., 9, 48.
  41. McConnell M., Montplaisir L. and Offerdahl E., (2020), Meeting the Conditions for Diffusion of Teaching Innovations in a University STEM Department, J. STEM Educ. Res., 3(1), 43–68.
  42. McDonald R. P., (1999), Test Theory: A Unified Treatment, Lawrence Erlbaum Associates.
  43. McIver J. P. and Carmines E. G., (1981), An introduction to Guttman scaling, Unidimensional Scaling, Sage Publications, pp. 41–66.
  44. Mestre J. P., Herman G. L., Tomkin J. H. and West M., (2019), Keep Your Friends Close and Your Colleagues Nearby: The Hidden Ties that Improve STEM Education, The Magazine of Higher Learning, 51(1), 42–49 DOI:10.1080/00091383.2019.1547081.
  45. Middleton J. A., Krause S., Judson E., Ross L., Culbertson R., Hjelmstad K. D., et al., (2022), A Social Network Analysis of Engineering Faculty Connections: Their Impact on Faculty Student-Centered Attitudes and Practices, Educ. Sci. 2022, 12(2), 108.
  46. Miller G. A., (1956), The magical number seven, plus or minus two: some limits on our capacity for processing information, Psychol. Rev., 63(2), 81–97.
  47. Missett T. C. and Foster L. H., (2015), Searching for Evidence-Based Practice: A Survey of Empirical Studies on Curricular Interventions Measuring and Reporting Fidelity of Implementation, J. Adv. Acad., 26(2), 96–111.
  48. Moog R. S., Creegan J. F., Hanson M. D., Spencer N. J., Straumanis A. and Bunce M. D., (2009), POGIL: Process-oriented guided-inquiry learning, in Pienta N., Cooper M. M. and Greenbowe T. J. (ed.), Chemists’ Guide To Effective Teaching, Prentice Hall, pp. 90–101.
  49. National Center for Education Statistics, (2021), Integrated Postsecondary Education Data System.
  50. National Research Council, (2012), in Singer S., Nielsen N. and Schweingruber H. (ed.), Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering, National Academies Press.
  51. Raker J. R., Pratt J. M. and Watson L. A., (2020), Building Community: A Reflection on the Interactive Online Network of Inorganic Chemists, ACS Symp. Ser., 1370, 131–139.
  52. Raker J. R., Dood A. J., Srinivasan S. and Murphy K. L., (2021a), Pedagogies of engagement use in postsecondary chemistry education in the United States: results from a national survey, Chem. Educ. Res. Pract., 22(1), 30–42.
  53. Raker J. R., Dood A. J., Srinivasan S. and Murphy K. L., (2021b), Pedagogies of engagement use in postsecondary chemistry education in the United States: Results from a national survey, Chem. Educ. Res. Pract., 22(1), 30–42.
  54. Raudenbush S. W. and Bryk A. S., (2002), Hierarchical linear models: Applications and data analysis methods, Sage Publications.
  55. Raykov T. and Marcoulides G. A., (1999), On desirability of parsimony in structural equation model selection, Struct. Equ. Model. A Multidiscip. J., 6(3), 292–300.
  56. Revelle W., (2018), psych: Procedures for Personality and Psychological Research.
  57. Rosseel Y., (2012), lavaan: An R Package for Structural Equation Modeling.
  58. Shadle S. E., Marker A. and Earl B., (2017), Faculty drivers and barriers: laying the groundwork for undergraduate STEM education reform in academic departments, Int. J. STEM Educ., 4(1), 1–13.
  59. Shi L. and Stains M., (2021), Development of the Departmental Climate around Teaching (DCaT) survey: neither psychological collective climate nor departmental collective climate predicts STEM faculty's instructional practices, Int. J. STEM Educ., 8(1), 1–20.
  60. Simon H. A., (1974), How Big Is a Chunk? Science, 183(4124), 482–488.
  61. Srinivasan S., Gibbons R. E., Murphy K. L. and Raker J., (2018a), Flipped classroom use in chemistry education: results from a survey of postsecondary faculty members, Chem. Educ. Res. Pract., 19(4), 1307–1318.
  62. Srinivasan S., Gibbons R. E., Murphy K. L. and Raker J., (2018b), Flipped classroom use in chemistry education: Results from a survey of postsecondary faculty members, Chem. Educ. Res. Pract., 19(4), 1307–1318.
  63. Stains M. and Vickrey T., (2017), Fidelity of implementation: An overlooked yet critical construct to establish effectiveness of evidence-based instructional practices, CBE Life Sci. Educ., 16(1), 1–11.
  64. Stains M., Harshman J., Barker M. K., Chasteen S. V., Cole R., DeChenne-Peters S. E., et al., (2018), Anatomy of STEM teaching in North American universities, Science, 359(6383), 1468–1470.
  65. StataCorp, (2021), Stata Statistical Software: Release 17.
  66. Stone K. L., Kissel D. S., Shaner S. E., Grice K. A. and Van Opstal M. T., (2020), Forming a Community of Practice to Support Faculty in Implementing Course-Based Undergraduate Research Experiences, ACS Symp. Ser., 1371, 35–55.
  67. Sturtevant H. and Wheeler L., (2019), The STEM Faculty Instructional Barriers and Identity Survey (FIBIS): development and exploratory results, Int. J. STEM Educ., 6(1), 1–22.
  68. Theobald E. J., Hill M. J., Tran E., Agrawal S., Arroyo E. N., Behling S., et al., (2020), Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math, Proc. Natl. Acad. Sci. U. S. A., 117(12), 6476–6483.
  69. Viskupic K., Earl B. and Shadle S. E., (2022), Adapting the CACAO model to support higher education STEM teaching reform, Int. J. STEM Educ., 9(1), 1–20.
  70. Walter E. M., Beach A. L., Henderson C., Williams C. T. and Ceballos-Madrigal I., (2021), Understanding Conditions for Teaching Innovation in Postsecondary Education: Development and Validation of the Survey of Climate for Instructional Improvement (SCII), Int. J. Technol. Educ., 4(2), 166–199.
  71. Watson L. A., Bentley A. K., Eppley H. J. and Lin S., (2020), Building an Online Community of Practice for the Evolution of Effective, Evidence-Based Teaching Practices: 15 Years of Improving Inorganic Chemistry Education, ACS Symp. Ser., 1371, 127–142.
  72. Worthington R. L. and Whittaker T. A., (2006), Scale Development Research: A Content Analysis and Recommendations for Best Practices, Couns. Psychol., 34(6), 806–838.
  73. Yik B. J., Raker J. R., Apkarian N., Stains M., Henderson C., Dancy M. H. and Johnson E., (2022a), Association of malleable factors with adoption of research-based instructional strategies in introductory chemistry, mathematics, and physics, Front. Educ., 7, 1–21.
  74. Yik B. J., Raker J. R., Apkarian N., Stains M., Henderson C., Dancy M. H. and Johnson E., (2022b), Evaluating the impact of malleable factors on percent time lecturing in gateway chemistry, mathematics, and physics courses, Int. J. STEM Educ., 9(1), 1–23.
  75. Zammit K. M., Connor M. C. and Raker J. R., (2023), Research and Practice Evaluating the level of inquiry in postsecondary instructional laboratory experiments: results of a national survey.

This journal is © The Royal Society of Chemistry 2024