Effectiveness of the active learning in organic chemistry faculty development workshops

Justin B. Houseknecht *a, Garrin J. Bachinski a, Madelyn H. Miller b, Sarah A. White c and Douglas M. Andrews d
aDepartment of Chemistry, Wittenberg University, 200 W Ward St, Springfield, OH 45504, USA. E-mail: jhouseknecht@wittenberg.edu
bImmunity and Pathogenesis Division, Burnett School of Biomedical Sciences, University of Central Florida, Orlando, Florida 32827, USA
cHealth Policy and Management Department, Johns Hopkins School of Public Health, 615 N. Wolfe Street, Baltimore, MD 21205, USA
dDepartment of Math and Computer Science, Wittenberg University, 200 W Ward St, Springfield, OH 45504, USA

Received 18th June 2019 , Accepted 4th November 2019

First published on 6th November 2019


Abstract

Active learning has been shown to improve student outcomes and learning, yet organic chemistry instructors have been slow to adopt these pedagogies. The Chemistry Collaborations, Workshops, and Communities of Scholars (cCWCS) Active Learning in Organic Chemistry (ALOC) workshops have sought to facilitate the adoption of active learning methods by helping participants define active learning and understand best practices, persuading them to incorporate these practices into their teaching, and supporting their implementation efforts through an online community, Organic Educational Resources (http://OrganicERs.org). The effectiveness of the workshops was measured over a two-year period using teaching self-efficacy and teaching practices instruments. Comparison to pre-workshop self-efficacy surveys found significant and sustained gains for knowledge about and belief in the efficacy of active learning methods (d = 1.18 compared to pre-workshop responses) and confidence in intention to implement (d = 0.60). Belief that they were implementing more active learning in their classrooms (d = 0.85) was corroborated by the teaching practices survey and survey of class time allocation which also showed statistically significant (p < 0.001) and sustained growth in student centered teaching (d = 1.00), formative assessment (d = 1.04), student–student interactions (d = 0.96), and the amount of class time spent with students working in groups (d = 0.68) for the workshop participants. Gains for participants in the 3 hour Active Learning in Organic Chemistry workshops at the 2016 Biennial Conference on Chemical Education (BCCE) were smaller than those in the 4 day ALOC workshops, but still meaningful. These results indicate that the 2015 and 2016 Active Learning in Organic Chemistry faculty development workshops effectively increased participants’ knowledge about, belief in the efficacy of, and implementation of active learning methods.


Introduction

It has become increasingly clear in the last several years that in the STEM disciplines generally and chemistry specifically, active learning methods produce student outcomes superior to traditional lecture: higher exam scores (Freeman et al., 2014; Warfa, 2016; Wilson and Varma-Nelson, 2016; Apugliese and Lewis, 2017; Crimmins and Midkiff, 2017; Raum, et al., 2017), improved performance on concept inventories (Freeman et al., 2014), higher success rates (Paulson, 1999; Freeman et al., 2014; Mooring, et al., 2016; Warfa, 2016; Wilson and Varma-Nelson, 2016; Crimmins and Midkiff, 2017), and often improved attitudes toward the discipline (Mooring et al., 2016; Cam and Omer, 2017; Raum et al., 2017; Vishnumolakala et al., 2017), though alternative explanations (e.g., selection bias) have been proposed to explain some results (Chan and Bauer, 2015). Active learning is a term that has been used to define a broad range of evidence-based teaching methods in which students participate in and contribute to class sessions rather than merely observing and taking notes. Most methods are grounded in social constructivism which posits that students develop meaning best through interaction with peers (Palincsar, 1998). Common active learning methods include clearly defined methods such as Process Oriented Guided Inquiry Learning (POGIL, Moog and Spencer, 2008), Peer Instruction (PI, Mazur, 1997), Problem-Based Learning (PBL, Duch et al., 2001), and Peer-Led Team Learning (PLTL, Gosser et al., 2000) as well as more generic methods such as the flipped classroom, the use of classroom response systems, and a plethora of collaborative learning techniques (Barkley et al., 2005).

Choice overload from the sheer variety of methods that have been shown to be effective may be part of the reason that so few chemists, outside of chemical education researchers, have implemented active learning methods in their classrooms. One recent survey of more than 800 chemistry faculty found that only 12.9% of faculty self-report that they use a flipped approach (“primary content delivery mode occurs outside of the classroom and the application of content occurs inside the classroom”) at least weekly in their courses (Srinivasan et al., 2018). This result is consistent with a cross-disciplinary study of more than 2000 classes which found that Chemistry lags significantly behind the STEM average of 18% of “instructors who incorporate student-centered strategies into large portions of their classes” (Stains et al., 2018). It is also true that the vast majority of chemistry instructors have little experience with active learning – either as students or instructors – and have already invested substantial time and energy becoming proficient at delivering content through lecture. Resistance, particularly of senior colleagues, may be lessening but remains an impediment to change for many chemistry instructors.

Theoretical framework

The Teacher-Centered Systematic Reform (TCSR) Model recognizes that instructional transformation is facilitated by three interrelated factors: instructor knowledge and beliefs about teaching, personal factors including experience, and contextual factors (Woodbury and Gess-Newsome, 2002; Gess-Newsome et al., 2003). Gibbons et al. (2018) have recently demonstrated the impact of instructor beliefs and self-efficacy on teaching practice in a survey of over 1200 chemistry instructors at a broad range of US institutions. Instructor experience, particularly professional development, is a crucial factor for instructional change in the TCSR Model (Fullan and Stiegelbauer, 1991). Contextual factors include: student and classroom characteristics, departmental and disciplinary support, textbook and technology availability, and administrative support (Woodbury and Gess-Newsome, 2002; Gess-Newsome et al., 2003). The TCSR Model is a helpful framework for understanding the factors required for instructional transformation and their interdependence, but does not describe the process by which transformation occurs.

Rogers’ diffusion of innovations theory describes the innovation decision making process as comprising five stages: knowledge, persuasion, decision, implementation, and finally confirmation (Rogers, 2003). The foundational role of knowledge and persuasion in Rogers’ theory is consistent with the key factors of the TCSR Model, particularly the importance of instructor beliefs and self-efficacy (Woodbury and Gess-Newsome, 2002; Gess-Newsome et al., 2003). Indeed, educational reformers working within the TCSR Model framework have demonstrated that the change process begins with growth in instructor knowledge and beliefs about teaching (Windschitl and Sahl, 2002; Bauer et al., 2013). Henderson et al. (2012) found that presentations and workshops, particularly the Physics and Astronomy New Faculty Workshop, effectively moved physics faculty through the first three stages of the decision-making process (knowledge, persuasion, and decision), but needed to do more to support faculty during the implementation and confirmation stages. This is consistent with his finding that effective educational change strategies involve two or more of the following: extended contact (a month or more), performance evaluation and feedback, and a deliberate focus on changing faculty attitudes and beliefs (Henderson et al., 2011). These findings suggest that typical faculty development workshops struggle to promote the desired long-term change in teaching practice because facilitators don’t have meaningful contact (that supports implementation and confirmation) with participants after the conclusion of the workshop. A recent report by Manduca et al. (2017) in geosciences education suggests that online communities of practice can provide effective support for the implementation of student-engaged teaching practices which leads to positive confirmation and sustained use of the desired teaching practices.

The theoretical framework for this quantitative study uses the TCSR Model to understand the factors that facilitate instructional transformation and their interconnectivity. Roger's diffusion of innovations theory was used to understand the process of instructional transformation, particularly the role that faculty development workshops can play in moving instructors through the process and supporting their post-workshop activities.

Workshop assessment in chemistry

Discipline and sub-discipline specific faculty development workshops on active learning have been available in chemistry since at least the early 2000's. Despite the prevalence of these workshops there are relatively few reports concerning their effectiveness that go beyond characterizing the workshops, their participants, and their participants’ opinion of the workshop. Notable exceptions include the Core Collaborators Workshops (CCW) for biochemistry (Murray et al., 2011), POGIL-PCL workshops for physical chemistry laboratory (Stegall et al., 2016), and Cottrell Scholars Collaborative New Faculty Workshop (CSC-NFW) for chemistry faculty at R1 institutions (Baker et al., 2014; Stains et al., 2015). The CCW and POGIL-PCL workshops focused on POGIL methods whereas the CSC-NFW introduced participants to a variety of evidence-based instructional practices (EBIPs). These studies used primarily self-reported survey results to measure confidence and rates of implementation in the year following workshop attendance, despite evidence that self-reported data is not the best measure of actual teaching practice (Kane et al., 2002; D’Eon et al., 2008; Ebert-May et al., 2011) and that a significant fraction of faculty discontinue use of evidence-based instructional practices after a few uses (Henderson et al., 2012). Most surveys were developed in-house, though the CSC-NFW also used validated surveys to measure teaching beliefs and teaching efficacy. The CSC-NFW studies are also noteworthy for inclusion of a control group and an attempt to assess changes in teaching practice using COPUS (Smith et al., 2013) analysis of video recorded class sessions the semester after and two years after workshop attendance. Unfortunately, with only 22 of 81 workshop participants submitting the first recording and only 3 of those submitting the second it is difficult to argue that the COPUS results provide a meaningful analysis of workshop effectiveness. More recent reports that observation of multiple class periods are required for accurate classification of teaching style further challenges the direct observation approach (Stains et al., 2018).

Active learning in organic chemistry workshops

Organic Education Resources (OrganicERs) was founded in 2012 to promote the use of evidence-based instructional practices in the teaching of organic chemistry. Members of the Leadership Board, with support from Chemistry Collaborations, Workshops, and Communities of Scholars (cCWCS), developed http://OrganicERs.org in 2013 (OrganicERs, 2013). This website, now in conjunction with a private Facebook group (OrganicERs, 2015), continues to host the online community of practice where pedagogical and curricular best-practices are shared and discussed. OrganicERs was also able to host 3 day, 4 day, and 3 hour Active Learning in Organic Chemistry workshops from 2013–2018 with the support of cCWCS. The community of practice and workshops have been described in greater detail elsewhere (Leontyev et al., 2019). The purpose of this project is to assess the effectiveness of the two 4 day (ALOC) workshops and two 3 hour (BCCE) workshops held in 2015 and 2016.

Twenty-four participants attended the 2015 4 day (ALOC) workshop in Washington, DC and 22 attended the 2016 ALOC workshop in Cincinnati, OH. Most participants taught at 4 year institutions (62%), though community college instructors (20%) and faculty at schools with graduate programs (18%) were also well represented. The workshop facilitators (5 each year) used active learning methods extensively throughout the workshop such that participants gained experience with the methods as students and as instructors. The workshops began with introductions (see Appendix 1) and the acknowledgement that personal factors and experience do impact our ability to transform instructional practice (Woodbury and Gess-Newsome, 2002; Gess-Newsome et al., 2003). The remainder of the first evening was used to provide an overview of the active learning methods introduced in the workshop, the theoretical and empirical evidence supporting their efficacy, and the concept of backward design (Wiggins and McTighe, 2006). These sessions were designed to provide a compelling rationale for implementation of active learning methods. Each of the following days began with workshop facilitators describing how they make use of particular EBIPs (Fig. 1) in their own teaching. These sessions were designed to inform participants of the underlying theory and best practices for each method as well as to experience the methods as a student. Afternoon sessions were more practical with introductions to technology (Fig. 1), opportunities to interact with it, and time for participants to design and receive feedback on instructional modules. Workshop facilitators had extensive experience with the EBIPs and technologies introduced and were therefore able to emphasize their compatibility with organic chemistry instruction. The workshops also included a session on summative assessment to help participants’ understand best-practice and provide preparation for the confirmation stage of Roger's diffusion of innovations theory.


image file: c9rp00137a-f1.tif
Fig. 1 Primary evidence-based instructional practices and technologies introduced at the 2015 and 2016 4-day ALOC workshops.

A total of 40 participants attended two 3 hour workshops at the 2016 Biennial Conference on Chemical Education (BCCE) at the University of Northern Colorado. These workshops used Reaching Students: What Research Says About Effective Instruction in Undergraduate Science and Engineering to provide an overview of evidence-based instructional practices (Kober, 2014). The philosophy of these workshops was similar to the 4 day workshops, but the schedule (Appendix 2) was highly compressed. Participants at the BCCE workshops taught primarily at 4 year institutions (57%), but there were more faculty from schools with graduate programs (24%) than at the ALOC workshops. Five percent of participants at the BCCE workshops taught at high schools and the remaining 19% taught at community colleges.

Research questions

Effectiveness was defined as increasing participants’: knowledge about and belief in the efficacy of active learning, proficiency with active learning methods, and implementation of active learning methods in their classrooms. Our research questions are:

RQ1. Will participants’ knowledge about and belief in the efficacy of active learning increase significantly after the workshop and, if so, will those changes be sustained?

RQ2. Will participants’ confidence with active learning methods increase significantly after the workshop and, if so, will those changes be sustained?

RQ3. Will participants’ implementation of active learning methods increase significantly after the workshop and, if so, will those changes be sustained?

RQ4. Will the 4 day (ALOC) workshops be significantly more effective than the 3-hour (BCCE) workshops?

Effectiveness was measured by comparison of pre- and post-workshop results on self-efficacy and teaching practices surveys, all self-reported by workshop participants. Self-efficacy refers to “an individual's belief in his or her capacity to execute behaviors necessary to produce specific performance attainments” (Bandura, 1999). The self-efficacy survey used in this study was adapted from validated teaching self-efficacy surveys (Prieto-Navarro, 2005; Chang et al., 2010). The teaching practices survey used was the Postsecondary Instructional Practices Survey (PIPS) using the authors’ five-factor model (Walter et al., 2016). This instrument, too, has been shown to provide valid and reliable measures of postsecondary teaching practice, particularly for introductory-level courses in the sciences. Limited resources and time between workshop enrollment and participation precluded alternative measures of active learning implementation such as student evaluations of teaching and classroom observation of teaching.

Methods

All the instruments administered to workshop participants were created and distributed through SurveyMonkey (SurveyMonkey, 2019). Administration of all instruments was approved by the Wittenberg University IRB (electronic communication, IRB 082-201516, IRB 107-201617, IRB 080-201718).

The self-efficacy items (Fig. 2) were adapted for organic chemistry from the previously validated Faculty Teaching Efficacy survey (Prieto-Navarro, 2005) and the College Teaching Self-Efficacy Scale (Chang et al., 2010). All responses were on a five-point Likert scale from “1 – unsure of what this is” to “5 – very confident”. Participants in the 3 hour workshops (BCCE) received an abbreviated survey that did not include questions about classroom technologies (Q10–18) as these topics were not discussed in the BCCE workshops. The self-efficacy survey was administered to workshop participants one week prior to the workshop, one week after the workshop, and every six months thereafter through 24 months. It was also administered to a control group comprised of participants’ departmental colleagues two weeks apart, then 6 months and 12 months thereafter. The Postsecondary Instructional Practices Survey (PIPS) was administered to workshop participants one week after the workshop and every six months thereafter through 24 months (Walter et al., 2016). It was discovered late in the data analysis that item 10, “I structure class so that students explore or discuss their understanding of new concepts before formal instruction”, was accidently omitted from early administrations of PIPS. This item could not therefore be included in the analysis. Participants that did not respond to the initial request to complete a survey were reminded to complete the survey 2–3 additional times with personal e-mails.


image file: c9rp00137a-f2.tif
Fig. 2 Items on the self-efficacy survey. Response options were: (1) unsure of what this is, (2) not confident at all, (3) not very confident, (4) somewhat confident, (5) very confident.

Because there are distinct cohorts of participants (ALOC and BCCE) and each participant gave a quantitative response to each of several items/questions at each of several time periods, the data have a “repeated measures” structure with one between-blocks factor and two crossed within-blocks factors. The participants themselves are the blocks. The lone between-blocks factor is the cohort, and it has two levels, corresponding to the two workshop types. One of the within-blocks factors is the item/question, corresponding to the measurements recorded on each participant at each period. The other within-blocks factor is the time period, and it has six levels – though the latter five periods were occasionally aggregated to facilitate overall pre-workshop vs. post-workshop comparisons.

There are a total of 44 items/questions. The 20 self-efficacy questions are separated into five groups: Knowledge, OrganicERs, Intent, Proficiency, and Implementation. The 24 items on the PIPS instrument were divided into five subscales and Time Allocation (four options). These 14 groups of items/questions were analyzed separately, mostly in a series of 14 repeated measures analyses as described above.

Exploratory data analysis included visualizations of the main effects and interaction effects of the cohort, period, and item/question factors on the participants’ responses. Formal repeated measures analysis of variance (ANOVA) modeling and inferential testing then showed which combination of these factors was most relevant in explaining the responses. The formal analysis was followed up with pairwise comparisons (using Tukey's adjustment to minimize the risk of false positive differences) to determine which cohort, periods, and items/questions had different responses, and by how much. The effect sizes were expressed both on the original scale and on a standardized scale using Cohen's d (Cohen, 1988), and were expressed both with single-number estimates and with confidence intervals. All analysis was performed using R (R Core Team, 2018) via RStudio (RStudio Team, 2018).

Effect sizes in this report were calculated using Cohen's d which can be understood as how much of a standard deviation a measurement changed (Cohen, 1988). Changes of 20% of a standard deviation (d value of 0.2) are generally considered small, changes of 0.5 d are considered moderate, and changes greater than 0.8 d are generally considered large. Effect sizes can also be calculated from analysis of variance (ANOVA) studies by determining the percentage of variance explained by group membership. These are often reported as eta-squared values; values less than 1% (η2 < 0.01) are generally considered small, values near η2 = 0.06 are considered moderate, and values of η2 > 0.14 are considered large (Cohen, 1988).

Results and discussion

The validity of the self-efficacy survey is supported by several factors. First, the structure of the survey and each item was adapted from previously validated instruments (Prieto-Navarro, 2005; Chang et al., 2010). Second, content validity is supported because each item was developed from the workshop objectives developed by the facilitators. Test–retest reliability of the self-efficacy survey was assessed using a control group that did not attend the workshops. Participants in the 4 day ALOC workshops recruited chemistry colleagues that completed the self-efficacy survey twice, two weeks apart. Comparison of the first two administrations for 26 control subjects provided a Pearson correlation coefficient of 0.85, standard error of measurement of 0.58, and an 86% chance of identical responses according to a paired t-test. The average corrected effect size was small (g = 0.20) and the largest was medium (g = 0.47, Q9, Fig. 2) using Cohen's guidelines (Cohen, 1988). Standard errors of measurement for individual items ranged from 0.3 to 0.9 with an average of 0.54 (±0.14). These results indicate that the self-efficacy survey produced reliable data and valid inferences for use with chemistry faculty despite being adapted from self-efficacy instruments originally designed for use across the university with non-native English speakers.

Coefficient α-values were calculated for each of the five factors on the PIPS (Table 1) to determine its internal consistency with the ALOC and BCCE cohorts. All coefficient α-values were similar to, but greater than, those reported by Walter et al. with the exception of the content delivery factor for the ALOC cohort which was marginally lower (Walter et al., 2016). Walter et al. (2016) established the validity of the PIPS instrument, in part, with both exploratory and confirmatory factor analysis. We were not able to perform CFA of the PIPS instrument with our population due to small sample size.

Table 1 PIPS 5-factor reliability scores
Factor # Items ALOC α BCCE α Lit αa
a Walter et al. (2016). b Item 10 omitted.
Content delivery 4 0.617 0.728 0.644
Student centered 5 0.854 0.925 0.606
Formative assessment 5 0.760 0.864 0.641
Student–student interactions 6 0.869b 0.971b 0.825
Summative assessment 4 0.457 0.509 0.447


Response rates for the self-efficacy and instructional practices surveys for the participants in the 4 day ALOC workshops were above 50% for all administrations with an average of 74% (Table 2). The 3 hour BCCE workshop participant response rates were less with a low of 38% and an average of 52%. Response rates tended to be lowest for the 18 month surveys that were administered during the academic year and response rates decreased slightly over time. These response rates are similar to those reported for the CSC-NFW (59%, Baker et al., 2014) and POGIL-PCL (69%, Stegall et al., 2016) workshops. The CCW had remarkably higher response rates (90% of attendees at their second workshop), though only 18 of 24 total participants responded (Murray et al., 2011). It is not unexpected that the highest response rates are for cohorts that attended multiple multi-day workshops (CCW) and the lowest for those that attended only a 3 hour workshop (BCCE).

Table 2 Survey response rates
Control (N = 26) 2015 ALOC (N = 24) 2016 ALOC (N = 22) 2016 BCCE (N = 40)
Pre-workshop 100 100 100 83
1 week 100 88 96 53
6 months 27 58 73 60
12 months 31 63 64 63
18 months 63 55 38
24 months 71 64 48


Knowledge, beliefs, and intent

Rogers’ diffusion of innovation theory recognizes that before instructors implement new innovations they must understand the innovation, be persuaded that it could be beneficial, and make the decision to implement it (Rogers, 2003). The Teacher-Centered Systematic Reform Model also recognizes that contextual factors, such as disciplinary support (e.g., OrganicERs), are important factors influencing the success of implementation (Woodbury and Gess-Newsome, 2002; Gess-Newsome et al., 2003). Several items on the self-efficacy survey probed workshop participants’ confidence in their understanding of active learning (Q1), belief that it is effective (Q2), the ability of OrganicERs to support their implementation efforts (Q's 19 and 20), and confidence in their intent to implement more active learning methods in their teaching (Q4). Fig. 3 shows how average responses to Questions 1 and 2 (Knowledge), Questions 19 and 20 (OrganicERs), and Question 4 (Intent) changed over time for the 4 day ALOC and 3 hour BCCE workshop participants.
image file: c9rp00137a-f3.tif
Fig. 3 Average responses to self-efficacy questions by both ALOC and BCCE participants. Similar items were condensed into Knowledge (Q1, Q2), OrganicERs (Q19, Q20), Intent (Q4), and Implementation (Q3, Q5–8).

Participants at both ALOC and BCCE workshops reported significant and sustained knowledge gains after attendance. Repeated measures ANOVA modeling showed that period (F(5,592) = 35.2, p < 0.001) and workshop type (F(1,592) = 8.7, p = 0.003) were important factors. Self-reported knowledge differences among the post-workshop responses were not statistically significant (p > 0.25), but the difference between average pre-workshop response and average post-workshop response (+0.59) was significant (p < 0.001) with a very large effect size of more than a standard deviation (d = 1.18) as summarized in Table 3. Participants at the 4 day ALOC workshops did report moderately greater knowledge gains (+0.12, d = 0.23) than participants at the 3 hour BCCE workshops due partially to starting at a lower average knowledge and partially to a higher average after the workshops. Stains et al. (2015) also studied the knowledge and beliefs of the CSC-NFW participants before and after their workshop. They found no significant change in the number of evidence-based instructional practices (EBIPs) their control group knew, but a large change (from 8 to 14 EBIPs) for workshop participants immediately after the workshop. This change was sustained after 12 months with 15 EBIPs. Scores on the student-centered scale of the Approaches to Teaching Inventory (ATI) also increased slightly (effect size, r = 0.23) immediately after the CSC-NFW, but returned to pre-workshop levels one year later. It is not surprising that college faculty retain knowledge about teaching practices for a year (CSC-NFW) or two (ALOC and BCCE). The large and sustained increases we observed in ALOC and BCCE participants’ belief that active learning methods are effective (Q2), however, are surprising. It is possible that this belief was sustained due to adequate support for implementation within the OrganicERs community and/or confirmation of effectiveness due to high levels of implementation.

Table 3 Effect of period and workshop type on self-reported gains
Period (all post-workshop vs. pre-workshop) Workshop (ALOC vs. BCCE)
Cohen's d p Cohen's d p
a Comparison of only the 1 week post-workshop to the pre-workshop responses.
Knowledge 1.18 <0.001 0.23 0.003
OrganicERs 1.02 <0.001 0.74 <0.001
Intent 0.60a <0.001a 0.32 <0.001
Proficiency 0.78 <0.001 N/A N/A
Implementation 0.85 <0.001 0.15 <0.001
Student-centered 1.00 <0.001 0.59 <0.001
Formative assessment 1.04 <0.001 0.60 <0.001
Interactive 0.96 <0.001 0.38 0.007
% Lecture 0.77 <0.001 0.28 0.038
% Small Groups 0.68 <0.001 0.23 0.090


Disciplinary support can be an important factor in whether faculty successfully implement teaching innovations (Woodbury and Gess-Newsome, 2002; Gess-Newsome et al., 2003). Questions 19 and 20 on the self-efficacy survey (Fig. 2) assessed whether ALOC and BCCE workshop participants recognize OrganicERs as a resource (Q19) from which they feel comfortable seeking help (Q20). Repeated measures ANOVA modeling showed that period (F(5,595) = 31.2, p < 0.001), workshop type (F(1,595) = 90.1, p < 0.001), and question (F(1,595) = 11.6, p = 0.001) were important factors. Initial gains were significant (p < 0.001, d = 1.42). Participants’ confidence in OrganicERs decreased at the 6-month survey (p = 0.002, d = −0.51), particularly of those that attended the BCCE workshops (Fig. 3). Even with this decline, the average responses on these items after the workshops were significantly greater than the pre-workshop survey (p < 0.001, d = 1.02). Responses from the ALOC participants did express significantly more confidence in OrganicERs than those of the BCCE participants (+0.56, p < 0.001, d = 0.74) as expected due to greater length of interaction. The high, and particularly for ALOC participants, sustained average responses to these items is an indication that workshop participants were confident OrganicERs could be a source of disciplinary support as they implement instructional change.

The hope and expectation of the workshops was that increasing participants’ knowledge about and belief in the efficacy of active learning methods and connecting them with disciplinary support would increase their intention (Q4, Fig. 2) to implement these methods in their courses after the workshop. Fig. 3 illustrates that participants’ confidence in their intention to implement active learning methods did increase significantly immediately after the workshops (p < 0.001, d = 0.60). Responses varied over the following 2 years with particularly low average responses 18 months after the workshops. This is not unexpected as intent should diminish once methods have already been implemented and our experience is that few instructors implement major instructional change in the midst of an academic year. ANOVA modeling showed that in addition to period (F(5,249) = 5.0, p < 0.001), workshop type attended was also an important factor (F(1,249) = 8.3, p = 0.004). Participants at the 4 day ALOC workshops reported stronger confidence in their intent to implement more active learning methods throughout the duration of the study (+0.17, p = 0.004, d = 0.32).

Participant responses to the self-efficacy survey suggest that they believe that the ALOC and BCCE workshops did effectively increase their knowledge about and belief in the efficacy of active learning methods and these increases were sustained for at least two years after workshop attendance (RQ1). The survey also found that participants’ confidence in their intent to incorporate more active learning methods in their teaching increased after the workshops.

Confidence with active learning methods

The TCSR Model suggests that confidence with reformed teaching methods is a prerequisite for their implementation (Woodbury and Gess-Newsome, 2002; Gess-Newsome et al., 2003). Questions 3 and 5–9 assess whether participants had confidence implementing the evidence-based methods discussed in the workshops (Fig. 2 and 3). Repeated measures ANOVA modeling found that period (F(5,1947) = 54.8, p = 0.000), workshop type (F(1,1947) = 11.3, p = 0.001), and question (F(5,1947) = 43.6, p = 0.000) were all important contributors to the observed change in responses. Average responses increased immediately following the workshops and this increase grew further over the following two years (p < 0.001, d = 0.85). The difference between ALOC and BCCE workshops was small (p < 0.001, d = 0.15).

Questions 10–18 on the self-efficacy survey (Fig. 2) measured 4 day ALOC workshop participants’ confidence using particular teaching methods. A stated goal of the workshops was for participants to become proficient with the use of at least two of the nine methods that were discussed at the workshops. As shown in Fig. 4, approximately one third of respondents rated themselves as proficient (4 or 5 on the Likert scale) with two or more methods prior to the workshop. This percentage increased three-fold (to >90%) after the workshops and was sustained over the two-years that participants were queried. The percentage of respondents confident in their ability to use five or more methods increased by 70% during the workshop and remained constant at about 50% for the remainder of the study. Repeated measures ANOVA modeling found that period (F(5,230) = 14.5, p < 0.001) and participant (F(46,230) = 3.5, p < 0.001) explained 51% of variation in the responses. Respondents, on average, became confident or very confident on two additional methods (p < 0.001, d = 0.78) after ALOC workshop attendance. These results indicate that the ALOC workshops were highly effective at improving participants’ confidence using active learning methods and that these gains remained steady from 6–24 months after the workshops (RQ2). These results are similar to those seen with participants in the CCW where prior to the workshop only 28% of participants were “confident” or “very confident” in their ability to implement POGIL techniques, but this increased to 78% eight months after the workshop (Murray et al., 2011).


image file: c9rp00137a-f4.tif
Fig. 4 Number of methods in which participants at the 4 day ALOC workshops self-report they are confident (4) or very confident (5) on questions 10–18 of the self-efficacy instrument.

Implementation of evidence-based methods

Belief in the efficacy of active learning methods and confidence in one's ability to implement them are necessary for effective implementation, but they do not guarantee that implementation will occur as personal and contextual factors can remain significant obstacles (Woodbury and Gess-Newsome, 2002; Gess-Newsome et al., 2003). Items from the self-efficacy survey, analysis of results from the Postsecondary Instructional Practices Survey (PIPS), and self-reported allocation of class time were used to determine whether workshop participants believed that they had implemented the evidence-based, active learning methods discussed in the workshops.

The CCW and POGIL-PCL workshop conducted surveys 7–8 months post-workshop to determine the extent to which participants implemented workshop-specific modules (Murray et al., 2011; Stegall et al., 2016). Most CCW and POGIL-PCL participants, 90% and 77% respectively, reported implementing at least some workshop material within this timeframe, though only 42% of the POGIL-PCL participants reported using the modules regularly. The CSC-NFW study also found statistically significant increases in the number and frequency of evidence-based methods participants reported implementing (Stains et al., 2015). Their control group and workshop participants both reported using more EBIPs one year after the workshop, but the increase was greater for the participants (p = 0.044, η2 = 0.062). The CSC-NFW participants also reported using EBIPs more frequently one year after the workshop, particularly group work (p = 0.018, η2 = 0.086), whole class discussion (p = 0.047, η2 = 0.061), and move into class (p = 0.015, η2 = 0.091). These increases were statistically significant compared to the control group which reported decreased frequency of use.

Self-efficacy survey

The majority of questions on the self-efficacy survey provided insight on whether participants had implemented active learning methods. The increase in respondent confidence in implementation of active learning methods (Q3 and 5–8, Fig. 3) and the large percentage (90%) of ALOC respondents that remain confident in their ability to use at least two active learning methods after two years (Fig. 4) suggests that they are using these methods in their teaching.

PIPS subscale analysis

The Postsecondary Instructional Practices Survey (PIPS) was used, in addition to items on the self-efficacy survey, as a measure of whether workshop participants believe that they have implemented active learning methods. Our hypothesis was that scores on the student-centered teaching, formative assessment, and student–student interactions subscales would increase for workshop participants that incorporate active learning methods. Fig. 5 shows that scores on these scales increased as expected from the large (d = 0.85) reported increase in confidence using active learning methods (above), particularly for the ALOC workshops. Scores on the student-centered teaching (F(1,173) = 45.2, p < 0.001, d = 1.00), formative assessment (F(1,172) = 48.8, p < 0.001, d = 1.04), and student–student interactions (F(1,172) = 41.9, p < 0.001, d = 0.96) scales all increased by approximately one standard deviation from pre-workshop levels. These are large improvements that, particularly in conjunction with the self-efficacy and time allocation data, provide strong evidence that the 4 day ALOC and 3 hour BCCE workshops effectively increased the implementation of active learning, evidence-based instructional practices in participants’ courses. Approximately two-thirds of ALOC and one-half of BCCE participants reported increases of at least half a standard deviation on the student-centered teaching, formative assessment, and student–student interaction scales after two years. These proportions are roughly comparable to the 77% of participants that implemented some material and 42% that reported using the modules regularly within 8 months of the POGIL-PCL workshop (Stegall et al., 2016). Participants at the ALOC workshops reported greater gains for the student-centered teaching, formative assessment, and student–student interaction scales than participants at the BCCE workshops (Table 3). It is possible that the larger effect sizes observed with the PIPS instrument are due to its greater sensitivity than the other measures of implementation we used. Scores on the content delivery (F(1,173) = 13.4, p < 0.001, d = 0.54) and summative assessment (F(1,172) = 1.1, p = 0.309, d = 0.15) scales decreased after the workshops.
image file: c9rp00137a-f5.tif
Fig. 5 Workshop participant scores on the Content, Student-Centered, Formative Assessment, student–student Interactions, and Summative Assessment subscales of the PIPS instrument.

Allocation of class time

The PIPS instrument also asked respondents to estimate the percentage of class time spent on lecture, individual work, small group work, and other pedagogies. Though lacking detail, this does provide another measure of whether workshop participants believe they altered their teaching practice after workshop attendance. Fig. 6 illustrates that workshop participants, particularly the ALOC cohort, reported a reduction in the percentage of time spent lecturing the whole class in the four semesters following workshop attendance (−11%, F(4,165) = 7.7, p < 0.001, d = 0.77). Workshop type was also an important factor in the ANOVA modeling (F(1,165) = 4.4, p = 0.038, d = 0.28).

The reduction in self-reported time spent lecturing corresponded to an increase in the percent of time spent with students working in small groups (+11%, p < 0.001, d = 0.68). This medium effect size is comparable to that reported for CSC-NFW participant use of small group work (p = 0.018, η2 = 0.086) one year after their workshop (Stains et al., 2015). Changes to the amount of time spent with students working individually or “other” were negligible. Repeated measures ANOVA modeling also found that participants in the ALOC workshop reported higher levels of small group work than BCCE participants (+4%, p = 0.095, d = 0.23) despite reporting less use of small group work in the semester before workshop attendance.

The self-efficacy survey, analysis of the PIPS instrument subscales, and the survey of time allocation in class meetings all suggest that ALOC and BCCE workshop participants believed that they increased their implementation of evidence-based, active learning methods after workshop attendance. The large magnitude and consistency of these increases (Table 3) provide great confidence that the ALOC and BCCE workshops were effective in their goal to increase implementation of evidence-based, active learning methods in the organic chemistry courses of participants (RQ3).

Efficacy of 4 day ALOC versus 3 hour BCCE workshops

The final research question (RQ4) is whether the 4 day ALOC workshops were significantly more effective than the 3 hour workshops facilitated at the 2016 BCCE. Review of Fig. 3, 5 and 6 as well as Table 3 show that both types of workshop effectively changed participants’ knowledge, beliefs, and teaching practices. The effect size for these changes were in the medium to large range. Pre-workshop survey responses largely indicate that the ALOC and BCCE participants entered the workshop with similar profiles. Exceptions include higher responses for the ALOC participants on knowledge of OrganicERs (Fig. 3) and time spent on whole-class lecture (Fig. 6) as well as lower average responses regarding their knowledge about active learning methods and belief in its efficacy (Fig. 3) and use of student–student interactions (Fig. 5). These differences suggest that, on average, participants at the ALOC workshops were slightly less likely to be using active learning practices before the workshops.
image file: c9rp00137a-f6.tif
Fig. 6 Percent of class time workshop participants reported using for whole class lecture, individual work, small group work, and other.

Participants in the 4 day ALOC workshops reported, on average, greater change than participants in the 3 hour BCCE workshops such that two years after the workshops ALOC participants were more likely to be using active learning practices by every measure in this study. Repeated measures ANOVA modeling found that the effect of workshop type was significant, but small (d = 0.15 to 0.38) for each measure other than recognition of OrganicERs as a resource (Q19) from which they feel comfortable seeking help (Q20) (d = 0.74), the PIPS student-centered scale (d = 0.59), and the PIPS formative assessment scale (d = 0.60, Table 3). The large effect for confidence in OrganicERs suggests that the 4 day ALOC workshops did a much better job incorporating workshop participants into our community of practice than the shorter, 3 hour BCCE workshops. Even with the medium to large effect sizes for the PIPS student-centered and formative assessment scales, participants at the 3 hour BCCE workshops reported significant gain. The other, smaller effect sizes suggest to us that the shorter workshops, though less effective than the 4 day workshops, are valuable experiences that have measurable impacts upon teaching knowledge, beliefs, and practice.

Limitations

It is important to note several limitations of this study. First, all data in this study is self-reported and therefore subject to errors of self-perception (Kane et al., 2002; D’Eon et al., 2008; Ebert-May et al., 2011). It would have been preferable to at least corroborate this data with data from direct observation, but the short amount of time between workshop acceptance and attendance along with resource limitations made direct observation impossible. Second, the self-efficacy instrument was adapted from instruments that were validated with university instructors from across the disciplines whose native language was not, primarily, English (Prieto-Navarro, 2005; Chang et al., 2010). Attempts were made to validate the self-efficacy instrument for English-speaking chemistry faculty, but it would have been preferable to use more than 26 subjects for this effort. Finally, it is not possible to predict whether workshop participants would have increased their use of active learning methods regardless of workshop attendance. Ideally participants’ gains could be compared to those of a control group of chemistry instructors that didn’t attend a workshop or interact with anyone who had, but finding such a group of faculty with sufficient motivation to respond to 44 questions five times over a period of two years was not possible.

Conclusions

Self-efficacy and instructional practice surveys were used to measure the effectiveness of 4 day ALOC and 3 hour BCCE workshops. The surveys were administered one week before the workshop (self-efficacy only), one week after the workshop, and every six months thereafter for two years. This extended observation period was important to differentiate between participants who abandoned EBIPs after one or two trials and those who, after initial implementation, received positive confirmation of its success and committed to sustained use of the EBIPs (Rogers, 2003; Henderson et al., 2012). Approximately two-thirds of ALOC participants and half of BCCE participants provided responses to these surveys.

The self-efficacy survey, analysis of the PIPS subscales, and changes in the use of class time all suggest that the workshops were highly effective. Participants’ knowledge about and belief in the efficacy of active learning methods increased by more than a standard deviation (d = 1.18) after the workshops and this change was sustained over the two years that participants were surveyed (Fig. 3). The 4 day ALOC workshops also had a large effect on participants’ beliefs in the OrganicERs community of practice (Fig. 3) and proficiency with active learning methods (Fig. 4). Immediately after the workshops, participants had a greater intention to implement active learning methods (d = 0.60, Fig. 3) and all indications are that they did so. Participants: reported higher levels of confidence in implementation (d = 0.85, Fig. 3); scored higher on the student-centered (d = 1.00), formative assessment (d = 1.04), and student–student interactions (d = 0.96) scales of the PIPS (Fig. 5); reported lower use of whole class lecture (d = 0.77, Fig. 6); and reported higher use of small group work (d = 0.68, Fig. 6). The large and consistent magnitude of these changes provide good reason to believe that the 4 day ALOC and 3 hour BCCE workshops effectively disseminated evidence-based instruction practices amongst organic chemistry instructors who then implemented these methods in their own courses and confirmed their effectiveness (Rogers, 2003).

The reported significant and sustained changes in teaching knowledge, beliefs, and practice reported by participants at the 4 day ALOC and 3 hour BCCE workshops are similar to or greater than those reported for the Core Collaborators Workshops (CCW) for biochemistry (Murray et al., 2011), POGIL-PCL workshops for physical chemistry laboratory (Stegall et al., 2016), and Cottrell Scholars Collaborative New Faculty Workshop (CSC-NFW) for chemistry faculty at R1 institutions (Baker et al., 2014; Stains et al., 2015). This report extends that of Stains et al. who found that participants in the CSC-NFW retained the knowledge gained at their workshop for a year; participants in the ALOC and BCCE workshops retained their knowledge gained for two years. Our findings, however, differ from the CSC-NFW in that the change in beliefs about the efficacy of active learning methods (Q4, Fig. 2) was also sustained over the two-year course of the study; Stains et al. (2015) found that beliefs in specific student-centered instructional practices on the Approaches to Teaching Inventory (ATI) reverted to pre-workshop levels a year after workshop attendance. The methods used to evaluate the CCW and POGIL-PCL workshops, likewise, make it unclear whether participants continued to implement POGIL modules beyond the academic year following workshop attendance.

The success of the ALOC and BCCE workshops is undoubtedly due to a variety of factors. We believe that four of these are worth noting. First, the workshops were conceived and designed as a means of incorporating participants into a community of practice focused on evidence-based instructional practices in organic chemistry. We have intentionally sought to incorporate diverse pedagogical and institutional perspectives into this community. Second, workshop participants learn about each evidence-based teaching practice, see it modeled while experiencing it as a learner, and develop learning artifacts that address key learning challenges faced by their organic chemistry students during the workshops. Third, workshops were led by organic chemistry instructors with years of experience using active, evidence-based instructional practices in their courses. Finally, assessment has been an integral component of our attempt to continually improve the workshops. The process of developing clear and concise delineation of workshop objectives required for assessment provided clarity to facilitators and a cohesive experience for participants. We have also been able to use preliminary survey results to eliminate discussion of some technologies that participants did not implement and incorporate others. Finally, the surprisingly positive results from the 2016 BCCE workshops encouraged us to continue offering 3 hour workshops at BCCE despite our perception that three hours is too little time to accomplish lasting change.

Conflicts of interest

The corresponding author (JBH) of this report was also a facilitator of each workshop and is on the Leadership Board of OrganicERs.

Appendix 1: sample 4 day ALOC schedule

Monday evening
6–7 pm Welcome and Introductions
7–8 pm Overview of Active Learning Pedagogies
8–9 pm Backward Design
Tuesday
8:30–9:15 The Flipped Classroom at IPFW
9:15–10:00 The Flipped Classroom at RCGC
10:15–12:00 Using the Tools (Livescribe and SnagIt)
1:00–1:10 Group Photo
1:10–1:30 cCWCS and OrganicERs.org
1:30–2:45 Clicker technology and use
3:15–4:00 Hands-on activities with clickers
4:00–4:30 EasyOChem
4:30–5:30 The Flipped Lab at Dartmouth
5:30–6:00 Reflection
Wednesday
8:30–9:00 Just-in-Time Teaching at Wittenberg
9:00–9:30 Just-in-Time Teaching at Centre
10:00–11:30 Learning Objectives and Reading Prompts
12:30–1:45 Using the Tools (Doceri and Explain Everything)
1:45–2:30 Assessments
3:00–5:30 Concept Inventory Development
5:30–6:00 Reflection
Thursday
8:30–9:15 Discussion with previous participant
9:15–10:00 Introducing active learning to others
10:30–11:45 Brainstorming – how will your teaching change?
11:45–12:00 Evaluations

Appendix 2: sample 3 hour BCCE schedule

10 min Introductions
20 min Chapters 1 and 2 – Thinking about learning and teaching as a researcher would
Evidence for effectiveness of active learning
Learning goals
30 min Chapter 3 – Using insights about learning to inform teaching
Constructivism
Metacognition
60 min Chapter 4 – Designing instruction
Think-Pair-Share
Peer Instruction
Just-in-Time Teaching
Interactive exercises
Cooperative/collaborative learning
30 min Chapter 5 – Assessing and adapting
30 min Chapter 6 – Overcoming challenges

Acknowledgements

The authors wish to acknowledge cCWCS for funding the 4 day ALOC workshops (NSF #1022895), the Wittenberg University Department of Chemistry and the Virginia Ellis Franta Fund for financial support, and Alexey Leontyev for guidance.

Notes and references

  1. Angelo T. A. and Cross K. P., (1993), Classroom Assessment Techniques: A Handbook for College Teachers, Jossey-Bass: San Francisco, CA.
  2. Apugliese A. and Lewis S. E., (2017), Impact of instructional decisions on the effectiveness of cooperative learning in chemistry through meta-analysis, Chem. Educ. Res. Pract., 18(1), 271–278.
  3. Baker L. A., Chakraverty D., Columbus L., Feig A. L., Jenks W. S., Pilarz M., Stains M., Waterman R. and Wesemann J. L., (2014), Cottrell scholars collaborative new faculty workshop: professional development for new chemistry faculty and initial assessment of its efficacy, J. Chem. Educ., 91, 1874–1881.
  4. Bandura A., (1999), Self-efficacy: The Exercise of Control, New York, NY: W.H. Freeman.
  5. Barkley E. F., Cross K. P. and Major C. H., (2005), Collaborative Learning Techniques, 1st edn, San Francisco: Wiley.
  6. Bauer C., Libby R. D., Scharberg M. and Reider D., (2013), Transformative research-based pedagogy workshops for chemistry graduate students and postdocs, J. Coll. Sci. Teach., 43, 36–43.
  7. Cam A. and Omer G., (2017), Effectiveness of case-based learning instruction on pre-service teachers’ chemistry motivation and attitudes toward chemistry, Res. Sci. Technol. Educ., 35(1), 74–87.
  8. Chan J. Y. K. and Bauer C. F., (2015), Effect of Peer-Led Team Learning (PLTL) on Student Achievement, Attitude, and Self-Concept in College General Chemistry in Randomized and Quasi Experimental Designs, J. Res. Sci. Teach., 52, 319–346.
  9. Chang T., McKeachie W. and Lin Y., (2010), Faculty Perceptions of Teaching Support and Teaching Efficacy in Taiwan, J. High. Educ., 59, 207–220.
  10. Cohen J., (1988), Statistical Power Analysis for the Behavioral Sciences, New York, NY: Routledge Academic.
  11. Crimmins M. T. and Midkiff B., (2017), High structure active learning pedagogy for the teaching of organic chemistry: assessing the impact on academic outcomes, J. Chem. Educ., 94(4), 429–438.
  12. D’Eon M., Sadownik L., Harrison A. and Nation J., (2008), Using self-assessments to detect workshop success – Do they work? Am. J. Eval., 29(1), 92–98.
  13. Duch B. J., Groh S. E. and Allen D. E. (ed.), (2001), The power of problem-based learning, Sterling, VA: Stylus.
  14. Ebert-May D., Derting T. L., Hodder J., Momsen J. L., Long T. M. and Jardeleza S. E., (2011), What we say is not what we do: effective evaluation of faculty professional development programs, BioScience, 61(7), 550–558.
  15. Freeman S., Eddy S. L., McDonough M., Smith M. K., Okoroafor N., Jordt H. and Wenderoth M. P., (2014), Active learning increases student performance in science, engineering, and mathematics, Proc. Natl. Acad. Sci. U. S. A., 111(23), 8410–8415.
  16. Fullan M. G. and Stiegelbauer S., (1991), The new meaning of educational change, New York: Teachers College Press.
  17. Gess-Newsome J., Southerland S. A., Johnston A. and Woodbury S., (2003), Educational reform, personal practical theories, and dissatisfaction: the anatomy of change in college science teaching, Am. Educ. Res. J., 40, 731–767.
  18. Gibbons R. E., Villafañe S. M., Stains M., Murphy K. L., Raker J. R., (2018), Beliefs about learning and enacted instructional practices: an investigation in postsecondary chemistry education, J. Res. Sci. Teach., 55, 1–23.
  19. Gosser D. K., Carcolice M. S., Kampeier J. A., Roth V., Strozak V. S., Varma-Nelson P., (2000), Peer-led team learning: A guidebook, Upper Saddle River, NJ: Prentice Hall.
  20. Henderson C., Beach A. and Finkelstein N., (2011), Facilitating change in undergraduate STEM instructional practices: an analytic review of the literature, J. Res. Sci. Teach., 48, 952–984.
  21. Henderson C., Dancy M. and Niewiadomska- Bugaj M., (2012), Use of research-based instructional strategies in introductory physics: Where do faculty leave the innovation-decision process? Phys. Rev. ST Phys. Educ. Res., 8, 020104.
  22. Kane R., Sandretto S. and Heath C., (2002), Telling half the story: a critical review of research on the teaching beliefs and practices of university academics, Rev. Educ. Res., 72(2), 177–228.
  23. Kober L., (2014), Reaching Students: What Research Says About Effective Instruction in Undergraduate Science and Engineering, Washington DC: The National Academies Press.
  24. Leontyev A., Houseknecht J. B., Maloney V., Muzyka J., Rossi R., Welder C. and Winfield L., (2019), OrganicERs: Building a Community of Practice of Organic Chemistry Instructors through Workshops and Web-based Resources, J. Chem. Educ.,  DOI:10.1021/acs.jchemed.9b00104.
  25. Manduca C. A., Iverson E. R., Luxenberg M., Macdonald R. H., McConnell, D. A., Mogk D. W., Tewksbury B. J., (2017), Improving undergraduate STEM education: the efficacy of discipline-based professional development, Sci. Adv., 3, 1–16.
  26. Mazur E., (1997), Peer instruction: a user's manual, Upper Saddle River, NJ: Prentice Hall.
  27. Moog R. S. and Spencer J. N. (ed.), (2008), POGIL: Process oriented guided inquiry learning, ACS Symposium Series 994, Washington, DC: American Chemical Society.
  28. Mooring S. R., Mitchell C. E. and Burrows N. L., (2016), Evaluation of a flipped, large-enrollment organic chemistry course on student attitude and achievement, J. Chem. Educ., 93(12), 1972–1983.
  29. Murray T. A., Higgins P., Minderhout V. and Loertscher J., (2011), Sustaining the development and implementation of student-centered teaching nationally: the importance of a community of practice, Biochem. Mol. Biol. Educ., 39, 405–411.
  30. Novak G. M., Patterson E. T., Gavrin A. D. and Christian W., (1999), Just-in-Time Teaching: Blending Active Learning with Web Technology, Prentice-Hall: Upper Saddle River, NJ.
  31. OrganicERs, (2013), 2 June, Organic Education Resources: a cCWCS community of scholars, viewed 6 June 2019, https://www.organicers.org.
  32. OrganicERs: Active Learning in Organic Chemistry, (2015), 7 July. Available at https://www.facebook.com, accessed 6 June 2019.
  33. Palincsar A. S., (1998), Social constructivist perspectives on teaching and learning, Annu. Rev. Psychol., 49, 345–375.
  34. Paulson D. R., (1999), Active learning and cooperative learning in the organic chemistry lecture class, J. Chem. Educ., 76(8), 1136–1140.
  35. Prieto-Navarro L., (2005), Las creencias de autoeficacia docente del profesorado universitario, Madrid: Universidad Pontificia Comillas.
  36. Raum M. A., Kennedy K., Oxtoby L., Bollom M. and Moore J. W., (2017), Unpacking “active learning”: a combination of flipped classroom and collaboration is more effective but collaboration support alone is not, J. Chem. Educ., 94(10), 1406–1414.
  37. Rogers E. M., (2003), Diffusion of Innovations, New York: Free Press.
  38. R Core Team, (2018), R: A Language and Environment for Statistical Computing (3.5.2) [Computer program], Vienna, Austria: R Foundation for Statistical Computing.
  39. RStudio Team, (2018), RStudio: Integrated Development for R (1.1.463) [Computer program], RStudio, Inc., Boston, MA.
  40. Smith M. K., Jones F. H. M., Gilbert S. L. and Wieman C. E., (2013), The Classroom Observation Protocol for Undergraduate STEM (COPUS): A New Instrument to Characterize University STEM Classroom Practices, CBE – Life Sci. Ed., 12, 618–627.
  41. Srinivasan S., Gibbons R. E., Murphy K. L. and Raker J., (2018), Flipped classroom use in chemistry education: results from a survey of postsecondary faculty members, Chem. Educ. Res. Pract., 19(4), 1307–1318.
  42. Stains M., Pilarz M. and Chakraverty D., (2015), Short- and long-term impacts of the Cottrell Scholars Collaborative New Faculty Workshop, J. Chem. Educ., 92, 1466–1476.
  43. Stains M., Harshman J., Barker M. K., Chasteen S. V., Cole R., DeChenne-Peters S. E., Eagan Jr. M. K., Esson J. M., Knight J. K., Laski F. A., Levis-Fitzgerald M., Lee C. J., Lo S. M., McDonnell L. M., McKay T. A., Michelotti N., Musgrove A., Palmer M. S., Plank K. M., Rodela T. M., Sanders E. R., Schimpf N. G., Schulte P. M., Smith M. K., Stetzer M., Van Valkenburgh B., Vinson E., Weir L. K., Wendel P. J., Wheeler L. B. and Young A. M., (2018), Anatomy of STEM teaching in North American universities, Science, 359(6383), 1468–1470.
  44. Stegall S. L., Grushow A., Whitnell R. and Hunnicutt S. S., (2016), Evaluating the Effectiveness of POGIL-PCL Workshops, Chem. Educ. Res. Pract., 17, 407–416.
  45. SurveyMonkey Inc., (2019), Main site [Online], available at: http://http:www.surveymonkey.com, accessed: 8 March 2019.
  46. Vishnumolakala V. R., Southam D. C., Treagust D. F., Mocerino M. and Qureshi S., (2017), Students' attitudes, self-efficacy and experiences in a modified process-oriented guided inquiry learning undergraduate chemistry classroom, Chem. Educ. Res. Pract., 18(2), 340–352.
  47. Walter E. M., Henderson C. R., Beach A. L., Williams C. T., (2016), Introducing the Postsecondary Instructional Practices Survey (PIPS): a concise, interdisciplinary, and easy-to-score survey, CBE – Life Sci. Educ., 15(4), 1–11.
  48. Warfa A.-R. M., (2016), Using cooperative learning to teach chemistry: a meta-analytic review, J. Chem. Educ., 93(2), 248–255.
  49. Wiggins G. and McTighe J., (2006), Understanding by Design, 2nd edn, New Jersey: Pearson.
  50. Wilson S. B. and Varma-Nelson P., (2016), Small groups, significant impact: a review of peer-led team learning research with implications for STEM education researchers and faculty, J. Chem. Educ., 93(10), 1686–1702.
  51. Windschitl M. and Sahl K., (2002), Tracing teachers’ use of technology in a laptop computer school: the interplay of teacher beliefs, social dynamics, and institutional culture, Am. Educ. Res. J., 39, 165–205.
  52. Woodbury S. and Gess-Newsome J., (2002), Overcoming the paradox of change without difference: a model of change in the arena of fundamental school reform, Educ. Policy, 16, 763–782.

This journal is © The Royal Society of Chemistry 2020