Justin B.
Houseknecht
*a,
Garrin J.
Bachinski
a,
Madelyn H.
Miller
b,
Sarah A.
White
c and
Douglas M.
Andrews
d
aDepartment of Chemistry, Wittenberg University, 200 W Ward St, Springfield, OH 45504, USA. E-mail: jhouseknecht@wittenberg.edu
bImmunity and Pathogenesis Division, Burnett School of Biomedical Sciences, University of Central Florida, Orlando, Florida 32827, USA
cHealth Policy and Management Department, Johns Hopkins School of Public Health, 615 N. Wolfe Street, Baltimore, MD 21205, USA
dDepartment of Math and Computer Science, Wittenberg University, 200 W Ward St, Springfield, OH 45504, USA
First published on 6th November 2019
Active learning has been shown to improve student outcomes and learning, yet organic chemistry instructors have been slow to adopt these pedagogies. The Chemistry Collaborations, Workshops, and Communities of Scholars (cCWCS) Active Learning in Organic Chemistry (ALOC) workshops have sought to facilitate the adoption of active learning methods by helping participants define active learning and understand best practices, persuading them to incorporate these practices into their teaching, and supporting their implementation efforts through an online community, Organic Educational Resources (http://OrganicERs.org). The effectiveness of the workshops was measured over a two-year period using teaching self-efficacy and teaching practices instruments. Comparison to pre-workshop self-efficacy surveys found significant and sustained gains for knowledge about and belief in the efficacy of active learning methods (d = 1.18 compared to pre-workshop responses) and confidence in intention to implement (d = 0.60). Belief that they were implementing more active learning in their classrooms (d = 0.85) was corroborated by the teaching practices survey and survey of class time allocation which also showed statistically significant (p < 0.001) and sustained growth in student centered teaching (d = 1.00), formative assessment (d = 1.04), student–student interactions (d = 0.96), and the amount of class time spent with students working in groups (d = 0.68) for the workshop participants. Gains for participants in the 3 hour Active Learning in Organic Chemistry workshops at the 2016 Biennial Conference on Chemical Education (BCCE) were smaller than those in the 4 day ALOC workshops, but still meaningful. These results indicate that the 2015 and 2016 Active Learning in Organic Chemistry faculty development workshops effectively increased participants’ knowledge about, belief in the efficacy of, and implementation of active learning methods.
Choice overload from the sheer variety of methods that have been shown to be effective may be part of the reason that so few chemists, outside of chemical education researchers, have implemented active learning methods in their classrooms. One recent survey of more than 800 chemistry faculty found that only 12.9% of faculty self-report that they use a flipped approach (“primary content delivery mode occurs outside of the classroom and the application of content occurs inside the classroom”) at least weekly in their courses (Srinivasan et al., 2018). This result is consistent with a cross-disciplinary study of more than 2000 classes which found that Chemistry lags significantly behind the STEM average of 18% of “instructors who incorporate student-centered strategies into large portions of their classes” (Stains et al., 2018). It is also true that the vast majority of chemistry instructors have little experience with active learning – either as students or instructors – and have already invested substantial time and energy becoming proficient at delivering content through lecture. Resistance, particularly of senior colleagues, may be lessening but remains an impediment to change for many chemistry instructors.
Rogers’ diffusion of innovations theory describes the innovation decision making process as comprising five stages: knowledge, persuasion, decision, implementation, and finally confirmation (Rogers, 2003). The foundational role of knowledge and persuasion in Rogers’ theory is consistent with the key factors of the TCSR Model, particularly the importance of instructor beliefs and self-efficacy (Woodbury and Gess-Newsome, 2002; Gess-Newsome et al., 2003). Indeed, educational reformers working within the TCSR Model framework have demonstrated that the change process begins with growth in instructor knowledge and beliefs about teaching (Windschitl and Sahl, 2002; Bauer et al., 2013). Henderson et al. (2012) found that presentations and workshops, particularly the Physics and Astronomy New Faculty Workshop, effectively moved physics faculty through the first three stages of the decision-making process (knowledge, persuasion, and decision), but needed to do more to support faculty during the implementation and confirmation stages. This is consistent with his finding that effective educational change strategies involve two or more of the following: extended contact (a month or more), performance evaluation and feedback, and a deliberate focus on changing faculty attitudes and beliefs (Henderson et al., 2011). These findings suggest that typical faculty development workshops struggle to promote the desired long-term change in teaching practice because facilitators don’t have meaningful contact (that supports implementation and confirmation) with participants after the conclusion of the workshop. A recent report by Manduca et al. (2017) in geosciences education suggests that online communities of practice can provide effective support for the implementation of student-engaged teaching practices which leads to positive confirmation and sustained use of the desired teaching practices.
The theoretical framework for this quantitative study uses the TCSR Model to understand the factors that facilitate instructional transformation and their interconnectivity. Roger's diffusion of innovations theory was used to understand the process of instructional transformation, particularly the role that faculty development workshops can play in moving instructors through the process and supporting their post-workshop activities.
Twenty-four participants attended the 2015 4 day (ALOC) workshop in Washington, DC and 22 attended the 2016 ALOC workshop in Cincinnati, OH. Most participants taught at 4 year institutions (62%), though community college instructors (20%) and faculty at schools with graduate programs (18%) were also well represented. The workshop facilitators (5 each year) used active learning methods extensively throughout the workshop such that participants gained experience with the methods as students and as instructors. The workshops began with introductions (see Appendix 1) and the acknowledgement that personal factors and experience do impact our ability to transform instructional practice (Woodbury and Gess-Newsome, 2002; Gess-Newsome et al., 2003). The remainder of the first evening was used to provide an overview of the active learning methods introduced in the workshop, the theoretical and empirical evidence supporting their efficacy, and the concept of backward design (Wiggins and McTighe, 2006). These sessions were designed to provide a compelling rationale for implementation of active learning methods. Each of the following days began with workshop facilitators describing how they make use of particular EBIPs (Fig. 1) in their own teaching. These sessions were designed to inform participants of the underlying theory and best practices for each method as well as to experience the methods as a student. Afternoon sessions were more practical with introductions to technology (Fig. 1), opportunities to interact with it, and time for participants to design and receive feedback on instructional modules. Workshop facilitators had extensive experience with the EBIPs and technologies introduced and were therefore able to emphasize their compatibility with organic chemistry instruction. The workshops also included a session on summative assessment to help participants’ understand best-practice and provide preparation for the confirmation stage of Roger's diffusion of innovations theory.
Fig. 1 Primary evidence-based instructional practices and technologies introduced at the 2015 and 2016 4-day ALOC workshops. |
A total of 40 participants attended two 3 hour workshops at the 2016 Biennial Conference on Chemical Education (BCCE) at the University of Northern Colorado. These workshops used Reaching Students: What Research Says About Effective Instruction in Undergraduate Science and Engineering to provide an overview of evidence-based instructional practices (Kober, 2014). The philosophy of these workshops was similar to the 4 day workshops, but the schedule (Appendix 2) was highly compressed. Participants at the BCCE workshops taught primarily at 4 year institutions (57%), but there were more faculty from schools with graduate programs (24%) than at the ALOC workshops. Five percent of participants at the BCCE workshops taught at high schools and the remaining 19% taught at community colleges.
RQ1. Will participants’ knowledge about and belief in the efficacy of active learning increase significantly after the workshop and, if so, will those changes be sustained?
RQ2. Will participants’ confidence with active learning methods increase significantly after the workshop and, if so, will those changes be sustained?
RQ3. Will participants’ implementation of active learning methods increase significantly after the workshop and, if so, will those changes be sustained?
RQ4. Will the 4 day (ALOC) workshops be significantly more effective than the 3-hour (BCCE) workshops?
Effectiveness was measured by comparison of pre- and post-workshop results on self-efficacy and teaching practices surveys, all self-reported by workshop participants. Self-efficacy refers to “an individual's belief in his or her capacity to execute behaviors necessary to produce specific performance attainments” (Bandura, 1999). The self-efficacy survey used in this study was adapted from validated teaching self-efficacy surveys (Prieto-Navarro, 2005; Chang et al., 2010). The teaching practices survey used was the Postsecondary Instructional Practices Survey (PIPS) using the authors’ five-factor model (Walter et al., 2016). This instrument, too, has been shown to provide valid and reliable measures of postsecondary teaching practice, particularly for introductory-level courses in the sciences. Limited resources and time between workshop enrollment and participation precluded alternative measures of active learning implementation such as student evaluations of teaching and classroom observation of teaching.
The self-efficacy items (Fig. 2) were adapted for organic chemistry from the previously validated Faculty Teaching Efficacy survey (Prieto-Navarro, 2005) and the College Teaching Self-Efficacy Scale (Chang et al., 2010). All responses were on a five-point Likert scale from “1 – unsure of what this is” to “5 – very confident”. Participants in the 3 hour workshops (BCCE) received an abbreviated survey that did not include questions about classroom technologies (Q10–18) as these topics were not discussed in the BCCE workshops. The self-efficacy survey was administered to workshop participants one week prior to the workshop, one week after the workshop, and every six months thereafter through 24 months. It was also administered to a control group comprised of participants’ departmental colleagues two weeks apart, then 6 months and 12 months thereafter. The Postsecondary Instructional Practices Survey (PIPS) was administered to workshop participants one week after the workshop and every six months thereafter through 24 months (Walter et al., 2016). It was discovered late in the data analysis that item 10, “I structure class so that students explore or discuss their understanding of new concepts before formal instruction”, was accidently omitted from early administrations of PIPS. This item could not therefore be included in the analysis. Participants that did not respond to the initial request to complete a survey were reminded to complete the survey 2–3 additional times with personal e-mails.
Fig. 2 Items on the self-efficacy survey. Response options were: (1) unsure of what this is, (2) not confident at all, (3) not very confident, (4) somewhat confident, (5) very confident. |
Because there are distinct cohorts of participants (ALOC and BCCE) and each participant gave a quantitative response to each of several items/questions at each of several time periods, the data have a “repeated measures” structure with one between-blocks factor and two crossed within-blocks factors. The participants themselves are the blocks. The lone between-blocks factor is the cohort, and it has two levels, corresponding to the two workshop types. One of the within-blocks factors is the item/question, corresponding to the measurements recorded on each participant at each period. The other within-blocks factor is the time period, and it has six levels – though the latter five periods were occasionally aggregated to facilitate overall pre-workshop vs. post-workshop comparisons.
There are a total of 44 items/questions. The 20 self-efficacy questions are separated into five groups: Knowledge, OrganicERs, Intent, Proficiency, and Implementation. The 24 items on the PIPS instrument were divided into five subscales and Time Allocation (four options). These 14 groups of items/questions were analyzed separately, mostly in a series of 14 repeated measures analyses as described above.
Exploratory data analysis included visualizations of the main effects and interaction effects of the cohort, period, and item/question factors on the participants’ responses. Formal repeated measures analysis of variance (ANOVA) modeling and inferential testing then showed which combination of these factors was most relevant in explaining the responses. The formal analysis was followed up with pairwise comparisons (using Tukey's adjustment to minimize the risk of false positive differences) to determine which cohort, periods, and items/questions had different responses, and by how much. The effect sizes were expressed both on the original scale and on a standardized scale using Cohen's d (Cohen, 1988), and were expressed both with single-number estimates and with confidence intervals. All analysis was performed using R (R Core Team, 2018) via RStudio (RStudio Team, 2018).
Effect sizes in this report were calculated using Cohen's d which can be understood as how much of a standard deviation a measurement changed (Cohen, 1988). Changes of 20% of a standard deviation (d value of 0.2) are generally considered small, changes of 0.5 d are considered moderate, and changes greater than 0.8 d are generally considered large. Effect sizes can also be calculated from analysis of variance (ANOVA) studies by determining the percentage of variance explained by group membership. These are often reported as eta-squared values; values less than 1% (η2 < 0.01) are generally considered small, values near η2 = 0.06 are considered moderate, and values of η2 > 0.14 are considered large (Cohen, 1988).
Coefficient α-values were calculated for each of the five factors on the PIPS (Table 1) to determine its internal consistency with the ALOC and BCCE cohorts. All coefficient α-values were similar to, but greater than, those reported by Walter et al. with the exception of the content delivery factor for the ALOC cohort which was marginally lower (Walter et al., 2016). Walter et al. (2016) established the validity of the PIPS instrument, in part, with both exploratory and confirmatory factor analysis. We were not able to perform CFA of the PIPS instrument with our population due to small sample size.
Response rates for the self-efficacy and instructional practices surveys for the participants in the 4 day ALOC workshops were above 50% for all administrations with an average of 74% (Table 2). The 3 hour BCCE workshop participant response rates were less with a low of 38% and an average of 52%. Response rates tended to be lowest for the 18 month surveys that were administered during the academic year and response rates decreased slightly over time. These response rates are similar to those reported for the CSC-NFW (59%, Baker et al., 2014) and POGIL-PCL (69%, Stegall et al., 2016) workshops. The CCW had remarkably higher response rates (90% of attendees at their second workshop), though only 18 of 24 total participants responded (Murray et al., 2011). It is not unexpected that the highest response rates are for cohorts that attended multiple multi-day workshops (CCW) and the lowest for those that attended only a 3 hour workshop (BCCE).
Control (N = 26) | 2015 ALOC (N = 24) | 2016 ALOC (N = 22) | 2016 BCCE (N = 40) | |
---|---|---|---|---|
Pre-workshop | 100 | 100 | 100 | 83 |
1 week | 100 | 88 | 96 | 53 |
6 months | 27 | 58 | 73 | 60 |
12 months | 31 | 63 | 64 | 63 |
18 months | — | 63 | 55 | 38 |
24 months | — | 71 | 64 | 48 |
Participants at both ALOC and BCCE workshops reported significant and sustained knowledge gains after attendance. Repeated measures ANOVA modeling showed that period (F(5,592) = 35.2, p < 0.001) and workshop type (F(1,592) = 8.7, p = 0.003) were important factors. Self-reported knowledge differences among the post-workshop responses were not statistically significant (p > 0.25), but the difference between average pre-workshop response and average post-workshop response (+0.59) was significant (p < 0.001) with a very large effect size of more than a standard deviation (d = 1.18) as summarized in Table 3. Participants at the 4 day ALOC workshops did report moderately greater knowledge gains (+0.12, d = 0.23) than participants at the 3 hour BCCE workshops due partially to starting at a lower average knowledge and partially to a higher average after the workshops. Stains et al. (2015) also studied the knowledge and beliefs of the CSC-NFW participants before and after their workshop. They found no significant change in the number of evidence-based instructional practices (EBIPs) their control group knew, but a large change (from 8 to 14 EBIPs) for workshop participants immediately after the workshop. This change was sustained after 12 months with 15 EBIPs. Scores on the student-centered scale of the Approaches to Teaching Inventory (ATI) also increased slightly (effect size, r = 0.23) immediately after the CSC-NFW, but returned to pre-workshop levels one year later. It is not surprising that college faculty retain knowledge about teaching practices for a year (CSC-NFW) or two (ALOC and BCCE). The large and sustained increases we observed in ALOC and BCCE participants’ belief that active learning methods are effective (Q2), however, are surprising. It is possible that this belief was sustained due to adequate support for implementation within the OrganicERs community and/or confirmation of effectiveness due to high levels of implementation.
Period (all post-workshop vs. pre-workshop) | Workshop (ALOC vs. BCCE) | |||
---|---|---|---|---|
Cohen's d | p | Cohen's d | p | |
a Comparison of only the 1 week post-workshop to the pre-workshop responses. | ||||
Knowledge | 1.18 | <0.001 | 0.23 | 0.003 |
OrganicERs | 1.02 | <0.001 | 0.74 | <0.001 |
Intent | 0.60a | <0.001a | 0.32 | <0.001 |
Proficiency | 0.78 | <0.001 | N/A | N/A |
Implementation | 0.85 | <0.001 | 0.15 | <0.001 |
Student-centered | 1.00 | <0.001 | 0.59 | <0.001 |
Formative assessment | 1.04 | <0.001 | 0.60 | <0.001 |
Interactive | 0.96 | <0.001 | 0.38 | 0.007 |
% Lecture | 0.77 | <0.001 | 0.28 | 0.038 |
% Small Groups | 0.68 | <0.001 | 0.23 | 0.090 |
Disciplinary support can be an important factor in whether faculty successfully implement teaching innovations (Woodbury and Gess-Newsome, 2002; Gess-Newsome et al., 2003). Questions 19 and 20 on the self-efficacy survey (Fig. 2) assessed whether ALOC and BCCE workshop participants recognize OrganicERs as a resource (Q19) from which they feel comfortable seeking help (Q20). Repeated measures ANOVA modeling showed that period (F(5,595) = 31.2, p < 0.001), workshop type (F(1,595) = 90.1, p < 0.001), and question (F(1,595) = 11.6, p = 0.001) were important factors. Initial gains were significant (p < 0.001, d = 1.42). Participants’ confidence in OrganicERs decreased at the 6-month survey (p = 0.002, d = −0.51), particularly of those that attended the BCCE workshops (Fig. 3). Even with this decline, the average responses on these items after the workshops were significantly greater than the pre-workshop survey (p < 0.001, d = 1.02). Responses from the ALOC participants did express significantly more confidence in OrganicERs than those of the BCCE participants (+0.56, p < 0.001, d = 0.74) as expected due to greater length of interaction. The high, and particularly for ALOC participants, sustained average responses to these items is an indication that workshop participants were confident OrganicERs could be a source of disciplinary support as they implement instructional change.
The hope and expectation of the workshops was that increasing participants’ knowledge about and belief in the efficacy of active learning methods and connecting them with disciplinary support would increase their intention (Q4, Fig. 2) to implement these methods in their courses after the workshop. Fig. 3 illustrates that participants’ confidence in their intention to implement active learning methods did increase significantly immediately after the workshops (p < 0.001, d = 0.60). Responses varied over the following 2 years with particularly low average responses 18 months after the workshops. This is not unexpected as intent should diminish once methods have already been implemented and our experience is that few instructors implement major instructional change in the midst of an academic year. ANOVA modeling showed that in addition to period (F(5,249) = 5.0, p < 0.001), workshop type attended was also an important factor (F(1,249) = 8.3, p = 0.004). Participants at the 4 day ALOC workshops reported stronger confidence in their intent to implement more active learning methods throughout the duration of the study (+0.17, p = 0.004, d = 0.32).
Participant responses to the self-efficacy survey suggest that they believe that the ALOC and BCCE workshops did effectively increase their knowledge about and belief in the efficacy of active learning methods and these increases were sustained for at least two years after workshop attendance (RQ1). The survey also found that participants’ confidence in their intent to incorporate more active learning methods in their teaching increased after the workshops.
Questions 10–18 on the self-efficacy survey (Fig. 2) measured 4 day ALOC workshop participants’ confidence using particular teaching methods. A stated goal of the workshops was for participants to become proficient with the use of at least two of the nine methods that were discussed at the workshops. As shown in Fig. 4, approximately one third of respondents rated themselves as proficient (4 or 5 on the Likert scale) with two or more methods prior to the workshop. This percentage increased three-fold (to >90%) after the workshops and was sustained over the two-years that participants were queried. The percentage of respondents confident in their ability to use five or more methods increased by 70% during the workshop and remained constant at about 50% for the remainder of the study. Repeated measures ANOVA modeling found that period (F(5,230) = 14.5, p < 0.001) and participant (F(46,230) = 3.5, p < 0.001) explained 51% of variation in the responses. Respondents, on average, became confident or very confident on two additional methods (p < 0.001, d = 0.78) after ALOC workshop attendance. These results indicate that the ALOC workshops were highly effective at improving participants’ confidence using active learning methods and that these gains remained steady from 6–24 months after the workshops (RQ2). These results are similar to those seen with participants in the CCW where prior to the workshop only 28% of participants were “confident” or “very confident” in their ability to implement POGIL techniques, but this increased to 78% eight months after the workshop (Murray et al., 2011).
Fig. 4 Number of methods in which participants at the 4 day ALOC workshops self-report they are confident (4) or very confident (5) on questions 10–18 of the self-efficacy instrument. |
The CCW and POGIL-PCL workshop conducted surveys 7–8 months post-workshop to determine the extent to which participants implemented workshop-specific modules (Murray et al., 2011; Stegall et al., 2016). Most CCW and POGIL-PCL participants, 90% and 77% respectively, reported implementing at least some workshop material within this timeframe, though only 42% of the POGIL-PCL participants reported using the modules regularly. The CSC-NFW study also found statistically significant increases in the number and frequency of evidence-based methods participants reported implementing (Stains et al., 2015). Their control group and workshop participants both reported using more EBIPs one year after the workshop, but the increase was greater for the participants (p = 0.044, η2 = 0.062). The CSC-NFW participants also reported using EBIPs more frequently one year after the workshop, particularly group work (p = 0.018, η2 = 0.086), whole class discussion (p = 0.047, η2 = 0.061), and move into class (p = 0.015, η2 = 0.091). These increases were statistically significant compared to the control group which reported decreased frequency of use.
Fig. 5 Workshop participant scores on the Content, Student-Centered, Formative Assessment, student–student Interactions, and Summative Assessment subscales of the PIPS instrument. |
The reduction in self-reported time spent lecturing corresponded to an increase in the percent of time spent with students working in small groups (+11%, p < 0.001, d = 0.68). This medium effect size is comparable to that reported for CSC-NFW participant use of small group work (p = 0.018, η2 = 0.086) one year after their workshop (Stains et al., 2015). Changes to the amount of time spent with students working individually or “other” were negligible. Repeated measures ANOVA modeling also found that participants in the ALOC workshop reported higher levels of small group work than BCCE participants (+4%, p = 0.095, d = 0.23) despite reporting less use of small group work in the semester before workshop attendance.
The self-efficacy survey, analysis of the PIPS instrument subscales, and the survey of time allocation in class meetings all suggest that ALOC and BCCE workshop participants believed that they increased their implementation of evidence-based, active learning methods after workshop attendance. The large magnitude and consistency of these increases (Table 3) provide great confidence that the ALOC and BCCE workshops were effective in their goal to increase implementation of evidence-based, active learning methods in the organic chemistry courses of participants (RQ3).
Fig. 6 Percent of class time workshop participants reported using for whole class lecture, individual work, small group work, and other. |
Participants in the 4 day ALOC workshops reported, on average, greater change than participants in the 3 hour BCCE workshops such that two years after the workshops ALOC participants were more likely to be using active learning practices by every measure in this study. Repeated measures ANOVA modeling found that the effect of workshop type was significant, but small (d = 0.15 to 0.38) for each measure other than recognition of OrganicERs as a resource (Q19) from which they feel comfortable seeking help (Q20) (d = 0.74), the PIPS student-centered scale (d = 0.59), and the PIPS formative assessment scale (d = 0.60, Table 3). The large effect for confidence in OrganicERs suggests that the 4 day ALOC workshops did a much better job incorporating workshop participants into our community of practice than the shorter, 3 hour BCCE workshops. Even with the medium to large effect sizes for the PIPS student-centered and formative assessment scales, participants at the 3 hour BCCE workshops reported significant gain. The other, smaller effect sizes suggest to us that the shorter workshops, though less effective than the 4 day workshops, are valuable experiences that have measurable impacts upon teaching knowledge, beliefs, and practice.
The self-efficacy survey, analysis of the PIPS subscales, and changes in the use of class time all suggest that the workshops were highly effective. Participants’ knowledge about and belief in the efficacy of active learning methods increased by more than a standard deviation (d = 1.18) after the workshops and this change was sustained over the two years that participants were surveyed (Fig. 3). The 4 day ALOC workshops also had a large effect on participants’ beliefs in the OrganicERs community of practice (Fig. 3) and proficiency with active learning methods (Fig. 4). Immediately after the workshops, participants had a greater intention to implement active learning methods (d = 0.60, Fig. 3) and all indications are that they did so. Participants: reported higher levels of confidence in implementation (d = 0.85, Fig. 3); scored higher on the student-centered (d = 1.00), formative assessment (d = 1.04), and student–student interactions (d = 0.96) scales of the PIPS (Fig. 5); reported lower use of whole class lecture (d = 0.77, Fig. 6); and reported higher use of small group work (d = 0.68, Fig. 6). The large and consistent magnitude of these changes provide good reason to believe that the 4 day ALOC and 3 hour BCCE workshops effectively disseminated evidence-based instruction practices amongst organic chemistry instructors who then implemented these methods in their own courses and confirmed their effectiveness (Rogers, 2003).
The reported significant and sustained changes in teaching knowledge, beliefs, and practice reported by participants at the 4 day ALOC and 3 hour BCCE workshops are similar to or greater than those reported for the Core Collaborators Workshops (CCW) for biochemistry (Murray et al., 2011), POGIL-PCL workshops for physical chemistry laboratory (Stegall et al., 2016), and Cottrell Scholars Collaborative New Faculty Workshop (CSC-NFW) for chemistry faculty at R1 institutions (Baker et al., 2014; Stains et al., 2015). This report extends that of Stains et al. who found that participants in the CSC-NFW retained the knowledge gained at their workshop for a year; participants in the ALOC and BCCE workshops retained their knowledge gained for two years. Our findings, however, differ from the CSC-NFW in that the change in beliefs about the efficacy of active learning methods (Q4, Fig. 2) was also sustained over the two-year course of the study; Stains et al. (2015) found that beliefs in specific student-centered instructional practices on the Approaches to Teaching Inventory (ATI) reverted to pre-workshop levels a year after workshop attendance. The methods used to evaluate the CCW and POGIL-PCL workshops, likewise, make it unclear whether participants continued to implement POGIL modules beyond the academic year following workshop attendance.
The success of the ALOC and BCCE workshops is undoubtedly due to a variety of factors. We believe that four of these are worth noting. First, the workshops were conceived and designed as a means of incorporating participants into a community of practice focused on evidence-based instructional practices in organic chemistry. We have intentionally sought to incorporate diverse pedagogical and institutional perspectives into this community. Second, workshop participants learn about each evidence-based teaching practice, see it modeled while experiencing it as a learner, and develop learning artifacts that address key learning challenges faced by their organic chemistry students during the workshops. Third, workshops were led by organic chemistry instructors with years of experience using active, evidence-based instructional practices in their courses. Finally, assessment has been an integral component of our attempt to continually improve the workshops. The process of developing clear and concise delineation of workshop objectives required for assessment provided clarity to facilitators and a cohesive experience for participants. We have also been able to use preliminary survey results to eliminate discussion of some technologies that participants did not implement and incorporate others. Finally, the surprisingly positive results from the 2016 BCCE workshops encouraged us to continue offering 3 hour workshops at BCCE despite our perception that three hours is too little time to accomplish lasting change.
Monday evening | |
6–7 pm | Welcome and Introductions |
7–8 pm | Overview of Active Learning Pedagogies |
8–9 pm | Backward Design |
Tuesday | |
8:30–9:15 | The Flipped Classroom at IPFW |
9:15–10:00 | The Flipped Classroom at RCGC |
10:15–12:00 | Using the Tools (Livescribe and SnagIt) |
1:00–1:10 | Group Photo |
1:10–1:30 | cCWCS and OrganicERs.org |
1:30–2:45 | Clicker technology and use |
3:15–4:00 | Hands-on activities with clickers |
4:00–4:30 | EasyOChem |
4:30–5:30 | The Flipped Lab at Dartmouth |
5:30–6:00 | Reflection |
Wednesday | |
8:30–9:00 | Just-in-Time Teaching at Wittenberg |
9:00–9:30 | Just-in-Time Teaching at Centre |
10:00–11:30 | Learning Objectives and Reading Prompts |
12:30–1:45 | Using the Tools (Doceri and Explain Everything) |
1:45–2:30 | Assessments |
3:00–5:30 | Concept Inventory Development |
5:30–6:00 | Reflection |
Thursday | |
8:30–9:15 | Discussion with previous participant |
9:15–10:00 | Introducing active learning to others |
10:30–11:45 | Brainstorming – how will your teaching change? |
11:45–12:00 | Evaluations |
10 min | Introductions |
20 min | Chapters 1 and 2 – Thinking about learning and teaching as a researcher would |
Evidence for effectiveness of active learning | |
Learning goals | |
30 min | Chapter 3 – Using insights about learning to inform teaching |
Constructivism | |
Metacognition | |
60 min | Chapter 4 – Designing instruction |
Think-Pair-Share | |
Peer Instruction | |
Just-in-Time Teaching | |
Interactive exercises | |
Cooperative/collaborative learning | |
30 min | Chapter 5 – Assessing and adapting |
30 min | Chapter 6 – Overcoming challenges |
This journal is © The Royal Society of Chemistry 2020 |