Structure and evaluation of flipped chemistry courses: organic & spectroscopy, large and small, first to third year, English and French

Alison B. Flynn
Department of Chemistry, University of Ottawa, 10 Marie Curie Private, Ottawa, ON K1N 6N5, Canada. E-mail: alison.flynn@uottawa.ca

Received 22nd October 2014 , Accepted 4th December 2014

First published on 5th December 2014


Abstract

Organic chemistry is a traditionally difficult subject with high failure & withdrawal rates and many areas of conceptual difficulty for students. To promote student learning and success, four undergraduate organic chemistry and spectroscopy courses at the first to third year level (17–420 students) were “flipped” in 2013–2014. In the flipped course, content traditionally delivered in lectures is moved online; class time is dedicated to focused learning activities. The three large courses were taught in English, the small one in French. To structure the courses, each course's intended learning outcomes (ILOs) were analyzed to decide which course components would be delivered online and which would be addressed in class. Short (2–15 min), specific videos were created to replace lectures. Online and in-class learning activities were created in alignment with the ILOs; assessment was also aligned with the ILOs. A learning evaluation was undertaken to determine the impact of the new course structure, using Guskey's evaluation model. Analysis of students' grades, withdrawal rates, and failure rates were made between courses that had a flipped model and courses taught in previous years in a lecture format. The results showed a statistically significant improvement in students' grades and decreased withdrawal and failure rates, although a causal link to the new flipped class format cannot be concluded. Student surveys and course evaluations revealed high student satisfaction; this author also had a very positive experience teaching in the new model. The courses' overall design and evaluation method could readily be adapted to other chemistry, science and other courses, including the use of learning outcomes, the weekly course structure, online learning management system design, and instructional strategies for large and small classes.


Introduction

Organic chemistry is a traditionally difficult subject with high failure & withdrawal rates and many areas of conceptual difficulty for students (Grove et al., 2008). The author had been teaching large chemistry courses in a lecture format, in which clickers, online homework, and demonstrations were used to create opportunities for active learning (Flynn, 2011, 2012a, 2012b). Even so, students were left on their own to learn the more difficult concepts that required higher order thinking (Krathwohl, 2002a), having learned (at best) the most basic concepts during the lecture. In the flipped classroom, the transmission of information that would have been conveyed during a lecture is moved online, either via short (ideally) videos or text (Fig. 1). Class time is used for interactive learning activities—of the sort that might traditionally be left out of class—and thus creates opportunities for increased student engagement, more faculty-student contact, and deeper learning (Jarvis et al., 2014). Another possible benefit of the flipped classroom is the reduction in cognitive load during classes (Sirhan et al., 1999; Seery and Donnelly, 2012).
image file: c4rp00224e-f1.tif
Fig. 1 Comparison of features between a traditional lecture and flipped classroom.

Related pedagogies include peer instruction (Mazur, 1997, 2004, 2009), team-based learning (Team-Based Learning Collaborative, 2013), just-in-time teaching (Novak et al., 1999), and process-oriented guided inquiry learning (“POGIL: Process-Oriented Guided-Inquiry Learning,” 2009; POGIL, 2011). While the “flipped classroom” is not a new pedagogy, the term conjures a defined image of how a flipped course might be structured and where the content might go.

Many reports of the flipped classroom involve suggestions for implementing this model (Pearson, 2012a; Sams and Bergmann, 2013; Lasry et al., 2014; Slezak, 2014; Vaughan, 2014). Some studies have investigated the value of the flipped classroom model, although the evidence is still coming in (Goodwin and Miller, 2013). For example, a number of reports have reported positive student feedback (Pearson, 2012b; Enfield, 2013; Love et al., 2013; McGivney-Burelle and Xue, 2013; Smith, 2013; Wilson, 2013). Other measures of student learning have reported increased student engagement (Seery, 2014) and effects on the classroom environment (Strayer, 2012). Academic (grade) improvements in small classes have been reported at the high school level (Fulton, 2012), in undergraduate math (Love et al., 2013) and chemistry (Trogden, 2014) courses, and at the graduate level (Tune et al., 2013).

Given the existing literature suggesting improved learning outcomes for the flipped course model (including non-academic ones) and the opportunity to optimize precious face-to-face time with students, organic chemistry and spectroscopy courses were converted to this format. Herein, the following are described: (1) the flipped course structures and the conversion process for one small and three large chemistry courses and (2) the results of a multi-level evaluation of the large courses that was conducted, using Guskey's evaluation framework (Guskey, 2002), to determine the impact of the flipped model on students' academic success.

Theoretical framework for the courses

The theoretical framework used when designing and teaching this course was constructivism, specifically von Glasersfeld's position of radical constructivism (Bodner, 1986; Glasersfeld, 1989). According to this framework, learners actively construct their own knowledge by building upon prior experiences and conceptions. Knowledge is not transferred intact (Bodner et al., 2001), and that knowledge must fit satisfactorily within the context in which it arises. To achieve meaningful learning, Cooper and coworkers (2010) summarized Novak's description (2010) as follows: “students first must possess relevant prior knowledge upon which to anchor new knowledge. Second, this new knowledge must be perceived by the student as relevant to other knowledge. Finally, the learner must consciously and deliberately choose to relate new knowledge to knowledge the learner already knows in some nontrivial way” (p. 869). While the learner constructs his or her own knowledge, social interactions are also important. Bodner (2006) pointed out that: “Learning is a complex process that occurs within a social context, as the social constructivists point out, but it is ultimately the individual who does the learning.” (p. 13)

In the courses described here, students were guided through the learning process. The course environment involved many different types of individual and social learning activities, thus providing opportunities for students to construct their own knowledge. They were also confronted with many situations in which they had to question the match between experimental evidence and their existing knowledge. These exercises required students to consider common errors and misconceptions, as will be described below.

Courses

The courses included Organic Chemistry I (CHM 1321, ∼400 students, winter 2014), Organic Chemistry II (CHM 2120, ∼400 students, fall 2013), Applications of Spectroscopy in chemistry (CHM 3122, ∼140 students, fall 2013), and Applications de la spectroscopie en chimie (CHM 3522—the French version of 3122, 17 students, fall 2013). Classes were held in large, theatre style auditoriums, with the exception of CHM 3522, which was held in the active learning classroom pictured in Fig. 2 (uOttawa: Teaching and Learning Support Service, 2013; Abraham, 2014).
image file: c4rp00224e-f2.tif
Fig. 2 uOttawa's active learning classroom.

The breakdown of marks for each course was as shown below (Table 1). TopHat (TopHat, 2014) was used as the Classroom Response System (CRS), which was accorded a 5% participation grade. The pre-class tests were worth 5% of the final grade and were delivered through Sapling Learning (Sapling Learning, 2014) in the organic courses and through Blackboard Learn (“Blackboard Learn”, 2013)—the learning management system (LMS)—in the spectroscopy courses. The assignments were worth 10% and 0% of the final grade in organic chemistry and spectroscopy, respectively. They were delivered with Sapling Learning in the organic chemistry courses and as pdf files via the LMS in the spectroscopy courses. Organic Chemistry I additionally had a laboratory component. For assessments with a range, the weighting was used that gave each student the best final grade.

Table 1 The weighting of assessments in each course
Course CRS (%) Pre-class tests (%) Assignments (%) Lab (%) Midterm 1 (%) Midterm 2 (%) Final exam (%)
Organic chemistry I 5 5 10 15 10–20 10–20 25–45
Organic chemistry II 5 5 10 10–20 10–20 40–60
Applications of spectroscopy (EN/FR) 5 5 Not graded 20–30 20–30 30–50


Course structure

The weekly course structure is summarized in Fig. 3. Each week began (from the students' point of view) by reading the ILOs followed by watching a video or reading the appropriate section in the textbook. Students completed a pre-class test before coming to class. Class time was dedicated to interactive learning activities. The weekly cycle ended with an online assignment (optional in the spectroscopy courses). The assignment from one week and the pre-class test for the following week were due on the same day and time, so that students had only one weekly deadline. Extra learning supports were available for outside of class time, including tutorials, office hours, discussion forum, etc. All the course components were designed to guide students toward achieving the intended learning outcomes of each module (Collis and Biggs, 1986; Krathwohl, 2002b).
image file: c4rp00224e-f3.tif
Fig. 3 The weekly class structure.

The structure of the course, expectations, and reasons for the choice of this format were clearly communicated to students in the syllabus, in an introductory video, and in the first class of the year. The structure remained consistent and predictable throughout the course.

The online component of a course risks becoming an exhaustive list of information, links, and resources. This can be overwhelming and make it difficult for the student to know how to navigate and prioritize the resources. To avoid this data dump, the course's learning management system (LMS)—Blackboard Learn (“Blackboard Learn,” 2013)—guided students' progress through the main course content (Fig. 4). The “Modules” link in the left menu bar brought students to the suggested order to follow. The system also provided quick access to frequently accessed items, such as class notes and online homework, and extra resources including past exams, the discussion forum, and one of the optional course textbooks (Smith, 2011; Klein, 2012; Wade, 2013).


image file: c4rp00224e-f4.tif
Fig. 4 The course content was organized in the learning management system by presenting the content and activities in the recommended order and by giving quick links.

Learning outcomes tied the course together

These courses took a learning outcome-based approach to focus on what the student demonstrably knows and can do after instruction, rather than what the instructor teaches (Biggs and Tang, 2007). The intended learning outcomes (ILOs)— what the instructor wants students to be able to do by the end of the course—were constructed based on the Structure of Observed Learning Outcomes taxonomy (SOLO) (Biggs and Tang, 2007) and the cognitive domain of the modified Bloom taxonomy (Krathwohl, 2002a). Learning outcomes can be identified at the program level, course level, or in an area within a course (Stoyanovich et al., in press; Towns, 2009).

For these courses, the ILOs were developed for each module (further described in Appendix I) then they were analyzed to decide which would be taught out-of-class and in-class, with many being addressed in both. In general, pre-class activities were dedicated to introductory and basic concepts—lower level SOLO and Bloom, such as definitions and general mechanisms. In-class activities were used for deeper learning—higher SOLO and Bloom levels. The assessments (e.g., assignments, midterms, and exams) were aligned with the learning activities and the ILOs. Appendix II provides an example of one learning module in which the ILOs were aligned with the learning activities and assessments. Below, the general structure of each course component is described.

Before class

The intended learning outcomes, video notes, videos, and class notes were posted for students at the beginning of each section of the course. The video notes outlined the concepts for the video, as well as content that would be difficult or time-consuming to copy by hand such as spectra and complex molecules. Students could annotate them as they watched the videos, just as they would if they were taking notes during a lecture in person.

The videos were recorded and edited using Camtasia (“Camtasia: Screen Recording and Video Editing for Anyone,” 2014). The program's screen capture function was used to capture handwritten notes, animations (Deslongchamps, 2007), and to show other data (e.g., pKa tables). The camera was used for demonstrating three dimensional analysis using Darling Molecular Models (“Darling Molecular Models,” 2010) and for manipulating sticky notes for spectral analysis (Flynn, 2012b). A Bamboo tablet (“Bambo”, 2014) and Notability (“Notability”, 2014) were used to create the handwritten notes.

The videos were approximately ten minutes long, on average, with the longest being approximately twenty minutes; ideally, the videos would be kept to five to ten minutes in length (Table 2). Designing and creating the videos were the most time consuming part of moving to the new course structure; creating a video required approximately ten times the video's length. The total number of video hours may seem very short compared to the lecture hours that have been removed, but the lectures were condensed (e.g., by drawing hand-drawn phrases at increased speeds) and focused on the absolutely essential material (e.g., additional examples and links to real-life were built into the in-class questions).

Table 2 Number and duration of the videos created for the courses
Course Number of videos used in the course Total video time (h) Average video length (min) Maximum video length (min) Minimum video length (min) Percentage created by the author (%)
Organic Chemistry I (CHM 1321) 28 6.9 9.11 16.35 1.55 93
Organic Chemistry II (CHM 2120) 24 3.6 9.04 15.73 3.25 88
Applications of spectroscopy (CHM 3122 and 3522) 17 3.2 11.31 21.35 2.62 100


After watching the pre-class videos or reading the appropriate sections in the textbook, students completed pre-class tests using Sapling Learning (Sapling Learning, 2014) in the organic chemistry courses or using the LMS for the spectroscopy courses. These tests were posted for students by the Thursday of one week and were due two hours before the first course of the following week (e.g., due at 8 am on a Monday for a 10 am class). The online homework system (Sapling Learning, 2014) was selected for organic chemistry because there were many questions for which students could draw molecular structures and mechanisms and receive immediate feedback for their answers. Students' answers were reviewed for questions that had the lowest success rates as determined by the program. This process required ten to fifteen minutes per assignment (Flynn, 2012a) and provided a starting point for creating in-class activities.

This “before class” phase started students on the path of learning new knowledge (Cooper et al., 2010) and provided evidence (in the form of pre-class test results) of their knowledge and abilities before they came to class.

In class. In class time was devoted to problem-solving activities designed to help students achieve the ILOs. The class notes were posted at least twenty-four hours before each class. These notes were essentially an outline of the activities for the class and contained material that was time-consuming to copy by hand (e.g., spectral data, large molecules, and definitions). Proving these data before the class freed up even more class time for learning activities.

In class, a SMART Podium (“SMART PodiumTM 500 Series”, 2014)—essentially an electronic whiteboard—was used to record notes, a document camera served to project documents, drawings, and molecular models, ECHO360 was used to record the classes (links to these recordings were posted on Blackboard), and TopHat (TopHat, 2014)—a classroom response system (CRS)—was used to capture students' responses to questions, providing the students and professor with immediate feedback. Other resources were used such as Organic Chemistry Flashware (Deslongchamps, 2007), and YouTube videos (“YouTube,” 2014). In 2014, an iPad (“iPad,” 2014) was incorporated, allowing the professor to move wirelessly through the classroom while retaining access to the projector. On average, 175 questions were asked per course (∼eight questions per eighty minute class). All the activities involved formative feedback mechanisms and most included social components.

Students' results on the pre-class tests informed the class activities (Flynn, 2012a). For example, a mechanism question that students answered poorly on Sapling could be brought into class as a multiple-choice question. The question shown in Fig. 5 was created using the most prevalent answers to a pre-class test question that the majority of students answered incorrectly. In it, students were asked to identify the first step in the reaction mechanism between cyclohexene and bromine.


image file: c4rp00224e-f5.tif
Fig. 5 Students' incorrect (A–C) and correct (D) answers to a pre-class test question were transformed into an in-class question.

There were many other types of in-class activities such as think-pair-share, predict-observe-explain, etc. Questions related to reaction mechanisms and were asked via the CRS using a numeric answer method described by Ruder and Straumanis (2009). In another question type, students worked in groups to prepare written answers to questions (molecular structures or explanations). A few of those answers were (anonymously) projected to the class and the class voted on the best answers.

For longer questions of the type commonly encountered in the spectroscopy course, CRS questions were asked periodically to monitor students' progress. These might ask students to identify a signal that should stand out to them, based on the data provided. For the example in Fig. 6, the third year students were asked to assign all the signals in the proton NMR spectrum of codeine (“Codeine NMR problem,” n.d.)(Fig. 6a); they were also provided with the 13C, DEPT135, COSY, and HMQC spectra. In the first question, students were asked to identify the 1H NMR signal of the hydroxyl proton. The majority of students (82%) incorrectly answered “G” (Fig. 6b). They justified their answer by saying that hydroxyl protons give broad, rounded signals as in signal “G.” This particular question relating to an acidic proton also served to address a likely misconception: that acidic protons are always broad singlets. Students were reminded to make sure their answer reflected the data. After a second vote, 60% of students had the correct answer, “K” (Fig. 6c). Students explained that according to the HMQC data (spectrum not shown), proton “G” was bound to a carbon while only proton “K” was not.


image file: c4rp00224e-f6.tif
Fig. 6 Students' were asked to assign the signal for the hydroxyl proton in codeine. (a) 1H NMR spectrum, (b) distribution of responses the first time students answered, (c) distribution of responses the second time students answered (answer: K). Note: only the four most prevalent responses are shown.

These in-class questions, which were graded on participation only, provided a regular feedback mechanism to and from students with respect to their achievement of various learning outcomes. A few more examples of in-class questions are provided in Appendix II and elsewhere (Flynn, 2011, 2012a, 2012b). Through the in-class portion of the course, students built on their prior knowledge and explicitly made connections with that knowledge (Cooper et al., 2010). They also had a social context in which to learn (Bodner, 2006).

Assignments

Assignments were used to close the loop on the learning from the week and were more challenging than the pre-class tests. By answering assignment questions and checking their answers, students could see whether they had achieved the intended learning outcomes for that module. Lots of practice was provided to help them achieve those LOs and construct their own knowledge (Glasersfeld, 1989).

The students were asked to think more deeply through questions that came up throughout the week (i.e., mid to high SOLO and Bloom levels). For example, students were asked in one case to draw the product that would result from the electron-pushing arrows drawn in Fig. 5A. As with the pre-class tests, assignment questions that were not well-answered by a majority of students were brought into the following class as learning activities (Flynn, 2012a).

Assessment

The midterm and final exams were aligned with the intended learning outcomes. The questions were targeted to the mid to upper Bloom and SOLO levels and they closely resembled the types found in class, assignments, and extra problem sets. To avoid asking low level Bloom and SOLO questions (e.g., memorization and isolated knowledge), the questions from tests, assignments, and the CRS were never directly copy/pasted into midterms and exams. Students therefore had to move beyond rote memorization in order to succeed in the course, and they were given many opportunities to learn to do so.

Impact of the flipped courses on student learning

A number of components of the organic chemistry course were analyzed to estimate the impact on student achievement. The framework used to evaluate the new flipped structure was Guskey's evaluation framework (Guskey, 2002, 2010). Guskey's framework—which was originally developed to measure teachers' professional development—is very similar to Kirkpatrick's evaluation model (Kirkpatrick, 1996), but it additionally addresses organizational support and change. Because the structures of the courses were changed significantly, the aspect of organizational support was a particularly important one to address. The CIPP (Context-Inputs-Process-Products) evaluation model was considered (Stufflebeam, 1983), but was considered too broad for this initial study as its multiple components involve many studies whose results must be integrated and evaluated over a longer time period.

In Guskey's framework, level 1 focuses on students' satisfaction with the learning activities and experience; for example, whether they felt that the activities were useful, helpful, and what types of issues arose (e.g., technical difficulties or understanding the instructions). Level 2 focuses on measuring aspects such as the knowledge, skills, and attitudes gained, based on the attainment of specific learning goals. Level 3 analyzes how changes are supported (or not) by the organization (e.g., university or professional community). Change could be supported by encouraging development, making resources available (including time, money, and expertise), and sharing successes. This level is the main difference from the Kirkpatrick evaluation model (Kirkpatrick, 1996). A lack of organizational support and change can undermine and even halt development, making this level of evaluation essential. Level 4 focuses on students' use of new knowledge and skills, such as whether any behaviour changes (e.g., problem-solving strategy) occurred after the learning experience. Finally, level 5 addresses student learning outcomes, or the “bottom line,” such as whether students' achievement, confidence, or attendance has improved, or whether dropouts have decreased. The student learning outcomes can be analyzed at the cognitive, affective, and psychomotor levels.

The research questions (RQs) targeted in this study are shown in Table 3. The university's Office of Research Ethics and Integrity was consulted and ethics approval was deemed unnecessary because of the type of study and confidentiality and anonymity of all student data (Canadian Institutes of Health Research, Natural Sciences and Engineering Research Council of Canada, and Social Sciences and Humanities Research Council of Canada, 2010).

Table 3 Guskey evaluation levels and associated research questions for this study
Evaluation level Research questions (RQs)
1. Reactions 1. What were students' reactions to the flipped format?
2. Learning 2. Did participants acquire the intended knowledge and skills?
3. Organization support & change 3. How was implementation advocated, facilitated, and supported, if at all?
4. What resources were made available, if any?
5. How were successes recognized and shared, if at all?
4. Use of new knowledge and skills Not addressed
5. Learning outcomes 6. How did the change affect student performance or achievement?
7. How did the change affect the withdrawal rate?


RQ 1: What were students' reactions to the flipped format?

At the first Guskey level (Table 3), course evaluations were used to quantify and qualify students' reactions to the new format. Twenty minutes at the beginning of one class period were set aside for students to fill out anonymous, standardized course evaluations. One component of the evaluations consisted of statements answered using a Likert scale; the second component was a space for students' comments and suggestions. A weighted average out of five, with five being high, was calculated based on students' ratings for each statement. While this authors' course evaluations had already been above the university's averages (4.57, 4.17, & 4.16 for the three statements, respectively), the courses taught in the flipped format were above the author's average (Fig. 7).


image file: c4rp00224e-f7.tif
Fig. 7 Results from the three key statements on anonymous, annual course evaluations. Legend: course name (enrollment, response rate). Answer options for the first two statements: almost always/often/sometimes/rarely/almost never. Answer options for the third statement: excellent/good/acceptable/poor/very poor.

Students' comments on the second part of the course evaluation were extremely positive. The recurring positive comments included:

• “The fact that we do problems in class better prepare us for the assignments and exams”

• “Top Hat, although sometimes cumbersome, enhances learning and problem solving, while giving the prof real-time evaluation of comprehension”

• “Love pre-class tests and assignments. Keeps us on top of the game”

• “The Sapling practice opportunities, PRE-CLASS VIDEOS [sic], and DGDs were all amazing tools to build a concrete foundation of learning… the way you teach helped me learn so much more”

Criticisms and suggestions for improvement were few, with the main ones including that (i) that the desks were small and cramped (a comment made only by the students in the lecture auditoriums, not in the active learning classroom), (ii) that the video quality could be improved (another program was used in the first few videos, which resulted in lower sound quality; this issue was resolved by using Camtasia, which also gave other editing advantages), and (iii) that the second midterm was too long in the spectroscopy course.

Although it was expected in the flipped model that students would “push back” against the course format and ask to be “just [taught] what I need to know [i.e., lecture]” (Colautti, 2014), students provided only very positive comments about the format. Woods (2006) described an analogy to a grieving process frequently experienced by students who are confronted with a major change from an accustomed learning format to a new one, such as problem-based learning (PBL). This was likely experienced by students in an upper year laboratory course that was converted to a PBL format (Flynn and Biggs, 2011). It is still possible that some students went through a similar process but that they bounced back from it quickly.

The class environment was also impacted by the type of room. The small (seventeen student) French spectroscopy course (CHM 3522) was initially taught in a small lecture classroom until an active learning classroom (uOttawa: Teaching and Learning Support Service, 2013; Abraham, 2014) became available (Fig. 2). Although the course had the flipped format, students seemed hesitant to ask questions, volunteer explanations, and work in groups; this was perhaps partly because of the sound quality of the rooms (sounds echoed). When relocating to the active learning classroom became an option, students voted unanimously to do so and the entire class environment changed. The environment became animated and the students worked together at their tables (Fig. 2) on the questions. They frequently debated answers (in a respectful fashion) and volunteers from each table regularly answered questions. Furthermore, students worked through the class problems—which had an gradient of difficulty—at their own pace.

There are many reasons why students might have enjoyed the flipped course format, although this has not yet been studied in detail for these courses. As described by Smith (2013), these reasons could include: the flexibility of when to watch the pre-class videos and the option to re-watch them, the predictable class structure with clear expectations, the ability for students to learn at their own pace (by spending more/less time on harder/easier concepts), the active class environment, the ability to check their own understanding, etc.

RQ 2: Did participants acquire the intended knowledge and skills?

To determine whether students had learned more in the flipped model compared to the original course model, the final exam grades were compared between two Organic Chemistry II courses taught by the author (2011 versus 2013). The course in 2011 was taught in an active lecture format, in which short lecture segments were punctuated with questions using a CRS. The final exams were identical to each other and the students had not seen any of the questions before. The average grade on the final exam was higher in 2013 (M = 65%, SD = 18%) than in 2011 (M = 63%, SD = 19%). A one-tailed t-test for independent samples revealed a statistically significant difference between the data, t(786) = 1.92, p = 0.03. The effect size was small (Cohen's d = 0.11). The higher exam grades in the flipped format than in the active lecture format suggested that students have learned more in the flipped course (2013) than in the active lecture course (2011). This effect needs to be studied in greater detail by using an instrument—such as a concept inventory—to determine to what extent specific learning goals have been achieved.

RQs 3–5: Was implementation advocated, facilitated, and supported? Were sufficient resources made available? Were successes recognized and shared?

These questions have not been studied in detail, but to date, the organization (i.e., uOttawa) has been supportive of this initiative. The author was part of a team of professors who teach organic chemistry courses at uOttawa who worked to modernize the organic curriculum, which now has a mechanistic structure (Flynn and Ogilvie, submitted). However, each professor chose the pedagogical approach taken within that structure. Thus, this author was able to develop the flipped course for her own classes. The Teaching and Learning Support Service (TLSS) (“uOttawa Teaching and Learning Support Service,” 2014) at the university provided essential support from each of its four units: members of the Centre for eLearning were available to discuss best practice for designing the online aspect of the course; the Multimedia Distribution Service was available by phone or in person during and outside of class time to assist with any technical difficulties (and they were fast and technically proficient); the Centre for Mediated Teaching and Learning provided training in using all the options in the active learning classroom (Fig. 2) as well as technical support when required; members of the Centre for University Teaching were always available for pedagogical discussions. The author was invited to make a presentation to the university's Board of Governors about the flipped format in the active learning classroom and her use of this room was promoted in other areas (Abraham, 2014; Smith, 2014). The format and experiences discussed here have been used by the TLSS as an example of one way to structure online and in-class components of a non-traditional course in its Blended Course Design Institute (“Blended Course Design Institute,” 2014).

RQs 6 & 7: How did the change affect student performance or achievement? How did the change affect the withdrawal rate?

Organic Chemistry I and II results were analyzed because the author has taught those courses for many years and so historical data were available. Students' grades, withdrawal rates (i.e., dropouts), and failure rates were used as a measure of student performance and achievement. The courses taught in a flipped model were compared to courses taught with the same course content.

First, chi-square tests of independence were performed to compare the withdrawal rates between the flipped course format and previous years' data (Table 4). The analyses revealed statistically significant reductions in withdrawal rates in both Organic Chemistry I and Organic Chemistry II courses taught in the flipped course format compared to previous years, χ2 (1, n = 4) > 3.84, p < 0.05. Two exceptions were noted in Organic Chemistry I in 2010, χ2 (1, n = 4) = 0.87, p = 0.35, and in Organic Chemistry II in 2011, χ2 (1, n = 4) = 2.84, p = 0.09. The courses taught in a flipped format had average risk of withdrawal reductions of 3.1% and 4.2% for Organic Chemistry I and II, respectively.

Table 4 Comparison of withdrawal rates between the flipped courses and historical data
Course Year Original enrolment Withdrawal rate (%) Comparison of each year with the flipped course
df χ 2 p Absolute risk of withdrawal reduction
Organic I 2010 1096 3 1 0.87 0.352 0.010
2011 1048 5 1 4.55 0.033 0.027
2012 1152 5 1 3.85 0.050 0.024
2013 1226 9 1 15.94 <0.001 0.062
Average (2010–2013) 1131 6 1 5.90 0.015 0.031
Flipped (2014) 364 2
Organic II 2009 707 7 1 10.59 0.001 0.047
2010 801 7 1 10.00 0.002 0.044
2011 786 5 1 2.84 0.092 0.020
2012 792 9 1 15.26 <0.001 0.059
Average (2009–2012) 772 7 1 9.33 0.002 0.042
Flipped (2013) 409 3


Chi-square tests of independence were performed to compare the failure rates between the flipped course format and previous years' data (Table 5). The analyses revealed statistically significant reductions in failure rates in both Organic Chemistry I and Organic Chemistry II courses taught in the flipped course format compared to all previous years. For all comparisons, χ2 (1, n = 4) > 3.84, p < 0.001. The courses taught in a flipped format had average risk of failure reductions of 14.3% and 10.4% for Organic Chemistry I and II, respectively.

Table 5 Comparison of failure rates between the flipped courses and historical data
Course Year Enrolment Failure rate (%) Comparison of each year with the flipped course
df χ 2 p Absolute risk of failure reduction
Organic I 2010 1058 24 1 55.26 <0.001 0.179
2011 994 20 1 38.99 <0.001 0.142
2012 1096 13 1 13.10 <0.001 0.069
2013 1120 24 1 57.83 <0.001 0.184
Average (2010–2013) 1067 20 1 39.92 <0.001 0.143
Flipped (2014) 355 6
Organic II 2009 655 15 1 13.14 <0.001 0.072
2010 744 20 1 33.80 <0.001 0.130
2011 749 14 1 14.59 <0.001 0.076
2012 724 19 1 31.40 <0.001 0.124
Average (2009–2012) 718 17 1 23.69 <0.001 0.104
Flipped (2013) 398 7


Finally, the students' grades in the flipped course were compared to those in previous years. The descriptive statistics are shown in Table 6. The median and first and third quartiles were included to describe the grades because the data were not normally distributed.

Table 6 Descriptive statistics of students' gradesa
Course Year 1st Quartile Median Mean 3rd Quartile Results of Wilcoxon–Mann–Whitney test (compared to flipped course)
W p AUCb
a Grade values: A+ = 10 (90–100%), A = 9 (85–89%), A− = 8 (80–84%), B+ = 7 (75–79%), B = 6 (70–74%), C+ = 5 (65–69%), C = 4 (60–64%), D+ = 3 (55–59%), D = 2 (50–54%), E = 1 (40–49%), F = 0 (<40%). b AUC = Area under the operator receiver curve.
Organic I 2010 2 5 4.77 8 226[thin space (1/6-em)]883 <0.001 0.63
2011 2 5 5.06 8 204[thin space (1/6-em)]655 <0.001 0.61
2012 3 5 5.32 8 219[thin space (1/6-em)]470 <0.001 0.65
2013 2 4 4.56 7 246[thin space (1/6-em)]834 <0.001 0.65
Average (2010–2013) 2 5 4.92 8 897[thin space (1/6-em)]841 <0.001 0.62
Flipped (2014) 4 7 6.26 9
Organic II 2009 3 6 5.32 8 142[thin space (1/6-em)]097 0.005 0.55
2010 2 5 4.91 8 172[thin space (1/6-em)]636 <0.001 0.59
2011 3 5 5.22 8 165[thin space (1/6-em)]804 <0.001 0.56
2012 2 5 4.69 7 174[thin space (1/6-em)]874 <0.001 0.61
Average (2009–2012) 2 5 5.03 8 655[thin space (1/6-em)]409 <0.001 0.58
Flipped (2013) 4 6 5.91 8


The flipped courses were compared to each of the previous years using the Wilcoxon–Mann–Whitney rank sum test for each of the comparisons. The unadjusted p values were adjusted for multiple testing with the Bonferroni–Holm correction. The grade distributions for both the organic chemistry flipped courses were found to be significantly different than each distribution of grades for the prior years (p < 0.01 for all comparisons and AUC values ≥0.55).

Thus, student achievement increased in both levels of organic chemistry courses in the most recent teaching year as evidenced by increased students' grades and decreased failure rates. In the same courses, the withdrawal rates also decreased as compared to previous years. These were the same courses in which the flipped course model was incorporated. While it could not be concluded that the flipped classroom model caused the improvements in the withdrawal rates, failure rates, and final grades, the evidence suggested at least a correlation with the flipped classroom model. Further investigation and exploration of the flipped classroom model in chemistry are certainly warranted.

Conclusions

The conversion of large and small chemistry courses (organic & spectroscopy) to flipped course models at the first to third year undergraduate level was described.

The most challenging and time-consuming aspects of the conversion to a flipped format were planning how to structure the in- and out-of-class components and preparing the videos. Moving forward, small, iterative improvements will be made to the courses, such as improving the quality of the videos. Improvements to course assessment will also be explored, including aligning the assessments with the social nature of the class environment. For example, a team-based component to a midterm (Gilley and Clarkston, 2014; Rieger and Heiner, 2014) was piloted with a small class in the fall of 2014.

Many factors seemed to contribute to the success of this endeavour, including: (1) a structured course format that kept the students' responsibilities predictable (e.g., with consistent deadlines) while communicating high expectations; (2) facile access to technical support. Although not often needed, the rapid technical support from the Teaching and Learning Support Service was invaluable (“uOttawa Teaching and Learning Support Service,” 2014); (3) teaching assistants who reviewed assignments and communicated areas of student difficulties; (4) this author's previous experience in classroom management; having previous taught lectures that were frequently punctuated by active learning opportunities using CRS questions facilitated the transition to a full flipped format; and (5) students' openness to working in a new classroom format.

The metrics used to measure the success of the course conversion in large and small classes suggest a positive effect of the flipped classroom model, even though a causal relationship could not be concluded. Only a very small part of a complex puzzle has been studied here. In the future, other factors that might have caused the positive effects observed should also be considered, including social, emotional, experiential, and cultural factors. Other potential outcomes of the new classroom model could also be investigated, such as its impact on students' argumentation skills (Kulatunga et al., 2013), conceptual change (Duit and Treagust, 2003), and metacognitive ability (Sandi-Urena et al., 2011). Regardless of the reasons for the apparent success with the flipped class model, it will be used again in future years with the goal of improving student learning.

Appendix I. Writing learning outcomes using the SOLO and Bloom taxonomies, and SMART goal-setting principles

The SOLO taxonomy (Table 7) describes “how a learner's performance grows in complexity when mastering many academic tasks” (Biggs and Tang, 2007). In the prestructural level, SOLO 1, there is little evidence of learning. At the unistructural level, SOLO 2, the student learns quantitative information (e.g., discrete facts and theories), deals with declarative knowledge such as terminology, and uses one single aspect without making connections. At the multistructural level, SOLO 3, the student continues learning quantitative information and declarative knowledge, and can deal with several aspects, but doesn't make connections between them. At the relational level, SOLO 4, the student's competences have increased and become qualitative as well as quantitative. In the fourth level, the student can make connections between several aspects or concepts and demonstrate how they fit together. At the extended abstract level, SOLO 5, the student goes beyond the information & explanations that were explicitly provided. The student's abilities include being able to: analyze concepts from different perspectives, generalize, create, and transfer ideas to new areas.
Table 7 Outline of the SOLO taxonomy and verbs commonly associated with each level (Biggs and Tang, 2007)
SOLO level 1 Prestructural 2 Unistructural 3 Multistructural 4 Relational 5 Extended abstract
a Requires knowledge of nucleophile/electrophile mechanisms, leaving group ability, and acid–base chemistry, hence making connections between several aspects. b Provided the students have not been asked or shown the answer to the same question previously.
At this level, the student: Shows little evidence of learning Deals with terminology, uses one single aspect without making connections Deals with several aspects, but doesn't make connections between them Makes connections between several aspects and how they fit together. Goes beyond what was given and transfers ideas to new areas
Associated verbs: None (uses irrelevant information, misses the point, avoids the question) Identify, define, recall, name, follow simple procedure Enumerate, describe, list, combine, do algorithms Compare/contrast, argue, solve, explain causes, analyze, relate, apply Theorize, generalize, hypothesize, create, reflect
Example of questions at each level: Decide whether the following molecule is chiral Circle the aromatic rings, underline the anti-aromatic, and do nothing to non-aromatic ring below Propose a mechanism for the following reaction [ester + NaOH] and justify the form of the final product [carboxylate]a Propose a synthesis of the following molecule or propose a mechanism for a previously unseen reaction.b


As emphasized in multiple resources for writing learning objectives or outcomes (Collis and Biggs, 1986; Krathwohl, 2002b; Biggs and Tang, 2007; Brabrand and Dahl, 2009; Towns, 2009), the verbs used for each ILO is one that is outwardly visible, or demonstrable. For example, we can see the result of a student's drawing, but we cannot directly measure whether they understand or appreciate a concept. The ILOs should also be specific, measurable, achievable, relevant, and time bounded, i.e., “SMART”, an acronym that has been used in sport (“Setting SMART goals,” 2013), business (Drucker, 2012), and education (Conzemius and ONeill, 2006; Towns, 2009) to promote development of useful goals.

Appendix II

Table 8.
Table 8 All learning activities and assessments in the flipped class were aligned with the intended learning outcomes
Intended learning outcomes (ILOs) Pre-class videos (lower SOLO & Bloom levels) Pre-class test (lower SOLO & Bloom levels) In class (upper SOLO & Bloom levels) Assignment (lower SOLO & Bloom levels) Assessment, e.g., midterm (all levels)
ILO 1: draw the mechanism (including electron-pushing arrows) for the reaction of a p bond nucleophile with a halogen, in the presence of various solvents and other functional groups. Generic mechanisms (alkene + X2, alkene + X2 + alcohol solvent, alkene bearing a nucleophilic functional group + X2) Basic questions related to exactly what was shown on the video (ILOs 1–5, lower Bloom) Mechanism questions (ILO 1) More questions like the ones seen in class (ILOs 1–6) Questions pertaining to all ILOs at with a range of questions (varying SOLO and Bloom levels)
 
ILO 2: decide which nucleophile is most likely to react, when there is more than one choice Definitions (e.g., intramolecular and intermolecular) Deciding on the best choice of nucleophile (ILO 2)
Demonstration by students: intramolecular versus intermolecular reactions (ILO 2)
 
ILO 3: justify the stereo- and regiochemical outcomes of the reaction The stereochemical outcome of the reaction is explained Draw the product, given the starting materials and taking stereochemistry and regiochemistry into account (ILOs 1–3)
The regiochemical outcome of the reaction is explained
 
ILO 4: draw the molecular orbitals involved in the reaction The molecular orbitals involved in the reaction are explained Animation (Flashchem): mechanism and orbitals involved in the reaction, with associated questions (ILOs 1, 3–5)
 
ILO 5: draw the reaction coordinate diagram for a mechanism. Reaction coordinate diagram for one of the mechanisms
 
ILO 6: analyze a product retrosynthetically: given a product, draw the reactants Given the product, draw the starting materials (and other retrosynthetic analysis questions) (ILO 6)


Appendix III. Examples of common types of in-class questions

To ask a mechanism question with the classroom response system, the atoms and bonds in the reactants were numbered (e.g., Fig. 8). To make the structure easier to read, electrons and bonds were coloured blue; atoms were coloured red. If students wanted to represent the C–Cl bond breaking and that bond's electrons going to chlorine (i.e., the correct answer), they would type “21.” Once approximately 80% of students had answered the question, they were given a 10–20 second warning and the results were examined. If the majority of students answered correctly based on the histogram of results, the next activity was presented. If not, students were given time to discuss the answer and try to convince each other of the correct one (Mazur, 1997). Either the same question again or a follow-up question was created to ensure students had learned the concept. With Top Hat, new questions can be quickly created and added, even just by taking a screenshot.
image file: c4rp00224e-f8.tif
Fig. 8 Mechanism question asked with Top Hat, the classroom response system used in the course (answer: 21).

In another activity type, students were asked to draw the products of a reaction, such as the one shown in Fig. 9. Approximately eight students were randomly given a sticky note and were asked to draw their answer on it. They did not have to write their name on it and they could work with the students around them. The answers were labeled A, B, C, etc. and the sticky notes were projected to the screen using a document camera. Students then voted on the best answer and explained their choices to each other.


image file: c4rp00224e-f9.tif
Fig. 9 Students who were selected at random drew their answers on sticky notes. They could work with their classmates and did not put their names on their answers (answer: C).

Students also submitted writing samples using this strategy. For example, they could be asked to decide why one species was a stronger base than another, and to justify their answer on a half sheet of paper. Some of these answers would be collected at random and students would vote first on the best answer and then on the best-structured answer. These activities generated a lot of excitement in the classroom.

A predict-observe-explain format was used frequently in the courses. For example (Fig. 10) in the spectroscopy courses, students (i) predicted the bond that would have the highest IR stretching frequency (by Top Hat vote), (ii) were shown the data, (iii) brainstormed reasons for the observed trend (written down without passing any judgment), (iv) voted for the best choice (B), and finally (v) explained their reasons to each other.


image file: c4rp00224e-f10.tif
Fig. 10 Students (i) predicted the bond that would have the highest IR stretching frequency (by Top Hat vote), (ii) were shown the data, (iii) brainstormed reasons for the observed trend, (iv) voted for the best choice (B), and finally (v) explained their reasons to each other (answer: B).

Questions were created using the document camera to show specific views or conformations of molecules, demonstrations were used to convey ideas such as the relative rates of intra- versus intermolecular reactions, and Organic Chemistry Flashware to demonstrate acid/base concepts, reaction mechanisms, and molecular orbitals (Deslongchamps, 2007).

References

  1. Abraham S., (2014, February 12), The new active learning classroom, Gazette. Ottawa. Retrieved from http://www.gazette.uottawa.ca/en/2014/02/the-new-active-learning-classroom/.
  2. Bambo, (2014), Bambo. Retrieved June 2014, from http://wtc.wacom.com/bamboo/index.php.
  3. Biggs J. B. and Tang C., (2007), Teaching for Quality Learning at University: What the Student Does, 3rd edn, Maidenhead: Society for Research into Higher Education and Open University Press.
  4. Blackboard Learn, (2013), Blackboard Learn. Blackboard. Retrieved July 2013, from http://www.blackboard.com/.
  5. Blended Course Design Institute, (2014), Blended Course Design Institute. Retrieved June 2014, from http://www.saea.uottawa.ca/institut/.
  6. Bodner G. M., (1986). Constructivism: a theory of knowledge, J. Chem. Educ., 63, 873.
  7. Bodner G. M., (2006), Theoretical Frameworks for Research in Chemistry/Science Education, Upper Saddle River, NJ: Pearson/Prentice Hall, pp. 2–26.
  8. Bodner G. M., Klobuchar M. and Geelan D., (2001), The many forms of constructivism, J. Chem. Educ., 78, 1107.
  9. Brabrand C. and Dahl B., (2009), Using the SOLO taxonomy to analyze competence progression of university science curricula, J. Higher Educ., 58, 531–549.
  10. Camtasia: Screen Recording and Video Editing for Anyone, (2014), Camtasia: Screen Recording and Video Editing for Anyone. Retrieved May 2014, from http://www.techsmith.com/camtasia.html.
  11. Canadian Institutes of Health Research, Natural Sciences and Engineering Research Council of Canada, and Social Sciences and Humanities Research Council of Canada, (2010, December), TCPS 2 – Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans. Government of Canada. Retrieved from http://www.pre.ethics.gc.ca/eng/index/.
  12. Codeine NMR problem, (n.d.), Codeine NMR problem. Retrieved June 2014, from http://www.chem.queensu.ca/facilities/NMR/nmr/chem806/example/problem-Cod.pdf.
  13. Colautti J., (2014, April 10), Hybrid learning lacks engagement, The Fulcrum.
  14. Collis K. and Biggs J. B., (1986), Using The SOLO Taxonomy, Set: Research Information for Teachers, 2, 4.
  15. Conzemius A. and ONeill J., (2006), The power of SMART goals: using goals to improve student learning. Solutions Tree. Retrieved from http://scholar.google.com/scholar?q=related:6Fh4qJGrYQEJ:scholar.google.com/&hl=en&num=20&as_sdt=0,5.
  16. Cooper M. M., Grove N. P., Underwood S. M. and Klymkowsky M. W., (2010), Lost in Lewis Structures: An Investigation of Student Difficulties in Developing Representational Competence, J. Chem. Educ., 87, 869–874.
  17. Darling Molecular Models, (2010), Darling Molecular Models. Retrieved November 2014, from http://www.molecularvisions.com/molecular-model-kits/cat_1.html.
  18. Deslongchamps G., (2007), Organic Chemistry Flashware. Nelson Education Ltd. Retrieved May 2014, from http://flashchem.nelson.com.
  19. Drucker P. F., (2012), The Practice of Management, Routledge, DOI: http://10.4324/9780080942360.
  20. Duit R. and Treagust D. F., (2003), Conceptual change: a powerful framework for improving science teaching and learning, Int. J. Sci. Educ., 25, 671–688.
  21. Enfield J., (2013), Looking at the Impact of the Flipped Classroom Model of Instruction on Undergraduate Multimedia Students at CSUN, TechTrends: Linking Research and Practice to Improve Learning, 57, 14–27.
  22. Flynn A. B., (2011), Developing Problem-Solving Skills through Retrosynthetic Analysis and Clickers in Organic Chemistry, J. Chem. Educ., 88, 1496–1500.
  23. Flynn A. B., (2012a), Development of an Online, Postclass Question Method and Its Integration with Teaching Strategies, J. Chem. Educ., 89, 456–464.
  24. Flynn A. B., (2012b), NMR Interpretation: Getting from Spectrum to Structure, J. Chem. Educ., 89, 1210–1212.
  25. Flynn A. B. and Biggs R., (2011), The Development and Implementation of a Problem-Based Learning Format in a Fourth-Year Undergraduate Synthetic Organic and Medicinal Chemistry Laboratory Course, J. Chem. Educ., 89, 52–57.
  26. Flynn A. B. and Ogilvie W. W., (submitted), Mechanisms Before Reactions: A Mechanistic Approach to the Organic Chemistry Curriculum Based on Patterns of Electron Flow, J. Chem. Educ..
  27. Fulton K. P., (2012), 10 Reasons to Flip, Phi Delta Kappan, 94, 20–24.
  28. Gilley B. H. and Clarkston B., (2014), Collaborative Testing: Evidence of Learning in a Controlled In-Class Study of Undergraduate Students, Res. Teach., 43, 83–91.
  29. von Glasersfeld E., (1989), Cognition, construction of knowledge, and teaching, Synthese, 80, 121–140.
  30. Goodwin B. and Miller K., (2013), Evidence on flipped classrooms is still coming in, Educ. Leadership, 70, 78–80.
  31. Grove N. P., Hershberger J. W. and Bretz S. L., (2008), Impact of a spiral organic curriculum on student attrition and learning, Chem. Educ. Res. Pract., 9, 157–162.
  32. Guskey T. R., (2002), Does It Make a Difference? Evaluating Professional Development, Educ. Leadership, 59, 45–51.
  33. Guskey T. R., (2010), Professional Development and Teacher Change, Teach. Teach. Theory Pract., 8, 381–391.
  34. iPad, (2014), iPad. Retrieved November 2014, from http://www.apple.com/ca/ipad/.
  35. Jarvis W., Halvorson W., Sadeque S. and Johnston S., (2014), A Large Class Engagement (Lce) Model Based on Service-Dominant Logic (Sdl) and Flipped Classrooms, Educ. Res. Perspectives, 41, 1–24.
  36. Kirkpatrick D., (1996), Great ideas revisited, Train. Dev., 54–59.
  37. Klein D., (2012), Organic Chemistry, 1st edn, Hoboken, NJ: John Wiley & Sons, Inc.
  38. Krathwohl D. R., (2002a), A revision of Bloom's taxonomy: an overview, Theory Into Practice, 41, 212–218.
  39. Krathwohl D. R., (2002b), A taxonomy for learning, teaching, and assessing: an overview, Theory Into Practice, 41, 212–218.
  40. Kulatunga U., Moog R. S. and Lewis J. E., (2013), Argumentation and participation patterns in general chemistry peer-led sessions, J. Res. Sci. Teach., 50, 1207–1231.
  41. Lasry N., Dugdale M. and Charles E., (2014), Just in Time to Flip Your Classroom, Phys. Teach., 52, 34–36.
  42. Love B., Hodge A., Grandgenett N. and Swift A. W., (2013), Student learning and perceptions in a flipped linear algebra course, Int. J. Math. Educ. Sci. Technol., 45, 317–324.
  43. Mazur E., (1997), Peer Instruction: A User's Manual, New Jersey: Prentice Hall.
  44. Mazur E., (2004), Interactive teaching: promoting better learning using peer instruction and just-in-time teaching, New Jersey: Pearson Prentice Hall.
  45. Mazur E., (2009), Farewell, Lecture? Science, 323, 50–51.
  46. McGivney-Burelle J. and Xue F., (2013), Flipping Calculus, PRIMUS, 23, 477–486.
  47. Notability, (2014), Notability. Retrieved November 2014, from http://www.gingerlabs.com.
  48. Novak, J. D., (2010), Learning, Creating, and Using Knowledge, 2nd edn, New York: Routledge.
  49. Novak G., Gavrin A., Christian W. and Patterson E., (1999), Just-In-Time Teaching: Blending Active Learning with Web Technology, Upper Saddle River, NJ: Prentice Hall.
  50. Pearson G., (2012a), Biology Teacher's Flipped Classroom: ‘A Simple Thing, But It's so Powerful', Educ. Can. Retrieved November 2014, from http://www.cea-ace.ca/education-canada/article/biology-teacher%E2%80%99s-flipped-classroom-%E2%80%98-simple-thing-it%E2%80%99s-so-powerful%E2%80%99.
  51. Pearson G., (2012b), Students, Parents Give Thumbs-Up to Flipped Classroom, Educ. Can., 52. Retrieved November 2014, from http://www.cea-ace.ca/education-canada/article/students-parents-give-thumbs-flipped-classroom.
  52. POGIL, (2011), POGIL – Process Oriented Guided Inquiry Learning, POGIL Process Oriented Guided Inquiry Learning. Retrieved from http://www.pogil.org/.
  53. POGIL: Process-Oriented Guided-Inquiry Learning, (2009), POGIL: Process-Oriented Guided-Inquiry Learning, Upper Saddle River, NJ: Pearson Prentice Hall, vol. II, pp. 90–105.
  54. Rieger G. and Heiner C. E., (2014), Examinations That Support Collaborative Learning: The Students' Perspective, J. Coll. Sci. Teach., 43, 41–47.
  55. Sams A. and Bergmann J., (2013), Flip Your Students' Learning, Educ. Leadership, 70, 16–20.
  56. Sandi-Urena S., Cooper M. M. and Stevens R. H., (2011), Enhancement of Metacognition Use and Awareness by Means of a Collaborative Intervention, Int. J. Sci. Educ., 33, 323–340.
  57. Sapling Learning, (2014), Sapling Learning. Sapling Learning. Retrieved May 2014, from http://www2.saplinglearning.com/.
  58. Seery M. K., (2014), Student Engagement with Flipped Chemistry Lectures. Presented at the Spring ConfChem Flipped Classroom. Retrieved from http://confchem.ccce.divched.org/sites/confchem.ccce.divched.org/files/2014SpringConfChemP1.pdf.
  59. Seery M. K. and Donnelly R., (2012), The implementation of pre-lecture resources to reduce in-class cognitive load: a case study for higher education chemistry, Br. J. Educ. Technol., 43, 667–677.
  60. Setting SMART goals, (2013), Setting SMART goals. Olympic.org. Retrieved October 2014, from http://www.olympic.org/content/olympic-athletes/athletes-space/tips/setting-smart-goals/.
  61. Sirhan G., Gray D., Johnstone A. H. and Reid N., (1999), Preparing the Mind of the Learner, Univ. Chem. Educ., 3, 43–46.
  62. Slezak S., (2014), Flipping a Class, the Learn by Doing Method. Presented at the Spring ConfChem Flipped Classroom. Retrieved from http://confchem.ccce.divched.org/sites/confchem.ccce.divched.org/files/2014SpringConfChemP6.pdf.
  63. SMART PodiumTM 500 Series, (2014), SMART PodiumTM 500 Series. Retrieved November 2014, from http://education.smarttech.com/en/products/podium.
  64. Smith J. G., (2011), Org. Chem., 4th edn, New York, NY: McGraw-Hill.
  65. Smith D., (2013), Student attitudes toward flipping the general chemistry classroom, Chem. Educ. Res. Prac., 14, 607–614.
  66. Smith V., (2014, November 5), What is Blended Learning? University Affairs.
  67. Straumanis A. R. and Ruder S. M., (2009), A Method for Writing Open-Ended Curved Arrow Notation Questions for Multiple-Choice Exams and Electronic-Response Systems, J. Chem. Educ., 86, 1392.
  68. Strayer J. F., (2012), How Learning in an Inverted Classroom Influences Cooperation, Innovation and Task Orientation, Learn. Environ. Res., 15, 171–193.
  69. Stoyanovich C., Gandhi A. and Flynn A. B., (in press), Acid–Base Learning Outcomes for Students in an Introductory Organic Chemistry Course, J. Chem. Educ., DOI: http://10.1021/ed5003338.
  70. Stufflebeam D. L., (1983), The CIPP [Context Input Progress Product] model for program evaluation, Evaluation models, Boston.
  71. Team-Based Learning Collaborative, (2013), Retrieved November 2014 from http://www.teambasedlearning.org/.
  72. TopHat, (2014), Top Hat. Retrieved June 2014, from http://https://tophat.com/.
  73. Towns M. H., (2009), Developing Learning Objectives and Assessment Plans at a Variety of Institutions: Examples and Case Studies, J. Chem. Educ., 87, 91–96.
  74. Trogden B. G., (2014), Reclaiming face time: how an organic chemistry flipped classroom provided access to increased guided engagement. Presented at the Spring ConfChem Flipped Classroom. Retrieved from http://confchem.ccce.divched.org/sites/confchem.ccce.divched.org/files/2014SpringConfChemP3.pdf.
  75. Tune J. D., Sturek M. and Basile D. P., (2013), Flipped Classroom Model Improves Graduate Student Performance in Cardiovascular, Respiratory, and Renal Physiology, Adv. Phys. Educ., 37, 316–320.
  76. uOttawa: Teaching and Learning Support Service, (2014), uOttawa Teaching and Learning Support Service. Retrieved June 2014, from http://www.saea.uottawa.ca/index.php?option=com_content&view=article&id=68&Itemid=27&lang=en.
  77. uOttawa: Teaching and Learning Support Service, (2013), New Active Learning Classroom. Retrieved November 2014, from http://www.saea.uottawa.ca/ceam/index.php?option=com_k2&view=itemlist&task=category&id=12:apprentissage-actif-collaboratif&lang=en.
  78. Vaughan M., (2014), Flipping the Learning: An Investigation into the use of the Flipped Classroom Model in an Introductory Teaching Course. Retrieved from http://www.erpjournal.net/wp-content/uploads/2014/05/ERPV41_Vaughn_2014_Flipping_the_learning.pdf.
  79. Wade L. G., Jr., (2013), Organic Chemistry, 8th edn, Upper Saddle River, NJ: Pearson Prentice Hall.
  80. Wilson S. G., (2013), The Flipped Class A Method to Address the Challenges of an Undergraduate Statistics Course, Teach. Psychol., 40, 193–199.
  81. Woods D. R., (2006), Preparing for PBL, 3rd edn, Hamilton, ON. Retrieved from http://chemeng.mcmaster.ca/sites/default/files/media/Woods-Preparing-for-PBL.pdf.
  82. YouTube, (2014), YouTube. Retrieved June 2014, from http://youtube.com.

This journal is © The Royal Society of Chemistry 2015