Cooperative learning in organic chemistry increases student assessment of learning gains in key transferable skills

Dorian A. Canelas *a, Jennifer L. Hill b and Andrea Novicki c
aDepartment of Chemistry, Duke University, Box 90346, Durham, NC 27708-0346, USA. E-mail: dorian.canelas@duke.edu
bTrinity College Office of Assessment, Duke University, Box 104819, Durham, NC 27708, USA
cCenter for Instructional Technology, Duke University, Durham, NC 27708-0198, USA

Received 11th January 2017 , Accepted 10th February 2017

First published on 10th February 2017


Abstract

Science and engineering educators and employers agree that students should graduate from college with expertise in their major subject area as well as the skills and competencies necessary for productive participation in diverse work environments. These competencies include problem-solving, communication, leadership, and collaboration, among others. Using a pseudo-experimental design, and employing a variety of data from exam scores, course evaluations, and student assessment of learning gains (SALG) surveys of key competencies, we compared the development of both chemistry content knowledge and transferable or generic skills among students enrolled in two types of large classes: a lecture-based format versus an interactive, constructive, cooperative learning (flipped classroom) format. Controlling for instructor, as well as laboratory and recitation content, students enrolled in the cooperative learning format reported higher learning gains than the control group in essential transferable skills and competency areas at the end of the term, and more growth in these areas over the course of the term. As a result of their work in the class, the two groups of students reported the most significant differences in their gains in the following areas: “interacting productively to solve problems with a diverse group of classmates,” “behaving as an effective leader,” “behaving as an effective teammate,” and “comfort level working with complex ideas.” Our findings clearly show that cooperative learning course designs allow students to practice and develop the transferable skills valued by employers.


Introduction

Many of the essential skills cited by employers (Carnevale et al., 1990) are best characterized as soft, generic, or transferable skills which are often not explicitly addressed or assessed in introductory science or engineering courses. One study's conclusions emphasized the “importance of the combined technical-soft skill set” (Bailey and Mitchell, 2007). This finding arose from analyses of interviews with top level supervisors, senior managers, and CEO's, revealing that “of the 23 skills identified as the most critical by 75 percent or more of respondents, 57 percent were technical and 43 percent were soft skills” (Bailey and Mitchell, 2007). Examples of these highly valued skills include oral communication and listening: both are clearly key aspects of effective performance in most professional job settings. Other desirable employee characteristics, such as cultural competency, teamwork, and leadership, enhance the ability of individual contributors to function as part of a diverse group. When compared to specific chemical content knowledge often explicitly targeted in course learning goals, these are examples of “skills which are more holistic, and which need to operate across wide ranges of contexts” (Taber, 2016). Without a doubt, many times “soft skills predict success in life,” (Heckman and Kauz, 2012) so we therefore have a duty as leaders of the technical professions to make certain that students are given many opportunities to develop these skills as part of their formal coursework.

Calls for colleges and universities to focus more on ensuring that science and engineering students are trained with parallel development of technical content knowledge and generic core competencies for the non-academic workplace have resounded for more than a decade (Kerr and Runquist, 2005; Martin et al., 2005; Shuman et al., 2005). Graduates cited feeling well-prepared for the technical aspects of their jobs – such as problem solving and readiness for life-long learning – but they also lamented weaknesses in formal training in terms of their readiness to effectively participate on teams or lead/manage interdisciplinary teams (Martin et al., 2005). Although this gap in transferable skills has been well documented for many years, researchers find that disparities continue to exist between targeted student competencies in the classroom environment and the full set of actual skills needed in technical professions. As an illustration, authors of a study focused on outcomes for undergraduate chemistry students note that “Some employers find chemistry graduates lacking written- and oral-communication skills, critical-thinking skills, group-work skills, as well as the ability to efficiently analyze data and retrieve chemical information” (Ashraf et al., 2011). Self-ratings of proficiencies reveal that the “ability to effectively work in a team” is a poorly rated skill in the college setting by students who have not yet started their careers (Baytiyeh, 2012). This skill is highly valued by both employers and students, but recent college graduates express greater confidence in their skills than employers report is warranted (Hart Research Associates, 2015). Perhaps most importantly, this same skill has the highest rating for necessity on the job (higher than all of the technical skill indicators) when ranked by those who are surveyed after they have been working for some time as technical professions (Baytiyeh, 2012). Similarly, recent chemistry graduates in the UK gave transferable skills higher “usefulness” rankings than chemical knowledge and skills (Hanson and Overton, 2010). Yet, when the same graduates were asked about the development of these transferable skills in their degree programs, they provided relatively low ratings for their development (Hanson and Overton, 2010). This assertion of a misalignment of the skills developed in the classroom setting and the interactive nature of true scientific discourse outside of the classroom environment is supported by a study that classified industrial chemists as transitional in their problem solving skills (Randles and Overton, 2015). Indeed, practicing chemists in industry occupy the process skills space between novice student learners and seasoned academics well-versed at open ended problem solving and intellectual discourse (Randles and Overton, 2015). A question then arises: can chemical educators improve the development of generic skills that industrial chemists need for high level functioning in the multidisciplinary teams often associated with the most complicated problem solving through revision of the curriculum or classroom environment?

The importance of generic or transferable skills has led to them being codified in the accreditation requirements for degree programs by organizations such as the Royal Society of Chemistry (RSC), American Chemical Society (ACS), and the Accreditation Board for Engineering and Technology (ABET) (ACS, 2015; ABET, 2016; RSC, 2016). As an illustration, for institutions to maintain accreditation of their bachelor's degree programs, RSC key requirements list transferable key skills for undergraduates such as “oral communication” and “an ability to interact with other people” (RSC, 2016). Similarly, the ACS committee on professional development's guidelines for bachelor's degree programs notes that institutions should provide “opportunities for students to learn to interact effectively in a group to solve scientific problems and work productively with a diverse group of peers” (ACS, 2015). The ABET expects programs to meet student learning criteria in three categories: technical, personal, and interpersonal skills (Baytiyeh, 2012). For the latter category, expected student outcomes include “An ability to communicate effectively with a range of audiences” and “An ability to function effectively on teams that establish goals, plan tasks, meet deadlines, and analyze risk and uncertainty” (ABET, 2016).

Since there is a mismatch in students’ perceptions of their own levels of competence with generic skills and employer reports (Hart Research Associates, 2015), what should undergraduate programs do to improve student outcomes in these areas? Some institutions endeavor to answer this question by developing a new course specifically focused on generic or transferable skills rather than on chemical content knowledge (Ashraf et al., 2011). Most chemistry departments, on the other hand, prefer to keep all courses that count towards the major focused primarily on technical skills and content knowledge. With this in mind, researchers have demonstrated that undergraduate science students can increase their proficiency in communication and teamwork skills through active learning pedagogies and curriculum innovation (Pontin et al., 1993; Bailey et al., 2012; Mittendorf and Cox, 2013; Whittington et al., 2014). Many science professors will point to the laboratory portions of courses as a place where skills such as collaboration and communication are practiced (Lowery Bretz et al., 2013). Indeed, laboratory experiences can be used and are often used to satisfy accreditation requirements for transferable skills. Moreover, laboratory classes have been explicitly shown to enhance development of soft skills for science students, especially when the instructor regularly and actively rotates the roles exercised by students in groups (Lategan, 2016). This is certainly a step in the right direction, but given the data reported by graduates of science and engineering programs, are laboratory experiences alone truly sufficient for developing students’ generic skills?

We aimed to aid and assess the development of both chemistry concept knowledge and students’ generic or transferable skills during their first year of college. We did this by employing inquiry-based, small group learning activities in an intentionally interactive, constructive, cooperative learning classroom format, and then comparing students’ learning outcomes in concept knowledge and generic skills to those of students enrolled in a lecture-based, or more passive, course. In this way, we could evaluate which types of knowledge and skills are most enhanced by the different types of classrooms.

This work is grounded in the ICAP theory (Chi, 2009; Chi and Wiley, 2014), which employs direct, overt observations of student behaviour during their learning opportunities as a proxy for varying levels of cognitive engagement. Observed behaviours are categorized into modes: Interactive, Constructive, Active, and/or Passive (ICAP). According to Chi and Wiley, “The ICAP hypothesis predicts that as students become more engaged with the learning materials, from passive to active to constructive to interactive, their learning will increase” (Chi and Wiley, 2014). Indeed, a large-scale, double-blind, randomized study supported the theory that student-centered, active-learning approaches improve science content learning when compared to teacher-centered approaches such as lecturing (Granger et al., 2012). While some recent studies have shown that organic chemistry content knowledge improves for undergraduate learners when they are in active learning classrooms as opposed to lectures (Hein, 2012; Conway, 2014), other studies have shown no differences in measured content knowledge outcomes (Dinan and Frydrychowski, 1995; Bradley et al., 2002; Chase et al., 2013; Rein and Brookes, 2015). We hypothesize that moving along the ICAP continuum toward the more interactive and constructive modes of learning in the classroom space will preserve acquisition of content learning while simultaneously improving learner confidence with important transferable skills such as teamwork, leadership, and collegiality.

To avoid confusion with terminology, and recognizing that frequently used words sometimes mean different things to different scholars, our use of language for describing the two types of classrooms in this study must be careful and deliberate. When describing our work, we will use the terms “control” and “lecture” fairly interchangeably. We will also use the word “experimental” and the phrase “cooperative learning” fairly interchangeably. In the latter case, the consistent language we have chosen here will be employed throughout the text to encompass the classroom format and/or students enrolled in the classroom format that involves the kinds of learning and activities that scholars in the field might refer to using many other words and phrases, such as student-centered, flipped, constructive, active learning, activity based, process oriented, inquiry guided, small group or team learning. We choose to clarify terminology here because we drew from a variety of traditions in developing the cooperative learning organic chemistry classroom format.

We present results from a case study that involves using the ICAP strategy for increasing student development of generic, transferable skills during the first semester of introductory organic chemistry in a large enrolment environment. Outcomes for students in the two cooperative learning sections were compared to outcomes for students in two traditional lecture sections of approximately the same size. Both formats were taught by the same instructor. Assessments of these student populations were conducted in three areas: psychological affect as a function of time (previously published) (Barger et al., 2015, 2017), formal summative assessments via assigned work and exams worth points in the course, and students’ self-reports of learning gains. For the latter purpose, which is the focus of this contribution, a Student Assessment of Learning Gains (SALG) survey instrument (Seymour et al., 2000; Jordan, 2012; Vishnumolakala et al., 2016) was developed and employed for comparison of the outcomes for students in the two types of classroom environments. Results for analyses of these surveys, student course evaluations through the established college system, and comparisons of results on graded course tests and assignments are presented herein.

Objectives and description

The primary learning objective was to teach students to routinely use inquiry, analysis, critical thinking, and the basic principles of scientific reasoning via the content of organic chemistry. In the cooperative learning sections of the course, this objective was developed through small-group constructivist, interactive group problem solving sessions during class. The instructor clearly conveyed expectations that the students would individually complete the assigned pre-class work on a regular basis as preparation for higher level problem solving with peers during class (flipped classroom). We intended to assess their learning and affect in comparison with that of students in a more traditional lecture setting. In the context of organic chemistry, we aimed to increase student comfort with constructing their own knowledge, which is an important skill for all professionals. Because teaching methods in organic chemistry and other gateway science courses are frequently cited by students who decide to leave STEM (Seymour, 1995; Seymour and Hewitt, 2000; Daempfle, 2004) or pre-health tracks (Barr, 2010; Barr et al., 2010), the study also intended to explore the way innovations in class format and learning activities might influence students’ perceptions of science instruction.

Experimental methods

Description of course sections, pedagogy observations

Four sections, two lectured-based and two activity-based, were taught in the spring semester at a private, mid-size university in the southeastern United States. All sections were defined as large classes by the university (≥60 students); enrolments for sections included in this study ranged from 132 to 158 students. All sections met in the same classroom: a sloped, theatre-style lecture hall with central and side aisles and forward facing seats. The instructor presentation space included one large central presentation screen, two periodic tables on side walls near the front, chalk boards, and a bench top suitable for chemical demonstrations at the front of the classroom (Fig. 1).
image file: c7rp00014f-f1.tif
Fig. 1 Photograph taken from near the centre of the classroom.
Course registration process. During the initial registration period prior to the beginning of classes, students were not alerted that the sections would be taught in different formats. However, announcements about class formats were made during the first week of class as learners in the active-learning classes were assigned to their initial groups. All sections had open seats in the registrar's system during the add-drop period; per university policy, individuals had the option to move freely from one section to another during the first week of the term without instructor permission (add-drop period). The small incidence (<5%) of student movement between sections prior to the administration of baseline SALG survey was not actively tracked by researchers. No students requested instructor permission to change sections during the second week of classes; at the end of the second week, rosters were frozen and students could no longer change their class schedules without a formal course withdrawal (incurring a “W” for the term).
Class time and resources. All sections met for an equivalent amount of class time per week with the instructor of record (150 minutes), and all students had identical labs and discussion (recitation) meetings, which were staffed by teaching assistants. The same instructor led all sections, and identical problem sets and final exams were given to both the control and experimental groups. Students completed graded homework problems online via Sapling Learning, and they were also encouraged to complete ungraded textbook exercises on their own to enhance concept understanding. In addition, both course formats used the same course webpage so that all of the supportive resources were identical for all students. In summary, every student had access to the same exercises, worksheets, resources, and a comprehensive set of video lectures broken into short segments. For example, students in the control group attended traditional live lectures during class time, but they also had access to all of the in-class worksheets completed by the experimental group in real time if they wanted to complete those on their own or with a study group. On the other hand, students in the experimental group completed worksheet-based activities in small groups during class time, but they also had access to both PowerPoint slides from the control group lectures and short video lectures if they wanted to watch lectures outside of class. All sections used a personal response system (clickers) so that students could submit individual answers in real time to multiple choice questions and review examples during class.

All sections worked through the same amount of material: concepts covered in the first thirteen chapters of the required textbook for the course (Loudon, 2009). The only difference was how class time with the instructor of record was employed. More details about this difference are described in the following subsections. The overall structure of the gateway chemistry curriculum at the institution in this study have been previously published (Hall et al., 2014; Canelas, 2015; Goldwasser et al., 2016).

Numerous resources were available to students during their study time. Besides the instructor of record office hours, each graduate student teaching assistant was assigned to work two hours per week in the “Chemistry Resource Room.” The schedule for the Chemistry Resource Room was posted near the beginning of the term, and any student could drop in during any of the open times. The university also supported nine hours of evening and weekend “Walk-in Tutoring” which was staffed by undergraduates who had previously earned an A grade in the class. No appointments were needed for any of these resources, and attendance at these times for learning was not tracked by name on an individual student basis.

Assessments and grading. Every section was assessed by three in-class tests (mid-term exams), which were spaced roughly evenly throughout the semester. The mid-term exams were different for the different sections, eliminating the chance that students taking the test later could receive unauthorized information about the specific problems, but similar material and concepts were covered. All students took a uniform, comprehensive common final exam at the end of the term. The 300 point final exam was comprised of twenty multiple choice questions (worth 100 points), and a free response section with nine questions that involved writing mechanisms, predicting products, explaining concepts via essay, interpreting spectral data to deduce unknown structures, and devising retrosynthetic analysis schemes (worth 200 points). A more widely used, standardized exam, such as one from the ACS exams institute, was not used due to a previously observed ceiling effect with this type of exam in our courses. These mid-term tests and the final exam accounted for the majority (67.5%) of the available points towards the overall course letter grade. Another large amount of available points (25%) were earned through completion of the laboratory portion of the course, which was part of the overall course grade and not a separate grade. The remaining 7.5% of the points were earned from the assigned Sapling online homework for the control group and split between the assigned Sapling online homework and graded in-class group activities for the experimental group.
Description of control classroom. Lecture classes were taught in a “traditional” lecture format using a combination of oral presentation aided with PowerPoint slides and the instructor writing mechanisms or working examples on the chalk board. In addition to the interpolated clicker questions to check understanding, the instructor employed cold-calling. This involved randomly drawing names from a cup in real time so that a few students each class provided oral responses. On some occasions, such as when clicker responses showed that many students did not select the correct answer on their first try, students were asked to talk to neighbouring classmates prior to re-polling, but formal groups were never assigned.
Description of experimental classroom. Cooperative learning classes (experimental group) relied heavily on students working through concepts and application problems in class through small, cooperative peer teams (4–6 students per group) while the instructor and TAs circulated between the groups. As previously described (Barger et al., 2017), during the active learning sessions, students worked in small groups on activities drawn from the Process Oriented Guided Inquiry Learning (POGIL) (Moog and Spencer, 2008; Moog et al., 2009), Student Centered Activities for Large Enrollment Undergraduate Programs (SCALE UP) (Oliver-Hoyo and Beichner, 2004; Oliver-Hoyo, 2011), and problem manipulation (Siburt et al., 2011) traditions while the professor and teaching assistants circulated to engage in the discussions. Teams were initially assigned randomly using decks of cards – a process that was transparent to the students. Then, after each in-class test, teams were reassigned based upon the test scores so that each team included a mix of relatively high, medium, and low performers. This method of group assignments has known advantages and disadvantages. Advantages included balancing teams in terms of class performance and content knowledge as well as exposing students to the challenge of working with a larger, more diverse group of peers over the course of the semester. The process had disadvantages in terms of logistics (for example, some class time needed for students to find their new team and meet new teammates after each test) and overall group dynamics, as long-term teams are well-known to be the highest functioning. Most of the in-class activity worksheets for the cooperative learning class were drawn from material published at that time (Straumanis, 2009) or unpublished at the time (Ruder, 2015), but used with the permission of the author in a beta-test environment. Some of the more advanced application activity worksheets were composed by the instructor (see Barger et al., 2017 for an example).

Various types of active, cooperative, small-group learning classroom environments for undergraduate organic chemistry have been previously reported in the literature (Dinan and Frydrychowski, 1995; Bradley et al., 2002; Tien et al., 2002; Hein, 2012; Chase et al., 2013; Conway, 2014; Fautch, 2015), so this work drew from the successes and lessons learned by examining the prior work. The instructor of record for the sections in this study had been teaching general chemistry classes in a cooperative learning classroom format with small, interactive student teams for several years, so basic challenges related to logistics and handling student teams during class time had been thoroughly addressed. In addition, a small section (36 students) using the cooperative learning format and group activities in this specific organic chemistry course was piloted by the instructor in the year before the experimental sections studied herein, so it was not the maiden voyage of the activities or classroom format for the course.

Classroom observations. Each semester, the amount of active learning in a classroom was assessed by direct observation via unannounced classroom visits at least four, usually five times per semester. The first week of classes and exam periods were avoided, but other dates were selected randomly through a random number generator. The instructor was unaware of the visit date before class, and the observer took an unobtrusive seat to avoid calling any attention to their presence.

The observer sat in the back of class and noted what the instructor and students were doing during class every 5 minutes. These observations were converted to percentages of time spent in active learning. For the purposes of this study, “active learning” during class is defined as anything course-related that all students in a class session are instructed to do other than simply watching, listening and taking notes (Felder and Brent, 2009). For the current study, a time point was scored as active learning only when all students were engaged in a learning activity. Active learning results are expressed as a percent of the total observations.

Student course evaluations. Students in all sections had the opportunity to complete formal, end-of-term course evaluations via a link located on the Student Information System website. This questionnaire evaluated overall course experience, course dynamics, and Likert-scaled estimations of progress toward College-level learning objectives. It also provided spaces for free-response comments. Students were assured that these responses remained completely anonymous to the instructor and were not available until after all final grades were officially recorded by the University. These course evaluations were collected and compiled by the College's Office of Assessment.

Survey procedure and methodology

Following SENCER-SALG standard procedure (Jordan, 2012), surveys were provided to students electronically outside of class time through a link sent by email and included in announcements on the course webpage. Students were informed that the instructor would not see any results until after grades were submitted, and that their names would not in any way be associated with their responses. A SALG baseline (pre-survey) was collected at the beginning of the term right after the add-drop period ended (week 3). In the last week of the term, students rated their own gains in competencies and perceptions of learning on a SALG instrument (post-survey). Students in every section were offered the opportunity to complete these online surveys at the two time points during the semester. The response rate ranged from 78.3% to 97.5% of the students in each of the sections completing the surveys. Students were provided a very small amount of extra credit (3 bonus points; there were 1000 total points possible in the course) for completing each survey, and this surely contributed to the high response rates. To ensure that participants felt comfortable responding frankly to survey items, individual survey responses were kept confidential (not available to the instructor during the term), and students were informed in advance that this would be the case. In addition, each participant had the option to skip survey items if he or she did not want to answer for any reason, so n-values experience some variation between the survey items.

For each survey item, the differences in the responses of the control and experimental groups were assessed by Student's t-tests; a p value of <0.05 was considered statistically significant. Reported numerical error values are the standard error of the mean (σM).

Description of sample

All participants were full time undergraduates who enrolled in the first semester of a two semester course sequence of introductory organic chemistry during the spring semester (please see the previous section for content and pedagogy details). From the total population of students enrolled in the class sections in this study (N = 567), approximately half were in the control group (traditional lecture sections, n = 270) and the remainder of the students were in the experimental group (activity-based learning sections, n = 297).
Demographics. Both groups of students were similar in terms of gender composition and other key demographics (Table 1). Students who self-identify as Asian/Pacific Islander were slightly over-represented in the lecture classes (control group), as were women. Caucasians and men were slightly over-represented in the cooperative learning sections (experimental group).
Table 1 Description of the sample: demographics
Characteristic Control group Experimental group
n Students % Students n Students % Students
Female 169 63 176 59
Male 101 37 120 40
Unknown gender 0 0 1 0
African-American 34 13 39 13
American-Indian/Alaskan native 3 1 1 0
Asian/Pacific Islander 106 39 103 35
Caucasian 100 37 121 41
Hispanic/Latino 20 7 28 9
Native Hawaiian 1 0 1 0
Other race/ethnicity 6 2 4 1
First generation 24 9 31 10
Non-first generation 246 91 266 90


Student predictions of future majors and careers. On both of the surveys described in the previous section, students responded to a small number of items relating to college major plans and career goals (Table 2). Because the institution does not allow students to formally declare their majors and minors during their first year of study, we were happy to learn that 16 ± 3% of the students enrolled in the course, all sections, planned to become chemistry majors. Undergraduates planning chemistry majors were slightly over-represented in the experimental group. We were also happy to discover that the percentage of students predicting a chemistry major declaration did not decline during the semester regardless of the format of the course.
Table 2 Future career and potential college major interests reported by students in the baseline SALG survey
Affirmative (yes) responses to the following items Control group Experimental group
% Students % Students
a n ranges pre: control 219–224; expt. 254–261. b n ranges post: control 196–203; expt. 244–247.
Baseline (pre) surveya
I plan to major in chemistry (n = 479) 13 18
I plan to attend a health-related professional school (medical, dental, vet, pharmacy, etc.) (n = 485) 76 85
I plan to attend graduate school in the natural sciences (n = 476) 37 42
I plan to gain employment as an engineer (n = 475) 8 8
Post surveyb
I plan to major in chemistry (n = 446) 13 18
I plan to attend a health-related professional school (medical, dental, vet, pharmacy, etc.) (n = 450) 76 80
I plan to attend graduate school in the natural sciences (n = 441) 34 37
I plan to gain employment as an engineer (n = 442) 10 9


The majority (76–85%) of the participants in both types of sections self-reported to be planning a pre-health course of study, indicating that they hoped to attend medical, dental, veterinary, or pharmacy school in the future. In these professions, which often involve extensive patient contact and require working in diverse teams of professionals, interpersonal skills are deemed especially important, and practice with these skills is considered an essential part of curricula (Makoul et al., 1998; Noble et al., 2007; Murdoch-Eaton and Whittle, 2012).

Approximately one third of the students answered “yes” to the item “I plan to attend graduate school in the natural sciences.” Apparently some of the students planned a future MD/PhD. The pre/post survey showed small declines (3–5%) in student future plans to attend health professional school or graduate school over the course of the semester. Eight to ten percent of the enrolled students planned to gain employment as an engineer.

Factor analysis and statistical methods

Researchers, most notably Vishnumolakala et al. (2016), have established the SALG as a psychometrically sound measure of core learning objectives in undergraduate STEM education. As a technique, factor analysis often is used to identify the central constructs underlying a complex dataset and around which survey items cluster (Fabrigar et al., 1999). However, in addition to its base questionnaire, the SALG permits local customization of supplemental question sets, thus complicating any replication of or comparison with the findings from similar studies. Like Vishnumolakala et al., this study deployed an exploratory factor analysis (EFA) to detect latent commonalities among the survey items under review. However, our SALG administration included 97 items, compared to Vishnumolakala et al.'s original 62. EFA allowed us to draw “bigger picture” conclusions about domains of student learning across individual questionnaire items.

Since both studies involved introductory organic chemistry courses, one reasonably might expect to observe the same four factors as Vishnumolakala et al. However, EFA was conducted for all of the questionnaire items concurrently, regardless of their likely categorization, thus producing a factor structure that is an authentic representation of the item associations and groupings that may be unique to this analytic sample. The present factor analysis was conducted using SAS Software, version 9.4, and specifying the promax option for oblique rotations (Cureton and D'Agostino, 1983; Gorsuch, 1983; Fabrigar et al., 1999), and it reveals three of the four constructs previously reported by Vishnumolakala et al.: process skills, concept learning, and active learning. The latter factor we have elected to describe using the term “interactive learning” herein (rather than active learning), since that phrase seems to fit our items better. We observed a fourth construct as well, but the questions loading onto that construct generally do not represent “resources”, as observed by Vishnumolakala et al. Rather, the pattern of factor loadings suggests that the fourth factor in this study represents “chemical modeling.” The labeling or categorization of factors requires interpretive discretion by the analyst, and given that many SALG questions could be attributed to multiple learning objectives, these deviations from Vishnumolakala et al.‘s findings are not unexpected. Table 3 displays the alignment of SALG items with each of the four observed factors, including the factor loadings (representing the strength of the association between the item and the observed factor).

Table 3 Factor loadings for the SALG post-tests (n = 154)
SALG item number Factor loadings
Interactive learning Concept learning Chemical modeling Process skills
1 0.88
2 0.87
3 0.86
4 0.82
5 0.79
6 0.72
7 0.65
8 0.71
9 0.71
10 0.71
11 0.72
12 0.73
13 0.76
14 0.76
15 0.77
16 0.77
17 0.77
18 0.78
19 0.79
20 0.80
21 0.82
22 0.82
23 0.83
24 0.85
25 0.86
26 0.71
27 0.78
28 0.78
29 0.82
30 0.83
31 0.57
32 0.62
33 0.64
34 0.66
35 0.67
36 0.67
37 0.69
38 0.69
39 0.70
40 0.70
41 0.71
42 0.71
43 0.71
44 0.71
45 0.71
46 0.71
47 0.72
48 0.72
49 0.74
50 0.74
51 0.75
52 0.75
53 0.76
54 0.77
55 0.77
56 0.78
57 0.79


To illustrate face validity (Nevo, 1985), Table 4 provides examples of the SALG items attributed to each of the factors/constructs. The questionnaire items loading onto each of the detected factors/constructs are reasonable and intuitive.

Table 4 Factor analysis constructs and sample items
Factor Examples of SALG items in that construct
Interactive learning How much did interacting with the instructor during class help your learning?

How much did working with peers during class help your learning?

Concept learning Presently, I understand how to plan workable multi-step sequences of reactions that convert simple starting compounds into complex organic products using my knowledge of functional group reactions.

I can list several common synthesis methods for, as well as several common reactions of: alkenes.

Process skills Presently, I can interact productively to solve problems with a diverse group of classmates.

Presently, I am in the habit of using systematic reasoning in my approach to problems.

Chemical modeling Presently, I understand how to predict 3-D shapes and polarity of molecules.

Presently, I understand orbitals, including atomic, hybrid and molecular.



Table 5 lists summary statistics that endorse the overall factor structure. Calculating percent of variance enables an estimation of the proportion of variance in the 57 survey items captured by the four selected factors. Cumulatively, the four constructs represented by these factors account for 77.5% of the variance. This high proportion likely is due to the large number of SALG questions loading onto the factors representing constructs “concept learning” and “process skills” (18 and 27 questions, respectively). Eigenvalues often are used to determine how many factors should be retained in the EFA. Because an eigenvalue of 1.0 means that the factor has the same explanatory power as a single variable in the analysis, higher eigenvalues represent greater explanatory power for that construct (Kaiser, 1960; Kim and Mueller, 1978). The factor representing the construct “process skills” has the most explanatory power, followed by “concept learning”. Cronbach's alpha (α) is a frequently employed and well-regarded estimate of internal reliability: the internal consistency among the questionnaire items attributed to the composite construct (Cronbach, 1951). Low alpha coefficients indicate high error, however the study results’ alpha coefficients all exceed 0.70, the commonly-accepted threshold for construct reliability (Nunnally, 1978). This study confirms, like Visnumolakala et al., that the SALG instrument measures essential, core learning objectives in undergraduate STEM education, and does so with a high degree of internal reliability and demonstrable construct validity.

Table 5 Summary results of the exploratory factor analysis and associated check of internal reliability
Interactive learning Concept learning Chemical modeling Process skills
Percent variance 12.55 24.65 9.88 30.42
Eigenvalue 5.78 7.59 3.68 35.25
Cronbach's alpha (α) 0.9521 0.9573 0.8806 0.9670


Results and discussion

Classroom format and content knowledge

Data analysis from the direct observations and recordings of overtly passive versus overtly active learning in the classrooms revealed large differences in the classroom environments. For the control group, an average of 18% of class time was spent on active learning tasks using the criteria described in the methods section; remaining time was spent on passive tasks such as listening and taking notes. In contrast, for the experimental group, an average of 66% of class time was spent with all students engaged in active learning tasks.

In spite of these large differences in the amount of time spent on passive versus active learning, students in both types of classroom formats performed similarly in terms of demonstrated application of content knowledge in chemical problem solving on the final exam. On a final exam with a maximum score of 300 points, the control group's mean score was 192.6 ± 2.8, while the experimental group's mean score was 191.8 ± 2.8 of (p = 0.846; for students who took the final exam: control group n = 258, experimental group n = 287). Indeed, no statistically significant difference was found when the overall final exam performance when these group were compared, and there were also no statistically significant differences in performances on the multiple choice portion of the exam. This finding is not surprising given that the exam focused on individual problem solving and content knowledge since the same content was provided to both groups. In addition, even the “lecture” format course employed a personal response system (clickers) and contained some shorter periods of intentionally interactive learning, so this could have contributed to the lack of difference in our summative assessment results.

These results confirm similar findings by several research groups who reported that changes in organic chemistry classroom formats did not significantly affect outcomes on summative assessments (Dinan and Frydrychowski, 1995; Bradley et al., 2002; Chase et al., 2013; Rein and Brookes, 2015). Interestingly, a pre- and post-test study for flipped versus lecture general chemistry classes (Reid, 2016) revealed that the “exam performance in the two sections is statistically different only for the bottom third, as measured by pretest score or percentile rank” (Ryan and Reid, 2016). Relatively weaker students were also found to benefit more than their stronger peers in terms of outcomes on tests in a flipped organic chemistry context (Fautch, 2015). On the other hand, our results offer a counterpoint to most other reports on college learning, which show a general trend of superior performance by students in flipped, inquiry-based, and/or other types of cooperative learning classroom environments (Freeman et al., 2014; Weaver and Sturtevant, 2015; Hibbard et al., 2016; Warfa, 2016). As an illustration, two recent studies specifically in organic chemistry revealed test performance benefits for learners in student-centered, active-learning organic chemistry classrooms (Hein, 2012; Conway, 2014).

Our study did not reveal a difference in exam performance for the groups in the two different conditions, perhaps because there were no real chemistry content knowledge learning differences, or perhaps because of confounding variables that mitigated any differences. On the latter point, even our passive, lecture classroom did include some active learning, which may have impacted the student outcomes for this condition on the summative assessment. In addition, our student population is from a highly selective university in terms of undergraduate admissions standards, similar to that reported by Bradley and coworkers (Bradley et al., 2002), who also did not see significant effects of classroom format on test scores. The interplay between students’ personal epistemologies and classroom formats may also have played a role, since students with authority-based justifications of knowledge tend to prefer and perform well in lecture format classes (Barger et al., 2015, 2017). Finally, the difference in format may have had a differential impact on a segment of our student population which was not revealed by our analysis. We did not examine the differential impact of active learning on different student populations (by demographics, previous grades, or pre-matriculation standardized test scores, for example), which may have revealed useful information similar to previous findings in studies which looked at student preparation distribution aspects in the analyses (Fautch, 2015; Reid, 2016; Ryan and Reid, 2016).

Results from the SALG survey items on course content knowledge gains are mixed. For each of the four factors (multi-item constructs), t-tests were used to explore possible differences in ratings between the groups, illustrated in Table 6. SALG items have a range of 1 to 5, where 5 is high. The analysis finds no statistically-significant overall differences in student self-evaluation of learning on items associated with the constructs chemical modeling and interactive learning but do show moderate and statistically-significant differences in the constructs concept learning and process skills (Table 6). Given the variety of learning outcomes included in the concept learning construct, and noting that not all questionnaire items associated with that construct show statistically significant differences across course formats, it is possible that the broad inclusivity of the SALG's concept learning construct masks specific, item-level learning outcomes. Item-level results presented in Table 7 suggest no statistically significant differences in a number of items attributed to the concept learning construct (such as how to draw the stereoisomers of chiral compounds, and how to identify the stereochemical relationship between structures) which may be better analytic matches for the performance tasks required by the course final exam. The analytic utility of Tables 6 and 7 are described in greater detail in the following section.

Table 6 Comparison of construct indexes for each of the two classroom formats: means, standard errors, and t-test results. Construct indexes have a 5-point range, where 5 is high
Control Experimental Difference
n Mean σ M n Mean σ M
** Pr |t| < 0.01. * Pr |t| < 0.05.
Interactive learning 152 3.34 0.10 226 3.56 0.10 0.2206
Concept learning 205 4.25 0.06 263 4.38 0.06 0.1339*
Chemical modeling 205 3.65 0.05 263 3.58 0.04 −0.065
Process skills 205 3.43 0.06 258 3.68 0.05 0.2527**


Table 7 SALG post survey mean values, standard deviations, and t-test results: comparing control and experimental group construct index item responses for interactive learning, chemical modelling, and concept learning. SALG items have a 5-point range, where 5 is high
Interactive learning
How much did each of the following aspects of the class help your learning? Control group Experimental group Difference
Mean σ M Mean σ M
**** Pr |t| < 0.0001. *** Pr |t| < 0.001. ** Pr |t| < 0.01. * Pr |t| < 0.05. σM = standard error of the mean.
Interacting with the instructor during class 3.18 0.14 3.61 0.11 0.434*
Interacting with the instructor during office hours 3.25 0.15 3.48 0.12 0.233
Working with teaching assistants during class 2.66 0.20 3.20 0.15 0.543*
Working with teaching assistants outside of class 2.83 0.22 3.26 0.17 0.436
Working with peers during class 2.72 0.25 3.10 0.20 0.383
Working with peers outside of class 2.50 0.24 3.11 0.19 0.610*
Walk-in tutoring 3.37 0.13 3.69 0.10 0.318*

Chemical modeling
As a result of your work in this class, what gains did you make in your understanding of each of the following? Control group Experimental group Difference
Mean σ M Mean σ M
How to use resonance, when it is present, to predict the reactivity of a given compound 3.67 0.08 3.63 0.07 −0.041
How to draw Lewis dot structures 3.56 0.09 3.42 0.08 −0.142
How to assign formal charge 3.83 0.08 3.65 0.08 −0.177
How to predict 3-D shapes and polarity of molecules 3.73 0.08 3.87 0.07 0.139
Orbitals, including atomic, hybrid and molecular 3.45 0.08 3.35 0.07 −0.101

Concept learning
As a result of your work in this class, what gains did you make in your understanding of each of the following? Control group Experimental group Difference
Mean σ M Mean σ M
How to recognize the difference between conformations, constitutional isomers, and configurational isomers, including enantiomers 4.27 0.06 4.38 0.05 0.106
How to identify chiral molecules 4.44 0.05 4.52 0.04 0.088
How to draw the stereoisomers of chiral compounds 4.43 0.05 4.45 0.05 0.013
How to identify the stereochemical relationship between structures 4.32 0.05 4.34 0.05 0.016
How to explain the mechanisms of uni- and bimolecular substitutions and eliminations 4.30 0.06 4.27 0.05 −0.027
How to identify nucleophiles and electrophiles and predict reactions between them 4.22 0.06 4.20 0.05 −0.023
How to write a proper reaction mechanism for a given reaction with known products 4.35 0.06 4.28 0.06 −0.069
How to write mechanisms to predict outcomes for a half dozen types of addition reactions 4.37 0.06 4.31 0.06 −0.063
How to determine whether or not a reactive intermediate will rearrange its internal structure prior to further reaction 4.29 0.06 4.25 0.06 −0.043
How to identify all the functional groups present in a given organic chemical 4.34 0.06 4.38 0.05 0.042
How to propose and predict the outcome of acid–base reactions 4.17 0.06 4.04 0.06 −0.134
How IR, NMR, and mass spectroscopy work & how I can use each kind of spectrum to identify structural facets of an unknown molecule 4.12 0.07 4.32 0.05 0.196*
How to plan workable multi-step sequences of reactions that convert simple starting compounds into complex organic products using my knowledge of functional group reactions 4.01 0.07 4.31 0.05 0.304**
(I can) list several common synthesis methods for, as well as several common reactions of:
Alkyl halides 4.43 0.06 4.65 0.05 0.226**
Alkenes 4.44 0.07 4.75 0.05 0.307***
Alcohols 4.38 0.07 4.58 0.06 0.206*
Ketones and aldehydes 4.08 0.07 4.29 0.06 0.203*
Carboxylic acids and their derivatives 3.99 0.08 4.30 0.06 0.304***


Transferable skills development

The most compelling results in this study were large differences found between the groups in the students’ perceptions of their development of transferable skills (Table 8). Reviewed in concert with Table 7, t-test results for individual SALG items, there is compelling evidence that students in the experimental group felt that they achieved higher levels of interpersonal and group learning skill development (soft, generic, or transferable skills) than their peers in the control group.

The biggest overall differences between the groups on the SALG post survey were the student responses to the following three items, all of which were prompted with “As a result of your work in this class, what gains did you make in the following skills?”

• Behaving as an effective team member

• Behaving as an effective leader

• Interacting productively to solve problems with a diverse group of classmates

All three of these items had highly statistically significant results in terms of differences observed between the groups (p < 0.0001 in all cases), with the experimental group reporting greater gains than the control group (Fig. 2 and Table 8). In contrast, skills that were not affected by the difference in classroom formats, such as skills related to the laboratory work, revealed no differences between the groups. For example, no statistically significant difference was found in participant's responses to the item “As a result of your work in this class, what gains did you make in the following skills: finding published data I need for lab work and report writing” (p = 0.6525). This item's lack of statistically significant differences between the groups completely makes sense, because all students were in the same format of laboratory. In fact, student assignment to a lab section had no correlation to the assignments in a specific classroom section of the course, so students from both groups were frequently in the same lab sections.


image file: c7rp00014f-f2.tif
Fig. 2 Student assessment of learning gains in key transferable skills: control versus experimental groups. Skill 1: finding published data I need for lab work & report writing. Skill 2: critically evaluate debates in the media about issues related to science involving organic chemistry. Skill 3: interacting productively to solve problems with a diverse group of classmates. Skill 4: behaving as an effective leader. Skill 5: behaving as an effective teammate.
Table 8 SALG post survey mean values, standard deviations, and t-test results: comparing control and experimental group construct index item responses for process skills. SALG items have a 5-point range, where 5 is high
Process skills
As a result of your work in this class, what gains did you make in the following Control group Experimental group Difference
Mean σ M Mean σ M
**** Pr |t| < 0.0001. *** Pr |t| < 0.001. ** Pr |t| < 0.01. * Pr |t| < 0.05.
Connecting key ideas and reasoning skills that I learn in my classes with other areas of my life 3.13 0.09 3.46 0.08 0.324
Using systematic reasoning in my approach to problems 3.73 0.08 3.97 0.07 0.244
Determining how each piece of new information or knowledge fits into the pattern of my existing knowledge 3.83 0.08 4.09 0.06 0.261*
Critically analyzing data and arguments before I make an action plan 3.61 0.08 3.85 0.07 0.243*
Developing testable hypotheses 2.86 0.09 3.14 0.08 0.282*
Designing and executing experiments 3.10 0.10 3.26 0.08 0.157
Finding published data I need for lab work and report writing 2.78 0.09 2.84 0.08 0.055
Critically evaluating debates in the media about issues related to science involving organic chemistry 2.41 0.09 2.61 0.09 0.204
Identifying patterns in data 3.30 0.08 3.25 0.08 −0.048
Recognizing a sound argument and appropriate use of evidence 2.89 0.09 3.05 0.08 0.159
Writing documents in the style and format appropriate for chemists 3.07 0.09 3.27 0.08 0.202
Behaving as an effective team member 3.22 0.09 3.94 0.07 0.713****
Behaving as an effective leader 2.97 0.10 3.74 0.07 0.776****
Interacting productively to solve problems with a diverse group of classmates 3.11 0.09 4.02 0.07 0.903****
Analyzing chemical models and draw appropriate conclusions 3.83 0.08 4.08 0.06 0.246*
Enthusiasm about learning organic chemistry 3.55 0.09 3.97 0.07 0.413***
Your comfort level working with complex ideas 3.66 0.08 4.08 0.06 0.410****

How much did each of the following aspects of the class help your learning? Control group Experimental group Difference
Mean σ M Mean σ M
Working on example problems given on the blackboard 4.31 0.07 4.33 0.06 0.024
The number and spacing of tests 3.78 0.08 3.84 0.08 0.052
The fit between class content and tests 3.78 0.08 3.94 0.07 0.157
The mental stretch required by tests 3.54 0.09 3.68 0.08 0.135
The feedback on my work received after tests or assignments 3.41 0.09 3.68 0.08 0.276*
Explanation of how the class activities, reading and assignments related to each other 3.66 0.08 3.85 0.07 0.184
Explanation of why the class focused on the topics presented 3.47 0.09 3.64 0.08 0.171
The instructional approach taken in this class 3.95 0.08 3.70 0.09 −0.242*
How the class topics, activities, reading and assignments fit together 3.85 0.08 4.02 0.07 0.168
The pace of the class 3.18 0.09 3.33 0.09 0.143


Students in the experimental group expressed high levels of awareness that their interactions with peers, teaching assistants, and the instructor aided their learning (Table 7, Interactive Learning construct). Interestingly, interactions with peers outside of class was given higher ratings by students in the experimental section, and this result was statistically significant. Perhaps gaining comfort with classmates during in class activities led to improved student perception of transferable skills and richer out of classroom studying opportunities with peers? Dialog about scientific concepts and problem solving with peers and mentors serves as important practice for the process of scientific discourse and is crucial to the process for transitioning from a novice to an expert in a field. This is because discourse with peers allows learners to progress in the resolution of misconceptions (Kulatunga et al., 2013). This dialogical process is analogous to the professional practice of debating new ideas and explanations among practicing scientists (Osborne, 2010). In addition, peer discussions during complex problem solving both inside and outside of class provide important practice in the collegiality and teamwork needed in professional environments. Along these lines, Repice and coworkers observed small groups working to solve chemistry problems and note, “students engage in joint decision-making by taking turns, questioning and explaining, and building on one another's ideas” (Repice et al., 2016). Clearly, students in our experimental group could see the value in the practice of professional skills with peers in enhancing their learning.

When compared to the control group, students in the experimental group also assigned higher average ratings to helpful learning from working with teaching assistants during class. Recall from the description of the courses that all students (both control and experimental groups) had one discussion section per week led by a teaching assistant. Students in the experimental section gained additional classroom exposure to all of the teaching assistants, not just their own discussion section leader, because of their circulation during the active learning activities. So, it makes sense intuitively that the experimental group felt they learned more from the teaching assistants during class. Students in the experimental group also rated higher learning gains from the walk-in tutoring resource that was available to students in both groups. One explanation for this could be that exposure to a wider range of learning mentors during class time led to greater comfort levels with teaching assistants and peers, which in turn led to corresponding greater help seeking behaviour by the experimental group. Or, perhaps the experimental group just realized on a more regular basis that they were confused and not fully understanding all of the material in each class? We do not know, however, if students in the experimental group were more likely to attend the Chemistry Resource Room, walk-in tutoring, or instructor office hours.

End-of-term student course evaluations

To explore perceptions of the two groups regarding general course dynamics and learning, we conducted t-tests and compared results from aggregate end-of-term, College-administered course evaluation data. Note that students in the experimental group rated their overall experience lower than their peers in the control group for key indicators of quality of instruction and course dynamics such as clarity of requirements, encouraging participation, and instructor enthusiasm and access. Since these types of items are used by departments and upper-level administrators in decisions about faculty promotion and tenure, faculty may be justifiably concerned about the potential negative impact of student course ratings. For example, students in the experimental group rated the overall quality of instruction as lower, and they rated the instructor as less enthusiastic, and these were highly statistically significant results (Table 9). But, paradoxically, students in the experimental group also rated the course as less difficult despite the fact that outside of class homework and graded assessments were the same (availability of instructor open office hours was also identical). Moreover, among the common concerns instructors of flipped or student-centred classrooms hear from college students is the perception of excessive time requirements for outside-of-class preparation. These data, however, suggest that students in each course type perceive that they spend about the same amount of time preparing course materials and studying for tests outside of class meetings. For most items there is no statistically significant difference in students’ reports of progress toward core college-level learning objectives. An important exception is students in the experimental group reporting greater progress toward the development of two key transferable skills: oral and written communication skills (Table 9), which also happen to be among the professional skills emphasized by the Royal Society of Chemistry accreditation requirements (RSC, 2016). Such results support the study's original hypothesis that primary content learning can be preserved whilst promoting and improving confidence in and mastery of lifelong transferable skills.
Table 9 t-Test results comparing students’ ratings of the overall course experience, reported on end-of-term evaluations. Ratings on a 5-point scale, where 5 is high
Control group Experimental group Difference
Mean σ M Mean σ M
**** Pr |t| < 0.0001. *** Pr |t| < 0.001. ** Pr |t| < 0.01. * Pr |t| < 0.05.
Quality of course 4.24 0.03 4.13 0.05 −0.109
Quality of instruction 4.49 0.04 4.07 0.06 −0.419****
Effort/work 4.58 0.04 4.44 0.04 −0.140*
Difficulty of subject 4.41 0.04 4.08 0.05 −0.326****
Intellectual stimulation 4.47 0.04 4.45 0.04 −0.017
Instructor enthusiasm 4.88 0.02 4.50 0.05 −0.385****
Instructor access 4.43 0.03 4.25 0.06 −0.179**
Participation encouraged 4.37 0.04 4.15 0.07 −0.224**
Clarity of requirements 4.58 0.04 3.93 0.08 −0.651****
Quality of feedback 4.08 0.04 4.10 0.07 0.022
Fairness of grading 4.07 0.05 4.15 0.06 0.089
Progress toward
Factual knowledge 4.40 0.04 4.37 0.05 −0.024
Fundamental concepts 4.41 0.04 4.40 0.05 −0.011
Applying concepts 4.33 0.04 4.32 0.05 −0.016
Analyzing ideas and points of view 3.30 0.06 3.27 0.09 −0.030
Synthesizing knowledge 4.33 0.04 4.22 0.06 −0.104
Conducting inquiry w/ methods of the field 3.87 0.05 3.69 0.08 −0.179*
Evaluate merit of ideas 3.11 0.06 3.00 0.09 −0.105
Oral expression skills 2.28 0.06 2.94 0.09 0.656****
Writing skills 2.52 0.06 2.92 0.08 0.400****
Time spent outside of class 3.53 0.03 3.55 0.04 0.017


From the course evaluation written comments, very few students in the experimental group appeared to be fully aware that they were learning both organic chemistry and useful career skills. In fact, numerous students expressed hostility towards the cooperative learning classroom structure in their written comments, and these students appeared to believe that they could only learn by lecture. This could possibly be because the students are at an early stage in their college career, but it might also possibly arise from these individual students having a personal epistemology with an authority-based justification of knowledge (Barger et al., 2015, 2017). Such comments corroborate the numerical differences in perceptions of course quality, quality of instruction, instructor enthusiasm, and instructor accessibility. For example, one student commented:

It is extremely hard especially when you are in the flipped section and not the lecture section. There is never any actual instruction.

However, other course evaluation free response comments illustrated that some students in the experimental group did recognize that they were learning the course content at a deeper level, as well as gaining significant practice in important skills. For example, two different students commented:

Organic chemistry involved a lot more reasoning and critical thinking than I expected. This is a very logical course rather than just sheer memorization that most science courses (bio/general chemistry) tend to be.

The “backwards” classroom style definitely suited the course material. It promoted cooperation and problem-solving as opposed to memorization.

To evaluate whether there were notable differences in students’ perceptions of their skills, competencies and attitudes at the start of the term, which might influence our interpretations of post-test data, the study conducted t-tests on data collected via the baseline/pre-course SALG (Table 10). With respect to questions the EFA designates as interactive learning or process skills, there are no a priori differences in students’ self-reported competencies. The only statistically significant differences in baseline survey data occur among concept learning questions (see ESI). Example items include:

Table 10 t-Test results comparing students’ results for the SALG baseline or pre-course administration. Ratings on a 5-point scale, where 5 is high
Control group Experimental group Difference
Mean σ M Mean σ M
**** Pr |t| < 0.0001.*** Pr |t| < 0.001.** Pr |t| < 0.01.* Pr |t| < 0.05.σM = standard error of the mean.
Chemical modeling
How to use resonance, when it is present, to predict the reactivity of a given compound 4.58 0.04 4.55 0.04 −0.04
How to draw Lewis dot structures 4.51 0.04 4.44 0.04 −0.07
How to assign formal charge 4.52 0.04 4.45 0.04 −0.07
How to predict 3-D shapes and polarity of molecules 3.83 0.06 3.78 0.05 −0.05
Orbitals, including atomic, hybrid and molecular 3.54 0.06 3.55 0.06 0.01
Concept learning
How to recognize the difference between conformations, constitutional isomers, and configurational isomers, including enantiomers 2.46 0.07 2.26 0.06 −0.19*
How to identify chiral molecules 1.40 0.06 1.42 0.06 0.01
How to draw the stereoisomers of chiral compounds 1.42 0.06 1.37 0.06 −0.04
How to identify the stereochemical relationship between structures 1.70 0.07 1.43 0.06 −0.28**
How to explain the mechanisms of uni- and bimolecular substitutions and eliminations 1.60 0.07 1.52 0.06 −0.07
How to identify nucleophiles and electrophiles and predict reactions between them 1.81 0.07 1.52 0.06 −0.29**
How to write a proper reaction mechanism for a given reaction with known products 2.38 0.08 2.02 0.07 −0.36***
How to write mechanisms to predict outcomes for a half dozen types of addition reactions 1.79 0.07 1.63 0.07 −0.17
How to identify all the functional groups present in a given organic chemical 2.36 0.07 2.31 0.07 −0.06
How to propose and predict the outcome of acid–base reactions 2.54 0.07 2.43 0.06 −0.11
How IR, NMR, and mass spectroscopy work & How I can use each kind of spectrum to identify structural facets of an unknown molecule 1.47 0.06 1.56 0.06 0.09
How to plan workable multi-step sequences of reactions that convert simple starting compounds into complex organic products using my knowledge of functional group reactions 1.57 0.06 1.48 0.06 −0.09
(I can) list several common synthesis methods for, as well as several common reactions of:
Alkyl halides 1.73 0.07 1.42 0.06 −0.31***
Alkenes 1.90 0.08 1.57 0.06 −0.32***
Alcohols 1.53 0.06 1.48 0.06 −0.05
Ketones and aldehydes 1.30 0.05 1.30 0.05 0.00
Carboxylic acids and their derivatives 1.38 0.06 1.36 0.05 −0.02
Process skills
Develop testable hypotheses 3.36 0.06 3.46 0.06 0.10
Design and execute experiments 3.18 0.07 3.35 0.06 0.17
Find published data I need for lab work and report writing 3.22 0.07 3.28 0.07 0.06
Critically evaluate debates in the media about issues related to science involving organic chemistry 2.57 0.07 2.60 0.07 0.03
Identify patterns in data 3.48 0.06 3.43 0.06 −0.05
Recognize a sound argument and appropriate use of evidence 3.53 0.07 3.55 0.06 0.02
Write documents in the style and format appropriate for chemists 2.94 0.07 3.01 0.07 0.08
Behave as an effective team member 4.16 0.05 4.18 0.05 0.02
Behave as an effective leader 3.90 0.06 4.05 0.05 0.15
Interact productively to solve problems with a diverse group of classmates 3.95 0.06 4.03 0.05 0.08
Analyze chemical models and draw appropriate conclusions 3.00 0.06 3.12 0.06 0.13
Enthusiastic about learning organic chemistry 3.50 0.07 3.50 0.07 0.00
Comfortable working with complex ideas 3.23 0.06 3.38 0.06 0.15


• how to recognize the difference between conformations, constitutional isomers, and configurational isomers, including enantiomers

• how to identify the stereochemical relationship between structures

In these instances, students in the experimental sections had lower mean self-ratings of concept mastery at the beginning of the course. That the differences in content learning are neutral (course final exams) or positive (select SALG items) suggests that the experimental pedagogy mitigates these differences and improves content learning over time. Likewise, the parity among course formats in interactive learning and perceived process skills at the beginning of the term, compared to the clear positive perceptions of learning outcomes for students in the experimental group (expressed by the SALG post-course data), suggests that cooperative learning is a viable innovation in undergraduate STEM education for reasons beyond gains in technical skills or content knowledge.

Conclusions

Direct classroom observations lead to the conclusion that we successfully implemented two different classroom environments in terms of the amount of time spent on active learning versus passively listening to lecture. Data from SALG survey administration indicate that classroom structures promoting cooperative learning are associated with greater perceived development of concept mastery, generic skills such as the ability to work collaboratively in a team and demonstrate leadership in a problem-solving task, and transferable process skills such as the ability to work with complex ideas, recognize valid sources of data, and draw conclusions on the basis of evidence. Course evaluation data reveal no difference in the reported amount of time required outside of class, suggesting that the cooperative learning format is a sustainable practice even in large-sized class sections, at least in terms of students’ workload. Both the SALG surveys and the course evaluations provide indirect evidence of student gains and behaviour; future research efforts would benefit from collection of direct evidence that demonstrates acquired transferrable skill use outside of the classroom setting. In addition, adding a journal or portfolio assignment to the active learning setting could be another key future innovation that allows students to reflect upon and articulate their own transferable skills gains over the course of the semester.

One may argue that the process skills evaluated by the SALG instrument represent essential 21st century learning objectives including critical thinking (Stein et al., 2007), information literacy, inquiry and analysis, problem-solving, and others (Maki, 2015). The SALG instrument also explores transferable skills accreditation requirements for degree programs by organizations such as the Royal Society of Chemistry (RSC), American Chemical Society (ACS), and the Accreditation Board for Engineering and Technology (ABET) (ACS, 2015; ABET, 2016; RSC, 2016). Innovations in pedagogy and curriculum design, as well as companion evaluative methodologies, are an essential response to these learning mandates, and the present study provides evidence of a workable approach to interactive, collaborative learning in an introductory science course. Without a doubt, flipped classrooms and other cooperative learning formats constitute viable innovations in undergraduate STEM education for important reasons beyond simply gains in technical skills or content knowledge.

Limitations

Like all education research experiments, this study has limitations that should be considered and discussed. Most importantly, whenever indirect data such as self-reports of learning are employed in an analysis, one must bear in mind the possibility of reporting bias. The SALG instrument design and informed consent procedures employed in this study sought to minimize reporting bias by ensuring students that their responses remain anonymous. As Jordan notes, “the [SALG] system provides an environment in which students can express their genuine opinions without any apprehension of negative consequences” (Jordan, 2012). Previous work has demonstrated that the SALG instrument is both valid and reliable for measuring students’ perceptions of their learning in active learning chemistry classroom (Vishnumolakala et al., 2016). We assessed development of skills by asking both groups of students if they improved as the result of this course. This is significantly different than directly measuring the skills, which we did not do. As we noted, students are reportedly overconfident in their assessment of their own transferable skills. We have assumed that students in both of our groups would be overconfident by similar amounts. Since our results are based on student reports (an indirect measure of transferable skills), further studies are needed to more directly measure transferable skills in cooperative learning classrooms. In general, indirect measurement of transferable skills development is a limitation of this study.

This study was relatively small in scope and sample size and was limited to a single institution in an American educational context. The student population studied was fairly homogeneous both in terms of age and enrolment status: >90% of enrolled undergraduate students at the institution are traditional college age (under 24 years old) and carrying a full time course load. In addition, because the study was conducted at a highly selective institution in terms of undergraduate admissions, the students in this study may not constitute a representative sample of the global population of all students planning to pursue STEM or health professions. Most students in this study neither worked full-time during the academic year nor served as the primary caregiver to others. Due to the characteristics of the populations studied herein, caution must be exercised in attempting to extrapolate the findings to populations with more non-traditional students who are older, work full time, and/or serve as the primary caregiver to children or elderly parents. To protect individual confidentiality, and due to the modest number of participants, results in this study cannot be disaggregated by gender, race, ethnicity, or other demographic factors.

Research ethics

The work reported herein was conducted following the guidelines set forth by Duke University's Institutional Review Board (Approved IRB Protocol number B0487: Learning in Organic Chemistry).

Acknowledgements

The first author thanks Professor Suzanne Ruder for allowing the use of her organic chemistry POGIL activities prior to their publication. Molly Goldwasser and Kim Manturuk are thanked for coding and administering the survey via Qualtrics software as well as tracking student completion lists for the instructor. Richard MacPhail is thanked for helpful comments on the draft. We are grateful for seed monies from the Paletz Innovative Teaching grant program and the Dean of Arts and Sciences minigrant program. A portion of the work described herein was supported by grants from the Duke University Arts and Sciences Faculty Assessment Committee and the Duke University endowment.

References

  1. Accreditation Board for Engineering and Technology (ABET), Accreditation Policy and Procedure Manual (APPM), 2016–2017, http://www.abet.org/accreditation/accreditation-criteria/accreditation-policy-and-procedure-manual-appm-2016–2017/, accessed September 25, 2016.
  2. American Chemical Society (ACS), Undergraduate professional education in chemistry: ACS Guidelines and Evaluation Procedures for Bachelor's Degree Programs, http://https://www.acs.org/content/dam/acsorg/about/governance/committees/training/2015-acs-guidelines-for-bachelors-degree-programs.pdf, accessed September 15, 2016.
  3. Ashraf S. S., Marzouk S. A. M., Shehadi I. A. and Murphy B. M., (2011), An integrated professional and transferable skills course for undergraduate chemistry students, J. Chem. Educ., 88(1), 44–48.
  4. Bailey J. and Mitchell R. B., (2007), Industry perceptions of the competencies needed by computer programmers: technical, business, and soft skills, J. Comput. Inf. Syst., 47(2), 28–33.
  5. Bailey C. P., Minderhout V. and Loertscher J., (2012), Learning transferable skills in large lecture halls: implementing a POGIL approach to biochemistry, Biochem. Mol. Biol. Educ., 40(1), 1–7.
  6. Barger M. M., Perez, A., Canelas, D. A. and Linnenbrink-Garcia, L., (2015), Constructivism and personal epistemology development, paper presented at the American Educational Research Association (AERA) National Meeting, Chicago, IL April, 2015.
  7. Barger M. M., Perez A., Canelas D. A. and Linnenbrink-Garcia L., (2017), Constructivism and personal epistemology development in undergraduate chemistry students, Sci. Educ., in revision.
  8. Barr D. A., (2010), Questioning the Premedical Paradigm: Enhancing Diversity in the Medical Profession a Century After the Flexner Report, Baltimore, MD: The Johns Hopkins University Press, p. 240.
  9. Barr D. A., Matsui J., Wanat S. F. and Gonzalez M. E., (2010), Chemistry courses as a turning point for premedical students, Adv. Health Sci. Educ., 15, 45–54.
  10. Baytiyeh H., (2012), Disparity between college preparation and career demands for graduating engineers, Int. J. Eng. Educ., 28(5), 1221–1231.
  11. Bradley A. Z., Ulrich S. M., Jones, Jr. M. and Jones S. M., (2002), Teaching the sophomore organic course without a lecture. Are you crazy? J. Chem. Educ., 79, 514.
  12. Canelas D. A., (2015), Teaching college chemistry to the edges rather than to the average: implications for less experienced science students, in Daus K. and Rigsbee R. (ed.), The Promise of Chemical Education: Addressing our Students' Needs, American Chemical Society Symposium Series 1193, Oxford, UK: Oxford University Press, pp. 11–28.
  13. Carnevale A. P., Gainer L. J. and Meltzer A. S., (1990), Workplace basics: the essential skills employers want, San Francisco: Jossey-Bass.
  14. Chase A., Pakhira D. and Stains M., (2013), Implementing Process-Oriented, Guided-Inquiry Learning for the first time: Adaptations and Short-term Impacts on Students' Attitude and Performance, J. Chem. Educ., 90(4), 409–416.
  15. Chi M. T. H., (2009), Active-constructive-interactive: a conceptual framework for differentiating learning activities, Top. Cognit. Sci., 1(1), 73–105.
  16. Chi M. T. H. and Wiley R., (2014), The ICAP framework: linking cognitive engagement to active learning outcomes, Educ. Psychol., 49 (4), 219–243.
  17. Conway C. J., (2014), Effects of guided inquiry versus lecture instruction on final grade distribution in a one-semester organic and biochemistry course, J. Chem. Educ., 91, 480–483.
  18. Cronbach L. J., (1951), Coefficient alpha and the internal structure of tests, Psychometrika, 16(3), 297–334.
  19. Cureton E. E. and D'Agostino, R. B., (1983), Factor Analysis: An Applied Approach, Hillsdale, NJ: Erlbaum.
  20. Daempfle P. A., (2004), An analysis of the high attrition rates among first year college science, math, and engineering majors, J. Coll. Stud. Ret., 5, 37.
  21. Dinan F. J. and Frydrychowski V. A., (1995), A team learning method for organic chemistry, J. Chem. Educ., 72, 429–431.
  22. Fabrigar L. R., Wegener D. T., MacCallum R. C. and Strahan E. J., (1999), Evaluating the use of exploratory factor analysis in psychological research, Psychol. Meth., 4(3), 272–299.
  23. Fautch J. M., (2015), The flipped classroom for teaching organic chemistry in small classes: is it effective? Chem. Educ. Res. Pract., 16, 179–186.
  24. Felder R. M. and Brent R., (2009), Active Learning: An Introduction. ASQ Higher Education Brief 2, no. 4, http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Papers/ALpaper%28ASQ%29.pdf, accessed September 22, 2016.
  25. Freeman S., Eddy S. L., McDonough M., Smith M. K., Okoroafor N., Jordt H. and Wenderoth M. P., (2014), Active learning increases student performance in science, engineering, and mathematics, Proc. Natl. Acad. Sci. U. S. A., 111(23), 8410–8415.
  26. Goldwasser M., Mosley P. L. and Canelas D. A., (2016), Implementation of online lecture videos in introductory chemistry, in Sorensen P. (ed.), Online Course Development and the Effects on the On-Campus Classroom, American Chemical Society Symposium Series, pp. 63–73.
  27. Gorsuch R. L., (1983), Factor Analysis, 2nd edn, Hillsdale, NJ: Erlbaum.
  28. Granger E. M., Bevis T. H., Saka Y., Southerland S. A., Sampson V. and Tate R. L., (2012), The efficacy of student-centered instruction in supporting science learning, Science, 338, 105–108.
  29. Hall D. M., Curtin-Soydan A. J. and Canelas D. A., (2014), The science advancement through group engagement program: leveling the playing field and increasing retention in science, J. Chem. Educ., 91(1), 37–47.
  30. Hanson S. and Overton T., (2010), Skills Required by New Chemistry Graduates and Their Development in Degree Programmes, Hull, UK: Higher Education Academy UK Physical Sciences Centre, http://www.rsc.org/learn-chemistry/resources/business-skills-and-commercial-awareness-for-chemists/docs/skillsdoc1.pdf, accessed January 26, 2017.
  31. Hart Research Associates, (2015), Falling Short? College Learning and Career Success, Washington, DC: Association of American Colleges & Universities, http://https://aacu.org/leap/public-opinion-research/2015-survey-falling-short, accessed January 7, 2017.
  32. Heckman J. J. and Kauz T., (2012), Hard evidence on soft skills, Lab. Econ., 19, 451–464.
  33. Hein S. M., (2012), Positive impacts using POGIL in organic chemistry, J. Chem. Educ., 89, 860.
  34. Hibbard L., Sung S. and Wells B., (2016), Examining the effectiveness of a semi-self-paced flipped learning format in a college general chemistry sequence, J. Chem. Educ., 93, 24–30.
  35. Jordan T., (2012), Using the SENCER-SALG to reveal student learning in a large-scale environmental chemistry course for non-majors, in Sheardy R. D. and Burns W. D. (ed.) Science Education and Civic Engagement: The Next Level, ACS Symposium Series, vol. 1121, pp. 179–215.
  36. Kaiser H. F., (1960), The application of electronic computers to factor analysis, Educ. Psychol. Meas., 20(1), 141–151.
  37. Kerr S. and Runquist O., (2005), Are We Serious about Preparing Chemists for the 21st Century Workplace or Are We Just Teaching Chemistry? J. Chem. Educ., 82(2), 231–233.
  38. Kim J.-O. and Mueller C. W., (1978), Factor analysis: statistical methods and practical issues, Sage: Newbury Park.
  39. Kulatunga U., Moog R. S. and Lewis J. E., (2013), Argumentation and participation patterns in general chemistry peer-led sessions, J. Res. Sci. Teach., 50, 1207–1231.
  40. Lategan M. J., (2016), Enhancing employability through group work, Microbiol. Aust., 37(2), 88–89.
  41. Loudon G. M., (2009), Organic Chemistry, 5th edn, Greenwood Village, CO: Roberts & Company, p. 1353.
  42. Lowery Bretz S., Fay M., Bruck L. B. and Towns M. H., (2013), What faculty interviews reveal about meaningful learning in the undergraduate chemistry laboratory, J. Chem. Educ., 90, 281–288.
  43. Maki P., (2015), Assessment that works: a national call, a twenty-first century response, Washington, DC: Association of American Colleges and Universities.
  44. Makoul G., Curry R. H. and Novack D. H., (1998), The future of medical school courses in professional skills and perspectives, Acad. Med., 73, 48–51.
  45. Martin R., Maytham B., Case J. and Fraser D., (2005), Engineering graduates’ perceptions of how well they were prepared for work in industry, Eur. J. Eng. Educ., 30(2), 167–180.
  46. Mittendorf I. and Cox J. R., (2013), Around the beta-turn: an activity to improve the communication and listening skills of biochemistry students, J. Chem. Educ., 90(11), 1476–1478.
  47. Moog R. S. and Spencer J. N. (ed.), (2008), POGIL: Process Oriented Guided Inquiry Learning, New York, NY: Oxford University Press.
  48. Moog R. S., Creegan J. S., Hanson M. D., Spencer N. J., Straumanis A. and Bunce D. M., (2009), POGIL: Process-oriented guided-inquiry learning, in Pienta N., Cooper M. M. and Greenbowe T. J. (ed.), Chemists’ Guide to Effective Teaching, Upper Saddle River, NJ: Prentice Hall, vol. 2, pp. 90–101.
  49. Murdoch-Eaton D. and Whittle S., (2012), Generic skills in medical education: developing the tools for successful lifelong learning, Med. Educ., 46, 120–128.
  50. Nevo B., (1985), Face validity revisited, J. Educ. Meas., 22(4), 287–293.
  51. Noble L. M., Kubacki A., Martin J., Lloyd M., (2007), The effect of professional skills training on patient-centredness and confidence in communicating with patients, Med. Educ., 41, 432–440.
  52. Nunnally J., (1978), Psychometric Theory, New York, NY: McGraw-Hill.
  53. Oliver-Hoyo M. T., (2011), Content coverage in a lecture format versus activity-based instruction, in Bunce D. M. (ed.), Investigating Classroom Myths through Research on Teaching and Learning, Washington, DC: ACS, pp. 33–50.
  54. Oliver-Hoyo M. T. and Beichner R. J., (2004), SCALE-UP: bringing inquiry-guided learning to large enrollment courses, in Lee V. S. (ed.), Teaching and Learning Through Inquiry, Sterling, VA: Stylus, pp. 51–79.
  55. Osborne J., (2010), Arguing to learn in science: the role of collaborative, critical discourse, Science, 328, 463–466.
  56. Pontin J. A., Arico E., Pitoscio Filho J., Tiedemann P. W., Isuyama R. and Fettis G. C., (1993), Interactive chemistry teaching units developed with the help of the local chemical industry: applying classroom principles to the real needs of local companies to help students develop skill in teamwork, communications, and problem solving, J. Chem. Educ., 70(3), 223. DOI: 10.1021/ed070p223.
  57. Randles C. A. and Overton T. A., (2015), Expert vs. novice: approaches used by chemists when solving open-ended problems, Chem. Educ. Res. Pract., 16, 811–823.
  58. Reid S. A., (2016), A flipped classroom redesign in general chemistry, Chem. Educ. Res. Pract., 17, 914–922.
  59. Rein K. S. and Brookes D. T., (2015), Student response to a partial inversion of an organic chemistry course for non-chemistry majors, J. Chem. Educ., 92, 797–802.
  60. Repice M. D., Sawyer R. K., Hogrebe M. C., Brown P. L., Luesse S. B., Gealy D. J., Frey R. F., (2016), Talking through the problems: a study of discourse in peer-led small groups, Chem. Educ. Res. Pract., 17, 555–568.
  61. Royal Society of Chemistry, Accreditation of Degree Programmes, http://www.rsc.org/images/accreditation-degree-programme_tcm18-151306.pdf, accessed September 15, 2016.
  62. Ruder S., (2015), Organic Chemistry: A Guided Inquiry, New York, NY: Wiley, p. 306.
  63. Ryan M. D. and Reid S. A., (2016), Impact of the Flipped Classroom on Student Performance and Retention: A Parallel Controlled Study in General Chemistry, J. Chem. Educ., 93, 13–23.
  64. Seymour E., (1995), Why undergraduates leave the sciences, Am. J. Phys.63, 199.
  65. Seymour E. and Hewitt N. M., (2000), Talking about Leaving: Why Undergraduates Leave the Sciences, Boulder, CO: Westview Press, p. 444.
  66. Seymour E., Wiese D. J., Hunter A.-B. and Daffinrud S. M., (2000), Creating a better mousetrap: On-line student assessment of their learning gains, paper at the National Meeting of the American Chemical Society, San Francisco, CA, http://t.salgsite.org/docs/SALGPaperPresentationAtACS.pdf, accessed September 15, 2016.
  67. Shuman L. J., Besterfield-Sacre M. and McGourty J., (2005), The ABET “professional skills” – Can they be taught? Can they be assessed? J. Eng. Educ., 94 (1), 41–55.
  68. Siburt C. J. P., Bissell A. N. and MacPhail R. A., (2011), Developing metacognitive and problem-solving skills through problem manipulation, J. Chem. Educ., 88, 1489–1495.
  69. Stein B., Haynes A., Redding M., Ennis T. and Cecil, M., (2007), Assessing critical thinking in STEM and beyond, Innovations in e-learning, instruction technology, assessment, and engineering education, Netherlands: Springer, pp. 79–82.
  70. Straumanis A., (2009), Organic Chemistry: A Guided Inquiry, 2nd edn, Boston, MA: Houghton Mifflin Harcourt, p. 528.
  71. Taber K. S., (2016), Learning generic skills through chemistry education, Chem. Educ. Res. Pract., 17, 225–228.
  72. Tien L. T., Roth V. and Kampmeier J. A., (2002), Implementation of a peer-led team learning instructional approach in an undergraduate organic chemistry course, J. Res. Sci. Teach., 39(7), 606–632.
  73. Vishnumolakala V. R., Southam D. C., Treagust D. F. and Mocerino M., (2016), Latent constructs of the students’ assessment of their learning gains instrument following instruction in stereochemistry, Chem. Educ. Res. Pract., 17(2), 309–319.
  74. Warfa A.-F. M., (2016), Using cooperative learning to teach chemistry: a meta-analytic review, J. Chem. Educ., 92, 1437−1448.
  75. Weaver G. C. and Sturtevant H. G., (2015), Design, implementation, and evaluation of a flipped format general chemistry course, J. Chem. Educ., 93, 248–255.
  76. Whittington C. P., Pellock S. J., Cunningham R. L. and Cox J. R., (2014), Combining content and elements of communication into an upper-level biochemistry course, Biochem. Mol. Biol. Educ., 42(2), 165−173.

Footnote

Electronic supplementary information (ESI) available: Appendix 1: SALG baseline survey. Appendix 2: SALG post survey. See DOI: 10.1039/c7rp00014f

This journal is © The Royal Society of Chemistry 2017