Comparison of student attitudes and performance in an online and a face-to-face inorganic chemistry course

Hannah T. Nennig a, Katrina L. Idárraga a, Luke D. Salzer a, April Bleske-Rechek b and Roslyn M. Theisen *a
aDepartment of Chemistry, University of Wisconsin–Eau Claire, Eau Claire, Wisconsin 54702, USA. E-mail: theiserm@uwec.edu
bDepartment of Psychology, University of Wisconsin–Eau Claire, Eau Claire, Wisconsin 54702, USA

Received 2nd May 2019 , Accepted 4th August 2019

First published on 13th August 2019


Despite recent interest in online learning, systematic comparisons of online learning environments with traditional classroom environments are limited, particularly in the sciences. Here, we report on a systematic comparison of an online and face-to-face classroom for a sophomore-level, lecture-only introductory inorganic chemistry course that is designed for students pursuing a chemistry major or minor. The online group consisted of three sections of students enrolled in Introduction to Inorganic Chemistry during two 8 week summer terms and one 4 week winter term. The face-to-face group consisted of two sections of students enrolled in Introduction to Inorganic Chemistry during two 15 week semesters. Both groups of students completed ten common exam questions, and a validated and reliable measure of their attitudes toward chemistry (Attitude toward the Subject of Chemistry Inventory Version 2: ASCIv2). Students in the online course and face-to-face course did not differ in their performance on the common exam questions, course grade distribution, or attitudes toward chemistry. Although few studies have compared online and face-to-face learning environments in the physical sciences, our results are consistent with the idea that students who complete an online course fare just as well as those who attend face-to-face classes.


Introduction

Online education discussions have increased dramatically at the undergraduate level (Mayadas et al., 2009). The concept of online education is not new, as it is fundamentally a version of distance education, which has persisted within higher education for decades (Bernard et al., 2004). From an institutional point of view, some faculty and universities feel pressure to expand online offerings due to capacity constraints or other budgetary demands (Lack, 2013; Nguyen, 2015). Still, one of the most cited reasons for the increase in online offerings is its potential to provide more access to content and instruction at any time and place, especially for students who cannot or choose not to attend traditional academic settings (Means et al., 2010). Some online learning environments enable students to work at their own pace and provide automatic or immediate feedback using online learning management systems, so students can identify points of confusion sooner. Further, online education operates in our current period of expansive new technologies and applications that support synchronous and asynchronous online learning, including threaded discussion boards, blogs, webcasting, learning analytics, and adaptive learning websites (Swan, 2003; Driscoll et al., 2012; Nguyen, 2015). Thus, online learning offerings remain popular with researchers and educators because online courses can be designed to enhance the quality of learning experiences and outcomes by giving learners control of their interactions with media (Means et al., 2010), and can modernize courses at the undergraduate level to overcome the limitations of the traditionally large and passive lecture format by supporting the learning process through self-pacing and interactive problem solving (Evans and Leinhardt, 2008; Bernard et al., 2017). However, this is not always the case, especially if the instructor does not incorporate all of the available tools that differentiate online pedagogical methods from the traditional pedagogical methods. Additionally, interest in developing online courses by faculty and academic staff may be hindered by uncertainty about how to use the online learning tools, concerns about increased workload, worries about academic integrity or how to maintain the ‘rigor’, coherence or viability of content in online learning environments (Seery, 2012; Seery, 2015). Concerns have also illuminated how low-income and underprepared students drop out of online classes at a higher rate than observed in face-to-face settings which complicates the stated reason to improve access to education using online coursework (Cooper and Stowe, 2018).

A central question about online learning is its effectiveness compared to traditional (face-to-face) instruction. Ideally, courses taught online should provide at least the same skill development and content as the face-to-face experiences do. Several published meta-analyses, research reports and reviews of online learning studies in the last fifteen years have reported either no significant difference in terms of effectiveness when compared to the traditional classroom learning (Schoenfeld-Tacher et al., 2001; Bernard et al., 2004; Tallent-Runnels et al., 2006; Driscoll et al., 2012) or reported that students in online conditions performed modestly better, on average, than those learning the same material through face-to-face instruction (Means et al., 2010). In 2013, a review of the research on online postsecondary education stated that there is “little evidence to support broad claims that online or hybrid learning is significantly more effective or significantly less effective than courses taught in a face-to-face format” and encouraged further studies on this topic (Lack, 2013). Still, in all the reports to date, few studies on the efficacy of online learning environments have been reported for introductory or advanced courses in the physical sciences (Gulacar et al., 2013). This lack of research could be a result of the relatively small number of online courses offered in the physical sciences (Online courses and virtual laboratories survey results, 2019) or the difficulty in creating fully online courses that contain a mandatory laboratory component, common to many physical science courses. Such a limited number of studies reveals the necessity for more research comparing online and face-to-face course formats, specifically within the sciences – for majors as well as for non-majors, and for lecture plus lab courses as well as lecture-only courses.

This study and literature review focuses entirely on online environments in undergraduate courses in chemistry. Only a few studies have compared online versus face-to-face formats in undergraduate chemistry courses. Of these, none systematically examined both student performance in chemistry and student attitudes toward chemistry. In 2009, Weaver, Green, Epp and Rahman reported an exploratory study comparing communication in a traditional face-to-face recitation with an online office hour for a general chemistry course at two different universities, and concluded that students’ and instructors’ modes of communication were impacted by the format (Weaver et al., 2009). In 2012, Seery described the creation of an online module for a computers-for-chemistry course. Higher average class grades for the online groups, responses to a student perception survey and lessons learned for the practitioner were reported (Seery, 2012). In 2013, Gulacar, Damkaci and Bowman reported a comparative study of an online and a traditional introductory chemistry course for non-science majors, which revealed students perform equally well on exam questions that require lower-order thinking skills. However, students enrolled in the traditional (face-to-face) course performed better than students enrolled in the online course with the exam questions that required higher-order thinking skills (Gulacar et al., 2013). In that study, student attitudes towards chemistry were not examined. In 2018, Faulconer, Griffith, Wood, Acharyya and Roberts compared student pass and withdrawal rates and grade distribution for an online and traditional format chemistry lecture/lab course: they found no significant difference between the two course formats in student pass or withdrawal rates. Although the final course grades were not standardized, students taking the online course were more likely to receive a higher grade than traditional students (Faulconer et al., 2018). These studies relied on evidence such as student satisfaction and course grades, highlighting the need for more data comparing online and traditional course formats in undergraduate chemistry courses.

With only a few examples of fully online chemistry courses available, we also reviewed studies in two other related areas: Massive Open Online Courses (MOOCs) and chemistry courses in a flipped format. Since 2012, several chemistry MOOCs have been reviewed and compared (Leontyev and Baranov, 2013). Although these popular courses tend to elicit positive student satisfaction (Shapiro et al., 2017), quantitative data that supports the effectiveness of this learning environment compared to traditional classrooms remains absent. Quantitative studies are slowly emerging from the other related area, chemistry courses in the flipped format, in which online platforms are used to deliver content outside of class time, while in-class time is reserved for more high-impact, active learning exercises and activities (Bergmann and Sams, 2012). A number of recent peer-reviewed reports indicate a high level of student satisfaction, positive outcomes for underprepared and at-risk students, and the potential to reduce high withdrawal and failure rates; however, in most cases, overall student performance was not significantly different when a flipped course format was used as opposed to a non-flipped course (Bergmann and Sams, 2012; Smith, 2013; Schultz et al., 2014; Fautch, 2015; Weaver and Sturtevant, 2015; Eichler and Peeples, 2016; González-Gómez et al., 2016; Hibbard et al., 2016; Reid, 2016; Ryan and Reid, 2016). While these large-scale and blended approaches to instruction have many potential benefits, our focus is on undergraduate courses in chemistry in a fully online format.

While the effectiveness of online courses by evaluation of student performance is one important factor, so too is student attitude. Learners actively construct their own knowledge by building upon prior experiences and conceptions (Bodner, 1986). Attitude is an important mental construct that may influence, and be influenced by, student learning behaviours (Brandriet et al., 2011). Recent research has explored the relationship between chemistry students’ attitudes toward chemistry and content knowledge, achievement, other affective characteristics, and motivation (Xu et al., 2013; Chan and Bauer, 2014). Often learning formats are compared using researcher-developed student perception or satisfaction surveys that capture student opinions about the methodology or curricular changes (Smith, 2013; Schultz et al., 2014; Weaver and Sturtevant, 2015; González-Gómez et al., 2016; Hibbard et al., 2016). These often rely on measures (e.g., student comments and instructor evaluations) which lack a strong statistical foundation and are sometimes neither valid nor reliable. Among the small number of studies of online classrooms versus traditional classrooms in an undergraduate chemistry course, none, to our knowledge, have looked at both student learning (performance on exam questions and overall grades) and student attitudes towards chemistry (using a validated and reliable survey instrument). We seek to fill this gap.

The theoretical framework used in this research study was constructivism. Because prior experiences in the sciences have most likely occurred in a face-to-face environment, it is difficult to know if changing the learning environment format will affect the construction of knowledge. Our study explores this idea by measuring both student performance and attitude. Attitude is a variable that hopes to capture the lived experience of completing the course by integrating content and knowledge with student's perception of the experience. The online format for this study was built with the idea that prior knowledge is activated. In the online course, types of learning activities (video lectures, quizzes and exams, homework and discussion activities) were designed to provide students an opportunity to construct their own knowledge by building on prior understanding, which may influence their attitude towards chemistry.

Similarly, in the face-to-face course, types of learning activities (didactic lecture, quizzes and exams, homework and face-to-face discussion activities) gave students opportunities to construct their own knowledge by building on prior understanding. Students in either format used learning activities and materials to construct knowledge. Furthermore, students in both online and face-to-face groups had the opportunity to construct their knowledge in the “zone of proximal development,” as Vygotskian theorists suggest, where a learner interacts with a more capable individual (the instructor and peers in face-to-face discussions or on discussion boards) to bridge the gap between what the learner knows and what they are capable of knowing with instructional help (Vygotsky, 1978). It is unclear if the learning environment format will impact connections made between instructor and learner as well as students’ attitudes.

Research questions

In undertaking this work, we identified two specific questions:

(1) Does students’ learning of concepts in inorganic chemistry, as indexed by performance on common exam question and course grade, differ in the online course compared to the face-to-face course?

(2) Do students’ attitudes toward chemistry, as indexed by responses to the ASCIv2, differ in the online course compared to the face-to-face course?

Methods

Courses studied

The online group consisted of students enrolled in three sections of Introduction to Inorganic Chemistry in two different 8 week summer terms and one 4 week winter term during the same two academic years. The face-to-face group consisted of students enrolled in two sections of Introduction to Inorganic Chemistry taken in two 15 week semesters during two different academic years. To avoid unequal sharing of strategies for successful learning within each format (the potential of cross-talk), the courses were offered during different terms. No laboratory topics are discussed because Introduction to Inorganic Chemistry is a lecture-only course at our institution.

Populations studied

The study was conducted at a public institution in the Midwestern United States with an undergraduate enrolment of 10[thin space (1/6-em)]022 students. The chemistry course of interest in this study was Introduction to Inorganic Chemistry, a three-credit lecture-only foundational inorganic chemistry course, which is mainly taken by chemistry majors or minors. The enrolment of this course varies between 20–45 students for the face-to-face course and 10–20 students for the online course each term. Prerequisites for this course include Honours General Chemistry or General Chemistry II, but many students take this course as a sophomore, junior or senior. Therefore, the number and level of chemistry courses students take prior to completing this course varies. General characteristics of students in the online and face-to-face groups are described in Table 1 and include a mix of first-year, sophomore, junior and senior students of mixed genders. For this study, only two gender categories were used even though it is known not everyone fits into binary categories. A greater percentage of female students than male students enrolled in the online group; no differences were found to be statistically significant. While self-selection could not be avoided, the student demographics of the groups are very similar. One predictor of success in chemistry courses is the grade a student received in a previous chemistry course or in high school chemistry. Table 1 displays students’ average course grade in Honours General Chemistry or General Chemistry II and their average overall college GPA coming into the course: these values show that students in the online and face-to-face groups are similar.
Table 1 Summary of demographic information and predictors of success for online and face-to-face-groups
Item Online group Face-to-face group
Total students in this study (Ntotal) 42 63
Class standing 4% first-year 3% first-year
31% sophomore 34% sophomore
33% junior 31% junior
31% senior 32% senior
Gender: F 60% 44%
Gender: M 40% 56%
Average course grade in General Chemistry II or Honours General Chemistry 3.0 = B 3.3 = B+
Average overall college GPA coming into the course 3.17 3.19


Instructor differences

Instructor A taught both face-to-face sections and Instructor B taught all the online sections. Instructor A has been teaching at the undergraduate level for 10+ years and Instructor B has been teaching at the undergraduate level for 20+ years. Both instructors have similar training and backgrounds (PhD in Inorganic Chemistry with an emphasis in bioinorganic chemistry) and both instructors consistently have comparable student evaluations (above the departmental average). The instructor of the online sections (Instructor B) had not taught an online course before but received training on best practices to use in online courses from our institution's Teaching and Learning Centre and had an interest to create an online course. Due to scheduling constraints, it was not possible to use the same instructor for the two course formats, but the two instructors used the same textbook, shared some curricular materials, and together chose the material covered in the ten common exam questions. On the other hand, each instructor independently chose the stimuli presented to students for the discussion portion of the course, all other exam questions besides the ten common exam questions, homework questions and quiz questions. We list course learning objectives for each of the groups in Appendix 1 (ESI). Both instructors used an absolute grade scale and did not curve grades. Because the courses were not offered during the same term, the students in the online course could not access the course materials via the course management system given to students in the face-to-face course and vice versa.

Description of the face-to-face course structure

Students in the face-to-face group met in a conventional 50-seat university classroom for three, 50 min lecture periods per week for 15 weeks. The face-to-face course format incorporated weekly active learning strategies (including minute paper assignments and reflection notecards), but class time mainly consisted of a mixture of PowerPoint slides and a document camera to write out lecture notes. Attendance was not required; however, points were awarded for completing in-class problem-solving activities. Weekly homework sets were assigned and graded and weekly online quizzes were given using an online learning management system. No other learning analytics, adaptive learning websites, or video lectures were integrated into this course. Students in the face-to-face group completed three midterm exams and a cumulative final exam in the classroom. All exams involved a mix of multiple choice and open response/short answer questions. The two sections of the face-to-face course were equivalent in terms of weekly time in lecture and time spent with the face-to-face instructor; types of homework assignments, exams and quizzes; and time allowed for student-to-student discussion. Only small variations existed between face-to-face sections (i.e., class size and semester taken).

Description of the online course structure

Using the conceptual framework for online learning described elsewhere (Means et al., 2010), the online course in this study is an example of an expository, asynchronous, face-to-face alternative online learning environment. Students in the online groups completed the entire course, including examinations, at a distance and did not participate in any on-campus instructional activities. The weekly course content was a combination of one-way video lectures and web resources. Two or three videos were typically assigned each week and made available only on the course management system; each was paired with an assignment. Points were awarded for posting questions to the online discussion board on the course management system (one question or response to a question per week) three times during the term. Students in the online groups were given three exams, six online quizzes and six homework assignments with deadlines distributed evenly throughout the term. All exams were a mix of multiple choice and open response/short answer questions. The exams and quizzes were taken during the same time period during the term and not based on where students were at with the material (not self-paced). All three sections of the online course were equivalent in that the same learning management system was used; the same video lectures were offered; the same types of homework assignments, exams and quizzes were assigned; and the same amount of feedback was provided. One small variant in the online sections was the amount of time reserved for student-to-student discussion during one of the online sections.

Measures

This study examined two dependent variables. The first dependent variable was student learning, as indicated by student scores on ten common exam questions, overall course performance and course grades as assigned at the end of the semester or term. Specifically, we examined the percentage of A, B and C grades of students in the two groups. We also compared the percentage of failing grades of students in the online course to the face-to-face groups. Failure is defined as a D or below in the course. A grade of D is defined as failure because students must re-take this course in order for it to count towards their major or minor degree.

The second dependent variable was student attitudes toward chemistry, as indicated by student responses to the ASCIv2. The original Attitude toward the Subject of Chemistry Inventory (ASCI) was developed by Bauer and was then modified by Xu and Lewis to a shortened version known as the ASCIv2. The modified instrument, ACSIv2, is an eight-item survey which uses a semantic differential scale that allows students to express their feelings toward chemistry on a seven-point scale indicated by two polar adjectives (Bauer, 2008; Xu and Lewis, 2011). This inventory was created and modified to study curricular reforms and was proposed as a pre- and post-test for treatment and comparison groups. This inventory has been shown to provide valid, reliable and readily interpretable scores (Brandriet et al., 2011; Xu and Lewis, 2011). In the current study, students completed the attitude inventory at the beginning and end of the course, which allowed us to investigate whether attitudes would change over time and vary with course format. In the small, but growing number of studies that have used the ASCIv2, this is the one study that uses data from the ASCIv2 to measure the effect of the course experience on attitudinal changes of students in online versus face-to-face groups. However, the ASCIv2 has been used to detect changes in attitude between different groups of students both in a study in a flipped versus traditional organic chemistry course and in a general chemistry course, both of which show a positive relationship between students’ attitude and chemistry course success (Brandriet et al., 2011; Mooring et al., 2016).

The first administration of the ASCIv2 was during the first week of the semester or term (pre-), and the second administration of the survey occurred during final exam week or the last week of the term (post-). The survey was hosted on Qualtrics, a web-based survey software. The study was reviewed by the Institutional Review Board at our institution and received exempt status. All of the participating students provided informed consent. Students were offered up to 1% toward their course grade for participating in the pre- and post-surveys and the total time to complete the surveys, on average, was less than ten minutes. Of the 109 students in all sections studied, 77 completed either the pre- or the post-survey and 53 completed both surveys. The attitude data reported below are from the 53 students (16 from the online group and 37 from the face-to-face group) who completed both surveys.

Results and discussion

Research question 1

To address the first research question about students’ learning of core concepts in Introduction to Inorganic Chemistry as a function of course format, we assessed student performance on ten exam questions that were taken by both the online and face-to-face students. These ten exam questions were chosen by the two instructors because they represent three core concepts taught in the inorganic curriculum as identified from a published national survey of inorganic chemistry faculty (Raker et al., 2015; Reisner et al., 2015): covalent molecular substances, solid state materials, and transition elements and coordination chemistry. The multiple-choice questions were written by Instructor B (instructor of the online course) and then reviewed by Instructor A (instructor of the face-to-face course); two additional members of the chemistry department reviewed the exam questions and perceived them as having content and face validity. We provide an example exam question given to both groups in Appendix 2 (ESI). Out of a total possible of 10 correct, the online group had a mean score of 7.67, 95% CI [7.17, 8.17] and the face-to-face group had a mean score of 7.67 95% CI [7.33, 8.01] (see Table 2). The difference between these means (Mdiff = 0.00, 95% CI [−0.57, 0.57] was negligible in size (Cohen's d = 0.00) and not statistically significant, t(103) = −0.00, p = 0.999. Because the distribution of scores on the 10 exam questions was negatively skewed as a whole (skewness = −0.49), we conducted chi-square tests of association to determine whether the students in the online and face-to-face groups differed in their likelihood of answering each individual exam question correctly. The results of these analyses are displayed in Table 2.
Table 2 Performance on the ten common exam questions, by course format
  Percent of students with correct answer X 2 p Cramer's V
Online (%) Face-to-face (%)
Percentages in bold are favoured by that group.
Q1 43 52 0.92 0.339 0.09
Q2 83 37 22.35 <0.001 0.46
Q3 79 97 8.95 0.003 0.29
Q4 79 87 1.42 0.234 0.12
Q5 86 97 4.42 0.036 0.21
Q6 57 79 5.98 0.014 0.24
Q7 95 79 5.19 0.023 0.22
Q8 88 54 13.40 <0.001 0.36
Q9 95 94 0.12 0.731 0.03
Q10 62 91 12.42 <0.001 0.34


As shown in Table 2, there was no consistent advantage for one course format over the other. The online and face-to-face groups were similarly likely to answer Questions 1, 4, and 9 correctly; the online students were more likely than the face-to-face students to answer Questions 2, 7, and 8 correctly, and the face-to-face students were more likely than the online students to answer Questions 3, 5, 6, and 10 correctly. Frequency analyses showed that 62% of students in the online sections and 57% of students in the face-to-face sections answered eight or more questions correctly. In both formats, no student answered less than four questions correctly and in both formats, about 10% of students answered all ten questions correctly. Although the scores for the students in the online group suggest they are not using additional aids when taking their exams, question 2 shows a large difference in performance when compared to the students in the face-to-face group. Further study is needed to understand why the online group outperformed the face-to-face group on this question over other questions.

To further address Research question 1 regarding student learning in the online versus face-to-face course, we compared the two groups’ overall course performance. As shown in Table 3, the online and face-to-face courses were designed to be as similar as possible in the degree to which final grades were a function of exams and quizzes, discussions, and homework assignments. On a 0 to 100% scale (% of total points earned out of total points possible), the online group who completed the course achieved a mean course percentage of 81% (95% CI [77, 85]) and the face-to-face group achieved a mean course percentage of 83% (95% CI [80, 86]). The difference between these means (Mdiff = 2.41, 95% CI [−7.08, 2.25]) was weak (Cohen's d = 0.20) and not statistically significant, t(104) = −1.03, p = 0.307.

Table 3 Grading scheme of online and face-to-face sections of introduction to inorganic chemistry
Item Online group (%) Face-to-face group (%)
Exams and quizzes 76 70
Discussion activities 8 11
Assigned homework 16 19


Because overall course percentages were negatively skewed, we looked at the percentage of students in each learning format who earned an A (A+, A, A−); B (B+, B, B−); C (C+, C, C−); or D (D+, D, D−) or F, as their final grade. As shown in Table 4 and Fig. 1, the grade distribution for the online course was similar to that for the face-to-face course, χ2(3, n = 106) = 0.80, p = 0.849; in both learning formats, most students earned an A or B. The number of students in each group who withdrew is described in Appendix 4 (ESI).

Table 4 Number of students who received grades of A, B, C or D and F in the online and face-to-face courses
Group n A (%) B (%) C (%) D and F (%)
Online 42 20 (47.6%) 14 (33.3%) 6 (14.3%) 2 (4.8%)
Face-to-face 64 26 (40.6%) 26 (40.6%) 8 (12.5%) 4 (6.3%)



image file: c9rp00112c-f1.tif
Fig. 1 Stacked bar graph display of the grade distribution for the online and face-to-face groups.

Research question 2

Our second research question asked whether students’ attitudes toward chemistry, as indexed by responses to the ASCIv2, would differ in the online course compared to the face-to-face course. Items 1, 4, 5, and 7 needed to be re-coded in order for higher scores to represent positive aspects of students’ attitudes. The mean and standard deviations for each item for the students in two course formats are shown in Table 5.
Table 5 Item descriptive statistics for pre and post administration of the ASCIv2 (n = 53 with pre and post data)
  Online (n = 16) Face-to-face (n = 37)
Pre Post Pre Post
Mean (SD) Mean (SD) Mean (SD) Mean (SD)
a These items were recoded before statistical analyses.
Easy–harda 3.44 (0.96) 3.56 (0.89) 3.95 (1.27) 3.89 (1.24)
Complicated–simple 3.13 (1.46) 3.25 (1.00) 3.32 (1.29) 3.57 (1.35)
Confusing–clear 4.38 (0.81) 3.94 (1.24) 4.51 (1.10) 4.57 (1.07)
Comfortable–uncomfortablea 5.13 (0.96) 4.81 (1.05) 5.30 (1.22) 5.49 (1.07)
Satisfying–frustratinga 5.38 (1.26) 5.50 (1.41) 5.30 (1.53) 5.61 (1.29)
Challenging–not challenging 2.50 (1.32) 2.44 (1.09) 2.49 (1.26) 2.46 (1.22)
Pleasant–unpleasanta 5.44 (1.03) 5.50 (1.21) 5.24 (1.19) 5.33 (1.07)
Chaotic–organized 4.63 (1.54) 4.56 (1.09) 4.92 (1.64) 4.81 (1.39)
Intellectual accessibility sum (4 to 28) 13.44 (3.31) 13.19 (2.83) 14.27 (3.40) 14.49 (3.55)
Emotional satisfaction sum (4 to 28) 20.56 (3.56) 20.38 (4.05) 20.76 (4.20) 20.95 (2.96)


For the online group, the item means ranged from 2.50 to 5.44 for the pre- administration and 2.44 to 5.50 for the post- implementation. The face-to-face group item means ranged from 2.49 to 5.30 for the pre- administration and 2.46 to 5.61 for the post-implementation. The mean scores on the ASCIv2, as displayed in Table 5, were comparable to those documented in other studies (Bauer, 2008; Brandriet et al., 2011; Xu and Lewis, 2011; Mooring et al., 2016). The ASCIv2 has two subscales, intellectual accessibility (IA) and emotional satisfaction (ES). Items 1, 2, 3 and 6 load on the IA subscale and items 4, 5, 7 and 8 load on the ES subscale. Table 6 shows the reliability of the two subscales on the ASCIv2 as indicated by Cronbach's alpha. Internal reliability coefficients were comparable to those reported in previous publications (Brandriet et al., 2011; Xu and Lewis, 2011; Kahveci, 2015; Mooring et al., 2016). Hence, items 1, 2, 3 and 6 were summed to represent Intellectual Accessibility (total possible = 4 to 28), and items 4, 5, 7 and 8 were summed to represent Emotional Satisfaction (total possible 4 to 28).

Table 6 Internal consistency (Cronbach's alpha) of the ASCIv2 subscales (n = 53 with pre and post data)
Subscale Pre Post
a To maintain consistency with other published reports using this measure, we did not omit Q8 of the ES subscale despite that it had a relatively low (<0.3) item-total correlation. With Q8 omitted, Cronbach's alpha = 0.75 for the post-ES.
Intellectual accessibility (items 1, 2, 3, and 6) 0.64 0.68
Emotional satisfaction (items 4, 5, 7, and 8) 0.73 0.56a


To investigate whether students’ attitudes toward chemistry differed as a function of course format, we conducted a mixed analysis of variance, with pre and post survey attitude responses as the within-subjects variable and course format (online vs. face-to-face) as the between-subjects variable. We conducted this analysis first for IA scores and then for ES scores; the results for both sets of analyses are displayed in Fig. 2 (descriptive statistics are in Table 5). As shown in the panel on the left of Fig. 2, IA scores did not differ from pre to post, F(1, 51) = 0.00, p = 0.969, partial η2 = 0.00; the online group did not respond differently from the face-to-face group on the IA subscale overall, F(1, 51) = 1.38, p = 0.245, partial η2 = 0.03; and the pre–post change in attitude did not differ as a function of course format, F(1, 51) = 1.21, p = 0.595, η2 = 0.01. As shown in the panel on the right of Fig. 2, ES scores did not differ from pre to post, F(1, 51) = 0.00, p = 0.999, partial η2 = 0.00; the online group did not respond differently from the face-to-face group on the ES subscale overall, F(1, 51) = 0.16, p = 0.695, partial η2 = 0.00; and the pre–post change in attitude (which was negligible) did not differ as a function of course format, F(1, 51) = 0.13, p = 0.722, η2 = 0.00. These findings suggest that in our sample, students’ attitudes toward chemistry did not differ by course format; the students’ ratings suggest no change in attitudes toward chemistry regardless of course format.


image file: c9rp00112c-f2.tif
Fig. 2 Pre and post mean intellectual accessibility sum scores for online and face-to-face groups (left) and pre and post mean emotional satisfaction sum scores for online and face-to-face groups (right).

As shown in Table 5 and in comparison of the left and right panels in Fig. 2, students in both groups scored higher on Emotional Satisfaction than on Intellectual Accessibility, which remains consistent with earlier research on students’ attitudes toward chemistry (Mooring et al., 2016). Students who rated chemistry as higher in emotional satisfaction (at either time point) tended to perform better in the course, as indicated by final percent, rpre(52) = 0.33, p = 0.017, rPost = 0.33, p = 0.018. However, how students responded to the intellectual accessibility items was not correlated with their performance in the course, rpre(52) = 0.09, p = 0.511, rPost = 0.15, p = 0.299.

Summary of results

When examining online and face-to-face instruction, student scores on the ten exam questions in the online group were similar to those of students in the face-to-face group. Students in the online group earned, on average, a similar overall course percentage as the students in the face-to-face group. This finding suggests students in the online course had suitable access to course content and the course provided students with satisfactory skills in problem solving as to face-to-face course. It is unlikely that the format greatly influences classroom learning and connections made between instructor and learner. Students can similarly construct knowledge in the online or the face-to-face format.

Further, students’ attitudes toward chemistry in either the intellectual or emotional subscales were comparable between the groups at the beginning of the course, at the end of the course, and between groups. Students found chemistry both intellectually accessible and emotionally satisfying given the overall mean sum average scores measured in this study, but both groups found chemistry more emotionally satisfying than intellectually accessible. Future work could look at increasing students’ intellectual accessibility in this course. Overall, the p values are consistently over 0.05, suggesting that the group differences we observed were too likely to have happened by chance alone, under the assumption of the Null hypothesis (that the online and the face-to-face groups do not differ).

Study limitations

The data were collected from a relatively homogeneous population of Introduction to Inorganic Chemistry students over a timespan of two academic years and at the same institution. A larger sample of students would have allowed for more variability in student performance and attitude, so findings from this study might not prove representative of other institutions and student populations. This study should be replicated with more students at multiple institutions for broader generalizability. In our study, self-selection could not be avoided due to enrolment limitations of our institution; therefore, it may not be possible to determine the true impact of the online and face-to-face format when students are allowed to self-select into groups. Student preference towards learning formats could be pre-set and impact the results of this study. Preventing and detecting cheating and plagiarism are limited in an online environment. For this study, efforts to curb and deter cheating in both the online group and the face-to-face group were to clearly state our institution's policies and consequences for academic misconduct in the syllabus. Additionally, for the online group, an enforced time limit on exams and quizzes limited additional retrieval of information or unauthorized collaboration and the answers to the exam questions were not revealed until every student in the course had completed all the questions.

Although the two groups of participants in the study were comparable in many ways, there were some differences. The course length was different between the two groups: a 15 week semester for the face-to-face group as compared to a 4 or 8 week course for the online group. It has been suggested that there may be some advantages to shorter courses (Gulacar et al., 2013) in terms of knowledge retention; therefore, it would be interesting to include a 15 week online course to compare the effect of instruction length on student performance or attitude. The face-to-face group had all ten exam questions given at the time of the final exam and the online group had the 3–4 questions given at the end of each of the three modules (either after 1 week or 2 weeks depending on the term). Using a pre-test or other sampling of the students prior to the course or the use of a standardized exam as the assessment tool to measure student performance would further improve the integrity of the study. Finally, the online and face-to-face groups were taught by different instructors. Future studies could account for differences in instructor performance using student evaluations or other measures.

Implications for teaching and further research

Our results are consistent with the recent statement that students who complete an online course fare just as well as those who attend face-to-face classes (Cooper and Stowe, 2018), even though few studies have examined undergraduate chemistry courses. Students in both groups actively constructed new knowledge by building upon prior experiences and conceptions, as indicated by comparable course performance and grade distribution in each group.

Attitude toward chemistry, as measured by a reliable and validated instrument, had not previously been examined for students enrolled in online versus face-to-face course format, but our results demonstrate that student intellectual accessibility and emotional satisfaction is consistent between the course formats. One frequent concern of online course delivery is the notion that students are not satisfied with or do not like online courses. This study reveals that students have comparable intellectual accessibility and emotional satisfaction toward chemistry. Also, attitude toward chemistry does not decrease after the completion of an online (or a face-to-face) course in introduction to inorganic chemistry.

Online learning environments do offer advantages. Specifically, they offer more flexibility for students with work/family commitments by allowing for distance learning, additional course scheduling options for students and instructors, and an institutional point of view that may help alleviate capacity constraints or other budgetary demands. Determining what students gain from an online versus a face-to-face learning environment will help determine whether similar types of online course offerings should be integrated into other STEM disciplines. Tracking student's attitude and performance in prior or later chemistry courses, whether online or face-to-face, would be interesting in terms of learning more about students’ conceptual understanding and persistence in chemistry.

Conclusions

We report here that a lecture-only sophomore-level inorganic chemistry course delivered completely online can have comparable outcomes as a course delivered in the face-to-face format in terms of student scores on ten exam questions, course grade distribution, and student attitudes toward chemistry on both the intellectual accessibility and emotional satisfaction measure. Our first research question sought to understand the impact of online versus face-to-face lecture formats on students’ scores across ten common topics of a foundational inorganic chemistry course. Our data indicate comparable performance on the ten exam questions for students enrolled in the online course and the face-to-face course. More than half of the students in the face-to-face sections and the online sections answered eight or more questions correctly. Our data also indicate that grades (A, B, C, D and F) earned in students enrolled in either course format were comparable. In both learning formats, most students earned an A or B. These results corroborate with meta-analyses, research reports and reviews of online learning environments in the last fifteen years, which have reported no difference in terms of effectiveness when compared to traditional classroom learning (Schoenfeld-Tacher et al., 2001; Bernard et al., 2004; Tallent-Runnels et al., 2006; Driscoll et al., 2012). Students in either format used learning activities and materials to connect what was known and what was to be learned. Therefore, format did not seem to impact the learners’ ability to construct new knowledge. Future work that includes cluster analysis of performance measures may uncover further differences that are missed between the online and face-to-face groups in this study.

The second research question sought to understand how student attitude was affected, if at all, from the beginning to the end of the course using a validated and reliable measure of attitude towards chemistry. Our data indicate attitudes towards chemistry for students enrolled in the online or the face-to-face course were comparable at the beginning of the course and at the end of the course in both intellectual accessibility and emotional satisfaction. In the limited, but growing, number of studies that measure students’ attitudes toward chemistry using the ASCIv2 instrument, this is the one study that measures the intellectual accessibility and emotional satisfaction components of attitude in an online versus a face-to-face undergraduate inorganic chemistry course. Future research is needed to examine positive and negative aspects to each course format from the students’ and instructors’ perspectives.

Beyond student performance and attitude measures, what other differences might exist across formats? In this study, both online and the face-to-face courses are primarily lecture environments with small opportunities for student discussion and interaction; therefore, one consideration when comparing the different course formats is the amount of student-to-instructor interactions and student-to-student interactions. It has been suggested that student-to-student discussion promotes critical thinking and forces students to engage with the course material at higher levels of learning (Driscoll et al., 2012). Student-to-instructor discussion is also important and this type of discussion may be utilized to various degrees in an online or traditional learning environment (Swan, 2003). In an attempt to incorporate student-to-student and instructor-to-student interactions that can occur in face-to-face courses, instructors of online courses have used several methods to embed discussion into online courses, including discussion boards or virtual office hours (i.e. synchronous instructor-led chat sessions) (Schoenfeld-Tacher et al., 2001). However, more research is required to identify meaningful student-to-student interactions in chemistry that can be implemented and measured in an online format. The authors suggest choosing a discussion model that promotes student metacognition. Metacognition has been shown in other disciplines to bring students to the forefront of the learning process by intentionally providing opportunities for students to reflect on their own learning (Tanner, 2012). Future work will include implementation and assessment of a structured student-to-student discussion model in both online and face-to-face courses.

Conflicts of interest

There are no conflicts to declare.

Acknowledgements

The authors would like to thank Jason Halfen for teaching some of the courses discussed and for his consultation throughout the project; Laura C. Ley for insightful discussions, and Angie Stombaugh and Jonathan Rylander for assistance with the manuscript. The authors acknowledge financial support from the University of Wisconsin–Eau Claire Office of Research and Sponsored Programs through Blugold Commitment funding and the University of Wisconsin – System Office of Professional and Instructional Development.

References

  1. Bauer C. F., (2008), Attitude toward Chemistry: A Semantic Differential Instrument for Assessing Curriculum Impacts, J. Chem. Educ., 85(10), 1440–1445.
  2. Bergmann J. and Sams A., (2012), Flip Your Classroom: Reach Every Student in Every Class Every Day, 1st edn, Alexandria, VA: International Society for Technology in Education.
  3. Bernard R. M., Abrami P. C., Lou Y., Borokhovski E., Wade A., Wozney L., et al., (2004), How Does Distance Education Compare With Classroom Instruction? A Meta-Analysis of the Empirical Literature, Rev. Educ. Res., 74(3), 379–439.
  4. Bernard P., Bros P. and Migdal-Mikuli A., (2017), Influence of blended learning on outcomes of students attending a general chemistry course: summary of a five-year-long study, Chem. Educ. Res. Pract., 18(4), 682–690.
  5. Bodner G. M., (1986), Constructivism: a theory of knowledge, J. Chem. Educ., 63(10), 873–878.
  6. Brandriet A. R., Xu X., Bretz S. L. and Lewis J. E., (2011), Diagnosing changes in attitude in first-year college chemistry students with a shortened version of Bauer's semantic differential, Chem. Educ. Res. Pract., 12(2), 271–278.
  7. Chan J. Y. K. and Bauer C. F., (2014), Identifying At-Risk Students in General Chemistry via Cluster Analysis of Affective Characteristics, J. Chem. Educ., 91(9), 1417–1425.
  8. Cooper M. M. and Stowe R. L., (2018), Chemistry Education Research—From Personal Empiricism to Evidence, Theory, and Informed Practice, Chem. Rev., 118(12), 6053–6087.
  9. Driscoll A., Jicha K., Hunt A. N., Tichavsky L. and Thompson G., (2012), Can Online Courses Deliver In-class Results? Teach. Sociol., 40(4), 312–331.
  10. Eichler J. F. and Peeples J., (2016), Flipped classroom modules for large enrollment general chemistry courses: a low barrier approach to increase active learning and improve student grades, Chem. Educ. Res. Pract., 17(1), 197–208.
  11. Evans K. L. and Leinhardt G., (2008), A Cognitive Framework for the Analysis of Online Chemistry Courses, J. Sci. Educ. Technol., 17(1), 100–120.
  12. Faulconer E. K., Griffith J. C., Wood B. L., Acharyya S. and Roberts D. L., (2018), A comparison of online and traditional chemistry lecture and lab, Chem. Educ. Res. Pract., 19(1), 392–397.
  13. Fautch J. M., (2015), The flipped classroom for teaching organic chemistry in small classes: is it effective? Chem. Educ. Res. Pract., 16(1), 179–186.
  14. González-Gómez D., Jeong J. S., Airado Rodríguez D. and Cañada-Cañada F., (2016), Performance and Perception in the Flipped Learning Model: An Initial Approach to Evaluate the Effectiveness of a New Teaching Methodology in a General Science Classroom, J. Sci. Educ. Technol., 25(3), 450–459.
  15. Gulacar O., Damkaci F. and C. Bowman, (2013), A Comparative Study of an Online and a Face-to-Face Chemistry Course, J. Interact. Online Learn., 12(1), 27–40.
  16. Hibbard L., Sung S. and Wells B., (2016), Examining the Effectiveness of a Semi-Self-Paced Flipped Learning Format in a College General Chemistry Sequence, J. Chem. Educ., 93(1), 24–30.
  17. Kahveci A., (2015), Assessing high school students’ attitudes toward chemistry with a shortened semantic differential, Chem. Educ. Res. Pract., 16(2), 283–292.
  18. Lack K. A., (2013), Current Status of Research on Online Learning in Postsecondary Education, Ithaka S+R.
  19. Leontyev A. and Baranov D., (2013), Massive Open Online Courses in Chemistry: A Comparative Overview of Platforms and Features. J. Chem. Educ., 90(11), 1533–1539.
  20. Mayadas A. F., Bourne J. and Bacsich P., (2009), Online Education Today, Science, 323(5910), 85.
  21. Means B., Toyama Y., Murphy R., Bakia M. and Jones K., (2010), Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies, Washington, D.C.: U.S. Department of Education.
  22. Mooring S. R., Mitchell C. E. and Burrows N. L., (2016), Evaluation of a Flipped, Large-Enrollment Organic Chemistry Course on Student Attitude and Achievement, J. Chem. Educ., 93(12), 1972–1983.
  23. Nguyen T., (2015), The effectiveness of online learning: beyond no significant difference and future horizons, MERLOT J. Online Learn. Teach., 11(2), 309–319.
  24. Online courses and virtual laboratories survey results, https://www.acs.org/content/acs/en/about/governance/committees/training/cptnewsletter.html, (accessed August 2019).
  25. Raker J. R., Reisner B. A., Smith S. R., Stewart J. L., Crane J. L., Pesterfield L. and Sobel S. G., (2015), Foundation Coursework in Undergraduate Inorganic Chemistry: Results from a National Survey of Inorganic Chemistry Faculty. J. Chem. Educ., 92(6), 973–979.
  26. Reid S. A., (2016), A flipped classroom redesign in general chemistry, Chem. Educ. Res. Pract., 17(4), 914–922.
  27. Reisner B. A., Smith S. R., Stewart J. L., Raker J. R., Crane J. L., Sobel S. G. and Pesterfield L. L., (2015), Great Expectations: Using an Analysis of Current Practices To Propose a Framework for the Undergraduate Inorganic Curriculum, Inorg. Chem., 54(18), 8859–8868.
  28. Ryan M. D. and Reid S. A., (2016), Impact of the Flipped Classroom on Student Performance and Retention: A Parallel Controlled Study in General Chemistry, J. Chem. Educ., 93(1), 13–23.
  29. Schoenfeld-Tacher R., McConnell S. and Graham M., (2001), Do No Harm—A Comparison of the Effects of On-Line vs. Traditional Delivery Media on a Science Course, J. Sci. Educ. Technol., 10(3), 257–265.
  30. Schultz D., Duffield S., Rasmussen S. C. and Wageman J., (2014), Effects of the Flipped Classroom Model on Student Performance for Advanced Placement High School Chemistry Students, J. Chem. Educ., 91(9), 1334–1339.
  31. Seery M. K., (2012), Moving an in-class module online: a case study for chemistry, Chem. Educ. Res. Pract., 13(1), 39–46.
  32. Seery M. K., (2015), Flipped learning in higher education chemistry: emerging trends and potential directions, Chem. Educ. Res. Pract., 16(4), 758–768.
  33. Shapiro H. B., Lee C. H., Wyman Roth N. E., Li K., Çetinkaya-Rundel M. and Canelas D. A., (2017), Understanding the massive open online course (MOOC) student experience: an examination of attitudes, motivations, and barriers, Comput. Educ., 110(Supplement C), 35–50.
  34. Smith J. D., (2013), Student attitudes toward flipping the general chemistry classroom, Chem. Educ. Res. Pract., 14(4), 607–614.
  35. Swan K., (2003), Learning effectiveness online: what the research tells us, Elem. Qual. Online Educ. Pr. Dir., 4, 13–47.
  36. Tallent-Runnels M. K., Thomas J. A., Lan W. Y., Cooper S., Ahern T. C., Shaw S. M. and Liu X., (2006), Teaching Courses Online: A Review of the Research, Rev. Educ. Res., 76(1), 93–135.
  37. Tanner K. D., (2012), Promoting Student Metacognition, CBE-Life Sci. Educ., 11(2), 113–120.
  38. Vygotsky L. S., (1978), Mind in Society, Cambridge, MA: Harvard University Press.
  39. Weaver G. C. and Sturtevant H. G., (2015), Design, Implementation, and Evaluation of a Flipped Format General Chemistry Course, J. Chem. Educ., 92(9), 1437–1448.
  40. Weaver G. C., Green K., Rahman A. and Epp E., (2009), An Investigation of Online and Face-to-Face Communication in General Chemistry, Int. J. Scholarsh. Teach. Learn., 3(1), 18.
  41. Xu X. and Lewis J. E., (2011), Refinement of a Chemistry Attitude Measure for College Students, J. Chem. Educ., 88(5), 561–568.
  42. Xu X., Villafane S. M. and Lewis J. E., (2013), College students’ attitudes toward chemistry, conceptual knowledge and achievement: structural equation model analysis, Chem. Educ. Res. Pract., 14(2), 188–200.

Footnotes

Electronic supplementary information (ESI) available. See DOI: 10.1039/c9rp00112c
Present address: Lawrence University, Appleton, Wisconsin, USA. E-mail: theisenr@lawrence.edu

This journal is © The Royal Society of Chemistry 2020