Hannah T.
Nennig
a,
Katrina L.
Idárraga
a,
Luke D.
Salzer
a,
April
Bleske-Rechek
b and
Roslyn M.
Theisen‡
*a
aDepartment of Chemistry, University of Wisconsin–Eau Claire, Eau Claire, Wisconsin 54702, USA. E-mail: theiserm@uwec.edu
bDepartment of Psychology, University of Wisconsin–Eau Claire, Eau Claire, Wisconsin 54702, USA
First published on 13th August 2019
Despite recent interest in online learning, systematic comparisons of online learning environments with traditional classroom environments are limited, particularly in the sciences. Here, we report on a systematic comparison of an online and face-to-face classroom for a sophomore-level, lecture-only introductory inorganic chemistry course that is designed for students pursuing a chemistry major or minor. The online group consisted of three sections of students enrolled in Introduction to Inorganic Chemistry during two 8 week summer terms and one 4 week winter term. The face-to-face group consisted of two sections of students enrolled in Introduction to Inorganic Chemistry during two 15 week semesters. Both groups of students completed ten common exam questions, and a validated and reliable measure of their attitudes toward chemistry (Attitude toward the Subject of Chemistry Inventory Version 2: ASCIv2). Students in the online course and face-to-face course did not differ in their performance on the common exam questions, course grade distribution, or attitudes toward chemistry. Although few studies have compared online and face-to-face learning environments in the physical sciences, our results are consistent with the idea that students who complete an online course fare just as well as those who attend face-to-face classes.
A central question about online learning is its effectiveness compared to traditional (face-to-face) instruction. Ideally, courses taught online should provide at least the same skill development and content as the face-to-face experiences do. Several published meta-analyses, research reports and reviews of online learning studies in the last fifteen years have reported either no significant difference in terms of effectiveness when compared to the traditional classroom learning (Schoenfeld-Tacher et al., 2001; Bernard et al., 2004; Tallent-Runnels et al., 2006; Driscoll et al., 2012) or reported that students in online conditions performed modestly better, on average, than those learning the same material through face-to-face instruction (Means et al., 2010). In 2013, a review of the research on online postsecondary education stated that there is “little evidence to support broad claims that online or hybrid learning is significantly more effective or significantly less effective than courses taught in a face-to-face format” and encouraged further studies on this topic (Lack, 2013). Still, in all the reports to date, few studies on the efficacy of online learning environments have been reported for introductory or advanced courses in the physical sciences (Gulacar et al., 2013). This lack of research could be a result of the relatively small number of online courses offered in the physical sciences (Online courses and virtual laboratories survey results, 2019) or the difficulty in creating fully online courses that contain a mandatory laboratory component, common to many physical science courses. Such a limited number of studies reveals the necessity for more research comparing online and face-to-face course formats, specifically within the sciences – for majors as well as for non-majors, and for lecture plus lab courses as well as lecture-only courses.
This study and literature review focuses entirely on online environments in undergraduate courses in chemistry. Only a few studies have compared online versus face-to-face formats in undergraduate chemistry courses. Of these, none systematically examined both student performance in chemistry and student attitudes toward chemistry. In 2009, Weaver, Green, Epp and Rahman reported an exploratory study comparing communication in a traditional face-to-face recitation with an online office hour for a general chemistry course at two different universities, and concluded that students’ and instructors’ modes of communication were impacted by the format (Weaver et al., 2009). In 2012, Seery described the creation of an online module for a computers-for-chemistry course. Higher average class grades for the online groups, responses to a student perception survey and lessons learned for the practitioner were reported (Seery, 2012). In 2013, Gulacar, Damkaci and Bowman reported a comparative study of an online and a traditional introductory chemistry course for non-science majors, which revealed students perform equally well on exam questions that require lower-order thinking skills. However, students enrolled in the traditional (face-to-face) course performed better than students enrolled in the online course with the exam questions that required higher-order thinking skills (Gulacar et al., 2013). In that study, student attitudes towards chemistry were not examined. In 2018, Faulconer, Griffith, Wood, Acharyya and Roberts compared student pass and withdrawal rates and grade distribution for an online and traditional format chemistry lecture/lab course: they found no significant difference between the two course formats in student pass or withdrawal rates. Although the final course grades were not standardized, students taking the online course were more likely to receive a higher grade than traditional students (Faulconer et al., 2018). These studies relied on evidence such as student satisfaction and course grades, highlighting the need for more data comparing online and traditional course formats in undergraduate chemistry courses.
With only a few examples of fully online chemistry courses available, we also reviewed studies in two other related areas: Massive Open Online Courses (MOOCs) and chemistry courses in a flipped format. Since 2012, several chemistry MOOCs have been reviewed and compared (Leontyev and Baranov, 2013). Although these popular courses tend to elicit positive student satisfaction (Shapiro et al., 2017), quantitative data that supports the effectiveness of this learning environment compared to traditional classrooms remains absent. Quantitative studies are slowly emerging from the other related area, chemistry courses in the flipped format, in which online platforms are used to deliver content outside of class time, while in-class time is reserved for more high-impact, active learning exercises and activities (Bergmann and Sams, 2012). A number of recent peer-reviewed reports indicate a high level of student satisfaction, positive outcomes for underprepared and at-risk students, and the potential to reduce high withdrawal and failure rates; however, in most cases, overall student performance was not significantly different when a flipped course format was used as opposed to a non-flipped course (Bergmann and Sams, 2012; Smith, 2013; Schultz et al., 2014; Fautch, 2015; Weaver and Sturtevant, 2015; Eichler and Peeples, 2016; González-Gómez et al., 2016; Hibbard et al., 2016; Reid, 2016; Ryan and Reid, 2016). While these large-scale and blended approaches to instruction have many potential benefits, our focus is on undergraduate courses in chemistry in a fully online format.
While the effectiveness of online courses by evaluation of student performance is one important factor, so too is student attitude. Learners actively construct their own knowledge by building upon prior experiences and conceptions (Bodner, 1986). Attitude is an important mental construct that may influence, and be influenced by, student learning behaviours (Brandriet et al., 2011). Recent research has explored the relationship between chemistry students’ attitudes toward chemistry and content knowledge, achievement, other affective characteristics, and motivation (Xu et al., 2013; Chan and Bauer, 2014). Often learning formats are compared using researcher-developed student perception or satisfaction surveys that capture student opinions about the methodology or curricular changes (Smith, 2013; Schultz et al., 2014; Weaver and Sturtevant, 2015; González-Gómez et al., 2016; Hibbard et al., 2016). These often rely on measures (e.g., student comments and instructor evaluations) which lack a strong statistical foundation and are sometimes neither valid nor reliable. Among the small number of studies of online classrooms versus traditional classrooms in an undergraduate chemistry course, none, to our knowledge, have looked at both student learning (performance on exam questions and overall grades) and student attitudes towards chemistry (using a validated and reliable survey instrument). We seek to fill this gap.
The theoretical framework used in this research study was constructivism. Because prior experiences in the sciences have most likely occurred in a face-to-face environment, it is difficult to know if changing the learning environment format will affect the construction of knowledge. Our study explores this idea by measuring both student performance and attitude. Attitude is a variable that hopes to capture the lived experience of completing the course by integrating content and knowledge with student's perception of the experience. The online format for this study was built with the idea that prior knowledge is activated. In the online course, types of learning activities (video lectures, quizzes and exams, homework and discussion activities) were designed to provide students an opportunity to construct their own knowledge by building on prior understanding, which may influence their attitude towards chemistry.
Similarly, in the face-to-face course, types of learning activities (didactic lecture, quizzes and exams, homework and face-to-face discussion activities) gave students opportunities to construct their own knowledge by building on prior understanding. Students in either format used learning activities and materials to construct knowledge. Furthermore, students in both online and face-to-face groups had the opportunity to construct their knowledge in the “zone of proximal development,” as Vygotskian theorists suggest, where a learner interacts with a more capable individual (the instructor and peers in face-to-face discussions or on discussion boards) to bridge the gap between what the learner knows and what they are capable of knowing with instructional help (Vygotsky, 1978). It is unclear if the learning environment format will impact connections made between instructor and learner as well as students’ attitudes.
(1) Does students’ learning of concepts in inorganic chemistry, as indexed by performance on common exam question and course grade, differ in the online course compared to the face-to-face course?
(2) Do students’ attitudes toward chemistry, as indexed by responses to the ASCIv2, differ in the online course compared to the face-to-face course?
Item | Online group | Face-to-face group |
---|---|---|
Total students in this study (Ntotal) | 42 | 63 |
Class standing | 4% first-year | 3% first-year |
31% sophomore | 34% sophomore | |
33% junior | 31% junior | |
31% senior | 32% senior | |
Gender: F | 60% | 44% |
Gender: M | 40% | 56% |
Average course grade in General Chemistry II or Honours General Chemistry | 3.0 = B | 3.3 = B+ |
Average overall college GPA coming into the course | 3.17 | 3.19 |
The second dependent variable was student attitudes toward chemistry, as indicated by student responses to the ASCIv2. The original Attitude toward the Subject of Chemistry Inventory (ASCI) was developed by Bauer and was then modified by Xu and Lewis to a shortened version known as the ASCIv2. The modified instrument, ACSIv2, is an eight-item survey which uses a semantic differential scale that allows students to express their feelings toward chemistry on a seven-point scale indicated by two polar adjectives (Bauer, 2008; Xu and Lewis, 2011). This inventory was created and modified to study curricular reforms and was proposed as a pre- and post-test for treatment and comparison groups. This inventory has been shown to provide valid, reliable and readily interpretable scores (Brandriet et al., 2011; Xu and Lewis, 2011). In the current study, students completed the attitude inventory at the beginning and end of the course, which allowed us to investigate whether attitudes would change over time and vary with course format. In the small, but growing number of studies that have used the ASCIv2, this is the one study that uses data from the ASCIv2 to measure the effect of the course experience on attitudinal changes of students in online versus face-to-face groups. However, the ASCIv2 has been used to detect changes in attitude between different groups of students both in a study in a flipped versus traditional organic chemistry course and in a general chemistry course, both of which show a positive relationship between students’ attitude and chemistry course success (Brandriet et al., 2011; Mooring et al., 2016).
The first administration of the ASCIv2 was during the first week of the semester or term (pre-), and the second administration of the survey occurred during final exam week or the last week of the term (post-). The survey was hosted on Qualtrics, a web-based survey software. The study was reviewed by the Institutional Review Board at our institution and received exempt status. All of the participating students provided informed consent. Students were offered up to 1% toward their course grade for participating in the pre- and post-surveys and the total time to complete the surveys, on average, was less than ten minutes. Of the 109 students in all sections studied, 77 completed either the pre- or the post-survey and 53 completed both surveys. The attitude data reported below are from the 53 students (16 from the online group and 37 from the face-to-face group) who completed both surveys.
Percent of students with correct answer | X 2 | p | Cramer's V | ||
---|---|---|---|---|---|
Online (%) | Face-to-face (%) | ||||
Percentages in bold are favoured by that group. | |||||
Q1 | 43 | 52 | 0.92 | 0.339 | 0.09 |
Q2 | 83 | 37 | 22.35 | <0.001 | 0.46 |
Q3 | 79 | 97 | 8.95 | 0.003 | 0.29 |
Q4 | 79 | 87 | 1.42 | 0.234 | 0.12 |
Q5 | 86 | 97 | 4.42 | 0.036 | 0.21 |
Q6 | 57 | 79 | 5.98 | 0.014 | 0.24 |
Q7 | 95 | 79 | 5.19 | 0.023 | 0.22 |
Q8 | 88 | 54 | 13.40 | <0.001 | 0.36 |
Q9 | 95 | 94 | 0.12 | 0.731 | 0.03 |
Q10 | 62 | 91 | 12.42 | <0.001 | 0.34 |
As shown in Table 2, there was no consistent advantage for one course format over the other. The online and face-to-face groups were similarly likely to answer Questions 1, 4, and 9 correctly; the online students were more likely than the face-to-face students to answer Questions 2, 7, and 8 correctly, and the face-to-face students were more likely than the online students to answer Questions 3, 5, 6, and 10 correctly. Frequency analyses showed that 62% of students in the online sections and 57% of students in the face-to-face sections answered eight or more questions correctly. In both formats, no student answered less than four questions correctly and in both formats, about 10% of students answered all ten questions correctly. Although the scores for the students in the online group suggest they are not using additional aids when taking their exams, question 2 shows a large difference in performance when compared to the students in the face-to-face group. Further study is needed to understand why the online group outperformed the face-to-face group on this question over other questions.
To further address Research question 1 regarding student learning in the online versus face-to-face course, we compared the two groups’ overall course performance. As shown in Table 3, the online and face-to-face courses were designed to be as similar as possible in the degree to which final grades were a function of exams and quizzes, discussions, and homework assignments. On a 0 to 100% scale (% of total points earned out of total points possible), the online group who completed the course achieved a mean course percentage of 81% (95% CI [77, 85]) and the face-to-face group achieved a mean course percentage of 83% (95% CI [80, 86]). The difference between these means (Mdiff = 2.41, 95% CI [−7.08, 2.25]) was weak (Cohen's d = 0.20) and not statistically significant, t(104) = −1.03, p = 0.307.
Item | Online group (%) | Face-to-face group (%) |
---|---|---|
Exams and quizzes | 76 | 70 |
Discussion activities | 8 | 11 |
Assigned homework | 16 | 19 |
Because overall course percentages were negatively skewed, we looked at the percentage of students in each learning format who earned an A (A+, A, A−); B (B+, B, B−); C (C+, C, C−); or D (D+, D, D−) or F, as their final grade. As shown in Table 4 and Fig. 1, the grade distribution for the online course was similar to that for the face-to-face course, χ2(3, n = 106) = 0.80, p = 0.849; in both learning formats, most students earned an A or B. The number of students in each group who withdrew is described in Appendix 4 (ESI†).
Group | n | A (%) | B (%) | C (%) | D and F (%) |
---|---|---|---|---|---|
Online | 42 | 20 (47.6%) | 14 (33.3%) | 6 (14.3%) | 2 (4.8%) |
Face-to-face | 64 | 26 (40.6%) | 26 (40.6%) | 8 (12.5%) | 4 (6.3%) |
Online (n = 16) | Face-to-face (n = 37) | |||
---|---|---|---|---|
Pre | Post | Pre | Post | |
Mean (SD) | Mean (SD) | Mean (SD) | Mean (SD) | |
a These items were recoded before statistical analyses. | ||||
Easy–harda | 3.44 (0.96) | 3.56 (0.89) | 3.95 (1.27) | 3.89 (1.24) |
Complicated–simple | 3.13 (1.46) | 3.25 (1.00) | 3.32 (1.29) | 3.57 (1.35) |
Confusing–clear | 4.38 (0.81) | 3.94 (1.24) | 4.51 (1.10) | 4.57 (1.07) |
Comfortable–uncomfortablea | 5.13 (0.96) | 4.81 (1.05) | 5.30 (1.22) | 5.49 (1.07) |
Satisfying–frustratinga | 5.38 (1.26) | 5.50 (1.41) | 5.30 (1.53) | 5.61 (1.29) |
Challenging–not challenging | 2.50 (1.32) | 2.44 (1.09) | 2.49 (1.26) | 2.46 (1.22) |
Pleasant–unpleasanta | 5.44 (1.03) | 5.50 (1.21) | 5.24 (1.19) | 5.33 (1.07) |
Chaotic–organized | 4.63 (1.54) | 4.56 (1.09) | 4.92 (1.64) | 4.81 (1.39) |
Intellectual accessibility sum (4 to 28) | 13.44 (3.31) | 13.19 (2.83) | 14.27 (3.40) | 14.49 (3.55) |
Emotional satisfaction sum (4 to 28) | 20.56 (3.56) | 20.38 (4.05) | 20.76 (4.20) | 20.95 (2.96) |
For the online group, the item means ranged from 2.50 to 5.44 for the pre- administration and 2.44 to 5.50 for the post- implementation. The face-to-face group item means ranged from 2.49 to 5.30 for the pre- administration and 2.46 to 5.61 for the post-implementation. The mean scores on the ASCIv2, as displayed in Table 5, were comparable to those documented in other studies (Bauer, 2008; Brandriet et al., 2011; Xu and Lewis, 2011; Mooring et al., 2016). The ASCIv2 has two subscales, intellectual accessibility (IA) and emotional satisfaction (ES). Items 1, 2, 3 and 6 load on the IA subscale and items 4, 5, 7 and 8 load on the ES subscale. Table 6 shows the reliability of the two subscales on the ASCIv2 as indicated by Cronbach's alpha. Internal reliability coefficients were comparable to those reported in previous publications (Brandriet et al., 2011; Xu and Lewis, 2011; Kahveci, 2015; Mooring et al., 2016). Hence, items 1, 2, 3 and 6 were summed to represent Intellectual Accessibility (total possible = 4 to 28), and items 4, 5, 7 and 8 were summed to represent Emotional Satisfaction (total possible 4 to 28).
Subscale | Pre | Post |
---|---|---|
a To maintain consistency with other published reports using this measure, we did not omit Q8 of the ES subscale despite that it had a relatively low (<0.3) item-total correlation. With Q8 omitted, Cronbach's alpha = 0.75 for the post-ES. | ||
Intellectual accessibility (items 1, 2, 3, and 6) | 0.64 | 0.68 |
Emotional satisfaction (items 4, 5, 7, and 8) | 0.73 | 0.56a |
To investigate whether students’ attitudes toward chemistry differed as a function of course format, we conducted a mixed analysis of variance, with pre and post survey attitude responses as the within-subjects variable and course format (online vs. face-to-face) as the between-subjects variable. We conducted this analysis first for IA scores and then for ES scores; the results for both sets of analyses are displayed in Fig. 2 (descriptive statistics are in Table 5). As shown in the panel on the left of Fig. 2, IA scores did not differ from pre to post, F(1, 51) = 0.00, p = 0.969, partial η2 = 0.00; the online group did not respond differently from the face-to-face group on the IA subscale overall, F(1, 51) = 1.38, p = 0.245, partial η2 = 0.03; and the pre–post change in attitude did not differ as a function of course format, F(1, 51) = 1.21, p = 0.595, η2 = 0.01. As shown in the panel on the right of Fig. 2, ES scores did not differ from pre to post, F(1, 51) = 0.00, p = 0.999, partial η2 = 0.00; the online group did not respond differently from the face-to-face group on the ES subscale overall, F(1, 51) = 0.16, p = 0.695, partial η2 = 0.00; and the pre–post change in attitude (which was negligible) did not differ as a function of course format, F(1, 51) = 0.13, p = 0.722, η2 = 0.00. These findings suggest that in our sample, students’ attitudes toward chemistry did not differ by course format; the students’ ratings suggest no change in attitudes toward chemistry regardless of course format.
As shown in Table 5 and in comparison of the left and right panels in Fig. 2, students in both groups scored higher on Emotional Satisfaction than on Intellectual Accessibility, which remains consistent with earlier research on students’ attitudes toward chemistry (Mooring et al., 2016). Students who rated chemistry as higher in emotional satisfaction (at either time point) tended to perform better in the course, as indicated by final percent, rpre(52) = 0.33, p = 0.017, rPost = 0.33, p = 0.018. However, how students responded to the intellectual accessibility items was not correlated with their performance in the course, rpre(52) = 0.09, p = 0.511, rPost = 0.15, p = 0.299.
Further, students’ attitudes toward chemistry in either the intellectual or emotional subscales were comparable between the groups at the beginning of the course, at the end of the course, and between groups. Students found chemistry both intellectually accessible and emotionally satisfying given the overall mean sum average scores measured in this study, but both groups found chemistry more emotionally satisfying than intellectually accessible. Future work could look at increasing students’ intellectual accessibility in this course. Overall, the p values are consistently over 0.05, suggesting that the group differences we observed were too likely to have happened by chance alone, under the assumption of the Null hypothesis (that the online and the face-to-face groups do not differ).
Although the two groups of participants in the study were comparable in many ways, there were some differences. The course length was different between the two groups: a 15 week semester for the face-to-face group as compared to a 4 or 8 week course for the online group. It has been suggested that there may be some advantages to shorter courses (Gulacar et al., 2013) in terms of knowledge retention; therefore, it would be interesting to include a 15 week online course to compare the effect of instruction length on student performance or attitude. The face-to-face group had all ten exam questions given at the time of the final exam and the online group had the 3–4 questions given at the end of each of the three modules (either after 1 week or 2 weeks depending on the term). Using a pre-test or other sampling of the students prior to the course or the use of a standardized exam as the assessment tool to measure student performance would further improve the integrity of the study. Finally, the online and face-to-face groups were taught by different instructors. Future studies could account for differences in instructor performance using student evaluations or other measures.
Attitude toward chemistry, as measured by a reliable and validated instrument, had not previously been examined for students enrolled in online versus face-to-face course format, but our results demonstrate that student intellectual accessibility and emotional satisfaction is consistent between the course formats. One frequent concern of online course delivery is the notion that students are not satisfied with or do not like online courses. This study reveals that students have comparable intellectual accessibility and emotional satisfaction toward chemistry. Also, attitude toward chemistry does not decrease after the completion of an online (or a face-to-face) course in introduction to inorganic chemistry.
Online learning environments do offer advantages. Specifically, they offer more flexibility for students with work/family commitments by allowing for distance learning, additional course scheduling options for students and instructors, and an institutional point of view that may help alleviate capacity constraints or other budgetary demands. Determining what students gain from an online versus a face-to-face learning environment will help determine whether similar types of online course offerings should be integrated into other STEM disciplines. Tracking student's attitude and performance in prior or later chemistry courses, whether online or face-to-face, would be interesting in terms of learning more about students’ conceptual understanding and persistence in chemistry.
The second research question sought to understand how student attitude was affected, if at all, from the beginning to the end of the course using a validated and reliable measure of attitude towards chemistry. Our data indicate attitudes towards chemistry for students enrolled in the online or the face-to-face course were comparable at the beginning of the course and at the end of the course in both intellectual accessibility and emotional satisfaction. In the limited, but growing, number of studies that measure students’ attitudes toward chemistry using the ASCIv2 instrument, this is the one study that measures the intellectual accessibility and emotional satisfaction components of attitude in an online versus a face-to-face undergraduate inorganic chemistry course. Future research is needed to examine positive and negative aspects to each course format from the students’ and instructors’ perspectives.
Beyond student performance and attitude measures, what other differences might exist across formats? In this study, both online and the face-to-face courses are primarily lecture environments with small opportunities for student discussion and interaction; therefore, one consideration when comparing the different course formats is the amount of student-to-instructor interactions and student-to-student interactions. It has been suggested that student-to-student discussion promotes critical thinking and forces students to engage with the course material at higher levels of learning (Driscoll et al., 2012). Student-to-instructor discussion is also important and this type of discussion may be utilized to various degrees in an online or traditional learning environment (Swan, 2003). In an attempt to incorporate student-to-student and instructor-to-student interactions that can occur in face-to-face courses, instructors of online courses have used several methods to embed discussion into online courses, including discussion boards or virtual office hours (i.e. synchronous instructor-led chat sessions) (Schoenfeld-Tacher et al., 2001). However, more research is required to identify meaningful student-to-student interactions in chemistry that can be implemented and measured in an online format. The authors suggest choosing a discussion model that promotes student metacognition. Metacognition has been shown in other disciplines to bring students to the forefront of the learning process by intentionally providing opportunities for students to reflect on their own learning (Tanner, 2012). Future work will include implementation and assessment of a structured student-to-student discussion model in both online and face-to-face courses.
Footnotes |
† Electronic supplementary information (ESI) available. See DOI: 10.1039/c9rp00112c |
‡ Present address: Lawrence University, Appleton, Wisconsin, USA. E-mail: theisenr@lawrence.edu |
This journal is © The Royal Society of Chemistry 2020 |