How do general chemistry students’ impressions, attitudes, perceived learning, and course performance vary with the arrangement of homework questions and E-text?

Vickie M. Williamson *a and Caitlin J. Zumalt b
aTexas A&M University, Department of Chemistry, College Station, TX 77843-3255, USA. E-mail: williamson@tamu.edu
bUniversity of South Florida, Department of Chemistry, Tampa, FL 33620, USA

Received 22nd March 2017 , Accepted 20th June 2017

First published on 20th June 2017


Abstract

Two large sections of first-semester general chemistry were assigned to use different homework systems. One section used MindTap, a Cengage Learning product, which presents short sections of the textbook with embedded homework questions; such that students could read the textbook section then answer one or more questions in the same screen. The other section used Online Web Learning (OWL-version 2) also from Cengage Learning, which presents homework questions that contains links to open the textbook in a separate window. Findings showed no difference between the groups in any course grades, with both groups strongly indicating that they learned from their system. During a second-semester chemistry course taught by the same instructor, all students used OWLv2. At the end of the second semester, students who had used MindTap during the first semester were given a delayed survey, containing Likert-scaled and open-response questions dealing with students’ perceived learning/perceived level of understanding with each system, how easy each system was to use, and the advantages/disadvantages of each system. In addition, students were asked to compare the two systems giving their homework preference. Students were heavily positive towards the MindTap system. Further data was collected to compare students who used MindTap for the first semester and OWL for the second-semester with those who used the systems in reverse order, using the same survey. Results showed that students indicated significantly higher perceived learning with MindTap and better attitudes and opinions of MindTap, with its single window arrangement, often citing that they read more with MindTap.


Literature/theoretical framework/background

Research has shown that online homework has increased student learning (e.g., Walberg et al., 1985; Malik et al., 2014; Revell, 2014). Eichler and Peeples (2013) investigated the effect of online homework on grades in a study, which allowed students to opt-out of the online homework at any time. They found that students who completed any online homework did significantly better on their final exams than students who did not complete any of the online homework. Arasasingham, et al. (2011) found in their study of over 3800 students that online homework led to significantly higher final exam scores, even when controlling for differences in students’ incoming placement scores. Behmke and Atwood (2013) found that students scored higher on subsequent test questions after the students used an electronic homework system, which contained questions that were written to minimize short-term memory load. Ye et al. (2015) found that students who reviewed the textbook in addition to completing online homework performed the best on their final exam. Richards-Babb et al. (2015) found no gender gaps in student success when using online homework, but they did find a significant gap in success when students did not use online homework, favoring females.

When compared to paper homework, a number of studies have found that online homework is more beneficial than paper homework (e.g., Arasaaingham et al., 2005; Malik et al., 2014), while other studies have found that students who did online homework outperformed those who did not do the online homework (e.g., Eichler and Peeples, 2013). One study (Smithrud and Pinhaus, 2015), which used a sample size of over 1000 students, found that using both types of homework (paper and online) gave the best performance in organic chemistry. The authors suggested that an improved version of online homework, which uses a stylus and allows students to hand draw molecular structures and mechanisms, might be a good solution to this dual homework issue. Their finding is at conflict with Malik et al. (2014) who found that online homework was superior to paper homework in organic chemistry. Other studies have found that online homework is just as effective as paper homework (e.g., Bonham et al., 2003; Cole and Todd, 2003; Fyneweaver, 2008).

The immediate feedback that electronic homework provides is seen as one of the main benefits (e.g., Cole and Todd, 2003; Arasaaingham et al., 2005; Parker and Loudon, 2013). Researchers have found that immediate, detailed feedback is beneficial to students’ understanding of the concepts (Arasaaingham et al., 2005), helps students retain content understanding (Cole and Todd, 2003; Gebru, et al., 2012), aids with retention of students in the course (Richards-Babb and Jackson, 2011), and provides an increase in students’ perceived learning (Hall et al., 2001; Charlesworth and Vician, 2003). A number of studies have reported combinations of these benefits; for example, Burch and Kuo (2010) reported increased exam scores and course retentions. Richards-Babb et al. (2015) proposed that the benefits of online homework might be due to improvement in the students’ study habits and attitudes. Their organic chemistry students indicated that they were more consistent in their study with online homework and expressed positive attitudes toward the use of online homework.

Other studies have reported positive student perceptions of online homework systems, even if the students did not show an improvement in grades (e.g., Riffell and Sibley, 2003; Demirci, 2007). Leong and Alexander (2014) found that students with lower and average achievement held more positive attitudes towards online homework than did the high achieving students. The authors concluded that this was probably due to the immediate feedback. These findings were similar to those found earlier by Cole and Todd (2003), that students of lower abilities held more positive attitudes towards the online homework. The literature shows that with positive perceptions and attitudes, student learning is enhanced (e.g., Mills, 1960; Marzano, 1992; Wlodkowski, 2008). The effect of positive attitudes on learning was concisely stated by Marzano (1992, p. 18): “Without positive attitudes and perceptions, students have little chance of learning proficiently, if at all.” Could it be that an improvement in grades with online homework that was shown by some studies was really fueled by positive attitudes elicited by the online homework use?

Online homework does have some disadvantages, such as homework systems that do not provide feedback on the student's exact error, that experience server or connection problems, that allow students to just select answers with unlimited tries until the correct one is chosen, or that are frustrating and tedious to students (Cole and Todd, 2003; Allain and Williams, 2006; Kuhn, et al., 2014). Although Richards-Babb et al. (2015) reported a significant correlation between course final grades in organic chemistry and online homework performance; they also reported that 39% of their students admitted to random guessing at times when using online homework. Bowman, et al. (2014) found that the number of attempts in online homework should be limited to encourage students to spend more time on each attempt and to discourage guessing.

There is some evidence that the modes of student learning are expanding. The literature on video-based learning suggests that using videos as learning tools increases learning outcomes as well as improving teaching methods in courses (Calandra, et al., 2006; Santagata, 2009; Kersting et al., 2012; Kuter, et al., 2012). In a study of masters students using video-based learning versus traditional text-based learning reported a significant increase in motivation with the video-based learning. These students also reported that the information presented in the video was more memorable (Choi and Johnson, 2005). Video learning is employed by popular websites, where many students go for answers.

The literature on questions embedded into text shows that students learn more when using text with questions embedded than when using plain text (Hamilton, 1985; Hamaker, 1986). Students that read text that contained examples and embedded questions were able to better assess their comprehension level than students reading plain text (Walczyk and Hall, 1989). Students with low comprehension levels had improved test performance when studying using readings that had embedded questions (Callender and McDaniel, 2007). Could these ideas be incorporated into online homework?

The literature concerning learning with computers shows that there is a difference between separated or pop-up displays and single screens that contain integrated material. Integrated single screens are better for learning than separated, multiple, or pop-up screens (Betrancourt and Bisseret, 1998; Erhel and Jamet, 2006). Macedo-Rouet et al. (2003) reported that the rationale for this preference for integrated single screens might have to do with cognitive load or disorientation issues with online reading. A physical integration of related elements can address an overload of working memory by reducing the search and match process of separated elements (Kalyuga, et al., 2000). Mayer (2009) advocated in his 12 Principles of Multimedia Learning that pictures and words together were best for learning. However, could these affect the results for students using homework with embedded questions in the text using a single screen and for students using homework with separate screens for text and questions?

Some of the advantages and disadvantages of online homework found by others could be linked to the characteristics of the specific homework systems. Our study sought to compare the student learning and attitudes towards two different homework systems with different characteristics. Our study has its basis in constructivism. Constructivism is a theoretical framework based on the work of a number of cognitive scientists (e.g., Piaget, 1977; Osborne and Wittrock, 1983; Bodner, 1986; Vygotsky, 1986; von Glasersfeld, 1995). Williamson (2008) summarized this body of literature into four main points:

Constructivism is the belief that:

(a) knowledge is constructed from interactions with people and materials, not transmitted,

(b) prior knowledge impacts learning,

(c) learning, especially initial understanding, is context specific,

(d) purposeful learning activities are required to facilitate the construction or modification of knowledge structures (p. 68).

This study uses a constructivist theoretical framework as it investigates homework, which can be a purposeful learning activity, in two electronic systems (MindTap and OWL). These two systems have qualitative questions, quantitative questions, and simulations of the laboratory experience and of particle action. The systems, which are both products of Cengage Learning, create different learning activities, with different characteristics and some similarities as described next.

The MindTap homework system

The MindTap is an electronic homework system structure that embeds the homework questions into an e-book text. Short subsections of a text chapter are followed by homework questions. These embedded questions can be interactive figures with simulations of experiments or of particle behavior, tutored qualitative or quantitative problems, or interactive video tutorials. An example of the subsection text, which is followed by an example problem with written and video solution and a tutored problem all in the same page is shown in Fig. 1. Fig. 2 shows the tutored problem in detail, after the tutor problem link has been opened. The page would display the text, the example problem with solutions, and the tutor problem in one screen. When the full section of the book is completed, students are presented with a mastery assignment to help them gauge their understanding of the section. For the mastery assignments, students are given groups of three questions, of which they must get two correct to show mastery of the concept. These mastery questions can be conceptual or mathematical. General Chemistry by Vining et al. (2014) is currently the only e-book that uses this MindTap homework interface and will be simply referred to as MindTap. It should be noted that most of the mastery questions are the same for both MindTap and for OWL. Students must complete the subsections of text with its questions and mastery assignments for a grade. The system comes set with five attempts for the text/question subsections and 10 attempts for a mastery assignment, but for each attempt a different problem is given with numbers, drawings, and chemicals changed. The program allows an error warning before submission, but the instructor can choose to allow up to five or no error warnings. Questions can be short answer, multiple-choice, matching, or require a drawing. For each chapter, there are also non-graded assignments, including a review assignment that allows students to self-test, a study guide that describes the main concepts of the chapter and allows students to generate quizzes over the chapter, and a challenge assignment that contains harder integrative questions.
image file: c7rp00052a-f1.tif
Fig. 1 MindTap subsection of text.

image file: c7rp00052a-f2.tif
Fig. 2 MindTap tutored practice problem.

The OWL homework system

The OWL system (Online Web Learning) is an electronic homework, which presents mastery assignments to students. The mastery assignments are composed of groups of three questions, of which students must correctly do two of the three to show mastery. These mastery questions can be conceptual or mathematical and are mostly the same as those in MindTap. There are multimedia assignments, which contain visualizations that are videos or particulate animations followed by questions, interactive examples that are more qualitative questions, exercises that are a series of qualitative questions, and tutored problems that are a single problem containing a list of the tutor steps to lead the student to the solution. In order to see the book, students must choose a link that brings up the e-book in a separate tab of the browser window. See Fig. 3 to see an example tutor problem. Notice that the link to the textbook has been opened, and the textbook is in the tab to the far right, which is also labeled as ‘MindTap’, but is only a textbook, which for this study was Chemistry: An Atoms First Approach, 2nd edition by Zumdahl and Zumdahl (2015). See Fig. 4 for an example of this textbook as it appears in the tab. Cengage uses the term ‘MindTap’ for its text interfaces, but for the Vining, et al. book, the MindTap includes both the text and imbedded homework. For this study, both the mastery and multimedia assignments are graded in order to keep the total number of problems similar to those in the MindTap system. The OWL system is preset with 10 tries for a graded assignment, but for each try a different problem is given with numbers, drawings, and chemicals changed. The instructor can choose to allow an error warning on each question, which would give the student another chance to reconsider the answer or can choose to have no error warnings. Questions can be short answer, multiple-choice, matching, or require a drawing. For each chapter, there are non-graded assignments, including an adaptive plan that has the main concepts of the chapter and allows students to generate a quiz for self-study and end-of-chapter questions that are selected from those in the textbook.
image file: c7rp00052a-f3.tif
Fig. 3 Owl with tutor problem and text in the last tab.

image file: c7rp00052a-f4.tif
Fig. 4 Zumdahl and Zumdahl text in a separate window for OWL.

Previous work

In their pilot study, Zumalt and Williamson (2016) found that first-semester general chemistry students who had used MindTap for a 3 week assignment and OWL for a total of 4 weeks perceived that they learned more with MindTap and preferred the arrangement of MindTap. MindTap's embedded arrangement of text and homework questions was preferred over the linked arrangement in OWL. Specifically, the 274 students in the study indicated that they learned more, would recommend it to other classes, were more comfortable using, and understood concepts in a better way with MindTap over OWL. At the end of the second-semester of general chemistry, the preference for MindTap still persisted even though students had been using OWL for the full second semester. The delayed survey showed that students still believed that MindTap helped them understand concepts in a better way over OWL. The current study seeks to build on this pilot study by conducting a controlled investigation of a semester long use of MindTap vs. OWL with two sections of general chemistry.

Research questions

The goals of this study were to determine students’ attitudes and perceptions of two different homework systems, one with questions embedded in the text (MindTap) and one with the questions separate from the text (OWL) and to determine if the homework system used makes a difference in students’ grades. These goals were investigated over three semesters.

Fall 14

The questions addressed by this study are:

(1) What are students’ attitudes and perceptions of two homework systems? (embedded vs. linked)

(2) Is there a difference in course grades for students using two different homework systems? (embedded vs. linked)

Spring 15

The question addressed by this study is:

(1) What are students’ attitudes and perceptions of two homework systems? (embedded vs. linked)

Spring 16

It was hypothesized that there may be a bias for the system that students used first or a bias for the system that students used last. The question addressed by this study is:

(1) Does the order of use of two homework systems make a difference in student preferences? (embedded vs. linked)

Methodology

Fall 14

Participants in this study were enrolled in two sections of a large first-semester general chemistry course at a large southwestern university. The students completed seven homework assignments throughout the semester, four exams, and a final exam. The sections were randomly assigned to one of two homework systems by a coin flip. One section used MindTap throughout the whole semester to complete the seven homework assignments. The second section used OWL version 2 during the semester to complete the seven homework assignments. Both treatment groups had the same instructor, met on the same days of the week (MWF) a few hours apart, and used an atoms first approach in the course.

The Test of Logical Thinking (TOLT) (Tobin and Capie, 1981) was administered to both groups of students to ensure the groups were equivalent. This is a ten-item, standard test of reasoning abilities, with two items each measuring proportional reasoning, probabilistic reasoning, controlling variables, and correlational reasoning. Students’ course grades in the MindTap group were compared to the students’ grades in the OWL version 2 group. Grades for exams 1–4, final exam grades, homework grades, and overall course point total were compared between the two groups using t-tests. Means and standard deviations were calculated for each grade for the MindTap and OWL groups. These grades were collected as a regular part of the course. At the beginning of the semester, students were asked to complete a pre-survey about homework prior to completing any of the homework assignments. At the end of the semester, after completing the homework assignments students were asked to complete a post-survey about homework. Both of these surveys contained Likert-scaled items, with a few open-ended questions. The open responses were read and tabulated and the common responses are given in the analysis section.

Out of a pool of 563 students enrolled in the two sections, 504 students agreed to participate in the study by giving their permission according to the Internal Review Board's requirements at the university. Of these, 149 were excluded due to missing data, such as not completing the TOLT or taking a makeup exam instead of the normal class exam, or missing some other study component, with exclusions split evenly between the groups. The analysis was completed with 355 subjects. All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. Informed consent was obtained from all individual participants included in the study. There were 183 students in the group that used MindTap and there were 172 students in the group that used OWL version 2. There were 40 males (22%) and 143 females (78%) in the MindTap group and 46 males (27%) and 126 females (73%) in the OWL group. The number of males in comparison to females is low because at the university many male students are enrolled as engineering majors, for which there is a different introductory chemistry course specific to their major.

Spring 15

Participants in this study were enrolled in a large second-semester general chemistry course at a large southwestern university. Students used OWL version 2 to complete seven homework assignments. At the end of the semester, students were given a seven-question survey about the OWL homework system and were asked to indicate their gender. Students in this course that had used MindTap in their first semester class were also asked seven questions about MindTap and five questions comparing the two systems. The questions asked about students’ preferences, the ease of use of the systems, and the systems’ helpfulness in learning. The survey had a mix of Likert-scaled questions and open-ended questions to explain some of the Likert-scaled responses. The same instructor taught both of these courses. There were 100 students who had MindTap in the first semester course and who gave their permission to participate in the study according to the Internal Review Board's requirements at the university. There were 23 (23%) males and 77 (77%) females that participated in this study. Again, these are usual percentages for this university.

Spring 16

Participants in this study were enrolled in a large second-semester general chemistry course at a large southwestern university. Students used MindTap to complete seven homework assignments. At the end of the semester, students were given a seven-question survey about the MindTap homework system and were asked to indicate their gender. Students in this course that had used OWL in their first semester class were also asked seven questions about OWL and five questions comparing the two systems. The five comparison questions were the same as those used in spring 15. The questions asked about students’ preferences, the ease of use of the systems, and the systems’ helpfulness in learning. The survey had a mix of Likert-scaled questions and open-ended questions to explain some of the Likert-scaled responses. A different instructor taught the first-semester course for these students, using an atoms first approach. During the second-semester, the instructor was the same for this group, as for those in the study during the fall 14 and spring 15. There were 63 students who agreed to participate in the study by giving their permission according to the Internal Review Board's requirements at the university. There were 14 (22%) males and 49 (78%) females that participated in this study.

Results and discussion

Fall 14

The test of logical thinking (TOLT) was administered to both the group that used MindTap and the group that used OWL version 2. The mean for the MindTap group was 7.10 (SD 2.25) and the mean for the OWL group was 7.41 (SD 2.30). The p-value between the two groups is 0.20, so the two groups are not significantly different. Equivalent groups were desirable to investigate the differences between the homework systems. The mean for the overall course point total for the MindTap group is 549.98 (SD 77.26) and the OWL group is 552.01 (SD 77.38). The p-value is 0.80 so the two groups are not significantly different. Since the overall course grade for students in the two groups were not significantly different we looked at the mean grades for each exam (100 points each) and the final exam (165 points) to see if there were any significant differences. The mean exam grades can be seen in Table 1.
Table 1 Exam scores
Group Mean (SD) p-Value
Exam 1 MindTap 73.55 (13.72) 0.19
OWL 75.46 (13.60)
Exam 2 MindTap 75.78 (14.37) 0.32
OWL 77.30 (14.57)
Exam 3 MindTap 74.25 (16.65) 0.38
OWL 72.69 (16.30)
Exam 4 MindTap 75.21 (19.36) 0.69
OWL 74.35 (21.06)
Final exam MindTap 127.39 (28.07) 0.77
OWL 126.52 (26.94)


The p-values for each exam show that the two groups are not significantly different. The two homework systems seemed to prepare students equally for the exams. Since the mean exam grades for the two groups were not significantly different, overall homework score and mean score on each homework assignment for each group was examined. Each homework assignment was given course points based on the amount completed. If 95% of the assigned activities were completed, then 10 course points were awarded, while if only 90% was done, then nine points were given and if 89–80% was done, then eight course points were given, etc. With seven homework assignments, a total of 70 points was possible. The mean overall homework score for the MindTap group is 64.37 with a standard deviation of 7.98. The mean overall homework score for the OWL group is 65.36 with a standard deviation of 6.71. The p-value for the overall homework score is 0.21 so the two groups are not significantly different. There was no effect of homework system on the mean overall homework score. The mean score for each of the individual homework assignments for both groups is shown in Fig. 5.


image file: c7rp00052a-f5.tif
Fig. 5 Mean scores for homework assignments 1–7. *[thin space (1/6-em)]Statistically significant at the p < 0.5 level.

The mean score for homework assignment 1 was the highest at 9.92 (SD 0.35) and 9.84 (SD 0.46) for the MindTap and OWL groups respectively. Homework assignment 1 dealt with a review of math and beginning concepts of mixtures and pure substances. The lowest mean score was for homework assignment 5 at 8.37 (SD 2.50) and 8.71 (SD 2.06) for the MindTap group and the OWL group, respectively. This homework assignment was concerning stoichiometry and equation types.

The mean scores for the homework assignments were not significantly different between the groups for assignments 1, 3, 4, 5, 6, and 7. The mean score for homework assignment 2 was significantly different. The MindTap group had a mean of 9.32 with a standard deviation of 1.35. The mean score for homework assignment 2 for the OWL group is 9.81 with a standard deviation of 0.68. The p-value for homework assignment 2 is 0.00002 so the two groups are significantly different for this homework assignment. Homework assignment 2 dealt with atomic structure.

The survey given at the end of fall 14 was analyzed. Regardless of the homework system, between 75–85% of the students reported that the system they used helped them understand the course material, helped them learn content they were struggling with, and helped them understand new concepts. They further believed that their system was easy to use, helped them learn, would be recommended to other students, and contained questions of moderate difficulty. Both groups indicated that they spent the same amount of time on homework (2–6 hours per week). In open responses they cited these important features: automatic scoring/immediate feedback, help with identifying relevant material, tracking progress and performance, and the ability to study anytime/anywhere. There were no differences by homework system, but each group had no comparison, since they had only used one homework system.

The MindTap group and the OWL group were not significantly different in most performance measures (overall course points, exam 1–4 grades, overall homework grades, homework 1 grades, or homework 3–7 grades). The MindTap group and OWL group were only significantly different on the homework assignment 2 score, with the OWL group scoring higher. Positive attitudes and perceived learning towards the system they used was found for both groups, with no measureable differences between the treatment groups. Both groups also had similar numbers of withdrawals and letter grade distributions.

Spring 15

In the following semester (spring 15), the OWL homework system was used. The end of semester survey contained seven questions each on OWL and on MindTap (for those who had used it in the fall semester). These students had the same professor for first and second semester general chemistry and had used MindTap one semester and OWL the second semester. The first question asked students about their usage of the homework system. Students who indicated that they did all or most of the OWL homework in the spring semester and all or most of the MindTap homework in the fall semester were used for this analysis of the remaining six questions dealing with each system individually.

Table 2 shows the percent of students that chose a particular response for each system and the p-value for the most common response compared to the expected percent. The expected values are of 33.33% when there were three response choices or 50% when there were only two. When asked about each system separately, students reported that it was easy to learn with both systems, that they were comfortable using both systems, that both systems helped them improve their concept understanding, that both helped them prepare to answer or work problems, and that both systems helped them learn. Students liked the arrangement of text to questions in both systems when asked about this arrangement individually. It should be noted that two students did not choose a response for this item, but wrote in an explanation.

Table 2 Survey data for spring 15 students who used MindTap then OWL
n MindTap n OWL
a Significantly higher responses from other choices.
How easy was the system to use? 100 Easy 91%a 100 Easy 78%a
Undecided 5% Undecided 10%
Not easy 4% Not easy 12%
p < 0.0001 p < 0.0001
Between treatments p > 0.05
Were you comfortable using the system? 100 Yes 96%a 100 Yes 88%a
No 4% No 12%
p < 0.0001 p < 0.0001
Between treatments p > 0.05
Did you like the arrangement of the text and questions? 100 Yes 90%a 98 Yes 65%a
No 10% No 35%
p < 0.0001 p < 0.0001
Between treatments p < 0.0001
How helpful was the system in improving your understanding of the concepts? 100 Helpful 89%a 100 Helpful 78%a
Undecided 7% Undecided 13%
Not helpful 4% Not helpful 9%
p < 0.0001 p < 0.0001
Between treatments p > 0.05
How helpful was the system in preparing you to answer/work problems? 100 Helpful 90%a 100 Helpful 71%a
Undecided 4% Undecided 16%
Not helpful 6% Not helpful 13%
p < 0.0001 p < 0.0001
Between treatments p > 0.05
Did the system help you learn? 100 Yes 96%a 100 Yes 88%a
No 4% No 12%
p < 0.0001 p < 0.0001
Between treatments p > 0.05


To compare the responses for each system to determine if they were significantly different from each other, a McNemar's test was performed. The McNemar's test is a statistical test used on paired nominal data in a 2 × 2 contingency table. For the questions with three response categories, the “undecided” response was omitted, and the analysis compared “helpful” and “not helpful” for each system. The only question that had a significantly different between systems was the question concerning the arrangement of the text and questions. For this question the responses for MindTap were significantly different from those for OWL (the “yes” response was significantly higher and the “no” response was lower for MindTap). Students preferred the arrangement of the text and questions in MindTap.

Five questions on the survey asked the students to choose between the systems and asked for an explanation using open-response. The results of this survey are in the first columns of Table 3. Using a chi-squared analysis, significant differences were found. This indicates that the percentages found were significantly different from the expected value of 50%, which would be predicted if students treated the systems the same. Students reported that with MindTap, they learned more, understood concepts in a better way, were more comfortable using, and would recommend it for future classes. Students reported that OWL had more disadvantages. It was the response to the first question that seemed the most dramatic. Students with about a 2[thin space (1/6-em)]:[thin space (1/6-em)]1 ratio said that MindTap helped them learn more (66% to 30%). This was surprising since the students had used OWL more recently, and it had been a full semester since they had used MindTap. Could it be that students felt that they retained the information better with MindTap, so they indicated that they had learned more with that system? More research is needed to investigate this. When students were asked which system helped them understand concepts in a better way there was an option that the systems were the same, 24% of students said they were the same. Because of this, the values on the table for this question only add up to 75%.

Table 3 Survey results from spring 15 students who used MindTap then OWL
MindTap (%) OWL (%) p-Value
a Statistically significant at the p < 0.5 level.
With which did you learn more? 66 30 <0.001
Which helped you understand concepts in a better way? 63 12 <0.001
Which were you more comfortable using? 69 30 <0.001
Which system had more disadvantages? 31 69 <0.001
Which would you recommend other classes in the future? 67 32 <0.001


Students were asked to explain their responses in Table 3. The explanations were tabulated and analyzed, based on the number of students giving an explanation. Common explanations are discussed here. Because students could give more than one explanation or fail to give an explanation, the percentages will not add to 100%. When asked why they indicated that they learned more with MindTap, 25.0% explained that MindTap had better explanations, while 30.3% said that MindTap was better organized.

When asked to explain their response about which system helped them understand concepts in a better way, 44.8% indicated it was the embedded arrangement of the text and homework in MindTap, while 13.8% said MindTap was easier to use. It should be noted here that while 4% that failed to choose one system over another in Table 3, 13.8% of the students writing explanations indicated that they learned with both systems and failed to choose one over the other. When explaining why they were more comfortable with a system, 49.0% said that MindTap was easier to use, while 21.6% said that MindTap gave better explanations for the problems. For those that said they were more comfortable with OWL, the most common explanation was that OWL was more straightforward (15.7%).

In explaining their response about which system had more disadvantages, 33.8% indicated that OWL had the text separate from the questions, 16.2% said that MindTap was easier to use, and 13.2% said that MindTap had better explanations. Finally, when explaining why they would recommend a system to other classes, 35.9% indicated that MindTap was better organized and easier to use and 35.9% said they learned more with MindTap. Although both systems have tutored problems, 11.5% said they would recommend OWL because it had tutor steps.

In the explanations for all five of the questions directly comparing the systems, the common responses were students pointing to the ease of use, better explanations, and better organization with MindTap, with some overtly indicating that what made the difference was the arrangement of the text and homework questions. These data were collected a semester after students had used the MindTap system, a semester in which they used OWL. In fact, 68% of the students who indicated that they learned more with MindTap than with OWL and explained this response by saying that MindTap had better explanations specifically said that the better explanations were due to the ability to read the text.

Spring 16

There was a question about bias for the order in which the systems were used. Was preference for MindTap in the spring 15 data simply a bias for the first system the students used? To investigate the effect of order of use, a similar survey to that of spring 15 was used with students who had used OWL in the first-semester general chemistry with another instructor, then used MindTap during the second-semester course. The end of semester survey contained seven questions each on MindTap and on OWL (for those who had used it in the fall semester). Students who indicated that they did all or most of the MindTap homework in the spring semester and all or most of the OWL homework in the fall semester were used for this analysis of the remaining six questions dealing with each system individually. Table 4 shows the percent of students that chose a particular response for each system and the p-value for the most common response compared to the expected percent. The expected values are of 33.33% when there were three response choices or 50% when there were only two.
Table 4 Survey data for spring 16 students who used OWL then MindTap
n MindTap n OWL
a Significantly higher responses from other choices.
How easy was the system to use? 62 Easy 87%a 62 Easy 61%a
Undecided 10% Undecided 11%
Not easy 3% Not easy 27%
p < 0.0001 p < 0.0001
Between treatments p < 0.0001
Were you comfortable using the system? 62 Yes 98%a 63 Yes 73%a
No 2% No 27%
p < 0.0001 p < 0.0001
Between treatments p < 0.0001
Did you like the arrangement of the text and questions? 63 Yes 95%a 61 Yes 41%
No 5% No 59%
p < 0.0001 p > 0.05
Between treatments p < 0.0001
How helpful was the system in improving your understanding of the concepts? 62 Helpful 87%a 63 Helpful 32%
Undecided 13% Undecided 30%
Not helpful 0% Not helpful 38%
p < 0.0001 p > 0.05
Between treatments p < 0.0001
How helpful was the system in preparing you to answer/work problems? 62 Helpful 81%a 62 Helpful 37%
Undecided 15% Undecided 27%
Not helpful 5% Not helpful 35%
p < 0.0001 p > 0.05
Between treatments p < 0.0001
Did the system help you learn? 63 Yes 94%a 63 Yes 57%
No 6% No 43%
p < 0.0001 p > 0.05
Between treatments p < 0.0001


When asked about each system separately, students reported in Table 4 that it was easy to learn with both systems and that they were comfortable using both systems. Students liked the arrangement of the text and questions in MindTap, thought MindTap was helpful in improving concept understanding, that MindTap helped them prepare to answer or work problems, and that MindTap helped them learn. It is important to point out that the number of students indicating yes when asked if OWL helped them learn was not significantly different from the expected value of 50%. For some of the questions, less than 63 responded because some students failed to choose a response but wrote in their own explanation. The preference for MindTap existed for those who used OWL first, then MindTap.

To compare responses between the systems, the McNemar's test was performed as previously described. Significant differences were found for all of the items in Table 4 between the systems. Significantly more of the students believed that MindTap was easier to use, that they were more comfortable with MindTap, that they liked the arrangement of text to questions in MindTap, that MindTap was helpful in improving their understanding of the concepts, that MindTap was helpful in preparing them to answer/work problems, and that they learn with MindTap.

Five questions on the survey asked the students to choose between OWL and MindTap and asked for an explanation using open-response. These students had a different professor for first and second semester general chemistry and had used OWL one semester and MindTap the second semester. In Table 5, the results of the spring 16 survey are in the last three columns. Significant differences between systems were found using a chi-squared analysis. This indicates that the percentages found were significantly different from the expected value of 50%, which would be predicted if students treated the systems the same. Students reported that with MindTap, they learned more, understood concepts in a better way, were more comfortable using, and would recommend it for future classes. Students reported that OWL had more disadvantages. These findings from spring 16 reflect those from spring 15.

Table 5 Survey results from students who used both systems
Spring 15 Spring 16
MindTap then OWL OWL then MindTap
MindTap (%) OWL (%) p-Value MindTap (%) OWL (%) p-Value
a Statistically significant at the p < 0.5 level.
With which did you learn more? 66 30 <0.001 84 16 <0.001
Which helped you understand concepts in a better way? 63 12 <0.001 87 5 <0.001
Which were you more comfortable using? 69 30 <0.001 85 18 <0.001
Which system had more disadvantages? 31 69 <0.001 17 81 <0.001
Which would you recommend other classes in the future? 67 32 <0.001 89 11 <0.001


Student explanations for each of the five questions, which asked them to choose between the two systems, were analyzed by tabulating the most common responses as they were in spring 15. In spring 16, the majority of the students responded in the same way (81–89%), such that there were no common explanations for the minority views. Students explained that MindTap helped them learn more due to better explanations (42.6%) or due to better examples (32.7%). Students explained that the reason MindTap helped them understand concepts in a better way was due to the arrangement of the text and questions (72.7%). When explaining why they were more comfortable with MindTap, students listed that MindTap was easier to use (28.2%) and that MindTap helped them understand concepts better than OWL (33.3%). The reason they gave for why OWL had more disadvantages was that OWL had the text separate (65.3%). Finally, students responded that they would recommend MindTap to other classes because MindTap helped them understand concepts better (60.0%).

Spring 15 vs. spring 16

In Table 5 the results are given for both spring 15 where students used MindTap then OWL and spring 16 where students used OWL then MindTap. The five questions compared are those where students had to choose between the systems. Both semesters, regardless of the order homework systems were used, indicate the same preferences: with MindTap, they learned more, understood concepts in a better way, were more comfortable using, and would recommend it for future classes, but OWL had more disadvantages. The spring 16 students who had used OWL, then MindTap were more positive towards MindTap than the spring 16 students who used MindTap then OWL. The findings for the open response common explanations were also similar between spring 15 and spring 16.

If there is a bias for the last program used, then the results from spring 15 are more surprising to still favor MindTap. Likewise, no bias for the first program used was found. There might have been an instructor difference. The spring 16 group did have a different professor for the first semester course, but this instructor was an experienced OWL-user. The professor for the spring 16 MindTap was an experienced MindTap user, but this was the first semester to use MindTap with the second semester of general chemistry. It is unclear how the instructor could have affected the higher scores for spring 16.

Summary and limitations

We found that when students were exposed to both the OWL and MindTap homework systems in any order, they have better impressions and attitudes towards MindTap. These impressions and attitudes were persistent when measured at the end of the next semester. Students reported that they were more comfortable with MindTap and would recommend MindTap to future classes. Students felt that OWL had more disadvantages often attributed to the arrangement of the text and homework questions. Further, students believe that they learn more and understand concepts better with MindTap as compared to OWL. This is important because researchers have found that students learn best when their perceptions and attitudes are positive towards an activity (e.g., Marzano, 1992). While there was a difference in perceived learning favoring MindTap, our findings show no differences in learning between students using MindTap as compared to those using OWL, using the performance measures in the course. This perceived learning was prevalent regardless of the order that students used the homework systems. Our findings concerning perceived learning are similar to those of other researchers (e.g., Hall et al., 2001; Charlesworth and Vician, 2003). The two systems studied here are very similar, which might explain why there was no difference in retention, while other researchers did find such a difference (e.g., Burch and Kuo, 2010; Richards-Babb and Jackson, 2011). Further, no differences were found in course performance, unlike the differences that were found by other researchers (e.g., Behmke and Atwood, 2013). The similarity of course performance could be either due to the similarity in systems or to the nature of the examination questions used.

While conducted over a number of semesters, the limitations of the study include the fact that subjects were from one institution, whose students are among the top from the region. The TOLT scores were higher (7.10 and 7.31) than the 4.4 reported by Tobin and Capie (1981) for a sample of 247 college level students. A further limitation includes the fact that only two homework systems were used. It should be noted that neither of the authors have a financial interest in either of the online homework systems.

Implications

The implications for further research are varied. This study was limited to the comparison of two similar homework systems (MindTap and OWL), for which many of the mastery assignments were the same. It might be that the systems are too similar to result in a measureable difference. Further studies should investigate whether there would be a difference in students’ grades or attitudes and preferences if MindTap were compared to a very different system such as ALEKS (Assessment and Learning in Knowledge Spaces), Mastering Chemistry, or Sapling Learning.

Also, quite a number of students mentioned that they read more with MindTap. Is the difference in perceived learning simply due to the availability of the text pages, or to students actually reading the text or to the reduction of cognitive load when the text and questions are embedded in one screen? Is it possible that the linked webpages of homework and text in OWL are just too taxing and result in cognitive overload? Other researchers have found better student performance when short-term memory load was minimized (Behmke and Atwood, 2013). These all lead to further work. By comparing the two similar homework systems in this study, the main difference was the screen display for the homework (questions embedded in the text on one screen or text linked to the homework in a separate screen). How do other student attributes play into their attitudes towards screen displays? Do differences in spatial ability or other cognitive measures affect student learning or attitudes with a homework system?

Those that participated in this study felt that a prudent implication for their teaching practice from this research was to follow the mandates from the student perceived learning. With no difference in actual performance between the groups using the course measures (exams), but with a significant difference in perceived learning favoring MindTap, MindTap has been the homework system used since the conclusion of the study. It is possible that students did have learning gains with MindTap as they believed, but these were not detected by the course measures (exams 1–4 or final). The teaching implications from this research include the ideas that, until more studies can be done, instructors should consider the issues of cognitive overload with separate screens for text and homework as they choose a homework system and that one way to encourage students to read the text is by having it in the same screen as the homework or readily available.

References

  1. Allain R. and Williams T., (2006), The effectiveness of online homework in an introductory science class, J. Coll. Sci. Teach., 35(6), 28–30.
  2. Arasaaingham R. D., Taagepera M., Potter F., Maartorell I. and Lonjers, S., (2005), Assessing the effect of web-based learning tools on student understanding of stoichiometry using knowledge space theory, J. Chem. Educ., 82(8), 1251–1262.
  3. Arasasingham R. D., Martorell I. and McIntire T. M., (2011), Online homework and student achievement in a large enrollment introductory science course, J. Coll. Sci. Teach., 40(6), 70–79.
  4. Behmke D. A. and Atwood C. H., (2013), Implementation and assessment of Cognitive Load Theory based questions in an electronic homework and testing system, Chem. Educ. Res. Pract., 14, 247–256.
  5. Betrancourt M. and Bisseret A., (1998), Integrating textual and pictorial information via pop-up windows: an experimental study, Behav. Inform. Technol., 17(5), 263–273.
  6. Bodner G. M., (1986), Constructivism: a theory of knowledge, J. Chem. Educ., 63, 873–878.
  7. Bonham S. W., Deardorf D. L. and Beichner R. J., (2003), Comparison of student performance using web and paper based homework in college-level physics, J. Res. Sci. Teach., 40(10), 1050–1071.
  8. Bowman C. R., Gulacar O. and King D. B., (2014), Predicting student success via online homework usage, J. Learn. Des., 7(2), 47–61.
  9. Burch K. J. and Kuo Y. J., (2010), Traditional vs. online homework in college algebra, Math. Comput. Educ., 44, 53–63.
  10. Calandra, B., Brantley-Dias, L., and Dias, M. (2006). Using digital video for professional development in urban schools: a preservice teacher's experience with reflection. J. Comput. Teacher Educ., 22(4), 137–145.
  11. Callender, A. A., and McDaniel, M. A. (2007), The benefits of embedded question adjuncts for low and high structure builders. J. Educ. Psychol., 99(2), 339.
  12. Charlesworth P. and Vician C., (2003), Leveraging technology for chemical science education: an early assessment of WebCT usage in first year chemistry courses, J. Chem. Educ., 80(11), 1333–1337.
  13. Choi, H. J., and Johnson, S. D. (2005), The effect of context-based video instruction on learning and motivation in online courses. Am. J. Dist. Educ., 19(4), 215–227.
  14. Cole R. S. and Todd J. B., (2003), Effects of web-based multimedia homework with immediate rich feedback on student learning in general chemistry, J. Chem. Educ., 80(11), 1338–1347.
  15. Demirci N., (2007), University students’ perceptions of web-based vs. paper-based homework in a general physics course, Eur. J. Math. Sci. Technol. Educ., 3(1), 29–34.
  16. Eichler J. F. and Peeples J., (2013), Online homework put to the test: a report on the impact of two online learning systems on student performance in general chemistry, J. Chem. Educ., 90(9), 1137–1143.
  17. Erhel S. and Jamet E., (2006), Using pop-up windows to improve multimedia learning, J. Comput. Assist. Learn., 22, 137–147.
  18. Fyneweaver H., (2008), A comparison of the effectiveness of web-based and paper-based homework for general chemistry, Chem. Educ., 13(4), 264–269.
  19. Gebru M. T., Phelps A. J. and Wulfsberg G. (2012), Effect of clickers versus online homework on students’ long-term retention of general chemistry course material, Chem. Educ. Res. Pract., 13, 325–329.
  20. Hall R. W., Butler L. G., McGuire S. Y., McGlynn S. P., Lyon G. L., Reese R. L. and Limbach P. A., (2001), Automated, web-based, second-chance homework, J. Chem. Educ., 78(12), 1704–1708.
  21. Hamaker, C. (1986), The effects of adjunct questions on prose learning. Rev. Educ. Res., 56(2), 212–242.
  22. Hamilton, R. J. (1985), A framework for the evaluation of the effectiveness of adjunct questions and objectives. Rev. Educ. Res., 55(1), 47–85.
  23. Kalyuga S., Chandler P. and Sweller, J., (2000), Incorporating learner experience into the design of multimedia instruction, J. Educ. Psychol., 92(1), 126–136.
  24. Kersting, N. B., Givvin, K. B., Thompson, B. J., Santagata, R., and Stigler, J. W. (2012), Measuring usable knowledge: teachers’ analyses of mathematics classroom videos predict teaching quality and student learning. Am. Educ. Res. J., 49(3), 568–589.
  25. Kuhn S. W., Watson S. W. and Walters T. J., (2014), Online homework and correlated success in university mathematics courses: a longitudinal study, in Kyei-Blankson L. and Ntuli E. (ed.), Practical Applications and Experiences in K-20 Blended Learning Environments, Hershey, PA: IGI Global, pp. 307–329.
  26. Kuter, S., Gazi, Z. A., and Aksal, F. A. (2012), Examination of Co-construction of Knowledge in Videotaped Simulated Instruction, Educ. Technol. Soc., 15(1), 174–184.
  27. Leong K. E. and Alexander N., (2014), College students’ attitude and mathematics achievement using web based homework, Eur. J. Math. Sci. Technol. Educ., 10(6), 609–615.
  28. Macedo-Rouet M., Rouet J., Epstein I and Fayard P., (2003), Effects of online reading on popular science comprehension, Sci. Commun., 25(2), 99–103.
  29. Malik K., Martinez N., Romero J., Schubel S. and Janowicz P. A., (2014), Mixed-methods study of online and written organic chemistry homework, J. Chem. Educ., 91(11), 1804–1809.
  30. Marzano R. J., (1992), A Different Kind of Classroom: Teaching with Dimensions of Learning, Alexandria, VA: ASCD (Association for Supervision and Curriculum Development).
  31. Mayer R. E., (2009), Multimedia Learning, 2nd edn, New York, NY, Cambridge Press.
  32. Mills C., (1960), Attitudes affect pupils’ learning, Educ. Leadership, 17(4), 212–216.
  33. Osborne R. J. and Wittrock M. C., (1983), Learning science: a generative process, Sci. Educ., 67, 489–508.
  34. Parker L. L. and Loudon G. M., (2013), Case study using online homework in undergraduate organic chemistry: results and student attitudes, J. Chem. Educ., 90(1), 37–44.
  35. Piaget J., (1977), The Development of Thought: Equilibrium of Cognitive Structures, Viking, NY, 1977.
  36. Revell K. D., (2014), A comparison of the usage of tablet PC, lecture capture, and online homework in an introductory chemistry course, J. Chem. Educ., 91, 48–51.
  37. Richards-Babb M. and Jackson J. K., (2011), Gendered responses to online homework use in general chemistry, Chem. Educ. Res. Pract., 12, 409–419.
  38. Richards-Babb M., Curtis R., Georgieva Z. and Penn J. H., (2015), Student perceptions of online homework use for formative assessment of learning in organic chemistry, J. Chem. Educ., 92(11), 1813–1819.
  39. Riffell S. K. and Sibley D. H., (2003), Learning online: student perceptions of hybrid learning format, J. Coll. Sci. Teach., 32(6), 394–399.
  40. Santagata, R. (2009), Designing video-based professional development for mathematics teachers in low-performing schools. J. Teacher Educ., 60(1), 38–51.
  41. Smithrud D. B. and Pinhaus A. R., (2015), Pencil-paper learning should be combined with online homework software, J. Chem. Educ., 92, 1965–1970.
  42. Tobin K. G. and Capie W., (1981), The development and validation of a group test of logical thinking, Educ. Psychol. Meas., 41(2), 413–423.
  43. Vining W. J., Young S. M., Day, R. and Botch B., (2014), General Chemistry, Belmont, CA: Cengage Learning.
  44. von Glasersfeld E., (1995), Radical Constructivism, Wash., DC.: Falmer.
  45. Vygotsky L.S., (1986), Thought and Language, Cambridge MA: The MIT Press.
  46. Walberg H. J., Paschal R. A. and Weinstein, T., (1985), Homework's powerful effects on learning, Educ. Leadership, 42(7), 76–79.
  47. Walczyk, J. J., and Hall, V. C. (1989), Effects of examples and embedded questions on the accuracy of comprehension self-assessments. J. Educ. Psychol., 81(3), 435–437.
  48. Williamson V. M., (2008), The particulate nature of matter: an example of how theory-based research can impact the field, in Bunce D. and Cole R., (ed.) Nuts and Bolts of Chemical Education Research, Washington, DC: American Chemical Society, pp. 67–78.
  49. Wlodkowski R., (2008), Enhancing adult motivation to learn: a comprehensive guide for teaching all adults, 3rd edn, San Francisco: Jossey-Bass A Wiley Imprint.
  50. Ye L., Oueini R., Dickerson A. P. and Lewis S., (2015), Learning beyond the classroom: using text messages to measure general chemistry students’ study habits, Chem. Educ. Res. Pract., 16, 869–878.
  51. Zumalt C. J. and Williamson V. M., (2016), Does the arrangement of embedded text versus linked text in homework systems make a difference in students’ impressions, attitudes, and perceived learning, J. Sci. Educ. Technol., 25, 704–714.
  52. Zumdahl S. and Zumdahl S., (2015), Chemistry: An Atoms First Approach, 2nd edn, Boston, MA: Cengage Learning.

This journal is © The Royal Society of Chemistry 2017