Vickie M.
Williamson
*a and
Caitlin J.
Zumalt
b
aTexas A&M University, Department of Chemistry, College Station, TX 77843-3255, USA. E-mail: williamson@tamu.edu
bUniversity of South Florida, Department of Chemistry, Tampa, FL 33620, USA
First published on 20th June 2017
Two large sections of first-semester general chemistry were assigned to use different homework systems. One section used MindTap, a Cengage Learning product, which presents short sections of the textbook with embedded homework questions; such that students could read the textbook section then answer one or more questions in the same screen. The other section used Online Web Learning (OWL-version 2) also from Cengage Learning, which presents homework questions that contains links to open the textbook in a separate window. Findings showed no difference between the groups in any course grades, with both groups strongly indicating that they learned from their system. During a second-semester chemistry course taught by the same instructor, all students used OWLv2. At the end of the second semester, students who had used MindTap during the first semester were given a delayed survey, containing Likert-scaled and open-response questions dealing with students’ perceived learning/perceived level of understanding with each system, how easy each system was to use, and the advantages/disadvantages of each system. In addition, students were asked to compare the two systems giving their homework preference. Students were heavily positive towards the MindTap system. Further data was collected to compare students who used MindTap for the first semester and OWL for the second-semester with those who used the systems in reverse order, using the same survey. Results showed that students indicated significantly higher perceived learning with MindTap and better attitudes and opinions of MindTap, with its single window arrangement, often citing that they read more with MindTap.
When compared to paper homework, a number of studies have found that online homework is more beneficial than paper homework (e.g., Arasaaingham et al., 2005; Malik et al., 2014), while other studies have found that students who did online homework outperformed those who did not do the online homework (e.g., Eichler and Peeples, 2013). One study (Smithrud and Pinhaus, 2015), which used a sample size of over 1000 students, found that using both types of homework (paper and online) gave the best performance in organic chemistry. The authors suggested that an improved version of online homework, which uses a stylus and allows students to hand draw molecular structures and mechanisms, might be a good solution to this dual homework issue. Their finding is at conflict with Malik et al. (2014) who found that online homework was superior to paper homework in organic chemistry. Other studies have found that online homework is just as effective as paper homework (e.g., Bonham et al., 2003; Cole and Todd, 2003; Fyneweaver, 2008).
The immediate feedback that electronic homework provides is seen as one of the main benefits (e.g., Cole and Todd, 2003; Arasaaingham et al., 2005; Parker and Loudon, 2013). Researchers have found that immediate, detailed feedback is beneficial to students’ understanding of the concepts (Arasaaingham et al., 2005), helps students retain content understanding (Cole and Todd, 2003; Gebru, et al., 2012), aids with retention of students in the course (Richards-Babb and Jackson, 2011), and provides an increase in students’ perceived learning (Hall et al., 2001; Charlesworth and Vician, 2003). A number of studies have reported combinations of these benefits; for example, Burch and Kuo (2010) reported increased exam scores and course retentions. Richards-Babb et al. (2015) proposed that the benefits of online homework might be due to improvement in the students’ study habits and attitudes. Their organic chemistry students indicated that they were more consistent in their study with online homework and expressed positive attitudes toward the use of online homework.
Other studies have reported positive student perceptions of online homework systems, even if the students did not show an improvement in grades (e.g., Riffell and Sibley, 2003; Demirci, 2007). Leong and Alexander (2014) found that students with lower and average achievement held more positive attitudes towards online homework than did the high achieving students. The authors concluded that this was probably due to the immediate feedback. These findings were similar to those found earlier by Cole and Todd (2003), that students of lower abilities held more positive attitudes towards the online homework. The literature shows that with positive perceptions and attitudes, student learning is enhanced (e.g., Mills, 1960; Marzano, 1992; Wlodkowski, 2008). The effect of positive attitudes on learning was concisely stated by Marzano (1992, p. 18): “Without positive attitudes and perceptions, students have little chance of learning proficiently, if at all.” Could it be that an improvement in grades with online homework that was shown by some studies was really fueled by positive attitudes elicited by the online homework use?
Online homework does have some disadvantages, such as homework systems that do not provide feedback on the student's exact error, that experience server or connection problems, that allow students to just select answers with unlimited tries until the correct one is chosen, or that are frustrating and tedious to students (Cole and Todd, 2003; Allain and Williams, 2006; Kuhn, et al., 2014). Although Richards-Babb et al. (2015) reported a significant correlation between course final grades in organic chemistry and online homework performance; they also reported that 39% of their students admitted to random guessing at times when using online homework. Bowman, et al. (2014) found that the number of attempts in online homework should be limited to encourage students to spend more time on each attempt and to discourage guessing.
There is some evidence that the modes of student learning are expanding. The literature on video-based learning suggests that using videos as learning tools increases learning outcomes as well as improving teaching methods in courses (Calandra, et al., 2006; Santagata, 2009; Kersting et al., 2012; Kuter, et al., 2012). In a study of masters students using video-based learning versus traditional text-based learning reported a significant increase in motivation with the video-based learning. These students also reported that the information presented in the video was more memorable (Choi and Johnson, 2005). Video learning is employed by popular websites, where many students go for answers.
The literature on questions embedded into text shows that students learn more when using text with questions embedded than when using plain text (Hamilton, 1985; Hamaker, 1986). Students that read text that contained examples and embedded questions were able to better assess their comprehension level than students reading plain text (Walczyk and Hall, 1989). Students with low comprehension levels had improved test performance when studying using readings that had embedded questions (Callender and McDaniel, 2007). Could these ideas be incorporated into online homework?
The literature concerning learning with computers shows that there is a difference between separated or pop-up displays and single screens that contain integrated material. Integrated single screens are better for learning than separated, multiple, or pop-up screens (Betrancourt and Bisseret, 1998; Erhel and Jamet, 2006). Macedo-Rouet et al. (2003) reported that the rationale for this preference for integrated single screens might have to do with cognitive load or disorientation issues with online reading. A physical integration of related elements can address an overload of working memory by reducing the search and match process of separated elements (Kalyuga, et al., 2000). Mayer (2009) advocated in his 12 Principles of Multimedia Learning that pictures and words together were best for learning. However, could these affect the results for students using homework with embedded questions in the text using a single screen and for students using homework with separate screens for text and questions?
Some of the advantages and disadvantages of online homework found by others could be linked to the characteristics of the specific homework systems. Our study sought to compare the student learning and attitudes towards two different homework systems with different characteristics. Our study has its basis in constructivism. Constructivism is a theoretical framework based on the work of a number of cognitive scientists (e.g., Piaget, 1977; Osborne and Wittrock, 1983; Bodner, 1986; Vygotsky, 1986; von Glasersfeld, 1995). Williamson (2008) summarized this body of literature into four main points:
Constructivism is the belief that:
(a) knowledge is constructed from interactions with people and materials, not transmitted,
(b) prior knowledge impacts learning,
(c) learning, especially initial understanding, is context specific,
(d) purposeful learning activities are required to facilitate the construction or modification of knowledge structures (p. 68).
This study uses a constructivist theoretical framework as it investigates homework, which can be a purposeful learning activity, in two electronic systems (MindTap and OWL). These two systems have qualitative questions, quantitative questions, and simulations of the laboratory experience and of particle action. The systems, which are both products of Cengage Learning, create different learning activities, with different characteristics and some similarities as described next.
(1) What are students’ attitudes and perceptions of two homework systems? (embedded vs. linked)
(2) Is there a difference in course grades for students using two different homework systems? (embedded vs. linked)
(1) What are students’ attitudes and perceptions of two homework systems? (embedded vs. linked)
(1) Does the order of use of two homework systems make a difference in student preferences? (embedded vs. linked)
The Test of Logical Thinking (TOLT) (Tobin and Capie, 1981) was administered to both groups of students to ensure the groups were equivalent. This is a ten-item, standard test of reasoning abilities, with two items each measuring proportional reasoning, probabilistic reasoning, controlling variables, and correlational reasoning. Students’ course grades in the MindTap group were compared to the students’ grades in the OWL version 2 group. Grades for exams 1–4, final exam grades, homework grades, and overall course point total were compared between the two groups using t-tests. Means and standard deviations were calculated for each grade for the MindTap and OWL groups. These grades were collected as a regular part of the course. At the beginning of the semester, students were asked to complete a pre-survey about homework prior to completing any of the homework assignments. At the end of the semester, after completing the homework assignments students were asked to complete a post-survey about homework. Both of these surveys contained Likert-scaled items, with a few open-ended questions. The open responses were read and tabulated and the common responses are given in the analysis section.
Out of a pool of 563 students enrolled in the two sections, 504 students agreed to participate in the study by giving their permission according to the Internal Review Board's requirements at the university. Of these, 149 were excluded due to missing data, such as not completing the TOLT or taking a makeup exam instead of the normal class exam, or missing some other study component, with exclusions split evenly between the groups. The analysis was completed with 355 subjects. All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. Informed consent was obtained from all individual participants included in the study. There were 183 students in the group that used MindTap and there were 172 students in the group that used OWL version 2. There were 40 males (22%) and 143 females (78%) in the MindTap group and 46 males (27%) and 126 females (73%) in the OWL group. The number of males in comparison to females is low because at the university many male students are enrolled as engineering majors, for which there is a different introductory chemistry course specific to their major.
Group | Mean (SD) | p-Value | |
---|---|---|---|
Exam 1 | MindTap | 73.55 (13.72) | 0.19 |
OWL | 75.46 (13.60) | ||
Exam 2 | MindTap | 75.78 (14.37) | 0.32 |
OWL | 77.30 (14.57) | ||
Exam 3 | MindTap | 74.25 (16.65) | 0.38 |
OWL | 72.69 (16.30) | ||
Exam 4 | MindTap | 75.21 (19.36) | 0.69 |
OWL | 74.35 (21.06) | ||
Final exam | MindTap | 127.39 (28.07) | 0.77 |
OWL | 126.52 (26.94) |
The p-values for each exam show that the two groups are not significantly different. The two homework systems seemed to prepare students equally for the exams. Since the mean exam grades for the two groups were not significantly different, overall homework score and mean score on each homework assignment for each group was examined. Each homework assignment was given course points based on the amount completed. If 95% of the assigned activities were completed, then 10 course points were awarded, while if only 90% was done, then nine points were given and if 89–80% was done, then eight course points were given, etc. With seven homework assignments, a total of 70 points was possible. The mean overall homework score for the MindTap group is 64.37 with a standard deviation of 7.98. The mean overall homework score for the OWL group is 65.36 with a standard deviation of 6.71. The p-value for the overall homework score is 0.21 so the two groups are not significantly different. There was no effect of homework system on the mean overall homework score. The mean score for each of the individual homework assignments for both groups is shown in Fig. 5.
The mean score for homework assignment 1 was the highest at 9.92 (SD 0.35) and 9.84 (SD 0.46) for the MindTap and OWL groups respectively. Homework assignment 1 dealt with a review of math and beginning concepts of mixtures and pure substances. The lowest mean score was for homework assignment 5 at 8.37 (SD 2.50) and 8.71 (SD 2.06) for the MindTap group and the OWL group, respectively. This homework assignment was concerning stoichiometry and equation types.
The mean scores for the homework assignments were not significantly different between the groups for assignments 1, 3, 4, 5, 6, and 7. The mean score for homework assignment 2 was significantly different. The MindTap group had a mean of 9.32 with a standard deviation of 1.35. The mean score for homework assignment 2 for the OWL group is 9.81 with a standard deviation of 0.68. The p-value for homework assignment 2 is 0.00002 so the two groups are significantly different for this homework assignment. Homework assignment 2 dealt with atomic structure.
The survey given at the end of fall 14 was analyzed. Regardless of the homework system, between 75–85% of the students reported that the system they used helped them understand the course material, helped them learn content they were struggling with, and helped them understand new concepts. They further believed that their system was easy to use, helped them learn, would be recommended to other students, and contained questions of moderate difficulty. Both groups indicated that they spent the same amount of time on homework (2–6 hours per week). In open responses they cited these important features: automatic scoring/immediate feedback, help with identifying relevant material, tracking progress and performance, and the ability to study anytime/anywhere. There were no differences by homework system, but each group had no comparison, since they had only used one homework system.
The MindTap group and the OWL group were not significantly different in most performance measures (overall course points, exam 1–4 grades, overall homework grades, homework 1 grades, or homework 3–7 grades). The MindTap group and OWL group were only significantly different on the homework assignment 2 score, with the OWL group scoring higher. Positive attitudes and perceived learning towards the system they used was found for both groups, with no measureable differences between the treatment groups. Both groups also had similar numbers of withdrawals and letter grade distributions.
Table 2 shows the percent of students that chose a particular response for each system and the p-value for the most common response compared to the expected percent. The expected values are of 33.33% when there were three response choices or 50% when there were only two. When asked about each system separately, students reported that it was easy to learn with both systems, that they were comfortable using both systems, that both systems helped them improve their concept understanding, that both helped them prepare to answer or work problems, and that both systems helped them learn. Students liked the arrangement of text to questions in both systems when asked about this arrangement individually. It should be noted that two students did not choose a response for this item, but wrote in an explanation.
n | MindTap | n | OWL | ||||
---|---|---|---|---|---|---|---|
a Significantly higher responses from other choices. | |||||||
How easy was the system to use? | 100 | Easy | 91%a | 100 | Easy | 78%a | |
Undecided | 5% | Undecided | 10% | ||||
Not easy | 4% | Not easy | 12% | ||||
p < 0.0001 | p < 0.0001 | ||||||
Between treatments p > 0.05 | |||||||
Were you comfortable using the system? | 100 | Yes | 96%a | 100 | Yes | 88%a | |
No | 4% | No | 12% | ||||
p < 0.0001 | p < 0.0001 | ||||||
Between treatments p > 0.05 | |||||||
Did you like the arrangement of the text and questions? | 100 | Yes | 90%a | 98 | Yes | 65%a | |
No | 10% | No | 35% | ||||
p < 0.0001 | p < 0.0001 | ||||||
Between treatments p < 0.0001 | |||||||
How helpful was the system in improving your understanding of the concepts? | 100 | Helpful | 89%a | 100 | Helpful | 78%a | |
Undecided | 7% | Undecided | 13% | ||||
Not helpful | 4% | Not helpful | 9% | ||||
p < 0.0001 | p < 0.0001 | ||||||
Between treatments p > 0.05 | |||||||
How helpful was the system in preparing you to answer/work problems? | 100 | Helpful | 90%a | 100 | Helpful | 71%a | |
Undecided | 4% | Undecided | 16% | ||||
Not helpful | 6% | Not helpful | 13% | ||||
p < 0.0001 | p < 0.0001 | ||||||
Between treatments p > 0.05 | |||||||
Did the system help you learn? | 100 | Yes | 96%a | 100 | Yes | 88%a | |
No | 4% | No | 12% | ||||
p < 0.0001 | p < 0.0001 | ||||||
Between treatments p > 0.05 |
To compare the responses for each system to determine if they were significantly different from each other, a McNemar's test was performed. The McNemar's test is a statistical test used on paired nominal data in a 2 × 2 contingency table. For the questions with three response categories, the “undecided” response was omitted, and the analysis compared “helpful” and “not helpful” for each system. The only question that had a significantly different between systems was the question concerning the arrangement of the text and questions. For this question the responses for MindTap were significantly different from those for OWL (the “yes” response was significantly higher and the “no” response was lower for MindTap). Students preferred the arrangement of the text and questions in MindTap.
Five questions on the survey asked the students to choose between the systems and asked for an explanation using open-response. The results of this survey are in the first columns of Table 3. Using a chi-squared analysis, significant differences were found. This indicates that the percentages found were significantly different from the expected value of 50%, which would be predicted if students treated the systems the same. Students reported that with MindTap, they learned more, understood concepts in a better way, were more comfortable using, and would recommend it for future classes. Students reported that OWL had more disadvantages. It was the response to the first question that seemed the most dramatic. Students with about a 2:1 ratio said that MindTap helped them learn more (66% to 30%). This was surprising since the students had used OWL more recently, and it had been a full semester since they had used MindTap. Could it be that students felt that they retained the information better with MindTap, so they indicated that they had learned more with that system? More research is needed to investigate this. When students were asked which system helped them understand concepts in a better way there was an option that the systems were the same, 24% of students said they were the same. Because of this, the values on the table for this question only add up to 75%.
MindTap (%) | OWL (%) | p-Value | |
---|---|---|---|
a Statistically significant at the p < 0.5 level. | |||
With which did you learn more? | 66 | 30 | <0.001 |
Which helped you understand concepts in a better way? | 63 | 12 | <0.001 |
Which were you more comfortable using? | 69 | 30 | <0.001 |
Which system had more disadvantages? | 31 | 69 | <0.001 |
Which would you recommend other classes in the future? | 67 | 32 | <0.001 |
Students were asked to explain their responses in Table 3. The explanations were tabulated and analyzed, based on the number of students giving an explanation. Common explanations are discussed here. Because students could give more than one explanation or fail to give an explanation, the percentages will not add to 100%. When asked why they indicated that they learned more with MindTap, 25.0% explained that MindTap had better explanations, while 30.3% said that MindTap was better organized.
When asked to explain their response about which system helped them understand concepts in a better way, 44.8% indicated it was the embedded arrangement of the text and homework in MindTap, while 13.8% said MindTap was easier to use. It should be noted here that while 4% that failed to choose one system over another in Table 3, 13.8% of the students writing explanations indicated that they learned with both systems and failed to choose one over the other. When explaining why they were more comfortable with a system, 49.0% said that MindTap was easier to use, while 21.6% said that MindTap gave better explanations for the problems. For those that said they were more comfortable with OWL, the most common explanation was that OWL was more straightforward (15.7%).
In explaining their response about which system had more disadvantages, 33.8% indicated that OWL had the text separate from the questions, 16.2% said that MindTap was easier to use, and 13.2% said that MindTap had better explanations. Finally, when explaining why they would recommend a system to other classes, 35.9% indicated that MindTap was better organized and easier to use and 35.9% said they learned more with MindTap. Although both systems have tutored problems, 11.5% said they would recommend OWL because it had tutor steps.
In the explanations for all five of the questions directly comparing the systems, the common responses were students pointing to the ease of use, better explanations, and better organization with MindTap, with some overtly indicating that what made the difference was the arrangement of the text and homework questions. These data were collected a semester after students had used the MindTap system, a semester in which they used OWL. In fact, 68% of the students who indicated that they learned more with MindTap than with OWL and explained this response by saying that MindTap had better explanations specifically said that the better explanations were due to the ability to read the text.
n | MindTap | n | OWL | |||
---|---|---|---|---|---|---|
a Significantly higher responses from other choices. | ||||||
How easy was the system to use? | 62 | Easy | 87%a | 62 | Easy | 61%a |
Undecided | 10% | Undecided | 11% | |||
Not easy | 3% | Not easy | 27% | |||
p < 0.0001 | p < 0.0001 | |||||
Between treatments p < 0.0001 | ||||||
Were you comfortable using the system? | 62 | Yes | 98%a | 63 | Yes | 73%a |
No | 2% | No | 27% | |||
p < 0.0001 | p < 0.0001 | |||||
Between treatments p < 0.0001 | ||||||
Did you like the arrangement of the text and questions? | 63 | Yes | 95%a | 61 | Yes | 41% |
No | 5% | No | 59% | |||
p < 0.0001 | p > 0.05 | |||||
Between treatments p < 0.0001 | ||||||
How helpful was the system in improving your understanding of the concepts? | 62 | Helpful | 87%a | 63 | Helpful | 32% |
Undecided | 13% | Undecided | 30% | |||
Not helpful | 0% | Not helpful | 38% | |||
p < 0.0001 | p > 0.05 | |||||
Between treatments p < 0.0001 | ||||||
How helpful was the system in preparing you to answer/work problems? | 62 | Helpful | 81%a | 62 | Helpful | 37% |
Undecided | 15% | Undecided | 27% | |||
Not helpful | 5% | Not helpful | 35% | |||
p < 0.0001 | p > 0.05 | |||||
Between treatments p < 0.0001 | ||||||
Did the system help you learn? | 63 | Yes | 94%a | 63 | Yes | 57% |
No | 6% | No | 43% | |||
p < 0.0001 | p > 0.05 | |||||
Between treatments p < 0.0001 |
When asked about each system separately, students reported in Table 4 that it was easy to learn with both systems and that they were comfortable using both systems. Students liked the arrangement of the text and questions in MindTap, thought MindTap was helpful in improving concept understanding, that MindTap helped them prepare to answer or work problems, and that MindTap helped them learn. It is important to point out that the number of students indicating yes when asked if OWL helped them learn was not significantly different from the expected value of 50%. For some of the questions, less than 63 responded because some students failed to choose a response but wrote in their own explanation. The preference for MindTap existed for those who used OWL first, then MindTap.
To compare responses between the systems, the McNemar's test was performed as previously described. Significant differences were found for all of the items in Table 4 between the systems. Significantly more of the students believed that MindTap was easier to use, that they were more comfortable with MindTap, that they liked the arrangement of text to questions in MindTap, that MindTap was helpful in improving their understanding of the concepts, that MindTap was helpful in preparing them to answer/work problems, and that they learn with MindTap.
Five questions on the survey asked the students to choose between OWL and MindTap and asked for an explanation using open-response. These students had a different professor for first and second semester general chemistry and had used OWL one semester and MindTap the second semester. In Table 5, the results of the spring 16 survey are in the last three columns. Significant differences between systems were found using a chi-squared analysis. This indicates that the percentages found were significantly different from the expected value of 50%, which would be predicted if students treated the systems the same. Students reported that with MindTap, they learned more, understood concepts in a better way, were more comfortable using, and would recommend it for future classes. Students reported that OWL had more disadvantages. These findings from spring 16 reflect those from spring 15.
Spring 15 | Spring 16 | |||||
---|---|---|---|---|---|---|
MindTap then OWL | OWL then MindTap | |||||
MindTap (%) | OWL (%) | p-Value | MindTap (%) | OWL (%) | p-Value | |
a Statistically significant at the p < 0.5 level. | ||||||
With which did you learn more? | 66 | 30 | <0.001 | 84 | 16 | <0.001 |
Which helped you understand concepts in a better way? | 63 | 12 | <0.001 | 87 | 5 | <0.001 |
Which were you more comfortable using? | 69 | 30 | <0.001 | 85 | 18 | <0.001 |
Which system had more disadvantages? | 31 | 69 | <0.001 | 17 | 81 | <0.001 |
Which would you recommend other classes in the future? | 67 | 32 | <0.001 | 89 | 11 | <0.001 |
Student explanations for each of the five questions, which asked them to choose between the two systems, were analyzed by tabulating the most common responses as they were in spring 15. In spring 16, the majority of the students responded in the same way (81–89%), such that there were no common explanations for the minority views. Students explained that MindTap helped them learn more due to better explanations (42.6%) or due to better examples (32.7%). Students explained that the reason MindTap helped them understand concepts in a better way was due to the arrangement of the text and questions (72.7%). When explaining why they were more comfortable with MindTap, students listed that MindTap was easier to use (28.2%) and that MindTap helped them understand concepts better than OWL (33.3%). The reason they gave for why OWL had more disadvantages was that OWL had the text separate (65.3%). Finally, students responded that they would recommend MindTap to other classes because MindTap helped them understand concepts better (60.0%).
If there is a bias for the last program used, then the results from spring 15 are more surprising to still favor MindTap. Likewise, no bias for the first program used was found. There might have been an instructor difference. The spring 16 group did have a different professor for the first semester course, but this instructor was an experienced OWL-user. The professor for the spring 16 MindTap was an experienced MindTap user, but this was the first semester to use MindTap with the second semester of general chemistry. It is unclear how the instructor could have affected the higher scores for spring 16.
While conducted over a number of semesters, the limitations of the study include the fact that subjects were from one institution, whose students are among the top from the region. The TOLT scores were higher (7.10 and 7.31) than the 4.4 reported by Tobin and Capie (1981) for a sample of 247 college level students. A further limitation includes the fact that only two homework systems were used. It should be noted that neither of the authors have a financial interest in either of the online homework systems.
Also, quite a number of students mentioned that they read more with MindTap. Is the difference in perceived learning simply due to the availability of the text pages, or to students actually reading the text or to the reduction of cognitive load when the text and questions are embedded in one screen? Is it possible that the linked webpages of homework and text in OWL are just too taxing and result in cognitive overload? Other researchers have found better student performance when short-term memory load was minimized (Behmke and Atwood, 2013). These all lead to further work. By comparing the two similar homework systems in this study, the main difference was the screen display for the homework (questions embedded in the text on one screen or text linked to the homework in a separate screen). How do other student attributes play into their attitudes towards screen displays? Do differences in spatial ability or other cognitive measures affect student learning or attitudes with a homework system?
Those that participated in this study felt that a prudent implication for their teaching practice from this research was to follow the mandates from the student perceived learning. With no difference in actual performance between the groups using the course measures (exams), but with a significant difference in perceived learning favoring MindTap, MindTap has been the homework system used since the conclusion of the study. It is possible that students did have learning gains with MindTap as they believed, but these were not detected by the course measures (exams 1–4 or final). The teaching implications from this research include the ideas that, until more studies can be done, instructors should consider the issues of cognitive overload with separate screens for text and homework as they choose a homework system and that one way to encourage students to read the text is by having it in the same screen as the homework or readily available.
This journal is © The Royal Society of Chemistry 2017 |