Student's study behaviors as a predictor of performance in general chemistry I

Lorraine Laguerre Van Sickle and Regina F. Frey *
Department of Chemistry, University of Utah, Salt Lake City, UT 84112, USA. E-mail: gina.frey@utah.edu

Received 2nd August 2023 , Accepted 8th November 2024

First published on 11th November 2024


Abstract

General chemistry is often the first course taken by students interested in careers in STEM and health fields, and therefore, is considered an essential course for the success and retention of students in these fields. Prior studies have shown study habits and skills to be related to student performance in college-level courses, including STEM courses. Previous chemistry studies have focused on deep versus surface approaches to studying, how affective variables (e.g., self-efficacy) affect study habits, and how students study week to week. Literature has also shown that students’ management of their general study time can impact their performance, with distraction while studying becoming an increasing challenge for students. This study examined first-semester general-chemistry students' study behaviors (both their explicit learning strategies and study-time management practices) focusing on their exam preparation and that relationship to exam performance when controlling for prior knowledge and class attendance. Key findings include: (1) students, on average, employed two active strategies for exam preparation, dedicated half of their study time to active strategies, and were distracted 26% of the time. (2) While active strategies positively influenced exam performance and passive strategies had a negative impact, not all active strategies were equally effective. (3) The percentage of study time spent on active strategies correlated positively with performance, whereas higher distraction levels during exam preparation negatively affected outcomes. Understanding student exam-study behaviors and their effects on exam performance can help instructors support students more effectively by teaching them study strategies effective for their courses.


Introduction

While there has been an increase in the number of individuals enrolling in college, only about half of enrolled students complete their bachelor's degrees (Conley, 2007) with high attrition rates observed in STEM fields (Rask, 2010). Several factors contribute to students leaving during their undergraduate years, including academic performance in their early college years. Previous literature has shown that course grades are an important determinant of whether a student decides to take another course in that major (Sabot and Wakeman-Linn, 1991). The lowest GPAs in college usually come from STEM majors (Sabot and Wakeman-Linn, 1991), and there exists the phenomenon of students being “pulled away” by their high grades in non-STEM courses and “pushed out” by their low grades in their STEM major (Ost, 2010). This situation can be especially critical for first-year students. Previous studies have shown that student performance in STEM courses can be affected by academic background (e.g., pedagogy of high-school courses) (Tai et al., 2005; Seery, 2009), affective factors (e.g., motivation or social belonging) (Lewis et al., 2009; Xu et al., 2013; Fink et al., 2020; Edwards et al., 2021), and cognitive factors (e.g., study habits) (Li et al., 2013; Sinapuelas and Stacy, 2015; Ye et al., 2015; Atieh et al., 2020; Rowell et al., 2021; Walck-Shannon et al., 2021). In addition, high-school study habits were shown to be unrelated to academic achievement in the first year of college, which suggests that first-year college students must acquire new study habits to succeed academically (Matt et al., 1991). For example, college courses require students to self-regulate their learning; i.e., students must recognize effective learning strategies, manage their study time, and reach out to resources like peers, instructors, peer leaders, and teaching assistants when they encounter issues or have questions during studying (Conley, 2007; Fanetti et al., 2010). This differs from most high-school courses, where they have detailed rubrics or specific prescribed tasks that do not require or encourage self-regulation (Fanetti et al., 2010). Research has also found discrepancies between the faculty-recommended learning strategies (e.g., elaboration, organization, critical thinking, peer helping, and seeking help) and the strategies students use to study (e.g., e-reading class material or re-watching lecture videos) (Lynch, 2007). In addition, a study by Cherif and Wideen (1992) found that first-year STEM students felt unprepared due to the emphasis on facts in high school versus the emphasis on understanding theories and concepts in college-science courses.

Our study focuses on a General Chemistry I (GCI) course, because it is an introductory course required for most STEM and health-field careers. It is also usually one of the first STEM courses students take in their first year of college after graduating high school. Prior studies have shown study habits and skills to be related to student performance in college-level courses (Credé and Kuncel, 2008; Rowell et al., 2021), including STEM courses (Ebele and Olofu, 2017; Walck-Shannon et al., 2021). Previous studies in chemistry (Szu et al., 2011; Li et al., 2013; Zhao et al., 2014; Sinapuelas and Stacy, 2015; Ye et al., 2015; Chan and Bauer, 2016; Bunce et al., 2017; Atieh et al., 2020; Muteti et al., 2021) have focused on deep versus surface approaches to studying, how affective variables like self-efficacy influence study habits, and how students study week to week. Literature has also shown that students’ management of their general study time can impact their performance (Credé and Kuncel, 2008), including distraction while studying (Rosen et al., 2013; May and Elder, 2018; Dontre, 2021; Sunday et al., 2021). In this study, we examined students' study behaviors (both their explicit learning strategies and study-time management practices) specifically focusing on their exam preparation in a GCI course and how these behaviors relate to exam performance when controlling for prior knowledge and class attendance. We want to add to the existing literature on student study behavior in STEM courses, and provide instructors additional knowledge on the specific learning strategies and study-time management behaviors students use when preparing for GC1 exams, to assist instructors in better supporting their students in their courses.

Generative learning framework

This study is based on the generative learning theory (Wittrock, 1991; Fiorella and Mayer, 2016), which states that as learners actively integrate ideas, these new ideas are linked to previously learned ideas resulting in cognitive reorganization, enabling learners to apply these ideas to new situations. This generative-learning process transforms new information into useable knowledge (e.g., mental models or schemas). Therefore, generative-learning theory relies on both the instructional methods (e.g., Merrill, 2002; Sweller et al., 2019; Lombardi et al., 2021) and on student learning (study) strategies.

Using this framework, much research has resulted (McDaniel et al., 2007; Dunlosky et al., 2013; Chi and Wylie, 2014; Fiorella and Mayer, 2016; Bjork, 2018) focusing on what learning strategies promote generative learning (often called active strategies) and which ones target rote learning (often called passive strategies). In the generative-learning process (Wittrock, 1991; Fiorella and Mayer, 2016), learners must first select the information to retain in working memory for further conscious processing. They then organize this information by creating internal connections, forming a new mental representation. Last, the learner integrates this new mental model with existing knowledge and beliefs from memory to facilitate long-term recall, creating connections and pathways between old and new information (Bjork, 1994). Consequently, the number of pathways through which information can be retrieved is increased, which improves the retrieval effectiveness and enhances the ability to apply the acquired knowledge and skills to novel problems. This approach also promotes easier access to information from long-term memory at a later time (Bjork, 1994; Roediger III and Karpicke, 2006; McDaniel et al., 2007). Using this framework and supporting literature, we categorized the learning strategies in our study as active, passive, or mixed.

A complementary framework often used in chemistry is the student approaches to learning (Pintrich, 1991; Biggs, 1993; Biggs et al., 2001; Entwistle and McCune, 2004; Bunce et al., 2017), specifically the concept of a deep approach to learning (Marton and Saljo, 1976), which states that students have different intentions when approaching a task such as studying. The generative learning framework used in this study differs from the deep and surface framework, because the later combines both the strategies the student uses and their intent while performing the task.

Which constructs of learning and study behaviors relate to student performance?

In this study, we are determining the self-directed study behaviors students engage in when studying for exams, which includes the learning strategies they use and their study-time management habits.

In a meta-analysis literature study conducted by Credé and Kuncel (2008), the following constructs related to learning and study behaviors were examined in relation to student performance in college (course grades and college gpa): study skills, study habits, study attitudes, metacognitive skills, study motivation, depth of processing, and study anxiety. The authors found that study habits, study skills, study attitudes, and study motivation were highly significant predictors of college academic performance while depth of processing and study anxiety were low predictors of college academic performance (Credé and Kuncel, 2008). Additionally, of the four top constructs, study habits and study skills were the top two predictors of college academic performance. In our study, we focused on the effects that study habits and study skills have on student performance in general chemistry performance.

One component of the study-skills construct is the student's capability to efficiently allocate time and other resources to meet the academic requirements (Credé and Kuncel, 2008). In our study, we assessed the students' study-time management habits by inquiring about their ability to manage distractions while studying for exams, the amount of days and hours they spent studying for the exam, and the percentage of exam-study time they spent on a particular learning strategy.

The term “study habits” used by Credé and Kuncel (2008) refers to the consistent study practices or routines that students engage in, such as regularly reviewing material, self-testing, or rehearsing learned material. In our study, we asked students to report which specific learning strategies they used when studying for their exams. Even though study behaviors include a wide variety of constructs, in this study, we focused on self-directed study behaviors, specifically learning strategies and study-time management habits, students used during exam preparation.

Which self-directed study behaviors help students succeed?

We describe the key generative strategies that are commonly reported among students and have been evaluated as improving course performance (or learning outcomes in laboratory studies) over a broad range of disciplines, types of problems, and student age (e.g., see the review by Dunlosky et al., 2013). Four key generative learning strategies are elaboration, self-explanation, practice testing, and paraphrasing. Elaboration entails explaining and clarifying concepts by providing intricate details and forming associations between the information you are trying to learn and your personal experiences, memories, everyday life, and existing knowledge (Dunlosky et al., 2013). Learners are encouraged to ask themselves questions about the underlying principles and reasons behind the subject matter they are studying. Elaboration is described as being an effective technique because it facilitates the assimilation of new knowledge with prior knowledge, creating multiple pathways for recalling or retaining information (Dunlosky et al., 2013). In self-explanation, students vocalize the reasoning behind their problem-solving steps or processes. Similar to elaboration, it enhances learning by facilitating the integration of new information with pre-existing knowledge, promoting a deeper level of understanding of the process, which helps in transfer (Dunlosky et al., 2013). In practice testing, students engage in either low- or no-stakes practice exercises without having the solutions or notes available. These exercises may involve activities such as answering practice questions, solving practice problems, or using flashcards. Retrieving target information from memory activates related information in long-term memory and therefore creates multiple pathways to access the information at a later time (Dunlosky et al., 2013). Hence, one is more likely to remember the information long-term and more likely to use and apply it in new situations (Karpicke et al., 2009; Dunlosky et al., 2013). Last, in paraphrasing, students identify and rephrase the main ideas and supporting details from a text or notes using their own words (Lee and Von Colln, 2003; Hagaman et al., 2012; Stevens et al., 2020). This approach helps students break down and organize information, reducing the cognitive load on working memory, and allows them to focus on comprehending the text. Additionally, paraphrasing enables students to establish connections between new ideas and previously assimilated knowledge, facilitating a holistic understanding of the text (Stevens et al., 2020). Finally, by actively engaging in paraphrasing, students become active participants in the reading process rather than passive recipients of information (Hagaman et al., 2012).

In addition to using learning strategies that have been shown to improve performance (or learning outcomes), study-time management habits can also affect performance. The desirable difficulties framework (Bjork and Bjork, 2011) discusses the need to distribute study time into multiple sessions to avoid “cramming” near the exam; although studies report that students make limited use of planned study schedules (Kornell and Bjork, 2007; Cepeda et al., 2008), and many students report waiting until the last day before an exam to study (Kornell and Bjork, 2007; Cepeda et al., 2008; Hora and Oleson, 2017). Hence, in our study, we included the amount of study time (both in days studied and hours studied). With both the generative-learning and the student approaches to learning frameworks, active-learning techniques used during studying lead to improved performance; therefore, we also examined the amount of exam-study time students spent using active strategies. For example, it was shown in a previous study by Walck-Shannon et al. (2021) that the proportion of study time using active strategies correlated with higher exam performance in a biology course. We hypothesize that the reason behind this trend is because students are spending more time engaging in strategies that promote long-term retention of information and transfer and flexibility of the information learned, and less time on strategies that rely more on passive absorption of information, which can help for immediate retrieval, but not on long-term recall or deep comprehension.

Which self-directed study behaviors do not help students succeed?

Similarly, students often rely on behaviors that do not support their success. Three commonly used learning strategies are rereading, re-watching videos, and re-writing class notes word for word. Re-reading, one of the most widely used study technique among students (Karpicke et al., 2009; Rowell et al., 2021; Walck-Shannon et al., 2021), involves reviewing the material being learned, such as textbooks, notes, and lecture slides (Dunlosky et al., 2013). While re-reading can enhance memory retention for the material, its benefits are limited compared to other techniques, such as self-testing (or practice testing) (Szpunar et al., 2013) and self-explanation (Moss et al., 2013). Re-reading may lead to rigid representations of the material, which are resistant to change (Fritz et al., 2000), and can result in students skimming through the material on subsequent readings, leading to shallower reading strategies (Rawson, 2012) and increased mind wandering (Phillips et al., 2016). Importantly, students may experience a greater sense of fluency during subsequent readings, mistaking this fluency for comprehension (Wiley et al., 2005). As a result, students may mistakenly believe that they have a better understanding of the material. In fact, re-reading has been found to not improve performance (i.e., was an ineffective study strategy) when used to prepare for certain summative assessments, such as multiple-choice, short-answer questions, and text summaries (Callender and McDaniel, 2009). Similarly, an increasingly popular technique among students today is to re-watch lecture videos, which shares the same limitation as re-reading, as it simply involves re-exposure to the same material without any new perspectives or insights (Martin et al., 2018). According to Martin et al. (2018), re-watching lecture material did not improve memory retention but did increase rates of mind wandering. Last, re-writing class notes word-for-word is a common learning strategy, but students take a passive approach when engaging with this strategy (Kiewra, 1987; DeZure et al., 2001; Özçakmak, 2019; Schmidt, 2019). Note-taking is a complex task that requires effort and skill, which many students find challenging since they must comprehend and select information, produce written notes, and understand new information simultaneously. This puts a significant strain on the central-executive and other working-memory components (Schmidt, 2019). Due to the high demands of notetaking, students' notes tend to be incomplete, with important ideas missing. Additionally, student notes are often inaccurate and rarely corrected once written (Kiewra, 1987; DeZure et al., 2001; Özçakmak, 2019; Schmidt, 2019). Therefore, when students re-write their notes word-for-word, they may be relying on incomplete or incorrect information from the lecture. This strategy also falls under the lowest level of Bloom's Taxonomy of Education Objectives, which is “remembering.” At this level, while the learner may be able to recall specific pieces of information, such as facts or definitions, this strategy does not facilitate a deeper understanding of the underlying concepts (Bloom et al., 1956).

An important aspect of study-time management includes distractions when engaging in independent study time, which is becoming more prevalent (Rosen et al., 2013; Patterson, 2017; May and Elder, 2018; Sunday et al., 2021). Previous research has shown that students who are distracted (predominately via multi-tasking) during lecture or while studying have lower academic performance (Kraushaar and Novak, 2010; Junco and Cotten, 2012; Calderwood et al., 2016; Walck-Shannon et al., 2021). A few studies have shown a negative effect of distraction while studying on exam performance (Patterson, 2017; Walck-Shannon et al., 2021). This is due to the incapacity of our brain to multitask or process multiple inputs of information (Junco and Cotten, 2012). Multitasking or switching back and forth between different tasks increases the number of errors and the processing time required to learn topics because of the time and effort required to refocus after switching from one task to another (Kraushaar and Novak, 2010). Multitasking is prevalent among students due to access to digital content, such as Facebook, instant messaging, and Instagram (Kraushaar and Novak, 2010; Junco, 2012). In cognitive-science research, multitasking while learning can result in the acquisition of less flexible knowledge that cannot be easily recalled or applied in new situations (Kraushaar and Novak, 2010).

In summary, existing research suggests that engaging in active, effortful learning strategies when studying for exams lead to better performance than passive strategies. Additionally, distributing exam study sessions over time tends to result in higher performance than cramming, and maintaining focused study sessions leads to higher performance than allowing distractions during exam studying. However, what study behaviors students use while studying for an exam is a separate question.

Study behaviors college students use to study for their courses

Without instruction on study behaviors, college students, including STEM majors, often prefer passive study strategies, such as re-reading when studying for their courses (Carrier, 2003; Karpicke et al., 2009; Kraushaar and Novak, 2010; Yan et al., 2014; Hora and Oleson, 2017; Muteti et al., 2021; Rowell et al., 2021; Walck-Shannon et al., 2021). In addition, most college students claimed to spend less than 10 hours a week studying (Hora and Oleson, 2017), and distraction (e.g., multi-tasking) during studying is becoming more prominent (Rosen et al., 2013; Patterson, 2017). In STEM courses, Rowell et al. (2021) found that psychology majors, when studying for exams, spent approximately 50% of their study time re-reading class material. Furthermore, in response to a Likert-scale question, 53–60% of students reported being distracted “never” or “rarely,” while 32–39% reported being distracted “about half the time” during their exam study time. Walck-Shannon et al. (2021) discovered that while most introductory biology students (94%) used re-reading lecture slides or class notes as their primary study method when preparing for exams, many students also reported answering old problem-set questions (84%), spending half of their study time using active strategies and being distracted 20% of the time when studying for exams.

Several studies have examined the factors associated with students' studying and learning behaviors while they engage in their chemistry coursework. A number of studies examined students’ approaches to learning (Li et al., 2013; Sinapuelas and Stacy, 2015; Bunce et al., 2017; Atieh et al., 2020). Li et al. (2013) examined the college chemistry major's conceptions of learning chemistry (COLC) and chemistry majors' approaches to learning chemistry (ALC) and how they related to students' study habits. They found that students who demonstrated lower levels of COLC (e.g., learning chemistry by memorizing) tended to use surface approaches, while students with higher levels of COLC tended to use deep approaches (i.e., individuals wanted to achieve a better personal understanding about the new ideas and information) to learning chemistry. Research conducted by Atieh et al. and Bunce et al. investigated the correlation between students’ approach to learning and general chemistry performance (Bunce et al., 2017; Atieh et al., 2020). Atieh's study revealed that students who exhibited below-average scores in the surface approach but above-average scores in the deep approach tended to perform exceptionally well in the course. Similarly, Bunce's findings echo this trend, indicating that students with varying levels of academic achievement adopt distinct studying approaches; those with A/B grades predominantly employ deep approaches, while students with D/F grades lean towards surface-level approaches. Additionally, Sinapuela and Stacy (2015) observed a positive association between learning approach and performance, with students using a learning approach of applying ideas obtaining higher exam performance than students who used a fact-gathering approach.

In a study by Ye et al. (Ye et al., 2015), text messages were used to investigate the study habits of general chemistry students. Students received two text messages per week asking if they had studied for the General Chemistry I course in the past 48 hours and how they had studied. The study found that the majority of students reviewed their notes, PowerPoint slides, or textbook when studying. Additionally, Ye et al. discovered that students who relied on reviewing the textbook, notes, and PowerPoint as their primary study strategies performed better than those who reported not studying for the course. In organic chemistry, Szu et al. (2011) performed a case study of 20 organic chemistry 1 students having them write study activities in a journal, performing concept maps of 5 topics, and conducting think-alouds on selected textbook problems. They found that prior GPA, and their level of performance on the concept maps and think-aloud problems correlated to final course grade. In addition, they examined the number of study sessions per week and the number of hours studied per week correlated with final course grade.

Metacognitive interventions have been developed to enhance the study skills of general chemistry students (Cook et al., 2013; Mutambuki et al., 2020; Muteti et al., 2021). In these studies, metacognitive training sessions were given at the beginning of the semester, each lasting 50 minutes, and discussed the study cycle, metacognitive knowledge and self-regulation prompts, and the use of Bloom's taxonomy during studying. Cook et al. discovered that students who attended the lecture achieved final grades one full letter grade higher than those who did not attend, on average. In addition to the initial training, Muteti et al. and Mutambuki et al. prompted students to continually reflect on the study strategies they employed throughout the semester. The Mutambuki et al. study revealed that the treatment group outperformed the comparison group significantly on both the third midterm exam and the cumulative final exam. The qualitative study by Muteti et al. found that students reported initially relying on rote memorization (or lower-order study strategies) rather than higher-order study strategies, but reported an increased willingness to adopt higher-order study strategies immediately after the metacognitive instruction.

We are adding to this prior research by focusing on the study behaviors (both explicit learning strategies and study-time management habits) students use when studying for their general chemistry exams, and asking how these behaviors affect exam performance when controlling for prior knowledge and class attendance. Given a list of the class resources and assessments, immediately after each mid-term exam, we asked students which resources and strategies they used, and what were their exam-study time management habits (i.e., number of days and hours per day they studied, the number of and percent of time using active (generative) learning strategies, and percent time distracted while studying). We then correlated these behaviors with exam performance accounting for background knowledge and attendance (i.e., class days missed per exam period). To gain nuanced understanding of the exam study behaviors of general chemistry students, we focused on determining exactly what learning strategies (based on the course resources and assessments available) students in general chemistry used in the absence of formal study training and how each learning strategy related to their exam performance. The learning strategies listed were tailored to align with the curriculum of the general chemistry course, ensuring they were recognizable to students, along with the availability for students to add their own strategies. Students use different strategies depending upon the course and the discipline (Hora and Oleson, 2017), and study-time management habits (especially distraction while studying) are being seen as areas where instructors can help students improve their performance (Rosen et al., 2013; Patterson, 2017). Having this in-depth knowledge about what students do as they study for their general chemistry exams will give instructors a better understanding of how to instruct general chemistry students in studying effectively for their course.

Research questions

Previous literature has shown that exam performance is vital for the retention of students (Rask, 2010), and that first-year students’ study habits are seldom adequate for college courses (Carrier, 2003; Karpicke et al., 2009; Sinapuelas and Stacy, 2015; Ye et al., 2015; Bunce et al., 2017; Hora and Oleson, 2017; Atieh et al., 2020; Mutambuki, 2020; Rowell et al., 2021; Walck-Shannon et al., 2021). Hence, in this study, we examined which self-directed study behaviors first-semester general-chemistry students use when studying for exams and how these behaviors affect exam performance. We addressed two research questions in a General Chemistry 1 (GC1) course.

Research Question 1: What types of study behaviors (Learning strategies and study-time management habits) do students enrolled in GCI use when studying for their exams? Do they differ by exam-grade group?

Research Question 2: What is the relation between study behaviors (Learning strategies and study-time management habits) and students' exam performance in GCI, after accounting for academic preparation, self-reported class absences, and self-reported total study time.

Methods

Study setting

University. The study took place at a large (∼35[thin space (1/6-em)]000 undergraduate students), public, research-intensive university in the Mountain West region of the United States during the fall 2021 semester.
General chemistry 1. The General Chemistry 1 (GC1) course, offered in the fall 2021, covered topics such as atomic and molecular structure, chemical reactions, thermochemistry, and gas properties. An accompanying lab is offered as an independent course in which many students typically concurrently enroll; however, data were collected only from the lecture component of GC1. The course typically enrolls approximately 1000 students and is the first of a two-course sequence required for many science, engineering and pre-health students. In fall 2021, the GC1 course was taught in person and each week comprised three 50-minute lectures taught by course instructors and one 50-minute discussion session facilitated by graduate teaching assistants (TAs), and assisted by undergraduate teaching assistants (TAs) and learning assistants (LAs). At this institution, LAs are paid undergraduates who facilitate small-group active-learning activities. LAs have previously taken the GC1 course and receive extensive pedagogy training through a formal, semester-long pedagogy course. During the discussion sessions, students worked in groups to solve instructor-selected problems and additional illustrative problems using student-response systems, and course credit was given based on student participation. Three instructors with chemistry teaching experience ranging from 5 to 25 years taught the GC1 course sections, worked as an instructor team, and were included in this study (one man and two women instructors). All course instructors, teaching assistants, and learning assistants held multiple weekly, in-person and synchronous online office hours, and students could attend any or multiple of these office hours regardless of the section in which they were enrolled. For this course, students from all three sections had the same assessments and were graded following the same rubrics and grading scheme.

Course grades were based on online homework assignments (2 per week (+completion of surveys and pre-assessment), 23% of total grade), online quizzes (1 per week, 6 total with lowest score dropped, 10% of total grade), student-response system (41 lecture classes with lowest 3 dropped, 24% of total grade; 14 discussion classes with lowest 2 dropped, 8% total grade), an introductory quiz (2% of the total grade), and exams (2 midterms plus the final, 33% of total grade, 11% each). Students were given a 19-h window on Friday each week to complete that week's quiz with a 25-min time limit after opening the assignment. All exams (including the final) were administered on paper and in person. The two midterm exams were one-hour each and were given on September 28 and November 2. The final exam was delivered during the university's final-exam testing period, and was two hours. The final exam was noncumulative and was structured and weighted the same as the midterm exams.

Participants. Participants were recruited from GC1 during fall 2021. The GC1 sections comprised ∼39% first-years, ∼34% sophomores, and ∼27% juniors/seniors/second-baccalaureate students. The study was approved by the university's Institutional Review Board and students were not compensated for participating in the study. However, course points were awarded to all students, regardless of their participation in the study, for completing the related surveys and pre-assessment. Out of the 910 students enrolled in the course, 625 consented to participate in the study. The sample size used for each analysis varied as not all consenting students completed all procedures, and academic records were not available for all students. Hence, the sample size for each analysis is reported with the respective results. Through chi-square analysis, we discovered a significant difference between all the students enrolled in the course and the students in the course who consented to the study, particularly in their academic year (p-values < 0.001) and distribution of final course grades (p-value ≤0.001). However, there was no discernible variance in gender between the two groups (p-value = 0.1681). To investigate the disparity in course grades between two groups—Consenting students (3.22 = B) and the Whole class (3.06 = B), a t-test was performed. The analysis revealed a statistically significant distinction (t-value: 4.701, df: 1479, p < 0.001), although it is important to note that the observed difference lies within the same letter grade. For a more comprehensive overview of our consenting sample, which may serve as a reference for future studies aiming to make comparisons, please refer to Appendix I, Tables 6 and 7 and Fig. 4.

Variables

Academic preparation. Chemistry preparation of students was evaluated by their initial performance on a multiple-choice chemistry-content pre-assessment (average 13 ± 5 out of 25 total possible points), which was created by experienced GC1 instructors at the university, and has been given for over 10 years, but started being used for data analysis in 2020 (Edwards et al., 2021). The pre-assessment was delivered online as part of the beginning-of-semester survey and students were given a 40-minute time limit to complete this pre-assessment component. On average, students completed the pre-assessment in approximately 20 minutes, which tested their knowledge of topics they may have studied in their high-school chemistry classes. As indicated in the Results section, the pre-assessment was found to predict the students' performance on their course exams, as has been seen in prior studies at this institution (Edwards et al., 2021).
Study behaviors reflection. The exam study-behaviors reflections were given after the two midterm exams. In these reflections, students were asked to report on which strategies they used, the percentage of their study time they used each strategy, and the percent of time they were distracted overall when studying for the exam (the reflection is in Appendix II). The list of study strategies was adapted from a biology study-habits paper (Walck-Shannon et al., 2021) in collaboration with the GC1 instructors. Some students selected “other” (Exam 1: N = 153; Exam 2: N = 125) and wrote a text description; all their responses were able to be recoded into existing categories. To ensure that students accurately recalled their study habits, the exercise was made available online for 5 days immediately after each exam. Additionally, only consenting students who completed the survey before the release of their exam grades were included in the statistical analysis. As a validation check that students just did not select all strategies, we used in our analyses only those strategies where students also reported percent usage time. That is, if a student selected a strategy but indicated 0% usage in the follow-up reflection question (Q8), that strategy was excluded from their list of strategies used in the analysis.
Midterm exam grades. The two midterm-exam scores are represented as the variables Exams 1 and 2 score (%). In fall 2021, students in all GC1 sections took the same exams during the same testing periods and grading followed the same rubric, so exam scores were expressed in the analyses as raw percentages.
Groups 1 and 2. From the sample, two groups were created to compare study habits and performance in the exams. Group 1(a, b) consisted of students whose scores for each midterm exam (1(a) and 2(b)) were in the 75 percentile and higher. Group 2(a, b) comprised students whose scores for each midterm exam (1(a) and 2(b)) were in the 25 percentile or lower. Note that the populations of these groups differ for the two exams.

Statistical analysis

Calculated indices. In this section, we describe the variables that were calculated from the student-reported data in the reflection. Variables that were used from direct inputs by the students, such as class absences or the percentage of study time spent distracted when studying for exams, are not described below.
Total exam study time. The students reported the number of hours they spent studying for each exam in their reflection assignments. This included the hours they studied daily in the week leading up to the exam and any additional hours spent studying more than a week ahead. The total exam study time was calculated by summing all reported study hours.
Number of active strategies used. To determine the number of active strategies used, we first defined which strategies were active. We used previous literature concerning study habits and skills related to student performance in college-level courses (Credé and Kuncel, 2008; Walck-Shannon et al., 2021) as a template to code the already categorized strategies. The authors then reviewed literature about desirable difficulties (Bjork and Bjork, 2011) and study strategies (Dunlosky et al., 2013) to code the learning strategies that were added to our study and were not contained in the previous study by Walck-Shannon et al. (2021). The authors met to discuss the coding of the strategies until agreement was reached. Since students in the review session could have engaged with the material in different ways – either actively (e.g., solved or discussed the selected problems by themselves or with peers) or passively (e.g., waited until the TA solved the selected problems and then copied the solutions or downloaded the review slides after the session) – we did not code this strategy as active. Instead, we coded it as a mixed strategy. After the coding was in place, we summed the number of active strategies that each student reported, which yielded the number of active strategies variable. For higher reporting accuracy, we only counted the active strategies where students estimated the percentage of time they spent using that strategy.
Percentage of study time using active strategies. We asked students which strategies they used, and we asked them to estimate the percentage of their study time they spent using each strategy. The proportion of study time spent on active strategies was calculated by summing the percentages of time spent on each active strategy.
Number of days studied in week leading up to the exam. To assess the consistency of studying, for exams 1 and 2, we calculated the number of days each student reported studying in the week prior to each exam. The calculation involved adding up the number of days in which the student reported studying for more than 0 hours, giving us the total number of days studied in the seven days before each exam.

Multiple linear regression

Multiple linear regression (MLR) was used to examine the impact of student study behaviors on exam scores. The absolute or standardized effect sizes were measured by regression coefficients, obtained using the Ordinary Least Square (OLS) method. All data analyses were performed using the base version of Stata Statistical Software, Release 16 (StataCorp, 2019). The variables of numeric academic preparation (Chemistry pre-assessment) and self-reported class days missed were included as predictors in all models.

The significance level was set at α = 0.05 for this study; hence, the null hypothesis states that the regression coefficient is equal to zero at a 95% confidence level. “Statistically insignificant” in this paper refers to a lack of evidence from the sample data to support that the regression coefficient is different from 0 in the population under the specified model at the α = 0.05 level. This should not be taken as a statement of “no effect”. Our models showed evidence of heteroscedasticity and skewness (i.e., the residual variances were not constant) using the Breusch–Pagan test (Breusch and Pagan, 1979) and the Cameron and Trivedi test normality with the White test for heteroscedasticity (White, 1980; Cameron and Trivedi, 1990). (Appendix III, Tables 8–10). To address the heteroscedasticity the 95% confidence intervals for the reported regression parameters were calculated using the Huber-White-sandwich heteroscedasticity-robust standard-errors correction (White, 1980), which calculates the standard errors of the regression coefficients by including all the different values of the residuals, rather than using a constant value for variance. Regarding the non-normality of residuals, as noted by Tranmer and Elliot (2008), multiple linear regression (MLR) is a robust statistical method that can tolerate moderate deviations from normality. Fig. 5–7 in Appendix III show a slight deviation from normality across all our models. Due to the minimal impact of these deviations, no transformations were applied to the independent variables to reduce deviations from normality.

Fisher's exact test

Fisher's exact test was used to compare the learning-strategies proportions between Groups 1 and 2. This test is used to conduct analysis on categorical data, but specifically when you have a 2 × 2 table, in which one of the cell frequencies is 5 or less. The null hypothesis of this test is the following: there is no association between the rows and columns of the 2 × 2 table, such that the probability of a subject being in a particular row (outcome) in not influenced by being in a particular column (group) (Upton, 1992). In other words, the probability of having a certain outcome is not influenced by being in a certain group. In our study, we wanted to determine if there is an association between the chosen learning strategies to study for the exams and being in Groups 1 and 2. The Fisher's test is exact because it calculates the probability of having such a combination of cell frequencies directly, rather than approximating from a distribution (Upton, 1992). Significance was evaluated at α = 0.05, meaning we would reject the null hypothesis if the probability was lower than 0.05. This would then signify that being in Groups 1 and 2 is not influenced by the learning strategies that students use when studying for the exams. To account for multiple comparisons, Bonferroni corrections were applied to significant Fisher tests (Wright, 1992).

In order to better understand how much of an influence the choice of learning strategies used has on being in a certain group, we conducted a size-effect analysis using the odds-ratio method. For this method, the ranges of magnitude on size effect are: large size effect (4.0 or higher odds ratio), medium effect (2.5 < 4.0 odds ratio), small size effect (1.5 < 2.5 odds ratio) (Maher et al., 2013). For ease of interpretation, we chose the group who had the highest proportion in terms of using a learning strategy to be the numerator in the odds-ratio calculation image file: d3rp00207a-t1.tif. This would allow for the odds ratio to be a number equal to or higher than 1, in all cases.

Results

The exam learning strategies that students selected, the percentage of time using each strategy, and the level of distraction they reported while doing so are described below. We determined the frequencies with which certain study variables were reported and correlated those study variables to Exam 1 and Exam 2 scores. For all performance analyses described in the Results section, we first controlled for a base model.

Base model

We used a base model to control potential confounding variables that might impact exam performance by incorporating variables known to influence performance based on previous research. First, we included the chemistry pre-assessment score as an indicator of academic preparation based on a meta-analysis (Westrick et al., 2015) and our previous study (Edwards et al., 2021). Second, we included the number of self-reported class days missed for each exam because prior studies showed correlations with exam performance (Gump, 2005; Lin and Chen, 2006; Walck-Shannon et al., 2021). Last, we controlled for the total number of hours spent studying for the exam (Credé and Kuncel, 2008; Walck-Shannon et al., 2021); although it was not a significant predictor of GC1 performance in our study (Appendix II, Table 6), consistent with findings in an organic chemistry study (Szu et al., 2011).

In the base model, the chemistry pre-assessment and the number of class days missed in an exam period reached significance at α = 0.05. The chemistry pre-assessment variable predicted both exams (Exam 1), effect size: [b with combining circumflex]i = (1.9 ± 0.1); Exam 2, effect size: [b with combining circumflex]i = (1.9 ± 0.2); i.e., for both Exams 1 and 2, the base model predicts that a 1.0-point increase in the pre-assessment score results in a 1.9% increase in the corresponding exam score. The class-days-missed variable also predicted both exams (Exam 1), effect size: [b with combining circumflex]i = (−1.5 ± 0.4); Exam 2, effect size: [b with combining circumflex]i = (−1.0 ± 0.4). That is, for Exam 1, the base model predicts that for each class day missed in the exam period there results a 1.5% decrease in Exam 1 score; whereas, for Exam 2, the base model predicts that for each class day missed in the exam period there results a 1.0% decrease in Exam 2 score. The final base model for both exams (see Table 1) was a significant predictor for both exams in all analyses and accounted for a substantial proportion of variance in both exams (Exam 1: R2 = 0.3035, F(2, 538) = 124.88, p < 0.0001; Exam 2: R2 = 0.2891, F(2, 470) = 98.31, p < 0.0001). For each analysis below, we included all consenting individuals who responded to the relevant reflection questions for that model. Thus, the sample size and variable values varied slightly across analyses. For means and standard deviations (SDs) of all continuous variables in this study, see Appendix III Table 11. In summary, our base model accounted for a substantial proportion (Exam 1: 30%, Exam 2: 29%) of the variance, which allowed us to interpret the relationship between particular study habits and exam performance more directly.

Table 1 Study behaviors multiple regression model (base model)
Predictor Exam 1 (%) Exam 2 (%)
[b with combining circumflex] i ± SEab P [b with combining circumflex] i ± SEab P
a [b with combining circumflex] i = mean unstandardized estimated regression coefficient of the ith predictor. b image file: d3rp00207a-t2.tif. Standard errors computed with ordinary least-squares. c Two-sided significance level. d Numerical variables were centered at their mean. Ntotal (Exam 1) = 541 and Ntotal (Exam 2) = 470.
Chemistry pre-assessmentd 1.9 ± 0.1 <0.001 1.9 ± 0.2 <0.001
Class days missed in exam period −1.5 ± 0.4 <0.001 −1.0 ± 0.4 0.007
Intercept 70.7 ± 0.9 <0.001 75 ± 1 <0.001
Model fit R 2 = 0.3035 (F2538 = 124.88, p < 0.001) R 2 = 0.2891 (F2470 = 98.31, p < 0.001)


Which are the top 4 learning strategies students used to study for GC1 exams 1 and 2?

The frequencies with which specific learning strategies were employed are reported in Table 2. The most prevalent strategies used by the students to study for the exams (in order of prevalence) are re-worked graded homework problems, re-read lecture slides or class notes, explained concepts to myself and others, and attended the optional review session (which were given by TAs). Additionally, students used two active strategies when studying for both exams 1 and 2 (Fig. 1), on average. Also, overall, the four most commonly reported study strategies were the same for both exams 1 and 2.
Table 2 Learning strategies used in Exam 1 (N = 547) and Exam 2 (N = 472)
Study strategy Type Exam 1 Exam 2
N % N %
Note: the top four strategies students reported are depicted in italic rows.
Re-worked graded homework problems Active 371 67.8 301 63.7
Re-read lecture slides of class notes Passive 331 60.5 299 63.3
Explained concepts to myself or others Active 256 46.8 222 47.0
Attended the optional Review Session Mixed 234 42.8 212 44.9
Paraphrased or outlined class notes (includes creating a study guide or writing) Active 197 36.0 122 25.8
Answered old exam questions Active 169 30.9 85 18.0
Re-worked online quizzes Active 168 30.7 131 27.8
Re-read textbook Passive 146 26.7 127 26.9
Re-watched lecture videos Passive 142 25.9 163 34.5
Made my own diagrams or comparison tables from lecture notes Active 94 17.2 94 19.9
Re-wrote your class notes word for word Passive 63 11.5 41 8.7



image file: d3rp00207a-f1.tif
Fig. 1 Distributions for the number of active strategies that each student used for Exam 1 (N = 547) and Exam 2 (N = 472).

What learning strategies affected student performance on exams 1 and 2?

We examined whether the strategies students reported using were related to exam performance. For these exploratory regression analyses, we added whether a student used a specific strategy (0 or 1) into the model after controlling for the exam-specific base model reported in Table 1 (which was still significant in the later regressions). The model presented in Table 3 was exploratory, as we wanted to understand the predictive power of each learning strategy while controlling for the rest of the learning strategies. To assess multicollinearity among the learning strategies, correlation tables were generated and revealed that the correlations were low between the strategies (Appendix III, Tables 13 and 14). As seen in Table 3, after considering preparation and class days missed, we found that, on average, the following strategies were distinctly predictive for Exam 1 only: re-wrote your class notes word for word ((−8.8 ± 2.4)%) and re-worked online quizzes ((3.5 ± 1.6)%). The following strategy was distinctly predictive for Exam 2 only: attended the optional review session (Exam 2: (6.7 ± 1.6)%). However, explaining concepts to myself or others (Exam 1: (5.1 ± 1.5)%; Exam 2: (3.0 ± 1.5)%) was distinctly predictive for both exams 1 and 2. In other words, for both exams, if a student used the “explaining concepts to myself or others”, they would score 5.1% and 3.0% above a student who did not use this strategy for the corresponding exam. In addition, the portion of exam-score variance explained by these 4 strategies was non-overlapping. For example, if a student reported using the 2 statistically significant strategies on Exam 2 (i.e., Attending the optional review session, and explaining concepts to myself and others), the strategies would have a net effect on the student scoring 9.7% above the score on Exam 2 of a student who used neither of the strategies. This regression, which includes the base model plus the learning strategies, accounted for a substantial proportion (Exam 1: 36%, Exam 2: 35%) of the variance in performance on the exams. These results also suggested that active strategies tended to be positively related to exam performance, and passive strategies tended to be negatively related to exam performance.
Table 3 Exploratory regression model relating specific study strategy use to performance on Exam 1 (N = 541) and Exam 2 (N = 470) when controlling for preparation and class absences (i.e., base model)
Predictors Type Exam 1 (%) Exam 2 (%)
Study strategy [b with combining circumflex] i SEb p-Valuec [b with combining circumflex] i SEb p-Valuec
a [b with combining circumflex] i = mean unstandardized estimated regression coefficient of the ith predictor. b image file: d3rp00207a-t3.tif. Standard errors computed with ordinary least-squares. c Two-sided significance level. Ntotal (Exam 1) = 540 and Ntotal (Exam 2) = 470.
Re-worked graded homework problems Active −0.2 1.6 0.899 0.3 1.7 0.863
Re-read lecture slides of class notes Passive −1.4 1.6 0.381 −2.4 1.6 0.133
Explained concepts to myself or others Active 5.1 1.5 0.001 3.0 1.5 0.048
Attended the optional Review Session Mixed 2.7 1.6 0.090 6.7 1.6 <0.001
Paraphrased or outlined class notes (includes creating a study guide or writing) Active −2.2 1.6 0.160 −1.1 1.7 0.540
Answered old exam questions Active 2.0 1.6 0.226 −1.1 2.1 0.596
Re-worked online quizzes Active 3.5 1.6 0.031 1.4 1.7 0.393
Re-read textbook Passive −1.5 1.7 0.377 −2.7 1.9 0.143
Re-watched lecture videos Passive −0.1 1.7 0.949 −1.7 1.7 0.302
Made my own diagrams or comparison tables from lecture notes Active −0.7 2.0 0.730 3.2 1.9 0.095
Re-wrote your class notes word for word Passive 8.8 2.4 <0.001 4.4 2.7 0.098
Base model Chemistry assessment (2.0 ± 0.1, p < 0.001), class days missed (−1.2 ± 0.4, p = 0.002) Chemistry assessment (2.0 ± 0.1, p < 0.001), class days missed (−1.2 ± 0.4, p = 0.002)
Model fit R 2 = 0.3552 (F13[thin space (1/6-em)]526 = 24.28, p < 0.001) R 2 = 0.3477 (F13[thin space (1/6-em)]456 = 19.55, p < 0.001)


After conducting the exploratory regression (see Table 3), to mitigate the impact of multicollinearity in our regression models, multiple linear regressions were performed considering only the strategies that demonstrated significance in the initial exploratory regression analyses for exams 1 and 2, plus the base model (see Appendix III, Table 15). For both exams 1 and 2, the strategies that were significant in the exploratory regression analyses were still significant in the second set of analyses containing the smaller number of strategies (see Appendix III, Table 15), with small changes in the coefficients and p-values [for Exam 1 only: re-wrote your class notes word for word ((−8.7 ± 2.4)%) and re-worked online quizzes ((4.0 ± 1.5)%); for Exam 2 only: attended the optional review session ((7.0 ± 1.6)%); for both exams: explaining concepts to myself or others (Exam 1: (4.8 ± 1.5)%; Exam 2: (3.5 ± 1.5)%)].

Did the percentage of time spent using active strategies positively affect exam score?

To further understand how active strategies related to performance, we investigated the percentage of study time that students spent using active strategies. In order to calculate this variable, we added the student-reported percentages of time spent on strategies for those we coded as active. On average, students spent about half of their study time using active strategies for Exam 1 (M = 57%, SD = 38%) and Exam 2 (M = 50%, SD = 30%), though values varied from 0 to 100 (Fig. 2). Importantly, as seen in Table 4, students who spent a larger proportion of their study time on active strategies tended to perform better on both exams 1 and 2. When controlling for academic preparation and class days missed, the model predicted that students who spend all of their study time on active strategies for exams 1 and 2 (Exam 1 Study Time: (0.07 ± 0.02)%; Exam 2 Study Time: (0.07 ± 0.03)%) would score 7 percent higher than students who spend none of their study time on active strategies (Table 4). The total variance in performance explained by this regression, which incorporates both the base model and time management variables, was 34% for Exam 1 and 31% for Exam 2.
image file: d3rp00207a-f2.tif
Fig. 2 Distributions for the percentage of time that students devoted to active study for Exam 1 (N = 547) and Exam 2 (N = 472).
Table 4 Time-management multiple regression model
Predictor Exam 1 (%) Exam 2 (%)
[b with combining circumflex] i ± SEab P [b with combining circumflex] i ± SEab P
a [b with combining circumflex] i = mean unstandardized estimated regression coefficient of the ith predictor. b image file: d3rp00207a-t4.tif. Standard errors computed with ordinary least-squares. c Two-sided significance level. d Numerical variables were centered at their mean. Ntotal (Exam 1) = 537 Ntotal and (Exam 2) = 468.
Chemistry pre-assessmentd 1.8 ± 0.1 <0.001 1.8 ± 0.2 <0.001
Class days missed −1.2 ± 0.4 0.004 −1.0 ± 0.4 0.008
% time distracted studying for exams −0.10 ± 0.03 0.005 −0.10 ± 0.04 0.019
% exam study time using active strategies 0.07 ± 0.02 0.003 0.07 ± 0.03 0.004
Number of days studied for exams −1.1 ± 0.3 <0.001 0.4 ± 0.3 0.221
Intercept 75 ± 2 <0.001 76 ± 4 <0.001
Model fit R 2 = 0.3352 (F5531 = 55.87, p < 0.001) R 2 = 0.3115 (F5462 = 42.83, p < 0.001)


Did the percentage of time being distracted while studying correlate with exam performance?

Another factor that contextualizes the study strategies is how focused students are during their exam study sessions. In the study-behaviors reflections, we asked students the percent of time they were distracted while studying for the exam. On average, students reported being distracted between 24–29% during their Exam 1 and Exam 2 study times (Exam 1: M = 29%, SD = 21; Exam 2: M = 24%, SD = 20) (Fig. 3). When holding preparation and class absences equal, the model predicted that students who are distracted 50% of the study time would score approximately 5 percent lower on the exams 2 (Exam 1: (−0.10 ± 0.03)%; Exam 2: (−0.10 ± 0.04)%) than students who are distracted 0% of the study time (Table 4). These results suggest that not only was it common for students to be distracted while studying, but their distraction also was negatively related to exam performance.
image file: d3rp00207a-f3.tif
Fig. 3 Distributions for the percent of time students reported being distracted while studying for Exam 1 (N = 547) and Exam 2 (N = 472).

Is there a difference in study behaviors between Group 1(a, b) (75 percentile) and Group 2(a, b) (25 percentile)?

To further understand which study behaviors helped students perform better in a general chemistry course, we looked in more detail at the exam study behaviors of students in the 75 percentile of exam scores (denoted Group 1(a, b)) and students in the 25 percentile of exam scores (denoted Group 2(a, b)). We examined if there was a significant difference in any study behaviors between Group 1(a, b) and Group 2(a, b) students. We found that for Exam 1, both groups (Group 1 and 2(a)) reported using the same top four strategies: re-working graded homework problems, re-reading lecture slides or class notes, explaining concepts to themselves and others, and attending the optional review session. For Exam 2, Group 1(b) reported using the same top four strategies Group 1(a) used for Exam 1. However, in Exam 2, while the top two strategies were the same for both groups (i.e., re-working graded homework problems, and re-reading lecture slides or class notes), Group 2(b) students replaced attending the optional review session (a mixed strategy) for using re-watching lecture videos (a passive strategy) in their top four strategies. We conducted a Fisher's exact test to determine if choosing certain learning strategies to use during exam study related to students being in Group 1(a, b) (75 percentile of exam scores) or Group 2(a, b) (25 percentile of exam scores). As shown in Table 5, we examined the strategies students reported using the most in each exam, the strategies that either significantly correlated or trended with performance in the course, and strategies that qualitatively showed a big difference in proportions between the groups. (Note: these strategies were included to ensure that no potential differences in usage between Group 1 and Group 2 students were overlooked.) The Fisher's exact test results indicated significant associations for the following strategies in Exam 1: Explained concepts to myself and others (G1(a): 50.7%, G2(a): 37.7%; p(0.038), odds ratio (1.70), small size effect), and Re-wrote your class notes word for word (G1(a): 7.7%, G2(a): 18.5%; p(0.011), odds ratio (2.67), medium size effect). However, using a Bonferroni correction where the adjusted alpha is 0.008, we see no significant associations for Exam 1. For Exam 2, the Fisher's exact test results indicated significant associations for the following strategies, Attending the optional Review Sessions (G1(b): 49.1%, G2(b): 31.8%; p(0.010), odds ratio (2.07), small size effect), Re-watched lecture videos (G1(b): 26.3%, G2(b): 44.9%; p(0.005), odds ratio (2.28), small size effect), and Re-wrote your class notes word for word (G1(b): 4.4%, G2(b): 13.1%; p(0.029), odds ratio (3.28), medium size effect). With a Bonferroni correction where the adjusted alpha is 0.007, Re-watching lecture videos showed a significant association with a medium effect between the groups, meaning if a student chose to re-watch lecture videos they have higher odds of being in Group 2 (b) (25 percentile). Since Bonferroni is considered a conservative correction (Rothman, 1990; Perneger, 1998; Nakagawa, 2004), we report the results of the Fisher test, with and without the correction. From Table 5, we observed qualitatively that a higher proportion of Group 1(a, b) students (75 percentile) used active strategies to study for the exams; while a higher proportion of Group 2(a, b) students (25 percentile) used passive strategies to study for the exams. We also observed that a higher proportion of Group 1(a, b) students (75 percentile) used strategies that were positively correlated to performance in the course, while a higher proportion of Group 2(a, b) students (25 percentile) used strategies that were negatively correlated to performance in the course.
Table 5 Proportion of students in Group 1 and 2 that used the following strategies to study for exams 1 and 2
Exam 1 Group 1 75 percentile (N = 142, (a)) Group 2 25 Percentile (N = 130, (a)) Size effect (odds ratio)a
Study strategy Type N % N %
a Note: *small size effect, **medium size effect, ***large size effect.
Re-worked graded homework problems Active 96 67.6 81 62.3
Re-read lecture slides or class notes Passive 76 53.5 80 61.5
Explained concepts to myself or others Active 72 50.7 49 37.7 1.70*
Attended the optional review session Mixed 56 39.4 57 43.8
Re-worked online quizzes Active 48 33.8 33 25.4
Re-wrote your class notes word for word Passive 11 7.7 24 18.5 2.67**
Exam 2 (n = 114 (b)) (n = 107 (b))
Re-worked graded homework problems Active 71 62.3 67 62.6
Re-read lecture slides of class notes Passive 70 61.4 72 67.3
Explained concepts to myself or others Active 51 44.7 40 37.3
Attended the optional review session Mixed 56 49.1 34 31.8 2.07*
Re-watched lecture videos Passive 30 26.3 48 44.9 2.28*
Re-wrote your class notes word for word Passive 5 4.4 14 13.1 3.28**
Made my own diagrams or comparison tables from lecture notes Active 22 19.3 19 17.8


Because we are also interested in study-time management behavior, we conducted independent sample t-tests at a 95% confidence level to examine the difference in percentage of study time students reported using actives strategies for exams 1 and 2, between Group 1(a, b) and Group 2(a, b). After conducting a Bonferroni correction, the adjusted alpha is 0.013. On Exam 1, the t-test revealed a significant difference between the groups, t(268) = [−3.45], p = [0.0006], with a small effect size (Hedge's g = [0.4]). The ranges of magnitude on size effect for Hedge's g are: large size effect (0.8 or higher), medium effect (0.5 < 0.8), small size effect (0.2 < 0.5) (Maher et al., 2013). On average for Exam 1, Group 1(a) students spent about 13% more of their study time using active strategies. For Exam 2, the t-test again revealed a significant difference between the groups, t(218) = [4.2], p = [ < 0.001], Hedge's g = [0.6]. That is, Group 1 (b) students spent approximately 17% more of their study time using active strategies, on average. In addition, descriptively (see Appendix III, Fig. 8 and 9), Group 1(a, b) has a higher proportion of students who spend more than 50% of their study time using active strategies, and Group 2(a, b) has a higher proportion of students who spend less than 50% of their study time using active strategies.

Additionally, we conducted independent samples t-tests at a 95% confidence level to examine the difference in percentage of distraction students reported when studying for exams 1 and 2, between Groups 1(a, b) and 2(a, b). After conducting a Bonferroni correction, the adjusted alpha is 0.0125. On Exam 1, the t-test revealed a significant difference between the groups, t(270) = [−2.75], p = [0.0065], with a small effect size (Hedge's g = [0.3]). For Exam 1, Group 2(a) students reported being distracted 7.7% more of their study time than Group 1 students, on average. For Exam 2, the t-test again revealed a significant difference between the groups, t(219) = [−4.29], p = [<0.001], Hedge's g = [0.6]. For Exam 2, Group 2(b) students reported being distracted 11% more of their study time than Group 1 students, on average. The distribution histograms are found in the Appendix III Fig. 10 and 11. Hence, we found that students in Groups 1(a, b) and 2(a, b) had different study-time management behaviors with students in Group 1(a, b) (75th percentile) spending more of their study time using active strategies and being less distracted during their exam study time, for both exams.

Discussion

We investigated the study behaviors employed by students when studying for exams in a first-semester general chemistry course (GC1) and how these behaviors impacted their exam performance. We found that students utilized both active and passive learning strategies, and the four most commonly used strategies for studying exams 1 and 2 were the same. Although the instructors suggested learning strategies for studying on the course Canvas page, students were not explicitly taught to modify their approach, so it is unlikely that students would change their exam learning strategies during the semester. When academic preparation and self-reported class attendance were controlled, we observed that students who spent a higher percentage of their study time when preparing for exams using active strategies achieved higher exam scores. Additionally, certain learning strategies were linked to better performance in GCI exams. Finally, the study revealed that distraction had a negative impact on exam performance, with higher levels of distraction being associated with lower scores.

What types of study behaviors do students enrolled in a GCI course have when studying for their exams?

In our study, students reported using two active strategies to study for exams, on average. From the learning strategies students reported using, the top 4 strategies most often used were: re-working graded homework problems (65%), re-reading lecture slides or class notes (62%), explaining concepts to themselves and others (47%), and attending the optional review session (43%). All other strategies were reported being used by less than 30% of the students. Generally, undergraduate students report that re-reading class material is their main learning strategy (Karpicke et al., 2009; Hora and Oleson, 2017). In more course-specific studies (psychology and biology), approximately 90% of undergraduate students use this passive strategy (re-reading class material) when studying for exams (Rowell et al., 2021; Walck-Shannon et al., 2021). However, the class content in these courses is more textual than chemistry, which is based more on solving mathematical or graphical problems, so we might expect students to engage with more problem-solving strategies (e.g., re-working homework problems) in chemistry. Similar to Muteti et al. (2021) qualitative study in General Chemistry 1, re-reading slides (or other passive strategies) was one of the main strategies used; e.g., in their open-ended prompts before the metacognitive intervention, 40% of the students described using strategies coded as passive (or lower-order study strategies), whereas 62% of students in our study selected re-reading lecture slides or class notes. In addition, 65% of students in our study used re-working graded homework problems (an active strategy) and in Muteti et al., only 25% of the students’ responses were coded as active (or higher-order study strategies) before the metacognitive intervention. We hypothesis this higher usage of active strategies may be because students were highly encouraged by the course instructors to study using problems and students had multiple opportunities to practice problems (individually or in groups) during the lecture and discussion sessions. It might also be that in the Muteti et al. study, students may not have written all the study strategies they used and therefore the percent of students using active strategies might be underreported. In terms of study-time management habits, students in our study reported spending approximately half of their study time using active strategies and being distracted 26% of the time when studying for exams, on average. This percent of exam-study time using active strategies is very similar to what was seen in an introductory biology course (Walck-Shannon et al., 2021). While the reported percent of time being distracted during exam studying was also similar to the introductory biology course, other studies report a much higher percent of time students are distracted during studying (e.g., Rosen et al., 2013; May and Elder, 2018). Hence, without study-strategy interventions, the type of material being studied, which can differ by discipline and course, affects the strategies students use and the percent of study time they dedicate to active-study techniques when studying for exams. In addition, independent of discipline, students report being distracted for at least a quarter of their exam-study time.

Study behaviors that relate to exam performance in the GCI course

In addition to finding specific strategies to be unique predictors of performance in GCI, we found that the proportion of active study time to be predictive of performance, after accounting for academic preparation and class days missed. We also found that spending a higher percentage of study time using active strategies is a better indicator of performance than the total number of active strategies used. For example, students could report using more active strategies to study for exams, but report spending only 20% of their time using these active strategies. When holding academic preparation and class days missed equal, students who spent most of their exam-study time using active strategies performed better than those who spent less time using active strategies. This statement is supported by the observation of Group 1 (75 percentile or higher in exam grades) students spending a higher percentage their exam-study time engaging with active strategies. Walck-Shannon et al. (2021) found similar results where a higher proportion of exam-study time spent on active strategies correlated with higher performance in the exams in an introductory biology course. While not actually examining the percent study time using active strategies, several studies in general chemistry also showed a relationship between students’ use of deep-learning approaches or higher-order study strategies and exam performance or course grade. For example, Sinapuelas and Stacy (2015) showed that students’ learning-approach level is correlated with their exam performance; i.e., students using higher levels (deeper learning approaches) have higher exam scores. Similarly, Bunce et al. (2017) showed that higher-performing (A/B) students used more deep-learning approaches than lower-performing students (D/F). In addition, Muteti et al. (2021) showed descriptively that A/B students used more higher-order study strategies than lower-order study strategies. Hence, in addition to students using active strategies to study for the course, students should spend most of their study time using these active strategies.

Students reported using various self-directed study behaviors to prepare for GCI exams, but not all of these behaviors were predictive of their performance in the course. From our analyses (see Table 3), explaining concepts to myself and others, reworking online quizzes, and attending the optional review sessions were positive predictors of performance. Explaining concepts to oneself and others enables the student to create multiple pathways from which they can recall the information learned (Dunlosky et al., 2013). Self-testing, such as reworking homework without the answers provided, enhances recall by requiring students to retrieve previously learned information from memory to solve problems (Dunlosky et al., 2013). These strategies encourage students to engage in the type of studying that promotes long-term retention and transfer of knowledge (Bjork, 1994; Roediger III and Karpicke, 2006; McDaniel et al., 2007). To conduct a more comprehensive analysis of the relationship between study behaviors and course performance, we explored the learning strategies of students who achieved in the 75-percentile or the 25-percentile for exam scores. Our findings revealed that students scoring in the 75th percentile or higher on the exams were more likely to employ study strategies positively associated with exam grades (i.e., active strategies) compared to students in the 25th-percentile group. Conversely, students in the 25th-percentile group were twice as likely to utilize passive strategies that exhibited a negative correlation with exam performance, such as transcribing class notes verbatim. This finding matches with previous general chemistry studies. For example, work by Bunce et al. (Bunce et al., 2017), where they found that students with A/B letter grades tend to use deeper level learning approaches, and the qualitative study by Muteti et al. (2021) where students with A/B grades described using more higher-order study strategies.

Attending the optional review session even though coded as a mixed learning strategy was positively correlated to performance. One possible reason for the observed effect is that students attending the optional review session are more motivated to do well in the course than students who do not attend. However, we also hypothesize that this effect could be because students may use this session for multiple functions. For example, students could use this time to clarify doubts and misconceptions about chemistry concepts or mistakes in their problem-solving of class practice problems. They also could engage in group work (explaining concepts to myself and others) when solving the review problems, not to mention the action of solving review problems during the session (self-testing). In addition, the session could help students organize and determine the important topics on which to focus. Hence, if instructors (or peer leaders/teaching assistants) give optional review sessions, they should strive to have activities that model using study strategies shown to improve performance and learning.

While only used by 11% of the students in this study, re-writing your notes word for word had a negative relationship with exam performance, as one might expect. Literature indicates that this learning strategy is on the knowledge or remembering level of learning in the Bloom's taxonomy hierarchy, which does not encourage deeper conceptual understanding. Second, taking useful and detailed notes is a complex activity that strains the central executive and working memory, often leading to incomplete and inaccurate notes. As a result, students may be relying on notes that contain missing or incorrect information from the lecture, which could hinder their performance. Last, re-writing notes takes time (Stevens et al., 2020) and therefore, less of the student's exam study time is used for active strategies that have been shown to improve conceptual knowledge and problem solving. Hence, while students may believe they are learning when re-writing their notes, instructors may want to teach and encourage students to use elaboration or self-explanation with their notes instead.

A key study-time management habit that has not been as well studied in STEM is distraction during exam-study time and its effect on exam performance. Previous studies have shown a negative relationship between distraction (predominately multi-tasking) during lecture or studying and a student's academic performance (Kraushaar and Novak, 2010; Junco and Cotten, 2012; Calderwood et al., 2016; May and Elder, 2018). Our result concurs with studies (one in introductory biology and the second covering many college courses) that examined the negative correlation between being distracted while studying for an exam and lower exam performance (Patterson, 2017; Walck-Shannon et al., 2021). Our study showed that students who were more distracted while studying performed lower on the exams than students who reported being less distracted when studying. We also observed that students in Group 1 reported being less distracted when studying for exams than students in Group 2 (25 percentile or lower in exam grades). Literature indicates that humans cannot process multiple streams of information at the same time; in fact, trying to learn while multitasking can lead to acquiring inflexible knowledge that cannot be easily recalled or applied in novel situations (Kraushaar and Novak, 2010; Junco, 2012). Additionally, switching between tasks requires time and effort to refocus, which can further impede learning (Kraushaar and Novak, 2010). There is a need to educate students on the impact about multitasking while studying on academic performance to help foster self-awareness and self-regulation of multitasking habit.

Active study behaviors that showed no relationship to performance on exams in the GCI course

Among the six active strategies included in the survey for studying for the GCI exams, only two demonstrated a statistically significant positive correlation with performance: explaining concepts to oneself and others, and reworking online quizzes. The other four active strategies (paraphrased or outlined class notes, re-worked graded homework problems, made my own diagrams or comparison tables, and answered old exam questions) showed no effect on GC1 exam performance. All six active strategies have literature evidence showing that they should be helpful in the learning process and performance in a test or course (Bjork, 1994; Karpicke et al., 2009; Dunlosky et al., 2013; Rowell et al., 2021; Walck-Shannon et al., 2021), so why did they not predict performance in this study of a GC1 course? Below are some possible reasons for not seeing a correlation.

(1) The content and types of problems on an exam may affect which active strategies are useful in improving exam performance. In our study the active strategy of making my own diagrams or comparison tables from lecture notes showed no significant correlation with performance on Exam 1. However, there was a notable trend suggesting a correlation with performance on Exam 2 (p-value = 0.06). The topics covered in Exam 1 focused on unit analysis, atomic mass, atomic structure, and periodic trends, while Exam 2 emphasized more visually-oriented topics such as Lewis structures, VSEPR theory, valence bond theory, and molecular orbitals. This shift in content could explain the emerging correlation with Exam 2 performance. We hypothesize that the absence of a statistically significant correlation in Exam 2 could be attributed to the inclusion of topics like stoichiometry, limiting reactants, and nomenclature, which require practice testing for improving learning. It is important to note that student's choice of learning strategies can be influenced by course content and type of assessments. Previous research by Hora and Oleson (2017) and Walck-Shannon et al. (2021) supports this notion, as students have expressed the need for different approaches to studying depending on the course content. For example, math courses may require a focus on problem-solving, while biology classes may necessitate strategies for extensive reading and vocabulary memorization. Additionally, Hora and Oleson's study (2017) revealed that students adapt their learning strategies based on the type of assessment given. For multiple-choice exams, students prefer memorization of important details over deep conceptual understanding; for departmental common exams, they tend to choose to do previous common exams as a study strategy. Szu et al. (2011) found that for an organic chemistry course, in which strong visualization and organization of concepts is key, predictors of performance included making concept maps (considered to be one way to practice dual coding), outlines of the material and doing practice problems. Considering these findings, it seems possible that the content and problem types on an exam may impact the effect of different active strategies on exam performance.

(2) Misguided Implementation of active strategies when studying for exams. In our study, we were surprised to find that the practice tests or self-quizzing strategies (re-worked graded homework problems and answered old exam questions) did not predict performance on exams for the course. This finding does not align with the existing literature, which suggests that self-testing and practice testing highly impact long-term retention and comprehension of material (Karpicke et al., 2009; Dunlosky et al., 2013). Additionally, previous course-based studies have shown a positive relationship between these strategies and course performance (Walck-Shannon et al., 2021). One reason for our finding may be because students had access to the solutions to problems assigned in the class, such as homework problems and old exam questions. This convenience could have resulted in some students simply reading the solutions without attempting to solve the problems themselves or relying on the solutions while re-working the problems. In addition, there were a large number of problems available; hence, students may have struggled to select which ones were important to review, leading to potential gaps in their preparation. Time constraints may have also caused some students to skim through the solutions instead of fully engaging with the problems.

Additionally, our expectation was that paraphrasing or outlining class notes would show a positive correlation with performance. Previous research by Hagaman et al. (2012) and Stevens et al. (2020) suggests that these activities assist students in breaking down and organizing information, as well as establishing connections between new ideas and previously acquired knowledge, leading to a more comprehensive understanding of the subject matter. However, in our study, students were provided with a study guide or outline by the instructors or teaching assistants (TAs/LAs). Our hypothesis is that instead of actively constructing their own study guide through methods like paraphrasing or outlining their class notes, students may have approached this strategy passively by simply reading the provided resource without actively engaging in the processing and reorganization of the information. We believe this passive approach could be the reason for the lack of correlation between this strategy and exam performance.

Implications

Instructors play a crucial role in persuading students about the importance of adopting study behaviors that can lead to improved performance in course assessments and foster self-regulated learning. One approach is to provide students with empirical evidence showcasing the benefits of these strategies in undergraduate STEM courses, as well as demonstrating how successful students employ these study behaviors in their own studying practices. In addition, instructors can encourage students to not only use active strategies to study for the course but to increase the quality of the studying by spending the majority of their study session on active engagement with the class material rather than passive reading or writing notes word for word. In addition to encouraging students to use study behaviors that result in improved performance and learning, students may benefit from explicit instruction on how to use and incorporate these behaviors into their studying routine. This could involve integrating learning strategies into lectures or discussion session; e.g., discussing the underlying reasoning behind problem-solving steps (self-explanation). Consistent with the findings of this paper, instructors may want to tailor their course resources to facilitate the proper utilization of these learning strategies. For instance, refraining from providing solutions to problem sets can encourage students to engage in meaningful self-testing; however, keep in mind students could resist this approach.

Additionally, we found that the level of distraction during study sessions significantly impacted exam performance. To enhance concentration and mitigate distractions, instructors may want to teach students some techniques to help them lower their distraction while studying. One possible method is the Pomodoro technique, which involves working in focused intervals with short breaks interspersed between them (Cirillo, 2018; Costales et al., 2021). This structured approach can help maintain focus and prevent mental fatigue (Cirillo, 2018; Costales et al., 2021). Another useful technique is time blocking, where students allocate dedicated blocks of time for specific tasks or subjects (Chase et al., 2013; Rampton, 2019). This helps create a structured study schedule and can help students to focus attention on each task (Chase et al., 2013; Rampton, 2019). Also, instructors can encourage students to attend class sessions regularly or give points for class attendance, as these sessions may contain valuable information that is not available in the textbook or online resources (e.g., the instructor indicates which problems to focus on during class), and instructors can note that missing class negatively correlates with course performance.

Encouraging and instructing students on appropriate study behaviors should be an ongoing endeavour by instructors (and TAs/LAs) throughout the semester, as previous studies have indicated that students are often resistant to changing their study habits (Gezer-Templeton et al., 2017; Sebesta and Bray Speth, 2017; Walck-Shannon et al., 2021). A key time may be right after an exam, as students may be more open to change after exams, when they directly witness the impact of their study efforts on their exam performance.

Limitations

One limitation to acknowledge in this study is the notable contrast in final course grades between the subgroup of consenting students and the entire class. This discrepancy indicates that consenting students achieved a higher final course grade, on average. However, we view this finding with minimal concern, as the difference in average grade is confined within a single letter grade (B). Moreover, our study primarily concentrates on performance in midterm exams, rather than final course grades, which are influenced by additional factors such as homework, quizzes, and participation. A second limitation in this study is that the data were self-reported. Self-reporting is susceptible to a number of potential biases, such as participants' misunderstandings of the survey question asked, and difficulties in recalling relevant events (Podsakoff et al., 2003). These biases can also be influenced by participants' individual dispositions, including acquiescence or social desirability (Podsakoff et al., 2003). In an effort to mitigate these limitations, we took steps to ensure the anonymity of participants' responses and that no instructor would have access to the survey data. Additionally, we aimed to aid participants' recall of their study behaviors by administering the survey immediately after they completed their exams and closing it five days thereafter. A third limitation is the possible nonnormality of the regression models. However, the nonnormality is small and as noted by Tranmer and Elliot (2008), multiple linear regression (MLR) is a robust statistical method that can tolerate moderate deviations from normality. A fourth limitation arises from the lack of understanding how students actively employed the learning strategies. For instance, a student's response of reworking mastering chemistry problems may not accurately reflect their actual engagement, as they could have simply read through the solution without initially attempting to solve the problem independently. Nevertheless, our study adds to the STEM literature about how that study behaviors that students employ while preparing for exams are significantly correlated with their performance on said exams.

Conclusions

The present study aimed to investigate the study behaviors of general chemistry students when preparing for exams and their relationship to exam performance. Results showed that students, on average, utilized two active strategies for exam preparation, with the most common strategies being re-working graded homework problems (65%), re-reading lecture slides or class notes (62%), explaining concepts to themselves and others (47%), and attending the optional review sessions (43%). It was also observed that students spent roughly half of their study time employing active strategies, while experiencing distractions around 26% of the time. Active strategies positively predicted exam performance, whereas passive strategies had a negative impact. However, not all active strategies were equally impactful on exam performance, potentially due to the specific exam content and problem types, as well as misguided use of active strategies when studying for the exams. The study also revealed that the amount of exam-study time spent on active strategies positively predicted performance, and higher levels of distraction when studying for exams had a detrimental effect. Furthermore, students in the 75-percentile group of exam scores used study behaviors that positively predicted performance, in contrast to the 25-percentile group who engaged in behaviors that negatively impacted performance. Knowing student study behaviors and the effects these behaviors have on exam performance may give instructors insight into how to support all students, and teach them study behaviors that are impactful for their course.

Data availability

The data are not publicly available as approval for this study did not include permission for sharing data publicly.

Conflicts of interest

There are no conflicts to declare.

Appendices

Appendix I

Tables 6 and 7 and Fig. 4
Table 6 Descriptive statistics of the consenting sample
Gender First-gen status Race/ethnicity
Female 276 First-gen 127 BIPOC 105
Male 310 Non-first gen 471 Asian 85
White 392


Table 7 Descriptive statistics of the whole general chemistry population
Gender
Female 396
Male 515



image file: d3rp00207a-f4.tif
Fig. 4 This Figure needs to be Appendix I after the tables. It may not be in the main text. for Exam 1 and Exam 2 of the consenting sample.

Appendix II

*The study behaviors reflections were the same for Exam 1 and 2, but the dates changed accordingly.
Study behaviors reflection.
Time management. 1. What is the number of class days that you missed between the start of the semester and Exam 1? There was a total of 16 class days.

2. About what percent of the time were you distracted during class? (0–100%).

3. About what percent of the time were you distracted while studying? (0–100%).

4. In a typical non-exam week, how much time did you spend on this class (total number of hours per week)? Do NOT include in-class hours.

5. Please indicate the approximate number of hours you spent studying for Exam 1 on each of the following days. Use a 0 if you did not study on a particular day. If you started studying earlier than Monday, September 20th, there will be another question about that next.

[Have 9 days with blanks next to them that students fill in.]

6. If you started studying for the exam before Monday, Sept 20th, please estimate the total number of hours that you spent studying BEFORE that day. Please indicate zero if this does not apply to you (Figure below followed this question in the survey).

image file: d3rp00207a-u1.tif


Learning strategies. 7. Which of the following did you do to prepare for Exam 1? (check all that apply)

a. Re-watched lecture videos

b. Re-read lecture slides or class notes

c. Re-wrote your class notes word for word

d. Paraphrased or outlined class notes (includes creating a study guide or writing out answers to the provided study guides)

e. Re-read textbook

f. Re-worked mastering chemistry problem questions (Re-worked graded homework problems)

g. Answered old exam questions

h. Made my own diagrams or comparison tables from lecture notes

i. Re-worked weekly graded homework sets (Re-worked online quizzes)

j. Attended the Optional Review Session

k. Explained concepts to myself or others

l. Other

If you selected “Other” on the previous question, please explain: _________.

8. For the strategies that you selected on the previous question, fill in the percent of your total study time that you spent on each one. Altogether it should add up to 100%.

9. For your two top strategies, please explain why you use those strategies. (Strategy 1 and Strategy 2)

10. The learning strategies I used to study for the Chem 1210 Exam 1 helped me to perform well on this exam.

∘ Strongly agree

∘ Agree

∘ Somewhat agree

∘ Somewhat disagree

∘ Disagree

∘ Strongly Disagree

11. Did you ask your instructor or one of the TAs/LAs for help (can include office hours, drop-in sessions, or study hall)?

∘ Yes

∘ No

12. Imagine you are using problem sets to study for the upcoming Chem 1210 exam. Choose the strategy you use the most.

End of Study Habit survey

∘ I typically try to solve the problems without looking at the solutions or briefly looking at the solutions.

∘ I typically read the solutions to the problems before trying them first.

∘ I typically read the solution to the problems sets and do not re-work the problems

∘ I typically do not use problem sets when studying for the exam.

Open-ended question: please explain the reasoning behind why you use the answer that you choose for the previous questions. (Imagine you are using problem sets to study for the upcoming Chem 1210 exam. Choose the strategy you use the most)

End of Study Habit survey

Appendix III

Normality of multiple linear regression models.

I. Base models (Exam 1 and Exam 2).

Tables 8–15 and Fig. 5–11.

Table 8 Residual statistics for the base models
Source Exam 1 Exam 2
Chi2 df p Chi2 df p
Heteroskedasticity 28.46 5 <0.001 25.47 9 0.4207
Skewness 21.08 2 <0.001 11.12 3 0.0068
Kurtosis 0.05 1 0.8156 2.44 1 0.1185
Total 49.59 8 <0.001 39.02 13 0.002


Table 9 Residual statistics for the learning-strategies model
Source Exam 1 Exam 2
Chi2 df p Chi2 df p
Heteroskedasticity 95.08 93 0.4207 112.63 93 0.4207
Skewness 28.89 13 0.0068 22.25 13 0.0068
Kurtosis 0.18 1 0.6744 2.19 1 0.1185
Total 124.14 107 0.1231 137.07 107 0.0265


Table 10 Residual statistics for time-management models
Source Exam 1 Exam 2
Chi2 df p Chi2 df p
Heteroskedasticity 33.52 20 0.0295 34.49 20 0.0230
Skewness 21.63 5 0.0006 11.20 5 0.0475
Kurtosis 0.42 1 0.5194 1.93 1 0.1645
Total 55.57 26 0.0006 47.62 26 0.0060


Table 11 Base model including total exam study time
Predictor Exam 1 (%) Exam 2 (%)
[b with combining circumflex] i ± SEabc P [b with combining circumflex] i ± SEab P
a [b with combining circumflex] i = mean unstandardized estimated regression coefficient of the ith predictor. b image file: d3rp00207a-t5.tif. Standard errors computed with ordinary least-squares. c Two-sided significance level. d Numerical variables were centered at their mean. Ntotal (Exam 1) = 541 and Ntotal (Exam 2) = 470.
Chemistry pre-assessmentd 1.9 ± 0.1 <0.001 2.0 ± 0.2 <0.001
Class days missed −1.4 ± 0.4 0.001 −1.0 ± 0.4 0.008
Total exam study time −0.2 ± 0.1 0.056 0.1 ± 0.1 0.280
Intercept 72.4 ± 1.2 <0.001 75 ± 1 <0.001
Model fit R 2 = 0.3092 (F2537 = 84.14, p < 0.001) R 2 = 0.2909 (F2466 = 65.70, p < 0.001)


Table 12 Means and standard deviations of continuous variables
Variable Observations Mean Std. Dev Min Max
Exam 1 (%) 603 68.2 21.1 0 100
Exam 2 (%) 603 71.1 22.5 0 100
Chemistry pre-assessment 603 12.4 5.4 0 25
Class days missed (Exam 1) 547 1.4 2.1 0 16
Class days missed (Exam 2) 472 1.6 2.1 0 13
Total exam study time (Exam 1) (hours) 547 8.6 7.6 0 62.5
Total exam study time (Exam 2) (hours) 472 9.3 7.8 0 66
% time distracted studying for Exam 1 547 29.1 21.2 0 100
% time distracted studying for Exam 2 472 24.9 20.2 0 100
% Exam 1 study time using active strategies 547 57.6 38.2 0 480
% Exam 2 study time using active strategies 472 50.2 30.1 0 104
Number of days studied for Exam 1 547 5.2 2.6 0 9
Number of days studied for Exam 2 472 5.1 2.5 0 9


Table 13 Correlations between the learning strategies for Exam 1
Re-watched lecture videos Re-read lecture slides or class notes Re-wrote your class notes word for word Paraphrased or outlined class notes Re-read textbook Re-worked graded homework problems Made my own diagrams or comparison tables from lecture notes Re-worked online quizzes Explained concepts to myself or others Answered old exam questions Attended the optional review session
Re-watched lecture videos 1.000
Re-read lecture slides or class notes 0.1115**
Re-wrote your class notes word for word 0.0868* 0.0220
Paraphrased or outlined class notes −0.0186 0.0841* 0.0276
Re-read textbook 0.0669 0.1239** 0.0412 −0.0911*
Re-worked graded homework problems 0.0151 0.0520 0.0156 0.0113 0.0706
Made my own diagrams or comparison tables from lecture notes 0.0398 0.0111 0.0482 0.2135*** 0.0209 0.0337
Re-worked online quizzes −0.0146 0.0109 0.0453 0.0207 0.0193 0.2974*** 0.0854*
Explained concepts to myself or others −0.0707 0.0531 0.0403 0.1435*** 0.0304 0.1284** 0.1069* 0.1062*
Answered old exam questions −0.0718 0.0579 0.0186 0.0002 −0.1181** 0.0863* 0.0934* 0.0773 0.0457
Attended the optional review session 0.0780 0.1769*** 0.0700 0.0826 0.0463 0.1289** 0.0861* 0.1292** 0.0702 0.0926*


Table 14 Correlations between the learning strategies for Exam 2
Re-watched lecture videos Re-read lecture slides or class notes Re-wrote your class notes word for word Paraphrased or outlined class notes Re-read textbook Re-worked graded homework problems Made my own diagrams or comparison tables from lecture notes Re-worked online quizzes Explained concepts to myself or others Answered old exam questions Attended the optional review session
Re-watched lecture videos 1.000
Re-read lecture slides or class notes 0.0809
Re-wrote your class notes word for word 0.0133 0.0317
Paraphrased or outlined class notes −0.0013 0.0072 0.1787***
Re-read textbook 0.0718 −0.0441 0.0503 0.0019
Re-worked graded homework problems 0.0005 0.0853 0.0603 0.0221 0.0697
Made my own diagrams or comparison tables from lecture notes 0.0618 0.0710 −0.0031 0.2509*** 0.0444 0.0227
Re-worked online quizzes 0.0175 0.0394 −0.0064 0.0664 0.0507 0.2900*** −0.0011
Explained concepts to myself or others −0.0238 0.1090* −0.0495 0.0739 −0.0836 0.0833 0.1466*** 0.0511
Answered old exam questions 0.0191 −0.0326 −0.0271 −0.0122 0.0016 0.0665 0.0562 0.1159* 0.0334
Attended the optional review session 0.0339 0.1211** −0.0365 0.0214 −0.0100 0.0337 0.1576*** 0.0586 0.0878* −0.0020


Table 15 Regression model relating specific study strategy use to performance on Exam 1 (N = 541) and Exam 2 (N = 470) when controlling for academic preparation and class absences (i.e., base model)
Predictors Type Exam 1 (%) Exam 2 (%)
Study strategy [b with combining circumflex] i SEb p-Valuec [b with combining circumflex] i SEb p-Valuec
a [b with combining circumflex] i = mean unstandardized estimated regression coefficient of the ith predictor. b image file: d3rp00207a-t6.tif. Standard errors computed with ordinary least-squares. c Two-sided significance level. Ntotal (Exam 1) = 541 and Ntotal (Exam 2) = 470.
Explained concepts to myself or others Active 4.8 1.5 0.001 3.5 1.5 0.020
Attended the optional review session Mixed 7.0 1.6 <0.001
Re-worked online quizzes Active 4.0 1.5 0.010
Re-wrote your class notes word for word Passive −8.7 2.4 <0.001
Base model Chemistry assessment (2.0 ± 0.1, p < 0.001), class days missed (−1.2 ± 0.4, p = 0.001) Chemistry assessment (2.0 ± 0.2, p < 0.001), class days missed (−0.8 ± 0.4, p = 0.044)
Model fit R 2 = 0.3438 (F5535 = 60.44, p < 0.001) R 2 = 0.3302 (F4465 = 58.97, p < 0.001)



image file: d3rp00207a-f5.tif
Fig. 5 Normal distribution plots for the residuals obtained from the base models for Exam 1 and Exam 2.

image file: d3rp00207a-f6.tif
Fig. 6 Normal distribution plots for the residuals obtained from the learning strategies models for Exam 1 and Exam 2.

image file: d3rp00207a-f7.tif
Fig. 7 Normal distribution plots for the residuals obtained from the time management models for Exam 1 and Exam 2.

image file: d3rp00207a-f8.tif
Fig. 8 Distributions for the percent of time students reported using active strategies while studying for Exam 1 (G1: N = 142) (G2: N = 130).

image file: d3rp00207a-f9.tif
Fig. 9 Distributions for the percent of time students reported being using active strategies while studying for Exam 2 (G1: N = 114) (G2: N = 107).

image file: d3rp00207a-f10.tif
Fig. 10 Distribution of the percent of the time students reported being distracted while studying for Exam 1 (Group 1 N = 142) (Group 2 N = 130).

image file: d3rp00207a-f11.tif
Fig. 11 Distribution of the percent of the time students reported being distracted while studying for Exam 2 (Group 1 N = 114) (Group 2 N = 107).

Acknowledgements

We would like to thank the instructors of the general chemistry class for allowing us to conduct the study with the students enrolled in their classes. We would also like to thank the students who willingly participated in the study.

References

  1. Atieh E. L., York D. M. and Muñiz M. N., (2020), Beneath the surface: an investigation of general chemistry students’ study skills to predict course outcomes, J. Chem. Educ., 98, 281–292.
  2. Biggs J., (1993), What do inventories of students' learning processes really measure? A theoretical review and clarification, Br. J. Educ. Psychol., 63, 3–19.
  3. Biggs J., Kember D. and Leung D. Y., (2001), The revised two-factor study process questionnaire: R-SPQ-2F, Br. J. Educ. Psychol., 71, 133–149.
  4. Bjork R. A., (1994), Memory and metamemory considerations in the training of human beings, in Metcalfe J. and Shimamura A. P. (ed.), Metacognition: Knowing About Knowing, The MIT Press, pp. 185–205.
  5. Bjork R. A., (2018), Being suspicious of the sense of ease and undeterred by the sense of difficulty: looking back at Schmidt and Bjork (1992), Perspect. Psychol. Sci., 13, 146–148.
  6. Bjork E. L. and Bjork R. A., (2011), Making things hard on yourself, but in a good way: creating desirable difficulties to enhance learning, Psychol. Real world: Essays Illustrating fundamental Contributions Soc., 2, 59–62.
  7. Bloom B. S., Engelhart M. D., Furst E. J., Hill W. H. and Krathwohl D. R., Taxonomy of educational objetives: the classification of educational goals: handbook I: cognitive domain, New York, US: D. Mckay, 1956.
  8. Breusch T. S. and Pagan A. R., (1979), A simple test for heteroscedasticity and random coefficient variation, Econometrica, 1287–1294.
  9. Bunce D. M., Komperda R., Schroeder M. J., Dillner D. K., Lin S., Teichert M. A. and Hartman J. R., (2017), Differential use of study approaches by students of different achievement levels, J. Chem. Educ., 94, 1415–1424.
  10. Calderwood C., Green J. D., Joy-Gaba J. A. and Moloney J. M., (2016), Forecasting errors in student media multitasking during homework completion, Comput. Educ., 94, 37–48.
  11. Callender A. A. and McDaniel M. A., (2009), The limited benefits of rereading educational texts, Contemp. Educ. Psychol., 34, 30–41.
  12. Cameron A. C. and Trivedi P. K., (1990), The information matrix test and its applied alternative hypothesis. Working paper 372, University of California-Davis, Institute of Governmental Affairs.
  13. Carrier L. M., (2003), College students' choices of study strategies, Perceptual Motor Skills, 96, 54–56.
  14. Cepeda N. J., Vul E., Rohrer D., Wixted J. T. and Pashler H., (2008), Spacing effects in learning: a temporal ridgeline of optimal retention, Psychol. Sci., 19, 1095–1102.
  15. Chan J. Y. and Bauer C. F., (2016), Learning and studying strategies used by general chemistry students with different affective characteristics, Chem. Educ. Res. Pract., 17, 675–684.
  16. Chase J.-A. D., Topp R., Smith C. E., Cohen M. Z., Fahrenwald N., Zerwic J. J., Benefield L. E., Anderson C. M. and Conn V. S., (2013), Time management strategies for research productivity, Western J. Nurs. Res., 35, 155–176.
  17. Cherif A. and Wideen M., (1992), The problems of the transition from high school to university science, BC Catalyst, 36(1), 10–18.
  18. Chi M. T. and Wylie R., (2014), The ICAP framework: linking cognitive engagement to active learning outcomes, Educ. Psychol., 49, 219–243.
  19. Cirillo F., (2018), The Pomodoro technique: The acclaimed time-management system that has transformed how we work, Currency.
  20. Conley D., (2007), The challenge of college readiness, Educ. Leadership, 64, 1–6.
  21. Cook E., Kennedy E. and McGuire S. Y., (2013), Effect of teaching metacognitive learning strategies on performance in general chemistry courses, J. Chem. Educ., 90, 961–967.
  22. Costales J., Abellana J., Garcia J. and Devaraj M., (2021), A learning assessment applying Pomodoro techniques as a productivity tool for online learning, 13th International Conference on Education Technology and Computers, 164–167.
  23. Credé M. and Kuncel N. R., (2008), Study habits, skills, and attitudes: the third pillar supporting collegiate academic performance, Perspect. Psychol. Sci., 3, 425–453.
  24. DeZure D., Kaplan M. and Deerman M. A., (2001), Research on student notetaking: implications for faculty and graduate student instructors, CRLT Occasional Papers, 16, 1–7.
  25. Dontre A. J., (2021), The influence of technology on academic distraction: a review. Human Behav. Emerg. Technol., 3, 379–390.
  26. Dunlosky J., Rawson K. A., Marsh E. J., Nathan M. J. and Willingham D. T., (2013), Improving students’ learning with effective learning techniques: promising directions from cognitive and educational psychology, Psychol. Sci. Public Interest, 14, 4–58.
  27. Ebele U. F. and Olofu P. A., (2017), Study Habit and Its Impact on Secondary School Students' Academic Performance in Biology in the Federal Capital Territory, Abuja, Educ. Res. Rev., 12, 583–588.
  28. Edwards J. D., Barthelemy R. S. and Frey R. F., (2021), Relationship between course-level social belonging (sense of belonging and belonging uncertainty) and academic performance in general chemistry 1, J. Chem. Educ., 99, 71–82.
  29. Entwistle N. and McCune V., (2004), The conceptual bases of study strategy inventories, Educ. Psychol. Rev., 16, 325–345.
  30. Fanetti S., Bushrow K. M. and DeWeese D. L., (2010), Closing the gap between high school writing instruction and college writing expectations, English J., 77–83.
  31. Fink A., Frey R. F. and Solomon E. D., (2020), Belonging in general chemistry predicts first-year undergraduates’ performance and attrition, Chem. Educ. Res. Pract., 21, 1042–1062.
  32. Fiorella L. and Mayer R. E., (2016), Eight ways to promote generative learning, Educ. Psychol. Rev., 28, 717–741.
  33. Fritz C. O., Morris P. E., Bjork R. A., Gelman R. and Wickens T. D., (2000), When further learning fails: stability and change following repeated presentation of text, Br. J. Psychol., 91, 493–511.
  34. Gezer-Templeton P. G., Mayhew E. J., Korte D. S. and Schmidt S. J., (2017), Use of exam wrappers to enhance students’ metacognitive skills in a large introductory food science and human nutrition course, J. Food Sci. Educ., 16, 28–36.
  35. Gump S. E., (2005), The cost of cutting class: attendance as a predictor of success, College Teach., 53, 21–26.
  36. Hagaman J. L., Casey K. J. and Reid R., (2012), The effects of the paraphrasing strategy on the reading comprehension of young students, Remedial Special Educ., 33, 110–123.
  37. Hora M. T. and Oleson A. K., (2017), Examining study habits in undergraduate STEM courses from a situative perspective, Int. J. STEM Educ., 4, 1–19.
  38. Junco R., (2012), Too much face and not enough books: the relationship between multiple indices of Facebook use and academic performance, Comput. Human Behav., 28, 187–198.
  39. Junco R. and Cotten S. R., (2012), No A 4 U: the relationship between multitasking and academic performance, Comput. Educ., 59, 505–514.
  40. Karpicke J. D., Butler A. C. and Roediger III H. L., (2009), Metacognitive strategies in student learning: do students practise retrieval when they study on their own? Memory, 17, 471–479.
  41. Kiewra K. A., (1987), Notetaking and review: the research and its implications, Instruct. Sci., 16, 233–249.
  42. Kornell N. and Bjork R. A., (2007), The promise and perils of self-regulated study, Psychon. Bull. Rev., 14, 219–224.
  43. Kraushaar J. M. and Novak D. C., (2010), Examining the affects of student multitasking with laptops during the lecture, J. Inform. Syst. Educ., 21, 241–252.
  44. Lee S. W. and Von Colln T., (2003), The Effect of Instruction in the Paraphrasing Strategy on Reading Fluency and Comprehension.
  45. Lewis S. E., Shaw J. L., Heitz J. O. and Webster G. H., (2009), Attitude counts: self-concept and success in general chemistry, J. Chem. Educ., 86, 744.
  46. Li W.-T., Liang J.-C. and Tsai C.-C., (2013), Relational analysis of college chemistry-major students' conceptions of and approaches to learning chemistry, Chem. Educ. Res. Pract., 14, 555–565.
  47. Lin T.-F. and Chen J., (2006), Cumulative class attendance and exam performance, Appl. Econ. Lett., 13, 937–942.
  48. Lombardi D., Shipley T. F., Astronomy Team, Biology Team, Chemistry Team, Engineering Team, Geography Team, Geoscience Team and Physics Team, (2021), The curious construct of active learning, Psychol. Sci. Public Interest, 22, 8–43.
  49. Lynch D. J., (2007), “I've studied so hard for this course, but don't get it!” Differences between student and faculty perceptions, College Student J., 41, 22–25.
  50. Maher J. M., Markey J. C. and Ebert-May D., (2013), The other half of the story: effect size analysis in quantitative research, CBE—Life Sci. Educ., 12, 345–351.
  51. Martin L., Mills C., D’Mello S. K. and Risko E. F., (2018), Re-watching lectures as a study strategy and its effect on mind wandering, Exp. Psychol., 65, 297–306.
  52. Marton F. and Säljö R., (1976), On qualitative differences in learning: I—Outcome and process, Br. J. Educ. Psychol., 46, 4–11.
  53. Matt G. E., Pechersky B. and Cervantes C., (1991), High school study habits and early college achievement, Psychol. Rep., 69, 91–96.
  54. May K. E. and Elder A. D., (2018), Efficient, helpful, or distracting? A literature review of media multitasking in relation to academic performance, Int. J. Educ. Technol. Higher Educ., 15, 1–17.
  55. McDaniel M. A., Roediger H. L. and McDermott K. B., (2007), Generalizing test-enhanced learning from the laboratory to the classroom, Psychon. Bull. Rev., 14, 200–206.
  56. Merrill M. D., (2002). First principles of instruction. Educ. Technol. Res. Dev., 50, 43–59.
  57. Moss J., Schunn C. D., Schneider W. and McNamara D. S., (2013), The nature of mind wandering during reading varies with the cognitive control demands of the reading strategy, Brain Res., 1539, 48–60.
  58. Mutambuki J. M., Mwavita M., Muteti C. Z., Jacob B. I. and Mohanty S., (2020), Metacognition and active learning combination reveals better performance on cognitively demanding general chemistry concepts than active learning alone, J. Chem. Educ., 97, 1832–1840.
  59. Muteti C. Z., Zarraga C., Jacob B. I., Mwarumba T. M., Nkhata D. B., Mwavita M., Mohanty S. and Mutambuki J. M., (2021), I realized what I was doing was not working: the influence of explicit teaching of metacognition on students’ study strategies in a general chemistry I course, Chem. Educ. Res. Pract., 22, 122–135.
  60. Nakagawa S., (2004), A farewell to Bonferroni: the problems of low statistical power and publication bias, Behav. Ecol., 15, 1044–1045.
  61. Ost B., (2010), The role of peers and grades in determining major persistence in the sciences, Econ. Educ. Rev., 29, 923–934.
  62. Özçakmak H., (2019), Impact of Note Taking during Reading and during Listening on Comprehension, Educ. Res. Rev., 14, 580–589.
  63. Patterson M. C., (2017), A naturalistic investigation of media multitasking while studying and the effects on exam performance, Teach. Psychol., 44, 51–57.
  64. Perneger T. V., (1998), What's wrong with Bonferroni adjustments, BMJ, 316, 1236–1238.
  65. Phillips N. E., Mills C., D'Mello S. and Risko E. F., (2016), On the influence of re-reading on mind wandering, Q. J. Exp. Psychol., 69, 2338–2357.
  66. Pintrich P. R., (1991), A manual for the use of the Motivated Strategies for Learning Questionnaire (MSLQ).
  67. Podsakoff P. M., MacKenzie S. B., Lee J.-Y. and Podsakoff N. P., (2003), Common method biases in behavioral research: a critical review of the literature and recommended remedies, J. Appl. Psychol., 88, 879.
  68. Rampton J., (2019), Time blocking tips top experts and scientists use to increase productivity, Inc.
  69. Rask K., (2010), Attrition in STEM fields at a liberal arts college: the importance of grades and pre-collegiate preferences, Econ. Educ. Rev., 29, 892–900.
  70. Rawson K. A., (2012), Why do rereading lag effects depend on test delay? J. Mem. Lang., 66, 870–884.
  71. Roediger III H. L. and Karpicke J. D., (2006), The power of testing memory: basic research and implications for educational practice, Perspect. Psychol. Sci., 1, 181–210.
  72. Rosen L. D., Carrier L. M. and Cheever N. A., (2013), Facebook and texting made me do it: media-induced task-switching while studying, Comput. Human Behav., 29(3), 948–958.
  73. Rothman K. J., (1990), No adjustments are needed for multiple comparisons, Epidemiol., 43–46.
  74. Rowell S. F., Frey R. F. and Walck-Shannon E. M., (2021), Intended and Actual Changes in Study Behaviors in an Introductory and Upper-Level Psychology Course, Teach. Psychol., 48, 165–174.
  75. Sabot R. and Wakeman-Linn J., (1991), Grade inflation and course choice, J. Econ. Perspect., 5, 159–170.
  76. Schmidt S. J., (2019), Taking Notes: There's a Lot More to It than Meets the Eye, J. Food Sci. Educ., 18, 54–58.
  77. Sebesta A. J. and Bray Speth E., (2017), How should I study for the exam? Self-regulated learning strategies and achievement in introductory biology, CBE—Life Sci. Educ., 16, ar30.
  78. Seery M. K., (2009), The role of prior knowledge and student aptitude in undergraduate performance in chemistry: a correlation–prediction study, Chem. Educ. Res. Pract., 10, 227–232.
  79. Sinapuelas M. L. and Stacy A. M., (2015), The relationship between student success in introductory university chemistry and approaches to learning outside of the classroom, J. Res. Sci. Teach., 52(6), 790–815.
  80. StataCorp L., (2019), Stata Statistical Software. Release 16.[Software], College Station, TX.
  81. Stevens E. A., Vaughn S., House L. and Stillman-Spisak S., (2020), The effects of a paraphrasing and text structure intervention on the main idea generation and reading comprehension of students with reading disabilities in grades 4 and 5, Sci. Stud. Read., 24, 365–379.
  82. Sunday O. J., Adesope O. O. and Maarhuis P. L., (2021), The effects of smartphone addiction on learning: a meta-analysis, Comput. Human Behav. Rep., 4, 100114.
  83. Sweller J., Van Merriënboer J. J. and Paas F., (2019), Cognitive architecture and instructional design: 20 years later. Educ. Psychol. Rev., 31, 261–292.
  84. Szpunar K. K., Khan N. Y. and Schacter D. L., (2013), Interpolated memory tests reduce mind wandering and improve learning of online lectures, Proc. Natl. Acad. Sci. U. S. A., 110, 6313–6317.
  85. Szu E., Nandagopal K., Shavelson R. J., Lopez E. J., Penn J. H., Scharberg M. and Hill G. W., (2011), Understanding academic performance in organic chemistry, J. Chem. Educ., 88, 1238–1242.
  86. Tai R. H., Sadler P. M. and Loehr J. F., (2005), Factors influencing success in introductory college chemistry, J. Res. Sci. Teach., 42, 987–1012.
  87. Tranmer M. and Elliot M., (2008), Multiple linear regression, The Cathie Marsh Centre for Census and Survey Research (CCSR), 5(5), 1–5.
  88. Upton G. J., (1992), Fisher's exact test, J. R. Stat. Soc.: Series A (Statistics in Society), 155, 395–402.
  89. Walck-Shannon E. M., Rowell S. F. and Frey R. F., (2021), To what extent do study habits relate to performance? CBE—Life Sci. Educ., 20, ar6.
  90. Westrick P. A., Le H., Robbins S. B., Radunzel J. M. and Schmidt F. L., (2015), College performance and retention: a meta-analysis of the predictive validities of ACT® scores, high school grades, and SES, Educ. Assess., 20, 23–45.
  91. White H., (1980), A heteroskedasticity-consistent covariance matrix estimator and a direct test for heteroskedasticity, Econometrica, 817–838.
  92. Wiley J., Griffin T. D. and Thiede K. W., (2005), Putting the comprehension in metacomprehension, J. Gen. Psychol., 132, 408–428.
  93. Wittrock M. C., (1991), Generative teaching of comprehension, Elem. School J., 92(2), 169–184.
  94. Wright S. P., (1992), Adjusted p-values for simultaneous inference, Biometrics, 48, 1005–1013.
  95. Xu X., Villafane S. M. and Lewis J. E., (2013), College students’ attitudes toward chemistry, conceptual knowledge and achievement: structural equation model analysis, Chem. Educ. Res. Pract., 14, 188–200.
  96. Yan V. X., Thai K.-P. and Bjork R. A., (2014), Habits and beliefs that guide self-regulated learning: Do they vary with mindset? J. Appl. Res. Memory Cogn., 3, 140–152.
  97. Ye L., Oueini R., Dickerson A. P. and Lewis S. E., (2015), Learning beyond the classroom: using text messages to measure general chemistry students' study habits, Chem. Educ. Res. Pract., 16, 869–878.
  98. Zhao N., Wardeska J. G., McGuire S. Y. and Cook E., (2014), Metacognition: an effective tool to promote success in college science learning, J. College Sci. Teach., 43, 48–54.

This journal is © The Royal Society of Chemistry 2025
Click here to see how this site uses Cookies. View our privacy policy here.