Post-pandemic challenges in higher education: learning preferences, performance and dropout in a first-semester chemistry course

Nicolás Pérez, Lorena Martínez, Natalia Alvarez, Lucía Otero, Nicolás Veiga* and Julia Torres*
Área Química Inorgánica, DEC, Facultad de Química, Universidad de la República, General Flores 2124, Montevideo, Uruguay. E-mail: nveiga@fq.edu.uy; jtorres@fq.edu.uy

Received 31st July 2025 , Accepted 18th November 2025

First published on 19th November 2025


Abstract

After the COVID-19 pandemic, new digital resources were maintained together with reinstated in-person activities, leading to a blended learning environment that provides higher education students with a variety of learning alternatives. This study provides a detailed analysis of students’ choices among these alternatives and their associations with academic performance and dropout rates within a first-year General Chemistry course at a public, open-enrollment university. The evolution of students’ preferences for a range of learning activities and resources—spanning virtual and in-person formats, as well as active and passive modes—was examined. Both student characteristics and resource attributes were analyzed as potential factors influencing these preferences. The results show that access to virtual resources surged during the crisis and, although it steadily declined afterwards, the most commonly used and valued resources remain those delivered virtually, particularly those closely related to the course content assessed in the final tests. On the other hand, activities involving in-person student–instructor interaction, such as theory lectures or tutoring sessions are less valued than watching selected parts of the corresponding recorded videos or using the electronic forum, respectively. Materials focusing on content not directly assessed in tests are also perceived as less useful. Overall, the results indicate a shift towards more self-paced, time-saving learning. However, in-person tutoring session attendance correlates with better final marks, while over-reliance on the electronic forum may signal academic struggles leading to lower performance and dropout. These findings emphasize the need to balance time-saving virtual learning with in-person support.


1. Introduction

Educational systems faced unprecedented challenges during the COVID-19 pandemic, which caused the widespread physical closures of schools, colleges, and universities in 2020–2021. In response to the crisis, a myriad of strategies were immediately implemented to ensure the continuity of curriculum-based studies (Dietrich et al., 2020; Rapanta et al., 2020; UNESCO Education Sector, 2020; Asare et al., 2021; García-Morales et al., 2021; Neuwirth et al., 2021; Stevanović et al., 2021; Su and Guo, 2021; Zhang et al., 2023). Higher education suffered a radical and sudden adaptation process to fully remote digital strategies (Carolan et al., 2020; Dwivedi et al., 2020; García-Morales et al., 2021). Since most universities did not have enough time to prepare new resources, they commonly resorted to synchronous meetings or emergency recorded classes (Manfuso, 2020). In some cases, although pre-existing technology-based tools provided an excellent opportunity for a smooth transition to virtual learning (Garrison and Kanuka, 2004; O’Flaherty and Phillips, 2015), these strategies were not always fully integrated into the instructional alternatives implemented during the pandemic emergency (Rapanta et al., 2021).

The variety of strategies chosen by the different institutions during university campus closures did not share goals, design, instructional delivery mode or even definition, but most of them used virtual tools for computer-assisted learning (Rapanta et al., 2021; Xie et al., 2021). The implemented remote learning approaches tried to keep what worked, while striving to ensure universal access (Nguyen et al., 2020; Rapanta et al., 2021). Ideally, this implied ensuring students' technological skills, availability of learning materials fully covering the curriculum content, virtual pedagogical support and continuous tracking and assessment of learning outcomes, with special attention to students with weaker self-regulation and self-organization skills (Rapanta et al., 2020; UNESCO Education Sector, 2020). The key to effective remote learning was establishing reliable communication channels that could bridge not only the physical distance but also, whenever possible, temporal constraints (Stöhr et al., 2020). Intentionally-designed blended guided approaches proved to be effective (Neuwirth et al., 2021), offering both synchronous meetings and asynchronous self-paced materials with various instructional modes covering diverse individualities (Daniel, 2020; Dhawan, 2020; Dietrich et al., 2020; Rapanta et al., 2020; Adedoyin and Soykan, 2023). Computer-assisted learning was perceived by students as more useful, flexible in scheduling and time-saving, although some reports also showed that personal interaction was in other cases preferred (Rodríguez-Rodríguez et al., 2020; Al-Kumaim et al., 2021; Stevanović et al., 2021). Interestingly, public positive opinions on distance learning significantly increased during the pandemic, although this topic also saw a notable rise in negative views, suggesting a polarization of perspectives (Asare et al., 2021).

A general loss of knowledge and skill acquisition was observed during the pandemic (Di Pietro et al., 2020; Sievertsen and Burgess, 2020; Di Pietro, 2023). However, a more active learning process may have also emerged, as the materials provided during this period took on roles traditionally fulfilled by educators (The Council of the European Union, 2018; Nordmann et al., 2020; Rodríguez-Triana et al., 2020). The flexibility provided by digital materials may have fostered student self-regulation among more active learners (Dietrich et al., 2020). In fact, according to the experts’ opinions, students might have modified their preferences, expectations and practices as the pandemic went by (Rapanta et al., 2021). The promotion of self-regulated learning was indeed observed, although it was less evident among first-year students (Stevanović et al., 2021). A special challenge stands for these students who face additional risk factors compared to older ones (Stevanović et al., 2021). Recent initiatives, such as supplemental co-class models, have highlighted the importance of enhancing first-year student engagement and learning outcomes in general chemistry, particularly for those transitioning from disrupted high school experiences (Kumari et al., 2025). Further studies are vital for developing and implementing effective strategies that take into account students’ diversity (García-Morales et al., 2021; Puriwat et al., 2021; Stecuła and Wolniak, 2022).

In this context, a significant gap remains in understanding the medium-term evolution of student learning preferences and their specific impact on academic performance and dropout rates in core first-year STEM courses—particularly chemistry, where foundational knowledge is critical. Furthermore, there is a need for studies that not only assess preferences, but also disentangle the influence of diverse student characteristics and specific instructional design features of learning activities and resources on academic outcomes. This study addresses these gaps by examining students’ preferences for a diverse range of learning activities and resources in a first-year university chemistry course. It analyses how these patterns evolved before, during, and after the pandemic, while also considering some relevant student characteristics within the selected context, as well as different aspects of the offered activities and resources. This helps to understand how students’ preferences were reshaped by the pandemic-related disruption. Additionally, the study investigates how students’ choices relate to academic performance and dropout rates. By examining the interplay among student preferences, activities and resource characteristics, academic outcomes and dropout, this research provides empirical evidence to guide the strategic reshaping of higher education practices, with a particular focus on addressing the unique needs of first-year students in a post-pandemic landscape.

1.1. Theoretical framework

The variety of strategies adopted by universities during campus closures offered greater flexibility on one hand but also required students to transition from a collaborative classroom environment to studying independently. To better understand this shift, two complementary yet potentially contrasting theoretical foundations must be considered: flexible learning and social constructivism.

Flexible learning is an educational approach that prioritizes learners’ needs by providing choices in how, when, where, and at what pace they learn. It often relies on technology to promote student-centered, self-paced, and independent learning experiences. A key feature of this model is the creation of a multifaceted, self-regulated learning environment, with flexibility across core dimensions such as time, content, assessment, and delivery (Müller and Mildenberger, 2021; Müller et al., 2023).

Besides, Vygotsky's social constructivism posits that knowledge is only actively developed through social interaction and shared experiences. Learning is viewed as a social process in which individuals construct understanding collaboratively within a cultural context. This framework introduces the existence of a gap between what a learner can achieve independently and what they can accomplish with guidance from others. Personal communication and language facilitate in this context the transmission and the internalization of concepts (Amineh and Asl, 2015).

Taking into account the described framework, the possible shift on students’ preferences toward computer-assisted materials could increase the relevance of flexible learning, which emphasizes student autonomy regarding pace, place, and mode of delivery, fostering more active and efficient learning. However, this could also weaken the engagement with peers and instructors—an essential component of social constructivist learning. Therefore, current challenges in higher education revolve around balancing social constructivist interaction with flexibility in learning (Amineh and Asl, 2015; Müller and Mildenberger, 2021; Müller et al., 2023), within an inclusive and efficient blended-learning framework (Guppy et al., 2022; Tilak and Kumar, 2022). The goal is not to choose between two extremes—fully in-person or fully computer-assisted self-learning—but rather to identify the most effective blended combinations that support each student engagement and performance (Roy, 2020; Anderson, 2021). Moreover, given the potential shift in students’ learning preferences (Rapanta et al., 2021) and the changes in study routines brought about by the pandemic (Kerres and Buchner, 2022; Jereb et al., 2023), student–content interaction may be a particularly relevant factor influencing current student satisfaction and academic outcomes (Su and Guo, 2021). So, there is an urgent need for research-based insights into student preferences for educational materials and their impact on academic outcomes.

1.2. Definition of the problem and research questions

General Chemistry I, a core first-year course, was selected as a case study. The pre-existence of self-paced virtual resources provided a unique opportunity to compare nearly identical resources before, during, and after the pandemic. In fact, in a previous study we reported the assessment of the available self-paced interactive materials in a pre-pandemic scenario, especially focusing on use, perceived usefulness and influence on academic performance and dropout rates (Veiga and Torres, 2022). Building on previous results, the central research questions of this work, are:

• What are the current students’ preferences towards offered learning activities and resources and how did they evolve across the pandemic disruption?

• How do students’ characteristics in terms of gender, coursing history (freshman/repeater), selected course modality (in-presence/online) and academic performance modulate the mentioned preferences?

• How do activities and resources’ characteristics in terms of student–instructor way of interaction (in-person/virtual), targeted learning style (active/passive) and academic content (test-related) influence the mentioned preferences?

• What is the association between students’ preferences/characteristics and their academic performance and dropout rates?

2. Context, materials, and methods

2.1. Description of the general context

First of all, it is worth mentioning that since 2007, Uruguay has implemented a national plan to ensure universal access to technology in public primary and secondary education (Pittaluga and Revoir, 2012). This initiative fostered the widespread development of technological resources and digital skills by the time students enter University, thereby reducing potential barriers to the adoption of virtual tools. Indeed, in 2020, during the semester of the first COVID-19 outbreak, a survey by Universidad de la República of over 99[thin space (1/6-em)]000 students across all programs revealed that 85% attended at least one complete virtual course—the sole option available that year—and 92% passed at least one course. Surprisingly, these percentages were even higher for freshman students. In the same survey, students were also asked about positive and negative aspects during the implemented completely virtual modality. Time flexibility and spared transport associated with distance learning were the main mentioned advantages, whereas overload and emotional affectation arose as the most negative aspects (Udelar, 2020).

Given the socioeconomic status of Uruguayan university students, first-semester learners represent a relatively uniform group of full-time students aged 18 to 20, predominantly from medium to high socioeconomic backgrounds (Perera, 2018; Torello and Casacuberta, 2020). Enrollment is free of charge and no admission test or quota is operative. In line with this free-enrollment framework, simultaneous enrollment in multiple degree programs is common, and as a result, students display a wide range of professional interests, prior experiences, academic performance levels, and learning skills.

General Chemistry I is a theory course located in the first semester of all chemistry degrees at Universidad de la República, Uruguay. The main specific learning goals are the basic concepts on atomic structure and chemical bond, as well as the development of basic calculation and prediction skills on those subjects. The course activities and resources available are detailed in Table 1, which also provides a brief overview of the modifications implemented during and after the pandemic. For each topic, the traditional pre-pandemic approach included an in-person, purely didactic lecture introducing the theory content (T, 1.5 hours per week). This was followed by an in-person practice session with comprehensive worked examples (P, 2.5 hours per week). Attendance at practice sessions was mandatory: student presence was systematically recorded, and a minimum attendance rate of 80% was required to pass the course. Additionally, students were provided with self-paced virtual interactive materials to reinforce their understanding. They consist of twelve individual files containing the worked examples assigned for each week's practice session, along with numerous additional exercises and problems compiled in a downloadable PDF that can be used entirely offline—anytime, anywhere, and as often as learners need. Interactive tools are embedded within these materials to supplement the limited personal guidance available from instructors in the context of a large-scale course. Most of these tools are activated on demand: clues (CL, solving hints of varying difficulty levels for all available exercises and problems), feedback (FK, multiple-choice feedback, offering either explanations for incorrect answers or additional details for the correct ones), “know more” sections (KM, further reading material that delves into theory concepts beyond the course's learning objectives) and “a bit of history” sections (Hy, historical anecdotes or curiosities highlighting notable scientific contributions related to the subject matter). Furthermore, an always present tool is also included: the “know what” section (KW, real-life, attention-grabbing information presented as brief texts and eye-catching images). Examples of these tools are depicted in Fig. 1.

Table 1 Description of the course activities and resources available throughout the course, before, during and after COVID-19 closures. Abbreviations used in this work are also included (code: italic font for virtual, bold font for in-person)
  Pre-pandemic editions (2017–2019) Pandemic (2020–2021) Post-pandemic editions (2022–2023)
Theory 14 in-person theory lectures (T) 14 recorded theory lectures (TV) and complementary animated videos (AV) 14 in-person/recorded theory lectures and complementary animated videos (T, TV, AV)
recommended virtual reading material (RM)
 
Practice 12 in-person practice sessions: worked examples solved (P) 12 recorded practice sessions: worked examples solved (PV) 12 in-person/recorded practice sessions: worked examples solved (P, PV)
12 out-of-class self-paced virtual interactive practicing activities containing daily-life curiosities section named know-what, KW and the following interactive tools: clues, CL, feedback, FK, know-more, KM and a bit of History, Hy
 
Theory and Practice in-depth discussion 12 in-person tutoring sessions: instructor–student interaction (TS) 12 live video conference/chat tutoring sessions: instructor–student interaction (TS) 12 in-person tutoring sessions: instructor–student interaction (TS)
daily moderated electronic forum: instructor–student and student–student interaction (EF)



image file: d5rp00296f-f1.tif
Fig. 1 Examples of the available tools embedded in the self-paced interactive materials, image reproduced from (Veiga and Torres, 2022).

The integration of theory and practice in the course, as well as the resolution and in-depth discussion of exercises present in the interactive materials, was facilitated through both in-person tutoring sessions (TS, 0.5 hours per week) and a monitored electronic forum (EF) with daily responses. Both formats provided individualized support, addressing specific student challenges, and helping them to build/reinforce key concepts. Students attending tutoring sessions are expected to ask and discuss with the instructor the specific doubts or questions they have. Analogously, in the electronic forum, they can post the corresponding questions. In both activities, students can find specific personal guidance on different aspects of theory and practice, when needed. However, while tutoring sessions involve direct, in-person dialogue between the instructor and an individual student (or occasionally a small group), discussions in the electronic forum are visible to all students, allowing anyone to post new questions, read, and contribute to ongoing threads.

The University suspended all in-person activities just days before the start of the first semester in March 2020. As a result, in-person theory lectures and practice sessions were quickly adapted into recorded formats: theory videos (TV) and practice videos (PV), respectively. Additionally, complementary animated videos (AV), covering some complementary theory concepts were developed. All these resources were made available on the course's Virtual Learning Environment (VLE). Previous tutoring sessions were transformed into live Zoom video conferences with the possibility of interaction by chat or audio for direct communication (80% mandatory attendance was required), while the rest of the course activities remained practically unchanged (Table 1). This general scheme was maintained in 2021. Since in the traditional strategy, student learning pace was regulated by a pre-planned in-person class schedule, the new virtual resources were offered with an identical time schedule by enabling access sequentially in the VLE of the course according to the subject dealt with during each week.

From 2022 on, in-person activities were reopened, and live Zoom video conferences were suspended. All other activities and resources, including the recorded theory lectures and practice sessions as well as the added complementary animated videos, remained available in the VLE of the course in a pre-scheduled weekly delivery mode, encompassing the subject evolution of the course (Table 1).

Furthermore, automatically corrected multiple-choice mock tests (M) were implemented in 2022.

It is also worth mentioning that even before the pandemic, students enrolled in the course could choose between two modalities: in-person (requiring 80% attendance to practice sessions) or fully online (ON). The in-person modality followed the previously described structure. In contrast, in the fully online modality (ON), students did not attend pre-scheduled in-person classes. Instead, they received weekly virtual materials, specifically designed to cover the practice session content, including written worked examples to ensure equivalent learning opportunities. Moreover, mandatory in-person attendance at practice sessions was replaced by weekly assignments for students enrolled in the fully online modality. During the pandemic, when attendance at live Zoom tutoring sessions became mandatory for the in-person modality, the online modality retained its structure—students did not attend the scheduled Zoom sessions but continued to receive the same virtual materials designed for this modality.

All students within a given cohort undertake the same assessments based on multiple-choice tests. Each year, tests were composed of multiple-choice questions (five options, one correct), randomly drawn from a validated question bank. The questions cover both theory concepts and applied problem-solving, ranging from straightforward to more challenging items. The selection process maintains fixed proportions between theory and practice, as well as between levels of difficulty and estimated time to answer, based on parameters obtained from previous evaluations. All questions are reviewed and validated annually by the teaching staff to ensure consistency across editions. The final score is calculated by subtracting 20% from the percentage of correct answers—reflecting the probability of random guessing—and then computing the percentage relative to the maximum achievable score (i.e., all correct answers minus that same 20%). Just slight changes in the assessment practices were triggered by the pandemic and they are comparatively summarized in Table S1. Before the pandemic, evaluation consisted of two in-person multiple-choice tests of equal weight. During the early pandemic (2020), the first evaluation was online but the in-person format was maintained for the final evaluation, as restrictions were lifted by the end of the semester. In the late pandemic period (2021), both tests were conducted online. For online tests, students were organized into small Zoom groups for instructors’ supervision, to reproduce the conditions of in-person testing as closely as possible. In the post-pandemic period, the in-person pre-pandemic format was reinstated. No additional graded activities contributed to the final mark, and passing the course required achieving at least 50% of the total possible score from the two tests.

2.2. Data acquisition and variables

This study aimed to investigate students’ choices among available activities and resources, trace their evolution throughout the pandemic, and examine their correlation with academic performance and dropout rates. Given the potential changes brought about by the COVID-19 pandemic, the data collection period was extended from 2017 to 2023, depending on the availability of data for each variable. Table S2 provides an overview of the variables measured in this study, along with their descriptive statistics and the periods for which data were available. In addition, Scheme 1 presents a glossary of abbreviations to facilitate straightforward reference throughout the work.
image file: d5rp00296f-s1.tif
Scheme 1 Glossary of abbreviations used in this work, including a brief description of each item.

The analysis is based on independent yearly cohorts of first-year students (2017–2023) enrolled in the same General Chemistry I course. The final dataset included 4511 students who completed at least one assessment (midterm or final test). Students who enrolled but did not participate in any evaluation were excluded from the analysis. Key variables included general student descriptors such as self-reported gender (GDR), course repetition (RP), chosen in-person/online modality (ON), final course grade percentage (GCI), and dropout status (DR). For this variable, dropout students were defined as those who took the first test but did not take either the second one or the final exam during the first scheduled period. Additionally, the percentage score of the first evaluation (G1) was included as an indicator of initial academic performance, to account for potential variations in prior knowledge among students entering the university in each cohort.

Data on students' access to resources as well as on frequency of use and perceived usefulness of each activity or resource were collected from VLE registers and students' surveys. This information is reflected in the variables ending with _A, _U, and _Uf, respectively.

Regarding _A variables, VLE registered access within Moodle was tracked. For each type of resource, student access was tracked by the variable access (_A), which represents the percentage of the available materials that each student accessed. This percentage was calculated relative to the total number of materials available within each resource type (Table 1, e.g., the 14 recorded lectures comprising the resource category ‘theory videos,’ TV). Access to the electronic forum deserves a special comment since it is not expected to be used on a weekly basis, but rather serves as an on-demand resource to help students address specific challenges encountered during practice. Consequently, the corresponding dichotomous variable (EF_A, Yes/No) indicates whether students accessed the forum at least once during the semester. To gain deeper insights into the number of students accessing each video and to analyze the average time spent watching them, YouTube analytics tools were also employed. It is worth mentioning that average view time does not differentiate playback speeds and therefore may underestimate the actual proportion of video content watched by students who adjusted the speed.

The post-pandemic frequency of use (_U variables) and perceived usefulness (_Uf variables) of resources were assessed through a general survey conducted in 2022 (391 students). Students were asked to rate the frequency with which they engaged with activities, resources or tools using a 4-point Likert scale (4: always, 3: very often, 2: seldom, or 1: never). Similarly, the perceived usefulness was evaluated by asking students to agree or disagree with statements regarding the tools' usefulness for studying, using a 5-point Likert scale (5: I totally agree, 4: I agree, 3: I neither agree nor disagree, 2: I disagree, and 1: I totally disagree). A portion of this survey, specifically focused on the perceived usefulness of self-paced interactive materials, was also administered in 2017 (pre-pandemic results published in (Veiga and Torres, 2022) and again in 2020 (pandemic results).

Lastly, a survey was conducted among nine instructors involved in the course to further evaluate their perceptions on the activities, resources, and tools provided. The objective was to gauge their perceived effectiveness in promoting course success and encouraging active student engagement. Instructors were asked to rate statements regarding the usefulness of each activity or tool to pass tests or to promote active learning using the already mentioned 5-point Likert scale (5: I totally agree, 4: I agree, 3: I neither agree nor disagree, 2: I disagree, and 1: I totally disagree).

The study was reviewed and approved by the institutional Ethical Committee for human subject research of Universidad de la República. The surveyed students were adequately informed of the main objectives and expected outcomes. They were also assured that their participation entailed no risk or benefit for them.

2.3. Statistical analysis

Statistical analysis was conducted on the variables summarized in Table S2, using pairwise deletion to handle missing data. Data processing was performed using STATA 18.0 (StataCorp, 2023). To examine temporal variations, four distinct periods were defined: pre-pandemic (2017–2019), early pandemic (2020), late pandemic (2021), and post-pandemic (2022–2023). As each year (or period) corresponds to a distinct cohort of students, all analyses were performed assuming independent samples, consistent with the cross-sectional design of the study. Average changes and their corresponding 95% confidence intervals (CI) were calculated using statistical methods appropriate to the nature of each variable, fully aligned with procedures recommended for independent-sample comparisons (Agresti, 2018). Specifically, two-sample z-tests with Bonferroni correction were applied to binary outcomes, Kendall's tau-b (τ) to ordinal outcomes, and the Games–Howell test to continuous variables. For associations between categorical variables, Pearson's χ2 test of independence was additionally performed. Regarding access to digital resources—namely, the number of views and average view time of theory videos (TV) and practice videos (PV)—temporal trends were analyzed with a two-factor ANOVA (factors: year [2020–2023] and week [12 weeks]), followed by Tukey's post hoc tests for multiple comparisons (Agresti, 2018). In all analyses, results were reported alongside p-values to assess statistical significance and relevant effect sizes (Cramer's V, Cohen's d) to quantify the strength of the associations.

To evaluate the association of each students’ descriptor (xi) with the selected response variable (y), while controlling the other variables, multiple regression models were employed as implemented in the R software (R foundation for statistical computing, 2020). First-test mark (G1) was included as a covariate in the regression models to account for initial academic preparedness, thereby controlling for baseline performance when examining the associations between explanatory variables and the outcomes. For continuous response variables, multiple linear regression was applied, using the leaps package (Lumley, 1997) to select the subset of descriptors xi that minimizes the adjusted R2. For binary response outcomes, logistic regression was employed, with model selection performed using AIC-based stepwise selection from the StepAIC function within the MASS library (Venables and Ripley, 2002). For ordinal response variables, ordered logistic regression models were used via the polr command from the MASS package. The proportional odds assumption was tested using Brant's test, and when this assumption was violated, a partial proportional odds model was fitted using the VGAM package (Yee, 2010). Model optimization was done manually by minimizing the AIC while ensuring that coefficients did not suffer from the Hauck-Donner effect. Predictive accuracy for logistic and ordinal models was evaluated using the overall accuracy coefficient (ACC) (Agresti, 2018).

To further strengthen the statistical investigation into student dropout, a survival analysis was conducted using the R package survival (Therneau, 2015; Therneau and Grambsch, 2000). For this analysis, we considered data from 571 students who attended in-person practice sessions in 2022. Attendance was considered to have ceased when a student was consecutively absent through the final week of classes (week 12). To account for the policy of allowing up to two absences during the course, students who did not dropout (DR = 0) were censored at week 12, even if they stopped attending between weeks 10 and 12. This approach ensures that the analysis reflects the maximum permitted number of absences and accurately captures the dropout patterns for the course. Only general student descriptors were included in the Cox model. Variables related to course resources were excluded, since the 2022 survey was administered to students at the end of the term. Therefore, there is no data to analyze its relationship with disengagement hazard over time. The Cox proportional hazards model (Harrell, 2001) was then employed to evaluate the effect of various covariates on the risk of dropout. The proportional hazards assumption was assessed using a proportional hazards (PH) test, yielding a statistically non-significant result (p = 0.76).

3. Results and discussion

3.1. Students’ general descriptors: the impact of the pandemic

Fig. 2 depicts the evolution of students’ general descriptors (gender, GDR, repeater students, RP, online modality, ON, dropout, DR, and final marks, GCI) along the following operatively defined periods: pre-pandemic (2017–2019), pandemic 2020–2021 (early pandemic 2020 and late pandemic 2021), and post-pandemic (2022–2023). Tables S3 and S4 summarize the key metrics for straightforward comparison and a more detailed discussion is available in Section SS1, SI.
image file: d5rp00296f-f2.tif
Fig. 2 Temporal evolution of students’ descriptors. Mean values of dichotomic variables are expressed as proportion: GDR = gender (Female = 0; Male = 1), RP = repeater students (NO = 0; YES = 1), ON = online students (NO = 0; YES = 1), DR = dropping out students (NO = 0; YES = 1). Final marks, GCI, were normalized to fit the scale.

As previously mentioned, Universidad de la República is a free, open-enrollment public university that attracts a relatively homogeneous population of full-time students aged 18–20, largely from similar socioeconomic backgrounds (Perera, 2018; Torello and Casacuberta, 2020). This homogeneity, together with the large sample sizes and standardized course assessments, reduces the likelihood that the observed temporal differences in performance or dropout arise from pre-existing disparities among cohorts. The average values shown in Fig. 2 and Table S3 therefore reflect population-level trends rather than individual variability.

Results show that the gender distribution (mean values for the general sample are 32% male and 68% female) has not changed substantially over time. Pearson's chi-square tests confirmed no significant association between gender and period (e.g., χ2(1) = 1.44, p = 0.230, V = 0.022 for pre- vs. post pandemic).

The overall repetition rate (RP, i.e. students repeating the course) is 23.5% for the general sample, with most of the repeater students having dropped out during the previous edition of the course. With statistical significance at 95% level, the pandemic led to a 6.1% rise in repeater students enrolled in 2020 (95% CI [1.6, 10.7]; χ2(1, N = 2413) = 11.24, p = 0.001, V = 0.068), with this increment persisting as a 5.5% rise from the pre-pandemic to post-pandemic periods (95% CI [1.8, 9.3]; χ2(1, N = 3054) = 13.43, p < 0.001, V = 0.066). Even though during the crisis many different personal situations could have led to the observed trend, the persisting increment in the post-pandemic situation may be due to the existence of new flexible resources. They might have attracted students who had previously taken earlier editions of the course, by improving their experience in terms of novelty and/or time flexibility, as previously observed (Campbell and Blankenship, 2020).

Overall, 7.9% of the general sample opted for the distance learning option (ON). An average statistically significant increase of 4.9% in the enrollment of the online coursing modality was observed during 2020 (95% CI [2.1, 7.7]; χ2(1, N = 2413) = 18.59, p < 0.001, V = 0.088). At that time, in-person practice sessions were not possible and were thus substituted by recorded practice sessions (PV) and Zoom video tutoring sessions with mandatory attendance for students enrolled in the in-person modality. In that scenario, a higher proportion of students chose the on-line modality in which they received specific virtual materials covering the content of the practice sessions instead of attending the pre-scheduled Zoom video conferences. Interestingly, despite a statistically significant 3.7% initial reduction in online enrollment from early to late pandemic, i.e. 2020 to 2021 (95% CI [−7.3, 0.0]; χ2(1, N = 1457) = 6.29, p = 0.012, V = 0.066), the increased preference for this modality remained nearly intact at 4.7% after returning to in-person practice sessions in the post-pandemic period (95% CI [2.3, 7.1]; χ2(1, N = 3054) = 23.47, p < 0.001, V = 0.088). Therefore, the pandemic seems to have encouraged a lasting shift towards a virtual self-paced learning format. This aligns with findings from other recent studies, where the general public (Asare et al., 2021) and also undergraduate students (Stevanović et al., 2021) express more positive attitudes towards distance learning, particularly because of the flexibility it affords (Müller and Mildenberger, 2021; Müller et al., 2023). Such flexibility has been shown to enhance self-regulated learning behaviors, allowing students to manage their time and learning processes more autonomously (Demir, 2024). However, despite the observed increase, post-pandemic online enrollment still remains low (under 11%), indicating a general reluctance to fully embrace distance learning. This trend has also been observed in other scenarios, often linked to external factors such as students' living conditions, a perceived lack of institutional support, and a dissatisfaction with teaching quality and engagement in online settings. (Li et al., 2023; Steyn et al., 2024).

Regarding dropout rates (general sample mean value is 19%), the pandemic in 2020 generated a statistically significant 19.8% increase in student disengagement from the course (95% CI [15.4, 24.3]; χ2(1) = 123.31, p < 0.001, V = 0.226). This phenomenon occurred in conjunction with a 9.1% drop in their academic performance (95% CI [−11.8, −6.4]; Games–Howell q(1268) = −12.20, p < 0.001. Both trends may reflect the effects of isolation and uncertainty, which contributed to emotional stress and anxiety, as well as possible disruptions in socioeconomic status with the consequent need to balance studies with work, among other influencing factors (Udelar, 2020; Del Savio et al., 2022; Sanz and López-Iñesta, 2022; Martínez-Líbano and Yeomans-Cabrera, 2023; Tang and He, 2023). Remarkably, the data reveal a partial reversal of this trend from 2020 to 2021, with a statistically significant 14.0% reduction in the dropout rate (95% CI [−19.8, −8.1]; χ2(1) = 35.97, p < 0.001, V = 0.157) and a 7.5% increase in final course marks (95% CI [4.3, 10.6]; q(1439) = 8.64, p < 0.001). This suggests another shift in student behavior, potentially reflecting resilience recovery, though it is also possible that the newly designed flexible resources have played a role in this context by fostering greater engagement in the course (Campbell and Blankenship, 2020). Finally, there is no statistically significant change in student dropout rates between the pre- and post-pandemic periods (χ2(1) = 2.15, p = 0.143, V = 0.027). Similarly, even though the average final marks, GCI, dropped by 3.6% from 2017–2019 to 2022/2023 (95% CI [−5.8, −1.6]; q(2920) = −6.32, p < 0.001), the variation between 2019 and 2022/2023 is not statistically significant (−0.7%; 95% CI [−2.9, +1.5]; q(1071) = −0.86, p = 0.544). As a result, dropout rates and academic performance appear to have returned to levels comparable to those seen just before the pandemic.

The overall mean GCI was 29.1%, substantially below the 51% threshold required to pass, and only a modest proportion of students achieved a passing grade (19.6%), fluctuating between 16.2% and 23.0% during the 2017–2023 period (Table S3). However, it is worth mentioning that students who achieve at least 30% in the overall course grade can take the final exam in the first examination period. Many of them do so successfully reaching the required 51% to continue progressing in the degree program. Although the exam takes place just two weeks after the second test, students’ increased familiarity with university routines—and likely greater effort in this final opportunity—result in a significant proportion of them passing. Nevertheless, on average, 25% of students are unable to continue in the program during the second semester, as they do not achieve a passing grade either during the course or in the final exam.

Regarding the consistently low academic performance observed, it is worth recalling that free of charge enrolment with no admission test or quota is operative. This accounts for a high occurrence of first-year multiple enrolment, and dropout. Furthermore, the observed low performance may also reflect the well-documented challenges associated with large introductory science courses in the first year of university studies. These courses are characterized by high student–teacher ratios and wide variability in students’ prior learning experiences, which together can hinder the acquisition of fundamental chemical concepts and limit opportunities for individualized feedback (Flaherty et al., 2015). In addition, affective factors such as students’ motivation, academic identity, and sense of belonging have been shown to significantly influence success in introductory chemistry, often contributing to performance disparities across demographic groups (Chestnut and Johnson, 2025). The transition from secondary to tertiary education further compounds these difficulties, as mismatches between students’ and educators’ perceived preparedness—especially in mathematics and chemistry—can leave many students struggling to meet university-level expectations (Leong et al., 2021). In the Uruguayan context, these global trends are likely intensified by heterogeneous secondary school preparation, the abrupt transition to a large, highly demanding academic environment, and the cognitive load imposed by abstract chemical concepts.

3.2. Students' access to virtual resources and their evolution across the pandemic

To gain insight into the students' learning preferences, access to various virtual resources (_A variables, tracked by the registered access within Moodle) was analyzed. This included access to materials for theory understanding (theory videos, TV_A, complementary animated videos, AV_A, and recommended reading material, RM_A), materials for practice training (practice videos, PV_A), and student–instructor interaction (via the electronic forum, EF_A). Access to self-paced interactive materials (pdf files) was excluded from this assessment, as this access is required to follow each practice session. Moreover, accessing these materials does not necessarily imply the use of the embedded interactive tools (Fig. 1), whose usage cannot be tracked, as they are not operated online.

Results are depicted in the boxplots of Fig. 3, where the students' access behavior is described by plotting the proportion of materials accessed by students from 2020 to 2023 relative to the total number of materials available for each resource (data prior to this period are unavailable due to the scheduled automatic updates to the University servers). The results depicted in Fig. 3a show that the theory videos (TV_A) and the recommended reading materials (RM_A) are the resources with the highest levels of access among students. Videos of practice sessions and especially complementary animated videos show comparatively lower access percentages (PV_A and AV_A, respectively). This trend aligns in principle with the general design of the course. Since attending practice sessions is mandatory for in-person modality, practice session videos are probably only resorted to by students in the online modality—not attending practice sessions—and by students in the in-person modality that need to revisit or clarify ideas or solving strategies within the worked examples. On the other hand, animated videos are designed just as a complement for especially complex theory concepts. Thus, the corresponding access variable is expected to be low.


image file: d5rp00296f-f3.tif
Fig. 3 Temporal evolution of the students’ access to the available resources. (a) Boxplots of the average percentage access to theory videos (TV_A), complementary animated videos (AV_A), practice videos (PV_A) and recommended reading material (RM_A). The mean is represented with a cross. (b) Proportion of students that accessed the electronic forum (EF_A) in the semester.

To evaluate the statistical significance of temporal changes, Table S5 presents the average variations and confidence intervals of access across the different periods. From 2020 to 2021, the data reveal a statistically significant increase in access to virtual materials, with a rise exceeding 20% across all resources: theory videos, TV_A = 23.8% (95% CI [19.6, 28.0]; q(1427) = 20.5, p < 0.001), complementary animated videos, AV_A = 22.2% (95% CI [19.0, 25.4]; q(1239) = 25.4, p < 0.001), practice videos, PV_A = 20.7% (95% CI [16.6, 24.9]; q(1365) = 18.1, p < 0.001), and recommended reading materials, RM_A = 22.1% (95% CI [17.5, 26.8]; q(1445) = 17.3, p < 0.001). This upward trend is likely linked to the student's increased familiarity with virtual resources as the pandemic went by (Rodríguez and Pulido-Montes, 2022; Stecuła and Wolniak, 2022). After the return to in-person activities in 2022, a decline in the access to resources was observed for the sample, with the decrease being statistically significant for complementary animated videos, AV_A = −9.0% (95% CI [−12.7, −5.2]; q(1373) = −8.7, p < 0.001), practice videos, PV_A = −18.1% (95% CI [−22.5, −13.7]; q(1368) = −14.9, p < 0.001), and recommended reading materials, RM_A = −5.7% (95% CI [−10.7, −0.7]; q(1349) = −4.1, p = 0.019), but not for theory videos, TV_A = −4.1% (95% CI [−8.7, 0.4]; q(1360) = −3.3, p = 0.094). The deepest decrease is observed for videos of practice sessions, which changed in 2022 from Zoom video conferences to in-person meetings (both in a pre-scheduled mandatory-attendance basis). This suggests that students adopted a blended learning approach, combining watching theory videos with in-person practice session attendance. This trend has been previously observed (Cobo-Rendón et al., 2022; Anthony Angwaomaodoko, 2024). In the evaluated post-pandemic scenario (2022–2023), the most frequently accessed resources are still the videos of theory lectures (TV_A) and the recommended reading material (RM_A), whereas the animated (AV_A) and practice sessions videos (PV_A), which provide complementary support, show a comparatively lower average access rate (<30%).

To deepen the analysis of students’ video-watching behavior, data were collected for two key indicators of the most accessed and content-related audiovisual resources: theory videos (TV) and practice videos (PV). The indicators included (i) the relative number of views (proportion of the total number of views relative to enrolled students) and (ii) the total view time (average percentage of the video's total duration watched by viewers). Each week, one theory and one practice video were made available to students in the virtual learning environment (VLE). The indicators described were computed separately for each theory and practice video on a weekly and yearly basis throughout the course (14 weeks for TV and 12 weeks for PV). The relative number of views, spanning the 2020–2023 period, should be considered approximate in terms of cohort-level view counts, as some views may correspond to students from previous years who retained access to the VLE after completing the course. Despite this limitation, it is notable that both types of recorded materials exhibit interesting trends.

The results are summarized in Fig. S1 and a more detailed discussion including a two-factor ANOVA centered on observed variations is available in Section SS2, SI. As a general trend, the number of views per video diminishes as the course progresses (Fig. S1a and c), despite the considered year. This is in principle a logical behavior since at the beginning of term students probably use different resources to decide which they prefer and, as the course goes by, the total number of students engaged in each kind of activity or resource gets lower. For theory lecture views (TV), this trend was confirmed by a significant effect of week, F(13, 39) = 5.01, p = 4.3 × 10−5, partial η2 = 0.63, with a total reduction of 121%. Similarly, for practice videos (PV), a significant effect of week was also observed, F(11, 33) = 11.21, p = 3.0 × 10−8, partial η2 = 0.79, corresponding to a 71% reduction in the number of views.

On the other hand, a statistically significant difference in the total number of views per enrolled student was observed for both resources across the years. For theory videos, TV, the effect of year was significant, F(3, 39) = 23.43, p = 7.7 × 10−9, partial η2 = 0.64, with post hoc tests showing that data from 2023 (M = 0.59) had markedly fewer views than data from 2020 (M = 1.43, p < 0.001, Cohen's d = 2.57), 2021 (M = 1.42, p < 0.001, d = 2.53), and 2022 (M = 1.46, p < 0.001, d = 2.66), but no significant differences among 2020–2022 (all p > 0.98) were observed. For practice videos, PV, the year effect was also significant, F(3, 33) = 12.15, p = 1.6 × 10−5, partial η2 = 0.52, with post hoc results showing that views in 2022 (M = 0.81) and 2023 (M = 0.72) were both significantly lower than in 2020 (M = 1.10; p = 0.010, d = 1.37; p < 0.001, d = 1.79, respectively) and 2021 (M = 1.15; p = 0.002, d = 1.63; p < 0.001, d = 2.05, respectively), but not different from each other (p = 0.73). But in this case, it must be taken into account that attending in-person practice sessions was again mandatory during the 2022–2023 period for students enrolled in the in-presence modality. Taken together, these results indicate a declining preference for this resource, suggesting a shift toward a more balanced strategy that integrates in-person and online resources in the post-pandemic period (Fig. S1a and c).

The percentage of view time for both theory (TV) and practice (PV) videos is notably low and declining along enrolment year. In the last registered year, 2023, students watched recorded lectures and practice sessions for an average time of 23% and 27% of the total video duration, respectively. Besides, the recorded lectures and practical sessions were watched for longer in 2020, with an increase in view time of around 11.1% to 13.5% and 3.8% to 9.0% of the video duration, respectively. These results are statistically significant. For theory videos, TV, view time, ANOVA results showed a significant effect of year, F(3, 39) = 58.59, p = 1.6 × 10−14, partial η2 = 0.82, with view times significantly longer in 2020 (M = 35.44%) compared to 2021 (M = 24.34%, p < 0.001, d = 3.60), 2022 (M = 21.96%, p < 0.001, d = 4.38), and 2023 (M = 22.70%, p < 0.001, d = 4.14), but no differences among the last three years (all p > 0.18). For practice videos, PV, view time, the year effect was also significant, F(3, 33) = 43.75, p = 1.3 × 10−11, partial η2 = 0.80, with 2020 (M = 32.33%) showing higher engagement than 2021 (M = 28.49%, p < 0.001, d = 1.98), 2022 (M = 23.35%, p < 0.001, d = 4.63), and 2023 (M = 27.15%, p < 0.001, d = 2.67). Furthermore, view time in 2022 was significantly lower than in 2023 (p < 0.001, d = 1.96).

While the possibility of accelerated playback cannot be ruled out, such low figures suggest that specific parts of the video are selected by students to focus on some specific content or explanation. This observation aligns with previous research reporting that most students self-identify as having short attention spans and therefore tend to prefer short videos (Patterson et al., 2020), which are often perceived as more engaging and have been linked to improved academic performance (Manasrah et al., 2021). On the other hand, although results reveal a statistically significant difference across weeks for both resources (theory videos, TV: F(13, 39) = 3.00, p = 0.004, partial η2 = 0.50; practical videos, PV: F(11, 33) = 12.68, p = 6.6 × 10−9, partial η2 = 0.81), the variation in total view time is minimal, ranging from 4.9% to 5.3% of the video duration (Fig. S1b and d). Again, this phenomenon may suggest that, after an initial embrace of both virtual resources, students progressively shifted toward a blended approach combining in-person and online participation.

Electronic forum access percentages and their temporal evolution (Fig. 3b and Table S5) deserve particular consideration. The electronic forum is an optional resource designed to facilitate student–instructor interaction thorough a personalized approach that addresses individual learning challenges and provides guidance on specific aspects such as problem-solving strategies and calculations. Since students do not always require this specific type of interaction—and can also seek similar support during practice or tutoring sessions—they are not expected to access the electronic forum once a week, but on demand depending on their needs and preferences. In general, results show that a very high percentage of students do access the electronic forum at least once during the semester. Focusing on the evolution across the pandemic, access to the electronic forum (EF_A) grew by 8.6% from 2020 to 2021 (95% CI [5.1, 12.1]; χ2(1, N = 1457) = 37.07, p < 0.001, V = 0.16). It showed afterwards a continuous and statistically significant decline: −4.8% from 2021 to 2022 (95% CI [−8.0, −1.6]; χ2(1, N = 1376) = 14.37, p < 0.001, V = 0.10), and −34.5% from 2022 to 2023 (95% CI [−40.5, −28.4]; χ2(1, N = 1359) = 204.92, p < 0.001, V = 0.39), in line with the behavior of the rest of the analyzed virtual resources.

3.3. Preferences, performance and dropout: a post-pandemic picture

3.3.1. Instructors’ perceived usefulness of available activities and resources. The activities and resources offered in the course (see Table 1) encompass diverse and redundant approaches that students can select according to their individual learning styles, schedules, and other preferences, while also following the course's recommendations. Based on this, a post-pandemic survey to explore instructors’ perceptions of the activities and resources available was conducted, focusing on their perceived usefulness for passing tests and/or promoting active learning. Despite the initial pedagogical design and the existing course recommendations for each activity or resource, instructors’ perceptions are expected to provide a more realistic point of view on how effectively each strategy addresses the concepts and competencies evaluated in the tests, as well as on its potential to promote active learning—based on direct student feedback received during in-person practice sessions.

Fig. S2 shows the general results of the survey using a 5-point Likert scale (5: I totally agree, 4: I agree, 3: I neither agree nor disagree, 2: I disagree, and 1: I totally disagree). Regarding the perceived usefulness to pass tests, recommended reading materials, RM, practice sessions (either in-person, P or recorded, PV), student–instructor interaction (either by attending tutoring sessions, TS, or participating in electronic forum, EF), doing mock tests implemented in 2022, M, and employing the embedded clues, CL, or feedback, FK, have a median value equal to or higher than 4 (4 = I agree, 5 = I totally agree). This indicates that more than half of the instructors either agreed or strongly agreed on the usefulness of these activities and resources in helping students succeed in their tests. Notably, the most highly valued activities and resources in terms of test preparation are those that focus on the core content of the course, which are more likely to be assessed in examinations.

Besides, more than half of the course instructors disagreed on the usefulness of attending the purely didactic theory lectures (T), watching complementary animated videos (AV), or using embedded tools that focus on daily-life curiosities (KW) or higher-level content sections (KM, Hy) for succeeding in tests. Except for theory lectures, these strategies are less directly aligned with the core content assessed in the tests, which likely explains their perceived lower relevance in this context. It is particularly interesting to note that instructors disagree with the usefulness of attending educator-centered traditional theory lectures, T, but not of watching the recorded videos, TV to succeed in tests. This perception is likely grounded in the possibility of pausing and selecting specific segments of the videos, thus transforming them into a more active self-paced resource. On the other hand, the more student-centered in-person activities, such as practice sessions (P) and tutoring sessions (TS), are rated just as high as watching the corresponding practice videos (PV) or participating in the electronic forum (EF), respectively. This suggests that according to instructors’ view, both traditional and virtual alternatives provide comparable value to pass tests.

In terms of promoting active learning (Fig. S2), the survey's results align closely with the intended design of each activity or resource. Attending theory lectures (T) or watching videos (TV, AV, PV) are generally not perceived by instructors as fostering active learning. Conversely, student-centered activities such as practice sessions (P) or tutoring sessions (TS), participation in the electronic forum (EF), completing mock tests (M), and especially using interactive tools like clues (CL) or feedback (FK) are consistently recognized for promoting active learning.

3.3.2. Students' choices for available activities and resources: frequency of use and perceived usefulness. To provide a post-pandemic overview of students' preferences, a survey among 391 students was conducted in 2022, focusing on the frequency of use and perceived usefulness of the available resources (variables ending in _U and _Uf in Table S2, respectively). Test mocks, M, implemented in 2022 were also included (use is evaluated in this case as a Yes/No dichotomic variable instead of assessing frequency as described for the other resources). The results of the mentioned survey are depicted in Fig. 4.
image file: d5rp00296f-f4.tif
Fig. 4 Students' responses from the 2022 survey regarding the frequency of use and perceived usefulness of the available activities or resources. (a) Percentage of students who reported frequently (always or very often) using the course resources or finding them useful or very useful. (b) Students' perceived usefulness of the tools embedded in the self-paced interactive materials (Fig. 1).

Interestingly, according to the Spearman's rank analysis, there is a weak to moderate correlation between the access (_A) and declared frequency of use (_U) of each resource (all ρ < 0.5; Table S6). The strongest association is observed for theory videos (TV; ρ = 0.49, p < 0.001), followed by practice videos (PV; ρ = 0.45, p < 0.001), recommended reading materials (RM; ρ = 0.40, p < 0.001), and complementary animated videos (AV; ρ = 0.25, p < 0.001). In contrast, the correlation for electronic forum use is very weak and not statistically significant (EF; ρ = 0.10, p = 0.06). Although there is some association between the degree of access to the resources and their frequency of use, these variables capture different aspects of student engagement. Resource access reflects student's interest in engaging with and acknowledging a given resource. However, it might be biased by factors such as the visibility, the emphasis and promotion provided by instructors, and the possibility of accessing during practice sessions. On the other hand, the frequency of use is associated with the actual use made by the student. Going beyond virtual resources, the frequency of use and perceived usefulness of other optional activities such as attending in-person theory lectures or tutoring sessions (T_U, T_Uf, TS_U and TS_Uf in Fig. 4a) were also assessed in the survey. Attending practice sessions was not included, since this activity is mandatory for students enrolled in the traditional modality and not applicable to online students.

In general terms, the frequency of use and perceived usefulness profiles are similar for each activity or resource (Fig. 4a). As expected, the activities or resources that students use most frequently are also considered by them the most useful for learning. In fact, there is a moderate to high correlation between the reported frequency of use (_U) and the perceived usefulness (_Uf) of the available resources (all ρ > 0.5; Table S7). Specifically, the strongest associations are observed for practice videos (PV; ρ = 0.74, p < 0.001), complementary animated videos (AV; ρ = 0.73, p < 0.001), recommended reading materials (RM; ρ = 0.71, p < 0.001), and theory videos (TV; ρ = 0.60, p < 0.001). Moderate correlations are also found for the electronic forum (EF; ρ = 0.60, p < 0.001) and tutoring sessions (TS; ρ = 0.56, p < 0.001), whereas mocks (M) show a weaker, though still significant, association (ρ = 0.47, p < 0.001). The present results are consistent with the widely accepted view that individuals adopt new technologies through a reasoned process influenced by perceived ease of use, perceived usefulness, attitude toward use, and behavioral intention (Mortenson and Vidgen, 2016). Previous findings have shown perceived usefulness as a key factor in assessing the students’ intention of accepting and using e-learning in higher education (Elkaseh et al., 2016).

Among the most frequently used and useful resources are the theory videos, TV, the recommended theory reading materials, RM, and the mock tests, M. The recorded practice sessions, PV as well as attending theory lectures, T, have intermediate use and usefulness performances, while complementary animated videos, AV, the tutoring sessions, TS, and the electronic forum, EF are the least used/useful resources. It is worth noting that although most students access the electronic forum at least once (Fig. 3b), its actual use and perceived usefulness are comparatively lower than those of other resources.

As stated before, self-paced practicing materials are expected to be accessed by all students since they are essential for following each practice session. However, to gain a better understanding of students’ perceived usefulness of the embedded interactive tools (Fig. 1), part of the 2022 survey specifically addressed their evaluation of each tool. The results, depicted in Fig. 4b, indicate that solving clues (CL) and feedback answers (FK) are the tools students find the most useful. This is likely related to the fact that these tools are designed to help students actively develop the practical skills required to solve the problems and exercises included in the interactive materials, which closely resemble those later assessed in the tests.

The results depicted in Fig. 4 provide valuable insight into students’ preferences for in-person versus remote activities and resources. It is particularly interesting to compare the results for the theory-focused activities and resources: attending in-person lectures (T_Uf), watching recorded lectures (TV_Uf), watching complementary animated videos (AV_Uf), and consulting recommended reading materials (RM_Uf). For content-equivalent in-person or recorded lectures, 79% of students favored the flexibility of watching videos, possibly appreciating advantages such as the ability to control the pace, pause for notetaking, or skip to relevant sections. In contrast, only 45% found scheduled traditional lectures useful, revealing a clear preference for more flexible self-directed learning formats, even at the expense of losing real-time interaction and opportunities for questioning and discussion. This result may be multicausal but partly rationalized by the massive nature of the course, which limits the amount of individual interaction time each student can receive from the instructors. Moreover, virtual tools offer students greater flexibility in scheduling, as well as significant savings in travel and time, factors that are highly valued and may contribute to the overall lower perceived usefulness of student–instructor in-person interaction instances. Going beyond the specific context of this work, although it was initially assumed that university students would prefer to return to in-person classes as soon as possible, attendance after reopening has generally remained lower than expected (Mehta et al., 2024). Time-saving learning is indeed highly valued by students in general (Rodríguez-Rodríguez et al., 2020; Stevanović et al., 2021) and in the University under study (Udelar, 2020). Regarding current students’ preferences for in-person vs. remote modalities, previous findings account for different results that are probably modulated by the quality of the offered modalities. Besides, student behavior does not always align with their declared modality preference, especially for those indicating a preference for online instruction (Larson et al., 2023).

Interestingly, students’ preferences regarding in-person vs. remote modalities align to some extent with the instructors' survey results, which reflect disagreement on the usefulness of attending purely didactic in-person theory lectures, while showing a more neutral stance on the value of watching the corresponding recorded lecture videos to succeed in tests (Fig. S2). Notably, the same trend is observed for activities focusing on practice: perceived usefulness of practice videos, PV_Uf, shows 63% agreement vs. 47% for perceived usefulness of in-person attendance at practice sessions, P_Uf. This very high perceived usefulness of recorded practice sessions may be linked to the possibility of revisiting specific concepts or resolution steps of the worked example, a fact which aligns with the short mean view time observed for these resources (Fig. S1). Instructors rated both recorded and in-person practice sessions as equally useful for succeeding in tests (Fig. S2). This perception is likely influenced by their higher appreciation of the in-person interactions among students and educators —an aspect shaped by social constructivism conceptions (Amineh and Asl, 2015) and by their own teaching experiences. Furthermore, attending tutoring sessions (TS_Uf) shows a similar students’ agreement to usefulness, 40%, with regard to using the electronic forum (EF_Uf), 42%. The latter stands out as less valued by students compared to other available virtual options. However, the instructors’ survey indicated a higher perceived usefulness of forum participation for succeeding in tests (Fig. S2). In addition, the available self-paced resources solving, clues and feedback answers (CL_Uf, FK_Uf), as well as doing mock tests (M_Uf) are perceived as more useful than the in-person approach of tutoring sessions, TS, both by students and instructors. The rest of the interactive material embedded tools are comparatively valued as less useful (KW_Uf, KM_Uf, Hy_Uf) (Fig. S2).

From another point of view, available activities and resources can be compared in terms of the degree to which they relate to the contents usually assessed in tests. Students’ perceived usefulness results show that resources or tools that go beyond the learning objectives, such as complementary animated videos (AV), “know what” (KW), “know more” (KM) and “a bit of history” (Hy) tools are positively valued by a small number of students (<35%). Accordingly, instructors’ survey responses did not indicate agreement regarding the usefulness of these activities or resources for passing the tests. On the other hand, activities and resources that significantly enhanced students' positive perceptions were mainly those directly related to following the course content to be assessed in tests: attending theory lectures, T, watching recorded theory lectures, TV, watching recorded practice sessions PV, reading recommended reading materials, RM, using interactive solving clues, CL, using interactive feedback answers, FK and doing mock tests, M. This trend has already been observed in other scenarios and accounts for a marked preference of time-saving strategies that lead to a better performance in tests (Coffin Murray et al., 2012). In this aspect, a match with the instructors’ survey was also obtained (Fig. S2).

The available activities can also be compared based on their effectiveness in fostering active learning, according to the instructors' survey (Fig. S2). In-person activities promoting student–instructor interaction, such as tutoring sessions (TS) and participation in the electronic forum (EF), have shown a low perceived usefulness among students. On the other hand, students attribute a high usefulness value to watching recorded theory lectures (TV), an activity that is not expected to significantly promote active learning (in line with instructors’ responses, Fig. S2). However, the short average view time observed for recorded lectures can indicate that students might be selecting short parts of the available videos to revisit concepts, thus employing a time-saving and more active approach. Previous findings suggest that lecture videos are often used to complement rather than replace in-person interaction, enabling students to engage in more active, self-directed study by pausing, reviewing, or rewatching content (Topale, 2016). They are indeed used for exam preparation, revision, and clarifying lecture content. The flexibility offered by theory recordings is valued by students since they allow accessibility and support for diverse student needs, complementing in-person instruction (Nkomo and Daniel, 2021). In line with this, instructors disagreed on the effectiveness of attending theory lectures, but acknowledged the value of watching the corresponding videos for succeeding in tests. Lastly, regarding embedded tools, the more active-learning ones, clues (CL) and feedback answers (FK) were again rated as the most useful tools, both by students and instructors.

In order to establish the variation of perceived usefulness of tools with time, a comparison with similar questionnaires carried out before the pandemic in 2017 (Veiga and Torres, 2022) and in 2020 (pandemic results) was done. The findings, detailed in Table S8, reveal that the perceived usefulness of the tools varied over time. On average, solving clues, CL, (τ = 0.12, p = 0.0013), feedback answers, FK, (τ = 0.17, p < 0.001), and “know more” sections, KM, (τ = 0.17, p < 0.001) saw a statistically significant increase in perceived usefulness during the pandemic, whereas “know what” sections, KW (τ = −0.04, p = 0.278), and Hy (τ = 0.05, p = 0.150) did not. As in-person activities resumed (2022), the perceived usefulness of solving clues, CL (τ = 0.04, p = 0.233), and feedback answers, FK (τ = −0.03, p = 0.343), remained relatively stable with respect to 2017, while the usefulness of the other tools experienced a statistically significant decline (KW: τ = −0.29, p < 0.001; KM: τ = −0.21, p < 0.001; Hy: τ = −0.24, p < 0.001).

Overall, these results highlight that the tools which significantly enhanced students' positive perceptions were virtual and test-content related (watching theory and practice videos, TV and PV, reading digital recommended materials, RM) including also those requiring active but self-regulated problem-solving efforts (solving clues, CL, feedback answers, FK, mock tests, M). Conversely, tools associated with direct student–instructor interaction (electronic forum, EF and tutoring sessions, TS) and those related to topics not so frequently assessed in tests (“know-what” sections, KW, “know-more” sections, KM, “a bit of History” sections, Hy, and watching complementary animated videos, AV) were perceived as less useful. Thus, it can be concluded that students' preferences have shifted towards more flexible and time-saving learning strategies, implying a more individual and efficient way of succeeding in tests.

3.3.3. Association of student descriptors with perceived usefulness of activities and resources. To assess the association between student descriptors and their perceived usefulness of each resource (see Table S2), ordered logistic regression models were employed. This allowed us to evaluate the individual impact of each descriptor while holding the others constant. To simplify interpretation, the Likert response variable was recoded: values 1 and 2 were grouped as 0 (not useful), value 3 as 1 (neutral), and values 4 and 5 as 2 (useful). Table 2 presents the results for the response variables for which regression models with statistical significance could be adjusted. Data from 379 respondents (2022 survey) were analyzed, with models demonstrating satisfactory accuracy, correctly classifying over 46% of observations (compared to 33% for random classification). For better visualization, Fig. S3–S11 display the model predictions, showing the probability that a student falls into each of the three usefulness perception categories (0, 1, 2) based on their descriptors (gender, GDR, year of enrolment, Y, repeater or freshman, RP, in-presence, or online modality, ON, and initial performance, G1; Table S2).
Table 2 Association of students’ descriptors with their perceived usefulness of the resources available in 2022
Responsea (y) AV_Uf RM_Uf PV_Uf CL_Uf FK_Uf KW_Uf TS_Uf EF_Uf M_Uf
a Variables for which a regression model with statistical significance could be adjusted. To simplify the interpretation of the model, the response variable was recoded as follows: values 1 and 2 were grouped as 0, value 3 as 1, and values 4 and 5 as 2.b Percentage of correctly classified observations.c Ordered logistic regression analysis.d Partial proportional odds model.e Statistical significance: p < 0.001 (***), p < 0.01 (**), p < 0.05 (*), p < 0.1 (). For the partial proportional odds model, the two odds ratios listed for some of the descriptors correspond to the transition from level 0 to 1 and from level 1 to 2 in the response variable and are separated by a slash (/).
Number of students 381 380 381 379 380 379 380 380 383
Overall accuracyb 50%c 78%d 63%c 74%d 73%d 46%c 47%c 52%c 89%c
Variables Odds ratioe
Gender (GDR) 0.61* 0.55* 0.51** 0.65 0.73
Course repetition (RP) 1.54 2.22/0.66 1.52 2.68*** 0.35**
Online coursing (ON) 3.90** 5.19** 0.49/2.14 0.51 2.40* 0.38
First test mark (G1) 1.01 0.997/1.01* 1.01* 0.99* 1.02*


Before starting the discussion of the results separated by individual activities and resources, it is worth recalling that students enrolling in the online modality have chosen not to attend regular practice sessions but instead receive extra reading materials containing practice worked examples of each weekly-delivered subject. In that sense, the virtual resources are expected to be their main way of following course content and thus these students are expected to show higher preferences for them. Table 2 accounts for that, showing markedly higher odds of finding virtual resources more useful for these students.

For theory-related activities and resources, perceived usefulness of theory lecture attendance or theory video watching shows no evidence of statistically significant association with any student descriptor. Considering the statistically significant results, complementary animated videos, AV, are preferred by female students and those who chose the online course modality (Table 2 and Fig. S3). Specifically, the odds of considering these videos useful are 1.6 times higher for female students (p = 0.020) and 3.9 times higher for online students (p = 0.002). Regarding the perceived usefulness of the recommended reading material, RM, a statistically significant association with gender is again observed (p = 0.023), where the odds of female students considering it useful are 1.8 times higher than their male counterparts (Fig. S4). Additionally, though with less statistical significance (p = 0.060), students who performed better on the first test tend to find the digital recommended reading material, RM, more useful. This is likely because the reading materials comprehensively cover the content required for tests, aligning with their academic goals.

Regarding the recorded practice sessions, PV, female, and online students have 2.0 (p = 0.002) and 5.2 (p = 0.009) times the odds, respectively, of finding them useful (Fig. S5). This suggests that this resource is particularly appealing to female and online students, as they offer opportunities to complement in-person activities with more independent, self-regulated learning strategies, as previously observed (Nakata, 2020; Rohman et al., 2020; Refika, 2023). Additionally, although with less statistical support, repeater students have 1.5 times the odds of finding both recorded practice sessions and complementary animated videos useful.

Focusing on embedded tools of the interactive materials, the regression models indicate that solving clues, CL, and feedback answers, FK, are preferred by students with higher first test marks (see Table 2 and Fig. S6, S7); CL: p = 0.045, FK: p = 0.048). This is likely because these tools are directly aligned with the content evaluated in tests, while also promoting active engagement in the learning process, which seems to lead to high performing students (Coffin Murray et al., 2012; Foong et al., 2021). There is also weaker statistical evidence suggesting that solving clues are more favored by female students (p = 0.09). Conversely, the “know-what” sections, KW, which focus on daily-life curiosities, tend to appeal more to students with lower initial academic performance (Fig. S8; p = 0.017). This may be due to these students being less engaged with the course's disciplinary content and more drawn to tools that relate to less abstract, more context-applied topics (Mann and Enderson, 2017; Maya et al., 2021).

In the case of activities related to student–instructor interaction, the regression results again reveal a clear association between perceived usefulness and the chosen modality of the course (Table 2). The odds of students enrolled in the in-person modality finding tutoring sessions, TS, twice as useful (p = 0.076; Fig. S9) indicates a clear preference for face-to-face interaction. Conversely, the odds of online students considering the electronic forum, EF, useful are 2.4 times higher (p = 0.036; Fig. S10), suggesting again that online learners favor web-based, on-demand communication over in-person tutoring, as previously observed (Marks, 2011). This dichotomy highlights how online students lean towards remote asynchronous interactions, valuing the flexibility of the electronic forum for solving queries at their own pace over personal engagement with peers and instructors.

Notably, there is strong statistical evidence indicating that repeating students place significant value on the electronic forum. The odds of these students finding the forum useful are 2.68 times higher than those of freshman students (p = 3 × 10−5), likely because, having already exhausted the in-person activities in earlier editions of the course, they now rely more heavily on the electronic forum to fill knowledge gaps from previous attempts. Nevertheless, it is important to note that attending tutoring sessions and using the electronic forum are not necessarily equivalent forms of engagement. In tutoring sessions, students are expected to participate actively by asking specific questions and demonstrating their understanding through their work. In contrast, the electronic forum may allow for a more passive approach as students may just read existing posts and access shared information without necessarily socially interacting or contributing. Future research should further examine the different types of engagement within the forum—such as reading, posting questions, or providing answers—to better understand how these interaction patterns influence student learning and engagement.

Finally, the ordinal regression analysis for the perceived usefulness of the mock tests reveals notable associations with the descriptors RP, ON, and G1 (Table 2 and Fig. S11). Freshman students are significantly more likely to favor this resource (p = 0.002), as are students with higher academic performance in the first test (p = 0.034), and those who attend the course in person (p = 0.060). This preference can be attributed to the fact that the mocks provide targeted preparation for the tests and are especially promoted by instructors. This makes the resource especially appealing to in-person enrolled students aiming for higher marks, as well as those taking the course for the first time, who may be less familiar with the test format.

3.3.4. Academic performance. In the selected General Chemistry I course, there was a 3.6% decline in students’ average final academic performance between the pre- and post-pandemic periods. By 2023, the average final mark had dropped to 27.9% (95% CI [26.3, 29.6]), falling far below the passing threshold of 51%. In light of this, we aimed to identify some key determinants related to available activities and resources that could be influencing student academic performance (GCI) and the observed trend. For this purpose, multiple linear regression models were applied to the 2022 dataset. The results for the entire 2022 cohort are shown in Table 3. Additionally, Table S9 summarizes the findings focusing on repeater students and breaking down the analysis by tertiles of G1 grades, providing deeper insights into specific subgroups.
Table 3 Association of students’ descriptors with their performance (y = %GI) and course dropout (y = DR) in 2022
  Multiple linear regression (y = %GCI) Logistic regression (y = DR) Cox regression (y = DR)
a Statistical significance: p < 0.001 (***), p < 0.01 (**), p < 0.05 (*), p < 0.1 ().
Number of students 376 364 571
Performance Explained variance 89% Overall accuracy 95% Concordance 76%
Variables Regression coefficientsa Odds ratiosa  
Course repetition (RP) −1.59 1.72*
First test mark (G1) 0.91*** 0.92*** 0.94***
Course drop-out (DR) −8.87***
T_U −0.48 0.64  
TV_U 1.70  
TS_U 0.98*  
EF_U −1.25**  
M_U −1.78  
FK_Uf 0.71  
KW_Uf 2.81*  
Hy_Uf 0.49  


The regression model using the whole 2022 sample has a satisfactory fit of the data, explaining 89% of the variability in GCI (Table 3). The same happens for the models calculated on the partial samples (Table S9), with explained variance above 59%. In general terms, there is strong statistical evidence supporting the association between final marks, GCI, and both first-test marks G1 (p < 0.001) and dropout rate, DR (p < 0.001). As expected, students with higher performance on the first test, G1, and those who did not drop out (DR = NO, and therefore completed the second test) achieve higher final course marks. Specifically, for each additional point in the first test (G1), students gain an average of 0.91 points on the second test, reinforcing the importance of early academic success in determining overall course performance. For the highest-performing students (top G1 tertile), both online coursing (ON = YES) and the use of mock tests (M_U) are positively correlated with their final course marks, with average increases of 7.8% (p = 0.007) and 12.5% (p = 0.037) in GCI, respectively. Additionally, these students perceive feedback answers as more useful, FK_U (p = 0.004). These findings align with previous discussions, suggesting that mock tests and feedback answers, which are directly related to test content, likely benefit high-achieving students by fostering active engagement and self-paced learning. Indeed, high-performing university students have been reported to employ more self-regulated learning strategies to achieve academic success (Foong et al., 2021).

The regression model in Table 3 also reveals that frequent attendance at tutoring sessions is positively associated with final course marks (p = 0.017), allowing an average increase of 0.98 percentual marks. This is in line with previous experiences where individual tutoring was used as a tool to improve the academic performance of university students (Bloom, 1984; Guerra-Martín et al., 2017), even though other studies indicate that student–instructor interaction has no significant effect on students’ satisfaction or outcome (Su and Guo, 2021). It is worth recalling at this point that, according to students’ point of view, the frequency of use and perceived usefulness of tutoring sessions is low (Fig. 4).

Conversely, frequent use of the electronic forum is negatively associated with final course performance (p = 0.008). This is notable, as the use of online discussion forums has been reported to have a positive or no correlation with final course grades (Davies and Graff, 2005; Cheng et al., 2011). In the studied context, this phenomenon could indicate that while tutoring sessions provide a framework for social constructive direct support and clarification of course material, excessive reliance on the electronic forum may reflect a compensatory use by students who struggle with the material, leading to lower performance. It is also worth recalling that using the electronic forum does not necessarily mean to post or to actively interact with peers or instructors, since all students can enter and just read the posted messages. Moreover, frequent use of electronic forums may coincide with heavier recreational internet use, which has been identified as a detrimental factor affecting academic performance (Kubey et al., 2001; Jacobsen and Forste, 2011). Both trends for tutoring sessions, TS, and electronic forum, EF, are particularly pronounced among students with medium prior academic performance (middle G1 tertile) and repeater students (see Table S9).

With moderate statistical support (p = 0.083), repeater students score, on average, 1.6 marks lower in their final course grades (Table 3). Indeed, in other scenarios, pass rates of first-year university students have been found to be significantly lower for students on their second attempt with regard to pass rates of freshman students (Snead et al., 2022). Interestingly, the present findings indicate that the GCI of repeater students is positively influenced by attending tutoring sessions (p = 0.007). In-person support has also previously proven to be important for a better performance of repeater students (Hood and Girshner, 2023) and aligns with the dependence on social interaction for engagement and learning (Amineh and Asl, 2015). Additionally, their overall performance benefits from the use of recommended reading materials (p = 0.015) and the “know-what” sections (p = 0.004), suggesting that connecting to real-life curiosities and applications further reinforces this pattern, as they gravitate toward less abstract material that helps comprehension, as previously described (Sheard and Hagan, 1998). Conversely, frequent use of the electronic forum is negatively correlated with repeat students' performance (p = 0.028), as is their reliance on recorded practice sessions worked-examples (p = 0.017) and the history sections of self-paced materials (p = 0.0007). These findings suggest that among these students, those who adopt a more passive approach of watching recorded materials or frequently resort to the electronic forum for help tend to underperform.

3.3.5. Dropout rates. Finally, as previously mentioned, free of charge enrolment with no admission test or quota is operative in the course under study. This can account for a high occurrence of dropout. Nevertheless, factors related to available activities and resources that might be influencing dropout rates (DR) were targeted. This issue is not only a critical concern across higher education (Del Savio et al., 2022; Martínez-Líbano and Yeomans-Cabrera, 2023), but is also particularly relevant in the context of the course under study. Although the rise in General Chemistry I course dropout rates during the pandemic was somewhat mitigated in the following years, it persists at concerning levels (Table S3). Indeed, in 2023 it reached 18.6% (95% CI [15.7, 21.4]; 134 students).

To tackle the dropout problem, a logistic regression model was applied, using several student descriptors as explanatory variables (Table 3). The model showed an excellent fit, with an overall accuracy of 95%. To facilitate interpretation, Fig. S12 illustrates the probability of dropping out as a function of the statistically significant descriptors. As previously discussed, first-test marks (G1) are negatively correlated with dropout (p = 0.0003): for each additional point in G1, the odds of dropping out decrease by 8%, emphasizing the crucial role early academic success plays in preventing dropout. This aligns with other reports in higher education, which indicate that the risk of dropping out was positively influenced by impaired performance during the pandemic (Martínez-Líbano and Yeomans-Cabrera, 2023). Interestingly, students who drop out tend to find the “know-what” (KW) sections of the self-paced materials useful (p = 0.015) but show less interest in the “a bit of history” (HY) sections (p = 0.085). This suggests that dropout students are less engaged with the disciplinary content of the course and gravitate toward tools that present more concise and less abstract topics. They might thus prefer resources focused on everyday curiosities or applications and are less interested in those on how the different concepts are built, since the required level of abstraction is, in general, low, and high, respectively. This is in agreement with other reports that show that difficulties in abstraction skills and an increase in workload elevate the dropout risk among university students (Hoed et al., 2018; Karimi-Haghighi et al., 2022).

Lastly, to complement the previous analysis on dropout including other relevant concomitant variables besides those already assessed, a survival analysis was conducted. Details are available in the supplementary section SS3 and the results are summarized in Table 3 and Fig. 5. It can be observed that disengagement probability increases notably during the second half of the course (Fig. 5a, week 6 onwards). As already mentioned, higher test scores are associated with a lower probability of dropping out. Specifically, for each additional point in the first test, G1, the hazard of dropping out decreases by approximately 6% (p = 7 × 10−8). This is illustrated in Fig. 5b, where the dropout profiles for students in different G1 tertiles diverge significantly throughout term. Students with the lowest first test scores tend to disengage even before the midpoint of the course, whereas those with the highest scores do not disengage at all. Interestingly, repeaters are 72% more likely to drop out than freshman students when all other variables are held constant (p = 0.02). The disparity in dropout rates between repeater and freshman students becomes more pronounced in the second half of the semester (Fig. 5c). This could be attributed to repeaters experiencing greater academic frustration, diminished motivation, or external pressures, making it increasingly difficult for them to remain engaged in the course as it progresses.


image file: d5rp00296f-f5.tif
Fig. 5 Kaplan–Meier plot of the data, showing the temporal variation in the student percentage that remain enrolled in the course. (a) All data. (b) Data by G1 tertile. TERG1 = 1 (lowest third of first test scores); TERG1 = 2 (middle third of first test scores); TERG1 = 3 (highest third of first test scores). (c) Data by repetition status. RP = 0 (students taking the course for the first time); RP = 1 (students repeating the course). The value of p for the log-rank test is also shown for (b) and (c).

3.4 Overview of the main findings

Taken together, the results show that the COVID-19 pandemic significantly reshaped students’ learning behaviors, leading to a marked shift toward virtual learning modalities. This shift may also have attracted students who had previously taken the course, likely due to the redesigned resources that offered a more flexible learning environment and incorporated new approaches appealing to those already enrolled in earlier editions. During the pandemic, access to resources like recorded theory lectures, recommended digital reading materials and the electronic forum surged. This shift occurring in the special unfamiliar circumstances of the pandemic can be multicausal. However, after the return to in-person activities, a preference for on-demand virtual resources has been retained. In the post-pandemic scenario, the most frequently used and valued resources are the recorded lectures and practice sessions, as well as the digital recommended reading materials, the interactive tools as solving guides or feedback answers, and the mock tests. All of them are flexible computer-assisted self-paced materials that bear a significant alignment with test content. In contrast, activities involving student–instructor interaction (either in-person or virtual) as well as those covering non-assessed topics, were seen by students as less useful. This shift indicates a preference for more flexible time-saving learning strategies.

Interestingly, a closer examination of post-pandemic student preferences reveals distinct patterns based on academic performance, gender, course modality, and repeater status. For instance, female students, who are ca. two thirds of the sample, show a higher preference for recorded practice sessions, recommended reading material and complementary animated videos than their male counterparts. Besides, higher-performing students especially favor resources that facilitate computer-assisted self-paced learning and are closely aligned with test content, such as solution clues, feedback answers, and mock tests.

Despite the lower students’ perceived usefulness of in-person tutoring sessions, with respect to relying on reading the electronic forum, the former activities significantly improve final course marks, especially for students with moderate performance. This result emphasizes the relevance of social constructive learning (Amineh and Asl, 2015) for low- or moderate-performance learners. In line with this, for the highest performance tertile, online coursing associates with higher performance. On the other hand, repeater students relying on the electronic forum and recorded practice sessions, tend to underperform. This reliance could indicate a need for additional support, without which these students may struggle with the material and find it difficult to keep up with the demands of the course. This is probably related to their higher dropout rates, as these tools may not sufficiently foster the active learning required for course sustained engagement and success. Indeed, among students repeating the course, the use of digital recommended reading materials and “know-what” sections, along with face-to-face tutoring session attendance, is strongly associated with higher final course marks. This suggests that alternative connections of course content to daily-life subjects as well as in-person support are particularly effective for these students. On the other hand, students who dropped out (and those with lower academic performance) found the “know-what” sections of the self-paced materials particularly useful, but this was not enough for engagement, suggesting in this case that these students were less engaged with the core disciplinary content and gravitated toward tools that offered more general-interest centered accessible topics.

4. Concluding remarks

This study aimed to explore current students' choices of learning activities and resources in a General Chemistry I course as a case-study, and how these preferences were influenced by the pandemic. By analyzing various student features, and the characteristics of available resources, we sought to pinpoint some of the factors possibly driving students’ preferences. The present findings shed light on the specific activities and resources students use and find useful, and how these preferences are linked to their performance and engagement to the course, addressing important questions about the role of activities and resources in current educational outcomes.

The results go beyond previous research on student preferences for in-person versus virtual learning. Its added value lies in the fact that preferences are not evaluated as a whole, but rather analyzed in conjunction with specific student characteristics and the particular features of each activity. This approach provides a more nuanced and practical understanding of student engagement. Furthermore, preferences are also compared with academic performance and potential dropout risk. These insights are essential for informing future strategies by highlighting the types of learning activities that best balance student appeal with meaningful learning outcomes and sustained motivation within the educational process.

5. Implications for teaching and research

Understanding students’ preferences defined by learning needs and styles, as well as their time evolution and impact on academic outcomes or dropout, provides valuable insight into how tailored resource design and student support strategies could improve both performance and retention in future editions of the course. For example, based on the short view times registered for available videos, partially replacing long theory videos or adding shorter, focused videos on key concepts to be assessed in tests could enhance student engagement. However, this strategy should be applied cautiously, as recent findings indicate that the overuse of short videos may negatively affect academic outcomes by promoting fragmented attention or failing to cover sufficient content (Manasrah et al., 2021; Asif Saniya Kazi, 2024).

Based on our findings, which reveal both a diversity in students’ preferences for learning materials and the stability of these preferences over time, it becomes clear that the post-pandemic instructional design should adopt a blended approach. The intentional integration of in-person and computer-assisted strategies should be promoted, combining the strengths of face-to-face interaction, which fosters social constructivist learning, with the flexibility and autonomy of digital resources. In this context, where a wide range of activities and resources are offered simultaneously, the rapid adaptation and selection of preferred learning modes by first-year students becomes essential. To support this process, structured learning aids such as study guides and schematic overviews of course activities and resources are needed. In particular, providing clear descriptions of each available activity or resource could help students identify learning pathways aligned with their individual study styles, thereby promoting faster adaptation and more effective learning.

Another significant point derived from the results of this work is that the currently offered in-person activities and resources are not particularly valued, yet they are indeed beneficial for improving resilience against dropout and academic performance. Therefore, it is essential to communicate to students that in-person activities play an effective role in learning, as they foster meaningful exchanges with peers and instructors that enhance understanding and retention. At the same time, once students attend face-to-face sessions, these experiences must be redesigned to provide a more engaging and rewarding environment—one that offers a distinctive value surpassing the comfort and flexibility of computer-assisted activities.

6. Limitations

This work was carried out on a university first-year scenario of free and unrestricted enrollment, in which dropout is especially high and the average student academic performance is far below passing marks. This implies that the present results can, in principle, only be extrapolated to similar settings. Notably, since first-year students behave different from older students (Stevanović et al., 2021), extrapolation of the present findings to other courses, even in similar scenarios, may not be straightforward.

Correlation data are based on students’ self-reported use rather than their actual “brains-on” engagement, which raises further questions about accuracy. It is also worth noting that one limiting factor of the study lies in the assumption that all potential confounding variables have been controlled—a condition that is not fully attainable in an observational design, particularly since students were allowed (rather than randomly assigned) to choose among optional activities and resources. Furthermore, the emphasis placed by instructors on certain materials may have influenced students’ choices to some extent. Nevertheless, we made a conscious effort to include all available predictors that could affect the analysis and to avoid overgeneralizing conclusions based on the sample.

Author contributions

Nicolás Pérez: data curation and investigation. Lorena Martínez: investigation, writing – review and editing. Natalia Alvarez: investigation, writing – review and editing. Lucía Otero: conceptualization, writing – review and editing. Nicolás Veiga: conceptualization, formal analysis, writing – original draft, review and editing. Julia Torres: conceptualization, writing – original draft, review and editing.

Conflicts of interest

The authors declare no conflicts of interest.

Data availability

Analyzed data supporting this article have been partly included as part of the supplementary information (SI). However, data collected from human participants are not available for confidentiality reasons. Supplementary information is available. See DOI: https://doi.org/10.1039/d5rp00296f.

Acknowledgements

This work was supported by Comisión Sectorial de Enseñanza, Universidad de la República, Uruguay. The authors also acknowledge the revision of students’ survey carried out by Unidad Académica de Enseñanza, Facultad de Química, Universidad de la República, Uruguay.

References

  1. Adedoyin O. B. and Soykan E., (2023). Covid-19 pandemic and online learning: The challenges and opportunities, Int. Learn. Environ., 31(2), 863–875 DOI:10.1080/10494820.2020.1813180.
  2. Agresti A., (2018), Statistical methods for the social sciences, 5th edition, Pearson.
  3. Al-Kumaim N. H., Mohammed F., Gazem N. A., Fazea Y., Alhazmi A. K. and Dakkak O., (2021), Exploring the Impact of Transformation to Fully Online Learning During COVID-19 on Malaysian University Students’ Academic Life and Performance, Int. J. Int. Mobile Technol., 15(05), 140–168 DOI:10.3991/ijim.v15i05.20203.
  4. Amineh R. J. and Asl H. D., (2015), Review of Constructivism and Social Constructivism, J. Soc. Sci., Literature Languages, 1(1), 9–16.
  5. Anderson W., (2021), The model crisis, or how to have critical promiscuity in the time of Covid-19, Soc. Stud. Sci., 51(2), 167–188 DOI:10.1177/0306312721996053.
  6. Anthony Angwaomaodoko E., (2024), A Review of Blended Learning after the COVID-19 Pandemic, Int. Res. Educ., 12(1), 86 DOI:10.5296/ire.v12i1.21849.
  7. Asare A. O., Yap R., Truong N. and Sarpong E. O., (2021), The pandemic semesters: Examining public opinion regarding online learning amidst COVID-19, J. Comput. Assisted Learn., 37(6), 1591–1605 DOI:10.1111/jcal.12574.
  8. Asif Saniya Kazi M., (2024), Examining the Influence of Short Videos on Attention Span and its Relationship with Academic Performance, Int. J. Sci. Res., 13(4), 1877–1883 DOI:10.21275/SR24428105200.
  9. Bloom B. S., (1984), The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring, Educ. Res., 13(6), 4 DOI:10.2307/1175554.
  10. Campbell R. and Blankenship B. B., (2020), 20210331, To Improve the Academy, 39(2), 51–74 DOI:10.3998/tia.17063888.0039.203.
  11. Carolan C., Davies C. L., Crookes P., McGhee S. and Roxburgh M., (2020), COVID 19: Disruptive impacts and transformative opportunities in undergraduate nurse education, Nurse Educ. Pract., 46, 102807 DOI:10.1016/j.nepr.2020.102807.
  12. Cheng C. K., Paré D. E., Collimore L.-M. and Joordens S., (2011), Assessing the effectiveness of a voluntary online discussion forum on improving students’ course performance, Comput. Educ., 56(1), 253–261 DOI:10.1016/j.compedu.2010.07.024.
  13. Chestnut J. and Johnson C. C., (2025), Factors Influencing Students’ Academic Success in Introductory Chemistry: A Systematic Literature Review, Educ. Sci., 15(4), 413 DOI:10.3390/educsci15040413.
  14. Cobo-Rendón R., Bruna Jofre C., Lobos K., Cisternas San Martin N. and Guzman E., (2022), Return to University Classrooms With Blended Learning: A Possible Post-pandemic COVID-19 Scenario, Front. Educ., 7, 957175 DOI:10.3389/feduc.2022.957175.
  15. Coffin Murray M., Pérez J., Geist D. and Hedrick A., (2012), Student Interaction with Online Course Content: Build It and They Might Come, J. Inf. Technol. Educ.: Res., 11, 125–140 DOI:10.28945/1592.
  16. Daniel S. J., (2020), Education and the COVID-19 pandemic, PROSPECTS, 49(1–2), 91–96 DOI:10.1007/s11125-020-09464-3.
  17. Davies J. and Graff M., (2005), Performance in e-learning: Online participation and student grades, Br. J. Educ. Technol., 36(4), 657–663 DOI:10.1111/j.1467-8535.2005.00542.x.
  18. Del Savio A. A., Galantini K. and Pachas A., (2022), Exploring the relationship between mental health-related problems and undergraduate student dropout: a case study within a civil engineering program, Heliyon, 8(5), e09504 DOI:10.1016/j.heliyon.2022.e09504.
  19. Demir K., (2024), Future of Undergraduate Education for Sustainable Development Goals: Impact of Perceived Flexibility and Attitudes on Self-Regulated Online Learning, Sustainability, 16(15), 6444 DOI:10.3390/su16156444.
  20. Dhawan S., (2020), Online Learning: A Panacea in the Time of COVID-19 Crisis, J. Educ. Technol. Syst., 49(1), 5–22 DOI:10.1177/0047239520934018.
  21. Dietrich N., Kentheswaran K., Ahmadi A., Teychené J., Bessière Y., Alfenore S., Laborie S., Bastoul D., Loubière K., Guigui C., Sperandio M., Barna L., Paul E., Cabassud C., Liné A. and Hébrard G., (2020), Attempts, Successes, and Failures of Distance Learning in the Time of COVID-19, J. Chem. Educ., 97(9), 2448–2457 DOI:10.1021/acs.jchemed.0c00717.
  22. Di Pietro G., (2023), The impact of Covid-19 on student achievement: Evidence from a recent meta-analysis, Educ. Res. Rev., 39, 100530 DOI:10.1016/j.edurev.2023.100530.
  23. Di Pietro G., Biagi F., Costa P., Karpiński Z., Mazza J. and European Commission: Joint Research Centre, (2020), The likely impact of COVID-19 on education – Reflections based on the existing literature and recent international datasets, Publications Office DOI:10.2760/126686.
  24. Dwivedi Y. K., Hughes D. L., Coombs C., Constantiou I., Duan Y., Edwards J. S., Gupta B., Lal B., Misra S., Prashant P., Raman R., Rana N. P., Sharma S. K. and Upadhyay N., (2020), Impact of COVID-19 pandemic on information management research and practice: transforming education, work and life, Int. J. Inf. Manage., 55, 102211 DOI:10.1016/j.ijinfomgt.2020.102211.
  25. Elkaseh A. M., Wong K. W. and Fung C. C., (2016), Perceived Ease of Use and Perceived Usefulness of Social Media for e-Learning in Libyan Higher Education: A Structural Equation Modeling Analysis, Int. J. Inf. Educ. Technol., 6(3), 192–199 DOI:10.7763/IJIET.2016.V6.683.
  26. Flaherty, A., O’Dwyer, A., Leahy, J. and Richardson, O., (2015), Teaching large undergraduate chemistry classes: a challenge or an opportunity. Lumat: Int. J. Math, Sci. Technol. Educ., 3(3), 353–370 DOI:10.31129/lumat.v3i3.1034.
  27. Foong C. C., Bashir Ghouse N. L., Lye A. J., Khairul Anhar Holder N. A., Pallath V., Hong W.-H., Sim J. H. and Vadivelu J., (2021), A qualitative study on self-regulated learning among high performing medical students, BMC Med. Educ., 21(1), 320 DOI:10.1186/s12909-021-02712-w.
  28. García-Morales V. J., Garrido-Moreno A. and Martín-Rojas R., (2021), The Transformation of Higher Education After the COVID Disruption: Emerging Challenges in an Online Learning Scenario, Front. Psychol., 12, 616059 DOI:10.3389/fpsyg.2021.616059.
  29. Garrison D. R. and Kanuka H., (2004), Blended learning: Uncovering its transformative potential in higher education, Internet Higher Educ., 7(2), 95–105 DOI:10.1016/j.iheduc.2004.02.001.
  30. Guerra-Martín M. D., Lima-Serrano M. and Lima-Rodríguez J. S., (2017), Effectiveness of Tutoring to Improve Academic Performance in Nursing Students at the University of Seville, J. New Approaches Educ. Res., 6(2), 93–102 DOI:10.7821/naer.2017.7.201.
  31. Guppy N., Verpoorten D., Boud D., Lin L., Tai J. and Bartolic S., (2022), The post-COVID-19 future of digital learning in higher education: views from educators, students, and other professionals in six countries, Br. J. Educ. Technol., 53(6), 1750–1765 DOI:10.1111/bjet.13212.
  32. Harrell F. E., (2001), Cox Proportional Hazards Regression Model, in Harrell F. E., Regression Modeling Strategies, Springer, New York, pp. 465–507 DOI:10.1007/978-1-4757-3462-1_19.
  33. Hoed R. M., Ladeira M. and Leite L. L., (2018), Influence of algorithmic abstraction and mathematical knowledge on rates of dropout from Computing degree courses, J. Br. Comput. Soc., 24(1), 10 DOI:10.1186/s13173-018-0074-2.
  34. Hood S. and Girshner J., (2023), The Impact of Frequent Student-Faculty Interactions on Repeater Students, TFSC Publications and Presentations, Retrieved from https://scholarworks.uark.edu/wctfscpub/60.
  35. Jacobsen W. C. and Forste R., (2011), The Wired Generation: Academic and Social Outcomes of Electronic Media Use Among University Students, Cyberpsychology, Behavior, Soc. Networking, 14(5), 275–280 DOI:10.1089/cyber.2010.0135.
  36. Jereb E., Jerebic J. and Urh M., (2023), Studying Habits in Higher Education Before and After the Outbreak of the COVID-19 Pandemic, Athens J. Educ., 10(1), 67–84 DOI:10.30958/aje.10-1-4.
  37. Karimi-Haghighi M., Castillo C. and Hernández-Leo D., (2022), A Causal Inference Study on the Effects of First Year Workload on the Dropout Rate of Undergraduates, in Rodrigo M. M., Matsuda N., Cristea A. I. and Dimitrova V. (ed.), Artificial Intelligence in Education, Springer International Publishing, vol. 13355, pp. 15–27 DOI:10.1007/978-3-031-11644-5_2.
  38. Kerres M. and Buchner J., (2022), Education after the Pandemic: What We Have (Not) Learned about Learning, Educ. Sci., 12(5), 315 DOI:10.3390/educsci12050315.
  39. Kubey R. W., Lavin M. J. and Barrows J. R., (2001), Internet Use and Collegiate Academic Performance Decrements: Early Findings, J. Commun., 51(2), 366–382 DOI:10.1111/j.1460-2466.2001.tb02885.x.
  40. Kumari A., Ortiz-Rodríguez J. C., Mers-Noble S. L., Enderle B. and Velázquez J. M., (2025), Enhancing Student Engagement and Success in General Chemistry Through Supplemental Co-Class Models, J. Chem. Educ., 102(7), 2997–3003 DOI:10.1021/acs.jchemed.4c01432.
  41. Larson M., Davies R., Steadman A. and Cheng W. M., (2023), Student's Choice: In-Person, Online, or on Demand? A Comparison of Instructional Modality Preference and Effectiveness, Educ. Sci., 13(9), 877 DOI:10.3390/educsci13090877.
  42. Leong E., Mercer A., Danczak S. M., Kyne S. H. and Thompson C. D., (2021), The transition to first year chemistry: student, secondary and tertiary educator's perceptions of student preparedness, Chem. Educ. Res. Pract., 22(4), 923–947 10.1039/D1RP00068C.
  43. Li X., Odhiambo F. A. and Ocansey D. K. W., (2023), The effect of students’ online learning experience on their satisfaction during the COVID-19 pandemic: the mediating role of preference, Front. Psychol., 14, 1095073 DOI:10.3389/fpsyg.2023.1095073.
  44. Lumley T., (1997), leaps: Regression Subset Selection (p. 3.2) [Dataset] DOI:10.32614/CRAN.package.leaps.
  45. Manasrah A., Masoud M. and Jaradat Y., (2021), Short Videos, or Long Videos? A Study on the Ideal Video Length in Online Learning. 2021 International Conference on Information Technology (ICIT), 366–370 DOI:10.1109/ICIT52682.2021.9491115.
  46. Manfuso L., (2020), From Emergency Remote Teaching to Rigorous Online Learning, EdTech Magazine, https://edtechmagazine.com/higher/article/2020/05/emergency-remote-teaching-rigorous-online-learning-perfcon.
  47. Mann M. and Enderson M. C., (2017), Give Me a Formula Not the Concept! Student Preference to Mathematical Problem Solving, J. Adv. Marketing Educ., 25(Special Issue on Teaching Innovations in Retailing Education), 15–24.
  48. Marks A., (2011), Electronic Group Collaboration in higher education, Proceedings of the 2011 15th International Conference on Computer Supported Cooperative Work in Design (CSCWD), 742–747 DOI:10.1109/CSCWD.2011.5960201.
  49. Martínez-Líbano J. and Yeomans-Cabrera M.-M., (2023), Post-pandemic psychosocial variables affecting academic dropout in a sample of Chilean higher-education students, Front. Educ., 8, 1293259 DOI:10.3389/feduc.2023.1293259.
  50. Maya J., Luesia J. F. and Pérez-Padilla J., (2021), The Relationship between Learning Styles and Academic Performance: Consistency among Multiple Assessment Methods in Psychology and Education Students, Sustainability, 13(6), 3341 DOI:10.3390/su13063341.
  51. Mehta K. J., Aula-Blasco J. and Mantaj J., (2024), University students’ preferences of learning modes post COVID-19-associated lockdowns: In-person, online, and blended, PLoS One, 19(7), e0296670 DOI:10.1371/journal.pone.0296670.
  52. Mortenson M. J. and Vidgen R., (2016), A computational literature review of the technology acceptance model, Int. J. Inf. Manage., 36(6), 1248–1259 DOI:10.1016/j.ijinfomgt.2016.07.007.
  53. Müller C. and Mildenberger T., (2021), Facilitating flexible learning by replacing classroom time with an online learning environment: A systematic review of blended learning in higher education, Educ. Res. Rev., 34, 100394 DOI:10.1016/j.edurev.2021.100394.
  54. Müller C., Mildenberger T. and Steingruber D., (2023), Learning effectiveness of a flexible learning study programme in a blended learning design: Why are some courses more effective than others? Int. J. Educ. Technol. Higher Educ., 20(1), 10 DOI:10.1186/s41239-022-00379-x.
  55. Nakata Y., (2020), Promoting self-regulation in low achievers: a case study of Japanese university EFL students, Int. J. Appl. Linguistics, 30(1), 110–126 DOI:10.1111/ijal.12269.
  56. Neuwirth L. S., Jović S. and Mukherji B. R., (2021), Reimagining higher education during and post-COVID-19: Challenges and opportunities, J. Adult Continuing Educ., 27(2), 141–156 DOI:10.1177/1477971420947738.
  57. Nguyen C. K., DeNeve D. R., Nguyen L. T. and Limbocker R., (2020), Impact of COVID-19 on General Chemistry Education at the United States Military Academy, J. Chem. Educ., 97(9), 2922–2927 DOI:10.1021/acs.jchemed.0c00771.
  58. Nkomo L. M. and Daniel B. K., (2021), Providing Students with Flexible and Adaptive Learning Opportunities using Lecture Recordings, J. Open, Flexible Distance Learn., 25(1) DOI:10.61468/jofdl.v25i1.437.
  59. Nordmann E., Horlin C., Hutchison J., Murray J.-A., Robson L., Seery M. K. and MacKay J. R. D., (2020), Ten simple rules for supporting a temporary online pivot in higher education, PLoS Comput. Biol., 16(10), e1008242 DOI:10.1371/journal.pcbi.1008242.
  60. O’Flaherty J. and Phillips C., (2015), The use of flipped classrooms in higher education: a scoping review, Int. Higher Educ., 25, 85–95 DOI:10.1016/j.iheduc.2015.02.002.
  61. Patterson N., Schultz M., Wood-Bradley G., Lanham E. and Adachi C., (2020), Going digital to enhance the learning of undergraduate students, J. Univ. Teach. Learn. Pract., 17(3) DOI:10.53761/1.17.3.6.
  62. Perera M., (2018), Índice de Nivel Socioeconómico. Propuesta de actualización. Estudio realizado para la Cámara de Empresas de Investigación Social y de Mercado del Uruguay (CEISMU), Centro de Investigaciones Económicas, https://portal.factum.uy/pdf/INSE_informe_2018.pdf.
  63. Pittaluga L. and Revoir A., (2012), One Laptop per Child and Bridging the Digital Divide: The Case of Plan CEIBAL in Uruguay, Inf. Technol. Int. Dev., 8(4), 145–159.
  64. Puriwat W., Tripopsakul S. and Chulalongkorn Business School, Chulalongkorn University, (2021), The Impact of e-Learning Quality on Student Satisfaction and Continuance Usage Intentions during COVID-19, Int. J. Inf. Educ. Technol., 11(8), 368–374 DOI:10.18178/ijiet.2021.11.8.1536.
  65. Rapanta C., Botturi L., Goodyear P., Guàrdia L. and Koole M., (2020), Online University Teaching During and After the Covid-19 Crisis: Refocusing Teacher Presence and Learning Activity, Postdigital Sci. Educ., 2(3), 923–945 DOI:10.1007/s42438-020-00155-y.
  66. Rapanta C., Botturi L., Goodyear P., Guàrdia L. and Koole M., (2021), Balancing Technology, Pedagogy and the New Normal: Post-pandemic Challenges for Higher Education, Postdigital Sci. Educ., 3(3), 715–742 DOI:10.1007/s42438-021-00249-1.
  67. Refika C., (2023), DIFFERENCES IN SELF-REGULATED LEARNING AMONG STUDENTS IN TERMS OF GENDER. Trend: Int. J. Trends Global Psychol. Sci. Educ., 1(1), 1 DOI:10.62260/intrend.v1i1.88.
  68. R foundation for statistical computing, (2020), R: A language and environment for statistical computing [Software]. R foundation for statistical computing, https://www.R-project.org/.
  69. Rodríguez M. L. and Pulido-Montes C., (2022), Use of Digital Resources in Higher Education during COVID-19: A Literature Review, Educ. Sci., 12(9), 612 DOI:10.3390/educsci12090612.
  70. Rodríguez-Rodríguez E., Sánchez-Paniagua M., Sanz-Landaluze J. and Moreno-Guzmán M., (2020), Analytical Chemistry Teaching Adaptation in the COVID-19 Period: Experiences and Students’ Opinion, J. Chem. Educ., 97(9), 2556–2564 DOI:10.1021/acs.jchemed.0c00923.
  71. Rodríguez-Triana M. J., Prieto L. P., Ley T., De Jong T. and Gillet D., (2020), Social practices in teacher knowledge creation and innovation adoption: a large-scale study in an online instructional design community for inquiry learning, Int. J. Comput.-Supported Collaborative Learn., 15(4), 445–467 DOI:10.1007/s11412-020-09331-5.
  72. Rohman F. M. A., Riyadi R. and Indriati D., (2020), Gender differences on students’ self-regulated learning in mathematics, J. Phys.: Conf. Ser., 1613(1), 1 DOI:10.1088/1742-6596/1613/1/012053.
  73. Roy A., (2020), The pandemic is a portal. Financial Times, https://www.ft.com/content/10d8f5e8-74eb-11ea-95fe-fcd274e920ca.
  74. Sanz M. T. and López-Iñesta E., (2022), Impact of extracurricular factors on the academic performance of university students during the COVID-19 pandemic. Front. Educ., 7, 991276 DOI:10.3389/feduc.2022.991276.
  75. Sheard J. and Hagan D., (1998), Our failing students: a study of a repeat group, ACM SIGCSE Bull., 30(3), 223–227 DOI:10.1145/290320.283550.
  76. Sievertsen H. H. and Burgess, S., (2020), Schools, skills, and learning: The impact of COVID-19 on education, VoxEU column, https://cepr.org/voxeu/columns/schools-skills-and-learning-impact-covid-19-education.
  77. Snead S. L., Walker L. and Loch B., (2022), Are we failing the repeating students? Characteristics associated with students who repeat first-year university mathematics, Int. J. Math. Educ. Sci. Technol., 53(1), 227–239 DOI:10.1080/0020739X.2021.1961899.
  78. softwareStataCorp, (2023), Stata Statistical Software: Release 18 [Software], StataCorp LLC.
  79. Stecuła K. and Wolniak R., (2022), Advantages and Disadvantages of E-Learning Innovations during COVID-19 Pandemic in Higher Education in Poland, J. Open Innovation: Technol., Market, Complexity, 8(3), 159 DOI:10.3390/joitmc8030159.
  80. Stevanović A., Božić R. and Radović S., (2021), Higher education students’ experiences and opinion about distance learning during the Covid-19 pandemic, J. Comput. Assisted Learn., 37(6), 1682–1693 DOI:10.1111/jcal.12613.
  81. Steyn A. A., Van Slyke C., Dick G., Twinomurinzi H. and Amusa L. B., (2024), Student intentions to continue with distance learning post-COVID: an empirical analysis, PLoS One, 19(1), e0293065 DOI:10.1371/journal.pone.0293065.
  82. Stöhr C., Demazière C. and Adawi T., (2020), The polarizing effect of the online flipped classroom, Comput. Educ., 147, 103789 DOI:10.1016/j.compedu.2019.103789.
  83. Su C. and Guo Y., (2021), Factors impacting university students’ online learning experiences during the COVID-19 epidemic, J. Comput. Assisted Learn., 37(6), 1578–1590 DOI:10.1111/jcal.12555.
  84. Tang Y. and He W., (2023), Meta-analysis of the relationship between university students’ anxiety and academic performance during the coronavirus disease 2019 pandemic, Front. Psychol., 14, 1018558 DOI:10.3389/fpsyg.2023.1018558.
  85. The Council of the European Union, (2018), Council recommendation on key competences for lifelong learning, Oficial J. Eur. Union, 61, 1.
  86. Therneau T. M., (2015), A package for survival analysis in S, pp. 3.7–0) [Dataset] DOI:10.32614/CRAN.package.survival.
  87. Therneau T. M. and Grambsch P. M., (2000), Modeling Survival Data: Extending the Cox Model, Springer, New York DOI:10.1007/978-1-4757-3294-8.
  88. Tilak J. B. G. and Kumar A. G., (2022), Policy Changes in Global Higher Education: What Lessons Do We Learn from the COVID-19 Pandemic? Higher Educ. Policy, 35(3), 610–628 DOI:10.1057/s41307-022-00266-0.
  89. Topale L., (2016), The strategic use of lecture recordings to facilitate an active and self-directed learning approach, BMC Med. Educ., 16(1), 201 DOI:10.1186/s12909-016-0723-0.
  90. Torello M. and Casacuberta C., (2020), Las características socio-económicas de la matrícula universitaria, Documentos de Trabajo del Rectorado, Universidad de la República.
  91. Udelar, (2020), Principales resultados de la Encuesta a estudiantes de la Udelar para la evaluación de la propuesta educativa en la modalidad virtual del primer semestre 2020, Dirección General de Planeamiento, Universidad de la República, https://udelar.edu.uy/portal/wp-content/uploads/sites/48/2020/07/Resumen_Difusi%C3%B3n_Informe_encuesta-estudiantes.pdf.
  92. UNESCO Education Sector, (2020), Distance learning strategies in response to COVID-19 school closures (UNESCO Education Sector issue notes No. 2.1; UNESCO COVID-19 Education Response), https://unesdoc.unesco.org/ark:/48223/pf0000373305.locale=es.
  93. Veiga N. and Torres J., (2022), Interactive Tools for First-Semester Undergraduate Chemistry Course in Uruguay: Student Choices and Impact on Student Performance and Dropout, J. Chem. Educ., 99(2), 851–863 DOI:10.1021/acs.jchemed.1c00750.
  94. Venables W. N. and Ripley B. D., (2002), Modern applied statistics with S, 4th edn, Springer, https://www.stats.ox.ac.uk/pub/MASS4/.
  95. Xie J., Gulinna A. and Rice M. F., (2021), Instructional designers’ roles in emergency remote teaching during COVID-19, Distance Educ., 42(1), 70–87 DOI:10.1080/01587919.2020.1869526.
  96. Yee T. W., (2010), The VGAM Package for Categorical Data Analysis, J. Stat. Softw., 32(10), 1–34 DOI:10.18637/jss.v032.i10.
  97. Zhang R., Bi N. C. and Mercado T., (2023), Do zoom meetings really help? A comparative analysis of synchronous and asynchronous online learning during Covid-19 pandemic, J. Comput. Assisted Learn., 39(1), 210–217 DOI:10.1111/jcal.12740.

This journal is © The Royal Society of Chemistry 2026
Click here to see how this site uses Cookies. View our privacy policy here.