Student perceptions of a flipped second-semester postsecondary organic chemistry course through the lens of the community of inquiry framework
Received
2nd December 2025
, Accepted 22nd January 2026
First published on 30th January 2026
Abstract
Organic chemistry courses are often viewed as “weed-out” courses, with success traditionally measured by examination performance. However, learning extends beyond cognition as it is also influenced by students’ interactions with their peers and instructors. Affective and social dimensions of the learning environment should also be considered for supporting meaningful engagement with organic chemistry content. In this study, a flipped, peer-led team-learning (PLTL) pedagogical strategy was implemented in the second semester of a yearlong postsecondary organic chemistry course. Using the Community of Inquiry (CoI) framework, we explored how students’ perceptions of their social, cognitive, and teaching presence in the course varied across the semester, by course grade, and admit type (first-time-in-college and transfer). Confirmatory factor analysis and measurement invariance testing supported the validity of the data collected by the CoI instrument, and nonparametric analyses were used to assess group-level differences. Results indicate that perceptions of all CoI components increased across the organic chemistry course. Students that earned higher course grades reported stronger perceptions of social and cognitive presence, and transfer students were found to report comparable perceptions of the CoI components to their first-time-in-college peers. Overall, the flipped, PLTL course pedagogical strategy cultivated a supportive and cohesive learning environment of organic chemistry over time and across student groups. These results underscore the value of intentionally structured, collaborative environments in challenging gateway chemistry courses and the need to measure affective and social dimensions of learning alongside cognitive outcomes to more fully capture the mechanisms by which course structures influence the learning experience.
Introduction
The introductory organic chemistry course sequence has long been recognized as a pivotal and challenging course within the postsecondary STEM curriculum (Bradley et al., 2002; Anderson and Bodner, 2008; Grove et al., 2008, 2012; Widanski and McCarthy, 2009; Kraft et al., 2010; Schmidt-McCormack et al., 2019; Asmussen et al., 2023), often serving as a gateway to advanced coursework and professional opportunities in the chemical and health sciences (Grove et al., 2008; Horowitz et al., 2013; Mutanyatta-Comar and Mooring, 2019). Success in these courses requires students to coordinate between multiple modes of chemical reasoning, while developing sophisticated conceptual and representational skills (such as molecular visualization and representational competence), which are demands that are uniquely intensive relative to earlier introductory chemistry coursework. In response to these challenges, recent instructional reforms such as flipped and peer-led team-learning (PLTL) models have sought to make these cognitive processes more explicit and to promote deeper engagement with chemical thinking by positioning students as active participants in the construction of their chemical understanding.
Much of the organic chemistry education literature has focused on students’ cognitive processes, including how they reason about reaction mechanisms and representations (e.g., Kozma and Russell, 2005; Graulich, 2015, 2025; Cooper et al., 2016; Dood and Watts, 2022, 2023; Talanquer, 2022; Ward et al., 2022; Frost et al., 2023; Yik et al., 2023; Crowder and Raker, 2024; Crowder et al., 2024, 2025). While such work provides insight into how students make sense of content, dimensions of learning go beyond cognition (Dewey, 1904; Ausubel, 1963; Novak, 1977; Novak and Gowin, 1984; Bretz, 2001; Immordino-Yang and Damasio, 2007). Student success in the complex, high-stakes learning environment of organic chemistry is additionally influenced by affective and social dimensions, including motivation, identity, sense of belonging, and perceptions of the learning environment (e.g., Lynch and Trujillo, 2011; Galloway et al., 2016; Villafañe et al., 2016; Liu et al., 2018; Gibbons and Raker, 2019). As chemistry education continues to emphasize more student-centered teaching practices, there is a growing recognition that understanding how experiences in the learning environment are essential for the improvement of learning and retention. Thus, a more complete understanding of the organic chemistry learning experience must account for the affective and social dimensions of students’ experiences (Flaherty, 2020b).
This work explores students’ perceptions of these different dimensions of learning within the second semester of a yearlong postsecondary organic chemistry course sequence that uses a flipped, peer-led team-learning (PLTL) pedagogical strategy. To frame these perceptions, this study adopts the Community of Inquiry (CoI) framework, which conceptualizes learning as the interaction of three interrelated components: social presence, cognitive presence, and teaching presence (Garrison et al., 1999). In the context of this work, social presence reflects students’ engagement with peers and their sense of belonging; cognitive presence reflects students’ engagement with meaningful problem-solving; and teaching presence reflects how instructors structure and guide learning activities. Using this framework, we quantitatively examine students’ perceptions of these three CoI components across the semester, by course grade, and by admit type.
Student perceptions in organic chemistry
Student perceptions provide a lens to capture broader dimensions of the learning experience. Perceptions influence how students interpret instructional practices, how they engage with peers, and how they evaluate their own persistence and belonging in STEM (Bauer, 2005, 2008; Galloway et al., 2016). In the context of the organic chemistry course sequence, a sequence that often carries a negative connotation, these perceptions may be particularly consequential. Positive experiences of collaboration and support can foster motivation; whereas negative experiences, such as perceiving the course culture as competitive or unsupportive, can compound existing challenges and influence students’ decisions to continue in STEM (Seymour and Hewitt, 1997; Thiry et al., 2019).
Student perceptions are shaped by the diverse experiences and backgrounds that they bring into the classroom. Among these, transfer students represent a particularly relevant group for studying student perceptions in organic chemistry (Whitfield, 2005; Frost et al., 2024b). Unlike their first-time-in-college (FTIC) peers, transfer students often enter the organic chemistry course sequence after completing prerequisite coursework elsewhere (both at two-year and four-year institutions). This means transfer students may differ in their chemistry preparation, expectations, and familiarity with instructional practices. Research has also shown that transfer students often experience lower academic performance upon transfer (i.e., transfer shock), both broadly and within organic chemistry specifically, highlighting the potential impact of this transition period on student outcomes (Hills, 1965; Whitfield, 2005; Elliott and Lakin, 2021; Smith et al., 2022; Frost et al., 2024b). Moreover, transfer students must simultaneously adapt to a new institutional culture, build peer networks, and develop a sense of belonging. This transition often coincides with enrollment in foundational, yet highly demanding, gateway courses like organic chemistry, which serve as critical sites for integration into the new institution and for building relationships that support persistence and success (Whitfield, 2005). At universities with historically high transfer rates, such as those where our study is conducted (Jenkins and Fink, 2016), understanding how transfer and FTIC students perceive their learning environments is particularly important since a growing number of students are beginning STEM degrees at two-year institutions (Wang, 2015; Gray et al., 2022). More broadly, examining variations in student perceptions across groups can provide insight into how different students navigate and engage with the organic chemistry course sequence.
In our work, we examine how students’ perceptions of the second semester of a yearlong postsecondary organic chemistry course differ over time and by different grouping variables (i.e., course grade and admit type). However, student perceptions are not only shaped by students’ prior experiences and backgrounds, but also by the instructional environments they encounter. Work exploring student perceptions in chemistry education contexts has provided insights beyond performance metrics, such as those related to course goals, climate, and challenges (Flaherty, 2020a; Irby et al., 2020; Ramachandran and Rodriguez, 2020; Bowen et al., 2022). These perceptions are useful for evaluating course transformations, informing instructional practices, and exploring the alignment between student expectations and instructor goals (Bowen et al., 2022). To more fully understand student perceptions, it is also necessary to consider the pedagogical principles underlying course design and facilitation as these shape student engagement, peer and instructor relationships, and the construction of understanding.
Constructivist approaches in chemistry classrooms
Constructivist learning theories posit that students build knowledge actively through engagement with concepts, authentic tasks, and their peers rather than passively absorbing information from their instructor (Dewey, 1938; Piaget, 1952; Freire and Ramos, 1970; Cole et al., 1978; Bodner, 1986; DeVries, 2000). In practice, a constructivist classroom is characterized by (1) knowledge shared between instructors and students, (2) instructors and students sharing authority, (3) the instructor serving as a facilitator or guide, and (4) learning occurring in small groups of students (Chung, 1991). Instructional approaches grounded in these principles emphasize student engagement, collaboration, and problem-solving as central to learning. Such approaches have been found to promote chemistry students’ higher-order thinking, conceptual understanding, sense of belonging, and persistence (Freeman et al., 2014; Seery, 2015; Crimmins and Midkiff, 2017). By fostering inclusive learning environments that value students’ experiences, active learning can help promote more equitable learning outcomes across student populations (Stanich et al., 2018; Theobald et al., 2020), and contribute to increased retention and graduation rates for transfer students (Wang et al., 2017; Riedl et al., 2021).
These approaches can take a variety of forms, including, but not limited to, just-in-time teaching (Simkins and Maier, 2010), studio format classrooms (Sorensen et al., 2006), flipped classrooms (Seery, 2015), process-oriented guided-inquiry learning (POGIL; Moog et al., 2009), think-pair-share (Mazur, 1997), and peer-led team-learning (PLTL; Gosser et al., 1996). In the present study, the second semester of a yearlong postsecondary organic chemistry course incorporated both flipped instruction and PLTL, with the goal of providing students with multiple opportunities to actively engage with challenging content, collaborate with their peers, and receive guided support from both their instructor and trained peer leaders.
In a flipped classroom, students engage with the course content before class through readings, videos, or online modules (Seery, 2015); which allows for productive problem-solving, discussion, and application of concepts in class. (Liu et al., 2018; Mutanyatta-Comar and Mooring, 2019; Eichler, 2022). For example, students may watch a video and complete a module on electrophilic aromatic substitution (EAS) before arriving to class so they can complete a worksheet with their peers on EAS reactions. The flipped classroom approach shifts the instructor's role from primarily delivering content via lecture to facilitating deeper engagement by guiding students as they work through challenging problems and build understanding. In an organic chemistry context, this might involve having students discuss challenges they encountered during the EAS video lecture in the first few minute of class. The instructor can walk around the class to get an idea of the knowledge students’ currently have and then direct students to questions to solve together, such as practicing drawing EAS intermediates and their resonance structures. Prior chemical education work has cited that flipped organic chemistry classrooms positively impacted student attitudes across underrepresented student groups and allows students to set the pace of their learning (Christiansen, 2014; Fautch, 2015; Flynn, 2015; Rossi, 2015; Mooring et al., 2016; Crimmins and Midkiff, 2017; Rocabado et al., 2019; Reimer et al., 2021).
PLTL is rooted in the theory of social constructivism (Cole et al., 1978) and emphasizes the importance of collaboration in the process of knowledge construction (Raker et al., 2021). In the PLTL model, small groups of students (e.g., 6–10 per group) meet weekly to work through problems that build upon content covered in lecture (Gosser et al., 1996). An undergraduate peer leader who completed the course, generally with a “A” or higher, undergoes training throughout the semester in content and pedagogy. In the organic chemistry context, peer leaders might help students in drawing the mechanisms of EAS reactions by guiding their reasoning through concepts, such as directing effects, and addressing common struggles (e.g., confusing activating vs. deactivating groups, missing resonance structures). Prior work has suggested that PLTL positively impacts general (Lewis and Lewis, 2005, 2008; Hockings et al., 2008; Lewis, 2011; Mitchell et al., 2012; Shields et al., 2012; Chan and Bauer, 2015) and organic (Tien et al., 2002; Lyle and Robinson, 2003; Wamser, 2006; Rein and Brookes, 2015; Wilson and Varma-Nelson, 2019, 2021) student learning in multiple facets, including positive affect, content understanding, problem-solving, and retention. Taken together, the PLTL model is not only a means to promote students’ conceptual understanding, but also a structure to foster the social and affective dimensions of learning that contribute to persistence and success in challenging courses like organic chemistry.
Community of inquiry (CoI)
The community of inquiry (CoI) framework serves as a model for understanding how learning occurs through collaboration within higher education (Garrison et al., 1999). This framework reflects the idea that inquiry is a social activity, where collaborative engagement encourages students to take responsibility for constructing and confirming their understanding (Dewey, 1910). The CoI framework has been applied in a variety of learning environments; originally developed for online courses, it has since been applied in hybrid and face-to-face learning environments (Garrison and Kanuka, 2004; Vaughan et al., 2013). The CoI framework emphasizes that meaningful learning develops in communities (e.g., a course environment) where students engage in sustained dialogue, reflection, and shared inquiry. Meaningful learning emerges from the dynamic interplay of three components: social presence, cognitive presence, and teaching presence (see Fig. 1).
 |
| | Fig. 1 Description of CoI components. | |
Social presence
Social presence is the most studied component of CoI (e.g., Walther, 1992; Rourke et al., 2001; Richardson and Swan, 2003; Richardson et al., 2017; Oh et al., 2018); it is described as the degree to which students feel that they are able to project themselves socially and emotionally within a course so that they are perceived as authentic participants (Gunawardena and Zittle, 1997). Aspects of social presence include affective/emotional expression, open communication, and group cohesion (Garrison et al., 1999). Together, these highlight that social presence is more than just personal bonds formed within the context of a course but rather are reflective of relationships that are both personal and purposeful within the community of inquiry.
Social presence facilitates classroom relationships and open communication among instructors and students to support collaboration rather than individual information acquisition (Picciano, 2019). Within online course environments, this might take place in discussion forums in course management systems (e.g., Canvas) or communication platforms (e.g., GroupMe or Discord) where students feel comfortable expressing their thoughts; in face-to-face courses, this may occur in collaborative group activities. Previous literature suggests that collaborative activities support increased social presence and sense of community (Rovai, 2002; Richardson and Swan, 2003), which results in increased satisfaction with the learning process (Benbunan-Fich and Hiltz, 2003). A high perception of social presence posits that students feel comfortable interacting, collaborating, and able to express themselves in the course environment. This is reflective of students building connections with peers, feeling a sense of belonging, and engaging in collaborative learning environments (Rogers and Lea, 2005; Shen et al., 2010). Without sufficient social presence, course interactions risk feeling superficial or transactional.
Cognitive presence
Cognitive presence has long been considered a defining characteristic of higher education, emphasizing critical-thinking as both a process and outcome (Dewey, 1910; Garrison et al., 2001). Within the CoI framework, cognitive presence is described as the extent to which students construct and confirm meaning through sustained reflection and discourse (Garrison et al., 1999). This emphasis on meaning making aligns with Ausubel's theory of meaningful learning, which posits that learning is most effective when new information is actively integrated with learners’ existing cognitive structures rather than simply memorized in isolation (Ausubel, 1968). The CoI framework conceptualizes cognitive presence as an inquiry process that occurs through four phases including (1) a triggering event, (2) exploration, (3) integration, and (4) resolution, to reflect the process of critical thinking and the means to develop cognitive presence. Progressing through these phases of inquiry requires thoughtfully designed learning activities, as well as purposeful facilitation and guidance.
Cognitive presence reflects the extent to which students are able to engage in critical thinking and meaning making through reflection and dialogue with their peers. When learning activities are limited to rote memorization or passive information transfer (e.g., traditional lecturing style), opportunities to develop cognitive presence and higher-order thinking skills are reduced (Arbaugh, 2013). Within the classroom, cognitive presence is fostered through collaborative problem-solving activities via class discussion or discussion posts, group projects, or inquiry-based experiments that push students beyond surface-level engagement. Although cognitive presence is often associated with academic performance measures, it does not directly measure content knowledge; rather, it reflects how students perceive their engagement with course content and collaborative knowledge construction (e.g., class discussions). Thus, a high perception of cognitive presence suggests that students perceive themselves as not only engaging with course content on a surface-level, but also as constructing and applying their knowledge in more meaningful ways.
Teaching presence
Teaching presence is the most recently conceptualized component of the CoI framework, described as how the instructor designs and facilitates the educational experience with the goal of supporting social and cognitive presence to help students achieve the learning outcomes and goals (Garrison et al., 1999). To this end, teaching presence is considered to be a significant determinant of student satisfaction, their perceived learning, and sense of community. This component is also conceptualized as having three components including (1) instructional design and organization, (2) facilitating discourse (or building understanding), and (3) direct instruction (Anderson et al., 2019). In practice, teaching presence may be reflected in how an instructor structures group work, offers feedback, addresses misconceptions, or poses guided questions during peer-led activities. A high value of teaching presence suggests that students perceive the instructor as providing clear communication of expectations, feedback, and guidance that make the learning process feel structured and intentional.
Overall, teaching presence provides design and direction of the educational experience, social presence supports the development of the personal and purposeful relationships needed to foster a collaborative environment, and cognitive presence reflects the depth of higher-order thinking skills encouraged by the environment. These three components are interconnected and collectively shape students’ perceptions of the learning environment.
CoI in chemistry education research
While there is not widespread use of the CoI framework within STEM applications (such studies mainly exist within engineering and medical education disciplines; Sadaf et al., 2021), there is a growing body of work within chemistry education that use this framework to explore how instructional contexts influence student learning and engagement, particularly following the transition to online learning during the COVID-19 pandemic (Lawrie, 2021). Some studies have used CoI as a framework for course (re)design of general, organic, and elective chemistry courses; redesign initiatives mainly focus on adapting the course for an online format (Lasker et al., 2019), but also include the implementation of structured peer teams, (Flener-Lovitt et al., 2020) and online communication tools (Ng et al., 2022). CoI has also been used as a lens to investigate student and instructor experiences within these nontraditional course contexts (Ang and Ng, 2022; Reyes et al., 2024), e.g., investigating their perceptions of asynchronous lectures (Ang and Ng, 2024) and help-seeking behaviors of students underrepresented in chemistry (Williams-Dobosz et al., 2021). The CoI instrument has also been adapted for chemistry-specific contexts, used specifically within introductory chemistry courses in a face-to-face context (Komperda, 2016).
While the CoI framework has been used and adapted for chemistry courses, its application to pedagogies such as the flipped, PLTL organic chemistry course in which our study is situated, provides a unique opportunity to examine students’ perceptions of the learning experience. Social presence, cognitive presence, and teaching presence provide a lens to evaluate the design of the learning environment and understand the ways students perceive themselves engaging in the community and content of the course. In this sense, the CoI framework can assist in illuminating how students perceive the learning environment. Exploring these perceptions over the course of a semester across different performance levels and student backgrounds allows for a more nuanced understanding of student engagement in a constructivist learning environment.
Research question
Guided by the CoI framework, this study aims to quantitatively examine differences in students perceptions of social, cognitive, and teaching presence in the second semester of a yearlong postsecondary organic chemistry course sequence, using flipped, PLTL methods. Specifically, we ask: Do students’ perceptions of the CoI components differ (1) across the semester, (2) by course grade, and (3) by admit type?
Methods
This work was conducted under application Pro#00028802, “Comprehensive evaluation of the University of South Florida's undergraduate and graduate chemistry curricula”, as reviewed by the University of South Florida's Institutional Review Board on December 13, 2016.
Data collection & course context
Data were collected at the University of South Florida: a large, public, research-intensive, and emerging Hispanic-Serving Institution university in the southeastern United States. Data were collected from students enrolled in the second semester of a yearlong postsecondary organic chemistry course sequence across four semesters (i.e., Spring/Fall 2023 and Spring/Fall 2024) and taught by author K. B-R. The course used Klein's Organic Chemistry, 4th edn. textbook (Klein, 2021). Each course section enrolled up to 150 students, organized into groups of five for in-class collaborative work. The second semester organic chemistry course included a separate laboratory course that was offered as a recommended co-requisite but was not required for enrollment in the lecture component. The laboratory was conducted in traditional, instructor (or teaching assistant)-led format and was administratively and pedagogically separate from the flipped, PLTL lecture course. Student enrollment in the laboratory course was not controlled for in the present study.
The lecture course was taught in a flipped, PLTL format, which the instructor (author K. B.-R.) implemented with the goal of fostering students’ self-regulated learning, problem-solving, teamwork, and communication skills, in addition to mastery of organic chemistry content. The instructor positioned themselves as a facilitator, mentor, and manager of the learning environment by guiding students through content, coordinating course logistics, and supporting students and peer leaders in developing transferable skills for their future careers.
Prior to each lecture, students engaged with course content through video lectures and textbook readings. Within each class period, approximately 45 minutes were devoted to instructor-led discussion of major topics and common misconceptions, followed by 20 minutes of worksheet practice problems in small groups of 5 students and 10 minutes of class-wide clicker questions (a class period is 75 minutes). A team of 45 trained peer leaders supported these sessions, each responsible for two groups of five students. Peer leaders facilitated in-class group work, guided discussion, helped students reflect on problem-solving processes, and served as peer mentors. Peer leaders were undergraduate students who had previously completed the course with an “A” and participated in a weekly peer leadership course covering a review of content to be learned in the course that week as well as facilitation strategies, cognitive psychology, growth mindset, and the creation of supportive learning communities. Students also had multiple opportunities for interaction outside of class, including small-group GroupMe chats, a course Discord server (for virtual office hours and peer assistance), in-person office hours, and a weekly 50-minute recitation session with a graduate teaching assistant. These opportunities were made available to further foster peer collaboration, social connection, and engagement with course material.
CoI instrument. The CoI framework was used to inform the development of the original instrument (Arbaugh et al., 2008). Similar to the CoI framework, this original instrument was originally developed for online contexts but has since been adapted and applied across face-to-face and hybrid environments, as well as across several disciplines (e.g., E-learning, education, languages and literature, medicine and health sciences, computer science, etc.; Stenbom, 2018). The instrument has largely been used to capture how larger groups of students perceive the three components of the learning environment, to compare features or interventions, and to investigate the relationship between the components and other variables (e.g., satisfaction; Garrison et al., 2010; Redstone et al., 2018; Stenbom, 2018; Castellanos-Reyes, 2020; Sadaf et al., 2021).Students’ perceptions of teaching, social, and cognitive presence were measured twice during the semester using a revised CoI instrument for chemistry (Komperda, 2016). The instrument is comprised of 37 multiple-choice items: thirteen items for teaching presence, nine items for social presence, fourteen items for cognitive presence, and one attention check item. Example items are shown in Fig. 2. Answer choices comprised on a five-point Likert-type scale with response options including: Strongly Disagree, Disagree, Neutral, Agree, and Strongly Agree (numerical score ranged from 1–5, respectively).
 |
| | Fig. 2 Example CoI items. | |
The CoI instrument was administered via Qualtrics to all students enrolled (N = 1611; including students who withdrew from the course) at two timepoints approximately eight weeks apart – after Exam 1 and before the cumulative Final Exam (see Fig. 3). Students received bonus points towards their final examination score for completing the CoI instrument. Only responses from students who completed CoI (e.g., no more than five missing responses and passed the attention check item) at both timepoints and received a passing final grade (i.e., A–C) were included in our analyses (N = 1052; response rate of 65.3%). A median imputation was conducted for remaining items missing a response.
 |
| | Fig. 3 Timeline of course content coverage, examinations, and CoI administrations. | |
Course grade. Course grade was determined by performance on clicker questions, group work, Discussion section engagement, homework, a science literature project, four in-term examinations, and a cumulative final examination. Letter grades were determined with the following ranges: “A” 85.00–100.00%, “B” 70.00–84.99%, and “C” 60.00–69.99%. The DFW rate for this course ranged from 5–8% over the four semesters CoI data were collected; these observations were excluded from our analyses due to incomplete information (e.g., withdrawals) and the small size of the subgroup. Within our data, 489 students received an “A” (46.48%), 418 received a “B” (39.73%), and 145 received a “C” (13.79%) for their final course letter grade.
Admit type. Admit type is inclusive of first-time-in-college (FTIC) and transfer (TS) students. FTIC students are defined as having “no prior postsecondary experience attending any institution for the first time at the undergraduate level” (National Center for Education Statistics, 2025); the TS admit type is inclusive of students who transferred from a two-year or four-year institution with 12 or more credit hours. During the timeframe data were collected, the University of South Florida enrolled an average of 3730 new FTIC students and 3263 TS students per year (University of South Florida, 2023, 2024, 2025). Within our data, 842 students are FTIC (80.04%) and 210 students are TS (19.96%). In line with prior work (Frost et al., 2024b), a significant difference was found between student admit type and final course grade (see Appendix 1, Table 9 for descriptive statistics) using a t-test: t(325) = 8.0321, p < 0.001, d = 0.62 (medium).
Data analysis
Analyses were conducted using R (version 4.5.0; R Core Team, 2025) via RStudio (version 2025.05.0 + 496; Rstudio Team, 2025). Descriptive statistics were determined using the ‘psych’ package (Revelle, 2024). Confirmatory factor analyses were conducted using the ‘lavaan’ package (Rosseel, 2012), Cronbach's alpha (α) and McDonald's omega (ω) were calculated using the ‘semTools’ package (Jorgenson et al., 2022), and dynamic fit indices were calculated using the ‘dynamic’ package (Wolf and McNeish, 2023). Nonparametric analyses were conducted using the ‘npmv’ (Burchett et al., 2017), ‘rcompanion’ (Mangiafico, 2025), ‘rstatix’ (Kassambara, 2025), and ‘FSA’ (Ogle, 2017) packages.
Psychometric analysis
To ensure that the CoI measurement model captured the intended constructs of interest of teaching presence (TP), social presence (SP), and cognitive presence (CP), evidence for validity and reliability were gathered using confirmatory factor analysis and measurement invariance.
Confirmatory factor analysis. To investigate the internal structure of our data, confirmatory factor analyses were conducted to test a three-factor model. Due to the ordinal nature of the data collected (i.e., Likert scale responses), the model was estimated using the weighted least squares means and variance adjusted (WLSMV) method. The comparative fit index (CFI), Tucker-Lewis index (TLI), and root-mean-square error of approximation (RMSEA) were used to evaluate model fit. Robust values for each are reported for all relevant analyses (Brosseau-Liard and Savalei, 2014). General recommendations for good model fit are: CFI > 0.90, TLI > 0.90, RMSEA < 0.08 (Hu and Bentler, 1999).Cronbach's α and McDonald's ω are reported for each of the three CoI components as measures of internal consistency. As prior work with CoI report Cronbach's α, this value is reported herein for comparative purposes. Due to the congeneric nature of the proposed model, CoI scores were evaluated using McDonald's ω. Cronbach's α and McDonald's ω are interpreted similarly, with general recommendations of a value of 0.7 or higher suggesting acceptable internal consistency (Cortina, 1993; Komperda et al., 2018).
The internal structure of the CoI data was evaluated with a three-factor model estimated on the combined Timepoint 1 and Timepoint 2 dataset (see Appendix 1, Table 10 for all item loadings); this pooled model was used to establish the baseline factor structure prior to conducting measurement invariance analyses. Fit statistics for the model and internal consistency measures showed similar patterns to those previously reported (Komperda, 2016). To evaluate whether each presence scale could also be treated as a unidimensional construct, we estimated separate single-factor models for TP, SP, and CP. These models demonstrated acceptable fit and yielded consistent reliability estimates (see Appendix 1, Table 11), supporting the interpretation of each presence as a coherent scale-level factor for subsequent latent mean comparisons. For the three-factor model, acceptable data model fit was obtained: χ2 (591, N = 2104) = 6298.703, p < 0.001, CFI = 0.895, TLI = 0.888, RMSEA = 0.068. Although the CFI and TLI values fall slightly below commonly cited cutoff criteria (e.g., 0.90), rigid adherence to universal cutoff values is not recommended; model evaluation should also consider sample size, model complexity, and theoretical plausibility (Marsh et al., 2004). In large-sample CFA models with many indicators, incremental fit indices are known to be particularly sensitive to minor model misspecifications. In this context, the RMSEA value falls within the general acceptable range, and the overall pattern of fit indices suggests reasonable model fit. Moreover, these fit statistics are consistent with prior work with the revised CoI instrument for chemistry (Komperda, 2016). Acceptable internal consistency was also observed for each component: αTP = 0.936, αSP = 0.892, αCP = 0.910; ωTP = 0.942, ωSP = 0.900, ωCP = 0.919.
Measurement invariance. Longitudinal measurement invariance and group measurement invariance were conducted to ensure that the CoI data represent the same constructs across time and subgroups (i.e., course letter grade and admit type). Three models were evaluated: configural, metric, and scalar. The configural model serves as the baseline model for the entire dataset, meaning factor loadings and thresholds are freely estimated for each timepoint or group. The metric model constrains the factor loadings to be equal across timepoints or groups. The scalar model further constrains factor loadings and thresholds to be equal across timepoints or groups. Invariance is desired and is exemplified by identical fit between successive models (e.g., configural to metric, metric to scalar), indicated by a nonsignificant χ2, ΔCFI ≤ 0.1, ΔTLI ≤ 0.1, and ΔRMSEA ≤ 0.015 (Chen, 2007). When the most constrained model (e.g., scalar) demonstrates acceptable fit, this can also be used as evidence to support invariance (Putnick and Bornstein, 2016).Results of the longitudinal measurement invariance testing (see Appendix 1, Table 12) suggest that the CoI instrument functioned consistently across both timepoints. Configural, metric, and scalar invariance were each supported (e.g., ΔCFI, ΔTLI, and ΔRMSEA ≤ 0.003), providing evidence to suggest that students interpreted the constructs similarly at the beginning and end of the semester. In other words, observed differences across timepoints can be interpreted as meaningful changes in student perceptions rather than shifts in understanding of the survey items.
Group measurement invariance testing was also conducted by course grade (“A”, “B”, “C”; see Appendix 1, Table 13) and admit type (FTIC vs. TS; see Appendix 1, Table 14) at both timepoints. Individual CFA models generally reflected the fit seen in the original CFA; however, few groups showed worse fit indices (e.g., course grade “C” at both timepoints). Dynamic fit index analyses (using the ‘dynamic’ package) suggested these deviations were likely due to sample size rather than substantive model misspecification (see Appendix 1, Table 15; McNeish and Wolf, 2023a; Wolf and McNeish, 2023, 2024; McNeish, 2024). Across both comparisons, configural, metric, and scalar invariance were supported (for course grade: ΔCFI, ΔTLI, and ΔRMSEA ≤ 0.007; for admit type: ΔCFI, ΔTLI, and ΔRMSEA ≤ 0.004), suggesting that the CoI instrument measures the same constructs regardless of course grade or admit type. Differences observed across student groups reflect differences in their perception rather than differences in interpretation of the survey.
Group comparisons
Following measurement invariance testing, we sought to investigate differences across various groups through tests of analysis of variance and structured means modeling. Due to the ordinal nature of the individual CoI survey items (i.e., Likert scale) and preliminary analyses which indicated that the assumption of multivariate normality was violated, accordingly parametric tests (e.g., multivariate analysis of variance) are inappropriate. Thus, nonparametric alternatives were used to examine if and where differences occurred between our groups.
Nonparametric multivariate analysis of variance (MANOVA). A nonparametric MANOVA was used to determine the differences between students’ perceptions of CoI components and other variables (e.g., timepoint; Wilks, 1932). The ‘nonpartest’ and ‘ssnonpartest’ functions in the ‘npmv’ package were used to calculate global nonparametric test statistics, permutation-based analogs, and nonparametric relative effect sizes (Burchett et al., 2017). For our data, Wilks’ lambda was used as the preferred test statistic. The reported relative effect sizes indicate the direction in which groups differ by quantifying the probability that a randomly chosen observation from one group is higher than a randomly chosen observation from the overall population. Relative effects range from 0 to 1 with a value of >0.5 indicating that observations in this group tend to be higher than the overall population. For significant nonparametric MANOVA results, a Mann Whitney U or Kruskal–Wallis test was conducted based on the number of independent groups.
Mann–Whitney U test. The Mann–Whitney U test is a nonparametric alternative to the independent t-test and is used to investigate whether differences are present between medians of two variables (e.g., CoI perceptions at Timepoint 1 vs. Timepoint 2; Mann and Whitney, 1947). Rank-biserial correlations are reported as effect sizes and can be interpreted similarly to Pearson correlation coefficients: 0.10 (small), 0.30 (medium), and 0.50 (large; Cureton, 1956; Cohen, 1988). The ‘rcompanion’ (Mangiafico, 2025) package was used for these analyses.
Kruskal–Wallis test. The Kruskal–Wallis test is a nonparametric alternative to the one-way ANOVA and is used to investigate whether differences are present between medians of three or more variables (e.g., course letter grade; Kruskal and Wallis, 1952). Eta-squared (η2) values are reported as effect sizes and can be interpreted as follows: 0.01 (small), 0.06 (medium), and ≥0.14 (strong; Cohen, 1988). For significant Kruskal–Wallis results, Dunn's post-hoc test is conducted to evaluate pairwise comparisons to investigate where differences lie (Dunn, 1964). A Bonferroni correction was used to control for Type 1 error (Holm, 1979). The ‘rstatix’ (Kassambara, 2025) and ‘FSA’ (Ogle, 2017) packages were used for these analyses.
Structured means modeling. Comparisons across groups are often conducted using traditional ANOVA or MANOVA techniques (and nonparametric alternatives) on composite scale scores; these approaches assume that all items contribute equally to a scale and contain the same amount of measurement error. In reality, items may vary in how strongly they reflect an underlying construct, and each item has unique variance representing measurement error. Ignoring these differences can lead to biased estimates of group differences.Structured means modeling (SMM) provides an alternative, latent variable approach to ANOVA-type comparisons (Hancock, 1997; Steinmetz, 2013). SMM is based on confirmatory factor analysis, where items are modeled as indicators of underlying latent structures; by explicitly modeling the mean structure of the factors, SMM allows for comparison of error-free latent factor means across groups rather than composite scale scores (Sörbom, 1974). This is accomplished by establishing scalar invariance, ensuring that the latent factors are measured equivalently so that observed differences reflect differences in the underlying construct rather than differences in item functioning. In practice, the mean of each factor is set relative to a reference group (typically zero), and differences between groups are interpreted as deviations from this reference.
SMM is particularly useful in cases where (1) the instrument has a defined factor structure and items differ in their loadings, (2) measurement error is present, or (3) accurate estimation of group differences on underlying constructs is desired. Statistical significance of the latent mean differences can be tested using Wald tests in the ‘lavaan’ package. Effect sizes of these differences (reported as ‘Effect size SMM’) can be calculated as the absolute value of the factor mean difference divided by the pooled variance of factors. Although calculated similar to Cohen's d, these effect sizes should not be interpreted on the same scale of magnitude; latent factors are free from measurement error, thus these effect sizes tend to be larger than those computed from composite scale scores (Hernán and Robins, 2006).
By explicitly modeling the latent structure, SMM provides more precise and interpretable comparisons across groups, complementing traditional techniques, and may reveal patterns obscured by measurement error.
Results & discussion
Results of our study suggest:
1. Across the semester – Student perceptions of CoI components are found to differ, with evidence to suggest that perceptions of teaching presence, social presence, and cognitive presence are higher at the end of the semester.
2. By course grade – Student perceptions of social presence and cognitive presence are found to differ; students that received an “A” in the course have higher perceptions of these components than other students in the course.
3. By admit type – Student perceptions of teaching presence are found to differ; transfer students have higher perceptions of this component than their FTIC peers.
Research question 1: Do students’ perceptions of CoI components differ across the semester?
A nonparametric MANOVA revealed significant differences in students’ perceptions of the CoI components across the semester: Λ(3, 2100) = 6.258, p < 0.001. The median predicted factor scores reported in Table 1 represent the central tendency of students’ standardized latent perceptions, where values above zero indicate perceptions above the sample mean; increases in median values from Timepoint 1 to Timepoint 2 reflect upwards shifts in students’ perceived teaching, social, and cognitive presence across the semester. Relative effects indicate that students were more likely to report higher levels of teaching, social, and cognitive presence at Timepoint 2 compared to Timepoint 1, as evidenced by values of above 0.5 across all components at Timepoint 2; this suggests a consistent shift towards more positive perceptions at the end of the semester.
Table 1 Descriptive statistics for predicted factor scores of CoI components and relative effects (N = 1052)a
| |
Median |
SD |
Relative effects |
| Raw item statistics by CoI component are reported in Appendix 1, Tables 16–18. |
| TP1 |
0.16 |
0.92 |
0.451 |
| TP2 |
0.25 |
0.91 |
0.549 |
| SP1 |
−0.09 |
1.03 |
0.454 |
| SP2 |
0.03 |
1.04 |
0.546 |
| CP1 |
−0.08 |
0.96 |
0.462 |
| CP2 |
0.06 |
1.00 |
0.538 |
| CG |
83.06 |
10.86 |
— |
To further examine these differences, Mann–Whitney U tests (with rank-biserial correlations) are conducted for each component. Results confirmed that perceptions of teaching presence (W = 499
104, p < 0.001, rrb = −0.098), social presence (W = 502
537, p < 0.01, rrb = −0.092), and cognitive presence (W = 511
760, p < 0.01, rrb = −0.075) are all significantly higher at Timepoint 2, though with small effect sizes. Latent mean comparisons were also investigated via SMM, as shown in Table 2; a positive mean difference indicates that the other variable (i.e., Timepoint 2) has a higher latent mean compared to the reference group (i.e., Timepoint 1) with standardized mean difference measures (i.e., Effect size SMM) indicating the magnitude of change. For example, a mean difference of 0.159 for teaching presence suggests that its latent mean at Timepoint 2 is 0.159 units higher than at Timepoint 1. These findings align with the Mann–Whitney U test results, suggesting that students’ perceptions of CoI components increased significantly over time, although the effect size was small.
Table 2 Latent mean comparisons of CoI components by timepointa
| Component |
Mean difference |
Effect size SMM |
| Mean differences are reported for the Timepoint 2 relative to Timepoint 1.*p < 0.001; **p < 0.01. |
| TP |
0.159* |
0.16 |
| SP |
0.135** |
0.13 |
| CP |
0.117** |
0.11 |
Timepoint 1 was administered after Exam 1, meaning it captured students’ initial exposure to the flipped, PLTL course context and a measure of their course performance. At Timepoint 2, students had completed the majority of the semester. Consistent increases over time for each component suggest that extended exposure to the flipped, PLTL course structure likely supported meaningful improvements in students’ perceptions of the learning environment.
The features of the flipped, PLTL structure (e.g., pre-class preparation, peer collaboration, peer-led teamwork, instructor guidance) likely helped students to feel more connected, engaged, and supported in the course. As this increase in student perception was measured after students had experienced the course structure, it is reasonable to interpret the course format as a contributing factor to these observed changes. While overall increases in perception were small, students’ prior experiences may have shaped the size of these effects, indicating that the course may have influenced students differently depending on their background. For example, at this institution, PLTL formats are frequently used in the year-long general chemistry course sequence; having engaged in similar active-learning formats, students may have started the semester with positive perceptions of the learning environment, leaving less room for substantive change. Thus, even small increases in perception suggest that continued exposure in the flipped, PLTL in an Organic Chemistry 2 environment reinforced their experiences.
Research question 2: Do students’ perceptions of CoI components differ by their course grade?
Nonparametric MANOVA tests conducted at each timepoint reveal significant differences in students’ perceptions of CoI components by course letter grade letter: Timepoint 1 Λ(6, 2094) = 19.848, p < 0.001 and Timepoint 2 Λ(6, 2094) = 19.148, p < 0.001. The median predicted factor scores and relative effects (Table 3) suggest that students who earned an “A” consistently reported higher perceptions of social and cognitive presence compared to those who earned a “B” or “C”; students who earn a “C” consistently reported higher perceptions of teaching presence.
Table 3 Descriptive statistics for predicted factor score of CoI components and relative effects by course letter grade (“A”, “B”, “C”)
| |
A (n = 489) |
B (n = 418) |
C (n = 145) |
| Median |
SD |
Relative effects |
Median |
SD |
Relative effects |
Median |
SD |
Relative effects |
| Course grade. |
| TP1 |
0.13 |
0.98 |
0.490 |
0.18 |
0.88 |
0.498 |
0.18 |
0.84 |
0.539 |
| TP2 |
0.26 |
0.93 |
0.498 |
0.25 |
0.89 |
0.495 |
0.32 |
0.87 |
0.520 |
| SP1 |
−0.01 |
0.95 |
0.543 |
−0.18 |
1.06 |
0.464 |
−0.23 |
1.14 |
0.460 |
| SP2 |
0.20 |
0.93 |
0.543 |
−0.04 |
1.09 |
0.466 |
−0.10 |
1.16 |
0.452 |
| CP1 |
0.07 |
0.97 |
0.541 |
−0.17 |
0.89 |
0.474 |
−0.41 |
1.07 |
0.436 |
| CP2 |
0.20 |
0.96 |
0.550 |
−0.08 |
0.98 |
0.461 |
−0.18 |
1.11 |
0.441 |
| CGa |
92.89 |
4.66 |
— |
77.35 |
4.12 |
— |
66.05 |
2.83 |
— |
To further investigate these differences, Kruskal–Wallis tests are conducted for each CoI component at both timepoints, followed by Dunn's post hoc test when appropriate (Table 4). Results indicate that differences primarily lie in perceptions of social and cognitive presence, with students earning an “A” grade perceiving these components significantly higher than students earning a “B” or “C”. Teaching presence did not differ significantly across course letter grade groups at either timepoint. Latent means were compared via SMM, with results that confirmed these findings as shown in Table 5.
Table 4 Kruskal–Wallis tests and Dunn's test results for course letter gradea
| Component |
Timepoint 1 |
Timepoint 2 |
| Kruskal–Wallis |
η2 |
Dunn's test |
Kruskal–Wallis |
η2 |
Dunn's test |
| All p values are adjusted with the Bonferroni correction.*p < 0.001; **p < 0.01. |
| TP |
3.18 |
0.001 |
— |
0.80 |
−0.001 |
— |
| SP |
20.02* |
0.017 |
A-B,* A-C** |
20.55* |
0.018 |
A-B,* A-C** |
| CP |
20.73* |
0.017 |
A-B,** A-C* |
28.32* |
0.025 |
A-B,** A-C** |
Table 5 Latent mean comparisons of CoI components by course letter gradea
| Component |
Comparison |
Timepoint 1 |
Timepoint 2 |
| Mean difference |
Effect size SMM |
Mean difference |
Effect size SMM |
| Reference group is the first group listed in comparisons (e.g., mean differences are reported for students earning a “B” relative to students earning an “A”.).*p < 0.001; **p < 0.01. |
| TP |
A vs. B |
−0.043 |
0.05 |
0.005 |
0.01 |
| A vs. C |
−0.196 |
0.21 |
−0.082 |
0.09 |
| B vs. C |
−0.152 |
0.17 |
−0.087 |
0.10 |
| |
| SP |
A vs. B |
0.285* |
0.27 |
0.309* |
0.29 |
| A vs. C |
0.326** |
0.30 |
0.364** |
0.33 |
| B vs. C |
0.042 |
0.04 |
0.055 |
0.05 |
| |
| CP |
A vs. B |
0.192** |
0.20 |
0.298* |
0.30 |
| A vs. C |
0.303** |
0.29 |
0.384* |
0.36 |
| B vs. C |
0.111 |
0.11 |
0.086 |
0.08 |
Ideally we would observe negligible differences in terms of perceptions of these components by course grade, suggesting that students at all performance levels perceived the course the same in terms of these three components. However, our results suggest that the observed differences reflect areas where this course structure may have been experienced differently depending on student performance. The small differences in social presence perception suggest that PLTL activities generally support engagement for most students, but that high-performing students experienced slightly greater interaction. The small differences in perceptions of cognitive presence at the end of the semester suggest that higher-performing students felt more cognitively engaged; these students may have felt that they were able to better leverage the active learning and problem-solving components of the course. In contrast, teaching presence remained consistent across grade groups, indicating that students generally perceived clear guidance, feedback, and support from the instructor, regardless of their performance in the course.
Overall, these patterns suggest that performance in the course was related to how students experienced the learning environment: high-performing students in the course tended to report stronger social and cognitive presence than the rest of their peers, while perceptions of teaching presence remained relatively consistent regardless of grade. These indicate opportunities for helping all students perceive the learning environment similarly.
Research question 3: Do students’ perceptions of CoI components differ by their admit type?
Nonparametric MANOVA tests, conducted for admit type (i.e., FTIC or TS) at each timepoint, reveal significant differences in students’ perceptions of CoI components: Timepoint 1 Λ(3, 1072) = 10.773, p < 0.001 and Timepoint 2 Λ(3, 1072) = 4.712, p < 0.01. The median predicted factor scores along with the relative effects (Table 6) suggest that FTIC students have higher perceptions of social presence, and transfer students have higher perceptions of teaching presence and cognitive presence.
Table 6 Descriptive statistics for predicted factor score of CoI components and relative effects by admit type (FTIC and TS)
| |
FTIC (n = 842) |
TS (n = 210) |
| Median |
SD |
Relative effects |
Median |
SD |
Relative effects |
| TP1 |
0.12 |
0.95 |
0.429 |
0.27 |
0.80 |
0.571 |
| TP2 |
0.22 |
0.92 |
0.455 |
0.39 |
0.85 |
0.545 |
| SP1 |
−0.08 |
1.03 |
0.527 |
−0.14 |
1.03 |
0.473 |
| SP2 |
0.05 |
1.05 |
0.520 |
−0.01 |
1.03 |
0.480 |
| CP1 |
−0.08 |
0.96 |
0.473 |
−0.07 |
0.93 |
0.527 |
| CP2 |
0.09 |
0.99 |
0.489 |
0.10 |
1.03 |
0.511 |
| CG |
82.90 |
10.88 |
— |
84.74 |
10.75 |
— |
To further investigate these differences, Mann–Whitney U tests (with rank-biserial correlations) are conducted for each CoI component by admit type at each timepoint (Table 7). Rank-biserial correlations (rrb) calculated as effect size estimates to quantify the magnitude and direction of differences in predicted factor scores between FTIC and TS groups; negative values indicate higher predicted factor scores for TS compared to FTIC students. As with investigating differences by course letter grade, in an ideal scenario, we would observe no differences between students in these groups as this would reflect a course environment that supports all students similarly. Our results suggest that differences only exist between admit types with student perceptions of teaching presence at both timepoints. Specifically, transfer students have higher perceptions of teaching presence than FTIC students, a comparison, though, with weak effect sizes. Latent mean comparisons via SMM further confirm these results (Table 8).
Table 7 Mann–Whitney U tests results comparing predicted factor scores for CoI components between FTIC and TS groups at each timepointa
| Component |
Timepoint 1 |
Timepoint 2 |
| Mann–Whitney |
rrbb |
Mann–Whitney |
rrbb |
| All p values are adjusted with Bonferroni correction. Rank-biserial correlation.*p < 0.001; **p < 0.05. |
| TP |
81 251* |
−0.143 |
86 274** |
−0.090 |
| SP |
100 000 |
0.055 |
98 604 |
0.040 |
| CP |
89 586 |
−0.055 |
92 617 |
−0.023 |
Table 8 Latent mean comparisons for CoI components between FTIC and TS groups at each timepointa
| Component |
Timepoint 1 |
Timepoint 2 |
| Mean difference |
Effect size SMM |
Mean difference |
Effect size SMM |
| Mean differences are reported for the TS group relative to the FTIC group.*p < 0.001. |
| TP |
0.244* |
0.26 |
0.133 |
0.14 |
| SP |
−0.089 |
0.09 |
−0.066 |
0.06 |
| CP |
0.100 |
0.10 |
0.017 |
0.02 |
For our data, significant differences by admit type were observed only for teaching presence, with transfer students reporting slightly higher perceptions than their FTIC peers. This higher teaching presence perception by transfer students may reflect the value of structured guidance and instructor support as they navigate a new academic environment. Furthermore, transfer students coming from community colleges may be more familiar with smaller class sizes, where active-learning environments are often easier to implement; thus, students may feel more connected to their instructor in this environment compared to the traditional large-lecture courses more commonly found at four-year institutions.
No differences were observed in terms of how different admit student groups perceived social or cognitive presence; this result suggests that both FTIC and transfer students perceived similar levels of peer interaction, collaboration, and engagement with the course materials. The consistency in social presence may indicate that the PLTL learning structure effectively fostered a comparable sense of community and connectedness for FTIC and transfer students. Similarly, for cognitive presence the active-learning and problem-solving opportunities were perceived as accessible and meaningful to both groups.
Interestingly, despite these comparable perceptions, transfer students performed significantly worse than their FTIC peers in overall course grades (as discussed in the Methods section). This decoupling of perception and performance highlights that positive perceptions of the learning environment do not necessarily equate to higher academic outcomes. Transfer students may feel supported and engaged yet still face challenges with the organic chemistry content. Many transfer students are non-traditional students with additional commitments or obligations, such as full-time employment or family responsibilities, that may constrain the time and energy available to engage and practice with the complex organic chemistry content. Differences in prior chemistry preparation or institutional background could further influence transfer students performance in the course.
Overall, these findings indicate that the flipped, PLTL structure may provide an avenue for equitable learning experiences for transfer students; however, differences in performance between transfer and FTIC students still exists.
Limitations & implications for researchers
The results of this work highlight key limitations and implications for chemistry education researchers regarding (1) context & study design, (2) performance and course outcomes, and (3) measurement & instrument use.
Context & study design
A key contextual limitation of this study is that FTIC students at this institution have prior experience with active learning formats in the yearlong general chemistry course sequence. Though the facilitation of active learning techniques for the flipped, PLTL second semester organic chemistry course is different to that of the general chemistry courses, students’ prior exposure to these environments may have influenced their initial perceptions and responsiveness to the course design. As a result, the patterns observed in this study may differ at institutions where FTIC students have less prior exposure. Investigating similar instructional models at institutions that differ in their use of active learning could help to clarify how students’ prior experiences and institutional context shape their perceptions and engagement with these learning environments.
Additionally, we were unable to account for enrollment in the recommended co-requisite second semester organic chemistry laboratory course, which may have provided additional opportunities for peer interaction and small-group engagement, particularly for transfer students, and could therefore influence perceptions of CoI components (particularly social presence). We attempted to minimize this confounding effect by including explicit instructions in the CoI survey asking students to rate only lecture portion of the course. Further, students were informed that the bonus points received for responding to the survey would be applied to their lecture grade, reinforcing focus on the lecture experience rather than the laboratory course.
This work also used a one-group, pre- and post-test design (Shadish et al., 2002) to examine student perceptions of social, cognitive, and teaching presence in a flipped, PLTL second semester organic chemistry course. While this design allowed us to observe changes over time and differences across student groups, we are limited in our ability to make strong causal claims about the effect the course format had on student perceptions. Nonetheless, such results are valuable for identifying potential pathways through which course structure and pedagogy may shape students’ affective and social experiences while engaging with challenging organic chemistry concepts. Future work could examine how students’ perceptions of CoI components might differ in high-quality traditional lecture-based courses (e.g., those with an engaging instructor that cares about students and student motivation and is able to create a structured course with clear expectations, provides consistent feedback, and opportunities for growth) compared to flipped, PLTL courses (or other active learning formats). Prior work by Liu et al. (2018) found that even a high-quality traditional lecture-based approach to an organic chemistry course may not prevent a decline in student motivation compared to a course using an active-learning format. Given these findings, in comparison to flipped, PLTL courses, high-quality traditional courses might support cognitive and teaching presence to some extent, as clear structure and active instructor engagement can facilitate understanding and guidance; however, this course format may be less effective at fostering social presence, which relies on structured peer interactions and collaboration. Further, conducting such a study would be challenging, as differences in instructor style and student cohorts could confound results, making it difficult to isolate the effects of instructional format alone. Despite this, systematically investigating these comparisons could clarify how different instructional formats uniquely impact social, cognitive, and teaching presence, helping to guide course design decisions that maximize student engagement and learning outcomes.
As organic chemistry courses require students to coordinate multiple representational levels, reason through mechanisms, and use spectroscopic evidence to predict features of a molecule, students’ perceptions of community and instructional support are likely to play a critical role in how they approach and persist in the course. Further insight into these pathways would benefit from incorporating multiple forms of evidence beyond self-reported perceptions. Although survey data provide a useful snapshot of students’ subjective experiences, they may not fully capture behavioral engagement or interactional dynamics that contribute to learning, such as how students justify their reasoning about a mechanistic pathway, build coherence among ideas, or enact epistemic agency during peer-led discussions. For example, Eckhard et al. (2026) demonstrated that students’ opportunities to enact epistemic agency and construct coherent reasoning in the moment of classroom discourse depend strongly on problem design and instructor facilitation. If we had collected additional sources of data, such as participation metrics (e.g., clicker responses), observational data, or qualitative interviews with students, it would be possible to more clearly link students’ reported perceptions to their actual engagement within the course and investigate how affective and social dimensions relate to collaborative knowledge-building in organic chemistry.
Further, pairing the CoI instrument with other measures, such as questions about student satisfaction, the Academic Motivation Scale-Chemistry (AMS-Chemistry; Liu et al., 2017), or the Academic Emotions Questionnaire-Organic Chemistry (AEQ-OCHEM; Raker et al., 2019) could provide a more comprehensive picture of students’ perceptions and experiences in organic chemistry. Using multiple instruments would allow researchers to capture different facets of the learning environment and inform the evaluation of specific pedagogical strategies and their effectiveness. Triangulating such evidence could clarify the mechanisms underlying observed changes in social, cognitive, and teaching presence, offering a more complete picture of how the flipped, PLTL structure of the course relates to organic chemistry students’ connections, motivation, and persistence across a variety of backgrounds (American Educational Research Association et al., 2014; Pekrun, 2020).
Performance & course outcomes
Further, there was a small number of students who did not pass this course (i.e., DFW) which restricted analyses comparing passing and non-passing students. Although a high pass rate reflects positively on student success, it limits our ability to draw conclusions about how students who struggle in the course perceive social, cognitive, and teaching presence. Perspectives from these students are valuable, as they may engage with the chemical content and collaborative aspects of the course in distinct ways that reveal opportunities to enhance instructional design to better support students. For example, students who find it challenging to integrate multiple representations or reason through reaction mechanisms may also experience heightened feelings of shame, anxiety, or embarrassment, particularly when the course environment emphasizes peer interaction (Cooper et al., 2018), which could affect their perception of social presence in the course. Expanding the sample to include a broader range of course outcomes would enable a more comprehensive understanding of how students across achievement levels experience the learning environment and inform strategies to promote engagement and belonging for all students.
While this study examined differences by final course grade, it does not capture how students’ perceptions of social, cognitive, and teaching presence may interact with performance on individual examinations throughout the semester; investigating reciprocal relationships between CoI components and examination performance could be particularly informative in organic chemistry as it is characterized by cumulative, conceptually challenging content. As students’ understanding builds across topics, early perceptions of teaching and social presence may influence performance on in-term examinations; further, success or struggles on these assessments may shape students’ perceptions of the learning environment. Modeling these bidirectional relationships can help to identify which components of the learning environment most strongly relate to conceptual growth and how the CoI components interact with each other over time. Such work could offer a nuanced perspective on the dynamic interplay between affective and cognitive dimensions of learning within the traditionally high-stakes, content-intensive organic chemistry course.
Measurement & instrument considerations
As the chemistry education research community increasingly investigates affective aspects of learning, rigorous approaches to modeling such constructs are needed. The use of dynamic fit indices for measurement model evaluation and SMM for comparing latent means across groups represent one such opportunity. Traditional fit cutoffs (e.g., Hu and Bentler, 1999) use fixed thresholds that may not generalize across models or sample sizes; the dynamic fit index approach instead generates context-specific cutoffs through simulation, providing a more accurate and flexible assessment of model fit, though at a greater computational cost (McNeish and Wolf, 2023b). SMM allows researchers to compare latent variable means across groups while accounting for measurement error, providing a more accurate assessment of differences than observed scores (Hancock, 1997). These techniques, while more established in broader educational psychology fields, are less common in chemistry education research and offer a pathway for more accurate comparisons across groups, revealing differences that conventional methods might obscure.
The findings of this work also highlight considerations for using the CoI framework as an evaluative tool in chemistry education research. In this work, we used the chemistry-specific adaptation of the CoI instrument (Komperda, 2016); however, multiple versions of the instrument exist, including the original CoI instrument (Arbaugh et al., 2008) and versions that add a fourth component, learning presence (Shea and Bidjerano, 2010). Further, researchers may choose different ways to model these components. Prior work has demonstrated considerable variation in how the CoI instrument has been operationalized, with constructs modeled beyond the three-factor model; such models include dividing teaching presence into pre-instruction planning and in-class facilitation as well as adapting cognitive presence to include inquiry or resolution phases and social presence to include affective expression (Arbaugh, 2007; Arbaugh et al., 2008; Yang and Su, 2021; Bai et al., 2023). These variations underscore the importance to carefully consider which version and modeling approach of the CoI instrument best aligns with the context and research questions of a study.
Implications for instructors
The results of this work point toward four key implications for organic chemistry instructors regarding (1) sustaining student engagement, (2) recognizing variation in students’ experiences of learning organic chemistry, (3) supporting transfer students’ sense of belonging and community, and (4) reflective course design.
Sustaining organic chemistry student engagement
Findings of this work indicate that students’ perceptions of social, cognitive, and teaching presence increased, though weakly, across the semester, suggesting that the flipped, PLTL structure of the organic chemistry course may have helped maintain and even strengthen students’ sense of connection and engagement with the learning environment over time. These trends are particularly noteworthy given the challenging nature of the organic chemistry curriculum, where maintaining student motivation and engagement can be difficult as course content becomes more complex. Prior work has shown that in traditional, lecture-based organic chemistry courses, students’ attitudes, motivation, and confidence often decline over time, which can negatively impact persistence and achievement (Liu et al., 2018; Collini et al., 2024). In contrast, active learning approaches have been associated with more positive affective outcomes in sustaining students’ motivation throughout the course (Liu et al., 2018). The results of this study align with these findings, suggesting that course structures that support collaborative problem solving may help students to remain engaged throughout the semester. For organic chemistry instructors, this highlights the value of designing learning environments that not only promote conceptual understanding but also foster meaningful participation and collaboration, factors that appear to contribute to sustaining students’ perceptions of the learning environment across the course.
Recognizing variation in students’ experiences of learning organic chemistry
This study revealed that students’ perceptions of social and cognitive presence varied by course grade, with “A” students reporting stronger perceptions of these components than their peers. This is not altogether surprising as students who perform at higher levels may feel more confident in their ability to engage in peer discussions and in using critical thinking skills with organic chemistry content. However, this does not reflect a deficiency of non-“A” students, rather it highlights that students at different performance levels may experience the same classroom environment and activities differently. In the context of organic chemistry, students who have already developed fluency in visualizing with representations or interpreting spectroscopic data may feel more comfortable contributing to collaborative discussions and articulating their reasoning to their peers. Conversely, students who are still developing these skills may be less confident in sharing ideas, which can affect how supported or connected they feel in the learning environment (Skagen et al., 2018).
Despite these differences, students across course grade levels reported similar perceptions of teaching presence, suggesting that the flipped, PLTL structure and the instructor's consistent facilitation effectively communicated organization and guidance to all students. This highlights the value of deliberate instructional design in organic chemistry. When course structure, materials, and instructor communication are well orchestrated, students may feel supported even as their individual confidence or performance varies. Instructors can continue to build on this strength by pairing structured, well-facilitated learning environments with scaffolds to specifically address areas where students’ experience diverge. For example, breaking down complex synthetic pathways into stepwise discussions to allow them to model their reasoning could help more students to participate meaningfully (social presence) and reinforce their understanding (cognitive presence). Overall, instructors should recognize that students engage with and perceive the organic chemistry classroom differently depending on their course grade; we encourage instructors to reflect on how they structure both the cognitive and social dimensions of learning organic chemistry.
Supporting transfer students’ sense of belonging and community in organic chemistry
Findings from this study indicate that transfer students perceive the flipped, PLTL organic chemistry course comparably to their FTIC peers in terms of all CoI components. This suggests that this course structure provided a learning environment in which transfer students felt a sense of belonging with their peers, supported by their instructor, and able to engage with critical thinking skills. Such perceptions are noteworthy given that transfer students often face challenges when transitioning into larger institutions (e.g., adjusting to larger class sizes). Within chemistry (and other STEM fields) specifically, transfer students also have to learn how to navigate the “hidden curriculum” including the importance of participating in research and receiving mentoring by near-peers or faculty (Reeves et al., 2023). For many transfer students, the organic chemistry sequence is their first chemistry experience at a new institution and thus it plays an important role in shaping transfer students’ sense of belonging within the discipline.
For organic chemistry instructors, these findings emphasize the value of intentionally designed course structures that help transfer students acclimate both academically and socially. Transfer students may enter with fewer established peer networks and less familiarity with institutional or departmental norms; incorporating structured group work, peer-led sessions, and communication of resources can help students making this transition. Implementing practices that highlight teaching presence (e.g., clear organization, supportive feedback, flexible office hours) and social presence (e.g., structured and intentional peer collaboration) can help ease this transition. Such approaches are particularly valuable in organic chemistry as the content is conceptually demanding and often carries negative perceptions; building meaningful connections with peers (or near-peers) and their instructor can help students manage these challenges (Jardine and Friedman, 2017; Reeves et al., 2023). Creating these types of supportive, connected learning environments not only helps transfer students in developing a sense of belonging, but also fosters a community that benefits all students in organic chemistry.
Reflective course design in organic chemistry
While the findings of our work indicate that the flipped, PLTL organic chemistry course structure supports transfer students in their perceptions of the learning environment comparable to their FTIC peers, it is crucial to recognize that these outcomes may not be universally experienced across all student groups. As Bodner, and others, have reflected, it is not important to know how many unique student experiences exist (whether it be in terms of their level of reasoning or affect), rather that we simply recognize that a wide range of these experiences are present in our classrooms (Bodner, 1986; Frost et al., 2024a). Students from different backgrounds, such as first-generation college students, may encounter unique challenges that influence their perceptions and experiences in the classroom. Thus, it is important to recognize that there is no “one-size-fits-all” course structure that is inherently ideal; instructors must consider how and for whom interventions and classroom structures work (Schweingruber et al., 2012; Eddy and Hogan, 2014; Cooper and Brownell, 2016). Effective classroom design requires awareness of the specific students in a course, the goals of both the students and instructor, and reflection on which strategies best support learning in that context (Eddy and Hogan, 2014). Instructors should be mindful of the many ways that students perceive the learning environment and reflective about which approaches best foster engagement, motivation, and mastery within their specific course.
For context, the flipped, PLTL second semester organic chemistry course in this study was designed and refined over several years by the instructor, partly in response to challenges presented by COVID-19 and partly out of a sustained interest in active learning pedagogies. Decisions about course structure, content coverage, and the skills emphasized (e.g., teamwork and collaboration) were shaped by what was feasible given available institutional resources, what had worked for students in previous semesters, and the flexibility afforded in a second-semester organic chemistry course (compared to that of the first-semester organic chemistry course). This iterative, reflective process highlights the importance of grounding instructional design in both context and experience, and of continuously assessing how course structures and pedagogical choices shape students’ perceptions of learning organic chemistry.
Conclusions
This study examined students’ perceptions of CoI components (i.e., social, cognitive, and teaching presence) in the second semester of a yearlong postsecondary organic chemistry course sequence using a flipped, PLTL pedagogical strategy, revealing nuanced differences over time and across groups of students. Results indicate that students’ perceptions of the learning environment increased over the semester for all CoI components, suggesting that the structured peer interactions and instructor guidance of this course design can help sustain engagement in conceptually demanding organic chemistry courses. While students with higher course letter grades generally reported stronger perceptions of social and cognitive presence, teaching presence was perceived consistently regardless of course letter grade, highlighting that students in this study experienced instructor support similarly. Finally, transfer students reported similar perceptions of all CoI components to their FTIC peers, suggesting that this course was effective at creating an equitable learning environment for these students in a context of conceptually demanding organic chemistry courses. For researchers, this work demonstrates how the CoI framework can be used to examine student perceptions of the learning environment and highlights the potential to investigate relationships between CoI components and other variables (e.g., motivation, satisfaction, examination performance) to gain a more holistic understanding of the learning environment and effectiveness of pedagogical strategies in organic chemistry. For instructors, these results highlight the value of deliberate instructional design within the context of organic chemistry and of being mindful of the many backgrounds and experiences students bring with them in organic chemistry courses.
Author contributions
J. R. R., C. J. C., and R. K. conceptualized the project. J. R. R., R. K., and K. B.-R. collected the data. C. J. C. ran all data analyses. C. J. C., J. R. R., and R. K. discussed the interpretation of the results and implications. C. J. C. authored the paper. All authors read, edited, and approved the final manuscript.
Conflicts of interest
There are no conflicts to declare.
Data availability
The data supporting this study (i.e., CoI survey responses, course grades, and admissions information) are not publicly available. Ethics approval for this study did not permit public data sharing, and participants did not consent to their data being shared; releasing the data could compromise participant privacy.
Appendix
Appendix 1 Course grade descriptive statistics, three-factor CFA item loadings, single-factor CFA fit statistics, measurement invariance fit statistics, dynamic fit indices, and item statistics by CoI component
Tables 9–18.
Table 9 Descriptive statistics for final course grade
| Group |
N |
Average (%) |
SD |
| FTIC |
842 |
84.35 |
10.59 |
| TS |
210 |
77.88 |
10.40 |
| Overall |
1052 |
83.06 |
10.86 |
Table 10 Three-factor CFA item loadings from the combined Timepoint 1 and Timepoint 2 data
| TP |
SP |
CP |
| Item |
Loading |
Item |
Loading |
Item |
Loading |
| TP1 |
0.814 |
SP1 |
0.736 |
CP1 |
0.706 |
| TP2 |
0.793 |
SP2 |
0.657 |
CP2 |
0.735 |
| TP3 |
0.777 |
SP3 |
0.729 |
CP3 |
0.739 |
| TP4 |
0.699 |
SP4 |
0.801 |
CP4 |
0.563 |
| TP5 |
0.863 |
SP5 |
0.821 |
CP5 |
0.689 |
| TP6 |
0.860 |
SP6 |
0.787 |
CP6 |
0.682 |
| TP7 |
0.807 |
SP7 |
0.655 |
CP7 |
0.736 |
| TP8 |
0.835 |
SP8 |
0.763 |
CP8 |
0.753 |
| TP9 |
0.731 |
SP9 |
0.834 |
CP9 |
0.798 |
| TP10 |
0.777 |
|
|
CP10 |
0.792 |
| TP11 |
0.843 |
|
|
CP11 |
0.751 |
| TP12 |
0.771 |
|
|
CP12 |
0.721 |
| TP13 |
0.698 |
|
|
CP13 |
0.694 |
| |
|
|
|
CP14 |
0.610 |
Table 11 Single-factor CFA fit statistics and internal consistency for each CoI component from the combined Timepoint 1 and Timepoint 2 data
| Model |
χ2 a |
df |
CFI |
RMSEA |
α |
ω |
| All p < 0.001. |
| TP |
50 726 |
78 |
0.936 |
0.102 |
0.937 |
0.942 |
| SP |
22 753 |
36 |
0.962 |
0.087 |
0.893 |
0.899 |
| CP |
1275 |
77 |
0.937 |
0.080 |
0.914 |
0.920 |
Table 12 Longitudinal measurement invariance testing for the CoI instrument
| Model |
χ2 |
df |
CFI |
TLI |
RMSEA |
Δχ2 |
Δdf |
ΔCFI |
ΔTLI |
ΔRMSEA |
| *p > 0.05. |
| Timepoint 1 |
3083 |
591 |
0.890 |
0.882 |
0.069 |
— |
— |
— |
— |
— |
| Timepoint 2 |
3413 |
591 |
0.885 |
0.877 |
0.074 |
— |
— |
— |
— |
— |
| Configural |
2549 |
1182 |
0.914 |
0.908 |
0.054 |
— |
— |
— |
— |
— |
| Metric |
2581 |
1215 |
0.914 |
0.911 |
0.053 |
32* |
33 |
0.000 |
0.003 |
0.001 |
| Scalar |
2653 |
1248 |
0.912 |
0.911 |
0.053 |
72 |
33 |
0.002 |
0.000 |
0.000 |
Table 13 Group measurement invariance testing by course letter grade
| Model |
χ2 |
df |
CFI |
TLI |
RMSEA |
Δχ2 |
Δdf |
ΔCFI |
ΔTLI |
ΔRMSEA |
| *p > 0.05. |
| A*Timepoint 1 |
1854 |
591 |
0.859 |
0.850 |
0.081 |
— |
— |
— |
— |
— |
| A*Timepoint 2 |
1963 |
591 |
0.861 |
0.851 |
0.084 |
— |
— |
— |
— |
— |
| B*Timepoint 1 |
1330 |
591 |
0.865 |
0.856 |
0.075 |
— |
— |
— |
— |
— |
| B*Timepoint 2 |
1471 |
591 |
0.867 |
0.858 |
0.080 |
— |
— |
— |
— |
— |
| C*Timepoint 1 |
929 |
591 |
0.756 |
0.740 |
0.123 |
— |
— |
— |
— |
— |
| C*Timepoint 2 |
957 |
591 |
0.715 |
0.696 |
0.143 |
— |
— |
— |
— |
— |
| Configural |
4437 |
3546 |
0.909 |
0.903 |
0.055 |
— |
— |
— |
— |
— |
| Metric |
4603 |
3711 |
0.908 |
0.906 |
0.054 |
166* |
165 |
0.001 |
0.003 |
0.001 |
| Scalar |
4852 |
3876 |
0.901 |
0.904 |
0.055 |
249 |
165 |
0.007 |
0.002 |
0.001 |
Table 14 Group measurement invariance testing by admit type
| Model |
χ2 |
df |
CFI |
TLI |
RMSEA |
Δχ2 |
Δdf |
ΔCFI |
ΔTLI |
ΔRMSEA |
| *p > 0.05. |
| FTIC*Timepoint 1 |
2659 |
591 |
0.886 |
0.879 |
0.070 |
— |
— |
— |
— |
— |
| FTIC*Timepoint 2 |
2896 |
591 |
0.881 |
0.873 |
0.076 |
— |
— |
— |
— |
— |
| TS*Timepoint 1 |
988 |
591 |
0.813 |
0.801 |
0.094 |
— |
— |
— |
— |
— |
| TS*Timepoint 2 |
1121 |
591 |
0.785 |
0.771 |
0.115 |
— |
— |
— |
— |
— |
| |
| Configural |
3295 |
2364 |
0.912 |
0.907 |
0.054 |
— |
— |
— |
— |
— |
| Metric |
3353 |
2463 |
0.913 |
0.911 |
0.053 |
58* |
99 |
0.001 |
0.004 |
0.001 |
| Scalar |
3486 |
2562 |
0.911 |
0.912 |
0.053 |
133* |
99 |
0.002 |
0.001 |
0.000 |
Table 15 Dynamic fit indices by group
| Model |
Empirical fit indices |
Simulated cutoff recommendationsa |
| χ2,* |
df |
CFI |
RMSEA |
CFI |
RMSEA |
| *All p < 0.001. Level 2 cutoffs reflect fit indices expected under moderate simulated misspecification. |
| Timepoint 1 |
2560 |
591 |
0.937 |
0.063 |
0.988 |
0.032 |
| Timepoint 2 |
2938 |
591 |
0.942 |
0.067 |
0.987 |
0.035 |
| A*Timepoint 1 |
1621 |
591 |
0.933 |
0.066 |
0.992 |
0.028 |
| A*Timepoint 2 |
1710 |
591 |
0.936 |
0.069 |
0.992 |
0.029 |
| B*Timepoint 1 |
1216 |
591 |
0.958 |
0.055 |
0.993 |
0.025 |
| B*Timepoint 2 |
1413 |
591 |
0.956 |
0.060 |
0.990 |
0.031 |
| C*Timepoint 1 |
821 |
591 |
0.948 |
0.063 |
0.993 |
0.024 |
| C*Timepoint 2 |
907 |
591 |
0.961 |
0.066 |
0.994 |
0.023 |
| |
| FTIC*Timepoint 1 |
2247 |
591 |
0.937 |
0.064 |
0.991 |
0.029 |
| FTIC*Timepoint 2 |
2518 |
591 |
0.941 |
0.068 |
0.990 |
0.032 |
| TS*Timepoint 1 |
900 |
591 |
0.956 |
0.055 |
0.992 |
0.026 |
| TS*Timepoint 2 |
1082 |
591 |
0.954 |
0.064 |
0.991 |
0.034 |
Table 16 Item statistics for social presence
| Item |
Mean |
Median |
SD |
SK |
Kurt |
| SP1 |
3.85 |
4 |
1.08 |
−0.85 |
0.14 |
| SP2 |
3.89 |
4 |
0.91 |
−0.71 |
0.43 |
| SP3 |
4.34 |
4 |
0.79 |
−1.36 |
2.35 |
| SP4 |
4.08 |
4 |
0.95 |
−1.12 |
1.13 |
| SP5 |
3.92 |
4 |
0.96 |
−0.92 |
0.71 |
| SP6 |
4.12 |
4 |
0.93 |
−1.15 |
1.21 |
| SP7 |
3.83 |
4 |
0.94 |
−0.80 |
0.61 |
| SP8 |
3.94 |
4 |
0.89 |
−0.68 |
0.35 |
| SP9 |
3.84 |
4 |
1.07 |
−0.82 |
0.07 |
Table 17 Item statistics for cognitive presence
| Item |
Mean |
Median |
SD |
SK |
Kurt |
| CP1 |
3.31 |
3 |
1.11 |
−0.23 |
−0.66 |
| CP2 |
3.35 |
3 |
1.08 |
−0.31 |
−0.53 |
| CP3 |
3.39 |
4 |
1.11 |
−0.32 |
−0.65 |
| CP4 |
3.96 |
4 |
0.93 |
−0.83 |
0.42 |
| CP5 |
3.81 |
4 |
0.96 |
−0.70 |
0.16 |
| CP6 |
3.92 |
4 |
0.86 |
−0.80 |
0.77 |
| CP7 |
3.80 |
4 |
1.05 |
−0.80 |
0.12 |
| CP8 |
3.87 |
4 |
0.87 |
−0.82 |
0.92 |
| CP9 |
3.92 |
4 |
0.92 |
−0.94 |
0.87 |
| CP10 |
3.94 |
4 |
0.90 |
−0.85 |
0.74 |
| CP11 |
3.81 |
4 |
0.98 |
−0.79 |
0.30 |
| CP12 |
3.74 |
4 |
0.96 |
−0.70 |
0.28 |
| CP13 |
3.73 |
4 |
0.93 |
−0.71 |
0.39 |
| CP14 |
3.37 |
4 |
1.16 |
−0.40 |
−0.65 |
Table 18 Item statistics for teaching presence
| Item |
Mean |
Median |
SD |
SK |
Kurt |
| TP1 |
4.05 |
4 |
1.00 |
−1.10 |
0.87 |
| TP2 |
4.22 |
4 |
0.86 |
−1.22 |
1.60 |
| TP3 |
4.22 |
4 |
0.83 |
−1.22 |
1.80 |
| TP4 |
4.34 |
5 |
0.87 |
−1.52 |
2.42 |
| TP5 |
3.88 |
4 |
1.02 |
−0.81 |
0.14 |
| TP6 |
3.85 |
4 |
1.08 |
−0.86 |
0.11 |
| TP7 |
3.96 |
4 |
1.02 |
−0.97 |
0.48 |
| TP8 |
3.79 |
4 |
1.07 |
−0.78 |
0.01 |
| TP9 |
4.08 |
4 |
0.88 |
−0.96 |
0.95 |
| TP10 |
4.12 |
4 |
0.89 |
−1.08 |
1.28 |
| TP11 |
3.75 |
4 |
1.04 |
−0.70 |
−0.05 |
| TP12 |
3.58 |
4 |
1.08 |
−0.47 |
−0.45 |
| TP13 |
3.93 |
4 |
1.00 |
−0.84 |
0.36 |
Acknowledgements
We would like to thank the Organic Chemistry 2 (CHM 2211) students at the University of South Florida who completed the CoI survey. We would also like to thank Dr Stephanie J. H. Frost for thoughtful conversations about the work.
References
- American Educational Research Association, American Psychological Association and National Council on Measurement in Education, (2014), Standards for Educational and Psychological Testing, Washington, DC, US: American Educational Research Association, pp. 11–31.
- Anderson T. L. and Bodner G. M., (2008), What can we do about ‘Parker’? A case study of a good student who didn't ‘get’ organic chemistry, Chem. Educ. Res. Pract., 9, 93–101 10.1039/b806223b.
- Anderson T., Rourke L., Garrison R. and Archer W., (2019), Assessing teaching presence in a computer conferencing context, Online Learn. J., 5(2), 1–17 DOI:10.24059/olj.v5i2.187.
- Ang J. W. J. and Ng Y. N., (2022), Effect of Research-Based Blended Learning with Scrum Methodology on Learners’ Perception and Motivation in a Laboratory Course, J. Chem. Educ., 99, 4102–4108 DOI:10.1021/acs.jchemed.2c00002.
- Ang J. W. J. and Ng Y. N., (2024), Students’ Perceptions of Asynchronous Lectures via the Community of Inquiry Framework and Its Relationship with Learning Performance, J. Chem. Educ., 101, 661–668 DOI:10.1021/acs.jchemed.3c00697.
- Arbaugh J. B., (2007), An empirical verification of the community of inquiry framework, JALN, 11, 73–85.
- Arbaugh J. B., (2013), Does academic discipline moderate CoI-course outcomes relationships in online MBA courses?, Internet High. Educ., 17, 16–28 DOI:10.1016/j.iheduc.2012.10.002.
- Arbaugh J. B., Cleveland-Innes M., Diaz S. R., Garrison D. R., Ice P., Richardson J. C. and Swan K. P., (2008), Developing a community of inquiry instrument: testing a measure of the community of inquiry framework using a multi-institutional sample, Int. High. Educ., 11, 133–136.
- Asmussen G., Rodemer M. and Bernholt S., (2023), Blooming student difficulties in dealing with organic reaction mechanisms – an attempt at systemization, Chem. Educ. Res. Pract., 24, 1035–1054 10.1039/D2RP00204C.
- Ausubel D. P., (1963), The psychology of meaningful verbal learning, Oxford, England: Grune & Stratton.
- Ausubel D. P., (1968), Educational psychology: a cognitive view, Holt, Rinehart and Winston: New York.
- Bai X., Gu X. and Guo R., (2023), More factors, better understanding: model verification and construct validity study on the community of inquiry in MOOC, Educ. Inf. Technol., 28, 10483–10506 DOI:10.1007/s10639-023-11604-z.
- Bauer C. F., (2005), Beyond “Student Attitudes”: Chemistry Self-Concept Inventory for Assessment of the Affective Component of Student Learning, J. Chem. Educ., 82, 1864 DOI:10.1021/ed082p1864.
- Bauer C. F., (2008), Attitude toward Chemistry: A Semantic Differential Instrument for Assessing Curriculum Impacts, J. Chem. Educ., 85, 1440 DOI:10.1021/ed085p1440.
- Benbunan-Fich R. and Hiltz S. R., (2003), Mediators of the effectiveness of online courses, IEEE Trans. Prof. Commun., 46, 298–312.
- Bodner G. M., (1986), Constructivism: a theory of knowledge, J. Chem. Educ., 63, 873 DOI:10.1021/ed063p873.
- Bowen R. S., Flaherty A. A. and Cooper M. M., (2022), Investigating student perceptions of transformational intent and classroom culture in organic chemistry courses, Chem. Educ. Res. Pract., 23, 560–581 10.1039/D2RP00010E.
- Bradley A. Z., Ulrich S. M., Jones, Jr. M. and Jones S. M., (2002), Teaching the Sophomore Organic Course without a Lecture. Are You Crazy?, J. Chem. Educ., 79, 514 DOI:10.1021/ed079p514.
- Bretz S. L., (2001), Novak's Theory of Education: Human Constructivism and Meaningful Learning, J. Chem. Educ., 78, 1107 DOI:10.1021/ed078p1107.6.
- Brosseau-Liard P. E. and Savalei V., (2014), Adjusting Incremental Fit Indices for Nonnormality, Multivar. Behav. Res., 49, 460–470 DOI:10.1080/00273171.2014.933697.
- Burchett W. W., Ellis A. R., Harrar S. W. and Bathke A. C., (2017), Nonparametric Inference for Multivariate Data: The R Package npmv, J. Stat. Softw., 76, 1–18 DOI:10.18637/jss.v076.i04.
- Castellanos-Reyes D., (2020), 20 Years of the Community of Inquiry Framework, TechTrends, 64, 557–560 DOI:10.1007/s11528-020-00491-7.
- Chan J. Y. K. and Bauer C. F., (2015), Effect of peer-led team learning (PLTL) on student achievement, attitude, and self-concept in college general chemistry in randomized and quasi experimental designs, J. Res. Sci. Teach., 52, 319–346 DOI:10.1002/tea.21197.
- Chen F. F., (2007), Sensitivity of goodness of fit indexes to lack of measurement invariance, Struct. Equ. Modeling., 14, 464–504.
- Christiansen M. A., (2014), Inverted Teaching: Applying a New Pedagogy to a University Organic Chemistry Class, J. Chem. Educ., 91, 1845–1850 DOI:10.1021/ed400530z.
- Chung J., (1991), Collaborative Learning Strategies: The Design of Instructional Environments for the Emerging New School, Educ. Technol., 31, 15–22.
- Cohen J., (1988), Statistical power analysis for the behavior sciences, New York: L. Erlbaum Associates DOI:10.4324/9780203771587.
- Cole M., John-Steiner V., Scribner S. and Souberman E., (1978), Mind in Society: The development of higher pyschological processes, Cambridge, Mass.: Harvard University Press.
- Collini M. A., Miguel K., Weber R. and Atkinson M. B., (2024), Investigating changes in students’ attitudes towards organic chemistry: a longitudinal study, Chem. Educ. Res. Pract., 25, 613–624 10.1039/D3RP00228D.
- Cooper K. M. and Brownell S. E., (2016), Coming Out in Class: Challenges and Benefits of Active Learning in a Biology Classroom for LGBTQIA Students, CBE Life Sci. Educ., 15(3), ar37 DOI:10.1187/cbe.16-01-0074.
- Cooper K. M., Downing V. R. and Brownell S. E., (2018), The influence of active learning practices on student anxiety in large-enrollment college science classrooms, Int. J. STEM Educ., 5, 23 DOI:10.1186/s40594-018-0123-6.
- Cooper M. M., Kouyoumdjian H. and Underwood S. M., (2016), Investigating Students’ Reasoning about Acid–Base Reactions, J. Chem. Educ., 93, 1703–1712 DOI:10.1021/acs.jchemed.6b00417.
- Cortina J. M., (1993), What is coefficient alpha? An examination of theory and applications, J. Appl. Psychol., 78, 98–104 DOI:10.1037/0021-9010.78.1.98.
- Crimmins M. T. and Midkiff B., (2017), High Structure Active Learning Pedagogy for the Teaching of Organic Chemistry: Assessing the Impact on Academic Outcomes, J. Chem. Educ., 94, 429–438 DOI:10.1021/acs.jchemed.6b00663.
- Crowder C. J. and Raker J. R., (2024), Patterns in Explanations of Organic Chemistry Reaction Mechanisms: A Text Analysis by Level of Explanation Sophistication, J. Chem. Educ., 101, 5203–5220 DOI:10.1021/acs.jchemed.4c01042.
- Crowder C. J., Ward L. W., Popova M., Komperda R., Rotich F. and Raker J. R., (2025), Testing a Reciprocal Causation Model between the Organic Chemistry Representational Competence Assessment and Examination Performance in Postsecondary Organic Chemistry, J. Chem. Educ., 102(7), 2609–2622 DOI:10.1021/acs.jchemed.5c00169.
- Crowder C. J., Yik B. J., Frost S. J. H., Cruz-Ramírez de Arellano D. and Raker J. R., (2024), Impact of Prompt Cueing on Level of Explanation Sophistication for Organic Reaction Mechanisms, J. Chem. Educ., 101, 398–410 DOI:10.1021/acs.jchemed.3c00710.
- Cureton E. E., (1956), Rank-biserial correlation, Psychometrika, 21, 287–290.
- DeVries R., (2000), Vygotsky, Piaget, and Education: a reciprocal assimilation of theories and educational practices, New Ideas Psychol., 18, 187–213 DOI:10.1016/S0732-118X(00)00008-8.
- Dewey J., (1904), The Education Situation, University of Chicago Press.
- Dewey J., (1910), How we think, Lexington, MA: D.C. Heath and Company DOI:10.1037/10903-000.
- Dewey J., (1938), Logic: the theory of inquiry, Oxford, England: Holt.
- Dood A. J. and Watts F. M., (2022), Mechanistic Reasoning in Organic Chemistry: A Scoping Review of How Students Describe and Explain Mechanisms in the Chemistry Education Research Literature, J. Chem. Educ., 99, 2864–2876 DOI:10.1021/acs.jchemed.2c00313.
- Dood A. J. and Watts F. M., (2023), Students’ Strategies, Struggles, and Successes with Mechanism Problem Solving in Organic Chemistry: A Scoping Review of the Research Literature, J. Chem. Educ., 100, 53–68 DOI:10.1021/acs.jchemed.2c00572.
- Dunn O. J., (1964), Multiple Comparisons Using Rank Sums, Technometrics, 6, 241–252.
- Eckhard J., Scheck R. A. and Caspari-Gnann I., (2026), Enhancing students’ agency and coherence in organic chemistry through transformed problem design, Chem. Educ. Res. Pract. 10.1039/D5RP00268K.
- Eddy S. L. and Hogan K. A., (2014), Getting Under the Hood: How and for Whom Does Increasing Course Structure Work?, CBE Life Sci. Educ., 13, 453–468 DOI:10.1187/cbe.14-03-0050.
- Eichler J. F., (2022), Future of the Flipped Classroom in Chemistry Education: Recognizing the Value of Independent Preclass Learning and Promoting Deeper Understanding of Chemical Ways of Thinking During In-Person Instruction, J. Chem. Educ., 99, 1503–1508 DOI:10.1021/acs.jchemed.1c01115.
- Elliott D. C. and Lakin J. M., (2021), Unparallel Pathways: Exploring How Divergent Academic Norms Contribute to the Transfer Shock of STEM Students, Comm. Coll. J. Res. Pract., 45, 802–815 DOI:10.1080/10668926.2020.1806145.
- Fautch J. M., (2015), The flipped classroom for teaching organic chemistry in small classes: is it effective?, Chem. Educ. Res. Pract., 16, 179–186 10.1039/C4RP00230J.
- Flaherty A. A., (2020a), Investigating perceptions of the structure and development of scientific knowledge in the context of a transformed organic chemistry lecture course, Chem. Educ. Res. Pract., 21, 570–581 10.1039/C9RP00201D.
- Flaherty A. A., (2020b), A review of affective chemistry education research and its implications for future research, Chem. Educ. Res. Pract., 21, 698–713 10.1039/C9RP00200F.
- Flener-Lovitt C., Bailey K. and Han R., (2020), Using Structured Teams to Develop Social Presence in Asynchronous Chemistry Courses, J. Chem. Educ., 97, 2519–2525 DOI:10.1021/acs.jchemed.0c00765.
- Flynn A. B., (2015), Structure and evaluation of flipped chemistry courses: organic & spectroscopy, large and small, first to third year, English and French, Chem. Educ. Res. Pract., 16, 198–211 10.1039/C4RP00224E.
- Freeman S., Eddy S. L., McDonough M., Smith M. K., Okoroafor N., Jordt H. and Wenderoth M. P., (2014), Active learning increases student performance in science, engineering, and mathematics, Proc. Natl. Acad. Sci. U. S. A., 111, 8410–8415 DOI:10.1073/pnas.1319030111.
- Freire P. and Ramos M. B., (1970), Pedagogy of the Oppressed, Seabury Press.
- Frost S. J. H., Pratt J. M., Cruz-Ramírez de Arellano D., Bliss-Roche K. and Raker J. R., (2024a), Feelings of Shame in a First Semester Organic Chemistry Course: Associations between Shame and Examination Performance for Multiple Learner Groups, J. Chem. Educ., 101, 4136–4148 DOI:10.1021/acs.jchemed.4c00754.
- Frost S. J. H., Rocabado G. A., Pratt J. M., de Arellano D. C.-R., Fields K. B. and Raker J. R., (2024b), Motivation Differences in First-Semester Organic Chemistry: A Comparison between First-Time-in-College Students and Transfer Students, J. Chem. Educ., 101, 354–363 DOI:10.1021/acs.jchemed.3c00579.
- Frost S. J. H., Yik B. J., Dood A. J., Cruz-Ramírez de Arellano D., Fields K. B. and Raker J. R., (2023), Evaluating electrophile and nucleophile understanding: a large-scale study of learners’ explanations of reaction mechanisms, Chem. Educ. Res. Pract., 24, 706–722 10.1039/d2rp00327a.
- Galloway K. R., Malakpa Z. and Bretz S. L., (2016), Investigating Affective Experiences in the Undergraduate Chemistry Laboratory: Students’ Perceptions of Control and Responsibility, J. Chem. Educ., 93, 227–238 DOI:10.1021/acs.jchemed.5b00737.
- Garrison D. R., Anderson T. and Archer W., (1999), Critical inquiry in a text-based environment: Computer conferencing in higher education, Internet High. Educ., 2, 87–105.
- Garrison D. R., Anderson T. and Archer W., (2001), Critical thinking, cognitive presence, and computer conferencing in distance education, AJDE, 15, 7–23 DOI:10.1080/08923640109527071.
- Garrison D. R., Anderson T. and Archer W., (2010), The first decade of the community of inquiry framework: A retrospective, Internet High. Educ., 13, 5–9 DOI:10.1016/j.iheduc.2009.10.003.
- Garrison D. R. and Kanuka H., (2004), Blended learning: Uncovering its transformative potential in higher education, Int. High. Educ., 7, 95–105.
- Gibbons R. E. and Raker J. R., (2019), Self-beliefs in organic chemistry: Evaluation of a reciprocal causation, cross-lagged model, J. Res. Sci. Teach., 56, 598–618 DOI:10.1002/tea.21515.
- Gosser D., Roth V., Gafney L., Kampmeier J., Strozak V., Varma-Nelson P., Radel S. and Weiner M., (1996), Workshop Chemistry: Overcoming the Barriers to Student Success, Chem. Educ., 1, 1–17 DOI:10.1007/s00897960002a.
- Graulich N., (2015), The tip of the iceberg in organic chemistry classes: how do students deal with the invisible?, Chem. Educ. Res. Pract., 16, 9–21 10.1039/C4RP00165F.
- Graulich N., (2025), The tip of the iceberg in organic chemistry – revisited, Chem. Educ. Res. Pract., 26, 359–376 10.1039/D4RP00345D.
- Gray M. J., Gunarathne S. A., Nguyen N. N. and Shortlidge E. E., (2022), Thriving or
Simply Surviving? A Qualitative Exploration of STEM Community College Students’ Transition to a Four-Year University, CBE Life Sci. Educ., 21(3), ar57 DOI:10.1187/cbe.21-09-0261.
- Grove N. P., Cooper M. M. and Cox E. L., (2012), Does Mechanistic Thinking Improve Student Success in Organic Chemistry?, J. Chem. Educ., 89, 850–853 DOI:10.1021/ed200394d.
- Grove N. P., Hershberger J. W. and Bretz S. L., (2008), Impact of a spiral organic curriculum on student attrition and learning, Chem. Educ. Res. Pract., 9, 157–162 10.1039/B806232N.
- Gunawardena C. N. and Zittle F. J., (1997), Social presence as a predictor of satisfaction within a computer-mediated conferencing environment, AJDE, 11, 8–26.
- Hancock G. R., (1997), Structural Equation Modeling Methods of Hypothesis Testing of Latent Variable Means, Meas. Eval. Couns. Dev., 30, 91–105 DOI:10.1080/07481756.1997.12068926.
- Hernán M. A. and Robins J. M., (2006), Instruments for Causal Inference: An Epidemiologist's Dream?, Epidemiol., 17, 360–372 DOI:10.1097/01.ede.0000222409.00878.37.
- Hills J. R., (1965), Transfer Shock, J. Exp. Educ., 33, 201–215 DOI:10.1080/00220973.1965.11010875.
- Hockings S. C., DeAngelis K. J. and Frey R. F., (2008), Peer-Led Team Learning in General Chemistry: Implementation and Evaluation, J. Chem. Educ., 85, 990 DOI:10.1021/ed085p990.
- Holm S., (1979), A Simple Sequentially Rejective Multiple Test Procedure, Scand. J. Stat., 6, 65–70.
- Horowitz G., Rabin L. A. and Brodale D. L., (2013), Improving student performance in organic chemistry: Help seeking behaviors and prior chemistry aptitude, J. Scholarsh. Teach. Learn., 13, 120–133.
- Hu L.-t and Bentler P. M., (1999), Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives, Struct. Equ. Model., 6, 1–55 DOI:10.1080/10705519909540118.
- Immordino-Yang M. H. and Damasio A., (2007), We Feel, Therefore We Learn: The Relevance of Affective and Social Neuroscience to Education, Mind Brain Educ., 1, 3–10 DOI:10.1111/j.1751-228X.2007.00004.x.
- Irby S. M., Pelaez N. J. and Anderson T. R., (2020), Student Perceptions of Their Gains in Course-Based Undergraduate Research Abilities Identified as the Anticipated Learning Outcomes for a Biochemistry CURE, J. Chem. Educ., 97, 56–65 DOI:10.1021/acs.jchemed.9b00440.
- Jardine H. E. and Friedman L. A., (2017), Using Undergraduate Facilitators for Active Learning in Organic Chemistry: A Preparation Course and Outcomes of the Experience, J. Chem. Educ., 94, 703–709 DOI:10.1021/acs.jchemed.6b00636.
- Jenkins D. and Fink J., Tracking Transfer: New Measures of Institutional and State Effectiveness in Helping Community College Students Attain Bachelor's Degrees, Teachers College, Columbia University, Community College Research Center, 2016.
- Jorgenson T., Pornprasertmanit S., Schoemann A. and Rosseel Y., (2022), semTools: Useful tools for structural equation modeling.
- Kassambara A., (2025), rstatix: Pipe-friendly Framework for Basic Statistical Tests (R package version 0.7.3) [Software] DOI:10.32614/CRAN.package.rstatix.
- Klein D. R., (2021), Organic Chemistry, John Wiley & Sons, Inc.
- Komperda R., (2016), Deconstructing Constructivism: Modeling Causal Relationships Among Constructivist Learning Environment Factors and Student Outcomes in Introductory Chemistry, PhD, The Catholic University of America.
- Komperda R., Pentecost T. C. and Barbera J., (2018), Moving beyond Alpha: A Primer on Alternative Sources of Single-Administration Reliability Evidence for Quantitative Chemistry Education Research, J. Chem. Educ., 95, 1477–1491 DOI:10.1021/acs.jchemed.8b00220.
- Kozma R. and Russell J., (2005), in Visualization in Science Education, Gilbert J. K. (ed.), Dordrecht: Springer Netherlands, pp. 121–145 DOI:10.1007/1-4020-3613-2_8.
- Kraft A., Strickland A. M. and Bhattacharyya G., (2010), Reasonable reasoning: multi-variate problem-solving in organic chemistry, Chem. Educ. Res. Pract., 11, 281–292 10.1039/c0rp90003f.
- Kruskal W. H. and Wallis A. W., (1952), Use of Ranks in One-Criterion Variance Analysis, J. Am. Stat. Assoc., 47, 583–621 DOI:10.2307/2280779.
- Lasker G. A., Mellor K. E. and Simcox N. J., (2019), Green chemistry & chemical stewardship certificate program: a novel, interdisciplinary approach to green chemistry and environmental health education, Green Chem. Lett. Rev., 12, 178–186 DOI:10.1080/17518253.2019.1609601.
- Lawrie G., (2021), Chemistry education research and practice in diverse online learning environments: resilience, complexity and opportunity!, Chem. Educ. Res. Pract., 22, 7–11 10.1039/D0RP90013C.
- Lewis S. E., (2011), Retention and Reform: An Evaluation of Peer-Led Team Learning, J. Chem. Educ., 88, 703–707 DOI:10.1021/ed100689m.
- Lewis S. E. and Lewis J. E., (2005), Departing from Lectures: An Evaluation of a Peer-Led Guided Inquiry Alternative, J. Chem. Educ., 82, 135 DOI:10.1021/ed082p135.
- Lewis S. E. and Lewis J. E., (2008), Seeking effectiveness and equity in a large college chemistry course: an HLM investigation of Peer-Led Guided Inquiry, J. Res. Sci. Teach., 45, 794–811 DOI:10.1002/tea.20254.
- Liu Y., Ferrell B., Barbera J. and Lewis J. E., (2017), Development and evaluation of a chemistry-specific version of the academic motivation scale (AMS-Chemistry), Chem. Educ. Res. Pract., 18, 191–213 10.1039/C6RP00200E.
- Liu Y., Raker J. R. and Lewis J. E., (2018), Evaluating student motivation in organic chemistry courses: moving from a lecture-based to a flipped approach with peer-led team learning, Chem. Educ. Res. Pract., 19, 251–264 10.1039/C7RP00153C.
- Lyle K. S. and Robinson W. R., (2003), A Statistical Evaluation: Peer-led Team Learning in an Organic Chemistry Course, J. Chem. Educ., 80, 132 DOI:10.1021/ed080p132.
- Lynch D. J. and Trujillo H., (2011), Motivational Beliefs and Learning Strategies in Organic Chemistry, Int. J. Sci. Math. Educ., 9, 1351–1365 DOI:10.1007/s10763-010-9264-x.
- Mangiafico S., (2025), rcompanion: Functions to Support Extension Education Program Evaluation.
- Mann H. B. and Whitney D. R., (1947), On a Test of Whether one of Two Random Variables is Stochastically Larger than the Other, Ann. Math. Stat., 18, 50–60 DOI:10.1214/aoms/1177730491.
- Marsh H. W., Hau K.-T. and Wen Z., (2004), In Search of Golden Rules: Comment on Hypothesis-Testing Approaches to Setting Cutoff Values for Fit Indexes and Dangers in Overgeneralizing Hu and Bentler's (1999) Findings, Struct. Equ. Modeling., 11, 320–341 DOI:10.1207/s15328007sem1103_2.
- softwareMazur E., (1997), Peer Instruction: A User's Manual, Prentice Hall.
- McNeish D., (2024), Dynamic fit index cutoffs for treating likert items as continuous, Psychol. Methods DOI:10.1037/met0000683.
- McNeish D. and Wolf M. G., (2023a), Dynamic fit index cutoffs for confirmatory factor analysis models, Psychol. Methods., 28, 61–88 DOI:10.1037/met0000425.
- McNeish D. and Wolf M. G., (2023b), Dynamic fit index cutoffs for one-factor models, Behav. Res. Methods, 55, 1157–1174 DOI:10.3758/s13428-022-01847-y.
- Mitchell Y. D., Ippolito J. and Lewis S. E., (2012), Evaluating Peer-Led Team Learning across the two semester General Chemistry sequence, Chem. Educ. Res. Pract., 13, 378–383 10.1039/C2RP20028G.
- Moog R. S., Creegan F. J., Hanson D. M., Spencer J. N., Straumanis A., Bunce D. M. and Wolfskill T., (2009), in Chemists’ Guide to Effective Teaching, Upper Saddle River, NJ: Prentice Hall, vol. 2, pp. 90–107.
- Mooring S. R., Mitchell C. E. and Burrows N. L., (2016), Evaluation of a Flipped, Large-Enrollment Organic Chemistry Course on Student Attitude and Achievement, J. Chem. Educ., 93, 1972–1983 DOI:10.1021/acs.jchemed.6b00367.
- Mutanyatta-Comar J. and Mooring S. R., (2019), From General to Organic Chemistry: Courses and Curricula to Enhance Student Retention, American Chemical Society, vol. 1341, ch. 11, pp. 145–157 DOI:10.1021/bk-2019-1341.ch011.
- National Center for Education Statistics, (2025), IPEDS glossary: First-time student (undergraduate), https://surveys.nces.ed.gov/ipeds/public/glossary.
- Ng B. J. M., Han J. Y., Kim Y., Togo K. A., Chew J. Y., Lam Y. and Fung F. M., (2022), Supporting Social and Learning Presence in the Revised Community of Inquiry Framework for Hybrid Learning, J. Chem. Educ., 99, 708–714 DOI:10.1021/acs.jchemed.1c00842.
- Novak J. D., (1977), A theory of education, Cornell University Press.
- Novak J. D. and Gowin D. B., (1984), Learning How to Learn, Cambridge: Cambridge University Press DOI:10.1017/CBO9781139173469.
- Ogle D. H., (2017), FSA: Fisheries Stock Analysis.
- Oh C. S., Bailenson J. N. and Welch G. F., (2018), A systematic review of social presence: definition, antecedents, and implications, Front. Robot. AI., 5, 114.
- Pekrun R., (2020), Self-Report is Indispensable to Assess Students’ Learning, FLR, 8, 185–193 DOI:10.14786/flr.v8i3.637.
- Piaget J., (1952), The Origins of Intelligence in Children, International Universities Press.
- Picciano A. G., (2019), Beyond Student Perceptions: Issues of Interaction, Presence, and Performance in an Online Course, Online Learn. J., 6(1) DOI:10.24059/olj.v6i1.1870.
- Putnick D. L. and Bornstein M. H., (2016), Measurement Invariance Conventions and Reporting: The State of the Art and Future Directions for Psychological Research, Dev. Rev., 41, 71–90 DOI:10.1016/j.dr.2016.06.004.
- Raker J. R., Dood A. J., Srinivasan S. and Murphy K. L., (2021), Pedagogies of engagement use in postsecondary chemistry education in the United States: results from a national survey, Chem. Educ. Res. Pract., 22, 30–42 10.1039/D0RP00125B.
- Raker J. R., Gibbons R. E. and Cruz-Ramírez de Arellano D., (2019), Development and evaluation of the organic chemistry-specific achievement emotions questionnaire (AEQ-OCHEM), J. Res. Sci. Teach., 56, 163–183 DOI:10.1002/tea.21474.
- Ramachandran R. and Rodriguez M. C., (2020), Student Perspectives on Remote Learning in a Large Organic Chemistry Lecture Course, J. Chem. Educ., 97, 2565–2572 DOI:10.1021/acs.jchemed.0c00572.
- R Core Team, (2025), R: A language and environment for statistical computing.
- Redstone A. E., Stefaniak J. E. and Luo T., (2018), Measuring presence: a review of research using the community of inquiry instrument, Q. Rev. Distance Educ., 19(2), 27–36.
- Reeves A. G., Bischoff A. J., Yates B., Brauer D. D. and Baranger A. M., (2023), A Pilot Graduate Student-Led Near-Peer Mentorship Program for Transfer Students Provides a Supportive Network at an R1 Institution, J. Chem. Educ., 100, 134–142 DOI:10.1021/acs.jchemed.2c00427.
- Reimer L. C., Denaro K., He W. and Link R. D., (2021), Getting Students Back on Track: Persistent Effects of Flipping Accelerated Organic Chemistry on Student Achievement, Study Strategies, and Perceptions of Instruction, J. Chem. Educ., 98, 1088–1098 DOI:10.1021/acs.jchemed.0c00092.
- Rein K. S. and Brookes D. T., (2015), Student Response to a Partial Inversion of an Organic Chemistry Course for Non-Chemistry Majors, J. Chem. Educ., 92, 797–802 DOI:10.1021/ed500537b.
- Revelle W., (2024), psych: Procedures for Psychological, Psychometric, and Personality Research.
- Reyes C. T., Thompson C. D., Lawrie G. A. and Kyne S. H., (2024), Insights into a Community of Inquiry that emerged during academics’ emergency remote university teaching of chemistry in response to concern for students, Res. Sci. Technol. Educ., 42, 1042–1068 DOI:10.1080/02635143.2023.2202387.
- Richardson J. C., Maeda Y., Lv J. and Caskurlu S., (2017), Social presence in relation to students' satisfaction and learning in the online environment: a meta-analysis, Comput. Hum. Behav., 71, 402–417.
- Richardson J. C. and Swan K., (2003), Examining Social Presence in Online Courses in Relation to Students’ Perceived Learning and Satisfaction, Online Learn. J., 7(1) DOI:10.24059/olj.v7i1.1864.
- Riedl A., Yeung F. and Burke T., (2021), Implementation of a Flipped Active-Learning Approach in a Community College General Biology Course Improves Student Performance in Subsequent Biology Courses and Increases Graduation Rate, CBE Life Sci. Educ., 20, ar30 DOI:10.1187/cbe.20-07-0156.
- Rocabado G. A., Kilpatrick N. A., Mooring S. R. and Lewis J. E., (2019), Can We Compare Attitude Scores among Diverse Populations? An Exploration of Measurement Invariance Testing to Support Valid Comparisons between Black Female Students and Their Peers in an Organic Chemistry Course, J. Chem. Educ., 96, 2371–2382 DOI:10.1021/acs.jchemed.9b00516.
- Rogers P. and Lea M., (2005), Social presence in distributed group environments: The role of social identity, Behav. Inf. Technol., 24, 151–158 DOI:10.1080/01449290410001723472.
- Rosseel Y., (2012), lavaan: An R Package for Structural Equation Modeling, J. Stat. Softw., 48, 1–36 DOI:10.18637/jss.v048.i02.
- Rossi R. D., (2015), ConfChem Conference on Flipped Classroom: Improving Student Engagement in Organic Chemistry Using the Inverted Classroom Model, J. Chem. Educ., 92, 1577–1579 DOI:10.1021/ed500899e.
- Rourke L., Anderson T., Garrison D. R. and Archer W., (2001), Methodological issues in
the content analysis of computer conference transcripts, Int. J. Artif. Intell. Educ., 12, 8–22.
- Rovai A. P., (2002), Development of an instrument to measure classroom community, Int. High. Educ., 5, 197–211.
- Rstudio Team, (2025), RStudio: Integrated Development Environment for R.
- Sadaf A., Wu T. and Martin F., (2021), Cognitive Presence in Online Learning: A Systematic Review of Empirical Research from 2000 to 2019, CAEO, 2, 100050 DOI:10.1016/j.caeo.2021.100050.
- Schmidt-McCormack J. A., Judge J. A., Spahr K., Yang E., Pugh R., Karlin A., Sattar A., Thompson B. C., Gere A. R. and Shultz G. V., (2019), Analysis of the role of a writing-to-learn assignment in student understanding of organic acid–base concepts, Chem. Educ. Res. Pract., 20, 383–398 10.1039/C8RP00260F.
- Schweingruber H. A., Nielsen N. R. and Singer S. R., (2012), Discipline-based education research: Understanding and improving learning in undergraduate science and engineering, National Academies Press.
- Seery M. K., (2015), Flipped learning in higher education chemistry: emerging trends and potential directions, Chem. Educ. Res. Pract., 16, 758–768 10.1039/C5RP00136F.
- Seymour E. and Hewitt N. M., (1997), Talking about leaving: Why undergraduate leave the sciences, Boulder, CO: Westview Press.
- Shadish W. R., Cook T. D. and Campbell D. T., (2002), Experimental and quasi-experimental designs for generalized causal inference, Boston, MA, US: Houghton, Mifflin and Company.
- Shea P. and Bidjerano T., (2010), Learning presence: Towards a theory of self-efficacy, self-regulation, and the development of a communities of inquiry in online and blended learning environments, Comput. Educ., 55, 1721–1731 DOI:10.1016/j.compedu.2010.07.017.
- Shen K. N., Yu A. Y. and Khalifa M., (2010), Knowledge contribution in virtual communities: accounting for multiple dimensions of social presence through social identity, Behav. Inf. Technol., 29, 337–348 DOI:10.1080/01449290903156622.
- Shields S. P., Hogrebe M. C., Spees W. M., Handlin L. B., Noelken G. P., Riley J. M. and Frey R. F., (2012), A Transition Program for Underprepared Students in General Chemistry: Diagnosis, Implementation, and Evaluation, J. Chem. Educ., 89, 995–1000 DOI:10.1021/ed100410j.
- Simkins S. and Maier M., (2010), Just-in-time Teaching: Across the Disciplines, Across the Academy, Stylus.
- Skagen D., McCollum B., Morsch L. and Shokoples B., (2018), Developing communication confidence and professional identity in chemistry through international online collaborative learning, Chem. Educ. Res. Pract., 19, 567–582 10.1039/C7RP00220C.
- Smith N. L., Grohs J. R. and Van Aken E. M., (2022), Comparison of transfer shock and graduation rates across engineering transfer student populations, J. Eng. Educ., 111, 65–81 DOI:10.1002/jee.20434.
- Sörbom D., (1974), A General Method for Studying Differences in Factor Means and Factor Structures Between Groups, Br. J. Math. Stat. Psychol., 27, 229–239 DOI:10.1111/j.2044-8317.1974.tb00543.x.
- Sorensen C. M., Churukian A. D., Maleki S. and Zollman D. A., (2006), The New Studio format for instruction of introductory physics, Am. J. Phys., 74, 1077–1082 DOI:10.1119/1.2358999.
- Stanich C. A., Pelch M. A., Theobald E. J. and Freeman S., (2018), A new approach to supplementary instruction narrows achievement and affect gaps for underrepresented minorities, first-generation students, and women, Chem. Educ. Res. Pract., 19, 846–866 10.1039/C8RP00044A.
- Steinmetz H., (2013), Analyzing Observed Composite Differences Across Groups, Methodology, 9, 1–12 DOI:10.1027/1614-2241/a000049.
- Stenbom S., (2018), A systematic review of the Community of Inquiry survey, Internet High. Educ., 39, 22–32 DOI:10.1016/j.iheduc.2018.06.001.
- Talanquer V., (2022), The Complexity of Reasoning about and with Chemical Representations, JACS Au, 2, 2658–2669 DOI:10.1021/jacsau.2c00498.
- Theobald E. J., Hill M. J., Tran E., Agrawal S., Arroyo E. N., Behling S., Chambwe N., Cintrón D. L., Cooper J. D., Dunster G., Grummer J. A., Hennessey K., Hsiao J., Iranon N., Jones L., Jordt H., Keller M., Lacey M. E., Littlefield C. E., Lowe A., Newman S., Okolo V., Olroyd S., Peecook B. R., Pickett S. B., Slager D. L., Caviedes-Solis I. W., Stanchak K. E., Sundaravardan V., Valdebenito C., Williams C. R., Zinsli K. and Freeman S., (2020), Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math, Proc. Natl. Acad. Sci. U. S. A., 117, 6476–6483 DOI:10.1073/pnas.1916903117.
- Thiry H., Weston T. J., Harper R. P., Holland D. G., Koch A. K., Drake B. M., Hunter A. and Seymour E., (2019), Talking about Leaving Revisited, Springer DOI:10.1007/978-3-030-25304-2.
- Tien L. T., Roth V. and Kampmeier J. A., (2002), Implementation of a peer-led team learning instructional approach in an undergraduate organic chemistry course, J. Res. Sci. Teach., 39, 606–632 DOI:10.1002/tea.10038.
- University of South Florida, (2023), USF Fact Book 2022–2023.
- University of South Florida, (2024) USF Fact Book 2023–2024.
- University of South Florida, (2025), USF Fact Book 2024–2025.
- Vaughan N. D., Cleveland-Innes M. and Garrison D. R., (2013), Teaching in blended learning environments: Creating and sustaining communities of inquiry, Athabasca University Press.
- Villafañe S. M., Xu X. and Raker J. R., (2016), Self-efficacy and academic performance in first-semester organic chemistry: testing a model of reciprocal causation, Chem. Educ. Res. Pract., 17, 973–984 10.1039/C6RP00119J.
- Walther J. B., (1992), Interpersonal effects in computer-mediated interaction: a relational perspective, Commun. Res., 19, 52–90.
- Wamser C. C., (2006), Peer-Led Team Learning in Organic Chemistry: Effects on Student Performance, Success, and Persistence in the Course, J. Chem. Educ., 83, 1562 DOI:10.1021/ed083p1562.
- Wang X., (2015), Pathway to a Baccalaureate in STEM Fields:Are Community Colleges a Viable Route and Does Early STEM Momentum Matter?, Educ. Eval. Policy An., 37, 376–393 DOI:10.3102/0162373714552561.
- Wang X., Sun N., Lee S. Y. and Wagner B., (2017), Does Active Learning Contribute to Transfer Intent Among 2-Year College Students Beginning in STEM?, J. High. Educ., 88, 593–618 DOI:10.1080/00221546.2016.1272090.
- Ward L. W., Rotich F., Hoang J. and Popova M., (2022), in Student Reasoning in Organic Chemistry, Graulich N. and Shultz G. (ed.), The Royal Society of Chemistry, ch. 3, pp. 36–56 10.1039/9781839167782-00036.
- Whitfield M., (2005), Transfer-Student Performance in Upper-Division Chemistry Courses: Implications for Curricular Reform and Alignment, Comm. Coll. J. Res. Pract., 29, 531–545 DOI:10.1080/10668920590953999.
- Widanski B. B. and McCarthy W. C., (2009), Assessment of Chemistry Anxiety in a Two-Year College, J. Chem. Educ., 86, 1447 DOI:10.1021/ed086p1447.
- Wilks S. S., (1932), Certain Generalizations in the Analysis of Variance, Biometrika, 24, 471–494 DOI:10.1093/biomet/24.3-4.471.
- Williams-Dobosz D., Jeng A., Azevedo R. F. L., Bosch N., Ray C. and Perry M., (2021), Ask for Help: Online Help-Seeking and Help-Giving as Indicators of Cognitive and Social Presence for Students Underrepresented in Chemistry, J. Chem. Educ., 98, 3693–3703 DOI:10.1021/acs.jchemed.1c00839.
- Wilson S. B. and Varma-Nelson P., (2019), Characterization of First-Semester Organic Chemistry Peer-Led Team Learning and Cyber Peer-Led Team Learning Students’ Use and Explanation of Electron-Pushing Formalism, J. Chem. Educ., 96, 25–34 DOI:10.1021/acs.jchemed.8b00387.
- Wilson S. B. and Varma-Nelson P., (2021), Implementing Peer-Led Team Learning and Cyber Peer-Led Team Learning in an Organic Chemistry Course, J. Coll. Sci. Teach., 50, 44–50 DOI:10.1080/0047231x.2021.12290507.
- Wolf M. G. and McNeish D., (2023), dynamic: An R package for deriving dynamic fit index cutoffs for factor analysis, Multivar. Behav. Res., 58, 189–194 DOI:10.1080/00273171.2022.2163476.
- Wolf M. G. and McNeish D., (2024), dynamic: DFI cutoffs for latent variables models.
- Yang P. D. H. and Su P. D. J., (2021), A Construct Revalidation of the Community of Inquiry Survey: Empirical Evidence for a General Factor Under a Bifactor Structure, IRRODL, 22, 22–40 DOI:10.19173/irrodl.v22i4.5587.
- Yik B. J., Dood A. J., Frost S. J. H., Cruz-Ramírez de Arellano D., Fields K. B. and Raker J. R., (2023), Generalized rubric for level of explanation sophistication for nucleophiles in organic chemistry reaction mechanisms, Chem. Educ. Res. Pract., 24, 263–282 10.1039/d2rp00184e.
|
| This journal is © The Royal Society of Chemistry 2026 |
Click here to see how this site uses Cookies. View our privacy policy here.