Measuring student engagement in the undergraduate general chemistry laboratory

K. Christopher Smith * and Valeria Alonso
Department of Chemistry, University of Texas Rio Grande Valley, 1201 W. University Drive, Edinburg, TX 78539, USA. E-mail: kenneth.smith@utrgv.edu

Received 6th July 2018 , Accepted 5th November 2019

First published on 6th November 2019


Abstract

In this study a survey was developed to investigate students’ engagement during general chemistry laboratory sessions. Aspects of engagement surveyed included cognitive, behavioral, and emotional engagement, and the survey items were focused on activities during the pre-laboratory introduction, laboratory procedures, and data collection. Exploratory factor analysis of the results was conducted to determine the various underlying factors in the survey, and the scores of the general chemistry laboratory students along these underlying factors were compared. The findings supported the various dimensions of engagement reported in the literature.


Introduction

The teaching laboratory, in which students conduct laboratory experiments, is a standard and expected component of science courses. Various authors have reviewed the literature and reported on educational goals associated with students’ engaging in laboratory activities (Hofstein and Lunetta, 1982; White, 1996; Johnstone and Al-Shuaili, 2001; Hofstein and Lunetta, 2004; Reid and Shah, 2007). The goals reported generally fall within categories outlined by Hofstein and Lunetta (2004): enhancing students’ understanding of science, understanding of the nature of science, scientific habits of mind, practical skills, problem solving abilities, interest, and motivation.

While such educational goals are indeed worth striving for, Hofstein and Lunetta (1982) noted that there was significant overlap between objectives stated for laboratory experiences and objectives outlined for science courses themselves; a potential conclusion drawn from this overlap could be that student outcomes associated with laboratory experiences could be achieved through sciences courses themselves. As such, Hofstein and Lunetta (1982) highlighted the need for defined goals for laboratory work taking into account the opportunities potentially afforded by laboratory instruction.

Another important point raised by Hofstein and Lunetta (1982) was that researchers had not yet convincingly or comprehensively investigated the relationships between the teaching laboratory experience and student learning, especially in contrast to other forms of instruction. As such, it could be difficult to defend the effectiveness and the role of the teaching laboratory in science curricula. Some twenty years later, Hofstein and Lunetta (2004) noted that these concerns were still valid, despite the centrality of the teaching laboratory in science education.

Additional calls have been made for researchers to investigate the effects of the laboratory experience on student learning, most recently by Bretz (2019). She noted that undergraduate chemistry teaching laboratories are labor intensive and costly, and that such resources could come under scrutiny from university administrators facing increasingly strained budgets. Bretz (2019) highlighted the need to have arguments and data available to justify the presence of the standard chemistry teaching laboratory. Over the past fifteen years, researchers have been making more concerted and consistent efforts towards investigating student learning in the chemistry teaching laboratory. As the introduction to a themed collection on student learning in the chemistry teaching laboratory in this journal, Hofstein and Mamlok-Naaman (2007) wrote that the themed collection would hopefully provide insight into learning in the chemistry teaching laboratory. In addition, a number of research groups have been investigating student learning, and in particular meaningful learning, in the chemistry teaching laboratory.

Meaningful learning in the undergraduate general chemistry laboratory

Nearly twenty years ago, Bretz (2001) published an article focused on Novak's theory of education (Novak, 1977), as a potential theoretical framework for classroom teaching and conducting chemistry education research. Bretz (2001) noted that Novak's work drew strongly from the work of Ausubel, and that according to Ausubel, meaningful learning occurred when new knowledge was nonarbitrarily connected to a learner's existing knowledge (Ausubel, 1968). In order for meaningful learning to take place, a learner must bring some relevant prior knowledge, the new material must be meaningful and relevant to the prior knowledge, and the learner must consciously choose to integrate the new material into the prior knowledge (Ausubel, 1963). Pragmatically, Bretz (2001) noted that only one of these three conditions – presenting the material in a meaningful and relevant manner – was controlled by the educator, as learners brought their own prior knowledge and made their own decisions about consciously engaging in integrating material. As such, Bretz (2001) drew on Novak's theory as a guide for educators to potentially positively affect students’ learning, within the context of educators presenting new material in a meaningful and relevant manner. Specifically, Bretz (2001) noted that Novak's theory indicated that students would experience meaningful learning only when they engaged in educational experiences which required them to connect and integrate cognitive (thinking), affective (feeling), and psychomotor (doing) domains. This framework for meaningful learning has proven to be very useful when considering learning in the undergraduate general chemistry laboratory.

Goals for the undergraduate general chemistry laboratory

Hofstein and Lunetta (1982) highlighted the need for defined goals for laboratory work, and recent research has addressed goals for undergraduate general chemistry laboratory work, some of it within the context of meaningful learning. A qualitative study on faculty goals for chemistry laboratory work revealed several goals for the general chemistry laboratory, including having students engage in doing science, mastering laboratory techniques and skills, developing critical thinking skills, connecting lecture content to laboratory content, and collaborative group work (Bruck et al., 2010). There were, however, differing emphases on some of the goals among faculty who had successfully obtained federal funding geared towards laboratory curricular improvement, compared to faculty who had not obtained such funding: faculty with such funding emphasized connecting lecture with laboratory and developing critical thinking skills more, while faculty without such funding emphasized teamwork skills more. In a follow up quantitative study, Bruck and Towns (2013) developed a survey instrument to examine faculty goals for chemistry laboratory work. They also reported that faculty goals for the general chemistry laboratory included a focus on laboratory techniques and skills, and on connecting lecture material to laboratory material; in addition, they reported laboratory safety as a goal. Interestingly, Bruck and Towns (2013) noted that general chemistry faculty placed less emphasis on goals related to research experience and laboratory writing, compared to faculty across the rest of the undergraduate chemistry curriculum.

Faculty goals for chemistry laboratory work have also been examined within the context of meaningful learning, with a focus on cognitive, affective, and psychomotor domains. Bretz et al. (2013) conducted a qualitative study on faculty goals for chemistry laboratory work and coded the faculty-identified goals as cognitive, affective, or psychomotor. They reported six cognitive goals: connecting the lecture to the laboratory, connecting the laboratory to everyday life, emphasizing conceptual understanding in the laboratory, connecting laboratory content to other science and mathematics fields, engaging in critical analysis, and communicating with the scientific community. Affective goals included laboratory connections to the real world, gaining independence, and engaging in collaboration. Lastly, psychomotor goals included using laboratory equipment and learning laboratory techniques. There were, however, differences in emphases on some of the general chemistry laboratory goals among faculty in departments which had successfully obtained federal funding geared towards laboratory curricular improvement, compared to faculty in departments which had not obtained such funding, similar to the findings reported by Bruck et al. (2010). Faculty in departments with such funding emphasized critical thinking and connecting lecture to laboratory more, while it was suggested that faculty in departments without such funding emphasized collaboration more.

Students are a critical component of the educational enterprise, and it is important to consider students’ goals for laboratory work, especially since educators and students may hold mismatched goals (Hofstein and Lunetta, 2004). Fittingly, student goals for general chemistry laboratory work have also been investigated within the context of meaningful learning. DeKorver and Towns (2015) collected general chemistry student-identified goals pertaining to specific general chemistry experiments students conducted, and coded the goals among the cognitive, affective, and psychomotor domains. They reported that students’ most prevalent goals focused on the affective domain, in terms of finishing quickly (positive), finishing late (negative), earning good grades (positive), and making mistakes (negative). DeKorver and Towns (2015) also reported that some students stated learning techniques as a goal (psychomotor), but did not take advantage of opportunities to practice or try techniques, while some students would follow the steps in the procedure without thought or understanding, with the intent to understand the material at a later time. This last result was also reported by Galloway and Bretz (2016), and by Galloway et al. (2016), who reported that some students were influenced to make this choice because they perceived a lack of control in the structure of their laboratory course, so they disengaged cognitively and focused only on carrying out the procedures. DeKorver and Towns (2015) noted that these actions led to a separation of psychomotor tasks from the cognitive domain, that the affective goal of finishing quickly was in conflict with cognitive and psychomotor goals, and that there were indeed areas of misalignment between student goals and faculty goals. In a related vein, Flaherty et al. (2017) have also reported on the misalignment between the perceptions of general chemistry students and laboratory demonstrators (similar to graduate teaching assistants) on laboratory demonstrators’ responsibilities for addressing aspects of meaningful learning in the general chemistry laboratory. Flaherty et al. (2017) wrote that graduate students (whether graduate teaching assistants or laboratory demonstrators) are integral to the enculturation of undergraduate students into the scientific community, as similarly noted by Smith and Sepulveda (2018); as such, it is important to focus on goals of students, faculty, and teaching assistants, in relation to laboratory work.

These studies have certainly helped to define educator and student goals for general chemistry laboratory work, and within the context of meaningful learning. The results can help inform educators on which domains of meaningful learning are being more or less attended to by educators and students, and can help guide the development of general chemistry laboratory curricula.

Integrating aspects of cognitive, affective, and psychomotor domains in the undergraduate general chemistry laboratory

Galloway and Bretz recently published a series of articles reporting their research on meaningful learning in the undergraduate chemistry laboratory, with some of the focus being on the undergraduate general chemistry laboratory (Galloway and Bretz, 2015a, 2015b, 2015c, 2015d). The first article in this series of articles reported on the development of an instrument, the Meaningful Learning in the Laboratory Instrument (MLLI), which focused on measuring students’ expectations and experiences along cognitive and affective dimensions in undergraduate chemistry laboratory courses (Galloway and Bretz, 2015a). They used a common stem for each item in the instrument, with the stem setting the context of performing the laboratory experiments; as such, the psychomotor domain of meaningful learning was incorporated within the stem of each MLLI item, and each item was coded as cognitive, affective, or cognitive/affective, which spoke to the cognitive and affective domains of meaningful learning. The MLLI was administered at the beginning and end of a semester, and based on the survey results and analysis, Galloway and Bretz (2015a) reported that few students in general chemistry and organic chemistry engaged both their thoughts (cognitive domain) and feelings (affective domain) while performing their laboratory experiments. They also suggested that the cognitive domain and affective domain were not integrated in students’ minds, and that students were not aware of their thinking and feeling as they performed their laboratory experiments, or that students did not connect their thinking and feeling with their performing of their laboratory experiments. Interestingly, exploratory factor analysis of MLLI results yielded a two-factor solution, with one factor being composed of positively worded items (in cognitive and affective domains) contributing to meaningful learning, and the other factor being composed of negatively worded items (in cognitive and affective domains) hindering meaningful learning; this type of two-factor solution, with a positive and a negative factor, was not unprecedented (Galloway and Bretz, 2015a). Results of a different study focused on the development of a chemistry laboratory self-efficacy beliefs instrument also yielded exploratory factor analysis outcomes of a similar two-factor solution nature (Alkan, 2016). One factor was composed of positively worded items, all in the psychomotor domain, while the other factor was composed of negatively worded items, all in the cognitive domain. Both of these studies had students consider their holistic chemistry laboratory experience, as opposed to focusing on particular laboratory experiments.

Galloway and Bretz also reported on the results of the MLLI administered to general chemistry laboratory students in a local study (Galloway and Bretz, 2015b), as well as to undergraduate chemistry laboratory students, including general chemistry laboratory students, in a national study (Galloway and Bretz, 2015c). They reported that their results might indicate that students intrinsically hold the affective domain to be more important than the cognitive domain within the context of learning in the laboratory (Galloway and Bretz, 2015b), and that even if students report an overall positive learning experience in the laboratory, it is possible that they may have also had some negative affective experiences, which could hinder the quality of their meaningful learning (Galloway and Bretz, 2015c). Galloway and Bretz called for chemistry laboratory curricula to attend more to the incorporation of the affective domain, beyond only a focus on group work and real world connections (Galloway and Bretz, 2015b, 2015c); in addition, it has been noted that the development of chemistry laboratory curricula should focus on various affective dimensions, and not focus only on student interest (Galloway et al., 2016). These studies have contributed greatly to shedding light on the presence and integration of cognitive, affective, and psychomotor domains in the general chemistry laboratory.

Other aspects of cognitive, affective, and psychomotor domains in science curricula

Within the context of chemistry coursework in general, a number of articles focused on aspects of cognitive, affective, and psychomotor domains have appeared in the literature, much of it focused on the affective domain. Bauer (2005) developed a chemistry self-concept inventory (CSCI) and found five self-concept factors: mathematics self-concept, chemistry self-concept, academic self-concept, academic enjoyment self-concept, and creativity self-concept. Bauer (2008) also developed a semantic differential instrument on attitudes towards chemistry (the Attitude toward the Subject of Chemistry Inventory, ASCI) with factors and items focused on interest and utility, anxiety, intellectual accessibility, fear, and emotional satisfaction. Xu and Lewis (2011); Brandriet et al. (2011) reported on a shortened version of the ASCI, focused on only intellectual accessibility and emotional satisfaction. Grove and Bretz (2007) developed an instrument to evaluate chemistry students’ cognitive expectations, the CHEMX instrument, with survey items divided in seven clusters: effort, concepts, math link, reality link, outcome, laboratory, and visualization. Barbera et al. (2008) modified and validated the Colorado Learning Attitudes about Science Survey (CLASS) for use in chemistry specifically, and developed the CLASS-Chem survey, with survey responses falling in several categories: personal interest, real world connection, problem solving: general, problem solving: confidence, problem solving: sophistication, sense making/effort, conceptual connections, conceptual learning, and atomic-molecular perspective of chemistry.

Within the context of science laboratory work specifically, few articles focused on exploring aspects of cognitive, affective, and psychomotor domains have appeared in the literature. Bowen (1999) developed the Chemistry Laboratory Anxiety Instrument (CLAI), which examined student anxiety in college freshman and sophomore chemistry laboratory courses. He identified five dimensions of anxiety: working with chemicals, using equipment and procedures, collecting data, working with other students, and having adequate time. Barrie et al. (2015) developed the Advancing Science by Enhancing Learning in the Laboratory (ASELL) Student Laboratory Experience (ASLE) survey instrument, which focused on students’ experiences, including aspects of relevance and interest, in mostly college freshman science laboratory courses. The ASLE instrument was meant to be administered to students engaged in various laboratory experiments to provide feedback to laboratory instructors to improve the laboratory experiments. They found three dimensions of the students’ experiences: motivators, assessment, and resources. In addition, Sadler et al. (2011) developed the Laboratory Instructional Practices Inventory (LIPI), an observation protocol focused on student engagement and discourse in college biology, chemistry, and physics laboratory courses. The LIPI protocol was also meant to capture data from various laboratory experiments to provide feedback for the improvement of the laboratory experiments. Sadler et al. defined various types of student engagement being observed, including disengagement (students doing something other than the laboratory work), passive engagement (students receiving information), task-oriented engagement (students working on laboratory tasks or protocols), and epistemic engagement (students being cognitively engaged). These studies have all contributed to the knowledge base on aspects of cognitive, affective, and psychomotor domains in science curricula.

Measuring engagement in science laboratory curricula

The literature articles cited thus far have focused on studies of aspects of cognitive, affective, and psychomotor domains in science curricula, and especially in general chemistry laboratory courses, within the context of meaningful learning. Many of the studies have involved a focus on outcomes associated with students performing laboratory experiments, so undoubtedly the students have been engaged in the laboratory activities. What is generally missing from the research literature, however, is a consideration of the extent of student engagement during the laboratory activities.

A useful framework for considering student engagement is found in the seminal review article written by Fredricks et al. (2004), in which they thoroughly discussed the concept of engagement in school settings. They defined three types of engagement: behavioral engagement, emotional engagement, and cognitive engagement. While they presented various definitions of each of these forms of engagement, each type of engagement had a definition that was most relevant to chemistry laboratory work, which is the focus of this article. The most relevant definition of behavioral engagement centered on students’ involvement in academic tasks and learning, and was characterized by behaviors such as effort, persistence, and attention. Similarly, the most relevant definition of emotional engagement focused on students’ affective reactions in the academic environment, such as interest, boredom, and anxiety. Likewise, the most relevant definition of cognitive engagement involved students’ psychological investment in academic tasks, as well as their psychological effort geared towards academic tasks, such as learning, understanding, and mastering. It is important to note that these dimensions of engagement align with the domains of meaningful learning; cognitive engagement aligns with the cognitive domain, emotional engagement with the affective domain, and behavioral engagement with the psychomotor domain. Furthermore, the cognitive and behavioral aspects of engagement focus on students’ efforts along those dimensions, which aligns with one of the stipulations for meaningful learning to occur, namely that the learner must consciously choose to integrate the new material into the prior knowledge (Ausubel, 1963). In contrast, the emotional aspect of engagement focuses on students’ affective reactions.

Within this context of student engagement, several published studies do provide for a measure of student engagement in science laboratory activities, largely in the dimension of emotional engagement. The Meaningful Learning in the Laboratory Instrument (MLLI) developed by Galloway and Bretz (2015a) measured the extent of affective reactions. The MLLI also measured cognitive aspects, but more in terms of the degree of cognitive aspects present, as opposed to cognitive effort. The Chemistry Laboratory Anxiety Instrument (CLAI) developed by Bowen (1999) also measured the extent of affective reactions. The Laboratory Instructional Practices Inventory (LIPI) developed by Sadler et al. (2011) did capture aspects of engagement observed, but it did not measure the effort associated with behavioral or cognitive engagement, and it did not measure the extent of affective reactions.

Several other published studies do include reports of positive student engagement associated with chemistry laboratory activities, but without the use of instruments such as the MLLI (Galloway and Bretz, 2015a) or the CLAI (Bowen, 1999). Laredo (2013) reported on the implementation of problem-based undergraduate general chemistry laboratories. She reported that students were engaged, and more so than when conducting traditional general chemistry laboratory experiments. These conclusions about engagement were based on student feedback which indicated that the laboratory activities were enjoyable and made them think. Irby et al. (2018) reported on the use of a virtual general chemistry laboratory module to investigate students’ engagement with macroscopic, submicroscopic, and symbolic representations of chemistry, in comparison to the wet general chemistry laboratory. They measured engagement by comparing students’ dialogue relevant to macroscopic, submicroscopic, and symbolic representations of chemistry, as students worked through the two different laboratory formats. In addition, a number of published laboratory experiments have indicated positive student engagement outcomes. For example, Smith and colleagues have reported that student feedback indicated that students enjoyed chemistry laboratory experiments involving cranberry-apple juice (Edionwe et al., 2011) and acetaminophen in gel capsules (Smith and Cedillo, 2014), and that students found an inquiry-based lab on acid conductivity to be engaging (Smith and Garza, 2015). These articles do report on aspects of engagement, but generally without the use of an instrument or an observation protocol, and without a focus on all the dimensions (behavioral, emotional, and cognitive) of engagement.

In a recent article, Nyutu et al. (2019) investigated student engagement in individual undergraduate microbiology laboratories. They cited the framework of engagement described by Fredricks et al. (2004) but opted to use the Laboratory Instructional Practices Inventory (LIPI) developed by Sadler et al. (2011), since it was specific to science. We were also interested in examining student engagement in individual laboratory experiments, within the context of the general chemistry laboratory specifically, and with a consideration of all three types of engagement presented by Fredricks et al. (2004). No instrument currently exists to measure this student engagement in general chemistry laboratory experiments, so we saw the need to develop such an instrument. In addition, during general chemistry laboratories there are typically several different types of activities: a pre-laboratory talk by the instructor or teaching assistant, engaging in the laboratory procedures, and collecting data and observations. We were interested in examining the various types of student engagement during these different types of activities during the laboratory. Furthermore, we were interested in measuring and comparing students’ engagement in various laboratory experiments. In light of these various considerations, our research question was: How do students’ behavioral, emotional, and cognitive engagement vary and compare across various laboratory experiments in university general chemistry laboratory courses?

Theoretical foundation

The concept of engagement as defined by Fredricks et al. (2004) was the foundation used for this research. They provided various definitions of three forms of engagement: behavioral engagement, emotional engagement, and cognitive engagement. Fredricks et al. discussed these types of engagement in the context of a classroom school setting, but certain definitions they presented resonated strongly with the context of a general chemistry laboratory setting. First, there was behavioral engagement centering on students’ involvement in academic tasks and learning (with the tasks and learning being associated with the general chemistry laboratory), and characterized by behaviors including effort, persistence, and attention. Next, there was emotional engagement focusing on students’ affective reactions in the academic environment (the general chemistry laboratory), such as interest, boredom, and anxiety. Finally, there was cognitive engagement involving students’ psychological investment in academic tasks (associated with the general chemistry laboratory), as well as their psychological effort geared towards academic tasks, such as learning, understanding, and mastering.

Methods

Data collection

The site of this research was the University of Texas Rio Grande Valley, a large public university in the southwestern United States, with human subjects research approval from the university's Institutional Review Board. The approved protocol was numbered 2017-286-12 and titled “Examining Student Engagement in Freshman General Chemistry Labs”. We developed a preliminary survey instrument to examine student engagement in general chemistry laboratory courses. Neither author of this article was an instructor of any general chemistry laboratory courses during the semesters in which the survey was administered. In addition, neither author was directly involved in administering the survey to students; instead, the surveys were administered by the individual general chemistry laboratory course instructors, after being supplied with the Qualtrics (2017) link to the survey. The survey instrument began with a demographics section including questions on age, gender, student classification, major, minor, and current chemistry laboratory course enrollment. The survey instrument then provided a written prompt to students to choose one of their two most recent general chemistry laboratory experiments, and to think about that particular experiment while they were responding to the survey items. This prompt was included for two main reasons. First, it was decided that having more laboratory experiments represented in the survey would lead to a more robust survey, especially since the survey was being developed to measure student engagement in individual laboratory experiments. Second, we wanted to ensure that students would be likely to remember the laboratory experiments they had recently completed, so it was decided to limit their choices to their two most recently completed laboratory experiments.

The next section of the survey included forty-six Likert-scale items with response options of Strongly Disagree, Disagree, Agree, and Strongly Agree. Nadler et al. (2015) reported that the strongest validity and reliability are exhibited by Likert-scale items with 4- to 7-point scales, and we chose to use a 4-point scale with our Likert-scale items. Furthermore, we did not include a midpoint because we wanted participants to select a stance instead of simply selecting the midpoint (Nadler et al., 2015). In addition, we opted to use a Likert-scale as opposed to a slider scale because slider scales have labels only at the ends of the scales, potentially leading to participants having various interpretations of response categories, whereas the Likert-scale has each response category labeled, resulting in common interpretations of the labels by participants (Weijters et al., 2010). The Likert-scale items focused on the various parts of a typical general chemistry laboratory experiment: a pre-laboratory talk by the instructor or teaching assistant, engaging in the laboratory procedures, and collecting data and observations. In addition, there were items involving behavioral, emotional, and cognitive engagement associated with each different part of a typical general chemistry laboratory experiment, as well as several items on general engagement during laboratory experiments.

After the survey was constructed, as a measure of face validity of the survey, the survey was shared with several undergraduate students who had recently taken general chemistry courses and there were no major suggested changes to the survey items. The survey was administered to students enrolled in General Chemistry for Engineers Laboratory courses, as well as the first semester (General Chemistry Laboratory I) and second semester (General Chemistry Laboratory II) freshman general chemistry laboratory courses viaQualtrics (2017). It should be noted that these chemistry laboratory courses all offered traditional laboratory experiments, which would typically be described as Level 0: Confirmation, following Bruck et al. (2008). The laboratory experiments occupy a two hour and forty minute time slot, with a teaching assistant or instructor assigned to each laboratory classroom with a maximum of 22 students. During the laboratory experiments, students typically work in pairs to collect data and fill in data tables, while the laboratory report has students follow prescribed calculations and data analysis. During the semester in which survey data were collected, there were 9 sections of the general chemistry for engineers course (with 193 students total), 33 sections of the general chemistry laboratory I course (with 685 students total), and 9 sections of the general chemistry laboratory II course (with 182 students total).

Participation in the survey by students was voluntary, and students received a 5-point bonus credit on their final exam for completing the survey. The survey was administered just once, several weeks before the end of the semester. In total, 651 students accessed the survey, while 616 of these students completed the survey. As one check on the quality of the data, we tallied the number of students who completed the survey and who had responded in exactly the same way (either Strongly Disagree, Disagree, Agree, or Strongly Agree) to all of the Likert-scale items, which was a possible indication that they were not really paying attention to the items and were simply entering responses. There were 2 students who responded Agree and 3 students who responded Strongly Agree, respectively, to all of the Likert-scale survey items; these data were excluded from the analysis, and as such, our preliminary survey data analysis was based on the data of the remaining 611 students. At the end of the survey, students had an opportunity to give open-ended feedback on the survey, with this feedback used as a further measure of face validity of the survey. 123 of the students indicated that they had no feedback to give, while 18 students indicated that it was a good survey, based on it being thorough, detailed, well-organized, answered easily, and reflective of the various aspects of the chemistry laboratory experiment experience. Two students indicated that they found the survey questions somewhat repetitive, while four students indicated that they would have preferred a neutral option in the Likert-scale response options. In addition, there was no student feedback indicating doubt or confusion about the meaning of any of the survey items.

Data analysis (described in the following sections) of the results of the preliminary survey led to eighteen of the forty-six survey items being dropped, yielding a final survey consisting of twenty-eight items. As a measure of the response process validity associated with the final survey, eleven general chemistry students were interviewed individually to gauge their understanding of the various survey items. Each student was interviewed on either nine or ten of the survey items, with each survey item being addressed by either three or four students. In each student's interview, each survey item presented to them was printed out on a slip of paper and presented to them one at a time. Each student was asked to read aloud each survey item, and to describe in their own words what the survey item meant to them. Students’ interpretations of the survey items were well aligned with the intended meanings of the survey items. In addition, as a result of the interviews, several small changes were made to the wording of some of the survey items. One such change involved the words “excited” and “intrigued” in some of the survey items; students indicated that such words were too enthusiastic, and that the word “interested” would be more fitting, which reinforced the importance of such item wording (Galloway and Bretz, 2015a; Galloway et al., 2016).

As a measure of the content validity associated with the final survey, three instructors who regularly teach general chemistry laboratory courses (two of them with doctoral degrees in chemistry and one of them with a master's degree in chemistry) individually reviewed the final survey items. In reviewing the survey items they were asked to indicate the type of engagement (behavioral, emotional, or cognitive) associated with each survey item. There was total agreement on the type of engagement on twenty-two of the survey items, while there was partial agreement (two out of three of the reviewers) on five of the survey items, and no agreement amongst the reviewers on one survey item. The reviewers were also asked to indicate the extent to which they perceived each individual item to be representative of the type of engagement that they selected for each item, according to the following scale: 1 = not representative, 2 = minimally representative, 3 = moderately representative, and 4 = strongly representative. All reviewers rated the representativeness of all items as either 3 (moderately representative, 22.6% of the total items) or 4 (strongly representative, 77.4% of the total items). Finally, the reviewers were asked to comment on the clarity of the items, and they all indicated that the items were clear.

The final survey was administered in a subsequent semester to students enrolled in General Chemistry for Engineers Laboratory courses, as well as the first semester (General Chemistry Laboratory I) and second semester (General Chemistry Laboratory II) freshman general chemistry laboratory courses viaQualtrics (2017). During the semester in which survey data were collected, there were 6 sections of the general chemistry for engineers course (with 143 students total), 26 sections of the general chemistry laboratory I course (with 482 students total), and 17 sections of the general chemistry laboratory II course (with 375 students total). Participation in the survey by students was voluntary, and students received a 5-point bonus credit on their midterm exam for completing the survey. The survey was administered just once, near the middle of the semester. In total, 296 students accessed the survey, while 269 of these students completed the survey. As one check on the quality of the data, we tallied the number of students who completed the survey and who had responded in exactly the same way (either Strongly Disagree, Disagree, Agree, or Strongly Agree) to all of the Likert-scale items, which was a possible indication that they were not really paying attention to the items and were simply entering responses. There was 1 student who responded Agree and 1 student who responded Strongly Agree, respectively, to all of the Likert-scale survey items; these data were excluded from the analysis. In addition, there were 4 students who responded Agree or Strongly Agree to all the items, even though some of the items were reverse worded; these data were also excluded from the analysis. As such, our final survey data analysis was based on the data of the remaining 263 students.

The average age of the students taking the final survey was 20.4 years. The distribution of students by gender, course, classification, and major are shown in Table 1; one student did not submit their laboratory course.

Table 1 Distribution of students by gender, course, classification, and major
Number of students % of students
Gender Female 165 62.7
Male 98 37.3
Course Gen. Chem. Eng. Lab. 13 4.9
Gen. Chem. Lab. I 154 58.6
Gen. Chem. Lab. II 95 36.1
Classification Freshman 82 31.2
Sophomore 78 29.7
Junior 70 26.6
Senior 33 12.5
Major Biology 127 48.3
Nursing 38 14.4
Chemistry 16 6.1
Mechanical Engineering 10 3.8
Civil Engineering 7 2.7
Computer Engineering 5 1.9
Psychology 5 1.9
Pre-Professional 2 0.8
Other 53 20.2


Data analysis

Once preliminary and final survey data collection was complete, the results were exported into Microsoft Excel (2016). The response option Strongly Disagree was converted to a numerical value of 1, Disagree to 2, Agree to 3, and Strongly Agree to 4. There were some reverse worded items, so care was taken to reverse code these items. These data were then exported into IBM SPSS Statistics 23 (2015) for further analysis.

We then conducted exploratory factor analysis on the data in IBM SPSS Statistics 23 (2015). We also conducted reliability analyses of the various factors, and comparisons of means of the various groups on the various factors as well as the factors combined. Exploratory factor analysis is a statistical technique used to reduce a large number of variables in a set of data into a smaller set of factors, with each of the factors being related to a latent construct (Taherdoost et al., 2014), which cannot be directly measured. We chose to conduct exploratory factor analysis on our survey data to determine which survey items loaded onto various factors, and to determine the nature of the various factors. Also, exploratory factor analysis provides evidence of construct validity (Thompson and Daniel, 1996), allowing us to provide evidence of construct validity for our survey, which is a Likert-scale self-reporting survey. Exploratory factor analysis was appropriate for our preliminary survey data as our number of survey participants surpassed the generally recommended number of 300 survey participants for exploratory factor analysis (Taherdoost et al., 2014); we also conducted exploratory factor analysis on our final survey data, as the number of our survey participants was approaching 300. In addition, our ratio of preliminary survey participants to survey items was close to 13.3[thin space (1/6-em)]:[thin space (1/6-em)]1, which surpassed the minimum recommended ratio of 10[thin space (1/6-em)]:[thin space (1/6-em)]1 (Costello and Osborne, 2005); our ratio of final survey participants to survey items was 9.4[thin space (1/6-em)]:[thin space (1/6-em)]1, which was approaching the ratio of 10[thin space (1/6-em)]:[thin space (1/6-em)]1. We followed the best practices outlined by Costello and Osborne (2005) in conducting exploratory factor analysis of our data.

To determine the reliabilities of the various factors, internal consistencies were measured by Cronbach's α, with values greater than 0.7 considered acceptable (Nunnally, 1978). To explore difference in survey scores among the different groups (gender, course, classification), analysis of variance (ANOVA) tests were conducted.

Results and discussion

Exploratory factor analysis

Exploratory factor analysis was conducted in IBM SPSS Statistics 23 (2015). Principal axis factoring was used as the factor extraction method, which allows the data to have a non-normal distribution (Costello and Osborne, 2005). This extraction method was selected because Shapiro–Wilk tests of the data in IBM SPSS Statistics 23 (2015) showed that the data from the preliminary and final survey were non-normally distributed. The rotation method used was an oblique rotation method, promax rotation, with the standard kappa value of 4. This rotation method allows factors to be correlated, which is not uncommon in social sciences research (Costello and Osborne, 2005). After the initial exploratory factor analysis was conducted on the preliminary survey data, low-loading items (items with loadings of less than 0.32 on a single factor) were dropped, and cross-loading items (items with loadings of 0.32 or greater on more than one factor) were also dropped (Costello and Osborne, 2005). This procedure was repeated until there were no low-loading or cross-loading items remaining, and each factor consisted of three or more items, which indicated strong and stable factors (Costello and Osborne, 2005). Finally, there were seven additional items which were dropped due to having low communalities of less than 0.40 (Costello and Osborne, 2005). Of the items that were dropped, three items cross-loaded on factors associated with emotional engagement and behavioral engagement, four items cross-loaded on factors associated with emotional engagement in the pre-lab, lab procedures, and/or data collection, and two items cross-loaded on factors associated with cognitive engagement in the pre-lab, lab procedures, and/or data collection; these cross-loading outcomes highlight the potential difficulty in constructing survey items for various parts of the laboratory experience along various dimensions of engagement. Furthermore, the original forty-six item preliminary survey had twelve items related to cognitive, emotional, and behavioral engagement during the pre-lab introduction given by the lab instructor. The final survey had twenty-eight items, with none of the items related to engagement during the pre-lab introduction. In the open-ended feedback at the end of the preliminary survey, thirteen students indicated they had a good lab instructor while eight students indicated they had a bad lab instructor. These results indicated that students had a range of perceptions of their lab instructors, which may have affected their responses on their levels of engagement during the pre-lab introduction given by their lab instructors, ultimately leading to a lack of cohesion among these survey items. In total twenty-eight of the original forty-six preliminary survey items were retained, while eighteen of the original survey items were dropped. The final survey, comprised of the twenty-eight items remaining after exploratory factor analysis of the preliminary survey, was administered to students in a subsequent semester.

After the initial exploratory factor analysis was conducted on the final survey data, one item was dropped as a result of cross-loading on factors associated with behavioral engagement and emotional engagement in data collection. Apart from this item, there were only two items remaining associated with behavioral engagement in data collection, and they were both low-loading items, so they were dropped.

The Kaiser–Meyer–Olkin Measure of Sampling Adequacy (KMO), which evaluates the degree to which items are correlated, generally has a minimum value of 0.7 considered as adequate, with higher values approaching the maximum value of 1 being preferred (Taherdoost et al., 2014). Our final survey data value for the Kaiser–Meyer–Olkin Measure of Sampling Adequacy (KMO) was 0.894. In addition, our p-value for Bartlett's test of sphericity, which tests the hypothesis that the correlation matrix is an identity matrix, was <0.001; this result indicated that the correlation matrix was not an identity matrix and that the items were indeed correlated.

The number of final survey factors retained from the exploratory factor analysis was determined using the scree test (Costello and Osborne, 2005). This test involves examining the plot of eigenvalue versus factor number, and looking for a break or bend in the plot where the curve flattens out. The number of factors retained would be the number of data points above the break or bend. Fig. 1 shows the scree plot.


image file: c8rp00167g-f1.tif
Fig. 1 Scree plot for the final survey.

As a further gauge for the number of factors to retain from the exploratory factor analysis, we determined the scree test acceleration factors (which are a function of eigenvalue), which emphasize the point on the scree plot at which the slope changes abruptly (Raîche et al., 2013). The scree test acceleration factor plot graphs the scree test acceleration factor as a function of factor number from the second to the penultimate eigenvalue. This graph allows for an examination of where the break or bend in the scree plot occurs, enabling us to make a decision on the number of factors to be retained. Fig. 2 shows the scree test acceleration factor plot.


image file: c8rp00167g-f2.tif
Fig. 2 Scree test acceleration factor plot for the final survey.

Based on the scree plot and the scree test acceleration factor plot, as well as a consideration of Kaiser's criterion which recommends that all factors with eigenvalues greater than one be retained, six factors were retained. The six factors and corresponding survey items are shown in Table 2.

Table 2 Student engagement in the general chemistry laboratory final survey factors, items, and factor loadings
Factors and items Factor loadings
F1 F2 F3 F4 F5 F6
Factor 1 (F1): Cognitive Engagement in Data and Overall
I put a lot of effort into understanding and evaluating the data/observations as I was collecting them during the lab 0.59 −0.01 −0.01 0.10 0.09 −0.03
I put a lot of effort into thinking about techniques to help me accurately record the data from the instruments, equipment, glassware, and my observations 0.72 −0.06 −0.01 0.01 0.08 0.02
I put a lot of effort into planning how I would collect all the data/observations before the end of the lab 0.74 0.03 −0.02 0.05 0.01 0.04
I tried hard to connect the lab experiment to the theory from lecture 0.76 0.00 0.05 −0.04 −0.12 0.00
I put a lot of effort into thinking about real world applications and connections to the lab experiment 0.59 0.12 0.04 −0.09 0.11 −0.06
I tried hard to understand the lab concepts rather than just memorizing them 0.72 −0.06 −0.02 −0.03 −0.01 0.11
Factor 2 (F2): Negative Emotional Engagement in Lab Procedures
I felt nervous about using the glassware during the lab procedures 0.03 0.75 −0.05 0.01 −0.11 −0.12
I felt unsure when following the lab procedures 0.00 0.66 0.08 −0.01 0.07 0.14
I felt insecure about using the lab equipment and instruments during the lab procedures −0.07 0.71 0.11 −0.05 0.01 0.14
I felt nervous when using and/or preparing the chemicals during the lab procedures 0.03 0.87 −0.12 0.07 0.03 −0.04
Factor 3 (F3): Positive Emotional Engagement in Lab Procedures
I felt interested when using the lab equipment and instruments during the lab procedures −0.04 0.06 0.82 0.06 −0.05 −0.01
I found it interesting to use and/or prepare the chemicals during the lab procedures −0.01 −0.14 0.75 0.12 0.01 0.08
I found it interesting to use the glassware during the lab procedures −0.01 −0.02 0.87 −0.08 0.07 0.00
I felt interested when following the laboratory procedures 0.14 0.06 0.74 −0.01 −0.06 −0.10
Factor 4 (F4): Behavioral Engagement in Lab Procedures
I participated fully when using the lab equipment and instruments during the lab procedures −0.05 −0.03 −0.10 0.73 0.07 0.09
I put a lot of effort into following the lab procedures properly −0.03 0.01 0.17 0.56 0.06 0.00
I participated fully in using and/or preparing the chemicals during the lab procedures 0.02 −0.10 0.09 0.76 −0.05 0.02
I participated fully when using the glassware during the lab procedures 0.04 0.16 −0.03 0.79 −0.06 −0.11
Factor 5 (F5): Cognitive Engagement in lab Procedures
I put a lot of effort into understanding the design of the lab procedures in terms of how different parts of the lab procedures were related to one another 0.10 −0.04 −0.03 0.13 0.63 0.08
I tried hard to understand why specific lab equipment and instruments were used during the lab procedures 0.08 −0.04 −0.09 0.01 0.81 0.00
I put a lot of effort into understanding why specific chemicals were used and/or prepared during the lab procedures 0.04 0.01 0.02 −0.03 0.87 −0.03
I tried hard to understand why specific glassware was used during the lab procedures 0.16 0.03 0.12 −0.07 0.64 −0.12
Factor 6 (F6): Negative Emotional Engagement in Data Collection
I felt insecure about the data/observations I was collecting during the lab 0.07 −0.05 −0.06 0.02 −0.09 0.86
I felt unsure about accurately recording the data from the instruments, equipment, glassware, and my observations −0.02 0.07 −0.02 −0.02 0.05 0.80
I felt worried about collecting all the data/observations before the end of the lab 0.00 0.21 0.09 −0.01 0.01 0.56


Factors 1 and 5 dealt with cognitive engagement, Factors 2, 3, and 6 dealt with emotional engagement, and Factor 4 dealt with behavioral engagement. These types of engagement aligned with the various types of engagement described by Fredricks et al. (2004), but within the context of the general chemistry laboratory.

Factor 1, focused on cognitive engagement, included items associated with collecting and understanding data, and understanding the lab overall. Factor 5, also focused on cognitive engagement, included items associated with the laboratory procedures. These results indicated that students experienced cognitive engagement in laboratory procedures differently than cognitive engagement in data collection.

Factor 2 dealt with negative emotional engagement in the laboratory procedures, Factor 3 dealt with positive emotional engagement in the laboratory procedures, and Factor 6 dealt with negative emotional engagement in data collection. These results indicated that students experienced negative emotional engagement in laboratory procedures differently than negative emotional engagement in data collection. These results also showed that negative emotional engagement in laboratory procedures and positive emotional engagement in laboratory procedures are different factors, which supported the work of Galloway and Bretz (2015a, 2015b, 2015c, 2015d), who reported students experiencing either contributions to meaningful learning (positively worded items) or inhibitions against meaningful learning (negatively worded items).

Factor 4 focused on behavioral engagement in laboratory procedures. Overall, laboratory procedures was the aspect of the laboratory experience most represented among the various dimensions of engagement, with factors involving negative emotional engagement in laboratory procedures, positive emotional engagement in laboratory procedures, behavioral engagement in laboratory procedures, and cognitive engagement in laboratory procedures.

Factor reliabilities

Internal consistencies of the various factors were measured by Cronbach's α, with values greater than 0.7 being considered acceptable (Nunnally, 1978). We measured Cronbach's α values to obtain an indication of the consistency of measurement between the survey items within each of the factors resulting from the exploratory factor analysis. Taber (2018) argued for the logic of demonstrating high Cronbach's α values in surveys such as ours, in which each factor is associated with a particular construct, with the items in each factor feeding into the associated construct. The Cronbach's α values of the various factors are shown in Table 3.
Table 3 Internal consistencies of the various factors
Survey items Cronbach's α
Factor 1 (Cognitive Engagement in Data and Overall) 0.85
Factor 2 (Negative Emotional Engagement in Lab Procedures) 0.84
Factor 3 (Positive Emotional Engagement in Lab Procedures) 0.88
Factor 4 (Behavioral Engagement in Lab Procedures) 0.82
Factor 5 (Cognitive Engagement in Lab Procedures) 0.88
Factor 6 (Negative Emotional Engagement in Data Collection) 0.83


These results indicated that the internal consistencies of the various factors all surpassed the benchmark Chronbach's α value of 0.7.

Differences in average survey scores on the different factors

Differences in average survey scores on the various factors were determined using tests of analysis of variance (ANOVA) and a Tukey's HSD post hoc test. The scores were obtained by converting the response option Strongly Disagree to a numerical value of 1, Disagree to 2, Agree to 3, and Strongly Agree to 4, with reverse worded items (items in Factor 2 and Factor 6) being reverse coded. The minimum average survey score would be 1, and the maximum average survey score would be 4. The results are shown in Table 4.
Table 4 Differences in average survey scores on the different factors
Survey items Scores
a The average survey score was significantly different at the p = 0.05 level. b The items in Factors 2 and 6 were reverse scored.
Factor 1 (Cognitive Engagement in Data and Overall) 3.19
Factor 2 (Negative Emotional Engagement in Lab Procedures)b 3.01
Factor 3 (Positive Emotional Engagement in Lab Procedures) 3.55a
Factor 4 (Behavioral Engagement in Lab Procedures) 3.76a
Factor 5 (Cognitive Engagement in Lab Procedures) 3.23
Factor 6 (Negative Emotional Engagement in Data Collection)b 3.07
Entire survey 3.30


The average survey scores on Factors 3 and 4 were significantly different from all average survey scores at the p = 0.05 level. In addition, the average scores on Factors 2 and 6 were not significantly different, the average scores on Factors 1 and 6 were not significantly different, and the average scores on Factors 1, 5, and the entire survey were not significantly different, all at the p = 0.05 level.

These results indicated that overall, Factor 4 (Behavioral Engagement in Laboratory Procedures) had the highest average score. The behavioral engagement dimension is the dimension most associated with “doing” the laboratory experiment, and these results indicated that the highest level of engagement were associated with “doing” the laboratory work.

Factor 3 (Positive Emotional Engagement in Laboratory Procedures) had a significantly higher average score than Factor 2 (Negative Emotional Engagement in Laboratory Procedures). This result indicated that on average students agreed more strongly with the positive emotional engagement in laboratory procedures items in comparison to disagreeing less strongly with the negative emotional engagement in laboratory procedures items.

Differences in average survey scores among the different laboratory experiments

As the students completed the survey, they were asked to choose one of their two most recent general chemistry laboratory experiments, and to think about that particular experiment while they were responding to the survey items. In General Chemistry for Engineers Laboratory the recent laboratory experiments involved acids and bases, and volumetric analysis; in General Chemistry Laboratory I the recent laboratory experiments involved the chemistry of copper and limiting reactants; in General Chemistry Laboratory II the recent laboratory experiments involved rate laws and factors affecting reaction rates. Differences in average survey scores among the different laboratory experiments were determined using tests of analysis of variance (ANOVA); a Tukey's HSD post hoc test was conducted to examine the differences in survey scores among the various laboratory experiments. The results are shown in Table 5; one student did not select a laboratory experiment.
Table 5 Differences in average survey scores on the different laboratory experiments
Survey items Laboratory experiment
Gen. Chem. Eng. Lab. acids (N = 40) Gen. Chem. Eng. Lab. volumetric (N = 8) Gen. Chem. Lab. I limiting (N = 25) Gen. Chem. Lab. I copper (N = 102) Gen. Chem. Lab. II factors (N = 29) Gen. Chem. Lab. II law (N = 58)
a The average survey score was significantly different at the p = 0.05 level.
Factor 1 3.30 3.13 3.06 3.22 3.10 3.18
Factor 2 3.05 3.03 2.95 2.93 3.09 3.11
Factor 3 3.69 3.31 3.49 3.62 3.58 3.40
Factor 4 3.79a 3.78 3.72 3.83a 3.90a 3.57a
Factor 5 3.33 2.97 3.14 3.29 3.10 3.20
Factor 6 3.03 2.88 3.16 3.02 3.22 3.13
Entire survey 3.37 3.19 3.24 3.32 3.32 3.26


The results indicated that there were only few significant differences between the average survey scores on any of the six laboratory experiments at the p = 0.05 level. On Factor 4 (Behavioral Engagement in Lab Procedures), students had a significantly lower average score in the general chemistry II rate law laboratory experiment compared to several other laboratory experiments, indicating that students were less engaged on the behavioral dimension in that particular laboratory experiment.

For further statistical context, the statistical power and effect size of the various ANOVA tests are presented in Table 6; these results may have contributed to the lack of significance found.

Table 6 Statistical power and effect size of the ANOVA test determining differences in average survey scores on the different laboratory experiments
Survey items Laboratory experiments
Statistical power (%) Effect size, η2
Factor 1 30.9 0.02
Factor 2 28.1 0.02
Factor 3 77.7 0.05
Factor 4 98.9 0.10
Factor 5 40.0 0.02
Factor 6 29.5 0.02
Entire survey 26.2 0.01


Implications for teaching

Overall, Factor 4 (focused on behavioral engagement in laboratory procedures) had the highest average score. This factor was the one most associated with the hands-on aspect of “doing” the laboratory experiments, and this result reinforces the commonly held idea that students really value the hands-on aspect of laboratory experiments. This result also supports the validity of the survey in that laboratory experiments might be expected to have the highest level of engagement associated with the aspect of “doing” the laboratory experiment, a point noted by Hofstein and Lunetta in their statement “…studies had shown that often the students and the teacher are preoccupied with technical and manipulative details that consume most of their time and energy.” (2004, p. 31).

Laboratory experiments in real world contexts or inquiry contexts are suggested to be engaging and motivating for students (Barrie et al., 2015), with real world contexts pertaining particularly to the affective (emotional) domain (Bretz et al., 2013; Galloway and Bretz, 2015b). The survey instrument reported in this study can be used to examine the differences in cognitive, emotional, and behavioral engagement between laboratory experiments embedded in real world contexts and inquiry-based contexts, and laboratory experiments not embedded in such contexts.

Limitations

Each school or institution and its student population exists in a unique context. When considering the use of this survey instrument with another student population, care must be taken to evaluate the appropriateness of using the survey. To ensure the appropriateness, confirmatory factor analysis studies are suggested. In addition, since students were prompted to think about one of their two most recent laboratory experiments, it is possible they may have been biased and may have chosen the more favorable laboratory experiment (in their view), so the results should be considered in that context. Furthermore, it is possible that students may have selected survey responses which overstated their levels of engagement in an effort to look good, thereby engaging in socially desirable responding (Steenkamp et al., 2010); the results should also be considered in this context.

Conclusions

The survey reported here focused on student engagement in general chemistry laboratories focused along the dimensions of cognitive, behavioral, and emotional engagement. We provided good measures and evidence for the reliability and validity of the survey. Overall, the survey results followed the dimensions of engagement established in the literature, within the context of the general chemistry laboratory. The survey results showed only few significant differences among any of the laboratory experiments that were part of the survey.

Conflicts of interest

There are no conflicts of interest to declare.

Appendix

Student engagement in the general chemistry laboratory survey instrument

1. I participated fully when using the lab equipment and instruments during the lab procedures.

2. I put a lot of effort into following the lab procedures properly.

3. I participated fully in using and/or preparing the chemicals during the lab procedures.

4. I participated fully when using the glassware during the lab procedures.

5. I felt interested when using the lab equipment and instruments during the lab procedures.

6. I found it interesting to use and/or prepare the chemicals during the lab procedures.

7. I found it interesting to use the glassware during the lab procedures.

8. I felt interested when following the laboratory procedures.

9. I felt nervous about using the glassware during the lab procedures.

10. I felt unsure when following the lab procedures.

11. I felt insecure about using the lab equipment and instruments during the lab procedures.

12. I felt nervous when using and/or preparing the chemicals during the lab procedures.

13. I put a lot of effort into understanding the design of the lab procedures in terms of how different parts of the lab procedures were related to one another.

14. I tried hard to understand why specific lab equipment and instruments were used during the lab procedures.

15. I put a lot of effort into understanding why specific chemicals were used and/or prepared during the lab procedures.

16. I tried hard to understand why specific glassware was used during the lab procedures.

17. I felt insecure about the data/observations I was collecting during the lab.

18. I felt unsure about accurately recording the data from the instruments, equipment, glassware, and my observations.

19. I felt worried about collecting all the data/observations before the end of the lab.

20. I put a lot of effort into understanding and evaluating the data/observations as I was collecting them during the lab.

21. I put a lot of effort into thinking about techniques to help me accurately record the data from the instruments, equipment, glassware, and my observations.

22. I put a lot of effort into planning how I would collect all the data/observations before the end of the lab.

23. I tried hard to connect the lab experiment to the theory from lecture.

24. I put a lot of effort into thinking about real world applications and connections to the lab experiment.

25. I tried hard to understand the lab concepts rather than just memorizing them.

Acknowledgements

We would like to acknowledge the students for taking part in the study. We would also like to acknowledge Mrs Sylvia Diaz and several other instructors for assisting with the distribution of the surveys. Further, we would like to thank Mrs Sylvia Diaz, Dr Tina Thomas, and Dr Abudi Atesin for providing expert review on the survey.

References

  1. Alkan F., (2016), Development of chemistry laboratory self-efficacy beliefs scale, J. Balt. Sci. Educ., 15(3), 350–359.
  2. Ausubel D. P., (1963), The psychology of meaningful verbal learning: An introduction to school learning, New York, NY: Grune and Stratton, Inc.
  3. Ausubel D. P., (1968), Educational psychology: A cognitive view, New York, NY: Holt, Rinehart and Winston, Inc.
  4. Barbera J., Adams W. K., Wieman C. E. and Perkins K. K., (2008), Modifying and validating the Colorado learning attitudes about science survey for use in chemistry, J. Chem. Educ., 85(10), 1435–1439.
  5. Barrie S. C., Bucat R. B., Buntine M. A., da Silva K. B., Crisp G. T., George A. V., Jamie I. M., Kable S. H., Lim K. F., Pyke S. M., Read J. R., Sharma M. D. and Yeung A., (2015), Development, evaluation and use of a student experience survey in undergraduate science laboratories: The advancing science by enhancing learning in the laboratory student laboratory learning experience survey, Int. J. Sci. Educ., 37(11), 1795–1814.
  6. Bauer C. F., (2005), Beyond “student attitudes”: Chemistry self-concept inventory for assessment of the affective component of student learning, J. Chem. Educ., 82(12), 1864–1870.
  7. Bauer C. F., (2008), Attitude towards chemistry: A semantic differential instrument for assessing curriculum impacts, J. Chem. Educ., 85(10), 1440–1445.
  8. Bowen C. W., (1999), Development and score validation of a chemistry laboratory anxiety instrument (CLAI) for college chemistry students, Educ. Psychol. Meas., 59(1), 171–185.
  9. Brandriet A. R., Xu X., Bretz S. L. and Lewis J. E., (2011), Diagnosing changes in attitude in first-year college chemistry students with a shortened version of Bauer's semantic differential, Chem. Educ. Res. Pract., 12(2), 271–278.
  10. Bretz S. L., (2001), Novak's theory of education: Human constructivism and meaningful learning, J. Chem. Educ., 78(8), 1107.
  11. Bretz S. L., (2019), Evidence for the importance of laboratory courses, J. Chem. Educ., 96(2), 193–195.
  12. Bretz S. L., Fay M., Bruck L. B. and Towns M. H., (2013), What faculty interviews reveal about meaningful learning in the undergraduate chemistry laboratory, J. Chem. Educ., 90(3), 281–288.
  13. Bruck A. D. and Towns M., (2013), Development, implementation, and analysis of a national survey of faculty goals for undergraduate chemistry laboratory, J. Chem. Educ., 90(6), 685–693.
  14. Bruck L. B., Bretz S. L. and Towns M. H., (2008), Characterizing the level of inquiry in the undergraduate laboratory, J. Coll. Sci. Teach., 38(1), 52–58.
  15. Bruck L. B., Towns M. and Bretz S. L., (2010), Faculty perspectives of undergraduate chemistry laboratory: Goals and obstacles to success, J. Chem. Educ., 87(12), 1416–1424.
  16. Costello A. B. and Osborne J. W., (2005), Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis, Pract. Assess., Res. Eval., 10(7), 1–9.
  17. DeKorver B. K. and Towns M. H., (2015), General chemistry students’ goals for chemistry laboratory coursework, J. Chem. Educ., 92(12), 2031–2037.
  18. Edionwe E., Villarreal J. R. and Smith K. C. (2011), How much cranberry juice is in cranberry-apple juice? A general chemistry spectrophotometric experiment, J. Chem. Educ., 88(10), 1410–1412.
  19. Flaherty A., O’Dwyer A., Mannix-McNamara P. and Leahy J. J., (2017), Aligning perceptions of laboratory demonstrators’ responsibilities to inform the design of a laboratory teacher development program, J. Chem. Educ., 94(8), 1007–1018.
  20. Fredricks J. A., Blumenfeld P. C. and Paris A. H., (2004), School engagement: Potential of the concept, state of the evidence, Rev. Educ. Res., 74(1), 59–109.
  21. Galloway K. R. and Bretz S. L., (2015a), Development of an assessment tool to measure students’ meaningful learning in the undergraduate chemistry laboratory, J. Chem. Educ., 92(7), 1149–1158.
  22. Galloway K. R. and Bretz S. L., (2015b), Using cluster analysis to characterize meaningful learning in a first-year university chemistry laboratory course, Chem. Educ. Res. Pract., 16(4), 879–892.
  23. Galloway K. R. and Bretz S. L., (2015c), Measuring meaningful learning in the undergraduate chemistry laboratory: A national, cross-sectional study, J. Chem. Educ., 92(12), 2006–2018.
  24. Galloway K. R. and Bretz S. L., (2015d), Measuring meaningful learning in the undergraduate general chemistry and organic chemistry laboratories: A longitudinal study, J. Chem. Educ., 92(12), 2019–2030.
  25. Galloway K. R. and Bretz S. L. (2016), Video episodes and action cameras in the undergraduate chemistry laboratory: eliciting student perceptions of meaningful learning, Chem. Educ. Res. Pract., 17(1), 139–155.
  26. Galloway K. R., Malakpa Z. and Bretz S. L., (2016), Investigating affective experiences in the undergraduate chemistry laboratory: Students’ perceptions of control and responsibility, J. Chem. Educ., 93(2), 227–238.
  27. Grove N. and Bretz S. L., (2007), CHEMX: An instrument to assess students’ cognitive expectations for learning chemistry, J. Chem. Educ., 84(9), 1524–1529.
  28. Hofstein A. and Lunetta V. N., (1982), The role of the laboratory in science teaching: Neglected a. aspects of research, Rev. Educ. Res., 52(2), 201–217.
  29. Hofstein A. and Lunetta V. N., (2004), The laboratory in science education: Foundations for the twenty-first century, Sci. Educ., 88(1), 28–54.
  30. Hofstein A. and Mamlok-Naaman R., (2007), The laboratory in science education: The state of the art, Chem. Educ. Res. Pract., 8(2), 105–107.
  31. IBM SPSS Statistics 23, (2015), Armonk, NY: IBM.
  32. Irby S. M., Borda E. J. and Haupt J., (2018), Effects of implementing a hybrid wet lab and online module lab curriculum into a general chemistry course: Impacts on student performance and engagement with the chemistry triplet, J. Chem. Educ., 95(2), 224–232.
  33. Johnstone A. H. and Al-Shuaili A., (2001), Learning in the laboratory; some thoughts from the literature, Univ. Chem. Educ., 5(2), 42–51.
  34. Laredo T., (2013), Changing the first-year chemistry laboratory manual to implement a problem-based approach that improves student engagement, J. Chem. Educ., 90(9), 1151–1154.
  35. Microsoft Excel 2016, (2016), Redmond, WA: Microsoft.
  36. Nadler J. T., Weston R. and Voyles E. C., (2015), Stuck in the middle: The use and interpretation of mid-points in items on questionnaires, J. Gen. Psychol., 142(2), 71–89.
  37. Novak J. D., (1977), A theory of education, Ithaca, NY: Cornell University Press.
  38. Nunnally J. C., (1978), Psychometric theory, New York, NY: McGraw-Hill, Inc.
  39. Nyutu E. N., Cobern W. W. and Pleasants B. A. S., (2019), Student engagement in direct instruction, undergraduate microbiology laboratories, J. Biol. Educ., 53(3), 250–264.
  40. Qualtrics, (2017), Provo, UT: Qualtrics.
  41. Raîche G., Walls T. A., Magis D., Riopel M. and Blais J.-G., (2013), Non-graphical solutions for Cattell's scree test, Methodology, 9(1), 23–29.
  42. Reid N. and Shah I., (2007), The role of laboratory work in university chemistry, Chem. Educ. Res. Pract., 8(2), 172–185.
  43. Sadler T. D., Puig A. and Trutschel B. K., (2011), Laboratory instructional practices inventory: A tool for assessing the transformation of undergraduate laboratory instruction, J. Coll. Sci. Teach., 41(1), 25–31.
  44. Smith K. C. and Cedillo D., (2014), Determining the mass and time of release of acetaminophen from gel capsules, J. Chem. Educ., 91(3), 437–439.
  45. Smith K. C. and Garza A., (2015), Using conductivity measurements to determine the identities and concentrations of unknown acids: An inquiry laboratory experiment, J. Chem. Educ., 92(8), 1373–1377.
  46. Smith K. C. and Sepulveda A., (2018), Students’ perceptions of common practices, including some academically dishonest practices, in the undergraduate general chemistry classroom laboratory, Chem. Educ. Res. Pract., 19(4), 1142–1150.
  47. Steenkamp J. E. M., De Jong M. G. and Baumgartner H., (2010), Socially desirable response tendencies in survey research, J. Mark. Res., 47(2), 199–214.
  48. Taber K. S., (2018), The use of Cronbach's Alpha when developing and reporting research instruments in science education, Res. Sci. Educ., 48(6), 1273–1296.
  49. Taherdoost H., Sahibuddin S. and Jalaliyoon N., (2014), Exploratory factor analysis; Concepts and theory. in: 2nd International Conference on Mathematical, Computational and Statistical Sciences. Gdansk: WSEAS, pp. 375–382. Available at: http://www.wseas.us/e-library/conferences/2014/Gdansk/MATH/MATH-49.pdf, accessed 19 Feb.2018.
  50. Thompson B. and Daniel L. G., (1996), Factor analytic evidence for the construct validity of scores: A historical overview and some guidelines, Educ. Psychol. Meas., 56(2), 197–208.
  51. Weijters B., Cabooter E. and Schillewaert N., (2010), The effect of rating scale format on response styles: The number of response categories and response category labels, Int. J. Res. Mark., 27(3), 236–247.
  52. White R. T., (1996), The link between the laboratory and learning, Int. J. Sci. Educ., 18(7), 761–774.
  53. Xu X. and Lewis J. E., (2011), Refinement of a chemistry attitude measure for college students, J. Chem. Educ., 88(5), 561–568.

This journal is © The Royal Society of Chemistry 2020