Design and implementation of instructional videos for upper-division undergraduate laboratory courses

Jennifer A. Schmidt-McCormack , Marc N. Muniz , Ellie C. Keuter , Scott K. Shaw and Renée S. Cole *
University of Iowa, Iowa City, IA 52242, USA. E-mail: renee-cole@uiowa.edu

Received 24th April 2017 , Accepted 31st May 2017

First published on 31st May 2017


Abstract

Well-designed laboratories can help students master content and science practices by successfully completing the laboratory experiments. Upper-division chemistry laboratory courses often present special challenges for instruction due to the instrument intensive nature of the experiments. To address these challenges, particularly those associated with rotation style course structures, pre-laboratory videos were generated for two upper-division laboratory courses, Analytical Measurements and Physical Measurements. Sets of videos were developed for each experiment: a pre-laboratory lecture, an experimental, and a data analysis video. We describe the theoretical principles that guided the design of the instructional videos as well as the process. To assess the impact of the videos on students' successful completion of the experiments, a mixed-methods approach to data collection was used, which included video-recorded laboratory observations, student one-on-one interviews, and the Meaningful Learning in the Laboratory Inventory (MLLI) survey. Our findings indicate that video-based resources can help alleviate some challenges associated with rotation-style labs, particularly the temporal disconnect between pre-laboratory lectures and experiment completion as well as the need for more student autonomy in upper-division laboratory courses.


Introduction

The undergraduate teaching laboratory is widely characterized as an essential part of a student's education in the chemical sciences (Hofstein and Lunetta, 2004; Evans et al., 2006; Reid and Shah, 2007; Elliott et al., 2008; American Chemical Society Committee on Professional Training, 2015). Johnstone and Al-Shuaili (2001) have described the teaching lab as the only opportunity for students to gain physical skills in the laboratory. These include observational skills, the ability to obtain and interpret experimental data, and the capacity to plan experiments (Herrington and Nakhleh, 2003). As students become closer to graduating, they are expected to have developed the requisite skills to become professionals in the field. Upper-division undergraduate laboratory courses, which are typically taken by third and fourth year biochemistry, chemistry, and chemical engineering majors, offer an opportunity for students to become more independent in terms of both their laboratory practice and the application of content they are learning. Students are often given more independent access to instruments to solve chemical problems in these courses. Despite the role upper-division laboratory classes play in the undergraduate chemistry curriculum, they have been largely understudied in the literature, with a limited number of studies addressing this learning environment (Burewicz and Miranowicz, 2006; Tsaparlis and Finlayson, 2014; Jolley et al., 2016; Mack and Towns, 2016).

A common challenge in upper-division undergraduate chemistry laboratory courses is that the instrument intensive nature of the experiments often leads to the use of a rotational scheduling structure. In such a structure, each individual student completes all of the experiments as they progress through the term, but they do so at different times and in different sequences with respect to their classmates and peers. One of the pedagogical challenges associated with a rotational approach to completing experiments is that the pre-lab lecture component is often temporally far removed from when the students perform the experiments. This time lag can range from days to months, depending on the structure of the course. Rotational style laboratory courses can also be challenging for teaching assistants (TA) to facilitate, as well as for the laboratory support staff who maintain the instruments. A TA in this type of class must assist and facilitate multiple student groups completing different experiments during the same lab period. With these challenges in mind, a way to directly engage students with the material pertinent to their individual schedule immediately before they enter the laboratory is needed (Jennings et al., 2007; Seery and McDonnell, 2013; Jordan et al., 2016).

We propose that by offering students readily accessible resources to assist them in the laboratory, they will be afforded more opportunities to self-direct their learning on an as-needed basis. This will better prepare students to succeed in completing laboratory experiments, and to do so with decreased reliance on TA/staff assistance. Providing instructional resources in the form of short videos and pre-lab quizzes allows each student to have autonomy over their learning because they have the ability to review the videos and quizzes more than once and at their own pace (Seery and McDonnell, 2013). Findings from previous research studies show more student autonomy can also increase their positive perceptions about the laboratory (Shibley and Zimmaro, 2002; Cooper and Kerns, 2006; Jennings et al., 2007; Chini and Al-Rawi, 2013; Galloway and Bretz, 2016; Galloway et al., 2016). This study describes the design and implementation of instructional videos in upper-division chemistry labs to improve the student laboratory experience.

Background

Pre-laboratory videos

Pre-laboratory preparation is essential to student success in the laboratory, as it can help prepare students to complete the laboratories and increase their success about the content. Rollnick et al. (2001) described how first-year university students who were prepared for lab were able to begin the laboratory experiment almost immediately and work in an organized manner. However, it has been reported that many students do little to no laboratory preparation, especially when the pre-lab activities are not mandatory (Pogacnik and Cigic, 2006). After the introduction of mandatory pre-lab sessions, Pogacnik and Cigic (2006) found that students spent more time preparing for the laboratory and less time completing post-lab activities, and even though students spent more time completing the pre-lab activities, only 8% felt like the mandatory pre-lab increased the amount of time they spent preparing for lab. However, these pre-laboratory sessions involved face-to-face sessions and quizzes, which required additional time from the teaching assistants in terms of time to facilitate the sessions and to evaluate student work.

One approach to encourage student pre-laboratory preparedness has been the development of videos designed to address portions of lab procedures where students commonly struggle. Burewicz and Miranowicz (2006) conducted a study with fourth-year college students to determine whether traditional, video, or computer-assisted methods were most effective in delivering pre-laboratory activities. Findings indicated that the video and interactive computer-assisted activities increased the amount of time students spent completing the pre-lab activities and decreased the number of errors and total time spent on the experiment. This increased efficiency in the laboratory has also been seen in other studies (Nadelson et al., 2015; Box et al., 2017). Pre-laboratory instructional videos can also decrease the number of questions students ask about procedural issues, while increasing the number of questions related to calculations or theory (Pogacnik and Cigic, 2006; Winberg and Berg, 2007; Towns et al., 2015; Box et al., 2017). This change has been explained as a result of having the videos remove some of the cognitive load of performing the experiments and shifting the focus to thinking about the calculations and chemistry behind the experiments (Teo et al., 2014; Box et al., 2017). Multimedia pre-laboratory activities have been shown to increase students’ perceptions of being prepared for the laboratory (Jones and Edwards, 2010; Towns et al., 2015; Jolley et al., 2016), which can also influence students’ overall experience of the laboratory (Galloway and Bretz, 2016; Galloway et al., 2016). However, all these studies focused on laboratory environments where all the students were completing the same experiment during the laboratory session.

A number of different approaches have been used to create pre-laboratory videos. The most common approach is for instructors to create the resources (Burewicz and Miranowicz, 2006; Winberg and Berg, 2007; Powell and Mason, 2013; Teo et al., 2014; Fung, 2015; Nadelson et al., 2015; Fun Man, 2016; Jolley et al., 2016) although some instructors had undergraduate students who had completed the course create the videos (Jordan et al., 2016; Box et al., 2017). The latter approach was adopted because students can readily remember the portions of the experiment with which they struggled and highlight these aspects of the experiment. Most of the video resources used a third-person perspective (Burewicz and Miranowicz, 2006; Winberg and Berg, 2007; Powell and Mason, 2013; Nadelson et al., 2015; Jolley et al., 2016; Jordan et al., 2016; Box et al., 2017), but there have been cases where a first-person perspective was used (Fung, 2015; Fun Man, 2016). Published descriptions of the content of pre-laboratory videos have included calculations (Winberg and Berg, 2007; Powell and Mason, 2013; Jolley et al., 2016; Box et al., 2017), safety (Nadelson et al., 2015), physical demonstration of the laboratory techniques (Burewicz and Miranowicz, 2006; Powell and Mason, 2013; Teo et al., 2014; Fung, 2015; Nadelson et al., 2015; Fun Man, 2016; Jolley et al., 2016; Box et al., 2017), demonstrations of how to setup software (Burewicz and Miranowicz, 2006; Powell and Mason, 2013; Fung, 2015; Fun Man, 2016; Jolley et al., 2016; Jordan et al., 2016; Box et al., 2017), and the use of time lapses to illustrate the experimental progress in a short period of time (Jordan et al., 2016). However, there is often a lack of description as to how the video production was carried out.

Theoretical framework

The theoretical framework of Self-Regulated Learning (SRL) aligns with our project goals of encouraging students to be more autonomous. Zimmerman (2002) described SRL as having three phases that make up a learning cycle: Forethought Phase, Performance Phase, and Self-Reflection Phase. The Forethought phase includes planning and thinking about how to complete a task. Students engage in forethought by reading the laboratory procedure and completing pre-laboratory exercises. The Performance phase includes the process of carrying out the task and involves focusing one's attention. Completing experiments by carrying out laboratory procedures and analysis is the performance phase. The Self-Reflection phase involves reflecting on one's performance and evaluating how to improve performance in the future. The Self-Reflection phase would occur after students leave the laboratory and reflect on how they could improve their performance for the next experiment. Studies have shown that the self-regulatory processes of learners can display significant variation in both quality and quantity (with more successful students demonstrating SRL processes that are more aligned with the expectations of SRL theory), and that engagement in the different SRL processes can be supported by instruction (Zimmerman, 2002; Molenaar and Järvelä, 2014). We anticipated that providing students with laboratory videos and quizzes before the class period would scaffold their focus during the forethought phase, increasing their ability to monitor the progress of their laboratory experiments more autonomously and with less instructional support, thus improving their performance.

Research questions

The principal research question guiding our work was as follows:

• To what extent can laboratory videos and accompanying resources facilitate students’ successful completion of experiments in upper-level, rotation-style laboratory environments?

Video development

Three types of videos were created to support students in preparing for the laboratory, including collecting and analyzing data. The first type of video was essentially a pre-lab lecture to provide background information, the second focused on the experimental aspects of the laboratory, and the third focused on the data analysis. The design of the videos was guided by the Cognitive Theory of Multimedia Learning (Mayer et al., 1999; Mayer and Chandler, 2001; Mayer et al., 2001). A central idea of the theory is to minimize cognitive overload from processing the auditory and visual channels simultaneously. An example of this principle would be to not have text on the screen that duplicates what the instructor is voicing in the video (Robinson, 2004). When on-screen text is included, it should not be different than what the instructor is voicing in the video, as this would overload the viewer's auditory channel by causing confusion about what the viewer should be paying attention to. This is in line with the so-called “limited capacity assumption,” in which learners are assumed to possess a working memory that is only capable of processing a fixed amount of information irrespective of the channel through which it is initially received. Judicious use of on-screen callouts were used to divide sections of the video to aid in navigation or to direct student attention to a feature that might otherwise be overlooked.

In order to meet the needs of hearing impaired students, closed caption text is available for all YouTube videos, although instructors will likely need to edit the closed captions generated by the program to ensure specialized terms are transcribed correctly. For lecture capture videos, closed captions should also be added to support students for whom the auditory channel presents a barrier. Most programs include the capability to add this easily. This approach allows for closed captions to be turned on for those who need it, but is not omnipresent for those who do not.

To gain insight as to what features would be best to include in the videos, focus group interviews were conducted with students who were enrolled in the Analytical Measurements laboratory course during the Spring 2014 semester. The students’ recommendations for the videos included showing how to use the software programs and succinctly providing background information about the experiments.

The videos were created by members of the research team, which included course instructors (RSC and SKS), a post-doctoral fellow (MNM), a graduate student who had also been a teaching assistant for the Analytical Measurements laboratory course (JASM), and an undergraduate student who had taken both laboratory courses (ECK). Specific contributions are noted in the description for each type of video below.

Pre-lab lecture videos

Pre-lab lecture videos addressed the fundamental concepts addressed in the laboratory. The videos for Physical Measurements were created by the course instructor (RSC) using a lecture capture system (Panopto). PowerPoint slides with instructor annotations provided the visual component while the audio consisted of explanations about the chemical phenomena. These were designed to mirror a typical pre-laboratory lecture with the slides providing a foundation that were annotated using the “inking” function on the laptop by the instructor as background information relevant to the experiment was presented. A screenshot from the Carbon Monoxide Fourier Transform Infrared Spectroscopy lecture video from Physical Measurements is provided in Fig. 1.
image file: c7rp00078b-f1.tif
Fig. 1 Screenshot from a mini-lecture video covering the concepts from Carbon Monoxide Fourier Transform Infrared Spectroscopy Experiment in Physical Measurements.

These videos were intended to be viewed prior to beginning the experiment, but could also be used by the students as a resource while writing their laboratory reports. The intent was to strengthen the connections between the chemical concepts and the experiment being performed. Although the intention was to create pre-lab lecture videos for both Physical and Analytical Measurements, the lecture videos had not yet been created for Analytical Measurements for these case studies.

Experiment videos

Filming of the experimental videos was conducted by a professional video recording service. Scripts were generated by JASM, MNM, and ECK in consultation with the course instructors to guide the creation of the experimental videos. This process was recommended by the video production company to provide structure and focus for the creation of the videos. Members of the project team (JASM, MNM, and ECK) practiced the laboratory experiments and worked up the data beforehand to inform the process of writing scripts. The script consisted of two columns, one column representing the focus of the camera shot and the other consisting of the spoken words or other audio in the production. The structure of the script aligns with the dual channel aspect of Cognitive Theory of Multimedia Learning (Mayer et al., 1999; Mayer and Chandler, 2001; Mayer et al., 2001), as one column represents the visual column and the other column represents the auditory channel. An example of a script for the Physical Measurements CO FTIR experimental video is included in the ESI.

All of the videos were filmed in the instructional laboratory using the same instruments used by students during the course. Relevant portions of the experiments were demonstrated by JASM, MNM, or ECK while technicians from a professional video service recorded the action. Videos ranged from five to twenty minutes in duration and were designed to complement reading the lab manual. These videos focused explicitly on showing the necessary steps to successfully use instruments and related software to collect data during the laboratory periods. A screenshot from an Analytical Measurements Video is shown below in Fig. 2. Camera shots involving the solution prep and glassware were included for demonstration purposes, but these were not the primary focus of the videos. Chromatograms or spectra were shown on screen to show students what the software looked like with loaded spectra, but the quality or particular attributes of these spectra were not discussed in the videos. Editing of the videos was completed by the professional video service, with input from the research team. Bookmarks for specific content items were inserted post-production to facilitate easier navigation.


image file: c7rp00078b-f2.tif
Fig. 2 Screenshot of the FTIR experimental Video from Analytical Measurements.

Data analysis videos

Data analysis videos were designed to provide guidance to students on how to analyze data and represent results in their laboratory reports. These videos were recorded using lecture capture software by a member of the research team. For Physical Measurements, example calculations on sample data sets in an Excel or Origin spreadsheet were shown while being narrated by RSC, MNM, or ECK. The focus was to remove some of the cognitive load in learning how to use software to allow students to focus on the results of the data analysis. For the Analytical Measurements data analysis videos, a SMART board system was used to generate freehand representations of examples of how the graphs/tables should look while JASM narrated. The data analysis videos also provided examples of sample calculations, including dimensional analysis. An example screenshot from the Analytical Measurements data analysis video for the Fourier Transform Infrared Spectroscopy (FTIR) experiment is provided in Fig. 3. These videos ranged in length from four to twenty minutes. In a similar fashion to the experimental videos, the data analysis videos were made with bookmarks so that students could easily navigate from section to section.
image file: c7rp00078b-f3.tif
Fig. 3 Screenshot from the FTIR Data Analysis Video for Analytical Measurements. The slide is representative of the data analysis videos where representations and concepts were often handwritten to approximate the real data.

Development of quizzes

In addition to the traditional pre-laboratory exercises, which included answering questions and completing calculations relevant to the experiment and turning in their responses prior to beginning an experiment, students were required to complete quizzes delivered using the online course management system prior to their participation in lab. Question formats included multiple-choice, multiple select, ordering, and one-word fill in the blank answers. The quiz questions were developed based on specific details found in the experimental videos, such as what wavelength to use to collect spectra or what cues to look for to know when the software is ready to collect data. Questions also asked about specific safety aspects of the experiments. The quizzes were implemented to encourage students to watch the videos and to ensure they understood the content and details in the videos.

Implementation

Participants and setting

The participants for this study were third and fourth year undergraduate students in upper-division laboratory courses at a large, research intensive, Midwestern university. The Analytical Measurements class enrollment consisted of students with majors in chemistry, biochemistry, and chemical engineering. Students in Physical Measurements were chemistry and chemical engineering majors. Both courses are stand-alone laboratory courses where students are expected to take the complementary lecture course(s) previously or concurrently with the laboratory course. The course schedules for both laboratory courses included two hours of lecture (two 1 hour class periods) and six hours of laboratory (two 3 hour blocks) per week, although the lecture portion of the course often did not meet after the first five or six weeks of the semester. This was particularly true after the implementation of the pre-laboratory videos, as they were designed to replace much of this portion of the course. For the analytical course, there were two sections, a morning and afternoon section, and for the physical course there were two different sections that met on alternating days (both of which took place in the afternoon). The lecture sessions were conducted by the course instructor while laboratory sessions were facilitated by the TAs. The laboratory videos and quizzes were implemented during the Fall 2014 (Physical) and Spring 2015 (Analytical) semesters. A summary of the course enrollments is presented in Table 1.
Table 1 Summary of the number of students enrolled per semester. The first column lists the course. The second column indicates the total number of students enrolled per course. The third column lists the number of TAs per laboratory section
Course/semester Students enrolled in course Students per laboratory section TAs per laboratory section
Analytical measurements

Spring 2014

38 18–20 3
Analytical measurements

Spring 2015

29 14–15 2
Physical measurements

Fall 2014

18 7–11 1–2


Video implementation

Table 2 summarizes the videos created for Physical Measurements and for Analytical Measurements. For Analytical Measurements, a page with a list of the links to a YouTube channel containing the experiment videos and Data Analysis videos was posted on the course site. In Physical Measurements, the course site was arranged to have a folder for each experiment, with direct links to each video being part of the information in the folder. Students in both courses completed the quizzes within the course management system and were immediately shown their score, but were not provided detailed feedback about which questions were answered incorrectly. Students were not provided detailed feedback on their quiz responses to discourage students from simply guessing on the one question that they got wrong. Instead, students had to go back and review the entire quiz to figure out where they might have gone wrong. All students answered the same questions for each laboratory experiment. Students in Analytical Measurements had to achieve 100% on the quiz in order to begin the laboratory experiment and they had unlimited attempts to obtain this goal. All of the quizzes contributed the same amount to the overall grade. The quiz scores were worth 3% as part of the overall grade. Students in Physical Measurements had three attempts and did not have to obtain a minimum score to be admitted to the laboratory, but the quizzes were considered a safety component and had to be completed in order to enter the laboratory and complete the experiment. The scores from the pre-laboratory quizzes constituted about 4% of the overall course grade.
Table 2 Summary of which videos were available for which experiments in each course and for which experiments classroom observations were recorded. An “X” indicates there was a video focused on that experiment. An * next to the experiment indicates that the entire class performed the experiment simultaneously (i.e. not done in rotation)
Physical measurements Fall 2014

Observation

Experiment title Pre-lab video Experimental video Data analysis video
Partial Molar Volume (PMV)* X X
Mass Spectrometric Measurements of Heterogeneous Catalysis (CMS) X X X
Enzyme Kinetics of Apples (EK) X X X X
Cyclodextrin Inclusion Complexes (CIC) X X X X
Cyanine Dyes and Quantum Dots (PIB) X X X
Atomic Force Microscopy (AFM) X X X
Theoretical Chemistry (Spartan)*

Analytical measurements Spring 2014

Observation

Spring 2015

Observation

Experiment title Pre-lab video Experimental video Data analysis video
Stats, plots & fits*
Circuits & Coulometry (CC)* X
Cyclic Voltammetry (CV) X X
UV-Vis Spectroscopy (UV-Vis) X X
Fluorescence Spectroscopy X X X X
Fourier Transform Infrared Spectroscopy (FTIR) X X X X
Atomic Emission Spectroscopy (AES) X X X X
Gas Chromatography (GC) X X X X
High Performance Liquid Chromatography (HPLC) X X X X
Capillary Electrophoresis (CE) X X X


Methods

Data collection

For this study, we employed a multi-pronged approach to data collection including video and audio-recorded laboratory observations, video and audio-recorded semi-structured interviews, and a pre-post survey to gauge students’ perceptions of their own learning in the laboratory. The study protocol was approved by the university's Institutional Review Board (IRB), and informed consent was obtained from all participants prior to data collection. In addition, verbal consent was obtained from all student group members immediately prior to recording to ensure they were comfortable with being filmed.

Observations were captured by recording student groups while they were completing the laboratory experiments. A summary of recorded experiments is provided in Table 2. The student groups ranged in size from 1–3 students per group per observation.

Interviews were conducted with students enrolled in Analytical Measurements in 2014. These interviews were fairly open-ended and the primary purpose was to gather student input about the most important features of the laboratory experiments to include in the video Interviews were also conducted with three students during the Fall 2014 semester of Physical Measurements and eighteen students from the Spring 2015 semester of Analytical Measurements. Students were asked about how they prepared for lab and follow-up questions were used to further probe their use of the videos and their overall laboratory experience. The interviews were conducted by MNM (Physical Measurements) using a Livescribe pen and JASM (Analytical Measurements) using a video-camera. The interview protocol in these cases focused on the general course structure, what resources students utilized to inform their action in lab, and how the resources influenced their attitudes towards the laboratories. The interviews ranged from 15 to 45 minutes long and took place at or near the end of the semester, after students had completed a majority of the laboratory experiments. Recordings were transcribed, and both the original recordings and transcripts were stored on a secure server.

The Meaningful Learning in the Laboratory Instrument (MLLI) (Galloway and Bretz, 2015a, 2015b) was administered to students enrolled in the Physical Measurements course in the Fall of 2014. The MLLI is a survey instrument that students completed online in a pre-post fashion: the first time before they started completing experiments in the laboratory and the second time was during the last week of instruction. The goal of the MLLI was to investigate how students’ perceptions of the lab changed from the beginning to the end of the course. The survey instrument categorizes students’ responses in cognitive, affective, and cognitive-affective dimensions.

Data analysis

Observational data was analyzed by coding the video data using qualitative data analysis software to track times and codes. Each video was analyzed by first dividing each observation period into episodes that characterized the primary focus of student activity. Each episode was further coded employing a coding scheme that captured captures several dimensions of students’ engagement in the laboratory experiments: practices (P), cognitive processing (CP), and engaging with the TAs and laboratory support staff (TA/LSS). This analysis is based on a coding scheme developed by Xu and Talanquer (2013). The codes we adopted from the original scheme are directly related to students’ autonomy in the laboratory (e.g. seeking TA or LSS help) and self-regulated learning (e.g. cognitive processing: interpretative). Open-coding was used to develop the remainder of the codes to characterize the broad occurrences in the laboratory as well as the specific practices students were engaged in as the experiment was underway. Collectively, these codes serve as a means to report, in a general manner, on the teaching laboratory environments. A more detailed analysis centered on student discourse in such environments is forthcoming in a separate manuscript. The codes most pertinent to our present discussion are presented in Table 3.
Table 3 Summary of codes; the Code Name is listed in column 1 and the Code Definition is listed in column 2
Code name Definition
Troubleshooting (E) Students, and/or TA/LSS, are addressing something problematic with the laboratory apparatus.
Reference to a video resource (P) Students are referring to one of the laboratory instructional videos.
Interpretative (CP) Students are interpreting some data while in the lab.
Interacting with TA/LSS Students are interacting with TA or LSS in any manner.
Seeking TA/LSS help Students are actively seeking the TA or LSS for assistance.


The researchers used one entire video observation session from a single experiment from the Physical Measurements laboratory (CO FTIR) and one entire observation from a single experiment from the Analytical Measurements laboratory (FTIR) to establish consensus for the application of the coding scheme. One researcher coded all of the Analytical Measurements observations while another researcher coded all of the Physical Measurements observations. 25% of the recorded video clips for each course were then coded by the other researcher and subsequently discussed by both researchers. Inter-rater agreement scores were calculated based on degree to which the same code was present (or absent) in each episode coded by both researchers. An overall inter-rater agreement score was calculated at 89% for the co-coded observations. All coding discrepancies were resolved among the coders. This agreement score aligned with the 90% agreement score reported by Xu and Talanquer (2013) for observing laboratory interactions. Data analysis for the interviews consisted of first transcribing the interviews verbatim. Interview transcriptions captured the entire conversation between the interviewer and interviewee, omitting filler words such as “um” and “like” only when they masked the meaning of what the participant was trying to say. Next the interviews were coded through an open coding method. Any aspects of the transcripts that related to the videos, course structure, instructor facilitation or laboratory in general were highlighted. Quotes similar in nature were placed together to facilitate the compilation of overall themes which emerged from the transcript. Themes were compared across the interviews from both Analytical and Physical Measurements.

Results & discussion

The goal of developing and providing videos to students is to help them become more autonomous in the lab. Common themes emerged across the observations, interviews, and MLLI survey results that reflect the ways in which the students use the videos and accompanying online resources to support their performance in the laboratory. Sample quotes from student interviews and laboratory observations are included to illustrate each of the themes. All student names are pseudonyms, and the context and semester are included at the end of each quote.

Themes from laboratory observations and interviews

Similar themes were noted for both the Analytical Measurements and Physical Measurements laboratory courses with the exception of the theme related to accessing resources.
Accessing resources. Videos are only useful if they are used by students. Students in both courses had access to all of the videos through the online course management system. However, some students in Analytical Measurements reported being unware of the data analysis videos. This was likely due to the manner in which the instructor had provided the links to the videos – as a separate document posted on the course management system rather than embedded with the instructions for each experiment. Access to the videos was changed in subsequent semesters, and students were more likely to make full use of the available resources.

In contrast, in Physical Measurements where there was no mention of a lack of knowledge of each type of video during the interviews. Students in the Analytical Measurements course stated that they did not know the data analysis videos existed (illustrated in the quote below), highlighting the need for better organization for students to access the videos.

Jeremy: I guess the only thing I would say is like, I never looked at the uh, data analysis video I didn’t realize they were there until a couple of weeks ago.

Interviewer: So did you actually use the data analysis videos once you figured out they were there?

Jeremey: Yeah, I think I used them for one or two labs just to try to get on the right track. (Analytical Measurements Interview, 2015)

This quote from Jeremy also illustrates how he changed his practice once he became aware of the videos.

Video use. The experimental videos were used extensively by students in both laboratory courses. Evidence for this is seen in the analytics from the course management system, which indicated that all students accessed the experiment videos, as well as in the interviews. Many students reported taking notes on the experimental videos as part of their preparation for the laboratory as illustrated in the following quotes from interviews in both Analytical Measurements and Physical Measurements. It was clear in both the interviews and in laboratory observations that students were using the experiment videos as intended.

Leonard: …I would go on and watch the experiment video, and while I was watching that…I would write out…bullet points of what we did so we didn’t get lost and knew we were following exactly the right techniques… (Physical Measurements Interview, 2014)

Alexandra: And while I am watching it (the video) I take notes and I think it really makes me pay attention to what I am doing and I come to class definitely knowing what I am doing. (Analytical Measurements Interview, 2015)

Students watched the lecture videos (if they were available for the course) but these were not as vital to them as the experiment videos. Even though the lecture videos were not required for the students to watch to successfully complete the pre-laboratory quiz, it was an expected part of the Physical Measurements course. Students who watched them noted the inherent value in how these videos could assist them in the laboratory. Some students preferred to watch these videos beforehand, often up to a few days in advance, so they would be able to seek additional help if they needed.

Interviewer: How did you prepare for lab?

Howard: …If there was a theory video posted, I would watch it. …but in the case that I really didn’t understand something, I usually tried to look at it a couple of days before I was actually in lab so that [on the day the experiment was actually performed] I would…be able to go to office hours. (Physical Measurements Interview, 2014)

The data also indicated a missed opportunity for the Analytical Measurements course, as this course did not include pre-lab lecture videos as part of the course structure. Some of the students in Analytical Measurements expressed having difficulty with connecting information from the pre-laboratory lectures, provided at the beginning of the semester, to the experiments when they wrote their laboratory reports. Students stated it was difficult to remember the material that was covered in lecture due to the time lapse between when they attended lecture and when they completed the experiments, which could be up to four months. They also said it was challenging to understand how the concepts in the lecture connected to the concepts in the experiments. The following quote illustrates this theme.

Ethan: CV or CE I think those things are not covered as much as undergraduates. So we had lectures at the start of the semester but (that) was four months ago now and we spend a week on each one (laboratory) or something. Not even a week on those topics. So then I'm trying to recall and go back to those lecture slides. Make sense of it. (Analytical Measurements Interview, 2015)

Preparation. All of the students interviewed from both courses mentioned they watched videos and were using resources to help them become more prepared. This is indicative of some level of forethought from the students, which helped them complete the experiments. The primary impact was an increased familiarity with the instruments as reflected in the student interview data illustrated in the quote below.

Gabe: Um, largely what I found the videos the most useful for was definitely for the instrumentation. Like I said we don’t know this instrumentation so that was always helpful. What you are actually doing in the experiment is nice to see being done as opposed to just reading about it that's not always helpful. (Analytical Measurements Interview, 2015)

The quote from Gabe also indicates that watching the videos was preferred to just reading the laboratory manual. The dynamic nature of the videos made it easier for students to process the steps for data collection. Students become more independent when they could see the instruments and how to use the software, which seemed to contribute to students being less reliant on the TAs and laboratory support staff in the laboratories (and therefore more efficient), as illustrated in the following quote.

Armando: I feel like the videos really helped. This is the first lab that actually had videos to them. That definitely helps just coming in you saw and you did most of the stuff well in the video I feel like that's a head-start, (not) wasting time asking questions in lab, (which) definitely helps. Things go smoother and smoother in labs. (Analytical Measurements Interview, 2015)

This decreased reliance on the TAs was also noted in the laboratory observations. Use of the resources eliminated some of the questions students had in previous semesters, such as where each instrument was located and how to use them.

It was commonly observed in Analytical Measurements in 2014 that if students were having trouble with a particular step of the procedure, they would usually stop and wait for the TA to come over and help them before continuing. After the introduction of the pre-laboratory videos, it was observed that students were able to setup the software using the computer on the instrument without TA assistance. This comparison provides evidence that the videos helped the students overcome the barrier of learning how to operate the instruments. It was also observed in Analytical Measurements in 2015 that students had a better sense of what areas they were working on and what steps each portion of the experiment entailed. Students in 2015 had a broader overview of the experiment and knew what steps to anticipate in the experiments.

Students in both courses drew upon the videos for manipulating the software. Students could be overheard referring to the videos as they navigated the procedure as illustrated in the following excerpt from the Apple Enzyme Kinetics experiment.

Jeremy: Did it export?

Luke: I’m pretty sure I did that right.

Jodie: I don’t know…

Jeremy: Do they just copy and paste?

(Inaudible muttering)

Jeremy: In the video, do they just copy and paste the numbers? I don’t remember.

Jodie: Yeah. (Physical Measurements Observation, 2014)

No TA intervention was needed and the students were able to solve their problem on their own.

Troubleshooting. Even though our data suggests that the videos helped students become more autonomous in performing the laboratory experiments, the videos did not eliminate the need for TAs or laboratory support staff in the laboratory. This was particularly true when it came to troubleshooting problems with software or instrumentation. Examples of troubleshooting included scenarios such as the computer software would not connect to the instrument or when a leaky valve needed to be replaced. Students were able to recognize that there was a problem that was preventing them from successfully obtaining appropriate data, but they were not able to identify the source of the problem or correct it. In these cases, students would solicit help from a TA or LSS member.

The laboratory observations also showed that students often developed confidence issues that impeded their progress in the laboratory once a problem was encountered. This is illustrated in observations of the Gas Chromatography Experiment from Analytical Measurements in 2015. The students completing this experiment were on task and working independently until there were some difficulties with getting the computer software to connect with the instrument. This was resolved by the TAs, and then the instrument was back to functioning normally. However, after the troubleshooting issue the students sought out affirmation from the TAs for each of the step of the experiment before they executed an action on the instrument. It seemed the students wanted the reassurance of the TAs to potentially avoid another issue. Another example of the students not performing as autonomously as they were previously was observed in the High-Performance Liquid Chromatography experiment in the 2015 Analytical Measurements course. The students on Day 1 of the experiment came to the laboratory prepared, and they finished the experiment with no major issues. However, on Day 2 of the experiment, there was a leaky valve that needed to be replaced by the laboratory support staff member. After the seal was fixed, the students seemed extra cautious in how they approached setting up the software and loading their samples to be injected into the column. The incident seemed to take away some of the confidence the students had exhibited on Day 1 of the experiment.

The physical measurements course had troubleshooting episodes as well. One case was during the FTIR of CO lab in which a leak developed in the Schlenk apparatus. Students generated various solutions (e.g. re-greasing the joints), and were engaged directly in troubleshooting for several minutes until they realized it was somewhat beyond the scope of their experience level with such an apparatus. In the case of the atomic force microscopy (AFM) experiment, the laboratory support staff member was always immediately involved in troubleshooting due to the sensitive nature of the apparatus (e.g. expensive and delicate AFM tips). In these cases, direct student involvement in troubleshooting was quite limited. However, as in analytical measurements observations, students generally did not seek out TA help for extremely simple fixes (e.g. figuring out how to take a blank on a spectrometer). This is an encouraging observation in the context of students’ development of relative autonomy in the laboratory environment and for allowing the TAs and laboratory support staff to focus on trouble areas rather than guiding students through using the instruments.

Data interpretation. Students were able to use the videos as resources to help them interpret their data. The following excerpt from students as they were completing the FTIR CO lab provides one illustration of this theme.

Luke: Ok, so we’re at negative thirty [inHg] which is way better than the negative ten we had before because I know that's about what it was in the video.

Jeremy: Yeah, it was.

Jodie: Do we retake the blank or do you think that would not matter?

Jeremy: It's gonna take like five minutes to take the blank, so we might as well just take the blank. (Physical Measurements Observation, 2014)

Here, the student group recognizes that the pressure of the Schlenk line used to evacuate as well as fill the IR gas cell is appropriate based on the value given in the instructional video. They use this information to decide how to proceed with their experiment. This is a prime example of how students are utilizing the video resources in order to interpret their data during the laboratory period.

An example of how students used the videos to determine the peak shapes can be seen in the following example. Here, students in Analytical Measurements drew upon the videos as a source during the GC experiment.

Dennis: It's [the GC results) looking pretty similar to the picture [video].

Heather: Yeah.

Dennis: Our peaks are sharp. (Analytical Measurements Observation, 2015)

The students were able to qualitatively state that their peaks were sharp and were comparable to what would be expected from an acceptable chromatograph. They were also able to use the data analysis videos to help them interpret their data when they were writing their laboratory reports, as illustrated below.

Joseph: Your (the researchers’) videos were a savior! Like at 5’o clock this morning we were watching that video just hunkering down on the analysis, that was for fluorescence. They were extremely beneficial, like extremely beneficial. (Analytical Measurements Interview, 2015)

In addition to analyzing data after the laboratory, students in both courses were able to use the videos to help them monitor the progress of their experiments and to gain a better understanding of what the data was supposed to look like before they came into lab. The videos assisted the students with analyzing their data on a qualitative level to help inform them with how to proceed with the data collection in the experiment. Some students mentioned that they see the value in analyzing their data as they collect it during the laboratory period, rather than waiting to analyze while constructing their laboratory report(s).

MLLI results and discussion

The Meaningful Learning in the Laboratory Inventory (MLLI) (Galloway and Bretz, 2015a, 2015b) was administered to students in the physical measurements lab, in a pre-post fashion, during the Fall 2014 term. As Galloway and Bretz (2015a, 2015b) indicated, responses to the items were reverse coded where appropriate (i.e. when a lower numerical response indicated a viewpoint that would be considered positive by an instructor, and vice versa). In these cases, the raw numerical values were simply subtracted from 100. Median scores were calculated for each student's responses to questions for each scale (affective, cognitive, and cognitive/affective). Scores for the pre-test were then compared to scores on the post-test. Herein, we refer to a “positive shift” as a desirable, or favorable, one and a “negative shift” as unfavorable.

The plots in Fig. 4 show that there was a positive shift in students’ perception of their experience in the affective domain between the pre and post survey, and a negative shift in the cognitive domain. While the negative shift in students’ cognitive expectations is not desirable, it is consistent with the results seen in other studies (DiBiase and Wagner, 2002; Galloway and Bretz, 2015a, 2015b). These results indicate that more attention should be given to engaging students cognitively during the laboratory in addition to improving the psychomotor and affective aspects. However, the shifts in the affective domain in our study were not consistent with previously observed shifts: similar plots and descriptive statistics by Galloway and Bretz (2015a, 2015b) reported that students’ scores significantly decreased in all three domains, albeit with a range of effect sizes (small for affective). This result does suggests that the video resources were successful in improving students’ experiences in the laboratory.


image file: c7rp00078b-f4.tif
Fig. 4 Plots of pre-(left) vs. post-(right) course responses for each participant provide a useful visual representation of shifts in each of the domains, or clusters. The following plots were constructed using the median score of all items in the affective and cognitive domains, respectively. The diagonal line represents no change in the pre-post scores. Points above the line indicate a positive (favorable) shift and points below the line represent a negative (unfavorable) shift.

A more detailed statistical analysis was performed on each set of data using SPSS to check for statistically significant differences between participants’ responses pre vs. post within the cognitive and affective clusters. Since the sample size was relatively small (N = 12), the Wilcoxon signed-rank test was used. Table 4 lists the questions that had statistically significant differences, as well as the effect sizes for each of the clusters. The effect sizes were calculated in accordance with a method described by Pallant (2013). In accordance with Cohen's standards for evaluating effect sizes, these effect sizes are considered between small and medium (Cohen, 1992). The only individual item within the affective domain with a positive, significant change was one focused on whether or not students felt frustrated.

Table 4 Summaries of the Wilcoxon signed rank test for affective and cognitive items for each of the Physical Measurements laboratory, green questions indicate a positive (favorable) shift while red questions indicate a negative (unfavorable) shift
Domain Effect size Item number Item statement (Pre: I expect to…; Post: I…past tense…)
Cognitive 0.232 Q11 …think about what the molecules are doing
Q16 …be confused about the underlying concepts
Q19 …think about chemistry I already know
Affective 0.314 Q21 …be frustrated


Item 21 on the MLLI, which investigates the degree to which students feel frustrated in lab, shifted significantly in a desirable manner as indicated in Table 4. We note that, while this is the only item with a statistically significant shift in the affective domain, no questions in this domain significantly decreased. Item 16 on the MLLI (Q16) shifted in a positive, significant manner from pre to post as well. This item serves to investigate the degree to which students are confused about concepts explored in the laboratory. It is significant to note that, in Galloway et al.'s (2016) study of students’ affective responses in chemistry laboratory environments, students indicated they were “frustrated,” and “confused” with relatively high frequency after they had completed at least one laboratory experiment during the academic term. The favorable shift in students’ affective domain scores, as a whole, and the shift in Q16 is promising since a primary goal of the video resources was to help students feel more prepared and confident in the laboratory.

While two other items in the cognitive domain shifted negatively (Q11 and Q19), these focused on chemistry-specific aspects (molecules and chemistry in general). Specifics regarding the underlying chemistry (i.e. atomic and molecular scale phenomena) were covered in the mini lecture videos in Physical Measurements, but the nature of the laboratory creates an environment where students frequently do not often think about chemistry concepts while they are collecting data (DeKorver and Towns, 2016). Some students in the interviews, consistent with the MLLI cognitive domain results, expressed a desire to engage in data analysis while they were present in the lab but often feel that is difficult to do so, which is illustrated in the following excerpt.

Interviewer: So if you could take this course again what would you do to improve your experience?

Shane: I would probably try to understand the data. I know it's easy to just get the data. It makes a lot more sense when you look through (it) after, the like, you could be looking for stuff during it (the laboratory period) I guess. That's probably something that would be helpful. Just to, yeah.

Interviewer: So that's something you would be more…

Shane: It's hard to do that when you're getting in to figure out what you are supposed to be doing. Once you understand what you did and maybe you got it, probably would have helped if you had a better understanding of what you are looking for in lab. Sometimes we are just getting the data, getting the numbers and you don't really know why until you start putting them altogether. (Analytical Measurements Interview, 2015)

In line with the MLLI results from Physical Measurements, students in the Analytical Measurements expressed positive attitudes towards the course. Although we do not have MLLI data from Analytical Measurements in Spring 2015, the quotes below indicate that the students have a positive affective response.

Interviewer: Is there anything else about this course that we didn’t touch on that you wanted to talk about?

Nicholas: I really enjoyed this course, actually. I enjoy being in lab, I thought the labs were fun to do. There wasn’t very much tedium. There wasn’t a feeling of what is this? I thought it was actually a good course. (Analytical Measurements Observation, 2015)

The observation data supports the conclusions from the MLLI results. Although there were instances of students being somewhat idle in lab (e.g. during the lengthy kinetics runs for the enzyme kinetics lab), there was a notable absence in language indicating frustration. Even during troubleshooting episodes, there was very little visible frustration on the part of the students aside from intermittent comments about “wanting to get out of lab on time.” Similar sentiments were found in Analytical Measurements in 2015, with students generally not getting flustered when there were glitches with the experiments. There were no mentions of students feeling frustrated or afraid to conduct the experiments in the interview data. In 2014, when students encountered trouble with the instruments, they were more likely to be frustrated and give up and seek TA or laboratory support staff help without attempting to first fix the problem themselves.

Limitations

Limitations of our research study include that we only recorded one laboratory group per week and were only able to get one set of data per experiment per year. There were variations in the student composition of the groups being recorded from week to week and our observations only captured one snapshot in time. Another limitation is that we did not have any data collected for the Physical Measurements students before the videos were implemented, so we are only able to comment on how well the students perceived that the videos helped them. There was no MLLI data collected for the Analytical Measurements course, which limits the degree to which we can compare the similarities and differences among students in both courses.

Conclusions

The work presented here illustrates that the use of carefully designed pre-laboratory videos are an effective way to support upper-level students in completing laboratory experiments, including using instrumentation. Video-based resources can help alleviate some challenges associated with rotation-style labs, particularly the temporal disconnect between pre-laboratory lectures and experiment completion as well as the need for more student autonomy. Students were able to call upon these resources as they completed the laboratory and subsequent data analysis and laboratory report. Findings from laboratory observations, interviews with students, and MLLI results indicate students can become more autonomous and confident in conducting experiments in upper-level rotational style laboratory courses when provided with appropriate video resources. The video resources also provided students opportunities to engage in the forethought and performance phase of SRL throughout the entire process of completing a laboratory experiment. While it was difficult to determine if the self-regulated learning cycle was completed since this could not be observed directly, there was evidence from the interviews that students did reflect on how to make better use of the resources between laboratory experiments.

Students from both courses made specific statements that indicated they relied on the instructional videos to inform their practices in the laboratory, and observations in the laboratory support that students appeared to know what to expect when in lab insofar as instrumentation, apparatus, and procedure were concerned. Most student–TA interactions that coincided with “seeking TA help” were for assistance with rather complex manipulations of instrumentation or for troubleshooting problems, particularly in labs with complicated apparatuses. Simple procedural aspects that were addressed in the instructional videos were generally executed without a problem. These assertions align well with the findings from one-on-one student interviews, in which many participants stated that they drew upon the instructional videos to become familiar with the various instruments, software, and experimental details fundamental to completing the lab.

The positive shift in the MLLI affective domain (from pre to post) was encouraging especially given the fact that students’ expectation of being confused and frustrated was not what they experienced. The quotes and the results from the MLLI suggest the implementation of the videos yielded favorable shifts in students’ affective experiences in lab, along with a limited impact on their cognitive experiences. The triad of laboratory videos were intended to help alleviate the frustration of preparing for and completing the laboratory and not necessarily to increase their cognitive engagement. This represents an area where future interventions could address more specifically in order to increase students’ cognitive engagement while completing the laboratory rather than only when completing reports and analysis.

Implications

Implications for research

While the results indicate that the videos were able to support students’ autonomy in the laboratory, findings from student interviews and the MLLI survey, including previous studies, (Galloway and Bretz, 2015a, 2015b) indicate that more work is needed to better support students cognitively in the laboratory. Self-reflection was evident in some of the findings from the student interviews and how they changed their practice; however the interventions were not specifically designed to address this aspect of student learning. This is an area where researchers could work to determine what the optimal structure of the courses and how resources could be used more effectively to increase self-reflection in students. This is an important implication because metacognition, or reflecting on one's own thinking, has been established as an essential practice for learners to engage in while constructing their knowledge as well as for improving performance (Schraw et al., 2006; Zohar and Barzilai, 2013). Laboratory protocols themselves could be restructured to prompt students to engage in self-reflection. For example, students could be asked to respond to questions in their laboratory notebooks or laboratory reports such as “What could you do to improve your experience in the laboratory for next time?” to gauge if students are reflecting on their use of video resources. Another prompt that could be added to the laboratory manual would be to ask students “How did you use the laboratory resources (including the videos) and what, if any, changes might you make to how you use them for the next experiment?” Additionally, incorporating short small group discussions in the laboratory in which students reflect on initial results and draw upon the instructional videos, among other resources, to plan next steps would likely provide an environment conducive to encouraging self-reflection. Analyzing student discourse in such settings would afford researchers greater insight into the efficacy of such resources with respect to promoting self-reflective and metacognitive practices, as well as a basis for their refinement.

Furthermore, it would be beneficial to examine the use of such video resources in advanced laboratory environments that are conducted in a simultaneous rather than in a rotational manner. Of particular interest would be settings where there are opportunities for student reflection, analysis, and discussion during the laboratory period. It is possible that this would also improve cognitive engagement while completing the experiments. Another aspect to address in future studies is the perceptions and experiences of the teaching assistants that run the laboratories and how the videos might have impacted their facilitation of the laboratories. We have some evidence that the laboratory ran more smoothly after the addition of the video resources, but no systematic was collected to address this more fully.

Implications for instructors

Video resources can help students be more prepared when they come to the laboratory. Our results showed that by having these resources readily available, students felt more confident performing the experiments. However, in order for students to take full advantage of the videos, they have to be easily accessible. Issues with course structure and student frustration with locating the videos they need to watch has also been seen in other studies (Mason et al., 2013). Our experiences show that students are more likely to use video resources when they are organized by experiment in the online course management system rather than by video type. From our experiences, successful implementation of the videos relies on all members of the instructional staff to be fully aware of the resources to be able to guide students to the correct resources. Our study shows that the videos with an incentive for students to watch them embedded into the course, such as a quiz, were more likely to be watched by students. We also found that the videos can serve as resource to help train new teaching assistants or laboratory staff to familiarize them with the instruments and experiments.

Carefully designed video resources can assist with helping both students and teaching assistants have a more positive and productive laboratory environment. Even though a professional video service was used for the experimental videos, a handheld video camera could be used with similar outcomes. As changes have been made to laboratory protocols, short videos have been made by the research team using a small video camera to address the changes and then the experiment video edited to replace the segment. These inserted videos are not significantly different in appearance from the professionally filmed segments.

We highly recommend constructing scripts before filming the videos, and separating out the scripts into two columns: one column focusing on what the camera shot will be and the other on what the audio will be. The principles from Mayer's Cognitive Theory of Multimedia Learning (Mayer et al., 1999; Mayer and Chandler, 2001; Mayer et al., 2001) should be taken into account when constructing pre-laboratory videos. The first principle would be to have no unnecessary on-screen text when audio is playing in the video to reduce cognitive overload. The second principle would be to focus the camera shot on completing one action at a time to make it clear to the viewer what step is being illustrated. The third and last principle would be to include bookmark hyperlinks in all of the videos and to place text transitions between each of the sections on screen to signal to the viewer the beginning of another section.

Acknowledgements

Support for this work was provided by a University of Iowa Innovations in Teaching with Technology Award. We would also like to thank the students who participated in this project for being willing to share their experiences and insights.

References

  1. American Chemical Society Committee on Professional Training, (2015), Undergraduate Professional Education in Chemistry: ACS guidelines and evaluation procedures for bachelor's degree programs, Washington, DC.
  2. Box M. C., Dunnagan C. L., Hirsh L. A. S., Cherry C. R., Christianson K. A., Gibson R. J., Wolfe M. I. and Gallardo-Williams M. T., (2017), Qualitative and Quantitative Evaluation of Three Types of Student-Generated Videos as Instructional Support in Organic Chemistry Laboratories, J. Chem. Educ., 94, 164–170.
  3. Burewicz A. and Miranowicz N., (2006), Effectiveness of multimedia laboratory instruction, Chem. Educ. Res. Pract., 7, 1–12.
  4. Chini J. J. and Al-Rawi A., (2013), in Engelhardt P. V., Churukian A. D. and Rebello N. S. (ed.), 2012 Physics Education Research Conference, Melville: Amer Inst Physics, vol. 1513, pp. 98–101.
  5. Cohen J., (1992), A power primer, Psychol. Bull., 112, 155–159.
  6. Cooper M. M. and Kerns T. S., (2006), Changing the Laboratory: Effects of a Laboratory Course on Students' Attitudes and Perceptions, J. Chem. Educ., 83, 1356.
  7. DeKorver B. K. and Towns M. H., (2016), Upper-level undergraduate chemistry students' goals for their laboratory coursework, J. Res. Sci. Teach., 53, 1198–1215.
  8. DiBiase W. J. and Wagner E. P., (2002), Aligning General Chemistry Laboratory With Lecture at a Large University, Sch. Sci. Math., 102, 158–171.
  9. Elliott M. J., Stewart K. K. and Lagowski J. J., (2008), The Role of the Laboratory in Chemistry Instruction, J. Chem. Educ., 85, 145.
  10. Evans K. L., Leinhardt G., Karabinos M. and Yaron D., (2006), Chemistry in the Field and Chemistry in the Classroom: A Cognitive Disconnect? J. Chem. Educ., 83, 655.
  11. Fun Man F., (2016), Exploring Technology-Enhanced Learning Using Google Glass To Offer Students a Unique Instructor's Point of View Live Laboratory Demonstration, J. Chem. Educ., 93, 2117–2122.
  12. Fung F. M., (2015), Using First-Person Perspective Filming Techniques for a Chemistry Laboratory Demonstration To Facilitate a Flipped Pre-Lab, J. Chem. Educ., 92, 1518–1521.
  13. Galloway K. R. and Bretz S. L., (2015a), Development of an Assessment Tool To Measure Students’ Meaningful Learning in the Undergraduate Chemistry Laboratory, J. Chem. Educ., 92, 1149–1158.
  14. Galloway K. R. and Bretz S. L., (2015b), Measuring Meaningful Learning in the Undergraduate Chemistry Laboratory: A National, Cross-Sectional Study, J. Chem. Educ., 92, 2006–2018.
  15. Galloway K. R. and Bretz S. L., (2016), Video episodes and action cameras in the undergraduate chemistry laboratory: eliciting student perceptions of meaningful learning, Chem. Educ. Res. Pract., 17, 139–155.
  16. Galloway K. R., Malakpa Z. and Bretz S. L., (2016), Investigating Affective Experiences in the Undergraduate Chemistry Laboratory: Students' Perceptions of Control and Responsibility, J. Chem. Educ., 93, 227–238.
  17. Herrington D. G. and Nakhleh M. B., (2003), What defines effective chemistry laboratory instruction? Teaching assistant and student perspectives, J. Chem. Educ., 80, 1197.
  18. Hofstein A. and Lunetta V. N., (2004), The laboratory in science education: Foundations for the twenty-first century, Sci. Educ., 88, 28–54.
  19. Jennings K. T., Epp E. M. and Weaver G. C., (2007), Use of a multimedia DVD for Physical Chemistry: analysis of its effectiveness for teaching content and applications to current research and its impact on student views of physical chemistry, Chem. Educ. Res. Pract., 8, 308–326.
  20. Johnstone A. and Al-Shuaili A., (2001), Learning in the laboratory; some thoughts from the literature, Univ. Chem. Educ., 5, 42–51.
  21. Jolley D. F., Wilson S. R., Kelso C., O'Brien G. and Mason C. E., (2016), Analytical Thinking, Analytical Action: Using Prelab Video Demonstrations and e-Quizzes To Improve Undergraduate Preparedness for Analytical Chemistry Practical Classes, J. Chem. Educ., 93, 1855–1862.
  22. Jones S. M. and Edwards A., (2010), Online pre-laboratory exercises enhance student preparedness for first year biology practical classes, Int. J. Innov. Sci. Math. Educ., 18, 1–9.
  23. Jordan J. T., Box M. C., Eguren K. E., Parker T. A., Saraldi-Gallardo V. M., Wolfe M. I. and Gallardo-Williams M. T., (2016), Effectiveness of Student-Generated Video as a Teaching Tool for an Instrumental Technique in the Organic Chemistry Laboratory, J. Chem. Educ., 93, 141–145.
  24. Mack M. R. and Towns M. H., (2016), Faculty beliefs about the purposes for teaching undergraduate physical chemistry courses, Chem. Educ. Res. Pract., 17, 80–99.
  25. Mason G. S., Shuman T. R. and Cook K. E., (2013), Comparing the effectiveness of an inverted classroom to a traditional classroom in an upper-division engineering course, IEEE Trans. Educ., 56, 430–435.
  26. Mayer R. E. and Chandler P., (2001), When learning is just a click away: Does simple user interaction foster deeper understanding of multimedia messages? J. Educ. Psychol., 93, 390.
  27. Mayer R. E., Moreno R., Boire M. and Vagge S., (1999), Maximizing constructivist learning from multimedia communications by minimizing cognitive load, J. Educ. Psychol., 91, 638.
  28. Mayer R. E., Heiser J. and Lonn S., (2001), Cognitive constraints on multimedia learning: when presenting more material results in less understanding, J. Educ. Psychol., 93, 187.
  29. Molenaar I. and Järvelä S., (2014), Sequential and temporal characteristics of self and socially regulated learning, Metacogn. Learn., 9, 75.
  30. Nadelson L. S., Scaggs J., Sheffield C. and McDougal O. M., (2015), Integration of video-based demonstrations to prepare students for the organic chemistry laboratory, J. Sci. Educ. Technol., 24, 476–483.
  31. Pallant J., (2013), SPSS survival manual, UK: McGraw-Hill Education.
  32. Pogacnik L. and Cigic B., (2006), How to motivate students to study before they enter the lab, J. Chem. Educ., 83, 1094.
  33. Powell C. B. and Mason D. S., (2013), Effectiveness of podcasts delivered on mobile devices as a support for student learning during general chemistry laboratories, J. Sci. Educ. Technol., 22, 148–170.
  34. Reid N. and Shah I., (2007), The role of laboratory work in university chemistry, Chem. Educ. Res. Pract., 8, 172–185.
  35. Robinson W. R., (2004), Cognitive Theory and the Design of Multimedia Instruction, J. Chem. Educ., 81, 10.
  36. Rollnick M., Zwane S., Staskun M., Lotz S. and Green G., (2001), Improving pre-laboratory preparation of first year university chemistry students, Int. J. Sci. Educ., 23, 1053–1071.
  37. Schraw G., Crippen K. J. and Hartley K., (2006), Promoting Self-Regulation in Science Education: Metacognition as Part of a Broader Perspective on Learning, Res. Sci. Educ., 36, 111–139.
  38. Seery M. K. and McDonnell C., (2013), The application of technology to enhance chemistry education, Chem. Educ. Res. Pract., 14, 227–228.
  39. Shibley I. A. and Zimmaro D. M., (2002), The Influence of Collaborative Learning on Student Attitudes and Performance in an Introductory Chemistry Laboratory, J. Chem. Educ., 79, 745.
  40. Teo T. W., Tan K. C. D., Yan Y. K., Teo Y. C. and Yeo L. W., (2014), How flip teaching supports undergraduate chemistry laboratory learning, Chem. Educ. Res. Pract., 15, 550–567.
  41. Towns M., Harwood C. J., Robertshaw M. B., Fish J. and O'Shea K., (2015), The Digital Pipetting Badge: A Method To Improve Student Hands-On Laboratory Skills, J. Chem. Educ., 92, 2038–2044.
  42. Tsaparlis G. and Finlayson O. E., (2014), Physical chemistry education: its multiple facets and aspects, Chem. Educ. Res. Pract., 15, 257–265.
  43. Winberg T. M. and Berg C. A. R., (2007), Students' cognitive focus during a chemistry laboratory exercise: effects of a computer-simulated prelab, J. Res. Sci. Teach., 44, 1108–1133.
  44. Xu H. and Talanquer V., (2013), Effect of the Level of Inquiry on Student Interactions in Chemistry Laboratories, J. Chem. Educ., 90, 29–36.
  45. Zimmerman B. J., (2002), Becoming a self-regulated learner: an overview, Theor. Pract., 41, 64–70.
  46. Zohar A. and Barzilai S., (2013), A review of research on metacognition in science education: current and future directions, Stud. Sci. Educ., 49, 121–169.

Footnotes

Electronic supplementary information (ESI) available. See DOI: 10.1039/c7rp00078b
Now at Western Washington University.

This journal is © The Royal Society of Chemistry 2017