Jennifer A.
Schmidt-McCormack
,
Marc N.
Muniz‡
,
Ellie C.
Keuter
,
Scott K.
Shaw
and
Renée S.
Cole
*
University of Iowa, Iowa City, IA 52242, USA. E-mail: renee-cole@uiowa.edu
First published on 31st May 2017
Well-designed laboratories can help students master content and science practices by successfully completing the laboratory experiments. Upper-division chemistry laboratory courses often present special challenges for instruction due to the instrument intensive nature of the experiments. To address these challenges, particularly those associated with rotation style course structures, pre-laboratory videos were generated for two upper-division laboratory courses, Analytical Measurements and Physical Measurements. Sets of videos were developed for each experiment: a pre-laboratory lecture, an experimental, and a data analysis video. We describe the theoretical principles that guided the design of the instructional videos as well as the process. To assess the impact of the videos on students' successful completion of the experiments, a mixed-methods approach to data collection was used, which included video-recorded laboratory observations, student one-on-one interviews, and the Meaningful Learning in the Laboratory Inventory (MLLI) survey. Our findings indicate that video-based resources can help alleviate some challenges associated with rotation-style labs, particularly the temporal disconnect between pre-laboratory lectures and experiment completion as well as the need for more student autonomy in upper-division laboratory courses.
A common challenge in upper-division undergraduate chemistry laboratory courses is that the instrument intensive nature of the experiments often leads to the use of a rotational scheduling structure. In such a structure, each individual student completes all of the experiments as they progress through the term, but they do so at different times and in different sequences with respect to their classmates and peers. One of the pedagogical challenges associated with a rotational approach to completing experiments is that the pre-lab lecture component is often temporally far removed from when the students perform the experiments. This time lag can range from days to months, depending on the structure of the course. Rotational style laboratory courses can also be challenging for teaching assistants (TA) to facilitate, as well as for the laboratory support staff who maintain the instruments. A TA in this type of class must assist and facilitate multiple student groups completing different experiments during the same lab period. With these challenges in mind, a way to directly engage students with the material pertinent to their individual schedule immediately before they enter the laboratory is needed (Jennings et al., 2007; Seery and McDonnell, 2013; Jordan et al., 2016).
We propose that by offering students readily accessible resources to assist them in the laboratory, they will be afforded more opportunities to self-direct their learning on an as-needed basis. This will better prepare students to succeed in completing laboratory experiments, and to do so with decreased reliance on TA/staff assistance. Providing instructional resources in the form of short videos and pre-lab quizzes allows each student to have autonomy over their learning because they have the ability to review the videos and quizzes more than once and at their own pace (Seery and McDonnell, 2013). Findings from previous research studies show more student autonomy can also increase their positive perceptions about the laboratory (Shibley and Zimmaro, 2002; Cooper and Kerns, 2006; Jennings et al., 2007; Chini and Al-Rawi, 2013; Galloway and Bretz, 2016; Galloway et al., 2016). This study describes the design and implementation of instructional videos in upper-division chemistry labs to improve the student laboratory experience.
One approach to encourage student pre-laboratory preparedness has been the development of videos designed to address portions of lab procedures where students commonly struggle. Burewicz and Miranowicz (2006) conducted a study with fourth-year college students to determine whether traditional, video, or computer-assisted methods were most effective in delivering pre-laboratory activities. Findings indicated that the video and interactive computer-assisted activities increased the amount of time students spent completing the pre-lab activities and decreased the number of errors and total time spent on the experiment. This increased efficiency in the laboratory has also been seen in other studies (Nadelson et al., 2015; Box et al., 2017). Pre-laboratory instructional videos can also decrease the number of questions students ask about procedural issues, while increasing the number of questions related to calculations or theory (Pogacnik and Cigic, 2006; Winberg and Berg, 2007; Towns et al., 2015; Box et al., 2017). This change has been explained as a result of having the videos remove some of the cognitive load of performing the experiments and shifting the focus to thinking about the calculations and chemistry behind the experiments (Teo et al., 2014; Box et al., 2017). Multimedia pre-laboratory activities have been shown to increase students’ perceptions of being prepared for the laboratory (Jones and Edwards, 2010; Towns et al., 2015; Jolley et al., 2016), which can also influence students’ overall experience of the laboratory (Galloway and Bretz, 2016; Galloway et al., 2016). However, all these studies focused on laboratory environments where all the students were completing the same experiment during the laboratory session.
A number of different approaches have been used to create pre-laboratory videos. The most common approach is for instructors to create the resources (Burewicz and Miranowicz, 2006; Winberg and Berg, 2007; Powell and Mason, 2013; Teo et al., 2014; Fung, 2015; Nadelson et al., 2015; Fun Man, 2016; Jolley et al., 2016) although some instructors had undergraduate students who had completed the course create the videos (Jordan et al., 2016; Box et al., 2017). The latter approach was adopted because students can readily remember the portions of the experiment with which they struggled and highlight these aspects of the experiment. Most of the video resources used a third-person perspective (Burewicz and Miranowicz, 2006; Winberg and Berg, 2007; Powell and Mason, 2013; Nadelson et al., 2015; Jolley et al., 2016; Jordan et al., 2016; Box et al., 2017), but there have been cases where a first-person perspective was used (Fung, 2015; Fun Man, 2016). Published descriptions of the content of pre-laboratory videos have included calculations (Winberg and Berg, 2007; Powell and Mason, 2013; Jolley et al., 2016; Box et al., 2017), safety (Nadelson et al., 2015), physical demonstration of the laboratory techniques (Burewicz and Miranowicz, 2006; Powell and Mason, 2013; Teo et al., 2014; Fung, 2015; Nadelson et al., 2015; Fun Man, 2016; Jolley et al., 2016; Box et al., 2017), demonstrations of how to setup software (Burewicz and Miranowicz, 2006; Powell and Mason, 2013; Fung, 2015; Fun Man, 2016; Jolley et al., 2016; Jordan et al., 2016; Box et al., 2017), and the use of time lapses to illustrate the experimental progress in a short period of time (Jordan et al., 2016). However, there is often a lack of description as to how the video production was carried out.
• To what extent can laboratory videos and accompanying resources facilitate students’ successful completion of experiments in upper-level, rotation-style laboratory environments?
In order to meet the needs of hearing impaired students, closed caption text is available for all YouTube videos, although instructors will likely need to edit the closed captions generated by the program to ensure specialized terms are transcribed correctly. For lecture capture videos, closed captions should also be added to support students for whom the auditory channel presents a barrier. Most programs include the capability to add this easily. This approach allows for closed captions to be turned on for those who need it, but is not omnipresent for those who do not.
To gain insight as to what features would be best to include in the videos, focus group interviews were conducted with students who were enrolled in the Analytical Measurements laboratory course during the Spring 2014 semester. The students’ recommendations for the videos included showing how to use the software programs and succinctly providing background information about the experiments.
The videos were created by members of the research team, which included course instructors (RSC and SKS), a post-doctoral fellow (MNM), a graduate student who had also been a teaching assistant for the Analytical Measurements laboratory course (JASM), and an undergraduate student who had taken both laboratory courses (ECK). Specific contributions are noted in the description for each type of video below.
Fig. 1 Screenshot from a mini-lecture video covering the concepts from Carbon Monoxide Fourier Transform Infrared Spectroscopy Experiment in Physical Measurements. |
These videos were intended to be viewed prior to beginning the experiment, but could also be used by the students as a resource while writing their laboratory reports. The intent was to strengthen the connections between the chemical concepts and the experiment being performed. Although the intention was to create pre-lab lecture videos for both Physical and Analytical Measurements, the lecture videos had not yet been created for Analytical Measurements for these case studies.
All of the videos were filmed in the instructional laboratory using the same instruments used by students during the course. Relevant portions of the experiments were demonstrated by JASM, MNM, or ECK while technicians from a professional video service recorded the action. Videos ranged from five to twenty minutes in duration and were designed to complement reading the lab manual. These videos focused explicitly on showing the necessary steps to successfully use instruments and related software to collect data during the laboratory periods. A screenshot from an Analytical Measurements Video is shown below in Fig. 2. Camera shots involving the solution prep and glassware were included for demonstration purposes, but these were not the primary focus of the videos. Chromatograms or spectra were shown on screen to show students what the software looked like with loaded spectra, but the quality or particular attributes of these spectra were not discussed in the videos. Editing of the videos was completed by the professional video service, with input from the research team. Bookmarks for specific content items were inserted post-production to facilitate easier navigation.
Course/semester | Students enrolled in course | Students per laboratory section | TAs per laboratory section |
---|---|---|---|
Analytical measurements
Spring 2014 |
38 | 18–20 | 3 |
Analytical measurements
Spring 2015 |
29 | 14–15 | 2 |
Physical measurements
Fall 2014 |
18 | 7–11 | 1–2 |
Physical measurements |
Fall 2014
Observation |
|||
---|---|---|---|---|
Experiment title | Pre-lab video | Experimental video | Data analysis video | |
Partial Molar Volume (PMV)* | X | X | ||
Mass Spectrometric Measurements of Heterogeneous Catalysis (CMS) | X | X | X | |
Enzyme Kinetics of Apples (EK) | X | X | X | X |
Cyclodextrin Inclusion Complexes (CIC) | X | X | X | X |
Cyanine Dyes and Quantum Dots (PIB) | X | X | X | |
Atomic Force Microscopy (AFM) | X | X | X | |
Theoretical Chemistry (Spartan)* |
Analytical measurements |
Spring 2014
Observation |
Spring 2015
Observation |
|||
---|---|---|---|---|---|
Experiment title | Pre-lab video | Experimental video | Data analysis video | ||
Stats, plots & fits* | |||||
Circuits & Coulometry (CC)* | X | ||||
Cyclic Voltammetry (CV) | X | X | |||
UV-Vis Spectroscopy (UV-Vis) | X | X | |||
Fluorescence Spectroscopy | X | X | X | X | |
Fourier Transform Infrared Spectroscopy (FTIR) | X | X | X | X | |
Atomic Emission Spectroscopy (AES) | X | X | X | X | |
Gas Chromatography (GC) | X | X | X | X | |
High Performance Liquid Chromatography (HPLC) | X | X | X | X | |
Capillary Electrophoresis (CE) | X | X | X |
Observations were captured by recording student groups while they were completing the laboratory experiments. A summary of recorded experiments is provided in Table 2. The student groups ranged in size from 1–3 students per group per observation.
Interviews were conducted with students enrolled in Analytical Measurements in 2014. These interviews were fairly open-ended and the primary purpose was to gather student input about the most important features of the laboratory experiments to include in the video Interviews were also conducted with three students during the Fall 2014 semester of Physical Measurements and eighteen students from the Spring 2015 semester of Analytical Measurements. Students were asked about how they prepared for lab and follow-up questions were used to further probe their use of the videos and their overall laboratory experience. The interviews were conducted by MNM (Physical Measurements) using a Livescribe pen and JASM (Analytical Measurements) using a video-camera. The interview protocol in these cases focused on the general course structure, what resources students utilized to inform their action in lab, and how the resources influenced their attitudes towards the laboratories. The interviews ranged from 15 to 45 minutes long and took place at or near the end of the semester, after students had completed a majority of the laboratory experiments. Recordings were transcribed, and both the original recordings and transcripts were stored on a secure server.
The Meaningful Learning in the Laboratory Instrument (MLLI) (Galloway and Bretz, 2015a, 2015b) was administered to students enrolled in the Physical Measurements course in the Fall of 2014. The MLLI is a survey instrument that students completed online in a pre-post fashion: the first time before they started completing experiments in the laboratory and the second time was during the last week of instruction. The goal of the MLLI was to investigate how students’ perceptions of the lab changed from the beginning to the end of the course. The survey instrument categorizes students’ responses in cognitive, affective, and cognitive-affective dimensions.
Code name | Definition |
---|---|
Troubleshooting (E) | Students, and/or TA/LSS, are addressing something problematic with the laboratory apparatus. |
Reference to a video resource (P) | Students are referring to one of the laboratory instructional videos. |
Interpretative (CP) | Students are interpreting some data while in the lab. |
Interacting with TA/LSS | Students are interacting with TA or LSS in any manner. |
Seeking TA/LSS help | Students are actively seeking the TA or LSS for assistance. |
The researchers used one entire video observation session from a single experiment from the Physical Measurements laboratory (CO FTIR) and one entire observation from a single experiment from the Analytical Measurements laboratory (FTIR) to establish consensus for the application of the coding scheme. One researcher coded all of the Analytical Measurements observations while another researcher coded all of the Physical Measurements observations. 25% of the recorded video clips for each course were then coded by the other researcher and subsequently discussed by both researchers. Inter-rater agreement scores were calculated based on degree to which the same code was present (or absent) in each episode coded by both researchers. An overall inter-rater agreement score was calculated at 89% for the co-coded observations. All coding discrepancies were resolved among the coders. This agreement score aligned with the 90% agreement score reported by Xu and Talanquer (2013) for observing laboratory interactions. Data analysis for the interviews consisted of first transcribing the interviews verbatim. Interview transcriptions captured the entire conversation between the interviewer and interviewee, omitting filler words such as “um” and “like” only when they masked the meaning of what the participant was trying to say. Next the interviews were coded through an open coding method. Any aspects of the transcripts that related to the videos, course structure, instructor facilitation or laboratory in general were highlighted. Quotes similar in nature were placed together to facilitate the compilation of overall themes which emerged from the transcript. Themes were compared across the interviews from both Analytical and Physical Measurements.
In contrast, in Physical Measurements where there was no mention of a lack of knowledge of each type of video during the interviews. Students in the Analytical Measurements course stated that they did not know the data analysis videos existed (illustrated in the quote below), highlighting the need for better organization for students to access the videos.
Jeremy: I guess the only thing I would say is like, I never looked at the uh, data analysis video I didn’t realize they were there until a couple of weeks ago.
Interviewer: So did you actually use the data analysis videos once you figured out they were there?
Jeremey: Yeah, I think I used them for one or two labs just to try to get on the right track. (Analytical Measurements Interview, 2015)
This quote from Jeremy also illustrates how he changed his practice once he became aware of the videos.
Leonard: …I would go on and watch the experiment video, and while I was watching that…I would write out…bullet points of what we did so we didn’t get lost and knew we were following exactly the right techniques… (Physical Measurements Interview, 2014)
Alexandra: And while I am watching it (the video) I take notes and I think it really makes me pay attention to what I am doing and I come to class definitely knowing what I am doing. (Analytical Measurements Interview, 2015)
Students watched the lecture videos (if they were available for the course) but these were not as vital to them as the experiment videos. Even though the lecture videos were not required for the students to watch to successfully complete the pre-laboratory quiz, it was an expected part of the Physical Measurements course. Students who watched them noted the inherent value in how these videos could assist them in the laboratory. Some students preferred to watch these videos beforehand, often up to a few days in advance, so they would be able to seek additional help if they needed.
Interviewer: How did you prepare for lab?
Howard: …If there was a theory video posted, I would watch it. …but in the case that I really didn’t understand something, I usually tried to look at it a couple of days before I was actually in lab so that [on the day the experiment was actually performed] I would…be able to go to office hours. (Physical Measurements Interview, 2014)
The data also indicated a missed opportunity for the Analytical Measurements course, as this course did not include pre-lab lecture videos as part of the course structure. Some of the students in Analytical Measurements expressed having difficulty with connecting information from the pre-laboratory lectures, provided at the beginning of the semester, to the experiments when they wrote their laboratory reports. Students stated it was difficult to remember the material that was covered in lecture due to the time lapse between when they attended lecture and when they completed the experiments, which could be up to four months. They also said it was challenging to understand how the concepts in the lecture connected to the concepts in the experiments. The following quote illustrates this theme.
Ethan: CV or CE I think those things are not covered as much as undergraduates. So we had lectures at the start of the semester but (that) was four months ago now and we spend a week on each one (laboratory) or something. Not even a week on those topics. So then I'm trying to recall and go back to those lecture slides. Make sense of it. (Analytical Measurements Interview, 2015)
Gabe: Um, largely what I found the videos the most useful for was definitely for the instrumentation. Like I said we don’t know this instrumentation so that was always helpful. What you are actually doing in the experiment is nice to see being done as opposed to just reading about it that's not always helpful. (Analytical Measurements Interview, 2015)
The quote from Gabe also indicates that watching the videos was preferred to just reading the laboratory manual. The dynamic nature of the videos made it easier for students to process the steps for data collection. Students become more independent when they could see the instruments and how to use the software, which seemed to contribute to students being less reliant on the TAs and laboratory support staff in the laboratories (and therefore more efficient), as illustrated in the following quote.
Armando: I feel like the videos really helped. This is the first lab that actually had videos to them. That definitely helps just coming in you saw and you did most of the stuff well in the video I feel like that's a head-start, (not) wasting time asking questions in lab, (which) definitely helps. Things go smoother and smoother in labs. (Analytical Measurements Interview, 2015)
This decreased reliance on the TAs was also noted in the laboratory observations. Use of the resources eliminated some of the questions students had in previous semesters, such as where each instrument was located and how to use them.
It was commonly observed in Analytical Measurements in 2014 that if students were having trouble with a particular step of the procedure, they would usually stop and wait for the TA to come over and help them before continuing. After the introduction of the pre-laboratory videos, it was observed that students were able to setup the software using the computer on the instrument without TA assistance. This comparison provides evidence that the videos helped the students overcome the barrier of learning how to operate the instruments. It was also observed in Analytical Measurements in 2015 that students had a better sense of what areas they were working on and what steps each portion of the experiment entailed. Students in 2015 had a broader overview of the experiment and knew what steps to anticipate in the experiments.
Students in both courses drew upon the videos for manipulating the software. Students could be overheard referring to the videos as they navigated the procedure as illustrated in the following excerpt from the Apple Enzyme Kinetics experiment.
Jeremy: Did it export?
Luke: I’m pretty sure I did that right.
Jodie: I don’t know…
Jeremy: Do they just copy and paste?
(Inaudible muttering)
Jeremy: In the video, do they just copy and paste the numbers? I don’t remember.
Jodie: Yeah. (Physical Measurements Observation, 2014)
No TA intervention was needed and the students were able to solve their problem on their own.
The laboratory observations also showed that students often developed confidence issues that impeded their progress in the laboratory once a problem was encountered. This is illustrated in observations of the Gas Chromatography Experiment from Analytical Measurements in 2015. The students completing this experiment were on task and working independently until there were some difficulties with getting the computer software to connect with the instrument. This was resolved by the TAs, and then the instrument was back to functioning normally. However, after the troubleshooting issue the students sought out affirmation from the TAs for each of the step of the experiment before they executed an action on the instrument. It seemed the students wanted the reassurance of the TAs to potentially avoid another issue. Another example of the students not performing as autonomously as they were previously was observed in the High-Performance Liquid Chromatography experiment in the 2015 Analytical Measurements course. The students on Day 1 of the experiment came to the laboratory prepared, and they finished the experiment with no major issues. However, on Day 2 of the experiment, there was a leaky valve that needed to be replaced by the laboratory support staff member. After the seal was fixed, the students seemed extra cautious in how they approached setting up the software and loading their samples to be injected into the column. The incident seemed to take away some of the confidence the students had exhibited on Day 1 of the experiment.
The physical measurements course had troubleshooting episodes as well. One case was during the FTIR of CO lab in which a leak developed in the Schlenk apparatus. Students generated various solutions (e.g. re-greasing the joints), and were engaged directly in troubleshooting for several minutes until they realized it was somewhat beyond the scope of their experience level with such an apparatus. In the case of the atomic force microscopy (AFM) experiment, the laboratory support staff member was always immediately involved in troubleshooting due to the sensitive nature of the apparatus (e.g. expensive and delicate AFM tips). In these cases, direct student involvement in troubleshooting was quite limited. However, as in analytical measurements observations, students generally did not seek out TA help for extremely simple fixes (e.g. figuring out how to take a blank on a spectrometer). This is an encouraging observation in the context of students’ development of relative autonomy in the laboratory environment and for allowing the TAs and laboratory support staff to focus on trouble areas rather than guiding students through using the instruments.
Luke: Ok, so we’re at negative thirty [inHg] which is way better than the negative ten we had before because I know that's about what it was in the video.
Jeremy: Yeah, it was.
Jodie: Do we retake the blank or do you think that would not matter?
Jeremy: It's gonna take like five minutes to take the blank, so we might as well just take the blank. (Physical Measurements Observation, 2014)
Here, the student group recognizes that the pressure of the Schlenk line used to evacuate as well as fill the IR gas cell is appropriate based on the value given in the instructional video. They use this information to decide how to proceed with their experiment. This is a prime example of how students are utilizing the video resources in order to interpret their data during the laboratory period.
An example of how students used the videos to determine the peak shapes can be seen in the following example. Here, students in Analytical Measurements drew upon the videos as a source during the GC experiment.
Dennis: It's [the GC results) looking pretty similar to the picture [video].
Heather: Yeah.
Dennis: Our peaks are sharp. (Analytical Measurements Observation, 2015)
The students were able to qualitatively state that their peaks were sharp and were comparable to what would be expected from an acceptable chromatograph. They were also able to use the data analysis videos to help them interpret their data when they were writing their laboratory reports, as illustrated below.
Joseph: Your (the researchers’) videos were a savior! Like at 5’o clock this morning we were watching that video just hunkering down on the analysis, that was for fluorescence. They were extremely beneficial, like extremely beneficial. (Analytical Measurements Interview, 2015)
In addition to analyzing data after the laboratory, students in both courses were able to use the videos to help them monitor the progress of their experiments and to gain a better understanding of what the data was supposed to look like before they came into lab. The videos assisted the students with analyzing their data on a qualitative level to help inform them with how to proceed with the data collection in the experiment. Some students mentioned that they see the value in analyzing their data as they collect it during the laboratory period, rather than waiting to analyze while constructing their laboratory report(s).
The plots in Fig. 4 show that there was a positive shift in students’ perception of their experience in the affective domain between the pre and post survey, and a negative shift in the cognitive domain. While the negative shift in students’ cognitive expectations is not desirable, it is consistent with the results seen in other studies (DiBiase and Wagner, 2002; Galloway and Bretz, 2015a, 2015b). These results indicate that more attention should be given to engaging students cognitively during the laboratory in addition to improving the psychomotor and affective aspects. However, the shifts in the affective domain in our study were not consistent with previously observed shifts: similar plots and descriptive statistics by Galloway and Bretz (2015a, 2015b) reported that students’ scores significantly decreased in all three domains, albeit with a range of effect sizes (small for affective). This result does suggests that the video resources were successful in improving students’ experiences in the laboratory.
A more detailed statistical analysis was performed on each set of data using SPSS to check for statistically significant differences between participants’ responses pre vs. post within the cognitive and affective clusters. Since the sample size was relatively small (N = 12), the Wilcoxon signed-rank test was used. Table 4 lists the questions that had statistically significant differences, as well as the effect sizes for each of the clusters. The effect sizes were calculated in accordance with a method described by Pallant (2013). In accordance with Cohen's standards for evaluating effect sizes, these effect sizes are considered between small and medium (Cohen, 1992). The only individual item within the affective domain with a positive, significant change was one focused on whether or not students felt frustrated.
Domain | Effect size | Item number | Item statement (Pre: I expect to…; Post: I…past tense…) |
---|---|---|---|
Cognitive | 0.232 | Q11 | …think about what the molecules are doing |
Q16 | …be confused about the underlying concepts | ||
Q19 | …think about chemistry I already know | ||
Affective | 0.314 | Q21 | …be frustrated |
Item 21 on the MLLI, which investigates the degree to which students feel frustrated in lab, shifted significantly in a desirable manner as indicated in Table 4. We note that, while this is the only item with a statistically significant shift in the affective domain, no questions in this domain significantly decreased. Item 16 on the MLLI (Q16) shifted in a positive, significant manner from pre to post as well. This item serves to investigate the degree to which students are confused about concepts explored in the laboratory. It is significant to note that, in Galloway et al.'s (2016) study of students’ affective responses in chemistry laboratory environments, students indicated they were “frustrated,” and “confused” with relatively high frequency after they had completed at least one laboratory experiment during the academic term. The favorable shift in students’ affective domain scores, as a whole, and the shift in Q16 is promising since a primary goal of the video resources was to help students feel more prepared and confident in the laboratory.
While two other items in the cognitive domain shifted negatively (Q11 and Q19), these focused on chemistry-specific aspects (molecules and chemistry in general). Specifics regarding the underlying chemistry (i.e. atomic and molecular scale phenomena) were covered in the mini lecture videos in Physical Measurements, but the nature of the laboratory creates an environment where students frequently do not often think about chemistry concepts while they are collecting data (DeKorver and Towns, 2016). Some students in the interviews, consistent with the MLLI cognitive domain results, expressed a desire to engage in data analysis while they were present in the lab but often feel that is difficult to do so, which is illustrated in the following excerpt.
Interviewer: So if you could take this course again what would you do to improve your experience?
Shane: I would probably try to understand the data. I know it's easy to just get the data. It makes a lot more sense when you look through (it) after, the like, you could be looking for stuff during it (the laboratory period) I guess. That's probably something that would be helpful. Just to, yeah.
Interviewer: So that's something you would be more…
Shane: It's hard to do that when you're getting in to figure out what you are supposed to be doing. Once you understand what you did and maybe you got it, probably would have helped if you had a better understanding of what you are looking for in lab. Sometimes we are just getting the data, getting the numbers and you don't really know why until you start putting them altogether. (Analytical Measurements Interview, 2015)
In line with the MLLI results from Physical Measurements, students in the Analytical Measurements expressed positive attitudes towards the course. Although we do not have MLLI data from Analytical Measurements in Spring 2015, the quotes below indicate that the students have a positive affective response.
Interviewer: Is there anything else about this course that we didn’t touch on that you wanted to talk about?
Nicholas: I really enjoyed this course, actually. I enjoy being in lab, I thought the labs were fun to do. There wasn’t very much tedium. There wasn’t a feeling of what is this? I thought it was actually a good course. (Analytical Measurements Observation, 2015)
The observation data supports the conclusions from the MLLI results. Although there were instances of students being somewhat idle in lab (e.g. during the lengthy kinetics runs for the enzyme kinetics lab), there was a notable absence in language indicating frustration. Even during troubleshooting episodes, there was very little visible frustration on the part of the students aside from intermittent comments about “wanting to get out of lab on time.” Similar sentiments were found in Analytical Measurements in 2015, with students generally not getting flustered when there were glitches with the experiments. There were no mentions of students feeling frustrated or afraid to conduct the experiments in the interview data. In 2014, when students encountered trouble with the instruments, they were more likely to be frustrated and give up and seek TA or laboratory support staff help without attempting to first fix the problem themselves.
Students from both courses made specific statements that indicated they relied on the instructional videos to inform their practices in the laboratory, and observations in the laboratory support that students appeared to know what to expect when in lab insofar as instrumentation, apparatus, and procedure were concerned. Most student–TA interactions that coincided with “seeking TA help” were for assistance with rather complex manipulations of instrumentation or for troubleshooting problems, particularly in labs with complicated apparatuses. Simple procedural aspects that were addressed in the instructional videos were generally executed without a problem. These assertions align well with the findings from one-on-one student interviews, in which many participants stated that they drew upon the instructional videos to become familiar with the various instruments, software, and experimental details fundamental to completing the lab.
The positive shift in the MLLI affective domain (from pre to post) was encouraging especially given the fact that students’ expectation of being confused and frustrated was not what they experienced. The quotes and the results from the MLLI suggest the implementation of the videos yielded favorable shifts in students’ affective experiences in lab, along with a limited impact on their cognitive experiences. The triad of laboratory videos were intended to help alleviate the frustration of preparing for and completing the laboratory and not necessarily to increase their cognitive engagement. This represents an area where future interventions could address more specifically in order to increase students’ cognitive engagement while completing the laboratory rather than only when completing reports and analysis.
Furthermore, it would be beneficial to examine the use of such video resources in advanced laboratory environments that are conducted in a simultaneous rather than in a rotational manner. Of particular interest would be settings where there are opportunities for student reflection, analysis, and discussion during the laboratory period. It is possible that this would also improve cognitive engagement while completing the experiments. Another aspect to address in future studies is the perceptions and experiences of the teaching assistants that run the laboratories and how the videos might have impacted their facilitation of the laboratories. We have some evidence that the laboratory ran more smoothly after the addition of the video resources, but no systematic was collected to address this more fully.
Carefully designed video resources can assist with helping both students and teaching assistants have a more positive and productive laboratory environment. Even though a professional video service was used for the experimental videos, a handheld video camera could be used with similar outcomes. As changes have been made to laboratory protocols, short videos have been made by the research team using a small video camera to address the changes and then the experiment video edited to replace the segment. These inserted videos are not significantly different in appearance from the professionally filmed segments.
We highly recommend constructing scripts before filming the videos, and separating out the scripts into two columns: one column focusing on what the camera shot will be and the other on what the audio will be. The principles from Mayer's Cognitive Theory of Multimedia Learning (Mayer et al., 1999; Mayer and Chandler, 2001; Mayer et al., 2001) should be taken into account when constructing pre-laboratory videos. The first principle would be to have no unnecessary on-screen text when audio is playing in the video to reduce cognitive overload. The second principle would be to focus the camera shot on completing one action at a time to make it clear to the viewer what step is being illustrated. The third and last principle would be to include bookmark hyperlinks in all of the videos and to place text transitions between each of the sections on screen to signal to the viewer the beginning of another section.
Footnotes |
† Electronic supplementary information (ESI) available. See DOI: 10.1039/c7rp00078b |
‡ Now at Western Washington University. |
This journal is © The Royal Society of Chemistry 2017 |