Mitzy A.
Erdmann
and
Joe L.
March
*
Department of Chemistry, University of Alabama at Birmingham, Birmingham, Alabama 35205, USA. E-mail: march@uab.edu
First published on 7th July 2014
The increased use of video capable cellular phones to document everyday life presents educators with an exciting opportunity to extend this capability into the introductory laboratory. The study assessed whether students enrolled in a southeastern U.S. university's first-year laboratory course retained technical information at a higher rate after creating a technique video. These videos were created on hand-held video capable devices that students owned prior to enrolling in the course, eliminating additional cost to students. Pre-/post-test analysis (N = 509) was performed to determine short- and long-term learning gains regarding reporting the volume of graduated glassware to the proper number of significant figures. Though both groups used various graduated glassware throughout the term, chi-square analysis showed that students who created a video detailing use of a Mohr pipet reported the volume of graduated glassware correctly on the final exam and laboratory practical at a significantly higher rate than those students who received only verbal instruction on the technique.
The explosion of social media in the past 5 years including Facebook, YouTube, and Twitter, along with the development of personal electronic devices, has resulted in a generation that has new and different technology available for use in their own learning than previous generations. Personal phones are widely popular and a large number of devices are manufactured and purchased annually (Global Mobile Statistics, 2013). Williams and Pence propose that the use of cellular phones or portable devices will impact chemical education (and society) in greater ways than the introduction of the personal computer (Williams and Pence, 2011). Additionally, the Horizon Report, an annual report that summarizes research and discussion on current issues in technology and education from publications and the internet, recognized cellular phones as an emerging technology for teaching and learning because of these wide ranging capabilities including video capture and data transfer (Johnson et al., 2011).
A survey of currently available cellular phones shows that even the most low-tech of these devices is capable of capturing video. This video can be transferred as a data file by either docking the phone with a computer, accessing an internal memory card, or through wireless data transfer. For those students who do not own an adequate phone or have difficulty transferring data from its storage device, inexpensive point and shoot cameras are an easy and readily available option. Our institution offers students video equipment on short-term loan, though no student in the study took advantage of this opportunity.
Technology has long been a part of the chemistry laboratory curriculum (March et al., 2000; Winberg and Berg, 2007). Specifically, video technology has been used for everything from training (Pantaleo, 1975) and demonstrations (Fortman and Battino, 1990) to self-reflection (Veal et al., 2009). Videos have been used extensively for in-laboratory instruction for a variety of chemistry courses, including upper level courses in physical (Rouda, 1973) and analytical chemistry (Williams, 1989). Searches through YouTube's internal search engine for standard laboratory techniques result in a number of useful tutorial videos (ChemLab, 2013; MIT, 2013). These videos are largely instructor/institution produced, with little to no student involvement. Instructor-produced pre-laboratory videos are a valuable asset in the classroom and have been shown to improve laboratory techniques and retention of information (DeMeo, 2001).
Despite this literature demonstrating the effectiveness of instructor-generated videos as a teaching and learning tool, the chemical education literature provides only a few descriptions of student-generated videos. Initial studies involving multimedia laboratory reports indicate that students are willing to report their results via less conventional means (Jenkinson and Fraiman, 1999). Student-authored videos on biochemistry topics were used in a second year undergraduate course to engage students in their own learning (Ryan, 2013). This study found that the students were more engaged in their own learning, perceived deeper learning, and enjoyed working in groups. Lancaster describes the effectiveness of student authored vignettes where students use Camtasia Studio to prepare short review presentations on topics required on a final examination (Lancaster, 2014). Passing marks suggest a positive relationship between the introduction of the vignettes and the passing rate. A conference paper describes how student-generated videos allow the instructor to identify student's knowledge gaps and misconceptions (Niemczik et al., 2013). An additional paper shows that having teachers (as students in a professional development program) prepare videos improves their self-efficacy (Blonder et al., 2013). These results are consistent with observations in disciplines outside of chemistry (Hirschel et al., 2012; McCullagh, 2012) where the impact of the video is increased when the student reviews themselves performing the task. Thus, student-created videos have been shown to have positive effects on student learning, but none of these studies have focused on laboratory techniques or instruction.
Both the control and experimental groups created a video during the same week of the study during their scheduled laboratory period, but the technique chosen to be described differed between the groups. The videos were not stand-alone assignments, and both of the video topics chosen were necessary to complete the laboratory assignment for the week. Despite the creation of the video, students in both groups were expected to use the equipment the same number of times throughout the semester. The 16 sections offered in fall 2012 were divided into two groups; 9 sections were in the control group (N = 276) which created an analytical balance video and 7 sections were in the treatment group (N = 233) which created a Mohr pipet video. Group assignments were made on the basis of the day/time that the laboratory section met, so students were not able to choose the content of the video they recorded. Assigning treatment/control on the basis of the day/time of the week resulted in some teaching assistants leading both groups.
Fig. 1 Project timeline. The 15 week fall 2012 semester indicating key events relative to the project. |
Students published their videos to YouTube. Students that did not wish to make their video public were instructed to choose YouTube's ‘unlisted’ option, which allows the video to be published but only those individuals with a link directly to the videos are able to view it. We did accommodate a student that was concerned with privacy issues by viewing the video from a personal laptop during office hours. YouTube was selected as a submission platform because it converts a large number of file types to a player that our teaching assistants were familiar with and offers help with file types and uploading. Thus, teaching assistants did not have to learn how to use multiple video players. Videos were graded by one of the section's teaching assistants the week after submission. Feedback was provided to the students during the next laboratory meeting via a scored rubric. Students submitting a video with gross technical errors received a rubric with additional written comments from their TA and were encouraged to edit their video, reshooting if necessary, and to submit it for re-grading.
Box 1. Pre-test item and coding criteria |
Two post-tests were administered to measure short- and long-term learning gains. The short-term post-test was given in week 4 (two weeks after treatment). The post-test item was included as a single item on a multi-item pre-laboratory activity quiz (Box 2, post-test item). Again, students were given 10 minutes to complete the entire quiz. Longer-term gains were measured as a single item on the written final exam (Box 2, final examination item) and direct observation as part of a laboratory practical exam in week 15 (thirteen weeks after treatment). Students were instructed to review both the laboratory manual and their notes to prepare for both the written exam and practical exam. All students completed the written exam prior to starting the practical exam.
Box 2. Post-test and final exam items and coding criteria. The same image was presented for both prompts |
For the laboratory practical exam, students were instructed that they would need to determine the density of a solution and provide data concerning both the accuracy and precision of their measurements. They were not allowed to bring their notebook or other notes. Stations were set up to mimic the week 2 density determination activity and included a balance, a 10 mL Mohr pipet, a 100 mL graduated cylinder, and beakers. Each laboratory section had outside proctors (also known as supervisors or monitors), who had been trained to monitor technique through a series of training videos. Proctors were not placed in observation positions until their scores on training videos met calibrated scores established by three different instructors. Proctors were assigned to laboratory sections, but were not told which video the students had prepared. These proctors used a rubric to monitor whether students chose to use the Mohr pipet and whether they used whatever glassware they chose and the balance correctly. Students were given a worksheet on which they recorded data and performed calculations. Proper use of significant figures was based on the data recorded on the student worksheet and the observation of the proctors. All laboratory practical exam materials can be found in Appendix IV of the ESI.†
For the pre- and post-test, each response was coded using a 2/1/0 score. As the task required in the pre- and post-test items differed slightly, the skills required in each item were classified according to a revised Bloom's taxonomy domains: (2) remember and apply/analyze, (1) remember, and (0) incorrect (other responses) (Anderson et al., 2001; Krathwohl, 2002). As part of the practical, used the pipet and recorded the volume to 2 decimal places was scored a (2), used the pipet but recorded the volume incorrectly (an incorrect number of significant figures or the beginning/final meniscus was outside the graduation marks) was scored a (1), and used the incorrect glassware was scored a (0).
Each student enrolled in the course was expected to spend similar time on task and as such both the control and experimental groups were expected to show improvement in reporting the proper number of significant figures. Thus, McNemar's chi square and odds ratios were calculated within each group to evaluate any differences in performance within the groups. This statistic is commonly used in pre-/post-test design to compare and determine significance between the number of students whose scores have increased, decreased or remained the same (Elliott and Woodward, 2007). Between groups analysis was analyzed using chi-square (χ2) determinations between the pre-/post-test, pre-test/written final exam, and post-test/written final exam. The chi-square statistic was selected since this statistic uses the frequency data from a sample to evaluate the relationship between the variables in a population and is the one of the most common nonparametric statistical values (Gravetter and Wallnau, 2009). Chi-square calculations were performed using a 3 × 2 contingency table to assess whether the increase in number of students that use the pipet and record the proper number of significant figures was different between the groups as a result of the treatment. Due to the size of the data pool and the degrees of freedom, minimal requirements for all cells of the contingency tables were met and the correction for continuity was not necessary and thus ignored. Statistical analyses were conducted using the Statistical Package for the Social Sciences (SPSS version 20 for Windows). Tabulated statistical data can be found in the ESI† (see Appendices V–VII).
Key points regarding data collection along the semester are presented in the timeline shown in Fig. 1. During week 2, the pre-test was administered prior to all students filming videos detailing the use of either a Mohr pipet or an analytical balance. The post-test was administered during week 4, after students had edited and submitted the first draft of their videos. The longitudinal post-test data point (the written final exam) was administered the last week of the semester (week 15) along with the practical examination.
Of the 10 labs performed during the course, 8 of them required estimating the last decimal place when reading a volume. Mohr pipets were used to transfer solutions 3 of the 10 weeks, but students also used burets (2 of 10 weeks) or graduated cylinders (at least 5 of 10 weeks) throughout the course. Teaching assistants demonstrated the proper use of each piece of glassware at least once during the term, though the rigor in the presentation admittedly varied among TAs. The large sample size is expected to address this variable treatment.
Study period | Statistical test | Pipet group (N = 233) | Balance group (N = 276) |
---|---|---|---|
Pre-test to post-test | McNemar's χ2 | 43.2 | 25.3 |
Odds ratio | 4.6 | 2.6 | |
Pre-test to final | McNemar's χ2 | 82.6 | 84.8 |
Odds ratio | 11.1 | 9.8 | |
Post-test to final | McNemar's χ2 | 13.3 | 23.3 |
Odds ratio | 2.6 | 3.1 | |
Pre-test to practical | McNemar's χ2 | 21.3 | 8.8 |
Odds ratio | 2.5 | 0.6 | |
Final to practical | McNemar's χ2 | 31.6 | 130.5 |
Odds ratio | 5.1 | 36.0 |
The influence of the pipet video is tempered between the post-test and the written final examination (OR = 2.6, McNemar's χ2 = 13.3 (pipet) and OR = 3.1, McNemar's χ2 = 23.3 (balance)). These odds ratios indicate that the number of students providing correct answers increased for both groups by the end of the semester. This observation is not unexpected since both groups used graduated glassware throughout the semester.
Fig. 2 shows the number of students who were able to improve their performance from an incorrect to a correct answer between the pretest and subsequent tests (i.e., the number of students who fell into the ‘no/yes’ category in the McNemar's chi-square table).
Fig. 2 The number of students who answered incorrectly on the pretest but correctly on a subsequent test. All results are statistically significant. |
Between group analyses of short-term learning gains showed that creation of a technique video had a marginal effect on student performance on the post-test. The χ2 statistic of 3.29 (p = 0.193) implies that there is a little over an 80% chance that the videos led to an increase in student performance on the post-test. However, given that this assessment method appeals to a group of learners that is often overlooked in the chemistry laboratory environment (i.e., bodily/kinesthetic learners) (Gardner, 1983), the positive correlation between assessment and learning gains provides opportunities for further studies that include learning style preferences as another variable (Fig. 3).
Fig. 3 Comparison of student performance on all of the test items analyzed. Statistically significant differences are starred. Particular attention should be paid to the results of the practical. |
Additional between group analyses show that the pipet group answered the assessment item correctly on the written final exam at a higher rate the balance group (81% versus 72%, χ2 = 9.78, p = 0.008). Though this finding is encouraging, the performance on the laboratory practical exam is the important observation. The item on the written final exam was similar to the post-test item, so responses on the written final exam could be influenced by the availability of the graded quiz as a study guide. During the practical exam, students in the pipet group used a Mohr pipet to transfer the solution at a higher rate (91%) than those students in the balance group (75%) (χ2 = 50.53, p < 0.001). Thus, though both groups spent the same amount of time using the pipet earlier in the term to determine the density, those students that made a pipet video selected the intended piece of glassware at a significantly higher rate. Additionally, students who used the Mohr pipet for volume determination also reported the correct significant digits at a much higher rate in the pipet group (74%) than the balance group (46%) (χ2 = 34.77, p < 0.001), indicating that creation of the video leads to an increase in proper application of the techniques the video details.
In addition to the data collected regarding significant figures related to volume, items were included on the pre- and post-tests probing students' familiarity with the significant figures associated with the analytical balance. As with the volume data, no differences in a priori knowledge were found between the groups. Additionally, there was no significant difference in growth between the groups on items related to using an analytical balance, which is likely attributed to the fact that the balances in question do not require estimation on the students' part as the mass readout is digital.
Though the majority of student groups submitting videos were able to successfully demonstrate proper technique with their initial submission, a potential limitation of the results lies in requiring these groups to re-shoot their videos or portions of video when gross technical errors were observed. The collection of this additional footage requires additional practice with the pipet that students in the control (balance) group would not have performed and time on task between the groups is not equivalent. The improved performance on the practical may be attributed to the instructor's feedback and student corrective action, and not simply from making the video. However, asynchronous monitoring is not possible without the video, so this corrective action would likely not ever have occurred. Thus, there could be an indirect benefit from the video preparation.
Student-created video assessments offer instructors the ability to observe technique and offer critique asynchronously, thus ensuring that the instructor can observe and provide feedback to all students in the laboratory. Jones et al. state that feedback can be seen as ‘the end of a cycle of learning and the beginning of the next’ and describe a method of assessment using screen capture digital video to provide feedback (Jones et al., 2012). In this method, students receive instructor feedback on their assignments via short screen capture videos created by the instructor. Thus, videos could be potentially useful in large enrolment laboratory courses, with the additional benefit of requiring equipment that the student already owns and software that can be obtained free of charge.
Many laboratory course designs require students to perform tasks in small groups. One potential problem in any such cooperative learning setting is that of the ‘hitch hiker’, or the individuals who defer to the ‘good’ student to complete the work (Cooper, 1995). In the laboratory, this hitch hiker problem often leads to the most technically astute member of the group taking responsibility for the majority of the data collection. Ensuring student involvement and equal participation among all group members in the laboratory is difficult due to the extent of the observation of the group required. This difficulty is exacerbated when graduate teaching assistants are responsible since they have limited experience with classroom management. The use of video could potentially be used to address the “hitch hiker” issue. Requiring all students to be seen in the video requires a minimum level of participation that is often difficult to monitor in a normal classroom environment.
An exciting potential application of student-created videos is in the distance-learning laboratory course setting. At the author's institution and others, enrolment is ever increasing while infrastructure remains largely unchanged. This phenomenon has led a number of institutions to increase their offerings of online courses (Phipps, 2013). Despite these growing pains, the contentious issue of the laboratory experience remains one of the largest obstacles to implementing online chemistry courses (Pienta, 2013). Safety, expense, retention and academic rigor further complicate the online laboratory course environment (Patterson, 2000; Boschmann, 2003; Hoole and Sithambaresan, 2003; Casanova et al., 2006). Provided safety precautions are in place, video reports may have value in distance learning environments where the instructor is not physically present, as they provide an active learning method of assessment. Instructors of these courses could ensure that off campus students are performing their own experiments by requiring them to create videos of themselves safely performing laboratory activities. In this model, the student-created video would be used for more than just analysing technique, it would be used to ensure students are individually performing their tasks. In this way instructors could monitor that the technical aspect of the laboratory experience is met. The author's data, specifically the high chi squared value associated with the application of technique, imply that creating a video could lead to increased retention and more meaningful learning in the online laboratory setting, thus alleviating some of the academic rigor concerns.
This study also allows us to consider the development of a list of techniques that could be included in a student's personal electronic library for use in other laboratory courses or in the research laboratory. These techniques could be as simple as the proper use of a piece of glassware or much more technical, such as the use of a HPLC. A number of techniques and instruments are common across scientific fields, and these videos on these techniques could be retrieved from the student's e-portfolio when needed.
Footnotes |
† Electronic supplementary information (ESI) available: Survey results, statistical information, grading rubrics and the Chemistry video shooting guide. See DOI: 10.1039/c4rp00107a |
‡ Laboratory courses met only during the 5 day weeks of the semester. Students were not asked to meet during holiday weeks or during the week of final exams. |
This journal is © The Royal Society of Chemistry 2014 |