Video reports as a novel alternate assessment in the undergraduate chemistry laboratory

Mitzy A. Erdmann and Joe L. March *
Department of Chemistry, University of Alabama at Birmingham, Birmingham, Alabama 35205, USA. E-mail: march@uab.edu

Received 14th May 2014 , Accepted 7th July 2014

First published on 7th July 2014


Abstract

The increased use of video capable cellular phones to document everyday life presents educators with an exciting opportunity to extend this capability into the introductory laboratory. The study assessed whether students enrolled in a southeastern U.S. university's first-year laboratory course retained technical information at a higher rate after creating a technique video. These videos were created on hand-held video capable devices that students owned prior to enrolling in the course, eliminating additional cost to students. Pre-/post-test analysis (N = 509) was performed to determine short- and long-term learning gains regarding reporting the volume of graduated glassware to the proper number of significant figures. Though both groups used various graduated glassware throughout the term, chi-square analysis showed that students who created a video detailing use of a Mohr pipet reported the volume of graduated glassware correctly on the final exam and laboratory practical at a significantly higher rate than those students who received only verbal instruction on the technique.


Introduction

Laboratory instruction has been considered an essential component of chemistry instruction since 1927 (DeMeo, 2001). The goals for general chemistry laboratory courses are generally a balance between training students in proper laboratory techniques and the development of research (critical thinking) skills (Lloyd, 1992). Even courses that are focused on critical thinking skills require students to take measurements in order to collect data that will allow them to draw accurate conclusions. While proper technique is implied and often demonstrated in these approaches, there is opportunity for error to be introduced the first time a student performs a technique.

The explosion of social media in the past 5 years including Facebook, YouTube, and Twitter, along with the development of personal electronic devices, has resulted in a generation that has new and different technology available for use in their own learning than previous generations. Personal phones are widely popular and a large number of devices are manufactured and purchased annually (Global Mobile Statistics, 2013). Williams and Pence propose that the use of cellular phones or portable devices will impact chemical education (and society) in greater ways than the introduction of the personal computer (Williams and Pence, 2011). Additionally, the Horizon Report, an annual report that summarizes research and discussion on current issues in technology and education from publications and the internet, recognized cellular phones as an emerging technology for teaching and learning because of these wide ranging capabilities including video capture and data transfer (Johnson et al., 2011).

A survey of currently available cellular phones shows that even the most low-tech of these devices is capable of capturing video. This video can be transferred as a data file by either docking the phone with a computer, accessing an internal memory card, or through wireless data transfer. For those students who do not own an adequate phone or have difficulty transferring data from its storage device, inexpensive point and shoot cameras are an easy and readily available option. Our institution offers students video equipment on short-term loan, though no student in the study took advantage of this opportunity.

Technology has long been a part of the chemistry laboratory curriculum (March et al., 2000; Winberg and Berg, 2007). Specifically, video technology has been used for everything from training (Pantaleo, 1975) and demonstrations (Fortman and Battino, 1990) to self-reflection (Veal et al., 2009). Videos have been used extensively for in-laboratory instruction for a variety of chemistry courses, including upper level courses in physical (Rouda, 1973) and analytical chemistry (Williams, 1989). Searches through YouTube's internal search engine for standard laboratory techniques result in a number of useful tutorial videos (ChemLab, 2013; MIT, 2013). These videos are largely instructor/institution produced, with little to no student involvement. Instructor-produced pre-laboratory videos are a valuable asset in the classroom and have been shown to improve laboratory techniques and retention of information (DeMeo, 2001).

Despite this literature demonstrating the effectiveness of instructor-generated videos as a teaching and learning tool, the chemical education literature provides only a few descriptions of student-generated videos. Initial studies involving multimedia laboratory reports indicate that students are willing to report their results via less conventional means (Jenkinson and Fraiman, 1999). Student-authored videos on biochemistry topics were used in a second year undergraduate course to engage students in their own learning (Ryan, 2013). This study found that the students were more engaged in their own learning, perceived deeper learning, and enjoyed working in groups. Lancaster describes the effectiveness of student authored vignettes where students use Camtasia Studio to prepare short review presentations on topics required on a final examination (Lancaster, 2014). Passing marks suggest a positive relationship between the introduction of the vignettes and the passing rate. A conference paper describes how student-generated videos allow the instructor to identify student's knowledge gaps and misconceptions (Niemczik et al., 2013). An additional paper shows that having teachers (as students in a professional development program) prepare videos improves their self-efficacy (Blonder et al., 2013). These results are consistent with observations in disciplines outside of chemistry (Hirschel et al., 2012; McCullagh, 2012) where the impact of the video is increased when the student reviews themselves performing the task. Thus, student-created videos have been shown to have positive effects on student learning, but none of these studies have focused on laboratory techniques or instruction.

Context and rationale for this study

Students' ability to correctly report significant figures when reading a meniscus has traditionally been poor at the authors' institution. Analysis of laboratory final exams collected over a number of semesters indicate that students have difficulty understanding the need to estimate between the graduations (only 30% of students report the volume using the proper significant figures on final examinations). The frequent use of laboratory glassware that requires such a skill offered an opportunity to relate a laboratory technique with the creation of technique videos. Moreover, these videos could be created alongside laboratory activities that were already a part of the curriculum. The introductory exercise in our laboratory sequence requires students to develop an experimental procedure to determine the density of an unknown liquid (guidelines are provided and the instructor demonstrates the use of the equipment, but a detailed step-by-step procedure is not presented to students). The expected procedure requires the transfer of the liquid using glassware that has graduated volumetric marks and a mass measurement using an electronic balance. The glassware provided includes a 10 mL Mohr pipet, a 100 mL graduated cylinder, and a beaker. Class data and discussion of standard deviation and error analysis is expected to lead students to select the pipet when transferring small volumes of liquid. Because of this, the proper use of the Mohr pipet was chosen as the first video topic.

The research question

The work presented in this paper seeks to determine whether the data support the hypothesis that students who create a video detailing the proper use of a Mohr pipet as part of a laboratory exercise report a volume accurately and to the correct number of significant figures more frequently than students who complete the same laboratory exercise without preparing the pipet video (having prepared a video on using a balance instead).

Methods

To answer the research question, we integrated the creation of videos into the existing laboratory curriculum and used a pre-/post-test research design to analyze how students reported the significant figures associated with reading a volume. Assessment items were included on written quizzes or exams and direct observation made during a laboratory practical exam. All students enrolled in the course participated in the study. Informed consent was explained to all participants, though signed forms were not collected in an effort to maintain anonymity. Students were assigned to either the balance video (control) group or pipet video (experimental) group based on lab section. Test items and observation rubrics were collected and coded and statistical analysis was performed.

Formative surveys

A formative survey was developed to identify possible barriers to implementing a video requirement in the laboratory. All students enrolled in the introductory course during the summer 2011, fall 2011 and spring 2012 semesters were asked to complete a formative survey related to the availability of video cameras and their perceived self-efficacy with video capture, editing, and transferring (N = 799). Students completing the formative survey are not included in the current study because we did not wish to place a graded requirement in the syllabus that could not be completed by a significant number of students. These surveys were collected after informed consent was given. Students were also asked to indicate their preference for video assignments over more traditional assignments (written reports, exams, etc.). The results of these surveys indicated that the students were willing and able to prepare videos as part of the course requirements. Survey items and results can be found in the ESI (see Appendix I).

Subjects

Data were collected in the fall 2012 semester at a public university in the southeastern United States. All students in the study were enrolled in the first part of a two-term introductory chemistry laboratory sequence. The majority of students (95%) enrolled were STEM or pre-health majors. The course is a stand-alone laboratory course requiring students to master both conceptual and technical items. Students are not required to be co-registered in a lecture course, but most (98%) are co-registered for or have completed the corresponding first semester lecture. Students were instructed to prepare for the laboratory activities by reviewing a laboratory manual custom-published by the authors (March, 2012) and by reviewing texts and online resources associated with a list of suggested topics. The laboratory met for a three-hour block once-a-week twelve weeks of a fifteen-week semester. In the fall of 2012, 16 sections were offered with an initial enrollment of 619 students (5.5% of the university student body). Each section has 39[thin space (1/6-em)]:[thin space (1/6-em)]2 student to instructor ratio. All sections were led by at least one graduate teaching assistant, and occasionally upper-level undergraduate students who had performed exceptionally well in the laboratory were assigned to assist as the second instructor. Only those students completing all components of the study (pre-test, post-test, written final exam, and laboratory practical exam) were included in the final analysis. Of the 619 students initially enrolled, 509 were present for all four of these testing periods.

Both the control and experimental groups created a video during the same week of the study during their scheduled laboratory period, but the technique chosen to be described differed between the groups. The videos were not stand-alone assignments, and both of the video topics chosen were necessary to complete the laboratory assignment for the week. Despite the creation of the video, students in both groups were expected to use the equipment the same number of times throughout the semester. The 16 sections offered in fall 2012 were divided into two groups; 9 sections were in the control group (N = 276) which created an analytical balance video and 7 sections were in the treatment group (N = 233) which created a Mohr pipet video. Group assignments were made on the basis of the day/time that the laboratory section met, so students were not able to choose the content of the video they recorded. Assigning treatment/control on the basis of the day/time of the week resulted in some teaching assistants leading both groups.

Video production and submission

Detailed grading rubrics, video recording, and editing tutorials were provided to students via the course management website at the start of the term (see Appendices II and III, ESI). Students were instructed to review these materials as their pre-laboratory assignment in the week prior to video production. Student groups created videos in parallel to a standard laboratory exercise, and the creation of the video was presented to students as a complementary activity to the standard laboratory assignment. The laboratory exercise required students to determine the density of an aqueous solution using an analytical balance and volumetric glassware; including a Mohr pipet. The assignment offered an equipment list to students but did not give them explicit instruction on determining the density. Videos were recorded in groups of 3–4 students. Though formation of laboratory groups is at the discretion of the TAs, the majority of students are allowed to choose their laboratory partners. Each student had to perform the technique and it had to be obvious that each student was present in the video (i.e., simply showing the students' hand was not sufficient). Make-up periods were provided in the event that students later realized they needed additional footage. Editing requirements were minimal, but time limits were placed on the video both to avoid lengthy segments where students are simply standing around and to limit the amount of time teaching assistants spent grading videos. Students were instructed to perform any edits outside of class time, and they had two weeks to edit the first version of their video (Fig. 1).
image file: c4rp00107a-f1.tif
Fig. 1 Project timeline. The 15 week fall 2012 semester indicating key events relative to the project.

Students published their videos to YouTube. Students that did not wish to make their video public were instructed to choose YouTube's ‘unlisted’ option, which allows the video to be published but only those individuals with a link directly to the videos are able to view it. We did accommodate a student that was concerned with privacy issues by viewing the video from a personal laptop during office hours. YouTube was selected as a submission platform because it converts a large number of file types to a player that our teaching assistants were familiar with and offers help with file types and uploading. Thus, teaching assistants did not have to learn how to use multiple video players. Videos were graded by one of the section's teaching assistants the week after submission. Feedback was provided to the students during the next laboratory meeting via a scored rubric. Students submitting a video with gross technical errors received a rubric with additional written comments from their TA and were encouraged to edit their video, reshooting if necessary, and to submit it for re-grading.

Test items

A pre-test was administered as a single item (Box 1) on a five-question pre-laboratory quiz during week 2, the week that the pipet was used for the first time (Fig. 1). Students were provided information in the laboratory manual related to the meniscus, graduations, and estimation, but the quiz was administered prior to the pre-laboratory lecture or demonstration. Students were given 10 minutes to complete the quiz.

Box 1. Pre-test item and coding criteria


image file: c4rp00107a-u1.tif

Two post-tests were administered to measure short- and long-term learning gains. The short-term post-test was given in week 4 (two weeks after treatment). The post-test item was included as a single item on a multi-item pre-laboratory activity quiz (Box 2, post-test item). Again, students were given 10 minutes to complete the entire quiz. Longer-term gains were measured as a single item on the written final exam (Box 2, final examination item) and direct observation as part of a laboratory practical exam in week 15 (thirteen weeks after treatment). Students were instructed to review both the laboratory manual and their notes to prepare for both the written exam and practical exam. All students completed the written exam prior to starting the practical exam.


Box 2. Post-test and final exam items and coding criteria. The same image was presented for both prompts


image file: c4rp00107a-u2.tif

For the laboratory practical exam, students were instructed that they would need to determine the density of a solution and provide data concerning both the accuracy and precision of their measurements. They were not allowed to bring their notebook or other notes. Stations were set up to mimic the week 2 density determination activity and included a balance, a 10 mL Mohr pipet, a 100 mL graduated cylinder, and beakers. Each laboratory section had outside proctors (also known as supervisors or monitors), who had been trained to monitor technique through a series of training videos. Proctors were not placed in observation positions until their scores on training videos met calibrated scores established by three different instructors. Proctors were assigned to laboratory sections, but were not told which video the students had prepared. These proctors used a rubric to monitor whether students chose to use the Mohr pipet and whether they used whatever glassware they chose and the balance correctly. Students were given a worksheet on which they recorded data and performed calculations. Proper use of significant figures was based on the data recorded on the student worksheet and the observation of the proctors. All laboratory practical exam materials can be found in Appendix IV of the ESI.

Data collection and analysis

Written quizzes and the written final exam were administered and collected by the teaching assistants for the individual laboratory sections. Materials were scanned into PDF form by the teaching assistants and sent to the authors before being graded. Electronic copies of all test items were stored in a password protected folder. Each student was given an alphanumeric code so that individual student progress could be followed. Student data presented here was collected under the guidance of the University's Institutional Review Board for Human Use (Erdmann and March, 2011).

For the pre- and post-test, each response was coded using a 2/1/0 score. As the task required in the pre- and post-test items differed slightly, the skills required in each item were classified according to a revised Bloom's taxonomy domains: (2) remember and apply/analyze, (1) remember, and (0) incorrect (other responses) (Anderson et al., 2001; Krathwohl, 2002). As part of the practical, used the pipet and recorded the volume to 2 decimal places was scored a (2), used the pipet but recorded the volume incorrectly (an incorrect number of significant figures or the beginning/final meniscus was outside the graduation marks) was scored a (1), and used the incorrect glassware was scored a (0).

Each student enrolled in the course was expected to spend similar time on task and as such both the control and experimental groups were expected to show improvement in reporting the proper number of significant figures. Thus, McNemar's chi square and odds ratios were calculated within each group to evaluate any differences in performance within the groups. This statistic is commonly used in pre-/post-test design to compare and determine significance between the number of students whose scores have increased, decreased or remained the same (Elliott and Woodward, 2007). Between groups analysis was analyzed using chi-square (χ2) determinations between the pre-/post-test, pre-test/written final exam, and post-test/written final exam. The chi-square statistic was selected since this statistic uses the frequency data from a sample to evaluate the relationship between the variables in a population and is the one of the most common nonparametric statistical values (Gravetter and Wallnau, 2009). Chi-square calculations were performed using a 3 × 2 contingency table to assess whether the increase in number of students that use the pipet and record the proper number of significant figures was different between the groups as a result of the treatment. Due to the size of the data pool and the degrees of freedom, minimal requirements for all cells of the contingency tables were met and the correction for continuity was not necessary and thus ignored. Statistical analyses were conducted using the Statistical Package for the Social Sciences (SPSS version 20 for Windows). Tabulated statistical data can be found in the ESI (see Appendices V–VII).

Key points regarding data collection along the semester are presented in the timeline shown in Fig. 1. During week 2, the pre-test was administered prior to all students filming videos detailing the use of either a Mohr pipet or an analytical balance. The post-test was administered during week 4, after students had edited and submitted the first draft of their videos. The longitudinal post-test data point (the written final exam) was administered the last week of the semester (week 15) along with the practical examination.

Of the 10 labs performed during the course, 8 of them required estimating the last decimal place when reading a volume. Mohr pipets were used to transfer solutions 3 of the 10 weeks, but students also used burets (2 of 10 weeks) or graduated cylinders (at least 5 of 10 weeks) throughout the course. Teaching assistants demonstrated the proper use of each piece of glassware at least once during the term, though the rigor in the presentation admittedly varied among TAs. The large sample size is expected to address this variable treatment.

Results and discussion

Between group analysis of the pre-test results from the balance/pipet groups show that there is no significant difference (χ2 = 0.6051, p = 0.739) in prior knowledge of volume measurements or significant figures between the two groups, and corrections for a priori knowledge were not performed. As both groups performed identical tasks throughout the semester, it is not surprising that within group analysis indicate that both groups showed improved performance on the post-test and the final exam. The post-test odds ratios (OR) (Table 1) indicate that the preparation of the pipet video does have some influence on the experimental group's performance even though the percent correct is similar. Students who created a pipet video were 4.6-fold more likely (McNemar's χ2 = 43.2) to increase their performance between the pre-test and post-test while students who created a balance video are only 2.6-fold more likely (McNemar's χ2 = 25.3). It is important to note that the difference between the two ORs does not indicate the magnitude of this influence (i.e., it is not a doubling effect).
Table 1 McNemar's chi square analysis within groups for the pre-, post- and final examination data points. All probabilities are statistically significant at the 99% confidence interval (p < 0.01)
Study period Statistical test Pipet group (N = 233) Balance group (N = 276)
Pre-test to post-test McNemar's χ2 43.2 25.3
Odds ratio 4.6 2.6
Pre-test to final McNemar's χ2 82.6 84.8
Odds ratio 11.1 9.8
Post-test to final McNemar's χ2 13.3 23.3
Odds ratio 2.6 3.1
Pre-test to practical McNemar's χ2 21.3 8.8
Odds ratio 2.5 0.6
Final to practical McNemar's χ2 31.6 130.5
Odds ratio 5.1 36.0


The influence of the pipet video is tempered between the post-test and the written final examination (OR = 2.6, McNemar's χ2 = 13.3 (pipet) and OR = 3.1, McNemar's χ2 = 23.3 (balance)). These odds ratios indicate that the number of students providing correct answers increased for both groups by the end of the semester. This observation is not unexpected since both groups used graduated glassware throughout the semester.

Fig. 2 shows the number of students who were able to improve their performance from an incorrect to a correct answer between the pretest and subsequent tests (i.e., the number of students who fell into the ‘no/yes’ category in the McNemar's chi-square table).


image file: c4rp00107a-f2.tif
Fig. 2 The number of students who answered incorrectly on the pretest but correctly on a subsequent test. All results are statistically significant.

Between group analyses of short-term learning gains showed that creation of a technique video had a marginal effect on student performance on the post-test. The χ2 statistic of 3.29 (p = 0.193) implies that there is a little over an 80% chance that the videos led to an increase in student performance on the post-test. However, given that this assessment method appeals to a group of learners that is often overlooked in the chemistry laboratory environment (i.e., bodily/kinesthetic learners) (Gardner, 1983), the positive correlation between assessment and learning gains provides opportunities for further studies that include learning style preferences as another variable (Fig. 3).


image file: c4rp00107a-f3.tif
Fig. 3 Comparison of student performance on all of the test items analyzed. Statistically significant differences are starred. Particular attention should be paid to the results of the practical.

Additional between group analyses show that the pipet group answered the assessment item correctly on the written final exam at a higher rate the balance group (81% versus 72%, χ2 = 9.78, p = 0.008). Though this finding is encouraging, the performance on the laboratory practical exam is the important observation. The item on the written final exam was similar to the post-test item, so responses on the written final exam could be influenced by the availability of the graded quiz as a study guide. During the practical exam, students in the pipet group used a Mohr pipet to transfer the solution at a higher rate (91%) than those students in the balance group (75%) (χ2 = 50.53, p < 0.001). Thus, though both groups spent the same amount of time using the pipet earlier in the term to determine the density, those students that made a pipet video selected the intended piece of glassware at a significantly higher rate. Additionally, students who used the Mohr pipet for volume determination also reported the correct significant digits at a much higher rate in the pipet group (74%) than the balance group (46%) (χ2 = 34.77, p < 0.001), indicating that creation of the video leads to an increase in proper application of the techniques the video details.

In addition to the data collected regarding significant figures related to volume, items were included on the pre- and post-tests probing students' familiarity with the significant figures associated with the analytical balance. As with the volume data, no differences in a priori knowledge were found between the groups. Additionally, there was no significant difference in growth between the groups on items related to using an analytical balance, which is likely attributed to the fact that the balances in question do not require estimation on the students' part as the mass readout is digital.

Though the majority of student groups submitting videos were able to successfully demonstrate proper technique with their initial submission, a potential limitation of the results lies in requiring these groups to re-shoot their videos or portions of video when gross technical errors were observed. The collection of this additional footage requires additional practice with the pipet that students in the control (balance) group would not have performed and time on task between the groups is not equivalent. The improved performance on the practical may be attributed to the instructor's feedback and student corrective action, and not simply from making the video. However, asynchronous monitoring is not possible without the video, so this corrective action would likely not ever have occurred. Thus, there could be an indirect benefit from the video preparation.

Implications for practice

The results of this study can be useful in many classroom situations. Using student-created videos offers an improved method for monitoring student technique by moving from a synchronous to an asynchronous evaluation model, and has uses in large-enrolment courses and distance-learning environments.

Student-created video assessments offer instructors the ability to observe technique and offer critique asynchronously, thus ensuring that the instructor can observe and provide feedback to all students in the laboratory. Jones et al. state that feedback can be seen as ‘the end of a cycle of learning and the beginning of the next’ and describe a method of assessment using screen capture digital video to provide feedback (Jones et al., 2012). In this method, students receive instructor feedback on their assignments via short screen capture videos created by the instructor. Thus, videos could be potentially useful in large enrolment laboratory courses, with the additional benefit of requiring equipment that the student already owns and software that can be obtained free of charge.

Many laboratory course designs require students to perform tasks in small groups. One potential problem in any such cooperative learning setting is that of the ‘hitch hiker’, or the individuals who defer to the ‘good’ student to complete the work (Cooper, 1995). In the laboratory, this hitch hiker problem often leads to the most technically astute member of the group taking responsibility for the majority of the data collection. Ensuring student involvement and equal participation among all group members in the laboratory is difficult due to the extent of the observation of the group required. This difficulty is exacerbated when graduate teaching assistants are responsible since they have limited experience with classroom management. The use of video could potentially be used to address the “hitch hiker” issue. Requiring all students to be seen in the video requires a minimum level of participation that is often difficult to monitor in a normal classroom environment.

An exciting potential application of student-created videos is in the distance-learning laboratory course setting. At the author's institution and others, enrolment is ever increasing while infrastructure remains largely unchanged. This phenomenon has led a number of institutions to increase their offerings of online courses (Phipps, 2013). Despite these growing pains, the contentious issue of the laboratory experience remains one of the largest obstacles to implementing online chemistry courses (Pienta, 2013). Safety, expense, retention and academic rigor further complicate the online laboratory course environment (Patterson, 2000; Boschmann, 2003; Hoole and Sithambaresan, 2003; Casanova et al., 2006). Provided safety precautions are in place, video reports may have value in distance learning environments where the instructor is not physically present, as they provide an active learning method of assessment. Instructors of these courses could ensure that off campus students are performing their own experiments by requiring them to create videos of themselves safely performing laboratory activities. In this model, the student-created video would be used for more than just analysing technique, it would be used to ensure students are individually performing their tasks. In this way instructors could monitor that the technical aspect of the laboratory experience is met. The author's data, specifically the high chi squared value associated with the application of technique, imply that creating a video could lead to increased retention and more meaningful learning in the online laboratory setting, thus alleviating some of the academic rigor concerns.

This study also allows us to consider the development of a list of techniques that could be included in a student's personal electronic library for use in other laboratory courses or in the research laboratory. These techniques could be as simple as the proper use of a piece of glassware or much more technical, such as the use of a HPLC. A number of techniques and instruments are common across scientific fields, and these videos on these techniques could be retrieved from the student's e-portfolio when needed.

Conclusions

Student-created technique videos were successfully integrated into the general chemistry laboratory curriculum as an alternate assessment. Formative surveys indicate that students are able to create and edit videos with little difficulty. Though the project required additional training of teaching assistants and the occasional need to address the hitch hiker problem (Cooper, 1995), the videos proved to be a worthwhile addition to the laboratory course. The short- and long-term measurements indicate an increase in students' ability to correctly report a volume to the correct number of significant figures after having prepared a video describing the proper technique. These results suggest that instructors can consider the use of video laboratory reports to improve retention of proper laboratory technique. Further studies should probe whether the important step in the learning gain is the preparation of the video or the process of reviewing oneself in the video after its preparation.

Acknowledgements

The authors would like to acknowledge Michele Foreman for the excellent video tutorials that she created for this project and Dr Julia Austin for graciously agreeing to review this paper. Special thanks also go out to the teaching assistants of the course for their patience and assistance during the course of the study.

Notes and references

  1. Anderson L. W., Krathwohl D. R., Airasian P., Cruikshank K. A., Mayer R. E. and Pintrich P. R., (2001), A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives, New York, NY: Longman.
  2. Blonder R., Jonatan M., Bar-Dov Z., Benny N., Rap S. and Sakhnini S., (2013), Can You Tube it? Providing chemistry teachers with technological tools and enhancing their self-efficacy beliefs, Chem. Educ. Res. Pract., 14, 269–285.
  3. Boschmann E., (2003), Teaching chemistry via distance education, J. Chem. Educ., 80, 704–708.
  4. Casanova R. S., Civelli J. L., Kimbrough D. R., Heath B. P. and Reeves J. H., (2006), Distance learning: a viable alternative to the conventional lecture-lab format in general chemistry, J. Chem. Educ., 83, 501–507.
  5. ChemLab - 1, (2013) Introductory Laboratory Techniques, http://www.youtube.com/watch?v=QJxuL1PeAg4.
  6. Cooper M., (1995), Cooperative Learning: An Approach for Large Enrollment Courses, J. Chem. Educ., 72, 162.
  7. DeMeo S., (2001), Teaching Chemical Technique. A Review of the Literature, J. Chem. Educ., 78, 373–379.
  8. Elliott A. C. and Woodward W. A., (2007), Statistical Analysis Quick Reference Guidebook with SPSS Examples, California: Sage Publications, Inc., p. 259.
  9. Erdmann M. A. and March J. L., (2011), Institutional Review Board for Human Use Form 4:IRB Approval Form, UAB IRB Office.
  10. Fortman J. J. and Battino R. J., (1990), A practical and inexpensive set of videotaped demonstrations, J. Chem. Educ., 67, 420–421.
  11. Gardner H., (1983), Frames of mind: The theory of multiple intelligences, New York, NY: Basic Books, p. 464.
  12. Global Mobile Statistics, (2013), http://mobithinking.com/mobile-marketing-tools/latest-mobile-stats.
  13. Gravetter F. J. and Wallnau L. B., (2009), Statistics for the Behavioral Sciences, 8th edn, California: Wadsworth, Cengage Publishing, p. 780.
  14. Hirschel R., Yamamoto C. and Lee P., (2012), Video Self-Assessment for Language Learners, SiSAL J., 3, 291–309.
  15. Hoole D. and Sithambaresan M., (2003), Analytical chemistry labs with kits and CD-based instructions as teaching aids for distance learning, J. Chem. Educ., 80, 1308–1310.
  16. Jenkinson G. T. and Fraiman A., (1999) A Multimedia Approach to Lab Reporting via Computer Presentation Software, J. Chem. Educ., 76, 283–284.
  17. Johnson L., Smith R., Willis H., Levine A. and Haywood K., (2011), The 2011 Horizon Report, Austin, Texas: The New Media Consortium.
  18. Jones N., Georghiades P. and Gunson J., (2012), Student feedback via screen capture digital video: stimulating students' modified action, High. Educ., 64, 593–607.
  19. Krathwohl D. R., (2002), A Revision of Bloom's Taxonomy: An Overview, Theor. Pract., 41, 212–218.
  20. Lancaster S., (2014) Educationinchemistry Blog, http://www.rsc.org/eic/2014/03/student-vignette-presentation.
  21. Lloyd B. W., (1992), The 20th century general chemistry laboratory: its various faces, J. Chem. Educ., 69, 866–869.
  22. March J. L., Moore J. W. and Jacobsen J. J., (2000), ChemPages Laboratory: Abstract of Special Issue 24 on CD-ROM, J. Chem. Educ., 77, 423–424.
  23. March J. L., (2012), Laboratory Experiments, CH 116/118, The University of Alabama at Birmingham, Plymouth, MI: Hayden-McNeil Publishing, p. 100.
  24. McCullagh J. F., (2012), How can video supported reflection enhance teachers' professional development?, Cult. Stud. Sci. Educ., 7, 137–152.
  25. MIT 5.301 Chemistry Laboratory Techniques, (2013), http://www.youtube.com/playlist?list=PL57499F5778AAB619.
  26. Niemczik C., Eilks I. and Pietzner V., (2013), Presented in part at the 5th Eurovariety in Chemistry Education, Limerick.
  27. Pantaleo D. C., (1975), Videotapes for laboratory instruction in freshman chemistry, J. Chem. Educ., 52, 112–113.
  28. Patterson M. J., (2000), Developing an internet-based chemistry class, J. Chem. Educ., 77, 554–555.
  29. Phipps L. R., (2013), Creating and teaching a web-based, university-level introductory chemistry course that incorporates laboratory exercises and active learning pedagogies, J. Chem. Educ., 90, 568–573.
  30. Pienta N. J., (2013), Online courses in chemistry: Salvation or downfall?, J. Chem. Educ., 90, 271–272.
  31. Rouda R. A., (1973), Student-produced videotapes in a physical chemistry laboratory course, J. Chem. Educ., 50, 126–127.
  32. Ryan B., (2013), A walk down the red carpet: students as producers of digital video-based knowledge, Int. J. Technology Enhanced Learning, 5, 24–41.
  33. Veal W. R., Taylor D. and Rogers A. L., (2009), Using Self-Reflection To Increase Science Process Skills in the General Chemistry Laboratory, J. Chem. Educ., 86, 393–398.
  34. Williams A. J. and Pence H. E., (2011), Smart Phones, a Powerful Tool in the Chemistry Classroom, J. Chem. Educ., 88, 683–686.
  35. Williams R. J., (1989), Availability of video tape to clarify the method of standard abbreviations, J. Chem. Educ., 66, 247.
  36. Winberg T. M. and Berg C. A. R., (2007), Students' cognitive focus during a chemistry laboratory exercise: effects of a computer-simulated prelab, J. Res. Sci. Teach., 44, 1108–1133.

Footnotes

Electronic supplementary information (ESI) available: Survey results, statistical information, grading rubrics and the Chemistry video shooting guide. See DOI: 10.1039/c4rp00107a
Laboratory courses met only during the 5 day weeks of the semester. Students were not asked to meet during holiday weeks or during the week of final exams.

This journal is © The Royal Society of Chemistry 2014