Michael K.
Seery
*,
Hendra Y.
Agustian
,
Euan D.
Doidge
,
Maciej M.
Kucharski
,
Helen M.
O’Connor
and
Amy
Price
EaStCHEM School of Chemistry, University of Edinburgh, Joseph Black Building, Edinburgh, UK. E-mail: michael.seery@ed.ac.uk
First published on 24th January 2017
Laboratory work is at the core of any chemistry curriculum but literature on the assessment of laboratory skills is scant. In this study we report the use of a peer-observation protocol underpinned by exemplar videos. Students are required to watch exemplar videos for three techniques (titrations, distillations, preparation of standard solutions) in advance of their practical session, and demonstrate the technique to their peer, while being reviewed. For two of the techniques (titrations and distillations), the demonstration was videoed on a mobile phone, which provide evidence that the student has successfully completed the technique. In order to develop digital literacy skills, students are required to upload their videos to a video sharing site for instructor review. The activity facilitated the issuing of digital badges to students who had successfully demonstrated competency. Students’ rating of their knowledge, experience, and confidence of a range of aspects associated with each technique significantly increased as a result of the activity. This work, along with student responses to questions, video access, and observations from implementation are reported in order to demonstrate a novel and useful way to incorporate peer-assessment of laboratory skills into a laboratory programme, as well as the use of digital badges as a means of incorporating and documenting transferable skills on the basis of student generated evidence.
Since then, practical work has grown to become a core component of the chemistry curriculum (Kirschner and Meester, 1988; Hofstein and Lunetta, 2004; Reid and Shah, 2007). Chemistry courses accredited by the Royal Society of Chemistry list as one of their requirements that “students must develop a range of practical skills” and chemistry courses at Bachelor level need to demonstrate that at least 300 hours are assigned to practical work, excluding undergraduate research (RSC, 2015). In the US, work over the last decade has gauged what chemistry faculty consider goals of practical work in chemistry (Bruck et al., 2010; Bruck and Towns, 2013). These include engaging in the scientific process, developing critical thinking skills, communication skills, and mastery of laboratory techniques and skills. There is therefore a general sense that practical work is important, and that there is a value placed on the “hands-on” skills students achieve in the laboratory.
While the value of practical work is considered paramount by professional societies and faculty, there have long been calls for reform in teaching laboratories both at school and university level (Hofstein and Lunetta, 2004; Reid and Shah, 2007). Some of this has been in response to the challenge of whether practical work should be carried out at all, given its cost and time requirement (Hawkes, 2004). Recently, interesting work on the student perception of practical work has emerged. This highlighted that students in earlier years are more likely to be driven by affective aspects of work, such as finishing the practical quickly (DeKorver and Towns, 2015). Outcomes of a study involving students in upper-level undergraduate laboratories included the finding that there was substantial misalignment with faculty goals and student goals of practical work, and also emphasised the desire students at this level had to complete the practical work as quickly as possible (DeKorver and Towns, 2016). This is likely a reflection of one continuing and central failure of much of the laboratory work in chemistry curricula: that the laboratory work is not itself assessed.
The development of a rubric to assess undergraduate organic chemistry laboratory activities has been described (Chen et al., 2013). Acknowledging the fact that many large institutions rely on demonstrators (also called graduate teaching assistants) to assess student work, their rubric aimed to provide a systematic method to assess the tasks students needed to complete in several organic syntheses reactions. Their rubric considered particular skills required (e.g. refluxing), and identified sub-skills that students needed to demonstrate to achieve these skills (e.g. use of clamp, use of condenser). These sub-skills aligned with benchmark statements so that markers could determine whether the sub-skill requirements were fully or partially met, or neglected.
In the context of laboratory skills, there is an argument that there is a gap between what graduates leave university education with and what industrial employers report that they need (Kirton et al., 2014). These authors argue that measurement of academic competence (as reported by examination grades) does not necessarily indicate students have high proficiency in laboratory work. They describe their adaptation of the objective structural clinical examination approach used in healthcare education, to develop what they term structured chemistry examinations. These consisted of a laboratory session dedicated to students demonstrating their competencies in six areas, two of which included core laboratory practical skills. These were assessed according to a scoring sheet, which checked that students could complete the practical and quantitative aspects of various practical tasks (e.g. weighing using an analytical balance).
The use of video recordings for assessment of students completing a pipetting task (Towns et al., 2015) and, more recently, a burette task (Hensiek et al., 2016) has been described. Students are required to submit their video for assessment to their virtual learning environment, where they are graded according to a range of criteria (e.g. bringing the meniscus to the line in the pipette) using a rubric aligned with the instructions students were given. As well as feedback on their videos, students who successfully demonstrated the technique were also awarded a digital badge.
The work outlined above has informed the design of activities used in the approach described in this work. It is worth mentioning here that other authors that have described the assessment of practical work include the assessment of practical work in high school settings using several stations, (Rhodes, 2010) and a practical exam where students must have competencies to complete the tasks required (Neeland, 2007). However, the direct observation of practical skills for the purpose of assessment appears to be limited to very few reports. In her report, Towns writes that the assessment of hands-on practical work needs more research.
The approaches for assessing laboratory skills described above illustrate interesting and innovative ways to allow students demonstrate their competencies and skills under valid testing conditions. Although they take different forms, three components are common: (1) the clear description of what is expected of students; (2) an alignment of assessment processes with these expectations; and (3) a means to authenticate and validate the assessment of the activity. In our work in designing assessment of laboratory skills, it was evident that these components needed to be part of the design framework.
Hendry argues for the use of exemplars – examples of work or activities of a particular quality – so that students have a much clearer sense of what is required of them in advance of the task, rather than relying for feedback after the event. This is also described as scaffolding, which provides an overall structure for students, presented so that they can develop their own work alongside it, and at points where they are unsure, use the scaffold to push beyond their zone of proximal development, as described by Vygotsky in his theory on social constructivism. The literature on exemplars does not intend to dismiss post hoc feedback; rather it argues that educators should provide a scaffold (in the form of an exemplar) for students before they complete their task, allow them to complete it, and then provide feedback on their work, again using the scaffold as a basis.
A connected concept is that of formative assessment, involving the use of activities enabling learners to bridge the gap between their current level of understanding or competence and the desired level. One aspect of formative assessment, is the engagement of students in self-evaluation (Black and Wiliam, 2006). Of course this connection between formative assessment and feedback was outlined by Sadler (emphasis added):
“A key premise is that for students to be able to improve, they must develop the capacity to monitor the quality of their own work during actual production. This in turn requires that students possess an appreciation of what high quality work is, that they have the evaluative skill necessary for them to compare with some objectivity the quality of what they are producing in relation to the higher standard, and that they develop a store of tactics or moves which can be drawn upon to modify their own work.” (Sadler, 1989)
Connecting Sadler's concepts to the previous discussion, it is clear that exemplars offer students an opportunity to “possess an appreciation of what high quality work is”. Thus in designing a laboratory skills assessment protocol within Sadler's framework, it is necessary to, along with the provision of exemplars, enable students “monitor the quality of their own work during production”, “have the evaluative skills” so that they may compare to the exemplars, and “develop…tactics” to modify their work.
Four reasons to incorporate self- and peer-assessment are suggested (Sadler and Good, 2006). It offers a logistical advantage in providing a large group of students with feedback more quickly and in more detail. There is a pedagogical benefit in considering another student's work, which can prompt an opportunity to change ideas or further develop skills. It develops metacognitive skills beyond the subject specific content by using higher order thinking skills to offer judgement and prompt self-evaluation (Zoller et al., 1997) Finally, there is an affective component, as the process of peer evaluation can prompt a more productive, friendlier and cooperative learning environment, by encouraging a shared ownership of the learning process (Weaver II and Cotrell, 1986).
This literature on pre-laboratory work guides the approach in this study. Exemplar work, in this case in the form of pre-laboratory demonstrations, may have some value, but this value can be enhanced by explicitly relating to it in laboratory time. In the framework devised here, there is of course a clear and obvious link between pre-laboratory and in-laboratory work, due to the nature of the activity (technique demonstrations). This point is highlighted as it attempts to align the general literature on exemplars with that on pre-laboratory work.
An advantage of digital badging is that they can give enhanced visibility to the many formal and informal learning scenarios students engage with during the course of their studies, but which may not be immediately obvious to someone reading a degree transcript.
Digital badges are often proposed as a means of motivating students. By linking with concepts popular in computer gaming, or on some review websites that wish to reward contributors of different levels, advocates argue that the desire to achieve badges and build up on a collection is a useful extrinsic motivator. However critics of the approach argue that it is essentially a behaviourist approach to reward learning, shifting the focus to the goal rather than to the learning activities themselves (Elkordy, 2016). In response to this criticism, Elkordy cites Goldberg, who has argued that badges will have benefit when they are incorporated into a context that socially supports them, and where users understand their purpose and significance (Goldberg, 2012). Indeed, results from a study in a high school STEM context suggests that use of badges was motivating both in terms of the learning goal, and also in task performance.
The use of digital badges in university chemistry laboratory education has been presented by Towns et al., whose work was described above. In this case, as well as assessment of the completion a laboratory technique (pipetting, use of burette), the videos submitted by students of their completion of the technique was used as evidence to demonstrate their competency, and subject to demonstration of competency, they were issued with a digital badge for pipetting. This work has formed the basis of the present study, with modifications to incorporate guidance from the literature on exemplars and peer review, as well as in the desire to develop students’ digital literacy (discussed below).
The peer assessment protocol for laboratory skills is described in full below, but briefly it involved the following.
(1) Before the lab: students were asked to watch exemplar videos for the techniques they will demonstrate in advance of the lab. The techniques involved were titrations (requiring students to know how to pipette correctly), setting up distillation apparatus and explaining the distillation procedure, and preparing a standard solution from solid. The exemplar videos students were asked to watch are publicly available (Doidge et al., 2016; Kucharski and Seery, 2016a, 2016b, 2016c).
(2) During the lab: students demonstrated each technique to each other in the laboratory. During the demonstration, their peer used an observation sheet to check that each step was correctly completed. For two out of the three techniques (titration and distillation), students videoed their peer on a mobile phone as they were demonstrating. Students could review the peer observation sheet feedback in the laboratory, and opt to reshoot a video if they wished based on this feedback. Peers and demonstrators signed off on the form once all involved were satisfied that the technique had been successfully demonstrated.
(3) After the lab: students uploaded their video to a video sharing website (e.g. YouTube or an internal University sharing site) for the two techniques which they had video evidence for. Students submitted links to their videos to the virtual learning environment. After review, those videos which provided evidence that the student had demonstrated competency in the technique were issued with a digital badge in that technique (Fig. 1).
Previous work on assessment of laboratory skills defined the format of the Peer Observation Sheets; which were essentially rubrics of activities students should complete in each stage of the demonstration (Chen et al., 2013). In addition, space was provided for peers to write feedback based on these rubric prompts. These were subsequently intended to act as discussion prompts during the peer review. For example, students have the option to review the video to check on a particular protocol step on the basis of peer review discussions. This aimed to allow students to address the final points highlighted by Sadler: to monitor the quality of their own work during production and to develop a capacity to modify and improve their work.
Once students had completed their demonstrations and peer review forms, these were signed off by the demonstrators and submitted for final review by the instructor. As the purpose and main learning outcome of the activity was that students demonstrated a technique to each other and reviewed their peer's techniques, the submission of three complete and signed peer review forms meant that they had successfully completed their requirements.
We consider that the laboratory activity described herein incorporates the development of several transferable skills. In order to prepare their demonstration, students are required to watch the video and organise in advance what they are going to do. As mentioned above, the process of peer review can develop metacognitive skills beyond subject specific content.
In addition to these, this activity offers students scope to develop their information technology skills. They are required to record video and upload that video to a sharing website. An important consideration in this is managing their digital footprint; the process of submitting a link to a video hosted elsewhere rather than just the video itself means that students have to make decisions about how they wish to control access to that video. As most of the formal online interactions between educators and students occur within their virtual learning environment (VLE), there is little or no opportunity for educators to support students in developing a professional online identity outwith the VLE, and which they are responsible for managing. Indeed it is argued that there is an onus on educators to formally consider this support and development within their curricula (Ng, 2015; Seery, 2016).
(1) To what extent did students watch exemplar videos prior to their laboratory session?
(2) How do students consider their own ability had changed as a result of completing the activity?
(3) What were the observations about the implementation of the lab-skills activity in practice?
A pre-/post survey was used to examine students’ perception of their knowledge, confidence, and experience in several aspects to do with each procedure, based on the approaches used in previous reports on badging activities (Towns et al., 2015; Hensiek et al., 2016). Full surveys are in Appendix 2. Students ranked on a Likert scale of 1–5, where 1 represented a low value (no knowledge, no experience, not confident) and 5 represented a high value (very knowledgeable, very experienced, very confident). The procedural protocol steps also mirrored the approach of developing statements to be used in the rubric by Chen et al. (2013), by identifying key procedural steps to be considered in the demonstration.
For pre-/post-analysis, the averages of the totals for each technique for knowledge, confidence, and experience were compared. Analysis was conducted with SPSS and Microsoft Excel programmes. Students were also asked three questions in the survey. These questions related to each technique: students were asked: (1) to read a burette when provided with a close-up picture of liquid in a burette; (2) when they would change a flask to collect a second fraction during a distillation; and (3) to calculate a concentration having been given a mass and molar mass. The answers to these were categorised as follows. For burette readings, answers were categorised as correct if the correct reading was given, and it was given to two decimal places. It was categorised as incorrect if the reading was incorrect or if it was given to one decimal place. For distillation, the answers were categorised as correct if students correctly explained when to change the flask, otherwise it was categorised as incorrect. For standard solution concentration calculations, answers were categorised as correct if students gave the correct concentration to the correct number of significant figures. Otherwise, it was categorised as incorrect, noting whether it was an incorrect calculation or whether the number of significant figures was wrong. The aim of this categorisation is to provide an additional source of data to put the student responses by way of looking at relationships between students’ knowledge, experience and confidence, and their answers to these questions.
For each pre- and post-laboratory data, descriptive statistics were presented and analysed in order to look at the central tendency, which was done with median values. At this point, the data was treated as it was. Missing values are reported but all valid responses are included. For pre-/post-analysis, the data was cleaned so that only matching pairs of responses were considered. The total responses reduced from 148 to 120. They were analysed with paired t-test, Cohen's d and effect size. This process involves the averaging of Likert responses to generate one overall pre-score and one overall post-score for each of the knowledge, experience, and confidence values for each technique, and the subsequent analysis of the differences between these scores. Averaging Likert scales is subject to some discussion in the education literature as it involves the averaging of ordinal values. In a discussion of this kind of analysis, Lalla writes that parametric tests can be used if it is assumed that the ordinal variable is an approximate measurement process, which evaluates a continuous underlying variable (Lalla, 2017). However, being aware of the criticisms of this approach, we place our pre-/post-analysis in the context of an initial exploration of the quantification of actual responses themselves, and subsequently use statistical analysis of pre-/post-scores to summarise any observed differences quantitatively. The total number of students who completed the practical session is 158.
Analysis of viewing figures and length of video viewed was obtained from YouTube analytics dashboard. The analytics dashboard allows viewership to be filtered by date range and also by geographic region (i.e. UK). Analytics also provide information on the viewing platform (PC, mobile, etc.). This information was exported from the YouTube analytics dashboard and subsequently processed in Microsoft Excel. In order to provide a combined overview of viewing of the video in the time prior to the labs, a “weighted frequency” was calculated from the product of the number of viewers on a particular day and the length of time the video was viewed for on that particular day.
The analytics dashboard also provides information on viewer retention over the course of a video. This information is not available to export, and hence for each video, within the date range and geographic filters considered, a screen-shot was taken of the analytics dashboard.
Students completed the laboratory activity during their third week (second laboratory session) of first year at university. Because of the size of the class, laboratory sessions run in three 3 hour sessions: Tuesday mornings and afternoons, and Wednesday afternoons. Videos were made available prior to the sessions. YouTube access statistics for the three videos for the 6 days prior to the lab sessions, and the lab session days themselves are summarized in Table 1. Access before this time and after this time was negligible. Given that fact, and that the majority of views were UK only, it is assumed that essentially all views are associated with this activity. 158 students completed the laboratory activity and while it is not possible to say all students watched the videos in advance of the practical class, the number of views (267 titration, 295 distillation, 243 standard solution) suggest that most did, with many students watching repeatedly.
Titration | Distillation | Standard solution | |
---|---|---|---|
URL | http://bit.ly/skillstitrating | http://bit.ly/skillsdistillation | http://bit.ly/skillsstandardsoln |
Video length (m:ss) | 4:57 | 7:15 | 5:57 |
All views/UK views | 269/267 | 300/295 | 264/243 |
Average view duration m:ss (%) | 4:08 | 5:09 | 4:17 |
83% | 71% | 72% | |
Viewing platform | Computer: 91% | Computer: 90% | Computer: 92% |
Mobile: 7.2% | Mobile: 8.1% | Mobile: 6.6% | |
Tablet: 1.5% | Tablet: 2% | Tablet: 1.6% |
An important consideration is the extent of the video that students watched. Average view times are shown, along with percentages of the entire video. For titrations, the % of video viewed averaged at 83%, for distillations, it was 71%, and for standard solutions, it was 72%.
These figures will automatically include students who re-watch a video but only a segment of it. Hence, of more interest is the retention of a student viewer over the course of a video. The YouTube analytics platform provides this information graphically, and the plots for the three videos are shown in Appendix 3. These illustrate a remarkable stability in viewing across almost the entire length of video, suggesting that students who started to watch tended to watch almost all of the video. “Drop-offs” were noted at the end of each video, at times 4:40, 5:40, and 5:20 for titration, distillation, and standard solution respectively. These times correspond on the videos to finishing notes about the video: confirmation of calculation, repeated statement about distillation, and the method of concentration calculation in standard solutions. The drop-off periods do not relate to the lab skills part of the video. These end of video drop-offs also distort the average viewing times, by reducing the average due to the component of the video not viewed at the end of the timeline.
Finally YouTube analytics provides information on the viewing platform. These data show that the dominant viewing platform was a personal computer, which was used over 90% of the time. The next choice was mobile phones (6–8%), followed by tablets. The % figures do not add up to 100%, probably because there were some views where the platform was not recognised.
Fig. 2 aims to represent the YouTube viewing data graphically, This compares a “weighted frequency” of views, accounting for the number of views and the average viewing time across all platforms for the days (−6 to −1) running up to the laboratory sessions, which are identified as days 1 and 2. Platform viewing data tended to mimic these data, although the highest use of mobile and tablet platforms were on the lab days themselves, reflecting the fact that students reviewed these videos in the laboratory session itself. This was facilitated by making short URLs available to students (as indicated in Table 1) so that they could easily call up the video if required. Access to dynamic information in situ has been proposed as a means of reducing in-lab cognitive load (Kolk et al., 2012).
In addition to the three main exemplar videos, students were also referred in the titration video to review a video on how to pipette (http://bit.ly/skillsvolpipette). This link was also directly provided in their pre-laboratory links. This video showed a very similar access profile to the main laboratory videos: 219 UK views, with an average view of 3:29 of 4:15, corresponding to 82%. Interestingly, this video's retention remained uniform over the course of its length (Appendix 3), and did not show the drop-off that other videos displayed. This adds weight to the conclusion that drop-off in the other videos is probably due to the fact that they finished with a section not directly related to the actual lab skill, whereas the pipetting video finished at the end of the skill demonstration without lingering on other considerations.
The above data aims to show that in general, students completing the practical session involving demonstration of laboratory skills watch the exemplary videos in advance.
Research Question 2: How do students consider their own ability had changed as a result of completing the activity?
Students were surveyed before and after the laboratory activity in a manner similar to that described previously (Towns et al., 2015; Hensiek et al., 2016). These surveys asked students to rate their knowledge, confidence and experience on a 5-point Likert scale, prior to and after the laboratory session (Appendix 2). The pre-test survey highlighted some interesting observations. In general students reported the highest previous knowledge, confidence, and experience of standard solutions, followed by titrations, with the lowest scores for distillations. For example, the median value for knowledge of “weighing out a solid onto a balance” was 5 in the pre-lab survey. In contrast, the median value for experience of “correctly greasing glassware” in distillation was 1.
In the case of titrations, students’ high ratings prior the laboratory increased further, and the changes show a decrease in the number of responses rated “3” and “4”, and an increase in the number of responses rated “5”. The most substantial changes were observed for distillations; students rated their experience much lower than their knowledge prior to the lab, reflecting that many of them would have learned about distillation in school but not performed one, due to the cost of the distillation apparatus. Therefore large changes are observed across all three categories, but student experience sees the largest shift; the largest decrease is in the number of responses rated “1”. The pre-lab ratings for standard solutions were the highest, reflecting that students have likely learned about and completed many standard solution preparations in their school work. Thus the largest shift here is ratings of “4”, with the subsequent increase in ratings of “5”.
As well as counting directly the number of responses, it is possible to conduct a pre-/post statistical test to ascertain whether there is any significant difference in the means of the responses before and after the laboratory activity. A sum of the pre-lab means and post-lab means for each of the series of statements for the three techniques are shown. As there are 6 statements for titrations, 7 for distillation, and 4 for standard solutions, the maximum possible score for these techniques is 30, 35, and 20 respectively. After the data was cleaned as described in the Methods section, a paired t-test was conducted on matching pairs before and after the laboratory activity. This data is shown in Table 2. In all cases, there is a significant difference (p < 0.001) between the pre- and post-mean scores for each knowledge, experience, and confidence scale. By calculating the Cohen's d value, these differences were all calculated to have a large or very large effect size (Sawilowsky, 2009), with the exception of “experience” of standard solutions. Although the median values of distillation were among the lowest in the pre-laboratory group, the effect size of the increase was the highest compared to that of the other techniques.
Technique | Pre-lab mean | Post-lab mean | Cohen's d (effect size) |
---|---|---|---|
Titration (30) | |||
Knowledge | 23.44 | 28.18 | 1.41 (v. large) |
Experience | 22.03 | 26.88 | 1.08 (large) |
Confidence | 22.25 | 27.51 | 1.27 (v. large) |
Distillation (35) | |||
Knowledge | 21.61 | 30.85 | 1.55 (v. large) |
Experience | 16.09 | 26.32 | 1.48 (v. large) |
Confidence | 19.99 | 29.03 | 1.32 (v. large) |
Standard solution (20) | |||
Knowledge | 17.39 | 19.08 | 0.82 (large) |
Experience | 16.47 | 18.17 | 0.56 (medium) |
Confidence | 16.00 | 18.57 | 0.80 (large) |
Question | Pre-lab | Post lab |
---|---|---|
Burette reading | Correct: 34% | Correct: 66% |
Incorrect: | Incorrect: | |
1 Decimal: 48% | 1 Decimal: 23% | |
Reading: 18% | Reading: 10% | |
Distillation procedure | Correct: 43% | Correct: 75% |
Incorrect: 43% | Incorrect: 24% | |
Don’t know: 14% | Don’t know: 1% | |
Concentration calculation | Correct: 31% | Correct: 32% |
Incorrect: | Incorrect: | |
Sig. fig.: 45% | Sig. fig.: 53% | |
Calculation: 22% | Calculation: 13% |
As well as total responses, it was noted that percentage of students who gave an incorrect burette reading and whose self rating average exceeded 3/5 for titrations was 13% in the pre-lab survey and 9% in the post-lab survey. The proportion who incorrectly answered the distillation question and whose self-rating average exceeded 3/5 was 15% in the pre-lab survey and 20% in the post lab survey. Finally the proportions who answered the molarity calculation incorrectly but who had a higher than 3/5 average self-rating was 21% in the pre-lab survey and 16% in the post-lab survey.
The above data aims to demonstrate, that for titrations and distillations, both students’ perceptions of their laboratory competency and external measures of some aspects of these competencies, improved over the project.
The exception is with the data on standard solutions. While students’ self-perceptions increased as a result of the activity, there was little change observed in the responses to the calculation question, aside from a reduction in the number of incorrect responses. The reason for this can only be speculated from the available data. This protocol differed from others; students were not required to video each other doing this activity, merely to observe each other. In practice (as indicated below) we noted that many students did not undergo peer observation as there was no explicit need – the absence of a requirement for a video meant that peers could work on their own individually if they wished.
Another reason may be that the peer observation sheet does not explicitly mention significant figures as a consideration, merely that the student should add a label to their flask “with appropriate details.” Therefore significant figures might not have been considered, or indeed it might have been perceived that this was not a consideration. It will be interesting to observe whether this changes in a future iteration where significant figures are explicitly mentioned.
Research Question 3: What were the observations about the implementation of the lab-skills activity in practice?
The laboratory sessions were structured around the Peer Observation Sheets (Appendix 1), with students being given space to provide feedback on their peer's video demonstration. These sheets also indicated which components of the demonstration students should record on their video.
As mentioned above, students were not required to video the standard solution preparation. This was partly due to the fact that this involved two significant tasks: weighing out a solid correctly, and making up the solution after transferring the solid. It was felt that this may take too much time to video. The absence of a requirement to video meant that observations of students not completing the peer review, and just preparing their solutions themselves, were reported. This may explain the small change in pre-post survey question responses for this technique, but regardless, demonstrates the necessity for some evidence of peer review for this approach. A future iteration plans to separate the weighing and standard solution procedures, so that they can be videoed and afforded a digital badge.
Suggestions for sites to submit their video included YouTube, Vimeo, and the university's own video sharing site. Students opted to upload their videos in order of preference to the university's own video sharing site (58%), YouTube (40%), Vimeo (1%), and a Dropbox or similar link (1%).
The submission of links to videos rather than videos themselves to the virtual learning environment was required. The purpose of this was to develop students’ digital literacy, and awareness about digital footprint. Students were informed that they should submit their video and list it publicly or have it unlisted (available to anyone with the link) as they chose. They were informed that private videos could not be viewed, but that they could make their videos private after instructor review. A surprise finding was that most students chose the university's own video site to host their videos. No data on the reason for this was collected, but anecdotally, several students commented that they saw this work as “academic” and therefore was better placed there instead of a site such as YouTube. Other students considered the university website more secure for their academic work. In reviewing some videos again as part of the research project after instructor review, it was noted that some students had exercised the option to change their video settings to private. These kinds of options and choices mean that students are developing the ability to control their own digital footprint.
Students also received feedback on the aspects of the Peer Observation Sheet that they needed to complete themselves, namely the titration readings and average titre and the standard solution concentration calculation. 17% of student reports marked did not record one or more of their titration figures to two decimal places, while the remainder did. 16% of students did not complete the calculation of their standard solution correctly. By far the most common mistake was the correct number of significant figures, 59% of students did not enter in the correct number of significant figures for their standard solution calculations, in line with the responses observed for the post-lab survey quiz, also involving significant figures.
Finally, students received feedback on their videos. For titrations, this tended to focus on specific issues which may affect the accuracy of results. While not prevalent, the most common error was not washing the burette tip after each dropwise addition close to the endpoint, followed by not reading burette to two decimal places either at the start or the end of the titration. Distillation feedback was less rich; students tended to set-up and explain the distillation very well. Typical comments, when required, were regarding the correct arrangement for the condenser tubing.
Another limitation in our study is that all of our participants had studied chemistry in school and likely had some practical experience. Therefore it cannot be concluded that the approach taken here is appropriate for teaching techniques ab initio, although the results from the distillation experiment suggest that even without prior experience in the practical technique, the framework proposed works well.
Student ID
Lab group (Day/time)
You are asked in these questions to rate between 1–5 your own ability in terms of knowledge, experience and confidence in various aspects of completing techniques.
• 1 is a low value (little knowledge, no experience, not confident).
• 5 is a high value (very knowledgeable, lots of experience, very confident).
• Your ratings do not affect your lab score in any way!
1. Adding liquid to burette
2. Where initial level of liquid in burette should be
3. Amount of indicator to add
4. What to do to analyte in conical flask when adding solution from burette
5. Steps to take when near end point (dropwise adding, washing)
6. Reading a burette to correct number of decimal places
A picture of a burette with some liquid is shown. What is the correct reading of this value?
2. Correct sequence to connect rubber tubing to condenser
3. Know how to correctly grease glassware
4. Correct assembly of apparatus including placement of clamp and thermometer.
5. Arrangement of cables and tubing in a safe manner
6. Correct method for adding liquid and required number of bumping granules
7. Protocol for collecting different fractions
In a short statement, explain how you would know when to transfer flasks after you collect your first fraction during a distillation.
2. Transferring solid to beaker and solvating
3. Transferring solution to volumetric flask
4. Making up a solution to the mark in a volumetric flask
[x] g of Na 2 CO 3 is weighed out and made up to 250 cm 3 of water. What concentration would you write on the label of this flask?
Distillation video: http://bit.ly/skillsdistillation
Standard solution video: http://bit.ly/skillsstandardsoln
Pipetting video: http://bit.ly/skillsvolpipette
This journal is © The Royal Society of Chemistry 2017 |