Developing laboratory skills by incorporating peer-review and digital badges

Michael K. Seery *, Hendra Y. Agustian , Euan D. Doidge , Maciej M. Kucharski , Helen M. O’Connor and Amy Price
EaStCHEM School of Chemistry, University of Edinburgh, Joseph Black Building, Edinburgh, UK. E-mail: michael.seery@ed.ac.uk

Received 4th January 2017 , Accepted 24th January 2017

First published on 24th January 2017


Abstract

Laboratory work is at the core of any chemistry curriculum but literature on the assessment of laboratory skills is scant. In this study we report the use of a peer-observation protocol underpinned by exemplar videos. Students are required to watch exemplar videos for three techniques (titrations, distillations, preparation of standard solutions) in advance of their practical session, and demonstrate the technique to their peer, while being reviewed. For two of the techniques (titrations and distillations), the demonstration was videoed on a mobile phone, which provide evidence that the student has successfully completed the technique. In order to develop digital literacy skills, students are required to upload their videos to a video sharing site for instructor review. The activity facilitated the issuing of digital badges to students who had successfully demonstrated competency. Students’ rating of their knowledge, experience, and confidence of a range of aspects associated with each technique significantly increased as a result of the activity. This work, along with student responses to questions, video access, and observations from implementation are reported in order to demonstrate a novel and useful way to incorporate peer-assessment of laboratory skills into a laboratory programme, as well as the use of digital badges as a means of incorporating and documenting transferable skills on the basis of student generated evidence.


Introduction

Purpose of laboratory work

The first teaching laboratory in chemistry in Britain was established at the University of Edinburgh in 1807, although the notion of associating practical work with a chemistry curriculum dates back further. William Cullen, who held the first independent chemistry lectureship in Britain and Ireland, and was from 1756 Professor of Chemistry at Edinburgh, made laboratories available to his students, so that they might explore some of the concepts he discussed in his lectures (Anderson, 1978).

Since then, practical work has grown to become a core component of the chemistry curriculum (Kirschner and Meester, 1988; Hofstein and Lunetta, 2004; Reid and Shah, 2007). Chemistry courses accredited by the Royal Society of Chemistry list as one of their requirements that “students must develop a range of practical skills” and chemistry courses at Bachelor level need to demonstrate that at least 300 hours are assigned to practical work, excluding undergraduate research (RSC, 2015). In the US, work over the last decade has gauged what chemistry faculty consider goals of practical work in chemistry (Bruck et al., 2010; Bruck and Towns, 2013). These include engaging in the scientific process, developing critical thinking skills, communication skills, and mastery of laboratory techniques and skills. There is therefore a general sense that practical work is important, and that there is a value placed on the “hands-on” skills students achieve in the laboratory.

While the value of practical work is considered paramount by professional societies and faculty, there have long been calls for reform in teaching laboratories both at school and university level (Hofstein and Lunetta, 2004; Reid and Shah, 2007). Some of this has been in response to the challenge of whether practical work should be carried out at all, given its cost and time requirement (Hawkes, 2004). Recently, interesting work on the student perception of practical work has emerged. This highlighted that students in earlier years are more likely to be driven by affective aspects of work, such as finishing the practical quickly (DeKorver and Towns, 2015). Outcomes of a study involving students in upper-level undergraduate laboratories included the finding that there was substantial misalignment with faculty goals and student goals of practical work, and also emphasised the desire students at this level had to complete the practical work as quickly as possible (DeKorver and Towns, 2016). This is likely a reflection of one continuing and central failure of much of the laboratory work in chemistry curricula: that the laboratory work is not itself assessed.

Assessment of laboratory work

Despite the value placed on laboratory work by faculty and professional bodies, there are few reports on the direct assessment of laboratory work or on demonstration of competencies and skills. Assessment tends to focus on the laboratory report or on some outcome of the laboratory work, such as yield or product purity (Graham et al., 2008). Some recent reports directly describing the assessment of practical skills are described below.

The development of a rubric to assess undergraduate organic chemistry laboratory activities has been described (Chen et al., 2013). Acknowledging the fact that many large institutions rely on demonstrators (also called graduate teaching assistants) to assess student work, their rubric aimed to provide a systematic method to assess the tasks students needed to complete in several organic syntheses reactions. Their rubric considered particular skills required (e.g. refluxing), and identified sub-skills that students needed to demonstrate to achieve these skills (e.g. use of clamp, use of condenser). These sub-skills aligned with benchmark statements so that markers could determine whether the sub-skill requirements were fully or partially met, or neglected.

In the context of laboratory skills, there is an argument that there is a gap between what graduates leave university education with and what industrial employers report that they need (Kirton et al., 2014). These authors argue that measurement of academic competence (as reported by examination grades) does not necessarily indicate students have high proficiency in laboratory work. They describe their adaptation of the objective structural clinical examination approach used in healthcare education, to develop what they term structured chemistry examinations. These consisted of a laboratory session dedicated to students demonstrating their competencies in six areas, two of which included core laboratory practical skills. These were assessed according to a scoring sheet, which checked that students could complete the practical and quantitative aspects of various practical tasks (e.g. weighing using an analytical balance).

The use of video recordings for assessment of students completing a pipetting task (Towns et al., 2015) and, more recently, a burette task (Hensiek et al., 2016) has been described. Students are required to submit their video for assessment to their virtual learning environment, where they are graded according to a range of criteria (e.g. bringing the meniscus to the line in the pipette) using a rubric aligned with the instructions students were given. As well as feedback on their videos, students who successfully demonstrated the technique were also awarded a digital badge.

The work outlined above has informed the design of activities used in the approach described in this work. It is worth mentioning here that other authors that have described the assessment of practical work include the assessment of practical work in high school settings using several stations, (Rhodes, 2010) and a practical exam where students must have competencies to complete the tasks required (Neeland, 2007). However, the direct observation of practical skills for the purpose of assessment appears to be limited to very few reports. In her report, Towns writes that the assessment of hands-on practical work needs more research.

The approaches for assessing laboratory skills described above illustrate interesting and innovative ways to allow students demonstrate their competencies and skills under valid testing conditions. Although they take different forms, three components are common: (1) the clear description of what is expected of students; (2) an alignment of assessment processes with these expectations; and (3) a means to authenticate and validate the assessment of the activity. In our work in designing assessment of laboratory skills, it was evident that these components needed to be part of the design framework.

Formative assessment

Most assessment at university level comes after the corresponding teaching event. Students are assessed on their lecture content after lectures by examinations, and typically on their laboratory work by means of a laboratory report after they have completed the work. The methods described for assessment of laboratory skills above were also summative; students are given feedback after the event. Hendry challenges the notion of “loading up” feedback; that is feedback that is stored up and provided to students after their work has been completed (Hendry, 2013). This approach typically gives students information on how they might do the task better, and highlight any errors made. An issue with this mode of feedback is that its relevance is lost to students; the task it refers to is complete and there is no mechanism for students to demonstrate that they have engaged with this feedback or to demonstrate that they can recomplete the task with the feedback in mind.

Hendry argues for the use of exemplars – examples of work or activities of a particular quality – so that students have a much clearer sense of what is required of them in advance of the task, rather than relying for feedback after the event. This is also described as scaffolding, which provides an overall structure for students, presented so that they can develop their own work alongside it, and at points where they are unsure, use the scaffold to push beyond their zone of proximal development, as described by Vygotsky in his theory on social constructivism. The literature on exemplars does not intend to dismiss post hoc feedback; rather it argues that educators should provide a scaffold (in the form of an exemplar) for students before they complete their task, allow them to complete it, and then provide feedback on their work, again using the scaffold as a basis.

A connected concept is that of formative assessment, involving the use of activities enabling learners to bridge the gap between their current level of understanding or competence and the desired level. One aspect of formative assessment, is the engagement of students in self-evaluation (Black and Wiliam, 2006). Of course this connection between formative assessment and feedback was outlined by Sadler (emphasis added):

“A key premise is that for students to be able to improve, they must develop the capacity to monitor the quality of their own work during actual production. This in turn requires that students possess an appreciation of what high quality work is, that they have the evaluative skill necessary for them to compare with some objectivity the quality of what they are producing in relation to the higher standard, and that they develop a store of tactics or moves which can be drawn upon to modify their own work.” (Sadler, 1989)

Connecting Sadler's concepts to the previous discussion, it is clear that exemplars offer students an opportunity to “possess an appreciation of what high quality work is”. Thus in designing a laboratory skills assessment protocol within Sadler's framework, it is necessary to, along with the provision of exemplars, enable students “monitor the quality of their own work during production”, “have the evaluative skills” so that they may compare to the exemplars, and “develop…tactics” to modify their work.

Four reasons to incorporate self- and peer-assessment are suggested (Sadler and Good, 2006). It offers a logistical advantage in providing a large group of students with feedback more quickly and in more detail. There is a pedagogical benefit in considering another student's work, which can prompt an opportunity to change ideas or further develop skills. It develops metacognitive skills beyond the subject specific content by using higher order thinking skills to offer judgement and prompt self-evaluation (Zoller et al., 1997) Finally, there is an affective component, as the process of peer evaluation can prompt a more productive, friendlier and cooperative learning environment, by encouraging a shared ownership of the learning process (Weaver II and Cotrell, 1986).

Pre-laboratory work

The discussion of exemplars, above, can be related to the work on pre-laboratory activities, which is extensive in chemistry education literature. Pre-laboratory videos and simulations have been described as a means of preparing students for laboratory work by reducing the cognitive load in laboratory time (Winberg and Berg, 2007; Jolley et al., 2016). Recent work published in this journal suggested that pre-laboratory activities on their own did not have a significant change on student perceptions of laboratory work, but when this preparatory work was explicitly acted on in the laboratory, students negative feelings towards laboratory work decreased (Spagnoli et al., 2017).

This literature on pre-laboratory work guides the approach in this study. Exemplar work, in this case in the form of pre-laboratory demonstrations, may have some value, but this value can be enhanced by explicitly relating to it in laboratory time. In the framework devised here, there is of course a clear and obvious link between pre-laboratory and in-laboratory work, due to the nature of the activity (technique demonstrations). This point is highlighted as it attempts to align the general literature on exemplars with that on pre-laboratory work.

Digital badges

One way of acknowledging student competence in particular skills is to issue them with a digital badge. Digital badges are of increasing interest in education as a means of “micro-accreditation”; issuing an institutional acknowledgement for coursework where the student has displayed evidence for stated achievements. Students may display these badges on their own social media or personal profiles, websites, etc. (Casilli and Hickey, 2016). There are a growing number of examples of the practice of issuing digital badges with positive findings, including recent work in English education (Yang et al., 2015), medical education (Mehta et al., 2013), and secondary STEM education (Elkordy, 2016).

An advantage of digital badging is that they can give enhanced visibility to the many formal and informal learning scenarios students engage with during the course of their studies, but which may not be immediately obvious to someone reading a degree transcript.

Digital badges are often proposed as a means of motivating students. By linking with concepts popular in computer gaming, or on some review websites that wish to reward contributors of different levels, advocates argue that the desire to achieve badges and build up on a collection is a useful extrinsic motivator. However critics of the approach argue that it is essentially a behaviourist approach to reward learning, shifting the focus to the goal rather than to the learning activities themselves (Elkordy, 2016). In response to this criticism, Elkordy cites Goldberg, who has argued that badges will have benefit when they are incorporated into a context that socially supports them, and where users understand their purpose and significance (Goldberg, 2012). Indeed, results from a study in a high school STEM context suggests that use of badges was motivating both in terms of the learning goal, and also in task performance.

The use of digital badges in university chemistry laboratory education has been presented by Towns et al., whose work was described above. In this case, as well as assessment of the completion a laboratory technique (pipetting, use of burette), the videos submitted by students of their completion of the technique was used as evidence to demonstrate their competency, and subject to demonstration of competency, they were issued with a digital badge for pipetting. This work has formed the basis of the present study, with modifications to incorporate guidance from the literature on exemplars and peer review, as well as in the desire to develop students’ digital literacy (discussed below).

Description of peer-assessment protocol for assessment of laboratory skills

The literature presented above underpins the framework for the design of peer assessment protocols for laboratory skills. The following describes how it was implemented for three techniques: performing titrations; explaining a distillation procedure; and making up a standard solution from solid. A dedicated laboratory session was allocated for this activity. This approach was taken over the alternative (where students demonstrate it at some stage over their laboratory course) as it was felt that students who were least confident and had least experience may struggle to find time in the otherwise busy laboratory programme.

The peer assessment protocol for laboratory skills is described in full below, but briefly it involved the following.

(1) Before the lab: students were asked to watch exemplar videos for the techniques they will demonstrate in advance of the lab. The techniques involved were titrations (requiring students to know how to pipette correctly), setting up distillation apparatus and explaining the distillation procedure, and preparing a standard solution from solid. The exemplar videos students were asked to watch are publicly available (Doidge et al., 2016; Kucharski and Seery, 2016a, 2016b, 2016c).

(2) During the lab: students demonstrated each technique to each other in the laboratory. During the demonstration, their peer used an observation sheet to check that each step was correctly completed. For two out of the three techniques (titration and distillation), students videoed their peer on a mobile phone as they were demonstrating. Students could review the peer observation sheet feedback in the laboratory, and opt to reshoot a video if they wished based on this feedback. Peers and demonstrators signed off on the form once all involved were satisfied that the technique had been successfully demonstrated.

(3) After the lab: students uploaded their video to a video sharing website (e.g. YouTube or an internal University sharing site) for the two techniques which they had video evidence for. Students submitted links to their videos to the virtual learning environment. After review, those videos which provided evidence that the student had demonstrated competency in the technique were issued with a digital badge in that technique (Fig. 1).


image file: c7rp00003k-f1.tif
Fig. 1 Digital badges designed for titration, distillation, and standard solution techniques. In the implementation described in this work, badges for titration and distillation technique were issued.

Pre lab work

Sadler highlights the need to have an appreciation of what high quality work is, and the literature on exemplars demonstrate that this is a suitable approach to provide this information. Therefore, in advance of laboratory classes, students are required to watch video demonstrations of the techniques they will be asked to perform in the laboratory class. These exemplar videos are intended to allow students see what will be required of them, in their own laboratory setting. This differs from previous reported approaches, where students viewed teaching assistant demonstrations in the laboratory. The rationale for this approach was to formalize the concept of exemplars, so that students know that there is an expectation that they should review the procedure in advance of the laboratory class. It was also evident in the preparation of exemplar videos that there were a wide range of views on what “correct technique” was, and therefore we wished to document a fully correct, literature-based, approach as a reference for all involved in the laboratory activity.

In lab peer demonstration and review

To structure in-lab work, we were keen to align with a common theme from earlier reports of assessing laboratory work – that students have a clear description of what is expected. Therefore, we developed the Peer Observation Sheets to structure student activity in the lab (Appendix 1). These described, for each technique in turn, the steps students should take, as well as points for the peer to consider when providing feedback. These aimed to address Sadler's point about enabling students to evaluate their work by comparing to the exemplar. They also defined the points at which students should start and end videoing. Thus the Peer Observation Sheets were used to structure the overall flow of the laboratory session.

Previous work on assessment of laboratory skills defined the format of the Peer Observation Sheets; which were essentially rubrics of activities students should complete in each stage of the demonstration (Chen et al., 2013). In addition, space was provided for peers to write feedback based on these rubric prompts. These were subsequently intended to act as discussion prompts during the peer review. For example, students have the option to review the video to check on a particular protocol step on the basis of peer review discussions. This aimed to allow students to address the final points highlighted by Sadler: to monitor the quality of their own work during production and to develop a capacity to modify and improve their work.

Once students had completed their demonstrations and peer review forms, these were signed off by the demonstrators and submitted for final review by the instructor. As the purpose and main learning outcome of the activity was that students demonstrated a technique to each other and reviewed their peer's techniques, the submission of three complete and signed peer review forms meant that they had successfully completed their requirements.

Post lab review of work

The third and final theme arising from previous work on assessment of laboratory skills was a means to authenticate and validate the assessment of the activity. This was achieved by reviewing student videos and providing them with feedback on issues aligning with those raised in the Peer Observation Sheets. Students uploaded their videos to a video sharing site (e.g. YouTube, Vimeo, university's own hosting site) and submitted the URL links to their videos (titration demonstration and distillation explanation) into an assignment area in their virtual learning environment. Once their assignment had been viewed, students received individual feedback based on the technique displayed in their videos, and assuming the video displayed an appropriate level of competency, they were issued with a digital badge via the virtual learning environment, which they could push out to their own Open Badge Backpack (http://https://backpack.openbadges.org/). The backpack is an independent hosting site for badges, which allows learners collect and display badges from wherever they may earn them. As it is not dependent on any institution, the purpose of the backpack is that learner's can access and control their badges once they have moved on from any institution where their badges were issued (e.g. in this case, the university). Students also received feedback on the answers noted on their Peer Observation Sheets, in relation to the number of significant figures, and their standard solution calculations.

Combination of effects: developing transferable skills

Much of the innovation and reform regarding practical work has capitalised on the opportunities the laboratory environment offers in terms of addressing a wider set of transferable and professional skills. Outcomes of practical work have been grouped into three broad themes: practical skills, transferable skills, and intellectual skills (Carnduff and Reid, 2003). Transferable skills considered included aspects such as team working, organisation, time management, communication, presentation, information retrieval, data processing, numeracy, designing strategies, and problem solving.

We consider that the laboratory activity described herein incorporates the development of several transferable skills. In order to prepare their demonstration, students are required to watch the video and organise in advance what they are going to do. As mentioned above, the process of peer review can develop metacognitive skills beyond subject specific content.

In addition to these, this activity offers students scope to develop their information technology skills. They are required to record video and upload that video to a sharing website. An important consideration in this is managing their digital footprint; the process of submitting a link to a video hosted elsewhere rather than just the video itself means that students have to make decisions about how they wish to control access to that video. As most of the formal online interactions between educators and students occur within their virtual learning environment (VLE), there is little or no opportunity for educators to support students in developing a professional online identity outwith the VLE, and which they are responsible for managing. Indeed it is argued that there is an onus on educators to formally consider this support and development within their curricula (Ng, 2015; Seery, 2016).

Research questions

The aim of the research study is to explore some factors around assessment and learning of practical laboratory skills. In particular, we were keen to explore the following.

(1) To what extent did students watch exemplar videos prior to their laboratory session?

(2) How do students consider their own ability had changed as a result of completing the activity?

(3) What were the observations about the implementation of the lab-skills activity in practice?

Methodology

A quantitative approach is used to address these research questions. The advantages of this include that we are able to readily determine data regarding the access and use of our exemplar videos, as well as quantify changes in the pre-/post survey data described below. This data is used to give a sense of the interaction and outcomes in the circumstances observed in this particular case. The laboratory session and its associated work are an unusual format in relation to the general scheme used for the remainder of the sessions which students experience over the course of their first year laboratories. These tend to follow a more traditional format, where students answer some general pre-laboratory questions, complete the laboratory session, and prepare a worksheet or short report for assessment. However as an early session is used (the second session out of ten in the semester), it is assumed that from the student perspective, recently arrived at university, that the uniqueness of the session is not apparent at this stage. We use the pre-/post survey described below to gain a sense of how students perceive their knowledge, experience, and confidence changing as a result of the activities. Details of how this is conducted are outlined below.

Methods

Ethical approval was secured from the School's Research Committee in line with institutional guidelines. In accordance with British Education Research Association guidelines (BERA, 2011), students were informed about the research prior to completion of survey, as well as being offered the right to withdraw their contributions at any time. Students who completed the survey but opted not to have their results considered in the analysis were removed from the dataset (n = 3). The survey was conducted on the Bristol Online Surveys platform which is fully compliant with UK data protection laws. Data was only held on University-secured computers and was not transferred electronically by other means.

A pre-/post survey was used to examine students’ perception of their knowledge, confidence, and experience in several aspects to do with each procedure, based on the approaches used in previous reports on badging activities (Towns et al., 2015; Hensiek et al., 2016). Full surveys are in Appendix 2. Students ranked on a Likert scale of 1–5, where 1 represented a low value (no knowledge, no experience, not confident) and 5 represented a high value (very knowledgeable, very experienced, very confident). The procedural protocol steps also mirrored the approach of developing statements to be used in the rubric by Chen et al. (2013), by identifying key procedural steps to be considered in the demonstration.

For pre-/post-analysis, the averages of the totals for each technique for knowledge, confidence, and experience were compared. Analysis was conducted with SPSS and Microsoft Excel programmes. Students were also asked three questions in the survey. These questions related to each technique: students were asked: (1) to read a burette when provided with a close-up picture of liquid in a burette; (2) when they would change a flask to collect a second fraction during a distillation; and (3) to calculate a concentration having been given a mass and molar mass. The answers to these were categorised as follows. For burette readings, answers were categorised as correct if the correct reading was given, and it was given to two decimal places. It was categorised as incorrect if the reading was incorrect or if it was given to one decimal place. For distillation, the answers were categorised as correct if students correctly explained when to change the flask, otherwise it was categorised as incorrect. For standard solution concentration calculations, answers were categorised as correct if students gave the correct concentration to the correct number of significant figures. Otherwise, it was categorised as incorrect, noting whether it was an incorrect calculation or whether the number of significant figures was wrong. The aim of this categorisation is to provide an additional source of data to put the student responses by way of looking at relationships between students’ knowledge, experience and confidence, and their answers to these questions.

For each pre- and post-laboratory data, descriptive statistics were presented and analysed in order to look at the central tendency, which was done with median values. At this point, the data was treated as it was. Missing values are reported but all valid responses are included. For pre-/post-analysis, the data was cleaned so that only matching pairs of responses were considered. The total responses reduced from 148 to 120. They were analysed with paired t-test, Cohen's d and effect size. This process involves the averaging of Likert responses to generate one overall pre-score and one overall post-score for each of the knowledge, experience, and confidence values for each technique, and the subsequent analysis of the differences between these scores. Averaging Likert scales is subject to some discussion in the education literature as it involves the averaging of ordinal values. In a discussion of this kind of analysis, Lalla writes that parametric tests can be used if it is assumed that the ordinal variable is an approximate measurement process, which evaluates a continuous underlying variable (Lalla, 2017). However, being aware of the criticisms of this approach, we place our pre-/post-analysis in the context of an initial exploration of the quantification of actual responses themselves, and subsequently use statistical analysis of pre-/post-scores to summarise any observed differences quantitatively. The total number of students who completed the practical session is 158.

Analysis of viewing figures and length of video viewed was obtained from YouTube analytics dashboard. The analytics dashboard allows viewership to be filtered by date range and also by geographic region (i.e. UK). Analytics also provide information on the viewing platform (PC, mobile, etc.). This information was exported from the YouTube analytics dashboard and subsequently processed in Microsoft Excel. In order to provide a combined overview of viewing of the video in the time prior to the labs, a “weighted frequency” was calculated from the product of the number of viewers on a particular day and the length of time the video was viewed for on that particular day.

The analytics dashboard also provides information on viewer retention over the course of a video. This information is not available to export, and hence for each video, within the date range and geographic filters considered, a screen-shot was taken of the analytics dashboard.

Results and discussion

Research Question 1: To what extent did students watch exemplar videos prior to their laboratory session?

Students completed the laboratory activity during their third week (second laboratory session) of first year at university. Because of the size of the class, laboratory sessions run in three 3 hour sessions: Tuesday mornings and afternoons, and Wednesday afternoons. Videos were made available prior to the sessions. YouTube access statistics for the three videos for the 6 days prior to the lab sessions, and the lab session days themselves are summarized in Table 1. Access before this time and after this time was negligible. Given that fact, and that the majority of views were UK only, it is assumed that essentially all views are associated with this activity. 158 students completed the laboratory activity and while it is not possible to say all students watched the videos in advance of the practical class, the number of views (267 titration, 295 distillation, 243 standard solution) suggest that most did, with many students watching repeatedly.

Table 1 Summary of YouTube analytic data for the three exemplar videos
Titration Distillation Standard solution
URL http://bit.ly/skillstitrating http://bit.ly/skillsdistillation http://bit.ly/skillsstandardsoln
Video length (m:ss) 4:57 7:15 5:57
All views/UK views 269/267 300/295 264/243
Average view duration m:ss (%) 4:08 5:09 4:17
83% 71% 72%
Viewing platform Computer: 91% Computer: 90% Computer: 92%
Mobile: 7.2% Mobile: 8.1% Mobile: 6.6%
Tablet: 1.5% Tablet: 2% Tablet: 1.6%


An important consideration is the extent of the video that students watched. Average view times are shown, along with percentages of the entire video. For titrations, the % of video viewed averaged at 83%, for distillations, it was 71%, and for standard solutions, it was 72%.

These figures will automatically include students who re-watch a video but only a segment of it. Hence, of more interest is the retention of a student viewer over the course of a video. The YouTube analytics platform provides this information graphically, and the plots for the three videos are shown in Appendix 3. These illustrate a remarkable stability in viewing across almost the entire length of video, suggesting that students who started to watch tended to watch almost all of the video. “Drop-offs” were noted at the end of each video, at times 4:40, 5:40, and 5:20 for titration, distillation, and standard solution respectively. These times correspond on the videos to finishing notes about the video: confirmation of calculation, repeated statement about distillation, and the method of concentration calculation in standard solutions. The drop-off periods do not relate to the lab skills part of the video. These end of video drop-offs also distort the average viewing times, by reducing the average due to the component of the video not viewed at the end of the timeline.

Finally YouTube analytics provides information on the viewing platform. These data show that the dominant viewing platform was a personal computer, which was used over 90% of the time. The next choice was mobile phones (6–8%), followed by tablets. The % figures do not add up to 100%, probably because there were some views where the platform was not recognised.

Fig. 2 aims to represent the YouTube viewing data graphically, This compares a “weighted frequency” of views, accounting for the number of views and the average viewing time across all platforms for the days (−6 to −1) running up to the laboratory sessions, which are identified as days 1 and 2. Platform viewing data tended to mimic these data, although the highest use of mobile and tablet platforms were on the lab days themselves, reflecting the fact that students reviewed these videos in the laboratory session itself. This was facilitated by making short URLs available to students (as indicated in Table 1) so that they could easily call up the video if required. Access to dynamic information in situ has been proposed as a means of reducing in-lab cognitive load (Kolk et al., 2012).


image file: c7rp00003k-f2.tif
Fig. 2 Viewings of videos in advance on lab (days −6 to −1) and on lab days (day 0 = Tuesday, 1 = Wednesday) represented as weighted frequency, the product of the fraction of total views and the viewing time on a particular day. Bars from left to right represent: titration (blue), distillation (red) and standard solution (green) videos.

In addition to the three main exemplar videos, students were also referred in the titration video to review a video on how to pipette (http://bit.ly/skillsvolpipette). This link was also directly provided in their pre-laboratory links. This video showed a very similar access profile to the main laboratory videos: 219 UK views, with an average view of 3:29 of 4:15, corresponding to 82%. Interestingly, this video's retention remained uniform over the course of its length (Appendix 3), and did not show the drop-off that other videos displayed. This adds weight to the conclusion that drop-off in the other videos is probably due to the fact that they finished with a section not directly related to the actual lab skill, whereas the pipetting video finished at the end of the skill demonstration without lingering on other considerations.

The above data aims to show that in general, students completing the practical session involving demonstration of laboratory skills watch the exemplary videos in advance.

Research Question 2: How do students consider their own ability had changed as a result of completing the activity?

Students were surveyed before and after the laboratory activity in a manner similar to that described previously (Towns et al., 2015; Hensiek et al., 2016). These surveys asked students to rate their knowledge, confidence and experience on a 5-point Likert scale, prior to and after the laboratory session (Appendix 2). The pre-test survey highlighted some interesting observations. In general students reported the highest previous knowledge, confidence, and experience of standard solutions, followed by titrations, with the lowest scores for distillations. For example, the median value for knowledge of “weighing out a solid onto a balance” was 5 in the pre-lab survey. In contrast, the median value for experience of “correctly greasing glassware” in distillation was 1.

Self-rating of knowledge, confidence, and experience

We analyse the pre-/post-survey data in two ways. Firstly, we counted the number of responses in each point on the scale in each category before and after the lab. The change in the number of responses allows us to consider the changing levels of student perception on their levels of knowledge, confidence, and experience. The data summarised graphically in Fig. 3. In all cases, we see a decrease in the number of responses in the lower numbered Likert categories, with an increase in the number of responses in the higher number Likert points, especially point 5. This is reflecting a growth in the number of students choosing 5 in responses to the survey questions after the laboratory, indicating that they consider their levels of knowledge, confidence, and experience to have increased (Fig. 3). The increase in number of students selecting choice “5” was observed across all three experiments: titration, distillation, and standard solutions. By monitoring the consequent decrease in the other scales, we can obtain a sense of the shift in changing perceptions of self-evaluated knowledge, confidence, and experience.
image file: c7rp00003k-f3.tif
Fig. 3 % change in number of responses to survey for all answers in titrations (top), distillations (middle) and standard solutions (bottom). Percentage changes in responses for knowledge (blue bottom bar), experience (green middle bar), and confidence (red top bar) are shown.

In the case of titrations, students’ high ratings prior the laboratory increased further, and the changes show a decrease in the number of responses rated “3” and “4”, and an increase in the number of responses rated “5”. The most substantial changes were observed for distillations; students rated their experience much lower than their knowledge prior to the lab, reflecting that many of them would have learned about distillation in school but not performed one, due to the cost of the distillation apparatus. Therefore large changes are observed across all three categories, but student experience sees the largest shift; the largest decrease is in the number of responses rated “1”. The pre-lab ratings for standard solutions were the highest, reflecting that students have likely learned about and completed many standard solution preparations in their school work. Thus the largest shift here is ratings of “4”, with the subsequent increase in ratings of “5”.

As well as counting directly the number of responses, it is possible to conduct a pre-/post statistical test to ascertain whether there is any significant difference in the means of the responses before and after the laboratory activity. A sum of the pre-lab means and post-lab means for each of the series of statements for the three techniques are shown. As there are 6 statements for titrations, 7 for distillation, and 4 for standard solutions, the maximum possible score for these techniques is 30, 35, and 20 respectively. After the data was cleaned as described in the Methods section, a paired t-test was conducted on matching pairs before and after the laboratory activity. This data is shown in Table 2. In all cases, there is a significant difference (p < 0.001) between the pre- and post-mean scores for each knowledge, experience, and confidence scale. By calculating the Cohen's d value, these differences were all calculated to have a large or very large effect size (Sawilowsky, 2009), with the exception of “experience” of standard solutions. Although the median values of distillation were among the lowest in the pre-laboratory group, the effect size of the increase was the highest compared to that of the other techniques.

Table 2 Accumulated mean pre- and post-laboratory activity scores for students rating of their knowledge, experience, and confidence in titrations (6 sub-scales), distillation (7 sub-scales) and preparing standard solutions (4 sub-scales)
Technique Pre-lab mean Post-lab mean Cohen's d (effect size)
Titration (30)
Knowledge 23.44 28.18 1.41 (v. large)
Experience 22.03 26.88 1.08 (large)
Confidence 22.25 27.51 1.27 (v. large)
Distillation (35)
Knowledge 21.61 30.85 1.55 (v. large)
Experience 16.09 26.32 1.48 (v. large)
Confidence 19.99 29.03 1.32 (v. large)
Standard solution (20)
Knowledge 17.39 19.08 0.82 (large)
Experience 16.47 18.17 0.56 (medium)
Confidence 16.00 18.57 0.80 (large)


Pre- and post-laboratory questions

As well as self-rated perceptions, students were also asked a question in the pre- and post laboratory surveys relating to the laboratory skill. For titrations, students were shown a picture of a burette and asked to note the reading (required to two decimal places); for distillations students were asked to explain when they would change a flask to collect a second fraction, and for standard solutions, students were asked to calculate a solution concentration given a particular mass. The numbers given in the question meant that the answer should be reported to two significant figures. The post lab questions were the same, but involved a different burette reading and a different concentration calculation. Responses to these questions are shown in Table 3.
Table 3 Categorisation of responses to a question for each of the techniques pre- and post-laboratory work
Question Pre-lab Post lab
Burette reading Correct: 34% Correct: 66%
Incorrect: Incorrect:
1 Decimal: 48% 1 Decimal: 23%
Reading: 18% Reading: 10%
Distillation procedure Correct: 43% Correct: 75%
Incorrect: 43% Incorrect: 24%
Don’t know: 14% Don’t know: 1%
Concentration calculation Correct: 31% Correct: 32%
Incorrect: Incorrect:
Sig. fig.: 45% Sig. fig.: 53%
Calculation: 22% Calculation: 13%


As well as total responses, it was noted that percentage of students who gave an incorrect burette reading and whose self rating average exceeded 3/5 for titrations was 13% in the pre-lab survey and 9% in the post-lab survey. The proportion who incorrectly answered the distillation question and whose self-rating average exceeded 3/5 was 15% in the pre-lab survey and 20% in the post lab survey. Finally the proportions who answered the molarity calculation incorrectly but who had a higher than 3/5 average self-rating was 21% in the pre-lab survey and 16% in the post-lab survey.

The above data aims to demonstrate, that for titrations and distillations, both students’ perceptions of their laboratory competency and external measures of some aspects of these competencies, improved over the project.

The exception is with the data on standard solutions. While students’ self-perceptions increased as a result of the activity, there was little change observed in the responses to the calculation question, aside from a reduction in the number of incorrect responses. The reason for this can only be speculated from the available data. This protocol differed from others; students were not required to video each other doing this activity, merely to observe each other. In practice (as indicated below) we noted that many students did not undergo peer observation as there was no explicit need – the absence of a requirement for a video meant that peers could work on their own individually if they wished.

Another reason may be that the peer observation sheet does not explicitly mention significant figures as a consideration, merely that the student should add a label to their flask “with appropriate details.” Therefore significant figures might not have been considered, or indeed it might have been perceived that this was not a consideration. It will be interesting to observe whether this changes in a future iteration where significant figures are explicitly mentioned.

Research Question 3: What were the observations about the implementation of the lab-skills activity in practice?

The laboratory sessions were structured around the Peer Observation Sheets (Appendix 1), with students being given space to provide feedback on their peer's video demonstration. These sheets also indicated which components of the demonstration students should record on their video.

Recording and submitting videos

There were no difficulties reported in terms of students not wishing to be videoed or not having a mobile phone to record their video. As part of the demonstrator induction, it was made clear that if students did not wish to be recorded, then they could complete the demonstration with a demonstrator present for the purpose of showing their competency and to complete the laboratory activity. In this case, the student would not receive the digital badge, as this was based on the evidence produced. However, all students in this implementation successfully recorded and submitted their two videos (titration and distillation). Students were required to upload their video to a video sharing site and submit the link to the virtual learning environment for review, within 48 hours of their laboratory session.

As mentioned above, students were not required to video the standard solution preparation. This was partly due to the fact that this involved two significant tasks: weighing out a solid correctly, and making up the solution after transferring the solid. It was felt that this may take too much time to video. The absence of a requirement to video meant that observations of students not completing the peer review, and just preparing their solutions themselves, were reported. This may explain the small change in pre-post survey question responses for this technique, but regardless, demonstrates the necessity for some evidence of peer review for this approach. A future iteration plans to separate the weighing and standard solution procedures, so that they can be videoed and afforded a digital badge.

Suggestions for sites to submit their video included YouTube, Vimeo, and the university's own video sharing site. Students opted to upload their videos in order of preference to the university's own video sharing site (58%), YouTube (40%), Vimeo (1%), and a Dropbox or similar link (1%).

The submission of links to videos rather than videos themselves to the virtual learning environment was required. The purpose of this was to develop students’ digital literacy, and awareness about digital footprint. Students were informed that they should submit their video and list it publicly or have it unlisted (available to anyone with the link) as they chose. They were informed that private videos could not be viewed, but that they could make their videos private after instructor review. A surprise finding was that most students chose the university's own video site to host their videos. No data on the reason for this was collected, but anecdotally, several students commented that they saw this work as “academic” and therefore was better placed there instead of a site such as YouTube. Other students considered the university website more secure for their academic work. In reviewing some videos again as part of the research project after instructor review, it was noted that some students had exercised the option to change their video settings to private. These kinds of options and choices mean that students are developing the ability to control their own digital footprint.

Feedback on performance

Students were required to complete the Peer Observation Sheets to provide feedback on their lab partner's performance. Analysis of these sheets however indicated that there was very little written feedback provided; comments such as “nicely demonstrated” or “well done” were common. In a small number of instances (∼10%) some instances of feedback on technique was provided. These typically were along the lines of suggestions on how to improve, for example in a titration feedback sheet, some feedback was “add liquid more slowly near endpoint”. However, this was not typical.

Students also received feedback on the aspects of the Peer Observation Sheet that they needed to complete themselves, namely the titration readings and average titre and the standard solution concentration calculation. 17% of student reports marked did not record one or more of their titration figures to two decimal places, while the remainder did. 16% of students did not complete the calculation of their standard solution correctly. By far the most common mistake was the correct number of significant figures, 59% of students did not enter in the correct number of significant figures for their standard solution calculations, in line with the responses observed for the post-lab survey quiz, also involving significant figures.

Finally, students received feedback on their videos. For titrations, this tended to focus on specific issues which may affect the accuracy of results. While not prevalent, the most common error was not washing the burette tip after each dropwise addition close to the endpoint, followed by not reading burette to two decimal places either at the start or the end of the titration. Distillation feedback was less rich; students tended to set-up and explain the distillation very well. Typical comments, when required, were regarding the correct arrangement for the condenser tubing.

Digital badges

Students’ videos were reviewed, and assessed holistically to determine whether competency was displayed in the technique. In almost all cases, students were issued with a digital badge. Students were awarded five points for submitting each video, with a point deducted for issues which affected accuracy or operation. Students who received more than three points out of five were awarded with the badge automatically by the virtual learning environment. In the VLE used (Blackboard), this is managed by setting criteria: a check to see if student submitted their assignment, which would detail the video link, and a check to see if the score awarded exceeded 3/5. Once these criteria were met, students were awarded the badge (called an “achievement” in Blackboard). They had the option to “push” (publish) this badge to the open badges backpack; the independent platform for hosting badges. Because of data protection issues, the student must be offered this choice, and thus we were not able to secure data on how many students opted to publish their badge, nor indeed what role the badges had in motivating students to do well in the activity, if any. Therefore the framework proposed incorporates badging as a means to package the entire exercise, and our future work will focus on these motivational aspects, and interests in display of badges. Some hints came through anecdotal feedback from students, regarding queries about “getting the badge” during the lag time between video submission and assessment.

Limitations

In this study, the use of pre-lab exemplar videos, in-lab demonstration with videoing (for two of three techniques), and in-lab peer review was used to facilitate the learning of laboratory skills. The entire process involved the production of evidence-based competencies, which meant that it could be packaged up in the awarding of digital badges. Because of the combination of approaches, it is not clear whether one or of these approaches leads to the observed improvements in students’ perceptions of their knowledge, confidence, and experience of the techniques, or if it is a combination, what the relative weighting is of those different components which are having an effect. For example, the literature on pre-laboratory activity cited above illustrates that this can have benefit in terms of reducing cognitive load in the laboratory. Little is known about the motivation aspects of digital badges. However the purpose was not to isolate each component involved but to show that the combination, which was designed in accordance with the framework proposed by Sadler, has some merit.

Another limitation in our study is that all of our participants had studied chemistry in school and likely had some practical experience. Therefore it cannot be concluded that the approach taken here is appropriate for teaching techniques ab initio, although the results from the distillation experiment suggest that even without prior experience in the practical technique, the framework proposed works well.

Conclusions

Peer-review of laboratory techniques incorporating peer-recorded video has enabled a useful in situ feedback method for students in the development of their laboratory skills. Exemplar videos provided in advance of the laboratory class provide students with information on the correct protocol. For demonstration of technique, students and their peers use a peer-observation sheet which allows the feedback to be structured and aligned with the exemplar videos. Video recording is valuable as a means of prompting this feedback and ensuring peer dialogue – a fact noted in the third technique where peer review was not required – and also acts as evidence for competency. This evidence is awarded by means of a digital badge, acknowledging students ability to complete the technique. The activity described provides a useful means of facilitating peer assessment, as well as documenting and acknowledging transferable skill development by means of digital badges.

Appendices

Appendix 1a: peer observation sheet – titrations


image file: c7rp00003k-u1.tif

Appendix 1b: peer observation sheet – distillations


image file: c7rp00003k-u2.tif

Appendix 1c: peer observation sheet – standard solutions


image file: c7rp00003k-u3.tif

Appendix 2: pre-/post-survey questions

Name

Student ID

Lab group (Day/time)

You are asked in these questions to rate between 1–5 your own ability in terms of knowledge, experience and confidence in various aspects of completing techniques.

• 1 is a low value (little knowledge, no experience, not confident).

• 5 is a high value (very knowledgeable, lots of experience, very confident).

Your ratings do not affect your lab score in any way!

Titrations

Rate your knowledge, experience, and confidence of the following aspects of titrations:

1. Adding liquid to burette

2. Where initial level of liquid in burette should be

3. Amount of indicator to add

4. What to do to analyte in conical flask when adding solution from burette

5. Steps to take when near end point (dropwise adding, washing)

6. Reading a burette to correct number of decimal places

A picture of a burette with some liquid is shown. What is the correct reading of this value?

Quickfit distillation

1. Identify the necessary glassware for distillation

2. Correct sequence to connect rubber tubing to condenser

3. Know how to correctly grease glassware

4. Correct assembly of apparatus including placement of clamp and thermometer.

5. Arrangement of cables and tubing in a safe manner

6. Correct method for adding liquid and required number of bumping granules

7. Protocol for collecting different fractions

In a short statement, explain how you would know when to transfer flasks after you collect your first fraction during a distillation.

Preparing a standard solution

1. Weighing out a solid onto a balance.

2. Transferring solid to beaker and solvating

3. Transferring solution to volumetric flask

4. Making up a solution to the mark in a volumetric flask

[x] g of Na 2 CO 3 is weighed out and made up to 250 cm 3 of water. What concentration would you write on the label of this flask?

Appendix 3: YouTube retention plots (taken directly from YouTube analytics dashboard)

Titration video: http://bit.ly/skillstitrating
image file: c7rp00003k-u4.tif

Distillation video: http://bit.ly/skillsdistillation

image file: c7rp00003k-u5.tif

Standard solution video: http://bit.ly/skillsstandardsoln

image file: c7rp00003k-u6.tif

Pipetting video: http://bit.ly/skillsvolpipette

image file: c7rp00003k-u7.tif

Acknowledgements

MKS and MMK wish to acknowledge the Institute for Academic Development Principal's Teaching Award Scheme. EDD, HMO’C and AP acknowledge the Principal's Career Development Scholarships, University of Edinburgh.

References

  1. Anderson G. W., (1978), The Playfair Collection and the Teaching of Chemistry at the University of Edinburgh 1713–1858, Edinburgh: The Royal Scottish Museum.
  2. BERA, (2011), Ethical Guidelines for Educational Research.
  3. Black P. and Wiliam D., (2006), Inside the black box: raising standards through classroom assessment, Granada Learning.
  4. Bruck A. D. and Towns M., (2013), Development, Implementation, and Analysis of a National Survey of Faculty Goals for Undergraduate Chemistry Laboratory, J. Chem. Educ., 90(6), 685–693, DOI: http://10.1021/ed300371n.
  5. Bruck L. B., Towns M. and Bretz S. L., (2010), Faculty Perspectives of Undergraduate Chemistry Laboratory: Goals and Obstacles to Success, J. Chem. Educ., 87(12), 1416–1424, DOI: http://10.1021/ed900002d.
  6. Carnduff J. and Reid N., (2003), Enhancing undergraduate chemistry laboratories: pre-laboratory and post-laboratory exercises, Royal Society of Chemistry.
  7. Casilli C. and Hickey D., (2016), Transcending conventional credentialing and assessment paradigms with information-rich digital badges, Inf. Soc., 32(2), 117–129, DOI: http://10.1080/01972243.2016.1130500.
  8. Chen H.-J., She J.-L., Chou C.-C., Tsai Y.-M. and Chiu M.-H., (2013), Development and Application of a Scoring Rubric for Evaluating Students’ Experimental Skills in Organic Chemistry: An Instructional Guide for Teaching Assistants, J. Chem. Educ., 90(10), 1296–1302, DOI: http://10.1021/ed101111g.
  9. DeKorver B. K. and Towns M. H., (2015), General Chemistry Students’ Goals for Chemistry Laboratory Coursework, J. Chem. Educ., 92(12), 2031–2037, DOI: http://10.1021/acs.jchemed.5b00463.
  10. DeKorver B. K. and Towns M. H., (2016), Upper-level undergraduate chemistry students’ goals for their laboratory coursework, J. Res. Sci. Teach., 53(8), 1198–1215, DOI: http://10.1002/tea.21326.
  11. Doidge E. D., O'Connor H. M., Price A. and Seery M. K., (2016), Using a volumetric pipette correctly, retrieved from http://https://www.youtube.com/watch?v=yL5XZhrWZ6I.
  12. Elkordy A., (2016), Development and Implementation of Digital Badges for Learning Science, Technologly, Engineering and Math (STEM) Practices in Secondary Contexts: A Pedagogical Approach with Empirical Evidence, in Foundation of Digital Badges and Micro-Credentials, Springer, pp. 483–508.
  13. Goldberg D. T., (2012), Badges for Learning: Threading the Needle Between Skepticism and Evangelism, retrieved from http://dmlcentral.net/badges-for-learning-threading-the-needle-between-skepticism-and-evangelism/.
  14. Graham K. J., Johnson B. J., Jones T. N., McIntee E. J. and Schaller C. P., (2008), Designing and Conducting a Purification Scheme as an Organic Chemistry Laboratory Practical, J. Chem. Educ., 85(12), 1644, DOI: http://10.1021/ed085p1644.
  15. Hawkes S. J., (2004), Chemistry Is Not a Laboratory Science, J. Chem. Educ., 81(9), 1257, DOI: http://10.1021/ed081p1257.
  16. Hendry G., (2013), Integrating feedback with classroom teaching, in Merry S., Price M., Carless D. and Taras M. (ed.), Reconceptualising Feedback in Higher Education: Developing Dialogue with Students, Routledge, pp. 133–134.
  17. Hensiek S., DeKorver B. K., Harwood C. J., Fish J., O’Shea K. and Towns M., (2016), Improving and Assessing Student Hands-On Laboratory Skills through Digital Badging, J. Chem. Educ., 93(11), 1847–1854, DOI: http://10.1021/acs.jchemed.6b00234.
  18. Hofstein A. and Lunetta V. N., (2004), The laboratory in science education: foundations for the twenty-first century, Sci. Educ., 88(1), 28–54, DOI: http://10.1002/sce.10106.
  19. Jolley D. F., Wilson S. R., Kelso C., O’Brien G. and Mason C. E., (2016), Analytical Thinking, Analytical Action: Using Prelab Video Demonstrations and e-Quizzes To Improve Undergraduate Preparedness for Analytical Chemistry Practical Classes, J. Chem. Educ., 93(11), 1855–1862.
  20. Kirschner P. A. and Meester M. A. M., (1988), The laboratory in higher science education: problems, premises and objectives, High. Educ., 17(1), 81–98, DOI: http://10.1007/bf00130901.
  21. Kirton S. B., Al-Ahmad A. and Fergus S., (2014), Using Structured Chemistry Examinations (SChemEs) As an Assessment Method To Improve Undergraduate Students’ Generic, Practical, and Laboratory-Based Skills, J. Chem. Educ., 91(5), 648–654, DOI: http://10.1021/ed300491c.
  22. Kolk K. V. D., Beldman G., Hartog R. and Gruppen H., (2012), Students Using a Novel Web-Based Laboratory Class Support System: A Case Study in Food Chemistry Education, J. Chem. Educ., 89(1), 103–108, DOI: http://10.1021/ed1005294.
  23. Kucharski M. M. and Seery M. K., (2016a), Completing a Distillation, November 2016, retrieved from http://https://www.youtube.com/watch?v=qZRRRXlZexg.
  24. Kucharski M. M. and Seery M. K., (2016b), Preparing a standard solution, November 2016, retrieved from http://https://www.youtube.com/watch?v=MeOAPbMvubE.
  25. Kucharski M. M. and Seery M. K., (2016c), Titration Exemplar Video, November 2016, retrieved from http://https://www.youtube.com/watch?v=rK7Egs-SJus.
  26. Lalla M., (2017), Fundamental characteristics and statistical analysis of ordinal variables: a review, Quality & Quantity, 51(1), 435–458.
  27. Mehta N. B., Hull A. L., Young J. B. and Stoller J. K., (2013), Just Imagine: New Paradigms for Medical Education, Academic Medicine, 88(10), 1418–1423, DOI: http://10.1097/ACM.0b013e3182a36a07.
  28. Neeland E. G., (2007), A One-Hour Practical Lab Exam for Organic Chemistry, J. Chem. Educ., 84(9), 1453, DOI: http://10.1021/ed084p1453.
  29. Ng W., (2015), New digital technology in education: conceptualizing professional learning for educators, Springer.
  30. Reid N. and Shah I., (2007), The role of laboratory work in university chemistry, Chem. Educ. Res. Pract., 8(2), 172–185, DOI: 10.1039/B5RP90026C.
  31. Rhodes M. M., (2010), A Laboratory Practical Exam for High School Chemistry, J. Chem. Educ., 87(6), 613–615, DOI: http://10.1021/ed100200k.
  32. RSC, (2015), Accreditation of Degree Programmes.
  33. Sadler D. R., (1989), Formative assessment and the design of instructional systems, Instructional Science, 18(2), 119–144.
  34. Sadler P. M. and Good E., (2006), The impact of self-and peer-grading on student learning, Educ. Assess., 11(1), 1–31.
  35. Sawilowsky S. S., (2009), New effect size rules of thumb, J. Mod. Appl. Stat. Methods, 8(2), 597–599.
  36. Seery M. K., (2016), CVs for the 21st Century, Educ. Chem., 53(2), 32.
  37. Spagnoli D., Wong L., Maisey S. and Clemons T. D., (2017), Prepare, do, review: a model used to reduce the negative feelings towards laboratory classes in an introductory chemistry undergraduate unit, Chem. Educ. Res. Pract., 18(1), 26–44.
  38. Towns M., Harwood C. J., Robertshaw M. B., Fish J. and O’Shea K., (2015), The Digital Pipetting Badge: A Method To Improve Student Hands-On Laboratory Skills, J. Chem. Educ., 92(12), 2038–2044, DOI: http://10.1021/acs.jchemed.5b00464.
  39. Weaver II R. L. and Cotrell H. W., (1986), Peer evaluation: a case study, Innovative High. Educ., 11(1), 25–39.
  40. Winberg T. M. and Berg C. A. R., (2007), Students' cognitive focus during a chemistry laboratory exercise: effects of a computer-simulated prelab, J. Res. Sci. Teach., 44(8), 1108–1133.
  41. Yang J. C., Quadir B. and Chen N.-S., (2015), Effects of the Badge Mechanism on Self-Efficacy and Learning Performance in a Game-Based English Learning Environment, J. Educ. Comput. Res., 54(3), 371–394, DOI: http://10.1177/0735633115620433.
  42. Zoller U., Tsaparlis G., Fatsow M. and Lubezky A., (1997), Student self-assessment of higher-order cognitive skills in college science teaching, J. Coll. Sci. Teach., 27(2), 99.

This journal is © The Royal Society of Chemistry 2017