What laboratory skills do students think they possess at the start of University?

Nimesh Mistry* and Stephen G. Gorman
School of Chemistry, University of Leeds, Leeds, West Yorkshire LS2 9JT, UK. E-mail: N.Mistry@leeds.ac.uk

Received 15th April 2019 , Accepted 1st April 2020

First published on 6th April 2020


To be able to design a laboratory course it is important to know what laboratory skills students possess before the course starts. This way the course can focus on developing skills in areas that are lacking. Despite the extensive literature on laboratory education, there are few studies on what laboratory skills students have at this stage of their education. In this work, we aimed to address this by surveying students’ percieved knowledge, experience and confidence of a range of laboratory competencies at the start of a chemistry degree. Our key findings were that students percieved to have knowledge, experience and confidence of performing lower-order competencies such as practical techniques, but lacked the knowledge, experience and confidence to perform higher-order competencies such as designing experiments. From our results, we propose that instructors should be aware that experiments focussing on certain practical skills may not teach students how to perform that technique but are providing more experience and confidence. We also propose instructors should use laboratory courses to teach higher-order skills such as experimental design and problem-solving where these skills are more evidently lacking.


Introduction

Since it was first postulated by Piaget, constructivism has gained significant importance in chemistry education as a model for how students learn (Bodner, 1986). Under the constructivist model, learners are not blank slates who absorb knowledge intact from the instructor. Instead, knowledge is constructed in the mind of the learner (Cooper and Stowe, 2018). This implies that new knowledge is integrated through what the learner experiences and integrates new knowledge with their pre-existing knowledge structures.

The importance of prior knowledge in the constructivist model has been highlighted by Ausubel, who stated that for students to construct knowledge in a meaningful way “students must have appropriate prior knowledge to which the new knowledge can be connected” and “new knowledge must be perceived as relevant to this prior knowledge” (Ausubel, 1968; Bretz, 2001). Vygotsky proposed that the amount of new knowledge a student can learn is inextricably linked to what they know already, commonly known as zone of proximal development (ZPD) (Vygotsky and Cole, 1978).

In field of laboratory education, there has been a great deal of discussion concerning the skills that students should learn through laboratory education (Kirschner and Meester, 1988; Hofstein and Lunetta, 2004; Reid and Shah, 2007; Bruck et al., 2010; Bruck and Towns, 2013; George-Williams et al., 2018). Recently, a framework for learning in the laboratory has been proposed, based upon the prior literature (Seery et al., 2019a). With complex learning theory as a theoretical framework, Seery proposes four guiding principles regarding laboratory learning:

1. The overarching purpose of the laboratory is to teach learners how to ‘do’ science.

2. Preparing students for learning in the laboratory is beneficial.

3. Explicit consideration needs to be given to teaching experimental techniques.

4. Consideration of learners’ emotions, motivations, and expectations is imperative in laboratory settings.

In the UK, the Quality Assurance Agency (QAA) provides a framework for what students should learn in an undergraduate chemistry degree. This includes guidelines of the laboratory skills that undergraduate chemistry students should learn (Quality Assurance Agency, 2014). Equivalent frameworks are also used in other countries, such as the ACS Guidelines used in the US (ACS, 2015) and the Australian Qualifications Framework in Australia (AQF, 2013). Furthermore, the Royal Society of Chemistry (RSC) also provides accreditation for chemistry degrees as a benchmark for the quality of the programme (Royal Society of Chemistry, 2017). RSC accreditation is used by Universities around the world. To receive accreditation, University chemistry departments have to teach the laboratory skills specified by the RSC.

Using constructivism as a model for learning, prior knowledge should also be considered by instructors who design and deliver laboratory courses. This was recognised by Reid and Shah (2007) who stated that “it is important that those directing university chemistry laboratories are aware of what is currently happening at schools… In this way, it is possible to plan university chemistry laboratories so that they can avoid repeating school laboratory experiences but also build on the kind of thinking skills which school courses seek to inculcate.

Many students enrolled onto a chemistry or general science degree will have had some prior laboratory experience from high school. The majority of students in the UK, for example, will have completed an ‘A’ level qualification in chemistry. The UK government provides the specifications to which exams boards must follow to provide an ‘A’ level qualification (Department for Education, 2014). This is the equivalent of the Framework for K-12 education used in the US (NRC, 2012). Included in these specifications are details of the laboratory skills students should develop from studying chemistry at high school. These specifications can therefore be used to determine what laboratory skills students have prior to enrolling on a chemistry degree programme.

At the beginning of a chemistry degree, George-Williams (2018) investigated the skills that students expected to learn through a University laboratory course. Most students expected to enhance their understanding of theory, learn how to apply theory, and develop practical skills. This study helps to inform laboratory instructors of students’ perceptions looking forward. However, it would also be useful to have students look back at what they have learnt previously so instructors can determine how prepared students are for laboratory courses at University. A better understanding would also help universities plan their laboratory curricula accordingly. A criticism of laboratory courses is that they do not directly assess laboratory skills (Seery et al., 2017; Seery et al., 2019b). In the UK, the ‘A’ level qualification assesses practical skills by written exam. This means grades may have little bearing on the practical ability of students. Therefore, asking students to self-assess their laboratory skills could provide a useful measure of their skills if entry grades do not reflect practical skill levels.

Herein, we describe our work to understand what laboratory skills students possess at the start of University, through student self-assessment of their laboratory skills, and how this can be used to inform instructors of how to design the laboratory curriculum at University.

Our research questions for this study are:

1. What laboratory skills do students think they possess at the start of University?

2. How should this inform the curriculum design of a University level laboratory course?

Methodology

Survey design

For the purpose of this study, a quantitative approach was used based upon students’ self-assessment of laboratory skills. Having students self-assess their knowledge, experience and confidence of laboratory skills has been used previously to evaluate the development of laboratory curricula (Towns et al., 2015; Hensiek et al., 2016; Seery et al., 2017). We felt this approach would be a suitable method to evaluate laboratory skills for our study.

The survey focused upon skills in synthetic chemistry (organic and inorganic) laboratory courses (Table 1). Survey items (questions) were designed around laboratory skills relating to synthetic chemistry that were specified in the relevant documents for high school level (Department for Education, 2014) and for University level chemistry education (Quality Assurance Agency, 2014; Royal Society of Chemistry, 2017).

Table 1 Details of the survey given to students the start of University and results given as mean scores (N = 308)
Category Survey item Knowledge Experience Confidence
Literacy Recording experimental details in your laboratory notebook. 3.32 3.27 3.11
Writing a full laboratory report. 2.42 2.27 2.22
 
Health and safety Following health and safety information given in the laboratory manual for experiments. 4.13 3.97 4.06
Handling and disposing chemicals safely in the laboratory. 3.60 3.26 3.45
Assess the risk of a particular situation in the laboratory and deal with it in a safe manner. 3.63 3.48 3.17
Being able to work safely in the laboratory. 4.24 4.16 4.17
 
Problem-solving Using demonstrators (laboratory teaching assistants) to help me solve problems I encounter during an experiment. 3.77 3.36 3.79
Understanding how the advice given to me by a demonstrator (laboratory teaching assistant) will solve my problem. 3.90 3.55 3.86
Being able to make my own assessment of a problem I encounter during an experiment. 3.44 3.22 3.21
Being able to devise my own solution to a problem I encounter during an experiment. 3.17 2.91 2.91
 
Practical skills Setting-up and running a reaction under reflux. 3.60 3.23 3.11
Setting-up and running a reaction under controlled (dropwise) addition of reagents. 3.76 3.55 3.45
Monitoring the progress of a reaction by thin layer chromatography (TLC). 3.32 2.96 3.02
Isolating a crude product by liquid–liquid extraction (work-up) using a separating funnel. 3.16 2.75 2.84
Purifying a solid by recrystallisation. 3.63 3.31 3.22
Purifying a liquid by distillation. 3.69 3.37 3.37
 
Practical theory The chemical theory that underpins thin layer chromatography. 3.31 N/A 3.06
The chemical theory that underpins liquid–liquid extraction (work-up) using a separating funnel. 2.94 N/A 2.75
The chemical theory that underpins recrystallisation. 3.13 N/A 2.90
 
Experimental design Choosing a suitable set-up of a reaction (i.e. choice of glassware) if this information has not been given in a procedure. 3.07 2.60 2.71
Choosing suitable reaction parameters (e.g. solvent system) for monitoring reaction progress by thin layer chromatography. 2.36 2.07 2.08
Designing a procedure to purify a mixture by liquid–liquid extraction (work-up) with a separating funnel. 2.55 2.20 2.27
Finding an appropriate solvent to purify a solid by recrystallisation. 2.49 2.16 2.25
Choosing analytical methods that will verify if my reaction was successful or not. 3.10 2.69 2.75


The high school level guidelines were used to design survey items for laboratory skills which students would be expected to have learnt before arriving at University. The QAA benchmark statement and RSC accreditation criteria were used to design survey items of skills that students could be expected to develop through the laboratory course.

The survey was also designed to consider skills which could be deemed to be higher and lower-order. In chemistry education, lower and higher-order skills have been rationalised according to Bloom's taxonomy (Zoller and Pushkin, 2007; Ghani et al., 2017). Problem-solving, critical thinking and decision making are given as higher-order skills (Zoller and Pushkin, 2007; Ghani et al., 2017). In laboratory education, the development of higher-order skills more likely to occur with the use of inquiry or problem-based experiments (Hofstein et al., 2005; Cooper and Kerns, 2006; Seery et al., 2019b). In addition to problem-solving and critical thinking, scientific thinking, data evaluation and experimental design are all given as examples of higher-order skills.

Recently, Tekkumru-Kisa developed the Task Analysis Guide in Science (TAGS) framework to encapsulate the variety of tasks (i.e. the types of instruction and activities completed by students) used in science education (Tekkumru-Kisa et al., 2015). Tasks are categorised as being either scientific content, scientific practice or an integration of the two. Within each category, the nature of a scientific tasks can be defined by its cognitive level. At the lowest cognitive level are task involving memorisation (level 1). Next, are tasks involving scripts (level 2), followed by tasks involving guidance for understanding (level 3 and 4) and finally tasks doing science (level 5). By cross-referencing both the category of task and cognitive level, the TAGS framework provides 9 types tasks that are used in science education.

Under the TAGS framework, traditional ‘cookbook’ style experiments are determined to be a scientific content and practice type task, with low (level 2) cognitive demand. This is in response to the literature of student learning in ‘cookbook’ style experiments, which is often highly critical of this teaching method (Domin, 1999; Hofstein and Lunetta, 2004; Reid and Shah, 2007). Student reflections of what they are thinking when performing such laboratory experiments reveals that students focus on following procedures without thinking about what or why they are doing them and asking demonstrators (teaching associates) to correct their mistakes instead of trying to solve it themselves (DeKorver and Towns, 2015; DeKorver and Towns, 2016; Galloway and Bretz, 2016). Complex learning theory has been used as a framework for developing experimental design skills (Seery et al., 2019b). These skills are proceeded with the teaching of practical techniques which implies that practical techniques can be considered to be a lower-order skill in comparison to experimental design.

When comparing laboratory skills in both secondary and tertiary level specifications, most of the overlap concerned various practical techniques, however both also placed some emphasis on the development of problem-solving skills.

The survey items fell into six general categories of literacy, health and safety, practical technique, practical theory, experimental design and problem-solving.

In some categories, there are lower and higher-order survey items within them. For example, the health and safety item ‘Following health and safety information given in the manual for experiments’ aligns with the lower-order skills of following procedures with little independent thinking. However, the category also contains the item ‘Assessing the risk of a particular situation in the laboratory and deciding how to deal with it in a safe manner’ which requires higher-order independent thought and decision making.

Some categories were mostly made up of lower or higher-order items, with the categories themselves linked together to provide a progression from lower to higher-order skills. An example of this is the way the categories of practical skills, practical theory and experimental design were linked. As mentioned previously, performing practical techniques can be considered lower-order as students can develop those skills with no understanding of how or why they are used (DeKorver and Towns, 2015; DeKorver and Towns, 2016; Galloway and Bretz, 2016). Practical theory requires some understanding so is higher-order in relation to practical skills, whilst experimental design requires the application of that understanding so is considered higher-order than practical theory (Cooper and Kerns, 2006; Seery et al., 2019a; Seery et al., 2019b).

For most survey items, students were asked to rate their knowledge, experience and confidence to align with Novak's theory of meaningful learning (Novak and Gowin, 1984; Bretz, 2001). This theory builds upon Ausubel's concepts of meaningful learning to provide a framework for how students can integrate new knowledge and skills. Novak argued that meaningful learning occurs when new knowledge and skills connects to students across the cognitive (thinking), affective (feeling) and psychomotor (doing) domains. The Meaningful Learning in the Laboratory Instrument (MLLI) measures cognitive and affective dimensions of learning, and has been used widely to evaluate the effectiveness of laboratory courses (Galloway and Bretz, 2015a; Galloway and Bretz, 2015b; Galloway and Bretz, 2015c; Galloway and Bretz, 2015d; George-Williams et al., 2019).

For this survey, students' knowledge of particular laboratory skills aligns with the cognitive domain, their experience aligns with the psychomotor domain, and their confidence aligns with the affective domain (Towns et al., 2015; Hensiek et al., 2016).

The practical theory category was slightly different in that students only had to rate their knowledge and confidence. Students were asked to rate their knowledge, experience and confidence from a numerical value on a scale of 1–5. A score of 1 indicates low knowledge, experience, confidence, where as a score of 5 indicates high knowledge, experience, and confidence.

Data collection

Ethical approval was granted by the institutions ethical review board. The University of Leeds is a large research intensive university in the UK. The survey was administered to first year Chemistry and Natural Science students over a two year period (2017 and 2018). The survey was only available to students in the first two weeks of semester 1 of their first year to eliminate the possibility of responses after the first year laboratory course had started. The survey was delivered using the online survey tool (www.onlinesurveys.ac.uk) and students were notified of the survey through the University Blackboard tool. Before completing the survey, students were made aware of the aims of the study, how it did not contribute to their grades, how it was not compulsory, and how they could withdraw their data at a future date if they so desired.

Data analysis

A total of 308 responses were received from two cohorts of first year students in 2017 and 2018. The response rates for each year were 84% and 78% respectively. We believe that the high response rate provides an accurate representation of first year students' self-assessment of laboratory skills for our institution.

Analysis of the validity and reliability of the survey was performed using SPSS. To determine if the results from each year group were not statistically significant, and hence could be combined, an independent t-test was performed for each item with the year group as the independent variable. All items showed no statistical differences (p > 0.05), so the results will be discussed and presented further as the combined responses of the two cohorts.

Cronbach alpha values were measured for all survey items and for all knowledge, experience and confidence items to determine the scale reliability. Typically, an alpha value above 0.7 indicates good reliability. A Cronbach alpha value of 0.976 was measured for knowledge, experience and confidence items combined. All knowledge items gave an alpha value of 0.941; for all experience items the alpha value was 0.922; for all confidence items the alpha value was 0.938. These alpha values show good internal consistency for all items in the survey.

Exploratory factor analysis (EFA) was used to identify the latent construct of the survey (Appendix 6). Principal Axis Factoring was used as the factor extraction method, and Promax rotation was used to allow for the factors to be correlated. Analysis of this statistical method revealed that knowledge, experience and confidence of items loaded onto the same factor in all cases. This reveals that the same skill was being measured whether it was for knowledge, experience or confidence. EFA analysis also revealed some items in the same category loading onto the same factor however, there was no category where all items factored together. Therefore, EFA shows that categories in our survey are correctly defined as general skills areas, whilst items are measuring specific skills. Factor loading of items also correlated with our design of higher and lower-order skills. Further details and discussion of this is given in the Results and Discussion section, where the results for each category is discussed in more detail.

Pearson's r, also known as the Pearson correlation coefficient was measured for all items. This method provides an indication of the strength of correlation between continuous variables. A value close +1 indicates a strong positive relationship between two variables. A value close to -1 indicates a strong negative relationship, and a value close to 0 indicates no correlation. Strong correlation values between 0.7–0.8 (p < 0.05) were obtained for knowledge, experience and confidence of the same item. Moderate values of 0.3–0.7 (p < 0.05) were obtained for items within the same category, with a few exceptions where the correlation was stronger. Weak correlation values between 0–0.3 (p < 0.05) were obtained between items from different categories.

Further analysis of survey items and categories was conducted by analysing the distributions of responses as percentages (Fig. 1 and 2, Appendices) and comparison of mean scores (Table 1). Paired t-tests were used to determine significant differences between the mean knowledge, experience and confidence scores of the same item (Appendix 7). These results will also be discussed in the Results and Discussion section. There is some debate about the use of averaging Likert data in this type of analysis (Lalla, 2017). However, this centres on primary data being converted from ordinal (e.g. agree/disagree) statements into numerical values, then treating those responses as values on a continuous scale. In our study, students provide their primary data as numerical values, so we believe averaging these scores is a valid method to interpret the data. It should be noted that analysing the distribution of responses which is commonly accepted for Likert data.


image file: c9rp00104b-f1.tif
Fig. 1 Percentage distribution for items in the literacy category (N = 308).

image file: c9rp00104b-f2.tif
Fig. 2 Percentage distribution of responses for items relating to recrystallisation (N = 308) going in increasing order of cognitive demand.

Results and discussion

Literacy

Under the category of literacy, we wanted to determine students’ ability to record experimental data in a laboratory notebook and write scientific reports. These are arguably the two most important forms of written communication that a chemistry student should develop in a laboratory course. EFA and Pearson's r showed that students perceived these two items as two distinct skills.

Analysis of the percentage distribution of values revealed the most common ratings for recording experimental details in a lab book were 3 and 4 (Fig. 1). Mean values for knowledge, experience and confidence ranging 3.11–3.32 (Table 1). Paired t-tests showed the difference between knowledge and experience of this item were not statistically different (p = 0.281), however confidence and knowledge, and confidence and experience were (p < 0.05).

In comparison, ratings for writing a full laboratory report was lower, with the majority of students selecting ratings (Fig. 1) which gave mean values between 2.22–2.42 (Table 1). As with the previous item, paired t-tests showed knowledge and experience were not significantly different (p = 0.183), whereas confidence was significantly different to knowledge and experience (p < 0.05).

High school specifications under scientific literacy state that students should be able to “keep appropriate records of experimental activities and report findings from experimental activity” (Department for Education, 2014). University-level criteria also highlight the need for students to learn written communication skills (Quality Assurance Agency, 2014; Royal Society of Chemistry, 2017).

These results indicate that students perceive themselves to have good levels of knowledge and experience of recording results of experimental activities in a laboratory notebook – an essential skill for a bench chemist. Nevertheless, University laboratory courses can provide value in the development of this skill with more practice. Writing reports is perceived to be much less well developed, and therefore these skills need to be developed in University courses. It is also notable that the mean score for knowledge is the highest respectively for both items.

Health and safety

Learning how to handle chemicals safely is a key competency which was highlighted in both high school and University level practical specifications.

EFA resulted in two factors loading for the four items in this category. Following health and safety information […], handling and disposing chemicals safely […] and being able to work safely in the laboratory loaded onto the same factor. This is not surprising as these items relate to being safe in the laboratory, and can be achieved with lower-order cognition (i.e. following rules without understanding why). A separate factor was generated for assessing the risk of a particular situation […], which confirmed our hypothesis that this skill requires a higher level of cognition.

For the item being able to work safely in the laboratory, the majority of students gave high ratings of 4 and 5 (Appendix 1), leading to the some of the highest mean scores generated from the survey responses (3.97–4.13). Paired t-tests showed mean score for knowledge was significantly different to experience and confidence (p < 0.05), however experience and confidence were not (p = 0.73). The item following health and safety information […] had similarly high ratings and mean scores (4.17–4.24). The mean scores for knowledge, experience and confidence were all significantly different (p < 0.05). Despite loading onto the same factor as the previous two items,

handling and disposing chemicals safely […] had a lower distribution of ratings and lower mean scores (3.26–3.60). Paired t-tests showed these mean scores were significantly different (p < 0.05). Assessing the risk of a particular situation […] gave similar mean scores to the previous item (3.17–3.63). The mean scores for knowledge, experience and confidence were all validated to be significantly different using paired t-tests (p < 0.05).

High school criteria specify how students must be able to safely and correctly use a range of practical equipment and materials and follow written instructions (Department for Education, 2014). From these results, it appears that students believe they can do this to a high level. This is despite rating themselves lower for their ability to handle chemicals and make their own hazard assessments in comparison. As with the literacy category, knowledge was rated more highly than experience and confidence for each item.

Problem-solving

The ability to “solve problems in practical contexts” was clearly specified in the high school specification (Department for Education, 2014). University level guidelines both have extensive detail about how students should develop problem-solving skills through a chemistry degree (Quality Assurance Agency, 2014; Royal Society of Chemistry, 2017). Whilst the HE criteria does not explicitly say that these skills have to be developed through practical work, the laboratory provides an important environment for students to develop problem-solving skills.

EFA loaded two factors for problem-solving items, which correlated with our survey design to include higher and lower levels of problem-solving. The lower cognitive levels of problem solving – asking a demonstrator (laboratory teaching assistant) for help solving a problem and understanding that advice loaded onto one factor. The second factor contained the items of higher cognitive ability for problem-solving where students can diagnose and solve their own problems.

For the two lower-order items, Using demonstrators (laboratory teaching assistants) to help me solve problems […] and Understanding how the advice given to me by a demonstrator […], the majority of students gave high ratings between 3–5 (Appendix 2), and mean values ranging between 3.36–3.79 (Table 1). Paired t-tests showed the mean scores for experience were significantly different to knowledge and confidence (p < 0.05), however knowledge and confidence were statistically similar (p = 0.681 and 0.201 respectively).

The two higher-order items make my own assessment of a problem […] and devise my own solution […] were given lower ratings in comparison to the lower-order items. The majority of ratings were between 3 and 4, and mean scores ranged from 2.87–3.42. Paired t-tests for both items showed the means scores knowledge were significantly different to experience and confidence (p < 0.05), however experience and confidence were similar (p = 0.931 and 1 respectively).

Practical skills

Items in this category were not meant to be exhaustive list of practical techniques but cover fundamental practical skills that students would develop in a synthetic laboratory course. The techniques chosen in the survey were all techniques that were explicitly mentioned in the high school specification. University level criteria also state that students should learn practical skills to be able to perform organic and inorganic synthesis.

EFA loaded each practical skill onto a different factor. Paired t-tests showed the differences of mean scores for knowledge, experience and confidence all practical skills items were statistically significant (p < 0.05).

The majority of students gave high self-ratings of 3–4 for being knowledgeable of how to perform a reflux, recrystallisation and distillation (Fig. 2 and Appendix 3), with mean scores between 3.11–3.55 (Table 1). Experience and confidence were rated slightly lower in comparison giving mean scores between 3.11–3.55.

Students gave lower ratings for their knowledge, experience and confidence for performing a liquid–liquid extraction and thin layer chromatography. These more expensive/hazardous techniques might have limited their exposure to these at high school. As with the previously mentioned practical skills the mean values for knowledge (3.16 and 3.32) were higher than their comparative experience and confidence values (2.75–3.02). It is unclear why these two skills were rated lower than the other skills.

However, the overall results from this category show that a majority of students perceive to have learnt how to perform standard synthetic chemistry techniques before starting University. Therefore, increasing students’ experience and confidence could be the learning value that students gain from performing these skills at University.

Practical theory and experimental design

The high school specification states that students should be able to “comment on experimental design and evaluate scientific methods” (Department for Education, 2014). University level criteria states that Bachelor's students must have “the ability to plan experimental procedures, given well defined objectives”, whilst Master's students should also have theability to select appropriate techniques and procedures” and display “competence in the planning, design and execution of experiments” (Quality Assurance Agency, 2014).

For all these objectives, an understanding of how experiments are performed and the theoretical basis for practical techniques is required. We categorised this section as practical theory, and designed the survey items to evaluate student understanding of the same practical techniques that were included in the practical skills category. These same objectives from the secondary and tertiary specifications influenced the design of survey items under the experimental design category. Experimental design items were also linked to practical techniques and practical theory categories by asking students if they could choose the appropriate techniques or conditions to design a synthetic chemistry experiment.

EFA of practical theory items provided two factors. Understanding the process of liquid–liquid and recrystallisation loaded onto the same factor, whilst understanding thin layer chromatography gave a separate factor. Paired t-tests showed the differences between mean scores of knowledge and confidence for all items were statistically significant (p < 0.05).

Students ratings of practical theory were lower than in comparison their practical skills (Fig. 2 and Appendix 4). As a result, mean values were also lower, ranging between 2.71–3.31. This shows that many students believe they are be able to perform these techniques despite having little understanding as to how they work.

In the experimental design category, students were asked to rate their ability to choose a suitable reaction set-up, choose the appropriate solvent system for thin layer chromatography and recrystallisation, design a liquid–liquid extraction, and choose the appropriate analytical technique for a synthetic chemistry experiment.

For these items, EFA loaded 4 factors for the 5 items. Thin layer chromatography and recrystallisation loaded onto the same factor whilst other items loaded separately. This is not surprising as designing both recrystallisation and thin layer chromatography conditions involves the choice of solvents. Paired t-tests showed the mean scores for knowledge was significantly different for all items (p < 0.05). For two of items in this category, choosing a suitable set-up […] and designing a liquid–liquid extraction […], mean scores for experience and confidence were significantly different (p < 0.05). For the remaining items in this category, choosing a suitable solvent for thin layer chromatography, choosing a suitable solvent for recrystallisation, and choosing analytical methods […], experience and confidence means were statistically similar (p = 0.687, 0.077 and 0.095 respectively).

For all experimental design items, the majority of students gave low ratings between 1 and 3 for these skills (Fig. 2 and Appendix 5) leading to mean scores between 2.04–3.10 (Table 1). These results show that students perceive to lack the skills needed to design experiments, therefore activities that can develop these skills is required in University level laboratory courses.

Implications for University laboratory courses

Our results and analysis from asking students to self-rate their laboratory skills at the start of a University chemistry degree show that students perceived to have developed these skills to various degrees, depending on the type of skill. Those involved in designing and delivering laboratory courses should take students’ abilities into consideration.

In the Framework for Learning in the Chemistry laboratory proposed by Seery et al. (2019a), one of the four guiding principles stated that laboratory courses should have some focus on practical techniques. Most instructors’ expect students to learn lower-order skills such as practical techniques in a laboratory course (Bruck et al., 2010; Bruck and Towns, 2013; George-Williams et al., 2018). Here we have shown that students’ self-rate their ability to perform certain practical techniques to a high level before University courses have started. Therefore, University instructors should be aware that the value of experiments whose learning objectives are to teach practical techniques may in fact not be teaching students how to perform these skills, but are providing them with more experience and confidence.

The laboratory is a complex learning environment (Seery et al., 2019a), and this can lead to students feeling overwhelmed with the amount of information they have to deal with, particularly at the start of a laboratory course (Reid and Shah, 2007). Laboratory induction activities can help to reduce the cognitive overload for students by familiarising them with the laboratory environment before having to perform assessed experiments. We recommend asking students to perform techniques which students are familiar with to allow them to focus on assimilation into their new laboratory surroundings.

Our results also show that there is a disconnection between students’ perceptions of being able to perform techniques with their understanding of why they are being asked to use them, and their ability to apply them to design an experiment. This indicates that the experiments performed in high school are largely expository (cookbook) experiments (Domin, 1999). This style of experiment has been widely criticised for their inability to develop students' higher-order skills (Kirschner and Meester, 1988; Hofstein and Lunetta, 2004). We have shown that first-year students can improve their understanding of practical techniques and develop experimental design skills through structured guided-inquiry experiments (Mistry et al., 2016). Other ideas for improving experimental design skills include a two-stage experiment, where the first experiment introduces the technique in an expository experiment, whilst the second is more open-ended (Seery et al., 2019b).

An additional advantage of performing experiments that are more open-ended is that they improve other higher-order skills such as problem-solving. The level of detail devoted to the development of these skills in HE specifications indicates how important it is that students learn to problem solve. In a laboratory context, it is important that students can ask demonstrators for help if needed, but our results show that students could improve their ability of solving problems independently. If, as indicated by our results, students do not need the whole of a laboratory course to focus on teaching practical techniques, then courses could incorporate open-ended experiments and projects to develop higher-order problem-solving skills (Berg et al., 2003; Hofstein et al., 2005; Flynn and Biggs, 2012; Sandi-Urena et al., 2012; Bertram et al., 2014).

It is pleasing that students feel they have the skills to work safely by following instructions. Given the importance of health and safety we recommend that University instructors should assess students’ ability themselves in introductory experiments and with chemicals which are more hazardous than students would have experienced using in secondary education. However, at some stage in a course, students should be given more independence to make their own decisions relating to health and safety. For example, students could be asked to perform their own COSHH assessment for certain experiments once instructors are confident that students can follow the health and safety procedures in their laboratory.

Finally, for literacy skills there is a clear need to develop students' ability to communicate scientific experiments through written reports. Many traditional experiments assess laboratory skills through student's ability to write laboratory reports. These have been criticised as they do not directly relate to laboratory skills, leading to other forms of laboratory assessments being reported (Kirton et al., 2014; Seery et al., 2017). The authors support these forms of assessment. However, laboratory courses should still train students to write written reports. A possible solution is that report writing could be the assessment format used for open-ended experiments or projects.

Limitations

There are limitations that should be considered when interpreting the data. Student perceptions of their laboratory skills may not be an accurate representation of their abilities. Students may have over or underestimated their abilities, particularly for skills which they are unfamiliar with. Asking students to complete this pre-survey after they have finished the course may have provided different results (Towns et al., 2015)

Another limitation is that this research was conducted at a single institution in the UK. These results may not be representative of UK chemistry students, however as mentioned earlier, practical skills are not graded for ‘A’ level qualifications. The only requirement being that students complete the skills. This means entry requirements are not correlated with practical ability, and it is therefore reasonable to conclude that students with different entry grades to students at the University of Leeds could have a similar skills profile.

These results may not be representative of undergraduate students in other countries, as students in the UK specialise in chemistry sooner that many other countries. Therefore, these results may be more representative for sophomore college students at the start of general chemistry I or organic chemistry I, or equivalent. We do believe, however that the trend observed in these results, where knowledge is rated higher than experience and confidence, could be generalisable for all students transitioning from high school to University.

Finally, demographic data was not obtained from this survey. It is possible that there could be differences in students based on demographics such as gender and ethnicity.

Conclusion

In summary we have used a self-assessment survey of laboratory skills that students would be expected to gain either before or during a chemistry degree. Students indicated that they have perceived to have developed some of the skills expected of them before starting University, but there is scope for them to gain more experience and confidence through laboratory course at degree level. Students were also more likely to rate their ability to perform lower-order laboratory skills more highly than higher-order skills. Many students believed they had not developed some higher-order skills given in high school criteria such as experimental design at the start of University.

These results and findings can help HE instructors plan and design laboratory courses that provide the appropriate type of practical activities which will develop skills where students are lacking.

We are continuing our use of this survey to monitor the development of these skills in our own laboratory courses and will disseminate the findings in due course.

Conflicts of interest

There are no conflicts to declare.

Appendix 1: percentage distributions from the health and safety category; N = 308


image file: c9rp00104b-u1.tif

Appendix 2: percentage distributions from the problem-solving category; N = 308


image file: c9rp00104b-u2.tif

Appendix 3: percentage distributions from the practical skills category; N = 308


image file: c9rp00104b-u3.tif

Appendix 4: percentage distributions from the practical theory category; N = 308


image file: c9rp00104b-u4.tif

Appendix 5: percentage distributions from the experimental design category; N = 308


image file: c9rp00104b-u5.tif

Appendix 6: exploratory factor analysis

Table 2 Literacy category
  Factor loading
Following health and safety information given in the laboratory manual for experiments: knowledge 0.967
Following health and safety information given in the laboratory manual for experiments: experience 0.945
Following health and safety information given in the laboratory manual for experiments: confidence 0.915
Handling and disposing chemicals safely in the laboratory: experience 0.669
Being able to work safely in the laboratory: experience 0.607
Handling and disposing chemicals safely in the laboratory: knowledge 0.6


Table 3 Health and safety category
  Factor loading
Following health and safety information given in the laboratory manual for experiments: knowledge 0.967  
Following health and safety information given in the laboratory manual for experiments: experience 0.945  
Following health and safety information given in the laboratory manual for experiments: confidence 0.915  
Handling and disposing chemicals safely in the laboratory: experience 0.669  
Being able to work safely in the laboratory: experience 0.607  
Handling and disposing chemicals safely in the laboratory: knowledge 0.6  
Being able to work safely in the laboratory: knowledge 0.554  
Being able to work safely in the laboratory: confidence 0.518  
Handling and disposing chemicals safely in the laboratory: confidence 0.491  
Assess the risk of a particular situation in the laboratory and deal with it in a safe manner: confidence   0.998
Assess the risk of a particular situation in the laboratory and deal with it in a safe manner: experience   0.914
Assess the risk of a particular situation in the laboratory and deal with it in a safe manner: knowledge   0.877


Table 4 Problem-solving category
  Factor loading
Being able to devise my own solution to a problem I encounter during an experiment: knowledge 0.974  
Being able to devise my own solution to a problem I encounter during an experiment: confidence 0.935  
Being able to devise my own solution to a problem I encounter during an experiment: experience 0.868  
Being able to make my own assessment of a problem I encounter during an experiment: confidence 0.847  
Being able to make my own assessment of a problem I encounter during an experiment: knowledge 0.818  
Being able to make my own assessment of a problem I encounter during an experiment: experience 0.772  
Using demonstrators to help me solve problems I encounter during an experiment: knowledge   0.935
Understanding how the advice given to me by a demonstrator will solve my problem: knowledge   0.878
Using demonstrators to help me solve problems I encounter during an experiment.: confidence   0.872
Using demonstrators to help me solve problems I encounter during an experiment.: experience   0.831
Understanding how the advice given to me by a demonstrator will solve my problem: experience   0.787
Understanding how the advice given to me by a demonstrator will solve my problem: confidence   0.740


Table 5 Practical skills category
  Factor loading
Setting-up and running a reaction under controlled addition of reagents: experience 0.974          
Setting-up and running a reaction under controlled addition of reagents: confidence 0.935          
Setting-up and running a reaction under controlled addition of reagents: knowledge 0.868          
Monitoring the progress of a reaction by thin layer chromatography (TLC): experience   0.962        
Monitoring the progress of a reaction by thin layer chromatography (TLC): confidence   0.945        
Monitoring the progress of a reaction by thin layer chromatography (TLC): knowledge   0.927        
Purifying a liquid by distillation: knowledge     0.953      
Purifying a liquid by distillation: experience     0.942      
Purifying a liquid by distillation: confidence     0.910      
Isolating a crude product by liquid–liquid extraction: confidence       0.969    
Isolating a crude product by liquid–liquid extraction: experience       0.966    
Isolating a crude product by liquid–liquid extraction: knowledge       0.877    
Purifying a solid by recrystallisation: confidence         0.980  
Purifying a solid by recrystallisation: experience         0.930  
Purifying a solid by recrystallisation: knowledge         0.847  
Setting-up and running a reaction under reflux: experience           0.999
Setting-up and running a reaction under reflux: confidence           0.868
Setting-up and running a reaction under reflux: knowledge           0.835


Table 6 Practical theory category
  Factor loading
The chemical theory that underpins thin layer chromatography: knowledge 0.913  
The chemical theory that underpins thin layer chromatography: confidence 0.909  
The chemical theory that underpins liquid–liquid extraction: knowledge   0.874
The chemical theory that underpins liquid–liquid extraction: confidence   0.853
The chemical theory that underpins recrystallisation: confidence   0.722
The chemical theory that underpins recrystallisation: knowledge   0.659


Table 7 Experimental design category
  Factor loading
Choosing suitable parameters for monitoring reaction progress by TLC: knowledge 0.902      
Choosing suitable parameters for monitoring reaction progress by TLC: confidence 0.890      
Choosing suitable parameters for monitoring reaction progress by TLC: experience 0.848      
Finding an appropriate solvent to purify a solid by recrystallisation: confidence 0.749      
Finding an appropriate solvent to purify a solid by recrystallisation: experience 0.702      
Finding an appropriate solvent to purify a solid by recrystallisation: knowledge 0.690      
Designing a procedure to purify a mixture by liquid–liquid extraction: experience   0.948    
Designing a procedure to purify a mixture by liquid–liquid extraction: confidence   0.923    
Designing a procedure to purify a mixture by liquid–liquid extraction: knowledge   0.921    
Choosing a suitable set-up of a reaction if this information has not been given in a procedure: experience     0.874  
Choosing a suitable set-up of a reaction if this information has not been given in a procedure: confidence     0.843  
Choosing a suitable set-up of a reaction if this information has not been given in a procedure: knowledge     0.795  
Choosing analytical methods that will verify if my reaction was successful or not: knowledge       0.949
Choosing analytical methods that will verify if my reaction was successful or not: experience       0.885
Choosing analytical methods that will verify if my reaction was successful or not: confidence       0.876


Appendix 7: paired t-tests

Tables 8 and 9.
Table 8 Literacy, health and safety and problem-solving categories
Item 1 Item 2 t df Sig. (p-value)
Recording experimental details in your laboratory notebook.: knowledge Recording experimental details in your laboratory notebook.: experience 1.08 308 0.281
Recording experimental details in your laboratory notebook.: knowledge Recording experimental details in your laboratory notebook.: confidence 5.664 308 <0.001
Recording experimental details in your laboratory notebook.: experience Recording experimental details in your laboratory notebook.: confidence 3.881 308 <0.001
Writing a full laboratory report.: knowledge Writing a full laboratory report.: experience 4.042 308 0.183
Writing a full laboratory report.: knowledge Writing a full laboratory report.: confidence 5.617 308 <0.001
Writing a full laboratory report.: experience Writing a full laboratory report.: confidence 1.335 308 <0.001
 
Following health and safety information given in the manual: knowledge Following health and safety information given in the manual: experience 4.644 308 <0.001
Following health and safety information given in the manual: knowledge Following health and safety information given in the manual: confidence 2.022 308 0.044
Following health and safety information given in the manual: experience Following health and safety information given in the manual: confidence 2.182 308 0.03
Handling and disposing chemicals safely in the laboratory: knowledge Handling and disposing chemicals safely in the laboratory: experience 7.413 308 <0.001
Handling and disposing chemicals safely in the laboratory: knowledge Handling and disposing chemicals safely in the laboratory: confidence 3.792 308 <0.001
Handling and disposing chemicals safely in the laboratory: experience Handling and disposing chemicals safely in the laboratory: confidence 4.621 308 <0.001
Assess the risk of a particular situation in the laboratory: knowledge Assess the risk of a particular situation in the laboratory: experience 10.579 308 <0.001
Assess the risk of a particular situation in the laboratory: knowledge Assess the risk of a particular situation in the laboratory: confidence 4.276 308 <0.001
Assess the risk of a particular situation in the laboratory: experience Assess the risk of a particular situation in the laboratory: confidence 7.132 308 <0.001
Being able to work safely in the laboratory: knowledge Being able to work safely in the laboratory: experience 3.022 308 0.003
Being able to work safely in the laboratory: knowledge Being able to work safely in the laboratory: confidence 2.759 308 0.006
Being able to work safely in the laboratory: experience Being able to work safely in the laboratory: confidence 0.346 308 0.73
 
Using demonstrators to help me solve problems: knowledge Using demonstrators to help me solve problems: experience 8.919 308 <0.001
Using demonstrators to help me solve problems: knowledge Using demonstrators to help me solve problems: confidence 0.412 308 0.681
Using demonstrators to help me solve problems: experience Using demonstrators to help me solve problems: confidence 8.22 308 <0.001
Understanding how the advice will solve my problem: knowledge Understanding how the advice given to me will solve my problem: experience 8.489 308 <0.001
Understanding how the advice given to me will solve my problem: knowledge Understanding how the advice given to me will solve my problem: confidence 1.281 308 0.201
Understanding how the advice given to me will solve my problem: experience Understanding how the advice given to me will solve my problem: confidence 7.582 308 <0.001
Being able to make my own assessment of a problem: knowledge Being able to make my own assessment of a problem: experience 6.279 308 <0.001
Being able to make my own assessment of a problem: knowledge Being able to make my own assessment of a problem: confidence 6.151 308 <0.001
Being able to make my own assessment of a problem: experience Being able to make my own assessment of a problem: confidence 0.087 308 0.931
Being able to devise my own solution to a problem: knowledge Being able to devise my own solution to a problem: experience 7.864 308 <0.001
Being able to devise my own solution to a problem: knowledge Being able to devise my own solution to a problem: confidence 6.901 308 <0.001
Being able to devise my own solution to a problem: experience Being able to devise my own solution to a problem: confidence 0 308 1


Table 9 Practical skills, practical theory and experimental design categories
Item 1 Item 2 t df Sig. (p-value)
Setting-up and running a reaction under reflux: knowledge Setting-up and running a reaction under reflux: experience 8.33 308 <0.001
Setting-up and running a reaction under reflux: knowledge Setting-up and running a reaction under reflux: confidence 12.449 308 <0.001
Setting-up and running a reaction under reflux: experience Setting-up and running a reaction under reflux: confidence 3.233 308 0.001
Setting-up and running a reaction under controlled addition: knowledge Setting-up and running a reaction under controlled addition: experience 6.346 308 <0.001
Setting-up and running a reaction under controlled addition: knowledge Setting-up and running a reaction under controlled addition: confidence 9.234 308 <0.001
Setting-up and running a reaction under controlled addition: experience Setting-up and running a reaction under controlled addition: confidence 3.117 308 0.002
Monitoring the progress of a reaction by TLC: knowledge Monitoring the progress of a reaction by TLC: experience 8.598 308 <0.001
Monitoring the progress of a reaction by TLC: knowledge Monitoring the progress of a reaction by TLC: confidence 8.183 308 <0.001
Monitoring the progress of a reaction by TLC: experience Monitoring the progress of a reaction by TLC: confidence 4.562 308 0.019
Isolating a crude product by liquid–liquid extraction: knowledge Isolating a crude product by liquid–liquid extraction: experience 10.213 308 <0.001
Isolating a crude product by liquid–liquid extraction: knowledge Isolating a crude product by liquid–liquid extraction: confidence 8.716 308 <0.001
Isolating a crude product by liquid–liquid extraction: experience Isolating a crude product by liquid–liquid extraction: confidence 2.801 308 0.005
Purifying a solid by recrystallisation: knowledge Purifying a solid by recrystallisation: experience 7.937 308 <0.001
Purifying a solid by recrystallisation: knowledge Purifying a solid by recrystallisation: confidence 10.697 308 <0.001
Purifying a solid by recrystallisation: experience Purifying a solid by recrystallisation: confidence 2.54 308 0.012
Purifying a liquid by distillation: knowledge Purifying a liquid by distillation: experience 8.343 308 <0.001
Purifying a liquid by distillation: knowledge Purifying a liquid by distillation: confidence 9.409 308 <0.001
Purifying a liquid by distillation: experience Purifying a liquid by distillation: confidence 6.213 308 0.003
 
The chemical theory that underpins thin layer chromatography: knowledge The chemical theory that underpins thin layer chromatography: confidence 8.729 308 <0.001
The chemical theory that underpins liquid–liquid extraction: knowledge The chemical theory that underpins liquid–liquid extraction: confidence 7.15 308 <0.001
The chemical theory that underpins recrystallisation: knowledge The chemical theory that underpins recrystallisation: confidence 2.483 308 0.014
 
Choosing a suitable set-up of a reaction: knowledge Choosing a suitable set-up of a reaction: experience 12.398 308 <0.001
Choosing a suitable set-up of a reaction: knowledge Choosing a suitable set-up of a reaction: confidence 10.637 308 <0.001
Choosing a suitable set-up of a reaction: experience Choosing a suitable set-up of a reaction: confidence 2.908 308 0.004
Choosing suitable parameters for monitoring reaction progress by TLC: knowledge Choosing suitable parameters for monitoring reaction progress by TLC: experience 8.047 308 <0.001
Choosing suitable parameters for monitoring reaction progress by TLC: knowledge Choosing suitable parameters for monitoring reaction progress by TLC: confidence 9.475 308 <0.001
Choosing suitable parameters for monitoring reaction progress by TLC: experience Choosing suitable parameters for monitoring reaction progress by TLC: confidence 0.404 308 0.687
Designing a procedure to purify a mixture by liquid–liquid extraction: knowledge Designing a procedure to purify a mixture by liquid–liquid extraction: experience 10.438 308 <0.001
Designing a procedure to purify a mixture by liquid–liquid extraction: knowledge Designing a procedure to purify a mixture by liquid–liquid extraction: confidence 9.063 308 <0.001
Designing a procedure to purify a mixture by liquid–liquid extraction: experience Designing a procedure to purify a mixture by liquid–liquid extraction: confidence 2.15 308 0.032
Finding an appropriate solvent to purify a solid by recrystallisation: knowledge Finding an appropriate solvent to purify a solid by recrystallisation: experience 6.427 308 <0.001
Finding an appropriate solvent to purify a solid by recrystallisation: knowledge Finding an appropriate solvent to purify a solid by recrystallisation: confidence 7.298 308 <0.001
Finding an appropriate solvent to purify a solid by recrystallisation: experience Finding an appropriate solvent to purify a solid by recrystallisation: confidence 1.774 308 0.077


Acknowledgements

We would like to thank the University of Leeds Pedagogic Research in Mathematics and Physical Sciences (PRiSM) group for help with gaining ethical approval for this work.

References

  1. ACS, Guidelines and Evaluation Procedures for Bachelor's Programs, https://www.acs.org/content/acs/en/about/governance/committees/training/acs-guidelines-supplements.html, (accessed 24 February, 2020).
  2. AQF, Australian Qualifications Framwork, https://www.aqf.edu.au/sites/aqf/files/aqf-2nd-edition-january-2013.pdf, (accessed 20 January, 2020).
  3. Ausubel D. P., (1968), The psychology of meaningful verbal learning: an introdution to school lerning, New York: Grune & Stratton.
  4. Berg C. A. R., Bergendahl V. C. B., Lundberg B. and Tibell L., (2003), Benefiting from an open-ended experiment? A comparison of attitudes to, and outcomes of, an expository versus an open-inquiry version of the same experiment, Int. J. Sci. Educ., 25, 351–372.
  5. Bertram A., Davies E. S., Denton R., Fray M. J., Galloway K. W., George M. W., Reid K. L., Thomas N. R. and Wright R. R., (2014), From Cook to Chef: Facilitating the Transition from Recipe-driven to Open-ended Research-based Undergraduate Chemistry Lab Activities, New Dir., 10, 26–31.
  6. Bodner G. M., (1986), Constructivism: A theory of knowledge, J. Chem. Educ., 63, 873.
  7. Bretz S. L., (2001), Novak's Theory of Education: Human Constructivism and Meaningful Learning, J. Chem. Educ., 78, 1107.
  8. Bruck A. D. and Towns M., (2013), Development, Implementation, and Analysis of a National Survey of Faculty Goals for Undergraduate Chemistry Laboratory, J. Chem. Educ., 90, 685–693.
  9. Bruck L. B., Towns M. and Bretz S. L., (2010), Faculty Perspectives of Undergraduate Chemistry Laboratory: Goals and Obstacles to Success, J. Chem. Educ., 87, 1416–1424.
  10. Cooper M. M. and Kerns T. S., (2006), Changing the Laboratory: Effects of a Laboratory Course on Students' Attitudes and Perceptions, J. Chem. Educ., 83, 1356.
  11. Cooper M. M. and Stowe R. L., (2018), Chemistry Education Research—From Personal Empiricism to Evidence, Theory, and Informed Practice, Chem. Rev., 118, 6053–6087.
  12. DeKorver B. K. and Towns M. H., (2015), General Chemistry Students’ Goals for Chemistry Laboratory Coursework, J. Chem. Educ., 92, 2031–2037.
  13. DeKorver B. K. and Towns M. H., (2016), Upper-level undergraduate chemistry students’ goals for their laboratory coursework, J. Res. Sci. Teach., 53, 1198–1215.
  14. Department for Education U. G., GSC AS and A level subject content for biology, chemistry, physics and psychology, https://www.gov.uk/government/publications/gce-as-and-a-level-for-science, (accessed 8 April, 2019).
  15. Domin D. S., (1999), A Review of Laboratory Instruction Styles, J. Chem. Educ., 76, 543.
  16. Flynn A. B. and Biggs R., (2012), The Development and Implementation of a Problem-Based Learning Format in a Fourth-Year Undergraduate Synthetic Organic and Medicinal Chemistry Laboratory Course, J. Chem. Educ., 89, 52–57.
  17. Galloway K. R. and Bretz S. L., (2015a), Development of an Assessment Tool To Measure Students’ Meaningful Learning in the Undergraduate Chemistry Laboratory, J. Chem. Educ., 92, 1149–1158.
  18. Galloway K. R. and Bretz S. L., (2015b), Measuring Meaningful Learning in the Undergraduate Chemistry Laboratory: A National, Cross-Sectional Study, J. Chem. Educ., 92, 2006–2018.
  19. Galloway K. R. and Bretz S. L., (2015c), Measuring Meaningful Learning in the Undergraduate General Chemistry and Organic Chemistry Laboratories: A Longitudinal Study, J. Chem. Educ., 92, 2019–2030.
  20. Galloway K. R. and Bretz S. L., (2015d), Using cluster analysis to characterize meaningful learning in a first-year university chemistry laboratory course, Chem. Educ. Res. Pract., 16, 879–892.
  21. Galloway K. R. and Bretz S. L., (2016), Video episodes and action cameras in the undergraduate chemistry laboratory: eliciting student perceptions of meaningful learning, Chem. Educ. Res. Pract., 17, 139–155.
  22. George-Williams S. R., Ziebell A. L., Kitson R. R. A., Coppo P., Thompson C. D. and Overton T. L., (2018), ‘What do you think the aims of doing a practical chemistry course are?’ A comparison of the views of students and teaching staff across three universities, Chem. Educ. Res. Pract., 19, 463–473.
  23. George-Williams S. R., Karis D., Ziebell A. L., Kitson R. R. A., Coppo P., Schmid S., Thompson C. D. and Overton T. L., (2019), Investigating student and staff perceptions of students' experiences in teaching laboratories through the lens of meaningful learning, Chem. Educ. Res. Pract., 20, 187–196.
  24. Ghani I. B. A., Ibrahim N. H., Yahaya N. A. and Surif J., (2017), Enhancing students' HOTS in laboratory educational activity by using concept map as an alternative assessment tool, Chem. Educ. Res. Pract., 18, 849–874.
  25. Hensiek S., DeKorver B. K., Harwood C. J., Fish J., O’Shea K. and Towns M., (2016), Improving and Assessing Student Hands-On Laboratory Skills through Digital Badging, J. Chem. Educ., 93, 1847–1854.
  26. Hofstein A. and Lunetta V. N., (2004), The laboratory in science education: Foundations for the twenty-first century, Sci. Educ., 88, 28–54.
  27. Hofstein A., Navon O., Kipnis M. and Mamlok-Naaman R., (2005), Developing students' ability to ask more and better questions resulting from inquiry-type chemistry laboratories, J. Res. Sci. Teach., 42, 791–806.
  28. Kirschner P. A. and Meester M. A. M., (1988), The laboratory in higher science education: Problems, premises and objectives, Higher Educ., 17, 81–98.
  29. Kirton S. B., Al-Ahmad A. and Fergus S., (2014), Using Structured Chemistry Examinations (SChemEs) As an Assessment Method To Improve Undergraduate Students’ Generic, Practical, and Laboratory-Based Skills, J. Chem. Educ., 91, 648–654.
  30. Lalla M., (2017), Fundamental characteristics and statistical analysis of ordinal variables: a review, Qual. Quant., 51, 435–458.
  31. Mistry N., Fitzpatrick C. and Gorman S., (2016), Design Your Own Workup: A Guided-Inquiry Experiment for Introductory Organic Laboratory Courses, J. Chem. Educ., 93, 1091–1095.
  32. Novak J. D. and Gowin D. B., (1984), Learning how to learn, Cambridge: Cambridge University Press.
  33. NRC, (2012), A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas, Washington, DC: The Nataional Academies Press.
  34. Quality Assurance Agency U., Subject Benchmark Statement: Chemistry, https://www.qaa.ac.uk/quality-code/subject-benchmark-statements, (accessed 8 April, 2019).
  35. Reid N. and Shah I., (2007), The role of laboratory work in university chemistry, Chem. Educ. Res. Pract., 8, 172–185.
  36. Royal Society of Chemistry U., Accreditation of degree programmes, https://www.rsc.org/education/courses-and-careers/accredited-courses/, (accessed 8 April, 2019).
  37. Sandi-Urena S., Cooper M. and Stevens R., (2012), Effect of Cooperative Problem-Based Lab Instruction on Metacognition and Problem-Solving Skills, J. Chem. Educ., 89, 700–706.
  38. Seery M. K., Agustian H. Y., Doidge E. D., Kucharski M. M., O’Connor H. M. and Price A., (2017), Developing laboratory skills by incorporating peer-review and digital badges, Chem. Educ. Res. Pract., 18, 403–419.
  39. Seery M. K., Agustian H. Y. and Zhang X., (2019a), A Framework for Learning in the Chemistry Laboratory, Isr. J. Chem., 59, 546–553.
  40. Seery M. K., Jones A. B., Kew W. and Mein T., (2019b), Unfinished Recipes: Structuring Upper-Division Laboratory Work To Scaffold Experimental Design Skills, J. Chem. Educ., 96, 53–59.
  41. Tekkumru-Kisa M., Stein M. K. and Schunn C., (2015), A framework for analyzing cognitive demand and content-practices integration: Task analysis guide in science, J. Res. Sci. Teach., 52, 659–685.
  42. Towns M., Harwood C. J., Robertshaw M. B., Fish J. and O’Shea K., (2015), The Digital Pipetting Badge: A Method To Improve Student Hands-On Laboratory Skills, J. Chem. Educ., 92, 2038–2044.
  43. Vygotsky L. S. and Cole M., (1978), Mind in society: the development of higher psychological processes, Cambridge: Harvard University Press.
  44. Zoller U. and Pushkin D., (2007), Matching Higher-Order Cognitive Skills (HOCS) promotion goals with problem-based laboratory practice in a freshman organic chemistry course, Chem. Educ. Res. Pract., 8, 153–171.

This journal is © The Royal Society of Chemistry 2020