Identifying skill inequalities in undergraduate chemistry laboratory teaching

Aaron G. Jimenez and David P. August*
School of Chemistry, The University of Edinburgh, David Brewster Road, Edinburgh, EH9 3FJ, UK. E-mail: David.August@ed.ac.uk

Received 21st April 2025 , Accepted 17th June 2025

First published on 19th June 2025


Abstract

Designing effective laboratory courses that take prior knowledge and experience into account are important for reducing inequalities and skill gaps within higher education. Whilst many anecdotal trends are known, this study aims to provide quantitative confirmation of skill gaps within first year undergraduate students. Students that studied A-level chemistry considered themselves both more experienced and more confident in a number of lab skills when compared to students who had completed either Scottish Highers or the International Baccalaureate—with skills involving more sophisticated equipment the most likely to differ. A similar relationship was observed between private and state funded schooling respectively, perhaps linked to the fact that a much higher proportion of A-level respondents were privately educated. International students displayed similar experience and confidence in general lab skills compared to students from the UK, but were less confident in their written and spoken English skill—despite the fact that 91% of them completed their final school education in English. International students were also less confident in their ability to design experimental methodologies. Covid-19 continues to have an impact on recent student cohorts, with Scottish students experiencing higher levels of disruption. It is hoped that confirmation of these long-held preconceptions, and the identification of the specific lab skills that vary most with educational background, will help future course design provide focused support to the students who need it most.


Introduction

Rigorous laboratory training is widely considered to be a vital component at all stages of the chemistry undergraduate curriculum, for developing both research skills and an understanding of the applications of chemical theory (Hofstein and Lunetta, 2004; Reid and Shah, 2007). Many previous studies have also identified the need to teach laboratory skills based on constructivist principles or inquiry-based learning—with students laying foundational skills before building on these to develop research independence and experiment design (Domin, 1999; Farrell et al., 1999; DeKorver and Towns, 2015; Wheeler et al., 2015; Seery et al., 2019). However, the varying lab experience of incoming undergraduate students presents a challenge when designing courses that are inclusive for all.

With many universities in the UK accepting students from a broad range of cultural and educational backgrounds, even with set entry requirements, there is an inevitable variation in lab skill confidence and competence amongst first year cohorts. Within the UK alone, secondary education curricula include A-levels, the International Baccalaureate and Scottish Highers (or Advanced Highers) but others, including the Irish Leaving Certificate, are also known. The differences in the devolved UK education systems also complicates university admissions further, with Scottish students having the option to leave school and begin higher education a year earlier than their English and Welsh peers, although Scottish universities may also offer the option for Direct Entry to 2nd year provided criteria are met.

There are many widely held beliefs about how prior schooling experience may impact lab skills—Mistry and Gorman have already shown how students’ competency can vary greatly depending on the type of skill (Mistry and Gorman, 2020). Batty and Reilly have also shown how prior laboratory training can affect a student's confidence in undergraduate science laboratories (Grant 2011; Batty and Reilly, 2023). However, there has, to date, been very little quantitative analysis or identification of the factors that most strongly contribute to students’ prior practical experience. This study aims to investigate:

1. How do factors, such as country of study, curriculum followed, and school type influence students’ perceptions of experience and confidence in key laboratory skills?

2. Which groups of students may require additional support at the start of their undergraduate chemistry studies?

3. How can the results of this study inform the curriculum redesign of first-year laboratory courses to narrow the laboratory skill gap and promote equitable retention across diverse schooling backgrounds?

Research methodology

Survey design

A survey method was chosen and developed based on previous studies that have already successfully used student self-assessment to quantify lab skills (Towns et al., 2015; Hensiek et al., 2016; Seery et al., 2017; Mistry and Gorman, 2020). The survey was split into four main sections:

1. Contextual information to assess the students’ previous education and cultural background.

2. Student self-assessments of their experience in 23 key lab skills.

3. Student self-assessments of their confidence in the same set of 23 key lab skills.

4. Student self-assessments of 13 complementary soft skills.

The data collected included contextual information on the students’ school type, the curriculum they followed and their country of study during the final two years of their schooling as well as any final grades achieved. Respondents were also presented with three 7-point Likert scale questions to gauge their overall perceived quality of teaching during this period and a 5-point Likert scale to investigate the impact of Covid-19 on their chemistry education.

The 23 key laboratory skills were chosen and collated from three chemistry high-school curricula: A-levels, Scottish Advanced Highers and the International Baccalaureate. Notably, these foundational skills featured heavily in first-year undergraduate laboratory manuals, the QAA benchmark statement (QAA, 2022) and RSC accreditation criteria (RSC, 2024), representing competencies expected to be acquired by the conclusion of their high school education or first year of university.

A 5-point Likert scale, complemented by detailed scale descriptors (see Section S7 of the ESI), was utilised to ensure result consistency and mitigate potential ambiguities (Sullivan and Artino, 2013). A value of 1 denoted minimal experience or low confidence, whereas a value of 5 signified extensive experience or high levels of confidence. Descriptors for ‘Experience’ quantified the frequency of skill utilisation, whilst those for ‘Confidence’ also considered the need for guidance, supervision and the participant's comfort level in executing the skill safely and effectively (Hensiek et al., 2016). Students who rated their confidence in a skill as a 4 or a 5 also felt comfortable to teach the skill safely and effectively to others. Participants could also indicate whether they did not encounter a specific skill in their schooling; this was assigned a value of 0.

The final aspect of the survey sought to assess transferable or “soft” skills, deemed critical to success in the laboratory and wider chemistry education (Montgomery et al., 2022). The list of skills used were identified following initial interviews with members of academic teaching and support staff (see Section S2 of the ESI, for a list of the 13 skills utilised), as well as being guided by existing literature (Reid and Shah, 2007; Kondo and Fair, 2017; Hill et al., 2019). For this section, a 5-point bipolar Likert scale was utilised, ranging from ‘Strongly Disagree’ to ‘Strongly Agree’.

Data collection

Ethical approval was granted by the institution's School level ethical review board. The institution is a large research & higher education institution based within the UK. The survey was carried out with both first year and direct entry students that were registered on chemistry laboratory courses within the first semester of their undergraduate studies. The survey was advertised via cohort emails, the online teaching platform Blackboard Learn Ultra and through direct attendance of the researchers in class. The survey was made available to students between the 20th of November 2023 and the 4th of February 2024 and was administered via the JISC online survey tool v2 (https://www.onlinesurveys.ac.uk). Prior to completing the survey, students were made aware of the aims of the study; reminded that all data was collected anonymously; that participation was optional; data was stored securely and that they could withdraw their data at a future date using a unique response code if they so desired.

Data analysis

A total of 140 responses were received from first year students in the 2023/24 cohort, out of a possible 581. This represents a response rate of 58% for those in a Chemistry specific track and 8% for those on a Biological Chemistry track. No statistically significant differences were observed between the Chemistry and Biological Chemistry students with respect to the overall perceived quality of teaching (see Section S9 of the ESI).

Cronbach Alpha values were calculated for the three ‘Overall perceived quality of teaching’ questions, to determine the consistency and reliability of the Likert scale used (Warmbrod, 2014) (see Section S6 of the ESI). A value above 0.7 was used to indicate an acceptable scale, where high correlation and consistency is observed in the scale employed (Sullivan and Artino, 2013).

As previously advocated by other studies (Hensiek et al., 2016; Mistry and Gorman, 2020) comparison of mean values using numerical primary data was determined to be a valid method to interpret the data for these questions. Statistical significance was thus determined using a two-tailed independent T-test.

A 1–7 bipolar Likert Scale was employed for the 3 questions related to overall perceptions, allowing for increased granularity and precision in data collection compared to a 5-point scale (Jamieson, 2004). Extreme and central values were given the same scale descriptors, where a value of 1 represented ‘Not at all’, a value of 4 represented ‘Average’ and a value of 7 represented ‘Exceptionally’ (see Section S7 of the ESI).

Whilst the Cronbach Alpha values suggest reliable scales for the remaining self-assessment practical skill sections, it was decided that the data should not be treated as parametric (Lalla, 2017). As discussed by Lalla, the prescribed scale descriptors used in the survey design are not equidistant, rendering the data non-parametric and ordinal in nature (Lalla, 2017). This disparity becomes particularly apparent when addressing the additional ‘0’ value assigned to students who had not performed this experiment at school. Furthermore, the application of a ‘paired t-test’ is inappropriate for Sections 2 and 3 due to the non-normal distribution of the data (McCrum-Gardner, 2008). For this study, the individual skills data will be treated as non-parametric, ordinal data, assuming unequal variance and non-normal distribution. A Mann–Whitney U test has been employed to determine statistical significance (McKnight and Najab, 2010), with further discussion provided in the Results and discussion section (see Section S5 of the ESI, for a full list of p-values obtained for each skill).

Limitations

All participants were surveyed approximately half way through their first year of university laboratory teaching. A very small number (n = 2) were direct entry students, meaning that they entered directly into the second year of study, but had still only completed half a year of university laboratory teaching at the point of survey. Whilst this means that results are biased by early experiences in university lab courses, this also enabled students the opportunity to reflect on their introduction to university education and base their responses on how their experiences of university education matched their expectations.

Survey answers were based on the students’ own perceptions of their skill competency and confidence, rather than any form of skill assessment.

Although respondents were from a range of degree programmes, all results were obtained from a single year group at a single UK research institution. Whilst the number of respondents provides a statistically relevant sample size, the results may be biased towards a single institution's geographical location or student intake. Some groups may also be over- or under-represented with respect to the actual make up of the full cohort.

Results and discussion

School type

Although only 6.4% of English and 4% of Scottish pupils attend private schools, they utilise 15% of educational resources (Green, 2022). In England, 18% of 16–19-year-olds attend private schools (Green, 2022) and these students remain overrepresented at top universities; 60% of privately educated students complete degrees from Russell Group Universities, compared to 23% of students attending a comprehensive school (Montacute and Cullinane, 2018). Windle has previously discussed the impact of social restriction within the UK education system, and states that private schools have become increasingly selective as they focused further on ‘guaranteeing’ entry to Oxford and Cambridge Universities (Windle, 2015).

To better understand the impact of the differences in educational background, specifically in an undergraduate laboratory setting, participants were divided into two categories. Categorisation was determined by the source of educational funding: private or state funded (Table 1). For this analysis, the one respondent who self-reported as home-schooled was omitted.

Table 1 Categorisation of recorded school type into private vs. state funding
Private grouping (n = 56) State grouping (n = 83)
Independent schools (n = 35) Non-selective state schools (n = 80)
International schools (n = 14) Selective state (grammar) schools (n = 3)
Independent schools with a significant scholarship or bursary (n = 7)  


When assessing the overall perceived quality of teaching by school type (Fig. 1), privately educated students reported feeling that their schools prepared them better for the laboratory, compared to state-educated students (p = 0.036). This is despite the fact that there were no statistical differences observed in the overall perceived quality of teaching of “theory and exams” (p = 0.701) or “lab and practical skills” (p = 0.162).


image file: d5rp00129c-f1.tif
Fig. 1 Overall perceived quality of teaching comparing private vs. state schools.

Further analysis of the student's perceived experience across the key 23 skills showed that privately educated students felt statistically more experienced across 4 key skills and more confident in 6 lab skills (see Table 2). Notably, the majority of these skills involved significant pieces of apparatus (gas syringes and voltmeters) or more expensive glassware (QuickFit). This matches both the known allocation and perceived fact that privately educated students have access to more educational resources than their state funded counterparts (Green, 2022).

Table 2 Skills in which privately educated students statistically felt more experienced and more confident compared to state educated respondents
Experience Confidence Soft skills
a The only skill that state-educated students scored statistically higher in was Written English Skills.
Completing a redox titration with a range of indicators Measuring gas volume using a gas syringe I am confident designing my own experimental methodologies
  Conducting a calorimetry test  
Measuring gas volume using a gas syringe Constructing a voltaic cell  
Conducting a calorimetry experiment Use of QuickFit glassware (Written English Skills)a
Tests for key organic molecules and ions Tests for key organic molecules and ions  
  Writing risk assessments  


The only area in which state educated students indicated a statistically significant increase in experience was Written English Skills. However, this is very likely influenced by the higher proportion of privately educated students who studied abroad (48%), compared to their state-educated counterparts (less than 5%) (Fig. 2). Notably, 78% of international privately educated students stated that English was not their first language, compared to 21% of respondents who attended UK-based private schools. This may also highlight the low number of state funded international students who are able to continue into further education within the UK, or reflect the higher proportion of international educational institutions that are privately funded.


image file: d5rp00129c-f2.tif
Fig. 2 Student self-assessed proficiency in written English skills by school type highlighting the differences between privately educated students who were educated in either the UK or abroad.

School curriculum

To investigate the influence of the curriculum students studied on their practical laboratory skills, participants were categorised into four primary groups based on the curriculum they pursued during their final two years of schooling (Fig. 3). The three main curriculum categories included Scottish Highers or Advanced Highers (SQA, 2019a, 2019b), A-levels (AQA, 2015), and the International Baccalaureate, IB (International Baccalaureate, 2025). The fourth category comprised a subset of students (6%) that completed alternative curricula.
image file: d5rp00129c-f3.tif
Fig. 3 Breakdown of respondents based on curricula studied in the final two years of school.

Whilst 29% of survey respondents studied in England, 39% of respondents completed A-levels—the standard English pre-university curriculum—indicating a high uptake of A-levels in international students (see Fig. 4). Conversely, 48% of respondents studied at Scottish schools, with only 46% of respondents studying Scottish Highers or Adv. Highers indicating the 2% proportion of Scottish students who completed alternative curricula. State funded schooling within the UK almost always follows the national curriculum with any differences between nationality and curriculum due to the private education sector.


image file: d5rp00129c-f4.tif
Fig. 4 Breakdown of respondents based on location of study during the final two years of school.
A-Levels. The overall perception of teaching quality for students that undertook A-levels was statistically higher (p < 0.05) than students that undertook Scottish Highers or Other curricula. This was true for all three sub-categories: Theory and Exams, Lab and Practical Skills and Prepared for the lab (Fig. 5). There was no statistical difference (p > 0.05) between A-levels and IB students – perhaps indicating the similarity in the course level compared to students who only completed Scottish Highers.
image file: d5rp00129c-f5.tif
Fig. 5 Overall perceived quality of teaching based on the curricula studied by respondents.
International Baccalaureate. Despite no overall differences, students who studied the IB felt statistically less experienced in 6 skills and less confident in 3 in comparison with students taking A-levels (see Table 3). Similar to differences observed with School Type, IB students also appear to feel less experienced across reagent- and equipment-specific skills – despite the high percentage of international students who were privately educated.
Table 3 Statistically significant skill gaps, where A-level students scored higher than IB students
Experience Confidence Soft skills
Distillation using a Liebig condenser Experiments under reflux I understand the theory behind my labs
Experiments under reflux Tests for key organic molecules and ions I am independent in the lab
Use of QuickFit glassware Vacuum filtration using a Buchner funnel  
Tests for key organic molecules and ions    
Vacuum filtration using a Buchner funnel    
Initial rate method    


The IB itself does not require students to complete prescribed practical work like the A-level does (AQA, 2015), with large sections of both the course and assessment covering flexible teaching approaches or student-led investigations (International Baccalaureate, 2025). This more general and inquiry-led approach may explain why, despite showing less experience in several specific laboratory skills, the overall perception of “preparation for the lab” was statistically equal to A-level students.

Scottish Highers/Advanced Highers. Students who studied Scottish Highers felt their teaching for Theory and Exams; Lab and Practical skills; and their overall preparedness for the laboratory was significantly lower than students who studied A-levels (Fig. 5).

Additionally, students who undertook Scottish Highers or Advanced Highers felt less experienced across 11 and less confident across 9 of the surveyed skills (Table 4). As with the IB, many of the skills identified involved access to specific apparatus such as QuickFit glassware or distillation equipment—perhaps due to the higher proportion of A-level students that were educated privately. Of those students who studied Scottish Highers, 89% went on to study Advanced Highers in their final year of school—a level more equivalent to A-level. However, whilst 98% of the surveyed A-level students were awarded a grade of A or A*, only 69% of those who obtained a Scottish Higher went on to obtain a grade of A or B in Advanced Higher. When the results for overall perceived quality of teaching are grouped, there is a significant increase in the quality of “theory and exam” teaching for students achieving an A or B at Advanced Higher level versus those who obtained a C or lower (p = 0.014) (Fig. 6). However, despite the lower grades, those obtaining a C or lower gave a similar rating for their practical skills and lab preparation—perhaps indicating the perceived value of practical experience versus knowledge and understanding. Whilst the trends are less clear for those students that did not take Adv. Highers due to the low numbers of respondents (n = 7), these students still felt statistically less prepared for the lab than students who did study Adv. Higher (p = 0.024 and p = 0.027 versus A/B or <C grades respectively).

Table 4 Statistically significant skill gaps, where A-level students scored higher than respondents who studied Scottish Highers or Advanced Highers
Experience Confidence Soft skills
Measure gas volume using a gas syringe Measure gas volume using a gas syringe No significant differences identified.
Conduct a calorimetry experiment Conduct a calorimetry experiment  
Construct a voltaic cell Construct a voltaic cell  
Distillation using a Liebig condenser Distillation using a Liebig condenser  
Experiment under reflux Tests for key organic molecules and ions  
Use of QuickFit glassware Measure voltage of electrochemical cell  
Tests for key organic molecules and ions Writing risk assessments  
Measure the voltage of an electrochemical cell Initial rate method  
Writing risk assessments Continuous analysis for rate experiments  
Initial rate method    
Continuous analysis for rate experiments    



image file: d5rp00129c-f6.tif
Fig. 6 Overall perceived quality of teaching by Scottish students who achieved a A/B grade in Adv. Higher chemistry versus those that achieved a grade of C or lower, as well as those who did not take Adv. Higher chemistry.

These results highlight the complications arising from Scottish Highers and university admissions, with Advanced Highers having little to no impact on general university admission criteria in Scotland (although direct 2nd year entry or other adjustments may be available in some Scottish institutions). As a result, many Scottish Higher students enter their first year of university study having either not studied chemistry in their final year of school, or having achieved a significantly lower grade (C or below) in their final year of study. This difference in the undergraduate admissions process likely explains the observation that students who completed Higher/Advanced Higher chemistry perceive their experience and confidence in a range of lab skills to be lower than their A-level counterparts, rather than a direct comparison of the quality and quantity of lab training encountered in each curriculum.

Country of study

Despite recent concerns over the long-term sustainability of international student admissions (PWC, 2024), in 2023/24 UK Universities continued to maintain large international student populations—with international students making up 15% and 52% of the total UK student population for undergraduate and postgraduate programmes respectively (HESA, 2024).

The survey respondents were split into two categories: home (those students who completed their final two years of school in the UK) and International (Table 5). This is distinct from university admission data where some students may have studied within the UK, but still been classified as ‘international’ due to their residency or fee-paying status.

Table 5 Categorisation of recorded school location into Home vs. International
Home grouping (n = 107) International grouping (n = 33)
Scotland (n = 67) China (n = 16)
England & Wales (n = 40) Other European (n = 10)
  Other (n = 7)


Home students felt statistically more experienced in only two skills, and more confident in one compared with international students (Table 6). However, there was no significant difference in the overall perception of previous lab teaching between home and international students.

Table 6 Statistically significant skill gaps, where UK home students scored higher than International respondents
Experience Confidence Soft skills
Experiments under reflux Vacuum Filtration using a Buchner Funnel My written English skills are strong
Vacuum filtration using a Buchner Funnel   My spoken English skills are strong
    I feel confident designing my own experimental methodologies


International students felt less confident in both their English speaking and writing skills compared with their UK counterparts. This is perhaps unsurprising given that 78% of the international students stated that English was not their first language, compared to 15% of home students. However, 91% of the international students surveyed completed their final school studies in English, indicating that previous grades in courses taught in English may be an ineffective mechanism to guarantee a specific level of English proficiency and that post-admission English support may still be relevant for these students.

International students were also statistically less confident in their ability to design their own experimental methodologies. Perhaps this relates to perceived notion that international schooling, particularly in China, can focus more on the recollection of knowledge and facts, rather than application of this knowledge to unknown or unseen problems. However, it is important to note that recent studies have shown “weak” evidence for this difference in critical thinking ability (Rear, 2017; Fan and See, 2022). It may even be that this perceived cultural difference is responsible for the different confidence levels observed.

Impact of Covid-19

As a result of the Covid-19 pandemic, widespread and severe restrictions were in place in the UK, and much of the rest of the world, from March 2020 through to March 2021 that had a direct impact on chemistry education (Betthäuser et al., 2023; Cramman et al., 2024). Whilst 95% of the survey participants completed their schooling after 2022, with 84% completing their schooling in 2023, there was still the risk that the previous experiences of Covid-19, and residual policies, may have varied.

The overall perceived impact of covid was mixed, with a global mean response of 2.45 and standard deviation of 1.15 from a 5-point Likert scale ranging from 1 – not at all disrupted to 5 – severely disrupted. Despite previous reports on the unequal impact of Covid-19 for students from lower income backgrounds (Betthäuser et al., 2023; Golden et al., 2023), state-educated students reported no significant difference in disruption compared to privately educated students (p = 0.292) (see Section S10 of the ESI).

Participants who studied in Scotland reported significantly higher disruption compared with those in England/Wales (p = <0.001) or based outside of the UK (p = 0.046), with a clear increase in the number reporting moderate or worse disruption (Fig. 7). A similar trend was observed for participants who studied Scottish Highers/Advanced Highers versus A-levels (p = 0.008). No significant differences were observed for students who undertook the IB.


image file: d5rp00129c-f7.tif
Fig. 7 Overall perceived impact of Covid-19 on students’ chemistry school education, separated into three groups dependent on the country of study.

No significant difference in the impact of Covid-19 was observed between Scottish students who completed an Advanced Higher versus those who only completed Highers (p = 0.973). One reason for the perceived increase in disruption for Scottish students could be the decision to remove the practical assignment and project from both the Higher and Advanced Higher chemistry assessment from 2020 to 2023 (Traynor et al., 2025). This will have been the case for all respondents in this study who completed a Scottish Higher in chemistry.

Implications for lab course design

This study identifies a number of laboratory techniques and soft skills that are likely to show significant differences in experience or confidence levels within new undergraduate chemistry students (Table 7).
Table 7 Overview of key skills that showed statistically significant (p < 0.05) differences in perceived experience or confidence across first year undergraduate chemistry students
Experience Confidence Soft skills
Distillation using a Liebig condenser Experiments under reflux I understand the theory behind my labs
Experiments under reflux Tests for key organic molecules and ions I am independent in the lab
Use of QuickFit glassware Vacuum filtration using a Buchner funnel My written English skills are strong
Tests for key organic molecules and ions Measure gas volume using a gas syringe My spoken English skills are strong
Vacuum filtration using a Buchner Funnel Conduct a calorimetry experiment I feel confident designing my own experimental methodologies
Initial rate method Construct a voltaic cell  
Measure gas volume using a gas syringe Distillation using a Liebig condenser  
Conduct a calorimetry experiment Measure voltage of electrochemical cell  
Construct a voltaic cell Writing risk assessments  
Measure the voltage of an electrochemical cell Initial rate method  
Writing risk assessments Continuous analysis for rate experiments  
Continuous analysis for rate experiments    


Whilst all incoming undergraduate students are likely to have suitable experience with simple lab apparatus, there is a wider skill gap for techniques that involve access to more expensive or specialised equipment (e.g., distillations, gas syringes etc.). Care should therefore be taken in early undergraduate lab sessions to make sure sufficient guidance is provided, either before or during practical sessions, for any apparatus that may not have been widely available in high school laboratories.

International students, or those completing Scottish Highers, perceived themselves as having less experience or confidence in experimental design (or related investigative tasks such as risk assessments). With the popularity of inquiry-led approaches on the rise in undergraduate laboratories (Pavelich and Abraham, 1977; Domin, 1999; Fay et al., 2007; Walker et al., 2011; Thomson and Lamie, 2022), there is a concern that an initial imbalance of student skills may lead to a widening skill gap as previous lab experience may enable greater success. This is particularly true for exercises carried out in groups, where more confident or experienced individuals can take the lead and inadvertently increase the skill gaps further (Francis et al., 2022; Doucette and Singh, 2023).

Conclusions

This study seeks to provide quantitative confirmation of the anecdotal observations that many university lab organisers are already aware of—that first-year chemistry students have a diverse range of cultural and educational backgrounds influenced by previous school types, curricula or international culture.

Which factors influence laboratory skills the most?

Students who had undertaken Scottish Highers or the International Baccalaureate considered themselves less experienced or confident in skills requiring more advanced equipment compared with those who completed A-levels. The same was also true when comparing state funded schools with the private education sector, perhaps reflecting the greater percentage of A-level students who were educated privately and may have had better access to more expensive laboratory equipment (Green, 2022). Nationality or location of study was less important, with very little perceived difference in the preparedness of international students versus UK students—with the exception of English language skills. A reduced confidence in English amongst international students was observed despite the fact that 91% of the international students surveyed completed their school studies in English. Disruption due to Covid-19 was most evident in students who studied in Scotland.

Which groups of students require more support?

It is clear from these results that many Scottish students, particularly those who did not complete Adv. Highers or who performed poorly in them, perceived themselves as being less prepared for undergraduate chemistry practicals. Whilst many Scottish universities provide options to allow those with good A-level or Adv. Higher grades direct entry to 2nd year, many students do not take this option, and these skill inequalities persist within 1st year cohorts. Furthermore, universities in England and Wales often allow a combination of Highers and Adv. Highers as entry requirements (typically with an Adv. Higher in Chemistry), however, some allow entry with only Highers—particularly for those students, often registered as widening participation students, for which Adv. Highers were not offered. It is therefore important that students with only Scottish Highers (or poor performances at Adv. Higher) are provided additional support after admission to assist with starting undergraduate lab courses.

How can the results of this study inform future curriculum design?

Whilst many UK chemistry undergraduate courses already provide excellent laboratory training programmes, this study has determined which common laboratory skills should likely be targeted in the design of first year undergraduate laboratory courses to address inequalities in students’ past experiences. Care should be taken to provide additional support, before and during sessions, for activities involving more expensive lab apparatus (e.g. QuickFit glassware). Additional and optional pre-lab information should also be provided to all students, to enable those with limited experience the opportunity to prepare further.

Whilst this study has identified the skill inequalities likely present in current 1st-year chemistry cohorts at UK institutions, it is not yet clear whether this skill gap continues to exist in chemistry graduates or if existing measures within teaching laboratories, and a reduction in the long-term impact of Covid-19 policies, are sufficient to address existing inequalities. The survey set out in this study can be used to assess specific skill gaps within new undergraduate cohorts at institutions where the majority of applicants hold A-levels, Scottish Highers or an IB—providing the information required to ensure that the resources provided accurately map to the specific skills development required.

Author contributions

AJ designed and carried out the research, data curation and formal analysis. DA provided supervision and resources. Both authors contributed to writing, reviewing and editing this document.

Conflicts of interest

There are no conflicts of interest to declare.

Data availability

The data analysis supporting this article has been included as part of the ESI. The full data set collected from human participants is not available to maintain confidentiality.

Acknowledgements

The authors wish to thank the University of Edinburgh for funding as well as all the staff and students who participated in this study. They would like to thank Dr Nina Chadwick, Dr Peter Kirsop, Dr Uwe Schneider, Kirsty Bain, Dr Mairi Haddow and Dr Ben Arenas for their expert input and reflections on the content of this article.

Notes and references

  1. AQA, (2015), AS and A-Level Chemistry Specification. Available at https://cdn.sanity.io/files/p28bar15/green/9bc48d822ce0f2a5b8a6ce62de590a4c06281535.pdf (Accessed: 20 March 2025).
  2. Batty L. and Reilly K., (2023), Understanding barriers to participation within undergraduate STEM laboratories: towards development of an inclusive curriculum, J. Biol. Educ., 57(5), 1147–1169 DOI:10.1080/00219266.2021.2012227.
  3. Betthäuser B. A., Bach-Mortensen A. M. and Engzell P., (2023), A systematic review and meta-analysis of the evidence on learning during the COVID-19 pandemic, Nat. Hum. Behav., 7(3), 375–385 DOI:10.1038/s41562-022-01506-4.
  4. Cramman H., Arenas B., Awais R., Balaban C., Cropper C. and Dennis F. et al., (2024), Post-16 students’ experience of practical science during the COVID-19 pandemic and the impact on students’ self-efficacy in practical work, Enhancing Teach. Learn. High. Educ., 2, 39–69 DOI:10.62512/etlhe.14.
  5. DeKorver B. K. and Towns M. H., (2015), General Chemistry Students’ Goals for Chemistry Laboratory Coursework, J. Chem. Educ., 92(12), 2031–2037 DOI:10.1021/acs.jchemed.5b00463.
  6. Domin D. S., (1999), A Review of Laboratory Instruction Styles, J. Chem. Educ., 76(4), 543 DOI:10.1021/ed076p543.
  7. Doucette D. and Singh C., (2023), Making Lab Group Work Equitable and Inclusive, J. Coll. Sci. Teach., 52(4), 31–37 DOI:10.1080/0047231X.2023.12290630.
  8. Fan K. and See B. H., (2022), How do Chinese students’ critical thinking compare with other students? A structured review of the existing evidence, Think. Skills Creat., 46, 101145 DOI:10.1016/j.tsc.2022.101145.
  9. Farrell J. J., Moog R. S. and Spencer J. N., (1999), A Guided-Inquiry General Chemistry Course, J. Chem. Educ., 76(4), 570 DOI:10.1021/ed076p570.
  10. Fay M. E., Grove N. P., Towns M. H. and Bretz S. L., (2007), A rubric to characterize inquiry in the undergraduate chemistry laboratory, Chem. Educ. Res. Pract., 8(2), 212–219 10.1039/B6RP90031C.
  11. Francis N. J., Allen M. and Thomas J., (2022), Using group work for assessment–an academic's perspective, Advance HE. Available at https://www.advance-he.ac.uk/sites/default/files/2022-03/Using%20group%20work%20for%20assessment%20%E2%80%93%<?pdb_no 20an?>20an<?pdb END?>%20academic%E2%80%99s%20perspective.pdf (Accessed: 26 March 2025).
  12. Golden A. R., Srisarajivakul E. N., Hasselle A. J., Pfund R. A. and Knox J., (2023), What was a gap is now a chasm: remote schooling, the digital divide, and educational inequities resulting from the COVID-19 pandemic, Curr. Opin. Psychol., 52, 101632 DOI:10.1016/j.copsyc.2023.101632.
  13. Grant L., (2011), Lab Skills of New Undergraduates, The Gatsby Charitable Foundation, Available at https://www.gatsby.org.uk/uploads/education/reports/pdf/russell-group-survey-of-lab-skills-report-laura-grant-may-2011.pdf (Accessed: 26 March 2025).
  14. Green F., (2022), Private schools and inequality, IFS Deaton Review of Inequalities, Oxford Open Econom., 3, 842–849.
  15. Hensiek S., DeKorver B. K., Harwood C. J., Fish J., O’Shea O. and Towns M., (2016), Improving and Assessing Student Hands-On Laboratory Skills through Digital Badging, J. Chem. Educ., 93, 1847–1854 DOI:10.1021/acs.jchemed.6b00234.
  16. HESA, (2024), Statistical Bulletin SB269. Higher Education Student Statistics: UK, 2022/23 – Where students come from and go to study. Available at https://www.hesa.ac.uk/news/08-08-2024/sb269-higher-education-student-statistics/location (Accessed: 19 March 2025).
  17. Hill M. A., Overton T. L., Thompson C. D., Kitson R. R. A. and Coppo P., (2019), Undergraduate recognition of curriculum-related skill development and the skills employers are seeking, Chem. Educ. Res. Pract., 20(1), 68–84 10.1039/C8RP00105G.
  18. Hofstein A. and Lunetta V. N., (2004), The laboratory in science education: foundations for the twenty-first century, Sci. Educ., 88, 28–49 DOI:10.1002/sce.10106.
  19. International Baccalaureate, (2025), Diploma Programme Subject Brief: Chemistry. Available at https://www.ibo.org/globalassets/new-structure/recognition/pdfs/dp_sciences_chemistry_subject-brief_jan_2022_e.pdf (Accessed: 19 March 2025).
  20. Jamieson, S. (2004) Likert scales: how to (ab)use them? J. Med. Educ., 38(12), 1217–1218 DOI:10.1111/j.1365-2929.2004.02012.x.
  21. Kondo A. E. and Fair J. D., (2017), Insight into the Chemistry Skills Gap: The Duality between Expected and Desired Skills, J. Chem. Educ., 94(3), 304–310 DOI:10.1021/acs.jchemed.6b00566.
  22. Lalla M., (2017), Fundamental characteristics and statistical analysis of ordinal variables: a review, Qual. Quant., 51, 435–458 DOI:10.1007/s11135-016-0314-5.
  23. McCrum-Gardner E., (2008), Which is the correct statistical test to use? Br. J. Oral. Maxillofac. Surg., 46(1), 38–41 DOI:10.1016/j.bjoms.2007.09.002.
  24. McKnight P. E. and Najab J., (2010), Mann-Whitney U Test, in Weiner I. B. and Craighead W. E. (ed.), The Corsini Encyclopedia of Psychology, Wiley, p. 1 DOI:10.1002/9780470479216.corpsy0524.
  25. Mistry N. and Gorman S. G., (2020), What laboratory skills to students possess at the start of University? Chem. Educ. Res. Pract., 21, 823–838 10.1039/C9RP00104B.
  26. Montacute R. and Cullinane C., (2018). The influence of schools and place on admissions to top universities (The Sutton Trust), Access to Advantage.
  27. Montgomery T. D., Buchbinder J. R., Gawalt E. S., Iuliucci R. J., Koch A. S., Kotsikorou E., Lackey P. E., Lim M. S., Rohde J. J., Rupprecht A. J., Srnec M. N., Vernier B. and Evanseck J. D., (2022), The Scientific Method as a Scaffold to Enhance Communication Skills in Chemistry, J. Chem. Educ., 99, 2338–2350 DOI:10.1021/acs.jchemed.2c00113.
  28. Pavelich M. J. and Abraham M. R., (1977), Guided Inquiry Laboratories for General Chemistry Students. J. Coll. Sci. Teach., 7(1), 23–26, https://www.jstor.org/stable/42984818.
  29. PWC, (2024), UK Higher Education Financial Sustainability Report. Available at: https://www.pwc.co.uk/industries/government-public-sector/education/financial-sustainability-of-uk-higher-education-sector.html (Accessed: 19 March 2025).
  30. QAA, (2022), Subject Benchmark Statement: Chemistry. Available at: https://www.qaa.ac.uk/docs/qaa/sbs/sbs-chemistry-22.pdf?sfvrsn=46b1dc81_6 (Accessed: 24 March 2025).
  31. Rear D., (2017), Reframing the Debate on Asian Students and Critical Thinking: Implications for Western Universities, JCIE, 12(2), 18–33 DOI:10.20355/C5P35F.
  32. Reid N. and Shah I., (2007), The role of laboratory work in university chemistry, Chem. Educ. Res. Pract., 8, 172–185 10.1039/B5RP90026C.
  33. RSC, (2024), Accreditation of Degree Programmes. Available at: https://www.rsc.org/standards-and-recognition/accreditation/degree-accreditation (Accessed: 24 March 2025).
  34. Seery M. K., Agustian H. Y., Doidge E. D., Kucharski M. M., O’Connor H. M. and Price A., (2017), Developing laboratory skills by incorporating peer-review and digital badges, Chem. Educ. Res. Pract., 18, 403–419 10.1039/C7RP00003K.
  35. Seery M. K., Agustian H. Y. and Zhang X., (2019), A Framework for Learning in the Chemistry Laboratory, Isr. J. Chem., 59(6–7), 546–553 DOI:10.1002/ijch.201800093.
  36. SQA, (2019a), Advanced Higher Chemistry Course Specification, Available at https://www.sqa.org.uk/sqa/files_ccc/RNQAHChemistryCourseSpecification.pdf (Accessed: 20 March 2025).
  37. SQA, (2019b) Higher Chemistry Course Specification, Available at https://www.sqa.org.uk/files_ccc/HigherCourseSpecChemistry.pdf (Accessed: 20 March 2025).
  38. Sullivan G. M. and Artino A. R., (2013), Analyzing and Interpreting Data from Likert-Type Scales, J. Grad. Med. Educ., 5, 541–542 DOI:10.4300/JGME-5-4-18.
  39. Thomson P. I. T. and Lamie P., (2022), Introducing Elements of Inquiry and Experimental Design in the First Year of an Undergraduate Laboratory Program, J. Chem. Educ., 99(12), 4118–4123 DOI:10.1021/acs.jchemed.2c00311.
  40. Towns M., Harwood C. J., Robertshaw M. B., Fish J. and O’Shea O., (2015), The Digital Pipetting Badge: A Method To Improve Student Hands-On Laboratory Skills, J. Chem. Educ., 92, 2038–2044 DOI:10.1021/acs.jchemed.5b00464.
  41. Traynor E., Scott F. J. and Thomson P. I. T., (2025), Responses of teachers in Scotland to the reintroduction of the practical project in the advanced higher chemistry curriculum. Chem. Teach. Int., 7(1), 183–193 DOI:10.1515/cti-2024-0073.
  42. Walker J. P., Sampson V. and Zimmerman C. O., (2011), Argument-Driven Inquiry: An Introduction to a New Instructional Model for Use in Undergraduate Chemistry Labs. J. Chem. Educ., 88(8), 1048–1056 DOI:10.1021/ed100622h.
  43. Warmbrod J. R., (2014), Reporting and Interpreting Scores Derived from Likert-type Scales, J. Agricul. Educ., 55(5), 30–47 DOI:10.5032/jae.2014.05030.
  44. Wheeler L. B., Maeng J. L. and Whitworth B. A., (2015), Teaching assistants’ perceptions of a training to support an inquiry-based general chemistry laboratory course, Chem. Educ. Res. Pract., 16(4), 824–842 10.1039/C5RP00104H.
  45. Windle J. A., (2015), Making Sense of School Choice: Politics, Policies, and Practice Under Conditions of Cultural Diversity, Palgrave Macmillan DOI:10.1057/9781137483539.

Footnote

Electronic supplementary information (ESI) available. See DOI: https://doi.org/10.1039/d5rp00129c

This journal is © The Royal Society of Chemistry 2025
Click here to see how this site uses Cookies. View our privacy policy here.