Investigating the viability of a competency-based, qualitative laboratory assessment model in first-year undergraduate chemistry

Reyne Pullen *, Stuart C. Thickett * and Alex C. Bissember *
School of Physical Sciences – Chemistry, University of Tasmania, Hobart, Tasmania 7001, Australia. E-mail: Reyne.Pullen@utas.edu.au; Stuart.Thickett@utas.edu.au; Alex.Bissember@utas.edu.au

Received 15th December 2017 , Accepted 12th March 2018

First published on 20th March 2018


Abstract

In chemistry curricula, both the role of the laboratory program and the method of assessment used are subject to scrutiny and debate. The ability to identify clearly defined competencies for the chemistry laboratory program is crucial, given the numerous other disciplines that rely on foundation-level chemistry knowledge and practical skills. In this report, we describe the design, implementation, results, and feedback obtained on a competency-based assessment model recently introduced into the first-year laboratory program at an Australian university. Previously, this laboratory program was assessed via a quantitative, criterion-referenced assessment model. At the core of this new model was a set of competency criteria relating to skills-acquisition, chemical knowledge and application of principles, safety in the laboratory, as well as professionalism and teamwork. By design, these criteria were aligned with the learning outcomes of the course and the degree itself, as well as local accrediting bodies. Qualitative and quantitative feedback from students (and staff) obtained before and after the implementation of this new model suggested this approach provided an enhanced learning experience enabling a greater focus on the acquisition of fundamental laboratory skills and techniques.


Introduction

Chemistry is an experimental science and, as a result, the laboratory occupies a central place in chemistry curricula in universities worldwide (Lagowski, 2000; DeMeo, 2001). The undergraduate laboratory serves a number of purposes, including the development of practical skills and techniques, in addition to reinforcing and extending theoretical concepts presented in the classroom (Hofstein and Lunetta, 1982, 2004; Hegarty-Hazel, 1990; Bennett and O’Neale, 1998; Moore, 2006). The role of the undergraduate laboratory and the nature of the skills developed in this core learning environment have long been the subject of scrutiny and discussion (Garnet and Garnet, 1995; Reid and Shah, 2007).

An ongoing concern regarding the training of students pertains to the lack of “competent” graduates in science, technology, engineering and mathematics (STEM) disciplines with the necessary skills for future employment (National Academy of Science, 1996; Baranyai and Prinsley, 2015; Sarkar et al., 2016). Reid and Shah (2007) propose several reasons for this phenomenon, which include an ongoing reduction of laboratory hours (due to various pressures) and the relevance of the skills acquired by modern graduates during their degree program compared to the skills required by prospective employers. They suggest that the solution lies in identifying and incorporating innovative and alternative means for the development core skills and competencies emphasised in laboratories. However, effecting meaningful change in the teaching laboratory is a challenging exercise, given the number of variables that can be investigated. To this end, Hofstein and Lunetta (2004) has outlined a range of features of laboratory programs that could be modified, such as modes of learning and teaching styles, student attitudes towards chemistry laboratory work, students’ perceptions of the laboratory learning environment, and the use of alternative modes of assessment.

Bretz (2012) provides an elegant breakdown of assessment in its current form, stating that it can be reduced to two major questions: “What do we want our students to know? (And), “How will we know that they know it?” A number of reports have considered various aspects of these themes. For example, this includes studies concerning the development of detailed rubrics for both general and experiment-specific purposes (McInerney and McInerney, 2010; Pullen 2016), peer- and self-assessment tools (Wenzel, 2007; Seery et al., 2017), the use of concept maps (Ghani et al., 2017), and using video responses (Erdmann and March, 2014; Tierney et al., 2014). To consider laboratory assessment solely, it must be first stated as to what skills or competencies are sought as outcomes. Some approaches consider the laboratory an opportunity for expansive learning as demonstrated by Kirton et al. (2014) who developed Structured Chemistry Examinations (SChemEs) as a means to develop students in basic techniques, numeracy, apparatus assembly and handling, interpretive exercises, and information management. Hagen and Gragson (2010) approached laboratory assessment through the lens of developing learning tasks to improve and refine technical writing skills. Another strategy is the use of “badging” or micro-credentialing to demonstrate the attainment of various skills or competencies both in and outside of the laboratory (Towns et al., 2015, 2016; Seery et al., 2017).

Assessment in the first-year undergraduate Chemistry laboratory program at the University of Tasmania, Australia has traditionally featured a quantitative model employing a criterion-referenced assessment rubric. When placed in the context of developing student competency and related generic skills, it became apparent that there was a growing disconnect between what was being assessed – the end product – and the skills developed during the laboratory. More specifically, there was concern that the focus of the laboratory program had shifted from developing student competency to concentrating much more on quantitatively assessing it. Furthermore, our observations and interactions with undergraduates indicated that the existing form of assessment often led to students focusing solely on obtaining a good grade, to the extent that the acquisition of fundamental skills was a much lower priority and the techniques and concepts presented were often overshadowed and/or quickly forgotten. Similar observations have been noted previously (Bailey and Garner, 2010; Black and Wiliam, 2010; DeKorver and Towns, 2015). We were concerned that the above-mentioned issues were hindering the development of skills required in second-year chemistry units (and other non-chemistry units comprising the BSc or other degree programs offered at the University of Tasmania).

After considering a range of alternative assessment strategies, a competency-based assessment model was explored. This aligned with the use of criterion-based assessment for learning outcomes advocated by both Biggs and Tang (2011) and the Guidelines for Good Assessment Practices (University of Tasmania, 2011). The latter states, “Criteria should be clearly based on the learning outcomes in a unit outline.” We aimed to design a first-year chemistry laboratory program that provided an enhanced student experience, a greater focus on developing student competency, and an improved alignment of the laboratory program with both the Australian Council of Deans Teaching and Learning Centre's National Chemistry Threshold Learning Outcomes (TLOs) for undergraduate university-level chemistry (Australian Learning and Teaching Council, 2011) and unit-specific intended learning outcomes (ILOs). Consequently, it was postulated that the use of competency-based assessment model may be best-suited to addressing this challenge. Employing competency-based assessment in Chemistry has been reported previously (Foukaridis and McFarlane, 1988).

This report describes the implementation and evaluation of a new competency-based assessment model introduced into the first-year Chemistry laboratory program at the University of Tasmania in 2017. Pre- and post-data collection and analysis has enabled the comparison of the perceptions and experiences of first-year students who completed the laboratory program that featured either traditional quantitative, criterion-referenced assessment (2016) or the new assessment model (2017). More specifically, this work sought to investigate the viability of a competency-based, qualitative laboratory assessment model in first-year undergraduate chemistry.

This pilot study primarily aimed to answer the following research questions:

(1) Will a competency centric assessment model enhance the development and consolidation of fundamental skills in the first-year chemistry laboratory in the short-term?

(2) Can this assessment model shift student focus away from assessment to acquiring key laboratory techniques and reinforcing theory?

(3) Will the provision of a more transparent assessment model enable students to more clearly identify the intended learning outcomes?

Research methodology

Background

KRA114 – Chemistry 1B represents one of the two first-year Chemistry units that form the necessary prerequisites for students intending to major in Chemistry at the University of Tasmania (UTAS) and for those intending to proceed to any second-year UTAS chemistry unit. This unit builds on the year 12 chemistry syllabus offered to secondary school students and provides a foundation to four of the sub-disciplines of chemistry: organic, inorganic, physical, and analytical chemistry. Emphasis is placed on fostering an understanding of fundamental chemical concepts, development of problem-solving ability, and acquiring the skills necessary to carry out experiments safely and efficiently in the laboratory. The KRA114 intended learning outcomes (ILOs) are provided in Table 1.
Table 1 KRA114 intended learning outcomes
(1) Demonstrate an understanding of fundamental chemical principles and theories.
(2) Apply chemical principles to explain the chemical and physical properties of substances, their structure and the interactions that take place between them.
(3) Apply chemical concepts and theoretical principles to solve problems.
(4) Test and investigate theoretical chemical principles by successfully using fundamental experimental techniques in the laboratory.
(5) Demonstrate a capacity for self-directed learning, working responsibly and safely and recognise the critical role chemistry plays in society.


Each year at UTAS, ∼260 students enrol in KRA114. Consistent with their respective degree requirements, students enrolled in Pharmacy, Medical Research, Biotechnology, and Agricultural Science degrees must all successfully complete KRA114 in order to progress through their identified degree pathways. Allowing for minor annual variations in the percentages of first-year Chemistry students enrolled in different degree programs, the respective profiles of the cohort is similar from year to year. Typically, only ∼20% of these first-year Chemistry students choose to enrol in second-year Chemistry units. Consequently, it is imperative that students who are not progressing to further chemistry study also learn and demonstrate competency in fundamental chemistry skills and techniques consistent with the requirements and expectations of other disciplines.

Throughout the course of the laboratory program, students complete eight experiments on a variety of topics each designed to facilitate the development core skills or techniques and reinforce key concepts. A laboratory session is composed of up to 48 students overseen by three instructors (a ratio limit of 16 students per instructor) with support provided by a laboratory technician. Traditionally, for each experiment, students are awarded a summative grade based on two components. A pre-laboratory multiple-choice quiz (30 marks) is undertaken on arrival to the laboratory to measure a student's preparedness for this experiment. As such, the quiz contains a small number of questions targeting key techniques, calculations, and safety procedures they will utilise during the experiment. The remainder of the summative grade (100 marks) is undertaken by the allocated laboratory instructors, with the aid of a criterion-referenced assessment rubric as a guide. This component encapsulates pre-laboratory requirements and safety within the laboratory, use of correct techniques and calculations, and understanding of the key concepts and principles. The KRA114 assessment rubric employed in 2016 is included in Appendix 1 (ESI).

Development of a new competency-based qualitative laboratory assessment model

The overarching aim of this project was to test the viability of a competency-based, qualitative laboratory assessment model in first-year undergraduate Chemistry at UTAS. We were specifically interested in investigating the effect that a departure from a quantitative assessment in the KRA114 laboratory program to a qualitative, competency-based assessment model would have on KRA114 learning outcomes and the student experience. Feedback from KRA114 students, laboratory instructors, technical staff involved with the unit, and UTAS Chemistry academic staff was used to inform and guide the development of a new competency-based assessment model underpinned by a criterion-referenced assessment (CRA) framework. By design, this more skills-focused assessment model was specifically structured with a view to enhancing the first-year UTAS Chemistry laboratory program, providing improved alignment between KRA114 unit ILOs and the Australian Council of Deans Teaching and Learning Centre's National Chemistry Threshold Learning Outcomes (TLOs) for undergraduate university-level chemistry (Australian Learning and Teaching Council, 2011) and in compliance with the Royal Australian Chemical Institute's (RACI) degree accreditation requirements. The National Chemistry TLOs include four categories including understanding the culture of chemistry, inquiry, problem solving and critical thinking, communication, and personal and social responsibility. The expanded TLOs can be found in Appendix 2 (ESI).

A key consideration in the development of a new KRA114 laboratory assessment model was determining the fundamental competency-based criteria that would serve as the assessment framework. By design, assessment would not only be based on the results submitted at the end of each practical session, but also on student performance in the laboratory in areas, including: preparation, demonstrating competency in particular skills and techniques, bookkeeping, experimental accuracy, adherence to safe work practices, understanding of fundamental chemical principles, teamwork, and professionalism. With this in mind, eleven competency-based assessment criteria were identified (Table 2). A full breakdown of each criterion is provided in Appendix 3 (ESI). Each experiment in the lab manual outlined the criteria under investigation and clearly stated the requirements in order to achieve the various skills-based competencies assessed in each of the experiments. A representative 2017 KRA114 laboratory experiment is included in Appendix 4 (ESI).

Table 2 Links between the eleven KRA114 competency-based laboratory assessment criteria and KRA114 unit ILOs (see Appendix 3 (ESI) for more details) and TLOs for undergraduate university-level chemistry in Australia (see Appendix 2 (ESI) for more details)
KRA114 competency-based criteria KRA114 ILOs Chemistry TLOs
C1: Proficiency in Using Analytical Glassware 4 2
C2: Proficiency in Using Chemical Glassware 4 2
C3: Experimental Accuracy 1, 2, 4 2, 3
C4: Recording Observations 1, 2, 3, 4 1, 2, 3, 4
C5: Mastering Chemical Calculations and Equations 1, 2, 4 1, 2, 3
C6: Understanding and Applying Chemical Principles 1, 2, 3, 4, 5 1, 2, 3, 4
C7: Heating, Cooling and Isolating Materials 1, 2, 4 2
C8: Safety Awareness in a Chemical Laboratory 1, 4, 5 1, 4
C9: Efficiency and Time Management 5 4
C10: Professionalism and Preparation 5 4
C11: Collaboration and Teamwork 1, 5 4


By design, the KRA114 competency-based laboratory assessment criteria were aligned with both KRA114 unit ILOs and the chemistry TLOs. Biggs (2002) articulates the need for constructive alignment and its importance in the development of a learning environment that features learning activities, with the ultimate aim of students meeting the ILOs and TLOs. Consequently, each criterion was aligned with at least one chemistry TLO and one unit-specific ILO. Through this alignment it was intended that the ILOs would be more evident to students and therefore more achievable. Table 2 provides a summary of the alignment between each criterion with both KRA114 ILOs and the chemistry TLOs.

In the new assessment structure, the laboratory program represented a hurdle requirement. Specifically, in order to be eligible to pass KRA114, students were required to attend and complete a minimum of seven of the eight laboratory classes and successfully demonstrate competency in at least ten of the eleven listed competency-based assessment criteria. Each of the eight experiments that comprise the laboratory program assess the ability of a student across multiple criteria. Each criterion was assessed at least twice during semester and students were required to demonstrate proficiency in criterion 8 (safety awareness in a chemical laboratory) at all times (Table 3). It was anticipated that the development of student competency could be achieved via several approaches. From the outset of the laboratory session, the instructor would lead a brief discussion to provide an overview of the experiment and expectations. This would feature detailed instructions for any safety procedures or key experiment-specific techniques, which might include providing a physical demonstration if necessary. Throughout the laboratory session each instructor was expected to actively engage with individuals and groups to provide formative feedback on their progress. The instructor would lead group discussions where appropriate to ensure consistency of method development. Furthermore, the instructor would engage with each student individually throughout the laboratory session to discuss their performance and to identify areas for growth or improvement. This would include providing written feedback within the laboratory manual in response to the students recorded data and conclusions.

Table 3 Links between the eight KRA114 laboratory experiments and KRA114 competency-based laboratory assessment criteria
KRA114 laboratory experiment Competency-based assessment criteria
1 2 3 4 5 6 7 8 9 10 11
(1) The Determination of the Kf of the Iron Thiocyanate Complex by a Spectrometric Method
(2) Ionic Equilibria: Acids, Bases and Buffers
(3) Solvent Extraction
(4) Synthesis of Aspirin
(5) The Hydrolysis of Methyl Salicylate
(6) Preparation of Potassium Trisoxalatoferrate(III)
(7) Coordination Complexes: The Study of Metal Ions in Solution
(8) The Determination of the Rate of Hydrolysis of Crystal Violet


Participants

In 2016 and 2017, 257 and 235 students completed the KRA114 laboratory program, respectively. Each year, at the end of semester, a paper-based questionnaire was used to obtain student feedback on the KRA114 laboratory program (Table 4), in addition to the general UTAS online unit evaluation survey. It was not compulsory for students to provide feedback via these two mechanisms and all feedback was provided anonymously. The questionnaire was designed primarily as a tool to aid the evaluation of this new assessment model. The content of the paper-based questionnaire employed a mixed-methods approach, incorporating both Likert response items (quantitative) and two free-text response items (qualitative). Anecdotal feedback was also obtained from KRA114 laboratory instructors, technical staff involved with the unit, and UTAS Chemistry academic staff in 2016 and 2017. Furthermore, unsolicited student and staff feedback was also obtained during this period. Prior to the commencement of this study, ethics approval for this research was granted by the UTAS Social Sciences Human Research Ethics Committee (Ethics Reference Number: H0016079).
Table 4 KRA114 questionnaire used to obtain KRA114 student feedback in 2016 and 2017
Question Response options
Q1. I found the laboratory component to be interesting? Strongly Disagree; Disagree; Neutral; Agree; Strongly Agree
Q2. It was clear to me what I was supposed to learn from completing the laboratory component?
Q3. Based on the feedback that I have been provided during the laboratory component, I feel that I understand the experiments and have developed my laboratory skills during the semester?
Q4. It was clear to me how the laboratory experiments would be assessed?
Q5. The way in which the laboratory component is assessed helps me learn and improve my understanding of chemistry?
Q6. I enjoyed the chemistry laboratory component this semester?
Q7. Overall, as a learning experience, I would rate the laboratory component as… Very Poor; Poor; Average; Good; Excellent
Q8. What aspects of the laboratory did you find most enjoyable and interesting? Free-text Response
Q9. What skills have you acquired or improved upon completing the chemistry laboratory component?


Prior to the implementation of this new laboratory assessment structure, the academic staff responsible for the KRA114 laboratory program focused on mentoring and training the laboratory instructors. These instructors were either PhD students or postdoctoral research fellows working in UTAS Chemistry. Importantly, the majority of these people had prior experience as KRA114 laboratory instructors and, thus, were familiar with all of the eight KRA114 experiments. A training and mentoring program was developed to ensure that the laboratory instructors understood and could clearly communicate the features and nuances of this new assessment model to students, ensuring that they were suitably equipped to lead and facilitate lab classes.

Results and discussion

Outcomes of competency-based, qualitative laboratory assessment model

The results were collected in two forms, including Likert style responses to a series of student perception questions and two text-based questions for students to elaborate on their thoughts on the laboratory experience undertaken. Fig. 1 provides a comparison of student feedback for those who completed the KRA114 lab program under the existing quantitative assessment model (2016) and undergraduates that undertook the lab program under the revised qualitative assessment model (2017). For simplicity, the five-point Likert scale responses have been condensed into three groups; negative (Strongly Disagree, Disagree; red), neutral (grey) and positive (Strongly Agree, Agree; green). Inspection of the 5-point breakdown of the results indicated that the majority of positive and negative responses lay within the more conservative type (Agree and Disagree respectively). The uncondensed data can be found in Appendix 5 (ESI). In 2016, 182 students (∼71% of the cohort) and, in 2017, 121 students (∼51%) provided feedback via the aforementioned questionnaire. No specific reasons have been identified for the 20% decrease in response rate, as the same approach to disseminating these surveys was followed in both years.
image file: c7rp00249a-f1.tif
Fig. 1 Respective 2016 and 2017 KRA114 questionnaire responses to questions 1–7 (see Table 4 for questions). This data is based on responses from 182 KRA114 students (2016; top row) and 121 KRA114 students (2017; bottom row). The responses have been divided into negative (Strongly Disagree/Disagree; red), neutral (grey), and positive (Agree/Strongly Agree; green) for questions 1–6. The responses have been divided into negative (Very Poor/Poor; red), neutral (Average; grey), and positive (Good/Excellent; green) for question 7. All feedback was provided voluntarily and anonymously.

Although no formal analysis was undertaken for these responses, these data generally indicated that the implementation of the new assessment model did not negatively affect or compromise the student experience (Table 4 and Fig. 1). More specifically, these data suggested that across six of the seven questions posed, there was a noticeable improvement in the student experience (on the order of ∼5–15% in these cases). However, responses to question 3 relating to feedback in the laboratory suggest that there was a slight decrease in student satisfaction as a result of the change (∼3%). The positive impact of the new (more transparent) competency-based assessment model on improving student satisfaction is reflected in the responses to questions 4 and 5. Furthermore, a greater percentage of students appear to have enjoyed themselves in the laboratory and described it as a better learning experience (Q1, 6 and 7).

The final two questions (Q8 and 9, Table 4) from the questionnaire arguably offer greater and more detailed insight into students’ experiences and their understanding of the skills that the first-year laboratory program aimed to develop. The qualitative data were analysed using thematic analysis methodology (Miles and Huberman, 1994). Individual responses were grouped by theme and descriptive analyses were performed on these data. In this way, a broad range of themes were identified in each set of qualitative responses. The major themes (minimum of 10 responses) are depicted in Fig. 2 and 3. A more detailed description of the themes identified is provided in Appendix 6 (ESI).


image file: c7rp00249a-f2.tif
Fig. 2 2016 and 2017 KRA114 questionnaire responses to question 8 (see Table 4 for questions): common major themes analysis. A major theme is defined as having a minimum of 10 responses.

image file: c7rp00249a-f3.tif
Fig. 3 2016 and 2017 KRA114 questionnaire responses to question 9 (see Table 4 for questions): common major themes analysis. A major theme is defined as having a minimum of 10 responses.

A number of common major themes were shared in the respective 2016 and 2017 question 8 (Q8) data (Fig. 2). Notably, in both years, the most commonly discussed themes were visual outcomes, lab skills/hands on, instructor, experiment-specific themes and links to the lecture. There was however a relative change in the identification of these major themes from 2016 to 2017 – for example, “experiment specific” increased greatly (from ∼10% to ∼25%) while “visual outcomes” as a response decreased (from ∼20% to ∼10%). However, more significant differences were observed in the minor themes.

Although the manner in which question 8 was posed has positive bias we wanted to ensure that students commented on elements of the laboratory program that they enjoyed and found interesting (Table 4). In the past, student feedback was (understandably) directed to the negative aspects of laboratory program. In contrast, staff had a particularly limited understanding of what was working well and which features were positively received by the student cohort.

The minor themes for the 2016 Q8 data included (in order of number of responses, greatest to least): variety in the laboratories, inquiry/exploration, getting the correct answer, extending beyond curriculum, pre-laboratory quizzes, and information provided. The minor themes for the 2017 Q8 data included (in order of number of responses, greatest to least): working with others, understanding the content, preferred normal grading model, variety in the laboratories, relaxed and reduced stress environment, new competency assessment model, overall enjoyment, real world laboratory experiments, the range of equipment used, continuity throughout the laboratory course, safe to make mistakes, and student ownership. Although most of these themes were identified in 2016 and 2017, some were unique to the new competency assessment model, which suggested a change in the perceived benefits of the laboratory course. Specifically, these were the minor themes relating to a relaxed and reduced-stress environment, continuity throughout the laboratory course, safe to make mistakes, and student ownership.

A number of common major themes were shared in the respective 2016 and 2017 question 9 (Q9) data. Notably, in both years, the most commonly discussed theme was the use of techniques and/or equipment (Fig. 3). Despite techniques and equipment being the most highly identified skill acquired, there was little specification in the 2016 Q9 data. In contrast, the 2017 Q9 data illustrated that students were much more able to identify and name the specific techniques and experience that they acquired (Fig. 3). This included skills such as: recrystallization, titration, distillation, filtration and spectroscopic methods. Furthermore, safety emerged as a new major theme, previously a minor theme in 2016. This was often discussed with reference to specific techniques and practices.

The minor themes for 2017 Q9 data varied greatly with a high number of one-time responses. However, of these minor themes some notable inclusions were: time management and efficiency, accuracy and precision, planning and organization, and communication. In a recent study by Sarkar et al. (2016), 20 areas of knowledge and skills were found to be important for employability. When compared to our findings from the 2017 cohort, six of these have been recognised by student responses from the first-year unit KRA114. These include: content knowledge, application of knowledge and skills, research skills, mathematical skills, team working skills, and time management and organisational skills.

These snapshots paint a picture of an environment where the focus shifted from achieving the highest possible grades under high pressure, to an environment where skills taught in one experiment are recognised as being utilised at later stages in the laboratory course with reduced stress and pressure. Moreover, the ability of students to learn concepts and techniques is not constrained by the strictures of quantitative assessment that can be inconsistent and also “stressful” when applied in an “intense” 3 hour period. This is borne out when comparing the combined responses from the laboratory questionnaire and the end-of-unit responses of respective 2016 and 2017 KRA114 student feedback (Table 5).

Table 5 Comparison of all specific KRA114 feedback from individual students regarding laboratory experiences: quantitative laboratory assessment (2016); competency-based laboratory assessment (2017). All feedback was provided voluntarily and anonymously
2016 KRA114 Student Feedback Regarding Laboratory Experiences
2016 Positive Comments
– “I found the practicals really good – all the staff involved in the practical classes were extremely helpful and approachable and they were more than happy to help out with any questions about the content.”
– “The labs were very helpful for understanding some of the content.”
– “Practical classes were definitely the most helpful aspects, tying in lecture content to practical applications of knowledge.”
– “One of the highlights of my week was filling out the prelab, getting the lab quiz done and then enjoying the lab.”
– “Labs, although stressful in nature (no escaping that) were very well planned.”
2016 Negative Comments
– “The only issue I found was that in many of the lab sessions, the amount of lab work required (both practical and written) was often far too much. For this reason, I felt that it was quite difficult to learn what was intended to be learnt from the labs since most of my lab time was spent rushing to get everything done in time.”
– “The practical element is a very poor assessment, yes I agree that it should be part of the criteria but it is impossible by some weird marking scheme for anyone to get 100%. Regardless of how well you do the Prac you will only be told oh, the lab techs can do it better and thus you can't have all the marks. there is also great inconsistencies between the marking from each demonstrator.”
– “I found that the labs were more of a test than a learning exercise. I found the marking in the labs inconsistent. Last semester my marker was quite brutal compared to some of my friends who got similar marks to me in college and vice versa this semester.”
– “pracs can be made less stressful.”
– “The laboratories were very stressful with too much weight focused on the test, if you did poorly on the test it caused the rest of the lab to be stressful.”
– “Chemistry laboratories are quite intimidating, particularly for people with anxiety. I think they could be improved by making laboratory groups smaller, and taking more time to go through the practical prior to the experiment. Additionally, laboratories could be further improved by taking time to show us how to do calculations.”
– “The pre-lab 5 minute quizzes are highly stressful and not very good for accurately assessing what people understand. I can't even read the questions properly in 5 minutes, let alone do the maths and answer them.”
– “The laboratory sessions are quite stressful, especially the short quiz at the beginning, I found it unfair that the short quiz at the beginning of the lab would count towards so much of your lab mark.”

2017 KRA114 Student Feedback Regarding Laboratory Experiences
2017 Positive Comments
– “The order of the labs allowed me to carry on and practice skills I had learned in the laboratory environment.”
– “I could actively observe and apply the knowledge acquired in lectures to understand what was occurring in the experiments.”
– “I enjoyed not having major assessments so it was relaxed and I could get familiar with the lab.”
– “I enjoy the fact that the practicals are not counted in the overall marks of the semester. I feel less pressure and at the same time enjoy and learn from my mistakes.”
– “The experiments are not assessed, therefore I’m not as stressed when doing the experiment.”
– “I liked that the 2nd semester labs built on our knowledge from semester 1 labs.”
– “Very relaxed environment – stress free.”
– “Not really serious – more relaxed and it's okay to make mistakes.”
– “Practice of techniques without pressure of major assessment.”
– “The criterion based assessment for the laboratories is certainly appreciated (rather than having percentage grades).”
– “I found the lab expectations of this course to be very beneficial and has made me feel more comfortable in the labs going into the future of my degree.”
– “The labs where great.”
– “Good competency based assessment of the laboratory component, which was interesting and helped understanding by applying concepts in a tactile way.”
– “The labs are related to the lecture content and all the marking and assessment is fair and not subjective.”
2017 Negative Comments
– “Some of the experiments had very vague instructions and were hard to understand and follow.”
– “Maybe labs ought to contribute to your mark like they did prior to this year.”
– “I found all the labs interesting, however, I would rather that the labs were graded as they have been in previous years.”
– “I did not feel as though the practicals enhanced my knowledge of lecture content. I would have much preferred a review session instead.”
– “Relevant questions in pre labs that are examinable.”


Student perceptions, reflected in the end-of-unit responses, changed considerably from 2016 to 2017. The 2016 data indicated that students found the laboratory useful for developing an understanding of the chemistry content and linking this to lecture materials. Analysis of the perceived negative comments suggested that many students found the laboratory to be a high stress environment and quantitative assessment as being detrimental to the experience. Comments also indicated that some students were frustrated by the need to rush to complete the assessable content and they identified this as having a negative impact on their learning. In contrast, the 2017 data suggests that, although some student feedback indicated a preference to return to the traditional assessment model, a much higher number of responses praised the relaxed nature of the lab environment in this new assessment paradigm. Furthermore, students identified an increase in skills focus and ongoing development and refinement of those skills and techniques obtained through the laboratory course. However, this does not mean that improvements are not required; several responses from the 2017 cohort indicated that greater clarity in the instructions and the design of relevant pre-laboratory questions could enhance the laboratory experience.

The observations of the KRA114 laboratory instructors provides additional weight to the responses that have been received from the student cohort. Without exception, all instructors including technical staff present in the laboratory provided anecdotal feedback to the authors noting the improved “feeling” within the laboratory of students being at ease with the experience and showing a genuine interest to extend or link the skills and techniques to other experiences. Representative feedback from laboratory instructors is provided in Table 6. These comments indicated that as a result of this new approach, laboratory instructors were able to concentrate their efforts more effectively on developing student competency rather than being preoccupied with quantitatively assessing it. Notably, instructors were concerned that students reducing their efforts to excel within the laboratory. However, when placed in the context of the entire unit, numerous opportunities remain for diligent students to distinguish themselves from their peers (e.g. quizzes, tests, final exam).

Table 6 Representative Feedback from KRA114 laboratory instructors regarding the changes to KRA114 laboratory assessment
Representative Feedback from KRA114 Laboratory Instructors Regarding New Laboratory Experience
Positive Comments
– “The criteria are simple and clear. If a student has failed to meet a certain criterion it is very clear for them why, and what they must do to remedy that in the future. This also seems to have alleviated a great deal of stress attached to the numbers in marking. Despite explaining to students multiple times that each lab represents an incredibly tiny fraction of their overall mark, so they needn’t split hairs, students would often get visibly distressed between earning an 83 or an 85. Have not heard the lament of “that's not what my partner got” since this shift has been made.”
Negative Comments
– “While the majority have come to lab classes keen to learn, some of the less inspired students have adopted an attitude of passing is a given and that the criteria are seen as a minimum required therefore the maximum effort necessary.”
– “The criteria do no set the bar particularly high and as such there is little separating of the wheat from the chaff and whether or not that is the intent of the marking in the labs, it is disheartening to some students to have little reward in way of excelling among their peers. There is little incentive to do anything other than reach the minimum and some students take advantage.”
Overall Comments
– “There are a few small issues with certain students, but that will likely always be the case. I think it is a fantastic change. Moving to skills-based model has definitely made the laboratories a more jovial and less stressful environment, which I believe is more conducive to students learning. The relationship between instructor and student (in my opinion) has undergone a shift from assessor and nervous assesse to that of an educator interacting students in a group learning session…”


Summary

As part of this study, data has been collected and analysed in order to investigate the aforementioned three core research questions. The student response data suggest that there were no significant negative changes and, as a result of moving to a competency assessment model, there has been a shift in the perception of students of the skills obtained through the laboratory course.

With respect to question 1, the student response data and anecdotal evidence indicates that students are both identifying and acquiring these skills and techniques in the short-term. Future studies will focus on determining the longer-term impact on student performance in second and third year chemistry units. Regarding question 2, there was a definitive change in students recognising the specific skills and techniques coupled with the recognition that a more relaxed assessment model allows this focus. Furthermore, this feeds into question 3, as students are better able to identify the skills and techniques they have acquired and their capacity to link these skills through multiple experiences demonstrates an improved understanding of the unit ILOs. Importantly, students found the assessment model more transparent.

Conclusions

In conclusion, the above-mentioned study aimed to address a number of issues concerning the skills and practices being developed (or not developed) by first year chemistry students in the laboratory. The data collected and analysed has provided insight into the success of implementing a competency-based assessment model and illustrates a shift in the perceptions of students towards recognising these skills and techniques. In the future, we will continue to monitor and fine-tune this assessment model and the longer-term effects of this study will be recorded at second- and third-year levels to determine the longevity of these outcomes for continuing students.

Conflicts of interest

There are no conflicts to declare.

Acknowledgements

The authors gratefully acknowledge Dr Luke Hunter (University of New South Wales) for helpful preliminary discussions.

References

  1. Australian Learning and Teaching Council, (2011), Learning and Teaching Academic Standards Project (Science), retrieved 2015, from http://www.acds-tlcc.edu.au/wp-content/uploads/sites/14/2015/02/altc_standards_SCIENCE_240811_v3_final.pdf.
  2. Bailey R. and Garner M., (2010), Is the feedback in higher education assessment worth the paper it is written on? Teachers’ reflections on their practices, Teach. High. Educ., 15(2), 187–198.
  3. Baranyai and Prinsley, (2015), Stem skills in the workforce: What do employer's want? Occasional Paper Series, Issue 9, Office of the Chief Scientist, Australian Government, Canberra.
  4. Bennett S. W. and O’Neale K., (1998), Skills development and practical work in chemistry, Univ. Chem. Educ., 2, 58–62.
  5. Biggs J. B., (2002), ‘Aligning the curriculum to promote good learning’, constructive alignment in action: Imaginative curriculum symposium, LTSN Generic Centre.
  6. Biggs J. B. and Tang C., (2011), Teaching for quality learning at university, Retrieved from http://www.eblib.com.
  7. Black D. P. and Wiliam D., (2010), “Kappan classic”: inside the black box: raising standards through classroom assessment, Phi Delta Kappan, 92(1), 81–90.
  8. Bretz S. L., (2012), Navigating the Landscape of Assessment, J. Chem. Educ., 89(6), 689–691.
  9. DeKorver B. K. and Towns M. H., (2015), General Chemistry Students’ Goals for Chemistry Laboratory Coursework, J. Chem. Educ., 92(12), 2031–2037.
  10. DeMeo S., (2001), Teaching chemical technique, a review of the literature, J. Chem. Educ., 78, 373–379.
  11. Erdmann M. A. and March J. L., (2014), Video reports as a novel alternative assessment in the undergraduate chemistry laboratory, Chem. Educ. Res. Pract., 15, 650–657.
  12. Foukaridis G. N. and McFarlane L. R., (1988), Competency-based training for chemists, J. Chem. Educ., 65(12), 1057–1059.
  13. Garnet P. J. and Garnet P. J., (1995), Refocusing the chemistry lab: a case for laboratory-based investigations, Aust. Sci. Teach. J., 41(2), 26–39.
  14. Ghani I. B. A., Ibrahim N. H., Yahaya N. A. and Surif J., (2017), Enhancing students' HOTS in laboratory educational activity by using concept map as an alternative assessment tool, Chem. Educ. Res. Pract., 18, 849–874.
  15. Hagen J. P. and Gragson D. E., (2010), Developing technical writing skills in the physical chemistry laboratory: a progressive approach employing peer review, J. Chem. Educ., 87(1), 62–65.
  16. Hegarty-Hazel E., (1990), The student laboratory and the science curriculum, London: Routledge.
  17. Hofstein A. and Lunetta V. N., (1982), The laboratory in science teaching: neglected aspects of research, Rev. Educ. Res., 52, 201–217.
  18. Hofstein A. and Lunetta V. N., (2004), The laboratory in science education: foundation for the 21st century, Sci. Educ., 88, 28–54.
  19. Kirton S. B., Al-Ahmad, A. and Fergus S., (2014), Using Structured Chemistry Examinations (SChemEs) as an assessment method to improve undergraduate students' generic, practical and laboratory-based skills, J. Chem. Educ., 91, 648–654.
  20. Lagowski J. J., (2000), Lessons for the 21st century: 1999 James Flack Norris Award, sponsored by ACS Northeast Section, J. Chem. Educ., 77, 818–823.
  21. McInerney D. M. and McInerney V., (2010), Educational psychology constructing learning, 5th edn, Frenchs Forest, NSW: Pearson Australia.
  22. Miles M. B. and Huberman A. M., (1994), Qualitative data analysis: an expanded sourcebook, 2nd edn, Thousand Oaks, CA: Sage.
  23. Moore J. W., (2006), Let's go for an A in lab, J. Chem. Educ., 83(4), 519.
  24. National Academy of Science, (1996), From analysis to action: Undergraduate education in science, mathematics, engineering and technology, Washington, DC: The National Academy Press, pp. 1–10.
  25. Pullen R., (2016). An evaluation and redevelopment of current laboratory practices: an in-depth study into the differences between learning and teaching styles, (PhD thesis), Retrieved from UTAS Open Repository at http://eprints.utas.edu.au/23475/.
  26. Reid N. and Shah I., (2007), The role of laboratory work in university chemistry, Chem. Educ. Res. Pract., 8(2), 172–185.
  27. Sarkar M., Overton T., Thompson C. and Rayner G., (2016), Graduate employability: views of recent science graduates and employers, Int. J. Innov. Sci. Math. Educ., 24(3), 31–48.
  28. Seery M. K., Agustian H. Y., Doidge E. D., Kucharski M. M., O’Connor H. M. and Price A., (2017), Developing laboratory skills by incorporating peer-review and digitial badges, Chem. Educ. Res. Pract., 18, 403–419.
  29. Tierney J., Bodek M., Fredricks S., Dudkini E. and Kistler K., (2014), Using web-based video as an assessment tool for student performance in organic chemistry, J. Chem. Educ., 91(7), 982–986.
  30. Towns M., Harwood C. J., Robertshaw M. B., Fish J. and O’Shea K., (2015), The digital pipetting badge: a method to improve student hands-on laboratory skills, J. Chem. Educ., 92(12), 2038–2044.
  31. Towns M., Hensiek S., DeKorver B. K., Harwood C. J., Fish J. and O’Shea K., (2016), Improving and assessing student hands-on laboratory skills through digital badging, J. Chem. Educ., 93(11), 1847–1854.
  32. University of Tasmania, (2011), Guidelines for good assessment practice, Rev. edn, Hobart, Australia, Retrieved from http://www.teaching-learning.utas.edu.au/_data/assets/pdf_file/0004/158674/GAG_v16_webversion.pdf.
  33. Wenzel T. J., (2007), Evaluation tools to guide students’ peer-assessment and self-assessment in group activities for the lab and classroom, J. Chem. Educ., 84(1), 182–186.

Footnote

Electronic supplementary information (ESI) available. See DOI: 10.1039/c7rp00249a

This journal is © The Royal Society of Chemistry 2018