Inquiry and industry inspired laboratories: the impact on students’ perceptions of skill development and engagements

Stephen R. George-Williams *, Jue T. Soo , Angela L. Ziebell , Christopher D. Thompson and Tina L. Overton
School of Chemistry, Monash University, Victoria, 3800, Australia. E-mail: Stephen.george@monash.edu

Received 29th November 2017 , Accepted 1st March 2018

First published on 2nd March 2018


Abstract

Many examples exist in the chemical education literature of individual experiments, whole courses or even entire year levels that have been completely renewed under the tenets of context-based, inquiry-based or problem-based learning. The benefits of these changes are well documented and include higher student engagement, broader skill development and better perceived preparation for the workforce. However, no examples appear to have been reported in which an entire school's teaching laboratory programme has been significantly redesigned with these concepts in mind. Transforming Laboratory Learning (TLL) is a programme at Monash University that sought to incorporate industry inspired context-based, inquiry-based and problem-based learning into all the laboratory components of the School of Chemistry. One of the ways in which the effect of the programme was evaluated was through the use of an exit survey delivered to students at the completion of seven experiments that existed before the TLL programme as well as seven that were generated directly by the TLL programme. The survey consisted of 27 closed questions alongside three open questions. Overall, students found the new experiments more challenging but recognised that they were more contextualised and that they allowed students to make decisions. The students noted the lack of detailed guidance in the new laboratory manuals but raised the challenge, context and opportunity to undertake experimental design as reasons for enjoying the new experiments. Students' perceptions of their skill development shifted to reflect skills associated with experimental design when undertaking the more investigation driven experiments. These results are consistent with other literature and indicate the large scale potential success of the TLL programme, which is potentially developing graduates who are better prepared for the modern workforce.


Introduction

Transforming Laboratory Learning (TLL) is a programme at Monash University that was designed to significantly modernise the entire teaching laboratory programme in the School of Chemistry. This programme included 17 chemistry courses – four in first year, five in second year and eight in the third year, with the most significant changes focused towards the second and third years of the programme. Monash University is a large Australian university and the School of Chemistry has over 2000 enrolled students.

Several studies have suggested that chemistry graduates lack (or are unable to articulate) many transferable skills that are desired by employers, such as time management, independent learning and team-working (Hanson and Overton, 2010; Sarkar et al., 2016). Even students who continue into research positions have been found to be lacking an appreciation of scientific methodology or experimental design as ‘virtually no attention is given to the planning of the investigation or to interpreting the results’ (Domin, 1999).

The skills agenda has gained prominence within Australia and is well exemplified by the 2016 governmental report (Norton and Cakitaki, 2016) which found that many undergraduates struggled to find work within four months of graduation, with science graduates faring less well than arts graduates. Monash University offers many internal programmes designed to enhance the employability of undergraduate students either through attempts to broaden skill development or work placements. However, until recently these have been largely extracurricular.

The TLL programme sought to enable undergraduate students to develop the skills they needed to obtain employment through a redesigned laboratory programme. In common with many other institutions, the original Monash University laboratory programme relied heavily on traditional expository (or recipe-based) experiments. These types of laboratory activities (i.e. heavily prescriptive ones) are generally utilised to consume minimal resources whether these be time, space, equipment, or personnel (Lagowski, 1990). Whilst consumption is a major issue, it has been noted that students will often proceed through these experiences with little to no thought about the reasoning behind the procedures (Woolfolk, 2005). Furthermore, these experiences achieve little in the way of developing a wider range of transferable skills and usually lack any real-world context (Johnstone and Al-Shuaili, 2001; McDonnell et al., 2007). To address this, a variety of different delivery methods have been attempted.

Industry inspired context and assessment

The use of industrial context in the design of new laboratory programmes has been achieved in many modern examples (Bingham et al., 2015; Pilcher et al., 2015; Erhart et al., 2016). Typically, either the issues faced in an industrial problem or the actual methodology used in industry is used to guide the experimental design. Students generally respond positively to the experiences and respond in a manner consistent with context-based learning, i.e. they become more engaged with the material and tend to achieve higher levels of learning (Pringle and Henderleiter, 1999).

It should be noted that these same outcomes, i.e. high engagement with real industry inspired examples, could be achieved through the use of industrial placements or work-integrated learning (Cooper et al., 2010). However, when cohort sizes are very large industrial placements are simply not a practical means to achieve this contextualisation. Hence, the inclusion of some work-integrated learning into the undergraduate teaching laboratories may bridge this gap.

Increased inquiry and student control

The use of inquiry-based learning is another common experimental design used to address the issues associated with expository experiments (Domin, 1999; Cummins et al., 2004). In these cases, students are given a greater amount of freedom to either discover results for themselves (as opposed to simply confirming theory) or to choose how to undertake a given investigation (i.e. what methods to use) (Domin, 1999). These experiences are known to diversify the skills developed by students, particularly developing a greater range of transferable skills and a deeper understanding of scientific methodology (Weaver et al., 2008). These experiences do however require significant scaffolding to support the students (Bruck and Towns, 2009). In fact, it has been noted that:

Inquiry lab students valued more authentic science exposure but acknowledged that experiencing the complexity and frustrations faced by practicing scientists was challenging, and may explain the widespread reported student resistance to inquiry curricula’. (Gormally et al., 2009)

Solving real-world problems

Problem-based learning can be described as a composite of inquiry-based learning and context-based learning, wherein students are given control to investigate a scenario that has been contextualised to the real-world before all appropriate content has been delivered to them (Duch et al., 2001). As such, the benefits of both inquiry-based learning and context-based learning are achieved, with students reporting high levels of engagement whilst developing a wide range of scientific and transferable skills (Ram, 1999). The specific development of problem solving skills and strategies was shown in the work of Sandi-Urena et al. (2012) who used both quantitative and qualitative means to indicate a notable increase in these skills, even without explicit instruction. There are a large number of examples of problem-based learning in chemistry education in the literature which include individual experiments (Chuck, 2011; Mc Ilrath et al., 2012; Dopico et al., 2014), whole courses (Jansson et al., 2015; Pilcher et al., 2015) or even entire year levels (Kelly and Finlayson, 2007). However, there are no apparent examples of an entire school wide laboratory programme being reformed.

Staff involvement and training

It is also worth noting that throughout the TLL program, both Teaching Associates and technical staff were routinely involved in the generation of the new experiments. This approach was used in an attempt to ensure buy-in from all teaching and technical staff. Teaching Associate notes and guidelines were also generated throughout the program in an attempt to ensure the legacy of the TLL program.

Measuring the effect of TLL

The focus of this study was to identify a means by which to monitor and measure the overall effect of the TLL programme on a very large number of undergraduate students. Considering the numerous changes being implemented, it was decided that a survey would be an appropriate means to probe the effect on such a large cohort.

Through this survey, the effect of the TLL programme on a range of areas could be monitored. This survey was used in order to measure whether the programme was truly better at preparing the students for the workforce by creating a more engaging, industry-focused laboratory programme that allowed students to develop a wider range of transferable skills. This survey was designed to measure:

(1) The level of inquiry and contextualisation of the experiments, as noted by the students.

(2) The reported underlying motivations of the students.

(3) The overall perception of learning.

(4) Student perception of the skills developed in a given experiment.

(5) Student identified issues associated with the experiments.

(6) The level of (and reason for) the student enjoyment of the experiments.

Method

The aim of this study was to compare students’ perceptions of a range of teaching laboratories that either existed prior to the TLL programme or were produced as a result of the TLL programme. Industrial partners were sought to consult on the new implemented experiments. In many cases, the learning material was branded with their respective logos and the instructions were often written as though the industrial partners themselves were directing the students (e.g. ‘Rationale, a Melbourne based skin care company, has asked Monash Consulting to investigate the use of a new range of compounds to be used as active ingredients in sunscreen’). Assessment was made more varied, often tailoring it to this new context. For example, students were asked to provide an executive summary to the company. This is an example of authentic assessment, wherein the assigned task matches real world procedures (Wiggins, 1990) and was used to further embed the experiments in the real world.

Alongside the inclusion of industrially relevant experiments, an increase in the level of inquiry was achieved by removing excessive guidance in the laboratory manual (which was replaced by prompting questions or multi-directional flowcharts), obscuring the experimental outcome that the students might achieve or through simply instructing the students to devise their own means by which to complete the experiment. All experiments generated from the programme were designed with the principles of problem-based learning in mind, either by providing a context or through ensuring at least a minimal level of inquiry.

Survey development

There are many surveys in the chemical education literature designed to measure students’ attitudes, self-efficacy and overall learning in undergraduate chemistry courses (Bauer, 2005, 2008; Grove and Bretz, 2007; Cooper and Sandi-Urena, 2009). However, there are few that focus specifically on the undergraduate laboratory experience. Those that do, such as the MLLI (Meaningful Learning in the Laboratory Instrument) survey (Galloway and Bretz, 2015), focus on the students’ expectations, thoughts and feelings rather than specific skills being developed. The MLLI survey also contains questions relating to perceived student control or the contextualisation of the lab, but in a non-direct fashion (e.g. ‘I expect to learn chemistry that will be useful in my life’ or ‘I expect to make decisions about what data to collect’).

A survey more suited to our purpose was found in the work of Russel (2008), which contained closed questions designed to monitor the effect of the inclusion of more research-based experiments in their undergraduate teaching laboratories. Many questions overlapped with the aims of TLL, especially those around context, the level of guidance within the lab manual or the underlying motivation or engagement of the students. This survey had been validated and was considered an ideal starting point for the final survey used in this study.

The Russel (2008) survey was modified over several iterations. The formatting was changed to be consistent with other studies being undertaken and any mention of ‘chemistry course’ was changed to ‘lab’. Three questions were removed as they were considered to unlikely to be altered by a single laboratory experience (e.g. This lab experience has made me more interested in earning a Doctoral degree (PhD) or Master degree in a science field). Eight items were altered to their negative version (e.g. ‘This lab experience made me learn’ became ‘This lab experience did not make me learn’) to avoid students agreeing to every item, i.e. to avoid acquiesce bias (Watson, 1992). Six new closed questions were also added to capture the students’ perceptions of the ease, challenge, contextualisation, openness or level of interest in the experiments. Finally, three open questions were added to further probe the students’ perceptions. These questions were ‘What skills did you develop throughout today's experience?’, ‘Was there anything that could be improved about today's lab?’ and ‘Did you enjoy today's lab? Why/why not?’ The final version of the survey consisted of 27 closed questions, one distractor question (‘Please select agree and disagree to this question’) and three open questions.

Data collection

The survey was administered to students either in the teaching laboratory at the completion of an experiment or immediately after their laboratory session during a free lunch. In many cases, the same students filled out multiple surveys for multiple experiments. The seven experiments selected for the ‘Pre-TLL’ sample were chosen to represent traditional expository experiments, as opposed to any that already utilised inquiry-based learning. These seven experiments were chosen predominately due to convenience and to avoid overlap with other research programs (i.e. to avoid students filling out too many surveys in a given course). The seven ‘Post-TLL’ experiments were the first new experiments to be generated by the TLL programme. The surveys were disseminated in 2016 and 2017 – the first two years of the TLL programme. Consequently, students enrolled in chemistry courses in 2016 and 2017 could have seen the survey multiple times if they happened to be enrolled in the chemistry courses of interest. No repeat measurements of any experiment was performed.

The number of students varied between the courses, from approximately 100 to 1200+. In many cases where the student cohort was large, only a subset of the students were surveyed. For example, for the analysis of a new experiment in first year that ran twice a day for five days of the week, students were invited to complete the survey at the completion of the experiment on Tuesday and Thursday morning. Hence, not all students were surveyed but the sample was large and representative of the cohort.

Research theoretical framework

Through the use of both quantitative and qualitative questions, this research adopts a concurrent mixed-methods approach, i.e. both types of data were collected simultaneously (Creswell, 2009). It is believed that the students responded to the questions in a manner consistent with Constructivism which ‘underlies the assumption that learning is an active process where knowledge is constructed based on personal experiences and the continual testing of hypotheses’ (Leal Filho and Pace, 2016). Hence, the experiences that students have just had will directly impact their responses to the questions. The students were neither discouraged nor encouraged to discuss the survey with one another, so occasional collusion cannot be discounted and may lead to a small proportion of group responses rather than individual. The laboratory teaching staff were unaware of the content of the survey and were therefore unlikely to bias the viewpoints of the students through in-class conversations. The open questions allowed the students to respond in a less directed manner ensuring a greater depth to their responses.

Data analysis

The total number of responses are shown in Table 1 and completion rates range from 38% to 76% of the cohort surveyed. Further details regarding the individual experiments can be found in Table 2. The quantitative data was transcribed into Excel after recoding (e.g. Strongly Disagree = 1, Disagree = 2, Neutral = 3, Agree = 4 and Strongly Agree = 5) and then analysed (as frequencies, not averages) with the SPSS data analysis software. The responses to the open questions were transcribed verbatim into Excel and were analysed in Excel as well. Overall, there were 525 responses collected about experiments that existed prior to the TLL programme and 609 responds to those generated by the TLL programme.
Table 1 The number of responses to the surveys and as a the percentage of students who had just completed the respective experiment
Experiment name Number of responses % of students
Pre TLL Rearrangement 51 49
EAS 40 38
Anthracene oxidation 91 76
Macrocycles 37 46
Isomerisation 49 61
Proteins (2016) 70 58
Panacetin 184 74
Post TLL Proteins (2017) 138 69
Sunscreen 51 49
Pseudenol 65 63
Electronic waste 160 59
Nylon 69 58
Food project 97 49
Enzymes 28 47


Table 2 A brief description of the 14 experiments surveyed
Experiment name Year level – course focus Method type Context (real life or industry) Scientific content Additional notes
Pre TLL Rearrangement 2nd – Inorganic and organic chemistry Expository None noted Carbocation rearrangement One four hour session. Historical methods used (e.g. hydrazone wet test). Students worked in pairs.
EAS 2nd – Inorganic and organic chemistry Expository None noted Electrophilic aromatic substitution One four hour session. Typically completed within 2 hours. Students worked in pairs.
Anthracene oxidation 3rd – Medicinal chemistry Expository Enzyme mimics Oxidation using vanadium catalysts One four hour session. Contains a long wait time of 2 hours. Underutilised context. Students worked in pairs.
Macrocycles 3rd – Advanced inorganic chemistry Expository (mimics literature) None noted Synthesis of macrocyclic cage complexes One four hour session. Required students to obtain method from literature sources. Students worked in pairs.
Isomerisation 3rd – Advanced inorganic chemistry Expository None noted Kinetics of ligand isomerisation One four hour session. Utilises kinetics/physical chemistry in student perceived synthetically focused course. Students worked in pairs.
Proteins (2016) 2nd – Food chemistry Expository Food proteins Protein detection and measurement One four hour session. Context limited by use of non-commercially available defatted soy protein. Students worked in groups of 4.
Panacetin 2nd – Inorganic and organic chemistry Expository Black market pharmaceuticals Solute/solvent portioning One four hour session. Typically completed within 2 hours (out of the 4 assigned). Students worked in pairs.
Post TLL Proteins (2017) 2nd – Food chemistry Expository Food proteins Protein detection and measurement One four hour session. Commercially available milks used. Required students to obtain method from literature sources. Students worked in groups of 4.
Sunscreen 2nd – Inorganic and organic chemistry Expository (mimics literature) Sunscreens and UV-active materials Aldol condensation One four hour session. Traditional synthesis with students aiming to make the best sunscreen. Students worked in groups of 3.
Pseudenol (previously Panacetin) 2nd – Inorganic and organic chemistry Flowchart/student directed Black market pharmaceuticals Solute/solvent portioning One four hour session. Non-stepwise method, prompting questions used. Context strengthened. Students worked in pairs.
Electronic waste 1st – Introductory chemistry Flowchart/student directed Metal wastes from electronic goods Transition metals, complexes and colour One three hour session. Non-stepwise method, students follow multi-directional flowchart. Students worked individually.
Nylon 3rd – Materials chemistry Inquiry/investigation Production of Nylon Step-growth polymerisation One four hour session. Very simple method. Students allowed to investigate anything available. Students worked individually.
Food project 2nd – Food chemistry Inquiry/investigation Nutritional components of foods Methods used for non-ideal food samples. 2 four hour sessions, 1 week apart. Methods utilised in prior experiments of same unit. Students worked in groups of 4.
Enzymes 2nd – Biological chemistry Expository then Inquiry/investigation Commercially available digestive supplements Enzyme degradation of complex sugars 2 four hour sessions, 1 week apart. First week traditional method followed by inquiry-based second week. Students worked in pairs.


For the quantitative analysis, the responses were combined into two major groups, the largely expository, unchanged experiments (Pre-TLL) and those generated or revised during the TLL programme (Post-TLL). Significant differences between student responses to each of the 27 closed questions where measured through the Wilcoxon Signed Rank test, which presumed dependence between students responding to both the Pre-TLL and Post-TLL surveys. For questions that showed a statistical difference to a 95% confidence interval, the resulting Z values (obtained from SPSS) were divided by the square root of the respective sample size of respondents to generate a measure of the effect size (r). The cut-offs for the ‘size’ of the effects were determined through the work of Hattie (2008), which was later extended (Fritz et al., 2012; Lenhard and Lenhard, 2016) to r values. The ranges were defined by:

(1) 0 ≤ r ≤ 0.1. ‘Student effect size’. This refers to the natural variation in any group of students. For example, a more motivated student may respond more positively than a less motivated student.

(2) 0.1 < r ≤ 0.2. ‘Teacher effect size’. This refers to the effect of a particularly motivated teacher over the course of a single year (i.e. this effect size could be achieved given time/motivation).

(3) r > 0.2. ‘Zone of desired effect’. This refers to interventions that have an immediate impact and are where educators should typically focus their efforts.

The qualitative data was analysed for emerging themes in an inductive manner. Themes were generated through several rereads of the responses during which recurring themes were identified. These themes were studied in order to identify any redundancies and each theme was given a code. Themes extracted from the data and their codes were given to two other researchers who attempted to code a portion of the qualitative data for each of the three open questions. If needed, themes and codes were refined and these final themes were then used to recode the rest of the responses. This data was then expressed as a percentage of participants who raised a particular theme.

Limitations

The first limitation of this study is the percentage of students responding (38–76%) and the overall sample sizes (28–184). The percentages of a given cohort responding varied from somewhat representative (38%) to highly representative (76%). In some cases, the cohort itself was relatively small (approximately 60) whilst others were impractically large (approximately 1200+). Therefore, this situation either resulted in small data sets (Enzyme experiment) or required the use of subsets (e.g. only surveying two out of eight lab classes), a process known as convenience sampling (Henry, 1990). Potentially, the use of subsets or small cohorts could lead to less statistically valid data or non-representative data.

This can be further considered through comparison to the work of Barlett et al. (2001) in which they provided acceptable minimal sample sizes for varying populations. However, this is complicated by the fact that the researchers only provided acceptable sample sizes for either continuous or categorical data, whilst this study utilises ordinal data. Even presuming the lower acceptable sample sizes for the continuous data set apply for an alpha of 0.05 (i.e. a 5% error), six of the seven sample sizes are considered below the required levels. This is simply due to the smaller cohorts enrolled in those courses. However, when the datasets are combined into Pre-TLL and Post-TLL, the number of responses (522 and 608 respectively) are considerably above the required number (106). Hence, whilst some individual experiments may not be fully represented through this analysis, the overall Pre-TLL vs. Post-TLL comparison has sufficient statistical power.

Additionally, as students may have filled out the survey on multiple occasions, it is possible that they simply became accustomed to the survey and simply answered the questions in a repetitive manner. The inclusion of negative stems and a distractor question hopefully forced students to stop and reread questions but this cannot be confirmed at this time.

Another potential limitation was the experiments that were chosen to represent the Pre-TLL dataset. These experiments were selected for practical concerns with timetabling and the delivery of other questionnaires not related to this study. Hence, some experiments that did not form part of this study may have affected the results obtained (i.e. if all teaching experiments in all chemistry courses were surveyed, different results may have been obtained).

It is also possible that the previous background of the students, in terms of encounters with other teaching laboratories could influence their responses. This cannot be completely discounted but, during the first year at Monash University, all students undertake multiple inquiry-based laboratories (called IDEA experiments) during semester one and two. Consequently, all students in higher years have at least some experience with inquiry experiments which would potentially mitigate this issue.

Beyond the performance of a Cronbach's alpha test, no further measurements of reliability or validity were performed. Hence, there may be issues regarding the validity or reliability of the instrument. However, the inclusion of qualitative data through the open questions and sourcing the bulk of the survey from a pre-validated and thoroughly tested source was believed to be sufficient to counter this concern.

Finally, the method of interpretation could be a source of error. However, it is believed that the iterative nature of the theme generation negated this issue to a significant degree, particularly through the use of multiple coders.

Results and discussion

As many changes were made to the literature version of the survey, a test of the internal reliability of the modified version was carried out using the SPSS data analysis software. After correcting for the negative items, the Cronbach's alpha value of the entire Pre-TLL or Post-TLL dataset was found to be 0.846 and 0.866 respectively. As the common literature value for an internally consistent survey is ≥0.7 (Santos, 1999), it appeared that the numerous changes to the survey did not effect this value. As such, the items in this test were all considered focused on the same item/concept (in this case, the single laboratory experience) and indicated the use of a reliable scale.

Quantitative analysis

The occurrence of any significant differences between students responding to the survey after a Pre-TLL or Post-TLL experience was investigated. It should be noted here that even though the sample sizes for individual experiments were very different (28–184), all data for all the experiments was combined together in order to measure the Pre-TLL and Post-TLL effects. Furthermore, the six new closed questions (focusing on the students’ perceptions of the ease, challenge, contextualisation, openness or level of interest in the experiments) were added after analysis of some of the original experiments. Hence, the sample sizes are notably lower for those 6 new questions.

The Wilcoxon Signed Rank test showed that 18 of the 27 closed questions were answered in a significantly different way. All questions alongside the p, Z and r values are shown in Table 3. Of this, only eight showed an effect size within the ‘zone of desired effect’ (Hattie, 2008; Lenhard and Lenhard, 2016). Fig. 1 shows the responses to these eight questions.

Table 3 The results of the Wilcoxon Signed Ranked test on the 27 closed questions showing the number of responses (N), the p result, the calculated Z value and the r effect size
Question N p Z r
This lab experience was worthwhile. 680 0.95 −0.1
This lab experience was interesting. 680 0.79 −0.3
This lab experience helped me better understand chemistry, in general, as a result of completing the chemistry lab. 1122 0.55 −0.6
In my life, I will not use skills I’ve learned in this chemistry lab. 1098 0.44 −0.8
This lab experience did not make me learn. 1118 0.43 −0.8
Having the opportunity to use chemistry instruments helped me learn course topics. 1115 0.40 −0.8
Even if I don’t end up working in a science related job, the laboratory experience will still benefit me. 1111 0.33 −1.0
This lab experience has made me more interested in a science career. 1108 0.28 −1.1
This lab experience made me realize I could do science research in a real science laboratory (for instance at a university, or with a pharmaceutical company). 1114 0.076 −1.8
Having the opportunity to use chemistry instruments made this course less interesting for me. 1114 0.051 −1.9
This lab experience has made me less interested in science. 1117 0.045 −2.0 −0.06
This lab experience has made me more interested in chemistry. 1118 0.008 −2.7 −0.08
This lab experience helped me understand how the topics that are covered in chemistry lecture are connected to real research. 1119 0.003 −3.0 −0.09
This lab experience gave me a better understanding of the process of scientific research as a result of this experiment. 1106 0.002 −3.1 −0.09
Finding answers to real research questions motivated me to do well in the chemistry lab. 1115 0.001 −3.3 −0.10
Finding answers to real world questions motivated me to do well in the chemistry lab. 1106 <5 × 10−4 −3.5 −0.10
This lab experience presented real science to students, similar to what scientists do in real research labs. 1123 <5 × 10−4 −3.9 −0.12
This lab experience was not very similar to real research. 1120 <5 × 10−4 −4.3 −0.13
This lab experience was not well organized. 1112 <5 × 10−4 −5.1 −0.20
This lab experience was open enough to allow me to make decisions. 678 <5 × 10−4 −5.4 −0.16
This lab experience was easy. 594 <5 × 10−4 −6.0 −0.25
In this lab, the instructional materials did not provide me with explicit instructions about my experiment. 679 <5 × 10−4 −6.5 −0.25
This lab experience was well contextualised to real life or the workforce. 681 <5 × 10−4 −6.5 −0.25
This lab experience was challenging. 679 <5 × 10−4 −8.6 −0.26
In this lab, I did not repeat experiments to check results. 1121 <5 × 10−4 −8.6 −0.25
In this lab, the instructional materials provided me with sufficient guidance for me to carry out the experiments. 1117 <5 × 10−4 −9.3 −0.28
In this lab, I can be successful by simply following the procedures in the lab manual. 1118 <5 × 10−4 −9.3 −0.28



image file: c7rp00233e-f1.tif
Fig. 1 Horizontal stacked bar charts for the eight questions showing an effect size within the ‘zone of desired effect’ Pre and Post TLL.

It is worth noting that this analysis does not take into account the variation in sample sizes for the different experiments, which range from 28–184. To investigate the effect of this, 28 random responses were chosen from each data set (e.g. only 28 of the responses to the electronic waste experiment and so on for all of the other experiments). Following a Wilcoxon Signed Rank test, only one new question showed a significant difference (which focused on general chemistry understanding) but still exhibited a small effect size. Six other questions no longer showed a significant difference, although they previously exhibited effect sizes r < 0.14 and were considered irrelevant. One new question rose above the r = 0.2 threshold (which focused on student perception of the organisation of the experiment) whilst the effect size of the original eight questions increased by 0.02–0.08. Hence, at worst, combining the data for analysis causes the overall effect sizes to be underestimated for the original eight questions and these were considered to be the items most affected by the TLL programme. The full data for this analysis is present in the Appendix.

Fig. 1 indicates the direction of the change in the students’ responses to any of the eight questions that showed a change within the ‘zone of desired effect’. For example, the top two horizontal stacked bars showed that the students’ responses shifted right for the Post-TLL experiments compared to the Pre-TLL experiments. Hence, students were more likely to select neutral, agree or strongly agree to the concept that the experiment was open enough for them to make decisions in the Post-TLL experiments. With the exception of the responses to the level of the contextualisation within the experiment (which saw a large decrease in neutral and a rise in agree and strongly agree), the shift of the horizontal stacked bars provides a simple approximation of the shift in the students’ responses.

Overall, students responded in such a manner to imply that they found the Post-TLL experiments less easy whilst more challenging, contextualised, and open (i.e. they could make more decisions themselves). They were also more likely to repeat results and found that the laboratory manual offered less guidance and could not be relied upon in order to complete the experiment with no additional materials or aids. Overall, these results are considered very positive outcomes for the TLL programme as removing dependence on the laboratory manual was a key goal (as it was perceived that students over-relied upon it for guidance). This indicates a move away from expository recipe-style manuals and an increase in inquiry. Furthermore, the recognition of increased openness, challenge and contextualisation were also considered positive results.

It is also of interest to note the questions that showed little to no significant difference after the modified experiments. Questions referring to overall motivation (e.g. finding answers to real research questions/real world questions motivated me to do well), interest (e.g. This lab experience made me more interested in a science career or This lab experience was interesting/worthwhile) or even overall learning (e.g. This lab experience did not make me learn) showed little to no significant difference. It would seem that either even more significant changes to the laboratory programme would be required to effect these items or that the students were unlikely to become more positive. In this case, approximately 80% of students always stated that the lab was interesting or worthwhile so there was very little room for improvement no matter what changes were made. It is also possible that these inherently intrinsic factors (such as interest and motivation) are simply too innate to a given student and are unlikely to be influenced by external factors such as the type of laboratory experienced.

That being said, the closed questions, and the subsequent quantitative analysis, can only provide a surface analysis of the impact of the new experiments. Hence, discussion of the responses to the 3 open questions is required.

Perceived skill development

The first open question asked the students ‘What skills did you develop throughout today's experience?’. It should be noted that the Panacetin experiment is absent from the open questions as the survey did not include these prompts at that time. For the other experiments, students raised between one and four different skills. Tables 4 and 5 show the top three skills (raised by ≥10% of the students, which was believed to result in meaningful data) for any given experiment.
Table 4 The top three skills raised by ≥10% of the students to any Pre-TLL experiment
Experiment Top three skills developed Percentage of respondents (%)
Rearrangement Practical skills 88
Transferable skills 25
Theory 13
EAS Practical skills 76
Theory 32
Transferable skills 10
Anthracene oxidation Practical skills 76
Patience 17
Theory 16
Macrocycles Practical skills 69
Following a literature method 16
None 11
Isomerisation Practical skills 84
Transferable skills 49
Theory 19
Proteins (2016) Practical skills 84
Transferable skills 35
Teamwork 16


Table 5 The top three skills raised by ≥10% of the students to any Post-TLL experiment
Experiment Top three skills developed Percentage of respondents (%)
Proteins (2017) Practical skills 80
Transferable skills 35
Teamwork 25
Sunscreen Practical skills 76
Transferable skills 43
Time management 29
Pseudenol Practical skills 71
Transferable skills 36
Critical thinking 10
Electronic waste Transferable skills 45
Observational skills 40
Note taking 22
Nylon Transferable skills 42
Experimental design 39
Practical skills 35
Food project Transferable skills 76
Teamwork 49
Experimental design/practical skills 29
Enzymes Experimental design 56
Practical skills/transferable skills 33
Teamwork 19


The Pre-TLL results in Table 4 show several notable features. Firstly, the development of practical skills (student examples include ‘Use of the glassware’ and ‘Liquid–liquid extraction. TLC and how to interpret it. How to use pKA and separate solution mixture. Saw HNMR machine being used’) was a major theme for all six Pre-TLL experiments raised by 69–88% of the respondents. Secondly, many students stated a greater understanding of particular theoretical concepts (student examples include ‘Improved my mechanism understanding’ and ‘Understanding of how a catalyst works’) as a skill that they had developed, ranging from 13 to 32% of the responses. Thirdly, even though these experiments were predominately expository, students often raised a range of transferable skills, including, but not limited to, time management, teamwork, critical thinking and communication (student examples include ‘Teamwork skills and communication skills’ and ‘Critically thinking about instructions’). Individual transferable skills were generally not raised by more than 10% of the cohort, hence an overarching theme was generated that subsumed all transferable skills and was raised in four of the six Pre-TLL cases by 16–49% of respondents.

Deviations from the above observations can be explained by the nature of the experiments themselves – patience was raised in the anthracene oxidation experiment that involved a two hour reflux whilst following a literature method was raised in the macrocycles experiment as students were expected to find the method in the literature before arriving to laboratory session. Overall, the Pre-TLL results appear focused on the development of practical skills, limited transferable skills and regularly focus on theoretical understanding. As already noted, these are not unique but highlight the success of the survey at detecting these students’ perceptions.

Analysis of the Post-TLL responses in Table 5 indicates a range of similarities and differences when compared to the Pre-TLL results. Three of the Post-TLL experiments (Proteins (2017), Sunscreen and Pseudenol) show very similar results to the Pre-TLL experiments. This is to be expected as these new experiments, whilst contextualised, were still focused on the development of new practical techniques and could be considered largely expository, albeit with more inquiry focus than many Pre-TLL experiments as the overall outcome of the experiments were discovered by the students. However, it is worth noting that in all three cases an individual transferable skill (teamwork, time management and critical thinking respectively) was now raised enough by the students to become one of the top three aims. Hence, these Post-TLL experiments were still being recognised as opportunities to develop practical skills whilst raising recognition of several transferable skills. This is likely to be a result of the increased connection to the students’ daily lives and/or potential career paths when experiencing a conceptualised laboratory.

The Post-TLL experiments that allowed students to undertake experiments of their own design (Nylon, Food Project and Enzymes) showed common responses to one another. In all three cases, the development of experimental design skills (student examples include ‘Creating methods’ and ‘How to develop experiments to achieve certain aims or outcome’) were recognised by 29–56% of the students. The development of transferable skills also become much more prominent in the responses, with the extreme result of 76% of students raising them in the Food Project experiment. The development of practical skills was still raised in all of these experiments, but to a much lower degree (29–35%). This is likely an artefact of the students now raising a much larger breadth of developed skills.

The only case in which the development of practical skills was not a significant theme was the electronic waste experiment. This experiment involved the dropwise addition of metal ion solutions to a range of reagents and was, therefore, practically quite simple. Hence, other skills were raised by the students such as taking observations and making detailed notes.

Lastly, an increase in theoretical understanding was no longer raised as a common theme in the Post-TLL experiments. This would appear to contradict the quantitative results in which there was no notable difference in the students reported level of chemistry understanding or overall learning. However, it is possible that the new experiences simply provided a richer environment for skill development, which resulted in students identifying a broader range of skills that reduced the extent to which they viewed developing a deeper theoretical understanding as a skill that had been developed. Hence, it appeared to a lesser extent in the open answers but remained unchanged in the closed responses. This is particularly positive as the TLL experiments were designed to incorporate a larger diversity of learning experiences, rather than simply providing a chance to study a given theoretical principle. Overall, the Post-TLL results showed a larger range of skill development, particularly incorporating more transferable skills and experimental design. This was achieved without sacrificing the development of practical skills in the more expository experiences.

Improvements suggested by students

The second open question asked ‘Was there anything that could be improved about today's lab?’. The top three issues (raised by ≥10% of the students) for any experiment are shown in Tables 6 and 7.
Table 6 The top three improvements raised by ≥10% of the students to any Pre-TLL experiment
Experiment Top three improvements raised by ≥10% of respondents Percentage of respondents (%)
Rearrangement Greater guidance 42
No changes required 21
Better time management 19
EAS No changes required 43
Procedural issues 14
Greater guidance 14
Anthracene oxidation Less waiting time 48
Greater guidance 35
No changes required 12
Macrocycles No changes required 31
Greater guidance 26
More explanation of theory/better time management 14
Isomerisation Greater guidance 55
Better teaching associates/greater context required 15
Procedural issues 10
Proteins (2016) No changes required 48
Greater guidance 25
Better time management 15


Table 7 The top three improvements raised by ≥10% of the students to any Post-TLL experiment
Experiment Top three improvements Percentage of respondents (%)
Proteins (2017) Greater guidance 47
No changes required 28
Procedural issues 17
Sunscreen Greater guidance 40
Better time management 33
No changes required 13
Pseudenol Greater guidance 64
Better time management 20
No changes required 11
Electronic waste Greater guidance 50
No changes required 15
Introduce group or team work 11
Nylon Greater guidance 62
Better time management 22
No changes required 15
Food project Greater guidance 52
Pre-assignment to groups 21
Enzymes No changes required 29
Procedural issues 21
Greater guidance 18


Table 6 shows that the improvements asked for with the Pre-TLL experiments were quite varied. The themes ranged from better guidance (student examples include ‘More guidance’ and ‘The instructions in the lab manual were vague’), better use of time (student examples include ‘Ran very close to time, organisation could be better’ and ‘Could maybe find something to make it last closer for the 4 hours, as opposed to finishing at 4:30 [a 90 minute early leaving time]’), better Teaching Associates (student examples include ‘Lab demonstrators gave barely any info as to the theory, explaining kinetics (4102) or any of my data’ and ‘Having better TA’), fixed procedural issues (student examples include ‘Flask was not sufficient to capture all solid after recrystallisation’ and ‘Filtration, most solid fell through into the flask’) or even calls for no changes at all (student examples include ‘Not that I can think of’ and ‘Not really, exercise 3 is pretty well organised and went smoothly’). The only universal issue noted was that students routinely called for a greater amount of guidance in every Pre-TLL experiment.

The Post-TLL results (Table 7) show a similar range of themes to the Pre-TLL results, albeit with a much different focus. The desire for greater guidance was now the main issue raised in six of the seven Post-TLL experiments. This shift is in good agreement with the quantitative data that highlighted that the students no longer felt that the laboratory manual provided sufficient guidance to complete the experiment. The strong desire for guidance is also a logical extension of the Pre-TLL results, as guidance was already a perceived issue for the students and the TLL programme deliberately sought to remove the recipe-like approach. This backlash is a likely result of students already being accustomed to expository experiments and the stepwise instructions normally provided. Hence, this result is considered positive as students will need to learn to deal with limited guidance throughout their future careers. This is the first step in acclimatising students to the uncertainties of a real workplace.

Calls for better time management in three of the cases (20–33%) were increased compared to pre-TLL (14–19%). This is most likely due to the longer, and more challenging, experiments generated through the TLL programme. Additionally, the call for group/team work in the electronic waste experiment was simply due to the requirement for students to work individually in a course where they typically worked in groups. Finally, the desire for pre-assignment to groups was a response to the fact that students were given a topic to investigate, rather than being allowed to choose from a list. Overall, many of the issues raised appeared to be responses from students speaking out against the new, more challenging, less prescriptive programme. This situation could possibly be ameliorated through conversations with students about the aims of the programme. That being said, these teething problems are common in cases where inquiry or problem-based learning has been implemented (Bruck and Towns, 2009; Gormally et al., 2009) and may subside over time.

Perceived enjoyment

The third open question asked ‘Did you enjoy today's lab? Why/why not?’. As the question was composed of two sections, the analysis of the responses was also split into two. The first reading of the responses was simply whether the student enjoyed the laboratory or not. These results are shown in Tables 8 and 9.
Table 8 The percentage of students indicating that they enjoyed the Pre-TLL experiments
Experiment Respondents who enjoyed the laboratory (%)
Rearrangement 95
EAS 87
Anthracene oxidation 57
Macrocycles 70
Isomerisation 74
Proteins (2016) 84
Average 81
Standard deviation 14


Table 9 The percentage of students indicating that they enjoyed the Post-TLL experiments
Experiment Respondents who enjoyed the laboratory (%)
Proteins (2017) 82
Sunscreen 86
Pseudenol 78
Electronic waste 92
Nylon 78
Food project 70
Enzymes 78
Average 80
Standard deviation 7


In both sets the average percentage of students stating that they enjoyed the experience was quite high (≥80%) and the average values were the same within one standard deviation. Hence, the new laboratory experiments had no measurable impact (at least by the survey utilised) on the reported enjoyment by the students. This would appear to be in contrast to the results of many others (Gormally et al., 2009) who noted that students were ‘resistant’ to such changes in the curriculum. Potentially, this could imply that the new experiments were better received than originally anticipated. However, another reading of the data is that the students enjoyed the old expository laboratory experiments just as much as the new ones. This implies that enjoyment may not be the best measure by which to judge the effectiveness of any particular teaching intervention.

The reasons behind their enjoyment were very informative. The top three reasons (raised by ≥10% of the students) for enjoying any experiment are shown in Tables 10 and 11.

Table 10 The top three reasons for enjoying the laboratory by ≥10% of the students to any Pre-TLL experiment
Experiment Top three reasons for enjoying the laboratory Percentage of respondents (%)
Rearrangement Good teaching associate 32
Good practical skills 22
Interesting, worthwhile or fun 20
EAS Short experiment 53
Easy experiment 30
Good teaching associate 20
Interesting, worthwhile or fun 20
Anthracene oxidation Interesting, worthwhile or fun 35
Easy experiment 15
Interesting theory 13
Macrocycles Interesting, worthwhile or fun 51
Good time management 14
Good practical skills 11
Isomerisation Interesting, worthwhile or fun 58
Good time management 17
Good teaching associate 17
Proteins (2016) Interesting, worthwhile or fun 32
Good guidance 23
Good teaching associate 17


Table 11 The top three reasons for enjoying the laboratory by ≥10% of the students to any Post-TLL experiment
Experiment Top three reasons for enjoying the laboratory Percentage of respondents (%)
Proteins (2017) Strong context 36
Interesting, worthwhile or fun 31
Good time management 14
Sunscreen Interesting, worthwhile or fun 26
Strong context 23
Good challenge 21
Pseudenol Strong context 31
Interesting, worthwhile or fun 28
Good challenge 15
Electronic waste Interesting, worthwhile or fun 48
Easy experiment 15
Strong context 10
Nylon Interesting, worthwhile or fun 51
Strong context 34
Chance to development method or undertake investigation 26
Food project Chance to development method or undertake investigation 24
Good teamwork or team 23
Strong context 14
Enzymes Chance to develop method or undertake investigation 39
Strong context 10


Throughout the Pre-TLL experiments (Table 10), the most common reason raised (20–58%) for enjoying an experiment was that they were either interesting, worthwhile or fun (student examples include ‘Yes, it was interesting’ and ‘It was fun and pretty’). Typically, students did not state why the experiment was any of these particular descriptions. The importance of a good Teaching Associate was another major theme (student examples include ‘TA is nice and helpful. She makes the practical go very smoothly’ and ‘engaging demos and interesting end results’), appearing in the responses to four of the six experiments (17–32%). Whilst it is pleasing to hear that those particular Teaching Associates were well received, it is concerning that the students associated a large amount of the success of the experience to a small number of staff. This dependence on the Teaching Associate is a significant area of research (e.g. note the work of Velasco et al. (2016)), particularly in their training and development (Flaherty et al., 2017), and not overly surprising to see come through in this case.

Outside of these main themes, students reported enjoying easy (student examples include ‘It was simple and instructions are clear’ and ‘b/c it was easier than previous labs’) or short experiments (student examples include ‘it was only 2 hours’ and ‘it was quick’). This result is in good agreement with the work of DeKorver and Towns (2015), which showed that students often focus on simply completing the experiment as quickly as possible in order to achieve the highest mark possible.

Interesting practical skills (student examples include ‘Learning/practicing interesting techniques’ and ‘The techniques were consistent and satisfying’) and significant guidance (student examples include ‘because clear instructions were given’ and ‘very clear instructions’) were also raised, but only in response to single experiments. Overall, no mention was made of context or inquiry, which is to be expected, both from the nature of the experiments and the quantitative data discussed earlier.

The Post-TLL responses shown in Table 11 indicate a very different set of responses to the Pre-TLL responses. Firstly, whilst the students still routinely raised that the experiments were either interesting, worthwhile or fun, they were also far more likely to raise the context of the experiment as a reason for this (student examples include ‘the context was interesting’ and ‘interesting as an investigative exercise similar to industry processes’). In fact, enjoyment due to the context of the experiment was a notable theme in all seven Post-TLL experiments (10–36%). This effect, i.e. the raising of context as a reason for enjoyment, is common in other implementations of context-based learning (Pringle and Henderleiter, 1999).

It is also interesting that whilst the Sunscreen and Pseudenol experiments were known to be difficult, a number of students (15–23%) raised the challenge as a reason for their enjoyment of the lab (student examples include ‘Yes, quite challenging’ and ‘Was a good thinking and practical challenge in chem and science principles’). Themes relating to the ease of the experiment were only noted in the first year experiment, electronic waste, which is reasonable considering the year level involved and the simple practical skills utilised.

For the experiments that included a significant component of inquiry or experimental design (nylon, food project and enzymes), this was directly stated by the students (24–39%) as a reason they enjoyed the experience (student examples include ‘make some polymers and test the properties within own design’ and ‘It was interesting to design our experiments’). Only one theme was unique to an experiment, which was good teamwork or tea’ in the Food Project experiment (student examples include ‘team was good’ and ‘I really enjoyed the team dynamic’). Overall, this increase in enjoyment as a direct result of increased inquiry or problem based learning is a well-known artefact of these types of teaching laboratories that was raised earlier in this article (Weaver et al., 2008).

It is worth noting that whilst the post TLL experiments were never considered short and only rarely easy, there appeared to be no notable negative impact on student enjoyment. Overall, these results are promising for the TLL programme. The students appear to enjoy the experiments for the same reasons that they were created – to incorporate more industry contexualised, inquiry/problem based experiments. This is in spite of the reported issues with guidance as shown in the quantitative data and the desired improvements raised by the students.

Conclusions

Overall, the individual experiments generated from the Transforming Laboratory Learning (TLL) programme at Monash match the expected literature outcomes of context and inquiry-based learning. Through the use of a single survey (consisting of 27 closed questions and 3 open), the new undergraduate experiments result in students who are clearly more aware of (and more able to articulate) a larger range of skills that they have developed. Whilst the students recognised that the experiences were more challenging and contained far less guidance, this did not appear to impact their level of interest, enjoyment or overall appreciation of the experiments.

Furthermore, students routinely recognised that the experiments were more contextualised and more open (i.e. they were more able to make decisions). A large amount of effort was undertaken to incorporate more student control and greater real world context so this is a welcome result.

The students were also more likely to state that they repeated experiments, indicating an increase in a simple scientific practice – that of reproducibility. The students regularly reported (in both the closed and open questions) that the laboratory manual no longer provided enough information on its own to guide them through the experiment. As a central aim of TLL was to remove excessive guidance and encourage scientific practices (e.g. repetition), these were seen as favourable outcomes. However, it worth noting that there is always room for improvement with regards to student guidance and it is likely that the clarity of the student instructions could be further improved.

Students were more likely in the Post-TLL experiments to raise the development of transferable skills and skills associated with experimental design. This was accompanied by a decreased focus on development of theoretical understanding.

The proportion of students stating that they enjoyed the experiment did not change after the TLL programme. However, in the new experiments, students raised the strong context and open design of the experiments as reasons for their enjoyment – themes that were absent from the Pre-TLL data.

Overall, this research shows that the advantages gained by both contextualisation and inquiry/problem based learning persist when incorporated into a large, complex, multi-year undergraduate program. Whether or not these changes will have persistent, long term effects on the students understanding and articulation of their transferable skills will be determined through future research.

Implications for practice

There are two main outcomes of this research that could potentially guide staff involved in delivering teaching laboratories. The first is that one may not need to completely overhaul all undergraduate laboratories to obtain the benefits of inquiry/problem-based learning and context-based learning. Indeed, a range of laboratories can be generated that adhere to either increased context or enhanced inquiry (individually or together) and their global benefit may still prove fruitful. Furthermore, these changes can be implemented in many different chemistry courses and still provide the same apparent benefit. The second major practical outcome is the generation of the modified survey itself. Whilst further measurements of validity and reliability may be required, the use of this tool would appear to provide a powerful measure of the students’ perceptions of any new experiments that may be generated. Furthermore, the data in this article may provide a useful comparison for future users of this particular instrument.

Future work

This work primarily focuses on the students’ changing responses to a range of individual experiments generated through the TLL programme. More global investigations of the TLL programme are also being undertaken. These include, but are not limited to, focus groups of individual chemistry courses before and after the TLL programme and focus groups of students undertaking their final year project. Annual surveys (including the MLLI survey) are also being undertaken which are tracking the students’ perceptions of (a) laboratory aims as well as their expectations of their actions and feelings throughout teaching laboratories and (b) their perceived level of employability and overall skill development over the three-year chemistry programme. It is also worth noting that this survey will continue to be used throughout the remainder of the TLL program. The results will be used to further guide the researchers, forming the basis of an action research approach.

Conflicts of interest

There are no conflicts to declare.

Appendix

The Wilcoxon results after correcting for sample size are shown below (Table 12).
Table 12 The results of the Wilcoxon Signed Ranked test on the 27 closed questions (after correcting for sample size) showing the number of responses (N), the p result, the calculated Z value and the r effect size
Question N p Z r
This lab experience was worthwhile. 652 1.00 0.00
This lab experience was interesting. 665 0.95 −0.06
This lab experience helped me better understand chemistry, in general, as a result of completing the chemistry lab. 660 0.70 −0.39
In my life, I will not use skills I’ve learned in this chemistry lab. 662 0.62 −0.49
This lab experience did not make me learn. 666 0.57 −0.57
Having the opportunity to use chemistry instruments helped me learn course topics. 663 0.49 −0.68
Even if I don’t end up working in a science related job, the laboratory experience will still benefit me. 500 0.39 −0.86
This lab experience has made me more interested in a science career. 662 0.32 −0.99
This lab experience made me realize I could do science research in a real science laboratory (for instance at a university, or with a pharmaceutical company). 663 0.32 −0.99
Having the opportunity to use chemistry instruments made this course less interesting for me. 664 0.31 −1.01
This lab experience has made me less interested in science. 659 0.30 −1.03
This lab experience has made me more interested in chemistry. 661 0.078 −1.76
This lab experience helped me understand how the topics that are covered in chemistry lecture are connected to real research. 655 0.068 −1.83
This lab experience gave me a better understanding of the process of scientific research as a result of this experiment. 667 0.064 −1.85
Finding answers to real research questions motivated me to do well in the chemistry lab. 500 0.055 −1.92
Finding answers to real world questions motivated me to do well in the chemistry lab. 667 0.034 −2.12 −0.12
This lab experience presented real science to students, similar to what scientists do in real research labs. 665 0.018 −2.37 −0.17
This lab experience was not very similar to real research. 498 0.016 −2.42 −0.23
This lab experience was not well organized. 665 0.008 −2.67 −0.10
This lab experience was open enough to allow me to make decisions. 661 <0.0005 −4.45 −0.34
This lab experience was easy. 665 <0.0005 −4.83 −0.36
In this lab, the instructional materials did not provide me with explicit instructions about my experiment. 662 <0.0005 −5.00 −0.21
This lab experience was well contextualised to real life or the workforce. 502 <0.0005 −5.05 −0.29
This lab experience was challenging. 415 <0.0005 −5.05 −0.30
In this lab, I did not repeat experiments to check results. 499 <0.0005 −5.08 −0.29
In this lab, the instructional materials provided me with sufficient guidance for me to carry out the experiments. 664 <0.0005 −5.33 −0.36
In this lab, I can be successful by simply following the procedures in the lab manual. 666 <0.0005 −5.75 −0.29


Acknowledgements

The authors would like to acknowledge and thank, first and foremost, the students who participated in this study, whose honest feedback will result in a far stronger learning experience for students at Monash University. The authors would also like to express their gratitude to the technical staff at Monash University for their constant support and advice. The authors acknowledge Monash University for funding, and hosting, the Transforming Laboratory Learning programme. Ethics approval was obtained from the Monash University Human Ethics Research Committee, application number 2016000584.

References

  1. Barlett J. E., Kotrlik J. W. and Higgins C. C. (2001), Organizational research: determining appropriate sample size in survey research, Inf. Technol., Learn., Perform. J., 19(1), 43.
  2. Bauer C. F., (2005), Beyond “Student Attitudes”: Chemistry Self-Concept Inventory for Assessment of the Affective Component of Student Learning, J. Chem. Educ., 82(12), 1864 DOI:10.1021/ed082p1864.
  3. Bauer C. F., (2008), Attitude toward Chemistry: A Semantic Differential Instrument for Assessing Curriculum Impacts, J. Chem. Educ., 85(10), 1440 DOI:10.1021/ed085p1440.
  4. Bingham G. A., Southee D. J. and Page T., (2015), Meeting the expectation of industry: an integrated approach for the teaching of mechanics and electronics to design students, Eur. J. Eng. Educ., 40(4), 410–431 DOI:10.1080/03043797.2014.1001813.
  5. Bruck L. B. and Towns M. H., (2009), Preparing Students To Benefit from Inquiry-Based Activities in the Chemistry Laboratory: Guidelines and Suggestions, J. Chem. Educ., 86(7), 820 DOI:10.1021/ed086p820.
  6. Chuck J.-A., (2011), Hypothetical biotechnology companies: a role-playing student centered activity for undergraduate science students, Biochem. Mol. Biol. Educ., 39(2), 173–179.
  7. Cooper M. M. and Sandi-Urena S., (2009), Design and Validation of an Instrument To Assess Metacognitive Skillfulness in Chemistry Problem Solving, J. Chem. Educ., 86(2), 240 DOI:10.1021/ed086p240.
  8. Cooper L., Orrell J. and Bowden M., (2010), Work integrated learning: a guide to effective practice, Routledge.
  9. Creswell J., (2009), Research design. Qualitative, Quantitative, and Mixed Methods Approaches, Thousand Oaks, CA: SAGE Publications.
  10. Cummins R. H., Green W. J. and Elliott C., (2004), “Prompted” Inquiry-Based Learning in the Introductory Chemistry Laboratory, J. Chem. Educ., 81(2), 239 DOI:10.1021/ed081p239.
  11. DeKorver B. K. and Towns M. H., (2015), General Chemistry Students’ Goals for Chemistry Laboratory Coursework, J. Chem. Educ., 92(12), 2031–2037 DOI:10.1021/acs.jchemed.5b00463.
  12. Domin D. S., (1999), A Review of Laboratory Instruction Styles, J. Chem. Educ., 76(4), 543 DOI:10.1021/ed076p543.
  13. Dopico E., Linde A. R. and Garcia-Vazquez E., (2014), Learning gains in lab practices: teach science doing science, J. Biol. Educ., 48(1), 46–52 DOI:10.1080/00219266.2013.801874.
  14. Duch B. J., Groh S. E. and Allen D. E., (2001), Why problem-based learning, The Power of Problem-Based Learning, 3–11.
  15. Erhart S. E., McCarrick R. M., Lorigan G. A. and Yezierski E. J., (2016), Citrus Quality Control: An NMR/MRI Problem-Based Experiment, J. Chem. Educ., 93(2), 335–339 DOI:10.1021/acs.jchemed.5b00251.
  16. Flaherty A., O'Dwyer A., Mannix-McNamara P. and Leahy J. J., (2017), The influence of psychological empowerment on the enhancement of chemistry laboratory demonstrators' perceived teaching self-image and behaviours as graduate teaching assistants, Chem. Educ. Res. Pract., 18(4), 710–736 10.1039/C7RP00051K.
  17. Fritz C. O., Morris P. E. and Richler J. J., (2012), Effect size estimates: Current use, calculations, and interpretation, J. Exp. Psychol.: Gen., 141(1), 2–18 DOI:10.1037/a0024338.
  18. Galloway K. R. and Bretz S. L., (2015), Development of an Assessment Tool To Measure Students’ Meaningful Learning in the Undergraduate Chemistry Laboratory, J. Chem. Educ., 92(7), 1149–1158 DOI:10.1021/ed500881y.
  19. Gormally C., Brickman P., Hallar B. and Armstrong N., (2009), Effects of inquiry-based learning on students’ science literacy skills and confidence, Int. J. Scholar. Teach. Learn., 3(2), 16 DOI:10.20429/ijsotl.2009.030216.
  20. Grove N. and Bretz S. L., (2007), CHEMX: An Instrument To Assess Students' Cognitive Expectations for Learning Chemistry, J. Chem. Educ., 84(9), 1524 DOI:10.1021/ed084p1524.
  21. Hanson S. and Overton T., (2010), Skills required by new chemistry graduates and their development in degree programmes,Hull, UK: Higher Education Academy UK Physical Sciences Centre.
  22. Hattie J., (2008), Visible learning: a synthesis of over 800 meta-analyses relating to achievement, Routledge.
  23. Henry G. T., (1990), Practical sampling, Sage, vol. 21.
  24. Jansson S., Söderström H., Andersson P. L. and Nording M. L., (2015), Implementation of Problem-Based Learning in Environmental Chemistry, J. Chem. Educ., 92(12), 2080–2086 DOI:10.1021/ed500970y.
  25. Johnstone A. and Al-Shuaili A., (2001), Learning in the laboratory; some thoughts from the literature, Univ. Chem. Educ., 5(2), 42–51.
  26. Kelly O. C. and Finlayson O. E., (2007), Providing solutions through problem-based learning for the undergraduate 1st year chemistry laboratory, Chem. Educ. Res. Pract., 8(3), 347–361 10.1039/B7RP90009K.
  27. Lagowski J. J., (1990), Entry-level science courses: the weak link, ACS Publications.
  28. Leal Filho W. and Pace P., (2016), Teaching Education for Sustainable Development at University Level, Springer.
  29. Lenhard W. and Lenhard A., (2016), Calculation of Effect Sizes, from https://www.psychometrica.de/effect_size.html.
  30. Mc Ilrath S. P., Robertson N. J. and Kuchta R. J., (2012), Bustin’ Bunnies: An Adaptable Inquiry-Based Approach Introducing Molecular Weight and Polymer Properties, J. Chem. Educ., 89(7), 928–932 DOI:10.1021/ed2004615.
  31. McDonnell C., O'Connor C. and Seery M. K., (2007), Developing practical chemistry skills by means of student-driven problem based learning mini-projects, Chem. Educ. Res. Pract., 8(2), 130–139 10.1039/B6RP90026G.
  32. Norton A. and Cakitaki B., (2016), Mapping Australian higher education 2016, Grattan Institute, p. 7.
  33. Pilcher L. A., Riley D. L., Mathabathe K. C. and Potgieter M., (2015), An inquiry-based practical curriculum for organic chemistry as preparation for industry and postgraduate research, S. Afr. J. Chem., 68, 236–244.
  34. Pringle D. L. and Henderleiter J., (1999), Effects of Context-Based Laboratory Experiments on Attitudes of Analytical Chemistry Students. J. Chem. Educ., 76(1), 100 DOI:10.1021/ed076p100.
  35. Ram P., (1999), Problem-Based Learning in Undergraduate Instruction. A Sophomore Chemistry Laboratory, J. Chem. Educ., 76(8), 1122 DOI:10.1021/ed076p1122.
  36. Russel C. B., (2008), Development and Evaluation of a Research-Based Undergraduate Laboratory Curriculum, PhD, West Lafayette, Indiana: Purdue University.
  37. Sandi-Urena S., Cooper M. and Stevens R., (2012), Effect of Cooperative Problem-Based Lab Instruction on Metacognition and Problem-Solving Skills, J. Chem. Educ., 89(6), 700–706 DOI:10.1021/ed1011844.
  38. Santos J. R. A., (1999), Cronbach's alpha: a tool for assessing the reliability of scales, J. Extension, 37(2), 1–5.
  39. Sarkar M., Overton T., Thompson C. and Rayner G., (2016), Graduate Employability: Views of Recent Science Graduates and Employers, Int. J. Innov. Sci. Math. Educ., 24(3), 31–48.
  40. Velasco J. B., Knedeisen A., Xue D., Vickrey T. L., Abebe M. and Stains M., (2016), Characterizing Instructional Practices in the Laboratory: The Laboratory Observation Protocol for Undergraduate STEM, J. Chem. Educ., 93(7), 1191–1203 DOI:10.1021/acs.jchemed.6b00062.
  41. Watson D., (1992), Correcting for Acquiescent Response Bias in the Absence of a Balanced Scale: An Application to Class Consciousness, Sociol. Method. Res., 21(1), 52–88 DOI:10.1177/0049124192021001003.
  42. Weaver G. C., Russell C. B. and Wink D. J., (2008), Inquiry-based and research-based laboratory pedagogies in undergraduate science, Nat. Chem. Biol., 4, 577 DOI:10.1038/nchembio1008-577.
  43. Wiggins G., (1990), The Case for Authentic Assessment, ERIC Digest.
  44. Woolfolk A., (2005), Educational Psychology, Boston, MA: Allyn and Bacon.

This journal is © The Royal Society of Chemistry 2018