A disconnect between staff and student perceptions of learning: an ACELL educational analysis of the first year undergraduate chemistry experiment ‘investigating sugar using a home made polarimeter’

Michael G. Crisp a, Scott H. Kable b, Justin R. Read bc and Mark A. Buntine *cd
aSchool of Chemistry, Physics and Earth Sciences, Flinders University of South Australia, Adelaide, SA 5001, Australia
bSchool of Chemistry, University of Sydney, Sydney, NSW 2006, Australia. E-mail: scott.kable@sydney.edu.au
cSchool of Chemistry and Physics, The University of Adelaide, Adelaide, SA 5005, Australia. E-mail: justin.read@sydney.edu.au
dDepartment of Chemistry, Curtin University, GPO Box U1987, Perth, WA 6845, Australia. E-mail: m.buntine@curtin.edu.au

Received 19th November 2010 , Accepted 19th August 2011

First published on 7th October 2011


Abstract

This paper describes an educational analysis of a first year university chemistry practical called ‘Investigating sugar using a home made polarimeter’. The analysis follows the formalism of the Advancing Chemistry by Enhancing Learning in the Laboratory (ACELL) project, which includes a statement of education objectives, and an analysis of the student learning experience. The practical requires students to accurately prepare solutions of known concentrations of a common consumer chemical (sucrose), and then investigate the interaction between these solutions and plane-polarised light. The instrument used is a “home built” polarimeter which students assemble, allowing them to recognise that scientific apparatus need not be mysterious in its operation or construction. Student feedback data were conducted using the ACELL Student Learning Experience (ASLE) instrument. Analysis of the data shows that overwhelmingly students rate the experiment as “worthwhile” or better. However, many also rate the experiment as “boring” or “uninteresting”. By contrast, staff and student feedback at an ACELL experiential workshop rated the experiment very highly in terms of the “interest” criterion. In this contribution we discuss this alignment of staff and student perceptions of various elements, including “interest” and explore the correlation with the overall laboratory experience.


I. Introduction

Most researchers agree that the laboratory experience consistently ranks highly as a contributing factor toward students' interest and attitudes to their science courses (Osborne et al., 2003). Consequently, good laboratory programs should play a major role in influencing student attitudes, learning and performance (Byers, 2002). In fact they can define student experiences of the sciences, and if done poorly, could be the major contributing factor in causing students to disengage from the subject area (Deters, 2005). Academic staff have a primary responsibility in developing and monitoring the efficacy of laboratory exercises with an overall objective of motivating and engaging students to learn. The ASELL (Advancing Science by Enhancing Learning in the Laboratory) approach, and that of its progenitor projects (Read, 2006; Jamie et al., 2007)§, has shown demonstrably that positive student perceptions of a laboratory exercise leads to improved student learning outcomes (Buntine et al., 2007). Reinforcing this connection, in the present contribution we demonstrate an instance of a disconnect between academic staff perceptions of the educational value of a laboratory exercise and those expressed by the students who undertook the experiment. Our exploration of this disconnect involves an entry-level undergraduate experiment designed to explore aspects of the optical activity of molecules measured viapolarimetry.

Understanding the chemical and physical properties associated with chiral substances and racemic mixtures is fundamental to the study of chemistry. Introductory organic chemistry students would be expected to be familiar with chirality in the macroscopic environment, exemplified by hands and shoes, and to recognise when molecular models represent chiral substances. They would be introduced to examples such as (R)- and (S)-carvone and their differing odours (spearmint and caraway, respectively) and the different biological properties of the enantiomers of thalidomide. As their studies progress, students learn about the importance of chirality in biological macromolecules like proteins and DNA, and they come to understand that molecular recognition depends on shape in three dimensions rather than simply on structure. Moreover, students of inorganic chemistry encounter chirality involving metal centres and in substances which lack a stereogenic carbon centre.

Historically, chiral substances were described as being optically active because they rotate the plane of polarisation of plane-polarised light. Jean-Baptiste Biot first observed this property in 1815 and it is the basis for useful analytical techniques such as polarimetry, as well as spectroscopic techniques such as circular dichroism. However, polarimetry is typically not included in introductory discussions of chirality in Australian universities, and nor do students generally have the opportunity to make use of a polarimeter at first year level. The lack of opportunity is often driven by the expense and fragility of commercially-built apparatus and the large numbers of students undertaking first year classes.

In the first year of university chemistry courses, most discussions of chirality are necessarily focussed at the molecular level, involving recognising stereocentres and assigning configurations based on the Cahn–Ingold–Prelog priority rules. By including experiences of polarimetry at first year, the opportunity arises for students to make connections between the macroscopic and molecular levels – an area with which, research shows, students have difficulty (Russell et al., 1997; Treagust et al., 2003; Wu, 2003).

The experiment Investigating Sucrose Using a Home Made Polarimeter was adapted from Stary and Woldow (2001) and was introduced into the second semester of First Year undergraduate chemistry at Flinders University in 2005. It was part of a project involving the redevelopment of the First Year undergraduate chemistry practicals, and was intended to allow students to develop important generic scientific skills in graphing and data interpretation. During the exercise, students must accurately prepare solutions of known concentrations of a common consumer substance (sucrose), and then investigate the interaction between these solutions and plane-polarised light. The instrument used is a “home built” polarimeter which students assemble, allowing them to recognise that scientific apparatus need not be mysterious in its operation or construction. Furthermore, the simple dependence of measured optical rotation on the chemical amount of the optically active substance present (and thus on the product of the concentration and the path length):

 
Θ = α(λ, T) × c × l,(1)
makes the handling of the data collected amenable to straightforward data analysis. In eqn (1), Θ is the measured optical rotation (deg), l the path length (cm), c the concentration (M) and α(λ, T) is the specific rotation of the compound (deg M−1 cm−1), which is a function of temperature, T, and wavelength, λ.

This paper is not intended to describe the experiment itself, which is adapted from that previously reported (Stary and Woldow, 2001) and described in full online [www.asell.org], but rather to present evidence as to its positive influence on student learning and to describe an interesting disconnect between staff perceptions of the educational value of the experiment and those expressed by the students who undertook the experiment.

The scientific and educational objectives of the experiment are described in detail in the ESI. In brief, the experiment serves to promote student development at both the practical and intellectual levels. Students learn about the critical components of a polarimeter as they assemble the instrument themselves (see Fig. 1). They experiment with rotating just the top-most polarising filter to see their effect on the intensity of transmitted light, and learn how to use the extinction point to zero the instrument. Then, the instrument is used to examine the effect on the transmitted light of different sucrose concentrations and of different path lengths at constant concentration. These results are used to prepare a simple report.


Schematic of the polarimeter provided to the students to assist them in building it (right) and a photograph (left) of the assembled polarimeter.
Fig. 1 Schematic of the polarimeter provided to the students to assist them in building it (right) and a photograph (left) of the assembled polarimeter.

II. Methodology

The quality of the student learning experience was measured and evaluated in two stages (ACELL, 2007). The first stage in the protocol was to test and evaluate the experiment in a third party laboratory by small group of staff and students, whom we call “workshop delegates” herein. This is intended to provide early feedback about the experiment to the submitter, and to ensure that the experiment is readily transferable to other institutions that might wish to adopt it. The data for the second phase of the educational analysis were collected in the First Year Chemistry undergraduate laboratory at Flinders University.

Workshop surveys

The polarimetry experiment was evaluated at the 2006 ACELL workshop at the University of Sydney (Read et al., 2006a, 2006b) by 11 “workshop delegates”, in this case by 6 academic staff and 5 students. The students at the workshop were all either post-graduate students who had experience as laboratory demonstrators, or more senior undergraduate students, whose chemistry experience and knowledge exceeds the level for which this experiment was designed. The 11 delegates were randomly assigned to evaluate this experiment, except that, by design, none were from the submitting institution. The delegates undertook the entire experiment procedure, working in mixed teams of staff–staff, student–student and staff–student. They were then surveyed about their experience of the experiment, and its educational objectives. They were further asked questions as to how they thought First Year students might perceive the experiment when conducted under realistic laboratory conditions.

The Workshop survey instrument probes two facets of the experiment – practical details such as whether it works properly, and educational aspects. The instrument has 12 Likert scale items, plus open-ended responses are invited for five specific questions about the experiments. The influence of the workshop on professional development of staff has been discussed in several previous papers (Buntine et al., 2007; Read et al., 2006a, 2006b). A copy of the ACELL Workshop Survey instrument and data are available as ESI.

ACELL student learning experience (ASLE) surveys

The second stage of testing involves collection and evaluation of data relating to the students’ experience of learning after undertaking the experiment at the home institution. The ASLE survey was administered to volunteer students at Flinders University during the second semester of 2006. All students from the cohort undertaking the experiment (enrolment 220) were invited to respond to the survey, and 93 responses were received. We recognise that self-selection can introduce a bias into data sets such as this, but the anonymity of the responses makes any statistical investigation of the respondents impossible. However, a response rate of 42% allows a reasonable degree of confidence in the applicability of inferences to the cohort as a whole.

The ASLE instrument includes fourteen 5-point Likert items and five free-form responses to specific questions. A copy of the instrument is also available as ESI. Twelve of the Likert items are evaluated on a scale from “strongly disagree” to “strongly agree”. A thirteenth item, probing the adequacy of the length of time allotted to the experiment, is evaluated on a scale from “way too much time” to “nowhere near enough time”. The final Likert item asks the student about their overall learning experience and is evaluated from “outstanding” to “worthless”. The five open-response items probe the best and worst features of the experiment. Apart from the ASLE instrument, a summary of the student response data is presented in the ESI.

Data analysis framework

The responses to both surveys were analysed in various ways. The Likert data were analysed as broad agreement (BA) minus broad disagreement (BD), where the broad agreement was defined as responses A and B (agree and strongly agree) and broad disagreement as the responses D and E (disagree and strongly disagree). The middle, C, responses were not included. We also analysed the Likert data using the ACELL approach, where the responses are given a value of +2 for A responses (strongly agree), through −2 for E responses (strongly disagree), with the central response being zero. We recognise that the assignment of numerical scores to ordinal data is not universally accepted. The topic, and the specific numerical assignments have been discussed in detail by the ACELL team (Buntine et al., 2007; ACELL, 2007). The purpose for including the 5-point scale here is that it explicitly gives a higher weight for the extrema, A and E responses to differentiate them from the weaker B and D responses. Response sets that are heavily weighted within the A/B or D/E ranges are not differentiated by the BA − BD analysis. For example, both Qs 1 and 7 in the Workshop survey (Table 1), have BA − BD scores of 100%, demonstrating that all respondants were in broad agreement with the statements. However, the ACELL score for Q1 is 1.91, while the score for Q7 is 1.36. Clearly, the response to Q1 (nearly all strongly agree) is much stronger than Q7 (most agree); the ACELL score differentiates this, while BA − BD does not. All of the broad conclusions from this paper are based on analysis of the full 5 point scale, with the BA − BD and ACELL scores used only for guidance and as a quick snapshot of the responses.
Table 1 Summary of delegate responses to the Workshop survey Likert scale items. The full distribution of responses is available in the ESI†
Number Item %BA − %BDa ACELL scoreb
This expt All expts This expt All expts
a BA = broad agreement; BD = broad disagreement. b For the ACELL score, items 1 to 12, were scored as +2 (strongly agree) to −2 (strongly disagree), with a 0 (neutral) midpoint. For item 13, a +2 (way too much time) to −2 (nowhere near enough time) scale has been used, with a 0 (about right) midpoint. For item 14, a +2 (outstanding) to −2 (worthless) scale has been used, with a 0 (worthwhile) midpoint.
1 The experiment works 100% 85% 1.91 1.41
2 I expect that completing this experiment will effectively help students to develop their theoretical and conceptual knowledge 73% 81% 1.27 1.20
3 I expect that completing this experiment will effectively help students to develop their scientific and practical skills 82% 81% 1.27 1.19
4 I expect that completing this experiment will effectively help students to develop their thinking skills 64% 76% 0.82 1.11
5 I expect that completing this experiment will effectively help students to develop their generic skills 82% 71% 1.18 0.93
6 I expect that students will find this experiment interesting 91% 74% 1.55 1.06
7 I believe that the laboratory notes, when supported with guidance from demonstrators and other resources, will provide sufficient support for students as they learn. 100% 76% 1.36 1.07
8 Useful assessment criteria are clearly stated 73% 27% 1.18 0.42
9 The experiment requires students to participate as active learners 91% 79% 1.36 1.10
10 The statement of ‘required prior knowledge and skills’ (template, section 1.3), is accurate. 91% 76% 1.27 1.03
11 Sufficient background information, of an appropriate standard, is provided in the introduction 82% 74% 1.18 1.06
12 The experiment provides students to take responsibility for their own learning 100% 82% 1.27 1.13
13 The amount of time available for students to complete this experiment is: 27% 13% 0.27 0.13
14 Overall, as a learning experience, I would rate this experiment as: 64% 55% 0.91 0.65


The open-ended responses to the ASLE survey were subjected to a content analysis (Buntine and Read, 2007) to investigate common aspects of the students’ responses, and to provide insight and context to the statistical analysis of the Likert items. The Workshop open-ended responses were not analysed in this way because 11 responses are insufficient for a meaningful analysis. We have included several responses in this report to illustrate particular points. The full set of responses is included in the ESI.

Finally, both survey instruments have a number of common Likert items, albeit in a predictive tense for the workshop instrument and an evaluative tense for the ASLE instrument. For example, the workshop survey asks the delegates to respond to “I expect that the students will find this experiment interesting” while the ASLE instrument asks the students “I found this to be an interesting experiment”. A Wilcoxon Rank–Sum Test is an appropriate statistical test for distributions that have a different number of respondants (Wilcoxon, 1945; Mann and Whitney, 1947). The null hypothesis in each case was that the distribution of responses is the same for the workshop and ASLE surveys was tested for N = (11,93).

III. Analysis of results

Workshop survey

Table 1 shows an analysis of the 14 Likert items surveyed at the workshop for this experiment. Many of the workshop survey items are utilitarian in nature, for example, “The experiment works” and the experiment was favourably reviewed in these items (see ESI). Also included in Table 1 are the average responses for the set of 24 experiments that were surveyed in the 2006 workshop, which cover a diverse range of chemistry sub-disciplines and year levels. Not surprisingly, as delegates brought experiments that they thought were good learning experiences, the workshop average responses are quite favourable. Nonetheless, the polarimetry experiment was generally received more favourably than the average; indeed it was one of the most favourably surveyed experiments of that workshop. The full survey results are available in the ESI; here we select a few of the more interesting results.

As discussed above, some of the questions in the Workshop Instrument were designed to provide overlap with the ACELL Student Learning Experience (ASLE) instrument, to be conducted at a later time in the real laboratory setting. In particular, the survey items on “interest” (Item 6) and “responsibility for own learning” (Item 12), are exact counterparts to the ASLE questions. The response in Table 1 to the “interest” item was the strongest of all 24 experiments surveyed at the workshop. The strong response to this item was clarified in the freeform comments to the question “What, if anything, are the notable strengths of this experiment?” While the number of respondents is too small for a meaningful qualitative analysis, there were several responses that highlighted the clear and simple nature of the instrumentation as the reason for the interest:

Demystifying instrumentation

The hands on, playing with the polarimeter

… it keeps you entertained and you can have a lot of fun with it.

Based on the strong feedback from the ACELL workshop, this experiment was selected for further testing in the environment of a real First Year undergraduate laboratory.

ASLE survey

The fourteen Likert scale items from the ASLE instrument are presented in Table 2, along with the scoring used for the items and a summary of the results obtained.
Table 2 Summary of student feedback responses to the ASLE Likert scale items. The full distribution of responses are available in the ESI†
Number Item %BA − %BDa ACELL scoreb
This expt This expt
a BA = broad agreement; BD = broad disagreement. b For the ACELL score, items 1 to 12, were scored as +2 (strongly agree) to −2 (strongly disagree), with a 0 (neutral) midpoint. For item 13, a +2 (way too much time) to −2 (nowhere near enough time) scale has been used, with a 0 (about right) midpoint. For item 14, a +2 (outstanding) to −2 (worthless) scale has been used, with a 0 (worthwhile) midpoint.
1 This experiment has helped me to develop my data interpretation skills 68% +0.83
2 This experiment has helped me to develop my laboratory skills 63% +0.78
3 I found this to be an interesting experiment 33% +0.41
4 It was clear to me how this laboratory exercise would be assessed 53% +0.72
5 It was clear to me what I was expected to learn from completing this experiment 72% +0.97
6 Completing this experiment has increased my understanding of chemistry 57% +0.76
7 Sufficient background information, of an appropriate standard, is provided in the introduction 80% +1.15
8 The demonstrators offered effective support and guidance 82% +1.33
9 The experimental procedure was clearly explained in the lab manual or notes 73% +1.00
10 I can see the relevance of this experiment to my chemistry studies 77% +1.14
11 Working in a team to complete this experiment was beneficial 86% +1.43
12 The experiment provided me with the opportunity to take responsibility for my own learning 71% +0.96
13 I found that the time available to complete this experiment was 5% +0.07
14 Overall, as a learning experience, I would rate this experiment as 29% +0.32


We commence the analysis of the student learning experience with the overall experience question: “14. Overall, as a learning experience, I would rate this experiment as:”. The feedback is strong; 94% of students considered the experiment to be “worthwhile” or better (see ASLE data in the ESI). Analysis of the open-response student feedback reinforces and provides some context to the overall positive sense in which they view the laboratory exercise. Of a total of 63 comments relating to the student experience of the experiment (category EE in Table 3), 53 were positive. Many students commented favourably on learning about the operational principles of a polarimeter by assembling the simple apparatus themselves – comments that are aligned well with one of the objectives of demystifying the polarimeter.

Table 3 Summary of categories used in content analysis of the ASLE open-response items
Category/Theme Abbreviation Total comments Sub-categories
Understanding of Chemistry UC 69 How polarimeter works (28)
Relationship between concentration and optical activity (23)
Chirality and optical activity (15)
Graphing and data analysis (3)
Experience of Experiment EE 63 Positive comments (53)
Negative comments (10)
Interesting Aspects of Experiment IAE 36 Using the polarimeter (21)
Seeing sucrose rotate the plane of PPL (8)
Making solutions (7)
Potential Improvements PI 27 Fine as it is (16)
Student notes (5)
Include variety of compounds to analyse (3)
Instrument (3)
Miscellaneous Comments MC 6  


Although 94% of students rated the experiment as worthwhile or better, the distribution was strongly peaked at just worthwhile (see ESI); only 34% responded with the two highest categories. As shown in Tables 1 and 2, both of the summary scoring methods show a much lower response to this item in the ASLE survey than the Workshop survey: BA − BD = 64% (Workshop) and 29% (ASLE) or ACELL score = 0.91 (Workshop) and 0.32 (ASLE). A hint of the reasons for the lower score in the ASLE survey can be found in the 10 negative comments about the student experience, which can mostly be summarised as the students perceiving the practical to be “boring” or “uninteresting”. These comments, and the differences between the Workshop and ASLE surveys warrant further examination. Below, we analyse the other 13 ASLE questions for evidence of the lower overall student experience score, and in the subsequent section we compare, statistically, the common items in the two surveys.

Administrative items on the ASLE survey

A number of the items in the ASLE survey relate to more basic aspects on the organisation of the practical. These items are staff-oriented, and reflect the ability of the teaching staff to prepare the laboratory appropriately for the students. Items included in this category include: the clarity of the laboratory notes (Item 9), the training of the demonstrators (Item 8), sufficient background information, either in the notes or by demonstrator introduction (Item 7), clear assessment (Item 4) and the length of time for the practical (Item 13).

The first administrative item that we consider is item 13, which explores whether the length of time for the practical is appropriate. The students' responses to this item were very strongly peaked, with 73 of 87 responses “C: about right” (see ESI). Nine students thought there was too much time, and 5 thought not enough time was allocated to the practical.

Students responded very favourably to the other 4 administrative items in the survey, and we won’t dwell on these. The students responded positively to the clarity and appropriateness of the documentation describing the experiment (the lab manual) (Item 9) and to the appropriateness of the background information (Item 7). In each case 82% of respondents agreed or strongly agreed that the notes were clear and background material appropriate. Further evidence in support of the claim that the laboratory notes assist students in preparing for and undertaking the exercise can be found in the open response data in Table 3 where only 5 of 27 responses to the question, ‘What aspects of the experiment need improvement and what changes would you suggest?’, indicate that some improvement in the documentation is required. The quality of the demonstrators (Item 8) was also very favourably received.

The weakest response to the administrative questions concerns the clarity of the laboratory assessment. Although the assessment was clear to 66% of respondents, 13% remained unclear about assessment processes. The assessment for this experiment was performed after each laboratory session, and therefore after students had completed the ASLE survey. As such, it is not unreasonable to expect that some students had not thought too deeply about the assessment whilst completing the survey.

Scientific and generic skills on the ASLE survey

Four questions on the ASLE instrument probe the students' development of scientific, practical and generic skills. Item 1 probes the development of data analysis skills; Item 2 general laboratory skills; Item 6 their understanding of chemistry and Item 11 explores teamwork.

The responses to Items 1 and 2 are statistically the same – both in the summary data in Table 2 and the complete distributions in the ESI. Students responded favourably to the experiment assisting in developing their general laboratory skills, especially their volumetric skills in terms of making solutions of known concentration. They appreciated that accurately making solutions of a known concentration was a critical step in exploring the relationship between solution concentration and optical activity. A number of students commented favourably on the simple nature of the instrument, for example:

“[I have] learn[t] how a polarimeter works, and the parameters of using the instrument (concentration and path length)”

The targetted learning objectives in the “Scientific and Practical Skills” and “Thinking and Generic Skills” of plotting and error analysis was not identified by many students; only 3 students commented that graphing and data analysis skills were key themes of the experiment. Nonetheless, it is clear from the responses to Likert Item 1 that 72% of students agree that the exercise improves their data analysis skills (see ESI; table indicates a BA − BD score of 68%). This outcome indicates that one of the original aims of introducing the experiment, viz., to improve the generic skills of students in terms of graphing and data interpretation, has been achieved without distracting students from the core chemical concepts being explored.

The score on the teamwork item (11. Working in a team to complete this experiment was beneficial), was the highest of the 12 standard Likert items – 89% were in agreement and only 3% in disagreement. However, teamwork did not feature very strongly in the open response data, with only two comments – one favourable and one critical (perhaps because of random allocation of partner?). This suggests that, although the students valued working in a team (pair), this was an implicit factor in their overall laboratory learning experience and not one that featured very highly in their perceptions of the laboratory.

About 65% of respondents agree or strongly agree that ‘Completing this experiment has increased my understanding of chemistry’ (Likert Item 6). Student responses in the qualitative section of the survey show that the specific learning objectives identified in the Educational Template (see ESI) were identified correctly by the students. In particular, two highlighted learning outcomes in the “Theoretical and Conceptual Knowledge” category were strongly identified in the sub-categories with the Understanding Chemistry theme in Table 3.

Motivating items on the ASLE survey

Four items in the ASLE instrument probe higher level engagement of the students with the practical, viz., Interest (Item 3), Clear learning outcomes (Item 5), Relevance to studies (Item 10) and Responsibility for own learning (Item 12). These items are student-centred and were designed by ACELL to probe motivating aspects of the experiment. The full distribution of responses for these items is shown in Fig. 2. Three of these items show very positive responses – at least as strong as the administrative and skills items. However, one item – Interest – was much lower than the others.

            ASLE student responses to Q14, probing the students' overall experience of the practical, and the four items probing higher level engagement of the students. The full statements can be found in the ESI.
Fig. 2 ASLE student responses to Q14, probing the students' overall experience of the practical, and the four items probing higher level engagement of the students. The full statements can be found in the ESI.

Table 2 shows a very favourable student response to Likert Item 10 (‘I can see the relevance of this experiment to my chemistry studies’); over 80% of respondents agree or strongly agree with this sentiment. In addition, students found the use of ‘real world' sugar – a consumer product – as making the experiment more ‘relevant’. Over 78% of students also agreed or strongly agreed with the statement that “It was clear to me what I was expected to learn from completing this experiment”, further indicating that a targetted objective of the practical exercise is satisfied.

The item querying “responsibility for own learning” also scored very strongly; 72% of students agreed or strongly agreed with the statement that “The experiment provides students with opportunities to take responsibility for their own learning this experiment”. Only 1% of students disagreed with the statement. It is well-known in the literature that recipe, or formulaic experiments do not engage students (Buntine et al., 2007 and references therein). Student responses suggest that the experiment is an effective learning exercise in terms of creating motivated learners. Davis and Murrell (1993) discuss the role of explicitly encouraging student responsibility in the classroom as a key component in transforming teaching into learning. They comment that student participation in their education is a key motivator that contributes to creating an effective learning environment, or rather that a lack of student responsibility can lead to a cohort of passive, disaffected students who become disengaged with the learning process. There exists an extensive literature (e.g., Babbage, 1998; Alderman, 1999; Fallows and Ahmet, 1999) to indicate that motivated students, whether at school or at university, become better, more independent learners. The data reported here suggest that this laboratory exercise, whilst not perceived by students to engage their topical interest, certainly contributes to motivating them to learn about the chemical world around them.

The most surprising survey item was the responses to the question on interest: “3. I found this to be an interesting experiment”. As can be seen in Table 2, the summary score on this item was only 33% (BA − BD) or +0.41 (ACELL). This is by far the lowest mean score for any of the 12 Likert items on either scale, and as can be seen in Fig. 2, approximately 15% of students found the exercise uninteresting. In light of the strong anticipated interest from the workshop data, we explore the issue of student interest below.

Comparison between ASLE and workshop surveys

In the above analysis, two items were noteworthy by their difference between the preliminary survey at the workshop, and the student experience survey in the laboratory. The score on the overall experience on the ASLE survey (ACELL = 0.32) is lower than the score from the workshop survey data (ACELL = 0.91). And particularly striking, the score on Interest at the workshop was the highest of any surveyed experiment at that workshop, yet the ASLE score was by far the lowest of any item. The difference in these average scores warrants further attention.

The Workshop survey and ASLE survey have four items that are rigorously correlated (see Fig. 3). Two other items on the Workshop survey are more generally linked to the responses on the ASLE instrument. In particular, the workshop item “I expect that completing this experiment will effectively help students to develop their scientific and practical skills” is broken out into Questions 1 (lab skills) and 2 (data interpretation skills) on the ASLE survey, and the workshop item “I believe that the laboratory notes, when supported with guidance from demonstrators and other resources, will provide sufficient support for students as they learn” is broken out in the ASLE instrument to probe notes and demonstrators separately. Nonetheless, the items are clearly linked, and the average of responses to the relevant items in the ASLE survey is plotted in Fig. 3 in comparison with these two workshop items.


Comparison between expectations of the experiment at the ACELL workshop, and responses from student in the real laboratory environment. Only items (a) and (b) were statistically different at P = 0.05. Responses strongly agree (SA) through strongly disagree (SD) (see text).
Fig. 3 Comparison between expectations of the experiment at the ACELL workshop, and responses from student in the real laboratory environment. Only items (a) and (b) were statistically different at P = 0.05. Responses strongly agree (SA) through strongly disagree (SD) (see text).

A Wilcoxon Rank–Sum Test is an appropriate statistical test for distributions that have a difference number of respondents (Wilcoxon, 1945; Mann and Whitney, 1947). The null hypothesis that the distribution of responses is the same for the workshop and ASLE surveys was tested for N = (11,93). At p = 0.05, the null hypothesis is accepted for four of the distributions as shown in Fig. 3. Specifically, the items on Assessment, Responsibility for own learning, (lab + data) skills, and (demonstrators + practical notes) show distributions that are not significantly different between the workshop and in the teaching laboratory. This is an important finding because it demonstrates that the workshop experience was sufficiently realistic and rigorous that most of the preliminary observations were transferred into the real laboratory situation.

The item that probes the overall learning experience is designed as a catchall for evaluating the experiment. Fig. 3 shows that the distribution of responses is significantly different in the teaching laboratory than in the workshop (p = 0.04). The only other item that also shows a significant difference (p < 0.01) is the item that probes the level of Interest. Indeed, as discussed above, the item on Interest in the ASLE survey is outstanding by being far lower than any other item. Therefore, we conclude that the lower overall rating of the experiment in the teaching laboratory is because the students' interest is less engaged in the laboratory than the staff and students' interest in the workshop.

The open-ended responses to “What were the strengths of the experiment” in the workshop survey reveal that the simple and clear instrumentation arises as one of the strongest features of the experiment:

Enjoyable and valuable: it truly demystifies the principle and utility of the polarimeter. I find instruments the most difficult to learn, to understand, hence I found this experiment a very valuable one to undertake.

Many students in the teaching laboratory recognised that understanding the way a polarimeter works was one of the main objectives of the laboratory. In reply to the question: “What did you think was the main lesson to be learnt from the experiment?”, comments such as:

How simple polarimetry works practically”, or “How a polarimeter works

were common. However, the several students responded to “What aspects of the experiment need improvement and what changes would you suggest?” with comments that reflected a wish for more sophisticated instrumentation:

Perhaps a more complex polarimeter…”, “Equipment…”, and “The polarimeter was frustrating”.

The role of situational and individual interest as a motivational tool has received considerable attention (Schiefele and Krapp, 1996; Ainley et al., 2002; Hidi et al., 2004; Hidi and Renninger, 2006). In the present experiment, for the staff and more experienced students (no students at the workshop were First Year students) situational interest in the simple polarimeter was triggered and held by using a very simple piece of scientific apparatus. This alone is more novel for these experimenters as their experience includes much more personal experience on sophisticated instruments, either in research laboratories, or higher level teaching laboratories.

The simple polarimeter appears also to trigger situational interest in the first year students, but in this case holding the interest is more difficult. The students are more readily frustrated when the experiment doesn’t work as they expect. They do not have the experience in experimentation to know that this is a common occurrence in new experiments, irrespective of the sophistication of the apparatus (indeed they might come to learn that the more sophisticated the experiment the greater chance of it going awry!). The result is that it is easy to blame the instrument and their interest is not held. Maintained situational interest has been shown to be associated with a higher level of cognitive engagement than just triggered situational interest (Mitchell, 1993). It is likely that the inability of the polarimeter to maintain interest for some students also led to the lower overall score than expected from the workshop data.

IV. Summary and conclusions

The objectives of this experiment were to show students the mechanics and principles of polarimetry. In performing the experiment the students practice laboratory and data analysis skills, and learn that many scientific instruments have a simple basis. In all, these objectives were met well, with very positive student evaluation through the ASLE instrument. However, the overall evaluation was statistically weaker than the equivalent feedback collected at an ACELL workshop. We conclude that the reason for the weaker survey result on this item is because the students' interest was not engaged as strongly as the staff and more experienced students at the workshop. The underlying reason for the difference in interest seems to be the different scientific experience of the two cohorts, with a simple apparatus being more novel and interesting to the experienced cohort than the group of inexperienced First Year students for whom the experiment was designed. We are exploring further how the situational interest of the First Year students might be held better, including having a commercial polarimeter available in the laboratory for students to compare the results with their simple apparatus.

Acknowledgements

The ACELL project would not be possible without the financial support of the Australian Government, through the Higher Education Innovation Program. We thank the students of Flinders University for participating in this study by volunteering to complete surveys. Collection of data described in this experiment was authorised by the Human Research Ethics Committee at the University of Sydney, project number 12-2005/2/8807.

References

  1. ACELL, (2007), Guidelines and procedures, Available from http://asell.org/Publications/Document-Library.
  2. Ainley M., Hidi S. and Berndorff D., (2002), Interest, learning and the psychological processes that mediate their relationship, J. Educ. Psychol., 94, 545–561.
  3. Alderman M. K., (1999), Motivation for achievement: possibilities for teaching and learning, Mahwah NJ, Lawrence Erlbaum Associates.
  4. Babbage K. J., (1998), High-impact teaching: overcoming student apathy, Lancaster PA, Technomic Publishing.
  5. Buntine M. A., Read J. R., Barrie S. C., Bucat R. B., Crisp G. T., George A. V., Jamie I. M. and Kable S. H., (2007), Advancing Chemistry by Enhancing Learning in the Laboratory (ACELL): A model for providing professional and personal development and facilitating improved student laboratory learning outcomes, Chem. Educ. Res. Pract., 8, 232–254.
  6. Buntine M. A. and Read J. R., (2007a), Guide to content analysis. Available from http://www.asell.org/Educational-Information/Guide-to-Content-Analysis.
  7. Byers W., (2002), Promoting active learning through small group laboratory classes, Univ. Chem. Educ., 6, 28–34.
  8. Davis T. M. and Murrell P. H., (1993), Turning teaching into learning: the role of student responsibility in the collegiate experience, ASHE-ERIC Higher Education ReportsReport 8.
  9. Deters K. M., (2005), Student opinions regarding inquiry-based labs, J. Chem. Educ., 82, 1178–1180.
  10. Fallows S. and Ahmet K., (1999), Inspiring students: case studies in motivating the learner, London, Kogan Page.
  11. Hidi S., Renninger K. A. and Krapp A., (2004), Interest, a motivational variable that combines affective and cognitive functioning, in D. Dai and R. Sternberg (ed.). Motivation, Emotion and Cognition: Integrative Perspectives on Intellectual Functioning and Development (pp. 89–115), Hillsdale, NJ: Erlbaum.
  12. Hidi S. and Renninger K. A., (2006), The four-phase model of interest development, Educ. Psychol., 41, 111–127.
  13. Jamie I. M., Read J. R., Barrie S. C., Bucat R. B., Buntine M. A., Crisp G. T., George A. V. and Kable S. H., (2007), From APCELL to ACELL and beyond–expanding a multi-institution project for laboratory-based teaching and learning, Aust. J. Educ. Chem., 67, 7–13.
  14. Mann H. B. and Whitney D. R., (1947), On a test of whether one of two random variables is stochastically larger than the other, Ann. Math. Stat., 18, 50–60.
  15. Mitchell M., (1993), Situational interest: its multifaceted structure in the secondary school mathematics classroom, J. Educ. Psychol., 85, 424–436.
  16. Osborne J., Simon S. and Collins S., (2003), Attitudes towards science: a review of the literature and its implications, Int. J. Sci. Educ., 25, 1049–1079.
  17. Read J. R., (2006), The Australian Chemistry Enhanced Laboratory Learning project, Chem. Aust., 73(1), 3–5.
  18. Read J. R., Barrie S. C., Bucat R. B., Buntine M. A., Crisp G. T., George A. V., Jamie I. M. and Kable S. H., (2006a), Achievements of an ACELL workshop, Chem. Aust., 73(9), 17–20.
  19. Read J. R., Buntine M. A., Crisp G. T., Barrie S. C., George A. V., Kable S. H., Bucat R. B. and Jamie I. M., (2006b), The ACELL project: Student participation, professional development, and improving laboratory learning, Symposium Proceedings: Assessment in Science Teaching and Learning. Sydney, NSW: UniServe Science, 113–119.
  20. Russell J. W., Kozma R. B., Jones T., Wykoff J., Marx N. and Davis J., (1997), Use of simultaneous-synchronized macroscopic, microscopic, and symbolic representations to enhance the teaching and learning of chemical concepts, J. Chem. Educ., 74, 330–334.
  21. Schiefele U. and Krapp A., (1996), Topic interest and free recall of expository text, Learn. Individ. Differ., 8, 141–160.
  22. Stary F. E. and Woldow N., (2001), Build a simple polarimeter, J. Chem. Educ., 78, 644.
  23. Treagust D. F., Chittleborough G. and Mamiala T. L., (2003), The role of submicroscopic and symbolic representations in chemical explanations, Int. J. Sci. Educ., 25, 1353–1368.
  24. Wilcoxon F., (1945), Individual comparisons by ranking methods, Biometr. Bull., 1, 80–83.
  25. Wu H. K., (2003), Linking the microscopic view of chemistry to real-life experiences: Intertextuality in a high-school science classroom, Sci. Educ., 87, 868–891.

Footnotes

Electronic supplementary information (ESI) available. See DOI: 10.1039/c0rp90015j
Present address: University of South Australia, Adelaide SA 5001, Australia. E-mail: michael.crisp@unisa.edu.au
§ This federal government-funded project began in Australia in 1999 with a physical chemistry focus (APCELL) and evolved into an all-of-chemistry variant (ACELL) in 2004. In 2009 funding was secured to expand the project to include the kindred science disciplines of biology, chemistry and physics (ASELL).

This journal is © The Royal Society of Chemistry 2011