Michael G.
Crisp‡
a,
Scott H.
Kable
b,
Justin R.
Read
bc and
Mark A.
Buntine
*cd
aSchool of Chemistry, Physics and Earth Sciences, Flinders University of South Australia, Adelaide, SA 5001, Australia
bSchool of Chemistry, University of Sydney, Sydney, NSW 2006, Australia. E-mail: scott.kable@sydney.edu.au
cSchool of Chemistry and Physics, The University of Adelaide, Adelaide, SA 5005, Australia. E-mail: justin.read@sydney.edu.au
dDepartment of Chemistry, Curtin University, GPO Box U1987, Perth, WA 6845, Australia. E-mail: m.buntine@curtin.edu.au
First published on 7th October 2011
This paper describes an educational analysis of a first year university chemistry practical called ‘Investigating sugar using a home made polarimeter’. The analysis follows the formalism of the Advancing Chemistry by Enhancing Learning in the Laboratory (ACELL) project, which includes a statement of education objectives, and an analysis of the student learning experience. The practical requires students to accurately prepare solutions of known concentrations of a common consumer chemical (sucrose), and then investigate the interaction between these solutions and plane-polarised light. The instrument used is a “home built” polarimeter which students assemble, allowing them to recognise that scientific apparatus need not be mysterious in its operation or construction. Student feedback data were conducted using the ACELL Student Learning Experience (ASLE) instrument. Analysis of the data shows that overwhelmingly students rate the experiment as “worthwhile” or better. However, many also rate the experiment as “boring” or “uninteresting”. By contrast, staff and student feedback at an ACELL experiential workshop rated the experiment very highly in terms of the “interest” criterion. In this contribution we discuss this alignment of staff and student perceptions of various elements, including “interest” and explore the correlation with the overall laboratory experience.
Understanding the chemical and physical properties associated with chiral substances and racemic mixtures is fundamental to the study of chemistry. Introductory organic chemistry students would be expected to be familiar with chirality in the macroscopic environment, exemplified by hands and shoes, and to recognise when molecular models represent chiral substances. They would be introduced to examples such as (R)- and (S)-carvone and their differing odours (spearmint and caraway, respectively) and the different biological properties of the enantiomers of thalidomide. As their studies progress, students learn about the importance of chirality in biological macromolecules like proteins and DNA, and they come to understand that molecular recognition depends on shape in three dimensions rather than simply on structure. Moreover, students of inorganic chemistry encounter chirality involving metal centres and in substances which lack a stereogenic carbon centre.
Historically, chiral substances were described as being optically active because they rotate the plane of polarisation of plane-polarised light. Jean-Baptiste Biot first observed this property in 1815 and it is the basis for useful analytical techniques such as polarimetry, as well as spectroscopic techniques such as circular dichroism. However, polarimetry is typically not included in introductory discussions of chirality in Australian universities, and nor do students generally have the opportunity to make use of a polarimeter at first year level. The lack of opportunity is often driven by the expense and fragility of commercially-built apparatus and the large numbers of students undertaking first year classes.
In the first year of university chemistry courses, most discussions of chirality are necessarily focussed at the molecular level, involving recognising stereocentres and assigning configurations based on the Cahn–Ingold–Prelog priority rules. By including experiences of polarimetry at first year, the opportunity arises for students to make connections between the macroscopic and molecular levels – an area with which, research shows, students have difficulty (Russell et al., 1997; Treagust et al., 2003; Wu, 2003).
The experiment Investigating Sucrose Using a Home Made Polarimeter was adapted from Stary and Woldow (2001) and was introduced into the second semester of First Year undergraduate chemistry at Flinders University in 2005. It was part of a project involving the redevelopment of the First Year undergraduate chemistry practicals, and was intended to allow students to develop important generic scientific skills in graphing and data interpretation. During the exercise, students must accurately prepare solutions of known concentrations of a common consumer substance (sucrose), and then investigate the interaction between these solutions and plane-polarised light. The instrument used is a “home built” polarimeter which students assemble, allowing them to recognise that scientific apparatus need not be mysterious in its operation or construction. Furthermore, the simple dependence of measured optical rotation on the chemical amount of the optically active substance present (and thus on the product of the concentration and the path length):
Θ = α(λ, T) × c × l, | (1) |
This paper is not intended to describe the experiment itself, which is adapted from that previously reported (Stary and Woldow, 2001) and described in full online [www.asell.org], but rather to present evidence as to its positive influence on student learning and to describe an interesting disconnect between staff perceptions of the educational value of the experiment and those expressed by the students who undertook the experiment.
The scientific and educational objectives of the experiment are described in detail in the ESI.† In brief, the experiment serves to promote student development at both the practical and intellectual levels. Students learn about the critical components of a polarimeter as they assemble the instrument themselves (see Fig. 1). They experiment with rotating just the top-most polarising filter to see their effect on the intensity of transmitted light, and learn how to use the extinction point to zero the instrument. Then, the instrument is used to examine the effect on the transmitted light of different sucrose concentrations and of different path lengths at constant concentration. These results are used to prepare a simple report.
Fig. 1 Schematic of the polarimeter provided to the students to assist them in building it (right) and a photograph (left) of the assembled polarimeter. |
The Workshop survey instrument probes two facets of the experiment – practical details such as whether it works properly, and educational aspects. The instrument has 12 Likert scale items, plus open-ended responses are invited for five specific questions about the experiments. The influence of the workshop on professional development of staff has been discussed in several previous papers (Buntine et al., 2007; Read et al., 2006a, 2006b). A copy of the ACELL Workshop Survey instrument and data are available as ESI.†
The ASLE instrument includes fourteen 5-point Likert items and five free-form responses to specific questions. A copy of the instrument is also available as ESI.† Twelve of the Likert items are evaluated on a scale from “strongly disagree” to “strongly agree”. A thirteenth item, probing the adequacy of the length of time allotted to the experiment, is evaluated on a scale from “way too much time” to “nowhere near enough time”. The final Likert item asks the student about their overall learning experience and is evaluated from “outstanding” to “worthless”. The five open-response items probe the best and worst features of the experiment. Apart from the ASLE instrument, a summary of the student response data is presented in the ESI.†
Number | Item | %BA − %BDa | ACELL scoreb | ||
---|---|---|---|---|---|
This expt | All expts | This expt | All expts | ||
a BA = broad agreement; BD = broad disagreement. b For the ACELL score, items 1 to 12, were scored as +2 (strongly agree) to −2 (strongly disagree), with a 0 (neutral) midpoint. For item 13, a +2 (way too much time) to −2 (nowhere near enough time) scale has been used, with a 0 (about right) midpoint. For item 14, a +2 (outstanding) to −2 (worthless) scale has been used, with a 0 (worthwhile) midpoint. | |||||
1 | The experiment works | 100% | 85% | 1.91 | 1.41 |
2 | I expect that completing this experiment will effectively help students to develop their theoretical and conceptual knowledge | 73% | 81% | 1.27 | 1.20 |
3 | I expect that completing this experiment will effectively help students to develop their scientific and practical skills | 82% | 81% | 1.27 | 1.19 |
4 | I expect that completing this experiment will effectively help students to develop their thinking skills | 64% | 76% | 0.82 | 1.11 |
5 | I expect that completing this experiment will effectively help students to develop their generic skills | 82% | 71% | 1.18 | 0.93 |
6 | I expect that students will find this experiment interesting | 91% | 74% | 1.55 | 1.06 |
7 | I believe that the laboratory notes, when supported with guidance from demonstrators and other resources, will provide sufficient support for students as they learn. | 100% | 76% | 1.36 | 1.07 |
8 | Useful assessment criteria are clearly stated | 73% | 27% | 1.18 | 0.42 |
9 | The experiment requires students to participate as active learners | 91% | 79% | 1.36 | 1.10 |
10 | The statement of ‘required prior knowledge and skills’ (template, section 1.3), is accurate. | 91% | 76% | 1.27 | 1.03 |
11 | Sufficient background information, of an appropriate standard, is provided in the introduction | 82% | 74% | 1.18 | 1.06 |
12 | The experiment provides students to take responsibility for their own learning | 100% | 82% | 1.27 | 1.13 |
13 | The amount of time available for students to complete this experiment is: | 27% | 13% | 0.27 | 0.13 |
14 | Overall, as a learning experience, I would rate this experiment as: | 64% | 55% | 0.91 | 0.65 |
The open-ended responses to the ASLE survey were subjected to a content analysis (Buntine and Read, 2007) to investigate common aspects of the students’ responses, and to provide insight and context to the statistical analysis of the Likert items. The Workshop open-ended responses were not analysed in this way because 11 responses are insufficient for a meaningful analysis. We have included several responses in this report to illustrate particular points. The full set of responses is included in the ESI.†
Finally, both survey instruments have a number of common Likert items, albeit in a predictive tense for the workshop instrument and an evaluative tense for the ASLE instrument. For example, the workshop survey asks the delegates to respond to “I expect that the students will find this experiment interesting” while the ASLE instrument asks the students “I found this to be an interesting experiment”. A Wilcoxon Rank–Sum Test is an appropriate statistical test for distributions that have a different number of respondants (Wilcoxon, 1945; Mann and Whitney, 1947). The null hypothesis in each case was that the distribution of responses is the same for the workshop and ASLE surveys was tested for N = (11,93).
As discussed above, some of the questions in the Workshop Instrument were designed to provide overlap with the ACELL Student Learning Experience (ASLE) instrument, to be conducted at a later time in the real laboratory setting. In particular, the survey items on “interest” (Item 6) and “responsibility for own learning” (Item 12), are exact counterparts to the ASLE questions. The response in Table 1 to the “interest” item was the strongest of all 24 experiments surveyed at the workshop. The strong response to this item was clarified in the freeform comments to the question “What, if anything, are the notable strengths of this experiment?” While the number of respondents is too small for a meaningful qualitative analysis, there were several responses that highlighted the clear and simple nature of the instrumentation as the reason for the interest:
“Demystifying instrumentation”
“The hands on, playing with the polarimeter”
“… it keeps you entertained and you can have a lot of fun with it.”
Based on the strong feedback from the ACELL workshop, this experiment was selected for further testing in the environment of a real First Year undergraduate laboratory.
Number | Item | %BA − %BDa | ACELL scoreb |
---|---|---|---|
This expt | This expt | ||
a BA = broad agreement; BD = broad disagreement. b For the ACELL score, items 1 to 12, were scored as +2 (strongly agree) to −2 (strongly disagree), with a 0 (neutral) midpoint. For item 13, a +2 (way too much time) to −2 (nowhere near enough time) scale has been used, with a 0 (about right) midpoint. For item 14, a +2 (outstanding) to −2 (worthless) scale has been used, with a 0 (worthwhile) midpoint. | |||
1 | This experiment has helped me to develop my data interpretation skills | 68% | +0.83 |
2 | This experiment has helped me to develop my laboratory skills | 63% | +0.78 |
3 | I found this to be an interesting experiment | 33% | +0.41 |
4 | It was clear to me how this laboratory exercise would be assessed | 53% | +0.72 |
5 | It was clear to me what I was expected to learn from completing this experiment | 72% | +0.97 |
6 | Completing this experiment has increased my understanding of chemistry | 57% | +0.76 |
7 | Sufficient background information, of an appropriate standard, is provided in the introduction | 80% | +1.15 |
8 | The demonstrators offered effective support and guidance | 82% | +1.33 |
9 | The experimental procedure was clearly explained in the lab manual or notes | 73% | +1.00 |
10 | I can see the relevance of this experiment to my chemistry studies | 77% | +1.14 |
11 | Working in a team to complete this experiment was beneficial | 86% | +1.43 |
12 | The experiment provided me with the opportunity to take responsibility for my own learning | 71% | +0.96 |
13 | I found that the time available to complete this experiment was | 5% | +0.07 |
14 | Overall, as a learning experience, I would rate this experiment as | 29% | +0.32 |
We commence the analysis of the student learning experience with the overall experience question: “14. Overall, as a learning experience, I would rate this experiment as:”. The feedback is strong; 94% of students considered the experiment to be “worthwhile” or better (see ASLE data in the ESI†). Analysis of the open-response student feedback reinforces and provides some context to the overall positive sense in which they view the laboratory exercise. Of a total of 63 comments relating to the student experience of the experiment (category EE in Table 3), 53 were positive. Many students commented favourably on learning about the operational principles of a polarimeter by assembling the simple apparatus themselves – comments that are aligned well with one of the objectives of demystifying the polarimeter.
Category/Theme | Abbreviation | Total comments | Sub-categories |
---|---|---|---|
Understanding of Chemistry | UC | 69 | How polarimeter works (28) |
Relationship between concentration and optical activity (23) | |||
Chirality and optical activity (15) | |||
Graphing and data analysis (3) | |||
Experience of Experiment | EE | 63 | Positive comments (53) |
Negative comments (10) | |||
Interesting Aspects of Experiment | IAE | 36 | Using the polarimeter (21) |
Seeing sucrose rotate the plane of PPL (8) | |||
Making solutions (7) | |||
Potential Improvements | PI | 27 | Fine as it is (16) |
Student notes (5) | |||
Include variety of compounds to analyse (3) | |||
Instrument (3) | |||
Miscellaneous Comments | MC | 6 |
Although 94% of students rated the experiment as worthwhile or better, the distribution was strongly peaked at just worthwhile (see ESI†); only 34% responded with the two highest categories. As shown in Tables 1 and 2, both of the summary scoring methods show a much lower response to this item in the ASLE survey than the Workshop survey: BA − BD = 64% (Workshop) and 29% (ASLE) or ACELL score = 0.91 (Workshop) and 0.32 (ASLE). A hint of the reasons for the lower score in the ASLE survey can be found in the 10 negative comments about the student experience, which can mostly be summarised as the students perceiving the practical to be “boring” or “uninteresting”. These comments, and the differences between the Workshop and ASLE surveys warrant further examination. Below, we analyse the other 13 ASLE questions for evidence of the lower overall student experience score, and in the subsequent section we compare, statistically, the common items in the two surveys.
The first administrative item that we consider is item 13, which explores whether the length of time for the practical is appropriate. The students' responses to this item were very strongly peaked, with 73 of 87 responses “C: about right” (see ESI†). Nine students thought there was too much time, and 5 thought not enough time was allocated to the practical.
Students responded very favourably to the other 4 administrative items in the survey, and we won’t dwell on these. The students responded positively to the clarity and appropriateness of the documentation describing the experiment (the lab manual) (Item 9) and to the appropriateness of the background information (Item 7). In each case 82% of respondents agreed or strongly agreed that the notes were clear and background material appropriate. Further evidence in support of the claim that the laboratory notes assist students in preparing for and undertaking the exercise can be found in the open response data in Table 3 where only 5 of 27 responses to the question, ‘What aspects of the experiment need improvement and what changes would you suggest?’, indicate that some improvement in the documentation is required. The quality of the demonstrators (Item 8) was also very favourably received.
The weakest response to the administrative questions concerns the clarity of the laboratory assessment. Although the assessment was clear to 66% of respondents, 13% remained unclear about assessment processes. The assessment for this experiment was performed after each laboratory session, and therefore after students had completed the ASLE survey. As such, it is not unreasonable to expect that some students had not thought too deeply about the assessment whilst completing the survey.
The responses to Items 1 and 2 are statistically the same – both in the summary data in Table 2 and the complete distributions in the ESI.† Students responded favourably to the experiment assisting in developing their general laboratory skills, especially their volumetric skills in terms of making solutions of known concentration. They appreciated that accurately making solutions of a known concentration was a critical step in exploring the relationship between solution concentration and optical activity. A number of students commented favourably on the simple nature of the instrument, for example:
“[I have] learn[t] how a polarimeter works, and the parameters of using the instrument (concentration and path length)”
The targetted learning objectives in the “Scientific and Practical Skills” and “Thinking and Generic Skills” of plotting and error analysis was not identified by many students; only 3 students commented that graphing and data analysis skills were key themes of the experiment. Nonetheless, it is clear from the responses to Likert Item 1 that 72% of students agree that the exercise improves their data analysis skills (see ESI;† table indicates a BA − BD score of 68%). This outcome indicates that one of the original aims of introducing the experiment, viz., to improve the generic skills of students in terms of graphing and data interpretation, has been achieved without distracting students from the core chemical concepts being explored.
The score on the teamwork item (11. Working in a team to complete this experiment was beneficial), was the highest of the 12 standard Likert items – 89% were in agreement and only 3% in disagreement. However, teamwork did not feature very strongly in the open response data, with only two comments – one favourable and one critical (perhaps because of random allocation of partner?). This suggests that, although the students valued working in a team (pair), this was an implicit factor in their overall laboratory learning experience and not one that featured very highly in their perceptions of the laboratory.
About 65% of respondents agree or strongly agree that ‘Completing this experiment has increased my understanding of chemistry’ (Likert Item 6). Student responses in the qualitative section of the survey show that the specific learning objectives identified in the Educational Template (see ESI†) were identified correctly by the students. In particular, two highlighted learning outcomes in the “Theoretical and Conceptual Knowledge” category were strongly identified in the sub-categories with the Understanding Chemistry theme in Table 3.
Fig. 2 ASLE student responses to Q14, probing the students' overall experience of the practical, and the four items probing higher level engagement of the students. The full statements can be found in the ESI.† |
Table 2 shows a very favourable student response to Likert Item 10 (‘I can see the relevance of this experiment to my chemistry studies’); over 80% of respondents agree or strongly agree with this sentiment. In addition, students found the use of ‘real world' sugar – a consumer product – as making the experiment more ‘relevant’. Over 78% of students also agreed or strongly agreed with the statement that “It was clear to me what I was expected to learn from completing this experiment”, further indicating that a targetted objective of the practical exercise is satisfied.
The item querying “responsibility for own learning” also scored very strongly; 72% of students agreed or strongly agreed with the statement that “The experiment provides students with opportunities to take responsibility for their own learning this experiment”. Only 1% of students disagreed with the statement. It is well-known in the literature that recipe, or formulaic experiments do not engage students (Buntine et al., 2007 and references therein). Student responses suggest that the experiment is an effective learning exercise in terms of creating motivated learners. Davis and Murrell (1993) discuss the role of explicitly encouraging student responsibility in the classroom as a key component in transforming teaching into learning. They comment that student participation in their education is a key motivator that contributes to creating an effective learning environment, or rather that a lack of student responsibility can lead to a cohort of passive, disaffected students who become disengaged with the learning process. There exists an extensive literature (e.g., Babbage, 1998; Alderman, 1999; Fallows and Ahmet, 1999) to indicate that motivated students, whether at school or at university, become better, more independent learners. The data reported here suggest that this laboratory exercise, whilst not perceived by students to engage their topical interest, certainly contributes to motivating them to learn about the chemical world around them.
The most surprising survey item was the responses to the question on interest: “3. I found this to be an interesting experiment”. As can be seen in Table 2, the summary score on this item was only 33% (BA − BD) or +0.41 (ACELL). This is by far the lowest mean score for any of the 12 Likert items on either scale, and as can be seen in Fig. 2, approximately 15% of students found the exercise uninteresting. In light of the strong anticipated interest from the workshop data, we explore the issue of student interest below.
The Workshop survey and ASLE survey have four items that are rigorously correlated (see Fig. 3). Two other items on the Workshop survey are more generally linked to the responses on the ASLE instrument. In particular, the workshop item “I expect that completing this experiment will effectively help students to develop their scientific and practical skills” is broken out into Questions 1 (lab skills) and 2 (data interpretation skills) on the ASLE survey, and the workshop item “I believe that the laboratory notes, when supported with guidance from demonstrators and other resources, will provide sufficient support for students as they learn” is broken out in the ASLE instrument to probe notes and demonstrators separately. Nonetheless, the items are clearly linked, and the average of responses to the relevant items in the ASLE survey is plotted in Fig. 3 in comparison with these two workshop items.
Fig. 3 Comparison between expectations of the experiment at the ACELL workshop, and responses from student in the real laboratory environment. Only items (a) and (b) were statistically different at P = 0.05. Responses strongly agree (SA) through strongly disagree (SD) (see text). |
A Wilcoxon Rank–Sum Test is an appropriate statistical test for distributions that have a difference number of respondents (Wilcoxon, 1945; Mann and Whitney, 1947). The null hypothesis that the distribution of responses is the same for the workshop and ASLE surveys was tested for N = (11,93). At p = 0.05, the null hypothesis is accepted for four of the distributions as shown in Fig. 3. Specifically, the items on Assessment, Responsibility for own learning, (lab + data) skills, and (demonstrators + practical notes) show distributions that are not significantly different between the workshop and in the teaching laboratory. This is an important finding because it demonstrates that the workshop experience was sufficiently realistic and rigorous that most of the preliminary observations were transferred into the real laboratory situation.
The item that probes the overall learning experience is designed as a catchall for evaluating the experiment. Fig. 3 shows that the distribution of responses is significantly different in the teaching laboratory than in the workshop (p = 0.04). The only other item that also shows a significant difference (p < 0.01) is the item that probes the level of Interest. Indeed, as discussed above, the item on Interest in the ASLE survey is outstanding by being far lower than any other item. Therefore, we conclude that the lower overall rating of the experiment in the teaching laboratory is because the students' interest is less engaged in the laboratory than the staff and students' interest in the workshop.
The open-ended responses to “What were the strengths of the experiment” in the workshop survey reveal that the simple and clear instrumentation arises as one of the strongest features of the experiment:
Enjoyable and valuable: it truly demystifies the principle and utility of the polarimeter. I find instruments the most difficult to learn, to understand, hence I found this experiment a very valuable one to undertake.
Many students in the teaching laboratory recognised that understanding the way a polarimeter works was one of the main objectives of the laboratory. In reply to the question: “What did you think was the main lesson to be learnt from the experiment?”, comments such as:
“How simple polarimetry works practically”, or “How a polarimeter works”
were common. However, the several students responded to “What aspects of the experiment need improvement and what changes would you suggest?” with comments that reflected a wish for more sophisticated instrumentation:
“Perhaps a more complex polarimeter…”, “Equipment…”, and “The polarimeter was frustrating”.
The role of situational and individual interest as a motivational tool has received considerable attention (Schiefele and Krapp, 1996; Ainley et al., 2002; Hidi et al., 2004; Hidi and Renninger, 2006). In the present experiment, for the staff and more experienced students (no students at the workshop were First Year students) situational interest in the simple polarimeter was triggered and held by using a very simple piece of scientific apparatus. This alone is more novel for these experimenters as their experience includes much more personal experience on sophisticated instruments, either in research laboratories, or higher level teaching laboratories.
The simple polarimeter appears also to trigger situational interest in the first year students, but in this case holding the interest is more difficult. The students are more readily frustrated when the experiment doesn’t work as they expect. They do not have the experience in experimentation to know that this is a common occurrence in new experiments, irrespective of the sophistication of the apparatus (indeed they might come to learn that the more sophisticated the experiment the greater chance of it going awry!). The result is that it is easy to blame the instrument and their interest is not held. Maintained situational interest has been shown to be associated with a higher level of cognitive engagement than just triggered situational interest (Mitchell, 1993). It is likely that the inability of the polarimeter to maintain interest for some students also led to the lower overall score than expected from the workshop data.
Footnotes |
† Electronic supplementary information (ESI) available. See DOI: 10.1039/c0rp90015j |
‡ Present address: University of South Australia, Adelaide SA 5001, Australia. E-mail: michael.crisp@unisa.edu.au |
§ This federal government-funded project began in Australia in 1999 with a physical chemistry focus (APCELL) and evolved into an all-of-chemistry variant (ACELL) in 2004. In 2009 funding was secured to expand the project to include the kindred science disciplines of biology, chemistry and physics (ASELL). |
This journal is © The Royal Society of Chemistry 2011 |