Effect of clickers versus online homework on students' long-term retention of general chemistry course material

Misganaw T. Gebru , Amy J. Phelps * and Gary Wulfsberg
Chemistry Department, Middle Tennessee State University, PO Box 68, Murfreesboro, Tennessee 37132, USA. E-mail: Amy.Phelps@mtsu.edu

Received 12th March 2012 , Accepted 8th April 2012

First published on 2nd May 2012


Abstract

This study reports the effects of student response systems (clickers) versus online homework on students' long-term retention of General Chemistry I course material. Long-term content retention was measured by a comprehensive yearlong American Chemical Society (ACS) GC97 exam administered seven months after students had completed the General Chemistry I course. The analysis indicated that while students who used clickers or online homework systems earned a little over 2% higher than the non-clicker, non-online homework (lecture-only) group on ACS GC97 exam average scores, this difference was not statistically significant. Interestingly, the data also revealed that more students were retained both in clicker and online homework classes than lecture-only classes. This work suggests that treatments that enhance student's feedback may increase student retention in the course sequence with no loss in learning.


Introduction

Retention of introductory general chemistry course material is vital for student success in future chemistry and chemistry-related courses. However, in many cases, the learning environment in general chemistry classes does not seem to promote student long-term retention of material. One of the major roadblocks to the use of optimal learning environment in general chemistry courses is that they are usually taught as large classes. Unfortunately large classes are usually associated with less than favorable outcomes, including increased faculty reliance on the traditional lecture method, less active student involvement in the learning process, fewer instructor-student and student-student interactions, and reduced frequency of or no graded homework assignments, resulting in less feedback to students (Cuseo, 2007; Hall et al., 1999; Hoekstra, 2008; Trees and Jackson, 2007). This is precisely the situation facing many professors as the number of students enrolled continues to increase out pacing the hiring of new faculty.

In order to minimize the undesired results associated with teaching large classes, and more importantly, to increase student long-term (beyond the end of the course) retention of course material in large classes, instructors have adopted a variety of teaching strategies inside and outside of the classroom. One of the promising strategies that can enhance student learning and retention of information is the integration of emerging technologies into instruction. Among the technologies that have been extensively used in many institutions of higher education are Student Response Systems (Clickers), which require students to answer questions in class like ConcepTests (Brooks and Koretsky, 2011; Caldwell, 2007; Crouch and Mazur, 2001; Landis et al., 1998; MacArthur and Jones, 2008; Rickey and Stacy, 2000; Woelk, 2008), and online homework (OHW) systems, which require students to answer questions outside of the classroom (Chamala et al., 2006; Cole and Todd, 2003; Cuadros and Yaron, 2007; Freasier et al., 2003; Harris, 2009). Although clickers and online homework are used in different contexts, both have been praised for engaging students in learning activities and providing immediate feedback that can assist in student learning (Bunce et al., 2006; Cole and Todd, 2003; Dangel and Wang, 2008; FitzPatrick et al., 2011; Hall et al., 1999; Kennedy and Cutts, 2005; Lanz, 2010).

Studies examining the effect of clickers in chemistry found some promising results. King and Joshi (2008) found that using clickers in a large general chemistry lecture class at the university level enhanced student performance on exam questions that were related to content taught with clickers and considerably improved female students' class participation. An ethnographic study designed by Hoekstra (2008) suggested that the use of clickers in a large general chemistry lecture class at the university level improved student engagement, enhanced peer discussions among students, facilitated effective problem-based learning, and increased students' comfort level when working together.

Sevian and Robinson (2011) demonstrated that clickers could be used effectively in both small and large undergraduate level general chemistry lecture courses, in a small graduate-level class for environmental toxicology, and in undergraduate environmental science laboratory classes. Their study indicated that clickers were effective in promoting learning in the sciences, especially when the use of clickers was “transparently integrated with the content” and maintained the “flow of the class” without diverting students' attention from the lesson.

Bunce et al. (2006) compared the effect of clickers versus online quiz assessment on students' performance in a medium sized (N = 41) lecture class of a general, organic, and biochemistry course for nursing students in a small private university. The results indicated that using online quizzes significantly improved students' performance on teacher-written exams, but clickers did not. Neither clickers nor online quizzes significantly improved students' performance on the organic and biochemistry subsections of the ACS General, Organic, and Biochemistry exam Form 2000.

Research has also been done on the impact of clickers in other disciplines including biology, pharmacy, psychology, and computer science. Studies on the effect of clickers on student learning (Crossgrove and Curran, 2008; Doucet et al., 2009; Gauci et al., 2009; Lui et al., 2010; Preszler et al., 2007), students' long-term retention of course material (Crossgrove and Curran, 2008; Doucet et al., 2009; Lui et al., 2010), and the differential effects of clickers versus other classroom techniques such as the class discussion method (Martyn, 2007), the group questioning method (Mayer et al., 2009), and paper-based unexpected quizzes (Shapiro, 2009). Most of these studies found statistically significant differences in student performance favoring clickers (Crossgrove and Curran, 2008; Doucet et al., 2009; Gauci et al., 2009; Lui et al., 2010; Mayer et al., 2009; Preszler et al., 2007; Shapiro, 2009).

However, the results on long-term retention of information are inconsistent. Lui et al. (2010) found no statistically significant difference on long-term (one month) retention of pharmacy course material between students taught with clickers or without clickers. Doucet et al. (2009) indicated that using clickers in veterinary clinical pharmacology course did not significantly enhance students' long-term (twelve month) content retention.

Crossgrove and Curran (2008) investigated students' long-term retention of course material as measured by tests administered four months after they had completed the course for an introductory biology course for non-majors and a genetics course for biology majors. Results from this study indicated that using clickers in the introductory non-major biology class significantly improved students' long-term retention of material that was related to clicker-based questions, but using clickers in the biology major genetics class made no difference on students' long-term retention. The authors noted that the clicker questions in the genetics course were application or comprehension questions whereas the non-clicker questions were knowledge or comprehension questions, and the level of feedback provided to the genetic students was not same as that provided in the non-major course. The study suggests that students perform better when exam questions are on material covered using clickers, and that level of feedback given to students after answering clicker questions is vital for enhancing students' performance on exams.

Studies that compared the effect of clickers to other active learning strategies also found mixed results. For instance, Martyn (2007) found no statistically significant difference on students' performance on introductory computer information systems exams when clickers or the class discussion method were used during lectures. Mayer et al. (2009) investigated the effect of clickers on student performance in comparison to the group questioning method, finding significantly better student performance on educational psychology exams for students taught with clickers than for students taught with the group questioning method. Shapiro (2009) showed that students who used clickers outperformed students who used either paper-based unexpected quizzes or paper-based extra credit opportunities, especially on test questions that were similar to clicker's questions. In summary, the findings from recently published studies on the effect of clickers on student learning did not agree as to whether clickers had a positive or negative influence on student learning or long-term content retention.

Studies investigating the effect of online homework on students' performance in chemistry and non-chemistry courses found inconclusive results as well. In some cases, online homework assignments improved students' performance in course exams significantly better than paper-based homework assignments (Arasasingham et al., 2005; Burch and Kuo, 2010; Dillard-Eggers et al., 2008), and in others, online homework assignments were found to be as effective as paper-based homework assignments (Allain and Williams, 2006; Cole and Todd, 2003; Frnewever, 2008; Kodippili and Senaratne, 2008; Palocsay and Stevens, 2008).

While clickers or online homework have been studied in a variety of settings, none of the studies reviewed directly investigated the effect of clickers on student long-term retention of general chemistry material, particularly in comparison to online homework. This study hypothesizes that the benefits of clickers and online homework—engaging students in learning activities and providing immediate feedback to students—would be followed by an improvement of student long-term retention of information.

Theoretical framework

The constructivist perspective asserts that learning is a process of knowledge construction rather than knowledge recording or absorption (Anthony, 1996). In other words, knowledge is actively constructed by the learner based on prior knowledge rather than being transferred directly from the mind of the teachers (Green and Gredler, 2002). Therefore, students need to interact with their teacher and peers in classroom instruction, and should actively participate in a system of practices to develop conscious awareness of and mastery of subject-matter concepts (Green and Gredler, 2002). We applied this perspective for this study because we assume that using clickers or online homework, through engaging students in learning activities and providing them with immediate feedback, can help students actively interpret and impose meaning through their existing knowledge structures (Anthony, 1996). As a result, we expect students to have deeper understanding or better construction of knowledge that can be reflected in their long-term content retention of the course material.

The purpose of this study was to investigate the effectiveness of clickers on students' long-term General Chemistry I content retention, as compared to teaching methods facilitated by online homework or lecture-only (non-clicker, non-online homework) approaches. The findings from this study would fill a gap in the literature concerning the effect of clickers versus OHW on long-term learning.

Methods

Participants

The participants in this study were 160 undergraduate students at a regional comprehensive University who took the yearlong ACS GC97 final exam in a General Chemistry II (GC II) course in Falls 2008, 2009, or 2010, after having taken the General Chemistry I (GC I) course during the previous spring semester. Students taking GC II in the summer were excluded from this study since the ACS standardized exam (the data collection instrument) was not used consistently during the summer. This study used data from the Spring General Chemistry I and Fall General Chemistry II semesters because clickers were only used in General Chemistry I during the Spring semesters. The professor implementing clickers in GCI had other assignments in the Fall semesters.

Material and instruments

Clickers. TurningPoint clickers (with receiver and software) were used in selected General Chemistry I classrooms to engage students in active learning during the Spring 2008, 2009, and 2010 semesters. The same professor taught all of the General Chemistry I classes that used clickers. After a trial run in 2008, each class session contained 4–6 multiple-choice clicker questions. The professor who taught the clicker classes regularly assigned ungraded problems from the textbook to help students prepare for the clicker questions. Based on students' self-reported data collected by clicker votes, only about half of these students did the ungraded textbook assignments. This study included all students in the clicker classes regardless of whether they did the textbook assignments or not.
Online homework. The classes in this study used two different types of online homework systems (WebAssign or OWL) to help General Chemistry I students gain practice with the course material outside the classroom. WebAssign was used in Spring 2008 and Spring 2009, and OWL was used in Spring 2010. This change was based on a departmental decision to adopt a new textbook supported by a different online homework system. A single professor, different from the one who taught the classes that used clickers, taught all of the General Chemistry I classes that used online homework assignments, while several other professors used neither clickers nor online homework assignments in General Chemistry I classes.
American Chemical Society (Yearlong) General Chemistry exam (GC97). The GC97 exam was designed and validated by the ACS Institute and endorsed by ACS as an appropriate means of assessing students' knowledge of chemistry. The general chemistry courses in this study used the GC97 as a common final exam for all students. The GC97 exam consisted of 70 multiple-choice items, which were divided into 40 questions related to the material discussed in General Chemistry I (GC I subset) and 30 questions related to the material covered in General Chemistry II (GC II subset) by the Coordinator of General Chemistry. This categorization was evaluated and accepted by the general chemistry faculty participating in the study. The 40 GC I subset questions from the ACS exam were used to measure students' long-term retention of General Chemistry I material since it would have been seven months since these students completed the GC I course.

Research design

In this study, a quasi-experimental design (Borg and Gall, 1989) was used to compare the GC I subset scores from the ACS GC97 exam for students in the clicker group, the online homework group, and lecture-only group. The study used the types of learning activities completed by students (online homework outside of classroom, the use of clickers in the classroom, or neither of these) as the independent variable and ACS GC I subset score as the dependent variable. The classification of the participants into experimental and control groups of GC I instructional method is presented in Fig. 1. At the end of the General Chemistry II course, all of the participants completed a common ACS institute final exam in order to measure their long-term retention of General Chemistry I course material. Thus our definition of “long-term” is “7–10 months after introducing the concept”. The data from the ACS GC97 exam were retrieved with the approval of the university IRB committee and in consultation with the coordinator of the general chemistry courses. The yearlong ACS exam answer sheets were machine scored and the percentage of correct responses for the GC I were determined for each student.
Classification of the participants into experimental and control groups based on GC I instructional method.
Fig. 1 Classification of the participants into experimental and control groups based on GC I instructional method.

Statistical analyses

Welch's analysis of variance (Welch's ANOVA) is traditionally used when the experimental and control group sizes are different, as was the case in this study. The Welch's ANOVA was used to compare the scores in the GC I subset of the ACS GC97 exam for the three groups on students' long-term retention of general chemistry course materials. All the statistical analyses were performed using Predictive Analytics Software (PASW) Statistics version 18 (SPSS Inc., 2009).

Results and discussion

In order to investigate the effectiveness of clickers and online homework on students' long-term content retention in General Chemistry I, the average scores for students from the 40 GC I subset of questions (used as percentages) were compared for the students in the clicker group, the online homework group, and the lecture-only group (non-clicker, non-online homework). Descriptive statistics for ACS GC I subset scores for these groups appear in Table 1.
Table 1 Descriptive statistics for GC I subset for ACS common final exam scoresa
Group N [x with combining macron] (%) SD
a The GC subset scores in the ACS exam used in this study are based on the 40 GC I questions and are presented as in percentage. Comparison with the National Average Score for ACS GC97 form was not possible since the National Average score is based on all 70 questions on the GC97 exam.
Experimental group (s)
Online homework 68 49.82 14.81
Clickers 38 50.33 14.30
Control group
Lecture-only 54 47.69 14.73


The average scores indicate both the clicker ([x with combining macron] = 50.33, SD = 14.30) and online homework ([x with combining macron] = 49.82, SD = 14.81) students earned scores a little over 2% higher than lecture-only ([x with combining macron] = 47.69, SD = 14.73) students.

The Welch ANOVA indicated that there was not a statistically significant difference (at the α = 0.05 level) among the three instructional methods (clicker, online homework, lecture-only), Welch's F (2, 93) = 0.461, p = 0.632, in improving students' long-term retention of the General Chemistry I course material. This finding indicated that the additional use of clickers in the classroom or online homework outside classroom in this regional comprehensive University did not significantly improve students' long-term retention of material in the General Chemistry I course.

Interestingly, a greater number of students finished the general chemistry sequence in the following Fall semester in the classes where clickers or online homework were used in General Chemistry I when compared to the lecture-only General Chemistry I classes (Table 2).

Table 2 Number of students who enrolled for GC II and took the ACS GC97 exam in the following fall semester after they had taken GC I in the spring semesters of 2008, 2009, and 2010
Groups Spring (2008, 2009, 2010) Fall (2008, 2009, 2010) Retention rate (%)
Online homework 290 68 23.4
Clicker 153 38 24.8
Lecture-only 395 54 13.7
Total 838 160 19.1


Anecdotal evidences and a quick survey administered using clickers in clicker classrooms indicated that using clickers for the General Chemistry I course increased student attendance (more than 90% attended the clicker classes). The survey result also showed that a vast majority of the students in the clicker group felt that clickers should be used more broadly in other courses university-wide. Unfortunately, this study did not compare student attendance among the three groups since the instructors who taught online homework and lecture-only classes did not record students' attendance in their classes on a regular basis.

Conclusions

The literature on the use of student response systems (clickers) and online homework is mixed and it appears that studies that found differences used assessment questions tightly associated to the clicker questions (Crossgrove and Curran, 2008; King and Joshi, 2008; Shapiro, 2009). Additionally, Bunce et al. (2006) found that students who used online quizzes performed significantly better on teacher-written exams than a standardized ACS exam. In this study, we used a nationally standardized exam ACS GC97 as a measure of students' knowledge since our goal was to evaluate whether these instructional methods improved students' ability to answer questions that were not written by their instructor.

Our data also indicated that there was no significant difference in long-term chemistry content retention among any of the instructional methods tested. These findings are consistent with the existing literature on clickers (Crossgrove and Curran, 2008; Doucet et al., 2010; Lui et al., 2010). One explanation that the online homework group in our study did not outperform the lecture-only (control) group is that the students who were in the control group were assigned in-book homework. Hence online homework versus in-book homework should not really see much difference (as long as they are doing it) as indicated by studies that compared online homework versus paper-based homework (Allain and Williams, 2006; Cole and Todd, 2003; Frnewever, 2008; Kodippili and Senaratne, 2008; Palocsay and Stevens, 2008). Additionally, the results of this study might have been confounded by instructor effect since different instructors were involved in the teaching of all three groups.

Studies of long-term retention are often plagued by attrition and a loss of participants (Borg and Gall, 1989) and this study is no different. Although the clicker group originally had 153 students in the three Spring semesters and 112 students earned an A, B, or C, only 38 of these students enrolled in General Chemistry II the next Fall and took the ACS final exam. Looking at the number for the online homework group and lecture-only group across the same period of time, one noticed that dramatic attrition is found in all of the three groups, but more students were retained both in clicker and online homework classes than lecture-only classes.

Implications

The questions posed in this study are vital for the future of chemical education at universities such as this regional comprehensive university, since the reductions in state support cannot be offset by forever increasing student tuition. Larger classes seem inevitable and one wonders; must long-term learning of chemistry suffer as a consequence? Our study investigated whether the use of clickers or on-line homework could offset the drawbacks of larger class sizes. Unfortunately, our sample sizes were too small to validate the positive long-term trends that could be present. We can conclude using instructional techniques that increase student feedback enhances retention in the course sequence without any negative impact on performance.

Acknowledgements

We would like to thank Professor Gary White for his assistance in retrieving the archival data, and the other professors in chemistry department for allowing their Scantron data to be used or teaching one of the treatment or control sections.

References

  1. Allain R. and Williams T., (2006), The effectiveness of online homework in an introductory science class, J. Coll. Sci. Teach., 35, 28–30.
  2. Anthony G., (1996), Active learning in a constructivist framework, Educ. Studies in Math., 31, 349–369.
  3. Arasasingham R. D., Taagepera M., Potter F., Martorell I. and Lonjers S., (2005), Assessing the effect of Web-Based learning tools on student understanding of stoichiometry using knowledge space theory, J. Chem. Educ., 82, 1251–1262.
  4. Borg W. R. and Gall M. D., (1989), Educational Research: An introduction, (5th ed.). New York: Longman.
  5. Brooks B. J. and Koretsky M. D., (2011), The influence of group discussion on students' responses and confidence during peer instruction, J. Chem. Educ., 88, 1477–1484.
  6. Bunce D. M., VandenPlas J. R. and Havanki K. L., (2006), Comparing the effectiveness on student achievement of a student response system versus online WebCT quizzes, J. Chem. Educ., 83, 488–493.
  7. Burch K. J. and Kuo Yu.-Ju., (2010), Traditional vs. online homework in college algebra, Math. and Comp. Educ., 44, 53–63.
  8. Caldwell J. E., (2007), Clickers in the large classroom: Current research and best-practice tips, CBE Life Sci. Educ., 6, 9–20.
  9. Chamala R. R., Ciochina R., Grosssman R. B., Finkel R. A., Kannan S. and Ramachandran P., (2006), EPOCH: An organic chemistry homework program that offers response-specific feedback to students, J. Chem. Educ., 8, 164–169.
  10. Cole R. S. and Todd J. B., (2003), Effects of web-based multimedia homework with immediate rich feedback on student learning in general chemistry, J. Chem. Educ., 80, 1338–1343.
  11. Crossgrove K. and Curran K. L., (2008), Using clickers in non-majors- and majors-level biology courses: student opinion, learning, and long-term retention of course material, CBE Life Sci. Educ., 7, 146–154.
  12. Crouch C. H., Mazur E., (2001), Peer instruction: Ten years of experience and results, Am. J. Phys., 69, 970–977.
  13. Cuadros J. and Yaron D., (2007), “One firm spot”: The role of homework as lever in acquiring conceptual and performance competence in college chemistry, J. Chem. Educ., 84, 1047–1052.
  14. Cuseo J., (2007), The empirical case against large class size: Adverse effects on the teaching, learning, and retention of first-year students, J. Faculty Development, 21, 5–21.
  15. Dangel, H. L. and Wang C. X., (2008), Student response in higher education: Moving beyond linear teaching and surface learning, J. Educ. Technol. Development and Exchange, 1, 93–104.
  16. Dillard-Eggers J., Wooten T., Childs B. and Cooker J., (2008), Evidence on the effectiveness of on-line homework, College Teaching Methods and Styles Journal, 4, 9–15.
  17. Doucet M., Vrins A. and Harvey D., (2009), Effect of using an audience response system on learning environment, motivation, and long-term, during case-discussions in large group of undergraduate veterinary clinical pharmacology students, Medical Teacher, 31, e570–e579.
  18. FitzPatrick K. A., Finn K. E. and Campisi J., (2011), Effect of personal response systems on student perception and academic performance in course in a health sciences curriculum, Adv. Physiol. Educ., 35, 280–289.
  19. Freasier B., Collins G. and Newitt P., (2003), A web-based interactive homework quiz and tutorial package to motivate undergraduate chemistry students and improve learning, J. Chem. Educ., 80, 1344–1347.
  20. Frnewever H., (2008), A comparison of the effectiveness of web-based and paper-based homework for general chemistry, Chem. Educator, 13, 264–269.
  21. Gauci S. A., Dantas A. M., Williams D. A. and Kemm R. E., (2009), Promoting student-centered active learning in lectures with a personal response system, Adv. Physiol. Educ., 33, 60–71.
  22. Green S. K. and Gredler M. E., (2002), A review and analysis of constructivism for school-based practice, Sch. Psychol. Rev., 31, 53–70.
  23. Hall R. W., Butler L. G., Kestner N. R. and Limbach P. A., (1999), Combining feedback and assessment via Web-based homework, Campus-Wide Information System, 16, 24–26.
  24. Harris H., (2009), Electronic homework management systems: Reviews of popular systems, J. Chem. Educ., 86, 691.
  25. Hoekstra A., (2008), Vibrant student voices: exploring effects of the use of clickers in large college courses, Learning, Media, and Teaching, 33, 329–341.
  26. Kennedy G. E. and Cutts Q. I., (2005), The association between students' use of an electronic voting system and their learning outcomes, J. Comp. Assist. Learn., 21, 260–268.
  27. King D. B. and Joshi S., (2008), Gender differences in the use and effectiveness of personal response devices, J. Sci. Educ. Technol., 17, 544–552.
  28. Kodippili A. and Senaratne D., (2008), Is computer-generated interactive mathematics homework more effective than traditional instructor-graded homework?, Brit. J. Educ. Technol., 39, 928–932.
  29. Landis C. R., Peace Jr. G. E., Scharberg M. A., Branz S., Spencer J. N., Ricci R. W., Zumdhal S. A. and Shaw D., (1998), The new traditions consortium: Shifting from a faculty-centered paradigm to a student-centered paradigm, J. Chem. Educ., 75, 741–744.
  30. Lanz M. E., (2010), The use of ‘clickers’ in classroom: Teaching innovation or merely an amusing novelty?, Computers in Human Behavior, 26, 556–561.
  31. Lui F. C., Getting J. P. and Fjortofit N., (2010), Impact of a student response system on short- and long-term learning in a drug literature evaluation course, Am. J. Pharm. Educ., 74, 1–5.
  32. MacArthur J. R. and Jones L. L., (2008), A review of literature reports of clickers applicable to college chemistry, Chem. Educ. Res. Pract., 9, 187–195.
  33. Martyn M., (2007), Clickers in the classroom: an active learning approach, Educause Q., 2, 71–74.
  34. Mayer R. E., Still A., DeLeeuw K., Almeroth K., Bimber B., Chun D., Bulger M., Campbell J., Knight A. and Zhang H., (2009), Clickers in college classrooms: Fostering learning with questioning methods in large lecture classes, Contem. Educ. Psychol., 34, 51–57.
  35. Palocsay S. W. and Stevens S. P., (2008), A study of the effectiveness of web-based homework in teaching undergraduate business statistics, Decision Sciences J. of Innovative Education, 6, 213–232.
  36. Preszler R. W., Dawe A., Shuster C. B. and Shuster M., (2007), Assessment of the effects of student response systems on student learning and attitudes over a broad range of biology courses, CBE Life Sci. Educ., 6, 29–41.
  37. Rickey D. and Stacy A. M., (2000), The role of metacognition in learning chemistry, J. Chem. Educ., 77, 915–920.
  38. Sevian H. and Robinson W. E., (2011), Clickers promote learning in all kinds of classes—small and large, graduate and undergraduate, lecture and lab, J. Coll. Sci. Teach., 40, 14–18.
  39. Shapiro A., (2009), An empirical study of personal response technology for improving attendance and learning in a large class, J. Scholarship of Teaching and Learning, 9, 13–26.
  40. SPSS Inc., (2009), PASW Statistics 18.0 Command and Syntax Reference. Chicago, IL: SPSS Inc.
  41. Trees A. R. and Jackson M. H., (2007), The learning environment in clicker classrooms: student processes of learning and involvement in large university-level courses using student response systems, Learning, Media and Technology, 32, 21–40.
  42. Woelk K., (2008), Optimizing the use of personal response devices (clickers) in large-enrollment Introductory courses, J. Chem. Educ., 85, 1400–1405.

This journal is © The Royal Society of Chemistry 2012