Language of mechanisms: exam analysis reveals students' strengths, strategies, and errors when using the electron-pushing formalism (curved arrows) in new reactions

Alison B. Flynn * and Ryan B. Featherstone
Department of Chemistry and Biomolecular Sciences, University of Ottawa, 10 Marie Curie Private, Ottawa, ON, Canada K1N 6N5. E-mail: alison.flynn@uOttawa.ca

Received 1st June 2016 , Accepted 4th October 2016

First published on 4th October 2016


This study investigated students' successes, strategies, and common errors in their answers to questions that involved the electron-pushing (curved arrow) formalism (EPF), part of organic chemistry's language. We analyzed students' answers to two question types on midterms and final exams: (1) draw the electron-pushing arrows of a reaction step, given the starting materials and products; and (2) draw the products of a reaction step, given the starting materials and electron-pushing arrows. For both question types, students were given unfamiliar reactions. The goal was for students to gain proficiency—or fluency—using and interpreting the EPF. By first becoming fluent, students should have lower cognitive load demands when learning subsequent concepts and reactions, positioning them to learn more deeply. Students did not typically draw reversed or illogical arrows, but there were many other error types. Scores on arrows questions were significantly higher than on products questions. Four factors correlated with lower question scores, including: compounds bearing implicit atoms, intramolecular reactions, assessment year, and the conformation of reactants drawn on the page. We found little evidence of analysis strategies such as expanding or mapping structures. We also found a new error type that we describe as picking up electrons and setting them down on a different atom. These errors revealed the difficulties that arose even before the students had to consider the chemical meaning and implications of the reactions. Herein, we describe our complete findings and suggestions for instruction, including videos that we created to teach the EPF.


Introduction

Reaction mechanisms are a central aspect of study in chemistry that are commonly explored and explained in organic chemistry using multiple short hands and an electron-pushing (curved arrow) formalism (EPF)—a highly meaningful symbolism (Bhattacharyya, 2013). Although the EPF is a core part of organic chemistry's culture (Bhattacharyya and Bodner, 2014), students struggle to interpret reaction mechanisms and use the EPF. Graduate student participants in one study drew arrows that got them to the products, despite not being representative of the most likely mechanism (Bhattacharyya and Bodner, 2005). In another study, “students provided the mechanism and curved arrows only after having predicted the product” (p. 848), indicating that the EPF was not useful to them in their problem-solving process as it would be to an expert (Grove et al., 2012b). To analyze reactions (e.g., identify reagents or predict products), students seem to use a combination of heuristics and product analysis (e.g., stability) rather than the feasibility of a mechanism (Rushton et al., 2008; Graulich, 2015a). Further, students struggle to connect structure and function (Cartrette and Mayo, 2011; Cruz and Towns, 2014; Anzovino and Bretz, 2015; DeFever et al., 2015). Despite all these difficulties, there seems to be benefit for students to use mechanistic thinking, especially for more difficult questions for which the answer is not immediately obvious to students (Grove et al., 2012a), suggesting that instruction should be improved to help students develop expertise earlier in their studies. The goal of our study was to isolate and investigate students' success and errors when using and interpreting the EPF, independently of their ability to explain and predict reaction mechanisms and outcomes. We believe that being able to use and interpret the EPF is a key step toward understanding the molecular level of reactions, that is, the invisible. We explain our reasoning in the following paragraphs.

Making sense of the invisible is a major challenge that students face in learning chemistry (Kozma and Russell, 1997). The objects and changes we see at the macroscopic level reflect the submicroscopic level (Johnstone, 1982, 2000); these levels are communicated through and represented with the third, symbolic level (Taber, 2013). This third level forms chemistry's language—its many symbols, representations, and tools (Talanquer, 2011; Taber, 2013) that serve as chemistry's words, grammar, and syntax (Taskin and Bernholt, 2014) (Fig. 1). This language is deeply meaningful to experts but little more than a collection of lines and dots to many students (Bodner and Domin, 2000). Experts readily interpret, construct, and switch between representations and levels to best represent a desired molecule or process (Gilbert et al., 2008). However, this switching is often difficult and problematic for learners, who do not always know which aspects of symbolic representations are significant, much less how to translate between them (Kozma and Russell, 1997; Johnstone, 2000; Cheng and Gilbert, 2009; Gilbert and Treagust, 2009; Strickland et al., 2010). This difficulty developing representational competence transcends STEM fields (LaDue et al., 2015), including physics (Pape and Tchoshanov, 2001), math (Pollack, 2012), biology (Dees et al., 2014; Wright et al., 2014; LaDue et al., 2015), and biochemistry (Linenberger and Bretz, 2014). Not understanding the language makes it difficult to learn the concepts and easy to misinterpret the message; and yet, most curricula expect students to simultaneously learn and use this new language in complex situations (Taber, 2009).


image file: c6rp00126b-f1.tif
Fig. 1 Chemistry's triplet (Johnstone, 2000). Note: here the symbolic domain for Tylenol includes symbols (e.g., line structures), and representations of macroscopic (e.g., image of a bottle) and submicroscopic (e.g., electrostatic potential map) levels.

The complex language of organic chemistry contributes to its reputation as a difficult and mysterious subject, a gauntlet to be run rather than the pathway to an exciting field (Anderson and Bodner, 2008; Lewington, 2013; Moran, 2013). Taber (2009) stressed that students needed to understand the meaning of curved arrows and argued that “teachers need to do more to induct learners into the intended symbolism we use” (p. 84). Bhattacharyya and Bodner (2005) emphasized that: “First, students need to be more explicitly aware that arrow-pushing serves an explicatory function” (p. 1407).

To help students gain fluency in organic chemistry's language, the University of Ottawa's curriculum explicitly teaches students the language before teaching chemical reactions and principles (Flynn and Ogilvie, 2015). This language instruction is anchored in students' prior knowledge—such as drawing Lewis structures—and has four major learning outcomes (Fig. 2): (LO1) Draw the electron-pushing arrows for a step, given the starting materials and products for that step; (LO2) Draw the products of a reaction step, given the starting materials and electron-pushing arrows for that step; (LO3) Draw the transition state structure for a given reaction step (understanding that some of the mechanistic details and conformational information may be absent); and (LO4) Draw a mechanism for the reverse reaction, given the mechanism in the forward direction. These question types always use reactions that students have not learned, to emphasize using and interpreting the EPF and not relying on knowing the mechanism.


image file: c6rp00126b-f2.tif
Fig. 2 Organic chemistry language learning outcomes (LOs). Reactions are new to the students, such as this one (Clavette et al., 2012).

Research questions

In this study, we explored Organic Chemistry I and II students' successes, strategies, and the nature of their errors related to the first two learning outcomes, guided by the following research questions:

(RQ1) What are students' success rates in solving organic chemistry's language questions that involve using and interpreting the electron-pushing formalism?

(RQ2) What types of strategies and errors are students using and making in those question types?

Conceptual context: information processing theory and meaningful learning

Information processing theory (IPT) and meaningful learning form the conceptual context of this project (Fig. 3) (Mayer, 2012; Schunk, 2016). According to IPT, information enters through the sensory registers. Inputs perceived as useful are attended to and transferred to working memory (WM), which is responsible for holding, rehearsing, and processing new and already stored information. That information may be encoded for storage in long-term memory (LTM), which is facilitated through organization, elaboration, and links with networks—hallmarks of meaningful learning. For learning to be meaningful, the learner: (1) needs relevant prior knowledge upon which to anchor the new knowledge, (2) the new information must be meaningful in and of itself, and (3) the learner has to choose to incorporate this new information with existing knowledge (Ausubel et al., 1968; Bretz, 2001; Novak, 2007).
image file: c6rp00126b-f3.tif
Fig. 3 Information Processing Theory (IPT) and meaningful learning form the project's conceptual context.

Also integral throughout IPT are social interactions and the learner's characteristics, including decisions, self-regulation, culture, and affect (Mayer, 2012). As the learner transfers information from WM to LTM, they are actively constructing their own knowledge (Vygotsky, 1978). Students' perception and use of symbols and representations will depend on their associations to existing knowledge, the meaning they attribute to the language, the associated cognitive load, and other factors (Matlin, 2009). In principle, if students are “fluent” in chemistry's language, they should have lower cognitive load demands and will be positioned to more deeply analyze subsequent reactions.

By analyzing students' success rates and errors, we will begin to learn what features of structures they attend to (and do not attend to) and how they interpret organic chemistry representations.

Methods

Setting and course

This research was conducted in Organic Chemistry I and II courses taught at a large, research-intensive Canadian university; each course is one semester in length. Organic Chemistry I is offered in the winter semester of students' first year of their studies and Organic Chemistry II is offered in the fall and summer semesters. Students can take these courses in English or French.

Both courses are twelve weeks long and consist of two weekly classes (1.5 hours each, mandatory, flipped format) (Flynn, 2015a) and a tutorial session (1.5 hours, optional, also called a recitation or discussion group). Assessment for the course is generally comprised of two midterms, a final exam, online homework assignments, and class participation using a classroom response system (‘Top Hat’, 2016). The Organic I course has a required, associated laboratory section (3 hours biweekly) and the Organic II course has a laboratory course that runs concurrently and is only required for some programs (3 hours weekly). The course is composed of ∼75% Faculty of Science students, ∼17% Faculty of Health Sciences students, and ∼8% students from other faculties. A new curriculum has been used in the courses since 2012 that has been implemented in all Organic Chemistry sections (six sections of Organic I, 3 sections of Organic II) (Flynn and Ogilvie, 2015).

The organic chemistry language instruction was given in Organic Chemistry I, immediately after a module on structure and before the first reactivity module on acid–base chemistry. Two of the six course sections have piloted the chemistry language instruction. Every assessment in Organic I and II contained questions aligned with the aforementioned language learning outcomes, always using a reaction that students' had never learned at that point in their studies, so that the focus was on interpreting the symbolism rather than recalling a reaction type. The timing of the questions is summarized in Table 1. The question (e.g., P16_2_399) gives the question type (P = draw the products), question label (16), the total points possible on that question (2) and the number of answers that we analyzed for that question (N = 399).

Table 1 Summary of questions asked in Organic Chemistry I or II
Questiona Question type Course (Organic Chemistry I or II) Assessment type Year
a The question (e.g., P16_2_399) gives the question type (P = draw the products), label (16), the total points possible on that question (2) and the number of answers that we analyzed for that question (N = 399).
A1_8_221 Arrows II Exam 2014
A2_8_388 Arrows II Exam 2015
A4_5_42 Arrows I M1 2015
A5_11_121 Arrows II M2 2014
A6_6_344 Arrows I Exam 2015
A8_8_372 Arrows I M1 2015
A11_8_290 Arrows I M2 2015
A12_4_20 Arrows I Exam 2014
A13_4_344 Arrows I Exam 2015
A14_7_95 Arrows II M1 2015
A15_7_127 Arrows II M2 2015
P1_2_361 Products I M1 2015
P2_2_126 Products II M2 2014
P3_2_200 Products I Exam 2014
P4_2_344 Products I Exam 2015
P5_2_200 Products I Exam 2014
P6_2_345 Products I Exam 2015
P7_2_98 Products II M1 2015
P8_2_200 Products I Exam 2014
P9_2_344 Products I Exam 2015
P10_2_344 Products I M1 2015
P11_2_126 Products II M2 2014
P12_2_290 Products I M2 2015
P13_2_290 Products I M2 2015
P14_2_290 Products I M2 2015
P16_2_399 Products II Exam 2015
P17_3_16 Products II M2 2015


This study focused only on the first two learning outcomes, abbreviated (1) “Arrows” or “A” series questions (Fig. 4) and (2) “Products” or “P” series questions (Fig. 5). The compounds were all drawn as if they were flat, to minimize complexity in the drawings and focus on the formalism; orbitals were not included in these question types (stereochemistry and molecular orbitals are included in other question types with other reactions and mechanisms in the course). In total, 15 Arrows and 17 Products questions were asked, giving a total of 7411 answers to analyze (3114 arrows, 4297 products). The University of Ottawa's Office of Research Ethics and Integrity approved this secondary use of data for this study, in accordance with policies for the ethical treatment of human research participants (Canadian Institutes of Health Research and Social Sciences and Humanities Research Council of Canada, 2010; Taber, 2014).


image file: c6rp00126b-f4.tif
Fig. 4 “Draw the arrows” questions used in this study (labeled “A”). Students were given the starting materials and products of each reaction step. They had to expand the structure to reveal any non-bonding electrons and bonds involved then draw the electron-pushing arrows. The anticipated answers are drawn in red. Some questions were given on more than one midterm/exam; each instance was given its own label. For example, A4–A7 refer to the same question given on different examinations (see Table 1).

image file: c6rp00126b-f5.tif
Fig. 5 Draw the products questions used in this study (labeled “P”). Students were given the starting material(s) and electron-pushing arrows and had to draw the product of that step. The anticipated answers are drawn in green.

Question analysis

First, the score on each step of the questions (graded by a course teaching assistant using a marking scheme) was recorded on paper; the researcher who entered the scores into the database file verified that the scores were assigned according to the marking scheme (checking for grading errors and consistency) by comparing the assigned score to the same marking scheme.

Next, each question was coded to identify the common strategies and the common errors demonstrated in the answer, similar to a process used when analyzing students' answers to synthesis questions (Bodé and Flynn, 2016). For the language questions, four different researchers analyzed eight different questions using an open coding method. The coding schemes were compared and a systematic coding template was developed (Appendix 1). All questions were analyzed or re-analyzed using the coding template by one of the researchers. A second rater coded a subset of answers (at least 10%) for a number of questions and inter-rater reliability scores were calculated for each question, resulting in a Krippendorf's α ≥ 0.92 in all cases (Krippendorf, 2011).

The distribution of data was significantly non-normal in all cases. For example, a histogram of answer scores for question A1 (Fig. 6) shows negatively skewed data (skewness = −1.5), a Kolmogorov–Smirnov test was significant (p < 0.001), and QQ plot showed deviations from normality. As such, non-parametric statistical tests were used to conduct all the analyses.


image file: c6rp00126b-f6.tif
Fig. 6 Histogram of scores for question A1 (%) shows negatively skewed data.

Results and discussion

Students scored higher on Arrows questions than on Products questions

As we investigated RQ1, we expected scores on both question types (which had a single correct answer) to be higher than the corresponding assessment score, because all the necessary information was given to students to solve the problem and they only had to use or interpret the electron-pushing formalism. Because students were given all the information needed to solve the questions, we think the questions are relatively easy—i.e., at the “understand” level in Bloom's taxonomy (Krathwohl, 2002)—and they had the prior knowledge and ability to get to the correct answer (e.g., when explicitly asked to do so, students are very successful translating between line and Lewis structures or assigning a formal charge). In other words, we expected these scores to be nearly 100% in all cases. Instead, we found that students' scores were lower than expected for both question types, on average.

The average and median scores on Arrows questions were higher than on Products questions: 72% and 86% for Arrows questions (N = 2926, SD = 33%), respectively, versus 55% and 50% for Products questions (N = 3993, SD = 40%), U = 4[thin space (1/6-em)]628[thin space (1/6-em)]100, z = −15.4, p < 0.001, r = −0.18 (Fig. 7). Each point represents one student's averaged score within one course; a student who took two courses (e.g., Organic I and II or repeated one of the courses) would appear as a separate point for each course. There is less variance between Products scores because these questions are typically scored out of 2 or 3 whereas Arrows questions are scored out of larger values (e.g., 8). There was low correlation between students' average question scores and their corresponding midterm or exam scores, R2 = 0.38 for Arrows questions and R2 = 0.14 for Products questions.


image file: c6rp00126b-f7.tif
Fig. 7 Average score on arrows (left, red triangles) and products (right, green squares) questions versus the corresponding assessment score (midterm or exam). Each point represents one student's averaged scores.

Students scored higher on some questions than on others

We analyzed the distribution of responses to each question using boxplots (Fig. 8), histograms (not shown), and descriptive statistics (not shown). Specific question attributes were correlated with lower scores on both question types: implicit atoms, intramolecular reactions, year in which the question was asked, and conformation of the reactants drawn on the page. These features in the questions reveal barriers that exist before students even get to thinking about reactive sites in the molecules or other aspects of chemical reactions (e.g., stereochemistry, orbital overlap).
image file: c6rp00126b-f8.tif
Fig. 8 Distribution of scores for each question analyzed. x-Axis label (e.g., A1_8_221) represents the question type (A = arrows, P = products), question number (1), possible score on that question (8 = 8 possible points), and the number of answers analyzed for that question (221).

Students' scores were higher on questions with explicit atoms (Mdn = 83%) compared to implicit atoms (Mdn = 50%), U = 3[thin space (1/6-em)]786[thin space (1/6-em)]072, z = −22.6, p < 0.001, r = −0.27. Explicit atoms refer to atoms whose symbols are always explicitly written, such as N for nitrogen and O for oxygen, as well as carbon or hydrogen atoms that have been drawn as a fully or partially expanded structure. Implicit atoms refer to the carbon and hydrogen atoms that are embedded in a line structure. For example, students scored higher on A6 (Mdn = 100%) than on A8 (Mdn = 63%), Fig. 9.


image file: c6rp00126b-f9.tif
Fig. 9 Comparison between questions with explicit (top) and implicit (bottom) atoms involved in the mechanism.

Students' scores were also higher on questions that involved intermolecular reaction steps (Mdn = 75%) compared to intramolecular steps (Mdn = 50%), U = 4[thin space (1/6-em)]850[thin space (1/6-em)]614, z = −14.2, p < 0.001, r = −0.17. For example, a number of questions involving intramolecular reactions had median scores of about 50%: P5, P6, P7, and P14.

Some questions had different scores in different years. For example, P8 (Mdn = 50%, asked in 2014) and P9 (Mdn = 100%, asked in 2015) represent the same question, which were given on different final exams in Organic I; exam papers were never returned to students, U = 38774.5, z = 2.73, p = 0.006, r = 0.12. A similar trend was seen between P2 (Organic II, 2014, Mdn = 0%) and P1 (Organic I, 2015, Mdn = 50%), U = 17[thin space (1/6-em)]695, z = −4.26, p < 0.001, r = 0.19. The effect sizes were small for this trend. These trends could be due to an increased emphasis in this question type in 2015, with associated class videos (Flynn, 2015b), practice, and having a systematic way to solve them.

P3 and P4 were identical questions, except that the reactant was drawn in a different conformation in each one (Fig. 5). P3's scores (Mdn = 100%) were significantly higher than P4's scores (Mdn = 0%), U = 25103.0, z = −6.06, p < 0.001, r = −0.26. The different conformations (on the page) clearly had a marked effect on students' ability to connect atoms. In another example, P16's low scores (Mdn = 0%) are likely a reflection of the nature of the question: intramolecular, implicit atoms, and an unusual substructure in the product—a cyclopropanone.

Students' strategies and errors on Arrows questions

We analyzed students' answers to find out what kinds of strategies they were using (Fig. 10) and what errors they were making (Fig. 11); most students (>95%) attempted every question. We analyzed five types of Arrows questions, breaking down our analysis by reaction step, giving a total of 18 reaction steps analyzed (Fig. 10).
image file: c6rp00126b-f10.tif
Fig. 10 Scores and strategies demonstrated on answers to arrows questions. Each reaction step is represented with a different lowercase letter (see legend). LP = lone pairs or non-bonding electrons. The given action (e.g., drew required LPs) was required for steps with a dot (.).

image file: c6rp00126b-f11.tif
Fig. 11 Scores and errors demonstrated on answers to arrows questions. Each reaction step is represented with a different lowercase letter (see legend).

Most students expanded the structures when required (Fig. 10), with exceptions in steps A8a and A14d. In both questions, few students expanded the reactive carbon to reveal the reactive proton and bond; implicit atoms were involved in both cases. Mapping and expanding are two strategies that could have been helpful in these cases, which seemed more difficult than other questions.

Some low scores may reflect symbolism use issues or conceptual understanding issues. For example, scores on all steps of A4, A8, A11, and A14 might have been higher if the non-bonding electrons (lone pairs) had been drawn in (Fig. 10); students may not have understood the bonding changes in the reaction or may simply have neglected to draw in the electrons. Students' conceptual understanding of bonding could also have impacted their analysis, although we did not analyze their conceptual understanding in this study.

Very few arrows were drawn in reverse, from atom to electrons (Fig. 11). In line with our requirements for these questions, most arrows started from electrons (non-bonding or bonding) and pointed to an atom or bond. In only a few cases, the arrows started from an atom (instead of drawing in the non-bonding electrons) or from the negative charge; these ways of drawing arrows are commonly accepted in the organic chemistry community and so other instructors might have accepted these arrows as being correct (which would correspondingly lead to higher scores in these questions). There were only a few cases of extra, missing, or illogical arrows (e.g., that did not have meaning in the context of the reaction), suggesting that students are tracking electron movement (in most cases) and understand how to account for that movement within each reaction step.

Not expanding or mapping atoms and electrons seems to be a major source of errors, although students might have been doing this internally or by drawing out electrons, atoms, or bonds instead of numbering or colouring the reaction components. For example, we analyzed steps A14a and A14d to identify the answers students were giving. In A14a, only about half of students provided the correct arrows (Fig. 12). Another 35% showed one of the correct arrows (answers 2–4), which depicted how the negative charge on the sulfur atom was generated. Still, half of students could not correctly identify the electron source of the new N[double bond, length as m-dash]N π bond, with 35% of students drawing an arrow from the negative charge. For step A4d (Fig. 13), only 18% of students provided the accepted arrows, either the anticipated ones (answer 1) or an accepted alternative (answer 2). 76% of students drew the π bond forming using the carbene electrons, which had been explicitly drawn for the students, but did not show how the hydride migrated to the carbene atom (answer 3). 68% of students never drew in a hydrogen atom β to the carbene (not shown as a separate answer), presumably because they did not realize it was involved in the reaction.


image file: c6rp00126b-f12.tif
Fig. 12 A14a: distribution of answers. The product of each step was provided as usual for this question type.

image file: c6rp00126b-f13.tif
Fig. 13 A14d: distribution of common answers. The product of each step was provided as usual for this question type.

Strategies and errors on Products questions

Products questions had more errors. Most students attempted every question (Fig. 14). A variable number of students drew non-bonding electrons (i.e., lone pairs). Few students expanded bonds or atoms, redrew the structure or showed evidence of mapping. Students had been taught to use these strategies, initially as part of in-class instruction (2014), then as a series of videos and in-class activities (2015) (Flynn, 2015b). Further, we have seen that these strategies are associated with successful solutions to synthesis problems (Flynn, 2015b; Bodé and Flynn, 2016).
image file: c6rp00126b-f14.tif
Fig. 14 Strategies that students demonstrated on products questions.

The majority of students (>90%) correctly drew in non-bonding electrons, expanded bonds and atoms, and mapped, when they chose to do so. When these questions were first given to students, only the non-bonding electrons explicitly involved in the reaction mechanism were shown.

For Arrows questions in the first phase of this project (2012–2014), only non-bonding electrons directly involved in a mechanism were drawn out. Based on students' errors, showing all the electrons does not seem to be necessary or create confusion except in more complicated examples. In those cases, not showing students the electrons seemed to lead to confusion. In one example that involved a rearrangement, a large proportion of students drew non-bonding electrons in P2 (56%) and the majority did so incorrectly (58%), unlike in easier questions. Most of these errors related to non-bonding electrons drawn on the nitrogen atom of the acetylated hydroxamic acid. Now during instruction, all the non-bonding electrons are drawn on atoms directly involved in a reaction step. Students will also be required to do so on assessments.

We found differences between the answers to P3 and P4 in addition to the higher scores in P3. All the students who expanded structures in P3 did so correctly, compared to only 90% in P4. In P3, 88% of students drew in non-bonding electrons correctly, compared to only 67% of students in P4.

Many students (16%, N = 120) gave an unanticipated answer to the third step of question A15 (Fig. 15). Another 28% gave the anticipated response, 24% gave a partial response (i.e., were missing arrows), and 32% made a fundamental error in their response (e.g., reversed arrow, etc.). There are a couple of different ways to interpret students’ unanticipated answers. Although they had learned much about carbonyl chemistry by this point in the course (e.g., acylation, saponification), and could have used their chemistry knowledge when answering this question, perhaps they were simply conducting a bookkeeping exercise. That is, perhaps they were simply adding arrows that get them to the product (Bhattacharyya and Bodner, 2005). Alternatively, perhaps they believed their answer was chemically plausible, in the same way that a chemist would propose and explore many possible mechanisms and discount certain ones based on experimental evidence. Students’ alternative answers provide excellent opportunities for further class discussion and analysis.


image file: c6rp00126b-f15.tif
Fig. 15 Two common answers to A15c (step 3). Students did not have to label the specific oxygen atoms. N = 120.

When we analyzed students' errors in the Products questions, we found many formal charge errors throughout the majority of questions (Fig. 16). We contrasted these many errors with situations in which students choose to draw out the non-bonding electrons or expand structures; in the latter situations, the majority did so correctly (>90%). The only questions that did not have a high incidence of formal charge errors in the answer were P3 and P11. In P3, the structure was drawn in the pseudo-reactive conformation and the median score was 100%. We believe that the process of explicitly drawing electrons, bond, and atoms would help students more clearly interpret a reaction process and avoid formal charge errors.


image file: c6rp00126b-f16.tif
Fig. 16 Errors found in products questions.

There was also a type of error that we had not observed before, which we describe as the following arrow interpretation: picked up electrons and moved them to a completely different atom. For example, two different answers were accepted for P1 and P2 that show how students interpret electron-pushing arrows differently (Fig. 17). In many answers, the nitrogen–carbon electrons were “placed” on the nitrogen atom instead of migrating the bond and aryl group to the nitrogen atom. We had never seen this type of error on past exams, but we had also not asked this type of question where the arrows were provided. We do not know if instruction has influenced the way students interpret this question type (i.e., by asking students to interpret reactions they have not seen before) or if students instinctively interpret these arrows in this way. The latter has important considerations for instruction, as students seem to be seeing alternative interpretations of the electron-pushing arrows that we present in classes and would need opportunities to learn the chemistry (plausibility) behind these alternate reaction pathways.


image file: c6rp00126b-f17.tif
Fig. 17 Two common answers to P1 (N = 160) and P2 (N = 126).

We found many error types on answers to P10. This question was given in the first midterm of Organic Chemistry I, only a week after students had learned the electron-pushing formalism (through videos, in-class activities, and a problem set). These errors suggest how students initially interpret the electron-pushing formalism and support our opinion that students should explicitly be taught the EPF, so that the formalism itself does not become a barrier to learning the chemistry that follows. Time, practice, and feedback are needed to help support students' learning.

In 2015, the instructor explained to students that the electrons from a reacting bond stay with one of the atoms from that bond (Flynn, 2015b). Even so, more than half of students in the final exam of the 2015 Organic II course interpreted an electron-pushing arrow by picking up and moving electrons (Fig. 18, answers 2–4). The high proportion of students who gave this answer might have interpreted the cyclopropanone substructure as being implausible, which would suggest that they are considering the nature of the product formed and not simply conducting a bookkeeping exercise for the sake of answering a question. In other words, they might be linking the electron-pushing formalism to chemistry meaning, a possibility that we are investigating in a related project. Some students wrote notes beside their answers, to support this idea, such as one student who drew the correct answer and wrote: “This is what it [the question] seems to be indicating, but tried it with a molecular model and didn’t work too well.” Note: although there are alternative reaction mechanisms for the Favorski reaction, this was the version presented to students (as starting material and electron-pushing arrows). More concerning than an alternate interpretation of the arrow formalism, a full 70% of students made an implicit atom error in this question (answers 3–6).


image file: c6rp00126b-f18.tif
Fig. 18 Distribution of common responses to P16 revealed many implicit atom errors, formal charge errors, and “pick up electrons” interpretations. N = 104.

Very few students drew pentavalent carbon atoms (<3%), except for P10 and for P16. Question P10 was given in the first midterm of Organic I, when students had just learned the electron-pushing formalism, which could explain that error. For question P16, students may have realized that answer 5 (Fig. 18) was not a plausible answer but could not think of a better alternative, or they might not have realized their error.

Our investigation of students' strategies and errors suggested further barriers, in that students demonstrated few of the strategies that we had taught them and made a number errors when interpreting the formalism to draw products. Why were there so few demonstrated strategies even though students were taught to use them? Perhaps these strategies are not being consistently modeled in class. That is, perhaps students were being told to use strategies but the instructor was not doing so consistently, and so the students did not get sufficient practice to see the importance of the strategies. Perhaps the students did not see the purpose of the strategies because they did not recognize they were making mistakes. If the strategies are beneficial, then students who begin to use them should have higher success on these questions and presumably on other mechanism type questions.

Some errors are particularly surprising, like formal charge errors, especially when the students who expand structures do not typically make these errors. There are a few possibilities for this: these errors could be caused because the novice students perceive the structures differently than do more experienced students and experts. That is, they may not “see” the implicit atoms. This difficulty with implicit atoms could be leading to a string of subsequent errors. If this were true, then students' scores should increase if instructors consistently modeled the strategies of expanding and mapping structures in the course (i.e., going beyond recommending it and modeling this strategy at the beginning then for only hard questions) or if they required students to expand structures (at least around the reacting centres).

Limitations

We do not yet know if students are gaining greater fluency in the electron-pushing formalism by teaching the formalism before reactions, giving students opportunities to practice, and assessing their skills on the intended learning outcomes on all midterms and exams. Future studies will continue to investigate this question. We do not know from this study how students conceptualize or construct their understanding of the reactions or the questions of this type and are following up on their meaning-making in another study.

Conclusions

In this study, students' scores, strategies, and errors were investigated on two organic chemistry language/EPF question types: (1) drawing the electron-pushing arrows, given the starting materials and products of a reaction step (called “Arrows”), and (2) drawing the products of a reaction step, given the starting materials and electron-pushing arrows of that step (called “Products”). We explicitly teach students to interpret and use organic chemistry's language as a step or scaffold as they develop greater competency in chemistry. Any reaction can be used to create the types of questions drawn herein; we chose reactions that students had not yet learned, and we show each step of the reaction explicitly. The goal for these question types is for students to interpret and use the electron-pushing formalism, not predict or interpret the chemistry principles involved in the reaction.

The vast majority of students attempted every question and very few reversed or illogical errors were found. Students' scores were higher on the Arrows questions than the Products questions. Questions with implicit atoms or intramolecular steps were significantly correlated with lower scores, which signify barriers to later chemistry learning. Beyond identifying errors with the conventions of the EPF, our study found more fundamental issues of understanding, such as students interpreting some electron-pushing arrows as saying that electrons should be relocated to a different atom (especially in rearrangements).

Few students demonstrated using the strategies they had been taught, including: expanding structures by drawing out non-bonding electrons, bonds, or implicit atoms (carbon or hydrogen), and mapping the atoms and electrons. These strategies could be beneficial, especially in more challenging questions (e.g., with many implicit atoms, intramolecular reactions, unfamiliar structures or atoms).

There were some questions in which students provided unexpected answers that could enrich classroom discussions. P1 and P2 are a prime example of this (Fig. 17). Alternative mechanism interpretations can be linked to a research setting in which a reaction's mechanism is being investigated. In class, we can discuss how to probe various alternatives and what to look for in the data. While we would not expect students early in an Organic I course to be able to propose mechanisms, we would expect them to know that alternatives are possible; by the end of an Organic II course, students could be expected to propose or compare mechanistic pathways for a reaction.

Research has shown the many difficulties students have with using this formalism and with reaction mechanisms. These difficulties include connecting Brønsted-Lowry theory, Lewis theory, nucleophile/electrophile concepts and reactivity, and structure/property relationships (Strickland et al., 2010; Cartrette and Mayo, 2011; Cruz and Towns, 2014; Anzovino and Bretz, 2015; DeFever et al., 2015). Students often seem to rely on intuitive judgments when making decisions (Graulich, 2015b; Weinrich and Talanquer, 2015, 2016), focus on product stability rather than feasibility of the reaction mechanism (Rushton et al., 2008), and struggle with multi-variate problems (Kraft et al., 2010).

Experts largely agree that the principal use for mechanistic reasoning using the EPF is to explain and predict outcomes of chemical processes (Bhattacharyya, 2013) and research has shown a benefit to using mechanistic thinking, despite students’ difficulties (Grove et al., 2012a). However, students often propose mechanisms that are not meaningful—ones that “get them to the product” (Bhattacharyya and Bodner, 2005), they “decorate with arrows” after providing a product (Grove et al., 2012b), or interpret representations based on surface features (Kozma and Russell, 1997). With so many aspects of language and tools unique to organic chemistry (Bhattacharyya and Bodner, 2014), students understanding and interpretation of this language and tools will certainly impact their ability to learn chemistry concepts and make connections.

Our study revealed that many of the difficulties arise before students even have to consider the chemical meaning and implications of the reactions—when “all” they have to do is interpret the symbolism. Some of our findings are consistent with previous research; for example, they demonstrate that the symbolism to them is sometimes a collection of “letters and lines and numbers that cannot correctly be called symbols because they do not represent or symbolize anything that has physical reality” (Bodner and Domin, 2000). This seems to especially be the case when implicit atoms are involved in a reaction. However, we found few reversed arrows or illogical errors, suggesting that students are attributing the meaning of electron-movement and bond formation/breakage to the curved arrows. In other words, the EPF does seem meaningful to students in many contexts.

As our next research steps, we are investigating how students connect the organic chemistry symbolism with their prior learning, and how they describe their interpretation of EPF as it relates to the sub-microscopic level (i.e., how they understand and translate between these levels).

If students are to understand the sub-microscopic level, they need an understanding and fluency in the symbolic level and an ability to connect the sub-micro and macroscopic levels (Johnstone, 2000; Gilbert and Treagust, 2009). For instruction, we recommend modeling such strategies, embedding these strategies and question types in formative and summative assessment opportunities, and giving rapid or immediate feedback. To help students practice the skills involved, we developed a free, bilingual, online learning module that involves explanatory videos (Flynn, 2015b; Flynn et al., 2016), questions, feedback, and a metacognitive skill building layer. Instruction (explanations, modeling, activities, and assessment) could also help students connect the symbolic level with the macro- and sub-microscopic level (Cheng and Gilbert, 2009), along with explorations of the strengths and limitations of the representations we use.

Appendix 1

Table 2
Table 2 Coding template for arrows and products questions
Arrows questions (analysis repeated for each step) Products questions
File #, Step # File #
Assessment score Assessment score
Step a/b/c/d/e (possible score) Score (possible score)
No attempt No attempt
Required LPs drawn/not drawn (R/W)
Some LPs drawn (R/W) Some LPs drawn (R/W)
All LPs drawn/not drawn (R/W) All LPs drawn/not drawn (R/W)
Expanded bonds/atoms—required ones (R/W)
Expanded bonds/atoms—some more than required (R/W) Expanded bonds/atoms—some (R/W)
Expanded bonds/atoms—all (R/W) Expanded bonds/atoms—all (R/W)
Redrew starting material(s)—part (R/W) Redrew structure—part (R/W)
Redrew starting material(s)—All (R/W) Redrew structure—All (R/W)
Counted Cs in SM
Mapping evidence (R/W) Mapping evidence (R/W)
Formal charge error Formal charge error
Wrong reacting partners Picked electrons up and moved them to a completely different atom/bond
Wrong LP used Added extra step/intermediate
Arrow reversal (atom to e–s)
Arrows: Extra (E) or Missing (M) Ignored/missed an arrow (didn't do what the arrows says to do)—which one?
Arrows: didn't start from e–s Wrong configuration (R/S, E/Z)
Implicit atom error (H or C as part of a line structure) (E/M) Stereochemical discrepancy (e.g., 3D on sp2 C in plane)
Comments Carbons: Extra/Missing (E/M)
Functional group-Extra/Missing (E/M)
Bond: Extra/Missing (E/M)
Implicit atom error (H or C as part of a line structure) (E/M)
Comments


Acknowledgements

We thank the University of Ottawa for funding this work and the following research assistants for their contributions: Declan Webber, Delphine Amellal, Jade Choo Foo, John Giorgi, and Sarang Gupta.

References

  1. Anderson T. L. and Bodner G. M., (2008), What can we do about “Parker”? A case study of a good student who didn't “get” organic chemistry, Chem. Educ. Res. Pract., 9, 93–101.
  2. Ausubel D. P., Novak J. D. and Hanesian H., (1968), Educational Psychology: A Cognitive View, 2nd edn, NY: Holt, Rinehart and Winston.
  3. Anzovino M. E. and Bretz S. L., (2015), Organic chemistry students' ideas about nucleophiles and electrophiles: the role of charges and mechanisms, Chem. Educ. Res. Pract., 16(4), 797–810. Retrieved from http://pubs.rsc.org/en/content/articlehtml/2015/rp/c5rp00113g.
  4. Bhattacharyya G., (2013), From Source to Sink: Mechanistic Reasoning Using the Electron-Pushing Formalism, J. Chem. Educ., 90(10), 1282–1289. Retrieved from http://pubs.acs.org/doi/abs/10.1021/ed300765k.
  5. Bhattacharyya G. and Bodner G. M., (2005), ‘It Gets Me to the Product’: How Students Propose Organic Mechanisms, J. Chem. Educ., 82(9), 1402. Retrieved from http://dx.doi.org/10.1021/ed082p1402.
  6. Bhattacharyya G. and Bodner G. M., (2014), Culturing reality: how organic chemistry graduate students develop into practitioners, J. Res. Sci. Teach., 51(6), 694–713. Retrieved from http://onlinelibrary.wiley.com/doi/10.1002/tea.21157/abstract.
  7. Bodé N. E. and Flynn A. B., (2016), Strategies of Successful Synthesis Solutions: Mapping, Mechanisms, and More, J. Chem. Educ., 93(4), 593–604. DOI: 10.1021/acs.jchemed.5b00900.
  8. Bodner G. M. and Domin D. S., (2000), Mental models: the role of representations in problem solving in chemistry, Univ. Chem. Educ., 4(1), 24–30.
  9. Bretz S. L., (2001), Novak's Theory of Education: Human Constructivism and Meaningful Learning, J. Chem. Educ., 78(8), 1107.
  10. Canadian Institutes of Health Research and Social Sciences and Humanities Research Council of Canada, N. S. and E. R. C. of C, (2010), TCPS 2 – Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans. Government of Canada. Retrieved from http://www.pre.ethics.gc.ca/eng/index/.
  11. Cartrette D. P. and Mayo P. M., (2011), Students' understanding of acids/bases in organic chemistry contexts, Chem. Educ. Res. Pract., 12, 29–39. Retrieved from http://dx.doi.org/10.1039/C1RP90005F.
  12. Cheng M. and Gilbert J. K., (2009), Towards a Better Utilization of Diagrams in Research into the Use of Representative Levels in Chemical Education BT – Multiple Representations in Chemical Education, in Multiple Representations in Chemical Education, Dordrecht: Springer, vol. 4, pp. 55–73. Retrieved from http://link.springer.com/10.1007/978-1-4020-8872-8_4.
  13. Clavette C., Gan W., Bongers A., Markiewicz T., Toderian A. B., Gorelsky S. I. and Beauchemin A. M., (2012), A tunable route for the synthesis of azomethine imines and β-aminocarbonyl compounds from alkenes, J. Am. Chem. Soc., 134(39), 16111–16114. DOI: 10.1021/ja305491t.
  14. Cruz D. and Towns M. H., (2014), Students' understanding of alkyl halide reactions in undergraduate organic chemistry, Chem. Educ. Res. Pract., 15, 501–515.
  15. Dees J., Momsen J. L., Niemi J. and Montplaisir L., (2014), Student interpretations of phylogenetic trees in an introductory biology course, CBE Life Sciences Education, 13(4), 666–676. DOI: 10.1187/cbe.14-01-0003.
  16. DeFever R. S., Bruce H. and Bhattacharyya G., (2015), Mental Rolodexing: Senior Chemistry Majors' Understanding of Chemical and Physical Properties, J. Chem. Educ., 92(3), 415–426. Retrieved from http://pubs.acs.org/doi/abs/10.1021/ed500360g.
  17. Flynn A. B., (2015a), Structure And Evaluation Of Flipped Chemistry Courses: Organic & Spectroscopy, Large And Small, First To Third Year, English And French. Chem. Educ. Res. Pract., 16, 198–211. Retrieved from http://pubs.rsc.org/en/Content/ArticleLanding/2014/RP/C4RP00224E.
  18. Flynn A. B., (2015b), Reaction mechanisms: interpreting organic chemistry's language. YouTube. University of Ottawa. Retrieved from http://https://www.youtube.com/playlist?list=PL3RQqudFCQqd7UHNonwMinwv_tjhNJqxI.
  19. Flynn A. B. and Ogilvie W. W., (2015), Mechanisms before Reactions: A Mechanistic Approach to the Organic Chemistry Curriculum Based on Patterns of Electron Flow, J. Chem. Educ., 92(5), 803–810. Retrieved from http://pubs.acs.org/doi/abs/10.1021/ed500284d.
  20. Flynn A. B., Caron J., Laroche J., Richard G., Bélanger M. and Featherstone R., (2016), Orgchem101.com: An organic chemistry and metacognitive skill and concept building tool. Retrieved 18 September 2016, from http://orgchem101.com/.
  21. Gilbert J. K. and Treagust D. F., (ed.), (2009), Multiple Representations in Chemical Education. Models and Modeling in Science Education, vol. 4, Dordrecht: Springer.
  22. Gilbert J. K., Reiner M. and Nakhleh M., (ed.), (2008), Visualization: an emergent field of practice and enquiry in science education, in Visualization theory and practice in science education, Dordrecht: Springer, pp. 3–24.
  23. Graulich N., (2015a), Intuitive Judgments Govern Students' Answering Patterns in Multiple-Choice Exercises in Organic Chemistry, J. Chem. Educ., 92(2), 205–211. Retrieved from http://pubs.acs.org/doi/abs/10.1021/ed500641n.
  24. Graulich N., (2015b), The tip of the iceberg in organic chemistry classes: how do students deal with the invisible? Chem. Educ. Res. Pract., 16(1), 9–21. Retrieved from http://pubs.rsc.org/en/content/articlehtml/2015/rp/c4rp00165f.
  25. Grove N. P., Cooper M. M. and Cox E. L., (2012a), Does Mechanistic Thinking Improve Student Success in Organic Chemistry? J. Chem. Educ., 89(7), 850–853. Retrieved from http://dx.doi.org/10.1021/ed200394d.
  26. Grove N. P., Cooper M. M. and Rush K. M., (2012b), Decorating with Arrows: Toward the Development of Representational Competence in Organic Chemistry. J. Chem. Educ., 89, 844–849. Retrieved from http://dx.doi.org/10.1021/ed2003934.
  27. Johnstone A. H., (1982), Macro- and micro-chemistry, Sch. Sci. Rev., 64, 377–379.
  28. Johnstone A. H., (2000), Teaching of Chemistry – Logical or Psychological?, Chem. Educ. Res. Pract., 1(1), 9–15. Retrieved from http://pubs.rsc.org/en/content/articlelanding/2000/rp/a9rp90001b#!divAbstract.
  29. Kozma R. B. and Russell J., (1997), Multimedia and understanding: expert and novice responses to different representations of chemical phenomena, J. Res. Sci. Teach., 34(9), 949–968. Retrieved from http://onlinelibrary.wiley.com/doi/10.1002/(SICI)1098-2736(199711)34:9<949::AID-TEA7>3.0.CO;2-U/abstract.
  30. Kraft A., Strickland A. M. and Bhattacharyya G., (2010), Reasonable reasoning: multi-variate problem-solving in organic chemistry, Chem. Educ. Res. Pract., 11, 281–292. Retrieved from http://dx.doi.org/10.1039/C0RP90003F.
  31. Krathwohl D. R., (2002), A Revision of Bloom's Taxonomy: An Overview, Theor. Pract., 41(4), 212–218. Retrieved from http://www.unco.edu/cetl/sir/stating_outcome/documents/Krathwohl.pdf.
  32. Krippendorf K., (2011), Computing Krippendorff's Alpha-Reliability. Retrieved 27 May 2016, from http://repository.upenn.edu/asc_papers/43.
  33. LaDue N. D., Libarkin J. C. and Thomas S. R., (2015), Visual Representations on High School Biology, Chemistry, Earth Science, and Physics Assessments, J. Sci. Educ. Technol., 24(6), 818–834. DOI: 10.1007/s10956-015-9566-4.
  34. Lewington J., (2013), Get the lecture before you even arrive in class, The Globe and Mail.
  35. Linenberger K. J. and Bretz S. L., (2014), Biochemistry students' ideas about shape and charge in enzyme–substrate interactions, Biochem. Mol. Biol. Educ., 42(3), 203–212. DOI: 10.1002/bmb.20776.
  36. Matlin M. W., (2009), Cognitive Psychology, Hoboken, NJ: Wiley.
  37. Mayer R. E., (2012), Information processing, in Harris K. R., Graham S., Urdan T., McCormick C. B., Sinatra G. M. and Sweller J. (ed.), APA educational psychology handbook, Vol 1: Theories, constructs, and critical issues, Washington, DC: American Psychological Association.
  38. Moran B., (2013), How to Get an A - in Organic Chemistry, The New York Times, The New York Times.
  39. Novak J. D., (2007), Human constructivism: A unification of psychological and epistemological phenomena in meaning making, International Journal of Personal Construct Psychology, 6(2), 167–193.
  40. Pape S. J. and Tchoshanov M. A., (2001), The Role of Representation(s) in Developing Mathematical Understanding, Theor. Pract., 40(2), 118–127. DOI: 10.1207/s15430421tip4002_6.
  41. Pollack C., (2012), The Invisible Link: Using State Space Representations to Investigate the Connection Between Variables and Their Referents, Mind, Brain, and Education, 6(3), 156–163. DOI: 10.1111/j.1751-228X.2012.01151.x.
  42. Rushton G. T., Hardy R. C., Gwaltney K. P. and Lewis S. E., (2008), Alternative conceptions of organic chemistry topics among fourth year chemistry students, Chem. Educ. Res. Pract., 9, 122–130. DOI: 10.1039/B806228P.
  43. Schunk D., (2016), Learning Theories: An Educational Perspective, 6th edn, New York, NY: Pearson.
  44. Strickland A. M., Kraft A. and Bhattacharyya G., (2010), What Happens when Representations Fail to Represent? Graduate Students’ Mental Models of Organic Chemistry Diagrams, Chem. Educ. Res. Pract., 11, 293–301. Retrieved from http://pubs.rsc.org/en/Content/ArticleLanding/2010/RP/c0rp90009e.
  45. Taber K. S., (2009), Learning at the Symbolic Level, in Multiple Representations in Chemical Education, Dordrecht: Springer Netherlands, vol. 4, pp. 75–105. Retrieved from http://link.springer.com/10.1007/978-1-4020-8872-8_5.
  46. Taber K. S., (2013), Revisiting the chemistry triplet: drawing upon the nature of chemical knowledge and the psychology of learning to inform chemistry education, Chem. Educ. Res. Pract., 14(2), 156–168. DOI: 10.1039/C3RP00012E.
  47. Taber K. S., (2014), Ethical considerations of chemistry education research involving ‘human subjects’, Chem. Educ. Res. Pract., 15, 109–113. DOI: 10.1039/c4rp90003k.
  48. Talanquer V., (2011), Macro, Submicro, and Symbolic: The many faces of the chemistry ‘triplet’, Int. J. Sci. Educ., 33(2), 179–195. Retrieved from http://www.tandfonline.com/doi/abs/10.1080/09500690903386435.
  49. Taskin V. and Bernholt S., (2014), Students' Understanding of Chemical Formulae: A review of empirical research, Int. J. Sci. Educ., 36(1), 157–185. Retrieved from http://www.tandfonline.com/doi/abs/10.1080/09500693.2012.744492.
  50. Top Hat, (2016), [ELEC]. Retrieved 1 June 2016, from http://https://tophat.com/.
  51. Vygotsky L., (1978), Interaction between learning and development, in Mind in Society: The development of higher psychological processes, Cambridge, MA: Harvard University Press.
  52. Weinrich M. and Talanquer V., (2015), Mapping students' conceptual modes when thinking about chemical reactions used to make a desired product, Educ. Chem. Retrieved from http://pubs.rsc.org/en/content/articlehtml/2015/rp/c5rp00024f.
  53. Weinrich M. and Talanquer V. (2016), Mapping Students' Modes of Reasoning When Thinking About Chemical Reactions Used to Make a Desired Product, Chem. Educ. Res. Pract., http://doi.org/10.1039/C5RP00208G.
  54. Wright L. K., Fisk J. N. and Newman D. L., (2014), DNA → RNA: What Do Students Think the Arrow Means? CBE Life Sciences Education, 13(2), 338–348. DOI: 10.1187/cbe.CBE-13-09-0188.

This journal is © The Royal Society of Chemistry 2017