Upper-division chemistry students’ navigation and use of quantum chemical models

Marc N. Muniz *a, Cassidy Crickmore a, Joshua Kirsch a and Jordan P. Beck *b
aWestern Washington University, Bellingham, Washington 98225, USA. E-mail: Marc.muniz@wwu.edu
bConcordia University Wisconsin, Mequon, Wisconsin 53097, USA. E-mail: Jordan.beck@cuw.edu

Received 22nd January 2018 , Accepted 22nd April 2018

First published on 24th April 2018


Abstract

Chemical processes can be fully explained only by employing quantum mechanical models. These models are abstract and require navigation of a variety of cognitively taxing representations. Published research about how students use quantum mechanical models at the upper-division level is sparse. Through a mixed-methods study involving think-aloud interviews, a novel rating task, and an existing concept inventory, our work aims to fill this gap in the literature and begin the process of characterizing learning of quantum chemistry in upper-division courses. The major findings are that upper-division students tend to conflate models and model components. Students, unlike experts, focus on surface features. Our data indicates two specific surface features: lexical features and a “complex equals better” heuristic. Finally, there is no correlation in our data between a student's facility with navigating models and their conceptual understanding of quantum chemistry as a whole. We analyze the data through the lens of a framework which enables us to cast model conflation as a problem of ontology.


Introduction

Many chemical and physical properties and processes can be fully explained only by employing quantum mechanical models. For example, the stability of the chemical bond arises from the exchange integral, a quantum mechanical concept with no classical analog. Yet, these models are quite abstract and require that learners navigate a variety of cognitively taxing representations. This is especially true in upper-division (i.e. typically a third-year or fourth-year course in the undergraduate curriculum) quantum chemistry, as relatively complex mathematical representations are integrated into the development and use of quantum chemical models (Dick-Perez et al., 2016; Mack and Towns, 2016). A relatively substantial body of work exists to characterize student understanding about quantum models at the secondary and general chemistry levels (Harrison and Treagust, 1996; Taber, 2002a, 2002b, 2005; Tsaparlis and Papaphotis, 2009; Dangur et al., 2014), and in physics courses (Styer, 1996; Singh, 2001; Cataloglu and Robinett, 2002; Zhu and Singh, 2012; Dini and Hammer, 2017; Marshman and Singh, 2017). Many examples of student difficulty in this domain have appeared in the literature across many levels and subdomains. For example, at the secondary level, Harrison and Treagust found that students have a tendency to conflate components and attributes of biological systems with those of atoms and molecules (e.g. cells divide and replicate and so should atoms) (Harrison and Treagust, 1996). Taber reports that some secondary-level students equate the term “shell” with “orbital” insofar as their physical meanings are concerned (Taber, 2005). Tsaparlis and Papaphotis found that students at the secondary level tended to retain ideas associated with the planetary and Bohr-type models of the atom despite making some gains with respect to thinking about the probabilistic nature of electron behavior in atoms (Tsaparlis and Papaphotis, 2009). While students, after an intervention, shifted their views toward a more scientifically normative viewpoint on the nature of orbitals, the authors found that students’ initial ideas about this content are quite resistant to change. At the honors secondary level and first-year undergraduate level, Dangur and coworkers found that students retain some components of a deterministic-type model of the atom while, at times, integrating scientifically normative (probabilistic) language about electron behavior in atoms (Dangur et al., 2014). This was particularly true of students they characterized as holding a “hybrid model” of the atom: a model containing both deterministic (e.g. electrons “orbit” the nucleus) and quantum mechanical (e.g. energy levels are quantized) assumptions.

Stefani and Tsaparlis conducted a phenomenographic study of second-year undergraduate students’ understanding of basic quantum chemical models and concepts (Stefani and Tsaparlis, 2009). The researchers categorized student responses to an interview protocol, which covered content ranging from atomic orbitals and the Schrödinger equation to simple molecular orbital theory. They found that even students who performed the strongest on the interview expressed ideas about quantum chemical constructs (e.g. hybrid orbitals) that were not scientifically normative. Further, they explicitly identified fragmented knowledge as a barrier to students’ development of scientifically normative ideas about the theoretically underpinnings of atomic, molecular, and hybrid orbitals. Furthermore, Tsaparlis used three years’ worth of students’ final exam data from an upper-division quantum chemistry course to illustrate that students struggle to construct their understanding across a wide-variety of content including, but not limited to, orbitals (atomic and molecular), the orbital approximation, and a wide array of useful mathematical and symbolic-level representations (e.g. operators, atomic and molecular term symbols) (Tsaparlis, 1997). It is important to note that the data Tsaparlis analyzed in this work focused exclusively on those students who had passed the course. This supports the notion that even “high-performing” students encounter significant challenges when developing understanding of quantum chemical phenomena.

In the physics education literature, Singh identified three major ways in which upper-division undergraduate physics students struggled to “discriminate between related concepts” in a wide variety of contexts (including, but not limited to, quantum measurements and time dependence of states) (Singh, 2001). This involves the tendency of information to “run together” in such a manner that various systems and models cannot be appropriately deconvolved in students’ minds. More recently, Singh, in collaboration with Marshman, found that upper-division quantum physics students experienced challenges recognizing that the constructs of probability distribution and expectation value are separate but related (Marshman and Singh, 2017). At the same time, many students did not recognize that equivalent statements of the expectation value (one in Dirac notation, and one in traditional integral notation) are, indeed, equivalent.

To summarize, results from existing work on student ideas in quantum chemistry at the secondary and undergraduate levels as well as in quantum physics at the upper-division undergraduate level indicate that students have a tendency to make inappropriate associations between model or system components, or do not have a means of distilling useful information from prompts within quantum contexts. Prior work also suggests that students focus on surface-level features (e.g. “nucleus” in biology should mean the same thing as “nucleus” in chemistry) to draw conclusions about the systems under study. The work also indicates that students have strongly-held conceptions about the behavior of electrons and atoms, and that these deterministic or hybrid-type ideas are stable and difficult to change.

While more work has been done to characterize advanced students’ knowledge in quantum physics, comparatively little has been carried out to investigate how students navigate and use quantum chemical models at the upper-division level (with the work of Tsaparlis being a notable exception). This is unsurprising, given that the latest report from the U.S. National Academy of Sciences on discipline based education research explicitly states: “…DBER on upper division and graduate courses is currently relatively limited…” (Council, 2012). Furthermore, the report states that “The majority of (studies on students engaging in scientific practices, such as modeling) involve only students majoring in the biological sciences, and it is much more common for these studies to take place in lower division courses than upper-division courses.” Our work aims to fill this crucial gap in the literature and begin the process of characterizing students’ learning of quantum chemistry in upper-division courses.

Models and the scientific practice of modeling play a crucial role in enabling learners to construct and appropriately use their knowledge to predict and explain quantum chemical phenomena. For this reason, we have chosen a model- and modeling-centered approach to our investigations. To guide this work, we posed the following research questions (RQs):

RQ1 How do upper-division undergraduate physical chemistry students develop and apply quantum mechanical models in the context of the hydrogen atom?

RQ2 How do upper-division undergraduate physical chemistry students select quantum mechanical models and model components to explain various quantum chemical phenomena? How does this compare to experts’ approach(es)?

RQ3 Are students’ abilities to select appropriate models or model components correlated with the strength of their conceptual understanding of quantum chemistry as a whole?

RQ2 was formulated based on the results from data collected to address RQ1. Likewise, RQ3 was formulated while data pertinent to RQ2 was being interpreted. Thus, while we will briefly describe the methodology insofar as it applies to the entire study, the results and discussion will be presented as a narrative that follows the progression of the development of RQs 1, 2, and 3.

Research methodology, participants, and settings

To address the principal research questions, we developed a small-scale mixed-methods study centered on the use of three distinct tools and methods (administered at the end of the term):

Tool 1 Think-aloud interviews (Bowen, 1994) in which participants were asked to construct a model of the hydrogen atom and use the model to address typical questions encountered in a quantum chemistry course (addresses RQ1).

Tool 2 A quantum chemical model rating task that characterizes whether and how students choose appropriate models, or model components, to solve a variety of quantum chemical problems (addresses RQ2, partly addresses RQ3).

Tool 3 A concept inventory, the Quantum Chemistry Concept Inventory (QCCI) (Dick-Perez et al., 2016), to determine students’ conceptual understanding of quantum chemistry in multiple contexts (partly addresses RQ3).

Complete versions of the think-aloud interview protocol and rating task are provided in Appendices 1 and 2. For both Tool 1 and Tool 2, students were provided with written prompts and a resource sheet. They were asked to read the prompts aloud and to respond both verbally and with written answers.

Participants were drawn from upper-division physical chemistry courses at a large, public regional comprehensive university in the Pacific Northwest (LRU), a small private liberal arts university in the Midwestern United States (SLA), and a large, public research university in the Midwestern United States (R1U) during the 2015–2016 and 2016–2017 academic years. Table 1 provides a summary of the number of participants from each institution. Some student participants completed more than one of these three tasks. A full grid delineating the number of unique students who completed each task is given in Appendix 3.

Table 1 Summary of number of participants who participated in the study delineated by method/instrument and institution
Institution H-Atom protocol Rating task with (without) interview QCCI
LRU 4 10 (8) 13
SLA 6 4 (0) 8
R1U 3 4 (0) 12


Four experts were also recruited to engage in a think-aloud interview centered on the rating task. For the purpose of this study, experts are defined as individuals with a PhD in chemistry or closely related field and with at least two terms of experience teaching upper-division quantum chemistry. All participants provided informed consent and all research procedures were in compliance with the IRB standards set forth at each institution. For this report, students have been randomly assigned pseudonyms and gender pronouns.

Theoretical framework

The lens through which we have chosen to interpret the data is inspired by Michelene Chi's framework for novice/expert knowledge organization. We find that students often conflate bits and pieces of models with one another, or—in some cases—the models altogether, which we analyze in light of Slotta, Chi, and Joram's view that novices tend to misclassify content within a domain, i.e. that it is a problem of ontology (Slotta et al., 1995). The work of Chi, Feltovich, and Glaser provides strong evidence that novices tend to sort their knowledge in a domain based on surface-level or “first-order” features while experts tend to do so based on deep, structural, derived, or “second-order” features (Chi et al., 1981). Results from our think-aloud interviews centered on the model rating task led us to explore possible underlying reasons for the students’ difficulty in selecting appropriate models in the domain of quantum chemistry. The data contain several indications that students invoke models or model components for reasons that can very well be considered “surface level” and “first-order” as opposed to “deep” or “structural.” This led us to draw upon the perspective articulated by Chi and coworkers in order to more thoroughly explain the results of our work.

Results and discussion

Results pertinent to RQ1

The hydrogen atom think-aloud interview protocol is shown in its entirety in Appendix 1. Participants were provided with written prompts and a resource sheet and asked to respond to the prompts verbally and with written answers. The protocol was, roughly, separated into three segments:

1. Model construction: participants were prompted to draw upon a resource sheet in order to build a model of the hydrogen atom. They were prompted to consider how potential and kinetic energy were taken into account in their model.

2. Model application: participants were prompted to use their model to help explain several figures and diagrams.

3. Model extension: participants were simply asked to state what, if anything, they would change in order to model a helium atom as opposed to a hydrogen atom.

The research team (which consists principally of the two corresponding authors, but also includes the help of the two upper-division undergraduates listed as authors on this manuscript), using qualitative analysis software (Dedoose and Atlas.ti) as well as the comment feature of Microsoft Word, examined the transcripts of student responses for emergent patterns and themes by constructing memos and, eventually, preliminary codes. This approach is one that is traditionally used in qualitative research, particularly with methods whose principal aim is to categorize thematic elements in the data. Thus, our approach to the qualitative component of our data analysis can be best described as a form of thematic analysis (Braun and Clarke, 2006).

After multiple meetings and discussions among the researchers, it was determined that three general codes could be constructed from, and applied to, the existing data:

• Physical misinterpretation: occurred when a participant provided an improper explanation for a phenomenon, model, or model component itself.

• Reluctance to mathematize

• Conflation: occurred when models or model components were conflated with one another or invoked in improper circumstances. There are two sub-codes associated with the parent code of “Conflation.”

∘ Explicit: the language used by the participant suggests that they are aware of their struggles in the context under consideration.

∘ Implicit: the language used by the participant suggests that they are unaware of their struggles in the context under consideration.

Application of these codes to the data set from academic year 2015–2016 revealed a number of problematic conceptions among students about the hydrogen atom system as well as a tendency to conflate various models and model components with one another or invoke them under improper circumstances. As these results are similar to those reported in the physics literature (McKagan et al., 2008; Baily and Finkelstein, 2010; Savall-Alemany et al., 2016), only a few, short representative quotes are provided below to demonstrate how these themes appeared in our data.

Theme of physical misinterpretation. The principal category of physical misinterpretations, despite the interviews having occurred after formal instruction about the quantum mechanical model of the hydrogen atom, is retention of determinism. A prime example of a physical misinterpretation that falls into this category is the following excerpt from Adrian's interview, which includes his sketch in Fig. 1.
image file: c8rp00023a-f1.tif
Fig. 1 Adrian's sketch of an electron interacting with a proton in a hydrogen atom model. The dashed lines are, in the quote below, assumed to be interpreted as a “path.” A probabilistic interpretation of the electron's behavior in this system is not provided.

Adrian (in response to the prompt about accounting for kinetic and potential energy in the model): “Um I didn’t really draw them (potential and kinetic energy) into my model, um, however the electron orbital could be taken as, um, the path where the potential energy is kinda leading…it's keeping the electron in its orbital and it's not straying too far from the path that it normally takes.

Adrian's statements imply that the electron follows a particular path around the proton (i.e. it orbits the proton). This indicates that Adrian has yet to move beyond a deterministic interpretation of the electron insofar as it behaves in relation to the positively charged nucleus.

Theme of reluctance to mathematize. Accompanying the deterministic thought is a reluctance to draw upon appropriate mathematical formulations as components to the overall model of the atom. While mathematical constructs are simply parts of the larger construct that is the quantum mechanical model of the atom, it is quite important for a student of physical chemistry to link the underlying concepts to the expressions themselves. The centrality of mathematics in physical chemistry is emphasized in the description of physical chemistry given by the American Chemical Society (ACS, 2015), “Physical chemistry… contains mathematical models that provide quantitative predictions. Physical chemistry contains the mathematical underpinning to concepts…”. Thus, the mathematization of physical systems (including the hydrogen atom) is a focus of many quantum chemistry courses. Despite this, only four of the 13 students who completed the H-atom interview included any sort of mathematical formulations when faced with the first prompt of the interview which asked them to “Use sketches, diagrams, and mathematical formulae to construct a model of the interaction between the electron and the proton in the hydrogen atom”. The written portion of the responses from these four students to this prompt are given in Fig. 2.
image file: c8rp00023a-f2.tif
Fig. 2 Responses to the initial prompt that included any mathematical components.

Of the depictions displayed in Fig. 2, Kris gives the most complete response. Kris included the time-independent Schrödinger equation and separated out the potential energy and kinetic energy operators, panel A. Jamie, panel B, also included the Schrödinger equation, but mislabeled the kinetic and potential energy portions. Both Jodie, panel C, and Jessie, panel D, attempted to provide an expression for the potential energy operator. Jodie included the V(r) expression only after prompting from the interviewer.

Given the importance of mathematics in physical chemistry and that the prompt asked students to use mathematical formulae, it is interesting that nine of the thirteen students did not include any mathematical formulations. This is evidence that

• students do not consider mathematical formulae to be models or model components, and/or

• students are so uncomfortable with the mathematics of quantum chemistry that they choose to avoid it.

Our data provides evidence that both could be true. For the former, consider how Eddie responds to the first prompt:

Eddie: “It's kind of a hard question to answer cause when you draw a diagram, like kind of what you want to draw is like these two discrete particles with this like interaction where one orbits the other one… So it's, I guess that's hard to draw… Is it like in a wave function that has to like meet back up with itself at the end or like what… And then you how do you draw that? It's hard to pick one. I guess you just have to pick a model that satisfies the aspect of the system that you're trying to talk about at the time.

Students’ reluctance to consider mathematical formulae to be models or model components has been observed at the general chemistry level (Becker et al., 2017; Brandriet et al., 2018). This is the first time, to our knowledge, that direct evidence of this also occurs at the upper-division level in chemistry.

Our data also provides ample evidence of the possibility that students don’t understand the mathematics so they simply avoid it if possible. For example, a student should be able to look at the terms in the Hamiltonian operator for the hydrogen atom and clearly state that kinetic and potential energy operators are represented by the Laplacian and Coulombic terms, respectively. They should also understand that, once appropriate solutions to the differential equations that arise out of the Schrödinger equation for this system have been found, the wavefunctions (eigenfunctions) will be functions associated with the energies (eigenvalues) of particular states but should not be conflated with the observable (energy in this case) itself. A prime example of a participant who does precisely this is as follows:

Jodie (in response to the prompt about accounting for kinetic and potential energy in the model): “Potential and kinetic energy. I think those terms are going to be in the operator for the wavefunction of an electron.

Interviewer: “When you say in the operator for the wavefunction, what do you mean?

Jodie: “I guess the set of parameters of the wavefunction is used to measure the energy of the wavefunction. So the kinetic energy and potential energy are energy. So the functions for those are used to find the whole energy of the function, maybe the electron.

Note that Jodie clarifies her statement about the “operator for the wavefunction” by alluding to the “set of parameters of the wavefunction.” The implication is that the parameters of the function that describes the state are used as a means to measure the observable (in this case, energy). While, if one operates on a particular wavefunction with the Hamiltonian operator, they will certainly obtain the total energy of the state which contains parameters of the function, it is clear that the participant has a great deal of difficulty explicitly focusing on the information contained within the Hamiltonian operator. It is this operator that, specifically, serves as a platform in quantum mechanics within which to build terms for kinetic and potential energy.

Theme of conflation in H-atom protocol interview data. Jodie's difficulty in focusing on the relevant features of the Schrödinger equation to develop a response to the prompt is of interest in its own right in the context of how students interpret the quantum chemical model they are attempting to assemble and use. What transpires immediately after the excerpt provided above requires a somewhat different characterization than merely a reluctance to mathematize:

Interviewer (in response to Jodie's statement ending in “…electron.”): “So is there a way to take that into account? In building up the model?

Jodie: “In a model…I don’t know how to make a picture out of that. There was that thing with the box. I’m bad at pictures.

Interviewer: “Does it necessarily have to be just a picture? Or…

Jodie: “Well, it could be a visualization. I guess it could be the graph of particle in a box kind of thing. I still don’t really get that kind of graph anyway.

In addition to Jodie's implication that a model must be a “picture” or a “visualization” (rather than stating that such pictures or visualizations are, likely, a model component, the likes of which are accompanied by other model components such as mathematical expressions), she invokes a model (particle in a box) that is not directly relevant to the system under consideration. It is clear that Jodie is conflating the particle in a box model with the hydrogen atom model. In this segment, Jodie does not indicate that she is consciously aware of this conflation. We have coded such instances as “implicit.” The implicit conflation code was one of the most robust themes that emerged from the data.

In addition to “implicit conflation,” there were several instances where students were aware of and able to verbalize their difficulty consolidating and appropriately applying several models or model components. Such statements were coded as “explicit”. One representative example is given below.

Avery: “Everything has been clouded up by so many other things. Um, I think this has to do with the energy and the Rydberg.

Interviewer: “So when you say it's been clouded, what do you mean?

Avery: “I mean like it's been so long since we talked about this emission spectrum thing that I don’t remember what all it's signifying.

Results pertinent to RQ2

Given the emergence of conflation data from the H-atom think-aloud data, the research team was inspired to investigate the possibility of conflation with respect to models and model components across multiple content areas within the domain of quantum chemistry. Thus, our team constructed a novel model rating task as a tool to investigate research question 2: How do upper-division undergraduate physical chemistry students select quantum mechanical models and model components to explain various quantum chemical phenomena? How does this compare to experts’ approach(es)?

The full rating task and resource sheet are provided in Appendix 2. In short, participants were instructed to consider eight prompts drawn, or adapted, from two widely-used and prominent physical chemistry texts (McQuarrie and Simon, 1997; Atkins and De Paula, 2009). Rather than solve each problem, they were instructed to rate four quantum mechanical models or model components on a four-point Likert scale ranging from 1 (very unhelpful) to 4 (very helpful) in terms of how helpful they would be for solving the problem. Each model or model component was to be considered independently. So, for example, on any given prompt each of the four choices could be rated a 1 if the participant believed each choice to be very unhelpful in addressing the prompt.

The rating task was administered to 18 student participants as a think-aloud interview during the 2016–2017 academic year. Additionally, some participants at the LRU completed the rating task without going through the interview process. All responses were transcribed and coded. While the research team analyzed responses to all of the prompts, only three of these eight questions will be discussed here as they provide the richest data in terms of model conflation. These three prompts and models to rate are provided in Table 2.

Table 2 Rating task items discussed in this manuscript. The full task is given in Appendix 2. Note that item 6 is represented here with a propt description, since sufficient text was borrowed from problem 3–6 on p. 97 of McQuarrie and Simon, 1997
Item number Prompt Models/model components to rate
1 Atoms in a chemical bond vibrate around the equilibrium bond length. Show that the average displacement from the equilibrium bond length of a chemical bond equals zero and is independent of vibrational energy. a. Spherical harmonics
b. Rigid rotator model
c. Harmonic oscillator model
d. Morse potential energy curve
3 The “spacings” between lines in a pure microwave spectrum of HBr are not equidistant. Why? a. Rigid rotator model
b. Anharmonic oscillator model
c. Centrifugal distortion term
d. Spin selection rules
6 Prompt description: Given the length of a linear, conjugated molecule (hexatriene), predict the HOMO to LUMO transition energy. a. Particle in a box model
b. Particle on a ring model
c. Harmonic oscillator model
d. Free particle model


Four physical chemistry experts—all of whom have multiple years of experience teaching upper-division quantum chemistry—engaged in think-aloud interviews with the rating task. The experts’ ratings, coupled with their interview transcripts, served as a baseline against which to compare the ratings and statements of student participants. Therefore, the expert interview transcripts were analyzed for general themes. As expected, the experts gave fluid, scientifically normative responses that were very much focused on deep, structural features of the knowledge domain which are recorded in Table 3.

Table 3 Expert responses to the rating task items
Item number (topic) Expert responses
1 (Vibration) • Harmonic oscillator is helpful
• Spherical harmonics are unhelpful
• Clearly articulated the physical systems of motion involved in the model/model components
3 (Microwave spectroscopy) • Focused on rotational motion
• Described the physical system – rotating diatomic
6 (Hexatriene) • Described the physical system – linear, conjugated molecule


It is well documented in the literature that experts approach problems and organize knowledge in ways that are quite different from novices (Chi et al., 1981; Daley, 1999; Smith et al., 2013; Irby et al., 2016). The results presented here certainly adhere to the literature in these areas. They also support the notion that learners, as novices, seldom organize their knowledge in a domain based on deep, structural characteristics of that domain.

The rating task think-aloud interviews produced a wealth of data that uncovered not only the model conflation we sought to “tease out” in the context of various content across the quantum chemistry curriculum, but also illustrated a tendency of students to consider superficial aspects of models and model components. Below, we present a description of the results along the lines of both conflation and fixation on surface features.

The codes that resulted from, and were applied to, this data are:

• Conflation: occurs when models or model components are conflated with one another or invoked in improper circumstances. This code was typically applied when there was a departure from the expert majority response with respect to a particular model or model component on a given item. For this analysis, the actual ratings were dichotomized, i.e. ratings of 1 and 2 were collapsed into a single category called “unhelpful” and ratings of 3 or 4 were collapsed into a single category called “helpful”. In addition to divergence from the expert position described above, there must be evidence in the students’ statements that the rating is not simply due to “not knowing.” Since explicit statements of conflation did not arise within the data collected for this task, there was no need to re-invoke the child codes (explicit vs. implicit) in this analysis.

• Fixation on surface feature: this theme came about due to the tendency of student participants to focus on superficial aspects of models and model components as opposed to their underlying predictive/explanatory power or relevance to the problem. This theme emerged from the data in two general ways, and led to the development of the following child codes:

∘ Lexical: participant focused on a surface feature that was due to the language used to describe the model or model component. The language was interpreted in such a way that it is superficial and unrelated to the utility of the model or model component to address the problem.

∘ Complex = better: participant automatically assumed that a model that was either introduced later in the sequence, or that shares certain structural similarities with a more appropriate model, is better suited to solving the problem under consideration.

Theme of conflation in rating task interview data. The data from the rating task interviews yielded many instances of model conflation. This is not surprising given that this task was designed to investigate such confusion. Only a few representative quotes are provided to demonstrate the types of student statements that were coded in this category.

A prime example of conflation is provided by Addison in response to Item 1 (vibration):

Addison: “Okay, so. Spherical harmonics describe the motion for a chemical, a chemical bond vibrating around the equilibrium bond length; that will be the, um, part of the wavefunction that describes the motion going back and forth, so I ranked it ‘four,’ very helpful.

Here, Addison has conflated the rigid rotator wavefunctions (spherical harmonics) with those of the harmonic oscillator.

Ryan, in response to Item 3 (microwave spectroscopy) constructs a response that conflates molecular rotation and vibration:

Ryan: “A better model would be the anharmonic model because it does illustrate that just the spectrum, the lines of the spectrum going from lower to higher energy are, they get closer together. It's not constant. And, I guess the centrifugal distortion sort of explains that, it's the correction for the anharmonicity.

Note, here, that the conflation of rotation and vibration is with respect to the corrections to ideal motion. Ryan believes that the unequal spacing in the pure microwave (rotational) spectrum can be accounted for by an anharmonic model (vibrational) and then goes on to express centrifugal distortion as a “correction for the anharmonicity.”

As a final representative example of conflation, consider Brooklyn's description of the model(s) that would be most appropriate to address Item 6 (hexatriene):

Brooklyn: “…I said that both the particle-on-a-ring and the harmonic oscillator would be equally helpful because, in reality, it's kind of a combination between the two. And the particle in a box. I mean, the particle in a box is kind of like a particle on a ring sort of, but less accurate.

The most salient feature of this response, upon first glance, is perhaps the combination of a model for internal motion (harmonic oscillator) with a simple model to describe electronic behavior (particle-on-a-ring). However, upon further reading, one discovers another interesting statement in the excerpt: one that suggests that Brooklyn's choice of particle-on-a-ring over particle-in-a-box is based on the interpretation that the former is more complex than the latter. The idea that one model being more complex than another model, irrespective of its appropriateness to the system that is being considered, is arguably a reliance on a type of surface feature: the student's own interpretation of the relative complexity of models leads them to virtually ignore the appropriateness of the model with respect to symmetry or even the entity or type of motion that is under consideration.

This revelation naturally leads us into a discussion about surface features, which we have characterized as either related to a simplistic interpretation of the language associated with the model or model component (lexical) or, as in the case of Brooklyn above, the interpretation of the models or model components in such a way that increasing complexity is always better (complex = better).

The reliance of novice practitioners on surface features is well reported in the literature (Chi et al., 1981; Smith et al., 2013; Irby et al., 2016) and not particularly surprising in this context. What is informative and interesting in this data are the two classes of surface features that were overwhelmingly present in our data. These two classes, lexical and complex = better are briefly discussed below.

Theme of surface features: lexical. Beginning with the first item of the rating task (focused on molecular vibration), we find evidence of students’ fixation on surface-level features. Consider the following excerpt from Robin's response:

Robin: “I put, uh “a” (spherical harmonics) and “c” (harmonic oscillator) both as very helpful for solving the problem. Um, because if you’re dealing with bond vibration, you’re going to be looking at uh, um, you’re going to be looking at harmonics.

The term “harmonics” is applied by the participant in a manner that is far too broad: he believes that simply because “harmonics” is in the name of the model or model component that it must be useful for solving this particular problem.

While the incidence of surface feature (lexical) code was quite high in the first item, it was also discovered in other contexts. For example, consider Sam's response to the item about the microwave spectrum of HBr.

Sam: “Probably like it's not harmonic so if it was perfectly harmonic I think it would have the same distance… Centrifugal distortion term I think that's just like a correction to it being anharmonic… To explain it I think it's mostly because its anharmonic is why they are not equal.”

Sam focuses on harmonicity vs. anharmonicity to address this prompt. Presumably because he learned that energy levels in a harmonic oscillator are evenly spaced, Sam may have created a heuristic that “harmonic means equal” and “anharmonic means unequal” with disregard to the property being addressed. Sam approaches this question with a focus on the lexicon—the equal spacing of spectral lines and then applies the heuristic.

Theme of surface features: complex = better. We now turn our attention to surface-level features that are of a variety that we characterize as “complex = better.” Students who make their decisions based on this type of surface feature focus on model complexity as a proxy for model appropriateness: the more complex the model, the “better” it is for solving the problem.

One example of a student relying on this surface feature was given in Brooklyn's response to item six on the rating task (about the transition energy of hexatriene) as discussed above. Brooklyn refers to the particle-on-a-ring model as being a more accurate model than particle in a box and, therefore, suggests that it is more appropriate for solving the problem. This is a prime example of a participant focusing in on a “complex = better” type surface feature.

As another example, consider Gwen's response to Item 3 on the rating task (about the microwave spectrum of HBr):

Gwen: “For choice A (rigid rotator model) I will give it a four. It is very helpful. Um um so so for choice B (anharmonic oscillator model) I will also rank it very helpful because ah according to the question HBr the spectrum is supposed to be equidistance from the original rotator model umm like the equation of it but actually it's not. So there must be something like to the model itself might not be exactly precise so there are some anharmonic terms which can also be added in to describe it. So we need to know that…

Gwen correctly states that rigid rotators have spectral features with equidistant spacings. Then, she reasons that since the actual spectrum is stated to have non-equidistant spacings, a more complex model is required. Finally, she concludes that anharmonic terms should be added to account for the original model being imprecise. It is the final conclusion which illustrates the surface-level feature of complex = better. Much of Gwen's response is normative, but she eventually falls back on the idea that the more complex anharmonic oscillator model should be able to account for the microwave spectrum of HBr.

Results pertinent to RQ3

It is important to emphasize that our model rating task is, by-and-large, designed to be a tool for qualitative investigations of students’ evaluations of the relative helpfulness of models for solving quantum chemical problems. Without extensive psychometric work to develop the rating task as an instrument, our strongest conclusions based on data derived from it come from the qualitative analysis (i.e. the discussion above). Nevertheless, our findings led us to the third and final research question that we shall present in this article.

To investigate RQ3 (are students’ abilities to select appropriate models or model components correlated with the strength of their conceptual understanding of quantum chemistry as a whole?) we administered the Quantum Chemistry Concept Inventory (QCCI)—an instrument for measuring conceptual understanding of upper-division quantum chemistry students (Dick-Perez et al., 2016). The QCCI is a 13-item multiple choice diagnostic assessment tool which was administered by Dick-Perez et al. to 140 quantum chemistry students in the United States and Canada as a post-test (i.e. post instruction). Each multiple choice question has only one correct, best answer. The question topics include: approximations, bonding, harmonic oscillator, spectroscopy, etc. Given that the QCCI is a multiple-choice, and, therefore, quantitative instrument, we decided to compare the degree to which student participants deviated from the expert responses on the model rating task to their overall score on the QCCI.

One way to simply view the model rating task data, in aggregate, is to dichotomize the results of both the student and expert participants (ratings of 1 and 2 were grouped into a general “unhelpful” category while ratings of 3 and 4 were grouped into a general “helpful” category). For the purposes of this study, the scientifically normative response (helpful or unhelpful) for each model or model component was the response given by the majority of four expert positions and is referred to as the “expert response”. There were three instances where there was not a consensus among the experts (i.e. the expert positions were split with two answering helpful and two answering unhelpful), in which cases no “expert response” was established.

Fig. 3 represents dichotomized responses for the student participants (novices).


image file: c8rp00023a-f3.tif
Fig. 3 Dichotomized responses of novices to items 1, 3, and 6 of the rating task, respectively. U and H stand for “unhelpful” and “helpful.” The abbreviations for the models or model components are as follows: SH = spherical harmonics, RRM = rigid rotor model, HOM = harmonic oscillator model, MPEC = Morse potential energy curve, AOM = anharmonic oscillator model, CDT = centrifugal distortion term, SSR = spin selection rules, PBM = particle in a box model, PRM = particle on a ring model, FPM = free particle model. The expert response for each model or model component is indicated with an asterisk (*). No expert response was established for MPEC.

The plots in Fig. 3 give a quantitative overview of responses to items, and certain problematic responses by novices that were detailed in the qualitative results provided in the previous section can be readily observed. For example, a relatively high number of student participants characterized spherical harmonics as “helpful” in the first problem and a significant number characterized the anharmonic oscillator model and harmonic oscillator models as helpful in items 3 and 6, respectively.

The dichotomized representation of the data provides us a relatively simple means of comparing, across-the-board, how student participants performed on the model rating task. These so-called discrepancies were calculated as follows:

• The novice and expert responses, in dichotomized form, were placed side by side in a spreadsheet.

• If the responses matched (i.e. the novice response matched the expert response), a score of “0” was assigned, indicating that there was no discrepancy between novice and expert on that particular model/model component for that item.

• If the responses did not match, a score of “−1” was assigned, indicating that there was a discrepancy between novice and expert on that particular model/model component for that item.

• The scores, 0's and −1's, were summed for each student participant across all models/model components and all items on the rating task.

These discrepancies provide us with a simple, yet descriptive, representation of how far from the expert position, as a whole, participants were in their assignment of “helpful” or “unhelpful” ratings to the models and model components.

Table 4 provides descriptive statistics of the model rating task discrepancies and the QCCI scores.

Table 4 Descriptive statistics (average and standard deviation) for the 21 students who completed both the rating task and the QCCI
Rating task discrepancy QCCI score
Average −8.7 7.8
Standard deviation 3.2 2.7


The range of possible discrepancy values was 0 (perfect agreement with expert positions) to −29 (opposite of expert position on each part). The actual range of student responses was −3 to −15 with an average of −8.7. To provide a reference point, the expert responses were coded for discrepancies against the majority position in the same manner as the student responses. The four experts each had a discrepancy of either −1 or −2 with an average discrepancy of −1.8.

One further grain of analysis can be applied to the student discrepancy data by categorizing the types of discrepancies. There was an expert response to 29 models on the rating task. Of those, 11 were classified as helpful and 18 were classified as unhelpful. Thus, assuming that students are equally likely to disagree with experts on helpful or unhelpful models, we may expect approximately 38% (11/29) of student discrepancies to be of the type where the students answered that the model was unhelpful when the experts rated it as helpful. Conversely, approximately 62% (18/29) would be expected to be of the other type: students answered helpful and experts answered unhelpful. The student data actually bears out this simple assumption: 35% of all discrepancies were of the former type and 65% of the latter type. Without more rigorous statistical analysis, it isn’t possible to make a definitive statement, but it does appear that students are equally likely to disagree with experts in the helpful and unhelpful directions. This was initially surprising to the research team as, anecdotally, it seemed as if students were very eager to rate a model as being helpful. This researcher bias was likely due to a strong focus on a few key models that were included intentionally as unhelpful models.

The discrepancies, taken alone, do not provide us much further insight with respect to our third research question, which aims to determine if there is a relationship between students’ conceptual knowledge in the domain of quantum chemistry and their abilities to choose appropriate models or model components to address problems in this domain. For this reason, we collected student responses to the Quantum Chemistry Concept Inventory (QCCI)—which, as stated earlier, is an instrument that assesses students’ conceptual understanding of quantum chemistry.

Upon first glance, the mean score of our student participant population on the QCCI (cf.Table 4) is slightly above the 6.9 ± 2.7 post-term score reported by Dick-Perez et al. in their original paper about the QCCI (Dick-Perez et al., 2016). However, the difference between the mean score of our student participants and that of the QCCI validation cohort is not significantly different (two-tailed t-test, p = 0.16). Thus, it appears the performance of our cohort of students on this particular task compares reasonably well to that of the broader sample of Dick-Perez and coworkers.

These statistics provide a simple overview of the data and lead us, tentatively, to conclude that, despite moderately strong conceptual knowledge of quantum chemistry, students struggle to a great extent when it comes to choosing appropriate models or model components to approach quantum chemical problems. Fig. 4 illustrates the lack of correlation between rating task and QCCI scores.


image file: c8rp00023a-f4.tif
Fig. 4 QCCI vs. rating task discrepancy scores. Some of the student participants quoted in the manuscript are indicated to give a sense of the sampling of representative quotes.

Note, in Fig. 4, that there is no correlation between the rating task discrepancies and the QCCI scores (r2 = 0.02). Also, the range of rating task discrepancies for a given QCCI score is quite large. For instance, four participants attained a near-perfect (12 out of 13) score on the QCCI. Among these participants, however, the discrepancies on the rating task ranged from −6 (near the best) down to −13 (near the worst). These results support the notion that strong conceptual knowledge within a particular domain does not necessarily imply a robust understanding of how to choose and apply appropriate models to solve a problem.

Discussion

The perspective we choose to adopt to analyze the results is that of Slotta, Chi, and Joram which casts the issues of model conflation and confusion as a problem of ontology (Slotta et al., 1995). In their work, which is focused on learners of basic physics, learners (novices) experience difficulty appropriately categorizing types of problems encountered in physics. This “miscategorization” means that the information is not aligned with the scientifically normative structure of knowledge and, therefore, further meaningful inquiry is inhibited until the learner is guided in such a manner that the knowledge is appropriately categorized.

While the work of Chi et al. (Chi et al., 1981; Slotta et al., 1995) focuses on how students categorize knowledge about basic classical physics, we can readily extend the explanatory power of their perspective to our work. That is, we wish to make the claim that the model conflation we observe in our data can be explained as a miscategorization of models within the quantum chemical context.

In order to facilitate the use of this framework, we present a basic diagram that lays out part of the knowledge structure of quantum chemistry at the undergraduate level in Fig. 5.


image file: c8rp00023a-f5.tif
Fig. 5 Hierarchical diagram of the domain of quantum chemistry, as it is typically taught at the undergraduate level. Select models and model components have been inserted for illustrative purposes.

The information contained within Fig. 5 presents the domain of quantum chemistry as an area of study that centers on the application of quantum mechanics to atoms and molecules. It also provides the broad categories into which models (and their components) can, ultimately, be placed and provides some relevant models and model components.

Upon careful examination of Fig. 5, one can begin to notice where students can potentially experience difficulties with the appropriate categorization of models or model components and, thus, their scientifically normative application. To further illustrate this, we will briefly recap some of the findings from the H-atom protocol and the model rating task think-aloud interviews with this perspective.

With the H-atom protocol, participants at times spoke quite candidly about their difficulty categorizing and organizing the models and model components. We presented representative quotes that were coded as “explicit conflation” and illustrated participants “grasping at straws” (i.e. invoking the model they could best remember which, in one prominent case, was the particle-in-a-box model). These findings fit neatly into an interpretation that is based on miscategorization of models and model components. In the case of the H-atom protocol, this manifests itself in a manner that renders the participants unable to choose, or select, the appropriate models and model components to successfully predict and explain the behavior of the hydrogen atom.

In the case of the model rating task, which was explicitly created to investigate whether and how participants choose appropriate models and model components, the applicability of the framework of Chi et al. and Slotta et al. to the data is even more apparent. In response to Item 1 (vibration), a sizable portion (42.3%) of participants chose spherical harmonics as an appropriate model component ostensibly due to the presence of the term “harmonics.” In addition to being a prime example of fixation on a surface feature—lexical, it is quite obviously an indication of a miscategorization. While the participants would be right to focus on simple harmonic behavior, their inability to focus on the total model component “spherical harmonics” causes them to mistakenly ascribe utility to this model component as a tool to help predict and explain molecular vibration. The participants have categorized their knowledge based on this lexical surface feature (the term “harmonics”) rather than the appropriateness of the model component for the type of motion under consideration.

In Item 3 (microwave spectroscopy), responses tended to be problematic with respect to conflation of vibrational and rotational motion. In fact, a rather large majority (76%) of participants rated the anharmonic oscillator as helpful (see Fig. 3)! Participants, in general, recognized that some sort of perturbation to an ideal system was necessary to account for the uneven spacing. However, they did not make a connection between the type of motion under consideration and, therefore, the appropriate model or model component to explain the uneven spacing. This can be readily cast as a miscategorization in the following manner: the type of internal motion was, initially, assumed to be vibrational rather than rotational. If this is the case, then the invocation of the anharmonic oscillator model is conveniently substituted for the centrifugal distortion term. The type of motion under consideration has been miscategorized altogether.

For Item 6 (hexatriene) on the rating task, nearly a third of student participants (31%) rated the harmonic oscillator as “helpful.” This constitutes a clear miscategorization within the ontological framework of basic quantum chemistry: the system that is under consideration is a linear conjugated polyene; thus, a one-dimensional particle-in-a-box model would be appropriate to describe the pi electronic structure of the molecule. Instead, this fraction of student participants assigned weight to a model that is specifically designed to predict and explain vibrational motion of molecules (in the quantum chemistry context). Thus, this is a clear example of invoking a model whose target physical system is fundamentally different from the system under study.

Our findings are, ultimately, consistent with what Slotta, Chi and coworkers have found in the context of basic physics both with respect to the problem of ontology and the novice-expert divide in knowledge organization. Novices tend to miscategorize models and model components insofar as how they predict and explain the physicochemical system or phenomenon under study. Novices tend to rely on, and approach problems based on, surface features, contributing to the difficulties in coupling models/model components with a given phenomenon. Experts, meanwhile, are able to almost immediately identify the type of phenomenon under study, categorize it, and assign high utility to models and/or model components that are appropriate for the system at hand.

Study limitations

The two principal limitations in our study are as follows:

1. Our sample size is relatively small,

2. Psychometrics have not been conducted on the model rating task in a manner similar to the QCCI.

With regard to the first limitation, it is worth noting that upper-division undergraduate courses in chemistry do not have population sizes on the same order of magnitude as lower-division (e.g. general chemistry or organic chemistry) courses. Despite this, we have collected a relatively large amount of rich qualitative data and have reported several emergent themes, and an interpretation of these themes, in the sections above. The themes were remarkably consistent across multiple institutions, which is an indication of the reliability of our approach. Furthermore, the majority of our quantitative data (e.g. frequencies of unhelpful vs. helpful responses to each model or model component in the model rating task) was reported to support the qualitative results based on the think-aloud interviews. The exception to this was in our correlation of the rating task and the QCCI instrument, which naturally leads to our second limitation.

The model rating task was, principally, designed to gain deeper qualitative insight into how students assigned priority to various models and model components in the context of quantum chemistry. Descriptive values were compiled and used in a manner that supplemented the qualitative analysis. At the same time, we calculated a total score in the form of a total discrepancy from expert consensus. We used these values to determine if there is a correlation between this deviation and student participant scores on the QCCI. However, despite the disparity between the psychometric work performed on the QCCI and the lack thereof for the model rating task, we propose that the wealth of qualitative data we have collected about the model rating task lends itself to the face validity and, to some extent, the content validity of its results. That is, it serves as an adequate means of assessing students’ abilities to choose models or model components that are appropriate for physicochemical systems and phenomena.

Implications for research and future work

Our work has provided some insight into the ways in which upper-division chemistry students apply quantum mechanical models and model components to the hydrogen atom, as well as how they prioritize certain models over others when approaching various problems. Furthermore, it has provided some evidence that one's ability to choose appropriate models and model components to solve a problem is at least somewhat independent of one's ability to understand a concept or solve a problem within a narrowly-framed context. We have a couple of suggestions for future CER work in this area:

1. Develop protocols for other physicochemical systems (e.g. vibrating or rotating molecules) and conduct think-aloud interviews to investigate how students engage in the modeling process in these contexts.

2. Use a much larger sample size and psychometrics to rigorously validate, and improve, the model rating task.

We expect that these two approaches will contribute further to our small, but growing, literature base for student learning in upper-division chemistry environments. We are currently collecting data on a harmonic oscillator protocol, which is constructed in a similar fashion to the H-atom protocol from the current work.

Implications for teaching

Perhaps the strongest implication for teaching practice in the domain of upper-division quantum chemistry is to guide students to categorize their knowledge about physicochemical systems and phenomena in such a manner that the appropriate models and model components are invoked when solving problems related to such systems. Slotta and Chi's description of ontology training in the context of introductory physics is a possible starting point (Slotta and Chi, 2006). While their work focuses on a vastly different ontology, the principles can be readily ported over to our context: explicitly guiding students to classify physicochemical systems and phenomena, and classify the appropriate models. How this looks in practice will, of course, vary. A series of scaffolded writing prompts to encourage students to reflect on their knowledge structures is one possibility. Another is to build the categorization directly into problems as students learn the content (e.g. write prompts that ask “what is the particle under study?” “what is the “box” in the context of particle-in-a-box”; “why do you think 1-D particle-in-a-box is an appropriate model for this system while harmonic oscillator is not?”). Yet another avenue is to have students build their own ontological map, similar to Fig. 5 as they progress throughout the quantum chemistry curriculum. This approach is similar to concept mapping (Starr and Parente de Oliveira, 2013) and is meant to gradually evolve to encompass the entire domain as studied in an undergraduate course.

Concluding statement

Through the use of three distinct research tools, we sought to address three research questions designed to fill the crucial gap in the chemical education literature regarding how upper-division chemistry students develop and use models and model components. The evidence suggests that physical chemistry students inappropriately ascribe utility to models which experts consider inappropriate for certain problems. Upper-division chemistry students are reluctant to invoke mathematical models and rely on surface features that we described as lexical and “complex = better”. It is helpful, we believe, to view students’ difficulties in terms of ontology. Such a framework provides some guidance to instructors about how to potentially address some of the challenges revealed by the data above. Finally, there is no correlation in our data between a students’ facility with navigating models and their overall conceptual understanding of quantum chemistry as a whole.

Conflicts of interest

There are no conflicts to declare.

Appendix 1: hydrogen atom interview protocol and resource sheet (Tool 1)

Interview protocol (H-atom)

Consider a hydrogen atom:

a. Using your resource sheet, use sketches, diagrams, and mathematical formulae to construct a model of the interaction between the electron and proton in the hydrogen atom.

b. Should the electron fall into the nucleus? Why or why not? Justify whether or not this has been taken into account in your model.

c. How do potential and kinetic energy play a role in your model (if at all)? Explain.

d. Address the question: “How far away from the nucleus is the electron in the hydrogen atom?” Draw upon your model to support your response.

e. Using your model, explain the following emission spectrum obtained for a sample of hydrogen:

image file: c8rp00023a-u1.tif

f. Kwame and Jean, upon observing the data presented above, disagree about the nature of the specific lines observed in the hydrogen emission spectrum. Kwame suggests that the individual lines arise from the emission of a photon coinciding with an electron transitioning from a higher energy state to a lower energy state. Jean argues that the lines occur due to the emission of an electron as the atom moves from a higher energy state to a lower energy state. Whose response do you most agree with? Use what you have developed in parts a and b to support your response.

g. Kenya and Jose all (sic) examine the following images representing the 2p orbital of the hydrogen-like atom.

image file: c8rp00023a-u2.tif

• Jose argues that both images represent the electron occupying an orbital. The image on the left shows exactly the shape of the orbital as a “container” of the electron. The image on the right shows how the electron bounces around in that container and how often it makes impact with the outer boundaries of the orbital.

• Kenya disagrees and suggests that the orbital is the square of a wave function that describes the behavior of the electron, and represents a probability density. The image on the left shows an isosurface of the probability density function, and contains some percentage of the likelihood of finding the electron. The image on the right is a dot density diagram, in which the density of the dots represents the likelihood of finding the electron.

Whose response do you most agree with? Use your model to support your response, as well as the images the three students looked at to make their claims.

h. What (if anything) about your model would need to change if you were to add additional nucleons and an electron to make a helium atom? Explain your answer.

image file: c8rp00023a-u3.tif

i. The plots on the left and right are the 1s orbital probability density and radial probability distribution functions, respectively, vs. the distance of the electron from nucleus (r). Use your model and the information contained within both plots to explain the physical significance of the Bohr radius (a0 = 5.29 × 10−11 m or 0.529 Å).

Resource sheet (H-atom protocol)

Useful formulae:
image file: c8rp00023a-t1.tif
(Image of table containing hydrogenlike wavefunctions up through n = 2 from table 6.5 on p. 208 of McQuarrie and Simon, 1997, was inserted here.)

Useful physical constants and additional information:

ħ = 1.055 × 10−34 m2 kg s−1

c = 3.0 × 108 m s−1

R H = 1.097 × 107 m−1

a 0 = 5.29 × 10−11 m or 0.529 Å

image file: c8rp00023a-t2.tif

Appendix 2: rating task and resource sheet (Tool 2)

Quantum chemistry rating task

For each of the following problems, rank the utility of the particular model (or model component) as either:

1. Very unhelpful for solving the problem

2. Unhelpful for solving the problem

3. Helpful for solving the problem

4. Very helpful for solving the problem

1. Atoms in a chemical bond vibrate around the equilibrium bond length. Show that the average displacement from the equilibrium bond length of a chemical bond equals zero and is independent of vibrational energy.

a. Spherical harmonics _____

b. Rigid rotator model _____

c. Harmonic oscillator model _____

d. Morse potential energy curve _____

2. Calculate the most probable radius for an electron in the 2s state of a He+ ion

a. Hydrogen atom model _____

b. Associated Laguerre polynomials _____

c. Variational principle _____

d. Slater determinant _____

3. The “spacings” between lines in a pure microwave spectrum of HBr are not equidistant. Why?

a. Rigid rotator model _____

b. Anharmonic oscillator model _____

c. Centrifugal distortion term _____

d. Spin selection rules _____

4. Calculate the Huckel pi-electron energies of benzene.

a. Slater determinant _____

b. Secular determinant _____

c. Harmonic oscillator model _____

d. LUMO _____

5. Consider the following plot of the molecular potential energy curve for H2+ molecule ion, along with the coulomb (J) and exchange (K) contributions.

image file: c8rp00023a-u4.tif

Which of the contributions (J or K) is responsible for the stability of the chemical bond?

a. Morse potential energy curve _____

b. Radial probability distribution functions _____

c. Linear variational principle _____

d. Overlap integrals _____

6. The length of hexatriene can be estimated to be 867 pm. Show that the first electronic transition is predicted to occur at 2.8 × 104 cm−1

a. Particle in a box model _____

b. Particle on a ring model _____

c. Harmonic oscillator model _____

d. Free particle model _____

7. In a sentence or two explain why the 2s orbital in this figure has a light ring near its center whereas the 1s orbital does not.

Figure caption: representations of the 1s and 2s hydrogenic atomic orbitals in terms of their electron densities (as represented by the density of shading).

image file: c8rp00023a-u5.tif

a. Radial distribution functions: _____

b. Hydrogenic wavefunction: _____

c. Particle in a box: _____

d. Harmonic oscillator energies: _____

8. Can the position and total angular momentum of any electron be measured simultaneously to arbitrary precision?

a. Born interpretation of the wavefunction _____

b. Normalization condition _____

c. Heisenberg uncertainty principle _____

d. Free particle model _____

Note: Items 1 and 6, and the image for item 7 are adapted from Atkins (Atkins and De Paula, 2009). The image from item 5 is adapted from McQuarrie and Simon (McQuarrie and Simon, 1997).

Quantum chemistry rating task resource sheet

Associated Laguerre polynomials: image file: c8rp00023a-t3.tif

Anharmonic oscillator model: V(l) = D(1 − eβ(ll0))2, (Image of Morse potential energy curve, from figure 5.6 on p. 165 of McQuarrie and Simon, 1997, was inserted here.)

Born interpretation of the wavefunction: p(q) = ψ(q)* ψ(q), where q is a set of coordinates

Centrifugal distortion term: Erot(J) = [B with combining tilde]J(J + 1) − [D with combining tilde]J2(J + 1)2

Free particle model: image file: c8rp00023a-t4.tif

Harmonic oscillator model: image file: c8rp00023a-t5.tif, (Image of harmonic oscillator potential energy curve with v = 0 through v = 4 energy levels superimposed, from figure 5.7 on p. 167 of McQuarrie and Simon, 1997, was inserted here.)

Heisenberg uncertainty principle: image file: c8rp00023a-t6.tif

Hydrogenic wavefunction: ψn,l,m(r,θ,ϕ) = Rn,l(r)Yl,m(θ,ϕ)

Linear variational principle: image file: c8rp00023a-t7.tif

LUMO: Lowest Unoccupied Molecular Orbital

Morse potential energy curve: (Image of Morse potential energy curve, from figure 5.6 on p. 165 of McQuarrie and Simon, 1997, was inserted here.)

Normalization condition: image file: c8rp00023a-t8.tif

Overlap integrals: image file: c8rp00023a-t9.tif

Particle in a box model: image file: c8rp00023a-t10.tif, (Image of 1-D PIB potential, from figure 3.1 on p. 80 of McQuarrie and Simon, 1997, was inserted here.)

Particle on a ring model: image file: c8rp00023a-t11.tif,

Radial distribution functions: Pr(r) = r2Rn,l2(r)

Rigid rotator model: Erot(J) = [B with combining tilde]J(J + 1), (Image of simple, 2-body rigid rotator schematic from figure 5.9 on p. 174 of McQuarrie and Simon, 1997, was inserted here.)

Secular determinant: image file: c8rp00023a-t12.tif

Slater determinant: image file: c8rp00023a-t13.tif

Spherical harmonics: Yl,m(θ,ϕ)

image file: c8rp00023a-u7.tif

Spin selection rules: ΔS = 0, where S is the total spin angular momentum.

Appendix 3: student participant grid

Table 5.
Table 5 Number of students who completed each task delineated by institution. The key is: H = hydrogen atom protocol, RI = rating task with interview, RNI = rating task with no interview, Q = QCCI. Students listed in any of the columns labeled “only” completed only one protocol. For example, students listed in the “H only” column completed only the hydrogen atom protocol and no others. The final column—R (any) & Q—is a sum of the previous two columns (RI & Q + RNI & Q) and represents the students included in Fig. 4 of the manuscript
H only RI only RNI only Q only H & Q RI & Q RNI & Q R (any) & Q
LRU 4 2 3 5 0 8 5 13
SLA 6 0 0 4 0 4 0 4
R1U 0 0 0 5 3 4 0 4
Total 10 2 3 14 3 16 5 21


Acknowledgements

The authors would like to thank all research participants for providing their time and responses to the various tasks. The authors also thank the instructors who allowed the researchers access to their physical chemistry classes such that student participants could be drawn. This work was funded, in part, by Washington NASA Space Grant Consortium, NASA Grant #NNX15AJ98H (Summers of 2016 and 2017) as well as research startup funds from Western Washington University. Thus, we also thank these two institutions for their generous support.

References

  1. ACS, (2015), Physical Chemistry Supplement, https://www.acs.org/content/dam/acsorg/about/governance/committees/training/acsapproved/degreeprogram/physical-chemistry-supplement.pdf.
  2. Atkins P. and De Paula J., (2009), Physical Chemistry, 9th edn, Oxford, UK: Oxford University Press.
  3. Baily C. and Finkelstein N. D., (2010), Teaching and understanding of quantum interpretations in modern physics courses, Phys. Rev. Spec. Top. – Phys. Educ. Res., 6, 010101.
  4. Becker N. M., Rupp C. A. and Brandriet A., (2017), Engaging students in analyzing and interpreting data to construct mathematical models: an analysis of students' reasoning in a method of initial rates task, Chem. Educ. Res. Pract., 18, 798–810.
  5. Bowen C. W., (1994), Think-Aloud Methods in Chemistry Education: Understanding Student Thinking, J. Chem. Educ., 71, 184.
  6. Brandriet A., Rupp C. A., Lazenby K. and Becker N. M., (2018), Evaluating students' abilities to construct mathematical models from data using latent class analysis, Chem. Educ. Res. Pract., 19, 375–391.
  7. Braun V. and Clarke V., (2006), Using thematic analysis in psychology, Qual. Res. Psychol., 3, 77–101.
  8. Cataloglu E. and Robinett R. W., (2002), Testing the development of student conceptual and visualization understanding in quantum mechanics through the undergraduate career, Am. J. Phys., 70, 238–251.
  9. Chi M. T. H., Feltovich P. J. and Glaser R., (1981), Categorization and Representation of Physics Problems by Experts and Novices, Cogn. Sci., 5, 121–152.
  10. Council N. R., (2012), Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering, Washington, DC: The National Academies Press.
  11. Daley B. J., (1999), Novice to Expert: An Exploration of How Professionals Learn, Adult. Educ. Q., 49, 133–147.
  12. Dangur V., Avargil S., Peskin U. and Dori Y. J., (2014), Learning quantum chemistry via a visual-conceptual approach: students' bidirectional textual and visual understanding, Chem. Educ. Res. Pract., 15, 297–310.
  13. Dick-Perez M., Luxford C. J., Windus T. L. and Holme T., (2016), A Quantum Chemistry Concept Inventory for Physical Chemistry Classes, J. Chem. Educ., 93, 605–612.
  14. Dini V. and Hammer D., (2017), Case study of a successful learner's epistemological framings of quantum mechanics, Phys. Rev. Phys. Educ. Res., 13, 010124.
  15. Harrison A. G. and Treagust D. F., (1996), Secondary students' mental models of atoms and molecules: Implications for teaching chemistry, Sci. Educ., 80, 509–534.
  16. Irby S. M., Phu A. L., Borda E. J., Haskell T. R., Steed N. and Meyer Z., (2016), Use of a card sort task to assess students' ability to coordinate three levels of representation in chemistry, Chem. Educ. Res. Pract., 17, 337–352.
  17. Mack M. R. and Towns M. H., (2016), Faculty beliefs about the purposes for teaching undergraduate physical chemistry courses, Chem. Educ. Res. Pract., 17, 80–99.
  18. Marshman E. and Singh C., (2017), Investigating and improving student understanding of the probability distributions for measuring physical observables in quantum mechanics, Eur. J. Phys., 38, 025705.
  19. McKagan S. B., Perkins K. K. and Wieman C. E., (2008), Why we should teach the Bohr model and how to teach it effectively, Phys. Rev. Spec. Top. – Phys. Educ. Res., 4, 010103.
  20. McQuarrie D. A. and Simon J. D., (1997), Physical Chemistry A Molecular Approach, Sausalito, CA: University Science Books.
  21. Savall-Alemany F., Domènech-Blanco J. L., Guisasola J. and Martínez-Torregrosa J., (2016), Identifying student and teacher difficulties in interpreting atomic spectra using a quantum model of emission and absorption of radiation, Phys. Rev. Phys. Educ. Res., 12, 010132.
  22. Singh C., (2001), Student understanding of quantum mechanics, Am. J. Phys., 69, 885–895.
  23. Slotta J. D. and Chi M. T. H., (2006), Helping Students Understand Challenging Topics in Science Through Ontology Training, Cognition and Instruction, 24, 261–289.
  24. Slotta J. D., Chi M. T. H. and Joram E., (1995), Assessing Students' Misclassifications of Physics Concepts: An Ontological Basis for Conceptual Change, Cognition and Instruction, 13, 373–400.
  25. Smith M. K., Jones F. H. M., Gilbert S. L. and Wieman C. E., (2013), The Classroom Observation Protocol for Undergraduate STEM (COPUS): A New Instrument to Characterize University STEM Classroom Practices, CBE Life Sci. Educ., 12, 618–627.
  26. Starr R. R. and Parente de Oliveira J. M., (2013), Concept maps as the first step in an ontology construction method, Inf. Syst., 38, 771–783.
  27. Stefani C. and Tsaparlis G., (2009), Students' levels of explanations, models, and misconceptions in basic quantum chemistry: A phenomenographic study, J. Res. Sci. Teach., 46, 520–536.
  28. Styer D. F., (1996), Common misconceptions regarding quantum mechanics, Am. J. Phys., 64, 31–34.
  29. Taber K. S., (2002a), Compounding Quanta: Probing the Frontiers of Student Understanding of Molecular Orbitals, Chem. Educ. Res. Pract., 3, 159–173.
  30. Taber K. S., (2002b), Conceptualizing Quanta: Illuminating the Ground State of Student Understanding of Atomic Orbitals, Chem. Educ. Res. Pract., 3, 145–158.
  31. Taber K. S., (2005), Learning quanta: Barriers to stimulating transitions in student understanding of orbital ideas, Sci. Educ., 89, 94–116.
  32. Tsaparlis G., (1997), Atomic orbitals, molecular orbitals and related concepts: Conceptual difficulties among chemistry students, Res. Sci. Educ., 27, 271.
  33. Tsaparlis G. and Papaphotis G., (2009), High-school Students' Conceptual Difficulties and Attempts at Conceptual Change: The case of basic quantum chemical concepts, Int. J. Sci. Math. Educ., 31, 895–930.
  34. Zhu G. and Singh C., (2012), Surveying students’ understanding of quantum mechanics in one spatial dimension, Am. J. Phys., 80, 252–259.

This journal is © The Royal Society of Chemistry 2018