Vanessa Rosa
Ralph
*a,
Nicole E.
States
b,
Adriana
Corrales
c,
Yvonne
Nguyen
d and
Molly B.
Atkinson
c
aTeaching Engagement Program, Office of the Provost and Department of Chemistry and Biochemistry, University of Oregon, 1585 E 13th Ave, Eugene, Oregon 97403, USA. E-mail: vralph@uoregon.edu
bDepartment of Chemistry, University of Iowa, 230 N Madison St, Iowa City, Iowa 52242, USA
cDepartment of Chemistry, University of North Texas, 1508 W Mulberry St, Denton, Texas 76201, USA
dDepartment of Chemistry, University of South Florida, 12111 USF Sweetgum Ln, Tampa, Florida 33620, USA
First published on 15th April 2022
Emphasizing stoichiometry appears to be a norm of introductory chemistry courses. In this longitudinal and mixed-methods study, we examined how the emphasis on stoichiometry in assessments of introductory chemistry impacted educational equity and student learning. Using quantitative methods, we identified mole and stoichiometric conversions as two of the most frequently assessed and inequitable competencies, perpetuating systemic inequities in access to pre-college mathematics preparation. Above all other competencies, midterm assessments of stoichiometry were the most impactful as the strongest predictor of students’ scores on both the first and second-semester introductory chemistry final exam. These results informed the development of a think-aloud protocol used to describe how students approached assessments of stoichiometry. Students described stoichiometry as a step-by-step series of calculations, rarely associating this algorithm with the process of a chemical reaction by which reactants break bonds and rearrange to form products. Student responses suggest stoichiometry substitutes learning how to apply chemistry to think about the problems scientists solve for memorizing algorithms to solve math problems in the context of chemistry. Shifting the foundation of introductory chemistry courses from algorithmic to applied competencies reflects scientific practice and maybe one strategy for educators to disrupt systemic barriers to access and retention in STEM Education. Based on these findings and the advancements of other research, we offer implications for supporting educators as they iteratively develop increasingly relevant and equitable assessments of introductory chemistry.
Internationally, there have been calls to reform education from static, industrialized norms to equitably equipping students with competencies relevant to thriving in the rapidly developing disciplines of STEM (OECD, 2018; Freeman et al., 2019; Duschl et al., 2021). Competencies are the skills, abilities, and other qualities practitioners need to participate in a discipline (Albanese et al., 2008; Momsen et al., 2013; OECD, 2018). In the following sections, how we defined relevance and equity for this study is described alongside the prior research to which this work seeks to contribute.
The OECD characterized relevance as epistemic knowledge or competencies that allow students to think like a practitioner of a discipline, making a clear distinction between learning to solve problems and learning to think about the problems practitioners solve. Throughout, we will refer to learning to solve problems as algorithmic competencies and learning to think about the problems practitioners solve as applied competencies.
Assessments are a vital component of curricular reform. How students are assessed informs which competencies they perceive as valuable to participating in a discipline (Asikainen et al., 2013; Momsen et al., 2013; Dent and Koenka, 2016; Herrmann et al., 2017; Lynam and Cachia, 2018; Andrade and Brookhart, 2019; Phelps, 2019a, 2019b). Chemistry Education Researchers have developed frameworks to support educators as they differentiate algorithmic and applied competencies and evaluate which competencies should be assessed relative to which are assessed. Some examples include differentiating algorithmic competencies from conceptual understanding (Nakhleh, 1993; Niaz, 1995; Pushkin, 1998), lower- and higher-order cognitive skills (Zoller et al., 1995; Zoller, 2002; Toledo and Dubas, 2016), the thinking processes students are likely to employ when solving different types of assessment tasks (Smith et al., 2010), and the potential of an assessment task to elicit dimensions of scientific knowledge and practice (Laverty et al., 2016).
Despite these advancements, substantial emphases on algorithmic competencies continue to be the norm (Smith et al., 2010; Ralph and Lewis, 2018; Shah et al., 2021; Stowe et al., 2021). For example, Stowe et al. (2021) observed learning environments where more than half of assessment points were awarded to algorithmic competencies with relatively little (less than 5%) emphasis on applied competencies. Algorithmic competencies have also been identified as the most inequitable approach to assessing first (Ralph and Lewis, 2018) and second (Shah et al., 2021) semester introductory chemistry courses.
We argue that the persistent emphasis on algorithmic competencies and inequities in STEM education are related. Evidence of medium to strong correlations between students’ math test scores and achievement in introductory chemistry courses spans decades (Pedersen, 1975; Pickering, 1975; Ozsogomonyan and Loftus, 1979; Craney and Armstrong, 1985; Rixse and Pickering, 1985; Bunce and Hutchinson, 1993; McFate and Olmsted, 1999; Wagner et al., 2002; Nguyen et al., 2017; Berkowitz and Stern, 2018; Thompson et al., 2018; Powell et al., 2020; Williamson et al., 2020).
Often termed “underprepared,” “at risk,” or “low achieving,” students scoring in the bottom quartile of their cohorts are disproportionately at risk of attaining unfavorable academic outcomes in chemistry (Wagner et al., 2002; Gellene and Bentley, 2005; Lewis and Lewis, 2007; Hall et al., 2014; Ye et al., 2016; Ralph and Lewis, 2018; Williamson et al., 2020). Further, students scoring in the bottom quartile are over-represented by those who identify as women, Black, or Latinx (Carmichael et al., 1986; Crisp et al., 2009; Grossman and Porche, 2014; Ralph and Lewis, 2018; Vincent-Ruz et al., 2018; King and Pringle, 2019; Robinson et al., 2019; Witherspoon et al., 2019). Developed as tools for racial prejudice (Selden, 1999; Davis and Martin, 2008; Au, 2010; Knoester and Au, 2017), “standardized” math test scores are still nearly ubiquitously used by educational institutions to prevent the admission and enrollment of students with lower math test scores (Bialek and Botstein, 2004; Nicholas et al., 2015).
As there are no biological differences to explain why students in different socially constructed categories attain different math and chemistry test scores (Selden, 1983; Byrd and Hughey, 2015; Atmaykina and Babayan, 2018), we (1) argue math test scores conflate measures of math aptitude with access to mathematical preparation and (2) posit the emphasis of algorithmic competencies on chemistry assessments is a systemic barrier to education equity in STEM requiring further investigation.
Both studies operationalized inequitable topics within first or second-semester introductory chemistry as those presenting a barrier to students receiving inequitable access to pre-college mathematics preparation. However, coding by topic bounds the evaluation to a single semester of the two-semester course sequence. A longitudinal design allows us to answer whether the competencies assessed across the course sequences were inequitable, highly emphasized (or a substantive portion of the assessments used to define academic success), and impactful (or strongly correlated to students’ future academic outcomes). Neither study incorporated qualitative data analyses describing how students approached inequitable assessment tasks. A mixed-methods design allows us to examine competencies identified as inequitable for their relevance to disciplinary practice.
1. Which competencies assessed across first- and second-semester introductory courses were the most emphasized, impactful, and inequitable to students’ success?
2. How is the knowledge elicited by assessments of highly emphasized, impactful and inequitable competencies relevant to the practice of chemistry?
Topics covered in the first semester included properties of substances and reactions, thermochemistry, atomic-molecular structure, bonding, and periodicity. Second-semester topics included solutions, thermodynamics, kinetics, equilibria, electrochemistry, and nuclear chemistry.
Courses were coordinated by a team of instructors who shared a textbook, learning objectives, syllabus, and grading scheme. Instructors chose a peer-reviewed, openly licensed introductory textbook offered through OpenStax (Chemistry: 2e, 2019) for both courses. “Chemistry: 2e” was not atoms-first, introducing the topic of stoichiometry ahead of electronic structure, periodic properties, chemical bonding, and molecular geometry.
Academic success was primarily defined by high-stakes, summative assessment outcomes (70%), online homework completion (10%), and class participation (e.g., clicker responses and quizzes; 20%). Instructors coordinated the implementation of flipped classes and peer-led team learning (Robert et al., 2016) across all sections of first-semester general chemistry and several sections of the second-semester course (Ralph and Lewis, 2020).
To enroll in first-semester chemistry, the institution required students to attain either a 570 on a pre-college math test known as the SAT or complete a college-level algebra course with a “C” or higher in addition to either one year of high school chemistry or the completion of a chemistry preparation course. The bottom quartile (or 25th percentile) of math test scores for this cohort was 570, suggesting students scoring in the bottom quartile and students without registered math test scores relied on college-level algebra courses to meet the mathematics prerequisites set by the institution. In alignment with prior research (Wagner et al., 2002; Gellene and Bentley, 2005; Lewis and Lewis, 2007; Hall et al., 2014; Ye et al., 2016; Ralph and Lewis, 2018; Williamson et al., 2020), we observed disparate outcomes in longitudinal retention rates for students scoring in the top-three quartiles, bottom quartile or enrolled without pre-college math test scores (see Fig. 1).
Overall, there were 2034 students enrolled in first-semester general chemistry. Of the 1449 students who scored in the top-three quartiles, 936 (or 65%) were retained (i.e., passed GC1, enrolled, and passed GC2). In contrast, the retention rate for students scoring in the bottom quartile was 42%. Students without registered math test scores (“No Score” in Fig. 1) had a retention rate of 28%. Findings suggest that access to pre-college mathematics preparation is critical for success in introductory chemistry courses.
Student Groupc | Sample Size | % T3Qs | % BQ | % No Score |
---|---|---|---|---|
a Proportion of students represented in the top-three quartiles (T3Qs), bottom quartile (BQ), or without math test scores (no score). b Pass rates (in parentheses) for chemistry students retained across the course sequence. c Data limited to conflated categorizations of race and ethnicity as collected by the Institution's Registrar. | ||||
Overall | 2034 | 71a (0.65)b | 21 (0.42) | 8 (0.28) |
Asian | 288 | 86 (0.77) | 12 (0.48) | 2 (0.00) |
Black | 192 | 57 (0.68) | 33 (0.39) | 10 (0.21) |
Foreign national | 94 | 62 (0.64) | 25 (0.38) | 13 (0.17) |
Indigenous | 20 | 75 (0.73) | 15 (0.33) | 10 (0.50) |
Latinx | 478 | 64 (0.63) | 28 (0.47) | 8 (0.26) |
Not reported | 71 | 73 (0.52) | 20 (0.57) | 7 (0.00) |
White | 891 | 74 (0.61) | 18 (0.38) | 8 (0.36) |
As mentioned previously, there are no biological reasons why students of differing socially constructed identities should perform disparately on presumed measures of “math aptitude” (Selden, 1983; Allen, 1999; Byrd and Hughey, 2015). The exclusion of Black, Foreign National, and Latinx students from (1) having a math test score, (2) scoring in the top-three quartiles, and (3) being retained in introductory chemistry courses suggest these test scores reflect the consequences of systemic racism (Wilson-Kennedy et al., 2020; Madkins and Morton, 2021).
In addition to inequitable representation across subgroups, intersectionality, or observations of the overlapping and interdependent systems of discrimination across other socially constructed categories (Petersen, 2006; López et al., 2018), was observed within subgroups as students scoring in the top-three quartiles were far more likely to be retained in introductory chemistry courses than their peers who scored in the bottom quartile (Petersen, 2006; Bowleg, 2008; López et al., 2018). Thus, pre-college math test scores functioned as a proxy for educational access or exclusion, informing how we operationalized educational equity.
Rodriguez and colleagues (2012) identified the impact of equity operationalization on interpreting a study's results. We operationalized equity as parity (or, more appropriately, equality), defining equitable approaches to assessing chemistry competencies as those resulting in similar outcomes for chemistry students scoring in the top three and bottom quartiles of math test scores (Lynch, 2000; Rodriguez et al., 2012). Having identified our population of interest and a model for operationalizing equity, we then sought a framework to categorize individual chemistry test items by competency.
Knowledge | Competency |
---|---|
Foundational: students are asked to discern definitions or representations to recall or identify pertinent chemical contexts. | Recall: match a term to a definition or vice versa. |
Identify: apply a definition to categorize chemical species (e.g., acid or base, ionic or covalent). | |
Translate: interpret qualitative descriptions of observable phenomena (e.g., a change in color), quantitative expressions (e.g., equilibrium expressions, rate laws), or representations (e.g., photographs, particulate diagrams including Lewis structures, chemical equations). | |
Algorithmic: students are asked to engage in step-by-step solution processes concluding with conversions or multi-step calculations. | Math operations: conduct a single-step calculation (excluding conversions). |
Macroscopic conversions: convert macroscopic quantities (e.g., volume in mL to L). | |
Mole conversions: convert between macroscopic and microscopic quantities of the same chemical species (e.g., volume to moles). | |
Stoichiometric conversions: convert between moles of one chemical to moles of another using the coefficients of a chemical equation (e.g., mol-mol ratios and ICE tables). | |
Multi-step calculations: substitute the resultant value of one calculation into another (excluding conversions). | |
Applied: students are asked to use foundational or algorithmic knowledge to predict or explain chemical phenomena, compare chemical species, or evaluate data. | Compare: reason at the interface of measures, structures, and properties to communicate differences between chemical species or phenomena (e.g., use the structures of two chemical species to determine which would have a higher boiling point). |
Evaluate: interpret quantitative data (e.g., tables, graphs, calculations, expressions) or descriptions to assess the nature of chemical phenomena (e.g., determine the degree to which a solution is saturated using a solubility curve). | |
Predict or explain: extend relevant models, chemical theories, or laws to predict or explain changes in chemical systems (i.e., apply atomic- and molecular-level models to explain changes in a solution). |
The first and fourth authors independently coded all midterms and final exams administered across the year-long course sequence. These authors achieved 93.3% agreement with 0.891 for Cohen's Kappa; both measures could be interpreted as strong agreement (Berry and Mielke, 1988).
We explain in Appendix 1 how previously published frameworks and protocols were synthesized and applied according to data collected at the research setting. For exemplar assessment tasks for each competency, see Appendix 2.
Three measures were used to identify the competencies critical to students' success in first- (GC1) and second semester (GC2) introductory chemistry: (1) emphasis was the percent of assessment items represented by a given competency, (2) impact was measured using Pearson correlations (Pearson, 1909) between students’ competency scores on midterm exams and final exams scores in either course and (3) inequity was calculated as the numerator of Cohen*s d (Cohen, 1988) or the mean differential between students scoring in the top three and bottom quartiles (see Table 3).
Exceedingly challenging or easy competencies could artificially condense student outcomes giving the appearance of equality (Ho and Yu, 2015). However, mean competency scores ranged from 55.1–76.6% suggesting floor/ceiling effects were unlikely to influence measures of inequity.
These quantitative measures were applied to identify particularly frequently assessed, impactful, and inequitable competencies. Then, the quantitative findings informed the design of the qualitative methodology used to answer the second research question.
Our colleagues in Physics Education Research developed the modeling framework to characterize different facets in how students apply and integrate knowledge (e.g., principles, concepts, and measurements) to make sense of a phenomenon (Zwickl et al., 2015). The first, second, and fourth authors used the modeling framework to identify the target phenomenon of an assessment task and describe the extent to which students approached the competency as algorithmic (e.g., first, do this, then, do that) or applied (e.g., this principle/concept/measure can be used to make sense of the presented phenomena).
This framework prompted us to use an interview protocol comprised of a variety of prompts capable of eliciting a range of student experiences with a given competency. Developed in education research, phenomenography focuses on producing descriptions of a range of student experiences with a phenomenon (Walsh et al., 2007). Consistent with prior research in physics (Walsh et al., 2007; Ornek, 2008) and chemistry education (Bretz, 2008; Stefani and Tsaparlis, 2009), phenomenography was used in this study to produce descriptions of the range in ways a small group of students perceived and understood a competency as it related to the target phenomenon. A think-aloud protocol, an interview method wherein students are prompted to solve a task or activity aloud to the researchers (Larkin and Rainard, 1984), appeared well suited to the modeling framework and phenomenographic methodology.
Maximum variation sampling (Shaheen et al., 2019) was used to select student participants. We hypothesized that students with high chemistry test scores would approach the competency as more applied than algorithmic. Thus, we alternated between interviewees who scored in the top three and bottom quartiles of the first instructor-authored exam on which the identified competency was assessed.
The number of participants recruited for the study was informed by saturation. Following each interview, meetings were held to discuss prevalent themes and evaluate whether saturation had been achieved (Mack et al., 2005; Curtis and Curtis, 2011; McGrath et al., 2018). Given the specificity of the protocol, emergent themes became repetitive after seven participants. The researchers interviewed eleven students, identifying no new themes in the last four interviews, a sample size appropriate to the phenomenographic methodology implemented (Collins et al., 2006; Bartholomew et al., 2021).
Then, we separated to identify and describe the themes observed, reconvening to condense our collective themes to descriptive patterns, audit for negative cases, and discuss agreements and discrepancies (Ornek, 2008; Wilson, 2015; Ataro, 2020). Once the first and fourth authors reached a consensus, this cycle was repeated with the first and second authors (Miles et al., 2013). Overall, this cycle was consistent with the process of collaborative consensus coding, enabled the analysis to occur through multiple perspectives, and was a more appropriate option than interrater reliability (Sweeney et al., 2013).
The most frequently assessed competency (nitems = 42) was stoichiometric conversions. Together, mole and stoichiometric conversions (nitems = 70) comprised more than 1/3rd of the assessment tasks administered in the chemistry course sequence.
The types of assessment tasks assigned around these competencies were remarkably diverse, including calculations of molar mass, isotopic abundance, empirical and molecular formulae, percent mass/yield, molarity, molality, mole fraction, serial dilutions, lattice energy, partial/vapor/osmotic pressure, ideal gas laws, Gibbs free energy, the heat of vaporization, molar entropy, equilibrium/rate constants, pH/pOH, titrations, molar solubility, precipitants, common ion effects, electroplating, balancing equations and half-/formation reactions. While mole and stoichiometric conversions originally appeared to be narrowly defined competencies in the deductive coding scheme (see Fig. 1), these tasks were heavily represented, warranting individual categories.
In descending order, the most inequitable competencies were multi-step calculations (where mean differentials were, on average 17.1%), stoichiometric conversions (AVGMD = 17.0%), evaluate (16.4%), and mole conversions (16.4%).
The nuances observed in these data warrant further description. First, the inequity of a competency could differ across the course sequence. The most notable change in inequity observed was in the competency mole conversions, which decreased from 18.9% in GC1 to 14.3% in GC2. This change may relate to attrition as students with inequitable access to pre-college math preparation were much less likely to complete all four exams in each course (31%) than their peers (53%; see Fig. 1). Second, while the assessment of applied knowledge was generally more equitable, evaluate tasks (n = 19), relying on quantitative data (n = 15, AVGMD = 17.2%) to examine the nature of a chemical phenomenon were similar in inequity to solely algorithmic tasks.
However, the range of mean differentials for quantitative, evaluate tasks was 7.8 to 26.9%. The most equitable evaluate tasks relied on math diagrams, requesting (for example) that students use a vapor pressure diagram to identify which substance would have the highest vapor pressure at a given temperature or a solubility curve to identify whether a given quantity of solute is super/un/saturated. The most inequitable evaluate tasks also relied on math diagrams. However, these tasks required some form of conversion. For example, students were given a solubility curve with a y-axis in units of “g of solute per 100 g of solution” with a prompt “how many grams of solute would dissolve in 50 g of water.” In another task, students identified the equivalence point of a weak acid titrated with a strong base or the acidity/basicity of a solution relying on pH calculations (which often accompany mole conversions).
These examples highlight the relative emphasis on assessing the nature of chemical phenomena (applied) versus enacting conversions (algorithmic) as a potential underlying factor in the equity of an evaluate assessment task. These examples also demonstrate the potential for conversions to have the potential for conversions to have an impact on students* trajectory through introductory chemistry courses that may not correspond to the relevance of this competency to the practice of chemistry.
Situated in the top-right quadrant of Fig. 4, midterm assessments of stoichiometric conversions were above average in correlation, accounting for 44–46% of the variance observed on students’ GC1 and GC2 final exam scores.
In summary, stoichiometric conversions were flagged as a highly inequitable and frequently assessed competency strongly correlated to students’ academic success. Should stoichiometry be the foundation of introductory chemistry courses? Is it relevant in supporting students, not just to solve problems, but to think about the problems chemists solve?
As described in the methods, we used maximum variation sampling (Shaheen et al., 2019) to recruit interview participants scoring in the top-three and bottom quartiles of the first midterm exam wherein the competency emerges or Test 1 of first-semester introductory chemistry (see Table 4).
Pseudonym | Test 1 scores (%) | SAT scores | Stoichiometric competency (GC1, GC2) | Applied |
---|---|---|---|---|
Adam | 65.8 | 540 | 0.38, 0.36 | |
Aimee | 67.1 | No score | 0.28, 0.38 | |
Eric | 92.4 | 650 | 0.67, 0.40 | |
Isaac | 64.6 | 430 | 0.48, 0.68 | |
Jakob | 91.8 | 590 | 0.72, 0.52 | |
Jean | 43.7 | No score | 0.62, 0.59 | |
Lily | 100.0 | 750 | 0.67, 0.78 | |
Maeve | 91.1 | 610 | 0.81, 0.66 | |
Ola | 64.6 | 550 | 0.90, 0.90 | |
Rahim | 65.2 | 680 | 0.48, 0.61 | |
Ruby | 100.0 | No score | 0.86, 0.88 | ✓ |
Despite variations in access to pre-college mathematics preparation and students’ demonstrated proficiency with the competency of stoichiometry, only one student was identified as using an applied approach to stoichiometry. Therefore, neither a students* math test score nor proficiency with the competency of stoichiometry appeared to inform whether or not they adopted an applied approach to thinking about the problems chemists solve.
The prompts used in the think-aloud protocol were related to this chemical equation:
2Ni2O3(s) → 4Ni(s) + 3O2(g) |
More moles of nickel will be produced because 2 moles of oxygen atoms will break ionic bonds with nickel and rearrange to form one mole of covalently bound molecular oxygen (O2).
However, students unanimously elicited algorithmic strategies to answer the prompt, often relying on coefficients in the chemical equation. For example, Maeve responded by stating:
I figured because there was there were four moles of nickel, that would mean that the amount of moles of oxygen would be smaller than the amount of moles produced by nickel because there's four moles of nickel as opposed to 3 moles of oxygen.
These justifications lead to an accurate prediction: more moles of nickel are produced than moles of oxygen. However, one reflects an applied approach related to the atomic and molecular behavior of the target phenomenon (i.e., the process of converting reactants to products) and the other an algorithmic approach through a series of quantitative steps disconnected from the target phenomenon.
This disconnection from the target phenomenon appeared to impede some students’ responses, obfuscating the chemical reaction process with algorithmic approaches to understanding coefficients, numbers, or the masses of chemical species. For example, Isaac claimed, “I'm going to say [the moles of O2 produced would be] higher because there's two molecules of O2. And then there's three moles. So, three times two would be six [atoms]. Then there's only four [atoms] of nickel.” Other students emphasized the mass of the atoms, like Lily, who wrote, “higher, the mass is higher,” emphasizing atomic mass as knowledge elicited when considering which product would be produced in a greater quantity of moles. Adam stated, “… nickel would be higher. I just kind of ran through it in my head, and I'm pretty sure that three is a bigger number than what I would get if I were to multiply them by this [underlining the coefficient].”
Some students expressed an aversion to the first prompt, requesting to skip to the next, which involved a (potentially more familiar) unit conversion. For example, Jakob stated: “Do I have to answer this first one?” When prompted as to why they did not want to answer the prompt, Jakob stated, “I just like, I’ve kind of wanted to just jump into it because I don’t really think conceptually like that. I’m just like, oh, let's just do it.” Overall, none of the interviewed students elicited an approach that applied the process of chemical reactions to stoichiometry when responding to the first prompt.
x = 2.23 mol O2 |
I liked it when it's like this that just makes it looks more like a math equation. So, I think it's easier to understand when it's like that… I like math, so I try to put it in terms of like how you do it in math. But like once you incorporate like the chemicals, I get all confused… It's kind of like, um, like one day you just see the periodic table and they're like, okay, now we're just going to throw all these elements into regular math. And then it's like, what? But yeah.
While the prompt could be solved with a single mole-to-mole conversion, many students elicited a variety of algorithmic approaches, often expressing the need to incorporate grams into the conversion. Jakob started their solution by writing “mass → moles.” Jean had trouble starting the problem, asking, “Well, I know like you have to do the mole equation, but I'm forgetting how you start. Like should I divide this by the whole mass of NiO2 or just?” Even Aimee – who expressed confidence in their solution – stated, “Because I also have the, the grams for nitrogen. And since I also have the moles, I'm trying to figure out how, if I'm trying to see, because I need to cross out for moles and grams for nitrogen.”
Many students drew connections to past problems, like Isaac when they stated, “Um, I feel like a lot of equations use it, so I feel like the molar mass is probably going to be in there somewhere.” Aimee explained how grams are often incorporated, offering this explanation of why they converted to grams:
Well, it just makes it seem like, um, well, when we learned it was like a big thing of it. Like you had the chance to, we usually got the grams. So, you have the change to grams to moles. That's what most of it was like, and then, well, like the mol-to-mol was the hardest one. So, it was like harder to remember how to do that. So, I want to say they tell us an easy way to get from grams from moles to grams and then go back to moles but not moles to moles.
Some students expressed challenges with reconciling algorithmic and applied approaches in chemistry. Jakob explains how the first prompt was hard to grasp:
I was taught how to do like just the math first, not the understanding part. So, I like in the beginning I asked you, do I have to answer with this first? I was thrown off because I was like, oh, I don't really know how to do it. I just know how to do it mathematically. And then, once I do it mathematically, I could explain it to you conceptually. So, I feel like conceptually first threw me off because I just wanted to jump into the math and then explain… I feel like you should know how to do it conceptually first and then do it numerically. But I feel like when people grow up doing math, they jump right into the math. They don't really think about why it's this.
Ola reflects on the lack of engaging with “why” throughout their educational experiences in chemistry:
Yeah. That that's what it is. Is it the fact that um, like every, every time I've been taught at even from high school, it just, it, there is no explanation why. It's just that mole to gram and then that, you know, grams back to moles. It never really clicked, I guess.
Student responses suggest an overemphasis of algorithmic reasoning during the instruction and assessment of chemistry may (1) implicitly communicate to students that chemistry is about word problems using chemical phenomena as the context for calculations, not understanding how and why phenomena occur in terms of atomic and molecular behavior, and (2) impede students’ desire to learn explanations, processes, and applications of the knowledge.
Apart from Ruby, all other students sought to enact an array of calculations. For example, Jakob responded:
… when you take the molar mass of nickel, it is 58.69, but you have two of them. So, you have to multiply that by two. And then same thing for oxygen, it's 16 times the three. And then once you add that up, you have to times it by two because there's like two of each.
Students often elicited knowledge related to the conservation of mass but appeared to be impeded by algorithmic knowledge. For example, Lily invoked the law of mass conservation after multiplying the mass of Ni by the coefficient in the chemical equation when she stated, “the mass of the products a little bit. Um, yeah. So yeah, mass of the products would be like the four times Ni, oh wow. It'd be the same thing, wouldn't it?” Additionally, Isaac expressed, “I don't know how you would see that; I'm not sure.”
To improve the prompt's accessibility, the interviewers began asking students to draw what would be in a beaker before and after the reaction took place (see Fig. 5).
Using their visual (top-left of Fig. 5), Rahim explains: “Okay. Yes, there are reactants [left two beakers]. So together there are two compounds of Nickel and oxygen…and then after the reaction takes place, there are four moles of the Nickel and then three of the oxygen.” Similarly, Jean enacted a calculation using coefficients and atomic masses and explained their illustration as: “Um, because NiO is just one compound and then Ni, and then O2 [represented with circles later crossed out], it's two different compounds,” later editing their illustration (blue) to emphasize the combination of the reactants to form the products within the beaker. Ola initially answers “491.1786 grams,” describing the use of subscripts and atomic masses in calculating their response, maintaining that the mass of the products remains the same before and after the reaction occurs.
While four students achieved a competency score for stoichiometric conversions exceeding 70% on assessments administered in the first-semester introductory chemistry course (see Table 4), qualitative results suggest that just one student (Ruby) applied the competency of stoichiometry to the process of a chemical reaction. When Ruby was asked by the interviewers how they came to develop this understanding of stoichiometry, Ruby shared that they identify as an international student and describe educational experiences outside of the United States.
Truly, the US education is kind of broad, like the information they brought up. it is not really deep and their solution. …uh, it the same not really complicated, but like in my country, they do some more complicated question…
This response reflects the influence of systemic norms in science education and the potential positive impacts following a shift from emphasizing algorithmic to applied competencies. Subject to this norm of science education, educators may unknowingly perpetuate students’ systemic marginalization by emphasizing algorithmic competencies in introductory STEM courses (Ralph and Lewis, 2018; Shah et al., 2021). In summary, assessment design can convey which competencies are of value to a discipline and whom a discipline selects for participation, informing whether prerequisite STEM courses serve as a point of access or exclusion.
Situating these findings in the context of prior research suggests these results may reflect a norm in chemistry education. For example, a predominant emphasis on algorithmic competencies on assessments of chemistry has been documented in several studies (Tai et al., 2006; Smith et al., 2010; Momsen et al., 2013; Ralph and Lewis, 2018; Shah et al., 2021; Stowe et al., 2021). For example, chemistry courses offered at three different institutions were observed, assigning (on average) 29.2%, 46.7%, and 59.2% of assessment points to algorithmic competencies (Stowe et al., 2021). Smith and colleagues (2010) report algorithmic emphases on an initiative by the American Chemical Society to emphasize fewer “traditional” (i.e., algorithmic) and more “conceptual” chemistry questions as composed of 23% (first term) and 16% (second), suggesting it is a “tradition” in introductory chemistry to emphasize algorithmic competencies.
When placed alongside efforts to identify inequitable approaches to assessing introductory chemistry, these systemic trends reflect barriers to educational equity. For example, stoichiometry and the mole concept have previously been identified as the most inequitable topics in first-semester chemistry courses (Ralph and Lewis, 2018). In second semester introductory chemistry, researchers identified algorithmic competencies as comprising most assessments, potentially driving the inequity in the topics: chemical equilibria and intermolecular forces and properties (Shah et al., 2021).
The current study contributes a longitudinal analysis of competencies spanning first and second-semester introductory chemistry courses to this literature base. The results reify algorithmic competencies are highly emphasized, impactful, and inequitable to students of introductory chemistry courses. However, math remains a valuable tool supporting chemists as they think about problems as they arise in the modern world. How might educators reconcile the utility of mathematics with the inequity of algorithmic competencies?
This study adopts a mixed-methods approach contributing qualitative descriptions of whether students experience the competency of stoichiometric conversions as solely algorithmic or if this knowledge was a valuable tool in examining chemical phenomena. Relying on the modeling framework advanced by education researchers in the physical sciences (Zwickl et al., 2015), scientists' intellectual work was characterized around (1) a target phenomenon and (2) the knowledge used to model the processes underlying the target phenomena. In the context of stoichiometric conversions, it was theorized that the target phenomenon would be the process of converting reactants into products, with the knowledge used to model this process involving the rearrangements that follow breaking chemical bonds to form new bonds and products. Of the students interviewed, one applied knowledge about the target phenomenon when engaging in stoichiometric conversions. All others relied on an algorithmic approach disjointed from the target phenomenon with varying degrees of success in responding to the prompts correctly.
Competency scores in the assessments of stoichiometric conversions did not reflect a student's integration of mathematics to examine the chemical phenomena and instead more often correlated to students’ math test scores (see Table 4).
Again, when situated in prior research, the observations that students hold stoichiometry as an algorithm and not a tool to understand chemical phenomena appears to be repeatedly reified across several of the works contributed by chemistry education researchers. To provide a few examples, several scholars have reported a students’ reliance on algorithmic approaches to stoichiometry, using the algorithm to solve problems at the expense of applying this knowledge to think about phenomena (Mason et al., 1997; BouJaoude and Barakat, 2003; Arasasingham et al., 2004; Dahsah and Coll, 2007; Cracolice et al., 2008). One prior study could be found connecting the knowledge students elicit to the inequity of the competency of stoichiometric conversions. The evidence presented suggests inequities reflect differences in the rates by which students answer stoichiometric conversions correctly using incoherent chemical reasoning (Ralph and Lewis, 2019), reflecting the inaccurate and algorithmic approaches students expressed in the current study.
We argue this is not a deficit of the student nor the teacher reflecting instead the limitations of an industrialized education system that offers little institutionalized support or incentive to advance curricular and pedagogical reform. Despite this system, these findings reflect the need for a critical conversation about the purpose of chemistry education and how what we choose to perpetuate in our classrooms impacts who gains access to careers in STEM.
![]() | ||
Fig. 6 A visual guide for curriculum evaluation. Envisioning a guiding framework for evaluations along dimensions of relevance (x-axis) and equity (y-axis). |
For example, while assessments of applied knowledge (i.e., compare, evaluate, predict, or explain) were generally more equitable than assessments of algorithmic knowledge, the applied tasks administered in this setting were still inequitable, spanning mean differentials between 14 and 16% (see Fig. 4).
Such assessments could be characterized on the plane in Fig. 6 as relevant but inequitable. Thus, the advisable action would be to make evidence-based and data-driven revisions to improve the relevance and equity of our courses.
Some instructors calculate the difficulty of a task using the percentage of students who answered it correctly to evaluate whether a task is too difficult. With support, instructors could calculate inequity using the difference in percentages of students who answered a task correctly between subgroups of interest and evaluate whether a task is inequitable. Iteratively, these adjustments of how we define academic success and who is most impacted by these definitions can potentially impact the relevance and equity of our courses substantially. An example spreadsheet with equations to conduct by-item equity and relevance analyses is available in the ESI.† Specific examples of assessment tasks for each code can be found in Appendix 2.
The Center for Urban Education offers Racial Equity Tools, a suite of resources offering step-by-step guides, measures, and other evaluative tools for advancing racial equity in educational institutions (CUE Racial Equity Tools, 2020). Additionally, institutional efforts to broaden access to student outcomes disaggregated by social constructs (e.g., gender, race, ethnicity) can be used as models for advancement. For example, see the Grades and Equity Gaps Dashboard by Faculty Development at California State University, Chico (Hall, 2022), the Know Your Students web-based application offered by the Center for Educational Effectiveness (Steinwachs, 2021), the Comprehensive Analytics for Student Success project through the Office of the Provost at University of California (Cherland et al., 2019), Irvine, and Equity Data by the Center for Innovations in Teaching and Learning at the University of California, Santa Cruz (Equity Data, 2022).
Other opportunities to advance the current research include examining (1) other competencies (besides stoichiometry) identified as inequitable, and (2) intersections of competencies as each of the codes used in the current student represents a task's primary competency, ignoring when a variety of competencies were assessed within one task. In sum, these research efforts could be used to identify further the hallmarks of equitable approaches to assessing chemistry.
For larger-scale impacts to the equity and value of introductory chemistry courses, we agree with the following quote by Dr. Vincent-Ruz (2020) and encourage instructors to extend the practice of Relevance and Equity beyond practices of assessment design.
“Finally, equity work is not a methodology or the inclusion of diverse groups in a sample. A true commitment to equity research is centering those principles in every decision we make as researchers and practitioners.” (p. 71)
We also agree with our colleagues in education research who call for disaggregating data in evaluative efforts (Mukherji et al., 2017; Bancroft, 2018; Bancroft et al., 2021; Collins and Olesik, 2021).
As researchers engage in this work, we ask that they continue to shift their discourse out of deficit perspectives and into system-level interpretations. For example, prior research has attributed deficits to students with inequitable access to pre-college mathematics, referring to these students as “at-risk” or of “low math aptitude” (Lewis and Lewis, 2007; Shields et al., 2012; Hall et al., 2014; Ye et al., 2016; Ralph and Lewis, 2018, 2019), without acknowledging these measures were historically designed to exclude, and are presently successful in excluding, students by social constructs such as race and ethnicity (Davis and Martin, 2008; Knoester and Au, 2017). To provide a starting point, we encourage our colleagues to seek information about reframing deficit perspectives (Harper, 2010) and engaging in quantitative research critically informed by broader systemic and societal influences (Garcia et al., 2018; Gillborn et al., 2018).
While relying on data from students scoring within the bottom and top-three quartiles of assessments (e.g., SAT, ACT, or chemistry tests) aptly reflects the quantitative data, a second limitation was purposeful sampling by the maximum variation sampling method. Maximum variation sampling may have introduced bias into the results. As a result, we recognize that the knowledge elicited from the think-aloud interviews may not be generalizable to every institution's population, but given evidence in prior research, it appears pertinent to many. We also acknowledge that the purpose of qualitative methods, particularly from a phenomenological approach, is not meant to be generalizable but to provide a detailed description of individuals’ experiences as they engage with and process specific phenomena.
A third limitation of this work is that only high-stakes, summative assessments were included in the initial analysis. Formative assessments are more likely to reveal students’ progress toward learning goals. Future work could examine the connection between formative and summative assessments and equitable outcomes for chemistry students.
Finally, throughout the quantitative methods, we relied predominantly on descriptive statistics. This choice reflects the purpose of this work, to support educators as they evaluate their curricula. By relying on descriptive statistics (frequencies, percentages, rates) and a relatively familiar parametric statistic (correlation), we prioritized the accessibility of our methods. However, binary measures of the statistical significance between measures (e.g., whether the emphasis of mole conversions in assessment was significantly different from stoichiometry) and other more complex methods could be used to examine networks of associations between students’ competency scores, math test scores, and chemistry assessment outcomes.
These quantitative results informed the qualitative methods used to collect students’ response processes and characterize their experiences, engaging in assessments of this competency to evaluate whether the knowledge elicited was relevant to applying chemistry to think about phenomena. Stoichiometric conversions elicited knowledge of step-by-step calculations removed from the chemical contexts they are situated (algorithmic) and were observed to hinder students’ integration of the target phenomena stoichiometry represents: the process of converting reactants to products via the breaking, rearrangement, and forming of new chemical bonds and products. The authors challenge other proprietors of the education system to consider (1) what do our assessments communicate in terms of the intellectual work we value in this discipline, (2) who do our assessments exclude from STEM participation, and (3) how can we improve the relevance of the measures we use to assess academic success equitably and prepare students seeking careers in STEM?
Contribution | Authors’ initials |
---|---|
Conceptualization | VRR |
Methodology | YN & VRR |
Data curation | YN & VRR |
Coding | YN, VRR, & NES |
Formal analysis | VRR & NES |
Interpretation | MBA, AC, VRR, & NES |
Literature search | MBA, AC, VRR, & NES |
Writing – original draft | MBA, AC, VRR, & NES |
Writing – review and editing | MBA, AC, YN, VRR, & NES |
Project administration and supervision | MBA & VRR |
Current | Original (Smith et al., 2010) |
---|---|
FOUNDATIONAL | |
Recall: match a term to a definition or vice versa. | D-R: recognizing a definition in multiple-choice format. |
Identify: apply a definition to categorize chemical species (e.g., acid or base, ionic or covalent). | D-RUA: recalling, understanding, or applying a definition in an open-ended question. |
Translate: interpret qualitative descriptions of observable phenomena (e.g., a change in color), quantitative expressions (e.g., equilibrium expressions, rate laws), or representations (e.g., photographs, particulate diagrams including Lewis structures, chemical equations). | C-P: analysis of a pictorial representation (chemical symbols or equations). |
ALGORITHMIC | |
Macroscopic conversions: convert macroscopic quantities (e.g., volume in mL to L). | A-MaD: macroscopic-dimensional analysis questions requiring conversions between units of macroscopic quantities. |
Mole conversions: convert between macroscopic and microscopic quantities of the same chemical species (e.g., volume to moles). | A-MaMi: macroscopic-microscopic conversion questions between moles and volumes or masses. |
Stoichiometric conversions: convert between moles of one chemical to moles of another using the coefficients of a chemical equation (e.g., mol-mol ratios and ICE tables). | A-Mis: microscopic-symbolic conversion questions requiring stoichiometric conversions of particle or mole quantities of substances usually based on chemical formulas or equations. |
Multi-step calculations: Substituting the resultant value of one calculation into another (excluding conversions). | A-Mu: multi-step questions involve multiple steps frequently based on the use or algebraic manipulation of mathematical formulas. |
The subsumed competencies of foundational and algorithmic knowledge used in this study are similar to those described by Smith and colleagues, given their applicability and utility in describing assessment items in the research setting. However, changes were made. For example, assessment tasks rarely relied solely on word association (e.g., “definition” codes in the original scheme) and more often required students to apply a definition to categorize symbolically, or otherwise represented, chemical contexts (e.g., acid or base, ionic or covalent, oxidizing or reducing). These competencies were reorganized in the coding scheme used for the current study as “recall” and “identify,” wherein either could be applied to a multiple-choice assessment item. This reflects the difference between students associating a definition with its term and applying this definition toward identifying a relevant chemical context. As this reconceptualization of the “definition” knowledge reflects students' understanding and application of what could be considered introductory course content, the dimension of knowledge was renamed as “foundational” for use in the current study.
Items coded as “conceptual” were the most challenging to nest in the framework, as the competencies described in the original coding scheme did not encompass many of the items observed in the research setting. For example, the code “analysis of pictorial representations” referred to items requiring students to translate pictorial representations into analyzing a chemical phenomenon. At the research setting, several items required translation across scales of representation but did not necessarily rely upon a pictorial representation (i.e., translating qualitative descriptions of a macroscopic phenomenon to symbolic representations that span the molecular, atomic, and subatomic scale). Additionally, there was no code to describe items wherein algorithmic calculations were necessary to evaluate chemical phenomena (e.g., calculate ΔG to evaluate whether the equilibrium of a chemical reaction will favor the reactants or products).
A subset (90 of 298 or 30.2%) of the assessment tasks were coded using each published framework or protocol to evaluate each by its applicability to the data collected. In early examinations of interrater reliability, where percent agreement was measured at 72% (65 of 90 items or Kappa = 0.549, indicating weak agreement), discerning items of the conceptual competencies from foundational and algorithmic contributed many disagreements amongst the researchers (19 of the 25 mismatched items). Then, we discovered complementary strengths and weaknesses between this and other frameworks published in prior research, which informed the decision to combine these frameworks into a single coding scheme.
In Defining Conceptual Understanding by Holme and colleagues (2015), what a “conceptual” assessment task entails was defined by the open-ended responses of 1396 chemistry instructors. The study helped expand descriptions of “conceptual” assessment tasks (see Table 7).
Current | Original (Smith et al., 2010) | (Holme et al., 2015) | (Laverty et al., 2016) |
---|---|---|---|
Applied: students are asked to use foundational or algorithmic knowledge to predict or explain chemical phenomena, compare chemical species, or evaluate data. | Depth: reason about core chemistry ideas using skills that go beyond mere rote memorization or algorithmic problem solving. | ||
Compare: reason at the interface of measures, structures, and properties to communicate differences between chemical species or phenomena (e.g., use the structures of two chemical species to determine which would have a higher boiling point). | Transfer: apply core chemistry ideas to chemical situations that are novel to the student. | ||
Evaluate: interpret quantitative data (e.g., tables, graphs, calculations, expressions) or descriptions to assess the nature of chemical phenomena (e.g., determine the degree to which a solution is saturated using a solubility curve). | C-I: questions involving the analysis of interpretation of data. | Problem-solving: demonstrate the critical thinking and reasoning involved in solving problems including laboratory measurement. | Evaluating information: make sense of information presented and demonstrate reasoning to support or deny its validity. |
Analyzing and interpreting data: given a claim or data, select an interpretation of its meaning. | |||
Using mathematics and computational thinking: perform a mathematical manipulation and interpret the results in the context of a phenomenon. | |||
Predict or explain: extend relevant models, chemical theories, or laws to predict or explain changes in chemical systems (i.e., apply atomic- and molecular-level models to explain changes in a solution). | C-E: questions involving the explanation of underlying ideas behind chemical phenomena. | Predict: expand situational knowledge to predict and/or explain behavior of chemical systems. | Developing and using models: given a representation, select an appropriate explanation or prediction about phenomenon. |
C-O: questions involving the prediction of outcomes. | Constructing explanations and engaging in argument from evidence: Select reasoning and evidence to support a claim. |
For example, the code “depth” was instrumental in defining the difference between conceptual and the other two competencies (foundational and algorithmic). Our interpretation of the framework was that it did not minimize the role of memorization or algorithmic computation in learning chemistry but acknowledged the cognitive effort required to commit these competencies to practice via application.
However, two weaknesses were identified in the descriptions provided by Holme and colleagues (Holme et al., 2015). The first was in the code description “transfer – applying core chemistry ideas using scenarios novel to students.” Such a code would require data supporting (or assumptions about) what is and is not novel to individual students. Thus, “transfer” was redefined as “compare” in the synthesized coding scheme. The second issue was tautology. The code “problem solving” was defined as “relating to demonstrations of critical thinking in reasoning.” This use of other vaguely defined terms (“problem-solving,” “reasoning,” and “critical thinking”) to define “conceptual” required an additional perspective to solidify codes relating to “conceptual” knowledge.
Thus, we integrated the Scientific Practices Criteria from the Three-Dimensional Learning Protocol advanced by Laverty and colleagues (2016). The criterion described helped identify “conceptual” competencies. For example, the criterion for “Using Mathematics and Computational Thinking” and “Analyzing and Interpreting Data” helped to differentiate between multi-stepalgorithmic (the performance of the calculation or test) and evaluate (interpreting quantitative data to assess the nature of chemical phenomena) test items. Overall, “conceptual” tasks were often foundational or algorithmic knowledge applications towards a larger goal (e.g., modeling, predicting, explaining). Thus, “conceptual” tasks were relabeled as applied. A limitation of the Three-Dimensional Learning Protocol was in its applicability to the collected dataset. More than half of the assessment tasks collected in the research setting did not meet the criterion established for engaging in Scientific Practices, often because the collected assessment tasks did not require students to justify or explain their answer selections. We were able to clarify characterizations of the assessments collected in the research setting by synthesizing all three frameworks.
Recall: matching a term to a definition or description and vice versa.
The [A]t expression in an integrated rate law describes which of the following:
(A) The initial rate of the reaction
(B) The rate of the reaction at any point in time
(C) The concentration of a reactant at any point in time
(D) The concentration of a product at any point in time
(E) The initial concentration of a product
Requires students to match the definition for [A]t.
Identify: applying a definition to categorize a relevant chemical context (e.g., acid or base, ionic or covalent, oxidizing or reducing).
Write the equilibrium constant expression for the reaction:
2HgO(s) ⇄ 2Hg(l) + O2(g)
(A) K = [Hg]2[O2]
(B) K = [O2]
Requires students identify this reaction as having heterogeneous equilibrium for which only gaseous and aqueous species would contribute a concentration.
Translate: interpreting (1) qualitative descriptions across scale (e.g., macroscopic observations explained using submicroscopic phenomena), or (2) representations across scale (e.g., photographs or descriptions of chemicals at the macroscopic-level, particulate representations of chemicals at the particulate-level, and symbolic representations of chemicals at the reaction-, molecular-, elemental-, and subatomic-levels).
Consider the following chemical reaction,
HSO4−(aq) + H2O(l) ⇌ SO42−(aq) + H3O+(aq)
Identify all statements that are true regarding the reaction above.
(I) HSO4−(aq) is a base in this reaction.
(II) H3O+(aq) is the conjugate acid of H2O(l).
(III) SO42−(aq) is the conjugate base of HSO4−(aq).
(A) only I (B) only II (C) only III (D) both I and II (E) both II and III
Requires students to translate from the given symbolic representation to consider particulate-level changes to the chemical species.
Macroscopic conversions: converting macroscopic quantities (e.g., volume in mL to L).
No items in the research setting were exclusively related to macroscopic conversions. However, many required these conversions as a subsidiary step. For example, the item prior relating to ΔG requires a student to convert ΔH from 65.0 kJ to 65000 J to subtract the term TΔS where ΔS is in units of J K−1.
Mole conversions: converting between macroscopic quantities (e.g., volumes, masses) and microscopic quantities (e.g., moles).
Similarly, no items were observed to enact only a mole conversion. For example, in the exemplar for the code “Multi-Step,” students would need to convert from moles of the solvent to the mass of the solvent (in kg) to determine the molality of the solution.
Stoichiometric conversions: converting between chemical contexts (i.e., moles of one chemical to moles of another) via applying a mol–mol ratio or incorporating chemical coefficients in a calculation.
Stoichiometric conversions applied to convert between chemical contexts (e.g., ΔHrxn and Q for 2.05 mole of NH3):
Consider the following balanced chemical equation:
4NH3(g) + 5O2(g) → 4NO(g) + 6H2O(l) ΔHrxn = +1168 kJ
How much heat is absorbed/released when 2.05 mol of NH3(g) reacts with excess O2(g) to produce NO(g) and H2O(l)?
(A) 5.99 × 102 kJ of heat is released.
(B) 5.99 × 10 2 kJ of heat is absorbed.
(C) 2.40 × 103 kJ of heat is released.
(D) 2.40 × 103 kJ of heat is absorbed.
(E) 1.02 × 104 kJ of heat is absorbed.
Stoichiometric conversions are applied in the calculation of a state function (e.g., entropy) as for the item shown below:
Calculate for the following reaction. The S° for each species is shown below the reaction.
C2H2(g) + 2H2(g) → C2H6(g)
S° (J (mol K))−1 200.9 130.7 229.2
(A) −102.4 J (mol K)−1
(B) −233.1 J (mol K) −1
(C) 229.2 J (mol K)−1
(D) 303.3 J (mol K)−1
(E) 560.8 J (mol K)−1
Multi-step calculations: substituting the product of one calculation into another to arrive at a solution (excluding unit conversions). Note: not all algorithmic, multi-step items required computation (see the second exemplar).
The mole fraction of KNO3 in an aqueous solution is 0.0194. What is the molality of the solution? (MM of H2O is 18.02 g mol−1)
(A) 0.0194 m
(B) 0.0388 m
(C) 0.194 m
(D) 1.10 m
(E) 2.20 m
This task requires several steps, including:
1. Use the provided mole fraction of KNO3 (0.0194) to determine the moles of solute (the numerator of molality),
2. Subtract these moles of solute from moles of the solution to determine the moles of solvent,
3. Convert moles of solvent to the mass of solvent,
This was in addition to converting mass in g to mass in kg (the denominator of molality) to determine the molality of the solution, which (per the coding scheme) was not included as “steps” in the multi-step calculation.
Which answer best describes how to solve for the pH of a KOH solution when given the concentration of the solution?
(A) Write the base reaction, write an equilibrium table, and solve for [OH−], convert to pOH and then to pH
(B) Write the acid reaction and an equilibrium table, solve for [H3O+] and then pH
(C) Set concentration of KOH equal to [H3O+] and solve for pH
(D) Write the acid reaction, convert Ka to Kb, write an equilibrium table and solve for [OH−], convert to pOH, and then to pH
(E) Set concentration of KOH equal to [OH − ], solve for pOH, and then pH
Requires students to identify the correct sequence of steps in solving the item.
Compare: reasoning at the interface of measures, structures, and properties to inform differences between chemical species or phenomenon (e.g., relating the structure of a chemical species to compare by differences in boiling point). These two exemplars differ as one requires computation and the other does not.
A voltaic cell is designed with the following cell notation:
Li(s)|Li+(aq)‖Pb2+(aq)|Pb(s)
Which set of concentrations for Li+(aq) and Pb2+(aq) would produce the largest cell potential (Ecell) for this voltaic cell?
Li+(aq) | Pb2+(aq) | |
---|---|---|
(A) | 0.1 M | 1.0 M |
(B) | 1.0 M | 1.0 M |
(C) | 1.0 M | 0.1 M |
(D) | 0.1 M | 0.1 M |
(E) | Each has the same Ecell |
Requires students to compare each set of concentrations for which would render the smallest Q ([ion, oxidized]/[ion, reduced]) to produce the largest Ecell.
Which of the following aqueous solutions (solute added to water) would have the lowest freezing point?
(A) 2.0 m KNO3
(B) 2.0 m CaF 2
(C) 2.0 m KF
(D) 2.0 m CH3OH
(E) 2.0 m HClO4
Requires students to compare solutions by the ideal van't Hoff factor produced by each solvent.
Evaluate: interpreting data, quantitative (in the form of a table, graph, or calculations within the context of a mathematical model) or qualitative (descriptions of the context and chemical systems involved), to assess the nature of a chemical context (e.g., determining the degree to which a solution is saturated using a solubility curve).
Determine the rate law for the following reaction using the data provided.
Experiment | [NO], M | [O2], M | Reaction rate (mol (L × s)−1) |
---|---|---|---|
1 | 5.5 ×10−3 | 3.0 × 10−2 | 8.55 × 10−3 |
2 | 1.1 × 10−2 | 3.0 × 10−2 | 1.71 × 10−2 |
3 | 5.5 × 10−3 | 6.0 × 10−2 | 3.42 × 10−2 |
A. Rate = k[NO][O2]
B. Rate = k[NO]2[O2]
C. Rate = k[NO]2[O2]2
D. Rate = k[NO][O2]1/2
E. Rate = k[NO][O 2 ]2
Requires students to use the quantitative data provided to determine the rate law for each reactant.
Predict or explain: extending relevant models, chemical theories, or laws (e.g., the valence bond theory, the law of mass conservation, the laws of definite and multiple properties, the ideal gas laws) beyond algorithmic calculations to predict or select an explanation for changes in chemical systems (e.g., cause-effect relationships, structure-property relationships, and chemical interactions in solution).
Consider the following reaction at equilibrium. What effect will removing NO2 have on the system?
SO2(g) + NO2(g) ⇌ SO3(g) + NO(g)
(A) The reaction will shift in the direction of products.
(B) The reaction will shift in the direction of reactants.
(C) The reaction will shift to decrease the pressure.
(D) The equilibrium constant will decrease.
(E) No change will occur since NO2 is not included in the equilibrium expression.
Requires students to predict the impact removing NO2 will have on the equilibrium of the provided chemical context.
The normal boiling point for hydrazoic acid, HN3, is 37 °C compared to ammonia, NH3, which has a boiling point of −33.34 °C. This is best explained by:
(A) HN3 has a lower pH
(B) HN 3 has larger intermolecular forces
(C) HN3 has a higher vapor pressure
(D) HN3 has a larger van't Hoff factor
(E) HN3 has lower surface tension
Requires students to explain the observed phenomenon (differences in boiling points between two chemical species).
Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation or other sources of funding. Finally, the authors wish to acknowledge our mentors, Drs Stacey Lowery Bretz, Renée Cole, Regis Komperda, Scott E. Lewis, Norbert J. Pienta, and Kathy Schuh for their investments in our methodological and philosophical training, the anonymous reviewers whose labor is critical to the advancement of education research, and the community of scholars who readily engaged in critical discussions that informed this research: Leslie Bolda, Megan Y. Deshaye, Kevin Pelaez, Leah J. Scharlott, Nicole Suarez, Andrea Van Wyk, and Drs. Kathryn N. Hosbein, Morgan E. Howe, Nicole M. James, Christiane N. Stachl, and Paulette Vincent-Ruz.
Footnote |
† Electronic supplementary information (ESI) available. See DOI: https://doi.org/10.1039/d1rp00333j |
This journal is © The Royal Society of Chemistry 2022 |