Unpacking graduate students’ knowledge for teaching solution chemistry concepts

Jacob C. Lutter , Lillian V. A. Hale and Ginger V. Shultz *
Department of Chemistry, Willard H. Dow Laboratories, University of Michigan, Ann Arbor, Michigan 48109, USA. E-mail: gshultz@umich.edu

Received 7th August 2018 , Accepted 25th October 2018

First published on 6th November 2018


Abstract

Graduate students play a critical role in undergraduate education at doctoral-granting institutions and yet their training is often brief and overgeneralized. Little is known about how they develop knowledge for teaching chemistry in this context. To further understand how graduate students develop knowledge for teaching, we used a questionnaire to measure pedagogical content knowledge of solution chemistry concepts. These concepts are revisited frequently in the undergraduate curriculum, particularly in laboratory courses where graduate students commonly teach. Questions were included to measure both the pedagogical content knowledge and content knowledge of graduate students with a range of teaching experience. Analysis revealed that graduate students’ content knowledge is stronger than their pedagogical content knowledge, which increases modestly with teaching experience. Interviews were performed with a subset of respondents to determine whether they interpreted the questions as intended and the source of knowledge they used in responding. The interviews revealed that graduate students relied heavily on their own experience as students rather than direct experience teaching solution chemistry concepts.


Introduction

There is a wealth of knowledge of undergraduate learning in the sciences that contributes to improved instructional practices at the college level (NRC, 2012). By contrast, the study of instructor knowledge and use of these practices at the college level has been relatively ignored (Bond-Robinson, 2005; Alvarado et al., 2015; Mack and Towns, 2015; Rollnick and Davidowitz, 2015; Hale et al., 2016; Connor and Shultz, 2018). To address the gap we sought to examine chemistry instructors who are in their formative years of teaching, specifically graduate students who teach at PhD granting institutions. Graduate students represent the pool from which postsecondary institutions will ultimately draw new faculty, thus research on graduate student instruction is critical for improving professional development activities around instruction (Luft et al., 2004). For this reason, studies focused on graduate student instruction have the potential to improve the quality of STEM instruction overall.

Graduate students are key contributors to classroom instruction at PhD granting institutions in the U.S. because they have more direct contact hours with undergraduates relative to faculty (Luft et al., 2004). As a result, graduate student teaching may have a larger impact on undergraduate learning than anticipated. This impact is further complicated by the culture at such institutions, where excellence in research is promoted and instructional development often goes by the wayside (Anderson et al., 2011). Graduate student instructional training at many institutions is often short, overgeneralized, and out of touch with research-based STEM learning literature (Anderson et al., 1987; Anderson et al., 2011). In addition, many graduate students do not teach beyond their first year, which, compounded with “boot camp” style training, results in scarce opportunity to develop formal pedagogical skills (Luft et al., 2004).

Most graduate students also do not have substantial teaching experience or instructional training when beginning graduate school. On the other hand, they are successful students who have spent a large amount of time in the classroom. For these reasons, they are likely to adapt teaching strategies that are modelled from instructors of their own classes (Grossman, 1989). In these ways graduate students are distinct from pre-service and professional secondary teachers on which most teaching studies have been conducted (Grossman, 1989; Grossman, 1990; Hale et al., 2016; Connor and Shultz, 2018) and additional studies are needed to elucidate the unique nature of their knowledge for teaching. Specifically, improving our understanding when and where this knowledge is developed, and if graduate students develop it over time as they teach, will inform professional development that cultivates development of knowledge for teaching.

Knowledge for teaching chemistry

Pedagogical content knowledge (PCK) is a teacher specific type of knowledge, which was first described by Lee Shulman. Shulman considered that teachers hold a specialized knowledge about content, which is wholly separate from the type of knowledge that subject matter experts may have about the same content (Shulman, 1986; Shulman, 1987). He defined this specialized knowledge as (page 8):

“That special amalgam of content and pedagogy that is uniquely the province of teachers, their own special form of professional understanding.”

Shulman also considered PCK to be subject specific, meaning, for example, that a Chemistry teacher holds a distinct knowledge for teaching that differs from that of an English teacher (Shulman, 1986; Shulman, 1987).

The PCK model has evolved substantially since it was first conceptualized by Shulman and continues to be refined. Grossman expanded the PCK model by situating it within a set of knowledge types contained in a teacher's knowledge base, which include subject matter knowledge, general pedagogical knowledge, and knowledge of context (Grossman, 1990). Magnusson and Krajcik drew on Grossman's model to define a science specific PCK that includes a teacher's “knowledge and beliefs around scientific literacy” (Magnusson et al., 1999). Most recently a consensus definition of PCK was formulated during a summit of area experts, who divided it into what a teacher knows about a particular topic, the context in which they are teaching (reflection on action), and how a teacher uses that knowledge (reflection in action) to achieve “enhanced student outcomes” (Gess-Newsome, 2015).

The PCK summit called for studies that (1) examine the nature of PCK in context; (2) ascertain the relative quality of PCK; and (3) use both direct and indirect measures (Carlson et al., 2015). In this study, we sought to examine PCK in the unique context in which graduate students develop it. As such, we view graduate students as sharing more in common with teaching interns than pre-service teachers, about whom the majority of PCK studies have been conducted. Like interns, graduate students receive minimal instructional training and must develop teaching knowledge “on the job”. Grossman explored the development of teaching knowledge by teaching interns and observed that they encountered specific challenges when learning to teach, some of which are relevant to graduate students (Grossman, 1989). In particular Grossman found that interns relied heavily on their experience as students and therefore reported using strategies that weren’t necessarily appropriate for the context in which they were teaching. Additionally, interns commonly held a belief about what they expected their students to know that didn’t match what their students actually knew.

Another important outcome of the consensus PCK model is the agreement that the appropriate grain size to be measured is at the topic level (Gess-Newsome, 2015). Meaning, that rather than focusing on the discipline (i.e. Chemistry, English), PCK should be examined by topic (i.e. acid–base chemistry, magnetism). This aspect of PCK is consistent with the ‘topic specific’ model of PCK (TS-PCK) described by Mavhunga (Mavhunga and Rollnick, 2013). The character of teachers’ TS-PCK is distinct to each discrete topic that they may teach; an individual's knowledge for teaching equilibrium is different from their knowledge for teaching solubility. The TS-PCK model conceptualizes that knowledge is developed as an instructor transforms each discrete topic into a “pedagogically powerful form” as they reflect on practice (Hill, 2008; Mavhunga and Rollnick, 2013; Rollnick and Davidowitz, 2015). TS-PCK is divided into five areas, which include (1) students’ prior knowledge; (2) curricular saliency; (3) what is difficult to teach; (4) representations and analogies; and (5) conceptual teaching strategies.

Importantly, PCK correlates with student learning outcomes and instruction quality and can be improved through targeted professional development, (Hill et al., 2005; Park et al., 2011) which suggests that by characterizing graduate students’ PCK and using this understanding to inform their training, we can improve their instruction and student learning outcomes on particular topics. Only a few studies have specifically examined the PCK of chemistry graduate students. Bond-Robinson found that graduate students were more apt to develop management skills than teaching knowledge even when directed teaching feedback was provided (Bond-Robinson, 2005). This is consistent with observations of teaching interns, who likewise develop classroom management skills before they develop teaching knowledge (Grossman, 1990). In prior studies Hale, and later Connor, found that experienced graduate students had higher quality PCK than inexperienced graduates students, suggesting that it does develop over time in the absence of extensive training (Hale et al., 2016; Connor and Shultz, 2018) However, the difference was relatively modest. Graduate students relied heavily on their own experience as students, which is also consistent with Grossman's observations of teaching interns (Grossman, 1989). Additional research is needed to better understand the unique nature of graduate student instructors’ knowledge for teaching and the extent to which these prior findings are relevant in light of the specific context in which they learn to teach.

Content knowledge of solution chemistry

Content knowledge is a component of a teacher's knowledge base that is correlated with PCK and is considered a prerequisite for its development (Mavhunga and Rollnick, 2013; Alvarado et al., 2015). Moreover, the way that CK is used for teaching is distinct from the way that it is used by practicing experts (Grossman, 1989) (i.e. professional chemists) and “Content Knowledge alone is not sufficient for effective teaching” (Hill et al., 2005). Although both teachers’ CK and PCK are correlated with students’ learning, CK has lower predictive power than PCK in this regard (Hill et al., 2005; Hill, 2008). Graduate students typically have minimal teaching experience and as novice instructors are likely to have stronger CK as compared to PCK (Hale et al., 2016; Connor and Shultz, 2018).

Content knowledge (CK) is required for developing knowledge for teaching a particular subject area, (Mavhunga and Rollnick, 2013; Alvarado et al., 2015) like solution-state chemistry. Content knowledge for this study includes solubility and concentration such as it would be applied in an organic chemistry laboratory course, because in the United States graduate students are frequently assigned to teach in laboratory courses and would be more likely to have had experience teaching concentration and solubility in this context. A meaningful understanding of concentration for laboratory courses, (Lowery-Bretz, 2001; Lowery-Bretz et al., 2013) requires that students be able to relate microscale models of molecules in solution to physical observations, (Taber, 2001) such as visualizing the dissolution of molecules and mentally translating that to the clear solution that they observe when a substance is completely dissolved (Valanides, 2000).

For the purposes of this study, CK is divided into declarative, procedural, and conceptual knowledge (Heibert and Lefevre, 1986; Bransford et al., 2000; Anderson and Krathwohl, 2001). Declarative knowledge includes knowledge of discrete and unconnected content, where a student with declarative knowledge of this topic may be able to recall the definition of concentration. (Anderson and Krathwohl, 2001) Conceptual understanding has previously been defined as a network comprised of individual pieces of information and the connections between them (Heibert and Lefevre, 1986). Procedural knowledge includes knowledge of the “rules and procedures for solving exercises” (Heibert and Lefevre, 1986). Said another way, procedural knowledge is understanding how to do something and conceptual knowledge is an understanding of the relationships between ideas and how they function (Anderson and Krathwohl, 2001). Application of these topics in the lab requires both procedural knowledge, (Furio-Mas et al., 2007) such as calculating concentration, and conceptual knowledge, such as interpreting the implications of molecular structure on solution-state properties (Cooper et al., 2013). It is important to note that procedural knowledge can be learned by rote, whereas conceptual knowledge cannot (Glaser, 1984). Therefore, distinguishing procedural and conceptual CK may establish differences in the way graduate students develop PCK.

Learning of solution chemistry

A large number of studies have been reported on student learning of solution chemistry and related topics. Studies related to procedural knowledge of solution and concentration have focused on the concepts of moles and stoichiometry (Wood and Breyfogle, 2006; Furio-Mas et al., 2007; Pekdağ and Azizğlu, 2013; Ramful and Narod, 2014; Leonard, 2016). Other studies have investigated the particular difficulty students have with proportional reasoning as it relates to concentration (Wood and Breyfogle, 2006; Ramful and Narod, 2014). These studies revealed that students learn the mole concept only superficially and tend to conflate mass with “the amount of substance” (Furio et al., 2002). Furio attributed this to students’ developmental level and suggest that instructional strategies, such as using metaphors or analogies, may aid in helping students who are conducting mole calculations (Furio et al., 2002).

Conceptual understanding of solubility and concentration has been investigated for both students (Pinarbasi and Canpolat, 2003; de Berg, 2012) and pre-service teachers (Valanides, 2000; Cokadar, 2009; Ozden, 2009). These studies show that both students and teachers hold misconceptions about solution-state chemistry. The consistency between the conceptions that are held by students and those held by teachers suggests that misconceptions may arise from teachers. For example, Ebenezer investigated high school students’ conceptions of solubility and found that students confuse dissolving with melting and had a tendency to incorrectly extend their understanding of materials at the macroscopic level to the microscopic level (Ebenezer and Erickson, 1996). This study also revealed a discrepancy between the students’ and teachers’ vocabulary in portraying meaning using technical terms, where, for example, students’ use of the word “particle” in explaining a microscopic phenomenon did not convey that they sufficiently understand the meaning as intended by the instructor. Likewise, when investigating the conceptions of pre-service teachers, Valanides found that they had difficulty translating macro-scale observations to micro-scale representations of dissolution (Valanides, 2000). Ozden found that pre-service teachers held alternative conceptions about the relationship between solubility and temperature, the meaning of a ‘saturated’ solution, and the influence of dissolved substances on the boiling point of a solution (Ozden, 2009).

Students must also develop knowledge of structure–property relationships, (Cooper et al., 2013) namely they must be able to interpret molecular structure to make predictions about solubility. Cokadar investigated students’ conceptions of solubility with relation to their understanding of polarity and found that students hold misconceptions about polarity that influenced their ability to predict solubility pairs (Cokadar, 2009). Students hold incorrect or partial understandings about what structural features give rise to polarity. Furthermore, they do not take into account molecular symmetry and attribute polarity to single features of a molecule. Cokadar reported that very few students demonstrated a “sound understanding” of solubility, and it was determined that they primarily relied on “like dissolves like” as a guiding rule. This finding is consistent with literature that indicates that students often rely on weak heuristics, which will enable them to correctly answer questions that test computational or procedural knowledge, but may fail when they are used to answer questions that require a deep conceptual understanding (Maeyer and Talanquer, 2013; McClary and Talanquer, 2015). Given the difficulty of solution chemistry, it is essential that we investigate chemistry instructors’ knowledge for teaching it so that we can improve instructional quality and student outcomes.

Finally, it is important to note that the majority of prior studies on this topic are aimed at general chemistry and may not be applicable here. Students notions about solubility are formed in general chemistry where water is the most used “solvent” and solubility is considered in relation to precipitation reactions of metal ions in water (de Berg, 2012). This perspective may be problematic in organic chemistry courses when organic solvents, rather than water, are considered.

The goal of this study was to understand more specifically how chemistry graduate students develop teaching knowledge at the topic level using a direct measure. The study was guided by the following research questions:

1. What is the extent of chemistry graduate students’ CK and PCK as they relate to teaching concentration and solubility?

2. What is the source of graduate students PCK of these same topics and how does PCK develop in this context?

Methods

A mixed methods approach was employed relying primarily on a questionnaire that was designed to measure both graduate students’ TS-PCK and CK of solution chemistry concepts. Additionally, the questionnaire captured quantitative data of the participants’ teaching experiences and attitudes toward teaching organic chemistry. Semi-structured interviews were performed with some of the participants to determine whether the questions were interpreted as intended and to elucidate the origin of participants’ responses.

Data collection

The data presented in this manuscript is a subset from a larger study of pedagogical content knowledge among graduate students with various levels of experience. Many graduate students teach in a laboratory setting so topics were selected based on those that are commonly encountered in introductory organic chemistry laboratory courses. Previous work on topic specific PCK of other topics has been reported; (Hale et al., 2016; Connor and Shultz, 2018) however, the topic of this manuscript applied the model to solution chemistry concepts. These topics are worthy of analysis because graduate students are likely to have had an opportunity to teach them frequently. To understand these concepts, such as concentration and solubility, several fundamental chemical concepts must be utilized (i.e. structure–property relationships) (Cooper et al., 2013). The composition of these concepts to formulate understanding of solution properties is much more trivial for the graduate student instructor, whose status as a developing expert may preclude recognition of the need to see solution chemistry as a sum of these parts. Therefore a barrier between instructor understanding and student understanding can easily form, where the instructor does not recognize what concepts are missing or misunderstood.

Questionnaire design

The extent of graduate students’ solution chemistry PCK was measured using a questionnaire that was developed from the blueprint outlined in Table 1. An open-ended short-answer format was used in order to obtain a range of possible answers from the participants. Because CK is a prerequisite for PCK, questions were developed to interrogate both types of knowledge at various levels of difficulty (Mavhunga and Rollnick, 2013). The four CK questions were based on textbook questions and were designed to measure the graduate students’ declarative knowledge, procedural knowledge, and conceptual knowledge of solution chemistry concepts (Anderson and Krathwohl, 2001). These items were similar to those given to undergraduate organic chemistry students at the institution at which the study was conducted.
Table 1 Questionnaire blueprint for CK and PCK item development
Type of CK CK question
Declarative knowledge SQ1: Select pairs of solutes and solvents that have good solubility.
SQ4: Predict which solvent 3-pentanone would be least soluble in.
Procedural knowledge CQ1: Calculate the concentration of a specific solution.
Conceptual knowledge SQ3: When running a reaction must the reagent be soluble in the reaction solvent?

Aspect of PCK PCK question
Conceptual teaching strategies CQ2: As if to a student, explain the difference between concentration and quantity.
CQ3: As if to a student, explain why it is important to know the concentration of a reagent in a give reaction mixture.
SQ2: As if to a student, explain how you would determine if a compound is soluble in a solvent.
Students prior knowledge CQ4: What do you expect students to understand about the concept of concentration when they begin this course?
What is difficult to teach CQ5: What about the concept of concentration do you think students will find most challenging in this course?


The five PCK questions were developed from literature examples of TS-PCK as well as the authors’ classroom experiences (Hill, 2008; Mavhunga and Rollnick, 2013). PCK can be reliably measured using particular components of the construct (Rowan et al., 2001) and so questions included only those three of the components that a graduate student would be likely to develop: understanding students’ prior knowledge, what is difficult to teach, and conceptual teaching strategies. The test items were refined through discussion with experts, who had organic chemistry teaching experience, and piloted prior to data collection.

Cognitive interviews were conducted to validate the results and uncover the source of participant knowledge when responding. The interviews included items from the questionnaire and items that identified the graduate students’ attitude towards teaching, prior teaching experience, and academic background. Because participants were not observed, these items might not differentiate between PCK and simple reasoning strategies (Willis et al., 1999). Therefore, “verbal probing techniques” outlined by Willis were incorporated to unearth why the interviewee answered survey items the way they did. The interviewer exercised substantial latitude, in particular when following lines of inquiry regarding the sources of knowledge used when responding to the questions.

Participants

The participants consisted of 68 graduate students enrolled at a public university in a chemistry doctoral program. Of the 68 participants, 36 participated in their first year before teaching the organic II laboratory for the first time. A subset of the participants (27 of the 68) participated before and after their first time teaching the course – both of their responses were used in the analysis to enable evaluation of TS-PCK growth over the semester. The remaining participants represent a wide range of experienced graduate students with at least two terms of experience teaching the course and who were not currently teaching at the time of the study. These participant demographics are summarized in Table 2. Seven of the participants, including four novices and three experienced instructors, participated in cognitive interviews. Only five instructors were interviewed initially, however after analysis it was clear that one instructor with extensive experience drew on different experiences in responding as compared to the other participants. Therefore, two additional instructors, with relatively more experience (greater than 1 year) were interviewed after initial analysis was completed. These two participants responses were found to be consistent with the first experienced instructor, and thus it was determined that that a range of perspectives had been obtained. All consented to participate and IRB approval was obtained.
Table 2 Participant characteristics
Characteristic Number of participants
Gender
Male 41
Female 27
Year in program
1 53
2 3
3 2
4+ 10
Division
Organic 42
Other 26
Terms experience
0 36
1 19
2 2
3+ 14


Data analysis

A rubric was generated to analyse PCK questionnaire responses and was modelled after that reported by Rollnick (Rollnick and Davidowitz, 2015). The rubric for each question is provided in the appendix (Tables 4–8). For each rubric, scores ranged from 0 (incorrect) to 4 (transforming) and were assigned by the authors. A similar rubric was used for content knowledge items. For all PCK questions, on the low end a “limited” response was one that was factually correct, but did not elaborate or expand for the student. Whereas on the high end, a transforming response was one that modified explanation for the student in such a way as to demonstrate a transformation of CK by the participant. For example, a limited explanation (score = 1) for conceptual teaching strategies was:

“The concentration is the number of moles per 1 L of solution. Quantity is just an amount i.e. moles”

This response communicated an accurate definition of concentration, but one that was not transformed in any way for students. In contrast, the following response was rated as transforming and communicated the definition using an everyday concept “people in a room”:

“Quantity (moles) is like counting the how many people are in the room. Concentration is like finding the average number of people standing in a square foot of that same room.”

A subset of responses were scored, and specific rubric levels were iteratively developed through discussion with the authors. A subset of 20% of test items was scored independently by two authors and discussed until greater than 90% agreement was attained. Additional measures of inter-rater reliability were not obtained. A single author then scored the remaining responses. Response analysis and validation were performed using Rasch analysis with Winsteps software (Neumann et al., 2011). Raw data was transformed to logit units, a probability measure used to relate test taker ability to test item difficulty. A logit unit sets the mean item difficulty to zero logit units, which means that an item with average difficulty has a logit unit equal to zero (Bond and Fox, 2013). The model also provides a measure of a participant's ability relating to a single latent variable, in this case CK or PCK of solution and concentration topics. Measurements including persons reliability, item reliability, and fit statistics were within an acceptable range and are provided in Table 3. The results were then statistically analysed.

Table 3 Rasch reliability measures
Indices Content knowledge Pedagogical content knowledge
Person reliability 0.48 0.61
Item reliability 0.96 0.94
Fit statistics (t = −2 and +2) All scores All scores


Interviews were audio recorded and transcribed verbatim in preparation for a deductive coding process according to the model outlined by Hill (Hill, 2008). Coding was designed to capture interviewees’ sources of knowledge used when responding to the questionnaire. Modifications were made to Hill's model to account for the question type involved, which was open-ended only, topic-specific conceptualization of PCK was used, and sources of knowledge were emphasized. Multiple-choice or forced responses, which were originally used by Hill, were avoided here and thus the use of test-taking strategies by interviewees was not evaluated. The “mathematical reasoning” code was renamed “chemical reasoning” for our purposes and was used to differentiate classroom experience from practical experience. Each author coded a subset of the interview data collected and discrepancies were completely resolved by discussion. Due to the small sample size, the quantitatively transformed interview data was not statistically analysed.

Results

A questionnaire was used to investigate the extent of graduate students CK and PCK for teaching solution chemistry concepts in an organic chemistry context. Interviews were conducted in order to determine whether questions were interpreted as intended and to uncover the source of knowledge used by participants in responding. From the statistical analysis of responses, some correlations were drawn between qualities such as CK scores, PCK scores, and terms taught.

Rasch analysis was used to evaluate and compare test item difficulty against participants’ scores, placing both on a common logit scale where test item difficulty and person scores were directly related to logit increase. A Wright map (Fig. 1) was used to evaluate how likely a participant would correctly answer an item (question) of similar difficulty, where a participant with a logit score greater than the location of the item on the map is more likely to give a higher scoring answer. In Fig. 1 a participant who appears at 2.5 on the LOGIT scale would be likely to answer all four content questions correctly. In contrast, a participant at −0.5 would only be likely to answer SQ1 correctly.


image file: c8rp00205c-f1.tif
Fig. 1 Wright map of content knowledge questions.

The Wright map for CK questions (SQ1, SQ3, SQ4, and CQ1) is shown in Fig. 1. The difficulty of the items ranged between −1 and 1, while participant scores ranged between −2 and 4 on the logit scale. The person reliability was 0.48 and the item reliability was 0.96 (Table 3), which was consistent with similar measures reported by others for similar constructs. The distribution of participant performance is centred around 1 logit unit and suggests that most participants had a high likelihood of performing well on this aspect of the questionnaire.

The Wright map for PCK questions (SQ2, CQ2, CQ3, and CQ5) is shown in Fig. 2. The item difficulty ranged between −1 and 1 and the participant scores were between −6 and 2 on the LOGIT scale. The person reliability was 0.61 and the item reliability was 0.94 (Table 3). The distribution of participant performance is much lower than with the CK questions, centred around −0.5, and has a much broader range indicating that the participants were less likely to perform well on the PCK questions and demonstrating a greater variability in their knowledge for teaching.


image file: c8rp00205c-f2.tif
Fig. 2 Wright map of pedagogical content knowledge questions.

The relationship between mean PCK score and terms teaching experience was evaluated using a bubble plot (Fig. 3). PCK trends upwards for both CK and PCK when more terms are taught. The mean CK trend begins at 0.08 and rises to a maximum of 1.08 by term 3, then decreases slightly to 1.05 for those who have taught 3+ terms. There is a significant correlation between CK and terms taught of 0.338 (Pearson, p < 0.05 (two-tailed)), which is small and positive. The mean PCK trend begins at −1.73 and rises steadily to a maximum of −0.45 for GTAs with 3+ terms of experience. PCK and terms taught is also correlated (0.392, p < 0.5 (two-tailed)) indicating a gain in PCK with teaching experience. In general, participants scored higher on the CK set compared to the PCK set, which his consistent with the Wright map results.


image file: c8rp00205c-f3.tif
Fig. 3 Bubble plot of mean content knowledge and pedagogical content knowledge scores as a function of terms teaching experience. The number of participants in each bin are 0 = 36, 1 = 21, 2 = 22, and 3+ = 15.

Fit statistics were obtained and used to assess the questionnaire's level of productive measurement. For all items, MNSQ < 1.5 and/or t ≤ |2|, confirming that the items were productive for measurement (Linacre, 2018). These are acceptable fit statistics indicating that CK and PCK were reliability measured.

The CK and PCK scores for each participant were plotted to compare novice (0–1 terms taught) and experienced (2 or more terms taught) participants (Fig. 4). Higher scores for both CK and PCK were more consistently observed for experienced participants. The lack of a meaningful population in the upper left quadrant supports that CK is a prerequisite for PCK and there is a small but significant correlation (0.259, p < 0.05 (two-tailed)) between CK and PCK. However, a large population in the lower right compared to the upper right quadrant indicates that high CK does not equate with high PCK within the context of the study.


image file: c8rp00205c-f4.tif
Fig. 4 Scatterplot of pedagogical content knowledge and content knowledge logit scores.

Interviews were performed to investigate how the questions were interpreted by the participants and to elucidate the source of knowledge used in responding. All participants described a clear understanding of the wording and intent of each question and no particular questions were identified as problematic during interviews. The interviews did reveal differences in the reported source of knowledge in answering each PCK question. Participants indicated that their general teaching style as well as their specific knowledge for teaching solution chemistry topics was attributed to their own experiences as students, their experiences as TAs, or through a reasoning process.

Participants reported relying on a variety of strategies when teaching solution concepts, which they drew on in responding to PCK questions. Each interviewee referred back to the widely used “like dissolves like” heuristic as a go-to tactic for helping students understand solubility, which is consistent with PCK question responses given by a large proportion of participants. One participant described his thinking about the heuristic as reasoned from the perspective of a student:

“Why do I draw on that phrase? I guess it's just easy to remember. If I were in (lab) still in the phase where I didn’t understand things and the TA said that to me, that would be something I could hold on to. I could have thought about that. It's just so simple.”

Another more experienced participant sourced his thinking from direct teaching experience and related an understanding of the limitations of the heuristic and the need to use other strategies:

“I’ve always gone back to the “like dissolves like” that they have been taught, and sometimes it helps and sometimes it doesn’t. This goes back to the polarity is a gradient and its ratio of C to H atoms and charges. Sometimes they need to see a physical representation – this is a clear solution or a cloudy solution. To see the theoretical – why is it soluble? Then you go the polar route.”

This response was consistent with the responses of the two other experienced participants, who also conveyed the limitation of the “like dissolves like” heuristic in helping students to understand polarity.

Participants also described other strategies such as connecting solubility to real life experiences, for example mixing oil and vinegar for salad dressing. They also asked students to draw in order to help them visualize structure differences:

“If I was talking to a student I would say “let's draw it out” and really break it down. Lead them to the point where polarity is going to be the most different between 3-pentanone and water [SQ4, Table 2], and that's why it would be the least soluble.”

Four of the seven participants related an understanding that students would have difficulty going beyond a distinction of solubility as a binary property (it is or isn’t soluble) as compared to a range. For example, one said:

Students think of things as being polar or not polar. As you go along you gain experience and you realize that everything has some degree of polarity. It's a spectrum not a yes or no statement.

Participants reported relying on their own experiences as students as well as their experience teaching when responding to questions about solution chemistry concepts. They also described experiences from teaching when asked about teaching strategies for solution chemistry concepts, but had a tendency to reason from general teaching experiences as opposed to describing experiences specific to teaching solution concepts. For example, one participant described a general teaching strategy that might be applied to any concept, which suggested a reliance on a more general knowledge for teaching as opposed to specific knowledge for teaching this topic:

“The first thing I thought about was intermolecular interactions. For me, when I try teaching something I try to think of an example. So it's not just a concept out of the air. You have something concrete to link it to.”

Although the strategy applying “intermolecular interactions” is appropriate for teaching solubility, we considered this to be a general teaching strategy because the participant specifically described that their use of a “concrete example” was a strategy that they adapted from experience teaching other topics.

Interviewees with less teaching experience drew almost completely on their own experiences as students when describing why these topics might be easy or difficult. One participant reflected on his experience in the lab to rationalize how he expected students to think about solubility:

“It's a value judgement I guess. I did say for concentration in organic solvents might be tricky because students may be used to thinking about concentration in water. But its something I’ve got to wrap my head around. There are more solvents than just water. Once you get past that it flows pretty logically from there.”

One participant reflected on research experience as a source of knowledge:

“The thing that got me over the hump, in terms of thinking about it, was being in the lab a lot. You begin to get familiar with the different solvents, but that's kind of difficult for students because they don’t have that familiarity and they aren’t going to get it at any point.”

Both examples indicate a specificity of thinking about teaching concentration and solubility that arose from their own experiences in the classroom or lab rather than direct teaching experience.

Limitations

Pedagogical content knowledge is challenging to study because it exists both in the mind of the instructor (internal) and through demonstration in practice (external) (Carlson, 1990). Open-ended questions were used rather than observations, because observation would provide information about how graduate students enact their knowledge for teaching, but they are not an efficient method to surfacing knowledge for a large number of participants. Inferential techniques, such as short answer questions, may not reveal how an instructor will act on this knowledge and thus the questionnaire described here cannot fully capture knowledge for teaching solution chemistry concepts. Multiple-choice questions were not used, because they may show poor criterion related validity (Carlson, 1990), and therefore the use of open-ended questions provided a means to better capture aspects of this knowledge, which we could not anticipate when designing questions. Open-ended questions also offered the advantage of capturing responses from a greater number of participants than might be practical with other methods. Interviews were performed to determine whether the questions were interpreted as intended, but only seven graduate students participated in the interviews so a full range of possible views may not have been obtained. Finally, response bias (Fowler, 2013) may have prevented us from capturing the full extent of some participants’ knowledge. For example, those participants who were fatigued or not motivated may have provided terse responses to some questions, which were scored lower than was representative of their actual understanding.

Discussion

In this study we examined chemistry graduate students’ knowledge for teaching solution chemistry using a set of open-ended questions. The questionnaire results demonstrated positive relationships between PCK, CK, and teaching experience that are consistent with the framework proposed by Shulman (Shulman, 1986) and findings reported separately by Hale (Hale et al., 2016), Connor (Connor and Shultz, 2018), and Mavhunga (Mavhunga and Rollnick, 2013). A positive and significant correlation between CK and PCK was observed (Fig. 4), which indicates that graduate students with greater CK are likely to demonstrate greater PCK. However, higher CK did not guarantee higher PCK; several participants who had high CK scores did not perform well on PCK with respect to their peers. Further, no participants had high PCK and low CK, which is consistent with the notion that CK is a prerequisite for PCK.

A modest increase in PCK was observed for participants with more teaching experience, indicating that PCK can be developed through practice. A small positive, significant correlation between both PCK and CK and the number of terms taught was observed (Fig. 3). These trends are consistent with observations made by Lederman (Lederman, 1992) and Davis (Davis and Krajcik, 2005), who reported increases in PCK upon self-assessment and practice of pedagogical techniques. The increase in CK may be related to teaching the content and becoming familiar with these concepts; graduate students with at least three terms of teaching experience did not score below zero on the CK LOGIT scale (Fig. 4).

Results support the assertion that graduate student knowledge for teaching compares more closely with the interns described by Grossman (Grossman, 1989), who also lacked formal pedagogical training. Study participants performed well on CK questions, which is consistent with their relative subject matter expertise and the level of CK question difficulty; they can correctly answer textbook questions that might be posed to their students. Participants performed less well on PCK questions, which indicates that their knowledge for teaching solution-state chemistry is not as well developed. The Wright maps in Fig. 1 and 2 demonstrate a disparity in participant performance on the CK items relative to the PCK items. This difference is also observed in the bubble plot on Fig. 3 where the average score is consistently higher for the CK questions when compared to PCK questions. Finally, the scatterplot in Fig. 4 shows a distribution of participants with a strong bias toward positive CK scores as compared to a more negative bias toward PCK scores. This result is consistent with the participants’ reported background expertise in CK gained through undergraduate studies and minimal prior teaching experience of training.

A difficulty hierarchy (Linacre, 2004) (Fig. 2) was observed in the PCK questions where participants performed best on CQ2 (conceptual teaching strategy), which asked them to describe the difference between concentration and quantity. They performed the least well on CQ4/5 (knowledge of students) which asked participants to predict what students would struggle with and what expectations of prior knowledge they have for students in their class. This finding agrees with a prior investigation of graduate student PCK by Hale (Hale et al., 2016), in that both studies found that graduate students did the best on questions that relate to conceptual teaching strategies and poorest on questions that related to knowledge of their students.

The experiences that contributed to graduate students’ knowledge for teaching solution chemistry, were examined using semi-structured interviews. In responding to PCK questions, less experienced participants’ knowledge sourced primarily from a process of reasoning based on their own experience as students. To a lesser extent they drew on teaching experience, though this was largely adapted from general teaching experiences rather than direct experience teaching solubility and concentration. The tendency to rely on student experiences is also consistent with Grossman, who found that novice instructors with no pedagogical training will emulate their own instruction (Grossman, 1989).

All participants reported relying on heuristics such as “like dissolves like”, which was sourced from their own experience as students. However, the more experienced participants each described the limitations of heuristics and provided explanation of why they are limited. Similarly Cokadar reported that prospective high school chemistry teachers relied on “like dissolves like” as a guiding rule (Cokadar, 2009). The reliance on heuristics by more novice instructors suggests that views on heuristics may be a marker for PCK development and that those with higher PCK will know when such a heuristic is appropriate for students to use and when to use other strategies. However, additional study is needed to establish whether experienced graduate students would be more likely to recognize the limitations of this heuristics.

Implications

This study is among the first reported inferential measures of chemistry graduate students’ pedagogical content knowledge. Because graduate students learn to teach in a much different context than secondary educators, this study adds to the existing literature on the teacher knowledge. The questionnaire described here provided a method that can be used to better understand how graduate students learn to teach specific chemistry topics, namely solubility and concentration. There was a modest increase in these graduate students’ pedagogical content knowledge of these topics with terms of teaching experience, which suggest that it can be developed over time. Furthermore, interviews revealed that graduate students relied primarily on their own experience as students when responding to PCK questions. In this regard, they may share more in common with subject matter experts who have little or no instructional training than with secondary teachers on which the PCK model has been primarily developed. For this reason, additional work is needed to understand better how graduate students develop knowledge for teaching solution concepts and other topics in chemistry.

The results of this study indicate that graduate student instructional training can be improved to better expedite the development of this knowledge. One possible adaptation that directly arose specifically from these findings is related to the different views on heuristics held by novice and experienced instructors. One approach could be to include discussion with graduate student instructors about the benefits and limitations of heuristics in order to prepare graduate students for using heuristics such as “like dissolves like” more effectively.

Conflicts of interest

There are no conflicts to declare.

Appendix 1: PCK scoring rubrics

Table 4 Rubric for question CQ2: as if to a student, explain the difference between concentration (molarity) and quantity (moles)
Score Definition Exemplar
(0) Incorrect Provides incorrect explanation. N/A
(1) Limited Defines both concentration and quantity correctly, but does not provided standard or elaborated explanation. “Concentration is moles per liter of solvent. Moles is just the amount of solid material you have.”
(2) Basic Identifies the difference between concentration and quantity and provides standard “textbook like” explanation. “Moles refers to the number of molecules in terms of Avogadro's number, and molarity provides us with concentration, refers to the number of said moles that would be present in 1 liter of solution.”
(3) Developing Identifies the difference between concentration and quantity, provides textbook like explanation, which they expand on or rephrase. “Concentration is the amount of solute per unit of solvent, while quantity is the total amount of solute you are using. If you make up a big batch of solution and begin using it up in an experiment, the total amount of solute decreases as you use it up, but the concentration remains the same.”
(4) Transforming Identifies the difference between concentration and quantity, provides textbook explanation, which they expand or rephrase. Demonstrate a transformation of the knowledge indicative of PCK. “Moles is the term used to describe a collection of particles, so as an example if you have a cookie with 5 chocolate chips you will say that there are 5 chocolate chips. If you want to look at the concentration, you must compare how many moles, or in this case chocolate chips, are available per a unit of space/volume. In this case, you could say there are 5 chocolate chips per cookie. It is also very important to make sure the student can describe that moles is a book keeping number to organize a large quantity.”


Table 5 Rubric for question CQ3: as if to a student, explain why it is important to know the concentration of a reagent in a given reaction mixture
Score Definition Exemplar
(0) Incorrect Provides incorrect explanation. N/A
(1) Limited Explains why it is important to know concentration of a reagent in a given reaction mixture. “Concentration can be used to determine rate constants.” OR “Concentration has large effects on reactivity and rate.”
(2) Basic Explains why it is important to know concentration of a reagent and provides standard “textbook like” explanation. “It is important because the amount of reagent you actually added needs to be calculated using the concentration. If you are adding a very dilute solution, you have to add more of it to add the full amount needed.”
(3) Developing Explains why it is important to know concentration of a reagent, provides standard textbook like explanation, and explains or rephrases explanation. “Molecules must encounter each other to react, higher concentration means more encounters, and therefore faster reaction.”
(4) Transforming Explains why it is important to know concentration of a reagent, provides standard textbook like explanation, and explains or rephrases explanation. Demonstrates a transformation of the knowledge indicative of PCK. N/A


Table 6 Rubric for question CQ4: what do you expect students to understand about the concept of concentration when they begin this course?
Score Definition Exemplar
(0) Incorrect Provides incorrect explanation or insufficient response to score. No expectation.
(1) Limited Identifies an area that students should know, but does not explain or expand. “How to calculate C = n/V OR Everything. It is a simple topic that everyone learns in high school chemistry.”
(2) Basic Identifies and area that students should know and includes a standardized explanation. “They should know everything about it, because they should not have been allowed to skip/pass Gen Chem without knowing it or pass Orgo I without knowing it. However, in my experience they know next to nothing about it and need to be taught it in lab”
(3) Developing Identifies an area that students should know and provides an expanded or rephrased explanation. “I think a student will know that some things are very concentrated while others are not. Many may know the concepts of molarity and molality and maybe ppm. I think their conceptual ability to use it to problem solve may still be very basic.”
(4) Transforming Identifies an area that students should know and demonstrates a transformed understanding (curricular saliency). “From past experience, a lot of students had trouble with dimensional analysis and calculating concentration and moles. I think they understand it based on the units of moles per volume and not conceptually as number of molecules distributed in a certain volume.”


Table 7 Rubric for question CQ5: what about the concept of concentration do you think students will find most challenging in this course (organic chemistry II laboratory)?
Score Definition Exemplar
(0) Incorrect Provides incorrect explanation. N/A
(1) Limited Identifies an area of student difficulty. “How to calculate it.” OR “How to distinguish it from total amount.”
(2) Basic Identifies and area of student difficulty and provides standardized textbook like explanation. “How to use it to calculate how much of the liquid that you need to add to have the desired reaction.”
(3) Developing Identifies an area of student difficulty and provides expanded or rephrased explanation. “Concentration, as a quantitative measure, is not an everyday tool we use. Students will need to connect the concentration of a substance to a problem that they have, and recognize that it can affect the quality of their results just as much as any other aspect of how they set up their experiments.”
(4) Transforming Identifies an area of student difficulty, provides expanded or rephrases explanation, and demonstrates a transformed understanding of student difficulty (knowledge of students). N/A no participants demonstrated specific knowledge of students.


Table 8 Rubric for question SQ2: as if to a student explain how you would determine if a compound is soluble in a solvent
Score Definition Exemplar
(0) Incorrect Insufficient answer to score or incorrect. N/A
(1) Limited Simplistic answer with limited information. “Take a small amount of a compound and add 1 mL of solvent to it.”
(2) Basic Provides straightforward instruction or observation sufficient for student to get started. “Add some solvent to the compound. If there is no visible solid then the compound is soluble.”
(3) Developing Detailed instruction, which student could follow and enact. “Take a small vial and add ∼3 mL of solvent. Add a very small amount, the tip of a micro spatula, of the solute and pour it in. Swirl/wait for a minute or two and see if the solute has dissolved to leave a clear solution with no chunks.”
(4) Transforming Provides detailed instruction, which student could follow and enact. In addition instruction provides strong conceptual teaching strategy. N/A no participants included a conceptual teaching strategy.


Acknowledgements

Acknowledgement is made to the UM President's Postdoctoral Fellowship and the UM CSEI|UM Future Faculty Program for funding.

Notes and references

  1. Alvarado C., Canada F., Garritz A. and Mellado V., (2015), Canonical pedagogical content knowledge by CoRes for teaching acid base chemistry at high school, Chem. Educ. Res. Pract., 16, 603–618.
  2. Anderson C. W., Smith E. L. and Richardson-Koehler V., (1987), Educator's handbook: A research perspective.
  3. Anderson L. W. and Krathwohl D. R., (2001), A taxonomy for learning, teaching and assessing: A revision of Bloom's taxonomy of educaitonal objectives: Complete edition, New York: Longman.
  4. Anderson W. A., Banerjee U., Drennen C. L., Elgin S. C. R., Epstein I. R. and Handelsman J., et al., (2011), Changing the culture of science education at research universities, Science, 331, 152–153.
  5. Bond T. G. and Fox C. M., (2013), Applying the Rasch Model: Fundamental Measurement in the Human Sciences, New York: Routledge.
  6. Bond-Robinson J., (2005), Identifying pedagogical content knowledge in the chemistry laboratory, Chem. Educ. Res. Pract., 6, 83–103.
  7. Bransford J., Brown A. and Cockering R., (2000), How people learn: brain, mind, experience, and school, Washington, D.C.: National Academies Press.
  8. Carlson J., Stokes L., Helms J., Gess-Newsome J. and Gardner A., (2015), The PCK Summit: A process and structure for challenging current ideas, provoking future work, and considering new directions, New York: Routledge.
  9. Carlson R. E., (1990), Assessing Teachers' Pedagogical Content Knowledge: Item Development Issues, J. Pers. Eval. Educ., 4, 157–173.
  10. Cokadar H., (2009), First year prospective teachers' perceptions of molecular polarity and properties of solutions, Asian J. Chem., 21, 75–85.
  11. Connor M. C. and Shultz G. V., (2018), Teaching assistants' topic-specific pedagogical content knowledge in 1H NMR spectroscopy, Chem. Educ. Res. Pract., 19, 653–669.
  12. Cooper M. M., Corley L. M. and Underwood S. M., (2013), An investigation of college chemistry students' understanding of structure-property relationships, J. Res. Sci. Teach., 50, 699–721.
  13. Davis E. A. and Krajcik J., (2005), Designing educative curriculum materials to promote teacher learning, Educ. Res., 44, 263–272.
  14. de Berg K., (2012), A study of first-year chemistry students' understanding of solution chemistry at the tertiary level, Chem. Educ. Res. Pract., 13, 8–16.
  15. Ebenezer J. V. and Erickson G. L., (1996), Chemistry students' conceptions of solubility: A phenomenography, Sci. Educ., 80, 181–201.
  16. Fowler F. J., (2013), Survey Research Methods, Boston, MA: Sage Publications.
  17. Furio C., Azcona R. and Quisasola J., (2002), The learning and teaching of the concepts ‘amount of substance’ and ‘mole’: A review of the literature, Chem. Educ. Res. Pract., 3, 277–292.
  18. Furio-Mas C., Calatayud M. and Barcenas S., (2007), Surveying students' conceptual and prcedural knowledge of acid-base behavior of substances, J. Chem. Educ., 84, 1717–1724.
  19. Gess-Newsome J., (2015), A model of teacher professional knowledge and skill including PCK, New York, NY: Routledge.
  20. Glaser R., (1984), Education and thinking: The role of knowledge, Am. Pscyh., 39, 1–54.
  21. Grossman P. L., (1989), Learning to teach without teacher education, Teach. Coll. Rec., 91, 191–208.
  22. Grossman P. L., (1990), The making of a teacher: Teacher knowledge and teacher education, Teach. Coll. Press.
  23. Hale L. V. A., Lutter J. C. and Shultz G. V., (2016), The development of a tool for measuring graduate students' topic specific pedagogical content knowledge, Chem. Educ. Res. Pract., 17, 700–710.
  24. Heibert J. and Lefevre P., (1986), Conceptual and procedural knowledge: The case of mathematics, New Jersey: Lawrence Erlbaum and Associates.
  25. Hill H. C., (2008), Unpacking pedagogical content knowledge: Conceptualizing and measuring teachers' topic-specific knowledge of students, J. Res. Math. Educ., 39, 372–400.
  26. Hill H. C., Rowan B. and Ball D. L., (2005), Effects of teachers mathematical knowledge for teaching on student achievement, Am. Educ. Res. J., 42, 371–406.
  27. Lederman N. G., (1992), Students' and teachers' conceptions of the nature of science: A review of the research, J. Res. Sci. Teach., 29, 331–359.
  28. Leonard B. P., (2016), Why is 'amount of substance' so poorly understood? The mysterious Avogardo constant is the culprit!, Accred. Qual. Assur., 21, 231–236.
  29. Linacre J. M., (2004), Test Validity and Rasch Measurement: Construct, Content, etc., Rasch Meas. Trans., 18, 970–971.
  30. Linacre J. M., (2018), A user's guide to Winsteps/Ministep: Rasch-model computer programs, retrieved from http://www.winsteps.com/a/winsteps.pdf.
  31. Lowery-Bretz S., (2001), Novak's theory of education: Human constructivism and meaningful learning, J. Chem. Educ., 78, 1107–1108.
  32. Lowery-Bretz S., Fay M., Bruck L. B. and Towns M. H., (2013), What faculty interviews reveal about meaningful learning in the undergraduate chemistry laboratory, J. Chem. Educ., 90, 281–288.
  33. Luft J. A., Kurdziel J. P., Roehrig G. H. and Turner J., (2004), Growing a garden without water: teaching assistants in introductory laboratory sciences at a doctoral/research university, Sci. Teach., 41, 211–233.
  34. Mack M. R. and Towns M. H., (2015), Faculty beliefs about the purposes for teaching undergraduate physical chemistry courses, Chem. Educ. Res. Pract., 17, 80–99.
  35. Maeyer J. and Talanquer V., (2013), Making predictions about chemical reactivity: assumptions and heuristics, J. Res. Sci. Teach., 50, 748–767.
  36. Magnusson S., Krajcik J. and Borko H., (1999), Nature, sources, and development of pedagogical content knowledge for science teaching, Netherlands: Kluwer Academic Publishers.
  37. Mavhunga E. and Rollnick M., (2013), Improving PCK of Chemical Equilibrium in Pre-service Teachers, African Journal of Research in Mathematics, Science and Technology Education, 17, 113–125.
  38. McClary L. and Talanquer V., (2015), Heuristic reasoning in chemistry: Making decisions about acid strength, Int. J. Sci. Educ., 33, 1433–1454.
  39. Neumann I., Neumann K. and Nehm R., (2011), Evaluating instrument quality in science education: Rasch-based analyses of a nature of science test, Int. J. Sci. Educ., 33, 1373–1405.
  40. NRC, (2012), Discipline-based education research: Understanding and improving learning in undergraduate science and engineering, National Academies Press.
  41. Ozden M., (2009), Prospective science teachers' conceptions of solution chemistry., J. Balt. Sci. Educ., 8, 69–78.
  42. Park S., Jang J., Chen Y. and Jung J., (2011), Is pedagogical content knowledge necessary for reformed science teaching? Evidence from an empirical study, Res. Sci. Educ., 41, 245–260.
  43. Pekdağ B. and Azizğlu N., (2013), Semantic mistakes and didactic difficulties in teaching the “amount of substance” concept: a useful model, Chem. Educ. Res. Pract., 14, 117–129.
  44. Pinarbasi T. and Canpolat N., (2003), Students' understanding of solution chemistry concepts, J. Chem. Educ., 2003, 1.
  45. Ramful A. and Narod F. B., (2014), Proportional reasoning in the learning of chemistry: Levels of complexity, Math. Educ. Res., 26, 25–46.
  46. Rollnick M. and Davidowitz B., (2015), Topic Specific PCK of Subject Matter Specialists in Grade 12 Organic Chemistry, Eduardo Mondlane University.
  47. Rowan B., Schilling S. G., Ball D. L. and Miller R., (2001), Measuring teachers' pedagogical content knowledge in surveys: an exploratory study, Ann Arbor: University of Pennsylvania.
  48. Shulman L. S., (1986), Those who understand: Knowledge growth in teaching, Educ. Res., 15, 4–14.
  49. Shulman L. S., (1987), Knowledge and teaching: Foundations of the new reform, Harv. Educ. Rev., 57, 1–22.
  50. Taber K. S., (2001), Building the structural concepts of chemistry: Some considerations from educational research, Chem. Educ. Res. Pract., 2, 123–158.
  51. Valanides N., (2000), Primary student teachers' understanding of the particulate nature of matter and its transformations during dissolving., Chem. Educ. Res. Pract., 1, 249–262.
  52. Willis G., Lessler J. T. and Capar R. A., (1999), presented in part at the Meeting of the American Statistical Association, Iowa.
  53. Wood C. and Breyfogle B., (2006), Interactive demonstrations for mole ratios and limiting reagents, J. Chem. Educ., 83, 741–748.

This journal is © The Royal Society of Chemistry 2019