Jacob C.
Lutter
,
Lillian V. A.
Hale
and
Ginger V.
Shultz
*
Department of Chemistry, Willard H. Dow Laboratories, University of Michigan, Ann Arbor, Michigan 48109, USA. E-mail: gshultz@umich.edu
First published on 6th November 2018
Graduate students play a critical role in undergraduate education at doctoral-granting institutions and yet their training is often brief and overgeneralized. Little is known about how they develop knowledge for teaching chemistry in this context. To further understand how graduate students develop knowledge for teaching, we used a questionnaire to measure pedagogical content knowledge of solution chemistry concepts. These concepts are revisited frequently in the undergraduate curriculum, particularly in laboratory courses where graduate students commonly teach. Questions were included to measure both the pedagogical content knowledge and content knowledge of graduate students with a range of teaching experience. Analysis revealed that graduate students’ content knowledge is stronger than their pedagogical content knowledge, which increases modestly with teaching experience. Interviews were performed with a subset of respondents to determine whether they interpreted the questions as intended and the source of knowledge they used in responding. The interviews revealed that graduate students relied heavily on their own experience as students rather than direct experience teaching solution chemistry concepts.
Graduate students are key contributors to classroom instruction at PhD granting institutions in the U.S. because they have more direct contact hours with undergraduates relative to faculty (Luft et al., 2004). As a result, graduate student teaching may have a larger impact on undergraduate learning than anticipated. This impact is further complicated by the culture at such institutions, where excellence in research is promoted and instructional development often goes by the wayside (Anderson et al., 2011). Graduate student instructional training at many institutions is often short, overgeneralized, and out of touch with research-based STEM learning literature (Anderson et al., 1987; Anderson et al., 2011). In addition, many graduate students do not teach beyond their first year, which, compounded with “boot camp” style training, results in scarce opportunity to develop formal pedagogical skills (Luft et al., 2004).
Most graduate students also do not have substantial teaching experience or instructional training when beginning graduate school. On the other hand, they are successful students who have spent a large amount of time in the classroom. For these reasons, they are likely to adapt teaching strategies that are modelled from instructors of their own classes (Grossman, 1989). In these ways graduate students are distinct from pre-service and professional secondary teachers on which most teaching studies have been conducted (Grossman, 1989; Grossman, 1990; Hale et al., 2016; Connor and Shultz, 2018) and additional studies are needed to elucidate the unique nature of their knowledge for teaching. Specifically, improving our understanding when and where this knowledge is developed, and if graduate students develop it over time as they teach, will inform professional development that cultivates development of knowledge for teaching.
“That special amalgam of content and pedagogy that is uniquely the province of teachers, their own special form of professional understanding.”
Shulman also considered PCK to be subject specific, meaning, for example, that a Chemistry teacher holds a distinct knowledge for teaching that differs from that of an English teacher (Shulman, 1986; Shulman, 1987).
The PCK model has evolved substantially since it was first conceptualized by Shulman and continues to be refined. Grossman expanded the PCK model by situating it within a set of knowledge types contained in a teacher's knowledge base, which include subject matter knowledge, general pedagogical knowledge, and knowledge of context (Grossman, 1990). Magnusson and Krajcik drew on Grossman's model to define a science specific PCK that includes a teacher's “knowledge and beliefs around scientific literacy” (Magnusson et al., 1999). Most recently a consensus definition of PCK was formulated during a summit of area experts, who divided it into what a teacher knows about a particular topic, the context in which they are teaching (reflection on action), and how a teacher uses that knowledge (reflection in action) to achieve “enhanced student outcomes” (Gess-Newsome, 2015).
The PCK summit called for studies that (1) examine the nature of PCK in context; (2) ascertain the relative quality of PCK; and (3) use both direct and indirect measures (Carlson et al., 2015). In this study, we sought to examine PCK in the unique context in which graduate students develop it. As such, we view graduate students as sharing more in common with teaching interns than pre-service teachers, about whom the majority of PCK studies have been conducted. Like interns, graduate students receive minimal instructional training and must develop teaching knowledge “on the job”. Grossman explored the development of teaching knowledge by teaching interns and observed that they encountered specific challenges when learning to teach, some of which are relevant to graduate students (Grossman, 1989). In particular Grossman found that interns relied heavily on their experience as students and therefore reported using strategies that weren’t necessarily appropriate for the context in which they were teaching. Additionally, interns commonly held a belief about what they expected their students to know that didn’t match what their students actually knew.
Another important outcome of the consensus PCK model is the agreement that the appropriate grain size to be measured is at the topic level (Gess-Newsome, 2015). Meaning, that rather than focusing on the discipline (i.e. Chemistry, English), PCK should be examined by topic (i.e. acid–base chemistry, magnetism). This aspect of PCK is consistent with the ‘topic specific’ model of PCK (TS-PCK) described by Mavhunga (Mavhunga and Rollnick, 2013). The character of teachers’ TS-PCK is distinct to each discrete topic that they may teach; an individual's knowledge for teaching equilibrium is different from their knowledge for teaching solubility. The TS-PCK model conceptualizes that knowledge is developed as an instructor transforms each discrete topic into a “pedagogically powerful form” as they reflect on practice (Hill, 2008; Mavhunga and Rollnick, 2013; Rollnick and Davidowitz, 2015). TS-PCK is divided into five areas, which include (1) students’ prior knowledge; (2) curricular saliency; (3) what is difficult to teach; (4) representations and analogies; and (5) conceptual teaching strategies.
Importantly, PCK correlates with student learning outcomes and instruction quality and can be improved through targeted professional development, (Hill et al., 2005; Park et al., 2011) which suggests that by characterizing graduate students’ PCK and using this understanding to inform their training, we can improve their instruction and student learning outcomes on particular topics. Only a few studies have specifically examined the PCK of chemistry graduate students. Bond-Robinson found that graduate students were more apt to develop management skills than teaching knowledge even when directed teaching feedback was provided (Bond-Robinson, 2005). This is consistent with observations of teaching interns, who likewise develop classroom management skills before they develop teaching knowledge (Grossman, 1990). In prior studies Hale, and later Connor, found that experienced graduate students had higher quality PCK than inexperienced graduates students, suggesting that it does develop over time in the absence of extensive training (Hale et al., 2016; Connor and Shultz, 2018) However, the difference was relatively modest. Graduate students relied heavily on their own experience as students, which is also consistent with Grossman's observations of teaching interns (Grossman, 1989). Additional research is needed to better understand the unique nature of graduate student instructors’ knowledge for teaching and the extent to which these prior findings are relevant in light of the specific context in which they learn to teach.
Content knowledge (CK) is required for developing knowledge for teaching a particular subject area, (Mavhunga and Rollnick, 2013; Alvarado et al., 2015) like solution-state chemistry. Content knowledge for this study includes solubility and concentration such as it would be applied in an organic chemistry laboratory course, because in the United States graduate students are frequently assigned to teach in laboratory courses and would be more likely to have had experience teaching concentration and solubility in this context. A meaningful understanding of concentration for laboratory courses, (Lowery-Bretz, 2001; Lowery-Bretz et al., 2013) requires that students be able to relate microscale models of molecules in solution to physical observations, (Taber, 2001) such as visualizing the dissolution of molecules and mentally translating that to the clear solution that they observe when a substance is completely dissolved (Valanides, 2000).
For the purposes of this study, CK is divided into declarative, procedural, and conceptual knowledge (Heibert and Lefevre, 1986; Bransford et al., 2000; Anderson and Krathwohl, 2001). Declarative knowledge includes knowledge of discrete and unconnected content, where a student with declarative knowledge of this topic may be able to recall the definition of concentration. (Anderson and Krathwohl, 2001) Conceptual understanding has previously been defined as a network comprised of individual pieces of information and the connections between them (Heibert and Lefevre, 1986). Procedural knowledge includes knowledge of the “rules and procedures for solving exercises” (Heibert and Lefevre, 1986). Said another way, procedural knowledge is understanding how to do something and conceptual knowledge is an understanding of the relationships between ideas and how they function (Anderson and Krathwohl, 2001). Application of these topics in the lab requires both procedural knowledge, (Furio-Mas et al., 2007) such as calculating concentration, and conceptual knowledge, such as interpreting the implications of molecular structure on solution-state properties (Cooper et al., 2013). It is important to note that procedural knowledge can be learned by rote, whereas conceptual knowledge cannot (Glaser, 1984). Therefore, distinguishing procedural and conceptual CK may establish differences in the way graduate students develop PCK.
Conceptual understanding of solubility and concentration has been investigated for both students (Pinarbasi and Canpolat, 2003; de Berg, 2012) and pre-service teachers (Valanides, 2000; Cokadar, 2009; Ozden, 2009). These studies show that both students and teachers hold misconceptions about solution-state chemistry. The consistency between the conceptions that are held by students and those held by teachers suggests that misconceptions may arise from teachers. For example, Ebenezer investigated high school students’ conceptions of solubility and found that students confuse dissolving with melting and had a tendency to incorrectly extend their understanding of materials at the macroscopic level to the microscopic level (Ebenezer and Erickson, 1996). This study also revealed a discrepancy between the students’ and teachers’ vocabulary in portraying meaning using technical terms, where, for example, students’ use of the word “particle” in explaining a microscopic phenomenon did not convey that they sufficiently understand the meaning as intended by the instructor. Likewise, when investigating the conceptions of pre-service teachers, Valanides found that they had difficulty translating macro-scale observations to micro-scale representations of dissolution (Valanides, 2000). Ozden found that pre-service teachers held alternative conceptions about the relationship between solubility and temperature, the meaning of a ‘saturated’ solution, and the influence of dissolved substances on the boiling point of a solution (Ozden, 2009).
Students must also develop knowledge of structure–property relationships, (Cooper et al., 2013) namely they must be able to interpret molecular structure to make predictions about solubility. Cokadar investigated students’ conceptions of solubility with relation to their understanding of polarity and found that students hold misconceptions about polarity that influenced their ability to predict solubility pairs (Cokadar, 2009). Students hold incorrect or partial understandings about what structural features give rise to polarity. Furthermore, they do not take into account molecular symmetry and attribute polarity to single features of a molecule. Cokadar reported that very few students demonstrated a “sound understanding” of solubility, and it was determined that they primarily relied on “like dissolves like” as a guiding rule. This finding is consistent with literature that indicates that students often rely on weak heuristics, which will enable them to correctly answer questions that test computational or procedural knowledge, but may fail when they are used to answer questions that require a deep conceptual understanding (Maeyer and Talanquer, 2013; McClary and Talanquer, 2015). Given the difficulty of solution chemistry, it is essential that we investigate chemistry instructors’ knowledge for teaching it so that we can improve instructional quality and student outcomes.
Finally, it is important to note that the majority of prior studies on this topic are aimed at general chemistry and may not be applicable here. Students notions about solubility are formed in general chemistry where water is the most used “solvent” and solubility is considered in relation to precipitation reactions of metal ions in water (de Berg, 2012). This perspective may be problematic in organic chemistry courses when organic solvents, rather than water, are considered.
The goal of this study was to understand more specifically how chemistry graduate students develop teaching knowledge at the topic level using a direct measure. The study was guided by the following research questions:
1. What is the extent of chemistry graduate students’ CK and PCK as they relate to teaching concentration and solubility?
2. What is the source of graduate students PCK of these same topics and how does PCK develop in this context?
Type of CK | CK question |
---|---|
Declarative knowledge | SQ1: Select pairs of solutes and solvents that have good solubility. |
SQ4: Predict which solvent 3-pentanone would be least soluble in. | |
Procedural knowledge | CQ1: Calculate the concentration of a specific solution. |
Conceptual knowledge | SQ3: When running a reaction must the reagent be soluble in the reaction solvent? |
Aspect of PCK | PCK question |
---|---|
Conceptual teaching strategies | CQ2: As if to a student, explain the difference between concentration and quantity. |
CQ3: As if to a student, explain why it is important to know the concentration of a reagent in a give reaction mixture. | |
SQ2: As if to a student, explain how you would determine if a compound is soluble in a solvent. | |
Students prior knowledge | CQ4: What do you expect students to understand about the concept of concentration when they begin this course? |
What is difficult to teach | CQ5: What about the concept of concentration do you think students will find most challenging in this course? |
The five PCK questions were developed from literature examples of TS-PCK as well as the authors’ classroom experiences (Hill, 2008; Mavhunga and Rollnick, 2013). PCK can be reliably measured using particular components of the construct (Rowan et al., 2001) and so questions included only those three of the components that a graduate student would be likely to develop: understanding students’ prior knowledge, what is difficult to teach, and conceptual teaching strategies. The test items were refined through discussion with experts, who had organic chemistry teaching experience, and piloted prior to data collection.
Cognitive interviews were conducted to validate the results and uncover the source of participant knowledge when responding. The interviews included items from the questionnaire and items that identified the graduate students’ attitude towards teaching, prior teaching experience, and academic background. Because participants were not observed, these items might not differentiate between PCK and simple reasoning strategies (Willis et al., 1999). Therefore, “verbal probing techniques” outlined by Willis were incorporated to unearth why the interviewee answered survey items the way they did. The interviewer exercised substantial latitude, in particular when following lines of inquiry regarding the sources of knowledge used when responding to the questions.
Characteristic | Number of participants |
---|---|
Gender | |
Male | 41 |
Female | 27 |
Year in program | |
1 | 53 |
2 | 3 |
3 | 2 |
4+ | 10 |
Division | |
Organic | 42 |
Other | 26 |
Terms experience | |
0 | 36 |
1 | 19 |
2 | 2 |
3+ | 14 |
“The concentration is the number of moles per 1 L of solution. Quantity is just an amount i.e. moles”
This response communicated an accurate definition of concentration, but one that was not transformed in any way for students. In contrast, the following response was rated as transforming and communicated the definition using an everyday concept “people in a room”:
“Quantity (moles) is like counting the how many people are in the room. Concentration is like finding the average number of people standing in a square foot of that same room.”
A subset of responses were scored, and specific rubric levels were iteratively developed through discussion with the authors. A subset of 20% of test items was scored independently by two authors and discussed until greater than 90% agreement was attained. Additional measures of inter-rater reliability were not obtained. A single author then scored the remaining responses. Response analysis and validation were performed using Rasch analysis with Winsteps software (Neumann et al., 2011). Raw data was transformed to logit units, a probability measure used to relate test taker ability to test item difficulty. A logit unit sets the mean item difficulty to zero logit units, which means that an item with average difficulty has a logit unit equal to zero (Bond and Fox, 2013). The model also provides a measure of a participant's ability relating to a single latent variable, in this case CK or PCK of solution and concentration topics. Measurements including persons reliability, item reliability, and fit statistics were within an acceptable range and are provided in Table 3. The results were then statistically analysed.
Indices | Content knowledge | Pedagogical content knowledge |
---|---|---|
Person reliability | 0.48 | 0.61 |
Item reliability | 0.96 | 0.94 |
Fit statistics (t = −2 and +2) | All scores | All scores |
Interviews were audio recorded and transcribed verbatim in preparation for a deductive coding process according to the model outlined by Hill (Hill, 2008). Coding was designed to capture interviewees’ sources of knowledge used when responding to the questionnaire. Modifications were made to Hill's model to account for the question type involved, which was open-ended only, topic-specific conceptualization of PCK was used, and sources of knowledge were emphasized. Multiple-choice or forced responses, which were originally used by Hill, were avoided here and thus the use of test-taking strategies by interviewees was not evaluated. The “mathematical reasoning” code was renamed “chemical reasoning” for our purposes and was used to differentiate classroom experience from practical experience. Each author coded a subset of the interview data collected and discrepancies were completely resolved by discussion. Due to the small sample size, the quantitatively transformed interview data was not statistically analysed.
Rasch analysis was used to evaluate and compare test item difficulty against participants’ scores, placing both on a common logit scale where test item difficulty and person scores were directly related to logit increase. A Wright map (Fig. 1) was used to evaluate how likely a participant would correctly answer an item (question) of similar difficulty, where a participant with a logit score greater than the location of the item on the map is more likely to give a higher scoring answer. In Fig. 1 a participant who appears at 2.5 on the LOGIT scale would be likely to answer all four content questions correctly. In contrast, a participant at −0.5 would only be likely to answer SQ1 correctly.
The Wright map for CK questions (SQ1, SQ3, SQ4, and CQ1) is shown in Fig. 1. The difficulty of the items ranged between −1 and 1, while participant scores ranged between −2 and 4 on the logit scale. The person reliability was 0.48 and the item reliability was 0.96 (Table 3), which was consistent with similar measures reported by others for similar constructs. The distribution of participant performance is centred around 1 logit unit and suggests that most participants had a high likelihood of performing well on this aspect of the questionnaire.
The Wright map for PCK questions (SQ2, CQ2, CQ3, and CQ5) is shown in Fig. 2. The item difficulty ranged between −1 and 1 and the participant scores were between −6 and 2 on the LOGIT scale. The person reliability was 0.61 and the item reliability was 0.94 (Table 3). The distribution of participant performance is much lower than with the CK questions, centred around −0.5, and has a much broader range indicating that the participants were less likely to perform well on the PCK questions and demonstrating a greater variability in their knowledge for teaching.
The relationship between mean PCK score and terms teaching experience was evaluated using a bubble plot (Fig. 3). PCK trends upwards for both CK and PCK when more terms are taught. The mean CK trend begins at 0.08 and rises to a maximum of 1.08 by term 3, then decreases slightly to 1.05 for those who have taught 3+ terms. There is a significant correlation between CK and terms taught of 0.338 (Pearson, p < 0.05 (two-tailed)), which is small and positive. The mean PCK trend begins at −1.73 and rises steadily to a maximum of −0.45 for GTAs with 3+ terms of experience. PCK and terms taught is also correlated (0.392, p < 0.5 (two-tailed)) indicating a gain in PCK with teaching experience. In general, participants scored higher on the CK set compared to the PCK set, which his consistent with the Wright map results.
Fit statistics were obtained and used to assess the questionnaire's level of productive measurement. For all items, MNSQ < 1.5 and/or t ≤ |2|, confirming that the items were productive for measurement (Linacre, 2018). These are acceptable fit statistics indicating that CK and PCK were reliability measured.
The CK and PCK scores for each participant were plotted to compare novice (0–1 terms taught) and experienced (2 or more terms taught) participants (Fig. 4). Higher scores for both CK and PCK were more consistently observed for experienced participants. The lack of a meaningful population in the upper left quadrant supports that CK is a prerequisite for PCK and there is a small but significant correlation (0.259, p < 0.05 (two-tailed)) between CK and PCK. However, a large population in the lower right compared to the upper right quadrant indicates that high CK does not equate with high PCK within the context of the study.
Interviews were performed to investigate how the questions were interpreted by the participants and to elucidate the source of knowledge used in responding. All participants described a clear understanding of the wording and intent of each question and no particular questions were identified as problematic during interviews. The interviews did reveal differences in the reported source of knowledge in answering each PCK question. Participants indicated that their general teaching style as well as their specific knowledge for teaching solution chemistry topics was attributed to their own experiences as students, their experiences as TAs, or through a reasoning process.
Participants reported relying on a variety of strategies when teaching solution concepts, which they drew on in responding to PCK questions. Each interviewee referred back to the widely used “like dissolves like” heuristic as a go-to tactic for helping students understand solubility, which is consistent with PCK question responses given by a large proportion of participants. One participant described his thinking about the heuristic as reasoned from the perspective of a student:
“Why do I draw on that phrase? I guess it's just easy to remember. If I were in (lab) still in the phase where I didn’t understand things and the TA said that to me, that would be something I could hold on to. I could have thought about that. It's just so simple.”
Another more experienced participant sourced his thinking from direct teaching experience and related an understanding of the limitations of the heuristic and the need to use other strategies:
“I’ve always gone back to the “like dissolves like” that they have been taught, and sometimes it helps and sometimes it doesn’t. This goes back to the polarity is a gradient and its ratio of C to H atoms and charges. Sometimes they need to see a physical representation – this is a clear solution or a cloudy solution. To see the theoretical – why is it soluble? Then you go the polar route.”
This response was consistent with the responses of the two other experienced participants, who also conveyed the limitation of the “like dissolves like” heuristic in helping students to understand polarity.
Participants also described other strategies such as connecting solubility to real life experiences, for example mixing oil and vinegar for salad dressing. They also asked students to draw in order to help them visualize structure differences:
“If I was talking to a student I would say “let's draw it out” and really break it down. Lead them to the point where polarity is going to be the most different between 3-pentanone and water [SQ4, Table 2], and that's why it would be the least soluble.”
Four of the seven participants related an understanding that students would have difficulty going beyond a distinction of solubility as a binary property (it is or isn’t soluble) as compared to a range. For example, one said:
Students think of things as being polar or not polar. As you go along you gain experience and you realize that everything has some degree of polarity. It's a spectrum not a yes or no statement.
Participants reported relying on their own experiences as students as well as their experience teaching when responding to questions about solution chemistry concepts. They also described experiences from teaching when asked about teaching strategies for solution chemistry concepts, but had a tendency to reason from general teaching experiences as opposed to describing experiences specific to teaching solution concepts. For example, one participant described a general teaching strategy that might be applied to any concept, which suggested a reliance on a more general knowledge for teaching as opposed to specific knowledge for teaching this topic:
“The first thing I thought about was intermolecular interactions. For me, when I try teaching something I try to think of an example. So it's not just a concept out of the air. You have something concrete to link it to.”
Although the strategy applying “intermolecular interactions” is appropriate for teaching solubility, we considered this to be a general teaching strategy because the participant specifically described that their use of a “concrete example” was a strategy that they adapted from experience teaching other topics.
Interviewees with less teaching experience drew almost completely on their own experiences as students when describing why these topics might be easy or difficult. One participant reflected on his experience in the lab to rationalize how he expected students to think about solubility:
“It's a value judgement I guess. I did say for concentration in organic solvents might be tricky because students may be used to thinking about concentration in water. But its something I’ve got to wrap my head around. There are more solvents than just water. Once you get past that it flows pretty logically from there.”
One participant reflected on research experience as a source of knowledge:
“The thing that got me over the hump, in terms of thinking about it, was being in the lab a lot. You begin to get familiar with the different solvents, but that's kind of difficult for students because they don’t have that familiarity and they aren’t going to get it at any point.”
Both examples indicate a specificity of thinking about teaching concentration and solubility that arose from their own experiences in the classroom or lab rather than direct teaching experience.
A modest increase in PCK was observed for participants with more teaching experience, indicating that PCK can be developed through practice. A small positive, significant correlation between both PCK and CK and the number of terms taught was observed (Fig. 3). These trends are consistent with observations made by Lederman (Lederman, 1992) and Davis (Davis and Krajcik, 2005), who reported increases in PCK upon self-assessment and practice of pedagogical techniques. The increase in CK may be related to teaching the content and becoming familiar with these concepts; graduate students with at least three terms of teaching experience did not score below zero on the CK LOGIT scale (Fig. 4).
Results support the assertion that graduate student knowledge for teaching compares more closely with the interns described by Grossman (Grossman, 1989), who also lacked formal pedagogical training. Study participants performed well on CK questions, which is consistent with their relative subject matter expertise and the level of CK question difficulty; they can correctly answer textbook questions that might be posed to their students. Participants performed less well on PCK questions, which indicates that their knowledge for teaching solution-state chemistry is not as well developed. The Wright maps in Fig. 1 and 2 demonstrate a disparity in participant performance on the CK items relative to the PCK items. This difference is also observed in the bubble plot on Fig. 3 where the average score is consistently higher for the CK questions when compared to PCK questions. Finally, the scatterplot in Fig. 4 shows a distribution of participants with a strong bias toward positive CK scores as compared to a more negative bias toward PCK scores. This result is consistent with the participants’ reported background expertise in CK gained through undergraduate studies and minimal prior teaching experience of training.
A difficulty hierarchy (Linacre, 2004) (Fig. 2) was observed in the PCK questions where participants performed best on CQ2 (conceptual teaching strategy), which asked them to describe the difference between concentration and quantity. They performed the least well on CQ4/5 (knowledge of students) which asked participants to predict what students would struggle with and what expectations of prior knowledge they have for students in their class. This finding agrees with a prior investigation of graduate student PCK by Hale (Hale et al., 2016), in that both studies found that graduate students did the best on questions that relate to conceptual teaching strategies and poorest on questions that related to knowledge of their students.
The experiences that contributed to graduate students’ knowledge for teaching solution chemistry, were examined using semi-structured interviews. In responding to PCK questions, less experienced participants’ knowledge sourced primarily from a process of reasoning based on their own experience as students. To a lesser extent they drew on teaching experience, though this was largely adapted from general teaching experiences rather than direct experience teaching solubility and concentration. The tendency to rely on student experiences is also consistent with Grossman, who found that novice instructors with no pedagogical training will emulate their own instruction (Grossman, 1989).
All participants reported relying on heuristics such as “like dissolves like”, which was sourced from their own experience as students. However, the more experienced participants each described the limitations of heuristics and provided explanation of why they are limited. Similarly Cokadar reported that prospective high school chemistry teachers relied on “like dissolves like” as a guiding rule (Cokadar, 2009). The reliance on heuristics by more novice instructors suggests that views on heuristics may be a marker for PCK development and that those with higher PCK will know when such a heuristic is appropriate for students to use and when to use other strategies. However, additional study is needed to establish whether experienced graduate students would be more likely to recognize the limitations of this heuristics.
The results of this study indicate that graduate student instructional training can be improved to better expedite the development of this knowledge. One possible adaptation that directly arose specifically from these findings is related to the different views on heuristics held by novice and experienced instructors. One approach could be to include discussion with graduate student instructors about the benefits and limitations of heuristics in order to prepare graduate students for using heuristics such as “like dissolves like” more effectively.
Score | Definition | Exemplar |
---|---|---|
(0) Incorrect | Provides incorrect explanation. | N/A |
(1) Limited | Defines both concentration and quantity correctly, but does not provided standard or elaborated explanation. | “Concentration is moles per liter of solvent. Moles is just the amount of solid material you have.” |
(2) Basic | Identifies the difference between concentration and quantity and provides standard “textbook like” explanation. | “Moles refers to the number of molecules in terms of Avogadro's number, and molarity provides us with concentration, refers to the number of said moles that would be present in 1 liter of solution.” |
(3) Developing | Identifies the difference between concentration and quantity, provides textbook like explanation, which they expand on or rephrase. | “Concentration is the amount of solute per unit of solvent, while quantity is the total amount of solute you are using. If you make up a big batch of solution and begin using it up in an experiment, the total amount of solute decreases as you use it up, but the concentration remains the same.” |
(4) Transforming | Identifies the difference between concentration and quantity, provides textbook explanation, which they expand or rephrase. Demonstrate a transformation of the knowledge indicative of PCK. | “Moles is the term used to describe a collection of particles, so as an example if you have a cookie with 5 chocolate chips you will say that there are 5 chocolate chips. If you want to look at the concentration, you must compare how many moles, or in this case chocolate chips, are available per a unit of space/volume. In this case, you could say there are 5 chocolate chips per cookie. It is also very important to make sure the student can describe that moles is a book keeping number to organize a large quantity.” |
Score | Definition | Exemplar |
---|---|---|
(0) Incorrect | Provides incorrect explanation. | N/A |
(1) Limited | Explains why it is important to know concentration of a reagent in a given reaction mixture. | “Concentration can be used to determine rate constants.” OR “Concentration has large effects on reactivity and rate.” |
(2) Basic | Explains why it is important to know concentration of a reagent and provides standard “textbook like” explanation. | “It is important because the amount of reagent you actually added needs to be calculated using the concentration. If you are adding a very dilute solution, you have to add more of it to add the full amount needed.” |
(3) Developing | Explains why it is important to know concentration of a reagent, provides standard textbook like explanation, and explains or rephrases explanation. | “Molecules must encounter each other to react, higher concentration means more encounters, and therefore faster reaction.” |
(4) Transforming | Explains why it is important to know concentration of a reagent, provides standard textbook like explanation, and explains or rephrases explanation. Demonstrates a transformation of the knowledge indicative of PCK. | N/A |
Score | Definition | Exemplar |
---|---|---|
(0) Incorrect | Provides incorrect explanation or insufficient response to score. | No expectation. |
(1) Limited | Identifies an area that students should know, but does not explain or expand. | “How to calculate C = n/V OR Everything. It is a simple topic that everyone learns in high school chemistry.” |
(2) Basic | Identifies and area that students should know and includes a standardized explanation. | “They should know everything about it, because they should not have been allowed to skip/pass Gen Chem without knowing it or pass Orgo I without knowing it. However, in my experience they know next to nothing about it and need to be taught it in lab” |
(3) Developing | Identifies an area that students should know and provides an expanded or rephrased explanation. | “I think a student will know that some things are very concentrated while others are not. Many may know the concepts of molarity and molality and maybe ppm. I think their conceptual ability to use it to problem solve may still be very basic.” |
(4) Transforming | Identifies an area that students should know and demonstrates a transformed understanding (curricular saliency). | “From past experience, a lot of students had trouble with dimensional analysis and calculating concentration and moles. I think they understand it based on the units of moles per volume and not conceptually as number of molecules distributed in a certain volume.” |
Score | Definition | Exemplar |
---|---|---|
(0) Incorrect | Provides incorrect explanation. | N/A |
(1) Limited | Identifies an area of student difficulty. | “How to calculate it.” OR “How to distinguish it from total amount.” |
(2) Basic | Identifies and area of student difficulty and provides standardized textbook like explanation. | “How to use it to calculate how much of the liquid that you need to add to have the desired reaction.” |
(3) Developing | Identifies an area of student difficulty and provides expanded or rephrased explanation. | “Concentration, as a quantitative measure, is not an everyday tool we use. Students will need to connect the concentration of a substance to a problem that they have, and recognize that it can affect the quality of their results just as much as any other aspect of how they set up their experiments.” |
(4) Transforming | Identifies an area of student difficulty, provides expanded or rephrases explanation, and demonstrates a transformed understanding of student difficulty (knowledge of students). | N/A no participants demonstrated specific knowledge of students. |
Score | Definition | Exemplar |
---|---|---|
(0) Incorrect | Insufficient answer to score or incorrect. | N/A |
(1) Limited | Simplistic answer with limited information. | “Take a small amount of a compound and add 1 mL of solvent to it.” |
(2) Basic | Provides straightforward instruction or observation sufficient for student to get started. | “Add some solvent to the compound. If there is no visible solid then the compound is soluble.” |
(3) Developing | Detailed instruction, which student could follow and enact. | “Take a small vial and add ∼3 mL of solvent. Add a very small amount, the tip of a micro spatula, of the solute and pour it in. Swirl/wait for a minute or two and see if the solute has dissolved to leave a clear solution with no chunks.” |
(4) Transforming | Provides detailed instruction, which student could follow and enact. In addition instruction provides strong conceptual teaching strategy. | N/A no participants included a conceptual teaching strategy. |
This journal is © The Royal Society of Chemistry 2019 |