Adam G. L.
Schafer
and
Ellen J.
Yezierski
*
Miami University, Department of Chemistry and Biochemistry, Oxford, OH, USA. E-mail: yeziere@miamioh.edu
First published on 4th November 2020
Designing high school chemistry assessments is a complex and difficult task. Although prior studies about assessment have offered teachers guidelines and standards as support to generate quality assessment items, little is known about how teachers engage these supports or enact their own beliefs into practice while developing assessments. Presented in this paper are the results from analyzing discourse among five high school chemistry teachers during an assessment item generation activity, including assessment items produced throughout the activity. Results include a detailed description of the role of knowledge bases embedded within high school chemistry teachers’ pedagogical content knowledge and the processes used to enact these knowledge bases during planned formative assessment design. Implications for chemistry teacher professional development are posited in light of the findings as well as potential future investigations of high school chemistry teacher generation of assessment items.
Formative assessment practices have been the focus of several domain-general investigations (Tomanek et al., 2008). Many formative assessment practices can be applied without domain-specificity. For example, all teachers can provide their students with opportunities to demonstrate knowledge and skills that can be used to inform continued learning. However, specific formative assessment practices employed by teachers are highly dependent upon expertise and cultures embedded within the discipline (Coffey et al., 2011). For instance, chemistry teachers need a wealth of knowledge about chemical species and how those species interact to effectively design a planned formative assessment about the solubility of ionic salts that appropriately evokes student responses that will be informative for guiding continued instruction.
In the absence of chemistry-specific support, chemistry teachers are left translating the available domain-general guidelines into practices suitable for their own classrooms (Black and Wiliam, 1998). As the Next Generation Science Standards (NGSS) become more common and widespread, teachers are being provided more resources for designing formative assessments that are 3-dimensional (encompass the NGSS disciplinary core ideas, cross-cutting concepts, and science and engineering practices) (NGSS Lead States, 2013; Harris et al., 2016; Underwood et al., 2018). However, enacting the reformed formative assessment practices of the NGSS effectively can be difficult; the NGSS could provide additional challenges for teachers to overcome. A recent report from the National Academies of Science, Engineering, and Medicine emphasized the importance of identifying core practices for teachers to develop to meet the demands of NGSS (National Academies of Sciences, Engineering, 2015). The report specifically communicated that “teachers need to master a range of formative and summative assessment strategies” (p. 103) when examining assessment's critical role in informing instruction.
The effective use of formative assessment is linked to improved student engagement and performance (Ruiz-Primo and Furtak, 2007; Furtak et al., 2016). However, teachers may find translating domain-general support for implementing formative assessments difficult and newly implemented practices may result in student outcomes that are not measurable (Buck and Trauth-Nare, 2009; Gómez and Jakobsson, 2014; Harshman and Yezierski, 2017). Additionally, very few investigations study how teachers interact with specific content during assessment design (Tomanek et al., 2008; Coffey et al., 2011). Often, research about assessment practices focuses on general knowledge and beliefs about assessment (Remesal, 2011; Opre, 2015; Yan and Cheng, 2015). Common assessment tools (Black and Wiliam, 1998; Suskie, 2009), and grading practices (Henderson et al., 2004; Dubey and Geanakoplos, 2010; Toledo and Dubas, 2017) but not on teachers’ assessment design process, especially chemistry teacher assessment design processes (Coffey et al., 2011).
One study by Tomanek and others investigated teacher considerations when selecting and evaluating formative assessment tasks (Tomanek et al., 2008). In the study by Tomanek et al., several prospective and practicing teachers selected assessment tasks or evaluated the potential of a task to assess student understanding (Tomanek et al., 2008). Their findings show that teachers exhibit general tendencies when evaluating/selecting assessment items, often influenced by two broad categories of concerns: (1) characteristics of the task and (2) characteristics of the students or the curriculum (Tomanek et al., 2008). Their findings align with other calls for further investigation into teachers’ process of developing assessments (Coffey et al., 2011; Park and Chen, 2012). The process of designing assessment items is a personal experience (Yan and Cheng, 2015), and a study capturing teachers’ considerations during this process may expand the understanding about how teacher beliefs about assessment are translated into practice.
Knowledge base (knowledge of…) | Description (knowledge of…) |
---|---|
Content | The academic ideas and concepts that are pertinent to a discipline |
Curriculum | The structure, scope, sequence, and goals of a curriculum |
Students | Students’ general characteristics, cognitive development, and variations in approaches to learning |
Pedagogy | Skills and strategies related to learning theories, instructional principles, and classroom management |
Assessment | The design and interpretation of formative and summative assessments as well as how to take action from assessment data |
The knowledge bases in Table 1 describe the collective knowledge available to inform the processes of teaching a particular topic in a particular way for a particular purpose (Hume et al., 2019). Early conceptualizations positioned PCK as a separate knowledge domain alongside the knowledge bases (Shulman, 1987). More recent views recognize that PCK is not a “freestanding type of knowledge,” although one's PCK is continuously influenced by and influences the embedded knowledge bases (e.g., Magnusson et al., 1999; Abell, 2008; Park and Chen, 2012). As such, evidence of one knowledge base does not “equate to PCK.” However, we posit that our understanding of the complex nature of PCK can be strengthened by better understanding how the integrated components (i.e., the knowledge bases) are enacted to transform knowledge into opportunities for learning.
Each of the knowledge bases in Table 1 can be described as existing within the collective community at large, within the mind of the individual, or be enacted by the individual (Park and Oliver, 2008; Hume et al., 2019). A teacher may have access to knowledge from a particular knowledge base (collective-PCK), but may not hold that knowledge personally (personal-PCK) or enact it in their classroom practices (enacted-PCK) (Park and Oliver, 2008; Hume et al., 2019). The five knowledge bases do not exist in isolation and are highly interconnected (Park and Chen, 2012; Hume et al., 2019). For example, a chemistry teacher would likely consider ideas within “knowledge of content” when designing and implementing an assessment (which would require the application of “knowledge of assessment”). Although enacted-PCK is typically used to describe in-class activities only, a teacher's knowledge applied to activities outside of class is likely not drastically different. For example, a teacher will still enact some knowledge about assessments when designing formative assessments or interpreting assessment results. As such, providing teachers an environment that encourages them to communicate ideas during generation of planned formative assessment items could reveal teacher practices for translating their personal-PCK into enacted chemistry assessment knowledge.
1. What is the role of high school chemistry teachers’ pedagogical content knowledge when generating planned formative assessment items for a solubility lab?
2. What processes do high school chemistry teachers undergo when enacting their pedagogical content knowledge when designing planned formative assessment items for a solubility lab?
Excerpts of teacher dialogue were coded according to considerations expressed during item development that were classified within PCK categories. For this study, a “consideration” is an idea communicated by the teachers during assessment item generation. Enacted-PCK describes when a teacher's personal-PCK is applied to a teaching and learning situation. This could arguably occur in the classroom environment or outside the classroom in situations such as designing an assessment or interpreting assessment results. By observing chemistry teachers during assessment item generation for a single lab, the study described herein focuses on enacted-PCK (i.e., the PCK that teachers actually employ) (Park and Oliver, 2008; Hume et al., 2019). Furthermore, since the teachers were observed generating assessment items for a solubility lab, the study focuses on enacted-PCK specific to the chemistry topic of solubility and any related content (Magnusson et al., 1999; Abell, 2008) as characterized through the knowledge bases in Table 1.
To evidence trustworthiness of the coding, frequent interrater debriefings were held between two members of the research team. During the debriefings, two raters separately coded roughly 15% of the teacher considerations during assessment item generation as a means of establishing interrater agreement. Additionally, code applications were presented to graduate students and chemistry education researchers who were not affiliated with this investigation. Comparison of code applications from the two coders resulted in 98% agreement. The exceptionally high interrater agreement was likely a result of frequent debriefings during the code generation process. Any disagreement in code applications was discussed by the raters, followed by minor modification of code descriptions when necessary. This iterative process of consensus building continued with reapplication of codes to the data set and interrater debriefing until complete agreement was reached.
The coded statements were used to characterize high school chemistry teachers’ processes for generating planned formative chemistry assessment items. By matching the audio and video files, teacher ideas were linked to the item being generated at the time the idea was communicated.
1. The student will be able to correctly predict what happens when a soluble salt dissolves in water.
2. The student will be able to explain how an ionic compound dissolves in water, noting the role of water molecules in this process.
3. The student will be able to explain what happens to the charge of the ions during solvation and be able to explain why.
The EOs are stated in both the student and teacher versions, available on the TIMU website (Target Inquiry at Miami University). Each group of teachers generated one assessment item per EO, shown in Table 2.
EO | Group 1 assessment items | Group 2 assessment items |
---|---|---|
1 | Which choice below correctly expresses what happens to the sodium particles in NaCl when sodium chloride dissolves in H2O? | Ca(NO3)2 (s) dissolves in water. |
a. Na | a. What will you observe? | |
b. Na+ | b. Write the equation that describes this process. | |
c. NaH2O | c. Draw a particulate model of the Ca(NO3)2 (aq) after it is all dissolved. | |
d. None of the above | ||
2 | If the polarity of the particles in H2O is switched so that the “O” end is now partially positive, and the “H” end is now partially negative, what part of the NaCl would be attracted to the “H” end of the water molecule? | Draw a calcium ion in water include 4 water molecules in your drawing. Explain why they are arranged in the way you provide. |
a. “H” end of water would surround the Na ions of the salt crystal | ||
b. “H” end of water would surround the Cl− ions of the salt crystal | ||
c. “H” end to “O” end of water | ||
d. None of previous | ||
3 | What is the difference in charge of the Cl particle before and after NaCl dissolves in water? | Na2SO4 + H2O → |
a. Before its negative, after its neutral | a. Na22+(aq) + SO42−(aq) | |
b. Before its negative, after its negative | b. 2Na(aq) + SO4(aq) | |
c. Before its neutral, after its neutral | c. 2Na + (aq) + SO42−(aq) | |
d. Before its positive, after its negative | d. 2Na(aq) + S(aq) + 2O2(g) |
Idea communicated | Knowledge base (knowledge of…) |
---|---|
Student prior knowledge | Students |
Student response | Students |
Representational level | Content |
Dissolving | Content |
Substance | Content |
Electrostatic interaction | Content |
Item format | Assessment |
Learning activity (Lab) | Curriculum |
Educational objective | Curriculum |
State/national standards | Curriculum |
Following Table 3 is a brief description of the ideas communicated within each knowledge base.
Teacher considerations about expected student answers for a question were coded as student response (Anne: A lot of kids have a tendency to pick none of the above.) and elements of the student response that the teachers were attempting to elicit (Celine: What I'm getting at is would the kid be able to tell me that the oxygen end would be attracted to the chlorine now? Because the oxygen is positive now.)
These findings align with the literature about the “knowledge of students” knowledge base. For example, Magnusson et al. state that teachers’ “knowledge of students” includes considerations about the knowledge required for learning (or in this case assessing) and knowledge of areas of student difficulty (Magnusson et al., 1999). The teachers’ considerations about student responses included ways to construct the assessment item that either addressed common student difficulties to avoid confusion (Anne: Would you give them the original water and its polarity? Just so that's not what they miss.) or employed them to generate purposeful item distractors (Anne: Yeah, since [the students] would all choose ‘none of the previous’ because where's the chloride?)
In addition to the topic-specific content, teachers evoked chemistry-general ideas within the “knowledge of content” knowledge base by considering how to represent the content in the assessment items being generated. For example, teachers in this investigation were familiar with Johnstone's representational levels (Johnstone, 1991), and communicated how the representational level was important in defining how the content was addressed in the assessment item (Claude: So, correctly predict symbolically and particulately what happens. Don’t we have to include macroscopically as well?) In this regard, the “knowledge of content” knowledge base served to inform not only what content was addressed in the assessment items, but also how that content would be perceived by the student taking the assessment.
Teachers attempted to ensure the assessment item was similar to the lab in regard to the student response, such as having the student perform similar tasks as in the lab. Additionally, the teachers sought to ensure similarity in the chemical species used in the assessment item by incorporating substances that are equally common and complex as those used in the lab, within the context of their classrooms. For example, part of the lab involved dissolving copper(II) chloride in water; however the teachers decided not to use copper(II) chloride in their assessment items (Claude: I wouldn't use chloride because chloride is what's in [the lab].) Instead, the teachers opted to find salts that were similar to incorporate in their assessment items (Celine: [The lab] never uses sodium chloride. So, let's use sodium chloride as an assessment question.) Essentially, the teachers in this investigation employed their “knowledge of curriculum” to connect their “knowledge of content” to their “knowledge of assessment.”
Similar to their considerations of the learning activity, teacher ideas communicated about the EOs bridged the gap between their “knowledge of content” and “knowledge of assessment.” Teachers consistently referred to the EOs to establish what content to address in the assessment item (Celine: What [the students] are doing is applying the opposites attract, and that water is pulling at parts [of the salt] which is the objective.), how that content should be represented (Claude: The expression ‘what happens’ is essentially covering all three [representational] levels.), and what to ask the student to do when responding to the item (Celine: …can they still apply the idea?)
Analyzing teacher discourse while identifying quotations that reveal teachers’ “knowledge of curriculum” demonstrated the interconnectedness of the PCK knowledge bases. Literature sources argue that the knowledge bases serve as a heuristic device, allowing for the representation of knowledge in the mind of the teacher and that teachers likely activate multiple knowledge bases simultaneously (Shulman, 1986; Tamir, 1991; Magnusson et al., 1999; Park and Oliver, 2008; Abell and Siegel, 2011; Hume et al., 2019). In this investigation, the teachers’ wove together PCK knowledge bases during assessment item generation to inform processes of checks and balances necessary for connecting the instructional materials (i.e., lab and EOs) with the assessment items. These “processes” represent the transformation of personal PCK into learning opportunities as characterized by the knowledge bases and are explored further in the next section through the lens of discussion vignettes.
1. an appropriate task for the student to accomplish.
2. the content to assess versus what content is provided to the student.
3. an appropriate representation of the content in the stem and in the student response.
The vignettes in Tables 4–7 show brief exchanges between the teachers during the assessment item generation activity. For reference, each vignette includes the assessment item generated from the discussion. The vignette in Table 4 represents the teachers’ process for establishing an appropriate task during the item generation process. The “task” of an assessment item is defined as the work to be completed by the learner to demonstrate competency of what was to be learned (McDonald, 1964; Hoffman and Medsker, 1983; Jonassen et al., 1999; Merrill, 2007).
Statements | Assessment item |
---|---|
Ashton: Now for this one I have an idea. What if we have three drawings showing water around an ion; one of them is correct. | Draw a calcium ion in water include 4 water molecules in your drawing. Explain why they are arranged in the way you provide. |
Claude: I love it. | |
Ashton: You know the somewhat negative portion of the oxygen with the positive ion, and then one of them where they are just totally wrong. So, how about this one drawing something. I don't know I'm just thinking. | |
Emmerson: So, does that explain though? | |
Claude: It's a good question. And specifically, what is it about water to dissolve in the first place? Right. I mean a multiple-choice question where you have the three different possible orientations you may look into it and you may just know factually based on seeing but that's not necessarily explaining it… How about how about this. Instead of a multiple-choice, draw calcium ion and four water molecules and explain why you arranged them the way that you did. |
Statements | Assessment item |
---|---|
Celine: So, if the water's polarity was reversed, the hydrogen end was negative and the oxygen end was positive, which choice would describe water's interaction with salt? So, let me tell you what I'm trying to get at. | If the polarity of the particles in H2O is switched so that the “O” end is now partially positive, and the “H” end is now partially negative, what part of the NaCl would be attracted to the “H” end of the water molecule? |
Anne: Are you getting at solvation? | a. “H” end of water would surround the Na ions of the salt crystal |
Celine: Yeah. Because it's like we give the kid a situation in which we say water is this now. Okay? | b. “H” end of water would surround the Cl− ions of the salt crystal |
Anne: Okay | c. “H” end to “O” end of water |
Celine: If water's polarity was this way first. Explain which choice would correctly explain how water would interact with sodium chloride. What I'm getting at is would the kid be able to tell me that the oxygen end would be attracted to the chlorine now because the oxygen is positive now. | d. None of previous |
Anne: So, you're getting at whether they know that chlorine is going to be negative. | |
Celine: Right. So, the idea is noting the role of water molecules in this process. They know that this is switched, but can they still apply the idea that the positive end of water. | |
Anne: So, would you give them the original water and its polarity? Just so that's not what they miss. Is not remembering [the polarity of water]. | |
Celine: No. I wouldn't. If they don't remember [water's polarity] it doesn't matter because I'm telling them imagine if water's polarity was reversed and the oxygen is now positive and the hydrogen end is now negative. |
Statements | Assessment item |
---|---|
Ashton: Student will be able to correctly predict what happens when a salt dissolves in water. What if we just say calcium chloride plus water with the model and a symbol? Yeah, I dunno. I'm just thinking something simple. | Ca(NO3)2 (s) dissolves in water. |
Claude: I mean they do both in [the lab]. So correctly predict we would say symbolically and particulately but ‘what happens.’ Don't we have to include macroscopically as well? The expression ‘what happens’ that's essentially covering all three levels. | a. What will you observe? |
Ashton: How about this, calcium chloride, which is commonly used in the wintertime. | b. Write the equation that describes this process. |
Claude: I wouldn't use chloride because chloride is what's in [the lab]. | c. Draw a particulate model of the Ca(NO3)2 (aq) after it is all dissolved. |
Ashton: Alright. How about calcium bromide? | |
Emmerson: The question doesn't have to cover. It's one question for that standard. | |
Claude: It could be three parts, right? | |
Emmerson: I suppose. | |
Claude: So, the question would simply be. Calcium nitrate dissolves in water. What would you observe with your eyes? Express it symbolically. Express the process symbolically. |
Statements | Assessment item |
---|---|
Celine Let's have them dissolve. So, he used copper chloride so we don't want to use copper chloride. Let's just use sodium chloride. I mean honestly the kids are familiar with sodium chloride. | Which choice below correctly expresses what happens to the sodium particles in NaCl when sodium chloride dissolves in H2O? |
Anne Right. | a. Na |
Celine Right. He never uses sodium chloride, so let's use sodium chloride as an assessment question. When we do multiple choice test question, which choice below correctly expresses what happens to sodium chloride when it dissolves in water. Now, we can do symbolic, we can do particulate, we can do macroscopic. I say we give a macroscopic description. No, a symbolic because he's really stressing the symbolic in [the lab]. Everything he's done is supposedly symbolic, macro. | b. Na+ |
Anne Right. | c. NaH2O |
Celine So, we could give them a choice of a. just the symbol Na. b. Na positive c. NaH2O because it bonds with water. Right? | d. None of the above |
Anne Yeah, those are all logical choices. You don't always have to have four choices. | |
Celine That's true. |
In the Table 4 vignette Ashton initially proposed having students perform a multiple-choice task to assess their knowledge of how an ionic compound dissolves in water. Afterwards, the teachers collectively reasoned with the proposed task by considering if a multiple-choice task effectively assesses the students’ ability to “explain,” as is stated in the EO. Throughout this exchange, the teachers communicated ideas from several PCK knowledge bases. For example, the “knowledge of content” knowledge base was communicated as teachers reasoned with the chemical phenomena in the item (Ashton: You know the somewhat negative portion of the oxygen with the positive ion…) and “knowledge of assessment” when considering the elicitation of students’ ability to “explain” (Emmerson: So, does that explain though?). While developing this item, the teachers kept the originally proposed chemical phenomenon, but wove together ideas from multiple knowledge bases to generate a task for the student that (they believed) aligned to the requirements stated in the EO. These findings agree with other investigations that have similarly identified the importance teachers place on establishing an appropriate task for the student to perform when evaluating competency (Tomanek et al., 2008). Essentially, a chemistry teacher must establish if “what the student does” appropriately evaluates student competency within a particular chemistry topic when judging competency. The process of establishing a task represents the translation of personal-PCK into enacted-PCK for assessment item generation (Hume et al., 2019).
The vignette in Table 5 illustrates the teachers’ process for establishing what content to assess versus what content to provide to the student.
In the Table 5 vignette Celine proposes a hypothetical “switch” of water's polarity in order to provide the student with a novel situation using a familiar chemical phenomenon. Following Celine's initial proposal, the teachers collaboratively considered if the content is appropriately matched to the EO as well as what supporting content is necessary for the student to be able to answer the question. Establishing the content to assess is another translation of the teacher's personal-PCK into enacted-PCK that is essential for designing chemistry assessment items. During this process, teachers considered multiple PCK knowledge bases in addition to the “knowledge of content” knowledge base. For example, teachers communicated ideas about the “knowledge of students” knowledge base by discussing student prior knowledge (Celine: If they don't remember [water's polarity] it doesn't matter…) and the student response (Celine: What I'm getting at is would the kid be able to tell me that the oxygen end would be attracted to the chlorine now because the oxygen is positive now). Establishing the content to assess is a necessary process in developing assessment items, as evidenced by the consistent evaluation of content by methods designed to evaluate assessment quality (Herman et al., 2005; Martone and Sireci, 2009; Polikoff and Porter, 2014). Arguably the teacher should always consider what to assess (Sandlin et al., 2015); however, these findings reveal that establishing what to assess versus what information to provide to the student is a process that exists independently of establishing the task to elicit student knowledge and how information is represented.
Tables 6 and 7 illustrate teachers’ process for establishing how information should be represented in the assessment item.
Establishing how information should be represented was an important process for teachers while generating assessment items. In the Table 6 vignette, the teachers grappled with language in the EO that set the requirements for how to represent the data (Claude: The expression ‘what happens’ that's essentially covering all three [representational] levels.) The teachers considered ideas from the “knowledge of content” (Ashton: What if we just say calcium chloride plus water with the model and a symbol?) and “knowledge of assessment” (Claude: It could be three parts, right?) knowledge bases while establishing how information should be represented. When considering how information should be represented, the teachers often referred to Johnstone's representational levels as a framework to guide their design (Johnstone, 1991).
Similar to Table 6, the vignette in Table 7 shows Anne and Celine grappling with how information should be represented. In their discourse, Anne and Celine considered ideas from the “knowledge of curriculum” knowledge base as they evaluated the way information was represented in the lab as well as the “knowledge of assessment” knowledge base by considering how information in the item distractors should be represented.
In both vignettes, the teachers’ considerations about how to appropriately represent information impacted the processes of establishing an appropriate task and establishing what content to assess, although the process for establishing how to represent information was discussed separately of the other two processes. Multiple studies state the importance of representational level as a means of perceiving chemical information (Johnstone, 1991; Taber, 2013). In this investigation, representational level was communicated as part of teachers’ considerations for how information be represented in the assessment item as well as how information should be represented in the students’ response to appropriately demonstrate competency of the task and chemistry content.
Research question 1: What is the role of high school chemistry teachers’ pedagogical content knowledge when generating planned formative assessment items for a solubility lab?
Although PCK was not directly investigated, the role of PCK was characterized through the lens of the embedded knowledge bases. Teachers did not communicate ideas directly related to the “knowledge of pedagogy” knowledge base, likely because the teachers were generating assessment items for a specific chemistry topic without discussing teaching strategies for that topic. The “knowledge of students” knowledge base served a role to inform common student difficulties related to the topic being assessed. Teachers either carefully crafted the item to avoid these difficulties or employed them to assess common pitfalls in student knowledge. The “knowledge of content” knowledge base was communicated by the teachers to consider not only the content of the item being generated, but also how the information in the item would be perceived by the student responding to the item. The results shown illustrate the need for teachers to consider both the content to be assessed in the item as well as how to construct the item to elicit student knowledge about that content. The methods for how to construct the item to elicit student knowledge about certain content was informed by the “knowledge of assessment” knowledge base. This knowledge base was commonly communicated by teachers when considering how to ensure the student response required an appropriate task. Although the refined consensus model of PCK depicts each of the knowledge bases as interconnected, results showed the “knowledge of curriculum” knowledge base was often activated in conjunction with other knowledge bases (Hume et al., 2019). For example, the “knowledge of curriculum” knowledge base was communicated by teachers as reasoning for how the construction of the assessment item should be connected to the content and the task being assessed in the item. Future investigations could further explore the interrelatedness between PCK knowledge bases to better understand how each is employed during various teacher tasks.
Research question 2: What processes do high school chemistry teachers undergo when enacting their pedagogical content knowledge when designing planned formative assessment items for a solubility lab?
Throughout the PD activity, the PCK knowledge bases informed several “assessment item processes” that teachers underwent during assessment item generation. These processes were recognized as establishing an appropriate task to elicit student knowledge, establishing appropriate content to assess vs. content to provide for the student, and establishing how information should be represented in the item's stem and in the student response. These processes represent the chemistry teacher's enactment of their personal PCK. Essentially, these processes are what the teacher does to take the knowledge they have and apply it to generate a product (i.e., the assessment item). Each of the three processes are identifiable separately throughout teacher discourse; however, the refined consensus model of PCK indicates that these processes are likely interrelated, as are the PCK knowledge bases (Hume et al., 2019). As such, each of the three processes identified was determined to be essential for teachers to undergo while designing chemistry-specific assessment items.
Another limitation is the likelihood that all ideas were not openly communicated. Although these teachers have been shown to willingly share ideas in the past, they likely did not share all their thoughts and ideas leading to a possibly incomplete characterization of their enacted PCK. To this end, it is important to recognize that the teachers were given ample opportunity to generate assessment items. Teachers ended the item generation period of their own volition; the items developed were perceived as best-possible quality by the teachers before critique from peers. We would like to recognize our support of these individuals (and all teachers) and hope these findings lead to collective growth.
A further limitation of this work is that the analysis focused on the ideas used to generate assessment items for a specific chemistry topic and not the quality of the assessment items generated or the content accuracy of teacher ideas. As such, the final items presented in this study are not necessarily examples of high-quality items. Further investigation is required to understand how the quality of any individual knowledge base (such as a teacher's content knowledge) influences teacher enactment of PCK.
Similar investigations have found that as teacher PCK strengthens, so too does their ability to foster student understanding (Pajares, 1992; Marzabal et al., 2018). A similar trend may be observable for the relationship between PCK and a teacher's ability to generate high-quality assessment items. High school chemistry teachers would likely benefit by reflecting on their assessment item development processes. Again, teachers may scaffold their reflections through the lens of the individual knowledge bases. Taking the time to consider how knowledge is enacted within a particular chemistry context could both bolster productive skills and highlight gaps in assessment design practices that influence assessment design. As such, professional development designers should provide chemistry teachers with opportunities for sustained professional development that explicitly connects the process of assessment item design to other knowledge bases as teachers transform their knowledge into classroom tools and tasks for assessment.
Knowledge base (knowledge of…) | Ideas | Description (statements that directly communicate the …) |
---|---|---|
Students | Student prior knowledge | Student prior experiences aligned to the content, task, or representational level of the item |
Student response | Student ability to respond to the item or potential student response to the item | |
Content | Representational level | Johnstone's level emphasized in the stem or student response |
Dissolving | Phenomenon of a substance dissolving in solution | |
Substance | Atoms, ions, particles, molecules, compounds involved in the phenomenon | |
Electrostatic interaction | Strength or presence of attractive or repulsive forces due to electric charge (or partial electric charge) | |
Assessment | Item format | Arrangement of the stem or student response |
Curriculum | Learning activity | What's the Solution? inquiry activity |
Educational objective | Educational objectives provided for the What's the Solution? inquiry activity | |
State/national standards | State and or national standards for chemistry |
This journal is © The Royal Society of Chemistry 2021 |