Xuan
Liao
and
Rui
Liu
*
College of Chemistry and Materials Science, Sichuan Normal University, Chengdu, Sichuan 610066, People's Republic of China. E-mail: liurui@sicnu.edu.cn
First published on 23rd September 2025
In school settings, language plays a crucial role in the transmission of knowledge. This case study describes and compares the Pedagogical Scientific Language Knowledge (PSLK) of a novice and an experienced junior high school chemistry teacher. Data were collected from 40 classroom observations, as well as semi-structured and post-class interviews. Using the PSLK framework, we evaluated the teachers’ performances across PSLK elements. Results revealed distinct PSLK characteristics and teaching differences between the two educators. The experienced teacher demonstrated effective integration and adaptive optimization of scientific language strategies, including multimodal scaffolding, conceptual-language synergy, and contextualization in real-life situations. In contrast, the novice teacher adopted fragmented and formulaic approaches, characterized by inconsistent terminology use, superficial connections between concepts and language, and limited adaptability in teaching strategies. These findings underscore the need for teacher education programs to explicitly foster the systematic integration and adaptive use of PSLK to support the professional growth of novice teachers.
According to Laszlo (2013), “Chemistry teachers are linguistic guides, they are interpreters. They teach their students how to craft well-formed chemical sentences”. Therefore, chemistry teachers should be well-prepared to address students’ difficulties in learning and using scientific language and to make the concepts of scientific language accessible to all students. To ensure effective knowledge translation, chemistry teachers should not only possess a deep understanding of Chemish but also develop pedagogical strategies to bridge the gap between scientific language and student comprehension. Based on this, equipping students with the ability to understand Chemish and use it correctly and purposefully lies with chemistry teachers (Mönch and Markic, 2022a).
Current research indicates that preservice and in-service science teachers often lack sufficient chemistry content knowledge (CK) for the high school level, which contributes to their struggles in incorporating multiple meanings of scientific terms into their instruction (Kind, 2014; Galiza et al., 2018). Additionally, some teachers do not see the value in developing teaching objectives for scientific language or even regard teaching scientific language as their responsibility (Hansen-Thomas et al., 2018). If teachers do not know how to teach scientific language in the classroom, it could lead to students’ learning anxiety about scientific language, which can hinder their learning (Taibu and Ferrari-Bridgers, 2020). For example, teachers use terms that students do not yet understand in an everyday sense, thus potentially leading to confusion in students (Morgado Fernández and Sologuren Insúa, 2024). All of these may hinder students’ learning of scientific language. Therefore, chemistry teachers require a specialized form of teacher knowledge: Pedagogical Scientific Language Knowledge (PSLK), first proposed by Markic (2017). Central to PSLK is the “scientific language transformation ability”—a competence that enables teachers to adapt scientific language into contextually appropriate forms while maintaining scientific rigor.
Currently, several Pedagogical Content Knowledge (PCK) frameworks (Magnusson et al., 1999; Gess-Newsome, 2015; Carlson et al., 2019) have been proposed and applied to science education contexts, but there is still a lack of systematic knowledge frameworks in the field of scientific language. This gap is particularly critical, given the role of chemical symbols and images as a suite of highly distinctive, subject-specific semiotic resources that are rarely used in other academic disciplines and are therefore characteristic of chemistry (Yu and Doran, 2024). Existing research on chemistry teachers’ PSLK focuses predominantly on model development and preliminary validation (Mönch and Markic, 2023, 2024), with limited investigation into variations in teachers’ PSLK or the factors driving its development through professional development frameworks. Consequently, empirical research on PSLK implementation in chemistry teaching contexts is critically needed.
PSLK represents a specialized refinement and contextualization of PCK within the domain of scientific language instruction. While PCK broadly encompasses the knowledge teachers need to transform subject content in teaching, PSLK specifically focuses on the linguistic dimension of this transformation process in science (particularly chemistry). Given its role in mediating language across diverse chemical topics, PSLK thus functions more as a cross-topic principle for chemistry education. The recently refined PSLK model by Mönch and Markic (2024) identifies ten core, observable elements constituting this knowledge. Like established approaches for capturing PCK (e.g., classroom observations, CoRe), the observable manifestations of these PSLK elements during instruction provide a diagnostic lens to assess teachers’ PSLK performance and identify variations. This study leverages this framework and its diagnostic potential to empirically investigate how PSLK manifests in real classroom settings.
This study adopted a case study approach to reveal the application characteristics of PSLK in teaching practice for a novice case teacher and an experienced case teacher. The selection of a novice and an experienced case teacher specifically enables contrastive analysis of how professional socialization processes shape PSLK maturation over time. Therefore, the specific research questions are as follows:
(1) What is the PSLK performance of the experienced case teacher?
(2) What is the PSLK performance of the novice case teacher?
(3) What are the differences in PSLK between the experienced case and the novice case teacher on the same topic?
In particular, Chemish is particularly distinctive due to its unique symbolic system, triple representations (macroscopic, microscopic, and symbolic), and a vast array of specialized terms that often diverge from their meanings in everyday language (Markic and Childs, 2016; Rees et al., 2018; Vaccaro et al., 2022). Consequently, general frameworks of PLK may be insufficient to capture the specific challenges inherent in chemistry instruction. Based on this, Markic (2017) proposed the concept of PSLK for (chemistry) teachers, building upon both PCK and PLK. This construct is defined as linguistic knowledge related to subject matter teaching and placing it in a specific context of teaching and learning, which Markic adapts to the notion of PSLK as a science or chemistry teachers “knowledge of scientific language related to teaching and learning chemistry” (Mönch and Markic, 2022a).
PCK as a core component of teacher knowledge proposed by Shulman, synthesizes CK with PK (Shulman, 1987). In contrast, PSLK specifically targets the teaching of Chemish, requiring that teachers possess not only fundamental knowledge of terminology and the concepts behind them CK but also an awareness of its semantic features, grammatical structures, and diverse representational forms spanning the macroscopic, microscopic, and symbolic levels (Mönch and Markic, 2022b). This distinctiveness arises because Chemish functions not merely as a knowledge carrier but inherently constitutes a major difficulty in learning chemistry (Mönch and Markic, 2022a). Consequently, PSLK integrates generic PCK principles—including analysis of student preconceptions and selection of instructional strategies—to address challenges unique to Chemish, such as distinguishing everyday versus scientific semantics. This integration ensures that PSLK remains grounded in topic-specific chemical pedagogy while simultaneously operating as a cross-cutting principle across instructional contexts, ultimately advancing students’ disciplinary literacy in chemistry (e.g., scientific explanation and argumentation). This theoretical framework aligns closely with the PSLK model advanced by Mönch and Markic (2024), which emphasizes the synthesis of knowledge about multidimensional features of CK and strategies for its PCK.
Research indicates that scientific language can constitute a major barrier for students learning chemistry (Rees et al., 2018), necessitating that teachers possess specialized PSLK to help students overcome these challenges. PSLK is conceptualized as “an integration of CK and PCK”, highlighting its unique role in chemical education. It involves translating CK into practice, encompassing the identification of student difficulties with scientific terminology and the implementation of effective instructional strategies (Mönch and Markic, 2022b). For instance, when students encounter the abstract concept of the “mole”, teachers with well-developed PSLK can anticipate potential learning obstacles and employ targeted linguistic explanations, analogies, and concrete examples to scaffold conceptual understanding.
![]() | ||
| Fig. 1 Model of Pedagogical Scientific Language Knowledge (Mönch and Markic, 2024). | ||
Mönch and Markic's systematic research identifies ten core PSLK elements (inner ring in Fig. 1): knowledge of (i) scientific language role models, (ii) the development of the concept before the development of the scientific language, (iii) making scientific terms and language explicit, (iv) providing a discursive classroom, (v) providing multiple resources and representations, (vi) providing scaffolds for scientific language development, (vii) communicating expectations clearly, (viii) specific methods and tools for teaching and learning the scientific language, (ix) motivation when learning scientific language and (x) lesson preparation and follow-up (Mönch and Markic, 2024). Empirical study validates this multidimensional model, emphasizing the importance of explicit instruction, contextualized scaffolding, and curriculum consistency (Mönch and Markic, 2023). The PSLK model critically shapes teachers’ decisions regarding scientific representations, discourse norms, and student perception analysis, thereby defining their PSLK performance as the understanding and application of this knowledge.
Recent empirical studies, such as Mönch and Markic (2023), have validated the applicability of the PSLK model through pre-service teacher training programs, demonstrating its effectiveness in fostering scientific language awareness and pedagogical adaptability. For instance, educators with robust PSLK are prepared to employ diverse instructional approaches—such as using vivid explanations, examples, experiments, and analogies—to make abstract concepts tangible and scaffold connections between scientific terminology and conceptual understanding (Mönch and Markic, 2022b). They further guide students in articulating experimental phenomena and results through structured language tools (e.g., sentence starters, diagrams), ensuring accurate communication of scientific processes (Kieferle and Markic, 2024). Additionally, effective teachers employ various representations (e.g., analogies, metaphors, and models) to bridge abstract scientific concepts with students' prior knowledge and real-world experiences, thereby facilitating understanding (Sri et al., 2021). However, systematic comparisons of PSLK across teacher types (e.g., experienced vs. novice teachers) remain underexplored.
As Shulman emphasized, teacher knowledge is not merely static information stored in the mind; it should be manifested and operationalized through pedagogical practices—such as explaining, exemplifying, questioning, and assessing (Shulman, 1986). Thus, classroom practices serve as a critical site for observing how PCK is enacted. This aligns with the view that the value of PCK lies in its impact on teaching practice, and that direct observation of classrooms is a profitable method for uncovering the tacit dimensions of PCK that are difficult to capture through interviews alone (Käpylä et al., 2009). In this study, “Pedagogical Scientific Language Knowledge” (PSLK) is conceptualized as a highly specialized form of PCK within science education. Just as PCK is enacted through instructional behaviors, PSLK can be demonstrated via teacher’ practical actions—including language use, instructional design, and teacher–student interactions in classrooms.
Performance assessment is an approach that evaluates knowledge and skills by requiring individuals to execute authentic real-world tasks (Griffin et al., 2012, p. 39). Applied to PSLK, this means assessing teachers’ specialized language knowledge through their enactment of language-focused pedagogical practices in the classroom. Systematic classroom observation is a key method for measuring the effectiveness of teaching and collecting evidence of teaching practice (Lara-Alecio et al., 2024). Specifically, this involves documenting observable behaviors related to scientific language use, scaffolding, and interaction. Based on this, this study's use of classroom observation is therefore grounded in the theoretical premise that PSLK—as an internal, cognitive construct focused on teaching scientific language—can be measured through a series of concrete language-related behaviors in the “performance task” of classroom teaching. The classroom observation instrument functions to deconstruct the abstract PSLK construct into observable, recordable, and assessable “Behavioral Indicators” related to scientific language pedagogy, thereby enabling the evaluation of this implicit knowledge.
While teachers play a crucial mediating role in facilitating students’ acquisition of scientific language, students themselves confront a dual challenge in learning chemistry: they must simultaneously master abstract scientific concepts and acquire a highly specialized “language” novel (Rees et al., 2018). This system includes discipline-specific vocabulary, symbolic representations, and syntactic structures that differ significantly from everyday language. Failure to acknowledge the distinct cognitive load imposed by this linguistic dimension may lead educators to misinterpret learning difficulties merely as conceptual misunderstandings, thereby overlooking the fundamental role of language processing in the comprehension of chemistry. Empirical evidence consistently demonstrates a positive correlation between students’ language proficiency and their chemistry achievement (Firmayanto et al., 2020). Conversely, inadequate language skills are directly associated with poor learning outcomes, indicating that failures primarily stem from an inability to process the instructional language rather than from the inherent complexity of the subject matter (Elhadary and Elhaty, 2021). This underscores that any analysis of student success or failure in chemistry should integrate linguistic considerations.
The inherent complexity of chemical language directly contributes to the discipline's cognitive load. When students encounter concepts such as “Le Chatelier's principle”, their working memory should concurrently process terms like “equilibrium”, “concentration”, “pressure”, and “temperature”, alongside the principle's logical relationships. Cardellini (2014) argues that the abstract and multifaceted nature of chemical symbols places a significant burden on students' working memory, as they must simultaneously learn and manipulate this unfamiliar representation system (p. 240). While some linguistic load is intrinsic, a significant portion is extraneous, arising from suboptimal instructional communication. When instructors or textbooks use ambiguous language, fail to define terms clearly, or present symbolic equations without adequate verbal explanation, students divert cognitive resources to deciphering instructions rather than learning chemical concepts (Chepyegon, 2011). Recognizing that unclear communication generates extraneous cognitive load provides educators with a powerful theoretical tool to optimize pedagogical language.
The research process and research idea of this study are shown in Fig. 2. The design systematically aligns the three research questions with corresponding data collection and analysis phases. Classroom observations, triangulated with interviews, capturing teachers’ PSLK implementation and conceptual understanding (RQ1/RQ2). In contrast, a structured comparative analysis synthesizes discrepancies across the ten PSLK components (RQ3). The figure emphasizes the iterative workflow where each question drove targeted data collection (e.g., observation transcripts, interviews) and distinct analytical approaches (deductive coding, comparative frameworks), ensuring theoretical–study utilizes the Dreyfus five
empirical coherence throughout.
However, given that theoretical constructions themselves are not directly observable, a critical next step is to operationalize these abstract elements. The challenge of operationalizing complex educational constructions is well-documented in the literature, requiring a clear translation from theory to measurable practice (Borges del Rosal et al., 2016). This process constitutes the central pillar of our tool design. For each PSLK element, we defined a series of specific, unambiguous, and observable instructional behaviors. These observable practices are systematically measured using standardized rubric during classroom observations. Kane et al. (2010) provide empirical evidence that a teacher's performance on a structured observational framework is a significant predictor of their effectiveness in promoting student achievement growth, as measured by standardized test scores. For instance, the abstract PSLK element of “Scientific Language Role Models” was operationalized into the following observable indicators: (1) teachers’ normative and consistent use of scientific language, along with pedagogical awareness, encompasses conscious application, accuracy and consistency, interpretation to facilitate understanding, and recognition of their exemplary role; (2) teachers observe students demonstrate scientific language expression through group discussions, which are then used by teachers as exemplary models for peer learning; (3) the frequency and effectiveness of teachers using counterexamples through scientific language to guide students in identifying and correcting inappropriate scientific expressions.
This procedure was replicated for every PSLK element, transforming a catalog of theoretical constructs into a practical tool for systematic observation. Detailed observation indicators are presented in the Appendix. This approach ensures data collection relies not on subjective impressions but on documented presence and quality of specific pedagogical behaviors, directly reflecting the application of PSLK in practice.
Subsequently, we conducted a pilot test of the classroom observation form. Observational data were collected from ten consecutive lessons delivered by an experienced chemistry teacher (“Hana”). Post-observation, a one-hour semi-structured interview with Hana was conducted. We discussed specific classroom events and compared observer ratings with the teacher's stated instructional intentions and reflections. This triangulation enabled assessment of alignment between instrument-captured data and the teacher's own understanding of instructional decision-making. The pilot phase proved crucial for tool refinement. In-depth analysis revealed two critical gaps: (1) initial items inadequately captured explicit strategies teachers use to scaffold complex chemical concepts for novice learners. Consequently, the item “Helping students understand concepts” was added to focus on these targeted scaffolding techniques. (2) The tool insufficiently reflected the key practice of linking abstract chemical principles to students’ daily lives and societal issues. To address this, the item “Linking to real-life situations” was incorporated. These two items (new) were converted into observation indicators during the pilot testing process (see the Appendix for details).
In contrast, pilot testing confirmed that the item “Pre-lesson preparation and post-lesson follow-up” could not be effectively assessed through classroom observation alone. Given our methodology's inherent constraints to direct observation, this item was removed to maintain instrument focus and validity. Simultaneously, we eliminated the dimension “the motivation when learning scientific language.” Our observation tool primarily targets externally observable teaching behaviors (e.g., language modeling, questioning strategies, resource deployment, scaffolding) and linguistic features of teacher–student interactions. Reliably inferring motivation necessitates more nuanced methods; importantly, we contend that the core concerns of the “motivation” dimension are substantively addressed by the newly added dimension (Linking to real-life situations). This addition was informed by pilot findings indicating that connecting abstract concepts/language to lived experiences constitutes one of the most direct and observable teacher strategies for stimulating interest and intrinsic motivation.
The revised instrument, incorporating these modifications, was deployed in the primary data collection phase. Data collection continued until theoretical saturation was achieved, ensuring the final dataset sufficiently supported the study's analytical objectives. The observation instrument was reviewed by two domain experts (chemistry education professors with >10 years’ experience) for content validity. The final PSLK Classroom Observation Form is presented in Table 1.
| Observation project | Observation results | |
|---|---|---|
| Scientific language role models | The chemistry teacher serving as a scientific language role model | |
| Students serving as scientific language role models | ||
| Other instances serving as scientific language role models | ||
| Providing a discursive classroom | Providing opportunities for students to practice scientific language | |
| Incorporating multiple dimensions of language | ||
| Asking questions | ||
| Using mistakes as learning opportunities | ||
| Communicating expectations clearly | ||
| Multiple resources and representations | ||
| Developing the concept first | ||
| Explicating scientific language | ||
| Specific methods and tools | Introducing Chemish | |
| Practicing Chemish | ||
| Summarizing Chemish | ||
| Monitoring students’ use of Chemish | ||
| Scaffolds for scientific language development | Oral strategies | |
| Visual aids | ||
| Written strategies | ||
| Helping students understand concepts (new) | ||
| Linking to real-life situations (new) | ||
We conducted three months of classroom observations (each captured the entire lesson duration, with each class lasting 40 minutes), resulting in 40 total lessons: 20 taught by novice teacher and 20 by experienced teacher. Each teacher was observed delivering 1–2 new lessons per week, ensuring both novice and experienced teachers had multiple observation records under identical lesson themes. While primary observational data were recorded in real-time using the structured form (Table 1), lessons were video recorded with participant consent to: (1) serve as stimuli for subsequent stimulated recall interviews; (2) provide data for data analysis.
| 1. Why is it important for students to learn the language of science? |
| 2. What scientific language should students learn? |
| 3. How does students’ prior knowledge affect the teaching of the language of science? |
| 4. Are there other factors that can influence the teaching of science language in chemistry classes? |
| 5. Why is the language of science a challenge for students? |
| 6. What are some ways to diagnose student understanding/misunderstanding when teaching the language of science? |
| 7. How do you consider the language of science when preparing or teaching lessons? Involving certain groups of students as well? |
| 8. Do you use any special teaching methods when teaching the language of science? |
| 9. What methods have you already tried in teaching scientific language? What has worked well and what has not? |
| 10. As a teacher, you know more about the language of science than your students need to know. Is this sometimes a challenge for you? |
| 11. Looking back, how have you changed teaching the language of science since you became a teacher? |
| 12. Is there anything that you had not thought of before but did during this interview? |
| 13. To summarize from your perspective, what are the most important points to consider when teaching the language of science? |
| 14. What other help is available when teaching the language of science? |
| PSLK level | Code name | Coding instructions | Standard of judgment | Example (using the observation program “Teacher's Role as a Role Model” as an example) |
|---|---|---|---|---|
| 1 | Unconsciousness/fragmentation | Teachers did not fully realize the importance of this element and implemented it sporadically and unsystematically, often based on intuition. | Elements are not present or are implemented haphazardly, without a clear purpose, and are not integrated with students’ cognitive needs. | Tina uses contradictory expressions such as “speeds up the reaction” and “participates in the reaction” interchangeably when describing “catalysts” and does not distinguish between every day and scientific terms (e.g., referring to “oxidation” as “rusting”). (Data source: Tina's Classroom Observation Sheet under the Theme of “Catalyst”) |
| 2 | Mechanical actuation | Teachers were able to mimic standard strategies, but lacked contextualized adjustments, and implementation was formal. | Elements are mechanically applied (e.g., copying textbook templates), no attention is paid to student feedback, and strategies are weakly related to instructional objectives. | Tina strictly uses standardized expressions such as “iron undergoes slow oxidation in moist air”, but when students are confused between “composition and constituents”, she simply repeats the definitions without identifying the differences through examples. (Data source: Tina's Classroom Observation Sheet under the Theme of “Composition of substances”) |
| 3 | Systematization and integration | Teachers can systematically design strategies that are flexible and adaptable to the cognitive characteristics of their students, resulting in a reusable instructional framework that can be tailored to individual needs. | Elements are integrated into the main lines of the instructional design, strategies are logically coherent, and implementation paths can be dynamically optimized through diagnostic questioning or tasks. | When teaching “combustion conditions” Hana systematically designs a “three-color labeling method”: blue labeling “combustible” (material basis), yellow labeling “temperature reaches the ignition point” (energy conditions), red labeling “contact with oxygen” (reaction environment), and the color-coding system is reproduced in subsequent combustion experiments (e.g., combustion of sulfur, iron wire). (Data source: Hana's Classroom Observation Sheet under the Theme of “Combustion and Extinguishing”) |
| 4 | Innovative optimization | Teachers creatively develop strategies tailored to the learning context, design individualized teaching models, and guide students in constructing knowledge. | Element implementation reflects originality and active student participation in the strategy improvement process. | Hana designs a Coke Mentos activity, records the students’ experiments, and asks them to explain the video they created. The students describe the phenomenon as “a large amount of gas was generated, leading to eruption”, which Hana corrects to “dense bubbles were rapidly generated inside the bottle, forming a white column of bubbles that erupted out of the bottle's mouth!” (Data source: Hana's Classroom Observation Sheet under the Theme of “Interest Experiment”) |
To ensure objectivity and consistency in data analysis, the first author trained a research assistant on the application of the classroom observation form and analytical framework. Both coders conducted pre-test coding using video recordings. Prior to formal coding, the researchers participated in training workshops, including: (1) two-hour workshop on the PSLK framework and operational definitions of behavioral indicators (see the Appendix); (2) dual coding of two lessons taught by the same instructor on identical topics to calibrate the analytical framework's interpretation; (3) inter-rater reliability calculated based on four lesson samples (Cohen's κ = 0.82). Final coding standard was established, with discrepancies resolved through discussion with the research team.
Following clarification of the coding standard, analytical rigor was ensured through double independent coding and consistency verification. Both coders conducted a three-stage analysis using NVivo 12 Plus (QSR International) software: (1) translated the PSLK framework into a hierarchical node structure (10 theoretical dimension parent nodes, 40 level sub-nodes) through matrix coding; (2) independently annotated 440 reference points while documenting coding rationale in memos, verifying construct boundaries through Boolean/proximity queries; (3) randomly sampled 30% (n = 12 lessons) to ensure classroom proportional representation, achieving weighted Kappa consistency (κ = 0.76, SD = 0.09) through coding comparison and meeting the Landis and Koch benchmark of 87.6% exact agreement rate.
| Observation project | Level 1 (unconsciousness/fragmentation) | Level 2 (mechanical actuation) | Level 3 (systematization and integration) | Level 4 (innovative optimization) | Mean level |
|---|---|---|---|---|---|
| Scientific language role models (O1) | 2 | 5 | 11 | 2 | 2.65 |
| Providing a discursive classroom (O2) | 1 | 7 | 10 | 2 | 2.65 |
| Communicate expectations clearly (O3) | 6 | 8 | 5 | 1 | 2.05 |
| Multiple resources and representations (O4) | 0 | 9 | 9 | 2 | 2.65 |
| Developing the concept first (O5) | 3 | 10 | 6 | 1 | 2.25 |
| Explicating scientific language (O6) | 4 | 11 | 4 | 1 | 2.10 |
| Specific methods and tools (O7) | 2 | 7 | 9 | 2 | 2.55 |
| Scaffolds for scientific language development (O8) | 1 | 6 | 10 | 3 | 2.75 |
| Helping students understand concepts (O9) | 0 | 5 | 13 | 2 | 2.85 |
| Linking to real-life situations (O10) | 1 | 8 | 8 | 3 | 2.65 |
To quantify the average level of each PSLK element, each level was assigned a numerical value (Level 1 = 1 point, Level 2 = 2 points, Level 3 = 3 points, Level 4 = 4 points), enabling the calculation of mean scores for individual PSLK elements by summing the cumulative frequencies across all levels within each element and dividing by the total sample size of 20. For example, Hana's mean score for “O1” was computed as (2 × 1 + 5 × 2 + 11 × 3 + 2 × 4)/20 = 2.65. This procedure was repeated for all elements, with the average scores for each element shown in Table 4.
In terms of “O1”, Hana can standardize the use of scientific language and reinforce the accuracy of language expression through board design. However, when students were confused about terminology, such as “volatilization” and “evaporation”, the teacher used direct corrections instead of analyzing the differences in scientific language through examples. The strategy of responding to students’ linguistic myths still needs to be enhanced. For example, in explaining “catalysis”, the teacher demonstrated the systematic integration of experimental phenomena and terminology through the simultaneous demonstration of the match rekindling experiment; however, when dealing with students’ immediate questions, the teacher still relied on repeating definitions rather than guiding students to analyze them on their own.
The interview data supported Hana's view of chemical language as both a disciplinary “tool” and a “second language” (as she said, “chemical language should be a tool in chemistry”, “it should fall into the category of like a second language”), which drove her emphasis on language normativity. However, she also noted that she relied more heavily on immediate “exercise” (as she said, “the most effective way is to exercise, to let the students practice immediately after the lecture”) instead of in-depth guided analysis when addressing student confusion, aligning with the observed tendency toward mechanical corrections also seen in the exercise.
“O2” is a prominent element in Hana's teaching practice. Her classroom consistently fosters a language-learning environment through multi-layered interactions and varied representations. In Hana's lesson on the Law of Conservation of Mass, the PSLK element centers on the two-way dynamic between experimental phenomena and conceptual construction: she leverages two exploratory experiments—“White Phosphorus Combustion” and “Iron–Copper Sulfate Reaction” as entry points, guiding students to observe macroscopic mass changes pre- and post-reaction. To bridge the cognitive gap between macroscopic observations and microscopic principles, Hana innovatively introduces the analogy of “building with blocks”: atoms in a chemical reaction are likened to differently shaped block units, with the reaction process comparable to “reassembling a new structure using the original blocks”, where the total block count remains unchanged despite external transformations. This real-world analogy illustrates how the observed phenomenon of mass conservation aligns with the principles of atomic conservation. Hana's tendency to innovate through dynamic pacing adjustments, for instance, extending discussion time based on student responses, was evident in some sessions. However, in other classes, “questioning” remained teacher-driven (e.g., self-answering), limiting students’ active participation and reflecting that her discursive classroom approach had yet to fully break from traditional lecture modes.
The interviews showed that Hana understood the importance of student-centered lesson planning when designing interactions (as she said, “being able to prepare the lesson more from the students’ perspective”). She recognized that methods for sparking student interest (like games) were effective but time-intensive (as she said, “it requires the teacher to put energy into prepping the lesson”). This highlights the tension she felt between innovating to optimize (e.g., adjusting discussions on the fly) and maintaining efficiency (sometimes still reverting to teacher-led instruction), partly driven by practical constraints of classroom time and midterm exam requirements (as she said, “there's a trade-off involved with knowing a little bit more about the midterm exams”).
In the dimensions of “O9” and “O10”, Hana showed impressive creativity. For example, teaching “atomic structure”, she used the metaphor of a “stadium” for the atom and “ants in the stadium” for electrons to bring these concepts to life; in a lesson on the “diversity of carbon monomers”, she introduced examples like the British royal scepter and graphite bombs to effectively spark students’ verbal engagement. These strategies not only deepened conceptual understanding but also lessened the cognitive load of scientific language by using relatable, real-world contexts, meeting the Level 4 criterion for innovative optimization. However, some attempted life connections remained superficial. For instance, using the ratio of female students to illustrate mass fraction, while mathematically analogous, failed to explore deeper relationships between chemical language and lived experience, suggesting an inconsistent execution.
Hana's intent to link abstract concepts (like valence) to students’ life experiences (“bartering” analogies) or interests (stomach medicine choices) came through when she talked about it (as she said, “try to make it as scientific as possible, but also accessible to the class”), backing up the innovative nature of her approach. But she also admitted she sometimes purposely held back on going deeper or adding more connections to avoid overloading students or straying from the exam's focus (as she said, “throwing it out there would be a memory burden for the students, preventing them from treating it as a key piece of knowledge”). This explains the ‘unevenness’ seen in practice.
In addition, Hana demonstrated a notable systematic approach in employing “O8” and deploying “O7.” For instance, when teaching “filtration operation” she broke down the steps step-by-step through guided questioning and reinforced memorization using mnemonics (e.g., “One Adherence, Two Lows, Three Contacts”); similarly, in the lesson on “Preparing Carbon Dioxide”, she used “Experiment Solitaire” to have students infer the rules of chemical language independently. These strategies reflect the teacher's flexible instructional design and skill in adjusting scaffolding based on students’ cognitive levels. However, classroom observations also revealed that some written strategies were underutilized and that an over-reliance on verbal explanations potentially reduced the diversity of chemical language representations.
Regarding the distribution of PSLK levels, discernible divergences existed in Hana's performance on elements like “O6” and “O5.” In O6-focused practices, for example, the teacher frequently used qualifying statements like “junior high school level can be considered” or emphasized “rigorous step-by-step calculations” to standardize student language output, which may foster a rigid understanding of scientific language (Level 2). Conversely, in O5 implementation, the teacher prioritized concept construction through experimental investigation and analogical reasoning before introducing the scientific language—for instance, ensuring understanding of “reduction” before formally defining the “oxygen-capturing reaction”, which reflects a sophisticated knowledge of the conceptual-linguistic relationship (Level 3). This disparity suggests that Hana's PSLK is unevenly developed, with strengths concentrated on elements tied to direct instructional behaviors; more systematic reflection and adjustments are needed on how language design can foster students’ deeper cognitive processing of scientific concepts.
Based on the comprehensive analysis of Hana's PSLK performance across the ten elements and observational data, three core characteristics stand out: (1) she skillfully weaves scientific language into concepts using metaphors and inquiry, but her strategies for standardizing terms and setting expectations often fall back on mechanical corrections or test-focused instructions, which limits deeper language precision; (2) her dynamic changes show real-world ingenuity but lack consistent application and theoretical grounding, leading to patchy improvements across lessons; (3) while she excels at building discussion-rich settings and linking concepts to language, relying too much on indirect support can blur scientific rigor, and her curriculum alignment prioritizes test success over genuine language understanding.
| Observation project | Level 1 (unconsciousness/fragmentation) | Level 2 (mechanical actuation) | Level 3 (systematization and integration) | Level 4 (innovative optimization) | Mean level |
|---|---|---|---|---|---|
| Scientific language role models (O1) | 7 | 10 | 3 | 0 | 1.80 |
| Providing a discursive classroom (O2) | 8 | 8 | 4 | 0 | 1.80 |
| Communicate expectations clearly (O3) | 2 | 10 | 5 | 3 | 2.45 |
| Multiple resources and representations (O4) | 1 | 6 | 10 | 3 | 2.75 |
| Developing the concept first (O5) | 6 | 9 | 4 | 1 | 2.00 |
| Explicating scientific language (O6) | 6 | 9 | 3 | 2 | 2.05 |
| Specific methods and tools (O7) | 0 | 4 | 12 | 4 | 3.00 |
| Scaffolds for scientific language development (O8) | 2 | 12 | 5 | 1 | 2.25 |
| Helping students understand concepts (O9) | 5 | 10 | 4 | 1 | 2.05 |
| Linking to real-life situations (O10) | 4 | 10 | 5 | 1 | 2.15 |
In the “O1” dimension, Tina exhibited limitations with normative precision and contextualization. Classroom observations revealed the frequent use of vague or redundant language, which undermines the accuracy of scientific terminology. For instance, when explaining “energy changes in material changes”, she often analogizes heat absorption with the example of “ice melts into water and absorbs heat”. However, they failed to clearly distinguish the difference between physical changes (such as melting with heat absorption) and chemical changes (heat absorption reactions). Concurrently, Tina over-relied on generic directives, as she expressed in the interview, “without step-by-step scaffolding, leading students to conflate everyday experiences with scientific concepts”. Additionally, rapid speech pace and lack of strategic pausing hindered students’ ability to grasp linguistic focal points. Although some classrooms attempted to use symbolic notation via board writing, video evidence showed fragmented notations that failed to effectively reinforce language modeling, exemplified by disorganized writing during the “Calculating Chemical Equations” instruction. Tina's behavioral patterns reveal a superficial grasp of scientific language conventions, alongside identifiable deficits in the systematic design of language-modeling strategies.
“O2” was the weak aspect of Tina's practice. Tina’ classes predominantly featured one-way lectures: in the “Atomic Structure” lesson, questions were simple recall (e.g., “What are atoms made of?”) and failed to prompt deep thinking, while during “Checking the Airtightness of Devices”, she gave students no time for independent thought, directly instructing procedural steps via the lesson plan. Though some classes attempted experiments, activity designs were perfunctory and failed to establish a chemical language learning environment conducive to trial and error. These practices indicate Tina's understanding of discursive classrooms remains limited to “providing practice opportunities”, overlooking the synergistic development of language and thinking.
When asked about effective ways to promote students’ understanding of the language of chemistry, Tina's response (“Currently, only the method of learning to classify substances and then learning how to read their names has been used, but the results have been less than satisfactory, and there are no other special methods for the time being”) reflects her underdeveloped capacity to orchestrate multidirectional discourse, manifesting as didactic instruction with token interactive tasks; her lack of awareness or resources to explore diverse interactive strategies meant her “O2” practice remained superficial and only provided opportunities for basic practice.
Tina demonstrated fragmentation in “O9” and “O10”. For example, when teaching “valence”, Tina used a “building blocks assembling a house” analogy to illustrate atomic reorganization during reactions but only superficially noted “the number of blocks remains unchanged” without connecting this to “conservation of atomic species and number” or the law of conservation of mass. In the “CO” lesson, while mentioning the “gas composition” life example, she neglected to explore CO's toxicological chemical mechanisms. Such strategies often relied on fragmented anecdotes, lacking a systematic design and causing disconnects between lived experience and scientific language. During the interview, Tina expressed awareness of the importance of “pre-penetrating chemical ideas” (“to penetrate relevant ideas from time to time”), yet failed to translate this awareness into effective classroom practice (as she said, “there are no other special methods for the time being”), with the effect that the connections between the observed cases appeared fragmented and unsystematic, reflecting a disconnect between theoretical concepts and classroom practice. Notably, Tina showed promise in isolated lessons, particularly with novel tools: during “Checking Device Airtightness”, her interactive whiteboard sketches proved markedly more effective than Hana's traditional chalk diagrams.
During the interview, Tina mentioned the intention to utilize “physical models or metaphors” (e.g., models of atoms) to help students understand, consistent with observed localized strengths in using specific technological tools (e.g., interactive whiteboards), suggesting she perceives the potential value of multimodal resources but has not yet systematically integrated or transferred them to broader teaching contexts.
The ‘O7’ and ‘O8’ dimensions further reveal Tina's mechanical approach: in the “Properties of Oxygen” lesson, she segmented “transition from textual to symbolic expression” without addressing cognitive hurdles—the lesson's key knowledge being the “↑” symbol's application conditions; while teaching “chemical equation calculation”, she emphasized rote formatting (“first write ‘solution’”) while overlooking calculation logic instruction, instead directly providing the “left-right ratio” rule. Through mimicking standard processes, these strategies lack contextual embedding, resulting in misalignments between the intensity of scaffolding and student needs. Tina's infrequent written strategies undermined linguistic representation diversity.
Tina's focus on students’ cognitive load was evident in her interview, where she emphasized the importance of “the most important thing to consider is students’ receptivity” and “knowing how much students can absorb in a lesson.” However, her approach to assessing student understanding relied predominantly on conventional paper-and-pencil tests and simple questioning, lacking more nuanced formative assessment or differentiated instructional scaffolding (e.g., no mention of tailored support for students at varying ability levels). As a result, her support measures (e.g., focusing on answer formatting) often failed to address students' deeper cognitive barriers (e.g., comprehending the symbolic logic in chemistry), and did not lead to adaptive teaching responses.
PSLK level distribution indicates that Tina's approach to “O5” and “O6” reflects a focus on procedural clarity and formal accuracy, which at times may come at the expense of conceptual depth. For instance, during instruction on “Chemical Formula Calculation”, she emphasized procedural rules such as “it must be × percent” without fully unpacking the underlying logic of mass fraction calculations. Similarly, in the “Properties of Oxygen” lesson, while she adeptly demonstrated standardized notations like “↑” her rapid pacing and limited conceptual reinforcement may have contributed to a more surface-level engagement with the content. These observations suggest that Tina is still developing strategies to harmonize conceptual understanding with explicit language instruction, indicating a transitional phase in her integration of content and language pedagogy.
Tina believes learning chemistry “requires a combination of memorization, comprehension, and application”, placing particular emphasis on “standardization” and avoiding “mistakes in details” that can cause point loss (as she said, “mistakes in the language of chemistry can lead to loss of marks”; “You can lose points if you make mistakes in memorizing the language of chemistry”). This intense focus on test norms and accuracy partly explains her tendency to stress “rigorous steps” and specific formatting (e.g., “first write ‘solution’”) in her teaching, sometimes at the expense of fostering a deeper understanding of the concepts’ essence and the chemistry behind the language, confirming the observed prioritization of form over substance.
Based on a comprehensive analysis of Tina's PSLK performance across the elements and observational data, three core characteristics stand out: (1) the primary barrier stems from difficulties in translating knowledge of pedagogical strategies into consistent and responsive classroom practice, especially concerning the use of “O1”. This manifests as inconsistent scientific language modelling, an inability to foster substantive student discussions, and difficulty adapting to emergent learning opportunities or student responses; (2) her ability to connect scientific terminology to real-world contexts remains transitional. While she can create surface-level connections and occasionally explore deeper conceptual links, these practices lack consistency and systematic integration into her teaching framework; (3) despite these challenges, she demonstrates proficiency with novel technological tools (e.g., interactive whiteboards and simulations), sometimes using them more effectively than more experienced teachers.
Fig. 3's radar chart shows differences in the level distribution of PSLK. At Level 1, Tina shows a broader distribution than Hana on almost all dimensions (except O3) indicating more missing or fragmented elements. At Level 2, both teachers show a considerable distribution, but Tina's is much broader, especially in O1, O8, and O9. This suggests a more procedural use of PSLK compared to Hana, who demonstrates fewer instances at this level. At Level 3, Hana outperforms Tina on nearly all dimensions, with Tina slightly ahead on O4 and O7. This indicates that Hana demonstrates stronger systematic integration of PSLK. At Level 4, high-level practices are infrequent for both teachers. While Hana shows a slightly broader distribution, Tina demonstrates a comparative advantage in O3 and O7.
To facilitate comparative analysis between the two teachers, we compared their mean level across various PSLK elements, ultimately generating the radar chart shown in Fig. 4. While Hana demonstrated higher scores than Tina across most elements, Tina outperformed Hana in specific dimensions (O3, O4, O7). Tina's proficiency in leveraging interactive whiteboard technologies (e.g., visualizing airtightness checks in apparatus setups) contributed to her notable advancements in O4/O7 dimensions (Section 6.2: “Tina's application of interactive whiteboard technology surpassed Hana's”). Furthermore, Tina demonstrated a stronger tendency to communicate expectations clearly (Section 6.2 interview: “Errors in chemical language result in point deductions”) and reinforced procedural consistency in O3 through repeated emphasis on writing conventions (e.g., mandatory inclusion of “solution” in chemical equations).
This suggests Tina's overall challenge, where most PSLK elements are deficient or fragmented, but reveals comparable performance in specific areas, with Tina even outperforming Hana at levels 3 or 4 in certain PSLK elements. The comparative analysis goes beyond surface teaching differences, uncovering fundamentally distinct knowledge structures within the experienced (Hana) and novice (Tina) teachers’ PSLK. These contrasts manifest not only in strategic choice but also in how knowledge is organized, activated, and adapted within dynamic classroom ecosystems.
Tina's PSLK, conversely, often operates as compartmentalized fragments. Elements like “explicating scientific language” or “helping students understand concepts” are addressed in isolation rather than synergy. While she sporadically deploys effective tactics (e.g., the superior use of interactive whiteboards), these lack cohesive pedagogical integration. Her compartmentalized fragments correspond to the Novice/Advanced Beginner stage: novice teachers are limited by their “reliance on context-free rules” and mechanically execute isolated strategies (such as dealing with “language interpretation” or “concept teaching” separately), unable to recognize the teaching connections between elements (Lyon, 2015). Her interview underscores this: strategies like classification-first approaches feel disconnected (“results have been less than satisfactory”), signaling unresolved tension in synthesizing PSLK elements into a student-responsive framework.
Tina's practice leans toward procedural fidelity. Her focus centres on delivering preset content and enforcing accuracy (as she said, “mistakes in the language of chemistry can lead to loss of marks”). Though aware of student “receptivity” and cognitive limits (as she said, “knowing how much students can absorb”), her assessment relies heavily on summative checks (tests, direct questioning), limiting dynamic scaffolding during concept formation. This leads to rigid procedural emphasis (as she said, “first write ‘solution’”) that may overlook conceptual gaps. Tina's procedural fidelity aligns with the rigidity of the novice stage: her “delivering preset content” and “enforcing accuracy” are survival strategies for the novice stage—avoiding cognitive overload by adhering to explicit rules (such as “must write ‘solution’ first”) (Slattery, 2017).
Tina's emerging innovations (e.g., tech integration) depend more on external models or tools. Her strengths, such as using physical models, lack systematic integration into her PSLK schema or cross-context transfer. Her interview reveals a desire to “penetrate relevant ideas from time to time”. However, it acknowledges limited effective methods (as she said, “only the method of learning the classification has been used”), indicating reactive rather than generative innovation. This corresponds to the bottleneck in the Dreyfus model's ability stage: although there is conscious planning, there is a lack of expert-level “pattern recognition agility”, resulting in superficial innovation attempts (Slattery, 2017).
However, the ability to coordinate PSLK elements is not innate but develops incrementally through reflective practice and deliberate design, as illustrated by Tina's fragmented application of elements compared to Hana's integrated and adaptive approach, a disparity quantified across multiple PSLK dimensions (Section 6.3). Evens et al. (2018) found that presenting knowledge domains in isolation (e.g., only PK and CK) was insufficient for developing PCK, suggesting that without integrated guidance, robust subject-specific expertise may not be fully established. Therefore, teacher education training should move beyond merely introducing PSLK components to cultivating teachers' pedagogical reasoning, which is the skill of dynamically coordinating language objectives with conceptual learning goals, student needs, and contextual demands. Professional development programs for teachers should incorporate analysis and design of integrated PSLK curricula, achievable through video reflection, case studies, or collaborative lesson planning. Future research could explore how digital tools support teachers in visualizing and coordinating PSLK elements in real time.
This contrast reveals a gap between merely possessing PSLK knowledge reserves (e.g., knowing which vocabulary is challenging or understanding several explanation strategies) and effectively applying that knowledge in real-world contexts. Adaptive expertise thus functions as the essential cognitive mechanism that integrates a teacher's conceptual understanding (knowing-why) with procedural knowledge (knowing-how), enabling not only the application of existing knowledge but also the innovation of context-sensitive strategies in response to novel challenges (Ng et al., 2022). It transforms teachers from mere “executors” of PSLK strategies into “designers” capable of dynamically restructuring and adapting teaching approaches based on students’ unique language barriers and cognitive needs. Tina's challenges in adapting her instruction to emergent students’ needs suggest that her difficulties stem less from a complete lack of PSLK elements and more from an immature ability to apply them dynamically. This is vividly illustrated in her teaching of “composition and constituents”, where, upon recognizing student confusion, she defaulted to mechanically repeating definitions rather than generating illustrative examples or analogies to bridge the understanding gap (Section 6.2). As Mönch and Markic (2022a) emphasize in their systematic review, the construction of PSLK requires a deep understanding of teaching contexts. This study further indicates that such understanding should be dynamic and immediate—the very essence of adaptive expertise.
Consequently, this research offers significant implications for teacher professional development programs. Pre-service training and in-service professional development should not merely focus on imparting specific PSLK strategies; they must also design learning environments that cultivate teachers’ adaptive expertise (Moran et al., 2023). This means teacher training curricula need to move beyond theoretical lectures and prioritize enhancing teachers’ capacity for “reflection-in-action (Park and Oliver, 2008)”. For instance, this can be achieved through the following approaches: utilizing authentic classroom video clips to guide pre-service or novice teachers in analyzing complex teacher–student interactions and student language difficulties, exploring multiple possible response strategies aligned with adaptive expertise rather than seeking a single “correct answer” (Vale et al., 2024). Creating low-risk simulated environments—such as the mixed reality simulations used by Rosati-Peterson et al. (2021)—allows teachers to safely practice and adjust their instructional strategies based on immediate feedback from virtual learners.
Notably, Tina outperformed Hana in utilizing digital tools (e.g., interactive whiteboards) to visualize language (Section 6.2). This phenomenon can be interpreted through the Technological Pedagogical Content Knowledge (TPACK) framework (Koehler and Mishra, 2009). Despite the novice teacher's overall underdeveloped PSLK, their digital tool proficiency provided an alternative pathway for implementing effective language-focused instruction. This demonstrates that strong Technology Pedagogical Knowledge (TPK) might serve as compensatory scaffolding for immature PSLK. For instance, through the TPACK framework, preservice teachers can learn to evaluate and select technological tools that accurately represent and clarify scientific concepts for their students (Latip et al., 2023). Simultaneously, the use of multimodal resources (e.g., visualizations, hands-on activities) in subject-integrated teaching can reduce cognitive load by disaggregating concept and language learning, thereby supporting disciplinary language acquisition (Gieske et al., 2022).
Simultaneously, our research robustly validates the diagnostic value of the PSLK framework and demonstrates its powerful utility as a research and professional growth tool. The framework's detailed ten-element structure effectively captures subtle variations in teachers' PLK, precisely reflecting the complex interplay between language awareness and chemistry instruction. Furthermore, when transformed into a structured observation tool, this framework demonstrates dual value: it serves as a reliable research instrument for capturing PSLK across diverse classrooms while also functioning as a vital scaffold for teacher reflection, self-assessment, and goal setting. By linking theoretical constructs to observable teaching behaviors, this tool bridges a critical gap between the principles of language pedagogical knowledge and instructional practice.
In summary, this study illuminates the practical characteristics of PSLK and its contextual relationship with teaching experience through in-depth analysis, providing empirical support for enhancing language literacy in teacher education. Future implementations should carefully consider the dynamic interplay between individual educators and their specific instructional contexts.
| Observation project | Observation indicators | |
|---|---|---|
| Scientific language role models | The chemistry teacher serving as a scientific language role model | Teachers’ normative and consistent use of scientific language, along with pedagogical awareness, encompasses conscious application, accuracy and consistency, interpretation to facilitate understanding, and recognition of their exemplary role. |
| Students serving as scientific language role models | Teachers observe students demonstrate scientific language expression through group discussions, which are then used by teachers as exemplary models for peer learning. | |
| Other instances serving as scientific language role models | The frequency and effectiveness of teachers using counterexamples through scientific language to guide students in identifying and correcting inappropriate scientific expressions. | |
| Providing a discursive classroom | Providing opportunities for students to practice scientific language | Teachers create opportunities for students to practice communicating and expressing themselves using scientific language. |
| Incorporating multiple dimensions of language | Teachers’ integrated demonstration of scientific language across multiple dimensions in teaching, discussion, and writing practice. | |
| Asking questions | Teachers observe the frequency with which teachers pose questions and the extent to which they guide students into deeper thinking through follow-up inquiries. | |
| Using mistakes as learning opportunities | Teachers observe how teachers create safe, supportive classroom environments that transform student mistakes into collective learning opportunities. | |
| Communicating expectations clearly | Teachers can clearly distinguish between practice and assessment situations and clearly state and explain the evaluation criteria and requirements for the use of scientific language in advance. | |
| Multiple resources and representations | Teachers employ multiple representation strategies to promote students’ holistic understanding of abstract concepts. | |
| Developing the concept first | Teachers guide students to construct concepts through experiments and observation of phenomena, rather than directly using scientific language to represent them. | |
| Explicating scientific language | Teachers guide students to pay attention to the morphology, usage, and contextual features of scientific language. | |
| Specific methods and tools | Introducing Chemish | Teachers gradually introduce scientific language and text types, guiding students to understand differences through comparative examples. |
| Practicing Chemish | Teachers guide students to actively use and practice scientific language through diverse activities. | |
| Summarizing Chemish | Teachers guide students to systematically summarize scientific language and its connections and differences using tools such as tables and mind maps. | |
| Monitoring students’ use of Chemish | Teachers monitor students’ correct use of scientific language by listening, checking assignments, and asking students to explain terms. | |
| Scaffolds for scientific language development | Oral strategies | Teachers guide students in constructing accurate scientific expressions through targeted questioning and terminology explanations. |
| Visual aids | Teachers employ multiple characterization tools and visualization methods to make abstract concepts explicit. | |
| Written strategies | Teachers establish frameworks and logical pathways for written scientific language output by organizing tools and vocabulary paradigms through structured texts. | |
| Helping students understand concepts (new) | Teachers employ diverse strategies to ensure students gain a deep understanding of concepts and can clearly articulate their meaning. | |
| Linking to real-life situations (new) | Teachers connect scientific concepts to students’ life experiences and real-world scenarios, guiding students to describe and explain the real world using scientific language. | |
| This journal is © The Royal Society of Chemistry 2026 |