Discipline-specific cognitive factors that influence grade 9 students’ performance in chemistry

Lina Zhang ab, Lei Wang *a and David F. Treagust c
aBeijing Normal University, Beijing, 100875, China. E-mail: wangleibnu@126.com
bScience and Mathematics Education Assessment Center, Beijing Institute of Education, Beijing, 100044, China
cSTEM Education Research Group, School of Education, Curtin University, Perth, Western Australia 6102, Australia

Received 21st November 2020 , Accepted 2nd May 2021

First published on 17th May 2021


Abstract

Students’ academic performance in chemistry can be the result of a number of cognitive and affective factors. This study explored the influence of the discipline-specific cognitive factors of knowledge structure, cognitive perspectives, and cognitive patterns on grade 9 students' chemistry performance. One instrument measured chemistry academic performance based on concept knowing, application and problem solving. Six tasks with marking keys measured the discipline-specific cognitive factors of knowledge structure, cognitive perspectives, and cognitive patterns. Different groups of grade 9 students participated in pilot tests and the field tests. The quality of the chemistry academic performance instrument and the six tasks was inspected by both expert assessment with six raters and computer-aided inspection including Rasch analysis and Kendall rater-consistency reliability tests. Correlation analysis and multiple regression analysis explored the relationship among academic performance and knowledge structure, cognitive perspectives, and cognitive patterns. According to the results of this research, knowledge structure, cognitive perspective and cognitive pattern all influenced grade 9 students’ chemistry performances; cognitive perspective was the most important factor. Based on these findings, we discuss individual student performance relative to their discipline-specific cognitive factors. We recommend that instruction of discipline-specific learning in chemistry pay attention to each of the three discipline-specific cognitive factors and that tasks be designed to promote the progress of each of these three discipline-specific cognitive factors, especially cognitive perspectives.


Introduction

Existing research in science education has highlighted the importance of cognitive factors as variables that influence students' academic performance (Lawson, 1985; Danili and Reid, 2006; Adrian-Yigit, 2018). Researchers have investigated the relationships of formal reasoning ability, prior knowledge, field dependence/field independence, and memory capacity on students' academic achievement (Ingle and Shayer, 1971; Lawson 1983, 1985; Chandran et al., 1987) though learning discipline-specific knowledge of chemistry was not discussed in these studies.

In many studies, researchers have found that although conceptual change instruction had been implemented, students still scored low on cognitive tests and retained their existing preconceptions instead of scientific concepts taught by the teachers (Abell and Lederman, 2007; Taber et al., 2012; Adbo and Taber, 2014; Potvin et al., 2015; Derman and Eilks, 2016; Nimmermark et al., 2016). Research has shown that even with the same teacher and techniques, there are differences in students’ performance in chemistry learning (Taber, 1997, 1998, 2001; Nicoll, 2001; Coll and Treagust, 2003a, 2003b; Jarkko and Maija, 2018). This finding gives rise to research to investigate which discipline-specific cognitive factors influence students’ academic performance in chemistry.

Theoretical framework

In recent years, discipline-based education research (DBER) has developed as a relatively new area of research at the undergraduate level with findings that are consistent with cognitive science and studies in K-12 education (National Research Council, 2012a, 2012b). In this field of research, investigators are interested in developing an understanding of how students learn the concepts, practices and ways of thinking in a particular discipline. In the case of the research reported in this paper, that discipline is chemistry and specifically the concepts (related to knowledge structure), practices (related to laboratory work – not part of this study) and ways of thinking (related to cognitive perspectives and cognitive patterns) for chemistry in the grade 9 curriculum.

Cognitive psychologists use the term cognitive structure to characterize internal cognitive factors that influence students’ learning. Piaget (1983) used schema to describe cognitive structures that illustrate students’ cognitive development; the original schema developed through assimilation or adaptation leads to new, more advanced schema. Students’ knowledge is of great significance to promote their cognitive development. In the field of science education, mental models are dynamic structures to answer, understand, or deal with a situation (Vosniadou, 1994; Vosniadou and Brewer, 1992, 1994; Coll and Lajium, 2011). Students’ mental models are derived from their cognitive structures and are influenced by their conceptual understanding. Existing research on mental models involves many scientific concepts such as atoms, molecules, chemical bonds, and metallic bonds. The common agreement from existing research has shown that many core concepts of science are incomprehensible to students because of their abstract nature (Vosniadou and Brewer, 1992, 1994; Harrison and Treagust, 2000; Hsieh et al., 2014). Studies of mental models usually assign factors such as scientific knowledge into schema, especially core concepts of that knowledge, but these studies have not analytically explored schema from the perspective of their constituent mental elements (National Research Council, 2012a, 2012b), that is, the discipline-specific cognitive factors. Therefore, research is needed to identify these cognitive factors and investigate them. One approach to investigate the disciplinary knowledge of students’ cognitive structure is to study the discipline-specific cognitive factors of students’ knowledge structure, cognitive perspectives, and cognitive patterns. These cognitive factors affect students' performance and are the core mechanism of disciplinary competence (Wang, 2016, Chiu and Lin, 2019). Each of the three discipline-specific cognitive factors have their own specific domain of knowledge and research, their own unique cognitive activities and problem-solving tasks and all are needed to uniquely understand, analyze and solve problems, ideas, and methods (Zhang and Wang, 2013; Zhang, 2015; Wang, 2016). Consequently, students’ academic performance can be measured by responses to specific questions designed to measure the three cognitive factors that make up the students’ mental cognitive structures. These structures are evident in (a) how the discipline is organised – for example, how that knowledge is structured can be measured by a concept map showing links between concepts (Won et al., 2017), (b) the cognitive perspectives employed – for example, by the different modes of text, drawings and symbols (Park et al. 2020), and (c) the cognitive patterns demonstrated – for example as measured by concept maps showing macro–micro representation and systematic thinking (Margel et al., 2006; Brandstädter et al., 2012). While there are different theories that can explain how these factors effect students’ understanding of scientific phenomena, our position is that the best theory is constructivism which ‘holds the promise of providing some unity to the practice of education and learning’ (Dennick, 2016).

Discipline-specific cognitive factors

In chemistry, students’ explicit performance is determined by their knowledge structure, cognitive perspectives and cognitive patterns as shown in Fig. 1. The practicality of these relationships is now discussed.
image file: d0rp00352b-f1.tif
Fig. 1 Discipline-specific cognitive factors that influence students' academic performance.
Knowledge structure (see Fig. 1). The knowledge factor refers to the chemical knowledge structure possessed by a student. Knowledge structure is a hypothetical construct indicating the organization of concepts in learners' long-term memories and the relationships between them which plays an important role in promoting students' cognitive development and affects students' chemistry performance (Shavelson, 1974; Champagne et al., 1981; Nakiboglu, 2008). More than 50 years ago, Ausubel (1968) showed that cognitive development was related to cognitive structure and the components of knowledge and their associations. Subsequent research by Rumelhart (Rumelhart and Norman, 1976; Rumelhart, 1986) considered that research should investigate knowledge schemas and how they enhance students' cognitive development (see also Rumelhart and Ortony, 1977; Jonassen et al., 1993; Zhou et al., 2015). Sources of chemical knowledge can be found in textbooks. However, after learning chemical knowledge, students’ individual differences dictate whether knowledge is retained and whether or not this knowledge can be demonstrated as knowledge output (Rumelhart, 1977, 1986). Thus, individual differences also affect students’ performance. Regardless of whether knowledge is presented as nouns, such as ‘molecules,’ ‘atoms’, etc., or in the form of statements, such as ‘molecules are the smallest particles necessary to maintain the chemical nature of a substance’, it is necessary to interpret, analyze, and infer from the data whether such knowledge exists in students’ minds. For example, this knowledge is the basis of whether or not students can apply their understanding of physical properties and chemical properties of iron to solve problems.

Ausubel (1968) put forward the importance of knowledge connections when discussing the structure of knowledge. The subordinate concept with connections and structure helps to form a more general concept through the process of superordinate learning. Forming a knowledge structure is a process that involves students memorizing and integrating knowledge and is the basis for knowledge in use (Ausubel, 1968; Novak and Gowin, 1984; Harris et al., 2016, 2019). For example, knowledge structure reflects students’ understanding of a certain reaction of iron; they not only know that iron can burn in oxygen but also that iron can slowly oxidize in air. Another example is that students know iron can react with hydrochloric acid and with copper sulphate solution and consider that these two are examples of metal activity and an iron replacement reaction, respectively.

Students’ knowledge structure affects their academic performance which can be evaluated by concept maps (Shavelson et al., 2005; Lopez et al., 2014; Hung and Lin, 2015; Won et al., 2017). It is also important to evaluate the links between the concepts and conceptions present in students' knowledge structure so that teaching can be more effective (Adrian and Subramaniam, 2018).

Cognitive perspective (see Fig. 1). Concepts such as molecule and atom may provide the perspective of particles when understanding specific substances or phenomena. Similarly, the concept of element provides a specific perspective for understanding matter and reactions. Only when knowledge has a cognitive perspective and reasoning paths are developed in problem solving can understanding be transformed into disciplinary competence (Wang, 2016). The cognitive perspective entails the ‘navigation’ of thinking and reasoning in the process of problem solving. Students are expected to possess the perspective of chemists, for example, ‘substance is formed by the arrangement of atoms in multiple ways’, ‘change is the rearrangement of atoms’, and ‘the process of chemical change is accompanied by changes in energy’ (Organization for Economic Co-operation and Development, 2006, 2015; International Association for the Evaluation of Educational Achievement, 2011, 2019; Ministry of Education, 2011; National Research Council, 2012a, 2012b, 2015). These perspectives are derived from an analysis of the core chemistry concepts (Claesgens et al., 2009; Zhang and Wang, 2013). A concept itself is objective but students’ perceptions of concepts, such as chemical bonds and molecules, are subjective and may be expressed by individual students in different ways. Even if the content of instruction is the same, different students will have a different understanding of the core concepts (Zhang, 2015). It is expected that cognitive perspectives formed by students through chemistry learning, like chemists’ perspectives, results in a scientific understanding of substances and phenomena rather than the development of misconceptions.

Furthermore, cognitive perspectives reflect students’ perceptions of specific phenomenon such as chemical change. There can be multiple perspectives of certain phenomena; generally, holding more perspectives reflects higher levels of understanding. For example, specific perspectives of matter and change, including classification and being made of particles, enable an understanding of specific substances and chemical changes. However, all students’ perspectives of matter and change will not be exactly the same. After learning chemistry concepts, some students may not reflect on these concepts and may not be able to form perspectives to understand matter and change like scientists. Nevertheless, it is expected that students can apply multiple perspectives flexibly in problem-solving after learning about certain concepts but not all students will have the same perspective; these different perspectives give rise to understandings about matter and change on the basis of the knowledge learned (Harris et al., 2016, 2019). If a perspective is formed, students should be able to apply that perspective to solve problems. For example, learning about molecules and atoms not only retains conceptual nouns as definitions of concepts in students’ minds but also enables students to explain, analyze, and demonstrate from the perspective of particles. As another example, when learning about iron, students not only pay attention to the macroscopic properties of iron, which is silvery white and has a metallic luster, but they also can actively state that ‘iron consists of iron atoms’ which means that they have formed a particle perspective.

Hence, a cognitive perspective is a key factor affecting student discipline-specific learning. Different students develop their own advanced perspectives when they understand specific substances and reactions or engage in problem solving. Faced with the same matter or phenomenon, different individuals may understand from different perspectives which are evident in their different ways of reasoning and their performance.

Cognitive pattern (see Fig. 1). Chemistry thinking involves the three levels of ‘macroscopic’, ‘sub-microscopic’, and ‘symbolic’ (Johnstone, 1991, 1993, 2000; Margel et al., 2008; Chandrasegaran et al., 2009). In the field of chemistry, an understanding of scale, proportion and quantity, systems and system models, structure and function, and stability and change involve different types of views that are gradually developed through the study of chemistry (National Research Council, 2012a, 2012b, 2015; Sevian and Talanquer, 2014; Weinrich and Talanquer, 2015). Therefore, chemical thinking involves not only the three levels of macro, micro, and symbolic but also includes the understanding of quantity, stability and systematic thinking. These understandings are what students should form through chemistry learning which represents the objective scientific knowledge that exists in student's cognitive structure.

Cognitive patterns not only represent chemistry thinking but also crosscutting scientific understanding, including the three levels of understanding macro, micro, and symbolic as well as understanding quantity, stability and systematic thinking (National Research Council, 2012a, 2012b, 2015) and how students link concepts (Ye et al., 2015). Firstly, chemical concepts such as the movement of electrons outside the nucleus, the structure of water, and the structure of metal crystals are presented in chemistry textbooks. From the perspective of the discipline, students are required to associate the microscopic world with the macroscopic world (Fu, 1999; Ding, 2003; Zhang and Wang, 2013). The middle school chemistry curriculum requires students to master the idea that ‘substances are composed of extremely small particles that never stop moving’ and to ‘use atomic and molecular models to explain chemical reactions’ (Ministry of Education, 2011; National Research Council, 2012a, 2012b, 2015). Secondly, Gibbs free energy, enthalpy, and oxidation–reduction reaction equation balance are presented in chemistry textbooks requiring students to develop qualitative–quantitative patterns from the perspective of the discipline (Fu, 1999; Ding, 2003; Zhang and Wang, 2013). The middle school chemistry curriculum requires students to calculate the mass of a substance, to calculate products based on chemical equations, to understand the law of conservation of mass, to measure the acidity and alkalinity of a solution by pH value (Ministry of Education, 2011; National Research Council, 2012a, 2012b, 2015). Thirdly, the periodic law of elements, system and environment, and categories of crystals are presented in chemistry textbooks requiring students to develop systematic thinking (Fu, 1999; Ding, 2003; Zhang and Wang, 2013). The middle school chemistry curriculum requires students to distinguish between pure substances and mixtures, elemental substances and compounds, organic compounds and inorganic compounds, and to recognize that all substances are composed of one or more elements (Ministry of Education, 2011; National Research Council, 2012a, 2012b, 2015). Fourthly, reversible reactions, chemical equilibrium, and transformation of precipitation are presented in chemistry textbooks requiring students to possess a ‘static–dynamic’ thinking from the perspective of the discipline (Fu, 1999; Ding, 2003; Zhang and Wang, 2013). The middle school chemistry curriculum requires students to understand the three states of matter (solid, liquid, gaseous) and their transformation, and realize energy transformation through chemical reactions.

When recognizing specific substances and changes, each type of cognitive pattern does not exist in isolation but is combined in a specific way, for example, as macro-qualitative-fragmented-static or micro-quantitative-systematic-dynamic (Zhang and Wang, 2013). There are three different levels of cognitive patterns, from low to high, to understand matter and change. For example, a macroscopic understanding of iron is at a lower level while a microscopic understanding is at a higher level based on the understanding from the perspective of particles (molecules, atoms, etc.). The highest level of cognitive pattern is combined macroscopic and microscopic understanding (Johnstone, 1982, 1991; Margel et al., 2008; Zhang and Wang, 2013; Talanquer, 2018). If students can understand matter and its changes from the perspective of particles on the basis of knowledge learned about the atom, they can also actively understand a substance in combination with the macroscopic perspective of a substance such as physical properties, or students can use microscopic understanding to explain the nature of phenomenon of macroscopic materials, and can combine macro and micro to understand matter and change. In this way, students reach the highest level of the macro–micro cognitive pattern which helps to develop their abstract thinking. Similarly, quantitative thinking, dynamic thinking, and systematic thinking are also developed on the basis of knowledge learning and the formation of perspectives (National Research Council, 2012a, 2012b, 2015; Zhang and Wang, 2013; Sevian and Talanquer, 2014; Weinrich and Talanquer, 2015).

Cognitive patterns, which affect student's academic performance, exist in students’ minds at a deeper level than cognitive perspective, are more general, more stable, and not easy to test, but reflect a diverse and high-level understanding of matter and change. The variety of cognitive patterns is a manifestation of richness of ideas which means a high level of understanding. The continuous deepening of the level of cognitive patterns reflects the increase in the depth of understanding which means a high level of understanding (National Research Council, 2012a, 2012b, 2015; Zhang and Wang, 2013; Talanquer, 2018).

Academic performance

Academic performance describes students’ behaviours which can be described by their actions on tasks which express what students can do, often using an output verb such as ‘explain’. According to Bloom's taxonomy (Bloom et al., 1956; Anderson and Krathwohl, 2001), the analysis of international large-scale assessment in science including PISA, TIMSS and NAEP, the chemistry curriculum standard in China and abroad, and expert checks, tasks are given terms like ‘Recognize’, ‘Represent’, and ‘Exemplify’ for Concept-Knowing, ‘Compare/Contrast’, ‘Explain’ and ‘Analyse’ for Concept-Application and ‘Hypothesize/Predict’, ‘Design’ and ‘Conclude’ for Problem-Solving. The task and performance items to assess expressed behaviour levels of performance used in this research are shown in Table 1. Responses to these tasks are a measure of students’ academic performance and can be used to outline the level of students’ cognitive development.
Table 1 Task and academic performance items to test expressed behaviour
Level Task Academic performance
Knowing Recognize Remember the knowledge related to matter and change. E.g., write familiar chemical reaction equations
Represent Represent/describe the properties of the types of matter or phenomena of chemical changes
Exemplify Cite required examples
Translate/interpret Interpret the matter or chemical reaction information from tables or figures
Summarize/classify Determine the type of substance or chemical reaction
Applying Compare/contrast Compare two or more types of substance or chemical reactions
Judge/decide Make judgments on types of substance or chemical reactions
Explain Explain types of substance or chemical reactions
Analyze Analyze certain types of substance or chemical reactions from one or more perspectives
Infer Infer from one or more perspectives
Problem solving Hypothesize/predict Predict based on given information
Design Design experiments including experimental procedures, or selecting and effectively combining experimental apparatuses
Conclude Prove/argue/find solutions; draw conclusions
Reflect/evaluate Reflect/evaluate experiments effectively
Produce/generalize Generalize; put forward new questions


If students can complete certain tasks, for example, if they can write down a familiar chemical equation of a reaction, describe a phenomenon of a chemical reaction, cite examples of compounds, interpret information from tables or figures, and classify substance or chemical reactions, they have reached the level of ‘knowing’ chemical knowledge. If students can use certain perspectives to make comparisons and decisions, offer explanations, analysis or inferences, for example, they can compare the content of oxygen in Al2O3 and Fe2O3, determine that CO2 is an oxide, explain certain phenomena, infer from one or more perspectives, analyze certain phenomena, they have reached the level of ‘applying’. If students can predict, design experiments, draw conclusions, reflect and evaluate experiments or generalize and put forward new questions based on the experiment or practice, they have reached the level of ‘problem solving’.

Researchers in education distinguish between different types of knowledge. Declarative knowledge is about knowing facts. A chemistry example is that an acid turns litmus red and neutralises alkalis. Procedural knowledge is knowing how to perform tasks with the actual learned knowledge. A chemistry example is knowing how to find the neutralisation point when an acid reacts with a base. Knowledge structure like schema is the knowledge of how concepts within a domain are interrelated (Jonassen et al., 1993; Anderson and Krathwohl, 2001; Organization for Economic Co-operation and Development, 2015). In this research we focus on students' chemistry performance based on their declarative knowledge and procedural knowledge in chemistry learning and the discipline-specific cognitive factors including knowledge structure, cognitive perspectives and cognitive patterns that affect students’ chemistry performance.

The relationship between academic performance and knowledge structure, cognitive perspective and cognitive pattern. Piaget proposed the theory of schema, using cognitive structures, to describe students’ performance and their intellectual development. Feng (Feng, 1998; Feng and Feng, 2011) believed that the performance of individuals is determined by their internal cognitive structure. Our consideration is that chemistry knowledge structure, cognitive perspectives and cognitive patterns are the three factors of the disciplinary cognitive structure formed by students in the process of learning chemistry. Consequently, the knowledge structure, cognitive perspectives and cognitive patterns are the internal factors which affect students’ external performance on assessments and can be represented by the relationship:
 
Academic performance = a * knowledge structure + b * cognitive perspective + c * cognitive pattern(1)

From the above formula (1), we can assume that there is a linear relationship among academic performance and knowledge structure, cognitive perspective, and cognitive pattern, where a, b, and c represent the degree, or rate of contribution of influence of knowledge structure, cognitive perspective, and cognitive pattern on student's academic performance (see Fig. 1).

Methodology

Research questions

The empirical part of this research involves the development of assessment tools of performance and determination of students’ knowledge structure, cognitive perspectives and cognitive patterns. We are interested in which discipline-specific cognitive factors influence students’ chemistry performance and how knowledge structure, cognitive perspectives, and cognitive patterns affect this performance. Thus, the main research question is: How do knowledge structure, cognitive perspective and cognitive pattern affect grade 9 students’ chemistry performance? What factors have a greater impact?

Data collection

Participants. A total of 302 students from four secondary schools (grades 7–12) in Beijing were involved in the research that included pilot tests and field tests. In the pilot tests, 80 students participated in the first-round to develop the instrument to measure academic performance and the tasks and scoring methods to measure knowledge structure, cognitive perspective and cognitive pattern; 150 students participated in the second-round to report item quality as measured by Rasch analysis. Another 12 students participated in the interview and think-aloud tasks to further develop the tasks and scoring methods for knowledge structure, cognitive perspective and cognitive pattern. For the field test, a stratified sampling method was applied to select 20 students from three of the four schools with high, medium, and low performance according to the unified junior high school graduation examination for academic achievement in Beijing. Consequently, the field test which was conducted with 60 students included the instrument to measure academic chemistry performance, and tasks to measure knowledge structure, cognitive perspective and cognitive pattern.

The average age of the grade 9 students was 14 years and 6 months. In mainland China, the chemistry curriculum is officially first offered in grade 9 which is the beginning period for the formation of the chemist's perspective and developing cognitive patterns. This study of grade 9 students’ perspectives and cognitive patterns in chemistry is of great significance to understand their formation as well as the level of the students’ academic performance and progress. Students follow the full-time Compulsory Education Chemistry Curriculum Standard and use the same textbook published by the People's Education Press (2012). Important concepts such as molecule, atom, chemical formula, chemical equation, and the law of mass conservation were learned before the chemistry performance instrument was administrated.

Ethics clearance for the study was obtained from each schools’ Institutional Oversight Board. The testing process was approved by the school leaders and chemistry teachers, and the entire process was supervised by the teachers. All participants were informed of the purpose and procedure of the test and were told that their participation was voluntary and their anonymity would be guaranteed. The students consented to participate in the performance test, the test for knowledge structure, cognitive perspective and cognitive patterns, and possible interviews and think-aloud tasks.

The researchers collected data from the three schools that participated in the field testing within one week. The test for each school was completed within one day (to avoid any communication between students in different schools). The performance test took one hour while the test for knowledge structure, cognitive perspective and cognitive pattern also took one hour. Only the researchers knew the three schools that participated in the test and that there was no communication between the three schools. All data from written tests and interviews were in Mandarin. The instruments, student responses to the instruments and the interview transcripts were translated from Mandarin to English by an associate professor who majored in chemistry education; the English texts were then reviewed by a professor who is a native English speaker.

Instrument to measure academic performance in chemistry. The grade 9 students' chemistry performance assessment instrument is in the form of a paper-and-pen test that included 30 items with a total score of 30 (see Q1–Q30 in Appendix 1). The items types are multiple choice items and construct response items. The scoring method is right or wrong (1, 0) for each item/question. Eight examples of the items are shown in Fig. 2. Fifteen tasks with the expressed behaviours listed in Table 1 were used to design the items for the expected performance which expresses what a student can do. For the content of the test, the knowledge of ‘metal’ was selected to examine students' understanding of ‘matter’ and ‘change’ which are the research objects of chemistry, the core content of curriculum standards, and the focus and difficulty of students' learning as well. Among these 30 items, 15 items correspond to students' understanding of ‘matter’, and the other 15 items correspond to students' understanding of ‘change’ (see Appendix 2 Table 9).
image file: d0rp00352b-f2.tif
Fig. 2 Examples of eight items for the academic performance instrument in chemistry.

In Fig. 2, the items coding Q1, Q6, Q7, Q8 test the following tasks of students' understanding of ‘matter’ respectively: Summarize/Classify, Compare/Contrast, Judge/Decide, Explain; the items coding Q16, Q17, Q24, Q25 test the following tasks of students' understanding of ‘change’: Recognize, Summarize/Classify, Infer, Analyze.

The validity of the measures of academic performance was established by expert review combined with Rasch measurement. Six experts checked the academic performance instrument; two are professors in chemistry education and four are expert teachers in chemistry teaching and research. The students' academic performance measures were created with a one-dimensional Rasch model, and Winstep 3.72.0 software was used to conduct Rasch analysis, and establish evidence for validity and reliability of the data. The Rasch model, based on Item Response Theory (IRT), which was applied for item pool development focuses on constructing relatively objective and equidistant scales, considering the difficulty of the items and students' ability separately, and in this way avoids problems such as sample dependence of Classic Testing Theory (CTT). Consequently, the Rasch model is often used to construct instruments to measure the development of students’ learning (Wilson, 2005; Liu, 2010; Stevens et al., 2010; Zhang, 2015). The construction of instruments of this study refers to the relevant steps based on those for the existing Rasch model (Wilson, 2005; Liu, 2010; Stevens et al., 2010). The research steps include: (1) defining the assessment framework; (2) item design; (3) pilot tests and revisions; (4) experts check; (5) field test; (6) use Rasch model to implement computer aided check; and (7) repeat steps 5–6 to improve the instrument.

Most of the chemistry academic performance items were adapted from the senior high school entrance examination items which possess good stability and differentiation and have high level of quality in this context. The construction of chemistry academic performance items followed the above steps. The steps for using the computer to check the items are: (1) reliability; (2) person-item match (Wright map); (3) dimensionality (principal component analysis was applied to the standardized residuals to identify possible dimensions); and (4) item fit. The results showed that student reliability (person reliability) is 0.83 and item reliability is 0.98 (see Appendix 2, Fig. 6). Both person reliability and item reliability are higher than 0.7 which meet the reliability requirements of diagnostic tests. In other words, the items have a high degree of reliability. The Wright map, which shows the internal structure of items and suggests construct validity of measures, demonstrates that the students’ abilities match the difficulty level of the items and the items cover the scope of all students' abilities (see Appendix 2, Fig. 7). The result of the one-dimensional test in Appendix 2, Fig. 8 shows the factor loading of residuals. A variance greater than or equal to 50% explained by the Rasch dimension can be regarded as evidence that the scale is unidimensional (Linacre, 2011), and scale unidimensionality can be assumed if the second dimension (first contract) has the strength of less than 3 items (in terms of eigenvalues) and the unexplained variance by the first contrast is less than 5% (Oon and Subramaniam, 2011). Factor analysis of residuals showed that almost all 30 items (except q12, q13, q23 and q25 which represent the 4 items of B, A, a, C in Appendix 2, Fig. 8) had a loading which means that after the measured construct, i.e. students’ academic performance of chemistry, is controlled, the unexplained variance in the first construct is 2.8, not much larger than 2 (see Appendix 2, Fig. 9) indicating that there does not seem to be another construct measured by the instrument (Peng et al., 2016). Therefore, the data almost meet the unidimensionality requirement and the local independence requirement, which provides evidence for the construct validity of the instrument. Among all 30 items, the INFIT MNSQ has a maximum value of 1.67 and a minimum value of 0.70. Only three items INFIT MNSQ have values greater than 1.3; the remaining 27 items INFIT MNSQ are between 0.70 and 1. 30 (see Appendix 2, Fig. 9), meeting the necessary criteria indicating that the matching between the test with 30 items and the model is good (Wilson, 2005; Bond and Fox, 2007; Liu, 2010).

Tasks and scoring methods to measure knowledge structure, cognitive perspective and cognitive pattern. The grade 9 students' tasks for knowledge structure, cognitive perspective and cognitive pattern included a total of 6 tasks. The tasks and intentions corresponding to knowledge structure, cognitive perspective and cognitive pattern are listed in Table 2.
Table 2 The six tasks showing type and intentions corresponding to knowledge structure, cognitive perspective, and cognitive pattern
Task Task description Type Intention
1 Please describe your understanding of iron Subjective self-reported items Taking iron as an example, the perspectives of matter were mainly examined
Tip: The descriptions you can use include: (1) text (2) drawing (3) symbols; if the above three methods are not sufficient to explain your understanding of iron, you can use other means to describe and indicate your method. Please write as much as you can
2 Please describe your understanding of the reaction between iron and oxygen Subjective self-reported items Taking the reaction between iron and oxygen as an example, the perspectives of change were mainly examined
Tip: The descriptions you can use include: (1) text (2) drawing (3) symbols; if the above three methods are not sufficient to explain your understanding of the reaction between iron and oxygen, you can use other means to describe and indicate your method. Please write as much as you can
3 Drawing a concept map (show Example 1, see Fig. 3) Concept map Taking iron as an example, the knowledge structure of matter was mainly examined
Please refer to Example 1 and construct a concept map with 'iron' as the core concept
4 Please refer to Example 1 to construct a concept map with 'matter' as the core concept Concept map To investigate cognitive patterns of matter
5 Please refer to Example 1 to construct a concept map with 'change' as the core concept Concept map To investigate cognitive patterns of change
6 Drawing a knowledge network map (show Example 2, see Fig. 3) Concept map (knowledge network map) Take the reaction of iron as an example, the knowledge structure of change was mainly examined
Please refer to Example 2, construct a knowledge network map with ‘iron’ as the core concept, and write down related chemical reaction equations


The six tasks shown in Table 2 were used to assess students’ knowledge structure, cognitive perspective and cognitive pattern. Knowledge structure possessed by students was assessed by drawing two concept maps (see tasks 3 and 6 in Table 2), while cognitive perspective was assessed by subjective self-reported items (see tasks 1 and 2 in Table 2). Cognitive pattern was assessed by drawing two concept maps (see tasks 4 and 5 in Table 2) for each student. A summary of how the student responses were scored is presented below with further details of the tasks and the scoring in Appendix 3. Three students’ answers, selected as examples to illustrate the scores for their answers in the following Tables 3–5, represent those identified as being excellent (Lily: Rasch measure score is 53.7), middle (Xiao Ming: Rasch measure score is 51.7), and to be improved (Jia: Rasch measure score is 49.5); the responses for these three students are shown in Appendix 4.

Table 3 List for scoring method of knowledge structure
Task 1: draw a concept map with ‘iron’ as the core concept
Task 2: draw a knowledge network map of chemical reaction with ‘iron’ at its core, and write down related chemical reaction equations
Knowledge point Physical properties (K1); metal mineral (K2); metal materials (K3); the existence form of iron, metal corrosion and protection (K4); metal activity (K5); metal smelting, iron making (K6); chemical properties of metal (K7); slow oxidation (K8)
Score 0 (no knowledge point) 1 (isolated knowledge) 2 (related knowledge)
The student has not shown corresponding knowledge points in Task 1 and Task 2 The student demonstrates isolated knowledge points in Task 1 or Task 2 (there is no line in the figure to indicate the connection with other knowledge, or only a single knowledge point is expressed in words or equations) The student demonstrates related knowledge points in Task 1 or Task 2 (use lines to establish a clear relationship between multiple knowledge points in the diagram, or express the relationship between multiple knowledge points in language or equations)
Score note According to the students' answers to the two tasks, a score is given for each knowledge point, which includes 0, 1, 2. ‘0’ represents no knowledge point in the answers to the two tasks (pictures), ‘1’ represents only an isolated knowledge point in the answers to the two tasks, and ‘2’ represents related knowledge points.
Example (the complete answers of Lily, XiaoMing and Jia are shown in Appendix 4) There is no knowledge expressed about metal mineral and metal materials and alloys in Jia's knowledge structure from her answers for both tasks (item 3 and item 6 in Table 20, Appendix 4) In Xiao Ming's answer to task 2 (item 6 in Table 21, Appendix 4), there are iron react with acids, which shows he knows the knowledge of metal activity. However, there is no reaction that iron reacts with salt solution, such as copper sulfate solution. Therefore, his knowledge for metal activity is isolated In Lily's answer to task 1 (item 3 in Table 22, Appendix 4), there are ‘(Iron) reacts with oxygen and water’, ‘Iron oxide reacts with carbon monoxide (to prepare iron)’. In Lily's answer to task 2 (item 6 in Table 22, Appendix 4), there is obvious relation between iron making and slow oxidation, which shows she knows both iron making and slow oxidation and is able to correlate the both


Table 4 List for scoring method of cognitive perspectives
Task 1: please describe your understanding of iron
Task 2: please describe your understanding of the reaction between iron and oxygen
Perspective Classification (Pe1); particle (Pe2); properties (Pe3); element (Pe4); energy (Pe5)
Score 0 (no perspective) 1 (isolated perspective) 2 (related perspectives)
The student has not shown corresponding perspective in Task 1 and Task 2 The student demonstrates an isolated perspective in Task 1 or Task 2 (there is only a single perspective is expressed in words or equations etc.) The student owns related perspectives in Task 1 or Task 2 (the student can express the relationship between multiple perspectives in language or equations etc.)
Score note According to the students' answers to the two tasks, a score is given for each perspective, which includes 0, 1, 2. ‘0’ represents no perspective in the answers to the two tasks, ‘1’ represents only an isolated perspective in the answers to the two tasks, and ‘2’ represents related perspectives
Example (the complete answers of Lily, Xiao Ming and Jia are shown in Appendix 4) There is no perspective of ‘energy’ in Jia's answers for both tasks (item 1 and item 2 in Table 20, Appendix 4) In Xiao Ming's answer to task 1 and task 2 (item 1 and item 2 in Table 21, Appendix 4), there is ‘iron is a solid’, but there is no classification of the reactions of iron which shows he has an isolated perspective of ‘classification’ In Lily's answers to task 1 and task 2 (item 1 and item 2 in Table 22, Appendix 4), she knows the chemical valence of iron, the phenomena of the two reactions of iron with oxygen. Additionally, she can balance both of the reactions correctly, which shows she has a related perspective of ‘element’


Table 5 List of scoring method for cognitive patterns
Task 1: draw a concept map with 'matter' as the core concept
Task 2: draw a concept map with 'change' as the core concept
Cognitive pattern Macro–micro (Pa1); qualitative–quantitative (Pa2); fragmented–systematic (Pa3); static–dynamic (Pa4)
Score 0 (low-level cognitive pattern) 1 (high-level cognitive pattern) 2 (related low-level and high-level cognitive pattern)
The student demonstrates only a low-level cognitive pattern in the 2 tasks The student demonstrates only a high-level cognitive pattern in the 2 tasks The student demonstrates related low-level and high-level cognitive pattern in the 2 tasks
Score note According to the students' answers to the two tasks, a score is given for each cognitive pattern, which includes 0, 1, 2. ‘0’ represents only a low-level cognitive pattern(student presents his/her macro understanding, qualitative understanding, static understanding and fragmented understandings in the answers to the 2 tasks), ‘1’ represents only a high-level cognitive pattern(student presents his/her micro understanding, quantitative understanding, dynamic understanding and integrated understanding in the answers to the 2 tasks), and ‘2’ represents related low-level and high-level cognitive pattern(student presents his/her combined macro–macro understanding, combined qualitative–quantitative understanding, combined static–dynamic understanding and systematic understanding in the answers to the 2 tasks)
Example (the complete answers of Lily, XiaoMing and Jia are shown in Appendix 4) The classification criteria are confused in Jia's answers for both tasks (item 4 and item 5 in Table 20, Appendix 4) which shows her understanding is ‘fragmented’ of ‘Fragmented-systematic’ cognitive pattern In Xiao Ming's answers to the 2 tasks (item 4 and item 5 in Table 21, Appendix 4), there is clear classification criterion and incomplete examples, which shows he understands in an integrated manner (rather than fragmented manner) of ‘Fragmented-systematic’ cognitive pattern In Lily's answers to the 2 tasks (item 4 and item 5 in Table 22, Appendix 4), there are clear classification criteria and correct examples, which shows she understands in a systematic manner of ‘Fragmented-systematic’ cognitive pattern


Measurement of knowledge structure. Measurement of knowledge structure is based on the students' answers on the two tasks 3 and 6 in Table 2. The details are shown in Fig. 3; the total score for knowledge structure is 16.
image file: d0rp00352b-f3.tif
Fig. 3 Tasks 3 and 6 for knowledge structure.

A score is given for each knowledge point. The eight knowledge points and the scoring framework (which includes 0, 1, 2) for knowledge structure are shown in Appendix 3. The scoring method for knowledge structure and three examples are shown in Table 3.

Measurement of cognitive perspectives. Measurement of cognitive perspectives is based on the students' answers on the two tasks 1 and 2 in Table 2. The details are shown in Fig. 4; the total score for cognitive perspective is 10.
image file: d0rp00352b-f4.tif
Fig. 4 Tasks 1 and 2 for cognitive perspective.

A score is given for each perspective. The five perspectives and the scoring framework (which includes 0, 1, 2) for cognitive perspective are described in Appendix 3. The scoring method for cognitive perspectives and three examples are shown in Table 4.

Measurement of cognitive patterns. Measurement of cognitive pattern is based on the students' answers on the two tasks 4 and 5 in Table 2. The details are shown in Fig. 5; the total score for cognitive pattern is 8.
image file: d0rp00352b-f5.tif
Fig. 5 Tasks 4 and 5 for cognitive pattern.

A score is given for each cognitive pattern. The four categories of cognitive pattern and the scoring framework (which includes 0, 1, 2) for cognitive pattern is in Appendix 3. The scoring method for cognitive pattern and three examples are shown in Table 5.

Validity and reliability of the six tasks and scoring methods

The inspection of items and scoring lists adopted the method of expert review combined with interrater-consistency inspection to ensure the validity of the tasks and scoring methods for knowledge structure, cognitive perspective and cognitive pattern. The same six experts who checked the academic performance instrument examined all three scoring methods for the six tasks to measure knowledge structure, cognitive perspective and cognitive pattern. The quality of the measurements was inspected by Kendall rater-consistency reliability test and calculated by SPSS 26.0 statistical software. The Kendall coefficients are 0.95 for the knowledge structure assessment, 0.93 for the cognitive perspective assessment, and 0.99 for cognitive pattern assessment. Additionally, the researchers interviewed 12 students with a specially designed think-aloud protocol about the tasks; then the pilot test was implemented to 30 students, and revisions were made based on the analysis results before completing the field test.

Data analysis

According to the theoretical framework of this study, students’ academic performance is determined by their knowledge structure, cognitive perspectives and cognitive patterns (see Fig. 1). Based on the assessment results, we applied the SPSS26.0 statistical program to conduct correlation analysis and multiple linear regression analysis to explore the relationship between performance and knowledge structure, cognitive perspective, and cognitive pattern to respond to the research question.

Firstly, scatterplots between the dependent variable and the independent variables were examined to ensure the relationship was linear. Secondly, correlation analysis was performed to analyze the relationship between the independent variable and the dependent variables, which is the relationship between knowledge structure, cognitive perspective, cognitive patterns and academic performance. Thirdly, multiple linear regression was performed. The method of ‘Enter’ was used to provide the results. Finally, multiple linear regression was re-run to double-check the data by using the methods ‘Stepwise’, ‘Backward’ and ‘Forward’.

Results

In the part of Theoretical Framework, we believe that in chemistry, students’ academic performance is determined by their knowledge structure, cognitive perspectives and cognitive patterns as shown in Fig. 1. These relationships are now discussed.

Relationship between academic performance and knowledge structure, cognitive perspective and cognitive pattern

We investigated the relationships between variables to answer the research question: How do knowledge structure, cognitive perspective and cognitive pattern affect students’ chemistry performance? What factors have a greater impact? by using the SPSS26.0 software program. Firstly, we calculated the descriptive statistics for students’ scores for academic performance (we used the Rasch measure score as the measurement for students’ score of academic performance rather than raw score, so the total score is 55.82 transformed from raw score 30.00), knowledge structure (total score for knowledge structure is 16.00), cognitive perspective (total score for cognitive perspective is 10.00), and cognitive patterns (total score for cognitive pattern is 8.00) (see Table 6).
Table 6 Descriptive statistics for academic performance, knowledge structure, cognitive perspective and cognitive pattern
N Mean Std. deviation Variance Minimum Maximum Total
Note: students’ score of academic performance has been transformed by Rasch measurement from the raw score of 30 to the total score of 55.82 (the detailed transformation see Appendix 2, Fig. 11 and 12).
Academic performance 60 51.12 1.82 3.31 46.69 55.82 55.82
Knowledge structure 60 6.85 4.69 22.03 0.00 13.00 16.00
Cognitive perspective 60 5.58 2.90 8.38 0.00 10.00 10.00
Cognitive pattern 60 2.50 2.10 4.39 0.00 7.00 8.00


Through correlation analysis and multiple linear regression analysis, we determined the relationship between academic performance and knowledge structure, cognitive perspective, and cognitive pattern. The results of the correlation analysis show that the statistical significance of the t-test of the correlation coefficients of academic performance (here, we also used the Rasch measure score) and knowledge structure, cognitive perspective, and cognitive pattern are all less than 0.001 indicating that academic performance is related to the three factors. Consequently, there is a statistically significant Pearson product correlation between academic performance and knowledge structure, cognitive perspective, and cognitive pattern being 0.82, 0.86 and 0.74, respectively (see Table 7).

Table 7 Correlations for academic performance, knowledge structure, cognitive perspective and cognitive pattern (N = 60)
Performance Knowledge structure Cognitive perspective Cognitive pattern
a Correlation is significant at the 0.01 level (2-tailed).
Academic performance Pearson correlation 1 0.82a 0.86a 0.74a


Table 8 The relationship between academic performance and knowledge structure, cognitive perspective and cognitive pattern (N = 60)
Academic performance Non-standardized coefficient Standardized coefficient t-Value
B SE
a p < 0.05. b p < 0.001.
(Constant) 48.27 0.25 195.97b
Knowledge structure 0.10 0.47 0.27 2.20a
Cognitive perspective 0.29 0.81 0.46 3.61b
Cognitive pattern 0.21 0.76 0.24 2.71b


Table 9 Performance of the 15 tasks and the corresponding 30 items
Task Performance Item Matter Change
Recognize Remember the knowledge related to matter and change. E.g., write familiar chemical reaction equations Q2
Q16
Represent Represent/describe the properties of the types of matter or phenomena of chemical changes Q4
Q19
Exemplify Cite required examples Q3
Q18
Translate/interpret Interpret the matter or chemical reaction information from tables or figures Q5
Q20
Summarize/classify Determine the type of substance or chemical reaction Q1
Q17
Compare/contrast Compare two or more types of substance or chemical reactions Q6
Q21
Judge/decide Make judgments on types of substance or chemical reactions Q7
Q22
Explain Explain types of substance or chemical reactions Q8
Q23
Analyze Analyze certain types of substance or chemical reactions from one or more perspectives Q10
Q25
Infer Infer from one or more perspectives Q9
Q24
Hypothesize/predict Predict based on given information Q11
Q26
Design Design experiments including experimental procedures, or selecting and effectively combining experimental apparatuses Q12
Q27
Conclude Prove/argue/find solutions; draw conclusions Q13
Q28
Reflect/evaluate Reflect/evaluate experiments effectively Q14
Q29
Produce/generalize Generalize; put forward new questions Q15
Q30


Additionally, there is a statistically significant Pearson product correlation between knowledge structure and cognitive perspective and between knowledge structure and cognitive pattern being 0.86 and 0.66, respectively (see Appendix 2, Table 10). There also is a statistically significant Pearson product correlation between cognitive perspective and cognitive pattern of 0.70 (see Appendix 2, Table 10).

Table 10 Correlations for knowledge structure, cognitive perspective and cognitive pattern (N = 60)
Knowledge structure Cognitive perspective Cognitive pattern
a Correlation is significant at the 0.01 level (2-tailed).
Knowledge structure Pearson correlation 1 0.86a 0.66a
Cognitive perspective Pearson correlation 0.86a 1 0.70a
Cognitive pattern Pearson correlation 0.66a 0.70a 1


Before we performed a linear regression, collinearity diagnostics were conducted (see the data in Appendix 2, Table 11). According to Cohen's view, the Variance Inflation Factor (VIF) larger than 10 provides evidence of collinearity. However, the VIF values for knowledge structure, cognitive perspective and cognitive pattern were 3.92, 4.39, and 2.02 (see Appendix 2, Table 11), respectively, each being lower than 10 (Cohen et al., 2003). In this case, concerning the correlation between the three discipline-specific cognitive factors, while there is a statistically significant Pearson product correlation between knowledge structure and cognitive perspective of 0.86 (>0.70), all the Variance Inflation Factor (VIF) values are less than 10. Consequently, the analysis cannot provide evidences of collinearity. Therefore, a linear regression is a suitable method to investigate the relationship between academic performance and knowledge structure, cognitive perspective and cognitive pattern.

Table 11 Coefficients for collinearity statistics (N = 60)
Academic performance Unstandardized coefficients Standardized coefficients t Collinearity statistics
B Std error Beta Tolerance VIF
a p < 0.05. b p < 0.001.
(Constant) 48.27 0.25 195.97b
Knowledge structure 0.10 0.05 0.27 2.20a 0.26 3.92
Cognitive perspective 0.29 0.08 0.46 3.61b 0.23 4.39
Cognitive pattern 0.21 0.08 0.24 2.71a 0.50 2.02


When performing a linear regression fit, the independent variables of the entry model are knowledge structure, cognitive perspective, and cognitive pattern. The correlation coefficient R is 0.89, R2 is 0.79, and the adjusted R2 is 0.78 (see Appendix 2, Table 12). The independent variables of the estimated standard deviation into the model are knowledge structure, cognitive perspective, and cognitive pattern. Using ANOVA, the variance source regression shows F is 70.11 with the associated probability distribution less than 0.001 (p < 0.001, see Appendix 2, Table 13), indicating that the regression effect is statistically significant. These values demonstrate a linear correlation between the dependent variable and the independent variables (see Table 8).

Table 12 Model summary for the relationship among academic performance, knowledge structure, cognitive perspective and cognitive pattern (N = 60)
Model R R square Adjusted R square Std error of the estimate
a Predictors: (constant), knowledge structure, cognitive perspective, cognitive pattern.
1 0.89a 0.79 0.78 0.86


Table 13 ANOVAb for the relationship among performance, knowledge, perspective and pattern (N = 60)
Academic performance Sum of squares df Mean square F
a p < 0.001. b Predictors: (constant), knowledge structure, cognitive perspective, cognitive pattern.
Regression 154.21 3.00 51.41 70.11a
Residual 41.06 56.00 0.73
Total 195.27 59.00


Table 14 The relationship between academic performance and knowledge structure and cognitive pattern (N = 60)
Academic performance Unstandardized coefficients Standardized coefficients t
B Std error Beta
a p < 0.001.
(Constant) (Constant) 48.79 0.22 221.99a
Knowledge structure Knowledge structure 0.23 0.04 0.59 6.60a
Cognitive perspective Cognitive pattern 0.30 0.08 0.35 3.89a
Cognitive pattern


The companion probability of the t-value of the regression coefficients of knowledge structure, cognitive perspective, and cognitive pattern are all less than 0.05 (p < 0.05). indicating that the linear correlation between the three variables and academic performance is statistically significant.

From the above data (see Table 8), after standardization this relationship is as follows

Academic performance = 0.27 knowledge structure + 0.46 cognitive perspective + 0.24 cognitive pattern

From the results of the correlation analysis and multiple linear regression analysis, the contributions to academic performance of each of the discipline-specific cognitive factors were obtained as 0.27, 0.46 and 0.24. According to Cohen's method (Cohen et al., 2003), we checked the significance of each pair of factors. The t value of the difference for contribution rate of knowledge structure and cognitive pattern is 3.78 (p < 0.01), which indicates that the contribution rate of knowledge structure to performance (0.27) is statistically significantly larger than that of cognitive pattern (0.24). The t value of the difference for contribution rate of knowledge structure and cognitive perspective is −3.47 (p < 0.01); the t value of the difference for contribution rate of cognitive perspective and cognitive pattern is 5.27 (p < 0.01). The above results indicate that the contribution rate of cognitive perspective (0.46) to performance is significantly larger than that of knowledge structure (0.27) and cognitive pattern (0.24). Further detail is shown in Appendix 2 (‘The significance of the contributions to academic performance of each pair of discipline-specific cognitive factors according to the data in above tables’ and Tables 15–20).

Table 15 Coefficient of correlations and covariances for knowledge structure and cognitive pattern (N = 60)
Cognitive pattern Knowledge structure
Correlations Cognitive pattern 1.00 −0.658
Knowledge structure −0.658 1.00
Covariances Cognitive pattern 0.006 −0.002
Knowledge structure −0.002 0.001


Table 16 The relationship between academic performance and knowledge structure and cognitive perspective (N = 60)
Academic performance Unstandardized coefficients Standardized coefficients t
B Std error Beta
a p < 0.05. b p < 0.001.
(Constant) 48.22 0.26 186.30b
Knowledge structure 0.12 0.05 0.32 2.49a
Cognitive perspective 0.37 0.08 0.59 4.65b


Table 17 Coefficient of correlations and covariances for knowledge structure and cognitive perspective (N = 60)
Cognitive perspective Knowledge structure
Correlations Cognitive perspective 1.00 −0.86
Knowledge structure −0.86 1.00
Covariances Cognitive perspective 0.006 −0.003
Knowledge structure −0.003 0.002


Table 18 The relationship between academic performance and cognitive perspective and cognitive pattern (N = 60)
Academic performance Unstandardized coefficients Standardized coefficients t
B Std error Beta
a p < 0.05. b p < 0.001.
(Constant) 48.18 0.25 191.92b
Cognitive perspective 0.42 0.06 0.67 7.56b
Cognitive pattern 0.23 0.08 0.26 2.96a


Table 19 Coefficient of correlations and covariances for cognitive perspective and cognitive pattern (N = 60)
Cognitive pattern Cognitive perspective
Correlations Cognitive pattern 1.00 −0.70
Cognitive perspective −0.70 1.00
Covariances Cognitive pattern 0.006 −0.003
Cognitive perspective −0.003 0.003


Table 20 Jia's answers and corresponding analysis
Item Task Answer of student 3: Jia Note
1 1. Please describe your understanding of iron image file: d0rp00352b-u1.tif To understand from general properties
2 2. Please describe your understanding of the reaction between iron and oxygen image file: d0rp00352b-u2.tif To understand from the perspectives of elements in reaction and the transformation of elements involved in chemical reactions
3 3. Drawing a concept map (show Example 1) image file: d0rp00352b-u3.tif Metal activity; chemical properties of metal
(1) Please refer to Example 1 and construct a concept map with ‘iron’ as the core concept
4 (2) Please refer to Example 1 to construct a concept map with ‘matter’ as the core concept image file: d0rp00352b-u4.tif The classification criteria are not clear. Student's understanding has not formed stable cognitive patterns
5 (3) Please refer to Example 1 to construct a concept map with 'change' as the core concept image file: d0rp00352b-u5.tif The classification criteria are confused. Student's understanding has not formed stable cognitive patterns
6 4. Drawing a knowledge network map (show example 2) image file: d0rp00352b-u6.tif Metal activity; metal smelting, iron making; chemical properties of metal
Please refer to Example 2, construct a knowledge network map with ‘iron’ as the core concept, and write down related chemical reaction equations


To summarize, the academic performance of grade 9 students is shown to be statistically significantly positively correlated with knowledge structure, cognitive perspective, and cognitive pattern (see Table 7). The contribution rate of cognitive perspective to academic performance is the largest in the multiple linear regression equation, which is 0.46 (see Table 8), and significantly larger than that the contribution of knowledge structure and cognitive pattern, indicating that the cognitive perspective is the most important factor affecting the performance of student's chemistry learning. The contribution rate of cognitive pattern in the multiple linear regression equation is 0.24 (see Table 8) indicating that the cognitive pattern affects the performance. While cognitive pattern is part of an interdisciplinary understanding in the scientific field, its impact on the performance in specific discipline, such as chemistry, is not as large as the cognitive perspective. The contribution rate to performance of knowledge structure in the multiple linear regression equation, which is 0.27 (see Table 8), indicating that knowledge structure influences the performance of the student's chemistry learning, but the contribution is smaller than that of the cognitive perspective.

According to the above calculated relationship between academic performance and the discipline-specific cognitive factors, we have demonstrated that students’ academic performance is contributed to by knowledge structure, cognitive perspective, and cognitive pattern. Among them, the cognitive perspective is the most important factor in determining students' performance in chemistry.

Discussion of results in relation to individual student performance

The structure of knowledge in the mind of students has individual characteristics which are related to the organization of their knowledge (Ausubel, 1968; Shavelson, 1974; Rumelhart and Norman, 1976; Rumelhart, 1986; Champagne et al., 1981; Nakiboglu, 2008), thereby affecting the student's chemistry learning. The detailed results of three students are described in Appendix 4, Tables 20–22. For Lily, her academic performance is the best among the three students; she possesses a systematic knowledge structure. However, the performance of Jia is the lowest among the three students; her knowledge integration is not well-developed because much of the knowledge she has learned cannot be found in her concept map. It can be concluded that a higher achieving student possesses a systematic knowledge structure, while a lower achieving student demonstrates none or just a few single knowledge points. To improve students’ chemistry learning, it is important to integrate knowledge and then to form a systematic knowledge structure.

The formation of cognitive perspectives stems from disciplinary core concepts (National Research Council, 2012a, 2012b, 2014, 2015; Harris et al., 2016, 2019). In chemistry learning, as described earlier, students should form five cognitive perspectives, classification, particle, properties, element, energy, which affect students’ chemistry learning. The detailed results on cognitive perspective of three students are described in Appendix 4, Tables 20–22. Based on the research reported here, the cognitive perspective is the most important factor affecting students’ chemistry learning. From the three students' answers, Lily's perspectives are the richest, whereas the perspectives of Jia are more limited. Lily pays attention to the physical and chemical properties of iron comprehensively and systematically. Her understanding of iron and the reaction of iron and oxygen shows specific perspectives of classification, particle, properties, element and energy, which demonstrates rich perspectives on the reaction of iron and oxygen.

Xiao Ming pays attention to the physical and chemical properties of iron. He possesses several perspectives, including classification, element, and properties. Although he can describe coherently among each perspective, such as properties and element, his expressions have not shown the perspectives of particle and energy, and are not coherent enough from a secondary perspective, such as classifying familiar chemical changes. Jia pays attention to the physical and chemical properties of iron and recognizes the particles of iron. It is possible that she possesses a limited perspective of general properties. Jia's understanding on the reaction of iron and oxygen shows only the reaction of iron burning strongly in oxygen without referring to slow oxidation. It is possible that she possesses limited perspectives: the perspectives of elements in reaction and the transformation of elements involved in chemical reactions.

It can be concluded that a higher achieving student possesses multiple perspectives, while a lower achieving student demonstrates none or just a few perspectives. To improve students’ chemistry learning, it is important to provide students with more tasks to apply multiple perspectives and pay attention to those perspectives that have been neglected.

The formation of cognitive patterns stems from knowledge and cognitive perspectives, and possesses the characteristic of crosscutting scientific understanding (NRC, 2012a, 2012b, 2014, 2015; Sevian and Talanquer, 2014; Zhang, 2015; Weinrich and Talanquer, 2015). In chemistry learning, as described earlier, students should form four cognitive patterns, macro–micro, qualitative–quantitative, fragmented–systematic, static–dynamic, which affect students’ chemistry learning. The detailed results on cognitive pattern of three students are described in Appendix 4, Tables 20–22. Lily processes combined macro–micro, combined qualitative–quantitative, static and systematic understanding. In her concept map for matter (see item 4 in Table 22, Appendix 4), there are clear classification criteria, which is not unique (more than two hierarchies), such as pure substance–mixture and simple substance–compound. There are understanding from micro world, such as atom, molecule, and the student can relate them to substance. In her concept map for change (see item 5 in Table 22, Appendix 4), there are clear classification criteria, such as physical change-chemical change and compound reaction–decomposition reaction–displacement reaction, etc. Xiao Ming possesses an integrated understanding. In his concept map for matter (see item 4 in Table 21, Appendix 4), there is clear classification criterion but a wrong example. In his concept map for change (see item 5 in Table 21, Appendix 4), there is clear classification criterion, such as physical change-chemical change, etc. But these classification criterion and examples are incomplete. Jia demonstrates fragmented understanding. She knows some nouns but her classification criteria for matter and change are confused and has some misunderstandings (see item 4 and item 5 in Table 20, Appendix 4).

The formation and application of the four cognitive patterns are the contributions of chemistry learning to scientific learning and students' cognitive development, namely, abstract thinking, system thinking, dynamic thinking and qualitative thinking. The most important and difficult cognitive pattern in junior high school is the macro–micro cognitive pattern (Johnstone, 1982, 1991) and the establishment of corresponding symbolic representations (Gilbert and Treagust, 2009). The next important pattern is system thinking and quantitative thinking (Ministry of Education, 2011; National Research Council, 2012a, 2012b; Zhang and Wang, 2013; Zhang, 2015). Because there is no knowledge of chemical equilibrium in junior high school, dynamic thinking at this grade level is elementary and needs to be further developed in senior high school. To improve students’chemistry learning, it is important to form the four categories of cognitive pattern. The progress of cognitive patterns requires high-level tasks, such as those with the terms like Design, Conclude, Reflect/Evaluate, and Produce/Generalize. These tasks need to be reflected in experimental inquiry activities and problem solving.

Conclusions and implications

In the current study, we investigated discipline-specific cognitive factors that influence students' academic performance in chemistry. In addition to the knowledge structure factor, cognitive perspectives and cognitive patterns are the discipline-specific cognitive factors which affect student’ academic performance in chemistry. Accordingly, a performance assessment instrument and tasks and scoring methods for knowledge structure, cognitive perspective and cognitive pattern were constructed and implemented in the research. The results showed that knowledge structure, cognitive perspective, and cognitive pattern all correlated with chemistry performance of grade 9 school students, with cognitive perspective being the most important factor.

For assessment and evaluation of students' learning, attention should be paid to students' discipline-specific cognitive factors

The content of assessment and evaluation should focus on the formation of students’ knowledge structures, cognitive perspectives and cognitive patterns. Researchers and teachers can help students’ academic progress by relating to the lesson not only to chemistry knowledge and its structure but also to cognitive perspectives and cognitive patterns. Furthermore, because cognitive factors are implicit, techniques need to be applied to evaluate the formation and application of students' knowledge structure, cognitive perspectives, and cognitive patterns. For example, a concept map for a substance and chemical change can be used to evaluate students' knowledge integration after a unit or when a project is completed. Forming a knowledge structure is a process that involves students comprehending and integrating knowledge and is the basis for knowledge in use (Ausubel, 1968; Novak and Gowin, 1984; Harris et al., 2016, 2019).

Teachers can use middle-level tasks, such as those requiring students to Explain, Analyze, etc., to provide opportunities for students to use their knowledge. Subjective self-reports or interviews can be used to detect the formation of perspectives, allowing students to apply knowledge in completing tasks (Harris et al., 2016, 2019). Use of high-level tasks, such as those requiring students to Design, Conclude, Reflect/Evaluate, and Produce/Generalize provides opportunities for students to engage in problem solving.

Integrating the problems to be solved into simple and complex situations is helpful for students to form cognitive patterns. Concept maps for matter and change, or think-aloud tasks, can be used to detect the formation of cognitive patterns and whether students can perform complex reasoning (Ingle and Shayer, 1971; Lawson, 1985; Chandran et al., 1987Zhang, 2015; Won et al., 2017). In disciplinary teaching and instruction, we believe that concept maps, subjective self-reports, and interviews are practical evaluation methods to conduct research on students’ learning. Furthermore, and importantly, these evaluation methods can be embedded in classroom assessment.

The design of assessment tasks should promote the progress of discipline-specific cognitive factors

The design of tasks should point to the development of cognitive factors. Existing research which assesses students’ academic performance often neglects disciplinary-specific cognitive factors. The current study considered cognitive factors that were closely related to disciplinary knowledge enabling teaching to be more targeted and directed to thinking and cognitive progress. This approach is not teaching for memorization of knowledge, nor general conceptual change, but teaching that focusses on the development of cognitive perspectives and cognitive patterns.

Tasks can be designed according to students' cognitive development. For example, students like Jia do not have well-developed knowledge integration, so the teacher needs to provide knowledge integration tasks in the classroom to improve the students’ knowledge structure, and based on this, give simple tasks to apply knowledge such as requiring students to Exemplify, Translate, Summarize/Classify and Compare/Contrast. For students like Xiao Ming, some perspective integration tasks should be used as homework to improve their perspectives and to avoid missing points from the instruction. For example, in the classroom, the students can demonstrate that they can use their knowledge (Harris et al., 2016, 2019) when tasks are provided, such as applying perspectives to Infer and Analyze, and helping to form a chemist's perspectives (Claesgens et al., 2009). Also, students can be engaged in problem solving tasks (National Research Council, 2012a, 2012b, 2014, 2015; Wang, 2016; Zhang, 2016) such as being required to Design, Conclude, Reflect, and Evaluate in situations from simple to complex. On the basis of assessment of knowledge integration and the richness of perspectives, students like Lily should be given problem-solving tasks with situations from simple to complex and unfamiliar. The students like Lily can be given real problem-solving tasks and open-ended problems without unique answers.

Tasks should have a gradient that reflects the different stages in which students understanding has developed. For example, at the stage of concept formation, the tasks which help students to associate knowledge, and form cognitive perspectives should be applied in teaching (Biggs and Collis 1982; Webb, 1997; Linn, 2006; Organization for Economic Co-operation and Development, 2015). At the stage of applying a particular concept, the focus should be on the tasks of Demonstration, Analysis, Prediction, and Design to develop the perspectives (Wang, 2016). At the stage of problem solving, the focus on the design of questioning and open-ended questions helps students to induct and transform their cognitive patterns, to explore unknown issues in unknown areas, and leads students to apply the cognitive pattern for broader transformation with questions and activities in learning chemistry. Exploring unknown issues in unknown areas also means innovation for high achieving students (Zhang, 2016).

Limitations

This research aimed to study discipline-specific cognitive factors of grade 9 school student's academic performance in chemistry, focusing on the relationship between academic performance and knowledge structure, cognitive perspective, and cognitive pattern. This relationship had not been previously studied. The sample of this study was selected for testing in urban grade 9 schools in economically developed areas of mainland China. The results of the study in rural areas and economically underdeveloped areas may be different and needs to be investigated in practice.

Conflicts of interest

There are no conflicts to declare.

Appendix 1: academic performance instrument

1. Iron and its compounds are widely used in daily life.

(1) ① Iron is a(n) _ Q1 ___. (Fill in the letter number)

A. compound B. non-metal C. oxide D. metal E. element F. salt

② Ferric chloride is composed of _ Q2_____ elements.

(2) Please give 2 kinds of compounds containing iron and oxygen elements: _ Q3_. (Fill in the name or chemical formula)

(3) Please describe the physical properties of iron: ___ Q4_____.

(4) Please interpret the conditions for iron to rust: _________ Q5____________.

(5) Compare the content of iron in ferriferrous oxide (Fe3O4) and ferric oxide (Fe2O3), ferriferrous oxide ___ Q6_____ ferric oxide. (Fill in ‘ >’, ‘ <’ or ‘ =’)

(6) Is the following statement correct? ‘Both iron and copper are composed of metal atoms and free electrons’ ___ Q7_____ (Mark ‘’ or ‘✗’)

(7) ‘Whether iron oxide and aluminium oxide contain the same amount of oxygen’? why? Please explain: _____________________ Q8_________________________

(8) The carbon content of pig iron is 2–4.3%, the carbon content of steel is 0.03%–2%; 10.0 g of carbon-containing iron alloy is fully burned in an oxygen stream to obtain 0.185 g of carbon dioxide. ① Please infer that the alloy is: _ Q9 __ (Fill in ‘pig iron’ or ‘steel’);

② Analyze your inference process (Write down the calculation process): Q10

2. In the cold winter, many people use ‘warm babies’ (a kind of portable body warmer, see Note 1 at the end of Appendix 1). In order to explore the composition of the ‘warm baby’ and its heating principle, the teacher found a ‘warm baby’ on the market and opened its package. The students saw that there was black powder inside. The teacher told the students that the black powder might be:

A: Iron powder, activated carbon, salt (such as sodium chloride), etc.

B: Copper oxide powder, activated carbon, salts (such as sodium chloride), etc.

The teacher and the students explored the composition of the ‘warm baby’.

[Hypothesize/Predict] The composition of ‘warm babies’ may be ___ Q11______.

[Design experiment] Please design an experimental plan: Q12

(Please choose chemicals and apparatus according to your design. Note: The teacher provided test tubes and dilute sulfuric acid as options; if you think the above two are not enough to complete your experiment, you can choose any chemicals and apparatus according to your needs.)

[Draw conclusions]

Please demonstrate the experimental process you designed: Q13

(Through experiments and analysis, it can be confirmed that the hypothesis is true.)

[Reflection and Evaluation]

Student A believes that there is no need to do any chemistry experiments to explore the composition of ‘warm babies’, and a conclusion can be drawn in half an hour.

Student B thinks it is better to use dilute sulfuric acid and iron nails to complete the experiment.

Please comment on the experimental design of the above two students and give your reasons: ___ Q14____________________________________________________

[Generalize/Produce] Iron rust is unfavorable for protection of iron and should be removed as much as possible. However, this reaction itself exothermic, it is used by scientists to make ‘warm babies’;

In fact, metals and their oxides are widely used in daily life, and the antioxidant in the package of Wedomi brand cakes (see Note 2 at the end of Appendix1) is also a black powder. Based on this information, the research questions you can ask are: ________________________________ Q15_______________________________

3. Iron can react with many substances.

(1) ① Please write down the chemical equation of iron burning in oxygen: _ Q16__

② The reaction is: ____ Q17_______. (Fill in the letter number)

A. compound reaction B. decomposition reaction

C. displacement reaction D. oxidation reaction

(2) Please give 2 examples of displacement reactions in solutions involving iron:

______ Q18_________________________ (fill in the chemical equations)

(3) Put the iron nail into the copper sulfate solution,

① Please describe the phenomenon of the reaction: _________ Q19_________

② Please explain the application of metal activity sequence in this example: _ Q20__

(4) There are two chemical reactions:

a. 2Fe + 2HCl == FeCl2 + H2

b. 2Al + 6HCl == 2AlCl3 + 3H2

There is the same mass of iron and aluminium reacting with enough dilute hydrochloric. Compare the mass of hydrogen generated by the above two reactions: the mass of hydrogen generated by reaction ‘a’ _ Q21_ the mass of hydrogen generated by reaction ‘b’ (fill in ‘ >’, ‘ <’ or ‘=’)

(5) Is the following statement correct? ‘We cannot distinguish the two materials of pure copper and bronze (copper-tin alloy) by a chemical method.’ ___ Q22_____ (Mark ‘✓’ or ‘✗’)

(6) How do we distinguish the two materials of pure copper and bronze (copper-tin alloy)? Why? Please explain: _____________________ Q23____________________

(7) If 3 g of impure zinc (contains impurity M) and 3 g of impure iron (contains impurity N) are fully reacted with sufficient dilute sulfuric acid to obtain 0.1 g hydrogen, please infer that M and N may be: __ Q24 __ (Fill in the letter number)

A. M is Fe, N is C B. M is Mg, N is Zn

C. M is Cu, N is C D. M is Cu, N is Mg

② Analyze your inference process (Write down the calculation process): Q25

4. There is a packet of black powder and a packet of red powder. The students in the chemistry experiment group worked together to study the composition of the two packages of powder. The teacher told the students that the composition of the two packages of powder may be:

Black powder: iron powder, copper oxide powder,

Red powder: copper powder, iron oxide powder.

Now the teacher and students explored the composition of the black powder.

[Hypothesize/Predict]

Hypothesis (1): The black powder may be iron powder

Hypothesis (2): The black powder may be copper oxide powder

Hypothesis (3): ___________ Q26_________________________

[Design experiment] The teacher provided the following chemicals: dilute sulfuric acid, copper sulfate solution, dilute hydrochloric acid, silver nitrate solution, etc. for selection; all equipment in the lab can be used. Please participate in the design of the experimental plan by drawing a diagram of the experimental device: Q27

[Draw conclusions]

What phenomena appear during the experiment, then hypothesis 2 holds? __ Q28__

(Through experiments and analysis, it can be confirmed that the hypothesis 3 is true.)

[Reflection and Evaluation]

The role of the alcohol lamp in the experimental device is: ___ Q29__________

[Generalize/Produce]

Based on the above experiment, what do you want to study about these two packages of powder? Q30

Notes:

Note 1: ‘warm baby’

There is iron power etc. in ‘warm babies’ (a kind of portable body warmer/warm paste). The reaction principle of the warm paste is to use the primary battery to accelerate the oxidation reaction speed and convert chemical energy into heat energy. In order to make the temperature last longer, the mineral material vermiculite is used for insulation. Because it cannot react before use, the material of the bag must be very special, consisting of a raw material layer, a gelatin layer and a non-woven bag.

Negative electrode: Fe–2e = Fe2+

Positive electrode: O2 + 2H2O + 4e = 4OH

The total reaction: 2Fe + O2 + 2H2O = 2Fe(OH)2

4Fe(OH)2 + 2H2O + O2 = 4Fe (OH)3↓2Fe(OH)3 = Fe2O3 + 3H2O

The non-woven bag is made of a microporous breathable film, and it must have a conventional airtight outer bag. When in use, remove the outer bag and let the inner bag (non-woven bag) be exposed to the air, and the oxygen in the air enters the inside through the breathable membrane. In cold winter, people keep this in their coat pocket or place in their underwear for warmth.

Note 2: ‘antioxidant in the package of Wedomi brand cakes’

The antioxidant in the package of Wedomi brand cakes is Fe power.

Appendix 2: tables and figures

a. Tables

The significance of the contributions to academic performance of each pair of discipline-specific cognitive factors according to the data in the following Tables 9–19. Although in Table 14, the significance level of both knowledge structure and cognitive pattern are lower than 0.001, this does not mean that the contribution rate of knowledge structure to performance is significantly larger than that of cognitive pattern. In order to interpret their contribution, we need to obtain the joint standard error se12 (see formula (2)).
 
image file: d0rp00352b-t1.tif(2)

From the data in Table 15, se1 is 0.001, se2 is 0.006, cov12 is −0.002, according to formula (2) we can obtain se12 is 63.54 × 10−3. Then, we could obtain the t value according to formula (3).

 
image file: d0rp00352b-t2.tif(3)

From the data in Table 14, b1 is 0.59, b2 is 0.35, according to formula 4, the t value with a df equals 59 could be obtained as 3.78 (p < 0.01), indicating that the contribution rate of knowledge structure to performance is statistically significantly larger than that of cognitive pattern.

We used the same method to obtain the t values for the contribution rate of knowledge structure and cognitive perspective (see the data in Appendix 2, Tables 16 and 17), which is −3.47 (p < 0.01); and the t values for the contribution rate of cognitive perspective and cognitive pattern (see the data in Appendix 2, Tables 18 and 19), which is 5.27 (p < 0.01). The above results indicate that the contribution rate of cognitive perspective to performance is significantly larger than that of knowledge structure and cognitive pattern.

b. Figures

The Rasch measurement model has been developed by the Danish mathematician George Rasch who developed a probabilistic model (a model that uses probabilities) to describe the response patterns of examinees to test items. Rasch believed that when a person responds to a test item there is a mathematical relationship that governs the probability of the person correctly answering that particular test item. According to Rasch, for any item i with a difficulty Di that can be scored as right (X = 1) or wrong (X = 0), the probability (Pni) of a person n with an ability Bn to answer the item correctly can be expressed as:
 
image file: d0rp00352b-t3.tif(4)
Bn is the ability of a person n and Di is the difficulty of item i. As can been seen from formula (2), the probability (P) of a person correctly answering an item is solely related to her/his ability and the difficulty of the item being answered.

Winstep3.72.0 software for Rasch modelling was used to calibrate reliability, item difficulty and student ability estimates.

Fig. 6 shows that student reliability (person reliability) is 0.83 and item reliability is 0.98. Both person reliability and item reliability are higher than 0.7 which meet the reliability requirements of diagnostic tests. In other words, the items have a high degree of reliability.


image file: d0rp00352b-f6.tif
Fig. 6 Reliabilities for the performance instrument.

Fig. 7 shows the combined person and item estimate map. On the left hand side, we see how students’ ability (i.e. students’ academic performance in chemistry) estimates distribute, and on the right hand side we see how the 30 items distribute from easiest to most difficult. From Fig. 7, we see that items are arranged from the easiest (bottom) to the most difficult (top). Overall, it can be seen that both person ability and item difficulty estimates formed approximately a normal distribution, and the distributions largely overlapped. Thus, it can be considered that the instrument had an adequate coverage.


image file: d0rp00352b-f7.tif
Fig. 7 Wright map for the performance instrument.

Fig. 8 shows the loading scatterplot of factor analysis of the residuals. Horizontal axis is estimated item difficulty in the Rasch scale; left hand vertical axis is correlation coefficient between student item scores and another potential construct after the primary construct, i.e. students’ academic performance in chemistry, is controlled; the right hand vertical axis is the frequency of items with a particular correlation coefficient on the left hand vertical axis and item difficulty on the horizontal axis; and letters in the box represent items. Factor analysis of the residuals showed that almost all 30 items (except for 4 items: q12, q13, q23 and q25 which represent the 4 items of B, A, a, C in Appendix 2, Fig. 8) had a loading (i.e. correlation) within the −0.4 to +0.4 range, which means that after the measured construct, i.e. students’ academic performance in chemistry, is controlled, there does not seem to be another strong construct measured by the instrument.


image file: d0rp00352b-f8.tif
Fig. 8 Dimensionality for the performance instrument.

There were four items, in fact have loadings that suggested the presence of an additional dimension. We checked the factor sensitivity ratio according to the data in the following Fig. 9. The value is 12.3% (i.e. the residual units divided by the Rasch variance units, 2.8/52.8 = 12.3%) which is larger than 5%. Additionally, the unexplained variance in the first construct is 2.8, slightly larger than 2. These evidences suggested there may be another construct but we did not investigate further in this study.


image file: d0rp00352b-f9.tif
Fig. 9 Construct 1 from principal component analysis.

A variance greater than or equal to 50% (in this study 52.8%) explained by the Rasch dimension can be regarded as evidence that the scale is unidimensional (Linacre, 2011). Also scale unidimensionality can be assumed if the second dimension (first contract) has the strength of less than 3 items (in terms of eigenvalues) and the unexplained variance by the first contrast is less than 5% (Oon and Subramaniam, 2011). There are four items (q12, q13, q23, q25) with contrast loadings all larger than 0.4 (see Fig. 9) and the result of INFIT statistics (see Fig. 10) shows a good fit and the four items were retained. Overall, it was considered that data met the unidimensionality requirement (Peng et al., 2016), thus also the local independence requirement, which provides evidence for the construct validity of the instrument.


image file: d0rp00352b-f10.tif
Fig. 10 Item fit for the performance instrument.

Fig. 10 (columns 5 to 10) presents fit statistics for the 30 items in the instrument. Measures in the table are Rasch estimated difficulties. SE stands for standard error of the estimated difficulty measure; the closer SE is to 0, the better. Mnsq is a chi-square model-data-fit statistics based on the difference, or residual, between the observed response patterns in the student sample and the predicted response patterns based on the Rasch model, and ZSTD is normalized Mnsq; the closer Mnsq is to 1 and ZSTD to 0, the better. INFIT is weighted – giving more weight to better fitting responses, while OUTFIT is unweighted. Finally, P Corr refers to partial correlation between students’ academic performance in chemistry scores on the item and their total test scores; the higher and more positive the correlation is, the better. From Fig. 10, we see that SEs for all items are below 1.0. Using a criterion of >0.70 and <1.3 as acceptable INFIT Mnsq, we see that three items (q8, q6, q2) have Mnsq Infit values beyond the range. Using a criterion of ZSTD within 2.0 as acceptable Infit, we see that some items (i.e. q8, q6, q17, q28, q27, q15) could be misfitting. Except for q8, all partial correlations are positive and are reasonably large. Overall, the data more or less fit the model.

Fig. 11 shows the nonlinear relationship between raw scores and Rasch measure scores for students’ academic performance.


image file: d0rp00352b-f11.tif
Fig. 11 Nonlinear relationship between raw scores and Rasch measure scores for students’ academic performance.

Fig. 12 shows the conversion of raw score (with total score of 30) and the Rasch measure score for students’ academic performance by running Winstep 3.72 software.


image file: d0rp00352b-f12.tif
Fig. 12 Transformation of raw scores and Rasch measure scores for students’ academic performance.

Appendix 3: scoring method for knowledge structure, cognitive perspective and cognitive pattern

a. Scoring method for knowledge structure

We scored the results according to the knowledge structure framework which is a list of knowledge points from the national curriculum standards in mainland China. Based on the results of experts checking, the knowledge list included the following eight points:

1. Physical properties – refers to color, state, melting point, thermal conductivity, electrical conductivity, ductility, etc.

2. Metal mineral – refers to hematite, magnetite, etc.

3. Metal materials – refers to iron-containing materials. For example, steel and its products; pig iron, etc.

4. The existence form of iron; metal corrosion and protection – students should clearly express the different forms of iron in nature and daily life such as: iron exists in nature as mixture; or, iron exists in nature as haematite or magnetite; or, iron exists in daily life as rust or as a kitchen knife. Students should clearly express that iron will rust and can be prevented from corrosion by painting or other methods.

5. Metal activity – refers to the displacement reactions of metals with acids, and metals with salt solutions.

6. Metal smelting, iron making – refers to the smelting of metals, especially iron smelting from iron ore.

7. Chemical properties of metal – refers to the reactions of metals with oxygen, metals with acids, and metals with salt solutions.

8. Slow oxidation – refers to the slow oxidation of metals in the air, especially the slow oxidation of iron in the air.

The scoring method of knowledge structure is shown below and in Table 3 of the manuscript. The total scores of these eight knowledge points based on the above list is each student's knowledge structure assessment score. The scoring of students' answers to the two tasks were derived as 0, 1, 2 for each knowledge point. The score ‘0’ represents no knowledge point in the answers to the two tasks (pictures). The score ‘1’ represents only an isolated knowledge point in the answers to task 1 or task 2. Isolated knowledge refers to that students can list only 1 secondary knowledge in the concept map, or has not related to other knowledge points by using line or language. The secondary knowledge here refers to specific examples of the above eight knowledge points. For example, metal activity includes two secondary knowledge aspects: the displacement reaction of metal with acid, and the displacement reaction of metal with salt solution. If a student can only list one of the knowledge points, his/her score for the ‘metal activity’ of this knowledge point is ‘1’. Another example, if a student can list ‘slow oxidation’ in task 1 and has not related to the other 7 knowledge points; or he/she can write the equation for slow oxidation of iron in task 2, or there is single arrow from Fe to Fe2O3, his/her scores ‘1’ for the knowledge point of ‘slow oxidation’. The score ‘2’ represents associated knowledge points. Related knowledge refers to when students can list more than two points of secondary knowledge in the concept map, or relate to other knowledge points by using lines or language. If a student can list both reactions of iron with hydrochloric acid or sulfuric acid, and iron with copper sulfate solution, he/she will receive the score ‘2’ for this knowledge point ‘metal activity’. Another example, if a student can list ‘slow oxidation’ (or ‘iron reacts with oxygen and water’) in task 1 and has related this to one of the other seven knowledge points; or there is a double arrow between Fe and Fe2O3, his/her scores ‘2’ for the knowledge point of ‘slow oxidation’.

The quality of the instrument was inspected by the Kendall rater-consistency reliability test and calculated by SPSS 26.0 statistical software. The scoring by the six raters, who participated in the scoring and were chemistry education researchers, resulted in a Kendall coefficient of 0.95.

b. Scoring method for cognitive perspective

Students’ cognitive perspective assessment results were scored according to the cognitive perspective framework which is based on text analysis, including university chemistry textbooks, chemistry curriculum standards in China and abroad, international large-scale assessments (PISA, TIMSS, and NAEP), student surveys, teacher interviews and expert checking (Zhang and Wang, 2013). The cognitive perspective framework included the following five perspectives: classification, particle, properties, element and energy.

1. Classification (Pe1): to understand substance from the perspective of classification. (Pe1-1) To understand from the perspective of substances involved in the reaction, such as classifying familiar chemical changes (Pe1-2);

2. Particle (Pe2): to understand from the perspective of particles-qualitative. For example, matter is composed of tiny particles in motion, including molecules as combinations of atoms (e.g., H2O, O2, CO2), and atoms are composed of subatomic particles (electrons surrounding a nucleus containing protons and neutrons). (Pe2-1)

To understand from the perspective of particles-quantitative. For example, atoms form molecules that range in size from two to thousands of atoms. Atoms possess their own weight. (Pe2-2)

To understand from the perspective of the types of particles involved in chemical reactions. (Pe2-3)

To understand from the perspective of the number of particles involved in chemical reactions. (Pe2-4)

To understand from the perspective of the arrangement of particles. (Pe2-5)

3. Properties (Pe3): to understand from general properties (Pe3-1), For example, compare the properties of common acids and bases (acids have a sour taste and react with metals); bases usually have a bitter taste and slippery feel; strong acids and bases are corrosive; both acids and bases dissolve in water and react with indicators to produce different color changes; acids and bases neutralize each other. Taking iron as an example, the general properties of metals, such as thermal conductivity, electrical conductivity, and ductility.

To understand from specific properties, combustion, stability or activity, and whether a substance is able to react with oxygen and so on. (Pe3-2) As an example, students should discuss the properties of iron including color, state, density, hardness, melting point etc.

4. Element (Pe4): to understand from the perspective of element qualitatively. (Pe4-1) For example, students can write out the symbol of iron or explain that iron is composed of the element of iron.

To understand from the perspective of element and quantitatively. (Pe4-2) For example, students can calculate the content of element in a substance based on chemical formulas, such as pointing out the content of iron in Fe2O3.

To understand from the perspectives of elements in reaction and the transformation of elements involved in chemical reactions qualitatively. (Pe4-3) For example, students can describe the transformation of elements before and after a chemical reaction.

To understand from the perspectives of elements in reaction and the transformation of elements involved in chemical reactions quantitatively. (Pe4-4) For example, students can calculate the mass of a product based on chemical equations; or balance unfamiliar reaction equations.

5. Energy (Pe5): to understand from the perspective of the types of energy involved in reactions. For example, recognizing that some chemical reactions release energy (e.g., heat, light), while others absorb it (Pe5).

Finally, the total scores of the 5 perspectives based on the above list is each student's score on their perspectives. The scoring method of cognitive perspective is shown below and in Table 4 in the manuscript.

The scoring of students' answers to the two tasks were derived as 0, 1, 2 for each cognitive perspective. The score ‘0’ represents no perspective in the answers to the 2 tasks. The score ‘1’ represents only an isolated perspective in the answers to task 1 or task 2. Isolated perspective refers to that students can list only 1 secondary perspective in the 2 tasks, or has not associated other perspectives by using language, chemical formula and chemical equation or figure, etc. The secondary perspective here refers to specific examples of the above five perspectives. For example, classification includes two secondary perspectives: ‘To understand substance from the perspective of classification’ and ‘To understand from the perspective of substances involved in the reaction, such as classifying familiar chemical changes’. If a student can only classify iron as metal or solid, his/her score for the ‘classification’ of this perspective is ‘1’. Another example, if a student knows the reaction of iron with oxygen will ‘release energy’ in task 2, his/her scores ‘1’ for the perspective of ‘energy’. The score ‘2’ represents related perspectives. Related perspectives refer to that students can list more than two secondary perspectives in the two tasks, or relate to other perspectives by using language, symbol or figure. If a student can classify iron as metal or solid, and classify the reaction of iron with oxygen as combination reaction or oxidation reaction as well, he/she will receive the score ‘2’ for this perspective ‘classification’. Another example, if a student knows the reaction of iron with oxygen ‘release energy’ in task 2, and can correlate both the two reactions of iron burning in oxygen with the slow oxidation of iron, for example, he/she can express ‘both reactions are exothermic’ as well, his/her scores ‘2’ for the perspective of ‘energy’.

The quality of the instrument was inspected by Kendall rater-consistency reliability test and calculated by SPSS 26.0 statistical software. Six raters, who participated in the scoring, were all chemistry education researchers. The Kendall coefficient is: 0.93.

c. Scoring method for cognitive pattern

The cognitive pattern assessment results were scored according to the cognitive pattern framework, and finally the scores of the student's cognitive pattern assessment were obtained. The cognitive pattern framework used the four cognitive patterns defined in this study which included macro–micro, qualitative–quantitative, fragmented–system, and dynamic–static.

1. Macro–micro (Pa1) – students are expected to understand concepts from both the macro and micro worlds.

2. Qualitative–quantitative (Pa2) – students are expected to analyze both qualitatively and quantitatively.

3. Fragmented–systematic (Pa3) – students are expected to understand concepts both systematically and in an integrated manner.

4. Static–dynamic (Pa4) – students are expected to understand concepts dynamically. For example, students cannot only see the phenomena of change, but also pay attention to the dynamic processes of chemical reactions.

The scoring method of cognitive pattern is shown below and in Table 5 of the manuscript. The total score of the four categories of cognitive pattern based on the above list is each student's score of categories of cognitive pattern.

The scoring of students' answers to the two tasks were derived as 0, 1, 2 for each cognitive pattern. The score ‘0’ represents only a low-level cognitive pattern in the answers to the two tasks. Low level categories of cognitive pattern refer to macro, qualitative, fragmented, static understandings rather than micro, quantitative, systematic, dynamic understandings, which are high level categories of cognitive pattern. For ‘macro–micro’, if a student shows no expression of micro understanding in the two tasks, he/she scores ‘0’ for ‘macro–micro’, the same for ‘qualitative–quantitative’, ‘static–dynamic’ as well. If a student shows inconsistent classification criteria for matter or change, or there are only a few examples, he/she scores ‘0’ for ‘fragmented–systematic’. The score ‘1’ represents only a high-level cognitive pattern. For ‘macro–micro’, if a student shows micro understanding in the two tasks, he/she scores ‘1’ for ‘macro–micro’, the same for ‘qualitative–quantitative’, ‘static–dynamic’ as well. If a student shows integrated understanding, he/she scores ‘1’ for ‘fragmented–systematic’. For example,

Task 1: in the concept map for matter, there is clear classification criterion, such as pure substance–mixture, etc.; or there are classification criterion and examples, but incomplete; Task 2: in the concept map for change, there is clear classification criterion, such as physical change-chemical change, etc.; or there are classification criterion and examples, but incomplete.

The score ‘2’ represents related low-level and high-level cognitive pattern. For ‘macro–micro’, if a student shows related macro and micro understandings in the two tasks, he/she scores ‘2’ for ‘macro–micro’, the same for ‘qualitative–quantitative’, ‘static–dynamic’ as well. If a student shows systematic understanding, he/she scores ‘2’ for ‘fragmented–systematic’. For example,

Task 1: in the concept map for matter, there are clear classification criteria and examples, which is not unique (i.e. more than 2 hierarchies), such as: pure substance–mixture, simple substance-compound;

Task 2: in the concept map for change, there are clear classification criteria and examples, which is not unique (i.e. more than 2 hierarchies), such as physical change-chemical change, compound reaction–decomposition reaction–displacement reaction etc.

The quality of the instrument was inspected by Kendall rater-consistency reliability test and calculated by SPSS 26.0 statistical software. Six raters, who participated in the scoring, were all chemistry education researchers. The Kendall coefficient is: 0.99.

Appendix 4: three students’ answers to the test for knowledge structure, cognitive perspective and cognitive pattern

Student 1: Jia

Knowledge structure (see item 3 and item 6 in Table 20 ): Jia pays attention to the chemical properties of iron only. She possesses limited knowledge points and understands metal activity, chemical properties of iron (see item 3 in Table 20) and metal smelting, iron making (see item 6 in Table 20).

Cognitive perspective (see item 1 and item 2 in Table 20 ): Jia pays attention to the physical and chemical properties of iron and recognizes the particles of iron. It is possible that she possesses limited perspective: general properties. This student understands that iron is a solid. She knows iron can react with oxygen and the related phenomenon (see item 1 in Table 20).

Jia's understanding on the reaction of iron and oxygen shows only the reaction of iron burning strongly in oxygen, without slow oxidation. It is possible that she possesses limited perspectives: the perspectives of elements in reaction and the transformation of elements involved in chemical reactions. This student can write the reaction equation (see item 2 in Table 20).

Cognitive pattern (see items 4 and 5 in Table 20 ): For cognitive pattern, the typical characteristic of Jia is fragmented understanding. She knows some nouns but her classification criteria for matter and change are confused (see item 4 and item 5 in Table 20).

Jia's understanding of iron and its reactions is mainly limited to the fragmented memorization of specific knowledge without real understanding. This student does not possess an integrated knowledge structure and has some misunderstandings (see in Table 20) and has confused the classification criteria for matter and change.

Student 2: Xiao Ming

Knowledge structure (see item 3 and item 6 in Table 21 ): Xiao Ming pays attention to the physical and chemical properties of iron comprehensively. He possesses some knowledge points and understands the physical properties of iron, metal activity, and slow oxidation.

Cognitive perspective (see item 1 and item 2 in Table 21 ): Xiao Ming pays attention to the physical and chemical properties of iron. He possesses several perspectives, including classification, element, and properties. This student understands that iron is a solid and knows its colour and state. He knows iron can react with oxygen and acid. Although he can describe coherently among each perspective, such as properties and element, his expressions have not shown the perspective of particle and energy, and are not coherent enough from a secondary perspective, such as classifying familiar chemical changes (see item 1 in Table 21).

Table 21 Xiao Ming's answers and corresponding analysis
Item Task Answer of student 2: Xiao Ming Note
1 1. Please describe your understanding of iron image file: d0rp00352b-u7.tif To understand a substance from the perspective of classification; to understand from the perspective of properties; to understand from the perspective of element qualitatively
2 2. Please describe your understanding of the reaction between iron and oxygen image file: d0rp00352b-u8.tif To understand from the perspectives of elements in a reaction and the transformation of elements involved in chemical reactions qualitatively
3 3. Drawing a concept map (show Example 1) image file: d0rp00352b-u9.tif The student possibly knows about metal activity and properties of iron (Conclusions would be drawn combined with the answers in item 6)
(1) Please refer to Example 1 and construct a concept map with ‘iron’ as the core concept
4 (2) Please refer to Example 1 to construct a concept map with ‘matter’ as the core concept. image file: d0rp00352b-u10.tif To understand from the macro world for matter; to analyse qualitatively; to understand in an integrated manner
5 (3) Please refer to Example 1 to construct a concept map with ‘change’ as the core concept. image file: d0rp00352b-u11.tif To understand in an integrated manner for change; to understand statically.
6 4. Drawing a knowledge network map (show example 2) image file: d0rp00352b-u12.tif Metal activity (the reaction with acid); chemical properties of metal; slow oxidation
Please refer to Example 2, construct a knowledge network map with ‘iron’ as the core concept, and write down related chemical reaction equations


Xiao Ming's understanding on the reaction of iron and oxygen shows both the reaction of iron burning strongly in oxygen and slow oxidation. This student can write the reaction equation and balance it correctly (see item 2 in Table 21).

Cognitive pattern (see items 4 and 5 in Table 21 ): For cognitive pattern, the typical characteristic of Xiao Ming is integrated understanding. In his concept map for matter (see item 4 in Table 21), there is clear classification criterion, but a wrong example. In his concept map for change (see item 5 in Table 21), there is clear classification criterion, such as physical change-chemical change, etc.; there are classification criterion and examples, but these are incomplete.

Student 3: Lily

Knowledge structure (see item 3 and item 6 in Table 20 ): Lily pays attention to the physical and chemical properties of iron comprehensively and systematically. She possesses rich knowledge points and systematically understands the physical properties of iron, metal material, metal activity, iron making, chemical properties of iron and slow oxidation (see item 3 and item 6 in Table 22).

Cognitive perspective (see item 1 and item 2 in Table 22 ): Lily pays attention to the physical and chemical properties of iron comprehensively and systematically. She possesses rich cognitive perspectives, including classification, particle nature, substance properties, and elements. This student systematically understands the physical properties of iron, such as colour, state; focusing on stability/activity in chemical properties, she is able to comprehensively write the reaction between iron and oxygen, and other important chemical reactions (see item 1 in Table 22). The microstructure characteristics of iron are evident in her responses.

Table 22 Lily's answers and corresponding analysis
Item Task Answer of student 1: Lily Note
1 1. Please describe your understanding of iron image file: d0rp00352b-u13.tif To understand a substance from the perspective of classification; to understand from the perspective of particles-qualitative and particles-quantitative; to understand from general properties and specific properties; to understand from the perspective of element qualitatively
2 2. Please describe your understanding of the reaction between iron and oxygen image file: d0rp00352b-u14.tif To understand from the perspective of substances involved in the reaction; to understand from the perspectives of elements in reaction and the transformation of elements involved in chemical reactions; to understand from the perspective of the types of energy involved in reactions
Note: ‘warm baby’ is a kind of portable body warmer, see Note 1 at the end of Appendix 1
3 3. Drawing a concept map (show Example 1) image file: d0rp00352b-u15.tif Physical properties; metal material; metal activity; metal smelting, iron making; chemical properties of metal
(1) Please refer to Example 1 and construct a concept map with ‘iron’ as the core concept
4 (2) Please refer to Example 1 to construct a concept map with ‘matter’ as the core concept image file: d0rp00352b-u16.tif To understand matter from combined macro–micro understanding, combined qualitative–quantitative understanding, and systematically
5 (3) Please refer to Example 1 to construct a concept map with 'change' as the core concept image file: d0rp00352b-u17.tif To understand change systematically
6 4. Drawing a knowledge network map(show example 2) image file: d0rp00352b-u18.tif Metal mineral, metal activity; metal smelting, iron making; chemical properties of metal; slow oxidation
Please refer to Example 2, construct a knowledge network map with 'iron' as the core concept, and write down related chemical reaction equations


Lily's understanding on the reaction of iron and oxygen shows specific perspectives of classification, properties, element and energy. This student distinguishes reactions under different conditions from the perspective of products and phenomena. Her understanding has focused on important perspectives, such as the energy, substance type, and other understanding of chemical changes, including application of reaction. Lily possesses rich perspectives on the reaction of iron and oxygen (see item 2 in Table 22).

Cognitive pattern (see items 4 and 5 in Table 22 ): For cognitive pattern, the typical characteristic of Lily is the combined macro–micro understanding, combined qualitative-quantitative understanding and systematic nature of her cognition. In her concept map for matter (see item 4 in Table 22), there are clear classification criteria, which is not unique (more than two hierarchies), such as pure substance-mixture, simple substance–compound. There are understanding from the micro world, such as atom, molecule, and the student can relate them to substance. In her concept map for change (see item 5 in Table 22), there are clear classification criteria, such as physical change-chemical change, compound reaction–decomposition reaction–displacement reaction etc. From the students' overall answers in the tasks 4–5 of matter and change, being systematic is the most typical feature of Lily.

Acknowledgements

This work was supported by the International Joint Research Project of Faculty of Education, Beijing Normal University. We appreciate the constructive recommendations made by Professor Lewis and the anonymous reviewers on earlier versions of this manuscript as well as the formatting of the manuscript by the technical editors.

References

  1. Abell S. K. and Lederman N. G., (2007), Handbook of research on science education, Lawrence Erlbaum Associates.
  2. Adbo K. and Taber K. S., (2014), Developing an understanding of chemistry: A case study of one Swedish student's rich conceptualisation for making sense of upper secondary school chemistry, Int. J. Sci. Educ., 36, 1107–1136.
  3. Adrian S. L. L. and Subramaniam R., (2018), Mapping the knowledge structure exhibited by a cohort of students based on their understanding of how a galvanic cell produces energy, J. Res. Sci. Teach., 55(6), 777–809.
  4. Adrian-Yigit E., (2018), Can cognitive structure outcomes reveal cognitive styles? A study on the relationship between cognitive styles and cognitive structure outcomes on the subject of chemical kinetics, Chem. Educ. Res. Pract., 19(3), 746–754.
  5. Anderson L. W. and Krathwohl D. R., (2001), A taxonomy for learning, teaching, and assessing: a revision of Bloom's taxonomy of educational objectives, New York: Longman.
  6. Ausubel D., (1968), Educational psychology: A cognitive view, New York: Holt, Rinehart & Winston.
  7. Biggs J. B. and Collis K. F., (1982), Evaluating the quality of learning—The SOLO taxonomy, New York: Academic Press.
  8. Bloom B. S., Engelhart M. D., Furst E. J., Hill W. H. and Krathwohl D., (1956), Taxonomy of educational objectives: The cognitive domain, New York: McKay.
  9. Bond T. G. and Fox C. M., (2007), Applying the Rasch model: Fundamental measurement in the human sciences, 2nd edn, London: Lawrence Erlbaum.
  10. Brandstädter K., Harms U. and Großschedl J. (2012), Assessing system thinking through different concept-mapping practices, Int. J. Sci. Educ., 34, 2147–2170.
  11. Champagne A. B., Klopfer L. E., Desena A. T. and Squires D. A., (1981), Structural representations of students' knowledge before and after science instruction, J. Res. Sci. Teach., 18, 97–111.
  12. Chandran S., Treagust D. F. and Tobin K., (1987), The role of cognitive factors in chemistry achievement, J. Res. Sci. Teach., 24(2), 145–160.
  13. Chandrasegaran A. L., Treagust D. F. and Mocerino M., (2009), Emphasizing multiple levels of representation to enhance students’ understanding of the changes that occur during chemical reactions, J. Chem. Educ., 86(12), 1433–1436.
  14. Chiu M.-H. and Lin J.-W., (2019), Modeling competence in science education, Disc. Interdisc. Sci. Educ. Res., 1, 12,  DOI:10.1186/s43031-019-0012-y.
  15. Claesgens J., Scalise K., Wilson M. and Stacy A., (2009), Mapping student understanding in chemistry: The perspectives of chemists, Sci. Educ, 93(1), 56–85.
  16. Cohen J., Cohen P., West S. G. and Alken L. S., (2003), Applied multiple regression/correlation analysis for the behavioral sciences, 3rd edn, London: Lawrence Erlbaum.
  17. Coll R. K. and Lajium D., (2011), Modeling and the future of science learning, in Khine M. S. and Saleh I. M. (ed.), Models and modeling: cognitive tools for scientific enquiry, Dordrecht: Springer, pp. 3–22.
  18. Coll R. and Treagust D. F., (2003a), Investigation of secondary school, undergraduate and graduate learners’ mental models of ionic bonding, J. Res. Sci. Teach., 40(5), 464–486.
  19. Coll R. K. and Treagust D. F., (2003b), Learners' mental models of metallic bonding: A cross-age study, Sci. Educ., 87(5), 685–707.
  20. Danili E. and Reid N., (2006), Cognitive factors that can potentially affect pupils’ test performance, Chem. Educ. Res. Pract., 7(2), 64–83.
  21. Dennick R., (2016), Constructivism: Reflections of twenty-five years teaching the constructivist approach in medical education, Int. J. Med. Educ., 7, 200–2005.
  22. Derman A. and Eilks I., (2016), Using a word association test for the assessment of high school students' cognitive structures on dissolution, Chem. Educ. Res. Pract., 17(4), 902–913.
  23. Ding T. Z., (2003), University Chemistry Course – Principles, Applications, Frontiers, Beijing: Higher Education Press (in Chinese).
  24. Feng Z. L., (1998), Structured and oriented teaching psychology principle, Beijing: Beijing Normal University Press (in Chinese).
  25. Feng Z. L. and Feng J., (2011), New teaching-structured and oriented teaching psychology principle, Beijing: Beijing Normal University Press (in Chinese).
  26. Fu X. C., (1999), University Chemistry, Beijing: Higher Education Press (in Chinese).
  27. Gilbert J. K. and Treagust D. F., (2009), Introduction: Macro, submicro and symbolic representations and the relationship between them: Key models in chemical education, in Gilbert J. K. and Treagust X. (ed.), Multiple representations in chemical education, Springer, pp. 1–8.
  28. Harris C. J., Krajcik J. S., Pellegrino J. W. and McElhaney K. W., (2016), Constructing assessment tasks that blend disciplinary core: Ideas, crosscutting concepts, and science practices for classroom formative applications, Menlo Park, CA: SRI International.
  29. Harris C. J., Krajcik J. S., Pellegrino J. W. and Debarger A. H., (2019), Designing knowledge-in-use assessments to promote deeper learning, Educ. Meas. Iss. Pract., 38 (2), 53–67.
  30. Harrison A. G. and Treagust D. F., (2000), Learning about atoms, molecules, and chemical bonds: A case study of multiple-model use in grade 11 chemistry, Sci. Educ., 84(3), 352–381.
  31. Hsieh P. H., Lin C. H., Tseng M. J., Jong J. P. and Chiu M. H., (2014), Case studies on investigation the 9th graders’ conceptions of particles: Taking diffusion as an example, Sci. Educ., 35(4), 2–23 (in Chinese).
  32. Hung C. H. and Lin C. Y., (2015), Using concept mapping to evaluate knowledge structure in problem-based learning, BMC Med. Educ., 15(1), 212.
  33. Ingle R. B. and Shayer M., (1971), Conceptual demands in nuffield O-level chemistry, Educ. Chem., 8(1), 182–183.
  34. International Association for the Evaluation of Educational Achievement (IEA), (2011), TIMSS 2011 assessing frameworks, Chestnut Hill, MA, USA: TIMSS &PIRLS International Study Center and International Association for the Evaluation of Educational Achievement (IEA).
  35. International Association for the Evaluation of Educational Achievement (IEA), (2019), TIMSS 2019 assessing frameworks, Chestnut Hill, MA, USA: TIMSS &PIRLS International Study Center and International Association for the Evaluation of Educational Achievement (IEA).
  36. Jarkko J. and Maija A., (2018), The challenges of learning and teaching chemical bonding at different school levels using electrostatic interactions instead of the octet rule as a teaching model, Chem. Educ. Res. Pract., 19(10), 932–953.
  37. Johnstone A. H., (1982), Macro and microchemistry, Sch. Sci. Rev., 64(1), 377–379.
  38. Johnstone A. H., (1991), Why is science difficult to learn? Things are seldom what they seem, J. Comput. Assist. Learn., 7(2), 75–83.
  39. Johnstone A. H., (1993), The development of chemistry teaching: A changing response to changing demand, J. Chem. Educ., 70(9), 701–705.
  40. Johnstone A. H., (2000), Chemical education research: Where from here? Univ. Chem. Educ., 4(1), 34–38.
  41. Jonassen D. H., Beissner K. and Yacci M., (1993), Structural knowledge: techniques for representing, conveying, and acquiring structural knowledge, Lawrence Erlbaum Associates.
  42. Lawson A. E., (1983), Predicting science achievement: The role of developmental level, disembedding ability, mental capacity, prior knowledge and belief, J. Res. Sci. Teach., 20(2), 141–162.
  43. Lawson A. E., (1985), A review of research on formal reasoning and science teaching, J. Res. Sci. Teach., 22(7), 569–617.
  44. Linacre J. M., (2011), A user's guide to Winsteps ministep Rasch-model computer programs, version 3.72.0, Chicago, IL: Winsteps.com.
  45. Linn M., (2006), The knowledge integration perspective on learning and instruction, in Swyer R. K. (ed.), The Cambridge handbook of the learning sciences, New York: Cambridge University Press.
  46. Liu X., (2010), Using and developing measurement instruments in science education: A Rasch modeling approach: Science & engineering education sources, Charlotte, NC: Information Age Publishing, Inc.
  47. Lopez E. J., Shavelson R. J., Nandagopal K., Szu E. and Penn J., (2014), Ethnically diverse students' knowledge structures in first-semester organic chemistry, J. Res. Sci. Teach., 51(6), 741–758.
  48. Margel H., Eylon B.-S. and Scherz Z., (2006), From textiles to molecules—teaching about fibers to integrate students' macro- and microscale knowledge of materials, J. Chem. Educ., 83(10), 1552.
  49. Margel H., Eylon B. S. and Scherz Z., (2008), A longitudinal study of junior high school students’ conceptions of the structure of materials, J. Res. Sci. Teach., 45(1),132–152.
  50. Ministry of Education (MOE) of PRC, (2011), Full-time compulsory education chemistry curriculum standard (Trial), Beijing: Beijing Normal University Press (in Chinese).
  51. Nakiboglu C., (2008), Using word associations for non-major science students' knowledge structure before and after general chemistry instruction: The case of atomic structure, Chem. Educ. Res. Pract., 9, 309–322.
  52. National Research Council (NRC) of US, (2012a), A Framework for K-12 science education: Practices, crosscutting concepts and core ideas, Washington, DC: National Research Council.
  53. National Research Council (NRC) of US, (2012b), Discipline-based education research: Understanding and improving learning in undergraduate science and engineering, Washington, DC: National Research Council.
  54. National Research Council (NRC) of US, (2014), Developing assessments for the next generation science standards, Washington, DC: National Research Council.
  55. National Research Council (NRC) of US, (2015), Guide to implementing the next generation science standard, Washington, DC: National Research Council.
  56. Nicoll G., (2001), A report of undergraduates' bonding misconceptions, Int. J. Sci. Educ., 23, 707–730.
  57. Nimmermark A., Öhrström L., Mårtensson J. and Davidowitz B., (2016), Teaching of chemical bonding: A study of Swedish and South African students' conceptions of bonding, Chem. Educ. Res. Pract., 17(4), 985–1055.
  58. Novak J. D. and Gowin B. D., (1984), Learning how to learn, NY: Cambridge University Press.
  59. Oon P. T. and Subramaniam R., (2011), Rasch modeling of a scale that explores the take-up of physics among school students from the perspective of teachers, in Applications of Rasch measurement in learning environments research, Sense Publishers, pp. 119–139.
  60. Organization for Economic Co-operation and Development (OECD), (2006), Assessing scientific, reading and mathematical literacy, a framework for PISA 2006, Paris, France: Organization for Economic Co-operation and Development.
  61. Organization for Economic Co-operation and Development (OECD), (2015), PISA2015 Draft science framework, Paris, France: Organization for Economic Co-operation and Development.
  62. Park J., Chang J., Tang K.-S., Treagust D. F. and Won M., (2020), Sequential patterns of students’ drawing in constructing scientific explanations: Focusing on the interplay among three levels of pictorial representation, Int. J. Sci. Educ., 42(5), 677–702.
  63. Peng H., Liu X., Zheng C. and Jia M., (2016), Using rasch measurement to validate an instrument for measuring the quality of classroom teaching in secondary chemistry lessons, Chem. Educ. Res. Pract., 2, 123–158.
  64. People's Education Press (PEP) of PRC, (2012), Full-time compulsory education textbook: Chemistry (Grade 9), Beijing: People's Education Press (in Chinese).
  65. Piaget J., (1983), Piaget's theory, in Mussen P. (ed.), Handbook of child psychology (Vol. 1, 4th ed.), New York: Wiley.
  66. Potvin P., Sauriol E. and Riopel M., (2015), Experimental evidence of the superiority of the prevalence model of conceptual change over the classical models and repetition, J. Res. Sci. Teach., 52(8), 1082–1108.
  67. Rumelhart D. E. and Norman D. A., (1976), Accertion, tuning and restructuring: Three modes of learning, http://files.eric.ed.gov/fulltext/ED134902.pdf.
  68. Rumelhart D. E., (1977), Toward an interactive model of reading, in Domic S. (ed.), Attention and performance, VI, Hillsdale, NJ: Lawrence Erlbaum Associates.
  69. Rumelhart D. E., (1986), Parallel distributed processing: Explorations in the microstructure of cognition, Cambridge Mass: MIT Press.
  70. Rumelhart D. E. and Ortony A., (1977), The representation of knowledge in memory, in Anderson R. C., Spiro R. J. and Montague W. E. (ed.), Schooling and the acquisition of knowledge, Hillsdale, NJ: Lawrence Erlbaum Associates.
  71. Sevian H. and Talanquer V., (2014), Rethinking chemistry: A learning progression on chemical thinking, Chem. Educ. Res. Pract., 15(1), 10–23.
  72. Shavelson R. J., (1974), Methods for examining representations of a subject-matter structure in student memory, J. Res. Sci. Teach., 11, 231–249.
  73. Shavelson R. J., Ruiz-Primo M. A. and Wiley E. W., (2005), Windows into the mind, High. Educ., 49(4), 413–430.
  74. Stevens S. Y., Shin N., Delgado C. and Krajcik J., (2010), Developing a hypothetical multi-dimensional learning progression for the nature of matter, J. Res. Sci. Teach., 47(6), 687–715.
  75. Taber K. S., (1997), Student understanding of ionic bonding: Molecular versus electrostatic framework, Sch. Sci. Rev., 78, 85–95.
  76. Taber K. S., (1998), An alternative conceptual framework from chemistry education, Int. J. Sci. Educ., 20, 597–608.
  77. Taber K. S., (2001), Building the structural concepts of chemistry: Some considerations from educational research, Chem. Educ. Res. Pract., 2, 123–158.
  78. Taber K. S., Tsaparlis G. and Nakiboglu C., (2012), Student conceptions of ionic bonding: Patterns of thinking across three European contexts, Int. J. Sci. Educ., 34, 2843–2873.
  79. Talanquer V., (2018), Progressions in reasoning about structure–property relationships, Chem. Educ. Res. Pract., 19(4), 998–1009.
  80. Vosniadou S. and Brewer W. F., (1992), Mental models of the earth: A study of conceptual change in childhood, Cognit. Psychol., 24(4), 535–585.
  81. Vosniadou S. and Brewer W. F., (1994), Mental models of the day/night cycle, Cognit. Psychol., 18(1), 123–183.
  82. Vosniadou S., (1994), Capturing and modeling the process of conceptual change, Learn. Instruct., 4(1),45–69.
  83. Wang L., (2016), Exploring performance and intrinsic composition of disciplinary competence-based on the multi-integrative model of ‘learning-applying-innovating’, Educ. Res.39(9), 83–92 (in Chinese).
  84. Webb N. L., (1997), Criteria for alignment of expectations and assessments in mathematics and science education, Washington, DC: Council of Chief State School Officers and National Institute for Science Education Research.
  85. Weinrich M. L. and Talanquer V., (2015), Mapping students' conceptual modes when thinking about chemical reactions used to make a desired product, Chem. Educ. Res. Pract., 16(3), 561–577.
  86. Wilson M., (2005), Constructing measures: An item response modeling approach, Mahwah, NJ: Lawrence Erlbaum Associates.
  87. Won M., Krabbe H., Ley S. L., Treagust D. F. and Fischer H. E., (2017), Science teachers’ use of a concept map marking guide as a formative assessment tool for the concept of energy, Educ. Assess., 22(2), 95–110.
  88. Ye L., Oueini R. and Lewis S. E., (2015), Developing and implementing an assessment technique to measure linked concepts, J. Chem. Educ., 92(11), 1807–1812.
  89. Zhang L. N., (2015), Study on the process of chemistry learning in middle school – theory, technology and case, Beijing: Beijing Normal University Press (in Chinese).
  90. Zhang L. N., (2016), Implementation of PISA2015 scientific literacy assessment on Chinese science teaching and assessment, Glob. Educ., 45(3), 15–24 (in Chinese).
  91. Zhang L. N. and Wang L., (2013), A study on the chemistry cognitive development in the study of core concept in junior high school students, Beijing: Beijing Normal University (in Chinese).
  92. Zhou Q., Wang T. T. and Zheng Q., (2015), Probing high school students' cognitive structures and key areas of learning difficulties on ethanoic acid using the flow map method, Chem. Educ. Res. Pract., 16, 589–602.

This journal is © The Royal Society of Chemistry 2021