Development of a measurement instrument to assess students' electrolyte conceptual understanding

Shanshan Lu and Hualin Bi *
College of Chemistry, Engineering and Materials Science of Shandong Normal University, No. 88 of East Culture Road, Ji'nan City, Shandong Province, China. E-mail: bihualin@sdnu.edu.cn

Received 13th June 2016 , Accepted 27th July 2016

First published on 2nd September 2016


Abstract

To assess students’ conceptual understanding levels and diagnose alternative frameworks of the electrolyte concept, a measurement instrument was developed using the Rasch model. This paper reports the use of the measurement instrument to assess 559 students from grade 10 to grade 12 in two cities. The results provided both diagnostic and summative information about students’ conceptual understanding, suggesting that this measurement instrument had a certain validity. The results also demonstrated that Chinese mainland senior students’ understanding improved with increasing grade level, but that many alternative frameworks were entertained by students at each level.


Introduction

The research of conceptual learning and assessment has always been the focus of science education. Since the 1980s, the research of alternative frameworks has been the prevalent topic. Researchers have found that for conceptual learning students always have some ideas that are inconsistent with the scientific concepts. Various terms have been used by researchers to label these ideas, for example, misconceptions (Johnstone et al., 1977), preconceptions (Ausubel, 1968), alternative frameworks (Driver and Easley, 1978) and students' conceptions (Duit, 1993). In this paper, for consistency, we have referred to them as alternative frameworks. To diagnose alternative frameworks, researchers have developed many kinds of diagnostic assessment tools. One important tool is the multiple-choice paper-pencil test (Caleon and Subramaniam, 2010; McClary and Bretz, 2012), such as, the concept inventory (Mulford and Robinson, 2002; Evans et al., 2003; Brandriet and Bretz, 2014). Two-tier multiple-choice diagnostic tools are widely used at present (Treagust, 1988; Tan et al., 2002; Chandrasegaran et al., 2007), these can be used to diagnose students' alternative frameworks, and also to provide insight into the students' understanding based on their reasoning. Diagnostic tools are effective in providing qualitative information regarding students’ understanding, i.e., the alternative frameworks entertained by students, however, these tools do not usually provide summative measures due to their internal consistency and because they are not unidimensional.

Since the 2000s, researchers have deepened their understanding of students' alternative frameworks. They have found that students' alternative frameworks still persist after scientific study. The alternative frameworks change with the development of conceptual understanding, and vary depending upon the level of understanding (Aktan, 2012). If the tool is only used to diagnose students' alternative frameworks, and is not used to measure the conceptual understanding levels of the students, it only provides the teacher with limited information regarding the students' level of understanding (Wilson, 2008). In the mid-2000s, the Rasch measurement, which provides a tool for integrating diagnostic assessment with summative assessment, was introduced to science education research (Liu, 2009, 2012). This approach is now commonly applied in the development of formative assessment with the aim of constructing a learning path, for example, learning progression (Corcoran et al., 2009; Claesgens et al., 2009; Hadenfeldt et al., 2013). Although some researchers have reported on instruments that integrate diagnostic and summative assessments (Hoe and Subramaniam, 2016), few researchers have reported on instruments based on the Rasch model for the integration of diagnostic and summative assessment in chemical education.

Electrolyte is a key concept in the Chinese high school chemistry curriculum, and plays an important role in the students' understanding of the behavior of solutions. Assessing the understanding of the concept of an electrolyte is included in tests for the assessment of the concept of a solution (Devetak et al., 2009), acid and base (Chiu, 2007), electrochemistry (Ogude and Bradley, 1994; Loh et al., 2014), and various chemistry concepts (Mulford and Robinson, 2002; Potgieter and Davidowitz, 2011). The tools used in these tests are mainly diagnostic. The purpose of this paper was to develop a measurement instrument combining understanding level measurement with the diagnosis of alternative frameworks. Two questions were posed:

(1) How effective is the measurement instrument when it is used to measure electrolyte conceptual understanding levels and to diagnose electrolyte alternative frameworks?

(2) How does the students’ conceptual understanding of an electrolyte change from grade 10 through to grade 12?

Method

The development of a measurement instrument includes three components: cognition, observation, and interpretation (NRC, 2001). Cognition refers to a theory or construct of the way in which students develop conceptual understanding in a subject domain. Observation refers to the students' performance based on the type of assessment task and situation. Interpretation refers to a statistical model, which is a summary of the patterns one would expect to see in the data taking into account the students’ understanding levels. Wilson (2008) proposed four building blocks as steps for the development of measurement instruments based on three components. The first building block includes the progress variables and focuses on one characteristic to be measured at a time. The second building block is the item design that refers to a variety of items or tasks used to prompt students' responses. The third building block is the outcome space which students' responses are categorized into for all of the items associated with the progress variable. The fourth building block is the measurement model, for example, the Rasch model. The measurement instrument was developed based on the framework suggested by Treagust (1986) for two-tier instruments and the data obtained were analyzed based on the Rasch model using the four steps described below.

1. Defining the levels of understanding

The first step was to define the understanding levels of the concept of electrolyte and the alternative frameworks. For senior students, the concept of electrolyte plays an important role in understanding the behavior of aqueous solutions. The behavior of aqueous solutions including conductivity, acidity and alkalinity, and ionic reactions is usually encountered in previous studies.

For the conductivity of electrolyte solutions, Çalik (2005) surveyed 10th grade students' conceptions about the conductivity of electrolytes and non-electrolytes. It was found that 8% of the students believed that electrolyte solutions were not conductive, and the students also found it difficult to list some examples of electrolytes. The students had difficulty in connecting electrolytes with other categories of matter, such as acids, bases and salts. Ogude and Bradley (1994) examined pre-college and college students' understanding of ionic conductivity in electrolyte solutions. The results demonstrated that students always attributed conductivity to the matter of electrons and thought that electrons can move freely in solution when they solved more complex problems related to batteries. They suggested that the understanding of the concept of electrolyte should not be isolated from other concepts.

For the ionization of electrolyte, Devetak et al. (2009) investigated 16-year-old students' microscopic understanding of aqueous solutions. They found that 46% of the students did not completely understand the concept of electrolyte ionization, as well as misunderstanding the concepts of ionization and dissolution. A similar case was found in Goodwin's (2002) research, where senior students confused solid melting with dissolution and considered both of them to be the change of a solid to a liquid. From their explanation of complex electrochemical phenomena, Ogude and Bradley (1994) found that pre-college and college students thought that electrolytes were decomposed by a current. Nusirjan and Fensham (1987) found that lower grade senior students thought that solids turned into a mixture of molecules, atoms and ions when dissolved in water.

For strong electrolytes and weak electrolytes, Chiu (2007) administrated a national survey of acid and base conception understanding in Taiwan. It was found that 34% of the senior students thought that weak electrolytes existed in the form of molecules, because some molecules decompose to ions, then positive and negative ions attract each other to combine as molecules again.

For the reaction of electrolytes in solution, Nusirjan and Fensham (1987) surveyed senior students' understanding of reactions occurring in aqueous solutions. The results demonstrated that students from three grades had alternative frameworks for lack of a particle view of solutions. Furthermore, there were obvious differences in the answers given by the students from the different grades when asked to describe the products and species of reaction ions.

The Chinese Chemistry Curriculum Standard of High School (The Ministry of Education of the People's Republic of China, 2003) put forward learning objectives at different stages of electrolyte concept learning. In this paper, we constructed electrolyte conceptual understanding levels by using these different stages of learning objectives. Furthermore, the students' different alternative frameworks described above were divided into different levels, as shown in Table 1.

Table 1 Students' electrolyte conceptual understanding levels and alternative frameworks
Level 3: Students can describe the ionization of weak electrolytes and explain the acid–base properties of solutions by quantitatively mastering the species and particular changes.
Alternative framework 1: Weak electrolyte is in a molecular form in aqueous solutions, because some molecules decompose to ions, then positive and negative ions attract each other to combine as molecules again (Chiu, 2007).
Level 2: Students understand electrolyte ionization based on the interaction between particles from a microscopic perspective.
Alternative framework 1: Electrolyte decomposed by current (Ogude and Bradley, 1994).
Alternative framework 2: Solid became a mixture of molecules, atoms and ions when dissolved in water (Nusirjan and Fensham, 1987).
Alternative framework 3: Solid melting is the same as dissolution and they are both a result of the change from a solid to a liquid (Goodwin, 2002).
Alternative framework 4: Ionization and dissolution are the same process (Devetak et al., 2009).
Level 1: Students distinguish electrolyte and non-electrolyte by the property of solution conductivity.
Alternative framework 1: Electrolyte solutions are not conductive (Çalik, 2005).


The electrolyte conceptual understanding levels built as shown in Table 1 represent a model for the students’ learning of the concept of electrolyte. This model was the basis for the development of a measurement tool and its validity needed to be tested by using the results.

2. Designing items and scoring schemes

The second step was the design of items and scoring schemes. The are multiple forms of items with different effectiveness for the measurement of understanding levels and the diagnosis of alternative frameworks (Mintzes et al., 1999). Wilson (2008) used a multiple-choice item, as a convenient and effective form, to measure understanding levels. In particular, the two-tier multiple-choice item proposed by Treagust (1988), has been used by many researchers (Tan et al., 2002; Chandrasegaran et al., 2007; Adadan and Savasci, 2012) for the diagnosis of students' alternative frameworks.

The measurement instrument comprised entirely of two-tier multiple-choice items. Each two-tier multiple-choice item consisted of two questions. The first-tier question was aimed at assessing whether or not students understood the content, with the form of dichotomous or multiple choices. The second-tier question mainly diagnosed the students' reasons for the first-tier answer and was in the form of multiple choices. Both tiers had only one correct answer. The distracters were designed based on the students' possible alternative frameworks, including those listed in Table 1 and others gathering using a questionnaire and interview before the items were designed. The number of options were not equal for each question, and in general there were two to five options.

The basic criterion for designing the items was to make sure that the students could find a reasonable reason in the second-tier corresponding to the answer in the first-tier. Only when the correct choices are provided in both tiers does a student answer the item correctly. If the student selected the wrong choice in either tier, he\she was considered to entertain an alternative framework. The student's alternative framework could be diagnosed by combining the choices in both tiers. In order to illustrate the process for the designing items, we have taken question 5 and question 6 (abbreviated as Q5/Q6) as an example, shown in Fig. 1.


image file: c6rp00137h-f1.tif
Fig. 1 Item Q5/Q6 in measurement instrument.

Item Q5/Q6 was designed in accordance with understanding level 3. This item mainly assessed whether the students had mastered the species of particulars quantitatively according to conductivity experimental fact. When a student chose A for question 5, and B for question 6 (simply labeled 5A6B), he/she had answered this item correctly. The multiple-choice items had the disadvantage of inflating errors due to the fact that students guessed the answer if they did not know the correct answer. Taking Q5/Q6 as an example. The probability of getting the correct answer through guesswork was 0.125, with 0.5 in the first-tier and 0.25 in the second-tier. A different choice of answers, was an indication that the student entertained an alternative framework. For example, if a student chose 5A6C, we inferred that the student entertained an alternative framework of only hydrogen ions which was conductive. Some options were unreasonable for an item, such as 5B6D. If the students selected these options, they were considered to not be taking the test seriously.

Four to six items were designed for each level. Finally, 15 items, comprising of 30 questions, were composed for the measurement instrument. The measurement instrument is in Chinese, and was translated into English (see Appendix 1). The effectiveness of the tool refers to the Chinese version in this paper. The items distributed in each level are shown in Table 2.

Table 2 Distribution of items in understanding levels
Understanding levels Distribution of items
Level 3: Explaining problems quantitatively Q5/Q6; Q13/Q14; Q15/Q16; Q21/Q22; Q29/Q30
Level 2: Understanding interactions of particles Q1/Q2; Q9/Q10; Q19/Q20; Q23/Q24; Q25/Q26; Q27/Q28
Level 1: Recognizing the category of matter Q3/Q4; Q7/Q8; Q11/Q12; Q17/Q18


The instrument was tested for the contents and corresponding understanding levels by an experienced high school teacher. The items had good content-related validity. Most of the items were within the understanding level according to Table 2 except for items Q13/Q14 and Q17/Q18. These two items were considered to examine some of the related content of level 2. After balancing the assessment content, the final decision made was that the two items were set within the levels presented in Table 2.

3. Data collection

The third step was the use of the measurement instrument to test senior students from mainland China. Using the Rasch measurement, variation in the students' abilities should be found. A total of 559 students from two high schools located in two cities agreed to participate in this study. We obtained approval from the two schools for the conduct of this study and obeyed the ethics laws of China. All of the students that participated in this study agreed to the use, in the analysis, of their responses to the measurement instrument. The students' details are presented in Table 3.
Table 3 Participants distribution
Cities Grade 10 Grade 11 Grade 12 Total
Ji'nan 62 62 57 181
Shi Jiazhuang 129 128 121 378
Total 191 190 178 559


The students from the two cities used different versions of high school chemistry textbooks, and both textbooks were developed based on the Chinese Chemistry Curriculum Standard of High School. Senior students from grade 10 to 12 were tested in this study. Before administration, the 10th grade students had learned about the electrolyte concept, including some related concepts of ionization and ionic reaction. Moreover, the 11th grade students had learned about weak electrolytes and the 12th grade students had learned about the behaviour of electrolyte solution complexes. It is common sense that the students' understanding would deepen by gradually learning about these topics. Proving this by means of the test, would provide some evidence that the test had good predictive validity (Linacre, 2011).

The whole test was supervised by the school teachers. All the students were asked to answer all the questions in the measurement instrument, and most of them managed to finish the test within 15 minutes. After the test, the 559 test papers were collected and sent to us by the teachers.

4. Measurement model and data analysis

The fourth step was the use of the Rasch model to estimate the difficulties of the items and the students abilities in the same scale. For the formula and use of the Rasch model please refer to the relevant literature (Liu and Boone, 2006).

Firstly, all the test papers were collected and details of the cities and grades were added. The students' responses were recorded and saved using Excel software as a data set. Taking the format of the items into consideration, each item actually included two questions, however, the students' answers to each item were given only one code in the form of a “title number & option”. For example, if a student chose A for question 5 and C for question 6, the answer was coded as 5A6C. The codes for the students' answers were consistent with the alternative framework marks of the items. Thus, the students' alternative frameworks could be diagnosed by the frequency of the options. It was also found that very few students only answered the first tier, and the second tier was null, thus the answer was coded as zero.

How was the conceptual understanding levels of students infered using these raw data? The Rasch model is considered to be an effective method, which estimates the difficulties of the items and abilities of students together, and defines the probability of an accurate answer as P = e(θnδi)/1 + e(θnδi) with θn being the students' abilities and δi being the difficulty of the items. According to the Rasch model, both the underlying understanding levels of the students and the difficulty of the items can be evaluated. On the one hand, by comparing the difficulties of the items distributed at different levels, the efficiency of the measurement instrument for measuring the students' understanding levels can be tested. On the other hand, the understanding levels of the students from different grades can be compared. In processing the raw data, the true score model from the classic test theory can hardly assess students' abilities based on various difficulties of items.

Two basic assumptions are made when using the Rasch modeling for analyzing data. One is unidimensionality of the measurement instrument, that is, the expected performance of students is based on a single latent trait. The other one is local independence of the items, that is, the answer for one item is not affected by the answers for the other items. Taking two-tier items into consideration, the answer for the second-tier is bound to that of the first-tier. In order to obey the second assumption, the answer for one item scored one point only in cases when both tiers were correct answers. Another case is that the answer is scored zero points if either tier is wrong or both tiers are wrong. Therefore, the raw data was changed to the dichotomy of 1 and 0, and processed by Winsteps 3.72.0 software.

Results and discussion

1. Validity and reliability of measures

Using the Rasch model to estimate the understanding levels, the collected data should meet two assumptions, which are unidimensionality and local independence.

Unidimensionality aims at looking for other components not corresponding to the latent trait. The latent trait refers to the “conceptual understanding of electrolyte” in this paper. The unidimensionality was tested by the principal component analysis of residual error.

Fig. 2 shows the dimensionality analysis of the measurement instrument in the test. The horizontal axis represents the item measured, and the vertical axis represents the contrast loading between the items and the contrast component. The Rasch model explained 38% of the total variance, leaving 62% of the variance unexplained. This shows that the instrument developed was not unidimensional in nature, it is possible that an additional construct could exist. It can be seen that most items had a loading within the −0.4 to +0.4 range; three items (C-Q25/Q26, B-Q1/Q2, A-Q27/Q28) were out of the range. Items with a correlation of over 0.7 were considered to be highly locally dependent (Linacre, 2011). Since most of the correlations for the questions were below 0.7, the responses for most of the items developed thus fulfilled the criteria for local independence as defined by the Rasch model. Other constructs seemed to be underlying the residuals as the loading of two items were over 0.7, these were items Q1/Q2 (0.73) and Q27/Q28 (0.79).


image file: c6rp00137h-f2.tif
Fig. 2 Plot of item loading.

Do these items need to be separated from other items, or to be reconstructed in a new test? Research has demonstrated that it is difficult for data to be unidimensional, and deleting these items in a measurement instrument requires rigorous consideration if these items obviously differ from other items in the light of the aim of the measurement (Linacre, 2011). We examined the items, and found that they mainly evaluated the understanding of ionization. Ionization is an essential concept for understanding aqueous electrolytes, therefore we decided to keep these items. The items mentioned above require further investigation to improve the unidimensionality of this measurement instrument.

Validity. Does the test measure what it is intended to measure? If the test has measured three levels of electrolyte understanding, then the measurement instrument developed in this paper has construct validity (Linacre, 2011).

The first step was to ensure that all of the items fitted the Rasch model. The mean square residual (MNSQ) showed how big the impact of the misfit was, with two forms of Outfit MNSQ and Infit MNSQ. Outfit is a chi-square sensitive to outliers in the Rasch analysis. Outliers are often lucky guesses for students of lower ability and careless mistakes for students of higher ability. Infit mean squares are influenced by response patterns, focusing on the responses close to the difficulty of the items or the students' ability. The expected value of MNSQ is 1.0. The PTMEA Corr. value is the correlation between the person item scores and person measures. For Rasch analysis, the value should be positive and not be nearly zero (Bond and Fox, 2015). From Table 4, it can be seen that the PTMEA Corr. values for all of the items were positive and were within a range of 0.06–0.63. The values for two items' (Q15/Q16, Q29/Q30) were nearly zero, so these two items require further investigation.

Table 4 Item fit statisticsa
Item Measurea Model S.E. Infit Outfit PTMEA corr.
MNSQ ZSTD MNSQ ZSTD
a N = 559. MNSQ values of 0.70–1.30 indicate the acceptable fit range; infit (weighted) or outfit (unweighted) values outside this range indicate poor fit of the data to the Rasch model.
Q1/Q2 −0.02 0.09 0.83 −4.9 0.78 −4.4 0.58
Q3/Q4 −2.02 0.12 1.03 −0.4 1.12 −0.9 0.30
Q5/Q6 2.17 0.14 1.10 1.1 1.52 2.6 0.20
Q7/Q8 −0.82 0.10 0.90 −2.7 0.81 −3.0 0.51
Q9/Q10 −2.29 0.13 1.00 0.1 0.95 −0.3 0.32
Q11/Q12 −2.11 0.12 1.08 1.1 1.16 1.2 0.26
Q13/Q14 −0.02 0.09 1.03 0.8 0.99 −0.1 0.42
Q15/Q16 2.18 0.14 1.25 2.6 1.92 4.3 0.06
Q17/Q18 −0.24 0.09 1.10 2.7 1.12 2.2 0.35
Q19/Q20 −0.70 0.10 1.02 0.60 1.00 0.0 0.41
Q21/Q22 0.31 0.10 1.07 1.8 1.05 0.9 0.38
Q23/Q24 0.27 0.10 1.04 0.9 1.03 0.6 0.40
Q25/Q26 0.19 0.10 0.76 −7.0 0.76 −4.6 0.63
Q27/Q28 −0.32 0.09 0.80 −5.9 0.75 −4.9 0.60
Q29/Q30 3.40 0.21 1.11 0.7 1.76 2.1 0.08


The second step was measurement of the consistency between the difficulty of the items and the understanding levels constructed in Table 2. That is, the higher the level of the conceptual understanding, the harder the corresponding item. A Wright map (presented in Fig. 3) is a graphical representation of increased conceptual understanding. The locations of items on the Wright map are derived from empirical analyses of students' data from sets of items.


image file: c6rp00137h-f3.tif
Fig. 3 Wright map.

From the Wright map, it can be seen that all of the items in the measurement instrument covered most of the students' abilities. However, three gaps around 2.5 logit, 1 logit and −1.5 logit lack corresponding items for the students' abilities. Items are needed for further study. There are three alternative explanations for these gaps, each involves a comparison of the different items measured and their corresponding levels. Firstly, the result for item Q13/Q14 (−0.02) corresponding to level 3 is equal to the result for item Q1/Q2 corresponding to level 2. According to the expert validity, item Q13/Q14 also refers to the content of level 2 and needs to be revised. Secondly, the result for item Q17/Q18 (−0.24) corresponding to level 1 is higher than some items according to level 2 because “methanol” in item Q17/Q18 is not a familiar matter to students. Thirdly, the result for item Q9/Q10 (−2.29) corresponding to level 2 is the lowest value because the ionization of magnesium chloride, as a typical example of a solution, had been learned by students in junior schools. Overall, 80% of the items' difficulties were aligned with the construct.

The third step was calculation of the mean measurements for every understanding level by averaging the values of all the items at each level. As shown in Table 5, the mean measurements from level 1 to level 3 increased with an increase in the level, thereby providing more evidence for the validity of the measurements.

Table 5 Mean measures of understanding levels
Level Item (measurements) Mean S.D.
1 Q3/Q4(−2.02); Q7/Q8(−0.82); Q11/Q12(−2.11); Q17/Q18(−0.24). −1.30 1.30
2 Q1/Q2(−0.02); Q9/Q10(−2.29); Q19/Q20(−0.70); Q23/Q24(0.27); Q25/Q26(0.19); Q27/Q28(−0.32). −0.48 1.28
3 Q5/Q6(2.17); Q13/Q14(−0.02); Q15/Q16(2.18); Q21/Q22(0.31); Q29/Q30(3.40). 1.61 1.40


Reliability. The reliability of the measurements was established by means of the Rasch measurement model as well as the classical test theory (Cronbach alpha coefficients). In the Rasch analysis, the reliability is a property of the person and the item measured, with two indicators of the person separation index and item separation. The separation index can also be converted to the Cronbach's α equivalent value with a range of 0–1. A summary of the statistics of the measurement instrument is presented in Table 6.
Table 6 Summary statistics of person and item
Parameter (N) Infit Outfit Separation Reliability
MNSQ ZSTD MNSQ ZSTD
Persons (559) 0.97 0.0 1.11 0.2 1.18 0.58
Items (15) 1.11 −0.5 1.12 −0.2 12.68 0.99


It can be seen from Table 6 that the personal separation index was 1.18, with an equivalent value of the Cronbach's α of 0.58. This personal reliability was not very high, but it would not impact the teachers' decision to teach and was suitable for low-stake classroom assessment (Wei et al., 2012). The item separation index was very high, and the corresponding Cronbach's α value was 0.99.

Cronbach's α from the classical test theory indicates consistency within students' responses to all the items in the measurement instrument. The α values for grade 10 students (α1 = 0.60), grade 11 students (α2 = 0.53) and grade 12 students (α3 = 0.66) did not exceed 0.7 which is considered to be an indicator of acceptable reliability. It indicated weak correlations among students' response to the items. The results are similar to those obtained by Luxford and Bretz (2014), the reliability of their measurements did not exceed 0.7 when they tested high school students' understanding of other chemistry concepts. One cause for a low value of Cronbach's α was students' fragmented knowledge being measured with alternative frameworks as proposed by Adams and Wieman (2011).

2. Students' conceptual understanding levels and alternative frameworks

The second question of this paper was how students' conceptual understanding of electrolyte changes from grade 10 through to grade 12? Using the measurement instrument developed in this paper, we assessed students' understanding levels by grade and diagnosed their alternative frameworks.
Students' understanding levels at different grades. In light of the mean measurements obtained for the three levels (listed in Table 5), the students' ability was divided into different understanding levels. When the students' ability value was lower than −1.30, it was concluded that the students’ conceptual understanding level of electrolyte was below level 1. When the students' ability was in the range of −1.30 to −0.48, it was concluded to be at level 1. When the students' ability was between −0.48 to 1.61, it was concluded to be at level 2. When the students' ability was greater than 1.61, it was concluded to be at level 3. The students' understanding levels from grade 10 to 12 are listed in Table 7.
Table 7 Conceptual understanding levels of students in different grades
Grade N Mean (S.D.) Below level 1 (%) Level 1 (%) Level 2 (%) Level 3 (%)
10 191 −0.33 (1.07) 18.3 28.8 48.7 4.2
11 190 −0.21 (0.98) 14.2 55.8 26.3 3.7
12 178 0.04 (1.16) 13.5 19.7 60.7 6.1


The mean values of each grade of students' understanding were calculated by averaging the students' ability values in each grade. As shown from the results presented in Table 7, the mean values increased by grade from 10 to 12, this indicates that students' conceptual understanding of electrolyte developed by grade and provided some evidence for the predictive validity of the measurement instrument. The significant differences of the understanding of the students' from the three grades were tested by one-way ANOVA. The results revealed that there were differences in the understanding of the students from the three grades [F(2,556) = 5.427, p = 0.005**]. Whereby, the understanding of 10th grade and 11th grade students (N = 190, M = −0.21, SD = 0.98) were not significantly different (p = 0.897), and similar results were obtained when the 11th grade and 12th grade students were compared (p = 0.083). However, there was a very significant difference between the 10th grade and 12th grade students (p = 0.004**).

From Table 7, it can be seen that less than 10% of the students achieved level 3 in grade 10 to 12. This suggests that students have difficulty in explaining complicated problems, such as solution conductivity and acid–base properties after learning all the high school curriculum content relating to electrolytes. This result is similar to that of Potgieter and Davidowitz (2011) research. They found that most students still had difficulties in predicting the changes of pH in the light of saline hydrolysis after finishing high school.

About 80% of the students' understanding was distributed at level 1 and level 2 at each grade. Specifically, the percentage of grade 11 students who attained level 1 was far greater than that of grade 10 and 12 students. However, the percentage of grade 11 students who achieved level 2 was far lower than for the other two grades. Maybe it is a reason for there was no significant difference between grade 11 and other two grades mentioned above.

Students' alternative frameworks in different grades. To obtain more information about the students' understanding of electrolyte, the students' alternative frameworks about electrolytes was diagnosed by counting the frequency of the students choices. The alternative frameworks for the students from the different grades are shown in Table 8.
Table 8 Alternative frameworks for the students from the different grades
Understanding levels Alternative frameworks Options Grade 10 (%) Grade 11 (%) Grade 12 (%)
Note: only the frequency of alternative frameworks over 10% are listed in this table, “—” represents alternative frameworks of less than 10%.
Level 3: Explaining problems quantitatively (a) The conductivity is the same with strong and weak electrolyte when they have the same concentration. 5B6A 62.3 59.7 58.7
(b) A strong acid is not necessarily a strong electrolyte. 13A14B 15.7 15.2 14.0
(c) There is no OH in acid solution. 13D14C 27.2 28.8 30.2
(d) There are small amount of ions in water because water is a weak electrolyte. 15D16C 45.0 43.5 50.8
(e) The conductivity of a strong acid is bigger than a weak acid solution. 21A22C 38.7 38.2 38.0
Level 2: Understanding particles interactions (a) Insoluble salts become mixture of molecules and ions. 1B2B 14.1 14.1
27B28B 14.1 21.5
25D26D 26.2 22.5 20.1
(b) Insoluble salts can't ionize. 1C2A 12.6
25A26A 12.0
(c) There are no interactions that occur when an electrolyte ionizes. 15D16D 15.7 13.6 13.4
29A30B 61.8 64.4 65.9
(d) Electrolyte ionizes after dissolves in the water. 19A20A 16.8 14.1 16.2
Level 1: Recognizing matter category (a) Any electrolyte is a conductor of electricity. 3B4B 11.7
23D24A 11.0
(b) Any conductive matter is an electrolyte. 7A8B 28.8 24.6 21.8
(c) Organic is not electrolyte. 17B18B 31.4 36.1 27.9
(d) There are ions in any solution. 23D24C 29.3 22.0 24.6
23C24C 13.6 10.1


More than 2/3 items were diagnosed with students' alternative frameworks as shown in Table 8. This suggests that the measurement instrument is effective for diagnosing students' alternative frameworks for electrolyte. Table 8 also shows that students had various alternative frameworks at different understanding levels.

In understanding level 1, the prominent alternative framework was “organics are not electrolytes”, with a percentage of 31.4%, 36.1% and 27.9% from grade 10 to 12 respectively. For example, students thought that there were no ions in methanol solution because methanol is organic when they answered item Q17/Q18. Another dominant alternative framework was “conductive matter is electrolyte” this was the result for 28.8% of grade 10 students, 24.6% of grade 11 students and 21.8% of grade 12 students. Another similar alternative framework was “there are ions in all solution” which was answered by 29.3% of grade 10 students, 22.0% of grade 11 students and 24.6% of grade 12 students. These alternative frameworks suggested that the students made fuzzy connections between the concepts of electrolytes, conductive matter, solution and ions. These results are similar to those obtained by Çalik (2005) who found that grade 10 students were unable to distinguish between conductive matter, solutions and ions. According to them, students inferred that there were ions in all solutions, thus confusing electrolyte solutions and nonelectrolyte solutions.

In understanding level 2, the dominant alternative framework was that no interactions occur during electrolyte ionization. For example, when the students answered item Q29/Q30, 61.8% of grade 10 students, 64.4% of grade 11 students and 65.9% of grade 12 students selected the options of “few formic acid molecules automatically ionize” with the reason of “weak electrolyte partially ionization”. This indicated that most students couldn't understand electrolyte ionization in terms of particulate interactions. In contrast to the results obtained by Ogude and Bradley (1994), few students thought that an electrolyte is decomposed by a current. One important reason is that ionization and current share the same word when translated into Chinese. Teachers are aware of the probable confusion and explain to students that electrolyte ionization is not caused by current. However, the students misunderstood that electrolytes decompose automatically without interactions between the particulates. For example, when answering item Q15/Q16, 15.7% of grade 10 students, 13.6% of grade 11 students and 13.4% of grade 12 students thought less water molecules ionize automatically and ignored the interactions of the molecules.

Another dominant alternative framework according to understanding level 2, was that students confused ionization and dissolution. When answering item Q19/Q20, 16.8% of grade 10 students, 14.1% of grade 11 students and 16.2% of grade 12 students thought that ionization of calcium chloride occurred after its dissolution in water. Devetak et al. (2009) also found that grade 10 and 11 students did not fully understand electrolyte ionization, and confused with dissolution in microscopic viewpoint. When answering item Q25/26, 26.2% of grade 10 students, 22.5% of grade 11 students and 20.1% of grade 12 students thought that insoluble electrolytes were all weak electrolytes, and formed a mixture of ions and molecules after ionization. More than 10% of grade 11 students even thought that an insoluble electrolyte couldn't ionize. These results are in accordance with the results of Nusirjan and Fensham (1987), who found that students thought that insoluble salts turned into a mixture of molecules and ions after ionization.

In understanding level 3, the 10th to 12th grade students had difficulties in explaining the conductivity and acid-base property of solutions. When answering item Q5/Q6, students hardly compared the conductivity of strong electrolytes and weak electrolytes of the same concentration. The students ignored the fact that the electrolytes were of the same concentration, and 62.3% of grade 10 students, 59.7% of grade 11 students and 58.7% of grade 12 students intuitively thought that a weak electrolyte did not necessarily exhibit weak conductivity. The same result was found in the answers to item Q13/Q14. 15.7% of grade 10 students, 15.2% of grade 11 students and 14.0% of grade 12 students thought that a strong acid might be a weak electrolyte. Some students answered items by literal meaning, for example, 38.7% of grade 10, 38.2% of grade 11 and 38.0% of grade 12 students thought that the conductivity of a strong acid solution was stronger than that of a weak acid solution. In addition, 27.2% of grade 10 students, 28.8% of grade 11 students and 30.2% of grade 10 students had learnt by rote that there was no OHin an acid solutions. Also 45.0% of grade 10 students, 43.5% of grade 11 students and 50.8% of grade 12 thought that there were small amounts of ions in water because water is a very weak electrolyte. This suggests that the students made simplistic connections between a strong electrolyte and strong conductivity, acid and hydrogen ions, and so on. It also suggests that the students did not think about systems when they thought about aqueous conductivity and acid–base properties quantitatively.

Conclusions

The measurement instrument described herein has a certain construct validity and predictive validity. It can be used both to measure students' understanding levels of the concept of electrolyte and to diagnose students' alternative frameworks of electrolyte. When the measurement instrument was used to assess grade 10 to grade 12 students' conceptual understanding, it was found that most of the students were at conceptual understanding levels 1 and 2, and very few students got to level 3. At each level of conceptual understanding, students had various dominant alternative frameworks. For example, students made fuzzy connections between several of the concepts in level 1, such as electrolyte, conductive matter, solution and ions. Students confused ionization and dissolution at level 2 and had difficulties in explaining the conductivity and acid–base property in level 3. These alternative frameworks are stubborn and do not disappear as students progress by grade.

Developing a measurement instrument using the Rasch model is an iterative process. In this paper, the results are the first iteration of the test, and demonstrated that few items should be further investigated to improve the unidimensionality and item fit of the measurement instrument. In addition, an abnormal phenomenon was found in that far fewer grade 11 students achieved level 2 than those in grade 10 and grade 12. Further research is needed to provide more evidence for the reason why this was observed. However, this study was framed as a feasibility one: it is possible to develop a measurement instrument to provide both diagnostic and summative information about students' conceptual understanding.

Appendix 1: electrolyte conceptual understanding test

Grade/Class_______Name_______NO._______.

This test consists of 30 questions totally, and each question has only one correct answer. Please finish all the questions on your own. Some relative definitions in the textbook are provided as following.

Electrolyte: any compound that conducts electricity when melted or dissolved in water

Non-electrolyte: any compound that can't conduct electricity when melted and dissolved in water

Ionization: the process of forming ions when melted or dissolved in water

Strong electrolyte: an electrolyte that completely ionizes in water

Weak electrolyte: an electrolyte that only partially ionizes in water

Q1. Calcium carbonate (CaCO 3 ) is a _______.

[A with combining low line] strong electrolyte

B weak electrolyte

C non-electrolyte

Q2. The reason for the answer of Q1 is_______.

A the CaCO3 can't dissolve in aqueous solution

[B with combining low line] the CaCO3 completely ionizes when melted

C the CaCO3 only partially ionizes in aqueous solution

D the CaCO3 can't ionize in aqueous solution

Q3. The KBr aqueous solution can make the bulb light on. The substance(s) that caused electrical conductivity is (are) _______.

A water

[B with combining low line] K+ and Br

C solute

D electron

Q4. The reason for the answer of Q3 is that _______.

A water decomposed into hydrogen and oxygen gas by electricity

B KBr is an electrolyte, and all electrolytes conduct electricity

[C with combining low line] KBr ionized and freed the ions to move

D KBr decomposed into conductive substances by electricity

E pure water can't conduct electricity unless there are solutes in it

Q5. There are H 2 CO 3 solution of 1 mol L −1 and H 2 SO 4 solution of 1 mol L −1 . The fact is that the conductivity of H 2 CO 3 solution is weaker than solution H 2 SO 4 , thus it comes the conclusion that H 2 CO 3 is weak electrolyte._______.

[A with combining low line] True

B False

Q6. The reason for the answer of Q5 is _______.

A weak conductivity is not necessary to weak electrolyte

[B with combining low line] H2CO3 solution has molecules, molecules are not conductive

C H2CO3 is a weak acid, only H+ is conductive

D H2CO3 solution is less O2− than H2SO4 solution

Q7. Which matter is a electrolyte in the following? _______.

A NaOH solution

B Cu

[C with combining low line] BaSO4

D SO2

Q8. The reason for the answer of Q7 is _______.

A this matter conducts electricity

B there are free moving ions in aqueous solution

C the compound conducts electricity when dissolved in water

[D with combining low line] this matter ionized when melted

Q9. What particles exist when magnesium chloride (MgCl 2 ) is heated into the melt state? _______.

[A with combining low line] Mg2+ and Cl

B MgCl2 molecules

C water and MgCl2 molecules

D Mg and Cl2

Q10. The reason for the answer of Q9 is _______.

[A with combining low line] MgCl2 completely decomposed into ions

B melting MgCl2 is a solution

C MgCl2 does not ionize when melted

D MgCl2 decomposed into Mg and Cl2 by electricity

Q11. Hydrofluoric acid (HF) is a weak electrolyte. The concentration of H + in 0.1 mol L −1 Hydrofluoric acid solution is _______.

A equal to 0.1 mol L−1

B greater than 0.1 mol L−1

[C with combining low line] less than 0.1 mol L−1

Q12. The reason for the answer of Q11 is that _______.

A acid completely ionized into H+

B water molecules also ionized and produces some H+

[C with combining low line] HF molecules rarely ionized in solution

D not sure how many molecules ionized

Q13. NaHSO 4 is a strong electrolyte and the solution is acidic. Which particulate(s) is (are) certainly not in solution? _______.

A SO42−

B H+ and OH

[C with combining low line] HSO4

D OH

Q14. The reason for the answer of Q13 is that _______.

[A with combining low line] NaHSO4 completely ionized into Na+, H+ and SO42−

B NaHSO4 completely ionized into Na+ and HSO4

C there is no OH in any acid solution

D neither H+ nor OH exists in salts solution

Q15. Pure water has very weak electrical conductivity. How the rare ions are produced in water? _______.

[A with combining low line] The interactions are broken in water molecules

B Single water molecule ionized automatically

C By electricity

D Water is a weak electrolyte

Q16. The reason for the answer of Q15 is _______.

[A with combining low line] there are interactions among water molecules

B molecules ionized by electricity

C weak electrolyte only partially ionized

D very few water molecules ionized

Q17. Are there ions when methanol (CH 3 OH) is dissolved in water? _______.

A Yes

[B with combining low line] No

Q18. The reason for the answer of Q17 is that _______.

A methanol ionized into ions

B methanol is an organic

C there is OH in methanol aqueous solution

[D with combining low line] methanol is a non-electrolyte

Q19. Calcium chloride (CaCl 2 ) is a compound that consisting of calcium ion and chloride ion. Does calcium chloride ionize when dissolved in the water? _______.

[A with combining low line] Yes

B No

Q20. The reason for the answer of Q19 is that _______.

A CaCl2 ionized after dissolved

[B with combining low line] the ions are released from the structure by water molecules

C CaCl2 is insoluble

D ionized by electricity

Q21. The electrical conductivity of HCl solution is than (as) HClO solution with the same concentration.

[A with combining low line] stronger

B weaker

C the same

Q22. The reason for the answer of Q21 is that _______.

A the concentrations of two solutions are the same

[B with combining low line] the more ions exist in HCl solution

C the conductivity of strong acid is stronger

D the more ions exist in HClO solution

Q23. In the following matter, which one has free moving chloride ions (Cl )? _______.

A KCl crystal

B melted KCl

C KCl aqueous solution

[D with combining low line] KCl solution and melted KCl

Q24. The reason for the answer of Q23 is that _______.

A KCl solid consists of K+ and Cl

B the melted KCl is a solution

C all solutions are conductive

[D with combining low line] KCl ionized both in solution and in melted state

Q25. What particulates exist in melted silver chloride (AgCl)? _______.

A AgCl molecules

[B with combining low line] Ag+ and Cl

C AgCl molecules and water molecules

D AgCl molecules, Ag+ and Cl

Q26. The reason for the answer of Q25 is that _______.

A silver chloride didn't dissolve in water

B melted silver chloride is a solution

[C with combining low line] silver chloride is a strong electrolyte

D silver chloride is a weak electrolyte

Q27. One aqueous solution contains barium ion (Ba 2+ ), another contains sulfate ion (SO 4 2− ). When mixing the two solutions together, you will see barium sulphate (BaSO 4 ) precipitated. The BaSO 4 is _______.

[A with combining low line] a strong electrolyte

B a weak electrolyte

C a non-electrolyte

D neither an electrolyte nor a nonelectrolyte

Q28. The reason for the answer of Q27 is _______.

A BaSO4 didn't dissolved in water

[B with combining low line] there is no molecules in BaSO4 solution

C there is no Ba2+ or SO42− in BaSO4 solution

D BaSO4 didn't ionize in aqueous solution

Q29. Formic acid (HCOOH) is a weak electrolyte, and there are few HCOOand H+in aqueous solution. How the ions are produced? _______.

A Few HCOOH molecules ionized automatically

[B with combining low line] The interactions in formic acid were broken by water molecule

C There are ions in all electrolyte aqueous solution

D There are H+ in all acid solutions

Q30. The reason for the answer of Q29 is that _______.

A acid ionized and produced H+

B a weak electrolyte ionized partially

[C with combining low line] the interaction force is different in different molecule

Acknowledgements

This work was supported by the Specialized Research Fund for the Doctoral Program of Higher Education of China (20133704110007) and the Natural Science Foundation of Shandong Province of China (ZR2012GM022).

Notes and references

  1. Adadan E. and Savasci F., (2012), An analysis of 16–17-year-old students' understanding of solution chemistry concepts using a two-tier diagnostic instrument, Int. J. Sci. Educ., 34(4), 513–544.
  2. Adams W. K. and Wieman C. E., (2011), Development and Validation of Instruments to Measure Learning of Expert-Like Thinking, Int. J. Sci. Educ., 33(9), 1289–1312.
  3. Aktan D. C., (2012), Investigation of students' intermediate conceptual understanding levels: the case of direct current electricity concepts, Eur. J. Phys., 34(1), 33–43.
  4. Ausubel D. P., (1968), Educational psychology: a cognitive view, New York: Holt, Rinehart and Winston.
  5. Bond T. G. and Fox C. M., (2015), Applying the Rasch Model. Applying the Rasch model: Fundamental Measurement in the Human Science, 3rd edn, Lawrence Erlbaum Associates Publishers.
  6. Brandriet A. R. and Bretz S. L., (2014), The development of the redox concept inventory as a measure of students' symbolic and particulate redox understandings and confidence, J. Chem. Educ., 91(8), 1132–1144.
  7. Caleon I. S. and Subramaniam R., (2010), Do Students Know What They Know and What They Don't Know? Using a Four-Tier Diagnostic Test to Assess the Nature of Students' Alternative Conceptions, Res. Sci. Educ., 40(3), 313–337.
  8. Çalik M., (2005), A Cross-age Study of Different Perspectives in Solution Chemistry from Junior to Senior High School, Int. J. Sci. Math. Educ., 3, 671–696.
  9. Chandrasegaran, A. L., Treagust, D. F. and Mocerino, M., (2007), The development of a two-tier multiple-choice diagnostic instrument for evaluating secondary school students' ability to describe and explain chemical reactions using multiple levels of representation, Chem. Educ. Res. Pract., 8(3), 293–307.
  10. Chiu M. H., (2007), A National Survey of Students' Conceptions of Chemistry in Taiwan, Int. J. Sci. Educ., 29, 421–452.
  11. Claesgens J., Scalise K., Wilson M. and Stacy A., (2009), Mapping student understanding in chemistry: the perspectives of chemists, Sci. Educ., 93(1), 56–85.
  12. Corcoran T. B., Mosher F. A. and Rogat A., (2009), Learning progressions in science: an evidence-based approach to reform, Philidelphia, PA: Consortium for Policy Research in Education.
  13. Devetak I., Vofrinc J. and Glazar S. A., (2009), Assessing 16-Year-Old Students' Understanding of Aqueous Solution at Submicroscopic Level, Res. Sci. Educ., 39(2), 157–179.
  14. Driver R. and Easley J., (1978), Pupils and paradigms: a review of literature related to concept development in adolescent science students, Stud. Sci. Educ., 5, 61–84.
  15. Duit R., (1993), Research on students' conceptions – developments and trends, in The Proceedings of the Third International Seminar on Misconceptions and Educational Strategies in Science and Mathematics, Ithaca, NY: Misconceptions Trust.
  16. Evans D. L., Gray G. L., Krause S., Martin J., Midkiff C., Notaros B. M., et al., (2003), Progress on concept inventory assessment tools, in Frontiers in Education, 2003 FIE 2003 33rd Annual (vol. 1, pp. T4G - 1). IEEE.
  17. Goodwin A., (2002), Is salt melting when it dissolves in water? J. Chem. Educ., 79(3), 393–396.
  18. Hadenfeldt J. C., Bernholt S., Liu X., et al., (2013), Using ordered multiple-choice items to assess students' understanding of the structure and composition of matter, J. Chem. Educ., 90(12), 1602–1608.
  19. Hoe K. Y. and Subramaniam R., (2016), On the prevalence of alternative conceptions on acid–base chemistry among secondary students: insights from cognitive and confidence measures. Chem. Educ. Res. Pract., 17(2), 263–282.
  20. Johnstone A. H., Macdonald J. J. and Webb G., (1977), Misconception in shool thermodynamics, Phys. Educ., 12(4), 248–251.
  21. Linacre J. M., (2011), A User's Guide to Winsteps: Rasch-Model Computer Programs, winsteps.com.
  22. Liu X., (2009), Using and Developing Measurement Instruments in Science Education: A Rasch Modeling Approach, Information Age Publishing, Inc.
  23. Liu X., (2012), Developing measurement instruments for science education research, Second international handbook of science education, Netherlands: Springer, pp. 651–665.
  24. Liu X. and Boone W. (ed.), (2006), Application of Rasch Measurement in Science Education, Maple Grove, MN: JAM Press.
  25. Loh A. S. L., Subramaniam R. and Tan K. C. D., (2014), Exploring students' understanding of electrochemical cells using an enhanced two-tier diagnostic instrument, Res. Sci. Technol. Educ., 32(3), 229–250.
  26. Luxford C. J. and Bretz S. L., (2014), Development of the Bonding Representations Inventory to identify student misconceptions about covalent and ionic bonding representations, J. Chem. Educ., 91(3), 312–320.
  27. McClary L. M. and Bretz S. L., (2012), Development and Assessment of a Diagnostic Tool to Identify Organic Chemistry Students' Alternative Conceptions Related to Acid Strength, Int. J. Sci. Educ., 34, 2317–2341.
  28. Mintzes J. J., Wandersee J. H. and Novak J. D., (1999), Assessing science understanding: a human constructivist view, Academic Press, San Diego, California: Elsevier Academic Press.
  29. Mulford D. R. and Robinson W. R., (2002), An Inventory for Alternate Conceptions among First-Semester General Chemistry Students, J. Chem. Educ., 79(6), 739–744.
  30. National Research Council, (2001), Knowing what students know: The science and design of educational assessment, Washington, DC: National Academy Press.
  31. Nusirjan and Fensham P., (1987), Descriptions and frameworks of solutions and reactions in solutions, Res. Sci. Educ., 17(1), 139–148.
  32. Ogude A. N. and Bradley, J. D., (1994), Ionic Conduction and Electrical Neutrality in Operating Electrochemical Cells, J. Chem. Educ., 71(1), 29–34.
  33. Potgieter M. and Davidowitz B., (2011), Preparedness for tertiary chemistry: multiple applications of the chemistry competence test for diagnostic and prediction purposes, Chem. Educ. Res. Pract., 12(2), 193–204.
  34. Tan K. C. D., Goh N. K., Chia L. S. and Treagust D. F., (2002), Development and Application of a Two-Tier Multiple Choice Diagnostic Instrument to Assess High School Students' Understanding of Inorganic Chemistry Qualitative Analysis, J. Res. Sci. Teach., 39, 283–301.
  35. The Ministry of Education of the People's Republic of China, (2003), High School Chemistry Curriculum Standards, Beijing: People's Education Press.
  36. Treagust D., (1986), Evaluating students' misconceptions by means of diagnostic multiple choice items. Res. Sci. Educ., 16(1), 199–207.
  37. Treagust D. F., (1988), Development and use of diagnostic tests to evaluate students' misconceptions in science, Int. J. Sci. Educ., 10(2), 159–169.
  38. Wei S., Liu X., Wang Z. and Wang X., (2012), Using Rasch Measurement To Develop a Computer Modeling-Based Instrument To Assess Students' Conceptual Understanding of Matter, J. Chem. Educ., 89(1), 335–345.
  39. Wilson M., (2008), Cognitive Diagnosis Using Item Response Models, J. Psychol., 216(2), 74–88.

This journal is © The Royal Society of Chemistry 2016