Advancing students’ scientific inquiry performance in chemistry through reading and evaluative reflection

Yu-Jan Tseng a, Zuway-R. Hong b and Huann-shyang Lin *bc
aInstitute of Education, National Sun Yat-sen University, 70 Lien Hai Road, Kaohsiung 804, Taiwan
bCentre for General Education, Kaohsiung Medical University, Kaohsiung, Taiwan
cCentre for General Education, National Sun Yat-sen University and Kaohsiung Medical University, 70 Lien Hai Road, Kaohsiung 804, Taiwan. E-mail: huannlin@mail.nsysu.edu.tw

Received 14th September 2021 , Accepted 5th March 2022

First published on 8th March 2022


Abstract

Inspired by existing literature indicating that reading literacy is significantly associated with scientific reasoning and chemistry conceptual understanding, this quasi-experimental study explored the effectiveness of reflective reading of scientific articles on student inquiry performance. A total of 134 10th graders from southern Taiwan were divided into two Experimental Groups (EG) and a Comparison Group (CG). Both EG1 and EG2 students engaged in reading and discussing scientific articles and planning their own experiments. Evaluative reflection on their peers’ experimental designs was emphasized for EG1 students while the recognition of variables in designing experimental procedures was emphasized for EG2 students. The CG students learned how to read and understand scientific articles without direct emphasis on their inquiry practices. The results revealed that EG students’ scientific inquiry performances of forming researchable questions and planning experimental procedures could be effectively improved by reading and reflecting on experimental design. Further analysis revealed that students with higher reflection ability performed better than those with lower reflection ability in the competency of experimental design applied to other contexts. Given that using reading in chemistry teaching is scarce, this study's pedagogical approach of reading with a focus on evaluative reflection provides an alternative teaching strategy for those teachers who do not have enough time or laboratory equipment to provide their students the opportunity of doing hands-on experiments.


Introduction

Improving scientific literacy through scientific inquiry

The importance of scientific inquiry competency has been emphasized in national documents of science education (National Research Council, 2012) and international assessments of scientific literacy (OECD, 2016). The Programme for International Student Assessment (PISA) 2015 (OECD (2016)) defined evaluating and designing scientific inquiry as one of the three essential competencies for a scientifically literate person. This competency is “required to evaluate reports of scientific findings and investigations critically … [and is defined as] the ability to describe and appraise scientific investigations and propose ways of addressing questions scientifically” (p. 52). Students are expected to know the procedures and related science concepts for the design of an investigation, such as making hypotheses, forming research questions, understanding variables (i.e., control, independent, dependent), designing experiments, and collecting and presenting data. This study used the PISA 2015 assessment framework to assess student scientific inquiry performance.

The openness level of scientific inquiry can be divided into confirmation inquiry, structured inquiry, guided inquiry, and open inquiry (Schwab, 1958; Bell et al., 2005; Akuma and Callaghan, 2019). These types of inquiry rely on different logics, sources of structure, procedure sequences, and lesson organization. Most of the past studies have concentrated on the data interpretation phase and the engagement in the argumentation phase (e.g., Bell et al., 2010; Chang et al., 2011). Limited studies were focused on the competency of designing scientific inquiry experimental procedures (Rönnebeck et al., 2016). Mostly, experimental design in the existing literature could be classified as either hands-on or computer simulation. In a traditional hands-on inquiry experiment, students have to do the experiment by using provided equipment and materials in a laboratory (Szalay et al., 2020). However, experimental design in computer simulation requires students to choose appropriate variables in the early stages of a computer simulation program (Mcelhaney and Linn, 2011).

Apart from these two ways of teaching experimental design, reading science articles has been used to promote student reasoning ability (De Waard et al., 2020), conceptual understanding (Nas et al., 2021), and scientific process skill (Tosun, 2019). The scarcity of studies focusing on the advancement of student scientific inquiry performance—especially on the sub-competencies of formulating researchable questions and designing experimental procedures—has influenced the current study to explore the effectiveness of promoting student chemistry inquiry performance through reading and reflecting on contextual chemistry articles.

Evaluative reflection

After reviewing 32 inquiry-based learning articles, Pedaste along with his team members (2015) concluded that communication and reflection was potentially present at all phases of inquiry-based learning. In order to ensure effective learning, students are encouraged to communicate, evaluate, and share their perspectives with classmates or team members when they engage in the scientific inquiry. In this study, we put more emphasis on the cognitive practice of evaluation in communication and reflection. Therefore, evaluative reflection was practiced by students when they engaged in peer review and self-reflection.

Peer review is an effective learning approach that has been applied widely by teachers (Gaynor, 2020). Many studies concluded that peer review effectively promoted students’ metacognitive skills (Topping, 1998; Nicol et al., 2014; Gupte et al., 2021). There is evidence that a majority of students would get some positive benefits even though they did not receive constructive feedback from their peers (Ketonen et al., 2020). Additionally, Gupte et al. (2021) pointed out that peer review improved students’ learning experiences in organic chemistry with writing assignments. It appears that peer review motivates students to be involved in constructive interactions, personal reflections, and thoughtful revisions.

Self-reflection is an internal feedback process where students reflect on their own thoughts from reading, deep thinking, and potential ways of revising. Previous findings indicated that students become more insightful through self-reflection (Orsmond et al., 2005; Steen-Utheim and Hopfenbeck, 2019). Self-reflection is supportive to students in understanding the progression of learning science. Choi and Hand (2020) found that reflection played a significant role in inquiry learning. Sewry and Paphitis (2018) analysed students' reflections and showed that students learned about society beyond the laboratory and applied their chemistry knowledge to the contexts of the local community through service-learning.

The advantages of peer review, interaction, and reflection have encouraged us to integrate the above supportive practices as evaluative reflection when students are engaged in forming researchable questions and planning experimental procedures.

Forming researchable questions

Forming researchable questions is a key component of learning scientific inquiry (Tawfik et al., 2020). The practice of forming questions not only encourages students’ higher-order thinking skills (Hofstein et al., 2005) but also triggers students’ interest, motivation, and self-reflection on scientific inquiry (Huang et al., 2017). The characteristics of a quality research question are: (a) testable (Chang et al., 2011; Ebenezer et al., 2011), (b) having potential to increase understanding (Cavagnetto et al., 2010), and (c) using scientific evidence to generate new knowledge (Samarapungavan et al., 2011). Hofstein et al. (2019) pointed out that students who engaged in scientific inquiry can provide more and higher quality questions.

The literature reviewed indicated that previous studies have focused more on categorizing students' questions or enhancing students' ability to form researchable questions through teaching interventions (Chang et al., 2011; Ebenezer et al., 2011; Kaya and Temiz, 2018; Kohen et al., 2020). Considering the benefits and characteristics of forming researchable questions, this study aims to analyse the quality of research questions formed by students and the development of their practices. In this study, a student's quality researchable question means that this student is capable of verifying the purpose of an experiment, pointing out the control variable of an experiment, and predicting the possible result of an experiment.

Experimental design

The practice of experimental design is a systematic thinking process of configuring the relationship among control, independent, and dependent variables (Pedaste et al., 2015). Students need to verify independent variables, measure dependent variables, and control confounding variables (Arnold et al., 2018). Measuring students’ experimental design ability is critical in science teaching (Vorholzer et al., 2020). Furthermore, confounding variables in students’ experimental designs are infrequently noticed. Arnold et al. (2021) suggested that controlling confounding variables can help students improve their experimental design capability. Based on student statements, controlling confounding variables can enhance accurate measurement and clarify the relationship between the independent variable and dependent variable. Thus, how students identify dependent and independent variables along with the quality of the experimental procedure in their experimental design are the primary concerns of this study.

Enhancing student learning of science through reading

The role of reading in promoting student science learning has been emphasized (Yore et al., 2004; Fang and Wei, 2010; Norris and Phillips, 2015). Students’ reading ability is associated with their future participation in public issues related to science, health, and the environment (Trefil, 2008; Dori et al., 2018). A recent study showed a significant impact of students’ reading ability on their scientific reasoning performance (Lazonder et al., 2021). Additionally, Deng et al. (2019) claimed that students improved their competence in organic chemistry through reading.

Given that reading scientific articles is often challenging for high school students, an empirical study of eye movement showed that students who were equipped with better reading skills spent more time and repeated unfamiliar scientific terms frequently while they were reading. On the other hand, students with fewer reading skills tended to focus on common words (Yun, 2020). Neri et al. (2021) concluded that reading comprehension is crucial for student science performance. Existing literature has shown that high school students’ understanding of chemistry can be enhanced by reading modified scientific articles (Dori et al., 2018). In summary, adapted contextualization of scientific articles is a key element in promoting high school students’ learning of science through reading. The instruments and reading material for this study were based on suggestions from the above literature.

Theoretical framework of the study

The theoretical framework of this study was motivated by integrating the principles of social constructivism (Driver et al., 1996; Lee and Hutchison, 1998) and reflection (Lee and Hutchison, 1998). The above science educators advocated that dynamic interactions and reflections are supportive to students’ learning in the process of knowledge creation and legitimization. It is reasonable to assume that encouraging dynamic student–teacher and student–student interaction and personal reflection would be beneficial to the development of students’ learning outcome of scientific inquiry.

Research questions

The review of the literature highlights the potential impact of reading and reflection on student science learning. We are curious about the effectiveness of reading, and then reflecting on, a contextual scientific article on students’ inquiry performance. The specific research questions of this study are:

(1) If there is any, what are the impacts of different reading strategies on students’ scientific inquiry performance?

(2) How do students of high and low level of evaluative reflection perform differently on forming research questions and experimental designs in chemistry?

Methodology

Participants

A total of 134 10th graders from four high school Chemistry classes participated in this quasi-experimental study. The participants joined the course named “Science Reading” that involved 6 weeks of instruction (2 hours per week). The four classes were randomly assigned as experimental group (EG) 1 including 2 classes (n = 66), EG2 including one class (n = 33), and comparison group (CG; n = 35). Both EGs used the same reading material. All the students were from a typical senior high school located in a suburban area of Taiwan and were from families representing a broad range of socioeconomic backgrounds. The school authority randomly assigned students into each classroom. All of the students in the study consented to participate in the research and have their work analyzed and published. The three classes of EGs were taught by Teacher A while the CG was taught by Teacher B. Both teachers had similar teaching experiences.

Teaching interventions

Before the study commenced, students already had the knowledge of basic gas properties (e.g., solubility, water pressure) and gas laws. CG and EG students were taught to improve their scientific literacy but with different strategies. In order to fit the students’ ability, the article was modified by two experts with majors in Chemistry. This study was executed in 6 weeks (2 hours per week). The pre-test was administered at the first period of the study. The post-test was administered at the end of the intervention. After 6 months, the delayed post-test was administered. Students spent about 50 minutes completing each test.
Teaching of the comparison group. Students in the CG were taught to read an article about air pollution. The article emphasized that scientists are investigating the sources of air pollution based on simplified research models and discussed the effects of air pollutants with different particle sizes on the human body. Students were encouraged to answer the open-ended questions in small-group discussions and the teacher's support of clarification for ambiguities. After the practices of reading and making sense of this sample unit, students freely picked up a science article on different topics (provided by the teacher) and worked in small groups to prepare their group presentation in whole-class discussions.
First and second weeks.
Retrieve scientific information. The CG students read the article about air pollution after completing the pre-test. Students needed to retrieve scientific information and answer questions on their worksheet after group discussion (e.g., after discussing with your classmates, try to find 6 ways to reduce air pollution and choose the top three that you can achieve.) Students’ answers must be based on the article, and the teacher clarified students’ claims.
Third week.
Interpreting data scientifically. We provided the data about air pollution in the reading material. Students were asked to generate an appropriate chart to interpret the composition among different sources of pollution. Students were supported by the teacher to ensure that their scale and the units were correct.
Fourth week.
Generate inference. Students were encouraged to act as a lawmaker and make laws to prevent air pollution, providing scientific evidence to support their claim.
Fifth and sixth weeks.
Group reporting. Students choose another article provided by the teacher. During the presentation, students needed to summarize, generate a chart according to the article, and interpret. The teacher gave feedback after every presentation. All students were given the post-test after the group report.
Teaching of experimental group 1. The students in EG1 were taught to read an article about using a microscope to observe bubbles in a soft drink. The content of the article described how scientists observed and interpreted the bubbles in a soft drink and mentions the formation, rising, and bursting of the bubbles. The reading material was revised from Gerard Liger-Belair's research; that is, the article that introduces the scientific phenomena of champagne (Liger-Belair, 2015), the study of the bubble dynamics in sparkling water (Liger-Belair et al., 2015), and the review article of the bubbles’ study (Liger-Belair, 2017). Because the students did not have the experience of using scientific instruments (e.g., 13C NMR or stroboscope) or some scientific concepts (e.g., van der Waals force), we added some explanations in the reading material for students to realize the challenging concepts. For example, we provided a description of Henry's Law: “The greater the pressure in the container, the more gas dissolved in water.” In addition, students’ evaluative reflections were emphasized during the 6 weeks of teaching.
First week.
Self-experience on reading material. Students had four tasks to practice: (a) provide an abstract after reading, (b) create a self-generated illustration for the article, (c) based on the reading material, ask and answer two questions by themselves, (d) share a self-experience (personal) that related to the reading material. The reading material provided an observation from using a microscope to observe and interpret the common scientific phenomena in daily life. Therefore, students were encouraged to explain phenomena scientifically, which is defined by OECD (2016) as an essential competency for scientific literacy. After completing the reading assignment, evaluative reflection was practiced by all students through self-questioning and answering the questions raised by themselves in a week one learning sheet.
Second week.
Practicing divergent thinking. At first, the teacher shared the pictorial illustrations of the reading assignment and experiences that students generated during the first week. Students were asked to engage in evaluative reflection by selecting 1–3 best pictorial illustrations and explaining the reasons why they selected these illustrations. Second, students started to think about the factors that cause the bubbles generated in soft drinks and shared their ideas on the blackboard. All the students had to provide a factor that was different from the others. The teacher helped students to clarify the factors that can be evaluated by an experiment.
Third week.
Forming a researchable question. Every student in their discussion group had to indicate a purpose for their experiment. During the discussion, students shared their thoughts about the purpose and predicted a possible result of that purpose. The teacher helped students practice their evaluative thinking and clarified the statement of relationship between independent variable and dependent variable during the discussion (e.g., what are the reasons you think this factor is suitable for the experiment?). After discussion, students were encouraged to select the best statement regarding the purpose of the bubble study.
Fourth week.
Designing an experiment. Based on the purpose chosen by the student in the third week, students were asked to design an experiment and test their own hypothesis. During the design process, they were reminded to consider the relationships among the independent variable, the scale of the experiment, the control variables, and the dependent variable. The teacher pointed out the unclear part of individual students’ experimental design to motivate student evaluative reflection on their own experimental design (e.g., Is your measurement of the dependent variable feasible and accurate?).
Fifth and sixth weeks.
Reporting and evaluative reflecting. Students were encouraged to take turns presenting their experimental design and explaining the rationale for the design. During the whole-class discussion, the rest of the students were encouraged to ask questions about the experiment or provide feedback to the presenter. To increase the level of student engagement on evaluative reflection, students were asked to write feedback on a learning sheet for every group's presentation of research question and experimental design. The practices of evaluative reflection lasted for 4 hours. All students’ learning sheets were collected for coding of the evaluative reflection score. Finally, students were given the post-test at the last period of the sixth-week module.
Teaching of experimental group 2. The reading material and teaching interventions for EG2 were the same as EG1, except that EG2 students were not asked to perform the evaluative reflection. Instead, they were encouraged to focus on recognizing the variables in the experimental design. For instance, the teacher asked students to point out the control variables during the experimental design phase (fourth week). During the reporting phase, students need to find the variables and the predicted result in an experimental design. Furthermore, both EGs completed planning an investigation but did not perform hands-on activities in the laboratory.

Instrument

The study is based on the assessment of students’ responses to a pre-test/post-test and delayed post-test. The details and validation of the instruments are described in this section.
Pre-test and post-test. The students’ scientific inquiry performance was measured using a pre-test and a post-test that contained two sections. Each section had three open-ended questions that assessed if students can form a researchable question and then design an experiment for their own research question. The two sections of this instrument are explained next.
Section 1:
“The bubbles in a soft drink are carbon dioxide. Scientists try to figure out the position that the bubbles generated, so they use a high-speed camera with a microscope lens to aim at the bottom of the bubbles in the soft drink. In contrast, this nucleation site is not the tiny, bumpy holes that are on the glass surface. Because these tiny holes’ radius is too short, they cannot generate bubbles. Actually, the bubbles are generated from impure substances like cylindrical fibres when we wipe the glass or that come from the dust in the air.”
After reading this paragraph, Tom designed the following experiment.
Equipment: 3 plastic cups (50 mL), sandpapers (#50, #200, #800; smaller number indicates the surface is rougher), a bottle of sparkling water (250 mL).
Steps:
1. Make a circle of sandpaper that can fit in the bottom of the cup.
2. Put the round sandpaper (#50, #200, #800) on the bottom of the cup.
3. Pour 30 mL sparkling water slowly into the cup.
4. Stand still for 5 minutes, then count the bubbles that stay on the sandpaper.
Question 1: What purpose is it that Tom wants to figure out in this experiment?
Question 2: What control variables were in this experiment?
Question 3: What result is it that Tom tried to investigate?

In Section 1, students must verify the purpose of the experiment in the beginning. Then according to the purpose of investigation, they identify the control, dependent (i.e., the measured), and independent (i.e., the intervention) variables during the experimental design. Finally, students are encouraged to predict the result of the experiment.

Section 2:
Here is one article describing the bubbles in a soft drink:
“… during the bubbles rising, the CO2will diffuse from the surface of the bubbles into bubbles, so the volumes and buoyancy of the bubbles will increase. Consequently, the rising speed of the bubbles will become faster, but the big molecules in the drink (e.g., sugar or proteins) will form molecular films that try to slow down the bubbles …”
Terry thinks that if there is more sugar in the sparkling water, then the rising speed of the bubbles might be lower. To examine his thought, Terry prepared the following:
• Glasses (100 mL) • Stopwatch • Sparkling water
• Digital scale • Thermometer • Sugar
• Ruler • Distilled water • Brown sugar
Question 1: Based on Terry's thought, what data does he want to collect?
Question 2: According to the purpose, what are the independent variable and the dependent variable?
Question 3: Try to design your own experimental procedure.

Students were encouraged to design an experiment in Section 2. After reading the statements and Terry's hypothesis, students were asked to respond to three questions about collecting supportive data, identifying dependent and independent variables, and designing experimental procedures.

Delayed post-test. The delayed post-test was measured after six months of the teaching intervention. There are two scenarios that were similar to those of the pre-test and post-test. However, the theme of the delayed post-test was based on chemistry-related phenomena in daily life. Each section has three open-ended questions as shown in the following.

Section 1:
Effervescent tablet is a well-known tablet in our life. It is applied widely, such as in medicine, a health supplement, or dental cleaning. When you drop a vitamin effervescent tablet into a glass of water, it starts to fizz and generates a lot of bubbles. After the tablet dissolves, it tastes like soft drinks with fruity flavour. The main reason for an effervescent tablet creating a lot of bubbles results from an acid and base reaction. For instance, citric acid and baking soda are the main ingredients in the vitamin effervescent tablet. When the acid meets the baking soda in the water, they produce carbon dioxide. Therefore, we can see the bubbles during the dissolution.
After reading this paragraph, Rick designed the following experiment.
Equipment and chemicals:
• Plastic cups (100 mL) × 3 • Rice paper (a thin edible layer of starch, 6 cm × 6 cm) 3 pieces • Distilled water
• Citric acid 10 g • Salt 5 g
• Baking soda 10 g • Digital scale
• Timer
Steps:
1. Pour 50 mL water in each of the three cups.
2. Mix 3.0 g citric acid and 1.0 g baking soda in the rice paper to become tablet 1.
3. Mix 3.0 g citric acid and 2.0 g baking soda in the rice paper to become tablet 2.
4. Mix 3.0 g citric acid and 3.0 g baking soda in the rice paper to become tablet 3.
5. Put each tablet into different cups of water; measure the time it takes each tablet to totally dissolve.
Question 1: What is the purpose that Rick wants to explore in this experiment?
Question 2: What are the control variables in this experiment?
Question 3: Please make a prediction about the result of this experiment.

Students are asked to verify the purpose of the experiment in the first question. They then point out the control variables based on the experimental design. Finally, students make an appropriate prediction of the experiment.

Section 2:
On 4 August 2020, a huge explosion occurred due to a large amount of ammonium nitrate stored at the port of the city of Beirut, the capital of Lebanon. After seeing the news, Julie remembers that she used a potassium nitrate (KNO 3 ) solution as paint to draw on paper and that she ignited the drawn area by incense after drying. She discovered that the burned area was larger than the drawn area. During this reaction, the potassium nitrate (KNO 3 ) decomposed to potassium nitrite (KNO 2 ) and oxygen (O 2 ). The oxygen can help burning, so the flame burned beyond the drawn area
Julie thought the area that is burned will be bigger with higher concentrations of potassium nitrate. For the purpose of testing her thought, she prepared the following experiment.
• 50 mL glasses × 3 • Potassium nitrate • Ruler (20 cm)
• Digital scale • Paint brush (0.5 cm width) • Copy paper (10 cm × 10 cm)
• Distilled water • Glass stick
Question 1: Based on Julie's thought, what data does she want to collect?
Question 2: According to the purpose, what are the independent variable and the dependent variable?
Question 3: Try to design experimental procedures for Julie

Students were encouraged to design an experiment in Section 2. After reading the statements and Julie's hypothesis, students were asked to respond to the three questions about collecting supportive data, identifying dependent and independent variables, and designing experimental procedures.

Students’ answers to the questions of the pre-/post-test and delayed post-test were coded individually by two experienced science teachers, according to the scoring scheme shown in Table 1 and Appendix (i.e., for the delayed post-test), respectively. The possible full score for each question is 2 points; the possible total score of pre-test, post-test, and delayed post-test is 12 points. The total score represents student overall performance of scientific inquiry. Student performance on the competency of “forming researchable questions” is based on their performing scores of verifying the purpose of an experiment, pointing out the control variable, and predicting the result of the experiment. Their performance on the competency of “experimental design” is based on their performing score of collecting supportive data, identifying dependent and independent variables, and designing experimental procedures. The Spearman's rank correlation coefficient was analysed to represent the interrater reliability measure. The interrater reliability of the pre-test and post-test on the six questions ranged from 0.916 to 0.966 (p < 0.01). The interrater reliability of the delayed post-test on the six questions ranged from 0.917 to 0.979 (p < 0.01). Samples of student answers for each category score are provided in Table 1 (for pre- and post-tests) and Appendix (for delayed-post-test).

Table 1 Examples for the scoring scheme of the pre-test and post-tests
Category Score Description Examples of student answers
1-1: Verifying the purpose of an experiment 0 Wrong answer or missing value “The position that bubbles generated.”
1 Only showing the variables but not mentioning their relationship “The relationship between the different sizes of dust and the numbers of bubbles that generated.”
2 Containing the relationship of the independent variable and dependent variable “Are there more bubbles on the rougher sandpapers?”
1-2: Pointing out the control variable of an experiment 0 Wrong answer or missing value “The number of sandpapers. The volume of sparkling water.”
1 Minor control variables Plastic cups, sparkling water.”
2 Major control variables “(1) 30 mL of sparkling water. (2) Stand still for 5 minutes.”
1-3: Predicting the result of the experiment 0 Wrong answer or missing value “The nucleation site is not on the surface of sandpaper.”
1 Incomplete prediction “The rougher the sandpaper, the more air bubbles will be generated and the faster.”
2 Correct prediction “More bubbles on the rougher sandpaper.”
2-1: Collecting supportive data 0 Wrong answer or missing value “The speed of bubbles is affected by the portion of sugar.”
1 Partially correct answer “The concentration of the sugar solution”
2 Correct answer “Different weight of sugar in the sparkling water. The time for the bubbles to float up at the same distance.”
2-2: Identifying dependent and independent variables, respectively 0 Wrong answer or missing value “Independent variable: temperature, sparkling water. Dependent variable: different kind of sugar.”
1 Incomplete answer “Independent variable: he amount of sugar. Dependent variable: Types of sugar.”
2 Correctly indicating the independent variable and dependent variable “Independent variable: The amount of sugar. Dependent variable: The rising speed of bubbles.”
2-3: Designing experimental procedures 0 Wrong answer or missing value “Adding 50 mL sparkling water into the glass. Using scales to weigh 10 grams of sugar and checking the temperature.”
1 Short of the control variables or incomplete statement of data collection “Make the solutions that have different concentrations by using water and sugar. Add sparkling water in the solution. Measure the time that bubbles take rising to the same height.”
2 Appropriate experimental design “Step 1. Using three identical glasses and pouring in equal amounts of sparkling water. Step 2. Preparing different concentration of sugar solution and pouring into sparkling water. Step 3. Measuring the rising speed of the bubbles.”


Validation of the pre-test, post-test, and delayed post-test. The content validity of the instruments used in the pre-test, post-test, and delayed post-test was ensured by two experts who are seed teachers majoring in chemistry and science education. The construct validity of the instruments used in the pre-test, post-test, and delayed post-test was examined following established procedures of “comparing the scores of known groups” in a pilot study (Gronlund, 1985). A separate sample of students were selected from two known schools for the pilot test. Based on students’ high school entrance exam performances, the high-achieving group students were selected from a top-ranked school in contrast to the school of low-achieving group students. For the validation of instruments used in the pre-test and post-test along with the delayed-post-test, Mann–Whitney tests indicated that the performance of the high-achieving group was significantly (p < 0.001) better than that of the low-achieving group.

Data analysis

Scoring for the pre-test, post-test, and delayed post-test. All the data were scored by two trained teachers. In the beginning stage of coding, scoring schemes were discussed and practiced through the same answer sheets of five students by the two teachers. Then they performed coding individually. If their coding of a student's score differed by more than 1 point, a discussion followed to reach consensus. Finally, the Spearman's rank correlation coefficient was analysed to represent the interrater reliability measure. A p value smaller than 0.05 means significantly trustworthy or reliable. The scoring procedure is consistent with the recommendation of providing a reliability measure and reaching complete consensus through negotiated agreement (Watts and Finkenstaedt-Quinn, 2021).

Students’ answers to the test questions were coded individually by two experienced science teachers based on the above scoring scheme. In addition, we used an analysis of covariance (ANCOVA) to examine whether both EGs improved more than the CG. Howell (2012) suggested that an ANCOVA can be used to adjust the initial difference and reflect the impact on the dependent variable. This adjustment can clarify the treatment effect in an experimental study. The pre-test score served as the covariate, while the post-test and delayed post-test were examined as the dependent variable, respectively, to determine whether there were significant immediate and lasting differences among the three groups. The key assumptions (e.g., linearity of regression, normality of error terms, and homogeneity of regression slopes) that underlie the use of ANCOVA were confirmed during the statistical analyses.

Evaluative reflection score. As mentioned earlier, students’ learning sheets of evaluative reflection were collected and coded by two trained chemistry teachers for the assessment of their evaluative reflection score. The coding guides are as follows: 2 points for pointing out the specific problem and giving a constructive suggestion in the experimental design (e.g., “Based on their purpose, I think the volume of soft drink has to be the same.”); 1 point for pointing out reasonable but incomplete comments in an experimental design (e.g., “Are the cups the same size?”); and 0 point for an irrelevant recommendation or emotional reply. The maximum total score for each student was 16 points. The Spearman's ρ indicated that the interrater reliability of the two coders’ coding was 0.95, p < 0.01.

Results

Research Question 1: If there is any, what are the impacts of different reading strategies on students’ scientific inquiry performance?

Considering the impacts of evaluative reflection (ER) on students’ scientific inquiry performance, the students’ performance results on the pre-test, post-test, and delayed post-test are presented in Table 2. It reveals that students in both EGs had higher mean scores than those in the CG in the pre-test. The one-way analysis of variance and Scheffe post hoc comparisons revealed that both EG students’ performances on the pre-test were significantly better than that of the CG (p < 0.05), while there was no significant difference between EG1 and EG2 (p > 0.05). Given that the ANCOVA is capable of adjusting pre-existing differences among groups, it was used for the comparison of the post-test and delayed post-test performance of the three groups. Table 2 shows the mean scores of the pre-, post-, and delayed post-tests along with the ANCOVA results of the three groups. Both EGs performed significantly (p < 0.05) higher than the CG in the post-test and delayed post-test. However, there was no significant difference between the EGs in the post-test (p > 0.05) and the delayed post-test (p > 0.05).
Table 2 Students’ scientific inquiry performance between pre-test, post-test, and delayed post-test
Group Pre-test Unadjusted post-test Adjusted post-test Unadjusted delayed post-test Adjusted delayed post-test
Numbers in parentheses are standard deviations. α indicates a significant difference (p < 0.05) compared to the comparison group.
Comparison 2.69 (1.86) 2.80 (2.05) 3.42 (0.35) 3.39 (2.65) 3.82 (0.53)
Experimental 1 4.11 (2.47)α 5.96 (2.70) 5.82 (0.25)α 6.93 (2.43) 6.82 (0.35)α
Experimental 2 4.53 (2.56)α 7.00 (1.86) 6.63 (0.35)α 8.12 (3.42) 7.93 (0.50)α


Students’ performance results on forming researchable questions and experimental design on the pre-test, post-test, and delayed post-test are shown in Table 3. Concerning the performance of forming researchable questions, the CG's post-test and delayed post-test performances were significantly (p < 0.05) lower than those of the two EGs. The post-test had a significant difference between EG1 and EG2 (p = 0.02 < 0.05), but no significant (p = 0.15 > 0.05) difference was observed in the delayed post-test.

Table 3 Students’ performance on forming researchable questions and experimental design
Group Pre-test Unadjusted post-test Adjusted post-test Unadjusted delayed post-test Adjusted delayed post-test
Numbers in parentheses are standard deviations. α indicates a significant difference (p < 0.05) compared to the comparison group. β indicates a significant difference (p < 0.05) between the two experimental groups.
Forming researchable questions
Comparison 1.47 (1.24) 1.50 (0.37) 1.77 (0.22) 1.93 (1.82) 2.10 (0.43)
Experimental 1 2.22 (1.53) 3.69 (1.60) 3.63 (0.16)α 3.75 (1.57) 3.71 (0.29)α
Experimental 2 1.42 (1.76) 4.44 (1.15) 4.28 (0.22)α,β 4.50 (3.25) 4.42 (0.40)α
Experimental design
Comparison 1.21 (1.28) 1.30 (1.12) 1.53 (0.23) 1.46 (1.41) 1.70 (0.29)
Experimental 1 1.89 (1.40) 2.27 (1.66) 2.22 (0.16)α 3.18 (1.63) 3.12 (0.20)α
Experimental 2 2.11 (1.53) 2.56 (1.22) 2.42 (0.23)α 3.62 (1.66) 3.51 (0.28)α


Regarding the students’ performance on experimental design, ANCOVA results show that both EGs’ performances were significantly (p < 0.05) better than that of the CG. However, there was no significant difference in the post-test (p = 0.47 > 0.05) and delayed post-test (p = 0.25 > 0.05) between the two EGs.

Research Question 2: How do students of high and low level of evaluative reflection perform differently on forming research questions and experimental designs in chemistry?

In order to examine the relationship between students’ ER and scientific inquiry performance, students (N = 59, M = 7.43, SD = 1.72, median = 8.00) were divided into high and low groups according to the median score of the EG1 students who exercised ER. The key assumptions (e.g., both groups with normal distribution and having similar variance) that underlie the use of an independent t-test were confirmed. The independent t-test results shown in Table 4 revealed that there was no significant difference in the delayed post-test performance of forming a researchable question between the high ER and low ER students (t = 1.43, p = 0.16) while there was a significant difference in the performance of experimental design (t = 2.41, p < 0.05) between the two groups.
Table 4 Independent sample t-test results of high and low evaluative reflection students’ performance in the delayed post-test
Competency ER score n M SD t Sig. (2-tailed)
*p < 0.05.
Forming researchable questions High 31 4.11 1.58 1.43 0.16
Low 22 3.52 1.34
Experimental design* High 31 3.61 1.36 2.41* 0.02
Low 22 2.55 1.86


Discussion

This quasi-experimental study investigated the effectiveness of three teaching approaches of implementing reading contextual science articles on student performances of scientific inquiry, formulating researchable questions, and designing experimental procedures. After a 6-week teaching intervention, our findings (Tables 2 and 3) reveal that the students of the two experimental groups taught by focused reading outperformed their counterparts in the competency of scientific inquiry and the sub-competency of forming a researchable question and experimental design. In addition, the high ER students outperformed low ER students on experimental design. The major findings are discussed in the following subsections.

The teaching approach of focused reading is supportive to students’ scientific inquiry competency

The challenges of teaching scientific inquiry have been documented (Akuma and Callaghan, 2019), and various teaching approaches in promoting student scientific inquiry have been advocated in the science education community (Fang, 2021). In this study we explored the effectiveness of engaging students in reading scientific articles. The ANCOVA results of the post-test and delayed post-test revealed that students in the two experimental groups who were engaged in reading scientific articles with explaining scientific investigation procedures outperformed those who were engaged in reading articles that were focused on presenting modelling data and evidence of scientific findings. Existing literature indicated that personal hands-on experiences of inquiry could be used as a way of enhancing learners’ understanding about the nature of inquiry (Van Brederode et al., 2020, Zion et al., 2020). In this study, we explored a teaching strategy of reading scientific articles in which students were encouraged to plan their own inquiry investigation and provide feedback regarding their peers’ experimental designs. Even when students were not engaged in practical hands-on laboratory activities, their competency in forming researchable questions and designing experimental procedures significantly improved.

The level of students’ engagement in evaluative reflection is associated with their performance in designing experimental procedures

The benefits of students’ ER in the learning of argumentation and portfolio assessment design have been identified (Steen-Utheim and Hopfenbeck, 2019, Choi and Hand, 2020). Researchers recommended that ER should be implemented in science teaching (Newton and Tonelli, 2020, Ibourk and Kendrick, 2021). In view of the related literature on reflection, limited studies have focused on the higher order cognitive skills of experimental design. The results of this study revealed that students with high-level ER performed significantly better in experimental design than those with lower levels of reflection engagement.

Limitations and further studies

The results of quasi-experimental design in this study allow us to confirm the effectiveness of using the reading material of science articles for students to reflectively practice scientific inquiry competency. However, readers are reminded that the EG reading material about observing bubbles is closely related to the content for Sections 1 and 2 on the pre- and post-test. Although the reading material itself did not provide the opportunity of practicing scientific inquiry, the similarity of the reading materials focusing on observing bubbles might be a confounding variable explaining the differences of learning outcome between the EG and CG students. In addition, given that the two experimental groups and the comparison group were taught by two different teachers, the teacher effect should be identified as a possible limitation or confounding variable for the study.

There are additional limitations of this exploratory study. The first limitation is the duration of the teaching intervention. The participants were actively engaged in the experimental approach for only six weeks (2 hours per week). More distinctive results may be discovered if this teaching approach is implemented for one semester. Secondly, when teaching and assessing student learning outcomes, the level of students’ reading literacy was not considered in this study. It is worth exploring how students’ reading literacy affects their ER on experimental design. Thirdly, ER occurred not only in the students’ written feedback but also when they were designing their experimental procedures. Understanding how students’ ER influences their experiment design is valuable to explore in the future. Finally, we regard this study as a starting point for exploring the impact of different pedagogical approaches of explicit reading materials and strategies on student scientific inquiry performance. Further studies are encouraged to explore the causal relationship between higher order cognitive thinking skills such as convergent thinking (i.e., requiring the practice of integrating information and generating abstract categories) and the scientific competency of interpreting data and evidence scientifically.

Implications and contributions

With the use of quasi-experimental design and analysis of covariance, the findings of this study add value to the literature and have implications for frontline chemistry teachers. Given that the existing literature indicated that the competency of forming researchable questions and experimental design were evidently lacking even for university students (Mistry and Gorman, 2020), this study is important in that it enables us to better understand how pedagogical approaches of reading and reflecting on scientific articles can be implemented in classroom teaching practices for the purpose of promoting student competency of formulating researchable questions and experimental design. Our findings highlight the importance of including critical components in reading materials as well as add value to the existing literature by providing an alternative pedagogical approach on previous research positing the benefits of reading on students’ science learning outcomes. We believe that the reading materials and teaching approaches described in this study (e.g., forming a researchable question, designing an experimental procedure, and practicing evaluative reflection for the purpose of elaborating their own inquiry procedures) are likely to enrich chemistry teachers’ inquiry-based science teaching practices.

Conclusions

This study sought to better understand how students’ scientific inquiry performance could be supported by reading science articles. Our investigations of student reading practices and learning interactions revealed that engaging students in reading and discussing contextual scientific articles with a focus on explaining investigation procedures is supportive to advance student inquiry competency of forming research questions and designing experimental procedures. Those students who were equipped with a high level of evaluative reflection (i.e., pointing out specific flaws and providing constructive suggestions regarding their peers’ experimental design) outperformed their counterparts on scientific inquiry performance. Given that research about using reading in chemistry teaching is scarce, this study's pedagogical approach of reading with a focus on evaluative reflection provides an alternative teaching strategy for those teachers who do not have enough time or laboratory equipment to provide their students the opportunity of doing hands-on experiments. Further studies investigating the relationship between students’ reading literacy and their scientific inquiry competency are important as they are essential for 21st century citizens (OECD, 2016).

Conflicts of interest

There are no conflicts to declare.

Appendix: Examples for the scoring scheme of the delayed post-test

Category Score Description Examples of student answers
1-1: Verifying the purpose of an experiment 0 Wrong answer or missing value Producing CO2.
1 Only showing the variables but not mentioning their relationship Does NaHCO3 affect the number of bubbles.
2 Containing the relationship of the independent variable and dependent variable The relationship between the concentration of NaHCO3 and the dissolving time of the effervescent tablet.
1-2: Pointing out the control variable of an experiment 0 Wrong answer or missing value Sodium bicarbonate
1 Minor control variables Plastic cups, citric acid, rice paper, water
2 Major control variables Each has 50 mL water in the cups, 3.0 grams of citric acid, same rice paper
1-3: Predicting the result of the experiment 0 Wrong answer or missing value The time the effervescent tablet dissolved
1 Incomplete prediction Larger amount of baking soda has a faster reaction
2 Correct prediction Dissolving time: Tablet 3 > Tablet 2 > Tablet 1
2-1: Collecting supportive data 0 Wrong answer or missing value The relationship between the concentration of KNO3 and the area that burned
1 Partially correct answer The concentration of KNO3 and the oxygen that it generated
2 Correct answer The concentration of KNO3. Use ruler to measure the difference area between drawn and burned
2-2: Identifying dependent and independent variables, respectively 0 Wrong answer or missing value Independent variable: (blank). Dependent variable: (blank)
1 Incomplete answer Independent variable: different concentration of KNO3. Dependent variable: higher concentration of KNO3, bigger area that burned
2 Correctly indicating the independent variable and dependent variable Independent variable: different concentration of KNO3. Dependent variable: The area that burned by different concentrations of KNO3
2-3: Designing experimental procedures 0 Wrong answer or missing value Drawing on the copy paper by using the KNO3 solution and drying it
1 Short of the control variables or incomplete statement of data collection Preparing different concentrations of the KNO3 solution. Painting the solution on the paper and burning it
2 Appropriate experimental design 1. Pour 30 mL water in three glasses
2. Add 5 g KNO3 to the first glass (Solution A), 10 g to the second glass (Solution B), 15 g to the third glass (Solution C)
3. Draw a square (5 cm × 5 cm) on a copy paper by using a brush pen containing Solutions A, B and C, respectively
4. Ignite the paper by incense after drying
5. Record the area that burned

Acknowledgements

This study was supported by the Ministry of Science and Technology, Taiwan under grant number: MOST 109-2511-H-110-013 and MOST 109-2511-H-110 -005 -MY3. This study has fulfilled the technical requirements necessary to demonstrate the use of ethical procedures in researching human participants. The authors are thankful for Prof. Larry Yore's insightful comments and Editor Shari Yore's professional editing.

References

  1. Akuma F. V. and Callaghan R., (2019), A systematic review characterizing and clarifying intrinsic teaching challenges linked to inquiry-based practical work, J. Res. Sci. Teach., 56, 619–648.
  2. Arnold J. C., Boone W. J., Kremer K. and Mayer J., (2018), Assessment of competencies in scientific inquiry through the application of Rasch measurement techniques, Educ. Sci., 8, 184.
  3. Arnold J. C., Mühling A. and Kremer K., (2021), Exploring core ideas of procedural understanding in scientific inquiry using educational data mining, Res. Sci. Technol. Educ., 1–21.
  4. Bell R. L., Smetana L. and Binns I., (2005), Simplifying inquiry instruction, Sci. Teach., 72, 30–33.
  5. Bell T., Urhahne D., Schanze S. and Ploetzner R., (2010), Collaborative inquiry learning: Models, tools, and challenges, Int. J. Sci. Educ., 32, 349–377.
  6. Cavagnetto A., Hand B. M. and Norton-Meier L., (2010), The nature of elementary student science discourse in the context of the science writing heuristic approach, Int. J. Sci. Educ., 32, 427–449.
  7. Chang H.-P., Chen C.-C., Guo G.-J., Cheng Y.-J., Lin C.-Y. and Jen T.-H., (2011), The development of a competence scale for learning science: Inquiry and communication, Int. J. Sci. Math. Educ., 9, 1213–1233.
  8. Choi A. and Hand B., (2020), Students' construct and critique of claims and evidence through online asynchronous discussion combined with in-class discussion, Int. J. Sci. Math. Educ., 18, 1023–1040.
  9. Deng Y., Kelly G. J. and Deng S. L., (2019), The influences of integrating reading, peer evaluation, and discussion on undergraduate students' scientific writing, Int. J. Sci. Educ., 41, 1408–1433.
  10. De Waard E. F., Prins G. T. and Van Joolingen W. R., (2020), Pre-university students' perceptions about the life cycle of bioplastics and fossil-based plastics, Chem. Educ. Res. Pract., 21, 908–921.
  11. Dori Y. J., Avargil S., Kohen Z. and Saar L., (2018), Context-based learning and metacognitive prompts for enhancing scientific text comprehension, Int. J. Sci. Educ., 40, 1198–1220.
  12. Driver R., Leach J., Millar R. and Scott P., (1996), Young People's Images of Science, Open University Press.
  13. Ebenezer J., Kaya O. N. and Ebenezer D. L., (2011), Engaging students in environmental research projects: Perceptions of fluency with innovative technologies and levels of scientific inquiry abilities, J. Res. Sci. Teach., 48, 94–116.
  14. Fang S. C., (2021), Towards scientific inquiry in secondary earth science classrooms: Opportunities and realities, Int. J. Sci. Math. Educ., 19, 771–792.
  15. Fang Z. H. and Wei Y. H., (2010), Improving middle school students' science literacy through reading infusion, J. Educ. Res., 103, 262–273.
  16. Gaynor J. W., (2020), Peer review in the classroom: Student perceptions, peer feedback quality and the role of assessment, Assess. Eval. High. Educ., 45, 758–775.
  17. Gronlund N. E., (1985), Measurement and Evaluation in Teaching, New York: Macmillan.
  18. Gupte T., Watts F. M., Schmidt-Mccormack J. A., Zaimi I., Gere A. R. and Shultz G. V., (2021), Students' meaningful learning experiences from participating in organic chemistry writing-to-learn activities, Chem. Educ. Res. Pract., 22, 396–414.
  19. Hofstein A., Navon O., Kipnis M. and Mamlok-Naaman R., (2005), Developing students' ability to ask more and better questions resulting from inquiry-type chemistry laboratories, J. Res. Sci. Teach., 42, 791–806.
  20. Hofstein A., Dkeidek I., Katchevitch D., Nahum T. L., Kipnis M., Navon O., Shore R., Taitelbaum D. and Mamlok-Naaman R., (2019), Research on and development of inquiry-type chemistry laboratories in Israel, Isr. J. Chem., 59, 514–523.
  21. Howell D. C., (2012), Statistical methods for psychology, Cengage Learning.
  22. Huang X., Lederman N. G. and Cai C. J., (2017), Improving Chinese junior high school students' ability to ask critical questions, J. Res. Sci. Teach., 54, 963–987.
  23. Ibourk A. and Kendrick M., (2021), Elementary students' explanation of variation of traits and teacher's feedback using an online embedded assessment tool, Int. J. Sci. Educ., 43, 1173–1192.
  24. Kaya S. and Temiz M., (2018), Improving the quality of student questions in primary science classrooms, J. Balt. Sci. Educ., 17, 800–811.
  25. Ketonen L., Hahkioniemi M., Nieminen P. and Viiri J., (2020), Pathways through peer assessment: Implementing peer assessment in a lower secondary physics classroom, Int. J. Sci. Math. Educ., 18, 1465–1484.
  26. Kohen Z., Herscovitz O. and Dori Y. J., (2020), How to promote chemical literacy? On-line question posing and communicating with scientists, Chem. Educ. Res. Pract., 21, 250–266.
  27. Lazonder A. W., Janssen N., Gijlers H. and Walraven A., (2021), Patterns of development in children's scientific reasoning: Results from a three-year longitudinal study, J. Cogn. Dev., 22, 108–124.
  28. Lee A. Y. and Hutchison L., (1998), Improving learning from examples through reflection, J. Exp. Psychol.: Appl., 4, 187–210.
  29. Liger-Belair G., (2015), Six secrets of champagne, Phys. World, 28, 26–30.
  30. Liger-Belair G., (2017), Effervescence in champagne and sparkling wines: From grape harvest to bubble rise, Eur. Phys. J. Spec. Top., 226, 3–116.
  31. Liger-Belair G., Sternenberg F., Brunner S., Robillard B. and Cilindre C., (2015), Bubble dynamics in various commercial sparkling bottled waters, J. Food Eng., 163, 60–70.
  32. Mcelhaney K. W. and Linn M. C., (2011), Investigations of a complex, realistic task: Intentional, unsystematic, and exhaustive experimenters, J. Res. Sci. Teach., 48, 745–770.
  33. Mistry N. and Gorman S. G., (2020), What laboratory skills do students think they possess at the start of university? Chem. Educ. Res. Pract., 21, 823–838.
  34. Nas S. E., Akbulut H. İ., Calik M. and Emir M. İ., (2021), Facilitating conceptual growth of the mainstreamed students with learning disabilities via a science experimental guidebook: A case of physical events, Int. J. Sci. Math. Educ.
  35. National Research Council, (2012), Framework for science education, Washington DC: National Academy of Science.
  36. Neri N. C., Guill K. and Retelsdorf J., (2021), Language in science performance: Do good readers perform better? Eur. J. Psychol. Educ., 36, 45–61.
  37. Newton X. A. and Tonelli E. P., (2020), Building undergraduate STEM majors' capacity for delivering inquiry-based mathematics and science lessons: An exploratory evaluation study, Stud. Educ. Eval., 64, 12.
  38. Nicol D., Thomson A. and Breslin C., (2014), Rethinking feedback practices in higher education: A peer review perspective, Assess. Eval. High. Educ., 39, 102–122.
  39. Norris S. and Phillips L. M., (2015), Scientific literacy: Its relationship to “Literacy”, in Gunstone R. (ed.), Encyclopedia of Science Education, Dordrecht: Springer Netherlands.
  40. OECD, (2016), PISA 2015 Results (volume I): excellence and equity in education, Paris: PISA, OECD Publishing.
  41. Orsmond P., Merry S. and Reiling K., (2005), Biology students’ utilization of tutors’ formative feedback: A qualitative interview study, Assess. Eval. High. Educ., 30, 369–386.
  42. Pedaste M., Maeots M., Siiman L. A., De Jong T., Van Riesen S. A. N., Kamp E. T., Manoli C. C., Zacharia Z. C. and Tsourlidaki E., (2015), Phases of inquiry-based learning: Definitions and the inquiry cycle, Educ. Res. Rev., 14, 47–61.
  43. Rönnebeck S., Bernholt S. and Ropohl M., (2016), Searching for a common ground – A literature review of empirical research on scientific inquiry activities, Stud. Sci. Educ., 52, 161–197.
  44. Samarapungavan A., Patrick H. and Mantzicopoulos P., (2011), What kindergarten students learn in inquiry-based science classrooms, Cognit. Instruct., 29, 416–470.
  45. Schwab J. J., (1958), The teaching of science as inquiry, Bull. Atom. Sci., 14, 374–379.
  46. Sewry J. D. and Paphitis S. A., (2018), Meeting important educational goals for chemistry through service-learning, Chem. Educ. Res. Pract., 19, 973–982.
  47. Steen-Utheim A. and Hopfenbeck T. N., (2019), To do or not to do with feedback. A study of undergraduate students' engagement and use of feedback within a portfolio assessment design, Assess. Eval. High. Educ., 44, 80–96.
  48. Szalay L., Tóth Z. and Kiss E., (2020), Introducing students to experimental design skills, Chem. Educ. Res. Pract., 21, 331–356.
  49. Tawfik A. A., Graesser A., Gatewood J. and Gishbaugher J., (2020), Role of questions in inquiry-based instruction: Towards a design taxonomy for question-asking and implications for design, ETR&D-Educ. Tech. Res. Dev., 68, 653–678.
  50. Topping K., (1998), Peer assessment between students in colleges and universities, Rev. Educ. Res., 68, 249–276.
  51. Tosun C., (2019), Scientific process skills test development within the topic “Matter and its Nature” and the predictive effect of different variables on 7th and 8th grade students' scientific process skill levels, Chem. Educ. Res. Pract., 20, 160–174.
  52. Trefil J., (2008), Why science? Teachers College Press.
  53. Van Brederode M. E., Zoon S. A. and Meeter M., (2020), Examining the effect of lab instructions on students' critical thinking during a chemical inquiry practical, Chem. Educ. Res. Pract., 21, 1173–1182.
  54. Vorholzer A., Von Aufschnaiter C. and Boone W. J., (2020), Fostering upper secondary students' ability to engage in practices of scientific investigation: A comparative analysis of an explicit and an implicit instructional approach, Res. Sci. Educ., 50, 333–359.
  55. Watts F. M. and Finkenstaedt-Quinn S. A., (2021), The current state of methods for establishing reliability in qualitative chemistry education research articles, Chem. Educ. Res. Pract., 22, 565–578.
  56. Yore L. D., Hand B., Goldman S. R., Hildebrand G. M., Osborne J. F., Treagust D. F. and Wallace C. S., (2004), New directions in language and science education research, Read. Res. Q., 347–352.
  57. Yun E., (2020), Comparing the reading behaviours of students with high- and low-level comprehension of scientific terms by eye movement analysis, Res. Sci. Educ.
  58. Zion M., Schwartz R. S., Rimerman-Shmueli E. and Adler I., (2020), Supporting teachers' understanding of nature of science and inquiry through personal experience and perception of inquiry as a dynamic process, Res. Sci. Educ., 50, 1281–1304.

This journal is © The Royal Society of Chemistry 2022