Step by step learning using the I diagram in the systematic qualitative analyses of cations within a guided inquiry learning approach

Nalan Akkuzu * and Melis Arzu Uyulgan
Department of Mathematics and Science Education, Dokuz Eylul University, Izmir, Turkey. E-mail: nalan.akkuzu@gmail.com

Received 20th March 2017 , Accepted 10th May 2017

First published on 10th May 2017


Abstract

The current study examines the performance and achievement of students in the Systematic Qualitative Analyses of Cations (SQACs). We sought answers to questions such as, ‘What are the students’ levels of performance?’ and ‘What is the relation between the average scores for performance and achievement?’. This was done by using the I diagram as a tool within a Guided Inquiry Learning Approach (GILA), which is based on the constructivist theory. The sample consisted of sophomore students (N = 31) taking the Analytical Chemistry Laboratory-I course and attending the Chemistry Teaching Program of the Faculty of Education of a state university in the Aegean region of Turkey. During the analyses, the students attempted to solve specific problems and find the results of their qualitative analyses as they followed the sections of I diagram step by step under the guidance of researchers. They also tried to find solutions to problems through logical reasoning and discussions with each other. A positive correlation was found between the achievement and performance of the students. During the experimental process based on the GILA, the students were able to structure their knowledge more clearly by carrying out cation analysis in a systematic manner, inquiring and suggesting scientific explanations. Although they had difficulties in some sections of the I diagram, including logical argument, data transformation and variable definition, they were able to establish a link between theory and practice by using their cognitive and meta-cognitive skills.


Introduction

Meaningful learning and improved performance and achievement in chemistry education require students' participation in active learning processes. The laboratory is the best environment for enabling students to participate in active learning processes (Tobin, 1990; Hofstein, 2004; Lunetta et al., 2007). It is the most important learning environment for hands-on and minds-on learning, in which students can gain firsthand experience, demonstrate their performance in experiments and become more active participants (Hofstein and Lunetta, 2004). Confucius emphasized the importance of practical application of knowledge by saying, ‘I hear and I forget, I see and I remember, I do and I understand.Bodner (1992) noted that students who were actively involved in laboratory applications learned better compared to those who were not and that the difference in terms of learning between the groups was similar to the difference between watching and playing basketball. According to him, ‘If you only watch professional basketball, it is hard for you to become a good player; but if you practice, you can become a good player.Jalil (2006) stated the opportunity that the laboratory provides for students to make connections between theory and practice. However, studies in the literature have shown that students who participate in laboratory applications do not have a level of performance and achievement as high as expected (Nakhleh, 1994; Singer et al., 2005). An investigation into the reasons behind this revealed that students who simply conducted laboratory experiments as if following recipes did not have clear ideas about the purpose of and methods to use in experiments and were not able to develop their knowledge of theory and practice (Tamir and Lunetta, 1981; Nakhleh and Krajcik, 1993; Domin, 1999; Mayer, 2004; Kirschner et al., 2006). Friedler and Tamir (1990) found that students had difficulty in understanding basic concepts related to experiments, failed to combine their observations with theoretical knowledge and, as a result of their observations, remembered irrelevant details which were far removed from the purpose of the experiment. According to Gunstone (1991), this is usually because students engage in technical activities in science laboratories and have very little opportunity for meta-cognitive activities. Madhuri et al. (2012) noted that the process based on the confirmatory laboratory method which is frequently used in laboratories affects this situation. This is because the confirmatory approach provides information about the purpose of the experiment, theoretical information related to the subject, information about the operational steps and the results of the experiment and asks students to confirm the accuracy of the information given (Handelsman et al., 2004; Russell and Weaver, 2011; Brownell et al., 2012). Shiland (1999) and Domin (1999) referred to laboratory activities based on the confirmatory approach as closed or low inquiry laboratories and stated that these activities are usually transmissionistic, expository, demonstrative and teacher-centred. Within the framework of this approach, students know what to find at the end of the experiment and how to find it. Students cannot combine their knowledge and skills related to the experiment and their cognitive skills do not improve since they cannot engage in dynamic, critical and higher-order thinking. Allen et al. (1986) noted that the cookbook nature of verification experiments can actually inhibit intellectual stimulation. Therefore, students experience problems in structuring new knowledge, which adversely affects their experimental performance and achievement. However, when conducting experiments students actually experience the process and can establish a link between their prior knowledge and the new knowledge which they acquire in the laboratory. In this sense, the presence of meaningful learning in the laboratory is highly determined by the teaching approach. In light of this information, it is possible to say that the process of gaining theoretical knowledge from practical experience and producing scientific ideas is highly complicated. Thus, in this case, it is of great importance to answer the following questions: How can students acquire the theoretical knowledge related to the experiments carried out in the laboratory? What kind of effective methods or tools should be used to increase student performance in the laboratory?

In order to answer these questions, some institutions and researchers have suggested that it is necessary to rethink the role and practice of laboratory work in chemistry teaching (Bybee, 2000; National Research Council [NRC], 2000; Hofstein and Lunetta, 2004). Bybee (2000) stated that if used properly, the laboratory has the potential to be an important medium for introducing students to the central conceptual and procedural knowledge and skills in science. In the laboratory, students need to understand scientific concepts, improve their practical scientific skills and inquiry and problem-solving skills, develop increased motivation and interest and understand the nature of science. Ford (2008) mentioned that the students need to be active in obtaining knowledge and that they should think critically in order to understand how scientific knowledge is constructed and how scientists work. In this respect, the main focus of laboratory applications in recent years has been the inquiry-based learning approach which supports meaningful learning.

Theoretical framework of the inquiry-based learning approach

Inquiry-based learning is an active learning strategy based on Dewey's educational philosophy of pragmatic education and constructivism. The advancement of science is made possible through the development of inquiry. Inquiry is a natural process which is inherent to all humans and leads individuals to ask questions and discover and carefully test their discoveries to produce new knowledge (NRC, 2000; Llewellyn, 2002). By saying ‘The map does not take the place of the actual journey. The logically formulated material of a science is no substitute for the having of individual experiences. The mathematical formula for a falling body does not take the place of personal contact and immediate individual experience with the falling thingDewey (1902) emphasized the importance of learning based on individual experience through investigation-inquiry (as cited in Pine et al., 2006). Inquiry-based learning focuses on learners' information-collecting processes rather than creating a product or finding solutions appropriate for the situation (Lim, 2001). Individuals make sense of the world in which they live by improving their scientific thinking and thus their learning skills during this process (Evans, 2001). For this reason, inquiry is, in a sense, a way of thinking. National Research Council (NRC) states that students need to possess five essential characteristics related to the inquiry process in order to learn science. These include being engaged by scientifically oriented questions; giving priority to evidence, which allows students to develop and evaluate explanations that address such questions; formulating explanations from evidence to address these questions; evaluating their explanations in light of alternative explanations, particularly those reflecting scientific understanding; and, finally, communicating and justifying their proposed explanations (NRC, 2000). In this way, students gain scientific skills, i.e. the abilities to use scientific practices (NRC, 2012). Schwab (1962) and Colburn (2000) noted that three key activities are important for inquiry: asking questions, collecting data and interpreting those data. Coffman (2012) and Martin-Hansen (2002) stated that inquiry based learning is a way of converting information into useful knowledge. Davis (2005) stated that in inquiry-based learning students assume responsibilities and actively engage in the process. According to her students strive to both understand the subject deeply and to find solutions. In this process, students structure their knowledge by establishing links between their prior knowledge and the new information in a meaningful way (Zuiker and Whitaker, 2014). Hofstein et al. (2005) clearly showed that chemistry students who are involved in inquiry-type laboratory activities are able to ask more and better questions regarding chemical phenomena and they can develop high-level learning skills through these activities.

Depending on the amount of direction teachers give students, varying degrees of inquiry are possible (Schwab, 1962; Martin-Hansen, 2002; Buck et al., 2008). There are four types of inquiry in practice, such as confirmation or verification, structured inquiry, guided inquiry, and open inquiry (Herron, 1971; Bell et al., 2005; Banchi and Bell, 2008; Blanchard et al., 2010). These types vary according to the degree of freedom that the students are provided with in designing the investigation (Colburn, 2000; Bell et al., 2005). Banchi and Bell (2008), Chin and Chia (2006) suggested that students need to be engaged in guided and open inquiry learning approaches in order to gain the above-mentioned skills at a higher level, because these approaches allow students to play an active role in planning and producing results and to take responsibility for their own learning. In this context, the specific inquiry-based learning approach that we used in this study to allow students to gain these skills was the guided inquiry learning approach. The reason we preferred the guided inquiry learning approach over the open inquiry-based learning approach was that the students had no experience with the latter and did not yet have the potential to assume full responsibility for conducting research as scientists. Mayer (2004) noted that students need guidance for how to think during inquiry-based laboratory activities in order to structure their knowledge. Arnold et al. (2014) emphasized that instructors should support students while working on inquiry tasks. For instance, they should give information about the procedural knowledge of the experiments so that they can gain a deeper understanding and flexible thinking skills. Chatterjee et al. (2009) compared the perceptions of students regarding guided inquiry- and open inquiry-based laboratory activities and showed that students were enthusiastic about learning in guided inquiry-based laboratory activities. In a meta-analysis study, Furtak et al. (2012) revealed that teacher-led inquiry lessons seemed to have a greater effect on student learning than those that were student-led. In addition, researchers have suggested that inquiry types should be used gradually until students gain the science process skills necessary for inquiry-based learning (NRC, 2000; Llewellyn, 2002; Mayer, 2004).

Guided inquiry learning approach (GILA)

In the GILA, teachers give students a problem for them to research and students determine the methods necessary for the solution of the problem and find the results on their own (Colburn, 2000; Buck et al., 2008). The origin of the guided-inquiry laboratory approach has been traced back to the early 20th century British science educator Henry Armstrong, who taught chemistry by a heuristic method in which the students' role was transformed into that of a discoverer (DeBoer, 1991). In the GILA, teachers take a role which leads students towards research and investigation and thus towards thinking for themselves. In this process, the level of guidance provided by teachers depends on the learning objectives. Coffman (2012) stated that the teacher's responsibility is to set up and facilitate the activity to make sure students are learning what is expected. In this context, teachers prepare the material necessary for the students to use, then ask questions and try to get answers from students, as well as facilitating learning by giving guidance and tips. Indeed, the role of the teacher in guided inquiry teaching is as coach and facilitator (Anderson, 2007). Bodner (1986) put emphasis on the point that guided-inquiry activities are based on the theory of constructivism in which ‘knowledge is constructed in the mind of the learner.’ When the students become skillful in constructing explanations and doing inquiry they acquire more independence in learning. In guided-inquiry laboratories the aim is to involve students in an inquiry process by confronting them with an area of investigation and letting them follow experimental directions, gather data on certain specified variables and establish relationships among the variables from their own data (Farrell et al., 1999; Banchi and Bell, 2008; Buck et al., 2008; Chatterjee et al., 2009). In recent years, a large number of studies have been conducted which employ the GILA in the laboratory to allow students to better understand subjects related to chemistry (Bunterm et al., 2014; Conway, 2014; Fakayode, 2014; Gupta et al., 2015; Raydo et al., 2015; Wheeler et al., 2015). In the study of a quantitative comparison of the relative effectiveness of guided inquiry and verification laboratory instruction, Blanchard et al. (2010) found that students who participated in a guided inquiry-based laboratory unit showed significantly higher test scores when compared to the students participating in a verification laboratory-based unit. Similarly, Ketpichainarong et al. (2010) stated that an inquiry-based laboratory application for undergraduate students in biotechnology was effective for their conceptual learning, critical thinking and science process skills.

It is necessary to use various tools in laboratories where science teaching based on an inquiry-based learning approach is taking place. These will improve students' conceptual and methodological knowledge, lead them to design experiments, suggest and test hypotheses, and interpret findings and results (Novak and Gowin, 1984; Hofstein, 2004; Mayer, 2004; Waters, 2012). In order to apply such a learning approach in the laboratory, various techniques, tools or models, such as diagrams, text descriptions, concept maps and physical models are used (Nakhleh, 1994; Phillips and Germann, 2002; Cresswell and Loughlin, 2015). In this study we used the I diagram. Although rarely used in the literature, the I diagram is a tool that we believe can improve student performance and increase achievement by creating an effective analytical chemistry laboratory environment.

The role of the I diagram in the GILA

The I diagram is an educational tool proposed by Phillips and Germann (2002) inspired by Gowin's Vee diagram and based on Lawson's ifand ifthen template. The I diagram is one of the tools which improves students' coordination skills as well as their cognitive skills in the inquiry-based learning process. The I diagram allows students to follow steps such as logical argument, hypothesis suggestion, variable definition, procedure, data collection, data transformation and results in order to answer the question(s) related to the experiment. It also helps to create an active learning environment. Students follow the experimental process step by step using various modes of thinking such as brainstorming, reasoning, argumentation and critical thinking in a specific order, and are thus able to establish the link between practice and theory more easily. Phillips and Germann (2002) applied the heuristic PTEMA (Prepare, Trial, Evaluate, Model and Apply) technique based on guided inquiry using the I diagram. In their study, they investigated students' improvement in scientific inquiry during their investigation of changes in the osmotic system depending on the effects of different variables. They stated that the students are able to grasp the distinction between processes when they complete the I diagram in a more holistic, integrated manner and also that the I diagram encourages them to develop a deeper understanding of scientific inquiry. Karamustafaoğlu (2011) applied the I diagram to students and found that students' science process skills improved and that, consequently, they better comprehended concepts related to the subject. Based on these studies and the features of the I diagram, we believe that the I diagram can be used quite effectively especially in laboratory applications in order to enhance inquiry and thus performance and achievement. The reason why we chose to use the I diagram in the Analytical Chemistry Laboratory in our study was that the steps in the diagram and the steps followed in the analytical approach are similar in terms of the process of thinking. In the analytical approach, individuals display their level of performance by using certain steps to solve the problem and they improve scientific research in this process (Bybee, 2006). These steps are shown in Fig. 1.
image file: c7rp00050b-f1.tif
Fig. 1 The steps followed in the analytical approach.

Analytical Chemistry, which is based on the analytical approach, involves two types of analyses, qualitative and quantitative. The importance of laboratory experiments involving qualitative analysis, which enable the elucidation of the structure of matter, is particularly emphasized in various studies (Hicks and Bevsek, 2012; Mandler et al., 2014). Qualitative systematic analysis experiments are among the best laboratory practices for providing students with the opportunity to learn about matter. These experiments are important for teaching the characteristics of matter and improving and enhancing the understanding of chemical concepts. In this context, different approaches, methods, techniques and tools are required to conduct these experiments more effectively in the laboratory environment. The I diagram is among the most important educational tools which enable students to demonstrate their level of performance, as students answer all parts of the I diagram step by step when conducting SQAC experiments. Students attempt to create a final product in this process. They try to structure their knowledge and use the structured knowledge in a meaningful way. Thus, the experimental performance of students and their use of their knowledge and skills can be determined. This study is important in terms of what it reveals about students' performance in SQAC experiments by having them inquire into the GILA process and enabling them to form an opinion about the analytical approach in this process.

The purpose of the study

The purpose of this study was to determine student performance in SQAC experiments as a result of applications where the I diagram tool was used within the GILA, identify the relation between student performance and achievement and determine students' opinions regarding the I diagram tool. To this end, an attempt was made to answer the following questions:

With I diagram applications in SQAC experiments:

• What are the students' levels of performance?

• What is the relation between the average scores of performance and achievement?

• What do students think about the I diagram tool?

Method

Since the study aimed to determine the level of student performance, the case study method, one of the qualitative research methods, was used. Case studies provide comprehensive information related to an event, an activity or a process (Merriam, 2014). This paper describes a case study on determining students' performances and achievements about SQACs. Our aim was to understand how using the I diagram based on the GILA functioned during the experiments on SQACs.

Participants

The sample of the study consisted of a total of 31 sophomore students attending the Chemistry Teaching Program of the Faculty of Education of a state university and taking the Analytical Chemistry Laboratory-I course. The students were 19 to 21 years old. 45% (N = 14) of the sample was composed of female students and 55% (N = 17) was composed of male students. The students were believed to have the basic science process skills necessary for the GILA. It was possible to assert that the students had gained observation, measurement and data-reporting skills in Basic Chemistry Laboratory I and II courses which they had taken during their freshman year, had improved coordination skills and had already begun to learn about the inquiry process. For this reason, it could be said that they were already prepared for the GILA process which led students to take more responsibility for their learning.

Informed consent was obtained from each student before this research was carried out. They were notified that the process and the results of the application would be used for research purposes only and they were also assured that their identities would remain confidential. Additionally, the students were informed that they could withdraw from the study at any time without providing a reason or receiving detrimental treatment. The case study was conducted in compliance with ethical guidelines (Taber, 2014).

Instruments

As the case study method was used, the data were collected with various data collection instruments to ensure construct validity in the qualitative research. These instruments used in the study were the I diagram, the SQAC achievement test and the semi-structured interview.
I diagram tool. The I diagram was prepared as a document and used in the study to measure the performance of the students in SQAC experiments. Bailey (1982) stated that document review is one of the strongest data collection tools since it provides long-term or extended analysis, allows individuality and originality to be demonstrated and offers a quality data source. I diagrams were printed in the A3 format and given to each student to fill in during the SQAC experiments. The I diagram documents given to students consisted of a front page and a back page. The method is called the I diagram because the shape of the first page resembles the letter ‘I’. Fig. 2 shows the different sections of the I diagram and their purposes. As can be seen the planning and theoretical aspects of the inquiry were on the left side of the ‘I’ and the methodological aspects, including the data and conclusions, were on the right side. Arrows in the I diagram showed students how to follow the steps of the analysis which they were carrying out. Students who followed these arrows in order were able to complete their inquiry process. On the back page of the I diagram, the students were asked to write a summary of their prior knowledge, variables, any experimental errors, the limitations of the experiment and, finally, how the cations found are used in everyday life. Since these sections were essentially summaries, they were discussed in the laboratory environment after the experiment to enhance learning. For this reason, only the front page of the I diagram was considered in the assessment. In our study, the I diagrams used to determine the performance of the students relating to the SQAC experiments were assessed out of a total of 100 points. Fig. 2 presents how the sections of the I diagram are scored. The scoring was performed according to the holistic scoring rubric developed by the researchers. In the holistic scoring rubric, the entire performance exhibited by the student in using the I diagram was given a single score (Klein et al., 1998). The performance level of the students was scored with regard to four levels. Table 1 shows the performance levels, performance scores and criteria used to determine the quality of the performance at each level.
image file: c7rp00050b-f2.tif
Fig. 2 I diagram sections (Phillips and Germann, 2002), the purposes of and scores for the I diagram sections in our research.
Table 1 Performance scores and criteria determined according to holistic scoring in the filled-in I diagrams
Performance level Performance score range Performance criteria
Highest 85–100 The student filled in the sections of the I diagram with accurate and adequate information.
Good 70–84 The student mostly filled in the sections of the I diagram with accurate and adequate information.
Moderate 50–69 The student filled in the sections of the I diagram with partially accurate and incomplete information.
Low 49 and below The student filled in only a few sections of the I diagram with accurate and adequate information.


The scoring was carried out based on the I diagrams prepared by the researchers for each cation group analysis. In order to ensure scoring reliability, the consistency between scores given by two researchers was checked and the Pearson Product-Moment Correlation (r) was calculated to be 0.985. Consequently, it could be said that the consistency between the researchers was high and the scoring rubric was reliable. The achievement levels reflecting the students' performance in each cation group analysis and average scores were calculated using SPSS 15.00 and presented in charts. Correlation analysis was also performed between the students' average I diagram scores and SQAC achievement test scores in order to investigate the relationship between performance and achievement scores.

SQAC achievement test. A multiple choice achievement test was prepared by the researchers in order to identify the students' achievement level in the SQAC experiments. Bloom's remembering, understanding and applying levels in the cognitive field were used to prepare the test questions. Since the researchers had been giving the Analytical Chemistry Laboratory course for approximately 9 years, they prepared 25 test questions based on their experience and using different resources including the subject of SQACs. In order to determine content validity, the opinions of two faculty members, experts in the field of Analytical Chemistry, were sought and the test items were reviewed and corrections were made to test items which required amending. For the pilot application of the test, the data were collected for three consecutive years and the pilot test was applied to a total of 142 students taking the Analytical Chemistry Laboratory-I course. After the application, item analysis was performed to ensure the validity of the SQAC achievement test. First, the total item correlation, which explained the relation between scores obtained from test items and the total test score, was checked. When scoring the items in the multiple choice SQAC achievement test, correct answers were given the value ‘1’ and incorrect or blank answers were given ‘0’. Accordingly, four items with a negative total item correlation (the 4th, 19th, 20th and 24th items) and three items with a total item correlation below 0.2 (the 1st, 6th and 17th items) were removed from the test. After ensuring that test scores showed normal distribution, each item remaining in the item analysis was checked for the item difficulty index (pj) and the item discrimination index (rj), which were calculated separately. The results of the item analysis can be seen in Table 2.
Table 2 SQAC achievement test item analysis results
IN RIN CUG CLG pj rj IN RIN CUG CLG pj rj
IN: item number; RIN: renumbered item number; CUG: the student number who answered the question correctly in 27% of the upper group; CLG: the student number who answered the question correctly in 27% of the lower group.
2 1 33 18 0.671 0.395 13 10 23 0 0.303 0.605
3 2 35 27 0.816 0.311 14 11 21 3 0.316 0.474
5 3 38 31 0.908 0.314 15 12 23 7 0.395 0.421
7 4 29 2 0.408 0.711 16 13 33 17 0.658 0.421
8 5 35 4 0.513 0.816 18 14 29 15 0.579 0.368
9 6 32 9 0.539 0.605 21 15 32 16 0.632 0.421
10 7 26 2 0.368 0.632 22 16 23 1 0.316 0.579
11 8 37 5 0.553 0.842 23 17 33 22 0.724 0.300
12 9 28 4 0.421 0.632 25 18 23 5 0.368 0.474


Table 2 shows that the difficulty index of 18 items has varied between 0.303 and 0.908. It has been suggested that items with a difficulty index of 0.500 should be used in achievement tests and the item difficulty index should be between 0.300 and 0.800 (Kehoe, 1995). The average item difficulty index of the SQAC achievement test was calculated to be 0.527. These results suggested that the items could be used. The item discrimination index of the test items was found to vary between 0.300 and 0.816. The mean discrimination index of the items was 0.518. Items with a discrimination index of 0.400 or above are highly discriminative and referred to as a ‘very good item’, whereas items between 0.300 and 0.390 are referred to as a ‘good item’ (Ellis and Mead, 2002). Consequently, it could be said that most of the items in the SQAC achievement test had a ‘very good’ discrimination index.

The consistency of items in a test shows the measurement reliability of that test. Accordingly, the reliability coefficient of the 18-item SQAC achievement test (KR-20) was calculated to be 0.77. In accordance with this result, it could be said that this test was highly reliable (Zimmaro, 2004). Sample questions from the test including the 18 items after validity and reliability testing are given in Appendix 1. A correlation analysis was performed by calculating the Pearson correlation coefficient (r) and the determination coefficient (r2) between scores obtained from the SQAC achievement test and average performance scores obtained from the I diagram. After scoring, the total score obtained from the test (maximum 18 points) was converted to 100 points and used in the correlation analysis. Table 3 shows the renumbered items and their content.

Table 3 Items in the SQAC achievement test and their content
Content of items Item no.
Cations in Systematic Qualitative Analysis and their characteristics 1, 4, 11 and 12
Group 1 cations and their characteristics 2, 9 and 13
Group 2 cations and their characteristics 5 and 14
Group 3 cations and their characteristics 6, 10 and 15
Group 4 and 5 cations and their characteristics 8, 16 and 18
The reactions of cations 3, 7 and 17


Semi-structured interview. The semi-structured interview technique was used to elucidate what the students' experiences with the applications based on the GILA had been and to determine their achievement and performance levels in more depth. To this end, the students were asked:

• Whether the I diagram application contributed to their performance in the SQACs, if yes, in what way;

• In which sections of the I diagram application they exhibited the lowest and the highest performance;

• What information they had learned related to the SQACs with the help of the I diagram application;

• What negative aspects the I diagram applications had.

The interviews were held with 20 volunteer students. Each interview lasted approximately 25–30 minutes. The interviews were recorded with a tape recorder and then transcribed. The interview data were subjected to content analysis. In the content analysis, the data were encoded, categories were found based on codes, frequencies and percentages were calculated according to categories and quotes from the students were given. During the encoding process, the agreement percentage of the researchers, calculated for the reliability of the analysis, was found to be 96%.

Procedure

The study was conducted over a period of one semester in Analytical Chemistry laboratories. The aim of the course was to enable the students to understand SQAC methods (such as macro, micro and semi-micro), the characteristics of cations and their reactions with reagents. With these objectives in mind, the scope of the course involved being able to distinguish cation groups depending on their characteristics, being able to note the reactions of cations with different reagents, being able to separate cations in solutions using analytical methods and learning about treatments such as filtration, decantation and centrifugation in order to separate precipitates from solutions during cation analysis, both cognitively and psychomotorally. Table 4 presents the order of experiments conducted each week for the purposes of the course and the application process.
Table 4 Application process by week
Week Application process
1 We provided information about the GILA, the I diagram tool and its sections.
2 We started a discussion related to SQACs in the laboratory and tried to provide initial knowledge to students regarding cation analysis.
We prepared the solutions and glass equipment necessary for experiments.
3 The students completed the pretesting stage in order to familiarize themselves with the cations in group 1, their characteristics, reagents and reactions.
4 The students performed analyses in order to identify group 1 cations (Ag+ and Pb2+) using I diagrams.
5 The students completed the pretesting stage in order to familiarize themselves with the cations in group 2, their characteristics, reagents and reactions.
6 The students performed analyses in order to identify group 2 cations (Cu2+ and Cd2+) using the I diagrams.
7 The students completed the pretesting stage in order to familiarize themselves with the cations in group 3, their characteristics, reagents and reactions.
8 The students performed analyses in order to identify group 3 cations (Fe3+, Al3+, Cr3+, Mn2+, Co2+, Zn2+ and Ni2+) using the I diagrams.
9 The students completed the pretesting stage in order to familiarize themselves with the cations in group 4 and 5, their characteristics, reagents and reactions.
10 The students performed analyses in order to identify group 4 and 5 cations (Ca2+, Sr2+, Ba2+, Na+ and NH4+) using the I diagrams.
11 We carried out the SQAC achievement test.
12 We conducted semi-structured interviews.


The application and data collection process took place over 12 weeks during the fall semester. The Analytical Chemistry Laboratory-I course lasted 4 hours per week. The students performed analyses individually and filled in the I diagram for each analysis depending on the cation or cations which they identified during the course of that week. Thus, it was ensured that the application process continued in a reliable manner (Fig. 3).


image file: c7rp00050b-f3.tif
Fig. 3 Photographs from the I diagram applications.
A sample application for SQACs based on guided inquiry: analysis of group 3. During the week prior to the I diagram application, the students learned about group 3 cations and the main characteristics of these cations and observed what products (precipitate or complex) and reactions they created with their specific reagents. The students learned about iron (Fe3+), aluminum (Al3+), chromium (Cr3+) and titanium (Ti4+) among group 3A cations and zinc (Zn2+), manganese (Mn2+), cobalt (Co2+) and nickel (Ni2+) among group 3B cations. Thus, the students gained prior knowledge about group 3 cations. In the next week, I diagrams were distributed to students. A sample solution of cations to be found was given to each student. Only a titanium sample solution was not provided due to its cost. The solutions necessary for analysis, water baths in fume cupboards for heating, cooling baths for chilling, test tubes, washing bottles, pipettes, droppers, filter papers and pH papers were prepared by the researchers in the laboratory environment. Thus, the necessary conditions for the analysis of group 3 cations were provided to the students in the laboratory environment. The next step was the performance of the analysis together with the I diagram application. The application based on the GILA started with questions related to analysis. The questions we posed to the students in each section of the I diagram during the analysis was given in Table 5. If necessary, the students were guided by the researchers when filling in the I diagram. For example, in order to ensure uniformity for the guided inquiry, the researchers handed out a model procedure of the analysis after the student responses had been collected. The students were encouraged to discuss ideas and interact during the application. Students who completed the I diagram formulated a new research question. Finally, the students discussed the sections on the back page of the I diagram in the classroom environment and summarized the application.
Table 5 Questions posed to the students during the GILA-based application
I diagram sections Sample questions
Question What are we aiming to do in today's experiment?
With what sort of question would you start your research in order to identify the cations in the sample solution given to you?
Prior knowledge What are the characteristics of the group 3 cations seen in the pretesting stage? How do we distinguish group 3A and group 3B cations? What are the specific reagents of each cation? What products (precipitate, solution or complex) do they create with reagents?
What concepts come to your mind related to group 3 cations?
Logical argument What kind of hypothesis can you come up with regarding the cation group you will be looking for in the analysis? What can you do to analyse the cation group? What kind of experimental design can you create to determine the accuracy of your hypothesis? Which common reagent can you use to separate group 3 cations into 3A and 3B? How will you perform the analysis considering the specific reagents of each cation? What sort of results do you expect in your analyses? What do you expect to see at the end?
Experimental design (variable definition) What kind of variables are the reactive solutions which you have added to the sample solution during the analysis? What kind of variable is the sample solution into which you have added reactive solutions during the analysis? What are the manipulated variables in the analysis? Why?
Procedure How would you write down each process step of the analysis? According to the process steps, which common reagent should be added first? What kind of materials can you use in which step? Which cations are you going to look for in the precipitate and the solution formed?
Data collection What did you observe as a result of adding reagents into the sample solution? Did you observe a change in colour? What kind of product was formed? Was it a precipitate or solution or complex compound?
Data transformations How would you show resulting products and reagents in the reaction?
Results What kind of result did you find related to the analysis of group 3 cations? How would you describe the results you found with the variables you used? What is your decision regarding the cations you found in your sample solution? What is your evidence in reaching this decision? What kind of knowledge claim do your results reflect?
New knowledge What research do your results lead you to? How would you place your knowledge claim in a broader context?


Results

This section presents the findings obtained as a result of qualitative and quantitative data analysis and interpretations related to these findings.

First of all, we examined the findings related to the performance of the students in the SQAC experiments conducted during the Analytical Chemistry Laboratory-I course. Fig. 4 shows the average performance scores of the students in each cation group. The performances of the students in each analysis were presented by defining them in terms of performance levels. Fig. 5 shows that the majority of the students had a good (N = 10) and moderate (N = 10) performance level in group 1 cation analysis, the majority of the students (N = 23) had a good performance level in group 2 cation analysis, and a considerable part (N = 18) of the students had a very good performance level in group 4–5 cation analysis. It was found that the majority of the students (N = 21) had a moderate performance level. In light of these findings, it could be said that there was an increase in performance except for the fall in performance in group 3 cation analysis. The reasons behind this fall in group 3 SQAC performance might be that the students had to deal, in the analysis stage, with too many cations in the limited amount of time allocated for applications. They had spent much time to neutralize the solution by using pH papers in the first step of the analysis and made an effort to ensure the stability of buffer solution (NH3/NH4+) for pH adjustment. A sample I diagram document from one of the students is demonstrated in Appendix 2.


image file: c7rp00050b-f4.tif
Fig. 4 The average scores for I diagrams related to the SQAC experiments.

image file: c7rp00050b-f5.tif
Fig. 5 Performance levels of the students in each cation group analysis.

Another finding of the study was related to the relation between student performance and achievement using the I diagram. Considering the findings related to the SQAC achievement test the average score of the students was found to be 70. In addition, performance scores were calculated by averaging the scores obtained by the students from the diagrams which they had filled in during each analysis. The findings in Table 6 showed that there was a moderate, positive and significant correlation (r = 0.496, p < 0.01) between performance average scores and the SQAC achievement test scores of the students (Evans, 1996). In light of this result, we inferred that the students' achievement level increased as their performance in the SQAC experiments increased. When the determination coefficient (r2 = 0.246) is taken into consideration, it can be stated that 25% of the total variation in achievement is caused by the students' performances based on average scores for the I diagrams.

Table 6 Findings of the correlation analysis of students' performance average scores and SQAC achievement test scores
    Performance average scores
**Correlation is significant at the 0.01 level.
SQAC achievement test scores r 0.496**
p 0.005
N 31


One of the sub-problems of the study was the students' opinions regarding the I diagram application. The first question posed to the students during interviews was whether the I diagram contributed to their performance in the SQAC experiments. According to the findings, 90% (f: 18) of the students stated that the I diagram contributed to their performance, whereas 10% (f: 2) stated that it did not (Fig. 6).


image file: c7rp00050b-f6.tif
Fig. 6 Opinions of the students about the contribution of the I diagram.

Following this question, the students were asked what kind of contribution the I diagram had made to their performance. The analysis of the answers received from the students produced six different categories (see Table 7). These findings showed that the students' statements were mostly related to the categories of increasing achievement (21%), developing problem-solving skills (20%), developing analytical thinking skills (18%), and developing science process skills (17%). The reason that the category of increasing achievement emerged in student statements might be that they enhanced their knowledge by filling in the sections in the I diagram while performing analyses, because students improve their psychomotor skills as well as their cognitive skills during experiments. As a consequence, combining both of these skills can be said to contribute to their achievement. Also, the reason behind the students' opinion that the I diagram allowed for their developing problem-solving skills is that the analyses in these applications started with a research question and various steps were followed to answer this question. The reason behind the emergence of the developing analytical thinking skills category is that the students performed analyses in a systematic manner. During the I diagram application, the students had to fill in all sections in the diagram related to the analysis by relating them to each other to find a solution to the problem.

Table 7 Findings related to the question, ‘How did using the I diagram contribute to your SQAC performance?’
Category f % Sample student opinions
Increasing achievement 14 21 Because we wrote down very detailed answers, it allowed us to learn better and therefore increased my level of achievement during the course. (S.3)
As a student taking the course for the second time, this time I really learned the characteristics of cations. (S.17)
Sometimes you study for the mid-term or the final exam and sometimes you can't. Normally, if you don't study, you get a low grade. However, because we evaluated the entire course at the end of this application and because we filled in the document while conducting the experiment, we had the opportunity to learn during the course, which consequently increased our achievement level. (S.2)
Developing problem-solving skills 13 20 We were always trying to answer a question. All the steps were aimed at solving a problem. (S.11)
Developing analytical thinking skills 12 18 We came to the conclusion step by step, in a systematic manner. When moving onto the next step in the I diagram, we made decisions keeping what we had written in the previous section in mind. (S.13)
It taught me about systematic interpretation. (S.4)
Developing science process skills 11 17 I defined the problem, made observations while conducting the experiment and collected data. That's what it taught me. (S.4)
Ensuring collaborative work 8 12 For example, I and my friends asked each other whether we should go on with the precipitate or the solution or which reagent we should add. I was able to proceed with the experiment more easily thanks to our discussions. (S.7)
Normally, everyone would conduct the experiment independently; however we were able to consult each other when we got stuck. (S.6)
Leading to research 8 12 I remember our previous chemistry experiments. I had no idea what to do or how to do it in those experiments. The logic was ‘conduct the experiment and move on.’ Here, you are like a scientist. You can ask questions and research resources to find out the answers. (S.9)


Another question posed to the students during interviews was in which section of the I diagram they had the best performance. The findings showed that most of the students had their best performance in the research question (38%) and concepts (36%) categories (see Table 8). This might arise from the fact that the students participated in applications with prior knowledge about the experiment. Another section of the I diagram in which some students had their best performance was procedure (27%). This is because the students designed the experiment in the logical argument section before the procedure section.

Table 8 Findings related to the question, ‘In which sections of the I diagram did you do best? Why?’
Category f % Sample student opinions
Research question 17 38 We knew which experiment we were going to conduct that week thanks to the pretests. So we just had to formulate the question. (S.12)
Concepts 16 36 I had no difficulties writing down the concepts in the prior knowledge section, because, after filling in the prior knowledge section, the concepts came to my mind immediately. And also all sections included concepts such as solution, precipitate and complex. So, it wasn't that difficult. (S.19)
Procedure 12 27 We wrote down what the relevant reagents might be when determining the cation and how the analysis should be conducted when designing the experiment. So it was easier to write down the procedure after experimental design. (S.18)


Another finding of the study was related to the I diagram sections in which the students had their worst performance. Table 9 shows that the students had most difficulties in the logical argument (40%) section. The reason behind the drop off in student performance in this section might be that they had difficulty in establishing a hypothesis related to the analysis and in developing an experimental design with regard to the steps of the experiment. Although the students had learned about the characteristics of the cation group to be analyzed and reagents in pretests, it was quite difficult for them to think about the steps of the analysis, such as which reagent to add to the precipitate or solution formed, or when to perform heating or chilling, in a step by step, systematic manner. Another section in which the students did not perform well was data transformation (31%). Although the students were able to note the products of cations with reagents using the data which they had collected, they could not write down the reactions. It could be said that this was because the students had difficulty with equalization of reaction and prediction of complex compounds in particular. Another section in which they had difficulty was variable definition (29%). The students confused causal variables with dependent variables during the experiments. This might be due to the fact that the students failed to inquire what sort of variable the reagent, the sample solution into which the reagent was added, the precipitate or the solution formed or the heating treatment were.

Table 9 Findings related to the question, ‘In which sections of the I diagram did you do worst? Why?’
Category f % Sample student opinions
Logical argument 17 40 The most difficult part was to find the procedure to be followed for the experiment. I couldn't remember what to write where. I knew hydrochloric acid was added to silver, we learned that in pretest. I could only write about that. (S.16)
Data transformation 13 31 I had trouble creating reactions, mostly with complex compounds… and writing the conclusion or the creating the product section… like what was formed when a given substance reacts with a reagent. (S.7)
Variable definition 12 29 I confused dependent and independent variables all the time. What I remember is that heating is an independent variable in a water bath, because heat precipitates the sample solution. (S.8)


The average scores of the students from each section of the I diagram are shown separately for SQAC groups in Appendix 3. The students had better performance in research question, prior knowledge especially concepts and procedure sections when they were more difficult in logical argument, data transformation and variable definition sections (see Appendix 3). The maximum scores for the I diagram sections are given in Fig. 2.

During the interview, the students were asked the question, ‘What information did you learn related to SQACs with the help of the I diagram?’. The findings showed that the students mostly learned about analysis steps (25%) (see Table 10). The reason for this might be that the inductive approach of the application first of all involves obtaining information necessary for the qualitative analysis of cations, then creating the analysis steps using the information and, finally, coming to a conclusion. While coming to a conclusion, the students combine the parts into a whole and acquire a holistic knowledge. For this reason, students understood the analysis steps better. Some of the students (18%) stated that they were able to note what was formed without difficulty. According to this finding, it can be said that the I diagram helps students with regard to writing down compounds such as precipitates and complexes. The findings also showed that the students learned concepts related to analysis (18%) better. It can be asserted that the students enhanced their theoretical knowledge by writing down key concepts and thus learned better. Examining the other findings, the categories of characteristics of cations, qualitative analysis and characteristics of analytical reactions were also seen in student statements.

Table 10 Findings related to the question, ‘What information did you learn related to SQACs with the help of the I diagram? How?’
Category f % Sample student opinions
Analysis steps 11 25 I understood better how to do analysis, because I did a lot of thinking before writing down the analysis steps in the logical analysis section. By writing down how to do the analysis step by step in the procedure section and then applying them, I learned the steps related to analysis. (S.5)
Noting what products were formed 8 18 The most important thing for me was to write down what was formed. It was something I've had difficulty with since high school. From among the products, I could write down the precipitates, but it was hard to write down the complexes. I guess I have made some mistakes. (S.15)
Concepts related to analysis 8 18 To me, the concepts in the prior knowledge section were very informative. Also, there were concepts such as precipitate, solution and complex all the time. It was impossible to forget them. You got to write down almost all of them when analysing each cation. (S.2)
Characteristics of cations 7 16 I especially learned how to distinguish each cation group. For example, the substance which distinguishes silver from group 2 or other groups is diluted hydrochloric acid. This is now a piece of information that I will never forget. (S.12)
Qualitative analysis 6 14 Now I know the difference between qualitative analysis and quantitative analysis. I have learned that you cannot perform quantitative analysis without qualitative analysis. (S.8)
Characteristics of analytical reactions 4 9 For example, I needed to look for cadmium or copper when designing the experiment. What should I add, which substance would cause it to precipitate?, because there was a different reagent for each. When I remembered that bad smell in the pretest, Thioacetamide (CH3CSNH2) came to my mind and I wrote it down. (S.7)


Finally, the students were asked about their negative opinions regarding the I diagram. Table 11 shows that the students' statements mostly concentrated on the writing aspect (42%), time constraint (33%) and redundancy of information (25%) categories. This might be because there were many different sections to be filled in the I diagram. Also, SQAC experiments are usually tiring and time-consuming. In the analysis steps, students carried out activities such as heating, centrifuging and washing, which took time.

Table 11 Findings related to the question, ‘What negative thoughts do you have about the I diagram? Explain them and give reasons’
Category f % Sample student opinions
Writing 10 42 Writing things down were a little problematic, the diagram was a report in itself. (S.10)
Time constraint 8 33 You need to conduct the experiment and write it down at the same time. Normally, we would conduct the experiment and that's it, but here you fill in the I diagram as well. This is my only negative thought. (S.2)
Redundancy of information 6 25 There is too much information to write down. I don't know if this much is necessary. Especially the prior knowledge section. You write down characteristics of each cation, how to distinguish them and where to use them. I couldn't think properly after a while. (S.14)


Discussion

In this study, the performance and achievement of the students in the Analytical Chemistry Laboratory-I class were examined based on the GILA. The I diagram was utilized in order to evaluate student performance using this approach. The students learned how to systematically analyze a sample solution containing unknown cations given to them in the laboratory. Sometimes they were able to observe precipitation or a change in colour as a result of the analysis and sometimes they obtained different results and were surprised, which caused new questions to form in their minds. Under the guidance of the researchers, the students tried to find solutions through discussion and the use of reason.

The performances of the students in the SQAC experiments

The study results indicated that the performance exhibited by the students in group 1, 2 and 3 SQAC experiments was usually good or moderate while their level of performance in group 4 and 5 SQAC experiments was high. In their first analyses, the students had a low level of performance since they had just been introduced to the I diagram and the GILA process for the first time. In the group 4 and 5 cation analyses, the students performed flame tests together with analyses. The reason why performance increased during these analyses was that these analyses were much more interesting and also that the students had by then become familiar with the GILA process. Based on these results, we can assert that the GILA process has influenced the laboratory performance of the students more and more positively over time. Many studies have suggested that this process increases laboratory performance, which is consistent with our results. In one of these studies, Mandler et al. (2014) applied the GILA in experiments related to water quality in an Analytical Chemistry laboratory and measured student performance in these experiments. They reported that this approach improved the chemical data interpretation and assessment skills of the students. In another study assessing student performance, Acar Sesen and Tarhan (2013) found that as a result of 5 week GILA-based laboratory activities related to electrochemistry the students' understanding of theoretical knowledge and experimental concepts, their willingness to research experimental questions prior to the experiment, and their abilities to provide meaningful explanations for questions asked, to report their observations and write down experimental results in a meaningful way, to share knowledge, to shape an argument, to assess experimental results and to deliver experimental results all clearly improved. Besides, many studies have shown that laboratory courses carried out with the GILA increase student performance (Farrell et al., 1999; Poock et al., 2007; Rissing and Cogan, 2009; Irinoye et al., 2014).

The contributions of the I diagram in the GILA process

In our study, the students engaged in inquiry by brainstorming at every stage of the experiment to find an answer to the research question. They established their prior knowledge by discussing the information which they had learnt in the pretesting stage. Based on this prior knowledge, the students identified which concepts might come up. In addition, looking for an answer to the research question during the experiment enhanced the students' problem-solving skills. For example, some students (20%) mentioned that their problem-solving skills had improved since they had engaged in a process of inquiry and had continued to try to answer the question posed. Similarly, Hofstein et al. (2004) noted that the questioning skills of the students improved as a result of their inquiry-based laboratory activities. They also concluded that such laboratory applications were interesting and improved students' science process skills and conceptual learning.

In interviews, the students stated that they had developed meta-cognitive skills such as skills related to the scientific process as well as problem-solving and analytical thinking skills. Science process skills, which are at the centre of scientific study, allow for understanding and improving knowledge during the learning process (Bredderman, 1983). The I diagram is a learning tool which requires students to use these skills as they go through its sections. From this point of view, we can infer that the I diagram improves students' science process skills. In line with this result, Karamustafaoğlu (2011) and Phillips and Germann (2002) maintained that the I diagram allowed students to gain an understanding of scientific inquiry and improved their science process skills. Schraw et al. (2006) noted that cognitive and meta-cognitive outcomes in science education could be supported by the inquiry-based approach. Similarly, many studies have suggested that the inquiry-based approach serves many purposes, such as developing meaningful learning, improving critical thinking skills, ensuring active participation in the learning process, managing learning and enabling students to communicate with each other (Paris and Paris, 2001; Gibson and Chase, 2002; Hofstein, 2004; Akerson and Hanuscin, 2007). Hofstein (2004) stated that students' meta-cognitive efficacy improved thanks to learning environments which allowed them to manage their own learning and increase their self-awareness. Based on this statement, our concluding remark is that the GILA process carried out using the I diagram positively affected the meta-cognitive abilities of the students.

The GILA starts with a research question and a process is followed to find an answer to this question (Wheeler et al., 2015). In our study, the students advanced the process in a way that enabled them to answer the question related to SQACs and meaningful learning took place as a result of their questioning. Interviews showed that, thanks to this process, the students did not have any difficulties in the pretesting, research question and process steps sections when filling in the I diagram. This might be because the students had acquired knowledge about cations in specific cation groups in the pretesting stage before starting the SQAC experiment. Using the GILA process the students were also able to learn the key concepts of systematic analysis while they established their prior knowledge. On the other hand, in closed-ended experiments conducted as if following a recipe from a cook book, students do not pay attention to concepts on laboratory sheets and thus cannot establish the necessary links. For this reason, the GILA process can be suggested as one that is effective in student's conceptual learning. Studies consistent with this result have shown that laboratory applications based on the GILA allow students to engage in meaningful learning in science education (Hofstein, 2004; Taitelbaum et al., 2008; Köseoğlu and Bayır, 2012). For example, Gaddis and Schoffstall (2007) designed GILA-based experiments in an organic chemistry laboratory and reported that such applications enhanced students' conceptual understanding in relation to the subject, their desire to understand the unknown and their reasoning skills. Another result of our study was that the students were able to determine the procedure in the I diagram with ease. This is because the students determined the stages of the analysis by asking questions related to cation analysis in the experimental design section, which allowed students to engage in reasoning about the analysis of the relevant cation group.

The relation between performance and achievement on the way to meaningful learning

The results of the SQAC achievement test were examined in order to determine the relation between students' performances and achievements. A positive correlation was found between average scores obtained from this test and the I diagram performance scores. Accordingly, we determined that 25% of their achievements came from their laboratory performance. In the GILA process, the students developed hypotheses based on the experiment's research question, designed the experiment in the question–answer format in the logical argument section, discussed what the variables in the analysis were, collected and tabulated the data and presented knowledge claims by reaching a conclusion. During all these steps, the students structured their knowledge by carrying out cation analysis in a systematic manner, inquiring into and suggesting scientific explanations. Martin-Hansen (2002) pointed out that students should be presented with opportunities to pose questions related to the experiment, develop hypotheses, collect and analyze data, seek answers to their questions and make decisions about the outcome of the experiment in order to improve the effectiveness of laboratory applications. Hofstein et al. (2005) and Mandler et al. (2014) demonstrated that students were able to structure their knowledge while conducting experiments with the GILA process, which presented them with these opportunities. In addition, the process of structuring knowledge influenced their achievements in SQAC experiments (21%). By both conducting experiments and following experimental steps with the help of the diagram, students enhanced their knowledge and acquired permanent knowledge without rote learning. Also, students were able to manage their own learning while conducting experiments (Mandler et al., 2012). The knowledge acquired by the students in relation to SQACs affected their scores of the I diagram and the achievement test positively. In addition, the results obtained by the students in the SQAC achievement test were supported by the increasing achievement category which emerged in interview findings. In their statements, the students reported that they were able to write down the analysis steps and products formed in the chemical reactions and learn about the characteristics of cations and their analytical reactions. It can thus be said that the GILA process performed via the I diagram contributed to the students' learning in SQAC experiments. Similar to our results, Hmelo-Silver et al. (2007) reported that students acquired deeper and more meaningful knowledge in this process and that this was reflected in their achievement levels. Studies available in the literature have highlighted the results that inquiry-based laboratory applications increase student achievement and contribute to the improvement of cognitive skills such as problem-solving, analysis and decision-making (McDermott et al., 2000; Scherr, 2003; Hofstein et al., 2004; Acar Sesen and Tarhan, 2013). Also, as mentioned in other studies, a curriculum based on the GILA may increase student achievement in both theoretical courses and applications (Schroeder et al., 2007; Geier et al., 2008; Lewis and Lewis, 2008).

In the GILA process, the students actively engaged in the experimental process and gained the ability to work safely and in a disciplined manner in the laboratory, to identify cations in groups, to write down the reactions of cations with different reagents, to separate cations in the form of a mixture using the analytical method, and to calculate and report results. These were all among the learning outcomes of the course. Köseoğlu and Bayır (2012) reported that laboratory teaching based on inquiry-based learning applied in the topics of gravimetry, neutralimetry, manganometry and argentometry in the Analytical Chemistry Laboratory contributed to student learning in these topics. In a similar study, Demirelli (2003) inferred that conceptual learning in the topics of electrode calibration and the Gran method in the Analytical Chemistry Laboratory was improved with the help of an activity based on guided inquiry. Also, Fakayode (2014) found out that the guided inquiry laboratory experiments provided an opportunity for students to better understand the concepts and the practical utility of multiple analytical techniques for solving real-world problems during chemical analysis.

Difficulties the students have experienced in the GILA process

One of the results of this study was that the students had difficulty in the logical argument, data transformation and variable definition sections. In the logical argument section, the students develop hypotheses in order to answer the research question and design the experiment with if… then statements depending on their hypotheses (Phillips and Germann, 2002). For students, entering into such a thinking process requires a lot of effort. This is because establishing links between the chemical events which are taking place during cation analysis and developing a hypothesis in order to explain the reasons behind these events, as well as predicting the results of the analysis are quite complex. For this reason, some students said that they had difficulty with this process. However, a considerable number of students expressed that the I diagram led them to think like a scientist and to research (12%) and think analytically (18%). DeBoer (1991) emphasized the importance of thinking like a scientist when structuring knowledge. Farrell et al. (1999) stated that GILA-based experiments led students to develop and test hypotheses and that the purpose of these applications was to establish links between observations and principles. In the variable definition section, the students tried to identify dependent, independent and manipulated variables in the experiment. Questions such as ‘Which is the affecting chemical and which is the affected chemical, the reagent or the sample solution?’ posed to the students by the researchers allowed them to brainstorm and identify these variables in the experiment. However, the students' statements showed that they confused these variables or had difficulty identifying them.

The findings indicated that there was also negative feedback regarding the I diagram. The negative feedback focused particularly on the time constraint and problems related to writing. The fact that SQAC experiments are time-consuming and require continuous inquiry might be the reason behind the student's negative opinions. Fakayode (2014) carried out analytical chemistry experiments based on the guided inquiry approach and received similar negative opinions from students after preliminary applications. He stated that the opinions of students changed in a positive and favourable direction following discussions about the applications.

Conclusions and implications for practice

In conclusion, it can be said that performance and achievement in the Analytical Chemistry Laboratory improved thanks to the use of the I diagram based on the GILA. Such laboratory applications can play a constructive role in student-centred chemistry laboratories. Increasing the number of similar laboratory applications will result in an increase in student-centred structures in the curriculum, and allow prospective chemistry teachers to gain experience with new approaches in laboratory applications and improve their knowledge and skills. In our study the students had a better idea about how to conduct scientific research thanks to the I diagram. They developed scientific thinking skills by following the scientific research steps with this application. This will help them use these skills when conducting experiments in their future professional lives. In addition, this study focused on the relation between student performance and achievement in the GILA process implemented using the I diagram.

SQAC experiments in analytical chemistry laboratory are exploratory-based experiments. In the implementation of these experiments, ensuring that the analysis steps are developed by the students can improve students' skills of researching and scientific thinking. Experimental processes that appear to be complicated and time-consuming could be carried out in a more planned and regular manner. Future studies could investigate which science process skills improve the most with the I diagram or how science process skills are affected by the I diagram. The I diagram is an assessment tool that facilitates the teachers to follow the learning process of the students and to evaluate the learning products in this process. From this aspect, the conclusion of the analysis over the I diagram may be more effective in a student correctly identifying what kind of mistake is on what step.

When we consider the limitations of the application of the I diagram in the GILA; these might be that the class is crowded and that there is a problem with timing in the implementation of the curriculum. But in crowded classrooms this application could be carried out by working with the form of groups. Teacher guidance in this approach could help students understand the subject better and intervene instantly on the problems that they face since they take active roles in the process and thus could help education become more productive. This approach, in which the teacher guidance is the front line, could provide teachers with a specific experience and being able to improve themselves with these experiences could facilitate them design better process examples. In addition, because our research is limited to a case study method, further studies could be used to investigate the effectiveness of the I diagram with experimental studies in laboratories of various chemistry fields. Besides, the effectiveness of the use of I diagrams in different levels of education, courses and subjects on the students' knowledge, skills and attitudes could be examined. During the GILA process in various laboratory applications, students could be provided with the I diagram tool which would enable, students to follow their own mental process. Researchers could also attempt to improve students' conceptual learning and meta-cognitive skills by carrying out GILA-based laboratory applications in science education and allowing them to acquire a different perspective toward science through scientific inquiry. It may be possible, using new applications developed in this way, to abandon the classical approach in laboratory education based on closed-ended experiments.

Appendix 1

Sample questions from the SQAC achievement test

(Q5) Which one statement given below is the reason for the subdivision of cations group 2 into 2A and 2B?

This is due to the fact that… … …

(A) they dissolve by forming soluble complexes if concentrated ammonium acetate is used.

(B) they dissolve in a cooling bath by forming sulfur if dilute nitric acid is used.

(C) they precipitate in the form of hydroxides if concentrated sodium hydroxide is used.

(D) they precipitate if ammonium polysulphide is used.

(Q6) Which procedure should be applied in order to separate cations in a sample solution including Zn2+ and Mn2+ cations?

(A) Aqua regia should be added and the solution should be heated.

(B) The solution should be neutralized with NH3 and boiled for a few minutes by adding 3% H2O2 and 2 N NaOH.

(C) The solution should be made basic by adding aqueous NH3 and then the resulting mixture should be centrifuged.

(D) 3 M NaOH and 6% H2O2 should be added and then the resulting mixture should be heated.

(Q16) I. Their ions are highly charged.

II. They have a large number of atomic radii.

III. Their inner electron shells are full.

IV. Their salts are colourless.

Information about why cations of group 5 may not form complexes is given above. Which information is correct?

(A) I and II

(B) I, II and III

(C) II and III

(D) I, III and IV

Appendix 2

A sample I diagram document of a student


image file: c7rp00050b-u1.tif

Appendix 3

The average scores of the students from each section of the I diagram


image file: c7rp00050b-u2.tif

References

  1. Acar Sesen B. and Tarhan L., (2013), Inquiry-based laboratory activities in electrochemistry: high school students' achievements and attitudes, Res. Sci. Educ., 43(1), 413–435,  DOI:10.1007/s11165-011-9275-9.
  2. Akerson V. L. and Hanuscin D. L., (2007), Teaching nature of science through inquiry: results of a 3-year professional development program, J. Res. Sci. Teach., 44(5), 653–680,  DOI:10.1002/tea.20159.
  3. Allen J. B., Barker L. N. and Ramsden J. H., (1986), Guided inquiry laboratory, J. Chem. Educ., 63(6), 533–534,  DOI:10.1021/ed063p533.
  4. Anderson R. D., (2007), Inquiry as on organizing theme for science curricula, in Abell S. K. and Lederman N. G. (ed.), Handbook of research on science education, Mahwah, NJ: Lawrence Erlbaum Associates, pp. 807–830.
  5. Arnold J., Kremer K. and Mayer J., (2014), Understanding students' experiments-what kind of support do they need in inquiry tasks? Int. J. Sci. Educ., 36(16), 2719–2749,  DOI:10.1080/09500693.2014.930209.
  6. Bailey K. D., (1982), Methods of social research, 2nd edn, New York: The Free Press.
  7. Banchi H. and Bell R., (2008), The many levels of inquiry, Sci. Child., 46(2), 26–29.
  8. Bell R. L., Smetana L. and Binns I., (2005), Simplifying inquiry instruction, Sci. Teach., 72(7), 30–33.
  9. Blanchard M. R., Southerland S. A., Osborne J. W., Sampson V. D., Annetta L. A. and Granger E. M., (2010), Is inquiry possible in light of accountability?: a quantitative comparison of the relative effectiveness of guided inquiry and verification laboratory instruction, Sci. Educ., 94(4), 577–616,  DOI:10.1002/sce.20390.
  10. Bodner G. M., (1986), Constructivism: a theory of knowledge, J. Chem. Educ., 63(10), 873–878,  DOI:10.1021/ed063p873.
  11. Bodner G. M., (1992), Why changing the curriculum may not be enough, J. Chem. Educ., 69(3), 186–190,  DOI:10.1021/ed069p186.
  12. Bredderman T., (1983), Effects of activity-based elementary science on student outcomes: a quantitative synthesis, Rev. Educ. Res., 53(4), 499–518,  DOI:10.3102/00346543053004499.
  13. Brownell S. E., Kloser M. J., Fukami T. and Shavelson R. J., (2012), Undergraduate biology lab courses: comparing the impact of traditionally based “cookbook” and authentic research-based courses on student lab experiences, J. Coll. Sci. Teach., 41(4), 36–45.
  14. Buck L. B., Bretz S. L. and Towns M. H., (2008), Characterizing the level of inquiry in the undergraduate laboratory, J. Coll. Sci. Teach., 38(1), 52–58.
  15. Bunterm T., Lee K., Ng Lan Kong J., Srikoon S., Vangpoomyai P., Rattanavongsa J. and Rachahoon G., (2014), Do different levels of inquiry lead to different learning outcomes? A comparison between guided and structured inquiry, Int. J. Sci. Educ., 36(12), 1937–1959,  DOI:10.1080/09500693.2014.886347.
  16. Bybee R. W., (2000), Teaching science as inquiry, in Minstrell J. and van Zee E. (ed.), Inquiring into inquiry learning and teaching in science, Washington, DC: American Association for the Advancement of Science, pp. 20–46.
  17. Bybee R. W., (2006), Scientific inquiry and science teaching, in Lawrence F. and Lederman N. G. (ed.), Scientific inquiry and nature of science: implications for teaching, learning, and teacher education, Netherlands: Springer, pp. 1–14.
  18. Chatterjee S., Williamson V. M., McCann K. and Peck M. L., (2009), Surveying students' attitudes and perceptions toward guided-inquiry and open-inquiry laboratories, J. Chem. Educ., 86(12), 1427–1432,  DOI:10.1021/ed086p1427.
  19. Chin C. and Chia L. G., (2006), Problem-based learning: using III-structured problems in biology project work, Sci. Educ., 90(1), 44–67,  DOI:10.1002/sce.20097.
  20. Coffman T., (2012), Using inquiry in the classroom: developing creative thinkers and information literate students, 2nd edn, Lanham, MD: Rowman & Littlefield Education.
  21. Colburn A., (2000), An inquiry primer, Sci. Scope, 23(6), 42–44.
  22. Conway C. J., (2014), Effects of guided inquiry versus lecture instruction on final grade distribution in a one-semester organic and biochemistry course, J. Chem. Educ., 91(4), 480–483,  DOI:10.1021/ed300137z.
  23. Cresswell S. L. and Loughlin W. A., (2015), An interdisciplinary guided inquiry laboratory for first year undergraduate forensic science students, J. Chem. Educ., 92(10), 1730–1735,  DOI:10.1021/acs.jchemed.5b00183.
  24. Davis S. A., (2005), Inquiry-based learning templates for creating online educational paths, Master of science thesis, Texas A&M University.
  25. DeBoer G. E., (1991), A history of ideas in science education: implications for practice, New York: Teachers College Press.
  26. Demirelli H., (2003), A laboratory activity based on constructivist learning approach: electrode calibration and gran functions, Gazi Univ. J. Gazi Educ. Fac., 23(2), 161–170.
  27. Domin D. S., (1999), A review of laboratory instructional styles, J. Chem. Educ., 76(4), 543–547,  DOI:10.1021/ed076p543.
  28. Ellis B. B. and Mead A. D., (2002), Item analysis: theory and practice using classical and modern test theory, in Rogelberg S. G. (ed.), Handbook of research methods in industrial and organizational psychology, Malden, MA: Blackwell, pp. 324–343.
  29. Evans J. D., (1996), Straightforward statistics for the behavioral sciences, Pacific Grove, CA: Brooks/Cole publishing.
  30. Evans N., (2001), Inquiry-based professional development: letting questions direct teachers' learning [pdf], retrieved from http://files.eric.ed.gov/fulltext/ED461518.pdf.
  31. Fakayode S. O., (2014), Guided-inquiry laboratory experiments in the analytical chemistry laboratory curriculum, Anal. Bioanal. Chem., 406(5), 1267–1271,  DOI:10.1007/s00216-013-7515-8.
  32. Farrell J., Moog R. and Spencer J., (1999), A guided inquiry general chemistry course, J. Chem. Educ., 76(4), 570–574,  DOI:10.1021/ed076p570.
  33. Ford M. J., (2008), Disciplinary authority and accountability in scientific practice and learning, Sci. Educ., 92(3), 404–423,  DOI:10.1002/sce.20263.
  34. Friedler Y. and Tamir P., (1990), Life in science laboratory classrooms at secondary level, in Hegarty-Hazel E. (ed.), The student laboratory and the science curriculum, London: Routledge, pp. 337–356.
  35. Furtak E. M., Seidel T., Iverson H. and Briggs D. C., (2012), Experimental and quasi-experimental studies of inquiry-based science teaching: a meta-analysis, Rev. Educ. Res., 82(3), 300–329,  DOI:10.3102/0034654312457206.
  36. Gaddis B. A. and Schoffstall A. M., (2007), Incorporating guided-inquiry learning into the organic chemistry laboratory, J. Chem. Educ., 84(5), 848–851,  DOI:10.1021/ed084p848.
  37. Geier R., Blumenfeld P. C., Marx R. W., Krajcik J. S., Fishman B., Soloway E. and Clay-Chambers J., (2008), Standardized test outcomes for students engaged in inquiry-based science curricula in the context of urban reform, J. Res. Sci. Teach., 45(8), 922–939,  DOI:10.1002/tea.20248.
  38. Gibson H. L. and Chase C., (2002), Longitudinal impact of an inquiry-based science program on middle school students' attitudes toward science, Sci. Educ., 86(5), 693–705,  DOI:10.1002/sce.10039.
  39. Gunstone R. F., (1991), Reconstructing theory from practical experience, in Woolnough B. E. (ed.), Practical science, Milton Keynes: Open University Press, pp. 67–77.
  40. Gupta T., Burke K. A., Mehta A. and Greenbowe T. J., (2015), Impact of guided-inquiry-based instruction with a writing and reflection emphasis on chemistry students' critical thinking abilities, J. Chem. Educ., 92(1), 32–38,  DOI:10.1021/ed500059r.
  41. Handelsman J., Ebert-May D., Beichner R., Bruns P., Chang A., DeHaan R., Gentile J., Lauffer S., Stewart J., Tilghman S. M. and Wood W. B., (2004), Scientific teaching, Science, 304(5670), 521–522,  DOI:10.1126/science.1096022.
  42. Herron M. D., (1971), The nature of scientific inquiry, School Rev., 79(2), 171–212.
  43. Hicks R. W. and Bevsek H. M., (2012), Utilizing problem-based learning in qualitative analysis lab experiments, J. Chem. Educ., 89(2), 254–257,  DOI:10.1021/ed1001202.
  44. Hmelo-Silver C. E., Duncan R. G. and Chinn C. A., (2007), Scaffolding and achievement in problem-based and inquiry learning: a response to Kirschner, Sweller, and Clark (2006), Educ. Psychol., 42(2), 99–107,  DOI:10.1080/00461520701263368.
  45. Hofstein A., (2004), The laboratory in chemistry education: thirty years of experience with developments, implementation and research, Chem. Educ. Res. Pract., 5(3), 247–264,  10.1039/B4RP90027H.
  46. Hofstein A. and Lunetta V. N., (2004), The laboratory in science education: foundation for the 21st century, Sci. Educ., 88(1), 28–54,  DOI:10.1002/sce.10106.
  47. Hofstein A., Shore R. and Kipnis M., (2004), Providing high school chemistry students with opportunities to develop learning skills in an inquiry-type laboratory: a case study, Int. J. Sci. Educ., 26(1), 47–62,  DOI:10.1080/0950069032000070342.
  48. Hofstein A., Navon O., Kipnis M. and Mamlok-Naaman R., (2005), Developing students' ability to ask more and better questions resulting from inquiry-type chemistry laboratories, J. Res. Sci. Teach., 42(10), 791–806,  DOI:10.1002/tea.20072.
  49. Irinoye J., Bamidele E. F., Adetunji A. A. and Awodele B. A., (2014), Relative effectiveness of guided and demonstration methods on students' performance in practical chemistry in secondary schools in Osun State, Nigeria, Adv. Soc. Sci. Res. J., 2(2), 21–30,  DOI:10.14738/assrj.22.824.
  50. Jalil P. A., (2006), A procedural problem in laboratory teaching: experiment and explain, or vice-versa, J. Chem. Educ., 83(1), 159–163,  DOI:10.1021/ed083p159.
  51. Karamustafaoğlu S., (2011), Improving the science process skills ability of science student teachers using I diagrams, Eurasian J. Phys. Chem. Educ., 3(1), 26–38.
  52. Kehoe J., (1995), Basic item analysis for multiple-choice tests [pdf], retrieved from http://files.eric.ed.gov/fulltext/ED398237.pdf.
  53. Ketpichainarong W., Panijpan B. and Ruenwongsa P., (2010), Enhanced learning of biotechnology students by an inquiry-based cellulase laboratory, Int. J. Environ. Sci. Educ., 5(2), 169–187.
  54. Kirschner P. A., Sweller J. and Clark R. E., (2006), Why minimal guidance during instruction does not work: an analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching, Educ. Psychol., 41(2), 75–86,  DOI:10.1207/s15326985ep4102_1.
  55. Klein S. P., Stecher B. M., Shavelson R. J., McCaffrey D., Ormseth T., Bell R. M., Comfort K. and Othman A. R., (1998), Analytic versus holistic scoring of science performance tasks, Appl. Meas. Educ., 11(2), 121–137,  DOI:10.1207/s15324818ame1102_1.
  56. Köseoğlu F. and Bayır E., (2012), The effects of inquiry-based analytical chemistry laboratory on prospective teachers' conceptual change, perceptions of science and the ways of learning science, J. Turk. Educ. Sci., 10(3), 603–625.
  57. Lewis S. and Lewis J., (2008), Seeking effectiveness and equity in a large college chemistry course: an HLM investigation of peer-led guided inquiry, J. Res. Sci. Teach., 45(7), 794-811,  DOI:10.1002/tea.20254.
  58. Lim B. R., (2001), Guidelines for designing inquiry-based learning on the web: online professional development of educators, Unpublished doctoral dissertation, Indiana University, ABD.
  59. Llewellyn D., (2002), Inquire within: implementing inquiry-based science standards, Thousand Oaks, CA: Corwin Press.
  60. Lunetta V. N., Hofstein A. and Clough M., (2007), Learning and teaching in the school science laboratory: an analysis of research, theory, and practice, in Lederman N. and Abel S. (ed.), Handbook of research on science education, Mahwah, NJ: Lawrence Erlbaum, pp. 393–441.
  61. Madhuri G. V., Kantamreddi V. S. S. N. and Prakash Goteti L. N. S., (2012), Promoting higher order thinking skills using inquiry-based learning, European J. Eng. Educ., 37(2), 117–123,  DOI:10.1080/03043797.2012.661701.
  62. Mandler D., Mamlok-Naaman R., Blonder R., Yayon M. and Hofstein A., (2012), High school chemistry teaching through environmentally oriented curricula, Chem. Educ. Res. Pract., 13(1), 80–92,  10.1039/C1RP90071D.
  63. Mandler D., Blonder R., Yayon M., Mamlok-Naaman R. and Hofstein A., (2014), Developing and implementing inquiry-based, water quality laboratory experiments for high school students to explore real environmental issues using analytical chemistry, J. Chem. Educ., 91(4), 492–496,  DOI:10.1021/ed200586r.
  64. Martin-Hansen L., (2002), Defining inquiry, Sci. Teach., 69(2), 34–37.
  65. Mayer R. E., (2004), Should there be a three-strikes rule against pure discovery learning? The case for guided methods of instruction, Am. Psychol., 59(1), 14–19,  DOI:10.1037/0003-066X.59.1.14.
  66. McDermott L. C., Shaffer P. S. and Constantinou C. P., (2000), Preparing teachers to teach physics and physical science by inquiry, Phys. Educ., 35(6), 71–85,  DOI:10.1088/0031-9120/35/6/306.
  67. Merriam S. B., (2014), Qualitative research: a guide to design and implementation, 3rd edn, New York: John Wiley & Sons, Inc.
  68. Nakhleh M. B., (1994), Chemical education research in the laboratory environment: how can research uncover what students are learning? J. Chem. Educ., 71(3), 201–205,  DOI:10.1021/ed071p201.
  69. Nakhleh M. B. and Krajcik J. S., (1993), A protocol analysis of the influence of technology on students' actions, verbal commentary, and thought processes during the performance of acid–base titrations, J. Res. Sci. Teach., 30(9), 1149–1168,  DOI:10.1002/tea.3660300911.
  70. National Research Council (NRC), (2000), Inquiry and the national science education standards: a guide for teaching and learning, Washington, DC: National Academy Press.
  71. National Research Council (NRC), (2012), A framework for K-12 science education: practices, crosscutting concepts, and core ideas, Washington, DC: National Academies Press.
  72. Novak J. D. and Gowin D. B., (1984), Learning how to learn, New York, NY: Cambridge University Press.
  73. Paris S. G. and Paris A. H., (2001), Classroom applications of research on self-regulated learning, Educ. Psychol., 36(2), 89–101.
  74. Phillips K. A. and Germann P. J., (2002), The inquiry ‘I’: a tool for learning scientific inquiry, Am. Bio. Teach., 64(7), 512–520,  DOI:10.2307/4451356.
  75. Pine J., Aschbacher P., Roth E., Jones M., McPhee C., Martin C., Phelps S., Kyle T. and Foley B., (2006), Fifth graders' science inquiry abilities: a comparative study of students in hands-on and textbook curricula, J. Res. Sci. Teach., 43(5), 467–484,  DOI:10.1002/tea.20140.
  76. Poock J. R., Burke K. A., Greenbowe T. J. and Hand B. M., (2007), Using the science writing heuristic in the general chemistry laboratory to improve students' academic performance, J. Chem. Educ., 84(8), 1371–1379,  DOI:10.1021/ed084p1371.
  77. Raydo M. L., Church M. S., Taylor Z. W., Taylor C. E. and Danowitz A. M., (2015), A guided inquiry liquid/liquid extractions laboratory for introductory organic chemistry, J. Chem. Educ., 92(1), 139–142,  DOI:10.1021/ed400861r.
  78. Rissing S. W. and Cogan, J. G., (2009), Can an inquiry approach improve college student learning in a teaching laboratory? CBE – Life Sci. Educ., 8(1), 55–61,  DOI:10.1187/cbe.08-05-0023.
  79. Russell C. B. and Weaver G. C., (2011), A comparative study of traditional, inquiry-based, and research-based laboratory curricula: impacts on understanding of the nature of science, Chem. Educ. Res. Pract., 12(1), 57–67,  10.1039/C1RP90008K.
  80. Scherr R. E., (2003), An implementation of physics by inquiry in a large-enrollment class, Phys. Teach., 41(2), 113–118,  DOI:10.1119/1.1542051.
  81. Schraw G., Crippen K. J. and Hartley K., (2006), Promoting self-regulation in science education: metacognition as part of a broader perspective on learning, Res. Sci. Educ., 36(1), 111–139,  DOI:10.1007/s11165-005-3917-8.
  82. Schroeder C., Scott T., Tolson H., Huang T. and Lee, Y., (2007), A meta-analysis of national research: effects of teaching strategies on student achievement in science in the United States, J. Res. Sci. Teach., 44(10), 1436–1460,  DOI:10.1002/tea.20212.
  83. Schwab J. J., (1962), The teaching of science as enquiry, in Schwab J. J. and Brandwein P. F. (ed.), The teaching of science, Cambridge, MA: Harvard University Press.
  84. Shiland T. W., (1999), Constructivism: the implication for laboratory work, J. Chem. Educ., 76(1), 107–109,  DOI:10.1021/ed076p107.
  85. Singer S., Hilton M. and Schweingruber H., (2005), Needing a new approach to science labs, Sci. Teach., 72(7), 10.
  86. Taber K. S., (2014), Ethical considerations of chemistry education research involving ‘human subjects’, Chem. Educ. Res. Pract., 15, 109–113.
  87. Taitelbaum D., Mamlok-Naaman R., Carmeli M. and Hofstein A., (2008), Evidence for teachers' change while participating in a continuous professional development program and implementing the inquiry approach in the chemistry laboratory, Int. J. Sci. Educ., 30(5), 593–617,  DOI:10.1080/09500690701854840.
  88. Tamir P. and Lunetta V. N., (1981), Inquiry related tasks in high school science laboratory hand-books, Sci. Educ., 65(5), 477–484,  DOI:10.1002/sce.3730650503.
  89. Tobin K. G., (1990), Research on science laboratory activities: in pursuit of better questions and answers to improve learning, Sch. Sci. Math., 90(5), 403–418,  DOI:10.1111/j.1949-8594.1990.tb17229.x.
  90. Waters N. C., (2012), The advantages of inquiry-based laboratory exercises within the life sciences [pdf], retrieved from http://www.westpoint.edu/cfe/Literature/Waters_12.pdf.
  91. Wheeler L. B., Maeng J. L. and Whitworth B. A., (2015), Teaching assistants perceptions of a training to support an inquiry-based general chemistry laboratory course, Chem. Educ. Res. Pract., 16(1), 824–842,  10.1039/c5rp00104h.
  92. Zimmaro D. M., (2004), Writing good multiple-choice exams. Measurement and Evaluation Center, University of Texas, Austin [pdf], retrieved from http://www.utexas.edu/academic/mec/research/pdf/writingmcexamshandout.pdf.
  93. Zuiker S. and Whitaker J. R., (2014), Refining inquiry with multi-form assessment: formative and summative assessment functions for flexible inquiry, Int. J. Sci. Educ., 36(6), 1037–1059,  DOI:10.1080/09500693.2013.834.

This journal is © The Royal Society of Chemistry 2017
Click here to see how this site uses Cookies. View our privacy policy here.