Pre-service teachers' use of epistemic criteria in the assessment of scientific procedures for identifying microplastics in beach sand

Beatriz Crujeiras-Pérez * and Pablo Brocos
Departamento de Diácticas Aplicadas, Facultade de Ciencias da Educación, Universidade de Santiago de Compostela, Spain. E-mail: beatriz.crujeiras@usc.es; pablo.brocos@usc.es

Received 16th June 2020 , Accepted 14th November 2020

First published on 23rd November 2020


Abstract

This study addresses the use of epistemic criteria related to the scientific practice of inquiry in the context of environmental chemistry. In particular, it analyses the type of criteria that are used by pre-service teachers when assessing the adequacy of several scientific procedures for identifying microplastics in beach sand, as well as determining the ways in which these participants make use of said criteria. The participants were 22 pre-service primary teachers who were divided into small groups of 3–4 participants who were given the task of assessing the scientific quality of three different procedures before selecting which they considered to be the best option. The data collected includes audio recordings of the participants' small group conversations and their written comments. The data analysis is framed in qualitative content analysis, in which the participants' conversations were transcribed and coded using the ATLAS.ti software. The coding frameworks that were used to address each research question were developed by taking into consideration both the literature and the collected data. The main results indicate different patterns in terms of the types of criteria that were used in the participants' assessments, as well as the different uses of criteria within each of the small groups. These results could have been influenced by the participant's limited knowledge of both scientific inquiry and chemistry.


Introduction

In this paper, we have designed and assessed a strategy that can be used to foster the capacity of pre-service primary teachers in designing scientific investigations, taking certain aspects of the disciplinary epistemic knowledge related to scientific inquiry into account. Evaluating and designing scientific inquiry is considered as a requirement in order to achieve scientific literacy and it is one of the scientific competencies that are assessed in the PISA tests (OECD, 2016a). This competence includes, among other aspects, the ability to propose ways of addressing scientific questions and utilize certain specific skills that are associated with scientific inquiry, such as formulating hypotheses and questions, designing procedures, interpreting data, and drawing conclusions.

The relevance of this study emanates from the 2015 PISA report, which is the most recent report to focus on science as the major domain assessed (OECD, 2016b). The results of this report indicate that of the three assessed scientific competencies, evaluating and designing investigations yielded the lowest score in 55% of the evaluated countries (OECD, 2016b). Furthermore, numerous issues regarding the implementation of scientific inquiry in primary and secondary education, as well as in teacher education, have been identified in the literature. One of the main problems that students encounter is the difficulty in designing complete and well-defined procedures (Zimmerman, 2000; Crujeiras-Pérez and Jiménez-Aleixandre, 2017; García-Carmona et al., 2017). For the teachers, the main issues are associated with a lack experience and training in carrying out scientific investigations (Capps et al., 2012), as well as a lack of trust in one's own ability to implement inquiry-oriented strategies in the classroom setting (Rikmanis et al., 2012). These difficulties, along with the PISA assessment results, highlight the need to firstly devise tasks that are aimed at promoting students’ competences in the evaluation and design of scientific inquiries, and, secondly, to develop teaching strategies that encourage instructors to appropriately implement inquiry approaches in their classrooms. This potential course of action is in line with the educational implications that have been emphasized in numerous studies about inquiry (e.g.Biggers and Forbes, 2012; Crujeiras-Pérez and Jiménez-Aleixandre, 2017; García-Carmona et al., 2017). However, we believe that if participants are not engaged in the nature of scientific inquiry, that is to say with the characteristics of the scientific processes through which scientific knowledge is constructed and justified (Schwartz et al., 2008), this perspective may be insufficient in order to achieve a substantial improvement in learning about scientific inquiry. This assertion is grounded in empirical studies that have determined that students do not acquire learning about scientific inquiry by doing science (e.g.Bell et al., 2003; Metz, 2004; Sandoval and Morrison, 2003) and it is also backed by science education literature reviews (Lederman et al., 2014; Lederman et al., 2013). Therefore, we propose that additional aspects that are directly involved in the construction of scientific knowledge, such as the epistemic knowledge associated with scientific inquiry are taken into account, and this constitutes the focus of this study.

Epistemic knowledge involved in science learning

Epistemic knowledge draws from epistemology, the discipline that studies the nature of knowledge and knowing. This type of knowledge can be broadly defined as a set of knowledge, practices and motivations that are related to what is considered as knowledge and to how knowledge claims are justified (Chinn et al., 2014).

Within the framework of Science Education, epistemic knowledge is understood as the comprehension of the role of the specific constructs that are involved in the production of knowledge and of the essential characteristics of the knowledge-building processes (Duschl, 2008).

Addressing epistemic knowledge is relevant for science education since a higher level of epistemic knowledge allows for more productive learning about scientific practices and contents (Sandoval, 2005; Elby et al., 2016). Epistemic knowledge has been examined from a range of perspectives: disciplinary, personal and social (Kelly et al., 2012). This study focuses on the disciplinary perspective, which, according to these authors, includes knowledge about the nature of the evidence, criteria for selecting scientific theories, and the role of theoretical dependence in scientific methodologies, among other aspects.

We agree with Sandoval's (2014) consideration that epistemic knowledge is manifested through actions and reasoning, and, as such, it should be evaluated within its practice. This is also in line with Kelly's (2008) proposal that suggested that science learning should be evaluated through the study of the participants' practices in situ. Therefore, in this study the participants’ epistemic knowledge was examined by analysing their engagement in an inquiry task, placing particular emphasis on the epistemic criteria used when assessing several procedures for identifying microplastics in beach sand.

Epistemic criteria are the standards that are used by scientists in order to evaluate the validity and accuracy of scientific products (Pluta et al., 2011). According to these authors, making epistemic criteria explicit in the classroom setting is a good way of making the cognitive practices of science visible, for example, the awareness that criteria should be applied contextually rather than rigidly.

Several studies can be found in the literature that address the students' use of criteria in scientific practices such as argumentation (Hogan and Maglienti, 2001), modelling (Pluta et al., 2011) and inquiry (Samarapungavan et al., 2006). However, we are yet to identify any studies in which the specific criteria for assessing a scientific procedure are identified in a given context. For this study, we have used the evaluation of the adequacy of several scientific procedures as the basis for learning both about the design itself (Pérez-Vidal and Crujeiras-Pérez, 2019) and about the epistemic knowledge associated to inquiry or, what is the same, about the nature of scientific inquiry.

Learning about the nature of scientific inquiry

The epistemic knowledge of science, which is also known as the Nature of Science (NOS), is understood as a way of knowing and as the values and beliefs inherent to the development of scientific knowledge (Lederman, 1992). Nowadays, the literature also differentiates between NOS and NOSI (nature of scientific inquiry), with NOSI being related to either the nature of the processes involved in scientific inquiry (Schwartz et al., 2008) or the nature of the practices that are used by scientists to generate new scientific knowledge (Schwartz et al., 2012). Several authors such as Lederman et al. (2014), Osborne et al. (2003), or Schwartz et al. (2008) have developed frameworks for assessing NOSI, which focus on aspects such as research guided by scientific questions, diversity of methods of scientific investigations, justification of knowledge claims, and distinctions between data and evidence.

Addressing NOSI in science lessons is important as it not only allows students to perform better investigations, but it also enables them to participate in science-related discourses and make decisions as a scientifically literate citizen (Flick and Lederman, 2006). In fact, as Schwartz et al. (2012) pointed out, learners may be able to conduct a scientific investigation, but it is important that they also understand the nature of the practices involved in order to be able to evaluate the validity of the claims and understand how scientific knowledge is developed and accepted.

Several epistemic aspects aligned with NOSI frameworks have been examined in empirical studies, for example, considering systematicity as a feature of the scientific methodology that leads to valued forms of knowledge (Sandoval and Reiser, 2004), considering hypotheses as being subject to falsification (Constantinou and Papadouris, 2004), acknowledging the role of evidence in the justification of scientific conclusions (Kittleson, 2011), identifying patterns in data sets (Sandoval et al., 2000), and understanding that conclusions should cohere with the range of available evidence (Falk and Yarden, 2009).

In order to ensure the adequate teaching and learning of NOSI, it is fundamental that teachers are familiar with these epistemic aspects, and, as such, these must be addressed in teacher training programs. However, this sort of training is not common in science education instructional programs (Lederman et al., 2013). Given the lack of training on this aspect, teachers may not be aware of its relevance in science instruction. In this regard, Strippel and Sommer (2015) explored the manner in which chemistry teachers addressed NOSI issues in their chemistry laboratory lessons, observing that these were not a primary goal for them.

It is worth mentioning that there is a debate as to how NOS and NOSI issues should be addressed in science education, if this should be done in general terms or specifically for each discipline. In this sense, we agree with the idea of addressing the particular nature of chemical knowledge, which was proposed by Erduran (2001), Erduran and Scerri (2003) and Vesterinen and Aksela (2009). In this study, we have addressed NOSI aspects through a general perspective, focusing on how pre-service primary teachers apply epistemic criteria in their assessment of a scientific investigation. This constitutes the first step in their NOSI training, giving them the opportunity to become acquainted with the general epistemic aspects that must be considered in order to be able to plan and carry out adequate scientific investigations. Further instruction should include specific aspects that are associated with each scientific discipline.

The research questions that guided the investigation are:

(1) What epistemic criteria do pre-service primary teachers use for selecting the most appropriate procedure for identifying microplastics in beach sand?

(2) How do participants use the epistemic criteria to select the most appropriate procedure?

Methods

Study context and participants

This study is framed in a qualitative approach and it draws from qualitative content analysis (Schreier, 2014), therefore allowing for a systematic description of context-dependent meaning of data. This methodological framework has been successfully employed in education research (e.g.Kapustka et al., 2009).

The participants were 22 pre-service primary teachers (PSTs) from a public university located in the northwest of Spain, who were on the last subject about science education that was included in their syllabus. The PSTs worked in six small groups that were formed by 3 to 4 participants (GA, GB, GC, GE, GG and GH).

It is important to note than in the primary education syllabus in Spain, science is taught as a single subject, with no differentiation between the specific scientific disciplines such as chemistry, physics, or biology. The scientific curriculum is organised around five groups of contents, namely: (1) introduction to scientific activity, (2) human beings and health, (3) living beings, (4) matter and energy, (5) technology, apparatus and machines. This curriculum is used as a framework for organising the instructional approaches that are addressed in science teacher education.

The aforementioned subject addressed a specific topic about the disciplinary aspects that are associated with the epistemic knowledge that is involved in the development of the scientific practices of inquiry, argumentation and modelling. The task that was analysed in this study, focused on inquiry, and it formed part of a more comprehensive project in which these three scientific practices were addressed. The task was framed in environmental chemistry; the accumulation of microplastics is a current environmental issue that should be addressed in chemistry lessons in order to raise the awareness among PSTs of their uses and management. This topic relates to the curricular content of waste production, pollution and environmental impact that is included in the Spanish science curriculum for Primary Education. Moreover, this context of identifying microplastics in beach sand relates to some specific contents of chemistry that are addressed in primary science lessons all over the world, such as mixtures and solutions and the separation of mixtures. In this study, we explored the potential of this context for engaging PSTs in inquiry and NOSI as well as for learning chemistry.

The task began with a brief oral presentation about the environmental threat that the presence of microplastics in marine and coastal environments poses. The PSTs were then asked to carry out an investigation to ascertain whether these compounds were found in the regional beaches, highlighting the need for a procedure that was both scientifically adequate and reliable to be developed. For this purpose, three alternative procedures (A, B and C) were presented to the participants (see appendix), and they were required to choose the procedure that they considered to be most reliable from a scientific point of view, and they were required to justify their choice.

The highest quality procedure (B) was designed by the authors who drew from a standardized method developed following the review of procedures for the extraction and determination of microplastics in beach sand by means of density separation employed using a saline solution (Besley et al., 2017). This adaptation, which was adjusted to the classroom context and to the time available, was previously tested and performed by the authors. The alternative procedures, A and C, did not differ from B in terms of their overall rationale or extraction mechanism, however they attained lower detail, systematicity, accuracy, reliability and reproducibility. For instance, procedure C does not include two samples for each kind of sand (reliability), and procedure A does not specify the quantity of sand that is to be analysed (detail). It should be noted that the aim of the task was to select the best procedure for performing the investigation and to justify this selection in terms of these epistemic criteria, rather than managing the different chemical procedures, which would go beyond the objectives that are included in the science curriculum for primary education.

The task included other steps, which, for the purpose of this study it was not necessary to analyse, and these addressed: the modification of the discarded procedures in order to increase their scientific quality; the selection of materials; the data collected once the selected procedure was put in practice; as well as the conclusions drawn and their reliability.

Data collection and analysis

For data collection, the PSTs' conversations in small groups were recorded. Their written comments on the selection of the best procedure and their justifications were also collected. The coding was conducted using the transcripts in the languages in which the discourse was originally produced, which were Galician and Spanish, these languages were used interchangeably by the participants given that the instruction took place in a bilingual context. Both of the authors are fully proficient in the two languages. Selected fragments were translated into English by the authors, who are experienced in doing translations in the context of science education research, while ensuring to preserve the meaning of the original discourse.

For the analysis, the PSTs' conversations were transcribed and coded using the ATLAS.ti software. In order to address the first research question, the transcriptions were examined according to the coding frame, which is presented in Table 1. The categories for this coding frame were fundamentally developed in a concept-driven way, taking into consideration the aspects that must be met in an appropriate scientific procedure. These categories, which corresponded to the epistemic criteria, drew from the philosophy of science and in particular, Bunge's (1968) characteristics of science, such as being accurate, testable and systematic; and from the understandings of inquiry that are included in science education policy documents (NRC, 1996; 2000), or, in other words, the understandings of the nature of inquiry (NOSI) that should be developed by K12 students. For instance, one of these understandings is that “the methods and procedures that scientists use to obtain evidence must be clearly reported in order to enhance opportunities for further investigation”, therefore involving the “detail” and “systematicity” criteria given that a scientific procedure must be detailed and well organised in order for others to be able to reproduce it. For the purpose of this analysis, the data was segmented into episodes, which were divided by thematic criteria.

Table 1 Coding framework for examining the epistemic criteria considered by the participants in the selection of the procedure
Epistemic criteria Description Terms used by participants
Accuracy Referring to the inclusion of specific data, such as the quantities that were to be measured for each magnitude Suitable, greater precision
More specific data
Systematicity Referring to the order in which the steps were to be performed Sequenced
Detail Referring to the need for each step to be included and described Complete, detailed, more exhaustive, specific, concrete
Reliability Referring to the use of two samples of each beach sand Effective, representative
Replicability Considering the need for the procedure to be replicated by others So that the person who reads it understands it, so that the procedure can be repeated


With regards to the second research question, the transcriptions were examined in terms of the use that the PSTs made of the epistemic criteria when selecting the procedure. This analysis was essentially inductive and it was conducted by examining each turn of speech in the transcripts and it resulted in the construction of the following five data-driven categories: (1) referring to explicit criteria and justifying them; (2) referring to explicit criteria, but providing inadequate justifications; (3) referring to explicit criteria, but providing no justification; (4) referring to the criteria implicitly; and (5) revealing misconceptions regarding the use of criteria.

It is worth mentioning that the PSTs were familiar with all of the epistemic criteria examined in this paper because these criteria had been addressed in the two 90-minute sessions that took place prior to them performing the analysed task.

We must also note that the process of analysing the two research questions was performed independently by the two authors through several cycles of analysis until an 85% agreement was attained in the coding. This percentage is considered to be more than acceptable for ensuring reliability in qualitative content analysis (Julien, 2008).

Ethical considerations

The PSTs were informed of the goals of the study, how the data collected would be used, and the fact that their involvement in this study would not have any bearing on the assessment of the subject. They were invited to participate on a voluntary basis, and they were provided with an informed consent form, which all of them signed. In order to protect their anonymity, all of the participants were identified by pseudonyms. It needs to be noted that the institutional review board (IRB) approval was not requested in this study because our institution only requires approval when the research uses participants' personal data. In our case, we use participants' performances under pseudonyms, therefore we only need to collect their informed consent for participating in the study, as explained above.

Likewise, although this research required participation in a laboratory activity, the PSTs were not exposed to any significant risk, because the task was adapted to enable it to be performed in a school laboratory and as such it did not involve any potential hazards such as the use of strong acids, high temperatures, or dangerous materials and/or procedures. The samples of beach sand were collected and processed by the authors, who also prepared the saturated NaCl solution.

Results

Epistemic criteria considered in the selection of the most appropriate procedure for identifying microplastics in beach sand

The data corresponding to the type of criteria considered has been summarised in Table 2. In general, the type of criteria used and the frequency with which they were used was different in each of the small groups. Not all of the groups considered all of the criteria in their selections, in fact, only one of the six groups (GA) did so. Out of the other five groups, one of them (GG) considered four criteria, two groups used three criteria (GC, GE) and the final two groups used two criteria (GB, GH).
Table 2 Epistemic criteria used for selecting the most appropriate procedure
Epistemic criteria Frequency (number of episodes in which groups appeal to each criterion)
GA GB GC GE GG GH Total
Accuracy 4 0 3 2 2 1 12
Systematicity 2 0 0 0 0 0 2
Detail 3 1 4 4 4 0 16
Reliability 2 5 2 4 1 2 16
Replicability 2 0 0 0 2 0 4


In terms of frequency, the two criteria that were the most used by the PSTs were detail and reliability with a total frequency of 16 episodes each. The accuracy criterion was also widely used, appearing in 12 episodes, whereas the other two, systematicity and replicability were mentioned less, with these being identified in just one and four episodes respectively.

One example of the criteria considered is reproduced in the following excerpt from GA's discourse:

TurnTranscriptionCriteria
80Aroa: It gives like… a good explanation, no, but a more detailed one about… about what needs to be done, or something like that.Detail
81Alicia: An explanation…
82Aroa: More complete and detailed.
83Alicia: [Writing] Complete… about what needs to be done.
84Ángela: And yeah… also makes reference to… [slip of the tongue] to the grams, too. 
85Aroa: [simultaneously] No, instead of “about what needs to be done”, about the steps to follow. Well, or like that, whatever, don’t change it.Systematicity
86Alicia: That way we can put it step by step.
[…]  
90Alicia: Specifying… the quantities…Accuracy
91Aroa: Specific [quantities].
92Alicia: And the timing…
93Aroa: Specific [timing].
94Alicia: ¿Specific? I already wrote specifying… quantities and timing…
95Ángela: That must be used.
[…]  
101Alicia: I’d put something like that now as it's more reliable…Reliability
102Aroa: [simultaneously] Or something like… so that… so that the person who reads it really understands… because that's what she [the teacher] told us. For example, the recipe, right? If they tell you, you have to make an omelette with eggs and potatoes and whatever, and in fact, the person who reads it, they don’t know… I mean…Replicability
103Alicia: What they add first.
104Aroa: Yeah.

In this excerpt, the PSTs considered all of the criteria in their intervention. In turns 80–84 they examined whether the procedures included a complete and detailed explanation of what to do, and this was coded under the category of detail. Then, in turns 84–85, they continued their assessment by examining whether or not the procedure contained all of the required stages and whether it could be followed step by step, which was an indicator of systematicity. After that, in turns 90–95, the participants considered the need to specify the quantities of the variables that were to be measured, which was coded as accuracy. Next, in turn 101, they explicitly mentioned the reliability of the procedure, and in turns 102–104 they accounted for the reliability of the process, suggesting that other people would need to understand the information in order to be able to replicate it, meaning that the procedure should include all of the specific steps and elements so that anyone could reproduce it in the same conditions.

We then examined the order in which the different groups used the criteria to identify existing patterns. To do so, we represented the timeline that encompassed the assessment of the procedures and the selection of the most appropriate ones (see Fig. 1).


image file: d0rp00176g-f1.tif
Fig. 1 Participants' use of criteria across the time. Legend: image file: d0rp00176g-u1.tif = accuracy; image file: d0rp00176g-u2.tif = systematicity; image file: d0rp00176g-u3.tif = detail; image file: d0rp00176g-u4.tif = reliability; and image file: d0rp00176g-u5.tif = replicability.

The sequence of criteria that were used by the small groups provides useful information on the importance that the PSTs placed on some of the criteria compared to the other criteria, and it also made it possible to identify patterns in the PSTs behaviours as described below. Patterns are important in research as this allows for the findings to be translated to other contexts.

Moreover, this is also useful as it gives us an idea as to how the PSTs would conduct the investigation in terms of epistemic criteria if they were required to do so.

As represented in Fig. 1, the use of criteria was quite different in the groups. Although there was no clear pattern, some coincidences among some of the groups can be highlighted. Three of the six groups (GB, GC and GE) started the assessment by examining the level of detail (D) in the procedures. Their decision to do so may have been due to prior training about planning and carrying out scientific investigations in which all of the aspects that a quality-based plan should meet were addressed. This intervention particularly emphasized the need for all of the steps that were to be followed in the investigation to be explained clearly, given that this aspect is widely addressed in literature as one of the difficulties that are associated with the students' inadequate performances in inquiry settings.

Moreover, there were two groups (GB and GH) in which the use of reliability (Rl) prevailed, although they only considered two criteria in their assessments. This may indicate that these groups considered reliability as the most important criterion for the assessment. Finally, it is worth mentioning that whenever the reproducibility criterion (Rp), was mentioned, it always appeared at the end of the assessment processes, which may suggest that the groups did not find it as relevant as the other criteria.

The fact that a given criterion was addressed on several occasions by the same group might indicate that the PSTs had different opinions or knowledge about it. It is worth noting, however, that the fact that a criterion was discussed on multiple occasions does not imply that they demonstrated better knowledge about it. Therefore, although the detail and reliability criteria were the most frequently used, this does not necessarily mean that the participants’ discourse revealed a more sophisticated knowledge about said criteria. In order to address this question, it is necessary to examine the use that participants made of said criteria, which is analysed in the next section.

PSTs' use of epistemic criteria for selecting the appropriate procedure

By analysing the PSTs' interventions, five different actions related to the use of the epistemic criteria were identified and these have been summarised in Table 3.
Table 3 Students' use of epistemic criteria for selecting the most appropriate procedure. Legend: A = accuracy; S = systematicity; D = detail; Rl = reliability; Rp = replicability
Actions Small group (number of episodes) N total (%)
A S D Rl Rp
a. Referring to explicit criteria and justifying them GE (1) GA (2) GB (5) GG (1) 13 (26)
GC (1) GH (1)
GE (2)
b. Referring to the explicit criteria, but providing inadequate justifications GG (1) GC (1) 3 (6)
GG (1)
c. Referring to the explicit criteria, but providing no justification GC (1) GA (1) GA (1) 10 (20)
GG (1) GG (3) GB (1)
GE (2)
d. Referring to the criteria implicitly GA (4) GA (2) GA (1) GG (1) GA (2) 19 (38)
GE (1) GC (3) GE (2) GG (1)
GH (1) GH (1)
e. Revealing misconceptions in the use of criteria GC (2) GC (1) 5 (10)
GE (2)


In general, the most frequent action was the implicit use of epistemic criteria for selecting the most appropriate procedure, which represented 38% of the episodes. However, the most sophisticated level, which corresponded to the participants' use of explicit criteria and their justification for said usage, accounted for 26% of the episodes, which is considered as an acceptable result when considering the difficulties that have been identified in the literature associated with this operation. It is worth noting that 10% of the episodes involved the misuse of criteria, mistakenly attributing the meaning of one criterion to another.

In the small groups the use of criteria was quite different both in terms of the types of criteria and their actions. We will now discuss each category with examples from the PSTs' interventions.

a. Referring to explicit criteria and justifying them. This category, which corresponds to the highest quality action in relation to the use of criteria, appeared in only 13 of the 50 episodes. In terms of its overall frequency, the highest number of episodes were related to reliability (N = 6) and detail (N = 5), whereas references to other criteria appeared only once (accuracy and replicability), or not at all (replicability). In addition, there were differences among the groups: three of them used detail (GA, GC and GE), two used reliability (GB and GH), while accuracy (GE) and replicability (GG) were only used once.

These results show that the groups tended to discuss only one criterion in an explicit and justified manner, except for GE, who considered two of them. On the other hand, the fact that the majority of the episodes corresponded to the use of reliability and detail, might suggest that the PSTs were more familiar with these criteria.

The following excerpt describes the ways in which GB made explicit use of the criterion reliability and justified it:

23 Bárbara: This procedure…tests two samples and their results may therefore be more reliable. If one [one of the tests] goes wrong, they always have another opportunity to check, or the results of the two samples can be compared afterwards so that it [the experiment] is more… you know? To check if the results are really the same and that they are reliable.

In this excerpt, Bárbara was explicitly referring to reliability, as she mentioned this at the beginning of the intervention. She justified the idea of using two samples of sand, explaining that in this way it would be possible to compare results. Therefore she associated the use of more than one sample with the reliability of the results.

b. Referring to explicit criteria but providing inadequate justifications. This category contains the episodes in which the PSTs made use of explicit criteria but provided incorrect justifications for their use. Out of the 50 episodes that were analysed, there were only three interventions that were included in this category, and these corresponded to two small groups (GC and GG) and to the use of two criteria: accuracy and detail.

The following excerpt reproduces the way in which GG used the accuracy criterion, justifying it inadequately:

165 Gemma: No, [procedure B] is more accurate because the other [procedure A] does not filter anything.

In this intervention, Gemma was comparing procedures A and B (see Appendix) by using the term accuracy, however she provided an inadequate justification for it, as her reasoning was based on aspects that were related to the level of detail of the steps that were included in the procedure, rather than to their accuracy. In particular, she considered that filtrating the supernatant liquid before observing the sample through the stereo microscope would result in a more accurate procedure than simply removing the microplastics before observing the sample through the stereomicroscope.

c. Referring to explicit criteria but providing no justification. In this category, the PSTs made use of explicit criteria but without justification. Ten episodes were coded under this category, corresponding to the use of three of the five criteria (accuracy, detail and reliability). This behaviour was identified in five of the six small groups (all except GH) and with different frequencies. For instance, the most frequent criterion was detail (N = 6) and this was used by three small groups (GA, GE and GH). The other criteria, accuracy and reliability, appeared in just two episodes each, corresponding to the interventions by two small groups (GC, GG and GA, GB respectively).

The following excerpt reproduces the way in which GG made use of the detail criterion without justification:

182 Gemma: I prefer [procedure] B because it is more specific.

In this excerpt, the participant selected procedure B as the best option, backing up her decision based on the level of detail that it presented. She referred to this criterion using the term specific, which was correct, however she did not justify why this procedure was more specific than the other two.

d. Referring to the criteria implicitly. As stated before, this category was the most frequent in the PSTs' interventions (19 out of 50 episodes) and, unlike the other categories, all the criteria were included in it. Overall, the most frequent criteria that were used under this category were accuracy and reliability and both of them were addressed in five episodes, this was followed by the criterion detail, which appeared in four. The other two criteria, systematicity and replicability, were less frequent.

Five of the six small groups performed this action, however this was done with different frequency and related to different criteria. In general, the groups made use of just one or two criteria, with the exception of GA, which used four of the five criteria (accuracy, systematicity, detail and replicability).

The following excerpt reproduces an example of the way in which GC referred implicitly to the reliability criterion:

73 Cintia: What we can write is that… that…I mean, it runs tests, but it even runs two tests for each kind [of sample]. It does not just run one [test] like the others [procedures A and C].

74 Cruz: OK.

75 Celso: Yeah, it is what I…

76 Cintia: Exactly. No, no, but not only that, it is like it tests it twice in case one of the tests went wrong, and all that, you know?

In this excerpt, the participants were looking for the aspects that were included in procedure B that they could use in order to justify selecting it over the other two. In turn 73, Cintia referred to the number of samples tested by type of sand, given that this procedure involved using two samples per type of sand. Moreover, in turn 76, the same participant, Cintia, interpreted the use of two samples as a way of having more data to compare. It can be considered that she contemplated the reliability criterion in both interventions, however, she did not explicitly refer to it.

e. Revealing misconceptions in the use of criteria. This category corresponds to the episodes in which PSTs used explicit or implicit criteria wrongly by mistakenly attributing the meaning of one criterion to another.

Five episodes were coded under this category and these were identified in two of the six small groups (GC and GE) and were related to two criteria, accuracy and reliability.

GC mistook both the accuracy and reliability criteria for the detail criterion, whereas GE used reliability when referring to effectiveness.

The following extract reproduces an example of how GC mistook accuracy for detail:

47 Cruz: [Procedure B] is more accurate, it contains more steps…

In this example, Cruz was comparing procedure B with the other two. She considered procedure B to be more accurate than the others, however her justification was incorrect as she associated accuracy with the inclusion of a greater number of steps, which corresponded to the detail criterion, rather than to accuracy.

Discussion and conclusions

This study examined the epistemic criteria that were used by PSTs and the manner in which these were used in order to select the best scientific procedure for identifying microplastics in beach sand.

In general, the participants made use of five epistemic criteria related to inquiry (accuracy, systematicity, detail, reliability and replicability). However, the use of each criterion was different among the groups, with detail and reliability being the most frequently used criteria. This finding is quite relevant in terms of the PSTs' engagement in planning scientific investigations since it is widely agreed that participants usually propose ill-defined or incomplete plans (Zimmerman, 2000; Krajcik et al., 1998; Crujeiras-Pérez and Jiménez-Aleixandre, 2017). Therefore, the strategy of considering the epistemic criteria that are involved in a scientific procedure might contribute to the improvement of the PSTs' performances when planning a scientific investigation, an issue that has been highlighted in the literature as one of the main difficulties encountered by the students.

Although the PSTs considered several criteria for selecting the best procedure, only one group made use of all of them. In light of these results, we believe that holding a general discussion with all of the groups before making the final decision might improve the use of criteria. This aligns with Pluta et al.'s (2011) proposal that a whole class discussion about criteria could promote better group and individual learning.

The patterns identified in the PSTs' performances over time provide relevant information that enables us to interpret the use that they gave to the epistemic criteria that are involved in planning and carrying out scientific investigations. Moreover, although qualitative results are not generalisable, these patterns suggest the performances that could be expected in other learning contexts when participants have to select a scientific procedure or even design it. Apart from visual facilities, identifying sequences in research data is relevant when transferring the findings to an educational setting. In this case, identifying the sequences that the PSTs followed when assessing inquiry procedures has implications when designing learning tasks and in their teaching interventions, showing for instance that the criterion of systematicity needs to be emphasized more in lessons, especially to avoid the common misunderstanding of trial and error performances that are associated to laboratory inquiry by students with naïve views of NOSI instead of designing and carrying out systematic investigations (Sandoval and Reiser, 2004).

Regarding the second research question, which addresses the manner in which PSTs use the epistemic criteria for selecting the most appropriate procedure, a low percentage of the PSTs were able to use explicit criteria and propose an adequate justification for its usage, whereas most participants used the criteria implicitly. These findings suggest that the number of sessions that were dedicated to addressing the epistemic criteria that are associated with scientific practices before the task was performed were not sufficient in order to fully engage the PSTs in the use of epistemic knowledge. It is also important to note that the participants were not informed about the specific criteria that they needed to apply, because we were looking to investigate how they activate their epistemic knowledge about inquiry in a specific context, as well observing their comprehension of the relevance of this knowledge for meaningful engagement in inquiry practices, as recommended in the literature about learning through scientific practices (e.g.Duschl, 2008; Berland et al., 2016).

In fact, we consider that, rather than merely providing the PSTs with a list of predefined criteria that are to be used, it is only possible to meaningfully address the epistemic criteria involved in inquiry through the participants' reflective immersion in inquiry practices, such as the selection of an adequate procedure to investigate an issue, as proposed in this study.

Although the results show that the PSTs’ use of epistemic criteria was not optimal, the strategy of assessing different procedures and selecting which one was the best in order to conduct an adequate scientific investigation might be a promising resource for promoting the effective engagement of participants in inquiry. Furthermore, the epistemic criteria that are used for the assessment can be understood as epistemic tools that support the process of planning an investigation, since epistemic tools are characterised as physical, symbolic, and/or discursive artefacts that facilitate the construction of knowledge and support knowledge building (Kelly and Cunningham, 2019).

However, in light of the results, we consider that perhaps the PSTs require further training from the instructors regarding the use of criteria, or the establishment of specific classroom norms, such as the need to justify each decision or criteria in the groups. Although the need for justification was included in the task handout and they were familiar with the importance of this operation in argumentation practices, they may not have considered it necessary for inquiry. This aspect also highlights the need for scientific practices to be addressed through an integrated perspective, given that all three are interconnected (Osborne, 2014; Jiménez-Aleixandre and Crujeiras-Pérez, 2017).

To conclude, another aspect that may have influenced the PSTs’ performances was their epistemic beliefs, that is to say the participants’ beliefs about the nature of knowledge and knowing (Hofer and Pintrich, 1997). Numerous studies have highlighted the relationship between these beliefs and the students' learning (Mason et al., 2013; Getahum et al., 2016; Lin and Chang, 2018), however further research that addresses the relationship between such beliefs and the use of epistemic criteria may allow for innovative approaches that promote participants’ adequate use of epistemic criteria when engaging in scientific practices to be developed.

Conflicts of interest

There are no conflicts of interest to declare.

Appendix: Task handout

Considering the dangers that microplastics pose for the environment and our health, we are going to investigate the presence of these compounds in our beaches by analysing the beach sand from different locations.

In order to obtain reliable results, we must follow a procedure that is as scientifically adequate as possible. Below you will find three examples of methods that have been used by three scientific groups when performing this investigation. You have to decide which one is the most adequate if you want to guarantee the reliability of your results.

– Which procedure should you use in order to obtain the most representative data for the investigation? Why?

Procedure A

We dry the sand samples in an oven for 48 hours. We prepare a sodium chloride (salt) saturated solution in water (358 g per litre). We take a piece of each of the samples and we put them in beakers. We add 200 mL of the solution to each sample, we stir and wait for a while. When we observe that there are some floating particles in the solution, we remove the (supernatant) liquid to observe it with a stereomicroscope. During the observation we will identify the number and type of microplastics. We will express the results as the number of elements per cm2.

Procedure B

We dry the sand samples in an oven for 48 hours. Next, we prepare a sodium chloride (salt) saturated solution in water (358 g per litre), and we filter it. We take 50 g of each of the samples and we put them each into a beaker. We prepare two samples for each kind of sand. We add 200 mL of the solution to each sample, we stir continuously for 2 minutes and we let them rest for 10 minutes. After that time has elapsed, we filter the (supernatant) liquid, collecting the suspended solid particles in a paper filter, we wash the filter with distilled water, and we let them dry in the oven for 5 minutes. Next, we observe it with a stereomicroscope to identify the number and type of microplastics. We take a photo of the image projected in the stereo microscope for each sample as evidence of the observation. We will express the results as the number of elements per cm2.

Procedure C

We dry the sand samples in an oven for 48 hours. Next, we prepare a sodium chloride (salt) saturated solution in water (358 g per litre), and we filter it. We take 50 g of each kind of sample and we put them each into a beaker. We add 200 mL of the solution to each sample, we stir and let them rest for 10 minutes. We filter the (supernatant) liquid, and we collect the suspended solid particles in a paper filter, we wash the filter and we observe it with a stereo microscope to identify the number and type of microplastics. We take a photo of the image projected in the stereomicroscope for each sample as evidence of the observation. We will express the results as the number of elements per cm2.

Acknowledgements

The authors would like to thank FEDER/Ministry of Science, Innovation and Universities-National Agency of Research – Project EDU2017-83915-R.

References

  1. Bell R. L., Blair L. M., Crawford B. A. and Lederman N. G., (2003), Just Do It? Impact of a Science Apprenticeship Program on High School Students’ Understandings of the Nature of Science and Scientific Inquiry, J. Res. Sci. Teach., 40(5), 487–509.
  2. Berland L., Schwartz C. V., Krist C., Kenyon L., Lo A. S. and Reiser B. J., (2016), Epistemologies in practice: making scientific practices meaningful for students, J. Res. Sci. Teach., 53(7), 1082–1112.
  3. Besley A., Vijver M. G., Behrens P. and Bosker T., (2017), A standardized method for sampling and extraction methods for quantifying microplastics in beach sand, Mar. Pollut. Bull., 114(1), 77–83.
  4. Biggers M. and Forbes C. T., (2012), Balancing teacher and student roles in elementary classrooms: pre-service elementary teachers' learning about the inquiry continuum, Int. J. Sci. Educ., 34(14), 2205–2229.
  5. Bunge M., (1968), La ciencia: su método y su filosofía (Science: method and philosophy), Buenos Aires (Argentina): Siglo veinte.
  6. Capps D. K., Crawford B. A. and Constas M. A., (2012), A review of empirical literature on inquiry professional development: alignment with best practices and a critique of the findings, J. Sci. Teach. Educ., 23, 291–318.
  7. Chinn C. A., Rinehart R. W. and Buckland L. A., (2014), Epistemic cognition and evaluating information: Applying the AIR model of epistemic cognition, in Rapp D. and Braasch J. (ed.), Processing inaccurate information: Theoretical and applied perspectives from cognitive science and the educational sciences, Cambridge, MA: MIT Press, pp. 425–453.
  8. Constantinou C. P. and Papadouris N., (2004), Teaching and learning about energy in middle school: an argument for an epistemic approach, Stud. Sci. Educ., 48(2), 161–186.
  9. Crujeiras-Pérez B. and Jiménez-Aleixandre M. P., (2017), High school students' engagement in planning investigations: findings from a longitudinal study in Spain, Chem. Educ. Res. Pract., 18(1), 99–112.
  10. Duschl R., (2008), Science education in three-part harmony: Balancing conceptual, epistemic and social learning goals. Rev. Res. Educ., 32, 268–291.
  11. Elby A., Macrander C. and Hammer D., (2016), Epistemic cognition in Science, in Green J., Sandoval W. A. and Braaten I. (ed.), Handbook of Epistemic Cognition, New York: Routledge, pp. 113–127.
  12. Erduran S., (2001), Philosophy of Chemistry: An Emerging Field with Implications for Chemistry Education, Sci. Educ., 10, 581–593.
  13. Erduran S. and Scerri E., (2003), The nature of chemical knowledge and chemical education, in Gilbert J. K, De Jong O., Justi R., Treagust D. F. and Van driel J. H. (ed.), Chemical Education: towards a research-based practice, New York: Kluwer Academic Publishers.
  14. Falk H. and Yarden A., (2009), “Here the Scientists Explain What I Said.” Coordination Practices Elicited During the Enactment of the Results and Discussion Sections of Adapted Primary Literature. Res. Sci. Educ., 39, 349–383.
  15. Flick, L. B. and Lederman, N. G. (ed.), (2006), Scientific inquiry and nature of science, Dordrecht: Springer.
  16. García-Carmona A., Criado A. M. and Cruz-Guzmán M., (2017), Primary pre-service teachers’ skills in planning a guided scientific inquiry. Res. Sci. Educ., 47, 989–1010.
  17. Getahum D. A., Saroyan A. and Aulls M. W., (2016), Examining Undergraduate Students’ Conceptions of Inquiry in Terms of Epistemic Belief Differences, Can. J. High. Educ., 46(2), 181–205.
  18. Hofer B. K. and Pintrich P. R., (1997), The development of epistemological theories: Beliefs about knowledge and knowing and their relation to learning, Rev. Educ. Res., 67(1), 88–140.
  19. Hogan K. and Maglienti M., (2001), Comparing the epistemological underpinnings of students’ and scientists’ reasoning about conclusions, J. Res. Sci. Teach., 38, 663–687.
  20. Jiménez-Aleixandre M. P. and Crujeiras B., (2017), Epistemic Practices and Scientific Practices in Science Education, in Taber K. S. and Akpan B. (ed), Science Education. New Directions in Mathematics and Science Education, Rotterdam: Sense Publishers, pp. 69–80.
  21. Julien H., (2008), Content analysis, in Given L. M. (ed), The SAGE encyclopedia of qualitative research methods, California: SAGE Publications, vol. 2, pp. 120–121.
  22. Kapustka K. A., Howell P., Clayton C. D. and Thomas S., (2009), Social justice in teacher education: A qualitative content analysis of NCATE conceptual frameworks, Equity Excell. Educ., 42(4): 489–505.
  23. Kelly G. J., (2008), Inquiry, activity, and epistemic practice, in Duschl R. and Grandy R. (ed.), Teaching scientific inquiry: Recommendations for research and implementation, Rotterdam, The Netherlands: Sense, pp. 99–117.
  24. Kelly G. J. and Cunningham C. M., (2019), Epistemic tools in engineering design for K-12 education, Sci. Educ., 103, 1080–1111.
  25. Kelly G. J., McDonald S. and Wickman P.-O., (2012), Science Learning and Epistemology, in Fraser B. J., Tobin K. G. and McRobbie C. J. (ed.), Second International Handbook of Science Education, Dordrecht, The Netherlands: Springer, vol. 1, pp. 281–291.
  26. Kittleson J. M., (2011), Epistemological Beliefs of Third-Grade Students in an investigation-rich classroom, Sci. Educ., 95, 1026–1048.
  27. Krajcik J., Blumenfeld P. C., Marx R. W., Bass K. M. and Fredricks J., (1998), Inquiry in project-based science class- rooms: initial attempts by middle school students, J. Learn. Sci., 7(3/4), 313–350.
  28. Lederman N. G., (1992), Students’ and teachers’ conceptions of the nature of science: A review of the research, J. Res. Sci. Teach., 29(4), 331–359.
  29. Lederman N. G., Lederman J. S. and Antink A., (2013), Nature of science and scientific inquiry as contexts for the learning of science and achievement of scientific literacy, Int. J. Educ. Math. Sci. Technol., 1(3), 138–147.
  30. Lederman J. S., Lederman N. G., Bartos S. A., Bartels S. L., Meyer A. A. and Schwartz R. S., (2014), Meaningful assessment of learners’ understandings about scientific inquiry: The views about scientific inquiry (VASI) questionnaire, J. Res. Sci. Teach., 51 (1), 65–83.
  31. Lin F. and Chang C. K. K., (2018), Promoting elementary students’ epistemology of science through computer-supported knowledge- building discourse and epistemic reflection, Int. J. Sci. Educ., 40(6), 668–687.
  32. Mason L., Boscolo P., Tornatora M. C. and Ronconi L., (2013), Besides knowledge: a cross-sectional study on the relations between epistemic beliefs, achievement goals, self-beliefs, and achievement in science, Instr. Sci., 41, 49–79.
  33. Metz K. E., (2004), Children's understanding of scientific inquiry: their conceptualization of uncertainty in investigations of their own design, Cogn. Instr., 22(2), 219–290.
  34. National Research Council, (NRC), (1996), National Science Education Standards, Washington, DC: National Academies Press.
  35. National Research Council, (NRC), (2000), Inquiry and the national science education standards: A guide for teaching and learning, Washington, DC: National Academies Press.
  36. Organisation for Economic and Cooperative Development (OECD), (2016a), PISA 2015 Assessment and Analytical Framework: Science, Reading, Mathematic and Financial Literacy, Paris, France: OECD Publishing.
  37. Organisation for Economic and Cooperative Development (OECD), (2016b), PISA 2015 results (volume I): excellence and equity in education, Paris, France: OECD Publishing.
  38. Osborne J., (2014), Scientific practices and inquiry in the science classroom, in Lederman N. G. and Abell S. K. (ed.), Handbook of research on science education, New York, NY: Routledge, vol. II, pp. 579–599.
  39. Osborne J., Collins S., Ratcliffe M., Millar R. and Duschl R., (2003), What “ideas-about-science” should be taught in school science: A Delphi study of the expert community. J. Res. Sci. Teach., 40(7), 692–720.
  40. Pérez-Vidal D. and Crujeiras-Pérez B., (2019), Desempeños del alumnado de Educación Secundaria en la evaluación de una investigación científica en el contexto de la industria láctea, Ens. Cienc., 37(1), 5–23.
  41. Pluta W. J., Chinn C. A. and Duncan R., (2011), Learners’ epistemic criteria for good scientific models, J. Res. Sci. Teach., 48(5), 486–511.
  42. Rikmanis I., Logins J. and Namsone D., (2012), Teachers views on Inquiry-based Science Education, in Bolte C., Holbrook J. and Rauch F. (ed.), Inquiry- based Science Education in Europe: reflections from the PROFILES project, Berlin, Germany: Freie Universitat Berlin, pp. 14–24.
  43. Samarapungavan A., Westby E. L. and Bodner G. M., (2006), Contextual epistemic development in science: A comparison of chemistry students and research chemists, Sci. Educ., 90(3), 468–495.
  44. Sandoval W. A., (2005), Understanding students’ practical epistemologies and their influence on learning through inquiry, Sci. Educ., 89, 634–656.
  45. Sandoval W. A., (2014), Science education's need for a theory of epistemological development, Sci. Educ., 98(3), 383–387.
  46. Sandoval W. A. and Morrison K., (2003), High School Students’ Ideas about Theories and Theory Change after a Biological Inquiry Unit, J. Res. Sci. Teach., 40(4), 369–392.
  47. Sandoval W. A. and Reiser B. J., (2004), Explanation-driven inquiry: integrating conceptual and epistemic scaffolds for scientific inquiry, Sci. Educ., 88, 345–372.
  48. Sandoval W., Bell P., Coleman E., Enyedy N. and Suthers D., (2000), Designing Knowledge Representations for Learning Epistemic Practices of Science, Paper presented at the Annual Meeting of the American Educational Research Association, New Orleans, April 25.
  49. Schreier M., (2014), Qualitative Content Analysis, in Flick U. (ed.), The SAGE Handbook of Qualitative Data Analysis, London, United Kingdon: SAGE, pp. 170–183.
  50. Schwartz R. S., Lederman N. G. and Lederman J. S., (2008), An instrument to assess views of scientific inquiry: the VOSI questionnaire, National Association for Research in Science Teaching, March 30–April 2, Baltimore, U.S.
  51. Schwartz R., Lederman N. and Abd-El-Khalick F., (2012), A series of misrepresentations: a response to Allchin's whole approach to assessing nature of science understandings. Sci. Educ., 96(4), 685–692.
  52. Strippel C. G. and Sommer K., (2015), Teaching Nature of Scientific Inquiry in Chemistry: How do German chemistry teachers use labwork to teach NOSI? Int. J. Sci. Educ., 37(18), 2965–2986.
  53. Vesterinen V.-M. and Aksela M., (2009), A novel course of chemistry as a scientific discipline: how do prospective teachers perceive nature of chemistry through visits to research groups? Chem. Educ. Res. Pract., 10, 132–141.
  54. Zimmerman C., (2000), The development of scientific reasoning skills, Dev. Rev., 20, 99–149.

This journal is © The Royal Society of Chemistry 2021
Click here to see how this site uses Cookies. View our privacy policy here.