Inoculating students against science-based manipulation strategies in social media: debunking the concept of ‘water with conductivity extract’

Nadja Belova * and Moritz Krause
University of Bremen, Biology/Chemistry, Bremen, Germany. E-mail: n.belova@uni-bremen.de; Tel: +49-42121863284

Received 24th June 2022 , Accepted 30th September 2022

First published on 11th October 2022


Abstract

Social media are a popular source of information for young people, serving the purposes of not only communication but also the creation and distribution of content as well as advertising. However, that content may contain science-related information that in many cases is not based on scientifically proven sources. Content creators and/or advertisers use a specific set of strategies to make their claims more credible, and to address the strategies connected to science aspects, we created a fake product called ‘HIQO – the water with conductivity extract’ and claimed that it improved the conductivity of the brain to make the drinker smarter. We established a fully fledged web presence and ordered specially labelled bottles that in fact contained only ordinary mineral water. When creating the Internet resources, we made sure to consider the major manipulation strategies known from the literature. The product was embedded in a three-hour teaching sequence and has been tested with a total of 106 students from three secondary schools in northern Germany. The module was evaluated using tasks to assess civic online reasoning; the evaluation took place using a rubric with three levels (beginning/emerging/mastery), with learners who participated in the teaching sequence before attempting the tasks reaching the higher levels significantly more often than did those in the control group, who were given only the tasks.


Introduction

We live in an increasingly complex world, also referred to as the ‘post-truth world’ (Barzilai and Chinn, 2020). There are often no simple answers to multidimensional questions, and the problems of the modern world are becoming increasingly complex. This is especially reflected in the media landscape, because nowadays everyone can spread information through social media and the number of users of social networks such as Instagram, TikTok and Facebook is increasing worldwide from year to year (Statista, 2022).

The information in such media often has a scientific background but does not necessarily have to be disseminated by experts, which has fundamentally changed the way of communicating and reflecting on scientific content (Höttecke and Allchin, 2020). In addition, chemistry-related information in particular is often depicted in a very negative light—known as ‘chemophobia’—although there seems to be a slow shift towards a more positive picture (Guerris et al., 2020). Today, media aspects are covered in various school subjects, especially in the humanities and social sciences (Belova et al., 2015), but the implementation of such aspects in science education is still quite sparse and mostly limited to traditional types of media such as newspapers (Reid and Norris, 2016; Höttecke and Allchin, 2020). However, this misses the media reality of young people, the majority of whom use social media. Recent data show that the number of social-media users worldwide increased from roughly a billion in 2010 to 3.6 billion in 2020 (Tankowska, 2021). Many of the topics discussed on social media have a scientific background, but only in the rarest of cases are actual experts speaking here (Belova and Velikina, 2020). Because anyone can create and share content on platforms such as Instagram, Facebook and TikTok (thus eliminating the previously existing media gatekeepers), misinformation can spread rapidly and in extreme cases can lead to social divisions, as can currently be seen in the debates surrounding the COVID-19 pandemic (Archila et al., 2021).

The use of the Internet as a source of information by young people thus offers enormous opportunities for democratic participation on the one hand, but also enormous challenges, because understanding the use (and misuse) of scientific information is an important competence for critical citizens and thus for participation in society (Marks et al., 2014). Moreover, addressing current and controversial issues that are discussed in mass media increases learners’ sense of relevance (Belova and Eilks, 2015). On the other hand, young people often handle information in social media uncritically, although one might think that as ‘digital natives’ (Prensky, 2001) they should have fewer problems with it than, for example, their parents (Tseng, 2018). However, this does not seem to be the case (Bennett, 2012): students struggle with many aspects of online information gathering, including searching for and evaluating information, which is why appropriate materials and instructional interventions are repeatedly called for, especially in science (Horn and Veermans, 2019). The related initiatives are all based on the overarching concept of media literacy, or—as UNESCO (2021) calls it—media and information literacy. According to UNESCO, this concept consists of three dimensions (information literacy, media literacy, digital literacy), each with a set of subordinate competences [e.g. ‘critically evaluate media content in the light of media functions’ or ‘make ethical use of information’ (UNESCO, 2021, p. 9)]. Comparing the current framework with previous versions (e.g.UNESCO, 2006), one notices the increasing complexity of the concept of medial literacy due to the rapidly changing media landscape. Merging scientific literacy and media literacy in school science should form part of citizenship education in the modern age, which can be called ‘scientific media literacy’ (Chang Rundgren and Rundgren, 2014). Reid and Norris (2016) recently outlined certain research areas for the next decade in (what they call) scientific media education (SME), emphasising that skills related to SME are equally important for all citizens in modern societies.

The present paper describes the creation and evaluation of a teaching and learning module fostering such skills connected to SME and that was undertaken in five chemistry learning groups (grades 10–12, age range 16–18) in three German comprehensive schools. The module focuses on a supposedly ‘science-based’ product—namely a drinking water with ‘conductivity extract’—that the students need to debunk. Among other things, we created a real Instagram account for this product to give it the appearance of a real product, and we incorporated scientific information and science-based manipulation strategies into this account.

Theoretical framework

From media literacy to scientific media literacy

Because the role of mass media in society was already increasing rapidly almost 80 years ago, media literacy was first addressed internationally in 1946 by UNESCO as part of fundamental education for everyone (Holmes et al., 1947). It quickly becomes clear that science is a ubiquitous and recurring topic in nearly all the different media types. There are science-based news reports on television, radio and of course the Internet, for instance regarding issues such as climate change, energy supply, mobility and medical innovations; there are countless examples of the omnipresence of science in the media.

Developing capabilities to cope with information related to science in the media is indispensable if young people wish to become scientifically and media literate citizens. In the sense of a Bildung-oriented understanding of science education (Sjöström et al., 2017), science lessons should seek to promote the development of skills to prepare younger generations to become responsible citizens (Holbrook and Rannikmäe, 2007; Hofstein et al., 2011). The implementation of media into the science classroom also has the potential to make chemistry learning more meaningful to learners and more personally and societally relevant (Stuckey et al., 2013). The literature contains different definitions of the notion of media literacy, and in 2006 UNESCO outlined two main dimensions of media literacy, namely ‘reading’ and ‘writing’ media. On one hand, students need to understand the different forms of communication used in the media; on the other hand, they have to be able to create their own media. In the recently published updated version of the Media and Information Literacy curriculum, UNESCO (2021) points out that recognising reliable information is becoming an increasing challenge, and not only in the context of social media; trust in traditional media types is also declining. Therefore, according to UNESCO (2021, p. 6), media and information literacy is an umbrella term covering a large set of skills and competencies and consisting of (perhaps surprisingly) 23 sub-domains, ranging from advertising literacy via social-media literacy (which the present study builds upon) to visual literacy. In addition, 18 concrete learning objectives are formulated, and especially relevant to the present study are those aimed at locating and assessing information as well as its critical evaluation.

Other definitions of media literacy are aligned with the UNESCO framework in terms of content and suggested skills. According to Scheibe and Rogow (2012), the common definition of media literacy is individuals having a number of capabilities that enable them to navigate their way through a media-dominated environment; along with other media educators (Hobbs and Jensen, 2009), they explicitly name four dimensions that define the concept of media literacy, namely accessing, analysing, evaluating and creating media.

For societally oriented scientific literacy, it is necessary to develop skills to critically access and evaluate science in the media (Eilks et al., 2014). Merging scientific literacy and media literacy in school science should form part of citizenship education in the modern age, which can be called ‘scientific media literacy’ (Chang Rundgren and Rundgren, 2014), ‘scientific media education’ (Reid and Norris, 2016) or ‘science media literacy’ (Höttecke and Allchin, 2020). Scientific media literacy is a concept that encompasses a huge amount of different goals, skills and possible classroom activities connected to the learning of science, and it forms a justification for more-intense media education in the science classroom. Chang Rundgren and Rundgren (2014) argued that scientific media literacy could be an aspect of scientific literacy and includes three dimensions: (i) an understanding of the norms and methods of science (i.e. the nature of science), (ii) an understanding of key scientific terms and concepts (i.e. knowledge of scientific content) and (iii) an awareness and understanding of the impact of science and technology on society. The connection between science media literacy and the nature of science also plays a central role in the work of Höttecke and Allchin (2020), who presented a comprehensive theoretical basis for this connection and argued that nature-of-science education in our modern media landscape cannot function without this media perspective. Thus, media-literate citizens need to be aware that media contribute directly to the construction of scientific knowledge and that there can be certain effects (filter bubbles, confirmation bias, etc.) that can distort their own image of science and its nature. The paper by Höttecke and Allchin (2020) was published shortly before the first peak of the COVID-19 pandemic in Europe and has since become even more topical than it was before, and such effects must be examined actively in schools. In their paper on science media education, Reid and Norris (2016, p. 147) argued in a similar direction, calling it a ‘key content area’ for the future.

Misinformation, fake news and manipulation by the media in the context of science

Because of the substantial changes in our media landscape and the associated freedom to disseminate content without gatekeeping mechanisms, the spread of misinformation is becoming more frequent (Barzilai and Chinn, 2020; Moran, 2020; Tseng et al., 2021). However, despite terms such as ‘misinformation’ and ‘fake news’ being used increasingly, their definitions remain incomplete (van der Linden, 2022). A commonly used and quite broad definition of misinformation is ‘false or misleading information masquerading as legitimate news’ (Allen et al., 2020, p. 1). Although the term ‘fake news’ is often used synonymously, it has been criticised mainly for its strong political overtones (van der Linden, 2022); Wardle and Derakhshan (2018, p. 43) even described it as ‘inadequate’ because it has an extremely negative connotation and can potentially be misused to discredit information that one simply does not like or that does not fit one's agenda. Wardle and Derakhshan (2018) distinguished among ‘misinformation’, ‘disinformation’ and ‘malinformation’ by using two main categories, namely (i) genuinely false information and (ii) information intended to harm. ‘Disinformation’ falls into both of these categories, whereas ‘misinformation’ serves only the first one; by contrast, ‘malinformation’ is not wrong per se but is used to spread harassment or hate speech against a person or organisation. Note that Wardle and Derakhshan (2018, p. 44) deem any type of ‘misleading content’ to be misinformation.

In addition to the use of actual misinformation, which can extend to conspiracy ideas (Tseng et al., 2021), both private and commercial users of social media employ a range of strategies to make science-based content in general—and chemistry-based content in particular—appear more serious and therefore more credible (Kotsalas et al., 2017). These include the following frequently used strategies (Danciu, 2014; Kotsalas et al., 2017; van Prooijen and van Vugt, 2018; Guerris et al., 2020; Höttecke and Allchin, 2020): (i) deliberately emotive assertion (both positive and negative), (ii) insinuation of harm (e.g. regarding chemistry itself or ‘the chemical industry’), (iii) manipulated illustrations (e.g. cut-off graphs), (iv) linguistic exaggerations (e.g. ‘most effective’), (v) vague linguistic devices (e.g. ‘helps with’, ‘a feeling of’, etc.), (vi) presentation by an ‘expert in a white coat’, (vii) presenting correlation as causality, (viii) creating a feeling of logical connections, and (ix) creating a feeling of simple, satisfactory answers, to name only the most important ones.

Demonstrating such strategies as well as the use of misleading content is the primary goal of our teaching module. Moreover, research points out that actual misinformation is used quite rarely compared to the amount of reliable information (Acerbi et al., 2022), and therefore it is important to expand the focus to all types of narratives used in the media (Wardle and Derakhshan, 2018).

Reflection on science media messages in the classroom and beyond

Research on social media in the science classroom in general is scarce (Roozenbeek and van der Linden, 2019; Tseng et al., 2021). Reid and Norris (2016) outlined certain research areas for scientific media education, such as the design of standardised assessment tests for media literacy, something that has yet to be done for science education, while first suggestions are already available for measuring a more general construct of media literacy (Cuervo Sánchez et al., 2021). Reid and Norris (2016, p. 159) did not focus explicitly on social media but noted the importance of teaching students to be able to judge ‘representations of science on the Internet’ in order not to be confused by misinformation. A recently published book entitled Social Media in Education: Breakthroughs in Research and Practice (Information Resources Management Association, 2018) is focused mostly on using social media as a learning tool (e.g. for collaborative learning) in non-science subjects such as language education; it includes a chapter on practicing scientific argumentation through social media (Craig-Hare et al., 2018), which is focused mainly on social media as communication platforms and emphasises that students must be prepared to engage in science-based online interactions in a proficient way. Nevertheless, the authors note the need for the students to understand the mechanisms of scientific argumentation online. Beyond the scope of that book, the use of social media in the context of science education has been focused to date mainly on learning with media in the sense of communication tools (Hurst, 2018; Danjou, 2020), and interventions for learning about science in the media slowly started to appear on the research agenda of the science education community a few years ago. After an intervention in which a group of students was confronted with flawed science-based information, Tseng (2018) outlined the importance of more opportunities to critique science-based information online; she showed that students who accepted the claims showed low capabilities in scientific reasoning. In their preface to a special issue on education in a world characterised by uncertainty, Barzilai and Chinn (2020, p. 107) reviewed general educational responses to ‘post-truth problems and formulated broad educational objectives, such as developing media literacy along with scientific literacy and re-establishing the epistemic authority of science.

Although there is obviously awareness of the relevance of interventions that map such strategies concretely in the science classroom, to date there have been few concrete empirically tested proposals (such as that by Tseng (2018) outlined above) for classroom implementation. Kotsalas et al. (2017) described an intervention for different types of manipulation techniques used by the media in the context of sustainable development; they presented such techniques (e.g. creating a sense of urgency) explicitly to secondary school students via a short lecture, and a post-test showed that most of the students were able to ‘identify and decode’ (Kotsalas et al., 2017, p. 102) common techniques. Tseng et al. (2021) chose a similar approach using a critical reading intervention and a pre- and post-test design: students were given two persuasive articles on a scientific topic together with a ‘critical reading guide’ (treatment group only). In the pre-test and post-test, the students had to rate the trustworthiness of the science-based claims, and those from the treatment group had increased epistemic vigilance, although the effect size was quite small. Bråten et al. (2019) designed a six-week intervention for upper secondary school students on sourcing (selection of reliable sources, evaluation of sources, etc.): they observed higher sensitivity to careful source selection in the treatment group, and this effect persisted for more than five weeks; in their study design, they used science-based texts on climate change and nuclear power.

Although research on dealing with misinformation in the (science) classroom is still scarce, current research from other domains—mostly in psychology—reflecting on such media manipulation strategies shows impressively that even one-off interventions can have a lasting effect. For example, Roozenbeek and van der Linden (2019) developed an online game for recognising misinformation and corresponding dissemination strategies, which is now even available in German: players take on the role of news producers and must actively apply certain strategies (such as polarisation or emotionalisation), whereby they are warned explicitly in the game about potentially manipulative activities; in a pre- and post-test design, the game was found to reduce significantly the perceived reliability of tweets that used several common strategies of online misinformation. In a later study using the same type of intervention, Maertens et al. (2021) found that this effect could last for up to three months. Therefore, it can be said that even one-off interventions do have an effect, which over a certain period of time leads to people dealing more critically and reflectively with information on the internet. Researchers call this ‘inoculation’ (Cook et al., 2017; Roozenbeek and van der Linden, 2019), basically a ‘vaccine’ against false information and manipulation. In this context, Cook et al. (2017) noted that people in a suspicious state are influenced less by misinformation; studies by Pennycook et al. (2020) (on COVID-19 misinformation), Guess et al. (2020) (on a large online campaign about spotting misinformation) and Tully et al. (2020) (on the promotion of news literacy via Twitter) gave similar results.

Context of research: a teaching module involving a fake product

Our initial idea was to develop a teaching module on the strategies used to manipulate consumers of social media (advertising), for instance by creating a checklist for the students (Kotsalas et al., 2017). After a phase of intensive brainstorming, the idea emerged to illustrate the strategies implicitly using a product as an example. Such an intervention is aimed at empowering the students to evaluate information that they see online (Lazer et al., 2018). To illustrate these strategies precisely, we decided to develop an (imaginary) product ourselves. In doing so, we made sure that it was a product with a chemistry-based mechanism of action that appeared to be logical and could be explained with knowledge from school chemistry. This is how we came up with ‘HIQO – the water with conductivity extract’, which is a drinking water enriched with supposed ‘conductivity extract’; this supposedly increases the conductivity in the brain, allowing nerve impulses to be transmitted better and ultimately increasing cognitive performance. Of course, this sounds only supposedly logical but lacks any serious basis, if only because the orally ingested ‘conductivity extract’ would have to pass through the blood–brain barrier.

The developed teaching sequence is based on this imaginary product and its social-media presence, the basis for which was an Instagram account (@hiqo_official; see Fig. 1). We chose Instagram as the main platform for two reasons: (i) it is a popular social network among young adults (Statista, 2022); (ii) it combines different ways of presenting content, it being possible to include photographs, text and videos. At the beginning of our intervention, learners were asked to familiarise themselves with the product and the social-media account and to assess its credibility, with many of them initially trusting the product and the promises associated with it. These first impressions were collected and clustered via a collaborative online tool. Based on the Instagram account, the learners then described the strategies that were used to give the product and the social-media account a certain seriousness and credibility. In doing so, the learners managed to work out most of the implicitly used strategies. Fig. 2 and 3 show examples of posts from the Instagram account depicting selected strategies.


image file: d2rp00191h-f1.tif
Fig. 1 Post from Instagram account for the product, showing a bottle of HIQO water.

image file: d2rp00191h-f2.tif
Fig. 2 Instagram post showing insights into a fictional laboratory (translation: ‘Our work in the lab’).

image file: d2rp00191h-f3.tif
Fig. 3 Instagram post showing ‘data’ on the conductivity of the water using different amounts of the ‘extract’.

In addition to the development of these strategies and the corresponding assessment of the product, the focus here was also on a chemistry-based consideration (conductivity, ions). Then the ‘fake’ was revealed, which was very impressive for the learners. In the next step, they compared the strategies they had worked out with the help of ‘digital index cards’ that we had created, on which the strategies used in the account were named and briefly explained. There were also ‘frequently asked questions’ for further reading. If necessary, these index cards could also be used earlier as support or scaffolding material. The end of the teaching sequence plays an important role in securing and internalising what has been learned. Thus, at the end, the learners formulated a conclusion that described and reflected on their usage behaviour and perception of scientific information before the sequence and was aimed at future behaviour in social media. The conclusion often included not being blinded hastily and checking carefully whether a product or information that they find is reputable, especially if they recognise one of the strategies being used. Such a conclusion coincides with the effect of ‘inoculation’ described above and can lead to increased sensitivity over several months. An overview of the module is given in Table 1. After creating the first version of the module, we conducted a pilot test with a learning group (a grade-11 chemistry class) and then made some changes to the concept. These were related mainly to the Instagram account: learners criticised it because (i) the posts were all posted on the same day (which strengthened the suspicion of fakery) and (ii) the pictures and the design of the posts did not correspond to the current trends of ‘Instagram aesthetics’. A group of learners offered to rebuild the account, which happened during the autumn break of 2021. So, the final account can be seen as a collaborative project between us and the learners.

Table 1 Overview of structure of module
Phase Content
Introduction The teacher announces that they have discovered a new product and presents a bottle to the students. A student reads aloud the label on the bottle (including ‘Follow us on Instagram’).
Problematisation The students look at the Instagram account (@hiqo_official) and formulate a first impression. The impressions are clustered (using a digital feedback tool) and discussed.
Work phase I The students note down criteria that make the account either more or less credible.
Transition The criteria found are discussed. Bottles are distributed to the students, who search for the ‘conductivity extract’. The technical background is clarified and the fake is revealed.
Work phase II The students compare their criteria with the digital index cards and complete their notes.
Consolidation A general conclusion is formulated.


Method

The teaching sequence was used and evaluated in a school context. The evaluation of the final version of the module was carried out using a total of six slightly adapted civic-online-reasoning assessment tasks (McGrew et al., 2018) based on the assignment of three levels of critical reflection skills (beginning/emerging/mastery). Students who gave ‘mastery’ responses analysed online information well by paying close attention to the source as well as verifying it in other credible sources, and they analysed the evidence offered. Students who gave ‘emerging’ responses were on the correct track when it came to evaluating the source or evidence, but they provided insufficient explanations. Finally, students who gave ‘beginning’ responses used problematic ways to evaluate material (see McGrew et al., 2018, p. 173). Note that the learners in the sequence were prepared not explicitly [e.g. as in the intervention described by Kotsalas et al. (2017)] but instead implicitly for the competences tested in the tasks; for example, learners were asked to compare the reliability of sources, evaluate a website, or take a position on an argument in fictitious Facebook comments. Table 2 gives an overview of the tasks used in the study. The tasks were completed by the treatment group immediately after the intervention. The processing time was on average about 40 minutes. Ethical review and approval were waived for this study due to federal data protection regulations (no personal data was collected during the study). Participation was completely voluntary and informed consent was obtained from all participants prior to data collection. For student participants aged under 18, parental consent and student assent was obtained prior to commencement of the study.
Table 2 Overview of tasks used in the assessment
Task Description Source of task
1. Comparing articles Explain which of two sources (one sponsored content, one traditional news) is a more reliable source about climate change. McGrew et al. (2018) [original task translated into German]
2. Website reliability Using any online sources, explain whether a website is a reliable source of information about a specific compound (chorine dioxide). McGrew et al. (2018) [original task translated into German; website changed]
3. Evaluating evidence Evaluate the strength of evidence in a photograph posted on Facebook (on climate change being a hoax). McGrew et al. (2018) [original task translated into German; post changed]
4. Facebook argument Explain which poster in a Facebook conversation provides stronger evidence about parabens. McGrew et al. (2018) [original task translated into German; conversation topic changed]
5. Researching a claim Use an open Internet search to decide whether aluminium in cosmetics is dangerous to health. McGrew et al. (2018) [original task translated into German; topic changed]
6. Supposed logic Explain whether a connection made in an Instagram post (if glyphosate were harmless, field workers wouldn't be wearing protective gear) is credible. New task created by the authors


Sample

The lesson plan was implemented with five learning groups in three German urban comprehensive schools in grades 10–12 (age range 16–18 years); the total sample consisted of 106 students. The first author instructed all but one of the learning groups; for organisational reasons, the second author instructed the other group. For the study, we assembled a control group that was as similar as possible to the treatment group, consisting partly of parallel classes, to which we gave only the research instrument; this group did not go through the teaching module beforehand, and it was somewhat smaller than the treatment group (N = 80) because of class sizes and a high level of sick leave in the COVID-19 wave of the winter of 2021/22.

Research tools and data analysis

We chose the present tasks having researched instruments for measuring media literacy. The available instruments (e.g.Simons et al., 2017; Schilder and Redmond, 2019; Cuervo Sánchez et al., 2021) did not fit our scope: they are either too broad or explicitly focused on a specific media type, and they are not focused on aspects from the science domains in our teaching module. By contrast, the tasks from McGrew et al. (2018) have the advantage of explicitly testing a set of strategies used in the media; they were designed originally to assess civic online reasoning, i.e. ‘the ability to effectively search for, evaluate, and verify social and political information online’ (McGrew et al., 2018, p. 166). Even though science-based information may initially seem generally less controversial than social or political information, there are multifaceted, complex and also controversial issues there as well, and our versions of the tasks are focused on such controversial science issues (e.g. climate change, pesticide use) (see Beniermann et al., 2021). The exact changes made to the tasks are outlined in Table 2. After the present authors reached agreement on the final versions of the tasks, the first author adapted the original rubrics proposed by McGrew et al. (2018) to these tasks. Table 3 gives an example of the assessment rubric with three anchor citations for the three levels for task 2. In this task, learners were asked to assess the credibility of a dubious site on the medical use of chlorine dioxide. The “Mastery” answer assesses the site in a very differentiated way, argues comprehensibly and provides valid sources. The “Emerging” answer only mentions superficial criteria, but assesses the page correctly, and the “Beginning” answer is blinded by the technical terms and the professional presentation. The first and second authors (the latter having been involved in creating the tasks but not the rubric) then scored ca. 30% of the students’ responses, which led to a very high inter-rater agreement (Cohen's κ = 0.94), whereupon the first author scored the remaining responses using the rubrics.
Table 3 Exemplary anchor quotes for the three levels of task 2 (website reliability)
Mastery No, the website is not trustworthy because the articles on it are more or less only about chlorine dioxide and its supposedly good effect. The articles say very little about health, but only promote a product and do not present enough scientific facts about it. Moreover, nowhere is it about how exactly everyone is to be helped, as promised at the beginning. If you understand the organisation correctly, all you have to do is give everyone chlorine dioxide again and again and the whole world is saved. According to it, there will be no need for vaccinations. What helped me a lot in my decision is that if you research it further, you find out that [the organisation] is a ‘lobby organisation of suppliers and users of the disinfectant chlorine dioxide for the prevention and treatment of all kinds of diseases’ [credible source]. To make sure I didn't fall for an opposing side, I also looked for other sites that prove that the [organisation] belongs to a lobby and that its facts are just made up. In the process, I came across the site [credible source]. Here, doctors and journalists contacted one of the [organisation] members by email and asked how much there really was to the studies. The answer showed that members of [organisation] are vaccination opponents […]. This site also talks to actual doctors who confirm that chlorine dioxide cannot be a possible COVID-19 antidote. Some people have even died from it and in the meantime some members have also been reported to the police. (A_S3_T89)
Emerging Website under construction, no one answers the phone, ‘about us’ page missing. Not credible! (A_S5_K127)
Beginning I think the website looks quite credible. In general, the structure is very interesting and makes you want to read more. Pictures always help to arouse my interest. But the decisive point that makes the website look credible to me is that the website uses a lot of technical terms that you don't understand as a normal person. Because of all the technical terms, you immediately see that the website has a clue about its work. (A_S1_T81)


In addition to the post-intervention tasks, open-ended observation forms were used during the lesson and given to the actual teachers of the corresponding classes. There was also a brief feedback session with the students after each intervention. These sessions were held in an open format and were kept fairly short due to the already heavy time commitment for the students. They were asked what they liked, where there were suggestions for improvement, and what else they thought about the intervention.

Findings

Overall perception

During the lesson observations, it was noticeable that the teachers assessed the student activity in all groups as higher than that in regular chemistry lessons. For example, they found that students who normally did not participate so often participated in this particular lesson. In general, class participation was higher than usual, in part because students asked more questions, especially those that questioned their own media use (“What exactly can I do if I notice that an Internet resource is using certain strategies?”). In addition, it was highlighted that the students did not switch to other topics in their conversations during the work phases, as is usually the case, but talked exclusively about the lesson topic. During the presentation of the results, there were much more volunteers than in regular classes. In the subsequent feedback discussions, the learners explained this with a higher sense of relevance (‘I will now check which accounts I follow and whether they are credible’) as well as the motivating role of social media (‘We usually never work with an Instagram account’). In addition, it was observed that students had many questions after the first viewing of the account (How does the extract affect people? Can the IQ perhaps even be affected?). An additional type of feedback was comments about the content of the account; for example, several groups suggested posting a fictitious ‘expert interview’ online, such as with a professor who would reaffirm the effect for greater credibility, and evidence suggests that this is a very effective strategy (Cook et al., 2017). However, after lengthy deliberation, this idea was discarded because it seemed dishonest to include a sound bite from an interview with someone about a product that did not work. Other minor details in the account (e.g. the selection of certain hashtags) were adjusted again based on feedback. Furthermore, the teachers observed that at the beginning of the intervention, many students were comparing the mineral water that was handed out to their own in terms of mineralisation; we anticipated this and ordered extra water with either exceptionally high mineralisation.

In both the feedback sessions and the intervention itself, many learners highlighted the successful, professional-looking design of the account; until the resolution, they did not notice that stock photos and videos were used in some cases. Thus, it would appear that the design provided very high perceived credibility (also the graphs, tables and everything that looked ‘scientific’) and also to a certain extent outshone the content, which was only gradually opened up. For example, details such as misleading axis labels on graphs were not recognised at first and were only added later; at first glance, the graphs established a high level of credibility. In addition, it was noticeable that the students—who had just dealt with the topic of ions and conductivity in regular chemistry lessons—were able to classify the (supposed) mode of action much more quickly, but were also able to debunk it. This shows that knowledge does protect against manipulation if it can be applied directly (Hove et al., 2011). In summary, although most students were initially sceptical about the product, the design of the account and the content had some persuasive power. Thus, most students were very surprised by the resolution of the fake and—according to the teachers—were still talking about it weeks later. This indicates a higher sense of relevance on the part of the students, who also subsequently reported that they often have to think back to the intervention when they consume media in their daily lives.

Task performance

Fig. 4 gives the percentage distributions (beginning/emerging/mastery) for each task for the control and treatment groups. From these descriptive statistics alone, one can see that the treatment group far outperformed the control group. To confirm this, a Mann–Whitney U test was performed, and there was indeed a significant difference between the control and treatment groups, i.e. U = 99618.500, Z = −9.170, p < 0.001 (with a large effect size, Pearson's r = 0.67). This is a nonparametric test used for independent samples, and is used when the conditions for a t-test are not met. This was the case for us because the data were ordinally scaled (the three gradations represented a rank order).
image file: d2rp00191h-f4.tif
Fig. 4 Students' task performance.

In task 1, which we adapted entirely from McGrew et al. (2018), learners were asked to compare two pieces of information about climate change: the first was a headline from The Atlantic magazine for an article written by a credible scientist; the second was information from the same magazine but showing a graph created by Shell that illustrates the role of the energy mix. In fact, the learners in both groups had the most difficulty with this task (see Fig. 4; 63% reached only the ‘beginning’ level even in the treatment group). The learners were very influenced by the scientific-looking pie chart [‘Article B appears more credible because a graphic illustrates the problems and opportunities and, moreover, a metaphor is not used as in Article A. This means that Article B appears more informative than Article A.’ (C_S1_K_17)]. In the treatment group, almost a quarter of the students recognised the sponsoring as less credible [‘Article B appears credible at first sight, as the diagram looks professional. However, this article was sponsored by Shell and Shell, as a gigantic oil company, is known to have a negative impact on the environment. So, this action seems to be greenwashing to avoid responsibility of climate change.’ (A_S1_T101)].

In task 2, learners were asked to evaluate a pseudoscience (already bordering on conspiracy ideologies) site that promoted chlorine dioxide as a cure for all sorts of diseases, including COVID-19. Shockingly, over 60% of the students from the control group rated the site as reliable; the results were much better in the treatment group. In general, many learners here (just as with the Instagram account in the intervention) were dazzled by the professional appearance of the website. Task 3 involved a Facebook post that supposedly disproves man-made climate change: it shows water-level markings on buildings, some of which are over 300 years old, and this is presented as proof that floods have always existed. Here, the majority of the students were in the ‘emerging’ area [‘This article seems very dubious to me because it comes from Facebook and this is a platform where anyone can post what they want without it being checked for accuracy. Moreover, the author of the article does not seem to be an expert in this field. The article simply shows pictures with different water levels at different times in history. However, the article does not show how these water levels came about.’ (C_S1_K22)]. In addition to such source criticism, mastery-level learners also recognised the increasing frequency of natural disasters as an indication of climate change [‘No, the contribution is insufficient. Of course, there have also been devastating floods in past years, but the argument that these are due to climate change is not only based on the extent of the floods, but also on their frequency and location. For example, they occur in places without previous flooding problems, and more often than before the onset of climate change.’ (A_S3_T71)].

In task 4, students were asked to take a position on an argument on Facebook about the alleged cancer risk of parabens. One of the two people involved gave a reputable source that refuted the claim. The other person then showed two graphs (sales of paraben-containing products and breast cancer cases), which were only supposedly connected to each other and did not represent any proof of a cancer-causing effect of parabens. Almost 50% of the students in the treatment group and almost 80% of the students in the control group were blinded by these very graphs [‘Grace has the stronger arguments as she also includes studies, examples and graphs in the argument.’ (A_S1_K48)].

In task 5, students were asked to research the extent to which aluminium is harmful to health. The focus was on which sources they selected for their assessment and how they arrived at a verdict. Here, the learners in both groups performed best; it seems that most of them were able to select credible sources.

Finally, created entirely by us, task 6 involved the impact of supposedly logical connections. An Instagram meme was shown questioning why workers who spray glyphosate wear protective clothing, even though the producers claim that it is safe. In addition to this context, this meme has a strong emotionalising effect, and it influenced very many of the learners (43% in the treatment group and 64% in the control group) [‘The reasoning of (author of the post) is very plausible because it looks really scary how the people with their protective suits move through the fields.’ (A_S1_K33)]. Only a few students in both groups showed a differentiated view of the context [‘Consumption can be safe because the product is purified and one is only exposed to low concentrations of the product. Workers, on the other hand, come into direct contact and are given extra protection – the dose makes the poison.’ (A_S3_T101)].

In general, it can be summarised that even the treatment group was still blinded to the strategies in many cases. However, it was also evident in the group that went through the teaching unit that strategies (previously covered in the module) were more often mentioned explicitly when answering the question or that formulations from the teaching unit were used [‘Technical terms are not explained by the experts and occur in the text without context.’ (A_S1_T_68)].

Conclusions

In light of the transformed media landscape, it is now incredibly important to critically question messages from the media. This should play a role in all subjects at school, because young people have difficulties in adequately assessing science-based information (Dawson and Venville, 2009; Tseng, 2018). As a proposal to improve the questioning of science-based information in social media, we developed and evaluated a teaching module on a fictional product, i.e. water with ‘conductivity extract’. To the best of our knowledge, the teaching sequence that we developed and the associated study are the first of their kind (i.e. context of science teaching, immersive ‘trigger’ in the form of the fictional product, focus on authentic social media) both in Germany and worldwide.

In the evaluation of the module, we used tasks that were created to measure civic online reasoning, something that is known from evaluating social or political information. While science-based information may not seem equally controversial at first glance, even scientific facts are being distorted, changed or called into question in social media. Moreover, there are many complex and controversial issues that have a scientific basis (Beniermann et al., 2021), also known as socio-scientific issues (Sadler and Dawson, 2012). Therefore, we consider civic-online-reasoning tasks to be suitable instruments for our purposes as well.

One limitation of our study was that we did not measure the concept of scientific media literacy exactly but instead used the tasks by McGrew et al. (2018) as items that assess different scientific aspects and strategies frequently used in the media; they are not supposed to serve as a ‘summative instrument’ (McGrew et al., 2018, p. 185). In the long run, there is a need to create an instrument to measure the construct of scientific media literacy. Another limitation noted by the authors of the tasks and that is also valid in our study is that they are artificial because the students evaluate information that they did not encounter themselves while on social media. However, we deliberately chose or contrived the tasks to address those strategies that occurred in the instructional module.

Regarding the results of the tasks, it is useful to compare with studies in which tasks of this type were also used. The first study is that by McGrew et al. (2018) themselves. They described the piloting of the tasks as well as their final use with different groups (middle school, high school, college) and without any prior explicit intervention that could potentially prepare students to work on these tasks. Here, the percentage distributions of the results coincide with the trends of our control group (see Fig. 4): for task 1 (Comparing articles), 85% of the students in our group reached the beginning level, 6% reached the emerging level, and 9% reached the mastery level; for the identical task, the distribution in the high-school cohort of McGrew et al. (2018) was 80% beginning, 9% emerging and 11% mastery, i.e. very similar. Horn and Veermans (2019) also used these tasks in Finland: here, the Finnish students showed slightly better results but also similar tendencies; for example, in the aforementioned ‘Comparing articles’ task, 74% of the students answered at the beginning level, 12% at the emerging level and 14% at the mastery level. Obviously, it is quite difficult for students throughout the world to evaluate information in the media.

This is where teaching modules such as the one that we have developed could come in, and in all subjects. We were able to show that this had at least a short-term effect, but what we have not established is a long-term effect; nevertheless, other comparable studies have shown that this lasts for several weeks or even months (Roozenbeek and van der Linden, 2019). However, there is of course a certain novelty effect that overshadows the actual effectiveness of such interventions. In addition, the positive observations of the teachers, such as high participation and increased interest in the subject matter, should not be disregarded here. In general, there is a need for more teaching suggestions of this kind, especially in the sciences, as well as more evidence for their effectiveness. The study results that we have so far suggest that it is best to repeat such learning sequences every few months. Here, theoretically, one could alternate subjects. We should also consider how subject content in science subjects can be functionally linked to relevant issues discussed in the media, so that students understand the role of chemistry-related (or generally science-related) information in the public arena. It is important to note that simply knowing about manipulative strategies does not fully protect against manipulation, and emotions play a very large role in judgment (Pennycook et al., 2021): we believe what we want to believe. For example, very recent research has shown that young people evaluate information differently and less accurately when it is disseminated by influencers to whom they feel emotionally attached (Balaban et al., 2022; Sweeney et al., 2022). How to break through such mechanisms is a key question for future research. Behind this lies a fundamental question, namely what should be taught in schools or what role schools play in training competencies for reflexive orientation in a digitalised world. Over 10 years ago, Chao et al. (2011, p. 324) wrote that ‘[social media are] so widespread and inculcated in our culture that it is futile to try to stop their influence at the classroom door.’ Therefore, schools cannot close themselves against such initiatives, which in turn results in a great need for further training of teachers, both pre-service and in-service.

Conflicts of interest

There are no conflicts to declare.

Acknowledgements

This research was funded by the Joachim Herz Foundation (“Kolleg Didaktik:digital”).

References

  1. Acerbi A., Altay S. and Mercier H., (2022), Research note: Fighting misinformation or fighting for information? Harv. Kennedy Sch. Misinf. Rev., 3(1), 1–15.
  2. Allen J., Howland B., Mobius M., Rothschild D. and Watts D. J., (2020), Evaluating the fake news problem at the scale of the information ecosystem, Sci. Adv., 6(14), eaay3539.
  3. Archila P. A., Danies G., Molina J., de Mejía A. M. T. and Restrepo S., (2021), Towards Covid-19 literacy, Sci. Educ., 30(4), 785–808.
  4. Balaban D. C., Mucundorfeanu M. and Mureşan L. I., (2022), Adolescents’ understanding of the model of sponsored content of social media influencer Instagram stories, Media Commun., 10(1), 305–316.
  5. Barzilai S. and Chinn C. A., (2020), A review of educational responses to the “post-truth” condition: Four lenses on “post-truth” problems, Educ. Psychol., 55(3), 107–119.
  6. Belova N. and Eilks I., (2015), Learning with and about advertising in chemistry education with a lesson plan on natural cosmetics – A case study, Chem. Educ. Res. Pract., 16(3), 578–588.
  7. Belova N. and Velikina I., (2020), Analyzing the chemistry in beauty blogs for curriculum innovation, Chem. Teach. Int., 2(2), 20180028.
  8. Belova N., Chang Rundgren S.-N. and Eilks I., (2015), Advertising and science education: A multi-perspective review of the literature, Stud. Sci. Educ., 51(2), 169–200.
  9. Beniermann A., Mecklenburg L. and Upmeier zu Belzen A., (2021), Reasoning on controversial science issues in science education and science communication, Educ. Sci., 11(9), 522.
  10. Bennett S., (2012), Digital natives, in Yan Z. (ed.), Encyclopedia of Cyber Behavior: Volume 1, Hershey, PA: IGI Global, pp. 212–219.
  11. Bråten I., Brante E. W. and Strømsø H. I., (2019), Teaching sourcing in upper secondary school: A comprehensive sourcing intervention with follow-up data, Read. Res. Q., 54(4), 481–505.
  12. Chang Rundgren S. N. and Rundgren C.-J., (2014), SSI pedagogic discourse: embracing scientific media literacy and ESD to face the multimedia world, in Eilks I., Markic S. and Ralle B. (ed.), Science Education Research and Education for Sustainable Development, Aachen, Germany: Shaker, pp. 157–168.
  13. Chao J., Parker K. and Fontana A., (2011), Developing an interactive social media based learning environment, Issues Informing Sci. Inf. Technol., 8, 323–334.
  14. Cook J., Lewandowsky S. and Ecker U. K., (2017), Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence, PLoS One, 12(5), e0175799.
  15. Craig-Hare J., Rowland A., Ault M. and Ellis J. D., (2018), Practicing scientific argumentation through social media, in Information Resources Management Association (ed.), Social Media in Education: Breakthroughs in Research and Practice, Hershey, PA: IGI Global, pp. 234–256.
  16. Cuervo Sánchez S. L., Foronda Rojo A., Rodriguez Martinez A. and Medrano Samaniego C., (2021), Media and information literacy: a measurement instrument for adolescents, Educ. Rev., 73(4), 487–502.
  17. Danciu V., (2014), Manipulative marketing: persuasion and manipulation of the consumer through advertising, Theor. Appl. Econ., 2(591), 19–34.
  18. Danjou P. E., (2020), Distance teaching of organic chemistry tutorials during the COVID-19 pandemic: Focus on the use of videos and social media, J. Chem. Educ., 97(9), 3168–3171.
  19. Dawson V. and Venville G. J., (2009), High-school students’ informal reasoning and argumentation about biotechnology: An indicator of scientific literacy? Int. J. Sci. Educ., 31, 1421–1445.
  20. Eilks I., Nielsen J. A. and Hofstein A., (2014), Learning about the role of science in public debate as an essential component of scientific literacy, in Tiberghien A., Bruguière C. and Clément P. (ed.), Topics and Trends in Current Science Education, Dordrecht, The Netherlands: Springer, pp. 85–100.
  21. Guerris M., Cuadros J., González-Sabaté L. and Serrano V., (2020), Describing the public perception of chemistry on twitter, Chem. Educ. Res. Pract., 21(3), 989–999.
  22. Guess A. M., Lerner M., Lyons B., Montgomery J. M., Nyhan B., Reifler J. and Sircar N., (2020), A digital media literacy intervention increases discernment between mainstream and false news in the United States and India, Proc. Natl. Acad. Sci. U. S. A., 117(27), 15536–15545.
  23. Hobbs R. and Jensen A., (2009), The past, present, and future of media literacy education, J. Media Lit. Educ., 1(1), 1–11.
  24. Hofstein A., Eilks I. and Bybee R., (2011), Societal issues and their importance for contemporary science education: A pedagogical justification and the state of the art in Israel, Germany and the USA, Int. J. Sci. Math. Educ., 9(6), 1459–1483.
  25. Holbrook J. and Rannikmäe M., (2007), The nature of science education for enhancing scientific literacy, Int. J. Sci. Educ., 29(11), 1347–1362.
  26. Holmes H. W. et al. (ed.), (1947), Fundamental Education, Common Ground for All Peoples: Report of a Special Committee to the Preparatory Commission of the United Nations Educational, Scientific and Cultural Organization, Paris, 1946, New York, NY: Macmillan.
  27. Horn S. and Veermans K., (2019), Critical thinking efficacy and transfer skills defend against ‘fake news’ at an international school in Finland, J. Res. Int. Educ., 18(1), 23–41.
  28. Höttecke D. and Allchin D., (2020), Reconceptualizing nature-of-science education in the age of social media, Sci. Educ., 104(4), 641–666.
  29. Hove T., Paek H.-J. and Isaacson T., (2011), Using adolescent eHealth literacy to weigh trust in commercial web sites: The more children know, the tougher they are to persuade, J. Adv. Res., 51(3), 524–537.
  30. Hurst G. A., (2018), Utilizing Snapchat to facilitate engagement with and contextualization of undergraduate chemistry, J. Chem. Educ., 95(10), 1875–1880.
  31. Information Resources Management Association, (2018), Social Media in Education: Breakthroughs in Research and Practice, Hershey, PA: IGI Global.
  32. Kotsalas I. P., Antoniou A. and Scoullos M., (2017), Decoding mass media techniques and education for sustainable development, J. Educ. Sustainable Dev., 11(2), 102–122.
  33. Lazer D. M., Baum M. A., Benkler Y., Berinsky A. J., Greenhill K. M., Menczer F. et al., (2018), The science of fake news, Science, 359(6380), 1094–1096.
  34. Maertens R., Roozenbeek J., Basol M. and van der Linden S., (2021), Long-term effectiveness of inoculation against misinformation: Three longitudinal experiments, J. Exp. Psychol. Appl., 27(1), 1–16.
  35. Marks R., Stuckey M., Belova N. and Eilks I., (2014), The societal dimension in German science education – From tradition towards selected cases and recent developments, Eurasia J. Math. Sci. Technol. Educ., 10(4), 285–296.
  36. McGrew S., Breakstone J., Ortega T., Smith M. and Wineburg S., (2018), Can students evaluate online sources? Learning from assessments of civic online reasoning, Theor. Res. Soc. Educ., 46(2), 165–193.
  37. Moran P., (2020), Social media: A pandemic of misinformation, Am. J. Med., 133(11), 1247–1248.
  38. Pennycook G., McPhetres J., Zhang Y., Lu J. G. and Rand D. G., (2020), Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy-nudge intervention, Psychol. Sci., 31(7), 770–780.
  39. Pennycook G., Epstein Z., Mosleh M., Arechar A. A., Eckles D. and Rand D. G., (2021), Shifting attention to accuracy can reduce misinformation online, Nature, 592(7855), 590–595.
  40. Prensky M., (2001), Digital natives, digital immigrants part 1, On Horizon, 9(5), 1–6.
  41. Reid G. and Norris S. P., (2016), Scientific media education in the classroom and beyond: a research agenda for the next decade, Cult. Stud. Sci. Educ., 11(1), 147–166.
  42. Roozenbeek J. and van der Linden S., (2019), Fake news game confers psychological resistance against online misinformation, Palgrave Commun., 5(1), 65.
  43. Sadler T. D. and Dawson V., (2012), Socio-scientific issues in science education: Contexts for the promotion of key learning outcomes, in Fraser B., Tobin K. and McRobbie C. (ed.), Second International Handbook of Science Education. Dodrecht, The Netherlands: Springer, pp. 799–809.
  44. Scheibe C. and Rogow F., (2012), The Teacher's Guide to Media Literacy, Thousand Oaks, CA: Corwin.
  45. Schilder E. and Redmond T., (2019), Measuring media literacy inquiry in higher education: Innovation in assessment, J. Media Lit. Educ., 11(2), 95–121.
  46. Simons M., Meeus W. and T'Sas J., (2017), Measuring media literacy for media education: Development of a questionnaire for teachers' competencies, J. Media Lit. Educ., 9(1), 99–115.
  47. Sjöström J., Frerichs N., Zuin V. G. and Eilks I., (2017), Use of the concept of Bildung in the international science education literature, its potential, and implications for teaching and learning, Stud. Sci. Educ., 53(2), 165–192.
  48. Statista, (2022), Beliebstste soziale Netzwerke [The most popular social networks], retrieved from https://de.statista.com/prognosen/999733/deutschland-beliebteste-soziale-netzwerke.
  49. Stuckey M., Mamlok-Naaman R., Hofstein A. and Eilks I., (2013), The meaning of ‘relevance’ in science education and its implications for the science curriculum, Stud. Sci. Educ., 49(1), 1–34.
  50. Sweeney E., Lawlor M. A. and Brady M., (2022), Teenagers’ moral advertising literacy in an influencer marketing context, Int. J. Advert., 41(1), 54–77.
  51. Tankowska H., (2021), Number of social network users worldwide from 2017 to 2025, retrieved from https://www.statista.com/statistics/278414/number-of-worldwide-social-network-users/.
  52. Tseng A. S., (2018), Students and evaluation of web-based misinformation about vaccination: critical reading or passive acceptance of claims? Int. J. Sci. Educ. B: Commun. Public Engagem., 8(3), 250–265.
  53. Tseng A. S., Bonilla S. and MacPherson A., (2021), Fighting “bad science” in the information age: The effects of an intervention to stimulate evaluation and critique of false scientific claims, J. Res. Sci. Teach., 58(8), 1152–1178.
  54. Tully M., Vraga E. K. and Bode L., (2020), Designing and testing news literacy messages for social media, Mass Commun. Soc., 23(1), 22–46.
  55. UNESCO (2006), Media education, A kit for teachers, students, parents and professionals, retrieved from http://unesdoc.unesco.org/images/0014/001492/149278e.pdf.
  56. UNESCO, (2021), Media and information literate citizens: think critically, click wisely!, retrieved from https://unesdoc.unesco.org/ark:/48223/pf0000377068.
  57. van der Linden S., (2022), Misinformation: susceptibility, spread, and interventions to immunize the public, Nat. Med., 28, 460–467.
  58. van Prooijen J. W. and van Vugt M., (2018), Conspiracy theories: Evolved functions and psychological mechanisms, Perspect. Psychol. Sci., 13(6), 770–788.
  59. Wardle C. and Derakhshan H., (2018), Thinking about ‘information disorder’: formats of misinformation, disinformation, and mal-information, in Ireton C. and Posetti J. (ed.), Journalism, ‘Fake News’ and Disinformation, Paris, France: UNESCO, pp. 43–54.

This journal is © The Royal Society of Chemistry 2023