A novel qualitative method to improve access, elicitation, and sample diversification for enhanced transferability applied to studying chemistry outreach

Justin M. Pratt and Ellen J. Yezierski *
Department of Chemistry and Biochemistry, Miami University, Oxford, OH, USA. E-mail: yeziere@miamioh.edu

Received 19th October 2017 , Accepted 21st December 2017

First published on 17th January 2018


Abstract

Conducting qualitative research in any discipline warrants two actions: accessing participants and eliciting their ideas. In chemistry education research, survey techniques have been used to increase access to participants and diversify samples. Interview tasks (such as card sorting, using demonstrations, and using simulations) have been used to elicit participant ideas. While surveys can increase participation and remove geographic barriers from studies, they typically lack the ability to obtain detailed, thick description of participant ideas, which are possible from in-person interviews. Minimal research in CER has examined how to harness technology to synthesize traditionally diverse research approaches to advance the field. This paper presents a novel method for interviewing research participants employing freely available technology to investigate student ideas about the purposes of conducting chemistry outreach, how success of an outreach event is evaluated, and student understanding of the chemistry content embedded in activities facilitated at events. As the outreach practitioner population comes from numerous institutions and is therefore geographically diverse, technology is necessary in order to gain access to these students. To elicit their ideas and remove barriers associated with rapport, interview tasks are adapted and implemented electronically. The description of a novel set of methods is coupled with evidence from the interviews to illustrate the trustworthiness of the data obtained and to support the method as a means to improve qualitative data collection in chemistry education research. These methods create a unique data collection environment for off-site investigations and are applicable to all disciplines, as they shed light on how qualitative research in the 21st century can increase the diversity of samples and improve the transferability of findings.


Background

This study employs novel methods to access and elicit ideas from chemistry outreach practitioners who are geographically diverse. Because the context of this study is outreach, a review of the extant literature on informal science education (ISE) is presented. This is followed by a review of the literature that informs methods in chemistry education research (CER) so the novel features of the methods in this study can be highlighted.

Informal chemistry education

Teaching and learning can be parsed by where it occurs: in formal environments (i.e., schools) and informal environments (i.e., anything outside of a school setting). In science, informal learning environments can include science-specific institutions (such as museums and zoos), science-related media, and structured programs that occur outside of school (National Research Council, 2009; National Science Teachers Association, 2012).

Research on ISE has grown in the past decade across the world including in the United States (U.S.), the United Kingdom (U.K.), and Australia, including government-support projects and funding (National Research Council, 2009, 2010; Parliamentary Office of Science & Technology, 2011; Sewry et al., 2014; National Academies of Sciences Engineering and Medicine, 2016, 2017; National Innovation and Science Agenda, 2017; Australian Government, 2017a; National Science Foundation, n.d.). Such initiatives address growing concerns regarding the public's understanding of science (Funk and Goo, 2015; TNS BMRB and Royal Society of Chemistry, 2015) and the visibility of scientists (Wilsdon and Willis, 2004; Nisbet and Markowitz, 2015; National Academies of Sciences Engineering and Medicine, 2017). In chemistry, ISE is typically termed ‘outreach’ and involves college students and scientists engaging younger students and/or the public in chemistry activities (e.g., demonstration shows, science cafés, and public lectures) (National Academies of Sciences Engineering and Medicine, 2016). Large chemistry organizations support chemists in developing and implementing outreach events: the American Chemical Society (ACS), Alpha Chi Sigma (image file: c7rp00200a-t1.tif), the Royal Australian Chemical Institute (RACI), and the Royal Society of Chemistry (RSC). All three of these large organizations encourage their members to participate in outreach and provide outreach resources on their websites (Royal Australian Chemical Institute Incorporated, 2017; Royal Society of Chemistry, 2017a; Alpha Chi Sigma, 2017b; American Chemical Society, 2017b). The ACS, Australian Government, and RSC even sponsor large science/chemistry-focused outreach events each year: National Chemistry Week, National Science Week, and Chemistry Week, respectively (American Chemical Society, 2017b; Australian Government, 2017b; Royal Society of Chemistry, 2017b).

Minimal research has investigated chemistry outreach practices (e.g., practitioner goals and experiences, participant learning, long- and short-term impacts of events). While there are numerous publications discussing chemistry outreach, they primarily present ideas for outreach events and procedures for demonstrations (e.g., Koehler et al., 1999; Louters and Huisman, 1999; Swim, 1999; Flynn, 2005; Laursen et al., 2007; Carpenter et al., 2010; Kuntzleman and Baldwin, 2011; Houck et al., 2014). The first scholarly investigation of outreach practices was on the Fusion Science Theater program, which uses a specific demonstration show model coupled with either pre/post-tests or short interviews to collect evidence on conceptual audience learning as a result of attending events (Kerby et al., 2010, 2016; DeKorver et al., 2017). Some recent publications have surveyed event attendees to understand their perceptions of event implementation and perceived cognitive and affective gains from attending the event (Sewry et al., 2014; Ting et al., 2017), but they do not shed light on why chemistry outreach is conducted or what events look like for those not involved with Fusion Science Theater.

In the U.S., the National Academies of Sciences, Engineering, and Medicine studied informal chemistry education practices and published a report in 2016 (National Academies of Sciences Engineering and Medicine, 2016). The scope of the report is broad and describes general ideas for all types of informal chemistry education events (e.g., outreach events, science cafés, and public lectures). The report includes only perspectives of practicing chemists and presents generalized findings related to the goals chemists have for informal chemistry education events and the content areas they typically focus on during events. These goals include increasing the public's appreciation of science and encouraging people to enter the chemical sciences workforce. The goals that chemists in the U.S. have for conducting informal chemistry education align with U.K. outreach goals as evidenced by two ongoing outreach research studies by the RSC designed to understand public attitudes towards chemistry and how outreach can increase university enrollment (Royal Society of Chemistry, 2017c). The RSC even emphasizes its goal for increased university enrollment by stating on its website, “The world needs more chemical scientists, and chemistry skills can lead our young people into a vast range of fulfilling careers” (Royal Society of Chemistry, 2017e). This goal clearly aligns with U.S. chemists’ goals for increasing the workforce in the chemical sciences. The report by the National Academies in the U.S. also presents a framework to guide planning and implementing informal chemistry events, which emphasizes setting goals and evaluating the success of achieving those goals. This framework is very similar to advice given by the RSC for planning chemistry outreach events, despite it being generalized to all event types, not just chemistry outreach (Royal Society of Chemistry, 2017d). With such an international focus on chemistry outreach, and the call from both chemistry education researchers (Christian and Yezierski, 2012) and the broader discipline-based education research community (National Research Council, 2012), it is timely to examine chemistry outreach practices using quality research approaches.

Methods in chemistry education research (CER)

Just as chemistry teaching/learning can be divided into two foci (formal and informal), so can the methods used in CER. The broadest distinction in methods is between quantitative and qualitative methods. When CER is focused on studying humans, researchers are limited by their ability to access research participants and the techniques that elicit the desired data from participants. Presented below are descriptions of various methods used in CER and the benefits and limitations associated with them, with respect to accessing participants and eliciting meaningful data.
Surveys. Surveys are one method used in CER that can generate both quantitative and qualitative data. Surveys are commonly used because they allow for quick data collection. Modern methods for survey research include using survey software to administer surveys electronically rather than on paper. This allows researchers to disseminate surveys to a large population at minimal cost, thus increasing sample sizes for studies and the generalizability or transferability of conclusions. For example, Raker and colleagues sent one email and disseminated a survey to over 5000 potential research participants (Raker et al., 2015). Because of the ease of survey deployment via email, access to participants is maximized with email lists such as the Chemistry Education Research mailing list (Division of Chemical Education, 2016) and the National Association for Research in Science Teaching listserv (National Association for Research in Science Teaching, 2017). Limitations of survey research typically rest in the elicitation of participants’ ideas. While both quantitative and qualitative data can be collected with surveys, there is the concern of whether or not a research participant understands the questions being asked on the survey. While cognitive pretesting/interviews can minimize this concern (Collins, 2003; Gehlbach and Brinkworth, 2011; Dean et al., 2013), researchers are still limited to only analyzing the responses given by the participant when the survey is administered; follow-up questions to clarify the meaning of participant answers to free-response questions cannot be asked, thus limiting the depth of analysis. Additionally, there has been discussion about the validity and reliability of self-report data obtained via surveys (Mayer, 1999; Coughlan et al., 2009) and minimizing limitations of self-reports via pilot testing survey questions and developing questions based on interviews or the literature (Kelly et al., 2003; Desimone and Carlson le Floch, 2004; Bretz and Linenberger, 2012). No matter how the questions are developed, researchers are still limited in being unable to ask follow-up questions of participants to clarify survey responses without using additional research techniques.
Interviews. In contrast to surveys, interviews are also common in CER; interviews can generate rich, detailed descriptions of a research participant's ideas. Semi-structured interviews are common as they allow for follow-up/probing questions to be asked in an attempt to deepen the researcher's understanding of a participant's ideas (Drever, 1995; Patton, 2002). The issue of elicitation is a concern during interviews, since simply asking a participant to verbalize everything they know about a topic may not lead to rich, meaningful data. This is of particular concern when interviews are cognitive and focus on conceptual chemistry understanding; students often do not know what they do not know (Pazicni and Bauer, 2014). To increase elicitation during interviews, various tasks have been used in CER to obtain rich data necessary to answer research questions. Some common tasks include card sorting (e.g., Krieter et al., 2016), using demonstrations (e.g., Nakhleh, 1994; Brandriet and Bretz, 2014), using multiple representations (e.g., Linenberger and Bretz, 2012b; Kelly et al., 2017), having participants draw (e.g., Linenberger and Bretz, 2012a), and using think-aloud protocols (e.g., Bowen, 1994; Herrington and Daubenmire, 2014). To this end, interviews have become common when research goals align with rich, detailed elicitation. However, the limitation that comes with interviews primarily revolves around access to participants.

The standard for interviews in any discipline is in-person, face-to-face interviews (Patton, 2002; Herrington and Daubenmire, 2014). To accomplish this, researchers must have access to research participants and a space to conduct the interview, and researcher and participant schedules must align. This typically leads to access concerns, since researchers must physically meet participants; this can lead to smaller pools of potential interviewees and smaller sample sizes, thus minimizing the variability in studies and potential transferability of findings (Lincoln and Guba, 1985). To increase participation, gift cards to incentivize participation have become common in CER (e.g., Szteinberg and Weaver, 2013; Bauer, 2014; Anzovino and Bretz, 2016); however, the concerns of minimal variability in the sample are still warranted when studies typically include only participants from one location/region. Therefore, while interviews in CER benefit from rich elicitation, access to participants is a primary concern. Means of increasing access to participants and diversifying samples have started to emerge in CER through the use of multimedia communication programs that facilitate remote interviews, such as Skype. However, CER publications that have discussed using these programs have provided little detail about the implementation of interviews mediated by the internet and how concerns of elicitation are addressed (Herrington and Daubenmire, 2014; Harshman and Yezierski, 2015).


Online interviews. Outside of CER, qualitative researchers have been investigating using online methods to conduct interviews and have argued over the quality of data obtained online versus in person. Such online interview methods include asynchronous techniques (non-immediate responses from participants) and synchronous techniques (immediate responses similar to face-to-face interviews) (Mann and Stewart, 2000; Fielding et al., 2017). Various publications have focused on using different platforms to conduct these online interviews including over email (James, 2007), chat rooms (Chou, 2004), internet phone calls (Steeh and Piekarski, 2007), virtual reality systems/worlds (Dean et al., 2013; Girvan and Savage, 2013), and multimedia-based programs such as Skype (Deakin and Wakefield, 2014). All of these method-focused publications have discussed the benefits of these techniques for researchers and participants. The primary benefits include increased access (since researchers and participants no longer need to be in the same location) and decreased costs associated with data collection. Asynchronous techniques also allow participants time to think, reflect, and craft responses, while synchronous techniques mimic in-person interviews allowing for immediate responses from participants and opportunities to ask follow-up questions. However, these various techniques are limited, and the strengths and limitations of both approaches (asynchronous and synchronous) must be weighed in light of research questions. Asynchronous techniques may be more beneficial for investigating sensitive topics (such as depression) since participants can take their time when responding; synchronous techniques may be more beneficial for investigations of a benign nature (such as chemistry conceptual understanding) (Mann and Stewart, 2000; Sullivan, 2012; Iacono et al., 2016; Seitz, 2016; Fielding et al., 2017). Proponents of multimedia-based environments such as Skype point out the added benefit of communicating with gestures and body language that is afforded by simultaneous audio and visual communication (Hanna, 2012; Sullivan, 2012; Hamilton, 2014; Janghorban et al., 2014; Simeonsdotter Svensson et al., 2014; Iacono et al., 2016; Seitz, 2016). Such combined audio and visual communication is most similar to in-person interviews.
Multimedia-based interviews. One of the primary arguments against conducting interviews over multimedia-based programs is that rapport with participants may be hindered (i.e., building trust and making the participant comfortable). Techniques used for in-person interviews are no longer available when the researcher and participant are not in the same location (such as having coffee or tea), and technology concerns (such as dropped calls due to poor internet connection) may also hurt rapport (Simeonsdotter Svensson et al., 2014; Seitz, 2016; Weller, 2017). Multiple studies from a variety of disciplines (e.g., business, child development, nursing and health sciences, psychology, sociology) have addressed these concerns and conclude that programs such as Skype actually can increase rapport because the participant chooses the most convenient location to attend the interview (e.g., at home) (Bertrand and Bourdeau, 2010; Hanna, 2012; Hamilton, 2014; Janghorban et al., 2014; Simeonsdotter Svensson et al., 2014; Nehls et al., 2015; Iacono et al., 2016; Weller, 2017). Additionally, participants can feel more comfortable when not physically in the presence of the interviewer (Bertrand and Bourdeau, 2010; Hanna, 2012; Weller, 2017), and participants actually see no difference between in-person interviews versus those conducted over Skype (Deakin and Wakefield, 2014; Hamilton, 2014; Iacono et al., 2016; Weller, 2017). The ability of Skype to increase rapport has prompted multiple calls for using multimedia-based interview techniques. Such techniques can increase access to participants and add more diversity to samples, despite the technology concerns. Work in fields outside of CER has generated advice for mitigating technology concerns and building rapport prior to conducting the interview, including that steps are taken to ensure that researchers and participants have fluency in the technology, test technology prior to the interview, and have multiple contacts with the interviewee to establish the researcher as a real, trustworthy individual prior to meeting virtually (Bertrand and Bourdeau, 2010; Sullivan, 2012; Deakin and Wakefield, 2014; Nehls et al., 2015; Iacono et al., 2016; Weller, 2017). Additionally, advice on flexibility and participant choice has been discussed; allowing participants the choice of which programs to use may increase participation (Deakin and Wakefield, 2014; Simeonsdotter Svensson et al., 2014; Nehls et al., 2015; Seitz, 2016; Weller, 2017).

To minimize technology concerns, ensuring that both the researcher and participant have strong, reliable broadband internet has been suggested (Hay-Gibson, 2009; Sullivan, 2012; Hamilton, 2014; Janghorban et al., 2014; Nehls et al., 2015; Seitz, 2016), which may be less of a concern for current and future studies considering that 83% of households in Europe and 73% of Americans have a broadband connection (Eurostat, 2017; Pew Research Center, 2017). Additionally, 75% of K-12 students in America are connected to broadband internet (EducationSuperHighway, 2017). Participant fluency in the technology may be of lesser concern since 71% of Europeans and 88% of Americans are using the internet everyday (Eurostat, 2017; Pew Research Center, 2017). Despite the general population being familiar with and regularly using the internet, the population that may be more aligned with using multimedia-based techniques are teenagers and young adults, since 96% of Europeans ages 16–24 and 99% of Americans ages 18–29 use the internet everyday (Eurostat, 2017; Pew Research Center, 2017).

Research context

All of the issues and recommendations for conducting multimedia-based interviews became important considerations for a national chemistry outreach study. ACS and image file: c7rp00200a-t2.tif, which are primarily based in the U.S., specifically recruit undergraduate student members and support student chapters at individual universities; these student chapters are small, institution-based groups of students affiliated with the national organization that are supported/advised by a faculty or staff member. Every year, these student chapters report reaching almost 1 million members of the public through their outreach events (Connelly, 2015; Pratt, 2017). The ACS and image file: c7rp00200a-t3.tif national organizations support these student chapters through awards and funding opportunities illustrating the commitment the national organizations have to chemistry outreach, and thus outreach as part of the college student experience (American Chemical Society, 2017a, 2017c; Alpha Chi Sigma, 2017b). The types of outreach events that these student chapters typically engage in involve demonstration shows (where the audience watches the activity) or hands-on activities (where the audience actively participates in the activity) for elementary/primary school students (Connelly, 2015; Pratt, 2017; Pratt and Yezierski, 2018).

While the National Academies’ report of U.S. chemists (National Academies of Sciences Engineering and Medicine, 2016) is useful in providing a broad view of informal chemistry education, it lacks the perspectives of these university students who have a large impact through chemistry outreach. Additionally, recent findings have suggested that this population of students involved with student chapters of these organizations are distinctly different from the U.S. chemists included in the National Academies’ report, and that generalized goals for all informal chemistry education events do not capture the nuanced differences among various event locations, audiences, and event types (Pratt and Yezierski, 2018). As such, the overarching chemistry outreach study described in this paper stems from the results obtained from a national survey (Pratt and Yezierski, 2018) to understand why university students conduct chemistry outreach, the alignment between event evaluation methods and goals for events, and university students’ conceptual understanding of the chemistry content embedded in outreach activities. The population investigated was university students conducting chemistry outreach in conjunction with ACS and/or image file: c7rp00200a-t4.tif student chapters. Considering that this population consists of regular, everyday internet users, multimedia-based interviews were a logical methodological choice. However, considering the lack of the literature describing using multimedia-based interview in CER, this paper analyzes the efficacy of the methods employed to collect data in this chemistry outreach focused study (see Fig. 1).


image file: c7rp00200a-f1.tif
Fig. 1 Summary of the study about chemistry outreach. The focus of this paper is on the methods used to collect data related to the goals of the study.

Research questions

This paper seeks to describe and demonstrate the efficacy of the novel data collection technique employed in the context of an Institutional Review Board (IRB) approved study about chemistry outreach, which was developed based on literature recommendations and adaptions of in-person interview techniques to the online environment. Given the limitations of survey methods and traditional in-person interviews, this novel method was investigated with the goal of addressing the following research questions:

In a national study of chemistry outreach,

(1) How effectively can technological solutions provide access to interview participants across an entire country to diversify samples and improve the transferability of findings?

(2) How can in-person interview tasks be adapted to function in an online multimedia platform to elicit rich description commensurate with traditional face-to-face interviews?

Methods

Described below are the various methods related to recruiting and gaining access to participants, and eliciting their ideas, during an interview conducted virtually via a multimedia communication program (i.e., programs which allow simultaneous audio, video, and instant messaging communication).

Research question 1: recruiting and gaining access to participants

Access to research participants, including geographic locations, time zone differences, and gatekeepers, limits all qualitative studies. For this study, location was not a barrier to accessing research participants as multimedia-based online interviews, via programs such as Skype, allowed researchers to sample participants from multiple institutions/student chapters. Therefore, gatekeepers primarily limited access to participants; these gatekeepers were faculty/staff advisors for individual student chapters and/or department chairs (Creswell, 2007; Maxwell, 2013). To reach a sample size leading to data saturation (reoccurring ideas with no new ideas emerging), multiple rounds of recruitment were necessary (Lincoln and Guba, 1985; Patton, 2002). These recruitment efforts included electronic recruitment via emails sent directly to gatekeepers, in-person recruitment at a national meeting, and snowball recruitment with previously interviewed participants. Snowball recruitment is when a participant already in the study recruits other participants that they identify as suitable for the study based on their own personal experiences (Patton, 2002). For example, in this study, snowball recruitment was employed by asking previously interviewed participants to recruit other students from their own student chapter to participate in the study. This technique relies on developing a relationship between the interviewer and the interviewee in order to tap into the network of relationships that the interview participant already has. While these recruitment efforts were fruitful in obtaining volunteers, sampling criteria limited the number of volunteers actually interviewed. Sampling criteria for this outreach-focused study included participants having prior experience facilitating at least one outreach event with an audience (rather than experience planning an event), and familiarity with activities involving making liquid nitrogen ice cream, the “elephant toothpaste” reaction, and/or making slime. These criteria were based on results from a previous study (Pratt and Yezierski, 2018) and the need for participants to have prior experience discussing chemistry content with their target audience(s) before being interviewed. Detailed below are the various recruitment efforts for both the pilot study (early spring 2016) and the full study (late spring 2016–spring 2017).
Pilot study. For the pilot study (early spring 2016), researchers targeted universities where members of student chapters had already participated in a national exploratory study, because a similar recruitment strategy was employed and proved successful in accessing this population of college students (Pratt and Yezierski, 2018). From these institutions, researchers randomly sampled 11 of the 74 universities for the pilot study and sent a recruitment email directly to student chapter gatekeepers (faculty/staff advisors of the student chapter and/or department chairs). The recruitment email informed gatekeepers of the goals of the study and requested that they forward a link to an electronic recruitment survey to the students in their chapter. The recruitment survey described the project to students (including details about a $15 Amazon.com gift card as compensation for participating in the interview), obtained informed consent, collected demographic information from the participants, and asked their ideas about the purpose(s) of conducting chemistry outreach.
Full study. For the full study (late spring 2016–spring 2017), a similar recruitment strategy employed during the pilot study was used (i.e., using a recruitment survey emailed to gatekeepers to forward on to potential interview participants). However, multiple additional techniques were employed to access college student outreach practitioners for the full study. The first access technique was targeted, in-person recruitment at the spring 2016 ACS National Meeting. The research team distributed over 250 flyers to individual students involved with student chapters from institutions across the U.S. during a chapter-centric poster session; flyers had a QR code which linked students to the same recruitment survey described above.

In fall 2016, recruitment for the full study continued by directly contacting faculty/staff advisors (gatekeepers) of the student chapters that had participated in the national exploratory survey (Pratt and Yezierski, 2018); researchers contacted the 63 universities not recruited during the pilot study in two waves via the same procedure as the pilot study (contacting gatekeepers and requesting them to email a recruitment survey link to students). To increase the sample size and ensure variability of the sample, the research team also re-contacted the gatekeepers for the 11 chapters contacted for the pilot study (early spring 2016), while simultaneously conducting snowball recruitment with interviewed participants. To ensure trustworthiness of the data by evidencing data saturation (i.e., no new ideas emerging during interviews) and negative cases (i.e., data that did not support emergent patterns or trends) (Patton, 2002), the final recruitment technique involved contacting the 50 chapters that were awarded the highest organizational award for 2016: either the Exemplary Award from the ACS (American Chemical Society, 2017c) or the 3-star award from image file: c7rp00200a-t5.tif (Alpha Chi Sigma, 2017a). The assumption here was that award-winning chapters would have an involved advisor (gatekeeper), lots of outreach programs, and significant member participation. In late spring 2017, reoccurring ideas were captured during interviews and no new ideas emerged which confirmed data saturation and prompted the end of recruitment and data collection (Patton, 2002).

Research question 2: eliciting responses from participants

Since the target population for this study was geographically diverse, and in-person interviews were not feasible, researchers conducted synchronous, multimedia-based, semi-structured interviews. These interviews occurred using freely available programs (e.g., Skype and Google Hangouts) with audio recording using a recorder placed next to the interviewer's computer. Using these programs allowed for simultaneous voice, video, and instant messaging communication. Skype was the primary program, as it is widely used (Microsoft, 2016, 2017a). Throughout this paper, multiple programs used to conduct the interviews will be cited but will collectively be referred to as the interview platform (i.e., all of the technology used during the interviews).

Pilot study interviews were used primarily to build researcher expertise in using/troubleshooting the interview platform, as technology literacy of the interviewer has been presented as a concern in previous studies (e.g., Nehls et al., 2015). Additionally, in congruence with typical in-person interview practices in CER, the pilot study also helped test the interview guide for successful elicitation of meaningful data and give the interviewer practice probing student ideas using follow-up questions (e.g., Novick and Nussbaum, 1978; Linenberger and Bretz, 2012b). Since these interviews primarily served as preparation for the full study, data elicited from these interviews are not presented in this paper. What follows are detailed descriptions of the full study interviews focused on the interview task(s) used to elicit student ideas during each phase of the interview and the technology harnessed to administer the task(s) and collect data.

Building rapport and member checking (interview phase 1). In our technique, we built rapport by multiple emails with participants prior to the interview, which has been suggested in the multimedia-interview literature (Nehls et al., 2015; Iacono et al., 2016; Weller, 2017). Additionally, we greeted the participant, discussed the consent information from the recruitment survey, and had participants expand on the demographic information they provided during recruitment during the early part of the interview.

While greeting the participant (common practice in in-person interviews) does differ when conducted online over the interview platform, similar questions are able to ease the participant and help them adjust to talking over the internet. Greetings including “How are you?” and technology-related questions like “Can you see me? Am I loud enough?” allowed the participants to communicate with the researcher and test out the technology, all while growing accustomed to the interview platform. Additionally, the interviewer discussed the consent information to emphasize the confidentiality of the interview, the use of pseudonyms in all presentations of findings, and to allow participants to ask questions. This step was very important since some participants completed the recruitment survey (and provided informed consent for the interview) weeks or months prior to the interview. This enabled the researcher to obtain verbal consent in addition to the written consent on the recruitment survey.

To build rapport and help participants become accustomed to the interview platform, demographic information collected in the recruitment survey was discussed, and participants were invited to elaborate upon their answers. This review technique elicited participants’ ideas by allowing them to provide more details about their previous outreach experiences, how long they have been involved with their student chapter, and previous research experience (if applicable). These details enhanced their survey responses, added to the rich description of outreach practices obtained in the interviews, and built trust and rapport with the participant prior to asking interview questions specifically related to the research goals of the overarching chemistry outreach study.

One of the primary goals of the overarching study was to explore college students’ perceived purpose(s) of conducting chemistry outreach. To elicit their ideas, two modes were employed. The first mode involved a question on the recruitment survey regarding participants’ perceived purpose(s) of outreach. The question was built from raw responses received from an exploratory study (Pratt and Yezierski, 2018), and asked participants to select purpose(s) that they agreed with and then rank them into a hierarchy of importance (where one is the most important). Fig. 2 shows the first tier of this question in which participants selected items before ranking them.


image file: c7rp00200a-f2.tif
Fig. 2 Excerpt from the recruitment survey showing the first tier of the question and available options for purpose(s) of chemistry outreach; the second tier asked participants to rank only their selected items into a hierarchy of importance. Item choices were pulled directly from raw responses received from an exploratory study (Pratt and Yezierski, 2018).

Raw responses/hierarchies obtained from the recruitment survey were limited and did not provide any details about how participants interpreted items nor any rationale for item rankings. Therefore, the second mode of elicitation occurred during the interview where participants were asked to revisit/reflect on their responses and discuss with the interviewer their selections and rankings. To do this, participants’ hierarchies were instant-messaged via the messaging option of the interview platform, and participants were instructed to think aloud as they reviewed their responses to give insights into why they selected items and why they ranked items into their specific hierarchy (Bowen, 1994). Because this step focused on participants’ prior experience in the study (i.e., the survey completed prior to scheduling the interview), this step also continued building rapport between the interviewer and interviewee. Additionally, data obtained from revisiting the survey hierarchies became crucial to the overarching outreach study, since participants’ verbal reflections on their survey inputs provided more details regarding item interpretations and their process in constructing hierarchies. This reflection was a form of member checking which added trustworthiness to the findings, as the think-aloud data ensured that researchers accurately interpreted participants’ hierarchies (Lincoln and Guba, 1985). Data regarding the hierarchies, item interpretations, and varied processes for constructing the hierarchies are presented in the Results and discussion section. These data illustrate why accurate researcher interpretations of hierarchies would not have been possible without this specific revisiting/elicitation technique.

Adapting an interview task for the online platform (interview phase 2). To elicit student ideas regarding their chemistry outreach practices, multiple interview tasks had to be either adapted for, or created specifically for, the online environment. The first of these interview tasks was used to elicit ideas related to the evaluation of outreach events (interview phase 2) and resembled the member checking task discussed above. However, in this case, the task was conducted ‘live’ during the interview and resembled the common interview task of card sorting (e.g., Krieter et al., 2016). To adapt the card sorting task for the online environment, an electronic survey tool was used to provide options for participants to select and then sort into a hierarchy of importance. Like the purposes provided in the recruitment survey, responses from a national, open-ended survey became options for criteria of a successful outreach event (Pratt and Yezierski, 2018) used in this task. Using an electronic survey, participants selected success criteria that they agreed with before ranking their selected items into a hierarchy of importance (see Fig. 3).
image file: c7rp00200a-f3.tif
Fig. 3 Excerpt from the during-interview survey showing the first tier of the question and available options for success criteria of an outreach event; the second tier asked participants to rank their selected items into a hierarchy of importance. Item choices were pulled directly from raw responses received from an exploratory study (Pratt and Yezierski, 2018).

Just like the hierarchies of purposes of outreach which were revisited early in the interview, the survey link for this task was instant-messaged to participants using the messaging capability of the interview platform. However, more similar to a card sorting task, participants completed the task ‘live’ during the interview. Additionally, participants were asked to think aloud as they completed the selection and hierarchy to give insight into why they selected items and the rationales for their rankings (Bowen, 1994). This think-aloud technique elicited the same type of descriptive, meaningful data obtained when participants reflected on their purpose(s) of outreach, but with concurrent collection of survey responses and verbal rationales making it analogous to other interview tasks like card sorting; the survey technology was simply a means of translating the task into an electronic format.

Developing an interview task for the online platform (interview phase 3). The third interview task focused on eliciting participants’ understanding of the chemistry content embedded in common chemistry outreach activities. The results obtained from an earlier investigation (Pratt and Yezierski, 2018) showed that activities involving making liquid nitrogen ice cream, the “elephant toothpaste” reaction (catalyzed decomposition of hydrogen peroxide), and making slime (gelation of polyvinyl alcohol with borax) are very common for this outreach-practitioner population. Therefore, sampling criteria for this study included having prior experience with one or more of these activities, since this part of the interview explored these activities in depth.

During the pilot study, this part of the interview guide was structured with only open-ended questions related to typical age groups (audiences) targeted with these activities, mode of implementation (demonstration shows vs. hands-on activities), and expected learning goals for the activities before probing students more deeply about the chemistry content. Surprisingly, many participants were unfamiliar with the chemistry embedded in the activities including not understanding the procedures for the activities. When prompted to explain the activities and the reactions involved, the majority of participants struggled and could not respond, despite having prior experience facilitating these activities (sampling criterion). This difficulty was noticed early during the pilot study and prompted the researchers to revise the interview protocol by developing a novel task specifically focused on eliciting meaningful data about student conceptual chemistry understanding.

This novel task took the form of written, partially inaccurate student explanations which incorporated some ideas discussed during the pilot study and published misconceptions about the chemistry content underlying the activities. The research team crafted these explanations for the three outreach activities studied (making liquid nitrogen ice cream, the “elephant toothpaste” reaction, and making slime) and for three different audience age levels (elementary/primary school, middle/early secondary school, and college general chemistry). During the full study interviews, the explanations were individually instant-messaged to participants via the messaging feature of the interview platform accompanied by verbal instructions from the interviewer. The interviewer told participants that the explanations were a mixture of accurate and inaccurate ideas, and that they should read the explanations and critique them in terms of (1) content accuracy and (2) appropriateness for the indicated age group. These instructions purposefully and clearly indicated that the explanations were a combination of accurate and inaccurate content, and that the responses were from other participants. Having students discuss inaccurate content has been used successfully in animation research providing precedence for this task (Kelly et al., 2017), and the emphasis placed on the explanations obtained from other participants helped ensure that no power dynamics were introduced (i.e., an interview participant critiquing the interviewer's words).

Ensuring trustworthiness

While quantitative studies rely on determinants of validity and reliability to support findings, qualitative studies rely on establishing trustworthiness which includes credibility, dependability, and confirmability (Lincoln and Guba, 1985; Shenton, 2004). Multiple techniques were employed during the development and analysis of the method described in this paper to ensure trustworthiness of the data. First and foremost, because the development of this study was informed by findings from a previous study (Pratt and Yezierski, 2018), credibility is added due to a data-driven study development. Additionally, the use of a pilot study to test the method and interview guide further adds credibility and dependability to the study. While these two techniques provide clear evidence of trustworthiness throughout the development of the full study, the two researchers also conducted weekly debriefing sessions throughout the study discussing data analysis, preliminary findings and limitations, and next steps, which further adds trustworthiness to the study in the form of credibility. As the method itself is entirely novel, peer scrutiny of the project was also conducted through multiple presentations at national and international conferences, and monthly, local presentations with chemistry education researchers not involved in the project. Such peer scrutiny ensured rigor in the design of the novel method and in the analysis of data, which further supports the trustworthiness of the study. Throughout the weekly debriefing sessions and various peer scrutiny sessions, detailed notes were also recorded to establish an audit trail and add confirmability to the study.

In addition to the techniques to ensure trustworthiness of the method design and analysis of data, trustworthiness techniques were also embedded in the application of the method itself (as described above) including member checking, iterative questioning, and overlapping data collection tasks to allow for triangulation which adds credibility, dependability, and confirmability to the study. As such, multiple provisions were included in the method itself, design of the method, and analysis of data to ensure trustworthiness of the data, results, and conclusions presented in this paper.

Results and discussion

The following section presents evidence of the efficacy of methods for access and elicitation to answer the research questions.

Research question 1: recruiting and gaining access to participants

For the pilot study, five students from the initial recruitment survey responses sent to gatekeepers met the sampling criteria and were interviewed. These participants recruited an additional two students through snowball sampling (Patton, 2002). In total, seven students (N = 7) were interviewed for the pilot study.

Fig. 4 summarizes the outcomes of recruitment for the full study. The full study began at the end of the spring 2016 semester with targeted, in-person recruitment at the spring 2016 ACS National Meeting. Because the recruitment occurred near the end of the semester and many of the volunteers were graduating seniors, only five interviews resulted from this recruitment effort despite distributing over 250 flyers.


image file: c7rp00200a-f4.tif
Fig. 4 Ribbon/Sankey diagram (Bogart, 2017) summarizing the recruitment efforts for the full study and the number of research participants resulting from each effort.

From the combined two waves of electronic recruitment via emails sent to gatekeepers of student chapters that participated in the exploratory study, 32 students volunteered for the study who met the sampling criteria, and 16 interviews were conducted (8 participants from both waves of recruitment). As these responses did not evidence data saturation and had minimal sample variability, researchers re-contacted the universities recruited for the pilot study resulting in three interviews for the full study. Simultaneous snowball recruitment with previously interviewed participants resulted in two additional interviews. Saturation of data was suspected after these five recruitment strategies (n = 26 interviews), since participant answers from these two recruitment efforts discussed no novel features as compared to earlier data. However, to ensure trustworthiness of the data via data saturation and looking for negative cases (Patton, 2002), one last recruitment effort was made by targeting individual chapters which received awards in 2016. This route recruited 25 students who met the sampling criteria and resulted in 11 additional interviews; these interviews confirmed data saturation, as no new ideas emerged. In total, the full study included N = 37 students (17 males and 20 females). These students were from 22 unique student chapters/institutions (17 public institutions and 5 private institutions) of various sizes, as shown in Fig. 5.


image file: c7rp00200a-f5.tif
Fig. 5 Map detailing the locations and university sizes for the participants in the full study. Total student enrollment was used to classify sizes of schools (small <5000 students; medium 5000–15[thin space (1/6-em)]000 students; and large > 15[thin space (1/6-em)]000 students).

Clearly, recruitment of a geographically diverse population requires flexibility and multiple strategies to ensure the same response rate as in-person recruitment. This is likely due to gatekeepers having direct contact with students and researchers being strangers to participants (and unable to introduce themselves as in in-person recruitment). Additionally, scheduling interview times with students over email proved challenging due to differing time zones and the busy schedules of students. These difficulties explain why only 37 of 92 total volunteers who met the sampling criteria were actually interviewed (40%). Despite these difficulties, survey and email technologies allowed the researchers to cast a wide net over the course of the study and contact over 100 chapters at individual universities through over 700 individual email interactions. This approach led to a very diverse sample which would not have been possible if recruitment only occurred at a single institution.

Fig. 5 summarizes the diversity of institution locations and sizes for the participants in the full study; Table 1 summarizes the demographic information for the interviewed participants. While sample sizes for qualitative studies are varied and are dictated by data saturation (Lincoln and Guba, 1985; Herrington and Daubenmire, 2014), the sample size of this study is in the range of typical sizes for in-depth qualitative studies in chemistry education research (e.g., Henderleiter et al., 2001; Luxford and Bretz, 2013; Benny and Blonder, 2018). However, no other studies have reported as diverse of a sample, in terms of student demographics and locations, as this study. This lack of diversity in previous studies is likely due to limitations in accessing geographically diverse participants. Multimedia-based interviews greatly increased access to such participants in this study, which led to the large diversity in institutional and participant characteristics of this sample. Such diversity and inclusion of a wider range of perspectives increases the trustworthiness of the data and potential transferability of findings from this study to other teaching and learning contexts (Lincoln and Guba, 1985).

Table 1 Demographic information for the students interviewed in the full study (N = 37)
Year in school n Major n
Sophomore (second year) 5 Chemistry or biochemistry 22
Junior (third year) 11 Science (non-chemistry) 11
Senior (≥fourth year) 19 Non-science 2
Graduate student 2 Chemistry graduate student 2


Research question 2: eliciting responses from participants

For every participant, a single interview took place on the interview platform. These interviews ranged from 47 minutes to 2 hours and 36 minutes and averaged 1 hour and 21 minutes in length. As discussed in the Methods section, every participant interviewed had completed the recruitment survey to volunteer for the interview, and had both prior experience facilitating an outreach event with an audience and experience with at least one of the three targeted outreach activities (making liquid nitrogen ice cream, elephant toothpaste, or slime) prior to volunteering/being interviewed. Described below are results related to the various elicitation tasks; representative data are included to provide evidence supporting the efficacy of the method. In-depth analysis of the interview data related to the goals of the overarching chemistry outreach study will be the subject of future submissions.
Building rapport and member checking (interview phase 1). As mentioned in the Methods section, the first elicitation task asked participants to revisit/reflect on their hierarchies of purpose(s) of outreach and provided evidence of varied processes for constructing hierarchies and differing item interpretations. One such example of varied processes for constructing the hierarchies comes from Veronica who discussed ranking purpose items both by importance and by order of occurrence during an event (despite the question prompt asking specifically to rank in terms of importance).

While reflecting on her hierarchy (Fig. 6), Veronica said that the purpose of ‘Expose younger students/children to chemistry to spark their interest in the sciences’ was required for the purpose ‘Expose younger students/children to chemistry to help them decide if they want to study it’ to occur. Additionally, this then led to the purpose ‘Provide role models for younger students/children (give them someone to look up to)’ occurring. In her hierarchy, she placed these three purposes as first, second, and third, respectively. However, upon reflection Veronica said that this ranking was based on the order the items occurred in practice, not importance: “[they] occur in a chain reaction.” She actually thought of them as a single unit in her hierarchy, as together “they [were] more important” than others listed lower in the hierarchy. Therefore, her rankings of first, second, and third did not accurately represent her ideas of importance, and she would rather have all three ranked as equally important. This practice of grouping items and thinking about how they occur during an event, or as Veronica calls in order of “lead[ing] to,” was common in the data set (n = 16). Therefore, the technique of having students reflect and report on their survey responses produced meaningful and critical data as the hierarchies would be misinterpreted without this elicitation task.


image file: c7rp00200a-f6.tif
Fig. 6 Veronica's selected items for purpose(s) of chemistry outreach and the corresponding hierarchy.

In addition, some participants even chose to modify their rankings during reflection. One reason was that they interpreted items as equally important (like Veronica), but limitations of the survey software did not allow for equal item rankings. Lois is another participant who discussed this desire for equal rankings; her original hierarchy submitted during recruitment is shown in Fig. 7.


image file: c7rp00200a-f7.tif
Fig. 7 Lois's selected items for purpose(s) of chemistry outreach and the corresponding hierarchy.

Upon reflection, Lois discussed the purposes she ranked as fourth and fifth as, “I would probably say they're equal cause I think they're hand-in-hand with each other.” This was common throughout the interviews, as some participants reported that some purposes were equally important (n = 10). Additionally, some participants, upon reflection, added additional purposes to their hierarchies which they did not originally include (n = 2). This reflection and revision to hierarchies adds trustworthiness to the data as a form of member checking (Lincoln and Guba, 1985). Furthermore, this illustrates how researchers would have misinterpreted many of the responses if this revisiting/reflection task was not included in the interview. This finding embodies one key limitation of survey-only research methods and adds support to the combined survey and interview methods employed in this study. Additionally, this member-checking task also helped participants become comfortable verbalizing their thoughts on familiar data/their prior experience, which was crucial in building rapport with the participants and ensuring that the remainder of the interview elicited meaningful data.

Overall, this interview phase successfully built rapport and trust with participants, while also eliciting the details needed to fully understand participants’ ideas about the purpose(s) of outreach they communicated via their recruitment surveys. Evidence of rapport is best illustrated by Veronica and Lois's conversational interactions during the interview. With Veronica, she was very open about her lack of previous thought about the relationships between some of her selected items: “I don’t know… I just thought about that. I’ve never actually like sat down and figured it out.” For Lois, her discussion about the purpose she ranked as third also illustrates this conversational nature: “So then with number three, obviously I'm a girl, a female in science so [laughter] I totally get that one!” While responses from only two participants are presented, their data are representative of the meaningful data obtained from, and conversational dialogue that occurred during, the interviews.

Additionally, no evidence throughout other phases of the interview suggested that participants were unwilling to share their ideas, or that the interview platform hindered discussion, as suggested by previous studies on multimedia-based interviews (e.g., Seitz, 2016; Weller, 2017). Therefore, not only did this phase successfully build good rapport with participants, but it also elicited rich details about students’ ideas regarding the purposes of conducting chemistry outreach. These details add trustworthiness to the data obtained via multimedia-based interviews. They also provide evidence that multimedia-based interviews have the ability to elicit meaningful, high-quality data commensurate with in-person interviews, and that rapport can be built over the internet.

Adapting an interview task for the online platform (interview phase 2). Results from the second interview task, the analogous card sorting task where participants constructed hierarchies of important success criteria, evidenced similar construction strategies as the purposes of outreach data/member-checking task. These strategies included participants ranking items by order of occurrence during events (n = 9) and revising hierarchies to equate items (n = 19). This is not surprising, since the two tasks had similar structures, and the discussion of purposes prior to this task may have primed participants to think in a similar fashion as they did when completing the success criteria task. One example of a participant discussing the order of occurrence of success criteria is Shayera, whose hierarchy of success criteria is shown in Fig. 8.
image file: c7rp00200a-f8.tif
Fig. 8 Shayera's selected criteria for the success of an outreach event and the corresponding hierarchy.

Shayera specifically discussed the necessity of criteria ‘the presenters gave good explanations’ and ‘the presenters used good presentation skills’ leading to engagement of and excitement in the audience: “I put [those] lower because I feel like…without having [a] good presentation or good explanations you can't have an engaged and excited audience.”

Although some participants interpreted all items as communicating unique ideas and continued through the task (such as Shayera above), others interpreted some of the success criteria items as having the same meaning. Participants either (1) ranked items as equally important in their hierarchy or (2) only selected one representative item to include in their hierarchy. Ten participants interpreted the items in one of these two ways (n = 10). Items which typically had similar interpretations included ‘the audience had fun,’ ‘the audience left smiling,’ ‘the audience had a good time/enjoyed themselves,’ and ‘the audience was excited.’ Without the think-aloud descriptions accompanying the responses, these differences would be unidentifiable pointing towards the need for interviews rather than a survey-only investigation. Additionally, the thick description obtained from the participants (i.e., item interpretations and processes of constructing hierarchies) illustrates that the electronic card sorting task elicited meaningful data, and the interview platform did not hinder discussion. Similar to Veronica and Lois, Kendra also had no reservations in discussing her feelings when completing the task, “I think I misinterpreted [it]. It's been a long week! I think it's a Tuesday? [laughter]”, and calling out discrepancies in her discussion of items, “I think I may be contradicting myself!” These student statements provide further evidence that rapport was built and sustained throughout the interview. Additional evidence of this sustained rapport comes from participant responses during the third interview task (related to chemistry content), detailed below.

Developing an interview task for the online platform (interview phase 3). While pilot study data prompted the development of the novel task involving partially inaccurate explanations, the interview protocol for the full study interviews retained open-ended questions to probe initial student ideas prior to completing the novel task. Similarly to the pilot study, students struggled to discuss the chemistry content during open-ended questioning during the full study. This is not surprising considering that the population of students sampled for the full study was the same as that of the pilot study, and no modifications were made to the open-ended questions in the interview protocol. Representative quotations from the full study which illustrate this difficulty in discussing the content during open-ended questioning include:

“It's been a long time since chem two!… it's not knowledge I deem important enough to keep in my brain on a daily basis.” – Sue

“I don't know what's goin’ on” – Shayera

“If we’ve learned anything today is that…[Max] doesn’t remember chemical reactions” – Max

While the willingness of the participants to share their uncertainty shows further evidence of the rapport built during the interview, these responses do not provide details about participants’ chemistry content knowledge related to outreach activities, thus supporting the use of the novel elicitation/critiquing task.

While a total of nine explanations were crafted and used during the full study, only representative student responses from an excerpt from one explanation for the “elephant toothpaste” reaction will be presented, since the goal is to illustrate the efficacy of the task for successful elicitation; the appendices include all nine explanations for reference.

The “elephant toothpaste” reaction involves the catalytic decomposition of hydrogen peroxide into oxygen and water (Conklin and Kessinger, 1996; Cesa, 2004; Trujillo, 2005). An excerpt from the explanation written for the college general chemistry level is below (note: inaccurate ideas are boldfaced here for reference but were not boldfaced when sent to participants):

“…a catalyst is used because the decomposition is not spontaneous. The catalyst allows the reaction rate to increase because the mechanistic pathway changes. The catalyzed mechanism has two steps with higher activation energies. Overall, the catalyst decreases the overall enthalpy change of the reaction… Once all of the catalyst is converted to the intermediate, the reaction dramatically speeds up as noted by the increase in foam being produced…”

This excerpt specifically targets participants’ conceptual understanding related to the role of the catalyst in the reaction and its connection to thermodynamics and kinetics. Sue, who initially could not discuss the reaction during open-ended questioning (above), responded to a portion of this passage (underlined) with:

“The line a couple of sentences from the end ‘[o with combining low line][n with combining low line][c with combining low line][e with combining low line] [a with combining low line][l with combining low line][l with combining low line] [o with combining low line][f with combining low line] [t with combining low line][h with combining low line][e with combining low line] [c with combining low line][a with combining low line][t with combining low line][a with combining low line][l with combining low line][y with combining low line][s with combining low line][t with combining low line] [i with combining low line][s with combining low line] [c with combining low line][o with combining low line][n with combining low line][v with combining low line][e with combining low line][r with combining low line][t with combining low line][e with combining low line][d with combining low line] [t with combining low line][o with combining low line] [t with combining low line][h with combining low line][e with combining low line] [i with combining low line][n with combining low line][t with combining low line][e with combining low line][r with combining low line][m with combining low line][e with combining low line][d with combining low line][i with combining low line][a with combining low line][t with combining low line][e with combining low line]’ uhm…catalysts don't change so that's super wrong.”

Shown here, despite being unable to discuss the content at the beginning of the phase, the explanation elicited her knowledge of catalysts, specifically regarding the known misconceptions that catalysts do not change during the course of a reaction (Cakmakci, 2010; Bain and Towns, 2016): “catalysts don’t change.” Similarly, Shayera who also struggled to discuss the content earlier in the phase responded to a portion of this explanation (underlined) with:

“Uhm…‘[o with combining low line][v with combining low line][e with combining low line][r with combining low line][a with combining low line][l with combining low line][l with combining low line] [t with combining low line][h with combining low line][e with combining low line] [c with combining low line][a with combining low line][t with combining low line][a with combining low line][l with combining low line][y with combining low line][s with combining low line][t with combining low line] [d with combining low line][e with combining low line][c with combining low line][r with combining low line][e with combining low line][a with combining low line][s with combining low line][e with combining low line]s [t with combining low line][h with combining low line][e with combining low line] [o with combining low line][v with combining low line][e with combining low line][r with combining low line][a with combining low line][l with combining low line][l with combining low line] [e with combining low line][n with combining low line][t with combining low line][h with combining low line][a with combining low line][l with combining low line][p with combining low line][y with combining low line] [c with combining low line][h with combining low line][a with combining low line][n with combining low line][g with combining low line][e with combining low line] [o with combining low line][f with combining low line] [t with combining low line][h with combining low line][e with combining low line] [r with combining low line][e with combining low line][a with combining low line][c with combining low line][t with combining low line][i with combining low line][o with combining low line][n with combining low line]’…yes? Yes, but only [as a] result…so…the catalyst decreases the activation energy which leads to a decrease in overall enthalpy, but a catalyst does not directly change enthalpy. The enthalpy changes as a result of the decreased activation energy.”

Shayera's response indicates that she may understand the relationship between catalysts and activation energy during a reaction (“the catalyst decreases the activation energy”), but she does not understand how activation energy and enthalpy relate (or in this case do not relate) in a catalyzed reaction.

Both Sue and Shayera's responses are representative of the descriptive, meaningful data obtained through the novel, critiquing explanation interview task employed with all 37 participants; without this task, the interview would not have captured many participants’ chemistry understanding, thus providing evidence that this task was successful as an elicitation technique. While the task clearly elicited new inaccurate ideas and misconceptions not specifically included in the written explanations, a detailed analysis of these ideas is out of the scope of this paper and will be the subject of future manuscript submissions.

Even though the critiquing task successfully elicited verbal descriptions from students of the chemistry content, it has been shown before in CER studies that participants’ mental models may differ from their verbal and/or symbolic descriptions, thus calling for capturing drawings as well (Cooper et al., 2015). As such, drawing was offered to participants as an option for them to express their ideas during the interview. For those that chose to draw (n = 10), participants provided and used their own materials from their interview location to communicate their ideas (e.g., notebook paper). They would then display their drawings by holding them up to their webcam. Using screen capturing software, the interviewer then captured images of the drawings; representative drawings pertaining to the “elephant toothpaste” reaction are shown in Fig. 9.


image file: c7rp00200a-f9.tif
Fig. 9 Representative drawings of reaction coordinate diagrams drawn by Bruce (left) and Merina (right) when discussing the elephant toothpaste reaction.

In the case of drawings associated with elephant toothpaste, many participants only drew when discussing the age appropriateness of the explanations and the need to draw for kids to understand. For example, Bruce said that, “If I was…telling this to a kid, I might draw on the board the activation energy barriers” followed by drawing the left representation in Fig. 8. Bruce's drawings also evidence the published misconception that a catalyst does not change during a reaction (discussed by Sue above), since the formation of an intermediate including the catalyst is not shown on the diagram (Cakmakci, 2010; Bain and Towns, 2016).

The drawings from the participants offered an additional lens to interpret their chemistry content knowledge related to the activities used in chemistry outreach events. While not all participants chose to use drawings to convey their understanding, providing the choice ensured rich, meaningful data collection, as it offered another way for participants to represent and communicate their ideas.

Technology limitations and considerations

What has been described is a novel online interview technique that harnesses video conferencing software, electronic survey software, instant messaging, and screen capturing software to elicit interview participants’ ideas. The examples provided illustrate the meaningful data and thick description obtained via this method. However, as discussed in previous discussions of online interviews, technology can be both a benefit and a limitation. While the various tasks implemented through the technology did provide meaningful data which mirror the quality of data obtained via in-person interviews, technical difficulties did occur and must be considered in the evaluation of the efficacy of the method.

As the population targeted was composed of students at university, the quality and reliability of internet access varied. Almost a third of the video calls (n = 12) froze and/or dropped during interviews, causing both the interviewer and the participant to work to reestablish the connection. Additionally, every interview had at least one instance of an internet-connection issue in which the audio and/or video skipped. While some have argued that these issues hurt rapport (e.g., Seitz, 2016; Weller, 2017), we found no evidence of this. The participants seemed accustomed to troubleshooting connectivity issues and would continue the interview without any hindrance to rapport once the interviewer either repeated the question or asked the participant to repeat themselves. For example, internet connectivity was an issue for Lex; however, once reconnected, the interviewer simply repeated the question and continued the interview as normal, as shown in the following excerpt from the second elicitation task (analogous card sorting task):

Lex – I wouldn't so much say the attendance has so much of uhm…part to play in successfulness because I could have a high attendance but have everyone distracted rather than having low attendance and…have anyone very…

[Call Disconnected]

[Call Reconnected]

Interviewer – You there?

Lex – Yes, I’m here

Interviewer – Ok and you were saying, you don’t think you need high attendance to have a successful event?

Lex – Uhm I…attendance may play a part but I don't believe that it should uh…so much denote whether it was successful or not because…a high attendance may also lead to more people err…lead to it being a little harder to engage err…everyone in the audience rather than in a small crowd where you can interact with everyone watching the show.

Being well versed in the interview guide and conducting a pilot study to develop literacy in the technology (Nehls et al., 2015) was crucial to this study; this helped the interviewer be flexible, be able to quickly troubleshoot the technology, and rephrase/repeat questions to continue the interview without diminishing rapport. This population itself (college students) may also be more accustomed to using and troubleshooting technology due to daily internet use (Eurostat, 2017; Pew Research Center, 2017), which may also explain why disconnections did not seem to hinder rapport.

Flexibility was also important in terms of choosing the interview platform, and giving participants choices has also been suggested in previous studies as an access technique (e.g., Deakin and Wakefield, 2014; Simeonsdotter Svensson et al., 2014; Nehls et al., 2015; Seitz, 2016; Weller, 2017). In this study, every participant had differing preferences about which video conferencing software to use. With the goal of access in mind, providing options for the participants was necessary and afforded technology needs to dictate the platform. Interview software used in this study included Skype, Google Hangouts, or Facebook, and participants used laptops, desktop computers, cellphones, or a combination thereof. An added limitation was that instant messaging capabilities of the multimedia programs would sometimes malfunction. As all of the interview tasks relied on sending information to the participant during the interview, email was used a backup to continue the interview and not disrupt the order of interview tasks prescribed in the interview guide.

Screen capturing suited the needs of this study to acquire the drawings some of the participants provided. However, the qualities of the images obtained via this method were wide ranging (illustrated in Fig. 8), and it is worth noting that the researchers may need to recreate some drawings for presentation and publication. The screen capturing software used in this study was the Snipping Tool included by default on Windows computers (Microsoft, 2017b); however, a similar free program exists for Apple computers called Snip (Tencent Technology (Shenzhen) Company Limited, 2012). In other qualitative CER studies, researchers have provided representations for the participant to interact with and encouraged participants to draw their own representations (e.g., Linenberger and Bretz, 2012b). While this was not the goal of this study, the methods from this study can easily be adapted to stimulate and capture more drawings through the use of a freely available tool that allows researchers to provide representations with which participants can interact. The tool takes the form of a digital whiteboard where the interviewer can load representations, graphics, and even documents for the interviewee to interact with (including drawing on), and then capture the annotations/representations as an image using screen capturing software (Expat Software, 2017). A paid version allows for image capturing within the digital whiteboard tool without using additional software. However, while this tool is available, studies will need to investigate the ease of using this tool by research participants including drawing using a touchpad, computer mouse, and/or finger/stylus, and the learning that must occur for participants to use this tool easily. One study outside of CER has used this tool, but it does not discuss details about how the interviewees learned to use the program or how they performed their drawings (Hay-Gibson, 2009). As the participants in the study presented in this article demonstrated, various technologies may be used to attend the interview; additional technologies that give the participants choices for drawing likewise can increases access and are worthy of future investigation.

Summary of methods

In summary, the novel method described in this paper addresses two key considerations for research with human subjects: accessing research participants and eliciting responses from them. The techniques used in this novel method are summarized in Fig. 10. The novelty of the described method rests in the combined use of multimedia interviews with adapted in-person interview tasks, and the development of a novel interview task, to access research participants and elicit their ideas.
image file: c7rp00200a-f10.tif
Fig. 10 Summary of the various techniques used to access research participants and elicit their ideas. The first row (dark hues of blue and green) uses icons to discuss the various techniques and technology employed when accessing participants and eliciting their ideas; the second row (light hues of blue and green) provides specific methodological details related to access and elicitation; the third row (orange) discusses the rapport building techniques employed throughout both stages.

To access research participants and sample from a national population, three main recruitment techniques were used: (1) email, (2) in-person flyers, and (3) snowball recruitment. In all instances, electronic survey software was used to easily access students through gatekeepers. Additionally, interview scheduling also helped in accessing participants by providing participant choice in which multimedia interface was used.

To elicit student ideas, audio, video, and instant messaging communication was employed as they are embedded in freely-available multimedia programs. Additionally, survey software was used as a means to adapt in-person interview tasks for the online environment. Even though the interviews were conducted electronically, face-to-face elicitation tasks of think-aloud protocols, open-ended questions, and student drawing were easily implemented. The instant messaging capability of the multimedia programs helped administer the various elicitation tasks, minimizing the physical, location-based barrier of not meeting participants in person.

To establish and sustain rapport during the interviews, common face-to-face techniques were employed, such as responding to facial expressions and body language (afforded by the video communication). However, as suggested by previous studies on rapport building in multimedia-based interviews, the need to establish the interviewer as a ‘real’ person prior to the interview was required and achieved through multiple email communications leading up to the actual interview (e.g., Deakin and Wakefield, 2014; Nehls et al., 2015; Iacono et al., 2016; Weller, 2017).

Conclusions and implications

This paper describes a novel method for qualitative data collection conducted electronically. One aim was to evaluate the effectiveness of these technological solutions in accessing participants across the entire U.S. The final sample included in this in-depth, qualitative study of chemistry outreach practitioners is more diverse than previously reported in CER (in terms of both institution and participant characteristics). The increased sample diversity includes more perspectives than typical single-institution studies, which improves and increases the transferability of findings (Lincoln and Guba, 1985). Additionally, the diverse sample enables findings to capture the diverse outreach practices among the ACS and image file: c7rp00200a-t6.tif organizations as a whole, rather than of only a few individual chapters.

The other goal of this study was to describe the adaptation of in-person interview tasks (and the creation of a novel interview task) for online multimedia-based interviews. Electronic surveys offer a means to adapt card sorting tasks for use in online interviews. Instant messaging allows for easy sharing of information (including written explanations to critique), and simultaneous audio and visual formats allow participants to think aloud and draw. The data presented accompanying the descriptions of the various tasks illustrate the rich, meaningful data elicited and obtained using these multimedia-based approaches, which are commensurate with data obtained through traditional face-to-face interviews.

Implications for research

The data presented provide evidence that the method described here has the potential to change how chemistry education researchers design and carry out qualitative studies. Harnessing freely available technologies minimizes location-based barriers, allowing qualitative studies to cross local and potentially national boundaries to enable the study of more diverse groups than previously reported. Additionally, this study has shown that common interview practices for eliciting participant ideas (e.g., card sorting tasks) can be modified and implemented electronically (e.g., electronic surveys with item ranking). Straightforward technologies (such as screen capture) enable an electronic way to implement a common interview practice, such as participants drawing, to be seamlessly integrated into an online electronic environment. With increased access yielding rich data commensurate with face-to-face interviews, multimedia-based interviews prove to be a worthwhile data collection tool that illustrates how qualitative research in the 21st century can increase the diversity of samples and improve the transferability of findings, without losing the rich description associated with in-person interviews.

More research needs to be conducted to further test this method, provide details on the differences between multimedia-based interviews and in-person interviews in CER, and optimize the method to identify when limitations of multimedia-based interviews preclude answering particular CER questions. As technology is continuously changing, investigations of using other tools in these electronic environments and developing new elicitation tasks tailored specifically to online interviews are needed to advance the field and increase understanding of best practices when conducting online, multimedia-based interviews.

Implications for practice

Although the aim of this study was not to investigate distance learning environments, the use of multimedia online tools in this study significantly overlaps with those used in online courses. Like online courses, this study required techniques to elicit, capture, and assess student understanding electronically. In the online course environment, while discussion forums and quizzes are common assessment tools administered through course management systems (e.g., Blackboard, Moodle), the techniques presented in this paper provide new routes for eliciting rich, descriptive ideas from students. Providing students with prompts, tasks, and think-aloud protocols can capture their thought processes and yield more descriptive data about their understanding than a quiz or exam score. All of the aforementioned techniques can be conducted from a distance with data collected using technology, such as the survey tool used in this study. The use of online, multimedia tools coupled with research-based elicitation techniques demonstrates how distance learning environments could employ richer assessment tools to better evaluate student knowledge. However, future work would be required to test the research techniques used in this study as assessment approaches in online classroom environments.

For outreach practitioners, the data presented in this paper, although limited in scope, call for a closer examination of outreach practices. The data presented about student conceptual understanding of the chemistry content embedded in outreach activities suggest gaps in chemistry learning. While a great deal of research has investigated student understanding in the formal classroom, a gap in conceptual understanding is apparent in this sample. This gap is of significant concern for outreach education since these students are becoming informal chemistry educators who lack scientifically accurate foundational knowledge. Future publications will explore the gaps in student conceptual understanding and how that impacts chemistry outreach events.

The findings presented in this paper also illustrate the diversity of ideas related to why outreach is conducted and how to evaluate the success of events, evidenced by the varying ways participants responded to the tasks. These results call for outreach practitioners to closely examine their own outreach practices, particularly regarding the alignment between their goals and evaluation criteria. Additionally, examining the training of event facilitators (informal educators) is necessary to improve the teaching and learning taking place in these informal chemistry education environments.

Future work

The data presented in this paper evidence the rigor and success associated with using these novel data collection techniques, and are representative of the rich, meaningful data collected over the course of an in-depth qualitative study of chemistry outreach practices. Further analyses of the data presented in this paper are ongoing to address the goals of the overarching study about chemistry outreach practices.

Conflicts of interest

There are no conflicts to declare.

Appendix 1: explanations for the elephant toothpaste reaction

College general chemistry level

This reaction involves the catalytic decomposition of hydrogen peroxide into water and hydrogen gas. This acid–base reaction is an exothermic reaction because bonds are broken and heat is released. A catalyst is used because the decomposition is not spontaneous. The catalyst allows the reaction rate to increase because the mechanistic pathway changes. The catalyzed mechanism has two steps with higher activation energies. Overall, the catalyst decreases the overall enthalpy change of the reaction. The reaction starts off slowly because the first step is the rate limiting step. Soap is used to help break down the hydrogen peroxide. Once all of the catalyst is converted to the intermediate, the reaction dramatically speeds up as noted by the increase in foam being produced. Since the products are gas, the foam expands as the gas molecules inside the foam spread out.

Early secondary/late middle school (8th grade) level

Hydrogen peroxide changes into water and hydrogen using a catalyst. A catalyst speeds up the rate of reaction. Think about water; it evaporates if we sit a cup of it on the counter but it does it very slowly. If we put it on the stove, it goes much faster. The stove acts as a catalyst and speeds up the reaction. We use the soap to help break down the molecules of hydrogen peroxide. The water and hydrogen produced are gases which want to be as far apart from each other as possible so they expand, which is why the bubbles get bigger. The heat is produced because when hydrogen peroxide is broken apart, energy that is stored in the molecules is released.

Early primary/elementary school (2nd grade) level

In this reaction we change a liquid into multiple gases. When the particles become gas, they grow and become larger, taking up more space which is why gases spread out. However, most gases are invisible. In order to see this reaction occur, we need to capture the gas. Just like blowing bubbles or bubbles in the bathtub, we use soap here to capture the gas and give us evidence that the gas was produced. As the reaction progresses, you can see the foam grow and expand as the particles grow and become gas.

Appendix 2: explanations for making slime

College general chemistry level

Polymers are long chains of repeating units. White glue is primarily composed of the monomer vinyl acetate. In solution, these monomers easily slide past one another with minimal attractions to each other. This is why glue flows out of the bottle. Borax, when dissolved in water, yields boric acid B(OH)3 which condenses the monomer to vinyl alcohol. The boric acid causes a polymerization reaction, creating the polyvinyl alcohol polymer (PVA) which is a solid. The more borax that is added to the glue, the more PVA that is produced. This is why the slime can behave like a liquid or solid. When a small amount of borax is added to the glue, a small amount of PVA is created (meaning that the majority is the liquid polymer). As the concentration of borax is increased, more PVA is made, causing the mixture to behave more like a solid.

Early secondary/late middle school (8th grade) level

White glue is made up of a polymer. Polymers are long chains of molecules that are linked together. You use polymers every day; the rubber on your shoes, plastic drink cups, and Styrofoam containers are all different kinds of polymers. Some polymers are elastic and flexible (like rubber) and some are hard and firm (like hard plastics). The linked molecules in glue do not slide past each other easily (that is why you have to squeeze the bottle to get the glue to come out). When we add borax, we disrupt those connections. A small amount of borax causes some of the links to break, allowing the slime to flow more like a liquid. As we increase the amount of borax, new links form, causing the slime to start behaving more like a solid.

Early primary/elementary school (2nd grade) level

Solids hold their shape. Liquids do not (they flow). In this experiment, we are going to create a slime that behaves both like a solid and like a liquid. We start with glue which is very thick. When we add a small amount of detergent, the glue becomes less thick and more like a liquid (it flows). When we add a lot of detergent, the slime hardens into a solid (it holds its shape). Pushing or pulling on the slime causes the solid slime to loosen (and behave like a liquid). When we stop pushing or pulling on the slime, it returns to behaving more like a solid.

Appendix 3: explanations for making liquid nitrogen ice cream

College general chemistry level

The ice cream solution is a mixture of milk, sugar, and flavoring and milk is primarily composed of water. Dissolving the sugar in the milk increases the freezing point of the milk, causing it to freeze at a higher temperature. Nitrogen is a gas at room temperature because of strong intermolecular forces between the nitrogen molecules (London-dispersion interactions). Liquefying nitrogen requires low temperature and high pressure in order to decrease the kinetic energy/slow down the molecules enough to have the intermolecular forces take hold. As soon as the liquid nitrogen's container is opened, it boils because the vapor pressure of liquid nitrogen is so high. During boiling, the temperature of the liquid nitrogen increases as it changes into a gas. Heat from the ice cream solution is absorbed by the liquid nitrogen. Because the temperature difference between the ice cream solution and the liquid nitrogen is so great, the transfer of heat is very fast, allowing for the ice cream to freeze almost instantly. The water inside the ice cream mixture goes from a liquid state to a solid state because heat is lost and the molecules slow down, creating solid ice cream.

Early secondary/late middle school (8th grade) level

Ice cream is primarily made out of milk which is a mixture of fat and water. When we freeze the water, we get solid ice cream. We can use a freezer at your house to do this, but it takes a long time. If we use liquid nitrogen, we can do it in a few minutes. Your freezer at home is around 30 °F, liquid nitrogen is around −320 °F. Because liquid nitrogen is so cold, it freezes ice cream much faster. When we freeze the ice cream, the mixture goes from a liquid to a solid. In the liquid state, the molecules slide past each other and move around. In the solid state, the molecules are so cold that they stop moving. When we add the liquid nitrogen, cold from the liquid nitrogen transfers to the water in the ice cream mixture, causing the water particles to slow down and freeze. The liquid nitrogen loses its cold and increases in temperature to become a gas.

Early primary/elementary school (2nd grade) level

You use a freezer at home to keep your ice cream cold. Liquid nitrogen is about 12 times as cold as your freezer so it will let us make ice cream really fast. The liquid ice cream mixture is going to freeze to a solid because the liquid nitrogen gives its cold to the ice cream. When something gets really cold, it stops moving and shivers in place. The liquid nitrogen, once it has given its cold to the ice cream, heats up to become a gas and floats away.

Acknowledgements

We thank the student members of ACS and image file: c7rp00200a-t7.tif who volunteered and participated in this study, and the faculty/staff advisors and department chairs who helped aid our recruitment. We also thank the Yezierski and Bretz Research Groups for their feedback on early versions of the interview guide and helping test the software. Additionally, we appreciate the constructive feedback from the reviewers which helped us improve the presentation of the research. Lastly, we also acknowledge Miami University for funding this project.

References

  1. Alpha Chi Sigma, (2017a), Alpha Chi Sigma awards – Star Chapter Awards, accessed August 21, 2017, accessed at: http://https://www.alphachisigma.org/about-us/awards/star-chapter-award?.
  2. Alpha Chi Sigma, (2017b), Outreach programs, accessed August 4, 2017, accessed at: http://https://www.alphachisigma.org/about-us/outreach.
  3. American Chemical Society, (2017a), ACS student chapter grants, accessed August 4, 2017, accessed at: http://https://www.acs.org/content/acs/en/funding-and-awards/grants/acscommunity/studentaffiliatechaptergrants.html.
  4. American Chemical Society, (2017b), Community outreach, accessed August 4, 2017, accessed at: http://https://www.acs.org/content/acs/en/education/outreach.html.
  5. American Chemical Society, (2017c), Student chapter award recipients, accessed August 4, 2017, accessed at: http://https://www.acs.org/content/acs/en/funding-and-awards/awards/community/sachapter.html.
  6. Anzovino M. E. and Bretz S. L., (2016), Organic chemistry students’ fragmented ideas about the structure and function of nucleophiles and electrophiles: A concept map analysis, Chem. Educ. Res. Pract., 17(4), 1019–1029.
  7. Australian Government, (2017a), Inspiring Australia – Science engagement, accessed March 12, 2017, accessed at: http://https://www.business.gov.au/assistance/inspiring-australia-science-engagement.
  8. Australian Government, (2017b), National Science Week, accessed March 12, 2017, accessed at: http://https://www.scienceweek.net.au/.
  9. Bain K. and Towns M. H., (2016), A review of research on the teaching and learning of chemical kinetics, Chem. Educ. Res. Pract., 17, 246–262.
  10. Bauer C. F., (2014), Ethical treatment of the human participants in chemistry education research, Tools of Chemistry Education Research, Washington, D.C., American Chemical Society, pp. 279–297.
  11. Benny N. and Blonder R., (2018), Interactions of chemistry teachers with gifted students in a regular high-school chemistry classroom, Chem. Educ. Res. Pract.,  10.1039/C7RP00127D.
  12. Bertrand C. and Bourdeau L., (2010), Research interviews by Skype: A new data collection method, in Proceedings of the 9th European Conference on Research Methodology for Business and Management Studies, Madrid: IE Business School, pp. 70–79, retrieved from http://https://www.researchgate.net/profile/Catherine_Bertrand2/publication/256117370_Bertrand_C_Bourdeau_L_2010_Research_interviews_by_Skype_A_new_data_collection_method_In_J_Esteves_Ed_Proceedings_from_the_9th_European_Conference_on_Research_Methods_pp_70-79_S.
  13. Bogart S., (2017), SankeyMATIC (BETA): A Sankey diagram builder for everyone, accessed September 1, 2017, accessed at: http://sankeymatic.com.
  14. Bowen C. W., (1994), Think-aloud methods in chemistry education: Understanding student thinking, J. Chem. Educ., 71(3), 184–190.
  15. Brandriet A. R. and Bretz S. L., (2014), The development of the redox concept inventory as a measure of students’ symbolic and particulate redox understandings and confidence, J. Chem. Educ., 91(8), 1132–1144.
  16. Bretz S. L. and Linenberger K. J., (2012), Development of the enzyme–substrate interactions concept inventory, Biochem. Mol. Biol. Educ., 40(4), 229–233.
  17. Cakmakci G., (2010), Identifying alternative conceptions of chemical kinetics among secondary school and undergraduate students in Turkey, J. Chem. Educ., 87(4), 449–455.
  18. Carpenter Y., Phillips H. A., and Jakubinek M. B., (2010), Clock reaction: Outreach attraction, J. Chem. Educ., 87(9), 945–947.
  19. Cesa I. (ed.), (2004), Old Foamey: Decomposition of hydrogen peroxide, in Flinn ChemTopic Labs – Chemical Reactions, Volume 6, Batavia, IL, Flinn Scientific, Inc., pp. 85–86.
  20. Chou C., (2004), Internet heavy use and addiction among Taiwanese college students: An online interview study, Cyberpsychol. Behav., 4(5), 573–585.
  21. Christian B. N. and Yezierski E. J., (2012), A new chemistry education research frontier, J. Chem. Educ., 89(11), 1337–1339.
  22. Collins D., (2003), Pretesting survey instruments: An overview of cognitive methods, Qual. Life Res., 12, 229–238.
  23. Conklin A. R. and Kessinger A., (1996), Demonstration of the catalytic decomposition of hydrogen peroxide, J. Chem. Educ., 73(9), 838.
  24. Connelly T. M., (2015), Why is chapter community outreach so important? Chemistry, 24(1), 3.
  25. Cooper M. M., Williams L. C., and Underwood S. M., (2015), Student understanding of intermolecular forces: A multimodal study, J. Chem. Educ., 92(8), 1288–1298.
  26. Coughlan M., Cronin P., and Ryan F., (2009), Survey research: Process and limitations, Int. J. Ther. Rehabil., 16(1), 9–15.
  27. Creswell J. W., (2007), Qualitative Inquiry and Research Design: Choosing Among Five Traditions, 2nd edn, Thousand Oaks, CA, Sage Publications.
  28. Deakin H. and Wakefield K., (2014), Skype interviewing: Reflections of two PhD researchers, Qual. Res., 14(5), 603–616.
  29. Dean E., Head B., and Swicegood J., (2013), Virtual cognitive interviewing using skype and second life, in Hill C. A., Dean E. and Murphy J. (ed.), Social Media, Sociality, and Survey Research, Hoboken, NJ, John Wiley & Sons, Inc.
  30. DeKorver B. K., Choi M., and Towns M., (2017), Exploration of a method to assess children's understandings of a phenomenon after viewing a demonstration show, J. Chem. Educ., 94(2), 149–156.
  31. Desimone L. M. and Carlson le Floch K., (2004), Are we asking the right questions? Using cognitive interviews to improve surveys in education research, Educ. Eval. Policy Anal., 26(1), 1–22.
  32. Division of Chemical Education, (2016), Chemistry Education Research, accessed August 29, 2017, accessed at: http://https://cer.chemedx.org.
  33. Drever E., (1995), Using semi-structured interviews in small-scale research: a teacher's guide, Glasgow, Scottish Council for Research in Education.
  34. EducationSuperHighway, (2017), 2016 state of the states: Second annual report on K-12 broadband connectivity, accessed August 30, 2017, accessed at: http://https://s3-us-west-1.amazonaws.com/esh-sots-pdfs/2016_national_report_K12_broadband.pdf.
  35. Eurostat, (2017), Internet access and use statistics – Households and individuals, accessed August 30, 2017, accessed at: http://ec.europa.eu/eurostat/statistics-explained/index.php/Internet_access_and_use_statistics_-_households_and_individuals.
  36. Expat Software, (2017), Team whiteboarding with Twiddla – Painless team collaboration for the web, accessed August 30, 2017, accessed at: http://www.twiddla.com/home.aspx.
  37. Fielding N. G., Lee R. M. and Blank G. (ed.), (2017), The SAGE Handbook of Online Research Methods, 2nd edn, London, U.K., Sage Publications, Inc.
  38. Flynn N., (2005), Science days: An interdisciplinary outreach program, J. Chem. Educ., 82(10), 1483–1485.
  39. Funk C. and Goo S. K., (2015), A look at what the public knows and does not know about science, accessed August 29, 2017, accessed at: http://www.pewinternet.org/2015/09/10/what-the-public-knows-and-does-not-know-about-science/.
  40. Gehlbach H. and Brinkworth M. E., (2011), Measure twice, cut down error: A process for enhancing the validity of survey scales, Rev. Gen. Psychol., 15(4), 380–387.
  41. Girvan C. and Savage T., (2013), Guidelines for conducting text based interviews in virtual Worlds, in Childs M. and Peachey A. (ed.), Understanding Learning in Virtual Worlds, London, U.K., Springer London, pp. 21–39.
  42. Hamilton R. J., (2014), Using skype to conduct interviews for psychosocial research, Comput. Inform. Nurs., 32(8), 353–358.
  43. Hanna P., (2012), Using internet technologies (such as Skype) as a research medium: A research note, Qual. Res., 12(2), 239–242.
  44. Harshman J. and Yezierski E. J., (2015), Guiding teaching with assessments: high school chemistry teachers’ use of data-driven inquiry, Chem. Educ. Res. Pract., 16, 93–103.
  45. Hay-Gibson N. V., (2009), Interviews via VoIP: Benefits and disadvantages within a PhD study of SMEs, Libr. Inf. Res., 33(105), 39–50.
  46. Henderleiter J., Smart R., Anderson J., and Elian O., (2001), How do organic chemistry students understand and apply hydrogen bonding? J. Chem. Educ., 78(8), 1–5.
  47. Herrington D. G. and Daubenmire P. L., (2014), Using interviews in CER projects: Options, considerations, and limitations, Tools of Chemistry Education Research, American Chemical Society, pp. 31–59.
  48. Houck J. D., Machamer N. K., and Erickson K. A., (2014), Graduate student outreach: Model of a one-day “chemistry camp” for elementary school students, J. Chem. Educ., 91(10), 1606–1610.
  49. Iacono V. Lo, Symonds P., and Brown D. H. K., (2016), Skype as a tool for qualitative research interviews, Sociol. Res. Online, 21(2), 1–15.
  50. James N., (2007), The use of email interviewing as a qualitative method of inquiry in educational research, Br. Educ. Res. J., 33(6), 963–976.
  51. Janghorban R., Roudsari R. L., and Taghipour A., (2014), Skype interviewing: The new generation of online synchronous interview in qualitative research, Int. J. Qual. Stud. Health Well-Being, 9(1), 24152.
  52. Kelly K., Clark B., Brown V., and Sitzia J., (2003), Good practice in the conduct and reporting of survey research, Int. J. Qual. Health Care, 15(3), 261–266.
  53. Kelly R. M., Akaygun S., Hansen S. J. R., and Villalta-Cerdas A., (2017), The effect that comparing molecular animations of varying accuracy has on students’ submicroscopic explanations, Chem. Educ. Res. Pract., 18, 582–600.
  54. Kerby H. W., Cantor J., Weiland M., Babiarz C., and Kerby A. W., (2010), Fusion Science Theater presents the amazing chemical circus: A new model of outreach that uses theater to engage children in learning, J. Chem. Educ., 87(10), 1024–1030.
  55. Kerby H. W., DeKorver B. K., Cantor J., Weiland M. J., and Babiarz C. L., (2016), Demonstration show that promotes and assesses conceptual understanding using the structure of drama, J. Chem. Educ., 93(4), 613–618.
  56. Koehler B. G., Park L. Y., and Kaplan L. J., (1999), Science for kids outreach programs: College students teaching science to elementary students and their parents, J. Chem. Educ., 76(11), 1505–1509.
  57. Krieter F. E., Julius R. W., Tanner K. D., Bush S. D., and Scott G. E., (2016), Thinking like a chemist: Development of a chemistry card-sorting task to probe conceptual expertise, J. Chem. Educ., 93(5), 811–820.
  58. Kuntzleman T. S. and Baldwin B. W., (2011), Adventures in coaching young chemists, J. Chem. Educ., 88(7), 863–867.
  59. Laursen S., Liston C., Thiry H., and Graf J., (2007), What good is a scientist in the classroom? Participant outcomes and program design features for a short-duration science outreach intervention in K-12 classrooms, CBE – Life Sci. Educ., 6(1), 49–64.
  60. Lincoln Y. S. and Guba E. G., (1985), Naturalistic Inquiry, Newbury Park, CA, Sage Publications, Inc.
  61. Linenberger K. J. and Bretz S. L., (2012a), A novel technology to investigate students’ understandings, J. Coll. Sci. Teach., 42(1), 45–49.
  62. Linenberger K. J. and Bretz S. L., (2012b), Generating cognitive dissonance in student interviews through multiple representations, Chem. Educ. Res. Pract., 13, 172–178.
  63. Louters L. L. and Huisman R. D., (1999), Promoting chemistry at the elementary level: A low-maintenance program of chemical demonstrations, J. Chem. Educ., 76(2), 196–198.
  64. Luxford C. J. and Bretz S. L., (2013), Moving beyond definitions: What student-generated models reveal about their understanding of covalent bonding and ionic bonding, Chem. Educ. Res. Pract., 14, 214–222.
  65. Mann C. and Stewart F., (2000), Internet Communication and Qualitative Research – A Handbook for Researching Online. London, U.K., Sage Publications, Inc.
  66. Maxwell J. A., (2013), Qualitative Research Design: An Interactive Approach, 3rd edn, Thousand Oaks, CA, Sage Publications.
  67. Mayer D. P., (1999), Measuring instructional practice: Can policymakers trust survey data? Educ. Eval. Policy Anal., 21(1), 29–45.
  68. Microsoft, (2016), Satya Nadella and Terry Myerson: Build 2016, accessed August 30, 2017, accessed at: http://https://news.microsoft.com/speeches/satya-nadella-and-terry-myerson-build-2016/.
  69. Microsoft, (2017a), Microsoft by the numbers, accessed August 30, 2017, accessed at: http://https://news.microsoft.com/bythenumbers/skype-calls.
  70. Microsoft, (2017b), Use snipping tool to capture screenshots, accessed August 30, 2017, accessed at: http://https://support.microsoft.com/en-us/help/13776/windows-use-snipping-tool-to-capture-screenshots.
  71. Nakhleh M. B., (1994), Student's models of matter in the context of acid-base chemistry, J. Chem. Educ., 71(6), 495.
  72. National Academies of Sciences Engineering and Medicine, (2016), Effective Chemistry Communication in Informal Environments, Washington, D.C., The National Academies Press.
  73. National Academies of Sciences Engineering and Medicine, (2017), Communicating Science Effectively. Washington, D.C., National Academies Press.
  74. National Association for Research in Science Teaching, (2017), NARST: Connect to Listserv, accessed August 29, 2017, accessed at: http://https://www.narst.org/listserv.cfm.
  75. National Innovation and Science Agenda, (2017), Inspiring all Australians in science, technology, engineering, and mathematics, accessed March 12, 2017, accessed at: http://www.innovation.gov.au/page/inspiring-nation-scientists.
  76. National Research Council, (2009), Learning Science in Informal Environments: People, Places, and Pursuits, Washington, D.C., The National Academies Press.
  77. National Research Council, (2010), Surrounded by Science: Learning Science in Informal Environments, Washington, D.C., The National Academies Press.
  78. National Research Council, (2012), Future directions for discipline-based education research: Conclusions and recommendations, Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering, Washington, D.C., The National Academies Press, pp. 186–204.
  79. National Science Foundation, (n.d.), Advancing informal STEM learning (AISL): Research on learning in formal and informal settings, accessed August 4, 2017, accessed at: http://https://www.nsf.gov/funding/pgm_summ.jsp?pims_id=504793.
  80. National Science Teachers Association, (2012), An NSTA position statement: Learning science in informal environments, accessed April 8, 2017, accessed at: http://www.nsta.org/about/positions/informal.aspx.
  81. Nehls K., Smith B. D., and Schneider H. A., (2015), Video-conferencing interviews in qualitative research, in Hai-Jew S. (ed.), Enhancing Qualitative and Mixed Methods Research with Technology, Hershey, PA, IGI Global, pp. 140–157.
  82. Nisbet M. C. and Markowitz E., (2015), Public engagement research and major approaches, accessed March 12, 2017, accessed at: http://https://www.aaas.org/sites/default/files/content_files/Biblio_PublicEngagement_FINAL11.25.15.pdf.
  83. Novick S. and Nussbaum J., (1978), Junior high school pupils’ understanding of the particulate nature of matter: An interview study, Sci. Educ., 62(3), 273–281.
  84. Parliamentary Office of Science & Technology, (2011), Informal STEM Education, POSTnote, retrieved from http://www.parliament.uk/pagefiles/53788/postpn_382-informal-science-education.pdf.
  85. Patton M. Q., (2002), Qualitative Research and Evaluation Methods, 3rd edn, Thousand Oaks, CA, Sage Publications, Inc.
  86. Pazicni S. and Bauer C. F., (2014), Characterizing illusions of competence in introductory chemistry students, Chem. Educ. Res. Pract., 15, 24–34.
  87. Pew Research Center, (2017), Internet/broadband fact sheet, accessed August 30, 2017, accessed at: http://www.pewinternet.org/fact-sheet/internet-broadband/.
  88. Pratt J. M., (2017), Alpha Chi Sigma and chemistry outreach: Promoting the second object, The Hexagon, 108(1), 8–9.
  89. Pratt J. M. and Yezierski E. J., (2018), Characterizing the landscape: Collegiate organizations’ chemistry outreach practices, J. Chem. Educ., 95(1), 7–16.
  90. Raker J. R., Reisner B. A., Smith S. R., Stewart J. L., Crane J. L., Pesterfield L., and Sobel S. G., (2015), In-depth coursework in undergraduate inorganic chemistry: Results from a national survey of inorganic chemistry faculty, J. Chem. Educ., 92(6), 980–985.
  91. Royal Australian Chemical Institute Incorporated, (2017), 100 reactions for RACI100, accessed March 12, 2017, accessed at: http://https://www.raci.org.au/resourcecentre/100-reactions-for-raci100.
  92. Royal Society of Chemistry, (2017a), Chemistry outreach – Learn chemistry, accessed August 29, 2017, accessed at: http://www.rsc.org/learn-chemistry/resource/res00000301/outreach.
  93. Royal Society of Chemistry, (2017b), Connecting everyone with chemistry, accessed August 29, 2017, accessed at: http://www.rsc.org/campaigning-outreach/outreach/everyone/.
  94. Royal Society of Chemistry, (2017c), Connecting scientists, accessed August 29, 2017, accessed at: http://www.rsc.org/campaigning-outreach/outreach/scientists/.
  95. Royal Society of Chemistry, (2017d), Connecting scientists, accessed August 29, 2017, accessed at: http://www.rsc.org/campaigning-outreach/outreach/scientists/#getting-started-tab.
  96. Royal Society of Chemistry, (2017e), For science teachers, accessed August 29, 2017, accessed at: http://www.rsc.org/campaigning-outreach/outreach/educators/.
  97. Seitz S., (2016), Pixilated partnerships, overcoming obstacles in qualitative interviews via Skype: a research note, Qual. Res., 16(2), 229–235.
  98. Sewry J. D., Glover S. R., Harrison T. G., Shallcross D. E., and Ngcoza K. M., (2014), Offering community engagement activities to increase chemistry knowledge and confidence for teachers and students, J. Chem. Educ., 91(10), 1611–1617.
  99. Shenton A. K., (2004), Strategies for ensuring trustworthiness in qualitative research projects, Educ. Inf., 22, 63–75.
  100. Simeonsdotter Svensson A., Pramling Samuelsson I., Hellström A.-L., and Jenholt Nolbris M., (2014), Experiences of SKYPE communication in education and research – Data collection concerning young children with long-term illness, Early Child Dev. Care, 184(7), 1017–1030.
  101. Steeh C. and Piekarski L., (2007), Accommodating new technologies: Mobile and VoIP communication, in Lepkowski J. M., Tucker C., Brick J. M., de Leeuw E. D., Japec L., Lavrakas P. J., Sangster R. L. et al. (ed.), Advances in Telephone Survey Methodology, Hoboken, NJ, John Wiley & Sons, Inc.
  102. Sullivan J. R., (2012), Skype: An appropriate method of data collection for qualitative interviews? Hilltop Rev., 6(1), 54–60.
  103. Swim J., (1999), An elementary outreach program – Have demo will travel, J. Chem. Educ., 76(5), 628–629.
  104. Szteinberg G. A. and Weaver G. C., (2013), Participants’ reflections two and three years after an introductory chemistry course-embedded research experience, Chem. Educ. Res. Pract., 14(1), 23–35.
  105. Tencent Technology (Shenzhen) Company Limited, (2012), Snip on the Mac App Store, accessed August 30, 2017, accessed at: http://https://itunes.apple.com/us/app/snip/id512505421?mt = 12.
  106. Ting J. M., Ricarte R. G., Schneiderman D. K., Saba S. A., Jiang Y., Hillmyer M. A., Lodge T. P. et al., (2017), Polymer day: Outreach experiments for high school students, J. Chem. Educ., 94(11), 1629–1638.
  107. TNS BMRB and Royal Society of Chemistry, (2015), Public attitudes to chemistry, accessed August 29, 2017, accessed at: http://www.rsc.org/globalassets/04-campaigning-outreach/campaigning/public-attitudes-to-chemistry/public-attitudes-to-chemistry-research-report.pdf.
  108. Trujillo C. A., (2005), A modified demonstration of the catalytic decomposition of hydrogen peroxide, J. Chem. Educ., 82(6), 855.
  109. Weller S., (2017), Using internet video calls in qualitative (longitudinal) interviews: Some implications for rapport, Int. J. Soc. Res. Methodol., 1–13.
  110. Wilsdon J. and Willis R., (2004), See-through Science: Why public engagement needs to move upstream, London, England, Demos.

Footnote

Note: students received the explanations without boldfacing, but inaccuracies are boldfaced here for reference.

This journal is © The Royal Society of Chemistry 2018