PAA black holes, Khan, and Quora: mapping understanding of isotopes through existing data sources

Wayne Breslyn
Department of Teaching and Learning, Policy and Leadership, University of Maryland, 2311 Benjamin Building, College Park, Maryland 20742, USA. E-mail: wbreslyn@umd.edu

Received 6th July 2019 , Accepted 23rd October 2019

First published on 7th November 2019


Abstract

Publicly available, learner-generated questions were used to develop a methodology for advancing the exploratory stages of science education research. Data from four complementary online sources were collected, analyzed, and compared to the extant research literature on the science education topic of isotopes, a challenging concept for many chemistry learners. Data from People Also Ask features on a popular search engine, questions in response to two videos (n = 770), and questions posted to two question and answer websites (n = 600 and n = 29[thin space (1/6-em)]213) were analyzed. Multiple findings not present in the literature were detected across all data sources in this study. Findings suggest that these online sources can serve to inform research in science education by providing a rich, ecologically valid, and accessible source of data. Implications include the use of existing online data prior to initiating research, writing assessments, and developing curriculum for science topics as a means to achieve more robust and generalizable findings and instructional resources.


Background

Science education research has yielded insights into student thinking on a wide range of topics. A great deal of research has focused on identifying students’ alternative conceptions and testing strategies to move students towards a more sophisticated understanding of a topic. The goal of such research is often to inform assessment, curriculum development, and teacher education in order to measure or improve student learning.

A challenge faced by researchers is accessing a diverse group of participants and obtaining meaningful data. In their development of a novel methodology to investigate chemical education outreach, Pratt and Yezierski (2018) note that access, as well as eliciting meaningful data, is a limiting factor in many research studies. Our research suggests that for many studies the use of online learner-generated questions can be particularly effective for addressing this challenge.

A further challenge is the development of instruments and research designs that accurately identify and measure a learner's understanding of a topic. Typically, pilot studies with small samples are conducted, and the outcomes, along with relevant literature, are then used to develop the design and instrumentation of a larger study (Van Teijlingen and Hundley, 2001). While pilot studies are useful to develop ideas, generalizable inference is limited by sample size, resources, and access to participants. Although valuable, caution is necessary as pilot studies can have an unwarranted influence on research instrumentation, data analysis, and ultimately the findings in the larger study.

The current study offers a potential means to address these methodological problems. The proposed methodology uses existing data sources to augment pilot work and strengthen instrument development and study design. The goal is not to replace more traditional research methodologies; instead, the goal is to provide insights into learners’ thinking based on a large, existing dataset prior to initiating research, assessment, or curriculum development.

Three primary categories of resources are addressed: search engine generated People Also Ask (PAA) questions (http://www.Google.com), questions asked by students in response to two videos on an education website (http://www.KhanAcademy.com) and questions asked on two popular question and answer websites (http://www.Quora.com, http://www.Answers.Yahoo.com). In this study the topic of isotopes was investigated.

Isotopes was selected as a topic for three primary reasons. First, it provides a manageable breadth of content while still being theoretically and methodologically generative. Second, the topic is one that is frequently taught at the secondary and university level in both chemistry and physics courses. Finally, the topic of isotopes has a wide range of applications outside of the classroom such as medicine, nuclear energy, and radiocarbon dating (U.S. Nuclear Regulatory Commission, 2000). Therefore, the topic of isotopes allows for a focus on methodology as well as the generation of knowledge that can be applied to chemistry and physics education.

To explore the validity of the methodology proposed in this study, a comparison was made between the research literature on students’ understanding of isotopes and findings from the four data sources used in this study. The primary research question addressed was:

How does data from PAA suggested questions and learner-generated online questions compare to the research literature on learners’ understanding of isotopes?

In addition to making a comparison to the extant literature, a further goal was to detect and describe learner ideas about isotopes not present in the research literature.

Theoretical framework

Development of the current methodology was guided by a conceptual change framework, specifically the Learning Progression (LP) framework. An LP is an empirically based description of learner understanding over time (Duschl et al., 2007). Changes in learners’ understanding are placed into levels, with the lower level describing learners’ prior knowledge and skills and an upper level describing what they know upon completing the progression (Duncan and Hmelo-Silver, 2009). The intermediate levels describe the changes in their understanding as they progress. We found this framework productive in describing learning over time for the topics of sea level rise (Breslyn et al., 2016) and climate change (Breslyn et al., 2017).

In this study we were drawn to conceptualizing LP research as being composed of five stages with validity being established through evidence from various sources (Jin et al., 2019). In this framework the proposed five stages are Development, Scoring, Generalization, Extrapolation, and Use. Each stage is based on assumptions that guide the research. For example, in the Development stage it is assumed that important concepts are addressed in the instrumentation used to collect evidence and that it is effective in detecting student reasoning.

In the current study, the use of existing learner-generated questions is most applicable to the Development stage of LP research. It is at this stage that decisions are being made that can have a considerable impact on the remaining stages of research.

In the Development stage, Jin et al. include expert review, think-aloud, and clinical interviews as the evidence used to guide research. We see the use of existing learner-generated questions as an initial form of evidence to support the Development stage by providing a large and accessible, ecologically valid sample. Such a sample would strengthen the development of instrumentation for think-aloud, clinical interviews, etc. and therefore subsequent stages of research.

Literature review

In this study, the methodology presented focuses on learner-generated questions in an online environment to enhance and extend the initial stages of research. As described by Chin and Osborne (2008):

Students’ questions provide insights into their knowledge, understanding, and puzzlement, and act as a window into their minds.

Considerable research exists on the types of questions teachers ask their students but is sparse regarding learner-generated questions (Ronfard et al., 2018). It has been suggested that this is because students typically do not ask enough questions to be theoretically or practically generative (Commeyras, 1995, Graesser and Person, 1994). A strength of the methodology proposed in the current study is the diversity and volume of learner-generated questions available online for a wide range of science topics.

Although no studies on the topic of isotopes involved questions generated in an online environment, a number looked at the classification of the types of questions students asked. For example, in a comparison of a traditional lecture and an interactive learning environment, Griffith et al. (2019) categorized student questions as non-content, foundational knowledge, or application knowledge. Foundational knowledge questions were considered to be of a lower cognitive level than application questions. Pedrosa De Jesus et al. (2003) categorized student questions as either confirmational or transformational. Confirmation questions sought clarification whereas transformation questions involved the restructuring or reorganization of knowledge. In their study of elementary students’ text-based and knowledge-based questions, Scardamalia and Bereiter (1992) described what they termed “wonderment” questions. These questions are asked when students are curious or puzzled and are thought to have a larger impact on advancing understanding. Because learners tend to ask questions when they feel secure (Watts et al., 1997), we speculate that an online environment may result in more questions involving wonderment.

While there is little research available on the specific use of online questions, one useful example is the work of Baram-Tsabari et al. (2006) investigating children's interest in science. Using data from a popular online question and answer website, the authors analyzed 1555 learner-generated questions. These questions were coded based on disciplinary area of learner interest (e.g. biology, chemistry). Because the authors were able to access demographic information, they were able to investigate how gender and age related to the content area of questions asked.

The study offers several methodological insights. First, it is not always possible to determine why the question was asked by the learner. The authors attempted to differentiate between what they termed “spontaneous” and “school related” questions. Unless explicitly stated to be for a school assignment questions were classified as “spontaneous”. This highlights a limitation to be considered when using online data. Second, as the authors note, many learners who use question and answer websites find their question already answered on the site. As such these questions are not available for analysis in the data. Finally, the self-selected sample of learners asking questions necessitates caution in generalizing findings to learners in other contexts.

Isotopes

To avoid researcher bias, a literature review on learner understanding of isotopes was not conducted until after data analysis was complete. As such, it is less likely that findings in the literature influenced coding and analysis. In conducting the literature review after coding it was found that only one study focused specifically on the topic of isotopes (Schmidt et al., 2003). As noted by other researchers (Çalik et al., 2009) there is limited scholarship in this area. Two additional studies contained findings relevant to the current study. To support the comparison of the literature to findings to the current study, three studies are described.

Schmidt et al. (2003)

Based on a sample of 3074 senior high school students in Germany, the study focused on difficulties students experience in understanding isotopes and allotropes and their relationship to the periodic table of the elements. Using a mixed methodology, researchers collected pre and post data with a multiple-choice instrument that allowed for students to also provide written explanation of their responses (n = 3074).

For example, students were asked about the number of electrons and neutrons in atoms of graphite and diamond. Possible responses were that graphite and diamond had the same number of neutrons, the same number of electrons, both, or neither. To answer the question correctly students must understand that graphite and diamond consist of carbon atoms in different arrangements. The carbon atoms will have the same number of protons and electrons; however, within a sample of graphite or diamond, isotopes of carbon with varying number of neutrons would exist.

Of interest is how instrumentation was developed in the Schmidt et al., (2003) study and its influence on their findings. The multiple-choice questions were developed based on past tests from examination boards from the United Kingdom and United States. Items were piloted with a small group prior to the main study. The collection of data from the multiple-choice instrument was followed by interviews (n = 6) based on the responses to the multiple-choice questions. The study therefore ultimately relies on concepts about isotopes as conceived by the authors of the examination board test items. This places constraints on what is measured and found in the study.

Analysis of multiple-choice data indicated that students experienced difficulties with three concepts related to isotopes.

Students considered there to be “standard atoms” and isotopes of these atoms. Students often thought that standard atoms have the same number of protons and neutrons and are more stable. In addition they have integer mass numbers. These standard atoms were found on the periodic table.

Students considered graphite and diamonds to be isotopes of carbon.

Students confused the term “isotope” and “isomer”.

Interviews, developed based on findings from the multiple-choice instrument, confirmed students’ alternative conception of a standard form of an element with equal numbers of protons and neutrons. Interviews also confirmed student confusion between isotopes and allotropes of carbon. No new findings were reported based on interview data.

It is important to note that the intent of the study was to determine why students accepted inaccurate statements about isotopes developed in the pilot study rather than identify alternative conceptions in a broader context.

As such, these findings may be due to nature of the instrumentation and its development. The authors state that students may have been drawn to distractors and led to believe that the number of protons and neutrons were the same. Because of this it is likely that the development of the multiple-choice questions resulted in a contextually limited description of student understanding of isotopes.

Other relevant studies

Two other studies contained findings related to learners’ understanding of isotopes. In both cases isotopes were not the focus of the study.

Çalik et al. (2009) conducted a study on the use of analogies in understanding the ‘atom’ concept using a pre-post multiple choice instrument. A total of 36 students took part in the study.

Three alternative conceptions held by some students were:

• For isotopes, the atomic masses are the same, but the atomic numbers are different.

• For isotopes, the atomic mass and atomic numbers are the same.

• Atoms of isotopes have the same neutron number.

Tekin and Nakiboglu (2006) conducted a study on Turkish tenth grade students’ misconceptions of nuclear chemistry (N = 157) using a seven-item multiple choice diagnostic test. Students were also asked to explain their thinking in writing for each of the seven questions. Their findings related to isotopes focus on stability and half-life. They found that many students thought:

• Isotopes with specific atomic numbers affect the stability of the nucleus.

• Isotopes with the longest half-life are the least stable.

• Radioisotopes are only used for energy production since they are dangerous to humans.

• The rate of radioactive decay depends on temperature and pressure.

Similar to Schmidt et al. (2003), existing chemistry content, textbooks in this case, was used for question development as well as unspecified classroom observations.

For all of the studies in this literature review, instrument development was guided primarily by past examinations or textbooks (Çalik et al., 2009; Schmidt et al., 2003; Tekin and Nakiboglu, 2006). A major criticism of this approach is that starting with these secondary sources can lead to biased data and findings that do not necessarily reflect learners’ thinking. In other words, by either omitting key aspects or by providing attractive distractors, the data and findings do not produce a full description of how learners think about the topic, common questions, obstacles to their learning, or what they find interesting.

Context

Four complementary contexts provide a range of learners in terms of demographics as well as level of sophistication in their understanding of the topic of isotopes. This includes data from People Also Ask (PAA) questions generated by an internet search engine, questions in response to specific online videos on isotopes, and questions posted to two different online question and answer websites.

The nature of online data has contextual implications and it is possible that the questions asked by learners will differ from those asked in the classroom or in response to interview questions. It is hypothesized that the asynchronous and anonymous nature of online questions will result in a more diverse group of self-generated questions, as students reluctant in a classroom setting may be more willing to participate in an anonymous online environment. However, the relative anonymity of the learner also places limits on the information available about each individual learner (Holtz et al., 2012). Specifically, data on gender, age, nationality, etc. are not accessible for analysis.

Together the sources provide complementary data that can be used to generate a description of learner understanding of a topic as well as common challenges and areas of learner interest. A description of each source is provided below.

People Also Ask data (PAA)

Learners often begin their online quest for answers using a search engine. Many of the queries made to search engines take the form of questions (White et al., 2015). In response, major search engines added a PAA feature to provide users with an additional means of finding the information they are seeking. For example, in this study we searched on the term “isotopes”. Along with the search results a PAA box was displayed (see Fig. 1 below).
image file: c9rp00145j-f1.tif
Fig. 1 Initial PAA questions for the search query “isotopes”.

Upon selecting a question, additional questions about the topic appear along with a link and information related to the question. These questions can also be selected, resulting in further questions and the process can be repeated multiple times. This phenomenon is sometimes termed a PAA black hole allowing multiple iterations and pathways to explore questions people ask about the topic of isotopes. Fig. 2 shows additional questions generated after selecting “What makes an isotope?”


image file: c9rp00145j-f2.tif
Fig. 2 Additional questions after selecting “What makes an isotope?”.

PAA results are selected by the search engine as a function of the frequency the question has been asked (Matias et al., 2017). This means that the theme categories identified can be considered to be those most relevant and important to learners. This is similar to the “Customers who bought this item also bought…” feature on popular shopping websites.

Because the major search engines have access to several billion searches daily the PAA questions can be considered to be generated based on a substantial dataset. This dataset includes a diversity of people searching, often in the form of questions. However, exactly how the PAA questions are generated is not publicly available and therefore, while useful for exploratory work, limited.

Khan Academy (KA)

As one of the largest sources of science and math instruction available online, the non-profit KA has over 22[thin space (1/6-em)]000 instructional videos available. Founded in 2008, it is managed by a staff of 150 employees and has a network of content experts to support resource generation.

In general, the videos focus on academic topics and are of a high quality having been reviewed by content experts. The comments sections, where learners ask questions, are active, with questions still being asked and answered for older videos. Further, questions receive timely and high-quality responses providing incentive for learners to ask questions.

In the context of this study, KA provides questions posted by learners to the comments section of a specific instructional experience that can be accessed by the researcher. Thus it provides insights into how instruction influences the types of questions asked by learners. For Q and Y!A it is not possible to know what, if any, instructional experience led to their question. In addition, questions that would be filtered out of an internet search based on a specific keyword are present in the KA data. For example, a question like “Why isn’t the mass of electrons considered when calculating mass number?” provides insights to learners’ thinking but would not be returned in a search using the keyword “isotopes”. While advantageous in their specificity, question and answer websites like Q and Y!A have a broader range of questions but provide less contextual information.

Quora (Q)

For the purpose of this study, the question and answer website http://www.Quora.com was included to provide a more general dataset not tied to a specific educational resource.

Q, a for-profit company, was formed in 2010 and as of 2018 received 300 million monthly users. Users can post and answer questions to the website with the Q community responding, often in great detail.

For the topic of isotopes there are a range of questions; however, in general the level sophistication is higher than K and Y!A. For example, questions about the separation of isotopes, biological uptake, or the use of spectroscopy to identify isotopes are more common. This is of value as it extends the range and diversity of questions available for analysis.

From a technical perspective working with data from Q is more challenging since keyword sorts do not state the number of questions, only a continuous list that eventually will not produce more questions (limit appears to be around 600 questions). This prevents looking at the relative number of questions for a given keyword.

Yahoo!Answers (Y!A)

Begun in 2005, Y!A is one of the older question and answer websites. Like Q, users can either post their own questions or answer questions posted by other users. As of 2015 Y!A received eleven million visitors per month.

In general, the level of questions and answers on Y!A is of a wide breadth but lacks in depth, as measured by degree of expertise (Adamic et al., 2008). In this study, for the topic of isotopes, questions on Y!A were found to be more oriented towards homework and school projects than questions found via PAA, KA, and Q. It is expected this is the case for other science topics.

While generally lacking in depth, Y!A data is valuable because of the number of questions and for providing a different context in which questions are asked. With many questions related to homework or school projects, Y!A can be viewed as providing information about topics relevant to learners’ academic lives.

Y!A has the advantage of searching a large dataset and obtaining information on the numbers of questions asked for each keyword, something not possible with Q data. This allows for an idea of the relative numbers of each general type of question.

Methods

The study consists of four separate but complementary data sources and follows a Design Based Research (DBR) methodological framework (Collins et al., 2004). Initial code categories are developed using PAA data to guide the coding of learner-generated data from video comments and questions posted to a question and answer website (Q). The result is a developed codebook and well-developed themes.

To extend and explore themes that emerged from coding, keyword searches are then conducted on a different question and answer website (Y!A) based on these codes. As needed PAA, Q, and Y!A data are revisited in an iterative manner. Finally, a literature review is conducted to compare to findings in the current study. Themes found in the literature, but not in the current study, are then explored further in PAA, KA, Q, and Y!A data. The process is described in Fig. 3.


image file: c9rp00145j-f3.tif
Fig. 3 Graphic representation of methodology.

Sample and data collection

PAA data, which is generated algorithmically, is analyzed to detect initial themes to guide the subsequent analysis of learner-generated questions. Using individual questions as the unit of analysis, the sample consists of 1373 learner-generated questions from KA and Q and 29[thin space (1/6-em)]213 from Y!A.

Initially data were collected from the PAA feature on the search engine http://www.Google.com. The topic “isotopes” was entered in the search box and a search initiated. This process was followed until the questions being generated did not offer any new question types (simple variations in syntax were excluded). Questions that did not relate to the student understanding of isotopes were not included in the data set. For example, the questions “How much does it cost to get rid of radon?” or “Can goiter turn into cancer?” were excluded.

Questions asked by learners in the comments section of two videos on the popular education website, http://www.KhanAcademy.com, were accessed. For both videos over four years of comments (June 10, 2014 to May 1, 2019) were accessible with a total of 770 relevant comments. Questions not related to student understanding of the topic of isotopes or the atom were excluded. For example, questions about balancing chemical equations were removed from the dataset.

For Q the first 600 questions returned for the term “isotopes” were collected. Inclusion and exclusion criteria similar to PAA and KA were used. For Y!A, to take advantage of the large number (29[thin space (1/6-em)]213) of questions asked that included the term “isotopes”, data were manipulated via the Y!A website. This was possible because sorting on Y!A returned a specific count of the number of questions asked for each sort, unlike Q data.

Data analysis

Analysis took place in an iterative manner (see Fig. 3) beginning with PAA data. An initial set of codes were developed from PAA data by the first researcher and documented in a codebook. Because the PAA questions are generated algorithmically there is no clear unit of analysis and data were used only to generate initial codes. Using these initial codes, a subset of data for KA and Q were coded in an iterative manner by the first researcher. During the process, as codes were added, modified, and expanded upon, the codebook was further developed including a description of the code, examples, and exclusion criteria.

Prior to coding the entire dataset, a research assistant familiar with the topic of isotopes but not a chemistry student or instructor, coded a subset of data in order to assess Inter-rater Reliability (IRR). Initially, to familiarize the second researcher with the codebook and further refine the codebook, the first and second researcher collaboratively coded a subset of data from each source. As a result, modifications, primarily collapsing codes and adding clarifying descriptions for the inclusion and exclusion criteria, were made to the codebook based on discussions between researchers.

To illustrate changes to the codebook several examples are provided. Initially there was a code for questions about the addition or loss of neutrons and a separate code for questions about how isotopes are formed. These codes were collapsed since both deal with changes that result in the formation of isotopes. Further, two codes addressing the topic of ions were collapsed as well. A code for questions asking about the charge for ions and a code for questions addressing the neutrality of atoms were collapsed into one code. As a final example, two separate codes existed for questions about the properties of isotopes, one for physical and one for chemical properties. To reduce the complexity of the codebook these codes were collapsed.

After modifications were made to the codebook a different subset of data was independently coded by both researchers and IRR calculated and found to be above 90%.

Data were coded using the open source analysis software RQDA (Huang, 2018), a package of the R statistical software program. Using RQDA, frequencies were calculated for each code.

Due to the volume of data available for Y!A, data were analyzed by sorting on keywords and keyword phrases. For example, the terms “isotopes” returned 29[thin space (1/6-em)]213 questions. Further sorting on the keywords “isotopes, ions” returned 2050 questions. For “isotopes, ions, neutral” a total of 413 records were returned. Sorting in this manner provides a general idea of the prevalence of themes although because the sort encompasses both questions and responses from the community, the accuracy is lower than manually coding data.

Finally a review of the research literature on students’ understanding of the topic “isotopes” was conducted. The literature review was conducted after coding as a means to reduce researcher bias during coding and allow for a fairer comparison.

Results

To describe learners’ understanding of the topic isotopes, and to illustrate the proposed methodology, data were coded and the codes were categorized thematically (Saldaña, 2015). To provide coherence between themes as they relate to the methodology described in this study, themes are presented based on their relation to the existing research literature. Themes detected in the current study but not reported in the research literature are presented first, followed by themes that appear in both this study and the literature. For both, percentages from each data source are provided as appropriate.

As the methodology is intended to be used in the pilot or exploratory stages of research, code categories of either Q or KA that were 5% or greater were reported. For example, codes for isotopes and the periodic table, spectroscopy, separation, isotopes of carbon or oxygen, definition of isotopes, examples of isotopes, and the chemical and physical properties of isotopes were less than 5% and were not reported in the results. Depending on the context a researcher using the methodology might decide to explore findings of a smaller grain size if they appear to be theoretically generative.

New findings not reported in the literature

The following five themes were detected in the current study but not found in the existing research literature.

Isotopes, ions, and neutral atoms

A major source of confusion identified for learners was the relationship between ions and isotopes. This most commonly involved the difference between an ion and an isotope and how to know if an element was an isotope, ion, or neutral atom. Note that this theme specifically includes the concept of ions whereas others do not. Ions and isotopes tend to be one of the areas that are problematic for learners because both involve a difference of a subatomic particle (the electron for ions and the neutron for isotopes) and because they are often taught together.

The code was present in all data sources but most pronounced in KA. There were no instances of this theme found in the literature reviewed. Table 1 presents the frequency of occurrences across data sources as well as in the literature. Quotes are included to provide examples of typical learner questions. Because it is challenging to achieve meaningful counts for PAA themes, only the presence of the code, along with sample questions, is provided.

Table 1 Ions and isotopes questions
PAA Khan (codes) Quora (codes) Yahoo!Answers (keyword sort)
Are ions and isotopes the same? 130/770 (17%) 20/600 (3%) 2050/29[thin space (1/6-em)]213 (7%)
Can an atom be an ion and an isotope? Is it just assumed that a atom is neutral (obviously unless stated otherwise)? How do ions and isotopes differ from atoms? How can you tell if a element is an isotope, ion, or stable?
Can isotopes exist as ions? What do an atom, ion, and isotope of an element all have in common? What are the differences between ions and isotopes? Can ions be isotopes?


While the theme occurred in all data sources except the literature, it was predominantly found in KA data. For KA this is likely due to inclusion of the terms “ions” and “neutral element” in both videos. Further, the first video specifically included practice problems involving ions. However, the frequency of ion related questions was consistent across questions from both videos.

It may be that learners viewing the KA videos are more focused on traditional academic questions where the Q data tends to consist of more advanced questions. To further test this, data from Y!A was analyzed to detect if questions about ions and isotopes were present.

For Y!A, of 29[thin space (1/6-em)]213 questions containing the terms isotope, 2050 questions also contained the term ions. This represents 7%, less than the 17% from KA data, but still a meaningful number of learners. In addition, the term “isotopes and neutral” occurred 1118 times, over one half of isotopes and ions, although there is overlap in many questions containing both search terms.

The absence of this theme from the literature may be due to the manner in which research instruments were developed in those studies. Unless learners spontaneously brought up the difference between ions and isotopes during piloting, it is unlikely researchers would have included the theme in instrumentation for the study.

General questions about the atom

During the analysis of data, general questions about the atom frequently emerged in KA data (16%), and to a lesser extent in Q (3%). This is of importance as understanding the structure of the atom is necessary for understanding the topic of isotopes. It also points to the value of the analysis of learner-generated questions to specific instructional interventions such as the Khan Academy videos in this study (Table 2).
Table 2 General atom questions
PAA Khan (codes) Quora (codes) Yahoo!Answers (keyword sort)
General atom questions are not generated in PAA sorting when searching on “isotopes”. 120/770 (16%) 16/600 (3%) Because Y!A data was analyzed by keyword sorting, frequencies for general atom questions were not generated.
So the number of electrons doesn't change anything in the mass number? How would you write the electron configuration an isotope of the element cobalt?
Can an atom have no electrons? What is the difference between a proton, neutron and electron?
Just a basic question, how does an atom look like? How do I calculate binding energy of a nucleus or isotope?


Due to nature of searching on keywords in PAA, Q, and Y!A, most general questions about the atom were not returned in the search results. In contrast, because no sorting was used with KA data and all questions for both KA videos were coded, general questions about the atom were included in the dataset. To further clarify the importance of varying sources of data, one could imagine a learner studying isotopes who is confused about ions. In KA they would post a question such as “What are ions?” Since all KA learner-generated questions are coded, the question would be included in analysis. If the learner asked the same question on Q or Y!A websites, a computerized sort for the term “isotopes” would not return the question in the search results as “What are ions?” does not include the term “isotopes.”

While not surprising that general chemistry questions are asked when learning about isotopes, these types of questions have implications for teaching and are likely to be present when analyzing learner-generated questions for other specific chemistry topics.

Hydrogen

Hydrogen is involved in about 10% of all learner-generated questions in this study. This is likely because hydrogen is often used as an example to introduce learners to the topic of isotopes, as found in KA videos (Table 3).
Table 3 Questions related to hydrogen
PAA Khan (codes) Quora (codes) Yahoo!Answers (keyword sort)
PAA questions involving “Hydrogen” do not appear in PAA searches for “isotopes”. 79/770 (10%) 46/600 (8%) 3248/29[thin space (1/6-em)]213 (11%)
Doesn't deuterium have 2 neutrons and tritium have 3? Why is the atomic mass of hydrogen isotopes fractional? Of the 3248 questions containing “Hydrogen” and:
Is protium considered as an isotope? Why is protium an isotope? “Protium” 240/3248 (7%)
I thought protium, deuterium and tritium were the sub atomic particles of atom. Which isotope of hydrogen produces a higher temperature flame, protium or deuterium? “Deuterium” 1058/3248 (33%)
“Tritium” 826/3248 (25%)


Hydrogen is unique in that its most common isotope does not have any neutrons and that the first three isotopes of hydrogen have special names (protium, deuterium, and tritium). Questions about the lack of a neutron in protium are frequent in all sources of learner-generated questions. For example:

How does protium exist? If there are no neutrons, then doesn't the electron and proton get attracted to each other?

and

I don't understand why protium is a isotope if it is the same as hydrogen which has no neutron. (KA)

Why is protium an isotope? (Q)

In addition, questions about the specific names for the primary isotopes of hydrogen (protium, deuterium, and tritium) were frequently asked about by learners.

Do all elements' isotopes have names such as protium, deuterium, tritium? (KA)

An additional source of confusion in naming the isotopes of hydrogen is that protium has no neutrons, deuterium has one, and tritium has two. The prefixes pro, deu, and tri refer to the mass number (protons + neutrons), not the number of neutrons.

Did they get their names from how many protons and neutrons they have in their nucleus? (does pro stand for “one”, deu stand for “two”, and tri “stand for 3”?) (KA)

In addition, questions about the atomic mass of hydrogen, and if more isotopes exist after tritium, were also present in the data.

The theme of hydrogen was absent from the literature, even though its presence in instruction means it is frequently encountered when learning about isotopes. Both the absence of neutrons in protium and the special naming convention for isotopes of hydrogen lead to a large number of learner-generated questions.

Notation and naming

Notation for isotopes presents a challenge for learners who are most familiar with the notation for elements on the periodic table. On the periodic table the atomic number is usually above the element symbol with the average atomic mass below. Isotopes for specific atoms are written in a different manner with the atomic number below the element symbol and the mass number for the specific isotope on top (Fig. 4).
image file: c9rp00145j-f4.tif
Fig. 4 Notation on periodic table and isotopic notation.

Further, we can refer to carbon based on the atomic mass (a weighted average of the abundance of isotopes for a specific atom) as seen on the periodic table or we can name an isotope of carbon based on the mass number, for example Carbon-12, Carbon-13, or Carbon-14 depending on the number of neutrons present in the nucleus (Table 4).

Table 4 Notation and naming questions
PAA Khan (codes) Quora (codes) Yahoo!Answers (keyword sort)
How are isotopes represented? 96/770 (12%) 13/600 (2%) 2240/29[thin space (1/6-em)]213 (8%)
What is a nuclide symbol? Are there any other special names for isotopes of other (non-hydrogen) elements? What does an isotope symbol represent? How are isotopes named?
How do you write the name of an isotope? What is carbon 12 called? Do all isotopes of elements get a unique name? What is the isotope name of the element zinc?
How is the mass number of an isotope expressed in the name of an atom?


Most of the naming and notation questions for KA data (79/96) related to the hydrogen atom and protium, deuterium, and tritium. This is a result of hydrogen being used as an example to introduce isotopes to learners. For Q, naming and notation was present but not as prominent.

Questions about naming and notation from Y!A were more common, and compared to KA, more varied with less emphasis on the special names for the isotopes of hydrogen. Although 8% of questions overall for Y!A had the word “isotopes” and “naming” or “notation”, the actual percentage is lower due to questions such as “Can you name three uses of isotopes?” fulfilling the search criteria as well.

Formation of isotopes

The formation of isotopes, including why they form, is closely related to the role of neutrons. Overall, learner-generated questions center on the origins of isotopes and how easy/hard it is to add/remove neutrons to form isotopes (Table 5).
Table 5 Questions about the formation of isotopes
PAA Khan (codes) Quora (codes) Yahoo!Answers (keyword sort) Alignment with existing literature
How does an atom become an isotope? Formation 35/770 (5%) Formation 30/600 (5%) Formation Theme not found.
How do atoms gain or lose neutrons to make isotopes? Why exist 6/770 (<1%) Why exist 7/600 (1%) Data are not reliable as the term “formation” is used frequently with “isotopes” in relation to the radioisotope dating and the “formation” of the earth.
How are isotopes made? How exactly do scientists “take out” a neutron? Can radioactive isotopes be made for any element? Why exist data for why isotopes exist is not reliable due to the common occurrence of “why” and “exist” along with “isotopes”.
Do isotopes occur naturally? Where do elements get extra neutrons? Do colliding neutron stars produce new elements or isotopes?
How are isotopes formed? In other words, how do atoms gain neutrons? Is it possible to artificially produce any isotope of an atom from other isotopes?


Questions about the formation of isotopes and the role of neutrons are consistent across PAA, KA, and Q data. To a lesser extent learners also ask about why isotopes exist. These questions are noteworthy as they are not typically covered in school science and therefore are likely driven by curiosity, sometimes termed “wonderment” questions (Scardamalia and Bereiter, 1992). For example, a learner in the KA data asked:

what makes an atom to have more or less neutrons? could one just keep adding or taking them away from the nucleus? and what is the biological use/impact of having more or less neutrons in an atom? (KA)

Such questions provide insights into learners’ thinking at a deeper level and can be theoretically generative. In the above question the learner is seeking to understand the mechanism by which isotopes form and their influence in biological systems. Similar to other learner-generated questions, they are unsure of how easy it is to add or remove neutrons or from isotopes. For example, a different learner asked:

How exactly do scientists “take out” a neutron? Do they use some type of mini tweezers or something? (KA)

The addition or removal of neutrons may seem reasonable to learners since the formation of ions, where electrons are lost or gained, is often mentioned when teaching about isotopes. This confusion can be seen in the following two questions:

How are neutrons exchanged between atoms, and can you have atoms with less neutrons than normal? (KA)

What varies in the different isotopes (form)? I thought that changing the number of electrons affect the form of an element? (KA)

Here learners are conflating neutrons (which are not exchanged between atoms) and electrons (which are in the formation of ionic bonds). It is possible that this is related to learner confusion about ions and isotopes noted earlier. Note that learner-generated questions regarding ions are not included in this theme.

Data coded for Q differed from KA in that there were no questions in Q data about the addition or removal of neutrons and more questions about the actual formation process.

Findings also reported in the literature

The following themes were found in both the current study and the research literature.

Stability and decay, half-life, and radioactivity

Learner-generated questions about stability/decay, half-life, and radioactivity, are common across all data sources with the exception of KA data. Further, these terms are often found together in questions. For example, “Will all radioactive isotopes eventually decay into stable isotopes?” or “We only talk about radioactive elements having a half life. Does a radioactive atom actually disappear completely?” In this sense these terms can be thought of together as a general theme (Table 6).
Table 6 Stability and decay, half-life, and radioactivity questions
PAA Khan (codes) Quora (codes) Yahoo!Answers (keyword sort) Alignment with existing literature
Are all isotopes unstable? Stability and decay 11/770 (1%) Stability and decay 69/600 (12%) Isotopes and stable 3641/29[thin space (1/6-em)]213 (12%) Isotopes with specific atomic numbers affect the stability of the nucleus.
How do you find the half-life of an isotope? Half-life 0/770 (0%) Half-life 23/600 (4%) Isotopes and half-life 5219/29[thin space (1/6-em)]213 (18%) Isotopes with the longest half-life are the least stable.
What is the most dangerous radioactive isotope? Radioactivity 4/770 (<1%) Radioactivity 48/600 (8%) Isotopes and radioactivity 1012/29[thin space (1/6-em)]213 (3%) Radioisotopes are only used for energy production since they are dangerous to humans.
What exactly does it mean for an atom to be stable? How are the stable isotopes of chlorine formed? Isotopes and radioactive 7592/29[thin space (1/6-em)]213 (26%) Radioactive decay depends on physical conditions.
How do we know which isotopes are radioactive? How are half-lives determined for very long-lived radioactive nuclides? Note: there is overlap in reported percentages for keyword sorts. Above: alternative conceptions reported in Tekin and Nakiboglu (2006).
What makes isotopes radioactive?


While not found in the KA data, in Q and Y!A data learners frequently have questions about stability, half-life, and radioactivity, indicating that they are considered important to learners. Of interest is the absence in the KA data. In a search of the KA video transcripts there are no instances of the terms stable/stability, decay, half-life, or radioactive. However, in contrast, in the two KA videos learners frequently ask general questions about the atom (see theme, general questions about the atom), which are also not addressed in the videos. It is likely that a general understanding of the atom is necessary to understand isotopes whereas stability, decay, half-life, and radioactivity is not. This leads to general questions about the atoms as learners need this information to understand isotopes.

The literature also reports findings, although limited (Tekin and Nakiboglu, 2006). None of the themes from the literature were detected in the data sources for this study. While none of the data coded in this study detected this theme, a keyword sort of Q and Y!A for radioactive and temperature, as well as radioactive and pressure, did detect a small number of instances. However, after manually filtering out unrelated questions, only four questions specifically asked about the relationship between isotopes, radioactivity, and pressure or temperature.

Of note is that both temperature and pressure were distractors for the multiple-choice instrument used by Tekin and Nakiboglu (2006) in their study. This suggests that, while the idea is attractive to learners, it is not something they generate on their own and likely why it was not detected in the current study.

Other types of questions

Of note are several other themes that emerged during analysis. Questions about the number of isotopes were frequent across data sources indicating this is something learners deem important although it is not clear why they are asking. Of theoretical interest is the idea of a “standard” or “original” isotope, which was present in all data sources, including the literature (Table 7).
Table 7 Other questions related to isotopes
PAA Khan (codes) Quora (codes) Yahoo!Answers (keyword sort) Alignment with existing literature
Which elements have no isotopes? Number of isotopes 40/770 (5%) Number of isotopes 45/600 (8%) Data for Y!A are not reported due to the occurrence of common search terms as well as overlapping terms. Isotopes differ from standard atoms and are less stable.
Not present in PAA data. Standard element 14/770 (2%) Standard element 6/600 (1%) Standard atoms have equal numbers of protons and neutrons.
How do you find the number of neutrons in an isotope? Subatomic particle calculations 41/770 (5%) Subatomic particle calculations 12/600 (2%) Above: alternative conceptions reported in Schmidt et al., (2003).
How are isotopes used? Uses of isotopes 14/770 (2%) Uses of isotopes 80/600 (13%)


While questions about the number of isotopes for an element (e.g. “How many isotopes does tin have?”) are more frequent, of greater interest in the idea of a standard or original isotope. Here learners consider isotopes to be variations on one primary isotope of an element, often not considering it to be an isotope.

For example:

Are all atoms isotopes, or are isotopes variants off of one standard atom? For example, would carbon 12 be the “standard” carbon atom, and carbon 13 an isotope OF carbon 12? Or is there no standard, and both carbon 12 and 13 are isotopes? (KA)

In this question the learner is unsure the existence of the “standard” isotope of carbon, although they do reason that there may be no “standard” isotope at all. Other learners use the terms “original” and “normal” instead of “standard”.

Are there any isotopes that have less neutrons than the original element? (Q)

Is an isotope just the variations of the atom? So there is a normal common atom and then the variations are the isotopes… is that right? (KA)

Occasionally the original isotope is seen to have an equal number of neutrons and protons, something found in the literature (Schmidt et al., 2003). For example:

Aren't there supposed to be equal number of protons and neutrons in every atom? (KA)

An isotope is defined as a nucleus with an unequal number of protons and neutrons. Could somebody explain to me why deuterium, e.g., has the status of an isotope although it has one proton and one neutron? (Q)

The theme subatomic particle calculations was present in KA data, and to a lesser extent, in Q data. For example:

How do you work out the number of protons, neutrons and electrons in isotopes? (KA)

How many protons, electrons, and neutrons does isotope 79 Br + have? (Q)

This is not surprising as this is a task students are frequently asked to complete in academic setting.

The theme for uses of isotopes was common in Q data (13% of questions) but seldom in K (2%).

For K, most of the questions were about general uses of isotopes. For example:

What are some uses of isotopes? (K) or Do isotopes have different uses or are the all used for the same thing? (K)

Q questions were primarily about general uses, medical uses, and danger/disadvantages of isotopes, such as:

Which isotopes are used as radioactive tracers? (Q)

What isotope is used to treat Leukemia? (Q)

What are the health consequences of exposure to isotopes? (Q)

Finally, while coding it was noted that questions about the separation of isotopes (15/600 or 2%) and the use of spectroscopy (12/600 or 2%) was seen in the Q data suggesting that these concepts are of interest to more advanced learners. Interestingly, searching Y!A for the keywords “isotope separation” (204/29[thin space (1/6-em)]213 or 1%) and “isotope spectra” (369/29[thin space (1/6-em)]213 or 1%) also detected similar learner questions indicating a diversity of question complexity in the Y!A data.

Summary of primary themes

A summary of the primary themes found are provided in Table 8 to facilitate synthesis across the study and assist the reader in engaging with the overall conclusions.
Table 8 Summary of primary themes
Theme Percentages in current study Alignment with literature
Isotopes, ions, and neutral atoms KA: 17% Q: 3% Y!A: 7% Not found.
General questions about the atom KA: 16% Q: 3% Y!A: N/A Not found.
Hydrogen KA: 10% Q: 8% Y!A: 11% Not found.
Notation and naming KA: 12% Q: 2% Y!A: 8% Not found.
Formation of isotopes KA: 5% Q: 5% Y!A: N/A Not found.
Stability KA: 1% Q: 12% Y!A: 12% Tekin and Nakiboglu (2006).
Half-life KA: 0% Q: 4% Y!A: 18% Tekin and Nakiboglu (2006).
Radioactivity KA: <1% Q: 8% Y!A: 29% Tekin and Nakiboglu (2006).
Standard atoms have equal numbers of protons and neutrons. KA: 2% Q: 1% Y!A: N/A Schmidt et al., (2003).


Discussion

The goal of this study is to develop a methodology using existing online sources of learner-generated science questions to inform the design of research studies and the development of instrumentation. To accomplish that goal, data on questions for the topic of isotopes were investigated and findings were compared to the extant research literature to establish validity as well as practical aspects in the use of the methodology.

Data on learner-generated questions for the topic of isotopes were investigated across four different online contexts. Findings about learner understanding of isotopes, while interesting, will primarily be discussed here as a means to illustrate the use and limitations of the methodology proposed in this study.

Within the context of providing robust and ecologically valid empirical data at the initial stages of research, or as a means to explore the validity of existing research, the methodology proposed in the current study is shown to be productive. In comparing findings with the extant research literature, it was possible to detect the majority of findings from the literature as well as identify many new findings. Further, because the data is based on a larger sample, it also was able to quantify the occurrence of these findings.

Divergence from literature findings

Rather than focus on what was found, it is useful to look at what was not detected by the proposed methodology. In this study the methodology failed to find any mention that diamond and graphite were isotopes of carbon as was found in the Schmidt et al. (2003) study. This is of concern since there were over 3000 participants in their study. There are two possible explanations why this may be the case. First, the methodology in the current study may not be representative outside of the online environment. A second, and testable possibility, is that the alternative conception found in the Schmidt et al., (2003) study was a result of how instrumentation was developed and is not present in other contexts.

If students are confused about isotopes and their relationship to diamond and graphite, as seen in the Schmidt et al., (2003) study, this should be present in some form in the questions asked online by a larger sample of learners. A sort of over 30[thin space (1/6-em)]000 questions in Q and Y!A found only five instances. This strongly suggests that manner in which their instrument was developed resulted in this finding. That is, the distractors in the multiple-choice instrument were found to be attractive to students although not something learners would have considered independently. This was also the case for the finding that temperature and pressure affect the rate at which isotopes undergo radioactive decay (Tekin and Nakiboglu, 2006).

Methodologically this should serve as a caution in the development of research instruments. Research findings are influenced by decisions about what to include, and what not to, in the instruments used to collect data. From a practical standpoint, this suggests that the use of existing data sources, along with the research literature, can provide empirical information to guide pilot studies and lead to more robust instruments, study design, and findings.

“Wonderment” questions

During data analysis it became apparent that a subset of questions was what Scardamalia and Bereiter (1992) termed “wonderment” questions and that subset was found across data sources. While the number of these questions are limited, we see them as particularly useful in the development of research instruments such as interview, think-aloud, and multiple-choice items. For example, learners asked:

… would one kind of oxygen isotope have a different effect in our body when we breathe it than another? (KA)

Is a “periodic table of isotopes” thinkable? (Q)

Do atoms keep their isotopes when they bond with another atom? (KA)

Questions like these are likely to be interesting to learners and theoretically useful in eliciting their thinking about isotopes. This is especially true when working with students who have achieved a basic understanding of the topic. While these occur in classroom contexts, they are rare (Carr, 1998; White and Gunstone, 1992, p. 170) and methodologically difficult to capture at the frequency found in online learner-generated questions.

Implications for research

The current study uses a new methodology and is guided by the learning progression framework. In particular, conceptualizing LP development as consisting of five stages (Jin et al., 2019), the use of learner-generated questions is most appropriate for the initial or pilot stage. In addition, the methodology presented here is flexible and would also be of value in researching learner understanding of topics in other contexts such as informal science education, science technology and society, and in disciplines other than chemistry.

For researchers considering using online learner-generated questions, analysis will vary with research questions, time, and theoretical framework. In this study the analysis was necessarily extensive in order to make a valid comparison with the extant literature and establish the methodology. While the findings presented here benefited from this in-depth analysis, a more cursory look at the data would have detected many of the larger themes. A guide that can be modified to meet varying research contexts is given in the Appendix (ESI).

In addition to topics found frequently in formal academic contexts, as is the case with isotopes, learner-generated questions are also accessible for other contexts. For example, topics not always covered in the classroom, such as CRISPR, robotics, nanotechnology, careers in science, or climate change would benefit from the methodology. Further, data on the public understanding of science is also available and can be analyzed in a similar manner.

A logical next step is the development of instruments to determine if themes emerging from the analysis of online learner-generated questions, not found in the literature, are present in other contexts. Instruments for interview, multiple-choice, and think-aloud protocols could establish the existence of these themes and extend the findings in this study. In order to investigate more sophisticated thinking for a concept, the inclusion of “wonderment” questions could also prove fruitful.

Future research should also address how a modified form of the proposed methodology influences the development of learning resources. In addition to practicing teachers and curriculum developers, this extends to the wide range of individuals producing open educational resources and content for web and video sharing platforms.

Implications for instruction

For teachers, a simplified approach to using online student-generated questions could be incorporated into existing professional development, such as lesson study, individual or collaborative research, or PD focusing on science content. Exploring questions about a topic they are currently teaching, or plan to teach, would have an immediate impact on instruction. In the typical duration of a PD experience it is likely teachers would be able to detect the major themes for a topic, such as differences between neutral atoms, ions, and isotopes, found in this study.

To provide meaningful formative and summative assessment to measure student learning, assessment writers will find student-generated questions of value. In a formative context these questions can inform the diagnosis of areas where students need additional support. For example, in this study general questions about the atom suggest that formative assessment about the structure of the atom could guide teaching. For summative assessment, item writing would benefit from access to the questions students ask, especially in identifying plausible distractors for multiple choice items (Breakall et al., 2019).

Implications for developing education resources

Curriculum developers, including the developers of instructional resources like video, simulations, or animations, could benefit from a more thorough exploration of learner-generated questions coupled with the use of existing research literature. For example, noting the many questions about the unique aspects of hydrogen, a curriculum developer may choose to begin instruction with a different isotope and later return to hydrogen.

In addition to curriculum development, Open Education Resources (OER's) are a growing part of the educational landscape. OER's are often developed through a crowd-sourced model (Allen et al., 2015) with individual authors having a wide range of teaching and content expertise. For these individuals a simplified version of using online learner-generated questions would provide a fuller context for the creation of more effective content that addresses learners’ needs.

Limitations

There are several primary limitations to the methodology and the findings in this study. First, we know little about the learners asking the questions. While it is assumed they represent a wide range of students from secondary through university level learners, it is not feasible to investigate gender, age, nationality, or socioeconomic status. There is also the possibility that learners asked more than one question or asked questions across data sources, although this does not appear to be common. Further, in many cases we do not know whether the question asked originated with the learner or if it was related to a school assignment, such as homework or a project.

A second limitation is the availability of online learner-generated questions. While content exists for most major science topics, large datasets may not be available for all topics. This may be the case for nature of science or scientific practices such as argumentation or critical thinking. This is a limitation of the proposed methodology.

As noted by Lincoln and Denzin (2000), in analyzing content through coding there is an assumption that the categories are precise, which is often not the case. As such, in the current study there is likely some overlap between themes. Further, it is possible that, once established, categories may distract from other themes present in the data (Atkinson, 1992) and that not all themes in this study were detected.

Finally, in the current study, to limit researcher bias during coding, a review of the literature was conducted after coding was complete. An unanticipated outcome of this strategy was that limited research was available about learner understanding of isotopes. Other topics, with a larger research base, could have provided a more in-depth comparison to findings from the current study.

Conflicts of interest

There are no conflicts to declare.

References

  1. Adamic L. A., Zhang J., Bakshy E. and Ackerman M. S., (2008), Knowledge sharing and yahoo answers: everyone knows something. in Proceedings of the 17th international conference on World Wide Web, ACM, pp. 665–674.
  2. Allen G., Guzman-Alvarez A., Smith A., Gamage A., Molinaro M. and Larsen, D. S., (2015), Evaluating the effectiveness of the open-access ChemWiki resource as a replacement for traditional general chemistry textbooks, Chem. Educ. Res. Pract., 16(4), 939–948.
  3. Atkinson P., (1992), The ethnography of a medical setting: Reading, writing, and rhetoric, Qual. Health Res., 2(4), 451–474.
  4. Baram-Tsabari A., Sethi R. J., Bry L. and Yarden A., (2006), Using questions sent to an Ask-A-Scientist site to identify children's interests in science, Sci. Educ., 90(6), 1050–1072.
  5. Breakall J., Randles C. and Tasker R., (2019), Development and use of a multiple-choice item writing flaws evaluation instrument in the context of general chemistry, Chem. Educ. Res. Pract., 2019, 20, 369–382.
  6. Breslyn W., McGinnis J. R., McDonald R. C. and Hestness E., (2016), Developing a learning progression for sea level rise, a major impact of climate change, J. Res. Sci. Teach., 53(10), 1471–1499.
  7. Breslyn W., Drewes A., McGinnis J. R., Hestness E. and Mouza C., (2017), An empirically-based conditional learning progression for climate change, Sci. Educ. Int., 28(3), 214–223.
  8. Çalik M., Ünal S., Costu B., Dede N. and Ayas A., (2009). Investigating effectiveness of analogies embedded within four-step constructivist teaching model: a case of the ‘atom’ concept, J. Sci. Educ., 10, 36–40.
  9. Carr D., (1998), The art of asking questions in the teaching of science, Sch. Sci. Rev., 79, 47–50.
  10. Chin C. and Osborne J., (2008), Students' questions: a potential resource for teaching and learning science, Stud. Sci. Educ., 44(1), 1–39.
  11. Collins A., Joseph D. and Bielaczyc K., (2004), Design research: Theoretical and methodological issues, J. Learn. Sci., 13(1), 15–42.
  12. Commeyras M., (1995), What can we learn from students’ questions? Theory Pract., 34(2), 101–106.
  13. Duncan R. G. and Hmelo-Silver C. E., (2009), Learning progressions: Aligning curriculum, instruction, and assessment, J. Res. Sci. Teach., 46(6), 606–609.
  14. Duschl R. A., Schweingruber H. A. and Shouse A. W., (2007), Taking science to school: Learning and teaching science in grades K-8, Washington, DC: National Academies Press.
  15. Graesser A. C. and Person N. K., (1994), Question asking during tutoring, American Educational Research Journal, 31(1), 104–137.
  16. Griffith J., Vercellotti M. L. and Folkers H., (2019), What's in a question? A Comparison of Student Questions in Two Learning Spaces, Teach. Learn. Commun. Sci. Disord., 3(1), 7.
  17. Holtz P., Kronberger N. and Wagner W., (2012), Analyzing internet forums, J. Media Psychol., 24(2), 55–66.
  18. Huang R., (2018), RQDA: R-based Qualitative Data Analysis. R package version 0.3-1, URL http://rqda.r-forge.r-project.org.
  19. Jin H., van Rijn P., Moore J. C., Bauer M. I., Pressler Y. and Yestness N., (2019), A validation framework for science learning progression research, Int. J. Sci. Educ., 1–23.
  20. Lincoln Y. S. and Denzin N. K., (2000), The handbook of qualitative research, Sage.
  21. Matias Y., Keysar D., Chechik G., Bar-Yossef Z. and Shmiel T., (2017), US Pat. 9,679,027, Washington, DC: U.S. Patent and Trademark Office.
  22. Pedrosa De Jesus H. P., Teixeira-Dias J. J. and Watts M., (2003), Questions of chemistry, Int. J. Sci. Educ., 25(8), 1015–1034.
  23. Pratt J. M. and Yezierski E. J., (2018), A novel qualitative method to improve access, elicitation, and sample diversification for enhanced transferability applied to studying chemistry outreach, Chem. Educ. Res. Pract., 19(2), 410–430.
  24. Ronfard S., Zambrana I. M., Hermansen T. K. and Kelemen D., (2018), Question-asking in childhood: A review of the literature and a framework for understanding its development, Dev. Rev., 49, 101–120.
  25. Saldaña J., (2015), The coding manual for qualitative researchers, Sage.
  26. Scardamalia M. and Bereiter C., (1992), Text-based and knowledge based questioning by children, Cogn. Instr., 9(3), 177–199.
  27. Schmidt H. J., Baumgärtner T. and Eybe H., (2003), Changing ideas about the periodic table of elements and students' alternative concepts of isotopes and allotropes, J. Res. Sci. Teach., 40(3), 257–277.
  28. Tekin B. B. and Nakiboglu C., (2006), Identifying students' misconceptions about nuclear chemistry. A study of Turkish high school students, J. Chem. Educ., 83(11), 1712.
  29. U.S. Nuclear Regulatory Commission, (2000), The Regulation and Use of Radioisotopes in Today's World, Washington, DC: Office of Public Affairs.
  30. Van Teijlingen E. and Hundley V., (2001), The Importance of Pilot Studies, Social Research Update, issue 35, 1–4.
  31. Watts M., Gould G. and Alsop S., (1997), Questions of Understanding: Categorising Pupils' Questions in Science, Sch. Sci. Rev., 79(286), 57–63.
  32. White R.T. and Gunstone R.F., (1992), Probing understanding, London: Falmer Press.
  33. White R. W., Richardson M. and Yih W. T., (2015), Questions vs. queries in informational search tasks, in Proceedings of the 24th International Conference on World Wide Web, ACM, pp. 135–136.

Footnote

Electronic supplementary information (ESI) available. See DOI: 10.1039/c9rp00145j

This journal is © The Royal Society of Chemistry 2020