Wayne
Breslyn
Department of Teaching and Learning, Policy and Leadership, University of Maryland, 2311 Benjamin Building, College Park, Maryland 20742, USA. E-mail: wbreslyn@umd.edu
First published on 7th November 2019
Publicly available, learner-generated questions were used to develop a methodology for advancing the exploratory stages of science education research. Data from four complementary online sources were collected, analyzed, and compared to the extant research literature on the science education topic of isotopes, a challenging concept for many chemistry learners. Data from People Also Ask features on a popular search engine, questions in response to two videos (n = 770), and questions posted to two question and answer websites (n = 600 and n = 29213) were analyzed. Multiple findings not present in the literature were detected across all data sources in this study. Findings suggest that these online sources can serve to inform research in science education by providing a rich, ecologically valid, and accessible source of data. Implications include the use of existing online data prior to initiating research, writing assessments, and developing curriculum for science topics as a means to achieve more robust and generalizable findings and instructional resources.
A challenge faced by researchers is accessing a diverse group of participants and obtaining meaningful data. In their development of a novel methodology to investigate chemical education outreach, Pratt and Yezierski (2018) note that access, as well as eliciting meaningful data, is a limiting factor in many research studies. Our research suggests that for many studies the use of online learner-generated questions can be particularly effective for addressing this challenge.
A further challenge is the development of instruments and research designs that accurately identify and measure a learner's understanding of a topic. Typically, pilot studies with small samples are conducted, and the outcomes, along with relevant literature, are then used to develop the design and instrumentation of a larger study (Van Teijlingen and Hundley, 2001). While pilot studies are useful to develop ideas, generalizable inference is limited by sample size, resources, and access to participants. Although valuable, caution is necessary as pilot studies can have an unwarranted influence on research instrumentation, data analysis, and ultimately the findings in the larger study.
The current study offers a potential means to address these methodological problems. The proposed methodology uses existing data sources to augment pilot work and strengthen instrument development and study design. The goal is not to replace more traditional research methodologies; instead, the goal is to provide insights into learners’ thinking based on a large, existing dataset prior to initiating research, assessment, or curriculum development.
Three primary categories of resources are addressed: search engine generated People Also Ask (PAA) questions (http://www.Google.com), questions asked by students in response to two videos on an education website (http://www.KhanAcademy.com) and questions asked on two popular question and answer websites (http://www.Quora.com, http://www.Answers.Yahoo.com). In this study the topic of isotopes was investigated.
Isotopes was selected as a topic for three primary reasons. First, it provides a manageable breadth of content while still being theoretically and methodologically generative. Second, the topic is one that is frequently taught at the secondary and university level in both chemistry and physics courses. Finally, the topic of isotopes has a wide range of applications outside of the classroom such as medicine, nuclear energy, and radiocarbon dating (U.S. Nuclear Regulatory Commission, 2000). Therefore, the topic of isotopes allows for a focus on methodology as well as the generation of knowledge that can be applied to chemistry and physics education.
To explore the validity of the methodology proposed in this study, a comparison was made between the research literature on students’ understanding of isotopes and findings from the four data sources used in this study. The primary research question addressed was:
How does data from PAA suggested questions and learner-generated online questions compare to the research literature on learners’ understanding of isotopes?
In addition to making a comparison to the extant literature, a further goal was to detect and describe learner ideas about isotopes not present in the research literature.
In this study we were drawn to conceptualizing LP research as being composed of five stages with validity being established through evidence from various sources (Jin et al., 2019). In this framework the proposed five stages are Development, Scoring, Generalization, Extrapolation, and Use. Each stage is based on assumptions that guide the research. For example, in the Development stage it is assumed that important concepts are addressed in the instrumentation used to collect evidence and that it is effective in detecting student reasoning.
In the current study, the use of existing learner-generated questions is most applicable to the Development stage of LP research. It is at this stage that decisions are being made that can have a considerable impact on the remaining stages of research.
In the Development stage, Jin et al. include expert review, think-aloud, and clinical interviews as the evidence used to guide research. We see the use of existing learner-generated questions as an initial form of evidence to support the Development stage by providing a large and accessible, ecologically valid sample. Such a sample would strengthen the development of instrumentation for think-aloud, clinical interviews, etc. and therefore subsequent stages of research.
Students’ questions provide insights into their knowledge, understanding, and puzzlement, and act as a window into their minds.
Considerable research exists on the types of questions teachers ask their students but is sparse regarding learner-generated questions (Ronfard et al., 2018). It has been suggested that this is because students typically do not ask enough questions to be theoretically or practically generative (Commeyras, 1995, Graesser and Person, 1994). A strength of the methodology proposed in the current study is the diversity and volume of learner-generated questions available online for a wide range of science topics.
Although no studies on the topic of isotopes involved questions generated in an online environment, a number looked at the classification of the types of questions students asked. For example, in a comparison of a traditional lecture and an interactive learning environment, Griffith et al. (2019) categorized student questions as non-content, foundational knowledge, or application knowledge. Foundational knowledge questions were considered to be of a lower cognitive level than application questions. Pedrosa De Jesus et al. (2003) categorized student questions as either confirmational or transformational. Confirmation questions sought clarification whereas transformation questions involved the restructuring or reorganization of knowledge. In their study of elementary students’ text-based and knowledge-based questions, Scardamalia and Bereiter (1992) described what they termed “wonderment” questions. These questions are asked when students are curious or puzzled and are thought to have a larger impact on advancing understanding. Because learners tend to ask questions when they feel secure (Watts et al., 1997), we speculate that an online environment may result in more questions involving wonderment.
While there is little research available on the specific use of online questions, one useful example is the work of Baram-Tsabari et al. (2006) investigating children's interest in science. Using data from a popular online question and answer website, the authors analyzed 1555 learner-generated questions. These questions were coded based on disciplinary area of learner interest (e.g. biology, chemistry). Because the authors were able to access demographic information, they were able to investigate how gender and age related to the content area of questions asked.
The study offers several methodological insights. First, it is not always possible to determine why the question was asked by the learner. The authors attempted to differentiate between what they termed “spontaneous” and “school related” questions. Unless explicitly stated to be for a school assignment questions were classified as “spontaneous”. This highlights a limitation to be considered when using online data. Second, as the authors note, many learners who use question and answer websites find their question already answered on the site. As such these questions are not available for analysis in the data. Finally, the self-selected sample of learners asking questions necessitates caution in generalizing findings to learners in other contexts.
For example, students were asked about the number of electrons and neutrons in atoms of graphite and diamond. Possible responses were that graphite and diamond had the same number of neutrons, the same number of electrons, both, or neither. To answer the question correctly students must understand that graphite and diamond consist of carbon atoms in different arrangements. The carbon atoms will have the same number of protons and electrons; however, within a sample of graphite or diamond, isotopes of carbon with varying number of neutrons would exist.
Of interest is how instrumentation was developed in the Schmidt et al., (2003) study and its influence on their findings. The multiple-choice questions were developed based on past tests from examination boards from the United Kingdom and United States. Items were piloted with a small group prior to the main study. The collection of data from the multiple-choice instrument was followed by interviews (n = 6) based on the responses to the multiple-choice questions. The study therefore ultimately relies on concepts about isotopes as conceived by the authors of the examination board test items. This places constraints on what is measured and found in the study.
Analysis of multiple-choice data indicated that students experienced difficulties with three concepts related to isotopes.
• Students considered there to be “standard atoms” and isotopes of these atoms. Students often thought that standard atoms have the same number of protons and neutrons and are more stable. In addition they have integer mass numbers. These standard atoms were found on the periodic table.
• Students considered graphite and diamonds to be isotopes of carbon.
• Students confused the term “isotope” and “isomer”.
Interviews, developed based on findings from the multiple-choice instrument, confirmed students’ alternative conception of a standard form of an element with equal numbers of protons and neutrons. Interviews also confirmed student confusion between isotopes and allotropes of carbon. No new findings were reported based on interview data.
It is important to note that the intent of the study was to determine why students accepted inaccurate statements about isotopes developed in the pilot study rather than identify alternative conceptions in a broader context.
As such, these findings may be due to nature of the instrumentation and its development. The authors state that students may have been drawn to distractors and led to believe that the number of protons and neutrons were the same. Because of this it is likely that the development of the multiple-choice questions resulted in a contextually limited description of student understanding of isotopes.
Çalik et al. (2009) conducted a study on the use of analogies in understanding the ‘atom’ concept using a pre-post multiple choice instrument. A total of 36 students took part in the study.
Three alternative conceptions held by some students were:
• For isotopes, the atomic masses are the same, but the atomic numbers are different.
• For isotopes, the atomic mass and atomic numbers are the same.
• Atoms of isotopes have the same neutron number.
Tekin and Nakiboglu (2006) conducted a study on Turkish tenth grade students’ misconceptions of nuclear chemistry (N = 157) using a seven-item multiple choice diagnostic test. Students were also asked to explain their thinking in writing for each of the seven questions. Their findings related to isotopes focus on stability and half-life. They found that many students thought:
• Isotopes with specific atomic numbers affect the stability of the nucleus.
• Isotopes with the longest half-life are the least stable.
• Radioisotopes are only used for energy production since they are dangerous to humans.
• The rate of radioactive decay depends on temperature and pressure.
Similar to Schmidt et al. (2003), existing chemistry content, textbooks in this case, was used for question development as well as unspecified classroom observations.
For all of the studies in this literature review, instrument development was guided primarily by past examinations or textbooks (Çalik et al., 2009; Schmidt et al., 2003; Tekin and Nakiboglu, 2006). A major criticism of this approach is that starting with these secondary sources can lead to biased data and findings that do not necessarily reflect learners’ thinking. In other words, by either omitting key aspects or by providing attractive distractors, the data and findings do not produce a full description of how learners think about the topic, common questions, obstacles to their learning, or what they find interesting.
The nature of online data has contextual implications and it is possible that the questions asked by learners will differ from those asked in the classroom or in response to interview questions. It is hypothesized that the asynchronous and anonymous nature of online questions will result in a more diverse group of self-generated questions, as students reluctant in a classroom setting may be more willing to participate in an anonymous online environment. However, the relative anonymity of the learner also places limits on the information available about each individual learner (Holtz et al., 2012). Specifically, data on gender, age, nationality, etc. are not accessible for analysis.
Together the sources provide complementary data that can be used to generate a description of learner understanding of a topic as well as common challenges and areas of learner interest. A description of each source is provided below.
Upon selecting a question, additional questions about the topic appear along with a link and information related to the question. These questions can also be selected, resulting in further questions and the process can be repeated multiple times. This phenomenon is sometimes termed a PAA black hole allowing multiple iterations and pathways to explore questions people ask about the topic of isotopes. Fig. 2 shows additional questions generated after selecting “What makes an isotope?”
PAA results are selected by the search engine as a function of the frequency the question has been asked (Matias et al., 2017). This means that the theme categories identified can be considered to be those most relevant and important to learners. This is similar to the “Customers who bought this item also bought…” feature on popular shopping websites.
Because the major search engines have access to several billion searches daily the PAA questions can be considered to be generated based on a substantial dataset. This dataset includes a diversity of people searching, often in the form of questions. However, exactly how the PAA questions are generated is not publicly available and therefore, while useful for exploratory work, limited.
In general, the videos focus on academic topics and are of a high quality having been reviewed by content experts. The comments sections, where learners ask questions, are active, with questions still being asked and answered for older videos. Further, questions receive timely and high-quality responses providing incentive for learners to ask questions.
In the context of this study, KA provides questions posted by learners to the comments section of a specific instructional experience that can be accessed by the researcher. Thus it provides insights into how instruction influences the types of questions asked by learners. For Q and Y!A it is not possible to know what, if any, instructional experience led to their question. In addition, questions that would be filtered out of an internet search based on a specific keyword are present in the KA data. For example, a question like “Why isn’t the mass of electrons considered when calculating mass number?” provides insights to learners’ thinking but would not be returned in a search using the keyword “isotopes”. While advantageous in their specificity, question and answer websites like Q and Y!A have a broader range of questions but provide less contextual information.
Q, a for-profit company, was formed in 2010 and as of 2018 received 300 million monthly users. Users can post and answer questions to the website with the Q community responding, often in great detail.
For the topic of isotopes there are a range of questions; however, in general the level sophistication is higher than K and Y!A. For example, questions about the separation of isotopes, biological uptake, or the use of spectroscopy to identify isotopes are more common. This is of value as it extends the range and diversity of questions available for analysis.
From a technical perspective working with data from Q is more challenging since keyword sorts do not state the number of questions, only a continuous list that eventually will not produce more questions (limit appears to be around 600 questions). This prevents looking at the relative number of questions for a given keyword.
In general, the level of questions and answers on Y!A is of a wide breadth but lacks in depth, as measured by degree of expertise (Adamic et al., 2008). In this study, for the topic of isotopes, questions on Y!A were found to be more oriented towards homework and school projects than questions found via PAA, KA, and Q. It is expected this is the case for other science topics.
While generally lacking in depth, Y!A data is valuable because of the number of questions and for providing a different context in which questions are asked. With many questions related to homework or school projects, Y!A can be viewed as providing information about topics relevant to learners’ academic lives.
Y!A has the advantage of searching a large dataset and obtaining information on the numbers of questions asked for each keyword, something not possible with Q data. This allows for an idea of the relative numbers of each general type of question.
To extend and explore themes that emerged from coding, keyword searches are then conducted on a different question and answer website (Y!A) based on these codes. As needed PAA, Q, and Y!A data are revisited in an iterative manner. Finally, a literature review is conducted to compare to findings in the current study. Themes found in the literature, but not in the current study, are then explored further in PAA, KA, Q, and Y!A data. The process is described in Fig. 3.
Initially data were collected from the PAA feature on the search engine http://www.Google.com. The topic “isotopes” was entered in the search box and a search initiated. This process was followed until the questions being generated did not offer any new question types (simple variations in syntax were excluded). Questions that did not relate to the student understanding of isotopes were not included in the data set. For example, the questions “How much does it cost to get rid of radon?” or “Can goiter turn into cancer?” were excluded.
Questions asked by learners in the comments section of two videos on the popular education website, http://www.KhanAcademy.com, were accessed. For both videos over four years of comments (June 10, 2014 to May 1, 2019) were accessible with a total of 770 relevant comments. Questions not related to student understanding of the topic of isotopes or the atom were excluded. For example, questions about balancing chemical equations were removed from the dataset.
For Q the first 600 questions returned for the term “isotopes” were collected. Inclusion and exclusion criteria similar to PAA and KA were used. For Y!A, to take advantage of the large number (29213) of questions asked that included the term “isotopes”, data were manipulated via the Y!A website. This was possible because sorting on Y!A returned a specific count of the number of questions asked for each sort, unlike Q data.
Prior to coding the entire dataset, a research assistant familiar with the topic of isotopes but not a chemistry student or instructor, coded a subset of data in order to assess Inter-rater Reliability (IRR). Initially, to familiarize the second researcher with the codebook and further refine the codebook, the first and second researcher collaboratively coded a subset of data from each source. As a result, modifications, primarily collapsing codes and adding clarifying descriptions for the inclusion and exclusion criteria, were made to the codebook based on discussions between researchers.
To illustrate changes to the codebook several examples are provided. Initially there was a code for questions about the addition or loss of neutrons and a separate code for questions about how isotopes are formed. These codes were collapsed since both deal with changes that result in the formation of isotopes. Further, two codes addressing the topic of ions were collapsed as well. A code for questions asking about the charge for ions and a code for questions addressing the neutrality of atoms were collapsed into one code. As a final example, two separate codes existed for questions about the properties of isotopes, one for physical and one for chemical properties. To reduce the complexity of the codebook these codes were collapsed.
After modifications were made to the codebook a different subset of data was independently coded by both researchers and IRR calculated and found to be above 90%.
Data were coded using the open source analysis software RQDA (Huang, 2018), a package of the R statistical software program. Using RQDA, frequencies were calculated for each code.
Due to the volume of data available for Y!A, data were analyzed by sorting on keywords and keyword phrases. For example, the terms “isotopes” returned 29213 questions. Further sorting on the keywords “isotopes, ions” returned 2050 questions. For “isotopes, ions, neutral” a total of 413 records were returned. Sorting in this manner provides a general idea of the prevalence of themes although because the sort encompasses both questions and responses from the community, the accuracy is lower than manually coding data.
Finally a review of the research literature on students’ understanding of the topic “isotopes” was conducted. The literature review was conducted after coding as a means to reduce researcher bias during coding and allow for a fairer comparison.
As the methodology is intended to be used in the pilot or exploratory stages of research, code categories of either Q or KA that were 5% or greater were reported. For example, codes for isotopes and the periodic table, spectroscopy, separation, isotopes of carbon or oxygen, definition of isotopes, examples of isotopes, and the chemical and physical properties of isotopes were less than 5% and were not reported in the results. Depending on the context a researcher using the methodology might decide to explore findings of a smaller grain size if they appear to be theoretically generative.
The code was present in all data sources but most pronounced in KA. There were no instances of this theme found in the literature reviewed. Table 1 presents the frequency of occurrences across data sources as well as in the literature. Quotes are included to provide examples of typical learner questions. Because it is challenging to achieve meaningful counts for PAA themes, only the presence of the code, along with sample questions, is provided.
PAA | Khan (codes) | Quora (codes) | Yahoo!Answers (keyword sort) |
---|---|---|---|
Are ions and isotopes the same? | 130/770 (17%) | 20/600 (3%) | 2050/29213 (7%) |
Can an atom be an ion and an isotope? | Is it just assumed that a atom is neutral (obviously unless stated otherwise)? | How do ions and isotopes differ from atoms? | How can you tell if a element is an isotope, ion, or stable? |
Can isotopes exist as ions? | What do an atom, ion, and isotope of an element all have in common? | What are the differences between ions and isotopes? | Can ions be isotopes? |
While the theme occurred in all data sources except the literature, it was predominantly found in KA data. For KA this is likely due to inclusion of the terms “ions” and “neutral element” in both videos. Further, the first video specifically included practice problems involving ions. However, the frequency of ion related questions was consistent across questions from both videos.
It may be that learners viewing the KA videos are more focused on traditional academic questions where the Q data tends to consist of more advanced questions. To further test this, data from Y!A was analyzed to detect if questions about ions and isotopes were present.
For Y!A, of 29213 questions containing the terms isotope, 2050 questions also contained the term ions. This represents 7%, less than the 17% from KA data, but still a meaningful number of learners. In addition, the term “isotopes and neutral” occurred 1118 times, over one half of isotopes and ions, although there is overlap in many questions containing both search terms.
The absence of this theme from the literature may be due to the manner in which research instruments were developed in those studies. Unless learners spontaneously brought up the difference between ions and isotopes during piloting, it is unlikely researchers would have included the theme in instrumentation for the study.
PAA | Khan (codes) | Quora (codes) | Yahoo!Answers (keyword sort) |
---|---|---|---|
General atom questions are not generated in PAA sorting when searching on “isotopes”. | 120/770 (16%) | 16/600 (3%) | Because Y!A data was analyzed by keyword sorting, frequencies for general atom questions were not generated. |
So the number of electrons doesn't change anything in the mass number? | How would you write the electron configuration an isotope of the element cobalt? | ||
Can an atom have no electrons? | What is the difference between a proton, neutron and electron? | ||
Just a basic question, how does an atom look like? | How do I calculate binding energy of a nucleus or isotope? |
Due to nature of searching on keywords in PAA, Q, and Y!A, most general questions about the atom were not returned in the search results. In contrast, because no sorting was used with KA data and all questions for both KA videos were coded, general questions about the atom were included in the dataset. To further clarify the importance of varying sources of data, one could imagine a learner studying isotopes who is confused about ions. In KA they would post a question such as “What are ions?” Since all KA learner-generated questions are coded, the question would be included in analysis. If the learner asked the same question on Q or Y!A websites, a computerized sort for the term “isotopes” would not return the question in the search results as “What are ions?” does not include the term “isotopes.”
While not surprising that general chemistry questions are asked when learning about isotopes, these types of questions have implications for teaching and are likely to be present when analyzing learner-generated questions for other specific chemistry topics.
PAA | Khan (codes) | Quora (codes) | Yahoo!Answers (keyword sort) |
---|---|---|---|
PAA questions involving “Hydrogen” do not appear in PAA searches for “isotopes”. | 79/770 (10%) | 46/600 (8%) | 3248/29213 (11%) |
Doesn't deuterium have 2 neutrons and tritium have 3? | Why is the atomic mass of hydrogen isotopes fractional? | Of the 3248 questions containing “Hydrogen” and: | |
Is protium considered as an isotope? | Why is protium an isotope? | “Protium” 240/3248 (7%) | |
I thought protium, deuterium and tritium were the sub atomic particles of atom. | Which isotope of hydrogen produces a higher temperature flame, protium or deuterium? | “Deuterium” 1058/3248 (33%) | |
“Tritium” 826/3248 (25%) |
Hydrogen is unique in that its most common isotope does not have any neutrons and that the first three isotopes of hydrogen have special names (protium, deuterium, and tritium). Questions about the lack of a neutron in protium are frequent in all sources of learner-generated questions. For example:
How does protium exist? If there are no neutrons, then doesn't the electron and proton get attracted to each other?
and
I don't understand why protium is a isotope if it is the same as hydrogen which has no neutron. (KA)
Why is protium an isotope? (Q)
In addition, questions about the specific names for the primary isotopes of hydrogen (protium, deuterium, and tritium) were frequently asked about by learners.
Do all elements' isotopes have names such as protium, deuterium, tritium? (KA)
An additional source of confusion in naming the isotopes of hydrogen is that protium has no neutrons, deuterium has one, and tritium has two. The prefixes pro, deu, and tri refer to the mass number (protons + neutrons), not the number of neutrons.
Did they get their names from how many protons and neutrons they have in their nucleus? (does pro stand for “one”, deu stand for “two”, and tri “stand for 3”?) (KA)
In addition, questions about the atomic mass of hydrogen, and if more isotopes exist after tritium, were also present in the data.
The theme of hydrogen was absent from the literature, even though its presence in instruction means it is frequently encountered when learning about isotopes. Both the absence of neutrons in protium and the special naming convention for isotopes of hydrogen lead to a large number of learner-generated questions.
Further, we can refer to carbon based on the atomic mass (a weighted average of the abundance of isotopes for a specific atom) as seen on the periodic table or we can name an isotope of carbon based on the mass number, for example Carbon-12, Carbon-13, or Carbon-14 depending on the number of neutrons present in the nucleus (Table 4).
PAA | Khan (codes) | Quora (codes) | Yahoo!Answers (keyword sort) |
---|---|---|---|
How are isotopes represented? | 96/770 (12%) | 13/600 (2%) | 2240/29213 (8%) |
What is a nuclide symbol? | Are there any other special names for isotopes of other (non-hydrogen) elements? | What does an isotope symbol represent? | How are isotopes named? |
How do you write the name of an isotope? | What is carbon 12 called? | Do all isotopes of elements get a unique name? | What is the isotope name of the element zinc? |
How is the mass number of an isotope expressed in the name of an atom? |
Most of the naming and notation questions for KA data (79/96) related to the hydrogen atom and protium, deuterium, and tritium. This is a result of hydrogen being used as an example to introduce isotopes to learners. For Q, naming and notation was present but not as prominent.
Questions about naming and notation from Y!A were more common, and compared to KA, more varied with less emphasis on the special names for the isotopes of hydrogen. Although 8% of questions overall for Y!A had the word “isotopes” and “naming” or “notation”, the actual percentage is lower due to questions such as “Can you name three uses of isotopes?” fulfilling the search criteria as well.
PAA | Khan (codes) | Quora (codes) | Yahoo!Answers (keyword sort) | Alignment with existing literature |
---|---|---|---|---|
How does an atom become an isotope? | Formation 35/770 (5%) | Formation 30/600 (5%) | Formation | Theme not found. |
How do atoms gain or lose neutrons to make isotopes? | Why exist 6/770 (<1%) | Why exist 7/600 (1%) | Data are not reliable as the term “formation” is used frequently with “isotopes” in relation to the radioisotope dating and the “formation” of the earth. | |
How are isotopes made? | How exactly do scientists “take out” a neutron? | Can radioactive isotopes be made for any element? | Why exist data for why isotopes exist is not reliable due to the common occurrence of “why” and “exist” along with “isotopes”. | |
Do isotopes occur naturally? | Where do elements get extra neutrons? | Do colliding neutron stars produce new elements or isotopes? | ||
How are isotopes formed? In other words, how do atoms gain neutrons? | Is it possible to artificially produce any isotope of an atom from other isotopes? |
Questions about the formation of isotopes and the role of neutrons are consistent across PAA, KA, and Q data. To a lesser extent learners also ask about why isotopes exist. These questions are noteworthy as they are not typically covered in school science and therefore are likely driven by curiosity, sometimes termed “wonderment” questions (Scardamalia and Bereiter, 1992). For example, a learner in the KA data asked:
what makes an atom to have more or less neutrons? could one just keep adding or taking them away from the nucleus? and what is the biological use/impact of having more or less neutrons in an atom? (KA)
Such questions provide insights into learners’ thinking at a deeper level and can be theoretically generative. In the above question the learner is seeking to understand the mechanism by which isotopes form and their influence in biological systems. Similar to other learner-generated questions, they are unsure of how easy it is to add or remove neutrons or from isotopes. For example, a different learner asked:
How exactly do scientists “take out” a neutron? Do they use some type of mini tweezers or something? (KA)
The addition or removal of neutrons may seem reasonable to learners since the formation of ions, where electrons are lost or gained, is often mentioned when teaching about isotopes. This confusion can be seen in the following two questions:
How are neutrons exchanged between atoms, and can you have atoms with less neutrons than normal? (KA)
What varies in the different isotopes (form)? I thought that changing the number of electrons affect the form of an element? (KA)
Here learners are conflating neutrons (which are not exchanged between atoms) and electrons (which are in the formation of ionic bonds). It is possible that this is related to learner confusion about ions and isotopes noted earlier. Note that learner-generated questions regarding ions are not included in this theme.
Data coded for Q differed from KA in that there were no questions in Q data about the addition or removal of neutrons and more questions about the actual formation process.
PAA | Khan (codes) | Quora (codes) | Yahoo!Answers (keyword sort) | Alignment with existing literature |
---|---|---|---|---|
Are all isotopes unstable? | Stability and decay 11/770 (1%) | Stability and decay 69/600 (12%) | Isotopes and stable 3641/29213 (12%) | Isotopes with specific atomic numbers affect the stability of the nucleus. |
How do you find the half-life of an isotope? | Half-life 0/770 (0%) | Half-life 23/600 (4%) | Isotopes and half-life 5219/29213 (18%) | Isotopes with the longest half-life are the least stable. |
What is the most dangerous radioactive isotope? | Radioactivity 4/770 (<1%) | Radioactivity 48/600 (8%) | Isotopes and radioactivity 1012/29213 (3%) | Radioisotopes are only used for energy production since they are dangerous to humans. |
What exactly does it mean for an atom to be stable? | How are the stable isotopes of chlorine formed? | Isotopes and radioactive 7592/29213 (26%) | Radioactive decay depends on physical conditions. | |
How do we know which isotopes are radioactive? | How are half-lives determined for very long-lived radioactive nuclides? | Note: there is overlap in reported percentages for keyword sorts. | Above: alternative conceptions reported in Tekin and Nakiboglu (2006). | |
What makes isotopes radioactive? |
While not found in the KA data, in Q and Y!A data learners frequently have questions about stability, half-life, and radioactivity, indicating that they are considered important to learners. Of interest is the absence in the KA data. In a search of the KA video transcripts there are no instances of the terms stable/stability, decay, half-life, or radioactive. However, in contrast, in the two KA videos learners frequently ask general questions about the atom (see theme, general questions about the atom), which are also not addressed in the videos. It is likely that a general understanding of the atom is necessary to understand isotopes whereas stability, decay, half-life, and radioactivity is not. This leads to general questions about the atoms as learners need this information to understand isotopes.
The literature also reports findings, although limited (Tekin and Nakiboglu, 2006). None of the themes from the literature were detected in the data sources for this study. While none of the data coded in this study detected this theme, a keyword sort of Q and Y!A for radioactive and temperature, as well as radioactive and pressure, did detect a small number of instances. However, after manually filtering out unrelated questions, only four questions specifically asked about the relationship between isotopes, radioactivity, and pressure or temperature.
Of note is that both temperature and pressure were distractors for the multiple-choice instrument used by Tekin and Nakiboglu (2006) in their study. This suggests that, while the idea is attractive to learners, it is not something they generate on their own and likely why it was not detected in the current study.
PAA | Khan (codes) | Quora (codes) | Yahoo!Answers (keyword sort) | Alignment with existing literature |
---|---|---|---|---|
Which elements have no isotopes? | Number of isotopes 40/770 (5%) | Number of isotopes 45/600 (8%) | Data for Y!A are not reported due to the occurrence of common search terms as well as overlapping terms. | Isotopes differ from standard atoms and are less stable. |
Not present in PAA data. | Standard element 14/770 (2%) | Standard element 6/600 (1%) | Standard atoms have equal numbers of protons and neutrons. | |
How do you find the number of neutrons in an isotope? | Subatomic particle calculations 41/770 (5%) | Subatomic particle calculations 12/600 (2%) | Above: alternative conceptions reported in Schmidt et al., (2003). | |
How are isotopes used? | Uses of isotopes 14/770 (2%) | Uses of isotopes 80/600 (13%) |
While questions about the number of isotopes for an element (e.g. “How many isotopes does tin have?”) are more frequent, of greater interest in the idea of a standard or original isotope. Here learners consider isotopes to be variations on one primary isotope of an element, often not considering it to be an isotope.
For example:
Are all atoms isotopes, or are isotopes variants off of one standard atom? For example, would carbon 12 be the “standard” carbon atom, and carbon 13 an isotope OF carbon 12? Or is there no standard, and both carbon 12 and 13 are isotopes? (KA)
In this question the learner is unsure the existence of the “standard” isotope of carbon, although they do reason that there may be no “standard” isotope at all. Other learners use the terms “original” and “normal” instead of “standard”.
Are there any isotopes that have less neutrons than the original element? (Q)
Is an isotope just the variations of the atom? So there is a normal common atom and then the variations are the isotopes… is that right? (KA)
Occasionally the original isotope is seen to have an equal number of neutrons and protons, something found in the literature (Schmidt et al., 2003). For example:
Aren't there supposed to be equal number of protons and neutrons in every atom? (KA)
An isotope is defined as a nucleus with an unequal number of protons and neutrons. Could somebody explain to me why deuterium, e.g., has the status of an isotope although it has one proton and one neutron? (Q)
The theme subatomic particle calculations was present in KA data, and to a lesser extent, in Q data. For example:
How do you work out the number of protons, neutrons and electrons in isotopes? (KA)
How many protons, electrons, and neutrons does isotope 79 Br + have? (Q)
This is not surprising as this is a task students are frequently asked to complete in academic setting.
The theme for uses of isotopes was common in Q data (13% of questions) but seldom in K (2%).
For K, most of the questions were about general uses of isotopes. For example:
What are some uses of isotopes? (K) or Do isotopes have different uses or are the all used for the same thing? (K)
Q questions were primarily about general uses, medical uses, and danger/disadvantages of isotopes, such as:
Which isotopes are used as radioactive tracers? (Q)
What isotope is used to treat Leukemia? (Q)
What are the health consequences of exposure to isotopes? (Q)
Finally, while coding it was noted that questions about the separation of isotopes (15/600 or 2%) and the use of spectroscopy (12/600 or 2%) was seen in the Q data suggesting that these concepts are of interest to more advanced learners. Interestingly, searching Y!A for the keywords “isotope separation” (204/29213 or 1%) and “isotope spectra” (369/29213 or 1%) also detected similar learner questions indicating a diversity of question complexity in the Y!A data.
Theme | Percentages in current study | Alignment with literature |
---|---|---|
Isotopes, ions, and neutral atoms | KA: 17% Q: 3% Y!A: 7% | Not found. |
General questions about the atom | KA: 16% Q: 3% Y!A: N/A | Not found. |
Hydrogen | KA: 10% Q: 8% Y!A: 11% | Not found. |
Notation and naming | KA: 12% Q: 2% Y!A: 8% | Not found. |
Formation of isotopes | KA: 5% Q: 5% Y!A: N/A | Not found. |
Stability | KA: 1% Q: 12% Y!A: 12% | Tekin and Nakiboglu (2006). |
Half-life | KA: 0% Q: 4% Y!A: 18% | Tekin and Nakiboglu (2006). |
Radioactivity | KA: <1% Q: 8% Y!A: 29% | Tekin and Nakiboglu (2006). |
Standard atoms have equal numbers of protons and neutrons. | KA: 2% Q: 1% Y!A: N/A | Schmidt et al., (2003). |
Data on learner-generated questions for the topic of isotopes were investigated across four different online contexts. Findings about learner understanding of isotopes, while interesting, will primarily be discussed here as a means to illustrate the use and limitations of the methodology proposed in this study.
Within the context of providing robust and ecologically valid empirical data at the initial stages of research, or as a means to explore the validity of existing research, the methodology proposed in the current study is shown to be productive. In comparing findings with the extant research literature, it was possible to detect the majority of findings from the literature as well as identify many new findings. Further, because the data is based on a larger sample, it also was able to quantify the occurrence of these findings.
If students are confused about isotopes and their relationship to diamond and graphite, as seen in the Schmidt et al., (2003) study, this should be present in some form in the questions asked online by a larger sample of learners. A sort of over 30000 questions in Q and Y!A found only five instances. This strongly suggests that manner in which their instrument was developed resulted in this finding. That is, the distractors in the multiple-choice instrument were found to be attractive to students although not something learners would have considered independently. This was also the case for the finding that temperature and pressure affect the rate at which isotopes undergo radioactive decay (Tekin and Nakiboglu, 2006).
Methodologically this should serve as a caution in the development of research instruments. Research findings are influenced by decisions about what to include, and what not to, in the instruments used to collect data. From a practical standpoint, this suggests that the use of existing data sources, along with the research literature, can provide empirical information to guide pilot studies and lead to more robust instruments, study design, and findings.
… would one kind of oxygen isotope have a different effect in our body when we breathe it than another? (KA)
Is a “periodic table of isotopes” thinkable? (Q)
Do atoms keep their isotopes when they bond with another atom? (KA)
Questions like these are likely to be interesting to learners and theoretically useful in eliciting their thinking about isotopes. This is especially true when working with students who have achieved a basic understanding of the topic. While these occur in classroom contexts, they are rare (Carr, 1998; White and Gunstone, 1992, p. 170) and methodologically difficult to capture at the frequency found in online learner-generated questions.
For researchers considering using online learner-generated questions, analysis will vary with research questions, time, and theoretical framework. In this study the analysis was necessarily extensive in order to make a valid comparison with the extant literature and establish the methodology. While the findings presented here benefited from this in-depth analysis, a more cursory look at the data would have detected many of the larger themes. A guide that can be modified to meet varying research contexts is given in the Appendix (ESI†).
In addition to topics found frequently in formal academic contexts, as is the case with isotopes, learner-generated questions are also accessible for other contexts. For example, topics not always covered in the classroom, such as CRISPR, robotics, nanotechnology, careers in science, or climate change would benefit from the methodology. Further, data on the public understanding of science is also available and can be analyzed in a similar manner.
A logical next step is the development of instruments to determine if themes emerging from the analysis of online learner-generated questions, not found in the literature, are present in other contexts. Instruments for interview, multiple-choice, and think-aloud protocols could establish the existence of these themes and extend the findings in this study. In order to investigate more sophisticated thinking for a concept, the inclusion of “wonderment” questions could also prove fruitful.
Future research should also address how a modified form of the proposed methodology influences the development of learning resources. In addition to practicing teachers and curriculum developers, this extends to the wide range of individuals producing open educational resources and content for web and video sharing platforms.
To provide meaningful formative and summative assessment to measure student learning, assessment writers will find student-generated questions of value. In a formative context these questions can inform the diagnosis of areas where students need additional support. For example, in this study general questions about the atom suggest that formative assessment about the structure of the atom could guide teaching. For summative assessment, item writing would benefit from access to the questions students ask, especially in identifying plausible distractors for multiple choice items (Breakall et al., 2019).
In addition to curriculum development, Open Education Resources (OER's) are a growing part of the educational landscape. OER's are often developed through a crowd-sourced model (Allen et al., 2015) with individual authors having a wide range of teaching and content expertise. For these individuals a simplified version of using online learner-generated questions would provide a fuller context for the creation of more effective content that addresses learners’ needs.
A second limitation is the availability of online learner-generated questions. While content exists for most major science topics, large datasets may not be available for all topics. This may be the case for nature of science or scientific practices such as argumentation or critical thinking. This is a limitation of the proposed methodology.
As noted by Lincoln and Denzin (2000), in analyzing content through coding there is an assumption that the categories are precise, which is often not the case. As such, in the current study there is likely some overlap between themes. Further, it is possible that, once established, categories may distract from other themes present in the data (Atkinson, 1992) and that not all themes in this study were detected.
Finally, in the current study, to limit researcher bias during coding, a review of the literature was conducted after coding was complete. An unanticipated outcome of this strategy was that limited research was available about learner understanding of isotopes. Other topics, with a larger research base, could have provided a more in-depth comparison to findings from the current study.
Footnote |
† Electronic supplementary information (ESI) available. See DOI: 10.1039/c9rp00145j |
This journal is © The Royal Society of Chemistry 2020 |