Alex T.
Kararo
a,
Rachel A.
Colvin
ab,
Melanie M.
Cooper
c and
Sonia M.
Underwood
*a
aDepartment of Chemistry & Biochemistry and STEM Transformation Institute, Florida International University, 11200 SW 8th St., Miami, FL 33199, USA. E-mail: Sonia.underwood@fiu.edu
bDepartment of Chemistry, Adelphi University, 1 South Avenue, Garden City, NY 11530, USA
cDepartment of Chemistry, Michigan State University, 578 Shaw Lane, East Lansing, MI 48824, USA
First published on 28th November 2018
The relationship between chemical structure and physical and chemical properties is essential to chemistry. Studies have shown that students have difficulty using structural representations to predict properties, which is not surprising because of the sequence of inferences that are required for sense-making. However, obtaining a nuanced model of students’ understanding depends on how information is elicited. This study investigated how the phrasing of the question prompt may elicit students’ understanding of structure–property relationships. Students were given a two-part assessment: (1) four multiple-choice questions assessing students’ self-reported abilities to predict structure–property relationships, and (2) three questions requiring students to predict, argue, and explain a boiling point trend. Two groups of students were selected to determine the sensitivity of the instrument (one with less explicit instruction of structure–property relationships and one with more explicit instruction). We found that Part I of the assessment was able to differentiate between these two groups of students. The group with more explicit instruction was further analyzed to determine how their prediction on a boiling point task connected to their arguments and explanations of the phenomenon. Even though 64% of students answered the boiling point ranking task correctly, the students typically provided less complete arguments as to why that structure had a higher boiling point. However, after scaffolding (i.e., providing relevant information for the phenomenon) and asking for an explanation, students’ responses began to include a much more mechanistic understanding, suggesting that having students provide explanations instead of constructing an argument would display their reasoning at a deeper level.
Students’ reasoning for ranking tasks often consists of heuristics, both student- and instructor-derived. Heuristics are “cognitive shortcuts” that serve to limit the cognitive load that reasoning requires with examples such as “the octet rule” and “like dissolves like” (Maeyer and Talanquer, 2010; Cooper et al., 2013). While these are helpful rules of thumb, they do not necessarily require the student to understand what is happening at a deeper level; rather, the use of heuristics seems to be especially pervasive in answering questions that involve ranking tasks based on the values of some physical or chemical properties (e.g., boiling point, melting point, acidity/basicity, or reactivity). While these heuristics might prove helpful for getting the correct answer, their reasoning often lacks depth or accuracy. If students are not routinely asked to provide reasoning for their answers, they may have trouble when they are asked to reason about why a particular chemical phenomenon occurs.
Zohar and Nemet (2002) found that while 90% of the students in their study were able to provide a simple argument, only 16% of the students used relevant biological knowledge in their arguments. However, once students were taught how to provide a quality argument and the relevant biological content, the quality of the arguments and the use of correct scientific content increased. A further study by Von Aufschnaiter and colleagues (2008) found that students only provided specific content in their arguments when they had an adequate understanding of the topic being addressed, showing that argumentation and content knowledge are directly connected. Berland and Reiser (2009) asked students questions related to changes in population of different species. They found that while students did appear to see the value of evidence in their answers, their goal appeared to be more sense-making (i.e., explanation) and less persuasion (i.e., argumentation). Berland and Reiser (2009) state that the practices of explanation and argumentation complement each other. Argumentation allows for explanations of scientific phenomena to be discussed and accepted or refuted.
Appropriate types of prompts have been shown to encourage students to better articulate how and why something occurs, when they lacked reasoning in their explanations before (Cooper et al., 2016). However, the structure of the prompt is crucial. The prompt must be accessible so that students can understand the task, and it must provide enough structure so that students know what is expected of them (Jin and Anderson, 2012; Cooper et al., 2016). If a question is not structured enough (i.e., too implicit), then the students may not understand what is expected of them and will not be able to provide their complete understanding on the topic. If a task is over-structured (i.e., too explicit), students may be given enough information to complete the task which they otherwise would not have been able to complete. For example, Cooper et al. (2016) found that by including the term “electronegativity” in an assessment prompt, students used the term without increasing the richness or meaningfulness of the student responses.
Kang and colleagues (2014) also found that scaffolding explanations, by contextualizing them as a phenomenon or having students construct drawings, led to a higher quality scientific explanation. By providing a specific phenomenon, the students not only have to recognize the general model for that concept but apply that model to a specific situation. This can make it more cognitively challenging to students by making them apply the model to a specific phenomenon, and also make it more accessible by situating it under recognizable conditions.
In this study a two-part assessment was developed that increases the scaffolding as the student moves through the task to determine at what point within the assessment the students begin to connect structure–property relationships. In the second part of the assessment, we contextualize the structure–property relationships into a specific example of boiling point trends in which we provide a boiling point ranking task and require the students to construct an argument to support their claim. Finally, the students are provided the same boiling point phenomenon and required to provide an explanation (rather than an argument) about why the phenomenon occurs. This sequence of tasks allows us to explore how scaffolding influences students’ knowledge of structure–property relationships, compare students’ arguments and explanations about a given phenomenon, and determine the sensitivity of the instrument as a whole with students who have had different levels of exposure to structure–property relationships as shown in prior studies (Cooper et al., 2015; Underwood et al., 2016).
RQ2: How does the phrasing of the prompt influence students’ ability to predict, argue, and explain structure–property relationships?
RQ3: How do students’ abilities to predict structure–property relationships compare to their explanations of boiling point trends?
The final version of the assessment consists of two parts: Part I with four multiple-answer questions (Table 1: Questions 1–4), and Part 2 with three open-ended questions (Table 1: Questions 5–7). The purpose of the first part of the assessment was to determine how modifications in the prompt impact students’ abilities to self-report types of information that can be predicted about a substance based on the chemical structure. Question 1 (Table 1) presented students with the Implicit Information from Lewis Structures Instrument (IILSI) (Cooper et al., 2012a, 2012b), where students are provided with 20 different types of information to determine, without explicit prompting, which students believe could be predicted. Question 2 (Table 1) included the chemical structure of the amino acid alanine, a less familiar structure to students, to determine if the presence of a structure would prompt students to select more information on the IILSI. Question 3 (Table 1) reduces the options available for Question 2 from 20 to five. The purpose of this modification was to determine if students were more likely to identify chemical and physical properties when presented fewer options that are typically not chosen. Question 4 (Table 1) was designed to have the most explicit and familiar prompting by asking students if specific properties (e.g., boiling point and melting point) would be high or low compared to alanine. Therefore, the items in Part 1 consisted of versions of the previously reported IILSI (Cooper et al., 2012a, 2012b), with increasingly scaffolded prompting that would allow us to determine the level of scaffolding that might help students identify relationships between structures and properties.
Question 1: IILSI (Cooper et al., 2012a, 2012b) | “What information could you determine using a Lewis structure and any other chemistry knowledge you have? (Mark all that apply)” |
20 answer choices including information such as “hybridization”, “element(s) present”, “relative melting point”, etc. | |
Question 2: IILSI (Cooper et al., 2012a, 2012b) w/Lewis structure of alanine | “If given the Lewis structure as shown, what information could you determine using the structure and any other chemistry knowledge you have? (Mark all that apply)” |
20 answer choices (same as Question 1) | |
Question 3 (Lewis structure of alanine provided) | “Now, for the same compound what types of the following information could you determine using its Lewis structure and any other chemistry knowledge you have? (Mark all that apply)” |
5 answer choices: “relative boiling point”, “relative melting point”, “acidity/basicity”, “reactivity”, and “none of these properties can be predicted” | |
Question 4 (Lewis structure of alanine provided) | “For the same compound shown below, please select all of the properties that could be predicted using the structure and any other chemistry knowledge you have.” |
Answer choices: Boiling/Melting Point – “High boiling point”, “High melting point”, “Low boiling point”, “Low melting point”, and “Can’t be predicted”, Acidity/Basicity – “Acidic”, “Basic”, “Neutral”, “Acidic and Basic”, “Can’t be predicted”, and Reactivity – “Reactive”, “Non-reactive”, and “Can’t be predicted” | |
Question 5: Boiling point ranking task and reasoning | “If given the two compounds below [ethanol and dimethyl ether], which compound would you predict has the higher boiling point? Please explain your reasoning.” |
Informational Slide 1 | Same chemical formula, same molecular mass; dimethyl ether is a gas and ethanol is a liquid at room temperature; told the values of their boiling points |
Informational Slide 2 | “The difference in properties between the two compounds is because ethanol can participate in hydrogen bonding.” |
Question 6: Draw hydrogen bonding for three molecules of ethanol (Cooper et al., 2015) | “Please draw and label a representation below in the blue box that clearly indicates where hydrogen bonding is present for three molecules of ethanol (CH3CH2OH).” |
Question 7: Explain why ethanol has a higher boiling point due to hydrogen bonding | “Using your representation of hydrogen bonding in the blue box, explain in the black box why the ability of ethanol to form hydrogen bonds results in ethanol having a higher boiling point than dimethyl ether.” |
Part 2 of the assessment was designed to investigate whether students can actually predict and explain structure–property phenomena such as boiling point. The first question in Part 2 (Table 1 – Question 5) included a ranking task that required students to make an argument about a boiling point trend (i.e., predict the trend and support their claim with an argument). A ranking task was used since this is a common type of assessment seen by students in general chemistry courses. Although most students are familiar with and can answer these tasks, it has been shown that many students are unable to provide adequate explanations for ranking tasks (McClary and Talanquer, 2011). Previous studies have shown that merely asking students to provide an argument (Table 1, Question 5) may not elicit a rich response. Because of this, informational slides were added to Part 2 of the assessment to provide students with the information that ethanol is a liquid at room temperature while dimethyl ether is a gas at room temperature and this is due to ethanol molecules being able to hydrogen bond. Question 6, originating from the Intermolecular Forces Assessment (IMFA) (Cooper et al., 2015), was added to further capture students’ ideas of hydrogen bonding, since research has shown that students have difficulty differentiating between bonding and electrostatic non-covalent interactions (National Research Council, 1999; Cooper et al., 2013, 2015). Question 7 (Table 1) then asked the students to use the information presented to them and their representation of hydrogen bonding for ethanol to explain the phenomenon (i.e., ethanol is a liquid at room temperature while dimethyl ether is a gas).
Group A: the first group of students (N = 117) was from a large Southeastern public research university. At this institution, class sizes range from 200 to 300 students per section and the curriculum would be considered traditional in content (i.e., commercially available textbook and homework systems). However, this course incorporated worksheets inspired by Process Oriented Guided Inquiry Learning (POGIL) (Farrell et al., 1999) using a flipped classroom approach (Baker, 2000; Tucker, 2012), with lecture videos provided to students outside of class time and students working in groups of four on worksheets during class time. In addition, personal student response systems (e.g., clickers) were used to monitor students’ understanding during class time. The instructors administered common multiple-choice exams. Approximately one 75 minute lecture period during the first-semester of general chemistry was spent on the topic of comparing boiling point trends using Lewis structures. Although there were direct questions on a worksheet given in the class regarding boiling point ranking tasks, no questions on the exams specifically focused on the topic.
Group B: the second group of students (N = 96) were from a large Midwestern public research university. The class size for this institution was around 400 students per lecture section and an alternative general chemistry curriculum, Chemistry, Life, the Universe and Everything (CLUE) (Cooper and Klymkowsky, 2013), was used. Along with lectures, students attended a 50 minute recitation weekly consisting of approximately 20 students per section in which they worked in groups of four to complete worksheets. Clickers were also used in the classroom to monitor students’ understanding in real time. In this course, the topic of boiling point trends was explicitly discussed multiple times throughout the course and assessed on exams in the form of both multiple-choice and short-answer questions.
It is important to note that the two populations are unique in many aspects (e.g., different universities, curricula, and demographics of students). Therefore, the two populations were used in this study to determine the sensitivity of the assessment, given the different emphasis placed on the boiling point trends for the two groups, and not to compare their performances in the assessment.
Code | Definition | Examples of students’ responses |
---|---|---|
Student Does Not Know/No Response/Cannot Be Predicted | Student expresses that they do not know the answer, they do not provide any reasoning, or that boiling point cannot be predicted from a Lewis structure. | “There needs to be more information for me to predict the boiling point.” – Derek |
“I don’t have the knowledge to predict bp [boiling point].” – Jeanne | ||
Non-normative | Student uses scientifically inaccurate or unrelated reasoning | “It's going to be harder to break apart the carbon–carbon bonds, but dimethyl ether has bonds to oxygen, which are easier to break.” – Lindsay |
“Ethanol has an acidic H which means that if it lost that H, the molecule would be charged which means that it will have a higher boiling point.” – Amy | ||
Hydrogen bonding | Student explicitly mentions hydrogen bonding in their reasoning. | “Ethanol can form hydrogen bonds because they form between a H bonded to an O, N, F and the electron pair of another element (only N, O, and F).” – Bryan |
“Ethanol is able to hydrogen bond, therefore it has the higher boiling point.” – Edward | ||
Hydrogen bonding and strength of bonds/interactions | Student explicitly mentions hydrogen bonding. Student compares the strength of intermolecular forces or bonds in their response, ranking strengths or referring to bonds/interactions as being more difficult to break/interrupt. No energy argument included. | “Ethanol can form hydrogen bonds which are stronger than any that dimethyl ether can form.” – Nick |
“Ethanol has a hydrogen bond which is much stronger and harder to break.” – Destiny | ||
Hydrogen bonding, strength of bonds/interactions, and energy | Student explicitly mentions hydrogen bonding and strength of bonds/interactions. Student mentions energy in terms of it being higher/lower or more/less than another entity. Students may use the term “heat” instead of energy. | “The ethanol can form hydrogen bonds which are the strongest IMF and would require a lot of energy to break leading to a higher BP.” – Tiffany |
“Ethanol has hydrogen bonding, which is a strong bond. Breaking hydrogen bonds need to put a lot of energies in it, so the boiling point or melting point is higher for ethanol.” – Peter |
The student drawings of hydrogen bonding using three ethanol molecules in Question 6 were coded using the originally published coding scheme for the IMFA (Cooper et al., 2015). The codes include “between ethanol molecules”, “within ethanol molecules”, “ambiguous representation of hydrogen bonding”, or “student does not know” (Table 3). Other codes, such as “dimethyl ether” if students drew dimethyl ether molecules, appeared infrequently and were placed in the larger theme of “other”. Inter-rater reliability was assessed between two of the authors for Question 6 and produced a Cohen's kappa (κ) from 0.8 to 1.0.
Fig. 1 presents the performance of students from Groups A and B on Questions 1 and 5. For Question 1, labeled as “Predict on IILSI”, students could select or not select boiling point as a property that could be predicted from a Lewis structure. This graph shows that only 9% (N = 10) of students in Group A initially selected boiling point. This corresponds to their response in Question 5, the boiling point ranking task, where students appear to be guessing with a relatively even split between all four options. On the other hand, 60% (N = 58) of the students in Group B selected boiling point for Question 1 with a clear majority choosing ethanol as having the higher boiling point for Question 5.
![]() | ||
Fig. 1 Comparison of performances on Question 1 (boiling point prediction on the IILSI) and Question 5 (boiling point ranking task) for Groups A and B. |
However in the Group A curriculum, the topic of structure–property relationships was covered in one lecture and on one worksheet, and a vast majority of those students did not select boiling point as having a relationship with a Lewis structure and appear to be guessing on the boiling point ranking task. The students who had a larger exposure to this topic, Group B, were more likely to identify the relationship between the boiling point and the Lewis structure as well as identify the correct structure as having a higher boiling point. Therefore, these differences provide evidence that the assessment is sensitive enough to differentiate between different curriculum emphases. The data shown here for the IILSI and IMFA are similar to prior reported research findings (Cooper et al., 2012a, 2012b; Williams et al., 2015; Underwood et al., 2016), suggesting that the modified assessment behaved similarly to the individual assessment tasks for validity and reliability purposes.
Category | Group A | Group B |
---|---|---|
Never predict | 58% (N = 68) | 10% (N = 10) |
Inconsistently predict | 7% (N = 8) | 27% (N = 26) |
Predict upon prompting | 32% (N = 37) | 27% (N = 25) |
Always predict | 3% (N = 4) | 35% (N = 34) |
To simplify the discussion, here we discuss the results from Part 1 with respect to the physical property of boiling point since it corresponds directly to the boiling point ranking task in Part 2 of the assessment. The findings for the other chemical/physical properties for the IILSI were similar and are presented in Appendix 2.
Ideally, we would like students to align with the “Always Predict” category, indicating that they consistently and implicitly connect boiling point trends with Lewis structures. As shown in Table 4, the majority of students (58%, N = 68) in Group A never selected boiling point as a property that could be predicted from a Lewis structure regardless of how the question was presented, whereas only 3% (N = 4) of students fell into the “Always Predict” category. 32% (N = 37) of the students in Group A, however, aligned with the “Predict Upon Prompting” category, where the majority of these students 62% (N = 23 out of 37) selected boiling point on Question 4.
In contrast, 35% (N = 34) of students in Group B consistently selected that boiling point could be predicted using a molecular structure, while the smallest percentage of students, 10% (N = 10), never selected boiling point across the assessment. Students in the “Predict Upon Prompting” category 27% (N = 25) followed a similar pattern to the students in Group A with the largest percentage, 60% (N = 15 out of 25), first choosing that boiling point could be predicted in Question 4 (Table 1). This is the most explicit or structured question and shows that the last question in Part 1 of the assessment had the most impact on these students’ selections of boiling point's relationship to a Lewis structure. Providing a Lewis structure with the original IILSI (Question 2) did not provide much additional help to students; however, reducing the number of options and stating the options in terms of high/low were more helpful (Question 4). It is possible that the last question cued the students into recalling information provided in a general chemistry classroom (e.g. water has a high boiling point compared to other molecular compounds of similar size such as ammonia or methane). However, since a majority of students in Group A did not choose that boiling point could be predicted using a molecular structure, even on Question 4, it appears that students have to have some knowledge of this topic to choose boiling point. Further studies would need to be conducted to determine why students in the category “Predict Upon Prompting” do not link boiling point and Lewis structure until Question 4.
For the students in both groups who needed prompting to connect structures and properties, 32% and 27% for Groups A and B respectively, it took the most explicit prompting (Question 4) to do so. The wording in Question 4 also differed by including both high and low boiling points as options, implying that many of the students in the “Predict Upon Prompting” category may have needed the options worded in such a way as to identify the connection between the Lewis structure and the property. The trends observed in Part 1 of the assessment suggest that students had difficulty with the beginning step of constructing an argument (i.e., making a prediction or claim) and clearly need more experience to explicitly develop the connections of structures and properties.
In summary, the first half of the assessment demonstrated that over half of the students in Group A never connected structures and properties, despite explicit prompting to do so. On the other hand, over half of the students in Group B consistently predicted the boiling point structure–property relationship. The differences in performance of Groups A and B on Part 1 of the assessment provide further evidence to the sensitivity of this instrument.
![]() | ||
Fig. 2 Distribution of codes for students’ arguments (Question 5) as it relates to their boiling point predictions for Group B. |
As shown in Fig. 1, 64% (N = 61) of the students in Group B chose ethanol, 22% (N = 21) chose dimethyl ether, and 12% (N = 11) chose that the two compounds have the same boiling point. As shown in Fig. 2, 20% (N = 19) of students chose ethanol and used the concept of hydrogen bonding as evidence for their prediction, with an additional 16% (N = 15) of students going beyond hydrogen bonding to incorporate the strength of the bond/interaction. Ideally students would use the concepts of hydrogen bonding, strength of interactions, and the amount of energy needed to overcome the attractive forces between molecules to explain their boiling point prediction. Unfortunately, only 15% (N = 14) were able to do so on Question 5. However, it is promising though that students are introducing relevant concepts into their argument about the boiling point ranking task without being prompted to do so.
For the students who indicated that dimethyl ether had the higher boiling point, most of them provided non-normative arguments, using concepts such as stability and breaking of bonds (22%, N = 21). This is expected since in order to justify that dimethyl ether has a higher boiling point, the students would most likely have to provide scientifically inaccurate or unrelated reasoning. The arguments used by students who chose that the boiling points were equal (11%, N = 11) typically consisted of structural information, such as the number of atoms of each element. These students tended to be fixated more on the structure of one molecule instead of thinking about how multiple molecules would interact within the substance.
![]() | ||
Fig. 3 Distribution of codes for students’ explanations (Question 7) as it relates to their boiling point predictions for Group B. |
Students’ explanations (Question 7) were richer and included more ideas than their original arguments (Question 5). This effect was observed even for those students who earlier had incorrectly answered the boiling point ranking task. Although only 15% (N = 14) of students invoked the concept of energy as part of their arguments in Question 5, 51% (N = 49) of students used the concept of energy in their explanations in Question 7, without being explicitly prompted to do so. This not only shows that students have that knowledge but can also use that knowledge to build on the information provided. Since the information slides already provided the idea that hydrogen bonding was involved in the mechanism by which substances boiled, the students appear to understand that repetition of this idea was not sufficient to answer the question and were more likely to build on that concept to provide an explanation that included both the strength of interactions/bonds and the energy required to overcome them.
In summary, by changing the nature of the task from an argument to an explanation in Part 2 of the assessment, the students provided richer responses for why ethanol has a higher boiling point than dimethyl ether. To answer Question 5, the students were required to make an argument in which most of the students used the concepts of hydrogen bonding and strength of bonds/interactions. However, when provided with “the answer” and asked to construct an explanation for why ethanol has a higher boiling point than dimethyl ether (Question 7), many more students also included the concept of energy. That is, the structure of the prompt determined the types of information that the students used in their responses. The explanation prompt (Question 7) activated resources that the argumentation prompt (Question 5) did not.
![]() | ||
Fig. 4 Distribution of codes for students’ arguments and explanations (Question 5 and Question 7, respectively) based on whether they selected boiling point for Question 1. |
For Question 5, the students who selected that the relative boiling point could be predicted using a molecular structure on the IILSI in Question 1 were more likely to bring in concepts such as hydrogen bonding or strength of bonds/interactions on Question 5. However, a large percentage of students (44%, N = 42) provided some type of non-normative argument, regardless of whether they identified a relationship between boiling point and Lewis structure or not in Question 1.
When asked for an explanation in Question 7, the students in both groups shifted to more thorough explanations with only 16% (N = 15) of students providing non-normative responses. The difference observed between the group of students who selected that boiling point could be predicted from the Lewis structure for Question 1 and those who did not is the depth of their responses for their explanations. More students brought in concepts like strength of bonds/interactions and energy, with 28% (N =27) of the students selecting that boiling point could be predicted as well as providing an explanation connecting hydrogen bonding, strength of bonds/interactions and energy. This is compared to only 9% (N = 9) of the students connecting these concepts together in their explanations when they did not select boiling point on the IILSI for Question 1. These analyses show that even if students consider concepts such as hydrogen bonding and strength of bonds/interactions, they do not necessarily see the connection between those concepts and structure–property relationships or use it as part of an argument. By asking for an explanation about a boiling point trend compared to constructing an argument, the students are more likely to include those concepts. This also shows that the students who initially identify the relationship between structure and properties, such as boiling point, are more likely to have a more mechanistic understanding of that relationship. Although it is important to note that further studies would be needed to develop a better understanding of whether these students have a causal mechanistic understanding of this phenomenon (i.e. how and why energy plays a role in phase changes).
Overall, these analyses show the role of explanation in assisting students in being able to identify and explain structure–property relationships. For Part 1 of the assessment, limiting the number of choices and wording them in a different way, such as high or low boiling point, appears to have allowed more students to self-report that they understand that structure and properties are connected.
Part 2 of the assessment provides evidence that many students do not access all the resources available to them to answer a question. When the students were asked to provide an explanation, many more of them were able to tie in other appropriate concepts to provide reasoning. While ideally we might like students to consistently and explicitly connect these concepts in order to argue and explain structure–property relationships, what is clear from this work is that along the way we must support the students by designing appropriate task prompts.
Comparing the relationship of students’ responses between Parts 1 and 2 of the assessment, we found that students who predicted that boiling point depends on molecular structure initially on the IILSI for Question 1 were three times more likely to connect hydrogen bonding, strength of bonds/interactions, and energy in their explanations compared to the students who did not select this item. Overall, these results suggest that despite students’ difficulties with predicting and explaining the relationship between structure and properties, providing the phenomenon with relevant information led to more complete explanations for the relationship between structure and boiling point trends.
Semester administered | Student population |
---|---|
a Populations of students who are discussed in the paper. | |
Spring 2013 | N = 895, second-semester general chemistry, University 1 |
N = 303, second-semester organic chemistry, University 2 | |
Fall 2015 | N = 167, second-semester general chemistry, University 2 |
Spring 2016 | N = 96, second-semester general chemistry, University 2a |
N = 116, second-semester general chemistry, University 3a |
The first version of the assessment consisted of four questions (Table 1: Questions 1, 3, 4 and 5) and was piloted during the Spring 2013 semester to determine how students’ predictions of structure–property relationships changed depending on the prompt provided. Based on students’ initial responses, the informational slides as well as having students draw their understanding of hydrogen bonding were added to the assessment tasks. The modified assessment was administered in Fall 2015 to confirm that the prompt was not too explicit to provide help to students who otherwise would not be able to answer the question before the final administration in Spring 2016.
Category | Relative melting point | Reactivity | Acidity/basicity |
---|---|---|---|
Never predict | 67% (N = 78) | 43% (N = 50) | 46% (N = 54) |
Inconsistently predict | 8% (N = 9) | 20% (N = 23) | 4% (N = 5) |
Predict upon prompting | 24% (N = 28) | 35% (N = 41) | 44% (N = 52) |
Always predict | 2% (N = 2) | 3% (N = 3) | 5% (N = 6) |
Category | Relative melting point | Reactivity | Acidity/basicity |
---|---|---|---|
Never predict | 16% (N = 15) | 14% (N = 13) | 5% (N = 5) |
Inconsistently predict | 27% (N = 26) | 24% (N = 23) | 28% (N = 28) |
Predict upon prompting | 23% (N = 22) | 36% (N = 35) | 25% (N = 24) |
Always predict | 34% (N = 33) | 26% (N = 25) | 41% (N = 39) |
This journal is © The Royal Society of Chemistry 2019 |