Predictions and constructing explanations: an investigation into introductory chemistry students’ understanding of structure–property relationships

Alex T. Kararo a, Rachel A. Colvin ab, Melanie M. Cooper c and Sonia M. Underwood *a
aDepartment of Chemistry & Biochemistry and STEM Transformation Institute, Florida International University, 11200 SW 8th St., Miami, FL 33199, USA. E-mail: Sonia.underwood@fiu.edu
bDepartment of Chemistry, Adelphi University, 1 South Avenue, Garden City, NY 11530, USA
cDepartment of Chemistry, Michigan State University, 578 Shaw Lane, East Lansing, MI 48824, USA

Received 27th July 2018 , Accepted 28th November 2018

First published on 28th November 2018


Abstract

The relationship between chemical structure and physical and chemical properties is essential to chemistry. Studies have shown that students have difficulty using structural representations to predict properties, which is not surprising because of the sequence of inferences that are required for sense-making. However, obtaining a nuanced model of students’ understanding depends on how information is elicited. This study investigated how the phrasing of the question prompt may elicit students’ understanding of structure–property relationships. Students were given a two-part assessment: (1) four multiple-choice questions assessing students’ self-reported abilities to predict structure–property relationships, and (2) three questions requiring students to predict, argue, and explain a boiling point trend. Two groups of students were selected to determine the sensitivity of the instrument (one with less explicit instruction of structure–property relationships and one with more explicit instruction). We found that Part I of the assessment was able to differentiate between these two groups of students. The group with more explicit instruction was further analyzed to determine how their prediction on a boiling point task connected to their arguments and explanations of the phenomenon. Even though 64% of students answered the boiling point ranking task correctly, the students typically provided less complete arguments as to why that structure had a higher boiling point. However, after scaffolding (i.e., providing relevant information for the phenomenon) and asking for an explanation, students’ responses began to include a much more mechanistic understanding, suggesting that having students provide explanations instead of constructing an argument would display their reasoning at a deeper level.


Introduction

The ability to use chemical structures to predict the physical and chemical properties of substances is at the heart of chemistry and seen as a core idea of the discipline extending throughout the curriculum (Murphy et al., 2012; Cooper et al., 2017). The topics required for students to make sense of structure–property relationships are typically introduced during the first-semester of general chemistry and include construction of Lewis structures, types of bonding, and types of physical and chemical properties. Students who build a robust understanding of structure–property relationships in general chemistry should be able to build on this foundational knowledge as they move to upper level courses such as biochemistry.

Student difficulty with structure–property relationships

Structure–property relationships are difficult for students to understand and apply (Cooper et al., 2012a, 2012b; Chang and Goldsby, 2013; Cooper et al., 2013; DeFever et al., 2015; Underwood et al., 2015). It has been reported that general chemistry students struggle with drawing Lewis structures, possibly because they do not understand the purpose of drawing such structures (Shane and Bodner, 2006; Cooper et al., 2010). Furthermore, it has been found that students of all levels, not just general chemistry, have difficulty in connecting chemical structures and their macroscopic properties (Cooper et al., 2010; DeFever et al., 2015; Underwood et al., 2015). Even after two years of introductory chemistry courses (i.e., two semesters of both general chemistry and organic chemistry), many students do not report that physical and chemical properties could be predicted using a chemical structure (Underwood et al., 2015). Even upper-level undergraduates demonstrated a fractured understanding of structure–property relationships (DeFever et al., 2015). Cooper et al. (2010) found that when general chemistry, organic chemistry, and graduate-level students were asked what types of information could be obtained using a chemical structure (e.g., reactivity or boiling point information), only half of the students, regardless of the level of study, were able to report a chemical or physical property.

Prior research on students’ understanding of boiling point trends

Not only do students have difficulty in understanding structure–property relationships in a general sense, but it has also been reported that they struggle with understanding specific structure–property relationships, such as predicting chemical properties of acidity and basicity (Maeyer and Talanquer, 2010, 2013), reactivity (Maeyer and Talanquer, 2010; McClary and Talanquer, 2011), and relative boiling points (Schmidt et al., 2009; Maeyer and Talanquer, 2010). Predicting boiling point trends is confounded by the misconception that boiling involves the breaking of covalent bonds which has been shown to be prevalent with pre-college (Osborne and Cosgrove, 1983; Schmidt, 1996; Schmidt et al., 2009), undergraduate (Henderleiter et al., 2001), and graduate students (Bodner, 1991). Since it has been shown that many students do not understand the difference between bonding and intermolecular forces (Schmidt et al., 2009; Cooper et al., 2015), perhaps it is not surprising that students often provide faulty reasoning to ranking task questions related to that topic.

Students’ reasoning for ranking tasks often consists of heuristics, both student- and instructor-derived. Heuristics are “cognitive shortcuts” that serve to limit the cognitive load that reasoning requires with examples such as “the octet rule” and “like dissolves like” (Maeyer and Talanquer, 2010; Cooper et al., 2013). While these are helpful rules of thumb, they do not necessarily require the student to understand what is happening at a deeper level; rather, the use of heuristics seems to be especially pervasive in answering questions that involve ranking tasks based on the values of some physical or chemical properties (e.g., boiling point, melting point, acidity/basicity, or reactivity). While these heuristics might prove helpful for getting the correct answer, their reasoning often lacks depth or accuracy. If students are not routinely asked to provide reasoning for their answers, they may have trouble when they are asked to reason about why a particular chemical phenomenon occurs.

Scientific arguments and explanations

Both scientific arguments and explanations are used by the scientific community to build knowledge (Berland and McNeill, 2012; Berland and Reiser, 2009; McNeill and Krajcik, 2011). However, there is an important and significant difference between explanations and arguments. Explanations according to Chin and Brown (2000) are descriptions of how or why phenomena occur. The key aspect of an explanation is that the phenomenon to be explained is not in doubt, so the goal is sense-making or knowledge construction (Osborne and Patterson, 2011). An explanation involves using known scientific principles to reason about a phenomenon. An argument, on the other hand, is used to justify conclusions with a claim and evidence. The essential parts of an argument include (1) a claim, the conclusion or what the argument is about; (2) evidence, the scientific principles that support the claim; and (3) reasoning, the part of the argument that links together the claim and the evidence (McNeill and Krajcik, 2007, 2008; Osborne and Patterson, 2011; Cooper et al., 2015). The goal of an argument is to persuade, often via social construction of knowledge (Berland and McNeill, 2012). Having students construct explanations and arguments is a key goal in science education, as it requires the student to support their scientific knowledge with relevant evidence and reasoning (Driver et al., 2000) to develop a deeper understanding of scientific principles (Zohar and Nemet, 2002; McNeill and Krajcik, 2008).

Zohar and Nemet (2002) found that while 90% of the students in their study were able to provide a simple argument, only 16% of the students used relevant biological knowledge in their arguments. However, once students were taught how to provide a quality argument and the relevant biological content, the quality of the arguments and the use of correct scientific content increased. A further study by Von Aufschnaiter and colleagues (2008) found that students only provided specific content in their arguments when they had an adequate understanding of the topic being addressed, showing that argumentation and content knowledge are directly connected. Berland and Reiser (2009) asked students questions related to changes in population of different species. They found that while students did appear to see the value of evidence in their answers, their goal appeared to be more sense-making (i.e., explanation) and less persuasion (i.e., argumentation). Berland and Reiser (2009) state that the practices of explanation and argumentation complement each other. Argumentation allows for explanations of scientific phenomena to be discussed and accepted or refuted.

Theoretical perspective for study design

The goal of this project was to investigate the effect of increasingly explicit assessment prompts on student responses. By increasing the level of scaffolding in the prompts and embedding the assessment question within a phenomenon, we wanted to see if (1) more students reported a connection between a chemical structure and the compound's macroscopic physical and chemical properties and (2) the student responses became more thorough as the task moved from the students being engaged in an argumentation task versus an explanation task. In this paper we use the term prompt to refer to a specific item within the assessment, whereas we use scaffolding to refer to the phrasing of the task designed to elicit particular information as well as the arrangement of the tasks within the assessment. This language was used to align with wording used in previous studies. We come to this design from the literature on knowledge in pieces and resources (DiSessa, 2014) of how students construct and use knowledge. There is ample evidence (Cooper and Stowe, 2018) that in chemistry much of what students understand is fragmentary and disconnected. Designing the assessment task in this way allows us to identify what types of scaffolding will allow students to access appropriate resources to complete the task.

Appropriate types of prompts have been shown to encourage students to better articulate how and why something occurs, when they lacked reasoning in their explanations before (Cooper et al., 2016). However, the structure of the prompt is crucial. The prompt must be accessible so that students can understand the task, and it must provide enough structure so that students know what is expected of them (Jin and Anderson, 2012; Cooper et al., 2016). If a question is not structured enough (i.e., too implicit), then the students may not understand what is expected of them and will not be able to provide their complete understanding on the topic. If a task is over-structured (i.e., too explicit), students may be given enough information to complete the task which they otherwise would not have been able to complete. For example, Cooper et al. (2016) found that by including the term “electronegativity” in an assessment prompt, students used the term without increasing the richness or meaningfulness of the student responses.

Kang and colleagues (2014) also found that scaffolding explanations, by contextualizing them as a phenomenon or having students construct drawings, led to a higher quality scientific explanation. By providing a specific phenomenon, the students not only have to recognize the general model for that concept but apply that model to a specific situation. This can make it more cognitively challenging to students by making them apply the model to a specific phenomenon, and also make it more accessible by situating it under recognizable conditions.

In this study a two-part assessment was developed that increases the scaffolding as the student moves through the task to determine at what point within the assessment the students begin to connect structure–property relationships. In the second part of the assessment, we contextualize the structure–property relationships into a specific example of boiling point trends in which we provide a boiling point ranking task and require the students to construct an argument to support their claim. Finally, the students are provided the same boiling point phenomenon and required to provide an explanation (rather than an argument) about why the phenomenon occurs. This sequence of tasks allows us to explore how scaffolding influences students’ knowledge of structure–property relationships, compare students’ arguments and explanations about a given phenomenon, and determine the sensitivity of the instrument as a whole with students who have had different levels of exposure to structure–property relationships as shown in prior studies (Cooper et al., 2015; Underwood et al., 2016).

Research questions

RQ1: To what extent does the assessment differentiate between students with different levels of exposure to structure–property relationships in their general chemistry courses?

RQ2: How does the phrasing of the prompt influence students’ ability to predict, argue, and explain structure–property relationships?

RQ3: How do students’ abilities to predict structure–property relationships compare to their explanations of boiling point trends?

Methods

Assessment design

The set of assessment tasks for this study was designed to allow us to investigate how increasingly scaffolded prompts impact the ways that students connect structure–property relationships. Appendix 1 presents a brief description of how the assessment was modified as well as who completed the tasks.

The final version of the assessment consists of two parts: Part I with four multiple-answer questions (Table 1: Questions 1–4), and Part 2 with three open-ended questions (Table 1: Questions 5–7). The purpose of the first part of the assessment was to determine how modifications in the prompt impact students’ abilities to self-report types of information that can be predicted about a substance based on the chemical structure. Question 1 (Table 1) presented students with the Implicit Information from Lewis Structures Instrument (IILSI) (Cooper et al., 2012a, 2012b), where students are provided with 20 different types of information to determine, without explicit prompting, which students believe could be predicted. Question 2 (Table 1) included the chemical structure of the amino acid alanine, a less familiar structure to students, to determine if the presence of a structure would prompt students to select more information on the IILSI. Question 3 (Table 1) reduces the options available for Question 2 from 20 to five. The purpose of this modification was to determine if students were more likely to identify chemical and physical properties when presented fewer options that are typically not chosen. Question 4 (Table 1) was designed to have the most explicit and familiar prompting by asking students if specific properties (e.g., boiling point and melting point) would be high or low compared to alanine. Therefore, the items in Part 1 consisted of versions of the previously reported IILSI (Cooper et al., 2012a, 2012b), with increasingly scaffolded prompting that would allow us to determine the level of scaffolding that might help students identify relationships between structures and properties.

Table 1 Assessment question for Part 1 (Questions 1–4) and Part 2 (Questions 5–7)
Question 1: IILSI (Cooper et al., 2012a, 2012b) “What information could you determine using a Lewis structure and any other chemistry knowledge you have? (Mark all that apply)”
20 answer choices including information such as “hybridization”, “element(s) present”, “relative melting point”, etc.
Question 2: IILSI (Cooper et al., 2012a, 2012b) w/Lewis structure of alanine “If given the Lewis structure as shown, what information could you determine using the structure and any other chemistry knowledge you have? (Mark all that apply)”
20 answer choices (same as Question 1)
Question 3 (Lewis structure of alanine provided) “Now, for the same compound what types of the following information could you determine using its Lewis structure and any other chemistry knowledge you have? (Mark all that apply)”
5 answer choices: “relative boiling point”, “relative melting point”, “acidity/basicity”, “reactivity”, and “none of these properties can be predicted”
Question 4 (Lewis structure of alanine provided) “For the same compound shown below, please select all of the properties that could be predicted using the structure and any other chemistry knowledge you have.”
Answer choices: Boiling/Melting Point – “High boiling point”, “High melting point”, “Low boiling point”, “Low melting point”, and “Can’t be predicted”, Acidity/Basicity – “Acidic”, “Basic”, “Neutral”, “Acidic and Basic”, “Can’t be predicted”, and Reactivity – “Reactive”, “Non-reactive”, and “Can’t be predicted”
Question 5: Boiling point ranking task and reasoning “If given the two compounds below [ethanol and dimethyl ether], which compound would you predict has the higher boiling point? Please explain your reasoning.”
Informational Slide 1 Same chemical formula, same molecular mass; dimethyl ether is a gas and ethanol is a liquid at room temperature; told the values of their boiling points
Informational Slide 2 “The difference in properties between the two compounds is because ethanol can participate in hydrogen bonding.”
Question 6: Draw hydrogen bonding for three molecules of ethanol (Cooper et al., 2015) “Please draw and label a representation below in the blue box that clearly indicates where hydrogen bonding is present for three molecules of ethanol (CH3CH2OH).”
Question 7: Explain why ethanol has a higher boiling point due to hydrogen bonding “Using your representation of hydrogen bonding in the blue box, explain in the black box why the ability of ethanol to form hydrogen bonds results in ethanol having a higher boiling point than dimethyl ether.”


Part 2 of the assessment was designed to investigate whether students can actually predict and explain structure–property phenomena such as boiling point. The first question in Part 2 (Table 1 – Question 5) included a ranking task that required students to make an argument about a boiling point trend (i.e., predict the trend and support their claim with an argument). A ranking task was used since this is a common type of assessment seen by students in general chemistry courses. Although most students are familiar with and can answer these tasks, it has been shown that many students are unable to provide adequate explanations for ranking tasks (McClary and Talanquer, 2011). Previous studies have shown that merely asking students to provide an argument (Table 1, Question 5) may not elicit a rich response. Because of this, informational slides were added to Part 2 of the assessment to provide students with the information that ethanol is a liquid at room temperature while dimethyl ether is a gas at room temperature and this is due to ethanol molecules being able to hydrogen bond. Question 6, originating from the Intermolecular Forces Assessment (IMFA) (Cooper et al., 2015), was added to further capture students’ ideas of hydrogen bonding, since research has shown that students have difficulty differentiating between bonding and electrostatic non-covalent interactions (National Research Council, 1999; Cooper et al., 2013, 2015). Question 7 (Table 1) then asked the students to use the information presented to them and their representation of hydrogen bonding for ethanol to explain the phenomenon (i.e., ethanol is a liquid at room temperature while dimethyl ether is a gas).

Student population

The participants in this study consisted of two populations of students enrolled in a second-semester general chemistry course during the Spring 2016 semester. These students were administered the final version of the assessment as a homework assignment at the end of the semester on beSocratic – a free-form structure drawing program which allows students to submit written and drawn responses (Cooper et al., 2014). Each task was presented on a separate page, and students were told that once an answer was submitted they would not be able to go back to a previous question. Students were asked to try their best and not use any outside information (e.g., their peers and notes). All of these participants were notified of their rights as human subjects as this project was certified as exempt by the IRB.

Group A: the first group of students (N = 117) was from a large Southeastern public research university. At this institution, class sizes range from 200 to 300 students per section and the curriculum would be considered traditional in content (i.e., commercially available textbook and homework systems). However, this course incorporated worksheets inspired by Process Oriented Guided Inquiry Learning (POGIL) (Farrell et al., 1999) using a flipped classroom approach (Baker, 2000; Tucker, 2012), with lecture videos provided to students outside of class time and students working in groups of four on worksheets during class time. In addition, personal student response systems (e.g., clickers) were used to monitor students’ understanding during class time. The instructors administered common multiple-choice exams. Approximately one 75 minute lecture period during the first-semester of general chemistry was spent on the topic of comparing boiling point trends using Lewis structures. Although there were direct questions on a worksheet given in the class regarding boiling point ranking tasks, no questions on the exams specifically focused on the topic.

Group B: the second group of students (N = 96) were from a large Midwestern public research university. The class size for this institution was around 400 students per lecture section and an alternative general chemistry curriculum, Chemistry, Life, the Universe and Everything (CLUE) (Cooper and Klymkowsky, 2013), was used. Along with lectures, students attended a 50 minute recitation weekly consisting of approximately 20 students per section in which they worked in groups of four to complete worksheets. Clickers were also used in the classroom to monitor students’ understanding in real time. In this course, the topic of boiling point trends was explicitly discussed multiple times throughout the course and assessed on exams in the form of both multiple-choice and short-answer questions.

It is important to note that the two populations are unique in many aspects (e.g., different universities, curricula, and demographics of students). Therefore, the two populations were used in this study to determine the sensitivity of the assessment, given the different emphasis placed on the boiling point trends for the two groups, and not to compare their performances in the assessment.

Data analysis

Part 1 – making predictions – multiple answer

The student response data for Part 1 of the assessment (Questions 1–4) were exported from beSocratic as a CSV file and cleaned prior to analysis. Four students were removed for not completing the assessment. Responses to Question 4 were also simplified since students were able to choose multiple responses related to one property, and the goal of the study was to determine whether the students made a connection in general. For example, students were able to choose “high boiling point” or “low boiling point” and if a student chose either response, they were coded as “select” for the boiling point property, since their answer could be correct depending on the reference substance the student used. That is, if students were asked about water they may respond that it has a high boiling point when compared to ammonia, but a low boiling point when compared to acetic acid. This was done for all properties included in Question 4.

Part 2 – constructing arguments/explanations – open-ended

Student arguments/explanations and drawings in Part 2 of the assessment (Questions 5–7) were coded to identify themes in student reasoning. Initially the student responses were analyzed using an open-coding process which led to singular codes identifying all of the concepts students used. After considering the concepts essential to the boiling point trend argument or explanation, the coding scheme was revised to include only relevant concepts such as hydrogen bonding, strength of interactions/bonds, and energy. Since most students’ responses contained multiple concepts, the codes were aggregated into larger themes. The final coding scheme with definitions and examples of student responses, using pseudonyms, is shown in Table 2. All other codes fell into the “other” category since they appeared infrequently. The same coding scheme was used for both Questions 5 and 7 and inter-rater reliability was assessed between two of the authors for Questions 5 and 7 and produced a Cohen's kappa (κ) from 0.8 to 1.0.
Table 2 Coding schemes used for students’ arguments/explanations
Code Definition Examples of students’ responses
Student Does Not Know/No Response/Cannot Be Predicted Student expresses that they do not know the answer, they do not provide any reasoning, or that boiling point cannot be predicted from a Lewis structure. “There needs to be more information for me to predict the boiling point.” – Derek
“I don’t have the knowledge to predict bp [boiling point].” – Jeanne
Non-normative Student uses scientifically inaccurate or unrelated reasoning “It's going to be harder to break apart the carbon–carbon bonds, but dimethyl ether has bonds to oxygen, which are easier to break.” – Lindsay
“Ethanol has an acidic H which means that if it lost that H, the molecule would be charged which means that it will have a higher boiling point.” – Amy
Hydrogen bonding Student explicitly mentions hydrogen bonding in their reasoning. “Ethanol can form hydrogen bonds because they form between a H bonded to an O, N, F and the electron pair of another element (only N, O, and F).” – Bryan
“Ethanol is able to hydrogen bond, therefore it has the higher boiling point.” – Edward
Hydrogen bonding and strength of bonds/interactions Student explicitly mentions hydrogen bonding. Student compares the strength of intermolecular forces or bonds in their response, ranking strengths or referring to bonds/interactions as being more difficult to break/interrupt. No energy argument included. “Ethanol can form hydrogen bonds which are stronger than any that dimethyl ether can form.” – Nick
“Ethanol has a hydrogen bond which is much stronger and harder to break.” – Destiny
Hydrogen bonding, strength of bonds/interactions, and energy Student explicitly mentions hydrogen bonding and strength of bonds/interactions. Student mentions energy in terms of it being higher/lower or more/less than another entity. Students may use the term “heat” instead of energy. “The ethanol can form hydrogen bonds which are the strongest IMF and would require a lot of energy to break leading to a higher BP.” – Tiffany
“Ethanol has hydrogen bonding, which is a strong bond. Breaking hydrogen bonds need to put a lot of energies in it, so the boiling point or melting point is higher for ethanol.” – Peter


The student drawings of hydrogen bonding using three ethanol molecules in Question 6 were coded using the originally published coding scheme for the IMFA (Cooper et al., 2015). The codes include “between ethanol molecules”, “within ethanol molecules”, “ambiguous representation of hydrogen bonding”, or “student does not know” (Table 3). Other codes, such as “dimethyl ether” if students drew dimethyl ether molecules, appeared infrequently and were placed in the larger theme of “other”. Inter-rater reliability was assessed between two of the authors for Question 6 and produced a Cohen's kappa (κ) from 0.8 to 1.0.

Table 3 Coding scheme used for student drawings of hydrogen bonding (Cooper et al., 2015)
Code “Between” “Within” “Ambiguous”
Drawing example image file: c8rp00195b-u1.tif image file: c8rp00195b-u2.tif image file: c8rp00195b-u3.tif


Results and discussion

RQ1: To what extent does the assessment differentiate between students with different levels of exposure to structure–property relationships in their general chemistry courses?

Responses to Questions 1 and 5 were examined for Groups A and B to determine whether the assessment could detect nuances between the performance of each group based on how the topic of structure–property relationships was emphasized in each course. Questions 1 and 5 were chosen since these questions represent students’ first response to the two different tasks: students’ abilities to self-report the structure–property relationships and a boiling point ranking task.

Fig. 1 presents the performance of students from Groups A and B on Questions 1 and 5. For Question 1, labeled as “Predict on IILSI”, students could select or not select boiling point as a property that could be predicted from a Lewis structure. This graph shows that only 9% (N = 10) of students in Group A initially selected boiling point. This corresponds to their response in Question 5, the boiling point ranking task, where students appear to be guessing with a relatively even split between all four options. On the other hand, 60% (N = 58) of the students in Group B selected boiling point for Question 1 with a clear majority choosing ethanol as having the higher boiling point for Question 5.


image file: c8rp00195b-f1.tif
Fig. 1 Comparison of performances on Question 1 (boiling point prediction on the IILSI) and Question 5 (boiling point ranking task) for Groups A and B.

However in the Group A curriculum, the topic of structure–property relationships was covered in one lecture and on one worksheet, and a vast majority of those students did not select boiling point as having a relationship with a Lewis structure and appear to be guessing on the boiling point ranking task. The students who had a larger exposure to this topic, Group B, were more likely to identify the relationship between the boiling point and the Lewis structure as well as identify the correct structure as having a higher boiling point. Therefore, these differences provide evidence that the assessment is sensitive enough to differentiate between different curriculum emphases. The data shown here for the IILSI and IMFA are similar to prior reported research findings (Cooper et al., 2012a, 2012b; Williams et al., 2015; Underwood et al., 2016), suggesting that the modified assessment behaved similarly to the individual assessment tasks for validity and reliability purposes.

RQ2: How does the phrasing of the prompt influence students’ ability to predict, argue, and explain structure–property relationships?

Self-reported ability to select properties (Part 1 of the assessment). Four categories emerged when analyzing how students’ responses changed as students progressed through Questions 1–4 (Table 1) based on students’ self-reported abilities to connect structures and properties: (1) students who never selected that a specific type of information could be predicted from a Lewis structure regardless of the prompt (referred to as “Never Predict”), (2) students who were inconsistent in their selection, indicating that they selected a type of information for one question and did not select the same information in a later question (referred to as “Inconsistently Predict”), (3) students who selected a type of information after a particular question and continually selected that information in later questions (referred to as “Predict Upon Prompting”), and (4) students who always selected a specific type of information regardless of how the prompt changed (referred to as “Always Predict”). Table 4 shows the percentage of students in Groups A and B who were aligned with each category.
Table 4 Percentage of students in Groups A and B who were aligned with each category for Part 1 of the assessment
Category Group A Group B
Never predict 58% (N = 68) 10% (N = 10)
Inconsistently predict 7% (N = 8) 27% (N = 26)
Predict upon prompting 32% (N = 37) 27% (N = 25)
Always predict 3% (N = 4) 35% (N = 34)


To simplify the discussion, here we discuss the results from Part 1 with respect to the physical property of boiling point since it corresponds directly to the boiling point ranking task in Part 2 of the assessment. The findings for the other chemical/physical properties for the IILSI were similar and are presented in Appendix 2.

Ideally, we would like students to align with the “Always Predict” category, indicating that they consistently and implicitly connect boiling point trends with Lewis structures. As shown in Table 4, the majority of students (58%, N = 68) in Group A never selected boiling point as a property that could be predicted from a Lewis structure regardless of how the question was presented, whereas only 3% (N = 4) of students fell into the “Always Predict” category. 32% (N = 37) of the students in Group A, however, aligned with the “Predict Upon Prompting” category, where the majority of these students 62% (N = 23 out of 37) selected boiling point on Question 4.

In contrast, 35% (N = 34) of students in Group B consistently selected that boiling point could be predicted using a molecular structure, while the smallest percentage of students, 10% (N = 10), never selected boiling point across the assessment. Students in the “Predict Upon Prompting” category 27% (N = 25) followed a similar pattern to the students in Group A with the largest percentage, 60% (N = 15 out of 25), first choosing that boiling point could be predicted in Question 4 (Table 1). This is the most explicit or structured question and shows that the last question in Part 1 of the assessment had the most impact on these students’ selections of boiling point's relationship to a Lewis structure. Providing a Lewis structure with the original IILSI (Question 2) did not provide much additional help to students; however, reducing the number of options and stating the options in terms of high/low were more helpful (Question 4). It is possible that the last question cued the students into recalling information provided in a general chemistry classroom (e.g. water has a high boiling point compared to other molecular compounds of similar size such as ammonia or methane). However, since a majority of students in Group A did not choose that boiling point could be predicted using a molecular structure, even on Question 4, it appears that students have to have some knowledge of this topic to choose boiling point. Further studies would need to be conducted to determine why students in the category “Predict Upon Prompting” do not link boiling point and Lewis structure until Question 4.

For the students in both groups who needed prompting to connect structures and properties, 32% and 27% for Groups A and B respectively, it took the most explicit prompting (Question 4) to do so. The wording in Question 4 also differed by including both high and low boiling points as options, implying that many of the students in the “Predict Upon Prompting” category may have needed the options worded in such a way as to identify the connection between the Lewis structure and the property. The trends observed in Part 1 of the assessment suggest that students had difficulty with the beginning step of constructing an argument (i.e., making a prediction or claim) and clearly need more experience to explicitly develop the connections of structures and properties.

In summary, the first half of the assessment demonstrated that over half of the students in Group A never connected structures and properties, despite explicit prompting to do so. On the other hand, over half of the students in Group B consistently predicted the boiling point structure–property relationship. The differences in performance of Groups A and B on Part 1 of the assessment provide further evidence to the sensitivity of this instrument.

Students’ ability to construct an argument versus an explanation for boiling point trends (Part 2 of the assessment).
Constructing an argument. The coding scheme outlined in Table 2 was applied solely to Group B students’ responses for the boiling point ranking task (Question 5), since the majority of the students in Group A did not connect Lewis structure to boiling point. Fig. 2 shows the codes for students’ arguments based on their boiling point predictions. “Cannot predict” is excluded from the graph, which 3% (N = 3) of students chose, since the student reasoning consisted solely of “no explanation”.
image file: c8rp00195b-f2.tif
Fig. 2 Distribution of codes for students’ arguments (Question 5) as it relates to their boiling point predictions for Group B.

As shown in Fig. 1, 64% (N = 61) of the students in Group B chose ethanol, 22% (N = 21) chose dimethyl ether, and 12% (N = 11) chose that the two compounds have the same boiling point. As shown in Fig. 2, 20% (N = 19) of students chose ethanol and used the concept of hydrogen bonding as evidence for their prediction, with an additional 16% (N = 15) of students going beyond hydrogen bonding to incorporate the strength of the bond/interaction. Ideally students would use the concepts of hydrogen bonding, strength of interactions, and the amount of energy needed to overcome the attractive forces between molecules to explain their boiling point prediction. Unfortunately, only 15% (N = 14) were able to do so on Question 5. However, it is promising though that students are introducing relevant concepts into their argument about the boiling point ranking task without being prompted to do so.

For the students who indicated that dimethyl ether had the higher boiling point, most of them provided non-normative arguments, using concepts such as stability and breaking of bonds (22%, N = 21). This is expected since in order to justify that dimethyl ether has a higher boiling point, the students would most likely have to provide scientifically inaccurate or unrelated reasoning. The arguments used by students who chose that the boiling points were equal (11%, N = 11) typically consisted of structural information, such as the number of atoms of each element. These students tended to be fixated more on the structure of one molecule instead of thinking about how multiple molecules would interact within the substance.


Constructing an explanation. After Question 5, students were given two informational slides. These slides provided them with a phenomenon (ethanol is a liquid at room temperature while dimethyl ether is a gas, indicating that ethanol has a higher boiling point) and evidence for the phenomenon (ethanol has a higher boiling point because it can participate in hydrogen bonding). Question 7 asked them to provide the explanation for why hydrogen bonding leads to ethanol having a higher boiling point than dimethyl ether. Fig. 3 shows the distribution of codes for students’ explanations answering Question 7. The 3% of students (N = 3) who chose “Cannot Predict” were excluded from Fig. 3 since each student provided an explanation that aligned with a different code.
image file: c8rp00195b-f3.tif
Fig. 3 Distribution of codes for students’ explanations (Question 7) as it relates to their boiling point predictions for Group B.

Students’ explanations (Question 7) were richer and included more ideas than their original arguments (Question 5). This effect was observed even for those students who earlier had incorrectly answered the boiling point ranking task. Although only 15% (N = 14) of students invoked the concept of energy as part of their arguments in Question 5, 51% (N = 49) of students used the concept of energy in their explanations in Question 7, without being explicitly prompted to do so. This not only shows that students have that knowledge but can also use that knowledge to build on the information provided. Since the information slides already provided the idea that hydrogen bonding was involved in the mechanism by which substances boiled, the students appear to understand that repetition of this idea was not sufficient to answer the question and were more likely to build on that concept to provide an explanation that included both the strength of interactions/bonds and the energy required to overcome them.


Students’ hydrogen bonding drawings. Finally, previous studies have shown that as many as 72% of general chemistry students believe that hydrogen bonding consists of an actual covalent bond within the molecule (Cooper et al., 2015). Because of this, Question 6 was included which required the students to draw and label a representation of hydrogen bonding using three ethanol molecules. We found that 68% (N = 65) of students in Group B correctly drew hydrogen bonding between the molecules of ethanol. That is, the majority of students using the concept of hydrogen bonding in Questions 5 and 7 had a correct understanding that hydrogen bonding occurs between the molecules. Although there was no obvious trend between students’ drawings and responses, by having students draw hydrogen bonding, we were able to gather a visual representation of their understanding of the concept.

In summary, by changing the nature of the task from an argument to an explanation in Part 2 of the assessment, the students provided richer responses for why ethanol has a higher boiling point than dimethyl ether. To answer Question 5, the students were required to make an argument in which most of the students used the concepts of hydrogen bonding and strength of bonds/interactions. However, when provided with “the answer” and asked to construct an explanation for why ethanol has a higher boiling point than dimethyl ether (Question 7), many more students also included the concept of energy. That is, the structure of the prompt determined the types of information that the students used in their responses. The explanation prompt (Question 7) activated resources that the argumentation prompt (Question 5) did not.

RQ3: How do students’ abilities to predict structure–property relationships compare to their explanations of boiling point trends?

For this analysis we examined the relationship between students’ predictions on the IILSI (Question 1) and students’ responses for boiling point trends (Questions 5 and 7). Only Group B is presented here for the same reasons mentioned previously. Fig. 4 shows the distribution of the codes for Questions 5 (argument) and 7 (explanation) based on whether students selected boiling point on the IILSI for Question 1 (60%, N = 58) or did not select boiling point (40%, N = 38).
image file: c8rp00195b-f4.tif
Fig. 4 Distribution of codes for students’ arguments and explanations (Question 5 and Question 7, respectively) based on whether they selected boiling point for Question 1.

For Question 5, the students who selected that the relative boiling point could be predicted using a molecular structure on the IILSI in Question 1 were more likely to bring in concepts such as hydrogen bonding or strength of bonds/interactions on Question 5. However, a large percentage of students (44%, N = 42) provided some type of non-normative argument, regardless of whether they identified a relationship between boiling point and Lewis structure or not in Question 1.

When asked for an explanation in Question 7, the students in both groups shifted to more thorough explanations with only 16% (N = 15) of students providing non-normative responses. The difference observed between the group of students who selected that boiling point could be predicted from the Lewis structure for Question 1 and those who did not is the depth of their responses for their explanations. More students brought in concepts like strength of bonds/interactions and energy, with 28% (N =27) of the students selecting that boiling point could be predicted as well as providing an explanation connecting hydrogen bonding, strength of bonds/interactions and energy. This is compared to only 9% (N = 9) of the students connecting these concepts together in their explanations when they did not select boiling point on the IILSI for Question 1. These analyses show that even if students consider concepts such as hydrogen bonding and strength of bonds/interactions, they do not necessarily see the connection between those concepts and structure–property relationships or use it as part of an argument. By asking for an explanation about a boiling point trend compared to constructing an argument, the students are more likely to include those concepts. This also shows that the students who initially identify the relationship between structure and properties, such as boiling point, are more likely to have a more mechanistic understanding of that relationship. Although it is important to note that further studies would be needed to develop a better understanding of whether these students have a causal mechanistic understanding of this phenomenon (i.e. how and why energy plays a role in phase changes).

Overall, these analyses show the role of explanation in assisting students in being able to identify and explain structure–property relationships. For Part 1 of the assessment, limiting the number of choices and wording them in a different way, such as high or low boiling point, appears to have allowed more students to self-report that they understand that structure and properties are connected.

Part 2 of the assessment provides evidence that many students do not access all the resources available to them to answer a question. When the students were asked to provide an explanation, many more of them were able to tie in other appropriate concepts to provide reasoning. While ideally we might like students to consistently and explicitly connect these concepts in order to argue and explain structure–property relationships, what is clear from this work is that along the way we must support the students by designing appropriate task prompts.

Conclusions

In this study we aimed to expand upon previous research that investigated how prompts within an assessment elicit students’ knowledge of structure–property relationships. In this study we focused on two goals: (1) what level of prompting is required to help students access the appropriate resources that provide a connection between structures and their physical and chemical properties, and (2) the different types of responses produced by asking students to construct either an argument or an explanation. In Part 1 of the assessment, four categories were identified for how students’ ability to predict properties changed with more explicit prompting. While the largest percentage of students fell into the “Never Predict” or “Always Predict” categories for Groups A and B respectively, a large number of students needed prompting, through limiting and re-wording the informational choices, before they reported connections between structures and properties. These findings support the notion that many students need explicit prompting to help them make connections between structures and properties. Eventually we might hope that students will be able to summon appropriate resources to answer questions like these more efficiently; however, it is clear that appropriate prompting can be helpful for students who are first learning these ideas. The findings in Part 2 of the assessment further support prior findings in the literature, indicating that the students struggle with reasoning about boiling point trends. Although a large number of students in our study initially provided non-normative arguments for why they think that ethanol or dimethyl ether has a higher boiling point, their explanations became more thorough when relevant information was provided. Additionally, many students were able to move beyond the information provided, when asked to construct an explanation about the phenomenon (that ethanol had a higher boiling point because of hydrogen bonding) and began to introduce concepts like energy, without being explicitly prompted to do so.

Comparing the relationship of students’ responses between Parts 1 and 2 of the assessment, we found that students who predicted that boiling point depends on molecular structure initially on the IILSI for Question 1 were three times more likely to connect hydrogen bonding, strength of bonds/interactions, and energy in their explanations compared to the students who did not select this item. Overall, these results suggest that despite students’ difficulties with predicting and explaining the relationship between structure and properties, providing the phenomenon with relevant information led to more complete explanations for the relationship between structure and boiling point trends.

Implications

There is now an extensive database of research supporting students’ difficulty with understanding structure–property relationships, and it would be fair to assume that such a core concept in chemistry would be given great significance and taught in a way that stresses long-term understanding. There has been a strong push in recent years to abandon traditional methods of teaching in favor of active learning environments, such as POGIL (Farrell et al., 1999) and PLTL (Peer Led Team Learning) (Gosser & Roth, 1998). Additionally, there has been a call to reconsider the general chemistry curriculum (what is taught and how it is taught) to better serve students (Rickard, 1992; Cooper et al., 2013; Schaller et al., 2014; Sevian & Talanquer, 2014). Here we are suggesting that the types of formative (and summative) assessment items should be carefully chosen and designed to allow the students to make connections and use all of the knowledge they have at their disposal. Scaffolding students’ explanations by providing relevant information can improve the potential to elicit students’ understanding, but there is still more room to improve students’ initial and explicit understanding of structure–property relationships via curricular redesign.

Limitations of the study

The findings in this study are limited in that only two student populations are presented. It is unlikely that all curricula are the same and that other student populations would perform similarly. That is, these findings may differ for students who were given more time to develop an understanding of structure–property relationships. In addition, there are limitations with the assessment itself. For example, the first half of the assessment featured multiple-answer questions in that students’ reasonings for their answer choices were not provided. Having students explain their answer choices for each question, especially Question 4, would provide insights into what students are thinking about the questions and how the prompting impacts their thought processes.

Conflicts of interest

There are no conflicts to declare.

Appendix 1

There were multiple groups of students from various institutions who were involved in the development process of the assessment described in the paper. Table 5 displays the number of students from each institution who were administered the initial versions of the assessment.
Table 5 Student populations who were involved in the development of the assessment
Semester administered Student population
a Populations of students who are discussed in the paper.
Spring 2013 N = 895, second-semester general chemistry, University 1
N = 303, second-semester organic chemistry, University 2
Fall 2015 N = 167, second-semester general chemistry, University 2
Spring 2016 N = 96, second-semester general chemistry, University 2a
N = 116, second-semester general chemistry, University 3a


The first version of the assessment consisted of four questions (Table 1: Questions 1, 3, 4 and 5) and was piloted during the Spring 2013 semester to determine how students’ predictions of structure–property relationships changed depending on the prompt provided. Based on students’ initial responses, the informational slides as well as having students draw their understanding of hydrogen bonding were added to the assessment tasks. The modified assessment was administered in Fall 2015 to confirm that the prompt was not too explicit to provide help to students who otherwise would not be able to answer the question before the final administration in Spring 2016.

Appendix 2

Part 1 of the assessment included physical and chemical properties like relative melting point, reactivity and acidity/basicity. Tables 6 and 7 present the distribution of students for Groups A and B between the four categories presented in the paper. While a majority of the students in Group A never predicted any of these properties, most of the students in Group B either always predicted or predicted upon prompting for these properties.
Table 6 Distribution of students in Group A who were aligned with each category
Category Relative melting point Reactivity Acidity/basicity
Never predict 67% (N = 78) 43% (N = 50) 46% (N = 54)
Inconsistently predict 8% (N = 9) 20% (N = 23) 4% (N = 5)
Predict upon prompting 24% (N = 28) 35% (N = 41) 44% (N = 52)
Always predict 2% (N = 2) 3% (N = 3) 5% (N = 6)


Table 7 Distribution of students in Group B who were aligned with each category
Category Relative melting point Reactivity Acidity/basicity
Never predict 16% (N = 15) 14% (N = 13) 5% (N = 5)
Inconsistently predict 27% (N = 26) 24% (N = 23) 28% (N = 28)
Predict upon prompting 23% (N = 22) 36% (N = 35) 25% (N = 24)
Always predict 34% (N = 33) 26% (N = 25) 41% (N = 39)


Acknowledgements

This study was supported by startup funds from Florida International University, State of Florida, for support of the UP:LIFT project, the Howard Hughes Medical Institute Grant No. HHMI 52008097, the National Science Foundation under FIU REU Site Grant No. CHE – 1560375, and other National Science Foundation funding DUE 0816692 (1359818), DUE 1043707 (1420005), and DUE 1122472 (1341987).

References

  1. Baker J. W., (2000), The “classroom flip”: using web course management tools to become the guide by the side, Presented at the 11th International Conference on College Teaching and Learning, Jacksonville, FL.
  2. Berland L. K. and McNeill K. L., (2012), For whom is argument and explanation and necessary distinction? A response to Osborne and Patterson, Sci. Educ., 96(5), 808–813.
  3. Berland L. K. and Reiser B. J., (2009), Making sense of argumentation and explanation, Sci. Educ., 93(1), 26–55.
  4. Bodner G., (1991), The conceptual knowledge of beginning chemistry graduate students, J. Chem. Educ., 68(5), 385–388.
  5. Chang R. and Goldsby K., (2013), General Chemistry: The essential concepts, 7th edn, New York, NY: McGraw Hill Education.
  6. Chin C. and Brown D. E., (2000), Learning in science: a comparison of deep and surface approaches, J. Res. Sci. Teach., 37, 109–138.
  7. Cooper M. M. and Klymkowsky M. W., (2013), Chemistry, life, the universe, and everything: a new approach to general chemistry and a model for curriculum reform, J. Chem. Educ., 90, 1116–1122.
  8. Cooper M. M. and Stowe R. L., (2018), Chemistry education research – from personal empiricism to evidence, theory, and informed practice, Chem. Rev., 118(12), 6053–6087.
  9. Cooper M. M., Grove N. and Underwood S. M., (2010), Lost in Lewis Structures: An Investigation of Student Difficulties in Developing Representational Competence, J. Chem. Educ., 87(8), 869–874.
  10. Cooper M. M., Underwood S. M. and Hilley C. Z., (2012a), Development and validation of the implicit information from Lewis structures instrument (IILSI): do students connect structures with properties? Chem. Educ. Res. Pract., 13, 195–200.
  11. Cooper M. M., Underwood S. M., Hilley C. Z. and Klymkowsky M. W., (2012b), Development and Assessment of a Molecular Structure and Properties Learning Progression, J. Chem. Educ., 89(11), 1351–1357,  DOI:10.1021/ed300083a.
  12. Cooper M. M., Corley L. M. and Underwood S. M., (2013), An investigation of college chemistry students’ understanding of structure–property relationships, J. Res. Sci. Teach., 50(6), 699–721.
  13. Cooper M. M., Underwood S. M., Bryfczynski S. and Klymkowsky M. W., (2014), A short history of the use of technology to model and analyze student data for teaching and research, in Cole R. and Bunce D. (ed.), Tools of Chemistry Education Research, Washington, DC: ACS Symp. Ser. 1166, pp. 219–239.
  14. Cooper M. M., Williams L. C. and Underwood S. M., (2015), Student understanding of intermolecular forces: a multimodal study, J. Chem. Educ., 92, 1288–1298.
  15. Cooper M. M., Kouyoumdjian H. and Underwood S. M., (2016), Investigating students’ reasoning about acid–base reactions, J. Chem. Educ., 93(10), 1703–1712.
  16. Cooper M. M., Posey L. A. and Underwood S. M., (2017), Core ideas and topics: building up or drilling down? J. Chem. Educ., 94(5), 541–548.
  17. DeFever R. S., Bruce H. and Bhattacharyya G., (2015), Mental rolodexing: Senior chemistry majors’ understanding of chemical and physical properties, J. Chem. Educ., 92(3), 415–426.
  18. DiSessa A. A., (2014), A history of conceptual change research: threads and fault line, in The Cambridge Handbook of the Learning Sciences, New York, NY: Cambridge University Press, pp. 88–108.
  19. Driver R., Newton P. and Osborne J., (2000), Establishing the norms of scientific argumentation in classrooms, Sci. Educ., 84, 287–312.
  20. Farrell J. J., Moog R. S. and Spencer J. N., (1999), A guided-inquiry general chemistry course, J. Chem. Educ., 76(4), 570–574.
  21. Gosser D. K. and Roth V., (1998), The workshop chemistry project: peer led team learning, J. Chem. Educ., 75, 185–187.
  22. Henderleiter J., Smart R., Anderson J. and Elian O., (2001), How do organic chemistry students understand and apply hydrogen bonding? J. Chem. Educ., 78(8), 1126–1130.
  23. Jin H. and Anderson C.W., (2012), A learning progression for energy in socio-ecological systems, J. Res. Sci. Teach., 49(9), 1149–1180.
  24. Kang H., Thompson J. and Windschitl M., (2014), Creating opportunities for students to show what they know: the role of scaffolding in assessment tasks, Sci. Educ., 98(4), 674–704.
  25. Maeyer J. and Talanquer V., (2010), The role of intuitive heuristics in students’ thinking: ranking chemical substances, Sci. Educ., 94(6), 963–984.
  26. Maeyer J. and Talanquer V., (2013), Making predictions about chemical reactivity: assumptions and heuristics, J. Res. Sci. Teach., 50(6), 748–767.
  27. McClary L. and Talanquer V., (2011), Heuristic reasoning in chemistry: making decisions about acid strength, Int. J. Sci. Educ., 33(10), 1433–1454.
  28. McNeill K. L. and Krajcik J. S., (2007), Scientific explanations: characterizing and evaluating the effects of teachers’ instructional practices on student learning, J. Res. Sci. Teach., 45(1), 53–78.
  29. McNeill K. L. and Krajcik J. S., (2008), Chapter 11: Inquiry and scientific explanations: helping students use evidence and reasoning, in Science As Inquiry in the Secondary Setting, pp. 121–134.
  30. McNeill K. L. and Krajcik J. S., (2011), Supporting grade 5–8 students in constructing explanations in science: the claim, evidence, and reasoning framework for talk and writing, Boston, MA: Pearson.
  31. Murphy K., Holme T., Zenisky A., Caruthers H. and Knaus K., (2012), Building the ACS exams anchoring concept content map for undergraduate chemistry, J. Chem. Educ., 89(6), 715–720.
  32. National Research Council, (1999), How people learn: Brain, mind, experience and school, Washington, DC: National Academies Press.
  33. Osborne R. J. and Cosgrove M. M., (1983), Children's conceptions of the changes of state of water, J. Res. Sci. Teach., 20(9), 825–838.
  34. Osborne J. F. and Patterson A., (2011), Scientific argument and explanation: a necessary distinction? Sci. Educ., 95(4), 627–638.
  35. Rickard L. H., (1992), Reforms in general chemistry curriculum, J. Chem. Educ., 69(3), 175.
  36. Schaller C. P., Graham K. J., Johnson B. J., Fazal M. A., Jones T. N., McIntee E. J. and Jakubowski H. V., (2014), Developing and implementing a reorganized undergraduate chemistry curriculum based on the foundational chemistry topics of structure, reactivity, and quantitation, J. Chem. Educ., 91(3), 321–328.
  37. Schmidt H. J., (1996), Students’ understanding of molecular structure and properties of organic compounds, Presented at the annual meeting of the National Association for Research in Science Teaching, St. Louis, MO.
  38. Schmidt H., Kaufmann B. and Treagust D. F., (2009), Students’ understanding of boiling points and intermolecular forces, Chem. Educ. Res. Pract., 10(4), 265–272.
  39. Sevian H. and Talanquer V., (2014), Rethinking chemistry: a learning progression on chemical thinking, Chem. Educ. Res. Pract., 15, 10–23.
  40. Shane J. W. and Bodner G. M., (2006), General chemistry students’ understanding of structure-function relationships, Chem. Educ., 11(2), 130–137.
  41. Tucker B., (2012), The flipped classroom, Education Next, 12(1), 82–83.
  42. Underwood S. M., Reyes-Gastelum D. and Cooper M. M., (2015), Answering the questions of whether and when student learning occurs: using discrete-time survival analysis to investigate how college chemistry students’ understanding of structure–property relationships evolves, Sci. Educ., 99(6), 1055–1072.
  43. Underwood S. M., Reyes-Gastelum D. and Cooper M. M., (2016), When do students recognize relationships between molecular structure and properties? A longitudinal comparison of the impact of traditional transformed curricula, Chem. Educ. Res. Pract., 17, 365–380.
  44. Von Aufschnaiter C., Erduran S., Osborne J. and Simon S., (2008), Case studies of how students’ argumentation relates to their scientific knowledge, J. Res. Sci. Teach., 45(1), 101–131.
  45. Williams L. C., Underwood S. M., Klymkowsky M. W., Cooper M. M., (2015), Are noncovalent interactions an achilles heel in chemistry education? A comparison of instructional approaches, J. Chem. Educ., 92(12), 1979–1987.
  46. Zohar A. and Nemet F., (2002), Fostering students’ knowledge and argumentation skills through dilemmas in human genetics, J. Res. Sci. Teach., 39, 35–62.

This journal is © The Royal Society of Chemistry 2019
Click here to see how this site uses Cookies. View our privacy policy here.