Development and validation of the implicit information from Lewis structures instrument (IILSI): do students connect structures with properties?

Melanie M. Cooper *, Sonia M. Underwood and Caleb Z. Hilley
Department of Chemistry, Clemson University, Clemson, SC 29634, USA. E-mail: cmelani@clemson.edu

Received 16th November 2011 , Accepted 7th February 2012

First published on 24th February 2012


Lewis structures are a simplified two dimensional “cartoon” of molecular structure that allow a knowledgeable user to predict the types of properties a particular substance may exhibit. However, prior research shows that many students fail to recognize these structure-property connections and are unable to decode the information contained in the structures, thus resulting in students' inability to use Lewis structures for their intended purposes. We have developed a survey instrument using responses from student interviews and open-ended questions from a previous research study to determine students' beliefs about the information that can be obtained from Lewis structures. The survey was found to be reliable and valid through multiple administrations to a total of 3203 students, including first- and second-semester general chemistry and organic chemistry students, junior/senior level undergraduate chemistry students, and graduate level chemistry students. We propose that this survey can be a useful informational tool for instructors for a wide range of courses where students' understanding of structure-function relationships is important.


Introduction and theoretical background

There are a wide range of representations that chemists use to convey information about materials and each has its own purpose depending on the nature of the material under study. Lewis structures are typically the first representation students' encounter that can provide them with information leading to predictions of properties for a molecular substance. As first pointed out by G.N. Lewis, Lewis structures are a way to differentiate between polar and non-polar molecules (Lewis, 1916). He described polar molecules as reactive and exhibiting high intermolecular attraction, while non-polar molecules showed the opposite properties. That is: Lewis explicitly created these structures as an essential link between the structure of chemical compounds and their function.

This relationship between structure and properties is an overarching concept in chemistry, and one that is emphasized over and over again. For example, the ACS committee on professional training supplement on organic chemistry emphasizes structure-property relationships (ACS Committee on Professional Training, 2008), and one study reported that 10 out of 23 organic chemistry educators interviewed and surveyed cited “correlation between structure and properties” as an important fundamental concept that students need for organic chemistry (Duis, 2011) (although one might also wonder why the other 13 did not mention it). However, research suggests that students have great difficulty making this connection (Kozma and Russell, 1997; Shane and Bodner, 2006) and even transforming a structural representation into a molecular level representation is difficult (Johnstone, 1991; Nicoll, 2003). We have previously proposed (Cooper et al., 2010) that the inability of students to construct and use Lewis structures can be viewed through the lens of meaningful learning (Novak, 1977; Bretz, 2001; Novak, 2002). For meaningful learning, three key components must be present: (1) students must have prior knowledge to anchor the new knowledge (2) the new knowledge must be perceived as relevant to other knowledge, and (3) the learner “must consciously and deliberately choose to relate new knowledge to knowledge the learner already knows in some nontrivial way” (Novak, 1998). In our interviews and surveys we found that students do not spontaneously connect the act of drawing a Lewis structure with the purpose. That is, the reason why they must learn to draw structures is not apparent to them. The structure-function connection is in the future, perhaps even several chapters down the line, and therefore actually learning to draw structures requires the application of a set of what may seem to them rather arcane rules that are not connected to their prior knowledge and do not have an obvious future purpose.

Moreover, drawing Lewis structures is a relatively simple process compared to the knowledge and skills that students must use to extract meaningful information from them. We have previously shown (Cooper et al., 2010) that while students are able to articulate the explicit information that Lewis structures show (that is information about the actual structure), they are significantly less able to recognize the implicit information that can be determined, but that requires application of other knowledge; for example, whether a compound will be polar, or have a relatively high boiling point, or be acidic. In our studies we found that while all students could identify structural features, typically less than half the students would propose either a chemical or physical property. This inability was seen across the spectrum of students from general chemistry to graduate students. Consider the sequence of actions that students must concatenate to move from a molecular structure to a prediction of physical or chemical properties for a simple molecular substance. As shown in Fig. 1, after a viable Lewis structure is constructed, a student must then determine the geometry assumed by the electron pairs around each atom (often taught by using VSEPR (Gillespie, 1970)) so that they can eventually determine the actual shape of the molecule as outlined by the placement of the atoms. Now the student must use their knowledge of electronegativity to assign a direction of polarity to each bond, and then understand how the bond polarities are combined (as vectors) to determine the overall molecular polarity. Next, learners must use the combination of 3D structure and electron density distribution to determine the types of intermolecular forces present, and finally use this understanding to predict properties such as relative boiling points or melting points, reactivity such as acidity or basicity for example. For more complex molecules, such as organic compounds, this process is even more difficult, since the structure may be composed of several regions of differing polarity, which may or may not interact.


The process for connecting structure and properties for a simple molecule.
Fig. 1 The process for connecting structure and properties for a simple molecule.

While experts navigate this sequence of actions and inferences with ease, students do not find it so easy. One might predict that many students' working memory becomes overloaded when they have to string together so many actions, several of which require the applications of rules that must be remembered “in a vacuum” since the whole purpose of doing this may not be apparent. Our prior studies (Cooper et al., 2010) have shown that for many students these links are never made, and therefore they must, of necessity fall back on memorizing disconnected chunks and using surface level features and heuristics to organize their ideas (Maeyer and Talanquer, 2010).

However many instructors are not aware of the high cognitive demands that are placed on students, and are unaware of the depth of the problem. The purpose of the instrument described here is to provide instructors with an assessment that will allow them to monitor whether their own students can make these connections. The Implicit Information from Lewis Structures Instrument (IILSI) can be easily administered and provides information about student perceptions of information that can be determined from Lewis structures. It consists of seventeen items of information that students may select. The construction process and validity and reliability assessments for the IILSI are discussed below.

Methods

All participants in this survey construction and validation research were provided with information detailing their rights as human subjects; informed consent was obtained from all of the participants. To ensure anonymity, students who participated in online surveys were provided with access codes known only to themselves and the researchers. This survey has currently been administered to over 6900 students enrolled at a research university located in the southeastern United States, and over 100 students from a comprehensive university in the southeastern United States. The reliability and validity of the survey was determined by multiple administrations to a total of 3203 students including first- and second-semester general chemistry and organic chemistry students, junior/senior level undergraduate chemistry students, and graduate level chemistry students (Table 1).
Table 1 Participants involved in creating and validating the IILSI (from a Research University, except where stated)
When Administered Who it was administered to Why it was administered
Spring 2010 32 General chemistry students To develop the answer choices from Ed's Tools responses
134 Organic chemistry students
10 Graduate chemistry students
Spring 2010 + Summer 2010 648 General chemistry students Multiple revisions of the survey
336 Organic chemistry students
(Responses from 306 of these General chemistry students were used to support the validity)
Spring 2011 744 General chemistry students Further support of validity
387 Organic chemistry students
Fall 2011 11 Junior/Senior level chemistry students Further support of validity
12 Graduate chemistry students
Fall 2011 89 General chemistry students from a Comprehensive University Support of reliability
609 General chemistry students
Fall 2010 + Fall 2011 191 General chemistry students Further support of reliability


Survey design

In the design of our survey we were guided by previous work in this area on the development of diagnostic instruments (Treagust, 1988; Treagust and Chiu, 2011), in that we interviewed students, recorded their responses to open ended questions and developed the survey from those responses over multiple administrations. In our previous study investigating how students construct and understand Lewis structures (Cooper et al., 2010), we asked the open-ended question “What information can be obtained from a Lewis structure?” The question was administered to second-semester general chemistry students (N = 32), second-semester organic chemistry students (N = 134), and chemistry graduate students (N = 10). We used Ed's Tools—a program that provides an efficient way to collect data from free-response questions for larger populations of students while aiding in the analysis process by allowing the user to code the responses before collating them automatically (Ed's tools, 2009). Two researchers agreed upon the most common types of information, which contained eleven main topics including intermolecular forces, polarity, and reactivity. Ten general chemistry students', twenty six organic chemistry students', and four chemistry graduate students' responses were randomly selected and coded: the inter-rater reliability (Cohen's Kappa) ranged from 0.83–1 for these topics.

These answers were then used to develop the preliminary version of the IILSI, which consisted of three questions. The first question asked students “What information can be obtained from a Lewis structure?” where the answer choices consisted of 16 possible item choices which included the 11 most commonly stated types of information from the Ed's Tools free response question. The other answer choices consisted of broader topics subdivided into smaller items; for example, the choice “physical properties” was presented along with the answer choices “boiling point” and “melting point.” The answer choice “no information” was also included, which provided a choice for students who believe no information can be determined from a Lewis structure. This answer choice also serves as a validation measure to eliminate participants who merely check all of the answer choices. The second and third questions consisted of open-ended follow-up questions that served two main purposes: (1) to check the validity of this survey by determining if any of the answer choices were unclear to the students and (2) to allow for any additional information that the students believed was absent from the list.

Once the preliminary version of the survey was created it was revised over three semesters by administering it to a total of 648 first- and second-semester general chemistry students and 336 first- and second-semester organic chemistry students over three semesters. For each administration the questions were slightly reworded for clarity using feedback from the open-ended questions. The wording of the question became “What information could you determine using a Lewis structure and any other chemistry knowledge you may have?” An example of the changes that were made to the answer choices was “melting point” and “boiling point” became “relative melting point” and “relative boiling point” since students asked if it meant exact melting and boiling point. Changes were made to the items until there were no longer any major concerns voiced from the students; the final version of the survey is shown in Fig. 2.


Implicit Information from Lewis Structures Instrument (IILSI).
Fig. 2 Implicit Information from Lewis Structures Instrument (IILSI).

Data collection

These preliminary versions of the survey were administered via paper and pencil to each of the students. Once most of the changes were made the survey was moved to an online collection system, SurveyMonkey, as a multiple answer question. Completion of the survey typically takes about 5–10 min. There were no significant differences between the responses for the paper and pencil and online version. All other students' responses were collected using SurveyMonkey.

Results and discussion

Validity and reliability are important factors to evaluate when creating a survey instrument. The face, content, and predictive validity and reliability were determined as discussed below.

Face validity depends on whether an instrument appears to measure what it was intended to measure (Creswell and Plano Clark, 2007). To determine the face validity four graduate students and one faculty member examined the survey and agreed that it was measuring its intended purposes. In addition, students' responses to question three, in which students could comment on the clarity of the answer choices, were used to further confirm the face validity of the survey and to modify those answer choices until students no longer reported any major concerns.

Content validity ensures that all aspects of an intended topic are encompassed in an instrument (Creswell and Plano Clark, 2007). Since the intended audience is the student we used the students' responses to the request for any additional information that could be determined using a Lewis structure, as a way to support the validity. For example: some students provided “acidity and basicity” to this prompt. When we included the prompt “and any other chemistry knowledge you have” to ensure that students understand that they may use other information other than just the Lewis structure, there was no difference in the students response distribution for either general or organic chemistry students. Two faculty members and five graduate students also agreed that these items represent the information that can be determined from Lewis structures.

To further help support validity, the survey was administered to a subsample of first-semester general chemistry students (N = 306). Students who had not received instruction on a specific topic typically did not pick the corresponding answer choice. For example, the answer choice “resonance” had a student selection rate of less than 20% for first-semester general chemistry students (N = 47) who had not received instruction on this topic in their current class, while another group of first-semester general chemistry students (N = 259) who had received instruction on this topic had a selection rate over 60%. That is: the instrument is sensitive to the instructional environment of the students, and our data suggests that students take the instrument seriously and answer to the best of their abilities.

Further evidence of the validity of the IILSI is shown in Fig. 3, the results from a subsample of second-semester general chemistry students (N = 744) and second-semester organic chemistry students (N = 387). The survey supports the findings from our previous mixed methods study (Cooper et al., 2010). The results from the IILSI showed that less than 40% of general chemistry students spontaneously state that they can determine chemical and physical property information from Lewis structures (answer choices acidity/basicity, reactivity, relative boiling and melting point, and physical properties), while less than 50% of organic chemistry students spontaneously state that they can determine chemical and physical property information from Lewis structures. Indicating once again that students have a great deal of difficulty connecting structures with their properties. When examining how junior/senior-level undergraduate chemistry majors (N = 11) and graduate chemistry students (N = 12) view these structure-property connections, about 45% of them stated that chemical properties could be determined and only about 15% of them selected physical properties. Just as we had found in our previous work with interviews, even junior/senior level chemistry students do not automatically connect structures with properties (Cooper et al., 2010).


Comparison of second-semester organic and general chemistry students' responses to the IILSI (*p < 0.05, **p < 0.01, and ***p < 0.001).
Fig. 3 Comparison of second-semester organic and general chemistry students' responses to the IILSI (*p < 0.05, **p < 0.01, and ***p < 0.001).

While there were some items that resulted in a significant difference between the organic and general chemistry students choice of items, these differences had small effect sizes. Using a Chi Squared analysis, the effect size (phi coefficient [ϕ]) for each of the types of information that were significant was small (ϕ values between 0.1 and 0.3) (Table 2) (Cohen, 1988). That is, there appears to be little growth in students' ability to relate structure and properties as they move through organic. For the items that have an effect size greater than 0.2, such as the answer choice of acidity/basicity, potential for resonance, or need for formal charges, might be expected since these topics are emphasized more in organic chemistry. However it should be noted that typically less than half the students will spontaneously choose items that reflect the idea that Lewis structures can be used for predictive purposes.

Table 2 Statistical results for comparison of general chemistry and organic chemistry students for IILSI
Information General Chemistry Mean Organic Chemistry Mean χ2 p-value Effect Size
Formal charges 71.2 93.5 74.7 <0.001 0.26
Bond angle 81.9 75.2 6.6 0.011 0.08
Geometry/shape 84.8 78.3 7.1 0.008 0.08
Potential for resonance 55.5 76.2 45.8 <0.001 0.20
Hybridization 68.4 78.0 11.2 0.001 0.10
Intermolecular forces 72.3 62.0 12.1 0.001 0.10
Acidity/basicity 33.5 56.1 52.8 <0.001 0.22
Reactivity 38.0 51.2 17.4 <0.001 0.12
Relative boiling point 41.4 34.9 4.5 0.039 0.06


While most of the data used for the development and validation of the IILSI was collected at one institution, a research university, the IILSI has been administered at another institution. For example: the IILSI was administered online to general chemistry (N = 89) students at a comprehensive university in the southeastern United States, who consented to participate. These students took the IILSI at the end for their first-semester general chemistry course and were compared to students' responses at the research university who were at the same place in the curriculum. There were no significant differences between the two cohorts for any of the items (Fig. 4).


A comparison of the IILSI results at the end of the general chemistry course for two institutions.
Fig. 4 A comparison of the IILSI results at the end of the general chemistry course for two institutions.

Although one of the most common ways to evaluate the reliability of an instrument is by correlating test-retest data, we believe this type of reliability testing was inappropriate since closely repeated administration of the IILSI might affect students' future responses. Instead, the reliability of the final version of the IILSI was determined by comparing the results from two similar groups pre-instruction knowledge for first-semester general chemistry students (one in Fall 2010 [N = 99] and the other in Fall 2011 [N = 92]) who had been exposed to the same instruction and instructor. A Chi-Square analysis was performed using SPSS to determine whether or not the two groups selected similar answer choices for the types of information that can be obtained using a Lewis structure. We found that there was no significant difference between the two groups (p-values > 0.05) in their selection rates for all 17 possible answer choices. From these results, we concluded that the IILSI instrument was valid and reliable.

Conclusions

The IILSI instrument provides an effective, efficient quantitative method to determine whether students understand what Lewis structures are used for. It can be quickly and easily administered to large groups of students and can be used as an informative assessment given before and/or after instruction. It was found to be a valid and reliable measure for general and organic chemistry student populations at more than one institution. The IILSI provides instructors with a method to determine whether students are making the essential connection between structures and their properties.

Implications for teaching

As we discussed earlier, one of the most important concepts in chemistry is the relationship between structure and function. However, this connection requires students to concatenate a long sequence of actions that are often seen as disconnected activities, with no overall purpose. For students to become proficient in chemistry (not merely memorizing facts and algorithms), they must spontaneously and automatically make these connections. The IILSI provides evidence for instructors about whether their students are making these connections. As we interpret the IILSI results, we utilize the students' responses in a categorical manner. That is we look at what students are, and are not selecting, rather than using a score. All of the options available to students on the IILSI (except for “no information”) can be determined from a Lewis structure and other chemistry knowledge.

It should be noted that the IILSI requires students to make inferences about the implicit information that can be determined from using Lewis structures and any other chemistry knowledge the student may have. While students may be able to retrieve this information if explicitly asked to do so, this explicit prompting of retrieval does not mean that students will use the information under other circumstances. That is, students may answer a test question correctly, but will not be able to use the knowledge and skills required to predict reactivity (for example) unless explicitly prompted to do so.

We believe that the inability to extract the implicit information from structural representations is a major impediment to learning chemistry, and one that can be addressed by changes in instruction and assessment practices. However, the first step towards change is for instructors to become aware of the magnitude and difficulty of the tasks that they set for students, and the Implicit Information from Lewis Structures Instrument is one way to detect whether instruction is effectively addressing these problems. It is important to note that the only reason to teach students to draw such structures is so that they can make these connections (not to answer questions on a multiple choice test).

Acknowledgements

This work was supported in part by NSF awards: 0817409, 0816692. We gratefully acknowledge the assistance of Dr Nathaniel Grove with the administration of the IILSI.

References

  1. ACS Committee on Professional Training, (2008), Organic chemistry supplement, Retrieved on November 15, 2011, from http://portal.acs.org/portal/PublicWebSite/about/governance/committees/training/acsapproved/degreeprogram/CTP_005614.
  2. Bretz S. L., (2001), Novak's theory of education: Human constructivism and meaningful learning, J. Chem. Educ., 78, 1107.
  3. Cohen J., (1988), Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, New Jersey: Lawrence Erlbaum Associates.
  4. Cooper M. M., Grove N., Underwood S. M. and Klymkowsky M. W., (2010), Lost in Lewis structures: An investigation of student difficulties in developing representational competence, J. Chem. Educ., 87, 869–874.
  5. Creswell J. W. and Plano Clark V. L., (2007), Designing and conducting mixed methods research. Thousand Oaks, CA: Sage Publications, Inc.
  6. Duis J. M., (2011), Organic chemistry educators' perspectives on fundamental concepts and misconceptions: An exploratory study, J. Chem. Educ., 88, 346–350.
  7. Ed's tools, (2009), September 9, 2009, from https://solarsystem.colorado.edu/conceptInventories/.
  8. Gillespie R. J., (1970), The electron-pair repulsion model for molecular geometry, J. Chem. Educ., 47, 18–23.
  9. Johnstone A. H., (1991), Why is science difficult to learn? Things are seldom what they seem, J. Comp. Assist. Learn., 7, 75–83.
  10. Kozma R. B. and Russell J., (1997), Multimedia and understanding: Expert and novice responses to different representations of chemical phenomena, J. Res. Sci. Teach., 34, 949–968.
  11. Lewis G., (1916), The atom and the molecule, J. Am. Chem. Soc., 38, 762–785.
  12. Maeyer J. and Talanquer V., (2010), The role of intuitive heuristics in students' thinking: Ranking chemical substances, Sci. Educ., 94, 963–984.
  13. Nicoll G., (2003), A qualitative investigation of undergraduate chemistry students' macroscopic interpretations of the submicroscopic structure of molecules, J. Chem. Educ., 80, 205–213.
  14. Novak J. D., (2002), Meaningful learning: The essential factor for conceptual change in limited or inappropriate, Sci. Educ., 86, 548–571.
  15. Novak J. D., (1998), Learning, creating, and using knowledge: ConceptMaps as facilitative tools in schools in corporations. Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
  16. Novak J. D., (1977), A theory of education. Ithaca: Cornell University.
  17. Shane J. W. and Bodner G. M., (2006), General chemistry students' understanding of structure-function relationships, Chem. Educat., 11, 130–137.
  18. Treagust D. F., (1988), Development and use of diagnostic tests to evaluate students' misconceptions in science, Int. J. Sci. Educ., 10, 159–169.
  19. Treagust D. F. and Chiu M., (2011), Diagnostic assessment in chemistry, Chem. Educ. Res. Pract., 12, 119–120.

Footnotes

We use the term “Lewis structures” to encompass structural representations that indicate atoms, bonds, lone pairs, unpaired electrons, and formal charges.
Students in this course come from a wide range of backgrounds, most have had at least one year of high school chemistry, and some have previously been taught this information.

This journal is © The Royal Society of Chemistry 2012