Student perspectives on chemistry intelligence and their implications for measuring chemistry-specific mindset

Deborah L. Santos a, Harrison Gallo , Jack Barbera b and Suazette R. Mooring *a
aDepartment of Chemistry, Georgia State University, Atlanta, Georgia, USA. E-mail: smooring@gsu.edu
bDepartment of Chemistry, Portland State University, Portland, Oregon, USA

Received 31st March 2021 , Accepted 15th June 2021

First published on 18th June 2021


Abstract

Students’ beliefs about their ability to improve their intelligence (known as mindset) likely have more impact on their academic outcomes when engaging in challenging learning environments, such as introductory undergraduate chemistry courses. To date, little research has been conducted on the chemistry-specific aspects of intelligence which result in domain-specific mindset beliefs. Additionally, the existing mindset instrument, and its variations, have not been demonstrated as valid for a higher-education chemistry student population. In this work, we evaluate mindset trait terminology (“intelligence,” “chemistry intelligence,” and “chemistry ability”) interpretations across a large, diverse sample to identify key cognitive aspects students consider important within chemistry-specific contexts and qualitatively distinguish perspectives of students who describe growth mindset beliefs from those holding fixed mindset beliefs. It was determined that all three mindset trait terms yield broad ranges of interpretations, and that those specific to chemistry elicit meanings more relevant to the chemistry classroom context. Three distinct mindset perspectives were identified qualitatively within the sample based on students’ interpretation of the nature and origins of “chemistry intelligence”. These groups had significantly different mean values for the mindset construct as measured by the existing mindset instrument, however, the groups heavily overlapped in response patterns. These findings support the need to develop a chemistry-specific instrument that can produce valid data for this population as the different mindset perspectives were not distinguishable by the current quantitative measures.


Introduction

Introductory chemistry courses at the undergraduate level, such as general and organic chemistry, meet the criteria for gateway courses in that they are foundational, high risk, and high enrollment (Koch, 2017). Failure rates are known to be high in these chemistry courses (Amaral et al., 2013; Pienta, 2014; Harris et al., 2020), which are commonly taken by non-majors pursuing other sciences or professional career paths. Chemistry failure rates are likely linked to the relative difficulty of this academic domain (Lyons, 2006), which leads to increases in setback experiences for many students (Grant and Dweck, 2003). There are many factors that contribute to a students’ decision to withdraw from a chemistry course (Horowitz et al., 2013; McKinney et al., 2019). However, persistence is a key quality exhibited by some students who face challenges yet achieve success. A persistent response to academic setbacks is more common in students who have growth mindset beliefs about their intelligence in that domain (Hochanadel and Finamore, 2015; Binning et al., 2019; Henry et al., 2019; Karlen et al., 2019; Murner and Hessler, 2020). It is possible that chemistry instructors and programs could improve student outcomes by identifying and targeting students with fixed chemistry mindsets and implementing strategies to alter their beliefs about learning chemistry.

Mindset, here, refers to beliefs about the ability to improve one's intelligence. To date, very few studies have been published regarding mindset (or implicit theories of intelligence) in chemistry learning environments (Bedford, 2017; Fink et al., 2018; Limeri et al., 2020a). However, some studies provide evidence that perceptions of difficulty and challenge associated with STEM courses affect students' implicit beliefs about STEM domains (Burkley et al., 2010; Gunderson et al., 2017), in addition to producing more opportunities for setbacks which may cause students to confront those beliefs (Limeri et al., 2020a).

A few studies have reported a decline in domain-specific mindset beliefs, but not general intelligence mindset beliefs, over a semester-long period within STEM-related courses (Shively and Ryan, 2013; Scott and Ghinea, 2014). Dai and Cromley (2014) found that both the rate of decline and initial level of biology ability mindset predicted STEM major dropout. Gunderson et al. (2017) showed a preferential benefit of growth math beliefs for secondary and post-secondary students on math achievement compared to reading or writing, which was argued to be a result of the perception that math is a more challenging domain. Scott and Ghinea (2014) compared the effect of general intelligence mindset and computer programming aptitude mindset on frequency of programming practice using early course grade as a moderator and found that the domain-specific mindset scores were more predictive of programming practice. These and other studies provide evidence to support that each domain may have its own separate mindset construct with respect to that of general intelligence and that the difficulty of STEM courses lends greater importance to growth beliefs about those domains (Shively and Ryan, 2013; Costa and Faria, 2018; van Aalderen-Smeets et al., 2019; Yu and McLellan, 2020). Thus, if chemistry mindset indeed exists, the ability to determine students’ implicit theories of chemistry intelligence hinges on accurate measurement of this construct.

Theoretical framework

Students hold different theories about the nature of intelligence and other attributes. These theories are formed throughout their development as a consequence of cultural or environmental messages, personal experiences, and observations of others (Dweck et al., 1995b; Macnamara and Rupani, 2017; Barger, 2019; van Aalderen-Smeets et al., 2019; Limeri et al., 2020a). Based on Dweck's social-cognitive theory, a student's implicit theory refers to beliefs about the extent of malleability of a trait, such as intelligence, and is commonly called mindset (Dweck and Leggett, 1988; Dweck, 2006). These theories are labeled “implicit” because they are internal perspectives that impact external actions; however, they may not be explicit to the individual who holds them (Levy et al., 1998; Yeager and Dweck, 2012).

There are two general categories of implicit theories: incremental and entity. Students who have a “growth mindset” believe that intelligence can improve over time and thus hold incremental theories about the nature of intelligence. Incremental theorists generally believe that effort is important for learning (Chen and Pajares, 2010) and are more likely to exhibit persistent behaviors in the face of challenges (Karlen et al., 2019). Incremental beliefs can impact the goals students set (Burnette et al., 2013) and yield adaptive learning behaviors, which are theorized to improve their likelihood of success (Blackwell et al., 2007; Cavanagh et al., 2018). The opposite is true for students who hold entity theory beliefs about intelligence, that it is a stable and natural quality. Entity theorists, who possess a “fixed mindset,” attribute their success to ability, believe that effort is only required if you have low ability, and are more focused on performance rather than learning (Blackwell et al., 2007). Maladaptive behaviors such as disengagement, helplessness, and failure-avoidance can result from entity beliefs in the presence of setbacks (Hong et al., 1999; Tempelaar et al., 2015).

Growth mindset does not imply the belief that all people hold equal capacity in all domains; however, it does hold that any ability can be developed with effort (Blackwell et al., 2007). An individual may be set in a learning environment within a particular domain about which previous experience or current performance communicates that they possess low inherent aptitude. The response to these conditions from an entity theorist would likely be helplessness and avoidance of demonstrating low ability. If it is the first time this student has encountered major academic setbacks, it may be the first time their implicit theory is elucidated and necessitates a decision between adaptive or maladaptive behaviors (van Aalderen Smeets and van der Molen, 2018). On the other hand, a student with incremental beliefs would exhibit greater resistance to giving up or losing hope in this circumstance because their perspective is one of improving and learning (van Aalderen Smeets and van der Molen, 2018).

Students who hold implicit beliefs on either extreme should yield greater disparity in behavior and outcomes if observed during a time of increased likelihood of facing challenges (Dupeyrat and Mariné, 2005; Limeri et al., 2020a). These responses to setbacks are particularly relevant within challenging courses such as general and organic chemistry, due to the presence of many students who lack personal interest in chemistry and who may experience challenges with learning and performing in those classes. An incremental theory of general intelligence may provide some benefit to these students; however, it is more likely that chemistry-specific incremental beliefs are vital to increasing the accessibility of the adaptive behaviors necessary to improve and achieve desired outcomes.

Considerations when measuring mindset

Several variations of Carol Dweck's general mindset scale have been produced over the years, but nearly all utilize the term intelligence for the attribute in question (Dweck, 1999; Hong et al., 1999; Blackwell et al., 2007). Dweck's mindset scale was developed primarily for K-12 contexts but has been used with college students without the extensive validation studies necessary for the use of any psychometric tool with a different target population (Dweck et al., 1995a; Levy et al., 1998; Hong et al., 1999; American Educational Research Association, 2014). Recently, an increasing number of studies and meta-analyses have questioned the practical benefits, straightforwardness, and validity of the mindset meaning system as a predictive theory (Burnette et al., 2013; Costa and Faria, 2018; Sisk et al., 2018; Burgoyne and Macnamara, 2020). It is probable that developmental stage, in addition to culture, impacts the theories students hold about intelligence, which could affect the nature of the mindset construct itself within different student populations (Anderson, 1995; Dai and Cromley, 2014; Yeager and Dweck, 2020).

Two meta-analyses of mindset literature have concluded that mindset interventions produce inconsistent and smaller effect size results with regards to improving student achievement at the undergraduate level. However, significant correlations are consistently observed in younger student samples (Costa and Faria, 2018; Sisk et al., 2018). One explanation provided in the analysis by Sisk et al. (2018) is that with the increased course choice freedom at the college level, growth mindset students might be less deterred by difficult courses. As a result, such students may obtain lower GPAs relative to other majors as a result of the differential challenge-level.

In an attempt to understand the causes of this age effect on mindset intervention success, some questions have been raised over undergraduate interpretations of terminology used within the typical items of the implicit theories of intelligence instrument, such as “intelligence” or “ability” (Oliveira-Castro and Oliveira-Castro, 2003; Dupeyrat and Mariné, 2005; van Aalderen-Smeets et al., 2019; Limeri et al., 2020b). Aditomo (2015) points out that cultural dimensions influence individuals’ definitions of the term intelligence, in that, many non-Western cultures view intelligence as involving knowledge, wisdom, and morality. When assessing the mindset of a culturally diverse student sample, the terminology may have strong influences on interpretation and response patterns.

Interpretations of “intelligence” and “ability” likely vary depending on the domain associated with them, as evidenced by cultural views on expectations of special ability levels, especially in STEM fields, which influences beliefs about the type of person who can succeed in a particular field (Leslie et al., 2015). Buckley et al. (2019) reported that students’ descriptions of intelligent behaviors in technology fields aligned with a fluid definition of intelligence according to theory on fluid and crystalline intelligences, which supports the notion that the domain can alter the type of intelligence called to mind when one reads a survey item. It is possible that undergraduates believe that theories of multiple intelligences or context-dependence apply (Gardner, 2006; Sternberg, 2000), even to a term like “chemistry intelligence.” Regardless of domain-specificity, these terms may still yield a high degree of variation in meaning to students and thus require a clearer understanding of these interpretations before incorporating them into mindset measures.

Construct validity and domain-specific mindset

In the process of instrument development, validity, or the extent to which evidence supports the interpretation of test scores for an intended purpose, is of prime focus (Wren and Barbera, 2013; American Educational Research Association, 2014). Construct validity is the overarching concept within modern validity theory and evaluates “the degree to which certain explanatory concepts or constructs account for performance on the test” (Messick, 1987). It is important to consider potential threats to validity in the development of a new measure for domain-specific mindset.

As previously noted, several meta-analyses show weak to no relationship between mindset and achievement with adult subjects across a large number of studies (Costa and Faria, 2018; Sisk et al., 2018). As these results do not align with the theoretical prediction of outcomes established with younger students (Blackwell et al., 2007), they suggest potentially weak construct validity of implicit theories measures for these subjects (Messick, 1987; Costa and Faria, 2018; Sisk et al., 2018).

According to the Standards for Educational and Psychological Testing (2014), validity is addressed through the accumulation of evidence pertaining to content, response process, internal structure, and relation to other variables. Standard 1.10 states that when the interpretation of a test depends on the appropriateness of the content, content-oriented evidence, such as a thorough description of procedures used to generate test content for a particular target population and construct, should be included as a rationale. Additionally, Standard 1.14 states that when the intention is to interpret subscale scores or score differences between individuals, sufficient evidence should be provided as a rationale for the appropriateness of those interpretations (American Educational Research Association, 2014).

In examining the validity of the mindset construct for adult chemistry students, variations on views regarding the meaning of intelligence to these students is an important aspect to consider, given the many cultural aspects embedded. To this point, Limeri et al. uncovered two distinct themes in interpretation of intelligence from a sample of organic chemistry undergraduate students: “knowledge” and “abilities” (Limeri et al., 2020b). This finding provides evidence for a lack of homogeneity in the meaning of mindset items to undergraduate students. Construct validity can be affected by these different meanings, in that an interpretation of intelligence as “knowledge” is likely considered to be more malleable with respect to intelligence interpreted as “ability,” and might reflect different constructs.

Most often, in studies incorporating a domain-specific measure of implicit beliefs, the name of the domain is simply attached to the attribute without much consideration of potential alterations to psychometric functioning (Komperda et al., 2018) (for example, “chemistry intelligence” (Limeri et al., 2020a), “biology ability” (Dai and Cromley, 2014), or “computer programming aptitude” (Scott and Ghinea, 2014)). One can question the validity of these changes in how it might affect understanding of implicit theories items and the variety of ways a term like “chemistry intelligence” could be understood by a new target population. As many wording selections for the malleable/stable attribute have been presented in the literature, it is crucial to explore chemistry students’ perspectives on various attribute terminologies so that the influences these may have on responses can be better understood.

The goal in measuring implicit theories is to identify different implicit beliefs in order to analyze their effects on other variables or to assist those with fixed mindsets in developing adaptive learning beliefs and behaviors. The commonly cited technique for identifying a fixed mindset from an implicit theories scale is to reverse score the entity items then set a cutoff value in the lower half of the Likert scale for the mean score of all items (Hong et al., 1999; Costa and Faria, 2018; Yeager and Dweck, 2020). This technique assumes that entity and incremental scales are true opposites of one another and thus a singular construct, a topic heavily debated in the literature (Dweck et al., 1995b; Hong et al., 1999; Dupeyrat and Mariné, 2005; Tempelaar et al., 2015; Lüftenegger and Chen, 2017).

Mean mindset scores reported for populations within undergraduate STEM courses are often extremely high, where a growth mindset is easily interpreted as the normal belief a student would hold and a fixed mindset is an extreme belief (Shively and Ryan, 2013; Dai and Cromley, 2014; Flanigan et al., 2017; Lytle and Shin, 2020). Due to this skew toward growth mindset, the use of the cutoff score categorization technique results in the conclusion that an overwhelming majority of STEM undergraduate students have growth mindsets. Understanding the relative prevalence of a fixed mindset in a population is important because any advantage provided by interventions or student support should cause apparent outcome improvements.

One compelling argument for the underlying cause of the more extreme skew observed at the undergraduate level is that social desirability of a growth mindset towards others, and thus oneself, is more prevalent amongst undergraduates compared to younger students. Also, this perspective aligns well with a relativistic view on the nature of intelligence, an aspect of the predominant culture in US higher education (Hong et al., 1999; Brown, 2008; Lüftenegger and Chen, 2017). Measurements affected by social desirability have construct-irrelevant threats to validity, or “excess reliable variance that is irrelevant to the interpreted construct” (Messick, 1987). For example, in later iterations of Dweck's implicit theories instrument used with adult subjects, the incremental items were often removed because of social desirability (Hong et al., 1999). This removal of the incremental subscale for social desirability purposes would not be fully justified, in terms of internal structure of the measure, if the debated two-factor structure is the most accurate representation of the construct (Lüftenegger and Chen, 2017). It is also possible that the high frequency of students reporting growth mindset may be, in part, due to variation in interpretation of attribute wording, allowing a more malleable view of intelligence for many students relative to others (Limeri et al., 2020b), which may not truly reflect the same construct. At any rate, the question of item wording must be resolved to provide some qualitative evidence to support the validity of data generated with the measure when used with undergraduate students, and more specifically if targeting a new intellectual domain (chemistry) as a separate mindset construct.

Purpose of the study

The work presented here is part of a larger study to design a chemistry-specific mindset instrument. In this study, attempts are made to establish validity evidence for the content of modified implicit theories items targeted at a chemistry-specific context.

A major objective of this qualitative study is to better understand possible interpretations of terminology commonly included in domain-specific survey items. This step is important in order to justify any modifications made to the instrument items, such as attribute terminology. This qualitative work has not been carried out thus far for chemistry-specific attribute terminology with adult students in a chemistry context.

Another major objective is to uncover evidence of fixed mindset beliefs as a fairly common trait present within the target population. One consideration is that individuals may hold elements of both entity and incremental theories and the selective activation of one belief over another is determined by whichever is subconsciously viewed as most useful within a specific context (Anderson, 1995; Lüftenegger and Chen, 2017; Scherr et al., 2017). Student descriptions of the attribute terminologies previously mentioned are examined here for expressed views on the nature of chemistry intelligence. Knowledge and characteristics of different beliefs about the nature of chemistry intelligence can inform the development of more sensitive measurement tools to detect chemistry-specific entity and incremental theorists.

To better understand the mindset construct within undergraduate chemistry students, the following questions were investigated: (RQ1) What are students’ reported mindsets when using a modified implicit theories scale with “chemistry intelligence” substituted for “intelligence?” (RQ2) How do students interpret the attribute terminology commonly used in measures of mindset? (RQ3) To what extent is the mindset construct valid within this undergraduate chemistry course population?

Methods

Participants

This study was presented to students enrolled in all general and organic chemistry courses at a large, diverse public research university in the southeastern United States during the fall 2020 semester. Students were invited to participate in a study at two survey time points – the first three weeks of the semester and the last three weeks before the final exam. All sampled courses were large lecture sections (N > 200) administered in an online format due to the COVID-19 pandemic policies in place. The sample was composed of students from two general chemistry I sections and one section each of general chemistry II, organic chemistry I, and organic chemistry II. The total number of consenting participants in the pretest survey during fall 2020 was 1,080. Quality control items were used to identify and remove careless responders, defined as those students who lacked sufficient attention to survey instructions (e.g. “This is a quality control item. Select “disagree.”) After the removal of careless responders, the sample consisted of 851 students.

Student demographics according to course level are shown in Table 1. Of the total sample, only 5.1% were enrolled in non-STEM-related major programs. The rest were STEM majors, pre-professional, or post-baccalaureate students, with only 6.0% chemistry majors. The sample was majority female, which reflects the approximately 70% female STEM course enrollment at the institution. More than half the sample is considered to be from a low-socioeconomic household and about a third of the students are first-generation college students.

Table 1 Sample demographics by course level and semester
Fall 2020 (%) Spring 2020 (%)
General I (n = 322) General II (n = 174) Organic I (n = 253) Organic II (n = 102) Organic II (n = 100)
a Pell Grant is used here as a proxy for socioeconomic status as it is a need-based US governmental financial aid program. b First Generation status refers to students who reported that none of their parents or grandparents have attended college.
Response rate 29.0 34.2 55.6 32.5 29.7
Gender
 Male 30.0 28.7 26.5 31.4 27.0
 Female 71.7 70.7 72.7 67.6 83.0
Race/ethnicity
 Black 31.1 37.4 36.0 36.3 45.0
 Asian 32.6 21.8 30.8 33.3 36.0
 White 18.0 21.8 20.2 16.7 19.0
 Hispanic 11.5 13.2 8.3 9.8 5.0
 Other 6.8 5.7 4.7 3.9 5.0
 Pell eligiblea 55.3 55.7 54.5 48.0 63.0
 First generationb 32.9 43.4 30.4 29.4 33.0


A second, smaller sample of organic II students (N = 100) was included in some analyses and was surveyed during the last three weeks of spring semester 2020 with a similar demographic makeup. A negligible amount of extra credit in the course was offered as an incentive to participate in the surveys and the students accessed the Qualtrics survey through a link posted to their online course page. The students voluntarily participated in the study or were otherwise permitted to complete an alternative assignment to earn the same amount of extra credit.

Data collection

The survey was administered online via Qualtrics survey software (QualtricsXM, Provo, UT) and included open-ended questions and a modified version of the Implicit Self-Theories of Intelligence scale derived from De Castella and Byrne (2015), in which we changed the term “intelligence” to “chemistry intelligence” or “chemistry ability.” To gauge students’ definitions associated with these three terms, each participant was asked two open-ended questions about two of the three terms (e.g., “How do you define chemistry intelligence? What experiences or observations have led you to this belief? Please write at least 3–4 sentences.”). After formulating their responses for each term, the corresponding implicit theories scale was provided for response using a 6-point Likert scale. Four entity statements were provided regarding beliefs about the stability of the trait (e.g., “My chemistry intelligence is something about me that I can’t change very much.”), while four incremental statements were provided regarding beliefs about the malleability of the trait (e.g., “No matter how much chemistry intelligence I have, I can change it quite a bit.”).

It was desired that the responses to the instrument items would reflect the previously contemplated definitions students had provided. The sequencing of the definition prompt prior to implicit theories scales likely allowed the participants to be more thoughtful when selecting a response, rather than trying to guess what the researcher's definition may be. To reduce survey fatigue, two survey versions were created so that each student responded with regard to only two of the three terms. The versions were randomized by the Qualtrics logic, providing approximately 50% response each to the “chemistry ability” and “intelligence” questions, while all students were presented with the “chemistry intelligence” questions.

Data analysis

Descriptive analysis of implicit theories measure. Incremental and entity theory scores from the implicit theories of chemistry intelligence scale were computed as an average of the four items within each subscale, where a score of 6 represents strongly agree. A high score in incremental theory combined with a low score in entity theory would thus be theoretically interpreted as a growth mindset. The distributions of subscale means across the sample were examined through histogram plots.
Open-response coding. The open-ended questions were subjected to an iterative content analysis strategy that began inductive in nature and became more deductive toward the end of the analysis (Hsieh and Shannon, 2005). Two researchers separately coded identical samples of 40 responses (of the approximately 1900 collected) from each open-ended item at a time in an inductive manner during the code category development phase. In order to faithfully represent the responses provided by students, researchers did not begin with a predetermined list of codes and instead allowed the codes to arise organically from the text data. This coding of identical samples was done in an iterative process, during which, the two researchers independently coded the text data then convened to discuss their findings. The goal at each step was to merge the separate lists of codes into a new combined list. This refinement process was carried out for several iterations with novel codes being identified and incorporated into the code list whenever appropriate and mutually agreed upon.

During the codebook development, a single term was examined during a single iteration, but all three terms were incorporated into separate iterations of codebook development. During each iteration, a new sample of 40 responses was coded. This process uncovered new categories over time because of differences in interpretations of different terms, but in the end a merged codebook (see Appendix A) was deemed appropriate due to the high degree of code category overlap and the intent for comparison across terms. Cohen's kappa was calculated at 0.71 for the overall interrater reliability after four iterations. Based on this moderate to good indicator of reliability (Watts and Finkenstaedt-Quinn, 2021), along with higher than 90% percent agreement for individual codes, it was determined that sufficient agreement on the usage of each code was achieved.

The finalized code list was utilized in the coding of representative samples of all open-ended responses provided by students in each of the four courses surveyed during the Fall semester. The representative coding samples were prepared by systematically alternating response selection from the larger sample split by course. This selection resulted in a 50% response selection from each course, which represented a total of 1112 responses. In this way, coding samples were representative of course levels. After coding these responses, it was determined that data saturation had been reached since new data yielded no additional themes (Guest et al., 2006; Bernard and Ryan, 2010; Given, 2016). Therefore, it was unnecessary to continue coding additional responses. The code frequencies of different definitions and beliefs regarding the terminologies were analyzed to compare facets of the word interpretations by this student sample and identify themes.

Qualitative mindset sorting. A separate analysis was conducted using the chemistry intelligence open-ended prompt to qualitatively sort responses into groups based on the explicit mention of the origins or nature of chemistry intelligence acquisition. The purpose of this analysis was to examine if the chemistry mindset construct can be elicited qualitatively through students’ expressed beliefs about the nature of chemistry intelligence implicit to many students’ definitions.

Several criteria were derived from Dweck's mindset theory to identify a response that belonged in a particular group. For example, students who described chemistry intelligence as something that is unequally distributed among people naturally or that is unchangeable, a “gift,” or superior “smartness” in that area, were grouped as “fixed mindset” responses. Those that highlighted the ability to improve over time, the ability of anyone to achieve it, or explicitly described their own gains in chemistry intelligence were categorized as “growth mindset.” And those that mentioned some aspect of both categories or had a more inclusive view, stating that chemistry intelligence could be represented by many different types of skills, were placed in a “middle mindset” group, which aligns with the theoretical existence of a “mixed mindset” (Dweck et al., 1995a; Lüftenegger and Chen, 2017). These groupings were compared with their respective implicit theories of chemistry intelligence scale responses in order to investigate the effectiveness of distinguishing different mindsets quantitatively with the existing instrument.

The first author (DLS) sorted students into groups and excluded those that did not discuss origins or the nature of acquiring chemistry intelligence. The second author (HG) independently coded the open-ended response samples corresponding to each mindset group. A total of 291 responses were selected for this analysis based on meeting the mindset group criteria (see Appendix B), which represents about 34% of the full data set. Interrater reliability analysis for response coding of these mindset groups yielded a Cohen's kappa value of 0.79, indicating good reliability. Further evidence of the distinctiveness of the groups was provided through analysis of their code frequencies and how they define and attribute chemistry intelligence in significantly different ways.

Results and discussion

In order to align the study findings with its purpose, the following results are framed by the research questions.

RQ1: What are students’ reported mindsets when using a modified implicit theories scale with “chemistry intelligence” substituted for “intelligence?”

The data derived from the modified implicit theories scales regarding chemistry intelligence administered to this chemistry undergraduate sample were examined and the average subscale scores are presented as distributions in Fig. 1. The incremental beliefs about chemistry intelligence are shown on the right side of Fig. 1 and have a substantial negative skew with a mean of 4.80 and narrow standard deviation of 0.88. A similar skew is observed in the sample distribution for the entity scale, also representing low agreement overall with the fixed nature of chemistry intelligence (mean = 2.18, SD = 0.95). This subscale combined result signifies that the majority of the sample self-report having a growth chemistry mindset as measured by this domain-modified instrument. These results align with those reported in STEM-specific studies with other adult student populations (Shively and Ryan, 2013; Dai and Cromley, 2014; Flanigan et al., 2017; Lytle and Shin, 2020). It is unlikely that nearly all students in the sample truly have a growth mindset if the theoretical links to adaptive behaviors and achievement from Dweck and colleagues’ meaning system are considered. The misalignment between reported beliefs with typical outcomes may be the result of a measurement problem, prompting further investigation into student interpretations of “chemistry intelligence” and an alternative way to identify students’ mindset, beyond simply setting a potentially arbitrary cut-off point.


image file: d1rp00092f-f1.tif
Fig. 1 Distributions for entity and incremental subscales of the implicit theories instrument using the term “chemistry intelligence”.

RQ2: How do students interpret the attribute terminology commonly used in measures of mindset?

Variability of definition

When designing survey items, it is important that students interpret the wording similarly. The open-response questions allowed for a detailed examination of how chemistry students define terms of interest for use in mindset scale items: intelligence, chemistry intelligence, and chemistry ability. After coding representative samples of these open-response questions across all course levels, the frequencies with which specific codes appeared were examined. Each term elicited more than ten different definitions, which demonstrates the high degree of variability in interpretation for all three wording choices. Common definitions that fell into a specific code category are provided in Table 2, along with usage examples. Fig. 2 displays the code categories by percent frequency across the three different terms students were asked to define. This broad range of ways students interpret each term is evidence that items incorporating them may not yield consistent meaning to students, suggesting variation in the construct meaning to different individuals. Across all three terms, the same three definitions were consistently reported at a higher relative frequency: knowledge, understanding, and ability to apply. This finding suggests that these three definitions are the most common amongst chemistry undergraduate students in this sample and that they remain of somewhat equal importance regardless of the term in question.
Table 2 Content analysis codes for definitions of attribute terminologies. Bolding within example responses indicates the term that was being defined
Definition code Description Example response
Ability to apply Knowledge, concepts, skills Intelligence is the ability to recall and utilize information. However, this ability must also be done in a manner that is appropriate or beneficial to the situation at hand. I define it this way because there are people who can memorize things like trivia, but they have no skill to analyze or use that information to their benefit…”
Abstract thinking, visualization Molecular perspective, three-dimensional representations “[…Chemistry intelligence…] Things like the ability to visualize the 3D structures of molecules, to understand the shape of orbitals and the transfer of particles and charges. I find things like 3D molecule structures easy to visualize, while I find things like energy diagrams harder to understand…”
Communicate, explain Teaching, explain to others, communicate effectively, speak well about knowledge, sound like you know what you're talking about “I would define chemistry intelligence as not only knowing the materials covered but knowing the concept and being able to explain it…I find that I do better on tests if I try to teach it to my mom beforehand because she does not know science. If I can get her to understand it, then I know the concept well enough.”
Domain-specificity Different kinds of intelligences, multifaceted, better in one area than another “…Intelligence is not linear, as I may be intelligent for some things and lack intelligence in others.”
Efficacy for learning Ease of learning, learn quickly, understand quickly, get it better than others, independence in learning “I define intelligence as the capacity to learn new information easily. Someone with high intelligence can readily absorb new information more than someone with low intelligence…”
Emotional maturity Adapting to new situations, objectivity, wisdom, good decision making “I define intelligence as the ability to adapt and understand a variety of situations and be able to identify and overcome challenges. Intelligent people focus on the outcome and learning rather than the output. Intelligence also includes one's ability to identify mistakes and continue to grow and improve from them.”
Knowledge acquisition, retention Memorize, amount of knowledge, depth of knowledge, ability to retain information “I would describe chemistry intelligence as someone that has relevant knowledge and background about basic chemistry history and mathematical problems…For example, like which chemist are credited with certain formulas and theories and are capable of using these formulas in chemistry problems.”
Mathematical thinking Math skills, math foundations Chemistry intelligence depends on one's proficiency at the math required for chemistry problems and ability to connect concepts to input and output of said problems. Intelligence in chemistry also depends on one's tolerance for spending the time to memorize…”
Motivation Imposing structure on yourself for your own good, drive, willpower, interest, effort “I think willingness and dedication are important in chemistry. There are many complicated topics that require the patience to learn and gain understanding. Practice is also required to further gain understanding in chemistry just as it is required in math.”
Performance, achievement Success, grades, performance, do well, test taking, demonstrating proficiency/understanding, competent Chemistry ability is a measure of how well a person can display what they have learned about chemistry. People that have little exposure to chemistry tend to have a lower ability to display knowledge of chemistry or apply it. Chemistry ability can be thought of as skill, where practice in the subject increases your ability to perform in the subject.”
Problem solving Critical thinking, analytical skills, logic and reasoning Intelligence is the ability to use information that you know to find solutions to problems. Simply memorizing numbers and facts is not enough to be intelligent. You have to be able to think outside the box. Using the information you memorized to think of a creative and inventive answer to a problem is true intelligence.”
Psychomotor Do chemistry, actions, handling chemicals, performing experiments “I believe chemistry ability is being able to execute tasks relating to chemistry. The way that my lab instructors can perform experiments effortlessly is amazing to me. That shows a clear understanding of the subject, to me.”
Subjective The definition is subjective, defined by society, societal construct “…I don't think that intelligence as a whole can be properly based on how much information someone has because that brings up the debate of what information is considered valuable and who gets to decide that…”
Understanding Conceptual understanding, connecting concepts “[Intelligence is] the ability to intake new information and overlap from all areas of existing knowledge you have to come to a logical conclusion that both allows success in understanding related material and opens the floor for more complex and developed information. A level of understanding that is not basic or elementary in any one subject.”



image file: d1rp00092f-f2.tif
Fig. 2 Relative code frequencies across all three terms as a percentage of total number of definition code references.

In addition to the broad range of definitions, providing evidence of the subjectivity of each term, several students explicitly stated that intelligence is subjective.

Intelligence can differ depending on the situation and circumstance. Someone can show a high level of intelligence in street-smarts but that may not correlate with his/her level of intelligence in aeronautical sciences (and vice versa). Some people can have a jack-of-all-trades versatility to their intelligence while others may be savants in a more limited number of fields or even just one. In my experience, if what is defined as intelligence is limited to a few spectrums, it is an incomplete and not a fully thought out definition. I have come to this understanding of intelligence because when growing up I learned from people from different educational and vocational backgrounds and because what I learned helped me in life in some way or another, to me this shows that the definition of intelligence is not owned by one ideology, even if it may be the socially accepted one.

Other students expressed similar sentiments toward the term by stating that “it is very difficult to define intelligence because of how intelligence means something different to different people.” Another aspect mentioned by a few students was the societal influences on our understanding of the term, ‘intelligence,’ such as “school definitions of intelligence as well as…family and friends and even social media,” and that to define it “requires a quantification of the value of certain cognitive skills.” One student even stated the assumptions he believed the researchers were employing in the act of asking these questions:

I think in general intelligence, especially specific types of intelligence, is a useless term because it is used too broadly and can mean almost anything. So, I would like to say chemistry intelligence doesn't exist. Given the later question about “ability to change chemistry intelligence level” I infer that the researchers believe “chemistry intelligence” to be some type of innate knowledge about chemistry, or more generously knowledge the student acquired before the course. This is a concept I am against in principle as I believe that anyone given the right time and resources can become proficient at anything. Therefore, I think asking the student if they think they can change their chemistry intelligence level is rather backwards because it is a combination of the resources they have available to them, including time.

These types of responses could be used to conclude that the term intelligence can evoke negative affect for some students based on how they believe others to define it. This could present a problem for inclusion of the term ‘intelligence’ for those students who hold principled views that the act of measuring and labeling individuals with a certain innate amount of ability is a controversial issue.

One goal in the item development phase for an assessment instrument is to collect evidence supporting the desired interpretation of the wording utilized. These results suggest that students may not be responding in the same way when reading the same item due to variation in interpretation. When asking students if they can change their chemistry intelligence, how ‘chemistry intelligence’ is defined can be important. Some cognitive abilities are likely viewed to be more malleable than others, such as knowledge, which increases with study. Students who believe chemistry intelligence is the “ability to understand” might be less likely to say that it is a malleable quality compared to “knowledge” (Limeri et al., 2020b). This variability in the meanings students ascribe to intelligence may obscure the mindset construct by producing inconsistencies in the measured construct meaning. If this is the case, it would not allow a clear distinction of growth mindset individuals from those who believe chemistry intelligence to be more stable.

Terminology impacts item meaning

The variety in wording that has been used in implicit theories studies suggests a need to understand how the domain of chemistry affects the understanding of the trait itself (e.g., “intelligence,” “ability,” “aptitude”). It has been argued that a general implicit theories scale given in a domain-specific context may not be a true reflection of students’ beliefs within that domain (Shively and Ryan, 2013; Scott and Ghinea, 2014). In comparing the definitions provided by students regarding general and chemistry intelligence, it is clear that there are differences in the ideas that come to mind for many students. For instance, “general intelligence” primarily highlights “knowledge” and “ability to apply,” but students uniquely mentioned the idea of multiple intelligences (Gardner, 2006) (“domain-specificity” code) and emotional maturity (open-mindedness, wisdom, adaptability, objectivity). General intelligence tended to have a less specific definition because many students recognized that it can be a diverse range of skills or abilities, depending on the person and context.

Chemistry intelligence responses yielded a different focus on “understanding” and “knowledge,” while also highlighting “motivation” (willpower, interest, effort), and more distinctly, “mathematical thinking” and “abstract thinking” (or visualization). Many students referred to “understanding” as requiring time and effort and contrasted it with memorization: “For chemistry, you have to really understand what goes on instead of simply memorizing things.” When defining chemistry intelligence, many students recognized the underlying concepts as key to success at higher levels: “A lot of chemistry involves understanding basic concepts that build to more complicated concepts and reactions.”

Mindset, in the literature, seems to be context-dependent and the change of interpretation of the term intelligence, when specific to chemistry, supports the argument that measurement of general mindset for use in a specific domain may not be fully valid. These findings support the idea that the abilities students believe they are required to have in order to be considered “intelligent” in chemistry are more domain-specific than those deemed as general intelligence. Thus, it is pertinent to create implicit theories items that specifically probe beliefs within the chemistry domain.

When considering which trait term (i.e., “intelligence” or “ability”) is best used in the items we may develop, it is important to compare the students’ open-ended responses for the terms “chemistry intelligence” and “chemistry ability.” As seen in Fig. 2, the associations of both terms with understanding, applying, problem-solving, math, and motivation seem to be consistent across the two terms, which suggests that the reference to “chemistry” elicits these definitions more so than the trait term alone. The frequency of the “knowledge” code is much higher in reference to intelligence and a more unique description of a “psychomotor” definition becomes the focus for chemistry ability. This psychomotor category was applied each time students mentioned someone's ability to “do chemistry” or perform experiments and was typically used in a laboratory context. Additionally, chemistry ability was more commonly defined as “performance,” which is typically described as an ability to achieve good grades in chemistry.

Some students stated that they view “chemistry intelligence” and “chemistry ability” as highly interrelated, while others claimed that they are different and that one is easier to achieve than the other: “Chemistry ability is the skills and knowledge that one possesses about chemistry to complete a task. A person's chemistry ability does not equate to chemistry intelligence because a person can score high on a chemistry test, but their ability may only stop there. They would not know how to apply the knowledge they have to the real world.” Other students seem to say that intelligence is more deserving of respect, “I think that with hard work and dedication that you can increase your chemistry ability to chemistry intelligence.”

The ‘ability’ term had a higher association with an external performance aspect and ‘intelligence’ had a closer association to internal cognitive potential. This difference in association clearly shows that the words ‘ability’ and ‘intelligence’ are not simply interchangeable. Many studies have exchanged the term ‘intelligence’ for another word without much justification other than ‘intelligence’ is too broadly defined. The term ‘ability’ also brings up a broad range of definitions and these may not be aligned with beliefs about the particular cognitive abilities the items are intended to probe. Care should be taken in selecting the term that best fits the research needs and desired construct to be measured.

Context associations

In addition to providing definitions, students tended to reference a particular context in which the definition would be applicable. Context codes were developed based on those commonly mentioned: academic, real world, lab, and “street and book smarts” (a colloquial way to state intelligence as both academic and real-world). Some important trends are observed (Fig. 3) for the different definitions in reference to context. For instance, general intelligence was equally weighted in frequency across all contexts except for lab. This can be contrasted with chemistry intelligence, which appeared most frequently in the academic context. The lab context was also occasionally referenced with regard to “chemistry intelligence,” but a more noticeable association was made for the lab context with “chemistry ability.” In addition to strict concerns over the definition of a term, the context brought to a student's mind by a particular wording might very well be important to validity arguments. If one is intending to measure beliefs pertaining to an academic classroom setting but use terms that invoke thoughts of real world or laboratory applications, the responses may not consistently reflect the intended context.
image file: d1rp00092f-f3.tif
Fig. 3 Relative context code frequencies across all three terms (those with “Unspecified” context not shown).

RQ3: To what extent is the mindset construct valid within this undergraduate chemistry course population?

Qualitative detection of mindset groups

Students’ open-ended responses to the definition of chemistry intelligence were sorted according to explicit mention of the origins of the trait, such as a natural ability or something that is developed with effort. A small subset of the data (N = 291) met the criteria (see Appendix B) and was placed into a mindset group representing the same viewpoint on its origins. The first author (DLS) conducted the filtering and sorting, then the second author (HG) coded each sample to examine differences in definitions, contexts, and origins according to the previously established coding scheme. The “Fixed” and “Middle” mindset groups resulted in 79 students each, while the “Growth” mindset group was slightly larger (n = 133). The first 79 out of 133 responses from the Growth group were coded to yield homogeneous coding group sizes. After cross-coding the groups, the reliability (Cohen's κ = 0.79) was calculated to verify that interpretations of the content were sufficiently reliable between the two coders for analysis of group differences. Examples of a response from each mindset group, along with the corresponding selection criteria, are shown in Table 3.
Table 3 Open-ended response examples sorted into each group based on specified grouping criteria. The highlighted segments contain coded content referencing origins of chemistry intelligence. The colors are to differentiate between portions of text which met different criteria within the same response
image file: d1rp00092f-u1.tif


Some key differences were observed in the ways these groups described the nature of chemistry intelligence (Fig. 4) and defined the term (Fig. 5). Fig. 4 represents the origins of chemistry intelligence described by students without prompting and provides insight into the characteristic perspectives of each group. For instance, the “Growth” group frequently described intelligence as malleable, developable, and requiring effort, while these views were not characteristic of the “Fixed” or “Middle” groups. This finding aligns with mindset theory, in that, students who hold incremental theories about intelligence believe it to be a malleable quality, while those with entity theories believe it to be stable over time, which can also imply that it is endowed at birth or something natural.


image file: d1rp00092f-f4.tif
Fig. 4 Origins and nature of chemistry intelligence as code frequency percentages for each mindset group (each group n = 79).

image file: d1rp00092f-f5.tif
Fig. 5 Definitions of chemistry intelligence across all three mindset groups as code frequency percentages (each group n = 79).

The Middle group often cited more inclusive terms, such as many aspects can be considered chemistry intelligence or that anyone can attain it (“equality of attainment”), while also commonly stating that it is natural for some people. An example of this viewpoint is: “I would define chemistry intelligence as one who grasps and/or has an affinity in the subject of chemistry. I would say this is so because some people grasp chemistry faster/slower than others. I do think this is subjective though and depends on who you ask.” Students placed in the Growth group were much more committed to the malleability concept: “…Someone can work to improve their intelligence. Based on personal experience, intelligence is a fluid entity. Someone can be intelligent in a certain subject and not others. But if they are willing to have an open mindset and learn, intelligence can be raised.” Likewise, fixed group students were more committed to the stability or natural nature of chemistry intelligence: “Chemistry intelligence is having a natural inclination towards chemistry and being good at it. I think I have just seen people around me (not only in chemistry but also in other subjects) be naturally good at a subject. They are able to grasp the material easily.”

When comparing definitions of chemistry intelligence by these mindset groups, our previous claim that some definitions better align with a growth belief is further supported. Fig. 4, shows that the Growth group predominantly defines chemistry intelligence with regard to “knowledge,” “understanding,” and “motivation,” while the Fixed group most frequently mentions “understanding” and “efficacy for learning” with nearly no mention of “motivation.” Knowledge can be expected to align well with malleability beliefs. Knowledge increases, in the sense that we are born with essentially none but acquire it over our lifetime, and this is also the case with knowledge of chemistry. Motivation can also be expected to align more with a growth mindset belief in that it reflects the need for effort to achieve success.

It is interesting to note that the code for “understanding” was equally prevalent in all three groups’ definitions of chemistry intelligence. Students often described “understanding” as something that improves with effort, which aligns with beliefs about malleability; however, many others stated that it is easier for some people to understand chemistry concepts compared to others, which aligns well with beliefs that intelligence is innate and stable. This alternate view of “understanding” is reflected in the high relative frequency of the “efficacy for learning” code in the Fixed group. This code applies when understanding or learning is described in terms of pace and/or differential rates between individuals. Those who learn more efficiently, grasp things more quickly, or understand more easily are considered as having high efficacy for learning. It also seems to be somewhat logical to assume that efficacy for learning could be a natural quality, however, many Middle group individuals express that some natural qualities can be developed with effort. The Middle mindset group provided definitions appearing similar in frequency and distribution relative to those coded from the Growth mindset group.

Comparing qualitative mindset groups with implicit theories instrument results

To further investigate the validity of the distinction between these groups, mean responses to the chemistry intelligence implicit theories subscales (incremental and entity) were compared. A one-way ANOVA (N = 291; fixed n = 79, middle n = 79, growth n = 133) of the mindset groups on these subscales showed statistically significant differences between all three group means (according to the Tukey test) for chemistry intelligence incremental responses (F(2189) = 39.64, p < 0.001, ω2 = 0.210) and chemistry intelligence entity responses (F(2189) = 36.36, p < 0.001, ω2 = 0.203). The effect sizes for both of these mean comparisons are large, which suggests the mindset group differences in open-ended responses reflect meaningful differences in implicit theories scale means (incremental: [x with combining macron]growth = 5.09, [x with combining macron]middle = 4.71, [x with combining macron]fixed = 3.98; entity: [x with combining macron]growth = 1.89, [x with combining macron]middle = 2.26, [x with combining macron]fixed = 3.00). Thus, mean scores can be used to detect differences in mindset among the groups.

However, when individual response averages to growth and fixed items were placed on a scatter plot (Fig. 6), the mindset group differences became more difficult to identify. Some clustering was observed for the Growth group in the response region associated with a growth mindset (i.e., high incremental and low entity beliefs). However, there was less clustering for the Fixed group, resulting in spread that overlaps with the Growth group (Fig. 6). This reveals that many students who qualitatively described fixed mindset beliefs (indicated by + symbols in Fig. 6) still tended to respond with growth mindset beliefs when using the implicit theories scales. Additionally, this analysis demonstrates that the typical technique of using the scale center cut-off scores for identification of mindset beliefs (Hong et al., 1999) may result in miscategorization of some individuals in each group, although most noticeably for the Fixed group. The overlap of these response clusters suggests that the simple modification of the mindset instrument through use of “chemistry intelligence” may be insufficient for distinguishing between groups.


image file: d1rp00092f-f6.tif
Fig. 6 Scatter plot of individual incremental versus entity subscale scores. Fixed group students are represented by red plus signs and Growth group students are represented by blue dots. The similarly colored ovals indicate the spread of each group.

Given the overlap in typical responses shown in Fig. 6, it may not be possible to categorize individuals accurately into mindset groups using existing quantitative measures. While the ANOVA results indicate that these may indeed be distinct groups of student views as indicated by group means, the individual student response patterns have a high degree of variability and overlap. These results provide further evidence that the development of a more sensitive measure of chemistry mindset may be needed to efficiently identify and support those students who could academically benefit from mindset interventions.

Conclusions

Implications for research

The finding that the simple modification of the existing implicit theories instrument with the addition of the domain name, “chemistry,” yielded a predominantly growth-mindset centered distribution comparable to other STEM-specific studies with adult students suggests that the modification technique requires qualitative backing for construct validity purposes. When selecting an implicit theories scale, both the wording for the attribute name and references to particular academic domains affect the interpretations in breadth and content. Terms like “intelligence” are understood differently by various individuals and likely yield inconsistent response patterns in survey scales. The most prevalent categories of definitions uncovered in this analysis were “knowledge”, “ability to apply”, and “understanding”, which could have different levels of malleability associated with them to different individuals.

One potential direction for improving implicit theories measurements within a particular course is to incorporate more specific terms which align with common views of “intelligence” and “ability” in that domain. For example, the highest frequency definition provided by students in our sample was “understanding.” This could lead researchers to modify an item which states, “My [chemistry] intelligence is something about me that I personally can’t change very much,” (De Castella and Byrne, 2015) to, “My ability to understand [chemistry] is something about me that I personally can’t change very much.” This modification strategy could serve to reduce ambiguity in meaning of the scale items and provide qualitative evidentiary support. This could particularly address ambiguity for those students who claimed that they had never heard of “chemistry intelligence” and therefore could not define it. It also removes the need for the student to “guess” or “infer” the meaning intended by the researcher, since the specificity of the new wording may have a clearer definition. As the term intelligence was observed to invoke thoughts of ability to carry oneself in the real world, and chemistry ability often evoked mention of laboratory, shifts toward including specific definitions might remove the implications of academic, societal, practical, or workplace success and retain the pure belief about the domain dissociated from a context.

Another measurement aspect considered in this work was the ability to detect different mindsets (i.e., those who hold stronger incremental or entity beliefs). Given the observation that a vast majority of students self-report growth mindset on both entity and incremental chemistry intelligence scales, a possible interpretation is that almost none of the undergraduate chemistry students in the sample hold fixed mindset beliefs. Due to the prevalence of students who encounter challenges in these courses and fail to overcome them, it was pertinent to investigate the qualitative responses for some indication of the nature of chemistry intelligence to compare with these self-report measures. The detection of three mindset groups with different views on the nature and origins of chemistry intelligence suggests validity of the mindset construct within the target population, as well as of the likelihood of “fixed,” “growth,” and mixed (or “middle”) views being present in the sample.

The fixed mindset is theorized to represent a significant portion of student samples (Hong et al., 1999), and not just an extreme belief. Our identification of 79 students (27% of the sample that explicitly mentions origins, which accounts for 34% of the entire sample) who described fixed views about the nature of chemistry intelligence indicates that it is a fairly common belief. If implicit theories are measured solely through existing quantitative measures, these fixed mindset students would be difficult to isolate based on their self-report averages alone. This is evident in heavy overlap of responses with the other two perspectives. Further efforts are needed to improve the sensitivity of measures and ability to detect these different perspectives. Additional construct validity can be confirmed through analysis of the theoretical connections to other variables commonly discussed as part of the “meaning systems” students employ in academic contexts, such as attributions for failure/success and achievement goal orientations (Hong et al., 1999).

Implications for teaching

Mindsets about different attributes are known to shift over time based on experiences, which means that chemistry instructors can play a significant role in nurturing adaptive beliefs and behaviors by creating a learning environment which communicates a growth mindset about chemistry intelligence to students. The emphasis, during instruction, on particular definitions of chemistry intelligence which are developed over time (more easily viewed as malleable) could serve to reduce comparison of “natural ability levels” with others in the class or the need to demonstrate ability to others. Praising strategic effort and explicitly teaching study techniques conveys positive associations with overcoming academic challenge and developing weaker areas of one's chemistry intelligence. The instructor can also place emphasis on mastery through the types of teaching strategies and assessments implemented, so that learning, improvement, and understanding are valued above performance. Recognizing that the helpless behaviors of some students are a result of their beliefs about their abilities in chemistry is the starting point to redirecting them toward more adaptive beliefs and behaviors.

The findings presented here suggest caution in drawing conclusions about intervention success or failure based on existing surveys when used in undergraduate chemistry courses. To adequately assess interventions, a sensitive and accurate measure for chemistry mindset would first need to be developed in order to probe and monitor students’ implicit theories of chemistry for evaluating shifts in teaching strategy. However, it is likely that, as students intentionally engage in a positive learning environment that encourages development, they will see improvement and reshape their former beliefs about learning chemistry.

Limitations

This study was conducted at a single institution within the United States, which limits the generalizability of the findings; however, the institution is quite diverse across racial and ethnic backgrounds. Further, we collected data from students across several sections of general chemistry and organic chemistry lecture courses. The large, diverse sample accounts for a broad range of possible perspectives and provides insight into the target population's views.

Although many students with international backgrounds attend the university, facets of US educational and societal culture likely play a significant role in both the interpretation of the attribute terms and the views about malleability of intelligence. In US elementary and secondary schools, mindset is often embedded within instruction by teachers who wish to encourage their students to learn. Regardless of whether US students actually develop and hold these beliefs for themselves, they certainly would know the “correct” survey response if they had previous explicit instruction. As such, social desirability and acquiescence bias likely play a large role in the skewed responses to the domain-specific mindset scale. Further investigation with non-US student populations would be of great benefit to understanding how cultural differences may limit or support the generalizability of these findings. Additionally, ongoing cognitive interviews surrounding response process may shed light on the social desirability of particular wording or response options.

Another limitation to generalizability is that the majority of the respondents were female students, which is reflective of the high (approximately 70%) female enrollment in STEM courses at this university. Although this overrepresentation is expected of this population, similar studies should be conducted with higher male student representation for evidence of generalizability in terms of gender.

The low response rate (approximately 30% of enrolled students) may be due to the recent transition to a non-traditional online course format as a result of COVID-19 pandemic institutional policies. The unfamiliar course format likely decreased student attention to detail, such as extra credit opportunities, as navigating online courses was new to the students. This could limit the generalizability of the study findings, in that, those who responded were likely more engaged or organized at the beginning of the course.

Only half of the student responses from each course were coded and utilized in the analyses, so it is possible that some perspectives may have been overlooked. However, this sample size was sufficient to reach saturation of data themes.

Uncovering themes through content analysis is subject to bias in how the coding categories are formed and designating what constitutes a sufficient response for a particular code. To address this, the development of codes and themes in this work was carried out by two researchers in an iterative process. This served to reduce ambiguity in application of code categories, achieve consensus on code meaning, and consolidate redundant categories while maintaining authenticity with regards to the data. In single paragraph written responses, it can be difficult to interpret the meaning intended by the student without bias, especially considering grammatical errors and lack of elaboration on their ideas. If these students were interviewed, these issues could have been probed further to clarify meaning. However, attempts were made to reduce interpretation bias through use of inter-rater coding comparisons and by searching for only explicit information within the text. The implicit theories quantitative measures which were critiqued for sensitivity in this work employ self-report methods, which may be argued to be contradictory to the description of these beliefs as “implicit” and thereby difficult to elicit in the form of a Likert scale response. It is for this very purpose that we aim to improve and modify the implicit theories scale to attain more valid and reliable implicit theories data in undergraduate chemistry courses and for their subsequent interpretation as representing certain mindsets.

Conflicts of interest

There are no conflicts to declare.

Appendix

Appendix A: full codebook definitions table

Code Description
Context
 Academic, in class Refers to school subjects and grades, does not acknowledge uses beyond education
 Lab Experimenting, testing in the lab, lab skills
 Real world, life Job, non-academic career skills, daily life
 Street and book smart Mentions both as valuable, explicitly mentions both contexts, all aspects of life
 Unspecified Too vague to imply a context
Definition
 Ability to apply Knowledge, concepts, skills
 Abstract thinking, visualization Mention of dimensionality, diagrams, structures
 Ambiguous Student did not explain well enough to interpret meaning
 Communicate, explain Teaching, explain to others, communicate effectively, speak well about knowledge, sound like you know what you're talking about
 Domain-specificity Different kinds of intelligences, multifaceted, better in one area than another
 Efficacy for learning Ease of learning, learn quickly, understand quickly, get it better than others, independence in learning
 Emotional maturity Adapting to new situations, objectivity, wisdom, good decision making
 Knowledge acquisition, retention Memorize, amount of knowledge, depth of knowledge, ability to retain information
 Mathematical thinking Math skills, math foundations
 Motivation Imposing structure on yourself for your own good, drive, willpower, interest, effort
 Performance, achievement Success, grades, performance, do well, test taking, demonstrating proficiency/understanding, competent
 Problem solving Critical thinking, analytical skills, logic and reasoning
 Psychomotor Do chemistry, actions, handling chemicals, performing experiments
 Subjective The definition is subjective, defined by society, societal construct
Understanding Conceptual understanding, connecting concepts
 Unsure “I guess”, not sure about definition, implying based on combined word definitions
Origins
 Developed, experience Over time it is formed, experiences help to develop, using experience to help exert intelligence/ability
 Equality of attainment Everyone/anyone can be/is intelligent
 Foundation Having a good foundation, access to resources
 Innate Capacity, level, ability, natural attribute, born with it
 Malleable Can improve or worsen, changeable
 Relatively stable Unchanging, hard to change, fixed trait
 Requires effort Explicitly mentions hard work or effort as a cause/necessary for intelligence

Appendix B: full separation criteria list for mindset group sorting

Mindset group Fixed Middle Growth
Chemistry intelligence is described as… • Something natural, an inclination, or superior ease of learning • Multifaceted • Effort is necessary to develop it
• Indicated that people have different levels • Discussed personal growth • Explicitly discussed improving intelligence
• An unchangeable quality about someone • Inclusive beyond the educational sphere • Anyone can achieve despite how they were born
• Refers to being “smarter” than others or something only some people have • Interest and motivation were mentioned as causes • Discussed overcoming challenges in chemistry
• Both growth and fixed statements together • Learned over time, can increase, learning from cumulative experience
• Subjective/societal definition
• Attributed to teaching/learning styles

Acknowledgements

The authors would like to thank all of the participating general and organic chemistry instructors at Georgia State University for allowing their students to partake in this study for small amounts of extra course credit.

References

  1. Aditomo A., (2015), Students’ response to academic setback: ‘Growth mindset’ as a buffer against demotivation, Int. J. Educ. Psychol., 4(2), 198 DOI:10.17583/ijep.2015.1482.
  2. Amaral K. E., Shank J. D., Shibley Jr I. A. and Shibley L. R., (2013), Web-enhanced general chemistry increases student completion rates, success, and satisfaction, J. Chem. Educ., 90(3), 296–302.
  3. American Educational Research Association, (2014), Standards for educational and psychological testing, American Educational Research Association American Psychological Association National Council on Measurement in Education.
  4. Anderson C. A., (1995), Implicit theories in broad perspective, Psychol. Inquiry, 6(4), 286–289.
  5. Barger M. M., (2019), Connections between instructor messages and undergraduate students' changing personal theories about education, J. Exp. Educ., 87(2), 314–331 DOI:10.1080/00220973.2018.1469111.
  6. Bedford S., (2017), Growth mindset and motivation: A study into secondary school science learning, Res. Pap. Educ., 32(4), 424–443 DOI:10.1080/02671522.2017.1318809.
  7. Bernard H. R. and Ryan G. W., (2010), Analyzing qualitative data: Systematic approaches, Thousand Oaks, CA: Sage.
  8. Binning K. R., Wang M.-T. and Amemiya J., (2019), Persistence mindset among adolescents: Who benefits from the message that academic struggles are normal and temporary? J. Youth Adolesc., 48(2), 269–286.
  9. Blackwell L. S., Trzesniewski K. H. and Dweck C. S., (2007), Implicit theories of intelligence predict achievement across an adolescent transition: A longitudinal study and an intervention, Child Dev., 78(1), 246–263 DOI:10.1111/j.1467-8624.2007.00995.x.
  10. Brown M. F., (2008), Cultural relativism 2.0, Curr. Anthrop., 49(3), 363–383 DOI:10.1086/529261.
  11. Buckley J., O’Connor A., Seery N., Hyland T. and Canty D., (2019), Implicit theories of intelligence in STEM education: Perspectives through the lens of technology education students, Int. J. Technol. Des. Educ., 29(1), 75–106 DOI:10.1007/s10798-017-9438-8.
  12. Burgoyne A. P. and Macnamara B. N., (2020), The reliability and validity of the mindset assessment profile tool, Preprint DOI:10.31234/osf.io/hx53u.
  13. Burkley M., Parker J., Paul Stermer S. and Burkley E., (2010), Trait beliefs that make women vulnerable to math disengagement, Person. Indiv. Diff., 48(2), 234–238 DOI:10.1016/j.paid.2009.09.002.
  14. Burnette J. L., O'Boyle E. H., VanEpps E. M., Pollack J. M. and Finkel E. J., (2013), Mind-sets matter: A meta-analytic review of implicit theories and self-regulation, Psychol. Bull., 139(3), 655–701 DOI:10.1037/a0029531.
  15. Cavanagh A. J., Chen X., Bathgate M., Frederick J., Hanauer D. I. and Graham M. J., (2018), Trust, growth mindset, and student commitment to active learning in a college science course, CBE – Life Sci. Educ., 17(1), ar10 DOI:10.1187/cbe.17-06-0107.
  16. Chen J. A. and Pajares F., (2010), Implicit theories of ability of Grade 6 science students: Relation to epistemological beliefs and academic motivation and achievement in science, Contemp. Educ. Psychol., 35(1), 75–87 DOI:10.1016/j.cedpsych.2009.10.003.
  17. Costa A. and Faria L., (2018), Implicit theories of intelligence and academic achievement: A meta-analytic review, Front. Psychol., 9(829), 1–16 DOI:10.3389/fpsyg.2018.00829.
  18. Dai T. and Cromley J. G., (2014), Changes in implicit theories of ability in biology and dropout from STEM majors: A latent growth curve approach, Contemp. Educ. Psychol., 39(3), 233–247 DOI:10.1016/j.cedpsych.2014.06.003.
  19. De Castella K. and Byrne D., (2015), My intelligence may be more malleable than yours: The revised implicit theories of intelligence (selftheory) scale is a better predictor of achievement, motivation, and student disengagement, Eur. J. Psych. Educ., 30(3), 245–267.
  20. Dupeyrat C. and Mariné C., (2005), Implicit theories of intelligence, goal orientation, cognitive engagement, and achievement: A test of Dweck's model with returning to school adults, Contemp. Educ. Psychol., 30(1), 43–59 DOI:10.1016/j.cedpsych.2004.01.007.
  21. Dweck C., (1999), Self-theories: Their role in personality, motivation, and development, Psychology, pp. 177–178.
  22. Dweck C. S., (2006), Mindset: The new psychology of success. New York: Random House.
  23. Dweck C. S. and Leggett E. L., (1988), A social-cognitive approach to motivation and personality, Psychol. Rev., 95(2), 256.
  24. Dweck C. S., Chiu C.-y. and Hong Y.-y., (1995a), Implicit theories and their role in judgments and reactions: A word from two perspectives, Psychol. Inquiry, 6(4), 267–285 DOI:10.1207/s15327965pli0604_1.
  25. Dweck C. S., Chiu C.-y. and Hong Y.-y., (1995b), Implicit theories: Elaboration and extension of the model, Psychol. Inquiry, 6(4), 322–333.
  26. Fink A., Cahill M. J., McDaniel M. A., Hoffman A. and Frey R. F., (2018), Improving general chemistry performance through a growth mindset intervention: Selective effects on underrepresented minorities, Chem. Educ. Res. Pract., 19(3), 783–806 10.1039/C7RP00244K.
  27. Flanigan A. E., Peteranetz M. S., Shell D. F. and Soh L.-K., (2017), Implicit intelligence beliefs of computer science students: Exploring change across the semester, Contemp. Educ. Psychol., 48, 179–196 DOI:10.1016/j.cedpsych.2016.10.003.
  28. Gardner H., (2006), Multiple intelligences: New horizons, New York: BasicBooks.
  29. Given L. M., (2016), 100 Questions (and Answer) about Qualitative Research, Thousand Oaks, CA: Sage.
  30. Grant H. and Dweck C. S., (2003), Clarifying achievement goals and their impact, J. Person. Soc. Psychol., 85(3), 541.
  31. Guest G., Bunce A., Johnson L., (2006) How many interviews are enough? an experiment with data saturation and variability, Field Methods, 18, 59–82.
  32. Gunderson E. A., Hamdan N., Sorhagen N. S. and D'Esterre A. P., (2017), Who needs innate ability to succeed in math and literacy? Academic-domain-specific theories of intelligence about peers versus adults, Dev. Psychol., 53(6), 1188.
  33. Harris R. B., Mack M. R., Bryant J., Theobald E. J. and Freeman S., (2020), Reducing achievement gaps in undergraduate general chemistry could lift underrepresented students into a “hyperpersistent zone”, Sci. Adv., 6(24), eaaz5687 DOI:10.1126/sciadv.aaz5687.
  34. Henry M. A., Shorter S., Charkoudian L., Heemstra J. M. and Corwin L. A., (2019), FAIL is not a four-letter word: A theoretical framework for exploring undergraduate students’ approaches to academic challenge and responses to failure in STEM learning environments, CBE – Life Sci. Educ., 18(1), ar11 DOI:10.1187/cbe.18-06-0108.
  35. Hochanadel A. and Finamore D., (2015), Fixed and growth mindset in education and how grit helps students persist in the face of adversity, J. Int. Educ. Res., 11(1), 47–50.
  36. Hong Y.-y., Chiu C.-y., Dweck C. S., Lin D. M.-S. and Wan W., (1999), Implicit theories, attributions, and coping: A meaning system approach, J. Person. Soc. Psychol., 77(3), 588.
  37. Horowitz G., Rabin L. A. and Brodale D. L., (2013), Improving student performance in organic chemistry: Help seeking behaviors and prior chemistry aptitude, J. Schol. Teach. Learn., 13(3), 120–133.
  38. Hsieh H.-F. and Shannon S. E., (2005), Three approaches to qualitative content analysis, Qual. Health Res., 15(9), 1277–1288.
  39. Karlen Y., Suter F., Hirt C. and Maag Merki K., (2019), The role of implicit theories in students' grit, achievement goals, intrinsic and extrinsic motivation, and achievement in the context of a long-term challenging task, Learn. Indiv. Diff., 74, 101757 DOI:10.1016/j.lindif.2019.101757.
  40. Koch A. K., (2017), It's about the gateway courses: Defining and contextualizing the issue, New Dir. High. Educ., 2017(180), 11–17 DOI:10.1002/he.20257.
  41. Komperda R., Hosbein K. N. and Barbera J., (2018), Evaluation of the influence of wording changes and course type on motivation instrument functioning in chemistry, Chem. Educ. Res. Pract., 19(1), 184–198.
  42. Leslie S.-J., Cimpian A., Meyer M. and Freeland E., (2015), Expectations of brilliance underlie gender distributions across academic disciplines, Science, 347(6219), 262 DOI:10.1126/science.1261375.
  43. Levy S. R., Stroessner S. J. and Dweck C. S., (1998), Stereotype formation and endorsement: The role of implicit theories, J. Person. Soc. Psychol., 74(6), 1421–1436 DOI:10.1037/0022-3514.74.6.1421.
  44. Limeri L. B., Carter N. T., Choe J., Harper H. G., Martin H. R., Benton A. and Dolan E. L., (2020a), Growing a growth mindset: Characterizing how and why undergraduate students’ mindsets change, Int. J. STEM Educ., 7(1), 1–19 DOI:10.1186/s40594-020-00227-2.
  45. Limeri L. B., Choe J., Harper H. G., Martin H. R., Benton A. and Dolan E. L., (2020b), Knowledge or abilities? How undergraduates define intelligence, CBE – Life Sci. Educ., 19(1), ar5.
  46. Lüftenegger M. and Chen J. A., (2017), Conceptual issues and assessment of implicit theories, Z. Psychol., 225(2), 99.
  47. Lyons T., (2006), The puzzle of falling enrolments in physics and chemistry courses: Putting some pieces together, Res. Sci. Educ., 36(3), 285–311 DOI:10.1007/s11165-005-9008-z.
  48. Lytle A. and Shin J. E., (2020), Incremental beliefs, STEM efficacy and STEM interest among first-year undergraduate students, J. Sci. Educ. Technol., 1–10.
  49. Macnamara B. N. and Rupani N. S., (2017), The relationship between intelligence and mindset, Intelligence, 64, 52–59 DOI:10.1016/j.intell.2017.07.003.
  50. McKinney L., Novak H., Hagedorn L. S. and Luna-Torres M., (2019), Giving up on a course: An analysis of course dropping behaviors among community college students, Res. High. Educ., 60(2), 184–202.
  51. Messick S., (1987), Validity, ETS Res. Rep. Ser., 1987(2), i–208.
  52. Murner K. M. and Hessler E. E., (2020), The effects of difficulty and individual differences in mindset on persistence, Aisthesis: Honors Stud. J., 11(2), 18–22.
  53. Oliveira-Castro J. M. and Oliveira-Castro K. M., (2003), The relativity of “intelligence” in psychology and its adverbial function in ordinary language, Behav. Philos., 31, 1–17.
  54. Pienta N. J., (2014), Science scores, measures of success, and national competitiveness, J. Chem. Educ., 91(2), 159–160 DOI:10.1021/ed500060v.
  55. Scherr R. E., Plisch M., Gray K. E., Potvin G. and Hodapp T., (2017), Fixed and growth mindsets in physics graduate admissions, Phys. Rev. Phys. Educ. Res., 13(2), 020133 DOI:10.1103/PhysRevPhysEducRes.13.020133.
  56. Scott M. J. and Ghinea G., (2014), On the domain-specificity of mindsets: The relationship between aptitude beliefs and programming practice, IEEE Trans. Educ., 57(3), 169–174 DOI:10.1109/TE.2013.2288700.
  57. Shively R. L. and Ryan C. S., (2013), Longitudinal changes in college math students’ implicit theories of intelligence, Soc. Psychol. Educ., 16(2), 241–256.
  58. Sisk V. F., Burgoyne A. P., Sun J., Butler J. L. and Macnamara B. N., (2018), To what extent and under which circumstances are growth mind-sets important to academic achievement? Two meta-analyses, Psychol. Sci., 29(4), 549–571 DOI:10.1177/0956797617739704.
  59. Sternberg R. J., (2000), Implicit theories of intelligence as exemplar stories of success: Why intelligence test validity is in the eye of the beholder, Psychol., Publ. Pol., Law, 6(1), 159.
  60. Tempelaar D. T., Rienties B., Giesbers B. and Gijselaers W. H., (2015), The pivotal role of effort beliefs in mediating implicit theories of intelligence and achievement goals and academic motivations, Soc. Psychol. Educ., 18(1), 101–120.
  61. van Aalderen-Smeets S. I. and van der Molen J. H. W., (2018), Modeling the relation between students’ implicit beliefs about their abilities and their educational STEM choices, Int. J. Tech. Des. Educ., 28(1), 1–27.
  62. van Aalderen-Smeets, S. I., Walma van der Molen, J. H. and Xenidou-Dervou, I., (2019), Implicit STEM ability beliefs predict secondary school students' STEM self-efficacy beliefs and their intention to opt for a STEM field career, J. Res. Sci. Teach., 56(4), 465–485.
  63. Watts F. M. and Finkenstaedt-Quinn S. A., (2021), The current state of methods for establishing reliability in qualitative chemistry education research articles, Chem. Educ. Res. Pract.1 10.1039/d1rp00007a.
  64. Wren D.; Barbera J., (2013) Gathering evidence for validity during the design, development, and qualitative evaluation of thermochemistry concept inventory items, J. Chem. Educ.90(12), 1590–1601.
  65. Yeager D. S. and Dweck C. S., (2012), Mindsets that promote resilience: When students believe that personal characteristics can be developed, Educ. Psychol., 47(4), 302–314 DOI:10.1080/00461520.2012.722805.
  66. Yeager D. S. and Dweck C. S., (2020), What can be learned from growth mindset controversies? Am. Psychol., 75(9), 1269–1284 DOI:10.1037/amp0000794.
  67. Yu J. and McLellan R., (2020), Same mindset, different goals and motivational frameworks: Profiles of mindset-based meaning systems, Contemp. Educ. Psychol., 62, 101901.

This journal is © The Royal Society of Chemistry 2021