Deborah L.
Santos
a,
Jack
Barbera
b and
Suazette R.
Mooring
*a
aGeorgia State University, Atlanta, Georgia, USA. E-mail: smooring@gsu.edu
bPortland State University, Portland, Oregon, USA
First published on 30th May 2022
Chemistry education research has increasingly considered the role of affect when investigating chemistry learning environments over the past decade. Despite its popularity in educational spheres, mindset has been understudied from a chemistry-specific perspective. Mindset encompasses one's beliefs about the ability to change intelligence with effort and has been shown to be a domain-specific construct. For this reason, students’ mindset would be most relevant in chemistry if it were measured as a chemistry-specific construct. To date, no instrument has been developed for use in chemistry learning contexts. Here we present evidence supporting the development process and final product of a mindset instrument designed specifically for undergraduate chemistry students. The Chemistry Mindset Instrument (CheMI) was developed through an iterative design process requiring multiple implementations and revisions. We analyze the psychometric properties of CheMI data from a sample of introductory (general and organic) chemistry students enrolled in lecture courses. We achieved good data-model fit via confirmatory factor analysis and high reliability for the newly developed items, indicating that the instrument functions well with the target population. Significant correlations were observed for chemistry mindset with students’ self-efficacy, mastery goals, and course performance, providing external validity evidence for the construct measurement.
To unravel the motivational relations responsible for differences in student outcomes, appropriate measures of each construct must be established. Several researchers have recently criticized the mindset meaning system (Burgoyne and Macnamara, 2021), the measurement quality associated with it (De Castella and Byrne, 2015; Lüftenegger and Chen, 2017; Limeri et al., 2020a), or both (Dupeyrat and Mariné, 2005; Tempelaar et al., 2015; van Aalderen-Smeets and van der Molen, 2018). Likewise, meta-analyses of the mindset literature have highlighted the inconsistencies of mindset as a predictor of achievement with undergraduate student populations (Costa and Faria, 2018; Sisk et al., 2018). These inconsistent findings may point to inappropriate measurement of the mindset construct with the population of interest, indicating possible lack of validity. Additionally, work published by Santos et al. (2021) and Limeri et al. (2020a) found that undergraduate chemistry students interpret the terminology used in mindset instruments (i.e., “intelligence”) in a broad range of ways, which leads to potential response process concerns as some interpretations may have different implied malleabilities associated with them (e.g., knowledge is inherently a grow-able quality). To avoid these varied interpretations and improve response fairness, less broadly defined wording can be used in mindset instrument items.
Post-secondary students cannot be expected to hold the same views that primary and secondary students would have about a complex subject such as intelligence. It is likely that undergraduates hold a more multiplistic definition of intelligence as they increasingly realize that success can be achieved within a variety of different domains and using a variety of cognitive skills. This is supported by arguments that a domain-specific mindset measure is more appropriate at the undergraduate level within domain-specific contexts (Shively and Ryan, 2013; Scott and Ghinea, 2014; Little et al., 2016; Gunderson et al., 2017; Gorson and O'Rourke, 2019). Many domain-specific mindset studies in STEM have incorporated mindset measures that simply modify the item language from “intelligence” to terms such as “biology ability” (Dai and Cromley, 2014), “programming aptitude” (Scott and Ghinea, 2014), or “math intelligence” (Shively and Ryan, 2013). These types of modifications seek to improve the predictive power of mindset on STEM course performance or other outcomes but lack the qualitative justification necessary to suggest valid construct measurement. Buckley et al. (2019) demonstrated that students provide a broad range of definitions for intelligence within the technological domain through the qualitative exploration of characteristic behaviors of intelligent people in technology. These findings indicate that ideas about intelligence within a single domain can be complex for students to define. In addition to supporting the need for domain-specific mindset measures, these findings support infusing specified definitions of domain-specific intelligence within the instrument to yield more consistent interpretations. Based on these prior studies, it is reasonable to assume that chemistry intelligence is a unique and complex trait. Therefore, its meaning should be clarified for students when asked to report their beliefs, especially considering that many have a novice-level understanding of the field.
Another aspect of measuring mindset that has been questioned in recent years is the factor structure intended by typical mindset instruments (Lüftenegger and Chen, 2017). Mindset instruments are usually designed to measure two subfactors, entity and incremental theory beliefs (Dweck, 1999; Yeager and Dweck, 2020). Despite the two-factor design, mindset is often treated as a unidimensional measure when interpreting students’ responses by using cutoff values or terciles to identify respondents with a fixed mindset (Hong et al., 1999). Studies have shown inconsistent results in factor structure with some favoring a single-factor model and others favoring the intended two-factor structure (Gunderson et al., 2017; Lüftenegger and Chen, 2017; van Aalderen-Smeets and van der Molen, 2018). As a further critique on the validity of measurement in many mindset studies, they often report quantitative results using mindset as a predictor variable yet do not provide evidence of valid usage of the instrument with the studied population as they tend to omit confirmatory factor structure analysis. Therefore, the validity questions raised here must be taken into account when measuring domain-specific mindset and have driven our development of a chemistry-specific mindset instrument.
Theoretically, this results in incremental theorists exhibiting “growth mindset behaviors” such as putting forth more effort and persisting toward success because they believe it to be more attainable relative to entity theorists. Alternatively, entity theorists are more likely to exhibit “fixed mindset behaviors” such as procrastinating, avoiding evaluation, and self-hindering to remove emphasis from their natural ability onto their willful actions (Molden and Dweck, 2006; Burnette et al., 2013). These behaviors are self-protective responses to challenge that reflect ego threat, either as a result of interpreting challenge as a threat to their self-perceived value as intelligent individuals or confirming their negative self-perceptions. These relations suggest a link between mindset, self-efficacy, and achievement behaviors. Self-efficacy, the belief that one can achieve the desired outcome, has been shown to relate to mindset in several motivational analyses and thus is a useful variable to consider for demonstrating external validity (Komarraju and Nadler, 2013; Bedford, 2017; Lytle and Shin, 2020).
The originally proposed meaning system that students utilize based on their beliefs stated that achievement goals differ between growth and fixed mindset individuals (Dweck and Leggett, 1988). Achievement goals encompass two dimensions: mastery versus performance and approach versus avoidance (Elliot and McGregor, 2001). A student who sets mastery-approach goals is focused on increasing understanding of the content, while mastery-avoidance goals imply avoiding lack of understanding. Comparatively, performance-approach goals drive students toward achieving high grades, while performance-avoidance leads to avoiding poor grades. It has been proposed that growth mindset aligns with mastery-oriented goals and fixed mindset aligns with performance-oriented goals (Dweck and Leggett, 1988; Smiley et al., 2016). Empirical support for the link between fixed mindset and performance orientation is weak, and rather most students report some degree of performance orientation (Leondari and Gialamas, 2002; Burnette et al., 2013; Dinger and Dickhäuser, 2013; Karlen et al., 2019). This trend may be due to the increased emphasis on high-stakes testing and grades-based assessment within modern education systems. Finally, as previously discussed, mindset has varying empirical predictive power on achievement measures such as grades, yet theoretically, growth mindset should lead to improved grades through adaptive behaviors (Hong et al., 1999; Blackwell et al., 2007). Achievement goals and course performance variables offer additional potential for demonstrating external validity of appropriate mindset measures.
(1) Develop an instrument specific to mindset regarding chemistry intelligence intended for introductory undergraduate chemistry students.
(2) Determine the reliability and validity of measurements made with the developed instrument when used in the target population.
The two research goals were carried out by addressing the following research questions:
(1) How can item wording be modified to produce improved student response-process and construct measurement?
(2) How can the instrument's response-scale and dimensionality be modified to produce improved student response-process and construct measurement?
Across semesters, the sample was consistently representative of the overall course demographics. The majority of students identified as female (69%), which is representative of the STEM course enrollment at the institution. Most students reported being in their third year (41%) and as a pre-professional or STEM major other than chemistry (90%). The samples were consistently representative of the racial and ethnic diversity at the university according to student reports (37% Black or African American, 28% Asian, 15% White, of non-Hispanic origin, 12% Hispanic, 7% other). Approximately half of the students (53%) reported eligibility for a Pell Grant, which can be used as an approximate indicator of lower socioeconomic status. And approximately one-third (34%) identified as first-generation college students.
The rates for student participation compared to enrollment in participating course sections are shown in Table 1 for all surveys by semester. Participation rates in each instructors’ section varied. A quality control procedure was used to flag careless responses through items that directed students to select a particular answer to verify they were paying attention to the content of the statements. After the removal of students who did not select the indicated response for quality control items, the remaining participants’ data were analyzed.
| Semester | Timepoint | Total participants | Response rate (%) |
|---|---|---|---|
| Fall 2020 | Pretest | N = 851 | 45.4 |
| Posttest | N = 593 | 30.5 | |
| Spring 2021 | Pretest | N = 595 | 30.5 |
| Posttest | N = 513 | 30.8 | |
| Fall 2021 | Pretest | N = 514 | 46.5 |
| Posttest | N = 436 | 67.6 |
My ability to apply chemistry knowledge is something…
(I can’t change at all) 1 2 3 4 5 6 7 8 9 10 (I can change a lot)
Additional measures known to associate with mindset beliefs were also included in the surveys (Appendix B). Self-efficacy was measured using 6 out of the original 8 items from the self-efficacy subscale in the Motivated Strategies for Learning Questionnaire (Pintrich, 1991). Responses were reported using a 6-point Likert scale ranging from strongly disagree to strongly agree. Achievement Goals were measured using the 2 × 2 framework proposed by Elliot and McGregor (2001) and the 12 instrument items associated with it. The wording in these items was modified to reflect learning in a chemistry course by changing all references to “in this class” to “in chemistry.” The four subscales in this instrument each contain 3 items ranked on a 6-point Likert scale ranging from strongly disagree to strongly agree. The four dimensions are called Mastery-Approach, Mastery-Avoidance, Performance-Approach, and Performance-Avoidance.
![]() | ||
| Fig. 1 Example student quotes that emphasize each aspect of chemistry intelligence included in instrument items. | ||
When presented with a range of cognitive abilities relevant to learning chemistry (Fig. 1), derived from prior results on definitions for “chemistry intelligence” (Santos et al., 2021), students in interviews agreed that all were important factors to intelligence in chemistry. A sorting task was introduced to the students, instructing them to categorize the 7 chemistry intelligence terms in whatever way they believed they fit together. In the process of sorting these terms, Benjamin viewed them as abilities that develop sequentially while learning chemistry and that they all fall under “overall chemistry intelligence” as an umbrella term. During that same task, Camille commented that overall chemistry intelligence can change depending on improvements in the other six abilities. She stated that half of the abilities are less changeable and the other half she labeled as the “growth part of chemistry.” Abraham said that the term “overall chemistry intelligence” related to all six of the other cognitive abilities listed. When reading a statement regarding the ability to change one's problem-solving ability in chemistry, Abraham responded by discussing the extent to which he believed chemistry intelligence can change. When asked why he brought up chemistry intelligence, he stated:
“Problem-solving ability connects a good amount with me to chemistry intelligence because if you have the ability to sort of comprehend hard problems, you have a good understanding of chemistry and you have a better chemistry intelligence than other students do. But not, it's not like all revolving around that. I guess it's like a certain aspect of your chemistry intelligence, which is a big, big thing. But I do think problem-solving is a good portion of chemistry intelligence.”
To support the shift from the Version 1 term “chemistry intelligence” to various definitions (Versions 2, 3, and 4), it is helpful to compare how students responded to the first and final item wordings in cognitive interviews. Students commented about the repetitive nature of the original Dweck-style items (Version 1). This insight, combined with their reported tendency to simply select the same response for all items in a category, suggests that students do not feel the need to consider each item individually, but rather aim to respond consistently. This trend was not observed when asked to respond to items containing the various cognitive abilities as definitions of chemistry intelligence (Versions 2, 3, and 4). For these items, it was clear that students spent more time considering the nuances in their own abilities and beliefs regarding those abilities, thus leading to more variation in answer selection and care in representing their views about each item. These findings support the response-process validity associated with final item wording of the CheMI.
The construct of intelligence is a complex trait and can be defined in many ways. This complexity also applies when referring to intelligence within the discipline of chemistry. Equal emphasis across a broad range of cognitive skills deemed important for intellectual success in chemistry courses was selected for this instrument to provide a multiplistic view of intelligence within the measure. Interviews revealed that when presented with all of these cognitive aspects of chemistry intelligence (Fig. 1), students agreed that they are all important and fit within the umbrella of “chemistry intelligence.” When asked what they believe the term “overall chemistry intelligence” to mean, they tended to respond that it meant all of the previously discussed aspects combined. These responses from students support the inclusion of each of these definitions within the construct of chemistry intelligence as presented in the instrument items. Additionally, cognitive interview findings that students understood the meaning of “overall chemistry intelligence” to refer to the other presented terms suggests that, within the context of the instrument, students are directed to interpret the broader term in light of all of the cognitive abilities referenced. These findings provide face validity evidence for student interpretations of the item wording as representing the same construct intended, chemistry intelligence. It can also be argued that any guesswork associated with interpreting a vague term such as “chemistry intelligence” is reduced within the context of the multiplistic definition as presented.
Substantial differences were observed between the Likert scale (Version 2) and semantic differential (Version 3) response distributions and were used to select the semantic differential scale for all future iterations. This decision was further supported through cognitive interview evidence. One last aspect was considered to improve the responses and measurement quality. A ceiling effect may be present for some students when using a 6-point scale, regardless of scale type. The fourth iteration (Version 4) of the instrument, therefore, contained a 10-point semantic differential scale. To test the efficacy of the expanded scale, Versions 3 and 4 were randomly assigned during the survey administered at the beginning of Spring 2021. More detail into the evidence and rationale behind each of the modification decisions made are presented in the following sections.
Students tended to prefer the 10-point scale to the 6-point scale because it was more familiar and provided more room for variation in their beliefs between items. For example, Abraham said, “Because there's more numbers and there's like more ways to put my feeling into the question. So, I feel like there's more numbers, like, I can better gauge how I feel about this certain thing. And then the other one, because I feel like, when I say 5 (out of 6), it's more of a vague answer than whenever I say 8 (out of 10).” Desiree also expressed that the 10-point scale is more familiar when thinking of the way people often rate things out of 10. When responding to a 10-point scale item, Elena said that she would choose 5, which aligned with the self-doubt she had expressed previously in the interview. However, when reading the same item on a 6-point scale, she stated that she would choose either a 4 or 5, which is much closer to a growth belief response. She also stated that the 10-point scale is more precise for her to be able to express her feeling on the statement, while the 6-point scale requires her to be more “decisive.” Camille said that a higher value on the 6-point scale equates in her mind to a slightly smaller value on the 10-point scale (5.95 out of 6 is the same to her as 9.5 out of 10). She also said, “So, it's something that's…1 to 10 is easier to be visualized, at least in my mind, than a 1 through 6, even though, like, in the end, it's still the same. I believe it can change a lot.” Benjamin commented on the reason for selecting a higher value on the 1 to 6 scale relative to the 1 to 10 scale, “I guess when it's like a smaller number range It feels like it's like, more severe as the numbers go lower.” Commenting on the precision of each scale, he added,
“I think the smaller scale kind of feels a little more limiting, like it almost over summarizes maybe. As for the 10-point scale, it might be able to be more specific. I mean, again, it's hard to say, because…I don't know what these – it's hard to say, like, what it even quantifies. I mean, because I'm just assuming, like, 6 and 10 are like infinity and then ones are nothings. Then it's like nothing to infinity.”
The cognitive interviews provided useful evidence to support decisions related to students’ response processes but did so most strongly for the transition from the 6- to 10-point semantic differential scale.
In addition to the improved central tendency of the mean across the instrument iterations, skewness and kurtosis both decreased with the modifications. These values are shown in Table 2. The smallest values for both skewness and kurtosis were observed in the final version (Version 4) of the instrument containing the 10-point semantic differential scale and the seven items with defined abilities. These values indicate a slight negative skew favouring growth mindset beliefs and slightly taller than a perfectly normal curve, but both fall well within the range of an acceptable normal distribution (Jones, 1969). The skew toward growth mindset has been observed consistently across versions and was most reduced in Version 4. This skew is most likely due to the social desirability of reporting growth mindset that has been noted in prior studies (Hong et al., 1999; Santos et al., 2021), and is likely impacted by the popularity of mindset instruction in K-12 learning contexts. The reduced skew observed with instrument modifications made here is likely due to a decreased in social desirability combined with the wording and response-scale modifications.
| Instrument version | Sample | Response scale/dimension | Skewness | Kurtosis |
|---|---|---|---|---|
| a These values are ∼50% of those reported in Table 1 due to the random version assignment utilized during these administrations. | ||||
| Version 1 | Fall 2020 pretest N = 851 | Likert/incremental | −0.837 | 0.761 |
| Likert/entity | 0.985 | 0.928 | ||
| Version 2 | Fall 2020 posttest aN = 292 | Likert/incremental | −0.760 | 1.391 |
| Likert/entity | 0.768 | 0.816 | ||
| Version 3 | Spring 2021 pretest aN = 289 | 6-Point semantic differential | −0.202 | 0.248 |
| Version 4 | Spring 2021 pretest aN = 306 | 10-Point semantic differential | −0.188 | 0.187 |
Response process validity. In comparing response-scale formats, students in cognitive interviews tended to select higher response values when using the Likert scale and said they had more freedom when using a semantic differential because it did not express a positive or negative viewpoint. These two findings support the claim that social desirability or acquiescence bias may play a role in response patterns for Likert versions of mindset items (Luftenegger and Chen, 2017). The less personal use of numbers without expressing a particular view, as seen in the semantic differential items, seems to influence student opinions to a lesser degree. Although the values themselves do not state their meaning (are “arbitrary,” according to Desiree), they hold less value judgment and are left to the student to interpret. This provides evidence that Versions 3 and 4 reduce response-scale format influences on student responses, supporting response-process validity. These influences described in the cognitive interviews are likely a major cause of the skewed distributions observed in Versions 1 and 2.
Finally, consideration of response scale size was used to examine possible ceiling effects associated with limited value ranges. Upon initially responding to an item in cognitive interviews, students did not simply select the same scaled value between the 6-point and 10-point scale versions (e.g., 5 out of 6 and 8 out of 10). They first selected their response, then attempted to explain their response despite realizing it did not align with a direct conversion numerically. A less extreme value was selected when using the 10-point scale, indicating that having more scale options allowed them to feel more comfortable selecting a lower value. Students also said that a number lower than 5 on the 6-point scale seemed to be an “extreme” view to them, possibly indicating the effect of social desirability associations with a growth mindset leading to responses closer to the highest value. This finding aligns with the increased central tendency observed in Version 4 (10-point scale) relative to Version 3 (6-point scale), as shown in Fig. 2. Combining these results, we have evidence that a 6-point scale yields a ceiling effect for many student responses, and that this effect is reduced with the expanded 10-point scale. Reduction of a ceiling effect is beneficial in measurement to obtain better resolution of the distribution by shifting away from the scale edge and toward the center. Further support for the 10-point scale was provided in interview comments that a 1- to 10-point scale is more familiar and that it allowed students to be more precise in their responses, as evidence of response-process improvements. Considering all of this evidence led to the decision to retain a 10-point scale in the final CheMI version. The full final version (Version 4) of the CheMI is shown in Appendix A.
| Data collection | N | X 2 (df, p) | RMSEA (confidence interval) | CFI | SRMR |
|---|---|---|---|---|---|
| Fall 2021 pretest | 514 | 29.34 (14, 0.009) | 0.046 (0.022 to 0.070) | 0.994 | 0.014 |
| Fall 2021 posttest | 435 | 28.90 (14, 0.011) | 0.049 (0.023 to 0.075) | 0.993 | 0.015 |
Additionally, all items yielded high standardized factor loadings (all loadings ≥0.727), indicating a strong relation between each and the overall latent construct. Fig. 3 presents the CheMI Fall 2021 pretest data fit to a CFA model with standardized factor loadings. Similar CFA results were obtained in the posttest administration of the instrument in terms of the strengths of all factor loadings. To determine the single-administration reliability of responses across the 7 CheMI items, McDonald's omega (ω) values were obtained for both the pre- and posttest survey administrations. Both time points yielded excellent reliability (ωpretest = 0.929, ωposttest = 0.934).
![]() | ||
| Fig. 3 CFA model of Fall 2021 pretest Version 4 with standardized factor loadings. Item wordings can be found in Appendix A. | ||
In addition to the cognitive interview evidence previously discussed that supports the construct alignment of all 7 items as measuring mindset beliefs about chemistry intelligence, CFA data-model fit and factor loadings corroborate the construct validity. All items strongly correspond to a unidimensional chemistry mindset construct, with no apparent subfactors. Additionally, students respond reliably across all items. As all 7 cognitive aspects appear to contribute to the overall construct according to multiple data sources, all 7 items were retained in the final version of the CheMI.
| Pretest | Posttest | |||||
|---|---|---|---|---|---|---|
| N = 421 | Matched data, N = 209 | |||||
| Variables | 1 | 2 | 3 | 4 | 5 | Chemistry mindset |
| a p < 0.006. b p < 0.001 for Bonferroni corrections with 8 variable correlations. | ||||||
| 1 Chemistry mindset | 1 | 0.630 | ||||
| 2 Performance approach | 0.06 | 1 | 0.167 | |||
| 3 Performance avoidance | 0.011 | 0.187b | 1 | −0.025 | ||
| 4 Mastery approach | 0.337 | 0.124a | 0.147b | 1 | 0.218 | |
| 5 Mastery avoidance | −0.139 | −0.094 | 0.127a | 0.105 | 1 | −0.226 |
| 6 Self efficacy | 0.447 | 0.134a | −0.007 | 0.409b | −0.247b | 0.475 |
Self-efficacy yielded the largest external correlations with CheMI scores at both the pretest (r = 0.447, p < 0.002) and the posttest (r = 0.475, p < 0.002). This indicates that students with higher reported self-efficacy in their chemistry courses were more likely to report that they can improve aspects of their chemistry intelligence, aligning with findings from prior studies that these two constructs are positively related (Komarraju and Nadler, 2013; Bedford, 2017; Lytle and Shin, 2020). Likewise, mastery-approach goals were observed to correlate with both pre- (r = 0.337, p < 0.002) and posttest chemistry mindset (r = 0.218, p < 0.002), suggesting that students focused on mastery are more inclined toward beliefs associated with improving their chemistry intelligence (Dweck and Leggett, 1988). A negative correlation was observed for mastery-avoidance goals and chemistry mindset (pre: (r = −0.139, p < 0.002); post: r = −0.226, p < 0.002). Although the mastery-avoidance dimension of achievement goals was not a part of Dweck's original theoretical framework, the negative correlation with mindset can be expected because students’ fears regarding their inability to learn the content align well with beliefs that chemistry intelligence cannot improve. No significant correlations were observed between chemistry mindset and either of the performance goal orientations. These results aligned well with previous findings that mindset more strongly relates to mastery-based achievement goals relative to performance-based goals (Leondari and Gialamas, 2002; Burnette et al., 2013; Dinger and Dickhäuser, 2013; Karlen et al., 2019). It should be noted here that a ceiling effect was observed for two of the four achievement goal dimensions: mastery-approach (found to significantly correlate with chemistry mindset) and performance-avoidance (not found to significantly correlate with chemistry mindset). Ceiling effects have also been observed in another study using the same achievement goals measure in chemistry (Lewis, 2018). The presence of these ceiling effects may limit the interpretability of the correlations observed; however, the expected relationships with chemistry mindset were observed, reducing this concern. Nevertheless, this may indicate a need for an improved achievement goal orientation measure for chemistry-specific contexts.
The pre- and posttest mindset measures correlated significantly with both measures of course achievement, formative (pre: r = 0.168, p < 0.005; post: r = 0.293, p < 0.005) and summative scores (pre: r = 0.228, p < 0.005; post: r = 0.331, p < 0.005). As inconsistent results or small effect sizes have been observed in correlating mindset and achievement for undergraduate students across numerous studies (Sisk et al., 2018; Costa and Faria, 2018), these findings were positive evidence of an improved mindset measure for chemistry contexts. Others have observed that including mediating variables between mindset and achievement yields significant predictive relationships (Macakova and Wood, 2020). However, lack of sensitivity of the mindset measure itself may further reduce direct predictive power, supporting that this instrument has increased sensitivity to differences in chemistry mindset beliefs.
Pearson correlations with pairwise deletion, theoretically relevant and significant values bolded for emphasis.
Examining the correlations between the CheMI measurements and other variables (Table 4) addresses one final consideration of validity, namely external validity. Convergent validity evidence is provided through the strength and sign of correlations with self-efficacy, mastery-based achievement goals, and course achievement according to the mindset meaning system (Dweck and Leggett, 1988; Dweck, 1999). Literature reports consistent alignment of mindset and mastery-based achievement goals (Burnette et al., 2013) and a few studies have reported alignment with self-efficacy (Komarraju and Nadler, 2013; Bedford, 2017; Lytle and Shin, 2020). Theoretically, mindset beliefs in a domain should predict achievement (Hong et al., 1999; Blackwell et al., 2007), which is the primary incentive for conducting interventions. Yet, inconsistent findings in other studies with similar populations have brought these advantages into question for this academic stage (Sisk et al., 2018). We argue that inconsistent findings may be a symptom of poor measurement quality for the target population's mindset construct, especially if the domain-specificity of the construct has increased relevance for adult students. Divergent validity evidence was obtained by noting the near-absent correlations of mindset with performance-based achievement goals. Although a fixed mindset was originally found to relate to performance goals for young students (Dweck and Leggett, 1988), the increasing emphasis on performance as students progress toward high-stakes admissions processes is a possible cause for the lack of relationship between variables commonly reported in studies with secondary and tertiary students (Sisk et al., 2018).
Finally, several mindset-related studies in undergraduate STEM contexts have reported that domain-specific beliefs exhibit downward trajectories over time, indicating that students become more fixed in their beliefs (Dai and Cromley, 2014; Scott and Ghinea, 2014). However, the reported shifts in mindset are not large over a shorter time-scale such as one semester. This means that mindset beliefs at pre- and post-semester collection times should correlate strongly with one another. In our sample, pre- and post-chemistry mindset yielded the strongest correlation between variables (Table 4). This result should be interpreted with caution as many students were excluded from this correlation due to lack of participation at both timepoints (N = 209 for matched data). It does appear that some changes in students' beliefs did occur, as evidenced by the 0.630 correlation value. Students likely use their prior history with chemistry performance as evidence in the formation of their mindset beliefs at the beginning of the course. However, factors such as challenges, the classroom environment, and performance feedback in the current course may cause fluctuations and minor shifts in mindset throughout the semester. Although some students may have changed their views during the course as a response to their experiences and performance feedback (Limeri et al., 2020b), a single semester is a short time span for substantial changes in views.
Variation in course participation rates was observed; however, the sample was representative of typical STEM course enrollment demographics at the institution across all categories. The different course sections were given different assignments and exams, thus raw average performance scores for formative and summative assessment may represent very different difficulty levels or assessment types. To mitigate this issue, z-scores for each course section were used to be more directly comparable relative to the performance distribution in each section. Additionally, all other measures were collected as self-report values, and thus may contain variation in interpretation and biases.
| CheMI version | Mean | SD |
|---|---|---|
| Version 1 (6-point Likert scale), Fall 2020 Pretest, N = 851 | ||
| Incremental items | ||
| 1. No matter who I am, I can change my chemistry intelligence level. | 4.85 | 1.13 |
| 2. I can always change my chemistry intelligence. | 4.90 | 1.02 |
| 3. No matter how much chemistry intelligence I have, I can change it quite a bit. | 4.74 | 1.00 |
| 4. I can change my chemistry intelligence level significantly. | 4.73 | 1.12 |
| Entity items | ||
| 1. I have a certain amount of chemistry intelligence, and I really can’t do much to change it | 2.22 | 1.08 |
| 2. My chemistry intelligence is something about me that I can’t change very much. | 2.20 | 1.13 |
| 3. To be honest, I can’t really change my chemistry intelligence. | 1.97 | 1.04 |
| 4. I can learn new things, but I cannot really change my level of chemistry intelligence. | 2.34 | 1.13 |
| Version 2 (6-point Likert scale), Fall 2020 Postest, N = 292 (randomly assigned 50%) | ||
| Incremental items | ||
| 1. I can change my problem-solving ability in chemistry | 4.96 | 0.92 |
| 2. My ability to understand concepts in chemistry is something I can improve | 4.96 | 0.94 |
| 3. My ability to apply chemistry knowledge is something I can change | 4.96 | 0.90 |
| 4. My ability to master chemistry content is something I can improve | 4.98 | 0.94 |
| 5. I can improve my ability to visualize chemical structures and processes in chemistry | 4.76 | 1.08 |
| 6. My ability to use mathematical and logical reasoning in chemistry is something I can change | 4.77 | 1.06 |
| 7. My overall chemistry intelligence is something I can change | 4.93 | 0.99 |
| Entity items | ||
| 1. I can’t really change my problem-solving ability in chemistry | 2.21 | 1.03 |
| 2. I can’t change my ability to understand concepts in chemistry much | 2.27 | 1.05 |
| 3. My ability to apply chemistry knowledge is something I can’t really improve | 2.27 | 1.08 |
| 4. My ability to master chemistry content is something I can’t improve much | 2.14 | 1.05 |
| 5. I can’t really improve my ability to visualize chemical structures and processes in chemistry | 2.37 | 1.09 |
| 6. My ability to use mathematical and logical reasoning in chemistry is something I can’t change very much | 2.48 | 1.17 |
| 7. My overall chemistry intelligence is something I can’t change | 2.25 | 1.16 |
| Version 3 (6-point Semantic Differential), Spring 2021 Pretest, N = 289 (randomly assigned 50%) (I can’t change at all) 1 2 3 4 5 6 (I can change a lot) | ||
| 1. My problem-solving ability in chemistry is something… | 4.49 | 1.04 |
| 2. My ability to understand concepts in chemistry is something… | 4.64 | 0.98 |
| 3. My ability to apply chemistry knowledge is something… | 4.44 | 1.06 |
| 4. My ability to master chemistry content is something… | 4.46 | 1.07 |
| 5. My ability to visualize chemical structures and processes is something… | 4.22 | 1.17 |
| 6. My ability to use mathematical and logical reasoning in chemistry is something… | 4.46 | 1.05 |
| 7. My overall chemistry intelligence is something… | 4.55 | 1.08 |
| Version 4 (10-point Semantic Differential) Fall 2021 Posttest, N = 436 (I can’t change at all) 1 2 3 4 5 6 7 8 9 10 (I can change a lot) | ||
| 1. My problem-solving ability in chemistry is something… | 6.93 | 2.06 |
| 2. My ability to understand concepts in chemistry is something… | 7.08 | 2.05 |
| 3. My ability to apply chemistry knowledge is something… | 6.77 | 2.12 |
| 4. My ability to master chemistry content is something… | 6.83 | 2.28 |
| 5. My ability to visualize chemical structures and processes is something… | 6.48 | 2.12 |
| 6. My ability to use mathematical and logical reasoning in chemistry is something… | 6.91 | 2.03 |
| 7. My overall chemistry intelligence is something… | 7.00 | 2.15 |
| Measure/construct | Original subscale | Modified subscale |
|---|---|---|
a
Elliot and McGregor (2001) A 2 × 2 achievement goal framework. b P. R. Pintrich (1991). A manual for the use of the Motivated Strategies for Learning Questionnaire (MSLQ). |
||
| a Achievement goal questionnaire/performance-approach | 1. It is important for me to do better in this class than other students | 1. It is important for me to do better in chemistry than other students |
| 2. It is important for me to do well compared to others in this class | 2. It is important for me to do well compared to others in chemistry | |
| 3. My goal in this class is to get a better grade than most of the other students | 3. My goal in chemistry is to get a better grade than most of the other students | |
| a Achievement goal Questionnaire/mastery-avoidance | 4. I worry that I may not learn all that I possibly could in this class | 4. I worry that I may not learn all that I possibly could in chemistry |
| 5. Sometimes I'm afraid that I may not understand the content of this class as thoroughly as I'd like | 5. Sometimes I'm afraid that I may not understand chemistry content as thoroughly as I'd like | |
| 6. I am often concerned that I may not learn all that there is to learn in this class | 6. I am often concerned that I may not learn all that there is to learn in chemistry | |
| a Achievement goal questionnaire/mastery-approach | 7. I want to learn as much as possible from this class | 7. I want to learn as much as possible from this chemistry class |
| 8. It is important for me to understand the content of this course as thoroughly as possible | 8. It is important for me to understand chemistry content as thoroughly as possible | |
| 9. I desire to completely master the material presented in this class | 9. I desire to completely master the material presented in chemistry | |
| a Achievement goal questionnaire/performance- | 10. I just want to avoid doing poorly in this class | 10. I just want to avoid doing poorly in chemistry |
| Avoidance | 11. My goal in this class is to avoid performing poorly | 11. My goal in this chemistry class is to avoid performing poorly |
| 12. My fear of performing poorly in this class is often what motivates me | 12. My fear of performing poorly in chemistry is often what motivates me | |
| bMotivated strategies for learning questionnaire/self-efficacy | 1. I believe I will receive an excellent grade in this class | 1. I believe I will receive an excellent grade in this class |
| 2. I'm certain I can understand the most difficult material presented in the readings for this course | 2. I'm confident I can understand the basic concepts taught in this course | |
| 3. I'm confident I can understand the basic concepts taught in this course | 3. I'm confident I can do an excellent job on the assignments and tests in this course | |
| 4. I'm confident I can understand the most complex material presented by the instructor in this course | 4. I expect to do well in this class | |
| 5. I'm confident I can do an excellent job on the assignments and tests in this course | 5. I'm certain I can master the skills being taught in this class | |
| 6. I expect to do well in this class | 6. Considering the difficulty of this course, the teacher, and my skills, I think I will do well in this class | |
| 7. I'm certain I can master the skills being taught in this class | ||
| 8. Considering the difficulty of this course, the teacher, and my skills, I think I will do well in this class. | ||
1. Student reads and signs the consent form before beginning the interview.
2. Researcher thanks student for participating and initiates with a few questions about their experience in the course.
3. Researcher instructs student to complete a series of several activities using the WebEx drawing tools and has the student project the activity documents on the screen as they work.
4. The student circles or crosses out items based on whether they correspond to their own beliefs about chemistry intelligence or behaviors in challenging chemistry scenarios.
5. After the student has responded to all items, the researcher will ask further questions and prompt for the next part of the activity, such as sorting remaining items into categories.
6. The researcher will ask the student questions about why they categorized items in this way.
7. The researcher will then show the student responses they gave in their survey earlier in the semester and ask about why they selected those answer choices (do they actually believe this or what other reason might they have chosen it?). Any discrepancies between responses on the survey and during the previous activities can be discussed.
8. The researcher will then ask the student to draw several graphs based on their own beliefs and discuss them in terms of comparing and contrasting their graphs/shapes.
9. The researcher will ask final open-ended questions to conclude the interview.
10.The student will acknowledge receipt of the gift card for participation.
Description of phases and questions students will be asked to respond to using think aloud:
Phase 1: Beginning questions to practice talking
– Personal challenge, effort, and engagement in chemistry – previous and present experience?
– Personal interest in chemistry, reason for taking it, and career goals?
– Meaning of natural ability?
– Interest and natural – can you have something naturally that doesn’t interest you? Is interest natural or developed?
– Comes easily vs natural – are different things natural for different people?
Phase 2: Chemistry abilities sorting task (Fig. 4)
Provide instructions to the student that they should dictate aloud how they wish to sort the abilities into categories. They can create any number of categories as they wish.
– Ask students to label or name each category
– Ask for definitions of each term in chemistry
Phase 3: Chemistry mindset items
– Give Likert scale version first (Version 1). Ask student to respond to all aloud. Are they still reading each one?
– Ask how they would respond on a 10 point semantic scale (Version 4)
– Ask why they would choose that value, what does that number mean in words?
– Ask to compare 10 and 6 point scale (Version 3)
– Ask to compare Likert scale version (Version 2) – how would you respond and why? How does the format impact your answer or understanding of the item?
Items:
1. My problem solving ability in chemistry is something…
1 I can't change at all 2 3 4 5 6 I can change a lot
2. My ability to understand concepts in chemistry is something…
3. My ability to apply chemistry knowledge is something…
4. My ability to master chemistry content is something…
5. My ability to visualize chemical structures and processes is something…
6. My ability to use mathematical and logical reasoning in chemistry is something…
7. My overall chemistry intelligence is something…
Phase 4: Final questions
– Where do you think your chemistry intelligence comes from? Is this true for others?
– Do you believe your chemistry intelligence can change and what led you to believe that?
| This journal is © The Royal Society of Chemistry 2022 |