Katherine A.
Blackford
a,
Julia C.
Greenbaum
a,
Nikita S.
Redkar
b,
Nelson T.
Gaillard
a,
Max R.
Helix
c and
Anne M.
Baranger
*ac
aDepartment of Chemistry, University of California, Berkeley, California 94720, USA. E-mail: abaranger@berkeley.edu
bDepartment of Chemical & Biomolecular Engineering, University of California, Berkeley, California 94720, USA
cGraduate Group in Science and Mathematics Education, University of California, Berkeley, California 94720, USA
First published on 6th March 2023
Problem solving is a key component of authentic scientific research and practice in organic chemistry. One factor that has been shown to have a major role in successful problem solving in a variety of disciplines is metacognitive regulation, defined as the control of one's thought processes through the use of planning, monitoring, and evaluation strategies. Despite the growing interest in assessing and promoting metacognition in the field of chemical education, few studies have investigated this topic in the context of organic chemistry students. To gain a deeper understanding of how and why students make use of strategies related to metacognitive regulation in their approaches to solving problems, we conducted interviews with Organic Chemistry I, Organic Chemistry II, and graduate organic chemistry students and used multiple measures to examine students’ metacognition. As a part of these interviews, students verbalized their thoughts as they worked on complex predict-the-product problems and completed a self-report instrument indicating which planning, monitoring, and evaluation strategies they had used while completing each problem. Think-aloud protocols were analyzed for the presence of each of the behaviors included on the self-report instrument, and students’ use of metacognitive strategies was compared to identify differences between students with different levels of experience and between students who generated more and less successful solutions to the problems. Students who generated more successful solutions to the problems tended to report using a greater number of metacognitive strategies. When asked why they did or did not use certain metacognitive strategies, students indicated a number of factors, such as not feeling able to use these strategies effectively or believing that using these strategies was unnecessary. The results of this study support the importance of teaching metacognitive problem-solving strategies in organic chemistry courses and suggest several methods for the assessment and instruction of metacognition.
Among the types problems commonly used to assess student knowledge, predict-the-product problems are distinctive in that students are not provided with an endpoint to work towards. Studies have shown that when students attempt to solve mechanistic problems in which the final product is given, they typically focus on proposing steps that “get me [closer] to the product” by reducing the number of structural differences between the reactants and products (Bhattacharyya and Bodner, 2005; Ferguson and Bodner, 2008; Caspari et al., 2018). Work by DeCocq and Bhattacharyya (2019) demonstrated that knowing the overall product of a transformation led to a dramatic change in the reasoning strategies organic chemistry students used when asked to provide the intermediate product and curved arrows for a single elementary step of a multi-step mechanism. In the absence of information about the final product of the transformation, students primarily proposed intermediate products based on their knowledge of the chemical properties of the reactants. After students were provided with the final product, many changed their answers to structures that more closely resembled this product. It is clear from these studies that student reasoning is highly affected by the information given in the problem statement, and that students’ approaches to problems in which the ultimate product is not known, such as predict-the-product problems, may more accurately reflect their ability to engage in chemical reasoning. For this reason, along with the relatively small number of studies investigating student reasoning on problems of this type and level of difficulty, recent work in our research group has centered on investigating student approaches to open-ended predict-the-product problems that are relatively complex and potentially ambiguous (Helix et al., 2022).
Our previous research on student approaches to open-ended predict-the-product problems involved analyzing think-aloud interviews in order to categorize student approaches in terms of common problem-solving actions (Helix et al., 2022). The results of this analysis were used to develop a general workflow model that describes the ways in which students with different levels of expertise in organic chemistry solve problems that rely on predicting reactivity. While completing this work, we became interested in examining additional strategies that students engage in while solving these types of problems, especially those that may differentiate between successful and unsuccessful problem solvers. One of the factors that has been shown to have a significant impact on problem-solving across disciplines is a student's ability to engage in metacognition, defined as the knowledge and control of one's own thought processes (Flavell, 1979; Rickey and Stacy, 2000; Schoenfeld, 2016). There has been growing interest among chemical education researchers in assessing and promoting metacognition, yet few studies have focused on organic chemistry courses (Arslantas et al., 2018). In a review of research conducted in the field of organic chemistry education, Graulich (2015) suggested that one of the main areas of future progress in this domain should be fostering metacognitive and learning strategies. Developing ways to teach metacognition and scaffold the development of specific metacognitive problem-solving skills in this context is made easier by having an understanding of both how and why students use these strategies in their approach to solving organic chemistry problems. This study therefore builds upon our previous research on student approaches to complex predict-the-product problems by providing a more comprehensive, multi-method examination of students’ use of metacognitive regulation strategies when solving problems of this type. In addition to determining which metacognitive behaviors are exhibited by students with different levels of experience in organic chemistry and exploring the connection between students’ metacognitive regulation and their success in solving problems, we also discuss students’ reasons for using these strategies.
Metacognition has been shown to have a significant impact on problem-solving success in specific disciplines such as chemistry (Rickey and Stacy, 2000; Gulacar et al., 2020) and mathematics (Schoenfeld, 1987; Artz and Armour-Thomas, 1992; Jacobse and Harskamp, 2012) as well as in general critical thinking tasks (Swanson, 1990; Ku and Ho, 2010). Schoenfeld (1987), for example, found that in the absence of metacognitive regulation, college students enrolled in his mathematical problem-solving course often continued down unproductive paths, despite having the requisite mathematical knowledge to solve the problem, because they did not pause to consider whether they were making progress in the right direction. This indicates that simply being familiar with the relevant concepts is not sufficient for solving genuine problems. Work by Swanson (1990) suggests that a high level of metacognition could in fact compensate for lower aptitudes; using think-aloud interview techniques, he observed that children with higher levels of metacognition performed better on problem-solving tasks than those with lower metacognitive activity regardless of differences in general academic aptitude. This association between metacognitive ability and problem-solving skills underscores the importance of studying metacognition in disciplines where problem solving is a central practice.
There are benefits and drawbacks to the various measures of metacognition. Concurrent assessments are generally considered to better align with actual behavior than off-line measures, likely because these measures require the learner to make judgments based on reconstructing their previous cognitive processes from memory (Veenman et al., 2006; Van Hout-Wolters, 2009). The issue of distortion due to memory failure can be partially mitigated by administering self-report measures immediately after completing a task and asking students to consider their behavior in a specific situation (Ericsson and Simon, 1993; Veenman, 2011). While this does not resolve all of the issues with self-report questionnaires, including the inclination to give socially desirable responses, being asked to consider one's behavior in a specific situation can make it easier for participants to recall their actual behavior (Van Hout-Wolters, 2009). Task-specific questionnaires typically correlate more strongly with concurrent methods than general questionnaires; for example, Schellings et al. (2013) observed a correlation of r = 0.63 between think-aloud protocols and a task-specific questionnaire that was directly based on a taxonomy for coding those think-aloud protocols (Schellings et al., 2013). The major drawback of concurrent assessments is that they tend to be much more time-consuming to administer and analyze, so it is not typically feasible to use them with large groups. Also, though thinking aloud is not considered to alter student behavior apart from increasing the time taken to complete a task, assessing metacognition in this way may lead to underestimations of metacognitive behavior (Ericsson and Simon, 1993; Veenman, 2011). This is because students may not be consciously aware of their self-regulatory processes, as these processes are often highly automated in adults (Schraw et al., 2006; Veenman et al., 2006). To overcome the drawbacks associated with these individual measures of metacognition, many researchers have emphasized the advantage of using multiple methods to assess metacognition (Veenman, 2005; Cooper et al., 2008; Desoete, 2008; Schellings et al., 2013).
To evaluate interventions designed to promote metacognition and to investigate the nature of metacognition in chemistry problem solving, chemical education researchers need to assess students’ metacognitive ability. Researchers have most commonly used self-report instruments, either alone or in combination with other methods, for this purpose. Examples of general metacognitive self-report instruments that have been applied to chemical education research include the Inventory of Metacognitive Self-Regulation (development: Howard et al., 2000; use with students who had completed a general chemistry course: Wang, 2015) and the Metacognitive Awareness Inventory (development: Schraw and Dennison, 1994; use with students enrolled in a General Chemistry course: Gulacar et al., 2020). The Metacognitive Activities Inventory (MCAI), developed by Cooper and Sandi-Urena (2009), is an example of a domain-specific self-report instrument that was designed to measure metacognitive skillfulness in chemistry problem solving. Cooper and Sandi-Urena validated the use of this instrument among students enrolled in General Chemistry I and graduate students (Cooper and Sandi-Urena, 2009). Concurrent methods such as think-aloud interviews (Wang, 2015; Kadioglu-Akbulut and Uzuntiryaki-Kondakci, 2020; Heidbrink and Weinrich, 2021) and an automated online instrument known as Interactive MultiMedia Exercises or IMMEX (Cooper et al., 2008) are among the other measures researchers have used to assess metacognition in chemistry students. Several of these studies made use of multiple measures (Cooper et al., 2008; Wang, 2015; Kadioglu-Akbulut and Uzuntiryaki-Kondakci, 2020). In their investigation of metacognition use in general chemistry problem-solving, Cooper et al. (2008) observed convergence between the scores students received on the MCAI (a self-report instrument) and the IMMEX (a concurrent measure). Wang (2015) examined characteristics of students’ metacognition in different general chemistry topics using data from self-report measures, think-aloud interviews, and students’ judgments of their performance. Kadioglu-Akbulut and Uzuntiryaki-Kondakci (2020) investigated the effectiveness of self-regulatory instruction in a high school chemistry classroom using the Cognitive and Metacognitive Strategies Scale (a self-report instrument), think-aloud protocols, and journal entries.
Despite the growing interest in the role of metacognition in chemistry education, few studies have focused on organic chemistry students. In a recent review of metacognition in higher education chemistry, 27 out of the 31 articles that met the inclusion criteria examined metacognition in students that were enrolled in introductory, general, or preparatory chemistry courses (Arslantas et al., 2018). Problems students encounter in organic chemistry courses differ from those encountered in general chemistry courses in that they are primarily non-mathematical and require a different set of fundamental skills (Cartrette and Bodner, 2010). According to Dye and Stanton (2017), many of the students they interviewed as part of their study on metacognition in upper-division biology students stated that organic chemistry was the first course in which they had to be metacognitive to succeed, likely due to their lack of experience with the type of problem solving required in organic chemistry courses (Dye and Stanton, 2017). This suggests that investigating metacognition in organic chemistry students would be particularly valuable.
To our knowledge, only four reports on metacognition in organic chemistry students have been published (Lopez et al., 2013; Mathabathe and Potgieter, 2017; Graulich et al., 2021; Pulukuri and Abrams, 2021). Lopez et al. (2013) investigated the study strategies used by ethnically diverse organic chemistry students and found that students typically used strategies that involved reviewing course materials rather than more metacognitive study strategies and that there were no significant correlations between study strategies used and course performance. Mathabathe and Potgieter (2017) examined organic chemistry students’ use of metacognitive regulation during the collaborative planning of a laboratory group project. Based on previous coding schemes described in the literature as well as inductive analysis of transcripts of these collaborative planning sessions, the authors devised a coding scheme and decision tree for the classification of verbalizations related to planning, monitoring, control, and evaluation. Their coding scheme also classified verbalizations according to the type of regulation (self or other), area of regulation (cognition, task performance, or behavior), and depth of the regulatory behavior (high or low). Graulich et al. (2021) described the use of a scaffold that was designed to guide students through solving an organic chemistry case-comparison problem using a combination of instructional prompts and metacognitive suggestions. After writing down their initial solution and explanation for the given case-comparison problem, students watched videos of peers solving the same problem, completed a scaffolded analysis of these peer-solutions with a partner, developed a general procedure for handling contrasting cases tasks, and then revised their initial explanations. The authors found that this scaffolded activity led students to improve the quality of their mechanistic explanations. Pulukuri and Abrams (2021) compared metacognitive monitoring proficiency and learning gains between students who used different learning resources and found that students who learned organic chemistry concepts from question-embedded videos did better on both outcomes than those who learned from a textbook. Each of these studies suggests ways that metacognition can be observed in or encouraged in organic chemistry students.
1. What metacognitive strategies do undergraduate and graduate students use when solving organic chemistry problems?
2. How do students who are more and less successful at solving organic chemistry problems differ in their use of metacognitive regulatory strategies?
3. What reasons do students have for using or not using metacognitive strategies while solving organic chemistry problems?
Type of information | Undergraduate participants (N = 26) | Graduate participants (N = 12) |
---|---|---|
Gender | Women (65%) | |
Men (19%) | Men (66%) | |
Non-binary or unsure (4%) | Women (25%) | |
Did not answer (15%) | Non-binary or unsure (8%) | |
Race/ethnicity | East Asian (50%) | White/Caucasian (75%) |
South Asian (15%) | American Indian/Alaska Native (8%) | |
African American/black (8%) | East Asian (8%) | |
Mexican American/Chicano (8%) | Mexican American/Chicano (8%) | |
White/Caucasian (8%) | Middle Eastern/North African (8%) | |
Did not answer (20%) | South Asian (8%) | |
Year in undergraduate or graduate program | First year (25%) | |
First year (12%) | Second year (17%) | |
Second year (85%) | Third year (8%) | |
Third year (4%) | Fourth year (42%) | |
Fifth year (8%) | ||
Undergraduate major or graduate research focus | Life science (77%) | Organic chemistry (100%) |
Engineering (15%) | Biological chemistry (58%) | |
Public health (8%) | Analytical chemistry (16%) | |
Social science (4%) | Inorganic chemistry (16%) |
It is important to note that the undergraduate students who volunteered to participate in interviews are not a fully representative sample of those enrolled in Organic Chemistry I or II. Overall, the undergraduate interview participants received final percentage grades in the course that were 0.5 standard deviations above the class average, and less than 20% received a grade lower than the class mean. However, as shown in Fig. 1, the undergraduate interviewees did differ widely in their performance in the course, ranging from over one standard deviation below the class average to over one standard deviation above the class average. Grade data was not collected for the graduate student participants.
This initial list of metacognitive activities was introduced to seven students who had previously taken one or more organic chemistry courses and had volunteered to participate in focus groups. During these focus groups, students completed a survey that asked how often they engaged in each activity while working on organic chemistry problems. They were then asked to provide feedback on the clarity of the questions and instructions. The wording of some items was changed in response to this round of feedback, while other items were removed from the list entirely. The final list (see Table 2) was narrowed down to nine strategies that students might use during the planning phase before attempting a solution, five monitoring strategies that students might use during the problem-solving process, and six strategies that students could use to evaluate the products and process of their approach after reaching a solution. We believed that this list could function as a measure of students’ use of metacognitive regulation strategies in the context of both a self-report instrument and a coding scheme for use with interview transcripts. To ensure this dual functionality, we also conducted pilot interviews with five Organic Chemistry I or Organic Chemistry II students during the semester before the main data collection took place. These pilot interviews followed the same protocol described in the “interview protocol” section of this work. Transcripts of the think-aloud problem-solving portion of these pilot interviews, as well as similar interviews that one of the authors had conducted with students enrolled in different organic chemistry courses, were analyzed to determine whether student usage of each skill was evident or not evident in order to confirm that these behaviors could be detected in students’ verbalizations of their thinking processes.
Type of strategy | Individual item on self-report instrument/coding scheme | Abbreviation |
---|---|---|
a Duplicated or modified from an existing item on the MCAI (Cooper and Sandi-Urena, 2009). b Duplicated or modified from an existing item on the MAI (Schraw and Dennison, 1994). | ||
Planning | I set goals (e.g. “I need to make this bond,” or “I want to make this functional group”) before attempting a solution. | Set goals |
Before I started working, I sorted through the information in the problem to determine what is relevant.a | Sort relevant info | |
Before I started working, I looked for any reactions I recognized. | Look for reactions recognized | |
I reflected upon things I know that are relevant to the problem before I started working.a | Reflect relevant knowledge | |
I tried to relate unfamiliar problems with previous problems I've encountered.a | Relate to previous problems | |
I jotted down my ideas or things I know that are related to the problem before attempting a solution.a | Jot down ideas | |
I made predictions about what would happen before I started working on the problem. | Make predictions | |
I brainstormed multiple ways to solve a problem before I actually started solving it.b | Brainstorm multiple ways | |
I considered whether my proposed steps were reasonable before I actually started solving the problem.a | Consider if plan reasonable | |
Monitoring | When I was in the middle of working on the problem, I paused to consider whether there was another way to solve it.b | Consider another way |
While I was working on the problem, I paused to consider whether I was making progress toward my goals.b | Monitor progress toward goals | |
I paused to consider whether what I was doing was correct while I was working on the problem.b | Monitor correctness | |
I took note of what I was uncertain about as I worked on the problem. | Note uncertainty | |
As I worked on the problem, I periodically checked back over what I had done so far to make sure my overall approach was reasonable. | Periodically check if reasonable | |
Evaluation | I thought about whether my answer was reasonable after I finished the problem.a | Consider if answer reasonable |
I made sure that my solution actually answered the question.a | Check if answered question | |
I checked back over my work after I finished the problem to make sure I didn’t make any mistakes.a | Check for mistakes | |
Once I reached an answer, I checked to see that it agreed with what I predicted.a | Check if agreed with prediction | |
Once I finished the problem, I summarized the main take-away lesson I learned.b | Summarize main takeaways | |
After I finished the problem, I considered how I might change my approach for future problems. | Consider changes for future |
The components of the interview protocol and timeline are provided in Fig. 2. Copies of the interview protocol and the surveys students completed during the interview are provided in Appendices S1 and S2 of the ESI.† At the beginning of the interview, a PDF file containing the problems used in the interview was emailed to each participant. Participants were then asked to state their undergraduate major or graduate research focus, their year of study, and each organic chemistry course they had taken or taught. After they answered these introductory questions, students were given guidelines for how they should use the think-aloud technique to verbalize their thoughts while solving a problem. They were then asked to solve an organic chemistry problem while vocalizing their thought processes. A list of the problems completed by the study participants and their accepted answers is included in Fig. 3. The same instructions were given for all problems: “Predict the major organic product(s) of the following reactions. Please indicate stereochemistry where appropriate.” Participants were asked to either use the screenshare feature while annotating the PDF file or, if they preferred to write on paper, angle their camera toward that sheet of paper. Students worked on the problem without interruptions, except for occasional prompts to speak up or brief feedback on their think-aloud technique, until they indicated that they had reached their final answer. Students were then provided with a link to a survey hosted on Qualtrics, where they were asked to indicate whether they had used in each of the 20 metacognitive strategies introduced in Table 2 while solving the first interview problem. For each item, students were able to select “yes” or “no.” As a part of this survey, students were also asked how frequently they used each strategy when working on homework and exam problems in their organic chemistry course; however, this component of the data collection was completed as a part of a broader study involving additional chemistry courses and is beyond the scope of this work. Students were then asked several questions about their problem-solving approach, including questions about their reasons for carrying out certain metacognitive activities either on the problem they had just worked on during the interview or in their organic chemistry course in general. Following this discussion, students were asked to complete a second problem, which had identical instructions, while thinking aloud. They were then prompted to fill out a second survey to indicate whether they had used each strategy while working on that problem. Students were permitted to review their written work (i.e. any notes, annotations, chemical structures, or mechanistic drawings they wrote down while working on each problem) while completing each self-report survey.
We chose to use both concurrent and self-report measures in order to get a more complete understanding of students’ usage of strategies related to metacognitive regulation when solving organic chemistry problems. The think-aloud interview method was chosen because it allows for an in-depth analysis of students’ problem-solving processes, and concurrent measures of metacognition are considered to better align with actual behavior as compared to off-line methods (Veenman et al., 2006). However, data collected using think-aloud protocols may not be complete if interview participants do not or can not verbalize all of their thoughts (Veenman, 2011). For this reason, we chose to additionally ask students about their behavior using a retrospective, task-specific self-report questionnaire. Because memory distortions are likely to increase with the interval between task performance and retrospective reports, we chose to administer this questionnaire immediately after students finished solving each problem (Ericsson and Simon, 1993; Veenman, 2011). Considering the minimal interval between completion of the problem-solving task and self-report questionnaire, we expected that reviewing their written work would provide sufficient cues to minimize memory distortions without the additional time required to allow participants to fully review their recorded think-aloud protocol. There is considerable precedent for similar study designs in which students think aloud while completing a task and then complete a retrospective questionnaire about their strategy usage directly after task completion without reviewing their process (Bannert and Mengelkamp, 2008; Desoete, 2008; Schellings, 2011; Schellings et al., 2013; Merchie and Van Keer, 2014; Veenman and van Cleef, 2019; Rogiers et al., 2020).
We believed that, for the majority of interview participants, these problems would function as novel problems as opposed to routine exercises (Bodner, 2003). Whether any given chemistry question functions as a problem or an exercise depends on how familiar the person solving the task is with the material rather than on the innate difficulty of the task. For example, a stoichiometry problem that would serve as a routine exercise for a practicing chemist would be a novel problem for a student enrolled in their first chemistry course. The practicing chemist would likely complete the task in a logical, linear fashion based on recalled algorithms, while the student may take a more circuitous approach involving false starts and dead ends. The ambiguity and open-endedness of the chosen problems presented an opportunity for us to investigate how students approach less familiar problems where simple recall of information is not enough, and made it more likely that students would display the use of metacognitive behaviors during the process of solving these problems (Carr and Taasoobshirazi, 2008). Prior studies suggest that concurrent assessment of metacognitive regulation should be conducted using tasks that are of a level of complexity that would require the interview participants to intentionally control their thinking processes (Shin et al., 2003). Multiple sources of ambiguity were included in the design of these problems, including polyfunctional starting materials, an absence of detailed reaction conditions (e.g. temperature, equivalents), and the possibility of multiple potential products or completing solution pathways. Pilot interviews conducted with Organic Chemistry I and Organic Chemistry II students during the semester prior to the main study confirmed that students were generally interpreting the problems as expected and were able to at least generate some reasonable ideas about each problem despite their potential difficulty.
Interviews were fully transcribed, and the transcripts were annotated to indicate what students were writing as they spoke aloud. These transcripts were then coded by several members of the research team using MaxQDA qualitative data analysis software. Two different coding schemes were developed, one for analysis of the think-aloud portion of the interview and the other for analysis of the discussion portion. Definitions and examples of all codes are provided in Appendices S3 and S4 (ESI†). The first scheme includes codes that correspond to each of the 20 metacognitive strategies included in Table 2. These codes were assigned to each think-aloud problem transcript according to whether a student's usage of each skill was evident or not evident in the transcript. Definitions and criteria for the inclusion or exclusion of certain statements under each code were developed following extensive discussion between members of the research team, which included undergraduates who were currently enrolled in organic chemistry courses. The second scheme was developed to categorize the most common reasons that students gave for using or not using the metacognitive strategies described in Table 2. Codes and their definitions were developed inductively using a constant comparative method that consisted of reading the transcripts, noting down emerging themes and potential codes, and meeting to discuss agreements and disagreements between members of the research team. Saturation was reached with a set of 16 codes: nine corresponding to reasons students reported using the metacognitive skills, and seven corresponding to reasons for not using these skills. Similar codes were categorized into a total of seven major themes by two members of the research team. A list of these themes, codes, and their descriptions is included in Table 3.
Themes | Codes | Descriptions |
---|---|---|
Reasons for using strategies: the student uses this strategy because… | ||
Using strategy helps them solve the problem efficiently | Avoid wasting time/effort | It helps them avoid wasting time or effort during the problem-solving process. |
Get started/narrow focus | It helps them get started on the problem or narrow their focus to certain pathways. | |
Builds confidence | It helps them feel more confident in their answer or thought process. | |
Many reactions to consider | They recognize that a wide variety of reactions or types of reactivity exist and could possibly be relevant to the problem. | |
Keeps them from forgetting | It helps prevent them from forgetting an idea or piece of information. | |
Using the strategy helps them solve the problem correctly | Keeps them on right track | It helps them stay on the right path and continue making progress toward an answer. |
Helps avoid mistakes | It helps them avoid making mistakes. | |
Someone encouraged use | Someone encouraged use | Another person, such as an instructor or tutor, encouraged them to use this skill. |
Helps them learn/improve | Helps them learn/improve | It helps them learn or improve their knowledge or problem-solving skills. |
Reasons for not using strategies: the student does not use this strategy because… | ||
Using the strategy is detrimental to their success | Prevents success: distracting | It distracts them and they therefore consider it to be detrimental to their success in solving the problem. |
Prevents success: other | They consider it to be detrimental to their success in solving the problem for another reason, or they state that it is detrimental without stating a specific reason. | |
They are not able to use the strategy | Issues with timing | There is not typically enough time for them to use it. |
Unable to use effectively | They believe they are unable to use the skill effectively, often because they do not feel experienced enough to do so. | |
Using the strategy is unnecessary | Unnecessary: have answer | They consider it to be unnecessary when they have already found an answer to the problem. |
Unnecessary: redundant | They consider it to be unnecessary because they either use a different strategy for the same purpose or use a similar strategy at a different time in the problem. | |
Unnecessary: other | They consider it to be unnecessary for another reason, or they state that it is unnecessary without stating a specific reason. |
After coding approximately 10% of the transcripts as a group, each remaining think-aloud or discussion transcript was coded independently by at least two members of the research team. The average interrater agreements between pairs of researchers for metacognitive skills observed during the think-aloud interview and for reasons for using or not using metacognitive strategies mentioned during the discussion portion of the interview were κ = 0.83 and κ = 0.80, respectively. All members of the research team met periodically to compare notes on the coding process and resolve any discrepancies in coding.
After coding was complete, the average number of strategies students were observed using and the number of strategies that they self-reported using on at least one of the interview problems was calculated for Organic Chemistry I students, Organic Chemistry II students, and graduate students. The number of strategies students were observed using was determined using the coding scheme, while the number of strategies they self-reported using was determined using the surveys students took after completing each problem. The average percent agreement between observed and self-reported use of metacognitive skills was then calculated for each of these groups. A percent agreement of zero would indicate that there was no overlap between the strategies that a student self-reported using and the strategies that they were observed using on a specific problem. Percentages of students who self-reported or were observed using a strategy on at least one of the interview problems were also calculated for each of these groups. The average number of strategies students self-reported or were observed using was also calculated for students who received a performance score of less than or equal to 60% on the interview problems and those who scored greater than 60% on the interview problems. T-Tests were used to compare self-reported and observed strategy usage between these groups of higher and lower-performing students. IBM SPSS 27.0 was used for all statistical analysis. The number of times that students gave a certain reason for using or not using one of the 20 metacognitive strategies during the discussion portion of the interview was also determined.
Some strategies were used by nearly every student, others were rarely used by any student, and others were used more often by more or less experienced students. Among undergraduates in either organic chemistry course and graduate students, more than 90% reported and were observed sorting through the problem statement to determine what was relevant, reflecting upon prior knowledge they had that was relevant to the problem at hand, and monitoring whether what they were doing was correct as they worked on the problem. On the other hand, fewer than 50% of students reported or were observed jotting down their ideas prior to starting the problem, summarizing the main takeaway lessons learned after finishing the problem, or considering ways they might change their approach for future problems. It may be that students view the initial planning strategies such as sorting through the problem statement or reflecting upon their prior knowledge as necessary for determining how to solve the problem at hand, while evaluation strategies related to learning from the experience of doing problems, such as summarizing main takeaway lessons or considering how they might change their approach for the future, are primarily useful for improving one's performance on future problems.
Strategies with differences in usage between groups of students included making predictions and setting goals before beginning the problem, which were both performed more often by graduate students according to both measures. Both of these strategies require a student to think multiple steps ahead before beginning to work on the problem, which is likely more difficult for the undergraduate students, who had less experience with solving organic chemistry problems. Organic Chemistry I students, who had the least experience with organic chemistry, were more likely than other students to take note of what they were uncertain about when solving the problem; 100% of these participants exhibited this behavior according to both self-report surveys and observations.
Comparing the individual metacognitive problem-solving strategies that participants in this study self-reported using to other studies that make use of metacognitive self-report instruments is difficult because most report only composite survey scores. However, several of the strategies interview participants were observed using have been reported in other studies of chemistry students’ approaches to solving problems. For example, based on analyzing students’ responses to organic chemistry synthesis problems on exams (Bodé and Flynn, 2016) and during think-aloud interviews (Webber and Flynn, 2018), Flynn and coworkers found that students wrote down functional groups and identified other relevant explicit and implicit features of the problem, attempted multiple solutions, and rejected certain proposed reaction pathways. These strategies correspond most closely to several planning and monitoring strategies commonly used by participants in the present study, namely the “Sort Relevant Info,” “Reflect Relevant Knowledge,” “Jot Down Ideas,” “Brainstorm Multiple Ways” or “Consider Another Way,” and “Consider If Plan Reasonable” or “Monitor Correctness” strategies. Students have been observed using similar strategies during think-aloud interviews involving organic chemistry mechanism and predict-the-product problems (DeCocq and Bhattacharyya, 2019) and molecular polarity or thermodynamics problems (Wang, 2015). A few other studies have also reported how often students used certain metacognitive strategies. In their study of students’ approaches to open-ended chemistry problems, Overton et al. (2013) found that only 10 of 27 interview participants evaluated their answers; evaluation strategies were also used relatively infrequently among our sample. Heidbrink and Weinrich (2021) found that 23 out of 25 interview participants exhibited monitoring strategies such as appraising one's work or one's thought process when solving buffer problems, while fewer (19 out of 25) used planning strategies like goal setting or allocating resources or evaluation strategies like reflecting on their answer or identifying areas where they struggled in solving the problem. The monitoring strategies exhibited by the students in Heidbrink and Weinrich's study mostly closely correspond with the “Monitor Correctness” strategy described in this work, which we also observed in nearly all of the think-aloud protocols (36 out of 38). In sum, while few prior studies have provided quantitative information on the proportion of students who use some of the individual metacognitive problem-solving strategies described in this work, our findings are generally consistent with the literature on problem solving in chemistry.
Though some strategies were used approximately equally often according to both self-report and concurrent measures, there was in general a large discrepancy between the two measures. Table 5 summarizes the average number of strategies that Organic Chemistry I, Organic Chemistry II, and graduate students used during the interview according to both measures. On average, the number of strategies students reported using while solving either one of the interview problems was 66% greater than the number of strategies that they were observed using according to coding of their think-aloud interview transcripts. The average percent agreement between self-reported and observed usage of metacognitive regulatory strategies, which takes into account agreement between the two measures for each individual strategy, was 57%. Correlations between the two measures were weak and non-significant (first problem: r = 0.15, p = 0.38; second problem: r = 0.19, p = 0.25). This is consistent with the finding that self-reports tend to only weakly correlate with concurrent measurements of metacognitive behavior (Van Hout-Wolters, 2009; Craig et al., 2020). In a meta-analysis of studies assessing metacognitive skills, for example, Craig et al. (2020) found that analyzing 21 studies that correlated off-line and on-line measures of metacognition resulted in a pooled effect size estimate of 0.22. In the domain of chemical education, however, Wang (2015) observed stronger, significant correlations of 0.36 (p < 0.05) and 0.49 (p < 0.01) between a general self-report questionnaire and concurrent metacognition as measured using two different general chemistry think-aloud problem-solving tasks. When considering comparisons between task-specific questionnaires and think-aloud protocols more specifically, our observed correlations are on the low end compared to prior studies, in which correlations between these measures ranged from 0.10 to 0.63 (Van Hout-Wolters, 2009; Craig et al., 2020).
Group of students | N | # strategies used during interview | Self-reported vs. observed % agreement | |
---|---|---|---|---|
Self-reported | Observed | |||
Organic I | 10 | 16.3 ± 1.5 | 8.7 ± 2.4 | 53.8 ± 14.3 |
Organic II | 16 | 14.4 ± 2.9 | 9.1 ± 1.9 | 56.1 ± 9.7 |
Graduates | 12 | 15.6 ± 1.4 | 9.8 ± 2.3 | 59.8 ± 6.7 |
All students | 38 | 15.3 ± 2.3 | 9.2 ± 2.2 | 56.7 ± 10.4 |
There are several possible reasons for the observed discrepancies between the two measures of metacognitive behavior. Students may have reported using a greater number of strategies than they actually used due to social desirability bias, which is the tendency of survey or interview respondents to give answers that they believe will be viewed favorably by others (Paulhus, 1991). The mismatch between self-reported and observed metacognitive strategy usage might also be partially attributed to the Dunning–Kruger effect, which describes the finding that poor performers tend to overestimate their competence, leading to inflated self-assessments (Kruger and Dunning, 1999; Dunning, 2011). Students’ interpretation of the strategies described by the self-report items also may have differed from the definitions used by the researchers when coding the think-aloud protocols. Students were not asked to explain how they interpreted the items on the self-report measure used in this study, but studies on the response process validity of metacognitive self-report items in high school students have shown that some students find some items confusing or ambiguous, especially items related to planning skills (Berger and Karabenick, 2016) or items with more abstract terms or phrases such as “concepts,” “drawing conclusions,” or “finding information” (Schellings, 2011). It is also possible that some of the students’ thought processes were not included in their verbalizations. This is more likely when processes are highly automated or when a task is particularly difficult or requires a lot of effort (Ericsson and Simon, 1993; Veenman, 2016). When working on more difficult tasks, like the problems students were asked to solve in this study, learners are more likely to occasionally fall silent instead of continuously verbalizing their thoughts (Ericsson and Simon, 1993). These occasional silences were observed in most of the interviews we conducted, despite urging students to continue verbalizing their thoughts. Students’ use of metacognitive strategies may be overestimated by their responses to the self-report survey and underestimated by coding of their verbalized thought processes, which means that the true number of strategies they made use of while solving the interview problems is likely somewhere between the two values.
There were particularly low levels of agreement between the two measures for several of the individual metacognitive strategies. In each of these cases, many more students self-reported using these strategies than were observed using these strategies. For instance, the percentage of students who stated that, during the think-aloud portion of the interview, they had tried to relate an unfamiliar problem to previous problems they had encountered ranged from 80–100% depending on the course, but usage of this strategy was only detected in 0–10% of interview transcripts. This could be because students were more likely to verbalize that they were trying to relate a problem to previous problems they had encountered if they did in fact recall some similarity to a problem they had seen before. The use of the strategy itself may be less conscious, and it is only when using this strategy leads the student to notice something useful or unexpected that it surfaces in students’ verbalizations. Veenman (2006) noted that “many evaluation and self-monitoring processes run in the ‘background’ of the cognitive processes that are being executed. Only after an error is detected, rightfully or not, the system becomes alerted” (Veenman et al., 2006, p. 6). This could also explain the large differences that were seen with the “check if answered question” (self-reported: 88–100%, observed: 10–31%), “check if agreed with prediction” (self-reported: 44–80%, observed: 0–6%), and “monitor progress toward goals” (self-reported: 75–100%, observed: 8–13%) strategies. Students may be more likely to verbalize thoughts related to these strategies if, in using these strategies, they notice a problem with their answer or their progress. If certain strategies were more difficult to discern from the think-aloud protocols than other strategies, this supports the importance of using multiple methods to determine which strategies students use during the problem-solving process.
Group of students | N | Performance score on problems: mean (SD) | |||||
---|---|---|---|---|---|---|---|
First problem | Second problem | Problem A | Problem B | Problem C | Problem D | ||
Organic I | 10 | 50.0 (25.0) | 37.5 (16.7) | 46.3 (21.3) | 41.3 (22.9) | — | — |
Organic II | 16 | 46.1 (20.3) | 49.2 (23.9) | — | — | 46.9 (21.2) | 48.4 (23.2) |
Graduates | 12 | 81.3 (22.3) | 78.1 (29.3) | — | — | 78.1 (20.7) | 81.3 (30.4) |
Due to the difficulty of the problems, only 12 solutions were fully correct, and most of these solutions were generated by graduate students. Therefore, we chose to consider any solution that received a score greater than 60% to be “more successful,” which corresponded to 20%, 25%, and 83% of the solutions generated by Organic Chemistry I, Organic Chemistry II, and graduate students, respectively. The number of metacognitive strategies students used in the process of generating more and less successful solutions is displayed in Fig. 4. When comparing all interview participants, those who generated more successful solutions self-reported using a significantly greater number of strategies related to metacognitive regulation than those who were less successful (p = 0.003, Cohen's d = 0.67). Because the distribution of solutions that were considered more successful heavily favored graduate students, we also made comparisons that only considered undergraduate participants. Similar results were observed; undergraduates whose solutions were considered more successful self-reported using more metacognitive strategies while solving these problems (p = 0.015, Cohen's d = 0.83). Among undergraduate participants and participants as whole, observed strategy usage trended in the same direction, but these differences were only approaching statistical significance (p = 0.053 and p = 0.067, respectively).
The finding that students who generated more successful solutions to organic chemistry problem-solving tasks also reported using a significantly greater number of strategies related to metacognitive regulation is consistent with our hypotheses as well as with previously published research conducted with general chemistry students. Prior research has shown that students who scored higher on measures designed to assess metacognitive strategy usage performed better on specific problem-solving tasks (Cooper et al., 2008; Wang, 2015). Specifically, in their study involving students enrolled in a general chemistry laboratory course, Cooper et al. (2008) found that students with a higher level of metacognition usage according to their scores on a concurrent measure scored significantly higher on a metacognitive self-report instrument and also showed a significantly higher ability to solve ill-defined problems. Wang (2015) observed significant positive correlations between students’ performance on challenging problem-solving tasks related to thermodynamics and molecular polarity and their metacognitive regulation according to both a self-report questionnaire and analysis of think-aloud interview transcripts. These two studies are most directly comparable to our research methodology, as metacognition was assessed by both concurrent and self-report methods and performance was measured in terms of students’ ability to solve relatively complex problems.
The positive relationship between metacognition and problem-solving success observed in our study can additionally be compared to studies that investigate connections between student metacognition and course grades, though it is important to consider that a student's ability to solve complex problems is one of many potential influences on their grade. González and Paoloni (2015) found correlations of 0.64, 0.67, and 0.68, respectively, between students’ planning, monitoring, and evaluation scores on the Physics Metacognition Inventory and their final grades in introductory chemistry. Cooper and Sandi-Urena (2009) reported that students who received A grades in a general chemistry course scored significantly higher on the Metacognitive Activities Inventory (MCAI) compared to students who received lower grades in the course. Dianovsky and Wink (2012) observed a correlation of 0.56 between students’ scores on the MCAI and their numerical grades in a general education chemistry course. Several studies have also linked interventions designed to promote metacognition to improved performance in general chemistry courses. Cook et al. (2013) found that general chemistry students who attended a 50 minute lecture on metacognitive learning strategies received an average final grade that was a full letter grade higher than those who did not attend this lecture. Casselman and Atwood (2017) reported that students who engaged in homework-based metacognitive training that involved predicting their scores on assignments and making study plans received higher scores on midterm and final exams than those who did not. Mutambuki et al. (2020) noted that students exposed to instruction on metacognitive learning and study strategies in combination with active learning scored significantly higher on the final exam than those who were exposed to active learning alone, with a mean difference of 5%. Using the same metacognitive instructional model described in Mutambuki et al. (2020), Muteti et al. (2021) found that students who reported that this metacognitive lesson had a positive impact on their study strategies were more likely to receive A/B grades and less likely to receive D/F grades on the final exam than students who reported no influence. Overall, the connection between metacognition and performance that we observed in organic chemistry students is consistent with numerous studies conducted with general chemistry students, which reinforces the importance of assessing and promoting metacognitive strategy use in chemistry courses across sub-disciplines.
Student pseudonym | Problem solved | Performance score on problem (% of possible points) | # of strategies used while solving problem | |
---|---|---|---|---|
Self-reported | Observed | |||
Andrew | A | 38 | 10 | 4 |
Lily | B | 50 | 18 | 11 |
Ben | C | 75 | 14 | 4 |
Marta | D | 100 | 15 | 10 |
Less Successful Solution, Fewer Metacognitive Strategies Used:
Andrew received a relatively low score (38%) on Problem A and also exhibited fewer metacognitive behaviors than average according to both self-reported and concurrent measures. Andrew began the problem by reading the directions aloud. He then stated that the first thing he was looking for was the reactive site, and he noted that there was an alkene and an epoxide present in the starting material (Code: Sort Relevant Info). He predicted that the epoxide “is what would be breaking in this example” (Code: Make Predictions). He identified that the “H2SO4” present in the reaction conditions was an acid, which would protonate the epoxide and cause the epoxide to break apart to form a tertiary carbocation at the more substituted position of the epoxide (Code: Reflect Relevant Knowledge). He then stated that a water molecule would attack this carbocation, and that he was “pretty sure this is anti addition.” After drawing his final products (shown in Fig. 5), he looked back over what he had done to “make sure the stoichiometry and the equation is balanced” (Code: Check for Mistakes). In addition to the behaviors that were observed in his transcript according to the coding scheme, Andrew also reported that he had set goals, looked for reactions he recognized, related the problem to a previous problem he’d encountered, considered if his proposed steps were reasonable, considered if his answer was reasonable, checked if he’d answered the question, and checked if his answer agreed with his prediction. Andrew's final answer was partially correct in that he performed the hydration of the epoxide with the correct regioselectivity. However, he did not propose any reaction involving the alkene, and he drew an additional unreasonable stereoisomeric product.
Less successful solution, more metacognitive strategies used:
Lily received a score of 50% on her response to Problem B, which was categorized as “less successful,” but she was above-average in terms of the number of metacognitive strategies she reported and was observed using while solving this problem. Lily started by reading the directions aloud and stating that she noticed there was a bromide present in the starting material, which she predicted would act as a leaving group at some point during the reaction (Codes: Sort Relevant Info, Make Predictions). Drawing on her knowledge of nucleophile strength and substitution reactions, she proposed that the potassium ethoxide would react with the alkyl bromide in an SN2 reaction (Codes: Reflect Relevant Knowledge, Look for Reactions Recognized). After completing this SN2 reaction, she stated that she was now stuck because she didn’t know what to do with the ethanol that was also present in the reaction conditions, and she wanted to use every listed reagent in the reactions she proposed (Code: Note Uncertainty). She considered using the potassium ethoxide to deprotonate the ethanol, but she didn’t think this made sense, and she questioned whether the SN2 reaction was the correct path (Code: Monitoring Correctness). She considered carrying out an E2 reaction in step 1 instead, but realized that she had still not met her goal of using every listed reagent, since the ethanol did not participate in her proposed E2 reaction either (Codes: Consider Another Way, Monitor Progress Toward Goals). In the end, she returned to her initial proposed SN2 reaction because she thought she had seen potassium ethoxide act as a strong nucleophile more often than as a strong base.
Moving on to the second set of reagents, Lily proposed that the ethoxy group on her SN2 product could be protonated by the sulfuric acid because she had seen something similar happen in a previous problem, but she wasn’t sure what to do after this protonation (Code: Relate to Previous Problems). At this point, Lily went back over her previous work and again thought about whether her product for step 1 was reasonable (Code: Periodically Check if Reasonable). Her conclusion was “I still think the final product of reaction one is not correct, but I have no other way. I need to base it on that to solve the next question.” She then proposed a second SN2 reaction between methanol and the protonated ethoxy group of her intermediate product, and stated that the resulting final product (shown in Fig. 6) “looks fine” and that there would be no further reactivity (Code: Consider if Answer Reasonable). Other strategies that Lily reported using included setting goals, brainstorming multiple ways to approach the problem before she started working, considering whether her proposed steps were reasonable, checking if she had answered the question, checking for mistakes, checking that her answer agreed with what she had predicted, summarizing the main takeaway lesson, and considering how she could change her approach for the future. Lily's final answer received some partial credit because, though she had proposed SN2 reactions rather than the more favorable E2 and SN1/E1 reactions for each step of the problem, she carried out the reactions that she did propose with correct stereochemistry and regioselectivity.
More successful solution, fewer metacognitive strategies used:
Ben's solution to Problem C received a score of 75%, and was therefore categorized as “more successful.” According to his response to the self-report survey, he used an approximately average number of metacognitive strategies, but the number of strategies he was observed using was below average. At the beginning of the problem-solving process, Ben noted that the conditions were acidic and that there were several sites on the starting materials that could potentially be protonated (Code: Sort Relevant Info). He considered protonating each of these sites (Code: Brainstorm Multiple Ways). He then determined that protonation of the aldehyde would be the most productive option because he knew that the amine would most likely function as a nucleophile, and the aldehyde was the most electrophilic functional group present (Code: Reflect Relevant Knowledge). Once he had decided on the nucleophile and electrophile, he drew out the mechanism for forming an imine from the aldehyde. After he reached this product (shown in Fig. 7), he questioned whether the geometry of the imine was correct, but decided that the major product would be the one he had drawn and that he was done with the problem (Consider if Answer Reasonable). In addition to the behaviors that were observed in his transcript, Ben also reported that he had set goals, looked for reactions he recognized, related the problem to a previous problem he’d encountered, made predictions, considered if his proposed steps were reasonable, considered if there was another way to solve the problem, monitored his progress toward his goals, considered whether what he was doing was correct, noted what he was uncertain about, checked if he’d answered the question, and checked if his answer agreed with his prediction. Because Ben did form an imine by reacting the amine with the more reactive of the two carbonyls, did not make any stereochemical errors, and did not propose any additional unreasonable reactions, his answer was considered “more successful.” He was not fully successful, however, because he did not consider whether any additional reactivity was possible after forming the imine, such as the Mannich reaction or an amine-catalyzed aldol reaction.
More successful solution, more metacognitive strategies used:
Marta received a score of 100% on Problem D, and she used an above-average number of metacognitive strategies according to both self-report and concurrent measures. Upon first seeing the problem, she noted the presence of a phosphorus ylide as well as the acidic conditions (Code: Sort Relevant Info). She then predicted that the first step of the reaction would reveal a carbonyl, because she recalled she had typically seen this type of phosphonate reagent reacting with carbonyls (Codes: Make Predictions, Reflect Relevant Knowledge). She stated that she was not sure which acetal oxygen she should protonate first, but she decided to choose the one in the ring, keeping in mind that she could try the oxygen that was part of the isopropoxy group as well if her first idea did not work (Code: Brainstorm Multiple Ways). As she worked on cleaving the acetal, she recalled that she would need to indicate stereochemistry in her answer, so she made sure that she had considered this while drawing intermediate structures (Code: Monitor Progress Toward Goals). Once she generated the correct aldehyde product of step 1, she looked back over her work to consider whether what she had done was reasonable and then decided to go back to the beginning and try protonating the isopropoxy group first instead (Codes: Periodically Check if Reasonable, Consider Another Way). She erroneously determined that this path was incorrect and would not lead to the desired carbonyl product (Code: Monitor Correctness).
Marta then continued on to the second step of the reaction. As she drew out the mechanism for the HWE reaction, she stated that she was not sure about one step of the mechanism and would want to look it up if she had access to an answer key (Code: Note Uncertainty). After she reached her final answer (shown in Fig. 8), she repeatedly counted the atoms present in her answer and in her intermediates to make sure she had drawn the product correctly (Code: Check for Mistakes). Marta also reported that she had set goals, looked for reactions she recognized, related the problem to a previous problem she’d encountered, considered if her proposed steps were reasonable, considered if her answer was reasonable, checked if she’d answered the question, and checked if her answer agreed with her prediction. Marta's answer was fully correct and was considered “more successful.”
Considering the interview participants as a group, students who generated more successful solutions tended to use a greater number of metacognitive regulatory strategies. From our analysis of the individual problem-solving pathways of Andrew, Lily, Ben, and Marta, however, it is clear that the relationship between metacognition and problem-solving success is more nuanced. Andrew and Ben both used a below-average number of metacognitive strategies in their approach to Problems A and C, respectively. Neither student received full points for their solutions because, after identifying a reasonable starting point with the use of planning strategies, they did not consider the potential for further reactivity. Had these students engaged in monitoring strategies such as pausing to consider whether there was another way to solve the problem, they may have received higher scores. Andrew's solution to Problem A received a lower performance score than Ben's solution to Problem C and was ultimately categorized as less successful because Andrew's solution contained stereochemical errors that point to a gap in his understanding of this concept. This difference in task performance between students with a similar level of metacognitive strategy usage was also seen when comparing the approaches of Lily and Marta. Lily and Marta both displayed an above-average number of metacognitive behaviors, yet Marta's solution to Problem D received full points, while Lily's solution to Problem B was considered less successful. Based on her verbalized thoughts, Lily seemed to be unsure about the role of the solvent and the favorability of different substitution or elimination reactions under the given reaction conditions, which led her to struggle to generate a reasonable solution. However, Lily's use of planning and monitoring strategies did help her to identify, consider, and dismiss several potential types of reactivity. Overall, these four cases suggest that when solving complex organic chemistry problems, a solid foundation of conceptual knowledge and metacognitive problem-solving skills can both be major contributors to success.
The reasons students gave for using metacognitive planning, monitoring, and evaluation strategies mostly aligned with our expectations and showed that students used these strategies for their intended purposes. As expected, students generally used planning strategies to help identify and explore possible options, monitoring strategies to keep them on track and avoid making mistakes or wasting time or effort, and evaluation strategies to assess the merits of their answer and approach as well as to learn from their experience of solving the problem. Though we are not able to compare our findings to any existing studies on student reasons for using metacognitive strategies in the context of problem solving, students have been found to give similar reasons for using or not using metacognitive strategies while reading academic texts (Thuy, 2020; Andriani and Mbato, 2021). One interesting observation is that many students mentioned being encouraged by their instructors or tutors to use certain planning strategies, but this reasoning was mentioned less often in regard to monitoring or evaluation strategies. If instructors typically concentrate on teaching planning strategies, it would be useful to additionally introduce and model the use of various monitoring and evaluation strategies during class. Students’ reasons against using metacognitive strategies, especially those related to feeling unable to use certain strategies effectively, point towards opportunities for instructors to provide students with additional guidance and support in implementing these strategies. It is important to note that our goal in advocating that instructors teach students about metacognitive regulation is not for students to use every strategy listed in Table 2 when working on every organic chemistry problem they encounter. Students may rightfully not find some strategies useful in every situation, especially for more straightforward problem-solving tasks. Instead, we believe it is beneficial to introduce these skills and give students the tools to use them when needed.
Lastly, it is important to consider how our positionalities as instructors, researchers, and students influenced our analysis and interpretation of the data collected in this investigation. At the time this work was conducted, the first and fifth authors were doctoral students studying organic chemistry and chemical education at the same institution as the interview participants. The second, third, and fourth authors are current or former undergraduate students who had recently taken organic chemistry courses at this institution. The corresponding author is a professor who has taught organic chemistry courses at this institution for over a decade. Though none of the authors have taught or taken organic chemistry courses with any of the study participants, each author is either a product of or is involved in the teaching of the organic chemistry curriculum at this institution. Each of us is therefore experienced with solving organic chemistry problems similar to those investigated in this study, and we all have our own ideas about what metacognitive strategies work well for us or our students and why we choose to use or not use certain strategies. These personal experiences may have influenced our interpretation and understanding of students’ words and actions while coding the interview transcripts. For example, a researcher may have more readily noted a student's usage of strategies that more closely matched their own problem-solving approaches. To mitigate potential bias and ensure that both student and instructor perspectives were taken into account, each interview was coded by at least one member of the research team who had recently taken an organic chemistry course and one who had recently taught an organic chemistry course, and any differences in interpretation were discussed until agreement was reached.
Our analysis focused on three main research questions. First, what metacognitive strategies do students use when solving complex predict-the-product problems? Analysis of think-aloud problem-solving interviews and task-specific self-report questionnaires led us to conclude that the strategies most commonly used by students were those related to identifying relevant information, recalling prior knowledge, and monitoring or evaluating the correctness of one's progress or products, whereas far fewer students engaged in evaluation strategies that involved reflecting and learning from the experience of problem solving. When comparing the approaches of graduate and undergraduate students, one trend we observed was the higher prevalence of forward-thinking strategies, including setting goals and making predictions at the beginning of the problem, among graduate students. When examining students’ use of metacognitive regulation strategies measured concurrently during think-aloud interviews as compared to their self-reported use of these same strategies, significant discrepancies between these two measures were found. Our second research question asked whether students who are more and less successful at solving organic chemistry problems differ in their use of metacognitive regulatory strategies. We found that students who generated more successful solutions self-reported using a significantly greater number of metacognitive strategies during the problem-solving process, and comparisons of observed strategy usage trended in the same direction. Analyzing individual examples of student problem-solving pathways showed that, while the use of a greater number of metacognitive strategies does not always lead to greater success on non-trivial organic chemistry problems, using these strategies can help students generate possible ideas, ensure that they are making progress in the right direction, and determine whether their answer is reasonable and complete. Our final question involved the reasons students have for using or not using metacognitive strategies. Students stated that they found many of the strategies described herein to be useful for helping narrow down options, avoid mistakes, and keep themselves on track during the process of problem solving. Yet students also had several reasons for not using these strategies, such as believing that using a strategy was unnecessary or distracting or that they were not capable of using the strategy effectively. Each of these findings suggests specific implications for research and practice.
When considering implications for research, the significant discrepancy observed between concurrent and self-report measures emphasizes the importance of using multiple measures to detect metacognitive regulation in students, as the use of a single measure may result in an incomplete understanding of students’ cognitive processes related to this complex construct. The reasons for the observed discrepancies are not entirely clear; however, possible factors include social desirability bias (Paulhus, 1991), differences in students interpretation of the strategies described by the self-report items compared to the definitions used by the researchers when coding the think-aloud protocols, or a lack of inclusion of some of student's more automated cognitive processes in their think-aloud interview verbalizations (Ericsson and Simon, 1993). We suggest that future studies that rely upon self-report assessments of metacognitive regulation could make use of cognitive interviews where students are asked to explain their thought process as they answer each item of the questionnaire (Schellings, 2011; Berger and Karabenick, 2016). Analysis of these interviews could help explain the reasons for any disagreement between self-reported and observed metacognition as well as point to ways in which survey items or coding definitions could be modified to better assess strategy usage in students.
There are several teaching strategies instructors can use to enhance students’ use of metacognitive regulation strategies. Instructors of introductory organic chemistry courses could introduce metacognitive strategies by modeling the use of planning, monitoring, and evaluation strategies while explaining their thought process as they go over example problems during class. Rather than only presenting polished, linear solutions, instructors could also showcase the false starts and dead ends involved in real problem solving as well as how to recover from them. For example, when presenting a solution to a problem, the instructor could begin by setting goals, making predictions, and brainstorming potential approaches, either on their own or with input from the class. As they work through the problem, they could pause to ask themselves or their students whether they are making progress towards their goals. If they determine that they are not in fact making progress, they could backtrack and try another method. After reaching an answer, they could model the use of evaluation strategies such as checking for mistakes or checking whether their answer agreed with their prediction. Instructors could also give students opportunities to practice using metacognitive strategies with the help of problem-solving workflows. Examples of problem-solving scaffolds that could promote discipline-specific metacognition in students include the “Goldilocks Help” workflow, developed by Yuriev et al. (2017) in order to scaffold the development of metacognitive self-regulation and problem-solving skills in general and physical chemistry courses, and a problem-solving workflow designed for predicting organic reactivity that was developed by our research group (Helix et al., 2022). Instructors could also provide students with an opportunity to practice using these strategies on scaffolded homework or in-class assignments that include explicit prompts that would, for example, ask students to write down goals or predictions before solving a problem or to write down “main take-away lessons” after completing a problem. Students’ prior experiences with using metacognitive strategies and their memories of their past successes and failures influence their subsequent metacognitive and self-regulatory strategy choices (Finn, 2020), so having the opportunity to practice using these strategies with the help of problem-solving workflows or scaffolded assignments could enable students to feel more confident in their ability to use these strategies effectively, including in situations where they are constrained for time.
Drawing on these suggested teaching methods, we have recently piloted a series of problem-solving workshops with a small number of organic chemistry students at our institution based on the results of this investigation. According to Arslantas et al. (2018), metacognitive instruction should include “explicit instruction, modeling, integration of metacognitive skills with course content, and opportunities for practice and reflection” (Arslantas et al., 2018, p. 59). These workshops therefore begin with explicit instruction on metacognition and its importance, drawing on data collected during this study on the reasons students use certain strategies. This is followed by instructor modeling of strategies that we identified as particularly underused among undergraduate students, such as making predictions or summarizing main takeaway lessons. Students then complete scaffolded worksheets in which they are asked to write down their answers to prompts related to these strategies before, during, and after working on organic chemistry problems. Preliminary data suggests that these workshops were helpful to students, though additional research is needed to determine their efficacy in a larger classroom setting.
Footnote |
† Electronic supplementary information (ESI) available. See DOI: https://doi.org/10.1039/d2rp00208f |
This journal is © The Royal Society of Chemistry 2023 |