Promoting metacognition through measures of linked concepts with learning objectives in introductory chemistry

Alex Gilewski a, Mikhail Litvak b and Li Ye *c
aDepartment of Science, Cerro Coso Community College, Ridgecrest, 3000 College Heights Blvd, Ridgecrest, California 93555, USA. E-mail: alex.gilewski@cerrocoso.edu
bDepartment of Biology, California State University, Northridge, 18111 Nordhoff Street, Northridge, California 91330, USA
cDepartment of Chemistry and Biochemistry, California State University, Northridge, 18111 Nordhoff Street, Northridge, California 91330, USA. E-mail: li.ye@csun.edu

Received 25th February 2022 , Accepted 9th June 2022

First published on 13th June 2022


Abstract

Previous research has shown the validity of the Measure of Linked Concepts (MLC) assessment in college introductory chemistry. Herein, we present a novel model of pairing the MLC with a metacognitive exercise aligned with learning objectives in the introductory chemistry courses as an effective tool for instructors to be integrated into instruction and assessments. A mixed-methods approach was used to explore the effectiveness of MLCs with the metacognitive exercise on students’ performance and metacognitive thinking and behaviors. The study was implemented in introductory chemistry at two institutions over two semesters. The multiple regression model showed that MLC scores significantly predicted students’ final exam scores in introductory chemistry. Students’ MLC scores were improved significantly (18% higher) after pairing with the metacognitive exercise. Notably, the theme of metacognition was much more prevalent observed in the qualitative data when the MLC was paired with the metacognitive exercise as compared to without the exercise (37% vs. 9%). More importantly, most of the participants (87%) reported they looked up the learning objectives that are associated with the MLC statements they missed. More than half of the participants indicated they made plans to master the missed learning objectives. Their plans include reviewing lecture notes pertaining to the topics, doing more practice problems related to the missed learning objectives, or seeking help from tutors or instructors. Pairing MLCs with the metacognitive exercise shows promising signs of improving student performance and metacognition. The authors suggest adopters of MLCs utilize the new model of pairing them with the metacognitive exercise.


Introduction and background

Many college students struggle with introductory science courses in college because of the breadth and depth of the content covered in those courses. Educators and researchers are developing new methodologies for improving student learning skills and outcomes. The framework of David Ausubel's Meaningful Learning theories provides guidance for designing effective methodologies. Meaningful learning is characterized by the long-term retention of conceptual information and the application of prior information to new contexts (Ausubel, 1968). Meaningful learning is established by learners scaffolding new concepts upon their pre-existing knowledge. That is, novel concepts are connected to priorly learned ones. This is distinct from rote learning, in which only short-term retention is attained, and concepts are not integrated together.

Meaningful learning can be aided through various learning tools. Several assessment techniques have been reported in the literature to help students make connections between concepts. One of these assessments is Creative Exercises (CEs), in which students are asked to write original statements in response to a given prompt such as “34 grams of carbon dioxide” (Lewis et al., 2011; Ye and Lewis, 2014; Warfa and Odowa, 2015; Gilewski et al., 2019). For a student to get credit, statements must be accurate, germane to the prompt, and distinct from other statements regarding which concepts are being applied. CEs have been previously used to demonstrate students’ linking of chemistry concepts. Students were able to link more concepts over time and were shown to have performance gains with CEs. Survey data revealed that CEs helped students reinforce prior concepts and connect concepts in the chemistry courses, giving them more flexibility and providing an additional study tool (Gilewski et al., 2019). Although the open-ended nature of the CEs provides opportunities for students with autonomy to retrieve prior knowledge and apply it to new contexts actively. It limits the instructors’ flexibility in showing the relevance of particular prior concepts to new concepts, and the grading for larger classes could be time-consuming. Thus, the research team has developed an alternative assessment termed The Measure of Linked Concepts (MLCs), in which students are given a prompt that focuses on the new topic students learned in the chemistry course and a range of statements covering relevant chemical concepts related to the prompt (Ye et al., 2015a, 2015b). Students are asked to indicate and evaluate the legitimacy of each statement. The statements upon which the MLC assessments were created were originally based on student responses to CEs. For example, when students learn the molecular shapes of chemical compounds, which is usually taught toward the end of the semester of the introductory course, the instructor could provide a prompt with a chemical compound with its Lewis structure and link a variety of concepts for students to evaluate. The statements could not only measure current concepts such as molecular shapes, bond polarity, and molecular polarity but also show the relevance of prior concepts like classification of matter, the periodic table, nomenclature, etc. (see the example MLC in Table 1). MLCs provide instructors a tool for linking prior knowledge with subsequent topics. The closed-ended nature of the assessment enables instructors and students to get immediate feedback on the prevalence of students’ incorrect responses. While MLC assessments have been studied on an item-by-item basis to evaluate student understanding of specific topics and to inform future instruction, they have not yet been demonstrated to have a predictive effect on overall student performance in chemistry (Ye et al., 2015a, 2015b).

Table 1 An example of the MLCs
Instruction:
Consider the information given, indicate whether each statement below is True (T) or False (F) or (U) Unsure (you can choose Unsure when you do not know or not sure about your answer). You will get 1 point for a correct response or 0.2 points for selecting Unsure for each statement. After you have completed all the statements, please complete the metacognitive exercise in the end
NH3 is commonly used worldwide as a fertilizer. It is also a common cleaning agent when dissolved in water. The Lewis structure of NH3 is provided below. The electronegativity of N is 3.0 and H is 2.1
image file: d2rp00061j-u1.tif
Statements:
*Numbers in the parentheses indicate the learning objectives (LO):
_1. The molecular geometry of NH3 is trigonal planar. (LO7.5)
_2. Nitrogen will have a slightly positive charge. (LO7.7)
_3. N–H bonds are polar. (LO 7.8)
_4. This is a nonpolar molecule. (LO 7.9)
_5. NH3(aq) is a homogeneous mixture. (LO 1.3)
_6. Every atom in NH3 is a nonmetal. (LO 3.8)
_7. In one mole of NH3 there are 6.022 × 1023 atoms of hydrogen. (LO 4.4)
_8. Nitrogen has 2 core electrons. (LO 5.7)
_9. There are 8 total valence electrons in all the atoms of NH3. (LO 5.7)
_10. The name of NH3 is ammonium. (LO 6.3)
Metacognitive exercise:
Which learning objectives did you miss on the last [MLC] quiz?
What is your plan to master these learning objectives in the future?


Metacognition has also been identified as a factor in improving student performance in chemistry. Metacognition, meaning “thinking about thinking,” has been documented since the days of Aristotle. However, the definitions and ramifications of metacognition in education were deeply examined by John Flavell in the 1970s. Metacognition can be described as a self-monitoring process; students reflect on the quality of their learning, their difficulty with certain topics, or the effectiveness of their study skills (Flavell, 1976). Various models have been proposed to describe how learners could apply metacognitive skills and improve performance in sciences courses (Akyol et al., 2010; Sinapuelas and Stacy, 2015; Chan and Bauer, 2016; Ye et al., 2016). Although the models might be varied in different contexts, the shared notion of those models is that the learners need to monitor and regulate their learning processes (Zimmerman, 2001). In the context of college chemistry courses, research studies indicated that metacognition could be improved through purposeful training (Cook et al., 2013; Zhao et al., 2014; Casselman and Atwood, 2017; Avargil, 2019; Graham et al., 2019; Mutambuki et al., 2020; Muteti et al., 2020). Cook and colleagues gave students in general chemistry a 50 minute metacognition lecture. They introduced them to a range of metacognitive activities (e.g., the study cycle, rewriting lecture notes, pretending to teach, previewing material, and working on homework problems without any examples present). The results showed that students who attended the lecture received one letter grade higher than students who did not attend the lecture (Cook et al., 2013). The study highlighted the importance of dispelling student notions that high school-level learning skills (e.g., memorization) would not function in college, and the authors suggested that the timing of the implementation of the lecture is critical and gives students the opportunity to reflect on their learning shortly after high-stake assessments is optimal. In a follow-up study, a separate metacognitive lecture was implemented in general chemistry I and II shortly after the first examination. The study found the metacognition lecture increased students’ chemistry self-concept in general chemistry I, and students had a higher success rate in general chemistry II as compared to previous years (Zhao et al., 2014). A recent study compared the student performance in the college general chemistry I course of a treatment group (combined explicit teaching of metacognition by a lecture and active learning) and a comparison group (active learning alone) in the general chemistry I course. The results showed the treatment group increased by 5% on the common cumulative final exam (Mutambuki et al., 2020). The qualitative study showed that metacognition training in lectures led to students using fewer ineffective learning strategies such as rote memorization and gaining higher-order thinking skills like applying and analyzing (Muteti et al., 2020). In addition to lectures, metacognition training also has been implemented through tutoring sessions, online learning modules, and homework (Casselman and Atwood, 2017; Avargil, 2019; Graham et al., 2019). For instance, Casselman and Atwood reported the students who completed the metacognition prompts asking them to predict their scores on the homework assignments and create a future study plan outperformed students without answering those prompts by approximately 4% on American Chemical Society (ACS) general chemistry final exam (Casselman and Atwood, 2017). Avargil investigated how metacognitive prompts in online learning modules increased students’ self-efficacy, graphing skills, and conceptual understanding of chemistry. The study group with metacognitive prompts that students were asked to reflect on their chemical understanding and graphing skills outperformed the group with only content knowledge in the modules on the post-test when controlled for pre-test (Avargil, 2019).

Research studies have shown that metacognitive skills play an important role in increasing students’ learning and outcomes in college chemistry courses. They can be taught by lecture or embedded into online learning modules/homework as an effective tool for instructors to be integrated into instruction. However, very few studies incorporated metacognitive training into assessment practices. Our research team designed a new version of MLCs associated with learning objectives and a metacognitive exercise according to the existing literature and our prior study. The metacognitive exercise would encourage students to reflect on their performance on specific learning objectives and develop plans to address gaps in their understanding of chemistry concepts immediately after they receive the feedback on MLCs. We hypothesized that those practices would improve students’ metacognition and learning outcomes. The purposes of the current study are: first, to investigate the predictive effect of an assessment (i.e., MLCs) designed to help students connect concepts in the introductory courses; second, to integrate learning objectives and a metacognitive exercise into the MLCs and examine how the new components make a difference in student learning. Herein, we present a mixed-methods study exploring the effect of MLCs in introductory chemistry at two institutions. Our research questions were:

(1) Do MLCs predict student performance on the final exam in introductory chemistry?

(2) To what extent do the metacognitive exercise aligned with learning objectives improve student performance?

(3) In what ways do students use the metacognitive exercise to improve their knowledge of chemistry?

Methodology

Settings and data collection

The study took place over two semesters at two institutions in the southwestern United States. The courses involved were both introductory chemistry courses. Both courses are one-semester courses taught in person by the same instructors using the same curriculum materials and teaching approaches. The first semester was Fall the second semester was Spring. The first institution was a community college. Students met twice a week for 2 hours each for the lecture and 3 hours weekly for a laboratory session. The topics covered in Introductory Chemistry were taught in the following order: the scientific method, matter and energy, using numbers in science, atomic structure, the mole concept, energy and electronic structure, nomenclature, chemical bonding, gas laws, intermolecular forces, solutions, chemical reactions, stoichiometry, equilibrium, acid–base chemistry, and nuclear chemistry. MLCs were given as weekly online quizzes administered through the college's learning management system. Students had one week from completing a topic in the lecture to answer the MLC designed for the topic. A total of sixteen MLCs were administered over the entirety of the semester, with one MLC correlating with each chapter of the textbook used except for the first chapter (the scientific method). In the second semester, learning objectives associated with the statements were integrated into MLCs. Students were also asked to complete a metacognitive exercise in which they needed to look at the learning objectives they missed and write a plan for mastering the missed learning objectives. In both semesters, students were given immediate feedback on which statements they missed after completing the MLCs. About 4% of the grade were assigned to the MLC quizzes. A cumulative multiple-choice final exam using the same learning objectives in MLCs was administered at the end of the semester. The second institution was a public, primary teaching four-year university. Students met for lectures twice per week for 75 minutes each and had an optional separate laboratory class. The topics covered in order were: the scientific method, measurement, matter and energy, atoms and elements, molecules and compounds, chemical composition, chemical reactions, stoichiometry, solutions, electrons in atoms, and the periodic table. A total of twelve MLCs were given in a similar faction as the first institution.

In addition, surveys were administered at both institutions at the end of each semester. One focus group interview was conducted at each institution at the end of the second semester. Students were given a gift card for their participation in the interview, and several gift cards were raffled off for all students who submitted responses to the surveys to promote the response rate. This project was approved by the Institutional Review Board and Human Subjects Committee at their respective institutions. As per the request of the Institutional Review Board, students were given informed consent forms to affirm their participation in the study. All the students gave permission to be included in the study. There was only one student in the first semester who did not give permission but then dropped from the course, so the data is not included.

Generation of MLCs

As the MLCs developed for use in the original paper were geared towards a general chemistry course, new assessments needed to be written for an introductory chemistry course (Ye et al., 2015a, 2015b). One MLC exercise was written to correlate with each chapter belonging to the textbook used by both institutions (Tro, 2015). No MLC was written for the first chapter, as students would not have previous concepts to link to current concepts. MLCs were usually ten questions each. Approximately 4 of each MLC exercise's questions were based on the current chapter, with the remaining 6 derived from concepts learned in previous chapters. To ensure the content of the MLCs was accurate and appropriate to the level of the target student population, three undergraduate researchers originally drafted the MLCs, and these were examined and revised by two content experts (the instructors of the courses). Students would get 1 point per correct answer or could indicate they were unsure to get 0.2 points. The unsure option was added to give students the explicit option to indicate uncertainty while still receiving a small number of points – students would then be aware of exactly which objectives they did not know while taking the assessment. An incorrect answer would yield 0 points.

In the first semester, MLC consisted of a chemistry prompt with a real-world scenario with a target concept and a series of statements assessing previous concepts taught in the course (version 1). In the second semester (version 2), two changes to the MLC were implemented: students were given a list of learning objectives, and the MLC statements were revised to include which learning objective(s) a particular item was testing, and a metacognitive exercise was added to the weekly quizzes after students reviewed their quiz results. The students were asked: “Which learning objectives did you miss on the last [MLC] quiz? What is your plan to master these learning objectives in the future?Fig. 1 and Table 1 show the graphical abstract of the second version and an example of the MLCs. The red color annotations in Fig. 1 show the changes made in the second version as compared to the first version. The learning objectives for each institution, as well as all the MLCs used in the study, can be found in Appendices S1 and S2 (ESI).


image file: d2rp00061j-f1.tif
Fig. 1 The graphical abstract of the MLCs.

Data analysis

The quantitative data used for analysis included midterm and final exam scores, average MLC scores as well as demographic information such as gender and ethnicity. Students who self-identified as Latinx, African American, Native American, and Other Pacific Islander were coded as underrepresented minorities (URM). Students who did not complete at least half of the MLC exercises and did not have a final exam score were excluded from the quantitative analysis. Due to exigent circumstances, the final exams at the second institution were subject to emergency cancellation. As a result, data from the second institution were not included in the quantitative analyses (research questions 1 and 2). Descriptive statistics were used to describe the characteristics of the sample. The reliability of the MLCs was evaluated using Kuder–Richardson 20 (KR-20), a metric showing how consistent the items are for a measure with binary variables (i.e., answers that are true or false). Pearson product-moment correlation coefficient was used to indicate the relationship between MLCs and final exam scores. To explore the predictive effect of the MLCs on student performance, a standard multiple regression was used. Additionally, independent t-tests and effect size were conducted and calculated to show the differences in averages between the two versions of the MLCs. All statistical analyses presented here were conducted using IBM SPSS 26.

The survey responses and interviews from the two institutions were used for research question 3. The qualitative data were combined because we did not observe any apparent differences in the emerging themes from the survey and interview data between the two institutions. The interviews were transcribed verbatim by two chemistry education researchers, who cross-checked the entirety of each other's work for accuracy. The interview transcripts and survey responses were then perused by one of the transcribers and another researcher for thematic analysis. The two then independently coded statements in one of the interview transcripts as belonging to the emerging themes through open-coding and constant comparative methods (Glaser and Strauss, 1967; Patton, 2002). The interrater reliability (Cohen's κ = 0.89) was determined to be excellent, and one researcher coded the remainder of the data (Landis and Koch, 1977; Cohen, 1960).

Results and discussion

Research question 1: Do MLCs predict student performance on the final exam in introductory chemistry?

Table 2 illustrates the descriptive statistics for the samples in the two semesters. There were 57% female and 25% URM students in the first semester and 68% female and 33% URM students in the second semester. The average MLC score increased from 5.0 in the first semester to 6.8 in the second semester. The final exam percentage increased from 55.6% in the first semester to 57.8% in the second semester. The KR-20 was 0.781 in the first semester and 0.771 in the second semester, suggesting good internal consistency reliability for the data collected by the assessment with the study sample (Fraenkel and Wallen, 2008). Pearson product-moment correlation coefficients (r) were used to measure the associations between student performance on MLCs and final exams. The correlations between the two variables were positive and significant at the p = 0.05 level. The correlation was higher in the second semester (N = 40, r = 0.688, p < 0.001) than the first semester (N = 70, r = 0.372, p = 0.018). A moderately strong positive correlation between MLC scores and the final exam was observed for the combined data of two semesters (N = 110; r = 0.486; p < 0.001). Herein, evidence of the validity of the data collected by MLC assessment with the study sample through a strong relationship with associate construct was evident (Arjoon et al., 2013).
Table 2 Descriptive statistics
Semester 1 Semester 2
Note: only data from institution 1 (community college) was used.
N 70 40
% Female 57 68
% URM 25 33
Avg. MLC score (max. 10) 5.0 6.8
Final exam score (%) 55.6 57.8


To explore student performance on MLCs in the context of their overall learning, a standard multiple regression analysis was performed with data combined from both semesters. Midterm exam scores, student demographics including gender and URM status, and MLC scores were used as predictors to predict students’ final exam scores in the model. MLC scores were found to be a statistically significant predictor of final exam scores in the course. The results of the multiple regression analysis can be found in Table 3. The regression equation is:

Final exam % = −66.878 + 3.248 × Gender + 6.208 × URM status + 0.997 × Midterm exam average % + 7.594 MLC average

Table 3 Multiple regression model predicting final exam scores (R2 = 0.650; p < 0.001)
Model Unstandardized B Standardized error Standardized B t Sig.
Note: only data from institution 1 (community college) was used.a Means the predictors are significant in the regression model.
(Constant) −66.878 11.472 −5.830 <0.001a
Gender 3.248 3.441 0.067 0.944 0.348
URM status 6.208 3.955 0.119 1.570 0.121
Midterm exam average 0.997 0.096 0.704 10.357 <0.001a
MLC average 7.594 1.389 0.401 5.467 <0.001a


According to the multiple regression model, the independent variables assessed in this model were determined to predict significantly 65.0% of the variance in final exam scores. Students could expect about a 7.6% increase in their final exam score for each point their MLC average score rose (to a maximum of 10 points) when holding other independent variables constant. Unsurprisingly, the midterm exam average was found to be the strongest predictor in the model; a 1% increase in average midterm exam score was predicted to increase the final exam score by about 1% when holding all the other variables constant. There was no significant difference in the final exam based on gender or URM status.

MLCs have been previously implemented in the general chemistry course at a public research university (Ye et al., 2015a, 2015b), although their effect on other assessment scores has not yet been demonstrated in the literature. The results in this study indicate a beneficial effect resulting from the addition of MLCs to the introductory chemistry assessment toolkit. As the MLCs assessment is designed to promote students connecting concepts between topics in the introductory chemistry courses, the meaningful learning framework suggests that connecting concepts will strengthen long-term retention (Ausubel, 1968). Evidence of the strong predictive effects of MLC scores on final exam scores indicated values of implementing MLCs in the introductory chemistry course. These results mirror the study previously published by the authors about the effectiveness of the precursor of MLCs, Creative Exercises, in improving students’ linking of concepts and academic performance in introductory chemistry (Gilewski et al., 2019).

Research question 2: To what extent do the metacognitive exercise aligned with learning objectives improve student performance?

First, to ensure the samples were comparable across the two semesters, the percentages of female (χ2 = 1.147; p = 0.284) and URM students (χ2 = 0.741; p = 0.389) were compared by chi-square tests, and no significant difference was found between the two semesters. Additionally, the scores of the first MLC exercise given in the semester were compared and no significant difference was observed (average1 = 6.8; average2 = 6.4; df = 108; p = 0.119). Furthermore, to explore if the metacognitive exercise added to the MLCs during the second semester improved student performance on MLCs and final exam scores, average scores were compared through independent sample t-tests. Students performed significantly better on MLCs in the second semester (see Table 2, average1 = 5.0; average2 = 6.8; difference = 1.8; df = 108; p < 0.001; Hedges’ g = 1.6). Hedges’ g was chosen as a more appropriate effect size approximation due to the relatively different sample sizes across the two semesters. An effect size between 1.2 and 2.0 has been described in the literature as a very large effect, as noted by the expansion of Cohen's original effect size interpretation parameters by Sawilowsky (Sawilowsky, 2009). Hedges’ g was found to be 1.6 for the differences between the MLCs with and without the metacognitive exercise. Student performance on MLCs had increased by 18% after pairing the MLCs with the metacognitive exercise. Final exam scores were also higher the second semester, although this was not significant (see Table 2, average1 = 55.6; average2 = 57.8; difference = 2.2; df = 108; p = 0.679). These data are presented in Fig. 2a and b. These results indicate that the metacognitive exercise (list as ME in Fig. 2) aligned with learning objectives might have played an important role in improving student MLC performance. However, the influence of the pairing metacognitive exercise was not statistically significant although the students’ final exam performance did improve by 2.2% as compared to the student group using the original MLC. Possible explanations could be due to the reason that MLCs were used many times across the semester. Students might be familiar with the format of the MLCs, and the linking of learning objectives of statements might have provided students hints about which chapters or concepts are measured by the statements in MLCs. However, the final exam format is multiple-choice, and there is no indication of learning objectives associated with the questions. In addition, the regression model only explained 65% of the variance, other confounding variables such as student motivation, attitude, and prior knowledge might also impact students’ learning outcomes in chemistry.
image file: d2rp00061j-f2.tif
Fig. 2 (a) Average MLC scores (max. 10) between the two semesters; p < 0.001; (b) average final exam scores between the two semesters; p = 0.679.

Research evidence has shown assessment practices not only support student learning but also have an overwhelming influence on how and how much students study (Gibbs and Simpson, 2005; Ye et al., 2015a, 2015b). The qualitative data in this study (see research question 3) might explain and address the quantitative effects of the MLCs on students’ learning.

Research question 3: In what ways do students use the metacognitive exercise to improve their knowledge of chemistry?

To explore how students used the metacognitive exercise, surveys and focus group interview data were analyzed from both institutions. Students were asked questions to probe their experience with MLCs and the metacognitive exercise. A list of the semi-structured focus group interview and open-ended survey questions can be found in Appendix S3 (ESI). Interestingly, the themes independently generated for both semesters’ data were similar. Four themes were identical between the two semesters: conceptual connection, study habits, metacognition, and feedback. Each semester also had one theme unique to itself: in the first semester, students stated their affective domain; in semester 2, students expressed a dislike for mathematical questions as MLCs. Representative quotes demonstrating these themes for each semester are found in Table 4. However, in particular, the responses categorized under the theme of metacognition were both much more prevalent and more detailed in the second semester. Notably, when students were asked to explain whether MLCs helped them to make connections among topics, 9% of responses (10 out of 116 responses) from the first semester were coded as metacognition, whereas 37% of responses (26 out of 70 responses) in the second semester's data fell under this category. These findings indicated the metacognitive exercise added to MLCs promoted students to think about their learning and mastery of learning objectives. Furthermore, students' responses to the metacognitive exercise were examined to understand what plans students developed for improving future learning. As illustrated in Fig. 3a and b, in the semester in which the metacognitive exercise was added, 87% of students responded that they looked at learning objectives for which they performed poorly on MLCs. In addition, 54% noted that they developed a plan to master these missed objectives. When responses to the metacognitive exercise were examined in aggregate, almost all students, 95% of the students had mentioned they would review their lecture notes pertaining to the topic(s) covered by the learning objective(s) and more than half of the students indicated they would make a plan to master the missed learning objective(s). The majority of the students (80%) indicated they would do more practice problems related to the learning objectives missed. A smaller proportion, 11%, indicated they would seek tutoring or help from the instructors. A previous qualitative study has shown that college chemistry students improved their time management and reflective learning greatly after receiving instruction on metacognitive training in a general chemistry course (Muteti et al., 2020). Their findings indicated that 67% of students (N = 115) reported a positive influence of the intervention, more specifically, more students reported higher-order learning strategies and fewer students reported less rote memorization after the intervention. The qualitative results of our study are aligned with the study findings while providing another potential path through incorporating a metacognitive exercise aligned with learning objectives into assessments in the introductory chemistry courses. In doing so, students’ reflective thinking and subsequent behaviors are more targeted towards those missed learning objectives.
Table 4 Theme and representative quotes from the survey and focus group interview responses
Theme Representative quotation (Semester 1: S1, Semester 2: S2)
Conceptual connection – conceptual understanding and connections made between topics and concepts “…made you think of more than just one aspect of the question/statement and encouraged you to relate/understand on a broader, generalized scale.” S1
“Because in order to answer correctly you have to be able to make connections between the topics and subtopics that are lectured about.” S2
“It actually did help a lot because it was basically everything the instructor taught us in class. So it recapped in case we didn’t understand.” S2
Study habits – change learning behaviors or take specific actions to study “…just study and make connections between chapters.” S1 “…additional opportunity to refresh your knowledge.” S1
“I spent more time studying the topics that I needed to work on.” S2
“I would jot down what concepts I did not understand and felt like I needed more practice with. Later on, I studied those concepts and did many practice problems.” S2
Metacognitionself-monitoring process; students reflect on their difficulty with certain topics and the quality of their learning “It helped me to realize what I know and with what I still have problems, so I can concentrate on it, read a little bit more.” S1
“It helped me by seeing my strengths and weaknesses and focusing on what I needed to focus on.” S2
“They helped me to go back to see my errors and allowed me to fix them by practicing more on that certain topic until I get the hang of it.” S2
“Well, the learning objective, it shows what the topic is and I would from knowing the topic, go through the lecture notes, and then if there was more than I wouldn’t understand I would probably ask the professor about it.” S2
Feedbackperspective related to feedback obtained from MLCs “It is great to get immediate feedback once the quiz closes so you have it to review for tests and general conceptual review.” S1
“Maybe like whatever question we got wrong, like as a class got wrong, most people, maybe the instructor can elaborate on that [in class] or go over it [in class].” S2
“If the instructor can aggregate that information and reinforce topics where people had issues.” S2
Math (S2 only)MLC statements require mathematical calculations “I don’t want to see in the future, math-related questions for true/false, because…it's just agonizing to do more math on the normal exam and then we see another math-related question on the true/false [questions].” S2
Affective domain (S1 only)feelings, self-confidence, motivation “I wish I could explain more.” S1
“…preferred true and false for making my choice easier to choose and gave me confidence for the next question.” S1



image file: d2rp00061j-f3.tif
Fig. 3 (a) Survey responses about student metacognitive behaviors (N = 47); (b) student plans to master learning objectives collected from the metacognitive exercise (N =127).

Limitations

One limitation of the study is the quantitative data was only taken from a community college because of the emergency cancellation due to exigent circumstances at the other public four-year university. There is no requirement for students to take math placement tests or other standardized tests showing student academic ability before enrolling in the course at the community colleges in California, so we could not control student background knowledge with such measures prior to the course, which might be associated with the student performance in the course. Another limitation of this study is its serial nature, this design was chosen because each instructor in the study only taught one class in the particular semesters at the institutions. We prefer to have the same instructors so we were able to control factors such as course curricula, teaching approaches, and assessments that are most likely to affect the outcomes of the study. Future study in parallel utilizing a control/treatment design in the same semester may be beneficial to further validate these results with more comparable student backgrounds. It might also be of interest to future researchers to study the effect of MLCs in different chemistry courses across multiple universities. Replicating the study findings with other types of universities with diverse student populations may be necessary for future studies.

Conclusions and implications

A range of prior studies has shown how the influence of instruction and online learning modules could improve students’ metacognition in chemistry courses (e.g., Cook et al., 2013; Casselman and Atwood, 2017; Muteti et al., 2020). Our study showed an example that students’ metacognitive thinking and behaviors could also be taught and influenced significantly through metacognitive exercise embedded with assessments. MLCs were shown to be a significant predictor of students’ final exam performance in introductory chemistry through the multiple regression model. The pairing of a metacognitive exercise and learning objectives to the assessment has greatly improved student performance on the MLCs. Qualitatively, students had overwhelmingly positive impressions of the assessment and demonstrated more metacognitive thinking and behaviors after the modifications were added.

The evidence from the literature combined with our results underscores the importance for instructors to purposefully select assessments and have metacognitive awareness in their teaching. Additionally, the metacognitive exercise that was incorporated into MLCs takes minimum effort for the instructors to prepare and requires little time during the lecture to teach metacognition while promoting students’ metacognitive thinking and regulating their behaviors accordingly to master the learning objectives in the course. The MLCs can be easily integrated and modified for other introductory chemistry courses; all the exercises can be found in Appendix S2 (ESI). The assessments are ready for immediate use if the same textbook is used; otherwise, modification of the numeration of learning objectives may be needed. As evident herein in the study, substantial improvement in student performance has been made on the assessment after the implementation of the metacognitive exercise. It is suggested to adopters of MLCs utilize the new model of pairing them with the metacognitive exercise. Given the student responses that immediate feedback is desired, the MLC is likely best instituted online, as through a Learning Management System or other online platforms, where the correct answers can be shown to students after they submit their responses. Alternatively, MLCs can be used in recitation or discussion sections, or as group activities; students could also be asked to explain their answers to their peers to ensure full comprehension of the concept and an opportunity to explain chemistry concepts to others. In addition, those MLCs in the appendix could serve as starting examples, instructors are encouraged to modify the statements based on what learning objective they want to emphasize in their classes and utilize the MLCs as both formative and summative assessments. Having students reflect upon their understanding of specific learning objectives, student responses to the MLCs and the metacognitive exercise could be also valuable for instructors and researcher to use to track students’ learning trajectories on certain topics or learning objectives across time and develop interventions to improve them when downward trends are observed.

Conflicts of interest

There are no conflicts to declare.

Acknowledgements

The authors would like to thank all the study participants in the study and the reviewers for reviewing and providing valuable feedback for the manuscript.

References

  1. Akyol G., Sungur S. and Tekkaya C., (2010), The contribution of cognitive and metacognitive strategy use to students' science achievement, Educ. Res. Eval., 16 (1), 1–21.
  2. Arjoon J. A., Xu X. and Lewis J. E., (2013), Understanding the state of the art for measurement in chemistry education research: Examining the psychometric evidence. J. Chem. Educ., 90(5), 536–545.
  3. Ausubel D. P., (1968), Educational psychology: A cognitive view, Holt, Rinehart, and Winston.
  4. Avargil S., (2019), Learning chemistry: Self-efficacy, chemical understanding, and graphing skills, J. Sci. Educ. Technol., 28(4), 285–298.
  5. Casselman B. L. and Atwood C. H., (2017), Improving general chemistry course performance through online homework-based metacognitive training, J. Chem. Educ., 94(12), 1811–1821.
  6. Chan J. Y. K. and Bauer C. F., (2016), Learning and studying strategies used by general chemistry students with different affective characteristics, Chem. Educ. Res. Pract., 17, 675–684.
  7. Cohen J., (1960), A coefficient of agreement for nominal scales, Educ. Psychol. Meas., 20, 37–46.
  8. Cook E., Kennedy E. and McGuire S. Y., (2013), Effect of teaching metacognitive learning strategies on performance in general chemistry courses, J. Chem. Educ., 90(8), 961–967.
  9. Flavell J. H., (1976), Metacognitive aspects of problem-solving, in The nature of intelligence, L. B. Resnick (ed.), Hillsdale, NJ: Lawrence Erlbaum, pp. 231–235.
  10. Fraenkel J. R. and Wallen N. E., (2008), How to Design and Evaluate Research in Education, 7th edn, New York: McGraw-Hill.
  11. Gibbs G. and Simpson C., (2005), Conditions under which assessment supports students’ learning, Learn. Teach. High. Educ., 1, 3–31.
  12. Gilewski A., Mallory E., Sandoval M., Litvak M. and Ye L., (2019), Does linking help? Effects and student perceptions of a learner-centered assessment implemented in introductory chemistry, Chem. Educ. Res. Pract., 20(2), 399–411.
  13. Glaser B. G. and Strauss A. L., (1967), The discovery of grounded theory: strategies for qualitative research, New York: Aldine De Gruyter.
  14. Graham K. J., Bohn-Gettler C. M. and Raigoza A. F., (2019), Metacognitive training in chemistry tutor sessions increases first year students’ self-efficacy, J. Chem. Educ.96 (8), 1539–1547.
  15. Landis J. R. and Koch G. G., (1977), The measurement of observer agreement for categorical data, Biometrics, 33, 159–174.
  16. Lewis S. E. L., Shaw J. A. and Freeman K., (2011), Establishing open-ended assessments: Investigating the validity of creative exercises. Chem. Educ. Res. Pract., 12(2), 158–166.
  17. Mutambuki J. M., Mwavita M., Muteti C. Z., Jacob B. I. and Mohanty S., (2020), Metacognition and active learning combination reveals better performance on cognitively demanding general chemistry concepts than active learning alone, J. Chem. Educ., 97(7), 1832–1840.
  18. Muteti C. Z., Zarraga C., Jacob B. I., Mwarumba T. M., Nkhata D. B., Mwavita M., Mohanty S. and Mutambuki J. M., (2020), I realized what I was doing was not working: The influence of explicit teaching of metacognition on students’ study strategies in a general chemistry I course, Chem. Educ. Res. Pract., 22 (1), 122–135.
  19. Patton M. Q., (2002), Qualitative Research & Evaluation Methods, SAGE Publ., Inc.
  20. Sawilowsky S., (2009), New effect size rules of thumb, J. Mod. Appl. Stat. Meth., 8(2), 597–599.
  21. Sinapuelas M. L. S. and Stacy A. M., (2015), The relationship between student success in introductory university chemistry and approaches to learning outside of the classroom, J. Res. Sci. Teach., 52(6), 790–815.
  22. Tro N., (2015), Introductory chemistry essentials, 6th edn, Boston, MA: Pearson Education, Inc.
  23. Warfa A. R. M. and Odowa N., (2015), Creative exercises (CEs) in the biochemistry domain: An analysis of students’ linking of chemical and biochemical concepts, Chem. Educ. Res. Pract., 16(4), 747–757.
  24. Ye L. and Lewis S. E., (2014), Looking for links: Examining student responses in creative exercises for evidence of linking chemistry concepts, Chem. Educ. Res. Pract., 15(4), 576–586.
  25. Ye L., Oueini R., Dickerson A. P. and Lewis S. E., (2015a), Learning beyond the classroom: Using text messages to measure general chemistry students’ study habits, Chem. Educ. Res. Pract., 16, 869–878.
  26. Ye L., Oueini R. and Lewis S. E., (2015b), Developing and implementing an assessment technique to measure linked concepts, J. Chem. Educ., 92(11), 1807–1812.
  27. Ye L., Shuniak C., Oueini R., Robert J. and Lewis S., (2016), Can they succeed? Exploring at-risk students’ study habits in college general chemistry, Chem. Educ. Res. Pract., 17, 878–892.
  28. Zhao N., Wardeska J. G., McGuire S. Y. and Cook E., (2014), Metacognition: An effective tool to promote success in college science learning, J. Coll. Sci. Teach., 43(4), 48–54.
  29. Zimmerman B. J., (2001)., Theories of self-regulated learning and academic achievement: An overview and analysis, in B. J. Zimerman and D. H. Schunk (ed.), Self-regulated learning and academic achievement, 2nd edn, Hillsdale, NJ: Erlbaum, pp. 1–37.

Footnote

Electronic supplementary information (ESI) available: Appendix S1: learning objectives for introductory chemistry; Appendix S2: measures of linked concepts assessment; Appendix S3: survey questions and interview protocol. See DOI: https://doi.org/10.1039/d2rp00061j

This journal is © The Royal Society of Chemistry 2022