Alex
Gilewski
a,
Mikhail
Litvak
b and
Li
Ye
*c
aDepartment of Science, Cerro Coso Community College, Ridgecrest, 3000 College Heights Blvd, Ridgecrest, California 93555, USA. E-mail: alex.gilewski@cerrocoso.edu
bDepartment of Biology, California State University, Northridge, 18111 Nordhoff Street, Northridge, California 91330, USA
cDepartment of Chemistry and Biochemistry, California State University, Northridge, 18111 Nordhoff Street, Northridge, California 91330, USA. E-mail: li.ye@csun.edu
First published on 13th June 2022
Previous research has shown the validity of the Measure of Linked Concepts (MLC) assessment in college introductory chemistry. Herein, we present a novel model of pairing the MLC with a metacognitive exercise aligned with learning objectives in the introductory chemistry courses as an effective tool for instructors to be integrated into instruction and assessments. A mixed-methods approach was used to explore the effectiveness of MLCs with the metacognitive exercise on students’ performance and metacognitive thinking and behaviors. The study was implemented in introductory chemistry at two institutions over two semesters. The multiple regression model showed that MLC scores significantly predicted students’ final exam scores in introductory chemistry. Students’ MLC scores were improved significantly (18% higher) after pairing with the metacognitive exercise. Notably, the theme of metacognition was much more prevalent observed in the qualitative data when the MLC was paired with the metacognitive exercise as compared to without the exercise (37% vs. 9%). More importantly, most of the participants (87%) reported they looked up the learning objectives that are associated with the MLC statements they missed. More than half of the participants indicated they made plans to master the missed learning objectives. Their plans include reviewing lecture notes pertaining to the topics, doing more practice problems related to the missed learning objectives, or seeking help from tutors or instructors. Pairing MLCs with the metacognitive exercise shows promising signs of improving student performance and metacognition. The authors suggest adopters of MLCs utilize the new model of pairing them with the metacognitive exercise.
Meaningful learning can be aided through various learning tools. Several assessment techniques have been reported in the literature to help students make connections between concepts. One of these assessments is Creative Exercises (CEs), in which students are asked to write original statements in response to a given prompt such as “34 grams of carbon dioxide” (Lewis et al., 2011; Ye and Lewis, 2014; Warfa and Odowa, 2015; Gilewski et al., 2019). For a student to get credit, statements must be accurate, germane to the prompt, and distinct from other statements regarding which concepts are being applied. CEs have been previously used to demonstrate students’ linking of chemistry concepts. Students were able to link more concepts over time and were shown to have performance gains with CEs. Survey data revealed that CEs helped students reinforce prior concepts and connect concepts in the chemistry courses, giving them more flexibility and providing an additional study tool (Gilewski et al., 2019). Although the open-ended nature of the CEs provides opportunities for students with autonomy to retrieve prior knowledge and apply it to new contexts actively. It limits the instructors’ flexibility in showing the relevance of particular prior concepts to new concepts, and the grading for larger classes could be time-consuming. Thus, the research team has developed an alternative assessment termed The Measure of Linked Concepts (MLCs), in which students are given a prompt that focuses on the new topic students learned in the chemistry course and a range of statements covering relevant chemical concepts related to the prompt (Ye et al., 2015a, 2015b). Students are asked to indicate and evaluate the legitimacy of each statement. The statements upon which the MLC assessments were created were originally based on student responses to CEs. For example, when students learn the molecular shapes of chemical compounds, which is usually taught toward the end of the semester of the introductory course, the instructor could provide a prompt with a chemical compound with its Lewis structure and link a variety of concepts for students to evaluate. The statements could not only measure current concepts such as molecular shapes, bond polarity, and molecular polarity but also show the relevance of prior concepts like classification of matter, the periodic table, nomenclature, etc. (see the example MLC in Table 1). MLCs provide instructors a tool for linking prior knowledge with subsequent topics. The closed-ended nature of the assessment enables instructors and students to get immediate feedback on the prevalence of students’ incorrect responses. While MLC assessments have been studied on an item-by-item basis to evaluate student understanding of specific topics and to inform future instruction, they have not yet been demonstrated to have a predictive effect on overall student performance in chemistry (Ye et al., 2015a, 2015b).
Metacognition has also been identified as a factor in improving student performance in chemistry. Metacognition, meaning “thinking about thinking,” has been documented since the days of Aristotle. However, the definitions and ramifications of metacognition in education were deeply examined by John Flavell in the 1970s. Metacognition can be described as a self-monitoring process; students reflect on the quality of their learning, their difficulty with certain topics, or the effectiveness of their study skills (Flavell, 1976). Various models have been proposed to describe how learners could apply metacognitive skills and improve performance in sciences courses (Akyol et al., 2010; Sinapuelas and Stacy, 2015; Chan and Bauer, 2016; Ye et al., 2016). Although the models might be varied in different contexts, the shared notion of those models is that the learners need to monitor and regulate their learning processes (Zimmerman, 2001). In the context of college chemistry courses, research studies indicated that metacognition could be improved through purposeful training (Cook et al., 2013; Zhao et al., 2014; Casselman and Atwood, 2017; Avargil, 2019; Graham et al., 2019; Mutambuki et al., 2020; Muteti et al., 2020). Cook and colleagues gave students in general chemistry a 50 minute metacognition lecture. They introduced them to a range of metacognitive activities (e.g., the study cycle, rewriting lecture notes, pretending to teach, previewing material, and working on homework problems without any examples present). The results showed that students who attended the lecture received one letter grade higher than students who did not attend the lecture (Cook et al., 2013). The study highlighted the importance of dispelling student notions that high school-level learning skills (e.g., memorization) would not function in college, and the authors suggested that the timing of the implementation of the lecture is critical and gives students the opportunity to reflect on their learning shortly after high-stake assessments is optimal. In a follow-up study, a separate metacognitive lecture was implemented in general chemistry I and II shortly after the first examination. The study found the metacognition lecture increased students’ chemistry self-concept in general chemistry I, and students had a higher success rate in general chemistry II as compared to previous years (Zhao et al., 2014). A recent study compared the student performance in the college general chemistry I course of a treatment group (combined explicit teaching of metacognition by a lecture and active learning) and a comparison group (active learning alone) in the general chemistry I course. The results showed the treatment group increased by 5% on the common cumulative final exam (Mutambuki et al., 2020). The qualitative study showed that metacognition training in lectures led to students using fewer ineffective learning strategies such as rote memorization and gaining higher-order thinking skills like applying and analyzing (Muteti et al., 2020). In addition to lectures, metacognition training also has been implemented through tutoring sessions, online learning modules, and homework (Casselman and Atwood, 2017; Avargil, 2019; Graham et al., 2019). For instance, Casselman and Atwood reported the students who completed the metacognition prompts asking them to predict their scores on the homework assignments and create a future study plan outperformed students without answering those prompts by approximately 4% on American Chemical Society (ACS) general chemistry final exam (Casselman and Atwood, 2017). Avargil investigated how metacognitive prompts in online learning modules increased students’ self-efficacy, graphing skills, and conceptual understanding of chemistry. The study group with metacognitive prompts that students were asked to reflect on their chemical understanding and graphing skills outperformed the group with only content knowledge in the modules on the post-test when controlled for pre-test (Avargil, 2019).
Research studies have shown that metacognitive skills play an important role in increasing students’ learning and outcomes in college chemistry courses. They can be taught by lecture or embedded into online learning modules/homework as an effective tool for instructors to be integrated into instruction. However, very few studies incorporated metacognitive training into assessment practices. Our research team designed a new version of MLCs associated with learning objectives and a metacognitive exercise according to the existing literature and our prior study. The metacognitive exercise would encourage students to reflect on their performance on specific learning objectives and develop plans to address gaps in their understanding of chemistry concepts immediately after they receive the feedback on MLCs. We hypothesized that those practices would improve students’ metacognition and learning outcomes. The purposes of the current study are: first, to investigate the predictive effect of an assessment (i.e., MLCs) designed to help students connect concepts in the introductory courses; second, to integrate learning objectives and a metacognitive exercise into the MLCs and examine how the new components make a difference in student learning. Herein, we present a mixed-methods study exploring the effect of MLCs in introductory chemistry at two institutions. Our research questions were:
(1) Do MLCs predict student performance on the final exam in introductory chemistry?
(2) To what extent do the metacognitive exercise aligned with learning objectives improve student performance?
(3) In what ways do students use the metacognitive exercise to improve their knowledge of chemistry?
In addition, surveys were administered at both institutions at the end of each semester. One focus group interview was conducted at each institution at the end of the second semester. Students were given a gift card for their participation in the interview, and several gift cards were raffled off for all students who submitted responses to the surveys to promote the response rate. This project was approved by the Institutional Review Board and Human Subjects Committee at their respective institutions. As per the request of the Institutional Review Board, students were given informed consent forms to affirm their participation in the study. All the students gave permission to be included in the study. There was only one student in the first semester who did not give permission but then dropped from the course, so the data is not included.
In the first semester, MLC consisted of a chemistry prompt with a real-world scenario with a target concept and a series of statements assessing previous concepts taught in the course (version 1). In the second semester (version 2), two changes to the MLC were implemented: students were given a list of learning objectives, and the MLC statements were revised to include which learning objective(s) a particular item was testing, and a metacognitive exercise was added to the weekly quizzes after students reviewed their quiz results. The students were asked: “Which learning objectives did you miss on the last [MLC] quiz? What is your plan to master these learning objectives in the future?” Fig. 1 and Table 1 show the graphical abstract of the second version and an example of the MLCs. The red color annotations in Fig. 1 show the changes made in the second version as compared to the first version. The learning objectives for each institution, as well as all the MLCs used in the study, can be found in Appendices S1 and S2 (ESI†).
The survey responses and interviews from the two institutions were used for research question 3. The qualitative data were combined because we did not observe any apparent differences in the emerging themes from the survey and interview data between the two institutions. The interviews were transcribed verbatim by two chemistry education researchers, who cross-checked the entirety of each other's work for accuracy. The interview transcripts and survey responses were then perused by one of the transcribers and another researcher for thematic analysis. The two then independently coded statements in one of the interview transcripts as belonging to the emerging themes through open-coding and constant comparative methods (Glaser and Strauss, 1967; Patton, 2002). The interrater reliability (Cohen's κ = 0.89) was determined to be excellent, and one researcher coded the remainder of the data (Landis and Koch, 1977; Cohen, 1960).
| Semester 1 | Semester 2 | |
|---|---|---|
| Note: only data from institution 1 (community college) was used. | ||
| N | 70 | 40 |
| % Female | 57 | 68 |
| % URM | 25 | 33 |
| Avg. MLC score (max. 10) | 5.0 | 6.8 |
| Final exam score (%) | 55.6 | 57.8 |
To explore student performance on MLCs in the context of their overall learning, a standard multiple regression analysis was performed with data combined from both semesters. Midterm exam scores, student demographics including gender and URM status, and MLC scores were used as predictors to predict students’ final exam scores in the model. MLC scores were found to be a statistically significant predictor of final exam scores in the course. The results of the multiple regression analysis can be found in Table 3. The regression equation is:
| Final exam % = −66.878 + 3.248 × Gender + 6.208 × URM status + 0.997 × Midterm exam average % + 7.594 MLC average |
| Model | Unstandardized B | Standardized error | Standardized B | t | Sig. |
|---|---|---|---|---|---|
| Note: only data from institution 1 (community college) was used.a Means the predictors are significant in the regression model. | |||||
| (Constant) | −66.878 | 11.472 | — | −5.830 | <0.001a |
| Gender | 3.248 | 3.441 | 0.067 | 0.944 | 0.348 |
| URM status | 6.208 | 3.955 | 0.119 | 1.570 | 0.121 |
| Midterm exam average | 0.997 | 0.096 | 0.704 | 10.357 | <0.001a |
| MLC average | 7.594 | 1.389 | 0.401 | 5.467 | <0.001a |
According to the multiple regression model, the independent variables assessed in this model were determined to predict significantly 65.0% of the variance in final exam scores. Students could expect about a 7.6% increase in their final exam score for each point their MLC average score rose (to a maximum of 10 points) when holding other independent variables constant. Unsurprisingly, the midterm exam average was found to be the strongest predictor in the model; a 1% increase in average midterm exam score was predicted to increase the final exam score by about 1% when holding all the other variables constant. There was no significant difference in the final exam based on gender or URM status.
MLCs have been previously implemented in the general chemistry course at a public research university (Ye et al., 2015a, 2015b), although their effect on other assessment scores has not yet been demonstrated in the literature. The results in this study indicate a beneficial effect resulting from the addition of MLCs to the introductory chemistry assessment toolkit. As the MLCs assessment is designed to promote students connecting concepts between topics in the introductory chemistry courses, the meaningful learning framework suggests that connecting concepts will strengthen long-term retention (Ausubel, 1968). Evidence of the strong predictive effects of MLC scores on final exam scores indicated values of implementing MLCs in the introductory chemistry course. These results mirror the study previously published by the authors about the effectiveness of the precursor of MLCs, Creative Exercises, in improving students’ linking of concepts and academic performance in introductory chemistry (Gilewski et al., 2019).
![]() | ||
| Fig. 2 (a) Average MLC scores (max. 10) between the two semesters; p < 0.001; (b) average final exam scores between the two semesters; p = 0.679. | ||
Research evidence has shown assessment practices not only support student learning but also have an overwhelming influence on how and how much students study (Gibbs and Simpson, 2005; Ye et al., 2015a, 2015b). The qualitative data in this study (see research question 3) might explain and address the quantitative effects of the MLCs on students’ learning.
| Theme | Representative quotation (Semester 1: S1, Semester 2: S2) |
|---|---|
| Conceptual connection – conceptual understanding and connections made between topics and concepts | “…made you think of more than just one aspect of the question/statement and encouraged you to relate/understand on a broader, generalized scale.” S1 |
| “Because in order to answer correctly you have to be able to make connections between the topics and subtopics that are lectured about.” S2 | |
| “It actually did help a lot because it was basically everything the instructor taught us in class. So it recapped in case we didn’t understand.” S2 | |
| Study habits – change learning behaviors or take specific actions to study | “…just study and make connections between chapters.” S1 “…additional opportunity to refresh your knowledge.” S1 |
| “I spent more time studying the topics that I needed to work on.” S2 | |
| “I would jot down what concepts I did not understand and felt like I needed more practice with. Later on, I studied those concepts and did many practice problems.” S2 | |
| Metacognition – self-monitoring process; students reflect on their difficulty with certain topics and the quality of their learning | “It helped me to realize what I know and with what I still have problems, so I can concentrate on it, read a little bit more.” S1 |
| “It helped me by seeing my strengths and weaknesses and focusing on what I needed to focus on.” S2 | |
| “They helped me to go back to see my errors and allowed me to fix them by practicing more on that certain topic until I get the hang of it.” S2 | |
| “Well, the learning objective, it shows what the topic is and I would from knowing the topic, go through the lecture notes, and then if there was more than I wouldn’t understand I would probably ask the professor about it.” S2 | |
| Feedback – perspective related to feedback obtained from MLCs | “It is great to get immediate feedback once the quiz closes so you have it to review for tests and general conceptual review.” S1 |
| “Maybe like whatever question we got wrong, like as a class got wrong, most people, maybe the instructor can elaborate on that [in class] or go over it [in class].” S2 | |
| “If the instructor can aggregate that information and reinforce topics where people had issues.” S2 | |
| Math (S2 only) – MLC statements require mathematical calculations | “I don’t want to see in the future, math-related questions for true/false, because…it's just agonizing to do more math on the normal exam and then we see another math-related question on the true/false [questions].” S2 |
| Affective domain (S1 only) – feelings, self-confidence, motivation | “I wish I could explain more.” S1 |
| “…preferred true and false for making my choice easier to choose and gave me confidence for the next question.” S1 |
![]() | ||
| Fig. 3 (a) Survey responses about student metacognitive behaviors (N = 47); (b) student plans to master learning objectives collected from the metacognitive exercise (N =127). | ||
The evidence from the literature combined with our results underscores the importance for instructors to purposefully select assessments and have metacognitive awareness in their teaching. Additionally, the metacognitive exercise that was incorporated into MLCs takes minimum effort for the instructors to prepare and requires little time during the lecture to teach metacognition while promoting students’ metacognitive thinking and regulating their behaviors accordingly to master the learning objectives in the course. The MLCs can be easily integrated and modified for other introductory chemistry courses; all the exercises can be found in Appendix S2 (ESI†). The assessments are ready for immediate use if the same textbook is used; otherwise, modification of the numeration of learning objectives may be needed. As evident herein in the study, substantial improvement in student performance has been made on the assessment after the implementation of the metacognitive exercise. It is suggested to adopters of MLCs utilize the new model of pairing them with the metacognitive exercise. Given the student responses that immediate feedback is desired, the MLC is likely best instituted online, as through a Learning Management System or other online platforms, where the correct answers can be shown to students after they submit their responses. Alternatively, MLCs can be used in recitation or discussion sections, or as group activities; students could also be asked to explain their answers to their peers to ensure full comprehension of the concept and an opportunity to explain chemistry concepts to others. In addition, those MLCs in the appendix could serve as starting examples, instructors are encouraged to modify the statements based on what learning objective they want to emphasize in their classes and utilize the MLCs as both formative and summative assessments. Having students reflect upon their understanding of specific learning objectives, student responses to the MLCs and the metacognitive exercise could be also valuable for instructors and researcher to use to track students’ learning trajectories on certain topics or learning objectives across time and develop interventions to improve them when downward trends are observed.
Footnote |
| † Electronic supplementary information (ESI) available: Appendix S1: learning objectives for introductory chemistry; Appendix S2: measures of linked concepts assessment; Appendix S3: survey questions and interview protocol. See DOI: https://doi.org/10.1039/d2rp00061j |
| This journal is © The Royal Society of Chemistry 2022 |