Safaa Y.
El-Mansy
a,
Jack
Barbera
a and
Alissa J.
Hartig
*b
aDepartment of Chemistry, Portland State University, Portland, Oregon, USA
bDepartment of Applied Linguistics, Portland State University, Portland, Oregon, USA. E-mail: ahartig@pdx.edu
First published on 22nd November 2021
The level of students’ engagement during active learning activities conducted in small groups is important to understanding the effectiveness of these activities. The Interactive–Constructive–Active–Passive (ICAP) framework is a way to determine the cognitive engagement of these groups by analyzing the conversations that occur while student groups work on an activity. This study used qualitative content analysis and ICAP to investigate cognitive engagement during group activities in a General Chemistry course at the question level, a finer grain size than previously studied. The analysis determined the expected engagement based on question design and the observed engagement based on group conversations. Comparisons of expected and observed engagement showed cases of mismatch, and further analysis determined that incorrect model use, unfamiliar scientific vocabulary, and difficulty moving between molecular representations were all contributing themes to the observed mismatches. The implications of these findings with regard to teaching and research are discussed.
At the same time, evidence supporting the effectiveness of a given strategy can be inconsistent (e.g., Andrews et al., 2011) and simply adding AL strategies to a learning environment does not necessarily lead to the same performance outcomes across groups (Shortlidge et al., 2019). Likewise, a 2019 meta-analysis of peer-reviewed studies on the effectiveness of a wide range of AL strategies within chemistry found that the effect size of these practices varied widely, in some cases resulting in no positive impact (Rahman and Lewis, 2020). As Cooper (2016) points out, the umbrella of AL also covers a wide range of classroom practices, making it difficult to define what specific aspects of AL are effective and under what conditions such strategies work.
At a minimum, the effectiveness of any AL strategy depends on learners’ meaningful cognitive engagement with the learning materials (Bonwell and Eison, 1991). While there is little dispute that learners benefit more from active compared to passive learning (Freeman et al., 2014), a broader hierarchy of cognitive engagement has been proposed (Chi, 2009; Chi and Wylie, 2014). The ICAP framework (Chi, 2009; Chi and Wylie, 2014) offers a way to understand the varied outcomes in AL through a hierarchy of four levels of cognitive engagement: Interactive, Constructive, Active, and Passive. In this framework, simply being Active is one of the lower levels of engagement and is less likely to foster students’ understanding than the higher level Constructive or Interactive modes (Chi and Wylie, 2014). In the ICAP framework, students’ level of cognitive engagement is evaluated based on their overt physical and verbal behaviors (Fig. 1). For example, behaviors related to receiving information, such as reading a text or listening to instructions, would indicate Passive engagement. Active engagement would involve physical manipulations of information while learning, such as highlighting or underlining text. During Constructive engagement, students would perform the same physical manipulations that occur in Active engagement; in addition, they would generate output beyond the information provided in the learning materials. Examples of Constructive engagement include summarizing a text or taking notes in one's own words. Similar to the Constructive mode, during Interactive engagement, students would generate new information; however, this generation would occur through dialoguing among students or between students and instructors.
Fig. 1 Modes of cognitive engagement (in bold) and characteristic behavior (in italics) according to the ICAP framework (Chi et al., 2018). |
Studies within the ICAP framework have operationalized cognitive engagement by observing students’ physical behaviors (Villalta-Cerdas and Sandi-Urena, 2014; Wiggins et al., 2017), categorizing activities by their broad instructional design features (Wiggins et al., 2017; Henderson, 2019; Lim et al., 2019; Menekse and Chi, 2019), and analyzing student conversations (Chi, 2009; Menekse and Chi, 2019; Liyanage et al., 2021). Each of these approaches has strengths and weaknesses. ICAP studies that examine engagement in terms of students’ physical behaviors have used large-scale observation of overt behaviors at regular intervals at a distance in order to capture whole-class data (i.e., an observer seated in the back of the room with a chart, such as the “live coding” used by Wiggins et al. (2017) or the observation procedures used in Villalta-Cerdas and Sandi-Urena (2014)). While this approach may be able to distinguish Passive engagement from higher ICAP levels, the differences between Active, Constructive, and Interactive engagement are difficult to tease out at this level of granularity. For example, if a student is writing something on a worksheet, this could be simply Active engagement if it involves identifying relevant information on a graph and recording the answer. However, if the student is making inferences based on trends observed in the same graph, this student would be engaging at a Constructive level. What students are saying while engaging in these physical behaviors is essential to determining what level of engagement they reflect.
ICAP studies that rely on the instructional design features of the activity as a whole are based on the idea that the structure of the activity itself will constrain the ways that students can engage with it. For example, to assess the impact of cognitive engagement on learning, Henderson (2019) used a series of instructional conditions designed to reflect various ICAP levels, in which a lecture-based condition was used for Passive engagement, an individual writing activity was used to elicit Constructive engagement, and a peer instruction format was used to prompt Interactive engagement. This focus on coding engagement based on instructional design features assumes, however, that all students in a group will engage at the same level throughout an activity and does not distinguish among the levels of cognitive engagement required for different types of questions or phases within an activity.
These assumptions merit greater scrutiny. Research has shown that the type of activity students participate in can affect the nature of their conversation when working in small groups (Young and Talanquer, 2013). These differences in group conversations may reflect different modes of engagement. A study on small group activities using Peer-Led Guided Inquiry (PLGI) found that students’ construction of arguments varied based on the number of students participating (Kulatunga et al., 2013). It is possible that these students were engaging at different levels. Variations in conversation may also be important in Process-Oriented Guided Inquiry Learning (POGIL) activities (Farrell et al., 1999; Hanson et al., 2018). Through the lens of the ICAP framework, not every part of an activity may elicit the same level of cognitive engagement. For example, POGIL activities involve a three-step learning cycle (Atkin and Karplus, 1962) where students first explore information provided in a model, then identify trends and patterns during the concept invention step, and finally apply the learned concept to new situations (Hanson et al., 2018). The direct questions about a model during the exploration stage of this cycle are meant to ensure that students understand the model on which later parts of the activity are based. In terms of the ICAP framework, many of these questions rely primarily on Active engagement because they ask students to identify and/or reflect on information in a model that is provided for them and do not require the generation of additional information. By contrast, questions from the concept invention and application stages are more likely to elicit Constructive or Interactive engagement because they require students to make inferences that go beyond the information provided in the original model. This type of variation might be expected in any type of scaffolded learning activity.
ICAP studies that examine student conversations have generally used discourse analysis as a means for understanding student engagement during AL activities. Discourse analysis examines texts and talk in context in order to understand participants’ actions (Wood and Kroger, 2000), and in education research, discourse analysis focuses on the role of spoken language in teaching and learning (Cole et al., 2014). Discourse analysis research in chemistry education research has largely focused on patterns of interaction or argumentation in various instructional settings (Kulatunga and Lewis, 2013; Xu and Talanquer, 2013; Young and Talanquer, 2013; Warfa et al., 2014; Current and Kowalske, 2016; Moon et al., 2016; Repice et al., 2016; Shultz and Li, 2016; Stanford et al., 2016; Dohrn and Dohn, 2018). The use of discourse analysis in ICAP studies both within and outside of chemistry education research has generally been oriented toward the coding of individual student conversational turns, for example, the frequency of specific discourse moves (e.g., claim, accept, oppose) (Menekse and Chi, 2019), or the frequency, distribution, and engagement level evident in student conversational turns during small-group discussions (Liyanage et al., 2021).
Discourse analysis can also be applied at a broader level, beyond individual turns. Because the highest two engagement levels outlined in the ICAP theory rely on distinctions that relate not just to what individual students are doing but to how they respond to one another during small-group conversations, coding longer exchanges is especially useful for distinguishing between Constructive and Interactive engagement. As noted above, there is a need to examine the extent to which actual student engagement in an activity matches the planned level of engagement based on the instructional design features of the activity itself. Therefore, using these ICAP levels as coding categories for both the activity design features and for students’ observed engagement as evident in their conversations across different parts of an activity can provide a systematic way of investigating this alignment.
Whereas discourse analysis is useful in understanding how students interact with one another, an alternative method is needed to investigate what is being said, i.e., the content of the conversation. Qualitative content analysis (QCA) is well suited to filling this gap. QCA offers a method for systematically coding the content of textual data, whether verbal or written, to identify patterns (Schreier, 2012). QCA includes both deductive approaches (directed content analysis) and inductive approaches (conventional content analysis) (Hsieh and Shannon, 2005). Conventional content analysis can provide insights into phenomena that are not yet well described (Hsieh and Shannon, 2005). Because little research to date has explored the alignment between the instructional design features of individual parts of an activity and the actual level of engagement that they generate, an inductive approach is better suited to developing an understanding of instances where mismatches occur. Where mismatches between the planned and actual levels of engagement are found, conventional content analysis can be used to examine the content of students’ discussions during these parts of an activity in order to identify patterns or themes that explain these mismatches. Therefore, conventional content analysis can be used to identify patterns as to which specific aspects of question design seem to foster higher or lower engagement across different groups as well as any other relevant themes that arise in students’ conversations.
(1) What range of engagement modes are expected during a general chemistry AL activity based on the question design?
(2) What range of engagement modes are observed during a general chemistry AL activity based on students’ physical and verbal behaviors during group conversations?
(3) If mismatches occur between the expected and observed levels of cognitive engagement, what themes account for this mismatch?
Activity worksheets were developed in house and structured using a format which included a model containing conceptual material followed by key questions, exercises, and problems. Key questions (KQ) generally asked about information explicitly presented in the model, providing an opportunity for students to gain familiarity with the content. Exercises (EX) included questions which required students to apply the content and infer an answer either conceptually or by performing a calculation. Problems (P) were similar to exercises but tended to be more complex, generally involving multiple steps or novel applications of the model content. The completed activity worksheets were turned in through the learning management system, and a nominal number of points were awarded for participation and attendance during the activity.
Fall 2020 | Winter 2021 | |
---|---|---|
Group A | Group B | Group C |
Nani | Jacob | Nani |
Beth | Helen | Helen |
Katie | Grace | Grace |
Leslie | — | — |
Three activities were observed during fall term and one activity was observed during winter term. The activities during the fall were evenly spaced, with the first one covering the concepts of mole and molar mass occurring near the beginning of the term, the second one covering concepts involving solutions and dilutions occurring near the midway point of the term, and the third activity covering electronegativity and polarity occurring near the end of the term. During winter term, the single activity occurred near the beginning of the term and covered concepts surrounding thermal energy and calorimetry. Each breakout room session was audio and video recorded. These recordings were transcribed verbatim by a transcription service. Transcripts were then reviewed and edited as needed by author S. Y. E. and pertinent physical actions from the participants (e.g., nod of agreement) were added to the transcripts. Unclear conversation was denoted by [XXX] in the transcripts.
Previous work investigating group conversations using ICAP looked at quantitative measures such as frequency of conversational turns or discourse moves (Wiggins et al., 2017; Menekse and Chi, 2019); however, this type of analysis does not provide insight into the relation between the group conversation and the question design. To address this gap, a second round of coding applied the ICAP framework to the group responses to each question in an activity. Each group's response to a question was coded based on the content of the conversation and the definition of each of the ICAP modes. The codebook for both types of coding is presented in Table 2. Each question and group response in the transcripts was coded deductively based on features of the levels of engagement outlined in the ICAP framework (Chi, 2009; Chi and Wylie, 2014).
Question codes | |
Active (A) | Information to answer the question can be found in the provided materials |
Constructive/Interactive (C/I) | New information needs to be generated to answer the question prompt |
Group response codes | |
Active (A) | Conversation reflects that an answer was taken from information provided |
Constructive (C) | One person provides the answer, generating new information. Can include forms of agreement from other group members (e.g., head nods, “yeah”, “uh-huh”, etc.) |
Interactive (I) | Participants generate information to answer the question based on one another's responses. Other participants’ contributions of off-topic talk or forms of agreement are not included in this code |
For example, Key Question 6 from the Solutions and Dilutions activity (SD-KQ6) was coded as Active because the information in the model (Fig. 2) explicitly states the required information in the text blurb and in the equation in the gray box at the top of the table.
(SD-KQ6) When making a dilute solution, which of the following remains constant? (i) The concentration (ii) The moles of solute (iii) The volume of the solution.
However, Key Question 9 from the same activity (SD-KQ9) asks students to provide an algebraic expression for MD (i.e., the molarity of the dilute solution). Since this question asks students to manipulate the equation in the model (Fig. 2), they would be generating new information. Therefore, SD-KQ9 was coded as a Constructive/Interactive question.
(SD-KQ9) In preparing for an experiment, you need to know what the concentration of a dilute solution (M D ) will be. Provide an algebraic solution using the relation in the model for this concentration.
In total, 68 questions were coded across the four activities (Table 3). Since the groups did not complete the activities in their entirety during the time allotted, the data includes only those activity questions which had a corresponding group response. Additionally, questions which were answered by both Groups A and B were counted only once. The overall results show that 13 questions were Active and 55 questions were Constructive/Interactive.
Activity | Question type | |
---|---|---|
A | C/I | |
Mole and Molar Mass | 2 (11) | 16 (89) |
Solutions and Dilutions | 3 (23) | 10 (77) |
Electronegativity and Polarity | 3 (17) | 15 (83) |
Thermal Energy and Calorimetry | 5 (26) | 14 (74) |
Total | 13 (19) | 55 (81) |
In general, the majority of questions (81%) were Constructive/Interactive questions across all activities. Table 3 shows that within the different activities, the percentage of questions coded as Active can vary, consisting of up to around one quarter of the total coded questions. Such variation was not captured in previous studies which coded at the activity level (Menekse et al., 2013; Wiggins et al., 2017; Henderson, 2019).
Excerpt 1 illustrates a group response that was coded as Active. In this excerpt, members of Group A are responding to SD-KQ6. Beth's comment (line 261) mentions looking at the equation which is a reference to the model (Fig. 2); therefore, this group response was coded as Active.
(SD-KQ6) When making a dilute solution, which of the following remains constant? (i) The concentration (ii) The moles of solute (iii) The volume of the solution
Excerpt 1: Group response to SD-KQ6, coded as Active
260 KATIE: Okay. So, key question six: “When making a dilute solution, which of the following remains constant?” Circle your response: “One, the concentration, two, the moles of the so-, solute, or three, the volume of the solution.”
261 BETH: It looks from the equation [in the model] that the moles of the solute stay constant.
262 NANI: Yeah.
Excerpt 2, on the other hand, illustrates a group response that was coded as Constructive. This excerpt focuses on Group C's response to Key Question 3 from the Thermal Energy and Calorimetry activity (TEC-KQ3), where students are asked to explain the difference in heat capacity between two blocks. Fig. 3 presents a portion of the model from the Thermal Energy and Calorimetry (TEC) activity.
In Excerpt 2, Helen provides the answer to the question associated with this portion of the model (line 52), and the contributions from Nani and Grace are forms of agreement (lines 53 and 54). Therefore, Helen is the only student generating new information and this group response was coded as Constructive.
(TEC-KQ3) How does the difference in specific heat capacity between blocks 2 and 3 relate to their final temperature? Briefly explain.
Excerpt 2: Group response to TEC-KQ3, coded as Constructive
51 GRACE: So, “How does the difference in specific heat capacity between blocks two and three relate to their final temperature?”
52 HELEN: So it, it’s the same as mass, right? So, like a greater specific heat capacity will result in a lower final temperature.
53 GRACE: Yeah.
54 NANI: (nods).
55 HELEN: So, so block two will have a greater final temperature.
56 GRACE: Mm hmm.
Excerpt 3 gives an example where the coding of the group response was ambiguous. In this excerpt, students from Group A respond to Key Question 7 from the Solutions and Dilutions activity (SD-KQ7). Although the answer is present in the model (Fig. 2), and Katie gives the correct answer (line 271), it is unclear from the conversation whether Katie's response was based on the information in the model (Active) or she generated new knowledge (Constructive). In the absence of evidence that the response came from the model, it was assumed that she generated new knowledge and the group response was coded as Constructive.
(SD-KQ7) When making a dilute solution, which of the following decreases ? Circle your response. (i) The concentration (ii) The moles of solute (iii) The volume of the solution
Excerpt 3: Ambiguous group response to SD-KQ7, coded as Constructive
270 BETH: Okay. The sec-, or the seventh quest-, seven, seventh key question is, um, “When making a dilu-, dilute solution, which of the following decreases, circle your response? Um, one, the concentration, two, the moles of the solute, or three, the volume of solution.”
271 KATIE: Wouldn't it be the concentration since we're diluting it?
272 BETH: Yeah, I think so.
Excerpt 4 shows an example of Interactive engagement, where Katie, Leslie, and Beth all contribute new information to solving the calculation in Exercise 5 from the Solutions and Dilutions activity (SD-EX5). Leslie and Katie start by determining what variable they are solving for (lines 400 and 401). Leslie then builds on this by identifying the numerical value for MC, and Katie further contributes new knowledge by mentioning the form of the equation they should use to solve (lines 403 and 404). Beth further builds on this knowledge by providing the numerical solution (line 408). Since Leslie, Katie, and Beth all contribute pieces of information to answer the question and each of their statements builds upon the previous student's comment, this response was coded as Interactive.
(SD-EX5) What is the concentration when 11.75 mL of 0.375 M sucrose is diluted to 50.0 mL?
Excerpt 4: Group response to SD-EX5, coded as Interactive
400 KATIE: Ok, “What is the concentration when 11.75 milliliters of 0.375 molarity or mole?” I don't even know. Sucrose is diluted to 50 milliliters. Okay. So now we're trying to find MC. Again, MC.
401 LESLIE: No, we're find-, we're trying to find, MD now.
402 BETH: Yeah. I think MD.
403 LESLIE: Cause MC is that 0.375.
404 KATIE: Oh yeah, so we're finding...so we would do our MC times VC divided by VD then?
405 BETH: Yeah.
406 LESLIE: Did you guys get there?
407 KATIE: Just about...Oh geez!
408 BETH: Do you guys get 0.0881?
409 LESLIE: Mm hmm.
In total, 101 group responses were coded (Table 4). Groups A and B have a different number of response codes for each activity because they moved at different speeds and therefore did not answer the same number of questions. As with the question coding, since students did not complete the activities during the time allotted, coded responses are only for completed questions, not all questions in the activity. Overall, group responses were distributed across the three engagement modes with 8 responses coded as Active, 32 responses coded as Constructive, and 61 responses coded as Interactive. Results indicate that Interactive group responses ranged from 64% to 87% for Group A and from 39% to 77% for Group B across the Mole and Molar Mass, Solutions and Dilutions, and Electronegativity and Polarity activities. Only Group C completed the Thermal Energy and Calorimetry activity, and only 58% of their responses during this activity reached the level of Interactive engagement. Overall, observed engagement levels across groups and across questions within an activity varied widely.
Activity | Group | Response type | ||
---|---|---|---|---|
A | C | I | ||
Mole and Molar Mass | A | 0 (0) | 1 (13) | 7 (87) |
B | 1 (6) | 7 (39) | 10 (55) | |
Solutions and Dilutions | A | 1 (9) | 3 (27) | 7 (64) |
B | 1 (8) | 2 (15) | 10 (77) | |
Electronegativity and Polarity | A | 0 (0) | 5 (36) | 9 (64) |
B | 2 (11) | 9 (50) | 7 (39) | |
Thermal Energy and Calorimetry | C | 3 (16) | 5 (26) | 11 (58) |
Total | 8 (8) | 32 (32) | 61 (60) |
Activity | Group | Question codes | Response codes | |
---|---|---|---|---|
C/I | C | I | ||
Mole and Molar Mass | A | 8 | 1 (13) | 7 (87) |
B | 16 | 7 (44) | 9 (56) | |
Solutions and Dilutions | A | 8 | 1 (13) | 7 (87) |
B | 10 | 1 (10) | 9 (90) | |
Electronegativity and Polarity | A | 11 | 4 (36) | 7 (64) |
B | 15 | 9 (60) | 6 (40) | |
Thermal Energy and Calorimetry | C | 14 | 5 (36) | 9 (64) |
Total | 82 | 27 (33) | 55 (67) |
In addition to variation in response coding seen across activities, variation was also observed across groups (Table 6). For groups A and B, who completed the same three activities, several of the response codes differed across the two groups on the questions that both groups completed. For example, Table 6 shows that on the 8 completed questions coded as Constructive/Interactive in the Mole and Molar Mass activity, the responses of groups A and B only overlapped on 6 question responses, all coded as Interactive. The fewest matches between groups were observed on the 11 Electronegativity and Polarity questions, with only 5 of the response codes matching.
Response codes | ||
---|---|---|
Activity | C | I |
Mole and Molar Mass (8) | 0 | 6 |
Solutions and Dilutions (8) | 1 | 5 |
Electronegativity and Polarity (11) | 2 | 3 |
Upon comparison of question codes to the response codes of each group, mismatches were found exclusively in questions coded as Active. A breakdown of the frequency of questions and group responses coded as Active is shown in Table 7. While 19 total questions were coded as Active, only 8 responses were also coded as Active, a 42% match. This means that more than half of the questions coded as Active had a mismatch with their corresponding group response codes, where students were responding at a higher engagement mode than was indicated by the question design. Among the 11 Active questions which showed a higher group response engagement mode, the responses split almost evenly between Constructive (5) and Interactive (6) engagement.
Activity | Group | Question codes | Response codes | ||
---|---|---|---|---|---|
A | A | C | I | ||
a Groups A and B have different numbers of active questions because Key Questions 1–4 were assigned prior to class, and Group A did not discuss them while Group B went over them as a group before proceeding. | |||||
Mole and Molar Massa | A | 0 | 0 | 0 | 0 |
B | 2 | 1 | 1 | 0 | |
Solutions and Dilutions | A | 3 | 1 | 2 | 0 |
B | 3 | 1 | 1 | 1 | |
Electronegativity and Polarity | A | 3 | 0 | 1 | 2 |
B | 3 | 2 | 0 | 1 | |
Thermal Energy and Calorimetry | C | 5 | 3 | 0 | 2 |
Total | 19 | 8 | 5 | 6 |
To further investigate these mismatches, conventional content analysis was used to identify the potential causes by examining each mismatched question and group response for specific phrases that identified the source of the mismatch. Causes were then collected into common themes. Table 8 summarizes these results. Each of the questions in these mismatched cases was coded as Active because the information to answer the question was explicitly available in the activity.
Theme | Activity | Group | Mismatched cases |
---|---|---|---|
a The mismatched cases in these activities occurred on the same questions in Groups A and B. | |||
Model use | Thermal Energy and Calorimetry | C | 2 |
Mole and Molar Mass | B | 1 | |
Unfamiliar vocabulary | Solutions and Dilutionsa | A | 1 |
B | 1 | ||
Molecular representations | Electronegativity and Polaritya | A | 2 |
B | 1 | ||
Ambiguous | Solutions and Dilutionsa | A | 1 |
B | 1 | ||
Electronegativity and Polarity | A | 1 |
For example, in Excerpt 5, Group C responds to Key Questions 4 and 5 from the Thermal Energy and Calorimetry activity (TEC-KQ4 and TEC-KQ5). Since the answers to both these questions are explicitly stated in the model (Fig. 3), these questions are coded as Active. Although the group response to TEC-KQ4 did refer to the model and was coded as Active, the response was incomplete. The correct response should have included ΔT and q, but Helen and Grace used the model to decide that the answer should only include ΔT (lines 68–71). Because of this incomplete use of the model, Helen and Grace engaged interactively to answer the next question in the activity, TEC-KQ5, which built upon the aspects of the model highlighted in TEC-KQ4. This interaction starts from line 72 and Grace's realization that they need two variables. From there, Helen builds upon this, suggesting the two variables are Ti and Tf (line 73). Although the final answer they come to is incorrect, one can see that it is the incomplete use of the model in TEC-KQ4 which prompts the Interactive engagement in TEC-KQ5.
TEC-KQ4: When mathematically determining q, which variables can be positive or negative?
TEC-KQ5: How are the two variables in KQ4 related?
Excerpt 5: Example of incomplete model use
68 GRACE: It s-, it shows at the top model [referring to the model in Box 2], which ones. So...
69 HELEN: Yeah, it does. So only ΔT.
70 GRACE: Yeah. ΔT, and if you want to include the thermal energy, you could say that, but we're already talking about it, so...
71 HELEN: Yeah. I don't think you would include q.
72 GRACE: And then, “How are the two variables related?” Um...Oh, they said the two variables. Okay. So, you can’t include q. It’s the same thing...
73 HELEN: Okay. No, no. So, it's temperature final and temperature initial.
74 GRACE: Oh, those are the two variables. Ohhh...
75 HELEN: Yeah.
76 GRACE: Okay. Never mind. Um, But the temp...Oh yeah. The temp can be negative.
77 GRACE: Well if one's, if one's, it depends on which one, if the final's higher than the initial, then you get a positive number. If the initial's higher than the final, you get a negative number. So I suppose that's how it's related...right.
(SD-KQ8) In a dilution, which is always larger ? Circle your response. (i) The volume of the aliquot (ii) The volume of the final solution.
Excerpt 6: Example of unfamiliar vocabulary
166 HELEN: I know it's the second one, but what exactly is the ali- aliquot? Cause I know [XXX] fairly small, so small sample or whatever.
167 JACOB: I guess the aliquot would be, do you think it would be the given volume?
168 GRACE: I'm just looking it up.
169 JACOB: Fair enough.
170 HELEN: What does Google say?
171 GRACE: A portion of a larger whole, a specific sample taken for chemical analysis or other treatment. I think it’s like a portion of the sample. So the portion is obviously going to have less.
172 JACOB: So in a dilution, which is yeah, the volume of final solution will be larger.
Key Question 8 from the Electronegativity and Polarity activity (EP-KQ8) asks students to explain why DL2 is a polar molecule. Since this information is depicted in the model (Fig. 4), this question is coded as Active. Students in Groups A and B seemed to have difficulty moving between the Lewis structure representation and bond dipole representation of molecules. In Excerpt 7, the Interactive engagement of Group A is prompted by Beth asking about the number of arrows that should be drawn (line 285). Leslie builds on the question explaining that she drew three arrows (line 286), and the instructor (INST in excerpt) builds further, adding new information that there should be two component arrows for each bond dipole (line 290).
EP-KQ8: Using the blank Cartesian coordinate system, draw the x- and y-components of each bond and use them to explain why DL 2 is a polar molecule.
Excerpt 7: Example of molecular representations (Group A)
285 BETH: For this one, do you only draw two arrows or should there be more than two?
286 LESLIE: I'm doing three for that one. So like the two going on the X and then the one going down for the Y.
287 BETH: OK.
288 INST: [Key Question] Eight. Okay. And are you looking at it or have you talked about it?
289 BETH: Um, we've talked about how many arrows to draw and um, I think we decided on drawing like three arrows. Uh, I drew like two, um, on the X axis, like going different directions and then one down on the Y axis.
290 INST: Okay. So for each of the diagonal arrows, they have both an X and a Y component. Yeah. So the downward, yes. I see what, you're what you're drawing, Katie. So you, so you have for each of the diagonals, you have an X and a Y. And so for this one, you have an X and a Y. So you actually have two downward arrows on the Y axis.
291 BETH: Two downwards? Ok.
While the Interactive engagement in Excerpt 7 was prompted by difficulty in translating between the Lewis structure and the representation depicting bond dipoles, in Excerpt 8, we see a desire to understand more deeply the role of specific features of the Lewis structure (i.e., lone pairs of electrons) in the dipole representation is the trigger for the Interactive engagement. In Excerpt 8, Group B engages interactively to try to gain a deeper understanding of what the vector model of dipoles represents. Their response to the same question begins with a discussion of the Lewis structure to identify the molecular geometry (lines 368–371). From there, they reference the model to determine how to draw the components of the bond dipoles (lines 373–386). Lines 387–396 show the group generating new information as they attempt to make the connection between the lone pairs of electrons in the Lewis structure and the bond dipoles. In lines 385 and 386, both Helen and Jacob directly refer to Fig. 4 in the model, stating that the answer is there (Active engagement). However, Grace's desire to understand how the lone pairs fit into the vector representation causes the group to engage at the higher Interactive mode (lines 387 and 393). In both groups’ conversations, it is apparent that the students attempting to move from the Lewis structure representation of the molecule to the vector model of bond dipoles is the trigger for the higher mode of engagement.
Excerpt 8: Example of molecular representations (Group B)
368 GRACE: Oh, and this one has lone pairs. What kind of structure does that make?
369 JACOB: The chart's...DL2, lone pairs.
370 JACOB: It's bent.
371 GRACE: I think bent?
372 JACOB: Yeah.
373 JACOB: Cause if we're looking at the model, um, the model gives like the best description of it above, uh, for the DL2. So net molecular dipole due to bent geometry. And it shows you below what that bent geometry looks like on the planes.
374 GRACE: So for this we're doing four.
375 JACOB: And it's asking us why it's polar.
376 GRACE: Oh, I'm assuming they don't cancel each other out.
377 HELEN: The left and right aspects do, but they still have a net, like, down.
378 GRACE: Wait, what?
379 JACOB: It has a net molecular dipole.
380 GRACE: Yeah. No...Have you guys started drawing the coordinate? I don't know how they'll look. Are they pointing down Y?
381 JACOB: They're pointing down, yeah, Y.
382 GRACE: Okay. At what angle?
383 HELEN: Like 45ish each in the third and fourth quadrant.
384 JACOB: Like it's coming out of the origin.
385 HELEN: I mean, it's just like the green and pink arrows in Fig. 4 is what I drew. But on one axis or like one....
386 JACOB: Same. I simply, I literally don't know why, like I know, but I also like don't know, so I just looked at Fig. 4 that has the answer so... Well, it has like what we're supposed to be gathering from it.
387 GRACE: What about the lone pairs?
388 JACOB: Um, it shows in Fig. 4, like kind of, uh, the lone pairs are kind of like on the arrows or like, do you see Fig. 4?
389 GRACE: Oh yeah.
390 JACOB: So that's kind of what Fig. 4 does with the...
391 GRACE: So I've got two of them with the arrows pointing opposite ways in the third and fourth quadrants.
392 JACOB: Yes.
393 GRACE: What are the ones for the lone pairs?
394 HELEN: It just says of each bond. I don't think you have to worry about the lone pairs.
395 JACOB: And then ask why it's polar. And um, like Helen said, it doesn't cancel because of the net molecular dipole.
396 HELEN: I said it's polar because though the dipoles cancel out in the x-direction, they have a net downward dipole moment. I don’t think that's like correct language, but...
397 JACOB: I mean, I think it's it, but it gets your point across.
RQ1: What range of engagement modes are expected during a general chemistry AL activity based on the question design?
This study used a finer grain size, i.e., identifying engagement modes at the question level rather than the activity level. Results indicated that across the four activities observed, the majority of questions (81%) were designed to elicit Constructive or Interactive engagement. Investigation at this finer grain size confirms that not all questions were designed with the same mode of engagement in mind, and therefore studies which assume a single engagement mode for the entire activity may miss insights that can be seen when looking at engagement at the question level.
RQ2: What range of engagement modes are observed during a general chemistry AL activity based on students’ physical and verbal behaviors during group conversations?
The study also identified observed engagement modes of student groups by using ICAP to examine group conversations. Results indicated that within a single activity, the engagement of the group based on their conversation varied from Active to Interactive, with the majority of the group responses (60%) showing Interactive engagement. Additionally, within each group, the percentage of Interactive responses was not consistent across all activities (64–88% for Group A; 39–77% for Group B). These results provide further evidence that coding engagement at the question level for both questions and responses can give insight into students’ engagement which is lost when coding at the activity level.
RQ3: If mismatches occur between the expected and observed levels of cognitive engagement, what themes account for this mismatch?
By comparing the expected engagement mode based on the question design with the observed engagement mode based on the group responses, cases of mismatch were identified. The group conversations were then further investigated using qualitative content analysis for common themes that caused the mismatches. Results suggested that the causes of the higher than expected observed engagement levels were related to three themes: model use, unfamiliar vocabulary, and struggles with different molecular representations.
Use of new and potentially unfamiliar scientific terms can possibly promote students’ curiosity and potentially lead to higher modes of engagement. This idea was supported in this study where use of the unfamiliar term “aliquot” resulted in more conversation and a higher engagement mode. Although there is the danger that discussion of such vocabulary could result in unhelpful, tangential conversations, group discussions around the term “aliquot” seemed to help students reason out an answer to the question. In addition, learning relevant new vocabulary is essential to students’ growth as scientists. Therefore, use of unfamiliar vocabulary that is relevant to the concept being taught can be a useful tool to promote student learning.
It should be noted that although ICAP states that cognitive engagement increases as one moves from Passive to Active to Constructive to Interactive, it should not be inferred that Interactive engagement is always the most desirable. As shown in this study, these higher than expected modes were due to a variety of factors that could provide insight to future improvements in the activities or instructional practices. Worksheets for these activities were structured such that students begin with Key Questions which are designed to orient students to the pertinent information in the model (i.e., Active engagement), followed by Exercises and Problems, which allow students to manipulate and apply the information in a more advanced manner (i.e., Constructive or Interactive engagement). By scaffolding worksheets in such a manner, students use knowledge gained at the lower engagement modes to foster a deeper understanding during the more complex Exercises and Problems.
This journal is © The Royal Society of Chemistry 2022 |