Eleni K.
Geragosian
a,
Diana
Zhu
b,
Marc
Skriloff
b and
Ginger V.
Shultz
*b
aDepartment of Chemistry & Biochemistry, University of Detroit Mercy, 4001 W. McNichols Rd., Detroit, Michigan, USA
bDepartment of Chemistry, University of Michigan, Ann Arbor, Michigan 48109, USA. E-mail: gshultz@umich.edu
First published on 30th October 2023
Chemistry graduate teaching assistants (GTAs) have substantial facetime with undergraduate students at large research institutions where they lead discussion and lab sessions. Emerging research describes GTAs’ content and teaching knowledge for introductory chemistry classes, but we need to know more about how GTAs manage their classes in the moment and how they assess student learning during class time. We conducted classroom observations and post-observation interviews with six chemistry GTAs with various years of teaching experience and who were teaching a variety of classes (e.g., general chemistry discussion, biochemistry discussion, organic chemistry lab, computational chemistry lab, and more). These GTAs were each observed and interviewed multiple times over the course of a semester. Through qualitative analysis guided by the teacher noticing framework, we describe what chemistry GTAs notice, or pay attention to, regarding student learning in their teaching sessions and how they interpret what they notice. We found that chemistry GTAs often paid attention to the types of questions that students asked but relied on their students to take initiative to ask questions in order to assess their learning. Also, GTAs often focused on superficial features of their class sessions to assess learning, like whether students finished their tasks and left their session early. However, some GTAs noticed more sophisticated evidence of student understanding, such as when students connected content covered across multiple class sessions. The results from this study contribute to our understanding of how chemistry GTAs lead their sessions and evaluate student learning during their sessions. Results serve to inform potential training designs that can support chemistry GTAs’ teacher learning through learning to notice—and to create opportunities to notice—significant features of their classrooms.
Chemistry GTAs’ instruction is also influenced by the context in which they teach. GTAs at research institutions have reported feeling that teaching is not valued in their departments, which inhibits their development as instructors (Luft et al., 2004; Lane et al., 2019; Zotos et al., 2020). GTAs consider their role as supplemental, viewing themselves as lab managers, tutors, someone to answer students’ questions, and an approachable resource for students (Sandi-Urena and Gatlin, 2013; Zotos et al., 2020). This perception of their role in their classrooms influences their actions to teach and assess their students (Zotos, 2022).
Researchers have reported various ways that chemistry GTAs evaluate student learning; an important skill to support students in developing chemistry knowledge. To evaluate student learning, GTAs examine students’ facial expressions, check students’ grades, ask rhetorical questions like “do you understand?”, and determine whether students could explain the concept themselves (Zotos et al., 2020). Generally, GTAs use assessment strategies they feel are simple to use and require little added effort from the instructor and student, like asking students to write their “muddiest point” from the lesson (Mutambuki and Schwartz, 2018). Such methods may provide limited depictions of student learning, and research has shown GTAs’ teaching strategies may be misaligned with what they perceive students to struggle with (Baldwin and Orgill, 2019). Even so, many studies have recommended that GTA trainings increase their focus on formative and summative assessment strategies (e.g., Luft et al., 2004).
Education researchers have recognized the challenges faced by chemistry GTAs and have implemented, evaluated, and published various training programs geared toward supporting GTAs in their respective roles. For example, Mutambuki and Schwartz (2018) recently implemented a professional development program for chemistry GTAs and found that elements of the professional training were adopted by GTAs later in the semester. Before participating in the professional development, which focused in part on various formative assessments, GTAs relied on summative assessments like lab reports to assess student learning. The formative assessments discussed in this training included, for example, ungraded quizzes, asking students to identify their “muddiest point,” and having students write for one minute in response to the question “what did you learn the most from the lesson?” After engaging in professional development, most GTAs reported using at least two types of formative assessments, which they described to be “vital in obtaining immediate feedback on students’ areas of difficulties in learning, and for assessing conceptual understanding or knowledge transfer” (Mutambuki and Schwartz, 2018, p. 117). However, researchers recognize that GTA challenges persist, and for training to be successful, the design must be contextualized to the university and department in which GTAs are situated (Mutambuki and Schwartz, 2018; Zotos et al., 2020). To further expand our understanding of GTAs’ experiences and conceptions of teaching, we used the teacher noticing framework to describe GTAs’ teaching (Sherin et al., 2011).
The classroom elements that instructors notice provides insight into where instructors believe attention is or is not needed. It may also provide insight into teachers’ cognition regarding how they intend to frame various class activities (Russ and Luna, 2013) and how they assess student learning during class time (Dini et al., 2019). To identify what teachers pay attention to, researchers have used a variety of methods, including (a) having instructors record (write) elements of instruction that they notice while watching or after watching video clips of instruction (Morris, 2006; Star and Strickland, 2008; Jacobs et al., 2010; König et al., 2014; Blömeke et al., 2022), (b) having instructors reason out loud as they evaluate and grade student responses to exam questions (Talanquer et al., 2015; Herridge and Talanquer, 2021; Herridge and Tashiro, 2021), and (c) having instructors record instances of student reasoning during their instruction, and reflect on those clips afterward (Russ and Luna, 2013; Sherin and Dyer, 2017; Luna et al., 2018).
Research in this area has demonstrated that novice teachers tend to focus on surface-level features of classroom interactions, attend more to teachers’ actions than students’, and view lessons as chronological but disconnected sequences of events. However, more experienced teachers tend to focus on students’ actions, issues of content, and can more consistently identify students’ thinking (Carter et al., 1988; Sabers et al., 1991; Morris, 2006; Star and Strickland, 2008; Barnhart and van Es, 2015; Chan and Yau, 2021; Chan et al., 2021). Additionally, Erickson (2011) and Chan and Yau (2021) found that novice teachers tend to focus on the learning of the whole class rather than individual students, which may inaccurately indicate lesson success to the instructor. The discrepancy between novice and experienced teachers may be because novice teachers do not know what to pay attention to and what to ignore (Feiman-Nemser and Buchmann, 1985). Teacher noticing can provide a window into teachers’ epistemological framing for class activities and their learning goals for certain sessions. (Russ and Luna, 2013). Thus, a teacher's patterns in teacher noticing vary across different class activities. Investigating what chemistry laboratory and discussion GTAs notice in their classroom can provide similar insight into where they focus their attention and how that is guided by their goals for the teaching sessions.
While noticing and interpreting have been conceptualized to occur simultaneously (Sherin and Star, 2011; Sherin and Russ, 2014; Walkoe, Sherin, and Elby, 2020), some researchers have isolated the dimension of interpretation (Sherin and van Es, 2009; van Es, 2011). Most of these studies are focused on how teachers make sense of students’ understanding of content. For example, Sherin and van Es (2009) characterized teachers’ interpretations of students’ mathematical thinking during videos of instruction as being descriptive (describing what they observed in the video), evaluative (evaluating the quality of interactions in the video), or interpretive (making inferences about what took place). They posit that the “interpretive” stance is the most sophisticated level of interpretation as it involves invoking substantive knowledge of content to examine classroom phenomena (van Es and Sherin, 2021). Understanding how teachers interpret what they notice is important in understanding how teachers use their knowledge and experiences to make sense of what is observed (van Es and Sherin, 2021). Teachers’ interpretations inform the actions they take or do not take in the classroom (Morris, 2006; van Es and Sherin, 2021).
Teachers’ actions taken in response to interpreting what they notice may involve further questioning to draw out or guide students’ understanding, explaining content in a different way, re-directing students’ attention, referring students to other resources, and prompting discussion between students, among others (Sherin and Star, 2011). van Es and Sherin (2021) have proposed an alternative third dimension of teacher noticing, which they refer to as shaping. This dimension is defined as “constructing interactions, in the midst of noticing, to gain access to additional information that further supports their noticing” (van Es and Sherin, 2021, p. 23) and thus may involve asking questions to elicit students’ understanding of content, which the teacher would then notice. Dini et al. (2019) identified that teachers’ use of questions during class discussions to elicit student thinking or to advance student thinking. These teaching moves were influenced both by teachers’ goals for the teaching session and by what they noticed and interpreted about student thinking.
All dimensions of the teacher noticing cycle are profoundly influenced by teachers’ prior experiences as instructors and learners, knowledge of teaching, cultural backgrounds, instructional goals, knowledge of content, and more (Jacobs et al., 2010; Erickson 2011). Therefore, teacher noticing is highly contextual and can differ across teachers even within a single context. Because chemistry GTAs teach in a context unique to other teachers, investigating their teacher noticing may provide unique insight into their teaching practice. Additionally, highly sophisticated interpretation and response skills likely require sophisticated noticing skills (Chan and Yau, 2021). Many efforts have been made to incorporate the development of pre-service teachers’ noticing skills in training designs, which could inform future chemistry GTA training programs. Sherin and van Es (2009) designed a training in which preservice teachers were involved in video clubs. In club meetings, preservice teachers watched instructional videos and then were provided prompting questions, such as “what stands out to you here?” to discuss with their peers. This design and other similar trainings focused on developing teacher noticing skills have proven to be successful in guiding teachers’ attention to important aspects of student understanding and making more sophisticated interpretations of what they notice (Star and Strickland, 2008; Sherin and van Es, 2009; Benedict-Chambers, 2016; Chan and Yau, 2021). Analogous trainings for chemistry GTAs, influenced by empirical data about their teacher noticing, can help support chemistry GTAs in building their teaching knowledge and practice, a common issue in higher education.
In accordance with the wider goal of better preparing GTAs for their teaching role, we studied two dimensions of teacher noticing–notice and interpretation–for chemistry GTAs in this study. This study was guided by the research questions:
What do chemistry GTAs notice about student learning during laboratory and discussion class sessions, and how do they interpret what they notice?
Understanding this aspect of GTAs’ teaching may provide more detailed insight into GTAs’ conceptions of teaching and learning as well as how training may support their development. Teachers who can pay close attention to students’ ideas and conceptions are better able to create opportunities for student learning (Jacobs et al. 2007).
Discussion GTAs are generally expected to review the content taught in lecture and support students as they work through practice problems related to lecture content. Up to thirty students are present for a given discussion session, and sessions typically lasts one hour. Depending on the course, practice problems may be prepared by the course professor or by the GTA. Discussion GTAs are often free to choose how they spend class time; for example, GTAs may walk around to answer questions as students work in groups or review the practice problems as a whole class. In both lab and discussion sessions, GTAs are expected to be answering students’ questions to support their learning.
Typically, GTAs are assigned to teach courses based on availability and if the GTAs’ schedule aligns with course offerings. GTAs attend a two-day instructor training focused on departmental logistics, how to handle common situations with students, and expectations for their role before their first semester of graduate school. Professors and senior graduate students lead GTA training, and the majority of GTA training occurred in a full group setting. Senior GTAs lead mock teaching sessions, during which each incoming GTA presents a question in a classroom. During this session the incoming GTA receives feedback on their teaching from the senior GTA and peers. Throughout the semester, most course faculty hold weekly staff meetings to ensure all GTAs are informed of course logistics and are prepared for their sessions for the week.
GTA | Number of semesters as a GTA | Course | Course context | Number of observations |
---|---|---|---|---|
a Only three observations were conducted with Andrew due to technical complications. | ||||
Abby | 1 | Biochemistry discussion | Course professor provided guidance on what to cover during sessions, but GTAs chose how to cover it. | 4 |
Mallory | 1 | General chemistry discussion | Course professor provided practice problems for students to work on in groups. GTAs were expected to answer students’ questions as they work. | 4 |
Calvin | 2 | Organic chemistry II lab | Course professor provided experimental protocol and guidance on completing the experiment. GTAs are expected to help students as they work in groups to complete the experiment. | 4 |
Sol | 3 | Organic chemistry discussion | Course professor did not provide materials; GTAs chose how to organize their sessions, what material to cover, and how to cover it. | 4 |
Grace | 4 | Computational chemistry lab | Course professor provided experimental protocol and guidance on completing the experiment. GTAs are expected to help students as they work in groups to complete the experiment. | 4 |
Andrew | 5 | General chemistry lab | Course professor provided experimental protocol and guidance on completing the experiment. GTAs are expected to help students as they work in groups to complete the experiment and a related worksheet. | 3a |
Data collection for this study was part of a larger study (Zotos, 2022). Each participant was observed while they taught four times throughout one semester, except for Andrew who was observed three times. We conducted observations of discussion sessions using stationary cameras to capture the classroom (what students were doing, what the GTA was doing, and what was drawn on the board). Because stationary cameras were not able to capture the entirety of the lab space and GTA-student interactions, we collected observation data of lab sessions via a small wearable camera affixed to GTAs’ lab glasses. This small recording device allowed us to view the session from the GTAs’ perspective. For each observation, a researcher set up the equipment before the session started, left the room, and returned at the end of the session to collect the equipment.
Within 24 hours after each observation, a researcher conducted a semi-structured interview with each participant during which the GTA reflected on their teaching session. The interview questions were geared toward capturing GTAs’ goals for their session, indicators of success, and other perceptions of how their session went. The interviews typically lasted 20–30 minutes and served as our primary data source. The full post-observation interview protocol is included in Appendix 1. The interviews were transcribed verbatim through an off-site service.
Code | Definition | Example |
---|---|---|
Notice | Code when the GTA recalls something specific that occurred during the class session related to student learning of chemistry | So usually in that section, if they're not understanding something, they'll turn to each other and start to talk. And that's what they were doing during the derivation, so I stopped and tried to answer those questions.But with the problems, they weren't doing that. And if they did have a question, they were asking it to me and they were all very next-level questions. Like, they were following me along and asking follow-up questions, not clarifying questions. Um, and they seemed, like when I would ask a follow-up question, like in the next problem related to the last problem, they would remember things. So I feel like they were following along with me. And that's how I gauged that it was going well. (Abby post-observation interview 1) |
Interpretation | What the GTA interpreted based on something they notice during the observed session/how they made sense of what they notice | |
Response – Action | How the GTA responded to what they notice | |
Response – No action | Code when the GTA did not respond to something they notice, whether because they didn’t know how to respond or chose not to respond |
To conduct analysis, we analysed one observation and post-observation interview at a time. We first watched the recorded teaching session and memoed times where teacher noticing may have occurred based on our own perception. We did not specifically code observations videos because this would have required inferring what participants noticed. Rather, we watched the observed class sessions to familiarize ourselves with the nature of the observed session and to obtain context for what participants mentioned in post-observation interviews. After watching each observation, we coded the associated post-observation interview using the codebook detailed in Table 2.
To ensure proper use of our codebook, the first, second, and third authors independently watched one observation and coded the associated post-observation interview (Abby's first observation and interview), then met to discuss our coding of the post-observation interview. We discussed discrepancies and refined our codebook accordingly. For example, in Abby's first post-observation interview, she mentioned something she noticed in a previous session (the first sentence in the example in Table 2). Considering that we aim to describe teacher noticing within observed sessions, we decided to limit our teacher noticing to events that occurred within the observed session. This specific line was instead coded as an interpretation because Abby described how she interprets when students talk to each other. We repeated this process with another observation and continued our conversations until we reached a consensus. With multiple researchers actively participating in data analysis, we aimed to mitigate researcher bias by allowing for negotiation and consensus building. (Watts and Finkenstaedt-Quinn, 2021). Each researcher then independently coded a unique subset of interviews. Each week, all three researchers met to discuss independently-coded interviews until coding was complete. We identified teacher noticing and coded “notice” an average of seven times per interview; each interview ranged from four to fourteen instances of noticing.
After coding each interview for teacher noticing, interpretation, and response, we created spreadsheets to summarize noticing events for each participant. Each row contained a ‘notice,’ the associated interpretation, and the associated response. This allowed us to compare and view noticing events concisely. Through this process, we found that many GTAs’ teacher noticing codes did not have an associated response-action or response-no action code. This is likely due to the nature of our interview, as we asked GTAs to recall events but did not necessarily ask how they responded. Thus, we decided to focus on teacher noticing and interpretation during further analysis, aligning with other teacher noticing studies in mathematics education (Sherin, 2007; Sherin and van Es, 2009; Colestock and Sherin, 2009). Once the spreadsheet was complete, we began inductively identifying patterns in what GTAs notice. For example, four GTAs noticed that students asked a lot of questions, and three GTAs noticed that students struggled with different parts of the lab protocol. We listed these noticing patterns and continued to review GTAs’ teacher noticing as we looked for instances that supported or conflicted with our initial identified patterns. We then grouped GTAs’ noticing events by similarities, for example, noticing related to student questions and noticing students struggled with the lab protocol were grouped into the “noticing evidence of student understanding” category. The three final inductive categories of GTA teacher noticing were: noticing evidence of student understanding, noticing student participation, and noticing the pace of the teaching session. All teacher noticing events that were coded in two or more different GTAs’ interviews are included in these groupings; if a teacher noticing event was only coded in one interview, it was not included in a category. The Results section is organized by category of teacher noticing and associated interpretations are reported. In some cases, multiple GTAs noticed similar events, but interpreted it differently. These instances are described in the Results section below.
Fig. 2 GTAs' noticing events and interpretations related to evidence of student understanding of content. |
Generally, every discussion that I had yesterday a student asked, “What's a beta hydrogen and how do I determine it?” There's nothing special about a beta hydrogen, and it's such an easy concept, but when it was covered in class, they didn't pick it up.
Sol interpreted students’ questions about beta-hydrogens to be about a fundamental topic and thus indicating students did not understand this topic. Similarly, when GTAs noticed that students were asking a lot of questions, GTAs interpreted that to indicate that students were confused about course content or not prepared for an upcoming exam.
In contrast, some questions asked by students were interpreted by GTAs as being more advanced questions. GTAs further interpreted that to indicate that students were progressing in their understanding of chemistry content or abilities to carry out steps in a lab protocol. For example, Calvin (organic chemistry lab GTA) reflected on students’ questions as the semester progressed. After a lab session that occurred about eight weeks into the semester, Calvin said,
They don't ask the same inexperienced questions anymore. The questions are much more fundamental. […] Most of them, in most cases, know how to handle the reaction.
Calvin, like other GTA participants, interpreted students’ ability to ask higher-level questions to indicate that students understood course content; a central goal of chemistry laboratory and discussion sessions.
The GTAs who noticed that students were not asking questions differed in their interpretations. This indicated to some GTAs that the session was going well, and students understood content. For example, Mallory (general chemistry discussion GTA) held a discussion session in which students worked on the practice problems provided by the course professor. Toward the end of the session, Mallory reviewed practice problems and invited students to ask questions about the content. In the post-observation interview for this session, Mallory said,
No one had questions on mole conversions, so I assumed that was because they seemed to know what was going on and seemed to understand it.
Instances when students were not asking questions or when students were asking a lot of questions indicated to other GTAs that the students were not following along or understanding content. Mallory (general chemistry discussion GTA) said, “If there are no more questions, that'd imply that at least people either have said, “Okay, I think I understand this,” or have given up completely.” Mallory provides an example of a GTA who was not sure how to interpret students’ lack of questions; she was unsure if students understood content or if they were no longer engaged in the teaching session.
GTAs also recalled when students made connections across sessions, which GTAs interpreted to indicate that students understand course content. For example, in a lab session led by Andrew (general chemistry lab GTA), students completed a lab activity and worked on a related worksheet in their groups. In the post-observation interview for this session, Andrew discussed a worksheet question that students completed in class and said, “It was a fairly difficult question I was asking them about equilibrium, and I think they all got it because they were able to connect the dots between the two labs.” Andrew noticed that students recalled content from a previous lab to inform their current course work, which indicated to Andrew that students understood that content.
Finally, lab GTAs noticed when students struggled with specific parts of the protocol, indicating to them that certain lab techniques are challenging for students. Andrew (general chemistry lab GTA) led a session to introduce the technique of distillation to his students. In the lab, students were tasked with extracting caffeine from tea. During the post-observation interview, Andrew said, “A lot of students had a long time trying to actually get the [solvent] to boiling, because they filled their beakers with huge amounts of water and took forever to heat up.” Andrew and other lab GTAs recalled particular instances when students appeared to struggle with experimental techniques.
I started seeing groups, two or three teams, huddled together at the same table, working on answering questions together, trying to teach others how to use the equations to propagate uncertainty.
Andrew noticed his students working in groups to help each other answer questions.
Other GTAs noticed if students were not talking to each other and interpreted this in different ways. After Abby (biochemistry discussion GTA) led a review session by discussing key concepts for the course, she noticed students did not have any questions about the reviewed content. Abby interpreted this to mean that students were following along with her. When Mallory noticed students working independently during group work in her general chemistry discussion class, she interpreted it to mean that the students who were not talking preferred to work independently. For example, Mallory said,
There are some kids who are better at group work than others. There’re some kids who learn better while working alone. There are some kids who will ask questions no matter whether they're in a group or they're in an individual setting. I think for some kids, it is beneficial, but for other kids, it's a neutral contribution.
Mallory noticed that some of her students worked alone, which indicated to Mallory that those students preferred to learn independently.
GTAs also noticed whether students were quiet in general (not talking to each other or to their GTA), which indicated to GTAs that they did not know what was going on or were overwhelmed with course content. Abby (biochemistry discussion GTA) noticed students were quiet during a session held before students’ midterm examination and said,
I think they were just so overwhelmed because they have so many chapters to cover and so much material that they are a little quiet.
Andrew (general chemistry lab GTA), on the other hand, noted that he was unsure about how to interpret when students were quiet and said,
There were some people that are very quiet and […] I think some of them… they might not have known what was going on, but I didn't know it because they weren't coming up to me. That's something I'm still trying to figure out.
Andrew noticed that some students were quiet and noted that he was uncertain of how to assess the students that do not ask him questions.
GTAs noticed if students were participating in class, which indicated to GTAs that students were learning and that the season went well. After a session that occurred at the end of the term, in which Sol (organic chemistry discussion GTA) asked students to go to the board and create a mind-map of concepts covered throughout the term, he said,
They actually really participated well. Sometimes I really have to drag them up to the board, especially when it's not just doing a problem, but everyone got up. Everyone participated, and they interacted well, so that went well.
In this quote, Sol recalled an activity in which students readily participated, which indicated to him that his session went well.
Finally, GTAs noticed if students were using external resources to complete tasks during class. For example, Andrew (general chemistry lab GTA) described an instance where he noticed students rewatching lecture videos while working on a worksheet during lab:
They were watching my video to make sure they knew how to do the calculations. And so that worked out really well. I'm really liking how students are able to use the pre-lab videos at the end and try to connect the dots and be able to do their experimental workup and try to connect that with what we taught in lecture.
Students’ use of the pre-lab video indicated to Andrew that they were making connections between content taught lecture and the lab experiment.
The other thing that I noticed is that everybody works at different speeds, and there's not one specific way that you can do this. […] There's a way that everybody can do it differently based on how your brain works.
Grace noticed that students work through their lab protocols in different ways and at different speeds due to their different ways of thinking.
GTAs also recalled when students completed their work by the end of class and left on time, which indicated to GTAs that students understood their tasks for the day and finished the required work or that the session went well. For example, Calvin (organic chemistry lab GTA) often reflected on the time students needed to complete the lab experiment. In a post-observation interview, Calvin said,
It went totally fine. Got stuff done, we were supposed to do a certain experiment. Everyone did that. And yeah, it was alright.
In some cases, GTAs reflected on times when students left the session early. Some GTAs interpreted this similarly to when they noticed that students left the session on time: students finished their tasks, and the lab session went well. Other GTAs, however, interpreted this to mean that these students did not think staying in class for the entire duration was useful. For example, after a session held on the same day as a midterm examination, Mallory (general chemistry discussion GTA) said:
A lot of students left because they wanted to study on their own and didn't think it would be useful to stay in the session.
When students left Mallory's discussion session early, she interpreted it to mean that students would rather study on their own.
Our GTA participants’ teacher noticing related to student understanding of content revealed that GTAs often evaluated student learning based on the questions asked by students. GTAs’ focus on student questions may be a result of GTAs viewing their role as tutors or someone to answer students’ questions about content taught in the lecture (Sandi-Urena and Gatlin, 2013; Zotos et al., 2020). In both lab and discussion sessions, GTAs often relied on students taking initiative to speak up. Gauging student understanding by the questions asked by students does provide an indication of learning from the students willing to ask questions, but not all students. In a quote above, Andrew mentioned that he noticed his students were quiet during the lab session, and that may have indicated that they did not know what was going on, but he was not completely sure. He said he is still trying to figure out how to assess those students. Andrew describes a situation in which the class may benefit from what van Es and Sherin (2021) describe as “shaping”; when teachers create opportunities to elicit student thinking. Shaping can be particularly useful in a chemistry lab where students may need support connecting their lab work to chemistry concepts learned in lecture (Heppert et al., 2002; DiBiase and Wagner, 2010) and in situations like Andrew's, when students are reserved during sessions. We rarely observed our GTA participants deliberately creating these types of opportunities for students to share their thinking. We more so observed GTAs relying on students to take initiative to ask questions, perhaps due to GTAs view of their role: tutors and question-answerers. GTAs also view their role as lab managers (Sandi-Urena and Gatlin, 2013; Zotos et al., 2020), which may explain, in part, why our GTA participants also evaluated student learning based on relatively surface-level factors, such as whether students were quiet and if students finished lab on time.
Our GTA participants placing focus on surface-level factors is consistent with other studies that describe pre-service teacher noticing (Morris, 2006; Barnhart and van Es, 2015; Chan and Yau, 2021). Our GTA participants often demonstrated what Barnhart and van Es (2015) described as lower levels of sophistication of noticing skills, as they often attended to student behaviour and classroom climate. For example, our GTA participants noticed whether students finished work on time or early. While this may be influenced by one of the central goals of chemistry lab sessions (students complete the experiment during class), when instructors exhibit limited skills in noticing, potential to interpret and respond in more sophisticated ways is also limited (Barnhart and van Es, 2015). In some cases, GTAs demonstrated what Barnhart and van Es (2015) describe as medium sophistication as they noticed individual students’ thinking based on the questions asked by students as well as instances when students connected concepts across multiple class sessions. This noticing skill can be leveraged in many ways, one of which is to improve GTAs’ overall noticing skills, which can in turn support the development of their interpretation and response skills.
In some cases, multiple GTAs noticed similar events in their sessions but interpreted them differently. For example, when Mallory, Abby, and Grace noticed that students did not have questions, and the lack of questions indicated to them that students were following along and understanding material. In contrast, when Andrew noticed the same thing, he interpreted that to mean that students were not following along. Additionally, GTAs often noticed when students did ask questions. The questions asked by students were interpreted in some cases to be fundamental questions, while other questions were interpreted by GTAs to be more advanced questions. GTAs further interpreted fundamental questions to indicate students were struggling to understand content and further interpreted advanced questions to indicate students understood content. GTAs’ categorization of questions as fundamental or advanced is likely related to each GTAs’ knowledge and previous experiences in instructional settings, perhaps both as the student and as the instructor. This further emphasizes the subjective and complex nature of teacher noticing and interpretation (Jacobs et al., 2010; Erickson 2011). Furthermore, the context in which GTAs teach may influence their noticing and interpretation. When GTAs noticed that students left the session early, Grace, Andrew, and Calvin, who taught lab sessions, interpreted that to mean the lab went well, students understood the lab, and completed their required work. When students left early from Mallory's discussion session, she interpreted that to mean that students felt it would be more helpful to study on their own.
Because GTAs interpreted surface-level actions as signs of students’ learning, and because they interpreted these differently, there is a need to support GTAs in learning about formative assessments to elicit student thinking and assess students’ progress aligned with the learning goals for the course. In either case, GTAs may benefit from the opportunity to implement and evaluate them in their own sessions, as in a GTA training program reported by Mutambuki and Schwartz (2018). This may help GTAs obtain a clearer and more accurate account of student learning, while also increasing student participation, rather than making assumptions based on superficial features of their sessions.
As our GTA participants primarily relied on student actions to assess learning, GTA training programs could increase focus on both noticing student thinking and creating opportunities to elicit student thinking. Training may also support GTAs in learning to leverage students’ thoughts to move the class forward—an important teaching strategy to promote productive classroom discourse (Warren et al., 2001; Empson, 2003; Richards and Robertson, 2016; Gehrtz et al., 2022)—which in turn may support GTAs in further noticing students’ understanding of course content. The types of questions teachers ask can also influence the ways students think about and learn course content (Chin, 2007). One method to promote instructors’ teacher noticing skills is to have instructors watch classroom recordings and reflect on what they observed. This can be done in pre-semester training, where GTAs watch recordings of other GTAs teaching and respond to prompts focused on specific aspects of the classroom (Morris, 2006; Sherin and van Es, 2009). If GTA training continues into the semester, training leaders may ask GTAs to record a 4–5 minute clip of their own teaching that demonstrates student thinking, and to share the recording with their peers. This process may encourage GTAs to elicit student thinking while they teach as their recordings will be shared with peers (Sherin and Dyer, 2017). For such training programs to be most productive, GTAs must also reflect on their observations with peers. The observation and reflection should be guided by the dimensions of the teacher noticing framework and thus should focus on what the GTAs notice, how they interpret what they notice, and how they would respond. The teacher noticing framework can be a productive avenue to support GTA teacher learning in their own context as they engage in their teaching role.
What material are you covering right now?
What were the aims of your particular section?
What were the goals besides covering the content?
Did you achieve your goals?
What parts of your section could have gone better?
What parts of your section went particularly well?
How can you tell?
Have you taught this content before?
Did you use any teaching techniques that you haven’t used before?
Were there challenges with using these new techniques?
What went better than the previous week?
What do you feel like you could improve on for next week?
This journal is © The Royal Society of Chemistry 2024 |