Chemistry graduate teaching assistants’ teacher noticing

Eleni K. Geragosian a, Diana Zhu b, Marc Skriloff b and Ginger V. Shultz *b
aDepartment of Chemistry & Biochemistry, University of Detroit Mercy, 4001 W. McNichols Rd., Detroit, Michigan, USA
bDepartment of Chemistry, University of Michigan, Ann Arbor, Michigan 48109, USA. E-mail: gshultz@umich.edu

Received 2nd January 2023 , Accepted 8th October 2023

First published on 30th October 2023


Abstract

Chemistry graduate teaching assistants (GTAs) have substantial facetime with undergraduate students at large research institutions where they lead discussion and lab sessions. Emerging research describes GTAs’ content and teaching knowledge for introductory chemistry classes, but we need to know more about how GTAs manage their classes in the moment and how they assess student learning during class time. We conducted classroom observations and post-observation interviews with six chemistry GTAs with various years of teaching experience and who were teaching a variety of classes (e.g., general chemistry discussion, biochemistry discussion, organic chemistry lab, computational chemistry lab, and more). These GTAs were each observed and interviewed multiple times over the course of a semester. Through qualitative analysis guided by the teacher noticing framework, we describe what chemistry GTAs notice, or pay attention to, regarding student learning in their teaching sessions and how they interpret what they notice. We found that chemistry GTAs often paid attention to the types of questions that students asked but relied on their students to take initiative to ask questions in order to assess their learning. Also, GTAs often focused on superficial features of their class sessions to assess learning, like whether students finished their tasks and left their session early. However, some GTAs noticed more sophisticated evidence of student understanding, such as when students connected content covered across multiple class sessions. The results from this study contribute to our understanding of how chemistry GTAs lead their sessions and evaluate student learning during their sessions. Results serve to inform potential training designs that can support chemistry GTAs’ teacher learning through learning to notice—and to create opportunities to notice—significant features of their classrooms.


Background

Graduate teaching assistants

Graduate students routinely lead lab and discussion sessions in STEM departments at large research institutions. Often, teaching assistantships begin during graduate students’ first semester of graduate school, yet they receive little training (Golde and Dore, 2001; Luft et al., 2004). Although graduate teaching assistants’ (GTAs’) responsibilities may vary across chemistry departments, GTAs are generally expected to plan and prepare for their sessions, effectively communicate course content, facilitate discussions with students, fairly assess student learning, and demonstrate respect and professionalism (Deacon et al., 2017). Typically, lab GTAs are expected to support students in completing lab protocols; discussion GTAs are expected to support students in understanding the content taught in lecture. Research focused on the experiences of GTAs within their respective teaching roles has identified that GTAs generally have traditional conceptions of how students learn; they believe students learn best when information is clearly presented to them in a direct manner (Kurdziel et al., 2003). As a result of GTAs’ traditional conception of learning, GTAs are resistant to and struggle with implementing inquiry-based instruction (Mutambuki and Schwartz, 2018). Without substantive training, GTAs’ conceptions about student learning are often influenced by GTAs’ prior experiences in their undergraduate education, which GTAs commonly draw on to inform their teaching (Bond-Robinson and Rodriques, 2006; Sandi-Urena and Gatlin, 2013; Zotos et al., 2020). GTAs’ reliance on prior educational experiences to inform their teaching methods is paralleled in research on faculty instructors, which shows that prior experiences influence instructors’ beliefs about teaching and learning, in turn, impacting their instruction and development as instructors (Harwood et al., 2006; Lotter et al., 2007; Sandi-Urena and Gatlin, 2013; Mutambuki and Schwartz, 2018; Gibbons et al., 2018).

Chemistry GTAs’ instruction is also influenced by the context in which they teach. GTAs at research institutions have reported feeling that teaching is not valued in their departments, which inhibits their development as instructors (Luft et al., 2004; Lane et al., 2019; Zotos et al., 2020). GTAs consider their role as supplemental, viewing themselves as lab managers, tutors, someone to answer students’ questions, and an approachable resource for students (Sandi-Urena and Gatlin, 2013; Zotos et al., 2020). This perception of their role in their classrooms influences their actions to teach and assess their students (Zotos, 2022).

Researchers have reported various ways that chemistry GTAs evaluate student learning; an important skill to support students in developing chemistry knowledge. To evaluate student learning, GTAs examine students’ facial expressions, check students’ grades, ask rhetorical questions like “do you understand?”, and determine whether students could explain the concept themselves (Zotos et al., 2020). Generally, GTAs use assessment strategies they feel are simple to use and require little added effort from the instructor and student, like asking students to write their “muddiest point” from the lesson (Mutambuki and Schwartz, 2018). Such methods may provide limited depictions of student learning, and research has shown GTAs’ teaching strategies may be misaligned with what they perceive students to struggle with (Baldwin and Orgill, 2019). Even so, many studies have recommended that GTA trainings increase their focus on formative and summative assessment strategies (e.g., Luft et al., 2004).

Education researchers have recognized the challenges faced by chemistry GTAs and have implemented, evaluated, and published various training programs geared toward supporting GTAs in their respective roles. For example, Mutambuki and Schwartz (2018) recently implemented a professional development program for chemistry GTAs and found that elements of the professional training were adopted by GTAs later in the semester. Before participating in the professional development, which focused in part on various formative assessments, GTAs relied on summative assessments like lab reports to assess student learning. The formative assessments discussed in this training included, for example, ungraded quizzes, asking students to identify their “muddiest point,” and having students write for one minute in response to the question “what did you learn the most from the lesson?” After engaging in professional development, most GTAs reported using at least two types of formative assessments, which they described to be “vital in obtaining immediate feedback on students’ areas of difficulties in learning, and for assessing conceptual understanding or knowledge transfer” (Mutambuki and Schwartz, 2018, p. 117). However, researchers recognize that GTA challenges persist, and for training to be successful, the design must be contextualized to the university and department in which GTAs are situated (Mutambuki and Schwartz, 2018; Zotos et al., 2020). To further expand our understanding of GTAs’ experiences and conceptions of teaching, we used the teacher noticing framework to describe GTAs’ teaching (Sherin et al., 2011).

Teacher noticing

Effectively noticing and interpreting student thinking during class and on assessments is required to effectively fulfil GTAs’ classroom duties, such as facilitating discussions with students and assessing student learning (Deacon et al., 2017). Teacher noticing is a useful framework for understanding how teachers manage their classrooms (Sherin et al., 2011) based on what they notice about student learning. When instructors teach, they are presented with a “blooming buzzing confusion of sensory data” (Sherin et al., 2011, p. 5). In other words, many different things are happening simultaneously during a class session (i.e., students talking to each other, students recording notes, students asking questions, etc.). It is impossible for instructors to pay attention to every aspect of their classroom, so they choose to attend to (or notice) some specific element. Instructors then interpret what they notice and choose a response—which may include taking or not taking action. This process represents a cycle, which consists of the instructor noticing something in their classroom, interpreting what they notice, and responding—or not responding—based on their interpretation (see Fig. 1). Once the instructor responds, they then choose to notice another aspect of their classroom, repeating the cycle (Sherin and Star, 2011). These parts of the teacher noticing cycle–notice, interpretation, and response–will be referred to as dimensions of teacher noticing.
image file: d3rp00003f-f1.tif
Fig. 1 The teacher noticing cycle.

The classroom elements that instructors notice provides insight into where instructors believe attention is or is not needed. It may also provide insight into teachers’ cognition regarding how they intend to frame various class activities (Russ and Luna, 2013) and how they assess student learning during class time (Dini et al., 2019). To identify what teachers pay attention to, researchers have used a variety of methods, including (a) having instructors record (write) elements of instruction that they notice while watching or after watching video clips of instruction (Morris, 2006; Star and Strickland, 2008; Jacobs et al., 2010; König et al., 2014; Blömeke et al., 2022), (b) having instructors reason out loud as they evaluate and grade student responses to exam questions (Talanquer et al., 2015; Herridge and Talanquer, 2021; Herridge and Tashiro, 2021), and (c) having instructors record instances of student reasoning during their instruction, and reflect on those clips afterward (Russ and Luna, 2013; Sherin and Dyer, 2017; Luna et al., 2018).

Research in this area has demonstrated that novice teachers tend to focus on surface-level features of classroom interactions, attend more to teachers’ actions than students’, and view lessons as chronological but disconnected sequences of events. However, more experienced teachers tend to focus on students’ actions, issues of content, and can more consistently identify students’ thinking (Carter et al., 1988; Sabers et al., 1991; Morris, 2006; Star and Strickland, 2008; Barnhart and van Es, 2015; Chan and Yau, 2021; Chan et al., 2021). Additionally, Erickson (2011) and Chan and Yau (2021) found that novice teachers tend to focus on the learning of the whole class rather than individual students, which may inaccurately indicate lesson success to the instructor. The discrepancy between novice and experienced teachers may be because novice teachers do not know what to pay attention to and what to ignore (Feiman-Nemser and Buchmann, 1985). Teacher noticing can provide a window into teachers’ epistemological framing for class activities and their learning goals for certain sessions. (Russ and Luna, 2013). Thus, a teacher's patterns in teacher noticing vary across different class activities. Investigating what chemistry laboratory and discussion GTAs notice in their classroom can provide similar insight into where they focus their attention and how that is guided by their goals for the teaching sessions.

While noticing and interpreting have been conceptualized to occur simultaneously (Sherin and Star, 2011; Sherin and Russ, 2014; Walkoe, Sherin, and Elby, 2020), some researchers have isolated the dimension of interpretation (Sherin and van Es, 2009; van Es, 2011). Most of these studies are focused on how teachers make sense of students’ understanding of content. For example, Sherin and van Es (2009) characterized teachers’ interpretations of students’ mathematical thinking during videos of instruction as being descriptive (describing what they observed in the video), evaluative (evaluating the quality of interactions in the video), or interpretive (making inferences about what took place). They posit that the “interpretive” stance is the most sophisticated level of interpretation as it involves invoking substantive knowledge of content to examine classroom phenomena (van Es and Sherin, 2021). Understanding how teachers interpret what they notice is important in understanding how teachers use their knowledge and experiences to make sense of what is observed (van Es and Sherin, 2021). Teachers’ interpretations inform the actions they take or do not take in the classroom (Morris, 2006; van Es and Sherin, 2021).

Teachers’ actions taken in response to interpreting what they notice may involve further questioning to draw out or guide students’ understanding, explaining content in a different way, re-directing students’ attention, referring students to other resources, and prompting discussion between students, among others (Sherin and Star, 2011). van Es and Sherin (2021) have proposed an alternative third dimension of teacher noticing, which they refer to as shaping. This dimension is defined as “constructing interactions, in the midst of noticing, to gain access to additional information that further supports their noticing” (van Es and Sherin, 2021, p. 23) and thus may involve asking questions to elicit students’ understanding of content, which the teacher would then notice. Dini et al. (2019) identified that teachers’ use of questions during class discussions to elicit student thinking or to advance student thinking. These teaching moves were influenced both by teachers’ goals for the teaching session and by what they noticed and interpreted about student thinking.

All dimensions of the teacher noticing cycle are profoundly influenced by teachers’ prior experiences as instructors and learners, knowledge of teaching, cultural backgrounds, instructional goals, knowledge of content, and more (Jacobs et al., 2010; Erickson 2011). Therefore, teacher noticing is highly contextual and can differ across teachers even within a single context. Because chemistry GTAs teach in a context unique to other teachers, investigating their teacher noticing may provide unique insight into their teaching practice. Additionally, highly sophisticated interpretation and response skills likely require sophisticated noticing skills (Chan and Yau, 2021). Many efforts have been made to incorporate the development of pre-service teachers’ noticing skills in training designs, which could inform future chemistry GTA training programs. Sherin and van Es (2009) designed a training in which preservice teachers were involved in video clubs. In club meetings, preservice teachers watched instructional videos and then were provided prompting questions, such as “what stands out to you here?” to discuss with their peers. This design and other similar trainings focused on developing teacher noticing skills have proven to be successful in guiding teachers’ attention to important aspects of student understanding and making more sophisticated interpretations of what they notice (Star and Strickland, 2008; Sherin and van Es, 2009; Benedict-Chambers, 2016; Chan and Yau, 2021). Analogous trainings for chemistry GTAs, influenced by empirical data about their teacher noticing, can help support chemistry GTAs in building their teaching knowledge and practice, a common issue in higher education.

In accordance with the wider goal of better preparing GTAs for their teaching role, we studied two dimensions of teacher noticing–notice and interpretation–for chemistry GTAs in this study. This study was guided by the research questions:

What do chemistry GTAs notice about student learning during laboratory and discussion class sessions, and how do they interpret what they notice?

Understanding this aspect of GTAs’ teaching may provide more detailed insight into GTAs’ conceptions of teaching and learning as well as how training may support their development. Teachers who can pay close attention to students’ ideas and conceptions are better able to create opportunities for student learning (Jacobs et al. 2007).

Methods

Context

This study was conducted at a large research institution in the Midwestern United States. Graduate students who hold GTA positions are expected to spend twenty hours per week completing research work and twenty hours on their teaching assistantship role, which includes preparing to teach, leading their sessions, holding office hours, grading, and more. GTAs’ responsibilities are somewhat dependent on whether they are assigned to a lab or discussion. Lab GTAs are provided lab protocols by the course professor and are expected to lead their students through the protocol, explain relevant content, guide students through data analysis, and grade lab reports. While the course professor may be in lab for a short period of time, GTAs are the only instructor present for the majority of the class. Their sessions last three to four hours, during which groups of approximately twenty students are expected to be working through the protocol and GTAs are expected to be supporting students during class time.

Discussion GTAs are generally expected to review the content taught in lecture and support students as they work through practice problems related to lecture content. Up to thirty students are present for a given discussion session, and sessions typically lasts one hour. Depending on the course, practice problems may be prepared by the course professor or by the GTA. Discussion GTAs are often free to choose how they spend class time; for example, GTAs may walk around to answer questions as students work in groups or review the practice problems as a whole class. In both lab and discussion sessions, GTAs are expected to be answering students’ questions to support their learning.

Typically, GTAs are assigned to teach courses based on availability and if the GTAs’ schedule aligns with course offerings. GTAs attend a two-day instructor training focused on departmental logistics, how to handle common situations with students, and expectations for their role before their first semester of graduate school. Professors and senior graduate students lead GTA training, and the majority of GTA training occurred in a full group setting. Senior GTAs lead mock teaching sessions, during which each incoming GTA presents a question in a classroom. During this session the incoming GTA receives feedback on their teaching from the senior GTA and peers. Throughout the semester, most course faculty hold weekly staff meetings to ensure all GTAs are informed of course logistics and are prepared for their sessions for the week.

Data collection

To investigate chemistry GTAs’ teacher noticing in their lab and discussion sections, we conducted classroom observations and post-observation interviews with six GTAs. Our participants had a range of teaching experience and taught different courses: three participants taught lab sessions and three participants taught discussion sessions at the time of observation (see Table 1 for a summary of our participant population). With a small number of participants, we were able to collect and analyse a large amount of data per participant to capture their teacher noticing (Table 1). All participants were given pseudonyms, informed consent was received by all participants and their students, and IRB approval was obtained for this study.
Table 1 Participant information, including numbers of semesters as a GTA, course taught, course context, and number of observations recorded with each participant
GTA Number of semesters as a GTA Course Course context Number of observations
a Only three observations were conducted with Andrew due to technical complications.
Abby 1 Biochemistry discussion Course professor provided guidance on what to cover during sessions, but GTAs chose how to cover it. 4
Mallory 1 General chemistry discussion Course professor provided practice problems for students to work on in groups. GTAs were expected to answer students’ questions as they work. 4
Calvin 2 Organic chemistry II lab Course professor provided experimental protocol and guidance on completing the experiment. GTAs are expected to help students as they work in groups to complete the experiment. 4
Sol 3 Organic chemistry discussion Course professor did not provide materials; GTAs chose how to organize their sessions, what material to cover, and how to cover it. 4
Grace 4 Computational chemistry lab Course professor provided experimental protocol and guidance on completing the experiment. GTAs are expected to help students as they work in groups to complete the experiment. 4
Andrew 5 General chemistry lab Course professor provided experimental protocol and guidance on completing the experiment. GTAs are expected to help students as they work in groups to complete the experiment and a related worksheet. 3a


Data collection for this study was part of a larger study (Zotos, 2022). Each participant was observed while they taught four times throughout one semester, except for Andrew who was observed three times. We conducted observations of discussion sessions using stationary cameras to capture the classroom (what students were doing, what the GTA was doing, and what was drawn on the board). Because stationary cameras were not able to capture the entirety of the lab space and GTA-student interactions, we collected observation data of lab sessions via a small wearable camera affixed to GTAs’ lab glasses. This small recording device allowed us to view the session from the GTAs’ perspective. For each observation, a researcher set up the equipment before the session started, left the room, and returned at the end of the session to collect the equipment.

Within 24 hours after each observation, a researcher conducted a semi-structured interview with each participant during which the GTA reflected on their teaching session. The interview questions were geared toward capturing GTAs’ goals for their session, indicators of success, and other perceptions of how their session went. The interviews typically lasted 20–30 minutes and served as our primary data source. The full post-observation interview protocol is included in Appendix 1. The interviews were transcribed verbatim through an off-site service.

Data analysis

With the goal of exploring what our chemistry GTA participants noticed and how they interpreted what they noticed, data analysis was guided by teacher noticing theory (Sherin et al., 2011). Our codebook was created through conversations between the first, second, and third authors who used provisional coding methods (Saldaña, 2016) where predetermined codes were identified based on our theoretical framework. Our codebook was particularly influenced by Sherin and Star's (2011) description of teacher noticing “as the selection of noticed-things from sense data” (p. 69). Our codebook contained four codes: notice, interpretation, response-action, and response-no action. We coded units of sentence fragments, and each unit could only receive one code. Additional details about our codebook and an example of a coded excerpt are presented in Table 2.
Table 2 Our codebook, informed by the theory of teacher noticing (Sherin and Star, 2011)
Code Definition Example
Notice Code when the GTA recalls something specific that occurred during the class session related to student learning of chemistry So usually in that section, if they're not understanding something, they'll turn to each other and start to talk. And that's what they were doing during the derivation, so I stopped and tried to answer those questions.But with the problems, they weren't doing that. And if they did have a question, they were asking it to me and they were all very next-level questions. Like, they were following me along and asking follow-up questions, not clarifying questions. Um, and they seemed, like when I would ask a follow-up question, like in the next problem related to the last problem, they would remember things. So I feel like they were following along with me. And that's how I gauged that it was going well. (Abby post-observation interview 1)
Interpretation What the GTA interpreted based on something they notice during the observed session/how they made sense of what they notice
Response – Action How the GTA responded to what they notice
Response – No action Code when the GTA did not respond to something they notice, whether because they didn’t know how to respond or chose not to respond


To conduct analysis, we analysed one observation and post-observation interview at a time. We first watched the recorded teaching session and memoed times where teacher noticing may have occurred based on our own perception. We did not specifically code observations videos because this would have required inferring what participants noticed. Rather, we watched the observed class sessions to familiarize ourselves with the nature of the observed session and to obtain context for what participants mentioned in post-observation interviews. After watching each observation, we coded the associated post-observation interview using the codebook detailed in Table 2.

To ensure proper use of our codebook, the first, second, and third authors independently watched one observation and coded the associated post-observation interview (Abby's first observation and interview), then met to discuss our coding of the post-observation interview. We discussed discrepancies and refined our codebook accordingly. For example, in Abby's first post-observation interview, she mentioned something she noticed in a previous session (the first sentence in the example in Table 2). Considering that we aim to describe teacher noticing within observed sessions, we decided to limit our teacher noticing to events that occurred within the observed session. This specific line was instead coded as an interpretation because Abby described how she interprets when students talk to each other. We repeated this process with another observation and continued our conversations until we reached a consensus. With multiple researchers actively participating in data analysis, we aimed to mitigate researcher bias by allowing for negotiation and consensus building. (Watts and Finkenstaedt-Quinn, 2021). Each researcher then independently coded a unique subset of interviews. Each week, all three researchers met to discuss independently-coded interviews until coding was complete. We identified teacher noticing and coded “notice” an average of seven times per interview; each interview ranged from four to fourteen instances of noticing.

After coding each interview for teacher noticing, interpretation, and response, we created spreadsheets to summarize noticing events for each participant. Each row contained a ‘notice,’ the associated interpretation, and the associated response. This allowed us to compare and view noticing events concisely. Through this process, we found that many GTAs’ teacher noticing codes did not have an associated response-action or response-no action code. This is likely due to the nature of our interview, as we asked GTAs to recall events but did not necessarily ask how they responded. Thus, we decided to focus on teacher noticing and interpretation during further analysis, aligning with other teacher noticing studies in mathematics education (Sherin, 2007; Sherin and van Es, 2009; Colestock and Sherin, 2009). Once the spreadsheet was complete, we began inductively identifying patterns in what GTAs notice. For example, four GTAs noticed that students asked a lot of questions, and three GTAs noticed that students struggled with different parts of the lab protocol. We listed these noticing patterns and continued to review GTAs’ teacher noticing as we looked for instances that supported or conflicted with our initial identified patterns. We then grouped GTAs’ noticing events by similarities, for example, noticing related to student questions and noticing students struggled with the lab protocol were grouped into the “noticing evidence of student understanding” category. The three final inductive categories of GTA teacher noticing were: noticing evidence of student understanding, noticing student participation, and noticing the pace of the teaching session. All teacher noticing events that were coded in two or more different GTAs’ interviews are included in these groupings; if a teacher noticing event was only coded in one interview, it was not included in a category. The Results section is organized by category of teacher noticing and associated interpretations are reported. In some cases, multiple GTAs noticed similar events, but interpreted it differently. These instances are described in the Results section below.

Results

We sought to explore what chemistry GTAs notice and how they interpret what they notice during their lab and discussion sessions. Through qualitative analysis of interview and observational data with six chemistry GTAs, we identified components of discussion and lab sessions that our GTA participants notice about their students during class. Chemistry GTAs’ noticing events were inductively grouped into three categories: student understanding, student participation, and pacing of the teaching session. In the sections below, we describe these noticing events and GTAs’ interpretations of such events to provide insight into what GTAs believe is important to pay attention to during class and how they interpret those events to inform their teaching.

Noticing evidence student understanding

All GTA participants recalled events related to students’ questions, which indicated student understanding, or lack thereof, depending on the GTA and the context of their teaching session. GTAs noticed if students asked questions generally, if students asked a lot of questions, or if students did not ask questions (Fig. 2). When GTAs interpreted students’ questions as basic or fundamental, they further interpreted that to indicate students were struggling to understand the course content. For example, after a session in which Sol (organic chemistry discussion GTA) reviewed elimination reaction mechanisms by drawing mechanisms on the chalk board and asking students questions during his explanation, he said,
image file: d3rp00003f-f2.tif
Fig. 2 GTAs' noticing events and interpretations related to evidence of student understanding of content.

Generally, every discussion that I had yesterday a student asked, “What's a beta hydrogen and how do I determine it?” There's nothing special about a beta hydrogen, and it's such an easy concept, but when it was covered in class, they didn't pick it up.

Sol interpreted students’ questions about beta-hydrogens to be about a fundamental topic and thus indicating students did not understand this topic. Similarly, when GTAs noticed that students were asking a lot of questions, GTAs interpreted that to indicate that students were confused about course content or not prepared for an upcoming exam.

In contrast, some questions asked by students were interpreted by GTAs as being more advanced questions. GTAs further interpreted that to indicate that students were progressing in their understanding of chemistry content or abilities to carry out steps in a lab protocol. For example, Calvin (organic chemistry lab GTA) reflected on students’ questions as the semester progressed. After a lab session that occurred about eight weeks into the semester, Calvin said,

They don't ask the same inexperienced questions anymore. The questions are much more fundamental. […] Most of them, in most cases, know how to handle the reaction.

Calvin, like other GTA participants, interpreted students’ ability to ask higher-level questions to indicate that students understood course content; a central goal of chemistry laboratory and discussion sessions.

The GTAs who noticed that students were not asking questions differed in their interpretations. This indicated to some GTAs that the session was going well, and students understood content. For example, Mallory (general chemistry discussion GTA) held a discussion session in which students worked on the practice problems provided by the course professor. Toward the end of the session, Mallory reviewed practice problems and invited students to ask questions about the content. In the post-observation interview for this session, Mallory said,

No one had questions on mole conversions, so I assumed that was because they seemed to know what was going on and seemed to understand it.

Instances when students were not asking questions or when students were asking a lot of questions indicated to other GTAs that the students were not following along or understanding content. Mallory (general chemistry discussion GTA) said, “If there are no more questions, that'd imply that at least people either have said, “Okay, I think I understand this,” or have given up completely.” Mallory provides an example of a GTA who was not sure how to interpret students’ lack of questions; she was unsure if students understood content or if they were no longer engaged in the teaching session.

GTAs also recalled when students made connections across sessions, which GTAs interpreted to indicate that students understand course content. For example, in a lab session led by Andrew (general chemistry lab GTA), students completed a lab activity and worked on a related worksheet in their groups. In the post-observation interview for this session, Andrew discussed a worksheet question that students completed in class and said, “It was a fairly difficult question I was asking them about equilibrium, and I think they all got it because they were able to connect the dots between the two labs.” Andrew noticed that students recalled content from a previous lab to inform their current course work, which indicated to Andrew that students understood that content.

Finally, lab GTAs noticed when students struggled with specific parts of the protocol, indicating to them that certain lab techniques are challenging for students. Andrew (general chemistry lab GTA) led a session to introduce the technique of distillation to his students. In the lab, students were tasked with extracting caffeine from tea. During the post-observation interview, Andrew said, “A lot of students had a long time trying to actually get the [solvent] to boiling, because they filled their beakers with huge amounts of water and took forever to heat up.” Andrew and other lab GTAs recalled particular instances when students appeared to struggle with experimental techniques.

Noticing student participation

GTAs also recalled the ways students participated in class (Fig. 3). Some GTAs noticed when students worked together on tasks, which indicated to GTAs that students were trying to help each other with tasks. For example, after a typical session when Andrew (general chemistry lab GTA) guided students through an experiment, he said,
image file: d3rp00003f-f3.tif
Fig. 3 GTAs' noticing events and interpretations related to student participation.

I started seeing groups, two or three teams, huddled together at the same table, working on answering questions together, trying to teach others how to use the equations to propagate uncertainty.

Andrew noticed his students working in groups to help each other answer questions.

Other GTAs noticed if students were not talking to each other and interpreted this in different ways. After Abby (biochemistry discussion GTA) led a review session by discussing key concepts for the course, she noticed students did not have any questions about the reviewed content. Abby interpreted this to mean that students were following along with her. When Mallory noticed students working independently during group work in her general chemistry discussion class, she interpreted it to mean that the students who were not talking preferred to work independently. For example, Mallory said,

There are some kids who are better at group work than others. There’re some kids who learn better while working alone. There are some kids who will ask questions no matter whether they're in a group or they're in an individual setting. I think for some kids, it is beneficial, but for other kids, it's a neutral contribution.

Mallory noticed that some of her students worked alone, which indicated to Mallory that those students preferred to learn independently.

GTAs also noticed whether students were quiet in general (not talking to each other or to their GTA), which indicated to GTAs that they did not know what was going on or were overwhelmed with course content. Abby (biochemistry discussion GTA) noticed students were quiet during a session held before students’ midterm examination and said,

I think they were just so overwhelmed because they have so many chapters to cover and so much material that they are a little quiet.

Andrew (general chemistry lab GTA), on the other hand, noted that he was unsure about how to interpret when students were quiet and said,

There were some people that are very quiet and […] I think some of them… they might not have known what was going on, but I didn't know it because they weren't coming up to me. That's something I'm still trying to figure out.

Andrew noticed that some students were quiet and noted that he was uncertain of how to assess the students that do not ask him questions.

GTAs noticed if students were participating in class, which indicated to GTAs that students were learning and that the season went well. After a session that occurred at the end of the term, in which Sol (organic chemistry discussion GTA) asked students to go to the board and create a mind-map of concepts covered throughout the term, he said,

They actually really participated well. Sometimes I really have to drag them up to the board, especially when it's not just doing a problem, but everyone got up. Everyone participated, and they interacted well, so that went well.

In this quote, Sol recalled an activity in which students readily participated, which indicated to him that his session went well.

Finally, GTAs noticed if students were using external resources to complete tasks during class. For example, Andrew (general chemistry lab GTA) described an instance where he noticed students rewatching lecture videos while working on a worksheet during lab:

They were watching my video to make sure they knew how to do the calculations. And so that worked out really well. I'm really liking how students are able to use the pre-lab videos at the end and try to connect the dots and be able to do their experimental workup and try to connect that with what we taught in lecture.

Students’ use of the pre-lab video indicated to Andrew that they were making connections between content taught lecture and the lab experiment.

Noticing the pace of the teaching session

All GTA participants reflected on the pace of the teaching session based directly on student actions or behaviors (Fig. 4). GTAs noticed that students worked at different paces, where some could complete their tasks quicker than others. GTAs who noticed a student or group working slower than others interpreted it to mean that they were struggling with the material, especially if they were performing a technique for the first time. For example, after a session that occurred early in the semester, Calvin (organic chemistry lab GTA) reflected on students’ liquid–liquid extraction technique and said, “Since they are doing it for the first time, they will be slow at it, and they need to be guided.” Grace (computational chemistry lab GTA) also noticed that some students were working slower on their computational lab protocol. Grace mentioned this is because students’ brains work differently:
image file: d3rp00003f-f4.tif
Fig. 4 GTAs' noticing events and interpretations related to the pace of the teaching session.

The other thing that I noticed is that everybody works at different speeds, and there's not one specific way that you can do this. […] There's a way that everybody can do it differently based on how your brain works.

Grace noticed that students work through their lab protocols in different ways and at different speeds due to their different ways of thinking.

GTAs also recalled when students completed their work by the end of class and left on time, which indicated to GTAs that students understood their tasks for the day and finished the required work or that the session went well. For example, Calvin (organic chemistry lab GTA) often reflected on the time students needed to complete the lab experiment. In a post-observation interview, Calvin said,

It went totally fine. Got stuff done, we were supposed to do a certain experiment. Everyone did that. And yeah, it was alright.

In some cases, GTAs reflected on times when students left the session early. Some GTAs interpreted this similarly to when they noticed that students left the session on time: students finished their tasks, and the lab session went well. Other GTAs, however, interpreted this to mean that these students did not think staying in class for the entire duration was useful. For example, after a session held on the same day as a midterm examination, Mallory (general chemistry discussion GTA) said:

A lot of students left because they wanted to study on their own and didn't think it would be useful to stay in the session.

When students left Mallory's discussion session early, she interpreted it to mean that students would rather study on their own.

Discussion

To identify what chemistry GTAs notice and how they interpret what they notice in their discussion and lab classrooms, we conducted multiple observations and post-observation interviews with six chemistry GTAs teaching a variety of courses. In this study, we described two dimensions of the teacher noticing cycle: notice and interpretation (Sherin et al., 2011). GTAs’ prior experiences as teachers and learners, knowledge of teaching, cultural backgrounds, knowledge of content, their own teaching context, and more likely influence what GTAs choose to pay attention to during their sessions (Jacobs et al., 2010; Erickson 2011), which makes chemistry GTAs a unique population of instructors. We presented three inductive categories of GTAs’ teacher noticing: student understanding of content, student participation, and the pace of the teaching session.

Our GTA participants’ teacher noticing related to student understanding of content revealed that GTAs often evaluated student learning based on the questions asked by students. GTAs’ focus on student questions may be a result of GTAs viewing their role as tutors or someone to answer students’ questions about content taught in the lecture (Sandi-Urena and Gatlin, 2013; Zotos et al., 2020). In both lab and discussion sessions, GTAs often relied on students taking initiative to speak up. Gauging student understanding by the questions asked by students does provide an indication of learning from the students willing to ask questions, but not all students. In a quote above, Andrew mentioned that he noticed his students were quiet during the lab session, and that may have indicated that they did not know what was going on, but he was not completely sure. He said he is still trying to figure out how to assess those students. Andrew describes a situation in which the class may benefit from what van Es and Sherin (2021) describe as “shaping”; when teachers create opportunities to elicit student thinking. Shaping can be particularly useful in a chemistry lab where students may need support connecting their lab work to chemistry concepts learned in lecture (Heppert et al., 2002; DiBiase and Wagner, 2010) and in situations like Andrew's, when students are reserved during sessions. We rarely observed our GTA participants deliberately creating these types of opportunities for students to share their thinking. We more so observed GTAs relying on students to take initiative to ask questions, perhaps due to GTAs view of their role: tutors and question-answerers. GTAs also view their role as lab managers (Sandi-Urena and Gatlin, 2013; Zotos et al., 2020), which may explain, in part, why our GTA participants also evaluated student learning based on relatively surface-level factors, such as whether students were quiet and if students finished lab on time.

Our GTA participants placing focus on surface-level factors is consistent with other studies that describe pre-service teacher noticing (Morris, 2006; Barnhart and van Es, 2015; Chan and Yau, 2021). Our GTA participants often demonstrated what Barnhart and van Es (2015) described as lower levels of sophistication of noticing skills, as they often attended to student behaviour and classroom climate. For example, our GTA participants noticed whether students finished work on time or early. While this may be influenced by one of the central goals of chemistry lab sessions (students complete the experiment during class), when instructors exhibit limited skills in noticing, potential to interpret and respond in more sophisticated ways is also limited (Barnhart and van Es, 2015). In some cases, GTAs demonstrated what Barnhart and van Es (2015) describe as medium sophistication as they noticed individual students’ thinking based on the questions asked by students as well as instances when students connected concepts across multiple class sessions. This noticing skill can be leveraged in many ways, one of which is to improve GTAs’ overall noticing skills, which can in turn support the development of their interpretation and response skills.

In some cases, multiple GTAs noticed similar events in their sessions but interpreted them differently. For example, when Mallory, Abby, and Grace noticed that students did not have questions, and the lack of questions indicated to them that students were following along and understanding material. In contrast, when Andrew noticed the same thing, he interpreted that to mean that students were not following along. Additionally, GTAs often noticed when students did ask questions. The questions asked by students were interpreted in some cases to be fundamental questions, while other questions were interpreted by GTAs to be more advanced questions. GTAs further interpreted fundamental questions to indicate students were struggling to understand content and further interpreted advanced questions to indicate students understood content. GTAs’ categorization of questions as fundamental or advanced is likely related to each GTAs’ knowledge and previous experiences in instructional settings, perhaps both as the student and as the instructor. This further emphasizes the subjective and complex nature of teacher noticing and interpretation (Jacobs et al., 2010; Erickson 2011). Furthermore, the context in which GTAs teach may influence their noticing and interpretation. When GTAs noticed that students left the session early, Grace, Andrew, and Calvin, who taught lab sessions, interpreted that to mean the lab went well, students understood the lab, and completed their required work. When students left early from Mallory's discussion session, she interpreted that to mean that students felt it would be more helpful to study on their own.

Because GTAs interpreted surface-level actions as signs of students’ learning, and because they interpreted these differently, there is a need to support GTAs in learning about formative assessments to elicit student thinking and assess students’ progress aligned with the learning goals for the course. In either case, GTAs may benefit from the opportunity to implement and evaluate them in their own sessions, as in a GTA training program reported by Mutambuki and Schwartz (2018). This may help GTAs obtain a clearer and more accurate account of student learning, while also increasing student participation, rather than making assumptions based on superficial features of their sessions.

Limitations

Teacher noticing is a complex process that is difficult to capture in its entirety. In the study presented herein, we based our report of GTA teacher noticing on what GTAs mentioned in their post-observation interview. Asking teachers to recall events that occurred during a teaching session provides a window into what they pay attention to (Borko and Livingston, 1989). However, it is certainly possible that our participants noticed, interpreted, and responded to other events in their discussion or lab sessions that they did not recall in post-observation interviews. Furthermore, many interpretation processes happen internally and in the moment, which cannot be directly observed. We aimed to avoid making assumptions about what was noticed or not noticed based solely on the observation recordings. Additionally, we did not review observation recordings during post-observation interviews. A study that involves GTAs watching recordings of their teaching and reflecting on what they notice may provide more detailed insight into what GTAs notice during instruction, how they interpret what they notice, and how they respond (Ainley and Luntley, 2007; Rosaen et al., 2008; Sherin and Dyer, 2017). Additionally, our findings may not be generalizable to the broader chemistry GTA population due to a relatively small participant population. However, our goal was not to provide generalizable cases but rather to explore a few cases that can provide a starting point for additional empirical studies focused on chemistry GTAs’ teacher noticing.

Implications

For research

This study is the first to describe chemistry GTAs’ teacher noticing in the classroom, which provides insight into what GTAs believe is important to pay attention to during instruction. This work can inform future studies that specifically investigate chemistry GTAs’ teacher noticing of students’ chemical thinking, as this framework has been used successfully in investigating pre-service and in-service teachers’ noticing of students' mathematical and scientific thinking (e.g., Sherin and van Es, 2009; Sherin et al., 2011; Benedict-Chambers, 2016). Such a study may provide insight into whether GTAs’ interpretations of students’ questions as fundamental or advanced truly align with the question posed by the student. Similarly, identifying GTAs’ teacher noticing in an inquiry lab, where GTAs need to lead students through their work, may provide interesting insights. Because our data involved GTAs’ recall of events that happened in their session, a study in which GTAs watch their own recording during the post-observation interview may help GTAs recall key events during the class they may have otherwise forgotten and provide more detail about how they interpret what they noticed (Ainley and Luntley, 2007; Rosaen et al., 2008). A research design such as this may allow for GTAs to expand on some of the noticing events reported in this study, such as how GTAs categorize students’ questions as basic or advanced. Additionally, a study in which GTA participants watch a standardized recording may be helpful in identifying similarities in teacher noticing across GTA participants (Star and Strickland, 2008; Colestock and Sherin, 2009). Finally, a comparison of teacher noticing across a wider range of experience would be useful in further understanding the development of teacher noticing skills, such as novice GTA and a faculty member with experience teaching lab sessions. It would also provide insight into the potential influence of the impact of teaching roles, as GTAs are often given lab protocols and are asked to help students complete them, while faculty are the ones who develop the lab protocols.

For practice

While it may be instinctual for GTAs to focus on whether students finished the lab early or the specific questions individual students ask given GTAs’ perceptions of their role, these types of indicators do not provide a complete picture of student learning (Sabers et al., 1991). We recommend that GTA training focus on developing GTAs’ skills in creating opportunities for students to share their thinking and using this to guide students’ learning, as leveraging students’ thinking creates a more equitable and positive learning experience for students (Warren et al., 2001; Empson, 2003; Thornton, 2006; Richards and Robertson, 2016). Simultaneously, instructors develop their own teaching knowledge as they learn more about how students think and grapple with content (Franke et al., 2001; Kim, 2019). Methods such as approaching students as they work on tasks to ask open-ended questions are better indicators of student learning, as these types of questions usually require a student-generated explanation. Additionally, teaching strategies like “think-pair-share” help to develop peer relationships while also providing an opportunity for students to share their thinking (White et al., 2021). Such methods do not rely on the content covered in class; they can be productive in both lab and discussion courses at any level.

As our GTA participants primarily relied on student actions to assess learning, GTA training programs could increase focus on both noticing student thinking and creating opportunities to elicit student thinking. Training may also support GTAs in learning to leverage students’ thoughts to move the class forward—an important teaching strategy to promote productive classroom discourse (Warren et al., 2001; Empson, 2003; Richards and Robertson, 2016; Gehrtz et al., 2022)—which in turn may support GTAs in further noticing students’ understanding of course content. The types of questions teachers ask can also influence the ways students think about and learn course content (Chin, 2007). One method to promote instructors’ teacher noticing skills is to have instructors watch classroom recordings and reflect on what they observed. This can be done in pre-semester training, where GTAs watch recordings of other GTAs teaching and respond to prompts focused on specific aspects of the classroom (Morris, 2006; Sherin and van Es, 2009). If GTA training continues into the semester, training leaders may ask GTAs to record a 4–5 minute clip of their own teaching that demonstrates student thinking, and to share the recording with their peers. This process may encourage GTAs to elicit student thinking while they teach as their recordings will be shared with peers (Sherin and Dyer, 2017). For such training programs to be most productive, GTAs must also reflect on their observations with peers. The observation and reflection should be guided by the dimensions of the teacher noticing framework and thus should focus on what the GTAs notice, how they interpret what they notice, and how they would respond. The teacher noticing framework can be a productive avenue to support GTA teacher learning in their own context as they engage in their teaching role.

Conflicts of interest

There are no conflicts to declare.

Appendix 1: Post-observation interview

How did your section go?

What material are you covering right now?

What were the aims of your particular section?

What were the goals besides covering the content?

Did you achieve your goals?

What parts of your section could have gone better?

What parts of your section went particularly well?

How can you tell?

Have you taught this content before?

Did you use any teaching techniques that you haven’t used before?

Were there challenges with using these new techniques?

What went better than the previous week?

What do you feel like you could improve on for next week?

References

  1. Ainley J. and Luntley M., (2007), The role of attention in expert classroom practice, J. Math. Teacher Educ., 10, 3–22.
  2. Baldwin N. and Orgill M., (2019), Relationship between teaching assistants’ perceptions of student learning challenges and their use of external representations when teaching acid–base titrations in introductory chemistry laboratory courses, Chem. Educ. Res. Pract., 20(4), 821–836 10.1039/c9rp00013e.
  3. Barnhart T. and Van Es E., (2015), Studying teacher noticing: examining the relationship among pre-service science teachers’ ability to attend, analyze and respond to student thinking. Teach. Teach. Educ., 45, 83–93 DOI:10.1016/j.tate.2014.09.005.
  4. Benedict-Chambers A., (2016), Using tools to promote novice teacher noticing of science teaching practices in post-rehearsal discussions, Teach. Teach. Educ., 59, 28–44 DOI:10.1016/j.tate.2016.05.009.
  5. Blömeke S., Jentsch A., Ross N., Kaiser G. and König J., (2022), Opening up the black box: teacher competence, instructional quality, and students’ learning progress, Learn. Instruct., 79, 101600 DOI:10.1016/j.learninstruc.2022.101600.
  6. Bond-Robinson J. and Rodriques R. A. B., (2006), Catalyzing graduate teaching assistants’ laboratory teaching through design research, J. Chem. Educ., 83(2), 313 DOI:10.1021/ed083p313.
  7. Borko H. and Livingston C., (1989), Cognition and improvision: Differences in mathematics instruction by expert and novice teachers, Amer. Ed. Res. J., 26(4), 437–498,  DOI:10.3102/00028312026004473.
  8. Carter K., Cushing K., Sabers D., Stein P. and Berliner D., (1988), Expert-Novice Differences in Perceiving and Processing Visual Classroom Information, J. Teach. Educ., 39(3), 25–31,  DOI:10.1177/002248718803900306.
  9. Chan K. K. H. and Yau K. W., (2021), Using Video-Based Interviews to Investigate Pre-service Secondary Science Teachers’ Situation-Specific Skills for Informal Formative Assessment, Int. J. Sci. Math. Educ., 19(2), 289–311 DOI:10.1007/s10763-020-10056-y.
  10. Chan K. K. H., Xu L., Cooper R., Berry A. and van Driel J. H., (2021), Teacher noticing in science education: Do you see what I see? Studies Sci. Educ., 57(1), 1–44.
  11. Chin C., (2007), Teacher questioning in science classrooms: approaches that stimulate productive thinking, J. Res. Sci. Teach., 44(6), 815–843 DOI:10.1002/tea.20171.
  12. Colestock A. and Sherin M., (2009), Teachers' sense-making strategies while watching video of mathematics instruction, J. Tech. Teach. Educ., 17(1), 7–29.
  13. Deacon C., Hajek A. and Schulz H., (2017), Graduate teaching assistants' perceptions of competencies required for work in undergraduate science labs, Int. J. Sci. Educ., 39(16), 2189–2208 DOI:10.1080/09500693.2017.1367110.
  14. DiBiase W., Wagner E. P., (2010), Aligning general chemistry laboratory with lecture at a large university, School Sci. Math., 102(4), 158–171.
  15. Dini V., Sevian H., Caushi K., Orduna Picon R., (2019), Characterizing the formative assessment enactment of experienced science teachers, Sci. Educ., 104, 290–325.
  16. Empson S. B., (2003), Low-performing students and teaching fractions for understanding: An interactional analysis, J. Res. Math. Educ., 34(4), 305–343,  DOI:10.2307/30034786.
  17. Erickson F., (2011), On teacher noticing, in Sherin M., Jacobs V. and Philipp R. (ed.), Mathematics teacher noticing: Seeing through teachers' eyes, New York, NY: Routledge, pp. 17–34.
  18. Feiman-Nemser S. and Buchmann M., (1985), Pitfalls of experience in teacher preparation, Teach. Col. Rec., 87(1), 53–65,  DOI:10.1177/016146818508700107.
  19. Franke M. L., Carpenter T. P., Levi L. and Fennema E., (2001), Capturing teachers' generative change: A follow-up study of professional development in mathematics, Amer. Educ. Res. J., 38(3), 653–689,  DOI:10.3102/00028312038003653.
  20. Gehrtz J., Brantner M. and Andrews T. C., (2022), How are undergraduate STEM instructors leveraging student thinking? Int. J. STEM Educ., 9(1), 1–20,  DOI:10.1186/s40594-022-00336-0.
  21. Gibbons R. E., Villafane S. M., Stains M., Murphy K. L. and Raker J. R., (2018), Beliefs about learning and enacted instructional practices: an investigation in postsecondary chemistry education, J. Res. Sci. Teach., 55(8), 1111–1133,  DOI:10.1002/tea.21444.
  22. Golde C. M. and Dore T. M., (2001), At Cross Purposes: What the Experiences of Doctoral Students Reveal about Doctoral Education, Philadelphia: Pew Charitable Trusts.
  23. Harwood W. S., Hansen J. and Lotter C., (2006), Measuring Teacher Beliefs About Inquiry: The Development of a Blended Qualitative/Quantitative Instrument, J. Sci. Educ. Technol., 15(1), 69–79 DOI:10.1007/s10956-006-0357-4.
  24. Heppert J., Ellis J. and Robinson J., (2002), Problem solving in the chemistry laboratory, J. College Sci. Teach., 31(5), 322.
  25. Herridge M. and Talanquer V., (2021), Dimensions of Variation in Chemistry Instructors’ Approaches to the Evaluation and Grading of Student Responses, J. Chem. Educ., 98, 270–280 DOI:10.1021/acs.jchemed.0c00944.
  26. Herridge M. and Tashiro J., (2021), Research and Practice of student written responses and its impact on grading, Chem. Educ. Res. Pract., 22(4), 948–972 10.1039/d1rp00061f.
  27. Jacobs V. R., Franke M. L., Carpenter T. P., Levi L. and Battey D., (2007), Professional development focused on children's algebraic reasoning in elementary school, J. Res. Math. Educ., 38(3), 258–288,  DOI:10.2307/30034868.
  28. Jacobs V. R., Lamb L. L. C., Philipp R. A., Jacobs V. R., Lamb L. L. C. and Philipp R. A., (2010), Professional Noticing of Children's Mathematical Thinking,41(2), 169–202.
  29. Kim H. j., (2019), Teacher learning opportunities provided by implementing formative assessment lessons: Becoming responsive to student mathematical thinking, Int. J. Sci. Math. Educ., 17, 341–363,  DOI:10.1007/s10763-017-9866-7.
  30. König J., Blömeke S., Klein P., Suhl U., Busse A. and Kaiser G., (2014), Is teachers’ general pedagogical knowledge a premise for noticing and interpreting classroom situations? A video-based assessment approach, Teach. Teach. Educ., 38, 76–88 DOI:10.1016/j.tate.2013.11.004.
  31. Kurdziel J. P., Turner J. A., Luft J. A. and Roehrig G. H., (2003), Graduate Teaching Assistants and Inquiry-Based Instruction: Implications for Graduate Teaching Assistant Training, J. Chem. Educ., 80(10), 1206 DOI:10.1021/ed080p1206.
  32. Lane A. K., Hardison C., Simon A. and Andrews T. C., (2019), A model of the factors influencing teaching identity among life sciences doctoral students. J. R, 56, 141–162 DOI:10.1002/tea.21473.
  33. Lotter C., Harwood W. S. and Jose J., (2007), The Influence of Core Teaching Conceptions on Teachers’ Use of Inquiry Teaching Practices, J. Res. Sci. Teach., 44(9), 1318–1347 DOI:10.1002/tea.
  34. Luft J. A., Kurdziel J. P., Roehrig G. H. and Turner J., (2004), Growing a garden without water: graduate teaching assistants in introductory science laboratories at a doctoral/research university, J. Res. Sci. Teach., 41(3), 211–233 DOI:10.1002/tea.20004.
  35. Luna M. J., Selmer S. J., Rye J. A., Luna M. J., Selmer S. J. and Rye J. A., (2018), Teachers’ Noticing of Students’ Thinking in Science Through Classroom Artifacts: In What Ways Are Science and Engineering Practices Evident? J. Sci. Teach. Educ., 29(2), 148–172 DOI:10.1080/1046560X.2018.1427418.
  36. Morris A. K., (2006), Assessing pre-service teachers’ skills for analyzing teaching, J. Math. Teach. Educ., 9, 471–505 DOI:10.1007/s10857-006-9015-7.
  37. Mutambuki J. M. and Schwartz R., (2018), We don’t get any training: the impact of a professional development model on teaching practices of chemistry and biology graduate teaching assistants, Chem. Educ. Res. Pract., 19, 106–121 10.1039/C7RP00133A.
  38. Richards J. and Robertson A. D., (2016), A review of the research on responsive teaching in science and mathematics, in Robertson A. D., Scherr R. E. and Hammer D. (ed.), Responsive teaching in science and mathematics, Routledge, pp. 1–35.
  39. Rosaen C. L., Lundeberg M., Cooper M., Fritzen A. and Terpstra M., (2008), Noticing Noticing: How Does Investigation of Video Records Change How Teachers Reflect on Their Experiences? J. Teach. Educ., 59(4), 347–360,  DOI:10.1177/0022487108322128.
  40. Russ R. and Luna M., (2013), Inferring teacher epistemological framing from local patterns in teacher noticing, J. Res. Sci. Teach., 50(3), 284–314.
  41. Sabers D. S., Cushing K. S. and Berliner D. C., (1991), Differences Among Teachers in a Task Characterized by Simultaneity, Multidimensional, and Immediacy, Amer. Ed. Res. J., 28(1), 63–88,  DOI:10.3102/00028312028001063.
  42. Sandi-Urena S. and Gatlin T., (2013), Factors contributing to the development of graduate teaching assistant self-image, J. Chem. Educ., 90(10), 1303–1309 DOI:10.1021/ed200859e.
  43. Sherin M. G., (2007), The development of teachers' professional vision in video clubs, in Goldman R., Pea R., Barron B. and Deny S. J. (ed.), Video research in the learning sciences, Mahwah, NJ: Erlbaum, pp. 383–395.
  44. Sherin M. G. and Dyer E. B., (2017), Mathematics teachers’ self-captured video and opportunities for learning, J. Math. Teach. Educ., 20, 477–495 DOI:10.1007/s10857-017-9383-1.
  45. Sherin M., Jacobs V. and Philipp R., (2011), Mathematics teacher noticing: Seeing through teachers' eyes, New York, NY: Routledge.
  46. Sherin B. and Star J. R., (2011), Reflections on the study of teacher noticing, in Sherin M. G., Jacobs V. R. and Philipp R. A. (ed.) Mathematics Teacher Noticing: Seeing Through Teachers' Eyes, New York, NY: Routledge, pp. 66–78.
  47. Sherin M. G. and van Es E. A., (2009), Effects of Video Club Participation on Teachers’ Professional Vision, J. Teach. Educ., 60(1), 20–37.
  48. Star J. R. and Strickland S. K., (2008), Learning to observe: using video to improve preservice mathematics teachers’ ability to notice, J. Math. Teach. Educ., 11, 107–125 DOI:10.1007/s10857-007-9063-7.
  49. Talanquer V., Bolger M. and Tomanek D., (2015), Exploring prospective teachers' assessment practices: noticing and interpreting student understanding in the assessment of written work, J. Res. Sci. Teach., 52(5), 585–609.
  50. Thornton H., (2006), Dispositions in action: Do dispositions make a difference in practice? Teach. Educ. Quart., 33(2), 53–68.
  51. van Es E., (2011), A framework for learning to notice student thinking, in Sherin M., Jacobs, V. and Philipp R. (ed.), Mathematics teacher noticing: Seeing through teachers' eyes, New York, NY: Routledge, pp. 134–151.
  52. van Es E. A. and Sherin M. G., (2021), Expanding on prior conceptualizations of teacher noticing, ZDM-Math. Educ., 53(1), 17–27 DOI:10.1007/s11858-020-01211-4.
  53. Walkoe J., Sherin M. and Elby A., (2020), Video tagging as a window into teacher noticing, J. Math. Teach. Educ., 23(4), 385–405 DOI:10.1007/s10857-019-09429-0.
  54. Warren B., Ballenger C., Ogonowski M., Rosebery A. S. and Hudicourt-Barnes J., (2001), Rethinking diversity in learning science: The logic of everyday sense-making, J. Res. Sci. Teach., 38(5), 529–552,  DOI:10.1002/tea.1017.
  55. Watts F. M. and Finkenstaedt-Quinn S. A., (2021), The current state of methods for establishing reliability in qualitative chemistry education research articles, Chem. Educ. Res. Pract., 22(3), 565–578 10.1039/d1rp00007a.
  56. White K. N., Vincent-layton K. and Villarreal B., (2021), Equitable and inclusive practices designed to reduce equity gaps in undergraduate chemistry courses, J. Chem. Educ., 98, 330–339 DOI:10.1021/acs.jchemed.0c01094.
  57. Zotos E. K., (2022), Chemistry graduate students' knowledge for teaching and factors that influence their development as instructors, Doctoral dissertation, Ann Arbor, MI: University of Michigan.
  58. Zotos E. K., Moon A. C. and Shultz G. V., (2020), Investigation of chemistry graduate teaching assistants’ teacher knowledge and teacher identity, J. Res. Sci. Teach., 57(6), 943–967 DOI:10.1002/tea.21618.

This journal is © The Royal Society of Chemistry 2024
Click here to see how this site uses Cookies. View our privacy policy here.