Assessing TA buy-in to expectations and alignment of actual teaching practices in a transformed general chemistry laboratory course

Erin M. Duffy *a and Melanie M. Cooper b
aDepartment of Chemistry, Western Washington University, 516 High St – MS 9150, Bellingham, WA 98225, USA. E-mail:
bDepartment of Chemistry, Michigan State University, 578 South Shaw Lane, East Lansing, MI 48824, USA

Received 27th March 2019 , Accepted 27th August 2019

First published on 2nd September 2019

Inquiry-style laboratory courses, in which students engage in open-ended projects rather than a prescribed set of experimental steps (“cookbooks”), are becoming increasingly popular at the undergraduate level. Reformed curricula require reforms in training teachers; in the case of large universities, laboratory instructors are typically graduate teaching assistants (TAs). The General Chemistry Laboratory courses at a large, public, research-intensive university in the Midwestern region of the United States recently underwent a transformation from a “cookbook” to a project-based lab, and despite efforts to improve training, TAs continue to express difficulty teaching the course. To determine the source of these difficulties, we conducted multiple video observations and semi-structured interviews with seven TAs throughout one semester. We report TAs’ beliefs about what is expected of them, their philosophical alignment to perceived expectations, and a comparison of the Lab Coordinator's expectations to TAs’ actual teaching practices. We found that the TAs generally agreed with behaviors they were expected to perform, but responses to actions they were not supposed to do indicated that they were unsure of what the Lab Coordinator expected and personally believed that an ideal TA would perform those actions. This work highlights a need to clearly communicate the aims and expectations in a course and the rationale for those choices.


Laboratory courses are a longstanding tradition in chemistry degree programs. Historically, these courses were developed to train students to become competent workers in industry (Mabery, 1892); however, as fewer students in chemistry laboratory courses ultimately became chemistry practitioners, the goals have broadened beyond practical laboratory skills. By the 1960s, the idea that laboratory courses are an opportunity for students to engage in “inquiry” emerged (Hofstein and Lunetta, 1982), but it was not until the early 2000s, with the introduction of Inquiry and the National Science Education Standards (National Research Council, 2000), that inquiry learning became more prominent. In contemporary reviews of the chemistry laboratory education literature, Hofstein and Lunetta recommended an increase of inquiry-based laboratory courses in lieu of “cookbook” labs (Hofstein and Lunetta, 2004), and Reid and Shah called for a greater emphasis on “the process of thought and enquiry and much less on getting a ‘right’ answer” (Reid and Shah, 2007). An actionable interpretation of inquiry is to engage students in scientific practices, such as defined by the National Research Council's Framework for K-12 Science Education (National Research Council, 2012):

1. Asking questions and defining problems

2. Developing and using models

3. Planning and carrying out investigations

4. Analyzing and interpreting data

5. Using mathematics and computational thinking

6. Constructing explanations and designing solutions

7. Engaging in argument from evidence

8. Obtaining, evaluating, and communicating information

The General Chemistry Laboratory (GCL) course sequence discussed in this work uses an adaptation of the Cooperative Chemistry curriculum (Cooper, 1994; Sandi-Urena et al., 2011a, 2011b), which extensively incorporates scientific practices (Carmel et al., 2017, 2019a). This curriculum was originally developed by the second author in the early 1990s but was adopted by the institution's Department of Chemistry in 2015 concurrent with a transformation of the General Chemistry lecture curriculum (Cooper et al., 2012; Cooper and Klymkowsky, 2013; Williams et al., 2015; Underwood et al., 2016). Adoption and adaptation of Cooperative Chemistry at this institution were led by a new General Chemistry Laboratory Coordinator and a former postdoctoral research associate in the second author's research group. The second author provided the curricular materials as used at the originating institution, and the adoption leaders independently adapted the materials for use at this institution.

In Cooperative Chemistry, students work in small groups of 3 to 4 members on multi-week projects to investigate questions or to solve problems posed by scenarios that are designed to show the application of the kinds of problems to which the students are responding. Students are required to determine what needs to be done to solve the problem, design experimental procedures, perform experiments, analyze and interpret experimental data, make arguments about what the data means, and report findings for each project. These projects and the extent to which students engage in various scientific practices for each project has been described previously (Carmel et al., 2019b); for clarity, brief project descriptions for the experiments observed in this work are shown in Table 1 with more details provided in Appendix 1.

Table 1 Brief description of projects performed during classroom observations
Course Title of project General goals for each week
GCL1 Identification and synthesis of an unknown ionic compound • Week 1: Conduct qualitative tests to determine the identity of an unknown ionic compound
• Week 2: Confirm identity by conducting additional tests and comparing to known sample
GCL1 Food dye spectroscopy • Week 1: Experimentally determine the following relationships: wavelengths of absorbance vs. apparent color, intensity of absorbance and concentration.
• Week 2: Identify the dyes present in a sports drink and create a solution that mimics the color profile of the drink
GCL2 Soaps and detergents • Week 1: Synthesize soaps and detergents from various starting materials
• Week 2: Standardize acids/bases for use in analyses of wastewaters (filtrate after isolating soap/detergent)
• Week 3: Analyze wastewaters from synthesis and determine “best” soap/detergent out of those synthesized
GCL2 Artificial kidney stones • Week 1: Synthesize various water-insoluble salts that are typical of kidney stone composition
• Week 2: Compare various methods of dissolving the salts and determine the “best” method to combat kidney stones

However, while students have the opportunity to use scientific practices in the laboratory, doing so in the context of chemistry is nontrivial. As is common in large universities (Luft et al., 2004), graduate teaching assistants are responsible for teaching undergraduate students in the GCL courses at this institution. Since the transformation of the laboratories, the department has increased time for TA training, and efforts have been made to better prepare TAs so that they can facilitate learning in our reformed courses. While some of these efforts have been led by the authors of this work, not all of these training opportunities were specific to GCL TAs. These general and lab-specific trainings are described in more detail in Appendix 2.

Despite this added preparation, TAs continued to express having difficulty with teaching the laboratory. Some of these difficulties may be because the prior laboratory course was much more traditional in nature, in which the TA instructed, rather than guided, students who worked on one-week exercises by following a given procedure to produce a known outcome. The shift to a different type of teaching and learning experience posed a challenge for experienced TAs who had become accustomed to teaching “cookbook”-style labs; however, new TAs also expressed difficulties with teaching the laboratories. TAs’ continued struggle suggests a need to provide increased and/or improved support.

Because TAs’ instructional practices are influenced by their beliefs about teaching and learning (Volkmann and Zgagacz, 2004; Rodriques and Bond-Robinson, 2006), we first need to understand what our TAs think if we want to better support them (Wheeler et al., 2015). For TAs specifically, others (Luft et al., 2004; Boman, 2013; Flaherty et al., 2017; Wheeler et al., 2017c; Mutambuki and Schwartz, 2018) have extensively studied the impact of training activities on TA beliefs, which suggest that the messages received from training are important. More broadly, instructors in higher education (tenure-track faculty and nontenure-track instructors) form their beliefs from a variety of experiences: prior experiences as teachers, as students, as researchers, and from non-academic roles (Oleson and Hora, 2014). Although TAs may have fewer prior experiences as instructors than faculty, TAs’ views are also shaped by their teaching experiences (Wheeler et al., 2017c). Particularly relevant to this study, TAs tend to hold different views about teaching if they lead inquiry-based labs than traditional labs (Sandi-Urena et al., 2011a). However, even if TAs agree with the ideas behind reformed courses, studies in biology (Addy and Blanchard, 2010) and physics (Goertzen et al., 2009; Wilcox et al., 2016) education have shown that beliefs do not necessarily translate into practice.

An additional complicating factor for graduate TAs is that they play hybrid role of students and employees of an institution. Although the legal status of graduate assistants in the U.S. varies by institution type (public vs. private) and across states (Flora, 2007), the TAs in this study are recognized as university employees with union representation. To our knowledge, the impact of the supervising instructor on TAs’ instructional practices in the teaching laboratory has not been studied. However, it is an important aspect to consider because TAs are obligated to carry out their teaching assignments according to the wishes of their supervisor. This is true regardless of TAs’ personal beliefs or whether or not the supervisor's directions are informed by research on teaching and learning. Indeed, it may be more broadly relevant to investigate the impact of a teaching supervisor who is not an expert in teaching and learning because it is a likely scenario; as the American Chemical Society Commission on Graduate Education reported, graduate student teaching “experiences generally are not drawn from carefully crafted programs designed to teach students how to teach” (American Chemical Society (ACS), 2012). With this in mind, we sought to determine the relationships between TA behavior, TAs’ personal beliefs, and TAs’ perceptions of their supervisor's expectations by conducting in-lab observations and interviews with chemistry laboratory TAs. Our work was guided by the following research questions (RQs):

1. What do TAs believe their teaching supervisor expects of them?

2. How do TAs’ beliefs about teaching and learning compare to expectations?

3. How do TAs’ actual teaching behaviors align with their supervisor's expectations?



TA participants were recruited at weekly GCL1 and GCL2 staff meetings at the start of the spring 2018 semester. They were provided with written documentation of their rights as research participants and a description of the IRB-approved study. Seven TAs volunteered to participate in the study and have been given pseudonyms to preserve their anonymity. Of the seven TAs, three were teaching the first semester of the General Chemistry Laboratory sequence (GCL1) and four were teaching the second semester of the sequence (GCL2). In all, the seven TAs accounted for 25% of TAs teaching the GCL sequence for the spring 2018 semester. Four of the participants were men, and three were women. Two TAs were international students. The TAs ranged from first-years through fourth-years in the Department of Chemistry's PhD program, and all but one TA had experience teaching this course. This TA transferred from a different university and had prior TA experience at his former institution. The experience profile of each of the seven TAs is summarized in Table 1.

The General Chemistry Lab Coordinator (henceforth referred to as the Lab Coordinator), a PhD-level chemist, had been hired specifically to lead the GCL course transformation from “cookbook” labs to Cooperative Chemistry labs. This study took place during the third year of the Lab Coordinator's tenure as leader of the GCL courses.

Outline of study

To address the research questions, our approach was two-pronged: (1) interviews were conducted to determine TA beliefs (RQs 1 and 2), and (2) observations were performed to determine TAs actual teaching behaviors (RQ 3). The interviews involved completion of a worksheet based on the Real-Time Instructor Observation Tool (West et al., 2013) to probe TAs’ opinions about the teaching behaviors we characterized in the observations. To determine how TAs’ beliefs and behaviors related to their supervisor's expectations, we asked the Lab Coordinator to complete the same worksheet and provide comments about his choices. Video recordings of the actual laboratory setting and allowed us to identify and code the types of interactions that occurred between the students and TAs. The interview, worksheet, and observation protocols are described in detail in the following sections.

The timeline of data collection is shown in Table 2. TAs were recorded during two laboratory sessions. The first recording (Observation 1) took place during week 7, when lab students were finishing the project on which they would present their findings in a formal lab report. The second recording (Observation 2) took place during the project on which students would report their findings in a poster presentation, weeks 11 and 13 for GCL1 and GCL2, respectively.

Table 2 Description of TA participants in this study
Pseudonym Course taught Prior GCL TA experience Year in PhD program International student?
Giovanni GCL1 0 semesters 4 No
Koga GCL1 1 semester 1 Yes
Surge GCL1 2 semesters 2 No
Brock GCL2 3 semesters 2 No
Erika GCL2 3 semesters 2 Yes
Misty GCL2 1 semester 1 No
Sabrina GCL2 2 semesters 3 No

The first round of interviews (Interview 1) took place 7–10 days before the second recording; thus, GCL1 TAs were interviewed earlier than GCL2 TAs and in closer proximity to the first classroom observation. The second round of interviews took place after the second video recording but before the end of the semester, at the convenience of the TAs.

Classroom observations

Collection of video data. Audio and video data of TAs were collected using a Swivl (Swivl, n.d.) setup, with which video was recorded via an iPad camera mounted on a Swivl robot, which was synced to a microphone worn by the TA. The microphone contained a tracking device that the Swivl robot rotated to follow, such that we could acquire video of the TA as he or she moved about the laboratory throughout the class period. Each class period was 2 hours and 50 minutes long; however, some videos were shorter because of departure of students before the end of the class period.
Analysis of videos. TAs’ actions were coded using the web-based interface of the Real-Time Instructor Observation Tool (RIOT) (West et al., 2013). Other instructor observation protocols considered were the Classroom Observation Protocol for Undergraduate STEM (COPUS) (Smith et al., 2013), the Laboratory Observation Protocol for Undergraduate STEM (LOPUS) (Velasco et al., 2016), the Reformed Teaching Observation Protocol (RTOP) (Sawada et al., 2002), and the Teaching Assistant Inquiry Observation Protocol (TA-IOP) (Miller et al., 2014). Of these, the RTOP and TA-IOP characterize instructor behavior over the course period as a whole, specifically probing for behaviors predetermined as “good” teaching practices. Because we wanted to characterize what TAs were doing throughout the class period through a values-neutral lens, the RTOP and TA-IOP were inappropriate for this goal. The COPUS characterizes instructor behavior every two minutes throughout a class period; however, its codable actions are relevant for lecture-style courses (i.e., those in which an instructor primarily engages with the class as a whole and not with individual students or small groups). While the LOPUS is a laboratory-based modification of the COPUS, the nature of TA interactions with students is characterized by the content of conversations (e.g., scientific principles, experimental procedures), not by the relative balance of who is contributing to the conversation. Although the content of instructor discussions with students is important, for this study we were interested in how the TAs interacted with students, not what specifically was said in those interactions.

Thus, the RIOT was suitable for the present study because it distinguishes between types of student-TA interactions by intellectual contribution to the conversation (i.e., Open vs. Closed Dialogue and Listening to Student Question—see Table 3 for further description of all codes in the RIOT). This tool probes objectively observable behaviors, rather than behaviors interpreted as inherently valuable. It also allows for characterization in real-time rather than within the constraints of 2-minute intervals in which an instructor could perform multiple different actions. Finally, the RIOT was developed in the context of a transformed course (West et al., 2013), and was adapted in a similarly structured study about TAs for a physics course (Wilcox et al., 2016), providing precedent for its use for the purposes of this study.

Table 3 Timeline of data collection
Week # GCL1 GCL2
1 TA training TA training
3 Labs begin Labs begin
6 Spring break Spring break
7 Observation 1 Observation 1
9 Interview 1 No data collected
11 Observation 2 No data collected
12 No data collected Interview 1
13 Interview 2 Observation 2
14 No data collected Interview 2
15 Labs end Labs end

The RIOT may be applied using an open-source web application (Paul and Reid, n.d.). Via a graphical interface, the user selects the applicable code at the start of each action, and the application records the time in seconds between clicks. For example, if “Student Question” is clicked, followed by “Explaining Content” two minutes, the application would record that this instance of “Student Question” lasted for 120 seconds. One limitation of the web application is that it allows for only a single action to be selected at any one time. That is, if a TA was engaging in Closed Dialogue in which they were Clarifying Instructions to a student, only one of those codes could be selected for that timestamp. Priority was given to the type of conversation over the content of conversations; therefore, in the aforementioned example, Closed Dialogue would have been coded, not Clarifying Instructions.

The actions coded with descriptions and examples of when each code applied are provided in Table 3. This list includes all but two actions that are present in the RIOT. “Students Talking Serially” was omitted because it involves whole class discussions, which do not occur in this laboratory setting, and “Student Presentation” was omitted because presentations were not scheduled during the laboratory sessions recorded.

The first author coded all 14 videos, and another researcher coded two of them to confirm that the RIOT protocol was applied in an understandable and replicable manner. As a measure of inter-rater reliability, the intraclass correlation coefficient (ICC) (Koo and Li, 2016) was used to compare the coders’ percentage distribution of each RIOT code per video. Using SPSS 25, the calculated ICCs (two-way mixed, single measures, absolute agreement, 95% confidence interval) for the initial coding attempts were 0.562 and 0.680, indicating moderate reliability. After discussion and recoding, the calculated ICCs for the two videos were 0.931 and 0.951, indicating excellent reliability.

Interviews and worksheet

Collection of interview and worksheet data. Interviews were conducted individually and recorded via a digital audio recording device. Interviews ranged from 38 to 63 minutes, where Interview 1 averaged 59 minutes and Interview 2 averaged 48 minutes. The first interview focused on general experiences in the lab and supports and barriers to performing TA duties. At the end of this interview, TA expectations and buy-in were probed using a modified version of the RIOT worksheet (Appendix 3) protocol described by Wilcox, Yang, and Chini (Wilcox et al., 2016). Briefly, the RIOT worksheet asked TAs to consider the relative amount of time they should spend or actually do spend on each action during a typical lab period. For reference, an abbreviated version of Table 3 was provided to TAs as they filled out the worksheet; this version of the table is provided in (Appendix 4). After TAs filled out the RIOT worksheet, they were shown their distribution of RIOT codes (“RIOT profile”) from the first observation video and discussed anything that seemed interesting or surprising to them about their RIOT profile (e.g., did their self-perceptions align with observations?).

The second interview focused more specifically on the types of interactions coded in the RIOT, where we discussed TAs’ responses to the RIOT worksheet. In particular, we were interested in the comparison of TAs’ “Ideal TA” responses and “Course Instructor” responses, “Ideal TA” and “Student” responses, “Ideal TA” and “Actual TA” responses, and “Actual TA” responses and the coded videos. The interview protocols associated with each round of interviews are included in (Appendices 5 and 6); however, as these were semi-structured interviews, the questions may not have been asked verbatim or in the order in which they appear in the protocol.

Analysis of interviews and worksheet. For each activity coded in the RIOT, the number of TAs who responded “High”, “Medium,” “Low”, or “No Time” was recorded. Because each of the time bins is not well-defined, the absolute length of time corresponding to a relatively “High” amount of time to one TA could be “Medium” to another TA. However, in interviews, respondents generally indicated that “Medium” and “High” represented a substantial portion of class time, but “No Time” and “Low” represented a minimal fraction of the class period. Therefore, to assess TAs’ understanding of and buy-in to expectations, we consolidated the “Medium/High” and “Low/No Time” results into “Substantial” and “Minimal” bins.

All interviews, except one, were transcribed by an external freelance transcription service (Rev, n.d.) and proofread by the first author for accuracy. The remaining interview was transcribed by the first author. When analyzing the transcripts, all excerpts related to any of the RIOT codes were first selected for closer analysis. While most of these excerpts came from the Interview 2, when TAs were explicitly asked to discuss their beliefs about the RIOT codes, any spontaneous discussion of behaviors that fit the definition of a RIOT code was included for analysis. Next, we sorted these excerpts by which RIOT code was the subject of the excerpt. For each RIOT code, the excerpts were sub-grouped into two categories: those that represented TAs’ interpretation of the Lab Coordinator's expectations, and those that represented their own personal ideals. These two groups of excerpts were further divided based on if the quotes suggested support for spending a Substantial amount of time or Minimal amount of time on the corresponding action. A third researcher, who had prior experience studying laboratory TAs but was not directly involved with this study, read the interview transcripts to confirm appropriate selection and interpretation of quotes. These groupings of excerpts allowed us to find reasons why TAs agreed or disagreed with certain actions, or with what they perceived to be the Lab Coordinator's expectations regarding an action. Commonalities among responses allowed us to identify any emergent themes. For example, TAs’ discussions about wanting to spend a Substantial amount of time Explaining Content while simultaneously considering the Lab Coordinator's expectation that they spend a minimal amount of time Explaining Content, led us to see TAs feelings of tension between wanting to meet the Lab Coordinator's expectations and wanting to help students succeed as a pervasive theme.

Results and discussion

RQ 1 and 2: What do TAs believe their teaching supervisor expects of them? How do TAs’ beliefs about teaching and learning compare to expectations?

TA responses to the RIOT worksheet, along with the Lab Coordinator's expectation of the relative amount of time TAs should spend on each action, are shown in Table 4. TAs’ responses to “How do you think the course instructor would like TAs to spend their time during the laboratory session?” are shown under the column heading “Perceived Expectations”, and TAs’ responses to “How do you think is the most helpful way for TAs to spend their time during the laboratory session?” are shown under the column heading “Ideal TA Actions”.
Table 4 RIOT codes, description of when code applied, and examples of some codes
RIOT code Definition
Lecture-like actions
Clarifying Instructions (CI) TA talks at class about goals for the day, safety instructions, assignments that are due, etc.
Explaining Content (EC) TA writes and/or talks about chemistry concepts or algorithms for calculations without student input (e.g., performing calculations on chalkboard)
Individual/small group conversation
Student Question (SQ) Student-dominated conversation: TA listens to student's question(s) and may provide short verbal or nonverbal responses (e.g., nodding)
Closed Dialogue (CD) TA-dominated conversation: TA tells students what to do or talks at length with little-to-no student response
Open Dialogue (OD) Balanced conversation: TA and student both contribute full sentences to dialogue.
Observation behaviors
Passive Observing (PO) General observation: TA scans or paces classroom without focusing on any particular group. Ranges from occasionally looking up from phone to slowly walking the aisles while glancing at students. Coded as default interaction when camera did not adequately follow TA (i.e., if there was no conversation recorded but one could not see what the TA was physically doing).
Active Observing (AO) Targeted observation: TA stops at one group's bench and observes the students’ interactions and/or individual member's actions. Interaction is limited to “check in” behavior, e.g., “How's it going?” with a short response (“We’re fine”) from student(s) before the TA moves along.
Other actions
Administrative/Class Prep (CP) Includes logistical noninteractive TA responsibilities, e.g., locking/unlocking drawers, cleaning up acid/base spills or broken glass, using the computer to request reagents, etc. A subcode under “Not interacting” in the RIOT.
Working on Apparatus (WA) TA physically manipulates experimental or computational equipment, e.g., adjusting a vacuum filtration setup, setting parameters for data acquisition, acquiring solvents for students. A subcode under “Not interacting” in the RIOT.
Not Interacting (NI) TA is not doing any of the above. Typically includes when TA was out of room (e.g., left for the restroom or was in the lab when all students were working up data or planning documents in the hallway) or talking to non-students (e.g., the lab technician).

Accurate perception of & buy-in to Lab Coordinator's expectations. To make sense of TAs’ responses to the RIOT worksheet, we compare what they said in the interviews to each of the ten RIOT actions. We describe alignment of perceived expectations to actual expectations of the Lab Coordinator as TAs’ “understanding” of expectations, and “buy-in” refers to the alignment of what TAs believe are ideal TA behaviors to the Lab Coordinator's expectations, which are shown as bold in Table 5.
Table 5 Count of TA responses to “How do you think the course instructor would like TAs to spend their time during the laboratory session?” (perceived expectations) and “How do you think is the most helpful way for TAs to spend their time during the laboratory?” (Ideal TA Actions). Asterisk (*) represents the actual expectation of the Lab Coordinator
RIOT code Perceived expectations (# out of 7 TAs) Ideal TA actions (# out of 7 TAs)
Minimal Substantial Minimal Substantial
None Low Med High None Low Med High
Clarifying Instructions (CI) 0 2* 3 2 0 3* 2 2
Explaining Content (EC) 0* 3 4 0 0* 0 6 1
Student Question (SQ) 0 0 2* 5 0 0 1* 6
Closed Dialogue (CD) 0 5* 2 0 0 6* 1 0
Open Dialogue (OD) 0 0 0* 7 0 0 1* 6
Passive Observing (PO) 0 0 6* 1 0 1 3* 3
Active Observing (AO) 0 1 3* 3 0 1 4* 2
Classroom prep (CP) 0 3* 1 3 0 4* 1 2
Working on Apparatus (WA) 0 4* 2 1 0 1* 3 3
Not Interacting (NI) 2* 5 0 0 1* 5 1 0

Student question (SQ). All seven TAs understood and bought into the Lab Coordinator's expectation that they should spend a significant amount of time listening to Student Questions, and they seemed to see it as an important part of the TA's role, perhaps the defining responsibility of a TA. For example, when asked why she thinks she should spend a relatively high amount of time listening to student questions, Sabrina said, “’Cause that's what I’m here to do?” However, not all TAs had the same perspective about why listening to students is important. For example, Sabrina said that safety is “obviously” a reason, which was echoed by Erika: “I always encourage them to ask questions. I like it, because I’d rather them ask me questions than do something stupid without asking me”.

In addition to safety reasons, some of the TAs suggested that it was their responsibility to make students feel comfortable asking questions. For example, Erika said, “You should be able to let them approach you with questions”—that is, TAs should make themselves approachable so that students will be unafraid to ask for help. Brock made this point clear and said, “You want to be able to interact with them when they have questions or make them feel like they can come up and ask questions,” because if students feel like they can’t initiate conversations with the TA, then they might not ask for help when they need it. Koga believed that this happened during his first time teaching the course and referenced a student evaluation from the previous semester, where the comments suggested that a group of students didn’t feel comfortable asking him questions:

KOGA: [Listening to student questions is] actually a respect to them, [and] it could be due to my last semester's student evaluation. One student … said that I didn't help their team at all, and they figured out the case mostly by themselves… I cannot remember if I don't really help any teams. It's let me feel a little bit down, so in this semester I make sure I at least listen to their question.

Open dialogue (OD). Another area of agreement was Open Dialogue, where all of the TAs believed they were expected to spend a significant amount of time engaging in balanced conversations with students, and they bought in to the idea that it is a helpful way to interact. This notion is succinctly summarized by Surge, who said, “Open dialogue, I think that's been drilled. Ask them a question that can help them think forward, have them talking it through themselves. I think that's a good way to do it.” TAs also seem to converge on the rationale behind the importance of open dialogue: to guide students’ thinking in productive directions, rather than tell them what is the “right” answer for a given situation. Giovanni illustrated this in his second interview:

GIOVANNI: They need to have something to go off of. They need to have some sort of starting point … I used to think, at the beginning of this course, that you give everyone the same starting point. But I've grown to think that's not true, because you just simply have some groups that just are at a different level than others, whether that's because they have one person that's like really knowledgeable, because whatever reason, right? … Now I just sort of ask them, “Where are you guys at? What are you guys doing?” Then from there, I try to give them a little bit of something to guide them the right way.

Giovanni's comment is particularly astute. His words demonstrate that he has reflected upon his teaching, considered whether or not giving the same instruction was equally helpful to all groups, and modified his approach appropriately. While most of the TAs said that Open Dialogue involves giving students a launching point for productive thinking, Giovanni explicitly mentioned his strategy to figure out where a suitable launching point might be. Given that Giovanni had not previously taught this course—indeed, his previous lab TA experience was in a “cookbook”-style lab—it is especially impressive that he came to this realization and adjusted his teaching accordingly while still learning how each of the projects and the course overall worked.

Passive observing (PO) and active observing (AO). Another area in which TAs understood and bought in to expectations was with respect to the amount of time they should spend observing students. All but one of the TAs understood and agreed with the expectation that they should spend a significant amount Passive or Active Observing. Giovanni, the sole dissenter, said, “[Observing] just informs me that they're trucking along. It doesn't really do anything to improve their learning experience. So I don't think I should spend much time doing that.” However, other TAs mentioned that both modes of observing the students allowed them to monitor the class's safety and progress throughout the experiments. For example, Erika and Misty specified that in passive observing, they could quickly notice if any students had removed safety goggles during experimentation, what the students were working on, or if students were working at all. One TA mentioned that being able to do this without getting close to an individual group was better because then the students wouldn’t feel like he was “hovering over their shoulder” because “it's harder to do your work when someone's sitting right there” (Surge).

In contrast to Surge's opinion, those who thought Active Observing was an ideal behavior indicated that paying closer attention to a particular group enabled them to better gauge student progress, understanding, and group dynamics. For example, Erika said that through active observing, she can “make sure they aren't frustrated” and determine whether “they [are] all working, [or] is it one person working?” Despite this, she acknowledged Surge's concern about making the students nervous by observing them too closely: “They say I'm intimidating, staring at them … I wouldn’t want my PI [Principal Investigator] watching me all the time [so] I don't think they want me there all the time,” which is why she said Active Observing should be done at “a medium level.” Indeed, many of the TAs mentioned this concern and that they needed to find a good balance, instead of exclusively choosing a passive or active approach to observing.

Closed dialogue (CD). All but one of the TAs understood and bought into the expectation that they should spend an insignificant amount of time engaging in Closed Dialogue. For Closed Dialogue, both Sabrina and Koga mentioned that if they talked too much or just told students what to do, then they might not hear or understand what the students actually need. For example, Sabrina said, “[If] I’m so focused on getting what I want to say out that I'm not listening to their [students’] questions … I might not even be answering their question that they did ask,” which is not helpful. Along the same line, Koga said, “If I give them [students] too much order and those orders does not meet their real case, I'm actually misleading them.” Alongside this comment, Koga mentioned that he should respect the students’ ownership of their projects and said, “They [students] are the ones doing the lab, so I should listen to them because they are trying it,” which cannot be accomplished with TA-dominated dialogue.

In contrast, Erika thought the students “love” closed dialogue “because they would rather have me do all the hard work and then come up with a nice answer for them to type.” Despite the perceived desire of her students, she argued against closed dialogue “because then they don’t think at all … I don't think that's helpful for them, intellectually. They are not learning to think.”

Misalignments: miscommunications or a tightrope for TAs?. The TAs had mixed responses in interpreting of the Lab Coordinator's expectations about the remaining actions: Clarifying Instructions, Explaining Content, Classroom Prep, and Working on Apparatus. However, of those, Explaining Content and Working on Apparatus had particularly striking misalignments between the TAs’ beliefs about ideal teaching practices and the actual expectations of the Lab Coordinator. These disagreements are discussed below.
Working on apparatus (WA). With respect to Working on Apparatus, 4 of 7 of TAs understood that the Lab Coordinator did not want them to spend a lot of time doing this. Those who believed they were expected not to spend much time physically helping students with experiments noted that helping on this level would go against “the point of the class” (Giovanni) or suggested that the Lab Coordinator simply thinks it would not be necessary because he “assumes everything's gonna run smoothly” (Surge).

However, 6 of 7 TAs believed that it would be most helpful to spend a significant amount of time with hands-on assistance with experimental setups and techniques. Many of them stated that this was for safety reasons; if students do some techniques incorrectly, they might hurt themselves or others. While students are provided with videos that demonstrate various lab techniques, and the instructional team tries to minimize hazards in our labs by using common water-soluble salts, dilute acids and bases, or typical cooking ingredients (e.g., commercial food dyes), TAs still worried about students’ misuse of materials. For example, Sabrina mentioned this with respect to titrations:

SABRINA: I think it's always important to do it the first time they're using something, especially with titrations, setting the burettes up, showing them how the stopcock works. Make sure this is perpendicular; otherwise you're gonna have NaOH all over the floor… How to actually fill the burette properly—that keeps you safe so you're not pouring NaOH over your head, those things.

In addition to safety, Misty, Erika, and Sabrina also cited a need to conserve resources, that is, to avoid breaking equipment. However, they had different opinions on how involved a TA should be in the setup for more complicated techniques. Misty preferred a more hands-on approach and suggested that equipment, such as for vacuum filtration or titrations, should be set up for students by TAs:

MISTY: It's just easier if it is already set up, and then I just have to show them how to do it … I feel like the more time I spend working on it, the less time things are gonna end up broke [sic] … These [students] are looking for an in-and-out, and they don’t care if stuff gets broken. They don’t care if stuff is all over the place, and that's why I think it’d just be easier in a class like this to kind of do it [for them].

It appears that Misty had less faith in her students than did fellow GCL2 TAs Sabrina and Erika, who seemed more confident in allowing students to set up these techniques after being shown a first time:

SABRINA: I don't think it's the responsibility of the TA to set things up like that every single time. I do think that because you're taking responsibility for the class while you're teaching it, you need to check and make sure they're set up properly … You're walking around anyways; just look and make sure.

ERIKA: The next step they're taking might be orgo, so they should be able to know how to filter stuff properly. I want to give them that training … I only do it once and then I just, “You remember how I did it last time; why don't you do it now?”

As Erika noted, it may be important to help students learn these techniques so that they are prepared for future courses they might take. Although most of the students who take this course are not chemistry majors, a large fraction are life sciences majors or pre-health students (Appendix 7), who may need to take organic chemistry lab, as Erika suggested, where the ability to perform common lab techniques could be helpful.

Interestingly, none of the TAs mentioned that video tutorials of various lab techniques are provided to students in the electronic lab notebooks, and the background information for each project lists potentially useful techniques that students may want to review. Given the common concern that students could not safely perform complicated techniques, this suggests that these TAs and students were not fully embracing all of the resources available to them. Was their availability not communicated to TAs, or did TAs not encourage their students to use them? Were the resources hard to find or difficult to understand? Without the TAs’ discussion of the resources, we can only speculate about why they may not have been used.

Explaining content (EC). While 4 of 7 TAs correctly interpreted the Lab Coordinator's expectations about Explaining Context, all seven TAs disagreed with the notion that they should spend an insignificant amount of time performing this task. In his comments, the Lab Coordinator wrote that he “equates [explaining content] to lecturing,” which TAs do seem to agree that is something they should not do; however, Surge noted that explaining can encompass more than simply lecturing when he said, “I think the goal of the class is to take people who don't know a lot about chemistry and have them do chemistry, so I think there should be a reasonable amount of some of my scientific chemistry knowledge being used to help them, but obviously, I'm not gonna be lecturing at the board.” Thus, it's not that TAs necessarily disagree with the Lab Coordinator, but they interpreted what it means to “explain without student input” differently. In this example, the disagreement is simply a case of miscommunication of expectations.

However, the misalignment was more often a disagreement, rather than a miscommunication, between TAs and the Lab Coordinator about how to best support students. A recurring theme in the TA interviews was a tension between wanting to meet perceived expectations and wanting to help students move forward even after preferred teaching strategies seem insufficient. Some of the TAs, like Giovanni, agree with the Lab Coordinator's expectations only in an ideal situation or with the strongest students. In a typical case, however, these TAs believe that most students need more help understanding the task at hand than TAs believe they are expected to give. As Erika says, sometimes the students truly put in the effort, but “sometimes they understand what they have to do, but they really don't know how to put it together, so you should spend more time on content.” As Giovanni says, “It's a nice idea that these kids should be able to produce these thoughts on their own, but, from my experience, it's really rarely the case that you have a student that's just ready to go who gets it.”

The Lab Coordinator suggested that “proper planning” should eliminate the need for TAs to explain content, but Brock disagreed. As he said, the students “can plan it right, but if they're not understanding why they're doing it, this lab style is no better than, ‘Here's a list of directions.’” In his opinion, one can put together a viable plan to answer the questions posed by the project, but if they don’t understand the rationale behind the procedures, then they won’t gain anything from the planning phase of the experiments, making it no different than a traditional “cookbook” lab. For example, one GCL2 project involves standardizing a solution of NaOH, which they then use for titrations the next week. While the students could successfully execute these procedures in lab, three of the GCL2 TAs mentioned that their students often did not understand why the standardization was needed. Brock said that he realized after the lab period, while helping students with formal lab reports, that they did not know the purpose of these experiments: “When they're writing their papers, they don't quite get the connection between the standardization and then their [analyte] titrations… [They] come in my office hours to … [ask], ‘Why did we need to do this? I don't understand how week two [standardization] connects to week three [analyte titrations].’” This indicates that students could look up how to do a standardization, write down those steps in a planning document, and follow those instructions, but it did not help students understand how the task fit in with the overall project.

The above example shows that explaining content may be needed outside of the experimentation phase, e.g., when students are planning procedures or analyzing data. As a general rule, Sabrina said that explaining something to students instead of trying to get them to figure it out is acceptable when they ask about something “that's more context and not just procedure.” Still, she emphasized that it is important that students first try on their own: “After they’ve looked something up, if they still have a question or they’re interested in it, then you should definitely engage. But I think that it's their responsibility to actually look these things up and use the resources they have so that they get that skill for the future.”

Erika and Giovanni mentioned that mathematical tasks are often a stumbling block, such as determining amounts of reagents to use in preparation for an experiment or performing calculations on data to analyze results. In these situations, Giovanni thought lecturing at the board to show the math would be more helpful; however, he only wanted to do this if students already figured out what type of manipulation they wanted to do but struggled with doing the arithmetic:

GIOVANNI: [If] it's something that we've already arrived at due to a conversation, then I do go to the board because I feel like there's some things I can wave my hands around, and it just doesn't make any sense. So even if I were like, “You do this, and you carry, and you move this over” … I might as well be speaking gibberish to them.

Sabrina and Giovanni both indicated it was only after the students used available resources to try to make progress independently and after they tried to guide students in the right direction through discussion that they would turn to providing an explanation or worked example on the chalkboard. However, they both mentioned that they believed the Lab Coordinator did not want them to do this. This tension between wanting to meet the Lab Coordinator's expectations and wanting to help students succeed was summarized by Erika:

ERIKA: It's [hard] because I do know what is expected of me, and I know what I shouldn't do, but the requirement of the student is so different … It's entirely up to [the students] to figure out, but … [it's] so hard for them. So then it's a very difficult situation for the TA, because you know you're not supposed to do this [help too much]. You know that if you don't do it, they're completely clueless, so there's this side push you always have to give. In the meantime, you know it's not right, so you have to restrain yourself from doing more … You don't want to get them to feel frustrated, so you always have to struggle with the … situation.

RQ3: How do TAs’ actual teaching behaviors align with their supervisor's expectations?

The percentage of time each TA spent on each of the ten actions coded in the RIOT during the two recorded class periods is shown in Fig. 1. These “RIOT Profiles” illustrate how each TA distributed his or her time in the lab.
image file: c9rp00088g-f1.tif
Fig. 1 RIOT profiles for each TA. In dark blue are the coded profiles for the first recording (Time 1), and in light blue are the coded profiles for the second recording (Time 2). CI = Clarifying Instructions, EC = Explaining Content, SQ = Student Question, CD = Closed Dialogue, OD = Open Dialogue, PO = Passive Observing, AO = Active Observing, CP = Classroom Prep, WA = Working on Apparatus, and NI = Not Interacting.
Few lecture-like actions: clarifying instructions and explaining content. The Clarifying Instruction and Explaining Content codes characterize time spent talking at the entire class by the TA, and these codes took up little or no amount of time, as shown in Fig. 1. Because of the nature of the laboratory, most TAs reserved only about 5 minutes at the start of the class period for this. During the remainder of the class period students worked in small groups, so if the TA were talking about instructions or chemistry concepts with students, it would be in a conversation with one of the groups or individual students rather than the whole class. Because the type of conversation was prioritized over the content of conversations, these interactions would have been coded as Student Question, Closed Dialogue, or Open Dialogue, instead of Clarifying Instructions or Explaining Content.
Other actions varied by project activities: classroom prep and working on apparatus. Classroom Prep varied based on the activities that were being done during any particular project. For example, during some projects, students may consume reagents more quickly or generate more waste. This requires the TA to take the time to request refills from the stockroom or to put out a new waste container and properly seal the full container. Although these tasks are not pedagogical in nature, they are essential for labs to run smoothly, allowing students to perform experiments as planned and in a safe manner.

TAs indicated that Working on Apparatus would also vary from week to week based on specific experiments. Most said they tended to provide more hands-on assistance to students if they were performing a new technique. Because the observed labs did not involve new techniques, this may account for the relatively low frequency at which this was observed.

Bulk of class time: conversations and observations. Because we are interested in the alignment of TAs’ teaching practices with course expectations, we now focus the discussion on actions that indicate TAs are paying attention to students: conversing with students—Student Question, Closed Dialogue, and Open Dialogue—and observing students—Passive Observing and Active Observing. The percentage of time that TAs spent on each action is shown in Fig. 2a–e. We will refer to the sum of these as Attention Time, shown in Fig. 2f. For each, the average across fourteen observations is indicated by the black dashed line. The highlighted region spans half a standard deviation above and below the mean. These values are summarized in Appendix 8.
image file: c9rp00088g-f2.tif
Fig. 2 Profiles of TA attention time, i.e., time spent either talking with (blue) or observing (red) students during the laboratory period. Darker colors signify data from the first observation, and lighter colors represent data from the second observation. The average across fourteen observations is indicated by the black dashed line. The highlighted region spans half a standard deviation above and below the mean percentage of the class time for each code.

Conversations: closed and open dialogue and student question. Fig. 2a–c show the percentage of time that each TA spent in conversation with students. The three categories were distinguished by who did the most talking: students more than TAs (Student Question), TAs more than students (Closed Dialogue), or students about equal to TAs (Open Dialogue).
Closed dialogue. Of the types of conversations coded, everyone but Sabrina and Surge spent the least amount of time engaged in Closed Dialogue, in agreement with their beliefs about what they should do. When these TA-dominated conversations did occur, TAs explained that it was a last resort for when students “totally give up” (Koga) or are “completely clueless about everything” (Erika). Erika describes how this situation typically plays out:

ERIKA: You ask the questions to prompt them, but they're clueless. Then you have to give them a little bit more, still clueless. Then you go a little bit more, and ultimately, it's like you're the only one talking. I have to tell them, “Guys, this is not working,” but I have to give them something … You know this is not ideal, but you are just trying to make them talk.

Despite this, Erika, like most of the others, actually spent little time in Closed Dialogue. Koga was the only TA who spent a relatively high amount of time in Closed Dialogue, but this may be a limitation of the coding scheme. Because Closed Dialogue was coded when the TA dominated the conversation, Koga pointed out the amount of time he spent speaking depended upon his ability to efficiently express his thoughts in English:

KOGA: I think I am still training how to make all the things clear at one time, so later I’m going to be like the native TAs and just use one sentence that explains all the things, but now so far [I] need to use two or three or four sentences to explain the same comments.

As a first-year non-native English speaker, Koga had relatively little experience with communicating in English. He suggested that with practice his comments to students will be more concise, but until then he might need to use more words until he finds a suitable way to say what he means. Thus, we believe that coding may be biased toward Closed Dialogue instead of Open Dialogue when a TA has weaker English-speaking skills.

Open dialogue. TAs varied widely in the time spent engaging in Open Dialogue with students (Fig. 2c), which is interesting because all of the TAs believed that they should spend a relatively high amount of time in this type of conversation. However, it is clear that TAs’ beliefs did not always translate into practice. Previous studies employing the RIOT showed that TAs spent, on average, under 5% of class time in Open Dialogue with students (West et al., 2013; Wilcox et al., 2016), whereas the TAs in this study spent an average of ∼20% of class time in Open Dialogue. This is likely because of the nature of the courses being compared. Our study focuses on a chemistry laboratory class, while the earlier studies looked at physics workshop and “mini-studio” classes (West et al., 2013; Wilcox et al., 2016). The RIOT has not previously been used to characterize TA behaviors in the chemistry laboratory, but studies using other protocols indicate that TA conversations with students tend to occupy a larger portion of laboratory class periods (Velasco et al., 2016; Flaherty et al., 2017), consistent with our findings here.
Student question. For the remaining conversation code Student Question, the average amount of time that TAs spent listening to questions was only just above that for Closed Dialogue (∼13% Student Question vs. ∼10% Closed Dialogue), yet nearly all of them believed that they should spent a relatively high amount of time listening to questions and a relatively low amount of time in closed dialogue. However, this does not necessarily mean students’ questions were only addressed on average ∼13% of the time. Because Student Question was coded only when the student dominated the conversation, it is possible that TAs also listened to students’ questions during open dialogue. For example, Giovanni appears to have spent a relatively low amount of time listening to student questions, but he also spent the most amount of time, relative to the other TAs, engaging in open dialogue. Indeed, for most TAs, as their Open Dialogue percentages increased from one observation to the next, the Student Question percentages decreased.
Observations: passive and active observing. Fig. 2d and e show the percentage of time that each TA spent observing students. All but one TA spent more than twice as much time Passive Observing than Active Observing. One reason for this may be a limitation of the coding protocol: to warrant a code of Active instead of Passive Observing required overt behavior of the TA to demonstrate that they were focusing on a specific group of students. The TA may actually have been watching or listening to one group in particular when it appeared that they were passively listening to the whole class, but it is not possible to determine another's mental focus through visual observation.

When asked, several TAs indicated a reluctance to observe the groups individually (Active Observing) because they thought it would make the students feel nervous to have the TA “over their shoulder”. Alternatively, the difference may simply be an attempt to exert less effort. As Erika notes, teaching lab is “physically exhausting” because it involves hours of walking and talking. For some TAs, Passive Observing allows them to remain in one location while still paying attention to students’ progress and safety.

TA reflections on the RIOT profiles. TAs had an opportunity to reflect on their RIOT profiles at the end of each interview. Most TAs seemed unsurprised by our findings, in that the distribution of time spent on each action seemed reasonable based on their recollection of the corresponding lab periods. However, two of the TAs, Giovanni and Sabrina, thought the observed distribution was different than they expected. Giovanni was surprised about the relative amounts of Closed and Open Dialogue that were coded. He thought he engaged in much more Closed Dialogue than Open Dialogue, although he actually spent the highest amount of time in Open Dialogue out of the seven TAs. When asking students questions to gauge their thought process, he sometimes he felt like he was asking them “very patronizing questions” instead of guiding questions. As a fourth-year graduate student, sometimes an appropriate answer to his questions seemed obvious to him, and in those moments, he said, “It feels like I’m literally telling them what to do … Those are the moments where I feel like I’m engaging in closed dialogue, but it might be that I’m just being too harsh on myself.”

However, Sabrina was concerned by what she saw in her RIOT profiles:

SABRINA: When you see these, it looks really bad.… The student questions is a little different but the passive observing, not interacting, it seems really bad that it trumps all the other stuff … It just seems like I should be doing more things, but at the same time you do run into the situation where if I actually go over there … I'm going to distract them … A lot of it is it feels weird. Some groups are finishing much earlier than others and so it's kind of like, “Now there's nothing to do with them.” And the other ones are just sitting at their bench writing … I don't think I'm doing a poor job as a TA, but I don't think I'm the best TA. I'm not sure how to take it. I guess I need to pay more attention to what I'm doing and seeing if it's the right thing at that time.

These examples suggest that TAs can become more self-aware if they see the findings from an outside observer, which may be useful for professional development. Giovanni learned that his teaching practices were actually helpful and aligned with the course philosophy much better than he expected, but Sabrina realized she spent a disproportionate amount of time merely observing her students, when perhaps she could have been more productively interacting with her students.

Relating observed behaviors to beliefs about teaching. To compare TAs’ actual behavior to their beliefs, we first characterized the relative amounts of time each TA spent on an action as high, medium, or low. We define “high” as greater than 0.5 standard deviations (SD) above the mean for that action (i.e., above the highlighted region) and “low” as less than 0.5 SD below the mean (i.e., below the highlighted region), similar to the scheme used by Wilcox et al. (Wilcox et al., 2016). Any percentage of time within 0.5 SD of the mean is classified as a “medium” amount of time. These cutoff values are shown in Appendix 9. Using this classification scheme and the average percentage that each TA spent per action across the two observation videos, we show each TA's behavioral profile in Table 6:
Table 6 Relative amount of time TAs spent in conversing or observing actions, compared to average percent time for that action across all TA videos. “High” > 0.5 SD above mean, “Medium” = Within 0.5 SD of mean, “Low” = Less than 0.5 SD below mean. All conv. = Average of conversation actions (SQ + CD + OD), All obs. = Average of observation actions (PO + AO)
RIOT code Giovanni Koga Surge Brock Erika Misty Sabrina
SQ Low Low Low Med Med High Med
CD Med High Low Med Med Low Med
OD High Med Low Med Med Low Low
PO Low Low High Med Med High High
AO Med High Med Med High Low Low
All conv. High High Low Med Med Low Low
All obs. Low Low High Med Med High High

Since there were essentially two types of actions, “Conversations” and “Observations”, we combined Student Question, Closed Dialogue, and Open Dialogue into the “All Conversation” bin. Similarly, we combined Passive Observing and Active Observing into the “All Observation bin. From these categories (bottom two rows of Table 6), we can see that the TAs’ teaching styles fall into three categories: Conversation-Heavy (a relatively high amount of time in conversation with students but a low amount of time quietly observing), Observation-Heavy (a relatively high amount of time quietly observing students but a low amount of time in conversations), and Balanced Conversation-Observation (medium for both). Because most of the TAs’ behavior skewed toward observation or conversation, this suggests that finding a suitable balance between talking with students and stepping back to observe is a challenge.

What causes a TA to be Observation-Heavy or Conversation-Heavy in their teaching? All of the TAs believed that they should spend a substantial amount of time in conversation with students, which suggests that beliefs about talking with students did not drive their behavior. However, TAs’ views on observation actions differed, and these views, summarized in Table 7, seemed to be related to the observed patterns in behavior.

Table 7 Individual TA responses to the RIOT worksheet questions “How do you think the course instructor would like TAs to spend their time?” (Instructor) and “How do you think is the most helpful way for TAs to spend their time?” (Ideal TA) with respect to passive and active observing
TA Passive observing Active observing Observed behavior profile
Instructor Ideal TA Instructor Ideal TA
Giovanni Med Low Low Low Conversation-heavy
Koga Med High High Med Conversation-heavy
Surge High High Med Med Observation-heavy
Brock Med Med Med Med Balanced
Erika Med Med Med Med Balanced
Misty Med Med High High Observation-heavy
Sabrina Med High High High Observation-heavy

For all TAs but Koga, beliefs about observing aligned with their actual time spent observing. TAs in the Observation-Heavy category (Surge, Misty, and Sabrina) all responded “High” to Passive Observing, Active Observing, or both. In contrast, the TAs in the Balanced Conversation-Observation category (Erika and Brock), wrote “Medium” for all questions about observation, and the observed behavior of Erika and Brock did indeed correspond with their beliefs. However, for TAs in the Conversation-Heavy category (Giovanni and Koga), the correlation between their worksheet responses and observed behavior is less clear. While Giovanni's belief that an ideal TA should spend a relatively low amount of time observing matched his actions, there was no clear relationship between Koga's beliefs and actions.

Relative to the other TAs, Koga spent the highest amount of time Active Observing but the least amount of time Passive Observing, even though he responded that he should spend a medium or high amount of time doing both levels of observation. This suggests that there is another factor besides beliefs about ideal TA behaviors that influence actual teaching behavior. Here it may simply be that the ability to strike this balance comes with experience: the two TAs in the Balanced Conversation-Observation category (Erika and Brock) were also the most experienced of this group.


Based on the responses to the RIOT survey and interview data, it appears that TAs generally understood that they should spend most of class time listening to student questions, engaging in open dialogue with students, and observing students, and indeed they mostly bought into the idea that these were helpful teaching behaviors. However, for those actions the Lab Coordinator expected TAs to spend a minimal amount of time doing, responses among the TAs were inconsistent. TAs discussed communications with the laboratory manager that supported what they perceived his expectations to be, which suggests that they received mixed messages about preferred practices in the lab. Furthermore, for some of these behaviors (Explaining Content and Working on Apparatus), TAs disagreed with the expectations of the Lab Coordinator, and instead, they felt that it would be helpful if they spent more time performing these actions.

Some of the disagreement stemmed from TAs’ lack of confidence in students, in which they believed students required more help than the lab coordinator expected to complete the lab projects. Disagreement also came from misunderstandings about how much help TAs were allowed to offer students. For example, Giovanni mentioned that when they were told not to do a specific teaching move, he saw that other TAs thought that meant not to help students at all:

GIOVANNI: This one TA … [was] like, “Well, because of [an email stating what not to do], I'm not going to write anything on the board, and I'm just going to sit here.” … That's what they did … but the people who suffered were the kids, because they had no idea what to do … She [the TA] was too scared to say anything because she didn't want to get an email … saying that she was doing something wrong.

This suggests that a case of miscommunication between TAs and the Lab Coordinator can have a serious impact on class instruction. It does not necessarily mean that TAs do not buy in to the course philosophy.

In fact, classroom observations indicated that TAs spent most of class time performing in at least some of the ways the Lab Coordinator would prefer, that is, in conversation with or observing students. Of the seven participating TAs, Erika and Brock spent a substantial amount of time on both conversing and observing (Balanced Conversation-Observation); Giovanni and Koga spent much more time in conversations than observing (Conversation-Heavy); and Surge, Misty, and Sabrina spent much more time observing rather than talking with students (Observation-Heavy). While the Conversation-Heavy and Observation-Heavy TAs generally perceived and agreed with the idea that they were expected to converse, the two groups’ ideas about observation differed. By comparing the video data to the RIOT worksheet responses, it appears that the balance of conversation to observation was determined by TAs’ beliefs about Passive and Active Observing, i.e., the more that a TA thought that time spent observing was expected by the Lab Coordinator and/or was an ideal teaching behavior, the more time they actually spent quietly observing students.


Because the TAs who participated in this study were self-selected—and some specifically requested to teach GCL over any other course in the department—they may intrinsically be more interested or motivated than those who opted not to participate. Thus, relative to the average TA, the participants in this study could be more likely to agree with and/or fulfill the course expectations. We emphasize that these are the experiences of seven individuals in a specific academic context, comprising only a fraction of the total TA cohort for the GCL course. Their thoughts have provided insight into some of the challenges for General Chemistry Laboratory TAs, but their experiences might not be representative of the entire group of TAs at this particular institution. Furthermore, the findings presented here are not necessarily generalizable to other situations.

Coding TA behavior based on video data requires making some assumptions. First, because video was captured using a single camera, sometimes the TA was far away from the camera or had their back turned, making it difficult to see what they were doing or who they were talking to. This challenge was magnified in sections where students chose to work in the hallway just outside the laboratory to develop project plans for the next class meeting, and TAs would follow once all groups finished experimentation. In these situations where the TA was not visible, coding relied on audio data.

Finally, the types of dialogue between TAs and students were coded, without coding for the content of those conversations. This study focused on the behaviors of TAs and their reasoning for their actions, but it may be relevant to know what was most commonly discussed (e.g., procedural details, chemistry concepts, data analysis, etc.) to understand the needs of TAs and students. Knowledge of frequently asked questions or types of problems could help training developers better tailor activities to align with the realities of the classroom.

Implications for practice

To date there are still few studies concerning TA beliefs in transformed chemistry laboratory curricula (Sandi-Urena et al., 2011a; Wheeler et al., 2017c). Our work contributes insight into the relationship between TA teaching practices, TA perception of the course leader's expectations, and TAs’ personal beliefs about teaching and learning. We found that TAs did not interpret expectations consistently, and they struggled with a tension between satisfying the perceived expectations of the Lab Coordinator and teaching students in a way that they deemed most helpful. Wheeler and co-workers (Wheeler et al., 2015, 2017a, 2017b, 2017c, 2019) have demonstrated the need for effective TA training in inquiry-based labs. We reiterate this message and show here that two major areas that should be addressed in TA preparation include (1) making clear and explicit what the expectations are and why, and (2) teaching TAs pedagogical methods that may best support students.

Expectations influence actions—or inaction, as Giovanni noted about a fellow TA who did nothing to support her students for one project because she only knew what she was not supposed to do. Not only do TAs need to know what expectations are, they need to understand the reasoning behind them. For example, we saw that Misty did not know why the Lab Coordinator suggested that they have students “struggle a little” with the projects. Consequently, she relied on her past experiences as an undergraduate to decide that lab lectures are more helpful than letting students build their knowledge as they grapple with the experiments.

We saw that TAs found it difficult to help introductory students figure out how to approach the presented scenarios without resorting to telling them what to do or wanting to do certain techniques for them. To address TAs’ struggles with guiding novices in the lab, modeling helpful interactions, as has been done for TAs at the University of Virginia (Wheeler et al., 2015), is one practical approach to empower TAs to facilitate student learning. Such interactions may include scaffolding (van de Pol et al., 2010; Reiser and Tabak, 2014; Yuriev et al., 2017) or questioning (Tofade et al., 2013) techniques that TAs can use with students. While this sort of professional development may be offered in one-time TA training sessions, evidence suggests that short-term training is often ineffective (Feldon et al., 2017), and thus the ideal is to provide ongoing support. Wheeler et al. have discussed at length their approach to TA professional development for guided-inquiry laboratories, which focuses on three components: theory (on how people learn and expectancy-value theory), pedagogy, and practical issues (logistics, consistency, and TA expectations). This approach involves 25 hours of training at the start of the semester, followed by weekly meetings (20–30 contact hours) throughout the remainder of the term (Wheeler et al., 2017a; Wheeler et al., 2017c). While the TAs in this study had ∼5 hours of TA training at the start of the semester and participated in weekly meetings (∼1 hour per week) with the Lab Coordinator and fellow GCL TAs throughout the semester, our findings indicate that we need to use that time more effectively or allocate more time in order to better train and support TAs.

Beyond training sessions or courses in pedagogy, mentoring between senior and junior TAs can be an effective approach to provide ongoing support. Previously reported training programs for chemistry TAs (Richards-Babb et al., 2014; Lekhi and Nussbaum, 2015) have included mentoring components, and every participant in our study shared a specific way that more experienced TAs helped them in or outside the classroom. As Erika said, it is much more challenging if at first “you don’t have a good friend, a TA to talk to.” Thus, it may be important to facilitate the formation of mentoring relationships by explicitly identifying experienced TAs that new TAs can shadow and ask for advice or formally assigning mentor-mentee pairs.

Finally, to reduce TAs' desire to perform new lab techniques for students, one promising solution may be to incorporate digital badging to foster independence with common laboratory skills (Hensiek et al., 2016; Hennah and Seery, 2017; Seery et al., 2017). Badging allows students to demonstrate technical competence by recording a video where they narrate their actions as they perform a specific laboratory skill (e.g., pipetting). If the student's video is adequate, the “badge” for that particular skill is awarded. These videos could assuage TAs’ fears that students will be unable to perform the experiments that they propose safely or efficiently.

Conflicts of interest

There are no conflicts of interest to declare.


Appendix 1. Descriptions of projects observed in TA classroom videos

For all projects in the Cooperative Chemistry curriculum, students are required to write their own procedures for addressing the problem(s) posed in a given scenario. With the aid of provided background materials as well as the freedom to use any other resource of their choosing (textbooks, internet sources, etc.), each group decides what must be done to complete the project successfully and how to divide the work effectively among group members. The scenarios and required tasks for each project are provided below.
GCL1 observation 1: “identification and synthesis of an unknown ionic compound”. Students are provided with the following project scenario:

Your group is working as analytical chemistry interns with the [Institution] Office of Environmental Health and Safety (EHS). EHS is responsible for the collection and disposal of all chemical waste on campus.

Today, an unidentified white compound was discovered in one of the teaching labs, and your team has been called in to assist with the identification of the compound. Since your team is responsible for safe disposal of the compound, you will also need to determine as many of its physical and chemical properties as possible. EHS does not want to mix it with other waste that could create a potentially dangerous reaction.

As you are a team of interns, your supervisors have done some preliminary work and determined that the compound is likely one of the following:

NaCl KCl Na2SO4 CaCl2 MgSO4
Na2CO3 K2SO4 KNO3 Ca(NO3)2 (NH4)2CO3
NH4Cl (NH4)2SO4 CaCO3 MgCO3 CH3CO2Na

You will be given five grams (no more) of the compound; you will not know the identity of the compound, nor will you be given any other information about it.

Standard samples of the possible compounds along with the normal lab reagents will be available in the lab for you to test your hypotheses and compare with your unknown. Use the standards carefully, cross contamination or allowing the compounds to be exposed to air for long periods of time may cause erroneous results.

The required tasks for successful completion of this project are shown below:

1. Identify the unknown compound.

2. Discover as many chemical and physical properties of the compound as you can.

3. Devise two syntheses of the compound, and compare them for cost effectiveness, safety and potential yield of compound. Your group will then select the best for inclusion in the future work section of your formal report.

GCL1 observation 2: “food dye spectroscopy” students are provided with the following project scenario:. You work for a discount beverage company that produces no-brand versions of popular sodas and fruit drinks. Your current project is to use the seven (7) allowed Food, Drug, and Cosmetic (FD&C) food colorings to mimic the color profile of a popular name brand beverage.

For this project you will be provided a commercial, artificially colored beverage to analyze. Your beverage will contain more than one FD&C food dye.

The required tasks for successful completion of this project are shown below:

1. Experimentally determine the relationship between the color of a compound and the wavelength of light absorbed.

2. Experimentally determine the relationship between the amount of light absorbed and the concentration of the colored species in solution.

3. Experimentally determine both the identity and concentration of the food dyes present in the beverage.

4. Create 100 mL of a sample solution with the correct color profile to compare side-by side with the name brand beverage.

GCL2 observation 1: “soaps and detergents”. Students are provided with the following project scenario:

There has been another oil pipeline accident and the local wildlife has been covered with oil, so the local environmental group has decided to help. This group, to save money, has decided to make their own soap. The problem is the only recipe they have uses lard, animal fat, to make the soap. They, being the animal lovers that they are, would like an alternative.

It is your job to develop other types of soaps and detergents for this environmental group to use on the birds. The environmental group, being all for the environment, has requested that you test the soaps, detergents, and wastes from the processes of making the soaps and detergents for environmental impact. We have included their recipe for making the soap, and a well-known recipe for making detergents.

This is the second oil spill to hit the region in the past 50 years. There are horrific tales passed down about a scummy slime that was left on everything after the first oil spill was cleaned. Many suspect water contaminants were the cause of the scum. Since this is not always the case, the group thinks something in the well water used may be a problem. They have asked your team to research to see if you can determine the cause and find a way to prevent scum buildup.

The required tasks for successful completion of this project are shown below:

To make and test soaps and detergents in order to decide which one would be best for the environmental group to use in the future.

Your specific goals:

1. Test the solubility of fats, oils, soaps, and detergents.

2. Determine and Compare the desirable properties of each soap and detergent.

3. Examine environmental impact of the synthesis of soaps and detergents (particularly the pH of respective waste waters)

4. Determine what is causing the scum

5. Determine the best soap/detergent and protocol for cleaning up the oil.

GCL2 observation 2: “artificial kidney stones”. Students are provided with the following project scenario:

A kidney stone is a urologic disorder that is caused by the formation of a precipitate when some soluble ions present in blood and urine react. It is estimated that there were 2.7 million hospitals visits and more than 600[thin space (1/6-em)]000 emergency room visits in 2000 due to this disease. Scientists have found evidence of kidney stones in a 7000-year-old Egyptian mummy, and it is still a problem in our times being one of the most common disorders of the urinary tract.

In order to improve the quality of life for their patients, your team has been assigned by The Kidney Stone Center of the Rocky Mountains to investigate the formation of kidney stones and to suggest ways to dissolve and prevent them. (Accessed October 2015)

The required tasks for successful completion of this project are shown below:

1. Research the chemical composition of kidney stones. Numerous sites on the Internet and some chemistry textbooks are valuable sources of information.

2. Identify the major inorganic compounds present in kidney stones.

3. Prepare artificial kidney stones in a mini-scale laboratory. (each group member should make a different type of artificial kidney stone)

4. Research and Experimentally test different methods of dissolving the artificial kidney stones your group synthesized.

5. Based on the results of the experiment propose a strategy to prevent kidney stone formation.

Appendix 2. Description of TA training at this institution

Departmental new TA training. The Department of Chemistry devotes one week (∼9 am to ∼4 pm Monday to Friday) to New Graduate Student Orientation the week before classes begin. Approximately six hours (3 × 2 hour sessions) of this orientation are devoted to TA training. This training is not specific to any particular course because (1) new TAs are assigned to a variety of courses, and (2) TA assignments are not yet finalized by the time TA training sessions begin. Broadly, the three sessions covered the following topics:

1. Personal introductions, general information about teaching in the department (e.g., course numbers of corresponding course names, general responsibilities of TAs), addressing concerns about being a new TA

2. How people learn: introduction to constructivist ideas of teaching and learning, scientific practices and core ideas (from 3-dimensional learning), type I vs. type II thinking, metacognition, information processing and short-term memory

3. First day guide: professionalism, introducing oneself and contact information (email and office hours). Discussion of how to approach common problematic situations (e.g., improper lab attire, cheating on exams, not participating in class).

The TA training sessions for new 2017–2018 TAs were led by General Chemistry Teaching Staff (Director of General Chemistry, Lecturer of General Chemistry, and General Chemistry Lab Coordinator), the first author (a postdoctoral scholar focused on chemistry education research [CER]), and the second author (professor of chemistry with a focus on CER). The first session was led by the Director of General Chemistry. The second session was led by the second author. The third session was led by the first author.

Of the TAs who participated in this study, two (2 – Koga and Misty) attended these training sessions. The remaining five (5) TAs would have participated in New Graduate Student Orientation in previous years. The second author was involved in previous years’ new TA training, but the first author was at a different institution and therefore not involved.

GCL-specific TA training. The Department of Chemistry has no dedicated time allotted for course-specific training. Instructors of Record are responsible for arranging a time to meet with TAs assigned to their course. For GCL, this is typically a ∼2 hour meeting before the first day of class. At least half of that time is dedicated to assigning sections to TAs (three sections per TA at 2 hours and 50 minutes per section). The remainder of the time focuses on reviewing TA responsibilities (grading, office hours, etc.) and navigating the online lab notebook.

In Spring 2018 (the semester of this study), the first author arranged three (3) additional meetings during the first week of classes. (Lab meetings did not begin until the second week of the semester.) These sessions were 90 minutes long and scheduled in the early afternoon when no graduate courses were held in the Department, in an attempt to minimize scheduling conflicts for the TAs. Lunch was provided to encourage attendance. Of the TAs who participated in the study, four (4 – Giovanni, Koga, Surge, and Erika) participated in full, two (2 – Brock and Misty) attended some of the sessions, and one (1 – Sabrina) was unable to attend all sessions.

The three sessions consisted of the following topics and activities:

1. Large group discussion of course philosophy (goals of general chemistry lab for a largely non-major population), unpacking of big goals like “critical thinking” and “problem solving” to the Framework's scientific practices, and relationship between the activities of the lab (i.e., the project scenarios and graded assignments) and meeting the goals of the course.

2. Large group discussion of scaffolding and questioning techniques that may help guide student thinking in productive directions. Small group activity in which TAs focused on one specific project in GCL1 or GCL2. Groups consisted of a mixture between new and experienced TAs for the course. Each group was asked to consider the student goals for each project, problems students typically encountered, and possible questions to guide students to meet project goals or overcome obstacles along the way.

3. Role-playing activity where each small group would play the role of a “student team” trying to plan the experiments to do for each project. Using the same groups as the previous session, the group that focused on a particular project would play the role of “TAs” to help the “student teams” with their plans. Due to time constraints, each group play the “TA” role for about 15 minutes before resuming the role of “students” for a different project.

Appendix 3. RIOT worksheet as administered to participants

image file: c9rp00088g-u1.tif

Appendix 4. RIOT codes and definitions as provided to TAs during interviews

Action Description
a[thin space (1/6-em)]Slack is the web-chat interface used to communicate with the Lab Coordinator and stockroom staff.
Clarifying Instructions Clarifying instructions, reading from lab documents
Explaining Content Explaining a concept without student input
Working on Apparatus Helping students with experimental apparatus or computers
Student Question Listening to a question from a student
Closed Dialogue TA controls conversation with student; student provides short answers
Open Dialogue No individual person controls the conversation; student responds with complete sentences and/or ideas
Passive Observing Listening briefly to group conversations
Active Observing Listening to one group but does not participate
Admin/Class Prep Cleaning, (un)locking drawers, Slacka, writing on chalkboard
Not Interacting In a different room, non-course related chatting

Appendix 5. Protocol for interview 1

Intro. Today I want to ask you some questions about your experiences as a gen chem lab TA in this department. I’m looking for your honest opinions and feedback to learn what is currently working for you, as well as what needs to be improved. So don’t worry about tailoring your answers to what you think I want to hear, what my PI would want to hear, or what Joe would want to hear. I hope you’ll tell me what you think and feel about the course and your role as a TA.
Background. 1. What year are you?

2. What courses have you taught before?

a. How many times have you taught those courses?

Experiences. 1. How is lab going?

a. What do you think went well in lab this week?

b. Can you give an example of an interchange you had with students that you felt went particularly well? Why did it work well?

c. What did you feel did not go well in lab this week?

d. Why do you think these problems happened? How would you modify your teaching next time to deal with or prevent this problem?

Opinions/values. 1. When teaching lab, what do you see your job as?

2. What's hard about teaching this way, and what's easy?

a. Are there any materials or instructions you felt would have better prepared you to teach this lab?

3. What do you see as the advantages and disadvantages of a project-based lab course:

a. For you?

b. For the students?

4. How would you recommend tweaking the current format?

5. Do you think that this course helps teach what students should be learning in a chemistry lab?

Appendix 6. Protocol for interview 2

Intro. Today I’d like to talk a little more about the videos and the worksheet you filled out last time. But first, let's recap last lab:
Experience in lab. • Overall, how did this lab go?

• Do you think you did anything different during this lab compared to the last lab that was recorded?

• (If yes) What did you do differently? Why?

RIOT worksheet. • Here is what you wrote. Does this still look about right, or would you like to make any changes?
Course instructor/TA. 1. High/high

a. Can you give an example of this type of interaction?

b. Why do you think this type of interaction is important?

2. Low (or no time)/high

a. Can you give an example of this type of interaction?

b. Why do you think this type of interaction is important?

c. Why do you think the course instructor thinks this interaction is unimportant?

3. High/low (or no time)

a. Can you give an example of this type of interaction?

b. Why do you think the course instructor thinks this type of interaction is important?

c. Why do you think this interaction is unimportant?

Students/TA. 4. Low (or no time)/high

a. Can you give an example of this type of interaction?

b. Why do you think this type of interaction is important?

c. Why do you think students think this interaction is unimportant?

5. High/low (or no time)

a. Can you give an example of this type of interaction?

b. Why do you think students think this type of interaction is important?

c. Why do you think this interaction is unimportant?

Appendix 7. Percentage (%) of students by major category

Engineering & Computer Sciences 26 11
Life & Environmental Sciences 42 65
Physical Sciences 2 4
Pre-Health & Pre-Veterinary 10 2
Mathematics & Statistics 6 10
Other & Undeclared 14 8

Appendix 8. Mean and standard deviation of the percentage of time TAs spent on each conversing or observing action

“Conversing” actions include Student Question (SQ), Closed Dialogue (CD), and Open Dialogue (OD); the sum of these values is represented by %Conv. “Observing” actions include Passive Observing (PO) and Active Observing (AO); the sum of these values is represented by %Obs. The sum of all conversing and observing actions (%Conv + %Obs) is represented by “Attention Time” (AT).

Appendix 9. “High” and “low” cutoffs for each observing or conversing action

“High” represents a percentage 0.5 SD above the mean; “low” represents a percentage 0.5 SD below the mean.


The authors wish to acknowledge Dr Kinsey Bain and Dr Aishling Flaherty for their assistance with analysis of video and interview data, respectively. The material presented is based on work supported by the Howard Hughes Medical Institute (#52008102).


  1. Addy T. M. and Blanchard M. R., (2010), The problem with reform from the bottom up: instructional practises and teacher beliefs of graduate teaching assistants following a reform-minded university teacher certificate programme, Int. J. Sci. Educ., 32(8), 1045–1071.
  2. American Chemical Society (ACS), (2012), Advancing Graduate Education in the Chemical Sciences, Washington, DC: American Chemical Society.
  3. Boman J. S., (2013), Graduate Student Teaching Development: Evaluating the Effectiveness of Training in Relation to Graduate Student Characteristics, Can. J. High. Educ., 43(1), 100–114.
  4. Carmel J. H., Herrington D. G., Posey L. A., Ward J. S., Pollock A. M. and Cooper M. M., (2019a), Helping Students to “Do Science”: Characterizing Scientific Practices in General Chemistry Laboratory Curricula, J. Chem. Educ., 96(3), 423–434.
  5. Carmel J. H., Herrington D. G., Posey L. A., Ward J. S., Pollock A. M. and Cooper M. M., (2019b), Helping Students to “Do Science”: Characterizing Scientific Practices in General Chemistry Laboratory Curricula, J. Chem. Educ., 96(3), 423–434.
  6. Carmel J. H., Ward J. S. and Cooper M. M., (2017), A Glowing Recommendation: A Project-Based Cooperative Laboratory Activity To Promote Use of the Scientific and Engineering Practices, J. Chem. Educ., 94(5), 626–631.
  7. Cooper M. M., (1994), Cooperative Chemistry Laboratories, J. Chem. Educ., 71(4), 307.
  8. Cooper M. M. and Klymkowsky M. W., (2013), Chemistry, life, the universe and everything: a new approach to general chemistry, and a model for curriculum reform, J. Chem. Educ., 90, 1116–1122.
  9. Cooper M. M., Underwood S. M., Hilley C. Z. and Klymkowsky M. W., (2012), Development and Assessment of a Molecular Structure and Properties Learning Progression, J. Chem. Educ., 89(11), 1351–1357.
  10. Feldon D. F., Jeong S., Peugh J., Roksa J., Maahs-Fladung C., Shenoy A. and Oliva M., (2017), Null effects of boot camps and short-format training for PhD students in life sciences, Proc. Natl. Acad. Sci. U. S. A., 114(37), 9854–9858.
  11. Flaherty A., O’Dwyer A., Mannix-McNamara P. and Leahy J. J., (2017), Evaluating the Impact of the “Teaching as a Chemistry Laboratory Graduate Teaching Assistant” Program on Cognitive and Psychomotor Verbal Interactions in the Laboratory, J. Chem. Educ., 94(12), 1831–1843.
  12. Flora B. H., (2007), Graduate Assistants: Students or staff, policy or practice? The Current Legal Employment Status of Graduate Assistants, J. High. Educ. Policy Manage., 29(3), 315–322.
  13. Goertzen R. M., Scherr R. E. and Elby A., (2009), Accounting for tutorial teaching assistants’ buy-in to reform instruction, Phys. Rev. Spec. Top. – Phys. Educ. Res., 5, 020109.
  14. Hennah N. and Seery M. K., (2017), Using Digital Badges for Developing High School Chemistry Laboratory Skills, J. Chem. Educ., 94(7), 844–848.
  15. Hensiek S., DeKorver B. K., Harwood C. J., Fish J., O’Shea K. and Towns M. H., (2016), Improving and Assessing Student Hands-On Laboratory Skills through Digital Badging, J. Chem. Educ., 93(11), 1847–1854.
  16. Hofstein A. and Lunetta V. N., (1982), The Role of the Laboratory in Science Teaching: Neglected Aspects of Research, Rev. Educ. Res., 52(2), 201–217.
  17. Hofstein A. and Lunetta V. N., (2004), The laboratory in science education: foundations for the twenty-first century, Sci. Educ., 88, 28–54.
  18. Koo T. K. and Li M. Y., (2016), A Guideline of Selecting and Reporting Intraclass Correlation Coefficients for Reliability Research, J. Chiropr. Med., 15(2), 155–163.
  19. Lekhi P. and Nussbaum S., (2015), Strategic Use of Role Playing in a Training Workshop for Chemistry Laboratory Teaching Assistants, Can. J. High. Educ., 45(3), 56–67.
  20. Luft J. A., Kurdziel J. P., Roehrig G. H. and Turner J., (2004), Growing a garden without water: graduate teaching assistants in introductory science laboratories at a doctoral/research university, J. Res. Sci. Teach., 41(3), 211–233.
  21. Mabery C. F., (1892), Aims of Laboratory Training, Science, 19(490), 351–354.
  22. Miller K., Brickman P. and Oliver J. S., (2014), Enhancing Teaching Assistants’ (TAs’) Inquiry Teaching by Means of Teaching Observations and Reflective Discourse, Sch. Sci. Math., 114(4), 178–190.
  23. Mutambuki J. M. and Schwartz R., (2018), We don’t get any training: the impact of a professional development model on teaching practices of chemistry and biology graduate teaching assistants, Chem. Educ. Res. Pract., 19(1), 106–121.
  24. National Research Council, (2012), A framework for K-12 science education: practices, crosscutting concepts, and core ideas, Washington, DC: National Academies Press.
  25. National Research Council, (2000), Inquiry and the National Science Education Standards: A Guide for Teaching and Learning, Washington, DC: The National Academies Press.
  26. Oleson A. and Hora M. T., (2014), Teaching the way they were taught? Revisiting the sources of teaching knowledge and the role of prior experience in shaping faculty teaching practices, High. Educ., 68(1), 29–45.
  27. Paul C. and Reid A., (n.d.), SJSU RIOT, viewed 18 March 2018,
  28. Reid N. and Shah I., (2007), The role of laboratory work in university chemistry, Chem. Educ. Res. Pract., 8(2), 172–185.
  29. Reiser B. and Tabak I., (2014), Scaffolding, The Cambridge Handbook of the Learning Sciences Cambridge Handbooks in Psychology, Cambridge: Cambridge University Press, pp. 44–62.
  30. Rev, (n.d.), Rev, viewed 18 March 2019, https:/
  31. Richards-Babb M., Penn J. H. and Withers M., (2014), Results of a Practicum Offering Teaching-Focused Graduate Student Professional Development, J. Chem. Educ., 91(11), 1867–1873.
  32. Rodriques R. A. B. and Bond-Robinson J., (2006), Comparing Faculty and Student Perspectives of Graduate Teaching Assistants’ Teaching, J. Chem. Educ., 83(2), 305.
  33. Sandi-Urena S., Cooper M. M. and Gatlin T. A., (2011a), Graduate teaching assistants’ epistemological and metacognitive development, Chem. Educ. Res. Pract., 12(1), 92–100.
  34. Sandi-Urena S., Cooper M. M., Gatlin T. A. and Bhattacharyya G., (2011b), Students’ experience in a general chemistry cooperative problem based laboratory, Chem. Educ. Res. Pract., 12(4), 434–442.
  35. Sawada D., Piburn M. D., Judson E., Turley J., Falconer K., Benford R. and Bloom I., (2002), Measuring reform practices in science and mathematics classrooms: the reformed teaching observation protocol, Sch. Sci. Math., 102(6), 245–253.
  36. Seery M. K., Agustian H. Y., Doidge E. D., Kucharski M. M., O’Connor H. M. and Price A., (2017), Developing laboratory skills by incorporating peer-review and digital badges, Chem. Educ. Res. Pract., 18(3), 403–419.
  37. Smith M. K., Jones F. H. M., Gilbert S. L. and Wieman C. E., (2013), The Classroom Observation Protocol for Undergraduate STEM (COPUS): A New Instrument to Characterize University STEM Classroom Practices, CBE–Life Sci. Educ., 12(4), 618–627.
  38. Swivl, (n.d.), Swivl, viewed 18 March 2019,
  39. Tofade T., Elsner J. and Haines S. T., (2013), Best Practice Strategies for Effective Use of Questions as a Teaching Tool, Am. J. Pharm. Educ., 77(7), 155.
  40. Underwood S. M., Reyes-Gastelum D. and Cooper M. M., (2016), When do students recognize relationships between molecular structure and properties? A longitudinal comparison of the impact of traditional and transformed curricula, Chem. Educ. Res. Pract., 17, 365–380.
  41. van de Pol J., Volman M. and Beishuizen J., (2010), Scaffolding in Teacher–Student Interaction: A Decade of Research, Educ. Psychol. Rev., 22(3), 271–296.
  42. Velasco J. B., Knedeisen A., Xue D., Vickrey T. L., Abebe M. and Stains M., (2016), Characterizing Instructional Practices in the Laboratory: The Laboratory Observation Protocol for Undergraduate STEM. J. Chem. Educ., 93(7), 1191–1203.
  43. Volkmann M. J. and Zgagacz M., (2004), Learning to teach physics through inquiry: the lived experience of a graduate teaching assistant, J. Res. Sci. Teach., 41(6), 584–602.
  44. West E. A., Paul C. A., Webb D. and Potter W. H., (2013), Variation of instructor-student interactions in an introductory interactive physics course, Phys. Rev. Spec. Top. – Phys. Educ. Res., 9, 010109.
  45. Wheeler L. B., Maeng J. L. and Whitworth B. A., (2015), Teaching assistants’ perceptions of a training to support an inquiry-based general chemistry laboratory course, Chem. Educ. Res. Pract., 16(4), 824–842.
  46. Wheeler L. B., Clark C. P. and Grisham C. M., (2017a), Transforming a Traditional Laboratory to an Inquiry-Based Course: Importance of Training TAs when Redesigning a Curriculum, J. Chem. Educ., 94(8), 1019–1026.
  47. Wheeler L. B., Maeng J. L., Chiu J. L. and Bell R. L., (2017b), Do teaching assistants matter? Investigating relationships between teaching assistants and student outcomes in undergraduate science laboratory classes, J. Res. Sci. Teach., 54(4), 463–492.
  48. Wheeler L. B., Maeng J. L. and Whitworth B. A., (2017c), Characterizing Teaching Assistants’ Knowledge and Beliefs Following Professional Development Activities within an Inquiry-Based General Chemistry Context, J. Chem. Educ., 94(1), 19–28.
  49. Wheeler L. B., Chiu J. L., Maeng J. L. and Bell R. L., (2019), An exploratory study of teaching assistants’ motivation for inquiry-based teaching in an undergraduate laboratory context, Chem. Educ. Res. Pract., 20(1), 53–67.
  50. Wilcox M., Yang Y. and Chini J. J., (2016), Quicker method for assessing influences on teaching assistant buy-in and practices in reformed courses, Phys. Rev. Phys. Educ. Res., 12(2), 020123.
  51. Williams L. C., Underwood S. M., Klymkowsky M. W. and Cooper M. M., (2015), Are Noncovalent Interactions an Achilles Heel in Chemistry Education? A Comparison of Instructional Approaches, J. Chem. Educ., 92(12), 1979–1987.
  52. Yuriev E., Naidu S., Schembri L. S. and Short J. L., (2017), Scaffolding the development of problem-solving skills in chemistry: guiding novice students out of dead ends and false starts, Chem. Educ. Res. Pract., 18, 486–504.

This journal is © The Royal Society of Chemistry 2020