We don’t get any training: the impact of a professional development model on teaching practices of chemistry and biology graduate teaching assistants

Jacinta M. Mutambuki *a and Renee Schwartz b
aThe Teaching Center, Washington University in St. Louis, One Brookings Dr., Box 1022, St. Louis, MO 63130, USA. E-mail: Jacinta.mutambuki@wustl.edu
bCollege of Education & Human Development, Georgia State University, Atlanta, GA 30302, USA. E-mail: rschwartz@gsu.edu

Received 16th July 2017 , Accepted 15th October 2017

First published on 16th October 2017


Abstract

This study investigated the implementation of best teaching practices by science graduate teaching assistants [GTAs] (3 chemists and 2 biologists) in five inquiry-based, interdisciplinary chemistry-biology experiments during a six-week professional development (PD) program, Engage PD. Additionally, we examined GTAs’ experiences in implementing specific PD aspects. The PD program took place as the GTAs taught sections of biology and chemistry laboratory courses, each comprising five interdisciplinary experiments. The PD aspects included defining expected learning outcomes, subject-matter knowledge, relevance to real-world and chemistry-biology connections, and other active classroom teaching practices. Data were collected through classroom observations, reflection questionnaires, and individual interviews. Findings indicated that 57% of the PD aspects investigated were implemented in the five interdisciplinary experiments. Results also revealed GTAs’ initial areas of struggle in implementing specific PD aspects. Perceived implementation difficulties were attributed to individual perceptions and beliefs, and contextual factors. Through practice, continuous feedback, and reflections, most GTAs overcame the hurdles and refined their teaching. Findings imply the need to design training PD programs that offer mentoring and support to GTAs and future faculty in implementing teaching innovations. The teaching context and reflection prompts are helpful in identifying areas of difficulties and how to improve.


Introduction

Science graduate teaching assistants (GTAs) at research-based universities play a key role in undergraduate instruction, particularly in teaching the laboratory component for gateway courses (Luft et al., 2004; Connolly et al., 2016; Zehnder, 2016). For decades, these courses have been criticized for the “cookbook style” of laboratory experiments and expository methods of teaching, with emphasis placed on verification of scientific claims and memorization of science facts (Seymour and Hewitt, 1997; Buck et al., 2008; National Academies, 2010), and for complex and abstract concepts that seek to weed-out students from majoring in STEM (Bok, 2006; Baldwin, 2009). Additionally, they often lack relevance to students’ interests, connections to real-world experiences (Seymour and Hewitt, 1997; Rodriques and Bond-Robinson, 2006; National Academies, 2010), and interdisciplinary connections (National Academies, 2010). These shortcomings, among others, have been identified as possible reasons for many STEM entrants switching to non-STEM fields during the first two years of college (e.g., Seymour and Hewitt, 1997; Chen, 2015). Adoption of effective teaching approaches and classroom practices is important for increasing retention rates and improving student learning in undergraduate STEM education (President's Committee of Advisors on Science and Technology (PCAST), 2012).

Often, it is assumed that GTAs can comfortably teach, and have content knowledge in the assigned teaching discipline and pedagogical knowledge (Burke et al., 2005; Zehnder, 2016); however, many accept teaching responsibilities with little training on effective teaching practices (Golde and Dore, 2001; Austin et al., 2009). Besides fulfilling departmental teaching duties (Austin, 2002), many graduate students aspire to pursue careers in teaching as faculty members; hence anticipate preparation in teaching (Golde and Dore, 2001; Connolly et al., 2016). Interestingly, a few graduate programs concern themselves with providing the kinds of experiences necessary for science GTAs to develop the knowledge and skills they require to assist their current and future students, as faculty members (e.g., Kurdziel et al., 2003; Marbach-Ad et al., 2012; Wyse et al., 2014; Wheeler et al., 2015; Connolly et al., 2016; Dragisich et al., 2016a, 2016b). In short, many graduate programs, with their emphasis on educating researchers, largely neglect the advancement of pedagogical knowledge (e.g., Rushin et al., 1997; Luft et al., 2004; Schussler et al., 2015).

The literature indicates that GTAs often spend more time interacting with undergraduate students, and students view them as more approachable, relatable, and able to personalize teaching than faculty (e.g., Kendall and Schussler, 2012). Bettinger and co-authors also showed that undergraduate students taught by GTAs are more likely to matriculate to advanced levels in their major than their counterparts who are taught by full-time faculty for the same course (Bettinger et al., 2016). “Teaching assistants must be properly trained and supervised” (2015 ACS Guidelines for Bachelor's Degree Programs, p. 4). As such, training programs should not only strive to expose this cadre of professionals to effective teaching practices (Anderson et al., 2011; American Chemical Society Committee (ACS) on Professional Training, 2015; Association for the Advancement of Science (AAAS), 2015), but also nurture them in the journey of implementation.

Reported training models for GTAs include pedagogy courses (e.g., Kurdziel et al., 2003; Marbach-Ad et al., 2012; Dragisich et al., 2016a, 2016b; Zehnder, 2016), self-reflection on one's videotaped teaching sessions (e.g., Rodriques and Bond-Robinson, 2006), peer-observations and reflections (Miller et al., 2014), teaching observations on experienced faculty (e.g., Volkmann and Zgagacz, 2004), and workshops on pedagogy and professional development skills (Dragisich et al., 2016a, 2016b). Despite the existence of some GTA preparation programs, assessment on the effectiveness of many of these programs has relied on self-reported data especially on GTAs’ perceptions of the programs and less on how GTAs implement aspects of the teaching-development programs they engage in. The current research study design is one of the few studies, if any, to examine how GTAs implemented aspects of an Engage professional development (PD) program within an inquiry-based context and with ongoing support and mentoring.

For instance, Wheeler et al. (2015) reported a training program on guided inquiry instruction for general chemistry teaching assistants assigned to teach undergraduate laboratory course. The authors collected data using open-ended surveys and semi-structured interviews with 28 (TAs) to examine GTAs’ perceptions of training on their implementation of a guided inquiry approach. Many GTAs expressed that specific components of the training positively impacted their teaching. Frequently highlighted components of the inquiry training included authentic experiences from completing experiments, reviewing logistics, and supporting documents (Wheeler et al., 2015).

Dragisich et al. (2016a) reported findings from a professional development course with 38 first year chemistry graduate students. The content of the course was a mix of practices related to teaching and professional development. The authors investigated the usefulness of the course and related-materials, and its perceived impact on the teaching assignments of the GTAs. GTAs reported positive experiences on the effectiveness of the course modules, with a majority of participants rating most of the modules high on the scale. Findings from peer-reflections on GTAs’ classroom behaviors also indicated that 26 out of 37 GTAs were rated as excellent instructors. Similar findings were reported by Marbach-Ad and co-authors (2012) with chemistry GTAs following their participation in a mini-course pedagogy course in which the GTAs reported positive feelings about the course content and structure. Another study reported by Dragisich et al. in 2016 with 40 chemistry GTAs involved a 2-week GTA training program focused on Departmental orientation and university policies, classroom management, pedagogies, safety, and resources for instructors and students. A majority of GTAs reported positive experiences from the training, with many indicating favorable or neutral impact of the training program on their preparedness to teach. Additionally, the GTAs were perceived to be knowledgeable and well prepared to teach by their undergraduate students (Dragisich et al., 2016b).

Additional research is needed to establish if indeed GTAs implement the aspects of professional development (PD) programs, how they implement these practices, and their experiences in implementing them. In this design-based research study, we address these gaps as we illustrate one way in which science GTAs could be provided ongoing support and mentoring to develop knowledge about inquiry instruction and other classroom practices suited for learning science in a laboratory environment. In the subsequent section, we discuss the context in which we articulate the course curriculum first, then the PD aspects.

Context

Course curriculum: inquiry-based interdisciplinary laboratory modules

The establishment of the Engage PD was precipitated by the need to reform our General Chemistry (CHEM 1100) and Introductory to Molecular and Cellular Biology (BIOS 1500) laboratory courses, which are both required for biology and chemistry majors. Traditionally, these courses had rather low average scores (especially in the chemistry courses), and contributed to attrition in these science disciplines. In particular, data from the university in question indicate that approximately 48% of the students who begin as chemistry majors, and take CHEM 1100, choose to leave the major after one year. Similarly, 39% of the students who begin as biology majors, and take BIOS 1500, also choose to leave after one year. Typically, these shared characteristics of the gateway courses are criticized by many stakeholders in undergraduate STEM education.

We modified five investigation laboratory experiences for each course to incorporate inquiry lines of instruction and increase the level of student engagement in the learning. The modules also explicitly identified general concepts (Appendix 1) that draw connections between biology and chemistry concepts; thus highlighting science as interdisciplinary. The interdisciplinary experiments were implemented in four of the twenty seven sections of the CHEM 1100 laboratory course; six of the fourteen sections of BIOS 1500 laboratory course; and two sections for honors students enrolled for the BIOS 1500 laboratory course. These modules were taught in conjunction with the typical traditional experiments, four traditional experiments for CHEM 1100 and five for BIOS 1500 laboratory courses, all targeting the same content objectives. Each section comprised 24 students. The labs met once a week for approximately three hours.

Inquiry-based instruction was considered because of its potential to equip students in STEM fields with research skills relevant to society (NRC, 1996; National Academies, 2010; Brewer and Smith, 2011), and “helps students to learn in the same way that scientists learn through research. Scientists ask questions, make observations, take measurements, analyze data, and repeat this process in an attempt to integrate new information” (National Academies, 2010, p. 4). Science laboratories provide an appropriate setting for scientific inquiry where students can engage in more authentic scientific practices (NRC, 2012). The level of inquiry depends on what information is provided to the students and what the students are responsible for developing themselves (Buck et al., 2008). The modules were designed around mixed levels of inquiry comprising Level 1 and Level 2 (Buck et al., 2008). For Level 1, the problem, theory, and experimental procedures are provided; hence students work on results analysis, results communication, and conclusion. For Level 2, only the first two components of Level 1 are provided to students. The modules provided opportunities for students to either design the entire experimental procedure, or design parts of an experimental procedure with minimal guidance from the TAs (Buck et al., 2008). The mixed levels of inquiry were purposely considered in designing the modules for gradual adjustment in learning, and teaching through this approach for students and TAs, respectively. An example of the laboratory interdisciplinary experiments implemented in biology (BIOS 1500) and in chemistry (CHEM 1100) laboratory courses is presented in Appendix 2.

Development and implementation of the Engage PD program

Teaching is a complex problem-solving task that requires learning (Hardré, 2005). Educators believe that learning is situated; that is, how a person learns a set of knowledge and skills and the context in which the learning takes place are vital components of what is learned (Adler, 2000; Borko, 2004). The Engage PD is a six-week program, designed to expose science TAs to various best teaching and classroom practices, and mentor and support them as they translated the PD aspects into their teaching. TAs assigned to teach the laboratory sections containing the interdisciplinary modules engaged in the mentored PD program as they taught. The goal of the Engage PD was to not only expose them to the inquiry instruction, but also to other best classroom practices reported in the literature. Table 1 provides a summary of the PD aspects discussed and program timeline over the six weeks. We chose to focus on these PD aspects partly due to their suitability to the course goals and structure; however, these aspects can also be adapted to non-laboratory learning environments. Six PD sessions were held during one academic semester while the GTAs were teaching the laboratory courses. Each session lasted about 90 minutes.
Table 1 The Engage professional development program timeline
PD aspects and topics discussed Timing
Week the aspects were introduced to participants Week on the follow-up
(1) How People Learn (Bransford et al., 2000). Week 1
(2) Subject-matter knowledge and lesson goals (Shulman, 1987; Hausfather, 2001; Richardson, 2003). Week 1 to 5 Week 2 to 6
(3) Relevance/real-world applications and chemistry-biology interdisciplinary connections (e.g., Holbrook, 2005; Schwartz, 2006; National Academies, 2010; AAAS, 2015). Week 1 to 5 Week 2 to 6
(4) Philosophy and implementation of guided inquiry-based learning (Lederman, 1999; Buck et al., 2008)—Levels 1 & 2 (Buck et al., 2008). Week 1 Week 2 to 6
(5) Questioning techniques (Nilson, 2010; Tofade et al., 2013). Week 1 Week 2 to 6
(6) Wait time1 (Rowe, 1986; Atwood and Wilen, 1991; Stahl, 1994; Brooks and Brooks, 1999). Week 1 Week 2 to 6
(7) Lesson closure and benefits (Webster et al., 2009; Nilson, 2010). Week 1 Week 2 to 6
(8) Formative classroom assessment techniques (CATs) and implementation (Cross and Angelo, 1993; Nilson, 2010). Week 2 Week 3 to 6
(9) Reflection on Teaching (Shulman, 1987; Lang and Olson, 2000; Cornu and Peters, 2005). Week 1 Week 2 to 6


The PD sessions were facilitated by two experienced faculty and a doctoral student, all with backgrounds in science education. One faculty and the doctoral student had specialized backgrounds in chemistry, while the other faculty had specialty in biology. Additionally, the doctoral student had prior experience teaching CHEM 1100 laboratory course, and 3 years of teaching experience through inquiry-based approaches for university level science. Each PD meeting took place a day before the TAs taught the interdisciplinary laboratory experiments. Apart from classroom assessment techniques (CATs), which were introduced in Week 2, most PD aspects were introduced during Week 1 and followed up with details and examples over the subsequent five weeks. More information on how discussions about the aspects of the PD (1–8) proceeded during the meetings is presented in Appendix 3.

PD meetings on week 2 to 6 also focused on individual reflections on the previous week's lesson, where each TAs shared with their peers and the facilitators about successes and failures on implementation of the PD aspects, and plans for improving on their instruction. The interdisciplinary learning community was considered instrumental in providing feedback and support for TAs identifying ways to improve their classroom practices around the PD aspects. To reinforce TAs’ reflections on their teaching, each PD participant was issued a hard copy and emailed an electronic copy of a reflection questionnaire. The questionnaire contained open-ended prompts requiring the TAs to document how they implemented aspects 2 through 8 (Table 1); what challenges students experienced during the lesson and how they addressed those challenges; and the areas they should improve on and how they will improve on them in the subsequent lesson. Each PD participant was compensated $600 after data collection.

Assessment of the of Engage PD program

To assess the impact of the program, we investigated the classroom teaching practices of five science GTAs as they participated in the PD program, and their experiences in implementing the PD aspects. Two research questions guided the current study: (1) are aspects of the Engage PD program translated into science GTAs’ teaching assignments? If so, how are they implemented? (2) What are the GTAs’ experiences in implementing the aspects of the Engage PD?

Methods

Participants

The study was conducted in a research-based university located in the Midwest of the United States of America. The study was approved by the Institutional Review Board (IRB), and treatment of the participants was in accordance with the established ethical standards. Participation in the study was voluntary, and all participants consented to have their data used for research purposes. Seven TAs (two biology undergraduate majors and five graduate students, three from chemistry and two from biology) assigned to teach laboratory sections containing the revised experiments participated in the PD program; however, findings reported herein are based on graduate teaching assistants (GTAs) only. The GTAs were selected as they were presumed to have rich background knowledge of their disciplines, and more teaching experiences than undergraduate TAs. Of the five GTAs, three were female and two were male. All the chemistry GTAs were in the third-year of their PhD program, while the biology GTAs were both enrolled for a master's program, with one GTA being in her first-year and the other in his third-year at the time the PD program was being implemented.

Of the five GTAs, four had two semesters of experience teaching the first three interdisciplinary experiments, and one semester teaching the last two interdisciplinary experiments. Two of the three chemistry GTAs developed one interdisciplinary lesson each, while one of the two biology GTAs participated in the development of the five biology experiments. We note that GTAs with prior teaching experiences reported similar successes and difficulties in implementing the PD aspects as did their peers with little or no experience. A summary of the GTAs’ demographic information is provided in Table 2.

Table 2 Demographic information of the graduate teaching assistants
GTA name (pseudonyms) Department Teaching experience in years as a GTA (prior to the PD) Teaching experience in years (interdisciplinary experiments) No. of interdisciplinary experiments developed
a 1 year experience as an undergraduate teaching assistant.
Catherine Chemistry 2 1 1
Cindy Chemistry 2 1 0
Christopher Chemistry 2 1 1
Bridgeta Biology 0 0 0
Bill Biology > 2 1 5


Typically, GTAs at the university in question secure teaching appointments in exchange for stipend, and free tuition towards their graduate programs. Apart from specialization in the discipline of study, pedagogical knowledge or prior teaching experiences are not required to qualify for a GTA appointment. GTAs are randomly assigned to teach two to three laboratory sections of an introductory lower-level course, with a light teaching load (one section) sometimes given to GTAs who receive supplemental research funding from their advisors. The appointment duties for science GTAs include teaching laboratory sections in the discipline of specialization in which they deliver pre-lab lectures; carry out experimental demonstrations, including walking students through the experimental procedure while clarifying specific amounts of reagents to use and expected outcomes; supervise students during the experiments; grade laboratory reports and students’ responses to post laboratory questions; and compute final laboratory grades for submission by the instructor of record.

Prior to the PD program, the five GTAs had participated in a one-day University-wide TA training as well as half-day Departmental training. The former familiarized them with university policies and expectations, while the latter exposed them to safety rules, safety equipment, and other relevant materials in the laboratories. Additionally, GTAs were issued laboratory manuals with clearly outlined procedures, and attended departmental weekly meetings with respective professors of record for each introductory level course. The weekly TA meetings involved discussions on experimental setup, safety precautions for each experiment, anticipated student expectations, and grading of the laboratory report, with grading key issued to them. These meetings were in addition to the Engage PD sessions. We note that the laboratory instruction for these introductory laboratory courses is not rigid; rather, GTAs are allowed some freedom and flexibility to incorporate other instructional practices to reinforce student learning. For example, they can choose to engage students with questions, group students into 3–4 peers rather working in pairs, implement low-stakes assessments, or other approaches like those incorporated in the PD program. The “added” low-stakes assessments usually attract less than 5 points towards the final laboratory grade.

Data collection

Data were collected through classroom observations, a 30-60 minutes semi-structured interview with each participant, and completed reflection questionnaires. Classroom observations were conducted with each GTA for the five interdisciplinary experiments through an observer and/or video-taping. The interdisciplinary laboratory sessions were purposely scheduled to avoid overlap of sessions. This allowed the researchers an opportunity to observe each participant; however, the interdisciplinary sessions were video recorded when the two researchers were unable to observe the participants in-person. The researcher documented, through field-notes, for evidence of seven PD aspects (2–8, Table 1) implemented by the PD participants in the five modules. These notes detailed what aspects were incorporated and how they were implemented (Research Question 1).

A semi-structured, in-depth interview protocol was employed at the end of the semester to capture GTAs’ experiences in teaching the interdisciplinary experiments around the PD aspects. The interviews served to address Research Question 2. During the interviews, participants were asked to describe (1) their experiences in implementing the PD aspects; (2) perceived challenges in implementing the PD aspects; and (3) how the Engage PD program influenced their teaching practices. Gathering this information was important in shedding light to each participant's observed classroom teaching practices. All the interviews were audio-taped. GTAs’ responses to the reflection questionnaires previously described herein were used as data. The reflection questionnaire and the interview protocol are provided in Appendix 4.

Data analysis

Analyses of qualitative data were informed by the research questions (Creswell, 2003, 2007). For classroom observations, analysis was conducted by one researcher and proceeded in two phases: phase I involved identifying evidence of the aspects of the PD (2–8) in the GTAs’ teaching sessions for the interdisciplinary modules from both field notes and video tapes, and identifying patterns of their implementation styles with respect to the observed aspects in phase II. For evidence of the PD aspects (phase I), the researcher identified the aspects implemented in each of the five modules by each GTA first, then determined the frequency counts of the number of GTAs who implemented a given PD aspect across the five modules. For phase II, the researcher looked for similarities and difference in the implementation styles of the observed PD aspects among the five GTAs. Two major themes were generated from these analyses: (1) most GTAs implemented the seven PD aspects, but implementation of some aspects was gradual; and (2) GTAs varied in their implementation of some PD aspects. We note that similar themes and data from other sources were triangulated with these themes to provide meaningful intepretation of results.

For interviews, Researcher 1 and Researcher 2 independently read and re-read two interview transcripts to make sense of the participants’ statements, and assigned chunks of meaning (codes) to relevant segments of interview transcripts (Hycner, 1985; Creswell, 2003, 2007). HyperRESEARCH qualitative analysis software was employed to help in the organization of codes (http://www.researchware.com/products/hyperresearch/). Both researchers compared and discussed the independently generated codes. The calculated percent agreement on the two transcripts was 91.7% or Cohen's Kappa of 0.88 (N Agreements = 44; N Disagreements = 4; N Cases = 48; N Decisions = 96), indicating a very good agreement (Komagata, 2002). The next phases of data analyses were conducted by Researcher 1.

Previously identified codes were applied to the relevant segments of the remaining three interview transcripts while allowing new codes to emerge. The codes were reviewed by Researcher 2 against the three interview transcripts to ensure coherent interpretations (Tashakkori and Teddlie, 1998). There was no resulting conflict. Codes identified from the interview transcripts were then merged into categories by Researcher 1. The categories were, in turn, applied to relevant segments of the reflection questionnaires while allowing new codes to emerge (Creswell, 2007). Related categories to those generated from the classroom observations were merged to generate themes (Creswell, 2003, 2007).

Procedures similar to those previously described in cross-verifying the data were applied to ensure coherent interpretations. Discrepancies in developed categories and themes were discussed and resolved until a consensus was reached. Three major themes were generated from the interviews and reflection questionnaires: (1) GTAs reported initial difficulties in implementing the PD elements; (2) GTAs’ initial difficulties were alleviated through continued support and nurturing; and (3) Engage PD program was perceived beneficial in improving GTAs’ teaching practices. A review of the themes by a third party resulted in merging of the first two themes to one theme: “GTAs experienced initial difficulties in implementing the PD aspects, but most overcame them over the semester.” A summary of generated categories and themes based on the five interview transcripts and the reflection questionnaires is provided in Appendix 5. We discuss these themes along with supporting participants’ data in the subsequent section.

Results

The results are presented here by research question, with identified themes discussed for each.

Research question 1: are aspects of the Engage PD program translated into science GTAs’ teaching assignments? If so, how are they implemented?

Theme 1: most GTAs implemented all the assessed PD aspects, but implementation of some aspects was gradual. Classroom observations results showed that the GTAs implemented all seven PD aspects assessed in this study in at least one laboratory session during the PD training. Fig. 1 shows the PD aspects identified during the classroom observations versus the frequency of implementation in the five interdisciplinary experiments among the GTAs. Three out of the seven PD aspects (i.e., highlighting lesson goals, integrating real-world applications and chemistry-biology connections, and questioning techniques) were implemented by all the GTAs in the five interdisciplinary experiments. Additionally, immediate implementation of CATs (i.e., graded and ungraded) was observed among all the GTAs in interdisciplinary Expt. 2 through Expt. 5. For wait time 1, we noted that the GTAs’ average wait time increased from 2 s to 5 s during the first three experiments; however, wait time of 10 s and beyond was evident among 4 out of 5 GTAs, specifically in the last two experiments. These results suggest the GTAs became more comfortable letting their students think about and provide responses before having to fill the empty silence.
image file: c7rp00133a-f1.tif
Fig. 1 Shows the PD aspects assessed during the classroom observations and the number of GTAs who implemented these aspects in teaching the five interdisciplinary experiments.

Two out of 5 GTAs (Cindy and Bill) consistently implemented inquiry lines of teaching in the five experiments. The number of inquiry implementors, however, gradually increased to three (plus Bridget) and four (plus Bridget and Catherine) during the fourth and fifth experiments, respectively. Whole-lesson closures were only evident among the chemistry GTAs (Cindy, Catherine, and Christopher); however, Catherine and Cindy did not implement this aspect during the first and the third interdisciplinary experiment, respectively, while Christopher implemented this practice during Expt. 1 and switched to group closures for the subsequent experiments.

Theme 2: GTAs varied in their implementation of some PD aspects. Uniformity in implementation of the PD aspects among the GTAs was mainly observed on “communicating learning goals” and “adopting the inquiry instruction”. For lesson goals, GTAs articulated key ideas of what they expected students to be able to do during the lesson. For example, during the PCR Experiment (see Appendix 2), Bridget introduced her lesson as follows: “Our objective today is to isolate DNA from corn chips then of course learn PCR…” (Source: Classroom Observations). When inquiry instruction was implemented, GTAs allowed students opportunities to devise experimental procedures, and collect and analyze data with minimal guidance. There were, however, differences in the way the GTAs implemented some aspects such as real-world and chemistry-biology connections, questioning techniques, CATs, and lesson closures. We discuss these differences in the sub-sections below.
Real-world and chemistry-biology connections. All the GTAs highlighted the real-world and chemistry-biology connections presented on each module; however, articulation of these connections was approached differently. For example, classroom observations showed that one chemistry GTA called on different students during the pre-lab lectures to read the chemistry-biology connections on the manual, then didactically unpacked individual connections to students. The interviews confirmed similar sentiments:

Interviewer: Describe your experiences in integrating the chemistry-biology connection?

Catherine: … I had the students to actually read aloud the biology connection so that you know everyone could see: here is where biology is connecting with chemistry, here is where a lot of certain things we do in our daily lives actually matter, because even though the chemistry in the reaction is there, we want to know then what's going on in our bodies, what's going on in the environment from a biology perspective…

In contrast, the other GTAs probed students to draw the interdisciplinary connections during the interactive pre-laboratory lectures, and/or in discussion of the results during lesson closures. For example, in interdisciplinary Expt. 3: “Determining the Concentration of Acid in Food,” Cindy probed students to think about the acidity level in human stomach, and why stomach can withstand corrosion given the low pH environment, during the pre-laboratory lecture. Additionally, Cindy reinforced the biology connection to the chemistry principles during her lesson closure for interdisciplinary Expt. 1 by highlighting the existence of “salt in every biological system.” Below are examples of how Cindy integrated chemistry-biology connections during the pre-laboratory lecture in Expt. 3 and at the end of the lesson in Expt. 1, respectively.

Classroom Observations:

Expt. 3: Determining the Concentration of Acid in Food (Pre-Lab Lecture)

Cindy: So why do you think it's important to determine the pH of things? Like if something is really really acidic, what happens?

Student: It burns.

Cindy: It burns. What about if something is really really basic?

Student: It burns.

Cindy: They both burn, right? What do you think is the pH of the stomach acid?

Students: 2 [pH 2].

Cindy: 2 (Another student responds to the question).

Student: 4 [pH 4].

Cindy: So why can’t it burn you?

Students: We have stomach lining.

Expt. 1: Food Chemistry: A Qualitative Approach to Determine Biological Compounds (Combined-Group Lesson Closure)

Cindy: What did you find out?

Student: Salt is everywhere.

Cindy: salt is in every biological system. […]

Furthermore, while real-world/chemistry-biology connections were integrated to the minimum requirements, most GTAs integrated other examples of these connections into their lessons besides what was provided in the modules, but in an equal way. For three GTAs (Cindy, Christopher, and Bridget), their generated examples focused more on their specific discipline of training and less on the other. For example, from Cindy's illustration above, she did not continue the conversation to explain the mechanism behind acid neutralization on the stomach lining (i.e., the production of the bi-carbonate-rich mucus, alkaline in nature, by the epithelial cells on the stomach wall). Similarly, in teaching Expt. 4: “Polymerase Chain Reaction (PCR),” Bridget linked the PCR concept to crime scene, and even though she highlighted the chemical bonding, she did not explain in detail how the bonds are formed or broken down during these processes. An example of how Bridget integrated the real-world/chemistry-biology connections based on the reflection questionnaire is provided below:

Expt. 4: Polymerase Chain Reaction

Refection Questionnaire: Describe how you handled (explicitly taught) relevance of the concept or lesson to the students’ daily lives.

Bridget: Mostly linking the PCR on crime scene.

Describe how you handled (explicitly taught) the chemistry/biology connections.

Bridget: I talked about hydrogen and phosphodiester bonds, what they are and further talked about that during my explanation on DNA replication.

Based on the interviews, the chemistry GTAs indicated discomfort delving into connections outside their discipline. These GTAs were simply more confident in their understanding of the concepts within their own disciplines, and lacked sufficient efficacy to promote student understanding of relevant interdisciplinary connections beyond what was provided in the modules. Overemphasis on the chemistry principles by the Chemistry GTAs related to a large number of students majoring in chemistry over the disciplines (3 GTAs), and/or biases on passion for the discipline (1 GTA). For example, see the interview excerpts below:

Cindy: I just made sure that they knew the chemistry applications versus the biological ones, and I also don’t feel like a lot of students are in biology.

Christopher: As a chemistry TA, my passion is chemistry and I want them [students] to be excited about the chemistry, and so part of the challenge is not making it into a bio [biology] lab. There aren’t very many bio students actually in my class and so um to put a lot of the focus on biology, you know, I really didn’t do that this semester.

Questioning techniques. Each GTA asked different short-answer, open-ended questions. The questions were mostly posed during pre-laboratory lecture (lecture with questions) and at the lesson closure (discussion with questions). A few questions were also asked during the observations, particularly when the GTAs moved to the small groups. Implemented questions varied in cognitive levels. For example, two GTAs (Catherine and Bridget) often asked questions focused on low-order thinking skills, particularly the “what” type of questions, with occasional higher-order questions such as “why” type of questions in each of the five interdisciplinary experiments. Example of Bridget's questioning technique during the pre-lecture is presented below:

Classroom Observations: Interdisciplinary Expt. 3—pH and Temperature on Enzymatic Activity

Bridget: What is the control for the pH experiment?

Students: Temp

Bridget: Yes, temp is the control

Bridget: What is independent variable?

Students: Substrate and enzyme

Bridget: Yes

Bridget: For temp, what is the control?

Students: pH

In contrast, other GTAs (Christopher, Cindy, and Bill) integrated lower-order and higher-order questions in their questioning for all the five interdisciplinary experiments. These three GTAs, however, often asked higher-order questions during each experiment in comparison to Bridget and Catherine. The questions asked included “why”, “how”, or those requiring students to reflect on the results. Below are representative illustrations of Christopher's implementation of questioning technique at the beginning and at the end of the lesson for Expt. 4 and 2, respectively:

Pre-Lab Lecture: Expt. 4—Leavening Power of the Baking Powder

Christopher: What do you know about baking powder?

Students: used in breads.

Christopher: Why is it used in breads?

Students: Causes bread to rise.

Christopher: Yes, it causes bread to rise.

Christopher: How does a baking powder cause bread to rise?

Students: It releases a gas.

Whole-Class Lesson Closure: Expt. I—Food Chemistry: A Qualitative Approach to Determine Biological Compounds

Christopher: Why is it that the corn kernel showed negative results for lipid but the popped popcorn tests positive?

Student: The oil used in popping.

(TA also wrote results from each group on the board and engaged the whole class in a discussion).

Christopher: What made the results among your groups not the same?

Students: How much reagent/sample we put.

Christopher: Other reasons?

Student: Contaminations.

Classroom assessment techniques (CATs). Variation in the implementation of CATs included the use of ungraded CATs (4 GTAs) and graded quizzes (2 GTAs), or both (1 GTA). Implemented CATs included muddiest point (4 GTAs), application cards (3 GTAs), one-minute paper (“what did you learn the most” from the lesson) (2 GTAs), and “focused description/listing” (1 GTA). According to the GTAs, the choice of implemented CATs was partly due to their simplicity and low student-instructor effort.

Bridget: I implemented the “muddiest [point]” and “what did you learn the most.”

Interviewer: Why did you choose those ones?

Bridget: It's straight to the point, and I think it's easier for them to write; it's easy to understand the questions; easy to write so I think that's the best for them.

In implementing the graded quizzes, two GTAs posted a few items on the learning management system for students to complete prior to the class. The questions were also discussed at the beginning of the lesson. These GTAs’ reported motivation behind graded quizzes or “focused description/listing” was to promote student preparation, in thinking about the concepts or the design of a given experiment prior to the laboratory:

Catherine: I implemented a quiz just for me to reinforce the reading before the lab…. I asked questions in the quiz that I thought were most important concepts of the experiment which, in turn, helped them realize “this is what the experiment is about. This is what we should be learning about, and this is probably how we should conduct the experiment.”

Christopher: I like the CAT where they came in and wrote down the procedure for the lab [focused description/listing], because for me they would read the lab; think about it before coming to class; and then hopefully would improve their lab experience because they won’t be as unclear throughout the lab on what the concept was about.

Lesson closure. As previously stated on lesson closures, only two chemistry GTAs consistently implemented whole-class lesson closures while the other GTAs opted for group-to-group/combined group lesson closures, with biology GTAs implementing group-to-group closures. While in both formats the GTA engaged the students with questions to justify their results, we noted whenever whole-class or combined-group lesson closures were implemented the GTA tabulated results from each group or asked a group representative to write the results on the board, then asked students to compare and contrast the results. On the other hand, group-to-group lesson closures reinforced reflection on the group's observations. For example, see Christopher's example previously provided under the “Questioning technique.” We also provide an example of Cindy's one incidence of combined-group lesson closure:

Combined-Group Lesson Closure: Expt. I—Food Chemistry: A Qualitative Approach to Determine Biological Compounds

Cindy: What did you find out?

Student: Salt is everywhere

Cindy: Salt is in every biological system.

(TA wrote results from each group on the board)

Cindy: Why do you think you got mixed results?

Student: Because some were positive and others not and we have different food items.

Cindy: Suppose you set up your experiment, would you go by what you think you know or by what you observe?

Student: By what I know.

Cindy: You cannot go with what you know because what you know may turn out to be different; you need to always rely on the observations to make conclusions

Research question 2: what are the GTAs’ experiences in implementing the aspects of the engage PD?

Theme 3: GTAs reported initial difficulties in implementing some PD aspects, but most overcame them over the semester. As noted from the observations (Fig. 1), the interview results also indicated that initially all the GTAs experienced difficulties in implementing some PD elements, especially during the first 2 to 3 interdisciplinary lab sessions. Reported difficulties included implementing the inquiry approach (3/5 GTAs), wait time 1 (5/5 GTAs), formulating critical thinking questions (2/5 GTAs), and holding whole-class lesson closures (3/5 GTAs). We discuss the reported hurdles and how GTAs improved on them below. We also provide examples of GTAs’ reported hurdles (column 2) and perceived improvement on these areas (column 3) in Table 3.
Table 3 Engage PD aspects initially perceived difficulty to implement, and reported improvement by the GTAs
PD aspects Examples of participants’ excerpts (perceived initial implementation difficulties) Examples of participants’ excerpts (perceived improvement in implementation)
Guided inquiry-based instruction (3/5 GTAs; 60%) Catherine: While preparing for class, I would tend to think that the students would not get certain concepts or certain aspects of the laboratory experiment, so I would kind of start to doubt the students in a way before the class…. I think this doubting could have been because I was [not] a coach for them and I was not allowing them to think for themselves and, you know, let them figure it out [design experiments] on their own, and then if they really just had trouble then coming to me.

Christopher: I know the labs are meant to be inquiry-based, but I think that it is still difficult to implement when you have 24 students trying to do the same experiment in a 2 h 50 min period of time. Some of them you just know they haven’t noted the material until they show up for the pre-lab lecture…it can be somewhat difficult, especially when are not in the same level coming in. So to have them actually design an experiment [and] carry it out, I don’t know if there was a lot of that going on for a lot of experiments.

Catherine: I actually saw that letting students do it [design experiment] on their own and figuring it out, they would take that time to come up with something, you know, without me giving them much assistance. I was very happy for that and I guess I overcame one of those problems, you know, using inquiry…. At this last experiment [Energy Lab] even though it was the last, it was a learning experience for me, because I just said “no, this is what I’m gonna do; I’m gonna allow them to do it” and they did it.

Christopher: …There were opportunities to take a step back instead of just give them the data or give them the procedure of write out, I would ask them some questions that are meant to direct them towards, you know, whether it's a procedure or any sort of question. I know one lab in particular, the energy lab [Expt. 5], that I tried to have kind of that model where they [students] actually came in, designed experiment, and carried it out.

Wait time 1 (5/5 GTAs; 100%) Catherine: Early in this semester, I guess I was too quick to answer or ask another question to students; I didn’t allow enough time for the students to actually answer the questions because I felt like I was giving them too much time to answer the question. I felt like they should have known it right then because those were simple questions from reading the manual.

Cindy: At first, I had difficulty waiting for their responses.

Catherine: It [PD program] is helping me to become a better TA in the sense of allowing students to think a little bit more for themselves versus giving them the answer all the time or being too hasty to give the answer; allowing them to, you know, take a moment to think about what the question asks.

Cindy: …now I have started asking them a question and give them a couple seconds to think about it.

Questioning technique (2/5 GTAs; 40%) Bridget: It is not easy to ask the right questions. I may get some questions right for asking them [student] correctly but I think it's a part that you really need to have more experience on how students react and think of it again and again and again.

Cindy: At first, I had difficulty thinking of questions um to ask them [students], to lead them to right answer without actually giving them the answer um and letting them figure out themselves.

Bridget: I always get questions from them [students]; I don’t really ask a lot of questions, but with the [PD] training, I can use this and I think it's helpful for some students, especially for the shy ones; they don’t ask questions. So, constructing questions was a good thing and that led me to ask more questions. It is necessary, especially for the inquiry they need to think. They won’t think if you don’t ask questions; you have to ask so that they can think.

Cindy: I often would accept the answer “I don’t know” [prior to PD] and now I don’t accept “I don’t know” answers. …. I end up pushing them to think about it a little bit more and then try them to come up with their own answer.

Whole-class lesson closures (3/5 GTAs; 60%) Christopher: Wait time and closures are probably two of the things that I struggled with quite a bit this semester and probably in the previous semesters, and it goes back to having so many groups; twelve different lab partnerships working at the same time at different rates.

Bridget: Closure is a really really hard part. It's the hardest part for me, especially that I have different types of students… I can never get them to be at the same point; I have students that work fast, I have students that work slowly. It's not possible to make a closure as a class; when I have a small class, maybe it's possible but if I have a big class it's not possible.




Hurdles and improvement on guided inquiry-based instruction. Three GTAs (Bridget, Catherine, and Christopher) expressed reluctance to switch from the transmission model of teaching to adopting inquiry-based teaching during the first three to four interdisciplinary laboratory experiments. Findings revealed that these GTAs had doubts about students’ ability to design and execute experimental procedures without their involvement (3 GTAs) (e.g., see examples of excerpts in Table 3), and perceived complexity of the modular concepts to first-year students (1 GTA). For example, one biology GTA expressed that the concepts investigated in the interdisciplinary experiments were complex not only to undergraduate first-year students, but also to graduate students. The GTA said:

…Even for graduate students, not all can design the PCR stuff. They [students] don’t really know what PCR is, yeah, polymerase chain reaction; you amplify gene, “why do I need to do that”? They don't know why I need to do this. But I think for some students it's helpful. This is a freshman [first-year undergraduate]; it is the very first class. (Bridget)

Consequently, these GTAs were often observed giving students detailed information on how to conduct the experiments, including demonstrating parts of the experiments they perceived to be challenging for students. For example, during the PCR experiment (Appendix 2), Bridget explicitly pinpointed to her students that the hardest part was designing the experiment, and proceeded to help them out on the latter. She said: “Our objective today is to isolate DNA from corn chips, then of course learn PCR, and the hardest part is to design the experiment, but we will work together, so don’t worry about that” (Bridget: Classroom observations). This was a case where the techniques of PCR could have been explained and demonstrated, and then students challenged to design an investigation to determine what chips had GMOs. Instead, execution of the PCR technique took precedent over brainstorming on the experimental procedure to investigate the presence of GMOs within common corn chips.

Our study further indicates that Catherine, Bridget, and Christopher relinquished control to allow their students to design their own experiments with the extended PD meetings, even though Christopher resumed to the transmission model immediately after students proposed the experimental procedure for the last experiment (Expt. 5). Although it took a while for these three GTAs to let go of their doubts of student learning through the inquiry approach, Bridget and Catherine completely shifted their role of “transmitters” to “facilitators” in learning, with Bridget adopting a more student-driven investigation approach in the last the last two experiments, and Catherine in Expt. 5. We illustrate the adoption of the guided-inquiry instruction by the GTAs in Table 4. These two GTAs reported the inquiry instruction as satisfying experiences (see Catherine's response on guide-inquiry instruction in Table 3, column 3).

Table 4 The level(s) of inquiry for the interdisciplinary experiments and GTAs’ use of guided inquiry
Inter-disciplinary expt. Biology labs: inquiry level(s) Biology GTAs who implemented guided-inquiry instruction Chemistry labs: inquiry level(s) Chemistry GTAs who implemented guided-inquiry instruction
a Provided students with opportunities to design the experimental procedure, but later provided a step-by-step outline on how to carry out the experiment.
1 1 & 2 Bill 1 & 2 Cindy
2 1 & 2 Bill 1 & 2 Cindy
3 1 & 2 Bill 1 & 2 Cindy
4 2 Bill & Bridget 1 & 2 Cindy
5 1 & 2 Bill & Bridget 2 Cindy & Catherine; Christophera



Hurdles and improvement on wait time 1. The average wait time period for the five GTAs during the first 3 lessons was 3 seconds, lower than the time recommended during the PD program. All GTAs initially perceived that they were allowing students enough thinking time before they provided them with the answers which was not the case as captured from the classroom observations. However, their reported barriers to increasing wait time 1 related to prior or current experiences on students’ reluctance to respond even to simple questions (5 GTAs) (e.g., see examples of excerpts on Table 3), and fear of sabotaging the time required to complete the experiments (1 GTA—Christopher). For example, from the reflection questionnaires, Christopher expressed that increasing wait time would interfere with the time required to address students’ questions during the experiment, or to complete the experiment (Reflection Questionnaire, Expt. 3)—see excerpts below. He further stressed that his wait time can be increased if the aforementioned areas were not sabotaged (Reflection Questionnaire, Expt. 4). Surprisingly, classroom observations revealed that most of Christopher's laboratory sessions were done 25 to 42 minutes earlier than the allotted lesson period (i.e., 2 h 50 minutes). For Christopher, wait time was a barrier to completing the lab; he did not see the time as an opportunity to foster student thinking and gauge students' ideas formatively.

Reflection questionnaire: Do you think you can improve on your wait time? If so, what do you plan to do next time to improve?

Christopher: With so many groups working simultaneously, wait time directly leads to lengthening the duration of the lab, and many labs require the full amount of available lab time. (Expt. 3—Determining the Concentration of Acid in Food)

Christopher: Because of those times where I have many students competing for attention, I sometimes get in the habit of not giving much wait time even when only a single student has a question. I could improve my wait time during those times where the time is available. (Expt. 5—The Leavening Power of Baking Powder)

Classroom observations also confirmed GTAs’ hurdles with wait time during the first few modules where they spontaneously responded to their own questions by rephrasing or answering them; rather than allowing enough thinking time for their students.

Classroom Observation: Expt. 1—Food Chemistry: A Qualitative Approach to Determine Biological Compounds

Catherine: “We know that walnut is a protein; what might be causing it not to test positive? Is it because it is processed or why is it like that?

(Wait time 1 is 2 seconds)

Catherine: It's processed

Cindy: “Why do you think it's important to determine the pH of things?

(Wait time 1 is 3 seconds)

Cindy: Like if something is really really acidic what happens?”

During the extended PD, the GTAs, with an exception of one GTA (Christopher) whose wait time averaged about 4 seconds, gradually increased their wait time 1 to at least 10 seconds. Findings across the three data sources (classroom observations, reflection questionnaires, and interview) showed that some GTAs even waited for longer times like 30 or more seconds. Those who improved their wait time were able to see the value in providing opportunity for their students to think and respond. For example, see Table 3, column 3, for representative participants’ responses on wait time. The key feature of this PD aspect was for the GTA to overcome his/her uncomfortableness with the silence while students were thinking.


Hurdles and improvement on questioning techniques. The two GTAs (Bridget and Cindy) expressed initial difficulties in crafting and/or asking effective questions that would engage student in thinking about the concepts. Difficulties in questioning technique related to limited experience with this PD aspect (Bridget), and inability to refrain from giving direct answers to students’ questions (Cindy) (e.g., Table 3, column 2). For Bridget, she mentioned that the questioning in her classes was rather initiated by her students, while Cindy would not probe students into thinking if they did not know the answer to her posed questions. With the PD meetings, however, these GTAs began engaging students in effective questioning strategies (see Table 3, column 3), as previously illustrated elsewhere in this paper.
Hurdles and improvement on lesson closures. Like some PD aspects, whole-class lesson closure did not translate into some GTAs' instruction, particularly the biology GTAs. Their common approach was group-to-group lesson closure. Chemistry GTAs attempted to hold whole-class lesson closure, although one GTA (Catherine) forgot to hold whole-class lesson closure during the 1st and 3rd chemistry interdisciplinary experiments; one GTA (Christopher) implemented this format of closure once and switched to group closures; and one adopted a reverse approach to that of Christopher. For example, Catherine said: “I did forget to do a couple of closures, but of course that is something I should learn as time go by.”

Reported difficulties to implementing this practice related to large class size (3 GTAs—Christopher, Bridget, and Bill), a lack of experience with this practice (1 GTA, Catherine), and having “different types of students or groups” (2 GTAs—Bridget and Christopher); thus not being able to keep groups who had finished their work. It is probable that the tradition of student groups finishing a lab, turning in a report (if necessary), and leaving was overwhelmingly challenging for the GTAs to overcome. Even with continued PD discussions on ways to engage students who finish lab activities earlier than the rest, we noted no shift to whole-class lesson closures among the GTAs who were using group closures. During the interviews, GTAs who implemented group lesson closures maintained that they can shift to class lesson closures if the class sizes were small (e.g., see Bridget's response, Table 3). On the other hand, Catherine reported desire to improve on class lesson closure in future teaching. She said:

Catherine: closure is something I can definitely work on; how to bring students back, how to engage them now once they have done the experiment. Pull out the most important from that experiment and touch on that at the end of the class. Hopefully then that would wrap off and get them to thinking how to answer certain post-lab questions.

Overall, findings from the interviews not only revealed the aspects of the PD GTAs struggled with, but also how the extended PD program was helpful in improving their implementation of some of the aspects perceived to be challenging.

Theme 4: the Engage PD program was reported to be beneficial in improving GTAs’ teaching practices. Despite the reported initial difficulties in implementing the PD aspects, the training program was perceived beneficial in enhancing specific teaching practices among GTAs. For example, prior to the PD, the GTAs were not familiar with CATs; they often relied on summative approaches such as graded laboratory reports for assessment of student learning. During the PD program, however, most of them incorporated at least two types of CATs in their laboratory teaching. They reported these assessments as vital in obtaining immediate feedback on students’ areas of difficulties in learning, and for assessing conceptual understanding, or knowledge transfer:

Christopher: The classroom assessment techniques [CATs] were the most helpful thing in this semester. I hadn’t implemented those before and trying to integrate those in the labs this semester um there was definitely a benefit in TA training…. It was a good way to collect data from each student individually and to specific questions that I thought would be most helpful for me at the end of the lab.

Cindy: It [CATs] provided me with feedback right away and maybe I could catch them [students] before they left the room and try to help them understand the bigger concept or the little concepts that were going on.

Additionally, a few PD participants expressed likeliness to implement CATs in their future teaching. For example, Catherine mentioned her future use of CATs in not only laboratory teaching, but also in lecture-based teaching environment. Similarly, Christopher appreciated the awareness of literature on CATs, and further expressed interest in using them for future diagnosis of learning-related difficulties:

Catherine: I think now I can take the CATs not just for teaching labs, but I can also include these once I’m done with my degree, say, teaching in a lecture hall.

Christopher: Just knowing what is available out there for these CATs is helpful because if I identify learning-related problems in future I definitely know there are techniques that I can use to identify those and probably correct those.

Furthermore, the Engage teaching reflections were perceived to be beneficial in helping GTAs plan, modify, or refine their instructional approaches (5 GTAs). The PD sessions required them to be reflective teachers, something that was new to them despite having been teaching assistants prior to the study. Their reflection experiences helped them to scrutinize and evaluate their teaching on consistency, what worked or did not, what changes to make for future lessons, or become cognizant of their classroom behaviors and those of students:

Catherine: The reflections helped me reflect on what the students did; reflect on what questions they did ask; reflect on what I actually said or presented to them, or how I did the pre-lab and all of that. The questions [questionnaire] really helped me to see more or less am I changing anything? Is this new thing helping?…

Completing the reflection questionnaire also reinforced the adoption of specific PD aspects such as wait time (2 GTAs), and lesson closures and CATs (1 GTA) in GTAs’ classrooms. The reflections helped them to think about their teaching plan for next lesson, and to timely integrate these practices in their teaching.

Christopher: Trying to improve wait time and closure, and implement the CATs I don’t think that that would have happened…I don’t know if I would be thinking about those things more if it were not for those forms [reflection questionnaires] and having to come to the TA [PD] training… There were some questions that said “what would you do differently next time” and taking your time to kind think about that and plan it out at least puts something in your mind to work with for the next lab [lesson].

Furthermore, in comparison to the departmental TA-training and university-wide TA Orientation programs for GTAs at the university in question, the Engage PD exhibited practical strategies beneficial in transforming GTAs’ teaching practices. Holding the sessions while the GTAs were teaching was also beneficial, as they could immediately put into practice techniques they discussed and reflected upon. When asked if they would recommend TA training similar to the Engage program to their departments, the GTAs were positive about their Engage experience and critical of their past experiences drawn from departmental TA trainings. They viewed traditional TA training as ineffective in preparing them for their teaching assignments, and recommended the adoption of Engage-type training programs focused on effective teaching strategies to support all the TAs.

Interviewer: Would you recommend similar TA training to be adopted by your department?

Bill: We don’t get any training. Our training is we show up and they [faculty] hand us the things [lab manual and grading keys] and then they say “don’t sleep with your students” and then that's it. The other thing is we have a weekly meeting but that's not effective. For the classes that I have taught really, it's not how to teach [that we get] um and if it's important for TAs to teach, then we should get the training. If not that's fine too we can just show up and walk them through the lab and not teach them anything.

Catherine: Yes, especially when it comes to the CATs, closures, and inquiry in itself, because we want the students to get the most out of the laboratory experiment not just hands-on-wise but information-wise, concept-wise, you know, because really what they do in the lab can help them understand what's being taught in lectures. It can definitely help other TAs with the normal labs and so forth.

One indicator of the success of the Engage PD program is that three of the five GTAs were recognized for their teaching excellence at departmental levels, with one of these GTAs honored with the prestigious “All-University Teaching Excellence Award” in the year the PD program was implemented. The former award is given to the best two GTAs, while the latter is given to the six best GTAs at the entire university. Both of these competitive awards are issued annually based upon one's teaching portfolio and undergraduate student evaluations. Honoring these GTAs for their excellence in teaching is a sign that the PD program had a positive impact on their classroom teaching practices. Other studies also confirm that GTAs who participate in teaching development programs are likely to win university-wide teaching awards (e.g., Dragisich et al., 2016a, 2016b). Overall, the PD program was beneficial in exposing GTAs to new and broad effective teaching practices beyond the traditional teaching practices in these courses: transmitting information and supervising students on verification of scientific facts.

Discussion

Expectations from any training program are that the trainees will adopt the acquired skills and knowledge in executing the tasks for which they received the training. Our expectations were that the aspects of the Engage PD will be immediately translated into the GTAs’ classrooms. Unfortunately, our findings revealed that teaching is a complex task in which immediate or long-term translation of the aspects of teaching-development programs into the classroom is not always guaranteed. We found four out of the seven of the PD aspects (communicating lesson goals, integrating real-world/chemistry-biology connections, questioning technique, and CATs) were adopted immediately and sustained throughout the interdisciplinary modules during the PD program. Barriers to the implementation of the three PD aspects (i.e., inquiry instruction, wait time, and whole-lesson closures) related to GTA's perceptions and beliefs about student learning, prior knowledge and experiences, and contextual factors (e.g., large class size, different types of students, or inadequate classroom management skills).

GTAs’ perceptions and beliefs about student learning contributed to a delayed shift from the transmission model to the inquiry instruction for three of the five GTAs who were initially hesitant to allow students to design experiments during the first three modules (Fig. 1). Specifically, barriers to inquiry instruction related to beliefs about (1) complexity of the investigated science concepts to students, (2) the experimental design being tasking and challenging for undergraduate first-year students, (3) cognitive abilities of students, or (4) student unpreparedness for the laboratory. These belief systems were shaped by prior experiences in teaching and experiences as students. For example, in teaching traditional experimental in the same course and in previous semesters, Catherine was reluctant to let students design procedures for the interdisciplinary modules following their previous struggles with simple concepts in the traditional experiments, which she perceived to be less demanding compared to the interdisciplinary modules.

Similarly, Bridget's experiences from interaction with fellow graduate students who tussled with similar modular concepts like PCR made her uncomfortable in letting students proceed with experimental designs without her involvement. Christopher's prior experiences with students not adequately preparing for the laboratory prior to the sessions completely barred his buy-in of the inquiry instruction. Overall, GTAs’ beliefs accrued from prior experiences led to their underestimation of students’ capability to learn through the guided-inquiry approach. Our findings align well with similar studies reported in the literature. For example, Wheeler et al. (2017) found that teaching experiences shifted some chemistry TAs’ beliefs to “disseminators” rather than “facilitators” in an inquiry-based laboratory setting. Additionally, Kurdziel et al. (2003) reported that chemistry GTAs questioned the appropriateness of the inquiry-based approach at the freshmen (first-year students) level, and perceived that the students could learn best by regurgitating information and providing clearly outlined directions of the laboratory procedure. Other studies have also reported perceived difficulties in implementing inquiry instruction among science TAs (French and Russell, 2002; Luft et al., 2004; Volkmann and Zgagacz, 2004; Gormally et al., 2011), and faculty (Brown et al., 2006).

Discussions during the PD meetings emphasized on GTAs switching their role of “transmitting information” to guiding or facilitating learning, and not viewing their students as “blank slates” but affording opportunities for them to own the learning. This continuous feedback and mentoring was important in changing GTAs’ views about inquiry learning, even though the fruits of such effort were noted in later interdisciplinary experiments for the three GTAs. Wheeler et al. (2017) also reported that chemistry TAs’ beliefs of their role in instruction shifted to more facilitator-type beliefs following a PD program situated around inquiry instruction. These findings, together, point to the role of belief systems in teaching, and the need to support and mentor GTAs to overcome them.

In addition to beliefs related to learning environment (e.g., large class size, different types of groups or students), limited experiences with lesson closure in prior teaching practices might explain why some GTAs forgot to implement this practice, and others chose to implement group-to-group or combined-group lesson closures. This relates back to the aforementioned tradition of “letting students go as soon as they are done collecting data.” Many GTAs implemented group-to-group or combined-group lesson closures even when the PD emphasized the implementation of whole class-lesson closures over the group closures. Modeling of effective lesson closures in laboratory experiences might be helpful for TAs to learn this essential element of effective teaching.

Prior limited practice with extended wait time beyond 2 s can also explain why GTAs did not increase their wait time 1 immediately. Subsequent PD sessions included specific discussion of wait time and the importance of waiting for students to think and respond, especially when asking higher-order cognitive questions. We discussed how the students and teacher might feel uncomfortable in the silence of wait time, but eventually a student will respond as long as the teacher remains patient. In time, the length of silence shortens and students are more at ease sharing their ideas. Over time, we saw such extension of wait time in most of our participants.

Furthermore, the use of mixed higher/lower-order thinking questions by a large number of our sample indicate that GTAs can easily adopt and translate these questioning techniques into their classrooms. However, some will require more time than others to practice and become comfortable with asking higher-order questions. Other studies also suggest that TAs trained on development of effective assessments can better translate this skill in developing assessments measuring higher-order thinking skills than their untrained counter-parts (e.g., Wyse et al., 2014).

Additionally, perceived challenges in asking higher-order types of questions, as reported by two of the five GTAs are not surprising; similar hurdles have been reported else among other GTAs (e.g., Gormally et al., 2011). Other studies have also reported faculty's frustration in finding good questions for engaging students in active-learning classes (Turpen et al., 2016), and inability to ask high-level type of questions (Larson and Lovelace, 2013). These studies, together, reveal that “Using guiding questions is a learned skill that can take a good deal of practice to master” (Gormally et al., 2011, p. 49).

Differences in integrating real-world/chemistry connections beyond what was provided in the modules by GTAs can be explained by differences in the disciplines of specialization, especially prior experiences and background from previous science gateway courses the GTAs enrolled for as undergraduate students. Generally, introductory-level chemistry courses tend to attract students from other majors and students may take them as prerequisites or electives towards fulfilling their undergraduate programs’ requirements. This explains why biology GTAs felt comfortable integrating chemistry connections over the chemistry GTAs who “shallowly” integrated “other” examples of biology connections into their teaching.

An important Engage PD experience to GTAs was also documenting what they did or did not do, and plans for improvement in the subsequent lessons. This practice helped GTAs to reflect on and refine their instructional approaches during the PD program. Benefits of using reflection tools in teaching have also been reported in other studies (e.g., Miller et al., 2014; Farrell and Ives, 2015). For instance, Farrell's and Ives's (2015) study with one GTA showed that by documenting and reflecting on his beliefs, he become aware of them and their impact on his classroom practices. Likewise, Miller et al.'s (2014) study indicated that using an observation classroom protocol to observe peers, and participating in a reflection discourse provided (1) novice TAs with new approaches to teaching; (2) guidance on implementation of pedagogical aspects; (3) and ways of improving their skills in communicating with students and managing the class.

Overall, these results and others reveal the role of prior experiences; knowledge; and beliefs systems about teaching, students, and learning environment in instructional change. Importantly, extended PD programs coupled with mentoring and support in implementing the PD aspects is beneficial in helping GTAs overcome the noted barriers that may impede implementation of the PD aspects in ways that are considered beneficial for meaningful learning.

Conclusion and implications for practice

This study investigated how the aspects of the Engage PD translated into GTAs’ teaching and how they were implemented, and GTAs’ experiences in implementing these aspects. We address our research questions first, then discuss implications of the findings for future GTA training programs.

Research question 1: are aspects of the Engage PD program translated into science GTAs’ teaching assignments? If so, how are they implemented?

Results revealed that many of the PD aspects (57%) assessed were immediately adopted and sustained throughout the interdisciplinary modules by the five PD GTAs, whereas 43% translated gradually in some GTAs’ classrooms (e.g., guided inquiry instruction, wait time1, and whole-class lesson closure). The PD aspects implemented and sustained by the GTAs included: communicating goals for each lesson to students; integrating examples of real-world and chemistry-biology connections; employing questioning techniques; and using CATs. Furthermore, we noted differences in the implementation styles of the PD aspects among GTAs, particularly on real-world/chemistry-biology connections, questioning techniques, CATs, and lesson closures. While one GTA (chemistry) taught the examples of real-world/chemistry-biology connections provided on the modules, the rest incorporated new examples into their lessons, even though three of these GTAs did not delve deep in explaining the connections outside their disciplines.

Differences in questioning techniques related to the use of lower-order versus higher-order thinking, or a combination. A slim majority (3 out of 5) employed the later style. For CATs, most GTAs (4 out of 5) employed the ungraded format, a few (2 out of 5) employed graded quizzes, and one implemented a combination. Overall, most GTAs implemented at least two types of CATs in their classrooms. CATS perceived to be simple and involving low student-instructor effort were preferred over others provided during the PD program. For lesson closures, only 2 GTAs often employed whole-class lesson closure, while the rest adopted group closures. In sum, most PD aspects were translated into GTAs’ classrooms as the GTAs engaged in the PD program.

Research question 2: what are the GTAs’ experiences in implementing the aspects of the Engage PD?

Results from the interviews and reflection questionnaires indicated that GTAs’ perceived positive experiences of the Engage PD in improving their teaching practices, particularly in exposing them to instructional practices and approaches they had never encountered in their prior teaching practices (i.e., all the PD aspects). However, GTAs reported initial difficulties in implementing the three aspects that were gradually adopted in their assignments during the PD program: wait time 1 (5 GTAs), inquiry instruction (3 GTAs), and whole-class lesson closure (3 GTAs), as well as questioning technique (2 GTAs).

Barriers to the implementation of these PD aspects were attributed to individual perceptions and beliefs about teaching and learning, prior teaching experiences and practices, and contextual factors (e.g., student unpreparedness for the lesson, large class size, or inadequate classroom management skills). In particular, GTAs’ beliefs about teaching and learning impeded their use of guided inquiry instruction; however, through extended discussions and support during the PD meetings, most GTAs overcame many of these barriers and belief systems and adopted the PD aspects in their teaching assignments. The Engage teaching reflections were especially reported helpful for thinking about “what worked or did not work” during the lesson, and in modifying and refining the implementation of PD aspects.

Implications for practice. Overall, current findings imply that a one- or two-day TA training program, entirely lacking context and feedback, does not work. Adoption of best teaching practices by GTAs is a gradual process that requires practice, ongoing feedback, and support especially in implementing these practices. A number of educators have pointed out that mentorship and continuous support to novice instructors who are trying out new innovative teaching practices is pivotal for nurturing sustainable instructional change (Henderson et al., 2011; Turpen et al., 2016). Unfortunately, existence of such programs is yet to be realized in many institutions of higher education. The current PD model is one unique example of how departmental units and faculty developers can support and nurture future faculty, including GTAs, in implementing effective teaching practices. Support through observation feedback and reflection prompts can help GTAs identify areas of difficulties and how to improve.

Findings from this study and others shed light on the role of instructors’ beliefs in barring instructional change. Besides TAs’ beliefs on teaching and learning, there is consensus in the education community that teachers’ beliefs shape their instructional practices (e.g., Pajares, 1992; Mutambuki and Fynewever, 2012; Petcovic et al., 2013). We clearly saw this interaction where participants were reluctant to relinquish control of investigative design to the students due to their belief that the students were not ready or did not have the ability to successfully complete the tasks without more direct instructions. The extended PD meetings coupled with reflections and discussions on the areas of struggle helped to address beliefs that served as barriers to inquiry instruction and other effective teaching practices. Future GTA programs should intentionally be designed to uncover and confront participants’ beliefs about teaching and student learning in real time.

In addition, it is important that the PD program takes place as GTAs teach. The teaching context provides an immediate venue for practicing strategies and for reflection on one's teaching. By engaging with students and starting out small in implementing the PD aspects, GTAs can become better in adopting these practices into their teaching. In addition, it can help them to realize dissatisfaction with their current teaching practices and be open to change (Anderson et al., 2011; Brewer and Smith, 2011; Southerland et al., 2011), and overcome any unforeseen discontinuity in embracing innovative teaching approaches in later teaching as a faculty member (Henderson et al., 2012). We content that changes must be injected into the overall structure of graduate programs; supporting and demanding extended professional development, such as Engage PD, for GTAs. Perhaps the coming crop of future faculty (current GTAs) with the proper mentorship and support could begin such an effort.

Conflicts of interest

There are no conflicts of interest to declare.

Acknowledgements

We are indebted to all the participants for providing us with data for this study, and to the anonymous reviewers for their constructive feedback in improving this manuscript. This research was supported by the National Science Foundation under the CCLI, Grant/Award Number 09411713.

References

  1. Adler J., (2000), Social practice theory and mathematics teacher education: a conversation between theory and practice, Nord. Math. Educ. J., 8, 31–55.
  2. American Association for the Advancement of Science (AAAS), (2015), Vision and Change in Undergraduate Biology Education: Chronicling Change, Inspiring the Future, Retrieved from http://visionandchange.org/files/2015/07/VISchange2015_webFin.pdf.
  3. American Chemical Society Committee on Professional Training, (2015), ACS Guidelines and Evaluation Procedures for Bachelor's Degree Programs, Washington, DC.
  4. Anderson W. A., et al., (2011), Changing the Culture of Science Education at Research Universities, Science, 331, 152–153.
  5. Atwood V. A. and Wilen W. W., (1991), Wait Time and Effective Social Studies Instruction: What Can Research in Science Education Tell Us? Soc. Educ., 55, 179–181.
  6. Austin A., (2002), Preparing the Next Generation of Faculty: Graduate School as Socialization to the Academic Career, J. Higher Educ., 73, 94–122.
  7. Austin A. E., Campa H., Pfund C., Gillian-Daniel D. L., Mathieu R. and Stoddart J., (2009), Preparing STEM doctoral students for future faculty careers, New Dir. Teach. Learn., 2009, 83–95.
  8. Baldwin R. G., (2009), The climate for undergraduate teaching and learning in STEM fields, New Dir. Teach. Learn., 2009, 9–17.
  9. Bettinger E. P., Long B. T. and Taylor E. S., (2016), When inputs are outputs: the case of graduate student instructors, Econ. Educ. Rev., 1–14.
  10. Bok D., (2006), Our Underachieving Colleges: A Candid Look At How Much Students Learn and Why They Should Be Learning More, Princeton, New Jersey: Princeton University Press.
  11. Borko H., (2004), Professional development and teacher learning: mapping the terrain, Educ. Res., 33, 3–15.
  12. Bransford J. D., Brown A. L. and Cocking R., (2000), How People Learn, Washington, DC: National Academy Press.
  13. Brewer C. A. and Smith D., (2011), American Association for the Advancement of Science, Vision and Change in Undergraduate Biology Education: A Call for Action, Washington, DC.
  14. Brooks J. G. and Brooks M. G., (1999), Becoming a constructivist Teacher In Search of Understanding: the case for Constructivist Classrooms, Alexandria, VA, USA: Association for Supervision and Curriculum Development (ASCD), pp. 101–118.
  15. Brown P. L., Abell S. K., Demir A. and Schmidt F. J., (2006), College science teachers' views of classroom inquiry, Sci. Educ., 90, 784–802.
  16. Buck L. B., Bretz S. L. and Towns M. H., (2008), Characterizing the level of inquiry in the undergraduate laboratory, J. Coll. Sci. Teach., 38, 52–58.
  17. Burke K. A., Hand B., Poock J. and Greenbowe T., (2005), Using the Science Writing Heuristic, J. Coll. Sci. Teach., 35, 36–41.
  18. Chen X., (2015), STEM attrition among high-performing college students: scope and potential causes, J. Technol. Sci. Educ., 5, 41–59.
  19. Connolly M. R., Savoy J. N., Lee Y.-G. and Hill L. B., (2016), Building a Better Future STEM Faculty: How Teaching Development Programs Can Improve Undergraduate Education, http://lsfss.wceruw.org/documents/Building_a_Better_Future_STEM_Faculty.pdf.
  20. Cornu R.-L. and Peters J., (2005), Towards Constructivist Classrooms: the role of the reflective teacher, Educ. Inquiry, 6, 50–64.
  21. Creswell J. W., (2003), Research design: qualitative, quantitative, and mixed method approaches, Thousand Oaks: Sage Publications.
  22. Creswell J. W., (2007), Qualitative inquiry & research design: choosing among five approaches, Thousand Oaks: Sage Publications.
  23. Cross K. P. and Angelo T. A., (1993), Classroom Assessment Techniques. A Handbook for Faculty, San Francisco: Jossey-Bass.
  24. Dragisich V., Keller V., Black R., Heaps C. W., Kamm J. M., Olechnowicz F., Raybin J., Rombola M. and Zhao M., (2016a), Development of an Advanced Training Course for Teachers and Researchers in Chemistry, J. Chem. Educ., 93, 1211–1216.
  25. Dragisich V., Keller V. and Zhao M., (2016b), An intensive training program for effective teaching assistants in chemistry, J. Chem. Educ., 93, 1204–1210.
  26. Farrell T. S. C. and Ives J., (2015), Exploring teacher beliefs and classroom practices through reflective practice: A case study, Language Teaching Research, 19, 594–610.
  27. French D. and Russell C., (2002), Do graduate teaching assistants benefit from teaching inquiry-based laboratories? Bioscience, 52, 1036–1041.
  28. Golde C. M. and Dore T. M., (2001), At Cross Purposes: What the experiences of doctoral students reveal about doctoral education (www.phd-survey.org), Philadelphia, PA: A report prepared for The Pew Charitable Trusts, Accessed February 4, 2016.
  29. Gormally C., Brickman P., Hallar B. and Armstrong N., (2011), Lessons Learned About Implementing an Inquiry-Based Curriculum in a College Biology Laboratory Classroom, J. Coll. Sci. Teach., 40, 45–51.
  30. Hardré P. L., (2005), Instructional design as a professional development tool-of-choice for graduate teaching assistants, Innovative Higher Educ., 30, 163–175.
  31. Hausfather S., (2001), Where's the Content? The Role of Content in Constructivit Teacher Education, Educ. Horiz., 80, 15–19.
  32. Henderson C., Beach A. and Finklestein N., (2011), Facilitating change in undergraduate STEM instructional practices: an analytic review of the literature, J. Res. Sci. Teach., 48, 952–984.
  33. Henderson C., Dancy M. and Niewiadomska-Bugaj M., (2012), Use of research-based instructional strategies in introductory physics: where do faculty leave the innovation-decision process? Phys. Rev. Spec. Top. – Phys. Educ. Res., 8, 020104.
  34. Holbrook J., (2005), Making chemistry teaching relevant, Chem. Educ. Int., 6, 1–12.
  35. Hycner R. H., (1985), Some guidelines for the phenomenological analysis of interview data, Hum. Stud., 8, 279–303.
  36. Kendall K. D. and Schussler E. E., (2012), Does instructor type matter? Undergraduate student perception of graduate teaching assistants and professors, CBE Life Sci. Educ., 11,187–199.
  37. Komagata N., (2002), Chance agreement and significance of the kappa statistic, http://www.tcnj.edu/komagata/pub/Kappa.Pdf (Stand: Mai 2004).
  38. Kurdziel J. P., Turner J. A., Luft J. A. and Roehrig G. H., (2003), Graduate teaching assistants and inquiry-based instruction: implications for graduate teaching assistant training, J. Chem. Educ., 80, 1206–1210.
  39. Lang M. and Olson J., (2000), Integrated science teaching as a challenge for teachers to develop new conceptual structures. Res. Sci. Educ., 30, 213–224.
  40. Larson L. R. and Lovelace M. D., (2013), Evaluating the efficacy of questioning strategies in lecture-based classroom environments: are we asking the right questions, J. Excellence Coll. Teach., 24, 105–122.
  41. Lederman J. S., (1999), National Geographic Science: Best Practices and Research Base Teaching Scientific Inquiry: Exploration, Directed, Guided, and Open-Ended Levels, pp. 18–20.
  42. Luft J. A., Kurdziel J. P., Roehrig G. H. and Turner J., (2004), Growing a garden without water: graduate teaching assistants in introductory science laboratories at a doctoral/research university, J. Res. Sci. Teach., 41, 211–233.
  43. Marbach-Ad G., Schaefer K. L., Kumi B. C., Friedman L. A., Thompson K. V. and Doyle M. P., (2012), Development and evaluation of a prep course for chemistry graduate teaching assistants at a research university, J. Chem. Educ., 89, 865–872.
  44. Miller K., Brickman P. and Oliver J. S., (2014), Enhancing teaching assistants' (TAs') inquiry teaching by means of teaching observations and reflective discourse. Sch. Sci. Math., 114, 178–190.
  45. Mutambuki J. and Fynewever H., (2012), Comparing Chemistry Faculty Beliefs about Grading with Grading Practices, J. Chem. Educ., 89, 326–334.
  46. National Academies, (2010), BIO 2010: Transforming Undergraduate Education for Future Research Biologists, http://dels.nas.edu/resources/static-assets/materials-based-on-reports/reports-in-brief/bio2010_final.pdf.
  47. National Research Council, (1996), National Science Education Standards, Washington, D.C: The National Academies Press.
  48. National Research Council, (2012), A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas, Washington, D.C: The National Academies Press.
  49. Nilson L. B., (2010), Teaching at its best: a research-based resource for college instructors, San Francisco, CA: Jossey-Bass.
  50. Pajares M. F., (1992), Teachers’ beliefs and educational research: cleaning up a messy construct, Rev. Educ. Res., 62, 307–332.
  51. Petcovic H. L., Fynewever H., Henderson C., Mutambuki J. M. and Barney J. A., (2013), Faculty grading of quantitative problems: a mismatch between values and practice, Res. Sci. Educ., 43, 437–455.
  52. President's Council of Advisors on Science and Technology, (2012), Engage to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics, Washington D.C: Executive Office of the President.
  53. Richardson V., (2003), Constructivist Pedagogy, Teach. Coll. Rec., 105, 1623–1640.
  54. Rodriques R. A. B. and Bond-Robinson J., (2006), Comparing faculty and student perspectives of graduate teaching assistants' teaching, J. Chem. Educ., 83, 305–312.
  55. Rowe M. B., (1986), Wait time: slowing down may be a way of speeding up! J. Teach. Educ., 37, 43–50.
  56. Rushin J. W., De Saix J., Lumsden A., Streubel D. P., Summers G. and Bernson C., (1997), Graduate teaching assistant training: a basis for improvement of college biology teaching & faculty development? Am. Biol. Teach., 59, 86–90.
  57. Schussler E. E., Read Q., Marbach-Ad G., Miller K. and Ferzli M., (2015), Preparing biology graduate teaching assistants for their roles as instructors: an assessment of institutional approaches, CBE Life Sci. Educ., 14, 1–11.
  58. Schwartz A. T., (2006), Contextualized Chemistry Education: the American Experience. Int. J. Sci. Educ., 28, 977–998.
  59. Seymour E. and Hewitt N. M., (1997), Talking about leaving: why undergraduates leave the sciences, Boulder, CO: Westview Press.
  60. Shulman L. S., (1987), Knowledge and teaching: foundations of the new reform, Harvard Educ. Rev., 57, 1–23.
  61. Southerland S. A., Sowell S., Blanchard M. and Granger D. E., (2011), Exploring the construct of pedagogical discontentment: a tool to understand science teachers’ openness to reform, Res. Sci. Educ., 41, 299–319.
  62. Stahl R. J., (1994), Using “Think-Time” and “Wait-Time” Skillfully in the Classroom, ERIC Digest, http://files.eric.ed.gov/fulltext/ED370885.pdf.
  63. Tashakkori A. and Teddlie C., (1998), Mixed Methodology: Combining Qualitative and Quantitative Approaches, vol. 46, Thousand Oaks, CA: Sage Publications Ltd.
  64. Tofade T., Elsner J. and Haines S. T., (2013), Best practice strategies for effective use of questions as a teaching tool, Am. J. Pharm. Educ., 77, 155.
  65. Turpen C., Dancy M. and Henderson C., (2016), Perceived affordances and constraints regarding instructor's use of Peer Instruction: implications for promoting instructional change, Phys. Rev. Phys. Educ. Res., 12, 010116.
  66. Volkmann M. J. and Zgagacz M., (2004), Learning to teach physics through inquiry: the lived experience of a graduate teaching assistant, J. Res. Sci. Teach., 41, 584–602.
  67. Webster C. A., Connolly G. and Schempp P. G., (2009), The finishing touch: anatomy of expert lesson closures, Phys. Educ. Sport Pedagogy, 14, 73–87.
  68. Wheeler L. B., Clark C. P. and Grisham C. M., (2017), Transforming a Traditional Laboratory to an Inquiry-Based Course: Importance of Training TAs when Redesigning a Curriculum, J. Chem. Educ., 94, 1019–1026.
  69. Wheeler L. B., Maeng J. L. and Whitworth B. A., (2015), Teaching assistants' perceptions of a training to support an inquiry-based general chemistry laboratory course, Chem. Educ. Res. Pract., 16, 824–842.
  70. Wyse S. A., Long T. M. and Ebert-May D., (2014), Teaching assistant professional development in biology: designed for and driven by multidimensional data, CBE Life Sci. Educ., 13, 212–223.
  71. Zehnder C., (2016), Assessment of Graduate Teaching Assistants Enrolled in a Teaching Techniques Course, J. Coll. Sci. Teach., 46, 76–83.

Footnote

Electronic supplementary information (ESI) available. See DOI: 10.1039/c7rp00133a

This journal is © The Royal Society of Chemistry 2018