The effect of instructional method on teaching assistants' classroom discourse

Kelley Current and Megan Grunert Kowalske *
Western Michigan University, Department of Chemistry & The Mallinson Institute for Science Education, 1903 W. Michigan Ave, Kalamazoo, MI 49008-5413, USA. E-mail: Kelley.M.Current@wmich.edu; Megan.Kowalske@wmich.edu

Received 24th February 2016 , Accepted 6th April 2016

First published on 8th April 2016


Abstract

There has been increased interest in the transformation of post-secondary level instructional practices in STEM from more traditional to evidence-based practices that are more aligned with how learning occurs. Research has shown that instructional practices are linked to student learning outcomes even when content is unchanged; therefore, incorporating evidence-based practices into the classroom represents an area worthy of focus and resources. Problem-based learning (PBL) is an inquiry-oriented instructional strategy wherein students answer ill-structured real world problems through collaboration, research, experimentation, and trial and error. The success or failure of an inquiry-oriented laboratory session depends in large part on the practices of the instructor. The work presented here focuses on the instructional practices of chemistry graduate teaching assistants (GTAs), who over the course of a semester, taught both expository/demonstration and inquiry laboratory sessions. Upon comparing GTA discourse across inquiry and expository/demonstration settings, three assertions were generated: (a) there was an apparent relationship between the instructional mode (expository/demonstration versus inquiry) and the structure of the classroom discourse employed by GTAs, (b) patterns in classroom discourse repeated within given instructional modes, even when the nature of content being covered varied widely, and (c) patterns in classroom discourse observed in inquiry labs exemplified a constructivist learning environment and were achieved with minimal intervention.


Introduction

There has been an increased interest in transforming instructional practices at the post-secondary level to better align with modern conceptions of how learning occurs (Roth and Roychoudhury, 1993; Henderson and Dancy, 2007; Dancy and Henderson, 2010). The incorporation of evidence-based practices into regular classroom instruction has its foundations in cognitive science and educational psychology, among other fields. Instructional practices that actively engage students in learning and constructing knowledge, like inquiry, problem-based learning, and Socratic dialog, have proven to be effective means of increasing student learning and engagement (Anderson, 2002; Prince, 2004; NRC, 2011). Student learning outcomes have been shown to improve with the alteration of instructional practices, even when curricula remain unchanged (Freedman, 1997; Nye et al., 2004; NRC, 2005; Slavin et al., 2009; Mikami et al., 2011). Given the relationship between instructional practices and student learning outcomes, a focus on aligning current instructional practice with best practices is an important area of focus.

Problem-based learning (PBL) is an inquiry-oriented instructional strategy where students answer ill-structured real world problems through collaboration, literature research, experimentation, and trial and error (Hmelo-Silver, 2004). Instructors function as facilitators guiding the students as needed through the PBL setting (Hmelo-Silver, 2004). Inquiry-oriented modes of instruction, like PBL, embody the elements of a constructivist learning environment (Bodner, 1986; Palincsar and Brown, 1988; Brown and Campione, 1996). Constructivist instruction occurs when the instructor relinquishes control to the students by: (1) bringing out what students already suspect, know, believe, or are capable of, (2) encouraging students to reflect on these actions, and (3) supporting students by either confirming their path or by refocusing the students on what they must do or consider to make progress (Gage and Berliner, 1998).

The PBL setting has been shown to produce a variety of favorable learning outcomes. Dochy et al. (2003) conducted a meta-analysis outlining PBL effects on learners. Learners who experienced PBL demonstrated no significant increase in declarative knowledge relative to their counterparts who experienced expository instruction, but were better able to apply knowledge when assessed. Patel et al. (1993) compared expository and PBL learners and found that PBL students faced with a novel problem were better able to utilize a hypothesis-driven reasoning strategy than were the expository learners. In a carefully structured crossover experiment, learners exposed to both PBL and expository instruction over the course of a semester were able to generate “more integrative and explanatory essays” with the concepts that were taught using the PBL approach as opposed to the expository approach (Capon and Kuhn, 2004; Hmelo-Silver and Barrows, 2006, p. 103). Sandi-Urena et al. (2012) reported that learners from a PBL setting demonstrated improved problem-solving skills and increased self-regulation of metacognitive strategies even though no explicit instruction pertaining to these skills was provided. These findings show the capacity of the inquiry and PBL settings to produce desired student outcomes through their constructivist nature (Mervis, 2013).

PBL was designed to encourage students to generate an extensive and flexible knowledge base in a setting that promotes the development of an intrinsic motivation to learn. At the same time, it allows students to develop problem-solving and self-directed learning skills within a collaborative environment (Barrows and Kelson, 1995; Wang et al., 2016). To achieve the goals of the PBL setting and to shift student outcomes, it is crucial that the instructor function as an effective facilitator. The act of facilitation depends heavily on an instructor's chosen classroom discourse. Hmelo-Silver (2004) has described facilitators as expert learners capable of modeling thinking and learning strategies who also help students progress through the PBL experience. An instructor's ability to enact classroom discourse consistent with the tenets of facilitation can quite literally make or break the PBL setting and its ability to achieve previously outlined goals (Roehrig and Kruse, 2005). The success or failure of an inquiry-oriented laboratory course taught by a Graduate Teaching Assistant (GTA), such as in this study, depends on the instructional practices employed by the GTA and whether or not they are aligned with the constructivist learning model (Roehrig and Kruse, 2005).

GTAs often act as instructors in undergraduate chemistry laboratories, and GTA fulfillment of this role is essential to the function of many institutions of higher education (Smith, 1993; Sandi-Urena et al., 2011). As a group, GTAs have limited instructional experience, spend more time with undergraduates than do professors or other instructional staff, and have increasingly been asked to teach inquiry-oriented lessons (O'Neal et al., 2007). GTAs have also been shown to play a crucial role in keeping undergraduates in the sciences (Seymour and Hewitt, 1997; O'Neal et al., 2007; Sandi-Urena et al., 2011). Despite all this, little has been done to nurture, support, or better understand the development of finely tuned instructional practices employed by GTAs.

It is from today's population of GTAs that the next generation of college instructors will come, thus the instructional training of GTAs should at least be considered, if not thoughtfully designed. During their tenure as graduate students, teaching assistants develop a set of go-to instructional practices from their classroom experiences and interactions with undergraduate students. The future application of these instructional tools will shape GTAs instructional practice after graduate school. Because of their impact on current students, future students, and the viability and evolution of higher education, GTAs represent a significant population of study.

Theoretical framework

Discourse analysis has been used to frame this study. While discourse analysis has many functions and is used for a variety of purposes, most broadly it is employed in the study of language organization. The conversation functions as the unit of analysis, the ‘who’ or ‘what’ on which a study has been focused (Stubbs, 1983; Trochim, 2014). Discourse analysis frequently frames studies investigating the patterns of naturally occurring spoken language (Stubbs, 1983). Classroom discourse impacts nearly all elements of instructional practice. According to Cazden (1988), discourse is the primary mode of communication used by instructors to teach and by learners to learn. Variation in how or when discourse was used by an instructor can be the difference between excellent and impaired instruction. For these reasons, classroom discourse should not be ignored when considering issues surrounding instruction and learning (Cazden, 1988; Krystyniak and Heikkinen, 2007).

For decades, linguistic methods have been employed in the study of classroom discourse between students and teachers (Sauntson, 2012). The discourse analyst working with long segments of data collected from the classroom setting seeks to uncover discourse regularities with respect to context and form (Brown and Yule, 1983). At the heart of all studies of classroom discourse is the drive to better understand the social aspects of the classroom to uncover factors influencing student achievement (Cazden, 1988). Sinclair and Coulthard's (1975) model, a commonly accepted means of describing classroom discourse, borrows from Bellack's (1966) initiate/response/evaluate (IRE) model of classroom discourse. In this model, the instructor begins a cycle by questioning the learner, the learner then attempts to respond, after which the instructor either corrects or affirms the learner's response. Unfortunately, these descriptive models of student-teacher discourse were derived from expository or demonstration-oriented classrooms, leaving the IRE model unusable in the categorization of classroom discourse generated in inquiry-oriented settings (Gilkison, 2003). Standard methods and coding schemes commonly employed in classroom discourse analysis were unable to capture the richness of the discourse taking place in inquiry settings.

According to Stubbs (1983), there are three primary decisions to be addressed when conceptualizing a study through the frame of discourse analysis: (a) the length of discourse to be studied per unit, (b) the natural or engineered bounds of discourse sequences, and (c) the consideration of non-linguistic factors. These three crucial decisions begin the process by which the researcher formalizes the unit of analysis (Trochim, 2014). For this study, the unit of analysis has been defined by the initiation of conversation between the instructor and group of students and is terminated when the instructor leaves the group and/or moves on to a conversation with a different group. The length of these conversations varied. Non-verbal sound cues such as GTA footsteps or long pauses in discourse that indicated the instructor's movement between groups were used to help define the natural beginning and ending of conversations.

Purpose and research questions

Classroom discourse is a crucial component of instructional practice for all those who teach (Cazden, 1988; Pianta et al., 2012) yet the role played by actual classroom discourse is frequently taken for granted, or as Cazden (1988) describes it, is viewed as being transparent. Thus, classroom discourse is frequently overlooked despite its importance. With this in mind the following research questions were proposed:

(1) How does classroom discourse differ from the expository/demonstration mode of instruction to the PBL mode of instruction?

(2) What types of classroom discourse are characteristic of the expository/demonstration mode of instruction and the PBL mode of instruction?

This study sought to describe and compare GTA classroom discourse from two distinct instructional settings, the expository/demonstration setting and the PBL setting, beginning an exploration of the relationship between classroom discourse and instructional mode as displayed by GTAs.

Methods

Study design

This study took place during two full semesters of General Chemistry laboratory at a large Midwestern University. Labs early in the semester were expository/demonstration, where students followed step-wise instructions to verify a pre-determined outcome. In inquiry sessions, students were asked to develop a procedure and collect data as a means of answering a research question, putting the focus on understanding research design and data analysis. As the semester progressed, sessions became more inquiry-oriented, concluding with a multi-week PBL unit. During Semester A, the first two sessions were expository/demonstration oriented in nature, the third session was a guided inquiry lab, the fourth session was an expository/demonstration lab, and the final three sessions were PBL (see Table 1). During Semester B, the first four sessions were expository/demonstration, while the final three sessions were PBL (see Table 1). During the semester, GTAs wore a digital audio recorder with microphone while in the instructional laboratory, allowing for the documentation of all classroom conversation that occurred between GTAs and their students.
Table 1 Ordering of labs within semesters A & B
  Week Lab type by session
Semester A 1 Expository/demonstration
2 Expository/demonstration
3 Guided inquiry
4 Expository/demonstration
5 PBL Day 1
6 PBL Day 2
7 PBL Day 3
Semester B 1 Expository/demonstration
2 Expository/demonstration
3 Expository/demonstration
4 Expository/demonstration
5 PBL Day 1
6 PBL Day 2
7 PBL Day 3


The guided inquiry lab taught during Week 3 of Semester A was designed in conjunction with an externally funded project (NSF CCLI award number 0941713). This lab was a single day experiment where students were asked to design and carry out a procedure capable of answering a provided research question. After designing a procedure and collecting data, the students evaluated the data and were encouraged to make appropriate claims.

PBL was chosen to meet the course goals of engaging students in an authentic yet accessible research experience, allowing students to work with topics that were meaningful, relevant, and based on current research in the Chemistry Department, and increasing students' understanding of scientific research and the nature of science. The PBL sessions were designed by the authors with the following goals in mind: (a) to foster student creation of a deep and flexible knowledge base for a given topic, (b) to assist students in developing and enacting problem-solving skills, (c) to assist students in developing and enacting self-directed learning skills, (d) to foster student collaboration and teamwork skills, and (e) to allow students space to develop the intrinsic motivation to learn (Barrows and Kelson, 1995, Wang et al., 2016). During the PBL Day 1 session, students were presented with background literature highlighting an ill-structured real world problem. From this background information, students formulated a research question or hypothesis and an experimental procedure. Upon returning to lab on Day 2, students carried out their procedure, analyzed the data produced, and assessed the soundness of their research design. If there were confounding factors or if the data collected was unable to address the research question/hypothesis, students made procedural alterations. On Day 3, students collected data a second time using their adapted procedure. And lastly, on Day 4 students presented their research questions/hypotheses, experimental design, data collection procedures, and findings to their classmates in the style of a scientific research presentation. To encourage student to student discourse, the presentation grading rubric included points for asking questions during the presentations. During the presentations, the GTA's acted as mediators and encouraged, reframed, or clarified student generated questions. Day 4 did not contain a significant amount of GTA/Student discourse, and therefore, it was not analyzed for this project.

Ethical considerations and participant recruitment

Prior to the recruitment of participants, the authors carefully reflected upon the ethical considerations critical to the implementation of a chemical education research project (Taber, 2014). The authors were deeply committed to keeping participants and their students comfortable while also collecting high quality data from which strong conclusions could be drawn. As a first step, the authors co-constructed a basic description of the research design and aims to be read by participants and their students. This research description along with a formal letter of consent was submitted to the university's Human Subjects Internal Review Board (HSIRB). After receiving HSIRB approval, the authors began participant recruitment. Many participants expressed concerns about not being able to perform “well enough” or provide the authors with “good data”. When presented with these concerns, the authors emphasized that they were interested in the investigation of the research question rather than collecting a specific set of data confirming preconceived notions.

In addition to addressing common concerns during the recruitment phase, the authors also emphasized that individuals were entirely free to participate in the study or to decline participation at any time. The authors recognize that being recorded during instruction can be quite distressing at times, particularly for graduate students who are dependent on their teaching assistantship for tuition remission and salary. Because the authors in no way wanted to impede the performance of instructors, participants concerned about the potential negative impact of study participation on their instruction declined to participate.

After the first round of data collection, the authors realized that data collected with stationary audio-visual cameras placed in the classroom were unsuitable for the collection of high-quality data. The participant was often out of the frame or could not be heard. During the second round of data collection, participants were asked to wear an audio recorder with a lapel microphone as they taught. Asking participants to wear a digital voice recorder and microphone produced much clearer audio recordings, but decreased GTAs' willingness to participate in the study, limiting the available participant pool. GTAs were less willing to participate under these conditions as they felt that identification and possible criticism or loss of financial support from the department were more possible under these conditions, despite assurances of confidentiality. Ultimately, only two GTAs (hence forth referred to as GTA1 and GTA2) opted to participate in the study when the lapel microphones and digital audio recorders were used.

Participants

GTA1 participated in this study for two consecutive academic semesters (Semesters A and B), yielding Data Sets 1 and 2. GTA2 participated in this study for a single academic semester (Semester A), yielding Data Set 3. Table 2 defines the data sets, in terms of specific semesters and GTA involvement. With respect to previous teaching experience, GTA1 had taught this course in a previous semester, with PBL in the curriculum. GTA2 had not taught this specific course or a PBL lab prior to Semester A. GTA2, however, had previous experience instructing first year general chemistry students in a separate course.
Table 2 Participants contributing to Data Sets 1, 2 & 3
Data Set Participant identifier Semester
1 GTA1 A
2 GTA1 B
3 GTA2 A


Data collection

To take part in this study, GTAs needed to be instructors in a general chemistry laboratory featuring PBL in the curriculum and be willing to be recorded for the entire semester. Prior to each instructional period, the researcher met with each participant to set-up the audio recorder. The participants were instructed in the use of the audio recorder. At the end of each session, the researcher retrieved the recorder, transferred the audio file, and archived it. This provided a digital record of each participating GTA's classroom discourse for an entire academic semester.

Role of the researcher

Researchers typically function as nonparticipant observers when conducting observational classroom studies (Marshall and Rossman, 1999). By definition, nonparticipant observers do not involve themselves in the setting of study; they instead seek to be viewed as unobtrusive observers. During this study, the researcher behaved as a nonparticipant observer, having no special role in the classroom. Special precautions were taken by the researchers to prevent their preconceived notions regarding the outcome of the study from coloring study findings.

While conducting the study, the researchers generated and maintained a detailed record of data analysis steps. Yin (2003) reported how bias and error may be minimized by precise documentation of all analytic steps, such that a future researcher could replicate an analysis of the collected data identifying the functional relationship outlined by the original researcher. In an attempt to meet Yin's (2003) standard, the researchers kept what Richards (2005) calls a “log trail”. The log trail: (a) noted when and how the researcher worked on the project, (b) formed a record of rationales for changes made as data collection and analysis progressed, and (c) speculated as to potential influence these changes had on the research projects findings. The log trail assisted the researchers in tracking, checking, and being mindful of their biases and preconceived notions.

Data analysis

Once a digital record of each participant's instruction was collected, the researchers set out to develop and define a coding method by which the recordings could be analyzed. Work by Gilkison (2003) suggested that the richness of the discourse taking place within inquiry settings was unable to be captured using those standard methods and coding schemes commonly employed in classroom discourse analysis. Consequently, Gilkison (2003) conducted an exploratory case study outlining a series of discourse techniques characteristic of inquiry-oriented settings. The researchers chose to build on these findings as the basis for this project's coding method (Gilkison, 2003).

The types of classroom discourse observed by Gilkison (2003) were used by the researchers as process codes that described the specific types of classroom discourse across a variety of settings. Saldania (2009) has defined process codes as descriptions of specific actions taken by participants. Table 3 has outlined the researcher's adaptations of Gilkison's (2003) categories into the process codes (Saldania, 2009) used in this project. The authors felt it was necessary to modify the coding scheme to reframe Gilkison's categories as applicable process codes that could be clearly defined and consistently applied across a large and complex data set. These modifications contributed to the overall reliability of the data analysis process.

Table 3 Code names and adapted definitions (Gilkison, 2003)
Code name Definition
Elicitation Questions/statements that are not content-oriented and focus on what the student has completed or has observed.
Prompting Content-oriented questions/statements that encourage students to expand on or amend their ideas, provide an answer, or explain their thought process.
Giving feedback Indicating verbally that a response or statement made by one or more students is or is not correct.
Informing Making statements based on facts, ideas, or beliefs held by the teaching assistant that relate to course material, course content, or student actions.
Direct learning Making statements about what a curricular experience is meant to convey or why students are being asked to perform a task.


After selecting and refining code definitions, a coding procedure was developed to allow the researchers to consistently apply these process codes across the entire body of data. The researchers began by designing a standardized “coding box” (shown in Fig. 1). A coding process with three separate rounds, where each round of coding involved listening to the full audio recording, was developed. The researcher divided each audio recording into discrete segments by identifying the bounds of GTA/student conversations pertaining to classroom material during the first round of coding. The researcher assigned and justified the application of appropriate process codes while also checking the bounds of each conversation during the second round of coding. And lastly, the researcher audited the assignment of process codes within each conversation during the third round of coding. By conducting three rounds of coding the researchers were able to consistently apply the process codes (Saldania, 2009). Fig. 1 has depicted the coding of a single conversation.


image file: c6rp00050a-f1.tif
Fig. 1 Depiction of sample coding box for a single conversation, shown at three different levels of analysis: (a) the audio recorder time stamp marking the beginning and end of each pertinent GTA/student conversation was marked, (b) representative codes within each conversation were marked, supporting discourse was noted, and the pertinence and timestamp of the conversation was checked, and (c) the representative codes were checked.

During the process coding stage, the researchers began to notice patterns in the combinations of process codes present in specific sessions; there seemed to be common structures of conversation. As an example, Fig. 1 shows the coding of an E,I (Elicitation and Informing) conversation. E,I conversations were quite common, comprising approximately 30% of the conversations in some sessions. To clarify the pattern sensed by the researchers, a secondary coding method was applied, longitudinal coding. Saldania (2009) has defined longitudinal coding as the tracking and comparison of process codes (in this case specific combinations of process codes) over time and within varied environmental conditions (Saldania, 2009). Inter-coder agreement was also determined; the results are presented in Table 4.

Table 4 Inter-coder reliability quantification of PBL and Exp/Dem sessions
  Percentage of Value (%)
PBL session Exact agreement 76
Disagreement with one process code difference 12
Disagreement with more than one process code difference 12
Exp/Dem session Exact agreement 83
Disagreement with one process code difference 17
Disagreement with more than one process code difference 0


After reviewing the data and comparing graphs from each session, the researchers noticed a dichotomy. Common code combinations or conversation structures appeared to be consistently student driven, while others were instructor centered. After seeking out and listening to each individual conversation structure, the researchers categorized each conversation structure as being either constructivist or initiate/respond/evaluate (IRE) model consistent (Bellack, 1966). Constructivist conversations were student driven; in these conversations, GTAs asked students to share their opinions and form their own ideas while providing minimal guidance. IRE modeled conversations were instructor driven; in these conversations the GTA sought to correct students’ inaccuracies and expedite students’ ability to quickly and effectively complete the day's prescribed procedure.

Each conversation structure originating from the data has been classified as either a constructivist or an IRE model conversation (see Table 5). The relative proportions of conversation structures from each session were then graphed, as shown in Fig. 2. For each graph, the boxed sections on the right (shown in Fig. 2 with blue shading and red columns) represent the proportion of constructivist conversations, while the unboxed sections on the left (blue columns) represent the proportion of IRE conversations. For each session and each corresponding graph, the researcher compared the overall proportions of constructivist and IRE model conversation. The corresponding graph for each session has been provided in the associated content section.

Table 5 Division of conversation types between the two categories: expository and inquiry
Expository (IRE conversations) Inquiry (constructivist conversations)
I Informing E Elicitation
GF Giving feedback P Prompting
E,I Elicitation, informing DL Direct learning
E,GF Elicitation, giving feedback E,P Elicitation, prompting
P,GF Prompting, giving feedback E.DL Elicitation, direct learning
P,I Prompting, informing E,P,GF Elicitation, prompting, giving feedback
GF,I Giving feedback, informing E,P,I Elicitation, prompting, informing
E,GF,I Elicitation, giving feedback, informing E,P,DL Elicitation, prompting, direct learning
P,GF,I Prompting, giving feedback, informing P,GF,DL Prompting, giving feedback, direct learning
GF,I,DL Giving feedback, informing, direct learning P,I,DL Prompting, informing, direct learning
E,P,GF,I Elicitation, prompting, giving feedback, informing
E,P,I,DL Elicitation, prompting, informing, direct learning
E,GF,I,DL Elicitation, giving feedback, informing, direct learning
P,GF,I,DL Prompting, giving feedback, informing, direct learning
E,P,I,GF,DL Elicitation, prompting, informing, giving feedback, direct learning



image file: c6rp00050a-f2.tif
Fig. 2 Graph depicting proportion of various types of conversations (as defined by types of discourse present) within a lab session.

Results

With what is known about increasing student learning and engagement through evidence-based best practices, the authors developed PBL laboratory units for general chemistry. Problem-based learning is an inquiry-oriented instructional strategy that calls upon students to answer ill-structured real world problems (Hmelo-Silver, 2004). Within problem-based learning, the instructor functions as a facilitator by guiding students rather than directing them (Hmelo-Silver, 2004). GTAs frequently serve as laboratory instructors in chemistry and other STEM fields, and are essential to the teaching of undergraduate STEM courses (Smith, 1993; Sandi-Urena et al., 2011). Additionally, GTAs have been shown to play a crucial role in keeping undergraduates in the sciences (O'Neal et al., 2007; Sandi-Urena et al., 2011). Interestingly, little attention has been paid to the instructional practices of GTAs, especially across varying instructional modes.

In this study the researchers asked: (1) how does classroom discourse differ from the expository/demonstration mode of instruction to the PBL mode of instruction and (2) what types of classroom discourse are characteristic of the expository/demonstration mode of instruction and the PBL mode of instruction?

Data Set 1

Data Set 1 has been depicted in Fig. 3, showing the relative proportion of GTA1's IRE and constructivist model conversations during Semester A. During the expository/demonstration sessions, nearly 80% of the total conversations taking place were consistent with the IRE model. Interestingly, when transitioning from an expository/demonstration session to a guided inquiry session and back to expository/demonstration session, constructivist type conversations increased within the guided inquiry setting but returned to previous levels in the expository/demonstration session. Constructivist structured conversations dominated within each PBL session. Based on Data Set 1, there appears to be a relationship between the environmental setting and the relative proportion of IRE/constructivist conversations; inquiry sessions contained more constructivist conversations while expository/demonstration sessions were dominated by IRE model conversations. Inquiry sessions commonly contained the following conversation structures: E; E,P,I; and E,P,GF,I; while expository/demonstration sessions contained the following conversation structures: I; E,I; P,I; and P,GF,I. A graph of the relative proportions of conversation structures has been provided for each session of Data Set 1 in the ESI, Appendix 1.
image file: c6rp00050a-f3.tif
Fig. 3 Data Set 1, GTA1's relative proportion of IRE and Constructivist model conversations during each session of Semester A. (Exp/Dem represents Expository/Demonstration, Inq. represents guided inquiry, and PBL # represents Problem-Based Learning labs).

To illustrate the qualitative differences between IRE model and constructivist model conversations, transcriptions of select conversations have been provided. A fully transcribed elicitation and informing (E,I) conversation has been provided in Table 6. This conversation was transcribed from one of GTA1's expository/demonstration sessions in Semester A. Information was elicited from the students as a means of providing specific procedural steps which, if taken, would allow the students to complete their procedure as accurately and quickly as possible. Notice how GTA1 has mined information from students. Student responses midway through the conversation demonstrate that the students did not understand what GTA1 was asking. Once the students provided the desired procedural information, GTA1 launched into an explanation outlining the students' next steps.

Table 6 Transcriptions of E,I conversation
Transcribed example: elicitation and informing
TA: And you all got a chance to look at the book? {elicitation} You have not.
S: We haven't yet. But, how much water do we need to add?
TA: It [the lab manual] says twenty-five, I would try to do twenty {informing}. How much the of ammonium chloride left? {elicitation}
S: What do you mean left?
TA: What percentage? {elicitation}
S: 13.6 percent
TA: Okay, I would use more water, because you don't know how much sand is there and there [appears] to be a significant amount of sand. {informing}
S: Alright
TA: So, use more, use the whole thing, but you can do it, [add] like 15[ml and] stir it, 10 [ml and] stir it. You don't have to add it all at the same time. {informing}


In the conversation transcribed in Table 6, there was very little room for students to express ideas or questions. This transcription embodies instructor-centered instruction. During this conversation the GTA assessed the status of the students' procedural validity, not the students' understanding of the experiment or its underlying concepts.

A fully transcribed elicitation, prompting, and informing (E,P,I) conversation has been provided in Table 7. This conversation was transcribed from one of GTA1's inquiry sessions in Semester A. In this conversation, GTA1 asked students to describe how they planned to design their experiment. When students indicated that they were unsure of exactly what this meant, GTA1 provided a more specific line of questioning and gave the students a space in which to answer these questions. In a conversation taken from a PBL session, shown in Table 8, GTA1 participates in an extended discussion of the students' procedural design.

Table 7 Transcription of E,P,I conversation
Transcribed example: elicitation, prompting and informing
TA: So, how are things over here? {elicitation}
S: Good, I guess.
TA: Good, so what are we thinking so far? {prompting} what are we going to do for number one and number two? and so on? {elicitation}
S: [unclear]
TA: So what are your plans? {prompting}
S: So, we are going to get a 150 ml beaker, the hydro soap, 100 ml of water. Then we are going to use this pH strip to observe the color, to tell us the pH.
TA: So, these are the pH strips. They are a little different because they have four little buttons on them, each one is a different color. It makes it much more precise. {informing}
S: Right
TA: So, instead of saying, oh that's basic or oh that's acidic, this actually tells you how much. Alright. {informing}
S: Right.
TA: Good, so what are your plans for [number] two? {prompting}
S: Well we aren't sure yet.
TA: I will come back, you want me to come back? {elicitation} I don't want to give away the answer to everybody at the same time.
S: Well, what does it even mean? It just seems really easy? What do they mean make a procedure?
TA: Yeah, but how are you going to do that? {prompting} it says mix them together right? {prompting}
S: Yeah, so do we mix them in like three ml of water?
TA: So, you have your beakers with like 100 ml of water, and soap or water and detergent, right? {elicitation} So you have two beakers. It says: now mix those up together and add them to your samples. {informing} Well, what are you adding them to? {prompting} Think about that, what are you going to add them to and how much soap and detergent are you actually going to use for each test, right? So, those are some things to think about. {informing}


Table 8 Transcription of E,P,GF,I conversation
Transcribed example: elicitation, prompting, giving feedback, and informing
S: So, we want to test fruit for pesticides, but we are not sure that they will be there. And, we want to know if acids and bases could clean the fruit. Do we just bring fruit with pesticides on them?
TA: The fruit are coming from Meijer, so we don't really know. It's kind of random which one's actually have stuff on them. {informing}
S: I know.
TA: So really, if you are only going to test those you might not find those three pesticides. {informing}
S: So, could we test it for pesticides, clean it, and test it again.
TA: Yeah, there is no problem with that{giving feedback}, you just have to make sure that, like, maybe you circle a part with a Sharpie, and test that part. You just have to figure out how you are going to do that. {informing} How are you going to first test for pesticide, then try to clean it? {prompting} That will be an interesting [unclear]. I am not exactly sure how that is going to work. What kinds of acids and bases are you thinking? {prompting}
S: Like, HCl
TA: I'm trying to think if that's... so you're looking at a time thing? That's what you want to look at? {prompting}
S: No, we want to know how much pesticide there is.
TA: Again we can't, what is one of the limitations of our sensor? {prompting}
S: We can't quantify.
TA: Right {giving feedback}
S: But, say that we use some acid or base to see whether or not the pesticide comes off the fruit?
TA: So if, if you are using like five ml of acid and ten ml of acid and then 15? {prompting}
S: We could see how much is needed to take the sensor off the fruit.
TA: So, I have a question then. So you are saying that you need to eliminate water? {prompting}
S: Yeah
TA: Most acids and bases are made in what? {prompting}… water, right? {informing}
S: Couldn't we just dry it?
TA: Well, if you dry it you are going to be wiping pesticide off, you could be wiping the acid off. I mean, you don't really know, right? {informing} What's your control? How do you know what you're wiping off? {prompting}
S: Is there a way to control it?
TA: Yeah, there are ways to control it, it's just getting to that point. That is something to think about. I am not too hot on the idea of trying the cleaners, only because you are adding in a lot of variables to it, with the water and the drying it a certain way and things like that. {informing} So it's, I don't mean to shoot down your ideas. They are really good ideas. {giving feedback} It's just we don't have the- it will also be hard to keep things over for a week. Think about it, we've gotta keep these things, is refrigeration going to affect it? If it is left out it would rot. So, you've gotta think about these things too, will it sit for a week. It's easier to work with things if you can do it in the time that we have in lab here, okay? {informing}
S: Okay, could we test the seeds?
TA: So, if you're testing the seeds to look for contamination, are you testing anything else? {prompting}
S: hum
TA: That's a good idea. There is nothing wrong with that one, I like that one. That's something that's doable in our time frame. {giving feedback}
S: [unclear]
TA: Well, if you are testing the seeds for pesticides how will you know if they were even exposed to pesticides in the first place? {prompting}
S: I guess we could test the fruit itself.
TA: Okay, so you could test a few things on each fruit if you want. {informing} What else could you test? And, by testing the fruit itself I assume that you mean the flesh of the apple, right? {prompting}
S: Yeah.
TA: Well, where does the pesticide get applied? What else could you test? {prompting}
S: The skin.
TA: The skin. {giving feedback} So that's a good question, you could [answer it] by testing different parts of the fruit. So, you have a few things that you can look at and choose from. No need to do all of them. {informing} Does that help a little bit? {elicitation}
S: Yep.
TA: Okay, I gotta move on. I'll be back.


Unlike the instructor-centered conversation shown in Table 6, students were encouraged to express and enact their own ideas. This transcription is a demonstration of student-centered conversation. Within this conversation, GTA1 has supported the students as opposed to directing them.

Data Set 2

Data Set 2 is depicted in Fig. 4, showing the relative proportions of GTA1's IRE and constructivist model conversations during Semester B. During the expository/demonstration sessions, nearly 80% of the total conversations taking place were consistent with the IRE model, the most common being: I; E,I; GF,I; and E,GF,I. See the E,I conversation depicted in Table 9. Interestingly, each PBL lab contained ample amounts of constructivist conversations with the most common being: E; E,P,I; and E,P,GF,I. See the E,P,GF,I conversation depicted in Table 10. A graph of the relative proportions of conversation structures has been provided for each session of Data Set 2 in the ESI, Appendix 2.
image file: c6rp00050a-f4.tif
Fig. 4 Data Set 2, GTA1's relative proportion of IRE and Constructivist model conversations during each session of Semester B.
Table 9 Transcription of E,I conversation
Transcribed example: elicitation and informing
TA: How's it going? {elicitation}
S: Good, there are only a few steps left.
TA: Almost there, I'll be back in like, if I am not back, if I am not back by like 3:35 you guys can just like stick it on a hot plate. {informing}
S: Okay
TA: Be sure to turn it all the way up. {informing} And, you are at 25 not 30, right? {elicitation}
S: Yeah


Table 10 Transcription of E,P,GF,I conversation
Transcribed example: elicitation, prompting, giving feedback, and informing
TA: So, what did you end-up getting? {elicitation}
S: I said that NaCl was a yellow or an orange
TA: Mhm, it was lighter than the other ones {giving feedback}
S: Yeah it was. And for this one, well I figured that there was some lead in (unknown number) J30. There was only a little bit of purple.
TA: I mean, these three red lines, boom they
S: (unclear)
TA: It's easier if you look for the, the fewer amount of colors because there is only, only like this one especially. Like boom, that red line right there, it is a huge tell-tale a sign of which one it is. I mean, I don't think you have that one. {informing}
S: So, for J30, it would be like lead
TA: yep {giving feedback}
S: and it would have some….
TA: So, what gives it that funky line at the end? {prompting}
S: I think magnesium maybe but,
TA: So, if you are not sure then you look down here and you say, oh well, is this purple one here? {prompting}
S: Yeah, there is some purple right there
TA: It has that purple line there, right? They (the lines) will be in the exact same spot. {informing}
S: So, lead and palladium probably.
TA: Does anything else give this funky red double line, probably not. And there are only two. So yeah, there are only two combined. {informing}
S: So, they would combine lead and palladium? We could get lead again?
TA: Oh they might, I don't know, some groups had lead three times.
S: Okay
TA: Just the way it went.


Interestingly, the fourth and final expository/demonstration lab of the semester contained a similar proportion of constructivist and IRE model conversation. The researchers suspect that because the fourth expository/demonstration lab was entirely focused on students interacting with and drawing Lewis Dot Structures that GTA1's conversations were inherently more conceptual and favored the co-construction of knowledge. Despite it being classified as an expository/demonstration lab, the process by which students complete the lab (building models of different chemical structures) does have elements of problem-solving and trial and error similar to what would be seen in an inquiry-oriented activity. While GTA1 knew the molecular geometry of the chemical structures, GTA1 refrained from giving students the answers and helped them determine the answers on their own. The conversation presented in Table 11 has been taken from this anomalous expository/demonstration session.

Table 11 Transcription of E,P,GF,I conversation
Transcribed Example: elicitation, prompting, giving feedback, and informing
TA: Okay, what's up? {elicitation}
S: Okay, boron, it has 3 valence electrons, how can this possibly work?
TA: Well, there is also a minus charge isn't there, well, you are adding in an electron right? {informing}
S: I don't know; does fluorine have something to do with it?
TA: No, {giving feedback} there is an electron in there right, BF4 minus. So, there is another electron thrown into the mix. {informing}
S: Okay, is this other one correct though?
TA: Count up your electrons… how many from nitrogen? Oh, how many if you took nitrogen by its self, how many valence electrons? {prompting}
S: Five.
TA: How many from oxygen? {prompting}
S: (unclear)
TA: Huh?
S: Six.
TA: Yeah, so that would be eighteen plus five, that's twenty-three, twenty-four. {informing} How many electrons do you have on there? {prompting}
S: Well let me count... okay it looks like we have twenty-six.
TA: So, you have too many, right? {informing}
S: Yeah.
TA: So that little, that pair right there, it isn't there anymore. {informing} So what does it have to do? {prompting}
S: Make a double bond, but I really don't get it. There are still the same amount of electrons, who cares about the double bond?
TA: Yeah, but then nitrogen would have eight, oxygen would all have eight.
S: Okay, the octet rule.
TA: There you go. {giving feedback}


Based on Data Set 2, there appears to be a relationship between the environmental setting and the relative proportion of IRE/constructivist conversations. Inquiry sessions contained more constructivist conversations while all sessions (save one expository/demonstration session) were dominated by IRE model conversations.

Data Set 3

GTA2's classroom discourse patterns within inquiry and expository/demonstration sessions were difficult to discern, if present at all (see Fig. 5). GTA2 utilized a wider distribution of conversation structures within inquiry sessions than within the expository/demonstration sessions. The wider distribution of conversational structures did not, however, translate to an overall dominance of constructivist conversation.
image file: c6rp00050a-f5.tif
Fig. 5 Data Set 3, GTA2's relative proportion of IRE and constructivist model conversations during each session of Semester A.

Within expository/demonstration sessions, GTA2's discourse aided students in quickly and efficiently completing procedures or addressing assessments (see Table 12). These conversations tended to be instructor-centered. GTA2 recounted procedural steps and answers to laboratory assessments, whether or not students had asked a question. This sharing of information came in the form of individual conversations and classroom announcements. I; E,I GF,I; and P,GF,I were the most commonly documented conversation structures within GTA2's expository/demonstration sessions. In this conversation, GTA2 highlights a series of student mistakes, attempts to comfort the student, and then moves on to the next group. A graph of the relative proportions of conversation structures has been provided for each session of Data Set 3 in the ESI, Appendix 3.

Table 12 Transcription of GF,I conversation
Transcribed example: giving feedback and informing
TA: Okay [student's name] let's see whether you will get 100 out of 100 or not... okay no, let's see what happened.
S: Oh no.
TA: Just see what happened. This is grams and this is milligrams so you can see what was wrong here. {informing}
S: And this was not good either?
TA: Yeah, this answer was actually wrong.{giving feedback} This is not the best way to finish this problem. {informing} If you need more of an explanation you could come to my office hours and I could help you with that, right?
S: Yeah
TA: But, I think you are doing a good job so keep it up. {giving feedback}
S: Okay


Conversation types consistent with the IRE model were commonly used by GTA2 within inquiry-oriented sessions. However, other student-centered conversation structures (consistent with constructivist conversation) were seen within inquiry-oriented sessions. When constructivist conversation was utilized, it was student ideas and needs that drove the conversations. In these moments, GTA2 acted as a guide by cueing students to recall crucial facts and encouraging them to reflect upon all of their options. These actions point to the facilitative role periodically played by GTA2 within the inquiry-oriented sessions of Set 2. In the conversation depicted in Table 13, GTA2 engages with students by assisting them in framing the interpretation of their results while also reminding them of possible confounding variables.

Table 13 Transcription of E,P,GF conversation
Transcribed example: elicitation, prompting, and giving feedback
TA: So, what's happening over here? {elicitation}
S: We are stuck, all of our tests came back positive. Is that bad? It seems bad?
TA: Well, what do you think it could be? {prompting}
S: They could all be positive but, there could also be something else?
TA: Something else like, what? {prompting}
S: Oh, uh water. Maybe water is giving us a fake positive
TA: Maybe {giving feedback}. Okay how can you handle that? What can you do? {prompting}
S: Test again and try to dry things better?
TA: Sounds good {giving feedback}. I'll be back to see it soon.


Limitations

Ideally, the authors would have obtained more GTA discourse data for this study. As noted earlier, the sample of GTAs who provided data for this study was small. The first round of data collection used video recordings, which yielded unusable data. To improve the audio recording quality and allow for discourse analysis, the authors utilized digital audio recorders with lapel microphones. While this change resulted in high quality audio recordings, it significantly reduced the number of GTAs willing to participate in our study. Despite reassurances of anonymity and confidentiality, GTAs expressed persistent concerns about identification and possible criticism or loss of their teaching assistantship from the department based on the recordings, data, or findings. For these reasons, the participant pool was limited. The authors recognize that definitive conclusions cannot be drawn from such a small data set and that the limited participant pool may have impacted our results. The authors also acknowledge not knowing what motivated the participating GTAs to take part in the study. However, the authors would encourage future researchers with access to GTA populations to investigate the connection between lesson structure and GTAs' instructional practices.

Discussion

For the GTAs participating in this study, a distinct, qualitative change away from IRE modeled conversation was observed when the curriculum became problem-based or inquiry-driven. We were also able to quantify the differences in conversation structure present during expository/demonstration sessions as compared to inquiry sessions. IRE model conversations were not the dominant mode of instruction utilized by GTAs within inquiry sessions, rather a constructivist model of classroom discourse increased in frequency. Of particular interest and relevance was the finding that an immediate shift toward learner-centered discourse occurred when switching to an inquiry-oriented curriculum. Given the importance of active learning and student engagement in the learning process, these are important findings and are summarized here.

Assertion 1: There was an apparent relationship between the instructional mode (expository/demonstration versus inquiry) and the structure of classroom discourse employed by GTAs.

Inquiry oriented sessions were more frequently populated with conversation consistent with the constructivist model of classroom discourse than the IRE model. Conversely, expository/demonstration sessions were populated more frequently by conversation structures consistent with the IRE model of classroom discourse than the constructivist model. Fifteen of the twenty labs analyzed for this project produced data consistent with this pattern, though this pattern did display itself in varying degrees between the two participants (GTAl and GTA2).

Assertion 2: Patterns in classroom discourse repeated within given instructional modes, even when the nature of content being covered varied widely.

Though the expository/demonstration sessions and the inquiry session were comprised of very different types of classroom discourse when compared to each other (even when they were taught back to back), there was an interesting level of homogeneity displayed amongst all the sessions that were either expository/demonstration or inquiry oriented. This homogeneity was note-worthy given how widely the content varied. Lab activities covered topics including chemical formulas and limiting reagents, properties of gases, molecular geometry and atomic spectra, and the making of soaps from varying fatty acids.

Assertion 3: Patterns in classroom discourse observed in inquiry labs exemplified a constructivist learning environment and were achieved with minimal interference on the part of the researchers.

With the previous assertions, the connection between classroom discourse and mode of instruction has been discussed. It appears that there is a connection between the mode of instruction and the type of discourse utilized by GTAs. When GTA discourse shifts away from the classical IRE model there are exciting implications. Shifting away from the IRE model allows classroom discourse to better align with the conceptualized constructivist learning environment. According to the constructivist theory of learning, the learner is best served when individualized construction of knowledge occurs within the learner as he or she attempts to solve a problem with the limited guidance of a more knowledgeable individual (Gage and Berliner, 1998).

What is often taken for granted among education studies or is treated as transparent, however, is this “guidance” provided by a more knowledgeable individual (in this case, the GTA). Cazden (1988) commented on how infrequently educational studies focus on classroom discourse. Today, much emphasis is placed on curriculum design, and classroom discourse is often overlooked. In this study, the classroom discourse observed in the PBL labs parallels those classroom practices that align with the basic tenets of constructivist learning. The data presented here show that the theoretical components of the constructivist learning environment were actualized within the inquiry and PBL sessions more often than within the expository/demonstration sessions. As was previously discussed, conversation types E; E,P,I; and E,P,GF,I were consistently some of the most frequently employed conversation structures within the inquiry labs, and these conversation structures represent real world examples of constructivist classroom discourse.

With our participants we have observed that, for a given chemistry GTA and a given section, GTA discourse aligned well with the IRE model in the expository/demonstration labs, but departs from the IRE model of classroom discourse in the PBL sessions. At the level of individual chemistry GTAs and students, this departure from the IRE model is characterized by the chemistry GTAs relinquishing full control of the discourse by: (a) bringing out what students already know/believe, (b) encouraging reflection and consideration of this knowledge, and (c) supporting the students with either confirmation or explanation (Gage and Berliner, 1998). This shift in classroom discourse, generated by chemistry GTAs, embodies today's model of constructivist learning.

In addition to embodying constructivist learning, shifts in the structure of classroom discourse happened rapidly and without significant investments in GTA training. Literally from one session to the next, participants demonstrated an ability to significantly change the structure of the conversations they were having with students. This change in classroom discourse structure embodied the constructivist learning environment and was also strongly suggestive of the relationship between instructional mode and classroom discourse.

The take-home message from this study should be that there is a connection between instructional mode and classroom discourse, and that significant inputs of time and training may not be required (on the part of the department/university) to produce classroom discourse that is consistent with constructivism. This study has shown that moving towards a problem-based or inquiry laboratory experience also helps move instructors toward a teaching style more aligned with constructivism. Once given the opportunity, professors and GTAs must take the reins and experiment with their own abilities to carry out inquiry-oriented instruction. In this study, we have shown the ties between the mode of instruction and classroom discourse. In order to improve one's own instruction or to better align it with the tenets of constructivism, the data from this study suggest that one may begin by finding or generating an inquiry-oriented lesson and carrying it out. Putting the instructor in a 'new environment' by changing the lesson plan and the aims of a lesson has, for our participants, substantially altered the ways in which they structured their conversations with students.

Acknowledgements

The authors wish to thank all study participants for their willingness to participate in our study, without them this study would not have been possible. The Western Michigan University Research Development Award and the Western Michigan University Department of Chemistry helped support this project, in addition to the intellectual work of Dr Sherine Obare, Dr Steve Bertman, Dr John Miller, and their respective research groups. We would also like to extend our thanks to the Western Michigan University Stockroom Manager, Brianna Galli, for providing additional support to the authors and participants over the course of the study.

References

  1. Anderson R., (2002), Reforming science teaching: what research says about inquiry, J. Sci. Teacher Educ., 13(1), 1–2.
  2. Barrows H. and Kelson A., (1995), Problem-based learning in secondary education and the problem-based learning institute, Springfield, IL: Problem-Based Learning Institute.
  3. Bellack A. A., (1966), The language of the classroom, New York: Teachers College Press, Teachers College Columbia University.
  4. Bodner G. M., (1986), Constructivism: a theory of knowledge, J. Chem. Educ., 63(10), 873–878.
  5. Brown A. L. and Campione J. C., (1996), Psychological theory and the design of innovative learning environments: on procedures, principles, and systems, Lawrence Erlbaum Associates, Inc.
  6. Brown G. and Yule G., (1983), Discourse analysis, Cambridge: Cambridge University Press.
  7. Capon N. and Kuhn D., (2004), What's so good about problem-based learning? Cognition Instruct., 22(1), 61–79.
  8. Cazden C. B., (1988), Classroom discourse: the language of teaching and learning, Portsmouth, NH: Heinemann.
  9. Dancy M. and Henderson C., (2010), Pedagogical practices and instructional change of physics faculty, Am. J. Phys., 78(10), 1056–1063.
  10. Dochy F., Segers M., Van den Bossche P. and Gijbels D., (2003), Effects of problem-based learning: a meta-analysis, Learn. Instr., 13(5), 533–568.
  11. Freedman M. P., (1997), Relationship among laboratory instruction, attitude toward science, and achievement in science knowledge, J. Res. Sci. Teach., 34(4), 343–357.
  12. Gage N. L. and Berliner D. C., (1998), Educational psychology, Boston, MA: Houghton Mifflin.
  13. Gilkison A., (2003), Techniques used by ‘expert’ and ‘non-expert’ tutors to facilitate problem-based learning tutorials in an undergraduate medical curriculum, Med. Educ., 37(1), 6–14.
  14. Henderson C. and Dancy M. H., (2007), Barriers to the use of research-based instructional strategies: the influence of both individual and situational characteristics, Phys. Rev. ST. Top. Phys. Educ. Res., 3(2), 020102.
  15. Hmelo-Silver C. E., (2004), Problem-based learning: what and how do students learn? Educ. Psychol. Rev., 16(3), 235–266.
  16. Hmelo-Silver C. E. and Barrows, H. S., (2006), Goals and strategies of a problem-based learning facilitator, Intl. J. Prob. Bas. Learn., 1(1), 4.
  17. Krystyniak R. A. and Heikkinen H. W., (2007), Analysis of verbal interactions during an extended, open-inquiry general chemistry laboratory investigation, J. Res. Sci. Teach., 44(8), 1160–1186.
  18. Marshall C. and Rossman G., (1999), Designing qualitative research, Thousand Oaks, CA: Sage Publications.
  19. Mervis J., (2013), Transformation is possible if a university really cares, Science, 340(6130) 292–296.
  20. Mikami A. Y., Gregory A., Allen J. P., Pianta R. C. and Lun J., (2011), Effects of a teacher professional development intervention on peer relationships in secondary classrooms, School Psych. Rev., 40(3), 367.
  21. National Research Council, (2005), How students learn: Science in the classroom, Washington, DC: The National Academies Press.
  22. National Research Council, (2011), Promising practices in undergraduate science, technology, engineering, and mathematics education, Washington, DC: The National Academies Press.
  23. Nye B., Konstantopoulos S. and Hedges L. V., (2004), How large are teacher effects? Educ. Eval. Policy Anal., 26(3), 237–257.
  24. O'Neal C., Wright M., Cook C., Perorazio T. and Purkiss J., (2007), The impact of teaching assistants on student retention in the sciences: lessons for TA training, J. Res. Sci. Teach., 36(5), 24.
  25. Palincsar A. S. and Brown A. L., (1988), Teaching and practicing thinking skills to promote comprehension in the context of group problem solving, Rem. Spec. Educ., 9(1), 53–59.
  26. Patel V. L., Groen G. J. and Norman G. R., (1993), Reasoning and instruction in medical curricula, Cognition Instruct., 10(4), 335–378.
  27. Pianta R. C., Hamre B. K. and Mintz S. L., (2012), Classroom assessment scoring system-secondary manual, Charlottesville, VA: Teachstone.
  28. Prince M., (2004), Does active learning work? a review of the research, J. Educ. Eng., 93, 223–232.
  29. Richards L., (2005), Handling qualitative data: a practical guide, London: Sage Publications.
  30. Roehrig G. H. and Kruse R. A., (2005), The role of teachers' beliefs and knowledge in the adoption of a Reform-Based curriculum, Sch. Sci. Math., 105(8), 412–422.
  31. Roth W. and Roychoudhury A., (1993), The development of science process skills in authentic contexts, J. Res. Sci. Teach., 30(2), 127–152.
  32. Saldania, R., (2009), The coding manual for qualitative researchers, Portland, OR: Book News, Inc.
  33. Sandi-Urena S., Cooper M. M. and Gatlin T. A., (2011), Graduate teaching assistants'epistemological and metacognitive development, Chem. Educ. Res. Pract., 12(1), 92–100.
  34. Sandi-Urena S., Cooper M. and Stevens R., (2012), Effect of cooperative problem-based lab instruction on metacognition and problem-solving skills, J. Chem. Educ., 89(6), 700–706.
  35. Sauntson, H., (2012), Approaches to gender and spoken classroom discourse, London: Palgrave Macmillan.
  36. Seymour E. and Hewitt N. M., (1997), Talking about leaving: why undergraduates leave the sciences, Westview: Boulder, CO.
  37. Sinclair J. and Cuolthard M., (1975), Towards an Analysis of Discourse, Oxford: Oxford University Press.
  38. Slavin R. E., Lake C. and Groff C., (2009), Effective programs in middle and high school mathematics: a best-evidence synthesis, Rev. Educ. Res., 79(2), 893–911.
  39. Smith K., (1993), Graduate teaching assistants, Innovative Higher Education, 17(3), 147.
  40. Stubbs M., (1983), Discourse analysis: the sociolinguistic analysis of natural language, Chicago, IL: University of Chicago Press.
  41. Taber K., (2014), Ethical considerations of chemistry education research involving ‘human subjects’, Chem. Educ. Res. Pract., 15(2), 109–113.
  42. Trochim M. K., (2014), Research methods knowledge base: Unit of analysis, retrieved from http://www.socialresearchmethods.net on the following date: 3/10/2014.
  43. Wang Q., Pang W., Liang S. and Su Y., (2016), Developing and integrated framework of problem-based learning and coaching psychology for medical education: a participatory research, BMC Med. Educ., 16(2), 1–14.
  44. Yin R. K., (2003), Case study research: design and methods, 3rd edn, Thousand Oaks, CA: Sage Publications.

Footnote

Electronic supplementary information (ESI) available. See DOI: 10.1039/c6rp00050a

This journal is © The Royal Society of Chemistry 2016