Untangling a complex relationship: teaching beliefs and instructional practices of assistant chemistry faculty at research-intensive institutions

Maia Popova a, Lu Shi b, Jordan Harshman c, Annika Kraft b and Marilyne Stains *b
aUniversity of North Carolina at Greensboro, Greensboro, North Carolina, USA
bUniversity of Virginia, Charlottesville, Virginia, USA. E-mail: mstains@virginia.edu
cAuburn University, Auburn, Alabama, USA

Received 26th September 2019 , Accepted 3rd January 2020

First published on 14th January 2020


Abstract

In this era of instructional transformation of Science, Technology, Engineering, and Mathematics (STEM) courses at the postsecondary level in the United States, the focus has been on educating science faculty about evidence-based instructional practices, i.e. practices that have been empirically proven to enhance student learning outcomes. The literature on professional development at the secondary level has demonstrated a tight interconnectedness between ones’ beliefs about teaching and learning and one's instructional practices and the need to attend to faculty's beliefs when engaging them in instructional change processes. Although discipline-based education researchers have made great strides in characterizing instructional practices of STEM faculty, much less attention has been given to understanding the beliefs of STEM about teaching and learning. Knowledge of instructors’ thinking can inform faculty professional development initiatives that encourage faculty to reflect on the beliefs that drive their classroom practices. Therefore, this study characterized the interplay between beliefs and instructional practices of nineteen assistant chemistry professors. Luft and Roehrig's Teaching Beliefs Interview protocol was used to capture beliefs; classroom observations and course artifacts were collected to capture practices. Clear trends were identified between faculty's beliefs (characterized through constant-comparative analysis and cluster analysis) and practices (characterized with Blumberg's Learner-Centered Teaching Rubric). Overall, beliefs of most of the participants were somewhat aligned with their instructional practices, with the exception of one cluster of faculty who held student-centered beliefs, but received only moderate scores on the Learner-Centered Teaching Rubric.


Introduction

The discipline-based education research (DBER) community in the United States (US) has provided ample evidence of the positive impacts that the use of evidence-based instructional practices has on student learning outcomes (Freeman et al., 2014). Consequently, governmental and professional organizations have supported a wave of instructional reforms and many higher education institutions have taken on to themselves to transform their instructional practices in STEM undergraduate courses. The DBER community has informed these reform efforts by characterizing their impact on faculty's practices and thought processes and by identifying contextual factors that influence faculty's instructional decisions (e.g., Shadle et al., 2017; Stains et al., 2018). This research has demonstrated the challenges in helping faculty adopt new practices and how little transformation has occurred in most STEM disciplines in the US, with chemistry having the lowest level of transformation.

Instructional transformation has been extensively studied at the secondary level and many of the findings could provide meaningful insight to current efforts in higher education. One of these findings that has been understudied in DBER is the tight interconnectedness between one's instructional practices and ones’ beliefs about teaching and learning and the need to pay attention to instructors’ beliefs during professional development programs and reform efforts (Feyzioğlu, 2012; Hora, 2014; Wong and Luft, 2015; Şen and Sarı, 2018). Indeed, the DBER community has expanded a great effort to characterize instructional practices but has given much less attention to capturing STEM faculty's beliefs about teaching and learning. Studies on beliefs fall into three main categories: characterization of beliefs, measure of the impact of instructional reforms on participants’ beliefs, and exploration of the relationship between beliefs and practice.

Studies aiming to characterize teaching beliefs have focused on various types of instructors, including pre-service teachers (e.g., Mavhunga and Rollnick, 2016), secondary science teachers (e.g., Fletcher and Luft, 2011), undergraduate students (e.g., Pratt and Yezierski, 2018), graduate teaching assistants (GTAs) (e.g., Lee, 2019), postdoctoral scholars (e.g., Chapman and McConnell, 2018), and university faculty (e.g., Hora, 2014). Most of these studies characterize beliefs along the continuum from teacher-centered to student-centered, where teacher-centered side refers to beliefs that support the transmission model of learning (i.e., students receive knowledge from the teacher), whereas student-centered side represents beliefs that students construct knowledge when actively engaged in the educational process and when assuming responsibility for their own learning (Luft and Roehrig, 2007). Transitional beliefs represent the midpoint along this continuum. For example, Gardner and Parrish (2019) explored beliefs of biology GTAs and found that, on average, most of their participants held transitional beliefs about teaching and learning. On the other hand, Chapman and McConnell (2018) reported that geoscience graduate students and postdoctoral scholars in their study held a range of beliefs from instructor-centered to student-centered. Some studies highlighted that the same individual can hold different beliefs about processes of teaching and learning. For example, Feyzioğlu (2012) reported that most science teachers in their sample held transitional beliefs about teaching and teacher-centered beliefs about learning, whereas Wong and Luft (2015) found that science teachers in their study held more teacher-centered beliefs about teaching and more student-centered beliefs about learning. Other studies reported that pre-service science teachers, undergraduate students conducting outreach activities, and GTAs held teaching beliefs that are contradictory to the standards of contemporary literature on the best practices of teaching and learning (Phelps and Lee, 2003; Gormally, 2016; Pratt and Yezierski, 2019). Our review of studies characterizing beliefs points to a lack of focus on STEM faculty in general, chemistry faculty in particular, and inconsistencies in results across different STEM disciplines when the same type of instructor is investigated (e.g., GTAs).

Another group of studies has investigated the impact of interventions (e.g., professional development programs, pedagogy courses, adoption of student-centered teaching materials) on teaching beliefs. These studies suggest that some of the interventions were successful at shifting beliefs towards the student-centered side of the continuum (Mattheis and Jensen, 2014; Moore et al., 2015; Pelch and McConnell, 2016; Czajka and McConnell, 2019), whereas others failed to promote and sustain this shift (Fletcher and Luft, 2011; Lee, 2019). For instance, after an intervention targeting science teachers’ topic specific pedagogical content knowledge, Mavhunga and Rollnick (2016) reported that not all of the participants demonstrated a shift towards more student-centered beliefs. Fletcher and Luft (2011) reported that, after participation in a teacher preparation program, teachers in their study initially showed a shift to more student-centered beliefs, but ultimately returned to traditional beliefs by their first year in the classroom.

Although the literature demonstrates a tight interconnectedness between ones’ beliefs about teaching and learning and one's instructional practices (Czajka and McConnell, 2016, 2019), relatively few studies have closely examined this relationship. Douglass and colleagues investigated the relationship between GTAs’ beliefs and practices and reported that GTAs’ beliefs were consistent with their practices, exhibiting traits belonging to two primary categories: mostly teacher-centered and transitional (Douglas et al., 2016). Other studies, however, reported the opposite finding and identified a misalignment between teaching beliefs and instructional practices (Addy and Blanchard, 2010; Bennett and Park, 2011; Mansour, 2013; Dolphin and Tillotson, 2015; Şen and Sarı, 2018). For example, Addy and colleagues, who analyzed beliefs and practices of a different sample of GTAs, reported that half of the participants held transitional beliefs, yet displayed fairly traditional, teacher-centered teaching (Addy and Blanchard, 2010). One methodological weakness of most of these studies is their characterization of instructional practices solely through classroom observation field notes or video observations of instructional practices. The face-to-face time comprises only a small portion of the time instructors engage students with the content. Therefore, more comprehensive methods to characterize students’ experiences with a course are necessary to more accurately capture instructional practice.

As is evident from the literature reviewed above, currently there is little alignment between previous findings obtained in various contexts (i.e., with different samples, in different cultural backgrounds and educational settings). More research is needed to augment the body of literature on teaching beliefs and their relationship to instructional practices to detect a more consistent signal across studies performed in different educational contexts. At the same time, although a substantial body of literature shed light on teaching beliefs at the secondary level, beliefs might differ at the postsecondary level because unlike K-12 teachers, university instructors often do not have specialized training in teaching. Similarly, beliefs of faculty might also differ from beliefs of teaching assistants who may not have a lot of autonomy in the classroom/laboratory. This exploratory study aims to contribute to the relevant body of literature by characterizing the instructional beliefs and practices of assistant professors in chemistry at research-intensive institutions in the United States. The focus on a particular STEM discipline and a particular stage of one's academic career will help advance this field of research since isolating the investigation to one specific population of one discipline (chemistry) helps discover in-depth findings without wondering if broader context is driving the variability among participants. In addition, unlike other studies that investigated beliefs of experienced and/or exemplary faculty (Kane et al., 2004; Padilla and Garritz, 2015) or beliefs of faculty with a range of teaching experiences (Hora, 2014; Moore et al., 2015; Czajka and McConnell, 2019), this study characterizes beliefs of novice faculty specifically. This provides a focus on a particular demographic of chemistry instructors whose beliefs about teaching and learning might still be developing and are more prone to change. Along with a thorough analysis of beliefs about teaching and learning, this study provides a comprehensive analysis of instructional practices, by characterizing both video observations and course artifacts. In particular, this study seeks to address the following research questions that pertain to assistant chemistry professors from research-intensive institutions:

1. What are assistant chemistry professors’ beliefs about teaching and learning?

2. What is the relationship between assistant chemistry professors’ beliefs about teaching and learning and instructional practices?

Theoretical frameworks

Kagan (1992, p. 65) defined beliefs as “tacit, often unconsciously held assumptions about students, classrooms, and the academic material to be taught,” which directly impact how content is presented to students. Kagan's definition highlights that beliefs are held to be true and that they guide behavior. Over time, an individual develops a system of beliefs that is composed of multiple beliefs that are not independent of each other. The earlier beliefs in this network are held more strongly and are resistant to change. Other key characteristic of beliefs is that they form on the basis of evaluation and judgement and are inherently subjective. Interestingly, one can hold contradicting beliefs, which may trigger a feeling of dissonance (Pajares, 1992).

Beliefs are an important feature of multiple contemporary empirical models on instructional practices, including the Consensus Model of Pedagogical Content Knowledge (Shulman, 1986; Gudmundsdottir and Shulman, 1987; Neumann et al., 2018), the Teacher-Centered Systemic Reform Model (Gess-Newsome et al., 2003), and the Interconnected Model of Teacher Professional Growth (Clarke and Hollingsworth, 2002). All three models highlight the influence of instructors’ beliefs on their classroom practices.

The Consensus Model of Teacher Professional Knowledge is composed of four main components: (1) domain of teacher professional knowledge bases, which includes content knowledge (CK) – knowledge of disciplinary concepts, theories, and principles, and pedagogical knowledge (PK) – general knowledge of educational purposes and methods of teaching, learning, and assessment (Fernandez, 2014); (2) domain of topic-specific professional knowledge (TSPK) – knowledge of best practices of teaching and learning of a specific topic (Stender et al., 2017); (3) domain of classroom practice, which includes classroom context and pedagogical content knowledge (PCK) – combined knowledge of content and pedagogy that enables delivery of a subject matter in a form that is comprehensible for learners (Connor and Shultz, 2018); and (4) domain of student outcomes (Neumann et al., 2018). The model identifies “amplifiers and filters” that influence the relationships between some of these domains. In particular, beliefs about teaching and learning are shown to play an influential role between the domain of TSPK and the domain of classroom practice, emphasizing that beliefs impact both teacher knowledge and behavior (Schultz et al., 2018).

The Teacher-Centered Systemic Reform Model provides a framework for understanding change (or lack thereof) as an outcome of classroom reform initiatives (Gess-Newsome et al., 2003). According to this model, in order to achieve desirable outcomes, reform efforts need to take into account and target one or more of the following domains: (1) contextual factors (e.g., cultural context, school context, classroom context), (2) teacher personal factors (e.g., demographic profile, years of teaching experience), and (3) teacher thinking, which includes knowledge and beliefs about teaching, students, and content. Teachers’ knowledge and beliefs are at the center of this model and connect to all other domains, which highlights the robust link between teacher beliefs and teacher inclination to make changes to their teaching.

The Interconnected Model of Teacher Professional Growth describes another mechanism through which a teacher change can occur, where change is defined as growth or learning. The model suggests that in order for change to be possible, at least one of the following four domains need to be impacted: (1) the external domain (e.g., participation in a workshop), (2) the personal domain (e.g., beliefs, knowledge), (3) the domain of practice (e.g., classroom experimentation), and (4) the domain of consequence (e.g., student learning outcomes). The model illustrates the complex non-linear nature of the relationships between these domains, as change in one domain can promote change in another domain through the mediating processes of reflection and enactment (Clarke and Hollingsworth, 2002). According to this model, change in the beliefs system may promote change in the domain of practice and teacher professional growth.

Methods

Sample

Nineteen assistant chemistry professors participated in this Institutional Review Board approved investigation. All participants were from high or very high research institutions according to the Carnegie's classification (Center for Postsecondary Research) and were located across fifteen different states in the US, spanning all four of the regional divisions distinguished by the US Census Bureau (U.S. Department of Commence): 5 universities in the Northeast, 4 in the Midwest, 6 in the South, and 4 in the West. A code number was created for each participant in order to protect their identities.

Faculty were recruited while attending the Cottrell Scholars Collaborative New Faculty Workshop (CSC NFW) (Baker et al., 2014). Detailed demographics for the sample are shown in Table 1. Chemistry courses taught by faculty participants during data collection ranged from introductory undergraduate courses, such as general and organic chemistry, to advanced graduate courses, such as chemistry of polymers and mechanisms of chemical reactions.

Table 1 Descriptive demographics for the sample
Demographic variables Faculty, n
Sex
Female 10
Male 9
Course level taught
Graduate 10
Undergraduate 9
Year teaching
First 8
Second 5
Third 3
Fifth 3


Data collection

The individual steps of this study's research design are summarized in Fig. 1. Faculty participated in semi-structured, think-aloud interviews that were conducted by the first and third authors (Drever, 1995; Patton, 2002). The third author interviewed three participants in Fall 2016 and ten in Spring 2017; the first author interviewed six participants in Fall 2018. The interview protocol utilized the modified Teacher Belief Interview (TBI) (Luft and Roehrig, 2007). The TBI protocol used in this study excluded one of the original seven questions and included three additional questions. Thus, the modified protocol included nine questions that elicited faculty's beliefs related to how learning occurs, how to know when students understand, what the instructor's role in the classroom is, and several other items (full interview protocol can be found in Appendix 2, ESI). On average, the interviews lasted forty minutes. Because this population was geographically diverse, multimedia-based programs (e.g., Skype, Zoom) were used to interview the faculty participants. An audio recorder was utilized to record the data.
image file: c9rp00217k-f1.tif
Fig. 1 Research design of this study.

Besides collecting interview data, we also collected participants’ classroom observations and course artifacts associated with the teaching of one unit/chapter in their courses. All participants were mailed a video camera and a tripod to videotape one unit/chapter of their choice. Faculty were then asked to share their course artifacts, which included: syllabus, any materials used in class while covering the selected unit/chapter, as well as any assessment tools (homework, quizzes, and exams) that were used to assess student understanding of the selected unit/chapter. Each faculty received a $50 gift card to compensate them for their time.

Analysis of interviews

Once transcribed verbatim, the interviews were read by the first, fourth, and fifth authors to identify the beliefs about teaching and learning expressed by the faculty participants. However, upon reading through the transcripts, the researchers realized that most of participants’ responses described their classroom practices and only part of the responses expressed beliefs about teaching and learning. This is not surprising, as beliefs have been regarded in the literature as a “messy construct,” confusion around which centers on the difficulty to distinguish between beliefs, pedagogical knowledge, and pedagogical practices (Pajares, 1992). As the designers of the original TBI protocol have admitted, in their desire to elicit beliefs, they might have “inadvertently captured behavioral intentions” of teachers in their study (Luft and Roehrig, 2007, p. 43). A similar pattern was observed in this study. Therefore, the first, fourth, and fifth authors read through each of the transcripts to identify quotes that communicated participants’ beliefs about teaching and learning (i.e., not descriptions of what they are doing in their classrooms, but explanations of why they are doing it). To capture as many quotes that communicated beliefs as possible, the first and fifth authors individually identified and then compared all of the quotes from questions 1–6. Similarly, the first and fourth authors individually identified and then compared all of the quotes from questions 7–9. These efforts bolstered reliability of the findings by reducing the number of missed quotes that communicated beliefs about teaching and learning and resulted in the identification of 174 quotes across the nineteen interview transcripts.

The first, fourth, and fifth authors read through the identified quotes to generate the first version of the codebook that consisted of descriptive codes (a short phrase that summarizes a passage of qualitative data) (Saldaña, 2013). All of the identified quotes were then uploaded to NVivo 12 to be stored, organized, and inductively coded using the first version of the codebook (Patton, 2002; Creswell, 2003; Bazeley and Jackson, 2013). The first, fourth, and fifth authors collaboratively coded the data. The process of coding was accompanied by writing reflective memos in order to capture researchers’ thoughts about the data to assist in the communication between the investigators (Birks et al., 2008). The first author was assigned the responsibility of codebook editor to create, update, and revise the master codebook upon the discussions of coding with the team members.

The coding process underwent several cycles. Preliminary analysis included the three researchers independently coding quotes from the first two participants using the first version of the codebook. During a debriefing session the codes were revised, unique cases were discussed, and the second version of the codebook was created. In the second cycle of coding, the three researchers coded three more transcripts. At a follow-up debriefing session, the coders again discussed their coding process to make sure that they agree and remain consistent with their assignment of particular codes to particular data. At this stage, the coders engaged into constant comparative analysis, where they compared codes and explored their relationships to integrate them into meaningful categories and themes (Bradley et al., 2007). Upon reflection, the third version of the codebook that captured the organizational framework of codes and categories was created and applied to the rest of the data by the first author. Once coding was complete, the fifth author examined all of the first author's coding applied to questions 1–6, and the fourth author examined all of the coding applied to questions 7–9. At a number of follow-up debriefing sessions, the team members discussed and examined every case of disagreement on coding until researchers reached a 100% interpretive convergence (Saldaña, 2013).

The conformability (degree to which the results of an inquiry could be confirmed by other researchers) and dependability (stability of findings over time) of the analysis were ensured through keeping a careful record of audit trail notes that captured methodological procedures, data reduction steps, and data reconstruction products (Shenton, 2004; Anney, 2014). The credibility of the results (confidence in the rigor of the findings) were ensured through analyst triangulation, as frequent debriefing sessions helped the researchers address any biases and assumptions brought to the interpretative analysis (Anney, 2014; Pandey and Patnaik, 2014).

To further explore patterns in the belief systems of the research participants, an agglomerative hierarchical cluster analysis was conducted to classify faculty based upon the overall profile of their beliefs about teaching and learning. By running a matrix coding query in NVivo, a table illustrating what beliefs were expressed by each participant was generated, in which “0” showed absence of a belief and “1” showed presence of a belief. This categorical, nominal data was then uploaded to IBM SPSS 25 to perform cluster analysis. The agglomerative procedure began with each participant representing an individual cluster and then successively merging clusters together until a hierarchy of nested groupings was created (Frades and Matthiensen, 2010). Since the data was nominal, Pearson's correlation similarity measure was selected to measure the association between the variables (Wilks, 2014).

Analysis of classroom observations and course artifacts

Blumberg's Learner-Centered Teaching Rubric (LCTR), which was developed as a tool for instructors’ self-assessment of their teaching, was utilized to analyze participants’ classroom observations and course artifacts (Blumberg, 2009). The rubric was used to evaluate video observations of two class periods taught by each participant (resulting in a total of 38 videotaped classes), as well as each participant's course syllabus, lecture notes used during the two videotaped classes, and multiple assessments (e.g., homework, midterm exam, final exam, etc.) that were used to measure student understanding of the material introduced during the videotaped classes. The rubric is composed of five dimensions: (I) the function of content, (II) the role of the instructor, (III) the responsibility for learning, (IV) the purposes and processes of assessment, and (V) the balance of power. Each dimension consists of several components. The original rubric was modified to exclude components that were not relevant for the analysis of the obtained classroom observations and course artifacts. For example, within the dimension of the purposes and process of assessment, we excluded the component of “justification of the accuracy of answers.” Often grading disagreements are discussed during office hours or after class. Since our video records captured only classroom practices, we could not analyze any discourse that occurred after class or during office hours. Thus, the modified rubric included 14 different components across the five dimensions (the modified rubric can be found in Appendix 4, ESI).

Instructors’ classroom practices and course artifacts were coded across the 14 components using a four-point scale: “1” for employing teacher-centered approaches, “2” for showing lower-level transitioning to learner-centered approaches, “3” for showing higher-level transitioning to learner-centered approaches, and “4” for employing learner-centered approaches. Here, the teacher-centered approaches refer to using lecture as the main method of teaching, whereas learner-centered approaches imply the use of various methods that shift the role of the instructors from givers of information to facilitators of student learning (Blumberg, 2009). The first and the second authors simultaneously coded the video observations and course artifacts of four participants. Each coding session was followed up by a debriefing session, where the researchers discussed their assignment of the points for each of the components and resolved any disagreements. After high coding consistency was achieved between the researchers, they independently coded the video observations and course artifacts of seven more participants each. Note that one participant has been excluded from this analysis because the researchers were not able to obtain the syllabus for their course.

Results

Types of beliefs about teaching and learning

Three themes that cut across the entire data corpus were identified through constant-comparative analysis: Beliefs about Students, Beliefs about How Students Learn, and Beliefs about Content. The codes within each of these themes were grouped into meaningful categories. Note that the terms “beliefs” and “ideas” are used interchangeably in the section below.
Beliefs about students. Beliefs about Students fell into two categories: beliefs that highlight student differences (n = 12) and beliefs that are general to all students (n = 8). The unique individual beliefs under these two categories are presented in Table 2.
Table 2 Themes, categories, and codes that capture beliefs of assistant chemistry professors. Note that the total number of instances for each theme is greater than the number of faculty in this study (N = 19) because one participant could express multiple beliefs and, therefore, be assigned to multiple codes within one theme
Category Code/belief Faculty, n
Theme I: Beliefs about Students
Highlighting student differences (n = 12) Different students possess different ability to grasp the material 10
• With help students can get better 6
• Instructor does not aim to reach all students 4
Different students put in different level of effort 4
International students are reluctant to participate 1
Non-major students are intimidated by chemistry 1
General to all students (n = 8) Humans have limited attention spans/working memory capacity 3
Students need to assume responsibility for their learning 3
Students are afraid to be judged by their peers 2
Theme II: Beliefs about How Students Learn
Mechanisms through which learning occurs (n = 17) Learn better by doing/thinking, not listening 8
By listening to the instructor 6
When making connections between concepts 5
By paying attention 4
By repetition 4
When applying their knowledge 3
When being conceptually engaged 2
When being reflective 1
When being conceptually challenged 1
Context in which learning occurs (n = 13) Can learn from each other 10
Learn best with instructor's guidance 4
Can learn outside of class 5
Cannot learn outside of class 1
Theme III: Beliefs about Content
Selection of content to prepare students for their future (n = 13) Real-world applications of what students learn 9
Incorporating literature or authentic content 6
Exposing students to a broad range of topics 2
Providing examples that help students remember the topic 2
Content that will make students more interested 1
The goal is student understanding, not content coverage (n = 11) Focus on foundational concepts 9
Teaching too much content is bad for students 7
Curriculum is a flexible agenda 3
Curriculum is a fixed agenda (n = 5) Need to equip students for future courses 4
Need to equip students for the ACS exam 1


The first category of Beliefs about Students captured beliefs that highlight student differences. The most prevalent belief under this category was the idea of heterogeneous distribution of intelligence (Blackwell et al., 2007). Our data suggested that some faculty in our study held growth mindset beliefs (i.e. beliefs that intelligence is a malleable quality that can be developed), whereas others held beliefs that are more aligned with the fixed mindset (i.e., view of intelligence as an unchangeable entity). Ten faculty stated that different students possess different ability to grasp the material and emphasized the distinction between “good students” and “poor students.” Six of these participants believed that with help students can improve, as in the case with participant #9. While explaining why she values group work activities and how she assigns students into groups participant #9 stated: “Some students will be really good at concepts and others will be not. And I don’t want all of [the] really good top students always bonding together… I like to randomize, so there's a good mix of abilities and skill levels in groups… But when I allow group work to happen, then students can start to help each other in terms of concepts and explanations.” Participant #9 believed that students could help each other refine their understanding of concepts through discussion and explanation. Contrary to the six faculty whose beliefs about students were consistent with the incremental theory of intelligence, four participants’ beliefs were closer to the fixed intelligence side of the fixed-to-growth continuum of beliefs about intelligence (Yeager and Dweck, 2012). For instance, while describing student differences in respect to understanding chemical concepts, participant #12 stated that she did not aim to reach all of her students: “Some people probably digest the new concepts more easily and some people are not, they aren’t quite good at getting some things right away. Everyone is different. So what we do is, definitely we can’t take care of everyone. It's very time consuming.” As implied in the quote, participant #12 believed that due to the limited class time, the students who were not as good at getting the concepts right away were not able to keep up with the pace of content coverage and there was nothing that the instructor could do about it. Four other faculty highlighted student differences in respect to the amount of effort that they put into learning the material. For example, participant #19 shared: “So I definitely have assigned reading from the textbook that they are supposed to do before they come to class. But I know that a majority of them don’t actually do the readings.” Similarly, participant #1 stated that “the best ones will dig in and learn it themselves. And the other ones will not.” The remaining beliefs under this category were idiosyncratic and highlighted differences among specific groups of students (native vs. international, major vs. non-major).

Eight faculty expressed beliefs that were general to all students (second category). Three faculty discussed the idea that humans have limited attention spans and/or limited working memory capacity. For example, when describing how she ensured student learning, participant #11 stated that during class she asked her students multiple questions because “all people, you know, have limited attention spans and whenever I ask questions, they’ll actually, actually really pay attention to you and then they’re thinking about the question.” Other faculty (n = 3) believed that in order for students to learn in the course, they needed to assume responsibility for their learning. This belief was best embodied by participant #10 who stated: “I can’t make them learn it, you know, I’m not there holding their hand while they’re actually studying so I give it to them the first time, I give them problem sets and quizzes and stuff to try to help them get to those goals, but I can’t, yeah I can’t force it into their heads, I’ve learned.” The final belief under this category was the idea that students were reluctant to participate in the educational process because of the fear of being judged by their classmates (n = 2). For example, while describing her difficulties with encouraging students to ask questions, participant #19 shared that “most of the time people don’t ask because they’re embarrassed to ask a question in a group so big.”

Beliefs about how students learn. Beliefs about How Students Learn fell into two categories: mechanisms through which learning occurs (n = 17) and context in which learning occurs (n = 13). The unique individual beliefs under these categories are presented in Table 2.

The most prevalent belief under the first category was the idea that students learn better by doing and/or thinking, not listening (n = 8), as exemplified by the following quote from participant #19: “I think that if they are forced to do something on their own rather than just sit there and listen to me, then they actually like retain more of what we’re going over.” These participants made a clear comparison between learning by doing/thinking and learning by listening and expressed their preference for the former. Interestingly, the second most prevalent idea under this category was a contrasting belief to the one above. Six participants stated that students learn best when listening to the instructor. For example, when asked how he maximizes student learning in his classroom, participant #1 explained: “If the lecture comes off kind of like a story, you know, even if it's not in a traditional sense a story, but if it, if one thing leads to the next logically, I think students remember it better. And I base that simply on the fact that that seemed to be how I learn best.” A related belief that students learn by paying attention was expressed by four participants. For example, when asked to explain why he thinks that “presenting information in a more interesting way” was critical for student learning, participant #2 stated: “I think that better presenting the information would keep the students more engaged and that sort of leads to better understanding and comprehension.” Other participants (n = 4) believed that students learned by repetition, which was exemplified by participant #13: “I think if a concept is not practiced it will be forgotten soon after you pass that course, no matter what grade you get… The more practice we do, the more settled that concept will be in, in our minds.” These participants valued exposing their students multiple times to the same concept or problem to achieve deeper learning. Another belief, consistent with the theory of meaningful learning (Novak, 1993), was the idea that students learn when making connections between concepts (n = 5). This belief was a key point in the response of participant #10 when explaining how he determined whether or not his students understood the material: “They should be able to work through like a challenging comprehension problem. And one of the things that I struggle with in that, is that I am trying to push them towards seeing how all of these things are interconnected. Because they really like to learn each topic in a vacuum and when they see that, oh, lipid oxidation is connected to the citric acid cycle, to me that's a good sign that they understand what we’re talking about.” Finally, a few other less prevalent beliefs that describe the mechanisms through which learning occurred included idea that students learn when being conceptually engaged, when reflecting on their learning, and when being conceptually challenged.

The second category under Beliefs about How Students Learn captured beliefs about context in which learning occurs. The most prevalent code under this category was the belief that students can learn from each other (n = 10): “I’ve given independent take-home quizzes where they are encouraged to work with their classmates, again, to kind of promote discussion about the topic with the idea that hopefully, if they have a misconception about a particular topic or idea, then working in groups will clarify some of the material” (participant #8). Some of the participants, whose responses fell under this code, believed that peer explanations sometimes were even more effective than an instructor's explanation, as was the case with participant #12: “sometimes, it's probably more easy for them to grasp the ideas from their peers rather than from me.” Others added that the “best way to learn is to teach,” which, in their opinion, explained why not only the students who were being taught benefit from peer instruction, but also the students who teach are able to deepen their understanding when explaining concepts to their classmates. Four other participants expressed the belief that students learn best with instructor's guidance: “I provide a little help so they know where to go, but that it's still challenging enough that it's interesting. I think the biggest lasting learning experience is when they actively, you know, work through examples that are at the right level for them, but they’re still a little guided” (participant #3). Finally, four participants expressed the idea that students could learn independently outside of the classroom, whereas one participant stated the opposite, suggesting that students could not learn outside of the classroom.

Beliefs about content. Beliefs about Content fell into three categories: selection of content to prepare students for their future (n = 13), the main goal was student understanding not content coverage (n = 11), and curriculum was a fixed agenda (n = 5). The unique individual beliefs under these three categories are presented in Table 2.

The most prevalent belief under the first category was the idea of the importance of selecting content that will help students recognize the real-world applications of what they learn (n = 9). This belief is evident in the response of participant #13: “What I’m trying to teach my students is, given that knowledge, how can we apply them [sic] in real problems? How can we take that knowledge and solve a problem that we face in science?” A related belief of the importance of incorporating scientific literature/authentic content was expressed by 6 faculty in this study. Consider the quote from participant #16, who expressed a belief that exposing students to scientific literature prepares them for their future research activity: “I actually assigned one paper for each student. So I am kind of training them for a lab grade. Many of them have no idea what bioanalytics is, but I just give them a paper and my hope is they learn from it.” Other less prevalent beliefs under this category included ideas such as the importance of exposing students to a broad range of topics (n = 2) to give them a “big picture” of the field, the importance of using examples that would help students remember the material (n = 2), and selecting content that would increase student interest towards the subject (n = 1).

The second category captures beliefs that communicate that the main goal of instruction was student understanding and not just content coverage (n = 11). The most prevalent belief under this category was the idea that in order to help students acquire deep understanding of the subject, the instructor needed to spend more time teaching about discipline's foundational concepts (n = 9). This belief was best embodied in participant #18 response: “There are a lot of things to talk about… exposing students to what is going on is great. But what particular parts of that class do you need to be able to move on… like what is truly foundational. And if everyone understands foundational and has a good, you know, I guess a level playing field to start at, that's more important than just exposing them to everything that could be taught for example in organic chemistry.” Seven faculty expressed another belief that was in the same vein as the belief above – the idea that introducing too much content was bad for students. These participants justified this belief by explaining that covering too much information made students feel overwhelmed and, despite an instructor's good intentions, they did not retain much of that material. This was evident in the response of participant #2, who stated: “My first philosophy is to teach much less, but go in much greater depth… I think that there is a mistake with how much material is expected [to be covered]. I think you’re setting them up for failure. Even the ones that don’t fail don’t remember any of it when they walk out of the door.” These participants believed that covering less material more thoroughly promoted student conceptual understanding. The final belief under this category was the idea that curriculum was a flexible agenda. This belief was expressed by three participants who viewed curriculum as interactive and were willing to alter their plan of content coverage to ensure meaning making on the part of students: “After the workshop [CSC NFW] I do see the value in them understanding a topic. And so if we go a little over on a particular topic, I think it's important that I address their questions and that they do understand the material that we are talking about before we move on” (participant #8).

The final category under this theme captured beliefs that suggest that curriculum was a fixed agenda (n = 5). Participants who shared this belief stated that they did not alter the pace of content coverage regardless of their students’ understanding. One participant justified this belief by stating that she could not deviate from her syllabus because she needed to prepare her students for the ACS exam which tests them on a broad range of topics. Four other participants suggested that it is critical to cover everything to equip students for future courses: “They will continue through Gen Chem I and Gen Chem II and if I don’t cover something, they will have [sic] trouble. I think we have to cover all of the course” (participant #6). A similar idea was expressed by participant #10: “So I made a schedule at the beginning of the semester and I just stuck to it for better or worse. I do feel like there's this mandatory list of topics that I at least have to touch on and so I can’t get myself slowed down by things that are just more complicated.

Once themes, categories, and codes were identified, participants’ responses were analyzed based on their demographic parameters: sex, years of teaching, and course level taught. No patterns were identified in this analysis (likely due to the small sample size).

Types of belief systems

While it is valuable to detect themes via constant comparative analysis to characterize teaching beliefs of faculty, some key patterns might not be readily apparent to human coders, but are easily discovered when quantitative methods are used (Guest and Mclellan, 2003; Macia, 2015). To further explore patterns in the belief systems of our research participants, an agglomerative hierarchical cluster analysis was conducted to identify groupings of faculty within the sample. Cluster analysis has been and can be appropriately implemented in qualitative studies so long as they are done in exploratory (versus predictive) ways (Guest and Mclellan, 2003; Macia, 2015). This allowed us to make discoveries based on combinations of multiple variables and provided more insight than qualitative analysis alone. Although it is possible to classify faculty based upon the overall profile of their beliefs about teaching and learning without using quantitative approaches, the use of cluster analysis provided more rigor when analyzing similarities and differences in the patterns of the belief profiles of our participants. The cluster analysis was performed based on the previously identified codes under Beliefs about Students, Beliefs about How Students Learn, and Beliefs about Content. Using this approach, both a 3- and 4-cluser solutions were found as supported by the scree plot. The cluster analysis was conducted using different measures of similarity (complete-linkage, centroid-linkage, and Ward's methods) and the same 3- and 4-cluster solutions were obtained. This indicated the stability of each of the obtained cluster solutions. Additionally, to further ensure stability of the cluster solutions, the cluster analysis was replicated several (7) additional times, every time mixing the order in which the objects existed in the database (Brandriet and Bretz, 2014; Harshman et al., 2017). The same 3 and 4-cluster solutions were obtained every time. Finally, the 3- and 4-cluster solutions were carefully examined by the first and fifth authors. The 4-cluster solution was chosen as it allowed for description of the most homogeneous profiles of beliefs and interpretation of unique characteristics of each individual cluster (a dendrogram illustrating the results of the analysis can be found in Appendix 5, ESI). Note that two participants (#5 and #7) were excluded from the cluster analysis as outliers because each expressed only 1 belief about teaching and learning, whereas the rest of the participants expressed, on average, 7 beliefs (lowest observation equaled to 4 unique beliefs, whereas the highest observation equaled to 11). Shown in Table 3 are the prominent features of each cluster. Comparative demographics for each cluster can be found in Appendix 2 (ESI). Below, is a description of the patterns in the belief profiles of each cluster across the three themes: Beliefs about Students, Beliefs about How Students Learn, and Beliefs about Content. With the limited sample size, the cluster solution was used to examine the data in a different way to maximize findings; we do not claim that these clusters will be observed in the broader assistant professor population.
Table 3 Patterns in the belief profiles of each cluster
Cluster label Beliefs about students Beliefs about how students learn Beliefs about content
1. Student-centered & consistent (n = 7) • Students possess different ability to grasp the material, but with instructor's help they can get better (n = 5) • Students can learn from each other (n = 7) • Incorporate literature or authentic content (n = 4)
• Students learn better by doing/thinking, not listening (n = 5) • Teach about real-world applications of what students learn (n = 5)
• The focus is on few foundational concepts that students will use in future (n = 5)
• Too much content is bad (n = 4)
2. Transitional & consistent (n = 3) • Students put in different level of effort (n = 2) • Students learn by listening to the instructor (n = 2) • Students need exposure to a broad variety of topics (n = 2)
• Students learn by paying attention (n = 3)
• Students can learn from each other (n = 3)
3. Instructor-centered & inconsistent (n = 3) • Students need to assume responsibility for their learning (n = 3) • Students learn better by doing or thinking, not listening (n = 2) • Depth promotes conceptual understanding (n = 2)
• Students learn by listening to the instructor (n = 2) • Curriculum is a fixed agenda (n = 2)
• Students learn when making connections between concepts (n = 2)
4. Limited number of beliefs (n = 4) • Students possess different ability to grasp the material, but instructor doesn't aim to reach all students (n = 2) • Students learn when making connections between concepts (n = 2) • Teach about real world applications of what students learn (n = 4)
• Students learn when being conceptually engaged (n = 2)


Cluster 1: student-centered and consistent beliefs (n = 7). Cluster 1 was represented by both males and females, with a range of teaching experience, some teaching graduate courses and others teaching undergraduate courses. In respect to the Beliefs about Students, faculty in this cluster showed growth mindset beliefs, stating that students possess different abilities to grasp the material, but with instructor's help they could improve. When it comes to the Beliefs about Content, most faculty discussed the importance of incorporating literature or authentic content, as well as real-world application examples of what students learn. They also emphasized that teaching too much material led to memorization and, when selecting content to teach, the focus should have been on foundational concepts that are critical for students’ success in future courses and research. Finally, in respect to the Beliefs about How Students Learn, all noted that students could learn from each other and most mentioned that students learn better by doing/thinking, instead of listening. No faculty in this cluster said that students learned best when listening to the instructor. Thus, this cluster was assigned the label of “Student-centered and consistent beliefs.”
Cluster 2: transitional and consistent beliefs (n = 3). Cluster 2 was represented by males only who teach graduate courses, with a range of teaching experience. In respect to the Beliefs about Students, faculty stated that different students put in different amounts of effort. When it comes to the Beliefs about Content, participants expressed that students needed exposure to a broad variety of topics. Finally, in respect to the Beliefs about How Students Learn, all said that students learned by paying attention and from each other, but also most said that students learned by listening to the instructor. Thus, this cluster was assigned the label of “Transitional and consistent beliefs.”
Cluster 3: instructor-centered and inconsistent beliefs (n = 3). Cluster 3 was represented by males only, with a range of teaching experience, some teaching graduate courses and others teaching undergraduate courses. In respect to the Beliefs about Students, all participants in this cluster stated that students need to assume responsibility for their learning. When it comes to the Beliefs about How Students Learn, the participants showed contradicting beliefs – stating that students learned better by doing/thinking instead of listening, but at the same time noting that students learned best when listening to the instructor. No participants stated that students could learn from each other. Finally, in respect to the Beliefs about Content, the faculty yet again demonstrated contradicting beliefs. Even though they believed that depth promoted conceptual understanding, they saw curriculum as a fixed agenda, meaning that they did not alter the pace of content coverage regardless of their students’ understanding. Therefore, the label assigned to this cluster was “Instructor-centered and inconsistent beliefs.”
Cluster 4: limited number of beliefs (n = 4). Cluster 4 was represented by females only, with a range of teaching experience, most teaching graduate. On average, participants in this cluster expressed 5 unique beliefs about teaching and learning, in comparison to an average of 8 unique beliefs expressed by participants in all other clusters. For this reason, there were very few noticeable patterns in the beliefs’ profile of this cluster. When it comes to the Beliefs about Students, half of the participants in this cluster expressed fixed mindset beliefs, as they noted that students possessed different abilities to grasp the material and there was nothing that the instructor could do to help students grow in their conceptual understanding. No other patterns were observed within the theme of the Beliefs about Students. In respect to the Beliefs about How Students Learn, half of the participants mentioned that students learned when making connections between concepts and when being intellectually engaged with the material. Finally, when it comes to the Beliefs about Content, all participants in this cluster discussed the importance of teaching about real-world applications of what students learned. Thus, due to the limited number of patterns in the beliefs’ profile of this cluster, the label assigned to it was “Limited number of beliefs.”

Relationship between belief systems and instructional practices

LCTR scores for participants in each cluster are illustrated in Fig. 2. As can be seen in Fig. 2 under “total,” participants in Cluster 1 (student-centered and consistent beliefs) and Cluster 2 (transitional and consistent beliefs) performed similarly on the LCTR (both obtained an average of about 2.5 points on the LCTR 1–4 scale). This mid-scale average signified that even though participants in Cluster 1 expressed more student-centered beliefs, they were not able to fully translate their vision and beliefs into their classrooms. Participants in Cluster 3 (instructor-centered and inconsistent beliefs) scored an average of 2 points on the LCTR scale, whereas participants in Cluster 4 (limited number of beliefs) obtained the lowest average score of 1.7 points. This suggests an alignment between lower number of beliefs about teaching and learning and more instructor-centered practices in the classroom.
image file: c9rp00217k-f2.tif
Fig. 2 Box plots illustrating total LCTR scores for each cluster, as well as the scores for each cluster across the five dimensions of the LCTR.

To gain additional insights as to why participants in different clusters scored the way they did on the LCTR, we analyzed distributions of clusters’ scores across the five dimensions of the LCTR (Fig. 2). As can be seen from Fig. 2, all four clusters showed different performance on dimension I – “the function of content,” with Cluster 2 participants scoring the highest indicating that they placed the largest emphasis on using content to help students acquire an in-depth understanding of the material and develop critical thinking skills. Fig. 2 also illustrates that Clusters 1 and 2 outperformed Clusters 3 and 4 on dimension III – “the responsibility for learning,” dimension IV – “the purposes and processes of student assessment,” and dimension V – “the balance of power.” Thus, in comparison to Clusters 3 and 4, participants in Clusters 1 and 2 used higher number of formative assessments, allowed for more flexibility of course policies, and set student expectations that enabled the responsibility for learning to be shared between the instructor and students. Finally, Clusters 1, 2, and 3 outperformed Cluster 4 on dimension II – “the role of the instructor.” This suggests that participants in Cluster 4 did not utilize a variety of teaching techniques in their classrooms, making lecture their chief method of teaching. Additionally, participants in this cluster failed to align the learning goals for their course with the assessment methods used to gauge student attainment of these goals.

Conclusions and discussion

This study sought to identify beliefs about teaching and learning of assistant chemistry professors from research-intensive institutions, as well as the relationship between their beliefs and instructional practices. Three themes describing instructors’ beliefs were identified: Beliefs about Students, Beliefs about How Students Learn, and Beliefs about Content.

The most commonly discussed idea under Beliefs about Students, was the belief that different students were willing to put in different levels of effort (i.e., some work harder than others) and that different students possessed different abilities to grasp the material (i.e., not all students are equally capable). Faculty who expressed these ideas emphasized the distinction between “good students” and “poor students” and expressed a range of beliefs along the fixed-to-growth mindset continuum (Yeager and Dweck, 2012). Some stated that with help students could get better and others explained that there was nothing that the instructor could do to help the “poor students.” Presence of beliefs that are in congruence with the fixed theory of intelligence is problematic as it has been previously shown that instructors with a more fixed mindset tend to implement fewer active-learning strategies and are resistant to change (Aragón et al., 2018). Similar findings were previously described by Padilla and Garritz (2015) who reported that faculty from universities in Mexico also distinguished between “good” and “bad” students when interviewed about their teaching. Prawat (1992, p. 363) cautioned that teachers who place a strong focus on student individual differences are more driven towards “mindless eclecticism in their instructional style,” where the focus is solely on the use of multiple teaching methods to accommodate the needs of different students, instead of on careful selection of content to allow for student thinking and sense-making.

In respect to the Beliefs about How Students Learn, faculty in this study expressed multiple productive beliefs such as students were “able to learn from each other,” students learned “when being conceptually engaged,” and “when making connections between concepts.” These findings resonate with the work of Hora (2014), who also identified that science faculty beliefs about student learning include ideas that learning is best facilitated through active, hands-on engagement with the material and through active construction of understanding. Hora also reported that faculty believe that students learn through repeated exposure to a topic/idea, as well as through osmosis – by being in the presence of and listening to an expert; both of these ideas have also been expressed by chemistry faculty in this study. Similarly, Gess-Newsome and colleagues (Gess-Newsome et al., 2003) reported that science faculty in their study held beliefs that represented opposite ends along the constructivist to teacher-centered continuum: some arguing that learning is much more than the rote acquisition of knowledge, whereas others advocating for the transmission model of learning, where instructor's ability to “tell story well” is equated with student learning. Finally, faculty in this study have also expressed the belief that students learned “when paying attention.” Prawat (1992) argued that those who hold this belief are in fact “naïve” constructivists, as they often equate activity with learning and see student interest in the classroom both as a necessary and sufficient condition for learning.

The final theme identified was Beliefs about Content. Most of the beliefs under this theme illustrated that faculty in this study valued student understanding and selected content that was useful to prepare students for their future research activity/careers. Multiple beliefs under this theme were in congruence with those identified by Schultz and colleagues (Schultz et al., 2018): faculty value content with a focus “on the big picture,” “on fundamentals,” “on making connections between topics,” and “on real world applications.” A few faculty in our study, however, also expressed a belief that “curriculum is a fixed agenda” – an idea previously identified and discussed by Prawat (1992), as well as Padilla and Garritz (2015). The participants who shared this belief stated that they did not alter the pace of content coverage regardless of their students’ understanding.

Overall, the sophistication of beliefs of assistant chemistry faculty was somewhat superficial. Participants expressed a range of ideas, from less productive to more productive; however, most of the articulated beliefs lacked in depth. Thus, when discussing how students learn, participants’ explanations featured separate fragments rather than integrated structures one might call “theories of learning” (e.g., constructivism, Ausubel and Novak's theory of meaningful learning, knowledge construction as integration of multiple types of learning) (Novak, 1993; Bretz, 2001; National Academies of Sciences, Engineering, and Medicine, 2018).

Since no patterns were identified when comparing beliefs of faculty across different demographic parameters (sex, years of teaching experience, and course level taught), an agglomerative hierarchical cluster analysis was conducted to classify faculty based upon the overall profile of their beliefs about teaching and learning. The analysis clustered faculty with student-centered and consistent beliefs (Cluster 1), with transitional and consistent beliefs (Cluster 2), with instructor-centered and inconsistent beliefs (Cluster 3), and those who elaborated least on their teaching (Cluster 4). Although this study explored belief systems of a fairly homogeneous sample of participants, there was a noticeable variation in the sophistication of faculty's beliefs. According to the Teacher-Centered Systemic Reform Model, past experiences as students or contextual factors (e.g., cultural, school, and classroom contexts) could explain this variability (Gess-Newsome et al., 2003). Indeed, six of the faculty spontaneously shared that their beliefs were anchored in their prior experiences as students.

In contrast to previous research (Addy and Blanchard, 2010; Douglas et al., 2016; Şen and Sarı, 2018), we did not identify a full alignment or misalignment between beliefs and practices. Beliefs were generally correlated with practices, with beliefs being more advanced than practices. Analysis of participants’ instructional practices revealed that participants with student-centered (Cluster 1) and transitional (Cluster 2) beliefs performed similarly on the LCTR and outperformed faculty with instructor-centered beliefs (Cluster 3) and those who expressed limited number of beliefs (Cluster 4). In fact, Cluster 4 obtained the lowest score on the LCTR scale, which suggests that those who were least articulate about their teaching relied mostly on instructor-centered strategies in the classroom. A deeper analysis into clusters’ performance across the different dimensions of the LCTR supported this conclusion. In comparison to the other clusters, Cluster 4 participants did not utilize a variety of teaching techniques in their classrooms, making lecture their chief method of teaching. Cluster 1 also displayed interesting results, by showing difficulties in enacting beliefs. Although faculty in this cluster held student-centered beliefs, they received only moderate scores on the LCTR, which suggested some misalignment between their beliefs and practices. Beliefs of participants in other clusters were somewhat aligned with their practices as characterized with the LCTR. These results highlight the need for instructional reform facilitators to recognize the diversity of beliefs present within a somewhat homogeneous group of instructors and differentiate the learning experience accordingly. For example, the Teacher-Centered Systemic Reform Model would suggest that helping faculty in Cluster 1 recognize the dissonance between their beliefs and instructional practices could be fruitful in leading them to change their practices (Gess-Newsome et al., 2003). On the other hand, this approach would not prove effective for faculty in Cluster 2 and 3 since their beliefs and practices are well-aligned and dissatisfaction would not be established.

Limitations

While this study shed light on the beliefs about teaching and learning of assistant chemistry faculty from research-intensive institutions, there are some key limitations. First and foremost, the small sample size (N = 19) as well as the self-selection bias of respondents do not allow for generalizability of findings. Although the diversity of the participants minimizes this concern, the authors do not make any generalizability claims. In addition, our study participants were recruited while participating in the CSC New Faculty Workshop. Although this workshop did not aim to directly encourage faculty to reflect on their beliefs about teaching and learning, it could have had an indirect impact on our participants’ beliefs.

The small sample size is also problematic for using cluster analysis. However, cluster analysis was not utilized in a predictive manner. The authors used cluster analysis solely in an exploratory fashion, to allow for a deeper qualitative examination of patterns in the beliefs’ systems of the research participants. At the same time, some (Guest and Mclellan, 2003; Macia, 2015) argue that cluster analysis is “ideal for most qualitative data” since, unlike other statistical methods, cluster analysis does not find generalizable characteristics; instead, it assists in ordering the available data into clusters. It is suggested that cluster analysis can be both a useful and a powerful tool in qualitative data analysis, as long as the researchers use rigor when developing and applying codes, as well as perform adequate manipulation of data to make it suitable for cluster analysis (Guest and Mclellan, 2003; Macia, 2015). In this study, we took measures to satisfy these requirements for the adequate application of cluster analysis to qualitative data.

Beliefs can be only inferred from what people say. They cannot be directly measured or observed. If a participant holds a belief, but does not vocalize it, it is impossible to capture this specific belief. Therefore, it is possible that we have not detected all of the beliefs about teaching and learning held by our research participants. An open-ended interview protocol was used to allow for follow-up questions and additional probing to reduce the number of undetected beliefs. It is also possible that given a longer time for the interview and an even higher degree of probing, the faculty might have expressed a more intricate system of beliefs.

Finally, two different interviewers collected the data which could result in inconsistencies in how the interviews were conducted. Measures were taken to minimize this limitation – prior to interviewing the final six participants in Fall 2018, the first author listened through multiple interviews conducted by the third author in Fall 2016 and Spring 2017 to maintain similar level of probing.

Implications and future work

Knowledge of instructors’ thinking can inform faculty professional development initiatives to engage faculty to reflect on the beliefs that drive their classroom practices. Of importance is to note that the work herein supports the notion that beliefs are very diverse and do not fall under the limiting dichotomy of teacher-centered versus student-centered. The variation of sophistication of faculty's beliefs indicates that faculty professional development experiences need to be tailored towards providing its participants with more individualized support. In addition, future efforts need to be directed toward identifying channels to challenge faculty to become more self-aware of their beliefs about teaching and learning, to reflect on their teaching, and to identify any disconnect between their teaching beliefs and instructional practices. It is also critical to push faculty to reflect on the potential barriers for enactment of their beliefs in their classrooms, with the aim of providing faculty with resources and support to overcome these barriers.

To gain deeper, more generalizable insights, future research should aim to reproduce this study with a larger sample of faculty. Additionally, besides monitoring the relationship between faculty teaching beliefs and instructional practices, more research is needed to understand the impact of these beliefs and practices on student cognitive and affective outcomes. In particular, it is important to discern whether there is a relationship between higher scores on the LCTR rubric when analyzing faculty's classroom practices (i.e., faculty from the ‘student-centered & consistent cluster’ and ‘transitional & consistent cluster’) and their students’ conceptual understanding and/or skills acquisition.

Finally, additional longitudinal research is warranted to identify the advancement of faculty beliefs about teaching and learning over time. Since participants in this study were interviewed after participating in the CSC NFW, it is interesting to identify whether any short-term impacts of the workshop have diminished/increased/stayed constant. Previous research, that captured CSC NFW participants’ beliefs via surveys, reported that a short-term impact was observed after the program as participants’ beliefs became more student-centered. However, the observed gain diminished a year later (Stains et al., 2015). Luft and Roehrig (2007) explained that beginning faculty beliefs are more likely to change than those of their more experienced colleagues. Therefore, future longitudinal research is needed to provide additional evidence on the development of beliefs of novice faculty.

Conflicts of interest

There are no conflicts to declare.

Acknowledgements

This work is supported by the National Science Foundation CAREER 1552448. We thank P. Blumberg, R. M. Erdmann, and Z. Nelson for their feedback on the use of LCTR.

References

  1. Addy T. M. and Blanchard M. R., (2010), The problem with reform from the bottom up: Instructional practises and teacher beliefs of graduate teaching assistants following a reform-minded university teacher certificate programme, Int. J. Sci. Educ., 32(8), 1045–1071,  DOI:10.1080/09500690902948060.
  2. Anney V. N., (2014), Ensuring the Quality of the Findings of Qualitative Research: Looking at Trustworthiness Criteria, J. Emerging Trends Educ. Res. Policy Stud., 5(2), 272–281.
  3. Aragón O. R., Eddy S. L. and Graham M. J., (2018), Faculty beliefs about intelligence are related to the adoption of active-learning practices, CBE Life Sci. Educ., 17, 1–9,  DOI:10.1187/cbe.17-05-0084.
  4. Baker L. A., Chakraverty D., Columbus L., Feig A. L., Jenks W. S., Pilarz M., Stains M., Waterman R., Wesemann J. L., (2014), Cottrell Scholars Collaborative New Faculty Workshop: Professional development for new chemistry faculty and initial assessment of its efficacy, J. Chem. Educ., 91, 1874–1881,  DOI:10.1021/ed500547n.
  5. Bazeley P. and Jackson K., (2013), in Seaman J. (ed.), Qualitative Data Analysis with Nvivo, 2nd edn, Thousand Oaks, CA: Sage Publications Ltd.
  6. Bennett W. D. and Park S., (2011), Epistemological Syncretism in a Biology Classroom: A Case Study, J. Sci. Educ. Technol., 20(1), 74–86,  DOI:10.1007/s10956-010-9235-6.
  7. Birks M., Chapman Y. and Francis K., (2008), Memoing in qualitative research: Probing data and processes, J. Res. Nurs., 13(1), 68–75.
  8. Blackwell L. S., Trzesniewski K. H. and Dweck C. S., (2007), Implicit Theories of Intelligence Predict Achievement Across an Adolescent Transition: A Longitudinal Study and an Intervention, Child Dev., 78(1), 246–263.
  9. Blumberg P., (2009), Developing learner-centered teaching: a practical guide for faculty, 2nd edn, San Francisco, CA: John Wiley & Sons.
  10. Bradley E. H., Curry L. A. and Devers K. J., (2007), Qualitative data analysis for health services research: developing taxonomy, themes, and theory, Health Serv. Res., 42(4), 1758–1772,  DOI:10.1111/j.1475-6773.2006.00684.x.
  11. Brandriet A. R. and Bretz S. L., (2014), Measuring meta-ignorance through the lens of confidence: Examining students’ redox misconceptions about oxidation numbers, charge, and electron transfer, Chem. Educ. Res. Pract., 15, 729–746,  10.1039/c4rp00129j.
  12. Bretz S. L., (2001), Novak's theory of education: Human constructivism and meaningful learning, J. Chem. Educ., 78(8), 1107.
  13. Center for Postsecondary Research, (n.d.), Carnegie Classification of Institutions of Higher Education, Retrieved May 24, 2019, from http://carnegieclassifications.iu.edu.
  14. Chapman L. A. Y. and McConnell D. A., (2018), Characterizing the Pedagogical Beliefs of Future Geoscience Faculty Members: a Mixed Methods Study, Innovative Higher Educ., 43(3), 185–200,  DOI:10.1007/s10755-017-9416-9.
  15. Clarke D. and Hollingsworth H., (2002), Elaborating a model of teacher professional growth, Teach. Teach. Educ., 18, 947–967.
  16. Connor M. C. and Shultz G. V., (2018), Teaching assistants’ topic-specific pedagogical content knowledge in 1H NMR spectroscopy, Chem. Educ. Res. Pract., 19, 653–669,  10.1039/c7rp00204a.
  17. Creswell J. W., (2003), in Laughton C. D. (ed.), Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, 2nd edn, Thousands Oaks: Sage Publications.
  18. Czajka C. D. and McConnell D., (2016), Situated instructional coaching: a case study of faculty professional development, Int. J. STEM Educ., 3(10), 1–14,  DOI:10.1186/s40594-016-0044-1.
  19. Czajka C. D. and McConnell D., (2019), The adoption of student-centered teaching materials as a professional development experience for college faculty, Int. J. Sci. Educ., 41(5), 693–711,  DOI:10.1080/09500693.2019.1578908.
  20. Dolphin G. R. and Tillotson J. W., (2015), “Uncentering” teacher beliefs: The expressed epistemologies of secondary science teachers and how they relate to teacher practice, Int. J. Environ. Sci. Educ., 10(1), 21–38,  DOI:10.12973/ijese.2015.228a.
  21. Douglas J., Powell D. N. and Rouamba N. H., (2016), Assessing graduate teaching assistants’ beliefs and practices, J. Excellence Coll. Teach., 27(3), 35–61.
  22. Drever E., (1995), Using semi-structured interviews in small-scale research. A teacher's guide. Edinburgh: The SCRE Centre.
  23. Fernandez C., (2014), Knowledge base for teaching and pedagogical content knowledge (PCK): Some useful models and implications for teachers’ training, Probl. Educ. 21st Century, 60, 79–100. Retrieved from https://www.researchgate.net/publication/282330568.
  24. Feyzioğlu E. Y., (2012), Science teachers’ beliefs as barriers to implementation of constructivist-based education reform, J. Balt. Sci. Educ., 11(4), 302–317.
  25. Fletcher S. S. and Luft J. A., (2011), Early career secondary science teachers: A longitudinal study of beliefs in relation to field experiences, Sci. Educ., 95(6), 1124–1146,  DOI:10.1002/sce.20450.
  26. Frades I. and Matthiensen R., (2010), Overview on Techniques in Cluster Analysis, in Matthiesen R. (ed.), Bioinformatics Methods in Clinical Research, pp. 81–107,  DOI:10.1007/978-1-60327-194-3.
  27. Freeman S., Eddy S. L., McDonough M., Smith M. K., Okoroafor N., Jordt H. and Wenderoth M. P., (2014), Active learning increases student performance in science, engineering, and mathematics, Proc. Natl. Acad. Sci. U. S. A., 111(23), 8410–8415.
  28. Gardner G. E. and Parrish J., (2019), Biology graduate teaching assistants as novice educators: Are there similarities in teaching ability and practice beliefs between teaching assistants and K–12 teachers? Biochem. Mol. Biol. Educ., 1–7,  DOI:10.1002/bmb.21196.
  29. Gess-Newsome J., Southerland S. A., Johnston A. and Woodbury S., (2003), Educational Reform, Personal Practical Theories, and Dissatisfaction: The Anatomy of Change in College Science Teaching, Am. Educ. Res. J., 40(3), 731–767,  DOI:10.3102/00028312040003731.
  30. Gormally C., (2016), Developing a Teacher Identity: TAs’ Perspectives About Learning to Teach Inquiry-based Biology Labs, Int. J. Teach. Learn. Higher Educ., 28(2), 176–192. Retrieved from http://www.isetl.org/ijtlhe/.
  31. Gudmundsdottir S. and Shulman L., (1987), Pedagogical Content Knowledge in Social Studies, Scand. J. Educ. Res., 31(2), 59–70,  DOI:10.1080/0031383870310201.
  32. Guest G. and Mclellan E., (2003), Distinguishing the Trees from the Forest: Applying Cluster Analysis to Thematic Qualitative Data, Field Methods, 15(2), 186–201,  DOI:10.1177/1525822X03251188.
  33. Harshman J., Yezierski E. and Nielsen S., (2017), Putting the R in CER: How the statistical program R transforms research capabilities, ACS Symp. Ser., 1260, 65–90,  DOI:10.1021/bk-2017-1260.ch006.
  34. Hora M. T., (2014), Exploring faculty beliefs about student learning and their role in instructional decision-making, Rev. Higher Educ., 38(1), 37–70,  DOI:10.1353/rhe.2014.0047.
  35. Kagan D., (1992), Implication of Research on Teacher Belief, Educ. Psychol., 1, 65–90.
  36. Kane R., Sandretto S. and Heath C., (2004), An investigation into excellent tertiary teaching: Emphasising reflective practice, Higher Educ., 47, 283–310.
  37. Lee S. W., (2019), The Impact of a Pedagogy Course on the Teaching Beliefs of Inexperienced Graduate Teaching Assistants, CBE Life Sci. Educ., 18, 1–12,  DOI:10.1187/cbe.18-07-0137.
  38. Luft J. A. and Roehrig G. H., (2007), Capturing Science Teachers’ Epistemological Beliefs: The Development of the Teacher Beliefs Interview, Electron. J. Sci. Educ., 11, Retrieved from http://ejse.southwestern.edu.
  39. Macia L., (2015), Using Clustering as a Tool: Mixed Methods in Qualitative Data Analysis, Qualitative Rep., 20, Retrieved from http://www.nova.edu/ssss/QR/QR20/7/macia3.pdf.
  40. Mansour N., (2013), Consistencies and Inconsistencies Between Science Teachers’ Beliefs and Practices, Int. J. Sci. Educ., 35(7), 1230–1275,  DOI:10.1080/09500693.2012.743196.
  41. Mattheis A. and Jensen M., (2014), Fostering improved anatomy and physiology instructor pedagogy, Adv. Physiol. Educ., 38, 321–329,  DOI:10.1152/advan.00061.2014.
  42. Mavhunga E. and Rollnick M., (2016), Teacher- or Learner-Centred? Science Teacher Beliefs Related to Topic Specific Pedagogical Content Knowledge: A South African Case Study, Res. Sci. Educ., 46, 831–855,  DOI:10.1007/s11165-015-9483-9.
  43. Moore T. J., Guzey S. S., Roehrig G. H., Stohlmann M. S., Park M. S., Kim Y. R., H. L. Callender, Teo H. J., (2015), Changes in Faculty Members’ Instructional Beliefs while Implementing Model-Eliciting Activities, J. Eng. Educ., 104(3), 279–302,  DOI:10.1002/jee.20081.
  44. National Academies of Sciences Engineering and Medicine, (2018), How People Learn II: Learners, Context, and Cultures,  DOI:10.17226/24783.
  45. Neumann K., Kind V. and Harms U., (2018), Probing the amalgam: The relationship between science teachers’ content, pedagogical and pedagogical content knowledge, Int. J. Sci. Educ., 41(7), 847–861,  DOI:10.1080/09500693.2018.1497217.
  46. Novak J. D., (1993), Human constructivism: a unification of psychological and epistemological phenomena in meaning making, Int. J. Pers. Constr. Psychol., 6, 167–193.
  47. Padilla K. and Garritz A., (2015), Tracing a research trajectory on PCK and chemistry university professors’ beliefs, in Berry A., Friedrichsen P. and Loughran J. (ed.), Re-examining pedagogical content knowledge in science education, 1st edn, New York: Routledge, pp. 75–87.
  48. Pajares M. F., (1992), Teachers’ Beliefs and Educational Research: Cleaning Up a Messy Construct, Rev. Educ. Res., 62(3), 307–332,  DOI:10.3102/00346543062003307.
  49. Pandey S. C. and Patnaik S., (2014), Establishing reliability and validity in qualitative inquiry: a critical examination, J. Dev. Manage. Stud. XISS, 12(1), 5743–5753. Retrieved from https://www.researchgate.net/publication/266676584.
  50. Patton M. Q., (2002), Qualitative Research & Evaluation Methods, 2nd edn, Thousand Oaks, CA: Sage Publications, Inc.
  51. Pelch M. A. and McConnell D. A., (2016), Challenging instructors to change: a mixed methods investigation on the effects of material development on the pedagogical beliefs of geoscience instructors, Int. J. STEM Educ., 3(5), 1–18,  DOI:10.1186/s40594-016-0039-y.
  52. Phelps A. J. and Lee C., (2003), The Power of Practice: What Students Learn from How We Teach, J. Chem. Educ., 80(7), 829–832.
  53. Pratt J. M. and Yezierski E. J., (2018), A novel qualitative method to improve access, elicitation, and sample diversification for enhanced transferability applied to studying chemistry outreach, Chem. Educ. Res. Pract., 19, 410–430,  10.1039/C7RP00200A.
  54. Pratt J. M. and Yezierski E. J., (2019), “You Lose Some Accuracy When You’re Dumbing it Down”: Teaching and Learning Ideas of College Students Teaching Chemistry through Outreach, J. Chem. Educ., 96(2), 203–212,  DOI:10.1021/acs.jchemed.8b00828.
  55. Prawat R. S., (1992), Teachers’ beliefs about teaching and learning: A constructivist perspective, Am. J. Educ., 100(3), 354–395,  DOI:10.1086/444021.
  56. Saldaña J., (2013), The Coding Manual for Qualitative Researchers, Seaman J. (ed.), 2nd edn, Thousand Oaks, CA: Sage Publications Inc.
  57. Schultz M., Lawrie G. A., Bailey C. H. and Dargaville B. L., (2018), Characterisation of teacher professional knowledge and skill through content representations from tertiary chemistry educators, Chem. Educ. Res. Pract., 19, 508–519,  10.1039/c7rp00251c.
  58. Şen Ö. F. and Sarı U., (2018), From Traditional To Reform-Based Teaching Beliefs and Classroom Practices of Elementary Science Teachers, Int. J. Innovation Sci. Math. Educ., 26(6), 76–95.
  59. Shadle S. E., Marker A. and Earl B., (2017), Faculty drivers and barriers: laying the groundwork for undergraduate STEM education reform in academic departments, Int. J. STEM Educ., 4(8), 1–13,  DOI:10.1186/s40594-017-0062-7.
  60. Shenton A. K., (2004), Strategies for ensuring trustworthiness in qualitative research projects, Educ. Inf., 22, 63–75.
  61. Shulman L. S., (1986), Knowledge Growth in Teaching, Am. Educ. Res. Assoc., 15(2), 4–14.
  62. Stains M., Pilarz M. and Chakraverty D., (2015), Short and Long-Term Impacts of the Cottrell Scholars Collaborative New Faculty Workshop, J. Chem. Educ., 92(9), 1466–1476,  DOI:10.1021/acs.jchemed.5b00324.
  63. Stains M., Harshman J., Barker M. K., Chasteen S. V., Cole R., DeChenne-Peters S. E., M. K. Eagan, J. M. Esson, J. K. Knight, F. A. Laski, M. Levis-Fitzgerald, C. J. Lee, S. M. Lo, L. M. McDonnell, T. A. McKay, N. Michelotti, A. Musgrove, M. S. Palmer, K. M. Plank, T. M. Rodela, E. R. Sanders, N. G. Schimpf, P. M. Schulte, M. K. Smith, M. Stetzer, B. Van Valkenburgh, E. Vinson, L. K. Weir, P. J. Wendel, L. B. Wheeler, Young A. M., (2018), Anatomy of STEM teaching in North American universities, Science, 359(6383), 1468–1470,  DOI:10.1126/science.aap8892.
  64. Stender A., Brückmann M. and Neumann K., (2017), Transformation of topic-specific professional knowledge into personal pedagogical content knowledge through lesson planning, Int. J. Sci. Educ., 39(12), 1690–1714,  DOI:10.1080/09500693.2017.1351645.
  65. U.S. Department of Commence, (n.d.), United States Census Bureau, Retrieved May 24, 2019, from https://www.census.gov.
  66. Wilks D. S., (2014), Cluster Analysis, Concise Guide Mark. Res., 273–324,  DOI:10.1016/B978-0-12-385022-5.00015-4.
  67. Wong S. S. and Luft J. A., (2015), Secondary Science Teachers’ Beliefs and Persistence: A Longitudinal Mixed-Methods Study, J. Sci. Teach. Educ., 26, 619–645,  DOI:10.1007/s10972-015-9441-4.
  68. Yeager D. S. and Dweck C. S., (2012), Mindsets That Promote Resilience: When Students Believe That Personal Characteristics Can Be Developed, Educ. Psychol., 47(4), 302–314,  DOI:10.1080/00461520.2012.722805.

Footnote

Electronic supplementary information (ESI) available. See DOI: 10.1039/c9rp00217k

This journal is © The Royal Society of Chemistry 2020