Silent and vocal students in a large active learning chemistry classroom: Comparison of performance and motivational factors

Carrie A. Obenland a, Ashlyn H. Munson b and John S. Hutchinson *c
aRice University, Department of Chemistry, MS-60, PO Box 1892, Houston, TX 77251, USA. E-mail: carrieobenland@rice.edu
bPacific Lutheran University, Department of Mathematics, Morken Center, Room 252, Tacoma, WA 98447, USA. E-mail: munsonah@plu.edu
cRice University, Department of Chemistry, MS-60, PO Box 1892, Houston, TX 77251, USA. E-mail: jshutch@rice.edu

Received 14th February 2012 , Accepted 29th October 2012

First published on 15th November 2012


Abstract

Active learning is becoming more prevalent in large science classrooms, and this study shows the impact on performance of being vocal during Socratic questioning in a General Chemistry course. 800 college students over a two year period were given a pre and post-test using the Chemistry Concept Reasoning Test. The pre-test results showed that there was no initial difference in conceptual understanding of chemistry among students who actively participated on a regular basis and those who did not. However, on the post-test those vocal students who consistently attempted to respond to questions in class showed significantly greater learning gains than those silent students who rarely or never tried to answer questions. The motivation for participating for many students was initially the minimal extra credit for answering questions. Once students started answering questions, they saw the educational value in getting feedback, enjoyed the engagement of the Socratic dialogue, and realized the benefit of putting chemistry concepts into words. Also tied to participation during class was the need to complete the reading assignments before class. Students who read prior to class were more able to participate and were then continually incentivized to stay current with the class readings. The active learning environment did benefit all students by providing cognitive and social dimensions during class, and those students who participated frequently learned more chemistry.


Introduction

A strong and growing consensus has developed in the past few decades in favor of active learning approaches in college science classrooms (Donovan et al., 1999; NRC, 2000). “Active learning” generally refers to any of a variety of teaching methods that engage students in participation during class time, including class discussions, “clicker questions,” think-pair-share discussions, peer-learning, problem-based learning, and Socratic dialog (Bonwell and Eison, 1991; Lyman, 1992; King, 1993; Johnson et al., 1998; Crouch and Mazur, 2001; Bruff, 2009). These contrast with the dominant conventional lecture approach, in which the students are passive recipients of information provided by the instructor (Walczyk and Ramsey, 2003). Research and practice have shown that students as a group in active learning environments on average perform better academically, enjoy their courses more, and remain in these classes at a higher rate (Felder et al., 2000; Prince, 2004; Michael, 2006).

We have recently studied students in a large, active learning, introductory Chemistry course, focusing on those students who choose to remain silent, not responding or even attempting to respond to questions and not participating in class discussions (Obenland et al., 2012). Our immediate question was whether these “silent students” derive the same benefits from the active learning classroom as do their vocal counterparts.

To address this question, we began a systematic study of the behaviors, perceptions, learning and performance of these students in comparison to the other students in the class. For reasons reviewed below and discussed in our previous paper (Obenland et al., 2012), our research led us to define two categories of students: vocal and silent. Vocal students are those who participate or attempt to participate in class discussion on a weekly if not daily basis. Silent students are those who rarely or never participate or attempt to participate.

It is important to note that we do not refer to these groups as active and passive. Our study revealed that silent students are not passive as might have been assumed (Obenland et al., 2012). Watkins et al. (2007) described three dimensions of active learning: behavioral, cognitive, and social. Based on our previous study, we can say that silent students are active learners in two of these dimensions, the cognitive and the social. Our silent students reported via survey that they engaged the active learning environment by attempting to answer questions quietly to themselves and by listening to the answers given by other students to check against their own. This reflects the cognitive and social dimensions of active learning. Thus, a significant conclusion of our prior research is that, in general, silent students are active not passive in their learning.

Our study comparing silent and vocal students showed that all students responded favorably to active learning approaches, but vocal students did respond somewhat more favorably. Both silent and vocal students self-reported that the active learning techniques kept them engaged and enhanced their understanding of the material. Silent students attributed their lack of vocal participation to their learning style or preference rather than worry or perceptions in the classroom (Obenland et al., 2012).

In this paper, our questions primarily concern the comparative learning progress and outcomes for silent and vocal students. Simply stated, do the silent students learn as well as the vocal students? Our primary tool for comparison was the Chemistry Concept Reasoning Test (CCRT), previously presented in this journal (Cloonan and Hutchinson, 2011). The CCRT contains multiple-choice questions that were designed to analyze students’ ability to reason with chemical concepts, including their ability to analyze and interpret data in the context of chemical models. To answer this research question, we employed the CCRT in a paired pre-test and post-test analysis comparing the performances of silent and vocal students.

Our results will show that after a semester in an active learning environment, both groups of students showed substantial improvement on the CCRT, but the active students showed a statistically significantly greater improvement. Thus, a second research question addressed in this paper concerns motivation behind student participation. If vocal students make greater progress in learning outcomes than do silent students, how might teachers motivate silent students to become vocal? To facilitate identifying multiple types of methods to encourage vocal participation, we have studied the motivational factors and study habits of vocal students in comparison to silent students. Motivation has been the subject of active research in a variety of fields and has prompted multiple theoretical frameworks (Matusovich et al., 2009; Weiner, 1990; Wigfield and Eccles, 2000). Our research incorporates achievement goal theory by assessing both mastery goal and performance goal orientations (Ames and Archer, 1988; Harackiewicz et al., 1997). The results of this study provide insights into how and why vocal students chose to be vocal and thus provide a window into understanding why silent students chose to remain silent. In combination, these results indicate how silent students might be encouraged to become vocal.

In the next section, we describe the active learning environment in the General Chemistry course as well as the methods used to analyze the students in this course. We follow with an analysis of student performance gains on both the CCRT and course examinations. These results are analyzed for motivational factors and for study habits of vocal and silent students. Conclusions and recommendations are presented at the end.

Class environment

In order to understand this study as being within an active learning class and to build upon the previous research that defined vocal and silent students both as active participants, we outline with some detail the pertinent aspects of the class.

Class setting

The population for this study was the first semester General Chemistry class at Rice University over two years. Enrollment was 434 students for Fall 2010 and 394 students for Fall 2011. The class was approximately 90% freshman and unevenly divided across three sections each semester. The three sections were taught in tandem by different instructors with a common syllabus including identical reading assignments, homework, grading scheme, and exams. Students could attend any of the sections, as the content and teaching strategies across the three were consistent. Three midterm exams were used to assess student learning, as well as one final exam. Each of these exams was made up of approximately an even division of challenging traditional chemistry problems and conceptually focused questions that required students to write out explanations of chemical phenomena.

The curriculum for this course included the traditional General Chemistry content but also emphasized the Concept Development approach. Created at Rice, this approach leads students through observations and logical reasoning to the development of chemistry models and major chemical concepts. This concept development focus, implemented via Concept Development Studies in Chemistry by Hutchinson (2007) has been found to be successful at this institution for a number of years (Hutchinson, 2000).

The active learning approach

The active learning approach for General Chemistry at Rice University has been developed over the past few decades and fits within the three dimensions of active learning outlined by Watkins, et al. (2007) as well as the fourth dimension of affect added by Drew and Mackie (2011). Congruent with the concept development approach of providing students with the observations and reasoning for chemical concepts, students were asked to verbally express their logic and understanding in this course. Socratic questioning was used during class time throughout the course to probe students’ understanding and guide students to the accepted understanding of concepts. During each 50 minute class period, approximately 40 questions were posed to the class. Students were asked to respond by raising their hands, a behavioral dimension of active learning. The instructor would then call on a student for their response to be shared with the class. However, our previous study showed that even those students who were not called on but sat quietly thinking of an answer to the question were actively participating cognitively (Obenland et al., 2012). Students were incentivized to participate within the grading scheme, which also added an external motivation or affective dimension to the active learning. Each day a student responded with an answer in class, that student received one point extra credit towards their grade based on a 1000 point scale.

Socratic questions were based on topics from reading assignments students were asked to complete before coming to class. Thus, students were able to familiarize themselves with the content via reading. Then in class, through questioning and class discussion, the chemical concepts and models were built via that same reasoning presented in the reading. The class discussion format allowed for students to ask questions often during class, which then the instructor would ask other students to attempt provide answers. The flow of discussion provided an atmosphere where students could learn from each others’ ideas and also added a social dimension to the active learning.

Research methods

Assessment tools

In order to address our research questions, we collected data on student performance and perceptions using a variety of tools including surveys, interviews, and exam questions. The surveys included multiple-choice questions regarding students’ frequency of participation and Likert-scale questions regarding their perceptions of the active learning classroom. Students were asked to voluntarily participate in this online survey both in Fall 2010 and 2011, at almost ¾ of the way through the semester. In Fall 2011 only, further multiple-choice questions were added about the students’ consistency in participation and feelings about Socratic questions. In both Fall 2010 and Fall 2011, the survey response rate was 84%.

Survey responses were confidential, as one question regarded willingness to participate in interviews. From those students willing to participate in interviews, a random number generator was used to choose which students, listed alphabetically by university identification, to contact via email for interviews. In Fall 2010, 17 students were interviewed, all who would be considered in the silent category, as this was the initial focus of our study. In Fall 2011 we chose to expand the scope of the study to include vocal students. Interviews were conducted with 21 students, 14 vocal and 7 silent. To complement these interviews, one focus group was conducted with four students. The focus group was composed of two vocal and two silent students to allow for conversation between differing students. The interviews and focus group followed a semi-structured protocol based on the survey questions. Questions were along the lines of “What is your motivation for participating?” and “Why do you think so many of your fellow students raise their hands?”.

The study utilized student responses to the Chemistry Concept Reasoning Test (CCRT) to evaluate any differences in learning between vocal and silent students. As previously reported in this journal, The CCRT was created and validated as a tool for measuring conceptual understanding and scientific reasoning pertaining to general chemistry topics (Cloonan and Hutchinson, 2011). The test is made up of multiple-choice questions and has two analogous forms to allow for pre and post testing. While being constrained to multiple-choice for easy administration and grading, novel question structures were designed to test students’ understanding of the molecular level, logical reasoning, and the basis of scientific models. The test was validated at Rice University with previous classes of General Chemistry, so its use as a tool for this study’s population was particularly appropriate. The CCRT was administered online to General Chemistry students in the first week of the Fall 2010 semester as a pre-test. The test was a voluntary opportunity to earn minimal extra credit based on completion, and 71% of the class completed the test. In the last week of the semester, the post-test version of 23 questions that corresponded with the topics covered in the first semester of General Chemistry was given online to students. Extra credit was assigned based on completion and accuracy, and 86% of the students completed the test.

Certain exam questions were also analyzed to compare the performance of vocal and silent students. A style of question that was included at least once on every test was “assess the accuracy”. In an “assess the accuracy” question, there is a question stated and a possible response to that question that students are asked to evaluate, as shown in Fig. 1. This question format provides a venue to test misconceptions and probe higher level thinking skills. Two midterm exams included one of this type of questions, each worth 20 points, and one midterm exam included two of this type of questions, totalling 30 points. All instrument and interview protocols were approved by the IRB.


Example of an “assess the accuracy” exam question.
Fig. 1 Example of an “assess the accuracy” exam question.

Characterizing vocal and silent students

This study and the previous study (Obenland et al., 2012) aimed to understand the difference between students that participated in the active learning classroom often and those that did not. Our method to categorize students is consistent with our previous study, where one multiple-choice question from the survey was used: How often do you attempt to participate by raising your hand in lecture? Those students that self-reported that they attempted to participate at least once a week were categorized as “vocal.” Those students that self-reported that they attempted to participate less often than once a week were categorized as “silent.” These designations are specific to indicate the amount of vocal participation in class, but as discovered previously, even silent students are active participants in the classroom. Of the students that completed the survey for Fall 2010, there were 245 vocal students and 121 silent students. For Fall 2011, there were 223 vocal students and 109 silent students.

Data analysis

The data from the Fall 2010 was analyzed to include every student who responded to both an online survey and the pre and post administrations of the CCRT. This allowed for pairing of the individual student responses to these three instruments. The survey responses were used to categorize students according to their level of participation. The survey question “How often do you attempt to participate by raising your hand in lecture?” had six possible responses: Once or more per lecture, Once or twice per week, Four to six times per month, One to three times per month, A few times ever, Never (for reference, the class met three times per week). Scores on the CCRT were compared across these six groups via an Analysis of Variance (ANOVA), comparing the pre-test scores across participation levels, to confirm that participation was not motivated by prior knowledge. Across the six categories of participation, there was no significant difference in prior knowledge, with a p-value of 0.324. Since the ANOVA verified that none of the six categories displayed a performance difference from one another, the remainder of the analyses divided the students into only the two categories of vocal and silent.

Also of interest was whether student perceptions towards the Socratic and discussion questions used in the classroom were correlated to post-test performance on the CCRT. The Spearman’s rho correlation coefficient was computed for responses to selected survey questions against post-test scores.

Student grades were also available for given questions on the mid-term and final exams. For students categorized between silent and vocal, a two-sample t-test was performed to detect if vocal students displayed a significantly higher score than silent students on relevant questions.

Interviews were conducted with a total of 42 students all by the same researcher. The interviewer took notes onto the prepared semi-structured protocol and also made audio recording when approved by the interviewee. The notes were coded for emergent themes and guided survey data analysis.

Data collected during the Fall 2011 were similar in nature to the Fall 2010. However, these students did not complete the CCRT and responded to a slightly revised online survey. Based on the results of the Fall 2010 data, the survey was altered for the second year of the study in Fall 2011 to ask more specific questions to pinpoint why vocal students choose to be vocal in the active classroom. In order to draw any conclusions between the two different years of students, a chi-square test for homogeneity was completed to determine if the percentage of silent and vocal students across both 2010 and 2011 surveys are the same. Once this was established, our study was able to assess potential motivations for vocal students by analyzing responses to specific questions in the survey. These survey questions will be provided in the next section.

Results and discussion

As previously mentioned, our study had two data collections, one from Fall 2010 students and another from Fall 2011 students. Both were first semester courses in General Chemistry. The data from the 2010 students was used to establish any differences in performance between silent and vocal students. Meanwhile, the modified survey used in the 2011 data collection was intended to provide insight into the motivational factors that encourage vocal students in the active classroom. The methods used and motivation for these analyses were detailed in the previous section. The results and discussion are divided between the 2010 performance data and the 2011 motivation data.

Analysis of data on performance from 2010

Pre- and post-CCRT results. In order to determine whether those students categorized as vocal and those students categorized as silent started with different levels of prior knowledge, a two-sample t-test was run on the pre-test scores, 23 CCRT questions associated with the content in the first semester of General Chemistry. These results are given in Table 1.
Table 1 Two-sample t-test of CCRT pre-test in Fall 2010 for vocal and silent students
  Mean pre-test score St. Deviation p-Value
Vocal 8.21 2.82 0.220
Silent 7.74 2.85


There was no significant difference between the starting levels of these two groups. This test established that the silent and vocal participants began the Chemistry course with the same level of knowledge. This result was of primary importance to the study, as it confirms that those students who answer questions in class are not necessarily those students who start the course with the strongest Chemistry background.

We then sought to determine if the level of participation had an impact on how much learning occurred over the course of a semester. The difference in the score distributions between vocal and silent students is shown in Fig. 2, representing the post-test scores on the CCRT. The scores of the vocal students had a slightly higher peak at the median with a larger shoulder towards higher scores. The silent student score distribution, however, was more slanted towards lower performance. This was analyzed statistically via a two-sample t-test of the paired improvement over the semester, which was calculated as post-test score minus pre-test score on the CCRT. The results are shown in Table 2.


Fall 2010 post-test CCRT score distribution for vocal and silent students.
Fig. 2 Fall 2010 post-test CCRT score distribution for vocal and silent students.
Table 2 Two-sample t-test of CCRT points gained from pre to post-test in Fall 2010 for vocal and silent students
  Mean score gain St. Deviation p-Value
Vocal 6.80 3.49 0.005
Silent 5.51 3.10


There was a significant difference between the silent and vocal student performance improvement. Our previous study indicated that both silent and vocal students perceive a benefit to the active classroom and are participating in the active classroom, albeit in different ways. The results from the pre and post-test allowed us to conclude that there was a comparative advantage in learning for those students who were willing to be vocal versus those that were silent in the active classroom. Vocal students outperformed silent students on a test of conceptual understanding in chemistry, on average and statistically significantly with the same baseline average for both of the groups.

Exam results. The analysis also included results for “assess the accuracy” exam questions, as defined previously, that were written with the intent of testing students’ ability to reason and explain. The analysis of these questions determined if students who were vocal had an increased score and thus an increased ability to reason scientifically on these questions over silent students. These questions were worth a total of 70 points across three midterm exams, and a two-sample t-test was run on the mean scores for silent and vocal students, as shown in Table 3.
Table 3 Comparison of scores on “assess the accuracy” exam questions between silent and vocal students
  N Mean points St. Deviation p-Value
Vocal 241 53.11 9.32 <0.001
Silent 115 48.13 9.10


There was a significant difference between the silent and vocal groups, although the differential of points was not large. This comparison of performance on another means of assessment reinforced that vocal students had an increased performance improvement of learning over silent students.

Survey and CCRT correlations. Finally, the study used the 2010 data to compare performances against perceptions of the active classroom from the survey data. The active classroom mainly used Socratic questions to engage students. Results have established that vocal students seemed to have an advantage over those students who were silent. Then we also sought to determine if the post-test scores on the CCRT had a relationship with the perception of usefulness for Socratic questions used in class. For two questions regarding the perceived impact of Socratic questions, a Spearman’s rho was calculated for survey response versus possible points scored on the post-test CCRT. Correlations are given in Table 4.
Table 4 Spearman’s rho correlations for post-test CCRT scores and survey responses
Question Spearman’s rho
Q6: The Socratic questions in lecture keep me engaged. 0.131
Q15: Being asked Socratic questions in class helps me better understand the concepts. 0.238


These values indicated that a weak correlation existed between perceived usefulness of Socratic questions and a high post-test CCRT score. Thus, students with a high score on the CCRT were not substantially more likely to perceive the Socratic questions to be useful to the learning process. This weak relationship between a high level of learning and response to the Socratic method indicated that the active learning methods were valuable to both high achieving and low achieving students, according to the CCRT. This was similar to our previous result, which showed that both silent and vocal students perceive a benefit to the active learning environment (Obenland et al., 2012).

Analysis of data on motivation from 2011

The previous analyses have established that a learning benefit exists for those students who choose to be vocal during class. The data used in all previous tests came from the Fall 2010 collection. This lead naturally to the question of why vocal students were vocal. Thus, the next step for the study was to analyze the motivations for students to be vocal. To this end, additional survey questions were incorporated into the Fall 2011 survey in order to better understand the behavior of vocal and silent students. This required a comparison between the 2010 and 2011 groups. A chi-squared test for homogeneity was performed to demonstrate that the percentage of silent and vocal students were similar across the two years, as shown in Table 5. There was no evidence to suggest that the two cohorts differ in the sets of students that report themselves to be vocal or silent in the classroom. Thus, we continued the analysis by using comparisons across the 2010 and 2011 students.
Table 5 Comparison of vocal and silent groups across 2010 and 2011
  Vocal Silent p-Value
2010 245 121 0.949
2011 223 109  


A survey question was created for students to indicate their motivation for responding to the Socratic questions. Out of 11 options, the most commonly selected ones are listed in Table 6. Each of these options can be understood in the context of achievement goal theory as related to students mastery orientation (A,E) or performance goal orientation (B,C,D) (Ames and Archer, 1988; Harackiewicz et al., 1997).

Table 6 Percentage of students who identified with statements regarding Socratic questions for vocal and silent students and for those students that increased or decreased their participation over the course of the semester
Vocal N = 223 Silent N = 109 Please check all of the statements that you identify with about the Socratic questions in class. Increased Participation N = 101 Decreased Participation N = 48
46% 8% A. I feel as though, once I started answering questions, my understanding of the material increased. 48% 29%
62% 33% B. My primary motivation for answering questions was always so that I could get extra credit points. 70% 52%
68% 17% C. Once I correctly answered a question in class, I was motivated to try to participate more often. 74% 46%
75% 63% D. My ability to participate was affected by how much time I had to prepare for class due to other demands on my time. 79% 81%
83% 34% E. As I began to understand the material, I was more able to answer questions. 88% 52%


The most selected responses were divided to compare how either silent or vocal students responded, as shown in Table 6. Students were asked to select a description that matched their participation over the course of the semester from the following options: increased, decreased, stayed the same, or did not participate. For those students that either increased or decreased their participation over the course of the semester, data is included in Table 6 regarding the percentiles of those populations identifying with statements about Socratic questions. Interview data will be tied to the survey data to understand student survey responses and to understand why vocal students were vocal in class. Discussion of the survey data will be interwoven with findings from the interviews in the following sections on the impact of think-pair-share, extra credit, learning from questions, preparation for class, and studying in groups.

Learning via Socratic questions. Survey and interview results indicated the educational value of answering Socratic questions. Almost half of vocal students, 46%, saw how answering Socratic questions helped them understand the material better according to responses to statement A. This view was also held by about half, 48%, of those students who increased their participation over the semester.

Interview responses were generally positive towards Socratic questioning, with many students realizing the questions were aimed at helping them truly understand the material. One student said of an instructor, “I think he wanted us to figure it out rather than him just tell us.” Many students stated that Socratic questions helped them stay engaged in class and keep up with the ideas being discussed. One student said, “[Class] is not just a lecture…I’m being forced to think of the answers each time that the question is asked versus just listening to lecture. I think that’s very helpful in staying tuned in.” Whereas another student expressed the same idea saying why she liked Socratic questioning was, “Always being involved in lecture, always being part of it and feeling like you’re following along as it goes, and the questions are kind of like checkpoints.” A few students responded regarding the helpfulness of listening to other students answer questions. One student said, “It helps to hear it explained from a student. Different wording of the same idea that is clearer.”

A high percentage of the vocal students, 68%, responded to statement C that answering a question correctly motivated further participation. Most of those who increased their participation, 74%, also identified with the statement regarding motivation from answering correctly in class. The majority percentages demonstrated that getting positive feedback encouraged more participation from the vocal students.

Few students really verbally expressed that answering questions correctly was the main motivator to continue participating, as indicated by the survey responses. However, students did see the value in participating as by answering questions they got feedback on their understanding and were challenged to actually put their understanding into words. A student shared this comment, “Sometimes I feel like I know the material, but I don’t necessarily know how to phrase it. So I think answering questions definitely helps with…how to put concepts into words.” And another said, “Even though I knew the answer, I probably wouldn’t be able to explain it how I wanted coherently.” Thus, once students did start to answer questions, the feedback they received was helpful to them, as well as the challenge to put their thoughts into words. Due to the open-ended nature of the graded assessments in this course, the benefit of practicing explaining concepts out loud was obvious to students.

Vocal students were motivated to answer Socratic questions because they kept them engaged, let them hear other students’ explanations, provided an outlet for feedback, and forced them to put the concepts into their own words.

Extra credit for participation. Many students needed external motivation to participate by being vocal in class, such as the extra credit gained via participation points. There were 62% of vocal students and 33% of silent students who identified with the statement B regarding extra credit being a main motivator. Percentiles were high for this statement for both those who increased their participation, 70%, and those who decreased their participation, 52%. This indicated that the minimal grade incentive provided by participation points was enough to motivate most students to answer questions.

In interviews, numerous students gave a similar response to this regarding why they participate, “Definitely the extra credit points. But also just participating helps me understand it. Because if I answer or intend to answer, I have to think about what I would say and understand it.” One student even went as far to state, “In the beginning I was like, oh yeah, we get points… But then actually most of the time I raise my hand and answer questions, now I don’t actually go up and get points after class.” This student obviously started out by answering questions for the grade incentive but then realized the value added from participating was worthwhile without even claiming the extra credit.

Most of the vocal students interviewed saw the educational value in answering questions, yet they most often stated it was the participation points that encouraged them to start trying to answer. This interview data corresponded with the survey data, which indicated extra credit was a major initial motivator for students to become vocal in class.

Preparation for class. Most of the students, 75% of vocal and 63% of silent, felt their ability to participate was impacted by the time they had to prepare for class by identifying with statement D. Most of the students who increased participation, 79%, and decreased participation, 81%, also felt participation was affected by time to prepare. When students had time to prepare for class, their ability to participate improved. The converse must then have also been true; when students were overwhelmed by other classes or activities, they did not have time to prepare for chemistry class and were unable to participate.

In interviews, some students felt that they could still participate without having read the assignments before class, but most did not. One student stated, “Without reading, I won’t answer; but because others answer, I still understand.” A number of students expressed that as their semester got busier, their time to spend reading for class diminished, with their participation following. One student said, “At the beginning, I was more likely to have done the reading, so I participated more at the beginning. Then it varied if I’d done the reading.” Thus, the Socratic method did provide motivation for students to read the material before class. One student said, “Asking the questions challenged the students to go back and read and to understand. And definitely with the way you get extra points if you answer questions, I think that really encouraged people to go more in depth in their reading.” Making time to read and prepare prior to class did impact participation levels and, most probably, also performance.

A very high percentage, 83%, of the vocal students but only 34% of the silent students identified with statement E indicating that they felt that as they understood chemistry better, they were better able to answer questions. Survey data also showed that 88% of students who reported that their participation increased over the course of the semester also identified with the statement E. Thus, understanding of the material was definitely a motivational factor for answering questions.

The way most students who were interviewed said they were able to understand the material was by reading and studying. One student said she always read before class because if you did not, “Then you can’t get the participation extra credit. So, it was kind of a motivation to read before class, but then it also made a lot more sense if you read before class.” Multiple students expressed that their participation depended upon the amount of time they had to prepare for class by reading, as also seen from the survey data. This also adds some explanation to the vocal students’ greater learning gains, as students were vocal if they were keeping up with the course and making sure they understood the content.

Students did see preparation for class as valuable and, for most, essential to being able to be vocal in class. And those students that did participate recognized the impact of preparing and participating on their learning by being engaged and getting feedback on top of staying current with their studies.

Studying in groups. One hypothesis tested was whether students who studied and talked together about chemistry in groups outside of class had a higher degree of participation in class. As shown in Table 7, the amount that students said they studied with others was relatively consistent between vocal and silent students. Thus, there was no correlation between talking about chemical concepts with other students outside of class and talking about them in front of the class.
Table 7 Frequency of group study for Fall 2011 vocal and silent students
  At least once a week Occasionally Only before exams Never
Vocal 40% 32% 17% 10%
Silent 41% 24% 24% 11%


Conclusions

Our previous study revealed similarities in how both vocal and silent students positively perceived the active learning classroom, with both groups able to participate both cognitively and socially (Obenland et al., 2012). With this study, we have extended our understanding of the impact of active learning on vocal and silent students. The behavioral dimension of actually participating in class or even just attempting to participate by raising a hand did differentiate students by level of learning performance. The CCRT post-test and assess the accuracy exam questions showed vocal students to perform at a higher level than silent students on measures of chemistry knowledge. Because both silent and vocal students started out the semester at the same measured conceptual understanding in chemistry, the differences at the end of the semester must have been due to differences that had impact over that semester.

Interviews and survey data together also indicated that students are more likely to be vocal in class when they have time to prepare for class and read the assigned material. Students have a great deal of schoolwork and other activities throughout the semester, and time management is one of the great life skills acquired in college. Students were very honest that having time to prepare for class directly impacted whether or not they could participate. Preparing for class would, of course, impact understanding the material as it was presented in class. It follows that those students who regularly prepared for class would often participate and would be classified as vocal. Thus, prepared and vocal students would benefit both from preparing for class and the engagement and feedback of being vocal, making the greater learning gains of vocal students an obvious consequence of everything involved in being vocal.

This suggests a cyclical model for why vocal students outperform silent students. Because they are more likely to prepare for class, vocal students are more likely to participate. By participating, they understand the material more deeply. Having once participated successfully, they are more likely to want to do so again. As such, they are more likely to prepare for class. Thus, vocal students are better prepared because they participate, and they participate because they are better prepared. How then does a student begin this cycle?

Extrinsic motivation was provided for participation in the form of minimal extra credit. Despite the minimal impact on the course grade, the participation points made a major impact on students’ motivation to be vocal. Both survey and interviews showed that participation points encouraged students to attempt to answer questions in class. The motivation was not great enough to overcome inhibitions for some students, but it did get the majority of the class raising their hands.

As discussed earlier, once students answered questions they did see educational value of answering Socratic questions aloud. Students also realized the necessity for preparing for class by reading in advance, as without preparation they had limited opportunities to participate. Students learned from the challenge of putting chemical concepts into words, appreciated the feedback from the instructor, and used the drive to answer questions as a way to keep them on top of completing their reading assignments before class. Thus, vocal students that were initially motivated by extra points seemed to then transition to being motivated by the educational benefits of participation and preparing for participation. Extra credit for participation started off a “virtuous cycle” for students who start to participate in the active learning environment by being vocal, enjoy the engagement, benefit from feedback, and are motivated to continue to prepare for class so that they can continue to gain all the benefits of being vocal.

Participation points got students starting to be vocal and the immediate feedback and extra motivation to stay on top of the material kept students answering questions. One student stated that, “Questions are like stepping stones to a whole answer.” And we feel the “stepping stones” completed the beneficial active learning experience for students who started with an extrinsic motivation of points and continued with the intrinsic motivation of increased understanding.

References

  1. Ames C. and Archer J., (1988), Achievement goals in the classroom: Students’ learning strategies and motivation processes, J. of Educ. Psych., 80, 260–267.
  2. Bonwell C. C. and Eison J.A., (1991), Active learning: Creating excitement in the classroom, Washington, DC: ASHE-ERIC Higher Education Report No. 1.
  3. Bruff F., (2009), Teaching with classroom response systems: Creative active learning environments, San Francisco, CA: Jossey-Bass.
  4. Cloonan C. A. and Hutchinson J. S., (2011), A chemistry concept reasoning test, Chem. Educ. Res. Pract., 12, 205–209.
  5. Crouch C. H. and Mazur E., (2001), Peer instruction: Ten years of experience and results, American Journal of Physics, 69, 970–977.
  6. Drew V. and Mackie L., (2011), Extending the constructs of active learning: implications for teachers’ pedagogy and practice, The Curriculum Journal,22, 451–467.
  7. Donovan M. S., Bransford J. D. and Pellegrino J. W. (Eds.), (1999), How people learn: Bridging research and practice, Washington, DC: National Academy Press.
  8. Felder R., Woods D., Stice J. and Rugarcia A., (2000), The future of engineering education: II. Teaching methods that work, Chem. Eng. Educ., 34, 26–39.
  9. Harackiewicz J. M., Barron K. E., Carter S. M., Lehto A. T. and Elliot A.J., (1997), Predictors and consequences of achievement goals in the college classroom: Maintaining interest and making the grade, J. of Personality and Social Psychology, 73, 1284–1295.
  10. Hutchinson J. S., (2000), Teaching introductory chemistry using concept development case studies: interactive and inductive learning, Univ. Chem. Educ., 4, 3–8.
  11. Hutchinson J. S., (2007), Concept development studies in chemistry; Connexions; Retrieved June 6, 2009 (http://cnx.org/content/col10264/1.5/).
  12. Johnson D. W., Johnson R. T. and Smith K. A., (1998), Cooperative learning returns to college: What evidence is there that it works? Change, July/August, 27–35.
  13. King A., (1993), From sage on the stage to guide on the side, College Teaching, 41, 30–35.
  14. Lyman F., (1992), Think-Pair-Share, Thinktrix, Thinklinks, and Weird Facts: An interactive system for cooperative thinking. In Davidson N. and Worsham T. (Eds.), Enhancing thinking through cooperative learning (pp. 169–181). New York: Teachers College Press.
  15. Matusovich H., Streveler R. and Miller R., (2009), What does “motivation” really mean? An example from current engineering educaiton research, Proceedings of the Research in Engineering Education Symposium 2009, Palm Cove, Queensland, Australia.
  16. Michael J., (2006), Where’s the evidence that active learning works?, Advances in Physiology Education, 30,159–167.
  17. National Research Council (NRC), (2000), Inquiry and the national science education standards: A guide for teaching and learning, Washington, DC: National Academy Press.
  18. Obenland C. A., Munson A. H. and Hutchinson J. S., (2012), Silent students’ participation in a large active learning science classroom, J. Coll. Sci. Teach., 42, 90–98.
  19. Prince M., (2004), Does active learning work? A review of the research, J. Eng. Educ., 93, 223–231.
  20. Walczyk J. J. and Ramsey L. L., (2003), Use of learner-centered instruction in college science and mathematics classrooms, J. Res. Sci. Teach., 40, 566–584.
  21. Watkins C., Carnell E. and Lodge C., (2007), Effective learning in classrooms, London: Sage.
  22. Weiner B., (1990), History of motivational research in education, J. of Educ. Psych.,82, 616–622.
  23. Wigfield A. and Eccles J. S., (2000), Expectancy-value theory of achievement motivation, Contemporary Educational Psychology,25, 68–81.

This journal is © The Royal Society of Chemistry 2013
Click here to see how this site uses Cookies. View our privacy policy here.