Nirit
Glazer
University of Michigan, USA. E-mail: nirit.glazer@gmail.com
First published on 23rd February 2015
This study examines how students perceive their learning of creating and analyzing data in an introductory inquiry chemistry course at a college level that features oral presentations in student-centered discussions. A student Participant Perception Indicator (PPI) survey was administered in order to obtain data on student perceptions with respect to their own data-creation and data-analysis skills, which skills are essential for learning and understanding science. These student perceptions regarding gaining knowledge were consistently higher than their perceptions regarding gaining confidence and experience; however, both the confidence and the experience measures increased significantly as a semester progressed. Further, significant differences in student perceptions were found to exist between students who made oral presentations and students who did not. This finding strongly supports the active learning theory, i.e., learning by doing, and strongly encourages student participation in knowledge creation. Findings were also analyzed according to student demographics (gender, school) to determine patterns for different populations within the groups of students. Such analysis is important for instructors and for course designers to enable them to adjust their manner of teaching based on student demographic information in their classes, and to adjust the provided feedback and guidance, as needed.
Other studies clearly address the need for changes in undergraduate education, and support a shift from instructor-centered learning to student-centered learning, particularly in introductory level classes (Francisco et al., 1998; Slunt and Giancarlo, 2004). In the traditional classes, the teacher, rather than the student, becomes highly skilled by speaking, consulting, organizing, and solving problems. In contrast, student-centered teaching focuses on skills and cognitive development of students (Xu, 2003). In this teaching process, students no longer passively receive information, but actively participate in teaching, learning and thinking. Many reports argue that student-centered learning is a superior form of active learning (Lloyd and Spencer, 1994; Felder and Brent, 1996; Siebert and McIntosh, 2001; Michael and Modell, 2003; Yuretich, 2003; Oliver-Hoyo et al., 2004). These studies indicate that critical thinking involving student-centered activities are more effective when the activities require student involvement in higher-order thinking tasks, such as analysis of data and application results to new situations.
One way to implement active learning in student-centered classrooms is to incorporate instructional technology, such as oral presentations of lab findings, in guided discussion sessions. Oral presentations, in a science laboratory course, involves data-analysis, which reinforces the development of high cognitive skills and problem solving, to enhance critical thinking (Kovac, 1999; Zoller, 1999; Kerner et al., 2002; Mckeachie, 2002). Moreover, a presentation of data-analysis exposes students to thinking and to qualitative reasoning processes, by which scientists organize data, develop principles, make predictions, and design experiments.
This study describes a novel approach for teaching skills that are essential to the process of learning and understanding chemistry and evaluates the effectiveness of that innovative practice. More specifically the study examines how students perceive their learning of creating and analyzing visual data in an introductory inquiry chemistry course at a college level. The perceptions of student experiences are essential to the development of educational processes, in addition to the assessment of academic achievements. By obtaining inputs from participants, which might be missed or considered irrelevant by an external observer, the development and the implementation of learning environments can be improved (Fraser, 1994, 1998). Thus, this study investigates the perceptions of students regarding their ability to create and analyze data. The participating students were enrolled in a large (745 students, 33 sections) introductory, guided-inquiry chemistry course in a major research university. Results from this study are not limited to chemistry students, but are broadly applicable to any area of science and engineering.
The interpretation of data, and the ability to construct graphs and tables, are essential for the scientific process (Bowen and Roth, 2005). Graphs and tables are invaluable tools for representing data and for finding relationships between variables, particularly for determining patterns, properties and reactivity of matter. Such abilities are skills required by scientists for conducting investigations, for analyzing data, for drawing conclusions, and for writing research studies (Bowen and Roth, 2005).
The National Science Education Standards (National Research Council, 1996) and the Framework for K-12 Science Education (National Research Council, 2012) emphasize data-analysis skills. They state that students need to learn how to analyze evidence and data, and that the evidence they analyze may be from their investigations, from other student investigations, or from databases. The Standards also suggest data-related activities in which students ought to be skilled, such skills include the ability to collect, organize, and describe data, to create tables and graphs, and to analyze and interpret data to identify patterns, properties, and relationships.
Data-creation and data-analysis skills are not limited to the chemistry classroom, but are becoming increasingly important also outside the classroom. One of the central skills required in the 21st century is the ability to work with data, for instance, to make inferences from given data, to find trends, to criticize data, and to use the data. In the current information era, students need the ability to understand day-to-day medical results (e.g., growth-tracking charts and cholesterol levels), as well as commercial advertisements and news media (e.g., political elections, sports, and financial matters).
Educators call for practical programs that encourage the development of graphing and data-analysis competence (Tobin, 1990; Roth and McGinn, 1997; Glazer, 2011). The programs should provide opportunities to reflect on findings in the lab for clarifying understandings and misunderstandings with peers (Tobin, 1990). Although inquiry and associated skills, such as data analysis and data inference, are essential components in science learning, little is known about the perceptions by students of their own cognitive capability and their confidence to implement such skills. This paper enriches the literature with a study of student perceptions while learning in a particular course designed to foster cognitive skills important for inquiry, such as data-creation and data-analysis. This study is particularly valuable to high school and undergraduate science instructors and curricula developers interested in implementing a student-centered learning approach into their instruction in order to enhance data-creation and data-analysis skills.
Students in the course made observation in laboratory experiments and recorded the observations as a team in a central database. Students then got problems that require organization and analysis of the collated class data and in the following week they presented their answers in a student centered discussion. The main topics that were taught in the course include precipitation reactions, solution color and spectroscopy, redox reaction, acid–bases, and Lewis acids–bases.
The PPI survey was developed to determine whether students perceived learning as a result of attending the course described above. The survey queried responses by students to specified perception statements as to whether they had acquired various discussion-related skills, such as data-creation, data-analysis, and speaking in front of peers. In addition to the perception statements, the survey included demographic questions, such as gender, school, section, and year in the program. This information is important to examine, in case various groups of students responded to course objectives in different ways.
The PPI survey for this study included 20 statements that related to several factors which constituted broad objectives of the course, each addressed by a set of multiple (four or five) statements to increase reliability. A factor analysis was then run in order to group the statements into factors. The content of each statement was validated by a chemistry faculty member who taught the course, and by a science educator who was experienced in creating and analyzing surveys. For each perception statement, the students were asked to indicate how they perceived their knowledge, experience and confidence levels according to a five-point Likert scale, similar to the example in Fig. 1 below.
In the example above, response 5 in the knowledge category indicated the participant perceived that he/she had a great deal of knowledge about organizing data into a table or graph; response 3 in the experience category indicated the participant perceived that he/she had an average amount of experience about organizing data into a table or graph; and response 1 in the confidence category indicated the participant had no confidence about organizing data into a table or graph.
The Research Sample The PPI survey was administered via the web in three rounds during the semester (beginning, middle, and end) to approximately 700 students (33 sections) in the introductory chemistry course. Overall, 1194 responses (N) were received. In order to increase reliability, only sections (21 sections) with a high level of responses (i.e. sections where more than half of the students responded consistently all three times) were included in the study. The number of responses in these sections totaled 951 (N = 951). Each round involved an average response rate of 60–75%; i.e., there were 291–360 responses out of 475 students. The first round (called Time 1) included 300 responses and was administered after the students had completed the first discussion; the second round (called Time 2) included 291 responses and was administered mid-semester; and the last round (called Timed 3) included 360 responses and was administered at the end of the semester.
The responses were primarily from Engineering School students (N = 416), constituting 43.74% of the responses, and from Literature, Science, and Arts (LS&A) School students (N = 493), constituting 51.84% of the responses. Most of the students were freshmen (N = 810), constituting 85.17% of the responses, with 58% of the students being male (N = 552) and 42% being female (N = 399). Both presenters (N = 329, 34.6% of the responses) and non-presenters (N = 622, 65.4% of the responses) responded to the survey. Students' demographic and survey information are summarized in Table 1.
Demographic/survey info | N | Percentage (%) | |
---|---|---|---|
School | LS&A | 493 | 51.84 |
Engineering | 416 | 43.74 | |
Other | 42 | 4.42 | |
Year in the program | Freshman | 810 | 85.17 |
Sophomore | 98 | 10.30 | |
Junior | 23 | 2.42 | |
Senior | 20 | 2.10 | |
Gender | Male | 552 | 58.04 |
Female | 399 | 41.96 | |
Student role? | Presenters | 329 | 34.60 |
Non-presenters | 622 | 65.40 | |
When survey taken? | Time 1 – beginning | 300 | 31.55 |
Time 2 – middle | 291 | 30.60 | |
Time 3 – end | 360 | 37.85 | |
Total responses | 951 | 100.00 |
1. State the main conclusion to the assigned question
2. Organize data into a table or graph
3. Use alternate methods to organize and visualize class data via tables or graphs
4. Create graphs/tables to visualize the data
5. Interpret relationships represented in the graphs/tables
A factor analysis was conducted to verify that the above five items could be reduced to one factor and could be correlated. For each knowledge, experience, and confidence levels, all five survey items (statements) were found to be correlated with no significant difference (all p-values were less than 0.001). A reliability analysis confirmed that they fit well together (Cronbach's Alpha for knowledge, experience, and confidence, being 0.851, 0.827, and 0.821, respectively). Thus, all five items can be averaged for the analysis.
A data-analysis was then conducted to identify patterns over time, and patterns related to the different knowledge, experience, and confidence levels. In the data-analysis, the data was also analyzed to determine patterns for different populations within the group of students. Following are the results of this data-analysis.
As shown in Fig. 2, student perceptions of “knowledge” were at a higher level than their perceptions of “confidence” and “experience” in all the rounds. Using an ANOVA test with a 5% significance level (α) shows that as the semester progressed, there was no significant change from Time 1 to Time 3 in student perceptions regarding their “knowledge” level (p-value = 0.389), but there was a significant increase regarding their “experience” level (p-value = 0.014), and almost significant increase regarding their “confidence” level (p-value = 0.052).
As can be seen in Fig. 3, students had higher perceptions of having gained knowledge of data-creation and data-analysis competence in comparison to gained confidence or gained experience in those skills. Also, confidence intervals (vertical lines) of “experience” and “confidence” overlap, but neither overlaps with the “knowledge” confidence intervals. Thus, the responses by students of having gained knowledge are significantly different from their responses of having gained confidence or experience. Furthermore, the “experience” and “confidence” responses had greater variations (longer confidence intervals) in comparison to “knowledge” responses.
As can be seen in Fig. 4, there was a significant difference between students who orally presented data, and those who did not. Thus, confidence intervals do not overlap between the two groups of students; also, p-values for knowledge and experience are less than 0.001, and for confidence are less than 0.05. The rates of perceptions by presenters were significantly higher than those by non-presenters at the time the survey was taken. This strongly supports the active learning theory, i.e., learning by doing. Students who were actively involved in a particular assignment perceived themselves as having learned more and as having gained more experience and confidence, in comparison to their teammates who were not in charge of doing the presentation.
As can be seen in Fig. 5, there was no significant difference between males and females regarding their perception of “experience” and “confidence”. Thus, confidence intervals overlap; also, p-values (0.123 and 0.086, respectively) were both more than 0.05. There was, however, a difference regarding their perception of “knowledge” in that females perceived their knowledge greater than the males, although not significantly greater (p-value = 0.036). Accordingly, it can be seen that the “knowledge” and “experience” perceptions in females are higher than in males, but the “confidence” perception is lower.
Fig. 6 reveals a significant difference between LS&A students and engineering students regarding their perception of data-creation and data-analysis competence. Thus, confidence intervals did not overlap; also, p-values were less than 0.05. Accordingly, it can be seen that for all measures (knowledge, confidence, and experience), the perception by engineering students was higher than by LS&A students, the greatest difference being in their confidence level (p-value = 0.0004).
Overall, the students indicated that they had a great deal of knowledge in their ability to effectively create and analyze data. At the beginning of the semester, they felt assured in their knowledge, but they did not feel confident and experienced in their data-creation and data-analysis. Data showed that students grew in confidence and experience regarding their perception of data-creation and data-analysis skills.
Confidence and experience are important for learning. Improving confidence increases the chances of success. When students are more confident, they are likely to take a task more seriously, challenge themselves more, and succeed more (Linnenbrink and Pintrich, 2003). Experience also has an important impact on learning since learning new skills takes time. However, practice is not enough to ensure that a skill will be acquired, since practice under appropriate instruction, and appropriate feedback, are also important (National Research Council, 2001).
Fig. 4 shows that students who were active in a presentation, perceived learning more than their teammates who were less active. This finding supports the correlation between self-efficacy, engagement, and learning that had been presented in the model of Linnenbrink and Pintrich (2003). According to the National Science Education Standards, science is an active process in that “learning science is something that students do, not something that is done to them” (National Research Council, 1996, p. 20). This finding also provides an important feedback for the course designer, and suggests that more students need be actively engaged in future assignments.
Researchers generally agree that the use of self-reported estimates of learning is valid within limits (Anaya, 1999a; Gonyea, 2005). At the most basic level, there is a concern regarding the validity of self-report measures (Razavi, 2001). The literature indicates that student self-reports have only moderately positive correlations with objective measures when used in estimating the learning or skill of individuals. When aggregated to compare the performance of groups, the reliability of self-reported measures is quite high and is generally considered to be a valid measure of a real difference in learning between groups (Pike, 1995, 1996; Anaya, 1999b; Pascarella, 2001; Volkwein and Yin, 2010).
This study focuses on the perceptions by different groups of students, based on their demographic characteristics or their role in the task. Future studies should also consider comparing perceptions of students in different achievement levels in order to check a correlation between their perceptions of learning and of their achievement.
The data was analyzed according to student demographics (gender, school) to determine patterns for different populations within the groups of students. This analysis is important for course designers to enable them to adjust their manner of teaching based on student demographic information in their classes, and to adjust the provided feedback and guidance, as needed. To maintain student efficacy, instructors can provide appropriate feedback in order to help them develop self-confidence. The feedback should be specific to the task and relevant to the learning skill to be acquired. The literature shows that perceptions by participants of one or more measures (knowledge, confidence, experience) may decrease in the middle of the semester (Berger et al., 1999). Finding such changes in the perceptions reinforces the need to provide more feedback and guidance also during the middle of the semester.
In this study, there was no significant difference between male and female perceptions (Fig. 5) regarding how effectively they created and analyzed data. Using these results as a guide for adjusting the manner of teaching suggests that gender is not a major overall concern in tasks that require data-creation and data-analysis competence. The analysis of student perceptions according to their school affiliation (Fig. 6) shows a significant difference between engineering students and LS&A students. Engineering student perceptions of their knowledge, confidence, and experience measures are significantly higher than the perceptions of LS&A students, the greatest difference being in their confidence. It is possible that engineering students had more opportunity to deal with data-analysis and data-presentation in their engineering courses. Perhaps LS&A students need more instruction on data-creation and data-analysis related tasks.
Findings from this study support the notion that active learning affects self-efficacy. There was a significant difference between students who orally presented data and those who did not, which strongly supports the active learning theory, i.e., learning by doing. This aspect of the course corresponds to a broader reform effort for introductory college level courses, which effort encourages student participation in the creation of knowledge.
This journal is © The Royal Society of Chemistry 2015 |