Student perceptions of learning data-creation and data-analysis skills in an introductory college-level chemistry course

Nirit Glazer
University of Michigan, USA. E-mail: nirit.glazer@gmail.com

Received 18th October 2014 , Accepted 23rd February 2015

First published on 23rd February 2015


Abstract

This study examines how students perceive their learning of creating and analyzing data in an introductory inquiry chemistry course at a college level that features oral presentations in student-centered discussions. A student Participant Perception Indicator (PPI) survey was administered in order to obtain data on student perceptions with respect to their own data-creation and data-analysis skills, which skills are essential for learning and understanding science. These student perceptions regarding gaining knowledge were consistently higher than their perceptions regarding gaining confidence and experience; however, both the confidence and the experience measures increased significantly as a semester progressed. Further, significant differences in student perceptions were found to exist between students who made oral presentations and students who did not. This finding strongly supports the active learning theory, i.e., learning by doing, and strongly encourages student participation in knowledge creation. Findings were also analyzed according to student demographics (gender, school) to determine patterns for different populations within the groups of students. Such analysis is important for instructors and for course designers to enable them to adjust their manner of teaching based on student demographic information in their classes, and to adjust the provided feedback and guidance, as needed.


Introduction

Current reform efforts in college level undergraduate education in the USA are to develop pedagogical methods for facilitating student-centered classroom instruction. Student-centered instruction, a form of active learning, shifts the focus from the teacher to the student. One way to foster active learning in student-centered classrooms is to incorporate data-analysis tasks, such as lab finding presentations, in a guided discussion session. The Boyer Report (1998)Reinventing Undergraduate Education: A Blueprint for America's Research Universities recommends processes for teaching, rather than for content, that can be applied appropriately across disciplines. The report encourages the creative use of technology for educational processes, and links communication skills with course work. The report argues that undergraduate education should build on the freshman foundations, and suggests that the freshman year should offer numerous opportunities for both research-based and inquiry-based learning, Specifically, it argues that traditional classroom methods be replaced by inquiry methods. It strongly recommends that teacher-centered classrooms should become student centered, and that team-centered learning should replace individual competitive learning.

Other studies clearly address the need for changes in undergraduate education, and support a shift from instructor-centered learning to student-centered learning, particularly in introductory level classes (Francisco et al., 1998; Slunt and Giancarlo, 2004). In the traditional classes, the teacher, rather than the student, becomes highly skilled by speaking, consulting, organizing, and solving problems. In contrast, student-centered teaching focuses on skills and cognitive development of students (Xu, 2003). In this teaching process, students no longer passively receive information, but actively participate in teaching, learning and thinking. Many reports argue that student-centered learning is a superior form of active learning (Lloyd and Spencer, 1994; Felder and Brent, 1996; Siebert and McIntosh, 2001; Michael and Modell, 2003; Yuretich, 2003; Oliver-Hoyo et al., 2004). These studies indicate that critical thinking involving student-centered activities are more effective when the activities require student involvement in higher-order thinking tasks, such as analysis of data and application results to new situations.

One way to implement active learning in student-centered classrooms is to incorporate instructional technology, such as oral presentations of lab findings, in guided discussion sessions. Oral presentations, in a science laboratory course, involves data-analysis, which reinforces the development of high cognitive skills and problem solving, to enhance critical thinking (Kovac, 1999; Zoller, 1999; Kerner et al., 2002; Mckeachie, 2002). Moreover, a presentation of data-analysis exposes students to thinking and to qualitative reasoning processes, by which scientists organize data, develop principles, make predictions, and design experiments.

This study describes a novel approach for teaching skills that are essential to the process of learning and understanding chemistry and evaluates the effectiveness of that innovative practice. More specifically the study examines how students perceive their learning of creating and analyzing visual data in an introductory inquiry chemistry course at a college level. The perceptions of student experiences are essential to the development of educational processes, in addition to the assessment of academic achievements. By obtaining inputs from participants, which might be missed or considered irrelevant by an external observer, the development and the implementation of learning environments can be improved (Fraser, 1994, 1998). Thus, this study investigates the perceptions of students regarding their ability to create and analyze data. The participating students were enrolled in a large (745 students, 33 sections) introductory, guided-inquiry chemistry course in a major research university. Results from this study are not limited to chemistry students, but are broadly applicable to any area of science and engineering.

Research question

The research question presented in this study is: do students perceive that they are learning, becoming more confident, and gaining experience regarding how to effectively create and analyze data in an introductory inquiry chemistry course at the college level?

Literature review: why data-creation and data-analysis are important in chemistry learning

The National Science Education Standards (National Research Council, 1996), the Framework for K-12 Science Education (National Research Council, 2012), the Boyer Report (1998), and other contemporary science education literature, emphasize inquiry as essential for student learning, and support the use of inquiry for instruction (Krajcik et al., 1998; Lunetta, 1998; Olson and Loucks-Horsley, 2000; Hofstein and Lunetta, 2004; Bruck and Towns, 2009; Crawford, 2014). The inquiry lab environment develops qualitative reasoning skills for students, which scientists use to solve problems such as formulating hypotheses, organizing data, making inferences, and designing experiments. The goal is to determine whether students perceived learning from such an environment.

The interpretation of data, and the ability to construct graphs and tables, are essential for the scientific process (Bowen and Roth, 2005). Graphs and tables are invaluable tools for representing data and for finding relationships between variables, particularly for determining patterns, properties and reactivity of matter. Such abilities are skills required by scientists for conducting investigations, for analyzing data, for drawing conclusions, and for writing research studies (Bowen and Roth, 2005).

The National Science Education Standards (National Research Council, 1996) and the Framework for K-12 Science Education (National Research Council, 2012) emphasize data-analysis skills. They state that students need to learn how to analyze evidence and data, and that the evidence they analyze may be from their investigations, from other student investigations, or from databases. The Standards also suggest data-related activities in which students ought to be skilled, such skills include the ability to collect, organize, and describe data, to create tables and graphs, and to analyze and interpret data to identify patterns, properties, and relationships.

Data-creation and data-analysis skills are not limited to the chemistry classroom, but are becoming increasingly important also outside the classroom. One of the central skills required in the 21st century is the ability to work with data, for instance, to make inferences from given data, to find trends, to criticize data, and to use the data. In the current information era, students need the ability to understand day-to-day medical results (e.g., growth-tracking charts and cholesterol levels), as well as commercial advertisements and news media (e.g., political elections, sports, and financial matters).

Educators call for practical programs that encourage the development of graphing and data-analysis competence (Tobin, 1990; Roth and McGinn, 1997; Glazer, 2011). The programs should provide opportunities to reflect on findings in the lab for clarifying understandings and misunderstandings with peers (Tobin, 1990). Although inquiry and associated skills, such as data analysis and data inference, are essential components in science learning, little is known about the perceptions by students of their own cognitive capability and their confidence to implement such skills. This paper enriches the literature with a study of student perceptions while learning in a particular course designed to foster cognitive skills important for inquiry, such as data-creation and data-analysis. This study is particularly valuable to high school and undergraduate science instructors and curricula developers interested in implementing a student-centered learning approach into their instruction in order to enhance data-creation and data-analysis skills.

Method

The learning environment

The current paper studies an introductory chemistry lab course in a major US research university that implements student-centered learning. The course includes a team-based collaborative inquiry lab, with technology-assisted data sharing, and technology-assisted data-analysis. It includes a student-centered post lab discussion of oral presentations by students to findings regarding specified questions that require analysis and/or application of lab data. The course is non-traditional, in that there is a student-centered discussion following the completion of collaborative inquiry lab investigations before moving on to a new lab topic. Usually, an introduction to general college chemistry requires students to apply lecture material in weekly labs. In the course featured in this study, the students were asked, before moving on to the next topic, to organize and analyze the varying team data, and to orally present the findings in a one-hour student-centered discussion. For example, when the students studied redox reactions, they were asked to present their observation and conclusions regarding the reactivity of halogens (oxidizing agents) and halides (reducing agents) and to explain if there is a correlation between the reactivity of halogens and halides. In another question, also regarding redox, students were asked to present their interpretations of the reaction between Cu(NO3)2 and NaI, to explain how they identified the reacting species and any products formed, to explain the purpose of adding Hexane, and to explain whether the Hexane is a reactant or not.
Course structure. The course consists of a lecture given by a faculty member, and of a set of lab experiments and discussion sessions, each including a plurality of sections of 20–24 students taught by Teaching Assistants (TAs). The lab groups are broken down into teams of four students, where team members rotate roles in each experiment (team manager, recorder, technologist, and chemist/safety officer). The manager of the team is responsible for doing the presentations; thus, the role of the presenter is assumed by a different member for each experiment. Other team members are encouraged to help the presenter to prepare for the discussion. Since roles rotate after the completion of each topic, each student is assigned to be the presenter at least one time throughout the semester. Students taking this class are not chemistry majors and are usually enrolled in it because of distribution requirements. The students are primarily from the Engineering School or from the college of Literature, Science, and the Arts (LS&A). The LS&A is a liberal arts and sciences college within a large university. Students' majors can be in many different programs in three main areas of studies: social sciences, natural sciences, and humanities. Some common majors are Psychology, Economics, Political Science, Biology, English, Communications Studies, Mathematics, and History.

Students in the course made observation in laboratory experiments and recorded the observations as a team in a central database. Students then got problems that require organization and analysis of the collated class data and in the following week they presented their answers in a student centered discussion. The main topics that were taught in the course include precipitation reactions, solution color and spectroscopy, redox reaction, acid–bases, and Lewis acids–bases.

The instrument

A student Participant Perception Indicator (PPI) survey was administered in order to obtain data on student perceptions of their own abilities. The PPI survey, which was introduced by Albert Bandura (Bandura, 1977) as a key component in social cognitive theory, consisted of a short web survey that measured perceptions of knowledge, experience and confidence in three to five themes (factors) determined important for learning. The overall objective was to check whether students perceived that they were learning, that they were becoming more confident, and that they were gaining experience as a result of participating in the course, particularly when asked to visualize results and to orally communicate findings in the student-centered discussions. An IRB approval was obtained prior to the beginning of the data collection.

The PPI survey was developed to determine whether students perceived learning as a result of attending the course described above. The survey queried responses by students to specified perception statements as to whether they had acquired various discussion-related skills, such as data-creation, data-analysis, and speaking in front of peers. In addition to the perception statements, the survey included demographic questions, such as gender, school, section, and year in the program. This information is important to examine, in case various groups of students responded to course objectives in different ways.

The PPI survey for this study included 20 statements that related to several factors which constituted broad objectives of the course, each addressed by a set of multiple (four or five) statements to increase reliability. A factor analysis was then run in order to group the statements into factors. The content of each statement was validated by a chemistry faculty member who taught the course, and by a science educator who was experienced in creating and analyzing surveys. For each perception statement, the students were asked to indicate how they perceived their knowledge, experience and confidence levels according to a five-point Likert scale, similar to the example in Fig. 1 below.


image file: c4rp00219a-f1.tif
Fig. 1 Sample PPI question.

In the example above, response 5 in the knowledge category indicated the participant perceived that he/she had a great deal of knowledge about organizing data into a table or graph; response 3 in the experience category indicated the participant perceived that he/she had an average amount of experience about organizing data into a table or graph; and response 1 in the confidence category indicated the participant had no confidence about organizing data into a table or graph.

The Research Sample The PPI survey was administered via the web in three rounds during the semester (beginning, middle, and end) to approximately 700 students (33 sections) in the introductory chemistry course. Overall, 1194 responses (N) were received. In order to increase reliability, only sections (21 sections) with a high level of responses (i.e. sections where more than half of the students responded consistently all three times) were included in the study. The number of responses in these sections totaled 951 (N = 951). Each round involved an average response rate of 60–75%; i.e., there were 291–360 responses out of 475 students. The first round (called Time 1) included 300 responses and was administered after the students had completed the first discussion; the second round (called Time 2) included 291 responses and was administered mid-semester; and the last round (called Timed 3) included 360 responses and was administered at the end of the semester.

The responses were primarily from Engineering School students (N = 416), constituting 43.74% of the responses, and from Literature, Science, and Arts (LS&A) School students (N = 493), constituting 51.84% of the responses. Most of the students were freshmen (N = 810), constituting 85.17% of the responses, with 58% of the students being male (N = 552) and 42% being female (N = 399). Both presenters (N = 329, 34.6% of the responses) and non-presenters (N = 622, 65.4% of the responses) responded to the survey. Students' demographic and survey information are summarized in Table 1.

Table 1 Students' demographic and survey information
Demographic/survey info N Percentage (%)
School LS&A 493 51.84
Engineering 416 43.74
Other 42 4.42
Year in the program Freshman 810 85.17
Sophomore 98 10.30
Junior 23 2.42
Senior 20 2.10
Gender Male 552 58.04
Female 399 41.96
Student role? Presenters 329 34.60
Non-presenters 622 65.40
When survey taken? Time 1 – beginning 300 31.55
Time 2 – middle 291 30.60
Time 3 – end 360 37.85
Total responses 951 100.00


Results

The current research focuses on only one of the factors in the survey, namely whether students perceived that they have acquired the skills associated with data-creation and data-analysis. This factor is a combination of the following five individual perception statements:

1. State the main conclusion to the assigned question

2. Organize data into a table or graph

3. Use alternate methods to organize and visualize class data via tables or graphs

4. Create graphs/tables to visualize the data

5. Interpret relationships represented in the graphs/tables

A factor analysis was conducted to verify that the above five items could be reduced to one factor and could be correlated. For each knowledge, experience, and confidence levels, all five survey items (statements) were found to be correlated with no significant difference (all p-values were less than 0.001). A reliability analysis confirmed that they fit well together (Cronbach's Alpha for knowledge, experience, and confidence, being 0.851, 0.827, and 0.821, respectively). Thus, all five items can be averaged for the analysis.

A data-analysis was then conducted to identify patterns over time, and patterns related to the different knowledge, experience, and confidence levels. In the data-analysis, the data was also analyzed to determine patterns for different populations within the group of students. Following are the results of this data-analysis.

Analysis of perception over time

Fig. 2 shows the mean of student perceptions in each round of the administered survey for each of the knowledge, experience and confidence levels. Analysis of the ratings in Fig. 2 shows the differences between the student perceptions regarding their knowledge, experience and confidence levels.
image file: c4rp00219a-f2.tif
Fig. 2 Rating of perception over time by measures.

As shown in Fig. 2, student perceptions of “knowledge” were at a higher level than their perceptions of “confidence” and “experience” in all the rounds. Using an ANOVA test with a 5% significance level (α) shows that as the semester progressed, there was no significant change from Time 1 to Time 3 in student perceptions regarding their “knowledge” level (p-value = 0.389), but there was a significant increase regarding their “experience” level (p-value = 0.014), and almost significant increase regarding their “confidence” level (p-value = 0.052).

Analysis of perception mean by measures

Fig. 3 shows the perception mean for each measure (knowledge, experience, and confidence). The vertical lines in the graph are confidence intervals that represent the range of responses within one standard deviation. Confidence interval length is related to the diversity of responses, with longer confidence intervals indicating greater variation in responses. If confidence intervals overlap, there is no significant difference between the groups/variables. If confidence intervals do not overlap, there is a significant difference between the groups/variables.
image file: c4rp00219a-f3.tif
Fig. 3 Perception mean by measures.

As can be seen in Fig. 3, students had higher perceptions of having gained knowledge of data-creation and data-analysis competence in comparison to gained confidence or gained experience in those skills. Also, confidence intervals (vertical lines) of “experience” and “confidence” overlap, but neither overlaps with the “knowledge” confidence intervals. Thus, the responses by students of having gained knowledge are significantly different from their responses of having gained confidence or experience. Furthermore, the “experience” and “confidence” responses had greater variations (longer confidence intervals) in comparison to “knowledge” responses.

Analysis by the role of the students

Fig. 4 shows the perception mean for each measure in terms of the role of the students at the particular time the survey was taken, irrespective of whether the students presented the data by themselves, or whether another team member presented the data.
image file: c4rp00219a-f4.tif
Fig. 4 Perception mean by role of the students.

As can be seen in Fig. 4, there was a significant difference between students who orally presented data, and those who did not. Thus, confidence intervals do not overlap between the two groups of students; also, p-values for knowledge and experience are less than 0.001, and for confidence are less than 0.05. The rates of perceptions by presenters were significantly higher than those by non-presenters at the time the survey was taken. This strongly supports the active learning theory, i.e., learning by doing. Students who were actively involved in a particular assignment perceived themselves as having learned more and as having gained more experience and confidence, in comparison to their teammates who were not in charge of doing the presentation.

Analysis by gender

Fig. 5 shows an analysis of the perceptions by students of data-creation and data-analysis competence according to their gender.
image file: c4rp00219a-f5.tif
Fig. 5 Analysis by gender.

As can be seen in Fig. 5, there was no significant difference between males and females regarding their perception of “experience” and “confidence”. Thus, confidence intervals overlap; also, p-values (0.123 and 0.086, respectively) were both more than 0.05. There was, however, a difference regarding their perception of “knowledge” in that females perceived their knowledge greater than the males, although not significantly greater (p-value = 0.036). Accordingly, it can be seen that the “knowledge” and “experience” perceptions in females are higher than in males, but the “confidence” perception is lower.

Analysis by the school affiliation

Fig. 6 shows the perception mean of students according to their school affiliation.
image file: c4rp00219a-f6.tif
Fig. 6 Perception mean by school affiliation.

Fig. 6 reveals a significant difference between LS&A students and engineering students regarding their perception of data-creation and data-analysis competence. Thus, confidence intervals did not overlap; also, p-values were less than 0.05. Accordingly, it can be seen that for all measures (knowledge, confidence, and experience), the perception by engineering students was higher than by LS&A students, the greatest difference being in their confidence level (p-value = 0.0004).

Discussion

The overall objective was to check whether students perceived they were learning, becoming more confident, and gaining experience, regarding how to effectively create and analyze visualizations of data as a result of participating in an introductory college-level chemistry lab course, where they were asked to visualize results and orally communicate their findings in student-centered discussions. The results (Fig. 2 and 3) show that the responses by students regarding the gaining of knowledge were significantly different from those regarding the gaining of confidence or experience. As the semester progressed, there was no significant change in the perceptions by students of their “knowledge” regarding data-creation and data-analysis; however, the students perceived gaining more confidence and experience in those skills.

Overall, the students indicated that they had a great deal of knowledge in their ability to effectively create and analyze data. At the beginning of the semester, they felt assured in their knowledge, but they did not feel confident and experienced in their data-creation and data-analysis. Data showed that students grew in confidence and experience regarding their perception of data-creation and data-analysis skills.

Confidence and experience are important for learning. Improving confidence increases the chances of success. When students are more confident, they are likely to take a task more seriously, challenge themselves more, and succeed more (Linnenbrink and Pintrich, 2003). Experience also has an important impact on learning since learning new skills takes time. However, practice is not enough to ensure that a skill will be acquired, since practice under appropriate instruction, and appropriate feedback, are also important (National Research Council, 2001).

Fig. 4 shows that students who were active in a presentation, perceived learning more than their teammates who were less active. This finding supports the correlation between self-efficacy, engagement, and learning that had been presented in the model of Linnenbrink and Pintrich (2003). According to the National Science Education Standards, science is an active process in that “learning science is something that students do, not something that is done to them” (National Research Council, 1996, p. 20). This finding also provides an important feedback for the course designer, and suggests that more students need be actively engaged in future assignments.

Strengths and limitations of using self-report data and the limitations of this study

Higher education scholars and institutional researchers rely heavily on self-reported survey data in their work. Surveys have a number of advantages over other data collection methods. For example, they can be implemented with relative ease, quickly and at little cost. In addition, they can relatively easily survey large populations, particularly with the assistance of the current technology. Because developing objective tests of student learning and skills can be extremely time consuming and costly, the use of self-reported surveys is widespread.

Researchers generally agree that the use of self-reported estimates of learning is valid within limits (Anaya, 1999a; Gonyea, 2005). At the most basic level, there is a concern regarding the validity of self-report measures (Razavi, 2001). The literature indicates that student self-reports have only moderately positive correlations with objective measures when used in estimating the learning or skill of individuals. When aggregated to compare the performance of groups, the reliability of self-reported measures is quite high and is generally considered to be a valid measure of a real difference in learning between groups (Pike, 1995, 1996; Anaya, 1999b; Pascarella, 2001; Volkwein and Yin, 2010).

This study focuses on the perceptions by different groups of students, based on their demographic characteristics or their role in the task. Future studies should also consider comparing perceptions of students in different achievement levels in order to check a correlation between their perceptions of learning and of their achievement.

Implications

High-school and/or undergraduate science instructors, who are interested in implementing a student-centered learning approach in their instruction on data-creation and data-analysis skills, can benefit from the study. Before adopting any learning approach, it is useful to know if such approach is perceived as being valuable from the point of view of the students. The reactions of students, and the perceptions of their experiences, are important parameters of the learning environments; they can complement the assessment of academic achievement to give a complete picture of the educational process (Fraser, 1994, 1998). Self-efficacy may also impact the motivation of students and their desire to learn (Zusho et al., 2003), and may also serve as an achievement barrier even among capable students (Steele and Aronson, 1995; Steele, 1997; Linnenbrink and Pintrich, 2003).

The data was analyzed according to student demographics (gender, school) to determine patterns for different populations within the groups of students. This analysis is important for course designers to enable them to adjust their manner of teaching based on student demographic information in their classes, and to adjust the provided feedback and guidance, as needed. To maintain student efficacy, instructors can provide appropriate feedback in order to help them develop self-confidence. The feedback should be specific to the task and relevant to the learning skill to be acquired. The literature shows that perceptions by participants of one or more measures (knowledge, confidence, experience) may decrease in the middle of the semester (Berger et al., 1999). Finding such changes in the perceptions reinforces the need to provide more feedback and guidance also during the middle of the semester.

In this study, there was no significant difference between male and female perceptions (Fig. 5) regarding how effectively they created and analyzed data. Using these results as a guide for adjusting the manner of teaching suggests that gender is not a major overall concern in tasks that require data-creation and data-analysis competence. The analysis of student perceptions according to their school affiliation (Fig. 6) shows a significant difference between engineering students and LS&A students. Engineering student perceptions of their knowledge, confidence, and experience measures are significantly higher than the perceptions of LS&A students, the greatest difference being in their confidence. It is possible that engineering students had more opportunity to deal with data-analysis and data-presentation in their engineering courses. Perhaps LS&A students need more instruction on data-creation and data-analysis related tasks.

Findings from this study support the notion that active learning affects self-efficacy. There was a significant difference between students who orally presented data and those who did not, which strongly supports the active learning theory, i.e., learning by doing. This aspect of the course corresponds to a broader reform effort for introductory college level courses, which effort encourages student participation in the creation of knowledge.

Acknowledgements

I would like to thank Nancy Kerner for allowing me to collect data in her class, and to Joseph Krajcik, Christine Feak, and Benjamin Barish for their feedback on my manuscript.

References

  1. Anaya G., (1999a), Accuracy of Self-Reported Test Scores, College and University, 75(2), 13–19.
  2. Anaya G., (1999b), College impact on student learning: Comparing the use of self-reported gains, standardized test scores, and college grades, Res. High. Educ., 40(5), 499–526.
  3. Bandura A., (1977), Self-efficacy: toward a unifying theory of behavioral change, Psychol. Rev., 84, 191–215.
  4. Berger C., Kerner N. and Lee Y., (1999), Understanding student perceptions of collaboration, laboratory and inquiry use in introductory chemistry, National Association for Research in Science Teaching.
  5. Bowen G. and Roth W., (2005), Data and graph interpretation practices among preservice science teachers, J. Res. Sci. Teach., 42(10), 1063–1088.
  6. Boyer E., (1998), The Boyer commission on educating undergraduates in the research university. Reinventing Undergraduate Education: A Blueprint for America's Research Universities, Menlo Park, CA: Carnegie Foundation for the Advancement of Teaching.
  7. Bruck L. B. and Towns M. H., (2009), Preparing Students to Benefit from Inquiry-Based Activities in the Chemistry Laboratory: Guidelines and Suggestions, J. Chem. Educ., 86(7), 820–822.
  8. Crawford B. A., (2014), From inquiry to scientific practices in the science classroom, in Lederman N. G. and Abell S. K. (ed.), Handbook of research in science education, vol. 2, Routledge, pp. 515–544.
  9. Felder R. and Brent R., (1996), Navigating the bumpy road to student-centered instruction, College Teaching, 44(2), 43–47.
  10. Francisco J., Nicoll G. and Trautmann M., (1998), Integrating multiple teaching methods into a general chemistry classroom, J. Chem. Educ., 75(2), 210.
  11. Fraser B., (1994), Research on classroom and school climate, Handbook of Research on Science Teaching and Learning, pp. 493–541.
  12. Fraser B., (1998), Science learning environments: assessment, effects and determinants, International Handbook of Science Education, 1, 527–561.
  13. Glazer N., (2011), Challenges with graph interpretation: a review of the literature, Studies in Science Education, 47(2), 183–210.
  14. Gonyea R. M., (2005), Self-reported data in institutional research: review and recommendations, New Directions for Institutional Research, 127, 73.
  15. Hofstein A. and Lunetta V., (2004), The laboratory in science education: foundations for the twenty-first century, Sci. Educ., 88(1), 28–54.
  16. Kerner N., Black B., Monson E. and Meeuwenberg L., (2002), Training Instructors to Facilitate Collaborative Inquiry, Journal of Student Centered Learning, 1(1), 29–36.
  17. Kovac J., (1999), Student active learning methods in general chemistry, J. Chem. Educ., 76(1), 120.
  18. Krajcik J., Blumenfeld P., Marx R., Bass K., Fredricks J. and Soloway E., (1998), Inquiry in project-based science classrooms: initial attempts by middle school students, J Learn. Sci., 7(3), 313–350.
  19. Linnenbrink E. and Pintrich P., (2003), The role of self-efficacy beliefs in student engagement and learning in the classroom, Read. Writ. Q., 19(2), 119–137.
  20. Lloyd B. and Spencer J., (1994), The Forum: New Directions for General Chemistry: Recommendations of the Task Force on the General Chemistry Curriculum, J. Chem. Educ., 71(3), 206.
  21. Lunetta V., (1998), The school science laboratory: historical perspectives and contexts for contemporary teaching, International Handbook of Science Education, 1, 249–264.
  22. Mckeachie W. J., (2002), Teaching Tips: Strategies, Research, and Theory for College and University Teachers, Boston, MA: Houghton Mifflin Company, 11th edn.
  23. Michael J. and Modell H., (2003), Active learning in secondary and college science classrooms: A working model for helping the learner to learn, Lawrence Erlbaum.
  24. National Research Council, (1996), National Science Education Standards, Washington, DC: National Academy Press.
  25. National Research Council, (2001), in Pellegrino J. W., Chudowsky N. and Glaser R. (ed.), Knowing what students know: the science and design of educational assessment, Washington, DC: National Academies Press.
  26. National Research Council, (2012), in Quinn H., Schweingruber H. and Keller T. (ed.), A framework for K-12 science education practices, crosscutting concepts, and core ideas, Washington, DC: The National Academies Press.
  27. Oliver-Hoyo M., Allen D., Hunt W., Hutson J. and Pitts A., (2004), Effects of an active learning environment: teaching innovations at a research I institution, J. Chem. Educ., 81(3), 441.
  28. Olson S. and Loucks-Horsley S., (2000), Inquiry and the National Science Education Standards: A Guide for Teaching and Learning, National Academies Press.
  29. Pascarella E., (2001), Using student self-reported gains to estimate college impact: a cautionary tale, J. Coll. Student Dev., 42(5), 488–492.
  30. Pike G. R., (1995), The relationships between self-reports of college experiences and achievement test scores, Res. High. Educ., 36, 1–22.
  31. Pike G. R., (1996), Limitations of using students' self-reports of academic development as proxies for traditional achievement measures, Res. High. Educ., 37, 89–114.
  32. Razavi T., (2001), Self-report measures: an overview of concerns and limitations of questionnaire use in occupational stress research, University of Southampton, Southampton, UK, 23 pp. (Discussion Papers in Accounting and Management Science, (01-175), Retrieved February 24, 2015 from: http://eprints.soton.ac.uk/35712/).
  33. Roth W. and McGinn M., (1997), Graphing: Cognitive ability or practice? Sci. Educ., 81(1), 91–106.
  34. Siebert E. and McIntosh W., (2001), College pathways to the science education standards, Natl Science Teachers Assn.
  35. Slunt K. M. and Giancarlo L. C., (2004), Student-Centered Learning: A Comparison of Two Different Methods of Instruction, J. Chem. Educ., 81(7), 985–988.
  36. Steele C., (1997), A threat in the air: How stereotypes shape intellectual identity and performance, Am. Psychol., 52(6), 613–629.
  37. Steele C. and Aronson J., (1995), Stereotype threat and the intellectual test performance of African Americans, J. Pers. Soc. Psychol., 69(5), 797–811.
  38. Tobin K., (1990), Research on science laboratory activities: in pursuit of better questions and answers to improve learning, Sch. Sci. Math., 90(5), 403–418.
  39. Volkwein J. F. and Yin A. C., (2010), Measurement issues in assessment, New Directions for Institutional Research, pp. 141–154,  DOI:10.1002/ir.336.
  40. Xu J., (2003), The reform of teaching in General Chemistry: establishing student-centered teaching strategies, The China Papers, 15–19.
  41. Yuretich R., (2003), Encouraging Critical Thinking: Measuring Skills in Large Introductory Science Classes, J. Coll. Sci. Teach., 33(3), 6.
  42. Zoller U., (1999), Scaling-up of higher-order cognitive skills-oriented college chemistry teaching: An action-oriented research, J. Res. Sci. Teach., 36(5), 583–596.
  43. Zusho A., Pintrich P. and Coppola B., (2003), Skill and will: the role of motivation and cognition in the learning of college chemistry, Int. J. Sci. Educ., 25(9), 1081–1094.

This journal is © The Royal Society of Chemistry 2015