Learn to teach chemistry using visual media tools

Suat Turkoguz *
Dokuz Eylül University Buca Faculty of Education, Department of Science Education, İzmir, Turkey. E-mail: suat.turkoguz@gmail.com; Fax: +90 232 420 48 95; Tel: +90 505 698 06 34

Received 6th January 2012 , Accepted 7th June 2012

First published on 13th July 2012


Abstract

The aim of this study was to investigate undergraduate students' attitudes to using visual media tools in the chemistry laboratory. One hundred and fifteen undergraduates studying science education at Dokuz Eylül University, Turkey participated in the study. They video-recorded chemistry experiments with visual media tools and assessed them on a social website. A Likert scale consisting of 29 items was developed to obtain the students' views on using visual media tools in the chemistry laboratory. The scale has a four-factor structure. The Cronbach's alpha reliability coefficient was 0.95. In conclusion, this study found that using visual media tools in the general chemistry laboratory provides a positive contribution to students' behaviours and skills, improves students' learning attitudes to chemistry laboratory courses and increases their interest in visual media tools. The students also showed positive attitudes to the applicability of such tools.


Introduction

The technological advances in visual media tools, especially mobile phones with multi-visual media players, and the widespread use of mobile phones in social networks on the internet have had considerable consequences for teaching and education processes. Thanks to this technology, a lot of knowledge is conveyed to a great number of students. Via the internet, the visual products of visual media can easily be downloaded and shared in different situations (Varol, 1995; Petko and Reusser, 2003; Tetko et al., 2005; Vural and Zellner, 2010). The possibility of watching visual products repeatedly, their evaluation from different perspectives, their comparison with data obtained from observation or other visual products and their examination at different times and in different situations make these tools important.

With visual media tools, providing students with new knowledge is easier, the visualizing of abstract concepts is facilitated and the learning process is accelerated by relating this new knowledge to the student's existing knowledge. Consequently the student's critical thinking, and problem-solving skills improve and her/his success and interest in learning increase (Cennamo, 1993; Erim and Yöndem, 2009; Çelik, 2010; Pekdağ, 2010). Visual media tools also have some disadvantages as well as advantages. If the visual products of the media tools are not prepared well enough, even if the student is motivated for the lesson, s/he may not understand the content. In such cases the visual product should be strengthened with different instructional and technological designs.

Studies of students' performance have observed that when visual media tools are used in lessons, students' social skills increase, they become effective in terms of group decisions, their self-confidence increases, their performance has changed and the individuals who observe themselves have taken steps towards making progress in themselves (Varol, 1995; Tezer, 2008). Consequently, it is fair to say that the use of visual media tools in teaching and training processes can result in positive outcomes in terms of students' academic and social development.

The use of visual media tools in class is an effective method that can be helpful to students in terms of visualizing concepts, showing details, relating concepts, improving discussion activities, developing different learning styles and increasing motivation in the lesson (Maniar et al., 2008). Teaching with visual media tools has such benefits as revealing individuals' honest emotions, creating effective communication and providing flexibility in educational activities (Vural and Zellner, 2010). Using visual media tools in social networks is very beneficial. The access to education videos through these networks allows students to save time, study wherever they want, support their own learning through student-centred activities, creating a cooperative learning environment by letting them communicate with each other through social sharing networks, providing them limitless electronic learning environment and increasing the effectiveness of learning by downloading a great number of videos (Zhang et al., 2006). In a classroom environment, the use of visual media tools can contribute positively to learning in activities otherwise difficult to perform as group activities (Sharples et al., 2009).

Smart mobile communication tools are used in medicine education. Chang and his friends (2012) used mobile communication tools in medicine education in their study conducted in Botswana. The medicine students were delivered 3G smart mobile phones which have medicine content software. A communication centre was built in Botswana Faculty of Medicine for providing the communication with medicine students and to collect data about the patients in the treatment fields. The medicine students visited the patients in their assigned area and recorded visual and audible information about their diseases. They treated patients after their diseases were diagnosed through transmitting the obtained information to the medical experts in the communication centre. The medicine students were interviewed about the study before delivery of the smart phones. After 4 weeks, from beginning of the applications, the students were again interviewed and a likert type scale was developed. Seven weeks after the applications this scale was used in order to investigate the benefits of using smart mobile devices in medicine education. The observations in the scope of the study showed that usage and demand for smart mobile phones of medicine students increased. This revealed that students could manage themselves in their learning.

In many European countries, chemistry education presents problems. One of the most important reasons for this is the negative attitude of students and wider society towards chemistry. The majority of the students in high school think that chemistry is a difficult discipline and they have difficulty in understanding the concepts. It is also evident that in their first year at university students' interest in chemistry decreases. The reason for this decrease might be that the contents of chemistry laboratory classes are boring, out of date and lacking the dynamism that students experience through visual media tools. Lecturers complain about students' lack of ability to relate laboratory classes to chemistry and put theory into practice. The most common complaints are that chemistry lessons depend on concepts, have a complex structure, lack up-to-date materials and promote inadequate relations between high school and university chemistry concepts (De Jong, 2000). These inadequacies can be attributed to the frequent change in chemistry education programmes, inadequate content arrangements, lack of effort to get rid of misconceptions and the fact that the majority of people believe that knowledge of chemistry is unnecessary (Johnstone, 2000).

In video-based chemistry teaching, because they see and examine the materials repeatedly, students can understand the properties of matter in pictures and easily explain the concepts of chemistry (Franciskowicz, 2008). In educational activities in which molecular animation and video animation combinations are used, it has been shown that students can better understand the symbols and relate to them by understanding macroscopic and submacroscopic diagrams as concepts and can create dynamic mind models (Velazquez-Marcano et al., 2004). Students cannot relate the observations they make in laboratory classes to the concepts that they learn in lessons. The reason for this is that laboratory classes are rich in conceptual knowledge instead of complex and applied activities (Nakhleh, 1994). Students can structure the concepts of chemistry better if laboratories are organized according to students' interests, and then students can be expected to have a higher level of interest in chemistry lessons and laboratory classes. Chemistry laboratories can be changed into learning environments in which students enjoy both the contents and the teaching process without drifting away from the focus of the lesson and the learning activities. Those laboratories which are equipped with media tools can help students and teachers to categorize the concepts better. Laboratories should motivate children to enjoy lessons, improve their scientific thinking skills and help scientific concepts and progressions to be understood by means of concrete examples (Hofstein and Lunetto, 1982; Hodson, 1993; Aksoy et al., 2008; Azizoğlu and Uzuntiryaki, 2006; Morgil et al., 2009).

Laboratory environments equipped with visual tools can improve students' individual and social learning skills. Performing such activities as presentation, introduction and evaluation with electronic media tools can improve students' perception and attitude to laboratory classes (Hofstein, 2004). Cyber laboratory environment and simulations are effective tools in learning concepts. Learning environments which are equipped with multiple media tools can contribute to students' performance and success. The videos of chemistry experiments shown in cyber environments can motivate students to conduct experiments and entertain them (Morozov et al., 2004). Encouraging the use of visual media tools in chemistry laboratories can expand students' interest in experiments.

Smart phones which have visual media players, ability to convert the visual information into written information and make internet connection with those images are used in the educational settings. In their study, Benedict and Pence (2012) asked students to reach web pages of experiment videos and photo images which the students previously prepared by using smart phones and two dimensional barcodes. In this way, the students could reach the experiment videos and photos about the subjects which they did not understand during the experiments through their smart phones by reading the barcodes. The students' attitudes towards chemistry education which was implemented as above was investigated by a five-point likert type measurement tool. According to the measurement results, students indicated that they could do group work, do visual imagination more easily, reach experiment materials online through 2D barcodes and smart mobile phones and they wanted to use smart phones very frequently in the classroom.

The necessity for performance-based evaluation in laboratories has been emphasized for many years (Chen et al., 2005). Classical tests or multiple choice tests are usually used to measure performance in laboratories. Measures like video recordings, sound recordings and observation, however, which are authentic and geared better towards the process, will produce more realistic and correct results in terms of students' performance. If the process is performed too fast in direct performance evaluations, some details can go unnoticed. Similarly, students' performance can be evaluated indirectly by means of visual media tools. Students can record their experiments in the laboratory environment with visual media tools, evaluate their performance and obtain other people's evaluations by sharing these recordings through social networks. In this way, they obtain more realistic results for their individual performance by means of comparative evaluations.

The smart mobile phones which have the features of visual media playing, voice recording and sharing of these in the internet environment may also be used as assessment tools with the aim of giving feedback to the students. Nortcliffe and Middleton (2011) gave visual and audible feedback to students through smart phones in their research. According to the written feedback of the students, it was identified that they actively participated in the lessons, had positive attitudes to the course and improved their communication skills. Therefore, it may be claimed that giving visual and audible feedbacks to the students is more effective.

In the light of the above literature, in this study, first students were encouraged to record their performance with visual media tools and to share these recordings with other people in social networks. Then they were asked to evaluate other students' performance in the social networks. The aim was to measure science students' individual perceptions of the video recordings of the activities done in laboratory classes and shared in social networks by developing an ‘Attitude Scale for Visual Media Tool Use in the Chemistry Laboratory’.

Method

This study was designed as a descriptive study in which the survey technique was used. It investigated undergraduate students' attitudes towards visual media tool use in chemistry laboratory classes.

Study group

The present study was conducted in the Science Education Department at the Faculty of Education, Dokuz Eylül University, Turkey, in the 2010–11 academic year. One hundred and fifteen undergraduate students enrolled in chemistry laboratory classes participated in this study.

Data collection tools

The Attitude Scale for Visual Media Tool Use in the Chemistry Laboratory was used as a data collection tool. The scale was developed as follows.
Item pool creation. To create an item pool, the related literature was scanned. The opinions of 75 undergraduate students in the Science Education Department about the use of visual media tools in chemistry laboratory classes were gathered through written compositions in which five questions were asked three weeks after the chemistry experiments were done. An item pool of 61 was subsequently created.
Acquisition of expert views. The opinions of three experts in science education, one expert in Turkish Language Education and one expert in Educational Sciences were sought regarding the items in the Attitude Scale. Taking their suggestions into consideration, the statements in the scale were revised and organized and a 61-item attitude scale (31 positive and 30 negative statements) was prepared.
Scale type. The scale is a five-point Likert-type scale with a range of five options. The positive items range from 1 = ‘Certainly Agree’ to 5 = ‘Certainly Disagree’. The negative items range from 5 = ‘Certainly Agree’ to 1 = ‘Certainly Disagree’. A trial form was composed in accordance with experts' suggestions and pilot application results.
Pilot test. A pilot test of the scale was conducted with a group of five students four weeks after the chemistry experiments started and they were asked if they had any difficulty understanding the items. After that the items were revised and the scale of 61 statements (31 positive and 30 negative) was prepared.
Process. The scale was tested on 115 undergraduate students of science education at Dokuz Eylül University who were enrolled in chemistry laboratory courses. To test whether the data gathered were suitable for exploratory factor analysis, the Kaiser-Meyer-Olkin (KMO) and Bartlett tests of sphericity were applied. Depending on the results of these tests, exploratory factor analysis was performed. In addition, the factor points of the scale were defined by finding the arithmetic mean, standard deviation, median and the lowest and highest values. The relationships between factor points were computed with Pearson's correlation coefficient. Because the data gathered in the study were applied to the student sample only once, the Spearman-Brown formula and Cronbach's alpha were used to check the internal consistency and reliability of the scale (Tekin, 1991). The data were analysed with the SPSS program. The results of factor analysis are given below.
Factor analysis stage. Choosing the method of extraction in a scale and the process of deciding how many factors it will have are the most critical stages of the research (Steger, 2006). The structural validity of the scale is important conclusions have to be drawn from unobservable variables. For this reason factor analysis is an important tool for measuring psychological structures and for revealing validity problems (Hayton et al., 2004; Henson and Roberts, 2006). Factor analysis is used to obtain a limited number of significant and definable variables from a number of variables which measure the same structure (Büyüköztürk, 2002b, p. 118). Relying on classical measurement theories, factor analysis is widely used to ensure test-point validity (Kahn, 2006). Factor analysis can be defined as the process of revealing new variables called common factors, or factorizing, or as the process of obtaining functional definitions of variables using the factor load values of the items. There are many techniques which are used in factorizing. These techniques can be divided into two as the classical factor derivation techniques and the basic principal components. The analysis of basic principal components as a factorizing technique is a statistical technique which is often used (Büyüköztürk, 2002b, p. 118). Basic components analysis is a factor extraction technique in which it is assumed that the first factor has a maximum variance and the variances of other factors are compared with the first factor (Osborne and Costello, 2005). Basic components analysis analyses all the variations among the factors. Basic components analysis does not incline towards common variances, but it calculates one variance which belongs to every component (Kahn, 2006).

In the present study, to test the structural validity of the scale, the exploratory factor analysis technique was used (Özdamar, 2004). To select items which did not measure the same aims or attitudes, unrotated basic components analysis was used, followed by Varimax vertical rotation. In determining the number of factors, it has been required that the number of the eigenvalue be higher than 1.00 especially. In scale development studies, the number of factors in the scale is usually determined with regard to the eigenvalue which belongs to each factor. In selecting a factor, the eigenvalue number should be higher than 1.00. This scale is also known as the Kaiser criterion (Büyüköztürk, 2002b; Hayton et al., 2004; Osborne and Costello, 2005; Henson and Roberts, 2006; Steger, 2006). This criterion is a value which is unique to basic components analyses (Kahn, 2006). If the load of any item in a factor is smaller than 0.40, then it is assumed that that item is not related to the other items. In real data, the load of the factor which is retained in the test lies between 0.40 and 1.00. Items whose loads are higher than 0.32 are desirable (Osborne and Costello, 2005). In this study, the load value of a factor was required to be 0.40 or higher. Importance has been given to the higher load of attitude items in one factor; in other factors they should have lower load values. Those items which did not meet this requirement were excluded from the scale as the difference between two higher load values must be at least 0.10 (Büyüköztürk, 2002b).

Findings and results

In order to test the sampling proficiency level of the application while the attitude scale was developed the Kaiser–Meyer–Olkin (KMO) coefficient was used. In this study, the KMO sampling proficiency level of the scale was found to be 0.904 (Dziuban and Shirkey, 1974; Büyüköztürk, 2002a; Afacan and Aydoğdu, 2006). This value showed that partial correlations among variables were low and the data were suitable for exploratory factor analysis. Similarly, Bartlett's test of sphericity was used to test whether the correlation matrix obtained from the variables in the scale was an item matrix or not [X2 = 1887.026 (p < 0.000)]. This result indicated that the data obtained from the items in the scale were suitable for factor analysis and the variables could be reduced by grouping.

Factor analysis was applied to the scale in light of the KMO and Bartlett tests. The factor load rates and the explained variance rates relating to the results of factor analysis are given in Table 1. Cronbach alpha and descriptive statistical coefficient scores are also explained in the Table. In the factor analysis, first of all, Varimax rotation technique was applied and items with a factor load under 0.40 were omitted. After the first analysis the items of the scale were arranged as 12 sub-factors; however, as the number of the items were few in the last eight factors and factor load values were very close to each other, the scale was forced to employ four factors (Büyüköztürk, 2002b; Hayton et al., 2004; Osborne and Costello, 2005; Henson and Roberts, 2006; Steger, 2006).

Table 1 Descriptive statistical values, factor loads, explained variance rates and Cronbach alpha reliability scores for the Attitude Scale for Visual Media Tool Use in the Chemistry Laboratory
Factors Items Mean Sd r xy a Factor loads
a r xy: Item sum correlations. b Includes negative attitude expression items; scores of these items were converted from 1–5 to 5–1.
Interest in lessons and learning (F1) T14-Video Camera Practice in Chemistry Laboratory (VCPCL) should be resumed in chemistry laboratory 3.43 1.39 0.79 0.61
T15-VCPCL should be extended to all laboratory courses 3.19 1.47 0.72 0.68
T18-VCPCL is adequate for chemistry laboratory courses 3.52 1.08 0.50 0.65
T24-VCPCL makes chemistry laboratory courses more amusing 3.55 1.37 0.74 0.65
T36-VCPCL helps me to consolidate chemistry concepts 3.63 1.13 0.58 0.54
T38-VCPCL helps me to evaluate myself during the process 3.69 1.10 0.74 0.73
T48-VCPCL contributes to my individual learning 3.71 1.10 0.70 0.63
T52-VCPCL makes me enjoy chemistry laboratory courses 3.57 1.31 0.75 0.79
T54-VCPCL develops my scientific process skills 3.70 1.08 0.69 0.64
Experimental skill and self-assessment (F2) T4-With VCPCL I don't have to study experiments 3.77 1.16 0.62 0.58
T6-VCPCL encourages me to prepare for experiments 3.92 1.10 0.78 0.75
T8-I can watch experiments performed with VCPCL 4.03 1.20 0.70 0.77
T10-I can see errors with VCPCL 4.03 1.12 0.73 0.75
T16-VCPCL prevents repeat of experiment errors 3.82 1.10 0.65 0.67
T28-VCPCL encourages me to do preliminary research 3.74 1.08 0.68 0.61
T44-VCPCL allows me to identify my friends 3.54 1.25 0.49 0.44
bT47-VCPCL dulls my communication skills 4.03 1.00 0.63 0.58
T60-VCPCL teaches me teamwork 3.89 1.13 0.68 0.64
Behaviour and cooperative learning (F3) bT3-I can't learn chemistry concepts with VCPCL 3.64 1.23 0.49 0.56
bT37-VCPCL doesn't contribute to the cooperative learning process 3.70 1.22 0.50 0.64
bT41-VCPCL reduces my group mates' harmony 3.90 1.09 0.66 0.72
bT43-VCPCL reduces my participation in experiments 4.00 1.04 0.71 0.79
bT49-VCPCL weakens my experimentation skills 4.07 1.06 0.59 0.50
Interest in practising (F4) bT11-VCPCL extends chemistry laboratory course duration 2.30 1.30 0.47 0.57
bT21-VCPCL prevents me from recognizing the technological tools 4.17 0.99 0.38 0.47
bT23-VCPCL reduces my attention to chemistry experiments 3.72 1.17 0.69 0.60
bT45-VCPCL reduces course discipline 3.80 1.22 0.51 0.53
bT59-VCPCL restricts my behaviour on the course 3.21 1.35 0.51 0.75


In accordance with the Kaiser scale the factors whose eigenvalues were over 1.00 were taken into consideration (Büyüköztürk, 2002b; Hayton et al., 2004; Osborne and Costello, 2005; Henson and Roberts, 2006; Steger, 2006). The eigenvalues in these factors were found to be for the first factor 11.38, for the second factor 1.99, for the third factor 1.82 and for the fourth factor 1.26. Because the eigenvalues in these factors were over one, the scale suggested a structure with four factors. The factors in the scale were named Factor 1 (Interest in Lessons and Learning), Factor 2 (Experimental Skill and Self-Assessment), Factor 3 (Behaviour and Cooperative Learning) and Factor 4 (Interest in Practising). According to the results of the analysis, the Attitude Scale for Visual Media Tool Use in the Chemistry Laboratory account for 58.73% of the total variance.

Considering the values in Table 1, the factor load change for F1 is between 0.54 and 0.79, for F2 between 0.44 and 0.77, for F3 between 0.50 and 0.79, and for F4 between 0.47 and 0.75. Items whose factor loads were over 0.30 were accepted as strong items for the factor structure of the scale (Büyüköztürk, 2002a; Johnson and McClure, 2002; Osborne and Costello, 2005; Karaca, 2006); however, as regards composition of the factor pattern, factor loads between 0.30 and 0.40 were accepted as the minimum limit (Tavşancil, 2002, p. 8). The fact that the factor loads of the items are over 0.45 is proof that these items provide a strong factor structure.

Items which have low item-scale correlation weaken the factor structure. When item-scale correlations are low, the error rate in the covariance of the factors is high and weakens the factor. Item variances are different in each item. This difference affects the correlation. The different variances among items lessen the correlation between them. Factors with different variances are called difficult factors. Easy items and difficult items are different factors. Easy items show positive skewness and difficult items have higher correlations (Gorsuch, 1997). As a result of the factor analysis, item-total scale correlations of the rest of the items were calculated as 0.40, the lowest, and 0.82, the highest. The correlation values of the items in each factor change respectively for F1 between 0.50 and 0.79; for F2 between 0.49 and 0.62; for F3 between 0.49 and 0.71; and for F4 between 0.38 and 0.69. According to the results of the item-scale correlation, the items in the factors of the scale can be said to be strong items.

When the reliability of the variables is low, the variance rates are also low. This is an undesirable situation, as a variable whose variance is low is not a valid scale. Researchers should think about the validity of the measured variances when choosing items in annotative factor analysis (Fabrigar et al., 1999). Most similar items in the scale are good items for the factor. These items are reliable, and they measure only one structure (Gorsuch, 1997). The reliability in a scale is important for generalizing the changes in the training process correctly, for further research and for making comparisons. Accurate decisions in science are only possible with reliable standardized scales. The reliability of a scale used in social sciences such as education can be checked by the test-retest method, the parallel test method or by looking at its internal consistency reliability coefficients. If test-retest or parallel test methods are not used, Spearman-Brown and Cronbach alpha reliability analyses can be done (Tekin, 1991; Tavşancil, 2002). The Spearman-Brown reliability score of the scale was found to be 0.96 and the Cronbach alpha reliability coefficient was 0.95. Cronbach's alpha was found to be 0.91 for F1, 0.90 for F2, 0.81 for F3 and 0.75 for F4. When internal consistency reliability coefficients of the scale are over 0.70, it indicates that the scale is reliable (Tavşancil, 2002). The reason for the difference among the three factors' reliability coefficients can be attributed to the number of items in that factor. Özçelik (1992) stated that when the number of items of a factor with low internal consistency reliability is similar to the number of items of the other factors, this sub-factor should have a higher internal consistency reliability coefficient.

When the sub-factors of the scale are examined, it can be seen that the number of the items of factors with high internal consistency reliability is high and they are close to each other. A scale with high internal consistency reliability shows that the items measure only one structure and the factors are related to each other. Table 2 shows the factors' descriptive statistical values of the Attitude Scale for Visual Media Tool Use in the Chemistry Laboratory.

Table 2 Factors' descriptive statistical values for Attitude Scale for Visual Media Tool Use in the Chemistry Laboratory
Sub factors Item number Mean Range Sd Lowest score Highest score
F1 9 31.99 (3.56) 36.00 8.44 (0.94) 9.00 45.00
F2 9 34.77 (3.86) 33.00 7.51 (0.83) 12.00 45.00
F3 6 19.30 (3.22) 20.00 4.18 (0.70) 5.00 25.00
F4 5 17.19 (3.44) 19.00 4.27 (0.86) 6.00 25.00
Total 29 103.25 (3.52) 102.00 20.71 (0.87) 38.00 140.00


In Table 2, values in parentheses were computed by the rating item number in these factors to the total score for each factor. According to Fabrigar et al. (1999), each factor should have items in it which vary between three and five. Factors between one and three are weak factors. The factors of five or more than five are desirable factors (Osborne and Costello, 2005). The numbers of the items which appear in factors in the Attitude Scale for Visual Media Tool Use in the Chemistry Laboratory are for F1 9.0, for F2 9.0, for F3 6.0 and for F4 5.0. Thus, it can be said that the factors in the scale are strong factors.

A one-way repeated measures ANOVA was conducted to compare the mean scores of Factor 1 (Interest in Lessons and Learning), Factor 2 (Experimental Skill and Self-Assessment), Factor 3 (Behaviour and Cooperative Learning) and Factor 4 (Interest in Practising) in the Attitude Scale for Visual Media Tool Use in the Chemistry Laboratory. The results of a 1 × 2 (GroupxFactor) repeated measures ANOVA predicted attitude differences between factors. There was a significant effect for factors: Wilks's Lambda = 0.563, F (1,114) = 28.947, p < 0.000 η2p = 0.437. The ANOVA indicated that Factor 2 (Experimental Skill and Self-Assessment) gave more effective performance than the other factors. Effect sizes were measured through partial eta-squared (η2p). Partial eta-squared effect sizes were considered to be small for η2p ≤ 0.01, medium for η2p ≤ 0.06, and large for η2p ≥ 0.14 (Stevens, 1992). In this context, the rate of partial population variance emphasized by the GroupxFactor interaction suffices as large in this study.

Discussion and conclusion

The aim of this study was to develop an Attitude Scale for Visual Media Tool Use in the Chemistry Laboratory and to evaluate the attitudes of students of the Department of Science Education with this scale. In this context, by examining attitude scale towards chemistry, laboratory, science, technology and use of visual media tools in related literature, the structure of the present scale was formed. Validity and reliability of the scale were provided with factor analysis. Chemistry, laboratory and science courses have been integrated with technology recently. Some countries have changed their science courses into science and technology courses in their education programmes, and have added technological aspects. As a term, technology has various definitions. In this study, the term covers the use of visual media tools in a chemistry laboratory. Given the widespread interest in visual media tools, the need to examine affective skills related to the use of these tools in laboratories has been revealed. In response to this, an attitude scale on the use of visual media tools in the chemistry laboratory was developed to identify effects on students' affective skills. The variation rate given in the scale was 58.73% and the Cronbach alpha reliability coefficient was 0.95. The items in the scale were grouped in four factors: Interest in Lessons and Learning (F1), Experimental Skill and Self-Assessment (F2), Behavior and Cooperative Learning (F3) and Interest in Practising (F4).

When the scales related to chemistry information, chemistry courses or laboratory courses are examined, it is evident that most of them have structures with at least four different factors. Most of the scales are Likert-type or semantic differential rating type. They have been developed for chemistry courses, chemistry laboratories and chemistry combined with other disciplines. Most of the scales have at least four, and at most seven factors in their structures. ‘Interest in chemistry and the applications in chemistry’ and ‘learning chemistry terms’ are seen most often (Hofstein et al., 2000; Coll et al., 2002; Miller et al., 2004; Nakhleh et al., 2004; Bauer, 2008; Franciszkowicz, 2008; Cheung, 2009). It is evident that the scale developed from the data of this study has a structure with four factors. Different from the scale used in this study, self-evaluation' and ‘cooperative learning’ factors have not been used in the other scales developed in connection with chemistry. Adesoji and Raimi (2004) used chemistry practice skills and an attitude scale based upon chemistry in their study but the scale that they used has rates ranging from four to six and is not of Likert-type. Furthermore, albeit they attempted to measure scientific skills such as generalization, observation, measuring, recording and defining , the factor structure is not explained properly. In this study, attitudes to the evaluation of the skills indicated by Adesoji and Raimi (2004) were analyzed under different factors. This scale differs from Cheung's scale (2009) in that it has a seven Likert-type and lacks the factor ‘evaluating beliefs about chemistry at school’. Also, the fact that there were 12 items in the scale that Cheung (2009) developed, and each factor had three items, made the structural validity of the scale controversial. In the literature, there are suggestions that a factor's validity will be reduced if the number of items is correspondingly small. According to Fabrigar et al. (1999), scales must have at least three to five items in each factor. If they have one to three items the factors are weak; if there are five or more items in a factor, it is satisfactory (Osborne and Costello, 2005).

In the chemistry attitude scale (semantic differential scale) developed by Bauer (2008), anxiety and safety factors are added to love, enjoyment and interest factors. Similarly, in a survey of chemistry attitudes and experiences by Coll et al. (2002), chemistry benefits, career orientation in chemistry, and experience of learning environments were added to the attitudes towards chemistry. Coll et al. (2002) defined learning in terms of learning through demonstration, learning in the laboratory, learning through conferences and educational learning. They put the learning factor into the equation, unlike other scales in chemistry. The present study supports Coll et al.'s (2002) environment learning through its Interest in Learning and Interest in Cooperative Learning factors. This study examines the laboratory learning environment and learning experiences on the internet, but does not evaluate anxiety and safety factors. For this reason, the lack of dimension in the scale should be remedied in future studies. Chatterjee et al. (2009) examined students' attitudes and perceptions towards open-questioning and guided-questioning laboratories. A 19-item semantic differential study with three parts was used, and attitudes towards the guided-questioning laboratory were measured with seven items, the open-questioning laboratory with seven items and both laboratory applications with five items. The present study used the guided-questioning laboratory technique, however. Therefore, the items included in the attitude scale are grouped under the factor structures according to the guided-questioning laboratory technique. While developing a similar scale in further studies, a variety of laboratory techniques could be used to reduce the disruptive effect of the laboratory technique to a minimum level.

Chemistry and chemistry courses are strongly interrelated to technology. Most measurements are performed with technological materials: for example, digital thermometers digital pH meters. Video-assisted instructional programmes were used in the traditional education system, and their effects on the behaviours of students have been questioned by various studies (DeLoache and Korac, 2003; Maniar et al., 2008; Llorens-Molina, 2008; Ullrich et al., 2010). Recently, sharing the videos of chemical experiments done by teachers and students through the social media tools has given a different meaning to chemistry classes. For this reason, the use of technology in chemistry courses reveals the necessity of questioning the behaviour of students. In this context, Miller et al. (2004) developed an attitude scale to measure the comprehension of concepts in chemistry with instrumental devices and examined the attitudes of students. They examined the ease of use of chemicals in the laboratory measuring devices, the effect of measuring devices on conceptual comprehension and experimental errors and the experimental conditions related to group dynamics and the chemistry laboratory. Similarly, Franciszkowicz (2008) used a video-assisted training programme in chemistry courses. Using video images captured in chemistry lessons, he examined students' problem-solving skills and conceptual understanding. He developed an attitude scale to explain the benefits of video-assisted training, and searched for the number of hits on the internet about videos related to chemistry. Thanks to this method, students' skills in problem-solving and conceptual comprehension were improved. The attitude scale used in this study was of a five-point Likert type and explored students' conceptual comprehension and problem-solving skills in video-assisted training. In the other part of the attitude scale training students' self-learning behaviours were looked for during the video-assisted training. In the same way, Jyun and Hong (2010) examined perceptions of chemistry lessons in video clips on the internet. A questionnaire developed by the researchers was used in video clips. The scale consisted of items which evaluated chemistry course preferences for the use of video tools and video clip features. The questionnaire used a five-point Likert-type scale. Miller et al.'s (2004) work only partly fulfils the purpose of the present study, because their scale only questions the effect of the measuring devices on students' attitudes and learning level. In the present study, however, the effect of video cameras on students' attitudes and learning practices in the chemistry laboratory was questioned. Likewise, Franciszkowicz's (2008) research partly meets the purpose of the present study. Franciszkowicz (2008), however, also searched for problem-solving skills of students during video-assisted training. In further studies, attitude patterns of problem-solving could be added or different inventories used. The scale used by Jyun and Hong (2010) is consistent with the factors of the application process in this study and excludes the factors of other scales in which attitudes towards learning such as love and enjoyment are examined.

Hofstein et al. (2000) examined students' perceptions towards industrial chemistry lessons. The scale which Hofstein (1986) developed for learning chemistry in the classroom environment was updated for industrial chemistry lessons. The questionnaire was of a four-point Likert type consisting of eight factors and the variation rate was 53.3%. The items in the questionnaire were grouped as ‘interest in and desire for chemistry’. The items in the scale were grouped as ‘interest in and enjoyment of chemistry’, ‘correlating the term chemistry with daily life’, ‘participation in chemistry discussions in a classroom atmosphere’, ‘educational techniques in chemistry courses’, ‘the social effect of chemistry’, ‘the difficulty of chemistry’ and ‘chemistry and laboratory-industry knowledge’. It is clear that the structure of the Attitude Scale for Visual Media Tool Use in the Chemistry Laboratory matches the factors in the study of Hofstein et al. (2000) except for ‘correlating the term chemistry with daily life’, ‘the social effect of chemistry’ and ‘chemistry and laboratory-industry knowledge’. The scale used by Hofstein et al. (2000) tried to question the effect of chemistry on everyday life.

If the scale developed in this study is compared with those developed for science, some similar points emerge. The scales developed for science mostly involve primary-level students. As in the scales related to chemistry and the Attitude Scale for Visual Media Tool Use in the Chemistry Laboratory, attitude items and factors which measure affective features like love, trust, and interest are used in the scales developed for science (Kind et al., 2007; Gokhale et al., 2009; Wang and Berlin, 2010). It is emphasized that affective features tend to be questioned in the scales developed for science but science-oriented behaviours are not (Walczak, 2009). In addition to cognitive and behavioural features, Kind et al. (2007) also searched for affective ones in their study. On the other hand, in the present study, apart from attitude items for affective aspects, there are also other items in the Attitude Scale for Visual Media Tool Use in the Chemistry Laboratory searching for ‘interest in learning’ and ‘science-oriented skills’.

As a result, an Attitude Scale for Visual Media Tool Use in the Chemistry Laboratory with 29 items and four factors was developed. This scale took into account the following factors: Interest in Lessons and Learning (F1), Experimental Skill and Self-Assessment (F2), Behaviour and Cooperative Learning (F3) and Interest in Practising (F4). The scale developed in this study measures students' affective, cognitive and behavioural characteristics. It includes affective features like love, interest and enjoyment as in the other scales used in the chemistry laboratory. Behavioural features questioning the effect of visual media tools on learning in the chemistry laboratory are also included. The scale also includes attitude items in terms of experimental skill, collaborative learning and self-evaluation which have not been considered in the factor structures of the scales used in chemistry and science. Experiments and video recordings were performed only by using guided experiment sheets in the laboratory while the scale was developed. The validity and reliability scores of the items in this scale will be different depending on the laboratory teaching technique. In laboratory applications, problem-solving, task-based learning and argumentation methods are used. Because of this, the scale developed in this study can be updated with the addition of attitude items measuring problem-solving skills or questioning skills.

References

  1. Adesoji F. A. and Raimi S. M., (2004), Effects of enhanced laboratory instructional technique on senior secondary students' attitude toward chemistry in Oyo Township, Oyo State, Nigeria, Journal of Science Education and Technology, 13(3), 377–385.
  2. Afacan Ö. and Aydogdu M., (2006), The Science Technology Society (STS) Course Attitude Scale, International Journal of Environmental and Science Education, 1(2), 189–201. Retrieved from http://www.ijese.com.
  3. Aksoy G., Doymuş K., Karaçöp A., Şimşek U. and Koç Y., (2008), The effect of cooperative learning method in teaching general chemistry laboratory courses on the academic success of the science department undergraduate and the opinions of students related to teaching with the cooperative learning method (İşbirlikli öğrenme yönteminin genel kimya laboratuar dersinin akademik başarısına etkisi ve öğrencilerin bu yöntem hakkındaki görüşleri), Journal of Kazım Karabekir Education Faculty, (17), 212–227. Retrieved from http://e-ergi.atauni.edu.tr/index.php/kkefd.
  4. Azizoglu N. and Uzuntiryaki E., (2006), Chemistry Laboratory Anxiety Scale. Hacettepe University Journal of Education, (30), 55–62. Retrieved from http://www.efdergi.hacettepe.edu.tr.
  5. Bauer C. F., (2008), Attitude towards Chemistry: A Semantic Differential Instrument for Assessing Curriculum Impacts, Chemical Education Research, 85(10), 1440–1445.
  6. Benedict L. and Pence H. E., (2012), Teaching Chemistry Using Student-Created Videos and Photo Blogs Accessed with Smartphones and Two-Dimensional Barcodes, Journal of Chemical Education, 89(1), 492–496.
  7. Büyüköztürk Ş., (2002a), Factor Analysis: Basic Concepts and Using To Development Scale, Educational Administration-Theory and Practice, 32(2): 470–483. Retrieved from http://www.pegemdergi.net/index.php/KU.
  8. Büyüköztürk Ş., (2002b), Handbook to Data Analysis for Social Sciences (Turkish Title: Sosyal Bilimler İçin Veri Analizi Elkitabi), Ankara: Pegem A Publishing (Turkish Version: Pegem A Yayincilik).
  9. Çelik B., (2010), Use of Videotapes in Piano Education, Journal of Gazi Educational Faculty, 30(3), 785–800. Retrieved from http://www.gefad.gazi.edu.tr/.
  10. Cennamo K. S., (1993), Learning from Video: Factors Influencing Learners' Preconceptions and Invested Mental Effort, Educational Technology, Research and Development, 41(3), 33–45, DOI: 10.1007/BF02297356.
  11. Chatterjee S., Williamson V. M., McCann K. and Peck M. L., (2009), Surveying Students' Attitudes and Perceptions toward Guided-Inquiry and Open-Inquiry Laboratories, Chemical Education Research, 86(12), 1427–1432.
  12. Chang A. Y., Ghose S., Littman-Quinn R., Anolik R. B., Kyer A., Mazhani L., Seymour A. K. and Kovarik C. L., (2012), Use of Mobile Learning by Resident Physicians in Botswana, Telemedicine and e-Health, 18(1), 11–13.
  13. Chen H. J., Chiu M. H., Tsai Y. M., Chen J. T., Chang H. T., She J. L., Chou C. C. and Liu J. C., (2005), The Use of Performance-based Assessment in an Integrated Chemistry Laboratory Program, (Report No. NSC 91-2511-S-002-004 and NSC 92-2511-S-002-019), Taiwan: The National Science Council.
  14. Cheung D., (2009), Developing a Scale to Measure Students' Attitudes toward Chemistry Lessons, International Journal of Science Education, 31(16), 2185–2203.
  15. Coll R. K., Dalgety J. and Salter D., (2002), The Development of The Chemistry Attitudes and Experiences Questionnaire (CAEQ), Chemistry Education: Research and Practice, 3(1), 19–32.
  16. De Jong O., (2000), Crossing the borders: Chemical education research and teaching practice, Royal Society Chemistry Proceedings, 4(1), 31–34.
  17. DeLoache J. S. and Korac N., (2003), Video-based learning by very young children, Developmental Science, 6(3), 245–246. DOI: 10.1111/1467-7687.00279.
  18. Dziuban C. D. and Shirkey E. C., (1974), On the Psychometric Assessment of Correlation Matrices, American Educational Research Journal, 11(2), 211–216. Retrieved from http://www.jstor.org/pss/1161796.
  19. Erim A. and Yöndem S., (2009), The Effect of Model Aided Teaching on Guitar Performance, Dokuz Eylül Üniversitesi Buca Eğitim Fakültesi Dergisi (BEFjournal), 26, 45–55. Retrieved from www.befjournal.com.tr/index.php/dergi/article/download/252/218.
  20. Fabrigar L. R., Wegener D. T., MacCallum R. C. and Strahan E. J., (1999), Evaluating the Use of Exploratory Factor Analysis in Psychological Research, Psychological Methods, 4(3), 272–299. Retrieved from http://psycnet.apa.org/journals/met/4/3/.
  21. Franciszkowicz M., (2008), Video-Based Additional Instruction, Journal of the Research Center for Educational Technology (RCET), 4(2), 5–14.
  22. Gokhale A., Brauchle P. and Machina K., (2009), Development and Validation of a Scale to Measure Attitudes toward Science and Technology, Journal of College Science Teaching, May, 66–75.
  23. Gorsuch R. L., (1997), Exploratory Factor Analysis: Its Role in Items Analysis, Journal of Personality Assessment, 68(3), 532–560. DOI: 10.1207/s15327752jpa6803_5.
  24. Hayton J. C., Allen G. and Scarpello V., (2004), Factor Retention Decisions in Exporatory Factor Analysis: a Tutorial on Paralel Analysis, Organizational Research Methods, 7, 191. DOI: 10.1177/1094428104263675.
  25. Henson R. K. and Roberts J. K., (2006), Use of Exploratory Factor Analysis in Published Research: Common Errors and Some Comment on Improved Practice, Educational and Psychological Measurement, (66), 33. DOI: 10.1177/0013164405282485.
  26. Hodson D., (1993), Re-thinking Old Ways: Towards a More Critical Approach to Practical Work in Science, Studies in Science Education, 22, 85–142. DOI: 10.1080/03057269308560022.
  27. Hofstein A., (2004), The Laboratory in Chemistry Education: Thirty Years of Experience with Developments, Implementation and Research, Chemistry Eduaction: Research and Practice, 5(3), 247–264. Retrieved from http://www.uoi.gr/cerp/2004_October/pdf/06HofsteinInvited.pdf.
  28. Hofstein A., Kesner M. and Ben-Zvi R., (2000), Student perceptions of industrial chemistry classroom learning environments, Learning Environments Research, 2, 291–306.
  29. Hofstein A. and Lunetta V. N., (1982), The Role of the Laboratory in Science Teaching: Neglected Aspects of Research, Review of Educational Research, 52(2), 201–217. Retrieved from http://www.jstor.org/pss/1170311.
  30. Johnson B. and McClure R., (2002), Validity and Reliability of a Shortened, Revised Version of the Constructivist Learning Environment Survey (CLES), Learning Environments Research, 7, 65. DOI: 10.1023/B:LERI.0000022279.89075.9f.
  31. Johnstone A. H., (2000), Teaching of Chemistry – Logical or Psychological? Chemistry Education: Research and Practice in Europe, 1(1), 9–15.
  32. Jyun H. and Hong H., (2010), Students' Perceptions on Chemistry I Class Using YouTube Video Clips, Journal of the Korean Chemical Society, 54(4), 465–470. DOI: 10.5012/jkcs.2010.54.4.465.
  33. Kahn J. H., (2006), Factor Analysis in Counseling Psychology Research, Training, and Practice: Principles, Advances, and Applications, The Counseling Psychologist, 34, 684. DOI: 10.1177/0011000006286347.
  34. Karaca E., (2006), Developing Attitude Scale towards Planning and Evaluation in Teaching Courses (Turkish Title: Öğretimde Planlama ve Değerlendirme Dersine Yönelik Bir Tutum Ölçeği Geliştirme). Dumlupınar University, Journal of Social Sciences (Turkish Version: Dumlupınar Üniversitesi Sosyal Bilimler Dergisi), 16, 213–230. Retrieved from: http://sbe.dpu.edu.tr/16/213-230.pdf.
  35. Kind P., Jones K. and Barmby P., (2007), Developing Attitudes towards Science Measures, International Journal of Science Education, 29(7), 871–893. DOI: 10.1080/09500690600909091.
  36. Llorens-Molina J. A., (2008), Design and Assessment of an Online Prelab Model in General Chemistry: A Case Study, Journal of the Research Center for Educational Technology (RCET), 4(2), 15–31.
  37. Maniar N., Bennett E., Hand S. and Allan G., (2008), The Effect of Mobile Phone Screen Size on Video based Learning, Journal of Software, 3(4), 51–61. DOI: 10.4304/jsw.3.4.51-61.
  38. Miller L. S., Nakhleh M. B., Nash J. J. and Meyer J. A., (2004), Students' AttitudesTtoward and Conceptual Understanding of Chemical Instrumentation, Chemical Education Research, 81(12), 1801–1808.
  39. Morgil I., Seyhan H. G. and Seçken N., (2009), Investigating the Effects of Project-Oriented Chemistry Experiments on Some Affective and Cognitive Field Components, Journal of Turkish Science Education, 6(1), 89–107. Retrieved from http://www.tused.org/internet/tused/archive/v6/i1/text/tusedv6i1a8.pdf.
  40. Morozov M., Tanakov A., Gerasimov A., Bystrov D. and Cvirco E., (2004), Virtual Chemistry Laboratory for School Education, Chee-Kit Looi K., Sutinen E., Sampson D. G., Aedo I., Uden L., Kähkönen E. (ed.), in Proceedings of the IEEE International Conference on Advanced Learning Technologies, (pp. 605–608), Joensuu: Finland IEEE Computer Society, DOI: 10.1109/ICALT.2004.1357486.
  41. Nakhleh M. B., (1994), Chemical Education Research in the Laboratory Environment, Journal of Chemical Education, 71(3), 201–205.
  42. Nortcliffe A. and Middleton A., (2012), Smartphone Feedback: Using an iPhone to İmprove the Distribution of Audio Feedback, International Journal of Electrical Engineering Education, 48(3), 280–293.
  43. Osborne J. and Costello A. B., (2005), Best Practises in Exploratory Factor Analysis: Four Recommendations for Getting the Most From Your Analysis, Practical Assessment, Research & Evaluation, 10, 7. Retrieved from: http://pareonline.net/pdf/v10n7.pdf.
  44. Özçelik D. A., (1992), Measurement and Assesstment (Turkish Title: Ölçme ve Değerlendirme), Ankara: OSYM publishing (ÖSYM Yayinlari).
  45. Özdamar K., (2004), Analyzing of Statistical Data through PC Programs (5th Press) (Turkish Version: Paket Programlar ile statiksel Veri Analizi (5.baski)), Eskişehir: Kaan Publishing (Turkish Version: Kaan Kitabevi).
  46. Pekdağ B., (2010), Alternative Methods in Learning Chemistry: Learning with Animation, Simulation, Video and Multimedia, Journal of Turkish Science Education, 7(2), 79–109. Retrieved from: http://www.tused.org/internet/tused/archive/v7/i2/text/tusedv7i2a5.pdf.
  47. Petko D. and Reusser K., (2003), Collaborative Video based Teacher Training in a Virtual Learning Environment, Paper presented at 10th European Conference for Research on Learning and Instruction (EARLI), Padova/Italy.
  48. Sharples M., Arnedillo-Sanchez I., Milrad M. and Vavoula G., (2009), Mobile Learning, Small Devices, Big Issues, Technology-Enhanced Learning, 14, DOI: 10.1007/978-1-4020-9827-7.
  49. Steger M. F., (2006), Illustration of Issues in Factor Extraction and Identification of Dimensionality in Psychological Assessment Data, Journal of Personality Assessment, 86(3), 263–272. DOI: 10.1207/s15327752jpa8603_03.
  50. Stevens J., (1992), Applied Multivariete Statistics for the Social Sciences, Hillsdale, Nf: Erlbaum.
  51. Tavşancil E., (2002), Scaling Attitudes and Analyzing Data through SPSS (Turkish Title: Tutumların Ölçülmesi ve SPSS ile Veri Analizi), Ankara: Nobel Publishing (Turkish Version: Nobel Yayın Dağitim).
  52. Tekin H., (1991), Evaluation and Assesstment in Education (Turkish Title: Eğitimde Ölçme ve Değerlendirme), Ankara: Yargi Publishing (Turkish Version:Yargi Yayinlari).
  53. Tetko I., Gasteiger J., Todeschini R., Mauri A., Livingstone D., Erl P., Palyuling V. A., Radchenko E. V., Zefirov N. S., Makarenko A. S., Tanchuk V. Y. and Prokopenko V. V., (2005), Virtual Computational Chemistry Laboratory – Design and Description, Journal of Computer-Aided Molecular Design, 19, 453–463. DOI: 10.1007/s10822-005-8694-y.
  54. Tezer M., (2008, May), View of the Instructors and Students on Computer based Video Programs as an Educational Material (Turkish Title: Bilgisayar Tabanli Video Programlarının Eğitim Materyali Olarak Kullanılmasına Yönelik Öğretim Elemanlari ve Öğrenci Görüşleri), Paper Presented at the 8th International Educational Technology Conference, Eskişehir, Turkey: Anadolu University, Fulltext retrieved from http://www.ietc2008.anadolu.edu.tr/online.php.
  55. Ullrich C., Shen R., Tong R. and Tan X., (2010), A Mobile Live Video Learning System for Large-Scale Learning—System Design and Evaluation, IEEE Transactions On Learning Technologies, 3(1), 6–17.
  56. Varol N., (1995), Preparing Video Film and Using Video as Learning Activity (Turkish Title: Öğrenme Etkinliği Olarak Video Kullanimi ve Video Film Hazirlanmasi), The Conference to Vocational & Technical Education (Turkish Version: Mesleki ve Teknik Eğitim Sempozyumu, pp. 396–403), Elaziğ: University of Firat Press.
  57. Velazquez-Marcano A.,Williamson V. M., Ashkenazi G., Tasker R. and Williamson K. C., (2004), The Use of Video Demonstrations and Participate Animation in General Chemistry, Journal of Science Education and Technology, 13(3), 315–323.
  58. Vural Ö. F. and Zellner R., (2010), Using Concept Mapping in Video-Based Learning, Gaziantep University Journal of Social Sciences, 9(3), 747–757. Retrieved from: http://sbe.gantep.edu.tr/∼sbd/index.php/sbd/article/view/318.
  59. Walczak M. M., (2009), Do Student Attitudes toward Science Change during a General Education Chemistry Course, Journal of Chemical Education, 86(8), 985–991.
  60. Wang T. L. and Berlin D., (2010), Construction and Validation of an Instrument to Measure Taiwanese Elementary Students' Attitudes toward Their Science Class, International Journal of Science Education, 32(18), 2413–2428. DOI: 10.1080/09500690903431561.
  61. Zhang D., Zhou L., Briggs R. O. and Nunamaker J. F., (2006), Instructional video in e-learning: Assessing the impact of interactive video on learning effectiveness, Information & Management, 43, 15–27. DOI: 10.1016/j.im.2005.01.004.

This journal is © The Royal Society of Chemistry 2012