Integration of open educational resources in undergraduate chemistry teaching – a mapping tool and lecturers' considerations

Yael Feldman-Maggor a, Amira Rom a and Inbal Tuvi-Arad *b
aDepartment of Education and Psychology, The Open University of Israel, Ra'anana, Israel
bDepartment of Natural Sciences, The Open University of Israel, Ra'anana, Israel. E-mail: inbaltu@openu.ac.il

Received 10th October 2015 , Accepted 29th December 2015

First published on 29th December 2015


Abstract

This study examines chemistry lecturers' considerations for using open educational resources (OER) in their teaching. Recent technological developments provide innovative approaches for teaching chemistry and visualizing chemical phenomena. End users' improved ability to upload information online enables integration of various pedagogical models and learning theories. These improvements strengthen the need for up-to-date evaluation tools for educational websites. Building on existing taxonomies, a set of new criteria for the evaluation of online learning materials was developed and used to analyze 100 websites directed towards teaching chemistry. In addition, a questionnaire was circulated among 100 chemistry lecturers from various higher education institutions in Israel, 66 of whom responded. Subsequently, interviews were conducted with 17 of the questionnaire respondents. Our findings demonstrate that most of the chemistry lecturers who were interviewed integrate innovative learning materials such as simulations, videos and exercises found online in their teaching, but do not use web 2.0 that enables content sharing and collaborative learning. With respect to the selection of web-based learning materials, we found that the lecturers interviewed tended to select OER intuitively, mainly considering the reliability of information, pedagogical issues and the visual contribution, while paying less attention to collaborative learning and content sharing.


Introduction

A web-based learning environment offers valuable benefits for chemistry education (Barak, 2007; Merchant et al., 2012; Scherer and Tiemann, 2012; She et al., 2012). These include three-dimensional and interactive molecular visualization tools, the ability to demonstrate scientific phenomena via interactive simulation, advanced methods to search and retrieve chemical information through specialized online databases, and online frameworks to communicate with both peers and experts. These benefits pose intriguing questions for chemistry education research, including design of learning objects and simulations, best practice integration of technology, evaluation of teaching and learning, curriculum development and more. However, only about 11% of the research in chemistry education in the past decade was devoted to educational technology (Teo et al., 2014). The development of Web 2.0 technology that allows users to contribute to the content of websites through blogs, wiki, and other technologies (Greenhow et al., 2009; Schroeder and Greenbowe, 2009), the ease of integrating streaming video in websites (Kay, 2012) and the breakthrough of MOOCs (Massive Open Online Courses) in chemistry (Leontyev and Baranov, 2013) in recent years has intensified the need to conduct research in this field.

Evaluation of open educational resources

The variety of open educational resources (OER) offered to the general public is overwhelming (Zancanaro et al., 2015). One of the main advantages of these over printed materials lies in their accessibility and ease of publication. However, while printed textbooks are reviewed and validated by experts for their content, pedagogical value and writing style, some of the computerized learning materials are uploaded to the Internet without a review process. This situation highlights the need to develop criteria for evaluating online educational information (Maughan, 2001; Graham and Metaxas, 2003; Flanagin and Metzger, 2007; Brand-Gruwel and Stadtler, 2011). Generally, the selection of criteria for evaluation is a complicated and controversial task (Kahveci et al., 2008; Wikan and Molster, 2011). Furthermore, teachers are not necessarily sufficiently aware of the possibility for systematic evaluation of learning materials (Nevo, 1995). Nevertheless, the ability to properly evaluate OER is important for experts, faculty members, tutors, teachers, and communities of learners.

One way to evaluate educational resources found on the Internet is through predefined criterion lists or taxonomies. These are used to map the information in various websites in relation to its content, pedagogical value, visuality, usability, and utilization of the benefits of technology (Tuvi and Nachmias, 2001; Larreamendy-Joerns et al., 2005; Ardito et al., 2006; Evans and Leinhardt, 2008). According to Stake (2004), criteria are relative and assist us in understanding which characteristics should be emphasized during the evaluation process. While many studies focus on how students evaluate online resources (Barzilai and Zohar, 2012; Lucassen et al., 2013; Mandalios, 2013), far fewer studies examine how professors and tutors evaluate these materials (Tekerek and Ercan, 2012). This research builds on earlier studies of educational website evaluation (Mioduser et al., 2000; Storey et al., 2002; Walraven et al., 2009; Liu et al., 2011), and focuses on OER designed for the undergraduate chemistry curriculum from the point of view of faculty members (e.g., professors, lecturers and tutors). In order to evaluate OER, the pedagogical approach behind every website must also be examined, as described below.

Open educational resources and learning theories

Different educational learning theories differ in their pedagogical approaches. Although these theories were developed long before the era of the Internet, they can be used to classify and evaluate educational websites. Three such theories were found to be the most relevant to this study: the behaviorist theory, the cognitive theory and the constructivist theory. According to the behaviorist theory, learning is characterized by a number of actions, such as gradation integrated with repetition and exercise with feedback that presents the learner with his/her own performance (Winn and Snyder, 1996). With computer technology this theory can be applied through exercises accompanied by immediate feedback (Burton et al., 1996). A learning activity based on the behaviorist theory enables incorporation of several levels of difficulty as well as random generation of several examples for each exercise (Semple, 2000).

According to the cognitive theory of learning, the learning process has separate channels for processing visual and auditory information. Learners engage in active learning by attending to relevant incoming information, organizing selected information into coherent mental representations, and integrating mental representations with other knowledge (Mayer, 2005). Furthermore, Mayer (2005) emphasizes that learning through the integration of pictures and texts can assist in the transfer of knowledge since it enables integration of multiple modalities of information, such as verbal and visual channels, thus facilitating storage of information in the memory and retrieval of information at a later time. Such integration is natural in a web-based environment as the Internet provides the technology to present any combination of simulations, pictures, videos and texts in a single website. According to the cognitive learning theory, the use of feedback accompanied by explanation makes it is easier to transfer information about new situations. Computer technology exemplifies cognitive theory through the possibility of using feedback in learning (Moreno and Valdez, 2005).

The constructivist approach adds to the cognitive approach and distinguishes between understanding concepts and applying them in practice. It therefore promotes learning by experience through problem solving. This type of learning is considered effective since it provides a means to learn in an authentic manner that is analogous to the real world (Lombardi, 2007). According to Jonassen (2000), computer technology serves as a “mind tool” for the learner, i.e. the tools of thinking through which the learner gains experience in the field being taught, thus offering an appropriate environment to apply the cognitive theory of learning. According to the constructivist learning theory, control over the learning process shifts from the teacher to the student while learning takes place in context and in collaboration. The Internet platform, especially learner-centered activities triggered by Web 2.0, provides options for creating, sharing, remixing and repurposing content by users (Ullrich et al., 2008) in accordance with the constructivist theory. However, effective use of these tools requires teachers with proper pedagogical and technological knowledge related to the specific content to be taught.

Technological, pedagogical and content knowledge, TPACK

The increasing prevalence of technology in education and the need to improve the integration of technology instruction and practice among teachers has led to the development of technological, pedagogical and content knowledge (originally TPCK, now known as TPACK), a new framework for understanding teacher knowledge of technology integration, (Koehler and Mishra, 2009; Chai et al., 2013). The TPACK framework originated from the assumption that today's teachers need to know how to control, use and understand technology, and how to effectively apply it in order to improve their teaching (Mishra and Koehler, 2006). The framework is based on Shulman's (1986) construct of pedagogical content knowledge (PCK) and defines the type of knowledge required by a teacher in order to advance effective technological integration. Three main components of teacher knowledge are discussed in this framework: content, pedagogy, and technology (Koehler and Mishra, 2009). Content knowledge refers to the subject matter that is to be taught; pedagogical knowledge refers to the practices, processes, strategies, procedures and methods of teaching; technological knowledge refers to familiarities with computers, the Internet, digital videos etc. (Archambault and Barnett, 2010). TPACK signifies the ability of teachers to integrate these three components of knowledge in order to advance their teaching outcomes. Generally, the TPACK framework is seen as a valuable addition to the educational community and currently plays an important role in the integration of technology in teaching (Kopcha et al., 2014). However, effective implementation of technology requires more than TPACK. In particular, the importance of teacher self-efficacy regarding their ability to integrate technology in the classroom was raised by several researchers (Blonder et al., 2013; Chai et al., 2013; Blonder and Rap, 2015). In addition, cultural and institutional factors such as an emphasis on paper-and-pencil tests and inadequate physical or technical conditions can prevent proper integration of technology in the classroom (Chai et al., 2013).

Research goals

The main goals of this study are to explore the characteristics of OER in chemistry, to understand the different considerations that guide chemistry lecturers when selecting OER for teaching purposes, and to examine how these lecturers use them in practice. The results are expected to guide web designers in constructing effective websites that will meet lecturers' pedagogical needs. Furthermore, this study aims to help chemistry educators exploit the pedagogical benefits of well-designed educational resources in order to improve the integration of technology in higher education in chemistry as well as in other disciplines.

Methods

The methods used in this research were mainly quantitative: we developed and implemented a questionnaire and constructed a criterion system for evaluating OER in chemistry. In addition, we integrated qualitative aspects by conducting semi-structured interviews. The combination of both methods is beneficial here, as integration of quantitative and qualitative research methods is known to increase the rigor and trustworthiness of the results (Leech and Onwuegbuzie, 2007). In this research 100 educational websites in chemistry were analyzed in order to update the system of criteria for the evaluation of OER. An online questionnaire on OER usage for teaching purposes and the considerations for selecting OER was circulated among 100 chemistry professors, lecturers and tutors who teach undergraduate chemistry courses at 27 academic institutions in Israel. Of these, 66 responded, and 17 of them were interviewed in person in order to extend and validate the questionnaire findings. It should be noted that website analysis was carried out in parallel with the circulation of the questionnaires. This may be considered a limitation of the current study since the questionnaire did not include queries about findings that were discovered during the website analysis, such as the discussion regarding the possible use of Web 2.0 as explained below. However, by using semi-structured interviews we overcame this limitation.

The URLs of the surveyed websites, online questionnaire and interview questions are provided in appendices 1–3 respectively.

Selection of open educational resources

Several definitions of OER exist (McGreal, 2013). Here we follow the definition by Atkins et al. (2007), according to which OER are “full courses, open courseware and content, educational modules, textbooks, streaming videos, tests and assessments, open source software tools, and any other tools and materials used to support teaching or learning.” The present research surveyed 100 websites that contained OER in undergraduate chemistry – approximately half of them focused on general chemistry and the other half on other topics, mainly analytical chemistry and organic chemistry (see Appendix 1 for the list of URLs). These were collected mainly through Google searches using the keywords “General Chemistry”, “Organic Chemistry” and “Analytical Chemistry” during the years 2012–2013. Others were collected through answers to the online questionnaire and links within the collected websites. The search was terminated when recurring links appeared.

Online questionnaire

The questionnaire (see Appendix 2 for details) was developed to examine how chemistry lecturers evaluate OER before implementing them in their teaching. It was determined to be the most suitable data collection tool because it allows gathering a very large sample in a short period of time (Leech and Onwuegbuzie, 2007). The questionnaire was developed by the researchers but tested and validated by several experts in the fields of chemistry and education. It comprised 11 questions divided into three sections: the first characterized the respondents' background (age, years of experience, etc.), the second examined the respondents’ use of OER and the third examined the criteria used to select OER for teaching purposes. The questionnaire included questions about integrating online resources for learning purposes in general and specifically for teaching chemistry. Most questions were closed-ended but a few were open-ended. When relevant, closed-ended questions were accompanied by a five-point Likert-type scale (1 – not at all, 5 – very often).

Interviews with lecturers and tutors

After the completed questionnaires were received, the first author interviewed respondents who had taught undergraduate courses during the past five years at various academic institutions throughout Israel and indicated their agreement to be interviewed on their questionnaire form. The interviews were conducted from March through May of 2013. Fifteen interviews were conducted in person, and two others were conducted by telephone. These interviews included reference to the findings of the website analysis and the construction of the system of criteria for evaluation, both to extend the findings, and to provide examples of the subjects that emerged from the questionnaires (see Appendix 3 for details). All interviews were recorded and transcribed. The interviews were analyzed as follows. Initially, target categories were defined based on the interview questions, such as integration of OER for teaching purposes, lecturers' approaches to integrating such materials, use of Web 2.0 technology, and considerations for selecting OER. We then sorted the interviewees' answers according to these categories. In the second stage, we searched the interviewees' answers for recurring themes and also searched for answers that appeared in more than one or two interviews. These were added to the list of categories. Answers that appeared only once were included if they reflected themes that arose during the analysis of the questionnaires, or directly addressed an issue discussed in the literature. All authors participated in the analytical process.

Ethical considerations in brief

The research was approved by the ethics committee of the Department of Education and Psychology at the Open University of Israel. The questionnaire was circulated among chemistry lecturers whose contact details were freely available under their institution's official website. Participation was voluntary, and answering the questionnaire was done anonymously except for those who agreed to transfer their names and emails to the research team and participate in the interview stage of the research. Prior to the interviews, which were held several months later, each participant was kindly asked to reconfirm his/her consent to participate, while being given the opportunity to withdraw from the study. All interviewees were directly informed that in order to maintain confidentiality, any identifying details, e.g., their names, the names of institutions, details of the particular courses or study programs under their responsibility, etc. would not be published.

Results

Mapping of open educational resources in chemistry

Most of the surveyed websites were in English, focusing on teaching mainly general, organic and analytical chemistry. The taxonomies of Nachmias et al. (1998) and of Nachmias and Tuvi (2001) served as the basis for the present research. These included over 100 evaluation variables divided into four dimensions: descriptive (e.g., target population, site developers), pedagogical (e.g., instructional model and means, cognitive demand), representational (e.g., graphics, navigation tools) and communication (e.g., synchronic/asynchronic feedback, link configuration). The list of variables was updated in light of the technological development of the Internet in recent years and in accordance with the responses of the research participants. Variables that were too detailed (e.g., the type of external resources the website relies on such as textbooks, other websites, ask an expert etc.), or pointed to rare technologies (e.g., MOO/mud, Virtual environment) and variables that presented common characteristics of websites that were not expected to provide new insights (e.g. the structure of the website being linear, branched or web-like) were excluded from the list. This process resulted in a simple, updated system of criteria for evaluation and selection of educational resources by chemistry lecturers that can be easily adapted to other fields. The tool was validated by experts in chemistry and education. In addition, a sample of ten websites was tested by two external examiners. Only those variables that the examiners agreed upon were included in the final evaluation tool. The variables that were found unreliable by the examiners were altered or excluded.

Our final evaluation tool comprised 19 criteria that maintained the dimensions of the original taxonomies (see Appendix 4 for details). The descriptive dimension refers to factual details such as affiliation (e.g., academic institution, public organization, government authority, private company, school, teacher or student) and date of last update; the pedagogical dimension presents the activities provided by the website in relation to the learning method, e.g., exercises with or without feedback, user-generated content and simulations. In particular, two types of simulations were tested: symbolic simulation and experiential simulation (Gredler, 1996, 2004). The first consisted of an interactive demonstration of a system or process during which the learner can change certain parameters that affect the behavior of the system. The second refers to an environment in which the learner plays an authentic role, carries out complex tasks, and learns through creating (e.g., virtual laboratory simulation). The communication dimension examines whether the website enables the user to communicate with the website's administrator and/or other users, such as the general public, via e-mail, discussion group, chat or comments section. The representational dimension refers to the visualization of information or content in the website. This visualization consists of two-dimensional representations such as pictures, three-dimensional representations such as molecular models, and video representations such as lectures or demonstrations.

Table 1 presents the evaluation tool and consequent mapping of the websites. As is evident, the descriptive dimension and representational dimension were present in all 100 websites that were analyzed. However, the pedagogical dimension was present in only about 30% of the websites and the communication dimension in only about 25% of them.

Table 1 Mapping of chemistry websites by our evaluation tool (%), (N = 100)
Dimension Criteria Appearance on the internet (%)
Descriptive dimension Website affiliation
ac/edu 48
co/com 23
Other 29
Updating
Date of last update provided 88
Updated last year 39
Links and references
References 20
Links to other websites 31
Target population
High school 31
University 100
 
Pedagogical dimension User-generated content (UGC) 24
Exercises
Option for feedback 26
No option for feedback 12
Demonstrations
Experiential simulation 29
Symbolic simulation 18
 
Communication dimension Other users 24
Social media 22
 
Representational dimension 2D 100
3D 21
Video 34


Online questionnaires

The questionnaire was answered by 66 chemistry faculty members from 17 academic institutions (both colleges and universities) in Israel. Of the respondents, 60 were lecturers and the other six were teaching assistants, however for simplicity we will refer to the entire group as lecturers. Approximately half of the respondents (45.5%) teach general chemistry and the rest teach other undergraduate chemistry courses.

Fig. 1 presents the distribution of OER usage for teaching purposes as reported by questionnaire respondents on a 1–5 Likert-type scale. The central reasons for using OER were lesson preparation and visual demonstration via online simulations, animations and 3D graphics. Reference to websites and databases were at a medium level of use. A low level of use was accorded to finding extra exercises or quizzes with immediate feedback, suitable for self-learning.


image file: c5rp00184f-f1.tif
Fig. 1 Distribution of OER usage for teaching purposes on a 1–5 Likert-type scale (1 = lowest use, 5 = highest use) (N = 66).

The questionnaires were used to collect data regarding the criteria by which a specific website is selected for teaching purposes. Fig. 2 presents the distribution of these criteria. Most of the respondents marked scientific precision, links to other websites, design, and usability as important considerations, while the identity of the website's author was of medium interest. The level of explanation and the date of the last update were found to be less important.


image file: c5rp00184f-f2.tif
Fig. 2 Distribution of criteria for selecting OER for teaching purposes, on a 1–5 Likert-type scale (1 = lowest importance, 5 = highest importance) (N = 66).

Interview findings

The group of interviewees was diverse in academic seniority, ranging from a tutor with four years' experience in teaching to professors with over 30 years of teaching and research experience. No relationship was found between recurring themes and the respondents' background (age, years of experience, institution, etc.). Therefore, respondents' answers were analyzed as representing the full population of interviewees.

Generally, we found that the level of Internet integration in undergraduate chemistry teaching is high. Out of 17 interviewees, 13 taught general chemistry as well as introductory chemistry courses for freshman students, and all of them reported integration of OER in their teaching. Four other interviewees who taught different courses such as quantum chemistry or organic chemistry reported minor use of the Internet for teaching purposes.

Pedagogical strategies and the integration of online resources for teaching purposes

All interviewees use OER as supplementary tools for teaching. Interviewee No. 2 said: “The Internet is an auxiliary tool and not the main one; I am not looking for communication with other people”. Interviewee No. 9 explained: “I don't know how we managed before; the Internet upgraded all learning material and lectures, but one must know how not to integrate too much. [I] integrate new things but still believe in the traditional classic type of learning supplemented by modern applications”. All the interviewees stated that they found teaching materials through a search engine, mainly Google. The search was need-based (online demonstration or textual explanation), and conducted through keywords such as the name of a certain chemical reaction or the name of a specific topic. The strategy of this search was explained by interviewee No. 1: “When I search for something, I focus on the goal: If I am looking for an explanation, I go to the place that gives explanations. If I search for questions, I look for questions”. Interviewee No. 2 added: “I don't have enough time to waste… [I] use the method of trial and error […] and if it works, it works”.

The main use of OER for teaching purposes was to prepare lectures integrated with links to online demonstrations. This was done through video clips that presented experiments, especially those that appear in YouTube, and also by specific presentations through animations and simulations. Seven of the interviewees used online resources to find questions and problems on a specific topic that could be given to students. Interviewee No. 13 stressed that the use of the Internet to prepare exams was not only for exploring new ideas but also for comparing the level of the exam to other academic institutions around the world: “This gives me feedback on the kind of level that exists in the world and from which I can also get new ideas”.

Six interviewees operate a course website which is used to direct students to the following: enrichment websites, drill and practice, bulletin board and forums. Five interviewees operate a forum mainly before an exam. Two interviewees noted that they encourage their students to enter the forum and to answer questions posted by their peers. On the other hand, Interviewee No. 12, who operates an active forum, stresses that he encourages the students to ask factual questions. He claims that the field of natural science is based on facts, and therefore it is difficult for students to hold a discussion. From the interviews we found that 13 interviewees encourage their students to use a forum, but do not advocate the use of Web 2.0 technologies that allow for collaboration or the addition of content material such as a wiki or blog. The reasons for this are various. Interviewee No. 6 said: “I don't know whether I would integrate [the use of a blog], it depends on the effort and the added value. [This] sounds like something that could not be done in a lesson, and [if this is] not [in the framework] of a lesson it is less relevant for me”. Interviewee No. 7 thinks that technology is not suitable for freshman students. Interviewee No. 2 noted that she is not familiar with the technology. In this regard, all the interviewees accorded low importance to communication with the website managers or other users, or communication via social networks. Seven interviewees referred to guidance in the integration of learning technology and expressed the wish to participate in training sessions.

Fifteen interviewees were asked about their opinion of open lectures, MOOCs, etc. uploaded by academic institutions or other organizations. The interviewees regarded this positively, but six of them stressed that there was no substitute for a traditional lecture, due to the importance of student–lecturer interaction. Interviewee No. 14 explains: “As a supplementary tool this is excellent, but as the main learning tool it misses the aim”. Interviewee No. 8 added: “In an open lecture on the Internet the student is force-fed the material, and this causes far fewer questions to be asked […] and reduces their ability to participate […]. For me, interactions and discussions between the lecturer and the students are important”. Another example is interviewee No. 17 who also expressed doubts about open lectures and their influence on the future role of the lecturer: “This is positive, another possibility to view the material in preparation for the actual lecture. It affects the lecturer – if the students see that everything is available online – they won't be interested [in the lecture]. For students this is excellent”. Three interviewees use open lectures to learn new material, and think they are excellent for use as supplementary material. Interviewee No. 12 thinks that, since the courses given around the world do not necessarily overlap, the use of online video clips from another university or a private lecturer should be done only under the guidance of the lecturer, if at all.

From the present research on the main uses of OER for teaching purposes among chemistry lecturers, a number of advantages and disadvantages have emerged. One of the advantages that arises is the added value of the Internet for chemistry teaching. This advantage is expressed by the visual illustrative abilities, the exemplified experiments, and the dynamic effect provided by computerized simulations. For example, it is possible to give a tangible sense of a model or idea that cannot be seen visually, to dynamically present the different stages of a developing process and to access simulated experiments that cannot be carried out in the classroom. Nevertheless, five interviewees noted that online simulation of experiments cannot substitute for the laboratory.

The main disadvantage raised by all the interviewees was the inaccuracies prevalent in many websites. Such inaccuracies immediately downgrade the reliability of information which is the main criterion for selecting OER. In addition, four interviewees see a disadvantage in links to other websites since this makes it possible to jump from one topic to another and may, to some extent, interrupt in-depth learning. Language limitation is another disadvantage – four interviewees claimed that the fact that most of the websites are in English poses difficulties for Israeli students.

Criteria for selecting OER

We found that all the interviewees examined educational websites according to their own scientific knowledge. As interviewee No. 16 stated: “I am the reviewer of the website”. For six interviewees, the identity of the website's author is an important factor for determining the website's reliability. The use of a recognized university website or of a large commercial company with a well-known name was considered by most of the lecturers as preferable to a personal website or the website of an unknown organization.

According to our findings, the date of the last update and the information sources are of minor importance in determining website reliability. Three of the interviewees who referred to this criterion claimed that the date of the last update was not important because the content matter and the syllabus do not change over time. Interviewee No. 1 thought that one may evaluate information by cross-reference to textbooks, and she is the only one who claimed that even students could evaluate reliability in this way. Seven of the interviewees referred to the evaluation of information by students. Five of them noted that students do not have sufficient knowledge to evaluate chemical information reliability. Six interviewees referred to the reliability of the Wikipedia website, but their views differed – half of them claimed that the Wikipedia website is not a reliable source, and half of them claimed that it was. One of the lecturers defined it as “the constantly changing encyclopedia which is never static”.

Interviewees were asked about website usability, a criterion that was found important at the questionnaire stage. However, from the answers received we were unable to determine the acceptable criteria for characterizing usability. To summarize, we show that lecturers intuitively use most of the criteria included in the evaluation tool developed in this research. The main criteria were affiliation and information presentation. With regards to content reliability, no agreement was achieved on a criterion that could reflect reliability, and it was evident that lecturers rely mainly on their own professional knowledge in order to evaluate educational websites. Criteria related to Web 2.0 technologies that offer additional collaborative learning and communication possibilities, such as editing or adding content by users or communication with the website's managers, were found unimportant by the interviewees. They explained that in face-to-face teaching there is no need for communication of this kind.

Discussion

Practice of learning theories in chemistry OER

The mapping and evaluation of the different educational websites in chemistry was done through the construction of a criterion system. This system was based on existing taxonomies (Nachmias et al., 1998; Nachmias and Tuvi, 2001), reduced from over 100 to 19 evaluative criteria, which nevertheless retained the division of the categories and dimensions of the original taxonomy. All the 100 websites that were analyzed in this study included all of the evaluative categories (see Table 1). This analysis presents examples of the various possibilities for learning with OER. The descriptive dimension demonstrates the diversity of educational websites, including academic, commercial, private and public educational websites. These websites offer interactive learning environments by allowing representation, communication and experimentation via various learning approaches appropriate for different learning theories, types of learners and content levels.

Many websites integrate exercises with synchronic feedback; some of them present topics in a graded manner (simple questions followed by more complicated ones). These website types implement the characteristics of the behaviorist theory of learning. As shown by Winn and Snyder (1996), a computerized learning environment enables delivery of immediate feedback and construction of information presentations in a graded manner. About 40% of the lecturers interviewed in our study agreed with this approach, stating that immediate feedback for an exercise is an important aspect. The cognitive theory of learning is manifested in educational websites that provide both verbal and visual explanations of a specific topic using a combination of text and graphics, as well as animations and three-dimensional simulations. According to Mayer (2005) learning delivered through a number of channels (verbal and sensual) can help learners to establish understanding, to transmit knowledge for new situations and to facilitate the storing and retrieval of information.

Other websites combine the characteristics of the constructivist theory through collaborative and cooperative learning that enables dialogue among learners and between learners and teachers. Ullrich et al. (2008) explained that in this way the learner constructs knowledge through interaction with other learners in an integrated approach to the rich information found online.

Chemistry OER and TPACK

The mapping of educational websites carried out in the late 1990s and early 2000s found only a few websites with characteristics of collaborative learning (Mioduser et al., 1999, 2000; Tuvi and Nachmias, 2001). In our study, conducted during the years 2012–2013, this number increased – 25% of the websites integrated collaborative aspects and video podcasts that are central parts of Web 2.0 technology. It should be noted that integrating Web 2.0 technology in teaching is time-consuming and may require high levels of technological, pedagogical and content knowledge, in addition to the ability to apply them in a constructive way (Mishra and Koehler, 2006). This may explain why chemistry lecturers are not aware of, or do not give much attention to, these new pedagogical aspects. Archambault and Barnett (2010) claimed that it is difficult to measure and separate the different TPACK domains, except for the technological domain which is clearly separable. It is possible that chemistry lecturers are lacking only the technological knowledge of Web 2.0 applications and therefore cannot exploit their pedagogical advantages. Specially-designed intervention programs can be of assistance here, as demonstrated by Blonder et al. (2013) – when teachers receive the opportunity to develop concrete technological skills in a supportive environment of both colleagues and experts, integrating new technologies in class becomes much more probable. Indeed, a few interviewees expressed their interest in such programs should they become available at their institutions.

The interesting finding of the present research is in demonstrating the widespread use, by all interviewees, of OER based on Web 1.0 technology such as online simulations and videos without a program of directed involvement. For this purpose, chemistry lecturers surveyed for this study had naturally gained the required TPACK for the integration of technology in education which they use for various purposes such as representation, concretization, experimental exemplification, and 3D-interactive graphics. They claimed that Internet use allows them to explain complex models in more detail and to cope with difficulties in understanding subjects and concepts in chemistry. Lecturers explained that three-dimensional and dynamic presentations are particularly important for understanding complex models in chemistry. In addition, the questionnaire findings show that lecturers make extensive use of simulations. Many interviewees explained that computerized three-dimensional simulations help them explain complex models. This finding falls in line with Barak (2007), who argued that the major contribution of computer technology lies in the possibility of providing detailed explanation of models in chemistry, and with Merchant et al. (2012) who showed that 3D virtual reality-based instruction is effective for enhancing students' chemistry achievement.

Video podcasts and MOOCS

One of the main technological advancements in recent years is the widespread use of video podcasts, i.e., video files that are distributed in a digital format through the Internet (Kay, 2012). Chemistry lecturers who were interviewed for this research reported that over the past four to five years they have integrated video podcasts in their lectures, mainly from YouTube. According to Kay (2012), this increase in Internet use is due to the ease of activating online video podcasts that have developed with the advent of this innovative website. In addition, the last few years have seen a breakthrough in the use of MOOCs, as many academic institutions started to upload free courses online (Leontyev and Baranov, 2013). Chemistry lecturers saw a great advantage in the presentation of video podcasts during a lecture and have used them to demonstrate experiments in lectures on theory, or to present lectures given by overseas lecturers. Although there is a general consensus about the advantages of breaking down the barriers of time and space which are made possible by these podcasts, several interviewees stressed that their use is justified only for the purpose of exemplification. Most lecturers expressed their concern about the domination of MOOCs technology and its influence on the academic world, particularly with regard to its potential threat on their function as teachers and the lack of lecturer–student interactions that are an integral part of face-to-face teaching. Indeed many researchers have addressed the disrupting effects of MOOCs on higher education (de Langen and van den Bosch, 2013; Ng'Ambi and Bozalek, 2015; Rhoads et al., 2015).

Central criteria for using OER in chemistry

The central criteria for selection of OER, according to this study, are generally associated with OER usage. According to Stake (2004), the adoption of different criteria depends on the needs, means and personal preferences. Correspondingly, website reliability emerged (from the interviews) as the most important criterion for selecting OER. An attempt to define criteria that reflect reliability according to the affiliation of the website to a recognized institution such as a university or a well-known commercial company appeared insufficient – 16 of 17 interviewees noted that they evaluate the information themselves, and only one lecturer claimed that he did not know how to test reliability. Those who were interviewed explained that they acted this way because one could find errors in every website. Indeed, learning materials are disseminated online without being subjected to the same kind of strict and rigorous examination as textbooks (Graham and Metaxas, 2003; Flanagin and Metzger, 2007). The need to create specific criteria for the evaluation of online information has been recognized by several researchers (Flanagin and Metzger, 2007; Brand-Gruwel and Stadtler, 2011), but our findings show that although such criteria are important, they cannot replace examination of the content by the user himself. Therefore, the use of OER for teaching depends strongly on the lecturer's content knowledge.

Considering other dimensions of the evaluation criteria, it appears that the interviewees account for most of the criteria relating to the pedagogical dimension and the representational dimension, but disregard the communication dimension. According to Nevo (1995), learning materials can be selected intuitively or systematically. Nevo studied the work of school teachers and found that they did not use systematic evaluation, and were therefore unaware of its availability during the selection process. Similarly, our results found that chemistry lecturers do not carry out a systematic evaluation of websites according to an external system of evaluative criteria. Instead, they select learning materials according to previous knowledge based on their personal experience. The search for learning materials is not made in the framework of a specific website, but by searching for the relevant topic with a search engine. We found that lecturers tend to judge a website by the way it presents important topics, even if they do not define this formally. Intuitively, lecturers account for several specific criteria such as scientific precision, reliability and type of simulation or exercise. On the other hand, considerations related to communication among users and the editing of content materials are generally ignored.

In addition, we found that website design was strongly correlated to Internet use for teaching purposes. However, analysis of website design was not part of this research. This is a broad field that requires further research in order to clearly define the components of this criterion and its contribution to web-based chemistry education.

Conclusions

Chemistry lecturers make use of online resources for teaching purposes mainly as a supplementary tool for frontal teaching and for assigning homework. Opportunities for stimulating students' interest and simplifying the presentation of complex topics are the main aspects that influence the selection of websites by chemistry lecturers. These applications are based on Web 1.0 technology, i.e. tools that make it possible to present complex models by three-dimensional graphics and interactive simulations. Our results show that lecturers select online learning materials intuitively, according to personal criteria and the purpose for which these materials are being used. Their personal system of criteria is mostly consistent with the criterion system developed in this research as it addresses most of the considerations for selecting OER. The most important aspect missing in the personal system is the option for collaborative learning facilitated through communication with other users and social media platforms. These findings strengthen the reliability and validity of our system of evaluative criteria for summarizing the considerations used by lecturers when selecting OER for teaching purposes on one hand, and highlight the absence of advanced online communication as a pedagogical tool in teaching undergraduate chemistry.

The use of Web 2.0 tools such as content sharing and social network communication facilities is not as common as the use of Web 1.0 technology. Our results suggests that lecturers have the required TPACK to integrate Web 1.0 technology in their teaching, but may require well-designed intervention programs in order to gain the required TPACK for effectively implementing collaborative learning in their classes. Furthermore, adequate time must be allocated for lecturers and tutors to become familiar with new technologies and their pedagogical benefits in order to use them effectively (Buckenmeyer, 2008).

Web technology is a dynamic field that is constantly evolving. Designing a website in an attractive and user-friendly way, and making the advantages of its pedagogical approach clearly apparent for teachers, faculty members and tutors, particularly in science, is a challenge for web developers that deserves separate, comprehensive research. Our mapping tool presents a step forward in defining the important components of OER required for improved integration of technology in chemistry teaching as well as in other disciplines.

Appendix 1: URLs of open educational resources

The open educational resources were retrieved during 2012–2013. It is possible that some of them have since been edited, and some may no longer be accessible.

1. http://www.facebook.com/pages/General-Chemistry-Help/145801432150993

2. http://www.youtube.com/playlist?list=PL53ED067BE448053E

3. http://www.khanacademy.org/science/chemistry

4. http://www.facebook.com/#!/pages/General-chemistry-MFU/132089493524753?fref=ts

5. http://cwx.prenhall.com/petrucci/medialib/media_portfolio

6. http://cwx.prenhall.com/petrucci/

7. http://www.chm.davidson.edu/ronutt/che115/che115_2.htm

8. http://www.chm.davidson.edu/vce/index.html

9. http://www.jce.divched.org/

10. http://www2.stetson.edu/~wgrubbs/datadriven/resources.html

11. http://srdata.nist.gov/gateway/gateway?keyword=chemistry&rddesc=desc

12. http://kinetics.nist.gov/kinetics/index.jsp

13. http://phet.colorado.edu/

14. http://en.wikibooks.org/wiki/General_Chemistry

15. http://antoine.frostburg.edu/chem/senese/101/index.shtml

16. http://www.chem1.com/acad/webtext/virtualtextbook.html

17. http://cnx.org/content/col10264/latest/

18. http://cnx.org/content/col10263/latest/

19. http://chemwiki.ucdavis.edu/Wikitexts/UC_Davis

20. http://www.chemcollective.org/

21. http://www.meta-synthesis.com/index.html

22. http://ocw.mit.edu/courses/chemistry/5-111-principles-of-chemical-science-fall-2008/index.htm

23. http://ull.chemistry.uakron.edu/genchem

24. http://www.academicearth.org/courses/general-chemistry

25. http://edelsteincenter.wordpress.com/courses/general-chemistry/

26. http://www.tannerm.com/

27. http://academic.tv4u.co.il/Front/Newsnet/reports.asp?reportId=285611

28. http://www.webelements.com/

29. http://www.chemguide.co.uk/index.html

30. http://www.ptable.com/#

31. http://nhscience.lonestar.edu/biol/bio1int.htm

32. http://www.chemmybear.com/stdycrds.html#GenChem

33. http://group.chem.iastate.edu/Greenbowe/sections/projectfolder/flashfiles/reaction/bonding1.swf

34. http://www.wisc-online.com/objects/ViewObject.aspx?ID=AP13004

35. http://www.wellesley.edu/Biology/Concepts/Html/dilutions.html

36. http://programs.northlandcollege.edu/biology/Biology1111/animations/index_page_for_animations.htm

37. http://www2.nl.edu/jste/bonds.htm

38. http://www.chembio.uoguelph.ca/educmat/chm19104/chemtoons/chemtoons.htm

39. http://chemed.chem.purdue

40. http://www1.chem.leeds.ac.uk

41. http://www.youtube.com/user/UCBerkeley?feature=watch

42. http://www.educator.com/chemistry/general-chemistry/ow/

43. http://www.educator.com/chemistry/goldwhite/

44. http://www.files.chem.vt.edu/chem-ed/index.html

45. http://www.mhhe.com/physsci/chemistry/essentialchemistry/

46. http://hyperphysics.phy-astr.gsu.edu/hbase/chemical/chemcon.html#c1

47. http://www.youtube.com/playlist?list=PL5FC675C558EF707D

48. http://www.chem.ualberta.ca/~plambeck/che/p101/

49. http://chemistry.about.com/od/generalchemistry/u/basics.htm#s7

50. http://msumgenchem.blogspot.co.il/

51. http://www.bc.edu/schools/cas/chemistry/undergrad/genexp.html

52. http://ocw.metu.edu.tr/course/view.php?id=99

53. http://www.chem.wisc.edu/content/genchem-laboratory

54. http://www.chem.umass.edu/genchem/chem112/112index.html

55. http://www.dartmouth.edu/~chemlab/index.html

56. http://jan.ucc.nau.edu/~jkn/Labs.html

57. http://www.colorado.edu/chem/genchem/

58. http://www.uta.edu/chemistry/undergraduate/the-uta-chemistry-channel.php

59. http://butane.chem.illinois.edu/decoste/Chem103/index.html

60. http://www.learnerstv.com/index.php

61. http://group.chem.iastate.edu/Greenbowe/sections/projectfolder/animationsindex.htm

62. http://www.tau.ac.il/~phchlab/index.html

63. http://www.youtube.com/playlist?list=PLD5581AFFC708BE46

64. http://www.dnatube.com/categories/35/Analytical-chemistry

65. http://www.chem.ufl.edu/ugrad/labanalytical.shtml

66. http://www.organic-chemistry.org/

67. http://www2.chemistry.msu.edu/faculty/reusch/VirtTxtJml/intro1.htm

68. http://www.chemistry-blog.com/category/general-chemistry/

69. http://bchsgenchem.wordpress.com/

70. http://nghsapchem.wordpress.com/general-chemistry-questions

71. http://che1315gilbert.blogspot.co.il/

72. http://www.chem.ox.ac.uk/vrchemistry/

73. http://chemistry2.csudh.edu/homework/hwintro.html

74. http://web.mst.edu/~gbert/links.html

75. http://wps.prenhall.com/esm_bruice_essentials_2/110/28209/7221752.cw/index.html

76. http://www2.chemistry.msu.edu/faculty/reusch/VirtTxtJml/intro1.htm

77. http://pollux.chem.umn.edu/8021/Lectures/

78. http://www.rsc.org/learn-chemistry/resource/listing?searchtext=&displayname=students&filter=all&fAudience=AUD00000002&fMediaType%252525252cMediaType%252525252cMediaType=MED00000003&MediaType=MED00000011&sfMediaType%2525252cMediaType%2525252cMediaType=MED0

79. http://www.p-forster.com/IINDEX.htm?/index.htm

80. http://www.p-forster.com/IINDEX.htm?/index.htm

81. http://highered.mcgraw-hill.com/classware/selfstudy.do?isbn=0073047872

82. http://www.organic-chemistry.org/reactions.htm

83. http://https://wiki.ch.ic.ac.uk/wiki/index.php?title=StudentWiki:Contents

84. http://https://undergrad-ed.chemistry.ohio-state.edu/

85. http://www.chemtube3d.com/

86. http://www.chemtopics.com/

87. http://dwb4.unl.edu/

88. http://www.elmhurst.edu/%E2%88%BCchm/demos/index.html

89. http://www.chem.uiuc.edu/clcwebsite/demos.html

90. http://www.chemtutor.com/

91. http://www.scs.illinois.edu/schiffer/index.html

92. http://chem.usc.edu/resources/chemlinks.html

93. http://www.whfreeman.com/Catalog/static/whf/acsgenchemhome/

94. http://wps.prenhall.com/esm_mcmurry_chemistry_4/9/2408/616516.cw/index.html

95. http://www.mpcfaculty.net/mark_bishop/Chemistry_2.htm

96. http://www.shodor.org/UNChem/basic/nomen/index.html#prob1

97. http://www.chem.uiuc.edu/clcwebsite/OLresources.html

98. http://chemicalelements.com/

99. http://www.chemicool.com/

100. http://www.wwnorton.com/college/chemistry/gilbert2/chemtours.asp

Appendix 2: online questionnaire

This questionnaire was sent to 100 chemistry lecturers via email. Responses were anonymous.

Part A: Survey population (age, education, years of experience)

1. Age:

 • 20–30

 • 31–40

 • 41–50

 • 51–60

 • 61–70

2. Institution: __________

3. What is your role at the University? __________

4. Gender: Female/Male

5. Which courses have you taught in the past 5 years? __________

6. Experience in research (number of years)

 • 0–5

 • 6–10

 • 11–20

 • Over 21

7. Experience in teaching (number of years)

 • 0–5

 • 6–10

 • 11–20

 • Over 21

Part B: Level of use of online resources for teaching purposes

8. On a scale of 1 to 5 (1 – not at all, 5 – very often) how often do you use online resources for teaching purpose?

 • Curriculum development – lesson preparation 1—2—3—4—5

 • Demonstration from online resources (simulation, pictures, video, 3D) 1—2—3—4—5

 • Referring students to online resources 1—2—3—4—5

 • Referring students to chemistry databases 1—2—3—4—5

 • Other 1—2—3—4—5

 • If you chose ‘other’ please elaborate: __________

Part C: Use of online resources

Questions 9–11 are intended only for users of open educational resources

9. On a scale of 1 to 5 (1 – not at all, 5 – very often) how important are the following criteria when evaluating and choosing open educational resources:

 • Scientific accuracy 1—2—3—4—5

 • Visualization (website design) 1—2—3—4—5

 • Usability 1—2—3—4—5

 • The level of explanation 1—2—3—4—5

 • Links to other websites 1—2—3—4—5

 • Date of last update provided 1—2—3—4—5

 • Identity of the author 1—2—3—4—5

 • Other 1—2—3—4—5

 • If you chose ‘other’ please elaborate: __________

10. The research literature has created taxonomies which serve as tools for evaluating open educational resources. These taxonomies focus on assessment of information, pedagogy, and visualization. Do you use these kinds of evaluation tools? If so please describe an example:

__________

11. Please list two open educational resources that you use for teaching purposes and explain why or how they advance your teaching goals.

__________

Appendix 3: semi-structured interview

Background

1. Which courses do you teach?

2. How many years of experience do you have in teaching?

3. How many years of experience do you have in research?

4. How long have you been integrating technology and open educational resources in your teaching?

Making use of the Internet

5. How do you find online materials for teaching purposes?

6. One of the main problems with online learning materials is that today, anyone can upload content to the Internet. In your opinion, what are the most important criteria for evaluating the reliability of a scientific website in general and specifically in the field of chemistry (for example: references, date of last update provided, can students evaluate information?)

7. Can you please give two examples of open educational resources that you have used in the past, one that you were satisfied with and another that you were not satisfied with?

8. How do you integrate educational resources in your teaching?

9. In your opinion, what are the most central criteria one should consider when choosing a specific open educational resource for teaching purposes?

10. Do you think there is an added value in using open educational resources? If so, what is it? If not, please explain why not.

11. Can you please describe your teaching strategy?

12. For this question we presented an open educational resource to the interviewee and asked:

One challenge in curriculum development is that different educators have different approaches to teaching. In your opinion, is the open educational resource presented to you a good supplementary tool for teaching chemistry? Why? If you could, what would you change in this website and why?

13. Do open educational resources enable learning that is not possible with traditional teaching?

14. Do you think the way that content is presented in a website (for example: visualization/ audio) in important? Please explain why.

15. Do you think the learning strategy of the website (personal learning/collaboration) is important? Please explain why.

16. Web 2.0 technologies enable collaborative learning as well as adding or editing content by the users (e.g. wikis, blogs, and forums). What is your approach with regards to teaching with this kind of technology?

17. During our research we developed an evaluation tool that maps open educational resources. In the next part of the interview I will introduce you to some of the criteria for evaluation.

a. In your opinion, is there anything missing in the list of criteria presented to you? Which of these criteria would you use when choosing an open educational resource for teaching?

b. Are there criteria that should be removed?

c. Which criteria from the tool we presented would you use in order to choose an open educational resource for teaching?

18. Do you think that educators should be trained to integrate open educational resources in their teaching?

Appendix 4: evaluation tool

This tool comprises 19 criteria that can be used to evaluate open educational resources. Please check all of the appropriate boxes.

Descriptive dimension: refers to factual details

Website affiliation

□ ac/edu

□ co/com

□ Other

Update status

□ Date of last update provided

□ Updated last year

□ References

□ Links to other websites

Target population

□ High School

□ University

Pedagogical dimension: presents the activities provided by the website in relation to the learning method

□ User-generated content (UGC)

Exercises

□ With feedback

□ Without feedback

Demonstrations

□ Experiential simulation: (the learner takes part in performing complex tasks, and learns through creating, for example: by building a city or model in chemistry)

□ Symbolic simulation: (allows learners to explore the behavior of systems through a demonstration, for example: through simulating the heart's activity the learner is able to change a certain feature that changes the course of this activity)

Communication dimension: examines whether the website enables connection with other learners

□ Connection to other users (e.g., through an interactive forum or a feedback/comment option)

□ Links to social media

Representational dimension: the way in which data, information and knowledge are presented in the website

□ 2D (pictures, illustrations, figures)

□ 3D (models)

□ Videos

Acknowledgements

This research was supported by Research Fund of The Open University of Israel (grant no. 501045).

References

  1. Archambault L. M. and Barnett J. H., (2010), Revisiting technological pedagogical content knowledge: exploring the TPACK framework, Comput. Educ., 55, 1656–1662.
  2. Ardito C., Costabile M. F., De Marsico M., Lanzilotti R., Levialdi S., Roselli T. and Rossano V., (2006), An approach to usability evaluation of e-learning applications, Univers. Access Inform. Soc., 4, 270–283.
  3. Atkins D. E., Brown J. S. and Hammond A. L., (2007), A review of the open educational resources (OER) movement: achievements, challenges, and new opportunities, Report to The William and Flora Hewlett Foundation.
  4. Barak M., (2007), Transition from traditional to ICT-enhanced learning environments in undergraduate chemistry courses, Comput. Educ., 48, 30–43.
  5. Barzilai S. and Zohar A., (2012), Epistemic Thinking in Action: Evaluating and Integrating Online Sources, Cognition Instruct., 30, 39–85.
  6. Blonder R. and Rap S., (2015), I like Facebook: Exploring Israeli high school chemistry teachers' TPACK and self-efficacy beliefs, Educ. Inform. Tech., 1–28.
  7. Blonder R., Jonatan M., Bar-Dov Z., Benny N., Rap S. and Sakhnini S., (2013), Can You Tube it? Providing chemistry teachers with technological tools and enhancing their self-efficacy beliefs, Chem. Educ. Res. Pract., 14, 269–285.
  8. Brand-Gruwel S. and Stadtler M., (2011), Solving information-based problems: evaluating sources and information, Learn. Instr., 21, 175–179.
  9. Buckenmeyer J., (2008), Revisiting teacher adoption of technology: research implications and recommendations for successful full technology integration, College Teaching Methods & Styles Journal (CTMS), 4, 7–10.
  10. Burton J. K., Moore D. M. and Magliaro S. G., (1996), Behaviorism and instructional technology, in Handbook of research for educational communications and technology, pp. 46–73.
  11. Chai C. S., Koh J. H. L. and Tsai C.-C., (2013), A Review of Technological Pedagogical Content Knowledge, Educ. Tech. Soc., 16, 31–51.
  12. de Langen F. and van den Bosch H., (2013), Massive Open Online Courses: disruptive innovations or disturbing inventions? Open Learning: The Journal of Open, Distance and e-Learning, 28, 216–226.
  13. Evans K. L. and Leinhardt G., (2008), A Cognitive Framework for the Analysis of Online Chemistry Courses, J. Sci. Educ. Technol., 17, 100–120.
  14. Flanagin A. J. and Metzger M. J., (2007), The role of site features, user attributes, and information verification behaviors on the perceived credibility of web-based information, New Media Soc., 9, 319–342.
  15. Graham L. and Metaxas P. T., (2003), “Of course it's true; i saw it on the internet!” Critical thinking in the internet era, Commun. ACM, 46, 70–75.
  16. Gredler M. E., (1996), in Jonassen D. H. (ed.), The Handbook of Research for Educational Communications and Technology: A Project of the Association for Educational Communications and Technology, Bloomington, IN: Macmillan Library Reference, ch. 17, vol. 39, pp. 521–540.
  17. Gredler M. E., (2004), in Jonassen D. H. (ed.), in Handbook of research on educational communications and technology, New Jersey: Lawrence Erlbaum, 2nd edn, pp. 571–581.
  18. Greenhow C., Robelia B. and Hughes J. E., (2009), Learning, Teaching, and Scholarship in a Digital Age Web 2.0 and Classroom Research: What Path Should We Take Now? Educ. Res., 38, 246–259.
  19. Jonassen D. H., (2000), Toward a design theory of problem solving, Educ. Tech. Res. Dev., 48, 63–85.
  20. Kahveci A., Gilmer P. J. and Southerland S. A., (2008), Understanding chemistry professors' use of educational technologies: an activity theoretical approach, Int. J. Sci. Educ., 30, 325–351.
  21. Kay R. H., (2012), Exploring the use of video podcasts in education: a comprehensive review of the literature, Comput. Hum. Behav., 28, 820–831.
  22. Koehler M. J. and Mishra P., (2009), What is technological pedagogical content knowledge? Contemp. Issues Tech. Teach. Educ., 9, 60–70.
  23. Kopcha T. J., Ottenbreit-Leftwich A., Jung J. and Baser D., (2014), Examining the TPACK framework through the convergent and discriminant validity of two measures, Comput. Educ., 78, 87–96.
  24. Larreamendy-Joerns J., Leinhardt G. and Corredor J., (2005), Six online statistics courses: examination and review, Am. Stat., 59, 240–251.
  25. Leech N. L. and Onwuegbuzie A. J., (2007), An array of qualitative data analysis tools: a call for data analysis triangulation, School Psychol. Quart., 22, 557–584.
  26. Leontyev A. and Baranov D., (2013), Massive Open Online Courses in Chemistry: A Comparative Overview of Platforms and Features, J. Chem. Educ., 90, 1533–1539.
  27. Liu G.-Z., Liu Z.-H. and Hwang G.-J., (2011), Developing multi-dimensional evaluation criteria for English learning websites with university students and professors, Comput. Educ., 56, 65–79.
  28. Lombardi M. M., (2007), Authentic learning for the 21st century: an overview, Educause learning initiative, 1, 1–12.
  29. Lucassen T., Muilwijk R., Noordzij M. L. and Schraagen J. M., (2013), Topic familiarity and information skills in online credibility evaluation, J. Am. Soc. Inf. Sci. Technol., 64, 254–264.
  30. Mandalios J., (2013), RADAR: an approach for helping students evaluate Internet sources, J. Inf. Sci., 39, 470–478.
  31. Maughan P. D., (2001), Assessing information literacy among undergraduates: a discussion of the literature and the University of California-Berkeley assessment experience, Coll. Res. Libr., 62, 71–85.
  32. Mayer R. E., (2005), in Mayer R. E., (ed.), The Cambridge handbook of multimedia learning, Cambridge University Press, pp. 31–48.
  33. McGreal R., (2013), Open Educational Resources: Innovation, Research and Practice, Vancouver, Canada: Commonwealth of Learning and Athabasca University, pp. xv–xxiii.
  34. Merchant Z., Goetz E. T., Keeney-Kennicutt W., Kwok O.-M., Cifuentes L. and Davis T. J., (2012), The learner characteristics, features of desktop 3D virtual reality environments, and college chemistry instruction: a structural equation modeling analysis, Comput. Educ., 59, 551–568.
  35. Mioduser D., Nachmias R., Oren A. and Lahav O., (1999), Web-based learning environments (WBLE): current implementation and evolving trends, J. Netw. Comput. Appl., 22, 233–247.
  36. Mioduser D., Nachmias R., Lahav O. and Oren A., (2000), Web-based learning environments: current pedagogical and technological state, J. Res. Comput. Educ., 33, 55–76.
  37. Mishra P. and Koehler M. J., (2006), Technological pedagogical content knowledge: a framework for teacher knowledge, Teach. Coll. Rec., 108, 1017–1054.
  38. Moreno R. and Valdez A., (2005), Cognitive load and learning effects of having students organize pictures and words in multimedia environments: the role of student interactivity and feedback, Educ. Tech. Res. Dev., 53, 35–45.
  39. Nachmias R. and Tuvi I., (2001), Taxonomy of scientifically oriented educational websites, J. Sci. Educ. Technol., 10, 93–104.
  40. Nachmias R., Mioduser D., Oren A. and Lahav O., (1998), Taxonomy of educational websites: a tool for supporting research, development and implementation of web-based learning, Tel-Aviv University, School of Education, Knowledge Technology Lab.
  41. Nevo D., (1995), School-Based Evaluation: A Dialogue for School Improvement, Emerald Group Publishing Limited.
  42. Ng'Ambi D. and Bozalek V., (2015), Editorial: massive open online courses (MOOCs): disrupting teaching and learning practices in higher education, Brit. J. Educ. Technol., 46, 451–454.
  43. Rhoads R. A., Camacho M. S., Toven-Lindsey B. and Lozano J. B., (2015), The Massive Open Online Course Movement, xMOCs, and Faculty Labor, Rev. High. Educ., 38, 397–424.
  44. Scherer R. and Tiemann R., (2012), Factors of problem-solving competency in a virtual chemistry environment: the role of metacognitive knowledge about strategies, Comput. Educ., 59, 1199–1214.
  45. Schroeder J. and Greenbowe T. J., (2009), The chemistry of Facebook: using social networking to create an online community for the organic chemistry laboratory, Innovate, 5, 3.
  46. Semple A., (2000), Learning theories and their influence on the development and use of educational technologies, Aust. Sci. Teachers J., 46, 21–28.
  47. She H. C., Cheng M. T., Li T. W., Wang C. Y., Chiu H. T., Lee P. Z., Chou W. C. and Chuang M. H., (2012), Web-based undergraduate chemistry problem-solving: the interplay of task performance, domain knowledge and web-searching strategies, Comput. Educ., 59, 750–761.
  48. Shulman L. S., (1986), Those who understand: Knowledge growth in teaching, Educ. Res., 15, 4–14.
  49. Stake R. E., (2004), Standards-Based and Responsive Evaluation, SAGE Publications, Inc.
  50. Storey M.-A., Phillips B., Maczewski M. and Wang M., (2002), Evaluating the usability of Web-based learning tools, J. Educ. Technol. Soc., 5, 91–100.
  51. Tekerek M. and Ercan O., (2012), Analysis of teachers' attitude towards internet use: example of chemistry teachers, Creat. Educ., 3, 296.
  52. Teo T. W., Goh M. T. and Yeo L. W., (2014), Chemistry education research trends: 2004–2013, Chem. Educ. Res. Pract., 15, 470–487.
  53. Tuvi I. and Nachmias R., (2001), Current state of web sites in science education – Focus on atomic structure, J. Sci. Educ. Technol., 10, 293–303.
  54. Ullrich C., Borau K., Luo H., Tan X., Shen L. and Shen R., (2008), Why web 2.0 is good for learning and for research: principles and prototypes, Proceedings of the 17th international conference on World Wide Web, pp. 705–714.
  55. Walraven A., Brand-Gruwel S. and Boshuizen H. P. A., (2009), How students evaluate information and sources when searching the World Wide Web for information, Comput. Educ., 52, 234–246.
  56. Wikan G. and Molster T., (2011), Norwegian secondary school teachers and ICT, Eur. J. Teach. Educ., 34, 209–218.
  57. Winn W. and Snyder D., (1996), Cognitive perspectives in psychology, in Handbook of research for educational communications and technology: a project of the Association for Educational Communications and Technology, pp. 79–112.
  58. Zancanaro A., Todesco J. L. and Ramos F., (2015), A Bibliometric Mapping of Open Educational Resources, Int. Rev. Res. Open Dist. Learn., 16, 1–23.

This journal is © The Royal Society of Chemistry 2016