Line up, line up: using technology to align and enhance peer learning and assessment in a student centred foundation organic chemistry module

Barry J. Ryan *
College of Science, Dublin Institute of Technology, Cathal Brugha St., Dublin 1, Republic of Ireland. E-mail: barry.ryan@dit.ie; Fax: +353 1 402 4495; Tel: +353 1 402 4379

Received 30th December 2012 , Accepted 6th February 2013

First published on 25th February 2013


Abstract

This paper describes how three technologies were utilised in combination to align student learning and assessment as part of a case study. Multiple choice questions (MCQs) were central to all these technologies. The peer learning technologies; Personal Response Devices (a.k.a. Clickers) and PeerWise (http://peerwise.cs.auckland.ac.nz), were implemented to achieve scaffolded, self-directed independent learning by the students which aligned to the assessment methodology through creating, analysing, answering and discussing multiple choice questions. Personal response devices enhanced in-class activity involvement, whilst PeerWise provided structure and support for independent student learning through defined outside class activities. An associated technology, online MCQs hosted though a secure virtual learning environment, was used as an aligned assessment methodology. The rationale behind this case study, its implementation and evaluation are described and discussed. Finally, the potential widespread applicability of this aligned, technology enhanced learning and assessment methodology is outlined along with suggestions and guidelines to aid practitioners wishing to implement a similar approach.


Research question

How effective is the use of technology in assisting peer learning and the alignment of learning and assessment in a first year, large class, foundation organic chemistry module?

Introduction

Development of an active, engaging and aligned learning environment can be a difficult task for academics; however, the selective inclusion of appropriate technology can enhance student involvement and improve alignment between the learning activities and the method(s) of assessment. It is important that this technology incorporation does not detract from the pedagogy, instead it should add to the teaching approach (Wastson, 2001). Ideally, the pedagogy and technology should sustain a symbiotic relationship, which is of benefit to the student cohort and the academic (Dunne and Ryan, 2012).

Enhancing learning through technology integration

One such appropriate pedagogy is social constructivism; in this teaching approach the students, facilitated by the academic, work together to build on their known knowledge to bridge the gaps in their understanding (Palincsar, 1998). Biggs (2002) outlines the principles of a constructivist-based and aligned curriculum and suggests that the learner “constructs meaning through the learning activities” in a suitable space fostered by the academic. Through the careful use of technology, this space could be a technology enhanced lecture hall or an online, virtual learning environment.

Aligning learning and assessment

Assessment is an inescapable fact of education. Although it cannot be removed entirely from a curriculum, subtle changes can result in positive outcomes not only for the student, but also for the academic. Correct alignment of the learning outcomes with the assessment, the assessment strategy itself and also the quality of feedback provided to students can all influence the overall perception of assessments by students (Gibbs and Simpson, 2004). The view of students is often “what do I have to do to pass the exam”, or “is this topic/concept on the exam”, which chime with Boud (1998) and Gibbs and Simpson (2004) commentaries on assessments.

Academics should endeavour to address this prevalent student opinion by carefully selecting appropriate learning activities and aligned assessments that correctly, and fairly, appraise a student's attainment of the learning outcomes. Without correct alignment the student will question the need for certain topics/group-work/assignments within a course; this questioning can lead to disinterest, lack of motivation and, ultimately, disengagement (Astin, 1999).

Technology enhanced assessment

The use of technology to enhance an assessment can range from very simple automated scoring, through instant feedback provision upon assessment completion, to complex simulation environments that adapt to the participants level of understanding (Tippins, 2011). Careful integration of an appropriate technology into an assessment can be beneficial to the student and the academic. Students benefit from an alternative assessment approach; for example e-portfolios can be used to document a students learning journey in a more accountable and creative manner (Wickersham and Chambers, 2006). A more common example is the use of MCQs with instantaneous scoring and automated feedback enhancing the student assessment experience (Higgins and Tatham, 2003). A technology enhanced assessment, however, need not be restricted to a right/wrong or closed answer model. The use of technology can allow sophisticated questions, with complex answers, to be asked and discussed. For example, assessments based on Web 2.0 technologies (blogs, wikis and web hosted discussion fora, etc.) allow students, and academic moderators, to delve deep into a topic promoting meaningful social knowledge construction either synchronously or asynchronously (Grosseck, 2009). The ability to engage with an assessment, either individually or collaboratively, anytime and anywhere promotes freedom, autonomy and gives the responsibility for learning back to the student (Bates, 2011). The profit for the academic is a reusable, adaptable and engaging assessment; however, technology enhanced assessments are not a panacea for all assessment issues. The initial learning curve for the academic, the unbalanced work load in terms of resource preparation and the development of appropriate assessment rubrics are all hurdles to be overcome (Tyagi and Kumar, 2011).

Purpose of this research

Case study population

In this case study the effect of integrating three types of aligned technology into the teaching and assessment of a first year, second semester, foundation organic chemistry module was investigated. This module was delivered to a mixed class of students (Level 6, Certificate and Level 8, Honours Degree based on the Irish National Framework of Qualifications) for two hours per week over the course of a twelve week semester. The module also entailed two hours of aligned laboratory work. The module assessment weighting was split evenly between the lecture and laboratory components. The modules primary aim was to allow the students to develop their understanding of the nomenclature, classification, structure and properties of common organic compounds. Additionally, students investigated the fundamental reactions and the syntheses of organic compounds leading to a comprehension of the underlying reaction principles on a theoretical and practical level and subsequently develop an ability to predict simple organic reactions. The module was contextualised to the various groups within the class (pharmaceutical, food and nutraceutical) and followed-on from introductory chemistry modules delivered concurrently and in the previous semester.

Rationale for change

After a number of deliveries of this module, several re-occurring issues became apparent. The students, although they engaged with in-class and out of class non-graded written activities (e.g. individual written worksheets) throughout the semester, did not perform as well as expected in the traditional terminal exam. Furthermore, although in-class activities were carried out, the overall level of social knowledge construction (e.g. pair-sharing or group work) was poor as the students could not see the point in carrying out the activities. There was some ad hoc social knowledge construction for the outside class activities (e.g. over coffee or breaks in class); however, this was limited to a small number of the class. Finally, it was difficult to pace the lectures appropriately as the students were receiving an aligned, concurrent basic chemistry module which resulted in some topics requiring additional time and others less. Gauging the overall student prior knowledge or current understanding was challenging for this large (n = 139) and mixed-background class.

Pedagogical change: technology enhanced peer learning and assessment

In order to address the deficiencies listed above the module was redesigned in line with best quality assurance practices within the Institute. Feedback was provided on the module strengths and weaknesses by students who had just completed the module as per standard practice within the Institute. Inclusion of student input into the redesign of a module is important as Barnett and Coate (2005) note that students must be actively engaged in curriculum development in order for positive outcomes to be achieved within the student population. This is most effectively achieved by including students as integral parts of curriculum (re)design and as key drivers of the “living curriculum” (Barnett and Coate, 2005, p. 2). Student feedback, along with personal and colleague observations, provided the foundation upon which to build the redesigned module. Three major module changes were enacted, which were mirrored by the integration of three new and aligned technologies (see Table 1 and Fig. 1).
Schematic of the interrelationships between the three technologies, their uses and the activities associated with each.
Fig. 1 Schematic of the interrelationships between the three technologies, their uses and the activities associated with each.
Table 1 Comparison of the old and new learning and assessment approaches described in this case study. The learning outcomes addressed by the new approach and the rationale behind the changes are also indicated
Component Old method New approach Learning outcomes Rationale
In class activities Paper based, lecturer led discussion. Technology based (Clickers), peer debate and student centred discussion Be capable of working effectively in pairs and larger groups.

Be capable of correctly drawing and interpreting the structures of organic functional groups and understanding their associated chemical properties.

Understand the main organic chemical reactions and the reaction mechanisms underlying these processes.

Clickers based MCQs probed student understanding on topics ranging from simple theoretical concepts, to correct classification and nomenclature identification to higher order reaction prediction in an environment which promoted active, student centred learning.

The use of technology encouraged peer interaction, allowed immediate collation of student responses and also provided a visual representation of the overall ‘trend’ of student responses which could be used to initiate a student centred discussion and clarify areas of misconception.

Independent learning Recommended reading lists supplemented with minimal activities. Scaffolded online (PeerWise), asynchronous learning activities with peer feedback. Be capable of correctly drawing and interpreting the structures of organic functional groups and understanding their associated chemical properties.

Be capable of applying the basic skills in organic chemistry to problem solving.

Student designed MCQs and provided feedback for their peers based on topics covered in the preceding lectures. The online activity provided structure for student independent learning, typically student questions focussed on areas that required additional independent learning. Students engaged asynchronously, with minimal lecturer assistance, and fostered a supportive online class community.
Assessment One and a half hour exam. Students answered two questions from a choice of four. Contributed 80% module assessment weighting. Exam replaced with three small stake MCQs and one high stake MCQ distributed evenly over the course of the semester. Be capable of correctly drawing and interpreting the structures of organic functional groups and understanding their associated chemical properties.

Understand the main organic chemical reactions and the reaction mechanisms underlying these processes.

Be capable of applying the basic skills in organic chemistry to problem solving.

Questions were both text and image based; this permitted assessment of theoretical, conceptual and nomenclature understanding. Students were provided with immediate feedback and their score for each MCQ completed. This allowed students to structure their learning path, and with the assistance of the feedback, focus on areas of misconception. In this way the students could develop their understanding both before and after an assessment.

Equal distribution of the CA MCQs reduced student stress associated with terminal exams and ‘chunked’ the learning providing the students an opportunity for deep learning. The MCQs were aligned to the ‘chunked’ curriculum whereas the high stakes MCQ was synoptic in nature.



Pedagogical evaluation methodology

Pedagogical evaluation followed best ethical practices, and conformed to the Institutes Research Ethics Guidelines (DIT Research Ethics Committee approval number: 65/10). The data collected took several forms; an anonymous multiple choice questionnaire (n = 130), an independent academic facilitated discussion forum (n = 15), an anonymous evaluation sheet (n = 120), an anonymous standard institute module review form (n = 44) and a personal reflective diary (n = 1). All data were collected once the students had completed the module with the exception of the reflective diary, which was recorded on an on-going basis. The reflective diary recorded ‘informal’ discussions with students, personal observations and comments. Students were asked for verbal consent to allow the researcher to record an interesting or relevant point raised during an informal discussion. Qualitative data were coded into several key themes and sub-themes based on researcher interpretation. Data triangulation was carried out during qualitative theme coding to ensure only valid themes were investigated and the examples and findings are based on feedback from as broad a student base as possible.

Limitations

This study was carried out at a single institution, focusing on a single module. Additional studies can be carried out to investigate the applicability of this approach in other education settings and levels.

The researcher was also the lecturer involved in delivering both the theoretical and practical elements of this module. Pedagogical evaluation data were collected anonymously where possible (written reflections or online survey) or by an independent colleague (discussion forum); however, student and participating researcher bias cannot be totally discounted.

Pedagogical evaluation results

The data collected were classified into general themes, below, and included positive and negative aspects of the student learning experience (see Table 2).
Table 2 A summary of the module evaluation described in terms of the five themes for the three aligned technologies, indicating both positive and negative aspects of the student learning experience
Positive aspects Negative aspects
Alignment to assessment
Clickers Familiar format that mimicked the CA format.

Prepared students for graded CA.

Immediate and appropriate feedback on areas of misconception.

Limited number of Clickers; did not map onto the individual graded CA approach.
PeerWise Students developed an appreciation for the MCQ format.

A ‘safe’ learning space where learning from mistakes was encouraged and not penalised.

Question quality assurance requires constant peer and regular lecturer moderation.
MCQ CA Small stake MCQs:

 Viewed as learning events not just assessments.

 Built student confidence in their abilities over the semester.

 Students became familiar and comfortable with the assessment technology.

Spread the workload over the semester and improved preparedness for high stakes MCQ.

Large stake MCQ:

 Less stressful than traditional terminal exam.

Engagement
Clickers Cited in 75% of the anonymous responses as being one of the top five things about the module.

Broke the lecture into smaller sections, which aided in maintaining student attention.

Allowed students to become active and involved in the lecture hall.

Anonymous involvement; ‘Safe’ learning space.

Students perceived their involvement as important to class and peers.

Novelty factor can wear off if overused.
PeerWise A high level of interaction was noted:

71% of registered students engaged to some level; 60% of students asked the minimum number of questions (3) and 66% answered the minimum number (3).

Can distract students from other elements of the module and/or other modules.

Small assessment percentage weighting in comparison to other CA elements.

MCQ CA Positive attitude to assessment strategy reflected in all forms of student evaluation and the high interaction rates; 88% average participation in small stakes MCQs and 96% in high stakes MCQ.

Students saw the small stakes MCQs as a challenge which had a double reward (improved understanding and assessment marks).

The regularly occurring small stakes MCQs sometimes clashed with CA from other modules. For example MCQ1 was attempted by 97% of the class, MCQ2 by 94%, whilst MCQ3 as only attempted by 73%. This was dues to a clash between MCQ3 and CAs from other modules.
Gamification
Clickers Provided both a challenge and reward where students worked as a team to solve problems.

Viewed as game based learning through group based activities.

Can distract from the learning outcome of the Clicker question as students focus on the game rather than the content.
PeerWise Inherently game based (e.g. score keeping, rewards for attainment of selected criteria). Can become addictive to the detriment of other elements of the module as some students attempt to achieve the highest score within PeerWise.
MCQ CA Promoted personal sense of achievement and pride.
Peer learning
Clickers Assisted in the promotion of peer learning:

60% of students preferred to work in groups for Clickers activities.

65% of students stayed ‘on task’ during the activity.

One third of students preferred to work alone and not engage in collaborative learning.

Poor experience of group work discouraged some students from participating in peer learning activities.

PeerWise Noted, in the majority, as being very beneficial to peer learning.

Students preferred to answer peer questions rather than ask.

The largest number of questions posted by an individual student was eleven; whilst the largest number of questions answered was five hundred and twenty-five.

The final number of questions in the database was five hundred and sixty four.

Additional peer provided feedback and further question discussion was noted in four-hundred and twenty-two cases within the database.

Students that were unfamiliar with group work and peer learning struggled initially.

An additional log on required to gain access to the PeerWise course, which can discourage student participation.

Some ‘lurking’ observed.

Passive nature of a small number of students noted; they were unwilling or unable to take responsibility for their learning.

Student responsibility for learning
Clickers 90% of students commented that they were more focussed, or concentrated more, when there was a Clickers in-class activity.

90% of students described the post-Clicker discussion as being the most important aspect of a Clicker activity.

Provided guidance for independent student learning activities.

Potential for student distraction during discussion time, particularly in large classes. Discussion initially lecturer based but moved to student led with increased student confidence and appreciation of the pedagogy.
PeerWise Familiar feel to other social network sites which appealed to students and allowed them to feel comfortable in assisting each other to learn.

Tagging of topics assisted question selection, which often focussed on problem areas.

Student interaction, and engagement, within PeerWise does not necessarily result in improved student responsibly for learning.
MCQ CA Allowed students to take control of their assessment as the student could sit their uniquely randomised assessment anytime within a one week time-window. Students must have online access to complete MCQs. This may not be available if suitable facilities are not pre-booked for students (e.g. a computer laboratory) and this removes student autonomy.


Alignment to assessment

The central purpose of the redesigned module was to implement an aligned learning and assessment strategy. This was heavily influenced by the use of technology, both in the learning and assessment elements. As such, it was crucial that the student became comfortable with the technologies and at ease in their use as an assessment tool. Additionally, their continual use provided peace of mind to the academic that the technologies could be relied on in an assessment situation. Students developed their understanding as well as their confidence over the course of the semester through engagement with the peer learning technologies; which resulted in a more prepared and relaxed student attempting the graded continual assessment (CA) components. Student comments included “They [Clicker activities] were like a mini-test every week, without the stress” and “after using PeerWise I learnt that I had to take my time and investigate each option before selecting and this helped me in my graded MCQs”; one student commented that overall “the pressure is less as you have practiced and prepared so much with Clickers, PeerWise and the small MCQs, you know you're ready for it [high stakes MCQ]”.

Engagement

Engaging students inside, and particularly outside, the classroom can be difficult to achieve (Summerlee, 2010). One method is to provide aligned activities for students to work on either alone, or in this case, in groups. The technologies used in this case study provided a way for the students to engage with the learning activity with minimal additional academic workload. Student–student engagement was central and students engaged with each other and the content through questioning, answering, commenting and discussing MCQs. The introduction of technology into the module had an overwhelmingly positive effect on students; this was mirrored in the data collected by all methods. In particular students enjoyed the use of Clickers in the classroom; although perhaps the student's enjoyment of Clickers could also be attributed to a novelty factor. Many students simply stated that Clickers “added some fun to the classroom” and that using Clickers was “something different” compared to a more traditional didactic lecture. Care must be taken to use technology enhanced learning activities where appropriate and not to overuse them. For example, in this case study Clicker MCQs were based on the topics discussed in the previous fifteen minute section and provided a safe environment for the students to use their knowledge to answer a question. Students appreciated this and commented on how Clickers encouraged them to actively use their knowledge; one student commented “I like doing Clickers exercises in class, it keeps me engaged. It's good when you first hear the lecture and then put it into practice”. Furthermore, student anonymity afforded by the use of technology allowed quieter students to engage without fear or embarrassment and that every student's opinion counted; “I felt like my response was important to the class”. Finally, the ‘anytime, anywhere’ nature of the asynchronous PeerWise technology enhanced learning and assessment activities suited the students lifestyle where many juggled part-time work with college studies and was well received by the students in their module evaluation.

Gamification

Console based gaming has developed rapidly in recent years, with several companies offering high resolution, interactive and engaging games (for example; Nintendo Wii, Sony Playstation, Microsoft Xbox; Prakash et al., 2011). Although educational activities could never compete on the same entertainment level as these consoles, in this case study, students did note associated feelings of challenge, empowerment and reward upon completing either a single MCQ or an entire set. For example students noted that the Clickers introduced a challenge and reward system into the classroom: “Clickers made class more exciting as we wanted to get the right answer”. Furthermore, many students recommended including Clicker additional ‘games’ (e.g. class A vs. class B) in the suggestions section of the anonymous evaluation sheet.

PeerWise inherently contains gaming elements, ranging from simple score keeping to rewards for attainment of selected criteria. Echoing video games, the format encourages the user to continually engage with the content, leading the user onto the next question and deeper into the ‘game’ and the subsequent learning spiral. Student comments reflected this also; “PeerWise felt like more of a game than an exercise”! Students felt proud of their achievements within the different ‘games’; for example their group answering a Clicker quiz correctly, receiving a reward badge in PeerWise or achieving a good score in an online MCQ assessment. Some students became very involved in the ‘game’ and the attainment of the ‘reward’, which subsequently lead to deep independent learning. For example, within PeerWise the reward for some students was the peer rating attached to each question; students competed to have the highest rated questions. Additionally, and in line with other social media platforms, users could follow people they like. As PeerWise is anonymous, students followed students whose questions (style or content) they liked. Again this was an indication of status within the PeerWise environment: “I spent a lot more time reading about the topics than I did for other modules because I was trying to come up with really good questions. I liked to get a good rating for my questions and have people follow me. It meant that they enjoyed my questions and learnt something from them”. This trend of the importance of peer learning was also a common theme noted in all forms of the student module evaluation.

Peer learning

Students enjoyed the ability to discuss relevant topics with each other both inside and outside class. Once correctly facilitated by the lecturer, peer learning took place naturally in either environment as students worked through a problem together. One student comment noted the benefit of working with peers as “different opinions were introduced which opened my mind to new ways of thinking about a topic”. However, the majority of student comments focussed on the benefit of asynchronous peer learning enabled by PeerWise. Within PeerWise students worked and learned ‘together’ asynchronously; although an apparent contradiction, students benefitted from almost constant peer support through feedback, comments and additional online assistance (grading of questions, tagging topics etc.). One student comment succinctly summarised the benefits of PeerWise: “It was like having a teacher with you at home when you logged on”. Students actively supported each other in question design, creation and feedback. Additional commentary was often provided in an attempt to clear up any misunderstandings about a question. A small number of students questioned the benefit of working with peers, although a mix of learning styles will be present in every class, this preference for solo work may be a hangover from the second-level educational system which most of these students experienced. In this education system many students are ‘spoon-fed’ information from their teacher with little time provided for peer-discussion or constructive learning (Scharle and Szabo, 2000). These students have simply not experienced the social constructivism pedagogical paradigm and may be unwilling, or unwanting, to try it. Typically, these students did not take responsibility for their learning; instead they were passively relying on the academic to provide all the relevant information.

Student responsibility for learning

The redesign of the module through technology enhanced learning and continual assessment component appealed to most students and encouraged them to become responsible and autonomous learners. Over all the forms of module evaluation, students noted the technology enhanced learning activities were an engaging way to apply their knowledge in a safe environment. As part of these activities the student groups actively took responsibility for their learning by assisting each other through construction, answering and discussion of multiple choice questions. Students commented that in class discussion and online feedback assisted them in developing their personal understanding or highlighting areas that required them to do additional, independent and self-directed learning.

PeerWise provided clear evidence of students taking responsibility for their learning. PeerWise was run as a student-centred and student controlled online environment. Although a small assessment weighting (4% on a sliding scale) was associated with minimal engagement (ask six questions and answer six questions) the majority of students interacted far beyond the minimum. Instead, PeerWise acted as a place for students to interact with each other and allowed peers to assist each other in their learning. Students often created questions for their peers on topics that they themselves struggled with. To create these questions students carried out independent learning to deepen their understanding. This approach of focussing on problem areas allowed students to take responsibility for their learning in small, defined blocks. The use of technology offered a dissemination method for a student's study, a way to showcase their learning and assist others who were struggling with similar problems. One student comment highlights the development of student responsibility: “It [PeerWise] makes you think for yourself, not having a lecturer always telling you the answer shows us how much we actually know and can do on our own”.

Discussion

As outlined in the rationale, the reasons behind the module changes are several-fold; however the common theme was to improve the learning experience of the students. This was achieved through aligned learning and assessment (e.g. linking lecture content to the subsequent technology enhanced learning activity and associated assessments), improved organisation (e.g. scaffolding of the content and use of technology to highlight areas of misconception), improved assessment feedback (e.g. formative feedback after completing an MCQ), blended learning (e.g. increased use of the Virtual Learning Environment and other online resources) and the use of engaging technologies (e.g. Clickers and PeerWise).

Despite a positive evaluation by students and observed improvements in student engagement with, and preparedness for, the redesigned assessment strategy students will always be heavily influenced by how a module is assessed. Ramsden (1992, p. 187) noted “from [the] students' point of view, assessment always defines the actual curriculum”, and in many ways this is still true for the redesigned module described here. Although the majority of students appreciated the technology enhanced, aligned learning activities and the re-designed assessment strategy, a small number of students still remain ‘slaves’ to the assessment. One student comment, collected in the anonymous survey, highlighted this: “The lack of an end of module exam really influenced my lecture attendance; I didn't attend as many lectures as I would have normally if there was an exam at the end”.

Overall, in this study at least, student responses noted positive experience following the re-designed module; engaging in technology enhanced activities both inside and outside class, identifying their learning gaps and using social technologies to take ownership of their learning: “Initially I thought this module was going to be impossible until I saw the breakdown of the module; how we we're going to learn and be assessed. This made it much more do-able”. With correct alignment of the curriculum, through suitable learning activities, to the assessment (and not the other way around) students were encouraged not to see the assessment solely as the principle outcome of the module. An important aim of assessment is to “engage students in intellectually challenging tasks that are realistic and relevant in the context of a discipline” (Webster, 2007, p. 2). The academic must define suitable assessments that seek to uncover the student's true understanding of the module and achievement of the learning outcomes. By maintaining a level of challenge, reality and relevance in the assessment the benefit of the assessment will be more obvious to the student. In this study several students commented on the challenge (and reward) of MCQ based learning and assessment. The level of challenge and reward must align to the standard of student. Nobody would play a game that was too easy or too hard; there must be scope for success. MCQs are often considered low level assessment, based on fact regurgitation; Gibbs (1992, p. 10) outlined the potential issues with such an assessment:

Assessment systems dominate what students are oriented towards in their learning… students often recognise that what is really necessary is to memorise

To avoid this situation careful learning activity and assessment design must be considered. In this study students commented that they could not answer the assessment MCQs just by learning the notes, they had to apply their knowledge. The inclusion of higher order skills into the MCQ design can elevate this learning and assessment method from fact regurgitation and a memory game. Encouraging active student participation in the MCQ process, through question and feedback design, can further heighten cognitive processes used by the student and subsequently deepen their learning (see Table 3).

Table 3 Examples of student generated (Sample MCQ 1; hosted on PeerWise) and lecturer generated (Sample MCQ 2; formed part of a small stake MCQ) multiple choice question. The percentage of students that selected each option answer is also noted, as is the feedback the student received after completing the question. The PeerWise site allows students to comment on questions also and the comments associated with the example question are included
Sample MCQ 1 What is the major product of the dehydration of 2-methyl-2-butanol?
Answer options 2-Methyl-1-butene 2-Methyl-2-butene The reaction will produce equal amounts of 2-methyl-1-butene and 2-methyl-2-butene 2-Methyl-butane
Peer responses (16%) (50%) (18%) (16%)
Feedback In dehydration reactions, a molecule of water is eliminated from an alcohol molecule by heating the alcohol in the presence of a strong mineral acid. A double bond forms between the adjacent carbon atoms that lost the hydrogen ion and hydroxide group. The major product will be the most highly substituted alkene; i.e. the product with the fewest H substituents on the double bonded carbons (based on Zaitsev's rule).
Peer comments Good question and explanation’.

That tricky Zaitsev!! Good question’.


Sample MCQ 2 Fill the gaps in the following sentence: There are _ Sigma and _ Pi bonds in the H2C[double bond, length as m-dash]C[double bond, length as m-dash]CH2 molecule
Answer options 4, 2 6, 2 2, 2 2, 6
Student responses (10%) (83%) (2%) (5%)
Feedback In this molecule the C atoms on the outside are each bonded to 2H atoms by single bonds and then boded to the middle carbon with a single bond, making the total sigma bonds (single bonds) equal 6. The number of pi bonds (double bonds) is 2, bonding the middle C atom to the outside C atoms.


The importance of feedback featured heavily in all sources of student module evaluation and is consistent with Higgins et al. (2002) work which noted the positive impact feedback had on students in higher education. Higgins and colleagues noted that the modern student in higher education is highly motivated and will actively seek feedback as a means to improve their understanding of the content and help them to engage with their subject in a ‘deep’ way. In this study feedback was available through many avenues from two primary sources; lecturer provided or peer provided. Initially the students depended heavily on lecturer feedback, however, with time and experience students became accustomed to providing and receiving peer feedback. In this study the relationship between lecturer and student group evolved from Woods’ (1987, p. 242) symbiotic relationship where ‘the teacher and student collaborate actively to produce a best performance’, to a more student centered collaboration with the academic acting as facilitator and background moderator.

Recommendations for practice

Redevelopment of a module to align learning activities and assessment takes time. The inclusion of technology may speed up this process and should not be considered an inhibitory factor (Dunne and Ryan, 2012). A mixture of free and purchased technologies are outlined in this case study; however, there are several free or cheaper alternatives for the purchased technologies (e.g. Socrative (www.socrative.com) could be used as a substitute for Clickers and Moodle could be used for a purchased VLE). Whichever technology is chosen, there will be a learning curve associated with it. The academic should ensure that they are comfortable with each technology before they introduce it to the class. This is particularly important when dealing with graded assessments for large classes; any problems (e.g. non-functional MCQ) will be magnified resulting in frustrated students and additional stress for the academic. Table 4 describes the use on a practical level and Table 5 outlines some technology specific recommendations for practitioners interested in applying this approach as based on suggestions from this study.
Table 4 Suggested uses, formats, continual assessment weighting and academic tasks associated with implementing the aligned technology enhanced learning and assessment approach described in this case study
Activity Timing Duration Format Student anonymity Continual assessment weighting (%) Academic tasks
a Note: The remaining 50% continual assessment is based on laboratory work, written laboratory reports and a laboratory skills examination.
In class activities

(Clickers)

Every second or third lecture. 30–45 minutes MCQ based.

Four option answers and one correct answer per question.

Peer debate and student led discussion.

Student responses were anonymous to non-group peers and the lecturer. 0 Preparation of MCQ slides addressing specific learning outcomes. Distribution of Clickers at start of class. Facilitation of student led discussion before and after revealing the correct answer. Collection of Clickers at the end of each class.
Independent learning

(PeerWise)

Ongoing throughout the semester Typically 1–2 hours per week MCQ based.

Four option answers and one correct answer per question.

Peer provided feedback and commentary.

Student responses were anonymous to peers. The lecturer could review individual student responses. 4 Creation of a PeerWise course within the PeerWise website. Enrolment of students into the PeerWise database. ‘How to use PeerWise’, ‘How to write appropriate MCQs’ and ‘How to provide suitable peer feedback’ workshop facilitation. Support screencasts are also available on the PeerWise website. Online moderation and monitoring. Calculation of student continual assessment percentage.
Small stake assessment MCQs Distributed evenly throughout the semester 1 hour per quiz MCQ based.

Four option answers and one correct answer per question.

Lecturer provided feedback.

Student responses were not available to peers. The lecturer reviewed and monitored individual student responses. 3 × 7 Preparation of MCQ database based on best practice that addresses the specific learning outcomes from the preceding lectures and independent learning activities (Roberts, 2006). Generation of specific feedback for each answer within the database to enhance student understanding. Testing the MCQ and feedback database. Release and closure of the MCQ as per pre-determined assessment schedule. Calculation of student continual assessment percentage.
High stake assessment MCQs During the last teaching week of the semester 1 hour MCQ based.

Four option answers and one correct answer per question.

Lecturer provided feedback.

Student responses were not available to peers. The lecturer reviewed and monitored individual student responses. 25 As per ‘Small Stake Assessment MCQs’. An additional duty is student invigilation during the MCQ.


Table 5 Recommendations for practice based on this case study
Clickers Decide before the introduction of Clickers as a learning activity:
 1. How the Clickers logistics will be addressed:
 (a) Students purchase a Clicker each: This works best if the Clicker will be used in a number of modules over the student's studies, however, students can forget to bring them to class and this may disrupt the lesson plan.
 (b) Students borrow a Clicker on long-term loan from the institutes' library: A cheaper alternative for the student, however, this requires a large initial financial outlay on the institutes behalf.
 (c) A Clicker is allocated to small student groups during class: This approach was carried out in this case study and worked well. Distribution and collection of Clickers at the start and end of class can take up some time, however, over the course of the semester this improved as students became used to the set up.
 2. How the Clickers will be used as in the classroom:
 (a) Clickers can be used individually or in groups (as in this study).
 (b) Clickers can be used as part of an assessment approach or anonymously (as in this study). In this study students preferred the anonymous approach, citing that they could make mistakes and not feel embarrassed.
 3. How the Clickers will enhance the learning experience:
 (a) Detailed statistical analysis can be carried out on each question by the Clicker software; however, by far the most important aspect noted in this case study was the ‘reveal’, where the correct answer was shown, the bar chart showing student responses displayed and the question (and all option answers) discussed by the students. The instantaneous data collation and display afforded by the use of the Clickers technology should be used to enhance the learning experience.
After the introduction of Clickers as a learning activity:
 1. It is recommended not to over use Clicker activities. In this study a Clickers activity took place every second or third lecture and this maintained student enthusiasm for the learning activity.
 2. Supplement Clicker activities with additional appropriate activities (e.g. paper based structure drawing).
 3. Mix up the type of Clicker activity on a regular basis (e.g. Class A vs. Class B, males vs. females)
PeerWise Before the introduction of PeerWise as a learning activity:
 1. Create a ‘course’ within the PeerWise website. Upload all students to this site, using their student numbers as unique identifiers within the system. As interaction is anonymous within PeerWise, these unique identifiers can be used to monitor student activity and engagement, particularly if these form part of the assessment criteria (as in this case study).
 2. Before students begin to generate questions and feedback, it is recommended to hold a tutorial session on appropriate MCQ and feedback preparation. This will ensure a higher standard of question (and feedback) in the database.
After the introduction of PeerWise as a learning activity:
 1. Over time the database will develop and students can begin to take ownership of this environment. However, a low level academic presence is suggested; student activity can be monitored and any administration queries (e.g. forgotten password, questions flagged as inappropriate, etc.) can be easily dealt within the PeerWise environment.
 2. Detailed analytics of student interaction can be downloaded as a spreadsheet file for further analysis, particularly if engagement forms part of the assessment.
Online MCQs Before the introduction of online MCQs as an assessment activity:
 1. Preparation of good quality, challenging online MCQs takes time. Provision of feedback with each MCQ enhances the student learning experience, but again takes time. Ensure a suitable development timeline is put in place. The time input is front-loaded and once the quizzes are set up they can be automatically graded by the hosting VLE.
 2. It is important to test each question within each quiz, preferably by in conjunction with a colleague, to ensure the question is suitably challenging, the distracter questions are appropriate and the correct answer is indicated in the database.
 3. Question sets can be written collaboratively with other academics if the module is co-taught, and this will reduce the workload. Questions sets can be written in a word processing package (e.g. Microsoft Word), reviewed and transferred to the VLE. It is essential to double check that no errors occurred during the transfer.
After the introduction of online MCQs as an assessment activity:
 1. Invariably at least one typo will make it through the academic question screen and students should be encouraged to refer these glitches once they are discovered. Most VLEs allow for question editing and score correction after a quiz has been completed.
 2. In this study, students questions posted in PeerWise that were of suitable standard were included in the large stakes MCQ. This encouraged students to write high standard questions in their learning activities.
 3. Additional questions can be deposited into the question database on an ongoing basis by the academic, for example, based on questions discussed in class. This will keep the database alive and relevant to the student cohort.


Conclusions

In this study, three technologies were used in aligned and orchestrated manner to enhance the student learning and assessment experience in a foundation organic chemistry module. The technologies used in the learning activities encouraged students to work collaboratively and socially to construct their knowledge. These learning activities were aligned to the continual assessment methodology, which was also technology based. Student evaluation of the technology integration was, in the majority, positive; from an academic perspective increased engagement and student responsibility for learning was observed. Overall, students enjoyed learning with their peers in a safe, technology enhanced environment.

Acknowledgements

The author gratefully acknowledges the assistance of Dr Julie Dunne, School of Food Science and Environmental Health, Dublin Institute of Technology, in the preparation of this work. In addition, seed funding from the Learning, Teaching and Technology Centre, Dublin Institute of Technology, was used to support the initial use of Clickers and subsequent project funding permitted redevelopment of this module.

Notes and references

  1. Astin A. W., (1999), Student involvement: a developmental theory for higher education, J. Coll. Stud. Dev., 40, 518–529.
  2. Barnett R. and Coate K., (2005), Engaging the Curriculum in Higher Education, Berkshire, UK: Open University Press.
  3. Bates T., (2011), Understanding Web 2.0 and its Implications for E-Learning, in Lee M. J. W. and McLoughlan C. (ed.), Web 2.0-based e-Learning, Applying Social Informatics for Tertiary Teaching, Hersely, PA: Information Science Reference (an imprint of IGI Global), pp. 21–42.
  4. Biggs J., (2002), Aligning the Curriculum to Promote Good Learning, Constructive Alignment in Action: Imaginative Curriculum Symposium, LTSN Generic Centre.
  5. Boud D., (1998), The role of self-assessment in student grading, Assess. Eval. Higher Educ., 14, 21–30.
  6. Dunne J. and Ryan B., (2012), Harnessing technology to make learning (and teaching) more fun, in Proceedings of the International Conference on Engaging Pedagogy, Dublin December 14th.
  7. Gibbs G., (1992), Improving the Quality of Student Learning, Bristol: TES.
  8. Gibbs G. and Simpson C., (2004), Conditions under which assessment supports students' learning, Learn. Teach. Higher Educ., 1, 3–31.
  9. Grosseck G., (2009), To use or not to use web 2.0 in higher education? Procedia Soc. Behav. Sci., 1, 478–482.
  10. Higgins R., Hartley P. and Skelton A., (2002), The conscientious consumer: reconsidering the role of assessment feedback in student learning, Stud. Higher Educ., 27, 53–64.
  11. Higgins E. and Tatham L., (2003), Exploring the potential of multiple-choice questions in assessment, Learn. Teach. Action, 2, 1–12.
  12. Palincsar A. S., (1998), Social constructivist perspectives on teaching and learning, Annu. Rev. Psychol., 49, 345–375.
  13. Prakash E., Wood J., Li B., Clarke M., Smith G. and Yates K., (2011), Games technology: console architectures, game engines and invisible interaction, in Prakash E. (ed.), Proceedings of the 4th Annual International Conference on Computer Games, Multimedia and Allied Technology, pp. 103–108.
  14. Ramsden P., (1992), Learning to Teach in Higher Education, London: Routledge.
  15. Roberts T. S., (2006), The use of multiple choice tests for formative and summative assessment, in Tolhurst D. and Mann S. (ed.), Conferences in Research in Practice in Information Technology, vol. 52, Eighth Australasian Computing Education Conference (ACE2006), Hobart, Tasmania, Australia.
  16. Scharle A. and Szabo A., (2000), Learner Autonomy: A Guide to Developing Learner Responsibility, Cambridge: Cambridge University Press.
  17. Summerlee A. J. S., (2010), Challenge of engagement inside and outside the classroom: the future for universities, in De Corte E. and Fenstad J. E. (ed.), From Information to Knowledge; from Knowledge to Wisdom, London: Portland Press, pp. 67–78.
  18. Tippins N. T., (2011), Overview of Technology Enhanced Assessments, in Tippins N. T. and Adler S. (ed.), Technology-Enhanced Assessment of Talent, CA, USA: Jossey-Bass (a Wiley imprint), pp. 1–19.
  19. Tyagi S. and Kumar K., (2011), Web 2.0 for teaching, learning and assessment in higher education: a case study of universities in Western Uttar Pradesh (India), Int. J. Libr. Inf. Sci., 3, 230–241.
  20. Wastson D. M., (2001), Pedagogy before technology: re-thinking the relationship between ICT and teaching, Educ. Inf. Technol., 6, 251–266.
  21. Webster H., (2007), The Assessment of Design Project Work, CEBE Briefing Guide, 9, pp. 1–10.
  22. Wickersham L. and Chambers S., (2006), ePortfolios: using technology to enhance and assess student learning, Education, 126, 738–746.
  23. Wood R., (1987), Measurement and Assessment in Education and Psychology, Lewes: Falmer Press.

This journal is © The Royal Society of Chemistry 2013