Barry J.
Ryan
*
College of Science, Dublin Institute of Technology, Cathal Brugha St., Dublin 1, Republic of Ireland. E-mail: barry.ryan@dit.ie; Fax: +353 1 402 4495; Tel: +353 1 402 4379
First published on 25th February 2013
This paper describes how three technologies were utilised in combination to align student learning and assessment as part of a case study. Multiple choice questions (MCQs) were central to all these technologies. The peer learning technologies; Personal Response Devices (a.k.a. Clickers) and PeerWise (http://peerwise.cs.auckland.ac.nz), were implemented to achieve scaffolded, self-directed independent learning by the students which aligned to the assessment methodology through creating, analysing, answering and discussing multiple choice questions. Personal response devices enhanced in-class activity involvement, whilst PeerWise provided structure and support for independent student learning through defined outside class activities. An associated technology, online MCQs hosted though a secure virtual learning environment, was used as an aligned assessment methodology. The rationale behind this case study, its implementation and evaluation are described and discussed. Finally, the potential widespread applicability of this aligned, technology enhanced learning and assessment methodology is outlined along with suggestions and guidelines to aid practitioners wishing to implement a similar approach.
Academics should endeavour to address this prevalent student opinion by carefully selecting appropriate learning activities and aligned assessments that correctly, and fairly, appraise a student's attainment of the learning outcomes. Without correct alignment the student will question the need for certain topics/group-work/assignments within a course; this questioning can lead to disinterest, lack of motivation and, ultimately, disengagement (Astin, 1999).
Fig. 1 Schematic of the interrelationships between the three technologies, their uses and the activities associated with each. |
Component | Old method | New approach | Learning outcomes | Rationale |
---|---|---|---|---|
In class activities | Paper based, lecturer led discussion. | Technology based (Clickers), peer debate and student centred discussion |
Be capable of working effectively in pairs and larger groups.
Be capable of correctly drawing and interpreting the structures of organic functional groups and understanding their associated chemical properties. Understand the main organic chemical reactions and the reaction mechanisms underlying these processes. |
Clickers based MCQs probed student understanding on topics ranging from simple theoretical concepts, to correct classification and nomenclature identification to higher order reaction prediction in an environment which promoted active, student centred learning.
The use of technology encouraged peer interaction, allowed immediate collation of student responses and also provided a visual representation of the overall ‘trend’ of student responses which could be used to initiate a student centred discussion and clarify areas of misconception. |
Independent learning | Recommended reading lists supplemented with minimal activities. | Scaffolded online (PeerWise), asynchronous learning activities with peer feedback. |
Be capable of correctly drawing and interpreting the structures of organic functional groups and understanding their associated chemical properties.
Be capable of applying the basic skills in organic chemistry to problem solving. |
Student designed MCQs and provided feedback for their peers based on topics covered in the preceding lectures. The online activity provided structure for student independent learning, typically student questions focussed on areas that required additional independent learning. Students engaged asynchronously, with minimal lecturer assistance, and fostered a supportive online class community. |
Assessment | One and a half hour exam. Students answered two questions from a choice of four. Contributed 80% module assessment weighting. | Exam replaced with three small stake MCQs and one high stake MCQ distributed evenly over the course of the semester. |
Be capable of correctly drawing and interpreting the structures of organic functional groups and understanding their associated chemical properties.
Understand the main organic chemical reactions and the reaction mechanisms underlying these processes. Be capable of applying the basic skills in organic chemistry to problem solving. |
Questions were both text and image based; this permitted assessment of theoretical, conceptual and nomenclature understanding. Students were provided with immediate feedback and their score for each MCQ completed. This allowed students to structure their learning path, and with the assistance of the feedback, focus on areas of misconception. In this way the students could develop their understanding both before and after an assessment.
Equal distribution of the CA MCQs reduced student stress associated with terminal exams and ‘chunked’ the learning providing the students an opportunity for deep learning. The MCQs were aligned to the ‘chunked’ curriculum whereas the high stakes MCQ was synoptic in nature. |
The researcher was also the lecturer involved in delivering both the theoretical and practical elements of this module. Pedagogical evaluation data were collected anonymously where possible (written reflections or online survey) or by an independent colleague (discussion forum); however, student and participating researcher bias cannot be totally discounted.
Positive aspects | Negative aspects | |
---|---|---|
Alignment to assessment | ||
Clickers |
Familiar format that mimicked the CA format.
Prepared students for graded CA. Immediate and appropriate feedback on areas of misconception. |
Limited number of Clickers; did not map onto the individual graded CA approach. |
PeerWise |
Students developed an appreciation for the MCQ format.
A ‘safe’ learning space where learning from mistakes was encouraged and not penalised. |
Question quality assurance requires constant peer and regular lecturer moderation. |
MCQ CA |
Small stake MCQs:
Viewed as learning events not just assessments. Built student confidence in their abilities over the semester. Students became familiar and comfortable with the assessment technology. Spread the workload over the semester and improved preparedness for high stakes MCQ. Large stake MCQ: Less stressful than traditional terminal exam. |
— |
Engagement | ||
Clickers |
Cited in 75% of the anonymous responses as being one of the top five things about the module.
Broke the lecture into smaller sections, which aided in maintaining student attention. Allowed students to become active and involved in the lecture hall. Anonymous involvement; ‘Safe’ learning space. Students perceived their involvement as important to class and peers. |
Novelty factor can wear off if overused. |
PeerWise |
A high level of interaction was noted:
71% of registered students engaged to some level; 60% of students asked the minimum number of questions (3) and 66% answered the minimum number (3). |
Can distract students from other elements of the module and/or other modules.
Small assessment percentage weighting in comparison to other CA elements. |
MCQ CA |
Positive attitude to assessment strategy reflected in all forms of student evaluation and the high interaction rates; 88% average participation in small stakes MCQs and 96% in high stakes MCQ.
Students saw the small stakes MCQs as a challenge which had a double reward (improved understanding and assessment marks). |
The regularly occurring small stakes MCQs sometimes clashed with CA from other modules. For example MCQ1 was attempted by 97% of the class, MCQ2 by 94%, whilst MCQ3 as only attempted by 73%. This was dues to a clash between MCQ3 and CAs from other modules. |
Gamification | ||
Clickers |
Provided both a challenge and reward where students worked as a team to solve problems.
Viewed as game based learning through group based activities. |
Can distract from the learning outcome of the Clicker question as students focus on the game rather than the content. |
PeerWise | Inherently game based (e.g. score keeping, rewards for attainment of selected criteria). | Can become addictive to the detriment of other elements of the module as some students attempt to achieve the highest score within PeerWise. |
MCQ CA | Promoted personal sense of achievement and pride. | — |
Peer learning | ||
Clickers |
Assisted in the promotion of peer learning:
60% of students preferred to work in groups for Clickers activities. 65% of students stayed ‘on task’ during the activity. |
One third of students preferred to work alone and not engage in collaborative learning.
Poor experience of group work discouraged some students from participating in peer learning activities. |
PeerWise |
Noted, in the majority, as being very beneficial to peer learning.
Students preferred to answer peer questions rather than ask. The largest number of questions posted by an individual student was eleven; whilst the largest number of questions answered was five hundred and twenty-five. The final number of questions in the database was five hundred and sixty four. Additional peer provided feedback and further question discussion was noted in four-hundred and twenty-two cases within the database. |
Students that were unfamiliar with group work and peer learning struggled initially.
An additional log on required to gain access to the PeerWise course, which can discourage student participation. Some ‘lurking’ observed. Passive nature of a small number of students noted; they were unwilling or unable to take responsibility for their learning. |
Student responsibility for learning | ||
Clickers |
90% of students commented that they were more focussed, or concentrated more, when there was a Clickers in-class activity.
90% of students described the post-Clicker discussion as being the most important aspect of a Clicker activity. Provided guidance for independent student learning activities. |
Potential for student distraction during discussion time, particularly in large classes. Discussion initially lecturer based but moved to student led with increased student confidence and appreciation of the pedagogy. |
PeerWise |
Familiar feel to other social network sites which appealed to students and allowed them to feel comfortable in assisting each other to learn.
Tagging of topics assisted question selection, which often focussed on problem areas. |
Student interaction, and engagement, within PeerWise does not necessarily result in improved student responsibly for learning. |
MCQ CA | Allowed students to take control of their assessment as the student could sit their uniquely randomised assessment anytime within a one week time-window. | Students must have online access to complete MCQs. This may not be available if suitable facilities are not pre-booked for students (e.g. a computer laboratory) and this removes student autonomy. |
PeerWise inherently contains gaming elements, ranging from simple score keeping to rewards for attainment of selected criteria. Echoing video games, the format encourages the user to continually engage with the content, leading the user onto the next question and deeper into the ‘game’ and the subsequent learning spiral. Student comments reflected this also; “PeerWise felt like more of a game than an exercise”! Students felt proud of their achievements within the different ‘games’; for example their group answering a Clicker quiz correctly, receiving a reward badge in PeerWise or achieving a good score in an online MCQ assessment. Some students became very involved in the ‘game’ and the attainment of the ‘reward’, which subsequently lead to deep independent learning. For example, within PeerWise the reward for some students was the peer rating attached to each question; students competed to have the highest rated questions. Additionally, and in line with other social media platforms, users could follow people they like. As PeerWise is anonymous, students followed students whose questions (style or content) they liked. Again this was an indication of status within the PeerWise environment: “I spent a lot more time reading about the topics than I did for other modules because I was trying to come up with really good questions. I liked to get a good rating for my questions and have people follow me. It meant that they enjoyed my questions and learnt something from them”. This trend of the importance of peer learning was also a common theme noted in all forms of the student module evaluation.
PeerWise provided clear evidence of students taking responsibility for their learning. PeerWise was run as a student-centred and student controlled online environment. Although a small assessment weighting (4% on a sliding scale) was associated with minimal engagement (ask six questions and answer six questions) the majority of students interacted far beyond the minimum. Instead, PeerWise acted as a place for students to interact with each other and allowed peers to assist each other in their learning. Students often created questions for their peers on topics that they themselves struggled with. To create these questions students carried out independent learning to deepen their understanding. This approach of focussing on problem areas allowed students to take responsibility for their learning in small, defined blocks. The use of technology offered a dissemination method for a student's study, a way to showcase their learning and assist others who were struggling with similar problems. One student comment highlights the development of student responsibility: “It [PeerWise] makes you think for yourself, not having a lecturer always telling you the answer shows us how much we actually know and can do on our own”.
Despite a positive evaluation by students and observed improvements in student engagement with, and preparedness for, the redesigned assessment strategy students will always be heavily influenced by how a module is assessed. Ramsden (1992, p. 187) noted “from [the] students' point of view, assessment always defines the actual curriculum”, and in many ways this is still true for the redesigned module described here. Although the majority of students appreciated the technology enhanced, aligned learning activities and the re-designed assessment strategy, a small number of students still remain ‘slaves’ to the assessment. One student comment, collected in the anonymous survey, highlighted this: “The lack of an end of module exam really influenced my lecture attendance; I didn't attend as many lectures as I would have normally if there was an exam at the end”.
Overall, in this study at least, student responses noted positive experience following the re-designed module; engaging in technology enhanced activities both inside and outside class, identifying their learning gaps and using social technologies to take ownership of their learning: “Initially I thought this module was going to be impossible until I saw the breakdown of the module; how we we're going to learn and be assessed. This made it much more do-able”. With correct alignment of the curriculum, through suitable learning activities, to the assessment (and not the other way around) students were encouraged not to see the assessment solely as the principle outcome of the module. An important aim of assessment is to “engage students in intellectually challenging tasks that are realistic and relevant in the context of a discipline” (Webster, 2007, p. 2). The academic must define suitable assessments that seek to uncover the student's true understanding of the module and achievement of the learning outcomes. By maintaining a level of challenge, reality and relevance in the assessment the benefit of the assessment will be more obvious to the student. In this study several students commented on the challenge (and reward) of MCQ based learning and assessment. The level of challenge and reward must align to the standard of student. Nobody would play a game that was too easy or too hard; there must be scope for success. MCQs are often considered low level assessment, based on fact regurgitation; Gibbs (1992, p. 10) outlined the potential issues with such an assessment:
“Assessment systems dominate what students are oriented towards in their learning… students often recognise that what is really necessary is to memorise”
To avoid this situation careful learning activity and assessment design must be considered. In this study students commented that they could not answer the assessment MCQs just by learning the notes, they had to apply their knowledge. The inclusion of higher order skills into the MCQ design can elevate this learning and assessment method from fact regurgitation and a memory game. Encouraging active student participation in the MCQ process, through question and feedback design, can further heighten cognitive processes used by the student and subsequently deepen their learning (see Table 3).
Sample MCQ 1 | What is the major product of the dehydration of 2-methyl-2-butanol? | |||
---|---|---|---|---|
Answer options | 2-Methyl-1-butene | 2-Methyl-2-butene | The reaction will produce equal amounts of 2-methyl-1-butene and 2-methyl-2-butene | 2-Methyl-butane |
Peer responses | (16%) | (50%) | (18%) | (16%) |
Feedback | In dehydration reactions, a molecule of water is eliminated from an alcohol molecule by heating the alcohol in the presence of a strong mineral acid. A double bond forms between the adjacent carbon atoms that lost the hydrogen ion and hydroxide group. The major product will be the most highly substituted alkene; i.e. the product with the fewest H substituents on the double bonded carbons (based on Zaitsev's rule). | |||
Peer comments |
‘Good question and explanation’.
‘That tricky Zaitsev!! Good question’. |
Sample MCQ 2 | Fill the gaps in the following sentence: There are _ Sigma and _ Pi bonds in the H2CCCH2 molecule | |||
---|---|---|---|---|
Answer options | 4, 2 | 6, 2 | 2, 2 | 2, 6 |
Student responses | (10%) | (83%) | (2%) | (5%) |
Feedback | In this molecule the C atoms on the outside are each bonded to 2H atoms by single bonds and then boded to the middle carbon with a single bond, making the total sigma bonds (single bonds) equal 6. The number of pi bonds (double bonds) is 2, bonding the middle C atom to the outside C atoms. |
The importance of feedback featured heavily in all sources of student module evaluation and is consistent with Higgins et al. (2002) work which noted the positive impact feedback had on students in higher education. Higgins and colleagues noted that the modern student in higher education is highly motivated and will actively seek feedback as a means to improve their understanding of the content and help them to engage with their subject in a ‘deep’ way. In this study feedback was available through many avenues from two primary sources; lecturer provided or peer provided. Initially the students depended heavily on lecturer feedback, however, with time and experience students became accustomed to providing and receiving peer feedback. In this study the relationship between lecturer and student group evolved from Woods’ (1987, p. 242) symbiotic relationship where ‘the teacher and student collaborate actively to produce a best performance’, to a more student centered collaboration with the academic acting as facilitator and background moderator.
Activity | Timing | Duration | Format | Student anonymity | Continual assessment weighting (%) | Academic tasks |
---|---|---|---|---|---|---|
a Note: The remaining 50% continual assessment is based on laboratory work, written laboratory reports and a laboratory skills examination. | ||||||
In class activities
(Clickers) |
Every second or third lecture. | 30–45 minutes |
MCQ based.
Four option answers and one correct answer per question. Peer debate and student led discussion. |
Student responses were anonymous to non-group peers and the lecturer. | 0 | Preparation of MCQ slides addressing specific learning outcomes. Distribution of Clickers at start of class. Facilitation of student led discussion before and after revealing the correct answer. Collection of Clickers at the end of each class. |
Independent learning
(PeerWise) |
Ongoing throughout the semester | Typically 1–2 hours per week |
MCQ based.
Four option answers and one correct answer per question. Peer provided feedback and commentary. |
Student responses were anonymous to peers. The lecturer could review individual student responses. | 4 | Creation of a PeerWise course within the PeerWise website. Enrolment of students into the PeerWise database. ‘How to use PeerWise’, ‘How to write appropriate MCQs’ and ‘How to provide suitable peer feedback’ workshop facilitation. Support screencasts are also available on the PeerWise website. Online moderation and monitoring. Calculation of student continual assessment percentage. |
Small stake assessment MCQs | Distributed evenly throughout the semester | 1 hour per quiz |
MCQ based.
Four option answers and one correct answer per question. Lecturer provided feedback. |
Student responses were not available to peers. The lecturer reviewed and monitored individual student responses. | 3 × 7 | Preparation of MCQ database based on best practice that addresses the specific learning outcomes from the preceding lectures and independent learning activities (Roberts, 2006). Generation of specific feedback for each answer within the database to enhance student understanding. Testing the MCQ and feedback database. Release and closure of the MCQ as per pre-determined assessment schedule. Calculation of student continual assessment percentage. |
High stake assessment MCQs | During the last teaching week of the semester | 1 hour |
MCQ based.
Four option answers and one correct answer per question. Lecturer provided feedback. |
Student responses were not available to peers. The lecturer reviewed and monitored individual student responses. | 25 | As per ‘Small Stake Assessment MCQs’. An additional duty is student invigilation during the MCQ. |
Clickers | Decide before the introduction of Clickers as a learning activity: |
1. How the Clickers logistics will be addressed: | |
(a) Students purchase a Clicker each: This works best if the Clicker will be used in a number of modules over the student's studies, however, students can forget to bring them to class and this may disrupt the lesson plan. | |
(b) Students borrow a Clicker on long-term loan from the institutes' library: A cheaper alternative for the student, however, this requires a large initial financial outlay on the institutes behalf. | |
(c) A Clicker is allocated to small student groups during class: This approach was carried out in this case study and worked well. Distribution and collection of Clickers at the start and end of class can take up some time, however, over the course of the semester this improved as students became used to the set up. | |
2. How the Clickers will be used as in the classroom: | |
(a) Clickers can be used individually or in groups (as in this study). | |
(b) Clickers can be used as part of an assessment approach or anonymously (as in this study). In this study students preferred the anonymous approach, citing that they could make mistakes and not feel embarrassed. | |
3. How the Clickers will enhance the learning experience: | |
(a) Detailed statistical analysis can be carried out on each question by the Clicker software; however, by far the most important aspect noted in this case study was the ‘reveal’, where the correct answer was shown, the bar chart showing student responses displayed and the question (and all option answers) discussed by the students. The instantaneous data collation and display afforded by the use of the Clickers technology should be used to enhance the learning experience. | |
After the introduction of Clickers as a learning activity: | |
1. It is recommended not to over use Clicker activities. In this study a Clickers activity took place every second or third lecture and this maintained student enthusiasm for the learning activity. | |
2. Supplement Clicker activities with additional appropriate activities (e.g. paper based structure drawing). | |
3. Mix up the type of Clicker activity on a regular basis (e.g. Class A vs. Class B, males vs. females) | |
PeerWise | Before the introduction of PeerWise as a learning activity: |
1. Create a ‘course’ within the PeerWise website. Upload all students to this site, using their student numbers as unique identifiers within the system. As interaction is anonymous within PeerWise, these unique identifiers can be used to monitor student activity and engagement, particularly if these form part of the assessment criteria (as in this case study). | |
2. Before students begin to generate questions and feedback, it is recommended to hold a tutorial session on appropriate MCQ and feedback preparation. This will ensure a higher standard of question (and feedback) in the database. | |
After the introduction of PeerWise as a learning activity: | |
1. Over time the database will develop and students can begin to take ownership of this environment. However, a low level academic presence is suggested; student activity can be monitored and any administration queries (e.g. forgotten password, questions flagged as inappropriate, etc.) can be easily dealt within the PeerWise environment. | |
2. Detailed analytics of student interaction can be downloaded as a spreadsheet file for further analysis, particularly if engagement forms part of the assessment. | |
Online MCQs | Before the introduction of online MCQs as an assessment activity: |
1. Preparation of good quality, challenging online MCQs takes time. Provision of feedback with each MCQ enhances the student learning experience, but again takes time. Ensure a suitable development timeline is put in place. The time input is front-loaded and once the quizzes are set up they can be automatically graded by the hosting VLE. | |
2. It is important to test each question within each quiz, preferably by in conjunction with a colleague, to ensure the question is suitably challenging, the distracter questions are appropriate and the correct answer is indicated in the database. | |
3. Question sets can be written collaboratively with other academics if the module is co-taught, and this will reduce the workload. Questions sets can be written in a word processing package (e.g. Microsoft Word), reviewed and transferred to the VLE. It is essential to double check that no errors occurred during the transfer. | |
After the introduction of online MCQs as an assessment activity: | |
1. Invariably at least one typo will make it through the academic question screen and students should be encouraged to refer these glitches once they are discovered. Most VLEs allow for question editing and score correction after a quiz has been completed. | |
2. In this study, students questions posted in PeerWise that were of suitable standard were included in the large stakes MCQ. This encouraged students to write high standard questions in their learning activities. | |
3. Additional questions can be deposited into the question database on an ongoing basis by the academic, for example, based on questions discussed in class. This will keep the database alive and relevant to the student cohort. |
This journal is © The Royal Society of Chemistry 2013 |