• If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!


Nottingham e-assessment case study

Page history last edited by PBworks 16 years, 4 months ago

Nottingham e-assessment case study


(Moving from OMR to CBA for summative exams)Simon WilkinsonHeather Rai

Authors: Simon Wilkinson, Heather Rai




1. Why did you use this e-learning approach?

The University of Nottingham Medical School wished to switch from optical mark recognition (OMR) method of scoring objectively marked questions to a computer-based assessment (CBA) solution for two reasons. The first came from time pressures of marking increasing cohort sizes. The time it takes to mark using OMR scales linearly with cohort size. With limited numbers of scanners this was impacting on the scheduling of multiple examinations. The second impetus came from a desire to create more realistic questions. Teaching staff wished to incorporate full colour microscope slides and high resolution radiographs which are difficult and expensive to accurately reproduce on paper. Also, the ability to utilise interactive question types such as drag and drop labeling and image hotspots not possible with paper drove the move to CBA.



2. What was the context in which you used this e-learning approach?

The University of Nottingham runs two Medical degree programmes: a five year undergraduate degree and a four year accelerated graduate-entry degree. The combined number from both programmes is approximately 1,900 students with approximately 1,000 staff registered to use the virtual learning environment. The graduate-entry cohort is based at Derby and are predominantly taught through problem-based learning techniques in 11 modules. The undergraduate cohort complete most modules out of a total choice of 59 and then both groups integrate to study the final 13.
Both programmes utilise a wide variety of assessment forms including: groupwork, essays, projects, OSCEs, OSLERs and objectively marked questions. Historically the true/false/abstain question type dominated this last form of assessment. However, recently there has been a drive towards other more reliable question types such as multiple choice and extended matching. These types were administered on machine readable paper sheets which were then scanned through OMR hardware. The main problem with this form of assessment is the time required to scan is directly proportional to the size of the cohort. Also, unclear changes to answers on the paper form are difficult to process and frequently require human intervention which again slows the process.
Anticipated problems moving from an OMR method to a CBA approach revolved around two central issues. The first was where to find computer labs large enough to examine the cohort sizes of the Medical programme. The solution adopted was to utilise two back-to-back ‘sittings’ of a particular cohort in the same lab. The second problem regarded additional possibilities for plagiarism afforded by the move online. Staff were worried about: 1) students being able to see other student’s monitors, and 2) being able to access forbidden materials during the examination such as Google or their notes off a pen drive.


3. What technologies and/or e-tools were available to you?


The University of Nottingham has been utilising some form of online assessment since the early 1990’s. A large question bank was written firstly in Authorware and then in a web-based system developed in 1999 under the TLTP3-86 project. In 2003 the Medical School began development of a next-generation system called TouchStone. By 2004 the question bank had been fully transferred into this new assessment platform. At an institutional level there was no overall steer regarding computer-based assessment. Some departments used Question Mark Perception, others Test Pilot and others just the assessment capabilities build into WebCT or BlackBoard. Very few departments were utilising these online assessment systems for high-stakes summative assessment. With no clear lead, large question banks already in place and good staff/student familiarity, TouchStone was the platform chosen to support a computer-based assessment approach within the Medical School.



4. What was the design?


The initial design three years ago primarily concerned exam delivery online; working with academics to get questions into the system and to correctly set the permissions so that only authorised students could sit each assessment. Building on initial successes the design has expanded considerably to try and generate a complete electronic workflow process to support online summative exams. The workflow starts with question writing and item storage, review and editing with a change audit system in place, providing external examiner access, conducting standards setting, making adaptations for disabled students, through to exam delivery and reporting. Many different stakeholders are involved starting with academics creating the questions, administrators performing room booking/timetabling, subject matter experts conducting standards setting reviews, external examiners, disability experts and IT support personnel. The aim of computerising the whole workflow process is to enhance quality. Utilising a web-server the process can be distributed to any client computer and the data nightly backed up. In the case of any of the key stakeholders being ill or not being able to work, it is easy for another member of staff to log into the system and obtain the information. There are no problems with work being held on different peoples’ personal computers that may or may not be password encrypted.



5. How did you implement and embed this e-learning approach?


Half day summer workshops for staff training were arranged within the medical school alongside a comprehensive online help system with separate areas for staff and students. One to one training was given to key administrators, tailored to their roles. Reports of successes and failures were circulated to the main curriculum committees and student questionnaires were collected for non-summative exams (http://www.nottingham.ac.uk/nle/about/touchstone/ideal/usability/derby_touchstone_2007.html)


The number of people sitting each assessment was gradually increased to test the load on the system. Initially the system was tested with a few resit students, then a simulated load was tried with around 30 volunteers starting a paper within a few seconds of each other. The next test was with 90 students on the Graduate Entry Programme then 130 for a first year exam and finally a large cohort of 170 students taking their Advanced Clinical Experience (ACE) exams. At each stage the response times of the system were noted.


In response to question 2, the responses to the problems were:

  1. a large computer lab was found in the School of Computer Science, this lab holds 150 students. Larger numbers can be seated by using two cohorts of students and a second, smaller, computer lab simultaneously.
  2. An 'exam desktop' was created. This used group policies in Windows XP to restrict access to no applications apart from Internet Explorer and no web site apart from TouchStone. It also restricts access to any drives and system settings.




6. What tangible benefits did this e-learning approach produce?




As with a 'traditional' assessment life-cycle, involving external examiners to review the academic standard of a paper is equally important when moving to a computer-based approach. TouchStone has been designed specifically with this in mind; it is possible to create special log in accounts for external examiners and to explicitly specify which examiners should review which papers. The advantage of this approach is two-fold: 1) log in details can be easily emailed to externals who have immediate access to the paper(s), and 2) system administrators can check whether an external has reviewed a paper by a particular deadline.


Another advantage is that there is now a single place of storage for questions, past papers, student profiles (year of study, photo etc and marks) and exam results and that these are interconnected e.g. a student profile links to the papers that student has taken and this can in turn link to the marks for a cohort on that paper.


Marking time is greatly reduced with online exams. In March 2007 over two days 330 students each sat Paper I and Paper II for the Advanced Clinical Experience (ACE) module. If OMR was used these exams would have taken a total of around 10 hours to scan; using the CBA approach outlined here required about 2 seconds to bring up each of the two reports. This means that marks can be made available to staff as soon as the exam is finished (and can even be monitored during the exam) and can be viewed on screen or exported into Excel, CSV format or XML. Item analysis can be used immediately after the exam to identify potential problem questions and the marks for each question for every student can also be exported, if any further item analysis is necessary.


It has also been possible to gain flexibility in timetabling as there is no longer a bottleneck in exam marking due to hundreds of papers needing to be fed through one OMR machine. This has meant that one exam could be brought forward by one week this year and therefore provided more time to review marks has been created. A great deal of paper has also been saved with the move away from OMR. This also saves printing time as each OMR answer sheet needs to be printed individually.




Switching to online assessment has opened opportunities for the use of multimedia and more interactivity in exams. Labelling questions (example.gif) involve the student dragging labels into placeholders over an image or diagram. This is pedagogically similar to a multiple choice question with a diagram but there is less cognitive load allowing the student to engage more with the subject matter. Hotspot questions (example.gif) are again based on an image but the student is required to place a marker on an area of the image defined by the creator of the question. Both of these question types are worth highlighting as they are not easily possible on paper. The other advantage with image hotspot question is a reduction in the probability of getting the answer correct through guessing. Instead of the candidate being presented with five discrete options to choose from, they can instead click on literally any pixel on the image.


In fact any kind of image is much easier and cheaper to reproduce on screen than on paper, even for the less unique question types. The Medical School tried printing radiographs (x-rays) as greyscale images on 600dpi laser printers but found the resulting quality insufficient thus rendering questions ambiguous. As a consequence radiographs were removed from any OMR-based exams. However, modern visual display units have no such problem rendering the subtleties and have now been re-introduced into many CBA exams proving particularly useful in the clinical part of the medical curriculum.


In a field as complex as medicine, working collaboratively in teams, potentially interdisciplinary, is important. However, there is always the danger when working in such teams that changes are accidentally overwritten. TouchStone keeps an automatic change log that can be easily interrogated to see the last changes to a question. Also, in a similar way to MS Office, the system will monitor and stop concurrent editing; a second author will be warned about the situation and provided with a read only version of the question.




It has also been found that it has been simple to accommodate some disabilities such as dyslexia and some visual problems (see Disability Support in CBA). Students who prefer a non-white background (some dyslexic students find it considerably easier to read text on a coloured background) can change the presentation of the exam to suit their needs and no matter where they sit in the exam (or taking formative tests at home) this is provided. Text size can also be altered.


Students have also reported that they found the online exam cleaner' (www.nottingham.ac.uk/nle/about/touchstone/ideal/usability/derby_touchstone_2007.html). Frequently incompletely rubbed out answers on an OMR form can cause the scanner to throw up a multi-answer warning. This takes time to correct as the member of staff administering the scanning process has to manually re-scan. Online the exam paper utilises radio buttons, check boxes and dropdown menus that unambiguously store only one answer. Students can change their answer selection as many times as they choose until finally submitting the form. The other advantage, especially with radio buttons, is that only one option can be selected for a multiple choice question, the interface constrains the permitted interactions. Sometimes on paper candidates mark too many options which causes problems when scanned.



7. Did implementation of this e-learning approach have any disadvantages or drawbacks?


This approach is inherently more risky than OMR as there are more things that can potentially go wrong than using paper. A successful examination is also reliant on more parts of the chain running successfully. These include servers, client computers, the network and power. Without any of these being in place at the right time, the exam will not run. Large computer labs are also required to sit a large cohort of students simultaneously and these need to have either enough distance between screens or physical barriers to prevent plagiarism.


Powerful servers are also required to cope with a large number of students taking an exam, especially at the start of the exam when they all send a request for the first page almost simultaneously. A backup server is also very useful so that, if there is a problem with the main server, the backup can be used instead and the exam does not need to be rescheduled. The configuration of these primary and backup servers is complex and requires specialists; the salaries of such personnel could be considered a financial drawback.


Some anxiety was seen in the students, especially for their first online exam, though it is predicted that this would rapidally decrease as their familarity with the system grows. Alongside this there has been some anxiety also seen amoungst staff. Although online exams should be very similar to paper based exams in the way in which they are organised by the institution (obviously apart from the type of delivery), it seems that the unfamiliarity of the online approach and the perceived lack of confidence in IT abilities can lead some staff to be shy of the new system. This can lead to more responsibility for the organisation of exams falling to IT staff, who wouldn't normally be involved in these stages.



8. How did this e-learning approach accord with or differ from any relevant departmental and/or institutional strategies?


It is the Medical Education Unit's role to lead the departmental strategy with respect to assessment. There are staff within the unit who specialise in medical education (including assessment) and staff who's responsibility it is to program and run the online assessment system and therefore lead the departmental strategy. These members of staff liase with staff within the schools within the faculty of medicine to run online assessment.


The MEU started summative online exams 3 years ahead of the university's central IT team. When the university's Information Services started to look at software to begin summative online assessments, Touchstone was considered as an option to be adopted by the university but was beaten to second place by Questionmark Perception at the final stage. The medical school has continued with Touchstone as there are a large number of questions within the system, staff are trained to use it and it has been designed around the question types used commonly in medicine.



9. Summary and Reflection


The experience at Nottingham would suggest that there is a need for greater involvement of all key stakeholders in online assessment from the outset. Experience has shown that some of the usual stakeholders in exams take a step back when IT becomes involved maybe due to the new language that this entails (servers, clients, logins etc.).


The growth every year in the number of summative exams taken online reflects the growing enthusiasm for this format and the success of its use so far. The process has been a steep learning curve but this has been a series of small steps rather than huge leaps. This approach is definitely recommended. Testing is very important at every stage to minimise stress and possible failures though there are often surprises.


On reflection, not much would be done differently with respect to the IT. Most issues could not have been forseen before they occured and rigorous testing has kept these to a minimum.


More research is needed within the field of CBA to investigate the new interactive question types (image hotspots, drag and drop labelling) mentioned here for their validity and reliability. The current authors also have a hypothesis that the drag and drop labelling question could be beneficial to dyslexic candidates because it reduces working memory load.





Comments (2)

Anonymous said

at 2:14 pm on Jul 10, 2007

Simon - a few points that you made in your excellent presentation could maybe come out more strongly here:

- The fact that this offers no clues whereas multi choice allows the possibility of a correct guess

- Your theory that drag and drop is better for dyslexics than labels referenced on the answer sheet (have you thought of talking to TechDis about this?)

- the unambiguity of radio buttons as compared to rubbing out in pencil

- the audit trail of changes to questions

- previous spend on colour printing. Can you get hold of any figures on how much you used to spend?

- the time comparison. You cited 5 hours going down to 2 seconds. Can you give concrete examples of how the time has changed for a particular cohort size and how this scales up to total time saved?

Anonymous said

at 9:50 am on Jul 13, 2007

Comments addressed.

You don't have permission to comment on this page.