When lockdown hit just as 35,000 students across 180 countries were preparing to take exams in 600 different exam centres, like many other institutions, the University of London had to overhaul its assessments, and quickly.
Director of strategic projects and head of the Centre for Distance Education, Dr Linda Amrane-Cooper explains how COVID-19 caused a radical shake up in how they delivered exams and discusses how it might affect the future. Here she shares how the team moved assessments online…
The challenge and how we overcame it
Lockdown prompted us to set up an assessment transformation project to deliver alternatives to 110,000 paper-based examinations in just three months for 35,000 University of London students based in over 20 different time zones.
To make all that happen, we created an extended assessment team, made up of 40+ members, meeting daily to iron-out logistics, to coordinate key stakeholders and we conducted a number of practice exams amongst ourselves to test the new systems.
We eventually opted for three assessment formats:
- Online, time-limited, proctored (invigilated) examinations where this is essential
- Online, time-limited, unseen, closed book examinations
- Online, longer-time (several days or one week), assessment of an open book nature
How to stop cheating
We looked to our regulator, the OfS, for guidance, and adopted the ‘no detriment’ policy. Regular meetings of the QA teams allowed the required new regulations to be developed, reviewed by stakeholder groups and passed through our academic committee. Where we would previously have regulations covering equipment allowed in the exam hall for example, we had to be explicit about what behaviour was expected when taking an exam at home.
We use live invigilators to monitor students during conventional exams. Moving online we explored other proctoring options. In addition to un-invigilated, fixed time and longer open-book assessments we looked at AI video proctoring (where changes in the camera image that might indicate a student is leaving the room, or another is joining them, are picked up by the software), and video proctoring (a recording of the candidate via their webcam for later viewing and checking).
Finding a commercial partner
The bulk of our examinations ran through our Moodle VLE platforms. But to meet professional body expectations, some examinations required invigilation.
Managing volume and a short lead time (two months) nudged us to find a partner with established systems capable of managing data protection issues, procurement, stability and scalability, as well as interfacing to UoL systems and data management, at an acceptable cost. With support from UoL’s own digital services provider CoSector, we teamed up with existing partner Janison to deliver some of our assessment via their online systems.
Making sure it works
To help prepare students, and test new online assessment systems, we ran a variety of practice activities. In some cases, we issued prompt questions like ‘tell us about your day’ asking learners to submit a short response; tasks to highlight that we’re testing the system, not the students.
For many level-four students, we provided a low-stakes short assessment paper on a core module, giving them the opportunity to get used to the system, and achieve an academic outcome in a low-risk environment. Most students engaged with the practice tests, which was great, and the majority were able to access the practice test papers, or test area, and upload responses.
We’ve learned a lot. There have been particular in-country challenges related to access and broadband and we’ve been able to fine-tune and provide more detailed support, including country-specific follow-on tests.
To help manage resources, we staggered start and end times and made sure that several large exams wouldn’t take place at the same time – this was also to ensure that our VLEs (we use more than one) could cope with the uploads.
With exams just completed, we now know that the vast majority of our students, about 93% overall, engaged with their exams. For those who didn’t, often virus-related reasons such as personal or family illness impacted their ability to enter the exams. Overall our VLEs survived the surges in demand, particularly as students uploaded files of scanned handwritten or typed exam answers.
Our marking teams have been working in new ways and we are learning a lot about the digital management of 100,000+ student submissions. Understandably, students needed much more support this summer, using the student enquiry portal, phone and email to get in touch – in their thousands. Students emailed to share feedback, to log queries but many others got in touch just to share an email copy of their answers as a back-up.
With thousands of students undertaking different types of exams on varied programmes, on different platforms, the demands on staff have been huge. Many have been available on the student enquiry system from very early in the morning until very late at night to meet the time-zone locations of our students and the cycle of assessment has shifted to overlap with our active recruitment period placing more pressure on the teams.
But knowing that we were able to meet the needs of our students this summer, giving them an opportunity to demonstrate their achievements, on this huge scale has left us all with a sense of pride.
We’re currently running a huge evaluation project with student surveys; looking at student outcomes (how they've done compared to previous years), behaviour on and access to the platforms, attendance and the general experience. Response rates to students are high which is giving us lots of useful data for future planning.
Lockdown has prompted an entirely new way of working, there’s now a much greater recognition of the opportunities presented by the digital space. Hopefully we’ll be able to take these learnings forward to continually improve the student experience.
For more examples of how innovation during lockdown is inspiring long-term change in higher education, take a look at the learning and teaching reimagined initiative.