Over the summer, many young people found themselves at the mercy of a system of assessment they didn’t understand. Plenty had their plans for the future thrown into disarray. The long-term impact could be a significant shift in the way students view data.
Universities are about to meet a cohort of students for whom algorithms and data-driven decision-making is a potent and personal issue.
You could argue this is nothing new. As a society, our interaction with systems, platforms and organisations has been impacted by the use of large datasets, algorithms, machine learning and artificial intelligence (AI) for years – but I’d argue that, until recently, use of our data hasn’t been particularly tangible.
Sure, we know that corporations and governments use data to shape their services. We see Spotify suggesting songs based on previously played tracks, and we know Amazon recommends products ‘people like you’ bought - but it’s hard to engage with an issue when the processes are so opaque, the experience so varied, and the implications unclear.
Unlike our entertainment choices, the stakes when it comes to exams are high. If a students’ future is at the mercy of a system that appears unjust, that’s traumatic – especially when it comes after six months of uncertainty.
At the same time, senior managers in education know the smooth running of the sector requires the use of data. Without it, universities would be impossible, chaotic and unfair. But data and people interact in complex ways, so there must be trust in both the processes involved and the people in a position of power. Understanding this, Jisc has produced codes of practice around the use of data analytics in a learning setting, most recently relating to issues of wellbeing and mental health.
Working in partnership
Transparency is key. Do students know what data is collected on them and how it is used? That’s a good first step - but there are further opportunities to involve learners in the design and implementation of data-informed systems and services. This gives them agency in the process. It also gives students an opportunity to develop their awareness about the ways in which data is used in modern life, preparing them better to play an active role in their learning and beyond.
We know from the results of a Jisc survey, published this week, that there’s work to be done here: of more than 20,500 university students surveyed, only 36% agreed that their organisation had told them how their data was collected and used and just 17% said they got the chance to be involved in decisions about digital services. If we’re able to involve the staff that will be implementing such systems and services and the students who are affected by them in the designer process, so much the better. Jisc’s mini-MOOC on AI and ethics might be helpful when considering how to do this.
Stories and data
The recent controversy around exam grading wasn’t all focused on data and algorithms; the energy in the debate came from human experiences. These individual stories could be the narratives that stick, especially when considering outcomes for disadvantaged students, and the impact on individuals. And, for all the fallout, let’s not be too hasty about a comprehensive rollback.
The University of Gloucestershire has been exploring how it can bring analysis of the data generated by student interaction and gathered with their permission together with key systems and activities to watch for early warning signs of learners getting into difficulties. This enables informed discussions as part of the university’s pastoral role.
There’s a delicate balance to be achieved, but combining the use of analytics with strong relationships and careful interventions can - when done right - prove a useful tool to support a positive student experience. Given that regular contact with students in physical spaces will be unreliable over the coming months, universities and colleges need a wide range of tools to ensure the wellbeing of students.
After many students’ experiences this summer, I hope universities will look at ways to work collaboratively to use their data ethically and with transparency while also considering lived experiences. Data and stories are close in nature, and we need to hear both to inform decisions.