Lessons learned from six years of learning analytics at The Open University
Three key lessons learned from The Open University’s award-winning use of data.

It’s a common assumption with learning analytics (LA), that just getting hold of data is enough, and that the magic will happen all by itself. But that’s certainly not the case.
Start small
Fully integrating learning analytics into an organisation takes a lot of time and effort, and small steps are a great way to start.
Back in 2014, when we started to launch our first learning analytics dashboards at the Open University, we spent a lot of time investigating what was working for our organisation, what wasn’t working, and figuring out where we could do better. If we noticed, for example, that the data reflected students were particularly struggling during a certain week or a certain module, we would then focus in on what was different about that week or module; could learning design be improved? Is the student dealing with other external factors? This kind of exercise seems like a small, niche investigation, but it has wide-ranging implications, and is an essential step in the implementation of LA.
Similarly, the OU has a very flexible approach to what modules students can take within their degree pathways. This sometimes means that, for example, student A could do really well in module 1, but then struggle with module 2. This allows teachers to look at how they might better advise module pathways, or transitional content.
Through our use of learning analytics, we can follow students’ journeys, and at the same time data is fed back to teachers and people who are responsible for curriculum design to see what can be done to improve the overall teaching and learning experience.
This small start allowed us to develop our use of LA without overwhelming staff or students, and now this is business as usual – we call it ‘Analytics4Action’ and it’s one of the things I’m most proud of.
Use evidence-based research to help shape the narrative
There will always be some sceptical members of staff through a technological transformation. And it’s understandable – many education professionals have seen tech come and go throughout their careers, sometimes causing more harm than good.
The best way to make sure that learning analytics doesn’t fall into the same trap is to show real-world evidence of where it has worked and made a real impact.
In the good old days before COVID-19 when we could physically meet in the same room, we would bring teachers together with other members of staff across the organisation – those in libraries, and anyone that worked with student data – and we would literally sit down and look at the data dashboards on screen together. We’d talk through patterns we noticed, discussing what was working and what wasn’t going so well. It was a great way to get an understanding of what data is useful for these staff members to improve services, and to inform their decision-making.
It also meant we were – crucially – able to contextualise what the data was telling us. For example, if there was a drop off in student engagement during a certain week, a teacher might say that they had given the students a break, and so instead of reflecting a crisis, the data would in fact indicate normal fluctuations in students’ learning journeys. So, by having these perspectives, we’re getting a dialogue, and the data then starts to tell an understandable story within the context of our organisation.
Our learning design team also continuously works with staff to ensure they have the data knowledge they need.
By keeping up with training and professional development, the idea is that eventually staff becomes self-sufficient in their use and interpretation of data. There are always the early adopters, who are keen to innovate, but the vast majority of people will only be convinced once they see that it works. Making sure to use evidence-based research and keeping all relevant members of staff involved in the process of implementation can go a long way to smoothing the transition.
Celebrate your successes
Because implementing a resilient learning analytics programme takes time and a lot of effort, it can be easy to continuously focus on the next step and forget to celebrate your successes along the way.
But these celebrations are essential to keep the motivation and momentum going - because the big things don’t happen overnight. They are the result of months, often years, of cumulative effort.
It’s easy to compare yourself and your organisation to those that are ahead of the curve, too. For instance, universities like Monash in Australia are doing great things with analytics, but that doesn’t negate the amazing, if smaller, projects happening elsewhere.
It’s important to remember that even though there will likely be some, or many, institutions ahead of you, all progress is good progress, and should be celebrated. A good example from my own experience is that recently The Open University won the DataIQ Award for Best Predictive Learning Analytics. And that journey was almost six years in the making.
You can forget all the little successes that led up to that big one, but when you look back, you can see that an amazing amount of ground has been covered in that time, made up of many, many smaller projects. So yes, celebrate those wins – they're worth it.
To hear Bart discuss more lessons he’s learned through implementing learning analytics, book your place at Data Matters, running online from 26-27 January 2021. You can also learn more about harnessing the power of your data.
0 Comments