You’ll undoubtedly have read one of the many reports and articles on the power of data and analytics to further and higher education (HE) organisations, but may not be clear on what needs to happen so that your own university or college is able to do just that.
One of the key activities now underway is a strategy for data innovation in our sector. This was recommended by the Higher Education Commission in its From Bricks to Clicks report on data and analytics in HE, to be led by us with Higher Education Statistics Agency (HESA) and Universities UK (UUK).
We have begun to consider what this might look like, asking what we can learn both within the sector and from the likes of Google and Amazon’s approaches. The result was a horizon scan report, which has fed into our think tank discussions with sector bodies including our M5 Group partners, the Quality Assurance Agency (QAA) and HESA, UUK, institutional planners, and leading data analytics vendors Blackboard and Civitas Learning. I want to run through some of these findings in this blog, and look to the future.
What’s holding us back, and what can we do about it?
In order for us to arrive at a cohesive data and analytics strategy for HE, first we need to understand the challenges that organisations are currently facing.
Challenge: addressing the reporting burden
The Higher Education Data and Information Improvement Programme (HEDIIP), coordinated by HESA, found 93 organisations were collecting data about students in 525 separate data collections, made up of a mixture of statutory returns and data required for accreditation by professional bodies. The time and effort that we all put into collating these data sets and making returns could undoubtedly be spent more usefully and productively.
HESA’s Data Futures programme aims to develop and pilot a simplified model that will still meet reporting requirements without sacrificing data quality.
Challenge: moving from reactive to proactive decision-making
In research and education we are used to a culture of annual cycles, and much of our reporting and decision-making is based on these practices.
Our horizon scan report highlighted the potential of data analytics to support a culture of real-time decision-making, eg through in-year data collection in addition to the current system of annual returns. By contrast, learning analytics is inherently real-time, and is already starting to foster a culture of pro-active decision making. For example, we hear of lecturers modifying their course or their delivery as student attainment and feedback data comes in.
Challenge: total cost of ownership and benchmarking
Institutions often have blind spots about their operating costs, for example due to decision-making taking place in silos, as we found in the development of our Financial X-Ray tool.
Financial X-Ray gives institutions a standardised way to analyse the total cost of ownership of services, and benchmark these costs against those of their peers. We know from IT leaders that this approach has already yielded significant insights into previously hidden costs, such as power and cooling, and given institutions improved leverage over suppliers. We believe that this approach has massive potential to transfer outside of IT.
What does the future hold?
That’s the now. What we need for the future is a vision for data-driven decision-making that the sector can embrace.
Delegates from institutions and sector agencies at our recent think-tank on data-driven decision-making felt that we should have:
One data collection, with a universally useful data model, supporting benchmarking and a predictive analysis capability.
What does this mean in practice?
Delegates felt that three strategic developments would move the sector forward significantly:
- One data collection – the ideal would be for institutions to collect data once and once only, and permute the collected data for its various intended audiences. This might be more efficiently done by a trusted broker acting on behalf of the sector, rather than individual institutions.
- A universally useful data model – data collections should be useful and used both by institutions and the bodies that they are required to report back to. Ideally, institutions would only gather and report data that they would use in their own planning processes.
- Benchmarking and predictive analysis capability – the group felt that this would be most effective if driven by institutions’ own “enlightened self-interest”, rather than as an addition to the already onerous burden of regulatory reporting.
Of course, there are significant technical and cultural challenges associated with changing from current reporting processes to real-time data feeds – however, the group of delegates felt the benefits of a real-time approach outweighed these. In many cases useful data to support teaching and learning could be collected without human intervention. For example, Professor Stephen Heppell has invented Learnometer devices that gather data about the environment in classrooms, lecture theatres and exam halls, including light, noise and pollution levels.
There was also discussion about the potential for data analytics to inform the emerging Teaching Excellence Framework (TEF), and to feed into the next Research Excellence Framework (REF) exercise, which delegates felt could be hugely important in informing – not supplanting – human thought processes and decision-making. You’ll be hearing more on this as we put together a response about how learning analytics could support TEF as part of the ongoing consultation.
What do you think? Have your say
As our joint project to devise a data innovation strategy for the sector starts to take shape, these and other more detailed points raised by the think tank will provide invaluable input.