How data can be used to fuel a collaborative co-design process for effective digital services.
When it comes to understanding and improving student services, data can give us a lot of information. But it can’t tell us everything.
We also need to be able to contextualise and interpret that data, using it as a starting point for more human investigation.
Using a co-design model for service development can help support this dual approach to problem-solving.
Co-design is the idea of bringing people together to identify a challenge, then working together to solve it. Most organisations - including universities - have a lot of data silos, and these tend to be the biggest barrier to collaboration. It’s not that people don’t recognise that problems exist, but what you tend to get is a lack of shared agreement around what the problem actually is.
For me, that’s what co-design is all about, to try and break down those barriers. This model also helps to focus different departments or staff members on the issue at hand, and how best to solve it.
Data for a shared understanding
If data is siloed in different departments, the same data can be subject to different interpretations. For example, marketing may come to a different conclusion than the admissions team, so the response to that data is often varied.
This can be beneficial, as diverse insight is valuable, but it can also dilute or fragment the overall response. This can make it difficult to create a cohesive response to an issue, or even to come to a decision on what the issue is in the first place, because you’ve got siloes of people responding to their interpretation of the challenge.
What we want to encourage through co-design, is coming to an agreed identification of the issue, and then playing to the strengths of individual teams as part of a holistic response.
For example, recently, the university’s catering service found that regularly updating menus embedded in web pages was taking a lot of work and so they wanted to go back to using PDFs. But before making this change, we needed to understand what the impact would be on users.
To test how we could best design the menus, we ran an experiment. We replaced a single menu with a PDF format and monitored how that resource performed, and how users interacted with it. The data showed a much lower engagement rate with that page, and an increase in bounce rates. This told us that users were either not clicking on the page at all, or if they were, they were exiting very quickly. In order to understand why this was happening, we spoke to student users, and as we predicted, they confirmed that the format of the page was impacting on their decision to use it.
In this instance, data gave us a starting point for our investigation. It provided the groundwork for understanding what the issue was - lack of engagement with a resource – and helped us ask the right questions of users to be able to solve the problem.
It’s all about context
A lot of my work earlier on in my career was about educating people on interpreting and understanding data analytics. Because it’s very easy to make assumptions about people’s behaviour when engaging with a digital resource, such as a webpage. But in order to get a reliable understanding of what that behaviour means, a thorough understanding of the resource is also required.
Context is essential to reliable analytics.
A great thing about co-design is that it allows space for this contextual education. When assessing how well a product or service is doing, the first step is usually to look at the data. This can provide a quantitative assessment of whether there’s a problem or not, but it can’t tell us why. It’s often possible to make a very well-informed guess based on the information provided by the data, but a lot of the time there are user behaviours that won’t be explainable through data.
That’s where being able to understand analytics is essential. Part of this contextualising project is talking to students. So we can analyse quantitative data that shows us what resources are being used, when they are being used, etc, and then qualify this data by speaking to students. This helps us contextualise what we’re seeing and understand why these resources are being used this way. From there it’s a lot easier to understand what changes need to be made to the resource or its delivery.
This is particularly helpful for digital services, but the process can help inform non-digital services as well.
The data gives us a starting point for this enquiry – it can tell us if there are problems or concerns, and where in the process or resource they are, but it’s then up to us as people to understand why. It’s very much a blended approach, and essentially it comes down to being humble enough to say, ‘I don’t know everything, and neither does the data,’ and work out a way to use both together, as a holistic approach to problem-solving.