Gloucestershire's associate director of library technology and information (LTI) and university librarian asks “if you’re responsible for teaching or mentoring students, how do you know if they’re embracing student life, living well and staying on track with their learning? Is it enough to simply ask them? What if they just say ‘I’m fine’?”
At the University of Gloucestershire learning analytics is helping staff dig below the surface for answers. The university wanted staff to have high quality information about students’ engagement with their learning so they can have better conversations with students about their progress, and they wanted to give good quality feedback that helps them take more control of their own learning.
For these reasons, the university has started on a learning analytics programme that’s putting student data under the spotlight and enabling staff to spot patterns and anomalies that can be investigated.
As a result, they’re starting to have better-directed conversations that might make the difference between a student failing and flying:
“Basically, a student says ‘I’m fine’ and the tutor says ‘Well, not really’ and shows them the graph,”
That graph, created via the learning analytics solution, might show their attendance at lectures has dropped off, that they’re not using the virtual learning environment (VLE) very much or they haven’t handed in an assessment on time since the start of term.
Pioneers of learning analytics
The University of Gloucestershire was one of the first to get involved in our learning analytics project. It’s one of the group of universities and colleges that has been working with us since 2016 to co-design our learning analytics service, shaping and piloting analytics tools that are built around the needs of higher and further education.
The group is also helping us develop flexible, ongoing support that we can provide to help institutions prepare their staff, students and digital infrastructure and then implement their own learning analytics programmes.
Learning analytics has the power to transform teaching and learning
After the initial co-design phase, the University of Gloucestershire began piloting its own learning analytics project in three schools during 2018-19.
Jisc’s learning analytics specialists worked with the university to brief and train key staff and make sure its digital infrastructure was in good shape for a learning analytics project.
We worked with the university to get its data cleaned and flowing effectively into the learning data hub, a cloud-based storage system that’s at the heart of the Jisc learning analytics service. Because the hub is hosted in the UK and EU it’s a place where institutions can keep their data secure while staying compliant with the General Data Protection Regulation (GDPR).
While it also requires staff and students to work in new and different ways, so effective communication is a vital and ongoing part of any programme. We helped the university create a clear policy describing what data will be collected, why and how it will be used. (In Gloucestershire’s case, the data sources are VLE use together with background and achievement data from the student record system, as well as library and attendance data).
We also helped to put together a student guidance document describing the legal and ethical safeguards that are in place, and this makes it clear that analytics is being used solely to improve learning experiences and support better outcomes; it will never be used for assessment or in a punitive way.
Embedding learning analytics
The university runs learning analytics workshops and events for staff and students and it sends out regular email updates. We’ve supported the communications plan by visiting the university to talk to staff and students. For example, we demonstrated how the data explorer dashboards deliver quick visualisations that show patterns in the engagement data.
This is all part of a detailed communications and development plan to make sure people are comfortable about the programme and have the skills and confidence to take part.
With strong support from the top, the university has made a good start with embedding learning analytics.
“The data created by learning analytics give us insights into when, where and how students are engaging – or not engaging – with the various activities and services that support their learning. We hope that will enable students to track and reflect on their own style, pattern and pace of learning.
We also hope it will enable staff to identify more quickly and accurately patterns of engagement where some students are learning well whereas others are showing signs of disengaging, so that we can provide the right support for each student to help them succeed,”
Stephen Marston, vice-chancellor, University of Gloucestershire
Giving students more control over their own learning
Study goal records 2,000 unique student users per day
The University of Gloucestershire sees attendance as a vital marker for engagement, so they’ve made study goal the preferred choice for students to sign in to lectures and classes.
Study goal is our student app, designed to make it easy for students to register attendance as well as to benchmark their performance against peers and set their own goals, so they can take more control over their own learning.
During our visit to the campus we’ve run sessions showing students how to use it to make sign-in simpler.
By the end of the 2019 summer term study goal was recording 2,000 unique student users per day; at the same time, the Jisc data explorer tool was recording 150 unique data explorer users per day, out of around 300 staff who teach regularly and perform the role of personal tutor.
Staff got to grips quickly with identifying early warning signs and began to get in touch with students to talk about additional support. Staff and students are offering some positive feedback:
“The checking in process is much smoother/faster”
“…it gives us a guide on how we’re progressing and how much we’re engaging. Whether the score given on the study goal app is a good or bad one, personal tutors can still assist in improving this and encourage the student to reach their full potential”
“Students seem very happy to engage with it and some are already becoming competitive about the points they’ve earned on the app”
Adrian Long, academic subject leader: religious, philosophical and historical studies
“The user stats are encouraging,”
says James Hodgkin.
“But there’s always a need for more communication because some people still don’t know about it or have the wrong idea. Comms will never be ‘done’.”
Even so, there’s evidence that the simple act of getting students to ‘check in’ boosts attendance.
In the pilot schools, uptake of the check-in facility has been running at 80-90% which, James says, is “major”. And, anecdotally, staff have been noting improvements in attendance.
“We have fully integrated learning analytics within our tutor portal, allowing our personal tutors and module tutors to benefit from bespoke data to inform their critical work in supporting our learners.
Students often find it hard to calibrate how they are doing in real time and learning analytics ensures we intervene at a point when changes can make a real difference to outcomes”
David James, dean of academic development
RAG ratings: impact of attendance on attainment
“We’re starting to drill into exactly how much impact attendance has on attainment. While almost everyone agrees that there is a positive impact, we want to quantify it,”
The university is also working on the RAG (red, amber and green) ratings in data explorer to make sure they’re accurate and robust, so staff are confident they have strong evidence when they meet with students.
They’re also looking at ways to feed more data in; for example, adding reading lists could provide a more nuanced picture because if people are reading recommended texts at the appropriate time, this will show they’re engaging with learning even if their attendance is patchy for what might be wholly justified reasons.