What you need to know
Learning analytics refers to the measurement, collection, analysis and reporting of data about student progress and how the curriculum is delivered.
Students using digital resources and systems generate data that can be analysed to reveal patterns predicting success, difficulty or failure which enable teachers - and students - to make timely interventions.
Importantly for this guide, these metrics can also support a more accurate, data-informed approach to curriculum design.
Already, colleges and universities are using dashboards (explained further in the dashboards section of our data visualisation guide) which provide see-at-a-glance evidence of resource usage and student engagement.
This makes possible a more fluid, dynamic response to curriculum development. Essentially, you don’t need to wait for a distant course review to make any changes that are necessary.
Measures derived from this data can also support the institution’s submission for the Teaching Excellence Framework (TEF), be used in preparation for inspection or form part of routine quality assurance.
The benefits of analytics for staff, students and institutions are becoming clearer, but our research also argues for care over how data is interpreted and presented to staff and students. This is an area where we are still gaining understanding, particularly of the ethics of using student data.
Why analytics matter
With the benefit of analytics, large higher education institutions can spot differences in the amount of teaching and assessments used to deliver learning outcomes of the same credit value on different courses and disciplines. Equally, this kind of data analysis can reveal examples of over-teaching or over-assessment.
A brief overview of learning outcomes can reveal similar discrepancies. Often you find that some learning outcomes on modular courses are assessed multiple times in different parts of the course whilst others are ignored.
Even a review of course information in a virtual learning environment (VLE) can be instructive. Imagine how useful it would be next time you are designing a unit of learning to know which learning activities have been used the most, which have resulted in high achievement, and which presented the greatest difficulty.
Access to this kind of information enables you to improve the design of your programme, course or unit of learning year on year - and to track the impact of any changes you make.
What the experts say
“There is an increasing number of studies using control groups that show that retention and other measures of student success can be positively influenced by the use of learning analytics.”
Niall Sclater, consultant in learning analytics
Be inspired: case studies
Ulster University – analytics improve achievement in higher education
Ulster University has used analytics to bring forward the timescale for re-approval of programmes which fall below the institutional or sector average. An example is benchmarking programme outcomes against the sector average for student employment.
The process may be prompted by analytics but it is still a positive one. Staff are encouraged to view any early re-approval and revalidation as a design opportunity that will bring improvements for themselves and their students.
“A new design is likely to bring about better results so it makes sense to concentrate on programmes identified by analysing this kind of data.”
Paul Bartholomew, pro-vice chancellor (education), Ulster University
Salford City College – analytics embed digital learning
A dashboard application at Salford City College illustrates at a glance how staff are supporting their students with digital learning. This data enables the college to suggest more cost-effective ways of delivering the curriculum.
One of the college’s seven strategic aims is to develop a quality and performance management system that measures the impact of digital learning, provides data that can feed into individual learning plans and enables early intervention when students are at risk.
The college is also considering setting key performance indicators for staff to inform the college’s continuing professional development programme (CPD). Achievement rates from a badge system for digital capabilities is already collated and published for staff in the college’s VLE, Canvas.
“The college is still on a journey towards a fully digital approach to learning and teaching, but the potential is there. We have at least made a start by showing staff how analytics can make a difference to the way they work.”
Deborah Millar, formerly director for digital learning and information technology services, Salford City College
University of Huddersfield – analytics prevent assignment bunching
University of Huddersfield staff are asked to bring average marks from their assignments along to staff development workshops. This helps reveal patterns such as ‘assignment bunching’ when too many assignments are set in the same time period. Interrogating the data has also shown that marks achieved at the end of the year can be 10% lower than at other times.
The university’s work on analytics was aided by a project in 2013 which explored use of assessment data to improve student performance. Its recommendation was that assessment analytics play an important part in learning and teaching, but students need support in interpreting data about their performance.
“Assessment and feedback is a highly emotive and therefore a sensitive issue for students…. Simply providing the data in the form of a dashboard is unlikely to be effective unless students are offered training in its interpretation and accessible strategies to act upon it.”
Evaluating the Benefits of Electronic Management (EBEAM) project report (pdf), University of Huddersfield
- Sign up for our learning analytics service, email firstname.lastname@example.org
- Check your code of practice against our code of practice for learning analytics
- Follow our blog on effective learning analytics
Watch our Digifest 2017 video on learner data and learning analytics: