A Google search for ‘improving student experience’ brings up around 30,000 hits.
A Google search for ‘improving student experience’ brings up around 30,000 hits. When I did one recently, the front page showed items in scholarly journals and national news media, on Jisc’s own website and on those of several UK universities. This is a subject that really matters, especially as competition for students becomes more and more intense.
I’m working with Jisc on a number of projects focused on learning analytics and ways to use data about students and their activities to help universities and colleges improve educational processes and provide better, more timely support to learners. This aims to boost attainment and enable students to get more from their time in education.
But any programme of learning analytics involves collecting data, some of it sensitive, so institutions that opt to go down the learning analytics route must be sure to do so in a completely transparent way, working to clear policies and with specific objectives in mind.
We’ve put together a code of practice on learning analytics, clarifying the responsibilities that universities and colleges have when they embark on learning analytics. It provides guidance on the key legal, ethical and logistical issues that they are likely to face.
My top ten tips
Here are my top ten tips to help institutions stay safe when using learning analytics to improve the experience of their own students:
You’ll need the buy-in of learners and staff, so consult with them on the objectives, design, development, roll out and monitoring of learning analytics.
Be clear about responsibility
Decide who has overall responsibility for the legal, ethical and effective use of learning analytics.
Allocate specific responsibility for data collection, anonymisation (when necessary), analytics processes, resulting interventions, and stewardship of all relevant data.
Define your objectives and be clear on what data is necessary to achieve them.
Make sure staff and students understand the data sources, the purposes of the analytics, the metrics, who has access to the analytics, the boundaries around usage, and how the data will be interpreted. Give clear information on the processes involved or make the algorithms transparent to them.
Students should normally be asked for their consent for personal interventions arising from your analytics, either during enrolment or subsequently. But sometimes legal, safeguarding or other circumstances may arise that mean they can’t opt out of such interventions; if so, these circumstances must be clearly stated and explained.
If the institution's existing arrangements don’t cover new learning analytics projects it may be necessary to carry out privacy impact assessments and to obtain additional consent. If a student doesn’t consent, any potential adverse consequences of opting out must be explained clearly.
Take a rigorous approach to privacy
Access to student data and analytics must be restricted to those with a legitimate need to view them.
Where data is to be used anonymously make sure you can avoid:
- Identification of individuals from metadata
- Re-identification of individuals by aggregating multiple data sources
If you’re using ‘sensitive data’ as defined by the Data Protection Act (DPA) 1998, you’ll require additional safeguards and possibly additional consent before granting requests from external bodies to share data.
Monitor the quality of your data and analytics processes so that students are comfortable with learning analytics and understand that the process is beneficial.
Make sure that:
- Inaccuracies are identified and minimised
- Implications of incomplete datasets are understood
- The optimum range of data sources is selected
- Spurious correlations are avoided
All algorithms and metrics should be understood, validated, reviewed and improved by qualified staff.
Enable students to access learning analytics performed on their data easily, and to obtain copies of this data in a portable digital format.
They have a legal right to correct inaccurate personal data held and they should normally be able to view the metrics and labels attached to them.
Enable positive interventions
Be clear on what circumstances would cause your institution to offer additional support. Similarly, students may have obligations to act on the analytics presented to them – if that’s the case, communicate the fact clearly to them.
Minimise adverse impacts
Analytics can never give a complete picture of an individual’s learning and may sometimes ignore personal circumstances. So take steps to ensure that trends, norms, categorisation or any labelling of students doesn’t create bias, reinforce discriminatory attitudes or increase social power differentials.
Design analytics systems and interventions carefully to ensure that:
- Students maintain appropriate levels of autonomy in decision making relating to their learning
- Opportunities for ‘gaming the system’ are minimised
- Students don’t respond to the monitoring of their activity by refusing to engage
- There aren’t adverse impacts from giving students and staff information about performance and likely attainment
- Staff have a working understanding of legal and ethical practice
Steward data well
Data for learning analytics must comply with existing institutional data policies and the DPA and should be:
- Kept to the minimum necessary to deliver your objectives
- Processed in the European Economic Area (EEA) or, if elsewhere, only in accordance with the DPA
- Retained only for appropriate, clearly defined periods
Remember that students can request that any personal data used for - or generated by - learning analytics is destroyed or anonymised, with the exception of certain, clearly specified data fields required for educational or statutory purposes such as grades.
Keep up to date
You can find out more about the work we are doing to support institutions with their learning analytics – and in particular about our new free basic learning analytics system– on our effective learning analytics project page.