What can colleges do to protect homeworking students from harmful online content?
Like thousands of other students, 16-year-old Ryan is studying remotely while his college campus is shut during lockdown.

This fictious student is researching the Tudor period and wants to know the story behind Anne Boleyn’s famed decapitation. Searching for “beheading”, up pops a bunch of unsavory links referencing modern-day ISIS videos and a detailed eye-witness description of an execution.
Such potentially traumatic discoveries would be avoided if Ryan were working on campus, where web filtering protects him from accessing harmful, inappropriate and illegal content.
But while almost the entire student cohort is studying from home, can colleges still fulfil their duty under Ofsted’s safeguarding guidelines to prevent learners from stumbling across illegal content?
The short answer to the latter question is yes, if they use college-owned laptops. Things are more complicated and less certain if learners work on their own devices, particularly hand-held ones.
Now, as the government prepares to distribute laptops to disadvantaged college learners, I have pulled together these web filtering guidelines for colleges, which will be responsible for allocating the devices. Adhering to these guidelines will be especially pertinent at this time, as disadvantaged learners are more likely to be vulnerable.
What should be blocked?
For a start, any filtering solution must include block lists from both the Internet Watch Foundation (IWF) and Counter Terrorist Internet Referral Unit (CTIRU) and the software manufacturer should be a member of the IWF.
Besides the mandatory categories covering illegal content, filtering solutions for under 18s should also provide non-mandatory categories, such as gambling, pornography and social media, which can be selected from a list by the institution’s administrator. This capability means administrators can very easily set up policies differentiating between what is appropriate for staff and for students, and even between students of varying ages or taking different subjects.
Administrators should also be able to manually set allow and deny rules, which provides fine-tuning, for example to allow YouTube, which contains many educational clips, while blocking gaming and live-feed sites.
Similarly, picking a solution that enables filtering of search terms, means for example, that words such as Essex and Sussex can be allowed, but not the word sex.
Reporting activity
Filtering solutions for under 18s must also include the ability to report suspect internet activity, usually through an email alert to named staff, which is particularly useful if devices are used off campus. What the receiving member of staff does with the alert should be outlined in the institution’s safeguarding policy.
As a minimum, reporting should include user credentials, and the time and the type of activity, such as requests to reach blocked websites. Ideally, the identity and location of the device should be reported, too.
Public networks
During lockdown, learners will almost certainly be using public internet service providers such as Virgin Media or BT. These network providers do filter illegal content based on IWF and CITRU lists, but privately-owned machines aren’t usually as well supervised and are more susceptible to users finding ways to get around these controls.
Bearing in mind the limitations above, it would be sensible to install web filtering software directly on to college-owned laptops used by learners, whether they are studying on or off site. The other alternative is to use a cloud-based service, through which all internet traffic can be directed from any device.
What about hand-held devices?
It’s far more difficult to control access to illegal or inappropriate content through a smartphone or tablet, because these can link to the internet through means other than a browser.
Colleges that choose to provide tablets to learners should consider using mobile device managers, which allow the owner (the college) to determine which apps can be installed.
This gives control over apps that link to the device’s GPS location signal, camera or contacts list and which could be used to groom, stalk, harass or bully.
Policy and procedure
Colleges have no control over whether web filtering software is in place on devices privately owned by students or their family members and used for study off-campus.
When on campus, a bring your own device (BYOD) policy should cover ‘acceptable use’, alongside more specific acceptable internet usage and security policies, which users must agree to prior to using the college facilities. Such policies and guidance could be amended to refer to any use of institutional systems or for institutional purposes – even off-site – and, indeed, at a time of public stress, good behaviour online is to be encouraged.
If the web filtering system flags an incident, perhaps multiple attempts to access banned sites, there must an agreed process on how to deal with such problems. When is it acceptable to investigate an issue, and how will that be conducted?
Finally, while some of these measures might seem a bit ‘big brother’, filtering solutions are not about catching people out. Rather, the aim is to protect users and help them avoid accidental access to material that could prove traumatic.
Remember that requests to look at banned content are not always deliberate; they may result from pop-ups, malicious emails, misdirected links, or simply typos.
Jisc’s web filtering framework will be updated by early summer 2020, with a more diverse and comprehensive offering. Find out more about our cyber security services.
0 Comments