Keeping personal data secure has been a legal requirement since the 1984 Data Protection Act. But that creates a paradox, because to secure the computers and networks that store and carry personal data, we need to collect additional personal data about how those systems are used and abused
The General Data Protection Regulation (GDPR), which comes into effect in May 2018, at last recognises that paradox and provides a clear framework to resolve it. This is particularly important for research and education networks, such as the national Janet Network, which Jisc owns and operates.
Because one of our main roles is to support innovation, we can’t adopt the preventive approach of listing all known uses and blocking everything else as “presumed hostile”. Instead, we concentrate on identifying security problems quickly and dealing with them effectively. That requires us to record a lot of information about use of the network and services because, although most of it relates to legitimate use and will never be looked at, we can’t know in advance which may relate to malicious or risky activity that needs investigation.
What the law requires
The GDPR states that protecting network and information security is a legitimate interest of a range of organisations, including operators of networks and computer systems.
Processing personal data involves a three-step test: that the interest is legitimate, that the processing is necessary (there is no less intrusive way to achieve the interest) and that the risk to individuals is less than the benefit to the organisation. Unless all three steps are satisfied, the processing can’t proceed.
There seems little doubt that protecting security is a legitimate interest. Not only is this now explicit in the law, but data protection regulators have recommended further clarifying and extending the security activities that are permitted.
Necessity might seem tricky to demonstrate when we know that most log entries will never be accessed. However, there is no way to determine in advance which events relate to malicious activity, or which users will be the targets of attacks, and record only those.
If, however, an organisation was to collect logs without having a process to use them, or to keep them beyond the point when they would have any value for investigation, then that would raise doubts about necessity. A consistent set of logs, with clear processes for when and how each of them will be used to protect and improve security, seems to satisfy the requirement for minimum intrusiveness.
Keeping these logs does create some risk to users of the network or system, not least that the log files themselves may be involved in a security breach. The balancing test requires us to ensure that the security benefits – most of which apply to all users – justify that risk.
Logs will contain personal data, so must be kept secure; however, the same files also contain information that could help someone attack systems, so we should be keeping them secure already. The risks involved in processing can be reduced by installing programs to do the initial inspection of logs to pick out events that need further investigation by humans. The sheer volume of data is likely to make this a practical necessity.
Finally, the risks can be reduced by only linking events to individual people after analysis has confirmed that they indicate a security problem. For the Janet Network, this analysis happens automatically. We don’t have access to the identities of individual users of the network, but we pass confirmed incidents on to the affected organisations to deal with individuals if necessary.
The GDPR recognises that this type of processing contributes to reducing risk. Having organisations deal with their own users and security problems also means we rarely need to share that information with others. However, we share other information about how to avoid, detect or resolve security problems: this increases the benefits that result from processing, so contributes to satisfying the balancing test.
What our members should do
Many of the same considerations will apply to our members and other organisations’ own information security activities.
The most important thing is to ensure that your logs are fit for purpose. Check that you know, and have documented, when and how you’ll use them to prevent, detect and investigate security incidents. Do you have enough information? Are you keeping logs for the right length of time? Do you have everything you’ll need to interpret them? Are timestamps derived from a consistent, reliable source?
Exercises are an ideal way to test this. Think of a typical report you may receive and work through what your processes dictate. This can identify gaps where you need to collect more information or, conversely, that you are keeping information where any likely benefit is too low to justify the risk of collecting and storing it.
Review how much of your security processing can be done with pseudonyms such as IP addresses. The GDPR recognises the benefits of having a separate process to link these to individuals and making that the last step in an investigation, possibly subject to separate approval, helps to protect privacy and comply with the law.
Make sure your systems, tools, processes and people all reflect the importance of keeping security-sensitive information secure. Access to logs, either directly or via security tools, should be restricted to authorised, trained people. And always be on the look-out for opportunities to automate processes: reducing the need for humans to look at personal data improves data protection and saves time and effort.
Finally, when you are fixing your own security problems, think how you might use what you learn to help others within your organisation and the wider community. Sharing the benefits of your security activities makes us all more secure, and more data protection compliant.