On 23 January 2018 the House of Commons Science and Technology Committee took evidence from Elizabeth Denham, the UK’s Information Commissioner, on the use of algorithms in decision making. This inquiry’s published purpose is to examine the increasing use of algorithms in public and business decision making, to assess how algorithms are formulated, the scope for error or correction and the impact they may have on individuals – and their ability to understand or challenge decisions. Vanessa Barnett, Consultant Solicitor at Keystone Law, provides an account of Elizabeth Denham’s contribution to the inquiry.

Elizabeth Denham, as the Information Commissioner, is tasked with enforcing the General Data Protection Regulation (‘GDPR’) and giving guidance on how businesses can comply. Algorithms, machine learning and artificial intelligence are central to, and are becoming more important in, digital business. Easily one of the more challenging aspects of the GDPR is the set of rules around profiling and automated decision making which apply to the use of algorithms, machine learning and artificial intelligence. The Information Commissioner’s Office (‘ICO’) in the UK has been trying hard to understand and develop best practice in this area – see, for example, the ICO guidance ‘Big data, artificial intelligence, machine learning and data protection,’ which was updated in September 2017.

As ever, Elizabeth Denham showed that she is on top of the brief – overall, she believes that in the GDPR she has tools to police compliance in this area, but she called for greater resource and expertise. In terms of the sectors where algorithms are most relevant/interesting (and where the data subjects are at the most risk), she singled out health, housing, employment and justice. All are areas where an automated decision, without further human intervention, could have a material impact on a data subject. She also made the point that sector regulation sits alongside data protection law and her view is that regulators need to be brought together, so that all the various oversight roles put the data subject first.

Elizabeth Denham is most excited about the Centre for Data Ethics, and the ICO’s work with the Alan Turning Institute. The desire there is to create a framework for the explain-ability of algorithms, machine learning and artificial intelligence. Her view is that the publication of training data is not necessary, but that there is a requirement that businesses publish sufficient information for the data subject as to understand how a conclusion made by algorithms has been reached.

One of the items Elizabeth Denham was keen to stress is that explain-ability is a general requirement under the GDPR, which sits within what she feels is the most important change in the GDPR: the obligation of accountability. She summarised the concept of accountability as: identify the risks to the individual, mitigate those risks, do the impact assessments, notify the individuals and live by the concept of privacy by design ‘baked in.’ She was keen to hammer home that this means “putting the citizen or customer at the centre.” In terms of how that translates to algorithms, Elizabeth Denham was positive and aspirational, but didn’t quite give the type of answer that lawyers feel they can rely on: she said she does not believe that there is not a way to explain opaque algorithms and that she’d know it when she saw it. We all know that the ICO is a big fan of the layered notice and this was again endorsed in this area.

The Committee then asked about how the ICO would audit in this area. In terms of audits, in general, Elizabeth Denham said the approach would be to start with “I want to see your commitment to data protection.” She elaborated: is it a boardroom issue? Is someone senior and independent the DPO? Where are the policies and training? Have there been breaches? Have there been impact assessments? She continued: if people are asking what’s a DPO, if there’s no access to the Board, if there’s no training – then there’s a problem. She said that that would signify that data protection was not “evergreen” or “embedded” within the business.

There were some questions about auditability of algorithms, machine learning and artificial intelligence and Elizabeth Denham readily admitted that this required specialist resource (and that the ICO had started to look at secondments to get upskilled in this area). She felt that there is not necessarily a “shelf life” for an algorithm where every X years it needed review, rather it is an ongoing obligation, and is very context specific.

What is very clear from Elizabeth Denham’s appearance is that the ICO isn’t bamboozled by algorithms, machine learning and artificial intelligence, it is excited by them. It might not be an expert yet, but it has intentions in that direction – so each business using tools in this area would be wise to focus on explain-ability and dive more deeply into the ICO’s existing guidance. Got an internal ethics board yet?

This article was first published by Digital Business Lawyer

For further information please contact:

This article is for general information purposes only and does not constitute legal or professional advice. It should not be used as a substitute for legal advice relating to your particular circumstances. Please note that the law may have changed since the date of this article.