The Court of Appeal ruling in this case has been long awaited, following the decision at first instance in favour of the Chief Constable. The Claimant argued that the use of automatic facial recognition (“AFR”) technology was unlawful in that it interfered with his Article 8 right to respect for his private life. He also alleged that it breached the Data Protection Acts of 1998 and 2018.
Upholding part of the appeal, the court found that the AFR system did constitute an interference with Article 8 rights, not in accordance with the law, but would be proportionate if a sufficiently narrow local policy were framed.
AFR is a new technology being piloted by the police, to assess whether two facial images depict the same person. “AFR Locate” is used at large gatherings. The police hand out leaflets and advertise on social media so as to alert the public to the use of AFR. Prior to the event, police will upload and process a ‘watchlist’ of:
(a) persons wanted on warrants;
(b) individuals who are unlawfully at large;
(c ) persons suspected of having committed crimes;
(d) missing persons;
(e) individuals whose presence at a particular event causes particular concern;
(f) persons simply of possible interest to police for intelligence purposes; and
(g) vulnerable persons.
The watchlist database of 400 – 800 people is turned into biometric data unique to each face. CCTV is processed by AFR Locate to recognise human faces. It converts images of these faces into biometric templates and looks for a match with the templates on the watchlist. It can process up to 50 faces per second. Up to half a million faces can be scanned at a single event and compared with those on the watchlist.
If AFR Locate makes a match, it will alert its operator and display the images for comparison. No action is taken unless the human operator is satisfied of the need for it. A decision will be made as to whether to approach the potentially matched person. If no match is detected, the software will automatically delete the images with 24 hours. The biometric template is deleted immediately, and the CCTV feed and details of those matched is deleted after 31 days.
The interference of the Art 8 right was in breach. The local South Wales Police (“SWP”) policy contained wide areas of discretion (watchlists and locations) where AFR may be deployed. This meant that the data protection impact assessment prepared by SWP did not meet the requirements of s. 64 DPA 2018.
The Court also held that SWP had not done all that they reasonably could to fulfil the Public Sector Equality Duty (PSED) pursuant to s 149 (1) EA 2010. It was held that, as AFR is a novel and controversial technology, all forces that intend to use it in the future should satisfy themselves that everything reasonable had been done in order to make sure that the software does not have a racial or gender bias.
Proportionality is key. A fair balance has to be struck between the rights of the individual and the interests of the community. This means that AFR Locate may lawfully be used in future, so long as the police put in place an appropriate policy, undertake adequate data protection and equality impact assessments and put in place a more detailed policy document.
SWP will now need to draft a compliant policy document which must take into account the provisions of the Data Protection Act 2018.
The impact assessment prepared by SWP did not meet the requirements of s.64 DPA 2018. It proceeded on the basis that Article 8 was not infringed. The equality impact assessment prepared by SWP did not demonstrate compliance with s.149(1) of the Equality Act 2010.
Speaking for the South Wales Police, Chief Constable Matt Jukes welcomed the judgment, saying: “I am confident this is a judgment that we can work with”.
South Wales Police have already committed to adjust their policies in line with the Judgment and to work alongside the Surveillance Camera Commissioner, Home Office, College of Policing and National Police Chiefs Council to address the issues raised and to allow them to continue to make use of live facial recognition and biometric technologies.
The use of facial recognition will continue to draw public interest as there is significant opposition to this being used for any purpose. A ban on such technology has been suggested by Liberty with Megan Goulding, a lawyer for Liberty, commenting that: “The Court has agreed that this dystopian surveillance tool violates our rights and threatens our liberties … it is time for the Government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom – it needs to be banned.”
Police forces interested in the deployment of facial recognition have a difficult task in justifying the use of such technologies when balancing the potential benefits with the risks identified by the Court of Appeal and the potential to violate data protection, equality, and human rights laws.
Steps that need to be taken are as follows:
- Draft a policy to narrow the breadth of discretion in relation to the selection of those on watchlists, and the locations where AFR may be deployed, so that AFR Locate may be said to be in accordance with the law;
- Undertake an adequate data protection impact assessment;
- Undertake an adequate equality impact assessment in compliance with s.149(1) EA 2010; and
- Put in place a more detailed policy document which complies with s.42(2) DPA 2018.
To download this Case Alert as a PDF please click here.
If you would like to know more about this matter, please speak to your contacts at Plexus Law:
Andrew Steel, Partner
T: 0161 245 7965 | M: 07557 232 419 | E: email@example.com
Amy Olive, Legal Assistant
T: 0161 244 6927 | E: firstname.lastname@example.org
Whilst we take care to ensure that the material in this Case Alert is correct, it is made available for information only, and no representation is given as to its quality, accuracy, fitness for purpose, or usefulness. In particular, the contents of this Case Alert do not give specific legal advice, should not be relied on as doing so, are not a substitute for specific advice relevant to particular circumstances. Plexus Law accepts no responsibility for any loss which may arise from reliance on information or materials published in this Case Alert.