Facial recognition and the law – are we one step closer to a sci-fi dystopian future?

September 16, 2019 admin

In 2019, facial recognition is a common feature of private technology, but what does the law say about using it in public spaces?

The law on using facial recognition and other biometric technologies in public spaces has survived its first legal challenge, after the High court rejected a judicial review on 4th September 2019.

The judicial review was brought by Mr Edward Bridges, from Cardiff, against South Wales Police’s use of facial recognition technology at several events and locations since 2017.

Mr Bridges alleged that his image was captured (but not stored) on two of these occasions, and this amounted to breached of the Data Protection Act 2018, GDPR and his Human Rights.

Facial recognition does not breach data protection law

In Mr Bridges’ case, the High Court did not find breaches of the Data Protection Act, nor of GDPR. It reasoned that while the capturing of such personal images did amount to biometric data – and so required careful processing of this type of data – it was strictly necessary for law enforcement purposes. 

South Wales Police was able to justify the use of this technology with the following evidence:

  • The rollout of the technology was entirely transparent and had significant public engagement and consultation beforehand,
  • The facial recognition technology was used only for a limited time and over a limited footprint, and
  • The technology was only used at events that have previously been the scenes of disorder, bomb hoaxes and/or violent protests.

Because South Wales Police had carried out such a careful consideration of using facial recognition technology before deciding to go ahead with the project, this clearly helped justify its position when it was challenged under the law. This consideration is likely to have taken place as part of the Force’s Data Protection Impact Assessment (DPIA) prior to starting the facial recognition project.

The Information Commissioner’s Office (ICO), the UK’s independent authority that upholds information rights in the public interest, has extensive guidelines for how to conduct a DPIA. This ensures that projects which may have a high risk to public data protection interest are properly considered prior to starting.

Clearly, if South Wales Police had not have been so careful in their consideration of using facial recognition technology, the High Court may have ruled the other way.

… but what about Human Rights?

As part of his case against South Wales Police, Mr Bridges also claimed that his human rights have been breached. Specifically, his right to a private life under Article 8 of the European Convention on Human Rights.

He claimed that this right had been infringed because the use of facial recognition technology was neither in accordance with the law, nor necessary or proportionate in the instance of its use.

While the High Court did agree that his human rights had been infringed, they found that South Wales Police’s use of the technology was necessary and proportionate for many of the reasons we’ve outlined above.

As such, the use of facial recognition technology in this case met the requirements of the Human Rights Act because the actions were subject to sufficient legal controls.

What does this ruling mean for other cases?

This ruling goes to show that even law enforcement and crowd protection at big events require thorough consideration and appreciable preparation – all to keep us safe. It also demonstrates the level of protection that completing a good DPIA can afford a business or institution with potentially big data protection risks.

While this case does set a good precedent for continued use of advanced technologies in public spaces, the field is certainly is not without is detractors and challengers, and it may be that Mr Bridges will decide to take his case to the Court of Appeal.

Fortunately for us, it also looks like the use of facial recognition technology for targeted advertising – like in Tom Cruise’s 2002 film Minority Report – has been kept at bay for a little while longer.

No two cases of data protection are the same. If you’re concerned, call us on 0333 2400 944 for a free 15 minute consultation or use the contact form below.

Importantly, Amgen Law has provided this Insights article for information only and nothing in it should be constituted as legal advice. However, if you would like to discuss any of these issues further about a legal matter that is affecting you, please get in touch with us directly using the form below.

    Note: a version of this Insights article was published on LinkedIn pulse on 16th September, 2019, written by Jonty Gordon of Amgen Law.