SecurityBrief Asia - Technology news for CISOs & cybersecurity decision-makers
Story image
UK privacy watchdog 'deeply concerned' about live facial recognition
Wed, 17th Jul 2019
FYI, this story is more than a year old

The UK's Information Commissioner Elizabeth Denham has cast a harsh light on how every organisation that uses facial recognition to identify certain people are a threat to all citizens' privacy, so those organisations must comply with data protection laws.

The privacy watchdog says that any police force or organisation that uses live facial recognition  (LFR) technology - in which crowds can be scanned and compared against databases for matches in mere seconds - is processing personal data.

The South Wales Police and Met Police are two organisations that have been trialling LFR technology. While police are generally trying to identify those who are linked to criminal activity, they are also processing biometric data belonging to thousands of innocent people.

“That is a potential threat to privacy that should concern us all,” says Denham.

The Information Commissioner's Office (ICO) has been monitoring how police use LFR trials. Police have been fully cooperative and the ICO understands the practical benefits of the technology, but there are still ‘significant' privacy and data protection issues that are not being addressed.

Denham says she remains deeply concerned about LFR technology's rollout. She wants to see demonstratable evidence that the technology is necessary, effective and proportionate to the amount of privacy invasion the technology imposes.

“There is also public concern about LFR; it represents a step change from the CCTV of old. There is also more for police forces to do to demonstrate their compliance with data protection law, including in how watch lists are compiled and what images are used. And facial recognition systems are yet to fully resolve their potential for inherent technological bias; a bias which can see more false positive matches from certain ethnic groups.

Osborne Clarke is an international law firm. Partner Tamara Quinn offers comment:

“There's a lot of excitement around the use of face recognition systems. While the benefits are endless, businesses must also consider the risks that arise from deploying face recognition systems as they need to take appropriate steps to comply with the law.  Facial recognition and video surveillance are covered by a complex web of regulations which isn't easy to navigate, plus there is reputational risk if companies aren't seen to be taking privacy seriously."

“With the ICO promising to pay closer attention to private organisations that use facial recognition systems that cover public areas, businesses should act now to ensure that their software doesn't break the law. And this can include reassessing the use of external cameras overlooking the street, public parking or other communal spaces. As well as making sure that their systems comply with strict legal requirements, companies should be looking at their contracts with external suppliers of these systems, to make sure that they have strong legal protections in place.

While the courts examine how to construct a framework that safeguards privacy, in particular the case of R v Chief Constable of South Wales Police, any force that aims to deploy LFR must consider a range of concerns. These include:

•    Carrying out a data protection impact assessment and updating it for each deployment - because of the sensitive nature of the processing involved in LFR, the volume of people affected, and the intrusion that can arise. 
•    Producing a bespoke ‘appropriate policy document' to cover the deployments - it should set out why, where, when and how the technology is being used.
•    Ensuring the algorithms within the software do not treat the race or sex of individuals unfairly.”
•    In the UK, law enforcement organisations are advised to submit data protection impact assessments to the ICO for consideration, with a view to early discussions about mitigating risk.