updated on 03 September 2019
The use of facial recognition software (FRS) in security and monitoring was thrust into the spotlight by the London Mayor Sadiq Khan taking issue with a London developer over its installation in a King’s Cross site. In this post we consider the privacy and data protection issues with integrating FRS into security systems, an issue currently before the courts.
Human rights group Liberty has commented on the recent dispute, qualifying it as “a disturbing expansion of mass surveillance”. Instances of FRS being used by the South Wales Police to scan large crowds was legally challenged by the group. The software used by the force works by mapping the facial contours from subjects and matching them to those retained on a ‘watch list’. This system brings matters of accuracy, discriminatory algorithms and personal data acquisition to the fore.
The Information Commissioners Office issued a statement on the police forces use of FRS: “Legitimate aims have been identified for the use of live facial recognition. But there remain significant privacy and data protection issues that must be addressed, and I remain deeply concerned about the rollout of this technology.”
Accordingly it is recommended that a data protection impact assessment be undertaken before any software is used, a bespoke policy document drafted and algorithms are bias checked.
The R (Bridges) v South Wales Police
A key test case on the use of FRS is currently before the Welsh High Court. The case concerns South Wales Police’s efforts to use FRS when the complainant, Ed Bridges, questioned its lawfulness. The case has been heard and is currently awaiting judgment.
Liberty, via Mr Bridges, looks to challenge the practice on three grounds:
Specific attention has been given to the watch list images FRS uses to compare the facial maps it takes. Particularly, there are no guidelines as to how the images are stored and sourced. The Information Commissioners Office has intervened in the case and highlighted this concern.
Developments
In spite of the pending legal case and concern from regulators, South Wales Police has moved forward with plans to provide officers with FRS software on mobile devices.
A significant series of questions arise in relation to the FRS program which may have a bearing on whether it is lawful:
All these factors will have an impact on whether the use of FRS can be seen as a proportionate means of achieving the aim of the prevention and detection of crime. Further, they will also be significant in determining the risk posed to the liberty of citizens by the potential invasiveness of the use of FRS. Therefore, the determination of the Bridges case is the most significant awaited ruling on the use of FRS to date. It is likely to determine the scope of use of FRS and provide guidelines for the integration of further developments in technology.
Suneet Sharma runs the Privacy Perspective Blog and is an aspiring lawyer and legal writer.