On October 31, 2019, the UK Data Protection Authority, the Information Commissioner Officer (ICO), published an opinion on live facial recognition (“LFR”) by enforcement authorities: The use of live facial recognition technology by law enforcement in public places (“Opinion”)
The ICO points out that a statutory and binding code of practice issued by government, modelled on the Surveillance Camera Code is recommended.
is primarily for police forces or other law enforcement agencies using live facial recognition technology (LFR) in public spaces on how to comply with the provisions of the DPA 2018. It may also be a useful resource for those that have an interest in the capabilities of LFR technology and its potential applications for law enforcement. Opinion at 4.
The ICO explains how LFR is used by enforcement authorities (How is LFR technology used by law enforcement in public places? Opinion at 6), which are the legal requirement for the use of the technology (What are the legal requirements under Part 3 of the DPA 2018? Opinion at 7), and when LFR is lawful (What does ‘lawful’ mean in this context? Opinion at 9). Particular focus is given to the basis for processing:
In any case, law enforcement organisations will always need to articulate their lawful basis for processing in a sufficiently clear, precise and foreseeable manner to be able to justify the processing. Opinion at 9.
Consent is not a viable basis because “highly unlikely that individuals, including those not on a watchlist, will be able to provide valid consent.” Opinion at 10. The ICO “therefore expects the police and other law enforcement bodies to rely on s35(2)(b), ie, that the processing is ‘necessary for the performance of a task carried out by a competent authority” Opinion at 11
The ICO recommends privacy by default ad by design (Opinion at 13) and reminds the authorities that a Data Protection Impact Assessments (DPIAs) is required here because LFC always concern “sensitive processing and DPIA is required when “processing is likely to result in a high risk to the rights and freedoms of individuals. Opinion at14
The ICO also reminds authorities the need for tjsi processing to be strict necessary and to be mindful of proportionality (Id.) ad effectiveness (Opinion at 16). T
he ICO also highlights the need to eliminate inherent bias in LFR (“The Commissioner remains concerned about the potential for inherent technical bias in the way LFR technology works. Opinion at 19).
The ICO points out that
an appropriately governed, targeted and intelligence led deployment of LFR may meet the threshold of strict necessity for law enforcement purposes. An example is where LFR is used to locate a known terrorist suspect or violent criminal in a specific area. Such a targeted use for those kinds of significant law enforcement purposes is likelier to be proportionate to the potential intrusion into individuals’ rights and freedoms. In contrast, the blanket, opportunistic and indiscriminate processing, even for short periods, of biometric data belonging to thousands of individuals in order to identify a few minor suspects or persons of interest is much less likely to meet the high bar contemplated by the DPA 2018. In the Commissioner’s Opinion, this is particularly the case if the offences are low level and there may be other less privacy intrusive options available. Opinion at 21.
For more information: Francesca Giannoni-Crystal