On 18 June the ICO published a blog post and opinion on the use of live facial recognition technology (LFR) in public places. These apply to both public and private sector organisations, but do not cover the use of LFR for law enforcement purposes. The opinion identifies the following key issues with LFR:
- the governance of LFR systems, including why and how they are used;
- the automatic collection of biometric data at speed and scale without clear justification;
- a lack of choice and control for individuals;
- transparency and data subjects' rights;
- the effectiveness and the statistical accuracy of LFR systems;
- the potential for bias and discrimination;
- the governance of watchlists and escalation processes;
- the processing of children's and vulnerable adults' data; and
- the potential for wider, unanticipated impacts for individuals and their communities.
The opinion notes that it is not the ICO's role to ban technology, but to raise awareness of data protection law, explain how it applies, and monitor and enforce the law. The opinion focuses on summarising the law and how it applies to LFR, then provides a summary of key requirements for controllers:
- The controller must identify a specified, explicit and legitimate purpose for using LFR in a public place.
- The controller must identify a valid lawful basis and meet its requirements.
- The controller must identify and meet conditions for processing special category data and criminal offence data.
- The use of LFR must be necessary and should be a targeted and effective way to achieve the controller’s purpose.
- The controller must demonstrate that they cannot achieve their purpose by using a less intrusive measure.
- The use of LFR must be proportionate.
- The LFR system should be technically effective and sufficiently statistically accurate.
- The controller should address the risk of bias and discrimination and must ensure fair treatment of individuals.
- The controller must be transparent and provide clear information about how they are processing personal data.
- The controller should undertake a DPIA (data protection impact assessment).
- The DPIA must consider the risks and potential impacts of the processing on data subjects' rights.
- The controller must comply with the data protection principles.
When using LFR for surveillance, controllers must:
- ensure the use of watchlists complies with data protection law; and
- where there is collaboration with law enforcement, ensure roles and responsibilities are clear with appropriate governance and accountability measures in place. All parties must meet the requirements of theUK GDPR and the Data Protection Act 2018.
When conducting a DPIA, controllers:
- should undertake a DPIA in accordance with the annex to the opinion before starting to use LFR; and
- must consult the ICO if the DPIA indicates that the use of LFR would result in a high risk that the controller cannot mitigate.
The ICO opinion does not address artificial intelligence (AI) in detail, but flags it as an issue and directs readers to its separate guidance on AI and data protection. See the March 2021 issue of DWF Data Protection Insights for our report on the ICO toolkit, which is designed to help organisations to comply with this toolkit. On 21 June the European Data Protection Board and European Data Protection Supervisor published a joint opinion calling for a general ban on using AI for automated recognition of human features in publicly accessible spaces, including recognition of faces, gait, fingerprints or voice. It will be interesting to see how the EU law-making institutions and the ICO react to this opinion.
If you are using or considering using facial recognition technology, please contact one of our data protection specialists for advice on how to do this in compliance with the law, including support in conducting the necessary DPIA.