• AE
Choose your location?
  • Global Global
  • Australian flag Australia
  • French flag France
  • German flag Germany
  • Irish flag Ireland
  • Italian flag Italy
  • Polish flag Poland
  • Qatar flag Qatar
  • Spanish flag Spain
  • UAE flag UAE
  • UK flag UK

Immature biometric technologies could be discriminating against people, the Information Commissioner's Office warns.

16 November 2022

On 26 October 2022, the Information Commissioner's Office (ICO), published two reports warning organisations to assess the risks to the public and organisations of using biometric technologies to conduct emotion analysis.

The reports also reflected on more traditional biometric technologies and looked forward to potential new uses of biometric technologies, their benefits and risks in the coming years.

What is biometric emotional analysis technology?

Biometric emotional analysis technologies process data such as gaze tracking, sentiment analysis, facial movements and expressions, gait analysis, heartbeats and skin moisture. They rely on collecting, storing and processing a range of personal data and, in some scenarios, special category data, which is far riskier than traditional biometric technologies that are used to verify or identify a person.

The ICO’s warning

The ICO is concerned that incorrect analysis of data could result in a range of risks to people, including assumptions and judgments about vulnerable individuals which could subsequently lead to discrimination. This is due to the inability of algorithms, which haven't been sufficiently developed yet to detect emotional cues, resulting in systemic bias and inaccuracies.

Deputy Commissioner, Stephen Bonner stated:

"If you’re using this to make important decisions about people – to decide whether they're entitled to an opportunity, or some kind of benefit, or to select who gets a level of harm or investigation, any of those kinds of mechanisms… We're going to be paying very close attention to organisations that do that. What we're calling out here is much more fundamental than a data protection issue. The fact that they might also breach people's rights and break our laws is certainly why we're paying attention to them, but they just don't work."

Comment

Notably, this is the first time the regulator has issued a blanket warning on the ineffectiveness of a new technology. This intervention is perhaps due to the fact that this specific kind of biometric data is highly sensitive – it can even include subconscious behavioural responses. Any organisations using biometric data in this way should proceed with extreme caution, check the technology being used and conduct a Data Protection Impact Assessment if they have not already done so.

The ICO are not alone in their concerns – an independent review lead by Matthew Ryder QC (June 2022)(1) called for an urgent assessment of the legal framework surrounding biometric data, whilst a study by Cambridge University researchers argued that biometric emotional analysis technology had ‘no scientific basis’ (2) (June 2022).

The ICO has confirmed that it will publish new guidance on biometric technology in Spring 2023 - watch this space as we will publish more insights on this topic in due course.

  1. 'The Ryder Review: Independent legal review of the governance of biometric data in England and Wales'
  2. Does AI Debias Recruitment? Race, Gender, and AI’s “Eradication of Difference”

If you'd like to discuss any aspect of biometrics use or points raised in this article, get in touch with one of the authors below.

We would like to acknowledge Isaac Chulu Chinn's contribution to this article.

Further Reading