• IE
Choose your location?
  • Global Global
  • Australian flag Australia
  • French flag France
  • German flag Germany
  • Irish flag Ireland
  • Italian flag Italy
  • Polish flag Poland
  • Qatar flag Qatar
  • Spanish flag Spain
  • UAE flag UAE
  • UK flag UK

Consumer Trends 2024: The use of facial recognition in the Retail sector

23 January 2024
The deployment of Facial Recognition Technology (FRT) has come under increasing scrutiny and there is no sign of this trend easing in 2024, particularly as the uptake of this technology grows. 

FRT to combat shoplifting

In the UK, the use of Facial Recognition Technology in the retail sector has been under the spotlight, following the UK government's launch of its Retail Crime Action Plan towards the end of 2023. Under this plan, which has the backing of several large retailers, retailers are advised to send CCTV footage of incidents and an image of the shoplifter to the police as quickly as possible after an offence has been committed. The police would then run the footage and/or image through the Police National Database using FRT in order to aid efforts to identify and prosecute offenders. In addition, smaller businesses have also been reported to be pushing ahead with plans to implement FRT amid rising levels of shoplifting - an issue that is costing major retailers millions of pounds.

Challenges relating to the invasive nature of FRT

However, concerns have been raised over the highly invasive nature of FRT; particularly as it relies on the use of biometric data, which is classified as sensitive or 'special category data' under the General Data Protection Regulation (GDPR) and therefore requires a higher degree of protection. In December 2023, the UK Information Commissioner's Office issued a statement acknowledging that FRT "can bring benefits in helping to prevent and detect crime, but relies on processing large amounts of sensitive personal data. That is why the law places a high bar for its usage: its use must be necessary and proportionate, and its design must meet expectations of fairness and accuracy".

Biometric data is defined under the GDPR as personal data that results from specific technical processing relating to the physical, physiological or behavioural characteristics of an individual which allow or confirm the unique identification of that person. These characteristics can include fingerprints, voice or facial images. In the context of facial recognition, the use of FRT serves as the 'technical processing' and is described by the European Data Protection Board as a two-step process: (1) the collection of the facial image and its transformation into a template (i.e. a digital representation of the distinct characteristics of the face); followed by (2) the recognition of this face by comparing the template with one or more other templates.    

Unlike an address or telephone number, it is impossible for individuals to change their unique characteristics, such as their face. This naturally increases the risk profile of using biometric data in the context of FRT, particularly if the data were to be compromised in the event of a data breach. In addition, FRT has been reported to be more likely to misidentify non-white faces when compared to white faces, causing the potential to result in discriminatory outcomes. A study from MIT reinforced this finding, stating that all commercially available facial recognition technologies tested produced higher error rates on non-white faces.

The potential over-policing or monitoring of innocent people is a primary concern of many activist groups opposing the use of FRT and calling for the technology to be banned by the police and private companies in the UK, warning of the risks of an 'enormous expansion of the surveillance state'.

Risk management in FRT

Under the GDPR, retailers would need to undertake a data protection impact assessment prior to using FRT, given the novelty of the technology and that the processing activity is likely to result in a high risk to the rights and freedoms of individuals. In particular, this assessment would need to identify the legal basis relied on for the processing activity under the GDPR and contain a robust assessment of the necessity and proportionality of deploying FRT for achieving the retailer's aims and objectives (whether this is for security purposes or otherwise). It would also need to assess the potential risks posed to individuals, e.g. discriminatory effects, and the measures that would be implemented to address those risks.    

Given that retailers are likely to source FRT from third parties, it is critical for retailers to also undertake thorough due diligence on any FRT that is sourced. For instance, how does the FRT ensure statistical accuracy? How does it minimise/avoid false positives and false negatives? To what extent has the FRT been tested to identify any design flaws or deficiencies in the training data, that could lead to bias or discriminatory outcomes? Does the provider of the FRT undertake regular and systematic evaluation of the FRT's algorithmic processing to ensure the accuracy, fairness and reliability of the results? These are just some of the issues that would need to be addressed by retailers and their FRT providers as part of their overall risk management process.

FRT as a wider social issue

Even if retailers can get comfortable that their proposed use of FRT can be undertaken in compliance with the GDPR, there is a wider social angle as to whether FRT should be deployed in the first place; especially given the increased scrutiny by activist groups and the increased awareness by consumers of the risks involved. This consideration extends beyond the use of FRT for security purposes and into more novel use cases being developed, such as in-store consumer monitoring for marketing purposes and for analysing consumer purchasing behaviour.

Further, in light of the Post Office scandal, our faith in the reliance of technology for crime prevention and law enforcement is under heightened scrutiny. Retailers need to be confident that any technology they deploy for such purposes is robust, and that any issues related to performance and quality are properly investigated.

If you have any questions or would like to discuss any of these topics and what they mean for you and your business, please get in touch with our Consumer sector and Data Protection & Cyber Security experts. 

This insight was authored by Stewart Room, Tughan Thuraisingam and Stephen Kewley.

Back to hub
Click here to return back to the Consumer Trends 2024 hub.
   
Interested in more similar content?
Register to receive our regular sector and legal insights directly to your inbox.