• GL
Choose your location?
  • Global Global
  • Australia
  • France
  • Germany
  • Ireland
  • Italy
  • Poland
  • Qatar
  • Spain
  • UAE
  • UK

DWF Data Protection Insights February 2020

28 February 2020
Here is our round-up of the month's top data protection stories, together with practical advice on how to address the legal issues raised.

Regulatory guidance/campaigns/other news from the Information Commissioner's Office (ICO)/European Data Protection Board (EDPB)

ICO draft guidance on AI auditing framework

The ICO has launched a consultation on its draft guidance on the AI auditing framework.

The draft guidance contains advice on how to understand data protection law in relation to artificial intelligence (AI) and recommendations for organisational and technical measures to mitigate the risks AI poses to individuals. It is intended to provide a solid methodology for auditing AI applications and ensuring that they process personal data fairly.

It is divided into 4 parts:

  1. Issues that primarily relate to the accountability principle. This requires you to be able to demonstrate that your organisation complies with the data protection principles and includes:
    • Data protection impact assessments (DPIAs), which are legally required for processing personal data using AI
    • Controller/processor responsibilities

    DWF is developing a data accountability tool which will enable organisations to assess their compliance with the accountability principle and store their compliance evidence, as well as providing recommendations for improvement.  Please contact JP Buckley or a member of the DWF data protection team if you would like to arrange a demo.

  2. The lawfulness, fairness, and transparency of processing personal data in AI systems, including:
    • lawful bases for processing personal data in AI systems
    • assessing and improving AI system performance
    • mitigating potential discrimination to ensure fair processing
  3. Security and data minimisation
  4. How to facilitate the exercise of individuals’ rights about their personal data in your AI systems, and rights relating to solely automated decisions, including:
  • how you can ensure meaningful human input in partly-automated decisions
  • meaningful human review of solely-automated decisions
The guidance flags that if you use personal data to train your AI models, you're a controller of that data, so you need to comply with data protection law and put in place appropriate technical and organisational measures to ensure that you comply with the data protection principles, including data protection by design and by default.

It is important to note that, as well as needing a lawful basis for the main processing purpose ("deployment"), if you're also going to use an individual's personal data to train the AI system ("development"), you need a separate lawful basis for that, and you must be transparent about this use of the data.

Because the data generated by the processing is likely to be based on statistically informed inferences rather than facts, you need to make this clear in your records, which must set out the source of the information and the AI system used to generate the inference.

The guidance makes it clear that a zero-tolerance approach to risk is unrealistic and not required, so the emphasis is on identifying, managing and mitigating the risks in 3 key ways: preventative, detective and corrective.  The guidance refers to "trade-offs" between privacy and other competing rights and interests and emphasises the importance of assessing the trade-offs and striking the right balance.

The consultation closes on 1 April, following which the ICO will finalise the guidance.
 
In a webinar on 26 February, the ICO referred to its previous work on AI: 

  • it is working with three "sandbox" participants to provide support and advice on AI projects; and 
  • its "Project ExplAIn" guidance on how organisations can best explain their use of AI to individuals.  

The speaker said that organisations which are processing or intend to process personal data using AI should refer to both sets of guidance.

EDPB draft guidelines on personal data and connected vehicles

The EDPB has published draft guidelines on processing personal data in the context of connected vehicles and mobility related applications.  The draft guidelines are open for consultation until 20 March.

The draft guidelines focus on personal data processing in relation to the non-professional use of connected vehicles.  They consider 3 categories of personal data: (i) processed inside the vehicle; (ii) exchanged between the vehicle and personal devices connected to it (e.g. the user's smartphone); and (iii) collected in the vehicle and exported to external entities (e.g. insurers) for further processing.

The EDPB states that a connected vehicle and every device connected to it is 'terminal equipment' for the purposes of the ePrivacy Directive, which is implemented in the UK in the Privacy and Electronic Communications Regulations (PECR).  This requires consent (of GDPR standard) for storing or accessing information in the terminal equipment.

The EDPB has identified 3 categories of personal data warranting particular attention:
 
  1. Geolocation data – the places a person visits can reveal lots of personal information about them including place of work and residence, interests, religion and sexual orientation.  Accordingly, controllers must not collect location data unless it is absolutely necessary for the processing purpose.  Users must have the option to deactivate geolocation at any time and the period for which the data is stored must be limited and defined.
  2. Biometric data (e.g. fingerprints, eye movements or pulse) – this may be used to enable access to the vehicle or to the driver's profile settings and preferences.  The guidelines state that the use of biometrics should not be mandatory and the biometric data should be stored in encrypted form on a local basis.
  3. Data revealing criminal offences – while speed does not by itself indicate a crime, if it is combined with location data it could do so.  If it does, the data can only be processed under the control of official authority or where it is authorised by EU or member state law which provides appropriate safeguards for the rights and freedoms of data subjects. 
The draft guidelines provide advice which is based on the familiar data protection principles, but focused on how to comply with them in an AI context.  Here's a brief overview of the advice:

  • Controllers must ensure that all processing has a valid legal basis, is for specified, explicit and legitimate purposes and is not further processed in an incompatible way. 
  • Controllers must ensure that technologies deployed are configured to respect individuals' privacy by applying the principle of data protection by design and default.  Technologies should be designed to minimise data collection, provide privacy-protective default settings and ensure that data subjects are well informed and have the option to easily modify configurations associated with their personal data.
  • Before the processing, the data subject must be informed of the data controller's identity, the processing purpose, any third parties who will receive the data, the storage period and the data subject's rights, together with all the information required under the GDPR.
  • Controllers must facilitate data subjects' control over their data during the entire processing period by implementing specific tools which provide an effective way to exercise their rights.
  • Controllers must implement measures to guarantee the security and confidentiality of the personal data they process.
  • If the controller shares personal data with a third party, there must be a legal basis for this.
  • If personal data is transferred outside the EEA, an appropriate safeguard is required.

EDPB evaluation of the GDPR

The EDPB has contributed to the EU Commission's review and evaluation of the GDPR. The EDPB's position is that the application of the GDPR in the first 20 months has been successful.  Although there are concerns and challenges, the EDPB is examining possible solutions to overcome them, and concluded that it would be premature to revise the GDPR at this time. 


ePrivacy update

On 21 February the Croatian Presidency of the EU Council published its proposal for the new ePrivacy Regulation, which was originally intended to take effect at the same time as GDPR, but has been delayed by deadlock in the decision-making process.  The proposed text states that service providers whose website content or services are accessible without direct monetary payment and wholly or mainly financed by advertising may rely on legitimate interest for placing tracking cookies, provided the user has been given clear, precise and user-friendly information about the purposes of the cookies and has accepted such use. 

It is not clear what "accepted" means – it could mean that the user must have consented, but that would appear to contradict the statement that the service provider can rely on legitimate interest, as these are alternative lawful bases.  If the intention is that legitimate interests would suffice, commentators have suggested that the EU Parliament is unlikely to accept this (one MEP has called it a "dePrivacy" version), in which case the deadlock will continue.

The proposal considers that where the data subject is a child or the information contains special category data (e.g. information about their health, race or sexual orientation), the individual's rights would override the service provider's legitimate interest, so it would not provide a lawful basis.  It also states that legitimate interest would not provide a lawful basis for profiling.

The EU Council Working Party on Telecommunications and Information Society will discuss the proposal in March, so we'll provide an update in the March edition of DP Insights.  In the meantime, in the UK the Privacy and Electronic Communications Regulations (PECR) continue to apply, so consent (to the GDPR standard) is required for cookie use.  Following Brexit, the UK will not be bound by the new ePrivacy Regulation, so it is for the government to decide whether and, if so, how to implement it.

Enforcement action

Relationship between data protection law and human rights law

While we're now familiar with GDPR and the Data Protection Act 2018, it's important to remember that Article 8 of the European Convention on Human Rights provides a right to respect for one's "private and family life, his home and his correspondence".  This article has recently been enforced in two cases. 

A Dutch court has ordered the immediate halt of an automated surveillance system for detecting welfare fraud because it violates human rights.  

Turning to the UK, where the government is reported to be accelerating the development of robots in the benefits system, the chairman of the House of Commons Work and Pensions Committee said: “This ruling by the Dutch courts demonstrates that parliaments ought to look very closely at the ways in which governments use technology in the social security system, to protect the rights of their citizens.”

The European Court of Human Rights has decided that the Police Service of Northern Ireland's (PSNI) indefinite retention of a drink driver's biometric data (fingerprints and DNA profile) violated Article 8 of the Convention. The Court stated that the UK is one of the few Council of Europe jurisdictions to permit indefinite retention of DNA profiles.  Because the UK had given itself such an extensive power, it should have put proper safeguards in place.  Because the PSNI was only empowered to delete biometric data in exceptional circumstances, the applicant could not request deletion on the basis that retention was no longer necessary.  This failed to strike a balance between public and private interests. 

Brexit preparation

The UK government position

In his speech of 3 February, Boris Johnson stated: "We will restore full sovereign control over our borders and immigration, competition and subsidy rules, procurement and data protection".  While he didn't provide any detail about what this might mean, it suggests that UK data protection law may start to diverge from EU law

While at present there is no clear indication of whether/how UK data protection law will change, commentators are concerned that any relaxation would damage the possibility of the European Commission making an adequacy decision in respect of the UK.  An adequacy decision would mean that EEA organisations could continue to transfer personal data to the UK, without the need for an additional safeguard.  If the UK does not receive an adequacy decision, a different safeguard would be needed, such as standard contractual clauses or (for intra-group transfers) binding corporate rules.

The EDPS position

On 24 February the European Data Protection Supervisor (EDPS) issued an Opinion on the EU-UK future data protection relationship.  The key points are:

The EDPS makes the following three main recommendations in relation to the envisaged partnership:
  1. ensuring that the security and economic partnerships are underpinned by similar commitments to respect fundamental rights, including adequate protection of personal data;
  2. defining priorities where arrangements for international cooperation should be concluded in matters other than law enforcement, in particular for the cooperation between public authorities, including EU institutions; and
  3. assessing the issue of onward transfers of personal data.
With regard to the assessment of adequacy, the EDPS emphasises the following points: 

  • the adoption of an adequacy decision is subject to specific conditions and requirements and, should the Commission present a draft adequacy decision, the EDPB should be appropriately and timely involved; and
  • given the specific situation of the UK, any substantial deviation from EU data protection law that would result in lowering the level of protection would constitute an important obstacle to the adequacy findings. 
The EDPS finally recommends that the EU take steps to prepare for all eventualities, including where the adequacy decision(s) could not be adopted within the transition period, where no adequacy decision would be adopted at all, or where it would be adopted only in relation to some areas.

As we said in our January DP Insights, while the situation remains uncertain, you can put your business is in the best possible position to adapt to any future requirements by being clear on what personal data you are transferring to which countries, and what is being transferred to you, and ensuring that safeguards which meet the current requirements are in place.

Industry news

Facial recognition

The use and risks of facial recognition technology have been in the headlines this month.  In October 2019 the ICO concluded its investigation into how police use live facial recognition technology (LFR) in public places. The investigation found there was public support for police use of LFR but also that there needed to be improvements in how police authorised and deployed the technology if it was to retain public confidence and address privacy concerns.  Earlier this month, the Metropolitan police started to use facial recognition at a shopping centre.  While researchers have cast doubt on the accuracy of the technology, the Metropolitan police commissioner has responded by criticising their research and defending use of the technology to protect the public from crime.

While a previous draft of the EU AI policy suggested a 5-year moratorium on the use of facial recognition, the updated version (below) has dropped this proposal, putting the onus on member states to decide how to address the issue.  While the UK is no longer a member state, a Private Member's Bill is currently before the House of Lords which, if passed, would make use of facial recognition a criminal offence and require the government to commission a review of the technology.  As a Private Member's Bill, this is unlikely to become law, but it demonstrates the controversy surrounding facial recognition.  Further afield, the Canadian data protection authorities have announced an investigation into an AI provider over concerns about its use of facial recognition.

If you're considering introducing facial recognition in your organisation, you will need to conduct a DPIA and take legal advice to ensure that you have a valid lawful basis for using the technology.   While the controversy surrounding the use for crime prevention indicates that this may be difficult to demonstrate,  by involving experts and applying the data protection principles right from the start of the project, e.g. by ensuring that the system's design and features (such as camera specification) maximise accuracy, you can make it a better proposition to the public.  Of course, DWF will be able to support you with any such projects, so please contact us if you would like to talk to one of our data protection specialists. 

EU data strategy and AI policy

The European Commission has unveiled its new data strategy and AI policy options.  Its objectives are to ensure that "the EU becomes a role model and a leader for a society empowered by data" and becomes a world leader in AI systems that can be safely used and applied.

Given the continuing uncertainty about the relationship between UK data protection law and EU law following the expiry of the transition period, it is not known what impact (if any) this will have on the UK.

You can read more about the Commission's proposals here.

We use cookies to give you the best user experience on our website. Please let us know if you accept our use of cookies.

Manage cookies

Your Privacy

When you visit any web site, it may store or retrieve information on your browser, mostly in the form of cookies. We mainly use this information to ensure the site works as you expect it to, and to learn how we can improve the experience in the future. The information does not usually directly identify you, but it can give you a more personalised web experience.
Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change permissions. However, blocking some types of cookies may prevent certain site functionality from working as expected

Functional cookies

(Required)

These cookies let you use the website and are required for the website to function as expected.

These cookies are required

Tracking cookies

Anonymous cookies that help us understand the performance of our website and how we can improve the website experience for our users. Some of these may be set by third parties we trust, such as Google Analytics.

They may also be used to personalise your experience on our website by remembering your preferences and settings.

Marketing cookies

These cookies are used to improve and personalise your experience with our brands. We may use these cookies to show adverts for our products, or measure the performance of our adverts.