• AU
Choose your location?
  • Global Global
  • Australia
  • France
  • Germany
  • Ireland
  • Italy
  • Poland
  • Qatar
  • Spain
  • UAE
  • UK

DWF Data Protection Insights April 2021

30 April 2021

Here is our round-up of the month's top data protection stories, together with practical advice on how to address the legal issues raised.

This month's highlights include:

  • An update on the European Commission's draft adequacy decisions in relation to the UK;
  • The European Commission's proposal for an AI Regulation; and
  • DCMS's plans for new Smart By Design legislation to protect smart devices.

Regulatory guidance/campaigns/other news from the Information Commissioner's Office (ICO)/European Data Protection Board (EDPB)/ European Data Protection Supervisor (EDPS)

EDPB guidance and news

UK adequacy decision update
EDPB publishes opinion on the European Commission's draft adequacy decisions in relation to the UK. Read our overview of the key points >

If you would like advice on any aspect of international data transfers, including those between the UK and the EEA and between the UK and other countries, please contact one of our data protection specialists.

EDPB statement regarding the exchange of personal data between public authorities
The European Data Protection Board (EDPB) has adopted a statement regarding the exchange of personal data between public authorities under existing international agreements.  Read our overview of the key points and how this affects public authorities in the UK. Read our overview of the key points >

If you would like our support in reviewing your international data sharing agreements in accordance with the EDPB statement, please contact JP Buckley or one of our team of specialist data protection lawyers.

EDPB guidelines on the targeting of social media users

On 22 April the EDPB published the final version of its guidelines on the targeting of social media users, which were previously published for consultation in September 2020.  The stated aims of the guidelines are to:

  • clarify the roles and responsibilities of the social media provider and the "targeter", i.e. the person or organisation who uses social media to promote its commercial or other interests to social media users;
  • identify the potential risks for the rights and freedoms of individuals; and
  • tackle the application of key data protection requirements, such as lawfulness and transparency, the requirement for data protection impact assessments (DPIAs), etc.

If you use social media to target advertising or other promotional messages to users and you would like our advice on the impact of data protection law and these guidelines, please contact one of our specialist lawyers.

EDPS/AEPD Paper on 10 anonymisation misunderstandings

The European Data Protection Supervisor (EDPS) and the Spanish supervisory authority (AEPD) have published a joint paper setting out 10 misunderstandings related to anonymisation.  As the title suggests, there is a lot of confusion about when data has truly been anonymised, in particular confusion with pseudonymised data, which is still personal data within the scope of the GDPR.  The 10 misunderstandings and their key points are:

1. Pseudonymisation is the same as anonymisation

This is incorrect – "pseudonymisation" means "the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person". The use of "additional information" can identify the individuals, which explains why pseudonymous personal data is still personal data.

Data which is truly anonymous cannot be linked to specific individuals, so does not fall within the scope of GDPR.

2. Encryption is anonymisation

Encryption is not an anonymisation technique, but it can be a pseudonymisation tool.  The key needed to decrypt the data can be "additional information", as referred to in misunderstanding number 1.

3. Anonymisation of data is always possible

It may not be possible to prevent the data from identifying individuals while retaining a useful dataset for a specific processing activity, for example when the data relates to a small number of individuals or the datasets include specific data which makes it easy to identify them.

4. Anonymisation is forever

New technical developments and the availability of additional data may make it possible to re-identify data which was previously anonymised.

5. Anonymisation always reduces the probability of re-identification of a dataset to zero

While a robust anonymisation process reduces the risk of re-identification below a certain threshold, zero risk may not be possible.  The acceptable risk level depends on several factors, including the mitigation controls in place, the impact on individuals' privacy if the data is re-identified, and the motivation and capacity of an attacker to re-identify the data.

6. Anonymisation is a binary concept that cannot be measured

The risk of re-identification is rarely zero – there are are degrees of anonymisation. Any robust anonymisation process will assess the re-identification risk and continue to manage and control that risk.

7. Anonymisation can be fully automated

While automated tools can be used during the anonymisation process, expert human intervention is needed to analyse the original dataset, its intended purposes, the techniques to apply and the reidentification risk of the resulting data.

8. Anonymisation makes the data useless

A proper anonymisation process can keep the data functional for a given purpose.  While personal data must not be kept in a form which permits identification of data subjects for longer than necessary for the purposes for which the personal data is processed, anonymising the data may provide a solution, if the anonymised dataset still contains useful information.

9. Following an anonymisation process that others used successfully will lead our organisation to equivalent results

Organisations need to tailor their anonymisation processes to the nature, scope, context and purposes of their data processing, as well as the likelihood and severity of the risks to the rights and freedoms of individuals if the data is re-identified.

10. There is no risk and no interest in finding out to whom this data refers to

Re-identification of data subjects could have a serious impact on their rights and freedoms.  Re-identification in a seemingly harmless context may lead to inferences about the individual, for example their political beliefs or sexual orientation, which are subject to additional protection as special category data.

Anonymisation of data is a useful tool, but, as the misunderstandings outlined above illustrate, it is sometimes used incorrectly or confused with pseudonymisation.  In the March 2021 issue of DWF Data Protection Insights we reported on the ICO's plans to update its guidance on anonymisation and pseudonymisation, so of course we will report further once this guidance is published.  In the meantime, if you would like any advice about using anonymisation and/or pseudonymisation correctly, please contact one of our specialist data protection lawyers.

ICO guidance and news

ICO blogpost: the UK Government’s digital identity and attributes trust framework

On 21 April the ICO published a blogpost on the UK Government’s prototype digital identity and attributes trust framework, which was published for consultation in February 2021.  The framework sets out draft rules and standards for organisations which intend to provide or use digital identity verification products and services. DCMS said that they "are set to revolutionise transactions" e.g. buying a house, opening a bank account or buying age-restricted goods online or in person.  In the blogpost, the ICO:

  • acknowledges that a digital identity system with strong governance and effective data protection safeguards can help improve public access to digital services and reduce security risks;
  • highlights that accountability for the way that personal data is processed must be present from the outset;
  • welcomes the decentralised approach that the framework proposes, which provides a strong foundation for a data protection by design approach that must be embedded across the system; and
    recommends that the following measures must be put in place:
    - robust governance and clear accountability;
    - boundaries around who controls personal data and how it is used and gathered;
    - effective measures to address the data protection risks that relate to data minimisation and purpose limitation; and
    - appropriate technical and organisational security measures to protect the personal data held in the system.

As always, we will monitor the framework's progress and provide updates in future issues of DWF Data Protection Insights.

ICO blogpost: How the ICO Innovation Hub is enabling innovation and economic growth through cross-regulatory collaboration

On 20 April the ICO published a blogpost on the ICO Innovation Hub's participation in the Financial Conduct Authority’s (FCA) Virtual Women’s Economic Empowerment TechSprint, providing advice and expertise on real life applications of data protection law.  They identified three key themes:

1. Build in accountability

Teams required advice on their obligations under the accountability principle of the UK GDPR and advice on how they could comply. Adopting a data protection by design approach from the outset and carrying out data protection impact assessments for high risk processing operations are key.

2. Personal data vs special category data

It’s vital to be aware of the general prohibition of the processing of special category data under the UK GDPR unless an Article 9 condition for processing applies, in addition to identifying an applicable lawful basis under Article 6.

3. It’s not all about consent

Consent must be freely given, meaning that consent requests must be separate from other terms and conditions. There are also issues around consent given by vulnerable individuals, for example those under duress. Other lawful bases such as legitimate interests may be more appropriate, depending on the proposed solution.

While this blogpost only provides a brief overview, it does provide a useful reminder of some key data protection issues for organisations to bear in mind at all times:

  • Ensure that you are able to demonstrate your compliance with data protection law (accountability principle);
  • Comply with the data protection by design principle by considering data protection at the start of every project;
  • Carry out a data protection impact assessment for high risk processing operations;
  • Be clear on what personal data you are processing, to identify whether it includes any special category data.  If it does, ensure that an Article 9 condition applies, as well as an Article 6 lawful basis.
  • Consider the most appropriate lawful basis. Remember that consent is not the only basis, and legitimate interests may be more appropriate.
  • If relying on legitimate interests, conduct a legitimate interests assessment to weigh your organisation's interests against the rights and freedoms of the relevant data subjects, particularly when they include children.

Please contact one of our data protection specialists if you would like advice on how to implement any of these points in your organisation.

ICO enforcement

The ICO has not published details of any enforcement action under GDPR or the Privacy and Electronic Communications Regulations (PECR) during the last month.  It has published a number of decisions under the Freedom of Information Act (FOIA) for failing to deal with requests for information within the required timeframe.


Industry news

DCMS plans for new Smart By Design legislation to protect smart devices

On 21 April DCMS announced plans for new cyber security laws to protect smart devices, including phones, watches, cameras, speakers, televisions and doorbells. The key points are:

  • Customers must be informed at the point of sale the duration for which a smart device will receive security software updates;
  • A ban on manufacturers using preset universal default passwords which are easy to guess; and
  • Manufacturers will be required to provide a public point of contact to make it simpler for anyone to report a vulnerability.

The press release states that the government intends to introduce legislation as soon as parliamentary time allows.  We will of course monitor developments and continue to update you in future issues of DWF Data Protection Insights.

On the same date, DCMS published its response to the call for views on consumer connected product cyber security legislation.  This response provides some background information to DCMS's plans.

European Commission publishes proposal for AI Regulation

The European Commission has published a proposal for harmonised rules on artificial intelligence (AI). Read our overview of the key points >

If your organisation is proposing to use AI, you will probably need to conduct a data protection impact assessment (DPIA) to identify any risks to individuals and how to mitigate any such risks. Please contact one of our data protection specialists for advice on whether a DPIA is required and, if so, support in conducting the DPIA and addressing its findings.

CDEI publishes AI blogs

On the subject of artificial intelligence, the CDEI (the Centre for Data Ethics and Innovation, which is part of DCMS) has published three blogs on AI assurance, which deal with:

  • The need for effective AI assurance – this discusses the risks which must be managed and explains why an effective AI assurance ecosystem is required;
  • User needs for AI assurance – this considers different user needs for AI assurance and the potential tensions which arise between conflicting user interests; and
  • Types of assurance in AI and the role of standards – this explores different types of assurance in more detail and considers the role of standards in an assurance ecosystem.

The CDEI is asking for input from individuals and organisations who are developing or adopting AI systems, as well as those developing assurance tools or working on similar issues in AI assurance to identify areas where clarity and consensus around AI assurance has the potential to bring about significant benefits.

If your organisation is proposing to use AI, you will probably need to conduct a data protection impact assessment (DPIA) to identify any risks to individuals and how to mitigate any such risks.  Please contact one of our data protection specialists for advice on whether a DPIA is required and, if so, support in conducting the DPIA and addressing its findings.

Post-Brexit transition

We are closely monitoring the progress of the EU adequacy decisions (see EDPB publishes opinion on the European Commission's draft adequacy decisions in relation to the UK above).  It has been reported that, while senior Commissioners support the decisions, some EU member states oppose them.  Member states would need a qualified majority (55%) to block the decisions, so no single member state has a right of veto.

We will of course report on any developments and their impact on EU-UK data transfers in future issues of DWF Data Protection Insights.

Further Reading

We use cookies to give you the best user experience on our website. Please let us know if you accept our use of cookies.
Manage cookies

Your Privacy

When you visit any web site, it may store or retrieve information on your browser, mostly in the form of cookies. We mainly use this information to ensure the site works as you expect it to, and to learn how we can improve the experience in the future. The information does not usually directly identify you, but it can give you a more personalised web experience.
Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change permissions. However, blocking some types of cookies may prevent certain site functionality from working as expected 

Functional cookies

(Required)

These cookies let you use the website and are required for the website to function as expected.

These cookies are required

Tracking Cookies

Anonymous cookies that help us understand the performance of our website and how we can improve the website experience for our users. Some of these may be set by third parties we trust, such as Google Analytics.

They may also be used to personalise your experience on our website by remembering your preferences and settings.

Marketing Cookies

These cookies are used to improve and personalise your experience with our brands. We may use these cookies to show adverts for our products, or measure the performance of our adverts.