• AU
Choose your location?
  • Global Global
  • Australia
  • France
  • Germany
  • Ireland
  • Italy
  • Poland
  • Qatar
  • Spain
  • UAE
  • UK

DWF Data Protection Insights March 2021

31 March 2021
Here is our round-up of the month's top data protection stories, together with practical advice on how to address the legal issues raised.

This month's highlights include:

  • Adequacy decisions updates – the EU decisions about the UK's adequacy and the process for the UK to make new adequacy decisions;
  • News from different sectors about increased scrutiny and regulation of Artificial Intelligence, plus the new ICO toolkit; and
  • Reports on enforcement action against organisations which transfer personal data abroad without the required safeguards

Find out more on the different topic areas below:

Regulatory guidance/campaigns/other news from the Information Commissioner's Office (ICO)/European Data Protection Board (EDPB)/European Data Protection Supervisor (EDPS)

UK Government and ICO sign Memorandum of Understanding on procedure for future adequacy decisions
On 19 March the Secretary of State for Department for Digital, Culture, Media & Sport (DCMS) and the ICO signed a Memorandum of Understanding setting out the procedure for future adequacy decisions.  While the UK has adopted the European Commission's pre-Brexit adequacy decisions, the UK government will determine any future UK adequacy decisions.  The Secretary of State is responsible for making such decisions, but DCMS must first consult the ICO.

The DCMS announcement states 'the UK government intends to expand the list of adequate destinations in line with our global ambitions and commitment to high standards of data protection. Doing so will provide UK organisations and international partners with more straightforward and safer mechanisms for international data transfers.'

ICO AI and data protection risk mitigation and management toolkit
The ICO has released an alpha version of an AI and data protection risk mitigation and management toolkit for consultation.  The toolkit is designed to:

  • help to identify and mitigate the data protection risks AI systems create or exacerbate;
  • help developers think about the risks of non-compliance with data protection law;
  • reflect the ICO’s internal AI auditing framework and AI and data protection guidance; and
  • provide further practical support to organisations auditing the compliance of their own AI systems.

The toolkit comprises an Excel spreadsheet containing sections covering governance, contracts and third parties, training, data protection risk management, lawful basis, trade offs, statistical accuracy, discrimination, security & integrity, transparency, data minimisation, individual rights and human review.

Each of these sections:

  • identifies the relevant risks and how AI can create or exacerbate the risk;
  • provides fields for the user to identify the risk level and current status;
  • sets out practical steps to take to address the risks; and
  • provides additional fields for the user to record intended actions, outstanding actions, the action owner and the completion date.

The ICO intends to publish a beta version of the toolkit in the summer following initial feedback and further technical development, and then continue to keep it updated.  While the toolkit looks like a useful tool, it is by necessity generic and high-level. If you would like tailored advice on a specific AI project, please contact one of our data protection specialists.

ICO guidance for the use of personal data in political campaigning
On 9 March the ICO published guidance for the use of personal data in political campaigning.  While the guidance is only of direct relevance to organisations who conduct political campaigns, it provides a useful reminder of the key points of data protection law, which are also relevant to running marketing campaigns, including:

  • Identify the legal status of the parties involved and their relationship, e.g. controller to processor, controller to controller or joint controllers.  This will help you to identify the parties' legal obligations and responsibilities.
  • Identify whether you need to pay the data protection fee and, if applicable, pay the correct fee.
  • Identify what personal data you are processing.
  • Ensure that you are able to demonstrate your compliance with data protection law (the accountability principle) including embedding data protection by design principles, putting in place appropriate technical and organisational measures and conducting data protection impact assessments (DPIAs) when required.
  • Comply with the purpose limitation, data minimisation and storage limitation principles.
  • Identify the lawful basis for each processing activity, process the data in a way which individuals expect and be clear, open and honest with individuals about how you use their data (lawful, fair and transparent processing).  For example, think carefully before using profiling, data analytics, micro-targeting or automated calling systems.
  • Identify whether you are processing any special category data.  If so, identify the additional lawful basis required and if necessary put in place an appropriate policy document (required under the DPA 2018).
  • When collecting personal data, whether from the individual or a third party, ensure that you respect data subjects' right to be informed by providing them with the information required under GDPR (as implemented in the UK as the UK GDPR).  The guidance provides some useful suggestions for how to provide this information when you are collecting the data in different ways, e.g. face to face, using an online survey or quiz, or via a mobile app, as well as advice about buying or renting lists of contact details.
  • Ensure that you use profiling lawfully.
  • Be clear on whether your messages are service communications, market research or direct marketing.
  • Read and follow the ICO/EDPB guidance on online advertising, cookies, adtech, real-time bidding and social media.

If you would like tailored advice on the data protection aspects of running a direct marketing campaign, including the application of the Privacy and Electronic Communications Regulations (PECR), please contact one of our specialists.

ICO plans for updating its anonymisation guidance
On 19 March the ICO announced plans to build on its Data Sharing Code of Practice (see the December 2020 issue of DWF Data Protection Insights for an overview of the Code) by updating its guidance on anonymisation and pseudonymisation, which will cover the following topics:

  • The relevant legal, policy and governance issues;
  • Identifiability – including guidance on managing re-identification risk;
  • Pseudonymisation techniques and best practices;
  • Accountability and governance requirements, including data protection by design and DPIAs;
  • How anonymisation and pseudonymisation apply in the context of research;
  • Privacy enhancing technologies (PETs) and their role in safe data sharing;
  • Technological solutions – exploring possible options and best practices for implementation; and
  • Data sharing options and case studies – supporting organisations to choose the right data sharing measures in a number of contexts, including sharing between different organisations and open data release.

The ICO will be publishing and consulting on this guidance over the coming months, so we will provide updates in future issues of DWF Data Protection Insights.  If you would like tailored advice about your data sharing arrangements, please contact one of our specialist lawyers.

ICO sandbox update
The ICO has published its reports on the last three projects from the beta phase of its sandbox.  These address:

  • More efficient data sharing between public and private sector organisations, aimed at improving road safety;
  • The development and enhancement of an existing multi-agency data platform to reduce violent crime; and
  • A housing quality project.

The ICO reported that the next phase of the sandbox is in progress, focusing on:

  • Complex data sharing in the public interest; and
  • Innovations linked to the issues raised by the ICO's Children's Code.

Ofcom and the ICO publish joint plan for tackling nuisance calls
Ofcom has published a plan developed jointly with the ICO for tackling nuisance and scam calls.  The plan provides an update on progress made in the following key areas:

  • taking targeted action against people or companies that are not following the ICO’s and Ofcom’s rules;
  • raising awareness of and tackling Covid-19 scams and continuing to support the work of Stop Scams UK;
  • working with telecoms companies to review and improve how they disrupt and prevent nuisance calls;
  • working with other regulators and enforcement agencies to identify opportunities to prevent nuisance calls and scams; and
  • sharing intelligence with others, including international partners and enforcement agencies.

Digital Regulation Cooperation Forum (DRCF) publishes its first annual plan of work
The DRCF was formed by the ICO, the Competition and Markets Authority (CMA) and the Office of Communications (Ofcom) in July 2020, and the Financial Conduct Authority (FCA) will become a full member from April 2021. It is intended to ensure a greater level of cooperation, given the unique challenges posed by regulation of online platforms. On 10 March it outlined its priorities for the coming year, which will focus on three areas:

  • responding strategically to industry and technological developments, including algorithms (see the January 2021 issue of DWF Data Protection Insights for our report on the CMA consultation on algorithms), service design frameworks, artificial intelligence, digital advertising technologies and end-to-end encryption;
  • developing joined-up regulatory approaches to the interrelation between data protection and competition regulation, and the Age-Appropriate Design Code and the regulation of Video-Sharing Platforms and Online Harms; and
  • building shared technical and analytical skills and capabilities.

EDPB guidance and news
Following its virtual plenary meeting on 9 March, the EDPB has published the following items:

Draft UK adequacy decisions
The EDPB reported that it discussed the draft UK adequacy decisions and that it will thoroughly review the draft decisions, taking into account the importance of guaranteeing the continuity and high level of protection for data transfers from the EU.  It has been reported that the EDPB will deliver its opinion in April, and the EU hopes to adopt the adequacy decisions at the end of May or the beginning of June.  This will mean that 'the bridge', which permits transfers from the EEA to the UK to continue on an interim basis, will need to be extended from its initial expiry date of 30 April, but extension until 30 June was envisaged in the Trade and Cooperation Agreement.

It should be noted that various commentators have expressed concern at the UK government's stated intention to diverge from GDPR, for example by granting adequacy decisions to additional countries (see Government plans to diverge from GDPR below and UK Government and ICO sign Memorandum of Understanding on procedure for future adequacy decisions above), so organisations should continue to plan how to deal with data transfers from the EEA to the UK if the decisions are not adopted, or if they are subsequently invalidated.

EU concludes adequacy talks with South Korea
On the subject of adequacy decisions, on 30 March the European Commission announced that it had successfully concluded adequacy talks with the Republic of Korea.  The EDPB now needs to issue an opinion on the Commission's adequacy finding, and representatives of the EU member states need to approve it, before the adequacy decision can be finalised.  Once that happens, organisations in EEA member states can transfer personal data to South Korea without an additional safeguard.  As discussed above, the UK will not be bound by this adequacy decision, but may decide to make its own decision in respect of South Korea.

Statement on the draft ePrivacy Regulation
The EDPB broadly welcomed the agreement on the negotiation mandate by the Council as a positive step in the finalisation of the ePrivacy Regulation, but raised a number of concerns:

  • The current situation regarding the obtaining of consent to data processing for websites and mobile apps should be improved by giving back control to users and address "consent fatigue".  Browsers and operating systems should be required to have a user-friendly and effective mechanism allowing controllers to obtain consent.
  • In relation to the processing and retention of electronic communication data for law enforcement and safeguarding national security purposes, the draft Regulation cannot deviate from the EU Charter of Fundamental Rights or recent case law on targeted data processing and retention.
  • Practices which make access to services and functionalities conditional on a user consenting to the storing of information, or access to information stored in their terminal equipment ("cookie walls") should be prohibited, so that users can accept or refuse profiling
  • The exceptions to the general prohibition on personal data processing need to be narrowed down to specific and clearly defined purposes, which should be explicitly listed.
  • Oversight of privacy provisions should be entrusted to supervisory authorities under the EU GDPR, to support consistency and guarantee a level playing field in the Digital Single Market.

The EDPB also referred to ongoing discussions on the further processing of electronic communications metadata or data collected through cookies and similar technologies on the basis of compatible purposes, which it considers risks undermining the ePrivacy Regulation. It supports the approach previously taken based on a general prohibition of such processing, subject to narrow exceptions and consent

While the ePrivacy Regulation, once finalised, will not be directly applicable in the UK, UK organisations which process the personal data of individuals in the EU will have to comply with it in respect of such processing, and it is possible that the UK will update the Privacy and Electronic Communications Regulations (PECR) in line with the Regulation.  We will of course monitor developments and continue to update you in future issues of DWF Data Protection Insights.

Draft Guidelines on Virtual Voice Assistants (VVAs)
These draft guidelines are open for feedback until 23 April 2021.  They refer to VVAs as services that understand voice commands and execute them or mediate with other IT systems, acting as interfaces between users, devices and online services. VVAs have access to a large amount of personal data e.g. commands, browser and search history, and can use biometric identification and profiling. Consequently, the EDPB states that they are subject to GDPR, and are 'terminal equipment' within the meaning of the ePrivacy Directive.

The guidelines cover the most relevant compliance challenges and recommendations for how to address them, including: Determining the lawful basis for processing; Consent; Transparency; Purpose limitation; Retention; Data minimisation; Security; Processing children's data and special category data; Accountability; and Providing mechanisms to allow users to exercise their data subject rights. 

While EDPB guidelines do not bind organisations in the UK, the ICO has stressed their continuing importance, as they indicate how organisations can comply with the GDPR and ePrivacy Directive, which have both been implemented into UK law.

Final version of the Guidelines on Connected Vehicles
These focus on the processing of personal data in relation to individuals' non-professional use of connected vehicles.  While the final version refines the draft Guidelines in some respects, the key points from the draft have not changed.  Please click here to read our article about the draft Guidelines.

Joint EDPB-EDPS opinion on the proposed Data Governance Act (DGA)
The opinion on the proposed DGA made the following recommendations:

  • In relation to the DGA's general aim of fostering the availability of data by increasing trust in data intermediaries and strengthening data-sharing mechanisms across the EU, it should make it clear that the DGA will not change data protection law or affect the level of protection of individuals' personal data.
  • Concerning the aim of promoting the availability of public sector data for reuse, the opinion recommends aligning the DGA with the existing rules on the protection of personal data laid down in the GDPR and the Open Data Directive (also known as the PSI Directive) and clarifying that the reuse of personal data held by public sector bodies may only be allowed if it is grounded in EU or Member State law
  • In respect of sharing of data among businesses and allowing personal data to be used with the help of a ‘personal data-sharing intermediary’ the opinion highlights the need to ensure prior information and controls for individuals, taking into account the principles of data protection by design and by default, transparency and purpose limitation.  It must be clear how service providers will enable data subjects to exercise their rights.
  • In relation to the aim of enabling the use of data for altruistic purposes, the DPA should define this concept more clearly. Data altruism should be organised so that it allows individuals to easily give/withdraw their consent.

As in the case of the ePrivacy Regulation, the DGA will not be directly applicable in the UK, but UK organisations will need to comply when processing EU citizens' personal data, and it is possible that the UK government could enact similar legislation.

Enforcement action

Enforcement action

ICO enforcement
The ICO has continued to focus its enforcement action on breaches of the Privacy and Electronic Communications Regulations (PECR), fining two organisations a total of £330,000 for sending text messages without consent.  One of the organisations was seeking to exploit people who are financially vulnerable as a result of the pandemic.  The ICO's actions provide a continuing reminder of the importance of ensuring that direct marketing campaigns are conducted in compliance with all relevant data protection law, including PECR, which is sometimes overlooked.

EU supervisory authority enforcement – international transfers
The Spanish data protection supervisory authority (AEPD) has recently imposed its highest fine to date (€8.15 million) on a telco for multiple breaches of GDPR.  €2 million of the fine was imposed because the telco's service provider (data processor) had used a sub-processor in Peru without putting in place any contractual obligations to transfer the data in compliance with GDPR.

The Bavarian data protection authority (DPA) has declared the use of a US email marketing service by a Bavarian data controller impermissible because, while the controller entered into standard contractual clauses (SCCs) with the service provider, it did not put in place the supplementary measures required under the Schrems II decision.  The DPA did not impose a fine, in part because the final version of the EDPB's guidance on supplementary measures has not yet been published.

Both these decisions provide a reminder of the importance of putting in place appropriate safeguards when transferring personal data to a country outside the UK and EEA which does not have an adequacy decision.  While the EDPB guidance on supplementary measures has not yet been finalised, the draft guidance was published last year - see the November 2020 issue of DWF Data Protection Insights for our overview.  The ICO has described the EDPB draft guidance as 'useful reference' until the ICO issues its own guidance, and has issued UK versions of the EU SCCs. 

Given all the changes to international transfers caused by the Schrems II decision, the EDPB draft guidance and Brexit, this is a complex area, so please contact one of our data protection specialists if you require advice on how to transfer personal data in compliance with the law and relevant guidance.
Industry news

Industry news

DCMS announces national AI strategy
On 12 March the Department for Digital, Culture, Media & Sport (DCMS) announced a new national AI (artificial intelligence) strategy, which will focus on:

  • Growth of the economy through widespread use of AI technologies
  • Ethical, safe and trustworthy development of responsible AI
  • Resilience in the face of change through an emphasis on skills, talent and R&D

While the announcement does not refer to the link to data protection law, the references to 'ethical' and 'responsible' AI indicate that AI use must comply with the law.  In the May 2020 issue of DWF Data Protection Insights we reported on the ICO's guidelines on explaining decisions made with AI, which were developed with the Alan Turing Institute.

It's worth flagging that the Intellectual Property Office has issued a call for views on the relationship between AI and intellectual property, and the Trades Union Congress has published three reports about the use of AI in employment relationships, showing that all aspects of AI are being scrutinised, and data protection law forms part of a bigger picture.

If your organisation is proposing to use AI, you will probably need to conduct a data protection impact assessment (DPIA) to identify any risks to individuals and how to mitigate any such risks.  Please contact one of our data protection specialists for advice on whether a DPIA is required and, if so, support in conducting the DPIA and addressing its findings.  If you require advice about the intellectual property or employment law aspects of AI, please get in touch with your usual DWF contact, who will refer you to the most appropriate specialist.

DCMS publishes Cyber Security Breaches Survey 2021 report
DCMS has published its report on the results of its 2021 Cyber Security Breaches Survey.  The report's key findings are:

  • the risk of cyber security breaches is heightened by the pandemic;
  • securing digital environments is currently more challenging, as organisational resources are diverted to facilitating home working for staff;
  • fewer businesses are taking the recommended security measures, including using security monitoring tools to identify abnormal activity and up-to-date anti-virus software;
  • 39% of businesses have experienced a cyber security breach/attack in the last 12 months;
  • the most common breaches or attacks were phishing emails, followed by instances of others impersonating their organisation online, viruses or other malware including ransomware; and
  • 47% of businesses have staff using personal devices for work, but only 18% have a policy on how to use those personal devices for work ("BYOD" or "Bring Your Own Device" policy). Only 23% have a policy covering home working.

If you would like our help drafting your organisation's cyber security policies, including a home working policy or a BYOD policy, or updating them to reflect different ways of working during the pandemic, please contact one of our data protection specialists.

Post Brexit transition

Post-Brexit transition

Government plans to diverge from GDPR
While the prime minister and other government ministers have previously indicated that the UK may reform its data protection laws following Brexit, it has now been reported that Oliver Dowden, the Secretary of State for DCMS, has expressed an intention to rebalance the rules to open up greater economic opportunity without watering down protection.  The government is currently seeking a new Information Commissioner to replace Elizabeth Denham, and the role specification emphasises the importance of understanding the importance of striking this balance.

Watch our recent webinar…

Acquired and Latent Data and Cyber Risk: Due diligence, compliance and mitigation for both existing and acquired businesses

DWF data protection, cyber security, corporate and insurance experts explored the dynamics of data and cyber risks, looking at what can be done to reduce risk levels and the roles played by insurance policies for warranties and indemnities.

 

Click here to watch a recording of the webinar

Further Reading

We use cookies to give you the best user experience on our website. Please let us know if you accept our use of cookies.
Manage cookies

Your Privacy

When you visit any web site, it may store or retrieve information on your browser, mostly in the form of cookies. We mainly use this information to ensure the site works as you expect it to, and to learn how we can improve the experience in the future. The information does not usually directly identify you, but it can give you a more personalised web experience.
Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change permissions. However, blocking some types of cookies may prevent certain site functionality from working as expected 

Functional cookies

(Required)

These cookies let you use the website and are required for the website to function as expected.

These cookies are required

Tracking Cookies

Anonymous cookies that help us understand the performance of our website and how we can improve the website experience for our users. Some of these may be set by third parties we trust, such as Google Analytics.

They may also be used to personalise your experience on our website by remembering your preferences and settings.

Marketing Cookies

These cookies are used to improve and personalise your experience with our brands. We may use these cookies to show adverts for our products, or measure the performance of our adverts.