This month's highlights include:
- the ICO's pseudonymisation guidance and draft research guidance; and
- two cases showing the courts' cautious approach to data protection claims.
Webinar: Personal Data Transfers: A practical guide to the UK International Data Transfer Agreement (IDTA) and Addendum
In the January 2022 issue of DWF Data Protection Insights we reported that the ICO has published: the international data transfer agreement (IDTA), the international data transfer addendum to the new EU standard contractual clauses (SCCs) for international data transfers (Addendum), and a document setting out transitional provisions.
On Monday 28 March 2022, join our international data transfer specialists in our latest webinar to understand the practical and organisational steps you will need to take when using the UK IDTA and Addendum.
Podcast: Third-Party Cyber Risk
Two of our data protection specialists, Mark Hendry and Tughan Thuraisingam joined FTI Consulting to discuss key issues and trends in the area of third-party data protection and cyber risks, providing insights into this topical area in conjunction with the practical advice they would give to business leaders today to mitigate third party risks.
The pandemic has accelerated digital decentralisation – opening up a wider ecosystem of partners, workers, suppliers and customers. Whilst this acceleration has helped drive organisational efficiencies, it has also expanded potential attack surfaces and exposure to risk. Most organisations are aware of the significant threat posed by third party data protection and cyber risks, but many struggle to implement an adequate risk mitigation strategy.
Click here to listen to the podcast.
Regulatory guidance/campaigns/other news from the Information Commissioner's Office (ICO)/European Data Protection Board (EDPB)/ European Data Protection Supervisor (EDPS)
ICO guidance
ICO call for views: Anonymisation, pseudonymisation and privacy enhancing technologies guidance chapter 3 (pseudonymisation)
The ICO has published a consultation draft of the third chapter of its anonymisation, pseudonymisation and privacy enhancing technologies guidance. We have previously reported on the publication of the first two chapters, which cover an Introduction to Anonymisation and Identifiability.
Chapter 3 focuses on pseudonymisation, in particular:
The key differences between pseudonymisation and anonymisation, including:
- Pseudonymisation is a security and privacy risk management measure, but pseudonymised data is still personal data and accordingly subject to data protection law.
- The status of data can change depending on who holds it, e.g. if your organisation can identify pseudonymous data using a key or another separate identifier, that data may no longer be identifiable in the hands of another organisation which does not have access to that key, in which case it may be anonymous data, not personal data. However, you must not assume that this is the case and you need to consider the possibility and likelihood of identifiability.
- How pseudonymisation can reduce risk, including how it can help to:
- mitigate certain risks of data processing, including risks identified when conducting a data protection impact assessment (DPIA) or a legitimate interests assessment (LIA);
- comply with the data protection principles of data protection by design and security of processing;
- reduce the risk of harm to individuals that may arise from personal data breaches, or even prevent security breaches becoming personal data breaches.
- How pseudonymisation can help to allow personal data to be processed for other purposes, e.g. general analysis or processing for a compatible purpose, by providing an appropriate safeguard.
- The Data Protection Act 2018 (DPA) re-identification offences and the relevant defences.
- How to approach pseudonymisation, including:
- Defining your goals: what do you intend to achieve?
- Identifying the risks and the measures you need to implement.
- Deciding which technique (or set of techniques) is most appropriate.
- Deciding who does the pseudonymisation, e.g. your own organisation or a processor.
- Documenting your decisions and risk assessments.
The ICO announcement states that it will continue to publish draft chapters for consultation at regular intervals, including the following:
- accountability and governance requirements in the context of anonymisation and pseudonymisation, including data protection by design and DPIAs (data protection impact assessments);
- anonymisation and pseudonymisation in the context of research;
- privacy enhancing technologies (PETs) and their role in safe data sharing;
- technological solutions; and
- data sharing options and case studies to demonstrate best practice.
We will report on the publication of these chapters in future issues of DWF Data Protection Insights. In the meantime, please contact one of our privacy specialists if you would like advice on any aspect of anonymisation, pseudonymisation or privacy enhancing technologies.
ICO consultation on draft guidance on research
The ICO has published a consultation draft of guidance on the research provisions in the UK GDPR and the DPA 2018.
While these provisions are currently under review as part of the Government’s proposal to reform data protection, the ICO states that it is important to develop guidance on the current legislation to support organisations using personal data for research purposes.
The guidance covers the provisions that make reference to three broad types of research-related purposes for processing personal data:
- Archiving purposes in the public interest: to ensure the permanent preservation and usability of records of enduring value for general public interest, as distinct from the long-term retention of records for business or legal purposes.
The guidance recognises that some archiving in the public interest will be carried out by bodies with a specific legal obligation to archive records of enduring value for the general public interest, such as the National Archives, National Records of Scotland, and the Public Record Office of Northern Ireland. Public bodies such as local authorities may also have a public task set out in statute to maintain archives of records. However, some such archiving may also be carried out by private or third sector organisations, such as museums.
- Scientific or historical research purposes: this includes research, technological development and demonstration carried out in traditional academic settings, commercial settings and by charities.
- Statistical purposes: processing where the main objective is to generate statistics. This may be done by public authorities and bodies with a statutory obligation to produce and disseminate official statistics, such as the Office for National Statistics. But it is also much broader than this, and it may also be carried out by private or third sector
The guidance covers:
- a non-exhaustive list of activities and features that are indicative of this kind of processing;
- examples to help to identify whether the processing is in the public interest;
- the data protection principles;
- the lawful basis for the processing, which may be public task (where applicable), legitimate interest or consent;
- conditions for processing special category data and criminal offence data;
- reusing data for a different purpose;
- data subjects’ rights and the relevant exemptions; and
- safeguards, including data minimisation and pseudonymisation.
If your organisation processes personal data for any of these purposes and you would like advice on how to ensure that your processing complies with data protection law and guidance, please contact one of our privacy specialists.
Enforcement action
ICO enforcement
The ICO has continued to impose fines for breaches of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR), including the following:
- a fine of £200,000 for a home improvements company for making more than half a million unsolicited calls to individuals registered with the Telephone Preference Service (TPS);
- a fine of £85,000 for a motor insurance provider for sending 29,970,419 unsolicited direct marketing emails and text messages without valid consent. The recipients had obtained quotes, but had not been given the ability to opt out of direct marketing, so it was not able to rely on the 'soft opt-in' under PECR. The ICO's penalty notice also contains a reminder that legitimate interests cannot be used as a lawful basis under PECR; and
- a fine of £50,000 for a company which sent 752,425 unsolicited direct marketing text messages advertising loans without valid consent.
These fines show that the ICO is continuing to focus its enforcement action on breaches of PECR, rather than breaches of GDPR/UK GDPR, and provide a reminder of the importance of:
- screening marketing lists against the TPS before making marketing calls; and
- ensuring that the soft opt-in is used correctly, by giving the individual the ability to opt out at the time of collecting their personal data, and in each marketing message.
The new Information Commissioner, John Edwards, began his term on 4 January, so it will be interesting to see whether he changes the ICO's enforcement priorities. We will of course monitor developments and report in future issues of DWF Data Protection Insights.
Industry news
High Court applies Morrisons ruling on vicarious liability for employee's breach of data protection law
The High Court has applied the Supreme Court's decision in the Morrisons case, holding that a local council was not vicariously liable for its employee's misuse of personal data, because she was 'on a frolic of her own'. See our report on the Morrisons case for a reminder of the facts and the decision.
An employee of the local authority's social services department accessed the claimant's personal data and disclosed it to the claimant's ex-husband, who was now in a relationship with the employee. The employee had access to the relevant files, but was not working on a matter concerning the claimant. The disclosure made the claimant concerned for her safety, as she had made a complaint to the police about her husband's domestic abuse, which the police had shared with the council due to child safeguarding concerns.
The employee pleaded guilty to an offence under the Computer Misuse Act 1990 and was sentenced to three months' imprisonment, suspended for 12 months, plus community service. The court decided that, in accessing and sharing the claimant's records, the employee was pursuing her own agenda and not furthering the council's work. This meant that she was 'on a frolic of her own', so the council was not vicariously liable for her actions.
Stadler v Currys Group Limited [2022] EWHC 160 (QB)
This is another case where the judge indicated that a minor breach of data protection law did not give rise to significant (if any) compensation.
The claimant (S) returned a smart TV to the defendant (C) for repair. It was beyond economic repair, so C gave S a voucher to buy a replacement. C sold the TV to a third party company, which sold it on to another customer. S's personal data was not wiped from the TV before either sale. The subsequent buyer used S's Amazon Prime account to make a purchase in the sum of £3.49. C compensated S for the £3.49 and gave him a £200 voucher as a goodwill gesture.
C brought a claim in the High Court for damages (including aggravated and exemplary damages) of up to £5,000 for misuse of private information, breach of confidence, negligence and breach of data protection law. D applied to dismiss the claim.
The judge dismissed the claims at common law (misuse of private information, breach of confidence and negligence) and the claim for aggravated and exemplary damages. He allowed the claim for breach of data protection law to continue, but ordered that the claim be transferred to the County Court and indicated that the Small Claims Track appeared appropriate.
You can read the full case report here.
National Data Strategy Forum sets out priorities for next six months
The National Data Strategy Forum (co-chaired by DCMS and techUK) has published its priority work streams for the next six months:
- Unlocking the power of data for everyone everywhere: making private and third sector data more usable, accessible and available across the economy, while protecting people’s data rights and private enterprises’ intellectual property.
- Trust in data: building public support for trustworthy data use, so that data can be harnessed to unlock societal benefits and improve lives.
- Data reform: in September 2021 DCMS published a consultation on post-Brexit reform of data protection law. Read our summary of the key points here, and a public sector-specific summary here.
- Net Zero: explore how to harness the power of data to meet the UK government's net zero ambitions and use data collection and analysis to support environmental sustainability by enhancing the energy efficiency of supply chains and production.
- Measuring the data ecosystem: mapping stakeholder activity across the wider the data ecosystem in the UK and testing the feasibility of creating an openly available resource to show how the work of different organisations fits together.
You can read more about each work stream here. The top one to watch is probably the government's proposed reform of data protection law. This update states that the government plans to publish its response to the consultation in spring 2022.