This month in review:
Our key news has to be the National Cyber Security Centre's cyber risk update for critical infrastructure and services, which we covered in our blog here. We've included a summary below in our cyber security section. Don't miss reading it – it is crucial.
Focusing in on some specific areas:
- AI use – following last month's review of ChatGPT, new guidance and questions have been issued considering live facial recognition and generative AI, urging developers and users alike to take practical steps to consider and address data protection risks. Don't forget to consider data protection risks and put mitigations in place for them, and monitor ongoing compliance throughout use and decommissioning.
- For all employers: Workforce Monitoring – a last call for our webinar – register on the link below!
- M&A / Corporate transactions – in the first of an occasional series of Practical Insights, our team looks at some topical matters on M&A due diligence, and how we overcome challenges.
- Our events – Workforce Monitoring Webinar on 16 May – covering employment and data protection issues and trends – register here
- Data Protection Finance Group – we were delighted to host the April meeting of this industry data protection group in our London office – here's an overview of the topics covered. If you're practising data protection in finance, please contact Tughan Thuraisingam to find out more.
Our contents this month:
- General updates
- Adtech and direct marketing
- AI and Innovation
- Cyber and ransomware
- Data transfers
- Public sector
Back to top >
We wanted this month to start sharing some practical insights from our work, to give you additional insights as to what we do and how we resolve issues. Our two Practical Insights this month relate to the M&A process.
Practical Insight 1 – Due Diligence
Shervin Nahid commented on LinkedIn that: Compliance tick-boxing for data protection in an M&A context does not work, as follows:
"I recently assisted my corporate colleagues with a sale of a business by a long-standing client. From that experience, it was surprising to see such a narrow understanding of a data protection issue being applied in an M&A context from the other side.
The buyer’s lawyers had done some very limited paper-based data protection due diligence, and had concluded that because no DPIAs had been completed, there was a need for an indemnity in the transaction document.
In the context, there were a few issues with this:
- They had not determined which processing activity might have required a DPIA, which weakened their negotiating position.
- The contract protection was focussed on the presence of the compliance document itself, rather than what the DPIA achieves, e.g., Article 35 document compliance tick box, rather than the identification and treatment of potentially unknown risks.
- It appeared as if the data protection colleagues on the buyer’s side had not been fully joined up with their corporate colleagues, as the indemnity was positioned in the transaction document along with other proposed indemnities, which were all subject to a de minimis and a time-cap (i.e. the date of completion). The time-cap meant the #ICO would need to enforce against the Target and cause losses for the lack of a DPIA, all done prior to completion (which was targeted for one week’s time).
Needless to say, I advised my corporate colleagues on the implication of leaving this indemnity in and the negotiation benefit it could offer by ‘conceding’ it, given its drafting was largely ineffective to the purpose it was seeking to achieve (i.e. to transfer the unknown risks to the sellers)."
DWF Solutions: we're always looking to apply the right scrutiny lens to M&A and Insolvency transactions. Get in touch with Shervin or the team if you'd like to know more.
Practical Insight 2 – Warranties and Indemnities
Shervin also commented from his extensive deals experience on warranties in M&A transactions:
"Continuing the theme of data protection considerations in an M&A context, I often see data protection warranties that are either unhelpfully vague, poorly drafted, or otherwise fail to achieve one of their key practical benefits: revealing disclosures.
I have previously commented around data protection due diligence (DD) in its current form often being unfit for its purpose. Often DD responses are populated by the management team, or other members of the deals team that may not be the most appropriate point of contact from a data protection perspective (recognising that there are often several reasons for this – e.g. the transaction is confidential to the target's workforce), which means detailed and transparent responses are not always provided.
Once the deal progresses to the negotiation of the key transaction document, there is the potential threat of a warranty claim being brought against the sellers/warrantors. This means making effective disclosures becomes important, as that is the mechanism to limit risk and avoid absorbing potential liabilities post-completion.
When acting on the buy-side – think about breaking down your warranties to the key data protection issues, and scrutinise the disclosures made against each warranty and ensure they are sufficiently granular to help your client clearly understand the potential risk and liability exposure it is absorbing.
When acting on the sell-side – our role as advisors is to help clients to articulate the issues it is concerned that it could be in breach of, or simply does not know. This is achieved by providing an open and transparent level of detail, to ensure no future warranty claim can be brought.
Of course, we are also increasingly seeing data protection indemnities in the agreements that aim to protect against any losses resulting from potential non-compliance more easily (e.g. indemnities do not require the buyer to prove the breach like with warranties). In such cases, the disclosures will still aid buyers understand the potential key priority areas for remediation, often post-completion."
EU – EDPB – New data protection guidance for small businesses
In December 2020, the European Data Protection Board ("EDPB") announced its 2021-2023 strategy for ensuring the protection of personal data. One of its key initiatives was to provide further guidance on key provisions of EU data protection law, including engaging with SMEs to help them through their specific challenges. The EDPB put this into effect on 27 April 2023 by launching its data protection guide for small business, comprising of various tools and tips, in the form of videos, infographics and interactive flowcharts, to assist SMEs in complying with data protection legislation.
EU – EDPB – Final guidance published on the right to access
Following a public consultation early last year, on 28 March 2023 the EDPB formally adopted new guidelines on the data subject's right of access. The guidance provides clarifications on how the right of access applies to different situations, the extent of its scope, how a data controller should respond to a request and identification of manifestly unfounded or excessive requests.
We have considerable experience in the whole data subject rights area, from process design and optimisation, through to dealing with complex and contentious requests. Please get in touch with JP Buckley to discuss more.
EU – EDPB – GDPR enforcement against non-EEA entities
Under the EU GDPR, each Member State is required to provide a supervisory authority ("SA") that is responsible for protecting fundamental rights and freedoms. Each SA is given powers to monitor and enforce GDPR compliance within the EU and EEA. However, difficulties arise where entities involved are located outside of the EEA but fall within the territorial scope of the GDPR, who do not have a designated SA within the EEA or the EU.
The EDPB has published the final report of a study, prepared by Milieu Consulting, which analyses the ability of SAs to enforce their powers against such entities. As a brief overview:
- SAs are required to exercise their powers on their national territory, so it is not clear whether this precludes the initiation of legal proceedings in another Member State or in a third country.
- Where SAs are not exercising 'public powers', they may be acting as a 'private person' in 'civil and commercial matters'. In theory, SAs may exercise their powers beyond the EEA territories, but this may not mean those third countries will be receptive.
- Enforcement of EU SA decisions in California and UK courts may be difficult if not impossible in a reasonable timeframe and at affordable cost.
- Reponses identify legal instruments to support the enforcement of GDPR against third-country entities, including international agreements and Memorandums of Understanding.
- There are possibilities to improve international data protection cooperation, with electronic communications and the use of EEA intermediaries seeming the most effective.
- There is potential to negotiate on choice of jurisdiction and applicable law where the SA acts in 'civil and commercial matters'.
- Obstacles to international cooperation include lack of practice, shortcomings in legal framework and problems in producing evidence.
EU – EDPB – New guidance on personal data breach notifications under the GDPR
On 28 March 2023, the EDPB adopted 'Guidelines 9/2022 on personal data breach notifications under GDPR' that clarifies how controllers should notify breaches that occur at non-EU based establishments. The guidance provides that entities with a representative in a Member State cannot rely on the one-stop shop system, instead the controller should notify each supervisory authority as applicable. Further, non-EU controllers that fall within the territorial scope of the GDPR are bound by the same notification obligations as those established in the EU. Likewise, processors will still be obliged to notify controllers of a breach without undue delay.
Whilst the UK is not bound by the EDPB guidelines, the ICO has stated entities should still consider them as helpful guidance to clarify certain issues.
UK: Update on the Data Protection and Digital Information (No.2) Bill
On 17 April 2023, the Data Protection and Digital Information (No.2) Bill had its second reading in the House of Commons. The Bill aims to simplify the complexities of the UK GDPR and tailor it more specifically to UK business operations –we've written previously on the benefits and risks of this. Key changes include:
- a reduction in cookie pop ups;
- changes to the DPO regime and amendments to documentation requirements from records of processing to DPIAs;
- increased fines for nuisance calls and texts from £500,000 to the greater of £17.5 million (or up to 4% of group global annual turnover);
- creation of a statutory board for the ICO; and
- a new framework to simply the digital process of verifying identity.
EU: When is data is pseudonymised?
On 26 April 2023, the General Court of the European Union issued its judgment in Case T‑557/20 Single Resolution Board ("SRB") v European Data Protection Supervisor ("EDPS"). The case involved SRB sending shareholders and creditors an electronic form to complete which SRB sent to consulting firms. Before sharing the responses, SRB replaced the name of each respondent with a code. This led to five complaints to the EDPS expressing concerns that they had not been informed their responses would be shared with third parties.
The General Court found SRB had not transferred personal data because the consulting firms that received the data had no means of deciphering the codes. Therefore, the consulting firms were not able to identify, directly or indirectly, any of the respondents who had submitted the forms. The court also commented "it cannot be ruled out that personal views or opinions may constitute personal data" and that this will be decided by examining "whether, by its content, purpose or effect, a view is linked to particular person".
This has an impact on many areas of data protection, not least of course because pseudonymised data is still personal data, whereas anonymised data is not.
Adtech and direct marketing
Back to top >
Please see our articles on the Data Protection and Digital Information (No.2) Bill above and AI and innovation in the next section.
AI and innovation
Back to top >
UK - Live facial recognition technology
The ICO has published a blog discussing the use of live facial recognition ("LFR") technology in the retail and surveillance sectors, in light of their investigation into Facewatch. Facewatch is a security company that provides LFR technology to the retail sector with the aim of reducing theft offences. The system scans the faces of individuals and alerts the store when a 'subject of interest' (an individual who has a criminal record of theft) has entered the premises.
Whilst the ICO agreed that Facewatch had a legitimate interest in using an individual's personal data in this way, it did give recommendations to improve their data protection compliance. Facewatch made these improvements, which include focusing on repeat offenders or individuals who had committed significant offences, appointing a data protection officer and implementing additional protection for those classified as vulnerable. The ICO did not take any further action against Facewatch but did warn private sector organisations this was not a blanket decision to be applied to all cases of LFR technology and that each case will be considered individually to ensure it complies with the data protection principles.
DWF Solutions: we regularly advise on new and developing technology to build in data protection by design and default considerations at the beginning of development and through a product's lifecycle.
UK – £100 million of funding to develop AI technology in the UK
The Prime Minister and Technology Secretary has announced £100 million of funding to set up the 'Foundation Model Taskforce' – a team who will be responsible for the rapid development of the safe and lawful use of artificial intelligence in the UK. This is in addition to the £900 million of funding for a new supercomputer and AI Research Resource to help fund AI innovation, and is a big step towards the UK's ambition to become a science and technology superpower by 2030, as announced in March 2021.
The Taskforce's work on AI has huge potential to transform the healthcare industry, which could speed up diagnoses, drug discovery and development, and in the education sector where it could free up teachers' time to enable them to focus on delivery quality teaching to students. As we see in other articles this month, getting data protection right in AI is crucial.
UK – Generative AI – initial questions for developers and users
With AI technology developing at such a rapid rate, the ICO has published eight questions for developers and users of generative AI systems to ask themselves to ensure they understand exactly how the technology uses personal data. The questions are:
- What is your lawful basis for processing personal data?
- Are you a controller, joint controller or a processor?
- Have you completed a DPIA?
- How will you ensure transparency?
- How will you mitigate security risks?
- How will you limit unnecessary processing?
- How will you comply with individual rights requests?
- Will you use generative AI to make solely automated decisions?
These simple questions, if analysed and responded to properly, will start to give a baseline to work from towards considerate and lawful AI use. They don't represent all that is needed, so if you'd like advice on this, please get in touch with one of us. Remember that even publicly available personal data is subject to data protection laws.
UK – FemTech in the data protection spotlight
The Information Commissioner announced at a conference that the ICO will be directing its attention to the 'FemTech' market and will be "auditing them, and getting them to change any practices that are non-compliant".
The 'FemTech' industry comprises companies that use software or technology in relation to women's health and wellness apps. This often involves processing large quantities of special category personal data, particularly concerning reproductive health. This announcement forms part of the ICO's new 'agile' initiative, which will focus on "areas of vulnerability, targeting… intervention [where] that has the greatest impact".
To ensure your FemTech product is aligned with regulatory requirements, please get in touch to discuss how we can assist.
UK - ICO highlights its work on the UK's Covid apps
The ICO highlighted the recent work it has been conducting 'outside of the spotlight' on the NHS Covid apps throughout the UK to demonstrate its commitment to continuously protect individuals' privacy rights. The ICO stated it had worked closely with the Department of Health and Social Care and Welsh Government at all stages of the app's lifecycle – from design to decommission, which took effect on 27 April 2023 due to a fall in the number of users. It had also provided privacy advice on the apps used in Scotland (Protect Scotland) and Northern Ireland (StopCovidNI), both of which were decommissioned last year. The ICO emphasised that its work in these areas was imperative in ensuring data protection legislation did not hinder innovative technological developments needed to protect the health of the UK.
It reminds everyone that decommissioning and data retention/deletion processes are required for this and all other types of data when apps, services or products are no longer required. Please ask us for more details.
Cyber and ransomware
Back to top >
UK: National Cyber Security Centre threat level warning
On Wednesday 19 April 2023 the National Cyber Security Centre (NCSC), part of GCHQ, issued an unprecedented warning about threats to UK critical infrastructure and services posed by Russia-aligned cyber attackers. The scope of what we consider in the UK to be critical infrastructure and services is narrower than in the EU where there is a parallel regime, and we urge all our clients and contacts to review Stewart Room's article and take the necessary steps to confirm appropriate protections are in place to protect themselves.
Should you want this to be reviewed or receive our expert advice either on training for or preventing an attack, or should you suffer an attack or breach of some kind, our DWF Breach Counsel service will provide all the support you need. Please ask us for more information.
Back to top >
EU: EDPB: Update on the 101 data transfer complaints
Tughan Thuraisingam commented on this as follows on LinkedIn: "…the European Data Protection Board issued a report on the outcome of the work of the Task Force ("TF") that was set up to look into the 101 complaints filed by noyb.eu against #website operators using third party tools that transferred data to the US in the aftermath of the CJEU Schrems II judgment.
The report sets out the common positions of the members of the TF and contains information on the outcomes of the first cases concerned.
Key points to note are as follows:
- Encryption by the importer is not a suitable measure if the importer has legal obligations to provide the cryptographic key.
- Anonymization of data elements such as the ipaddress is not a suitable measure where the anonymization takes place only after all the data has been transferred to the third country.
- Website operators must undertake a compliance check of the third party tools that are integrated on their websites to meet the #accountability principle under the GDPR; not being able to demonstrate how transfers take place could lead to a breach of A.5(2) and A.24(1) GDPR.
- A decision of a website operator to integrate and use third party tools (e.g. social media plugins or analytical tools) and use it for specific purposes (e.g. analysing the behaviour of the website visitor) is regarded as determining "purposes and means" and therefore a #controller activity (the degree of liability vis-à-vis the third party must be determined on a case-by-case basis, taking into account the different functions and options the tool provides)."
Back to top >
UK – Call recording
On 18 April 2023, the ICO issued a reprimand to Surrey Police and Sussex Police for recording more than 200,000 phone calls without the individuals knowing. The incident arose from the use of an app, which members from both police forces downloaded to their work mobile phone. The app, unbeknown to the officers, recorded all incoming and outgoing calls, which resulted in over 200,000 recorded phone calls, likely to be with victims, witnesses and suspects, being saved without their knowledge.
The reprimands were issued instead of a £1 million fine to both police forces in light of the ICO's public sector approach. Both polices forces have until 18 July 2023 to address the ICO's recommendations, which include: ensuring data protection was embedded into processes by design and default; providing data protection guidance to staff members using the app; ensuring existing policies and procedures adequately deal with data subject rights; and reviewing the content of data protection training.
UK – Freedom of Information (FOI) toolkit
The ICO has launched its third topic in its Freedom of Information ("FOI") toolkit, which focuses on assisting public authorities to identify and deal with vexatious FOI requests. It comprises five modules for public authorities to self-assess their current FOI performance and identify areas for development.
DWF Solutions: if you require any assistance or tailored advice on any of the topics mentioned in this article, please get in contact with the Data Protection & Cyber Security team.