• GL
Choose your location?
  • Global Global
  • Australian flag Australia
  • French flag France
  • German flag Germany
  • Irish flag Ireland
  • Italian flag Italy
  • Polish flag Poland
  • Qatar flag Qatar
  • Spanish flag Spain
  • UAE flag UAE
  • UK flag UK

DWF Data Protection Insights March 2023

12 April 2023
Here is our round-up of the top data protection and cyber security stories for March 2023, together with practical advice on how to address the legal issues raised.

This month in review:

The AI Pendulum: In March we saw ChatGPT launched along with excitement (and fear) about its progression and application. Yet with those developments came regulatory scrutiny and a ban in Italy. AI is a polarising subject, but the cause for concern keeps returning to data protection basics – what data is being used for what, why and by whom. We cover commentary on that and an updated guidance below. 

Focusing in on few specific areas: 

  • Employment – don't miss our Workforce Monitoring Webinar in May. Sign up here and learn more below. We also make recommendations about training your workforce given an NHS 111 call centre operator's unlawful use of personal data and a resulting fine. 
  • Moving to cyber topics, Mark Hendry covers a range of trends including the new US Cyber Security Strategy, as well as our consideration of AI operations and general guidance.
  • In the public sector, we look at the release of Matt Hancock's WhatsApp messages, updated Freedom of Information (FOI) guidance and much more. 

Our contents this month: 

Our events

Monitoring your workforce: An employer's guide to monitoring in the new world, 16 May 2023, 13:00 - 14:00

With the pandemic acting as a catalyst for homeworking on an unprecedented scale, the appetite for monitoring worker activity has increased. Whilst many organisations are still working out their strategy, it seems certain that hybrid working is here to stay, in one form or another. 

Our upcoming webinar covers employment and data protection issues and trends – register here.

Back to top >

General updates

UK Government publishes new draft Data Protection and Digital Information Bill

On 8 March 2023 the UK Government published draft legislation, known as the Data Protection and Digital Information (No.2) Bill (DPDI Bill), to amend the UK GDPR and other aspects of the wider data protection regime including direct marketing and cookies. The headlines from the announcement are covered in our article on the new Data Protection & Digital Information Bill and include:

  • Introducing a business-friendly, cost-effective framework, which reduces the amount of "paperwork" required to demonstrate compliance. For example, the proposed law includes a requirement that only organisations whose processing activities are likely to pose "high risks" to personal rights and freedoms (such as health data processing) need to keep records of their processing.
  • Ensuring data adequacy and wider international confidence in the UK's data protection standards. The proposed law introduces a new "data protection test" for assessing adequacy and clarifies that transfer mechanisms lawfully entered into, before the new Bill comes into effect, will continue to be valid.
  • Providing greater confidence around processing personal data without consent. Under the proposed legislation, consent would not be required for online trackers placed: (i) for the purposes of collecting statistical information in order to bring improvements; (ii) for the installation of necessary security updates to a device; or (iii) to locate an individual in an emergency.
  • Increasing confidence around the use of AI technologies in automated decision-making. The Bill introduces a new definition of automated decision-making to the effect that it was solely automated processing involving no human intervention and adds that profiling will be a relevant factor in the assessment as to whether there has been meaningful human involvement in a decision.

The DPDI Bill has only a limited number of changes from its original version of 2022, so it will be interesting to see how this progresses through the law-making process.

DWF Solutions: to analyse in advance the likely changes and impact, in the UK and elsewhere, please get in touch with one of our data protection specialists.

ICO Children's Code: consultation on draft guidance for services ‘likely to be accessed’ by children

The ICO Children's Code applies not only to information society services (online services such as websites, apps, games and social media, abbreviated to "ISS") aimed at children, but also to ISS likely to be accessed by children. The ICO has published draft guidance on how to assess whether children are likely to access a service, together with a consultation on the draft. The guidance comprises of three sections:

FAQs, including:

  • What does the code say about when services are likely to be accessed by children?
  • What is an adult-only service and how do you decide if children are likely to access it?
  • Do you need to assess whether children are likely to access your existing adult-only service?
  • If a child only accesses your website's age-gating page, are you within the Code's scope?
  • How do you demonstrate your decision-making?
  • If you assess that children are likely to access your service, what do you need to do?

List of factors to consider:

  • The number of child users of your service, and the proportion of total UK users or total UK children that this represents.
  • Any research evidence available, such as existing evidence of user behaviour.
  • Information on advertising on your service targeted at children.
  • Information on complaints received about children accessing or using your service.
  • Whether your service contains content, design features and activities children are interested in.
  • Whether children are known to like and access similar services.
  • How you market, describe and promote your service, e.g. are there toys or other products targeted at children associated with your services?
  • Any other research evidence such as academic, independent and market research; or research relating to similar providers of ISS.

Case studies: the guidance includes case studies relating to online dating, pornography, games and social media.

DWF Solutions: The Children's Code is fairly complex, as it contains different guidance for ISS likely to be accessed by different age groups. If you would like guidance on whether the Code applies to your business and how to comply with the Code, please contact one of our data protection specialists.

DPO enforcement action in Europe

The European Data Protection Board (EDPB) has launched its coordinated enforcement action for 2023, the second initiative to be undertaken under the Coordinated Enforcement Initiative (CEF). 

The 2023 action under the CEF will focus on the role of data protection officers (DPOs). 26 Data Protection Authorities (DPAs) from across the European Economic Area (EEA) will participate. The initiative aims to determine whether DPOs occupy the position and wield the resources required by the GDPR (Articles 37-19) in order to aid compliance with data protection law and facilitate effective protection of data subject rights. 

The method of implementation of the coordinated enforcement action will be through questionnaires, and in some cases (presumably where questionnaires or other analysis highlight potential deficiencies) formal investigations. Results will be analysed in a coordinated fashion and DPAs will determine the need for further national supervision and enforcement. Aggregated insights will be developed to enable follow up, including a report by the EDPB, at the EU level.

DWF Solutions: DWF's Data Protection & Cyber Security team regularly advises clients on topics including the designation of the DPOs, reporting structures, and operationalisation of data protection governance, risk and control structures. Get in touch if you would like to have a conversation about how we can help.

ICO shares resources to help designers embed data protection by default

The ICO has shared resources to help technology professionals to implement privacy by design in their websites, apps or other technology products or services, including in the kick-off, research, design, development, launch and post-launch stages. The guidance highlights the legal requirements that organisations must comply with, along with emphasising the value of implementing privacy by design.

The ICO's recommendations include:

  • Ongoing collaboration between stakeholders in privacy discussions;
  • Mapping what personal information the product needs;
  • Considering privacy throughout design activities; and
  • Factoring privacy into launch plans.

Back to top >

Adtech and direct marketing

ICO and ACMA agree to work together to protect people from unwanted calls and messages

The ICO has signed a Memorandum of Understanding (MoU) with the Australian Communications and Media Agency (AMCA) to formalise their commitment to work together to protect people from unwanted nuisance calls and spam messaging. The MoU sets out how the authorities will:

  • Continue to share experiences and best practice;
  • Cooperate on specific projects of interest; and
  • Share information and intelligence to support their regulatory work.

AI and innovation

ChatGPT, a polarising landscape: considerable uptake to Generative AI but regulatory concern 

In the month that ChatGPT was launched, we saw a lot of use of the tool. As more emerged about the learning approach and data use, concerns started to surface – read more in Stewart Room's article in Forbes.

AI and data protection: updated ICO guidance

The ICO has published updated guidance on artificial intelligence and data protection, which has been restructured in line with the data protection principles. The key updates are:

  • Additional guidance on what to consider when carrying out a DPIA (data protection impact assessment).
  • How the transparency principle applies to AI.
  • How to ensure lawfulness when using AI, including new guidance on:
    - AI and inferences, i.e. making guesses or predictions about an individual, or categorising or profiling them based on correlations between datasets;
    affinity groups (groups created on the basis of inferred interests); and 
    special category data.
  • The guidance on statistical accuracy has been moved into a new chapter.
  • Additional guidance on the application of the fairness principle, including:
    - how it applies to AI with a non-exhaustive list of legal provisions to consider;
    the difference between fairness, algorithmic fairness, bias and discrimination;
    high-level considerations when evaluating fairness and inherent trade-offs;
    processing personal data for bias mitigation, including technical approaches to mitigate algorithmic bias;
    how solely automated decision-making and relevant safeguards are linked to fairness; and
    key questions to ask when considering Article 22 of the UK GDPR, which covers automated individual decision-making, including profiling.
  • Further guidance on data protection fairness considerations across the AI lifecycle, from problem formulation to decommissioning, including:
    - why fundamental aspects of building AI, such as underlying assumptions, abstractions used to model a problem, the selection of target variables or the tendency to over-rely on quantifiable proxies may have an impact on fairness; 
    the different sources of bias that can lead to unfairness; and
    possible mitigation measures.
  • An updated glossary setting out technical terms used in the guidance.

Dark Patterns or Deceptive Design Patterns – new guidance from the EDPB

When the design of an interface encourages a particular action or selection, this has been known as a dark pattern. The EDPB has been tracking this and has published guidance on this topic, setting out best practices and recommendations to keep these interfaces from infringing the EU GDPR. 

Pro-innovation Regulation of Technologies Review: Digital Technologies

On 15 March the UK government published a report on the outcome of Sir Patrick Vallance’s review on pro-innovation regulation for digital technologies. The ICO also published its response. The report makes a number of key recommendations:

  1. The government should work with regulators to develop a multi-regulator AI sandbox within the next six months.
  2. The government should announce a clear policy position on the relationship between intellectual property law and generative AI to provide confidence to innovators and investors.
  3. Facilitate greater industry access to public data, and prioritise wider data sharing and linkage across the public sector, to help deliver the government's public services transformation programme.
  4. The government should bring forward the Future of Transport Bill to unlock innovation across automated transport applications.
  5. Unlock the innovation potential of the drone sector.
  6. The ICO should update its guidance to clarify when an organisation is a controller, joint controller or processor for processing activities relating to AI as a service (AIaaS), including guidance on when providers can reuse personal information for improving their models.
  7. Reduce cost and time delays for companies seeking launch licences for small spacecraft, making the UK space sector a more attractive location for investment.
  8. Amend the Computer Misuse Act 1990 to include a statutory public interest defence that would provide stronger legal protections for cyber security researchers and professionals.

In its statement, The ICO says it will continue prioritising work in this area, including guidance on personal data processing relating to AIaaS, and looks forward to discussing the recommendations with its Digital Regulation Cooperation Forum (DRCF) partners and Government.

A White Paper on AI then followed on 29 March. As these items show, AI is currently a key focus for the UK government to boost innovation and economic growth. 

DWF Solutions: Contact one of our privacy specialists for support on designing and implementing AI solutions, including conducting any DPIAs required and mitigating any risks identified.

Back to top >

Cyber and ransomware

Ransomware trends

Read Mark Hendry's article on recent ransomware trends here, discussing the decreasing percentage of victims paying ransomware; how threat actors set the ransom value; and how this could impact you.

French regulator fines e-scooter company for its use of geolocation data

An e-scooter rentals company operating in France, Italy and Spain has been fined by the CNIL (French data protection regulator) for its disproportionate use of geolocation data. However, it didn't stop there. Read Tughan Thuraisingam's article here.

Former 111 call centre advisor fined for illegally accessing medical records

An individual working as a service advisor at an NHS 111 call centre has been found to have unlawfully accessed and obtained the medical records of a child and his family in breach of Section 55 of the Data Protection Act, for which he was fined £630 with a victim surcharge and court costs totalling £1,093.

The personal records were accessed without consent or a legal reason to do so, and were used to contact the data subjects with accusations and threats. Employees and others accessing data may not always realise that they are not entitled to use data as they wish – thus highlighting the importance of training the workforce. 

DWF Solutions: Get in touch with our data privacy specialists for training your workforce – we have a range of options available. 

New US National Cyber Security Strategy

The United States White House has released a new National Cybersecurity Strategy outlining plans to improve the United States' cybersecurity posture over the next four years. The strategy identifies five strategic pillars:

  1. Defend Critical Infrastructure
  2. Disrupt and dismantle threat actors
  3. Shape market forces to drive security and resilience
  4. Invest in a resilient future
  5. Forge international partnerships to pursue shared goals

Whilst heavily US-centric, the strategy itself, as well as its ongoing implementation, can be expected to have global impact. Objectives within the strategy include those relating to the establishment of cybersecurity regulations for critical national infrastructure, improving the speed and scale of intelligence sharing, support for national requirements for security of personal data in line with NIST standards, and invigorating cyber research and development, including preparation for a post-quantum computing future.

It also sets out appreciation and understanding over the benefits of emerging technologies, such as artificial intelligence and the Internet of Things, and articulates the need to understand and respond to the risks associated with their adoption.

National Cybersecurity Strategy 2023 is available here.

Back to top >

Data transfers

EDPB Opinion on the EU-US Privacy Framework

The European Data Protection Board (EDPB) has adopted an Opinion on the draft EU-US Privacy Framework. If this Framework is adopted by the European Commission, it will provide a mechanism for organisations caught by the EU GDPR to transfer personal data to organisations within the USA that sign up to the Framework, replacing the EU-US Privacy Shield which was invalidated in the Schrems II case. The EDPB's Opinion noted substantial improvements from the Privacy Shield, but set out a number of points which require clarification, including:

  • certain exemptions to data subjects' rights may be too broad;
  • the lack of clarity about the application of the principles to processors;
  • the lack of specific rules on automated decision-making and profiling; 
  • further guarantees should be provided in respect of onward transfers;
  • bulk collection of data may require additional safeguards;
  • the practical functioning of the redress mechanism requires clarification; and
  • the complex layout and missing definitions makes the Framework difficult to understand.

While the Framework still faces some obstacles, including the opposition of the European Parliament's Committee on Civil Liberties, Justice and Home Affairs (see DWF Data Protection Insights February 2023) and privacy activist Max Schrems, the EDPB's broadly positive opinion is a step towards its adoption.

DWF Solutions: If you would like advice on any aspect of international transfers under the EU GDPR and/or the UK GDPR, including the required transfer risk/impact assessment or putting a safeguard such as standard contractual clauses in place, please contact one of our specialists below.

EDPB Guidelines on the interplay between Article 3 EU GDPR (territorial scope) and the data transfer requirements

The EDPB has finalised its guidelines on this interplay, which is an important and misunderstood area. It sets out when there is an international transfer, as well as new recommendations for controllers to adopt when the data exporter is a data processor. 

EDPB Guidelines on using certification as a transfer mechanism

An increasing number of certification schemes are appearing, and the EDPB has helpfully updated its 2018 guidance on the topic now there is increasing use - see latest guidance here.

Back to top >

Public sector

ICO statement on the reporting of WhatsApp Messages sent by government ministers during the pandemic

The ICO statement, issued following the Telegraph's reporting of WhatsApp messages sent by Matt Hancock during the COVID-19 pandemic, highlights the balance which must be struck between protecting people's personal information and journalism in the public interest. All of which is underpinned by the importance of freedom of expression in society.

Although they did not see this as a matter for them, the ICO took the opportunity to issue a reminder about the risks that the use of private messaging apps or channels bring, especially in relation to transparency within the government.

FOIA and EIR: ICO to prioritise complaints with significant public interest

In DWF Data Protection Insights November 2022 we outlined the ICO's consultation on how it prioritises complaints under the Freedom of Information Act (FOIA) and the Environmental Impact Regulations (EIR). The ICO has now announced a new approach to prioritise complaints made under where there is significant public interest. The announcement states that, following feedback from both the consultation and engagement sessions with stakeholders, the public interest criteria has now been clarified and refined to provide clear guidance and expectations about what constitutes significant public interest, e.g. if the issue is likely to involve large amounts of public money, or the information may significantly impact vulnerable groups.

The updated criteria for prioritising complaints are:

  • Is there a high public interest in the information requested? Does it raise a new, unique or clearly high profile issue that we should look at quickly? Indicators of this may include whether:
    - the case is subject to significant media interest, e.g. there are existing news reports related to the subject matter in the public domain;
    - the case concerns an issue that is likely to involve a large amount of public money in the context of the size of the public body involved, e.g. a local council contract for provision of services across its whole area or a nationwide central government spend; and
    - the requester needs the information to respond to a live and significant public consultation, and the time for achieving resolution is reasonable to inform the decision-making process.
  • Are any groups or individuals likely to be significantly affected by the information requested? This may include information:
    - which covers policies, events or other matters that potentially have a significant impact on vulnerable people or groups;
    - that has a high potential impact or harm on a proportionately substantial number of people in relation to the information requested; and
    - that may directly affect the health or another personal issue of the requester or others, meaning they need a swift resolution, e.g. it may impact on treatment or is about a live court case.
  • Would prioritisation have significant operational benefits or support those regulated? For example, is the request:
    - novel, or could provide the basis for guidance or support for other regulated bodies; or
    - a part of a round robin request or otherwise linked to other requests or appeals?
  • Does the requester have the ability and desire to use the information for the benefit of the public? This may include where the requester has:
    - a clear aim of raising awareness around a topic of significant public interest and the means or contacts to do so; or
    - access to a suitable platform to allow the public at large to use the requested information to scrutinise the decisions made in the public sector.

Freedom of Information and EIR: ICO publishes new case studies

The ICO has published two case studies:

  1. How proactive disclosure of information can benefit request handling. 
  2. An example of how to restore compliance with statutory Freedom of Information (FOI) and EIR timescales.

BAU FOI requests

The first case study relates to a London Borough's approach to "business as usual" (BAU) FOI requests. The Borough issues prompt replies without the formality of a full FOI response when:

  • they can direct the requester to publicly-available information; or
  • information is readily available and disclosable, and it appears that a BAU response would satisfy the requester.

The ICO states that:

  • it is quicker to direct someone to publicly available information than to provide an individual copy of that information every time someone requests it;
  • informal BAU responses can be a quick and effective way of dealing with information requests; and
  • organisations must ensure this approach does not result in any lowering of standards in the time taken to issue the response, or in the quality or amount of information provided.

Reducing the ICO's information requests backlog

The second case study explains how the ICO dealt with its backlog of information requests that had built up during the pandemic. The ICO's actions included:

  • Creating a late cases project team.
  • Dedicating other colleagues' overtime to late cases.
  • Seconding colleagues from across the ICO to incoming cases.
  • Asking former team members to mentor those secondees.
  • Holding twice-weekly "Request Queries" sessions to collaborate and share best practice.
  • Introducing new processes, including:
    - producing weekly reports to identify reports that were overdue or due shortly;
    - a tier system based on the complexity of requests to allocate them appropriately;
    - a queue cap to limit the number of requests a team member could work on at any one time;
    - establishing clear routes for escalating delayed responses to senior management;
    - increasing understanding of the team's work across the ICO; and
    - cooperating with the communications team and other departments to identify planned activity likely to prompt information requests.

DWF Solutions: DWF's privacy specialists have extensive experience in dealing with complex rights requests, as external advisors on points of law and practice through to the provision of end-to-end managed services. 

Click here to read more or contact one of our team for information about how we can help you to manage rights requests under the EU GDPR, UK GDPR, FOIA and the EIR.

For advice on any aspect of Data Protection & Cyber Security, please contact one of our specialists. 

We would like to acknowledge the contribution of Sophie Broome to this article.

Further Reading