• GL
Choose your location?
  • Global Global
  • Australia
  • France
  • Germany
  • Ireland
  • Italy
  • Poland
  • Qatar
  • Spain
  • UAE
  • UK

DWF Data Protection Insights February 2023

07 March 2023

Here is our round-up of the top data protection stories for February 2023, together with practical advice on how to address the legal issues raised.

The month in review: 

  1. For everyone across all sectors - cyber: in commenting on a recent case, the UK Information Commissioner stated that "the biggest cyber risk is complacency, not hackers".  We've prepared a range of free cyber security tools to raise your awareness of the preparation, immediate impact and aftermath of an attack, and the steps you can take now.  Don't miss out on your opportunity to obtain copies below. 
  2. For the consumer and public sectors: taking the time to properly consider the data protection basics before you progress technological developments is this month's takeaway. Whether it relates to biometric identification technologies in school canteens, the use of AI in the public or private sector, or another deployment of technology – you should reduce your likelihood of scrutiny, challenge and disruption by getting the data protection basics right.  
  3. For the financial services sector: we've published a separate article on ransomware and Adtech in this sector where we cover the risks and then the best practices to mitigate them.  

We also discuss the latest on the UK's Data Protection and Digital Information Bill, the Online Safety Bill and much much more… 

Our contents this month: 

Our events

DWF Tech and Data Leaders Forum, 9 February 2023

A full house of data protection professionals and in-house counsel enjoyed a varied, insightful and detailed review of several topical data protection concepts at our recent event. Starting with Stewart Room's assessment of risk in the UK vs the EU (see "How's Data Protection Doing In Your Country?" below), the audience then listened to a detailed description of the Adtech ecosystem and our assessment of practical tips to navigate it. We also spoke about the integration of data protection regimes for entities just joining a group of companies (a necessary, but little-discussed topic), the best tips during the first few hours of a data breach or cyber attack, and an update on international transfers. Don't miss our next events in April, anticipated to be an in-person Tech and Data Leaders Forum and a webinar regarding employee monitoring.  

Free cyber security tools

DWF Solutions: DWF's Data Protection and Cyber Security team has launched a suite of free cyber security tools. Please see the Cyber and ransomware section.

Back to top >

General updates

How's Data Protection Doing In Your Country?

Our Global Head of Data Protection, Privacy & Cyber Security, Stewart Room, recently wrote an article How’s Data Protection Doing In Your Country? (forbes.com) in which he takes a look at the drivers for data protection and how the law stands on elements such as consumer action, privacy activism and corporate governance, as well as reflects on where the UK currently stands in relation to the EU.

UK Data Protection and Digital Information Bill updates

The Data Protection and Digital Information Bill has been subject to significant delays following its initial reading on 18 July 2022. Since the latest update in October, in which it was confirmed that the UK GDPR will be replaced, there has been little communication from the Government causing uncertainty in the UK's future data protection regime. 

The Government recently announced its newly created Department for Science, Innovation and Technology stating within its 2023 priorities to "deliver key legislative and regulatory reforms to drive competition and promote innovation, including the Data Protection and Digital Information Bill…"

Despite some briefings to the contrary the week before, in an interview with Morning Tech UK, Michelle Donelan (Secretary of State for Science, Innovation and Technology) confirmed the Data Protection and Digital Information Bill will be brought back to Parliament during the week commencing 6 March 2023. The Bill's page on Parliament.uk shows the first stage as "In Progress", rather than proceeding to the second stage as it had previously. We understand that the Bill has been redrafted, however there is no current clarification as to how the Government intends to simplify the UK's data protection regime. It will be interesting to see whether the Bill has minimal changes or if it has been drafted again from scratch using models from other laws around the world (as had been suggested previously).

We are monitoring the developments closely and will be reporting on any further announcements and implications. We anticipate there to be activity on this around 9 or 10 March, with the Bill's return to Parliament and we will be reporting on the developments as they happen. 

DWF Solutions: Whilst considering your regional or global strategy to data protection compliance, let us know as we can help with and guide you through those strategic decisions. 

UK Online Safety Bill Update

The Online Safety Bill completed its second reading in the House of Lords on 1 February. If it becomes law, the Bill will:

  • establish a new regulatory regime to address illegal and harmful content online;
  • impose legal requirements on search engine and internet service providers, including those providing pornographic content; and
  • confer new powers on the Office of Communications (Ofcom), enabling them to act as the online safety regulator.

During the second reading, the House of Lords discussed proposed changes to the Bill, including:

  • the use of age verification measures;
  • the responsibility of social media platforms to tackle illegal content;
  • protection for vulnerable adults online; and
  • the promotion of harmful content in social media algorithms. 

The Bill will now proceed to Committee stage. We will monitor developments and update you in future issues of DWF Data Protection Insights.

ICO guidance for games developers on protecting children

The ICO has issued guidance for game developers to ensure that games comply with the Children's Code. The ICO's recommendations include to:

  • identify if players are under the age of 18 and discourage false declarations of age;
  • ensure that games are not detrimental to children’s health and well-being, by including checkpoints and age-appropriate prompts to encourage players to take breaks;
  • turn off behavioural profiling for marketing by default;
  • if a child chooses to opt into receiving ads, implement measures to control or monitor product placement, advertising, or sponsorship arrangements;
  • discourage the use of 'nudge techniques' to encourage children to make poor privacy decisions; and
  • review the marketing of social media competitions and partnerships to children, and the encouraging of children to create social media accounts out of fear of missing out on rewards.

DWF Solutions: By way of a reminder, the ICO's Children's Code applies to information society services likely to be accessed by children in the UK, not just services specifically directed at children. If you would like advice on how to comply with the Code, please contact one of our privacy specialists.

EU court decision on the dismissal of a DPO

The European Court of Justice (ECJ) has provided a ruling in a case where a data protection officer (DPO) was dismissed due to the risk of a conflict of interest. Article 38(6) of the GDPR provides that the DPO may fulfil other tasks and duties, and the controller or processor shall ensure that any such tasks and duties do not result in a conflict of interests. The ECJ noted that it is the responsibility of the controller or processor not to entrust a DPO with performing tasks or duties which could impair the execution of their DPO functions. In particular, the DPO cannot determine the objectives and methods of a personal data processing activity.

This ruling is not binding on the UK courts, although the UK GDPR contains equivalent provisions regarding DPOs. The Data Protection and Digital Information Bill, which is currently on hold, contained provisions changing the requirement to appoint a DPO and replacing it with another role for UK compliance purposes.

We are monitoring developments and will report in future issues of DWF Data Protection Insights. If you would like advice about the rules governing the appointment and role of the DPO, please contact one of our specialist lawyers.

Back to top >

Adtech and direct marketing

Ransomware and Adtech in the Financial Services sector

Read our article about Ransomware and Ad-Tech in the Financial Services Sector

ICO fines company £200,000 for calls to people registered with TPS

The ICO has fined an appliance service and repair company £200,000 for making 1,752,149 unsolicited direct marketing calls to subscribers registered with the Telephone Preference Service (TPS). This shows that the ICO is continuing with its policy of enforcing the Privacy and Electronic Communications Regulations (PECR) and provides a reminder about the importance of screening call lists against the TPS.

DWF Solutions: Please contact one of our privacy specialists if you would like advice on how to conduct direct marketing in accordance with the law. This is a complex area because it requires an understanding of both the UK GDPR and PECR, which in turn have different rules for corporate and individual subscribers.

Back to top >

AI and innovation

Big tech social media platforms: liability for user content

Shervin Nahid, Senior Associate in DWF's Data Protection and Cyber Security team, wrote on LinkedIn about two significant cases in the US Supreme Court relating to the big tech online social media platforms and the content published on their sites. The cases centre on the platforms' legal liability in respect of content posted by their users. This relates to the frequently debated issue of freedom of speech vs the spread of damaging material. One of the key factors for consideration is the social media sites' argument of the neutrality of the platform, which allows for free speech and avoids unintended consequences, e.g. censorship. However, with the rise of algorithms that recommend content based on inputs, such as viewing history, this has the potential for very significant harm, which the two cases being considered concern – social media's potential influence on two terrorist attacks.

The development of algorithm technology is incredible. If that same technological innovation can be deployed to consider the potential implication for significant online harms, e.g. further advancements to content moderation techniques, that would be beneficial. In the meantime, it will be interesting to see how the US Justices comment, given many have not yet expressed views on the matter.

See Shervin's original LinkedIn post here

Facial recognition technology in schools

The ICO has published a letter it sent to a local council about the use of facial recognition technology (FRT) to manage 'cashless catering' in schools. Read Stewart Room's article here: Facial Recognition In Schools: Clever Tech. Bad, Bad, Bad Implementation (forbes.com). It underlines the need to get the basics right in data protection matters.  

ICO blog on use of AI by local authorities

On 19 January 2023, the UK ICO released a blog post on its sample-based findings regarding the use of the AI by UK local authorities. The sample involved 11 such local authorities.

As set out in the blog, the ICO's analysis included the providers of solutions analysed, who in turn stated "that the processing is not carried out using AI or machine learning but with what they describe as a simple algorithm to reduce administrative workload, rather than making any decisions of consequence."

The ICO's analysis has not turned, this time, to more controversial areas of AI deployment, e.g. public healthcare or law enforcement. These areas are addressed both in the already existing ICO’s guidance and findings. Consequently, it would be inappropriate to conclude to that use of AI by public authorities is in any case 'off the hook'.

Notably, the UK ICO concluded with respect to all authorities assessed "that there is meaningful human involvement before any final decision is made on benefit entitlement." The existence of human involvement indeed may help organisations to remain outside of scope of art. 22(1) of the GDPR, and is a significantly contributing factor when considering safeguards during a Data Protection Impact Assessment.

Implementing privacy by design is vitally important in the use and deployment of AI tools, as is consideration of the limits, and therefore the degree of human interaction required with the results.  Consequently, design of the AI tools which are compliant needs either involvement of the privacy officers within your organisation and/or sound, prior, legal advice.

In the foreseeable future, the compliance threshold may increase for a range of solutions, either via engagements of regulators as part of the UK’s AI Strategy or the upcoming UK’s Data Protection and Digital Information Bill.

Overall, given the current economic conditions, and the constant pressure on the pay levels of public staff, there is indeed a demand for GDPR-compliant tools that lessen administrative burdens. However, to fully reap the economic opportunities and to manage the associated risks, the providers and users of AI technologies should take into account the regulatory considerations for the circumstances in which their tool is being used.

We regularly advise clients:

  • on assessing the legal compliance risks of particular solutions, helping to close the compliance gaps with safeguards fit for their products/services, and
  • on integrating privacy into its organisations, including training and fostering the involvement of privacy teams at the right stages of business decision-making.

DWF Solutions: If you would require legal assistance or advice regarding AI and its implications, or would like to discuss any points raised above, please contact one of our Data Protection & Cyber Security team.

Back to top >

Cyber and ransomware

Cybersecurity incidents: suite of free materials and services

If you are a General Counsel, or part of their team or responsible for data protection, we have put together a suite of free value-add materials and services, to help you prepare for and navigate through a serious cybersecurity incident:

  • Legal Risks White Paper – it identifies the legal risks that can be triggered by an incident, or by incident response itself, with timelines and impacts.
  • Playbook for General Counsel – it guides you through the issues that you need to cover if an incident occurs.
  • DWF RAPID Incident Response Readiness Assessment – this is an online self-assessment tool that measures your readiness to deal with an incident.
  • Ransomware and Data Extortion Awareness – a training session for you and your team covering the technical and legal issues, and response strategies.

Please contact one of our Data Protection and Cyber Security specialists for more information.

See also Ransomware and Adtech in the Financial Services sector above.  

Back to top >

Data transfers

EU-US transfers update

Following our previous reports on the progress of the EU Adequacy Decision for the US, the European Parliament raised a number of objections to the current framework on 14 February. In doing so, they urged the European Commission not to grant adequacy to the US on the basis of that framework, relying on an Executive Order from President Biden. The objections they cited were: 

  • The intended Data Protection Review Court (DPRC) is not sufficiently independent or impartial, partly due to it being part of the executive rather than the judiciary. 
  • Decisions made by the DPRC will not be made public or available to complainants. 
  • "The redress process provided by the Executive Order is based on secrecy and does not set up an obligation to notify the complainant that their personal data has been processed, thereby undermining their right to access or rectify their data" . 
  • The redress process does not provide for an avenue for appeal in a federal court and therefore, among other things, does not provide any possibility for the complainant to claim damages. 
  • Proportionality and necessity are long-standing and key elements of the EU data protection regime. However, the  Executive Order's substantive definitions of these terms are not in line with their definitions under EU law and instead will be interpreted by US law. 
  • The Executive Order can be amended by the current US President or any other at any time, therefore its meaning and application is not clear or future-proof, creating uncertainty around its future validity.  
  • Unlike other jurisdictions benefiting from an adequacy decision from the European Commission, the US does not have a federal data protection law in place.

In its conclusions , the Committee reiterated its resolution of 20 May 2021 in which it called on the Commission "not to adopt any new adequacy decision in relation to the US, unless meaningful reforms were introduced, in particular for national security and intelligence purposes".

Furthermore, in its conclusion the Committee stated: "the EU-US Data Privacy Framework fails to create actual equivalence in the level of protection" and "calls on the Commission to continue negotiations with its US counterparts with the aim of creating a mechanism that would ensure such equivalence and which would provide the adequate level of protection required by Union data protection law and the Charter as interpreted by the CJEU". Given the above, it "urges the Commission not to adopt the adequacy finding". 

A full Parliament vote on the Resolution is anticipated in the upcoming months, however even if the Resolution is passed, it will not be binding on the European Commission with regards to its adequacy decision.

The EDPB also published its findings on 28 February 2023. Whilst the EDPB noted the improved protections in the framework over the previous regimes, it considered that there is still a range of topics which require clarification and explanation. It considers that in doing so, the overall clarity and comprehensiveness of the framework will increase its resistance to challenge (which has already been mooted by Max Schrems) and improve its implementation. The concerns range from a complex structure of the materials in the framework, through to appropriate controls for onward transfers, clarity as to the expected standards for emerging technology such as AI, the appropriate degree of oversight, its position that US intelligence agency use of EU personal data should be subject to prior authorisation by an independent authority, and for general close monitoring of the approach as it is implemented and during its life.  

Back to top >

Public sector

FOIA publication schemes: ICO report

The ICO has published a report called 'Publication Schemes: a snapshot of compliance' which examines public authorities' compliance with the publication scheme requirements of the Freedom of Information Act 2000 (FOIA).

FOIA requires public authorities to proactively publish information in accordance with the ICO Publication Scheme. The report is based on a sample of public authorities across different sectors, and its key findings include:

  • 75% of public authorities had adopted the model Publication Scheme, but only 25% had published the information that the ICO would expect; 
  • compliance varies between sectors. For example, 100% of the universities had a Publication Scheme, but only 25% of schools and 5% of medical practices; 
  • of the schemes, the ICO could see had been reviewed, a third had not been reviewed for more than five years; 
  • only 12% of public authorities had evidence of publishing datasets; and 
  • there was evidence to suggest that small authorities struggled disproportionately to comply with the Publication Scheme.

 The report ends with the ICO's recommendations:

  • consult the ICO guidance on FOIA Publication Schemes to make sure you are aware of the legal requirements; 
  • pay particular attention to the requirements regarding dataset publication. Many public authorities may have overlooked their legal obligations to publish datasets released in response to FOIA requests; 
  • put in place a process so that your scheme is regularly reviewed and maintained. Consider the seven classes of information covered by the publication requirement and how regularly the kind of information you hold in these classes changes over time; and 
  • review the ICO’s new definition document relating to your sector, to check whether you are publishing all the examples of information that could be in your scheme.

DWF Solutions: Please contact Jay Mehta for advice on FOIA compliance, including how to meet the publication scheme requirements.

DHSC draft guidance on NHS England’s protection of patient data

The Department of Health & Social Care (DHSC) has published draft guidance setting out the measures NHS England will undertake to protect the confidentiality of patient data following the transfer of NHS Digital's statutory functions to it on 31 January.

The guidance states that NHS England will be required to adopt the same statutory data protections, that had been implemented by NHS Digital, plus the following additional measures:

  • governance, scrutiny and accountability structures to protect patient data;
  • processes for obtaining independent advice on data protection;
  • procedures governing internal access to data, including processes for reviewing, assuring and scrutinising internal requests for access;
  • arrangements for engaging with key stakeholders;
  • implementation of specific technical measures and controls;
  • safeguards that will apply when NHS England enters into arrangements with third party processors; and
  • transparency and reporting obligations.

DHSC states that the final form of the guidance will be published 'within a reasonable timeframe'.

IPT publishes decision regarding MI5's failure to protect personal data

The Investigatory Powers Tribunal (IPT) has published its decision in a case brought by privacy organisations, Liberty and Privacy International, against MI5. IPT found that MI5 had breached the Regulation of Investigatory Powers Act 2000 and the Investigatory Powers Act 2016 by failing to comply with safeguards regarding the acquisition and holding of personal data held in bulk datasets and bulk communications data.

See also ICO blog on use of AI by public authorities under AI and innovation above.

Back to top >

For advice on any aspect of Data Protection & Cyber Security, please contact one of our specialists.

Further Reading