This month in review:
This month we have a wide range of topics for you to consider, ranging from the battle to achieve a successful EU to US data transfer model, through to a detailed look at AI regulation and next steps, plus the latest enforcement cases.
Focusing in on some specific areas:
- All sectors but particularly Technology – AI use – leading again this month is the now ever-topical AI. Whether it is for facial recognition in law enforcement where there are new guidelines from the European Data Protection Board (EDPB), or for a deeper review of ChatGPT – read on!
- Consumer sector – direct marketing – the ICO continues its enforcement action whilst also providing video content regarding compliant approaches to direct marketing. If in doubt in this complex area, it's worth checking.
- Public sector – assessing disclosures and whether they are reasonable and also whether your DSAR compliance is up to scratch are the two focused topics we assess this month.
- Our events – catch up on our Workforce Monitoring Webinar which we held on 16 May, covering employment and data protection issues and trends. You can view the slides and recording here.
- General updates
- Adtech and direct marketing
- AI and Innovation
- Cyber, breach and ransomware
- Data transfers
- Public sector
UK – update on the ICO and the Data Protection and Digital Information (No.2) Bill
On 23 May 2023, John Edwards delivered his opening remarks at the European Parliament's Committee on Civil Liberties, Justice and Home Affairs (LIBE).
The speech began by emphasising the importance of maintaining strong links with international colleagues, particularly as society is becoming more interconnected than ever. Mr Edwards named a number of Data Protection Authorities across Europe that he would be meeting with, demonstrating the shared ambition of collaborative working. The speech highlighted that, despite the ICO no longer being part of the European Data Protection Board (EDPB) and no longer having the formal mechanisms for consistency in the application of GDPR available, other forums provide opportunities to collaborate and explore the solutions available to tackle the, often shared, challenges.
Whilst recognising that fines have their place (including referencing the recent TikTok and Clearview AI fines), Mr Edwards expressed the view that enforcement for the ICO is not just about issuing more and larger fines, but rather it is keen to utilise the spectrum of regulatory responses to achieve success in the form of positive outcomes for individual rights.
He went on to say that the Data Protection and Digital Information Bill is an important part of this – an "evolution, not revolution". Mr Edwards explained that the Bill intends to achieve an improved level of data protection and will mean that the ICO's independence is preserved whilst increasing accountability, meaning that the ICO are able to support the Bill as it currently stands. He also gave an insight into the key concerns raised by the ICO, which were addressed during the consultation process. These amendments include introducing safeguards to enhance the transparency and accountability of the process by which the Secretary of State approves or rejects only the most significant matters requiring a statutory code, ensuring that the exercise of subject access rights remains free of charge, maintaining the UK's adequacy by following the GDPR model for international transfers and retaining human oversight of automated decision-making.
EU – Key dates for new data protection-related legislation
The IAPP have created a timeline setting out the new primary EU regulations and initiatives concerning and relating to privacy, which includes a brief status update. In the circumstances that an initiative has not yet been finalised, the status update reflects the current position. It acts as a visual representation of the various updates expected for 2024 and beyond. The range of topics covered make this an essential reference tool. When considering these laws and proposals (as with any other compliance matter) do always test whether one or both of UK and EU law applies to the processing activities of your organisation.
EU – update on the scope of DSARs
A recent judgment handed down by the Austrian Federal Administrative Court has provided some clarity as to the scope of a data subject access request and the wording of Article 15 of the GDPR in EU Law.
Specifically, at paragraph 45, it is confirmed that: "the right to obtain from the controller a copy of the personal data undergoing processing means that the data subject must be given a faithful and intelligible reproduction of all those data." It goes on to say that, where the provision of such personal data is essential in order to enable the data subject to exercise effectively the rights conferred, the controller should provide copies of extracts from documents or even entire documents or extracts from databases which contain, furthermore that data, taking into account the rights and freedoms of others.
It also confirms that, at Article 15(3) of the EU GDPR, "information" relates exclusively to the personal data of which the controller must provide a copy (see paragraph 53).
From a UK perspective, this is at odds with the ICO's guidance which indicates that the right is only to personal data and not copies of documents including their personal data.
ICO fines 2 businesses £180,000 for making unlawful marketing calls
The ICO has fined Ice Telecommunications Ltd £80,000 and UK Direct Business Solutions Limited £100,000 for making 480,000 unlawful marketing calls to businesses who were signed up to the Telephone Preference Service (TPS), which resulted in over 120 complaints. The investigation found both companies had made repeated and persistent calls to businesses, with some being described as rude and argumentative, despite multiple warnings from the TPS.
Andy Curry, Head of Investigations at the ICO, stated "these fines are another clear message to companies flouting the law – we will take action to ensure the public and UK businesses are protected and legitimate businesses complying with the law do not lose out".
ICO launches a new video aiming at helping small and medium-sized businesses navigate electronic communications law
The ICO has released a video aimed at assisting small and medium-sized businesses to understand the requirements of the UK data protection legislation around direct marketing communications. The ICO has also provided a training resource 'information governance for your small business', which was prepared on the basis of the internal training delivered to ICO staff members and has been adapted for use by small businesses. It is currently in a testing phase and the ICO encourages users to leave their feedback which they will use to develop and finalise the resource in due course.
The video and training resource can be accessed here.
UK to ban financial sales calls
On 3 May 2023, the Government announced its new plan to tackle fraud which "now accounts for over 40% of crime costs [the UK] nearly £7 billion a year and [these] proceeds are funding organised crime and terror". The Government's new plans include:
- A ban on so-called 'SIM farms' (technical devices that allow criminals to send bulk scam texts);
- A ban on cold calls on all financial products so that anyone who receives them will know it is a scam;
- Working with Ofcom to stop number 'spoofing' (where criminals impersonate UK numbers and trick people into thinking they are speaking with legitimate businesses);
- Working with tech companies to make it as easy as possible to report fraud online;
- Increased work with international partners and make the UK's intelligence community to identify and disrupt fraudsters overseas;
- Looking at giving banks more time to process payments to allow suspicious payments to be investigated and stop people from falling victim to fraudsters;
- The launch of a new National Fraud Squad led by the National Crime Agency and the City of London Police; and
- £30 million will be invested in a state-of-the-art reporting centre which will be operational later in the year.
EU – EDPB guidelines on facial recognition in law enforcement
The European Data Protection Board (EDPB) adopted the final version of its Guidelines on facial recognition technologies in the area of law enforcement (the “Guidelines”) on 17 May 2023. The EDPB notes that it “understands the need for law enforcement authorities to benefit from the best possible tools to quickly identify the perpetrators of terrorist acts and other serious crimes” but that “such tools should be used in strict compliance with the applicable legal framework and only in cases when they satisfy the requirements of necessity and proportionality.”
The Guidelines consist of the main body of guidance, and three annexes which include:
1. a template for assessing the severity of the interference with fundamental rights caused by facial recognition technology;
2. practical guidance for authorities wishing to implement facial recognition technology; and
3. a set of hypothetical scenarios and considerations.
As recent news headlines remind us, facial recognition for these purpose is contentious and requires careful consideration well before deployment, even in non-law enforcement settings.
UK – CMA to investigate AI
The Competition and Markets Authority (CMA) is opening an initial review of competition and consumer protection considerations in the development and use of AI foundation models.
According to the press release on 4 May 2023, this initial review will:
- explore how AI foundation models and their use is developing;
- examine what opportunities and risks these scenarios could bring for competition and consumer protection; and
- produce a report which sets out its findings and gives guiding principles to support competition and protect consumers as AI foundation models develop.
The announcement notes that, to ensure that innovation in AI continues in a way that benefits consumers, businesses and the UK economy, the Government asked regulators, including the CMA, to consider AI's development and deployment against five overarching principles:
(1) safety, security and robustness;
(2) appropriate transparency and explainability;
(4) accountability and governance; and
(5) contestability and redress.
EU – In-depth: ChatGPT and content generators
Recently, in public debate a lot has been said about artificial intelligence and its impact on society, work, law and economics. This is not an abstract topic, as many would assume. Even right now content generators, such as extremely popular ChatGPT, are changing business at an alarming rate and are becoming a tool widely used by it. Due to the speed of popularization of these types of tools, there are no appropriately comprehensive legal regulations that would regulate issues related to cybersecurity or protection of copyrights. Lawmakers have not yet had time to react to this new technology.
The use of content generators
ChatGPT is a language model based on machine learning algorithms that leverage large text datasets to train the model to recognise language patterns and dependencies. This raises questions in the context of personal data protection and the possibility of using such generators at work and in business. The first statements and recommendations of entities that assess the consequences of using tools of this type have started to appear. For example, in Poland, a scientific and didactic consortium issued recommendations on the use of content generators for universities. Some of these recommendations are worth setting out here, since they are also helpful for business and other entities.
Firstly, content generators (e.g. ChatGPT) have been defined as applications that can generate content in the form of text, images, or other content that may appear to be man-made. The consortium is of the opinion that the person using the content generators should be responsible for any data that has been published as a result of the use of this tool. However, users should not be perceived as authors of the content produced in this way. If a person uses a tool like this, they should indicate it as a source in the references. In addition, the use of content generators is associated with a high risk of bias of results, susceptibility to defective sources, and the possibility of errors in the answers provided by such a tool. Therefore, it is necessary to improve the qualifications of people (e.g. employees) using content generators and train them in a critical assessment of the results provided by these tools.
First reactions of European regulatory authorities
Questions related to the use of ChatGPT are also beginning to be asked by personal data protection supervisory authorities in the European Union. In March this year, the Italian data protection authority issued a ruling that blocked the use of ChatGPT in the country due to a possible violation of data protection laws. However, the blockade was lifted one month later (in April 2023) after Open AI assured the Italian DPA of its commitment to improving transparency of the use of personal data and guaranteed that appropriate responses to the concerns would be taken. Regulatory authorities in other countries are also increasing their interest in ChatGPT. Germany's Data Protection Chief has stated that it is possible to consider banning ChatGPT in Germany, while the Spanish Data Protection Agency has announced the opening of an investigation into ChatGPT. This matter has also been discussed by the European Data Protection Board, which launched a dedicated task force to foster cooperation and to exchange information on possible enforcement actions taken by European data protection supervisory authorities.
Many questions about the protection of personal data in the context of the use of ChatGPT remain open. It is unclear how and on what legal basis models of this type obtain personal data for training. There are ongoing discussions about whether a separate category should be created for content generators under the proposed Artificial Intelligence Act (AI Act). We expect that the concerns should, and will, be addressed in the near future. In a conversation with MIT Technology Review Magazine, the European Data Protection Supervisor (EDPS) Wojciech Wiewiórkowski warned that he sees tools of this type as a threat on the scale of the Cambridge Analytica scandal. Further legal steps should therefore be taken as soon as possible to ensure safe and ethical use of content generators.
EU – CJEU decision regarding the compensation threshold for GDPR infringement
The Court of Justice of the European Union (the CJEU) has confirmed that not every "infringement" of the provisions of the GDPR confers the right to compensation for a data subject. Furthermore, the CJEU stated that there is no "threshold of seriousness" in determining whether compensation for non-material damage should be awarded.
Background of the case
The case before the CJEU concerned illegally processed personal data for statistical extrapolation. The Austrian Postal Service (Österreichische Post) used an algorithm to generate statistical data about likely political affinities of persons resident in Austria. The claimant had not given his consent to the processing of his personal data, but still an affinity with a certain Austrian political party was attributed to him. Even though this information was not communicated to any third party, the claimant decided to go to the national court. The main reason was the fact that assigning a possible interest in this particular party caused him "great upset, a loss of confidence and a feeling of exposure". The claimant sought 1.000 € compensation. The case reached the Austrian courts and the Austrian Supreme Court referred enquiries to the CJEU.
The right to compensation
The first point related to the question whether Article 82(1) of the GDPR means that the mere infringement of its provisions is sufficient to confer a right to compensation. The CJEU carried out literal and contextual interpretation of this article, which indicated that there are three conditions for the right to compensation. The first one, "suffered damage", is apparent from the wording of that article and the recitals of the GDPR relating to it. Another two are: the existence of an infringement of the GDPR and causal link between the damage and the infringement. Significantly, those three conditions are cumulative, meaning that all three of them must be met to confer a right to compensation. As a result, the CJEU confirmed that the mere infringement of the GDPR is not sufficient to confer individual's right to compensation, but also damage and a causal link must occur for the claim to be successful.
Threshold of seriousness
Under Austrian Law, non-material damage gives rise to a right to compensation if such damage reaches a certain "threshold of seriousness". However, the CJEU stated that under the GDPR the compensation for non-material damage does not need to meet a minimum "threshold of seriousness" as there is no scale of reasonable compensation in the GDPR. Article 82(1) of the GDPR should be interpreted as precluding a national rule or practice for a threshold. In addition, the CJEU emphasised that the national GDPR procedures cannot be more complicated than other national court claims.
This case is the first of a number of similar ones concerning the compensation under Article 82 of the GDPR that are pending before the CJEU. In the near future, we can expect other rulings and perhaps, the development of a consolidated line of judgements in this matter. Again this is a very different position from that in the UK from recent case law there.
EU to US data transfers
On 11 May 2023, the European Parliament voted to adopt a resolution on the adequacy of the protection provided by the EU-US Data Privacy Framework (Framework). The EU Parliament's key concerns with the Framework are:
- The principles in the Executive Order 14086 (Order) are not consistent with those in EU law and legitimate national security objectives are open to amendments and expansion by the US President with no obligation to update the public or EU Member States;
- The EU Parliament shares the European Data Protection Board's concerns over the failure of the Order to provide sufficient safeguards where bulk data is collected, namely the lack of independent prior authorisation, clear and strict retention rules, 'temporary' bulk collections and stricter safeguards for dissemination of bulk data;
- The underlying problem is the surveillance of non-US persons under US law and inability of EU citizens to seek effective judicial redress – EU and US citizens should have equal rights and privileges;
- Decisions of the Data Protection Review Court would not be made public or available to the complainant – therefore the person bringing a case would not be informed about the substantive outcome and the decision would be final (subject to being 'secretly' overruled by the US President);
- Remedies available for commercial matters are largely left to the discretion of companies which can use alternatives such as dispute resolution or privacy programmes;
- The Framework could be invalidated by the Court of Justice of the European Union if adopted in its current form, which would lead to a "continuing lack of legal certainty, further costs and disruption for European citizens and businesses"; and
- Unlike all other third countries that have received an adequacy decision with the EU, the US lacks its own federal data protection law.
The EU Parliament has therefore called on the EU Commission not to adopt the adequacy decision and to continue its negotiations with the US to create a mechanism which would ensure an adequate level of protection equivalent to that provided by EU law. The search for adequacy mechanism number 3 continues.
Police disclosures to the public
The ICO published a blog on 9 May 2023 which covers the questions police must ask when considering sharing personal information with the public, particularly considering necessity and proportionality. It seems that the catalyst for this was the decision by Lancashire Police to include personal information in media statements during the search for Nicola Bulley.
The blog recognises that the law specifically considers the challenges law enforcement organisations may face using personal data, and asks police to consider the risks of disclosing information alongside the valuable role disclosure can play in an investigation.
The specific questions police must ask themselves are:
- Is sharing this information necessary? The police must consider the importance of sharing something sensitive against alternatives to disclosure that would also achieve their objectives.
- Is sharing this information proportionate? The police must balance any harm or detriment that may come from sharing information with the objectives, and make sure this does not outweigh what they are trying to achieve.
- Will there be an impact on others? The police should note that sharing information about one individual may also have an effect on the privacy of their family and friends.
- Are you recording the decision making? As the Lancashire Police case demonstrates, the ICO expects the police to set out how they reached a decision to disclose information.
- What is the Data Protection Officer’s view? The DPO will be ideally placed to draw together their knowledge of data protection, as well as their understanding of their organisation’s operating processes, to help consider the balance between privacy, the public interest, and the interests of all those involved.
ICO takes action against two public authorities for failing to respond to data subject access requests
The ICO has reprimanded two councils for repeatedly failing to respond to data subject access requests (SARs) made by the public. Stephen Eckersley, Director of Investigations at the ICO, warns "other organisations should take note that [the ICO] will act if they fail to meet their legal obligations when responding to SARs".
Both councils are required to take steps to ensure that data subjects receive their personal data within the statutory period of 1 month (or up to 3 months where the request is complex or you have received multiple rights requests from the same individual). They also must ensure they have adequate staff resources in place to respond to SARs on time and continue to implement effective measures to address the outstanding requests. The ICO has asked both councils to provide details of the remedial actions taken in light of the recommendations within 6 months.
SARs are time-consuming to get right, but we have great experience in dealing with bulk requests and weaponised requests.
If you require any assistance or tailored advice on any of the topics mentioned in this article, please get in contact with the Data Protection & Cyber Security team below.
Article authors: JP Buckley, Kelly Marum, Sophie Broome, Gerard Karp, Paulina Nowak