• DE
Choose your location?
  • Global Global
  • Australian flag Australia
  • French flag France
  • German flag Germany
  • Irish flag Ireland
  • Italian flag Italy
  • Polish flag Poland
  • Qatar flag Qatar
  • Spanish flag Spain
  • UAE flag UAE
  • UK flag UK

Newsletter Tech-Data November-December 2024

29 January 2025

Data & Media Department presents legal news of interest in November – December 2024 in Tech-Data.

In this issue

Latest news - technologies

Publication in the OJEU (EU) 2024/2853 of the European Parliament and of the Council of October 23, 2024 concerning liability for defective products.

The latter repeals the provisions of Council Directive 85/374/EEC of July 25, 1985, which was no longer adapted to the transition to a digital economy. The new rules on civil liability laid down by the new directive are designed to take better account of the fact that many products today feature digital functionalities, and that the economy is becoming increasingly circular.

Key elements:

  • Digital economy: the new directive extends the definition of “product” to include digital manufacturing files and software. Online platforms can also be held liable for a defective product sold on their platform, in the same way as any other economic operator if they are acting in this capacity.
  • Circular economy: when a product is repaired and upgraded outside the control of the original manufacturer, the company or person who modified the product should be held liable.
  • Disclosure of evidence: the right to redress has been simplified by ensuring that an aggrieved person seeking redress before an national court may request access to the relevant evidence in the manufacturer's possession in order to be able to substantiate its claim.
  • Products purchased from manufacturers established outside the EU: under the new rules, in order to ensure that consumers are compensated for damage caused by a product manufactured outside the EU, the company importing the product or the representative of the foreign manufacturer established in the EU may be held liable for the damage caused.
  • Burden of proof: where the injured consumer is faced with excessive difficulties in proving the defectiveness of the product or the causal link between its defectiveness and the damage, a court may decide that the claimant is only required to prove the likelihood that the product was defective or that its defectiveness is a probable cause of the damage.

This directive must be transposed by December 9, 2026.

Termination of an IT contract at the customer's fault and compensation for the damage

In a decision dated November 22, 2024, the Paris Court of Appeal issued an interesting decision because it is quite rare in a dispute between a company in the construction sector and a software integrator service provider relating to the existence of a prejudice in the deployment of IT projects.

In this case, the service provider had to deploy an IT solution dedicated to the construction sector (ERP Enterprise Resource Planning Integrated Management Software) at its client's site. Dissatisfied with the service provided, the customer had sued the service provider.

In view of the facts, the judge considered that the integrator had fully fulfilled its obligation to advise and inform its lay client. Such an outcome is relatively rare.

The contract was then terminated at the exclusive fault of the client, which is even less common.

However, although the claimant obtains payment of his invoices from the judge, it is denied compensation for the lost earnings.

In the end, the IT service provider did not fully win the case, also demonstrating the difficulty in IT litigation to demonstrate a prejudice.

240,000 fine imposed by the CNIL for unauthorized creation of a database of 160 million contacts

In a decision dated December 5, 2024, the CNIL imposed a penalty of 240,000 euros on the company KASPR for failure to comply with the obligations laid down by RGPD. This fine, which has been made public, is accompanied by a formal notice aimed at forcing the company to comply with the regulations. The decision was taken in collaboration with the other European data protection authorities.

KASPR is accused of operating a paid extension for the Chrome browser, enabling its customers to access the professional contact details of people whose profiles they consult on LinkedIn. To build up its database of some 160 million contacts, the company collected information not only from LinkedIn, but also from other online platforms, such as domain name directories. This data was then used by its customers, notably for commercial prospecting or identity.

The CNIL has received several complaints from people canvassed by companies that have obtained their contact details via the KASPR extension. Following an inspection, the authority's restricted formation found that these practices violated several provisions of the RGPD, thus justifying the sanction:

  • A breach of the obligation to have a legal basis (Article 6 of the RGPD)
  • A breach of the obligation to define and comply with a data retention period proportionate to the purpose of the processing (Article 5-1-e of the RGPD)
  • A failure to comply with the obligation to provide transparency and information to individuals (Articles 12 and 14 of the RGPD)
  • Failure to comply with the obligation to comply with requests to exercise the right of access (Article 15 of the RGPD).

Italian Data Protection Authority condemns Chat GPT for training on personal data without authorization

The Italian Data Protection Authority (Garante per la Protezione dei Dati Personali - GPDP) has fined OpenAI, developer of ChatGPT, 15 million euros for the way the generative artificial intelligence chatbot handles personal data.

The fine comes almost a year after the Authority identified various breaches of the GDPR in the way Chat GPT used personal data to train its model.

The Authority stated that OpenAI had failed to inform it of a security breach that occurred in March 2023, and that it had processed users' personal data to form ChatGPT without having an adequate legal basis to do so. It also accused the company of running counter to the principle of transparency and information obligations towards users. Finally, the Authority criticized OpenAI for failing to implement age verification mechanisms, with the resulting risk of exposing children under 13 to inappropriate responses with regard to their degree of development and self-awareness.

The Authority has ordered OpenAI to carry out a 6-month institutional communication campaign on radio, television, in newspapers and on the Internet.

The content, to be agreed with the Authority, will promote public understanding and awareness of how ChatGPT works, in particular with regard to the collection of user and non-user data for generative artificial intelligence training and the rights exercised by data subjects, including those of opposition, rectification and deletion.

Finally, as OpenAI had during the course of the investigation established its European headquarters in Ireland, the Italian Authority, in accordance with the so-called one-stop-shop rule, forwarded the documents of the proceedings to the Irish Data Protection Authority, which has become the lead supervisory authority under the RGPD, so that it can continue the investigation with regard to breaches of an ongoing nature that did not end before the European subsidiary was opened.

Personal data news

Second version of EU code of practice for general-purpose AI models published

The second version of the EU code of practice for general-purpose AI models is now available. It is twice as long as the first and contains KPIs for each compliance measure.

Publication of the list of participants in the code of good practice on general-purpose AI models

The European Commission has shared the list of participants in the development of the code of good practice on general-purpose AI models. It should be noted that a third of the participants are individual experts, but the Commission has so far refrained from publishing their names for reasons of personal data protection.

French Interior Minister sanctioned by CNIL for video surveillance software using real-time facial recognition

In December 2024, following the Disclose revelations, the CNIL served formal notice on the Ministry of the Interior and six French communes for non-compliance in the use of BriefCam video surveillance software, particularly with regard to real-time facial recognition.

Although the Ministry has not widely activated the facial recognition function, the technology integrated into the software has been used on an ad hoc basis as part of a judicial investigation. However, with a few strictly regulated exceptions, real-time facial recognition in public spaces is prohibited in France.

This case highlights a growing tension between the evolution of security technologies and the rigid legal framework, particularly under the French Data Protection Act and the RGPD. Defenders of human rights and civil liberties have welcomed the CNIL's intervention, seen as a safeguard against potential drifts towards a society of generalized surveillance.

Legal professionals saw it as a sign that the current legal framework was inadequate to meet the challenges posed by surveillance technologies. At the heart of the debate is the question of the proportionality of means in relation to public security objectives. A specific legal framework seems necessary to clarify the limits of the use of facial recognition in France, particularly with AI and the implementation of the relevant European regulations.

Dutch data protection authority fines Netflix 4.75 million euros for failure to provide information on personal data collected

The Dutch Data Protection Authority has fined Netflix 4.75 million euros, on the grounds that between 2018 and 2020, the company did not give its customers sufficient information about what it does with their personal data.

The regulator considered that Netflix had breached the GDPR by not making the information in its privacy statement sufficiently clear and by providing insufficient information to consumers who questioned the company about the data it collected about them.

This investigation by the Dutch Authority follows the complaint filed by the Austrian non-governmental data protection organization founded by Max Schrems, Noyb (None of your business).

EDPS adopts opinion on the use of personal data for the development and deployment of AI models

On December 18, 2024, the European Data Protection Committee (EDPS) adopted an opinion on the use of personal data for the development and deployment of AI models.

This opinion examines:

  1. when and how AI models can be considered anonymous,
  2. whether and how legitimate interest can be used as a legal basis for developing or using AI models, and
  3. what happens if an AI model is developed using personal data that has been processed unlawfully.

It also considers the use of first-party and third-party data.

The opinion was requested by the Irish Data Protection Authority with a view to seeking regulatory harmonization at European level. In order to gather input for this opinion, which deals with rapidly evolving technologies that have a significant impact on society, the EDPS organized a stakeholder event and held an exchange with the EU AI Office.

downlaod the pdf

Further Reading