ChatGPT’s ‘hallucination’ problem hit with another privacy complaint in EU

Image credits: OLIVIER DOULIERY / AFP / Getty Images

OpenAI faces another privacy complaint in the European Union. This one, which was filed by the privacy rights nonprofit noyb on behalf of an individual plaintiff, targets the inability of its AI chatbot ChatGPT to correct the misinformation it generates on individuals.

The tendency of GenAI tools to produce simply false information is well documented. But it also puts the technology on a collision course with the bloc’s General Data Protection Regulation (GDPR), which governs how regional users’ personal data can be processed.

Penalties for non-compliance with GDPR can reach up to 4% of global annual revenue. More importantly for a resource-rich giant like OpenAI: data protection regulators can order changes to how information is processed, so GDPR enforcement could reshape how AI tools generative can work in the EU.

OpenAI has already been forced to make some changes after early intervention from the Italian data protection authority, which briefly forced the local shutdown of ChatGPT in 2023.

Noyb is now filing the latest GDPR complaint against ChatGPT with the Austrian data protection authority on behalf of an anonymous complainant (described as a “public figure”) who discovered that the AI ​​chatbot had given them an incorrect date of birth .

Under the GDPR, EU citizens have a range of rights relating to information about them, including the right to have inaccurate data corrected. noyb contends that OpenAI does not meet this obligation with regard to the results of its chatbot. She said the company refused the complainant’s request to correct the incorrect date of birth, responding that it was technically impossible for them to correct it.

Instead, it proposed filtering or blocking data on certain prompts, such as the complainant’s name.

OpenAI’s privacy policy states that users who notice that the AI ​​chatbot has generated “inaccurate factual information about you” can submit a “correction request” via or by emailing dsar However, he caveats this line by warning: “Given the technical complexity of how our models work, we may not be able to correct the inaccuracy in every case.” »

In this case, OpenAI suggests users request that it removes their personal information from ChatGPT’s output entirely – by filling out a web form.

The problem for the AI ​​giant is that GDPR rights are not à la carte. European citizens have the right to request rectification. They also have the right to request the deletion of their data. But, as Noyb points out, it’s not up to OpenAI to choose which of these rights are available.

Other elements of the complaint focus on GDPR transparency issues, with Noyb claiming that OpenAI is unable to say where the data it generates about individuals comes from, nor what data the chatbot stores about people.

This is important because, again, the regulations give individuals the right to request such information by making what is known as a subject access request (SAR). In fact, OpenAI did not adequately respond to the complainant’s SAR, disclosing no information about the data processed, their sources or their recipients.

Commenting on the complaint in a statement, Maartje de Graaf, data protection lawyer at noyb, said: “Inventing false information is in itself quite problematic. But when it comes to false information about individuals, the consequences can be serious. It is clear that businesses are currently unable to ensure that chatbots like ChatGPT comply with EU law when processing data about individuals. If a system cannot produce accurate and transparent results, it cannot be used to generate data about individuals. It is the technology that must comply with legal requirements, not the other way around.

The company said it was asking the Austrian DPA to investigate the complaint regarding OpenAI’s data processing, and urging it to impose a fine to ensure future compliance. But he added that it is “likely” that the matter will be handled through EU cooperation.

OpenAI faces a very similar complaint in Poland. Last September, the local data protection authority opened an investigation into ChatGPT following a complaint from a privacy and security researcher who also found he was unable to get data corrected. incorrect information about him by OpenAI. This complaint also accuses the AI ​​giant of failing to comply with the regulation’s transparency requirements.

The Italian data protection authority is still conducting an open investigation into ChatGPT. In January, he presented a draft decision in which he found that OpenAI had violated the GDPR in several ways, including with regard to the chatbot’s tendency to produce false information about people. The findings also address other crucial issues, such as the lawfulness of the processing.

The Italian authority gave OpenAI one month to respond to its findings. A final decision remains pending.

Now, with another GDPR complaint filed against its chatbot, the risk of OpenAI facing a series of GDPR applications in different member states has increased.

Last fall, the company opened a regional office in Dublin – aiming to reduce its regulatory risk by routing privacy complaints through the Irish Data Protection Commission, using a GDPR mechanism intended to streamline the monitoring of cross-border complaints. by directing them to a single authority in the Member State where the company is “mainly established”.

News Source :
Gn bussni

Sara Adm

Aimant les mots, Sara Smith a commencé à écrire dès son plus jeune âge. En tant qu'éditeur en chef de son journal scolaire, il met en valeur ses compétences en racontant des récits impactants. Smith a ensuite étudié le journalisme à l'université Columbia, où il est diplômé en tête de sa classe. Après avoir étudié au New York Times, Sara décroche un poste de journaliste de nouvelles. Depuis dix ans, il a couvert des événements majeurs tels que les élections présidentielles et les catastrophes naturelles. Il a été acclamé pour sa capacité à créer des récits captivants qui capturent l'expérience humaine.
Back to top button