Critics claim Paris is using the 2024 Games to introduce Big Brother CCTV

France’s National Assembly is due to pass a law on Tuesday ahead of the 2024 Olympics in Paris. Article 7 is the most controversial aspect of this law, as it will allow the use of AI video surveillance to detect abnormal behavior. Human rights organizations and the French left condemned the measure.

The global law that the French National Assembly must adopt on March 28, in anticipation of the Paris 2024 Olympic Games, will allow the opening of shops on Sundays, establish a health center in the department of Seine-Saint-Denis (located north-east of Paris) and allow the French State to investigate the future accredited. However, article 7 of this law is particularly controversial, as it stipulates that AI video surveillance can be used, on an experimental basis, to ensure the security of the Olympic Games. Human rights groups say the use of this technology will set a dangerous precedent.

During the preliminary phase, article 7 was adopted by the presidential majority, the French right-wing party The Republicans and the far-right National Rally. The New People’s Ecological and Social Union (NUPES), a coalition of left-wing parties, opposed it. It will allow algorithm-based video surveillance technology to be used to provide security for large-scale “sporting, recreational or cultural events” on a trial basis.

“A total attack on the right to privacy”

“Algorithmic video surveillance is a new form of technology that uses computer software to analyze images captured by surveillance cameras in real time,” explains Arnaud Touati, a lawyer specializing in digital law. “The algorithms used in the software are notably based on machine learning technology, which allows AI video surveillance, over time, to continue to improve and adapt to new situations.”

Proponents of this technology claim to be able to anticipate crowd movements and spot abandoned luggage or potentially dangerous incidents. Compared to traditional video surveillance, everything is automated with algorithms in charge of analysis – which, according to the proponents of this technology, limits human errors.

“As France presents itself as a champion of human rights around the world, its decision to legalize AI-powered mass surveillance during the Olympics will lead to an all-out assault on privacy rights, to protest and freedom of assembly and expression,” Amnesty International said in a statement after the article was passed.

A herald of future video surveillance across Europe?

Katia Roux, specialist in technology and human rights at the NGO, explains that this technology can cause many fears. “Under international law, legislation must respect the strict principles of necessity and proportionality. In this case, however, the legislator has not demonstrated this,” she says. “We’re talking about assessment technology, which needs to assess behaviors and categorize them as risky so that action can be taken afterwards.”


“This technology is not legal today. In France, experiments have been made but not within the legal framework that this law proposes to create,” she said. “It’s also not legal at European level. It’s even being mentioned during discussions in the European Parliament on technology and regulation of artificial intelligence systems. The legislation could therefore also violate the European regulation being drafted. .”

“By adopting this law, France would become the champion of video surveillance in the EU and would set an extremely dangerous precedent. It would send an extremely worrying signal to countries which might be tempted to use this technology against their own population”, a- she continued.


One fear is that the seemingly cold and foolproof algorithm may actually contain discriminatory biases. “These algorithms will be trained from a set of data decided and designed by human beings. They will therefore be able to integrate the discriminatory biases of the people who designed and designed them”, specifies Roux.

“AI CCTV has already been used for racist purposes, notably by China, in the exclusive surveillance of Uyghurs, a Muslim minority present in the country”, explains Touati. “Because ethnic minorities are underrepresented in the data fed to algorithms for learning purposes, there are significant discriminatory and racist biases. According to an MIT study, while the facial recognition error is 1 % for white men, it’s 34% for black women.”

Touati, however, wants to see the glass half full. “The use of AI CCTV at events of this magnitude could also highlight the algorithm discriminatory, misogynistic and racist biases by identifying, at too high a frequency to be accurate, people from minority ethnic groups as potential suspects,” he explains.

Asked by members of the left-wing opposition coalition NUPES about the type of people targeted by AI video surveillance, French Interior Minister Gérald Darmanin replied: “Not (those who wear) hoodies “. The French government believes that the limits set by law – absence of facial recognition, data protection – will be sufficient to prevent discriminatory practices.

“We have put safeguards in place so that calls for tenders are only reserved for companies that comply with a certain number of rules, including the hosting of data on national territory, compliance with the CNIL (Commission Nationale de l’Informatique et des Libertés; an independent French administrative regulatory body that data protection law applies to the collection, storage and use of personal data) and the GDPR (General Data Protection Regulation data; a data protection law introduced by the EU)”, says MP Philippe Latombe, a member of the pro-Europe group and centre-right political party Mouvement Démocratique. He co-signed an addendum with the National Rally for that the tender gives priority to European companies.”Clearly, we don’t want it to be a Chinese company processing data in China and using the data to do other things.”

“We are not reassured by government guarantees. In reality, no real amendment is possible, and this technology is, in itself, problematic and dangerous for human rights,” explains Roux. “It will remain so until a serious assessment has been carried out, until the necessity and proportionality of its use have been demonstrated and until a real debate has taken place with the various players in the civil society on this issue.”

Sporting events and tech experiences

Although the Olympic Games are clearly the intended event, this technological experimentation can begin as soon as the law is put in place and will end on December 31, 2024, four months after the end of the Paralympic Games. It could therefore apply to a wide range of events, starting with the Rugby World Cup from September 8 to October 28.

Opponents of AI video surveillance fear that its initially exceptional use will eventually become commonplace. After all, sporting events are often used as a testing ground for policing, security and new technologies. The London Olympics in 2012, for example, led to the generalization of video surveillance in the British capital.

“We are afraid that this exceptional period will become the norm,” explains Roux, who adds that voice recognition technology, deployed on a trial basis during the 2018 World Cup in Russia, has since been used to suppress opposition.

Finally, Amnesty International fears that video surveillance will eventually lead to biometric or voice surveillance. “Facial recognition is just a feature waiting to be activated,” Roux explains.

The 2024 Olympic Games Act has not yet completed its legislative journey. Following Tuesday’s formal vote in the National Assembly, the text will undergo several modifications and will go back and forth between the Assembly and the Senate, which had previously amended it, until the two chambers agree to adopt it.

Peter O’Brien of Tech 24 contributed to this article.

This article has been translated from the original in French.


Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button