Texas Attorney General Ken Paxton has filed a lawsuit against Meta over Facebook’s facial recognition practices, his office announced Monday. The news was first reported by The Wall Street Journal, which notes that the lawsuit seeks civil penalties of hundreds of billions of dollars. The lawsuit alleges that the company’s use of facial recognition technology, which it has now discontinued, violated state privacy protections regarding biometric data.
A press release announcing the lawsuit alleges that Facebook stored millions of biometric identifiers contained in photos and videos uploaded by users. Attorney General Paxton says Facebook has leveraged users’ personal information “to grow its empire and reap historic windfall profits.”
“Facebook will no longer take advantage of people and their children for the purpose of profit at the expense of their safety and well-being,” Paxton said in a statement. “This is yet another example of Big Tech’s deceptive business practices and it must stop. I will continue to fight for the privacy and safety of Texans.
Meta did not immediately respond to TechCrunch’s request for comment.
The lawsuit alleges that Facebook misled the public by concealing the nature of its practices and that the Texans who used the app were oblivious to the fact that Facebook was capturing biometric information from photos and videos. It also alleges, without providing further context, that users were unaware that Facebook disclosed users’ personal information to other entities that further exploited them.
“Facebook has often failed to destroy collected biometric identifiers within a reasonable time, exposing Texans to ever-increasing risks to their well-being, safety and security,” the lawsuit states. “Facebook has knowingly captured biometric information for its own commercial benefit, to train and improve its facial recognition technology, and thereby create a powerful artificial intelligence device that reaches every corner of the world and traps even those who intentionally avoid being ‘use Facebook Services.
In November 2021, Meta announced that it was shutting down its facial recognition system on Facebook and would no longer automatically identify registered users in photos and videos. It also said it would remove more than a billion individual facial recognition patterns as part of the shutdown. But Texas officials asked Meta to keep that data for its investigation, likely delaying a full system shutdown.
This isn’t the first time Meta has faced legal action over its facial recognition practices. Last March, Facebook was ordered to pay $650 million for violating an Illinois law aimed at protecting state residents from invasive privacy practices. This law, the Biometric Information Privacy Act (BIPA), is a powerful state measure that has tripped up tech companies in recent years. The lawsuit against Facebook was first filed in 2015, alleging that Facebook’s practice of tagging people in photos using facial recognition without their consent violated state law.
As a result of the ruling, 1.6 million Illinois residents received at least $345 under the California federal court’s final settlement decision. The final figure was $100 million higher than Facebook’s proposed $550 million in 2020, which a judge ruled insufficient. Facebook turned off facial recognition auto-tagging features in 2019, causing it to opt in instead and address some of the privacy criticisms echoed by the Illinois class action lawsuit.
A $650 million settlement would have been enough to significantly impact any normal business, but Facebook swept it away like it did with the FTC’s record $5 billion fine in 2019. following its investigation into the social media giant’s privacy concerns.
The new lawsuit in Texas shows that sweeping privacy laws could have a significant impact not just on Meta’s operations, but on the practices of all major tech companies. In recent years, a group of lawsuits have accused Microsoft, Google and Amazon of breaking laws when users’ faces were used to train their facial recognition systems without express consent.