Elijah Nouvelage / Getty Images
Democratic senators on Thursday introduced a bill that would hold Facebook, YouTube and other social media companies responsible for the proliferation of vaccine lies, bogus medicine and other harmful health claims on their sites.
Co-sponsored by Democratic Senators Amy Klobuchar of Minnesota and Ben Ray Luján of New Mexico, the Health Misinformation Act targets a section 230 provision of the Communications Decency Act, which protects platforms from being held accountable for what their users post. in most of the cases.
The bill would deprive companies of this legal shield if it turns out that their algorithms have fostered health misinformation during a public health crisis. This would not apply if such erroneous information was displayed in a chronological feed.
The legislation leaves it to the US Department of Health and Human Services, which is responsible for reporting public health emergencies, to define what constitutes health misinformation.
“These are some of the biggest and wealthiest companies in the world and they need to do more to prevent the spread of deadly vaccine misinformation,” Klobuchar said in a statement. “The coronavirus pandemic has shown us how deadly misinformation can be and it is our responsibility to take action. “
She cited a recent poll by the nonprofit Kaiser Family Foundation showing that two-thirds of unvaccinated people believe in myths about COVID-19 vaccines, such as the baseless claim that vaccines cause disease.
Tensions over the role of social media in spreading fraudulent claims about COVID-19 vaccines have come to a head as the freeze on vaccination rates and the rise of the Delta variant threatens to prolong the pandemic.
US Surgeon General Vivek Murthy warned last week that disinformation about COVID-19 was an “urgent threat.” The White House called Facebook in particular, saying it needs to do more to tackle bogus anti-vaccination messages.
President Biden said on Friday that social platforms were “killing people,” although he later returned to that comment and said he meant people who spread vaccine misinformation online are irresponsible.
Facebook hit back, accusing the administration of “pointing fingers.” The company said it has removed more than 18 million misinformation about COVID, shown authoritative information about COVID and vaccines to more than 2 billion people, and that its own investigations reveal that “the vaccine acceptance among Facebook users in the United States has increased.
CEO Mark Zuckerberg told The Verge website on Thursday that he was “fairly confident” that the company had been “a positive force here.”
This week, YouTube announced that it will begin posting reviews on certain health-related videos with links to “authority” sources and will highlight videos from those sources in search results on certain health-related topics. .
But critics say social media companies need to go further. The Center for Countering Digital Hate, a nonprofit that combats disinformation, says only 12 people are responsible for 65% of anti-vaccine posts shared on social media, and criticized Facebook, YouTube and Twitter for failing to not have completely removed them from their platforms. (The platforms removed some accounts from a few of the dozen, but none removed them all.)
Klobuchar said the 25-year-old Section 230 allows digital disinformation to proliferate.
“The law – which was intended to promote online discourse and allow online services to thrive – now distorts legal incentives for platforms to respond to digital disinformation about critical health issues, like COVID-19,” and leaves those who suffer harm with little or no recourse, ”Klobuchar’s office said in a press release about his bill.
The White House also said it was “considering” whether to amend Section 230 to tackle misinformation about COVID.
The legal shield has been criticized by lawmakers on both sides of the aisle who say it has become obsolete now that technology platforms play such a dominant role in society.
Democrats say Section 230 allows social media companies to shirk responsibility for harmful content such as disinformation and hate speech, while Republicans say it gave platforms coverage to censor them. preservatives. (There is little public evidence showing that platforms treat conservatives more harshly than others.)
However, trying to hold platforms accountable for health disinformation may face challenges on First Amendment grounds, as such content likely falls under the category of “legal but awful” speech, Eric Goldman said. , professor of law at the University of Santa Clara.
“If health disinformation is constitutionally protected, there’s really little Congress can do about it,” he said. “The deletion of Article 230, which is a shield of liability, does not expose a [social media] service to any new responsibility. because the Constitution will fill in the protection. “
Editor’s Note: Facebook and YouTube owner Google is one of NPR’s backers.