Senators table a bill to limit the harmful effects of social networks on young people

Sponsors say there is bipartisan support in the House and Senate.

Lawmakers on Wednesday introduced a bipartisan bill aimed at protecting children from the potentially harmful effects of social media.

The bill, sponsored by Sens. Marsha Blackburn, R-Tenn., and Richard Blumenthal, D-Conn., Came as Congress held five hearings on the dangers of social media for children and teens ages 16 or younger in recent months , including one in which a whistleblower who testified against Facebook — now Meta — about internalized documents that showed the tech giant prioritized profits over the mental well-being of children.

Although the senators would not comment on the bill’s likelihood of passage, they stressed at a press conference Wednesday that it has bipartisan support in the House and Senate.

“What we’re doing in this bill is empowering these kids and their parents to take back control,” Blumenthal said.

The Kids Online Safety Act of 2022 includes five major elements:

  • Social media companies would be required to provide privacy options, the ability to disable addictive features, and allow users to unsubscribe from recommendations such as pages or other videos to “like”. It would also make the strongest privacy protections the default.
  • The bill would give parents tools to track time spent in the app, limit purchases and help combat addictive use.
  • This would require social media companies to prevent and mitigate harm to minors, including self-harm, suicide, eating disorders, drug addiction, sexual exploitation and illegal products for minors, such as the alcohol.
  • Social media companies would be required to hire a third party to conduct independent reviews to quantify the risk to minors, compliance with the law and whether the company is “taking meaningful steps to prevent such harm”.
  • Social media companies would be required to provide children’s data to academic and private researchers. Scientists would use this data to do more research on what harms children on social media and how to prevent this harm.
  • “Social media platforms have proven they’re not going to self-regulate. That’s why we’ve worked hard to make sure it’s a safer environment,” Blackburn said Wednesday.

    Dr Dave Anderson, a clinical psychologist at the Child Mind Institute, said the bill marks the sensitive intersection of technology and public policy.

    “I think politicians take what we know from science and say, ‘How can we incorporate these safeguards? ‘” Anderson said.

    He said social media algorithms have evolved to show children only what matters to them rather than a variety of viewpoints, which marks a dangerous shift for children with mental health issues.

    Meta, which owns Instagram and Facebook, had no comment on the legislation, but a spokesperson pointed to the company’s announcement in December that it was taking a tougher approach to recommendations for teens, including by pushing them towards different subjects if they lived.

    A Snap Inc. spokesperson said the company devotes significant time and resources to protecting teens and has tools for children to report behavior and opt out of location services, as well as resources to the parents.

    TikTok updated its policies on Tuesday to “promote safety, security and well-being on TikTok,” according to a press release written by TikTok’s head of trust and safety, Cormac Keenan.

    TikTok and Twitter did not respond to ABC News’ request for comment.

    ABC News

    Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
    Back to top button