Molly’s death showed that social media companies cannot be trusted to protect children
This has taken a long time and the Bill has rightly been the subject of intense scrutiny and debate. I would like to thank the Telegraph for its relentless campaigning on duty of care, which has been crucial in getting us to this point. Many diligent ministers, such as Michelle Donelan, the current Secretary of State, and courageous civil society activists, such as Baroness Beeban Kidron, also made this historic legislation a reality.
As the bill finally passes, in some ways it feels like we’re crossing the finish line after a long race. But the reality is that we are only at the starting line. This process is only just beginning and Ofcom needs to come out of the block quickly to catch up with the tech companies that are 20 years ahead of them.
Nor should we underestimate the task facing Ofcom, which will come up against the most powerful and influential companies on the planet. There is a risk that technology companies will attempt to skew the evidence base and embroil the regulator in appeals and litigation to protect their business models.
When faced with such powerful companies, Ofcom may be tempted to act initially with an aversion to risk or a largely prudential attitude. It wouldn’t be enough. We need an effective, proactive regulator who can quickly make changes to make social media fundamentally safe by design. Ofcom must act quickly to put things right.
The regulator must be a robust oversight body, guided by the public interest and not pressure from Silicon Valley. To be effective, the new regime must examine the fundamental design of social networks and the fundamental decisions about their operation.
For example, during Molly’s investigation, Meta Wellness Policy Manager Liz Lagone said that many of the posts my daughter saw on Instagram were considered “safe” by the company. It is clear that tech giants like Meta cannot be trusted to be the sole judges and parties of what is harmful, which is why we need an impartial but rigorous regulator to intervene.
The problem, of course, lies not only in the content of social networks, but also in the role played by the powerful algorithms of technology companies, which decide what each user sees and can direct them towards harmful actions.
It wasn’t just that there was self-harm and suicide content on Instagram and other apps, but that their algorithms scraped thousands of posts and funneled them to Molly. It wasn’t just what Molly saw, but also the volume that we think helped overwhelm her.
If algorithms are to be allowed to target children as young as 13 with content, companies must be able to demonstrate that they are not spreading harmful or even life-threatening messages.
Throughout this process, Ofcom must never lose sight of the most important aspect of its work: protecting children from entirely avoidable harm. Young people like Molly who today sit in their room scrolling through dozens of messages on their phones curated by these tech companies. This regulation must ensure that they are not drawn into distorted realities where suicide and self-harm are dangerously normalized.
If this act does not prevent children and teenagers from being pushed down deadly rabbit holes like Molly was, it will have failed.
Ian Russell founded the Molly Rose Foundation in Molly’s memory to campaign on suicide prevention.
telegraph