Facebook faces criticism over sexual comments on Instagram photos of minors


Instagram, owned by Facebook, is under heavy criticism for not removing accounts that post photos of children in bathing suits or small clothes that receive hundreds of sexualized comments from sick people who feel free to using Mark Zuckerberg’s platform to share their interests with like-minded perverts.

the Guardian reports that photo-sharing app Instagram, owned by Facebook, is failing to remove accounts on its platform that post photos of children in swimsuits or partially clothed who receive hundreds of sexualized comments. Facebook (now known as Meta) alleges it takes a zero-tolerance approach to child exploitation, but accounts that were flagged as suspicious still remained on the platform and were been deemed acceptable by the company’s automated moderation system.

Mark Zuckerberg talks about Instagram (AFP/Getty)

A researcher reported an account posting photos of children in sexualized poses. Instagram responded the same day saying that due to “high volume” they were unable to view the report but that their “technology discovered that this account is likely not against our community guidelines. “. The account remained online with over 33,000 followers.

The accounts are said to be often used for ‘breadcrumbs’, where those looking for child sexual abuse images will post technically legal images to lure other internet predators who then arrange to end up in private messaging groups to share even more extreme content.

Andy Burrows, head of online safety policy at the NSPCC, said the accounts are like a “showcase” for paedophiles. He commented: “Companies should proactively identify this content and then remove it themselves. But even when it is brought to their attention, they feel that it is not a threat to children and that they should stay on the site.

Facebook said it has strict policies against content that sexually exploits or endangers children and removes it when notified. A spokesperson commented: “We are also focused on preventing harm by banning suspicious profiles, preventing adults from messaging children they are not connected with and forwarding under 18s by default. years on private accounts.

Breitbart News recently reported that Facebook’s training materials for company content moderators instruct them to “err on the side of an adult” when they don’t know the age of an individual shown in a photo. or a video suspected of being child sexual abuse material ( CSAM ).

Facebook Chief Security Officer Antigone Davis told The Times that the policy is accurate and attempts to address the privacy concerns of those who post adult sexual images. Davis told the NYT, “Online child sexual abuse is heinous,” and pointed out that the company uses a rigorous review process that flags more potential CSAMs than any other company.

Learn more about the Guardian here.

Lucas Nolan is a reporter for Breitbart News covering free speech and online censorship issues. Follow him on Twitter @LucasNolan or email him at lnolan@breitbart.com




Breitbart

Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button