Oversight board criticizes Meta for preferential treatment


Meta, the owner of Facebook and Instagram, was harshly criticized on Tuesday by a company-appointed oversight board for its policies that give celebrities, politicians and business partners special treatment compared to the vast majority of its users.

Under a program called cross-checking, people with high follower counts were able to say and share things on Facebook and Instagram that would otherwise have been quickly removed for violating company policies, according to the board. of surveillance, which Meta had created to rule. thorny political issues related to freedom of expression, human rights and content moderation.

“The board is concerned about how Meta has prioritized commercial interests in content moderation,” the board said in a statement. The cross-check program, he said, “provided additional protection for some users’ expression.”

The oversight board has recommended that Meta overhaul its cross-checking system by “drastically” increasing transparency about who is on the program’s VIP list and hiding their posts while they are reviewed. Meta should prioritize speech, which is “of particular public importance”, he added. The recommendations made by the council, which includes around 20 academics, human rights experts and lawyers, are not binding.

The report reiterated the power of social media to decide which posts to keep, which to delete and how to deal with specific accounts. Twitter, Facebook, TikTok and others have long come under scrutiny for making one-sided decisions about content on their platforms that can influence political debates and societal issues.

Elon Musk, the new owner of Twitter, is now in the spotlight for how his social media service will moderate content. Twitter had created policies to keep misinformation and hate speech off the platform, but Musk said he believed in unfettered speech and dropped enforcement of some of those policies.

Meta has scaled back its social networking business in recent months after criticism of toxic content on those platforms. Mark Zuckerberg, the company’s chief executive, instead prioritized entering the immersive digital world of the metaverse. Meta has spent billions of dollars on this change, although it’s unclear whether consumers will embrace Metaverse-related products. The company recently laid off more than 11,000 employees, or around 13% of its workforce.

“Meta’s Oversight Board calls for a major overhaul of content moderation rules will create more fairness, a level playing field and keep VIP profile users to the same high standards as the common user,” Brian said. Uzzi, a professor at the Kellogg School of Management. at Northwestern University. “To avoid chaos, there should be one rule to rule them all.”

Nick Clegg, Meta’s vice president of global affairs, said Tuesday that Meta created the cross-checking system to prevent messages that were mistakenly deleted from having an outsized impact. He said the company would respond to the supervisory board’s report within 90 days.

The oversight board began investigating the counter-verification program last year after its existence was reported by the Wall Street Journal and whistleblower Frances Haugen. The board last year strongly criticized the company for its lack of transparency about the program.

On Tuesday, the oversight board found that the cross-checking program ensured high-level users received additional scrutiny from a human moderator before their posts were taken down for violating the company’s terms of service. The board of directors criticized the company for its lack of transparency and for the “unequal treatment” of the most influential and powerful users of Facebook and Instagram, to the detriment of its human rights and the values ​​of the company. Meta took up to seven months to reach a final decision on content posted by an account in the cross-checking program, according to the report.

Mr. Zuckerberg had pushed for the creation of the oversight board so that his company would not be the only entity involved in content moderation decisions. Since the board began hearing cases in the fall of 2020, it has issued a number of objections to Meta’s actions on content.

In 2021, the board recommended that Meta restore postoperative breast photographs that the company’s automated systems had removed for nudity reasons. The photos, restored by Meta, had been posted by a Brazilian Instagram user who was promoting a breast cancer awareness campaign. The board criticized Meta’s reliance on automated systems to remove posts.

The board also considered Meta’s banning of former President Donald J. Trump from Facebook and Instagram after the January 2021 U.S. Capitol riot. In May 2021, the board said Meta should review its decision to ban Mr. Trump, adding that the company did not have proper systems in place to issue a permanent suspension of the former president.

Mr. Trump had been part of the cross-verification program. The board criticized Meta for not being “fully forthcoming” in its cross-checking disclosures, including the numbers that were part of it.

Mr Clegg has since said that Meta will decide whether or not to allow Mr Trump’s accounts to be restored by January.

Thomas Hughes, the board’s director, said Tuesday’s report was “an important step in the board’s ongoing efforts to bring greater accountability, consistency and fairness to Meta’s platforms.”

Other social media companies have sought to replicate Meta’s supervisory board system. After Mr Musk took over Twitter, he said he planned to form a “content moderation council”. He did not follow through on this plan, accusing activists and investors of pressuring him to follow Meta’s model.

Meta also faces the prospect of not being able to display personalized ads in the European Union without first receiving user consent. Rulings approved by a European data protection body this week would require the company to allow Facebook and Instagram users to opt out of ads based on personal data collected by Meta, according to a person with knowledge of the ruling.

A final judgment, which can be appealed, is expected to be announced next month by Irish authorities, who are Meta’s main data privacy regulator in Europe, as the company’s European headquarters are at Dublin.

Tech

Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button