Twitter limits content moderation tools ahead of midterm elections


Jwitter Inc., the social network being redesigned by new owner Elon Musk, has frozen some employees’ access to internal tools used for content moderation and other enforcement policies, limiting staff’s ability to cracking down on misinformation ahead of a major US election.

Most people who work in Twitter’s Trust and Safety organization are currently unable to edit or penalize accounts that violate the rules regarding misleading information, offensive messages, and hate speech, except for the most more serious ones that would lead to real damage, according to people. familiar with the matter. These positions were prioritized for manual application, they said.

Read more: Inside Twitter’s Chaotic First Weekend Under Elon Musk

People who were called upon to enforce Twitter policies during Brazil’s presidential election had access to internal tools on Sunday, but in a limited capacity, according to two of the people. The company still uses automated enforcement technology and third-party contractors, according to one person, although higher-profile breaches are usually investigated by Twitter employees.

San Francisco-based Twitter declined to comment on new limits placed on its content moderation tools.

Twitter staff use dashboards, known as agent tools, to perform actions such as banning or suspending an account deemed to have violated policy. Detection of policy violations can either be reported by other Twitter users or detected automatically, but taking action on them requires human intervention and access to dashboard tools. These tools have been suspended since last week, the people said.

The restriction is part of a larger plan to freeze Twitter’s software code to prevent employees from modifying the app as it transitions to new ownership. Typically, this level of access is granted to a group of people numbering in the hundreds, and that was initially reduced to around 15 people last week, according to two of the people, who asked not to be named during the interview. discussions on internal decisions. Musk closed his $44 billion deal to take the company private on Oct. 27.

Read more: Elon Musk now owns Twitter. Here’s how the platform could change

Reduced content moderation has raised concerns among employees of Twitter’s Trust and Safety team, who believe the company will be short-staffed on policy enforcement in the run-up to the US midterm elections. of November 8. Trust and Safety employees are often tasked with enforcing Twitter’s misinformation and civic integrity policies — many of the same policies former President Donald Trump routinely violated before and after the 2020 election, the company said. at the time.

Other employees said they were concerned Twitter would reduce its access to data for researchers and academics, and about how it would handle foreign influence operations under Musk’s leadership.

On Friday and Saturday, Bloomberg reported an upsurge in hate speech on Twitter. This included a 1,700% increase in the use of racist slurs on the platform, which at its peak appeared 215 times every five minutes, according to data from Dataminr, an official Twitter partner with access to the entire platform. The Trust and Safety team did not have access to enforce Twitter’s moderation policies during this time, two people said.

Read more: As Elon Musk buys Twitter, the right celebrates

Yoel Roth, Twitter’s head of security and integrity, posted a series of tweets on Monday addressing the rise in offensive messages, saying very few people are seeing the content in question. “As of Saturday, we have been focused on combating the resurgence of hateful behavior on Twitter. We have made measurable progress, removing over 1,500 accounts and reducing impressions on this content to almost zero,” Roth wrote. We are mainly dealing with a targeted and short-term trolling campaign.”

Musk tweeted last week that he has so far made “no changes to Twitter’s content moderation policies”, although he has also publicly stated that he believes the company’s rules are correct. too restrictive and called himself a freedom of expression absolutist.

Internally, employees say, Musk has raised questions about a number of policies and focused on a few specific rules he wants the team to review. The first is Twitter’s general misinformation policy, which penalizes posts containing lies about topics such as election results and Covid-19. Musk wants the policy to be more specific, according to people familiar with the matter.

Read more: What to know about Trump’s Twitter ban, now that Elon Musk owns the platform

Musk also asked the team to review Twitter’s hateful conduct policy, people say, specifically a section that says users can be penalized for “sex abuse or dead naming targeted transgender individuals.” .

Either way, it’s unclear whether Musk wants the policies rewritten or the restrictions removed entirely.

More Must-Try Stories from TIME


contact us at letters@time.com.


gb7

Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button