According to insiders at Twitter who spoke with Technoutility, the company is struggling to protect its users from trolling, state-coordinated disinformation, and child sexual exploitation following layoffs and changes made under the ownership of Elon Musk.
Academic data and user testimony support these claims, with hate speech thriving under Musk’s leadership. Trolls are said to be more emboldened, harassment is intensifying, and accounts following misogynistic and abusive profiles are rising.
Former and current employees of Twitter have told theTechnoutility’s Panorama program that the company is finding it challenging to maintain features intended to protect users from trolling and harassment due to a chaotic working environment. Musk reportedly has bodyguards shadowing him at all times. Some features created by a team responsible for safety measures, such as nudge buttons, have been removed.
Internal research from Twitter has suggested that these measures reduced trolling by 60%. Still, nobody is maintaining this type of work, likening the platform to a building that seems fine from the outside but inside is “on fire.”
- Twitter has yet to respond to the Technoutility’s request for comment.
- Additionally, concerns have been raised that child sexual exploitation is increasing on Twitter and not being adequately reported to law enforcement. There are also targeted harassment campaigns to curb freedom of expression and foreign influence operations going undetected.
- Exclusive data from the investigation also shows a rise in misogynistic online hate targeting the Technoutility journalist and a 69% increase in new accounts following abusive profiles. Accounts targeting rape survivors have also become more active since the takeover, indicating that they may have been reinstated or newly created.
As a journalist reporting on disinformation, conspiracy theories, and hate, I am no stranger to online abuse. However, last year I noticed that the level of abuse I received on Twitter steadily decreased. Unfortunately, in November, I realized the situation had reversed and worsened.
My observations were confirmed by a team from the International Center for Journalists and the University of Sheffield, who has been tracking the hate directed towards me. Their data revealed that the abuse I received on Twitter had increased more than threefold since Mr Musk took over, compared to the same period the previous year.
While all social media platforms are pressured to tackle online hate and harmful content, Twitter has put this issue on the back burner.
To investigate, I travelled to San Francisco, where Twitter’s headquarters are located, to speak with an engineer responsible for the computer code that makes Twitter work. To protect his identity, he asked to be called Sam.
Sam painted a picture of chaos, saying that from the inside, Twitter resembles a building where everything is on fire. He explained that while the outside facade appears fine, nothing works properly. He blames the chaos on the massive disruption in staffing, with at least half of Twitter’s workforce being laid off or choosing to leave since Musk took over. As a result, people from other teams have had to take on additional responsibilities, creating additional risk and more opportunities for things to go wrong.
Sam believes many previous features still exist, but those who designed and maintained them have left, leaving them unmanned. He adds that there are many broken elements of the platform, and nobody is taking care of them, which results in inconsistent behaviour.
Sam believes Mr Musk’s lack of trust in Twitter employees is the root cause of this disorder. He says that Musk brought in engineers from his other company, Tesla, and asked them to evaluate engineers’ code over just a few days before deciding who to lay off. According to Sam, it would take months to understand code of that complexity.
He claims that Mr Musk’s lack of trust is reflected in the level of security that he surrounds himself with, revealing that there are always at least two bulky, tall, Hollywood-style bodyguards with him, even when he goes to the restroom.
Sam believes Mr Musk’s primary concern is money, citing examples of cleaning and catering staff being laid off and attempting to sell the office plants to employees.
Lisa Jennings Young, a former head of content design at Twitter, was among the team responsible for implementing features to protect users from online abuse. Before Mr Musk acquired Twitter, the platform had long been plagued by trolling, but Jennings Young claims that her team had significantly reduced the problem. Internal research from Twitter, which Technoutility has seen, supports this assertion.
“While we weren’t perfect, we were making progress all the time,” Jennings Young said in her first public comments since leaving Twitter following the Musk takeover.
Jennings Young and her team were responsible for developing several new features, including Safety Mode, which automatically blocks abusive accounts, and labels that are applied to misleading tweets. Based on AI technology, they also created the “harmful reply nudge,” which alerts users before they send tweets containing trigger words or harmful language.
According to Twitter’s research, which Technoutility has seen, the “nudge” and other safety tools have effectively reduced online abuse.
Since taking over Twitter, Elon Musk has expressed his priorities as making the social media company profitable and championing freedom of expression. However, insiders have expressed concerns that under Musk’s leadership, protecting users from harm, including child sexual abuse and state-sponsored disinformation campaigns, has been de-prioritized.
Ray Serrato, who worked in a team specializing in tackling state-sponsored disinformation, left in November due to a lack of clear vision for user protection under the new leadership. He notes that his team’s capacity has been minimized, and many experts who covered special regions or threat actors are no longer part of the team.
Similarly, an insider called Rory, who was part of a team tackling child sexual exploitation (CSE), expressed concerns about the drain of expertise under Musk’s leadership. He notes that his team was cut from 20 people to around six or seven after the acquisition, making it challenging to keep up with the workload. He worries that there are now fewer people with the knowledge to effectively escalate concerns about this content with law enforcement.
Twitter claims to have removed 400,000 accounts in one month to help make the platform safer. However, Rory worries that with fewer people to escalate concerns about offending content, many offending users can easily set up new accounts.
Despite attempts to contact Musk via email, tweets, and even a Twitter poll, there has been no response. More than 40,000 users voted in the poll, with 89% saying Musk should do an interview.
Leave a Reply