Twitter to reinvest in trust and safety amid child abuse claims

Twitter to reinvest in trust and safety amid child abuse claims

X, the social media company formerly known as Twitter, is reportedly hiring 100 people as part of a new effort to focus once more on content moderation — specifically to tackle child sexual exploitation (CSE) on the platform.

In a blog post, X said the site has “strengthened its policies and enforcement to tackle CSE” and is actively taking action against users who distribute such content and the networks of users who engage with it.

That includes building a “trust and safety centre of excellence” in Austin, Texas, where the new in-house content moderation “agents” will be based.

But X is still attempting to distance itself from the severity of the issue on its platform, with the post noting “X is not the platform of choice for children and minors”, given that users aged 13-17 “account for less than 1% of our daily US users”.

Analysis: Advertisers to remain cautious

The move comes as US senators are set to grill social media company CEOs on Wednesday over CSE concerns.

Meta’s Mark Zuckerberg, X’s Linda Yaccarino, TikTok’s Shou Zi Chew, Snap’s Evan Spiegel and Discord’s Jason Citron will testify as US Congress is currently debating a bill, dubbed the Kids Online Safety Act (KOSA), which would establish guidelines for protecting minors on social media.

Last week, Snap became the first social media company to publicly support KOSA, putting additional pressure on its competitors to act.

US senators Dick Durbin and Lindsey Graham, who are leading the panel, said in a joint statement in November that “Big Tech’s failure to police itself at the expense of our kids cannot go unanswered”.

According to James Sigrist, paid social account director at the7stars, the trust and safety hires are likely to be viewed positively by advertisers.

However, he told The Media Leader: “Due to the vague wording around when this will be implemented, as well as the timing of the announcement just ahead of the US Senate hearing on CSE, confidence may not be fully restored to advertisers just yet.

“Until we learn the scale of this team’s role in removing additional harmful content, beyond the imperative removal of CSE content, advertisers will remain cautious about including X in future activations.”

Charlotte Powers, head of digital at sibling agency Bountiful Cow, agreed: “How can a team of just 100 people monitor the millions and millions of tweets uploaded globally on to the platform on a daily basis?”

Difficulty in hiring top talent

While X is hiring trust and safety staffers to address CSE specifically, the site has been inundated with an even wider variety of moderation threats since Elon Musk bought Twitter in October 2022.

The platform has seen a sharp rise in misinformation and disinformation, hateful speech including antisemitic posts (some of which have been reposted by Musk himself) and, most recently, the spread of AI-generated pornographic images of celebrities such as Taylor Swift.

One ex-Twitter employee, speaking to The Media Leader on the condition of anonymity, said the company’s need to rehire trust and safety moderators was “inevitable”, given advertisers’ concerns over brand safety. However, the team would likely struggle to hire top talent, including rehiring anyone who was previously let go.

“Trust and safety people are very cautious and methodical. Those people are not going to want to work for Musk,” the source said. “Nobody in their right mind that is serious about trust and safety would work in that space, especially after what Musk did to [former trust and safety head] Yoel [Roth].”

In late 2022, Musk helped stoke a harassment campaign against Roth, falsely accusing him of supporting child exploitation. The firestorm forced Roth to move out of his home to avoid potential threats to his life.

“[Musk] had no concept of the importance — from a regulatory perspective, from a safety perspective etc — of the work that some of the [trust and safety] teams did, particularly around misinformation,” the source added. “And I think now what’s happening is the elections are coming and he wants to get their shit in place before[hand].”

Another potential reason for the latest move could be a renewed desire for X to begin monetising adult content on its platform. While doing so would likely cause a further exodus of advertisers, the new revenue stream could help make up the windfall.

The company had previously considered developing the site into something of an OnlyFans competitor in 2022, but the lack of capacity to remove illegal content related to CSE stymied the effort.

Media Jobs