‘Blood on your hands’: 5 takeaways from US social media hearing

‘Blood on your hands’: 5 takeaways from US social media hearing
(Left to right) Chew, Zuckerberg, Yaccarino, Citron and Spiegel attended the hearing (Credit: Tom Williams, Anthony Quintano, James Tamim/VMA Media, Kimberly White, TechCrunch — Wikimedia Commons)

On Wednesday, US senators took turns excoriating five social media CEOs for alleged harms they have brought to children and teens.

“Discord has been used to groom, abduct and abuse children. Meta’s Instagram helped connect and promote a network of paedophiles. Snapchat’s disappearing messages have been co-opted by criminals who financially ‘sextort’ young victims.

“TikTok has become a ‘platform of choice’ for predators to access, engage and groom children for abuse, and the prevalence of [child sexual abuse material] on X has grown as the company has gutted its trust and safety workforce,” said Dick Durbin, who chairs the judiciary committee, in his opening statement.

He continued: “Their design choices, their failures to adequately invest in trust and safety, and their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk.”

“You have blood on your hands,” added Lindsey Graham. “You have a product that’s killing people.”

Meta’s Mark Zuckerberg and TikTok’s Shou Zi Chew took the brunt of the senators’ ire. Snap’s Evan Spiegel, X’s Linda Yaccarino and Discord’s Jason Citron were spared in comparison.

The general implication: regulation is coming.

Here are five takeaways from the hearing.

Enforcement remains a hurdle

All of the CEOs testified that they do not allow users under the age of 13 to use their platforms and that they have policies to restrict harmful content from being seen by users under 18.

However, there is clearly a lack of reliable enforcement mechanism. When asked by Mike Lee about sexually explicit content on Meta’s platforms, Zuckerberg stated: “We don’t allow sexually explicit content.”

“How is that going?” shot back Lee. His point was that, regardless of the progress made by social media companies in trust and safety efforts, enforcement remains a hurdle, especially given the sheer volume of child sexual abuse material (CSAM) content online.

To Meta’s credit, it does report a tremendous amount of CSAM to the National Center for Missing & Exploited Children. Facebook reported 22m examples of CSAM in 2021, with an additional 3.3m reported by Instagram.

Taking a swipe at Apple and Google, Zuckerberg suggested that app stores should take on more responsibility by requiring parental consent when apps are downloaded by children, as opposed to requiring parental consent on an app-by-app basis.

Zuckerberg’s public apology

In a surprising moment of the hearing, Josh Hawley repeatedly asked Zuckerberg to apologise to the families in attendance who blame platforms like Facebook and Instagram for driving their children into depression and suicide.

“I’m sorry for everything you have been through. No-one should have to go through the things that you have suffered,” Zuckerberg offered. “This is why we’ve invested so much and are going to continue doing industry-wide efforts to make sure that no-one has to go through [that].”

Peer: Social media ‘talk a good talk’ on safety but fight regulations in court

According to reports, some parents in attendance complained that the hearing included a lot of talk but not enough action from social media companies.

The criticism echoed remarks by Baroness Beeban Kidron last year, who in part led the effort to pass the Online Safety Act in the UK.

The companies by and large did not agree to supporting further regulatory legislation, but said they would be open to working with legislators on shaping policy.

Notable exceptions included Snap, which previously announced its support for the Kids Online Safety Act, and X, which said it supports the Stop CSAM Act.

Not enough transparency on trust and safety

Thom Tillis asked each CEO how many people work in their trust and safety divisions.

Meta has 40,000 people on its team; TikTok also has 40,000 globally; X has 2,300; Snap has 2,000; and Discord has “hundreds of people”, but Citron noted that the platform’s user base and workforce are smaller.

It is not clear whether all of these employees are internal staff, as they could be third-party companies hired to handle content moderation.

The senators criticised the companies for what they saw as underinvestment in this area. While the companies have collectively invested billions in trust and safety efforts, it is only a fraction of their total revenue.

Xenophobic questions to Chew continue

Tom Cotton repeatedly questioned whether Chew is a member of or otherwise has close ties with the Chinese Communist Party.

Chew replied: “Senator, I’m Singaporean. No.”

TikTok has been accused of sharing US and European user data with parent company ByteDance in China — something that national security officials have warned could then be accessed by the Chinese government.

The platform has repeatedly denied the Chinese government accesses user data and has invested billions in building data centres in the US and Ireland to segment and firewall data collection.

However, this week The Wall Street Journal revealed that TikTok is still struggling to disallow China-based employees from accessing US user data. Chew disputed the report during the hearing.

X relatively unscathed

Yaccarino avoided significant scrutiny from senators despite X being a hotbed for hateful and inappropriate content since its acquisition (as Twitter) by Elon Musk in October 2022.

In a prepared statement, Yaccarino emphasised that X is “not the platform of choice for children and teens”.

Twitter to reinvest in trust and safety amid child abuse claims

“Children under the age of 13 are not allowed to open an account. Less than 1% of the US users on X are between the ages of 13 and 17, and those users are automatically set to a private default setting and cannot accept a message from anyone they do not approve,” she explained.

Yaccarino referenced recent investment in trust and safety through the creation of a new 100-person team focused on addressing CSAM.

But, as Charlotte Powers, head of digital at Bountiful Cow, recently asked: “How can a team of just 100 people monitor the millions and millions of tweets uploaded globally on to the platform on a daily basis?”

That, it seems, is the billion-dollar question facing every social media company.

Media Jobs