|

Meta’s plans reveal its lack of care for women

Meta’s plans reveal its lack of care for women
Opinion

Meta is enabling a toxic culture within our society, writes Reach’s online safety editor.


In the same week that the Online Safety Bill passed into law, the Home Office started a campaign to push against Meta’s plans to add end-to-end encryption to its Facebook and Instagram direct-messaging (DM) services.

It was also the same week that I, as online safety editor for Reach, the largest commercial publisher in the UK, received an update on an ongoing police investigation I am supporting a colleague with. This particular case was centred on the unsolicited sending of pornographic content to one of our young female journalists. The video was so extreme our journalist was told by police it would be a criminal offence to share it with security. The conduit for this online sexual assault? Facebook Messenger.

In theory, the video material should never have seen the light of day on Meta’s platforms. But the reality of Zuckerberg’s safety agenda seems palpably different to the platitudes provided to government when the topic of safety is raised.

And while the new Online Safety Bill offers strengthened safety expectations for children and teens, those who tick over into adulthood on their 18th birthday can expect very little in the way of change to their safety online.

Shouting into the void

This particular incident was reported to me in late July. Two months ago. The response at Reach to something like this is to encourage reporting to the police as well as reporting it to Meta. We also provide support for the reportee, by offering digital and physical security features, psychological safety and wellbeing support and checking in regularly. A year or so ago we would have had a contact at Meta who we could get in touch with about specific situations. That privilege disappeared as Meta started to downgrade news on Facebook and Instagram. Now, if something happens, we have to send a plea for help off into a black hole, usually never to hear about it again.

For all of its shiny journalism safety guidance and pledges to keep people safe, Meta’s track record is pretty poor. It doesn’t allow explicit content on its public-facing site. But Messenger and DMs? That’s more of a grey area.

Last year The Centre for Countering Digital Hate (CCDH) published research which demonstrated a lack of action against abusers by Instagram when sexual images and videos sent via DM were reported by women. Case in point: the profile which sent the content to our journalist is still active and the message and the content itself is still sitting in her inbox.

If I look at the platforms reported to Reach by journalists via our online safety reporting system, Meta’s products make up the majority of the reports (which is pretty impressive when considering the current reputation of Elon Musk’s X/Twitter).

If I look at the categories of online harm reported by our staff, the instances of sexual harassment and sexual violence are on the rise. Just last week I had another report of an explicit video sent to a different female journalist via WhatsApp. Another of Meta’s encrypted services.

Now, with its plan to offer end-to-end encryption, the platform will presumably trot out ‘user privacy’ as another shield to prevent any kind of transparency or interrogation into its response to the harassment, abuse and grooming of users aged over 18. It adds another blockade to our already over-stretched police forces. It makes identifying and addressing perpetrators of abuse even more difficult.

There is already an apathy about the consequences of reporting to Meta which I encounter regularly. Journalists who have received threats or abuse online are often reluctant to report it. They feel nothing will happen. Nothing will be done. They will never hear about it again, so what’s the point?

I get it. I have reported such abuse multiple times on behalf of myself or someone else. And much more often than not, I hear nothing back. But I keep reporting because if I don’t then how can we prove the lack of accountability? We have to take action to hold others accountable for inaction.

Facilitating abuse

The way we react to online harm is very much defined by not only who we are as individuals, but also relies heavily on past experiences, our beliefs and what our current state of mind might be. It can have serious outcomes, including a genuine and reasonable sense of threat to physical safety and an impact on daily quality of life as a result.

By its provision of a conduit for such content, its lack of transparency over what happens to reports made by users and now, its plans to introduce end-to-end encryption, Meta is essentially facilitating a toxic culture within our society.

In a society where young men idolise Andrew Tate and misogyny is reportedly on the rise in our schools, should Meta not be doing absolutely everything in its vast power to challenge these behaviours? Should it not educate our young people about the consequences? Facilitate our police with everything they need to identify perpetrators of digital sexual abuse? Encourage and fund intervention programmes before these abusers feel powerful enough to take their actions offline? Does Meta not have a significant responsibility which it appears to be absolutely refusing to accept?

I am fed up, angry and sick of feeling powerless. In my role at Reach, we can offer support. We can offer security. We can offer counselling. But we cannot single-handedly stop the scourge of unsolicited sexual content. Meta absolutely needs to do more to identify abusers and see them held accountable, not offer them further protections.


Dr Rebecca Whittington is online safety editor at Reach.

 

 

Threads: we need a safer space for journalists on social media

Media Jobs