|

Cleaning up social media: is decentralising content moderation the answer?

Cleaning up social media: is decentralising content moderation the answer?
Feature

Mastodon and Bluesky are offering a future of the internet where users are placed in charge of their own content moderation. Is it workable for big social media platforms?

 

In the weeks since Elon Musk took over Twitter, one of the biggest areas of contention on the platform has been his management of content moderation.

Twitter’s trust and safety team has been hollowed out and its safety chief resigned; harmful speech has surged across the platform as the company has begun to lean heavily on automation for moderation; and advertisers have been advised that Twitter now amounts to a high-risk opportunity for brands.

Addressing concerns, the European Union delivered a warning to Twitter last month that if the company does not get its house in order, it may well be banned from operating on the continent.

It remains to be seen whether Musk’s attempts to rewrite the rules of social media content moderation will end up driving away brands and users in the long-run, but other developments in the past several years have shown that existing policies at companies like Twitter, Facebook and Instagram have often failed to adequately protect users from harmful content.

Case in point, the death of 14 year-old Molly Russell, which has prompted the UK Government to draft an Online Safety Bill to more holistically regulate social media companies.

Wisdom of crowds

Meanwhile, increasingly popular Twitter alternative Mastodon and Twitter co-founder Jack Dorsey’s in-development competitor Bluesky appear to be betting that Web3—the future of the internet—is not one where large social media companies control and moderate content, but where users are placed in charge through a decentralised model.

When one signs up for Mastodon, for example, they sign up with a specific, independently owned and operated server that is part of the greater Mastodon community. That server, and any server on the platform, is in charge of setting its own rules and moderating its users.

Is Mastodon really the answer to life after Twitter?

Proponents of decentralised online systems argue that it places power back in the hands of users. For instance, users on platforms that do not have a central governing body would be unlikely to receive algorithmic recommendations of content, which could dramatically reduce the amount of harmful content they see, or at least grant individuals more control over it.

As Dorsey described of Bluesky: “It’s a competitor to any company trying to own the underlying fundamentals for social media or the data of the people using it”.

For social media companies, the benefit is that it punts the responsibility of moderation back onto the users themselves. Using a decentralised network such as through blockchain technology could allow for a validated record of all comments made, which could serve as a useful tool for platforms’ moderation strategies.

As Richard Parboo, UK country manager for blockchain-based ad fraud prevention start-up Adwatch explains: “the data is collected and through a hashed transaction it is saved in the blockchain so that there is a record of this digital footprint, and they know who the owner is at all times and where it is saved so that no one can manipulate it.”

Parboo added that Adwatch has previously been asked by platforms like Google to “look at the possibility of validating comments on a platform”.

Do advertisers give a shit about Twitter?

Others are not so optimistic that decentralisation will work out well for social media companies. Ashley MacKenzie, founder and CEO of media blockchain company Fenestra, argued to The Media Leader: “‘The wisdom of crowds’ is a pleasant enough concept but I see few places where it truly works as a mechanism, and know of no use case where it is used to moderate sensitive content categories.”

He notes that usually, decentralised communities get dominant voices that stand out from the crowd and can warp outcomes. MacKenzie listed Reddit as one such example, where communities may tangle with volunteer moderators, and which populates its subreddits and comment sections (known as threads) in order of popularity by default. The result is a mix of informal user moderation of bad content through “downvoting” posts and comments, but also a relative lack of space for disagreement with majorities. Reddit declined to comment.

If in doubt, go to Wikipedia…

MacKenzie also described Wikipedia as “perhaps the one globally workable example” of decentralised content moderation, though he caveated that it nevertheless still faces challenges despite having “complex systems to counter”.

A spokesperson for Wikimedia Foundation, the charity which operates Wikipedia and other related platforms, described to The Media Leader its overarching content moderation framework. Like other decentralised content models, the Foundation itself does not write, edit, or determine what content is included on Wikipedia or how that content is maintained.

Rather, nearly 300,000 monthly active volunteer editors do, and they have “created a system of collaborative content moderation to protect Wikipedia’s neutrality and reliability.”

Wikipedia’s core content policies include that content must remain neutral, verifiable, and attributed to a reputable source. It must also meet “notability requirements” in order to remain published.

The whole of Wikipedia receives 350 edits every minute, requiring a hefty amount of legwork to ensure accuracy. Its policies are overseen and upheld by Wikipedia administrators, which are a group of more senior volunteer editors granted with oversight tools, who are themselves elected by the broader volunteer editor community.

Wikipedia policies include: having volunteers regularly review a real time feed of edits to monitor for inaccuracies or issues in articles; editors “watching” certain articles that might be more prone to negative edits, and receiving updates when edits are made to the article; editors applying “a level of protection” to the article to restrict public editing under certain circumstances (though editors aim to keep as many articles open as possible); and using artificial intelligence (“bots”) to spot and revert “many common forms of negative behaviour on the site”.

“Part of what makes this system of moderation work well is that the site is radically transparent in sharing where information comes from,” said the spokesperson. “Everything from the way an article has evolved over time, the article talk page where editors discuss changes to an article, to administrator noticeboards–where more senior volunteers discuss potential issues on the site–is all publicly viewable.

Alternatives to display ads

The spokesperson also noted that Wikipedia’s decentralised moderation model allows for direct and constant interactions from hundreds of thousands of people from different cultures and points of view, inviting them to often have difficult conversations on controversial topics in search of “more civic solutions to disagreements” in service of informing the public.

Snoddy: UK’s Online Harms Bill seems to get the balance right

However utopian that may sound, however, the spokesperson added that “it’s important to note that Wikipedia operates in a fundamentally different way from other platforms online. It’s an encyclopedia, so its purpose is also different from primarily social-driven sites.”

While sceptics like MacKenzie may be impressed by Wikipedia’s model, he remains unconvinced of its application on other services.

“I don’t believe that decentralised models are viable alternatives to government-led regulation of content in western democracies. As such Mastodon et al offer no better solution and, indeed, will almost certainly make it worse by deepening the dark parts of the web where anonymity breeds abuse.”

Mastodon does not currently offer display advertising across its platform, though depending on an individual server’s rules, there’s nothing to prevent users from posting promotional material, or brands interacting directly with potential consumers.

But to MacKenzie, it seems unlikely that any advertiser will view the decentralised moderation on offer as less brand risky than elsewhere.

“If advertisers worry about Twitter, they will never embrace decentralised models.”

Media Jobs