| |

This is the worst media employee ever. But he can’t be fired

This is the worst media employee ever. But he can’t be fired
Opinion: 100% Media 0% Nonsense

The most important decisions in publishing are being taken by someone with no morals and no accountability, warns the editor-in-chief.


Let me tell you a story about the most cynical, amoral and irresponsible person who has ever held down a job in media publishing or broadcasting.

Some of his decisions have helped cause harm to millions of people across the world, particularly young people. He operates in secrecy and is therefore not accountable to anyone. And he’s seen as so important to a modern media business’s operations that they can’t be tamed, let alone fired.

This person’s name is Al Gorithm.

And he’s the absolute worst.

Online safety depends on what Al Gorithm decides

“This Osama Bin Laden guy seems popular. Let’s make him even more popular!”

Time and again we see a moral panic in digital media over some hateful content going viral. Over the weekend, Apple and Disney were among a list of entertainment giants to pull their advertising on Twitter (X) because their posts were appearing next to pro-Nazi content.

Leaving aside the issue over why these advertisers were still on Twitter (are they actually measuring their ROI over there?), we see Al Gorithm’s handiwork at play again.

Last week, TikTok videos with the hashtag #lettertoamerica had been viewed about millions of times, after videos featuring a letter written by Osama bin Laden began appearing on Monday. Al Gorithm dominates how TikTok users consume TikTok videos; users tend to watch whatever Al gives them instead of searching. That gives Al a huge amount of power and responsibility. But does he care whether his content is dangerous or harmful?

Absolutely not. Al Gorithm has no morals or common sense; if he sees signs that content is “highly engaging”, he’ll do his best to promote it.

Why do we let Al do whatever he wants?

It would be nice to ask Al how he comes up with these decisions. How does he ensure that teenage girls are not being actively shown content about self-harm because they might have searched for it previously?

But Al is not your regular editor. Whereas anyone is free to criticise me for writing this column (and they do!), Al is never anywhere to be seen. His CEO can always deny direct knowledge of what he did and promise to “do better”. But the same CEO has a fiduciary responsibility to their shareholders to make as much money as possible. So there is no incentive to tell Al to do anything other than what maximises engagement, which is what maximises audience.

How Al Gorithm actually works, meanwhile, is “proprietary information” and a trade secret. This matters because accountability in publishing has been effectively sucked out of the ecosystem as more people consume content by algorithm.

But isn’t there a clear public interest in knowing how Al Gorithm does his job? Why does society have to keep accepting it if he continually makes decisions against the public interest?

It seems like society is more interested in protecting Al than it is for protecting actual people.

‘Al Gorithm’ is not real. But the people who made him are

Of course the truth is that Al is an idiot. He literally has no brain. But he’s amazing and doing what he’s told.

And that’s the real point: algorithms are just a set of rules, or a recipe of instructions to be carried out by a machine. It doesn’t matter how complex they are — behind every algorithm there are real people who have made a conscious decision to decide what these rules are.

That means every time there’s a brand safety scandal on social media, or every time some kind of content is favoured over another, or the way “engagement” changes, there were people behind the scenes that are responsible. They wrote the rules, after having to explain how those rules worked to a CEO or commercial boss. A multi-billion dollar business doesn’t just ‘happen’ because computers and internet cables allow us to post whatever we want online.

Al Gorithm doesn’t just happen; he was created, he is constantly monitored, and he can be deleted.

And so people behind the algorithms should be publicly accountable, just as any publication editor is responsible for their content.

Time to come out

It’s a wonderful thing that online platforms can harness and organise so much content and give so much choice for consumers and advertisers.

But when did it become acceptable to remove the accountability for what is published by a media owner, just because the content creators are not professional journalists or broadcasters?

It’s time Al Gorithm was given a real name and a real face.


Omar Oakes is editor-in-chief of The Media Leader and leads the publication’s TV coverage.

100% Media 0% Nonsense’ is a weekly column about the state of media and advertising. Make sure you sign up to our daily newsletter to get this column in your inbox every Monday. 

Adam Hopkinson, MD, Pashn, on 21 Nov 2023
“Great article Omar. Completely agree”
Martin Woolley, Exec Chair, The Specialist Works, on 21 Nov 2023
“Completely agree. Legal responsibility for the content is the only thing that will ever change this”
John Moulding, Freelance, John Moulding Freelance, on 20 Nov 2023
“Brilliant piece. So true, and cleverly told.”

Media Jobs