The year ahead for media and AI

The year ahead for media and AI

Sponsor content: Publishers will need AI to help them truly understand the far reaching effects that changes to their ad experiences have on their visitors, writes Ezoic’s John Cole

It was obvious from the turn-out alone that The Year Ahead, hosted by Mediatel, was something the industry would be abuzz about. This was for good reason, 2018 is being viewed as the year we face down disruption, corruption, and destruction.

All these things were on the docket for the panel as they discussed an uncertain and bold new future for the state of media.

I had the pleasure of kicking off the event with my thoughts and predictions, which largely centred around the role that AI might have to play in of all these things. This was my first and most important prediction.

I thought the 2017 news of Google’s AlphaGo – which learned how to play the ancient Chinese game of ‘Go’ – was a great example of how we can expect to see AI disrupt our space in the future. Here’s why.

AlphaGo won the World Go Championship and showed that it had approximated human intuition – because it couldn’t brute force all the possible moves. AlphaGo learned by replicating neural networks, just like a human brain. This wasn’t expected by computer scientists for at least another decade. This type of learning offers far reaching benefits for complex and difficult problem-solving at scale.

Machine learning is something I’ve had the pleasure of being deeply involved with for the past eight years. Ezoic is a machine learning platform for digital publishers. Our system learns from website visitors over time – taking into account how different variables impact revenue for the publisher and user experience for the visitor. We then use that information to improve each session based on all of this data it has automatically collected.

Because of this, we get asked a lot, “what have you learned from these machines that are constantly testing and learning?”
[advert position=”left”]
We’ve been doing this for a few years now on thousands and thousands of websites…and do you know that the number one thing we’ve learned is?

When you treat visitors (or readers) differently they have better experiences.

It’s true, and we all know it, but unfortunately – as we have learned over the past eight years – it is a complex equation to solve. However, if any of you attended the event Mediatel hosted last April at the Haymarket Hotel when Ezoic’s head data scientist, Dr. Greg Starek, sat up and shared our learnings about the correlations between visitor engagement and overall session revenue, you’ll remember there is unprecedented value in solving riddles. Better experiences translate to longer, more valuable user sessions, and that ultimately means raising the bottom line for publishers.

This is where we return to machine learning. Applying AlphaGo-ian logic to this problem is the only way to truly understand how to treat each visitor differently. Publishers have to learn from visitor behaviour over time by allowing the machines to test and adapt different variables to user sessions to understand which experiences work best for different types of users. At scale, humans simply cannot do this effectively (we’ve tested this).

With much being said about Google joining the ranks of ad blockers by allowing Chrome to block all publishers that fail the Abusive Experiences Reports, publishers will need the likes of AI to help them truly understand the far reaching effects that changes to their ad experiences have on their visitors and revenues.

This ecosystem is not always fully understood in the short term, which means publishers will need to rely on more objective data than ever. Something humans cannot process very efficiently and something machines are already doing really well.

As the platforms continue to struggle with ‘bad content’ issues inside Facebook, Twitter and Google, machines will also learn to monitor and filter content in this realm as well. With hundreds of hours of video being uploaded every second, I expect machines will have to be tasked with policing this a bit for us.

I doubt the genie goes back in the bottle here, so that means we have to be better at identifying things like race hate and other putrid forms of content. This was hotly debated among the panel, for good reason. I expect we can see technology step in here in 2018 to assist us with a problem that may still get worse before it gets better.

Lastly, relating to technology, ad tech will likely consolidate even further in 2018, with fewer intermediaries (thanks to ads.txt). This might mean that native ads will finally hit the bottom (and I mean this quite literally) of the click bait ladder. I think it’s possible we’ll see one or two companies fold in that space this year as the race to the bottom finally reaches bottom.

These innovations in AI and technology are all coming thick and fast – whether that be web pages designing their own layouts based on AI predictions of user intent, or whether that be a non-human taxi driver. Our job – as leaders – is to manage that change and grasp the opportunities that are opening up in these exciting areas and learn to get the most out of machines in our businesses.

I’m hopeful that there are few industries as well-equipped as ours to do just that.

John Cole is the chief customer officer of Ezoic, and the headline sponsor of Mediatel’s Year Ahead 2018

Media Jobs