|

‘Do something and see what happens’: Why we’ve been doing test-and-learn wrong

‘Do something and see what happens’: Why we’ve been doing test-and-learn wrong
Opinion

Marketing effectiveness best practice needs an evidence-based approach that uses a variety of measurement techniques, not relying on spurious digital metrics.


There is a mountain of marketing effectiveness best practice out there. The likes of Les Binet and Peter Field, the Ehrenberg-Bass Institute for Marketing Science and, more recently, EffWorks have all given our industry evidence-based effectiveness best practices.

While there are inevitably differences in opinion, all of these thought leaders believe response-driving media is most effectively used alongside brand-building that focuses on reach and creating emotional connections with their audience.

So it is a relief to see that more businesses are adopting these principles, at least at the chief marketing officer level. Indeed, Nick Manning alludes to this when he argues that the online advertising bubble has burst.

Test-and-learn initiatives

But underneath the C-suite, there is another set of “best practices”, often sitting within digital and performance marketing-focused teams. These digital best practices are often the output of test-and-learn initiatives.

Having worked in digital since the early noughties, I’ve been involved in countless test-and-learn programmes. Unfortunately, I regularly witness these initiatives as being counterproductive. Internally built learnings often encourage businesses to overrule effectiveness best practice in favour of the data coming from digital testing. Many businesses believe, often incorrectly, that their business is unique and therefore industry best practice is less applicable to them.

I suspect this is one of the reasons that the editor-in-chief of this good website has said that digital media will need to be pushed into delivering better advertising.

What usually powers the digital test-and-learn agenda is spurious digital metrics. Attribution, clickthroughs, bounce rates, likes and shares have all been debunked as useful metrics by many industry leaders, yet it is success against these metrics that businesses are learning against.

Most test-and-learn programmes won’t identify what actually drives these metrics. Because, to do that, you would need to conduct a test rooted in a scientific method — something like a control vs exposed experiment. These are rare.

Instead, testing is more like “let’s do this new thing and see what happens”. It is no more scientific than primitive man dancing for rain and believing that this played an active role in bringing the rains that came afterwards. As any scientist will tell you, correlation is not causation.

Attribution nightmare

Ecommerce businesses have used attribution heavily since this form of data first became available around 20 years ago. Fast forward to now and attribution is regularly debunked (Binet has discussed “the attribution nightmare”). In this time, attribution has arguably lost even more accuracy and yet it remains a vital metric for many businesses. Crucially, it still powers many of the “learnings” that businesses adopt from test-and-learn.

What “works” in attribution usually involves targeting customers who are highly likely to buy. This is because attribution does not recognise the difference between baseline sales (sales that would have happened anyway) and incremental sales (sales that were actually driven by the advertising).

For established businesses, there are almost always more baseline sales than incremental sales. This creates a huge measurement challenge.

It goes without saying that any test-and-learn programme worth its salt should be trying to identify the drivers of incremental sales. This is hard to do, especially for brands that have large levels of active demand. This is because, for large businesses, advertising is one of many factors that influences total sales. Such organisations will need to take special measures to identify the impact of advertising over many real-world variables as diverse as the weather, competitor pricing and Brexit.

If you don’t control for those variables, their impacts (both positive and negative) will erroneously be given to the ads that are being tested.

What is effective in the real world, outside the digital silos, is far more complex than the digital data and learnings often indicate. In the real world, you need to positively influence customer shopping behaviour in order to be effective. This is much harder to measure, but far more valuable when the insights are accurate.

A new model

To achieve these higher-level insights, brands need to adopt an evidence-based approach that uses a variety of measurement techniques.

For evidence-based best practice, brands might employ measurement tools such as media mix modelling, incremental lift measurement, brand tracking, customer surveys and attribution data. When these measurement methods concur with each other’s findings, we can be most confident in their findings. At Bicycle, we call this a “mixed measurement model”.

This approach unashamedly borrows from another area where making the right judgement is important: justice. In a court of law, you would never place full faith in one piece of evidence alone — and marketers should follow this same lead. Using digital metrics alone is akin to a jury hearing witness statements but ignoring forensic evidence.

So, as an industry, we haven’t really been doing a true version of test-and-learn. Instead, we’ve been doing “do something and see what happens”. Unfortunately, very often the results are driven by things that no-one realised were being tested.

This is an uncomfortable truth that we need to accept if we are to make marketing more effective.


Aidan Mark is media science and strategy director at Bicycle

Adwanted UK is the trusted delivery partner for three essential services which deliver accountability, standardisation, and audience data for the out-of-home industry. Playout is Outsmart’s new system to centralise and standardise playout reporting data across all outdoor media owners in the UK. SPACE is the industry’s comprehensive inventory database delivered through a collaboration between IPAO and Outsmart. The RouteAPI is a SaaS solution which delivers the ooh industry’s audience data quickly and simply into clients’ systems. Contact us for more information on SPACE, J-ET, Audiotrack or our data engines.

Media Jobs