Performance planning: how to do ‘test and learn’ with 70:20:10

Performance planning: how to do ‘test and learn’ with 70:20:10

Testing shouldn’t be a ‘nice-to-have’ — it should be an intrinsic part of your media plans with protected time and investment.


Welcome back! If you’ve been following along at home, you will have amassed numerous tables, charts, templates, and approaches to plan and optimise your media activity.

Following on from our last article on testing roadmaps and frameworks, it seems prudent to discuss how this testing fits into marketing plans.

A ‘test and learn’ approach will be most effective when baked into your overall media approach.

Testing shouldn’t be a ‘nice-to-have’ or something that gets done when you’ve got a bit of spare time; it should be an intrinsic part of your media plans with protected time and investment.

Quite often the person/people/team allocating budget is different to those carrying out the testing, which is why it’s imperative that marketing functions work as one.

Don’t take my word for it though; we have a special guest for this instalment in the form of Lenga Ball, client account director at The Kite Factory, to explain the ‘70 20 10’ testing approach.

Testing has always been a vital part of progress, especially in the last few years as we’ve seen monumental shifts in consumer behaviours, media consumption habits, and platform capabilities.

These fluctuations in what we know to be true of the landscape has really bought the scope of testing and the need for innovation to the fore, however it’s not without risk.

When running a ‘new’ (and hopefully successful) marketing activity, be it a new platform, an alternative format, or a different audience, there is always the potential for your hypothesis to be proved wrong and for performance to decline – so how do we benefit from testing whilst ensuring we still deliver on the required KPI’s?

The 70:20:10 planning principle provides a framework for controlled, sustained development. It works like this:

70% of your media budget goes into Brilliant Basics: these are your bread and butter activities: tried and tested channels and activations which can be relied upon to perform consistently – for example the top-performing audiences, creatives, channels, days or dayparts – forming solid ‘base’ performance.

20% of spend is focused on Effective Enhancements: this is the evolution of your Brilliant Basics activity within core channels by reaching new audiences, using new placements, or testing different creatives or call to actions.  This element serves to improve performance of your 70% activity.

10% of spend is utilised on New Innovations:  This would encompass testing entirely new channels, platforms & products, ensuring campaigns don’t get ‘stale’.

Ringfencing 70% of spend for use in ‘banker’ activity enables confident performance forecasting, and a degree of comfort amongst marketers and stakeholders alike.

Transversely, spending 30% on enhancements and innovation means that there are always learnings which can be fed back into planning, unlocking constant evolution in your campaigns.

As the performance of the 20% and 10% activities is proven, the successful approaches can be shifted from the enhancement and innovation classifications into brilliant basics, growing your core performance program.

If you’ve been following the performance planning series and have played along at home, you should have a testing heatmap where you’ve outlined your testing priorities based on scale of benefit, ease of implementation, and cost implications. You can use your heatmap to assign budgets on a 70:20:10 basis to create a thorough testing roadmap or plan.

      Q1 Q2 Q3 Q4
Channel Testing Classification Media Budget £80,000 Test £140,000 Testing £50,000 Testing £150,000 Testing
PAID SEARCH Brilliant Basics 70% £56,000   £98,000   £35,000   £105,000  
Effective Enhancements 20% £16,000   £28,000   £10,000   £30,000  
New Innovations 10% £8,000   £14,000   £5,000   £15,000  


When building your roadmap it’s important to consider how much you’ll need to spend in order to gain statistically significant learnings – you may need to extend the time period of your test in order to ensure there is enough performance data available to confidently gauge the outcome.

It’s also important to factor in workloads and resourcing, especially if you’re testing asset-heavy factors such as landing pages or creative which will need to be built before the test can take place.

As you learn, your priorities may change and your roadmap may evolve which is allowed and encouraged!

If the 70:20:10 approach tickles your fancy, identify the categorisation of each of your tests so you can appropriately allocate budget using Lenga’s table above.

And remember: Always fail fast. If a test shows zero promise or potential, then stop the test and try something else. Unsuccessful tests are just as insightful as successful ones!

Niki Grant is search director at The Kite Factory. Check out her previous instalments of Performance Planning, a guide for marketers and media planners to handle performance media planning and budget optimisation and her other columns for The Media Leader.

Strategy Leaders: The Media Leader‘s weekly focus with thought leadership, news and analysis dedicated to excellence in commercial media strategy. Sign up to our daily newsletter for free to ensure you know what the industry’s leading media strategists and brands are thinking.

Media Jobs