| |

The false consensus effect

The false consensus effect

Were the Brexit polls the death knell for survey research, asks Zenith’s head of insight Richard Shotton

The failure of the research agencies to predict the result of the referendum has led to questions about the value of polling. However, it would be a shame if brands believed research was pointless.

First, the predictions weren’t dire. Of the six polls conducted the day before the referendum the average prediction was that 51% would vote to remain. Three percentage points out.

Most brands would be satisfied with this level of accuracy. The problem is that a referendum is a winner takes all situation – and the predictions were on the wrong side of the dividing line. Luckily for brands this stark divide doesn’t occur in sales.

So if dismissing research isn’t the real lesson, then what is? The most important out-take for advertising lies in the reaction of agency staff. Shock followed by disbelief. Many felt unable to understand why 17 million voted Leave. It’s a concern that so many couldn’t understand the motivations of more than half the consumers in the country.

The false consensus effect

The lack of understanding isn’t just anecdotal. Psychologists have long known about the false consensus effect, the tendency to assume our feelings and behaviours are shared by others.

Matthew Dunn, a psychologist at the University of Sydney, conducted an experiment in 2011 which quantified the scale of this bias.

[advert position=”left”]

He asked 974 elite athletes to estimate the prevalence of drug taking in their sport. Dunn found recent drug users estimated 45% of their competitors also cheated, whereas non-users put the figure at just 12%. The sportsmen were projecting their behaviour onto others.

Ad agency staff aren’t immune. We asked staff to estimate the percentage of the population with an iPhone. We cut the data according to whether the respondent owned an iPhone or not. The result? Those who owned an iPhone thought half the population owned one, whereas people who didn’t estimated that only a third of people owned one.

Popping the agency bubble

So what can we do? There are two solutions. First, more research, not less. Not the occasional expensive, formal research project but the universal uptake of fast and frugal insight approaches. That could be as simple as interviewing consumers in their homes, spending a day listening in at call centre, or working in-store for a week.

Or it could be a bespoke approach. For a recent brief into incontinence we wanted to help the planners understand the target audience. We had no budget so we used a technique we call ‘method planning’.

Over a weekend we texted the planners at random times. Each time they received a text they had to stop what they were doing and get to a toilet within two minutes. This helped the planners understand the experience of the target audience: not only the inconvenience but the sense of being a burden to one’s family.

Diversity in agencies

But will that be enough? When I spoke to the creative director, Dave Trott, he suggested a second approach. The problem in his eyes was not the lack of empathy but that agencies were staffed entirely with university graduates.

As he said: Advertising was more in touch with the mass-market when youngsters were hired from the mail room, straight out of school and progressed through the agency. They didn’t have to observe ordinary people through a two-way mirror, because they were ordinary people.

Trott’s concern about staff not being reflective of the country is a genuine one – when we surveyed agency staff we found that more people read the Guardian than the Sun, more shopped at Waitrose than ASDA, more drank Peroni than Carling.

Perhaps a culture of trying to understand the consumer is not enough – we need to compliment this with a more diverse agency make-up.

Richard Shotton is head of insight at Zenith – follow him on Twitter: @rshotton

Simon Redican, CEO, Publishers Audience Measurement Company, on 04 Jul 2016
“The British Polling Council produced an independent report into why the research companies predominantly called the 2015 General Election wrong. The single biggest issue they identified was unrepresentative sample. This is an issue with an awful lot of marketing industry research, where the assumption that online research is somehow more up to date, masks the fact that it is often self selecting, and hence non-representative. It is why the Published Media industry continues to make a huge investment in rigorously recruited, nationally representative samples to produce data for NRS and our new AMP audience measurement.
http://www.theguardian.com/politics/2016/mar/31/pollsters-improvements-2020-general-election

Media Jobs