|

Life in a filter bubble

Life in a filter bubble

Algorithmic filtering is now orchestrating our entire lives. Helen Rose asks what impact this will have on brands.

As 2017 dawns, Brexit Britain is looming and Donald Trump is soon to be inaugurated as the next president of the United States. Many columns have already been given over to the surprising results of these major political events of 2016. But are they as surprising as has been made out?

While the public has now been made aware of the false facts and fake news circulating on social media, the truth seems to be that media consumption has something to do with it. Many people are oblivious to the “filter bubble” their online media consumption, in particular, has been reduced to.

The continual advancement of algorithms has resulted in major players like Google and Facebook becoming experts in serving only personally-relevant content and at predicting what consumers are interested in based solely on their previous browser activity. If you don’t believe us, just have a look at your own Facebook ad data.

This model for prediction is a core long-term strategy for many brands. The benefits of highly-targeted ad campaigns are clear: consumers get to enjoy content they are already interested in, and brands are able to efficiently target the audiences of most value to them.

Win-win. Or is it?

While it’s true to say we have always consumed media that interests us as a result of cognitive bias – where we are primed to notice and like content of relevance – consumer choice is no longer so freely managed by the individual.

The traditional process of selection – for instance, when browsing a newsstand for a particular paper – usually involved one that allowed a consumer to see what was available, and then discount what didn’t appeal.

[advert position=”left”]

Prediction engines are now constantly refining a theory of individual consumers based on what they click on, what they like, what their friends have shared, etc. This ultimately alters the information a consumer could ever come across, leading to an entirely different process of forming opinions.

The term “filter bubble” was coined by Eli Pariser in his book of the same name, first published in 2011. Five years ago Eli wrote about how excessive personalisation on the internet – whether that be a Google search or your Facebook newsfeed – had dangerous political and social implications, particularly its high toll on serendipitous discovery.

Even earlier, in 2001, scholar Cass Sunstein wrote that selectivity may eventually trap us inside our own “information cocoons”.

The filter bubble concept is therefore not a new phenomenon. However, 2016 was the year when the effect of excessive personalisation became endemic.

According to Pew Research Center, over 60% of US adults get their news from social media, and 40% look for news updates solely on Facebook. This has led to many being exposed to just one side of each news story – one that is likely to reflect the beliefs they previously held, and those held by most of the people they interact with on a daily basis.

In fact, Facebook’s own survey found that users have on average five friends with similar political views for every one friend with differing views.

This is effectively creating a platform of self-serving content, with an ever-increasing effect of self-validation or “echo chambers” where views are not being challenged, leaving both political and societal views to become increasingly entrenched and polarised.

Individuals in these “echo chambers” share content that they resonate with. But as Ryan Milner, a communication expert at the College of Charleston in South Carolina, says “the thing about resonance is, it doesn’t have to be tied to actual reality.”

So if these decisions have no base on truth or reality, how can companies be using this data to personalise fact-based content and purchase options?

Much has been commented on with regards to news consumption and political views, but this ultimately has a knock on effect. Algorithmic filtering is now orchestrating our entire lives.

It defines what adverts and brands people are exposed to, the content they can consume, and what they share. There is a clear impact on consumer serendipity and brand discovery, and ultimately has much wider implications for both general online behaviour and offline consumption.

There’s no denying personalised models of targeting have many benefits for consumers who receive more relevant content, and for the brands who are trying to reach them more effectively.

However, greater understanding is required of the impact on both brands and consumers who are missing out on potentially “undiscoverable” information and content that cannot crack the filter bubbles.

Helen Rose is head of insight at the7stars

Media Jobs