Why Should We Care?

Surveys across the world show that what the public believes about their countries is often far from the truth, life is better than they think. People rate the performance of their societies much more poorly than the reality. This has consequences for how people, live, vote and treat one another.


A SYSTEM OF DELUSION
– WHY WE’RE WRONG ABOUT EVERYTHING


WRITTEN BY PROFESSOR BOBBY DUFFY
DIRECTOR OF THE POLICY INSTITUTE, KINGS COLLEGE, LONDON
____

 

The gaps between public perceptions and reality on a wide range of social issues are often extraordinary. This has been reiterated by 100,000 plus interviews in up to 40 countries; research for a collection of studies by Ipsos MORI, and explored in my book on The Perils of Perception.

For example, people in Britain think that 19 per cent of teenage girls get pregnant each year – when it’s only 1.4 per cent.  Italians think that 26 per cent of their population are immigrants, when the reality is around 10 per cent.

The French think 30 per cent of their population are Muslim, when it’s around 8 per cent.  Across 30 countries, only 15 per cent of people think their national murder rate is down since 2000, when it is actually down substantially in the vast majority.

All the best evidence, including a review of over 1 million children, suggests there is no link between vaccines and autism in healthy children.  But six in ten people across 40 countries think there is, or they are not sure.

 

“The temptation is to cry ‘post-truth’, blaming our increasingly sensationalist media, social media and tribal politicians. But this is not a new phenomenon.

Bobby Duffy, Director of the Policy Institute, King’s College London

 

The temptation is to cry ‘post-truth’, blaming our increasingly sensationalist media, social media and tribal politicians.

But this is not a new phenomenon.  Similar misperceptions have been measured all the way back to 1940s America: our delusions apply across time periods, countries and issuesThe studies that I’ve run over the past fifteen years – a period when we’d expect the effects of our changed information environment to have most taken hold – reinforce this view of stubbornly consistent errors.  For example, in every survey I’ve done, Americans and Brits think immigration is roughly twice its actual level.

The stability of our misperceptions points to a key conclusion: there are multiple drivers of our delusion, based on two groups of effects that interact: ‘how we think’, our many biases and faulty mental shortcuts; and ‘what we’re told’ by the media, social media and politicians. It’s a ‘system of delusion’ with many feedback loops.

How We Think

There are myriad effects on the ‘how we think’ side of the equation, but I’ll just pick out four of the key ones here.

First, one of our most important biases is our natural focus on negative information. There is an evolutionary element to this.  Negative information tends to be more urgent, even life-threatening: we needed to take note when we were warned by our fellow cavepeople about a lurking sabre-toothed tiger – and those who didn’t were edited out of the gene pool.

Our brains therefore handle negative information differently and store it more accessibly, as shown in a number of experiments that track electrical activity in subjects’ brains. We react more strongly to negative images, like mutilated faces or dead cats, and process them with different intensity in different parts of the brain.  We are therefore very attuned to bad news and a sense of threat in news stories and speeches by politicians, for example, on crime or terrorist attacks.  We focus more on this negative information, and this exaggerates the scale of the risk or issue in our thinking.

 

“We focus more on this negative information, and this exaggerates the scale of the risk or issue in our thinking.

Bobby Duffy, Director of the Policy Institute, King’s College London

 

Second, we also have a faulty view of change: in particular, we’re susceptible to a false sense that everything is going downhill. We naturally suffer from what social psychologists call ‘rosy retrospection’: we literally edit out bad things from our past, on everything from our poor exam results to our less-than-perfect holidays.

Again, this is not a dumb fault in our brains, it’s good for our mental health not to dwell on past failings or challenges. But it has the unfortunate side-effect of making us think the present and future are worse than our memories of the past. We don’t only exaggerate the scale of crime, for example, we also tend to think it’s getting worse even when it’s not.

Third, we suffer from what social psychologists call “emotional innumeracy” when estimating realities: this means we are sending a message about what’s worrying us as much as trying to get the right answers when answering questions about realities. Cause and effect run in both directions, with our concern leading to our misperceptions as much as our misperceptions creating our concern.

This has the critical implication that myth-busting, correcting misperceptions solely with facts, will always have limited impact – because it misdiagnoses part of the reason for our error.  Our perceptions of reality are partially driven by our emotional reactions.

Finally, some of our biases depend on our pre-existing views, through directionally motivated reasoning. For example, people in the US have utterly divergent views of the extent of gun deaths in the US, depending on whether they are Republicans or Democrats. Around 80 per cent of Democrats (correctly) say that guns kill more people in America than knives or other violence – but only 27 per cent of people who identify as strong Republicans say the same.  The same reality, seen entirely differently depending on your existing political view.

Bobby Duffy's book explores the latest research into the media and decision science.

What We’re Told Is Important Too

As well as our own biases, there are actors in politics, the media and social media that achieve the reaction they desire by, for example, emphasising vivid, negative, stereotypical stories precisely because we tend to be influenced more by these than accurate but dry statistics. This is then reinforced in feedback loops of achieving political results, and increasingly instantaneous ratings of popularity, viewing figures, clicks, shares or likes.

These interactions between ‘how we think’ and ‘what we’re told’ effects are not examined as systematically as they should be.  Almost all existing analysis tends to focus on one side or the other: on our fallible human brains or an information environment that leads us astray. This reflects our human need for simplicity and solutions: we want to see problems as caused by one thing or another, providing a clear focus for blame and a single answer.  We therefore miss the real issue – that we live in a system that, by default, breeds delusion from multiple sources. This delusion needs to be countered by more accurate, balanced and solutions focused stories that people can also engage in.

The conclusion from all the misperceptions work I’ve done is not that facts and balance are useless.  An understanding of factors such as emotional innumeracy, motivated reasoning and rosy retrospection are vital for understanding why we see the world the way we do, and how important our biases and identity are. But they are not inviolate effects, working identically across the population and issues, creating automatons immune to reason and incapable of changing their minds.

People are more varied and complex, and we need to move to a more balanced view, where the value of high quality journalism is recognised as not sufficient, but certainly essential.

Want to Know More?

Perils of Perception explores the gap between people’s perceptions and reality across more than 40 countries and 200,000 interviews.