Forget Fake News: Why We’re Wrong About Nearly Everything

Photo Illustration by Lyne Lucien/The Daily Beast/Getty
Photo Illustration by Lyne Lucien/The Daily Beast/Getty

People are often incredibly wrong about key social and political realities in their countries, as I explore in my book, Why We’re Wrong About Nearly Everything, which draws on over 100,000 interviews across up to 40 nations, including the U.S..

For example, people in the U.S. think that 24 percent of teenage girls give birth each year—when it’s only 2 percent. Americans think that 33 percent of their population are immigrants, when the reality is around 14 percent—and that 17 percent of the population are Muslim, when it’s around 1 percent.

Across 30 countries, only 15 percent of people think their national murder rate is down since 2000, when it is actually down by 29 percent.

Surveys of the general public in the U.S. show Americans are generally a pretty happy bunch, with 9 in 10 saying they’re very or rather happy. But that’s not our impression of our fellow citizens: We think only half of other Americans would say they’re happy.

All the best evidence, including a review of over 1 million children, suggests there is no link between vaccines and autism in healthy children. But 4 in 10 Americans think there is, or they are not sure.

The temptation is to cry “post-truth’” and entirely blame our increasingly sensationalist media, social media, and tribal politicians for misleading and bringing us down. But this is not a new phenomenon. Similar misperceptions have been measured all the way back to ’40s America: Our delusions apply across time periods, countries, and issues.

The stability of our misperceptions points to a key conclusion. There is no single cause. Instead it is a “system of delusion,” based on two groups of effects that interact: “how we think,” our many biases and faulty mental shortcuts; and “what we’re told” by the media, social media, and politicians.

There are myriad effects on the “how we think” side of the equation, but I’ll just pick out four of the key ones.

First, one of our most important biases is our natural focus on negative information. There is an evolutionary element to this. Negative information tends to be more urgent, even life-threatening: We needed to take note when we were warned by our fellow cavepeople about a lurking saber-toothed tiger—and those who didn’t were edited out of the gene pool.

Our brains therefore handle negative information differently and store it more accessibly, as shown in a number of neuroscience experiments that track electrical activity in subjects’ brains. We react more strongly to negative images, like mutilated faces or dead cats, and process them with different intensity in different parts of the brain. We are therefore very attuned to bad news and a sense of threat in news stories and speeches by politicians, for example, on crime or terrorist attacks. We focus more on this negative information, and this exaggerates the scale of the risk or issue in our thinking.

Second, we also have a faulty view of change: In particular, we’re susceptible to a false sense that everything is going downhill. We naturally suffer from what social psychologists call “rosy retrospection”: We literally edit out bad things from our past, on everything from our poor exam results to our less-than-perfect holidays.

Again, this is not a dumb fault in our brains, it’s good for our mental health not to dwell on past failings or challenges. But it has the unfortunate side-effect of making us think the present and future are worse than our memories of the past: We don’t only exaggerate the scale of crime, for example, we also tend to think it’s getting worse even when it’s not.

Third, we suffer from what social psychologists call “emotional innumeracy” when estimating realities: This means we are sending a message about what’s worrying us as much as trying to get the right answers when answering questions about realities, whether we’re consciously aware of that or not. Cause and effect run in both directions, with our concern leading to our misperceptions as much as our misperceptions creating our concern.

This has the critical implication that simplistic myth-busting, correcting misperceptions solely with facts, will always have limited impact—because it misdiagnoses part of the reason for our error. Our perceptions of reality are partially driven by our emotional reactions, not cold-eyed arithmetic.

Finally, some of our biases depend on our pre-existing views, through directionally motivated reasoning. For example, people in the U.S. have utterly divergent views of the extent of gun deaths in the U.S., depending on whether they are Republicans or Democrats. Around 80 percent of Democrats (correctly) say that guns kill more people in America than knives or other violence—but only 27 percent of people who identify as strong Republicans say the same. The same reality, seen entirely differently depending on your existing political view.

As well as our own biases, there are actors in politics, the media, and social media that have vested interests in pushing a particular worldview at us, through a distortion of the facts, or just outright lying. I examine a number of examples and their connection to our misperceptions in the book, from politicians across the spectrum, but just to pick out one here, from President Donald Trump’s address to the National Sheriffs’ Association at the White House in Feb. 2017:

The murder rate in our country is the highest it’s been in 47 years, right? Did you know that? Forty-seven years… the press doesn’t tell it like it is. It wasn’t to their advantage to say that.

But there was a good reason the press didn’t say that—because it wasn’t true. It is, however, effective in emotionally connecting to his target audience by playing on human biases—our focus on negative information and our tendency to think that things are getting worse.

Politicians, media, and social media achieve the reaction they desire by, for example, emphasising vivid, negative, stereotypical stories precisely because we tend to be influenced more by these than accurate but dry statistics. Politicians, journalists, and content creators understand this intuitively, because they are human too (despite what some think). They are subject to the same biases as the rest of us, so even where this is not part of a dastardly plan, their own delusions drive their messages. This is then reinforced in feedback loops of achieving political results, and increasingly instantaneous ratings of popularity, viewing figures, clicks, shares, or likes.

It’s getting ever more important that we fight back—because our new information environment presents an accelerating threat to a reality-based view of the world. While we’re no more wrong than we were in the past, our certainty in our faulty perceptions is fueling further polarization, splitting countries into fragments that see even the basics of social and political realities entirely differently.

The early days of the creation of the internet, and then social media platforms, were filled with hope about their power to inform and connect. We largely ignored the systemic risks that they would do the opposite. We were insufficiently focused on how our biases and heuristics would interact with this new information environment. We were blinded by the technological advances and forgot the flawed, motivated, and manipulative (in short, the human) aspects of how we produce and consume information in practice.

This is not just about the relatively small concept of truly “fake news,” or even the greater threat from the deliberate spreading of disinformation. Beyond these there are much wider effects from the extent to which we can filter and tailor what we see online, and how this is increasingly done without us even noticing or knowing it. Unseen algorithms and our own selection biases interact to increase the risk of splintering our collective understanding of the world into individual realities.

But in all this, we need to counter the sense that all is in terminal decline or already lost. Hope is essential to encourage action—and a vital defense against extremists who say things are so bad we need to rip it all up. This is not the same as saying that everything is perfect. But we need to be deeply suspicious of those playing on our biases to undermine our hold on reality and convince us that we are living in a new dystopian era. That really is fake news.

Bobby Duffy is Director of the Policy Institute at King’s College London and author of Why We’re Wrong About Nearly Everything.

Read more at The Daily Beast.

Get our top stories in your inbox every day. Sign up now!

Daily Beast Membership: Beast Inside goes deeper on the stories that matter to you. Learn more.