Data in Policy Debate: Skepticism, Confirmation Bias, and Mindsets

Our Data in Policy Debate series has focused on helping us be aware of how similar data about the same question can be used to lead us toward two very different conclusions.

A few recent articles in the series:

This framing can be used to manipulate us into partisan camps.

What’s curious, though, is how often we aren’t manipulated by deceptively-framed graphs. That is, we do not simply change our minds whenever we see data that seems to contradict what we think. This is not unreasonable: it should take time and effort to unseat a belief that has been developed with time and effort.

But it’s not the case that we’re simply (and nobly) storing away the new framing on the “other side” of a scale. Indeed, knowing that politically-charged graphs often have cherry-picked data means we’re prepared to be skeptical and approach them with deep scrutiny.

Let’s use a previous example to illustrate: the two graphs below seem to imply different conclusions about gun ownership and murder. Look at the graphs below and quickly assess your feelings:

About which graph were you more skeptical, and which less?

Does your level of skepticism correlate with your political leanings? That is, were you more prone to accept the graph that seems to support your current opinion, and prone to reject the one that seems to challenge it?

Most people will find, upon close inspection, that they’re more prone to accept as true a graph that seems to support what they believe, and more prone to try to find flaws in graphs that seem to challenge what they already believe. For the latter group of graphs, we closely scrutinize the framing, source, and methodology. We identify alternative explanations for the correlation. We can actually be quite good at critical thinking when we’re primed to be.

Mindsets and Partisanship

It’s common to believe that education and exposure to neutral reporting will go a long way in healing the partisan divide in America: if we’re all armed with the same, unbiased analysis, reasonable people should reach the same conclusions.

Unfortunately, that’s just not true. People with both very much and very little education are very prone to quickly accepting evidence that seems to support their views, and rejecting evidence that seems to challenge them. This is called a confirmation bias: it is the tendency to search for, interpret, favor, and recall information in a way that confirms one's beliefs or hypotheses while giving disproportionately less attention to information that contradicts it..

The confirmation bias is the root of the mindless sort of partisan politics, rather than a lack of education. In fact, exposing people to facts that contradict their beliefs usually causes them to actually dig in and believe what they do more firmly than before.

“So how do we change people’s minds?” Trying to remove confirmation bias from a single person--much less millions--is incredibly difficult. It’s part of human nature. Becoming more aware of the bias, and working within a set of rules in our road to understanding politics (as well as other things) is a start: it’s how scientists minimize the effects of their own biases.

We need to make sure we’re looking to ourselves first: the confirmation bias gets all of us sometimes. When looking at framing that we begin to emotionally reject or emotionally accept, ask yourself: "Why am I attached to this being true or not true?" 

But to help others, we must change the priming. Throwing facts out in front of people to prove them wrong primes them to be skeptical and look for ways to undermine or reject the argument. Another method is needed.

More on that later.