Changing our mind about some deep seated belief requires effort, especially when the evidence overwhelmingly goes against what we think to be true. I’m not referring here to the deliberate manipulation of people through dubious means, such as large scale propaganda campaigns, but to people’s everyday beliefs about what they see as true and false. We can view this on a broad scale, such as climate change, vaccines or politically charged topics like immigration, or we can apply it to education and learning in reference to so-called learning myths.
We can, of course, argue that evidence itself is biased, and this is true to some extent, but when the evidence is consistent across studies, is scrutinised by others, and generally found to be both internally and externally valid, and is able to be generalised beyond the initial experimental environment, it should become harder for us to cling to beliefs that run counter to such evidence.
However, we may still dismiss the evidence because it doesn’t support our beliefs, agenda or some wider aspect of our self-concept, despite it adhering to these norms.
So why do we continue to reject the evidence?
The backfire effect suggests that when presented with evidence that is inconsistent with our thinking, we actually reinforce our erroneous beliefs. For example, providing facts to people who think vaccinations are unsafe can actually reduce the chances of vaccination, not increase vaccination uptake.
The backfire effect may simply be a symptom of inherent change aversion. Generally, people don’t like change, which is why advertisers and marketers are always coming up with new ideas to make you buy their brand rather the competitors. We may, of course, be drawn to the novelty of a new product, but many of us will still revert to our previous choices over time.
Change is personal
But there is something intensely personal about changing a belief that goes to the core of who we are. Again, this may explain our tendency to vote for the same political party throughout our lives. We cling to what we know and rarely divert from this. Furthermore, we might even hold a false belief in the continuity of political parties, for example, that the UK Conservative party under Margaret Thatcher was ideologically the same as the one currently under Johnson (and this holds for all political parties over time).
Describing ourselves as a conservative, liberal, socialist or whatever, says something about our beliefs at a very personal level. We, therefore, defend this view because not to do so would feel like self sabotage. This is perhaps why, in education terms, self-described progressives and traditionalists (see my blog here on the prog/trad debate) rarely agree with each other, even in the face of overwhelming evidence.
Our self-concept (our identity or personal ideology) can derail efforts to assimilate new information into current schemas, resulting in assimilation bias (see #14), but there is also an emotional element to changing our minds. When we are presented with evidence that runs counter to our belief, the brain treats the new information in the same as it would an external threat. As a result, we defend ourselves against what we unconsciously view as a personal attack, in the same way the body triggers the fight or flight instinct.
How common is the backfire effect?
Nyhan doesn’t think the backfire effect is all that common, and that people do update their beliefs based on factual information. Why then, are opinions polarised? Exposure need not lead to backfire and opinions do change over time. However, even after accurate assimilation of the new information, people may only temporarily alter their beliefs, reversing them later because the new information still isn’t consistent with the person’s self-concept. New information, therefore, may well suffer from shallow processing because of this inconsistency.
I discussed the illusion of explanatory depth in #13. The illusion of explanatory depth holds that we have less knowledge about a topic than we think we do, a bias that’s revealed when we are expected to explain a specific phenomenon. People more likely to reject new evidence are also more likely to believe their knowledge is deeper than it is - they have no need for evidence because they assume they know it all anyway.
Indeed, this study found that people who were most confident about their knowledge of welfare programs in the United States, in reality had far less knowledge than those who were less confident. We can see a similar bias in the so-called Dunning-Kruger effect, but also this paper suggesting the effect might not be real.
Take a step back
Being aware of our biases, taking a step back and critically examining our beliefs are all part of concept known as metacognition. This would involve an awareness of how humans learn, the processes involved and how and why these processes might derail our learning. It also includes elements of assimilation and error correction, identifying errors in our thinking and proactively working to fix them.
The problem here is that we assume we can do this simply by exposure to evidence and facts, when in reality we are going to struggle to jettison incorrect beliefs and replace them with more accurate one. Conversely, expecting others to alter their own beliefs because we’ve supplied them with new evidence, is a very naive assumption, one that highlights our own biases, as well as those of others.
You can now read Dynamic Learning in the new Substack app for iPhone.
With the app, you’ll have a dedicated Inbox for my Substack and any others you subscribe to. New posts will never get lost in your email filters, or stuck in spam. Longer posts will never cut-off by your email app. Comments and rich media will all work seamlessly. Overall, it’s a big upgrade to the reading experience.
The Substack app is currently available for iOS. If you don’t have an Apple device, you can join the Android waitlist here.