We have more information at our fingertips than ever before. But with all this knowledge flying around, why do so many people still hold on to beliefs that have proven false? It turns out the answer lies in how our brains work, social dynamics, and the way we process information.
The Continued Influence Effect
One of the biggest reasons misinformation sticks is something called the continued influence effect, which basically means that even when we know something isn’t true, we still tend to rely on it. Once an idea takes root in our minds, it becomes part of the way we understand the world—even if it’s later debunked.
Why? Because our brains love a good story. When corrections come in, they often disrupt the neat narrative we’ve created around the original information. If that correction doesn’t come with a better, more complete story, we’re likely to fall back on what we already know, even if it’s wrong.
I remember when I used to teach an educational psychology course to undergraduates. When we got to the topic of reinforcement versus punishment, I’d make a point to warn them that they likely already had preconceived notions about what these terms meant. Most students thought negative reinforcement was just another term for punishment, which couldn’t be further from the truth. I would stress that if they didn’t actively work to shift those misconceptions, they’d miss an essential aspect of behaviorist psychology. Even when I provided clear examples showing that negative reinforcement involves removing an unpleasant stimulus to strengthen a behavior (like turning off a loud alarm when a seatbelt is buckled), some students still found it hard to let go of their initial misunderstanding. And, a few would inevitably miss those questions on the exam—a perfect (if unfortunate) example of how deeply the continued influence effect can run.
The Familiarity Trap
You’ve probably heard the phrase,
“If you repeat a lie often enough, it becomes the truth.”
This isn’t just a catchy saying—it’s backed by psychology. The more we hear something, true or not, the more familiar it feels, and that sense of familiarity makes us more likely to believe it.
This is known as the illusory truth effect. Even when we know a statement is false, repeated exposure can make it feel true. Our media landscape, with its endless cycle of news and social posts, amplifies this effect. It’s easy for false information to spread and gain traction, simply because we see it so often.
Social Identity
Beliefs are more than just ideas—they’re part of our identity. When certain beliefs are tied to our social groups, letting them go can feel like a betrayal. Changing your mind isn’t just about admitting you were wrong; it can mean risking your place within your community.
This deeply ingrained need to stay aligned with our group has evolutionary roots. For most of human history, being ostracized from the group meant near-certain death. Survival depended on social bonds, so over millennia, tendencies to conform and maintain group cohesion became hard-wired into our DNA, which helps explain why social identity and group beliefs are so powerful—rejecting them can trigger primal fears of isolation, even if those fears are no longer as relevant today.
Research in social psychology suggests that people are less likely to accept corrections when it threatens their core affiliations. Cognitive dissonance plays a big role here, too. When new information clashes with what we already believe, it creates mental discomfort. To avoid this, we’re more likely to dismiss the new information or find ways to justify our original belief.
How to Effectively Debunk Myths
Knowing why we hold on to misinformation can help us tackle it better. Here are a few strategies that work:
Fill in the Gaps: Don’t just tell someone they’re wrong—give them a better explanation. People need a story that makes sense and fits with what they already know.
Avoid Repeating the Myth: The more you repeat a falsehood, the more familiar (and believable) it becomes. Instead, focus on emphasizing the truth without constantly restating the myth.
Use Trusted Voices: If a belief is tied to social identity, it helps when corrections come from within the group. Messages from trusted insiders are more likely to be accepted because they don’t feel like a threat.
Moving Forward
Changing false beliefs isn’t as simple as just throwing facts at people. It takes empathy, patience, and smart communication. Psychological biases like the continued influence effect and the illusory truth effect show us that facts alone aren’t enough. People need stories that fit their worldview and don’t make them feel attacked.
Creating content that’s relatable and respectful can make a real difference. We need to engage with people in a way that respects their emotions and social ties. Only by addressing the cognitive, emotional, and social aspects of belief can we create an environment where truth wins out over repetition and group loyalty.
In some senses, the truths expressed here reinforce the inevitability of mythful thinking across the population of a financially-driven, as opposed to economically-driven, culture.
Still, your recommendations seem like useful tools for intelligently approaching corrections.