Certainty Transfer Effect
We’ve been told, often by very confident people, that humans are wired to crave certainty. It’s presented as a kind of evolutionary hangover, like a craving for sugar or the urge to explain traffic patterns with astrology.
It’s an appealing story. But I wonder if we’ve been too quick to accept it. What if the real story is less about biology and more about the world we’ve built around ourselves? What if certainty isn’t some ancient craving we inherited, but more like a habit we picked up from living inside systems that reward it?
That’s the question I’d like to explore, because once you see how modern tools and institutions quietly teach us to expect certainty, it gets a little easier to understand why so many people lose their minds when reality refuses to cooperate.
Here’s a question that’s been rattling around my head: what if the tools we use to understand the world don’t just shape what we know, but how we expect to know it?
If you asked someone in 1400 for a precise five-day weather forecast, they’d either laugh, offer you some extremely local superstition involving sheep, or assume you were an escaped wizard. Certainty wasn’t something you expected from reality. You lived alongside uncertainty, like an unruly neighbor who occasionally borrowed your rake and never returned it.
Then came the machines. Beautiful, reliable, obedient machines. Once we saw that kind of engineered certainty, it was easy to believe we could apply the same thinking to everything else: economies, ecosystems, election results. The tools worked so well they trained us to expect certainty from everything.
I’ve started thinking of this as Certainty Transfer Effect, or CTE for short, which feels appropriate, since the kind of CTE I’m talking about also seems to scramble our cognitive functioning over time. One too many hits from confidently wrong predictions, and suddenly we’re all walking around with epistemic concussions, unable to tell the difference between knowledge and performance.
The thing is, we didn’t just inherit this craving for certainty. We absorbed it, like secondhand smoke from every economic forecast, management report, and 24-hour news cycle we’ve ever inhaled. The tools didn’t just show us the world; they trained us how to expect the world to behave. And once certainty became the default aesthetic of expertise, admitting doubt started to feel like a confession, rather than a virtue.
Certainty Feedback Loop
If this was just about technology, that would be interesting enough. But technology doesn’t operate in isolation. It shapes culture, which shapes institutions, which shapes incentives, until you end up with a whole epistemic ecosystem where uncertainty is treated as something shameful.
The loop, roughly, looks like this:
We build tools that simplify complex reality into tidy predictions.
We build institutions that select for people who project confidence.
We build media platforms that reward bold certainty over thoughtful ambiguity.
We build schools that test for correct answers, not insightful questions.
We train the public to associate uncertainty with incompetence and confidence with truth.
The result is a kind of cultural allergy to ambiguity. We crave forecasts that feel like guarantees. We elevate pundits who deliver confident takes on events they barely understand. We mistake the performance of certainty for actual expertise, even though history suggests that the people most certain about the future are, quite often, the first to be surprised by it.
What Happens When Certainty Collapses?
This feels like an especially important question to sit with right now.
The past few years have delivered what you might call a masterclass in epistemic humility. A pandemic, political upheaval, economic weirdness, all unfolding inside systems that turned out to be far more complex and interconnected than our tidy models had assumed.
The predictable result? Certainty collapsed, and in the vacuum, we saw two reflexes kick in:
Some people lost trust entirely, deciding that if the system was wrong once, it must always be lying.
Others doubled down on certainty, attaching themselves to anyone willing to offer a confident explanation, no matter how far-fetched.
Neither response leaves much room for curiosity, and without curiosity, the whole project of learning gets a lot harder.
If this argument holds water (and I hope you’ll tell me if you think it doesn’t), the path forward isn’t just “teaching critical thinking,” it’s reconsidering the design of the tools and institutions that shape how we understand the world in the first place.
What might that look like? I have a few ideas, but these are more invitations than answers:
What if economic forecasts highlighted their own uncertainty, making the error bars so visible that they became part of the story?
What if news platforms actively surfaced expert disagreements, instead of trying to package every topic into a tidy narrative?
What if educational systems prized the ability to ask a great question, instead of only rewarding the ability to recall a single answer?
What if political debates rewarded intellectual honesty, celebrating candidates who admitted complexity, rather than penalizing them for it?
None of these would fix everything. But they might nudge us toward a culture where uncertainty feels less threatening, where it’s recognized as a natural and necessary part of living in a world that is, by its nature, full of surprises.
The Real Work of Curiosity
Evolution doesn’t work by predicting the future, but by adapting to it. Science doesn’t progress by being right all the time, but by being willing to be wrong in ways that teach us something new. Democracies don’t thrive because they always choose the perfect leader, but because they can survive imperfect ones.
If I’m right about any of this, certainty addiction isn’t inevitable; it’s a side effect of living inside systems that forgot how much wisdom begins with the words “I don’t know.”
The good news is, if this is a design problem, it’s one we could redesign, if we’re willing to ask better questions about how we got here.
Speaking of which: does this argument hold up for you? Or do you see a different explanation for our cultural love affair with certainty? I’d love to know what you think, because if there’s one thing I’m certain about, it’s that this is a conversation worth having.

