The Bayesian Mind
Or, How a Dead English Minister Accidentally Explained Why Your Brain Gets Stuck
Two researchers in Amsterdam recently made a discovery that would have mystified Thomas Bayes, the 18th-century English minister whose mathematical theorem remains relevant in innumerable fields that did not exist when he formulated it. Yang and van Vugt weren't thinking about Bayes at all when they wired up a group of volunteers to EEG machines in their lab. They weren't thinking about depression, either, at least not in the way most people think about depression—as sadness, as darkness, as the absence of joy.
They were thinking about decisions. Simple ones. The kind of mundane choices that fill our days without us noticing: Is this number higher or lower than the last one? Which of these two options is more likely to be correct? The sort of thing a computer could solve in microseconds, but that human brains—those magnificent, maddening biological computers—turn into elaborate productions involving electrical storms and chemical cascades.
What Yang and van Vugt discovered wasn't dramatic. It wasn't the kind of finding that makes headlines or wins Nobel Prizes. But it was the kind of discovery that, once you see it, changes how you see everything else. Like learning that the stock market isn't really about companies, or that professional basketball isn't really about height. The researchers found that people who were depressed—not clinically, catastrophically depressed, just moderately, persistently blue—made decisions differently than everyone else.
Not worse decisions. Not irrational decisions. Just slower ones.
Their brains, when presented with new information, took longer to incorporate it. Much longer. It was as if the mental machinery that normally hums along, quietly updating our beliefs about the world based on fresh evidence, had downshifted into first gear. The information came in. The brain registered it. But then... nothing. Or rather, something so slow it might as well have been nothing.
Yang and van Vugt called this "reduced drift rate," which sounds about as exciting as watching paint dry. But what they had stumbled upon was something far more profound: a mechanistic explanation for one of humanity's oldest puzzles.
Why can't some people just snap out of it?
The Mathematics of Misery
To understand what the Amsterdam researchers had discovered, you need to understand something about how decisions actually work inside your skull. Most people imagine decision-making as a kind of internal debate—the angel on one shoulder arguing with the devil on the other, until one side wins. But that's not how it works at all.
Decision-making, according to the best current science, is more like evidence accumulation. Your brain starts with some initial hunch—statisticians call this a "prior"—and then gradually collects data points until one option clearly outweighs the others. It's a bit like filling up two buckets with water, one drop at a time, until one bucket gets heavy enough to tip the scale.
This process has a name: drift diffusion. And it has a mathematical foundation that traces back to Thomas Bayes, the minister who died in 1761 without knowing he had invented one of the most powerful tools in modern science.
Bayes' theorem is deceptively simple. It says that your belief in something should change when you get new evidence. Start with what you think you know. Add what you just learned. Update accordingly. It's the logic that powers Google's search algorithms, Netflix's recommendations, and the GPS in your car. It's how meteorologists predict hurricanes and how doctors diagnose diseases.
And, is Yang and van Vugt’s results are reliable, it's how psychologically healthy people navigate life.
But here's the thing about Bayesian updating: it only works if you're actually willing to update. If you're stuck—if your priors have hardened into something resembling concrete—then all the evidence in the world won't help you.
This, the researchers realized, might be what depression actually is. Not sadness. Not hopelessness. Not even the absence of pleasure. But stickiness. The inability to revise your beliefs about the world, even when the world keeps offering you reasons to revise them.
The Thermostat That Wouldn't Turn On
Consider Sarah, a composite of patients described in the psychiatric literature. Sarah believes she's unlikable. This isn't a casual self-doubt—it's a core conviction, as solid and unexamined as her belief that the sun rises in the east. When Sarah goes to a party and someone compliments her dress, her Bayesian updating system should, in theory, adjust her self-assessment upward by some small amount. Evidence received: someone likes something about me. Prior belief: people don't like me. Updated belief: maybe people don't dislike me quite as much as I thought.
But Sarah's system doesn't update. The compliment bounces off her like light off a mirror. She explains it away—the person was just being polite, or they felt sorry for her, or they wanted something from her. Her prior belief remains untouched, as pristine and unchanging as the day she first formed it.
Yang and van Vugt's EEG data showed this process in real time. When presented with new information that should have shifted their decision-making, the depressed participants' brains moved like syrup in winter. The neural signatures that normally accompany belief updating—the electrical patterns that say "wait, maybe I was wrong about this"—were muted, delayed, or absent entirely.
It was like watching a thermostat that could sense the temperature but couldn't turn on the heat.
The Ecology of Belief
Once you start seeing the world through Bayesian eyes, certain things become impossible to unsee. Take the coach who can't adjust his game plan when it's clearly not working. Or the investor who doubles down on a losing stock rather than admit he was wrong. Or the parent who continues using the same failed discipline strategy because "that's how I was raised."
These aren't moral failures. They're updating failures.
The most successful people—in business, in relationships, in life—tend to be ruthless updaters. They hold their beliefs lightly, ready to revise them when new evidence arrives. The poker player who folds a strong hand when the betting pattern suggests he's beaten. The entrepreneur who pivots when the market doesn't respond as expected. The friend who apologizes when she realizes she misunderstood the situation.
These people aren't smarter, necessarily. They're more Bayesian. They've maintained what you might call cognitive plasticity—the ability to let their minds change when the world changes.
But for millions of people, this plasticity has gone missing. Depression, anxiety, PTSD, addiction—all of these conditions, viewed through the Bayesian lens, start to look like updating disorders. The brain gets stuck in a loop, replaying the same predictions and reaching the same conclusions, regardless of what reality keeps trying to tell it.
Take anxiety, for instance. An anxious person might believe that airplane travel is dangerous. Statistically, this is false—you're more likely to be struck by lightning than die in a plane crash. The anxious person knows this statistic. They've heard it a thousand times. But their Bayesian updating system has gone offline. Each safe landing doesn't update their belief about flying. Each uneventful flight doesn't reduce their fear. The evidence accumulates, but the belief remains frozen.
Debugging the Soul
If this view of mental illness is correct—if conditions like depression and anxiety are really updating disorders—then it suggests something radical about treatment. Therapy isn't about talking through your feelings or processing your trauma, though it might involve both of those things. Therapy is about debugging. It's about identifying the stuck beliefs and finding ways to get them unstuck.
Cognitive-behavioral therapy, the most widely used and researched form of psychotherapy, suddenly makes perfect sense through this lens. CBT doesn't try to make you feel better directly. Instead, it asks you to become a scientist studying your own life. What do you believe about yourself and the world? What evidence supports those beliefs? What evidence contradicts them? Can you design experiments to test your assumptions?
It's Bayesian updating, but with training wheels.
Exposure therapy works the same way. If you're afraid of dogs because you believe they're dangerous, exposure therapy doesn't try to convince you that dogs are safe. Instead, it presents you with a carefully controlled stream of evidence—friendly dogs, calm interactions, positive experiences—until your belief updating system has no choice but to revise its position.
Even mindfulness meditation, which might seem unrelated to statistical inference, can be understood as a form of Bayesian training. When you sit quietly and observe your thoughts without judgment, you're learning to hold your beliefs more lightly. You're practicing the art of noticing your mental models without being trapped by them.
The Speed of Recovery
This leads to an unusual but potentially useful way of thinking about mental health: not as the presence of happiness or the absence of suffering, but as the rate at which you update your beliefs.
How quickly do you notice when your assumptions about the world no longer match reality? How readily do you revise your self-concept when you receive new information about yourself? How often do you change your mind?
These might sound like abstract questions, but they have practical implications. In Yang and van Vugt's experiment, the people with faster "drift rates"—those whose brains quickly incorporated new evidence—were not only less depressed but better at navigating uncertainty. They made decisions more efficiently. They adapted to changing circumstances more readily. They were, in a word, more functional.
This suggests that recovery from depression, anxiety, and other mental health conditions might not be about reaching some final destination—a state of permanent happiness or unshakeable confidence. Instead, recovery might be about restoring motion to a system that has become stuck. It's about getting your Bayesian updating system back online.
The implications extend beyond individual therapy. If mental health is really about updating speed, then the environments we create matter enormously. Organizations that punish people for changing their minds will be filled with people who stop changing their minds. Cultures that treat intellectual flexibility as weakness will produce cognitively rigid populations. Schools that reward students for having the "right" answers rather than for revising wrong ones will graduate students who are afraid to revise.
Does this Bayesian view of mental health match your experience? Share your thoughts below.
The Minister's Legacy
Thomas Bayes died in 1761, and his theorem wasn't even published until after his death. He had no way of knowing that his mathematical insight would eventually power the algorithms that translate languages, recommend movies, and diagnose diseases. He certainly couldn't have imagined that it would offer a new way of understanding human suffering.
But perhaps he would have appreciated the irony. Bayes was, after all, a minister—someone whose job it was to help people change their lives. He spent his career trying to move people from one set of beliefs to another, to help them see the world differently than they had before.
What Yang and van Vugt discovered in their Amsterdam laboratory was that this process of belief revision—this fundamental capacity to update our mental models based on new evidence—might be the very thing that breaks down in mental illness. And restoring it might be the key to healing.
The depressed brain isn't broken because it's sad. It's broken because it's stuck. And the path to recovery isn't about finding happiness—it's about finding motion. It's about returning to a state where the mind can do what minds are supposed to do: take in new information, revise old beliefs, and keep updating their model of the world until it matches the world as it actually is.
In the end, mental health might be nothing more complicated than maintaining the ability to change your mind. Which would have pleased Thomas Bayes, who spent his life in the business of changing minds, even if he never knew he was also in the business of changing the future of psychology.
The brain that can update is the brain that can heal. And the mind that stays in motion is the mind that stays alive.