I wasn’t 50 pages into Derek Parfit’s Reasons and Persons before I felt the ground give way under a very old assumption: that rationality is the surest guide to living well.
Parfit doesn’t shout. He doesn’t rant. He simply lays out a set of cases—calmly, carefully, with the kind of philosophical precision that makes you nod along right up until you realize that everything you believe about acting rationally is now lying in pieces on the floor.
And when it broke, I knew exactly who to call.
Not literally, of course. But Deutsch's work on fallibilism—the idea that all knowledge is conjectural and open to revision—was the only intellectual scaffolding I had sturdy enough to make sense of what Parfit had just done.
Because here’s what Parfit shows:
Sometimes, the very act of trying to do what’s best for you ensures that things will go worse.
The Rational Trap
Parfit introduces the idea through what he calls the Self-Interest Theory: the notion that rational action is defined by doing what maximizes your own well-being.
Sounds airtight, right? It’s the invisible backbone of everything from Econ 101 to life-coach Instagram.
But then come the cases.
The writer Kate, for example, who drives herself to exhaustion chasing perfection in her books. She knows this makes her less happy. She also knows that if she didn’t care so much, her life would feel hollow. She’s acting irrationally by her own metric—and she’s right to do so.
Or the self-interested man who gets stranded in the desert because no one trusts his promises—not because he broke them, but because he never does anything that doesn’t benefit himself. His perfect rationality is why he's left to die.
These aren’t exceptions. Parfit argues they’re common. The Self-Interest Theory, when followed too perfectly, can sabotage the very outcomes it was designed to secure.
He calls this indirect self-defeat. I call it brutal.
But then—Deutsch.
Fallibilism to the Rescue
David Deutsch, in The Beginning of Infinity, builds his worldview around a simple but devastating insight: all knowledge is fallible. We are error-prone creatures in a world of infinite unknowns. The goal is not to be right. The goal is to be less wrong, more often.
In that light, Parfit’s paradox isn’t a breakdown. It’s a case study.
Why does the Self-Interest Theory fail? Because it presumes certainty. It assumes that we can reliably know what’s best for us—and that rationality is about following that knowledge wherever it leads.
But as Deutsch would say, you don’t know what you don’t know. You can’t. So any model that depends on certainty—especially about your own flourishing—is brittle from the start.
And here’s where the whole thing comes alive.
Parfit shows us that being “never self-denying”—always doing what seems best for you—is not just occasionally unwise. It’s systemically flawed. It makes you untrustworthy, inflexible, alienated. It eats your future to feed your present confidence.
Deutsch helps us see why that’s not surprising. If rationality is a living process—not a finished state—then it must sometimes direct us to override what we think is best. To act in ways that violate our current best guesses because we’ve learned, probabilistically, that sticking too rigidly to those guesses backfires in predictable ways.
That’s not irrational.
That’s Bayesian.
Bayesianism: The Art of Being (Productively) Wrong
Bayesian reasoning doesn’t care what your current belief is. It cares how you update it. You start with a prior. You get new evidence. You change your mind. The elegance of Bayesianism is that it welcomes error—because it’s built for revision.
So what do you do when, like Kate, your strongest desire leads to pain?
A naïve rationalist says: “That’s irrational. Stop.”
Parfit says: “That’s rational irrationality.”
Deutsch says: “That’s how all progress works.”
And the Bayesian says: “Interesting. Looks like we’re updating the model.”
What emerges is a worldview that treats contradiction not as failure, but as data. Kate’s suffering isn’t an error in the system—it’s a clue that her model of well-being might be oversimplified. Her choice to continue anyway is not hypocrisy. It’s adaptation.
Ever found yourself doing what you knew was “irrational”? Let me know in the comments.
This is what Parfit gave me: a philosophical x-ray of how being too rational can ruin your life.
This is what Deutsch gave me: a framework for understanding why that’s not a bug, but a natural consequence of living in an uncertain world.
And Bayesianism? That’s the bow that ties it all together. A mindset that says: expect your theories to break—but build in a way that breaks smart, not brittle.
So the next time you find yourself acting in a way that violates your best-laid plans, your values, your internal cost-benefit analysis—pause.
You might not be failing.
You might just be updating.