After publishing Trapped Priors, I posted a link to it on HackerNews, where I received a thought-provoking comment (from user: bulatb) that perfectly encapsulates the hardest part of pursuing truth: it’s inefficient, often lonely, and rarely rewarded. Here’s the comment, followed by my reflections.
The hardest part of this is fighting Goodhart's law. You have to really be the skeptic that you say you are, not just declare yourself a skeptic and take the day off. You have to care about the truth of your positions, not how good it feels to hold them. Not how rational you feel for being such a skeptic.
Which is expensive. It's inefficient at producing confidence from information, and efficiency is fitness. Why do the work when you could just...not? But feel as sure as if you had?
There's also rarely a reward for caring. Few people choose truth over tribe. If all you get for challenging a core belief among your group is punishment, less opportunity and more cold shoulders, without even changing their minds, why even try?
You'd have to care about the truth to an irrational degree.
We've known a bunch of ways to be more rational for thousands of years, but who wants to use them? It's just extra work. The methods in the article are simple and effective—but they're mostly good for leaving you upset, a lot, because nobody cares and they want you to stop.
This comment touches on some of the most challenging aspects of pursuing truth in a world that often values certainty, tribe, and efficiency over rigorous skepticism and uncomfortable truths. Let me unpack a few of the key points:
Goodhart’s Law and the Cost of Skepticism
1It is absolutely true that skepticism is not just a badge you wear—it’s a discipline, and one that requires ongoing effort. It’s tempting to declare oneself a skeptic and then shortcut the work of actual scrutiny, falling prey to the same cognitive traps skeptics claim to avoid. Fighting this tendency means embracing the inefficiency described—being willing to question not just others’ assumptions but your own, even when it feels redundant, unproductive, or deeply unsettling.
But that inefficiency has value beyond confidence-building. While efficiency is indeed fitness in many domains, the ability to entertain doubt and update beliefs can yield fitness in unexpected ways. It’s a slower, more deliberate form of adaptation, but it enables resilience when the environment changes or when hidden variables reveal themselves. Skepticism might feel costly in the short term, but it’s an investment in long-term epistemic survival.
Truth vs. Tribe
The observation about the social cost of pursuing truth is painfully accurate. Humans are tribal creatures, and challenging core beliefs—even with the best intentions—can often lead to punishment rather than persuasion. It’s a valid point: if the only reward for questioning the group is ostracism or hostility, why even try?
The answer, as suggested, lies in caring about truth to an “irrational degree.” It’s not a rational choice in the sense of maximizing immediate rewards; it’s a value-driven one. Some of us are compelled to prioritize truth even when it isolates us, because the alternative—embracing comfortable falsehoods or unearned certainty—feels like a betrayal of something essential.
That said, there’s room for nuance here. Choosing truth over tribe doesn’t always mean burning bridges or confronting every belief head-on. Sometimes it means knowing when to engage and when to preserve relationships, finding subtle ways to plant seeds of doubt, or recognizing that some battles aren’t worth fighting. Caring about truth doesn’t mean sacrificing every other value—it means navigating the tension between them with intention.
Evidence-Based Truth Claim Scale (0-100)
I suspect we share this in common: I am flooded with information—from scientific studies to social media claims, news reports to expert opinions. So, I've developed a personal approach to evaluate what to believe (and how much to believe it). I wanted to share this framework with you as a window into how I distinguish between what I consider well-establ…
The Emotional Toll of Rationality
I also take the point that these methods can leave you upset, because they make the world feel less stable and more isolating. Knowing a better way to think doesn’t always make life easier—it often makes it harder, because it removes the comforting illusions that most people take for granted. It can feel like a thankless task to care deeply about truth when the social, emotional, and cognitive incentives push us to do the opposite.
But the discomfort isn’t without purpose. Being upset is part of the process of growth, and that discomfort can spur creativity, humility, and a deeper understanding of ourselves and others. It reminds us that truth-seeking is not just about arriving at better answers—it’s about learning to live with uncertainty, complexity, and the limits of our own understanding.
Why Try?
Ultimately, the question isn’t just “Why even try?” but “What kind of person do you want to be?” For those who care about truth to an “irrational degree,” the pursuit itself is the reward. It’s not about changing the minds of others or reaping tangible benefits; it’s about aligning your actions with your values, even when it’s inconvenient, costly, or lonely.
And while it’s true that many people don’t care about truth in the same way, there are others who do. Building connections with those people—finding or creating your own “tribe of truth”—can make the journey less isolating. Together, we can create spaces where questioning is encouraged, where discomfort is embraced as part of growth, and where caring about truth isn’t a lonely pursuit.
What do you think? Is the pursuit of truth worth the costs, or do our social and emotional instincts lead us to more balanced paths? I’d love to hear your perspective.
Goodhart’s Law states: "When a measure becomes a target, it ceases to be a good measure." In other words, when you start optimizing for a specific metric, the metric often loses its value because the process shifts from genuinely achieving the goal to simply manipulating the numbers.
E.g., If "being skeptical" becomes a goal rather than a practice, the focus can shift to appearing skeptical—raising doubts performatively—rather than doing the hard work of critical evaluation. This performative skepticism may satisfy the "skeptic" label but fails to align with genuine truth-seeking. Goodhart’s Law reminds us to focus on the underlying values (truth, understanding) rather than letting proxies (confidence, efficiency, recognition) drive our behavior.
Super interesting, thank you. It’s interesting to consider how these truths affect the market, and society at whole.
If I recall correctly, according to Myers-Briggs Type Indicator, upwards of 50% of people base their beliefs off of how those beliefs make them feel. I’d have to look back at the source material to know the exact measured percentage, but it’s high.