Explore the fascinating phenomenon of dysrationalia - when intelligent people consistently make irrational decisions - and discover why our brains are wired to win arguments rather than find truth.

Smart people are incredible at building logical-sounding arguments for whatever they already want to believe; it's like having a brilliant lawyer in your head who's always working for the wrong side.
From Columbia University alumni built in San Francisco
"Instead of endless scrolling, I just hit play on BeFreed. It saves me so much time."
"I never knew where to start with nonfiction—BeFreed’s book lists turned into podcasts gave me a clear path."
"Perfect balance between learning and entertainment. Finished ‘Thinking, Fast and Slow’ on my commute this week."
"Crazy how much I learned while walking the dog. BeFreed = small habits → big gains."
"Reading used to feel like a chore. Now it’s just part of my lifestyle."
"Feels effortless compared to reading. I’ve finished 6 books this month already."
"BeFreed turned my guilty doomscrolling into something that feels productive and inspiring."
"BeFreed turned my commute into learning time. 20-min podcasts are perfect for finishing books I never had time for."
"BeFreed replaced my podcast queue. Imagine Spotify for books — that’s it. 🙌"
"It is great for me to learn something from the book without reading it."
"The themed book list podcasts help me connect ideas across authors—like a guided audio journey."
"Makes me feel smarter every time before going to work"
From Columbia University alumni built in San Francisco

Lena: You know what's been bugging me lately? Everyone talks about being "logical" like it's the holy grail of thinking, but I'm starting to wonder - what if some people just aren't wired that way?
Miles: Oh, that's such a fascinating question! And here's what's wild - there's actually a term for this. Psychologists call it "dysrationalia" - basically, when smart people consistently make irrational decisions despite having perfectly good intelligence.
Lena: Wait, so you're telling me someone can ace their SATs but still fall for obvious scams?
Miles: Exactly! Think about it - we've all met that brilliant friend who believes in conspiracy theories, or the PhD who gets duped by get-rich-quick schemes. Intelligence and rational thinking aren't the same thing at all.
Lena: That's mind-blowing. I always assumed smart equals logical.
Miles: Right? But here's where it gets really interesting - our brains might actually be designed to win arguments rather than find truth. So let's dive into why our supposedly rational minds seem so determined to fool us.
Miles: So here's the thing that blew my mind when I first learned about it—our brains basically operate like they have two different systems running simultaneously.
Lena: Two systems? What do you mean?
Miles: Think of it this way. System One is like your brain's instant reaction mode—it's fast, automatic, and runs on gut feelings. System Two is more like your careful, deliberate thinking mode—slower but more accurate.
Lena: Okay, so like when I immediately think "that person looks trustworthy" versus when I actually sit down and analyze whether I should lend them money?
Miles: Exactly! And here's the kicker—System One is in charge most of the time. It's making snap judgments about everything from whether that car is going to hit you to whether you like someone you just met.
Lena: But that makes sense for survival, right? If a tiger's charging at you, you don't want to spend five minutes weighing your options.
Miles: Absolutely. The problem is, System One doesn't know when to step back. So it's making the same lightning-fast judgments about complex decisions that need careful thought. Like when you're buying a house or choosing a career path.
Lena: So we're essentially running Stone Age software on modern problems?
Miles: That's a brilliant way to put it! And it gets worse—System One is incredibly confident. It doesn't send you a little notification saying "Hey, this is just a guess." It feels like certainty.
Lena: But what about emotions? I mean, we're not just thinking machines, right?
Miles: Oh, this is where things get really fascinating. Emotions aren't just along for the ride—they're actually driving the bus most of the time. There's incredible research showing that people with brain damage affecting their emotional centers can't make good decisions, even when their logic is perfect.
Lena: Wait, so emotions actually help us decide?
Miles: In many cases, yes! But here's the twist—emotions from completely unrelated situations leak into our current decisions. If you're feeling sad about your dog being sick, you might suddenly become more willing to sell your belongings for way less money.
Lena: That sounds crazy. Why would sadness make me a bad negotiator?
Miles: Because sadness triggers this unconscious goal to change your circumstances, to get rewards quickly. Your brain is basically saying "I need something good to happen now" without you even realizing it.
Lena: So my emotions are making financial decisions behind my back?
Miles: Essentially, yeah. And anger does the opposite—it makes you more optimistic about risks, more likely to blame individuals rather than situations. Fear makes you see danger everywhere. Each emotion comes with its own set of invisible biases.
Lena: This is terrifying. Are we just emotional puppets?
Miles: Well, here's the thing—once you know this is happening, you can start to catch it. But most people have no idea their mood from this morning's traffic jam is influencing their afternoon business decisions.
Miles: You know what's really dangerous though? The smarter someone is, the better they get at justifying their emotional decisions after the fact.
Lena: Oh no, so intelligence actually makes this worse?
Miles: In some ways, absolutely. Smart people are incredible at building logical-sounding arguments for whatever they already want to believe. It's like having a brilliant lawyer in your head who's always working for the wrong side.
Lena: I can see that. I've definitely caught myself doing mental gymnastics to justify impulse purchases.
Miles: And here's the really sneaky part—the more educated someone is, the more likely they are to fall for certain types of misinformation, especially if it aligns with their political beliefs. They're not using their intelligence to find truth; they're using it to win internal arguments.
Lena: So knowledge becomes a weapon against ourselves?
Miles: Exactly. There's this phenomenon called the Dunning-Kruger effect where people with a little knowledge become overconfident. But even people with a lot of knowledge can become overconfident in areas outside their expertise.
Lena: Like a brilliant surgeon who thinks they understand economics?
Miles: Perfect example. And the scariest part? The more confident someone feels, the less likely they are to question their own reasoning. Confidence and accuracy often move in opposite directions.
Lena: That's unsettling. So how do we know when we're being overconfident versus appropriately confident?
Miles: That's the million-dollar question, isn't it? Most of us are walking around completely unaware of our own blind spots.
Lena: Okay, but surely when we get together with other people, we can help each other think more clearly, right?
Miles: Oh, you'd think so, but groups often make these problems exponentially worse. There's this thing called groupthink where everyone's so focused on harmony that they stop questioning bad ideas.
Lena: Like when everyone in a meeting agrees with the boss's terrible plan because nobody wants to rock the boat?
Miles: Exactly! And it gets worse with something called the bandwagon effect. When we see other people doing something, our brains interpret that as evidence it must be right, even when we have no other information.
Lena: So we're basically following the crowd off a cliff?
Miles: Sometimes, yeah. And here's what's really wild—emotions are contagious in groups. If one person starts feeling anxious about a decision, that anxiety spreads through the whole team, affecting everyone's judgment.
Lena: I've totally seen this happen in brainstorming sessions. One person's negativity just kills all the creative energy.
Miles: Right! And the opposite happens too. Groups can get caught up in collective overconfidence, where everyone's feeding off each other's enthusiasm and nobody's playing devil's advocate.
Lena: So we need that person who's willing to be the voice of reason, even when it's uncomfortable?
Miles: Absolutely. But here's the problem—groups naturally suppress dissent. The people who might offer the most valuable perspective often stay quiet because they don't want to be seen as negative or disloyal.
Lena: Alright, this is all pretty depressing. Please tell me there are ways to fight back against our own brains.
Miles: There absolutely are! The first step is just awareness. When you know these biases exist, you can start catching yourself in the act.
Lena: Like what should I be watching for?
Miles: Well, anytime you feel really certain about something, that's actually a red flag. Ask yourself—what would change my mind? If you can't think of anything, you're probably in bias territory.
Lena: That's smart. What else?
Miles: Pay attention to your emotions before making decisions. If you're feeling angry, sad, or even really happy, maybe wait before making that big choice. Sleep on it, literally.
Lena: And I bet talking to people who disagree with us helps too?
Miles: Absolutely, but here's the key—you have to genuinely listen, not just wait for your turn to argue. Try to understand their perspective so well you could argue their side.
Lena: That sounds hard but valuable.
Miles: It is hard! But there's also this technique called "considering the opposite." Before you commit to a decision, force yourself to argue for the opposite choice. What evidence supports that view?
Lena: So basically, become your own devil's advocate?
Miles: Exactly. And when you're in groups, someone needs to be assigned the role of skeptic. Make it safe and even rewarded to poke holes in ideas.
Lena: I love that—making skepticism part of the process instead of seeing it as negativity.
Lena: So here's what I'm struggling with—if emotions and intuition are so unreliable, should we just try to be purely logical about everything?
Miles: That's such a great question, and here's the paradox—pure logic might actually be impossible and even counterproductive for humans.
Lena: What do you mean impossible?
Miles: Well, logic needs premises to work from, and where do those premises come from? Usually from values, experiences, and yes—emotions. Even deciding that "logic is good" is itself a value judgment.
Lena: So we can't escape the emotional foundation entirely?
Miles: Exactly. And here's the thing—emotions often contain wisdom that pure logic misses. Your gut feeling that someone isn't trustworthy might be picking up on micro-expressions and behavioral patterns your conscious mind hasn't processed yet.
Lena: Right, like when something feels "off" but you can't put your finger on why?
Miles: Precisely! The key isn't to eliminate emotions and intuition—it's to know when to trust them and when to double-check with more careful analysis.
Lena: So it's about using both systems strategically?
Miles: Absolutely. For routine decisions, System One is usually fine. For important, complex, or unfamiliar decisions, you want to slow down and engage System Two. The trick is knowing the difference.
Lena: And recognizing when our emotions might be hijacking the process?
Miles: Right. Sometimes that "logical" argument we're making is just our emotions in disguise, wearing a fancy suit of reasoning.
Lena: As we wrap this up, I'm realizing that maybe the question isn't why some people don't think logically—maybe it's how any of us manage to think clearly at all!
Miles: That's exactly right! We're all working with the same flawed hardware. The difference is just awareness and practice.
Lena: So what's the takeaway for our listeners who want to make better decisions?
Miles: I'd say start small. Notice when you're feeling really certain about something, especially if it's emotionally charged. That's your cue to slow down and ask some questions.
Lena: And remember that being smart doesn't make you immune to these biases—it might even make some of them worse.
Miles: Exactly. Intelligence is a tool, but like any tool, it can be used skillfully or clumsily. The goal isn't to become a thinking machine—it's to become more aware of how our wonderfully complex, beautifully flawed minds actually work.
Lena: I love that. We're not trying to transcend our humanity; we're trying to understand it better.
Miles: Perfectly said. So to everyone listening—be curious about your own thinking, be gentle with yourself when you catch these biases, and remember that the smartest thing you can do is admit you might be wrong.
Lena: Thanks for taking this journey with us today. We'd love to hear about times you've caught your own biases in action, so feel free to reach out and share your stories. Until next time, keep questioning everything—including your own certainty.