If a predictor knows your next move, do you take one box or two? Explore why this puzzle splits philosophers and how to choose when both sides feel right.

Newcomb’s Paradox is a clash between causal and evidential rationality where both sides think the other is being totally nonsensical. One side is looking at what their actions say about the world, and the other is looking at what their actions do to the world.
Deep dive into Newcomb's Paradox. Why some people struggle to be content choosing either and why some people don't understand the struggle at all.


Cree par des anciens de Columbia University a San Francisco
"Instead of endless scrolling, I just hit play on BeFreed. It saves me so much time."
"I never knew where to start with nonfiction—BeFreed’s book lists turned into podcasts gave me a clear path."
"Perfect balance between learning and entertainment. Finished ‘Thinking, Fast and Slow’ on my commute this week."
"Crazy how much I learned while walking the dog. BeFreed = small habits → big gains."
"Reading used to feel like a chore. Now it’s just part of my lifestyle."
"Feels effortless compared to reading. I’ve finished 6 books this month already."
"BeFreed turned my guilty doomscrolling into something that feels productive and inspiring."
"BeFreed turned my commute into learning time. 20-min podcasts are perfect for finishing books I never had time for."
"BeFreed replaced my podcast queue. Imagine Spotify for books — that’s it. 🙌"
"It is great for me to learn something from the book without reading it."
"The themed book list podcasts help me connect ideas across authors—like a guided audio journey."
"Makes me feel smarter every time before going to work"
Cree par des anciens de Columbia University a San Francisco

Nia: I was reading about this logic puzzle from the sixties called Newcomb’s Paradox, and Eli, it’s actually wild. It’s been over sixty years and we are still fighting about it. Even a survey of two thousand philosophers found they’re split almost down the middle on what to do.
Eli: It’s the ultimate "agree to disagree" problem. You’ve got two boxes: one open with a thousand dollars, and one opaque mystery box that might have a million or nothing. But here’s the catch—a super-reliable predictor already put the money in based on whether they thought you’d take one box or both.
Nia: Right, and if you take both, you likely get just the thousand. If you take one, you likely get the million. It sounds so simple, yet one side says taking both is "strategically dominant" because the money is already there, while the other says that’s just leaving a million dollars on the table.
Eli: Exactly, it’s a clash between causal and evidential rationality where both sides think the other is being totally nonsensical. Let's explore how these two logical frameworks lead to such a massive breakdown in mutual understanding.