
Discover why ordinary people outperformed intelligence agencies in Tetlock's groundbreaking forecasting research. Can prediction truly be learned? This NYT bestseller reveals the cognitive toolkit that helps superforecasters beat experts - even those with classified information at their disposal.
Philip E. Tetlock and Dan Gardner are the bestselling authors of Superforecasting: The Art and Science of Prediction, a groundbreaking exploration of decision-making and probabilistic forecasting. Tetlock, a Wharton School professor and political psychology expert, revolutionized understanding of prediction through his decades-long research, including the landmark Good Judgment Project, which identified non-experts outperforming intelligence analysts. Gardner, an award-winning journalist and author of Risk and How Big Things Get Done, brings a razor-sharp narrative style to translating complex research into actionable insights.
Their collaboration merges Tetlock’s academic rigor—honed through affiliations with the University of Pennsylvania and his Grawemeyer Award-winning work—with Gardner’s knack for distilling data-driven stories. The book, spanning psychology, behavioral economics, and cognitive science, reveals how "superforecasters" combine humility, curiosity, and analytical frameworks to improve accuracy.
Featured in The Economist, NPR, and The Wall Street Journal—which called it “the most important book on decision making since Kahneman’s Thinking, Fast and Slow”—Superforecasting has been translated into 20 languages and endorsed by figures like former U.S. Treasury Secretary Robert Rubin. Tetlock and Gardner’s work remains essential reading for professionals in finance, policy, and technology seeking to navigate uncertainty.
Superforecasting explores how ordinary people achieve extraordinary prediction accuracy through structured reasoning, probabilistic thinking, and continuous belief updating. Authors Philip Tetlock and Dan Gardner analyze findings from the decade-long Good Judgment Project, revealing techniques like "Fermi-izing" complex questions and using "dragonfly forecasting" to synthesize multiple perspectives. The book challenges the myth of expert infallibility while providing actionable strategies for improving foresight.
This book suits professionals in finance, policy analysis, and strategic planning who need to assess risks, plus anyone interested in decision-making psychology. Leaders managing uncertainty and self-improvers seeking to reduce cognitive biases will find practical frameworks like probabilistic forecasting and belief updating invaluable.
Yes – its evidence-based approach to geopolitical, economic, and technological forecasting remains vital in our AI-driven era. The 2025 reader gains tools to navigate misinformation and rapid change, with studies showing superforecasters outperforming intelligence analysts by 60% even a decade after publication.
Foxes aggregate diverse viewpoints, express probabilities precisely, and revise forecasts frequently. Hedgehogs rely on single ideological frameworks, make bold yes/no predictions, and resist updating beliefs. Tetlock's research shows foxes consistently outperform hedgehogs across 20 years of geopolitical forecasting.
This US government-funded study (2011-2015) demonstrated that crowdsourced forecasters using structured techniques beat CIA analysts with classified data. Top performers achieved 60% greater accuracy than professionals by applying belief updating, Fermi problem-solving, and collaborative error-checking.
Named after physicist Enrico Fermi, this technique breaks complex questions into smaller, answerable components. For example, estimating pandemic spread by analyzing:
This method reduces overconfidence and enables granular probability assessments.
Superforecasters revise predictions 2-3x more frequently than average forecasters. They treat opinions as "hypotheses in need of testing," systematically adjusting probabilities as new data emerges. This counters confirmation bias – a key reason why 72% of experts underperform basic prediction algorithms.
Inspired by insects' multi-lens vision, this approach synthesizes contradictory perspectives into unified forecasts. Practitioners:
This prevents single-narrative thinking while maintaining decisiveness.
"Strong views, weakly held" Advocates conviction in current forecasts while remaining open to disconfirming evidence.
"The fox knows many things..." Highlights the predictive advantage of eclectic knowledge over ideological rigidity.
"Forecast, measure, revise" Encapsulates the core three-step improvement cycle used by top forecasters.
Groups beat individual superforecasters by 23% through:
The book details structured collaboration techniques like prediction markets and Bayesian updating pools.
Some argue its methods:
However, 85% of corporate users report measurable decision-making improvements from adopting its frameworks.
While Kahneman explores cognitive biases broadly, Tetlock provides specific tools for improving predictions. Superforecasting offers more actionable frameworks like Fermi decomposition and belief calibration scales, making it preferred by practitioners needing operational methods over theoretical insights.
Feel the book through the author's voice
Turn knowledge into engaging, example-rich insights
Capture key ideas in a flash for fast learning
Enjoy the book in a fun and engaging way
Beliefs are hypotheses to be tested, not treasures to be guarded.
Good judgment depends on knowing things, but also on how one thinks.
Our minds rush to judgment and resist changing course, even when wrong.
If it feels true, it is.
Break down key ideas from Superforecasting into bite-sized takeaways to understand how innovative teams create, collaborate, and grow.
Distill Superforecasting into rapid-fire memory cues that highlight key principles of candor, teamwork, and creative resilience.

Experience Superforecasting through vivid storytelling that turns innovation lessons into moments you'll remember and apply.
Ask anything, pick the voice, and co-create insights that truly resonate with you.

From Columbia University alumni built in San Francisco
"Instead of endless scrolling, I just hit play on BeFreed. It saves me so much time."
"I never knew where to start with nonfiction—BeFreed’s book lists turned into podcasts gave me a clear path."
"Perfect balance between learning and entertainment. Finished ‘Thinking, Fast and Slow’ on my commute this week."
"Crazy how much I learned while walking the dog. BeFreed = small habits → big gains."
"Reading used to feel like a chore. Now it’s just part of my lifestyle."
"Feels effortless compared to reading. I’ve finished 6 books this month already."
"BeFreed turned my guilty doomscrolling into something that feels productive and inspiring."
"BeFreed turned my commute into learning time. 20-min podcasts are perfect for finishing books I never had time for."
"BeFreed replaced my podcast queue. Imagine Spotify for books — that’s it. 🙌"
"It is great for me to learn something from the book without reading it."
"The themed book list podcasts help me connect ideas across authors—like a guided audio journey."
"Makes me feel smarter every time before going to work"
From Columbia University alumni built in San Francisco

Get the Superforecasting summary as a free PDF or EPUB. Print it or read offline anytime.
Here's something that should shake your confidence: in 2011, a retired government employee from Nebraska with no special credentials started outpredicting CIA analysts, celebrated pundits, and sophisticated prediction markets on questions like whether Russia would annex Ukrainian territory. Bill Flack wasn't lucky - he was part of a revolution in how we think about the future. When over 20,000 volunteers participated in a government-sponsored forecasting tournament, researchers discovered something remarkable: the ability to see the future isn't a mystical gift or the domain of credentialed experts. It's a learnable skill, built on specific thinking habits that anyone can develop. The catch? Most of us - especially the experts we trust most - are doing it completely wrong. In 1956, a renowned medical specialist diagnosed Archie Cochrane with terminal cancer. Cochrane, a physician himself, accepted this death sentence without question and began planning his final months. The specialist was completely wrong. What's terrifying isn't just the misdiagnosis - it's that neither man questioned it. They fell into what psychologists call the certainty trap, where our minds rush to judgment and then defend those judgments against all evidence. This trap has killed countless people throughout history. George Washington's physicians bled, purged, and blistered him to death in 1799, supremely confident in treatments that were actually hastening his demise. For centuries, doctors rarely questioned methods that ranged from useless to lethal. As one medical historian put it, they were "like blind men arguing over the colors of the rainbow."