
In "May Contain Lies," finance professor Alex Edmans reveals how our biases make us vulnerable to misinformation. Featured in Wall Street Journal and endorsed by Adam Grant, this guide introduces the "ladder of misinference" - your essential toolkit for navigating today's deceptive information landscape.
Alex Edmans is the author of May Contain Lies: How Stories, Statistics, and Studies Exploit Our Biases – And What We Can Do About It. He is a globally recognized finance professor and behavioral economics authority.
Edmans is a Professor of Finance at London Business School and an MIT-trained Fulbright Scholar. He merges academic expertise—honed through roles at Wharton and Morgan Stanley—with groundbreaking research on corporate decision-making, data interpretation, and ethical business practices.
His work, including the Financial Times Business Book of the Year Grow the Pie: How Great Companies Deliver Both Purpose and Profit (translated into nine languages), challenges conventional wisdom on profit-driven strategies and misinformation.
Edmans’ TED talks on trust and social responsibility have amassed nearly 3 million views. His contributions to the Wall Street Journal and Harvard Business Review underscore his influence.
A Fellow of the British Academy and advisor to global institutions, he bridges rigorous analysis with actionable insights for leaders navigating complex information landscapes.
May Contain Lies examines how biases distort our interpretation of stories, statistics, and studies, leading to misinformation. Economist Alex Edmans uses real-world examples—like the Deepwater Horizon disaster and a fraudulent wellness guru—to illustrate how people confuse anecdotes for evidence. The book provides tools to critically analyze data relationships and causality, helping readers make informed decisions.
This book is essential for professionals, policymakers, and anyone navigating today’s information-saturated world. It’s particularly valuable for those seeking to improve critical thinking skills, discern factual accuracy in media, or avoid manipulation by biased narratives. Educators and students will also benefit from its frameworks for evaluating evidence.
Yes—Edmans combines rigorous analysis with engaging storytelling, making complex concepts like causal inference accessible. The book’s actionable advice, such as identifying “ladder of misinference” biases (e.g., mistaking data for proof), offers lifelong tools for smarter decision-making. Case studies, like a wrongful conviction due to confirmation bias, underscore its real-world relevance.
Edmans details four key biases: conflating statements with facts, facts with data, data with evidence, and evidence with proof. He demonstrates how these distortions fuel misinformation, using examples like the 10,000-hour rule’s misapplication and diets based on flawed studies.
The book advocates skepticism toward emotionally charged claims and teaches readers to scrutinize sources, sample sizes, and causal relationships. For instance, Edmans critiques the “side of a bus” fallacy—assuming bold claims (like political slogans) are factual without verification.
Notable examples include BP’s Deepwater Horizon oil spill (highlighting leadership overconfidence), a wrongful 20-year imprisonment due to confirmation bias, and the rise and fall of the Cambridge Diet. These illustrate how misinformation spreads when biases override evidence.
Edmans emphasizes analyzing relationships between data points, not just individual statistics. He introduces frameworks to distinguish correlation from causation and assess study limitations, helping readers avoid pitfalls like survivorship bias or cherry-picked data.
This concept describes the progressive errors in reasoning: treating opinions as facts, isolated facts as comprehensive data, data as validated evidence, and evidence as irrefutable proof. Edmans shows how each step amplifies misinformation risks.
Yes—Edmans dissects how Malcolm Gladwell’s popularized “10,000-hour rule” (from Outliers) oversimplifies original research. The book argues that deliberate practice alone doesn’t guarantee expertise, highlighting flawed interpretations of Anders Ericsson’s studies.
The book warns against relying on charismatic leadership narratives or cherry-picked KPIs. For example, it critiques companies that prioritize short-term metrics over long-term sustainability, using Enron-style failures as cautionary tales.
Key lines include:
These underscore the book’s theme of skeptical, evidence-based inquiry.
Amid AI-driven misinformation and polarized media, Edmans’ tools help navigate complex issues like climate debates or healthcare trends. The book’s focus on causal reasoning is crucial for evaluating emerging technologies and policy claims.
Break down key ideas from May Contain Lies into bite-sized takeaways to understand how innovative teams create, collaborate, and grow.
Distill May Contain Lies into rapid-fire memory cues that highlight Pixar’s principles of candor, teamwork, and creative resilience.

Experience May Contain Lies through vivid storytelling that turns Pixar’s innovation lessons into moments you’ll remember and apply.
Ask anything, pick the voice, and co-create insights that truly resonate with you.

From Columbia University alumni built in San Francisco

Get the May Contain Lies summary as a free PDF or EPUB. Print it or read offline anytime.
A young Australian woman named Belle Gibson built an empire on a lie. She claimed to have cured her terminal brain cancer through natural methods, turning her story into a bestselling cookbook and a wellness app that hit 200,000 downloads. She earned nearly half a million dollars in eighteen months. Apple featured her prominently on the Apple Watch launch. There was just one problem: Belle never had cancer. Not even a mild case. Her entire story was fabricated. What's truly unsettling isn't that Belle lied-it's that so many believed her without question. Sophisticated technology companies, major publishers, and countless media outlets never bothered to verify her medical records. Why? Because her story told us what we desperately wanted to hear: that willpower and lifestyle changes could conquer our most terrifying diseases. This is confirmation bias at work-our tendency to embrace claims that align with what we want to believe while dismissing evidence that challenges our worldview.