Why Case Studies Matter
Case studies bring critical thinking concepts to life by showing how they work in real situations. By examining actual scenarios where biases influenced decisions or logical fallacies derailed discussions, you can develop better pattern recognition and learn to apply these concepts in your own life.
Each case study includes:
- Background context: The setting and circumstances
- Key decisions: Critical moments where thinking patterns mattered
- Bias analysis: Identification of cognitive biases and logical fallacies
- Alternative approaches: How critical thinking tools could have helped
- Lessons learned: Takeaways for similar situations
Case Study 1: The Theranos Deception
๐ฅ Medical Technology Fraud (2003-2018)
Background
Elizabeth Holmes founded Theranos claiming to revolutionize blood testing with technology that could run hundreds of tests from a single drop of blood. The company reached a $9 billion valuation before being exposed as fraudulent, leading to criminal convictions and the company's collapse.
Key Players & Decisions
Elizabeth Holmes (CEO)
Made increasingly deceptive claims about technology capabilities, manipulated demonstrations, and misled investors and patients.
Investors
Invested billions without proper due diligence, swayed by charismatic presentations and prestigious board members.
Walgreens Partnership
Rushed into partnership without adequate technical verification, prioritizing speed to market over safety.
Media & Public
Initially celebrated Holmes as a visionary, contributing to hype without sufficient skeptical analysis.
Cognitive Biases & Fallacies Identified
๐ญ Halo Effect
Holmes's Stanford background and Steve Jobs-like image influenced how people perceived her claims about technology.
โ Confirmation Bias
Investors and partners focused on positive indicators while ignoring or dismissing red flags and whistleblower warnings.
๐ฅ Authority Bias
Prestigious board members (Henry Kissinger, George Shultz) lent credibility despite lacking relevant medical expertise.
๐ Bandwagon Effect
As more investors and partners joined, others felt pressure to participate rather than miss out on the "revolution."
How Critical Thinking Could Have Helped
Scientific Method
Demanding reproducible demonstrations and independent verification of technology claims.
Occam's Razor
Questioning why Theranos's approach would work when established companies with more resources hadn't achieved similar results.
Probabilistic Thinking
Assessing the low probability of a college dropout revolutionizing a field dominated by established players without peer review.
Lessons Learned
- Verify extraordinary claims: Revolutionary breakthroughs require extraordinary evidence
- Separate person from product: Charismatic leaders can still promote flawed or fraudulent ideas
- Value relevant expertise: Prestigious names don't substitute for domain knowledge
- Listen to whistleblowers: Insider warnings often reveal important information
- Demand transparency: Legitimate scientific breakthroughs welcome peer review
Case Study 2: The 2008 Financial Crisis
๐ฐ Global Economic Collapse
Background
A combination of deregulation, risky lending practices, and complex financial instruments led to a housing bubble that burst in 2008, triggering the worst economic crisis since the Great Depression.
Cognitive Biases in Action
Recency Bias
Manifestation: "Housing prices have gone up for years, they always go up"
Impact: Ignored historical cycles and warning signs of unsustainable growth
Groupthink
Manifestation: Industry consensus that complex securities had eliminated risk
Impact: Discouraged dissenting voices and critical analysis of new financial instruments
Overconfidence Effect
Manifestation: Belief that mathematical models had eliminated market uncertainty
Impact: Excessive risk-taking based on flawed confidence in predictive models
Incentive-Caused Bias
Manifestation: Short-term bonuses encouraged risky behavior regardless of long-term consequences
Impact: Systematic distortion of judgment due to misaligned rewards
Critical Thinking Failures
๐ Lack of Independent Analysis
Rating agencies had conflicts of interest, banks didn't properly assess risks, regulators relied on industry self-reporting
๐ Ignoring Base Rates
Historical data on housing crashes and economic cycles were dismissed as "this time is different"
๐ฏ Straw Manning Critics
Warnings about bubble conditions were dismissed as "doom and gloom" rather than seriously analyzed
๐ Failure to Update Beliefs
Even as warning signs accumulated, few participants updated their risk assessments
Alternative Approaches
Better critical thinking could have helped through:
- Devil's advocate processes: Systematic challenging of consensus views
- Scenario planning: "What if housing prices fall by 20%?"
- Historical analysis: Learning from previous bubble patterns
- Independent verification: Third-party risk assessment without conflicts of interest
- Stress testing: Scientific method applied to financial models under adverse conditions
Case Study 3: The Mars Climate Orbiter Mission Failure
๐ NASA's $125 Million Mistake (1999)
Background
NASA's Mars Climate Orbiter was lost during orbital insertion due to a unit conversion error. One team used imperial units (pound-force) while another used metric units (newtons), causing the spacecraft to approach Mars at the wrong altitude and burn up in the atmosphere.
Human Factors Analysis
Technical Assumptions
Lockheed Martin engineers assumed NASA would convert their imperial units to metric, while NASA assumed they were receiving metric values.
Communication Breakdown
No formal verification process existed to ensure both teams were using the same measurement system.
Warning Signs Ignored
Navigation team noticed trajectory anomalies but attributed them to other factors rather than investigating thoroughly.
Cognitive Biases & Errors
โ Confirmation Bias
Teams explained away discrepancies in ways that confirmed their existing assumptions about the mission progress.
๐ญ Assumption Error
Both teams made unstated assumptions about what the other team was doing without verification.
๐๏ธ Inattentional Blindness
Focus on other mission aspects led to overlooking the fundamental unit mismatch.
๐ฏ Normalization of Deviance
Small trajectory deviations became accepted as "normal" rather than investigated as potential problems.
Critical Thinking Solutions Applied
Systematic Verification
Implementing formal verification processes using scientific method principles to test assumptions.
Questioning Assumptions
Regular "assumption audits" where teams explicitly state and verify their working assumptions.
Red Team Analysis
Independent teams specifically tasked with finding flaws and challenging the main mission approach.
Lessons for Any Project
- Make assumptions explicit: Write down what you're assuming others are doing
- Create verification checkpoints: Regular points where teams confirm they're aligned
- Investigate anomalies: Don't explain away unexpected results too quickly
- Use independent checks: Have someone outside the project review critical interfaces
- Document everything: Clear specifications prevent misunderstandings
Interactive Analysis: Your Turn
Case Study Challenge
Read this scenario and identify the cognitive biases and suggest better approaches:
๐ฑ The Failed App Launch
Scenario: TechStart Inc. spent two years and $2 million developing a social media app called "ConnectNow." The CEO, Jake, was convinced it would revolutionize how people communicate because his friends loved the beta version. Despite warnings from the marketing team that market research showed limited interest, Jake insisted on launching because "Apple didn't do market research for the iPhone." The app gained only 500 users in its first month and was shut down after six months.