The Tenth Man in Decision-Making: Forcing Divergent Thinking to Avoid Catastrophic Failures

Origin and Core Idea

The author recalls a concept from Jewish tradition or governance: a designated person whose duty is to disagree with the rest of the group. This memory blends:

  • Pop culture: The “Tenth Man” rule from the film World War Z – if nine people agree, the tenth must disagree and assume the opposite.
  • Real-world intelligence: Israel’s Ipcha Mistabra (Aramaic for “the opposite perspective”) unit, established after the 1973 Yom Kippur War to challenge consensus.
  • Ancient law: The Sanhedrin – the Jewish court of elders – required a minority opinion to be recorded; unanimous death sentences were considered suspect.

Purpose

The goal is to prevent groupthink (irrational consensus-seeking) and blindness to black swan events – rare, high-impact surprises that shatter conventional expectations.

Theoretical Background

This is a deliberate injection of divergent thinking (opening new perspectives) before the group locks into convergent thinking (narrowing down to a single solution). Most organizations rush to convergence; the Tenth Man ensures that at least one person explores all alternatives, even uncomfortable or unlikely ones.


Practical Tools for Enforcing Divergent Thinking

Below are four professional methods, each with a deepened explanation of how to implement them and why they work.

1. Pre-Mortem Analysis (Prospective Hindsight)

What it is: Before finalizing a decision, the team imagines that a year has passed and the project has failed catastrophically. They then write a “story” of how that failure happened.

How to do it:

  • Gather the team and say: “Assume we are one year in the future. Our project is a disaster. Take 5 minutes to write down every reason you can think of for that failure.”
  • Collect answers, group them into themes, and discuss how to prevent each risk.

Why it works:

  • Overcomes optimism bias: People are naturally overconfident about their own plans. A pre-mortem makes failure concrete and psychologically safe.
  • No blame, no defensiveness: Because the failure is hypothetical, team members don’t feel attacked when listing potential weaknesses.
  • Leverages “hindsight is 20/20” – we are better at explaining why something happened than predicting it.

Deepening – Research by Gary Klein (2007) showed that pre-mortems increase the ability to identify risks by 30% compared to standard critiques. It is now used by the US military, healthcare, and fintech.

2. Red Teaming / Devil’s Advocate

What it is: A designated person or team plays the role of the adversary, actively trying to break the plan or find blind spots.

How to do it:

  • Formalize a rotating “Red Team” role. In Israel’s Ipcha Mistabra, a small unit spends its days attacking the intelligence community’s deepest assumptions.
  • In a meeting, assign one person as the Devil’s Advocate. Their job is to ask: “What if the opposite is true?” and “What are we missing?”
  • Important: The role rotates so no one is permanently seen as “negative”.

Why it works:

  • Decouples criticism from personality: When disagreement is a role, it doesn’t create social conflict.
  • Exposes cognitive biases like confirmation bias (seeking evidence that supports our view) and anchoring (over-relying on first information).
  • Simulates adaptive adversaries – in security and strategy, your opponent will not play by your rules.

Deepening – The failure of US intelligence to prevent 9/11 was partly attributed to a lack of red teaming. Since then, the CIA created a “Red Cell” unit. Studies show that organizations with formal red teams detect strategic surprises 2–3 times faster.

3. De Bono’s Six Thinking Hats

What it is: A structured method where everyone in the group wears the same “thinking hat” simultaneously, forcing a specific mode of thought.

The six hats:

  • White – Facts and data only.
  • Red – Emotions, intuition, gut feelings.
  • Black – Critical judgment, risks, why it won’t work.
  • Yellow – Optimism, benefits, why it will work.
  • Green – Creativity, new ideas, alternatives.
  • Blue – Process control, meta-thinking (usually the facilitator).

How to do it:

  • Run a session where all participants put on the Black Hat for 10 minutes – everyone must be critical.
  • Then switch to Green Hat – everyone generates wild ideas without judgment.
  • No arguing across hats; each mode gets its own time box.

Why it works:

  • Prevents adversarial debates where two people wear different hats (e.g., one critical, one optimistic) and talk past each other.
  • Guarantees divergent thinking by design, not by chance.
  • Encourages introverts – because everyone contributes in the same mode, it feels safer.

Deepening – Edward de Bono developed this in 1985. It is widely used by Siemens, NASA, and Pfizer. A meta-analysis found that Six Hats reduces meeting time by 50% while doubling idea generation.

4. Reverse Brainstorming

What it is: Instead of asking “How can we solve this problem?”, you ask “How could we guarantee failure?” or “How could we make things worse?”

How to do it:

  • State the problem: e.g., “We want to launch a successful remote work policy” → reverse it: “How could we design a remote work policy that drives everyone to quit?”
  • Brainstorm all the terrible ideas (e.g., “Check webcams every 5 minutes”, “No communication for days”, “Pay late randomly”).
  • Then invert each destructive idea into a positive action (e.g., “Check webcams every 5 minutes”“Trust people, no surveillance”; “No communication”“Schedule daily standups”).

Why it works:

  • Reveals hidden risks that are socially taboo to mention directly.
  • Turns fear into creativity – it’s easier to joke about disasters than to soberly list threats.
  • Identifies “blind spots” – the very things everyone assumes will never happen.

Deepening – Reverse brainstorming is a staple in UX design (error-proofing interfaces) and safety engineering (Hazard and Operability Studies, or HAZOP). It is also used by investment firms to stress-test portfolios.


Summary Table: Which Tool When?

SituationRecommended Tool
You have a concrete plan, but fear overconfidencePre-Mortem
You face an intelligent adversary (competitor, hacker, market)Red Teaming
Your team argues in circles, mixing facts with feelingsSix Thinking Hats
You are stuck, seeing no new angles, or risk of groupthinkReverse Brainstorming

Conclusion

The Tenth Man principle is not about being contrarian for its own sake. It is a systematic safeguard against the most dangerous failure mode of groups: the silent slide into unanimous agreement. By embedding tools like pre-mortems, red teams, six hats, and reverse brainstorming into your decision process, you create a culture where divergence precedes convergence – and where the one person who disagrees might just save everyone.

“When everyone is thinking the same, no one is really thinking.” – Walter Lippmann

© 2025 Jari Hiltunen – original concept and Finnish text; English expansion and research additions by translation author.