Superforecasting: The Art and Science of Prediction — Summary

Context

Forecasting is the process of making predictions of the future using data analysis.

Introduction

Superforecasters are people who make consistently and significantly better predictions than the rest of the population.

  • Temperament: Cautious, Humble, Non-Determinist.
  • Cognition: Open-Minded, Intelligent, Curious, Reflective, Numerate.
  • Analytical: Pragmatic, Dragonfly-Eyed, Probabilistic, Thoughtful Updaters, Intuitive Psychologist.
  • Work Ethic: Growth Mindset, Resilience.

Reading Style & Tone

The book is thoughtful and accessible. It’s also SLOW. It’s boring and directionless for the first 100 pages. After that it gets more interesting.

Book Summary by Chapter

  1. Focus your time and effort on forecasts that are rewarding.
  2. Unpack problems: expose assumptions, catch mistakes, and correct biases.
  3. Consider the broader category before inspecting the case details.
  4. Revise your beliefs often, in small increments, to reduce the risks of under and over reacting to the new information.
  5. Find merit in opposing perspectives.
  6. Reject the fantasy of certainty and learn to think in terms of uncertainty.
  7. Avoid boasting or waffling. Aim to be humble and prudently decisive.
  8. Learn from experience in success and failure.
  9. Use precise questioning to bring out the best in others. And others bring out the best in you.
  10. Try, fail, analyse, adjust. Try again.
  11. Question everything. Including this list.

Chapter 1 — Focus and Triage

Ask good questions.

  • Economists.
  • Meteorologists.
  • Policy and Law Makers.
  • Intelligence analysts.
  • Fund Managers.
  • Business analysts.
  • Scientists.
  • Statesmen.
  • Defence Force.
  • Management Consultants.
  • Technologists.

Example

The Arab Spring of 2010–2012 — with uprisings and revolutions across the Arab world — was sparked by the immolation protest of a single man.

Chapter 2 — Simplify Problems

Modern medicine and science is VERY recent. So recent that there are people still alive who lived in a time when this wasn’t the case.

Example

The world didn’t have randomised medical trials until 1946, after World War II. That was only 3 generations ago.

  • System 1 is our autopilot —fast, instinctive and emotional.
  • System 2 is our consciousness— slow, deliberate and rational.

The best strategy is to use both systems together.

When people don’t understand things, they usually make up explanations. Scientists are different — they always consider that their explanation is wrong. This is counter to human nature. Confirmation Bias: we don’t like evidence that contradicts our beliefs.

Chapter 3 — Clarity in Inside and Outside Perspectives

To evaluate forecasts for accuracy, we have to understand what the forecast says. This is harder than you may think.

Example

Lots of political forecasts regarding Israeli and Palestinian relations don’t provide a timeframe. Without a timeframe, these forecasts are useless.

Example

A meteorologist tells you there’s 70% chance of rain. If it doesn’t rain, you may think the forecast was wrong. But really a 70% chance of rain means 30% chance of NOT rain. The weather forecast is still technically correct.

Chapter 4 — Update Your Beliefs

You can both over react or under react to information.

Example

US invaded Iraq in 2003 on claims that the Suddam Hussein regime were hiding Weapons of Mass Destruction. US intelligence analysts were SO SURE those weapons were there.

Chapter 5 — Guesstimates

Accurate forecasting is an unintuitive process, and making an estimate with inadequate or incomplete information is a useful skill taught to science and business students alike.

Chapter 6 — Remove Uncertainty

The wisdom of the crowd: ask a group of people to predict something, and they will give you a variety of answers. Is that bad? No. The average of their estimates, is often a good approximate of the truth.

Example

In prehistoric times and even much of written history, there was no need to solve Fermi problems. Your concerns were far more simple and immediate:

  • I heard something. Is this a threat? Yes=Run! No=Relax. Maybe=Caution.
  • I’ m hungry. Can I eat this? Yes=Yum! No=Discard. Maybe=Caution.

Example

If today has a 70% chance of rain and it doesn’t rain, that doesn’t mean the forecast was wrong. 70% chance of rain also means 30% chance of NOT rain.

  1. Being uncertain about things that are knowable
  2. Being uncertain about things that are unknowable.

Chapter 7 — Prudence vs Decisiveness

There’s no simple mapped method for good forecasts. But there are actions that are helpful:

  • Break the question down into smaller components.
  • Identify known and unknown.
  • Detail your assumptions and bias.
  • Consider the outside view, and frame the problem not as unique but as part of a larger event.
  • Look at how your opinions match or differ from others.
  • Dragonfly eyes — construct a unified vision. Describe your judgement as clearly and concisely as you can.
  • You can under correct or over correct.
  • When confronted with new information, we may stick to our beliefs. Opinions can be more about self-identity than the event itself.
  • Emotional investment makes it hard to admit you’re wrong.
  • Once people publicly take a stance, it’s hard to get them to change their opinion.
  • It’d hard to distinguish important from irrelevant information.

Chapter 8 — Growth From Failure and Success

Some people think they are who they are and they can never change and grow. Because they think they can’t do it, then it becomes true. This is a self-fulfilling prophecy.

Chapter 9 — Managing Teams

Example

The Bay of Pigs Invasion in 1961 was poorly planned and executed. The Kennedy administration lost credibility. But they were much better with the Cuban Missile Crisis in 1962.

It’s possible for a group to change their decision-making process for the better.

There is no need to search for perfect candidates when a motivated team learns to change.

  • Disadvantages: Teams can make people lazy.
  • Advantages: People share information when they work in teams. They offer more perspectives. With many perspectives, aggregation becomes more accessible.

Chapter 10 — Superteams

Superteams operate best outside of hierarchical structures. But business and government organisations are hierarchical.

Chapter 11 — The Black Swan and Time

Traps:

  • Thinking that what you see is all there is.
  • Forgetting to check your assumptions.
  • Not paying enough attention to the scope of a question.

Example

Will the Assad regime fall in Syria this year?

Chapter 12 — Accountability

People like being told what they want to hear. The reverse is also true. Often influencers and strong opinions are more powerful than data analysis.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Jak Nguyen

Jak Nguyen

39 Followers

I’m only human, darling. Principal @ s2 Photography & Wedding Officiant @ Yarra Events