Philip Tetlock

Thinker
Philip Tetlock

Gaining insight into foresight

Intro

We make predictions about the possible outcomes of certain actions in order to inform our decision-making. This seems like an effective process until you realize that most of us are unable to accurately foresee the outcomes of our choices. It was psychologist Philip Tetlock who demonstrated that, generally, the accuracy of our predictions is no better than chance, which means that flipping a coin is just as good as our best guess. Tetlock also realized that certain people are able to make predictions far more accurately than the general population. He dubbed these people “superforecasters”.

Tetlock is a psychology professor and researcher who is fascinated by decision-making processes and the attributes required for good judgment. His career has had a major impact on decision-making processes worldwide, as his discovery of superforecasters has enabled him to uncover the attributes and methodologies necessary for making accurate predictions. Through consultations and workshops, Tetlock and his colleagues have been working to improve decision-making by promoting the qualities necessary to accurately foresee the outcomes of certain decisions.

In one of history’s great ironies, scientists today know vastly more than their colleagues a century ago, and possess vastly more data-crunching power, but they are much less confident in the prospects for perfect predictability.


– Philip Tetlockin Superforecasting: The Art and Science of Prediction

On their shoulders

For millennia, great thinkers and scholars have been working to understand the quirks of the human mind. Today, we’re privileged to put their insights to work, helping organizations to reduce bias and create better outcomes.

Find Out How

Innovative Ideas

Superforecasters – How a select few people can accurately forecast future outcomes

Tetlock’s career has been based on the assessment of good judgment. This research interest led him to discover that the predictions most people – including experts – make about future outcomes are not usually significantly better than chance.  In other words, they may as well have just guessed. We base our decisions on forecasts, so these findings call into question the accuracy of our decision-making. This is especially troubling for people like policymakers, whose decisions affect entire populations. In the same study that yielded these somewhat sobering findings, however, Tetlock noticed that a few experts stood out from the crowd and demonstrated real foresight. After publishing this study in 2005, he spent years attempting to uncover what sets these “superforecasters” apart.1 Research into superforecasters was conducted by The Good Judgment Project, an initiative Tetlock founded with Barbara Mellers, a colleague from the University of Pennsylvania.2  The research Tetlock and his team conducted demonstrated that the key attributes of a superforecaster are teamwork, thinking in terms of probabilities, drawing knowledge from a variety of sources, and willingness to own up to their mistakes and take a different approach.3

“Forecasters who see illusory correlations and assume that moral and cognitive weakness run together will fail when we need them most.” 

― Philip Tetlock in Superforecasting: The Art and Science of Prediction

Superforecasters have been shown to be so impressive in their ability to forecast future outcomes that they have outperformed highly trained intelligence analysts who have access to classified information that the superforecasters do not.4 In their 2015 book, Superforecasting: The Art and Science of Prediction, Tetlock and his co-author Dan Gardner trace patterns in forecasting through history. They give examples of successful and unsuccessful decision-making processes, none more diametrically opposed as two US Army missions. The attack on Osama bin Laden’s compound employed red teams and statistical risk assessments before the operation; whereas, the battle of the Bay of Pigs was undone by a failure to employ targeted questioning.5

“When the scientist tells you he does not know the answer, he is an ignorant man. When he tells you he has a hunch about how it is going to work, he is uncertain about it. When he is pretty sure of how it is going to work, and he tells you, “This is the way it’s going to work, I’ll bet,” he still is in some doubt. And it is of paramount importance, in order to make progress, that we recognize this ignorance and this doubt. Because we have the doubt, we then propose looking in new directions for new ideas. The rate of the development of science is not the rate at which you make observations alone but, much more important, the rate at which you create new things to test.” 

― Philip Tetlock in Superforecasting: The Art and Science of Prediction

Tetlock and his team have reached the conclusion that, while not everyone has the ability to become a superforecaster, we are all capable of improving our judgment.6 While the research of the Good Judgment Project has come to a close, the Good Judgment initiative continues to offer consulting services and workshops to companies worldwide. By identifying the attributes shared by successful forecasters and the methodologies that allow for accurate forecasting, Tetlock and his team at Good Judgment are able to help companies promote these skills among their employees. This allows them to make more adaptive decisions, which foster success within the company. Additionally, Good Judgment offers consulting services that are incredibly valuable for policymakers, who need to anticipate the global consequences of their decisions.7

“Foresight isn’t a mysterious gift bestowed at birth. It is the product of particular ways of thinking, of gathering information, of updating beliefs. These habits of thought can be learned and cultivated by any intelligent, thoughtful, determined person.”

― Philip Tetlock in Superforecasting: The Art and Science of Prediction

Historical Biography

Tetlock, who was born in Canada, attended university in his native country, at the University of British Columbia, where he completed his undergraduate degree in 1975 and his Master’s degree in 1976.8 He went on to do his doctoral studies at Yale, where he obtained his Ph.D. in psychology in 1979.9 Since then, Tetlock has taught courses in management, psychology, and political science at the University of California, Berkeley, the Ohio State University, and the University of Pennsylvania, where he is a current faculty member.10 Broadly, his research focuses on the evaluation of “good judgment” and the criteria used to assess judgment, bias, and error.11

“In describing how we think and decide, modern psychologists often deploy a dual-system model that partitions our mental universe into two domains. System 2 is the familiar realm of conscious thought. It consists of everything we choose to focus on. By contrast, System 1 is largely a stranger to us. It is the realm of automatic perceptual and cognitive operations—like those you are running right now to transform the print on this page into a meaningful sentence or to hold the book while reaching for a glass and taking a sip. We have no awareness of these rapid-fire processes but we could not function without them. We would shut down.” 

― Philip Tetlock, Superforecasting: The Art and Science of Prediction

Tetlock’s primary research interest, the question of what constitutes good judgment, is also his claim to fame. He coined the term “superforecaster” to refer to individuals with particularly good judgment, who are able to foresee future outcomes far more accurately than your average person. The Good Judgment Project was first developed as an entry into a competition for accurately forecasting geopolitical events, which was being hosted by The Intelligence Advanced Research Projects Activity.12 Despite the impressive competition, The Good Judgment Project won the tournament. The concept of “superforecasters” was developed by The Good Judgment Project and is arguably their best-known discovery. Tetlock describes the profiles of various superforecasters and the attributes they share in the book he wrote alongside Dan Gardner, Superforecasting: The Art and Science of Prediction. Released in 2015, it was a New York Times Bestseller and brought this concept into the mainstream by making it accessible to behavioral economists and the general population alike.

“The fundamental message: think. If necessary, discuss your orders. Even criticize them. And if you absolutely must—and you better have a good reason—disobey them.”

― Philip Tetlock in Superforecasting: The Art and Science of Prediction

The Government-funded research of the Good Judgment Project has manifested into a public platform called Good Judgment Open, where they recruit talented people to be trained to become a superforecaster.13 They also have a global network of superforecasters who offer analytic services. Additionally, companies can enroll in virtual workshops to boost their forecasting capabilities.14

In order to develop The Good Judgment Project, Tetlock worked alongside Barbara Mellers, a professor of psychology at the University of Pennsylvania. Her research focuses on decision-making, specifically, the variables that influence the decisions we make that are often excluded from rational models of decision-making, such as emotions and the effects of context.15 He has also collaborated with Dan Gardner, who works at the University of Ottawa’s Graduate School of Public Policy and International Affairs.16 In addition to lecturing on risk, forecasting, and decision-making, Gardner offers consulting services to enable people to become better decision-makers, with one of his clients being none other than the Canadian Prime Minister, Justin Trudeau.17 Gardner has also worked as a journalist and author18 and he penned Superforecasting: The Art and Science of Prediction along with Tetlock.

“Fuzzy thinking can never be proven wrong. And only when we are proven wrong so clearly that we can no longer deny it to ourselves will we adjust our mental models of the world—producing a clearer picture of reality. Forecast, measure, revise: it is the surest path to seeing better.” 

― Philip Tetlock in Superforecasting: The Art and Science of Prediction

References

Superforecasting: The Art and Science of Prediction

In 2015, Tetlock and Dan Gardner’s collaborative book on prediction examines why, while most people’s predictions are only slightly better than chance, certain people seem to possess some level of actual foresight. Superforecasting is an informative, well-researched book, while remaining highly accessible. It has been lauded as both a New York Times Bestseller and an Economist Best Book of 2015. Jason Zweig of The Wall Street Journal calls it “the most important book on decision making since Daniel Kahneman’s Thinking, Fast and Slow“, which, in the area of behavioral economics, is very high praise indeed.

Philip E. Tetlock on Forecasting and Foraging as Fox

In this hour-long interview, Tetlock offers insight into what people look for in a forecaster – everything from reassurance to entertainment – and what makes a good forecaster – it requires more than just intelligence. He covers a variety of topics, including the qualities he looks for in a good leader, whether it is becoming more difficult to make predictions about the world, and what we are able to infer from political speeches.

Superforecasting seminar

This talk given by Tetlock goes along with his 2015 book, Superforecasting: The Art and Science of Prediction. As in his book, Tetlock describes why a select few people seem to be able to make accurate predictions about the future – people he refers to as “superforecasters”.

Expert Political Judgment: How Good Is It? How Do We Know?

The title of this 2005 release asks the question on all of our minds. Tetlock aims to provide an answer by analyzing the predictive methodologies of leaders and researching those that are most successful at accurately forecasting future events. What he found is that a person who is knowledgeable in a variety of areas is a better forecaster than a person who has an in-depth, but extremely narrow area of expertise.

References

  1. Superforecasting by Penguin Random House. Penguin Random Househttps://www.penguinrandomhouse.com/books/227815/superforecasting-by-philip-e-tetlock-and-dan-gardner/
  2. About Superforecasting. Good Judgmenthttps://goodjudgment.com/about/
  3. See 1
  4. See 1
  5. See 1
  6. See 2
  7. See 2
  8. Tetlock – Management Department. Wharton University of Pennsylvania. https://mgmt.wharton.upenn.edu/profile/tetlock/
  9. Philip. E. Tetlock. University of Pennsylvania School of Arts and Scienceshttps://www.sas.upenn.edu/tetlock/bio
  10. See 8
  11. See 8
  12. See 2
  13. See 11
  14. Services. Good Judgmenthttps://goodjudgment.com/services/
  15. Barbara Mellers. University of Pennsylvania Arts & Scienceshttps://psychology.sas.upenn.edu/people/barbara-mellers
  16. Dan Gardner | About. DanGardner.cahttps://dangardner.ca/mnt/volume_tor1_01/www/spark/dangardner.ca/web/about
  17. Dan Gardner. University of Ottawahttps://socialsciences.uottawa.ca/public-international-affairs/people/gardner-dan
  18. See 15

About the Author

The Decision Lab

The Decision Lab

The Decision Lab is a Canadian think-tank dedicated to democratizing behavioral science through research and analysis. We apply behavioral science to create social good in the public and private sectors.

Read Next

Notes illustration

Eager to learn about how behavioral science can help your organization?