Skip to main content

Superforecasting: The Art and Science of Prediction

The  image shows a gold telescope with the title "Superforecasting" at the end. It has a black background.

Book written by Philip Tetlock and Dan Gardner

Book review by Rick Howard

Bottom Line

Hall of Fame Candidate, I recommend this nonfiction book for the Cybersecurity Canon Hall of Fame.

Review

Out of all the capabilities in the infosec community that have improved over the years, the one essential skill that hasn’t moved forward is calculating risk. Specifically, how do we convey risk to senior leadership and to the board? 

In my early network defender days, whenever somebody asked me to do a risk assessment, I would punt. I would roll out my "qualitative heat map" risk assessments and my three levels of precision--high, medium, and low--and call it a day. Along with many of my peers, I would tell myself that predicting cyber risk with any more precision was impossible; that there were too many variables; that cybersecurity was somehow different from all other disciplines in the world and it couldn’t be done.

We were wrong of course. 

The book that changed my mind on the subject was "Superforecasting: The Art and Science of Prediction,” by Philip Tetlock and Dan Gardner. Dr. Tetlock is quite the character. He’s one of those scream-and-shake-your-raised-fist-at-the-tv-because-they-have-no-idea-what-they-are-talking-about people. He would watch news programs like CNN, FOX, and MSNBC where the host would roll out famous pundits to give their opinion on some topic because, once in their lives, they predicted something correctly. It didn’t matter that all the predictions they’d made since were wrong. The news programs would still bring them on as if they were Moses coming down from the mountain to present their wisdom. Dr. Tetlock thought that they should have to keep score. I always thought that when pundits came on, the viewer should see their batting average rolling across the chyron on the bottom of the screen: “These pundits have made 73 correct predictions out of 1,000 tries in the last year. Maybe you shouldn’t listen too closely to what they have to say.” 

And then Dr. Tetlock decided to test his idea. Working with IARPA (Intelligence Advanced Research Projects Agency), he devised a test using three control groups: the intelligence community, the academic community, and a group I call the soccer moms. The soccer moms weren’t really soccer moms, they were just regular people with time on their hands who liked to solve puzzles. According to the Washington Post, he then had them forecast answers to over 500 really hard questions like

- Will the Syrian President, Bashar Hafez al-Assad, still be in power in six months time? 

- Will there be a military exchange in the South China Sea next year? 

- Will the number of terrorist attacks sponsored by Iran increase within one year of the removal of sanctions? 

Out of the three communities, the soccer moms outperformed the control group by 60%. They beat the academic teams from 30% and 70% depending on the school (MIT and the University of Michigan were two), and outperformed the intelligence groups. But Tetlock also discovered a subset of the soccer moms: the superforecasters. By the end of the four-year tournament, these superforecasters had outperformed the soccer moms by over 60% and could also see further out than the control group. "Superforecasters looking out three hundred days were more accurate than regular forecasters looking out one hundred days." 

And these Superforecasters don't have superpowers either.  They are intelligent for sure, but not overly so. They are not all card carrying members of Mensa. They are not math nerds either. But, by following a few guidelines, they can outperform random Kentucky windage guesses. These are the traits that security practitioners can adopt to create their  own cybersecurity forecasts. 

1: Forecast in terms of quantitative probabilities, not qualitative high-medium-lows. Get rid of the  heat maps. Embrace the idea that probabilities are nothing more than a measure of uncertainty. Understand that just because the probability that something will happen is 70%, doesn't mean it's a lock (See Sec Clinton in the 2016 U.S. Presidential campaign).

2: Practice: Do a lot of forecasts and keep score using something called the Brier Score (invented by Glenn W. Brier in 1950). The score is on two axes: Calibration and Resolution. Calibration is how close to the line your forecast is (are you over confident or under). Resolution is when you say something is going to happen, it does.

3: Embrace Fermi estimates (outside in vs inside out forecasts). Outside-in is looking at the general case first before you look at the specific situation. The Italian American physicist Enrico Fermi was a central figure in the invention of the atomic bomb and he was renowned for his back-of-the-envelope estimates. With little or no information at his disposal, he would often  calculate a number that subsequent measurement revealed to be impressively accurate. He would famously ask his students things like “estimate the number of square inches of pizza consumed by all the students at the University of Maryland during one semester.” He understood that by breaking down the big intractable question into a series of

much simpler answerable questions, we can better separate the knowable and the unknowable. When setting a forecast, use a range that you are 90% sure contains the right answer. The surprise is how often good probability estimates arise from a remarkably crude series of assumptions and guesstimates. 

4: Check your assumptions: Adjust, tweak, abandon, seek new ones, and adjust your forecast from there.

5: Dragonfly eyes: Consume more evidence from multiple sources. Construct a unified vision of it. Describe your judgment about it as clearly and concisely as you can, being as granular as you can be.

The point to all of this is that it's possible to forecast the probability of some future and mind-numbingly complex event with precision. If the soccer moms can accurately predict the future of the Syrian President, surely a bunch of no-math CISOs, like me, can forecast the probability of a material impact due to a cyber event for their organizations. That’s risk forecasting.

Tetlock spends time talking about how the U.S. government has not done this kind of thinking in the past:

Massive Intelligence Failures

- WMD in Iraq. 20 years of war on the "slam dunk" assertion that these weapons existed in Iraq when they didn't.

- Vietnam War: 10 years of war on the widely held belief that If South Vietnam fell, the entire world would fall to communism like dominoes. Leaders didn't think there was a chance. They thought it was a sure thing.

- Bay of Pigs: President Kennedy's political disaster when the planners didn't consider the probability of success when the plan changed at the last minute.

- Is Osama Bin Laden in the Bunker? 

Tetlock describes a scene in one of my favorite movies, 2012's "Zero Dark Thirty" starring Jessica Chastain. The CIA director, Leon Panetta - played by the late great James Gandolfini, is in a conference room asking his staff for a recommendation on whether or not Osama Bin Laden is in the Bunker. He's looking for a yes or no answer. One of his guys says that he fronted the bad recommendation about WMD in Iraq and because of that failure, they don't deal in certainties anymore. They deal in probabilities. Which is the right answer by the way, just not a very satisfying one. They go around the room and get a range of probabilities from 30% to 80%. Chastain breaks into the conversation and says that the probability is 100%. "OK fine, 95% because I know certainty freaks you out but it's a 100%." Which is the wrong answer by the way. The probability was never a 100% no matter how sure she was with her evidence. Tetlock interviewed the real Leon Panetta about that meeting and the subsequent meeting with President Obama about the decision to send the special forces into Pakistan to get Osama Bin Laden. When the President went around the room with his staff, he also got a range of probabilities. His conclusion though, after reviewing those recommendations, was that his staff didn't know for sure. Therefore, it was a fifty/fifty chance, a toss up, on whether or not Osama Bin Laden was in the bunker. Which is the wrong conclusion by the way. It was probably much stronger. He ultimately made the right call but he could just as easily erred on the side of caution.

Tetlock also describes criticism of his Superforecasting approach from his colleague, Nassim Taleb, the author of "The Black Swan: The Impact of the Highly Improbable" published in 2007. Taleb says that forecasting is impossible because history is controlled by “the tyranny of the singular, the accidental, the unseen and the unpredicted.” According to NYTs journalist, Gregg Easterbrook, Taleb argues that "Experts are charlatans who believe in bell curves, in which most distribution is toward the center — ordinary and knowable. Far more powerful, Taleb argues, are the wild outcomes of fractal geometry, in which anything can happen overnight." Taleb says that "What matters can’t be forecast and what can be forecast doesn’t matter. Believing otherwise lulls us into a false sense of security." Acknowledging the argument, Tetlock says that, "The black swan is therefore a brilliant metaphor for an event so far outside experience we can’t even imagine it until it happens”.

Case in point, if we do some first order back-of-the-envelope calculations, we know that in 2021, the press reported on some 5,000 successful cyber attacks to U.S. companies. We also know that there are ~ six million commercial companies in the United States. Doing the outside-in forecast, there was a 5 thousand / 6million chance of a U.S. company getting breached in 2021, ~ .0008. That's a really small number. By definition though, the experience of those 5,000 companies were black swan events, significant impactful events on something that was not very likely to happen at all. 

Tetlock's response to Taleb is that there are probably a set of forecasting problems that are too hard to forecast, but he says that they are largely due to the fact that the forecasting horizon is too long. For example, it's tough to forecast who will win the U.S. Presidential election in 2028 (6 years from the time of this writing), but you could do well with the U.S. Congressional elections in 2022 (3 months).

That said, Taleb's solution to black swan events is not attempting to prevent them, but to try to survive them. He says resilience is the key. For example, instead of trying to prevent a giant meteor from hitting the earth, the question is how would you survive one? In the cybersecurity context, instead of preventing Panda Bear from breaching your organization, what would you do to ensure that your organization continues to deliver its service during and after the attack? 

The Cybersecurity Canon Project is full of books that talk about how to calculate cyber risk.

  • "How to Measure Anything in Cybersecurity Risk," by Douglas W. Hubbard and Richard Seiersen
  • "Measuring and Managing Information Risk: A Fair Approach," by Jack Freund and Jack Jones
  • "Security Metrics: A Beginner’s Guide," by Caroline Wong
  • "Security Metrics: Replacing Fear, Uncertainty, and Doubt," by Andrew Jaquith

They are fantastic primers for how to think about probability and risk in a cybersecurity context. The problem with each, though, is that I kept waiting to get to the last chapter titled, "And This is How You Put It All Together." Alas, none of them have that chapter and I'm on the lookout for the book that has it.

That said, this book, "Superforecasting," is the first book you should read on the subject and it’s a must read for all cybersecurity practitioners.  It is the book that will make you a better cyber risk forecaster than any that I have come across.

Source

"Superforecasting: The Art and Science of Prediction,” by Philip E. Tetlock and Dan Gardner, 29 September 2015, Crown.

References

"Author Interview: 'Security Metrics: A Beginner’s Guide’ Review'," by Rick Howard, The Cyberwire,  the Cybersecurity Canon Project, Ohio State University, 2021.

Book Review: 'How to Measure Anything in Cybersecurity Risk," by Steve Winterfeld, the Cybersecurity Canon Project, Ohio State University, 2021.

Book Review: 'Measuring and Managing Information Risk: A FAIR Approach',” by Ben Rothke, the Cybersecurity Canon Project, Ohio State University, 2021.

Book Review: 'Security Metrics: A Beginner’s Guide’ Review," by Ben Smith, the Cybersecurity Canon Project, Ohio State University, 2021.

"Book Review: 'Security Metrics: Replacing Fear, Uncertainty and Doubt," by Rick Howard, The Cybersecurity Canon Project, Ohio State University, 2021. 

"BOOK REVIEW: SUPERFORECASTING,” BY SCOTT ALEXANDER, Slate Star Codex, 4 February 2016.

Fermi Estimations,” by Bryan Braun, 4 December 2011. 

"How to Measure Anything in Cybersecurity Risk," by Douglas W. Hubbard, Richard Seiersen, Published by Wiley, 25 April  2016. 

"How to predict the future better than anyone else,” By Ana Swanson, 4 January 2016.

"Measuring and Managing Information Risk: A Fair Approach," by Jack Freund and Jack Jones, Published by Butterworth-Heinemann, 22 August 2014. 

"‘Mindware’ and ‘Superforecasting’," By Leonard Mlodinow, 15 October 2015.

"Pundits are regularly outpredicted by people you’ve never heard of. Here’s how to change that,” By Sam Winter-Levy and Jacob Trefethen, The Washington Post, 30 September 2015. 

"Security Metrics: A Beginner’s Guide," by Caroline Wong, Published by McGraw-Hill Companies, 10 November 2011. 

"Security Metrics: Replacing Fear, Uncertainty, and Doubt," by Andrew Jaquith, Published by Addison-Wesley Professional, 1 March 2007. 

Superforecasting: Summary and Review," by HowDo, 16 June 2021. 

"The Black Swan: The Impact of the Highly Improbable," by Nassim Nicholas Taleb, Published by Random House, 17 April 2007.

Zero Dark Thirty Meeting Scene,” YouTube, 1 July 2019. 

We modeled the Cybersecurity Canon after the Baseball or Rock & Roll Hall-of-Fame, except for cybersecurity books. We have more than 25 books on the initial candidate list, but we are soliciting help from the cybersecurity community to increase the number to be much more than that. Please write a review and nominate your favorite. 

The Cybersecurity Canon is a real thing for our community. We have designed it so that you can directly participate in the process. Please do so!

More Books

"I Have Nothing to Hide" and 20 Other Myths About Surveillance and Privacy
Cybersecurity First Principles: A Reboot of Strategy and Tactics
Navigating the Cybersecurity Career Path
Cybersecurity First Principles: A Reboot of Strategy and Tactics
Raven
How to Measure Anything in Cybersecurity Risk, 2nd Edition