Cybersecurity first principles: risk assessment.
By Rick Howard
Jun 15, 2020

CSO Perspectives is a weekly column and podcast where Rick Howard discusses the ideas, strategies and technologies that senior cybersecurity executives wrestle with on a daily basis.

Cybersecurity first principles: risk assessment.
Listen to the podcast episode.

Note: This is the sixth essay in a planned series that discusses the development of a general purpose cybersecurity strategy for all network defender practitioners-- be they from the commercial sector, government enterprise, or academic institutions-- using the concept of first principles.

We are building a strategy wall, brick by brick, for a cybersecurity infosec program based on first principles. The foundation of that wall is the ultimate and atomic first principle:

Reduce the probability of material impact to my organization due to a cyber event.

That’s it. Nothing else matters. This simple statement is the pillar on which we can build an entire infosec program. 

The first four bricks we put on that pillar were zero trust, intrusion kill chains, resilience, and DevSecOps. Going back to our atomic first principle statement though, you will notice that it includes the word “probability.” That probability is really the risk assessment statement for our organization. In order to reduce the probability of material impact to my organization due to a cyber event, I have to first calculate what the existing risk probability is.

Probability (risk assessment) is scary.

For me though, “probability” sends chills down my spine as it invokes the nightmares I’m still having thirty years later of barely getting through that probability and stats course we all had to take in college. There is a reason I am a computer science guy and not a math guy. Math is hard and probability and stats problems still hold a special place in my heart that I remember involves a mix of the dark arts, a little Voodoo, the sacrifice of a small woodland creature or two, and some division. If somebody mentions probability and stats, there’s a good chance you’ll catch me running at full speed in the opposite direction.

In my early network defender days, whenever somebody asked me to do a risk assessment, I would punt. I would roll out my qualitative heat map risk assessments and my three levels of precision--high, medium, and low--and call it a day. Along with many of my peers, I would tell myself that predicting risk with any more precision was impossible; that there were too many variables; that cybersecurity was somehow different from all other disciplines in the world and it couldn’t be done.

We were wrong of course. 

Superforecasting.

The book that changed my mind on the subject was "Superforecasting: The Art and Science of Prediction,” by Philip Tetlock and Dan Gardner. Dr. Tetlock is quite the character. He’s one of those scream-and-shake-your-raised-fist-at-the-tv-because-they-have-no-idea-what-they-are-talking-about people. He would watch news programs like CNN, FOX, and MSNBC where the host would roll out famous pundits to give their opinion on some topic because, once in their lives, they predicted something correctly. It didn’t matter that all the predictions they’d made since then were wrong. The news programs would still bring them on as if they were Moses coming down from the mountain to present their wisdom. Dr. Tetlock thought that they should have to keep score. I always thought that when pundits came on, the viewer should see their batting average rolling across the chyron on the bottom of the screen: “These pundits have made 73 correct predictions out of 1,000 tries in the last year. Maybe you shouldn’t listen too closely to what they have to say.” 

And then Dr. Tetlock decided to test his idea.

Working with IARPA, he devised a test using three control groups: the intelligence community, the academic community, and a group he called the soccer moms. The soccer moms weren’t really soccer moms, they were just regular people with time on their hands who liked to solve puzzles. According to the Washington Post, he then had them forecast answers to some really hard questions like

  • Will the Syrian President, Bashar Hafez al-Assad, still be in power in six months time? 
  • Will there be a military exchange in the South China Sea in the next year? 
  • Will the number of terrorist attacks sponsored by Iran increase within one year of the removal of sanctions? 

Out of the three communities, the soccer moms outperformed the academics and the intelligence community by as much as 30%. And there were a few analysts who outperformed them all by large margins whom Tetlock called the superforecasters. There are many reasons for their success, according to Tetlock, but mostly it comes down to ruthlessly admitting your own cognitive bias and not being afraid to change your mind as new evidence comes in.

The point to all of this is that it is possible to forecast the probability of some future and mind-numbingly complex event with precision. If the soccer moms can accurately predict the future of the Syrian President, surely a bunch of no-math CISOs, like me, can forecast the probability of a material impact due to a cyber event for their organizations.

Cybersecurity risk books and experts to help clarify.

For cybersecurity specifically, I recommend to two Cybersecurity Canon Hall of Fame books

  • Measuring and Managing Information Risk: A FAIR Approach by Jack Freund and Jack Jones
  • How to Measure Anything in Cybersecurity Risk by Douglas W. Hubbard and Richard Seiersen

One key take-away from both of these books is that the network defender community has a limited understanding of what probability is, based on their own intro to probability and stats course in college. We remember that to get a probability, we have to count things. We have to count the number of times something happens and then divide it by the total number of times it could've happened. And that is one reason why most network defenders say that evaluating cyber security risk is impossible. What things do you count? How do you know how many times those things could've happened? And there are so many things to count in cybersecurity. Which things do you choose? While all of the questions are valid to a point, probability is so much more. 

For a more useful description of probability, we should turn to Dr. Ron Howard, the father of decision analysis theory. If the CISO says there is a 20% chance of some bad cyber thing happening next year, Dr. Howard would say that number is the current and most precise mathematical representation of what the CISO knows. He says, 

“Don’t think of probability or uncertainties as the lack of knowledge. Think of them instead as a very detailed description of exactly what you know.” 

And you don’t have to count things to make that assessment.

The right risk question.

From my own experience, I’ve learned that to assess the risk of any organization, you have to ask the right question and It must have three components. First, there needs to be a precise quantitative probability, not some mushy qualitative guess like high, medium, or low. But a probability of what? You can’t just tell the boss that there is a 20% chance that some bad cyber thing is going to happen. Even though you are using probabilities, that’s still not precise enough. 

What exactly is bad? Not everything in cyberland is important. You don’t have enough resources to protect everything. We should be focusing on what is material to the organization. In the first essay of this first principle series, I took a definition of material from the team at Datamaran that I like:

“A material issue can have a major impact on the financial, economic, reputational, and legal aspects of a company, as well as on the system of internal and external stakeholders of that company.”

But that still isn't enough. You can’t just tell the boss that there is a 20% chance that some material bad thing is going to happen sometime in the future. Of course that is true. The future is a long time. If you wait long enough, some material bad thing is going to happen. To complete the triad, risk questions have to be time bound. 

First Steps

In the essay, “Metrics and risk: All models are wrong, some are useful,” published in March, I said: 

The question we need to answer for the board, then, is this: what is the probability that a cyber event will materially impact the company in the next three years? Answer that question and then board members can decide if they’re comfortable with that level of risk or if they need to invest in people, process, or technology to reduce it.

How do you answer that seemingly impossible question? Let’s start with a simple but useful model. Let’s poll the infosec team. Using techniques described in the books mentioned in this essay, have the team build consensus on what the probability range is with 95% confidence. In other words, no matter what the upper and lower limits are, they are 95% confident that the actual probability will fall within it. And don't be concerned if the range is wide. This is just the first step. Like I said, it is a simple but useful model. In future essays in this first principle series, I’ll discuss how we can add complexity to our forecasting models by including Monte Carlo simulations, latency curves, and other improvement techniques designed to reduce the range with confidence. But taking this first step, you have a probability you can take to the board.

Risk assessment as a first principle 

Our atomic first principle is this:

Reduce the probability of material impact to my organization due to a cyber event.

It goes without saying that if this simple statement about probability is the base for everything we are doing in our infosec program, then we have to have a way to calculate it. Conducting risk assessments, or forecasting probability, becomes an essential strategy on the same level of importance as zero trust, intrusion kill chains, resilience, and DevSecOps. We are placing all of those bricks on that first pillar to give it strength. What I am advocating for is an expansion of your understanding of what probability is that goes beyond what you learned in your probability and stats course back in college. Take the lead from Dr. Ron Howard and determine a metric, a probability, where you can precisely explain to the board what you do know about the security posture of your organization that is something more meaningful than a squishy high, medium, or low guess. Consider that risk assessments are nothing more than forecasting the probability of something material happening to your organization and start developing your superforecasters now. If soccer moms can do it for the Gordian knot of international geopolitics, I think it is possible for us network defenders to do it to support our infosec first principle wall. Whether we like it or not, we are all going to have to embrace our fear of probability and stats and figure this out. So strap in. And you might want to collect a few small woodland animals just in case.

Recommended reading.

"How to Measure Anything in Cybersecurity Risk,” by Douglas W. Hubbard and Richard Seiersen, Published by Wiley, 25 July 2016, Last Visited 30 March 2020. 

Materiality in a nutshell,” by datamaran, Last Visited 30 April 2020. 

"Measuring and Managing Information Risk: A Fair Approach,” by Jack Freund and Jack Jones, Published by Butterworth-Heinemann, January 2014, Last Visited 30 March 2020.

Metrics and risk: All models are wrong, some are useful,” By Rick Howard, CSO Perspectives, the CyberWire, 30 March 2020, Last Visited 30 June 2020.

"Pundits are regularly outpredicted by people you’ve never heard of. Here’s how to change that,” By Sam Winter-Levy and Jacob Trefethen, The Washington Post, 30 September 2015, Last Visited 30 June 2020. 

Super Prognostication II: Risk Assessment Prognostication in the 21st Century,” by Rick Howard and Dave Caswell, 2019 RSA Conference, 6 March 2019, Last Visited 30 March 2020.

"Superforecasting: Even You Can Perform High-Precision Risk Assessments,” By Rick Howard, David Caswell, and Richard Seiersen, Edited by Deirdre Beard and Benjamin Collar. 

"Superforecasting: The Art and Science of Prediction,” by Philip E. Tetlock and Dan Gardner, 29 September 2015, Crown, Last Visited 30 June 2020.

"The Cybersecurity Canon – How to Measure Anything: Finding the Value of ‘Intangibles’ in Business,” Book Review by Rick Howard, Cybersecurity Canon Project, Palo Alto Networks, 19 July 2017, Last Visited 30 March 2020. 

"The Cybersecurity Canon: How to Measure Anything in Cybersecurity Risk,” Book Review By Steve Winterfeld, Cybersecurity Canon Project, Cybersecurity Canon Hall of Fame Winner, Palo Alto Networks, 2 December 2016, Last Visited 30 March 2020.

"The Cybersecurity Canon: Measuring and Managing Information Risk: A FAIR Approach,” Book Review by Ben Rothke, Cybersecurity Canon Project, Cybersecurity Canon Hall of Fame Winner, Palo Alto Networks, 10 September 2017, Last Visited 30 March 2020. 

The Foundations of Decision Analysis Revisited,” by Ronald Howard, Chapter 3, 060520 V10, last visited 20190117.