CSO Perspectives (Pro) 8.21.23
Ep 109 | 8.21.23

Cybersecurity risk forecasting.

Transcript

Ted Wagner: A risk model like the FAIR model, with the added benefit of the quantification using revenue and vulnerability data, enables business leaders, which is really the key, where security professionals can start speaking in the same language as business leaders, which is risk management. We can really have an impact on business decisions and how we can reduce risk within a business context.

Richard Fazzini: We're going to look back on the cybersecurity industry and say to ourselves, I can't believe we were deploying all this technology with no understanding of how it could effectively manage risk to a business.

Myrna Soto: Cyber risk has been a significant challenge for corporate boards because there is a lack of quantifiable metrics that can be equated to the concept of integrated risk management.

Bob Zookas: Once they have the ability to put this in economic terms, then it changes the game in terms of how they start to mitigate these risks. Not every firm is doing this, the vast majority are not yet, but the leaders are. They're understanding --

Kevin Richards: Without a financial representation of cyber, we're not doing our job.

Rick Howard: That was a collection of quotes taken from veteran cybersecurity thought leaders in order of appearance. Ted Wagner, our own CyberWire hash table subject matter expert, and Richard Fazzini, Myrna Soto, Bob Zookas, and Kevin Richards, all taken from the Cyber Risk Solutions website earlier this year.

Unidentified person: Oh yeah!

[ SOUNDBITE OF AUDIENCE LAUGHING ]

Rick Howard: And fans of this show know that I have been trying to get my hands around how to calculate cyber risk for over a decade now. I read all the best books on the subject, Superforecasting: The Art and Science of Prediction, by Tetlock and Gardner; How to Measure Anything in Cybersecurity Risk, by Hubbard and Seierson; and Measuring and Managing Information Risk: A Fair Approach, by Freund and Jones, all Cybersecurity Canon Hall of Fame inductees. I've interviewed most of the authors for either the Cybersecurity Canon Project or the CyberWire, and some of them are friends of mine, Richard Seierson and Jack Freund. Richard and I even presented together on the subject at the RSA conference a few years back, and Jack reviewed the chapter on risk in my book, Cybersecurity First Principles: A Reboot of Strategy and Tactics, available now on Amazon or wherever you get your books. And up to now, I felt like we were all just a bunch of rebels shouting into the wind and not gaining much traction. But I think that's starting to change. It feels like the InfoSec community is beginning to move in our direction. My indicator for this positive change is that I'm starting to see security vendors incorporate some of these ideas into their products. Specifically, I found two of them, Cyber Risk Solutions and ProcessUnity. So, hold on to your butts.

Arnold from "Jurassic Park": Hold on to your butts.

Rick Howard: I'm going to talk to these vendors to see what's driving the change. My name is Rick Howard, and I'm broadcasting from N2K Cyber's secret Sanctum Sanctorum studios, located underwater somewhere along the Patapsco River near Baltimore Harbor, Maryland, in the good old U.S. of A. And you're listening to "CSO Perspectives", my podcast about the ideas, strategies, and technologies that senior security executives wrestle with on a daily basis. I talked to Fred Kneip early this summer, 2023. He is the founder and former CEO of a company called CyberGRX, a security vendor who uses many of the risk forecasting ideas that I've talked about on this show and explained in my book, like superforecasting techniques, Fermi estimates, and Bayes algorithm to help his customers assess third-party risk. I said former CEO because just as we were doing this interview, Fred was putting the finishing touches on a merger with another company called ProcessUnity, where, as of this broadcast, Fred is the new president. The merger combines ProcessUnity's third-party risk management platform with CyberGRX's global risk exchange. The problem that ProcessUnity solves for their customers is streamlining third-party vendor risk assessment. For example, at N2K, we have hundreds of sponsors that buy ad packages for our shows, like AWS, Rippling, and Expensify, just to name three. All of them are potential material cyber risk. How do I, as the CSO, evaluate each in turn in terms of material cyber risk to N2K? On the flip side, N2K sells subscription services in the form of CyberVista training packages and CyberWire Pro services to enterprise customers around the world. How do their CSOs evaluate if N2K is a material cyber risk to their organizations?

Fred Kneip: So CyberGRX is a third-party cyber risk management platform kind of built on the concept of a one-to-many exchange. And what we're doing is we're recognizing that companies are part of a growing ecosystem of vendors, suppliers they rely upon to deliver their core offering, and no one has the capacity to go out and evaluate the risk that exists across that whole ecosystem of vendors and suppliers. And historical approaches like sending out a bunch of questionnaires are just not scalable. Why don't we do that once in a high-quality, thorough way? And then that data resides in our exchange -- it can be shared or accessed multiple times? So that's the one-to-many exchange. GRX stands for Global Risk Exchange.

Rick Howard: So give me some detail about how you do this. A company like, let's say, the CyberWire, that's where I work, they fill out the information, and then what happens?

Fred Kneip: Yeah, so you think of it as there are two parties to any of these interactions. There are those who are consuming the data, typically the customer, and then those who are providing the data, those are typically the service providers. And interestingly, almost everyone out there is both. So it's a bidirectional concept in that sense. So take CyberWire, for example. Your customers would say, okay, I need to understand your cybersecurity posture. We're going to send you questionnaires. Or alternatively, if you're on the CyberDirect exchange, I can access that data. It's in a standard, structured format that I know how to read quickly and I can respond very quickly. And it actually accelerates your deal cycle, interestingly. And then you flip on the other side, you have third parties you rely upon. The video conference service you're using today or others that, hey, wait a minute, I might be putting sensitive information. I should do some higher risk assessment myself. And so you can actually play on both sides of the exchange. And many of our largest customers are actually bidirectional.

Rick Howard: What you guys are doing there is interesting. It kind of fits in the middle of what I understand risk forecasting is. There's this concept of outside-in analysis and inside-out analysis. And outside-in, you correct me if I'm wrong, is just what is the generic risk of doing something, anything. And then inside-out is all the things you're going to add more evidence based on how the individual company protects itself. But you guys are kind of in the middle there, right? You're doing an outside-in but with some very specific data.

Fred Kneip: We're mirroring up inside data that's been provided by other customers who tell us how they use these third parties. And then we brought the outside of, okay, we have this company. But that's just the first step. That's the inherent risk. The real magic, or something that we're really most proud of, is what we call our residual risk forecasting. And if you think about the same scalability, we go to the site where a customer comes to us, we go through the inherent risk mapping and say, great, I've got 500 critical people I want to focus on. Well, only 200 of them are on our exchange already. It's the ones you'd know, AWS, Google, Azure, whatever it might be. However, they now have to go through the process of the other 300 need to join our exchange to get all that data. Well, that takes time. Sometimes it's easy, sometimes it's not. And that process.

Rick Howard: Yeah.

Fred Kneip: So in the interim, what we've done is we've said, okay, let's try and predict how those other 300 or whatever number of companies are going to respond to the standard CyberGRX questionnaire. And we're going to use our 14,000 completed assessments to build that prediction. In a very simple model, what we do is we go and find a new company you've given us, whatever company X. And we find all the externally visible data that we can. So to your point on that externally visible, we will look at how many employees do they have? What's their revenue? What industry are they in? What region are they in? What does our partner Risk Recon say about them? What does our partner Recorded Futures say on the dark web? Do we see evidence of potential leakage, et cetera? And we bring all that data and build what we call an external profile. And we take that profile and we say, okay, let's go back to our 14,000 companies. Say, who else has a profile that looks as similar as possible to this? And let's build a cohort of maybe 100 to 200 companies that look as similar as possible. And how did they respond to each individual question on our questionnaire? And sometimes they all answer the same way. That gives us pretty high confidence that, okay, we're going to be able to predict that these guys are going to answer this way. And sometimes they're all over the place. We're like, all right, we have no idea how they're going to answer it. And so we'll have varying levels of confidence in our ability to forecast answers to every single question in a questionnaire. We've gotten now to up to 91% predictive accuracy. So that means on average it's probably a little around 80%. But we can get in certain sectors over 90%, which is pretty powerful. You give me 1,000 third parties, I can give you with 90% confidence how they're all going to respond to a couple hundred questions around their cybersecurity profile. That can be a very powerful tool.

Rick Howard: Well, the big epiphany I've had this last year is, you know, it used to be when I was trying to calculate risk, most security practitioners like us would have said, you know, we need all this precision. We need to know exactly what the number is. And what I've learned is that is not the case at all. We just need a good enough answer. So to your point, when you guys bring in a company, you're going to guess, and it's going to be pretty accurate most of the time, even if it's off a little bit, it's probably close enough to make the decisions that the security team is making about that company.

Fred Kneip: And that conflict is there, that precision versus speed concept. Everyone wants to say, I want every single detail updated daily on this company. That gives me as good as it gets. If you have 10,000 third parties, and that's incredibly expensive and impossible to deliver, let's use the auto-inherent risk, say, okay, let's take your 10,000 and focus it on the top 500. Then we can use our predictive risk profiles to help you identify, here are the 10 that you might want to go and really dig in because they can really hurt you, and we're nervous about them. Risk management is not about risk elimination.

Rick Howard: Yeah.

Fred Kneip: It's about focusing, prioritizing, where do I get the greatest risk reduction for dollar invested? And we help people streamline and focus their time and their energy to where we think the greatest reduction happens.

Rick Howard: You all know that I'm a giant fan of the MITRE ATT&CK framework. We've done several episodes of this podcast on it, and it features prominently in the Intrusion Kill Chain Prevention Chapter, Chapter 4, in my book. My fantasy world is that I want the tools in my deployed security stack to forecast the likelihood that any of the 150-plus nation-state ATT&CK sequences tracked in the ATT&CK Wiki, like Ferocious Kitten or Mustang Panda or Nomadic Octopus, are in my network. That should be simple enough, right? We know the ATT&CK sequences of each of these campaigns, the specific tactics, techniques, and procedures that each campaign uses across the Intrusion Kill Chain. We should know the exact prevention and detection controls designed and deployed in the security stack to defeat each of the campaigns. If Nomadic Octopus has, say, 100 steps in its ATT&CK campaign, and the security stack is alerting on one of the steps, then it's likely not the hackers behind Nomadic Octopus. But if the security stack is alerting on 80 of the 100 Nomadic Octopus steps, then it's extremely likely you have the Nomadic Octopus hackers inside your network. Here's Fred.

Fred Kneip: And one of the things that a standard data set enables us to do to answer your question even further is that instead of just saying, okay, here are the controls, here's what they have and they don't have. Good luck. They have SSO and they have backups. Good. What we've done is we've now taken the MITRE ATT&CK framework and we've mapped it to our question set and said, let's look and evaluate all these known ATT&CK paths. Do they have the controls in place that could have stopped each of those ATT&CKs? We could have a kill chain analysis of, here's how that ATT&CK happened. Nope, they could have stopped it right at the inception. This one would have gotten through, but this, you know, whatever, segmentation of the network would have stopped them here, whatever it might be. And so you can see and you can kind of map our controls to each of those different steps in a kill chain and identify where there is potential risk and what are the most commonly, you know, exploited gaps, et cetera, in an industry. So we can find, okay, here are the controls they have or don't have, but here are the ones that really matter.

Rick Howard: I'm so happy that you're talking about the MITRE ATT&CK framework. Not enough of us use that as a tool to defend our networks. Most of us are in the passive defense kind of things. You know, we're going to put controls in place for any kind of generic adversary, but hardly any of us -- and there's lots of reasons for it, but they don't put specific controls in for, let's say, Wicked Spider, because, I don't know, it's hard somehow. But what's clear to me is that we know how Wicked Spider operates across the kill chain, and if we know that, why wouldn't you put those controls in place?

Fred Kneip: As new threats evolve, what we can do is we can then map them out. Either MITRE maps them and we'll just take that from them or we'll do it ourselves. We have a team that focuses on that. And then you can now go back and look at your ecosystem, if that's 1, 10, 100, 1,000 third parties, say, who of these is potentially susceptible to this attack? And let me index that to my inherent risk. Who do I care about and is susceptible to this attack? And so one of the other major issues that occurs in third-party risk management is that point-in-time conflict. I did an assessment of them back in March. I never thought about this type of attack, and now that's the number one ransomware attack that's happening. Well, our data is dynamic. It's on the exchange. It's constantly being updated by our customers. And you can go back and --

Rick Howard: Bayes' algorithm is the underlying math theory that makes super forecasting techniques work, and it's designed to handle these dynamics. To put it into terms that CSOs can understand, you start out with a basic guess about what the probability of material impact is due to a cyber event in the near future. Let's say your first guess is 20%. And over time, you gather more and more evidence, either about the outside security landscape, like a new log4j type of attack in the wild, or an improvement to your internal security stack, like maybe you just implemented single sign-on for the entire organization. And then you adjust the initial forecast up or down depending on what happens. Bayes' algorithm allows you to make continuous adjustments to your risk forecast as your specific situation changes.

Fred Kneip: And it goes even -- maybe I got a little weedy on you here, but we all go back to the inherent risk questions of how do you use this company? If you use them, you need constant uptime, then actually you're pretty worried about a DDoS attack because if that company goes down, you have a problem. So we'll actually overweight DDoS-relevant controls if you say that that's important to you. I don't care if they crash. It's a law firm. I don't care if their website crashes. But man, if that data gets out, the data loss, now I'm very focused. Okay, we'll overweight those controls. And so we actually prioritize the risks or the areas of concern based on how you tell us you use that third party. It's not just a generic ABC score. It's actually this is how it scores for you and your use case.

Rick Howard: So let me double down on that. Your people that submit their info to your platform, they can say, you know, we've updated the Wicked Spider control set. Instead of 100 controls, we now have 150. And they can just automatically give it in. And so when they try to interact with some customer down the line, they're getting the up-to-date control.

Fred Kneip: The more commonly someone is sharing their CyberDirects assessment, the more incentive they have to keep it up-to-date and as fresh as possible. Because, you know, I take ADP as an example. They're sharing it every day. And so they want that to be reflective of their security. And people typically are improving their security versus reducing it. So they want to keep that data fresh. The way a CyberDirects platform works is a third party, the provider of data, can update their assessment at any time. I can finish it. And like a week after I did it the first time, I put in a whole new endpoint security program. I can log back in. I can update those controls. They'll revalidate. And now I present as such. And it's a new data set.

Kevin Richards: Kevin Richards, I'm the president of CyberRisk Solutions.

Rick Howard: I met Kevin this past summer, 2023, at the ChiCyberCon conference in Chicago. He gave a talk on the current state of CRQ, Cyber Risk Quantification, and how his company can help. I asked him if he agreed with me that CRQ was just starting to gain traction in the InfoSec community.

Kevin Richards: Yeah, I mean, it only took them 30 years. But that's a fantastic thing. It's interesting, there are a number of people talking about it. And then I'm sure you saw there was a blurb that Gartner put out that said basically that a lot of people are going to spend a lot of time on this and they're going to fail miserably and get frustrated and quit. And then there was a swell on LinkedIn that basically said CRQ is dead.

Rick Howard: I think Kevin is saying a couple of things here. First, Gartner is probably right that a lot of InfoSec practitioners will get very excited about this initially and then down the line will learn that it's a lot harder to do than they thought. Kind of similar to the Gartner hype cycle where we all get excited about some new idea and climb the peak of inflated expectations but then fall down the trough of disillusionment as the initial hype subsides. Second, the idea of cyber risk quantification, CRQ, is dead, is also true in the sense that the way most of us have been doing this for the past 30 years with qualitative risk assessments in the form of heatmaps should have been dead many years ago, or at the very least not used as the primary means to present risk to the business. There are many ways to collect evidence in the superforecasting kind of way and have it inform your internal Bayes algorithm to update your risk forecast. And Kevin has a great analogy for this, an analogy that explains the purpose of doing CRQ in the first place. What's the best way to reduce the risk of material impact to your organization? Here's Kevin.

Kevin Richards: I was trying to think of a good analogy and I was trying to explain this to my mother-in-law who is 87 years old. I came up with this analogy and, Rick, I'm curious of your thoughts on it, because CRQ to me, the analogy is trying to make a fire. And there's a lot of ways you can make a fire. I could do the -- I've got a stick, you know, with a piece of wood. I could go out and grab a torch and just use the propane torch. There's a spark with the flint. All of them can be used to start a fire. So there's a lot of approaches that can be used to come up with a number. And then I started thinking about, well, what's the purpose of the fire? Am I trying to keep me warm? Am I trying to cook food? There's an edge where fire turns from constructive to destructive. I mean, you can look at this analogy and then it's like, well, what is it we really want to accomplish with these numbers? This is the point of your first principles, right? This is number one.

Rick Howard: Yeah.

Kevin Richards: You know, we've been doing something very poorly because we keep throwing money at it and our losses keep going up. So what we're looking for out of this financial analysis is, how do I make better decisions on how and where I spend? And thinking about if I do spend on that thing, whether it's, you know, Wikigome architecture, zero trust, or whole disk encryption, or next-gen endpoint stack, does it actually move the needle? Does it reduce my exposure? Does it make my financial footprint smaller? Does it protect my balance sheet better? You know, and that's really what we're trying to look at, which is the outcome, not necessarily how I created the fire.

Rick Howard: Like most of us, when most security practitioners, we think about probability and risk, and we think we need to measure all the things with high precision, and that's one way to start a fire, okay?

Kevin Richards: Right.

Rick Howard: But it's a really hard way to do it.

Kevin Richards: Right. That's taking the stick and the piece of wood and spinning it for an hour and a half and hoping that it starts on fire.

Rick Howard: That's exactly right. Yeah, and my big epiphany last year when I was talking to my boss about this was, you know, I went up to him and I said, you know, we think our risk for ransomware, whatever it was we were talking about, is about 20% in the next three years. And my point of me telling him that was that I wanted him to dedicate resources to really get in there and dive in, spend a year of his time, the IT team's time, and everybody else's time collecting all the data and doing a real deep dive assessment. And the question he immediately asked after I put forward all that, he says, what do you think the difference is going to be after that work, from your initial assessment that you just gave me, after a year of doing that deep dive, how many points are you going to be off? And I went, oh, well, maybe two or three probably, maybe ten, maybe. He goes, I don't need it.

Kevin Richards: Right.

Rick Howard: Okay, I can make a decision with what you told me right now. Right. So that was my epiphany.

Kevin Richards: Exactly. And that's really it. And I think most people that are getting -- I don't know, I love your camera or whatever because my gray hair seems to have gone away, which is fantastic. But, you know, we've been around so long, and we understand how things were built and how it's made, and the engineering part of our background kicks in where we want that nuance and we want to get all of that detail. And not to say that it doesn't matter, because it does absolutely matter. But for the purpose that, you know, you're talking about and the way we talk about it, if I walk in and say you've got a 20% chance of some ransomware happening over the next 12 months, that's good enough for the opening salvo. Now, when we talk about it internally, we take it one more step than that, because the duration of the ransomware matters or the number of records lost matters. If I lose 100 records, that's bad but manageable. If I lost 100 million records, well, that's a whole different conversation.

Rick Howard: I try to frame that in terms of what's material to the business. Losing 10 records, like you said, probably not material. Losing all the records, that might be material. And it's dependent on your organization, right?

Kevin Richards: Right. And that definition of materiality is dynamic and temporal and amorphous. So, you know, we've been talking about this a lot, and I'm sure you internally and then with all of your community that you talk to, we've got these pending SEC cyber transparency and reporting requirements that are out any day now.

Rick Howard: Kevin and I did this interview a few weeks before Commissioner Jaime Lizarraga from the United States Security and Exchange Commission, the SEC, made the announcement on 26 July 2023 about public company requirements for both incident reporting and governance regarding material cyber events that could affect a company's valuation and profitability, such as intellectual property loss, business interruption, increased cost of capital, or reputational damage.

Kevin Richards: And they've said if an event is material, then you must disclose it on an 8K within four days and all the associative component to that. Nowhere does the SEC really define what material is. I did some research on this. I spoke at a conference a couple weeks ago. The overriding definition of materiality was written by Thurgood Marshall on the Supreme Court in, like, the late '70s.

Rick Howard: Get out, really?

Kevin Richards: Yeah, it's wonderful. And basically what it says is this is information that a reasonable investor would find important in making an investment decision.

Rick Howard: That still seems a little foggy.

Kevin Richards: A little bit. That's like a 100,000-foot view of, you know, it's bad.

Rick Howard: According to the Harvard Law School Forum on Corporate Governance, the landmark judicial definition of materiality was crafted by Supreme Court Justice Thurgood Marshall in 1976. Yes, that Thurgood Marshall. For you youngsters out there, he was the guy that argued and won Brown v. Board of Education that helped end segregation in U.S. public schools. He founded the NAACP Legal Defense and Education Fund, and he was the first African-American Supreme Court Justice and served for 24 years. He's a big deal, a true American, and one of the country's champions for civil rights. He wrote in TSC Industries v. Norway that a fact is material if there is a substantial likelihood that a responsible shareholder would consider it important in deciding how to vote, or a substantial likelihood that the disclosure of the omitted fact would have been viewed by the reasonable investor as having significantly altered the total mix of information made available. Whew, that is a broad definition. Kevin has been at X-Analytics now for almost three years, helping his customers find a better way to assess, monitor, design, manage, and communicate cyber risk strategies. But he's been in the industry as a consultant doing similar things for a long time now. So I asked him to describe his view of the current state of the industry in reporting cyber risk.

Kevin Richards: It's both great and terrifying. When I talk to corporate boards, and I spend a lot of time with the National Association of Corporate Directors and the Digital Director Network, cyber is on the top three of the agenda for every one of those boards, whether it's on a full board or in a risk committee or an audit committee. So it's very, very top of mind. But right next to that, there was a recent study with the National Association of Corporate Directors where they polled their 26,000 director members. Seventy-three percent of the respondents said they did not understand what the CSOs were telling them. They knew it sounded important, but they didn't understand what they were actually being told. They couldn't put it in the right context to put it next to other enterprise risks. And so there's still some confusion on -- not the buzzwords. I mean, they've heard of ransomware and data breach, and they know good/bad, but they don't know how good or how bad. If I can put it to someone in a financial context, you've got, by example, $118 million of unaddressed cyber risk.

Rick Howard: Yeah.

Kevin Richards: Okay, now I can instantly understand.

Rick Howard: I'm going to say you're defining future state, which we're not there yet, because people like me, CSOs, we've struggled with this. Okay, we don't get it. I've gotten away with creating a heatmap and saying really scary stuff high and to the right, please give me money and I'll try to fix it. And sometimes that works and sometimes it doesn't. But we never give the leadership a chance to evaluate whether or not that's within their tolerance. What you're describing, future state, is giving them a number and saying, are you okay with that? You know, there's a 20% chance that you're going to lose $100 million this year. And if you're Amazon, okay, I can live with that. But if you're CyberWire, that would destroy us.

Kevin Richards: And that's really the point of the conversation, which is, you know, so you say what's the state of the board understanding? First you have to give it to them in a vocabulary that they can understand, and then you have to engage in that conversation in a way that those leaders engage in every business risk. The point of this is not to drive the risk to zero.

Rick Howard: No

Kevin Richards: Because it can't be done unless we decide to turn everything off, which is a different problem. So how do we get this in a construct, just like we think of everything else? If I'm a manufacturer and I decide to go put a brand new billion-dollar plant in the middle of a country that has political unrest, there's a billion-dollar exposure right there. Now, they might feel like they're going to make $5 billion the first year or $10 billion the next year, so they make that risk-reward decision all the time. So this isn't about the absence of risk. It's about giving it in a way that's actually actionable. That's a bad way of saying it, but making it an actionable decision.

Rick Howard: What we're doing here is weighing options to buy down risk. My good friend Joe O'Brien, the founder and principal of Alchemy, a consulting company, coined the phrase for me this summer. And I told him that I was totally stealing it for my own use. If, indeed, the ultimate cybersecurity first principle is reducing the probability of material impact due to a cyber event to my organization in the next few years, you have a number of strategies you could choose that logically follow, like zero trust, intrusion kill chain prevention, resilience, automation, or workforce development. But the way you prioritize the strategies you choose to implement is by forecasting the potential risk reduction for each and the costs associated with deployment and choose the option that has the biggest bang for the buck.

Kevin Richards: So the exercise that we walk a lot of people through is we start thinking about the digital value that's at play. And the idea here is to start from the top down as opposed from the bottom up. And there are things that move the needle a whole lot more than others. You know, I think about data protection strategies. I think about encryption technologies. A boring one, asset inventory, vulnerability management.

Rick Howard: I agree with you on this. So, like, we could roll out a zero trust program, let's say, and it might be extremely expensive. We'll buy down risk with that. Or we can look at another strategy, resilience, abandon zero trust and say, we're just going to try to survive it. And we can now, if we do this risk forecasting correctly, we can say, well, you know, resilience looks like it's a lot cheaper and we get a better return on investment for that. Right? So we can make those kind of decisions now or at least present them to the board.

Kevin Richards: As I talk to the business leaders, you can see the lights turn on. You know, you can see their eyes, they get they get wide. They're like, finally. The biggest obstacle to risk forecasting are cyber practitioners.

Rick Howard: That's so true. That is so true, yeah. Okay.

Kevin Richards: You know, the business operators, they want this. They're asking for this. They're yearning for it. We did a bit of research on this and it was really, really quite fascinating because it was a beautiful, perfect bell curve. About 25% of the CSOs we talked to were ready today to take these financial numbers to their leadership. There was 25% that said they would never show financial numbers to their leadership. And then there was everyone else that was like, I'm probably one to three years away. I need to understand a little bit more so that I can feel comfortable that when I present it, I don't get my lunch handed to me. And it was just a beautiful distribution, and it was it was three points. There was nothing in between. And in the way I viewed that 25% that said, we'll never do this, they're going to become a dinosaur and they will become extinct.

When I heard you say that at the Chicago ChiCyberCon conference in your presentation, I was one of the guys that raised their hand that -- I'm not sending these numbers up there yet. But I have a caveat because I want to spend the next year talking individually with each board member and sating, this is what we're looking at, this is what we're thinking, to get them on board to make sure I can describe it to them before I ever bring it into a group setting and say, here's the direct.

Kevin Richards: So what we're seeing is there is an absolute journey in introducing this the right way to the business leadership and boards of directors. It's not just we flip the switch on and it magically works.

Rick Howard: Exactly.

Kevin Richards: We're seeing that you probably need one to two sets of quarterly conversations, probably on an individual or maybe like a two-person kind of part of a subcommittee of a committee to introduce the notion to build the vocabulary and build confidence in the exchange of the conversation. And by the way, it's not in, do I trust the numbers? It's how I'm articulating and what do they mean?

Rick Howard: Exactly.

Kevin Richards: Because most people don't understand probabilities and statistics.

Rick Howard: Exactly right. Yeah. Well, I'm one of those that you're talking about because, you know, I took Probability and Stats 101 in college, barely got through it with the skin of my teeth. And I think probability is counting or predicting the colored marble coming out of the urn of colored marbles. And what's the probability of that? That's my understanding of it. But really, if you think about what we're trying to do here, it's a measure of uncertainty.

Kevin Richards: It is.

Rick Howard: Right? And not, you know, the number of blue marbles coming out of an urn. That is really hard for most CSOs and I would expect a lot of people that don't do this on a day-to-day basis.

Kevin Richards: It is. It is. And once you can get beyond that a little bit and, you know, it's not a big leap. It really isn't. It's probably a quarter-turn on the screw if you think about it in the notion of like a Ferrari mechanic.

Rick Howard: Yeah.

Kevin Richards: This isn't about an overhaul of the engine. It's just a little bit of openness, then you get the aha moment. And I think you got that aha moment when you started looking at the Bayesian superforecasting principles. It actually works. And I don't need to rub that stick into that piece of wood for 17 hours to start the flame.

Rick Howard: Yeah.

Kevin Richards: I can't actually just walk over with the propane torch and start the darn fire right there and be done with it and then not celebrate the fact that I made fire, which is important. Don't get me wrong. But what I do with the fire is more important. And so if I circle back around to, you know, is CRQ dead? No, it's a flame that there are a lot of ways to make the flame. But that's not the point. The point wasn't to make fire. The point was to go cook food or make someone warm or to heat up a house or whatever it was. We need to spend more of our time focusing on, how do I take the fire and make my CSO job easier, better, stronger, faster? And engage this the business in the right way to make better cyber risk decisions. And if I can get to fire in two hours, why would I possibly spend 18 months?

Rick Howard: I'd like to thank Fred Kneip, CyberGRX's founder and president of the newly merged company ProcessUnity, and Kevin Richards, Cyber Risk Solutions president, for coming on the show to help us understand this quickly evolving cybersecurity idea of risk forecasting. It's clear to me from this discussion and from other shows we've done here on this podcast and from the book I wrote that the InfoSec community is trying to find the most advantageous way to buy down risk, not completely eliminate it. At this moment, though, most security executives are struggling with not only how to forecast risk, but also how to present the risk in a way that business leaders can understand. But as Kevin said, the writing is on the wall for those security leaders who don't figure this out. Those that are sticking to qualitative risk assessments in the form of heat maps will be extinct in the very near future. With the U.S. Security and Exchange Commission, the SEC, establishing new rules about cybersecurity governance and reporting regarding materiality, the way we've been doing cyber risk quantification, or CRQ, is maybe not completely dead yet, but it's just a matter of time. To help, Chapter 6 in my book is a how-to guide on forecasting risk for your organization, and security vendors like ProcessUnity and Cyber Risk Solutions are starting to roll out products to facilitate these calculations. So that's a wrap. Don't forget you can buy copies of Cybersecurity First Principles: A Reboot of Strategy and Tactics. Order it now at Amazon or wherever you buy your books. The audio version just dropped, so you can now read the hard copy version, read the digital version with your Kindle, or listen to the audio recording on your phone. And finally, we'd love to know what you think of this podcast. Send email to cyberwire@n2k.com. Your feedback helps us ensure we're delivering the information and insights that help keep you a step ahead in the rapidly changing world of cybersecurity. We're privileged that N2K and podcasts like "CSO Perspectives" are part of the daily intelligence routine of many of the most influential leaders and operators in the public and private sector, as well as the critical security teams supporting the Fortune 500 and many of the world's preeminent intelligence and law enforcement agencies. N2K Strategic Workforce Intelligence optimizes the value of your biggest investment, people. We make you smarter about your team while making your team smarter. Learn more at n2k.com. N2K Cyber's "CSO Perspectives" is edited by John Petrik and executive produced by Peter Kilpe. Our producers are Liz Ervin and senior producer Jennifer Eiben. Our theme song is by Blue Dot Sessions, remixed by the insanely talented Elliot Peltzman, who also does the show's mixing, sound design and original score. And I'm Rick Howard. Thanks for listening.