N2K logoDec 22, 2023

CyberWire Live - Q4 2023 Cybersecurity Analyst Call.

Join Rick Howard, N2K's CISO, Chief Analyst and Senior Fellow, and his guests Merritt Baer, Field CISO at Lacework, and Caroline Wong, Chief Strategy Officer at Cobalt, for an insightful discussion about the events of the last 90 days that will materially impact your career, the organizations you’re responsible for, and the daily lives of people all over the world.

Rick Howard: Hey, everyone, welcome to the CyberWire's "Quarterly Analyst Call." My name is Rick Howard. I'm the N2K CSO and the CyberWire's chief analyst and senior fellow. I'm the host of the "CSO Perspectives" podcast and the author of a recently published book based on the podcast called "Cybersecurity First Principles." But, more importantly, I'm also the host of this program, the CyberWire's "Quarterly Analyst Call," where we try to unpack some of the most important cybersecurity news stories from the past quarter. And I'm happy to say that I'm joined by two members of the CyberWire "Hash Table." First is my very good friend, Merritt Baer, the field CISO at Lacework. Merritt, I'm so glad you came on the show this time. Say hello.

Merritt Baer: Hi, folks. Yes, great to be here. Thanks, Rick. We've known each other for years. I'm at Lacework now. I'm field CISO, so talking to lots of customer, CISOs, in the field, as the name implies. Coming from over five years at AWS in their Office of the CISO, so deputy CISO role there. And, before that, U.S. government in all three branches doing security work. So, looking forward to our conversation today. And thanks, as always, for having me.

Rick Howard: Excellent. And our second guest is Caroline Wong, the chief strategy officer at Cobalt.io. I guess just Cobalt, huh, Caroline? She is also a Cybersecurity Canon Hall of Fame author for her book "Security Metrics: A Beginner's Guide." Caroline, welcome. This is your first appearance with the Cyberwire. Thank you.

Caroline Wong: I am so excited to be here. I started my information security career in 2005 leading security teams at eBay and Zynga. Most recently, I've been with Cobalt. We're a security testing company.

Rick Howard: Excellent. So, everybody, this is our 17th show in the series. And, as always, there's been a lot of things going on this past quarter that we could have discussed, like the U.S. DHS Healthcare Cybersecurity Strategy or we could have talked about the big data breaches at Okta in 23andMe or we could have even talked about how the U.S. FCC proposes a return to net neutrality. But we're going to start with Merritt's topic on the new SEC rule about material reporting. So, Merritt, what's going on here?

Merritt Baer: Yeah, I wanted to actually kind of break your mold, you know, unsurprisingly, but instead -

Rick Howard: Wait, you're breaking the format? We're two minutes in. Come on.

Merritt Baer: Instead of picking like, you know, an issue or an article, I wanted to take some current issues, including SEC requirements around security disclosures and also the EU's new AI Act and kind of break down why we should care about these and how it will play out for security practitioners.

Rick Howard: Perfect.

Merritt Baer: Yeah. I think that so frequently we're getting news as if it's disconnected from our real world. And I also think that, as security practitioners, a lot of times there are questions around how other folks are incorporating this into what they're doing. So, I wanted to take a chance to talk through where I see some of these really hitting home for practitioners to need to think about and for the business to have to care about.

Rick Howard: All right. So, what's the first thing on your mind then?

Merritt Baer: All right. So, let's start with the SEC since we are -- you know, that was the first one you introduced here. I think that, if folks haven't been tracking with it, a few months ago the SEC unveiled rules for cybersecurity. And they had been toying with a couple of different prescriptive possibilities, but what they ended up coming out with was the idea that anyone who's bound by SEC guidelines, so public companies that have U.S. presence, will need to disclose what our material cybersecurity incidents within four days, will also need to attest annually in their 10-Ks around having a security process for incident detection and reporting, and having board oversight and expert -- and the level of expertise that they have on their board around security. So, these have now -- they just went into effect yesterday.

Rick Howard: Yeah, yesterday.

Merritt Baer: Yeah. They have now I think -- and I know you're going to dig in from a different angle into some of this, but they have definitely changed I think some of the calculus for CISOs around how to think about relationship to the SEC. And, also, in my view, one -- the first thing it's going to do is change sort of the order of operations in firefighting. So, if you are a CISO, generally speaking, it was not your top priority to go consult your legal and maybe CEO and maybe CRO or other folks the minute that you had a breach. You go contain it, you go figure out the level and you're just like in the trenches. Right? Well, now, with this four-day clock, I think we're going to see folks incorporating business decision making earlier into those determinations. We saw the first report actually come out yesterday from an impacted company based in Denver, VF Corporation. So, it's being played out in real time in terms of how these will look. And even in their filing yesterday, they said, "We don't know the full extent of it, but we anticipate that it has a likelihood of materiality." And, so, they disclosed. Well, to me, that means that like - there's a couple things here. One, they had to go through - they had to change their internal processes to be able to incorporate some of these determinations under that time threshold. Two, they are dealing with an ongoing threat possibly, it - by all accounts, it's a ransomware attack, that is maybe still something they're trying to contain. So, I think we're going to see folks really drifting toward tools and capabilities that allow them to not only decrease their likelihood of bad days and like detect threats and configure correctly and -- you know, but also to be able to grab root cause as fast as possible. So, that they're running against this, you know, reporting threshold and being able to have some confidence that they have gotten the bad actor out before their reporting requirement hits.

Rick Howard: I think what's really - I think that's really interesting about that, Merritt, is that, you know, before the new SEC ruling, most of us, most CISOs didn't really think about materiality too much. You know? That was something those business guys did over there. Right? And now -

Merritt Baer: Yeah.

Rick Howard: All of us are scrambling to see what that really means and how that affects our business. Right?

Merritt Baer: Yeah, it's a really good point. In fact, you know, these filings in theory are nothing new. And we see, for example, the SolarWinds, you know, charges are resulting from previous requirements that just had to deal with like reporting material events, whether they were security related or not. These new requirements are more pointed and put in this four-day reporting requirement. And, so, I think, yes, we're certainly seeing -- and, you know, one of the things that I have done is gather - the Wall Street Journal asked me a couple days in like, "What will material look like?" And I -- as I heard myself say out loud, "I think it'll end up being an industry standard," I thought, "We should probably put this together." So, I drafted a framework. We can link to it in the show notes. But, you know, this is a living set of calculations. I think we -- when I got folks in the room, it was like, okay, there are certainly going to be things on one end of the spectrum that may be like, you know, a knock at the door of an internet scanner that never makes its way into your environment. Right? And then there may be that clearly or quite clearly immaterial. Then there might be things on the other end of the spectrum. You know, regulated data is impacted, sensitive and proprietary information is impacted, a data xfill, an outage for the company, a known bad actor, you know, using malware that we recognize. There are all these things that may otherwise like seem clearly material. And then there's the gray area in between. Right? What if they are exalating fully encrypted data that is not known to be sensitive? What if -- you know, what if, what if, what if. And, so, I think as we whittle down that decision making, it's going to be contextualized and folks will need to take this framework and make it their own. But there is some level of like - I think the SEC was not specific about what material means, other than a reasonableness standard around shareholder right to know, in part because they think it's, you know, subjective as well, like it depends on the context here and also depends on, you know, what that industry's stakes are for shareholders.

Rick Howard: Caroline, I want to come to you in a second here, but I want to remind the audience that if you have questions about any of this, go ahead and feel free -

Merritt Baer: Oh, yeah.

Rick Howard: To input them into the questions -

Merritt Baer: We -

Rick Howard: Bar there. Okay?

Merritt Baer: We are here live in our real human form. So, take advantage.

Rick Howard: So, Caroline, what do you think? You -- the big strategy thinker at your company. What is your consensus on what's going on here?

Caroline Wong: So, my biggest takeaway is I love the transparency. There have been situations over the last several years where major breaches happened and all of the valuable information that we, as an industry, could have learned from was locked away and forbidden for anyone to look at. I think this is different. And while it will be uncomfortable for organizations that have to do this kind of thing, I think over time it's actually changing the culture of the security industry to talk about bad things that happen. With regards to materiality, you know, I am so excited about Merritt's framework. You know, fundamentally, we're talking about a classic model that has to do with probability and impact, both of which are highly subjective. I actually like that there is this materiality concept that CISOs and CSOs are being forced to think about. I think this aligns security in a different way with the business than historically leaders have been forced to do.

Rick Howard: That's a really good point. Right? That's a very good point. It might force us to talk to each other as opposed to what we've been doing in the past 30 years or so. Right? So, Merritt -

Merritt Baer: Hopefully not, but yeah.

Rick Howard: Maybe. We'll see. We'll see if that goes --

Merritt Baer: No, I mean, like, hopefully, we have been talking to each other. But, yeah, I think it raises the stakes. I think it raises -- I mean, like just as a business. You know, transparency is great for learning, but transparency in front of your regulator is scary. You know? So, I think it will provide some higher indications that the business has to care and, also, you know, that the CISO has to frame the kinds of considerations in business terms. And I think we were drifting toward that. And it's kind of an interesting vote from the SEC saying - you know, they're a regulator, right, so, in some level, they're saying, "We're not seeing the investments in security done by private market forces that we want to see." So, I'm curious to see, as it plays out, you know, whether we will get net benefits. Because, obviously, it's not intended to be our market crippling effect. You know what I mean? Like this is supposed to -- as Caroline points out, like it's supposed to be a healthy thing. Will it be? You know, like we -- I think folks are watching and we hope to validate whether it's an effective way to change the calculus of where those investments go. And, also, my sort of second point on this, which kind of leads into the AI element, is I think what they are trying to get at is you don't have to be perfect, but you have to have a process.

Rick Howard: Right.

Merritt Baer: You need to -- you know, like we're not going to tell you what that process has to be, but, when you attest -- so, in addition to that four-day disclosure, remember there are those two other new rules around disclosures that say you need to have an instant response reporting, you know, discovery and reporting process. And then you also need to have, you know, attestations around how the board oversees your risks and what their level of expertise is. Their -- you know, that tracks with a lot of what I have seen successful and mature security shops, which is like we're not going to tell you exactly what this has to look like, but we're going to tell you that it needs to be mechanized.

Rick Howard: So, Merritt, let's transition before we run out of time here the -

Merritt Baer: Okay.

Rick Howard: To the second point on the new EU regulation on artificial intelligence, because it all ties -

Merritt Baer: Yeah.

Rick Howard: Together. So, what's going on there?

Merritt Baer: It does. So, once again, I want to kind of highlight how even when there are new requirements, hopefully, your business kind of has these muscle groups in place to incorporate them. And the security shop is part of that business that has those. So, in this case, the EU -- and, by the way, the U.S. has had glimmers. We've had, you know, EOs and other attempts to kind of come out with a framing around AI, but we haven't had this type of dispositive regulation. EU has put out an AI Act that limits certain uses of AI. So, things like, you know, use of biometrics for law enforcement or social scoring using AI. They put some what they consider to be sort of safeguards around it. And the fines are really significant. They are seven percent of global turnover or 35 million euros. So, like really significant fines. Which, again, you know, will grab the attention of businesses. And here, too, I think we see a distinction between sort of like -- or, you know, a helpful conversation around public policy goals and then also what it will look like in a security shop to actually take some of the restrictions here, like the difference between your business using Ai and something that is a deliverable of your industry, and then also the AI that you're using in a security shop and how you secure those models. So, I think it's worth just highlighting how folks are using - and, you know, for example, Lacework uses a lot of heuristics that are contextualized. So, like we don't call that AI because it's not GenAI. This isn't the kind of stuff that we've been seeing that is the kind of problematic stuff that I think this is trying to get at. But we certainly see correlation or like a need to disentangle what the business is seeking to use AI for and then how security shops are approaching security of AI, the use of AI for security goals and other interesting, you know, practicalities here. So, I'm seeing some folks put guardrails around what employees can do. I'm seeing some folks come up with ethical and other sort of frameworks around how they're going to allow AI. But, a lot of times, like the devil is in the details around how you are using it and what data and how you're training it and what those inputs are. We saw some escapes where folks were able to just figure out what queries were. And that was indicative in itself.

Rick Howard: Caroline -

Merritt Baer: So, I think we're seeing this all over.

Rick Howard: Caroline, I'm going to come to you with a question about is anybody using AI yet. Before we do, though, let's put up the poll. Emily, can you bring up Merritt's first poll question? Caroline, I'm coming to you on this. You know, we all got started, you know, last year when ChatGTP came out. Is it yours -- do you see your customers using AI to do anything, like code generation, or anything like that?

Caroline Wong: Customers are absolutely using it. One of the most popular uses, which is not as scary as some others, is simply chatbots, customer service requests. We actually get quite a few requests to pen test AI software, primarily chatbots, but as well as LLMs. One of the things that I think is really interesting about this AI bill is folks actually started drafting it in 2021, it's actually not likely to come into actual effect much earlier than 2025. However, I think it's super important for CISOs to be aware of it. And here is the reason. Boards are going to ask CISOs what they think about AI. And no matter what your stance is, I do think it's important for us to be informed and to take a stance and to make recommendations.

Rick Howard: Let me throw the answers up to the poll here. Emily, I don't know how to display that. I hope we can figure that out. I'm looking at the screen, though. It says, Merritt, 57% say yes, they're using it, 28% says no, 14 are considering it. Does that match what you're seeing, Merritt, when you're talking to people out there in the - out there in the wild?

Merritt Baer: It's kind of interesting, yeah. So, this question actually was whether they're using it specifically in code generation -

Rick Howard: Right.

Merritt Baer: And/or review. And I think that actually this -- it tracks more closely with reality than I thought it would, which just shows that your viewers are probably more in the guts of the org. Because, a lot of times, I think if you asked, you know, an executive at a company if they were using it, they might think, "No, of course not, we would have to put guardrails around that." And it's like, in reality, a lot of folks are - unless you have some, you know, magic solution that blocks it everywhere all the time, folks are using it. And I think that's fine. My personal view is that we should be using, you know, all the tools we have, but appropriately and with the kinds of -- you know, the -- one of the biggest concerns is just whether the outputs are even accurate.

Rick Howard: It's a good question.

Merritt Baer: So, you know, I think using it for review is, you know, a productive use because, of course, there you would already have humans supervising closely. And then also I'm using for training. So, I was watching some folks posting on Twitter/X about how they'll generate fake logs so they can teach themselves how to filter appropriately and find true positives, and some of the cool ways that it actually can lend itself to a security shop just as you figure out how to write rules or to find detections and how to validate, you know, what your providers are doing for you and other things. So, I thought that was interesting. I think that most shops have folks who are doing it whether they realize it or not.

Rick Howard: Caroline, I got a question from my own boss in the chat section, right, from Simone Petrella. She says, "Do you see any potential risks of companies choosing to take less aggressive actions assessing the maturity and resilience of their cybersecurity programs or posture so they can avoid reporting it in filings?" That's an interesting question, yeah.

Caroline Wong: So, I think that's real and that happens. Sometimes people don't want to look because, if they were to look, they would be forced to report. That being said, I do think that, for the majority of what the SEC is having folks do, I expect the CISOs are already doing that. The CISOs are already saying, "Let's have a program. Let's have a great program." And then they say, "Can I have some money for this program?"

Rick Howard: Yeah.

Caroline Wong: And then the business says, "No, we're going to accept the risk." And, so, my hope is that it actually doesn't change so much of what the CISO is doing, but they just get more because the business understands that it's important materially.

Rick Howard: Well, I think that we could talk about this topic for the next five hours.

Caroline Wong: Yeah.

Rick Howard: Right? But we're going to have to -

Merritt Baer: Well -- and I do want to -

Rick Howard: Go ahead.

Merritt Baer: Hang on, a real quick point there. I think that while they haven't -- you know, the SEC hasn't affirmatively put a requirement that you go look for everything, you know, we may see some enforcement action where folks didn't look attentively enough. And I think that's a real possibility. But I also think like you have to remember that, aside from the SEC, you not looking at your own environment for security incidents is a liability in itself just to your organization. So, you know, like we're stacking up incentives, but there are already some at play here to do security. Right?

Rick Howard: Well, it's a fantastic topic and - but we're going to have to leave it there. Let's turn to Caroline's topic. She wants to talk about one of the best cybersecurity books of the year. Yeah, there it is. So, tell us about it, Caroline. Why did you like this book so much?

Caroline Wong: I love this book. And the number one reason is because it's super funny. When was the last -

Rick Howard: All right.

Caroline Wong: Time you picked up a 400-page security textbook and, while reading it, you literally laughed out loud? This came out in January 2023 by Addison-Wesley. The other thing I like about it and the other reason it's so darn funny is because it's real talk, there's no BS in this book. This is all deep thinking in plain language. It's talking about very important concepts and there's no ego, there's no elitist "I'm more technical than you" vibe to it. This is really just an honest look at things. And I love that. So, the book is called "Cybersecurity Myths and Misconceptions: Avoiding the Hazards and Pitfalls that Derail Us." And there are dozens and dozens of myths and misconceptions in this book. I just want to share a handful of my favorites.

Rick Howard: Oh, yeah, let's do it.

Caroline Wong: Macs are safer than PCS. Linux is safer than Windows. Passwords should be changed often. Everything can be fixed with blockchain.

Rick Howard: Oh, yeah.

Caroline Wong: We can solve all of our problems with big data. And there is an entire chapter for cognitive biases, about two dozen of them, action bias, omission bias, survivorship bias. But one of the things that I love most about this book is the last section is all about finding hope.

Rick Howard: What does that mean finding hope? What -- I love the sentiment of it. Okay? What were the authors getting to when they said that?

Caroline Wong: Yeah. So, there are dozens and dozens of myths and misconceptions that are discussed in this book. And, at the end of the book, we've got some meta myths and some meta recommendations. One of those meta myths is that cybersecurity is doomed, because sometimes, working in this industry, this is how it feels. Sometimes it feels really hard and it just feels like we're failing and stuff is getting harder. However, the book makes an excellent point, which is that far too little attention is made to how much progress we've actually made in protecting people and devices. And, so, it talks about there's an opportunity. Instead of focusing on the negative and blaming users for our human faults, why not empower people and why not offer cybersecurity that's actually well suited for helping people? And, so, there is both kind of a sarcastic and a funny bent to this book. But there's also a tremendous amount of hope.

Rick Howard: So, I will say that regular listeners to this show know that I'm part of the Cybersecurity Canon Project where we try to find the most important cybersecurity books out there. And the Canon has this book squarely in its sights as a potential hall of famer going into the new year. Merritt, let me go to you. I don't know if you've read the book or not, but do you have a favorite myth that you see all the time that you always have to tell people, you know, that's not really a true thing? Do you have a favorite that you'd like to highlight here?

Merritt Baer: Sure. I -- so, I feel like I'm -- so, full disclosure, I have not read the book. But I appreciate this glowing review. And I think it's great when folks are doing, you know, the good work of like not creating more myths and misconceptions. I think one of the things that I see among CISOs is just like the idea that they've bought a vendor and now they're set. And I say this working for Lacework who helps folks all the time, but like you have to make it meaningful. You have to action it, you have to operationalize, you have to prioritize. Like it's no replacement for those muscle groups and mechanisms that all security shops need to build. So, you still have to, you know, have - it's a different conversation when you're able to do cloud security, for example, when you're able to do infrastructures code and you're able to do, you know, better prioritization of alerting because you're getting heuristics from us, for example. But like it doesn't mean that the security work goes away, it just takes different -- you know, higher forms. And I think that sometimes when I'm talking to CISOs they feel a little -- and I'm not usually talking about Lacework, I'm usually talking about whatever they claim that they have that's their silver bullet. They'll be like, "We have, you know, Orca, so we're set." And then I'll say, "Well, do you have CloudTrail turned always on?" "No." "Well, you're not getting the visibility that you think you're getting then." You know what I mean? Like it's -

Rick Howard: Yeah.

Merritt Baer: Still you have to - and to do CloudTrail always on, you have to have organizational constructs that force that to be a setting that all of your builders have on. And, to do that, you have to ingest and then turn on the integrations. You know, like -- and even then, once you have your, you know, Lacework findings, you've got to go take action on them. And that is just the kind of thing that we can help. And I actually see possibilities for AI helping with like, you know, rather than rules that are like "you must sanitize inputs," that they might actually, you know, draft code that would allow you to, you know, jump into that faster. And, you know, like there's ways that we can help our ourselves with tech, but the tech will not save us.

Rick Howard: So, Caroline, the poll - the audience poll says that 25% of the audience has already read the book, and 50% says they are going to. That's a pretty rousing bit of support for a book that just came out in 2023. Is that what you were expecting?

Caroline Wong: I was not expecting it to be that high. I'm delighted that it is. Also, today is Tuesday, December 19th, I just checked on Amazon, you can order this and get it Prime delivered to you overnight between 4:00 a.m. and 8:00 a.m. tomorrow. So, for any folks who might still have a little bit of holiday shopping to do, you know, Amazon Prime, get the book. the four authors --

Merritt Baer: Do you work for that author or something?

Caroline Wong: I'm just obsessed. Right? I'm just obsessed. And, so, here's the folks who read it. You know, I have a deep appreciation for great books. And I have a deep appreciation for great learning. This book is written by four doctors, not of the medical type, including a 35-year faculty member from Purdue, my good friend Spaff; security research analyst at Carnegie Mellon CERT, Leigh Metcalf; cybersecurity leader at the NSA, Josiah Dykstra; and award-winning artist and writer Pattie Spafford. So, really, really good stuff. I think, fundamentally, one of the things that I worry about - you know, the three of us were chatting as we were getting started for today's discussion and one of the things we talked about is the role parents of children. And when I think about my kids and the things that they're going to face in their lifetimes, I'm worried that they're going to face disinformation. And I want to empower them to think critically. And that's really the biggest takeaway about - from this book is think critically, examine things, examine things over and over and over again because, when time passes, things change and things that we used to believe were true may actually not be anymore. So, yeah, huge fan.

Rick Howard: So, we got a question from an audience member. Merritt, let's see what you got for this. This username is JoeNotExotic. And he says, "Why are there so many myths and misconceptions in cybersecurity? Who do they serve?" Any idea -- any thoughts about that?

Caroline Wong: Yeah.

Merritt Baer: Well, you know, I think there is a little bit of a drive to be sexy in the security industry. So, if you go to like a con, you will see all these talks about like, you know, edge case scenarios. And I'm not saying that those don't matter. Obviously, most security people are geeks who love kind of being creative around like, okay, what'll happen if I put a terabyte of data in the name space or what can I, you know, break, what will stay the same, what can I -- but, at the same time, like folks spend all of this time on zero days and worrying about it, when like the number one vector is always going to be credential compromise. Like they're walking in through your front door. And it's just around like the unsexy stuff that it takes. Again, like it's that security work. And there can be sexy ways of getting to that information, you know, triangulating around like does this person usually behave in this way, are these credentials being used in novel patterns. So, those are the kinds of things that shops like us try to -- I mean, like surface for you. But, ultimately, like it's not the -- what's it? It's horses not zebras a lot of the time. And, so, since it's JoeNotExotic, you will understand. But like, you know, folks I think have an attraction to the stuff that feels really novel and sexy. And those are important for us, as practitioners, to be aware of. But, at the same time, a lot of what we need to be doing not only is focusing on those kind of ordinary, but workable things, but it's also even those edge cases will often get caught by having good mechanisms. So, if your trouble ticketing does force escalations, you're going to catch, you know, bad behavior, regardless of how it happens, if you've refined your tuning.

Rick Howard: That's one of my pet peeves. You know, like you said, Merritt, a lot of our community, we like to focus on the stuff we don't know, you know, the "oh, my goodness, I had to spend a lot of resources finding the new vulnerability, the new exploit." When we have all those other -- we have all this intelligence, let's just say from the MITRE ATT&CK Framework, about we know how adversaries operate across the intrusion kill chain. And most of us don't have prevention and detection controls -

Merritt Baer: Oh, yeah.

Rick Howard: In place for even the known stuff. So, why would you --

Merritt Baer: I mean, we --

Rick Howard: If we own them?

Merritt Baer: See logs --

Rick Howard: Yeah.

Merritt Baer: For [inaudible 00:31:03] happening all the time, still we see --

Rick Howard: Yeah.

Merritt Baer: [inaudible 00:31:05] happening all the time. So, we see all the known packages. You know?

Rick Howard: Yeah.

Merritt Baer: So, yeah, exactly.

Rick Howard: So, Caroline, I'll go back to you with the same question. Why are there so many myths and misconceptions in cybersecurity and who do they serve? What's the -- what do you think there?

Caroline Wong: One of the things that I've observed about some cybersecurity people over the years is that some cybersecurity people kind of have this vibe where it's like, "I am right and you are wrong."

Rick Howard: I have never done that.

Caroline Wong: And I -

Rick Howard: I have never done that ever.

Caroline Wong: I'm not talking about any of the people right here. But just, you know, close your eyes for a moment, think about all the cybersecurity people you know, and I'm willing to bet that, you know, a good percentage of those people have this like "I'm right and you are less right" vibe to them. And I'm not a huge fan of that. I think that the world and cybersecurity are incredibly complex, and our brains just try to understand stuff. You know? And we come to conclusions. And, sometimes, you know, it's fascinating to me how much about cybersecurity seems to be about belief. You know? I think somebody made a comment on the book in one of the reviews that says, "You know, when we're talking about opinions and beliefs, you know, that's really for religion and politics."

Rick Howard: Yeah.

Caroline Wong: You know, when it comes to cybersecurity, let's try and stick to the facts. You know, we've got another question in the chat from Gustine, I think, I'm not sure if I'm pronouncing that name correctly, "Should cybersecurity play a bigger role in monitoring and remediating disinformation campaigns? Asking that while mindful that we're entering an election year." I think, as cybersecurity professionals, we have an obligation to protect the integrity of data. And I think we're always searching for source attributions. So, I think that, you know, cybersecurity people, we have access to a lot of data. And when we see that data is inaccurate, is misinformation, is being used for false propaganda, we have an opportunity to point that out.

Rick Howard: I think it's an excellent question and an excellent topic. Merritt, let me ask you about this, right, because I think that the infoset community, people like us, we're probably uniquely qualified to understand what misinformation is and how it can be used against us. But I know I've been cautioned against speaking about that in other companies that I belong to because it's so politicized. So, what do you think about that? Do we need to get involved in that discussion?

Merritt Baer: Yeah, I think depends again - not to sound too lawyery, but I think it depends on the business and the context. But I certainly, you know, have worked on this. I worked on it at AWS with like what fraud and bad behavior we were willing to tolerate or not, how we could notice it. There's overlap with security because these bad actors generally are not putting down their own credit card. But, at the same time, there are some questions just around, you know, acceptable use and other things that we're seeing. You know, CISOs have to take responsibility for around not just sort of It security, but around the kind of like safety and security and acceptable use and trust and safety of the business. So, I think it is an issue that is ongoing for most security executives. I think we do a mix of sort of like raw security controls around it, like, you know, having folks who are not among your customers be able to leave a customer review, or other things that are kind of like a mix between IT and content. But I think we're seeing CISOs in general drift towards being able to not just secure the IT, but help secure the business, make security conscious decisions around content. And, of course, help steer the entity itself toward, you know, its own goals and what brings better outcomes.

Rick Howard: So, Caroline, we got another question from another audience member, username BeenThereDoneThat. I love that name. Right? He or she says, "Why is this book so important?" Why do you -- I may rephrase it. Okay? Why do you think this book is so important, especially right now, this year? What do you think about that?

Caroline Wong: I think it is so important because it really encourages us all to think critically. There is a introduction to the book by Vint Cerf.

Rick Howard: Yeah.

Caroline Wong: And what Vint says is, "Among the most powerful of defensive tools is critical thinking." This book is all about understanding -

Rick Howard: Right.

Caroline Wong: How to think more critically about risks in cyberspace. This takes work. It's not a free lunch. One of the things that I've been trying to impress upon my young children, I've got two below the age of 10, is, you know, you go and you ask Siri a question, you go and you type a question into the Google search bar, what comes back to you may or may not be true. And I am sort of desperate for them to understand that concept. You know? And now we're in this next level of it where, you know, folks who are closer to technology, folks who are in cybersecurity, we understand fundamentally how some of this stuff works. And, so, we know that I type a question into an AI and whatever it says back is just -- it came from the data that went into the learning model. But I expect that there are tons of folks who are using AI who sort of believe it's the truth. And that is not the case.

Rick Howard: Well, Caroline, that - this topic is a fantastic one. I'm so happy when anybody brings a book to this show. All right? So, thank you for doing that. But it's time to move on to the last topic of the show. All right? I want to talk about -- I want to get back to the SEC that Merritt was talking about, that is the Securities and Exchange Commission. And, this past quarter, they filed civil fraud charges against the SolarWinds CISO [inaudible 00:37:15], yeah, I know, I know, and the company he works for. But, before I ask you guys some questions about that, let me lay down some history. So, this -- the hack happened some time in 2019, maybe a little bit before. So, this is pre-pandemic. The hackers behind the Dark Halo attack campaign compromised the SolarWinds network and poisoned the company's Orion product. Then a year later, in early December 2020, FireEye was the first company to disclose that state sponsored hackers broke into their network, installed the company's Red Team penetration testing tools through the SolarWinds back door. And then, a couple of weeks later, just before Christmas, SolarWinds disclosed the breach in an SEC filing. Right? So -- and then fast forward three years to this year, July 2023, the SEC adopted the new disclosure rules that Merritt was talking about, and it went into effect on Monday. But, back in October, they made an extraordinary move. They charged the company SolarWinds and the CISO, Tim, with fraud, claiming that SolarWinds had internal control failures from at least October 2018, the initial public offering of the company, through at least this December 2020 announcement, that it was the target of a massive, nearly two-year-long cyberattack, and that SolarWinds and Brown defrauded investors by overstating SolarWinds' cybersecurity practices, and understanding or failing to disclose known risk. Wow, that is a big swing. Right? And, so, before I ask you two to weigh in on that unprecedented move, here's a couple of facts that we should consider. First, this SEC case is a civil case and not a criminal case. So, nobody's going to jail. But court costs and follow-on job opportunities for Tim, these are at stake because this is probably going to stretch on for years. The current CEO, Sudhakar Ramakrishna, was not the CEO at the time of the incident. He came in after Kevin Thompson left. But neither CEO has been charged by the SEC, which I find just crazy. And, by the way, Tim Brown was not the CISO at the time of the incident either. He got that title afterwards. And, by the way, he was not a director of SolarWinds, meaning that he's not on the board and not vested with any typical legal or fiduciary responsibilities that board members have. And he was not an officer of the company, meaning he was not appointed by the board. And, as you two know, directors and officers are protected against bad business decisions they make, that aren't personally liable for, because of reasonable mistakes in judgment and business decisions that turn out to be bad for the corporation, as long as they were reasonably informed, acted in good faith and believed their action to be in the best interest of the company. Tim Brown was and is a simple VP employee. He doesn't have any of those normal protections typically given to the executive team, like directors and officers insurance, that would cover court costs and lawyer salaries. Now, I happen to know that SolarWinds is standing behind Tim Brown and covering all those legal fees. But this is probably not going to be true in other corporations going forward. So, let me stop talking. Merritt, let me come to you first. Are you gobsmacked about this as I am? I'm just -- I just can't believe this is what happened.

Merritt Baer: Yeah, I think it's pretty surprising. You know, I think it's interesting, we saw with Joe Sullivan the move to do criminal liability in the Uber case, and now with this one with civil liability. I think it is influencing CISOs' sense of self. And I think, you know, there will be some negative impacts of that. I think, hopefully, there will be some positive ones, which are, for example, that we do get more board members that have security expertise, that we do get more CISOs at the actual board table with D&O insurance. You know, like that folks will have more awareness of this being -- you know, coming with C-suite liabilities and needing C-suite capabilities. But I think it's certainly a line in the sand. And I think that, you know, most CISOs are somewhere in between sort of like IT and security and the business. And this really squarely puts them in a position where they need to have characterized it correctly to the business. And I think that -- you know, the other thing that jumped out at me with the charging [inaudible 00:41:54] was that they raise all of these times that Tim's own staff kind of came to him and pointed out problems. And I was like, "Yeah, I mean, they're supposed to do that." And we're supposed to like -

Rick Howard: Yeah.

Merritt Baer: That -

Rick Howard: We get -- yeah.

Merritt Baer: We take that into account. And it doesn't mean that they get everything that they think the entity -- you know, the company or entity should do. It just means that they should raise them. I want to -- like my -- one of my concerns is just that it will stifle internal criticism because it could be surfaced in an investigation. You know what I mean? That I want entities to be vocally self-critical of their own shortcomings. But, if it means that it might turn up in an SEC investigation, I worry that they might not be willing to do so.

Rick Howard: I -- that's my -- kind of my concern, too. But, Caroline, let me go to you for what your take is on this.

Caroline Wong: So, I think the number one takeaway for anyone who's in a CISO position at a public company is you've got to ask the question, "Am I covered by D&O insurance?" And, if you're not, then that's an opportunity to address it. I do think that the SEC is trying to make a big splash. I think they're trying to make SolarWinds an example. And, while I generally expect that CISOs are doing the right things, their security teams are doing the right things, and it's oftentimes the business who's not actually investing in the security program appropriately, my understanding is that one of the things that was called out by the SEC about SolarWinds is that some of their public-facing security statements are not accurate. So, SolarWinds and any organization on the planet that sends -- that sells to an enterprise organization, you know, an enterprise organization wants to buy your thing, you need to tell them that you're secure. If you're telling them that you are, it should probably be the truth. So, here's the thing about security policy. I think that one of the decisions that security practitioners have to make is do I write policy in line with the practice so that my organization is compliant with the policy, or do I write it to be aspirational so I can encourage better behavior over time. The problem with an aspirational policy is that, in practice, you are out of compliance. So, I think the SEC is trying to do a good thing. I do think that there are some folks who are getting blamed a little more than they really should and sort of suffering as a result of the SEC trying to make such a big example out of this. But I'm really hoping that it turns into a stick for the CISOs to use to say, "Give me more money."

Rick Howard: Well, here's my hot take, Caroline. I think I may disagree with you a little bit, right, because it -- so, now, I'm not a lawyer, but Merritt is. Maybe she can move on here and tell us how to think about this. And, I confess, I haven't read in detail the civil complaint, but I don't understand how the SEC could reach into a company like SolarWinds, pass the board, pass the officers, like these two CEOs, two layers deep in the leadership hierarchy, and charge somebody like Tim for repeatedly violating the anti-fraud disclosure and internal controls provisions of the Federal Securities Law by not disclosing vulnerabilities that the company knew could lead to a hack. Merritt, help me out here. How is this possible?

Merritt Baer: So, I am not this kind of lawyer. And I'm not anyone's lawyer. I happen to have a law degree. But I think, you know, some of the things that you've pointed to are part of that, you know, like be -- not having protections that we might expect for other executives. But I also think that, in this case, they're sort of testing to see what works and, as Caroline said, sort of trying to make a splash. I also think that, you know, this was an unusual case for them to have picked because while -

Rick Howard: Yeah.

Merritt Baer: While I obviously see the impact of SolarWinds', you know, insecurity, because, you know, supply chain is such a huge issue, this was a really painful one, Orion was everywhere, still resides in a lot of systems, this was also a protracted like two-year nation state level, most likely Russia backed, attack. Like I pity the CISO or VP, whatever Tim's title was, who walks into this situation and expects that -- you know, that they will be able to make it perfect. I think that, again -- you know, my armchair lawyer guess at why they're going after him is just because he - you have that, you know, fraud element. It's not just that they weren't doing enough. It's that they lied about the scope and duration and depth of the issue. And, so, I think -- you know, just remember that like Martha Stewart didn't go to prison for tax evasion, she went to prison for lying to federal agents. You know, like the government doesn't like being lied to.

Rick Howard: Well, Caroline, let me come to you there, because I agree that the SEC's rules is - makes it for better transparency. I totally get that. But I -- my -- part of my reason I'm upset about this is they went after the CISO. Right?

Caroline Wong: Yeah.

Rick Howard: I'm here to tell you that the CISO in no company ever has the power to make disclosure decisions for the company. Right? In the best case, he has input to the decision made by the board and the officers. In the worst case, I would say -

Merritt Baer: He or she or they.

Rick Howard: He or she, yes. Pardon me. They're not even in the room for most cases.

Caroline Wong: Yeah.

Rick Howard: And, so, that's the reason I have a problem with this decision. I don't know, what do you think?

Caroline Wong: Yeah, I do think that there was a miss in terms of who is the individual or who are the individuals that are responsible and accountable. And, in the majority of security programs that I've observed, the CISO is in a position to say, "I see a risk. It is my duty to inform you about a risk. But it's actually not my job to accept that risk or not. That's someone else's job." And, so, I do think that it's the wrong person.

Rick Howard: Yeah, that's what I think, too. That's what I'm so upset about it. Right? And for -

Merritt Baer: Although I will say like -

Rick Howard: Yeah.

Merritt Baer: I don't know, because I don't know the inside of SolarWinds processes, but, you know, having -- so, those kinds of decisions were things that we did, not just like quarterly board meetings, at Amazon and with my customers now at Lacework. Like those are things that you need to be doing in your weekly metrics meetings. Like, "Why did it take us two hours to notice this?" "Was this a known risk or an unknown?" "Had we already accepted some -- you know, where did this threshold come from?" "Why are" -- you know, like -- and those are I think kinds of behaviors that we do see rewarded just with outcomes, and that now I think SEC is striving to reward based on process.

Rick Howard: It was -

Caroline Wong: Yeah.

Rick Howard: Go ahead, Caroline.

Caroline Wong: I do think that one of the fundamental concepts that the SEC is really intending to zero in on is that of risk tolerance and to what extent there exists a delta between what the risk tolerance should be, you know, with this relatively subjective, we're all trying to figure out exactly how to analyze and calculate materiality. But, if an organization should be a little more or a little less risk tolerant than is actually, you know, observed via the actions they take, the decisions they take, the investments they do and do not make, that I think is at the core of this. And I do think that a CISO plays kind of a inform and consult type of role when it comes to who determines the actual risk tolerance of the company, what risks we're actually going to tolerate. And I don't think that's the CISO.

Rick Howard: Yeah. And I don't think the community is in agreement on this either. If you just look at the audience poll, it's roughly 50/50. Okay, some think that the SEC missed a target, others think they were right on. But let me change gears a bit, Merritt. Okay? What are the implications of this kind of thing? Do you -- I mean, the first one I'd like to ask you about is, "Do you think there's going to be a chilling effect on the CISO position going forward?"

Merritt Baer: Do you mean fewer people are going to want to be a CISO? Or do you think they're going to have to just like be careful? What do you mean by chilling effect?

Rick Howard: Yeah. Well, yeah, for people like us who have been CISOs, right, do we want to take another CISO job if we -- if we're going to get sued because we made a mistake in judgment because we didn't agree with our team about what was material to the business? That's what I'm worried about.

Merritt Baer: Yeah, I hear you. I think -- so, in my view, like the role of CISO is a slug for a lot of reasons. And it's also really interesting and exciting for a lot of reasons. Right? And I think that, again, it'll just depend on your lens. So, on the one hand, like one of the things that is tough about being a CISO is that like you are an escalation point. So, you're going to have pagers, you're going to have -- you know, like there are already a lot of day-to-day negatives around taking this job. I think this one, while it's a little bit new on possibly like a con of taking the job, I also think it can be a reason that they get a bigger seat at the table or that, you know, folks in the business have to pay more attention. It's going to -- so, I was listening to actually Joe Sullivan on a different podcast recently, where he said, "Because of these liability standards, I think every CISO is going to want to report everything just to cover themselves." And the CEO and the CRO, and whomever, and CFO are going to be the ones saying, "Uh, do we have to?"

Rick Howard: Yeah.

Merritt Baer: "Like where can we draw those?" And I thought that was interesting because I don't actually know that that is true. But I do think that we will be seeing much more - like his view was that the legal and security folks are going to be risk averse and want to disc -- over -- like, you know, when in doubt, disclose, and that folks on the business side are going to be much more conscious of having to do that. Okay, fine. If that is the dy -- or whatever dynamic this becomes as it plays out, I think folks will have to - like the CEO is going to all of a sudden have a much bigger stake in what happens. And, so, you know, an enforcement action that looks punitive is going to look really bad. And that is what you have to weigh against your public disclosures. You know, we don't -- technically speaking, so far, we haven't really seen a ton of penalty in stock price for companies that disclose a data breach. They get a dip at first and then they generally like even out and do fine. And maybe that's because they've reinvested in security and have more -- I can tell you, as someone who works in a security company, that they often then invest a whole lot more after. But you know what I mean? Like it's possible that these dynamics are just kind of adjusting the ebb and flow of where we assign responsibility, where we assign blame and then where we invest and how we are choosing to spend our attention in dollars.

Rick Howard: So, Caroline, let me come back to you on the chilling effect of the CISO position. We got a question from an audience member, Braden Santo. He says, "As CISO burnout continues to plague our industry, what strategies can organizations implement to support CISOs in managing stress and maintaining job satisfaction, especially since you can get sued for fraud now?" All right? So, that's my problem.

Caroline Wong: You know, I have to agree with Merritt, which is to say the CISO job is not an easy job. It's never been an easy job. But I wonder a little bit if the CISO talent situation is such that you simply have more people who want to be CISOs than CISO roles that exist. And, so, it's actually just a really funny thing. Like how hard is this job going to get? There are people out there who are like, "I don't care. I want to be CISO," for one reason or another. And I still think those people, it's more than the number of CISO jobs that there are out there. So, there's this fascinating kind of talent supply and demand situation. That being said, maybe it will change the profile. You know? Are we going to get people who really want to disclose? Are we going to get people who, in some cases, you know, insist on disclosing and, in other cases, in order to climb the corporate ranks, are going to, you know, kind of do what, you know, higher executives tell them to do, even if it's not exactly aligned with the accuracy of the data that they have at their disposal? So, I think it will change the profile. I don't know what it is with people in this industry. CISO has been -

Merritt Baer: We've been --

Caroline Wong: So hard --

Merritt Baer: I mean, you're --

Caroline Wong: For such a long time and people still really want to do it.

Merritt Baer: Yeah.

Caroline Wong: I don't want to do it. But lots of people want to be CISOs.

Merritt Baer: I mean, I think it does have like the kind of - it's the kind of industry that people get into because they wanted to go do the hardest thing.

Rick Howard: Yeah, that's the --

Merritt Baer: And I'm speaking --

Caroline Wong: Totally.

Merritt Baer: For myself.

Caroline Wong: That is correct.

Merritt Baer: But, also, I think it's true. Rick, what do you think?

Rick Howard: Well, I -- let me bring it around to a positive note. I'd love to get your two opinion on this. Right? Because of all this extra pressure now, all right, does it mean that the CISO position will be ultimately lifted to the company officer position that we've all thought they should be there already? I mean, we all -- the title "CISO" is just a title. We're not really a senior executive staff or anything. But, because of these new rules, does that mean the senior leadership elevates the CISO position to be part of the executive staff? Do we think we're going to see that this year or down the road? Caroline, what do you think?

Caroline Wong: I don't think we're going to see it, to be perfectly honest. But I'll tell you what I do think we're going to see. I think we're going to see bigger CISO salaries. Across tech, in the year 2023, like everyone's salary went down except for CISOs. CISO salaries went up. So, I don't think we're going to get that seat at the small big table. But I do think there's going to be more money.

Rick Howard: Merritt, how about you?

Merritt Baer: I think your question is leading because, at some companies, they are already at the chief -

Rick Howard: Yeah, but it's not that many. I mean, Fortune -- a handful of Fortune 500 companies. Right?

Merritt Baer: Yeah.

Rick Howard: That's about it.

Merritt Baer: Like, I mean, Amazon's reports to the CEO -

Rick Howard: Yeah, but -

Merritt Baer: You know? And I think -

Rick Howard: Come on. Yeah, that's one thing -

Merritt Baer: Yeah, but Amazon is also like 70 companies and it's whatever portion -- so, I mean, it's not insignificant. I don't think that it is a given that we accept that today that isn't at least to some extent a state of play. I, also, think that some CISOs find value in having other parts of the business be tied to them. So, you know, the most common I see is that they either report into legal or to the CTO, so like the chief legal or general counsel or to the CTO. And those actually mean that they have to like tie their fortunes with security. And that can be frustrating, but it can also be a mover for those folks. So, like the CTO who needs to do infrastructure that works is now getting told by their security person like, "Okay, are you willing to accept this risk? Because if we're not deploying with infrastructure as code and we're not using ephemeral credentials and if you end up hard coding in your - like here are the things you're accepting by allowing your devs to run wild with this" -- you know, like they have now tied it in in a much more direct way to someone else's equities. And the same with legal, right, where they're accepting, you know, other risks and responsibilities. So, I think -- I'm not actually convinced that it is entirely good or entirely bad to have it one way or another. I think it has to work for that organization and the level of - you know, of regulated data, the level of scrutiny that they get from regulators, the, you know, number of employees and how they delegate responsibilities. There's a lot of things that work in different organizations. But I think -- generally speaking, I think these will be harbingers of more attention, weight and responsibility obviously to the CISO role. And I also think -- I agree with Caroline that like we're not going to run out of CISOs because folks want to work on this stuff. And, also, that, you know, we see more and more that CISOs can take different forms and have different strengths. And, so, I -- you know, that's also true for folks listening. You know, there's no pass go, collect $200. Like you work on something and you get good at understanding how systems work, and then you get better at understanding how successful security shops work. And then, one day, you're in a position where it makes sense for you to take the helm.

Rick Howard: Well, ladies and gentlemen, we are at the end of our time. All right? This has been a fantastic conversation. So, on behalf of my colleagues, Merritt Baer and Caroline Wong, thank you guys for participating. And we'll see all of you at the next CyberWire "Quarterly Analyst Call." Thanks, everybody.

Merritt Baer: Thank you so much, Rick. Thanks, folks.