N2K logoSep 28, 2023

CyberWire Live - Q3 2023 Cybersecurity Analyst Call.

There is so much cyber news that, once in a while, all cybersecurity leaders and network defenders should set aside some time to consider exactly which developments from the last few months have been the most impactful. Join Rick Howard, the CyberWire’s Chief Analyst, and our team of experts for an insightful discussion about the events of the last 90 days that will materially impact your career, the organizations you’re responsible for, and the daily lives of people all over the world.


Rick Howard: Hey, everyone. Welcome to the CyberWire's Quarterly Analyst Call. My name is Rick Howard, I'm the N2K CSO and CyberWire's Chief Analyst and senior fellow. I'm also the host of two CyberWire podcasts, "Word Notes" and "CSO Perspectives". And I just recently published a book based on the "CSO Perspectives" podcast called "Cybersecurity First Principles, A Reboot of Strategy and Tactics." But more importantly, I am also the host of this program, the "CyberWire's Quarterly Analyst Call", an exclusive for CyberWire Pro subscribers, one of the perks, you might say, of being a CyberWire Pro subscriber. And I'm happy to say that I'm joined by two brand new members of the CyberWire "Hash Table". This is their first appearance. So, everybody, be nice. The first is Chris Hughes, the president and co-founder of Aquia, it's a developer-centric consulting service that helps mostly federal agencies secure their digital transformation, and a recent author of a new book called "Software Transparency, Supply Chains Security in an Era of Software-Driven Society". Chris, I'm about halfway through it, and it's excellent. And I can't wait to talk about it. So, say hello to everybody.

Chris Hughes: Hi, everyone. I'm excited to be here. I'm looking forward to the conversation. Thanks so much.

Rick Howard: And our second brand new "Hash Table" member is Vik Arora. He's the former CSO at the Hospital for Special Surgery in New York City. And he's also a contributing author to the book "AI and Cybersecurity Handbook for Healthcare Boards". Vik, welcome.

Vikrant Arora: Thank you, Rick. I'm happy to be here.

Rick Howard: All right. So, everybody, this is our 16th show in the series where we try to pick out the most interesting and impactful stories from the past 90 days. And, you know, try to make sense of them. And as always, there has been a lot of things going on. We could have discussed the Cisco acquisition of Splunk where, you know, on my bad days, say Tuesdays and Thursdays, I say to myself, "Hm, isn't Cisco the place where great companies go to die?" But, you know, on my good days, Mondays and Wednesdays, I say to myself, "What a great acquisition for an already amazing security orchestration platform." Or we could have talked about a slew of nation-state continuous low-level cyber conflict attacks from the likes of China, Iran, North Korea, India, Russia -- and get this -- Belarus. Or we could have talked about how international law enforcement agencies have had some success this quarter disrupting cyber adversary groups like the FBI taking down the criminal marketplace called Genesis Market, like the Ukrainian police taking down the Botnet Influence operation, and the Ukrainian's Security Service disrupting a network of illicit fund transfer sites in Kyiv, Kharkiv, Rivne, and Sumy. And like the joint police US takedown of Lou Lacosta, it's kind of a criminal-to-criminal marketplace used for a variety of criminal activities. And finally, like the International Coalition of the US Justice Department, the FBI, France, Germany, the Netherlands, Romania, Latvia, and the United Kingdom taking down Qakbot, a Russian cybercrime botnet that targeted critical industries worldwide. But we're going to start with a story that has dominated the news for almost an entire quarter, the MOVEit hack attacks, all right, and we're going to approach it from two angles. The first is if you're a commercial company with a software product like MOVEit, how do you mitigate the risk for your customers, and second, if you're a customer of that kind of software, how do you protect yourself? So, Chris, this was your topic, how about you laying it out for us and tell us what the key points are?

Chris Hughes: Yeah, definitely. This is a really interesting one. We've seen a lot of, you know, conversation in the industry around software supply chain attacks over the last several years, obviously. But this is an interesting one because it's leading to a nationwide class action lawsuit that's been filed against Progress Software which is the maker of MOVEit. And by the latest estimate, it's been recognized that over 16 million individuals were impacted by this data breach in over, you know, I think it was 1600 organizations.

Rick Howard: Sixteen million. Holy crap.

Chris Hughes: Sixteen million, yeah. Yeah, so it kind of emphasizes the spiderweb effect of the software supply chain attack where, you know, they compromise this one piece of software, this one vendor, and it has this massive kind of sprawling spiderweb impact across many organizations, hundreds of organizations, tens of millions of users. And basically what's happening now is this law firm, I don't want to try to butcher the name, but has brought forward basically a lawsuit by consumer rights law firm against the organization, basically claiming that they've been negligent when it comes to protecting their software, producing software securely, effectively notifying potential victims that have been impacted and that they've particularly focused on designing and maintaining the software securely. You know, I thought this was interesting because we've seen a lot of discussion lately from organizations like CISA and others, you know, calling it first secure by design, secure by default software. We've seen NIST publish their NIST secure software development framework. So, a lot of discussion around software supply chain security. And here we have this, you know, this popular vendor used by I think it's tens of thousands of organizations around the world get this incident that impacts them and now has this massive impact across tens of millions of people, and leads to a class action lawsuit. So, it's definitely bringing the spotlight to software supply chain attacks and kind of the downstream impact they can have across society.

Rick Howard: So, Chris, you read through the material, does it appear to you that those guys were being that negligent? I know that we don't have the inside details but does it sound like they were, you know, asleep and asphyxiated there, or is this just some eager lawyers trying to take advantage of the situation?

Chris Hughes: Yeah, I'll be honest, I'm kind of interested to see how it shakes out because, you know, I don't have the inside details, but from what I can tell, they did have a vulnerability in their software. But as we know, who doesn't? Right. Who doesn't have a vulnerability that can be exploited out of their software? So, I'm curious what their claim of negligence is and then how it's going to play out when we look at things like are they compliant with SOC 2 or ISO, you know, what kind of compliance framework have they complied with. And if so, when they've, you know, kind of done their due diligence on that front, you know, where's the gap? Like what did they do that's negligent? Was it, you know, their vulnerability disclosure practices and communicating things to customers downstream? You know, where did they go astray because I think it's going to set the tone for other software suppliers and product, you know, manufacturers like, you know, here's what you need to be doing, and if you don't, here's the risk. And then it kind of opens up the other angle of that is, you know, if I'm following X, Y, and, Z, do I have safe harbor, how am I protected as an organization and a supplier from these kind of lawsuits in the future as well?

Rick Howard: Well, the question I had though, so you know, we're what? Amazon released AWS in 2006 so we're just under 20 years of cloud services where most of us -- most of our commercial organizations, most of our government organizations, and most of our academic institutions are in some form software developers at this point, right? So, Chris, how do you protect yourself from this kind of thing? Is there things you can do internally that will make this kind of thing, not a big problem or -- I mean, how do you think about that as a software developer yourself?

Chris Hughes: Yeah, I mean, I think -- I hate to say, it depends and there's no silver bullet. But as you know, from reading the book that you wrote recently like there is no silver bullet, there's a lot of different things you need to do. Obviously, the first step is like, you know, adopting secure software development frameworks and practices like OWASP SAMM or B SAMM or, you know, NIST SSDF. If you're producing software but if you're doing software downstream, you need to have those lines of communication with your suppliers so they understand when a vulnerability comes out if you're impacted. And maybe take mitigating measures like trying to segment the software kind of review who has access to it, what data is stored there, and be prepared to respond accordingly. But it's a tricky situation because there's no one single answer to this kind of problem. And, you know, malicious actors have realized they can tug at one organization, or tug at one vendor, one supplier, and have this massive impact across the ecosystem. There's just economies at scale that are too appealing to them. So, I think we're going to keep seeing this happen.

Rick Howard: So, Emilly, can you throw up the poll question? For those of you on the call, Emilly is the Director of Marketing here at N2K. So, she's moving the dials and pulling the levers behind the scenes here. So, put the poll question up for everybody. And, Chris, this has come to you is do you think that companies can use like insurance to buy down the risk here, is that something they can do just in case something like this pops up in the future?

Chris Hughes: Yes, it's definitely one of the recommendations that I make and, you know, obviously other folks make when it comes to best practices, but it also -- you know, there's some interesting activity there where some software, you know, or insurance firms, I should say, are starting to kind of have exemptions for nation-state activity, for example. So, what happens if you're a software supplier and you are nation-state involved, and now the insurance company maybe will or won't cover you, it creates a really interesting dynamic and scenario in that case. So, I think, you know, having insurance obviously is a great step. But, you know, like anything it will come down to the fine print of what will it or won't cover and what kind of protections you have. But when you talk about negligence, you know, having insurance obviously that that's going to help you mitigate the risk of facing a lawsuit for being negligent as a supplier or a manufacturer.

Rick Howard: So, Vik, you know, working for the medical field, you're involved in, not only protecting the enterprise but you guys are developing your own software too. Is this something that hit your radar screen when you were being the CSO of that medical organization?

Vikrant Arora: Yeah, not directly but like Chris was alluding to, that's the problem with SAS providers and vulnerabilities in their product that are internet-facing and that are popular such as MOVEit, or Dropbox, or Accellion. Even if you're not using it, chances are that your third party or fourth party or somebody is using it. So, if it becomes really difficult to kind of reach out to all those parties and find out if they are impacted, what data you have with them, what's the extent of the compromise, it becomes a very tedious exercise, a lot more difficult than assessing an impact on your organization. In our case, we were not impacted but we went through months and months of literally reach out to all the vendors and getting a response from them. This is just the replication of Log4j and SolarWinds, and reaching out to everybody who's connected with you.

Rick Howard: Emilly has reminded me that we're using a different platform this time than we have in the past. The poll itself is actually on the right side, okay, it's the third icon down from the bottom. It looks like a little graph, right? And we have three poll questions listed there. So, that's where you will answer those questions. And you can actually see the answers there. So, the question was, "Does your organization buy insurance to reduce the material impact of a cyber class action lawsuit?" And 60% of the audience said, yes, they do. So, that's a pretty high number. I didn't think that was going to be something that everybody did. I didn't even realize insurance companies were doing that. Chris, do you have experience with this at all? Is that something you normally say, "Oh, yeah, just go buy that insurance from insurance company X"?

Chris Hughes: Yeah, I think more and more people are starting to seek it out ask kind of risk mitigation measure. But the coverage, you know, from my experience like the coverages truly vary, you know. And then we've been seeing some big insurers, you know, especially in the EU, come out and even claim that, you know, cyber is uninsurable, I think was the quote that they used. It's definitely a unique situation because it's just such a systemic problem for us to get our hand around when we look at the like, you know, Vik pointed out that the number of organizations involved, the number of vendors, suppliers, individuals is just a major, major problem.

Rick Howard: Well, don't get me started about what I think the insurance agencies are in terms of cyber, okay, that's another five-hour podcast that we can do. Let me go to the questions from the listeners. This one's from McCauliflower Culkin, which that is a fantastic name, Chris. So, this is his or her question. "What are some things that software suppliers can do to mitigate the risk of" -- we've been talking about insurance, you talked about adhering to various compliance standards. Is there anything specific we can talk about?

Chris Hughes: Yeah, I mean, I think when we talk about software supply chain security and suppliers, we've seen a big emphasis like I talked about recently around NIST and their secure software development framework in particular, which, you know, rather than reinventing the wheel, which, you know, the government often doesn't do that, they tend to reinvent the wheel, this one went out in cross-map to existing frameworks like OWASP SAMM, Secure Application Maturity Model, or B SAMMs, Billing and Security Mmaturity Model. You know, existing frameworks around secure software development, so SSDF is great for a start. And I think having controls in place to have that vulnerability disclosure program and communication process to your downstream consumers and being transparent can go a long way. You know, here's what happened, here's what we know, here's what we're doing about it. And keep that communication flowing so that people don't claim negligence or lack of transparency or communication. Because we've seen other incidents where the vendors aren't forthcoming about this even happening or exactly what happened or who was impacted. So, I think in addition to insurance, you know, that line of communication, adhering to those emerging frameworks around secure software development can go a long way of kind of mitigating the risk, at least from a lawsuit perspective, and then also it just builds trust with your software, you know, consumers and your customers, you know, just being open and honest and transparent about the situation can go a long way.

Rick Howard: I got a question from Phil Neray in the chatbox, in the chat room. But Phil, I'm going to answer that later in my part, I'm going to talk specifically. His question was about the MGM stuff and the seizures filing AK reports for this kind of thing. I'm going to talk about that in my part. So, just hold that. We've got another question, Chris, for your point here from Khaleesi's Fourth Dragon. "Are there protections in place, you know, from the government, let's say, either state, city, or feds, are they being discussed to protect vendors from this kind of liability? Because it does seem like, you know, anybody can get in this kind of mess if they're not too careful."

Chris Hughes: Yeah, actually if you go look at the latest national cybersecurity strategy that was published, this is where they start discussing the term and concept of safe harbor, of providing safe harbor to, you know, manufacturers and suppliers. Because if, as a society, you know, as government entities are going to start to call on these folks to adhere to more rigorous standards and compliance frameworks and things of that nature, there has to be some kind of protection in place to say, hey, I've done all of the things you've asked. I've done my due diligence. But as we know, things will still happen. So, there has to be some kind of safe harbor or protection. So, if you go dig into that national cybersecurity strategy, you'll that they're talking about, not only shifting the onus to software suppliers and manufacturers rather than downstream consumers and customers, but they're also, you know, conversely discussing how to provide safe harbor for those software suppliers and manufacturers, you know, what that looks like still is TBD. But I think there's at least a good amount of discussion going on publicly and dialogue around this topic. So, I think we'll see, you know, it keep evolving.

Rick Howard: Well, Vik, you know, as well as I do, you just pay attention to what's going on in government and cybersecurity a lot. There's always a lot of talk about this kind of thing but the chances that something substantive comes about that protects companies, that usually takes years to do, right? What do you think? What's -- are you going to make a forecast for me about whether or not we'll see a law that protects companies from this kind of thing?

Vikrant Arora: I love making forecasts. Nobody comes back and checks them unless they are wrong.

Rick Howard: We're going to come back, you know, next year and say, "Vik got this wrong." All right.

Vikrant Arora: I do have a directional trend that I'm seeing. It took a long time for medical device manufacturers, the same thing in the healthcare space who have been struggling with finding out what software, firmware, and hardware is on the medical devices. Most are legacy unsupported, third-party softwares. And finally, in March of last year, the FDA released a new guidance, they call it the Refusal to Accept. They always had it there, but they added some more clauses to their Refusal to Accept, saying that if the medical device manufacturer has not followed the practices that Chris was talking about, SDLC, and is not looking out for vulnerabilities when the device is in production. And on top of that, not providing patches for vulnerabilities that are effectively exploited, it squarely puts the burden on the device manufacturers. And if they don't meet that, they will not be able to pass their product to the market. So, I see a similar trend happening for other SAS providers because it's the same -- under the hood, it's the same thing, it's a bunch of softwares playing together, things like SQL injection applied to everybody, such as MOVEit company or America device that has a database. So, I do see it as a trend in that direction. And I'm watching that very closely.

Rick Howard: So, Vik, it seems to me like, you know, protecting yourself from a lawsuit like this, a class action lawsuit is very similar to how you protect yourself from compliance violations, right? And you tell me if I got this wrong, but you basically establish what you think you will do in terms of a crisis, demonstrate that you did that thing during the crisis, and you should be okay. I was listening to an interview with the Uber CSO who got into the hot water like last year or right before the pandemic started because, you know, they had a breach and it looks like they didn't follow their own best practice. So, if you can do those things, Vik, is that your experience that demonstrating that you follow your plan, you should be okay here in terms of class action lawsuits?

Vikrant Arora: As long as the plan is robust and based on industry best practices and what the courts call it what a reasonable person would follow. So, as long as the plan is robust and you have documented to show the plan is operationalized, I'm not a lawyer, but being a reasonable person, I can say that should be okay.

Rick Howard: Well, being the reasonable person and lawyer, those two things don't go together, I don't think, right? And I know you have lots of lawyers listening to the podcast, that was not a shot at you guys, I'm sorry. Chris, the last word for this topic, for your side of this argument, anything you want to convey to the audience about this particular piece?

Chris Hughes: I definitely think, you know, depending on who's listening, where you are in the software supply chain. If you're a supplier, definitely kind of re-evaluate, you know, your software development practices, your lines of communication with customers, your vulnerability disclosure programs and processes. Like Vik talked about, your overall kind of cybersecurity strategy and incident response plan, things of that nature to kind of protect your organization, you know, from these kind of activities moving forward. And if you're a consumer, start to ask these questions people that you're buying software of, you know, because you need to know are they following these frameworks, what kind of processes do they have in place, and even baking these things into your procurement or acquisition language when you work with a vendor or a supplier. You know, so you can hold them accountable if you need to.

Rick Howard: Well, that's a great transition from the first side of the argument which is I'm in a company that makes software and we're getting hit by a class action lawsuit, to the other side, Vik, which is your topic, how do you protect yourselves from software companies that you're purchasing services from, from being a victim of this kind of a thing? So, take it away, it's all yours.

Vikrant Arora: Yeah, thanks, Rick. And at the high level, what I wanted to do was open the conversation by separating the software supply chain attacks from the zero-day vulnerabilities that get exploited with SAS providers. There's a lot of media hype and they all get lumped together that these are all software supply chain attacks. If we really want to be careful and effective, so I consider or in common practice, it's considered that a software supply chain attack is when you have software and it's getting updates and instead of getting legitimate updates, you end up getting malware from the vendor. Vendor's distribution channels have been compromised and you're getting bad water from the well, for lack of better words.

Rick Howard: Like SolarWinds, right? Or Log4j, right?

Vikrant Arora: Log4j, SolarWinds, or even 3CX, and even no patch-up when the Russians infiltrated, it's the same thing. But what we are seeing an increase in is where you have a public-facing website such as MOVEit and there's a vulnerability with that, a zero-day vulnerability and that's getting exploited, and I as a consumer have data in that environment, and that's getting compromised because the vendor did not patch or it was not careful enough to create a code that is secure. I mean, any code can be broken in two, but I just wanted to separate that. And I've been busy for the past three or four months chasing so many of these. We started off at MOVEit, going through the whole list of vendors. And thankfully, after Log4j we had to create like a complete vendor inventory, set up procedures in place to reach out to them, make them contractually obligated to respond to queries from us. And then it was Microsoft, thankfully, it was not as widespread as MOVEit. There were only 25 customers that were impacted and it was a flaw with their web-made system that they quickly fixed. They're not calling it zero-day, I'm still waiting to see what they end up calling it. And then we had a Venty, they have products like they made acquisitions like MobilLine and their public-facing software that was also being exploited. So, that's what's been keeping me busy and I just wanted to make that separation to begin with.

Rick Howard: So, you have a bunch of ideas about how consumer of these kinds of services can protect ourselves. So, what's your most impactful one that you would recommend to this audience?

Vikrant Arora: So, essentially, it all starts with a good business associate agreement, that's what we call in health care, if you have a vendor that's dealing with patient information, make sure you have a business associate agreement. If it's not BAA, make sure you have a contract. The contract needs to have the data security terms that you expect a vendor to comply with. That's like table sticks, you've got to have that whether it's software that you're using or a SAS provider that you're having. After that, I strongly recommend, and I think we have moved away from that a little bit is threat modeling. People have gotten used to providing, implementing static controls, hey, I put a firewall, put an EDR, and think that you're good. I'm asking everybody to do a threat model because the threat coming from a software supply chain attack is very different from an SAS vendor getting compromised. So, do a threat modeling, figure out what your true risks are, and then implement controls that minimize that risk. For example, like for a software supply chain attack, making sure the vendor is signing and providing digital certification to sign their hold is a good control, but this control doesn't apply to something like MOVEit. In the case of MOVEit, after you have done your threat modeling, you want to make sure you have principles of zero trust. You're only giving access to people who need access to that environment. The other thing you should do, you should only -- you should limit the amount of data that is stored there, and that's a huge operational flaw. We end up using technologies for purposes beyond their intent. It's a fine transfer solution. It's not a fine retention software. If you keep retaining stuff like data from years and years, then in fact, it's going to be a lot harder. And the last thing I would recommend is always and obviously run on fear because criminals will always get in if they're readily entrusted. So, make sure you have standard business continuity and disaster recovery procedures and incident response plans, both internal and on the side of the vendor. And you test them, at least for your crown jewels.

Rick Howard: So, I like that laundry list of things to do, right, all those will have some impact on reducing risk. Vik, what do you think the -- which one would you do first? What's the most impactful, do you think?

Vikrant Arora: Threat modeling and risk assessment.

Rick Howard: I understand it, right, what you said. I understand exactly what you're protecting against, right, yeah, I totally get that, yeah. Chris, I'm going to come to you as the resident software developer here. One of the tools in the future that we really don't have yet that could combat supply chain risk, not the thing that Vik was talking about but supply chain risk is some form of SBOM implementation, Software Bill of Materials. Can you maybe explain -- I'll put you on the spot here, explain what SBOMs are and why this might help in this regard.

Chris Hughes: Yeah. I'll do a little SBOM 101, I guess, and hopefully, I don't butcher it for the SBOM experts out there. So, think of SBOM as an inventory of your -- and we've heard, you know, for a long time we need software asset inventory, right? But when we think of that, we think of like a product, like Microsoft Office or whatever, insert X. An SBOM essentially is a more granular inventory, a piece of software that shows you all the components that are in that software, whether you're talking proprietary first-party type of code or third-party open-source software. For example, we've seen, you know, now modern code bases are 60 to 80 percent of open-source software components. So, an SBOM is going to show you, you know, for example, do we have a Log4j in this piece of software? You know, and then when we talk about the software supply chain conversation, we see this, you know, this need for transparency. As a software supplier, I should be able to tell you what's in my software that I'm selling you. As a consumer, when I have an incident like a Log4j, without an SBOM, without an inventory of what's actually running in my enterprise beyond just the name of a product but the actual software components that are inside of it, I can't possibly respond because I have no idea where it's running in my ecosystem. I have no idea what suppliers are using it in their products. And we've seen the SBOM conversation, it initially kind of got traction in an agency called NTIA, and then from there, it moved over to CISA, the Cybersecurity Infrastructure Security Agency. It's run by a folks named Dr. Alan Friedman and some others over there. There's two major SBOM formats, there's CycloneDX which is, you know, from the OWASP Foundation, and then there's SPDX, the software packet data exchange, I think -- software package data exchange, I should say.

Rick Howard: I just want to say I'm pretty impressed you pulled all those acronyms over there. So, that's good. On a cold question that I did not warn you about. Nicely done.

Chris Hughes: SPDX is the Linux Foundation, and that one is actually an ISO standard now too. So, you have these -- I don't want to say two competing standards, but two industry-standardized formats for SBOMs. And a lot of tools can kind of read both or create both, you know, at least increasingly so they can. So, as a software supplier, me creating this artifact, say I sell you a piece of software, I also provide this inventory of what's in the software to you in the form of an SBOM. And if I'm a consumer, I write into my acquisition procurement language, etc., as well as asking the vendor, hey, I want to buy your software but I also want an SBOM so I understand what's inside the software, what known vulnerabilities are associated with the components in the software, you know, what components are used and are they from another nation that I don't trust, for example. You know, it gives you that transparency of the granular inventory of a piece of software that we just haven't had historically. It's kind of amazing that we've gotten as far as we have.

Rick Howard: Yeah, it really is.

Chris Hughes: Without anyone saying, "Hey, I want to know what's in the software." Like they just buy it and use it but they never actually understand what's inside of it.

Rick Howard: So, I want to be clear that SBOMs aren't really a thing yet, they're more of an idea of developing standards and we're probably years away from having it, but it's the solution in the future. But, Vik, I want to make it clear. SBOMs protect against supply chain attacks and you said at the top that that's not what this is thing is, right, this was exploitation of the software, right? So, I want to direct the audience over to the next poll. It is, "Why do you think exploits against file-sharing sites are surging?" So, you guys can answer that question right now. But, Vik, what do you have to say about all that?

Vikrant Arora: That's exactly because if I'm consuming something as a service, I'm not directly interacting with the software. I'm expecting the vendor to make sure that all the softwares are properly designed and protected. I just need to make sure that when my audience or my stakeholders are accessing that service, there's proper role-based access control, the data is retained on a need-to-know basis. And, of course, I also want to put in controls like, for example, in the case of MOVEit, having an IPS or a firewall and looking at those logs can help you after the compromise to determine if there's a deviation from normal behavior and react sooner to it. The other recommendation I will make is that in the case of zero-day patches and vulnerabilities, usually, the first patch is never enough. So, keep an eye, as soon as the first patches come out and tread with caution because sooner or later, another batch comes out because the vulnerability or the extension is not fully known within days, sometimes it takes weeks. So, stay on top of it. Don't go through -- don't get tired. It is tiring but like make sure you have enough batteries to keep following it through the end and not leave it half there.

Rick Howard: And just as an aside, I'll recommend a book for everybody. "This Is How They Tell Me the World Ends" by Nicole Perlroth. It is the history of the exploitation market in cyber, from about the early days in the early 2000s until modern times. If you're interested in how all this happens on the underground, that is an excellent, excellent book to talk about. And the answer to the poll, Vik, is you may be, I don't know if this surprises you or not, but most of the audience, 87% says they are exposed to the internet and store sensitive information, and that's the main problem with this. I thought it was going to be a lack of user awareness and education. I don't know. What did you think it was going to be?

Vikrant Arora: That's what I would have expected. That's what hackers and criminals are seeing and that's what they are going after it. I mean, it's like the third file transfer site that is getting compromised. Accellion had it, previously Kiteworks, now Kiteworks GoAnywhere, or that was exploited, now it's MOVEit. We're just waiting for the next shoe to drop.

Rick Howard: Yeah, I agree. So, we've got a question from Miss Piggy's Dimples. I'm going to assume Miss Piggy is a female. She says that what are some of the steps customers can take to proactively manage the risk of zero-day vulnerabilities being exploited in commonly used third-party software. I think we talked a little bit about this. But can you put a fine point on this, Vik?

Vikrant Arora: Yeah. Again, make sure that they are being used exactly how they are intended. Do not store data if it's a file transfer. Make sure you have enough contractual provisions with a vendor so they alert you to what the impact is because, at the end of the day, it's your responsibility. And in terms of prevention, you've got to make sure you have like intrusion prevention and web-application firewalls in front of them. Even if it's a zero-day attack, some of these newer technologies can detect deviations and patterns and block them. That was the whole idea behind intrusion prevention and web application firewall that we're moving away from signature-based attacks. So, you have a small chance that a zero-day will get prevented. Make sure they're protected. And then you have the response that I already talked about.

Rick Howard: So, we've got a question from Well-Endowed Penguin. Well-Endowed Penguin. I think this is going to generate some controversy. So, the question is, "Will AI help threat actors develop more zero-day exploits?" And, Vik, what do you think about that?

Vikrant Arora: The short answer is no. Here is why I think so. Because in spite of all the folks we have from AI solving everything and solving all problems across all industries, the way at least the current models are, they are only reproducing what they have learned up until a certain point and resurfacing that knowledge in a very structured manner for easy and quick consumption. So, by default, a zero-day exploit is something that was not known. So, it's very difficult for AI in its current stage to come up with a new piece of information. They're very good at reproducing previously known information in a nice manner but a zero-day essentially is nobody knows about it. And it requires -- I think it still requires humans to some degree to figure out what's not known. AI is, in my opinion, not there yet. So, it couldn't cause the surge.

Rick Howard: Chris, I'd be interested to hear what the resident application developer thinks about that answer. Are you going to come down the side of Vik or do you disagree with him?

Chris Hughes: No, I mean, I kind of agree in the regard that like as he pointed out, the models are trained on pre-existing knowledge so often, you know, it's going to use information that it knows exists. But that said, this is an evolving space, it's moving really quickly. And I think it's, you know -- I think as defenders, we need to be just as quickly looking to use AI for defensive purposes as malicious actors are for offensive purposes. If not, you know, they're going to find a way to really manipulate it and weaponize it, and we're not going to be prepared to kind of defend appropriately.

Rick Howard: Well, let me be the contrarian here then because I don't think Vik is right here. So, I'll just say that upfront, Vik, right? It's not that they're going to come up with a new way to exploit code, but they can take existing methodologies like finding, you know, buffer overflows, let's say, and apply it to all the software that we don't know exists out there. That's where AI is going to come in handy is how do I write the buffer overflow for MOVEit, again, when nobody thought about it? I think that's going to be a fairly easy achievement for the ChatGPT models. I don't know. Tell me if I'm crazy, Vik.

Vikrant Arora: No, that I agree with. So, I think, let me separate that. So, it will allow, just like it's allowing the security practitioners to scale more, do things faster, it will allow criminals to do the same. So, rather than being able to exploit one thing across one ecosystem, they can do it. Okay. But what I was saying was that you cannot come up with a new zero-day.

Rick Howard: Yeah, that's probably true. Yeah, I agree with that. The last question from Barbie Breath. I think we answered a bit of this already, Vik, but, "What do you ask the vendors who are impacted here? What discussions are you having with them?"

Vikrant Arora: So, vendors who were impacted, first and foremost, we want them to provide us with what was the impact to their business. Finding what are the news, I need to go out and meet my regulatory requirements, I need to notify my customers, can you please clearly determine what the impact is? So, that's the first thing. The first order of business. The second is give them room to remediate and come out of the crisis because probably everybody is working 24 by 7. But then the other thing I ask for once things have settled down is a complete root cause analysis, do you know exactly how it happened so that they can close the telemetry gap, a complete forensics report of the situation at hand, and then being transparent and consistent with their communication throughout the process. So, I'm not asking here what's the update, they should be transparently communicating with us throughout the incident, at the end, give us complete RCN forensics, and in the beginning, providers with a clear impact and adjusting back as they go along. But these are the three things I normally ask for.

Rick Howard: So, we're going to close this topic. I think we talked about from the people that make this software that Chris talked about, and then your side was how do you protect yourself against it. What's the takeaway, what's the Twitter line, what's the X line that listeners should know from all this discussion about this kind of thing? I'm giving that to you, Vik.

Vikrant Arora: That's -- I don't know. I don't have a Twitter-type of line just but if there is one thing I would want to say is, like I said in the beginning, threat modeling.

Rick Howard: Understand the problem. Okay. Threat modeling sounds so hard but understand where the vulnerabilities are. I totally agree with that, right, because it's -- you outlined a subtle difference between the supply chain side and this other kind of risk. So, I totally agree with that. So, good stuff there. Which brings us to the third topic which is my topic and I'm going to talk about the US Securities and Exchange Commission's new rule on cyber materiality and reporting, they passed it this last July. It applies to all SEC registrants reporting under the Securities Exchange Act of 1934. Who knew that act was that long ago, right? It goes into effect right before Christmas of this year. At the top of the show, I mentioned that I published a book this past spring, and in it, I made the case for what I thought was the absolute cybersecurity first principle. Let me just lay it out here. Vik, this is my Twitter line for the absolute cybersecurity first principle. Reduce the probability of material impact. See what I did there? Material. Due to a cyber event in the next three years. And all of us can argue later if you think I nailed it or not. But assume that I did for a second. You'll notice that materiality figures strongly within it. And so, I don't know if I should feel proud that the SEC's current thinking strongly aligns with my own or if I should feel entirely misguided by the fact that a bunch of non-cyber experts like the SEC think that materiality is as important in cyber as I do. I'll let you guys be the judge of that. And like I said, in the "CSO Perspectives" podcast this summer, according to the Harvard Law School Forum on Corporate Governance, the landmark judicial definition of materiality was crafted by Supreme Court Justice Thurgood Marshall in 1976. Yes, that Thurgood Marshall. And for you youngsters out there that never heard of that guy, he was the guy that argued in one Brown versus Board of Education that helped end segregation in US public schools. He founded the NAACP Legal Defense and Education Fund and he was the first African American Supreme Court justice and served for 24 years. He's a big deal. A true American. And one of the country's champions for civil rights. But he wrote in TSC Industries versus Northway that a fact is material if there is a substantial likelihood that a reasonable shareholder would consider it important in deciding how to vote or a substantial likelihood that the disclosure of the omitted fact would have been viewed by the reasonable investor as having significantly altered the total mix of information made available. Hooh! That is a broad definition and it's something that all of us network defenders are going to have to contend with going forward. And if on cue, this, the big noisy hacks of the summer were the victims of the MGM Resorts International and Ceasars Entertainment getting hit by Scattered Spider or maybe another group, a newer group called Star Fraud who, by the way, some people think were part of the crew behind the Colonial Pipeline attacks, as well as Clorox getting hit with a different kind of ransomware attack. This has all caused the leadership of those companies to file AK reports with the SEC. And for those that don't know, an AK report is a report of unscheduled material events or corporate changes. The report notifies the public of events, including acquisitions, bankruptcy, the resignation of directors, or any kind of weird changes in the fiscal year. And it's hard to know if the new SEC ruling caused this reporting by MGM, Ceasars, and Clorox even though the new rule doesn't go into effect until December. But you have to wonder, did this new rule cause this enthusiasm in reporting transparency to the SEC? So, Chris, let me start with you, do you think this new SEC rule is a good thing or a bad thing?

Chris Hughes: I definitely think it's a good thing. We've seen, you know, increasingly -- you kind of touched on this in your opening remarks about, you know, companies being a software company or not. And I think we've seen this increased digitalization of every business, you know, that's using software and technology to draw business value and deliver value to customers and stakeholders. And coincidingly, I think we need that level of accountability and transparency when it comes to cybersecurity oversight of the organization having a robust cybersecurity strategy in place, having the right people with the right expertise in place, and then obviously disclosing things, you know, that would be of interest to shareholders, which includes an impact to their organization's technological environment. So, I think it's definitely a good thing in my opinion.

Rick Howard: So, that's the poll question that I want everybody to answer now, okay? Is this new SEC rule a good thing or a bad thing? And, Vik, that's the same question that I want to ask you, what's your opinion of this?

Vikrant Arora: I totally agree with Chris. I think it's definitely a step in the right direction. I always put things into like from a time scale standpoint. First, there is innovation of technology, and then a few years later, security and privacy is able to catch up to the innovation and prevent threats and risks to those innovations. And finally, the regulation catches up to security and privacy, which is already behind innovation. So, I think it's a step in the right direction. We need to be able to -- we need to strive to define what is the impact of a cyber incident. That is needed to justify our investments in cybersecurity, that is needed to justify our responses to our stakeholders. So, I definitely think it's a step in the right direction. And it will foster the right level of conversations within all organizations.

Rick Howard: Well, as my role here, let me be the contrarian in the group, right? For 30 years, we've always, you know, people like us have thought, geez, we should make organizations tell us if they've been hit. But I can make a pretty strong argument that there aren't that many attacks out there. I mean, it feels like there's a bunch of attacks because they hit the headlines every day. But in 2021, the FBI said that self-reporting companies reported about 5000 material breaches to their organization. Now, so -- and there's about 6 million organizations within the United States so 5000 divided by 6 million, that's a really small number. Now, you can say there's a bunch of organizations that, you know, didn't report, I don't know, so raise it up to 25,000, even 100,000, that's still a small number. So, I don't know if reporting is the solution that we need here. But maybe. Okay. If you look at the poll results from our audience, they totally believe that I'm in the wrong there, 71% thinks the new SEC rule is a good thing. So, I guess, that's just the way we're going to have to go with that, right? We did get a couple of questions about this from listeners that are on the call today. Let me see if I can find the question box. There it is. The first one is from Phil Neray from Gem Security, okay, and he wants to -- this sits right away with what we're talking about. He says, "What are your thoughts on how the different approaches that MGM and Ceasars used and how they filed their SEC cybersecurity disclosures?" I know, Vik, Chris, any thoughts about that? It seems odd that those two big organizations would immediately file because they would never in the past, but I could be in the minority there.

Vikrant Arora: Yeah, I haven't analyzed or looked at what they have filed but I think that's what the regulations are allowed to do or that's what it's forcing the industry to do. So, it's in line with. So, I don't have an opinion because I haven't looked at their findings closely as to either --

Rick Howard: Yeah, I haven't read them either but it just struck me as odd that both of those big organizations did it. But, I don't know, Chris, any thoughts there?

Chris Hughes: Yeah. I'm with you guys. I want to go and dig into it but I think it's -- you know, I wonder if there's a preemptive activity with the recently proposed rule to try to, you know, kind of seem forward-leaning or more transparent, something along those lines perhaps.

Rick Howard: Yeah, I think so. Phil says more details versus fewer details in the filing. I haven't read any details so I don't know the answer. All right. But I think that what you're getting at, Phil, is that nobody knows what's important here. The definition of materiality that Thurgood Marshall gave us, man, you could cover a lot of ground with the very sparse reporting of it, or you know, 100 pages of detail. So, I think lawyers are going to be spending lots of time figuring out what that means as they wrestle with this question going forward. Vik?

Chris Hughes: Yeah -- oh, go ahead, Vik. I was just going to say real quick I agree. I think that's a very broad definition and then the term reasonable, you know, shareholder, for example, what a shareholder finds reasonable in terms of being a material incident. Obviously, that's very subjective. People have different perspectives and different views on what may or may not be material or be of interest to them from a shareholder perspective. So, it's super subjective.

Vikrant Arora: Yeah, and it's also a lot of perception management. As a consumer, I'm not reading everything or trying to figure out whether it's material or not. Consumers, which is a big part of MGM and Ceasars business, if they get a feeling that the organizations have been forthcoming and trying to disclose whatever they knew from the get-go, influences their perception significantly. People remember the handling of their Equifax breach very differently from the handling of the Home Depot breach. And the difference was mostly into how the CEOs communicated about it and how transparent were they. So, I think it might be a step in that direction where they're managing their perception in addition to the specific software incident.

Rick Howard: We have another question from Nathan Altman in the same ballpark but with a slightly different take. Nathan asks, "Does mandatory reporting like AK reports reduce valuable cyber collaboration between government agencies and private companies?" You know, their public-private partnership is something we've been trying to make better since the early days. Does this put a wet blanket on that collaboration? Chris, what do you think?

Chris Hughes: Actually, I don't know why it would. I think if anything, it shows the organization is being more transparent, more forthcoming with information that occurred in the incident, and then honestly, it kind of broadens the aperture to bring the consumer, the shareholder into the conversation too, rather than just, you know, the federal government and the organization. You've now brought people who make investments in financial decisions based on these organizations' performance into the conversation too and let them have a kind of peek behind the curtain of what's happening in the organization and what may have been of interest to them and how they should allocate their capital accordingly.

Rick Howard: I totally agree with that. Vik, would you have something to add to that? Or because I'm reading your facial expression.

Vikrant Arora: I agree with that.

Rick Howard: Yeah, I think that's pretty good. So, my takeaway, you know, my X takeaway that I asked Vik about and Chris on their topics is this, this is something that we've been asking the government to do for us, I don't know, for 30 years. Well, we got it, all right, and no one's really sure how it's going to happen or what's going to manifest from it. More to tell. I'm sure it's going to pop up going forward from here. But it is a really interesting development. And like I said, we're going to be paying lawyers lots of money to figure this out in the near future, I'm sure. We're going to wrap this whole thing up with some general-purpose questions. We always ask the audience to send in questions before the show starts when they register, so let's start with the question from Rootin-Tootin Putin, Chris, it's specifically for you. He said, "When I heard that you were coming on the show, I was excited." He said, "Your book 'Software Transparency'," okay, has been on his bookshelf to read for a while. He says, "What's the thumbnail, why should I feel excited about reading that book?"

Chris Hughes: If I had to put it in, you know, in a bottle I guess and kind of summarize it, I would say that this is an increasingly used attack vector from malicious actors and they've realized the efficiency of it. And if we don't get our hands around it or understand it as, you know, organizations, enterprises, security professionals, we're going to be caught blind-sided. So, I tried to lay in there very well, you know, the history of these kind of attacks, what the future of this looks like, what regulatory activities might be in play, and then what kind of technical and procedural solutions can be used to kind of mitigate the risk too. So, I think you'll find it interesting.

Rick Howard: Well, like I said at the top, I'm about halfway through and it's really good. So, I recommend it. I'm not even done with it yet so well done, Chris. We had a second question from the King of Dairy Queen. Okay. So, and not to be outdone by the first question but he wants to know about Vik's book, "Trustworthy Technology and Innovation in Health Care." That's not the title of it, all right, it's something else.

Vikrant Arora: Yeah, the title of the book is -- it's called "The Cyber and AI Handbook for Healthcare Board". And this particular book is designed to be a reference guide for healthcare executives and boards so that they can better understand cyber risk, and not only that, they can use cyber risk to influence technology decisions such as AI and duly fulfill their responsibility. The book draws on more than 150 years of joint experience from a diverse group of healthcare and security practitioners under the leadership of Sherry Dugal who's the city's editor and Taylor & Francis as the publisher.

Rick Howard: Well, I did not know about the book before we came into the show, so it's on my shelf of to read, Vik, so congratulations to you for getting that done. The third question we get from Kiss My Axe, let me pronounce that correctly because I can get into a lot of trouble there. This one's directed at me. It says, "I heard you present at the DreamPort Security Conference last week on first principles. You mentioned that you have added an additional first-principle strategy since the book was published, a strategy that didn't make it into the book, something called workforce development. And can you explain what that is and why it didn't make into the book?" And this is great, guys, because I want to get your take on it anyway. Amazingly, I wrote a book on cybersecurity first principles and didn't even touch the idea of the workforce that's going to deploy all of these great ideas that I had in my book. Strategies like zero trust and intrusion kill chain prevention and resilience, and automation. And a bunch of attackers, we mentioned some of them in here, SBOM as figured out in the book. But as we were finishing the book, the idea -- we launched the book at RSA, okay, the conference in the spring, and as I was flying home, I happened to watch the movie "Moneyball". Are you guys fans of the Brad Pitt movie? Okay. I love that book -- the movie, and I love the book by Lewis, Michael Lewis. He's the guy that wrote "The Big Short" and "The Blind Side", and a bunch of other things you heard of. Okay. But he was talking about how the Oakland A's back in 2001 managed their low-salary team. Their salary cap for that year was like $40 million. Their nemesis, the New York Yankees had a $120 million salary cap. So, they really couldn't compete in buying players. All right. And that year, their three superstars got poached by big hiring high-salary cap teams so they had nobody on their team going into the 2002 year. So, he refocused his efforts, okay, his hiring plan based on one first principle, by the way. The one metric that he used was get on base. Not fielding, not pitching, not home runs, he wanted players who could get on base regularly. And since no other major league baseball team used that metric to hire players, he could get those players relatively cheap. So, you know, instead of paying $7 million for a superstar or short staff, he could pay $200 for three players who had really high on-base percentages, essentially building the superstar in the aggregate. And the long story short is they won one more game that season than they did the previous year, and they'd been at the playoffs 10 of the last 20 years. So, clearly, that system works and the takeaway that I got from watching the movie was, we don't do that in cybersecurity hiring. We don't build teams, we hire superstars. Right. We train superstars with no idea about how to make the team better, right? Usually, we send people to training as a perk. You know, Kevin, he did a good job on that project last year, let's send him out to Black Hat as a reward. But with no thought about how we might make the team better. So, if you choose a strategy from my book, let's say it's zero trust, you should be hiring people who can help you accomplish those tactics that will buy down risk for your organization. And what that means is you maybe don't need the superstar when you're hiring, you can go get the brand new college graduate with no experience yet but who is hungry to solve problems and give them the one or two tasks that would improve your ability to deploy cybersecurity. Anyway, I was figuring all this stuff out after I published the book. So, that will definitely be in edition two if I ever get around to writing that part of it. But, Chris, let me come back to you, tell me I'm crazy here that we should be using a Moneyball approach for building our cybersecurity teams.

Chris Hughes: No, I like what you said a lot. And if you read books about DevOps, for example, we'll talk about the hero syndrome of having that one individual that does a lot of heroics to keep things functioning and successful and that's not sustainable, right, you need a team of people. And I like what you were kind of pointing to, it's like not only spreading the wealth in terms of the workforce and hiring a broader team or a more robust team versus one individual as a superstar, I think we also as an industry often try to buy tools, not realizing that people need to operate, maintain, and manage these tools to make them effective. So, having, you know -- maybe you don't have the shiniest, latest vendor or software solution, etc. But you have a team that's capable and knowledgeable, and understands your organizational environment and your requirements, and can actually be impactful. So, I think that's definitely a good approach. I like it a lot.

Rick Howard: So, Vik, I know you and I hired lots of people in our careers, right, but what do you think about this? I always had that one question at the end of the interview, you know, you ask the potential employee, you know, if they're a good fit culture-wise, do they have the general knowledge that you want to do the job? But the question I always ask at the end is what are you running at your house because if you haven't built a Linux box on your own, you're not smart enough to be on my team. It's not that you have to know Linux, okay, I just need to know that you're hungry enough to go figure things out. And I wonder if you have that kind of similar thing you do when you interview people for your jobs.

Vikrant Arora: Yeah, absolutely. And I think that is the right approach when you hire somebody based on his or her analytical capabilities or problems -- his or her problem-solving capabilities. Like you said, are they hungry to learn more, their character? In addition to the skills you need and not just for the skills you need, you can go a lot farther because most of the foundational cybersecurity fundamentals have not changed, I mean, encryption, role-based access control, now we call it zero trust, at the end of the day, it's the same thing, public key cryptography. So, if you have this basic foundational knowledge, coupled that with the personality traits that can make you go a lot farther than hiring a god-like person for threat research who's been reverse engineering for the past 20 years and is only one of 30 individuals in the country. That's just not scalable, let alone it being affordable. So, I agree with that. Hire people for personality, even foundational knowledge, and let them grow. And the person who grows in the company also ends up being a lot more loyal to the company because you are then part of that person's personal growth story. And there's nothing more valuable than being that as a mentor as a manager, as an organization to allow a human being to grow rather than just hiring successful people and hoping they'll make you even more successful.

Rick Howard: I think it's definitely a change in mindset too because now you can hire the college kid coming out of, you know, coming out of college. And you can grab those government workers transitioning out of the government space and they want to try cyber as their new career. They have no experience. But they have all this potential, this hunger to be good at their jobs and you don't give them the entire zero trust program, you give them the one piece of SBOMs that we need to figure it out and let them go with it, as long as they're not afraid to try new things. Chris, you were shaking your head up and down, I think you agree with me on that, right?

Chris Hughes: Yeah, I mean, as we all know, it's a very complex ecosystem to do cybersecurity right, no one individual can do it all. Of course, you need someone that has that big strategic picture and view, but it's going to take a team that's tackling this from multiple angles and have various areas of expertise in deep domain knowledge and different things. So, yeah, it's spot on.

Rick Howard: So, if I ever get around to writing the second edition of the book, we will definitely have a chapter for workforce development and Moneyball just so I can talk about Brad Pitt and, you know, give me an excuse to do that, which I love. So, we are at the end of this, everybody. On behalf of my colleagues, Chris Hughes and Vik Arora, thanks for participating and we'll see you all at the next CyberWire Quarterly Analyst Call which is around December. So, Vik, Chris, thanks, guys. And we'll see you next time. Thanks, everybody.

Chris Hughes: Take care, everyone.

Vikrant Arora: Thank you.