N2K logoJun 29, 2023

CyberWire Live - Q2 2023 Cybersecurity Analyst Call

There is so much cyber news that, once in a while, all cybersecurity leaders and network defenders should stop, take a deep breath and consider exactly which developments were the most important. Join Rick Howard, the CyberWire's Chief Analyst, Simone Petrella, President at N2K Networks, and Alan Berry,CISO at Centene, for an insightful discussion about the events of the last 90 days that will materially impact your career, the organizations you're responsible for, and the daily lives of people all over the world.

Transcript:

Rick Howard: Hey, everyone. Welcome to the CyberWire's "Quarterly Analyst Call." My name is Rick Howard, the N2K CSO and the CyberWire's chief analyst and senior fellow. I'm also the host of two CyberWire podcasts, "Word Notes" and "CSO Perspectives." And I just recently published a book based on the "CSO Perspectives" podcast called Cybersecurity First Principles, a Reboot of Strategy and Tactics. But more importantly, I'm also the host of this program, the CyberWire's "Quarterly Analyst Call," normally reserved for CyberWire Pro subscribers. But this month, Emily Bradford, the producer of this show, and I, we've opened it up to everybody so they can get a taste of this thing. And I'm happy to say that I'm joined by two new guests to the CyberWire Hash Table. They've never been on the show before. The first one is Alan Berry, the CISO at Centene, and he's having some security technical problems on his end since he's the CSO there. They're not letting him use GoToWebinar, so we're having to have him on the call as a phone caller. So like you said, Alan, it is your best side. Say hello to everybody.

Alan Berry: Thanks, Rick. It's a pleasure to be here and hello, everybody.

Rick Howard: And we have Simone Petrella, president at N2K, and a note for all the listeners out there. Simone is my boss, so you all better be on your best behavior, okay? I'm just saying. Hey, Simone.

Simone Petrella: Hey. How's it going, Rick? The listeners can send me a note separately about you later.

Rick Howard: And they most likely will. So this is our 14th show in the series where we try to pick out the most interesting and impactful stories from the past 90 days and try to make sense of them. And since the last show, there's been a boatload of things going on. We could have discussed the US Securities and Exchange Commission, the SEC, filing a Wells Notice. Naming the SolarWinds chief financial officer and chief information security officer over the 2020 supply chain attack that involved a compromise of the company's Orion software platform. So just for those who don't know, Wells Notice indicates that the SEC believes that these two may have broken US federal securities laws in the form of cybersecurity disclosures. Or we could have discussed the Europol takedown of the EncroChat encrypted phone network that included, get this, 6,558 arrests. And the seizure of 740 million euros, 150 million euros worth of physical assets, 103 tons of cocaine, 163 tons of cannabis, and 3.3 tons of heroin. That's right, I said tons. They also got 971 vehicles, 271 homes, 923 weapons, 83 boats, and 40 planes. So I think that the three of us have chosen the wrong profession. I don't know what you guys think about that.

Simone Petrella: You know, I've also said, Rick, that I chose a profession where I don't get kickbacks and this is evidence that I chose wrong.

Rick Howard: Exactly right. Or so the third story we could have talked about is the decision by the New Jersey court that ruled that Merck may be entitled to a payout from their insurers following the 2017 NotPetya attack. Merck's insurers disputed a payout of about 1.4 billion on the basis of hostile warlike action, that exclusion clause that they have with their policies. But the New Jersey appeals court said that the exclusion clause should not apply to a nonmilitary affiliated company despite the nature of its origin. And you can expect appeals later on down the road. This is not over yet, right? But, Simone, we have something completely else in mind for your topic. What is your first most impactful cybersecurity story for this quarter?

Simone Petrella: Yeah, so I would love to point out and shift out of commercial sector for a second. The Department of Defense rolled out over nine years in the making an update to their DOD cybersecurity workforce strategy which includes their update to a manual for DOD Directive 8140. And for those who are not familiar with what DOD 8140 is, it is the DOD directive that provides the qualification requirements for DOD military, civilian, contractor personnel. To service in noted cybersecurity functions and roles. And that has been something that has been sort of hanging out there. There has been a movement over literally it was last updated in 2015 around how is it going to become a more role-based requirement. How are they going to tackle the 25% vacancy rate that exists across the DOD? And this particular strategy as it has rolled out is laying out some items to address four specific challenges. The first is that DOD components don't have consistent criteria to identify their workforce requirements and this aims to actually fill that in for them.

Rick Howard: I'm shocked. Shocked, I say. Shocked that that doesn't happen, okay.

Simone Petrella: As a side note, you know, I started my career in the DOD. I have been through I think three different versions of attempts to classify cyber jobs in the DOD through massive spreadsheets. So I'm not going to- I'll reserve judgment on what that means, but yes, that is a very shocking revelation. The second is that they are using this to identify better ways to find candidates to fill these roles, and hopefully, open the aperture. The third is that they acknowledge that there aren't enough tools that exist in the space today to assess the current capabilities of the employees and the workforce that exist in those roles today. And the fourth is that they are proposing this new strategy to address attrition among the most highly skilled portions of the workforce. 'Cause as you can imagine, the private sector has done a really good job over the last ten years of providing some reverse brain drain for at least two of us out of three. I don't know --

Rick Howard: I've been guilty of that, yeah.

Simone Petrella: Yeah. Although, they --

Rick Howard: "Oh, you work for the government? You should come work for us and make a lot of money." Yeah, that's true.

Simone Petrella: Right. So, you know, interestingly enough, though, there's a couple of things I want to point out about this story as I was tracking it. The DOD chose to develop a framework. And they used definitions of roles that are similar to but they actually chose, medium surprisingly, to not leverage the NICE cybersecurity workforce framework. They decided to create their own.

Rick Howard: That's so typical of- that's so typical. In my military career, we did that all the time. "Oh, your solution is not good enough for me. We are special. We're not gonna use somebody else's work." Oh, it just kills me. All right, anyway, I agree.

Simone Petrella: Right. But, you know, to their credit, where they have I think really accomplished something that takes what is any workforce framework and moved it to the next level. Is they have put tangible requirements in place and so they've discretely identified each of those roles. And then mapped a matrix around what are the foundational experiences, education, or training, or credentials that are required for someone to service in those roles. So it has kind of advanced how you can execute on it in a way that, you know, the existing NICE framework had not. They probably didn't need to recreate all the roles themselves and their definitions themselves. They could have kind of built upon it but they chose to do it themselves. And I think this is part of the reason I picked this out as the last 90 days as being so critical as a story is the DOD has historically led the charge on the way we think about cybersecurity work and roles. And because of its vast size, I mean, we're talking about, you know, 150,000 personnel across military, civilian, and contractor populations. That this often trickles into requirements that really influences the entire industry.

Rick Howard: So, Emily, let's put Simone's poll question up so the audience can jump in here and see what they think. In the meantime, Alan, let me bring you into this, all right? You are a CISO of a Fortune 500 company. Do you guys develop your own list of cyber jobs like the DOD just had or are you using some other framework like the NICE framework?

Alan Berry: Actually, Rick, I use the NICE framework for that very purpose, because I don't want to recreate the wheel. I had the same experience in the military for 26 years that you mentioned where every time we turned around, we paid somebody a lot of money to build a new standard. And we had all these specific DOD standards and then, oh, no, this one's just the Air Force standard, versus the Army standard, versus the Marine standard. And I just think that's crazy. So when we decided to go down this very path and create what I call a career ladder to help everybody on my team understand their roles and responsibilities. And what it takes to become that next level person, that promotion, I just went right to NIST and plagiarized the NICE framework. They've done an amazing job laying it out. They've got dozens upon dozens of roles, hundreds of KPIs, KRIs, KSAs. It's all there. I don't need to recreate that. I just use theirs.

Rick Howard: I was talking to Simone about the NICE framework, yesterday, and I was reviewing it earlier this year, and it's a little bit overwhelming. It is a comprehensive document, everything that you'd possibly think of about any kind of cybersecurity job. It really is amazing. Emily, let's see the results that we have for this. So that's interesting, okay. Yes, it's gonna be better, I guess is the takeaway from this. Simone, did- is that what you thought we were gonna get with this poll?

Simone Petrella: You know, I don't know if I had an idea of where we were gonna land. I think it's really interesting, though, because I think it has a ton of potential. It is creating some new standards. As Alan pointed out, the devil's in the details when you actually really dig into this stuff. So I am surprised that people are not interpreting it as being kind of a default to certs, and on the surface, it doesn't. But when you dig into it, I have a few comments on why I think that might not necessarily be as true as it appears on the surface.

Rick Howard: Well, let's dive into that. What do you think that is? What's going on there?

Simone Petrella: So, you know, the thing that strikes me the most is one of the supporting documents that's part of the framework is this massive matrix. It's pretty cool in that it's robust. It outlines every role that the DOD has defined and then it lays out the foundational, the continuing development, and other requirements for someone to serve in that role. And the biggest change from the preexisting standard was that in the old version it was just you have a cert. You have a credential. You either to serve in this function, if you're gonna be an information assurance manager, you have to have a security class, or a CISSP, or whatever else it is. And if you don't have it, no dice.

Rick Howard: Yeah.

Simone Petrella: They've expanded that in this version. So to meet the foundational requirements, you have three options you have to fulfill. An education requirement, so either through your degree. I will suffice it to also say of note, we can talk about this at length, it still says associate's or higher, so it would still be an associate's or higher degree, or, so it's not inclusive. It's, you know, one of three, or trainings of what are defined in the DOD 8140 Training Repository. I'll put another caveat there. Everything in that training repository are service-level training requirements or from National Defense University. So it's pretty specific to kind of military and civilian personnel, at that point. And then the third, or you can go back and default to a professional credential. So those are your three foundational requirements. So they've included education. They've included training. But I think they're still limiting because the education notes claim that it's all computer science, cybersecurity, engineering, or mathematics degrees. That's what counts. And the training is obviously things that are available through military services or the National Defense University. So definitely available to a huge swath of the DOD population but I'm not sure- like I think that it's very easy to default, to still fall back on like a certification if those are the limiting options.

Rick Howard: What I think is interesting about this is then we've been talking about Moneyball for hiring cybersecurity people here at the N2K for the last few months, right? It's this idea that we're elevating the training requirements, not just the individuals. We want to train the team to be good at specific tasks. Do you see this DOD new plan to be able to accommodate Moneyball for cybersecurity hiring or is it still focusing on unicorns and very specific training requirements for the individual?

Simone Petrella: I think it is a step towards helping think about it from a team perspective. But what I think is limiting about it where I think it doesn't quite capture what we've talked about with Moneyball is that you- they are still requiring people to kind of be technical cyber folks, first. Even for some roles that I think would be potentially opportunities on the analytic side, on the vulnerability management side. Where it doesn't have to be someone who comes in with a purely technical background, they can still be incredibly successful. And I think that that makes it hard to sort of use the team-based approach if you're kind of closing the aperture because you're looking for people who already have those skill sets.

Rick Howard: Alan, what's the deal at a Fortune 500 company? Are you guys- are you still- is HR still requiring anybody applying for a cybersecurity job to have a college degree, and a list of 17 certs, and all that, or is- are you guys more open-minded about that?

Alan Berry: I, personally, am much more open-minded about that and actually took the degree requirement out of almost all of my job positions. It's not that it doesn't help. I am personally over-educated with multiple master's degrees. But it's definitely not required for --

Rick Howard: It's still a known thing.

Alan Berry: Still a known thing but it's simply not required for multiple jobs. We all know somebody with, you know, no actual college degree who can out-cyber us all day long. And really it's about recognizing that talent, and that aptitude, and getting them in the right spot to succeed. So we've taken them out, at least moved them to preferred, because, you know, the HR and talent acquisition side still is kind of stuck on this. And we also made sure that the security certs that get noted in there are recommended or preferred versus required. Because we still find, you know, outside consultants come in, and consult with compensation, or HR team, and say, "Well, this entry-level job shouldn't have a CISSP." And it shows up in the system and we have to go clean it out, again. And so it does take a lot of ongoing maintenance and oversight. But we have successfully so far pulled that back so that we can find the right talent, put them in the right position to succeed, and then give them the training they need.

Rick Howard: So we got a couple of questions from the listeners. Simone, this is from user name Too Old for This. "How can this new plan open the aperture to make a dent in those 25% of cyber vacancies?" Well, what do you think about that?

Simone Petrella: I like the irony of where the question comes from since my answer is actually it can tap into younger talent. I think the biggest opportunity here is, you know, if you think about the makeup of the Department of Defense and it's, I mean, by last count, right, isn't the DOD like the biggest employer in the entire US? I think they are --

Rick Howard: Something like that.

Simone Petrella: It's truly the biggest employer. You know, and the majority of the personnel are military service men and women. So I think when it comes to, you know, those that don't have degrees. And their access through these requirements to DOD service training components that can actually meet some of those requirements to those roles. I do think it will actually allow us to tap into our military personnel in a much more scalable way than we've done to fill these roles. And I assume, I don't know, that the, you know, like hypothesis here is that then we could leverage more of our military personnel to fill these roles. As opposed to being overly reliant on civilian, but really more so contractors. As a recovering contractor myself from a million years ago, like I think that there was a huge amount of reliance on contractors to do some of the specialized cyber work. Because the military just didn't have this kind of structure in place. So I think that this kind of lets them tap into their own talent pool and the things that the military does do really well are structure, and train, and provide pathways. It'll be interesting to see where that trickle is, but that's where I think that the biggest opportunity to impact the 25% is. That's just my take.

Rick Howard: Alan, let me throw this next question to you. This is from Joe Not Exotic. "How can this strategy help shift a cultural perspective on recruiting and maintaining talent across the DOD?" I'd change that to the commercial space. What do you think about that in terms of your- what you're trying to do at Centene? Oh, I think I lost him, okay?

Simone Petrella: He was stunned into silence.

Rick Howard: Are you there? He was stunned by that excellent question and the fantastic --

Alan Berry: No, I am here.

Rick Howard: Yeah. Go ahead, Alan.

Alan Berry: And I actually figured out my cyber challenge and I just enabled the web link to actually go live, so I was up [inaudible].

Rick Howard: Yay.

Alan Berry: But how in the [inaudible] then how can this change the environment?

Rick Howard: Yeah.

Alan Berry: I think recognizing, one, that the skill can come from anywhere. You know, one of my side hustles is a nonprofit called CyberUp. And in that program, we take anybody of any age and any phase of their career that wants to become a cyber person, wants to get a career in cyber, and we train them. Now we get them to a security-plus level training because you do need some basis. But we're really looking for aptitude and attitude more than anything. So we've had people that are 18 years old and just graduated from high school. We've had people that are 50 years old and are transitioning from their previous career and want to do something different and new. We really don't care as long as they have the right aptitude. And we all recognize you see the news media articles, the crazy sensationalist numbers of four million skills gap and all this. It's hard to validate those numbers but there is a real gap. There's no doubt about that. What the actual number is, impossible to pin down. So anything we can do to fill those gaps with people that are energetic, and optimistic, and have the right mix of curiosity and paranoia to be in cybersecurity, I'm all for it. I think Fortune 500 companies especially almost have an obligation to take in entry-level people in cyber.

Rick Howard: Yeah.

Alan Berry: 'Cause not every company can do it. But when you're a Fortune 500 company, you can. You can train people into this career field and then they can go work in other companies later in their career and keep spreading the career skills out there..

Rick Howard: So it feels like the DOD is taking a proper step into help fixing that problem, so that was an excellent topic, Simone. Thanks for bringing it to our attention. Alan, we need to change gear over to your topic for this show. What do you have for us as the most impactful cybersecurity story of the last quarter?

Alan Berry: Well, for me, it's really the SEC. So you mentioned a little bit about that Wells Notice that went out Friday that we only found out about because SolarWinds made it public. The SEC doesn't typically publicize those. But in general, when you think about these SEC rules that are coming out, as a practicing CISO, this is kind of a love-hate relationship. There's a lot of good news here. A responsible disclosure program is a good thing for all of us. There is a certain amount of truth in the collective defense thought process that people like General Alexander have pushed for quite a while now. And there's only one way we're gonna get there. It's gonna have to be managed externally because each individual company is always going to look at it in a very risk-averse manner. I just don't see any way that they'll ever not do that. But if we have it as a program, as a law, as a requirement, at least starting with publically-traded companies, I think there's a lot of goodness for the practitioners in this space. It's still a little scary, though, because this is a- an independent regulator. So you have the Securities Exchange Commission. You have the Federal Trade Commission. You have the Federal Communications Commission. Three independent regulators all putting their hands into this particular pie in various ways. And none of them are overseen by any other cabinet agency, if you want to think of it that way. And then you have CISA, which is part of a cabinet agency and does have this new CIRCIA law that went into effect last year. And they have to promulgate rules about this, as well. So there's multiple cooks in this particular kitchen right now all coming up with rules, and --

Rick Howard: I was gonna say, that's gonna cause a lot of confusion, right, Alan? I mean, I agree that these kinds of things are necessary, but like you said, lots of cooks in the kitchen. There's gonna- it's gonna be inconsistent, it feels like. I don't know. What do you think?

Alan Berry: Yeah, and you take that parallel out a little farther, it's really like there's four or five kitchens and four or five different cooks, cook teams, in those kitchens. And then occasionally, a cook moves between them and tells somebody else they're doing it wrong. And so it's really tough and I really home that under CIRCIA that the director of CISA, Jen Easterly, gets a chance to kind of lay a groundwork for everybody. If she can put in some basic ground rules about what's going on, and get them adopted, and accepted by these, you know, independent regulators. That may or may not really have to pay attention to everything, then I think we would go a long ways towards some continuity here. I also think there's some lessons to be learned from other industries. So healthcare, my space, and finance, are already heavily regulated, as we all know. I already have to report any kind of cyber incident to the Office of Civil Rights within 48 hours of determining that members were impacted and information was potentially disclosed. Note the word potentially. It's not that I have to have full confirmation. If I have a high indication, I've got to report it.

Rick Howard: Yeah.

Alan Berry: But I don't have to report that it was an exploit of Vulnerability 1, 2, 3, and I don't have to worry about it being in the middle of the fire fight with the bad guy. So I can still protect my company and do my duty to report this out to the proper agency so we can protect the individuals, as well.

Rick Howard: Well, I was thinking about the two people you mentioned there are, you know, the financial folks trying to put a regulations on and CISA. They have two different reasons for wanting disclosure, right? They, you know, CISA wants to get the word out and tell people that here's a way you can protect yourself against this thing could happen to Organization X. Where the financial folks are trying to prevent fraud, right, and trying to give stake- stock, you know, I don't know the right word for it, people that have stock options [inaudible] --

Simone Petrella: Yeah, shareholders.

Rick Howard: Yeah, shareholders, thank you. Okay, senior moment, right? Give them the best options for investing. So is there a conflict of interest there, Alan, do you think?

Alan Berry: Potentially, and also, a conflict of capability.

Rick Howard: Yeah.

Alan Berry: So the SEC does everything very public, as an example. They want to report this as an 8K, which immediately goes on a public website because that's their purpose. That's their reason for being is to protect the shareholder interest. And so they're not really set up to protect sensitive information. That's not what they typically do unless it's an investigation of some type. So they right out of the gate are not really well-prepared to take very sensitive, very timely, and critical, you know, vulnerability kind of information, and then be trusted to protect it. Not that they mean to do anything wrong. I'm not saying that at all. It's just that it's not what they do day in, day out.

Rick Howard: Emily, let's throw the poll question up for Alan's piece here to get a feel for what the audience thinks about this. And Simone, I'll kick it over to you. It feels like the United States is really good at this disclosure stuff. Most states have disclosure laws and it feels like all the cyberattacks are coming against American targets but that's mostly because other countries don't have those disclosure laws. So is that the right way to look at this or am I- is it skewed because I don't pay attention?

Simone Petrella: I mean, I think some of this is cultural and some of it's political. For us, we are, in the grand scheme of things, agency aside, we're a more transparent nation. And I think we believe in being able to both like have a collective security action problem here as we've seen the lines getting blurred between what's national security, what's critical infrastructure, what's not. I think what is- so I think it's admirable. I think what's challenging in the United States is that these rules are being proposed by independent agencies. And to Alan's point on like all the kitchens and all the cooks, this is getting fit in at a federal level across something that there's already like 50 state disclosure requirements plus any of the territories. And how- like what's the burden on companies to ultimately and who's responsible for kind of reporting those disclosures? What's the definition of material that actually would prompt some of these disclosures? I mean, those are all things that are already grabbled with. Alan, I can't even begin to imagine sort of what your kind of risk calculation is in your company to sort of make some of those determinations. But this just adds more complexity to an already really complex issue. Other countries don't have to deal with 50 states, territories, and, you know, all of these independent agencies.

Rick Howard: That's exactly right. Let's see the results, Emily. What does everybody think about this, okay? Well, I think people agree with you, Simone. They don't want each agency coming up with their own rule set for this.

Simone Petrella: Oh, man. Are we disbanding 50 states? I don't know. This is gotten real controversial.

Rick Howard: I don't think it's gonna go well. I don't think it's gonna go well.

Alan Berry: I think there may be some audience bias built into that, Rick, because most of us are probably security professionals that are listening. And we all have a lack of trust about giving out this information to 50 different locations.

Rick Howard: Well, Alan, let me ask you. Because with the Wells- with the SolarWinds Wells letter, okay, and you men- there are other CISOs now being targeted for these kinds of things, being involved in fraudulent operations. This is a relatively new thing, right? The CISOs haven't had to worry about that in the past. But now it looks like even though that most CISOs aren't officers of the company, they're getting dragged up and are being lumped in with other kinds of fraud operants. And I wonder what you think about that. That was a big sigh.

Alan Berry: Right. Yeah, well --

Rick Howard: Hey, we see your picture. That's awesome.

Alan Berry: Yeah. Hey, I do exist. I'm not just a talking head. Well, I am that, too. Oh, ask that- my brain like fuzzed on you there, Rick, so ask that again and let me make sure I'm answering the right question.

Rick Howard: Well, let me rephrase it and, Simone, you might want to pop in here, too, right? With more precise disclosure laws, or regulations, let's say, does that put CISOs in harm's way that --

Alan Berry: Oh.

Rick Howard: -- even though they're not officers of the company, yeah.

Alan Berry: Yeah, this is going to be an interesting one. So like if we look at SolarWinds, we don't know much about it. We only know what SolarWinds put out in public. But they listed the CISO and the CFO on Wells Notice. They didn't say why. So if we're trying to read between the tea leaves and there's a whole bunch of pundits that are speculating, so now I'm just one of them. But is it because they did something after the hack where they maybe testified or provided information incorrectly or is it about, you know, and you'll hear it called puffery. Where somebody says, "Our program is the very best in the world. We're bulletproof." And then they get hacked. And so you can have a shareholder lawsuit against you because of this puffery statement. So is it somewhere in there? But I think the bottom line is all of us that are practicing today are going to be highly concerned about this. Are we now directly in the line of fire for these kind of actions and what if a CISO has gone to his company, to his CEO, and said, "Here's the issue. Here's what we need to do," and they say no. Am I still responsible to the SEC even if I've given my best counsel and they said no?

Rick Howard: Yeah, Simone, I'll kick it back to you because we're all- been involved in government service, right, where- which encourages discussion about various courses of action. But we are- we have been trained that once the decision is made, we move out, right? So if the CISO goes to the boss and says, "We need to disclose this?" And the boss says, "No, we're not gonna." What happens then?

Simone Petrella: Yeah, that's a- thanks for the easy question, Rick, right for the lob. I'll --

Rick Howard: I'm not touching that question.

Rick Howard: I'm gonna deflect for a quick second 'cause the first thing that came to my mind as Alan was talking is I was wondering to myself would- I know this is an SEC action. But like was this door opened in the SEC's mind because of what happened like earlier this year with the sentencing of the former CISO of Uber. So I realize that was a DOJ action, not SEC, but like I'm- it's hard for me to decouple those two just because I think it's a trend, or at least that's what it seems to be. Maybe I'm starting to go into like conspiracy theory, but if I had to put my like government hat back on and you think about all this like service that we've done. Generally speaking, at least when it comes to enforcement actions, you're not off the hook. Because if you are aware that something should have been done and you have a fiduciary duty or like obligation to the company, that standard, you're not excused because you were told no. It puts a big burden on that individual 'cause then it's like, well, you have to be a whistleblower. I mean, you know, at some point, you have to --

Alan Berry: Yeah, and then you have to be Mudge, like Twitter.

Simone Petrella: Yeah, exactly. Exactly. So I don't envy that position but I think that from like the enforcement action side, it- there isn't really a lot of grace that's given there.

Rick Howard: Well, I'll push back on you, Alan, because, you know, like I said, most CISOs aren't officers of the company, right? They're just, at the best case, they're probably VPs of cybersecurity, and the worst case, they're directors, right, of the company. They don't have that official fiduciary responsibility on their- in their contract, right, or am I wrong about that?

Alan Berry: Well, it would depend company to company. So, you know, I'm an officer of my company, but I still only have so much say. I still work here and I have a boss. At the very senior member of our community, I had a discussion when the whole Twitter Mudge issue broke. He and I were discussing this and he had a very strong opinion that the CISO's responsibility is to the company. And if the company chooses to go down a path that they advised against, that doesn't mean you're a whistleblower. So if it's not blatantly illegal or harmful, it's just a decision you don't like, should you then go public with it? But all these Wells Notice, and the prosecution of Joe Sullivan, and some other issues like that, really give us all pause and we just don't know where it's gonna run. We don't know how it's gonna fall out.

Simone Petrella: I would also add --

Rick Howard: Excellent- go ahead.

Simone Petrella: Rick, before you even jump in, like we are also seeing this trend. Where I think more companies are recognizing information security as having an officer title, and an officer function, and set of responsibilities. There is now the, you know, equal push on the expertise that's going to be on boards, the opportunities for CISOs to serve on boards. So whatever, you know, that's also kind of a parallel pressure that's happening.

Rick Howard: That's a good point. So we got an excellent following question from my friend, S.A. [assumed spelling] Miller. This is directed right at you, Alan. She goes, "From Alan's perspective, how best to alleviate those discussions or elevate those discussions to the board level in a way that they understand the real impact of the issues the CISO sees?" What's- what do you think there?

Alan Berry: Well, you know, I've just recently finished a course with Bob Zukis in the Digital Directors Network called the Qualified Technical Expert. And one of the underpinnings of that is it's time for boards to embrace this risk area very, very openly. Similar to what we had to do 20 years ago with financial expertise and everything that drove Sarbanes-Oxley and all that. We need to have it where boards have a cyber committee, a risk committee that's very focused on cyber. Most of us are gonna find ourselves reporting into an audit or compliance committee or some other thing that has existed before and very few boards are embracing that. But I think more will if these SEC rules go into actual effect sometime this fall. They're gonna realize that and they're gonna get that shareholder pressure. But it really needs to be that level of attention and importance and it's only really demonstrated by the board when they have a committee for it.

Rick Howard: Well, that's a fascinating topic and more to follow on all this, right? But we're gonna move on to the next topic, guys, to my topic. And my topic for this webinar is a pet peeve of mine. I want to talk about how- this is more of a technical discussion about how we as a cybersecurity community name our threat groups. Now I covered this in my recently-published books, Cybersecurity First Principals. But what caught my eye this quarter were two intelligence reports from respected commercial intelligence teams, Trellix' Advanced Research Center and Recorded Future's Insikt Group. Trellix released their June 2023 edition of The Cyberthreat Report and Recorded Future released a fascinating analysis of North Korea's cyber strategy. And they're both excellent reports and I recommend putting them in your reading queue. We'll put the links in the show notes for everybody. But my pet peeve is how our industry, not just Trellix and Recorded Future, but everybody. Conflates our high confidence of attack campaigns with our SWAGs about which country actually did the attacks. And for those of you who don't know, a SWAG, spelled SWAG, stands for a swinging wild-assed guess, okay? So that's what I'm talking about here. And you all know that the infosec community is fond of assigning colorful names to various threat groups and there are many sources of these names. Most security vendors due the bulk of the naming attribution by publishing blogs describing what they discovered with their own security products and services like Trellix and Recorded Future. One reason for the colorful names is to get attention in the marketplace and some vendors have become famous for their naming taxonomy. Mandiant uses numbers as in APT1. CrowdStrike uses animal names like Fancy Bear. And Microsoft used to use elements like Athenium, but they've recently changed their scheme to weather and colors, which doesn't cause any confusion in the naming space at all, I'm sure, right? And there are many other schemes. But government computer emergency response teams, the CERTs, and law enforcement agencies from around the world, they publish intelligence reports, too. And most of us have made some effort to follow the standardized vocabulary made possible by the MITRE ATT&CK framework. Now in the MITRE Wiki, you can find intelligence on famous adversary campaign names that we've all heard about in the news like APT1, the Lazarus Group, and Sandworm. You know, there's bazillions more that we could talk about with cool codenames like Ferocious Kitten, and Nomadic Octopus, and Wizard Spider. It is the reason I've joined cybersecurity, I think, because of the cool code names that we all get to use, right? But the thing about it is they don't attribute adversary groups as in here are a bunch of people, cyber bad guys, that are behind the activity we're calling Nomadic Octopus. We use group names to identify unique adversary attack patterns across the intrusion kill chain that have been seen repeatedly in the wild. And what I mean by that is when the MITRE ATT&CK Wiki publishes intelligence about Ferocious Kitten. It doesn't normally include information about Kevin, you know, day job Walmart greeter, as the hacker behind the attacks. The Wiki just outlines a set of attack techniques and special procedures observed in the wild, an intelligence analyst had grouped together as belonging to the same adversary playbook. Sometimes intel analysts are pretty sure that these name patterns like APT1 originate from a specific government or crime group. Like in AP- in the APT1 case, the security vendor, Mandiant, actually hacked back to one of the hacker computers, compromised their computer, and watched them operate in the room in real time. So after that operation, their analysts had high confidence that the hackers behind APT1 were a Chinese military hacking group. Belonging to the Second Bureau of the People's Liberation Army, PLA, known as Unit 61398. But that kind of attribution is an exception to the norm in the commercial space. For the rest of the groups, okay, like Nomadic Octopus, intelligence analysts may have some suspicions that the group hails from Russia but they rarely have irrefutable proof as concrete as the APT1 evidence. The point I'm trying to make here is that most- for the most of this, it doesn't matter which government is behind the attacks. If you know that North Korea is attacking you, who cares? What do you do differently about that? All right, so Alan, I'm gonna toss it over to you, all right? Do you care if it's North Korea or Russia attacking you or do you care about something else? Am I off base on this whole analysis thing?

Alan Berry: Well, I do care, but can we all agree that CrowdStrike has the coolest graphics for their names?

Rick Howard: Yes. Yes, we can.

Alan Berry: George has really done a great job on his graphics.

Simone Petrella: I would agree. I was gonna comment that the calendar that they put out is the sole reason I would not criticize their names because they come up with like a super-cool annual calendar.

Alan Berry: Yeah, it's --

Rick Howard: Exactly right.

Alan Berry: It's the Cybergate pinup calendar. I get it every year and it's like, you know, it's gold. But to your point, Rick, I do really care about the attribution to a degree. So I care because it often demonstrates what is going to happen when I engage with that adversary or what are they going to do with the information that they may have compromised. So a nation state actor, until recently, was generally espionage. So the information probably never sees the light of day. It's used for different purposes and you have a different reaction. Whereas a criminal actor is almost always monetizing. Unfortunately, the last few years, that line has really blurred. It's almost irrelevant these days.

Rick Howard: That's what I'm saying.

Alan Berry: Because a lot of times, the nation state actor is taking the same tools at night, going home, and trying to monetize because, you know, they don't get paid much. So they're trying to increase their own position by using the exact same tools and techniques, and so it really clouds it. Plus the- everything's an inference unless you have your APT1 case where they actually hacked back, and opened up webcams, and stuff like that. It's always an inference. Hey, they generally work in Russian time zones and they general work on Cyrillic keyboards. And they generally don't attack any domains that are RU, and Georgia, and a couple of other places. Oh, they must be Russian.

Rick Howard: Yeah.

Alan Berry: That's a loose inference.

Rick Howard: Probably not.

Alan Berry: Look what happened at the- after the start of the Ukraine war here where a couple of Russian actors got a little proud and said that they're gonna back Mother Russia. And forgot that half their crew was Ukrainian. That didn't go over so well for them.

Rick Howard: Exactly. Emily, let's put the poll question up for the audience. And while that's coming up, okay, here's my point about all of that, right? We conflate the two issues. You know, when commercial cybersecurity vendors attribute the attack campaign across the crews and kill chain like the MITRE ATT&CK Wiki has, we have high confidence that that is correct. Multiple intel analysts from around the world have looked at those attack patterns and said, "Yeah, that's right for this attack sequence. That's how they do it." But in those same reports, they'll mention that this attack sequence has been associated with the Russian SRV, all right. Which most commercial cybersecurity firms, besides being lucky once like APT1 and Mandiant, they have no clue if that's Russian or not. They might have some suspicions like you said, Alan, right, like, you know, it's the IP address space or there's Russian words in the code, okay? But if I'm the president of the United States, I'm not launching the nukes based on that flimsy evidence, right? And so what happens, this is why it's a pet peeve of mine. The accuracy that we know about the attack campaigns gets conflated with our attribution to nation state activity and that's incorrect. There's no way that most commercial cybersecurity vendors know which country these things originate from. We don't have the assets to determine that. If you're a government intel agency, they have the resources 'cause they have human elements in the field. They're collecting intelligence all the time. The only thing the commercial side is doing is collecting network data, right? There's no way that we could know it's the Russians. Let's see what the poll says. Oh, that's interesting. I did not expect that at all. Somewhat confidant, okay? So, Simone, do you think I'm wrong here, I'm wrong about trying to make a case here that how we attribute nation states is wrong?

Simone Petrella: It- you bring up a really interesting point and I'm a former, you know, I've worked in the intel community, the- for a better part of a decade with my career. And I think that we're dealing- what your pet peeve is is essentially the marketing of the publicity of these attacks and these attack vectors. Which --

Rick Howard: Mm-hmm.

Rick Howard: -- I actually would sort of default to I think it can be a good thing because, you know, as a former intel analyst, I could sit there and talk about TTPs and certain characteristics of an attack vector all day long. And then I'm like, "It's APT blah, blah, blah," or even, you know, whatever. And then it kind of glazes over and no one's done anything about it. But there is some component of bringing it to life when you give it a name. I think there's just some like nice glossy marketing that maybe has some overall benefits to --

Alan Berry: Tangible.

Simone Petrella: Yeah, like tan- and people want to take it seriously. But to your point, they ended up getting names for attribution that are very illusionary to nation states and I think that that is misleading. I think, you know, so I get it.

Rick Howard: Is- that's- you captured it exactly, yeah. That's exactly right. I love the colorful names and I'm totally fine with, you know, Fancy Bear being associated with a 100 TTPs that the adversary used against victims. But it is not likely the Russians that did it. It might be. I'm not saying it's- you know, there's a chance that it could be the Russians, right? But the chances that some commercial vendor can attribute that with any kind of high confidence is pretty low, is what I'm- what I--

Simone Petrella: Right, but if I --

Rick Howard: Alan, you were gonna say something.

Simone Petrella: Oh, sorry.

Rick Howard: Yeah, go ahead.

Simone Petrella: No, I was just gonna say like if I could maybe play a devil's advocate here. I think one of the things that we struggle with as security professionals is sometimes advocating to the executive leadership and boards around the overall importance. And how do you put cybersecurity in the context of business objectives. And I'm not saying that it's the right answer but it does help make the case. Meaning if you at macro level can identify, "We know as this business that we're susceptible to, you know, financial motivated threat actors." And in some cases, depending on our business, maybe some nation states or maybe the lines are blurred. You can make better prioritization and kind of business contextual decisions than you would otherwise. So, you know, to the extent that it can help do that because it makes a colorful component, maybe it's okay.

Alan Berry: I think most companies --

Rick Howard: What do you think, Alan?

Alan Berry: -- care about stopping it more than they care about attributing it. At the end of the day, I need the bad thing to stop and I need some assurance that it's not gonna happen tomorrow or tonight. So that's where most companies are gonna care about. There's only a few companies that have the luxury of caring about the full chain of attribution and everything that goes with it. And so it's just a matter of I think to your point, Rick, it becomes marketing. It's great marketing for all these companies to use these fancy terms and cool graphics, put out the calendar. But the average company just needs the bad thing to stop.

Rick Howard: Well, I think what's- what I- the point I'm trying to make here is what most of these commercial cyber intel groups. They'll do a fantastic report and outline in detail what the adversary did against their victim. And then in a throwaway line, they'll say, "And this attack campaign has been attributed to the Russians," which they didn't figure out themselves. They're sourcing some other report that they didn't tell us where they got it from, right? And so it's like it just kind of perpetuates the myth that whatever this attack campaign is came from the Russians. So I'll get off my sandbox, right? This is my pet peeve that bothers me to no end, right? So that's the end of--

Simone Petrella: Are you boycotting the calendar?

Rick Howard: Yeah. No, I love the calendar. I love all that stuff. It just I want to debate what it means, that's all, okay? I'm being pedantic, is what that means. So that finishes my segment. Let's finish this up with some general purpose audience questions. We- the first question we get from Kate Adam. She's the senior director of security marketing at Juniper Networks. She says, "What kind of certification is most relevant today or the types of attacks happening lately?" That's an interesting question. What- Simone, you're doing workforce development here at N2K. Does something come to mind there?

Simone Petrella: The first thing I'll say is there are almost no certifications that will stay up to date on every threat attack vector as new ones are developed. But all of them cover the primary, you know, most common ways that we see threat attacks kind of tackle things. So it comes down to a matter of degrees. I think at a basic level, whether it's like a Security Plus or even the like a vSEC certification. That's a good starting point. If you start to get into more specialized technical, I think a Certified Ethical Hacking or an OSAAP [assumed spelling]. But you want to kind of toggle that based on the role and kind of where you fit into it.

Rick Howard: Alan, what do you think?

Alan Berry: Yeah, I like the thought of the OSTP for folks that are actually trying to actively defend against mature, cutting-edge threats, because that whole cert, one, it's extremely difficult to get. I do not have it. it's real. You have to succeed. You either succeed or fail when you take the test. And, you know, it's like 36 hours straight of trying to hack five boxes, and get in, and do the thing. And so you really have to learn it from a hacker's perspective. And so certs like that, if you're really in that active threat-hunting mode, or there's several SANS courses that are in the similar vein for active threat hunting. But your standard CISSP, or GPIH, or G-TEC, they're all about just responding to more generic issues, so they're not gonna keep you on a cutting-edge footing, in that sense.

Rick Howard: I'm gonna piggyback on something that Simone said. I would steer clear of very specific training requirements like what did we learn from the latest, you know, cyber espionage attack. Because that- like you said, Simone, it changes so quickly. What I think you should be looking as a cybersecurity professional is a general set of knowledge. So that when you see a new thing, you can apply that knowledge to this new problem and solve it quickly, right? And so I would look for those kinds of generalizations of training. That would be my advice to any newbies out there. We got a second question from Rick Beesley. He's a cloud security principal architect at McKinsey & Company. He asks, "In cloud, if you have robust infrastructure as code, policy as code guardrails, and cloud native security services, do you also value maintaining a system security plans and threat models?" And for those of you who don't know, according to the NIST, a system security plan is a formal document that provides an overview of the security requirements for an information system. And describes the security controls in place for planned- for meeting those requirements. And according to CSO Online, threat modeling is a structured process through which IT pros can identify potential security threats and vulnerabilities. So it was a lot of explanation. Alan, do you want to take a crack at this one? Okay, do we need security- system security plans and threat models with all this infrastructure as code stuff that we're building on the fly?

Alan Berry: I'd say wholly yes. And the SSPs, themselves, are kind of an administrative process in a lot of respects. As a federal customer and a state, you know, a federal or state supplier, we have to do SSPs for a lot of our customers or contracts. But they are something that make sure you've thought through all your steps and your processes. More important, though, would be the threat model [inaudible]. So I call it threat-informed defense. It's a little bit different term but the same idea. I watch my telemetry, my injects all the time. A lot of that comes through e-mail where I can track and trace what actors are trying to come at me on e-mail, what are they trying to do, what access providers are trying to get a foothold with us. And that data allows me to make sure the rest of the tools are properly tuned for whatever that latest threat is. That data allows me to adjust my policies and adjust my SLAs based on those threats so that I know I don't have three weeks to do this. I have maybe three days or three hours to do this because that threat is that significant. We get 2,000 vulnerabilities a month get published in the CV [assumed spelling] register, somewhere on the ish. So if you've got 60% of those in your environment somewhere, you're talking 40 a day you have to deal with. That's just crazy. So the only way you're going to do that is by understanding the threat against your company, against your systems, your instance. And how you need to model against that threat so you can best prepare for it. Because you have no plan to survive contact but the plan helps you prepare so that you can adapt on the fly.

Rick Howard: I'd take it in a little bit different direction. I call this intrusion kill chain prevention, right? And is- the threat model is we know exactly how about 150 different adversaries, cyber adversaries, attack their victims if you look at the MITRE ATT&CK Wiki, right, across the intrusion kill chain. And they have this attack sequence that they typically follow. So the threat model then is putting prevention and detection controls for everything possible in that attack chain so that we can defend against all the known adversaries. So that's what I call it. Simone, what- do you have a thought about this?

Simone Petrella: I see it as a necessity for stress testing. You have to, you know, put policies in place around infrastructure as code. But then you have to be able to kind of test and stress where you have soft spots so that you can modify and make them secure over time.

Rick Howard: Yeah, I agree with that. I totally agree with that. We got a third question here from Carolyn Crandall. She's the chief security advocate and CMO at Cymulate. This is a little bit different direction from what we've been talking about. She asks, "How is the state of the economy impacting the second half of 2023 plans related to cybersecurity budgets and hiring," right? So it looks like the entire tech industry has taken a hit this next year. Simone, what do you think?

Simone Petrella: Well, the first thing I'll say on the state of the economy, not that I'm gonna pontificate on where we're at. But even with the tech sector cuts, the cybersecurity industry has been relatively immune to being victim of many of those cuts. I thought the more interesting result is that we're actually seeing lower attrition because of the state of the economy than we have seen in kind of record years within cybersecurity professionals. Which is a great news for leaders who are not having to deal with as much turn as they're used to dealing with. I'm seeing a mixed bag as far as the budgets go. It seems to be very specific to industry and individual company. It's sort of ranging from on hold, it's kind of steady state in budget requirements, to slight trims but nothing egregious. I have seen, though, that there has been some hiring cuts across the board, meaning like we're just choosing not to hire in these positions as opposed to cutting. But that's what I've seen. Alan, I'm curious to see what you're seeing on your side.

Alan Berry: Yeah, similar data. So every CFO knows the one string they can pull that gets a fairly quick change in the numbers is manpower. So hiring slowdowns across the board are just a natural reaction when you have these kind of financial pressures and we're not gonna be immune to it. We're gonna have the same pressures. It's very difficult to carve out, "Hey, the security team, they get to do whatever they want, but the rest of you guys, suck it up. You're gonna pay the price." That's a tough sell for anybody. So we're all gonna feel it. I like the comment, though, about attrition because I've seen the same thing. At the very beginning of the pandemic, attrition went to zero because nobody knew what was going on. And then when everything got, you know, in this if you want to call it steady state for a little while, we had the great resignation because everybody could go work remote and salaries were insane. So everybody's jumping ship. Now it's kind of coming back, it's swinging back the other way, 'cause we're like, "Ah, I don't what's gonna go. I don't know how it's gonna happen." So everybody's, you know, digging in, saying, "Now is not the time for me to take that risk."

Rick Howard: So a little uncertainty out there for- is that what I'm hearing? Okay, everybody's a little let's stay close so that we'll see what happens in the next year or so? We have a question from Dennis Sullivan. He's the CEO of CyVantage. He wants to know, let's see, "What will it take for software vendors to take responsibility for pushing and publishing products that weaken the cyber sphere?" This is an age-old question that's been around for a while. Alan, do you want to tackle that one at all?

Alan Berry: That's got to be public law of some type. It's gonna have to be enforced from the top because why would --

Simone Petrella: I was gonna say the same thing.

Alan Berry: -- [inaudible] responsibility?

[ Multiple Speakers ]

Yeah, it's got to be just like, you know, recalls for cars and other products, you know, Underwriters Laboratory, or Good Housekeeping Seal of Approval, or something analogous to that. Because otherwise, companies just they're not incentivized to do that and we're all feeling the pain. I mean, how many people deal with Patch Tuesday and I don't care. They can say they don't do Patch Tuesday, anymore, but it still seems to show up on Tuesday.

Simone Petrella: Yeah. I was gonna say like --

Rick Howard: Simone?

Simone Petrella: -- this is a direct liability thing and it's in an age-old question, but if you think about- this is very nerdy of me to say. But if you think about the history of how we've ended up with direct liability laws in torts. Man, I feel like I'm back in law school right now. It was only because it wasn't just one individual got harmed by something that like blew up. I mean, well, then it's life and limb, so either someone gets egregiously harmed or loses their life for something that's directly attributable to the defect. Or it's so widespread in the harm that's not life and limb that it's like a massive class action that could for- but either of those instances, you're having like the legal code change.

Rick Howard: I agree. Okay, we'll see. I'll be- let's not hold our breath on that one. I don't think it's coming anytime soon. But ladies and gentlemen, we are at the end of this. On behalf of my colleagues, Simone Petrella and Alan Berry, thank you all for participating and we will see you at the next CyberWire "Quarterly Analyst Call." Thanks, guys. I appreciate it.

Simone Petrella: Thank you.

Alan Berry: Thanks, Rick.