N2K logoOct 14, 2022

CyberWire Live - Q3 2022 Cybersecurity Analyst Call

There is so much cyber news that, once in a while, all cybersecurity leaders and network defenders should stop, take a deep breath and consider exactly which developments were the most important. Join Rick Howard, the CyberWire’s Chief Analyst, and our team of experts for an insightful discussion about the events of the last 90 days that will materially impact your career, the organizations you’re responsible for, and the daily lives of people all over the world.


Rick Howard: Hey, everyone, welcome to the CyberWire's quarterly analyst's call. My name is Rick Howard, I'm the CyberWire's Chief Security's Officer, Chief Analyst and Senior Fellow. I'm also the host of two CyberWire podcasts. Word Notes on the ad supported side, meaning its free to anybody and it's short, usually no more than five minutes. Its description are the keywords and phrases that we all find in the ever expanding alphabet soup that is cybersecurity. And the other podcast is called CSO Perspectives on the Chrome side, the subscription side, I like to call it the Netflix side. It's a weekly podcast that discusses first principle strategic thinking and targets Senior Security Executives and those that want to be them sometime in their career. But more importantly, I'm the host of this program that's reserved for CyberWire Pro subscribers and I'm happy to say that I'm joined at the CyberWire hash table today by two friends of mine.

Rick Howard: Roselle Saffran, the CEO and founder of KeyCaliber and William MacMillan, the SVP of security product and program management at Salesforce and relatively recently, the CISO for the CIA. Roselle, William, welcome. You guys can say hello.

William MacMillan: Hey, good morning, Rick, good morning, Roselle.

Roselle Saffran: Good morning on the West Coast, good afternoon on the East Coast. Pleasure to be here, thanks for the invite, Rick.

Rick Howard: You bet, I'm glad to have you guys on. This is our 11th show in the series where we try to pick out the most interesting and impactful stories from the last 90 days and try to make some sense out of them. This third quarter, as you can imagine, has been full of surprises. We had, do you guys remember this one, the Google engineer? His name was Blake Lemoine, claimed that his artificial intelligence spot, Google's language model for dialog applications, or LaMDA, may be sentient. In other words, he claimed that we have reached the singularity and Google fired him for allegedly violating the company's data security and confidential, confidentiality policies, but when you read Lemoine's account, LaMDA does pass the Turing Test, so that's interesting.

Rick Howard: We also have the Russia-Ukraine war continuing to rage where we're all still waiting for the needle to drop on some sort of Russian massively destructive cyber campaign, similar to what they did with NotPetya back in 2017. That hasn't happened yet. Still, the community has seen adversary playbook activity from the likes of APT28, APT29, Sandworm, Dragonfly, TA445, TA471 and a group called Gamaredon; I can never say that one right, I was the one that called that Game We're Done. But Gamaredon. We'll keep an eye on that one for the next show we do. From the skateboarding Dogs File at Defcon in Vegas this past summer, an Australian hacker named Sick Codes jail broke the computer in a John Deere tractor in order to play the first person shooter game, Doom.

Rick Howard: I love hackers, right, I love that they do that kind of thing. But, let's get with the show. Roselle, let's start with you. What do you have for us as the most impactful story of the past 90 days?

Roselle Saffran: Yeah, so the one that really caught my attention was this story in NBC News about how cyberattacks against US hospitals mean higher mortality rates, according to a study.

Rick Howard: Yeah, scary.

Roselle Saffran: It's scary and it's sad and I literally almost started crying when I read this story and read about how a newborn may have lost its life, his or her life as a result of a ransomware attack. For me, I'm mission driven and the whole reason why I'm in cybersecurity still is because I feel very strongly about the mission of fighting the good fight. I worked for the Department of Homeland Security because I felt strongly about protecting the government and critical infrastructure organizations. I worked for the Executive Office of the President for the same reason, felt very strongly about defending the White House's network. Then I became an entrepreneur because I saw these issues with the industry and I wanted to be helpful. So, it's upsetting to see that we're at this point.

 I'm on my second cybersecurity start up now. With the first one, I remember I had a slide deck for the investors where I was trying to convince them, look, cybersecurity is important and here are the reasons, and we have these issues with their being theft of intellectual property, theft of financial data. I'd have at the end these bullet points of these sort of gloom and doom scenarios of there could also be disruption. There could also be destruction. At the time, that seemed very far fetched, but I included it because I knew there was that potential and that capability was on the horizon. All of us that were in the industry and in the know and seeing it day to day, they knew that, this was 2014. Now we're at the point where it's real, like, this is not just, "Oh, look at these cybersecurity people being crazy," this is real. And this article just brought that to the fore for me.

Rick Howard: That's so interesting, Roselle, because we all have our own doomsday scenarios when someone asks us, "What are you most afraid of?" and I think most of us can come up with four or five really scary things. My go-to one forever was that some nation state would break into a hospital and change the records. Not all the records. Just one or two and change dosage to medicine and it'd be so hard to track down, you wouldn't be able to do it. That was far-fetched in movie script territory; I think even Homeland Security, the famous TV show even had a episode of about something like that. But here we are in 2022 and people are dying because of just run of the mill ransomware attacks. William, I'm wondering about you, you know, CIA guy, right? In your past. I'm sure that's come up as potential problems as you guys went through your day to day effort?

William MacMillan: Yeah, absolutely. I mean, it affects everybody. I'll quote General Nakasone here, I've been in a couple of forums where I've heard him say that this criminal cyberactivity, it's national security, right? And this is a great example because you want to be able to trust that your hospital is going to be safe, that that's a safe environment for you and for your loved ones. So it really drives home the point about just how pervasive this has become as our lives are increasingly digitally connected.

Rick Howard: Eliana, let's throw up the poll for the audience. Roselle, let's take it back to you. What should we do? Is there anything different we can do about this kind of threat to our medical systems?

Roselle Saffran: Yeah, I think there's lots that can be done and part of it is looking at what we haven't done particularly well that we can do it better moving forward. I think as an industry we haven't done a great job of articulating the risk, and we've known for a long time, look, there is real potential for danger. But before it was something tangible like this, it ended up just being a little too abstract for folks that weren't in the industry. Theft of data, theft of identity, all of that, there's nothing you can see tangibly with that. There's nothing that's tangible where you can invoke this visceral reaction the same way that now we can with seeing that lives are lost because of it.

 I think as an industry, we have to do a much better job of articulating the risk and quantifying the risk and getting further away from the technical side of it. We can talk about the CV X and Y all day, but when we have to convey our industry to folks outside of the industry. There's lots that we can still do with improving the way that we translate what we know on the technical side, to what they understand on the non-technical side. It's a work in progress but we really need to get there with quantifying the risk and explaining it in a non-technical way where it's clear this is a concern, this is an impactful situation where something needs to--

Rick Howard: Eliana, can you give us the results on the poll? Well, that's really interesting. That's not what I thought was going to happen, there you go. That shows you that our audience is cleverer than I am. You're hitting the two points there that I want to cover, Roselle, is that we as an industry, as a community, as a profession, security profession--

Roselle Saffran: We haven't been talking about it in the right ways.

Rick Howard: Yeah. And we talk about quantifying risk, we suck at this, right? I've done it myself, I use these qualitative heat maps, these red, green and yellow spreadsheets, basically. Where I think the most impactful thing is high into the right and I go into a boardroom and say, "this is really scary, give me a gazillion dollars to fix it." And it's just not a very good way to do it. There's reams of science that says that that's a bad way to convey risk to any decision maker. We need to get better at that. In fact, let me just hype my own show, the last four episodes in CSO Perspectives, I did a deep dive, top down, how you can do it better, so go check that out if you want to. William, I cut you off, you were going to say something there.

William MacMillan: No, no, this is just a hot topic right now, risk quantification. I'm in more and more conversations where people are talking about trying to identify the denominator, meaning there's so much out there that we don't even know about that it's challenging to be able to identify all of your risk in that sort of environment. I'm an optimist and I think a lot of the tech that's rolling out nowadays is really starting to help and I've listened to your first episode, Rick, I've got three to go.

Rick Howard: [LAUGHS] Okay. There'll be quiz at the end of this, William. Let me ask you this, though, Roselle. You get to go around and talk to a lot of customers, right? I've talked to a lot of different CISOs from different verticals. It's my observation that the healthcare organizations, they tend to stuff their CISOs five, six layers down in the leadership bureaucracy, whereas other verticals, say finance or IT or tech, are further up. I don't know, is that your experience too?

Roselle Saffran: Yeah, I think in general you have industries that are ahead of the curve and industries that are behind the curve. If you look at from just an overall perspective, ahead of the curve you've got, say, financial sector and in IT and more recently, retail. I put governments in front of all of them, where they recognize the need for devoting resources to cybersecurity early and they're ahead of the curve. Then you look at who's behind the curve and it's some of these industries that are closest to human lives. It's the healthcare industry, it's oil and gas and so, that's why I think we might be at the point where there needs to be some regulation involved so that it's not discretionary whether this becomes a major position and a major area of importance for an organization.

 We will be in a situation like we were with seatbelts back in the days where car manufacturers, they didn't include them. It was inconvenient, it was expensive and so it wasn't done. Then it because regulated that they had to be included. Then it became regulated that people had to wear them once they were included. I'm not trying to advocate for okay, we get all of these new regulations coming in, I think we're kind of at the point where there has to be someone else voicing, "You need to do this," besides just us. And that's not a compliance game, also because compliance and risk are very different beasts.

Rick Howard: William, if you could wave a magic wand and you get one, you get one regulation that you could impart across the world, do you have an idea what that might be?

William MacMillan: Oh, man.

Rick Howard: Not not to put you on the spot. [LAUGHS]

William MacMillan: I am a fan of having to report cyber attacks, ransomware, whatever you want to call these cyber incidents. I do worry a little bit about the details and I think the details really matter, right? For example, there's some places where legislations emerging where you've got to report within like two or four hours and at that point I think you're doing more harm than good. I've never worked for an organization where within two hours you actually know what's going on, right? I'll just say that my magic wand would probably wave and produce something in the sweet spot where you have to report so that we can start collecting this data across industries, but that it's also balanced and recognizes that you have to run the business, you have to respond to the incident; reporting can't be the very first thing you have to do.

 I remember when I was a young student pilot, they used to teach us "aviate, navigate, communicate." In other words, don't crash the plane, don't get lost and then talk to people about what's going on. There's this like, sense of priority with the reporting timelines, that's the part I worry about.

Rick Howard: Well, the United States compared to other countries are pretty good at notification laws. I think there's 48 last time I count, 48 states that have some sort of mandatory tell people that you've been breached laws. They're all of varying quality, but at least that's true. It isn't that way internationally, but Roselle, what was your magic wand? What would you do?

Roselle Saffran: I think there needs to be a base level of security. I think, first to the point you made about the CISO being completely buried in the org chart. I think the CISO needs to be reporting to the board and that there is at least a minimal set of security capabilities that are in-house and up and running. In-house or outsourced, but that the organization has to leverage so that they're just not in this position where something happens and they're completely on their heels and unable to react in any formidable way.

Rick Howard: I'm going to open it up to the listeners here, we got a question from listener, his log-in name is Swamp Stumper, I love it. His question is, "How does a company begin to address this problem if they are starting from scratch with their cybersecurity program?" How do we do that? Anybody got any ideas?

Roselle Saffran: Well, I would start with the fundamentals. I mean, just even getting the basics down. If you look at NIST CSF of identify, prevent, detect, respond, recover. Just starting with identify and prevent. With that you're identifying what's critical and you're making sure that it's backed up. Especially in the case of ransomware attacks. Get 70, 80% of the way there with being able to respond and recover if you at least have backups. So first step, you identify what's really important, what needs to continue to be operational to make sure everyone's lives are not at stake, and then from there make sure you got the backups, make sure you have preventive technologies around it. Then you build on top of that and start detecting and building out the response capabilities, but at least getting that core in place so that you're not just completely bare.

Rick Howard: I love starting with the NIST cybersecurity framework, that's a very comprehensive list and just putting yourself at the various stages of each of those categories will give you an insight at where you need to concentrate, so I like that as a starting point. William, do you have anything to add there?

William MacMillan: Yeah, it's a good point. I would just add that I think we're making a ton of progress with some organizations like CISA, making resource kits available to smaller business. Sometimes you hear people refer to businesses that are below the cyber poverty line. If you are truly starting from scratch, there's some really great, free information out there if you just look at some of these websites. On top of that, I would say make sure you appoint somebody as being responsible for starting to draw up a plan for your organization, right? Accountability becomes important so make sure there's somebody who's responsible for doing it and that they have access to the business decision makers, to the leaders in the business. You can't just run cybersecurity from the basement, you got to pay attention to what they're learning and telling you about your cyber risk.

Rick Howard: I'm also a big advocate of getting back to first principles, in fact, CSO Perspectives talks about that a lot, right? Try to figure out what you think you're trying to do with your infosec program. I advocate that what we're all trying to do is preventing material impact on our organization, or at least reducing the probability of material impact. Getting back to the beginning of this question, people are dying in hospitals, I would say that's extremely material. That's a good topic, I think we could talk about this forever, Roselle, so good job bringing that to the table. William, let's switch over to you. What do you have is the most impactful story of this past 90 days?

William MacMillan: Alright, well, mine is perhaps not the likeliest topic, but it's something that I've been focusing on a lot lately, so I really resonated with this. The topic is about the increased need for an evolving security awareness program and an investment in security culture. The particular story that caught my eye is called "Security Awareness Training Must Evolve to Align With Growing eCommerce Security Threats" from Dark Reading in September. What resonated with me about this article is this idea that as more and more companies advance their digital transformation plans, the ways in which we live, work and participate in the economy have moved online in a bunch of ways that expand the attack surface for hackers.

 Let's face it, in 2022, if you're running a business, you're running a digital business and that means that people are going to interact with your digital business via multiple channels. I liked this article especially as we approach the holiday season, and it highlights how complicated this has all become. It's not just enough for security teams anymore to work in relative isolation just to focus on protecting consumer data, and maybe some anti-fraud analytics applied to payment mechanisms. They have to worry about all the different ways that the humans at their company are interacting with multiple digital processes. For me, this all really drives home the idea that the security awareness training at your company better be keeping up with this. We've all got used to thinking of security awareness training requirements as this annual thing that we do at the end of the calendar year.

Rick Howard: Yeah, the "Ah, not that again, let me go through the hour of training."

William MacMillan: Exactly. We can't view it that way, it's not an annual box checking exercise. We have to treat our people like the first line of defense, which means we've got to arm them up with information that's timely, relevant and actionable, it's got to evolve. That leads me to kind of my final point about this article that I really liked. It advocates for what the author calls a "Data safety mindset." I would word that a little differently, I would call it a "security first culture" or as we say in my company, a "trust first culture". Focusing on that type of culture is increasingly important in this era of evolving our programs to think about cyber resilience, which is a huge hot topic that I've been talking to a lot of people about lately. For all these reasons, I really like this article and I think it's a great read for anyone involved in any type of eCommerce, especially at this time of the year, as the holidays are approaching.

Rick Howard: Wait, let me play Devil's advocate on this one, because it depends on what day you ask me about this. On even days I really like security awareness training programs, but on odd days, I'm like, "Are we blaming the victim here?" Why do we have to rely on the first line, like you said? Why can't the infosec teams provide enough protection so they don't have to do that? [LAUGHS] I guess I answered both sides of the equation there. Roselle, do you fall in on either side of that?

Roselle Saffran: Yeah, I'm in the same boat as you with that, where I feel that if we are relying on the end users to be part of the security defense, then we haven't done our job properly as a security program.

Rick Howard: That's way better saying it than I just tried to articulate it. [LAUGHS]

Roselle Saffran: But I mean, I do have my days where I'm like, yeah, the fact that we can't catch everything and so there has to be this level of awareness. They can't be completely in the dark with it.

Rick Howard: Where I thought you were going with this, William, is for physical safety programs. We've all been taught if you see something that looks unsafe that we should stop it and make sure everybody's doing the right thing. I thought that's what this article was advocating for is that, if you see something unsafe, do something about it. Is that where the angle is here?

William MacMillan: Well, I think that's a huge part of it, and to be clear, I don't necessarily disagree with what you're both saying about this. For sure our systems have to be implemented in a way where it's not going to fall on one person to be the weak link when the attack comes. I think the bigger meta point for me about this article was, our lives are moving online in digitally connected ways that we just didn't anticipate in the past and I think that creates an individual responsibility, even as citizens, never mind just as an employee of a particular company. I don't think you can be overly passive anymore about your digital life and you should never be blaming the human who was targeted. I think you have an obligation to educate them about the different ways that somebody might try to come at them as some sort of a weak link.

 Let's say where somebody in a shipping department, if you read this article, a lot of it had to do with shipping fraud and eCommerce. If you're interacting with suppliers and you're receiving emails, calls and messages on your chat apps and all of these sorts of things, those are all potential attack factors. We've just got to arm people up with the awareness of the latest ways that bad actors are trying to exploit that sort of thing.

Rick Howard: Eliana, put up the poll question. Roselle, go ahead.

Roselle Saffran: I completely agree with that. If you draw the parallels with physical security, and I like doing this with cybersecurity because then it makes it more relatable. If you're talking about physical security, if you're going into an office building, you know that you don't let the sketchy person in behind you who doesn't have a badge. People have been given training and are aware and some of it they don't need to be trained on because it's more common sense because it's physical security. But they're aware that you don't let someone piggy back in after you and you don't just let someone without a badge into the building. I think on the cybersecurity side, paralleling that, there has to be this understanding and this training so that there is the awareness of, look, you can have that same sort of situation.

 It's not going to be the sketchy guy that's following you into the building, but it's going to be someone sending you this email with the link that if you click on it going to lead to them getting in the building, essentially. I think having that parallel and having that awareness is definitely a big part of it. With that said, as the company owning the building, you have lots of control on that building, you have that whole security system in place. You have security cameras, you have locks on doors, you have swipes so you can only have a certain level of access. There's a lot of similarities there and so the more that people understand that, the better it is because at the end of the day, we all know that many adversaries recognize the weakest link is the individual end users in the company, who will let someone just piggy back in after them, essentially.

Rick Howard: Let's put the answers up, Eliana. Let's just take a look at these numbers for a second. Hmm, that's interesting, very interesting.

William MacMillan: I'm impressed with the 29%, that's higher than I thought it was going to be.

Roselle Saffran: Yeah.

Rick Howard: Yeah, that's very good. Here at the CyberWire, we have one of those phishing programs, that we pay for, KnowBe4 is one of the sponsors of the CyberWire shows and we use their serves. They send phishing emails, crafted phishing emails into employees to train them on what to look for. I've been at the CyberWire now for over two years, I've been caught three times, I think. And I know what I'm doing, so I can't imagine who somebody in the HR department, or, William, someone in your company at Salesforce, who is not really a techy person at all. If I don't know better, how are they ever going to know. I don't know, what do you say to something like that?

William MacMillan: Well, I think the devil is really in the details with these things. We have an amazing program and I realize that your first principles approach is good because not all organizations have necessarily the same resource profiles. I'm very lucky to work at a big company with a well resourced security team and we have an amazing security awareness program. We get a bunch of data, we are never judgmental about the people who fall for phishing emails and that sort of thing. Then we're able to fine tune the security education programs based on what the data are telling us. For example, in your case, if we do find that there's a particular department where the people aren't as savvy, then we can focus more resources on them. I think these things are really maturing and I think they're getting pretty useful. But it goes back to this idea, you definitely never want to create a climate where it feels like you're blaming the people.

Rick Howard: Yeah.

Roselle Saffran: Yeah, you don't want it to turn into a gotcha exercise, where you're calling them out.

Rick Howard: Well, that's easier here at the CyberWire when the CSO is caught three times, it can't really be a gotcha, I can't really blame anybody, right. [LAUGHS]

Roselle Saffran: And then everyone else is going to say, "Well even Rick fell for it, so, I mean..."

Rick Howard: [LAUGHS] Oh. So, let's turn to some questions, William. This one's from Frog Whisperer, another great name. "What if your company operates on thin margins and doesn't have... " Oh, you were talking about this, William, it's on the other side from where you guys are. They don't have a budget for security education, what do you do about something like that?

William MacMillan: Yeah, I think you start small. Start modestly. Again, to my previous comment, appoint somebody. Find a volunteers who's motivated about this stuff and say, "Hey, could you kind of look into what are the ways that people are coming after businesses like ours?" Bring it up in an all hands, you could just send out some emails, it doesn't have to be a fancy program where you're subscribing to one of these services, which are very good. If you can't afford them, just start small, but do some really rudimentary threat modeling. In other words, don't try to boil the ocean, look at what sorts of threats are out there that are coming at your type of organization, and focus on that first. Try to go back to your heat map idea, try to focus on the glowing red parts of that mental heat map that you can put together with some basic back of the napkin threat modeling.

Rick Howard: Roselle, you're a, would you call it a startup, or small to medium size company? How would you classify your company?

Roselle Saffran: Yeah, we're startup, we're startup.

Rick Howard: How do you approach it there?

Roselle Saffran: Oh, we don't play with security, I mean. [LAUGHS] This is my background. Despite how tiny we are, we have a security awareness training program. I have our VP of Operations, she's spent a decade, plus, on the cybersecurity operations side and I said, "Look, we need a security awareness training program," and she pulled it together. She also has the threat intel background. We have a whole training program that everyone has to go through when they start. We're a tiny company but, still, we put security first and we are constantly bringing to people's attention if something new happens. We'll announce to the entire team if something big just popped up, everyone in the team knows about it. Knows about it, when it happens, so that there's this continuous awareness, it's not just, "once I year I get reminded that I need to be security conscious."

 We have completely infused it into the culture of the company, but that's my background, we're a cybersecurity company.

Rick Howard: I was going to say, it helps a little bit that the CEO was the cybersecurity person a the White House. [LAUGHS] I guess we have an emphasis there. But here's our second question, it's kind of related there William, "I work at a company that doesn't have any program at all, so how do you begin one?" Especially if your CEO isn't a cyber security person, how do you start?

William MacMillan: Yeah. Well, I would go back to two things that we actually talked about a little bit earlier. One is, there's great resources coming online out there. Start at cisa.gov; I'm an enormous fan of what that agency has done under Jen Easterly's leadership. There's more and more resources for folks that are starting with that lower resource profile. The other thing is, making sure that you've got that leadership buy in. If you have a company, opposite of Roselle's company, they've never really thought about cybersecurity, they don't think of themselves as a digital or a cyber company. Make sure whoever's going to run with building the rudimentary program for your company has the attention of the business leadership, because otherwise it's never going to get at that cultural angle that we heard Roselle talking about. It really does need to be infused into the security culture at your organization.

Roselle Saffran: Yeah, I would add a couple more resources. SANS has lots of resources, especially their OUCH! Newsletter, which is just for individuals. Also the SBA has resources for small business in particular.

Rick Howard: What's that acronym stand for again?

Roselle Saffran: The SBA, the Small Business Administration. Yeah, so I think they actually have a conference coming up, it's just for cybersecurity for SNB.

Rick Howard: Ah, very good.

Roselle Saffran: Yeah.

Rick Howard: This third question that came in, William, is along the lines on what I was talking about before. They said they have one of those programs that send phishing emails in, but it irritates them and aren't programs like that counter productive? How do you get away from annoying your employees? How do you build that culture? What should we be thinking about there?

William MacMillan: Yeah. This is the implementation details really matter. I remember, there was an era not that long ago, five or six years ago, some companies were just refusing to do these programs at all because they viewed them as counterproductive. I totally get the sentiment of this question, but there are definitely ways to implement these things where it feels supportive and helpful to the employees. The tone of the message that they get if they accidentally click on a link and all of those sort of things really matters. You can also automate them in way where it's not super time consuming. They're not going to have some human intervention where somebody visits their cubicle and says, "We can't believe you clicked on that link."

 It's just to an info page that sort of explains in a lighthearted tone what happened and some resources that they can refer to and that sort of thing. I really think it's the atmosphere in which that program is socialized and implemented and just being really careful never to do any victim blaming, that's not where you want to be going with these programs.

Rick Howard: Good stuff, William, good topic. It's time to move to the third story, this is my story, alright? What I pick for the most impactful story of the last 90 days is I have the Mudge Whistle blower complaint. For those of you that don't know, or don't remember, back in November 2020, in the wake of a high profile hack that compromised the Twitter accounts of some of the most famous people on the planet, including candidate, Joe Biden. The then CEO of Twitter, Jack Dorsey, he hired Mudge, AKA Peiter Zatko, Mudge is his hacker alias. He hired him as a member of the senior executive staff to beef up cybersecurity and privacy at the company. Now, I don't know if you guys remember Mudge or not, but he's one of the famous early day hackers. He was one of the front men for two notorious hacker groups in the 1990s and early 2000s; L0pht and the Cult of the Dead Cow, or cDc.

 In fact, there's a great book by Joe Menn came out a couple of years ago about the Cult of the Dead Cow, highly recommended. In the mid-2000s, Mudge, he went legit in the security world. He worked for BBN, DARPA and Google and ended up in 2017 as the Head of Security at Stripe, a payment processor company. Dorsey hired Zatko away from Stripe in November 2020 and Zatko reported directly to Dorsey. According to Zatko's LinkedIn profile, he wasn't named the Chief Security Officer and it's unclear if he was an official officer of the company. A year later, November 2021, Dorsey stepped down as the CEO to be replaced by the then CTO at the time, Parag Agrawal and according to news reports, Mudge and the new CEO didn't get along very well.

 Two months later, according to Twitter, in a restructuring effort, Agrawal fired Mudge and releases the CSO, Rinki Sethi, a friend of mine, who was hired a month before Mudge; that's all background to this. Last month, September, Mudge testified to the US Senate Judiciary Committee about his whistle blower complaint regarding Twitter's poor security posture. And this is unprecedented. No CSO, CISO or risk officer that I know of has ever done this, and the question is, did the complaint warrant such an extreme action on his part? I'm going to ask both of you to opine on this in a second, but let me break down what the complaint actually said. First, spies. Zatko claims that the Indian government forced Twitter to hire specific individuals who were government agents. He also said that Twitter received specific information from a US government source that at least one employee was a spy, probably from China and that was the giant bomb at the committee hearing.

 But he also said that Twitter hadn't complied with a 2011 FDC consent decree, ordering Twitter to beef up its security and ensure bad actors could not access private user information. And, finally, laid out just how bad Twitter's defensive posture was. Half of it's 500,000 servers were using unencrypted software, and about 40% of the employee laptops were not sufficiently protected from outside threats; 30% even blocked software updated with necessary fixes. Thousands of these barely protected laptops had access to Twitter's source code. Finally, out of Twitter's nearly 7,000 total employees, about half had unmonitored access to sensitive internal company software. Then he claimed that he tried to tell the Board about it, but the CEO stopped him, and then Agrawal and other executives censored any damming security data and mislead investors and regulators in the process.

 Phew, that's a lot. It's a lot of accusations. The question then is, did it warrant going to the Senate Judiciary Committee to complain? Roselle, I'm very interested to hear your point, you've been on both sides of this equation. You've been a security practitioner, now you're the CEO of your own company. How does this sit with you?

Roselle Saffran: There's so much to unpack here.

Rick Howard: [LAUGHS] I know. I was trying to summarize it and man-oh-man, it's a lot of stuff. Go ahead.

Roselle Saffran: No, that was an awesome summary. I think first of all, talking about the tactical side of it, of X number of machines that were unpatched and all of that, in my mind, that's not cool. It's not cool to divulge that type of information when that's something that can easily get into the hands of hackers all over the world, who now have all this additional intel as to their weaknesses. I definitely have qualms with that being divulged. Every organization faces challenges like that, and part of what his role, I assume, included was trying to improve those numbers. I think, getting into that minutia and making that public, that to me seems like it was more than what was called for.

 In terms of some of those strategic decisions of being cool with espionage happening within the environment, that's kind of at a different level. I have no idea what happened internally, how much he pushed to try to have changes made while he was still there. Alex Stamos had some similar situations, it seemed, when he was at Facebook. In both of those cases, you have very salient companies with lots and lots of influence and lots and lots of power, and the security leaders are seeing that there are issues going on in the company that are not getting addressed. He talked about some of it publicly as well, not in the same way that Mudge had.

 In my mind, if those types of decisions are being made and the security leader is trying to convince the senior leadership in the company to make better decisions along those lines and it's all falling on deaf ears, I can understand why then that side of it ends up being disclosed more publicly.

Rick Howard: Yeah, but publicly and doing a whistle blowing complaint with the government, those are two different things. It's interesting that you mentioned Alex Stamos. Stamos used to work for Mudge back in the day when they were at a company called At Stake, and they had two different approaches to this kind of problem. Stamos at Facebook, when he got let go because of his problems with the Board, he said, "Hey, it was my fault, I didn't convince them, and I stepped out of line." Mudge says, "I'm going to go to the government and complain." I'm going to come to you in a minute, William, but first let's put the poll question up to see what the audience thinks about this pretty simple question about whether or not you think Zatko should have blown the whistle or not.

Roselle Saffran: I'm interested to see the results of this.

Rick Howard: What I found by doing the background of the story, the community's divided about half and half on this. Let's see if our audience is the same way. Some feel that Mudge was totally justified and others feel he absolutely was in the wrong about this, so, it'll be interesting to see. William, what's your thought on this? How did it strike you as I went through that story?

William MacMillan: Well, it was a good one, Rick. This thing is a head scratcher and the approach that I took with this is, first of all, what would I have done? I think that that's like, a pretty normal human response to something like this, is to imagine yourself in similar scenarios. More importantly, what can we extract from this for people? What's the value of studying this whole situation? I don't know this gentleman, I've never met him and I've certainly never walked a mile in his shoes. What I struggled with with this whole thing was imagining myself ever getting to the point where I felt like I was a fork in the road where if I wanted to do something I had to go to Congress. That just seems pretty far out there to me.

Rick Howard: Seems like a big step [LAUGHS]

William MacMillan: It does. The thing that felt like a little bit of a missed opportunity is you noted in your intro about this article, that he was prevented from going to the Board by the CEO. I'm just thinking to myself, "gosh, I think I could probably come up with other ways to get to the Board." I don't think the only path to the board is through the CEO. I'm not second guessing him, because like I said, I don't know him, I didn't live his experience, this is his story not mine. I feel like if you do see something and you feel like you've got to say something, then, yeah, but the Congress thing just kind of jumped out at me as really pretty far out there.

Rick Howard: Let's reveal the answers, Eliana, what is everybody saying. This should be interesting. Wow. Wow. Look at our audience. There you go. I think we were leaning towards maybe not do this, but I think we're proven by our audience that we were wrong. Let me just say this about that. In my early days as a security practitioner, I thought I knew everything, I guess, is really the bottom line here. When I couldn't convince leadership to do something that I thought was important, I'd blame them; they're not smart enough to understand the significance of this, or they just don't understand what I'm trying to say. As I've gotten older and done this job for a while, it really comes down to two things. One is I didn't convey the risk the proper way, so they thought it was the right decision to do, so that's on me.

Roselle Saffran: That goes back to my earlier point.

Rick Howard: That's exactly right. The second thing is, I conveyed the risk properly but those executives had a gazillion other risks they had to deal with and this one wasn't as important as that one. And that is likely what happens in most organizations, right? You don't hire a CSO and then not pay attention to what they say. Business leaders make decisions all the time about this kind of thing, right? They weigh risk, not just cyber risk, but all kinds of risks every day. Roselle, you know this, you're running a company. Security's important but it's probably not the most important thing every day. That's kind of my take. William, you're looking at me like I have a horn growing out of my head. Are you with me on this or no?

William MacMillan: No, no, I'm with you. I always say that conveying risk is our most sacred obligation on cybersecurity teams, so I'm completely tracking and aligned with what you're saying. I think at the same point we have whistle blower protections for a reason and we do teach people if you see something say something. I guess in his estimation, he just was beyond his ability to be able to convey this risk, they weren't taking it seriously enough, that was the decision he made. I think I would try to avoid getting myself in that situation, would be my approach to this.

Rick Howard: Well, the only thing that raises to that kind of level is if he thought they were breaking the law somehow. The only thing in the summary I gave, and I may have misrepresented it, but there an FDC directive that they should have done something that he thought they hadn't done. That's the closest thing I could come up with to be a whistle blower complaint. I don't know, what do you think Roselle?

Roselle Saffran: I mean, the espionage side of it really bothers me.

Rick Howard: Ah, yeah, okay.

Roselle Saffran: Yeah, so if that is not being addressed, that's an issue because that could very well rise to the level of national security. I can see why at that point, if nothing is being done, especially given the breadth of Twitter's influence and its reach. I won't say that it's in this different echelon in terms of its responsibilities, but that does need to be factored into the equation when thinking about the risks involved.

Rick Howard: Well, I mean, I've been in jobs where we knew that there might be people who will be leaning towards espionage. We hire foreign nationals in previous jobs and so we'd watch them, put controls around them and things like that. Again, going to the Congress is a big step, it's a huge step. Like I said, never been done before, so we've broken ground here. Something to watch in the future. It'd be interesting to see how this plays out in the future. It's a good topic, I knew it's a little controversial and based on what our audience says, the three of us are completely wrong, compared to what they think. So that's perfectly fine. That's it for my segment. Let's finish it up with some general audience questions. When you all asked to register, some of you sent questions into us, so let's go through some of them.

Rick Howard: First one's from Lorenzo Riccucci; I hope I said that right. "What future development do you foresee for the role of cyber threat intelligence teams in corporate environments?" William, you and I were talking about that before the show started, what do you think about that?

William MacMillan: Yeah, I think cyber threat intel analysts, analytic teams are living in a golden era. I think large organizations are really waking up to the need to pay more attention to the nuance. I think even geopolitical analysis is really on the rise in big organizations, where, the general idea, getting back to this whole idea of moving from cybersecurity to cyber resilience, sort of implies paying more attention to the context and the details around you and making those intelligent drive decisions at an organizational level. I'm really excited about the future role of cyber threat analysis and just intelligence analysis in general. I hear more and more companies really interested in investing in this are and I think it's a great development.

Rick Howard: Roselle, anything from your side?

Roselle Saffran: Yeah, I agree with William. Understanding thy enemy is half of the equation and it's it's an important part of it. It's particularly interesting now that we see, essentially, this convergence of some of the nation state actor capabilities being given to these crime syndicates and to these activists. It's going to raise the bar across the board in terms of the sophistication of different types of actors, and being able to track that and incorporate that into operations is very important. I would add to that, though, that know thy enemy is half of it. The other half is know thy self. That side I feel has definitely been neglected in the industry.

 Part of what I'm addressing with my current company. It's one thing to know, "we've got this snippet of malware and we know it belongs to this threat actor in this province, in this country." Then not being able to understand their own environments and what's critical that we need to make sure we're protecting, defending to the utmost. That I see as being the other side that we need to beef up our understandings of even our own environment.

Rick Howard: I have two points about this. I'm not sure if this is where it's going to go, but I think this is where it needs to go. We need to be much better at what you were talking about, Rosella, is knowing thy enemy. I believe we should be putting prevention controls in place for the attack sequences of all the known adversary groups that are out there. If you look at the MITRE ATT&CK Framework, there's about 125, 150 groups that they track. Those are mostly nation states and what we're lacking is all the cyber crime groups. I got one estimate from Microsoft last year, it was about 100 cyber crime groups operating on any day on the internet. In the open source community, we're pretty much blind to all that. I would love to see a focus from the cyber threat intelligence teams to build that intelligence for the community.

 The second thing is a little bit more radical, we're talking about first principles and talking about how do you calculate risk for your organization? I think that question goes to the intelligence teams. They're the ones that know about adversaries. They're the ones that should know about how they are deployed in the very digital environments that we have out there. They're the organization that is most likely able to calculate the risk, to do this. I would love to see cyber threat intelligence teams be given those tasks in the future. Any comment, go ahead.

Roselle Saffran: I would just want to add to that. I absolutely agree that the threat intel side helps to understand that risk, but if you look at risk, it's likelihood times impact, and the likelihood side is where that threat intel comes in.

Rick Howard: Right.

Roselle Saffran: But, still, there needs to be that understanding of that impact.

Rick Howard: I agree with that, alright. Second question from Elijah Bjork. "For someone trying to break into the industry, what is the best strategy?" Alright, so a question from the newbie side. William, what do you think?

William MacMillan: Well, I think just dive into it. I've been amused lately with how many times I've heard the phrase "impostor syndrome." I was speaking at the Texas Cyber Summit recently and some people came up to me afterwards, they said, "Well, you know, I feel like an impostor, I don't know what I'm doing and I want to break in." I said, "Man, just jump in", there's so much online learning material nowadays. In my own company, we have this thing called Trailhead that's available to the public for free, you can start learning cybersecurity. There's all kinds of stuff like that. Roselle probably knows a ton of resources that are oriented in this space. Just start, convey that interest. If you've got the underlying traits of intellectual curiosity and a passion for this short of a thing, there's never been a better time to break into the industry.

Rick Howard: Roselle, what do you think?

Roselle Saffran: I'm in complete agreement. First of all, I cringe when someone says to me, "I want to get into cybersecurity, so I'm going to get a Masters in cybersecurity."

Rick Howard: [LAUGHS]

Roselle Saffran: Like, "No!" It sounds funny to say, "No, you don't need more education," But I just say, "Look, that's not what you need, there are all these online resources. You start getting up to speed on that, and you look at the job descriptions and you see which tools they're mentioning specifically and you download the demo versions of each of those tools so you get yourselves familiar with it, and just start applying." That's always my advice on the side of people that are trying to break into the industry. Now, on the flip side of it, I'm also trying to always encourage hiring managers. Hire people without a background in cybersecurity. It's so hard to find people that already have the skill set, and at the end of the day, they don't need to have the skill set going in. What they need to be able to do is learn and learn quickly and be willing to learn.

 Once they get on the job, they're going to have to learn the whole stack that the staff has, and they're going to have to learn the specific environment anyway. The industry changes so fast, they're going to have to be in this continuous learning mode no matter what. Even if they're starting from scratch, they'll get up to speed three to six months, if they have the aptitude and the willingness to learn. Then they can really start rolling. There are two sides of it, I mean, we definitely have this workforce shortage and I think on the hiring side, there just needs to be more willingness to bring in people that are just straight out of school, that are just from a different career field, but want to get in and we'll figure it out.

Rick Howard: A couple of points there. William mentioned impostor syndrome. I will just tell you that, I've been doing this stuff for 30 years, I have impostor syndrome every week. If that's stopping you from doing this, do not, just dive in. Most of it is make believe that you know what you're doing and eventually you'll start to know stuff. Second, I'm going to make a big push for my pet project, the Cybersecurity Canon Project. Canon of literature for cybersecurity. It's books, we try to recommend books that you all should have read by now. It's sponsored by Ohio State University and practitioners, it's like the rock and roll hall of fame for cybersecurity books. There's literally a book for anything in there that you're interested in. My advice to any newbie, is if you read one book on this subject, you're probably the smartest person in the room on that subject. If you read a second book on it, you're going to be one of the handful that are the smartest in the world on it.

 Just start, and don't be afraid. Like Roselle said, you don't need a Masters degree, you just need to be able to solve problems for your boss. I have an interview question I always save for the end, whenever I'm trying to hire some new person. You go through the list, are you qualified, blah, blah, blah. At the end, I always ask, "What are you running at your house?" Because if you're not running a Linux box that you've built yourself, you're not smart enough to be on my team. It's not that you have to know how to run Linux, you just have to be smart enough to take a problem you don't know anything about, read about it, solve it, and get a solution. I'm hiring you to solve problems for me because I don't want to do it so I just need you to be good at that kind of thing. You don't need a Masters or a Ph.D. you just need to be able to learn on your own and solve problems for yourself.

 Phew, I'll get off my soap box. You guys want to comment on any of that?

Roselle Saffran: I would usually include a similar question in interviews, well, similar in a way. I would ask, "What do you do when you don't know something?"

Rick Howard: Oh, yeah, that's more specific.

Roselle Saffran: What I'm listening for is, if they're going to say, "well, I'm going to follow the SOPs", or something along those lines, then I know that they're not going to stretch too far. If they say, "Well, I'll ask my colleague," then we know that they're going to be a burden on the other folks in the team. If they say, "Well, I get to Google," or "I look at this resource or I look at this one, or this one," then I know that they have that willingness to learn something new, that it comes naturally to them. Especially if they know resources already, they've been doing their homework.

William MacMillan: Just to very briefly riff on Roselle's thought there. I like the getting to the underlying trait and those qualities, that skill set rather than the specific "Did you grow up playing with this technology or that technology". We're going to get a more diverse inflow of people and I think in that diversity you really build very strong programs.

Rick Howard: Ladies and gentlemen, we are at the end of this. On behalf of my colleagues here, Roselle Saffran and William MacMillan. Thank you guys for participating, it was a fantastic discussion. For the audience out there, we'll see you at the next CyberWire quarterly analyst call. Thanks everybody, we'll see you soon.

Roselle Saffran: Thank you.

William MacMillan: Bye.