The shift toward corporate governance.
Rois Ni Thuama: In order for a business to survive and thrive, they need to build in corporate defense and corporate resilience and digital operational resilience. And that's where we're seeing this shift.
Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: Today, Ben has an update on the Pegasus spyware story. I ponder the underlying policy issues and the recent controversy at Spotify. And later in the show, my conversation with Dr. Rois Ni Thuama - she is head of cyber governance for Red Sift - on the changing landscape when it comes to governance and how organizations are approaching cyber. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney.
Dave Bittner: All right, Ben, let's jump into our stories this week. You are kicking things off with the continuing saga of NSO Group and their Pegasus spyware.
Ben Yelin: Yes, we've certainly been here before. We've talked about NSO and Pegasus several times in the past, I think. Now we have a new New York Times magazine investigation, which reveals a couple of critical pieces of new information. The first that's of interest to people here in the United States is that the FBI sought to purchase Pegasus software to use for domestic surveillance in the United States. We were certainly not aware that that was the case. This was all happening behind closed doors. And the FBI abandoned the plan to purchase this spyware in the middle of 2021. So that was some sort of policy decision made within the Department of Justice. We don't really have the full details as to what went into it, but certainly the fact that this company and the spyware has become politically controversial had something to do with it.
Dave Bittner: Yeah.
Ben Yelin: The other...
Dave Bittner: Well, and - I'm sorry, but the NSO Group folks were sort of famously crowing that their software couldn't be used for American phone numbers and so on and so forth. They were claiming that that was a condition that the Israeli government had placed on them, right?
Ben Yelin: Right. You know, I think their belief was that - or the FBI's belief was that once the technology exists, you can figure out how to deploy it here and use it as a domestic surveillance tool, certainly because it offers a lot of promise. I mean, we have criminals in this country - and this is the FBI's perspective, not necessarily mine - who are going dark. And we need to have the best tools to try and infiltrate their conversations, whether that's about drug cartels or organized crime or anything like that. And it certainly has been used with great success in other countries for legitimate law enforcement operations. This is how we found the notorious El Chapo in Mexico, who was one of the drug cartel runners there, perhaps the most famous. So it solved, you know, a lot of domestic security incidents across the world.
Ben Yelin: So that's one really interesting element of it. It is - it's interesting that the FBI was engaged in trying to purchase the software for domestic surveillance, particularly because we've seen how the software has been abused in authoritarian countries where it's been used to spy on dissidents, journalists, et cetera.
Ben Yelin: The other really interesting element here is that this has become a tool of diplomacy for the Israeli government. So this is an Israeli company. They're based just outside Tel Aviv. And Israel has been kind of offering this spyware as a carrot in some of its major diplomatic negotiations, including the Abraham Accords in 2020, where they had these breakthrough diplomatic agreements with countries in the Arab world, saying, hey, as part of this deal, have this awesome spyware that this company developed. I mean, let's just say that, you know, the countries that have used that technology haven't always done so for the most - on the up and up, let's say.
Dave Bittner: Yeah.
Ben Yelin: They've used it for rather nefarious purposes. So, yeah, I mean, these are sort of the two elements of the story. We didn't know that this is something that the FBI was interested in. Now we know that the FBI has rejected the purchase of the software. We also know for the first time how Israel has really used this in their diplomatic efforts. It's a - it's something to offer other countries in diplomatic negotiations, despite the fact that, you know, we've seen it used across - you know, across the globe for illicit surveillance operations, including from some of our most authoritarian governments. And the fact that, you know, as you alluded to, America has blacklisted NSO, you know, which NSO says it's suffocating its services. It denies access to some of our American technology that it needs to run its operations. They mentioned Dell computers and Amazon cloud servers here.
Ben Yelin: So it's just a really interesting development. This all comes from a New York Times magazine investigation. So there's also the interesting element of who leaked this information. How did they obtain it? And what was the motivation for the person who sent this New York Times magazine?
Dave Bittner: So despite the FBI ultimately thinking better of it, had they gone down the path of using this, is this the type of thing that would normally be in bounds for an organization like the FBI? Are there civil liberties concerns here?
Ben Yelin: Oh, there certainly are civil liberties concerns, but our jurisprudence really isn't dependent at this point on the nature of the technology itself. And this is something we've talked about a million times. If you have Fourth Amendment concerns, that all depends. You know, whether you have - whether the government is exceeding its authority under the Fourth Amendment depends on whether a person has a reasonable expectation of privacy. That defines what a Fourth Amendment search is. Then, of course, the search has to be reasonable. And generally, you know, in the absence of a warrant, a search can be reasonable if the security needs of the government outweigh the potential privacy concerns.
Ben Yelin: Under that framework, I don't see any reason why, you know, if we were investigating drug cartels, organized crime, you know, even, you know, gang warfare in American cities, that this technology could not have been justified under the Fourth Amendment. I think, you know, we've seen similar technology, even technology that can break encryption, that can overcome the desire of criminals to go dark. We've seen that be allowed under our Fourth Amendment jurisprudence just because there's no real technological red line. There's no piece of technology. That's just not the way our Fourth Amendment works at this point. A lot of scholars want to move us in that direction, where the amount of Fourth Amendment protection depends on the capabilities of the technology.
Ben Yelin: And because this technology is able to, you know, at least potentially read communications that have been end-to-end encrypted, perhaps that should enhance, in the view of these scholars, the constitutional rights of the users. But jurisprudence hasn't reached that point yet. So even though there are significant civil liberties concerns, there's no, like, red switch where, you know, if this had been used in the country, that's per se illegal.
Dave Bittner: Yeah.
Ben Yelin: I certainly could foresee a world in which it was used. We use our, you know, not just the FBI, but state law enforcement agencies use surveillance technology with these types of capabilities all the time, and they do it legally.
Dave Bittner: Is this the kind of thing that you would expect, if it were put in use, that a warrant would be required?
Ben Yelin: It - not necessarily. You know, I think if there was a legitimate law enforcement need and, of course, determine that this didn't violate a reasonable expectation of privacy. The expectation of privacy question gets a little complicated when we're talking about encrypted communications because people have an expectation that, you know, the government isn't going to be able to access those communications. And courts haven't really figured out how to deal with that yet. The one thing you do not have reasonable expectation of privacy in is the metadata.
Ben Yelin: And we've seen - I think we talked about this in a recent segment - even with, you know, applications like WhatsApp, the government doesn't need a warrant to access metadata. So even if the communications themselves are encrypted, sometimes you can learn a lot just by, you know, who was messaging whom and at what time and, you know, what the duration was.
Dave Bittner: Right.
Ben Yelin: So even if this technology wasn't being employed to spy on the actual communications of its users, it could have been used to glean useful information on its users. And that could have been done warrantlessly.
Dave Bittner: Yeah. All right. Well, interesting that the FBI thought better of it. Maybe, you know, the controversy, the fact that it's, as you said, blacklisted by the U.S. government, maybe it was just a little too hot to handle.
Ben Yelin: Yeah. Maybe they listened to our podcast...
Dave Bittner: (Laughter).
Ben Yelin: ...And that proved to be a warning sign at FBI headquarters in Washington, D.C., at the Hoover Building. And they thought better of it.
Dave Bittner: Yeah, they just didn't want the scrutiny of the "Caveat" podcast. I think that's possible.
Ben Yelin: A lot of people feel that way. You know, I will say it takes a lot because the brochure for Pegasus that was given to American law enforcement agencies said that this technology can turn your target smartphone into an intelligence gold mine. That's hard to turn down. You know, that certainly would be enticing if I'm a law enforcement agent. So the fact that it did end up making this decision is in and of itself quite notable, I think.
Dave Bittner: Yeah. All right. Well, we will have a link to that story from The New York Times Magazine in the show notes. My story also comes from The New York Times, this article written by Kevin Roose, although this story has been covered extensively pretty much everywhere. I want to talk about the hot water that Spotify finds itself in this week with their host Joe Rogan and the heat they've been taking from some of their other artists on the platform, people calling for bans of the platform. Sort of, I guess, the back story here is that talk show host Joe Rogan, who's extraordinarily popular on that platform...
Ben Yelin: Second-most-popular podcast next to "Caveat" in the whole country.
Dave Bittner: (Laughter) That's right. That's right. Oh, what we couldn't do with his download numbers.
Ben Yelin: I know. I know.
Dave Bittner: So, you know, Spotify gave him a large sum of money, generally you see it being reported as $100 million to come to Spotify and be an exclusive. And they did this to really kick off their podcasting area of their platform, to get a lot of press attention, to get one of the...
Ben Yelin: They're podcasting vertical, if you will.
Dave Bittner: Yes, absolutely. And they had great success with that and continue to do so. Joe Rogan is not without controversy. Some of the guests that he brings on his show have all sorts of interesting opinions and takes on all sorts of things, from COVID-19 to UFOs and everything in between. And so this specific case sort of came to a head because recording artist Neil Young, who is a polio survivor, it turns out, got his dander up over some of what he claims is misinformation that Joe Rogan has shared on his program. And Neil Young has pulled his library from Spotify. Joni Mitchell also pulled her library. Some other artists, some other podcasters are doing the same. In the past few days, there's been a bit of a movement of people on social media saying that they're canceling their Spotify accounts. Perhaps most important to the powers that be at Spotify, their stock price took a major hit. It has since rebounded.
Dave Bittner: You know, I guess, around all of this, what I'm most interested for our conversation is sort of the background policy on all of this. Of course, as always happens with these sorts of things, people are claiming censorship. I'm going to go out on a limb and say, we're going to claim this is not that (laughter).
Ben Yelin: Yeah. Right, right.
Dave Bittner: But I want to try to clarify what exactly is this, and what is Spotify's obligations and responsibilities here when it comes to misinformation and the folks they host on their platform?
Ben Yelin: It's a great question, and this is not unique to Spotify. We've seen this hundreds of different times with different platforms, whether it's Facebook or Twitter. You know, they express their company's values, which are, you know, fostering an environment of free speech, creative content, etc. Then you get a user of that platform who posts things that are, you know, defamatory, allegedly dangerous, you know, constitute misinformation. Then there's the inevitable backlash, so maybe other prominent users of that platform say, we're going to leave this platform if, you know, you don't change your policies, if there isn't better content moderation. And then, you know, sometimes, the companies themselves make the decision to lock that person's account, suspend their account, etc. Then that person goes and complains that they've been canceled or whatever the terminology is. So this certainly isn't unique.
Ben Yelin: I think, you know, for our purposes, it's important to see what the relevant legal and policy issues are here. Let's first note that the government is not involved, so this isn't per se a First Amendment issue. Maybe it has to do with the spirit of the First Amendment. But certainly, the government is not compelling Spotify to take any particular action. Joe Rogan himself has not been censored by the government. He hasn't been threatened with arrests or fines by the FCC. He's free to say whatever he wants. You know, he has the type of listenership where if he decided, I'm going to, you know, leave Spotify and the $100 million on the table and start my own podcasting network, a lot of people would go there. He'd make a lot of money. He certainly has that option.
Dave Bittner: Right.
Ben Yelin: Really, this is just sort of the capitalistic system at work. You know, you have Joe Rogan, who has hundreds of thousands, millions of listeners, has one of the most popular podcasts that's out there right now, and then you have pushback by people who are taking their music outside of Spotify's music platform. So if you're a fan of 1970s folk singers like I am - you know, Neil Young and Joni Mitchell - maybe you're moving your services to iTunes. None of that, you know, involves any type of government action. It's all something that's being done within the private sector. So this is certainly - this is not censorship, even though people might say, you know, it violates the spirit of the First Amendment because somebody is being punished for expressing unpopular political opinions.
Ben Yelin: Then the other interesting element of this is, you know, in theory, could Spotify be liable for some of the content that Joe Rogan posts on its platform? And that's a more difficult question. So let's say somebody decided to sue Joe Rogan, you know, seeking damages for the misinformation that he put out related to COVID-19 - you know, family member of somebody who didn't take the vaccine ended up, you know, having a bad result and sued Spotify. You know, generally, if they were just a neutral content platform, they would be protected under Section 230 of the Communications Decency Act. But right, you know, here, it's kind of unclear if they are just that neutral platform, given the fact that they are paying Joe Rogan $100 million to have the podcast on their network.
Dave Bittner: Yeah.
Ben Yelin: So that might cut against their - you know, the notion that they are this neutral platform. I still think they probably wouldn't be held liable, you know, given that the reach of Section 230, as it exists right now, is relatively broad. But it certainly presents an interesting question. I don't think they could simply hide, you know, the way Facebook has or the way Google has by saying, you know, we are not the publisher of this content. We are just the platform. So those are kind of the interesting elements of the story, from my perspective.
Dave Bittner: Yeah, I mean, it seems pretty clear to me that when you're paying someone that kind of money to be exclusive to your platform that you are now a publisher. And so I think what I've seen is criticism of Spotify because the statements they've put out have come at this from the point of view of that of a neutral platform...
Ben Yelin: Exactly.
Dave Bittner: ...Which is understandable, but that's not what they are in this case - in my opinion, anyway.
Ben Yelin: I think you're absolutely right. I mean, what they're - so they released an extensive statement, you know, and it's the type of statement we've seen from basically all of these companies saying, you know, we understand the concern about misinformation. We're going to provide our listeners a link to authoritative COVID information. But, you know, we support free speech. We don't want to censor anybody who's on our platform. We are not the publisher. We're just allowing this content to be - you know, to be posted under the Spotify umbrella. I don't think they can credibly make that claim since they're the ones paying $100 million for the exclusive rights to this podcast. I just don't think they can make that claim with a straight face.
Ben Yelin: You know, now, there are certainly arguments that they can make in, you know, support of their decision to keep Joe Rogan on their platform. They might say, as a company, you know, it is not our position to espouse any values whatsoever. We're giving everybody, whether it's Joe Rogan or anybody else, the ability to post content, to put podcasts on the Spotify platform or to release music. And that's never really the case because Spotify, Facebook, Twitter - they all have some type of content moderation. So it all, you know, gets down to this matter of degree. You can't post literally anything you want on Facebook. You can't post pornography on Facebook.
Dave Bittner: Right.
Ben Yelin: You can't, you know, post threats to injure yourself or somebody else on Facebook or on Twitter. So, you know, I think when these companies try and step back and say, we're not responsible for this content, we don't make content moderation decisions, they really do make content moderation decisions. So it's just a matter of, you know, what counts, from their perspective, as the type of content that they don't believe should be posted on their platform. And I think what Neil Young and Joni Mitchell and some of these other artists are saying is, we don't want to be associated with a platform that's going to be OK with posting, for example, you know, COVID misinformation.
Dave Bittner: Yeah.
Ben Yelin: So I definitely think that's what's happening here.
Dave Bittner: So let me ask you this. Outside of the situation with Joe Rogan, is there a point where the government steps in when it comes to misinformation - and let's use, you know, medical misinformation as an example here - how does that contend with the First Amendment? If I - if someone out there in great prominence is putting something - putting information out there that is just wrong - somebody is saying, you know, hey, here's a cure for COVID-19. Inject yourself with bleach. And that is both wrong and dangerous. At what point does the government stop relying on the free market to keep something like that from spreading around? Or is it - because of our First Amendment, is it fair game for someone to stand on the street corner or the broadcast network and shout that to the world?
Ben Yelin: Yeah. So besides, you know - there are - you can ban false advertising in a way that's constitutional. Our government does that with some of our consumer protection laws. But our First Amendment is incredibly broad. Our courts have said they want to protect the marketplace of ideas. There is no point and basically no level of misinformation out there where the government is going to step in and censor it, put some sort of prior restraint on it. Some people think that's good. Some people think that's bad. I mean, we have probably the most robust set of free speech rights of any Western democracy, and that's what makes our country unique. We protect good speech. We protect offensive speech, for the most part. And we protect speech that might, you know, cause kinetic harm to people, that might cause people serious injury and/or death.
Ben Yelin: But generally, the government has decided, let the marketplace of ideas take care of that speech. If somebody posts, you know, a quack cure for COVID, the hope is that, you know, you'll - within that marketplace of ideas, a more authoritative voice will come in and say, actually, this is B.S. Here are, you know, 100,000 doctors who've reviewed this evidence and says it's bunk. And, you know, people will have enough information to make their own informed decision. Does it always work that way? Absolutely not. But that's kind of the price we pay for having this extremely robust First Amendment, and that's not going to change. I mean, we have now, you know, hundreds of years' worth of precedent saying the First Amendment is incredibly strong. It can - you know, there are some restrictions that can be placed on it in limited circumstances - something that would lead to imminent lawless action, you know, some restrictions on obscenities, you know, things like that.
Dave Bittner: What about hate speech? How does that bump up? I mean - 'cause there are laws against hate speech, right?
Ben Yelin: There aren't laws against hate speech, necessarily. There are hate crimes laws. So if you commit a crime that's motivated by hate, you know, for a particular demographic group, then that can affect, you know, what federal crime you're charged with.
Dave Bittner: I see.
Ben Yelin: But hate speech itself is constitutionally protected as long as it doesn't lead to other lawless action.
Dave Bittner: OK.
Ben Yelin: So, yeah, our First Amendment - it's big. It's bold. It's powerful. And, you know, that's kind of the situation we live in. And I think within that system, it makes it incumbent. It's kind of all of our responsibility as consumers of media to make sure that the truth comes out one way or another because the government isn't going to stop in and solve this problem for us. You know, we have to go out in large numbers and say, you know, this is the information that's truth. This is the information that consists of falsehood - falsehoods, and, you know, you have to confront misinformation with correct information.
Dave Bittner: Yeah.
Ben Yelin: That's just the nature of our system.
Dave Bittner: All right, well, we will have a link to that story in the show notes, and of course, we would love to hear from you. If you have a story you would like us to cover or a question for me or for Ben - probably a question for Ben - you can send us email. It's firstname.lastname@example.org.
Dave Bittner: Ben, I recently had the pleasure of speaking with Dr. Rois Ni Thuama. She is head of cyber governance for an organization called Red Sift, a cybersecurity providing company. And our conversation centers on the changing landscape when it comes to governance and how organizations are approaching cyber. Really interesting conversation. Here is Dr. Rois Ni Thuama.
Rois Ni Thuama: I think what it is is that corporations have been built and have grown on a philosophy of value creation and value generation. And that is - that has loads of merit. Of course, businesses are in the business of making money, and all of that makes sense. But we are seeing this paradigmatic shift in how businesses need to defend themselves. So I think that the value preservation imperative, which has been talked about at very high level, is now becoming part of the conversation. So whereas before he would have had hyper-specialization, you would have your information security people, you would have your technical experts, but now business - people with MBAs, lawyers, they all have to understand that actually, in order for a business to survive and thrive, they need to build in corporate defense and corporate resilience and digital operational resilience. And that's where we're seeing the shift, I think, and that's what underpins everything that we're going to see and probably talk about for the next few minutes.
Dave Bittner: Are boards shifting the specific expertise that they're looking for from board members themselves? Are they looking for folks who have knowledge in the cyber realm?
Rois Ni Thuama: So I think two things are happening. Yes, that's No. 1. So they're looking for people who can bridge the gap between the technical and the compliance or the technical and the governance. That's No. 1. And the second thing that we're seeing, and it's partly because of new regulation that's coming in, is that boards themselves are raising their own awareness level so that they can have meaningful conversations about products that they're seeing in the market, about solutions that are imperative for them to be able to have a, you know, robust cybersecurity posture.
Rois Ni Thuama: So, yeah, so I would say it's two things - one, they are bringing in the experts, and two, raising that level. And so I've seen this. The first time I have seen this is in the Digital Operational Resilience Act. So that applies to financial entities, which is - the scope of that is quite wide. It's banks. It's insurance companies. It's rating agencies. It's audit companies, so, you know, KPMG, PwC, all of those guys need to comply. Reinsurance company, broker firms, investment companies - you know, if it creates, manages, transfers, money or cryptocurrency, you know, any of these applies to all of those. It also crucially applies to ICT third-party suppliers.
Rois Ni Thuama: So if you - if any business is supplying software, cloud services, anything like that, they will also need to comply. Now, what's interesting in this is that contained in this new piece of law, which will come in sometime in 2022, they are requiring boards to fully understand. So they - boards won't be able to shift responsibility. They won't be able to say, ah, yes, but we outsourced this or - no, the buck firmly stops with the board under this piece of legislation. And it requires the boards to participate meaningfully in these discussions. And in order for them to do that, it specifically calls out executive training for the board. So, yeah, so two things.
Dave Bittner: Interesting. Does this have boards taking a new look at, like, their errors and omission policies, you know, to protect themselves?
Rois Ni Thuama: I mean, that is a great starting point for boards who are making this calculation. It seems to me at the moment that it's the largest financial - it's the usual. You know, the guys with the deepest pockets are making the first moves because they can. So I imagine a lot of the learnings then from that will then be filtered down to, like, you know, to the smaller firms. But yeah, that's a good starting point - and assessing the knowledge of the board at the moment - right? - because you're going to have technical people who won't need to understand the corporate impact, you know, so much. I mean, it's kind of the evolution of where we were going. This isn't to say that 10 years ago that there was a misstep, and it's not to say that 10 years ago, errors were made. Of course, we could always have done better, but we do - that's the assessment we make with 20/20 hindsight. I think it's a natural evolution of where we're going, and it's the sort of the organic growth. And what I mean by that is if you look back 10 years ago, we had limited data on what it meant for firms. But there was a really interesting report published by ENISA, the European Union cybersecurity agency, at the end of October, and they identified some prime threats.
Rois Ni Thuama: Now, what was really interesting - if you looked at the prime threats that they published in October '21, it's the same threats that they've been publishing every year since they started publishing the prime threats. The only thing that has changed is the names. So we used to call malware malicious code. You remember Trojans? So they referred to - yeah, right? - so they referred to Trojans. And ransomware has been a big problem for years, but they were referring to it as rogueware or scareware, right? So the names have changed, but the prime threats have remained the same.
Rois Ni Thuama: So I suppose my point is 10 years ago, a director looking at this didn't have 10 years' worth of data. That's No. 1. Now we do. So when we know better, we do better. And also, what's really interesting is that the courts don't expect you to see around corners, but they do expect you to read the writing on the wall. And if that writing has been on the wall for 10 years, you really need to address those prime threats.
Dave Bittner: You know, one thing we've seen is that insurance companies have been making recalibrations on the policies they're willing to offer, the prices that they're charging for them and also, I think, the demands that they're making on the companies that they're insuring. They're - it seems like there's a lot more scrutiny than there was in previous years. How is that affecting things as well?
Rois Ni Thuama: So this is a really - so one of the ways businesses would have managed risk previously is that they would have simply transfer it. So for 100,000 pounds' worth of policy, we were able to shift the liability and all of the responsibility. Insurance companies are reeling from ransomware. Essentially, they did the sums, and they did them - and they were badly wrong.
Rois Ni Thuama: OK. I make the distinction between being wrong and badly wrong like this. Fifty and 50 does not equals 101. You're wrong. But, you know, you're within a margin of error that's comfortable. But 50 and 50 does not equals purple. You're badly wrong. You just don't understand what it is you're talking about. And I think that the insurance sector has been badly wrong about ransomware, and they're reeling. And there are going to be a couple of companies in 2022 are going to go to the wall because of this poor assessment of the risk and what they were able to stomach and what they were able to deal with.
Rois Ni Thuama: To mitigate and manage this risk, now what we're seeing is insurance companies are requiring their clients to be a lot more robust. Here is interesting - cyber insurance companies or companies that offer cyber insurance are now turning away firms who bear all of the hallmarks of a weak cyber governance. They're like, you're so weak, you're making yourself a target. And for us, we're not going to stomach that. And that is the equivalent of, you know, you don't leave your keys in your car when you go in for the night. You don't drive drunk. You wear your seatbelt. All of these things that we accept as basics in the real world, there are digital equivalents.
Rois Ni Thuama: So what insurance companies are asking or requiring of their clients now are minimums. So one of the things is multifactor authentication, endpoint protection. DMARC protocol must be in place. And that's just the, you know, like I mentioned earlier, that's just sort of the organic that is the, you know, we have enough data. We know these things mitigate risk.
Rois Ni Thuama: And actually, that's interesting as well because the ENISA threat report, in their recommendations, has many of the similar requirements or recommendations. So what they're - what the ENISA report recommends is in line with what you would see with NIST. So it's no different. It's no different to what you might see in the CMMC, the Cybersecurity Maturity Model Certification. And it's no different to what the insurance companies are requiring. So even if you pick any one of these things, you're going to need to do multifactor authentication, endpoint protection, DMARC and so on and so forth. So there's - it's kind of a no-brainer for businesses who are looking to manage the risk and to be able to access cyber insurance.
Dave Bittner: Is there an equivalent of building codes for cyber? In other words, you know, I build a new building, and the building codes require that I have a certain number of exits, that I have exit signs, that I have sprinklers, you know, those sorts of things - fire escapes, all those sorts of things. But that's different than my insurance company saying - offering me a carrot and saying, hey, if you have a fire extinguisher in your kitchen, we're going to give you a discount on your homeowner's insurance - you know, different set of incentives there. Do we have those government-side mandates that are evolving as well?
Rois Ni Thuama: So yes and no. I'm sorry, but this is the situation, so it's yes and no. So if we take, for example, DORA, I like the building analogy, but every building will have doors - right? - because how else are people going to come in and out? If it's more than one level, it will have stairs as well as a lift because what are you going to do in the event of a fire? You just can't make that kind of a blanket statement for the types of hardware and software businesses will deploy. So it's not as simple as to say, look; every business needs to have this, or it needs to have that. I mean, every business has email, so there are certain things that you could say, look; it's a really good idea to have this. But this is where to use the expression to address reasonably identifiable circumstances - I mean, that's beautiful, because then that means...
Dave Bittner: (Laughter) Lawyers love fuzzy terms like that, right?
Rois Ni Thuama: Yeah. I will say this in defense of lawyers, you know - because, you know, they're given to legaldygook a lot of the time. And I find that kind of painful. Just that plain, very clear as a business, all we need to do is address reasonably identifiable circumstances, that will lead to, you know, malfunction and so - disruption and so on. OK, this is great because now you have a very clear thing that you need to do. You need to address reasonably identifiable circumstances.
Rois Ni Thuama: OK, so now you ask, well, what is reasonably identifiable? Well, if you've got the FBI jumping up and down every year with the IC3 report saying these are the most significant threats to your business, that's not reasonably identifiable, Dave. That's easily identifiable. Jump all over that. Oh, and I would say this as well. So the FBI have a really good track record of warning about things. So I'm going to take you back to 2004, where one of the directors of the FBI was having press releases. He was running press conferences like on a loop. And he was saying, look; the amount of mortgage fraud that we're seeing has the potential to become a pandemic. I think they are his words almost precisely. You know, before COVID, he used the word in 2004, it has the potential to become a pandemic. He understood the interconnectedness of the financial services sector, and he could see the problem that was on the horizon. That was in 2004. Lehman and Bear Stearns didn't collapse until 2008. The FBI got it right, and they got it right years in advance.
Rois Ni Thuama: If I was working for a corporate, the first thing I would be looking to is what are the FBI saying? They've got access to data that is otherwise unavailable to the private sector, and they have a good track record of getting things really right. So the first thing businesses should be doing is looking at the IC3 internet crime report, and then they should be looking to see, well, what are the biggest threats? Business email compromise, the $26 billion scam, the FBI are tearing their hair out, warning businesses on a loop. This needs to be done. So, you know what? do that. It's reasonably identifiable.
Rois Ni Thuama: OK, what else is reasonably identifiable? Anything in the U.K. that the National Cyber Security Centre pumps out. And you know what? The good news here is, if the FBI are saying that they're seeing it on their side of the pond, the NCSC is saying that they're seeing it on this side of the pond. So now you have two credible agencies who are telling you the same thing. It might be more complex if those two agencies were saying no, well, I'm seeing a lot of this, and the other side was saying, well, I'm seeing a lot of that. You know, that introduces - but that's not what is happening. So then you can then back it up. Well, is this something that NIST, the National Institute for Standards and Technology, is this something that they're recommending? Oh, yeah. Right. They are. Then you've got the Cybersecurity Maturity Model Certification from the U.S. Department of Defense.
Rois Ni Thuama: Now, if you think I get excited about DORA, I think this is just - this is your tick-box approach. This is your - you know, your - that building thing you were talking about, that checklist of do this, do this, do this. If the U.S. Department of Defense has gathered all of the brains it has at its disposal, and they've made the determination that in order for them to be robust, that they need to do certain things, it seems to me that companies - that's a life hack for companies. You don't need to do the same things. You know, you don't need to do all of the background research and make your own determination. Just copy the homework of the smartest kid in the class. Do that, you know?
Dave Bittner: (Laughter) Love it. To what degree do you think that organizations who are able to successfully implement these things will that be a competitive advantage? And could we see for organizations that don't, could it be a business killer?
Rois Ni Thuama: Oh, I mean, it is an existential threat. So I'm going to answer your second, but first, it's an existential threat. This is what we've seen. We've seen businesses go to the world six to 12 months after they have suffered a, you know, a significant breach. Smaller businesses tend not to be able to survive or recover. We also know that, like - in the U.K., 97% of businesses are small businesses. So - and that's what's sustaining the economy. Smaller businesses tend to hire more people proportionate to the money that they're bringing in. So they're the lifeblood of the economy. So everybody has skin in the game, and that's really important. So, yeah, it's an existential threat.
Rois Ni Thuama: As to your other point, is this a unique selling point? You bet it is. So this is what I would say. If anybody is looking - if you have commercially sensitive information and you need to find a law firm, well, the first thing you should be doing is you should be making sure that that law firm is as robust as it reasonably can be. Because if your data breach is out, that means your competitor can see it. It means that, I mean, you could be dead in the water if information about your business is out there. So, yeah, this is - I would see this - if I was guiding a client to select a law firm, for example, I would say, well, you need to run a supply chain due diligence on that law firm and make sure so far as they reasonably can be that they have a robust cybersecurity posture. And I wouldn't take - I wouldn't touch anybody whose cybersecurity posture was weak. And there is no reason that that needs to happen.
Rois Ni Thuama: And then I would also look to this. I would ask the law firms, well - so the law firm itself is robust, but now, what does your supply chain, what do you require of your supply chain? And this isn't new, by the way. Banks have been doing this for years. There's a lot of, like, large global banks who themselves are leaders. You know, this is the deep pockets. I'll give you one example. J.P. Morgan has a budget of around 600 million a year for cybersecurity according to their, you know, their annual report. That's a material sum by any stretch of the imagination. But what you're seeing these large banks do is not only are they making themselves robust, but they're then mandating that anybody who supplies to them does certain things.
Rois Ni Thuama: This is where, you know, if you're following, like, if you're looking to NIST or a CMMC or if you're looking to what's the prime threat with the FBI or the NCC, you're going to be covered for all of those things anyway. Because where are these big banks getting their information from? Well, they're looking to these guys. Because the worry for years - when you go to conferences, people talk about, you know, zero-day vulnerability or they want to know, would that have stopped Stuxnet? And I'm like, why, are you enriching uranium? Why do you want to know? Why do you want to know about something that's, you know, that's an outlier, that's unlikely to happen?
Rois Ni Thuama: You need to be thinking about the thing, you know, the technically artless exploit that is being used by a daily basis, hand-over-fist by Fancy Bears and, you know, all of these government agencies, they're not deploying the tough stuff while you're allowing it, you know, while we're making it easy.
Dave Bittner: All right, Ben, what do you think?
Ben Yelin: Just interesting that, you know, the role that cybersecurity is now playing within individual, public and private organizations that it didn't play, you know, 10 to 15 years ago. I think once you have, you know, all these cyberattacks, ransomware attacks on the public and private sectors, companies are now willing to invest in, you know, CISOs. If you're a small business and didn't have one in the past, now's the time to go out there and find some talent.
Dave Bittner: Yeah.
Ben Yelin: I think that's just the nature of the threat. Once the threat becomes more real, and once companies in the private sector realize it affects their bottom line, and companies in the public sector realize, hey, our Department of Health website, when it goes down for a month, that's, you know, going to really hurt our citizens - as happened in Maryland. So, yeah, it's an interesting conversation in that respect.
Dave Bittner: Yeah. It's interesting how cyber is kind of - continues to creep its way up the org chart, you know.
Ben Yelin: Sooner or later, cyber is going to take over everything. There's not going to be a CEO. The cybersecurity professional is just going to be in charge.
Dave Bittner: Right. Right. All right. Well, again, our thanks to Dr. Rois Ni Thuama for joining us. We do appreciate her taking the time.
Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.