Caveat 3.17.22
Ep 117 | 3.17.22

The reluctance to use offensive cyber tools.

Transcript

Liz Wharton: It's not a matter of if but when and how bad, and forcing them or encouraging them to take a look at their systems, to look and say, OK, have we patched for this?

Dave Bittner: Hello, everyone, and welcome to Caveat, the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On today's show, Ben discusses the reluctance on the part of the U.S. and Russia to use offensive cyber tools. I've got the story of the FTC requiring organizations to delete troublesome algorithms. And later in the show, Ben speaks with Liz Wharton from Scythe on the evolving privacy regulatory environment. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, let's jump into some stories here. Why don't you start things off for us? 

Ben Yelin: So I'm not sure if my story is comforting, and I'm not sure if it's intended to be comforting, but I kind of got that feeling, coming out of the story. 

Dave Bittner: OK. 

Ben Yelin: So it's by Kim Zetter over at Politico. 

Dave Bittner: Yeah. 

Ben Yelin: And it is entitled "Not the Time to Go Poking Around: How Former U.S. Hackers View Dealing With Russia." And this is about offensive cyber operations. So obviously, there's the actual physical war going on in Ukraine. There's the expectation that that could escalate, particularly if Russia attacks NATO allies. 

Dave Bittner: Right. 

Ben Yelin: We'd be forced to get involved militarily. But then there's also the second part of this war. And really, it's what makes this probably the first major hybrid conflict in the 21st century, and that's the offensive cyber operations. And what I found really interesting about this article is there's a reluctance on the part of both U.S. actors and Russian actors to use the most offensive tools in our cyber arsenal to really inflict the most damage. And I felt like I was reading an article from 60 years ago about nuclear weapons... 

Dave Bittner: Oh, interesting. 

Ben Yelin: ...Because there is the same concept of mutually assured destruction. So as a little bit of background, I mean, we have been engaged in espionage on Russian networks, on their computer systems, for several years now... 

Dave Bittner: Sure. 

Ben Yelin: ...Probably decades... 

Dave Bittner: Yeah. 

Ben Yelin: ...To collect intelligence. Obviously, our intelligence operations have been relatively successful, in that we got a pretty good heads-up that this invasion was coming. But there's been entire units of our government agencies, the National Security Agency, that has engaged in tactics, at least, to be prepared for cyber warfare. And that includes hacking into Russian systems. So those weapons are available to us. If we were attacked in some capacity, it is well within our capabilities to do something like attack Russia's critical infrastructure. The problem and the reason that I think we're very reluctant to do that is they have those retaliatory capabilities as well. 

Dave Bittner: Right. 

Ben Yelin: So if we attack their critical infrastructure and bring down a power grid in a major city or disrupt a water system or a sewage system, they have the capability to do that to us. And not only is that going to have extremely detrimental effects on our own citizens, but that, too, is its own form of escalation that could lead to greater conflict. So we're so fearful about some mistake happening in the skies over Ukraine that escalates this conflict into a full-fledged war between nuclear powers - and I think cyber operations are also involved in that calculation - we don't want to take any actions that the Putin government and Russia would interpret as an escalation. And that would, in the words of Kim Zetter in this article, trigger a reprisal. So this is a very well-sourced interview. She talked with people who've been involved in intelligence gathering and hacking into Russian systems. I just - I thought it was perhaps a little bit comforting to know that the protection we all get from this concept of mutually assured destruction, at least for now, also applies... 

Dave Bittner: (Laughter) Right. 

Ben Yelin: ...To... 

Dave Bittner: We're sleeping under the warm blanket of mutually assured destruction (laughter). 

Ben Yelin: Sleep well tonight, everybody. We all have nuclear weapons, so we're all safe. Yeah. 

Dave Bittner: Yeah. You know, it's interesting to me how I think going into this conflict, a lot of folks, certainly outside the intelligence community, were assuming that we would see cyber operations leading the way... 

Ben Yelin: Right. 

Dave Bittner: ...That as part of the invasion, they would turn the lights off. They would shut down the networks. And so initially, there was a lot of surprise that that didn't happen and a lot of speculation. You know, why? Why aren't they turning the lights off? Why aren't they shutting down... 

Ben Yelin: Very good question. 

Dave Bittner: ...The cellular networks and so on and so forth? So - and I think in retrospect, it turns out that they needed stuff to be on. We've seen stories about how the Russian forces are facing all sorts of challenges with their communication systems. And so they're relying on cellular networks, old cellular networks... 

Ben Yelin: Right. 

Dave Bittner: ...To communicate with their troops, so it's against their interests to take them down. But the retaliatory thing is, of course, fascinating as well. I think also part of this - my understanding is that part of this is that they want to hold back on revealing capabilities... 

Ben Yelin: Right. 

Dave Bittner: ...As well. 

Ben Yelin: Right. I mean, you have to reserve the sharpest tools in your arsenal for when you would absolutely need them. 

Dave Bittner: Yeah. 

Ben Yelin: You know, there's also this shadow of doubt within our own government that Russia has the same capabilities that we have. We don't know for sure that they could take down our critical infrastructure the way we're pretty sure we could take down their critical infrastructure. 

Dave Bittner: Yeah. 

Ben Yelin: But you don't want to F around and find out, so to speak. 

Dave Bittner: (Laughter) Well, I also wonder - because, you know, again, before this conflict, Ukraine was used as a testing ground by Russia for some of these capabilities. They famously turned off the lights in Ukraine. 

Ben Yelin: Right. 

Dave Bittner: So I wonder to what degree after those sort of demonstrative episodes were our own defenses bolstered, right? And so - and we'll probably never know. 

Ben Yelin: Right. 

Dave Bittner: But... 

Ben Yelin: Hopefully someone's taken care of it out there. 

Dave Bittner: But my sense is they are, you know? 

Ben Yelin: Right. 

Dave Bittner: Like, this has been taken seriously. These threats been taken seriously. As you and I have spoken many times, it is one of the few things that still has bipartisan support. So on the cyber realm, things can get done, and it seems to me like they have been getting done. 

Ben Yelin: Absolutely. I mean, one thing I will say is if you had talked to us 22 years ago, which was Y2K, and also we're - there are enough adults in this country who are still in a Cold War mindset, I think there would have been an anticipation that we might engage in a cyber conflict at some point with another world superpower. 

Dave Bittner: Right. 

Ben Yelin: For 20 years, we've been - not sidetracked, but justifiably focused more on counterterrorism operations, which are just very different. 

Dave Bittner: Yeah. 

Ben Yelin: You're cutting off financial resources to sponsors of terrorism. You're trying to intercept communications for the purpose of intelligence. We, until very recently, just haven't really needed to prepare for this type of conflict, so we're relatively new at it. And it's not just the cyber capabilities. It's also diplomatic capabilities. You know, so just like the people that are hired after a ransomware attack know how to deal with a hostage-taker, I think we're just building the capability within our own government to know how to anticipate the next move from our geopolitical adversaries. And I think we're still in a relatively new phase of that. So I'm sort of glad that we've had a little bit of time without there being an actual attack on either country to make sure that our capabilities are in order in case we really need them. And I think we have to anticipate that if Putin feels extremely cornered and he runs out of options and he's - you know, has a feeling of revenge or vengefulness against the West, he might decide to deploy these weapons, and we just - we have to be prepared for it. 

Dave Bittner: It's sort of fascinating that we're - in a way, we're seeing the rules of the road being developed in real time when it comes to these sort of capabilities, right? We have a real conflict in front of us, and so whatever the norms are going to be, accidentally or not, this will be - this will set certain precedents, right? 

Ben Yelin: Absolutely. I don't think we anticipated that we'd need to be engaged in this type of warfare as soon as we are, although many within our government have anticipated that Russia would become more aggressive. And there's always been the potential that we'd have to respond to Russian aggression under Article 5 of the NATO treaty, meaning that if they attack another NATO country, we have to be prepared to use every tool in our toolbox. Even though this is a relatively new capability, you know, some of the same principles that applied throughout the Cold War era still apply. We've been very careful to avoid engaging in any offensive combat in the actual war arena in Ukraine for the same reasons we've been reluctant to use our cyber capabilities, which is just fear of retaliation and not wanting to escalate. And I think that's a very reasonable fear, and people are being prudent. But just like we conducted, you know, nuclear tests in the Nevada deserts during the 1950s and '60s, I think it's prudent for us to hope for the best, but be prepared for the worst. 

Dave Bittner: Yeah. Yeah, absolutely. All right. Well, we will have a link to that story from Politico in the show notes. My story this week comes from Protocol. This is written by Kate Kaye, and it's titled "The FTC's New Enforcement Weapon Spells Death for Algorithms." The headline might be a little breathless (laughter), but the story is quite good. 

Ben Yelin: Anything that says death to algorithms will raise our eyebrows. 

Dave Bittner: (Laughter) Right, right. 

Ben Yelin: I should put that as a Google alert. 

Dave Bittner: Yeah, it also just makes me - it reminds me that the folks who write the articles are rarely the ones who get to write the headlines. 

Ben Yelin: Yeah, they always hate the headlines. That's not what I meant to say. 

Dave Bittner: Right. So this is a fascinating story. So the FTC, the Federal Trade Commission, recently had a settlement - this was in early March - with WW International. That is the company formerly known as Weight Watchers. And as part of this settlement, Weight Watchers has to destroy algorithms or artificial intelligence models it built using personal information collected through its Kurbo healthy-eating app. And it collected those from children as young as 8 without parental permission. And they also fined the company $1 1/2 million, and they ordered it to delete the illegally harvested data. So at issue in this particular case is that this Weight Watchers company - they had an app you put on your phone. It would encourage you to eat healthy, presumably to help you lose weight, if that was something you were having trouble with. But we have a rule where anything that has to do with kids online has particular privacy rules. And that's COPPA, the - what is it? - the Children's Online Privacy Protection Act... 

Ben Yelin: Yup. 

Dave Bittner: ...Right? So the FTC found that the WW company here was sort of playing fast and loose with how they approached checking the age of people using this app. So they would allow people to put in any number - they were doing no age verification. But then worse than that, if someone went in and got access to the app by saying they were of age, later on they could change their age to be below what it should be to have access to the app, and the app did nothing. They allowed them to continue using the app. So that's sort of the foundation of what the - what got the FTC's attention, got their hackles up and had them go toward this settlement. But what I find fascinating is this idea that as part of the settlement, they are requiring that this company destroy the algorithms that they were able to create from the harvesting of this data. What do you make of this, Ben? 

Ben Yelin: It's the first time that we've really seen algorithmic destruction. This has been an idea that's floated out in academic circles and certainly within the FTC among a couple of its commissioners. In the past, we've taken steps short of compelling these companies to destroy their algorithms. So we've levied large financial fines. That's a pretty good disincentive. And we force companies to purge ill-begotten data. So data derived from collecting information on children, for example. We've seen that in high-profile examples. The Cambridge Analytica scandal. They mentioned the photo sharing application company Everible - Everal-bum. 

Dave Bittner: (Laughter) Everal-bum. Yeah. 

Ben Yelin: Everal-bum. 

Dave Bittner: Ever album. Ever album. 

Ben Yelin: Ever album. 

Dave Bittner: Ever album. 

Ben Yelin: That makes sense. 

Dave Bittner: We did it, Ben (laughter). 

Ben Yelin: I hope the editors keep that in there 'cause... 

Dave Bittner: Yes. 

Ben Yelin: ...That was a process of discovery for us. 

Dave Bittner: Our listeners get to hear our brains functioning in real time (laughter). 

Ben Yelin: For better or worse. 

Dave Bittner: There you go (laughter). 

Ben Yelin: But I do think that this really opens the door for additional cases that might be more high-profile. This is an especially egregious example because it does involve children. 

Dave Bittner: Right. 

Ben Yelin: And as you talked about, even if you were to explicitly change your age to be under the age that would allow you to use the application, they still allowed you to use it and maintain that data. So it's a pretty an open-and-shut case. What happens if there is an algorithm, as we've talked about a million times, that introduces some type of racial socioeconomic bias? Is that going to be ripe for algorithmic destruction? We certainly don't have precedent of that so far. But if somebody has standing to either sue the government or seek administrative penalties from the FTC, somebody could potentially allege that this algorithm, while, you know, it didn't do something as egregious as collect information from children as young as 8, had these effects of discriminating against people of different races. Maybe that's going to be an instance where we will reinvoke this new concept of algorithm - algorithmic destruction. 

Dave Bittner: Yeah. 

Ben Yelin: So it's really, you know, putting a flag down on what the FTC's capabilities are. And it's a warning to companies that the FTC can do this if they want to. Nobody's going to stop them, which hopefully will cause companies to look more closely at their algorithms to make sure that they're not violating federal laws. 

Dave Bittner: Yeah. I have questions. How do you... 

Ben Yelin: I might have answers. 

Dave Bittner: (Laughter) Well, they're not - they're more - I think they're mostly rhetorical questions. But how does a company verify that they have destroyed an algorithm, you know? Does somebody from the FTC come and look over someone's shoulder when they hit the delete key, you know? I don't know the answer to that. I suppose there are ways to - I don't - swear under legal penalty that... 

Ben Yelin: Right - under oath... 

Dave Bittner: ...Has been... 

Ben Yelin: ...That you've destroyed the algorithm? 

Dave Bittner: Yeah. 

Ben Yelin: Yeah. 

Dave Bittner: I mean that would be - that's what you do right, Ben? - I mean, Mr. Lawyer. 

Ben Yelin: Under oath - yeah. 

Dave Bittner: Yeah (laughter). 

Ben Yelin: I mean, that's probably the best we can do because... 

Dave Bittner: Right. 

Ben Yelin: ...A judge is not going to have the institutional knowledge to know whether an algorithm has been deleted. 

Dave Bittner: Yeah. 

Ben Yelin: They may be able to use their resources to double-check, but certainly they don't have the institutional capabilities to go in and actually make sure that the algorithm has been destroyed. So it might end up being kind of an honor system. The other thing I wonder about is - we've seen instances where the FTC has punished a company, and they make really superficial changes to something. So there's a company engaged in practices that are ripping off consumers, and the FTC levies a fine. Companies might make a very minimal change to comply with the FTC's requirements... 

Dave Bittner: Right. 

Ben Yelin: ...But it might not have a significant impact. 

Dave Bittner: Right. Right. And if you throw a bunch of computer code in front of somebody and say, well, this is different. This is - look, I mean, anyone who - coder... 

Ben Yelin: It's got a lot of zeros and ones in it is. It's... 

Dave Bittner: Right. Anyone who knows code would see how totally different this is from the other algorithm - you know. Yeah (laughter). So the FTC is coming at this using COPPA. One of the things that this article points out is that what we're really hoping for is some sort of comprehensive federal privacy law. By the way, they use the term here, algorithmic disgorgement, which I love. 

Ben Yelin: Sounds very Orwellian - yeah. 

Dave Bittner: (Laughter) Yeah. You know, is this a Band-Aid until we get to some somewhere having a federal privacy law? 

Ben Yelin: Yeah. Right now we have this patchwork of laws that would justify algorithmic disgorgement, if you will. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: So COPPA is one of them. You can certainly foresee a scenario where HIPAA would be invoked if we're talking about protected health information that's being inadvertently revealed. There are a variety of other federal laws. It would be much easier if we had a broad federal privacy law, which, of course, we don't have. And it would be hard to do something like this without statutory authorization. So theoretically, you could try and make a negligence claim, a common-law negligence claim, saying, you were negligent in handling my PII, for example, and that could be grounds for you to have to destroy your algorithm. That's going to be a much more difficult case to make. 

Dave Bittner: I see. 

Ben Yelin: You have a cause of action under COPPA as it relates to children's online privacy. 

Dave Bittner: Yeah. 

Ben Yelin: And you do in a number of - a limited number of other circumstances. But it's not going to be a - something that's done broadly until we have a federal privacy law that's geared towards what type of information is allowed to be collected under all circumstances. So it doesn't just relate to online privacy for children. It doesn't just relate to health records. But it relates to all information that might be private. There's a uniform federal standard, and the FTC could apply it. But as we know, it's been really hard for Congress to get its act together to pass a federal data privacy law. And so I don't see any reason why that's going to happen in the immediate future, which is why these other laws, the COPPAs of the world, might be a temporary workaround to justify this disgorgement in the meantime. 

Dave Bittner: Yeah. All right. Well, that article is from Protocol - again, written by Kate Kaye. We will have a link to that in the show notes. We would love to hear from you. If you have something you'd like for us to discuss, you can email us. It's caveat@thecyberwire.com. 

Dave Bittner: All right, Ben. Well, this week we're mixing things up a little bit, and you are taking the interview duties. You had the pleasure of speaking with Liz Wharton recently. 

Ben Yelin: I sure did. I spoke with Liz Wharton from Scythe on the evolving privacy regulatory environment, and I hope you enjoy the interview as much as I did. 

Dave Bittner: All right. Here's Ben's conversation with Liz Wharton. 

Ben Yelin: So I guess to start, at a really high level, can you tell us what legal framework there is that encourages or compels companies to protect data? What are the federal statutes that govern that? I know that's a very broad question. 

Liz Wharton: I was going to say, take your pick because it highlights one of the ongoing issues that companies face when deciding kind of who's the boss, so to speak, because you have - on the banking side, you have the GLBA. So you have the... 

Ben Yelin: Gramm-Leach-Bliley, yep. I always got that acronym wrong. 

Liz Wharton: Right (laughter). And you have - and you kind of have some upstarts in addition to the federal frameworks in that you have DHS and CISA that are coming in and jumping in to the regulatory scheme. But they're hampered. They really don't have authority outside of either the federal government or critical infrastructure. So on the other side of that - and kind of teeing up where I hope we get to talk - is the Federal Trade Commission, the FTC, who has, at least over the last several years, particularly with the Equifax breach, they actually have some enforcement ability and authority. And then we can also, like I said, talk about or, you know, go into some of the other agencies that have tried to jump in. But FTC, starting with Equifax and particularly with the Wyndham Hotels case, have really started that enforcement of - hey, you should be doing a better job of protecting your stuff, your data. 

Ben Yelin: What do you think are the limits on FTC's authority? In other words, if we did nothing except empower FTC to have the - basically, the enforcement authority it has now, would that be sufficient? Or do you think we need additional federal law or regulation 'cause there are gaps? 

Liz Wharton: That's one of the challenges of - are you chasing after - with creating additional federal laws, are you chasing after a specific problem? Or are you looking at the frameworks that you already have? My preference and my leanings tend to be towards - let's look at what we already have in place rather than trying to layer something on top of it and then creating this kind of circus where it's like, well, who am I answering to? Am I trying to figure out this? Am I trying to figure out that? And it gets to part of the notice, you know - did you know that you were breaching or that you had these obligations? 

Ben Yelin: Can you explain for the uninitiated what Gramm-Leach-Bliley does and why it applies to cybersecurity and protecting personal information? 

Liz Wharton: So from a high level, what the goals were - it's part of the know your customer, you know, getting into knowing who you're working with, protecting that PII - personally identifiable information. And a lot of people don't realize how deep that that can get, that it's a combination of information that banks, financial institutions, accountants, for example - company I work for, we have to look at, all right, what are we - are we collecting payments? Are we processing things? And if we are, then we probably fall under some of the protections and requirements. And so it - and where it picks up with - and the action. They - it came into being - what? - 1999. And so now 20 years - 20-plus years. Stop to do the math. 

Ben Yelin: Yeah, 22 maybe? Yeah. 

Liz Wharton: Yeah. I was going to say, stop - I don't like to always do the math sometimes. But when you look at all the updates - and back in October, there was some extensions on, really, that continuous - and not letting companies fall back on the - well, we did an audit. We did an assessment. We have these protections so that the data is encrypted, that we're not storing it different places, that we are looking at who's accessing. And the FTC said, great, OK, let's expand. And they updated the rules and said, all right, it's not just that snapshot. What else are you doing? Are you continually - like, you can't just look at it, like I said, one time and say, yep, we checked the box. Good to go. Nothing else we need to do. It's like, no, you need to continually look at who's accessing that data, making sure that you are using multifactor authentication and, really, falling in line with some of the other stuff we saw with the White House's cybersecurity executive order back in May of last year as well. 

Ben Yelin: So this applies because of the jurisdiction of GLBA. This applies largely to financial institutions. When we're talking about other institutions, you know, you have HIPAA - that applies to health care. You have, as you've talked about, kind of a patchwork of laws depending on the domain. I guess what I'm getting at is there are a lot of companies in the private sector who don't fall under that authority but still might have PII. I mean, they just might have access to personally identifiable information one way or another. So how do we address those gaps? 

Liz Wharton: Well, and that's where you see - one of the questions being, does CISA need more bite to their bark? Do we need to expand that? Or do you look at the FTC and say, well, you have this duty to protect the consumers? And are there aspects - and you saw that - that's where - with the Equifax, they said, hey - also, with - one of my personal favorites is the Wyndham Hotels, where, again, you've been breached multiple times. 

Ben Yelin: Right. 

Liz Wharton: You have failed (laughter). 

Ben Yelin: This keeps happening. Yeah. 

Liz Wharton: Yes. At some point, you are very aware you are not doing something right. And so now we're going to, you know, put you in a financial timeout. Like, we're going to make this hurt a little bit and hopefully use that as a wake-up call. So, again, companies should be aware of - hey, if we've seen this happen multiple times or if we are ignoring something that we know could impact our customers, you know, our - those whose data or those whose systems, those who we should be protecting, then maybe we need to take a hard look. Maybe we need to reorganize our priorities and what we were putting off for a while - let's reassess that. Let's go back and look at it. 

Ben Yelin: So I think that makes a lot of sense when it applies to big institutions. You know, Wyndham, for example, certainly has the financial resources... 

Liz Wharton: (Laughter). 

Ben Yelin: ...That they can figure out how to do proper compliance. I guess to play devil's advocate, how would you apply that to smaller organizations that just might not have the institutional know-how to protect against the next, you know, supply chain attack vector? 

Liz Wharton: Right. You had these tiny companies that were asking to step up and fight the fight against nation-state-sponsored actors. 

Ben Yelin: Right. Exactly. 

Liz Wharton: And so... 

(LAUGHTER) 

Ben Yelin: Talk about a David versus Goliath story. 

Liz Wharton: Right? And the good news is, is that at least from a federal level - and I'd love to see some of the states get it on this as well - that they're realizing, wow, they don't have the resources. So rather than just yelling at them, going, you should be eating your vegetables, making them more appealing. Like, what is all the cookbooks where it's - here's how to hide the vegetables in your cauliflower crust pizza. 

Ben Yelin: Exactly. Deep-fried broccoli with delicious chocolate or something, yeah. 

Liz Wharton: Right? And so you're seeing tool kits. And CISA's developed these and put them out there for small- and medium-sized companies that a lot of the cybersecurity or information security community is also releasing some of their basic tools. Of course, that also gets into the open source, which - hmm, no problems with open-source software and patching and keeping up with that, right? 

Ben Yelin: Oh, of course not. Yeah. 

Liz Wharton: No, no. 

Ben Yelin: No controversies have arisen from that. Right. 

Liz Wharton: No, not at all. But putting those resources and making them available so that, you know, you have weapons to defend yourself with against. 

Ben Yelin: So I guess that means we have kind of both a carrots-and-sticks approach, if you will. Thinking from the perspective of a stakeholder, whether it's a small company, a large company, even like a municipality, I'm sure they love the carrots and hate the sticks. So I guess my question is, why do you think the sticks are so important? And do you think FTC enforcement authority being able to actually levy fines in the long run is beneficial for our overall national cybersecurity posture? 

Liz Wharton: Unfortunately, it's one of those things that when you have companies doing that risk assessment, doing that decision-making, you know, going through their matrices to say, OK, what gets priority - and particularly - formerly been at the municipal level of, you're balancing these priorities, that you're having those conversations. The stick becomes important because it helps bump some of the things up to the top, or it helps shine a light on, hey, we should be doing these continuous assessments. We shouldn't be checking the box. I'm not saying, you know, that you look at, perhaps, a municipality that I live near now - that they go through, and it's like, well, if they don't get the cyber policies, if they don't take these steps, then they are going to get hit by the ransomware. And it's not a matter of if, but when and how bad. And forcing them to look and say, OK, have we patched for this? Forcing them or encouraging them to take a look at their systems and say, oh, does - are we impacted by Log4j? Spoiler alert - probably. And... 

Ben Yelin: (Laughter) Yeah. Who isn't, right? 

Liz Wharton: Right? And then having that - so knowing that there's that stick out there that we strongly - or as the FTC - we strenuously encourage you, remember what we did to Equifax, so please do this. Then also putting in there - and here are links to the resources. That is a great approach. 

Ben Yelin: So looking into the future, we had this new rule that was finalized just a couple of months ago that you referenced from the FTC. What do you see as kind of the next frontier of regulatory action? Like, in the next three to five years, what do you think some of the next steps are to compel companies to take cybersecurity more seriously? 

Liz Wharton: Well - and it depends on which battlefront you want to look at. I mean, you're going to have - CISA's making moves. The NSA is also looking at - you know, different government agencies are all kind of saying, hey, maybe this should be my responsibility. We need one ring to rule them all. It's - you know, that battle's going to play out. It's been playing out for many years. I mean, when you look at aviation cybersecurity, the FAA saying, hey, we - you know, with Boeing and some of the other software vulnerabilities of - like, our job is to protect the national airspace. But it - should it be cyber? You see some of it coming to play from the executive order. So I don't know who's going to find the precious, but I do see that being one of the battles, as well as moving beyond. And how do we take that next step from, OK, you checked the box? How do we further encourage companies nicely or strenuously encourage them to participate or enact the continuous and the automated and so that you don't have - you're not relying on the one pen test, the one vulnerability assessment. We checked the box this time. The reminding that it's an ongoing - it's not even a marathon. It's just a never-ending hamster wheel of protecting and encrypting and what steps are you taking? Are you updating? And again, it's that hamster wheel of continuous, continuous, continuous. You're never done. 

Dave Bittner: All right, interesting stuff. Always interesting to get Liz's perspective on things - really insightful and great to have her back. We really appreciate her taking the time for us. 

Ben Yelin: Yes. Liz Wharton is one of my favorite guests. I mean, she's both extremely smart and also has a way of explaining things that I think is very digestible for our listeners. 

Dave Bittner: Yeah, absolutely. You know, I think - a couple things that struck me in listening to the conversation - this whole notion of who has the ultimate authority when it comes to these regulations, you know? It sort of - it points to what we were talking about in one of our earlier stories about, you know, lacking a - an overall federal privacy legislation. You know, we're - we have all these little pieces that we're putting together, and it allows for some finger-pointing. 

Ben Yelin: It does. And it kind of creates a vacuum that's swept up by this patchwork of agencies, patchwork of relevant statutes. It's not the most efficient way to resolve these disputes. 

Dave Bittner: Yeah. It also struck me - you know, I think Liz was right on the money here when she sort of highlighted the complexity of all of this. And I love (laughter) her reference to one ring to rule them all. You know, it's a good callback. But I think we have this desire for that, and we're just not there. 

Ben Yelin: Right. Right. I completely agree. 

Dave Bittner: Yeah. All right. Well, our thanks to Liz Wharton from Scythe for joining us. As we said, always a pleasure to have her on the show. She is @LawyerLiz on Twitter, so do check that out. She is a good follow for sure. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.