Caveat 6.2.22
Ep 128 | 6.2.22

Cybersecurity and data protection at the forefront of legal disputes.


Andrea D’Ambra: It's not if you're going to get attacked, it's when you're going to get attacked. Because there are so many different vectors they can hit you at, right? You've got to protect 360. Plus, you have employees who might do something crazy.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben discusses a new Justice Department policy for charging cases under the Computer Fraud and Abuse Act. I've got new guidance from the FTC on student privacy issues. And later in the show, my conversation with Andrea D'Ambra. She is from the Norton Rose Fulbright law firm. We're discussing their research on litigation and the privacy landscape. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, let's jump right into our stories this week. Why don't you kick things off for us here? 

Ben Yelin: Mine comes from the Lawfare blog, written by Alvaro Maranon, and it is entitled "Department of Justice Revises Policy for Charging Cases under the Computer Fraud and Abuse Act." It references a new policy manual for the CFAA. If you'll recall, there was a major Supreme Court case in 2021 dealing with the Computer Fraud and Abuse Act, the Van Buren case, and it held that crimes can only be charged under the Computer Fraud and Abuse Act if somebody is in a particular area, either a physical area or a network, where they do not have access to be there at all. Exceeding authorized access is defined narrowly, meaning they are in a section of a computer or a section of a network where they have no authorization to be, not that they're using it for a purpose that was not intended. And so the Justice Department needs to adopt their policies to comply with the Van Buren holding. And also, I think they're expressing some sort of policy preference here to limit prosecutions under the Computer Fraud and Abuse Act. 

Dave Bittner: OK. 

Ben Yelin: So the most important provision of this new memo is to limit the prosecution of cases under the CFAA to cases where there isn't a good-faith security research effort taking place. In other words, there's sort of a carve-out to the Computer Fraud and Abuse Act for instances where somebody is accessing a computer - and I'm quoting from the memo here - solely for purposes of good-faith testing, investigation and/or correction of a security flaw or vulnerability or such activity is carried out in a manner designed to avoid any harm to individuals or the public, and where the information derived from the activity is used primarily to promote the security or safety of the class of devices. In other words, simply because somebody is on a network or in an area of a network or a computer where they're not supposed to be, the Justice Department, as a policy, will decline to prosecute if they are convinced that that individual is in that area of the network, computer - or the computer for good-faith reasons to promote security. It is security research. 

Dave Bittner: Is good faith a legal term of art? Is that defined itself? 

Ben Yelin: It is largely defined here in the memo. I mean, it's certainly the - I think what you're getting at is there's kind of a slippery slope here where... 

Dave Bittner: Well, yeah, I guess there's - yes, go on. 

Ben Yelin: Every single person will say, oh, I was just doing security research. 

Dave Bittner: Right. 

Ben Yelin: I think the way they're defining it is it has to be research that wouldn't confer any tangible or monetary benefit on the person doing the research and wouldn't cause any bodily harm or economic harm to any other entity. And I think that's relatively easy to measure. I mean, sometimes there's going to be close calls, but I think in most instances, you can tell when somebody is doing good-faith research because they immediately notify companies or organizations of their own vulnerabilities. They don't steal data. They don't sell data. And then you have instances where that's clearly not the case. 

Dave Bittner: Right. 

Ben Yelin: So I think this is actually a line where you can properly delineate between good-faith research and research that someone might be claiming to be good faith, but is done for personal enrichment in some way. 

Dave Bittner: Yeah, this is interesting. You know, I was thinking of my co-host over on the "Hacking Humans" podcast, Joe Carrigan. You know, he's over at Johns Hopkins. And one of his responsibilities there is disclosures to organizations when the researchers at Hopkins find a vulnerability in its system. Joe sort of leads the charge for reaching out and contacting the organization, you know, where they found the vulnerability. And he shared with me that sometimes they'll get a nasty gram from legal, you know, from the legal department of that company saying, you know, what are you doing in here? This is a violation of the Computer Fraud and Abuse Act and, you know, all sorts of things, cease and desist. Does this kind of try to put an end to that, where - I mean, obviously, you know, the researchers at Hopkins, for example, they're not up to bad things with their research. 

Ben Yelin: So they say. Yeah, I don't know about that Joe Carrigan. 

Dave Bittner: (Laughter) That's right. 

Ben Yelin: Yeah, it does. At least it puts an end to, for the time being, prosecution of those types of cases. 

Dave Bittner: OK. 

Ben Yelin: So I think it's a little murky whether it's actually still legal - and I'm putting scare quotes around the word legal... 

Ben Yelin: Yeah. 

Ben Yelin: ...To access networks or systems where you are completely unauthorized to access them for the purposes of doing research. What this memo says is just that it is current Justice Department policy to avoid prosecution on these issues. And that is the discretion of - that is at the discretion of the Department of Justice. That principle comes from the fact that there are a finite number of prosecutors in this country, and there are an infinite number of potential crimes that are being committed at any moment. 

Ben Yelin: So one thing that any presidential administration has to do is to set priorities. And so they have these types of memos where they describe what type of activity they will charge, they will prosecute, they will try to obtain convictions, and what type of activity is simply not worth their time or their resources. And that's what they're doing here. What that means is if a new administration comes in with different priorities, they could immediately rescind this Justice Department guidance. Or if there is a change in Justice Department leadership, maybe the landscape change - changes, and you have people who seemingly were accessing networks for good-faith research reasons but ended up stealing data or profiting monetarily... 

Ben Yelin: Right. 

Ben Yelin: ...Then certainly this document could be rescinded. But this is just a statement of the Department of Justice of their own values and their own intentions as to what they're going to prosecute. 

Ben Yelin: I'll also say they limited even more than the good-faith exception. Elsewhere in this memo, they kind of outline instances where it would actually be worth the time and effort and resources of the Department of Justice to prosecute a case under the Computer Fraud and Abuse Act. And for that, they consider things like the sensitivity of the affected computer system, the information stored on that system, whether the damage to the system or the information transmitted raises concerns related to national security, foreign terrorism, etc., whether the activity is in furtherance of a larger criminal enterprise or part of a conspiracy, the impact of whatever the crime is on the victims. 

Ben Yelin: What this language says to me is they just do not have the resources, wherewithal or policy desire to use the Computer Fraud and Abuse Act as a giant hammer to prosecute literally every case where somebody stumbles into a network where they're not authorized to be. Based on the Van Buren decision and based on the Justice Department's own values, they're deciding that they're going to limit the number of cases they're going to prosecute by having this good-faith exception and by, even beyond that, limiting cases where you have victims that have suffered monetarily or have had bodily injuries or instances where national security is implied. So it's just going to be a much narrower set of cases for the Computer Fraud and Abuse Act. 

Ben Yelin: To me, this means we're never going to see a case like the Van Buren case again. And that makes sense, obviously, since Van Buren won at the Supreme Court. But the type of case where some law enforcement officer has access to the network for law enforcement purposes and they use it to get personal information on an ex-girlfriend or something, I think based on these guidance and these documents, it's far more likely that cases like that will be prosecuted. 

Dave Bittner: To what degree do you think this is a result of a lot of the blowback from things like the Aaron Swartz case? Just real quick, can you describe for our listeners who may not be familiar what that was about? 

Ben Yelin: Sure. So Aaron Swartz was a computer programmer and a researcher. And as part of his work - I believe it was at MIT. 

Dave Bittner: Yeah. 

Ben Yelin: He was a hacktivist. And he was arrested by the MIT police on breaking and entering charges after he went into a closet to download academic journal articles from JSTOR, which was a database used by a lot of lawyers and other professionals to do research. 

Dave Bittner: So there was an unlocked closet that had, I guess, a network access port in it. And he found this closet, and he plugged his laptop into the closet to gain access to MIT's network to download all of this information. 

Ben Yelin: Right. So he claimed that he was doing this as a hacktivist for internet research purposes to expose this vulnerability. But the federal government charged him under the Computer Fraud and Abuse Act. They were seeking a pretty large penalty - $1 million fines... 

Dave Bittner: Yeah - felony charges. 

Ben Yelin: ...Felony charges, a significant prison sentence. And he couldn't agree to a plea bargain. I mean, they were really bringing the full hammer, the full force of their prosecutorial authority under the Computer Fraud and Abuse Act on this individual, Aaron Swartz, who, at least by all appearances, was engaging in this activity for good-faith reasons. And Aaron Swartz ended up taking his own life. It was incredibly tragic. I think the entire field and the entire industry hasn't been the same since this happened. I think it has caused prosecutors and policymakers to rethink what the purpose is of the Computer Fraud and Abuse Act. 

Dave Bittner: Yeah. 

Ben Yelin: This is where I think we might bring in a little bit of originalism. What did the lawmakers who drafted the Computer Fraud and Abuse Act in the 1980s intend for it to do? And it was to prevent people from trespassing on somebody else's computer and networks to steal information, to enrich themselves or to cause bodily harm. It's pretty clear in the debate on that legislation that that was the purpose. And to extend that to apply to efforts where, seemingly, there has been good-faith research, I think this document kind of explicitly and implicitly acknowledges that that is just a misuse of the authority of the Justice Department. So I don't think you can talk about this document without understanding the context of the Aaron Swartz case. It's been - what? - almost 10 years now? 

Dave Bittner: Yeah, 2013 was when he passed. So yeah, almost a decade. And since then, there's been a couple of attempts at legislation that have been referred to as Aaron's Law. And they've - you know, they haven't made it through. They've stalled in committee. They've, you know - typical, I suppose these days, story of things trying to make their way through Congress. But there have been efforts by some legislators - bipartisan efforts, I'll add - to kind of address this. And so I wonder if this is, in part, a way to respond to those desires, those pushes? 

Ben Yelin: I think it is. I mean, we were just talking before we got on. This policy document should almost be titled after Aaron Swartz, I think, because I think it reflects the Justice Department's recognition that using the Computer Fraud and Abuse Act to seek large penalties on good-faith researchers is - you know, I think you could argue whether it's morally wrong or not. I tend to think it is... 

Dave Bittner: Yeah. 

Ben Yelin: ...But also just a misuse of Justice Department and federal government resources. And it's not an area that the Justice Department wants to exercise authority. So I think it certainly informs what they're trying to do with this policy document. 

Dave Bittner: It's interesting, too, as I think about it - and I'm thinking off the top of my head here - but if you think about everything that's happened in the past decade in terms of the ascension of ransomware, you know, like, federal prosecutors are a lot busier with other things with much greater consequences than they were a decade ago. 

Ben Yelin: Yes. 

Dave Bittner: I think, right (laughter)? 

Ben Yelin: I figure if you are a Justice Department attorney... 

Dave Bittner: Yeah. 

Ben Yelin: ...Who is well-versed in computer crimes... 

Dave Bittner: Yeah. 

Ben Yelin: ...Computer Fraud and Abuse Act cases like this are going to be the least of your concerns. There are just better ways to use the expertise of those attorneys, like the ways you say. I mean, we have ransomware cases that are certainly under federal jurisdiction. And there's a really important policy rationale behind prosecuting those cases, whether they are rogue actors or representatives of nation-states. 

Dave Bittner: Yeah. 

Ben Yelin: And it's not just ransomware. I mean, denial-of-service attacks, what we saw in the 2016 election with interference... 

Dave Bittner: Right, good old-fashioned espionage. 

Ben Yelin: Exactly. 

Dave Bittner: Yeah. 

Ben Yelin: So those types of things, I think, the Justice Department would say that's more worth their time and resources than trying to use the Computer Fraud and Abuse Act to crack down on good-faith research. 

Dave Bittner: Yeah. All right. Well, it's an interesting story for sure. We'll have a link to that in the show notes. 

Dave Bittner: My story this week comes from the Inside Privacy website over at This is a story written by Jenna Zhang and Lindsey Tonsager, and it's titled "FTC Unanimously Adopts Policy Statement on Education Technology and COPPA." So earlier in May, the FTC, the Federal Trade Commission, unanimously adopted a policy statement for educational technology vendors, their edtech vendors. And this is reminding them of their duty to comply with the Children's Online Privacy Protection Act - COPPA. And specifically, they're saying that they have a couple of elements here they need to focus on. They include the prohibition against mandatory collection of data. They have some data use prohibitions, some data retention prohibitions and some data security requirements. What do you make of this, Ben? 

Ben Yelin: So first of all, it's encouraging to see this was adopted unanimously. There's a lot of contentious issues that come in front of the Federal Trade Commission. 

Dave Bittner: Yeah. 

Ben Yelin: And it seems like there is widespread recognition that we need to make the most of the tools that we have under the Children's Online Privacy Protection Act, or COPPA. This law was enacted in 2020 - or, I'm sorry, in 2000. So it's now going on 22 years. It was last revised in 2013, and it was designed to prevent the exploitation of children not just from edtech vendors, but from, really, all internet-related entities. 

Dave Bittner: Yeah. 

Ben Yelin: So private companies, organizations, etc. I think there are a couple of reasons why edtech has become a greater concern over the past several years. One is that we had this period of extended online learning, where education technology was more closely ingrained in the everyday lives of students across the country... 

Dave Bittner: Right. 

Ben Yelin: ...And had more access to potentially personal information because students, even to go to school, had to have login info, things that could be exploited by bad actors wishing to sell that type of data. So that's one reason. And I just think these companies have become more ubiquitous, even beyond the period of online learning during COVID, as they've gotten into public schools, private schools, provide tools to students and their teachers. 

Dave Bittner: Right. 

Ben Yelin: So I think that is the motivation behind these regulations. So what do these regulations do? You talked about the prohibition against mandatory collection of data. These companies cannot force students to disclose more personal information than necessary as a condition of student participation in an educational activity - so collect only the bare minimum that you would need from a student to allow them to use the educational tool. 

Dave Bittner: Right. 

Ben Yelin: Data use prohibitions - if the vendors are relying on the school itself for the collection of personal information, they can only use that information to provide an educational service. They can't sell it. 

Dave Bittner: Right. 

Ben Yelin: They can't use it for any commercial purpose - not for marketing, not for advertising. They can't - even if the data is anonymized, the education tech company would be prevented from saying, we have a profile of a 13-year-old student with X demographic characteristics, and we've seen that they display this type of behavior; this is how you should advertise to them. 

Dave Bittner: So it's off-limits. 

Ben Yelin: It is off - it is fully off-limits under these regulations. 

Dave Bittner: OK. 

Ben Yelin: And this means that they can't use any children's personal information as part of any algorithm or commercial database, which kind of cuts against the bread and butter of most of these tech companies who make their money by collecting information like this. 

Dave Bittner: Yeah. And, I mean, is it fair to say this is the reason why so many platforms like Facebook, for example, you know, say you must be over 13 to join our platform to... 

Ben Yelin: Hundred percent. 

Dave Bittner: ...Get them - even though they don't really enforce that in any way, shape or form (laughter). 

Ben Yelin: Yeah, they do not want to get in trouble with COPPA. 

Dave Bittner: Right. 

Ben Yelin: It is sort of funny that we have pretty robust data privacy legislation, but it only applies to people who aren't of voting age. 

Dave Bittner: Funny how that works (laughter). 

Ben Yelin: Yeah. I mean, it shows that we could have a regime of data privacy protections. Certainly, there are downsides to it that we've talked about. But it's an active choice that we don't. It's not like COPPA has been deemed unconstitutional. We see that it really can be enforced, and people can have more confidence in, at least as it relates to their children, their data not being used for inappropriate purposes. 

Dave Bittner: Yeah. 

Ben Yelin: And then on retention policies, they can't retain information longer than necessary to fulfill the purpose of the data collected in the first place. And they have to maintain the confidentiality, security and integrity of children's personal information. Some of this is just in response to what Congress has demanded the FTC do under COPPA. But I think this extends even beyond what's listed in the 2013 changes to the COPPA statute. I think this policy goes further. And I think it's encouraging to see that, unanimously, the FTC recognizes the need to protect this information. 

Dave Bittner: When the FTC does something like this and they put out a statement like this, is there an unspoken element that they're also going to be focusing on enforcing this? 

Ben Yelin: Yes. I mean, I think this is - any time they put out a statement, it's showing that this is some sort of priority. 

Dave Bittner: Right. 

Ben Yelin: And so I don't think the FTC would be afraid to - and they certainly have the authority to do so - to institute civil fines on companies for violating these policies. That is well within their purview as an administrative agency. And by putting out this statement, now companies have noticed that these are the regulations. And you do not want to get in trouble with the FTC for a number of reasons. 

Dave Bittner: (Laughter). 

Ben Yelin: But the fines can be significant. 

Dave Bittner: Yeah. 

Ben Yelin: They can be rather hefty. And some of these edtech companies aren't the Metas and the Twitters of the world. I mean, they're the type of companies that might go bankrupt if they get more than a slap on the wrist from the FTC. 

Dave Bittner: Right. 

Ben Yelin: So yeah, I think this is a statement of their priorities and an exercise of their authority under COPPA. 

Dave Bittner: All right. Well, we will have a link to that story as well in our show notes. We would love to hear from you. If you have something you'd like us to consider for discussion on the show, you can send it to us. It's 

Dave Bittner: Ben, I recently had the pleasure of speaking with Andrea D'Ambra. She is from the Norton Rose Fulbright law firm. And we're discussing some research that they all recently did on litigation and its effect on the privacy landscape. Here's my conversation with Andrea D'Ambra. 

Andrea D’Ambra: So this report's been going on for 17 years now. It's really the longest-running survey of its kind. And really, we started it as a way to sort of understand what our clients were seeing from their perspective and what trends are sort of emerging in the marketplace. 

Dave Bittner: Well, let's dig into some of the highlights here. I mean, what are some of the things that caught your eye? 

Andrea D’Ambra: I think the most interesting thing about this is that it completely tracks what we're seeing on our side of things. It was very validating. One of the big trends is that cybersecurity and data protection have become, really, very key legal disputes of most concern. So back in 2019, about 8% of our respondents were saying that they were a legal dispute of most concern. That has tripled in 2021 to 24%. And in addition, when people are looking at business exposure and, you know, whether they feel more exposed on cybersecurity and data protection or less, back in 2020, the more-exposed percentage was 44%. In 2021, that went to 66%. And that all came out of the sort of, you know, the people who felt like it was the same or the people that felt like it was less exposed. People who thought it was less exposed, I think, are probably delusional, but... 

Dave Bittner: (Laughter). 

Andrea D’Ambra: ...They went from 6- to 4%. And the people who felt like, you know, they had the same exposure went from 50- to 30%. So really, I think - and I think there's a lot that goes into that. One, it's absolutely true that cyberattacks are on the rise, right? I would say in our practice in 2020, our cyberattacks tripled, both in complexity and in numbers. And then in 2021, they tripled again. I think that that is being played out in the marketplace as well. But the other thing is, is that, you know, they're getting a lot of attention, right? Colonial Pipeline hack and other hacks like that have really sort of brought cybercrime and cyber breach to the forefront. 

Dave Bittner: So in recognition of this, what are the folks that you work with - you know, what sort of actions are they taking to protect themselves against this reality? 

Andrea D’Ambra: A number of different things. I will say one of the things that I see almost every one of our clients that has a major data breach do immediately after the data breach is look at their information governance structure and start looking at tranches of data that they can get rid of because you can't steal what you don't have, right? So, you know, they're really trying to clean that up. For years before this, people sort of, in my opinion, paid lip service to information governance, but didn't always follow through on the, you know, actual destruction of data. And there were a couple of reasons for that. One is that, quite frankly, you know, they have a day job and not really exciting for them, but also there used to be - and there is to some extent still - a litigation risk that if you, you know, accidentally destroy data that is relevant to some litigation, you could get sanctioned. That risk has decreased a fair amount. Once the - in 2015 they passed these rules that, essentially, made it more difficult to sanction somebody if they had a data loss. And this cyber-risk of, you know, having people steal data has gotten much bigger. 

Andrea D’Ambra: So, you know, I had one client who had data going back to 1998, so they were having to notify former employees back to 1998. And of course, those employees, A - they may not love the company as much as they did when they were working for them, but also, you know, a lot of them were like, well, why do you still have my data? That's so old? 

Dave Bittner: (Laughter). 

Andrea D’Ambra: And it also causes some sort of uncomfortable conversations with the regulators - right? - who are also really sort of getting on the data minimization bandwagon, not only in the EU, where it's been in the news a lot, but also the FTC and SEC have started looking at those, you know, how much people are over-retaining data. 

Dave Bittner: Yeah, that's fascinating. I mean, do you suppose that we're really heading into a time here where people are really focusing on keeping their houses in order when it comes to data governance? 

Andrea D’Ambra: I do. I've seen in the last two years much more emphasis on that, not just from the people who have unfortunately suffered some sort of attack, but also from other companies that are really starting to realize this risk. And they're trying to act quickly to, you know, mitigate it as much as they can. Now, there's some data that you have to keep no matter what, but over-retention is something that could be dealt with, and a lot of companies are investing in it. 

Dave Bittner: Yeah, it's - I mean, it's really interesting to track how this has changed. You know, I almost think that we sort of - we went through this period of time where it was so inexpensive - you know, practically free - to store everything, so why not keep it all forever? And then this - you could see it shift as, you know, data became less valuable. And I heard some folks describe it as almost being radioactive. You know, you have too much of it in one place - bad things happen. 

Andrea D’Ambra: Yeah. No, absolutely. I can remember back in - I guess it was 2017 - I did a presentation on IG and the importance of good IG program. And afterwards, one of the in-house lawyers came up to me, and he said, well, why should I pay you $200,000 to come in and, you know, fix my IG program - not that it would cost that much, but nevertheless... 

Dave Bittner: (Laughter). 

Andrea D’Ambra: But, you know, when, you know, storage is so cheap and, you know, it's not a big deal and, you know, at that time, we were really focused on the litigation risk - right? - that, you know, if you had data around, you might have to look at it in discovery, and the discovery costs are bigger. That's one risk, right? But then there's this other cyber-risk, which is, for most companies, a much more concerning risk. 

Dave Bittner: Can you give us some insights as to what the reality is when it comes to verifying that something has actually been deleted? You know, it's a conversation that I've heard many times where, you know, there's a difference between - a paper document, you know, can be shredded, can be burned. You could verify that it's gone. But sometimes flipping a switch in a database, indicating that something is gone, doesn't necessarily mean that it is or that it can't be gotten back. How much does that complicate things, if at all? 

Andrea D’Ambra: So one of the challenges of working with electronic data is it is both ephemeral and it lasts forever because a lot of times there are so many copies out there. I like to call it the D'Ambra Rule, that if you really need a piece of data, you can't find it; you can't get it. 

Dave Bittner: (Laughter). 

Andrea D’Ambra: And if you don't want a piece of data to be around, it's, like, in 50 different places. 

Dave Bittner: Right. 

Andrea D’Ambra: Because that's always how it seems to be in my practice. But yeah, from a cyberattack perspective, we don't generally see cyberthreat actors going in and restoring deleted data. They're really going in for a quick hit, right? They want to get some data that they can hold hostage and then threaten to publish on the internet and embarrass the company and cause them all sorts of problems so that they can get a ransom paid. You know, they're much more focused, and they want to do as little work as necessary. 

Dave Bittner: You know, faced with these realities and, you know, some of the enhanced regulations that are coming online and so forth, I mean, is there a real-world cost to this, or are organizations finding that tracking this is becoming a cost burden on them? 

Andrea D’Ambra: Oh, absolutely. Cyber insurance, which, you know, many organizations have, has gone up significantly in the last few years. You know, investing in IT infrastructure and good information governance is also a cost. And even sort of the monitoring and in putting in place sort of best practices for cybersecurity has a real-world cost. So definitely, there's a lot of cost involved here. I don't think there's a lot of ways to avoid it, though, in our current environment. 

Dave Bittner: What is your advice for that business owner? I'm thinking particularly of small and medium-sized businesses who are trying to strike that balance, you know, between the risk of litigation, spending a reasonable amount on protecting themselves against this. Any tips on how to go about dialing that in? 

Andrea D’Ambra: Well, it's hard - right? - because if you're a small or medium-sized business, you may not have, you know, the ability to put into place all these different security measures. We see a lot of those folks go to what's called a managed security service provider, and they sort of manage the IT security on their end and then interface with the client. We've had problems with them too, though. I mean, these hackers are very smart, right? So, you know, if they can hack in one of these MSSPs, you know, they don't have to do as much work because they can get into lots of different systems. And so there are some challenges there. 

Andrea D’Ambra: But, you know, I think it's just a matter of sort of assessing, you know, what data you have and how it could be leveraged, and, you know, what can you get rid of and then investing in good cyber insurance and understanding sort of what your policy will give you because that can be, you know, a huge help and ensuring that, you know, you've gotten relationships with the people you would have to work with on a cyber incident before you have the cyber incident. The last thing in the world you want to do is meet your forensic investigator and your lawyer when your hair's on fire and, you know, data's running out the door. But, you know, unfortunately, sometimes there's nothing we can do about that. 

Andrea D’Ambra: But we really encourage our clients and everybody else - potential clients - to get engaged before that happens. You know, we have tabletop exercises where we can sort of run through an event and, you know, talk about who's going to be doing what and what's going to be happening, which can be very helpful, and sort of identify some weak points in your whole cybersecurity incident response plan. We also, you know, in some cases, particularly with our larger clients, we actually have relationships with them where we get a feed from their security operations center when alerts come in of a certain level - right? - so that we have - they have counsel engaged immediately when a major event comes along. 

Dave Bittner: Where do you suppose we're headed with this? I mean, as you say, you all have been tracking this for nearly two decades now. Do you - is there a direction you see things going? 

Andrea D’Ambra: I think eventually - this is just one woman's opinion - the governments are going to end up sort of stepping in and leveraging more resources to prevent cyberattacks and also to try and track down the people who are doing them and punishing them - right? - because it is such a cost on and a drag on our economy. That may not be possible. That might be sort of Pollyanna. But, you know, at the end of the day, we have all these companies out there that are really getting hit hard, you know, and the fact of the matter is, is that no matter how much you spend on cyber, you know, security, it's not if you're going to get attacked, it's when you're going to get attacked because there's so many different vectors they can hit you at, right? You've got to protect 360. Plus, you have employees who might do something crazy, whereas our friends, you know - the threat actors, not our friends - but the threat actors really - they can hit you here, and then maybe they try a weak point over here. And they can do a lot of social engineering to get what they want. They're very, very good at this sort of stuff. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: It's amazing how quickly the landscape has changed not just in the past 10 years, but in the past two to three years. The data is really remarkable from these companies showing significant concern about liability. I think it's from these high-profile incidents, cyber incidents that have happened to companies in every sector. But that survey data is crucial to see where the industry is. It's - level of concern is a little bit higher than perhaps I would have expected. 

Dave Bittner: Yeah. 

Ben Yelin: So I think it's really good that they're doing this work. 

Dave Bittner: Yeah. I mean, it is interesting, isn't it? I mean, the - we've even - you know, we've - you and I have talked many times about how things like insurance policies can be drivers of - or the expense of insurance policies can drive compliance and... 

Ben Yelin: Right. 

Dave Bittner: ...You know, better security practices. And this, I think, tracks similarly. 

Ben Yelin: It does. And she talks about the newfound need for cyber insurance. I think as people start to recognize their risks and their vulnerabilities, they're going to dip into their financial resources to pay for things like insurance. And that might not have happened two or three years ago. 

Dave Bittner: Yeah. 

Ben Yelin: I think it's just how profile all of these attacks have been. It's something that seemed very far-fetched when it was even something like the Office of Personnel Management hack, where that only applied to federal employees. Then it was Equifax, and it was, OK, well, that's really most of us. 

Dave Bittner: Right (laughter). 

Ben Yelin: But maybe I didn't seek a credit report, or it's fine if they have that data, there's nothing secretive in there. But when we have something like the Colonial Pipeline incident, where the East Coast of the United States sees gas lines for a period of weeks... 

Dave Bittner: Yeah. 

Ben Yelin: ...Or we have these ransomware attacks on local governments, I think it becomes very real for people. And they're recognizing it in a way that they didn't one to two years ago. So I think the fast-paced nature of this is what's really interesting to me. 

Dave Bittner: All right. Well, our thanks to Andrea D'Ambra from the Norton Rose Fulbright law firm for joining us. We do appreciate her taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.