Caveat 10.20.22
Ep 146 | 10.20.22

The push back on cyber reporting regulations.

Transcript

Bill Bernard: To me, this is a symptom that we're all concerned about right now and not the problem or the infection that we have to deal with.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben discusses a novel court decision on geofence warrants. I've got the story of a $228 million judgment in the first biometrics privacy class action to go to trial. And later in the show, my conversation with Bill Bernard. He's managing director of solutions architecture at Deepwatch. We're discussing industry pushback on cyber reporting regulations. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, let's jump right into things here. Why don't you start things off for us? 

Ben Yelin: So my story comes from the Electronic Frontier Foundation, and they write in about a groundbreaking court decision coming from the state of California - a trial court there - relating to geofence warrants. So we've talked about geofence warrants a good deal. 

Dave Bittner: Right. 

Ben Yelin: But as a reminder, it's when you don't have individualized suspicion that somebody has committed a crime. You just want to collect all the metadata from phones that were in a certain geographic location during a certain time period. 

Dave Bittner: Right. 

Ben Yelin: And this has - presents major constitutional concerns because the Fourth Amendment has this particularity requirement. It says in the amendment you have to name either the persons or things to be searched or the things to be seized. It can't be a general warrant. That's why we have a Fourth Amendment in the first place. So that brings us to this case. It was an armed robbery in 2018 in San Francisco. You know I cannot resist a San Francisco-based story. 

Dave Bittner: (Laughter) The pull - the tug to your hometown. 

Ben Yelin: It really is. I mean, it maybe just feels like a disproportionate amount of - disproportionate number of these cases come from San Francisco, or maybe I'm just subconsciously finding them, but... 

Dave Bittner: Right. 

Ben Yelin: So there were basically no leads on this armed robbery, and law enforcement requested geofence data from Google, and Google provided it. There was a bit of an iterative process where, first, Google provides the police with a list of de-identified devices that were in this geographic area. Police has the opportunity to narrow the devices in which they're interested and expand the geographic area to where those devices came from before the crime took place, and then police have the opportunity to further narrow the devices in which they're interested. And that's when Google will deanonymize that data. What that means is there is only judicial involvement before that first step, and that's going to become a major source of concern on behalf of this court. 

Dave Bittner: OK. 

Ben Yelin: So long story short, they identified this individual by the name of Dawes who committed this - allegedly committed this crime. He's arrested and charged, and they are seeking to use evidence from this geofence warrant for a conviction. 

Dave Bittner: OK. 

Ben Yelin: And Dawes is trying to suppress the evidence... 

Dave Bittner: Ah. 

Ben Yelin: ...And the court granted his motion to suppress this evidence. 

Dave Bittner: Really? 

Ben Yelin: So basically, there are three things that you have to look at on a Fourth Amendment case dealing with geofence warrants, and I think this court offers kind of a roadmap for future cases on this subject. The first is the time frame. Is that time frame overbroad? Here, law enforcement - the San Francisco Police Department - was requesting about 2 1/2 hours' worth of data. And the court said, in that instance, that's OK. Two and a half hours is not overbroad. That's a sufficiently small amount of time. So that's one thing they look at. 

Dave Bittner: Presumably, they knew the exact time that the armed robbery took place. 

Ben Yelin: Exactly. 

Dave Bittner: Yeah. 

Ben Yelin: So they wanted a little bit - you know, you go a little bit before so you can see people coming in and out of the area. 

Dave Bittner: OK. 

Ben Yelin: So the court here says that 2 1/2 hours is reasonable. 

Dave Bittner: OK. 

Ben Yelin: The next question is the geographic location. So an overbroad search would be one where you include more of an area than is necessary to try and identify an individual who committed a crime. And what the court says here is requesting data from a whole - an entire dense city block in the city of San Francisco is overbroad. And I think that's a really novel finding. What they said here is that there were 13 houses involved in this geofence warrant. And that's going to include possibly 100 people, who are completely innocent, who now have their data swept up under this dragnet, and that's fundamentally unfair from a Fourth Amendment perspective. 

Ben Yelin: So we now have a California court really pushing back against this overbroad geographical area. And in future cases - at least in this district - law enforcement is going to have to go a little narrower when requesting the geofence warrant. It's going to have to be either an individual house or a couple of houses. And that's something that is a change from other cases we've seen in the past. So that is kind of the second thing they look at. The third thing is about the process of getting the warrant itself. And one thing that was deeply concerning to the court here is law enforcement requested the warrant from - or requested the geofence data from Google... 

Dave Bittner: Right. 

Ben Yelin: ...In front of a judge. And that happened once. But there's this further process where Google is going back with law enforcement to get a little bit of information on the devices that were found in the geographic area and then deanonymize the data. And all of that happens without judicial approval. And that goes against Fourth Amendment principles of particularity. 

Ben Yelin: So we now have a case here where in 2 of the 3 factors that courts are going to look at, the San Francisco Police Department wasn't particular enough in describing the location to be searched through this geofence warrant, and the warrant itself was overbroad. The original warrant that authorized Google to turn over deanonymized data from that geographic location should not have been extended to include that iterative process where Google is deanonymizing the data. So I think this is going to be a really instructive case across the country for other geofence warrants. We kind of have perhaps a standard settling in of how courts should look at whether these warrants are overbroad. 

Ben Yelin: One thing that's also noteworthy about this case is that California has a pretty robust statute in place, CalECPA, which is the Electronic Communications Privacy Act in California, which is stronger than other states. So I think there's more of an interest in criminal defendants in the state of California in getting this type of evidence suppressed than there would be in other states just because the sway of that statute is pretty strong. But I do think the way the court approached these issues is certainly novel and something that might be an example to other courts across the country. 

Dave Bittner: So going back to issue No. 2, which was the breadth of the number of houses that they were looking at, suppose - I'm imagining myself, you know, looking at a row of townhouses or row houses or whatever, you know, city housing... 

Ben Yelin: Right. 

Dave Bittner: ...Close together. So imagine those 13 houses, you know, side by side. And rather than asking for the 13, could I start with - I want these two. And then I don't get anything from them. Can I go back to Google and say, all right, let's do... 

Ben Yelin: Let's make it three, yeah. 

Dave Bittner: ...The next two? Or just - can I - you know, you see where I'm going with this? Can I work my way down the row rather than asking for all 13 at once? 

Ben Yelin: Yes. But I think what this court is saying here is you can't ask for two and then without getting a separate warrant, go back to law enforcement and say, let's expand it to four and six. 

Dave Bittner: I see. So I have to justify it in front of a judge - my process. 

Ben Yelin: Exactly. And that's what was so problematic about law enforcement's relationship with Google here in the first place is they were authorized to perform the individual search. But then all of the back and forth that occurred after that, like - all right, we have some data; it's not quite good enough for us to effectuate an arrest. Can you help us out a little bit? That type of conversation can't exist to the extent that it exists without a separate warrant. And I think that same principle would apply if you're trying to expand the geographic area. There is a real line-drawing problem here because you think, OK, one house is OK. Two houses is OK. I mean, what's the limit? Is it four houses? Is it... 

Dave Bittner: Right. 

Ben Yelin: ...A square footage amount? It's really hard to know where to draw the line, and I think other courts are going to have to wrestle with what is a pretty difficult standard to establish. All we know here is that a density block with 13 houses is overbroad. And one of the reasons it's overbroad is technology is so good that you really can be more exacting than that when you're talking about a geofence warrant. I mean, Google knows what they're doing. They can peg you at - on the second floor of an apartment building. 

Dave Bittner: Well, but to that point, I wonder because, I mean, imagine a high-rise apartment building, right? A GPS location is like looking down on a map and putting a pin in it. But in a high-rise apartment building, that could be 50 units. 

Ben Yelin: I think you've just come up with a plot for season three of "Only Murders In The Building." 

Dave Bittner: (Laughter) Yes. 

Ben Yelin: Can we pitch this to Steve Martin 'cause... 

Dave Bittner: Why not? 

Ben Yelin: Yeah. 

Dave Bittner: It's a lifelong dream of mine to collaborate with him, so sure let's... 

Ben Yelin: Yeah. 

Dave Bittner: What do we have to lose? 

Ben Yelin: And maybe if we just happened to meet Selena Gomez, that's fine, too. Yeah, I think that certainly becomes more difficult. I mean, San Francisco is dense. It's not as dense as New York City. 

Dave Bittner: Right. 

Ben Yelin: If we were talking about New York City, you can have a very narrow geographic location, but it could be a, you know, 30-story apartment building. 

Dave Bittner: Right. 

Ben Yelin: And it would be very difficult. Would that be overbroad because you'd be collecting from more than the - just the innocent people in the 13 houses in this case but probably hundreds - several hundred innocent people would be captured in that case. So I think that's another part of this line-drawing problem. 

Dave Bittner: So what happens next? I mean, California makes these rulings. How does the - how do courts around the rest of the country look at this and consider this? What is that process? 

Ben Yelin: So this is just persuasive to other courts across the country. In San Francisco, this has precedential value in this trial court, meaning that the San Francisco Police Department, if they want to obtain a geofence warrant, are going to have to narrow their request to make sure it doesn't cover as much of a geographic location. 

Dave Bittner: OK. 

Ben Yelin: But across the country, I mean, so many courts at both the state and federal level are struggling with how to deal with this Fourth Amendment issue on geofence warrants. And I think the approach the court here took could be instructive to those judges, particularly focusing on the lack of specificity as it relates to the location being searched. So I think that could be persuasive. Courts might take a completely separate approach, and I'm sure we'll talk about those approaches when they present themselves. But this was a novel way, at least in my view, of how a court looks at this issue by kind of going through those three factors. And that's why I think it potentially could be persuasive. 

Ben Yelin: You know, there's a downside to this for opponents of geofence warrants. The court - even this San Francisco district court, which was suppressing this evidence - said geofence warrants generally don't, per se, run afoul of the Fourth Amendment. They still could be constitutional if they are particular in what they're requesting. It's just the specifics of this one warrant that were not sufficiently particular, and it was overly broad. So geofence warrants - I think it's just going to be one of those Fourth Amendment issues where we're going to get lots of different cases from lots of different jurisdictions, and it might take a long time for courts to develop some type of sensible standard to how to evaluate these warrants. 

Dave Bittner: Is this on its way to the Supreme Court, do you think? 

Ben Yelin: It certainly could be on a path there. I think it's too early. We're not quite at the point in the process where, say, there's a disagreement among federal circuit courts. That's when you might see the case make it up to the Supreme Court. 

Dave Bittner: I see. 

Ben Yelin: So I don't think we're on an immediate glide path to the Supreme Court, but I do think this is an issue that's ripe for the Supreme Court because there is going to be disagreements among circuits, just 'cause I don't know if there's a natural way to develop a standard here that would be uniform across all of our different judicial systems. 

Dave Bittner: Well, I mean, in the fantasy world we often talk about, where Congress comes up with a national federal privacy law, could that, you know, interrupt this court's - its journey to the Supreme Court? 

Ben Yelin: It could. You know, Congress would have to come up with its own standard that was very particularized, and I think they would be reluctant to do that... 

Dave Bittner: I see. 

Ben Yelin: ...Largely because it is dependent on local factors. A certain number of houses or a certain square footage is going to mean one thing in a dense urban environment... 

Dave Bittner: Right. 

Ben Yelin: ...And a different thing in a rural environment, and it's hard to account for that in a statute. 

Dave Bittner: Right (laughter) right. 

Ben Yelin: I don't know how you would write that... 

Dave Bittner: Yeah. 

Ben Yelin: ...To make it a justiciable standard. 

Dave Bittner: And you lawyers love phrases like preponderance of the evidence and reasonable person would conclude, right (laughter)? 

Ben Yelin: Yeah. Make it as vague as possible so that nobody really knows... 

Dave Bittner: Right. 

Ben Yelin: ...What the standard is. 

Dave Bittner: Right. And then let the Supreme Court figure it out (laughter). 

Ben Yelin: Yeah, exactly. We'll punt it up to these nine individuals who will make all of our decisions for us. 

Dave Bittner: Right. Right. All right. Well, interesting case. I mean, it's fascinating to see the progress we make on these things, right? It happens slowly, but it's happening. 

Ben Yelin: Yeah. I mean, I feel like we run across one of these geofence warrant cases every few months now, and they're kind of building atop one another. It is still a very novel issue, but it is becoming a ubiquitous law enforcement tool. You can see why law enforcement loves it. I mean, I would have loved to have gone through historical crimes and figured out - all right, what were all of the devices? Who were all the individuals that were present at this particular location? 

Dave Bittner: Right. 

Ben Yelin: I mean, that's a very, very compelling law enforcement tool, so they're going to fight hard to be able to use it. 

Dave Bittner: All right. Well, we will have a link to that in our show notes. 

Dave Bittner: My story this week comes from the folks over at Bloomberg Law. This is reporting from Skye Witley, and this is about a case with Illinois' biometric privacy law. And they had their first case go through, and it resulted in a $228 million judgment. So there was a class of more than 45,000 truck drivers who were led by a gentleman named Richard Rogers, who alleged that he was required to scan his fingerprint to confirm his identity and access facilities at this organization called BNSF Railway Company, and that, in order to basically go about his day-to-day, this fingerprint scanning was required by the company. And he claimed, on behalf of this class, that this violates Illinois' biometric privacy law. And it went to court, and they won. And they were judged - or they were given - what's the word I'm looking for here, Ben? 

Ben Yelin: Damages? 

Dave Bittner: They were granted (laughter) $228 million in a judgment against this organization. Now, the organization plans to appeal. They say, we disagree with and are disappointed by the jury's verdict and think the decision reflects a misunderstanding of key issues. Of course they do (laughter). 

Ben Yelin: Yeah, I'd say that, too, if I lost $228 million. 

Dave Bittner: Right. Right. But this was a five-day trial in front of a jury, and they found that the company, BNSF, had recklessly or intentionally violated BIPA. And BIPA is the Biometric Information Privacy Act that is the law in Illinois. So the jury found they violated BIPA. And with each violation of BIPA, it's a $5,000 fine. They said they violated it over 45,000 times, and that's how the math adds up to this large judgment. What do you make of this, Ben? 

Ben Yelin: Yes, there are a couple of really interesting elements here. The first is, it's interesting to see Illinois as the battleground. We know that Illinois was the first state to pass a biometrics privacy law. 

Dave Bittner: Right. 

Ben Yelin: And so a lot of the novel cases are going to come out of Illinois. There are a few unanswered questions here. For one, there is a separate state case currently in front of the Illinois State Supreme Court about whether a company can be fined only for their first violation against one individual or also for every subsequent violation. So that could really determine the true number of damages here. 

Dave Bittner: That would be quite a swing. 

Ben Yelin: Right, so... 

Dave Bittner: Five thousand dollars or $228 million. 

Ben Yelin: Right. So you can imagine an individual using their fingerprint once. If that's one violation and it's $5,000 to that individual, that might hurt the company, but it's not going to bankrupt them. 

Dave Bittner: Yeah. 

Ben Yelin: But if that person goes to work every single day, now you're talking about 300 violations in a given year or whatever. That's really going to start to add up. So that is still under - something that's still under consideration in front of the Illinois State Supreme Court. 

Ben Yelin: The other question here is the amount of damages. Juries are notoriously fickle when it comes to awarding damages. And oftentimes, an initial judgment that a jury arrives at is going to be partially reversed on appeal. You see this in a lot of cases that have kind of, like, emotional baggage attached to it where it shocks the conscience. And the jury is just like, give the plaintiffs all the money... 

Dave Bittner: Right. 

Ben Yelin: ...Because this is so offensive. If I were in that position and they collected my biometric data without following the procedures laid out in BIPA, then I certainly would want to be compensated. And so the juries sometimes go a little bit overboard in awarding damages, so I'm interested if, on appeal, the court says, yes, BNSF violated the statute, but they're not liable for $228 million in damages. That's excessive. 

Ben Yelin: So those are two things that we are looking out for. But what this does mean is that organizations both private and public in Illinois are going to have to be far more conscientious about using any type of biometric data. And they're going to have to be concerned about getting sued in civil court and having to pay monetary judgments, and that's the intention of the law. The law is working according to how it's supposed to work, where it's forcing companies to think about - do we really need to collect this biometric data, considering all of the logistical hoops we have to jump through to make it legal in our state? And that might discourage companies from using biometric data. So that could be the long-term consequence here. 

Dave Bittner: To what degree, if any, do you think a judgment like this affects a company that does business in all 50 states, including Illinois? 

Ben Yelin: Well, for something like this, it probably has a bit of a limited impact because if you're in all 50 states and you use - in all - in the 49 other states, you're using biometric data, it's not going to be that hard for you to customize the experience of truck drivers in Illinois for... 

Dave Bittner: Right. 

Ben Yelin: ...Those particular circumstances. 

Dave Bittner: So I'm thinking of, like, McDonald's, you know. They operate - they're ubiquitous. And let's just say hypothetically, McDonald's comes with a way for their employees to clock in and clock out using their fingerprint or a face scan or something biometric. Would they simply - the easiest thing for them to do to just come up with a different way to do that in Illinois? 

Ben Yelin: Yeah. If it's effective in all other 49 states that don't have robust biometric privacy laws, it's not that hard to change it for just one state. When you see multiple states coming up with these types of statutes and judgments being awarded in multiple states, maybe it would force a company like McDonald's to reconsider its nationwide practices. I don't think this would be enough. You know, companies have to make changes to accommodate state - individual state laws all the time. 

Ben Yelin: We've talked about circumstances where the logistics are extremely difficult. So when Texas comes up with a statute that regulates social media content moderation, that's very difficult administratively for a large social media conglomerate because it's hard to know which users are in Texas. You're putting together algorithms and certain practices that it's really hard to tailor that to an individual state. But for something like a practice for providing biometric data, I think it would be OK - acceptable for this company to have some variance across states. 

Dave Bittner: Is this one of those cases that we see with class action suits where - suppose this goes through and the $228 million judgment stands and BNSF is on the hook for that. They pay it. Do the members of the class - is this like - you know, they get a $5 check in the mail or - like, who does the fine go to? 

Ben Yelin: No, the lawyers are going to get some of it. 

Dave Bittner: Yeah. 

Ben Yelin: Unfortunately - I mean... 

Dave Bittner: Right. 

Ben Yelin: ...Lawyers do that. They're the worst. 

Dave Bittner: (Laughter). 

Ben Yelin: I'm not very good at these types of mathematic calculations. 

Dave Bittner: Right. 

Ben Yelin: Maybe you're better at estimating 228 million divided by 45,000 truck drivers. 

Dave Bittner: Yeah. 

Ben Yelin: I think the amount is - if I had to ballpark it, it would probably be a little more significant than those class action lawsuits where - there's a class action on behalf of Coca-Cola for a defective can. 

Dave Bittner: Right. 

Ben Yelin: And 30 million people bought Coca-Cola bottles, and it's a judgment for 60 million, and we all get $2. It's... 

Dave Bittner: Yeah. 

Ben Yelin: ...I don't think it's going to be like that. 

Dave Bittner: OK. 

Ben Yelin: It's not going to make all these plaintiffs rich - that's for sure. 

Dave Bittner: Right. 

Ben Yelin: But I think it could be a considerable amount of damages per plaintiff. 

Dave Bittner: Yeah. 

Ben Yelin: If I'm doing the math correctly in my head. 

Dave Bittner: Yeah, it's interesting. All right. Well, another one to keep an eye on, right? I mean... 

Ben Yelin: For sure. 

Dave Bittner: Yeah. All right, well, again, we will have a link to that story in our show notes. We would love to hear from you. If there's something that you would like us to consider to cover on the show, you can email us. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Bill Bernard. He is managing director of solutions architecture at a company called DeepWatch. And our conversation centered on the pushback we're seeing from industry on some of these cyber reporting regulations. Here's my conversation with Bill Bernard. 

Bill Bernard: The regulations that we have in place today around the world - countries like India, the U.S., other places as well - a large part of them have been building up over time. We have HIPAA. We have GDPR. We even have PCI, all sorts of things along those lines. But I really think in 2021 we saw that huge uptick in cybercrime as expressed as ransomware most of the time. And we saw concerns over paying ransom and things along those lines. We saw governments recognize that they couldn't leave that to the private industry to fix and to deal with. Government was getting involved, whether it was the FBI here in the United States or CISA or etc., etc., all getting very, very involved in this. And so they've realized, of course, that they have no way of measuring their involvement. As a very smart Mythbuster once said, the difference between screwing around and science is writing it down. 

Dave Bittner: Right. 

Bill Bernard: Same is true here. If we don't keep records on cybercrime and how we're dealing with it and those sorts of things, we can't tell how effective we're being. So I think that's been the genesis of it. I think, unfortunately, there's been a lack of consensus on what's the best practice for it. There's also been a problem with multiple departments in multiple different places wanting reporting for their specific purposes. And I think that's what's brought us to where we are today. 

Dave Bittner: What's the push-pull kind of been like here? I mean, I think most organizations would agree that there needs to be a certain degree of reporting. But, you know, everybody has their idea of what's ideal. How do we meet in the middle there? 

Bill Bernard: Yeah. I mean, I think the industry has been - and by industry, I pretty much mean everybody out there with a business or a company - has been sort of skirting the issue a little bit. We've seen just in the past few months a major data breach of consumer data that was hidden and covered up by a ridesharing application company. We've seen other companies try to pass off major ransomware events as an undefined network interruption. We've seen them minimizing this. And so, you know, what did your parents do whenever they caught you doing something, and you tried to lie about it? They started paying way more attention. They started making you report in, do whatever, keep tabs on you. There's a little of that going on here right now, I think. So I think that, yes, government has perhaps overreacted. India, I find, is - has gone to a six-hour reporting requirement. That's a little scary because I'm not even sure you know what happened within six hours' time in a lot of companies. 

Bill Bernard: But peeling all of that back, you know, we're still at the point where we need that data. We need that data. And we need it to inform the public when their data has been compromised. We need it so that the government can, again, try to figure out if they're being effective. Heck, we need it for the government to know who to go after and to see the patterns and to help warn other companies that weren't affected. Because I think one of the most important things for this reporting is about - not necessarily how does that reporting impact the customer - or the company, rather, that's being asked to report, but how do we make the internet and just life in general a little bit safer for all the rest of the companies who could be next? So that's a - you know, that's a big part of it to me that we need to work on. So companies, of course, maybe put themselves in this position. On the other hand, governments have been maybe a little overbearing with some of the requirements. But I think really the funniest part of this whole thing is, to me, this is a symptom that we're all concerned about right now and not the problem or the infection that we have to deal with. 

Dave Bittner: I think that's an excellent point. And one of the things I've noticed that I'm curious for your take on is it seems to me like with this discussion, there tends to be a lack of nuance, and I find the parties talking past each other. You know, if we used India's six-hour reporting requirement just as a - let's use it as a straw man example, you know... 

Bill Bernard: Sure. 

Dave Bittner: ...For the sake of argument. You know, people say, well, six hours - that's way too quick. Like - as you said, how can we possibly know what's going on? Well, I would say, fine, and maybe that's the case. But is it unreasonable to say, if you discover something within six hours, you have to say something's happening here? We don't know what it is, but there's a chance there might be something happening here, and that's it. Like, that doesn't seem unreasonable to me. 

Bill Bernard: I tend to agree. I mean, I think there's an opportunity to think about this like reporting on a traffic incident. I need to know immediately if I should take a different highway into work because there's a jackknifed semi. Right? I need to know that right now. I don't even necessarily need to know it's a jackknifed semi. I just need to know that I-88 heading in toward Chicago is blocked. However, later that day, as I come home and I tune in to the nightly news, it might be nice to get some more details. 

Bill Bernard: I think there's an opportunity here for sort of a two-tier response - a notification - uh-oh, something bad happened - and then the opportunity to say, and now we're going to take X amount of time to get the details and report back. I think that's reasonable. I think companies are very scared for any bad press, though. And I can see them pushing back on that. The thing that I would remind them is the Home Depot is still in business. Target is still in business. TJMaxx, as best as I can tell, is still in business. And, you know, Colonial Pipeline is certainly still in business, right? The list goes on. You know, these are transitory issues in terms of customer sentiment and things like that, you know. So I think that may be a worry that these companies have that isn't as fully baked, perhaps, as they're worried it is. 

Dave Bittner: Are there other industries that we can use as an example? You know, I think about things like aviation where, you know, reporting is part of their culture. 

Bill Bernard: Absolutely, and I think, you know, part of this is - I think the public companies especially are nervous because they now have to publicly disclose all of this. I mean, they were supposed to anyway. But the new SEC rules are - you know, it's now an 8-K, if I'm remembering the abbreviation correctly. Or essentially, you have to describe how this is a major material issue for your company. And with, you know, an average ransom payment last year in the $4 million number, if I remember the stats properly, I would imagine most publicly traded companies would have to consider that a material incident. And they just don't want to have to do it, right? They're nervous about what that means to their business. 

Bill Bernard: I think for private companies, one of the interesting things is we're seeing them balk over having to privately disclose that to the government as well, which I found interesting, right? There are some of the folks that have looked at the Department of Homeland Security's TSA requirements and stuff, which have been put forth and rolled back a little bit, as still perhaps being onerous because of the number of companies and the number of industry verticals that now qualify as critical infrastructure in the United States. And I think, frankly, part of it is a little bit of what I mentioned before. I could have a company that's subject to HIPAA, PCI, privacy laws in all 50 states plus the territories plus Washington, D.C., you know, the SEC, CISA, you know, the Department of Homeland Security, all of these. I need a small army of lawyers just to figure out who I'm supposed to tell what by when. And that doesn't even get to, did I have the ability to detect the problem in the first place and deal with the problem in the first place, which... 

Dave Bittner: Right. 

Bill Bernard: ...I think is really at the root of this issue for many companies. 

Dave Bittner: Well, and you mentioned that earlier, that we're - it seems as though, you know, a lot of people are coming at this - looking at the symptoms and not the true disease here. In your estimation, what is the disease? 

Bill Bernard: I think the disease is companies have an inability to recognize security incidents when they are small and when they are containable, and they don't notice them until the house is burning down, right? If my fire system isn't good enough to tell that the wastebasket is on fire, and it can only tell me once the drapes have caught fire and the couch has caught fire, the house is a loss. So how do we identify these things earlier on before they're a material breach, before they're something that I have to put in my 8-K report - things along those lines. I think that's a space. And there's some industry research out there that I think backed that opinion up. 

Bill Bernard: We at Deepwatch have done some investigation. We've commissioned a report - things along those lines - where we found that almost 40% of companies 1,000 employees and larger don't actually have 24x7 monitoring. We found that almost 100% of respondents would like better, more accurate alerting so that they can deal with things earlier on in the cycle before they become major events. If we got ourselves to that point, I see the probability of having to announce a big breach becomes lower and lower and lower and lower. 

Bill Bernard: You know, I think about, back in 2021, the difference between Colonial Pipeline's breach and a breach that occurred at AmeriGas. AmeriGas lost one whole customer record. And you wouldn't know anything about it - it happened about the same time - unless you happened to see the notification that they sent to - I believe it was the attorney general of New Hampshire. You know, an amazing difference there - we all know what happened with Colonial. 

Dave Bittner: Right. 

Bill Bernard: But - yeah, and that sort of a notification, that sort of a breach - you know what everybody said? Oh, good on them. Bravo. Right? They did it well. The Colonial breach, of course, everybody went, uh-oh, quick, I have to go hoard some gasoline. So I think, at the heart of it, if we were doing a better job in companies and as, you know, an information security industry, etc., at identifying these things before they turn into bigger problems, I think we wouldn't see the gnashing of teeth over these reporting requirements like we do today. 

Dave Bittner: And do you think we could make use of something - you know, the equivalent of building codes for cyber? You know, if I'm putting up a commercial building, I have to have sprinklers. I have to have - you know, my exit doors must swing out and those sorts of things. I don't - you know, my sprinklers are on 24 hours a day. I don't turn them off at night when the employees go home. Is that a possible pathway here? 

Bill Bernard: I mean, I love that concept. I love where you're going with that. I think one of the interesting things with it is you're going to see industry worried about how is that stifling growth, especially for the entrepreneur? How does that ratchet up the cost of doing business? Now, I'm not sure I agree with that concern because, frankly, if you're not doing business securely, there's a cost you just haven't run into yet. That would be a very interesting way to try to go about that. You know, there have been some fits and starts at that. The current administration has put out some requirements - what? - a year and a half or something ago. There have been other steps - I know the Obama administration putting out the NIST CSF - trying to encourage that, but not making it a requirement the way you're talking about building codes are. It's a very interesting idea. That could work. I'd be curious to think about that some more and talk about that some more. 

Dave Bittner: Yeah. Where do you suppose we're headed, then? I mean, it seems as though, you know, we have to head towards something where we have a little more certainty than what we have now. As you look in your crystal ball, any ideas where we might be going? 

Bill Bernard: Well, I did drop that on the floor the other day, so it's got a crack in it, but... 

Dave Bittner: (Laughter). 

Bill Bernard: I would tell you that I think it's going to be fragmented for a while longer. I think one of the things that there has to be an aha moment for folks is that a data breach is a data breach, and maybe we have to stop treating a health care data breach different from a PII data breach - different from, you know, this data breach and that data breach. And I love the fact that the Department of Homeland Security and CISA have started down a path for some of that. They've started to sort of say, hey, here are some guidelines for everybody to follow. I would love to see them be able to do more of that. I'm just not sure, here in the U.S., that's going to work with all the sort of fiefdoms in terms of who owns HIPAA and who owns this and who owns that. And, you know, again, PCI isn't even part of the government, but it's got its own reporting rules that we've got to worry about. 

Bill Bernard: So I think it's going to remain fragmented for a while. The good news, though, is I think we're going to start seeing a little bit more internationally. We're going to see a lot of pressure on countries that maybe push out to be more aggressive than sort of some accepted reporting periods, like 72 hours versus 6 hours. I think we're going to see some countries start to draw that back a little bit as they recognize that people aren't going to come play in their sandbox if they're overly restrictive. So I think there's some of that for us, but it's going to be tough for a while. And again, my advice to any company would be to think about - how are you going to detect and respond to the things so that you deal with them before you have to report on them? 

Dave Bittner: All right, Ben, what do you think? 

Ben Yelin: Really interesting. I mean, I am kind of curious that there has been a pushback from industry about these reporting requirements. I understand why that could potentially be harmful for an individual business. But the alternative is less institutional knowledge about cyberthreats without this type of reporting, and that hurts everybody. 

Dave Bittner: Right. 

Ben Yelin: Or there could be a bigger stick than just a reporting requirement. There could be administrative fines. There could be criminal charges. There could be civil damages for negligence. So I sort of think industry should be a little bit more satisfied with having this limited, so far, to a simple reporting requirement... 

Dave Bittner: Right, be... 

Ben Yelin: ...As administratively difficult as that could be. 

Dave Bittner: Yeah, be careful what you ask for? 

Ben Yelin: Exactly. 

Dave Bittner: Yeah (laughter). All right. Well, again, our thanks to Bill Bernard. He is from Deepwatch. We appreciate him taking the time for us. 

Dave Bittner: And that is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.