Caveat 6.30.22
Ep 131 | 6.30.22

The future of cyber legislation.


Gary Buonacorsi: Stretch as much as you can to get complete visibility, 100% of your environment. You can't protect what you don't know.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben discusses the implications of the Supreme Court's abortion decision for digital privacy. I've got the story of a warrant authorizing the use of biometrics, but prohibiting the demand for a password. And later in the show, my conversation with Gary Buonacorsi. He's Tanium's SLED CTO and chief IT architect. We're discussing the Cyber Incident Reporting for Critical Infrastructure Act. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, before we jump into stories here, we got a little bit of feedback from a listener of ours. This is an anonymous listener who spent part of their career as a U.S. marshal. Now, a couple episodes ago, you and I were talking about bench warrants. And we mentioned that our experience with them was - maybe not personal experience - but our knowledge of them (laughter) was that a bench warrant was generally something where they didn't run you down. You know, if you have a bench warrant out to you, on the next time you interact with law enforcement, they'll run your information and they'll say, oh, guess what? You have a bench warrant. I'm taking you in. But it was unlikely that someone was going to come knock on your door. Well, this listener said that in the federal system, if a bench warrant for failure to appear is issued, the U.S. Marshals will come actively looking for you (laughter). And he says in his experience as a U.S. marshal, that would absolutely happen. So just a little clarification there, that our knowledge was incomplete. 

Ben Yelin: All right. Well, excuse me while I go hide in an underground bunker... 

Dave Bittner: (Laughter). 

Ben Yelin: ...In a secure, undisclosed location. 

Dave Bittner: That's right. 

Ben Yelin: Now you'll know why. 

Dave Bittner: Yeah. No, it's good to know. Good to know. 

Ben Yelin: Yeah. 

Dave Bittner: All right. Well, let's jump into our stories this week. We've got a lot to cover. You've got the hot, sticky topic this week. What do you got for us, Ben? 

Ben Yelin: I do not usually enjoy controversial political subjects. Sorry, that's partially a lie. 

Dave Bittner: (Laughter) Anyone who's seen your Twitter feed knows the opposite is true. But go on (laughter). 

Ben Yelin: But I think this is the biggest digital privacy story of our time. 

Dave Bittner: OK. 

Ben Yelin: And I'm talking about the Supreme Court's decision in Dobbs, which, as we're recording, was handed down a few days ago. If you are living under a rock, the court in Dobbs, in a split decision, but with a five-justice majority, overruled Roe v. Wade and Casey v. Planned Parenthood and threw the issue of abortion rights back to the states, saying that there's no constitutional right to have an abortion. It's up to the state legislatures and the people in the several states to choose the legal regime they want for abortion services. 

Dave Bittner: Right. 

Ben Yelin: So the immediate impact of this decision is that several states, I think seven at the time we're recording, have now criminalized abortion. They are allowed to do that after this Dobbs decision has taken place. 

Dave Bittner: And they had trigger laws in place - right? - where... 

Ben Yelin: Exactly. So they had these laws that said, if Roe v. Wade is ever overturned, then this state will make abortion illegal as soon as that decision is handed down. 

Dave Bittner: Right. 

Ben Yelin: Sometimes they have additional requirements like a 30-day waiting period or a - has to be signed off by the attorney general or something. But that's generally how these trigger laws work. 

Dave Bittner: OK. 

Ben Yelin: I would say by the end of the next month or so, we're going to see the states where abortion is either fully prohibited or effectively prohibited at any point during a pregnancy is going to reach upwards to 15 to 20 states. 

Dave Bittner: OK. 

Ben Yelin: So this is something that's going to affect many people across the country. 

Dave Bittner: Yeah. 

Ben Yelin: The angle that has to do with digital privacy is there are now going to be criminal investigations within these states, both into people seeking abortions and to providers who are providing them illegally in one way or another. And a lot of the evidence that these prosecutors in these states are going to use is going to come from our digital footprint. And we leave that digital footprint everywhere. So it's not just the Google searches for abortion clinics, although those can be subpoenaed, but it's private conversations we have with our loved ones. If that could be valuable evidence that somebody is seeking an illegal abortion, that's something that can be subpoenaed through a state court and could be obtained by a state prosecutor in one of these states. Then there are things like location sharing. One investigative method that we know is frequently used is trying to look at somebody's historical cell site location information. 

Dave Bittner: Right. 

Ben Yelin: So without disabling location services on your device, you leave a footprint of all the places that you've been. It's stored on your device, usually in some type of maps application. And that's going to be increasingly relevant if there are criminal investigations based on a suspicion that somebody's seeking an illegal abortion. If they are seen traveling to another state and the state prohibits that type of interstate travel, we don't know if that's going to be upheld as constitutional, but certainly, a person's location is going to become extremely relevant. 

Ben Yelin: Then there are things like period tracking apps. I think we've referenced this obliquely in the past. 

Dave Bittner: Right. 

Ben Yelin: But in these tracking apps, you're potentially sharing sensitive medical information. And if you're using one of these tracking applications and it's indicated that you miss a period, and then several weeks later your period shows up, that might be evidence in a state court that you're trying to conceal a terminated pregnancy. 

Dave Bittner: So you combine that with, you know, location data that says that you traveled to a state that - where abortion is legal, and suddenly, we've got a case against you. 

Ben Yelin: Yes. And then combine that with your internet search history, which says, where can I obtain an abortion? 

Dave Bittner: Right. 

Ben Yelin: What are - how can I get access, free access to an abortion pill if I live in a state where abortion is illegal? Those are searchable on any public search engine, specifically something like Google... 

Dave Bittner: Yeah. 

Ben Yelin: ...Where we know that they collect a lot of private information for advertising purposes to give you suggestions in your search bar. 

Dave Bittner: Right. 

Ben Yelin: So once you've conducted that search, that information is going directly to Google. 

Dave Bittner: Yeah. 

Ben Yelin: There are other things like license plate readers, facial recognition software systems that might be set up across state borders. So if the state is trying to surveil people leaving and coming into the state, either through airports or across the physical border, that's something that people might need to consider - alternate transportation routes - if people are recognizing where a license plate has been as part of a broader investigation. 

Ben Yelin: All of this is going to be extremely difficult for a layperson. The things you have to do to completely conceal your digital footprint is a series of pretty complicated actions. You'd have to disable your location services, which means that you can't access a bunch of applications that rely on location services, like Google Maps or Apple Maps or whatever. You have to use encrypted messaging applications. So things like iMessage are encrypted, but Apple has the key. They can unlock it. And these conversations are stored in the cloud. And then things like using incognito browser, which is not foolproof in 100% of circumstances, but something, certainly if you're going to be searching for things, that you'd want to do. 

Ben Yelin: The big question to my mind is what are the big tech companies going to do about this? 'Cause they're going to start getting a lot of subpoena requests, warrant requests from these states to obtain this information as states investigate people who have gotten - allegedly gotten illegal abortions. And so far, the big tech companies have been rather quiet on this issue. And I think that's alarmed a lot of pro-choice advocates and electronic privacy advocates who might not be in this for the pro-choice element, but just - they don't want investigations that rely on this strong digital footprint. 

Dave Bittner: Right. 

Ben Yelin: And the - these companies have been a little bit cagey. All of their EULAs say we will comply with lawful law enforcement requests. They'll say we'll push back if the requests are overly broad. But it's hard to know exactly what that's going to mean, and do they have enough of an incentive to push back against these overbroad requests lest they develop adversarial relationships with law enforcement in some of these states? And we're not just talking here about states in the Deep South, like Mississippi. We're also talking about states like Ohio, where abortion is effectively now criminalized. Utah is another one. So states that have areas that are cosmopolitan, big cities, a lot of young professionals working there, and we don't know how the tech companies are going to react when those states request abortion-related data. 

Dave Bittner: Yeah. 

Ben Yelin: So until the companies directly answer these questions - and they have not thus far - I think there's this really open question as to whether the big tech companies are going to shield the privacy of people who are seeking abortions. 

Dave Bittner: Yeah. You know, I've been thinking about this, and one of the things that has crossed my mind is, you know, it's been, what, about 50 years since Roe v. Wade was passed. 

Ben Yelin: Yep. 

Dave Bittner: Right? And so obviously, a lot has changed... 

Ben Yelin: Yes. 

Dave Bittner: ...In that amount of time. Related to that, you remember back in the, oh, gosh, it was probably the late '80s, when Congress suddenly became very interested in the privacy of your video rental store records. 

Ben Yelin: Yes. 

Dave Bittner: Right (laughter)? 

Ben Yelin: Yeah. 

Dave Bittner: Because turns out you could just get someone's video rental store history. And this proved to be problematic for some members of Congress. And quick as a wink, they made it so they (laughter) - they put protection in place for that. 

Ben Yelin: Yeah, day one. I mean... 

Dave Bittner: Right. 

Ben Yelin: ...If you're choosing one of the videos - those of us who remember video stores well, if you're choosing one of the videos behind the curtain in the corner... 

Dave Bittner: (Laughter) That's right. That's right. 

Ben Yelin: ...You're going to be in trouble, yeah. 

Dave Bittner: That's right. So I'm wondering, are we about to experience something like that here, where - kind of a be careful what you ask for thing. I mean, you know, one of the arguments you'll hear about - in the abortion debate is that people of means will always be able to have access to abortions. 

Ben Yelin: Right. 

Dave Bittner: And I think certainly members of Congress can be categorized as being in that category. So what I wonder is if our digital - our - the technology that we surround ourselves with, is that going to make it much harder for folks who are trying to keep something like this quiet - because it would be against their best political interest, are they going to suddenly find themselves under more scrutiny than they would have 50 years ago? And could that trigger, the same way that the video rental store thing did, could that trigger suddenly Congress is very interested in digital privacy? 

Ben Yelin: I think that's possible. I mean, we certainly - there's one current pro-life member of Congress in - from the state of Tennessee who I think has paid for three separate abortions for wives and mistresses in the past. So this is something that certainly does happen. It's not just elective abortion. 

Dave Bittner: Right. 

Ben Yelin: And if there were a member of Congress who doesn't find it politically advantageous to talk about an experience and they have something like an ectopic pregnancy... 

Dave Bittner: Right. 

Ben Yelin: ...That requires them to have an abortion and they have to travel to a state that's maintained abortion rights, yes, that's going to be a major controversy. I don't think that would galvanize Congress to act just because there is a very substantial pro-life minority in the House and the Senate right now but enough that it could stop legislation. And I'm guessing that this time next year there will be a pro-life majority in the House of Representatives, at the very least, most likely in the Senate as well. So that means - I find that prospect pretty unlikely that it would galvanize members of Congress. I think they would throw the embarrassed member under the bus... 

Dave Bittner: I see. 

Ben Yelin: ...And say, you know, what this guy did was wrong, but that doesn't reflect on the rest of our movement. We don't want to - in the words of pro-life legislators, we don't want to inhibit states' ability to investigate illegal abortions. 

Dave Bittner: Right. 

Ben Yelin: So, yeah, I don't necessarily see that as something that's happened. Now, what blue states might do is introduce new privacy protections to protect people traveling from out of state. So they might say our state courts will not honor a request for data that was obtained within our state's boundaries. 

Dave Bittner: Oh. 

Ben Yelin: That's something that you might see from enterprising states like California, who have already passed in the last several days new protections for abortion rights, and some of the other states where abortion is going to maintain its legality. So I think it's more likely we see action from blue state legislatures than we do from Congress, even though, as you say, this certainly causes the potential for some embarrassment. 

Dave Bittner: Yeah. 

Ben Yelin: Because there is a digital footprint now for literally everything. And if you are hypocritical about what you do in your personal life versus what you espouse publicly, unless you are extremely careful, that evidence is going to exist. And I think we're going to see high-profile cases where that kind of thing is revealed. 

Dave Bittner: Yeah. All right. Well, time will tell. It certainly is a fascinating case in potential unintended consequences, right? 

Ben Yelin: Yeah. I mean, I think so much of this is uncertain based on how the tech companies react and based on the specifics of the laws that are either triggered in place in these states or that are going to pass in the next several months. 

Dave Bittner: Yeah. 

Ben Yelin: Are states going to be ambitious enough to go after residents who travel out of state, and are they going to go after out-of-state providers? I think those questions will - the answer to those questions will shed light on how much of a problem this is for digital privacy, regardless of one's view on the abortion issue. 

Dave Bittner: All right. Well, we will have a link to that story in the show notes. My story this week revolves around the fact that some federal agents seized a phone of John Eastman, who was someone involved in the January 6 plan. Ben, how would you describe John Eastman to our listeners? 

Ben Yelin: He was President Trump's campaign attorney who wrote a lot of the legal memos advocating a strategy - a couple of different strategies - one, to have states send alternate electors to Congress so that there wouldn't be a majority of electors for Joe Biden and the election would have to be thrown to the House of Representatives. And he was an advocate of a legal plan to have Vice President Pence refuse to certify the electors from a number of different states. 

Dave Bittner: OK. So we'll have a link to The New York Times story about the seizure of that phone. But what is of particular interest to us here is the actual warrant that they used to seize that phone. I give credit to Kim Zetter, the journalist, for pointing this out. I saw this in her Twitter feed. She pointed out that the warrant authorized authorities to compel Eastman to use his finger or face to unlock his phone but specifically prohibited them from demanding his password. Ben? 

Ben Yelin: We - we're actually seeing this in real life. It's amazing. 

Dave Bittner: (Laughter). 

Ben Yelin: We've kind of previously talked about this in very obscure cases that we find on the internet and not something as high profile as John Eastman. 

Dave Bittner: Right. 

Ben Yelin: But, yeah, this gets back to the Fifth Amendment right against self-incrimination. That applies to testimonial evidence. Courts have generally held that entering a passcode is the content of one's own mind, and there are constitutional questions about whether you can compel somebody to enter that passcode if what they reveal would be particularly incriminating. The other side of that is biometric data, so facial recognition or thumbprints, which courts are more likely to compel because that's not the content of one's own mind. As we've mentioned before, that's more of the equivalent of being forced into a police lineup where your face is out there to be viewed and evaluated. And in those cases, the Fifth Amendment right against self-incrimination doesn't apply because it's not testimonial evidence. 

Dave Bittner: Right. 

Ben Yelin: So I think they are looking to avoid a potential Eastman defense that his Fifth Amendment rights against self-incrimination have been violated, which might slow down the legal process. So this is a case of surveying the legal landscape and dotting the I's and crossing the T's to make sure the warrant is constitutionally sufficient. 

Dave Bittner: My understanding is that Eastman has sued to get his phone back, which, from, you know, from his point of view, makes sense. 

Ben Yelin: Yeah. 

Dave Bittner: So thinking through this and seeing that we're at this place where there were - they were specific in laying all of this out, it got me thinking about protecting yourself, protecting your device from this sort of thing. And of course, my experience is on the iOS side of things, and I like to use face ID to log into my phone. I find it... 

Ben Yelin: It's easy. 

Dave Bittner: ...Very convenient... 

Ben Yelin: Yeah. 

Dave Bittner: ...And relatively secure. But what do you do? If the police come and say, hey, give us your phone, what do you do? Well, so let me walk through my thought process here (laughter). 

Ben Yelin: OK. 

Dave Bittner: OK. So... 

Ben Yelin: Let's hear it. 

Dave Bittner: So on iOS, for example, if you hold down the power button and either of the volume buttons for about two seconds, that will lock the phone. And it will lock the phone and require your passcode, so your face ID will no longer work. All right. Well, that's great. But I was imagining, what if law enforcement comes to you, and they say, put your hands up? So you cannot touch the phone, right? Put your hands up. They come to you, and they take the phone out of your pocket. 

Ben Yelin: If that's incident to arrest - now, to search the phone, because of Riley v. California, they'd need a warrant. But if they had a warrant to do that and they nab you in an arrest and say, put your hands up... 

Dave Bittner: Right. 

Ben Yelin: ...Then you're going to be out of luck. 

Dave Bittner: Well, perhaps. 

Ben Yelin: Perhaps. 

Dave Bittner: But I did a little digging, though, (laughter). 

Ben Yelin: Oh, OK. What do you got for me? 

Dave Bittner: It turns out that if you summon the little person who lives inside of the phone whose name begins with S and ends with I, whose name I'm not going to say while I'm recording audio... 

Ben Yelin: She's always listening. 

Dave Bittner: She is always listening. If you summon her and you say to her, who does this phone belong to, that will trigger her to answer and say who she believes the phone belongs to. But it also triggers a locking of the phone. 

Ben Yelin: Good to know. 

Dave Bittner: So it requires a password. You have to do it before a biometric unlocking. In other words, if the police grabbed the phone and immediately held it up to your face and unlocked it, you couldn't then trigger her and request the - you know, summon the command, and then it would lock. It has to be while the phone is locked. So you know, like, let's say the phone is in your pocket or the law enforcement is holding it not in front of your face. You could say this, and it would lock the phone. So... 

Ben Yelin: Wow. We're giving a lot... 

Dave Bittner: News you can use (laughter). 

Ben Yelin: Yeah. We're giving a lot of advice to potential criminals today. That does sound like a better option than what first came to my mind, which is trying to trick the facial recognition by making a funny face, you know, scrunching your eyes and your nose and... 

Dave Bittner: Right. 

Ben Yelin: ...Making your face look like what it does not usually look like when you're trying to unlock your phone. Or before you put your hands up, try and dirty up your thumbs with some dirt so the thumbprint doesn't work. 

Dave Bittner: Or just take your phone out of your pocket, and chuck it across the room as hard as you can (laughter). 

Ben Yelin: Yes. That might cause other problems. 

Dave Bittner: Well - so... 

Ben Yelin: That could be an obstruction thing. 

Dave Bittner: Well, that's what I wanted - that's where I wanted to go next with you. So let's say I do this. Let's say the police say to me, put your hands up. Don't touch your phone. And I say, I summon the phone, and I put this thing into place that locks the phone. Could they come after me with obstruction of justice? 

Ben Yelin: Huh, that's going to be a really interesting case. 

Dave Bittner: Yeah. 

Ben Yelin: I think it's possible. 

Dave Bittner: Yeah. 

Ben Yelin: I would not be surprised if we saw a case exactly along those lines... 

Dave Bittner: OK. 

Ben Yelin: ...At some point. I'm not aware of one that exists right now, but... 

Dave Bittner: Right. 

Ben Yelin: ...That is - has all of the makings of a fascinating judicial case. 

Dave Bittner: Because I'm actively trying to thwart their access in real time while interacting with law enforcement. That's what it takes (laughter). 

Ben Yelin: I would lean towards that being an obstruction charge. 

Dave Bittner: Right. 

Ben Yelin: But I'm not exactly sure how courts would see it. 

Dave Bittner: Yeah. 

Ben Yelin: What would not be an obstruction activity would be to use some of these privacy protections before you're in a situation where you're getting arrested. So if you look at Kim Zetter's tweets, all of the responses are, this is why I've turned off biometrics on my device. 

Dave Bittner: Right. 

Ben Yelin: Because I'm afraid of the situation where I'm going to be compelled to unlock my phone and there's a lot of personal, potentially incriminating information on there. 

Dave Bittner: Yeah. 

Ben Yelin: So that's the immediate step one should take if one were concerned about such a thing. You can disable facial recognition. You can disable the thumbprint if you do that ahead of time, and it's certainly not obstruction. And then you get into a situation where it's going to be a lot harder to obtain a requisite warrant to get you to unlock the phone because that has Fifth Amendment implications. So that's an action that somebody could take now to protect themselves about being - from being trapped in that type of situation. 

Dave Bittner: Yeah. All right. Well, it's - boy, it's interesting, isn't it? Interesting times. Interesting times. 

Ben Yelin: It sure is. 

Dave Bittner: Yeah. 

Ben Yelin: Yeah, I mean, I know we've talked about it a lot, this distinction between using a passcode and using biometric data. But it's fun to see it out in the wild, where it actually might have an impact in a pretty important, notorious legal case. 

Dave Bittner: Yeah, absolutely. All right. Well, again, we will have a link to that in the show notes. If there's something you would like us to consider for the show, you can email us. It's 

Dave Bittner: All right, Ben. I recently had the pleasure of speaking with Gary Buonacorsi. He works at Tanium. He is their SLED CTO and their chief IT architect. And our conversation centers around the Cyber Incident Reporting for Critical Infrastructure Act. Here's my conversation with Gary Buonacorsi. 

Gary Buonacorsi: So I think it's interesting, and I think this legislation that was recently passed by the Biden administration was in regards to the Colonial Pipeline, but I think that was just counting the most public example of what - or I think the vulnerability from the country's perspective at large is. Infrastructure's been one of those areas that's been pretty much neglected, I think, from a cybersecurity perspective for a long, long time. And what it's done, it's put a lot of our critical infrastructure at high risk. And if we get to - you know, recent events in Europe indicate, you know, you never know exactly how the geopolitical environment can change from just moments, right? And I think, as from a targeting perspective, infrastructure is one of those areas where countries can be very vulnerable because, I said, from lack of investment. 

Gary Buonacorsi: So I think the recent legislation, or what it's designed to do, is to really help incorporate the knowledge sharing amongst providers of infrastructure, much like we did after 9/11 - right? - when we had - we - 9/11 was a failure from a lot of intelligence agencies for sharing information. I think, in regards to infrastructure, they're trying to get ahead of that and basically not only just reporting events that happened, but basically sharing with other entities that may have similar vulnerabilities about what the defensive actions they could take and their posture. 

Dave Bittner: So as you look at this legislation, what is your take on it in terms of, you know, what they set out to do? And do you think as written, is it achievable? 

Gary Buonacorsi: You know, that's an interesting question. I was contemplating that before our conversation. And I think it helps in some ways, but it doesn't go far enough in others. It helps in that, to my point earlier, that, you know, it's kind of mandating the information sharing across different entities. 'Cause in the past, I think if there was an event, you might not share it with others 'cause you didn't want to disclose that you had a vulnerability and maybe open yourselves from other attacks. But I think that's kind of shortsighted. And I think from the legislation's perspective, the fact that they're mandating sharing of information is a good point. 

Gary Buonacorsi: I think where it doesn't go far enough is that it doesn't go back to the validating of where they are from a cyber hygiene perspective in advance of an incident. And I think this is a - it's a problem across all of government, actually. It's around the visibility part. And what I mean by that is most entities today don't really have visibility into their environment. And when you get into infrastructure, a lot of these are operational controls technologies, OT things, and those are very difficult to protect. But they really don't have a visibility about what their environment really looks like and where their vulnerabilities exist. I think the legislation could have gone farther in that it mandates, maybe even from a third-party perspective, some form of compliance check and basically some vulnerability assessments that are mandatory. I think a lot of infrastructure entities out there that provide critical infrastructure for the country don't really have a good plan of how they can test their vulnerabilities and then how do they - can enforce that compliance. Once they do identify what the plan is, are they in any compliance with that plan? 

Gary Buonacorsi: Again, I think it's a substate of just the way government operates. There's a lot of governance and policy that gets written, much like this legislation. But the compliance and the validations part on the back side is what's really, I think, sorely missing in government at large. And specifically, I think it's a big miss on this particular piece of legislation. 

Dave Bittner: You know, you bring up a really interesting point. As a sort of a related aside, my wife and I realized over the course of the pandemic that a large part of keeping our house tidy was dependent on having guests over. And so when COVID (laughter) happened and we weren't having guests over, we noticed things tend to get a little more cluttered. 

Gary Buonacorsi: (Laughter). 

Dave Bittner: I'm wondering, you know, to our conversation today, the fact that these organizations know that they're going to have to be sharing things, that other people are going to have to take a look at things, does that make them get their house in order? 

Gary Buonacorsi: You know, I'm not sure. One of the things I think the Colonial Pipeline incident brought to attention, not only just to Colonial Pipeline folks themselves but to the industry at large, those people that have critical infrastructure, it really just demonstrated how vulnerable they were. So to your point about does it - going to help them if they're going to share and there's an incident, is that going to help them clean their house? I'm not sure it really will. I go back and there's really no enforcement mechanism in this legislation. You know, if you're going to - you know, if you have to report an event, that - to me, that's after the fact. I think the better approach is to be proactive before an event. 

Gary Buonacorsi: And so if you do see something in your environment, sharing that in advance with others I think is a positive state. But after the effect, you know, post-mortem analysis, if you will, I'm not sure how effective that's going to be in changing behavior across the industry. I just hope they would do it because they realize the vulnerability is there and to begin to get their house in order way before an event happens. 

Dave Bittner: Is there any sort of time limit put on them that, you know, they must report within X number of days or hours or so forth? 

Gary Buonacorsi: Yeah, I don't remember the exact reporting methodology, but, yes, there is a timeframe in there and there's a type of incident. So there's a lot of descriptive behaviors of what types of events need to be reported and when and to who. So there is criteria around that. So there is an enforcement mechanism to - I guess from that - from the reporting perspective. But I go back, there's really no enforcement mechanism to get them preventative prior to happening. It's kind of like, you think about it, it's like leaving your door - front door unlocked and you hope somebody doesn't walk in instead of just checking your locks and making sure you have a process where you're always constantly locking and reinforcing your front door. If you don't do that up front, you're just waiting for someone to walk in the front door. And then after that, it's too late. 

Dave Bittner: How is industry responding to this? 

Gary Buonacorsi: You know, that - it's interesting because (laughter) I think the industry is taking this serious. What I don't think they have today is a real plan on how to proactively get in front of this. And I'll use, like, the power grids, for example. I think that's probably the most - probably one of the most vulnerable areas. And part of it is the challenge from a technology perspective is how do you protect something so vast and basically complicated and run by a bunch of independent entities? And if you think about it, the power grid system is really a mesh network of power providers, energy providers. So how do you go - you're only as strong as your weakest link, right? And so you may have 90%, 95% of the people that are doing a great job at this, but then you have that 5% or even 1% or even a fraction a percent. That's where that vulnerability happens. 

Gary Buonacorsi: And in regards to the electrical grid, it doesn't take a lot. It could have a cascading effect, especially right now. You have the heat of summer. That's going across the United States right now. The power grids are already stretched to the limit. You start breaking into small parts of that, and you could have a cascading effect that could be a domino. It could take down entire systems. So I think they're taking it seriously, but I don't know if they're able - just from the complexity of it, I don't think there's a basically - what I I use? - I don't think there is a universal approach to how everybody works together to basically harden their infrastructure. And when you have a bunch of independent operators, that just makes that challenge even more difficult. 

Dave Bittner: What sort of advice are you providing to the folks that you work with in terms of, you know, making sure that they have what they need or they're properly preparing for this? 

Gary Buonacorsi: That's a great question. So I think it starts with the basics. And one of the things I think government as a whole struggles is the basics and that they lack visibility. They don't really even know what's in their environment, especially when they get to industrial controls. You know, it's not just, you know, your servers, your laptops, your desktops and your environment that - and your systems that your operators work off of. But it's really the industrial controls themselves. Everything is connected to a network these days, even your industrial controls. Everything's operated remotely. There's very little hands on. So when you think about that, when you lack visibility into what your environment looks like, you don't have certainty. And so if you don't have the certainty and visibility, then you really are just basically shooting darts at a dartboard. You don't know where your vulnerabilities even are. 

Gary Buonacorsi: So what we advise people - at least what I advise people that I'm talking to - is stress as much as you can to get complete visibility, 100% of your environment. You can't protect what you don't know. And so that's where I stress for everybody that visibility is the key. It's a key for every cybersecurity framework, whether you use NIST - National Institute of Science (ph) and Technology - or CIS - computer information systems - controls whatever that framework is. It always starts with visibility and knowing what your assets are, where they are and what's running on them. 

Dave Bittner: Now, my sense is that with critical infrastructure, we seem to be playing a bit of catch-up here. First of all, I mean, do you agree with that? And if so, do you envision a time when we're able to keep pace? 

Gary Buonacorsi: You know, we are playing catch-up. And part of it is, it's the age of the infrastructure itself, right? So some of this infrastructure may be modernized. But other aspects of it are not. And so when you put those two things together, the investment that it would take to truly modernize and harden infrastructure, it's a really big challenge, and I don't think there's really adequate funding around that today. So I think that we're always going to be playing catch-up. I just think it's one of those things as we make investments in infrastructure, we think about that visibility and hardening as part of that. When you do that, hopefully that gap begins to shrink between the infrastructure that's probably archaic and older, that's really hard to defend, is being replaced at a faster pace going forward with technologies that can be hardened and much easier to depend. 

Dave Bittner: All right, Ben, what do you think? 

Ben Yelin: I think we've kind of heard this message echoed from a lot of different experts, that this is a really important first step. Reporting is good to establish governmental knowledge, institutional knowledge of the threat landscape to critical infrastructure. 

Dave Bittner: Right. 

Ben Yelin: We've seen the real-world effects of ransomware attacks and other cyberattacks on critical infrastructure, and I think all of us expect it to get worse. But reporting - a reporting requirement is only the first step. 

Dave Bittner: Yeah. 

Ben Yelin: I think we're still far from a comprehensive solution, which is to better protect critical infrastructure. That's going to happen at the state level, I think, more prominently in the short run than at the federal level. I think it's a more direct risk to states and localities, even though, of course, the federal government maintains and controls a lot of critical infrastructure. 

Dave Bittner: Yeah. 

Ben Yelin: But yeah, I think we should just see this as the first step and not the end game in terms of protecting critical infrastructure from cyberattacks. 

Dave Bittner: Yeah, it really is one of those, I guess, examples - for me, anyway - where the devil is in the details. Like, as you say, you know, this is a good first step, and we're headed in the right direction, it seems. But there's still a lot of details to work out. 

Ben Yelin: Right, exactly. 

Dave Bittner: All those edge cases, right? Yeah. 

Dave Bittner: All right. Well, again, our thanks to Gary Buonacorsi for joining us. We do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.