Caveat 8.11.21
Ep 89 | 8.11.21

Convergence of government and private sector in national security.

Transcript

Luke Tenery: And I think what's interesting about her background is sort of this convergence of government and private sector. Increasingly, that's highly relevant and of interest to our national security.

Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin from the University of Maryland's Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, Ben looks at increased skepticism over the use of gunshot sensing technology. I'm digging into class action lawsuits on the heels of ransomware. And later in the show, my conversation with Luke Tenery. He's a partner at StoneTurn. We're going to be discussing the recent confirmation of Jen Easterly as director of the U.S. Cybersecurity and Infrastructure Security Agency. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, we got some good stories to share this week. Why don't you kick things off for us? 

Ben Yelin: So my story comes from Motherboard by Vice, but it is not Joseph Cox. 

Dave Bittner: What? 

Ben Yelin: It's Todd Feathers. I know. 

Dave Bittner: (Laughter). 

Ben Yelin: So we are betraying our unrequited love on this podcast. 

Dave Bittner: Maybe Joe's on vacation this week (laughter). 

Ben Yelin: He must be, yeah. So the title of the article is "Police Are Telling ShotSpotter To Alter Evidence From Gunshot-Detecting AI." So if you haven't heard of ShotSpotter, it is this technology where, based on artificial intelligence, the government, law enforcement can detect supposedly where there has been a gunshot. It tries to match the sounds that's heard with a location. It's all, you know, put on separate coordinates, and it's used to try and solve homicide crimes. 

Dave Bittner: Right. So there are microphones placed around a city. 

Ben Yelin: Right. 

Dave Bittner: And by using triangulation and the - you know, we know how fast sound travels - so by using math, artificial intelligence (laughter). 

Ben Yelin: I was reliably informed there would be no math in this podcast. But... 

Dave Bittner: (Laughter) They can - they allege or they claim that they can pinpoint where something like a gunshot may have taken place. 

Ben Yelin: That's right. That's right. The problem is there is a human element to this. And what we found out from this article is that police departments are altering some of the data that comes in through this artificial system. So the hook in this article is an incident in Chicago where a 25-year-old was shot in the head, dropped off at a hospital and died two days later. The police arrested a suspect. And it turns out there was video surveillance of the suspect's car pulling up in the area where the crime was committed. So the police claimed that ShotSpotter was - one of the microphones used as part of the ShotSpotter system detected the sound and location of the gunshots, and that generated an alert for that particular time in that particular location. And they were preparing to bring that evidence into court. 

Ben Yelin: When they looked under the hood, it turns out that an analyst had altered the data for ShotSpotter that changed the time and the location and reclassified it - overrode the algorithms and reclassified the sound as a gunshot, even though it probably was actually a firework. This was a homicide that took place during widespread protests in the wake of George Floyd's murder. 

Ben Yelin: We know based on the research done in this article that this is not the first time that this has happened. It's happened in other cities that have deployed ShotSpotter technology, where the artificial intelligence is reasonably good. It's not entirely accurate, but sometimes the alert that's generated by ShotSpotter doesn't match the known circumstances surrounding the crime. So police alter the evidence to fit the location and the time of the crime in order to assist their own prosecution. 

Ben Yelin: So the defendant in this case, with the help of his attorney, filed what's called a Frye motion, which is a request for the judge to figure out whether whatever this forensic method is, is actually scientifically valid. And the prosecutors in this case are like, you know what? Forget the ShotSpotter thing. Let's not use that data. We'll try and prosecute him on something else. 

Dave Bittner: So faced with more scrutiny from the judge, rather than face that scrutiny, they dropped that particular evidence. 

Ben Yelin: They did, yeah. And, you know, it seems like there's enough evidence to sustain a conviction without the ShotSpotter data here. That's not going to be true in every case. They're going to be some cases where the only evidence they have is the ShotSpotter data. 

Dave Bittner: Yeah. 

Ben Yelin: So when they - nobody's really looked under the hood, besides the company that manufactured ShotSpotter, to understand exactly how it works. And now we have a bunch of documented incidents all across the country, one they talked about in Rochester, N.Y., where law enforcement are using the incoming data from ShotSpotter as kind of a starting off point before they edit the data to match the circumstances that they want to portray... 

Dave Bittner: Wow. 

Ben Yelin: ...As part of this prosecution. 

Dave Bittner: Yeah (laughter). 

Ben Yelin: So this is obviously a large problem. The question is whether the use of this technology is worth it in the first place 'cause if you have this risk that prosecutors and law enforcement have the option of overriding this data, obviously that's ripe for abuse. Now, that might be justified if ShotSpotter was useful, if the data showed that it was useful in, you know, heightening conviction rates for homicide and for protecting the public. And it seems like from the data presented in this article and data I've seen elsewhere, that it's not really an effective law enforcement tool. In the cities where it's been deployed, there hasn't been a decrease in gun-related crimes. And, you know, that's been replicated all over the country. And even though ShotSpotter says that, you know, the technology has gotten more accurate with time, it just seems like, when you have these circumstances in which the data coming in from the sound sensors doesn't match with - what law enforcement wants to see and wants to portray... 

Dave Bittner: Yeah. 

Ben Yelin: ...We have this large potential for abuse. So I think we're going to start to see many of these clients of ShotSpotter, which are some of the largest police departments in the country, start to show some skepticism about whether to deploy this technology into the future. 

Ben Yelin: And, of course, the unspoken truth here is there's a massive racial disparity in the use and deployment of ShotSpotter technology. We know based on the data that was obtained by Motherboard that, as you would expect, most of these sensors are placed in heavily minority communities. The - they talked about the city of Chicago, which I know relatively well. There are white enclaves they talk about in the north and northwest of the city, where there are no sensors whatsoever, even though gun crime is spread throughout all areas of the city, not just those areas that happen to be highly populated by minority populations. 

Ben Yelin: And, you know, people who are prevalent in community relations say, you know, this technology hasn't really done anything for us. You know, maybe the police are responding faster. But you know, we're not - it's not improving overall community-law enforcement relations. They come in with this militarized response. It's not exactly helpful in building relationships with residents. So long story short - seems to be a lot of problems with ShotSpotter and particularly with law enforcement trying to manipulate the data to match their preexisting conclusions. 

Dave Bittner: Yeah. What struck me in this article was - they cite an example from 2016 in Rochester, N.Y., where police stopped the wrong car. And they shot the passenger in the back three times. And so there were a total of four gunshots fired. And ShotSpotter data tracked that there were four gunshots. But the police went back to - the police were claiming that this person in the passenger seat had fired at them and they were responding to that. And they went back to ShotSpotter. And it - the way I read this article is that they went back to ShotSpotter and said, hey, could you find a fifth shot there for us? That would... 

Ben Yelin: Yeah. 

Dave Bittner: ...Be really helpful (laughter). And they did. And it's not just - it's not the only example of that where it seems like, you know, police went back and told ShotSpotter what they were looking for. And then the ShotSpotter folks took another look at the data, over-rid (ph) the algorithm. 

Ben Yelin: In scare quotations - yup. 

Dave Bittner: Yeah, over-rid the algorithms and sort of delivered what their client needed to go - to get their prosecution - right? - to get their conviction, I suppose. 

Ben Yelin: Yeah. And in this circumstances, it seems that the company and the Rochester Police Department - and this is not at all suspicious - lost, deleted and/or destroyed the spool and/or other information containing sounds pertaining to the officer-related - officer-involved shooting. What a coincidence. 

Dave Bittner: (Laughter) They also pointed out that, in this case, the police refused the victim's multiple requests to test his hands and clothing for gunshot residue. In other words, he was claiming that he did not fire a shot. And as, you know, anyone who's watched procedural crime TV shows knows that there's likely residue on someone's hands when they fire a weapon - right? - from the gunpowder. 

Ben Yelin: Yes. 

Dave Bittner: And they refused to test him for that. So a lot of things don't add up here. And seems to me, at the very least - I guess - look. There could be very well be great value in ShotSpotter's technology here. 

Ben Yelin: Right. 

Dave Bittner: But it seems to me like they have to clean up their own house and resist the temptation to give their clients what they want. They have to keep themselves separate from these requests to - hey, take a closer look at this data. And don't you think there was another shot there? 

Ben Yelin: Yeah. 

Dave Bittner: Nudge, nudge, wink, wink. 

Ben Yelin: Take another look. Yeah. I think there are, you know, a couple of root problems here. One is that the data that ShotSpotter has produced to defend its own use among these large police departments hasn't been independently fact-checked. The algorithms that they use haven't been independently evaluated. All of this is just kind of what ShotSpotter has claimed in its own marketing materials. So it just - it hasn't received the type of oversight that would lead you to trust the deployment of this type of technology. And as I said, there's no empirical data that it has a - it puts any downward pressure on gun crimes being committed. Frankly, just kind of seems like a recipe for abuse and, frankly, something that would foster further distrust between the community and law enforcement. 

Dave Bittner: Yeah. 

Ben Yelin: I think it's one of those things that sounds good in theory. 

Dave Bittner: Right, right. 

Ben Yelin: But now that we're seeing how it actually works in practice, including, you know, the abuse we've seen from law enforcement, it just looks far less appealing in practice. 

Dave Bittner: Mm hmm, yeah - interesting to see how this plays out. All right. Well, again, that is from Motherboard by Vice, written by Todd Feathers. We'll have a link to that in the show notes. My story this week comes from The Washington Post. It's written by Gerrit De Vynck. And it's titled "First Came the Ransomware Attacks, Now Come the Lawsuits." I suppose there's nothing terribly shocking about this course of action here. 

Ben Yelin: The lawsuits always come. Something bad happens. Somebody - yeah. 

Dave Bittner: Yeah. So really, what they're documenting here is that in the aftermath of ransomware attacks - and in this case, they particularly are talking about the Colonial Pipeline attack where the pipeline was shut down and we ended up with some fuel shortages along the southeastern United States. Well, you know, lots of people were affected by that, not the least of which were the owners of gas stations. 

Ben Yelin: Yeah. 

Dave Bittner: They - the pumps - they were out of gas. 

Ben Yelin: They dried up. 

Dave Bittner: So if you go for a couple of weeks with no gas, that, of course, is going to hurt your bottom line, your ability to maintain your business. And so these folks are going after people like Colonial Pipeline and saying, you had insufficient cybersecurity, and so we're going to sue you for that. So I'm curious what you make of this, Ben. I mean, is this - I guess what I'm wondering is, in the world of supply chains, to what degree can I go after my suppliers for not being able to provide me with the things I need to run my business? Shouldn't I have insurance for just this sort of thing? Wouldn't that take up some of the slack? Like, where do things generally fall with these sorts of claims? 

Ben Yelin: So to answer your broader question, of course, you can have a cause of action if your supplier - you know, if there's a breach of contract. If they've guaranteed that they're going to, you know, in some sort of binding legal agreement that they're going to supply you with something and they don't supply it, you have a cause of action against them, you know. Or if they are, you know - if they cause any sort of substantial injury to either, you know, a business or a customer and whether that injury is physical or financial, that's a tort. So they've committed a legal wrong, and they could be held liable in court for that. So that's true from a broader sense. 

Dave Bittner: Right. 

Ben Yelin: To focus more narrowly on ransomware itself, I think the legal landscape is largely undefined, and that's going to be the issue here. Generally, when we're talking about tort cases, you have to prove that a defendant breached the standard of care. This is, you know, has long given law students and attorneys great frustration because it's a relatively malleable standard. 

Dave Bittner: (Laughter) It's a bit fuzzy. 

Ben Yelin: It's a bit fuzzy. Basically, you just have to act in a way that would be reasonable to people who are similarly situated. So, you know, if you, let's say, you know, are negligent in how you set up a building and that building crumbles and, you know, people get hurt or die, they'll look at, you know, comparable building constructions, what the custom is in the industry or, you know, whether - maybe there's a statute that, you know, governs the safety standards for buildings. That would be good evidence as to what the standard of care is and whether that standard has been breached. 

Dave Bittner: You built your house out of straw when everyone else was using bricks. 

Ben Yelin: Yeah, I feel like there was a fairy tale fable about that. But... 

Dave Bittner: (Laughter). 

Ben Yelin: But that's a topic for another podcast. So in the realm of ransomware, it's relatively new in a legal sense - again, the legal world moves very slowly - and we haven't really developed what the standard of care is. And I think that leaves companies pretty vulnerable to lawsuits - you know, at the very least, being forced to settle because a potential plaintiff could say, this is what similarly situated companies did to protect their data. This is what, you know, the industry practices. You didn't conform to that practice. That's pretty compelling evidence that there has been a breach of that duty and, thus, a justiciable legal injury. 

Ben Yelin: We're not at that point yet because, you know, I don't think the legal system has recognized exactly what that standard is. But as this article makes clear, you know, you just kind of need to find the right court and the right circumstances. And if you fail - you know, if you fail to do one thing to protect your data, and that's something that a court is willing to recognize, then you are going to be liable to a whole bunch of people, either customers or other businesses. So, you know, I would expect that some of these lawsuits are going to succeed, especially if you find evidence that they violated custom or, you know, some of the more common industry standards. So, you know, I would not be surprised - not all of these cases are going to be successful, but I would not be surprised if some of them are. 

Dave Bittner: Yeah. This article points out that Chris Krebs, who was formerly the director of the Cybersecurity and Infrastructure Security Agency, CISA, he has made the suggestion that perhaps the government could require a certain level of security from companies that work in critical fields, like utilities. 

Ben Yelin: Right. 

Dave Bittner: So regulation, establishing a minimum standard. 

Ben Yelin: Right. And then that would be the standard of care. And if they breach that standard, then you have, you know, the tort. What the article also makes clear is you can develop pretty stringent standards, and then something happens. You know, the old lady who works in accounting accidentally clicks on the wrong link. And, you know, maybe you had incredibly strict security protocols, but you just got unlucky that one day. 

Dave Bittner: Right. Somebody kicked the plug out of the server that protected the system or something. 

Ben Yelin: Right. You know, I think that might still subject to legal liability if we're coming up with standards. And maybe that's the best way to resolve this problem. But I can see why it's kind of a confusing area of the law. We're not exactly sure what the standard is. It hasn't really been widely adopted in our court system. 

Dave Bittner: Yeah. 

Ben Yelin: And until there is a standard that's adopted, people are just going to keep throwing lawsuits at the wall because, you know... 

Dave Bittner: Why not? 

Ben Yelin: Yeah. They talk about a gas station here who - you know, their business was basically ruined for a month. You know, they lost out on a lot of revenue because their supply was cut off. You know, that really hurt somebody's bottom line. It would be - you know, it would be equitable for them to be able to seek some type of judicial relief here. 

Dave Bittner: Yeah. 

Ben Yelin: And so I think everybody who has suffered because of a ransomware attack, until we have a recognizable judicial standard, is just going to kind of try to throw things at the wall and see what sticks. So I think we are going to see a growing body of lawsuits in this area, as this article describes. 

Dave Bittner: And, you know, I mentioned insurance as a possible backstop for these sorts of losses. And just unrelated to this article, I was looking at some other coverage this morning, and they were saying that insurance policies are going - have been going up by about 400% lately. Cyber insurance policies, the costs of them - I think they were talking about - there was a city, a municipality in Alaska, that had been paying $50,000 a year for their cyber insurance, and this year, when they went to re-up, there was only one company who was interested in covering them, and the bill went up to $200,000. So we're talking about some - we're talking about real money, but I think we're also talking about - as the insurance companies start to have more data, they're able to price these things more realistically. And... 

Ben Yelin: They are properly evaluating risk... 

Dave Bittner: Right. 

Ben Yelin: ...Saying, yeah, this is becoming more likely. It's not only becoming more likely, but we can see how large the damages are. It can happen to anyone. It can happen to a municipality. It can happen to a federal entity. It can happen to a hospital system. It can happen to a private corporation. 

Dave Bittner: Yeah. 

Ben Yelin: But now we know what the potential effects are. And to cover those costs, yeah, you're going to need to pony up some increased premiums. I mean... 

Dave Bittner: Right. And it's interesting to me that that - I mean, that's another pressure point, right? That's another element to make to - for people to react to, to go back to the other participants in the system and say, OK, listen; I can't afford this cyber insurance, right? (Laughter) So what are we going to do here, gang? You know, whether it's regulations or the federal government stepping up or whatever, you know, it's - I guess what I'm saying is none of this happens in a vacuum. And it's all - every - the little... 

Ben Yelin: It's part of the same ecosystem. 

Dave Bittner: Yeah. The push-and-pull from one part will affect the others. 

Ben Yelin: Yeah. 

Dave Bittner: And as you point out, it's just all relatively new in the greater scheme of things. 

Ben Yelin: Yeah. I mean, I will say, it's not new in terms - like, the ransomware aspect of it is new. 

Dave Bittner: Yeah. 

Ben Yelin: But the interplay between legal liability and insurance - I mean, we can look at legal malpractice, which, you know, obviously I'm more familiar with - not personally... 

Dave Bittner: (Laughter). 

Ben Yelin: ...But having had to study professional responsibility... 

Dave Bittner: Right. 

Ben Yelin: But then something like medical malpractice, like - you know, we have some really complicated laws and regulations that go along with that. It's still a very controversial issue. And that obviously relates to the cost of buying malpractice insurance for providers. 

Dave Bittner: Right. Right. 

Ben Yelin: Because it took a while for that field to develop. I mean, what exactly is the standard of care there? Is there a heightened standard of care for medical professionals? What kind of evidence is going to be used in cases? I mean, that has developed in our common law system, but it has taken a while to develop. 

Dave Bittner: Yeah. 

Ben Yelin: And I think, you know, the insurance industry has moved along with the risk of medical malpractice. And I think that's exactly what's going to happen here as we're talking - as we're dealing with ransomware. 

Dave Bittner: All right. Well, good stuff. And again, we'll have links to all of the stories we covered today in our show notes. We would love to hear from you. If you have a question for us, you can send us an email. It's caveat@thecyberwire.com. You can also call in and leave your question. The number is 410-618-3720. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Luke Tenery. He is a partner at StoneTurn. And our conversation centered on the recent confirmation of Jen Easterly as director of the U.S. Cybersecurity and Infrastructure Security Agency. Here's my conversation with Luke Tenery. 

Luke Tenery: Yeah, she served previously in the government. She's a West Point grad, very well-educated in that sense and spent some time in the private sector, also, at a financial institution. I believe it was Morgan Stanley. And I think what's interesting about her background is sort of this convergence of government and private sector. Increasingly, that's highly relevant and of interest to our national security. Also, interestingly, her role at Morgan Stanley - she worked in a role that's uniquely situated her to bring different sort of operational units together or sources of information to kind of defend cybersecurity. She led Morgan Stanley's fusion resilience center. And what - generally, what a fusion center means is, it's organizations are - what are typically silos coming together to share information for the greater good or defense. And that's very much what CISA is designed to do. It's a collection of resources, in many ways, or a collection of information or agenda to support different areas of government and critical infrastructure. And so in many cases, it's probably a job that she's very well prepared for. 

Dave Bittner: Yeah, and it seems that her nomination was not controversial. She received praise from both Congress and the media alike. Where do you expect we're going to see her taking the organization? Any clues from her past? 

Luke Tenery: Yes, I think it's very much - you know, her experience aligns with the spirit and mission of CISA. So I think, firstly, there's an opportunity for her. Her hire tells me that the government is looking to share important cyberdefense and intelligence information better, both intergovernment but then also with enhanced fidelity, perhaps, with the private sector. With her background, particularly in the financial services arena, she'll carry some level of bona fides and trust as someone, not just a - you know, a former - only with former government experience and no know-ow around the - what's oftentimes the bureaucracy of government but likely knows how to get things done in the private sector as well, what information is valuable to the private sector from a cyberdefense perspective, what information is going to be valuable to the government coming from, potentially, the private sector as well. I think that's an important thing to note. The government is interested in, you know, getting visibility into the, you know, overall security of the private sector. Also, she likely will support that. 

Luke Tenery: And I think CISA - another critical role is, you know, it plays sort of a support and a functional role in helping different aspects of critical infrastructure respond to cyberattacks as well. Perfect example is with SolarWinds. Much kind of discussed at this point and not much new to say about it, but they played a critical role in getting what we would call the TTPs or tactics, techniques and procedures or indicators of compromise of that attack out to the public. CISA had a critical role in that. And she likely will bolster that with her resiliency background at Morgan Stanley. She'll know very well what it's like to have a high-performing and global resiliency or response function. And so hopefully that means CISA's looking to kind of optimize that even better as it spans, you know, public, private and supporting critical infrastructure. 

Dave Bittner: Do you think it's noteworthy that she came back from the private sector and decided to serve the nation in this role? 

Luke Tenery: I absolutely do. I think just in general, private sector organizations that are very interested in obtaining high-quality or reliable information from the government and how they can better protect themselves, better defend, understand the landscape of how resources and, candidly, the need to support the bottom line - you know, organizations are very interested in having, you know, almost an advocate or at least someone who understands the challenges of a private organization on the government side. And - but also that - you know, the differences between the types of information that they'll respectively have. You know, she'll likely have some level of clearance and access to intelligence and information requiring clearances that the private sector doesn't. But she'll have a good sense of, you know, how to properly navigate that while still, you know, being a helpful resource to private organizations that are supporting critical infrastructure. So I think that understanding will be really important, so definitely very notable in that sense to, you know, gain trust, likely more with private industry as well. 

Dave Bittner: Yeah, it strikes me that, you know, it could be her ability to switch languages, to be able to know the lingo from both sides, having spent time on both sides. That's going to be a real benefit for her. 

Luke Tenery: Absolutely. There's definitely a lexicon of risk management that one learns in private industry on, you know, understanding the critical assets of a private organization. Private organizations have to make decisions every day in cybersecurity based upon trade-offs and the resources they have available and the level of risk appetite that that organization has. And so, you know, they don't - most private organizations don't have unlimited resources to always invest in cybersecurity. So I think she understands that trade-off. She has likely communicated that in a variety of her roles in an upward way to executive leadership. That will be important for her, too, in her role with other sort of executive leaders or heads - department heads, et cetera, throughout the government as well. So that will serve her well. The lexicon and vernacular that she picked up from the private sector most certainly will benefit her as she kind of works, you know, supporting critical infrastructure by integrating with the government in her role as well as the private sector and supporting their coverage of critical infrastructure. 

Dave Bittner: You know, as you and I are recording this, just this morning, President Biden issued the National Security Memorandum on improving cybersecurity for critical infrastructure control systems. It seems that, if anything, this focus on cybersecurity is accelerating. And I think it enjoys a status as one of the few things in government that really has good faith, bipartisan support. You know, it's non-controversial to be in support of these things. What do you think that means for the role of CISA itself as we go forward, the role it will play within government and the importance it will take on this global stage? 

Luke Tenery: I think it's increasingly notable on a variety of levels, you know, especially we are reminded with the Colonial Pipeline attack. You know, many people were affected firsthand, you know, by that cyberattack. That was an attack upon critical infrastructure in the energy supply chain, essentially. So I think these issues have kind of hit firsthand on both sides of the aisle. And I think there'll be dynamics of CISA's role there to improve and/or further bolster security and focus around critical infrastructure. I think, also, it probably is sort of a broken record at this point. But - and I weighed in on this in other kind of media opportunities, but the government is behaving in a way of an organization that has been recently and deeply compromised from a cybersecurity perspective. 

Luke Tenery: Make no mistake about it. The SolarWinds attack is still the watershed event that is driving change. The Colonial Pipeline event was just sort of further driver to cause action in that sense. This is an administration that realizes that, you know, they've been cut pretty deeply, and they're regaining footing from a visibility, calling to action additional resources. I think this NIST announcement today - and similar to the executive order in the last couple of weeks - is a calling to, you know, force different entities into adhering to certain cybersecurity best practices. And hopefully it also means, just as generally it's often an issue, that more resources and funding are coming to support these efforts. But generally critical infrastructure, absolutely a focus. Colonial Pipeline was, you know, a significant sting in that respect. And the government is looking to, you know, further bolster the resources there. 

Dave Bittner: Is it your sense that CISA is getting the funds that it needs, the resources it needs? 

Luke Tenery: I think it's certainly getting a focus point. And I can mostly, without knowing or deeply looking into the, you know, the financials that are public on CISA, it's clear - in my experience, most notably in responding to a variety of global cybersecurity incidents, that the fidelity of information and the value of it has gone up. The reliability of their service has gone up. I was actually very pleased with the near real-time information provided around SolarWinds and other more recent cyberattacks. It allowed organizations to have reliable information on how to protect and mitigate some of the issues. So from my standpoint, combined with the investments in highly qualified people, that they are making the right - some of the right choices to date and at least most optically with some of their appointments. 

Dave Bittner: All right. Ben, what do you think? 

Ben Yelin: I mean, she seems pretty good, right? I can't - I think you can't really... 

Dave Bittner: (Laughter) So I think I mentioned in the interview - it's hard to imagine someone better qualified for this job than Jen Easterly. 

Ben Yelin: She's absolutely qualified. And I think one of the points that was made in the interview is that having experience in the private sector is so crucial, you know, because if you're just a government bureaucrat who hasn't actually been in the real world and private organizations... 

Dave Bittner: Right. 

Ben Yelin: ...In these types of consortiums where you have to get a bunch of different stakeholders at the same table, you just don't carry that same level of experience. So I think her nomination was obviously very well-received. 

Dave Bittner: Yeah. 

Ben Yelin: You know, I don't think you can evaluate somebody until the rubber starts to hit the road, and we'll see what happens with CISA over the next several years. But I certainly feel better about her nomination and confirmation after having listened to that interview. 

Dave Bittner: Yeah. Yeah, absolutely. All right. Well, our thanks to Luke Tenery for joining us. We do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the start-up studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.