Caveat 4.7.22
Ep 120 | 4.7.22

Digging deeper into the civil fraud initiative.


Matt Malarkey: And what we've seen over the decades is that self-attestation has not been working. And so the government is now sort of, I guess, flexing its muscle and sharpening its teeth. And so the introduction of the CCFI is really a reflection of that.

Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's privacy, surveillance law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben has the story about fraudulently collected metadata. I've got the story of a class-action suit against SolarWinds. And later in the show, my conversation with Matt Malarkey of Titania. We're sharing insights on the Civil Fraud Initiative. 

Dave Bittner: While this show covers legal topics, and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben. Let's get things rolling here. Why don't you start things off for us? 

Ben Yelin: So my article this week is a big-picture article. It comes from The Guardian by Johana Bhuiyan. And I like to think of this article as kind of a dissertation or thesis for our podcast. 

Dave Bittner: (Laughter) OK. 

Ben Yelin: Not that our podcast is ending. It will go on in perpetuity. 

Dave Bittner: Right. 

Ben Yelin: But this article was a good hook for sort of all of the things we've talked about as it comes to electronic surveillance and how tech companies and the government can get a hold of your data. 

Dave Bittner: OK. 

Ben Yelin: So the hook for this article - which is entitled "How Can U.S. Law Enforcement Agencies Access Your Data? Let's Count the Ways" - is something that I know you discussed on the CyberWire podcast. There was a hack of Apple and Meta, the parent company of Facebook, where hackers obtain information on users of those services by forging an emergency legal request. That is a mechanism where law enforcement agencies can demand or compel information from tech companies to hand over location and subscriber information. It's written into our statutes. It's supposed to be used in the case of a life-threatening emergency, something that's - doesn't have the time to go through your standard judicial process. 

Dave Bittner: Right. 

Ben Yelin: Of course, this was completely fraudulent. There was no such emergency. And these hackers were still able to obtain individual user data. 

Dave Bittner: So let me just back up here for a second. So we have hacker bad guys. 

Ben Yelin: Yes. 

Dave Bittner: They go to a company like Apple or Meta, and pretending to be law enforcement, they say there's an emergency here, and because of that emergency, we need this information. And the big tech companies, believing that they are actually talking with legitimate law enforcement, turn it over. 

Ben Yelin: Exactly. 

Dave Bittner: OK. 

Ben Yelin: So if you are Apple or Meta, you are generally going to be inclined to comply with such a request, largely because it is an emergency. Each of these companies has their own standards for determining whether the request actually constitutes an emergency. For the most part, from their perspective, if they actually believe it's an emergency, they are not going to contest this in a court of law just because it would be bad publicity if there were actually an emergency. If they needed somebody's location data because this person was a fleeing suspect and this person was out to commit more crimes and was out to victimize more individuals and it was Apple or Meta that withheld crucial data, that would be very bad publicity for these companies. 

Dave Bittner: Sure. 

Ben Yelin: But there is this legal mechanism that allows this type of emergency warrantless collection. Not only does it not require a warrant, doesn't require a subpoena and doesn't really require any sort of judicial sign-off. That led to The Guardian exploring all the other ways in which the government and private entities can collect a person's data. And they went through a list of them that I thought was very compelling and all-encompassing. So they started with law enforcement accessing your physical device. They can subpoena to - get a subpoena to access your device. They do need a warrant because of Riley v. California to unlock your phone and access the contents on your device, but they still can do that. There are instances, when we're talking about border searches, where appeals courts have ruled that even without any individualized suspicion, the government can unlock your phone and view your data, your content. 

Ben Yelin: There are all different types of law enforcement requests for different types of metadata, whether that's from ICE - Immigrations and Customs Enforcement - the FBI or other law enforcement agencies. There are different mechanisms in which law enforcement can obtain information through a bunch of different statutes. So there are administrative subpoenas, warrants, many of these requests comes - come with gag orders, particularly those relating to national security, when we're talking about something like national security letters, which means the company that receives the request has to hand over the information, and they are not allowed to discuss even the fact that they received this request with anybody, except perhaps their own attorneys. They talk about geofence warrants, which we've talked about a number of times, where law enforcement seeks device information of all the users in a certain place at a particular time or - something I don't think we really have talked about - a keyword search warrant, which is not individualized. It allows law enforcement to access the information of any person who searched for certain terms or keywords within a time period - a certain time period on a search engine. 

Dave Bittner: (Laughter) Right. How to hide a body. 

Ben Yelin: Yeah, exactly. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: That would be the most obvious example. 

Dave Bittner: Right. 

Ben Yelin: But there are all types of examples that aren't quite as obvious... 

Dave Bittner: Yeah. 

Ben Yelin: ...Where it's something like, a bank was robbed, and you use a keyword search warrant for people searching directions to X bank that was burglarized. 

Dave Bittner: I see. 

Ben Yelin: And that's kind of a summary of what goes on in the public sector from the government. But then there are data brokers. So we have this entire private sector industry of collecting data. They describe it in this article as a shadowy network of data brokers that operate under the radar but have easy access to our data, such as our location, purchase history. That's very valuable data. They sell it to entities to help target advertising for us. They collect that personal data from our social media profiles, public records, like I said, applications. And most of the time, the user, him or herself, doesn't really know whether a data broker has been collecting information. And, you know, that's obviously something that we've talked about on our podcast. 

Dave Bittner: Yeah. 

Ben Yelin: They talk about surveillance tech companies, Clearview AI. They scrape information from the internet, from social media sites and feed it into algorithms. Sometimes they sell their services to law enforcement, but law enforcement can't identify people because of the scraping done by companies like Clearview AI. Amazon's smart doorbell, Ring, is another example. They give special access to law enforcement so that police can monitor and request Ring footage from consumers. I think... 

Dave Bittner: Right. 

Ben Yelin: ...We covered that story. 

Dave Bittner: The panopticon. 

Ben Yelin: Exactly. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: And then there's - the final thing they talk about here is data sharing. So the fact that everything that's collected at the local level, at the state level and at the federal level is shared among each tier of government. So if a local DMV collects information - maybe they've obtained your driver's license picture. If the - obviously, they've obtained your driver's license picture. 


Dave Bittner: Right. 

Ben Yelin: But that's something that they might share with the FBI as part of a broader federal investigation. If there is a state prosecution of a crime and somebody has been targeted by some type of FBI surveillance, DHS surveillance, CBP - Customs and Border Protection - surveillance, that can be shared with state entities. And there are data-sharing services from companies like Palantir, which we've also talked about, that creates a centralized network of digital records where law enforcement can potentially identify chronic offenders, people who are frequently the target of law enforcement investigations. They are people of interest. And law enforcement partners at all levels, from local police departments to the FBI, can access their information. I guess the point is somebody is always watching. If you think your data is not being collected or obtained, then you're not being creative enough in your thinking. 

Dave Bittner: (Laughter). 

Ben Yelin: Because as we've seen - and this article tells us - there are so many different ways in our legal system and in the practices of our technology companies that can lead to your data getting collected. So I think it's just a - was a really interesting reminder, big picture, what our government and what our private industry is capable of doing with our private information. 

Dave Bittner: You know, swinging back around to the original topic here, which is this, you know, Apple and Meta giving away information under false pretenses - you know, them being fooled by fraudsters - do you know, is there any, I don't know, retrospective auditing of this? Does anybody go back after the fact? You know, if law enforcement says, we have an emergency request, is anybody going back and looking at those after the fact to see if they were indeed emergencies? 

Ben Yelin: I think it happens at the individual company level. They have their own internal audits. And I think that probably led to them discovering this in the first place. So there is sort of some internal practice, but there's not really much that law enforcement itself does, nor does the federal government, to my mind, really keep track of how often this is happening. I think one of the reasons the story resonated is because the federal government was not aware in time that these hackers were using this power of these emergency requests to go to companies like Meta and Google. So I think all the enforcement mechanisms are internal here, but maybe the government itself, maybe through the inspector generals - inspectors general of these various agencies, it might behoove them to look through all of these emergency requests to at least try to identify how many of them were legitimate. 

Ben Yelin: Now, the problem with that is you'd have to go through the emergency requests allegedly submitted by every law enforcement agency, whether that's local, state or federal. And so some of that is not under federal jurisdiction. So even if you had the best inspectors general in the world, that's only part of the puzzle. Really, it does depend on the companies themselves. They're the ones who receive these requests. And from their perspective, you know, I don't think it's necessarily in their short-term interest to go under the hood. They don't want to reveal that they've been completely had by hackers on the internet. 

Dave Bittner: Yeah. 

Ben Yelin: I don't think that speaks well of their own security protocols. 

Dave Bittner: I'm just trying to - I mean, I could - I guess I can imagine a defense attorney saying that, you know, the information that was gathered to convict my client was done so under the pretense that there was emergency circumstances. And clearly, Your Honor, ladies and gentlemen of the jury, that was not the case. 

Ben Yelin: Now, that would certainly be a wise strategy for a defense attorney. 

Dave Bittner: Yeah. 

Ben Yelin: Unfortunately for your hypothetical defense attorney, in the vast majority of cases, there are legitimate requests. 

Dave Bittner: Yeah. 

Ben Yelin: And you wouldn't have any necessarily - unless there was a story like this, you wouldn't have any indication that the emergency request was fraudulent. If you did, that would be a great way to get your case thrown out in court. 

Dave Bittner: (Laughter). 

Ben Yelin: Because everything that comes after that would be fruit of the poisonous tree, meaning that evidence would have to be suppressed. 

Dave Bittner: See; I missed my calling, Ben. This is just the kind of creativity that our legal system needs to throw sand in the gears, right? (Laughter). 

Ben Yelin: I know. I mean, the problem is all of the best defense attorneys either work for clients with a lot of money or are in fictional TV shows. 

Dave Bittner: I see (laughter). Right. So they're created by writers who have that creative mindset. 

Ben Yelin: Yeah, we're - if you talk to most defense attorneys, they're just trying to make it through their day... 

Dave Bittner: Yeah. 

Ben Yelin: ...Do their best on behalf of their clients... 

Dave Bittner: Sure (laughter). Sure. 

Ben Yelin: ...Especially public defenders. I mean, you have hundreds of cases... 

Dave Bittner: Yeah. 

Ben Yelin: ...In your portfolio. And, you know, ain't nobody got time for that. 

Dave Bittner: Yeah. Yeah. No, I don't mean to make light of it either, but it's interesting to think about. 

Dave Bittner: All right. Well, my story this week comes from SC Magazine. This is an article written by Derek B. Johnson, and it's titled "Court Denies SolarWinds Bid to Throw Out Breach Lawsuit." Basically, a Texas judge has dismissed claims that the former SolarWinds CEO, Kevin Thompson, was personally liable for deceiving investors about the company's cybersecurity but, beyond that, has allowed a class action lawsuit, which is filed against the company, its executives and investors, to proceed. And this is all in the wake of the 2020 Orion breach. They've also sort of collected a bunch of different class-action suits together to make one big one, I suppose. What really attracts my attention here, Ben, and what I'm curious on your take on this is the personal responsibility of people like the chief information security officer or the chief security officer - you know, these folks being on the hook for the liability here that they did not have proper cybersecurity methods in place. 

Ben Yelin: Yeah, I mean, personal liability in cases like this, when we're talking about defrauding one's investors, comes down to whether you had actual knowledge that what you were saying was wrong. So it seems like from what this Texas judge is saying, the CEO, Kevin Thompson, had general knowledge of SolarWinds' cybersecurity practices but wasn't intimately involved in day-to-day protocols, you know, recognizing their vulnerabilities. He was an overseer, so he didn't have intimate knowledge with what was going on behind the scenes. 

Ben Yelin: The chief information security officer is obviously privy to all of that information. So when he makes representations to investors, he knows or, at least in a legal sense, should have known that these risks existed. Now, we are not at the point in this case where a court has definitively determined the legal liability of anybody here. This was a preliminary hearing on a motion to dismiss. So they dismissed the claim against the CEO. As it relates to the CISO and everybody else involved, they are just simply allowing the case to go forward. And what's going to come out in court is whether there's enough evidence to show that these individuals, the CISO and others, people who are involved in the security operations, knowingly misled their investors. And that's going to have to get into - it's going to be up to the finder of fact here, the jury or the judge if it's a bench trial, as to how much these individuals knew about their own security vulnerabilities; how much of their vulnerabilities were foreseeable; how much of the hack was a matter of circumstance or bad luck and how much was it negligence? 

Dave Bittner: Right. 

Ben Yelin: But it - I think the broader lesson here is you can be personally liable if you have actual knowledge of your own vulnerabilities and you have misrepresented your own organization's cybersecurity posture. That means you could be subject to liability. It's still a relatively high standard. There's going to have to be a lot of evidence that comes out in discovery to establish proof. But at least, you know, this case and others have shown us just because a, you know - just because a person has been named doesn't mean they are above the legal liability threshold. 

Dave Bittner: Mmm hmm. Yeah, I mean, the argument that the plaintiffs are making here is that, for cost-cutting reasons, they didn't have adequate cybersecurity measures in place, that they didn't want to spend the money on it because ultimately, you know, at some point, they would want to sell the company, so they would have, you know, low expenses, which would make the company more valuable, and so on and so forth. There's a part of me that can't help wondering, like, suppose the breach had never happened, right? 

Ben Yelin: Mmm hmm. 

Dave Bittner: Time passes. They go to sell the company, right? Because they have not spent as much as they otherwise would have on cybersecurity, the company is indeed more valuable. All of these shareholders make more money. Everybody's happy. 

Ben Yelin: Yeah. 

Dave Bittner: Right (laughter)? 

Ben Yelin: I mean, that's the dream world here that everybody was foreseeing. 

Dave Bittner: Right. Right. 

Ben Yelin: Unfortunately, the dream reality never really comes into practice. 

Dave Bittner: Right. 

Ben Yelin: And bad things do happen if you have vulnerabilities in your system. 

Dave Bittner: Yeah. I mean, it could've been a hurricane - right? - and they didn't have adequate insurance. It could've - you know, any number of things that they should've insured themselves against or put in place, you know, protections against. This happened to be cybersecurity. 

Ben Yelin: Right. You know, I think the key here is that they held themselves out as authorities on SolarWinds cybersecurity measures, meaning it's not just that they tried to cut corners and keep costs low to the benefit of themselves and their investors. It's that they ensured their investors that they had taken proper cybersecurity measures to protect their data. 

Dave Bittner: I see. 

Ben Yelin: Again, we don't know what the evidence is going to tell us as to whether that's the case. 

Dave Bittner: Right. That's what the plaintiffs are alleging. 

Ben Yelin: That's what the plaintiffs are alleging. 

Dave Bittner: Yeah. 

Ben Yelin: But we do know that you can be held liable if you are in charge of cybersecurity for your organization and you knowingly make false representations to your investors. And I think that's kind of the broader lesson that comes from this case. 

Dave Bittner: Mmm hmm. So if I'm a chief security officer, what does this mean for me? 

Ben Yelin: If you're a chief security officer, that means one of two things. It means you should not make misrepresentations to your investor - to your investors about your own cybersecurity practices. It means you should always tell the truth to the best of your knowledge in any public proceeding. You should also do that in any private proceeding, but there are fewer legal consequences... 

Dave Bittner: Right (laughter). 

Ben Yelin: ...If that's not on the record. 

Dave Bittner: Yeah. 

Ben Yelin: And that it benefits your - even if it doesn't, in the short term, benefit your organization's bottom line to take robust cybersecurity measures to protect your data, the consequences of not doing so can be severe for both the company and can subject you, as a chief information security officer, to personal liability. 

Dave Bittner: Mmm hmm. 

Ben Yelin: So you can get caught. It's not going to happen very frequently. But it's something where you can be held liable in a court of law. So tell the truth about your own cybersecurity practices. Don't mislead your investors. And if you want to avoid any type of liability, then try to prevent your company's information from being stolen by bad cyber actors. 

Dave Bittner: Yeah, yeah. Yeah, lots of things to go into that whole risk analysis, right? 

Ben Yelin: Right, and then these are not easy decisions. 

Dave Bittner: Yeah. 

Ben Yelin: It's very expensive to take the types of measures that would prevent a SolarWinds-type incident from happening in the first place. All of us want to cut corners. I mean, that's why we drive five miles an hour above the speed limit. 

Dave Bittner: Yeah. 

Ben Yelin: We think we're not going to get caught. We'd like to get somewhere faster. But life has a way of catching up with you. And in these types of circumstances where the worst does happen, the cyberattack has caused a lot of pain and hardship for the victims, then somebody's got to be held legally liable, at least in theory. 

Dave Bittner: Yeah. 

Ben Yelin: And here, that so far seems to be falling on those who have intimate knowledge of the cybersecurity practices of SolarWinds. 

Dave Bittner: Yeah. Lots of security executives watching this case closely, I would imagine. 

Ben Yelin: Yeah, probably with a tinge of fear in seeing that one of their compatriots here has potentially a legal problem on their hands. 

Dave Bittner: Yeah. All right. Well, again, that is from SC magazine. We will have a link to that and both of our stories in the show notes. We would love to hear from you. If you have a question for us or a topic you'd like for us to cover, you can email us. It's 

Dave Bittner: Ben, I recently had the pleasure of speaking with Matt Malarkey. He's from a company called Titania. And we were talking about the Civil Fraud Initiative, some efforts going forward in the federal government. Here's my conversation with Matt Malarkey. 

Matt Malarkey: I think in order to sort of better understand what the Civil Cyber Fraud Initiative is we kind of need to take a step back and look at sort of how did we get here? And I think the first thing to understand is that we've seen now consecutive government administrations paying more attention to and sort of seeing cybersecurity as more of a priority, both within the government itself but also within its supply chain. And I think that while historically there's been an approach which has been to kind of allow contractors in the U.S. government supply chain to self-attest to their compliance, and what we've seen over the decades is that self-attestation has not been working. And so the government is now sort of, I guess, flexing its muscle and sharpening its teeth. And so the introduction of the CCFI is really a reflection of that. 

Dave Bittner: And so can you give us some insights here? What exactly is included in it? 

Matt Malarkey: So through the CCFI, what the DOJ is looking to do is it's, firstly, looking to build resiliency across the U.S. government as well as the private sector. It's looking to hold government contractors and grantees who are doing business with the government accountable to their cybersecurity requirements. And then it's also trying to ensure that those companies and contractors that are adhering to their requirements are not penalized or punished for doing so, so that the playing field is level. 

Dave Bittner: And so what kind of mechanisms will they have at their disposal here to ensure that what they're looking for actually comes to pass? 

Matt Malarkey: Yes, that's a good question. So what they're doing is they're basically enabling the DOJ to, in collaboration with other government departments, start launching, I guess, cases or lawsuits against, in this case particularly, the private sector to determine whether or not those government agencies have been deficient in their cybersecurity services or products or if they've been misrepresenting their compliance with their cybersecurity requirements and/or if they're failing to monitor or report any breaches that they've experienced. 

Dave Bittner: And what kind of timeline are we on here for something like this going live? 

Matt Malarkey: Oh, it's already gone live. So it was launched back in October, I believe. And even just recently, I think a couple of weeks ago, there was a case in California - in a federal court in California - where there was a major ruling in favor of the government against a large defense contractor. And that came about as a result of a whistleblower who had alerted the government to the fact that their employer had actually not been complying with their cybersecurity requirements and was knowingly aware of that and yet had continued to do services or perform services for the government. 

Dave Bittner: What sort of recommendations are you and your colleagues making to the folks that you work with in terms of organizations making sure that they are in compliance here? 

Matt Malarkey: So I think the first thing is that any organizations doing business with the U.S. government, the first thing that they need to do is they need to review and understand - and I stress understand - their current contractual requirements. So they need to look back through their contracts and understand, you know, what FAR clauses are included, what DFARS clauses are included and understand what the ramifications of that are. They then need to understand and they need to be prepared to adapt and update their cybersecurity efforts as those requirements change or as new federal requirements are issued. That would be the first thing. 

Matt Malarkey: The second thing is that they need to ensure that their compliance with those requirements is not a static effort. I recall some of the other guests that you've had on this podcast have mentioned the fact that, you know, compliance should be perceived as an ongoing process. It should not be just a one-off thing and one-off tick-box exercise. And so organizations should be performing gap analysis and compliance assessments against their requirements on an ongoing basis. And then finally, I would say that they need to be documenting all of their efforts. They need to have evidence such that they can use this to support assurance claims, but also to help them better understand where they actually need to, I guess, provide or dedicate remediation efforts. 

Dave Bittner: And I suppose it's safe to say that this is the shape of things to come going forward here. I mean, this sort of scrutiny is what we can expect from the federal government from here on out. 

Matt Malarkey: I think so. I think that you're absolutely right, that there's just going to be continued, increased scrutiny. The DOD released a report, I think, just last week where the DOD inspector general stated that cybersecurity within the defense industrial base is being applied inconsistently and how that's just simply not - cannot be tolerated. So we're going to see greater focus on cybersecurity within the supply chain. And the DOD has actually been quite forward-leaning in this area and through the introduction of the Cybersecurity Maturity Model Certification program. And I recognize that the Biden administration performed a review of that program and has since suggested some updates, which I think have been generally welcomed. But this is going to start putting the onus on government contractors to now proactively assess their compliance with their known requirements, remediate those requirements, and to be performing this as an ongoing process. 

Dave Bittner: In terms of the tone between the contractors and the Department of Defense, has the - has it been collaborative? Has it been adversarial? How are the two sides coming at this? 

Matt Malarkey: I think, by and large, it initially, I think, kind of rocked the defense industrial base when it became clear that as part of the CMMC program, that all defense contractors who handle federal contracting information and controlled unclassified information would be subject to mandated third-party audits. That did not go down very well within the defense industrial base. But I think that there's been a recognition that more needs to be done. And it's been interesting sort of seeing and hearing much of the discourse within the DIB where some elements are absolutely in favor of increased scrutiny on the supply chain because that makes, I guess, in a sense, that supply chain stronger. But then there are others who have been saying, well, actually, at what cost? - and the idea being that this might actually put many small businesses out of business. So it's been interesting, you know, sort of seeing the discourse and seeing how these kind of differing elements have been engaging with each other. 

Dave Bittner: To what degree has this been retroactive? In other words, I can see if I'm a contractor and I am bidding on a project, you know, one that has not been awarded yet, that I can work all of these things into my proposal. Is this being applied to jobs that have already been won? And I guess what I'm getting at is, has this been put on the contractors as kind of an unfunded mandate? 

Matt Malarkey: No, sir. So it doesn't apply retroactively. So it will only apply for new contracts or contracts that are up for renewal. And it can be inserted. And any requirement will be inserted into the contract between the government and the contractor. But it will not apply retroactively. 

Dave Bittner: All right, Ben, what do you think? 

Ben Yelin: I found this interview to actually be somewhat promising. I mean, I think the federal government is taking a more proactive role in trying to ensure cybersecurity best practices among companies that contract with the federal government. 

Dave Bittner: Yeah. 

Ben Yelin: And so I find that encouraging. I think the question as it was indicated in the interview is, how strong are the enforcement mechanisms? 

Dave Bittner: Mmm hmm. 

Ben Yelin: But I think the overall trend of trying to ensure that those who contract with the federal government are not subjecting the general public to significant risk is a promising trend. 

Dave Bittner: Yeah. Yeah, absolutely. It's interesting to see, as you say, some of these standards being raised, you know, by necessity (laughter)... 

Ben Yelin: Right. 

Dave Bittner: ...And proactively, you know? I guess, you and I talk often about how slowly the wheels turn particularly at the federal level. So as you say, it's encouraging to see some activity, dare I say action, happening at this level. 

Ben Yelin: Yeah, and it's born out of necessity. It's not just the general risk of cyber incidents. And obviously, now with some of the geopolitical events going on, it's the risk of nation states committing cyberattacks. 

Dave Bittner: Yeah. 

Ben Yelin: I think that threat is very real. The first target of a foreign adversary is going to be our government. The government itself might be difficult to attack. They might have proper security protocols. So the next best thing is those who contract with the federal government and might have access to critical data. And so I think given what's going on in the world, it's wise that we are taking these proactive steps. 

Dave Bittner: Yeah. All right. Well, again, our thanks to Matt Malarkey from Titania for joining us. We do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The Caveat podcast is proudly produced in Maryland at the startup studios of DataTribe where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.