Caveat 5.6.20
Ep 27 | 5.6.20

If there's cyberwar, are there cyber war crimes?


Tarah Wheeler: I tend to define cyberwar as kinetic war carried on by plausibly deniable means. 

Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland's Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hi, Dave. 

Dave Bittner: On this week's show, I've got the story of a U.S. appeals court asking why a Facebook encryption order should remain sealed, Ben takes a look at the effects of GDPR and, later in the show, my conversation with Tarah Wheeler. She's a cybersecurity policy fellow at New America, and she is also a well-known and respected author and speaker in cybersecurity topics. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll be right back after a word from our sponsors. 

Dave Bittner: And now a few words from our sponsors at KnowBe4. You know compliance isn't the same thing as security, right? How many times have we all heard that? It's true, too, having checked the legal and regulatory boxes won't necessarily keep the bad actors out. They are out-of-the-check-box kinds of thinkers. But what about compliance itself? You've heard of legal exposure and regulatory risk. And trust us, friend - they're not pretty. So, again, what about compliance? We'll hear more on this from KnowBe4 later in the show. It's not a trick question, either. 

Dave Bittner: And we are back. Ben, before we get into our stories this week, you've got a little bit of follow-up you wanted to share. 

Ben Yelin: Sure. So we discussed on an episode probably several months ago now about a case making its way up to the Supreme Court coming from the state of Georgia. So Georgia publishes a public version of its state code - basically, all of its state statutes - online. There is another version which includes both the code and legal annotations. So that includes things like notes on relevant court cases. That is owned by a private company LexisNexis, and LexisNexis has tried to assert an intellectual property rights in that annotated code. And the case made it all the way up to the Supreme Court. And this past week, the Supreme Court came down with a 5-4 decision saying that there can be no intellectual property interest in the law. 

Ben Yelin: There is something called the government edicts doctrine, which emanates from a 19th century case. Basically, the court held in that case that no reporter, no media entity nor anybody else could have a copyright claim in a court's opinion. And that has been extended in previous cases and in this case to laws drafted by the state Legislature. Basically, the principle behind this doctrine is nobody can own the law. The law belongs to the people. So this decision came out this week. 

Ben Yelin: I'll note it was a very interesting cross-ideological coalition. Chief Justice Roberts authored the opinion. He was joined by a couple of the conservatives, Justices Gorsuch and Kavanaugh, and by a couple of the liberals, Justices Sotomayor and Kagan. So it's always interesting when you don't have a neat ideological opinion. But the upshot of this is you cannot have a copyright claim in a state code or a state annotated code, and the law does not belong to anybody but us, the people who are subjected to it. 

Dave Bittner: All right. To me, that sounds like a good thing. 

Ben Yelin: Absolutely. Very good, sensible opinion in my view as well. And if you have a chance to - obviously, you probably don't want to read the whole case. But if you have a chance to at least read the syllabus of the case on, it will give you a very interesting brief history of the case and the rationale for the decision. 

Dave Bittner: Yeah, I'm curious what the dissenting opinions might be. Like, it's hard for me to imagine a counterargument here. But I suppose that's (laughter) the justices' specialty, right? 

Ben Yelin: Yeah, and there are actually two separate dissenting opinions. They dissented for different reasons, and I won't go into those different reasons - but definitely encourage people to check that out. 

Dave Bittner: Well, let's get into our stories this week. Why don't you kick things off for us? 

Ben Yelin: Sure. So my story comes from The New York Times. It's entitled "Europe's Privacy Law Hasn't Shown its Teeth," comma, "Frustrating Advocates." So we're nearly two years after the enactment of GDPR, the General Data Protection Regulation, in the European Union. And there has been frustratingly little progress in the minds of privacy advocates coming from this very promising, first-of-its-kind law. And there are a number of problems that have plagued the law. So far, only one company has been subjected to fines under the law. It was a $50 million fine leveled against Google, which - that's chump change. That's basically what Google makes in a day. 

Dave Bittner: (Laughter). 

Ben Yelin: So that's not much of a disincentive to that company. None of the other big tech companies have faced fines so far. They've been issued warnings. There have been various GDPR-led inquiries against some of their practices, but there have not been enforcement actions. So there are a bunch of reasons. This article mentioned why that's the case. One is the GDPR allows companies to ask any questions they want about enforcement actions and to get sort of advisory opinions on those questions, and companies seem to have used that provision of the law as a way to sort of kick the can down the road and kind of try to hope the regulators forget about the original problem and move on to something else. You can kind of bore them with a series of questions. 

Ben Yelin: Another huge problem has been the lack of resources. Many European countries have devoted a minuscule amount of their budget to enforcement of the GDPR. There's just a very small number of investigators. Even in countries where a lot of these companies are legally headquartered, which is Ireland, you've just seen very minimal budgetary resources devoted to the GDPR and very few investigators working on fines and claims. What advocates of the GDPR will say is it's still very early. The law is in its infancy. We're going to start to see enforcement actions pick up as investigators are able to do additional work and as more regulators are hired. 

Ben Yelin: But as we start to enter, potentially, a major global recession, which is going to have a big impact on government budgets, it's hard for me to see how there are going to be more investigators once countries are forced to sort of tighten their government belts, especially those countries in the European Union, and this is really all of them who don't really have the authority to print their own money to get themselves out of a recession. 

Ben Yelin: Couple of broader lessons here - this was supposed to be the model piece of legislation. Companies would finally be held accountable for their data practices, protecting personal information, giving the public a right to their private information. But it's sort of - there's a difference between what the law was supposed to have done in theory and what's actually happened in practice. And I think we'll just have to keep an eye on that, especially as we look toward other entities that have passed similar laws, like the state of California. So, you know, I think there's still hope for the future, but this is just sort of a reality check that enforcement under the law has been very slow. 

Dave Bittner: Yeah, this is fascinating to me for a couple of reasons. I mean, first of all, I remember leading up to this, there was a lot of companies that were sort of holding their breath and taking a wait-and-see approach. You know, how bad is this going to be in terms of enforcement? How is this going to change things? You know, there were some companies who got ahead of the law and were proactive, and there were other companies who chose, as balancing their risk, to be reactive, to say we're going to see how this really plays out and how the enforcement (laughter) - you know. And it seems like maybe they have - they won the bet so far. 

Ben Yelin: Yes. Now, it is worth pointing out that even though fines have not been levied, the GDPR has forced companies to take the sort of preemptive actions you've talked about. So for example, in this article they mentioned that Facebook delayed the release of its dating app. I had no idea Facebook was intending to have a dating app. 

Dave Bittner: (Laughter). 

Ben Yelin: To me, that seems like a bad idea. But that's a topic for another podcast. And that's because Irish authorities had raised questions about its data collection practices. So Facebook went back to the drawing board to try to address those concerns without actually facing a fine. So you can sort of compel enforcement just by being - you know, saying, like, hey, we could levy a fine against you even if we have not so far; you've got to go back and make sure your practices are in compliance with the law. But for the most part, you know, it has been a good bet for most of these tech companies to be like, you know, this is not going to break us. Let's not completely reinvent the wheel. You know, let's take this enforcement action by enforcement action and fight this law on its own terms. And I think you're right - that's become a pretty good proposition for these companies. 

Dave Bittner: Yeah, the other thing that strikes me about this is that - what I've heard with folks I've interviewed in industry is that they want consistency. Set some regulations. Tell me what they are. Allow me to follow them, you know, approach them the way I want to, you know, manage my risk. But consistency is much better than volatility. And I wonder, in terms of setting expectations as we go forward, will they be bringing more enforcement online? Will it continue to run this way? That's something that I imagine these companies really have their eyes on. 

Ben Yelin: Absolutely. And there are a couple of things that I think will undermine the effort to enforce some of the more robust data protections. We talked about the potential for a global recession. Another thing this article mentions is privacy in the age of the COVID-19 epidemic. There is an exception contained within the GDPR about public health emergencies, which allows the sort of contact tracing from applications that we're starting to see in many European countries right now. 

Ben Yelin: Europe was hit especially hard by the coronavirus, although probably, in the aggregate, about as hard as we've been hit here in the United States, but they've suffered a lot of cases and fatalities. And the GDPR provides legal grounds to enable employers and competent public health authorities to process personal data in the context of epidemics. So that's going to be something else, as we're stuck in the age of COVID-19, that might undermine the robust protections of the law. So, you know, in terms of companies that are unsure about whether compliance is going to start to ramp up, whether there's going to be additional capability to levy fines, I think for better or worse, you know, there are a lot of reasons to doubt that that's about to happen. 

Dave Bittner: All right. Well, we'll keep an eye on it as it continues to play out. My story this week - I must admit I put this one in partly because I figured you would get a kick out of it. This one goes a little bit into the legal weeds, but I know I can count on you to explain it to us (laughter). 

Ben Yelin: Oh, yes. Love the legal weeds. 

Dave Bittner: This is a... 

Ben Yelin: Everyone else wants to buy legal weeds killer at their local garden store. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: Yeah. I'm against it. 

Dave Bittner: Yeah, they want their legal weed whacker. 

Ben Yelin: Yeah. 

Dave Bittner: It's - you should put that on your business card, Ben. 

Ben Yelin: I - you've just given me a great idea. 

Dave Bittner: (Laughter) So this is a story from Reuters written by Joseph Menn, and it's titled "U.S. Appeals Court Asks Why Facebook Encryption Order Should Stay Sealed." And this centers around a group of federal appeals court judges who have asked prosecutors why a lower court could seal a ruling that absolved Facebook from having to wiretap a criminal suspect using one of the company's encrypted services. And evidently, the three members of the 9th U.S. Circuit Court of Appeals has some pointed questions towards the prosecutors as to why this should stay sealed. There are many layers to this. Can you sort of help unpack what's going on here? 

Ben Yelin: Sure. So I'll start with the basics. The Wiretap Act allows the government to compel telecommunications companies to track criminals in various ways. Now, there's some nuance as to whether that applies to tech companies. There are other authorities along with the Wiretap Act that would force, for example, a Facebook to release information to the government about a potential - the writings of a criminal suspect or that suspect's Facebook posts, Facebook messages. This gets very complicated when we're talking about encrypted applications like Facebook Messenger. 

Ben Yelin: So there was some sort of case where there was a criminal suspect. We don't know much about it because, as this article mentions, the opinion has been sealed. It's been completely redacted. But law enforcement, federal law enforcement tried to make Facebook break its own end-to-end encryption which protects voice calls placed over Facebook Messenger. I'm sure we're seeing many more voice calls and video calls through Facebook Messenger during this pandemic. 

Ben Yelin: The U.S. District Court - federal court came down with a decision saying, for whatever reason - and that reason has not been released; it's been sealed - Facebook did not have to comply with that order. Now, all of these civil liberties groups, like the ACLU and the Electronic Frontier Foundation, are saying, hey, we have a right to see what the rationale of that case was 'cause that can give us insight into how far the government's powers extend in terms of compelling companies to decrypt their own networks or decrypt their own applications, break their own end-to-end encryption. And because this opinion is sealed, they do not know what the legal rationale is, meaning they can't adjust their own strategies. And they can't advise some of the companies that they represent as to how, you know, they can react if they face some sort of subpoena to break end-to-end encryption. So the case made it up to the Ninth Circuit Court of Appeals, based in California. Many of these Facebook, Google cases end up in that circuit. 

Dave Bittner: Is that because that's where those companies are located? 

Ben Yelin: Exactly, exactly. 

Dave Bittner: OK. 

Ben Yelin: So Silicon Valley. This case about whether to unseal the district court opinion was heard at oral arguments - virtual oral arguments, I should mention - in front of a three-judge panel on the Ninth Circuit Court of Appeals. Now, we don't have the opinion itself yet, but they seem to indicate that they were very skeptical of why this case had been redacted, why this case had been completely sealed at the district court level, saying - at least indicating that there is significant public interest here. We should know how these laws are being applied, what the District Court's rationale was, whether that holds precedential value. 

Ben Yelin: What the prosecutors have said is if that district court opinion is released, it will reveal investigative methods used by the federal government. I don't want to sound like a conspiracy theorist. 

Dave Bittner: (Laughter). 

Ben Yelin: How many times federal prosecutors have talked about - oh, this is going to reveal our investigatory methods? We need to keep this information sealed, redacted for national security, for law enforcement purposes. It seems like they sort of fall back on that justification a lot, and so I'm not sure whether this three-judge panel is going to buy that argument. And it's possible that they will unseal that district court opinion and we can actually get information as to why Facebook was not forced to break their own encryption system. 

Dave Bittner: So the argument they're making here - help me understand - is that we don't necessarily need to see the details of the case, but we're entitled to see why the judges in this case made the decisions that they made. 

Ben Yelin: Right. They want to see the legal rationale. That's what the ACLU and the Electronic Frontier Foundation are saying. They are not interested in - well, exactly what techniques did the prosecutor use? They're interested in - OK, how does the Wiretap Act apply to encrypted communications? 

Dave Bittner: And they're saying that, for those details, go ahead and redact those details that could affect national security, for example. You know, block those out. That's not necessarily what we're interested in. 

Ben Yelin: Sure. And that is done in tons and tons of cases. That happens all the time. Well, you know, wiretap applications by definition and by custom are typically sealed. Many of those applications have been subsequently released with confidential or classified information redacted just because the public has an interest in knowing what those legal rationales are. So it seems to me a very equitable solution here would be release the legal rationale but do not release details on the investigatory tactics. Now, what the prosecutors might say is you cannot describe the legal rationale without describing the investigatory tactics. That's fine, and I think the court of appeals will have to balance the public interest in knowing how this law is being applied versus the public interest in having, you know, the Justice Department's digital investigative teams have their methods and sources protected. That's a difficult dilemma to weigh. I'm not saying it's necessarily easy. 

Ben Yelin: You know, my viewpoint would generally be, because there are so many companies out there using end-to-end encryption, it's important for those companies and the public to know exactly in which cases the government could gain access to that information. And that would sort of be the superseding value I would apply here. It's so hard to know, though, because you don't know the facts of the district court case. You know? You don't know the severity of the crime that the suspect has been charged with. We just don't know anything. So you know, it's hard to say from that perspective. I would lean towards transparency as sort of I always do. And it seems like that's, you know, the way that the judges on this three-judge panel were leaning as well based on what we heard at these oral arguments. 

Dave Bittner: Now, do the judges on that three-judge panel - do they get to see all of the information in the sealed case? Are they allowed to look inside there? 

Ben Yelin: Yes, they are. They're allowed to look, and they can determine, based on what they see in that district court opinion, whether to release that information to the public. So they themselves as judges have access to it. And then I'll also mention the three-judge panel could make a decision. Sometimes what happens is the losing party will appeal for a decision en banc, meaning the entire Ninth Circuit Court of Appeals would hear that case. And that's a panel of something like 20 different judges. I don't see that necessarily happening in this case but just something to keep in minds. If the Electronic Frontier Foundation and the ACLU lose this case, they might appeal to have the case reheard en banc. If they lose that appeal, they might appeal all the way up to the Supreme Court, so definitely something for us to keep our eye on. 

Dave Bittner: Yeah. Also, just a quick shout-out - one of the folks mentioned in this article is Stanford cryptography policy expert Riana Pfefferkorn, who we have had as a guest on our show and... 

Ben Yelin: Friend of the pod. 

Dave Bittner: ...You and I are both big fans of. So just a little shout-out to her. 

Ben Yelin: We've had a lot of great guests. She was one of our best. So... 

Dave Bittner: Yeah. 

Ben Yelin: And she was one of the people quoted in this article. So... 

Dave Bittner: Yeah. 

Ben Yelin: ...Definitely read the article if just for her quotes themselves. 

Dave Bittner: (Laughter) All right. Well, those are our stories for this week. If you have a question for us, we would love to hear from you. We've got a call-in number. It's 410-618-3720. That's 410-618-3720. You can also email us your question to 

Dave Bittner: Coming up next, my conversation with Tarah Wheeler. She is a cybersecurity policy fellow at New America. Stay around for that. It is an interesting discussion for sure. But first, a word from our sponsors. 

Dave Bittner: And now back to that question we asked earlier about compliance. You know, compliance isn't security, but complying does bring a security all its own. Consider this - we've all heard of GDPR whether we're in Europe or not. We all know HIPAA, especially if we're involved in health care. Federal contractors know about FedRAMP. And what are they up to in California with the Consumer Privacy Act? You may not be interested in Sacramento, but Sacramento is interested in you. It's a lot to keep track of, no matter how small or how large your organization is. And if you run afoul of the wrong requirement, well, it's not pretty. Regulatory risk can be like being gobbled to pieces by wolves or nibbled to death by ducks. Neither is a good way to go. KnowBe4's KCM platform has a compliance module that addresses, in a nicely automated way, the many requirements every organization has to address. And KCM enables you to do it at half the cost in half the time. So don't throw yourselves to the wolves, and don't be nibbled to death by ducks. Check out KnowBe4's KCM platform. Go to Check it out. That's And we thank KnowBe4 for sponsoring our show. 

Dave Bittner: And we are back. Ben, I recently had the pleasure of speaking with Tarah Wheeler, someone who I have been looking forward to talking to for quite a while. I actually had the pleasure of meeting her at the RSA Conference this year and we chatted, and she agreed to come on the pod. She is a cybersecurity policy fellow at New America, a well-respected speaker and expert on many things in cybersecurity. Here's my conversation with Tarah Wheeler. 

Dave Bittner: In your estimation, when asked, how do you define cyberwar? 

Tarah Wheeler: That's always the most interesting question I get asked. It's the hardest question because so much of the debate around cyber conflict and cyberwar is based on whether or not it actually is war or not. So I think one of the reasons this subject matters to me is without a definition of cyberwar, you can't do things like define cyber war crimes or cyber collateral damage. 

Tarah Wheeler: So the work I'm doing at the moment is about how we define cyber collateral damage in health care systems after nation-state-sponsored cyberattacks, such as WannaCry in 2017. And one of the things I started to realize was why it was so hard to define cyberwar - because any of the people who would or could agree with you on the definition have a lot of capital tied up in the notion that warfare is something you do with tanks. And only reluctantly do you get concepts like biological warfare tied into the concept of real war. 

Tarah Wheeler: And one of the ways that I've started to define cyberwar is - begins with an analogy. We know that in the 1700s that British and American soldiers seeded Native peoples' trade goods with smallpox. We know it. We know that this was part of the sustained campaign to depopulate the North American continent for white settlement. And there is no one now who would disagree that that was an act of biological warfare. It's just not the case that you would ever do that. At the time, it was argued that it had no part of warfare; it was not something that was part of the traditional or typical tactics of a kinetic campaign. And since the soldiers weren't physically there when the people were dying or being attacked by these germs, they asserted that they had no part of it being called a war. 

Tarah Wheeler: I don't think anybody would doubt now that the human body is a battleground in biological warfare. And I think a hundred years from now, the question of whether or not a state-sponsored attack that ties up hospital records, that causes people to die for lack of information over whether or not they have drug interactions or are incapable of accessing machines because the power's come down, I don't think there's any question a hundred years from now that we're going to call this cyberwarfare. And yet, right at the moment, it's difficult to get people to circle around the definition of what cyberwar is. The best that I've been able to come up with, partially in a nod to my esteemed elders and partially to make sure they understand what I'm talking about, I tend to define cyberwar as kinetic war carried on by plausibly deniable means. 

Tarah Wheeler: Now, when you define it that way you start to give people the ability to say, ah, I can see how people might view the concept of cyberwar as something that is intended to be hidden and tough to define because that's one of the tactics of its use. So yeah, it's interesting. 

Dave Bittner: Yeah. But that leads me to the question - do you suppose that this reticence to define it is intentional - that it allows nation-states and others who benefit from this their ability to have that plausible deniability to keep things sort of fuzzy. 

Tarah Wheeler: Without any hesitation. And what's more - instead of giving you a historical example, I'll give you a contemporary example that is separated from the major power players in terms of distance not in time. These are proxy wars that are happening. It's just that instead of using Nicaragua or Vietnam to prosecute policy aims, what we're seeing is people using AWS and Azure, right? We're seeing people using Python scripts as their version of a proxy war. There's simply not a doubt that these attacks are being carried out by nation-states. Who could doubt that at that point? But the question we come around to is, are we trying to not define these things as warfare exactly as you said - in order to be able to take advantage of the flexibility in tactics and operational leeway that gives us? Yes, absolutely. 

Dave Bittner: You know, there's this notion that I think we see a lot in popular media that we need to be bracing ourselves for some sort of cyber Pearl Harbor or perhaps a 9/11. Do you suppose that's a possibility? Are things going to play out that way? 

Tarah Wheeler: The thing that I have said is - it's kind of funny. When I get asked about this question, inevitably - when is the cyber Pearl Harbor coming? Soon, says expert - is what gets quoted on. 

Dave Bittner: (Laughter). 

Tarah Wheeler: No, I don't think that 9/11 is the example that's going to get used. And I'll tell you why it matters right now. In talking about, quote, "cyber Pearl Harbor," what I have said is that when we have that event that sears what the example of nation-state level cyberwar looks like, it will not look like Pearl Harbor, and it will have its own name. We won't need to compare it to something else because it will be seared into our collective consciousness as its own sui generis event and act, right? 

Tarah Wheeler: We're never going to have to describe the COVID-19 crisis, this global pandemic as - well, it was like that one time we had the Spanish Flu. Remember that time back then? Because this is the thing that has seared itself into our consciousness. So I don't think there's going to be a cyber Pearl Harbor. I think when it happens, it's going to have its own name, and we're not going to need to ask why. We're not going to compare it to anything else. 

Dave Bittner: What is your take on what I would describe as some of the good faith efforts to establish some global norms when it comes to these things? I'm thinking specifically of things like the "Tallinn Manual." 

Tarah Wheeler: "Tallinn Manual" - I am in such awe of the audacity of the group of scholars that got together to put together the "Tallinn Manual." And this is, I think originally, 2014, 2015 is when they did their original work, and then February 2017 was when they put out "Tallinn Manual 2.0." I think that the most important thing that the work of the Tallinn scholars, when they put together this document, especially in the second version of it in 2017, was to define the difference between cyberwarfare and cyber operations. 

Tarah Wheeler: What they did there was they gave us a more specific way to tell whether or not something rose to the level of nation-state combat or whether it hovered underneath it in the operational territory that didn't quite justify an all-out response under the norms of warfare previously. That's the most important thing that they did. It gives us something of a bright line beyond which something should be considered cyberwar. And so their work is influential, and at the same time, it is voluntary. And these experts were convened, yes, by NATO, but that's not an official NATO document. It's considered a work of international legal experts that can be used to advise policy. It doesn't have the official stamp of anything on it, and that's because people want to have their operational flexibility, I'm going to guess. 

Dave Bittner: You know, it's interesting, I think, domestically, cybersecurity is pointed out as something that has bipartisan support, and that it is unusual in that way, that it hasn't really been overtly politicized the way that many other things have. I'm curious what your perspective is on global leadership on this. Who is in the position to take the lead to try to move things forward internationally? 

Tarah Wheeler: Well, it depends on if you're asking whether or not move things forward internationally is prosecute a successful cyberwar or set up international norms around that process. I guess - I'm serious. Which of those two things is it? 

Dave Bittner: Well, I was thinking to establish norms. Why don't we just begin there? 

Tarah Wheeler: OK, to establish norms. That is - from what I've seen of regulatory frameworks, the European Union is almost certainly set up to be the regulation-giver when it comes to norms in cyber conflict, partially because they deal with internal conflict and the kind of low-level operations even among member states - no, I don't make too controversial a statement on that one - that would let them get that experience. Second, we also see that in cybersecurity in general, the European Union tends to be the regulation-giver, as they were for GDPR a couple of years ago. And it's an interesting thing to see. So I think that we're going to see, for the creation of norms - we saw, for instance, in - I think it was November 2018, the Paris Call for accord in cyberspace, and that was one of the answers internationally to how do we create peace in cyberspace. I think that's the place we're going to see it come from. 

Dave Bittner: What do you make of calls I've seen from some people, even former folks in the military, who've said that, you know, we need a fifth branch of our military to deal with cybersecurity, for it to be its own department? 

Tarah Wheeler: I think it's very telling that you're describing speaking to former members of the military. I don't think it's a popular view inside the current military, if only because IT is seen as a support function in all of the branches. I think it's - we're having a hard time explaining that, first of all, the Internet is a real place to people who operate, you know, tank divisions. How do you explain to them that taking down an electrical grid can have the same kind of devastation on noncombatants during a heat wave that anything that they do with the physical movement of troops and material might have? And a lot of people, I think, are bound up in the idea that the work that they're doing, the political capital being spent in the careers that they've succeeded in in running a conventional military would be somehow magically seen as less valuable if we acknowledge the existence of this other space. 

Tarah Wheeler: I don't think that's the case, and I think that the smartest people I know in the military are trying as hard as they can to prepare people for the idea that this is a separate place that we're doing battle. The best example I've seen and the most powerful example of that is probably Trident Juncture, the NATO military exercises in late 2018, when after the first 12 hours of conflict between these 29 member states - or 27 member states, plus two - we saw the entire exercise come to a screeching halt because, as soon as about six keyboards got working, the targeting navigation communication systems in the war game itself were no longer operational. The cyberwarriors, they wiped the floor of the North Atlantic with this exercise. They were decommissioned for the rest of that exercise. 

Dave Bittner: (Laughter). 

Tarah Wheeler: And no one's ever released the results of that particular form of attack. Basically, what happened was they - the people running this exercise collaborated to say, we're going to pretend computers don't exist for the next two weeks so that we can make sure that our planes and our boats and our people can operate and communicate and we can get our exercises done. Now, to anybody looking at that, that might have been a sensible decision in the moment because they'd already committed so much material and personnel in that moment. For those of us who sit behind a keyboard, we're looking at it and going, why didn't you just pay a little bit more attention to the few keyboards in Tallinn and wherever else that took this exercise down? That tells you that there is an interesting force multiplication factor behind a keyboard that doesn't exist anymore when it comes to sailing ships someplace. 

Dave Bittner: It's interesting to me - even the perception of how we protect ourselves. The distinction between the public sector and the private sector - you know, if I'm running an electrical system - if I'm in charge of that, you know, I rely on the military, the national guard of my country to defend against anyone coming and attacking me physically. But from the cyber point of view, it's really up to me engaging with the private sector, largely, to take care of my defenses there. 

Tarah Wheeler: That is correct. And the problem is is there's not a good answer for that at the moment. We're seeing some interesting work being done on nationalization of resources to keep people alive, but it's not militarily focused. It's focused entirely around this pandemic. And I don't know what lessons are going to come out of this. I am, as a private citizen, concerned that we're going to see the deprecation of a lot of the rights we've enjoyed as American citizens paired with the ability to protect the U.S. better but also paired with the same frank incompetence we've seen on the cyber level from this current administration. 

Tarah Wheeler: And if we see that happening, we're going to actually see the creativity and innovation of the private sector stifled more than it's able to assist. I think that's going to be the place we're going to end up. And I'm working to see that that doesn't happen, as I think many people are who are working hard on this project of making sure that our country - our globe - is safe from bad actors while also acknowledging that some of our tools are not ready-made to handle (ph) for that process. 

Dave Bittner: In your vision, what would be an ideal way that something like a digital Geneva Convention would play out? How would that process work? How would that be something that we could achieve? 

Tarah Wheeler: I think that the Paris Call in 2018 was incredibly powerful. The fact that it took a multi-stakeholder approach is key. And I think that the U.S., for instance, if it wanted to play any part in a leadership role in that kind of digital Geneva Convention, is losing time to be in a leadership role. Eventually, it's going to be a question of simply signing on to what a hundred other nations have done if the U.S. doesn't actively participate in a multi-stakeholder approach. 

Tarah Wheeler: And the problem with the multi-stakeholder approach is that in the U.S., we tend to let the military run our cyber conflict policy. And I think that we already have, frankly, seen what this is going to start to look like. It's going to look like either more people signing onto the Paris Call or a revision of that in future. Much like the "Tallinn Manual" got revised again and again to address current situations, we're going to see the Paris Call revised. Or we'll see something that helps to replace it. And I think that if the question of U.S. leadership is brought up that at the moment, I don't see a lot of leadership happening there on the international policy stage. 

Tarah Wheeler: We're actually great. And as a private citizen, I'll note that I think CISA's doing a fabulous job right now brokering and handling vulnerabilities equities between the private and public sector in the United States. But that's not a policy position. That's just the CISO of the U.S. basically. So the policymakers in cyberwarfare functionally don't exist in the U.S. Those have been gutted out of the State Department, out of even the military at this point. There's no White House central core around cybersecurity policy that is functional. 

Tarah Wheeler: And I've been to a lot of these meetings at this point, and there's very rarely any U.S. presence at the policymaker level. There are often people with expertise in compliance in U.S. government there, but there's no policy arm of the U.S. government handling this anymore. It's depressing. I don't have a good answer for you. But I guess the question - the answer is I have a good answer for you. I just don't like giving it. 

Dave Bittner: All right, Ben. What do you think? 

Ben Yelin: Very interesting conversation about sort of the state of international policy related to cyberwarfare. And one thing that was sort of unsettling from the interview is just the extent to which the United States has not really been proactive in being involved in this broader conversation about defining cyberwarfare. I think we've lost a lot of time and influence because we've been reluctant to join that effort. 

Ben Yelin: I think another thing that stuck out to me is - you know, we've often talked about what the 9/11 or Pearl Harbor event will be that sort of ignites an interest in cyberwarfare. She made a really interesting point that there's not going to be a 9/11 or a Pearl Harbor event. It's going to be its own unique events. It's going to be unlike, you know, any threat we've ever faced in the past. And that's just the nature of cyberwarfare. It's different. There is going to be, potentially, some physical impact if we talk about things like damage to critical infrastructure, but it's not going to be an experience that will remind us of anything else. And I think that's an important message, particularly to people who are not well versed in the subject of cyberwarfare. 

Dave Bittner: Yeah. You know, one of the things that stuck with me since she and I chatted was - when I brought up to her that some of the folks that I have spoken with - former military people - had suggested, you know, that we even may need another branch of the military - a cyber branch. And she brought up a really interesting insight, which was basically, you know, how interesting that these are all former military people who have that point of view - that it's not the people in the service who are thinking that way. 

Ben Yelin: And she basically said the people in the service look at cyber as kind of an IT issue. And you know, that's in some ways kind of disturbing to hear just as, you know, you and I have followed this closely and realized the potential physical threat of cyberwarfare to all of our well-being. I hope it is taken more seriously, especially as we face threats from state actors, non-state actors and especially as this threat grows in the future. 

Dave Bittner: Yeah, absolutely. Well, again, our thanks to Tarah Wheeler for taking the time for us. You can find her on Twitter. She is @tarah. And then we want to thank all of you for listening to our show. 

Dave Bittner: And of course, we want to thank this week's sponsor KnowBe4. If you go to, you can check out their innovative GRC platform. That's Request a demo, and see how you can get audits done at half the cost in half the time. 

Dave Bittner: Our thanks to the University of Maryland Center for Health and Homeland Security for their participation. You can learn more at The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.