Caveat 12.18.19
Ep 9 | 12.18.19

The cyber resiliency of White House operations.

Transcript

Christopher Whyte: The concerns related to everyday cybersecurity operations pertain very directly to the political agenda of our head executive, the president. 

Dave Bittner: Hello, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland's Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hi, Dave. 

Dave Bittner: On this week's show, Ben unpacks a recent Capitol Hill hearing on the Crypto Wars. I have a ruling that addresses biometrics and self-incrimination. And later in the show, I speak with Christopher Whyte. He's an assistant professor of homeland security and emergency preparedness at Virginia Commonwealth University. We'll be discussing the notion that cybersecurity in the White House is in disarray. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll be right back after a word from our sponsors. 

Dave Bittner: And now a few words from our sponsors at KnowBe4. You know compliance isn't the same thing as security, right? How many times have we all heard that? And it's true, too. Having checked the legal and regulatory boxes won't necessarily keep the bad actors out. They're out-of-the-checkbox kinds of thinkers. But what about compliance itself? You've heard of legal exposure and regulatory risk. And trust us, friend, they're not pretty. So again, what about compliance? We'll hear more on this from KnowBe4 later in the show. It's not a trick question either. 

Dave Bittner: And we are back. Ben, why don't you start things off for us this week? 

Ben Yelin: Well, our Congress critters are at it again. 

Dave Bittner: (Laughter). 

Ben Yelin: I spent another morning on C-SPAN, as I usually do, as somebody interested in these. 

Dave Bittner: You're a wild man, Ben (laughter). 

Ben Yelin: I sure am. Yeah, it was quite a party. But there was a hearing in front of the Senate Judiciary Committee on the Crypto Wars, and there were two punching bags/witnesses who were invited... 

Dave Bittner: Right. 

Ben Yelin: ...One from Apple and one from Facebook, along with somebody who's very prominent in law enforcement, Cyrus Vance, who's the district attorney for Manhattan. And this was a debate about backdoors - their benefits and their detriments. 

Dave Bittner: So encryption - the whole Crypto Wars thing of whether or not the technology companies should be obligated to provide backdoors in the encryption technology they provide. 

Ben Yelin: Right - and whether the benefits derived from such a policy, which is that the government would have access to potentially valuable information, outweighs the obvious detriments, which not only our invasions of privacy but also, potentially, that backdoor would be open for bad actors - cybercriminals or foreign actors... 

Dave Bittner: OK. 

Ben Yelin: ...Or state actors. So interestingly, the chair of the committee Lindsey Graham, Republican senator from South Carolina, and the ranking Democrat, Dianne Feinstein of California, were largely in agreements and have been longtime supporters of encouraging/forcing the tech companies to create this backdoor to allow law enforcement to gain access to encrypted devices. Senator Graham went as far as to sort of issue a threat to the witnesses who were there, saying either you guys come up with a fix by yourself to this issue - we need access one way or another. So either you develop the workaround or we're going to force it on you. And as you can expect, the witnesses pushed back a little bit. Apple and Facebook have said that they are incapable of creating these so-called backdoors without protecting against the threat that bad actors would also get access to those backdoors. So the technology doesn't yet exist, from their perspective, to regulate who comes through that backdoor on those encrypted devices. 

Ben Yelin: Cyrus Vance, the other gentleman who was testifying, obviously is a proponent of creating backdoors. He's on the government's side in the Crypto Wars. But interestingly, he sort of mentioned that, despite having the resources of the Manhattan district attorney's office, he's been unable to unlock something like 80% of the iPhones that they've needed to unlock for law enforcement purposes in New York. He said 82% of the time, devices are locked as opposed to 60% four or five years ago. So you know, sometimes, that will require him to spend a lot of taxpayer money to hire people to try and decrypt those devices. And still, they can have trouble accessing them. 

Ben Yelin: So you know, this is one of those long-running debates that we're not - you know, was not going to be solved at a single hearing. But it was just kind of interesting to see it aired out in public. 

Dave Bittner: Yeah, there's an interesting exchange here from Senator Chris Coons and Jay Sullivan, who is Facebook's Messenger app - he's their privacy chief for that. And Senator Coons was kind of - I don't know - pushing him on child pornography, which it seems like so many of these cases end up in that arena because I think it's the worst-case scenario and I guess something that resonates emotionally with everyone. Everyone's against child pornography. And Sullivan from Facebook said, we believe there's no place for these activities on our products. Coons fired back - so you're against child pornography and child abuse? Absolutely, Sullivan said. Thanks for clarifying that, Coons replied - as if there was any doubt. Now - but here's the interesting part for me, is that Jay Sullivan from Facebook made the point that a lot of the detection that they do for these kinds of things on their platform come from behavior, not from content. 

Ben Yelin: Right. 

Dave Bittner: In other words, they have figured out there is a common playbook that folks like child pornographers follow when they're on a platform like Facebook. And so Facebook doesn't necessarily have to see the data. That data may be encrypted, but Facebook can still figure out - hey, this person is likely up to something that's no good. I thought that was a really interesting insight. 

Ben Yelin: It is. And sort of - that's Facebook's way of saying we have a way of attacking the problem that would prevent the government coming in and having access to encrypted devices or our Facebook Messenger app while still confronting the bad actors that are using our platform for destructive purposes like child pornography. So yeah, I mean, I thought that was also a really interesting point. 

Ben Yelin: One thing this article referenced, which I had never seen this metaphor - I thought it was really brilliant - a group of computer scientists and cryptographers wrote a paper on the Crypto Wars. I think some of them were from MIT, so we know they have to be smart. And they said that a proposal to regulate encryption and guarantee law enforcement access feels rather like a proposal to require that all airplanes can be controlled from the ground. And this really spoke to me. Like, obviously, in some circumstances, you'd think it would be great for airplanes to be controlled from the ground. 

Dave Bittner: Right. It's... 

Ben Yelin: If it was 9/11 and airplanes have been hijacked, it's fantastic that the air traffic control tower can take over the controls and land the plane safely. 

Dave Bittner: Or even if a pilot had a medical condition or something like that. 

Ben Yelin: Exactly. 

Dave Bittner: Yeah. 

Ben Yelin: The movie "Airplane" would've been far less interesting if this technology had existed. 

Dave Bittner: Right? Do I have clearance, Clarence? Right. (Laughter). 

Ben Yelin: Lloyd Bridges would've just brought it right in for a safe landing. 

Dave Bittner: Yes. 

Ben Yelin: But the other side of it is, what if a bad actor gets into that control tower? That makes things much worse because now you have criminals - or potentially terrorists - exploiting a loophole that was created by the government. And you know - and I think that's just a - it was a very eloquent way of talking about the dangers of giving the government this backdoor. I think that was something that was pretty well represented in this hearing from both the representative from Apple and Facebook. 

Dave Bittner: Yeah. And a number of people have weighed in on this. Bruce Schneier, who's well-known in cybersecurity circles, he wrote in a brief - (reading) I don't believe that backdoor access to encryption data at rest offers an achievable balance of risk versus benefit either. But I agree that the two aspects should be treated independently. 

Ben Yelin: Yes, that's sort of the contrast between accessing decrypted communications in transit. So I think he's saying, like, accessing communications at rest is a better alternative but not an ideal alternative - but you know, in terms of weighing the pros and cons, a far better balance than what we commonly see in government proposals, which is allowing access to encrypted communications in transit. Another thing that struck out at me - and this is from both the article and the hearing - is that users are going to find workarounds to applications and services in which the government has created these backdoors. 

Dave Bittner: Right. 

Ben Yelin: So if people know that there's a backdoor to Facebook messaging, they're going to go to some app created in a foreign country where that capability does not exist, you know? And I think it was General Michael Hayden, former head of the NSA and CIA... 

Dave Bittner: Yeah, yeah. 

Ben Yelin: ...Yeah - who was saying, you know, this is what's been happening in Hong Kong. They've been able to migrate their communications to encrypted applications, those that have been created overseas that are even inaccessible to the Chinese governments, which is really saying something. 

Dave Bittner: Yeah. 

Ben Yelin: And in order to confront that problem, in his view - and I think in the view of the witnesses of this hearing - we pretty much have to become an authoritarian country because we, you know, have to be basically taking control of cyberspace lock, stock and barrel. So even for people who have that type of security background, you can understand, you know, what the risks are. And I think that was very compelling. And you know, just to note - to bring it back to our "Caveat" podcast, this is not something that a bunch of crazy hippies are only talking talking about. 

Dave Bittner: (Laughter) That's right. 

Ben Yelin: We had the former Secretary of Homeland Security Michael Chertoff on here basically saying the same thing. 

Dave Bittner: Yeah, he doesn't think it's worth it. 

Ben Yelin: Right. The risks just outweigh the benefits. So it's interesting that you have all of these stakeholders, from people who've served inside governments, especially in the infosec national security community and then people from the tech companies, civil liberties advocates all kind of on one side. And then you have the Senator Graham, Coons, Feinstein-type people - and I guess FBI Director Christopher Wray, as well - on the other side. You know, I sort of think the balance of compelling viewpoints is against creating these backdoors. 

Dave Bittner: Yeah. I also think it's worth noting that this is where things stand now. And it doesn't mean that in the future, there may be some technology that we don't have yet that could be a viable solution to this. And just because we can't do it now doesn't mean we shouldn't keep looking for that possible compromise solution or something that achieves what we're setting out to do. It's possible that in the future, it could exist. 

Ben Yelin: Right, absolutely. And I think what Apple was trying to say to Senator Graham in this hearing, you know, because Senator Graham was issuing this threat - basically, if you don't create the backdoor, we're going to mandate it and create for you - is basically, relax, Senator Graham. We are the innovators here. Right now the technology doesn't exist where we can keep the backdoor completely locked to bad actors while being open to... 

Dave Bittner: Right. 

Ben Yelin: ...The government. But we're innovators. We might find a way to make it happen. If you step in and regulate us, that's probably going to have a greater detrimental effect than just letting, you know, the market work this out. This is something that I think the tech industry will probably take the lead in figuring out because I think, you know, it's in their long-term interest to maintain a good relationship with law enforcement. 

Dave Bittner: Yeah. 

Ben Yelin: And so there's incentive for them to try and come up with a solution to this problem. 

Dave Bittner: All right. Well, the Crypto Wars continue. 

Ben Yelin: They will never end. 

Dave Bittner: (Laughter) My story this week comes from Forbes. This is from Thomas Brewster. Title of the article is "Feds Can't Force You to Unlock Your iPhone With Fingers Or Face, Judge Rules." And this is interesting. So a judge in California has ruled that the police can't force you to unlock your mobile phone with either your face - so face ID - or your finger - your fingerprint. Help me understand here, Ben. How is this different than what we had before, and how is this going to play out in the real world? 

Ben Yelin: So this was a fascinating decision. It did come from a magistrate judge in California, which is a relatively low-level judge. 

Dave Bittner: OK. 

Ben Yelin: So its applicability is somewhat limited even within the state of California, let alone the rest of the country. But the reasoning that this magistrate judge used in her opinion was groundbreaking and potentially could shift the way other courts see this issue of self-incrimination and unlocking devices. I know we've talked about this a lot, but the prevailing legal view is that entering in one's passcode counts as testimonial evidence for the purposes of the Fifth Amendment right against self-incrimination. 

Dave Bittner: Right. 

Ben Yelin: So as a quick review, Fifth Amendment says that a person should not be compelled to testify against themselves. That only applies to testimonial evidence - so you know, in the nondigital world, what you say at a hearing, what you say at a criminal trial, a civil trial. When you're being grilled in front of Congress, you can invoke your Fifth Amendment right. 

Dave Bittner: Yeah. 

Ben Yelin: Traditionally, that doesn't apply in nontestimonial settings, the prototypical example being a police lineup. If someone's like - all right, take your position, you know, in this police lineup; you're going to be number two - you can't go in there and say, sorry, I'm invoking my Fifth Amendment right. Judges have tried to extend that distinction to the digital world, saying that typing in a passcode counts as testimonial evidence. It's the contents of your brain. You know that passcode. You have to enter it in. But using biometric means - so face ID or a fingerprint does not count as testimonial evidence because that's, you know, the equivalent to being in a police lineup. 

Dave Bittner: Right. It's what you are rather than what you know. 

Ben Yelin: Exactly. So that's sort of the prevailing view among courts. What this judge is saying is the effect of unlocking a phone with your fingerprint or your face ID is the exact same as the effect of unlocking one's phone using an alphanumeric password. No matter what you use, the phone's going to be unlocked. The same amount of incriminating information potentially could be available. So this distinction in a practical sense, you know, shouldn't be there as we've tried to apply the Fifth Amendment to the digital world. In her view, as I think most of us agree, the law is always behind the technology. And she's trying to adapt the law to catch up to the technology. She's trying to say there should not be this legal distinction between unlocking a phone with an alphanumeric password and using biometric data because the practical effect is the same. 

Dave Bittner: So in other words, I should be able to declare my intention to not unlock my device. And it won't matter how that device is secured, be it by a password or by some biometric thing. It's basically invoking my Fifth Amendment rights as it applies to unlocking the device, regardless of how that unlocking occurs. 

Ben Yelin: Exactly. And I think this makes a lot of practical sense, especially since most devices are moving away from alphanumeric passcodes. 

Dave Bittner: Right. 

Ben Yelin: Although, you know, many still have them. And they might be - I'm not sure about the comparative security features of each. You probably know more about that than I do. But many phones are using biometric means to unlock. 

Dave Bittner: Right. 

Ben Yelin: And we're in that world. So why should we have diminished Fifth Amendment rights just because technology companies have decided that the best way - most efficient way for us to unlock our phones is through the use of face ID or a fingerprint? 

Ben Yelin: What I like about this decision is that it's practical. It exists in the real world. Where I think the decision is potentially going to be problematic is this is a magistrate judge going outside of the bounds of what has commonly been accepted in this area of the law. And the risk there is that her decision in this California case is going to be overturned by the district court or might run into trouble in, like, a federal appeals court, which means its effect could be rather minimal because it goes against the prevailing wisdom. But the prevailing wisdom has to change somehow. Right? You know, it doesn't always change with a Supreme Court decision. Sometimes it changes with a lower-court decision or a magistrate judge decision. 

Dave Bittner: Well, explain to me how that works. In the legal world - in your world - what - a magistrate judge, who I guess you're describing at a lower level than a district court... 

Ben Yelin: Yes. 

Dave Bittner: ...Judge. 

Ben Yelin: Yep. 

Dave Bittner: A magistrate judge makes this decision. That decision goes out into the world. Is this a situation where now other judges, when making decisions, look at that decision and say, I need to consider that as I make my own decisions? 

Ben Yelin: So great question. So there's this distinction in the legal world between mandatory authority and persuasive authority. So if you're a district court judge in California, any court that's directly above you, like the court of appeals - the Federal Court of Appeals for the 9th Circuit, for example - whatever they say, that's controlling on you. You have to abide by their decision because they are a superior court to you. 

Dave Bittner: Right. 

Ben Yelin: They're the ones who have ultimate authority. If the Supreme Court says something, they obviously have the highest place in the hierarchy on a particular issue. That is mandatory authority binding on lower courts. 

Dave Bittner: I see. 

Ben Yelin: When there's, like, a lateral decision, when you have a decision from a California magistrate judge, how that relates to a magistrate judge, you know, here in Maryland, that's what's called persuasive authority. So a Maryland judge could say, I wasn't sure how to rule on this, but I saw this - a very compelling reasoning from a judge in California, and I'm going to use that to inform my decision. And a lot of times that's how a new legal consensus develops. There's an enterprising judge who comes up with a new rationale, and slowly that's adopted by more judges and eventually is adopted by those that do have mandatory authority, like those higher courts - courts of appeals, state supreme courts and the United States Supreme Court. 

Dave Bittner: So like a snowball rolling downhill. 

Ben Yelin: Exactly. You know, and the other effect it has is - there are a lot of legal academic nerds out there, and they're going to see this decision and will start writing legal treatises or law review articles perhaps arguing that this would be a better approach to deal with the self-incrimination problem related to devices. And, you know, once that gets out into the legal community, judges oftentimes look to these legal academics for guidance on these issues, and they'll cite some of these academic articles. So, you know, us people in academia, we don't have much of an influence on the world, but we do have a minor influence. And that's... 

Dave Bittner: (Laughter) Right. Other than your popular podcasts, right? Yeah. 

Ben Yelin: Exactly. Now, I hope - you know, the highest mandatory authority should be what we say on the "Caveat" podcast. 

Dave Bittner: That's right (laughter). 

Ben Yelin: But apart from that, you know, this to me is sort of the first step in what might be a broader rethinking of how courts address this issue. 

Dave Bittner: Yeah. All right, very interesting. Very interesting indeed. Thanks for unpacking how all that works. 

Ben Yelin: Absolutely. 

Dave Bittner: I think it's - I have new understanding of that. That's very interesting. 

Ben Yelin: Yeah. And I would say, one thing this article mentions at the end is, because of its limited applicability, is a magistrate court decision in California. Probably, if you want to protect your secret information on your device, you still should probably use an alphanumeric password. 

Dave Bittner: Right. 

Ben Yelin: That's sort of the guidance given... 

Dave Bittner: Right. 

Ben Yelin: ...In this article, and I think it's good guidance. 

Dave Bittner: All right. Well, interesting stories this week. It is time to move on to our Listener on the Line. 

0:19:48:(SOUNDBITE OF DIALING PHONE) 

Dave Bittner: Ben, we've got an interesting call from a listener calling in from, I'd say, one of the southernmost points in the United States, and he has an interesting question about some vulnerabilities. 

Ben Yelin: The most southern - the southernmost call in "Caveat" history. 

Dave Bittner: There you go. There you go. So here is our Listener on the Line. 

Nacho: Hello. Love the show. My name's Nacho (ph), and I reside in Laredo, Texas. And I have a question. For zero-days, does a zero-day have to be a malicious thing, or could it be something that you just happen to find to make - like in my case, make the iPhone do something it was not intended to do? What I use it for is, basically, set up a GSM code to force, from the old iPhone to the new iPhone, where you tap the lock key on the side once it sends it to your voicemail. If you get telemarketers calling, tap twice. Whatever GSM code number you gave it sends it to, you know, whatever you want it to send it to. I mean, it's a hack, but it's not a malicious hack. So how would you cover that? Is that considered a zero-day, or is it just considered just a cheap hack? 

Dave Bittner: All right, interesting question. And thank you to Nacho for sending that in. Ben, you want me to take the lead on this one? 

Ben Yelin: Yes. 

0:21:17:(LAUGHTER) 

Dave Bittner: OK. So basically, what he's asking is, is a zero-day - a zero-day is a type of vulnerability, very common term when - in cybersecurity, something we talk about over on the CyberWire all the time. So what makes a zero-day a zero-day? Basically, a zero-day - my understanding of the definition is that it is a vulnerability that is unknown to the folks who'd be interested in knowing about that vulnerability. So, for example, I'm Microsoft, and I have put out in the world the latest version of Windows, and there are certain vulnerabilities in Windows that I know about, that I've decided I'm going to fix. We're going to patch right away. Maybe they're not so serious, so we've got patches in the works. We put information out to our users, and we say, here are some known vulnerabilities; if you're developing software for our platform, here are the things we know about, and here's how we think you should fix them. 

Dave Bittner: A zero-day is a vulnerability that is not known to me. In other words, I'm Microsoft; there's a vulnerability in my software that I don't know about. The danger there is that the bad guys can discover that zero-day and they can exploit it. They have a period of time in between when they discover it and when it becomes publicly known. When me as Microsoft or security researchers know about it, there's that time gap in between when the bad guys can exploit the vulnerability, and I can do something to try to address it. 

Dave Bittner: So the actual term zero-day comes from - kind of the clock starts running. When the good guys find out about this vulnerability, that's day zero. That's when the clock starts running, and we've all got to figure out, what are we going to do about this vulnerability? And so it is a dangerous thing. It is a well-known thing. But to answer Nacho's question, specifically, I don't think it has to be malicious. The notion of a zero-day is just that it's unknown. It might be something benign. It could be something, you know, that causes an odd behavior in a piece of software, a crashing or some funny, odd behavior that doesn't necessarily make something bad happen. Could still be a zero-day because the developers of the software didn't know it existed. 

Dave Bittner: That said, I would say in general, everyday use, when we're talking about zero-days, they're generally bad things. Bad guys - well, not just bad guys; there are many groups who are interested in zero-days, as you might imagine (laughter). 

Ben Yelin: Including our National Security Agency. 

Dave Bittner: Including - yes, yes (laughter). 

Ben Yelin: Right. Yeah. 

Dave Bittner: So zero-days are quite valuable because if I know a way into your system that no one else knows about, that gives me an advantage. And if I'm a nation-state and I have a way into a particular system, that gives me an advantage to have my way with your system that you may not know about. So, Ben, as you point out, rightfully so, certainly agencies within our government have taken advantage of zero-days. 

Ben Yelin: Absolutely. And, you know, that's part of their mission, is to find those vulnerabilities because, you know, the NSA has both offensive and defensive capabilities. So this would fall into both, actually. You'd try and attack the vulnerabilities of our adversaries, but you also - you know, sometimes the NSA will discover a zero-day from Microsoft before Microsoft knows about it. 

Dave Bittner: Right. Right. 

Ben Yelin: So it goes into both those capabilities. 

Dave Bittner: Right. There are - many companies have bug bounties, where they will pay researchers if they find vulnerabilities and report them to the company. So they put an incentive out there to try to find these things before bad actors do. There are organizations - probably the most well known is a company out of Israel called NSO Group - and they develop tools for governments around the world to be able to get into devices, like iPhones and computers and so on, and they are marketing in zero-days. They will pay big money for zero-days. My understanding is that if you have a zero-day - for example, the iPhone - that is very valuable because there aren't that many of them. It's a relatively secure device. So if you have one, it's a seller's market. 

Ben Yelin: Yeah (laughter). 

Dave Bittner: Yeah. 

Ben Yelin: Not that we're encouraging such a thing, but yeah. 

Dave Bittner: No, no. But I think you're not wrong in calling something that's just something you - something anyone takes advantage of, I guess technically that's a zero day. 

Ben Yelin: (Laughter) Yeah, I think so. 

Dave Bittner: All right. Well, thank you, Nacho, for calling in. That is our Listener on the Line. Coming up next, my conversation with Christopher Whyte. He's an assistant professor of homeland security and emergency preparedness at Virginia Commonwealth University. We'll be discussing some reports that have come out lately about how, when it comes to cybersecurity, the White House very well may be in disarray. But first, a word from our sponsors. 

Dave Bittner: And now back to that question we asked earlier about compliance. You know, compliance isn't security, but complying does bring a security all its own. Consider this - we've all heard of GDPR, whether we're in Europe or not. We all know HIPAA, especially if we're involved in health care. Federal contractors know about FedRAMP. And what are they up to in California with the Consumer Privacy Act? You may not be interested in Sacramento, but Sacramento is interested in you. It's a lot to keep track of, no matter how small or how large your organization is. And if you run afoul of the wrong requirement, well, it's not pretty. Regulatory risk can be like being gobbled to pieces by wolves or nibbled to death by ducks - neither is a good way to go. KnowBe4's KCM platform has a compliance module that addresses in a nicely automated way the many requirements every organization has to address. And KCM enables you to do it at half the cost in half the time. So don't throw yourself to the wolves, and don't be nibbled to death by ducks. 

Dave Bittner: And we are back. Ben, I recently had the pleasure of speaking with Christopher Whyte. He's an assistant professor of homeland security and emergency preparedness at Virginia Commonwealth University. And our conversations centered on the notion that when it comes to cybersecurity, there's a lot going on in the White House right now, and a lot of it is not good. Here's my conversation with Christopher Whyte. 

Christopher Whyte: The history of cybersecurity in the White House in - at least in its current form, goes back to the Clinton administration, the late 1990s. Following the commission reports and a variety of other developments that came out of Oklahoma City bombings, our experiences with the Solar Sunrise and Moonlight Maze APT incidents in '96 through '99, we basically put in place a cyber czar position in the White House. This person would be responsible for all cybersecurity policies across the federal government. It was recognized by both the Clinton administration and the Bush and Obama administrations. It was such a crosscutting and diverse issue area, it was the best idea for the whole thing to be centralized in one coordinate of package in the form of this office of the cybersecurity czar. 

Christopher Whyte: The office in question that we're talking about today, the office of the CISO at the White House was put in place after some initial breaches of the White House's unclassified network. So there have actually been three that I can think of. There have been a couple of other security incidents that have been a little bit concerning, but there have been three major incidents over the last 11 years. I believe the first one I'm thinking of was in 2008, where a foreign power was - seemed to have successfully got access to unclassified White House systems. This office that has just been merged into one of the other adjacent offices at the White House and is experiencing some downsizing was put in place after the last of those. 

Christopher Whyte: And so in 2008 and 2012, we had incidents where we saw Chinese operatives successfully conduct - I believe it was spear phishing campaigns in both instances. Somebody clicked on something they shouldn't have, and unclassified systems were breached. The more recent one was a Russian attempt to breach the White House's unsecured systems to some degree of success that we're not entirely clear about. But that was in 2014, and so that's what led the Obama administration to commission this office and give it the responsibility of coordinating information security for the White House as a whole, as opposed to relying on the DOD agencies. I think they're under the jurisdiction of DISA that were responsible for the White House's information security and their communications up to that point in time. 

Dave Bittner: Where do we find ourselves today? What changed, if anything, when President Trump came into office? 

Christopher Whyte: This is an open question. All right? There's a lot to unpack here. Unfortunately, particularly because of the unusually partisan nature of politics at this time in this country - but just the unusual circumstances surrounding several of the major foreign policy incidents linked to, in some form or another, the Trump administration and the central role that cybersecurity has played in those incidents - going back, of course, to Russian interference in 2016. 

Christopher Whyte: The argument that's being put out there by the Trump administration on what has happened here - and let's be clear descriptively on what's happened as well. The personnel in this office have been increasingly - this is the leaked report from Axios - have been increasingly marginalized over the past some months. They've seen their access to systems and information being reduced over time. There was the removal of an incentives package - the kind of thing you do if you're trying to force a particular subset of your workforce to look for other jobs, basically. 

Christopher Whyte: The most recent thing that's happened has been a merger of the Office of the Chief Information Security Officer with the Office of the Chief Information Officer. So this is an entirely separate entity within the White House. And as that happens, the director of the former, the security office, resigned in protest. 

Christopher Whyte: Kind of the argument that's being put out here by the White House is that this is streamlining and efficiency move. There's a lot of redundancy, they're arguing - a lot of inefficiency, duplicated efforts in having basically two different cybersecurity teams within the White House. That's the broad argument there. And it's - on the face of it, it's the kind of broad argument which could be accurate. It makes some sense - right? - overall improvements to the efficiency and the productivity of White House cyberactivities, putting the capabilities in - more centrally under one set of decision-makers. Sure, that that could absolutely be a reasonable move. 

Dave Bittner: And a president who talks about cutting costs... 

Christopher Whyte: Yeah. 

Dave Bittner: ...And saving the American people money. 

Christopher Whyte: Yeah, that's absolutely true as well. There is a very reasonable argument here to be made that duplication of efforts, particularly in this domain, costs the taxpayer a disproportionate amount relative to what they could be paying for. And I say in this domain specifically, of course, because we all know the value of good cyber talent. 

Christopher Whyte: The folks who are working at the White House in either office - but let's just say the folks that were working in the Information Security Office - they weren't being paid anywhere near what their colleagues in private industry would be getting paid. But the salaries and the incentive packages involved are still robust. These individuals will have made likely six figures. I believe the director, it was reported, made somewhere in the, you know, 130,000 to 140,000 range. And they had, like I said, incentives packages to keep them interested in working there. This is a workforce issue that the government has had a challenge with for 30 years at this point. Right? 

Dave Bittner: And these are career professionals. Right? These are not political appointees. 

Christopher Whyte: These are career professionals who were seconded to this office by the Obama administration. So they were - you're correct. They're career officials, but they were put in place in their current roles by the Obama administration. I don't know... 

Dave Bittner: I see. 

Christopher Whyte: ...If that nuance matters. 

Dave Bittner: Yeah. 

Christopher Whyte: But it is, of course, one of those pieces of nuance that people are suggesting might matter given the current president's interest in removing many of the things that the Obama administration put in place, whether it be policy, legislation or personnel. 

Dave Bittner: Right. There are concerns about how some of these moves might affect the Presidential Records Act. 

Christopher Whyte: Yes. 

Dave Bittner: Can you unpack that for us? 

Christopher Whyte: The broad shape of that issue is that there's a concern, I think particularly given what's being reported in the news related to the recent impeachment investigations, that the Trump team is basically attempting to cover up its tracks. It's trying to make transcripts and records a lot more difficult to find. The argument, I think, with the merger of these two offices - making one redundant and enlarging the other - is that, A, the line of command between, let's say, the chief of staff or the president and those who are actually going to be monitoring information coming in and out of the White House, monitoring White House networks is much more simplified. They could potentially appoint individuals to the leadership of the information office - the remaining office - that would be more or less willing to kowtow to political interests in protecting records. 

Christopher Whyte: That's the broad shape of the arguments being made there. Another part of this - the political side of this is just the actual removal of individuals. They're being chased out, so to speak, by, again, the removal of incentives packages, reducing their access to different systems, access to different pieces of information. There's been a lot of accusations that this is yet another example of how this administration has created an internal culture of fear and anxiety. It's trying to actually push people into becoming, more or less, loyal or leaving the administration or the White House specifically because there is a clear and present threat to their jobs if they don't. 

Dave Bittner: It's interesting to me because it seems as though cybersecurity is one of the rare things these days in government that has broad bipartisan support. It's recognized as being important and, in some ways, has not been politicized. Certainly, in some ways, it has. So I think it's interesting to see how - to see these reports from within the White House itself, the politicization of cybersecurity. 

Christopher Whyte: Yeah, absolutely. And it may be that we're talking today about that unique instance in government - although, unfortunately, I don't think it is all that unique - but where the concerns related to everyday cybersecurity operations pertain very directly to the political agenda of our head executive - the president. One of the major concerns that the Trump White House certainly has - and I think, to some degree, legitimately has - relates to leaks of memos, accounts of conversations over and above what you might traditionally expect of leaks heading out the door via conversations with, you know, the White House press pool. 

Dave Bittner: Right. 

Christopher Whyte: There's a concern that, particularly documentation and transcripts that are now making their way rapidly into the media relating to, you know, one political issue or another or one policy issue or another - this is happening in a manner that at least the administration sees as remarkably unprecedented. And so from that point of view, again, taking steps to and making efforts to actually crack down on the potential for information leakage makes some substantial sense, whether or not you agree with it as it interferes with the cybersecurity operations of the White House staff. 

Dave Bittner: Is it fair to say, I mean, that is different from espionage? In other words, leaks from White House staff to the press is different than a nation state trying to break into White House networks. 

Christopher Whyte: It absolutely is. 

Dave Bittner: Or is it a distinction without a difference? 

Christopher Whyte: I think it - no, it is different. I think it is different. We're talking about, you know, the insider threat here. I suppose the question there is, does this merger and this staff reduction - the staff - inevitably staff turnover, I suppose, does this actually reduce the cyber resiliency of White House operations equally with regards both to foreign adversaries and this, you know, insider threat that the Trump administration seems to still particularly concerned about. It certainly does the former. I think it reduces the ability - at least in the near to medium term - even if you buy the argument about redundancy and inefficiency and the need to streamline the whole process, the fact that the White House is going to have to refill staff positions; it is going to have to go through that institutional restructuring process that often happens. I mean, it happened very famously, of course, with DHS after 9/11, where DHS came into existence - 22 different organizations from across the government forced to work together for the first time. 

Christopher Whyte: Even on a small scale like we're seeing here, there's always a need to reconcile job responsibilities, mission statements, that kind of thing. And so even in the near term, I think it's a reasonable argument that the ability of the White House to actually effectively prevent foreign compromise of even just unclassified networks is going to be reduced by this move - again, at least in the near term. On the latter point, I'm actually unclear. I'm unclear whether or not this will actually help that insider threat mitigation mission that the Trump administration has clearly undertaken since the earliest days of their being in office. 

Dave Bittner: Yeah, it's fascinating to me how this has sort of, in my mind, revealed how much of this stuff really just relied on norms, on leaders aligning to norms. In other words, the president's security staff would recommend to the president - you know, Mr. President, here's the things we need to do; this is the type of phone we need you to use; this is the - how we need you to secure your communications. And I think historically, the presidents would go along with that. And here we seem to have a president who wants to do his own thing and leaves us wondering. There's no one who can say no to that. There's no one who has the authority to say no to that. And that could leave us with some national security issues. 

Christopher Whyte: Right, absolutely. I mean, this is not unusual in some senses in that it's not unique to the Trump administration. There's the very famous conversation that Obama had with the Secret Service about the use of his BlackBerry. And they came to a compromise of sorts with regards to the BlackBerry usage. 

Christopher Whyte: The thing that is unusual about the Trump administration, of course, is the unprecedented fashion in which even the most basic information security practices have, at some points in time, been thrown out the window. It's commonly reported, of course, that President Trump still uses a personal unsecured iPhone. There are definitely individuals who have close access to the president who are using unsecured, unaccounted for devices. I mean, we had the recent story about Rudolph Giuliani going into an Apple Store because he couldn't unlock his iPhone. That's a... 

Dave Bittner: Right, right. 

Christopher Whyte: ...Information Security 101 things you shouldn't do right there. And of course, you have a number of accusations, just to cover the kind of media-friendly pieces here of, you know, Jared Kushner and Ivanka Trump and others - again, people with close access to the president - and I think more importantly for our conversation here, people that are constantly physically in the White House and using White House networks using private email addresses and private servers, opening up a range of different threat vectors that could be compromised, could be manipulated, taken advantage of and exploited by foreign adversaries. 

Christopher Whyte: And yeah, the question about norms is a fascinating one. Right? To what degree does this have to be accepted simply because, at some point in time, the executive - i.e., Trump in this case - makes a decision and there's very little that those underneath him can do about it if he's adamant. 

Dave Bittner: Right. 

Christopher Whyte: And then to what degree does this actually become the normal mode of operation? I think it highly likely that in whatever the next administration is, whether or not that's in a year or two or whether it's in, you know, five years or so, you'll see, yet again, another restructuring of the institutions of cybersecurity within the executive branch. Whether or not they walk backwards to what we had under the Bush and Obama administrations with kind of a, you know, a cyber czar position running the show but an increasingly - I don't want to say fragmented - increasingly diverse landscape of offices and teams seconded to very specific network defense purposes within the organization as a whole. 

Christopher Whyte: Whether or not that's actually the move that will occur, I don't know. I think, to some degree, it depends on the length of time that this current system will persist - will actually be in place. If this system is in place, interestingly enough, for only (ph) the next year-and-a-half or so and then the current president doesn't win reelection and someone else comes into office, it's possible that we may not see the negative externalities related to this kind of move - because maybe there's not enough time in that a year-and-a-half or so for us to experience the kind of breach that typically prompts public attention on an issue and then makes the political establishment adapt in some way - appoint some new people, put a new office into existence, that kind of thing. 

Dave Bittner: All right. Ben, what do you think? 

Ben Yelin: First of all, very interesting interview. And thank you, professor Whyte, for coming on "Caveat." First thing I thought of was that "Seinfeld" episode where George Costanza got offered a job to work at a playground equipment organization... 

Dave Bittner: Yeah. 

Ben Yelin: ...On the false premise that he was disabled. They found out he wasn't disabled. And so they couldn't fire him, so they found a bunch of ways to make his life miserable so that he would quit voluntarily. 

Dave Bittner: I see. 

Ben Yelin: Seems like that's what's going on with the career professionals within the executive branch... 

Dave Bittner: Yeah. 

Ben Yelin: ...As it relates to cybersecurity... 

Dave Bittner: Right. 

Ben Yelin: ...Which obviously has very negative policy implications for our country. I mean, it is sort of depressing to hear how politics have infiltrated our executive branch when it comes to what were previously nonpartisan issues. You know, I think part of that is inevitable because everything is partisan these days. But also, as the professor mentions, it's largely due to the circumstances of President Trump himself related to Russian interference. You know, that was a cybersecurity issue, so it's something that he's sensitive about. And then, you know, in terms of the latest scandal, the fact that the conversation that he had with President Zelenskiy of Ukraine was put on this server intended to handle the most classified information and that sort of became part of the scandal. So I think some of the personnel decisions might just be reflecting those - I don't want say paranoia - but... 

Dave Bittner: Mmm hmm - the mindset... 

Ben Yelin: ...Sensitivities, yeah. 

Dave Bittner: The mode in which they are operating... 

Ben Yelin: Absolutely. 

Dave Bittner: ...In this particular administration, yeah. He brought up a really interesting point about how we as a nation express our capabilities when it comes to security in the cyber realm. And it got me wondering - you know, what is and is there a cybersecurity equivalent of an aircraft carrier? 

Ben Yelin: Or like the cyber F-35 bomber that we can - yeah. 

Dave Bittner: Right, right. But a - an aircraft carrier is an expression of the force of our military might that we can place all around the world and that has real-world implications. It has real-world effects. 

Ben Yelin: A deterrence effect... 

Dave Bittner: Right. 

Ben Yelin: ...Yeah, absolutely. 

Dave Bittner: He's got me thinking. I wonder, is there a cyber equivalent of that? I can't think of one off the top of my head. So listeners, let us know if you think there is one. 

Ben Yelin: Yeah. What can be our future, you know, cyber aircraft carrier? 

Dave Bittner: Right. Is it yet to be built? 

Ben Yelin: It might be. And I don't know that there is an answer to that question at this point. I can't think of one either. 

Dave Bittner: Yeah. All right. Well, thanks to Christopher Whyte for joining us. Really interesting conversation there. Of course, we want to thank all of you for listening. And we want to thank our sponsors KnowBe4. If you go to kb4.com/kcm and check out their innovative GRC platform - that's kb4.com/kcm - you can request a demo and see how you can get audits done at half the cost in half the time. Our thanks to the University of Maryland's Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bitner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.