Caveat 1.22.20
Ep 12 | 1.22.20

There is no back door.

Transcript

Andrea Little Limbago: I think encryption is a core aspect of maintaining privacy and freedom of expression and could help be foundational to democracies. 

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hi, Dave. 

Dave Bittner: On this week's show, I've got a story about Congress struggling to define acts of war in cyberspace. Ben has Apple's response to the DOJ and their request to unlock another iPhone. And later in the show, my conversation with Andrea Little Limbago - she is the chief social scientist at Virtru. She's going to be speaking at the upcoming RSA Conference on the global battle against encryption. And that is the focus of our chat. While this show covers legal topics, and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll be right back after a word from our sponsors. 

Dave Bittner: And now a few words from our sponsors at KnowBe4. You know compliance isn't the same thing as security, right? How many times have we all heard that? And it's true, too. Having checked the legal and regulatory boxes won't necessarily keep the bad actors out. They're out-of-the-checkbox kinds of thinkers. But what about compliance itself? You've heard of legal exposure and regulatory risk, and trust us, friend, they're not pretty. So again, what about compliance? We'll hear more on this from KnowBe4 later in the show. It's not a trick question, either. 

Dave Bittner: And we are back. Ben, I'm going to start things off for us this week. This is a story from The Hill written by Maggie Miller and Laura Kelly. It's titled "Congress Struggles on Rules for Cyber Warfare With Iran." I think, perhaps, Iran is the catalyst for this conversation, but I think it speaks to a broader issue of the fuzziness when it comes to both establishing what the rules of cyberwarfare are and the desire to establish what the rules of cyberwarfare are for a lot of different reasons. This story specifically goes into Senator Ron Johnson, who's a Republican from Wisconsin, along with Senator Gary Peters; he's a Democrat also on the Senate Homeland Security Committee. They are taking a look at this and trying to decide whether or not they need to take a deeper look or explore setting what some of the rules should be here. What are some of the details here, Ben? Can you sort of lay this out for us? 

Ben Yelin: Sure. So you're exactly right that the last couple of weeks in terms of our unrest, if you will, with the nation-state of Iran really is the catalyst for this renewed conversation about what constitutes cyberwarfare. And the question is important for a couple of reasons. One, we need to know whether we're at war because, you know, that triggers a whole bunch of legal implications. There are a lot of powers the government has during a state of war, whatever that state of war is. And, you know, certainly, that's something that Congress wants to define. And then there are, of course, rules of engagement - and that applies to both federal law and international law - what we're allowed to do as a kinetic response to some sort of cyberattack. And it's hard to define because I think, partially, there's a failure of imagination. It's very easy for us to conceptualize, to picture conventional warfare. 

Dave Bittner: Yeah. When there are tanks sitting on your border, you kind of know what you're dealing with. 

Ben Yelin: Exactly. 

Dave Bittner: Yeah. 

Ben Yelin: When there are tanks and bombs, when buildings are destroyed, when there are casualties, when there are planes in the sky. Cyberwarfare is very different, and it's difficult even for policymakers who are well-versed on these issues to picture exactly what constitutes warfare. One of the suggestions here from Senator Johnson, which I think is, you know, a good starting point, is to focus on attacks on critical infrastructure - so cyberattacks against our water systems, our electrical systems, our information security systems. One reason I think that's a good place to start is, that's the closest digital analog to targets of war that you would see in conventional warfare. 

Ben Yelin: Oftentimes in conventional warfare, a strategy is to destroy infrastructure. And so I think that can certainly be analogized to what a cyberattack against critical infrastructure would be. And I think, you know, in terms of imagining what the consequences were, what type of event would really trigger us to act as if we're at war with an adversary - it would be something that destroys our electrical system, that cuts off cellular phone service to people, that hampers our water utility for some reason. I think because the consequences of those actions are so severe, that's sort of where policymakers are leaning in terms of coming up with that definition. But as this article says and as you laid out, they're having a hard time defining that exactly. And because, of course, this is Congress, they first have to have an argument about who gets to make the decision. 

Dave Bittner: (Laughter). 

Ben Yelin: So... 

Dave Bittner: We all agree it needs to be done, and now let's fight about process, right? 

Ben Yelin: Yeah, not us, right? Yeah. 

Dave Bittner: Yeah (laughter). 

Ben Yelin: Let's punt it over to the Pentagon. 

Dave Bittner: Right. 

Ben Yelin: So the Department of Defense has taken an increased role as it relates to cybersecurity and cyber preparedness. Yeah, Pentagon back in 2018 elevated U.S. Cyber Command to what's known as combatant command, so that certainly raised the profile of cyber incidents as it relates to our national defense. So Senator Blumenthal, who's on the same Senate Homeland Security Committee - he's a Democrat from Connecticut - suggested that this really is the prerogative of the Department of Defense. They're the ones who should be defining this policy. Congress certainly has a role, particularly when it comes to oversight. It would be the Department of Defense that would develop the policy, and it would be Congress that figures out whether that policy is workable, that analyzes what the consequences would be of having that particular definition of cyberwarfare. But the Pentagon has not done that up to this point. And I think there is sort of a renewed effort in light of this threat from Iran to know exactly what constitutes cyberwarfare, what the rules of engagement are because - and I know this sounds like hyperbole, but it's not if we're going to enter in some sort of cyber conflict; it's when. 

Dave Bittner: How much of this is a perception, correctly or otherwise - and I'm coming down, perhaps, on the side of it being incorrect - that cyber is its own thing separate from other types of warfare? In other words, if I have tanks hurtling ordnance at you versus dropping it out of airplanes, your buildings are still blowing up, right? And I wouldn't say I've... 

Ben Yelin: One is war, and one is not war, yeah. 

Dave Bittner: One is war and is not because of where it's coming from. The damage is the damage. I was having a conversation with one of my CyberWire colleagues, and we were talking about the Sony breach. And suppose North Korea had snuck into Sony in the middle of the night when no one was there - so there'd be no human - you know, no casualties - and it had burned down all the filing cabinets. And we knew it was North Korea. 

Ben Yelin: Right. 

Dave Bittner: Would that be an act of war? 

Ben Yelin: Right. I mean... 

Dave Bittner: Interesting (ph), you know? 

Ben Yelin: I completely see what you're getting at. 

Dave Bittner: Yeah. 

Ben Yelin: It's not the mode of the attack; it's the consequences. And when we're talking about things like interrupting air traffic control centers, which would potentially cause planes to fall out of the sky... 

Dave Bittner: Right. Turning off the lights or the power, which could cause people to die from medical situations - you know, all sorts of fallout there. 

Ben Yelin: Absolutely. When the consequences are potentially as severe as conventional warfare, then I think we need to treat the problem as if it were conventional warfare. I favor that kind of consequence-based approach. Now, that doesn't work in all circumstances because whether we're talking about cyberwarfare or all other areas of cyber law and policy, there are going to be some things where you can't quite analogize it to the physical world, you know? 

Ben Yelin: A lot of policymakers would prefer to simply use tort law to govern data breaches. But I think through additional research and through experience, a lot of us have realized that those types of, you know, 18th-century tort principles are not always going to be applicable. And I think that's true when we're talking about cyberwarfare. That's something that policymakers need to consider. But from a broader standpoint, that's why I think starting the conversation around critical infrastructure is so important because that's largely what conventional warfare is about. It's destroying our physical capital and our human capital. And cyberattacks certainly have the potential to do the same. 

Dave Bittner: Yeah. It's also - it seems to me like there's - part of this fuzziness comes because in the cyber domain, there's some fuzziness between an attack and espionage. 

Ben Yelin: Right. 

Dave Bittner: And there's a different response, there's a different understanding, there's a different agreement when it comes to espionage versus warfare. 

Ben Yelin: Yeah. And that exists in the physical world, too. I mean, there are statutes dealing with espionage. Now, the consequences of being a spy inside the United States on behalf of a foreign government or being a spy on behalf of a foreign government in the United States as a United States citizen are very severe. But it's a distinct area of the law from, you know, the laws of war and the laws of armed conflict. 

Dave Bittner: Right. 

Ben Yelin: And I think you could apply those same principles here largely because we're talking about the difference in consequences. An attack, if it's going to be defined the way it seems like lawmakers want to define it, are going to be - attacks are going to be defined by their consequences. So if somebody is, for lack of a better word, loitering in our cyber world but not actually causing any overt damage, physical or otherwise, then that's more on the espionage side. But if they are disrupting our critical infrastructure, our power grids, our water systems, and you have an actual, tangible impact on living human beings, that becomes something different entirely. 

Dave Bittner: Yeah. And it's worth noting that there's nothing that says that a cyberattack requires a cyber response. If someone turned off our lights with cyber, we could respond with cruise missiles. 

Ben Yelin: Right, especially if it was a nation-state like Iran. 

Dave Bittner: Yeah. 

Ben Yelin: I mean, it becomes more difficult and more nebulous when we're dealing with either lone-wolf actors or terror groups that aren't necessarily associated with nation-states. 

Dave Bittner: Oh, yeah, yeah. 

Ben Yelin: But yeah. I mean, there are circumstances where it would probably look a lot like conventional warfare. We would deploy weapons. We would probably deploy cyber weapons, but we would also deploy conventional weapons because we're the United States (laughter). And we... 

Dave Bittner: That's the area where we have asymmetry (laughter). 

Ben Yelin: We sure do. 

Dave Bittner: (Laughter) Right, right. 

Ben Yelin: You know, and I think it's up to these lawmakers to make sure that that asymmetry continues in the realm of cybersecurity. And we're certainly not there yet, largely because the problem is relatively young, but we haven't had that national consciousness to determine that the threat of cyberattacks is a real problem that requires the proverbial Manhattan Project solution. 

Dave Bittner: Oh, yeah. That's an interesting way to think about it. 

Ben Yelin: And unfortunately, it may take some catastrophic incidents to actually raise that level of public consciousness. Power went out in my house a couple of weeks ago in the middle of the night. 

Dave Bittner: Oh, yeah? 

Ben Yelin: And just because I'm a nerd about all this stuff, I'm like, huh, what if this is a cyberattack against our critical infrastructure from some nation-state? 

Dave Bittner: (Laughter) Right. Right. 

Ben Yelin: And I fell asleep, and the power came back on in 20 minutes, but, you know. 

Dave Bittner: (Laughter) All right. It's something that we'll track. I think it's an interesting area, no doubt at all. That is my story this week. Ben, what do you have? 

Ben Yelin: So the crypto wars are back, and the actors are slightly changed but largely the same. 

Dave Bittner: Yeah. 

Ben Yelin: So there was this incident back in December at a naval base in Florida. It was a mass shooting. A gunman murdered three people, injured eight others at this naval base. 

Dave Bittner: Yeah. 

Ben Yelin: The Department of Justice led by Attorney General Barr requested a bunch of information from Apple because the alleged perpetrators were using Apple devices. And Apple responded lawfully. They complied with the requested subpoenas and handed over information. It turns out that one of the devices that was relevant to this investigation was locked. So this is the same problem we had with the San Bernardino shooter. 

Dave Bittner: Mmm hmm, on-device encryption - so all of the data on the device is encrypted. 

Ben Yelin: Exactly. So Attorney General Barr personally asked Apple and its CEO, Tim Cook, to decrypt the device and renewed their policy agenda, the policy push for tech companies to have these so-called back doors... 

Dave Bittner: Yeah. 

Ben Yelin: ...A method of decryption that's available only to the government in these types of circumstances. Apple responded with a very lengthy letter. It's a compelling letter. As I sort of joked to you before this podcast started, it's basically 5,000 words of saying, F you. 

Dave Bittner: In the most polite way possible to the Department of Justice. 

Ben Yelin: Yeah. That's exactly what it was. 

Dave Bittner: (Laughter). 

Ben Yelin: Now, their response was much more nuanced, but that's sort of the general tone I got from it. 

Dave Bittner: That's your professional analysis. 

Ben Yelin: My professional opinion. 

Dave Bittner: Yes, (laughter) OK. 

Ben Yelin: So they reject - the first characterization in Attorney General Barr's request was that Apple had not been forthcoming in handing over information. And Tim Cook says, we've complied with all of the subpoenas that you've issued; we've handed over troves of information related to this suspect; what do you want? Stop bothering us. The second part of Attorney General Barr's request is this request to unlock the device. And to that, Tim Cook responded with a position that he's had since he's been the CEO of Apple, that these so-called encryption back doors are dangerous. They could be made available to malicious actors. They're not safe for users. There's no such thing as a back door just for the good guys, in his words. 

Dave Bittner: Well, Apple's making the case that they can't unlock this phone, that they don't have access to this encrypted data because they don't have a back door because they don't... 

Ben Yelin: They're don't want that... 

Dave Bittner: They're not going to let (ph) a back door in there, right. 

Ben Yelin: ...Back door to exist. So yeah. Then we're back into the territory we were back in 2015, where the Department of Justice would have to go to court to get Apple to break their own encryption, which was the nature of that dispute that never actually got resolved because it turns out the FBI in that case was able to access the information. 

Dave Bittner: They went to somebody else, yeah. 

Ben Yelin: They went to somebody else. 

Dave Bittner: They found another vendor. 

Ben Yelin: But, you know, I think if the attorney general thinks that it is in the national interest to pursue that litigation, despite Apple's middle-finger letter that they sent back to the Department of Justice, we could be back in court. And perhaps this issue could be settled in this case. I think from Apple's perspective, they think encryption is important not just for user security, but also our national security, to protect our information from malicious actors, whether they be terrorists or Iranian nationals or North Koreans. And I think you sort of fight fire with fire. If it's the Department of Justice saying, we need access to this particular device in the name of public safety, Apple comes back and says, well, in the name of public safety, I don't think that we can create this back door to break our own encryption system. And this is just sort of a stalemate that's going to continue for a while. 

Dave Bittner: We spoke with former Secretary of Homeland Security Michael Chertoff about this, and he came down on the side that it's not worth it, that encryption's too important, that the upsides to encryption on devices are more important than the access that law enforcement would have if they had some sort of back door. 

Ben Yelin: Right. That was our first aha moment on the "Caveat" podcast... 

Dave Bittner: (Laughter). 

Ben Yelin: ...When former Secretary Chertoff said that. I mean... 

Dave Bittner: Yeah. 

Ben Yelin: It was interesting to hear from somebody who not only was our secretary of Homeland Security during a very dangerous time for our country, but as someone who's been a professional in the field for a generation and somebody who you'd think would be partial to law enforcement, having served in government. 

Dave Bittner: Right. 

Ben Yelin: You know, I would also note that this is not a partisan issue. This request that the attorney general is making here is almost analogous to the request that the Obama administration and its FBI made in the 2015 case. So it's more of an institutional concern within the Department of Justice. One thing that - sort of interesting nugget about the story is that the president found out about it, so we did get a presidential tweet asking for Apple to decrypt the device. It was in the tone of, we've done your company a lot of favors. 

Dave Bittner: (Laughter). 

Ben Yelin: One thing you can do for us is to protect our safety and security by unlocking the device. 

Dave Bittner: Our transactional president. 

Ben Yelin: Yes. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: So, you know, that might add a new level of publicity and urgency to the case. But based on the tone of this letter, Apple is not going to back down. I mean, I think this is their most closely held principle as it relates to encryption, that they are not going to create this back door; they're not going to jeopardize users' data, and they're not going to jeopardize our national security. 

Dave Bittner: All right. Well, again - another one that we will continue to follow. 

Ben Yelin: Maybe we'll get some legal resolution on this. That would be nice for us. 

Dave Bittner: All right. Well, the Listener on the Line segment of our show actually has the week off this week. I've decided to give it a week off (laughter). But that doesn't mean that we don't want you to call in with your question. Our "Caveat" call-in number is 410-618-3720. You can also email us at caveat@thecyberwire.com. The best thing you can do is send us an audio file with your question. But also, if you want to just send us a question, you could do that, as well. And perhaps we will use your question on the air. Coming up next - my interview with Andrea Little Limbago. She is the chief social scientist at Virtru, and we are going to be discussing her upcoming RSA Conference presentation about the global battle against encryption. 

Dave Bittner: But first - a word from our sponsors. And now back to that question we asked earlier about compliance. You know, compliance isn't security, but complying does bring a security all its own. Consider this. We've all heard of GDPR, whether we're in Europe or not. We all know HIPAA, especially if we're involved in health care. Federal contractors know about FedRAMP. And what are they up to in California with the Consumer Privacy Act? You may not be interested in Sacramento, but Sacramento is interested in you. It's a lot to keep track of, no matter how small or how large your organization is, and if you run afoul of the wrong requirement, well, it's not pretty. Regulatory risk can be like being gobbled to pieces by wolves or nibbled to death by ducks. Neither is a good way to go. KnowBe4's KCM platform has a compliance module that addresses in a nicely automated way the many requirements every organization has to address. And KCM enables you to do it at half the cost in half the time. So don't throw yourself to the wolves and don't be nibbled to death by ducks. 

Dave Bittner: And we are back. Ben, recently had a really interesting conversation with Andrea Little Limbago. As I mentioned before, she is chief social scientist at Virtru. You want to talk about somebody who knows her policy stuff, that's Andrea. Always an interesting conversation with her. Like I said, she's giving a talk at the RSA Conference, which is a big cyber conference coming up... 

Ben Yelin: In my hometown of San Francisco. 

Dave Bittner: ...In San Francisco, yep. And the topic of her talk is the global battle against encryption. And that is where our conversation was centered. Here's my talk with Andrea Little Limbago. Can you describe to us how's the global community addressing encryption? 

Andrea Little Limbago: It's interesting. I think it's along the lines of a spectrum with some extremes. And when I start talking about this usually within our industry, they go well, you know, crypto wars have been going on forever. And I say, I think it's important to note that this time's different, and, you know, exactly for the question that you asked, is that the global environment is very different now than it was during the crypto wars about two decades ago. 

Andrea Little Limbago: And so what's going on now is that governments are actually passing legislation to basically require backdoors into encryption. And whatever that means - it means a variety of things, depending on the various country and the governments. But the overarching goal is for government-mandated access to data when they want it. And so Australia passed a law almost a year ago - actually it was exactly a year ago right now - exactly for that purpose. And what I think the core difference is that when we talk about countries wanting to weaken encryption, we often think about the usual authoritarian suspects, like your China and Russia and Kazakhstan is one - it always - it pops up. And they absolutely are. And they're contributing and helping export that kind of model. 

Andrea Little Limbago: But it's starting to diffuse in democracies, as well. And that's where - I think it's most concerning - obviously, we don't want it to be happening anywhere. We - I think encryption is a core aspect of maintaining privacy and freedom of expression and could help be foundational to democracies. But the fact that it is growing into - it is spreading into democracies and this debate just continues, even within high levels of leadership now within the United States and Canada, as well. I mean, you know, India's discussing these kind of laws, as well. So it's really - it's a trend that's going on globally for the sake of government-mandated access to data under the auspices of national security. 

Dave Bittner: Can you walk us through the Australian example? I mean, what led them to this point and how are they implementing it? 

Andrea Little Limbago: So it's interesting. You know, about a year and a half ago - I think it was late summer of about 2018 - the Five Eyes governments - so the Five Eyes in the national security jargon are going to the U.S., Canada, U.K., Australia, New Zealand. I think I got all five there (laughter). 

Dave Bittner: Yeah. 

Andrea Little Limbago: Basically, release a joint statement, coming to a joint conclusion that governments should have access to encrypted data when they need to for national security purposes. And across each of them, initially, I think over the last few years, the link has more so been to terrorism and to try - and the argument has been that governments are - basically, there's a black area where they have no access to data due to encryption. And so leveraging that, they can pull up some cases - there's highlight how - well, how they had access to this encrypted data, they may have been able to dismantle a terrorist group easier or stop a terrorist attack. And so those have been the guiding arguments for a while. 

Andrea Little Limbago: And so those continued on. And Australia, really, was the one that started taking the lead following that Five Eyes statement that was released. And they passed the Telecommunications and Other Legislation Amendment about a year ago. And so it required government access when they want it. So it's mandated. Probably the most infamous quote that has come from that was that - you know, one of their politicians noted that the laws of mathematics come second to the laws of the land. And so... 

Dave Bittner: (Laughter). 

Andrea Little Limbago: ...For any mathematician, you know, it's like no, no that's actually not true (laughter). 

Dave Bittner: Yeah. Yeah. Yeah. Well... 

Andrea Little Limbago: That's not how math works. 

Dave Bittner: I was going to get to that because I saw, you know, that we had the recent hearings on the Hill here in the States. And one of the reactions I saw was along those same lines... 

Andrea Little Limbago: Yeah. 

Dave Bittner: ...Is someone said that it's an attempt to legislate math... 

Andrea Little Limbago: Exactly. 

Dave Bittner: ...Which - but it's an important point. So from a practical point of view, how is it playing out in Australia? Being a player on the global stage, how are they seeing these mandates through and still functioning? 

Andrea Little Limbago: You know, a quick plug for the RSA talk. So my co-presenter with me will be Lesley Seebeck, who leads the - she's the first executive director of the Australian National University's cyber program. And so she has been deeply in the weeds in that, so she would answer it much more thoroughly than I have, as she is living and breathing it right now. I don't actually know how many - whether it's actually even been implemented yet. I haven't heard of any of new major cases yet, which doesn't mean that it hasn't happened. It may not have just made our way over. 

Andrea Little Limbago: But at the same time, because of it, companies actually - there has been enough of an outcry and discussion among the private sector that companies are considering leaving Australia due to this. And so they've got major tech companies, like Atlassian and there's some others, that have noted that this is going to weaken their ability to provide the services that they need to do for their customer base. And so it is having an impact in that regard. You know, honestly, it's similar to the GDPR, where - but the opposite - where the GDPR applies to European Union citizens wherever they are, so it applies to U.S. companies. So that similar for the Australian Law, you know, it applies to any Australian data. And so you can imagine there are many tech companies in the U.S. that do hold Australian data. 

Andrea Little Limbago: And so that is how these laws - that's why it's so important to think about these laws as not just staying within their own borders. They impact - you know, within a global economic environment, it impacts companies globally that are - have any kind of services or products with those countries. And so it's interesting seeing that. And then I think that diffusion aspect is one hand, but we're also seeing - starting to see is more of a criminal underground as well popping up for encrypted phones and encrypted services. So, you know, it just gets to the point that even if we make it illegal for those specific national security reasons, criminals will still have access to encryption... 

Dave Bittner: Right. 

Andrea Little Limbago: ...And are still going to make their own encrypted technologies and their own phones. I think it was Vice had a really good article on how criminals had created their - I think it was NPC was the company that made their own encrypted phones and were using those for their own communications. And so we could go against - you know, as governments go against companies that are trying to work within the illicit economy, we'll be basically hurting those that are following the law and helping those that are not. 

Dave Bittner: What's the response from cryptographers? Are they throwing their arms up and saying, you know, we're not there yet? 

Andrea Little Limbago: Yeah. So the best analogy I've heard is - and I can't - I wish I could remember. There have been various comments from cryptographers who are saying that they now know how climate scientists feel. 

Dave Bittner: Oh, interesting. 

Andrea Little Limbago: Right? And so I think that's a nice analogy. Like, it's a nice way to make it - you know, to make it more solid for folks who aren't embedded in the weeds of cryptography because when you're at a point where the climate scientists are basically, you know, 99 point whatever percent in agreement, that's how the cryptographers are as far as encryption and backdoors and whether that's the realm of the possible. And by the realm of the possible - meaning that you're making sure that there'll be some way to do that without creating a vulnerability that others could exploit. 

Dave Bittner: Right. Right. 

Andrea Little Limbago: So I think that's a good analogy. That's the one that I've started to lean on. 

Dave Bittner: Yeah, I like it a lot. What other things are you covering in the RSA talk? 

Andrea Little Limbago: Basically, we're going to be walking through a little bit on differences now than it was, you know, a couple decades ago. And, again, it really is - we're seeing democracy decline across the globe. You know, the '90s was at the - really, the forefront of democracy starting to spread quite a bit more with the - you know, the fall of the Berlin Wall, fall of the Soviet Union and so forth. You know, democracy was on the rise. And so those battles were going on under an era when democracy was really starting to gain traction. Today we're seeing the decline for several years in a row of democracy across the globe. And then on top of that - and if you look at - Freedom House has the Freedom on the Net every year that has a variety of metrics for evaluating how free the internet is country by country, and it's the ninth year in a row of decline in that area. And part of that decline - you know, encryption is one component, so, you know, I don't want - not everything the entire world will rest on encryption. But it is a core component of safeguarding the data, and it's foundational for securing the data. 

Andrea Little Limbago: And the weakening of encryption is used in combination with other kinds of aspects for data for information control that the generally authoritarian regimes are using, which goes in combination with some of the hacks that they're doing - the disinformation, even creating some of the hardware discussions that we've increasingly been seeing that are being used for surveillance. It's just - it's part of the broader authoritarian playbook. So we'll talk a little about how encryption fits within that but then go, you know, case-by-case across the globe just talking about what's going on. And I think a lot of people are going to - I think will assume that it's something that's something that's only been going on in, you know, China and Russia or that the DOJ may be talking about a little bit now but will never happen in the U.S. But it really is gaining momentum. The United Nations has done a report on this - a couple, actually, over the last few years - and has noted that, since 2015, there really has been a global rise of anti-encryption policies being passed across the globe. 

Dave Bittner: How do nations enforce this, something as readily available... 

Andrea Little Limbago: Right. 

Dave Bittner: ...As encryption? How do you crack down on it? 

Andrea Little Limbago: You know, the banning of encryption is sort of a nice way to summarize what's going on, but the reality is - obviously, the devil is in the details. And so what really, though, they're trying to get is the access to data, and so they're not going to be cracking down on it across everyone and anyone who's using it, although I shall - I'll go back to that in a second. 

Dave Bittner: Yeah. 

Andrea Little Limbago: What they're generally doing is, when they want access to certain data, they target that company and mandate getting access to it. So it's very targeted as opposed to opportunistic and widespread. However, that said, there are cases - during the attempted coup in Turkey a couple of years ago, they ended up detaining citizens for using ByLock, which was a popular messaging app in Turkey. And so anyone who was even using it, they associated them with being behind the coup and were detaining them. And so that's the kind of world that we're living in now that I think - that doesn't get covered very much. 

Andrea Little Limbago: And, you know, on the one hand, we don't want to be, you know, like, yeah, the sky's falling, and saying that everything is bad. But I think we need be realistic about what is going on across the globe so we in the U.S. can make policies to counter that and ideally be, you know, that - be that symbol of how to actually do this right as far as protecting data and privacy and create a counter-framework that others will want to emulate as opposed to this other digital authoritarianism model that is spreading. 

Dave Bittner: And in your mind, what does that look like? If we could build an ideal system here, a model for the world, what would it look like? 

Andrea Little Limbago: Yeah, that would be nice. And, you know, I - one, I wish I had all the answers. I do not, and there are probably a lot of other people who could really provide the details on this. But for me, one core component is the federal privacy legislation. And so a lot of privacy legislation has cybersecurity components into it, so in general, I look at them more so as data protection regulations. That's what a lot of them are, such as the GDPR. What I would like us to do is, one, to learn the lessons from GDPR for what's worked and what hasn't and actually implement some of those under more of an American design. But I'm thinking about - a lot of these talk about security safeguards within their laws, and so it'd be great to include that and specify encryption, which is, you know, related to this conversation. 

Andrea Little Limbago: But I would like it to be, you know, much broader. I mean, we're at a point where we actually do have some momentum for federal data privacy law. And both the Republicans and Democrats over the last few weeks both introduced one, which means, you know, one, that both parties are actually interested in something and interested in hopefully solving this problem. So... 

Dave Bittner: Right. 

Andrea Little Limbago: That alone, I think, is enormous, given this time in our country. But it's also - you know, a lot of corporations have, over the last, you know, year, year and a half - have really switched their mood on - and their stance away from self-regulation towards some level of regulation. And so a lot of that stems from California's law that comes into effect in early January but also, you know, points to - there's a growing shift within the American population wanting a federal privacy law. And so there are a range of factors that are getting debated and, you know, hopefully will be included that all point to the notion of individual data sovereignty. So the digital authoritarianism really talks about cyber sovereignty a lot, which is government-mandated control of the internet and access to data. 

Andrea Little Limbago: And so I'd like to see one that focuses on individual data sovereignty and greater individual control over their data, and so that could be everything from providing the technical tools that are more usable and understandable, consumable for users to protect their data but also gets into the legal aspects, such as the terms of service that are just way too long for anyone to even understand what they're signing up for. And it gets into, you know, opting in versus opting out and giving the individual greater control, various kinds of repercussions that may happen if corporations fail to adhere to the law. And I think a lot of that follows under the data breach notification, which tends to fall under these kind of laws and specifying timeframes and - for what must be provided. But obviously, all that needs to be, in my - you know, ideally for me, would be influenced by both the private sector and those who are the ones that are trying to do the right thing and protect the data with the realities of how hard that actually is. 

Andrea Little Limbago: But also, I'd love the security community to get more involved. I mean, I think what we generally look at as the privacy community is generally, like, for broad stereotypes, generally - is the - are the lawyers. The security committee tends to be more the technologists. This is increasingly changing, and there's a growing overlap, which I love to see. I think that's going to keep going on. 

Andrea Little Limbago: But I love to see influence from both of those communities to help shape this as well so that solutions don't emerge that either are not technologically feasible or maybe just impossible for companies to enforce on a privacy side. So we need to find that good balance, and that's - I think the nice thing about in the U.S. as far as the fact that we haven't had one yet, I think, is we have a somewhat of a second mover advantage to learn from what other countries have done and has worked on while also having a broader debate across our country about what should be collected, what shouldn't be. What should individuals know? And I think the more that we keep seeing of data being collected that we didn't really think was being collected - by we, we is the American population - the more interest there is going to be in helping ensure that the - whatever privacy law is worked on focuses both on what's collected, how it's used, how it's protected and how we can empower individuals to better protect their data and incentivize corporations to also do the same. 

Andrea Little Limbago: To end on a positive note, which I think is always nice to do, as almost dire as it may seem, as we hear day after day of all the data that is out there and, you know, as the encryption debate tends to take higher precedent in our own - in the United States, there really is a strong movement both amongst people but across some governments as well that are pushing for data protection regulations. And encryption is going to be a core component of it. And so, you know, the EU already has that. You know, they have much - they've - the GDPR - there have been fines have been issued to companies for not properly protecting their data, and in those fines, they note that they didn't use encryption. And so the GDPR model, as the core data protection model that's out there in the globe right now until we hopefully find - you know, create our own - that that model is spreading. And so in order to be compliant with the European law, many other countries are also passing similar laws. And so right after the EU passed through law, Japan basically formed an agreement with the EU, and they basically became compliant with one another. So that - so similar protections are in Japan. 

Andrea Little Limbago: In 2020, Brazil has new law coming into effect for the general law on data protection - very similar to the EU's law, a little bit more specific in some areas. And so we are seeing sort of this counter-movement, and actually, the greatest example that I can think of is - so in Ecuador, which has bought a lot of surveillance technology from China - earlier this year, there was basically a data leak of almost every - of data on almost all their citizens. Immediately following that, there was a rush within their government to try and pass a Data Protection Regulation. And so it was kind of just - like, it's almost - that duality, I think, is real interesting, where on the one hand, you know, Ecuador is one whose country that we place this section of leaning towards digital authoritarianism. With this leak, they're now - you know, the pendulum swinging in favor of data protection. And so I think that we've seen so much of a decline in one way that there really is a counterforce starting to grow. 

Andrea Little Limbago: And so, you know, if we can all, you know, do our part to help that counterforce grow and provide those tools and provide the laws and the capabilities to help protect data, I think that that's where a lot of interesting work will be going on over the next year. And so while it's important to be realistic about what's going on in the global attacks on data and privacy, I think it's really important to notice that there is this movement growing and to help do what we can to help keep that movement growing and fight for greater data protection and privacy within the various laws and technologies that we have. 

Dave Bittner: All right. Ben, what do you think? 

Ben Yelin: I liked her sense of optimism, first of all. 

Dave Bittner: Yeah. 

Ben Yelin: I mean, it's often easy to become cynical when many Western countries are trying to gain backdoors to encrypted devices and encrypted systems. And I think she offers some hope for people who are supportive of data privacy, which is always good. One thing that really interested me about her interview with you is her explicit connection between the general decline of democracy and these efforts to undermine data privacy, and, you know, I think that's a really apt and important connection to make. And that's something that was compelling about her comparing the landscape now to 20 years ago, when, in the wake of the fall of the Soviet Union, democracy was sort of on the upswing. 

Ben Yelin: You had all these former Soviet republics who were figuring out how to democratize. And unfortunately, we're now in an era where we seem to be sliding back the other way in a lot of circumstances. And I think she's talking about these efforts to create these backdoors as emblematic of that larger trend, and I thought that was a very compelling point. I also loved the analogy between cryptographers and climate scientists. 

Dave Bittner: Yeah. 

Ben Yelin: Yeah. 

Dave Bittner: Yeah. 

Ben Yelin: That was sort of the... 

Dave Bittner: That's a good one. 

Ben Yelin: We know what we're talking about. People should be listening to us. And... 

Dave Bittner: Right. 

Ben Yelin: The opinion among people who are in the know is universal. It's just people on the outside who think differently about it. 

Dave Bittner: Right. It strikes me that - with math in particular, that's not just your opinion. 

Ben Yelin: Math is - yeah. 

Dave Bittner: (Laughter). 

Ben Yelin: I learned a long time ago you cannot answer a question on a math test by, you know, stating, my opinion is X. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: As she said, that's just not the way it works. 

Dave Bittner: Right. 

Ben Yelin: So I thought that was very compelling as well. She's a strong advocate for data privacy, and if you happen to make it out to San Francisco for that conference, please go to her presentation. 

Dave Bittner: Yeah. I always enjoy it when I have the opportunity to speak with her - always time well-spent, so... 

Ben Yelin: Absolutely. 

Dave Bittner: Thanks to Andrea Little Limbago for joining us. That is our show this week. We want to thank all of you for listening. 

Dave Bittner: And of course, we want to thank this week's sponsor, KnowBe4. If you go to kb4.com/kcm, you can check out their innovative GRC platform. That's kb4.com/kcm. You can request a demo and see how you can get audits done at half the cost in half the time. 

Dave Bittner: Our thanks to the University of Maryland Center for Health and Homeland Security for their participation - you can learn more at mdchhs.com. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.