Crypto crumple zones — Research Saturday
Dave Bittner: [00:00:03] Hello everyone, and welcome to the CyberWire's Research Saturday presented by the Hewlett Foundation's Cyber Initiative. I'm Dave Bittner, and this is our weekly conversation with researchers and analysts tracking down threats and vulnerabilities, and solving some of the hard problems of protecting ourselves in a rapidly evolving cyberspace. Thanks for joining us.
Dave Bittner: [00:00:26] And now a moment to tell you about our sponsor, the Hewlett Foundation's Cyber Initiative. While government and industry focus on the latest cyber threats, we still need more institutions and individuals who take a longer view. They're the people who are helping to create the norms and policies that will keep us all safe in cyberspace. The Cyber Initiative supports a cyber policy field that offers thoughtful solutions to complex challenges for the benefit of societies around the world. Learn more at hewlett.org/cyber.
Dave Bittner: [00:01:02] And thanks also to our sponsor, Enveil, whose revolutionary ZeroReveal solution closes the last gap in data security - protecting data in use. It's the industry's first and only scalable commercial solution enabling data to remain encrypted throughout the entire processing lifecycle. Imagine being able to analyze, search, and perform calculations on sensitive data, all without ever decrypting anything. All without the risks of theft or inadvertent exposure. What was once only theoretical is now possible with Enveil. Learn more at enveil.com, and be sure to check out Enveil at the RSA Early Stage Expo, Booth 32.
Charles Wright: [00:01:47] The problem is that there's this kind of central tension between governments who want access to data, specifically to communications data.
Dave Bittner: [00:01:57] That's Charles Wright. He's an assistant professor at Portland State University, along with Mayank Varia, a research associate professor at Boston University. They're co-authors of a new paper, "Crypto Crumple Zones: Enabling Limited Access Without Mass Surveillance."
Charles Wright: [00:02:14] Law enforcement, national security kind of purposes. Sort of the same way that they have the lawful ability to, for example, get a wiretap on a traditional telephone line, now they're starting to realize that they would also like access to, you know, encrypted internet calls.
Charles Wright: [00:02:30] And as encryption becomes much more commonplace and more people realize that we need this in our everyday lives to protect ourselves from cyber criminals and data theft and all these other threats, the way we've built our systems and leads to this inherent conflict between our tools that are built to give you confidentiality and privacy from everyone, and then the governments are saying they have a legitimate need to get access to this kind of data.
Charles Wright: [00:02:57] And just the way that we've been constructing our system so far, it's really been either, you know, it's an all or nothing deal, it's black or white. There hasn't been a way to build a system that allows only, you know, legitimate lawful warranted access, but at the same time keeps everybody else out. And so we'd like to get towards more of a middle ground. And we think this paper is hopefully maybe a first step toward something like that.
Dave Bittner: [00:03:21] So Mayank, I think we've seen some high profile cases where law enforcement has said that, you know, they would - they need to be able to decrypt things to do their jobs, and for national security and public safety and so forth. And yet, we've also reached this point where encryption isn't really an exotic thing that's difficult to do, so it's become routine for all of us to use encryption on a daily basis. Certainly it's a fundamental part of the Internet and the things we do there. And I think there's been this - sort of like Charles said - there's been this all or nothing that's ended up with people sort of taking one of two sides. And there's the law enforcement side and then there's the encryption side of this debate. Explain to me, how can we reach a middle here? Is this not an all or nothing?
Mayank Varia: [00:04:07] It's a great question. So I think that, you know, the way that cryptographers have traditionally defined the notion of encryption - and it's something that I put on my first day of class lecture slides and most others do too - is encryption as a thing where there are three possible participants in the world. There is the sender of a communication, there's a receiver of a communication, and then there's an outsider who is trying to eavesdrop or somehow gain access to this.
Mayank Varia: [00:04:37] And this is the full state of the picture. There are the two legitimate parties conducting the transaction, who encryption is supposed to permit - from a functionality point of view - to actually be able to send and receive messages reliably. And there's outsiders, who are just universally supposed to be excluded from access to the contents of this message.
Mayank Varia: [00:04:56] And essentially what we are thinking about in this paper is, rather than just considering one category of every possible outsider, to sort of split our concern into maybe two categories of outsiders, and there may even be more, but sort of the kinds of people who are cyber criminals or, you know, so-called hacker types. People who, you know, I think most people would agree that the goal of encryption is to keep such people out of being able to access the contents of encrypted data.
Mayank Varia: [00:05:27] And also law enforcement apparatuses of nation-states, who the answer to whether they should be able to read the contents of encrypted messages is a lot less clear. I don't know whether the answer is universally yes or universally no. But it might at least be a different answer than the answer that we give when we think about cyber criminals as the outsider is snooping on encrypted traffic. And so the question that we have in this, that we pose at the start of this paper, is is there some sort of mechanism that we can use when designing an encryption scheme to distinguish between the law enforcement outsider and the cyber criminal outsider.
Mayank Varia: [00:06:08] The specific mechanism that we use in this paper to distinguish between the two is economics. That the law enforcement organization may have more resources available at its disposal than a cyber criminal, and resources that they are willing to use in order to promote some sort of benefit that is not some sort of profit-motivated endpoint, whereas cyber criminals are motivated by sort of profit motive, a risk-reward thing.
Mayank Varia: [00:06:36] That maybe there are things that is in the public interest to read, but are not worth perhaps the resources on their own. If we can make a sort of cryptographic puzzle that is solvable, but at a very high cost, then a law enforcement entity could judiciously choose when to use this limited capability to recover the contents of a few messages and maybe we can design the system in such a way that it's possible for law enforcement to do this, while simultaneously not being of value for an other kind of outsider like a cyber criminal organization to do so.
Dave Bittner: [00:07:12] And so when we talk about costs, are we talking about dollars and cents, are we talking about time, or a combination of the two?
Mayank Varia: [00:07:20] Essentially, in the way that the current paper is written - which is not necessarily the way that has to be, but just the way that we've currently written it - when we describe costs, what we mean are computing efforts. So the money required to build and operate, you know, the electricity consumption required to operate a computing rig. So it's not, like, money in the sense of dollars flowing from one person to another, but in the sense of resources, you know, computational and energy resources expended in order to recover the contents of the message.
Dave Bittner: [00:07:51] So, Charles take us through conceptually what we're talking about here. I think when many of us think about the ability to decrypt things, the popular notion that's been put out there is key escrow. And this is something different from that.
Charles Wright: [00:08:06] Yeah. We talk a little bit in the paper about escrow, and some of the limitations of that approach. There's been a lot of good work in our community analyzing and looking at some of the issues there, and there's a really great paper from Hal Abelson and a lot of the real rock stars of security and crypto research in the '90s and '00s.
Charles Wright: [00:08:26] One of the major issues with key escrow is that - you know, when we talked earlier about this all or nothing problem we had, where there wasn't really a middle ground - and key escrow falls victim to that, where really it's more on the "all" end of the spectrum, where if you have escrowed keys, the idea there was that, if you're going to encrypt a message, you need to give the authorities some way to open it back up.
Charles Wright: [00:08:49] And the approach there was, well, you take your encryption key, and you also encrypt it, you know, with a key that's published by the authorities or maybe by some trusted third party like your ISP, or your device provider, or something. And, uh, so that only they could could open it up. And you encrypt that key and you send it along with your message. So if anybody else sees it, they can't decrypt the key and they can't open up the message.
Charles Wright: [00:09:14] But if the authorized third party - your ISP in this case, or your device provider, or the law enforcement, you know, the FBI themselves - they can go and first decrypt your key for the message and then use that to unlock the message. And there's really nothing to stop them from just going and applying that on every single message that they get. And so it requires a lot of trust that it won't be abused or misused, that the authorized users won't get overzealous or won't be corrupt, and also that they will be very careful and competent to keep this capability - whatever allows them to unlock all those keys - that they will keep that secret, and that they will lose it and let some third party do and grab it, and now that, you know, that third party can go and open up everything from everybody as well.
Dave Bittner: [00:09:58] Right. You end up with this sort of "who watches the watchmen" kind of situation, potentially.
Charles Wright: [00:10:03] Exactly.
Dave Bittner: [00:10:04] So your approach is using what you describe as moderately hard puzzles, and you talk about crumpling puzzles and abrasion puzzles. Describe to us what's going on here, and how it works.
Charles Wright: [00:10:16] The crumpling puzzle is maybe the easier one to get started with. So there the idea is that, normally, when we design an encryption scheme, a cryptographic key is just a long string of bits, and it's randomly chosen from such a huge space of possibilities that it's almost impossible to guess what it would be, and it's virtually impossible to try all combinations to find the right one. So, normally, we're talking, you know, something like two to the 128, two to the 256. These are enormous numbers.
Charles Wright: [00:10:46] So, for comparison, a million is only about two to the 20. And so our approach is, well, maybe, what if we shrunk down that space and we made it not two to the 128 possibilities, what if we made something smaller that is within the realm of somebody big and powerful to pay for all the electricity to do the brute force search. And so, we ran some numbers, created our scheme so that the keys are derived using the original key that the application like Skype, or WhatsApp, or Signal Private Messenger, or whatever the, you know, the encrypted communication app is. Normally, right now, these are generating these long, random, 128, 256 bit number keys.
Charles Wright: [00:11:26] So we used that as our initial secret to pick one of a much smaller number of keys. So, for example, maybe 2 to the 60 or 2 to the 70. And we do that in such a way so that the brute force search of trying all the possibilities looks a whole lot like the function that's used for bitcoin mining right now. And so then we imagine that a law enforcement agency like FBI or MI5, or whoever it would be, to then go and build hardware that looks a whole lot like a bitcoin miner they can get off the shelf and that they can do this search through, you know, say 2 to the 60 or 2 to the 70 possibilities in about the same time that a bitcoin miner takes - about the same time and about the same amount of electricity, most importantly, that a current bitcoin miner takes.
Charles Wright: [00:12:09] And so, based on those numbers, we looked at, you know, what's the best most efficient bitcoin miner you can go and buy right now, and it looks like a key space of about 2 to the 60. You can search and I think, you know, if you're buying your electricity for the cheapest region of the US, looks like about a thousand dollars to try all those possibilities. And if you crank that up to 2 to the 70 - so that's multiplying by 2 to the 10, which is about a thousand - no surprise that now you're spending about a million dollars on electricity.
Dave Bittner: [00:12:37] And so, the notion is that that expense is what's going to be the bottleneck, if you will, for folks to be able to decrypt things.
Charles Wright: [00:12:49] That's one of them. That's the main bottleneck, for example that would that would limit abuse or misuse of the system by an authorized party like the FBI or MI5 or whoever it is. We also had this bigger puzzle, this big gatekeeper puzzle, where you have to solve some really, really difficult puzzle that we call the abrasion problem. And this uses some public key cryptography - we won't go into all the details now. We leveraged some - a recent attack on some public key crypto and we think, based on some some numbers that we read in the literature and some kind of back of the napkin math that we did, we think we can make that one cost anywhere between about 150 million dollars up to maybe two billion dollars.
Charles Wright: [00:13:27] And the idea is then that someone like the FBI, who is in charge of national security and criminal investigations would spend that money upfront to pre-compute a bunch of stuff that they can then use to solve simpler problems that we bake in to the key generation algorithm. And we make that a necessary component for them to derive some secret information for each of those little keys that we're going to use on each message, to then go and do the brute force search to get the crumpled key.
Dave Bittner: [00:13:55] So, in other words, the cost of entry to even have access to the simpler puzzles is a big puzzle. And so, that way you're making sure that really only, for example, nation-states would even have access to the simpler puzzles?
Charles Wright: [00:14:10] That's the idea. Yeah. So, without the abrasion puzzle, if we make each message cost, say, a thousand dollars to recover, then if there's some message out there that I think is worth ten thousand dollars, I'm going to spend the thousand and I'm going to profit by nine thousand, right? On the other hand, if we have the abrasion puzzle there, then the total cost to get that one message would be, say, two billion, one thousand dollars. And now that ten thousand dollar message is not worth it anymore.
Dave Bittner: [00:14:34] Mayank, you all have a list of requirements that you think would be necessary to make this feasible, and it's something that key escrow falls short on. Can you take us through what these requirements are?
Mayank Varia: [00:14:48] So, first of all, as Charles was saying earlier, there are some issues with key escrow, in terms of the fact that, you know, because encryption so far has been this all or nothing thing, and with key escrow it's effectively giving the government apparatus a skeleton key which gives it access to everything. That does not, on its own, prevent - at least technologically -any kind of massive bulk surveillance. That the same key that is allowed - the same escrow key that the government can use to open targeted messages - can also open, arbitrarily, many messages.
Mayank Varia: [00:15:24] And the only limitation on that would be any kind of apparatus that exists within the government to, you know, restrict its use and to prevent fraud and abuse kinds of practices, whereas, in our system, one of the requirements is for this system technologically to prevent sort of bulk, mass-scale surveillance, and it does so with the fact that the crumpling puzzle poses a marginal cost on every single message transacted.
Mayank Varia: [00:15:52] The other requirements we have - that also key escrow does not meet on its own - is that we want to, you know, keep the system as simple as possible for both users to use and developers to implement. And in particular that neither one of them ever need to have any direct lines of communication with the government law enforcement apparatus at all. So no sending of escrowed key material or anything else for that matter, maintain the kind of user workflow that exists today, minimize the amount of new lines of code that are needed to be built in order to implement our puzzling techniques.
Mayank Varia: [00:16:26] And the final requirement that we have is to maintain all of the cryptographic best practices we have developed over the course of the last few decades, in terms of being able to design schemes that provide simultaneously both confidentiality and integrity. A system called authenticated encryption which is very much in use now, and to be compatible with other kinds of techniques that we use to protect key material, such as hardware-based systems like hardware security modules or any other kind of mechanism to protect key material locally. And to be compatible with notions like perfect forward secrecy in order to limit the possible damage to one's privacy that can happen if your computer is ever compromised and falls into the wrong hands.
Mayank Varia: [00:17:14] So we sort of want to limit the ability, technologically, for mass-scale abuse, limit the increase in system complexity required to implement the system and to maintain the system, and finally to maintain cryptographic best practices.
Mayank Varia: [00:17:30] With that having been said, if we step back for a moment and we think about what key escrow was trying to accomplish, at a very high level it was trying to accomplish the same exact goal that I said our system was trying to accomplish, in the sense of when we think about the intended recipients of an encrypted communication, and then all of the various forms of outsiders, it was trying to find a way to distinguish between the government law enforcement outsider and the cyber criminal outsiders. It was trying to find a different way to distinguish between them, right? By possession of this sort of skeleton-key-like material. Whereas we have a different mechanism, which is to distinguish them via economics.
Mayank Varia: [00:18:09] And I put them under the same framework here to make two points. One is that the two ideas can be used together, in which case one would get the strengths of both put together. And the second reason I mentioned this is to say these are just two different ways to distinguish between law enforcement outsiders and every other type of outsider, like a cyber criminal. And maybe there are many different other ways, many other dimensions, in which one can distinguish between these two types of entities, even beyond, you know, our paper or key escrow for that matter. Maybe they could be lumped together as well. And the more ways one has to distinguish, then the stronger such a system might become.
Dave Bittner: [00:18:50] So how do you account for things like Moore's Law and, you know, coming quantum computers where, presumably, the cost of computation is going to go down?
Mayank Varia: [00:19:00] Very good question. We discussed in the paper that Moore's Law is definitely something that is a concern to the approach that we propose here in terms of the economics. One way that we propose to deal with the concern of Moore's Law is to have these abrasion puzzles and these crumpling puzzles themselves have the strength of them be tunable over time. So that they should - the way that one should implement such a system, if used in practice, would be always continuously to be increasing the size of these parameters to keep up with Moore's Law. That's comment number one.
Mayank Varia: [00:19:34] Number two is to think about the scale of these parameters proactively, based on Moore's Law. What I mean by that is, if you think it would be an effective deterrent if it costs a thousand dollars for a cyber criminal to break, if that's an effective deterrent, and if you want to withstand this kind of attack for a period of ten years or so, you should design the parameters of the puzzle so that, even ten years from now, with the advances in Moore's Law, that it will still be an effective deterrent even into the future.
Mayank Varia: [00:20:05] And the third thing I would say in terms of combating Moore's Law is coming back to my previous point about combining this economic space distinguisher together with other forms of distinguishing, which are not necessarily economics-based, and this reduces the influence or the dependency of the system upon Moore's Law. So sort of a defense-in-depth approach to combine many different kinds of distinguishers together could be one way to handle the concern of Moore's Law.
Mayank Varia: [00:20:31] The question about quantum computers is somewhat similar, except it doesn't have so much of the regular inflation rate, so to speak, that Moore's Law has. It seems to be something that might be more of a big cliff that, once quantum computers exist, then that enables a variety of new tasks that were not possible before.
Mayank Varia: [00:20:51] And with regards to quantum computers, I think that one thing that the cryptography community and many other communities within computer science have already been thinking about are what are the kinds of problems that continue to remain difficult, even in the presence of quantum computers? And using those kinds of systems as the basis for crumpling or abrasion puzzles could be a way of making sure that this system withstands even quantum computers. But it is true that quantum computers make many problems easier, but it is also the case that there are many problems that quantum computers either don't make much easier at all, or we know fairly well how much easier they make the problems, and so we can account for that in the analysis.
Dave Bittner: [00:21:33] Charles, are there any areas where you sense that perhaps this approach isn't the best approach, or maybe it comes up short? Have you found any areas like that?
Charles Wright: [00:21:44] Well, sure. This system was designed to be used more or less in representative liberal democracies where the government and the law enforcement work for the people. And so you can imagine if we deployed something like this in, say, North Korea, there's really nothing to stop a guy like Kim from going and spending all the resources he wants to track down people that are criticizing him and, you know, let thieves and murderers or whatever run free. That is not at all how we were hoping the system might be used and so, you know, the benefits of having encryption in that case are mostly nullified, and you get all the drawbacks of earlier approaches like key escrow.
Charles Wright: [00:22:24] And so I think that's the major case where I think our ideas really wouldn't help much. I guess it's debatable about about other countries. Maybe it depends on your opinion of the various governments, whether you think it would be an acceptable risk to give them this kind of capability or not. I'm sure there's many, many different opinions out there. It may also be a great difference of opinion on how high the price should be for any particular government. And I think that's, you know, that's a question for society more than for us. We're just hoping to maybe find some mechanism that'll get us to a middle ground that we can have this public debate. We're just saying, hey, it would be cool if we had a dial at all.
Dave Bittner: [00:23:04] Yeah.
Charles Wright: [00:23:05] And then, once we have it, then we can all debate about where - you know, how high up we should - you know, should we dial it up or dial it down? And hopefully there can be some more of a middle ground. Maybe not quite consensus, but a little more agreement than just a black and white issue that tends to divide people and get us riled up against each other.
Charles Wright: [00:23:21] The other case where we really don't provide much help is for a really, really high-value target. And so, as Mayank was saying a minute ago, we provide some provable security for messages that are worth less than the total cost to recover them. And we don't give any guarantees at all for a message that's - whose value is more than that. And so you have a guy like Snowden - you can imagine that a government would want to spend the resources to track him down. You know, maybe also, you know, candidates for high-level office might also fall into this. It may not be safe for them to use something like this.
Charles Wright: [00:23:55] And there's probably some other good examples, you know, maybe the CEO of Apple or Google, maybe their messages are worth enough money to motivate somebody spending these kind of resources to go and get them. And I think that's more of an interesting area for future work. I guess my hope right now is that we're providing some technologies that encryption providers can use when they are legally required to provide some sort of mechanism to give the authorities access.
Dave Bittner: [00:24:20] Right.
Charles Wright: [00:24:20] I think if it comes to a point where there are new laws and regulations being put in place, there's a lot of risk in that. And there's a lot of risk to having a situation where the technology continues to move fast and the regulations don't keep up. It's not clear how that might turn out, but there's a lot of risk, I think.
Dave Bittner: [00:24:39] One of the things that comes to mind with me is the granularity of the data. In other words, you know, if for example, if someone got a search warrant to search my house, you know, someone convinced a judge that, you know, I had done something bad and they could come basically search through my house to find what they might find. Well, that warrant covers the whole house, and I could see people saying, well, you know, if we have to spend ten thousand dollars per text message, for example, but we don't know which text message is the one that says, you know, "I'm the murderer." So do you get where I'm going with this? Wouldn't it be nice to be able to spend X amount of dollars and know that I'm going to be able to unlock the whole phone?
Mayank Varia: [00:25:23] I very much like the analogy raised to the question of sort of a warrant on your house, because I think that is very similar to the inspiration that we have in this paper, which is to say that the the warrant for your house is not something that is impossible for government to obtain. It's also something that the US judicial system renders effectively impossible to abuse to a mass scale, that they cannot just simply request a warrant for everybody's house when they have a high value investigation and just go on a fishing expedition.
Mayank Varia: [00:25:58] So it basically - and the, you know, executing a warrant on the house, even if they did, even if law enforcement somehow magically got a warrant to everybody's house, they wouldn't even be able to execute that, right? Because actually going through and searching your house costs money. It costs time, right? And they have limited resources, and that forces them to be focused in terms of what kinds of things are of most value in terms of, for the amount of time that they have at their disposal, what is the most social good they can do for that. And what are the kinds of cases and what are the kinds of people that are worth pursuing in order to be most worth the time to do so.
Mayank Varia: [00:26:40] And what our work is trying to do is to say maybe we can try to emulate that kind of thought process in the digital domain. So rather than making the marginal cost for law enforcement to recover information zero, and rather than making the marginal cost effectively infinite, can we make the marginal cost somewhere in the middle that it is possible but onerous to recover information? In such a way that it forces law enforcement to be judicious about its use of its limited funds to recover the messages of highest value, but while still actually permitting them to do that in order to deal with high-value investigations. To close the cases on the high-value investigations.
Mayank Varia: [00:27:20] And to your question about sort of whether the cost needs to be per message, the answer is no. So far we've described it for simplicity as just sort of these two levels of puzzles, this one level of an abrasion puzzle, and this one level of a crumpling puzzle. But in the paper we discussed that, you know, that may not be the, like, we need not limit this just to two levels of alternation. We do it in most of the paper for simplicity, just of exposition, but one can consider many more levels of marginal cost per blah, or different types of, you know, fill-in-the-blank.
Mayank Varia: [00:27:53] So you can have one cost per geographic region so that, you know, you have to do an abrasion puzzle to even unlock the ability to recover messages in a particular geographic region region, and that can be expensive enough that there's, you know, limited ability to sort of, you know, spend the money in recovering messages in regions other than your own.
Mayank Varia: [00:28:11] There can be a cost per particular software product that you want to be able to, as a law enforcement organization, acquire the ability to read contents of. There can be a cost per user that law enforcement targets. There can be a cost per pair of users, or per communication session that is targeted. And finally, a marginal cost per message.
Mayank Varia: [00:28:32] And maybe, to the point that you raised in your question about making sure that maybe law enforcement should have the ability to read a full contents of a person's phone, if that is - if, again, as Charles said, if that is sort of the way that society decides to go about enforcing this - then you could make a high cost, a high marginal cost per user targeted or per phone targeted, but a low marginal cost per message or file targeted within that device so that, you know, you can you can tune these costs as you see fit in order to appropriately handle the tradeoff between giving law enforcement limited access to the contents they need in order to pursue investigations, while simultaneously not giving overreach into more contents than necessary in order to fulfill their obligations.
Charles Wright: [00:29:19] One other thing to add to that - this ties into what we were talking about Moore's Law. Normally, we talk about Moore's Law makes computation cheaper over time. So in a way, we usually looked at that as kind of a limitation or a weakness of our scheme. If you look at it from a slightly different perspective - and I think you mentioned, you know, maybe a high-value investigation like a murder case - well, there's no statute of limitations on murder.
Charles Wright: [00:29:45] And so, if I have all these text messages and I, you know, I'm pretty sure within one of these twenty or thirty or a hundred there's going to be an incriminating message that's going to make my case that, yes, this guy did it, now we're going to be able to get him. Well, you know, if Moore's Law makes computation 50 percent cheaper every eighteen months, I can wait three years, and now the cost is only 25 percent of what it was originally. And that may be enough to go back and solve some cold cases for a cost that is bearable for an important thing like a murder investigation. Whereas if we could prevent another 9/11, we would have spent the millions or however much it was in the beginning, but to - maybe for a shoplifting case or, you know, petty vandalism or something, it probably won't ever be worth it.
Dave Bittner: [00:30:32] So where do we go from here? You all have put this out into the world, people are going to absorb it. What would you like to see come from this?
Mayank Varia: [00:30:40] Personally, I think that there are two things I would like to see from this. The first is in the current debate about encryption technology and its appropriate role in society. I think we've been stuck as a society on this question of is it technologically possible to achieve some sort of appropriate tradeoff between law enforcement access to contents and the right to privacy? I don't think this is the interesting question. And by the way, I'm not saying it's not interesting or that it is solved in any way by this paper. I think this paper is a beginning to this exploration of how one can sort of balance these two, but not the end of that. But it's just the beginning.
Mayank Varia: [00:31:22] But nevertheless, I don't think the question of is it technologically possible is the interesting question here to discuss. I mean, it's not the most interesting question for a society. What's the interesting question, I think, is what is the appropriate role we want as a society for law enforcement to have? And it may very well be the case that we decide as a society that, even if some sort of tradeoff in encryption is technologically possible, that we simply don't want it as a society.
Mayank Varia: [00:31:53] That's a perfectly valid answer that, you know, maybe the right to privacy - it's importance as both for individuals and as a social function - the right to privacy enables us to be better as a society. That may be so worth it that it is worth the cost of maybe not closing some of those cases that Charles was describing earlier. And if that is that decision that is made by society then so be it. And then, you know, our paper goes on a shelf never to be used, and that's fine.
Mayank Varia: [00:32:19] And I think that that's a totally reasonable answer to this question, and I think this is the more interesting question to debate that's currently blocked by that question of, is it technologically possible? But rather than asking, you know, what is technologically possible with encryption, what we should be asking is, what is the appropriate role that we want for law enforcement to have in the digital world, and what is the kind of encryption that we want in order to reinforce whatever that rule is?
Charles Wright: [00:32:43] One of the things we talk about - well, we're going to be presenting this in April in London. And so, hopefully, as the rest of the scientific community kind of gets its teeth into this, other people can hopefully come up with some ideas of ways to make it better, to maybe reduce some of these limitations that we talked about earlier. You know, for example, maybe providing better protection to high-value targets, or coming up with other kinds of puzzles or other kinds of distinguishers between what we're kind of calling a legitimate outsider versus an illegitimate cyber criminal type of outsider.
Charles Wright: [00:33:15] And I guess at the same time, I like the idea that now, we as the security and privacy community, we have a fallback position. In case something really bad happens, in the past, we potentially could have been looking at a total ban on encryption. Or, you know, some virtually unlimited kind of backdoor coming down either through legislation or through courts. I think right now there's court cases where people have been held in contempt of court for refusing to disclose a password to decrypt a device, for example. That seems not a great thing for democracy.
Charles Wright: [00:33:49] And so, at least now we have some sort of a fallback mechanism. You know, if we can't have high-strength encryption available everywhere, well, maybe we have something that still provides a lot of protection. And so, in the past, where companies like BlackBerry was nearly forced out of India in, I think 2010, and then more recently WhatsApp was temporarily banned a couple of times in Brazil. I think in both of those cases the ability of people in those countries to use those services with encryption on has been restored. But in the future, we may not always win.
Charles Wright: [00:34:22] And so this gives us a fallback so that, rather than people in countries like that losing all ability to have encyrpted communications, maybe now we have something that the next BlackBerry or the next WhatsApp could use to get a better balance that might be acceptable.
Dave Bittner: [00:34:41] Our thanks to Charles Wright and Mayank Varia for joining us. The title of their paper is "Crypto Crumple Zones: Enabling Limited Access without Mass Surveillance." We've included a link to the paper in the show notes for this episode.
Dave Bittner: [00:34:56] Thanks to the Hewlett Foundation's Cyber Initiative for sponsoring our show. You can learn more about them at hewlett.org/cyber.
Dave Bittner: [00:35:04] And thanks to Enveil for their sponsorship. You can find out how they're closing the last gap in data security at enveil.com.
Dave Bittner: [00:35:12] The CyberWire Research Saturday is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technology. It's produced by Pratt Street Media. The coordinating producer is Jennifer Eiben. Editor is John Petrik. Technical editor is Chris Russell. Executive editor is Peter Kilpe. And I'm Dave Bittner. Thanks for listening.