Caveat 4.8.20
Ep 23 | 4.8.20

Heat maps and surveillance temptations.

Transcript

Yehuda Lindell: Had the world gone in a different way and say, we will pay for our internet services and for our social networks with money rather than with our data, then maybe we wouldn't have this problem. 

Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hi, Dave. 

Dave Bittner: On this week's show, I've got a fascinating application of anonymized data which shows travel patterns during the coronavirus pandemic. Ben's story is all about surveillance and the temptation to increase surveillance when we're in a situation like this. And later in the show, my conversation with Professor Yehuda Lindell, CEO and co-founder of Unbound Tech. We're going to be talking about government requests for data. What does that mean for your organization? While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll be back after a word from our sponsors. 

Dave Bittner: And now a few thoughts from our sponsors at KnowBe4. What do you do with risk? We hear that you can basically do three things. You can accept it. You can transfer it. Or you can reduce it. And, of course, you might wind up doing some mix of the three. But consider this. Risk comes in many forms, and it comes from many places, including places you don't necessarily control. Many an organization has been clobbered by something they wish they'd seen coming. So what can you do to see it coming? Later in the show, we'll hear some of KnowBe4's ideas on seeing into third-party risk. 

Dave Bittner: And we are back. Ben, why don't you start things off for us this week? What do you have for us? 

Ben Yelin: So my article comes from the publication The Week. And it's an article entitled "The Temptation of Coronavirus Surveillance" by Navneet Alang. Obviously, all of us have coronavirus on the minds. Many of us are isolated and quarantined in our homes, unless we're essential workers. There's a lot to worry about - not just the illness itself, but the economic effects. 

Ben Yelin: But one of the sort of tertiary areas of concern, as identified in this article, is the surveillance - the digital surveillance we are introducing during this epidemic and whether that surveillance will persist after the emergency is over. 

Ben Yelin: We've seen articles talking about broad phone metadata surveillance tracking people's movements. I know we're going to get to one such story later in this podcast. But there have been a lot of other digital surveillance methods that have been used both in this country and around the world. China, of course, which is not a democracy, has been using an application for people who want to go out in public as their coronavirus epidemic abates a little bit. And people have to self-identify with a color on their application - red, yellow or green - whether they've been infected, immune or ready to go back to the workforce. Other democracies, like South Korea, are using location tracking to do what's called contact tracing, figure out, for the people who are positive, who did those individuals interact with during the incubation period? 

Ben Yelin: A lot of privacy advocates, of which of course there are many, are saying that this is an acceptable use of mass surveillance, even people who are extremely skeptical of mass surveillance. I saw Glenn Greenwald, of Intercept fame, who was probably the internet's foremost opponent of electronic surveillance. I saw him on social media saying, look; in these times, you know, even I am not against unusual surveillance measures to help us fight this pandemic. And when he says it, it certainly carries a lot of weight. 

Dave Bittner: (Laughter) Is that that old, you know, there are no atheists in foxholes joke? 

Ben Yelin: Yeah, I think that's kind of where we are now. There are no civil libertarians during a pandemic. 

Dave Bittner: Right. 

Ben Yelin: Of course, there were no civil libertarians in the 10 days following the September 11 attacks. We put into place a national security apparatus and a surveillance state that persists to this day. And that emergency - it's never really over because we can't conclusively win a war on terrorism, but the acute threat has been over for a while. So the threat here is that we can introduce these tools. Companies and the government can get used to using them. And then once the emergency abates, they will still be in circulation. We will have just sort of accepted them. 

Dave Bittner: Right. 

Ben Yelin: So it's sort of the conundrum of having these measures introduced when they are necessary, like during the pandemic right now while we are quarantined and isolated and wanting to get back to normal life, versus the threat of these surveillance techniques persisting into the future. 

Dave Bittner: Now, is it possible that these measures, when they're put into effect, they could automatically time out, they could automatically sunset? Is that something that's done from time to time? 

Ben Yelin: It is. Sunsetting, though, is not a panacea. And I take that guidance from our experience with the September 11 attacks. The Patriot Act was set up as a piece of legislation that was due to sunset. I think the first sunset date was 2006. It has been reauthorized countless times since then, including, most recently, certain provisions of the Patriot Act, I think, were at least reauthorized by the Senate as of two weeks ago, and the House is expected to follow suit. 

Ben Yelin: So even if you introduce a sunset provision, the fear is that the government will get used to using these tools, and the general public will see that these tools have been effective in performing their intended goal, which was to slow the spread of this disease. And these reauthorizations will become a matter of course instead of an opportunity for robust debate. You know, I think there's a saying among my more libertarian friends that nothing is as eternal as a new government program. 

Dave Bittner: (Laughter). 

Ben Yelin: Once it's introduced, it is very hard to take it away. And I think that that applies here. 

Dave Bittner: Do you suppose that we're going to see a bit of a - I don't know - comparative experiment here, to see the nations like China, who have the ability to, because of, you know, their form of government, to be able to say, this is what's going to be done, and you're going to do it, versus what we do, where we have a little more - a lot more give-and-take and debate and checks and balances on things like our freedoms, our privacy and so on? It'll be interesting to see how that affects the outcomes and how that affects the conversation going on. If a country like China does much better and fewer people die, can you imagine people saying, you know, maybe in times like this, this whole freedom thing... 

Ben Yelin: This authoritarianism thing... 

Dave Bittner: Right. (Laughter) Right. Exactly. 

Ben Yelin: ...Yeah, doesn't sound so bad. 

Dave Bittner: Yeah. 

Ben Yelin: So a couple things on that. I would say it's more appropriate and useful for us to compare the data to other democracies or other republics across the world, largely because we don't want to be authoritarian. We have political values that we want to keep intact after this emergency ends. But also, on a more practical level, I think all of us would agree that the data and information coming from China is probably less than trustworthy. I think the original death toll in Wuhan, where the virus originated, was listed as something like 2,000. And new data has come out to show that that figure has been called into question. So, you know, it would be more difficult for us to actually evaluate whether their surveillance practices were successful. 

Ben Yelin: Something like South Korea, which is a democracy, that's potentially a model for us. And we may be able to look at countries like that and see, you know, whether these types of surveillance measures actually work. Germany is another one. They have a pretty high rate of cases but a relatively low rate of fatalities. And basically, they've done that through testing and tracing. And I'm sure some of that has involved some of these surveillance tactics. So I think it is useful for comparison. 

Ben Yelin: One other thing I wanted to note - I looked at some public opinion polling. I think it was from the Harris Organization. Overwhelming majorities of Americans are in favor of temporary surveillance practices during these types of emergencies, this emergency in particular. So I think over 70% of people are OK with cellphone tracking for the purpose of tracing contacts. Those are overwhelming numbers. So it's just interesting to see where the public is on this. I think the public, like even some civil liberties advocates, are willing to, at least temporarily, do away with these principles if it helps fight this global pandemic. 

Dave Bittner: Yeah. And I suppose, as with all of these things, it's important to have folks out there being vigilant for when the emergency passes, to be out there shouting from the rooftops, hey, everybody, time to claw these things back. 

Ben Yelin: Yeah. You know, oftentimes, we're just so happy that the emergency has ended, we all go back to work, and we don't necessarily realize that there are certain... 

Dave Bittner: Right. 

Ben Yelin: ...Laws and policies still in place that can be used for more nefarious purposes. One thing this article mentioned is that a lot of the surveillance tools we used for the war on terrorism were used in the United States against First Amendment activities like Black Lives Matter's protests. So if we aren't vigilant and we let the emergency end and just sort of accept the new realities of location tracking and other surveillance tools, then I think we're going to regret it. And so I think it is up to all of us to stay vigilant. 

Dave Bittner: Yeah. Well, it's an interesting story. My story this week is related to that. This comes from The Daily Dot. It's a story written by Mikael Thalen. This was actually sent to us by a friend of the show, Elizabeth Wharton. She's @LawyerLiz on Twitter. She's a former guest on our show. And she brought this one to our attention. 

Ben Yelin: Great friend of the pod. 

Dave Bittner: Yep, yep. The article is titled, perhaps a bit breathlessly, "Terrifying Cellphone 'Heat Map' Shows Just How Much People Are Still Traveling." And the subtitle is, want to know where all those Florida spring breakers are now? This is fascinating to me. There's - as part of this article - I guess this article was sort of triggered by a video that was using a combination of data tools. There's one tool that does the data collection - actually, one of the many companies who collects location data off of cellphones, as you and I have talked about many times here. But there's another tool. You can load that data into one of the many tools that visualizes that data. In this case, it's a tool called Tectonix. And what they've done in this video is they zoom in down onto a specific beach in Fort Lauderdale, Fla. Just imagine yourself... 

Ben Yelin: We'll call that Beach Patient Zero. 

Dave Bittner: (Laughter) I mean, imagine yourself at the beach, spring break or the summertime. You look to your left, look to your right as far as the eye can see. You've got a limited sightline there. That's basically what they do here is they - they're able to go in and lasso a group of location data points on this beach and then go forward in time and watch as these data points distribute themselves around the country. And, boy, from tracking the potential for the spread of a disease, it is fascinating to see all of these little location pings make their way pretty much everywhere east of the Mississippi. It just spreads out, and off it goes. 

Ben Yelin: It's into our community here in Maryland if you look closely at the map, yeah. I mean, I think all of us were struck a couple of weeks ago. I think it was right before most states were putting in these shelter-in-place orders. But Florida was behind the ball. And their public beaches remained open during the beginning part of this COVID-19 emergency. And there were viral photos and videos out of spring breakers on the beach in large crowds. There was one guy who famously said, look; if I get Corona, I get Corona. And he was - you know, looked like your standard spring break partier. 

Dave Bittner: Yeah. His parents must be so proud. 

Ben Yelin: Must be so proud of him. Apparently, he apologized, which, you know, good for him, but... 

Dave Bittner: (Laughter) All right. Well, good for him. 

Ben Yelin: ...He might have already spread the disease. 

Dave Bittner: Yeah. Who knows? 

Ben Yelin: But problem is we don't have strict borders across state lines in this country. So the governor of Florida or policymakers in Florida could say it's the right of our state to make decisions as to - whether to close public spaces. But as this map shows, it has ripple effects across the entire eastern seaboard and into the Midwest. So people end their spring break in Fort Lauderdale, get on the interstate, get on the airplane, go back to their communities and start to spread this disease. And I think this article and this surveillance technique just provides great visual evidence of how quickly just one single beach in Fort Lauderdale can spread this virus. I mean, we're talking about a relatively - a very small geographic area relatively speaking and to have these pink dots all over the country, it's just - it's very striking to look at. And, you know, I think it adds even more justification for why we need to shut down public spaces because this is how the rate of transmission just excels, and it's very disturbing. 

Dave Bittner: What about from the privacy point of view? You know, the folks who provide this data, the company who does the location gathering, their name is X-Mode. They, as most companies in this business, say that the data is anonymized, but of course the folks who research this stuff, they say that - I want to say it's, like, 99% of the time they can deanonymize that data just being able to correlate different bits of information. If your device is sitting still at a certain location between midnight and 6 a.m., chances are that's your house, right? 

Ben Yelin: Right. And then if every day you're going to, you know, your essential business in Michigan, we're going to know that that's probably one individual. We can get that person's address, and we can get that person's employment information. Then we can take a look at that person's Facebook profile and see that, look, they were in Fort Lauderdale two weeks ago. Either let's publicly shame them or, you know, potentially the consequences could be more severe. States are, including Maryland, have introduced criminal penalties for violating the shelter-in-place orders. Now, these things aren't retroactive. So there was no shelter-in-place order when this took place. But you can understand why the threat certainly would be there. And this connects to our previous segment. We heard stories here in Maryland of a party where a guy got 60 individuals together, refused to disperse the party and was arrested for violating the governor's executive order. Now, what if we tracked the cellphones of - you know, we just swept up the cellphone metadata from that particular house on that night and tracked those people across this geographical area? We could get a lot of personal information about them, even if the data was completely anonymized. And I think that's a very realistic proposition. So I'm all for publicly shaming people in the right circumstances... 

Dave Bittner: (Laughter). 

Ben Yelin: But I just think we really have to think about the consequences here. And not to blame X-Mode - I mean, I think they're providing a very useful service. They've also used their tools to track movements across cities that have had significant outbreaks. So they noticed a large cessation of social activity when they were tracking Rome and between the middle of February and the middle of March, and they compared that to Seattle where there was a little bit of social distancing going on but not to the same degree. And that's very useful information for the public and for policymakers. It's just, you know, like we said, once this tool is introduced, we have to be very vigilant that it not be abused. 

Dave Bittner: What about - I can imagine a scenario like you say, that person who threw the party just because, but we've also heard stories of there've been some preachers, some ministers, some church leaders who've said, no, we're going to have our services on Sunday mornings. And we're going to go ahead and do that despite the prohibitions from the governor from the state. What if you were able to use this sort of data to say, OK, here's everybody who was at this event? Guess what, everybody. You are all now under a mandatory quarantine. You're not allowed to leave your house. Everybody else can go to the store. Everybody else can go for a walk. But we've got you pegged here as being at this location. Here are the consequences. 

Ben Yelin: Yeah, that part is very disturbing to me. I mean, you can be very supportive of public health measures, but when you start to infringe on people's First Amendment rights, it's a major area of concern. And most of our surveillance laws, including the Foreign Intelligence Surveillance Act, prohibit surveillance activities solely based on First Amendment-protected events. So you can't just go and surveil a church or a single church or a mosque, for example, under that law. So that's where, you know, I think things could get potentially very dangerous. And I worry about it when we're talking about activities that are protected by the First Amendment. 

Dave Bittner: Yeah. Interesting. All right. Well, lots of interesting stories this week. It's time to move on to our Listener on the Line. 

0:17:44:(SOUNDBITE OF PHONE DIALING) 

Dave Bittner: Our Listener on the Line this week is someone named Karen (ph). And she wrote in and she asks - she says, Ben and Dave, I've been hearing reference to force majeure in some of the conversations regarding legal obligations in the midst of the coronavirus pandemic. Could you explain what this means? That's a good question. 

Ben Yelin: It's a fantastic question. This is, of course, a Latin term, as many legal terms are. It means superior force, and it is a concept in contract law. It's less prevalent in our common law system descending from our English ancestors and more common in other European countries that have more of, like, a civil law system. But the basic idea is contracts can be declared null and void if there is some sort of superior superseding event. Usually, you know, we're thinking about acts of God, hurricanes, floods, earthquakes, you know, crazier things like a nuclear explosion, solar storms. Where this becomes very acute during our current emergency is there are going to be a lot of contracts that will be unable to be performed because of this emergency. A lot of businesses are closed, and they're not going to be able to fulfill their contractual obligations. 

Ben Yelin: So the question becomes in these millions of contracts, does force majeure apply? Can one party get out of the contract simply because there was this act of God? And there is going to be mounds and mounds of litigation on this very question. Generally, in our system, it's incumbent upon the parties to a contract to explicitly write out what those acts of God would be. You know, so it's usually a part of standard contract clauses. Some of them probably do contain language about a global pandemic; many of them do not. And so we're going to see a bevy of lawsuits related to exactly how to interpret the force majeure rule. 

Ben Yelin: One thing I will note is there is a common law concept called frustration of purpose that is a more common way for parties to get out of a contract here in our legal system. And that's if the purpose of the contract becomes null and void because of superseding events. The example I always use in my class is there are a lot of people who rented out their homes and apartments as Airbnbs for the 2017 Inauguration assuming that Hillary Clinton would win that election. And, you know, all of their friends in liberal areas like New York and California, you know, they expected them to come and stay in those Airbnbs. She did not win the election. The purpose of those contracts became null and void. And that actually could have been a justification because that purpose had been frustrated for one party to back out of that contract. So I think that's more of the concept that we'll see here where companies will say that because of some frustration of purpose or, you know, not necessarily an act of God but just the inability of us to perform because the purpose of the contract became null and void will justify one party or the other getting out of a contract. 

Dave Bittner: Will this come down to, you know, individual per contract negotiations? I guess my - what I'm getting at is will there be some declaration that because this was officially declared a national emergency, does that take some of the ambiguity out of it as to whether or not force majeure, for example, can be invoked? 

Ben Yelin: I really don't think it does just because of the way our legal system works. It is court by court, case by case. You can take precedents from other cases. But first, we would have to see those cases get brought up, right? So there has to be a first wave of cases where courts start to interpret force majeure provisions or the lack of provisions in the context of this pandemic. We don't have the way a lot of European countries have a government body that can step in and make an interpretation for all of the courts. There's something called the universal commercial code in the United States, which covers a lot of these transactions. Potentially, you could amend that - or uniform commercial code. I'm sorry. Potentially, that could be amended, but that process is quite cumbersome and could take a long time. And probably organizations are going to want to litigate these claims in relatively short order so that they're not out a lot of money because of the impossibility of performing a contract. So I'm less convinced that we're going to get sort of a ruling on a tablet from down high saying, all right, across the entire United States, force majeure now means it's covered by global pandemics. I would not anticipate that we'd see something like that. 

Dave Bittner: You know, I remember after 9/11, I had friends who were in the commercial real estate business. And they were saying that it was fascinating to see the legal wranglings taking place after that event and that, you know, they were arguing over whether or not it was one event or two because the two towers falling, were they separate events or one? Because that would affect the payout from the insurance policies. 

Ben Yelin: Yeah. I mean, I've read about those pieces of litigation and they went on for years. Some of them are probably still going on because there were so many insurance claims that were made. So I could see that happening here. Businesses that have had to shut down, they've been unable to provide services, suppliers who have had their supply chains interrupted, they're going to get sued because that's just what happens in our legal system. People are going to want to recover damages. And this is litigation that I think we could see go on for a very long time. 

Dave Bittner: All right. Well, thanks to our listener, Karen, for sending that in - very interesting topic of discussion. We would love to hear from you. Our "Caveat" call-in number is 410-618-3720. That's 410-618-3720. You can also send us an audio file with your question or an email. It's caveat@thecyberwire.com. 

Dave Bittner: Coming up next - my conversation with professor Yehuda Lindell, CEO and co-founder of Unbound Tech. We're going to be discussing government requests for your data. 

Dave Bittner: But first, a word from our sponsors. So let's return to our sponsor KnowBe4's question. How can you see risk coming, especially when that risk comes from third parties? After all, it's not your risk - until it is. Here's step one. Know what those third parties are up to. KnowBe4 has a full GRC platform that helps you do just that. It's called Casey KCM, and its vendor risk management module gives you the insight into your suppliers that you need to be able to assess and manage the risks they might carry with them into your organization. With KnowBe4's KCM, you can vet, manage and monitor your third-party vendor security risk requirements. You'll not only be able to prequalify the risk; you'll be able to keep track of that risk as your business relationship evolves. KnowBe4's standard templates are easy to use, and they give you a consistent, equitable way of understanding risk across your entire supply chain. And as always, you'll get this in an effective automated platform that you'll see in a single pane of glass. You'll manage risk twice as fast at half the cost. Go to kb4.com/kcm and check out their innovative GRC platform. That's kb4.com/kcm. Request a demo and see how you can get audits done at half the cost in half the time. And we thank KnowBe4 for sponsoring our show. 

Dave Bittner: And we are back. Ben, I recently had the pleasure of speaking with professor Yehuda Lindell, CEO and co-founder of Unbound Tech. We discuss government requests for your data and what that means for your organization. Here's my conversation with professor Yehuda Lindell. 

Yehuda Lindell: I mean, when I think about accessible customer data, I think about the fact that with the growth of SaaS and a lot of their information being in the cloud or with external providers, that information is vulnerable. We've seen multiple cases of breaches where someone isn't necessarily going after me, but I get caught up in another - a different megabreach. One of the great examples, although it's not that recent, is that one of the administrators at Dropbox was using their admin password also for LinkedIn. And when the LinkedIn - when LinkedIn was hacked - that was a long time ago, indeed - that person's password was then used to breach Dropbox and actually get a lot of information about a lot of customers. So that's part of the dangers of the SaaS world, is that attackers don't necessarily even need to be targeting you, and you can get caught up in one of these very large breaches. 

Yehuda Lindell: It's also similar in terms of the silent subpoena issue. In the old days, if the government wanted to get your information, they had to come at you with a subpoena, which is actually fine. As long as they have a subpoena, we think that that's OK. But you do want to know about it. You do want to know that you're under investigation or what the situation is. And once we have outsourced our data to a cloud, to - or to a SaaS provider, then that subpoena and that investigation can go on without you even knowing about it. 

Dave Bittner: Well, can you give us sort of an overview of, when we're talking about accessible customer data, the different types of that, the different ways and places that that plays out, that that's a reality? 

Yehuda Lindell: You can break it into a number of categories. The one that most people actually think about is the one that I'm least concerned about (laughter). People think about things like your credit card number or things like that. That actually, you know, is a relatively solvable issue. It's a pain when your credit card number is stolen, but nothing personally bad has happened to you. Nothing personal that is damaging about you has been revealed to the world. So that category which I would call the security category is one that I'm much less concerned about. There's the whole other category of what we'd call private data or confidential data, data which is personal relating to you. And here, once again, I'll split this into two subcategories. 

Yehuda Lindell: One are things that you actually don't think are too private or personal, the things that you post readily online to your friends and Facebook or other places - I don't actually have Facebook, but for those who do, which is a large percentage of the world. It could be your dog's name, your teachers in school, your favorite musical instrument, your favorite sport. I think you'll already guess where I'm getting at. All of that information is also the same information that's used by organizations to identify you when you've forgotten your password or you need additional information to do a bank transfer or things like that. So actually, that personal information, although not really considered secret by you, when that falls into the hands of an attacker, they can use that to do identity theft. They can use that to steal your account. And so in that sense, it's also problematic. No one's specifically targeting me. I'm not interesting enough. No one really wants to know my dog's name. It's not important at all. But given all that information, they can sell that to people who then to go ahead and actually make money off that. 

Yehuda Lindell: And the final category is the actual private data. This can be medical data. I think we've all talked about concerns of - if my insurance company knows something, that I have a relative with some disease, will that affect my ability to get insurance? And these things are not imaginary. But they're also in things that some people think less about, which is location data. I actually think location data is the most sensitive data out there. I've seen real location data multiple times. Now it's actually quite easy to do. If you use Android or - then you can just go and look at what Google knows about where you've been. And by just looking at location data, I know where you live, where you work, where you shop, who you go out with, who you meet. I can, of course, correlate different locations. If I find that two specific people have been in the same location multiple times, then I can assume they're meeting each other. It actually says almost everything about you, and that is something that really, really is a huge risk. And a lot of organizations actually have allocation data, and that's a huge risk in my opinion. 

Dave Bittner: So what sort of mitigations can people take to protect themselves here? 

Yehuda Lindell: It's very difficult (laughter). It's very difficult. Had the world gone in a different way and say, we will pay for our internet services and for our social networks with money rather than with our data, then maybe we wouldn't have this problem. I do want to say that this is not something that any individual person can really do, but there really is, I think, a big opportunity for services to say, you can choose to pay with your money rather than with your data. If that possibility does exist, then I think people should choose it. 

Yehuda Lindell: Beyond that, there's a certain choice that I personally do - I like to give my data to organizations that want my money first and my data second. So I'm not naive; I don't think that Apple doesn't also want my data and isn't also monetizing my data, but Apple's primary business model is my money. Microsoft's primary business model is my money. So I don't use Google Cloud and Google Docs; I use Microsoft and Office 365. And I don't use Android; I use Apple. And that way I try to support the business model that says, I want to pay with my money. That might sound a little bit strange - I want to give away my money. But it's actually not that much money. And I think that this is where I want the world to go. 

Yehuda Lindell: Beyond that, what I think that everybody can do in the short term, in the immediate term, is to have a separation of powers. So, for example, I don't need to connect my Waze to my actual identity, and therefore I don't. If I am using Gmail, then - I don't use Gmail, but if I do log into a Gmail account because I need to access something else, then I'll make sure to log out of that straightaway. If I always leave my Gmail open, then I would suggest having one browser where I'm using my Gmail - let's make it Chrome - and then use another browser, say Safari, for everything else. And that way I'm not giving access to Google to all of my tabs and everything that I'm doing online and connecting that to my Gmail account and, furthermore, to my real-world persona via my cellphone and other things. 

Yehuda Lindell: So if it's possible to sort of separate things out and - then that's No. 1. No. 2 is every time you allow an app on your phone to track your location or every time you give permission for it to access your microphone or your camera and other things, you have to understand that there is a risk involved. So do I need the weather app to always know where I am or only when I'm using the app? In my opinion, it's only when I'm using the app, and therefore I don't give permission to my weather app to access my location always. 

Yehuda Lindell: So just think about - before we actually allow or give these permissions, let's think about is it actually needed? Do I have to wait three seconds for my weather app to work out where I am when I open it? Yes. I think I can live with waiting those three seconds. It's OK. These are sort of immediate measures that people can do beyond all the standard things of turning off tracking cookies and other settings that are privacy-preserving settings that becoming more and more popular. Apple has implemented quite a lot of them, and they're doing a good job of that. 

Dave Bittner: When it comes to things like storing your data in the cloud - thinking of protecting yourself against things like you mentioned, like the silent subpoenas - what about the use of encryption? How much protection does that provide? 

Yehuda Lindell: So encryption is a basic. You always have to do it. And the amount of protection it provides - depending on what you're doing. And I'll give you a couple of examples. If I'm using a cloud backup, then - or another, you know, service like that, then the strong cryptographic key is actually held in the cloud because I want to ensure that my data can never get lost. And therefore the strength is really based on the strength of a password. And if someone comes with a subpoena, there's a very likely case that they'll be able to get to my data. 

Yehuda Lindell: I don't specifically want to point out iCloud because I don't want to make a mistake here. I'm not exactly sure of the specific details of the backup, but the backup is essentially one, in most cases, where even if you lose your password, the cloud provider can help you to recover your data. If that is the case, then you know - then they can do that without you. So if they're able - if you lose your password and you're not losing your data, then you know that actually you're in trouble, which is not the case, by the way, on my Mac. On my Mac, when I encrypt my disk, I get a very clear message saying if you lose your password, it's your problem, and that's a good thing. That's a sign that, actually, it's well protected. 

Yehuda Lindell: There are other things like WhatsApp - so WhatsApp actually has end-to-end encryption, which means that every time I start exchanging and talking to someone, a pair of keys is generated that are local on both devices, and that means that there is no way of going to WhatsApp servers and getting all of my data, unless I say that I want to back it up, like in iCloud. So actually, I don't back up my WhatsApp in iCloud because I want to have that protection over end-to-end encryption. So if it's end-to-end encrypted, it's very, very secure. They would actually have to come directly to you, which is what we think the situation should be. I don't know many people who are against truly targeted surveillance with a subpoena when the police have a specific reason to think that Person A is a danger. We don't want this mass surveillance. That's what we're really against. 

Yehuda Lindell: When you're looking at the cloud, what you really want if you're an organization rather than individual, you want to make sure that you have as much control over your keys as possible. So there is this notion of bring your own key, and I put the key in the cloud. It's very nice, but once you've brought the key and given it to someone else, then you've lost all control, and a rogue administrator can access the key. And this has happened in the past. And law enforcement can access that key and get your data. If you keep control of your key yourself by encrypting it and then uploading or having a system which enables you to have control of your keys at all times, then you are much less vulnerable, so that's certainly a much better practice. 

Yehuda Lindell: So on an organizational level, I think that people have to do much better risk analyses of what the possible scenarios are that can happen and how they can protect themselves rather than just saying, oh, I'm in Amazon (ph), and I turned on encryption. Everything's fine. That is not good enough. 

Yehuda Lindell: And another thing which I think is really important is there are two types of organizations when it comes to security. There are those who are most interested in checking all of the boxes, but they're not really necessarily looking at the actual security ramifications. And there are others that really want security, and checking the boxes and doing all the standards and the certifications just help them understand whether they're in the right direction, but they really want security. Today, to be in the first camp is just inexcusable. We - organizations today have to be in the second camp. They have to understand that security is actually a threat to your business. A huge breach can have very big ramifications. It can be also - your own IP can be loss of customer trust. It can be huge fines with GDPR and other regulations that are out there. 

Yehuda Lindell: And so I would strongly recommend that people start taking it seriously, analyze what their risks are and make sure they're doing the best practices for encryption and signing and other cryptographic protections and security protections that are out there. And understand that having professionals in-house and thinking about how you keep control of your own assets is very important. It's good to rely on the cloud for what they're encrypting, but to only rely on that and say, OK, encryption is encryption, without thinking about where the key is and who can access it, that's being very naive, to say the least. 

Dave Bittner: All right. Lots of interesting stuff there. Ben, what do you think? 

Ben Yelin: Yeah. One thing that stuck out to me about the general tenor of his interview is the more conveniences we see on a given website or a given service, the more likely it is that the best security protocols are not in place. So if an application, for example, wants us to turn on our location tracking for convenience purposes, we should be more suspicious that they're going to collect our personal data. 

Ben Yelin: Another thing that stuck out to me is which data is particularly valuable. I would have thought, just like I think the conventional wisdom is, that our credit card numbers - that's something that we should be most fearful about getting into the hands of the wrong people because... 

Dave Bittner: Right. 

Ben Yelin: ...They can dip into our financial resources. But some of the information we share on social media - and this is a great point that he brought up - are things that are probably used as security questions. Which street did you grow up on? Who was your best friend growing up? And because we overshare on social media, I'm probably referencing my best friends, if I'm the average person, several times throughout my Facebook interactions. That information is going to be available to nefarious actors, and they could get into our bank accounts, health care portals. So I just thought that was something that I hadn't really thought about that much that was particularly interesting to me. 

Dave Bittner: Yeah. I'm curious. What's your take on this notion of privacy canaries, you know, as a sort of workaround for government requests for data when a company's not allowed to say that they got a request for data? You know, there's - some companies will put up a webpage that basically says, as long as this webpage is up, we have not received any, you know, requests for data. And if that page goes away, it's sort of a backhanded way to say that may not be true anymore, you know (laughter)? 

Ben Yelin: Yeah. We're not saying we've gotten requests... 

Dave Bittner: Right. 

Ben Yelin: ...But we're sort of saying it. I mean, I actually think it's pretty useful from the consumer's perspective because... 

Dave Bittner: Yeah. 

Ben Yelin: ...I talk about things in some of my courses, like national security letters, where they actually put gag orders on the companies that are recipients, so... 

Dave Bittner: Right. 

Ben Yelin: ...They're legally forbidden from discussing, in some cases even with your attorney, although that's begun to change, whether you've received this administrative subpoena or - let alone which - what information is contained within that subpoena. So these canaries are probably the best option that a company has in revealing to its consumers whether they've handed data over to the government. I think it's one of those things that it's not necessarily transparent, but it's better than nothing. 

Dave Bittner: Yeah. All right. Well, our thanks to Professor Lindell for joining us. And, of course, we want to thank all of you for listening. That is our show. 

Dave Bittner: We want to thank this week's sponsor, KnowBe4. If you go to kb4.com/kcm, you can check out their innovative GRC platform. That's kb4.com/kcm. Request a demo and see how you can get audits done at half the cost in half the time. Our thanks to the University of Maryland Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.