Caveat 6.17.20
Ep 33 | 6.17.20

Building public trust.

Transcript

John Ackerly: You can have privacy and actionable intelligence that can be used to help save lives. It is not some ideal. It's possible today.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hi, Dave. 

Dave Bittner: On this show, I've got the story of law enforcement using enhanced stingray technology. Ben explains why IBM is no longer pursuing facial recognition technology. And later in the show, my conversation with John Ackerly. He's the CEO of Virtru Corporation. We'll be discussing protecting our right to privacy in the midst of the COVID-19 pandemic. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: Well, Ben, let's start off with some follow-up this week. I got a couple of kind notes from some listeners, both having to do with - why don't you describe what we were discussing? It had to do with their lock screen on their phone. 

Ben Yelin: Yeah. It was a case in which a court had to determine whether there was a search when law enforcement turned on the phone for the purpose of taking a picture of a lock screen. And we did get some very friendly criticism of some of the things we said, which I'll let you describe. 

Dave Bittner: OK. Well, the first one came from a listener. They wrote, in the most recent "Caveat" and the daily podcast, Ben refers to the FBI both taking the photo of the defendant's lock screen and pressing the two buttons to take a screenshot. It seems to me that pressing the two buttons to take a screenshot on a third party's phone is absolutely no use to the person wanting that image since the image remains on the device. It would seem to me that the issue at hand is really the FBI taking a picture of the lock screen with a different device. I'd say this listener is right on with that criticism (laughter). 

Ben Yelin: Yeah. I guess it was a clumsy attempt at a metaphor. And this listener definitely got us. 

Dave Bittner: Yep, yep. 

Ben Yelin: But the listener is absolutely right. Obviously, the device in that case would remain in the hands of the person for whom the law enforcement is trying to obtain information. 

Dave Bittner: Yeah. 

Ben Yelin: So, yeah, it would be taking a picture of what you see on a lock screen after you turn on the device. 

Dave Bittner: Right. And to that point, we had another listener write in, and they wrote, gentlemen, I think you missed a point about what could possibly show up on a lock screen when someone opens a phone or turns it on. There could be incriminating information in terms of notifications and other pop-ups that could be considered evidence. Good point. 

Ben Yelin: Yeah, also a good point. I think I had mentioned that, you know, perhaps, like, his alias was listed there. I think we also reference pictures that might be on your front page. But, you know, absolutely what could really be revealing is the notifications you get. If, you know, I left my phone in the other room and came back three hours later and, you know, looked at the notifications I received, I could probably get a pretty good idea of my interests, my affiliations, my friends, my enemies. 

Dave Bittner: (Laughter). 

Ben Yelin: So yeah. 

Dave Bittner: You should be ashamed of yourself, Ben (laughter). 

Ben Yelin: I would be not as ashamed of myself as I am for apparently messing up the segment so badly, but ashamed nonetheless. So, yes, to be more precise, I think it is notifications that one would be concerned about... 

Dave Bittner: Yeah. 

Ben Yelin: ...As well as potentially your lock screen picture. 

Dave Bittner: Right. 

Ben Yelin: But probably most likely it's going to be those notifications. 

Dave Bittner: I think it's worth noting, at least on iOS - and I don't know what the case is on Android, but on iOS, you can sort of dial in how much information those notifications will show depending on whether or not the phone is unlocked. For example, I have my device set where if I get a Twitter notification, if you just activate the phone, it'll just say Twitter notification. But if I activate the phone and it unlocks with Face ID, it'll say Twitter notification but then actually show the tweet. So you can dial in how much it shows depending on whether or not it knows it's you or not. 

Ben Yelin: Right. So, you know, you could have your iMessage be from coconspirator, let's go commit crimes. 

Dave Bittner: Right (laughter). 

Ben Yelin: And if you don't adjust your setting, that message is going to show up on your lock screen. 

Dave Bittner: Right, right. 

Ben Yelin: If you do adjust your settings, it'll just say iMessage from coconspirator. 

Dave Bittner: Right, right. 

Ben Yelin: So that's, you know, something - if you want to protect your private information on your device, that's a very easy step you can take, not having those messages revealed on the lock screen. It does mean that when you wake up in the middle of the night and receive a text message, in your sleeping stupor, you will have to unlock your device with facial recognition or some other method instead of just hastily gazing at your phone, so there's that. 

Dave Bittner: Right. All right. Well, thanks to both of those listeners for sending in those comments. We do appreciate it. Let's move on to our stories. Ben, why don't you kick things off for us this week? 

Ben Yelin: So this one really caught my eye. It is from the news source The Verge. And the headline reads "IBM Will No Longer Offer, Develop or Research Facial Recognition Technology." This was revealed in a letter written to Congress by the CEO of IBM, Arvind Krishna. They sent it to Democratic members of both the House and the Senate. And they said that they oppose the use of facial recognition technology, not only the technology that they've done research and development on but technology offered by other vendors. And the reason they are coming out with this broad opposition is the potential for mass surveillance, racial profiling and violations of human rights and freedoms. And they said that they're writing this letter now and they're instituting this policy in the wake of the protests we've seen over the past couple of weeks that emanated from the George Floyd killing in Minneapolis. 

Ben Yelin: And so I think this is a really profound announcement. It's the first of its kind to my mind in its industry that IBM really wants to press a pause button. Before we know more about facial technology and its biases and its potential to identify people on racial terms, gender terms, et cetera - before we have a full understanding of that, we need to start a national dialogue, I mean, to think about what our values are, what the purposes of this type of surveillance and how we will protect people's privacy and civil rights. And until we can do that, in the mind of IBM, we are going to not only cease our use of this technology, but we're going to call on other companies as well to abandon facial recognition. 

Ben Yelin: So I think this is a really groundbreaking announcement, and we'll see if other companies will follow. And I think one thing we've seen in the wake of these protests is there has been a reckoning on the part of corporate America as it relates to things like systemic racism. So, you know, every company from Apple to Subway sandwiches and everyone in between is putting up some announcements, you know, talking about their values related to what we've seen in the protests. And this is the most concrete step I've seen a major corporation take in trying to reconsider its policies. 

Dave Bittner: Yeah, it's interesting to me. Obviously, IBM being such a big player in the space, also, of course, a household name, for them to come out and say this is certainly noteworthy. It made me think of Amazon, who is another big player in the space. I suppose they could come out. There's two ways. They could either say, we join you, IBM, in putting a pause on this and considering this technology, or they could go the other way and say, hey, hey, more business for us. 

Ben Yelin: Right, woohoo (ph) full market share on facial recognition. They're obviously not the only ones who might be able to take advantage of this from an economic standpoint. 

Dave Bittner: Right. 

Ben Yelin: We talked about Clearview AI in our podcast earlier this year, an up-and-coming facial recognition company that scrapes from social media accounts, which obviously caused some controversy and some lawsuits. You know, this isn't the first time that IBM has tried to address some of the biases in its artificial intelligence and facial recognition software. They released a public dataset back in 2018 to help all the industry players reduce bias in order to make the data more reliable. Now, one thing this article mentions is while that effort was laudable, they were also found in a separate case to be taking a million photos from Flickr, the photo sharing website, without the user's consent. But nevertheless, you know, I think they have been industry leaders on this. 

Ben Yelin: Another thing that I thought of is we are going to see some criminal court cases in which the key form of evidence has been facial recognition. So, you know, we saw you looting at a protest. We scraped your photo from social media, matched it up against a photo from the scene of the crime, and we used that evidence to arrest you and put you on trial. If I'm a defense attorney, I bring this article to that proceeding and I say, there is a huge problem with the reliability and potential for bias as it relates to facial recognition technology. And my evidence for that is, you know, one of the largest and oldest companies in this country as it relates to any type of computing or information technology has put out this announcement saying we're putting a pause on facial recognition technology. It's not as reliable as it should be. It could lead to false arrests, false convictions. I think if I am a defense attorney, this is going to be a very valuable weapon in those cases. 

Dave Bittner: All right. Well, I suppose that reflects this moment that we're in, doesn't it? 

Ben Yelin: Yeah, absolutely. This is something that certainly would have surprised me, you know, three or four weeks ago before all of this took place. But I think kind of we're all considering our own role in our own way in perpetuating a system. And IBM, you know, I think is recognizing its role, and they're, you know, trying to effectuate change in an area that they control. 

Ben Yelin: And I think the data is - you know, there have been enough studies on artificial intelligence and facial recognition for us to know that there are biases involved. There are biases in the algorithms. There was that famous study where I think it was the ACLU was able to identify 28 members of Congress picked from 25,000 public mug shots even though those members of Congress had not been arrested. 

Dave Bittner: Right. Right. 

Ben Yelin: So, yeah, I mean, they're recognizing the problem, and I think they're being very proactive, even if it hurts their market share, in trying to come up with a solution before we get further deeper and deeper into this world of using facial recognition, particularly for law enforcement purposes. 

Dave Bittner: It'll be interesting to see who gets on board as IBM puts the industry on notice. This will be an interesting one to follow for sure. 

Ben Yelin: Yeah. It's funny. It's - you know, if you're Amazon, do you care more about the social pressure or do you care more about your market share in facial recognition? And I think that's actually a pretty difficult question. You know, for a company like Clearview AI, IBM is a multifaceted company that does a million different things. Clearview AI is more limited. So if they were to limit their artificial intelligence facial recognition technology and wanted to take a step back, they'd probably be killing their own business whereas IBM can afford to do this because, you know, they still build all different types of software and hardware. 

Dave Bittner: Yeah, yeah. 

Ben Yelin: So I think it's a little easier for them from that perspective. 

Dave Bittner: Yeah. My story this week, this comes from Motherboard and written by Joseph Cox. The title of the article is "Agencies Spending Millions on 'Crossbow' Spy Tech, an Upgraded Stingray." Now, Ben, you and I love our stingrays on this show. 

Ben Yelin: Love our stingrays, yep. 

Dave Bittner: (Laughter) Yeah. It is something that I think I'd hazard to say we have a little bit of a fixation about but justifiably so because I think it's - it fits right into the topic - ongoing topic of things we cover here. 

Ben Yelin: Absolutely. We've been talking about this one for years. I think one of our earliest CyberWire segments probably discussed stingray devices. 

Dave Bittner: Yeah, yeah. 

Ben Yelin: So this is, you know, something that we've been focused on for a while. 

Dave Bittner: Yeah. So this article outlines how some federal agencies and some military agencies have purchased a device called a Crossbow, which evidently is an upgrade, a new version of the stingray device, which, of course, the stingray is the cellphone tower simulator device that's used to catch your IMSI number from your cellphone. Basically, law enforcement will set one of these devices up, turn it on, and it pretends to be the local cell tower. And your cellphone then logs on to that device. And through that log on, the law enforcement can track you or know if you are in a particular area based on that logging on. So there's this information they can gather. 

Dave Bittner: Evidently, the folks who make these devices have a newer version of it. And the - I believe it's the ACLU has through the Freedom of Information Act has found that several agencies have been purchasing these devices. These are not insignificant purchases either. They're hundreds of thousands of dollars, in some cases millions of dollars for these devices. One of the developments is that these devices can use 4G. Previous ones, I believe, were limited to some of the older protocols that your mobile device used. They would kind of force the device to downgrade its signal to the older protocols, which is an automatic thing. That's part of how your cellphones work. But I guess these new ones are capable of using 4G and the enhanced capabilities that come with that. What do you make of this, Ben? 

Ben Yelin: So there are a couple of things that are particularly problematic here. For one, any stingray device can sweep up personal information of people who just happen to be in a targeted area of the device that's targeted. 

Ben Yelin: Another thing this article mentions is that these types of IMSI catchers potentially interfere with somebody's ability to make emergency 911 calls. So obviously, that has a very real-world effect. 

Ben Yelin: And then the ubiquitousness of these devices, as reflected in this article, is really eye-opening. So some of the entities that have purchased Crossbow devices include the Department of the Army, Department of the Navy, U.S. Special Operations Command and the United States Marshals. And that is in addition to state and local law enforcement agencies across the country who have purchased these devices. So as you said, it's not like this is, you know, a pilot project for one local police department. This item is on the market. It's being sold to some of the largest military institutions and law enforcement agencies in this country. 

Ben Yelin: This is another thing where transparency is a particular concern as well. They asked - in this article, there's a quote from a U.S. Marshals spokesperson who gives you the nondenial denial, saying, we will not reveal our methods to catch, you know, dangerous criminals. 

Dave Bittner: Right. 

Ben Yelin: So we're not going to tell you whether or not we have purchased these devices. But I think people who are cellphone users, which is most of us, should have a right to know whether the military or law enforcement has these capabilities to trick our devices into transmitting personal information. You know, I just think the public has a right to that information. And so there's something particularly problematic that the only way we find out about this is through FOIA requests. 

Ben Yelin: And another thing this article mentions which I find fascinating is law enforcement agencies find this technology so valuable that when, you know, a criminal defendant and his or her attorney has tried to get more information on stingray surveillance, local law enforcement is willing to throw out those particular cases so they don't have to reveal their methods. So you can understand why it's so incredibly valuable as a law enforcement tool if they're willing to go to such an extreme measure. 

Dave Bittner: Yeah. It still boggles my mind that the FCC approved this device, that the FCC approved a device that pretends to be a cellphone tower, that actively interferes with such a fundamental part of our communication system. Yes, it's for law enforcement, and law enforcement, I suppose, has made their case. But I just don't - that part I still scratch my head about, that the FCC would approve something like this. 

Dave Bittner: And the other part that I find interesting - I saw some speculation on Twitter during some of these protests that - and again, I'll emphasize speculation here - that these devices don't pass along emergency announcements that are made through the cellular system. So, for example, in this particular case, they were saying some of the municipalities would put out a message that said, curfew goes into effect at 9 p.m. tonight, right? 

Ben Yelin: Right. 

Dave Bittner: And there were people on the street who said, I never got that message. Some people got the message. I didn't get the message. And it was pointed out that these devices may not pass on those messages. When they're active, if a message like that goes out, because these devices aren't part of the actual network that's used to send out messages like these, the message doesn't get passed through them. So the message goes undelivered. And is there a potential issue, is there a potential liability if an emergency message is sent out and because of the use of this device, people are unable to receive that message when they otherwise would? 

Ben Yelin: Yeah, it's a huge issue. I mean, you know, I've spent a lot of time studying denial-of-service attacks on public service answering points, 911 centers. These are supposed to be actions taken by malicious actors, cybercriminals. But the effect of what these governments are doing with these devices is pretty similar to what a denial-of-service attack on a 911 center can actually do, which is inhibit emergency information from going either from the government to users of a device or from users to the government. So we're in a pretty dark, disturbing place where, you know, we're facing pretty severe tangible effects of putting this technology in place. People aren't able to get emergency alerts, potentially. People aren't able to dial 911. 

Ben Yelin: And, you know, I think there's a values question at play here. Is it worth it for us - you know, for however many criminals we are able to catch using this technology, is it worth it to prevent people from making 911 calls, to sweep up innocent individuals' private device information and, you know, as you say, to potentially prevent them from receiving emergency communications? Is it all worth it? You know, I think that's a value judgment that policymakers are going to have to contend with, you know, both legislators and judges. Now, I will say it's really hard to make those policy judgments when you don't have full and complete information on how these devices work and how often they are employed. And that's why I think transparency is such an important part of this story. 

Ben Yelin: I think one of the quotes in this piece says, the public, judges and lawmakers cannot provide effective oversight without basic information about the capabilities of this new military-grade equipment. You know, we see this all the time. Whether it's stingray technology or aerial surveillance or cell site location information, you'll have law enforcement agencies say, we're not going to confirm whether we're using this technology, but we're not going to tell you anything about it because, you know, we don't want to reveal our methods. You know, I think the public has a right to know, even if they don't know all the details, what type of surveillance and what type of tools law enforcement is willing to use. 

Dave Bittner: Is it possible that someone could go after them using that avenue? Could someone, either an individual or an organization like the ACLU, could they say that the fact that these devices may be interrupting someone's ability to make a 911 call or receive an emergency notification - that's worth a lawsuit here? 

Ben Yelin: I don't think so. I mean, you know, I think these law enforcement agencies probably have lawyers that will protect them from lawsuit. You know, law enforcement generally has pretty wide latitude, as long as it's not violating people's constitutional rights, to engage in effective criminal surveillance. So I don't think there would be a proper cause of action, even if, you know, one of these detrimental effects took place. 

Ben Yelin: Yeah, that gets into a broader question. You know, sometimes any law enforcement action can do more harm than it does good. I'm sure we've all read the news recently and understand the concept of qualified immunity, where it's much harder to sue law enforcement for potential injuries if it's in the course of their investigative work or law enforcement duties. So I think that same principle is really at play here. 

Dave Bittner: All right, well, that is our collection of stories for this week. We would love to hear from you. If you have a question for us, we have a call-in number. It's 410-618-3720. You can also send us a message at caveat@thecyberwire.com. Send us your question, and perhaps we will answer it on air. 

Dave Bittner: Ben, I recently had the pleasure of speaking with John Ackerly. He is the CEO of Virtru Corporation. Our discussion centered on the notion of protecting our privacy in the midst of the COVID-19 pandemic. Here's my conversation with John Ackerly. 

John Ackerly: I think the relevant history really starts on September 11, 2001. So I was the lead technology policy adviser and was actually in the West Wing when the twin towers were hit and was really at the front lines of the response to those terror attacks. And the Bush administration and the president were actually very focused on data privacy issues from the campaign in Austin and, obviously, all the way through the aftermath of September 11. And it was a very interesting time and a lot of lessons learned about how to respond and the importance of transparency in that context. 

Dave Bittner: And one of the things, obviously, that came out of that was the Patriot Act. I'm curious. When you were in the midst of that, when you were in that advisory role at the highest levels of our nation, what was your own impulse there? When it came to privacy, when you were in the heat of all those discussions, how did you process that yourself? 

John Ackerly: So that's a great question, and I will never forget going into the West Wing on September 12. And I worked with the National Economic Council under Larry Lindsey, and he said to us that, you know, by coming into work at that time, you know, we had an obligation and we had the standing to potentially lean against the wind of changes that could be made in the context of what will be a fight against terrorism but where there may be unintended consequences. And so it was in that context that there was a lot of back-and-forth around what came to be known as the Patriot Act. 

John Ackerly: And without going into too much detail 'cause I think we should really talk about COVID, let's just say that going into those initial discussions with the Department of the Justice, there were a lot of priorities and a lot of requests that had been made over the preceding decade and a half, and decisions were made around the scope of the response that was very controversial and was not fully, you know, debated in the public eye. And I think when you fast forward 20 years, you know, we are paying the price for that in terms of public trust, and in particular around data privacy issues. 

John Ackerly: And a lot, by the way - you know, a lot of positive things came out of the response, too. And, you know, we'll always be proud of the fact that it wasn't a political discussion. There were no polls being taken around popularity. It was, in most respects, an authentic impulse to move quickly in a time of uncertainty. But it just reinforced over time, and I think we're feeling it now, the importance for a transparent process because if you are going to put systems in place that people will adopt, people got to trust them. And so that's what we're dealing with now. 

Dave Bittner: And we find ourselves now in another state of emergency. How does your experience from back then inform your perspective on the things you're seeing today, the conversations that are taking place now when it comes to privacy? 

John Ackerly: In a way, there is a very big difference, right? And drawing too many parallels between September 11 and today really is not helpful. But I do think that, you know, one of the key similarities is there is a very important role for the federal government and policymakers to make on publicly stated principles, first principles about how to enable federalism and how to, you know, build public trust. So, for example, in this context, there has been no public statement that when it comes to health surveillance and track and trace that the public and that the individual should be at the center of control - right? - with true, verifiable assurance that data is only used for its intended purpose, for specific periods of time. And that kind of leadership, I think, would accelerate our ability to respond. 

John Ackerly: Twenty years ago, there was a lot of centralized action. There wasn't transparency. I worry that on topics of privacy and data, data being a vital tool that we can use to get back to opening the economy quickly, that we don't have that kind of central clarity and also not transparency. So I think there's a lot that the government could do today to accelerate things for us to get out of phase zero and into phase one and phase two. 

Dave Bittner: How would you envision that sort of messaging coming out of the federal government? And specifically this federal government, the administration we have now and the situation we find ourselves in with Congress, who could lead that charge? 

John Ackerly: Well, look; I think that the natural place to start is with the CDC but have it be in - so then in coordination with the White House. And I hate the term Bill of Rights. It gets overused all the time. And it didn't work very well from a privacy perspective with the Obama administration, but I do think a clear set of principles about how to institute effective track and trace and contact tracing specifically would be very useful. 

John Ackerly: I think there is sort of almost a false choice being presented, and this is a generalization, but between the centralized systems that have been put in place to some positive effect in South Korea and in China but where it's really mandated centrally - that, as compared with a privacy-centric approach which is fully decentralized and really relies on proximity-based approaches, so then around Bluetooth. So you have either sort of very privacy-invasive and concerning approaches based on civil liberties on the one hand and, on the other hand, approaches that have a limited utility. 

John Ackerly: And I think there is important work today going on across many companies where there is a path forward where you can both give the public confidence and control over their data but encourage more sharing that combines things like proximity with sensitive health information that would make for a much more effective response. And so that's where the federal government and the CDC in particular can really lead here, and that's what I think is missing. 

Dave Bittner: And I suppose - I mean, there's a real human factor to the messaging here as well in that I can imagine a message put out that, you know, as a citizen, you know, do your part. If you're comfortable, here are the tools we put in place. Here's how we're going to protect your privacy. It's all out here. It's - you know, here are the privacy experts who've looked it over and said that it's good to use. But if you want us to start opening up our country again, we need everybody to step up and do their part. 

John Ackerly: I think that's such an important point, and I'm very happy that you raised that. You know, it was not perfect, but one of the empowering things coming out of September 11 was the campaign around see something, say something - right? - getting people, you know, focused on how they can help keep others safe, in addition to themselves. And I think in this context, you know, really, a message around share something - share your most sensitive data and save someone - like, with that public call to action, I think you can get the American public fired up about sharing their data to contribute to the response. And that's the kind of call to action that I think the country's really yearning for. And with the right kind of controls in place, we can do that. 

Dave Bittner: What about building in things like sunsetting for these sorts of things? You know, it's - I think it's a common criticism that when these sorts of things are put in place, they tend to stay around for better, for worse. 

John Ackerly: I think that's, again, a great point. And, you know, over the past 20 years, whether it's government or the private sector or the government and the private sector working together, you know, there has not been a reason to really have a lot of trust that because someone who for a period of time is in charge of a department says, hey, trust us; we will only use it for this purpose. The fact of the matter is when you have access to data, the urge to reuse that, and often in ways that the person in charge at that time thinks is beneficial to then his or her mission, to me, that is just such a powerful impulse. And the problem is so then administrations change. People change. And data can be weaponized. And I think it's incredibly important that it's not too kind of - you know, taking just a trust us mentality is not going to work. You have to actually put in place cryptographic controls so that data can actually be revoked after it is shared with a third-party system. And that is very possible today. 

John Ackerly: And I think due to the lack of federal leadership, companies like Google and Apple are put in an uncomfortable position for actually having to set those kinds of trade-offs. And Google and Apple are saying, trust us; we are not going to flip a switch and suddenly centralize control to then data that they are planning to build into operating systems. 

John Ackerly: There is just sort of this weird place where there is not clarity. And what is going to have to be put in place for these kinds of surveillance systems to be instituted in the U.S. context, where there's got to be voluntary opt-in versus coercion, is that kind of verifiable control where you can see what organizations are accessing your data and for how long and being ultimately able to revoke access to that data. I think that's going to be the path forward. 

John Ackerly: So the important point is that the approach where you can have privacy and actionable intelligence that can be used to help save lives is not some ideal. It's possible today. And what is really important is that there is federal leadership in coordination with the private sector to drive very quickly to put those kinds of mechanisms in place. It really ties back to that verifiable trust and control. 

Dave Bittner: All right, Ben, what do you think? 

Ben Yelin: So I think the point that resonated most with me is that it's up to the federal government to take a leadership role. You know, we need to lay out some principles. The key principle is we want to protect everybody's health and safety, and we want to do so in a way that protects individual rights. 

Ben Yelin: And I like that he invoked the see something, say something slogan that we all used in the aftermath of 9/11. I think that really did raise public awareness of our own role in protecting everyone's safety. And so, you know, a similar public messaging effort where we encourage people not to be afraid of, you know, using applications to help us with contact tracing by saying, sacrifice a little of your personal privacy, potentially share your information so that you can save lives - you know, as he said, without that public messaging, it's up to the private sector to kind of determine what direction the surveillance takes, and they're not subject to the same type of oversight that our government is. And I think people will be more reluctant to voluntarily opt in to these programs without that sort of messaging and leadership coming from the federal government. 

Dave Bittner: Yeah. 

Ben Yelin: So I think, you know, the first step, as he says, is for the federal government to lay out principles. What are the values underlying what we're trying to do here? What's the balance that we're trying to strike? And why is it so important that we all engage in contact tracing even if it potentially would sacrifice some of our privacy? 

Dave Bittner: And the notion that this is a temporary sacrifice, if you will, for us to make - that, yeah, it's important in this moment to give up a little bit of our privacy, but we also need to be vigilant that it doesn't become forever. 

Ben Yelin: Yeah. You made that point in the interview, I believe, that as we saw with the Patriot Act and other surveillance methods, once the government obtains these powers, it is very reluctant to give them up. And so I think that's absolutely a legitimate concern about any track-and-trace or contact-tracing operation. It's sort of hard to put the genie back in the bottle. So I think that needs to be part of public messaging as well. You know, the government needs to set out very clear timelines. We're going to do this for a period of six months, and it will be renewed as conditions on the ground warrant. So, you know, maybe in six months things are as bad as they are now. Maybe in six months we have a miracle vaccine or, you know, we're all drinking bleach or something and the disease has been cured. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: But, you know, I think just having some clarity as to why we're doing this, the values underlying this action and what the need is, why the need is so acute. And in the absence of that leadership, it's just going to be hard to get millions of people to voluntarily opt in to contact tracing. And the only way contact tracing works is to get millions of people to opt in. 

Dave Bittner: Right. 

Ben Yelin: Otherwise, you know, it doesn't help if there are only 10 people in whatever Apple and Google have set up for contact tracing. That's just not going to be very effective. 

Dave Bittner: Right, right. Right. Well, our thanks to John Ackerly from Virtru for joining us. We do appreciate him taking the time. 

Dave Bittner: That is our show. And, of course, we want to thank all of you for listening. 

Dave Bittner: Our thanks to the University of Maryland Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.