Policy is a mess.
Fred Cate: We want great service. We want it highly personalized and we want it just when we want it. And the only way to do that is with a fair amount of data.
Dave Bittner: Hello, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: We've got a good show this week. I've got the story of the IRS giving up on location data. Ben has the story of a man wrongfully arrested due to a facial recognition match. And later in the show, my conversation with Fred Cate. He's vice president for research at Indiana University and author of the book "Bulk Collection: Systematic Government Access to Private-Sector Data." While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney.
Dave Bittner: All right, Ben, let's kick things off here. Why don't you start us off? What story do you have for us this week?
Ben Yelin: So my story is actually very disturbing. It comes from The New York Times, and the headline is "Wrongly Accused by an Algorithm" (ph). And the story is about a man in Michigan named Robert Williams who was arrested back in January for a crime he did not commit. And the evidence used against him was based on faulty facial recognition software.
Ben Yelin: So just to give a very brief backstory, there was a robbery at a fancy store in Detroit, Mich. There was some sort of grainy surveillance footage that was obtained from the store and sent to some sort of expert - a risk reduction expert - who examined the footage. She captured a grainy image and sent that to the Detroit Police Department. They ran it through their database and found a possible match with this individual.
Ben Yelin: With this match, the software that they used comes with a warning that says, this is not probable cause for an arrest. This is simply the first step in an evidentiary chain to arrest somebody.
Dave Bittner: Right.
Ben Yelin: But the Detroit Police Department didn't heed that advice. They assumed that they had probable cause. They arrested this gentleman on his front lawn in front of his wife and two children. He was taken to detention. He was booked, fingerprinted and held for something like 12 hours. And they presented him with pictures, basically saying, is this you, you know, kind of intimating that obviously it is him; he committed the crime. And the guy said, no, look at the picture; it's clearly not me. And the cops sort of looked back at him and said, you know what? You're right.
Dave Bittner: (Laughter) The whole...
Ben Yelin: And it turns out that the whole thing - yeah, this whole thing was based on a faulty artificial intelligence system. We know based on numerous academic studies, including some done by government entities like NIST, that facial recognition software is less reliable as it relates to African Americans and other racial minorities just because the sample used to build up a database isn't as high. You don't have as many faces, you know, to help build up the artificial intelligence.
Ben Yelin: So we know that it's not particularly reliable, and the fact that the law enforcement in this case pretty much only used the evidence obtained through this facial recognition software to really put this innocent man through a traumatic experience is really eye-opening. And it seems like this is sort of the first time that something like this has happened so publicly, and I think it's a real moment of reckoning as it relates to facial recognition software.
Dave Bittner: I have a number of issues I want to discuss with you on this, one of which is why was he arrested? Why was he - we're talking about a shoplifting charge here, a nonviolent crime. Why would the police come handcuff him and haul him away as their first course of action? Why didn't they come and have a conversation with this man and say, hey, we have this picture of somebody who may be you. Hey, where were you on this day? Hey - like, why is the first thing hauling him in, keeping him overnight, locking him up?
Ben Yelin: I mean, that's a super loaded question. I think, clearly, they should not have done that. You know, you're not supposed to arrest somebody until you have probable cause. What they did here is show this individual's driver's license photo to the so-called loss prevention contractor who was hired on behalf of the store who had viewed that grainy surveillance footage. She identified him in that photo. Obviously, she did so falsely. And that investigator, that contractor didn't make herself available for an interview. But that seems to me to be very shoddy evidence of probable cause, especially when law enforcement themselves had seen the photo, had seen that it was grainy and also should have known themselves that facial recognition is particularly questionable and unreliable as it relates to racial minorities.
Ben Yelin: The other thing I should mention is the kicker of the story is kind of heart-wrenching. This gentleman actually had an alibi because he recorded himself singing a song on Instagram. He was in his car. And this occurred at the exact time of the robbery of the store. He was singing "We Are One" by Maze and Frankie Beverly, which has lyrics that say, I can't understand why we treat each other in this way, taking up time with the silly, silly games we play. Quite a kicker for the story for what this gentleman went through. Not to get sappy and emotional here on you, Dave.
Dave Bittner: No, no, I think it all aligns. So, again, I'm trying to understand the legal process here. Would the police have had to have gotten an arrest warrant to bring him in?
Ben Yelin: Not necessarily if they had probable cause. Maybe, you know, they - apparently, they went to his office - or they called him at his office first to try and get him to submit himself voluntarily to the police. And he didn't - he wasn't there. He didn't answer. So they went to his house to arrest him. You don't always need a warrant signed by magistrate judge to effectuate an arrest if you can show that you had evidence that a crime had been committed. And I think that was, obviously, improperly done in this case.
Dave Bittner: The other thing that strikes me as chilling is this gentleman is sitting in the room with these two police officers. Everybody in the room agrees he's not the guy. And he asks them, am I free to go? And they say no. They keep him in custody. They release him on a thousand-dollar bond. He has to come back, appear in court. The prosecutor moves to dismiss but without prejudice. Now, explain to me what that means. That means he could be charged again?
Ben Yelin: That's exactly what it means. So dismissing a case with prejudice means that, in any future proceeding, you could use evidence of a previous dismissal to show that the crime had not been committed. But in this case, the case was dismissed without prejudice, meaning if there was some other piece of evidence obtained indicating that this gentleman had committed the crime, they could instigate further criminal proceedings without having to take into consideration that the case had previously been dismissed.
Ben Yelin: And yeah, I mean, it's really heart-wrenching. He had to pay a thousand-dollar bond. They said he had to wait in the rain 30 minutes for his wife, all when it's very clear that he was innocent. So, you know, you ask kind of how could this possibly happen? It's clear from the photos and, really, from what the law enforcement officer said that they knew that it wasn't him in the picture. You know, I think this gets to biases in our criminal justice system and, you know, some real institutional problems that, you know, go beyond the narrow issues that are brought up by this case.
Dave Bittner: Yeah.
Ben Yelin: But you're absolutely right. I mean, there was no justification. The case should have been dismissed immediately, with prejudice. The fact that it was not, given the exculpatory evidence here, is pretty indicative of some deep rot in some of these police departments and in our judicial system.
Dave Bittner: And he had to hire an attorney. He has the expense that comes with that. His attorney, Victoria Burton-Harris, was quoted as saying that her client was lucky. She said, "He's alive. He's a very large man. My experience has been, as a defense attorney, when officers interact with very large men, very large black men, they immediately act out of fear. They don't know how to de-escalate a situation." Now, in this case, it seems like there was no violence. There was no - I mean, it seems like...
Ben Yelin: Thank goodness, yeah.
Dave Bittner: Right. Right. I mean, you know, despite his frustration, it seems like Mr. Williams was compliant, despite the absurdity of the situation.
Ben Yelin: Yeah. I mean, you just shudder to think of, you know, what could happen. We've seen this happen all over the country with deaths incident to arrest. This was an arrest. This was an arrest on a person's front lawn, in front of his family - and we know that those interactions can turn violent - all for a complete misidentification based on obviously flawed software that should not be used in this manner.
Ben Yelin: They interviewed the companies that develop the facial recognition software that were used in this instance. And they said, well, you know, our technology is not intended to be used to make a final determination on somebody's culpability; you have to do other police work, in other words.
Dave Bittner: Right.
Ben Yelin: So, you know, maybe you find a match here, but you also have to, you know, maybe pull this person's cell site location data to see if they were at the location of the crime when it happened or interview other witnesses. And they just didn't do that other police work here. So this person has reached out to the American Civil Liberties Union.
Ben Yelin: The ACLU is obviously very interested in some of the civil liberties issues around facial recognition technology and artificial intelligence, and so they've taken a great interest in the case. They've sent a legal document to the police department kind of demanding answers as to what happened. And I'm wondering if the police department might be subject to some civil litigation on false imprisonment, which is a tort in our legal system. That's very hard to prove and is almost never successful because law enforcement officers kind of get the presumption of good faith.
Dave Bittner: (Laughter) Perhaps not in this moment.
Ben Yelin: Yeah, exactly. Exactly. Yeah. I mean, the fact that this story comes to us during this very fraught moment in our history is also kind of eye-opening as well. I think the context is very important here. We now have this story, which is heart-wrenching. We have the decisions that we talked about last week and the week before from some of the large tech players, like Amazon and Microsoft, saying they're pausing their use of facial recognition technology for the foreseeable future, at least for the next year. So perhaps we really are going to get a reckoning on the use of this technology in law enforcement matters.
Dave Bittner: Yeah. I would hope that the publicity that this case will bring, this article in The New York Times, you know, should be spread around to law enforcement organizations around the country - the peril that they potentially face by trusting facial recognition software too much.
Ben Yelin: Absolutely. And one thing this article does is it kind of puts a human face on this problem. You know, sometimes we read about these things and they're very abstract, you know. Like, it's just hard to really conceptualize. But hearing a particular man's story, you know, somebody who had this traumatic thing happen to him in front of his family, I think just makes it more real and kind of highlights the urgency of the problem.
Dave Bittner: Yeah.
Ben Yelin: And so I think that was very well done by The New York Times.
Dave Bittner: All right. Well, interesting story, for sure. My story this week comes from The Wall Street Journal. It's titled "IRS Used Cellphone Location Data to Try to Find Suspects." It's written by Byron Tau. This is an interesting one. It's about how the IRS revealed to Senator Ron Wyden - he's a Democrat from Oregon - through some disclosures to his office, I guess a routine briefing some of the IRS officials gave to the senator, it came out that they had contracted with a company that provides location data, primarily for marketing purposes, but the IRS had used this data to try to put together cases on folks that they might be after.
Dave Bittner: And to be fair, we're talking about folks who might be up to money laundering, cybercrime, drug cases, organized crime - those sorts of things. But what's particularly interesting about it is that the information that this company provides is anonymized. It doesn't give the name of the person. It doesn't give the person's cellphone number. But what it does do is it provides a series of locations. And as you and I have discussed here before, if I have a series of locations, generally, it's not that hard for me to figure out who you are...
Ben Yelin: Right.
Dave Bittner: ...Because, eventually, you're going to go home, you're going to go to your office, and if you keep doing that...
Ben Yelin: Although, these days, I guess most of us are not going to our offices. So...
Dave Bittner: Right. Right.
Ben Yelin: Yeah.
Dave Bittner: But the chances are the place where your mobile device is parked through the night...
Ben Yelin: Yes.
Dave Bittner: ...And if it happens time and time again, chances are that's where you live (laughter), right?
Ben Yelin: Absolutely, yeah.
Dave Bittner: And so what the IRS could do is, for example, if there was a suspicious deposit made at an ATM, they could take that location data, correlate it with other connected data for a particular device, figure out who that was likely and then use that information on their case. Now, what's interesting about that is, because they don't have names and cellphone numbers, according to this article, attorneys at government agencies have concluded that this is not a violation of the Fourth Amendment. This is very interesting, don't you think, Ben?
Ben Yelin: It sure is. So it, of course, hearkens back to the Carpenter case, which we've discussed ad nauseum on this.
Dave Bittner: (Laughter).
Ben Yelin: I hope our listeners aren't tired of us talking about Carpenter. But it is, you know, the foundational case in this area. And that case held that law enforcement needs a warrant based on probable cause that a crime has been committed to obtain cell site location information data from an individual. Now, what government agencies have taken from that case and have kind of made into their own interpretation is that type of probable cause determination is not required when you're just obtaining anonymized data. And, you know, I actually think that is a pretty reasonable interpretation of the decision from Carpenter.
Ben Yelin: You know, the Carpenter case was individualized. They had a suspect. They tried to figure out whether that suspect was at the location of various robberies. So it was, you know, very specific to that individual. Here, they're just kind of getting a bucket of anonymized location data and doing, you know, investigative work to match that data to individuals. So I certainly think that is a defensible view of the law. You know, a couple other things about this article that really interested me - one, it didn't work (laughter).
Dave Bittner: Right.
Ben Yelin: So the office that engaged or that purchased the service from Venntel, which is the name of the company, said they let the subscription lapse after it failed to locate any targets of interest during the year it paid for the service. So I think it's maybe somewhat comforting for people who'd be concerned about the use of this technology that the IRS was not really able to do much with it.
Ben Yelin: And, you know, one thing that kind of stuck out to me towards the end of this article, they mention that the investigators or The Wall Street Journal viewed federal contracting records, and apparently, the IRS only paid $20,000 for access to this platform, which they mentioned is roughly the cost of a single login to the service. So at least as it relates to this company, the use of this was not particularly widespread. It was sort of a pilot...
Dave Bittner: Right.
Ben Yelin: ...To see whether this type of investigative technique, purchasing this data, could be effective. It seems as if it's not, although I guess there's a lot that we don't know about it.
Dave Bittner: Yeah. It's interesting, too, for example, the Department of Homeland Security could use these subscriptions to a service like this for immigration enforcement. They could look for underground tunnels or other, you know, illicit border crossings by using this data just to track where phones are going, without necessarily knowing the user's identity.
Ben Yelin: Right. I mean, that's what's probably most concerning about this article is - maybe some of us don't have sympathy for tax cheats (laughter). It depends on kind of your personal preferences as to the moral dimensions of that crime.
Dave Bittner: Right.
Ben Yelin: But it means that other government agencies could use this data for more nefarious purposes. And because we know, as you said, that anonymized data, if it's analyzed by a human being, is very likely to not end up being anonymized, that presents a lot of problems. You mentioned the Department of Homeland Security. Immigration and Customs Enforcement certainly comes to mind. Even the use by federal law enforcement agencies like the FBI for criminal matters - if the IRS, which is the Criminal Investigation Unit, is charged with tax code violations is willing to use this software, you know, it kind of seems to reason to me that other agencies would be willing to use it as well. So, you know, we're at this point where it seems like the Supreme Court might have to address this particular issue. We'd have to have a case make it through the lower courts, and for that to happen, we need somebody's arrest to be based on the use of this anonymized data so that they would have standing. And so because, you know, there were no arrests, at least in this pilot experiment here, I think we're a long way from that case. But eventually, the Supreme Court has to decide whether it will extend its reasoning in the Carpenter case to anonymized GPS data that's sold on a bulk level brought by private contractors.
Dave Bittner: Yeah. All right. Well, we will include a link to this article, of course, and also, we would love to hear from you. If you have a question about privacy policy, send us a note. It's at caveat@thecyberwire.com. You can also call in and leave a message at 410-618-3720. We would love to hear from you.
Dave Bittner: Ben, I recently had the pleasure of speaking with Fred Cate. He is the vice president for research at Indiana University. He's also author of the book "Bulk Collection: Systematic Government Access to Private-Sector Data." Really interesting conversation - here's my talk with Fred Cate.
Fred Cate: I think privacy is all about not being surprised. In other words, it's some sense that you either control your information or you know what others are doing with your information if they control it.
Dave Bittner: And so where do we find ourselves today?
Fred Cate: Well, I mean, generally speaking, we don't have a lot of privacy in that sense, in that we all use technologies every day that we have very little idea what they're doing with our data. And in many cases, that's not their fault. In other words, they're telling us. It's just we don't care enough, I mean, to be honest. So you know, when you poll people, they say overwhelmingly how much we all care about privacy. But then when you say, how many of you have actually done something to protect privacy, like, have you changed a setting on a browser, have you changed the setting on your phone to protect privacy, it turns out it's actually a pretty small number. But there's an additional complexity, and that is a lot of data uses are not well disclosed, and they're not well understood. So, for example, location information that my phone might broadcast - I think we all have some sense that our phones have location information. We know there's a location - something we can change in settings. We know if we go to a map, it shows us where we are on the map, so we must have some sense that there's location. But I think most people don't give much thought to how much information about my location is being transferred and to whom is it being transferred.
Dave Bittner: Yeah, and I think there was a lot of surprise when folks found out, you know, their mobile service providers were selling a lot of that information. And it strikes me that we're kind of in this era now where - and perhaps it's naive of me to think that it's ever been any other way, but I guess what I keep thinking of is just because you can doesn't mean that you should. And we're generating all this information that people find value in the ability to sell it, and I guess personally, I find myself scratching my head from time to time thinking, who thought it was a good idea to gather up and sell this information? But here we are.
Fred Cate: Yeah. I mean, I think you're right. I certainly agree with you. I think some of it is really we've been focused so long in our legal systems and in the way we talk about this on data collection. But in reality, it's probably data use is where most of us are actually both most uninformed and most likely to be surprised because, you know, we usually provide data for a pretty good reason. Like, I provide a credit card so that I can buy something, or I go to my doctor, and I tell my doctor something so that my doctor can treat me. What we don't expect is that the data's going to be reused in some completely unexpected way. And obviously, it's hard to define what's expected because each of us might have slightly different expectations. But, again, I think we've seen this a lot in marketing. It's not really a surprise that if you buy something from a retailer, the retailer then offers you a warranty on it, that they're using that information to offer you something related. It's probably more of a surprise if they sell that data to an entirely separate third party to do something that you never even contemplated as part of the transaction.
Dave Bittner: Yeah, and I think a lot of us, you know, had this sense, this sort of promise that was given to us at the outset of a lot of online retailing and so forth. You know, this would be great because we're only going to get ads for the things that we're interested in. And again, personally, I thought, well, that's a good idea. Don't waste my time, you know, putting ads in front of me for things I might not be interested in. But then it seems as though we've sort of crossed the line where so many of the things that are put in front of us, we can't help feeling a little creeped out by it.
Fred Cate: Yeah, I think that's right. And I think also it reflects again that distinction between sort of what we think we know and what we really know or our lack of understanding. So, you know, most people I think really probably prefer targeted ads. In other words, I'd rather see ads that are relevant to me than ads that aren't relevant to me. But again, I don't want to see ads that are too relevant to me. I don't want an ad that says, I see your recent blood test results show you have a, you know, low vitamin C.
Dave Bittner: (Laughter).
Fred Cate: We're going to send you some vitamin C. That's feels a little creepy. On the other hand, the other side feels creepy as well. I can't stand when I see ads and I'm like, why do you think this is relevant to me? Like, you know, a beard trimmer - I don't have a beard, so why are you sending me this ad, and who in my family do you think has a beard? And so it's hard. I mean, I actually am one of the few people who actually feels that a tiny bit of sympathy for people who use data in the commercial environment because, on the one hand, we want great service, we want it highly personalized, and we want it just when we want it. And the only way to do that is with a fair amount of data.
Dave Bittner: What are your insights on the policy side of things? Where are areas that you think need some movement, need some adjustment?
Fred Cate: I think policy is actually, in many ways, probably the most important area here. And maybe that's because I do policy, I would say that. But, you know, the technologies we run out and we adopt almost without thinking. You know, a new iPhone, you go buy it. A new piece of software, a new app, you download it. And so I think the notion that technology is really going to be our protection here has so far not worked. And part of the reason for that is because even when technology protects us, we will make bad decisions that we will go around those protections. And, you know, we saw this when Internet Explorer first introduced the malicious software tool. And, you know, your address bar would turn red, and it would say, do not go to this website. This is known to be a source of malicious software. And we know from monitoring behavior that most people click through it. They say no, I want to go anyway.
Dave Bittner: Right.
Fred Cate: And so I think policy is where we should be focused because what we're really saying is, what should be the rules, the principles and then the ways those principles are implemented that tell people, here are things you can do, and here are things you can not do with other people's information or without getting explicit consent if you're going to do something unexpected. So I use a credit card to buy something. Of course, you're going to have to contact my bank. You shouldn't have to tell me that. Don't give me a notice that says the bleeding obvious. On the other hand, using that information to do something completely unexpected might be something that we either prohibit or we say, no, now you really do have to give clear, explicit notice. These are the types of conditions, these are the types of rules that policy can put in place.
Fred Cate: And right now, I think policy is a mess. Almost everywhere around the world, we either have too much of it, which I think is the case in Europe. We find organizations, you know, tied up with complexities in terms of complying with, you know, GDPR. And GDPR is not just - you know, we thought it was going to be the umbrella policy, but it's not. It's supplemented by many other both European wide and then national laws. So we have a sort of overly complex or overly regulated situation. And, you know, in the U.S. and in many other parts of the world, we have what might be an under-regulated situation, where ambiguity is able to be legally exploited by, you know, law enforcement or by businesses.
Dave Bittner: Yeah. And certainly, I mean, here in the states as we see, things are kind of going state by state rather than seeing something come from the feds. How do you see that playing out long term? Do you think that we can sustain that or will we have to see some sort of federal umbrella policy?
Fred Cate: Let me say, I've been thinking for two decades we would see a federal umbrella policy...
Dave Bittner: (Laughter).
Fred Cate: ...And I've been wrong for two decades. So you probably shouldn't pay any attention to what I think on this subject at all. Of course we should see a federal policy. I mean, you know, none of these issues are relevant to deal with state by state. You know, the internet does not respect state boundaries, and very little commerce respects state boundaries. But so far, the federal government has been hamstrung, and right now it's hard to imagine it being any less hamstrung. And so what we end up with is a de facto form of kind of national regulation that emerges from whatever is the most restrictive state. And generally speaking, that's California in most instances.
Dave Bittner: Is there a built-in, systemic disconnect where the rate of change that's built into our political system, our policy system, our ability to execute change, is mismatched with the reality in the tech world? Is that a reasonable thing to say?
Fred Cate: Oh, it's totally reasonable, and it's completely true. But let me say, that's true in almost everything. I mean, there are very few instances where law gets ahead of a problem. Law or our political system is always playing catch-up. You know, if there's a recession, we're trying to lessen its effect. If there's a health outbreak, we're trying to catch up where the actual biology is of that health outbreak. Technology may present some slightly more exacerbated challenges. And that is they happen on a bigger scale, and they may happen faster than many other types of changes we see. But I think we should be expecting law not to get ahead, but we should be expecting law to set frameworks that then are fit, that they are adaptable to new challenges. And so, you know, in an ideal world, you shouldn't have to change the law that frequently. It's the way the law's applied that will change as the technology changes.
Dave Bittner: What about where we find ourselves at this moment with the encryption wars? I've seen recently some apps like Signal have said if some legislations go through in the U.S., they may have to pull out of the market. And we hear, you know, talks about outlawing encryption isn't going to keep people from using encryption. What insights do you have in that debate?
Fred Cate: You know, I don't know that I have any worthy insights. I actually chaired the National Academy of Science's most recent task force that looked at questions about encryption and particularly law enforcement access. And...
Dave Bittner: Yeah.
Fred Cate: You know, one of the things that it's hard to get around but it's also hard to figure out what to do with is, again, that we are a global system. And so, you know, the U.S. sort of famously has adopted export controls applicable to certain encryption technologies. And then we discovered that, you know, those encryption technologies were built into almost every product - the, you know, word processing and communications and operating systems - and that people were unwittingly violating the law just by getting on an airplane and going someplace with their laptop or their phone.
Fred Cate: So we over time have sort of backed away from that approach towards encryption and instead have focused on it more as sort of a law enforcement or a national security issue. You know, under what conditions should the government be able to get access? The challenge there is almost any condition in which the government gets access, particularly if it needs it fast - and, you know, the example always used is a kidnapped child. I think we would all be enormously sympathetic to - in that situation, if you had the iPhone that said where is that kidnapped child, you would like to be able to get into it. But to make a tool that gets you into it quickly, that necessarily means you're going to let other people in as well because there's just no way. I mean, there's no tool. There's no approach. There's no technology that we've seen so far that only lets good guys in through holes in encryption.
Fred Cate: And then we have to add the complexity of - not all governments are good guys. Even good governments occasionally act for bad reasons. And so do we really want - even if we all agree that maybe there should be access in the case of the kidnapped child, is there a way to cabin that so we're not also using it to get information on political opponents or on dissidents or on political protesters? These all come together.
Fred Cate: And then we do have this fact that there will always be a workaround, you know? There'll always be - if I can't buy the technology in the U.S., I'll buy it someplace else. If I can't download the software here, I'll download it somewhere else. And so in that sense, I think we're not getting any closer towards clarity on this issue other than appreciating that it's maybe a harder issue than certainly law enforcement - but I think also civil libertarians - originally thought that it was.
Dave Bittner: With the situation we're in right now with the global pandemic and COVID-19, I've seen commentary from many people saying that one thing we need to be mindful of is that, in our effort to combat this and the need to gather data - that we don't inadvertently end up with a COVID-19 equivalent of the Patriot Act, something that perhaps has good intentions at the outset but then lingers for long beyond when the actual danger has passed. I'm curious; what are your insights when it comes to that?
Fred Cate: Well, I totally would agree with that. I don't actually see enormous risk of that right now. In other words, so far, most of our laws - certainly including in Europe - have proven flexible enough to deal with, you know, the issues we've seen from the pandemic so far. Now, I am a little concerned. We're starting to see some proposals out of Europe for laws that would specifically restrict tracking, for example. Well, that seems unwise to me, in other words, to say we're fighting a pandemic that we don't really fully understand. And this does not seem the time to adopt a legislative ban on the use of a tool that might be helpful as opposed to letting regulators and companies work together - hopefully with consumers involved in that loop - in a way to say, what types of tracking would be appropriate? Can we track just for the purpose of doing contact tracing so that if someone you've interacted with in the past 14 days turns out to have COVID, we can let you know or we can tell you, you need to now self-quarantine?
Fred Cate: I think those are the types of things that seem reasonable if we could put in place sufficient protections on either side, if you will, as, you know, guard rails so that these don't then become anti-immigration tools or anti-protest tools or something else. And so I do worry always about, you know, the either misuse of technology or the overuse of technology. But I also worry about the reverse, which is sort of the shying away from technologies that could be the very thing that could get us out from quarantine right now.
Dave Bittner: What sort of advice do you have for the engaged citizen who wants to keep up on these sorts of privacy issues? What sort of tips do you have? What are the best sources and ways for them to stay up to date?
Fred Cate: Fortunately, there are a lot of them, and unfortunately, there are a lot of them. I mean, it's the challenge of having lots of information. Much of it's really terrific. So first of all, I would say the popular press has done quite a good job, including, you know, podcasts like this and others where you don't have to just be focused on privacy to find really interesting privacy stories. I mean, I read the New York Times and the Washington Post every day, and it's rare that there's not a privacy story in each of them. So I think that's one place to start.
Fred Cate: Another is there are a lot of more dedicated sources. It's funny. I actually worry about getting overwhelmed with information, and so I tend to be pretty scarce to what I subscribe to. But on the other hand - I mean, so, for example, in a sort of related area, cybersecurity, which I think is critically important to privacy because obviously, if you can't secure the data, you can't protect the privacy regarding its use - you know, there are wonderful sources. You know, Bruce Schneier, the sort of famous cybersecurity guru, publishes a sort of every three week or once a month newsletter. And that for me - that's great. That sort of captures, you know, three weeks or four weeks of news, puts it all together in kind of a readable format.
Fred Cate: And then, of course, there are websites like the Center for Democracy and Technology or the Electronic Freedom Foundation, the ACLU of North Carolina - of Northern California. I should never make that mistake again.
Dave Bittner: (Laughter).
Fred Cate: The ACLU generally - I mean, these are all really good sources of data. Even the Federal Trade Commission has quite a good website on privacy issues, you know, that rise to the national level. I do think the issue about use as opposed to just collection is really important, and I think we have been myopically focused on collection. Even the Fourth Amendment privacy is applied to the states through the 14th Amendment. The Supreme Court's interpreted it to only focus on collection. So, like, once the government collects data from you, if it has a lawful purpose, it can do anything it wants even if what it later wants to do is unlawful. And that makes no sense. Like, none of us think that way. You know, again, we tell a lawyer, we tell a doctor, we tell a friend something, and there's an implied or sometimes explicit promise it won't be reused in inappropriate ways. So I think that's one really key point here.
Fred Cate: And then a second one is I do think COVID-19 is really showing us, once again, the importance of that balance between letting data be used under appropriate conditions and appropriate controls for things that make our lives better. And if that means getting people back to work or getting us out of our houses, that could have tremendous value. The problem comes down to trusting, you know, is there policy in place and is the policy effective so that we don't worry about it being misused; you know, we don't worry about something untoward happening with the data. That could include a breach of the data stolen, or that could include the government comes in and asks for all of it because it suddenly wants to do something previously unconsidered with it.
Fred Cate: And then just the last thing I'd say, which is - I think there are a lot of tools we have not really explored. We've so relied on notice that nobody reads and consent that nobody really appreciates that they're giving that we've not explored other types of tools. And one which I've become particularly interested in recently - I don't have an ownership stake. I don't make any money off this. But things like data review boards - you know, if you think about it, all health research - all research involving humans in this country is overseen by law by institutional review boards - you know, boards that bring together members of the community with members of the research organization that then oversee these types of research studies. Are they safe? Are they appropriate? Are they worth the risk?
Fred Cate: What about if we did something similar around data uses - you know, if we said, look; you've got hard choices to make - should you collect data on your employees so that you can track them in the event of the pandemic breaking out again? And instead of just making that decision yourself or just your lawyers making it, how about if you actually had a data review board that would include people from outside of the company or outside of the government agency or outside of the organization as well as, potentially, people inside? And this would lead to a more thoughtful discussion. You would get some perspectives broader than just your own. You would document your decision, which might be useful if things go wrong later, and might certainly be relevant to a regulator trying to figure out if you acted recklessly or you acted, you know, in willful disregard or if, in fact, you thought you were doing the right thing but maybe you just didn't get the balance right.
Fred Cate: So there seems like there are lots of other tools - I don't want to suggest data review boards are the only ones - that with a little creativity, we could move beyond. If, you know, I see another ballot initiative in California that says notice of choice, for God's sake, who's got time to read those notices? I mean, nobody reads them. And so we just click yes because we want to move ahead. And I think we've got to apply the same creativity to the policy side as we've been applying to the technology side.
Dave Bittner: All right. Ben, what are your thoughts here?
Ben Yelin: Very much enjoyed hearing from professor Cate, and it makes me curious to read the book. I'm sure we'll put information on that book in the show notes as well. He's kind of reiterating a lot of themes that we've talked about many times on this podcast but sort of still remain unaddressed. One thing he talked about is how, obviously, decisions we make about privacy should be made at the policy level by our policymakers. And the real gap we have in this country is that federal policymakers have just not taken the lead on data privacy legislation, and that leaves this sort of bizarre gap that exists in this country where you don't have any federal guidance. And companies are forced to abide by the strictest state laws, which are generally - you know, it's California and New York, the ones that we've mentioned.
Dave Bittner: Yeah.
Ben Yelin: The upshot of that to me is that there's really not as much democratic small-D accountability. If you're a Wyoming person who's not happy about a tech company's data policies, those policies are developed because of a law that was passed in California by legislators that you did not elect. So I think that's problematic. And I think, as the professor said, it's why, you know, data privacy is such a federal problem. There are virtually no online interactions that are contained within states, and there's very little commerce that's contained within states, as he said. So I think that's kind of what I took away most from the interview.
Dave Bittner: Yeah. All right, well, our thanks to Fred Cate for joining us, for taking the time. Again, the book is "Bulk Collection: Systematic Government Access to Private Sector Data."
Dave Bittner: That is our show. We want to thank all of you for listening. We want to thank this week's sponsor KnowBe4. If you go to kb4.com/kcm, you can check out their innovative GRC platform. That's kb4.com/kcm. Request a demo and see how you can get audits done at half the cost in half the time. The Caveat podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.