Caveat 10.30.19
Ep 2 | 10.30.19

Privacy and biometric data.

Transcript

Elizabeth Wharton: How do we protect this information? And while we can't perhaps opt out, we can at least have some say in how it's used. 

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast, where we discuss online privacy, surveillance, legal cases and policy battles that affect our daily lives. I'm Dave Bittner from the CyberWire, and joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hi, Ben. 

Ben Yelin: How you doing, Dave? 

Dave Bittner: We've got some interesting stories to share and, later in the show, my interview with Elizabeth Wharton. She's an attorney, and she's also vice president of operations and strategy at security company Prevailion. She's joining us to talk privacy and biometric data. We want to remind you that while this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: We've got a lot to share this week, but first, a word from our sponsors, KnowBe4. And now a few thoughts from our sponsors at KnowBe4. What do you do with risk? We hear that you can basically do three things - you can accept it, you can transfer it, or you can reduce it. And of course, you might wind up doing some mix of the three. But consider - risk comes in many forms, and it comes from many places, including places you don't necessarily control. Many an organization has been clobbered by something they wish they'd seen coming. So what can you do to see it coming? Later in the show, we'll hear some of KnowBe4's ideas on seeing in to third-party risk. 

Dave Bittner: And we are back. Ben, why don't you kick things off for us this week? 

Ben Yelin: Sure. So I spent a Wednesday morning on C-SPAN, as I sometimes do, watching... 

Dave Bittner: (Laughter) Living the dream, Ben - living the dream. 

Ben Yelin: I am living the dream. It was a three-hour hearing of the House Judiciary Committee about the Foreign Intelligence Surveillance Act, parts of which are up for reauthorization this fall. A few members of the Department of Justice and the National Security Agency were there to testify on behalf of some of these programs, and they are calling for all of these programs to be extended permanently. So there are a few provisions of that law that are set to expire. I'll mention a couple of them briefly and then focus on one that is of particular interest to me. 

Dave Bittner: OK. 

Ben Yelin: One of them is what's called the roving wiretap provision. So FISA has a provision in the law where if a target of surveillance tries to evade the government, the government's surveillance techniques, by, say, switching phone providers, we can authorize what are called roving wiretaps. You don't have to get a separate FISA order to conduct surveillance if this person changes the device that they're using. 

Dave Bittner: So the wiretap follows the person, not the device or the address? 

Ben Yelin: Exactly. 

Dave Bittner: OK. 

Ben Yelin: So that's up for reauthorization. 

Dave Bittner: OK. 

Ben Yelin: Business records provision allows the government to apply to the Foreign Intelligence Surveillance Court, and that would compel the production of business records, driver's license, apartment leasing records. This was - kind of used to be known as the library records provision because I think... 

Dave Bittner: Oh, yeah. 

Ben Yelin: Yeah. What stuck out to people is that they could get a subpoena to obtain your library records. Mine would be extremely boring. But, you know, I'm sure some people have interesting library records. 

Dave Bittner: (Laughter) Sure. 

Ben Yelin: A lot of children's books for me. Lone wolf provision - you know, most of FISA is designed around foreign powers, so either the big bad guys like Russia and Iran or terrorist organizations like al-Qaeda. But we also want to be able to surveil lone wolves, people who aren't part of any ideological groups. And so this provision allows the government to do that, and so that's set to expire, as well. 

Ben Yelin: The main one and sort of the central topic for this hearing is the call details records program. This is a program, you might remember, as being part of the Edward Snowden leaks in 2013. The domestic communications carriers were routinely handing over call detail records of all of their domestic customers to the National Security Agency to be available for data mining. So if the NSA had a reasonable, articulable suspicion that they could find evidence of terrorist activity in these call records, they could search them. Now, this is just metadata, so it's the number making the call, the number receiving the call, the duration of the call. 

Dave Bittner: Mm-hmm. Time of day - that sort of thing. 

Ben Yelin: Exactly. But, you know, depending on who you're calling, that could be pretty personal information. 

Dave Bittner: Right. 

Ben Yelin: Congress reauthorized this program in 2015. They changed it a little bit so that now the government has to go to the Foreign Intelligence Surveillance Court to get authorization to get these records for an individual selector - so, like, for an email address or a telephone number. And they can obtain those records from the company directly. 

Dave Bittner: Right. 

Ben Yelin: So it's the company that actually holds the records. 

Dave Bittner: So rather than NSA gathering up all this information and keeping it themselves, it's now the job of the phone companies to do that, and NSA has to get permission to go into that... 

Ben Yelin: Exactly. Now, you know, pick your poison, who you'd rather have containing your call detail records - the government or the telecommunications company. I think most people would probably say, the telecommunications companies. So this obviously became very contentious in the wake of the Snowden disclosures. In the past couple of years, we found out that, routinely, the NSA has been collecting unauthorized data, data that they didn't mean to collect and is not authorized under this program. 

Ben Yelin: We've also sort of learned gradually that this has not been a very effective tool in fighting terrorism, especially as our adversaries have moved to more advanced technology than simple, you know, call detail records. And so as a result, the NSA itself, in a highly unusual step, actually recommended that this program be discontinued. They recommended that to the Trump administration. The Trump administration, led by the former director of national intelligence, Dan Coats, said that they wanted permanent reauthorization of this program. So this was going against the recommendations of their own National Security Agency. 

Ben Yelin: What they say is, even though this program is not successful now, you never know which threats are going to present themselves in the coming years, so we might as well have this power at our disposal. And what was interesting to me about this hearing is that most members of the committee were just not having that explanation. 

Dave Bittner: Really? 

Ben Yelin: I mean, I think - to paraphrase, you don't give somebody a loaded gun and expect them never to shoot it. And we're giving the NSA this incredibly powerful weapon to collect call detail records even if they don't have probable cause that a person is an agent of a foreign power or committing a crime - just on reasonable suspicion that you're going to get foreign intelligence information. And particularly the Democratic members on this committee were very skeptical of the NSA and its justification for continuing the program. The upshot of all of this is it is now unclear whether this program will be reauthorized. 

Dave Bittner: Despite the NSA suggesting that they don't want it or need it anymore. 

Ben Yelin: Right. Now, the representative of the NSA who was at this hearing said, even though we're not going to use this program - the call detail records program - because we haven't figured out a way to run it efficiently, we still want the legal authority if the case may present itself. 

Dave Bittner: I see. So - oh, interesting. So they're not saying, do away with the authorization. They're just saying, even though we're authorized, we may not find it useful to do. 

Ben Yelin: Right, and I think the Chairman of the House Judiciary Committee Jerry Nadler - a guy you've probably seen in the news a lot recently - was just very skeptical of that argument and said, this is a large invasion of privacy potentially. I mean, call detail records - if you're calling your, you know, therapist at 3 in the morning or, you know, you're calling a sex hotline or an abortion doctor or something, that's a pretty detailed description of a person's private relations. 

Dave Bittner: Right. 

Ben Yelin: And I think in Chairman Nadler's view, if we're going to do something like that, it better be narrowly tailored to achieve the government's objective, and it actually has to work in terms of being a counterterrorism tool. Some Republicans on the committee also expressed, I think, almost some shock at the fact that this program was still in existence and that the National Security Agency was still pushing for it. Other Republican members seemed to favor permanent reauthorization. 

Ben Yelin: Generally, these things tend to get reauthorized. You know, a disadvantage for proponents of these types of programs is a lot of the information that would justify the programs is classified, so they don't want to say anything publicly at a congressional hearing, so they might be able to have greater influence behind closed doors. But my guess is that the call detail records program as it exists now may not be authorized by the end of this calendar year. 

Dave Bittner: You know, the one insight I have on this is - I've interviewed several people from NSA over the years, including some pretty high-level folks. And they've made the point that when, for example, information is gathered accidentally, it is a royal pain in the butt for the people who gathered it up. 

Ben Yelin: Absolutely. 

Dave Bittner: There's a whole series of things they have to do and they have to document, and it really throws kind of a monkey wrench into their day-to-day operations. So particularly that accidental gathering - they just really try to avoid it because it kind of ruins your day when you do it because of all the things you have to do to mitigate it. 

Ben Yelin: Absolutely. I mean, every NSA program has what are called minimization procedures - so getting rid of data that you were not authorized to collect - and I've heard the same thing. It's a pain. The analysts there are always - in my view, at least - acting in good faith. It's not like they were seeking to collect unauthorized records. Whatever system they were using failed them. 

Dave Bittner: Right. 

Ben Yelin: And that creates a lot of extra work for them, and time is a finite resource. I mean, what's potentially useful about the call detail records program is that it added some color to a general intelligence picture. Which target was calling whom - you know, mapping that out to different hops. So who was Person A calling? If they're calling Person B, who is Person B calling? And so the time should be used to actually conduct that analysis, not towards purging records. And I think because it became such a burden, the NSA said, it's just not worth our time and effort anymore. 

Dave Bittner: Yeah. All right. Well, we'll keep an eye on it, see how it plays out. My story this week, I have to say, is a little lighter. It's actually kind of - it seems like it might be straight out of a soap opera, but... 

Ben Yelin: This is totally going to be on a "Law and Order: Special Victims Unit" episode in the next year. I'm willing to guarantee that.

Dave Bittner: This is from The Daily Beast, written by Julia Arciga, and the headline is "Husband Ordered to Pay Almost $500,000 After Bugging Tobacco Heiress Wife's iPhone." So a gentleman by the name of Crocker Coulson - great name - he is the chairman of a performing arts school in Brooklyn. 

Ben Yelin: Sort of sounds like a Faulkner character, doesn't it? 

Dave Bittner: It really does. 

Ben Yelin: Yeah. 

Dave Bittner: I mean, again, it is right out of a soap opera. 

Ben Yelin: Absolutely. 

Dave Bittner: That is a soap opera name, so hats off to him. As I said, he's the Brooklyn Music School chairman, and he has been ordered to pay his ex-wife $200,000 in compensatory damages, $200,000 in punitive damages and $41,500 in statutory damages, which works out to $100 for each of the 415 days he access to her phone. So over a year, he was essentially bugging her phone. And what happened was these folks were in the midst of a divorce, and his ex-wife's divorce lawyer discovered that he had been spying because he was going through financial records. And he found a payment for a piece of software called OwnSpy, which is a piece of software that lets you listen in on conversations on other people's phones. 

Ben Yelin: Yeah. You got to keep those PayPal accounts private... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Those PayPal transactions. If you don't want to get caught, what a shame to have that $50 charge show up in your PayPal bill. I mean, it's like you get a - you know, a bill for $50 that turns into a bill for 200,000... 

Dave Bittner: Yeah. 

Ben Yelin: ...Or whatever it is. 

Dave Bittner: Let me go out on a limb here, Ben, and say maybe the better advice would be not to bug your ex-wife's phone. 

Ben Yelin: Yeah, don't do that. That is certainly the best piece of advice. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: For one - so that violates a couple of federal statutes, but most notably the Electronic Communications Privacy Act, which says that you cannot surveil somebody else's private device. There are a lot of different exceptions to that act, particularly as it relates to government surveillance. But this is a pretty clear violation, and that's reflected in the statutory damages. 

Dave Bittner: And there are no family exemptions for that? For example, obviously, you know, a husband and wife. But what about children? 

Ben Yelin: So I think there are exemptions because, technically, it's the parent who actually owns that device. You have to be 18 or older to sign the contract. 

Dave Bittner: I see. 

Ben Yelin: So the child could be using the phone, but the parent is actually the purchaser of the device. I assume here the wife - the heiress - it was her phone. I would presume because of the outcome of this case that the husband - his name was not on that contract. 

Dave Bittner: Right. 

Ben Yelin: They were not consolidating their phone minutes. 

Dave Bittner: Did not get her permission (laughter). 

Ben Yelin: Exactly. 

Dave Bittner: Right, right. 

Ben Yelin: And so without her permission, I mean, you are trespassing into somebody else's private phone conversations. To give you an idea of how egregious the court thought this was, $200,000 in punitive damages is a lot. 

Dave Bittner: Is that right? 

Ben Yelin: Yeah. You know, punitive damages are intended to be used only in the most egregious of circumstances. We're saying not only should you compensate the victims for their, you know, emotional and financial losses, but we're going to tack on more money just as a punishment to you for engaging in this type of conduct because we want to prevent other people from doing crazy things like monitoring your wife's cellphone for a period of two years. 

Ben Yelin: So to me, this just says that the court took this surveillance very seriously. I mean, $500,000 is a lot of money to pay. I don't know how much the chairman of a performing arts school makes, but probably less than a tobacco heiress, I would guess. 

Dave Bittner: Now, explain to me what's the difference between the punitive damages and the statutory damages. 

Ben Yelin: So the statutory damages are monetary damages for violating the Electronic Communications Privacy Act. The punitive damages are levied by the court, so it can either be a jury, if this was a jury trial, or the judge, if the judge was the finder of fact. And they'll levy those damages not on the basis of any statute but on the basis of their own interpretation as to whether there should be additional punishment, so to speak, based on the egregious nature of the conduct. 

Dave Bittner: I see. 

Ben Yelin: And so $200,000, in that context, is a lot. Like I said, I mean, we see punitive damages when there's, you know, exploitative behavior. A lot of the punitive damages cases are when, you know, corporations have some policy or hidden fee that rips off consumers and compensatory damages wouldn't be enough to convey the message that this is something that's wrong. So when you see punitive damages, it's almost always a bad sign for a defendant. 

Dave Bittner: Yeah. All right. Well, I guess the lesson here is that if this is something you're thinking about doing, not only is it the wrong thing to do, but it could be an expensive thing to do. 

Ben Yelin: Yeah, that's - $500,000 is a lot of money. And you may think you're only paying $50 for spying software, but it's going to cost you a lot more if you're caught, so. 

Dave Bittner: Yeah. All right. Well, it's time to move on to our Listener on the Line. 

0:15:30:(SOUNDBITE OF PHONE DIALING) 

Dave Bittner: This week, we've got a call from a gentleman named Russell (ph). He says he's from Utica, New York. And Russell has some privacy concerns that are, I suppose, more common than you might think, especially as folks are living more of their lives online. We'll let Russell describe it in his own words. Here's Russell. 

Russell: Hi. This is Russell. And I'm wondering how much I need to be concerned with people monitoring my cameras and microphones on my laptop. Can the FBI actually listen in, or is that just something that happens on TV shows and movies? Thanks. 

Dave Bittner: All right. So, Ben, what do you think? 

Ben Yelin: So certainly, you understand the fear, right? I mean, nothing scares us more than the government hacking into our computers. 

Dave Bittner: Something you see in Hollywood movies. 

Ben Yelin: Exactly. 

Dave Bittner: Right. 

Ben Yelin: Spying on whatever it is we do with our laptops. 

Dave Bittner: Right. 

Ben Yelin: So I understand the fear. There have been a limited number of judicial cases dealing with this. 

Dave Bittner: Really? 

Ben Yelin: Oftentimes, it's the FBI going to seek a warrant to conduct these searches for some sort of severe intelligence or criminal justice investigation purposes. And courts have been very, very reluctant, for obvious reasons, to grant warrants to allow the FBI to do this. The bigger concern from my perspective is private hackers who possess the technological capabilities and don't face the same sort of restrictions that the government does on conducting this type of surveillance. 

Dave Bittner: Right. There's no oversight of the bad guys. 

Ben Yelin: Exactly. 

Dave Bittner: Right. 

Ben Yelin: So it's really not that much of a threat when it comes to law enforcement because it's been - the law enforcement agencies have only tried a limited number of times. 

Dave Bittner: I guess it's not worth the effort. They know that it's a tough thing to get permission to do. 

Ben Yelin: Absolutely. 

Dave Bittner: Yeah. 

Ben Yelin: I mean, this is as much of an invasion on one's privacy as you can possibly think of, meaning you're going to have to have a pretty persuasive justification to a judge that you need to conduct that type of surveillance. 

Dave Bittner: I see. 

Ben Yelin: And it has been done in a couple of circumstances, but it's a very, very high bar. You'd have to have probable cause that you'd catch somebody basically committing a crime on their webcam. You know, when it comes to protecting yourself from hackers, obviously, always have your latest security features installed, first and foremost. If you're super paranoid, you know, put a little towel in front of your webcam. 

Dave Bittner: Cover it with a Band-Aid. 

Ben Yelin: Yeah, exactly. Those are... 

Dave Bittner: I've seen people do that. 

Ben Yelin: ...Foolproof methods. 

Dave Bittner: Yeah, yeah, yeah, yeah. There are little sliding windows you can put in front of them that you see people give away at trade shows and you can find at your local electronics store. 

Ben Yelin: Yeah. So I found out that this listener wasn't the only person who has this fear. James Comey, among other people, have said that they have something that covers their webcam when they're on their laptop computer. So you're not alone, listener. I wouldn't worry as it relates to law enforcement unless you are literally in the commission of committing a crime. 

Dave Bittner: You've got bigger things to worry about, probably... 

Ben Yelin: Exactly. 

Dave Bittner: ...In that case. All right, well, thanks to Russell for calling in and asking that question. Coming up next, we've got my interview with Elizabeth Wharton. She's an attorney, and she's also VP of operations and strategy at security company Prevailion. She's joining us to talk about privacy and biometric data. 

Dave Bittner: But first, a word from our sponsors KnowBe4. So let's return to our sponsor KnowBe4's question. How can you see risk coming, especially when that risk comes from third parties? After all, it's not your risk until it is. Here's step one. Know what those third parties are up to. KnowBe4 has a full GRC platform that helps you do just that. It's called KCM, and its vendor risk management module gives you the insight into your suppliers that you need to be able to assess and manage the risks they might carry with them into your organization. With KnowBe4's KCM, you can vet, manage and monitor your third-party vendor's security risk requirements. You'll not only be able to prequalify the risk. You'll be able to keep track of that risk as your business relationship evolves. KnowBe4's standard templates are easy to use, and they give you a consistent, equitable way of understanding risk across your entire supply chain. And as always, you get this in an effectively automated platform that you'll see in a single pane of glass. You'll manage risk twice as fast at half the cost. Go to kb4.com/kcm and check out their innovative GRC platform. That's kb4.com/kcm. Check it out. 

Dave Bittner: And we are back. Ben, I recently had the pleasure of speaking with Elizabeth Wharton. She is an attorney, and she is also one of the founders of a security company called Prevailion, and she joins us to talk about biometrics. Here's my conversation with Elizabeth Wharton. 

Dave Bittner: When you look around and you sort of establish what the state of things is when it comes to biometrics and how that's affecting people's privacy, how do you describe that? 

Elizabeth Wharton: It depends on the context that you're looking at it. I mean, on the one hand, everybody's uploading pictures to Facebook and other social media and tagging themselves and yet recognizing that there are benefits to being able to use your fingerprint to start your laptop. You can bypass long security lines at the airport using your face as your passport. Similarly, custom lines and - if you're coming in and out of the US. So you've got this whole mix of competing interests, and at the same time, states are jumping in and saying, hey, wait a minute, we missed the boat on some other privacy protections. We're not going to miss the boat this time. 

Dave Bittner: And so what are we seeing in terms of actions from individual states? 

Elizabeth Wharton: Well, they're going in, and just as with a lot of stuff - for example, the data privacy laws in general - there is no federal data breach standard in law. So one of the things that we're looking at from a state level is they're saying, hey. We've created our patchwork here. Let's create a patchwork for this. So you've got - even Maryland just jumped in the game where they're passing legislation that keeps the biometric data, treating it as the PII, same as what you saw with HIPAA and some of the others - that it's protected, be it facial scanners - and I'm curious to see where some other states are going to go because I feel like it becomes a game of one-upmanship where - OK, you did this? Well, here. Hold my beer. Watch what we're going to do. 

Dave Bittner: Does the classification of making it PII - what's the practical implications of that? 

Elizabeth Wharton: Well, it brings in additional protections. And in the cases you're seeing, for example, in Illinois, when the biometric - your facial recognition, that information, the measurements of your face - you have to have a plan. You have to have - same as with other PII, you have to somewhat - you have to have someone in charge. You have to say, who can view this information? How can it be used? How is it being stored? And how is it being destroyed as well? Is it six months? Is it two years? But you can't just keep it on somebody's laptop and whomever asks for it, you just email it out. 

Dave Bittner: Now, what about broad surveillance systems? I'm thinking, you know, close to where we are. Baltimore certainly has a system of video cameras in place. We've heard about these sorts of things at airports, where they're checking who might be on one list or another by using facial recognition. 

Elizabeth Wharton: Yeah, well, you have airlines - and I believe it's Delta that is going in and - hey. You don't need your boarding pass and your driver's license. Just sign up for the facial scan. CLEAR is also using the biometric data and information. I went through BWI just this week, and they offered, hey, for five minutes of your time, we'll get you set up in CLEAR, and you'll be able to just breeze right through - no more invasive lines and scans and all that. And I thought, all right, interesting. 

Dave Bittner: Yeah, that is interesting - sort of trading one thing for another, perhaps. 

Elizabeth Wharton: Yes, and where it also gets to be fun is looking at - there's a difference between private entities collecting this information, where you're voluntarily offering it, as opposed to the involuntary collection, be it the license plate readers - different kind of data, but at the same time, if it's the government or local law enforcement collecting it, they're going to have different obligations, protections, as opposed to a private entity, a company, a store, tracking your face as you go through - and perhaps reading your expressions - as you go through and stand in front of one display as opposed to another display or what's catching your eye. 

Dave Bittner: Yeah. It makes me wonder what options are available in terms of opting out. This may seem like a really basic question, but is there anything that keeps me - from a legal point of view - of having some sort of disguise, from putting a fake mustache on and, you know, sunglasses and a hat, a false beard or something like that? 

Elizabeth Wharton: It depends on who's collecting the information, how it's being used. And I'm curious to see how states are going to do that because - just because someone wants to wear glasses, maybe they don't need them, but are they using them to evade their data collection? And whether it's considered a mask versus - who's to say, if I have a nose job, so is it because I'm evading something? Or something other than an awkward-shaped nose, in my opinion. 

Dave Bittner: Right. Did you get your eyebrows done or have a mole removed or something like that that could throw these scanners off? 

Elizabeth Wharton: Yeah. And the other flip side is then when does it become acceptable - let's check medical records for this or makeup. I mean, one of the things is you look at choices in appearance and if people - and I've seen certain things advertised of, this will help defeat certain systems - but when you're talking about the protests that are going on in Hong Kong, at what point does protecting your right to free speech - or in their case, a protest - and expressing and altering your appearance that way to avoid detection? Well, what if it's the bank on the corner that was using facial recognition for - instead of your ATM card, that kind of thing. What happens to that information? Can law enforcement pull it, use it for whatever? 

Dave Bittner: Yeah, it also strikes me that we've seen these studies that show that facial recognition systems in particular have a much lower degree of accuracy when it comes to minority groups. 

Elizabeth Wharton: Yeah. I mean, they're biased. Exactly. At what point does the reliance on the AI when the algorithm that created it were not created perfect - and it's not like, should I have to go get a tattoo added or removed because somebody stole my identity? 

Dave Bittner: Do you suppose we're headed in some direction where we're going to establish exactly what our rights are and are not when it comes to this sort of thing? 

Elizabeth Wharton: If not - I mean, the states are starting to take it upon themselves to do this and, so ultimately, to avoid the confusion of, well, I'm compliant with this state or this country. And what happens travelling abroad? I mean, you look at what happened in the EU, the right to be forgotten, and apparently, it only applies to folks in the EU at that time, according to a recent court decision. 

Elizabeth Wharton: So similarly, well, it's OK for the company operating in Georgia to collect this information but not in Maryland to collect the information. And I think as state legislators and Congress start looking at some of these issues, especially with how they're looking at Facebook, they're going to have to make some decisions to, as you said, how do we protect this information? And while we can't perhaps opt out, we can at least have some say in how it's used.

Dave Bittner: Yeah, because your biometric information is more or less forever. 

Elizabeth Wharton: Yeah. When you start looking at what - in Maryland, it's not just the facial; it's the voice recognition. It's your genetic material as well as any unique biological characteristics. Well, what happens when the next state over doesn't capture the same information or - again, it's not like you can change your fingerprint easily. 

Dave Bittner: I also can't help wondering about - you know, let's say there was a state that had certain rules when it came to this sort of thing. But let's say I'm shopping at a retail establishment that has some sort of centralized security system - you know, multiple stores send their signals back to one place. How do they filter out who's from where and what they can do in one place versus another? I can see this getting complicated. 

Elizabeth Wharton: Well, and ask - I believe it's Home Depot, Lowes and currently going through this, at least in the state of Illinois because if you're using - in their case, they were using facial recognition as part of their security system. And not having a plan in place for who can access what information is potentially going to trip them up in those states. And the lawsuits to that effect are nothing new in the sense of some of them go back to 2016, 2017. But how do you do that? 

Elizabeth Wharton: And what do you do with stores that are operating and might have, as you noted, the central database that they're keeping some of this information could be - you know, their marketing center could be California, whereas most of their stores are on the East Coast or - what happens when you get those conflicting rules? And then what - also, as with any great policy, what happens when people don't follow it? 

Dave Bittner: Right. How do you maintain the regulations? What's your enforcement regime, I suppose? 

Elizabeth Wharton: Well, enforcement's only so good as - I mean, it's - as we see with the ransomware and similar data breaches, you know, we had a great policy for this, but we checked the box, and it's just, we chose - somebody either ignored it or the system failed. Well, this is a little bit different than a bank account number or credit card number; I mean, this is your biometric information. 

Dave Bittner: Right. What are your recommendations for organizations who want to get ahead of this, who may want to use this sort of technology, but they want to make sure that they're doing so within the proper legal boundaries? Where should they begin? 

Elizabeth Wharton: Well, the easiest place is looking at why am I collecting this information and what am I doing with it because it's the same thought process that they're going to have to be going through with any of their customer information data, that while it might be great to have certain beacon location data on when central customers are coming into their store, you need to protect - know who has access to this information. Are we collecting it for business purposes? 

Elizabeth Wharton: Are we collecting it for privacy or other, you know, purposes? Basically, defining what are you collecting, why are you collecting it, how are you protecting it, and who has access to it and who is destroying it. So that same thought process and risk analysis with any other piece of information, this needs to fall into that same metric. 

Dave Bittner: All right. Ben, what do you think? 

Ben Yelin: Very interesting. One thing that always sticks out to me on topics like this, and I think Elizabeth articulated it quite well, is this lack of voluntariness. We might, in the abstract, be able to avoid biometric scans. You know, maybe I opt out here, and I opt out there. Sure. Delta right now is the only one that will scan my face as my airline ticket; maybe I fly United. In the long run, that's not going to be the case. And you know, unless you want to be a literal shut-in who lives in the woods, your biometric data is going to get exposed. It's a part of your daily life. They're going to read your license plate. They're going to have access to your license plate photos. Somebody has access to your fingerprint. And so I think what's difficult from a consumer's perspective is there really isn't a meaningful opportunity to opt out of the collection of biometric data... 

Dave Bittner: Yeah. 

Ben Yelin: ...Which makes it, I think, even more problematic that the government would not need a traditional warrant to get access to this data. 

Dave Bittner: I recall someone saying recently - it stuck with me. They said, what it comes down to at a certain level is you either participate in society or you don't. 

Ben Yelin: Right, that's exactly it. I guess you could say part of the cost of participating in societal affairs, the cost of traveling, the cost of walking around a city that has persistent video surveillance or aerial surveillance, is you lose a measure of privacy. Most of us will never notice. The government's not going to try and get access to my fingerprints or to my facial biometric data. But I think we have to recognize that that's the sacrifice we're making. And as technology continues to evolve, there's no longer going to be any meaningful opportunity for people to opt out of sharing their biometric data. It's almost going to be collected as a matter of routine. And you know, that rubs a lot of people the wrong way. 

Dave Bittner: I can't help wondering if overall does it ratchet up the level of anxiety that we all have, that this notion that we're always being watched, even when we're just walking down the street, minding our own business, that there are cameras on every corner collecting our data? Does that lead to an overall feeling of wariness, of anxiety? It's something I wonder about. 

Ben Yelin: I wonder about it, too. I mean, I would love to see some, like, public opinion data on this. My just sort of anecdotal impression is that it's just not something people think about very much because you're not really going to come into contact with the consequences, you know. 

Dave Bittner: Right. 

Ben Yelin: Maybe after you're arrested for committing a crime, you know, local law enforcement will reveal at trial that they collected biometric data; they were able to, you know, geolocate you based on your cellphone's GPS, you know. 

Dave Bittner: Right, right. The security camera from the McDonald's you had breakfast at that morning was collected. 

Ben Yelin: Exactly. 

Dave Bittner: Sure, sure. 

Ben Yelin: But until you're at that trial, it's just not something that one really notices. You know, I worry about it in the long run. If it is something that people start to think about, then it could have a real chilling effect on people's expression, people's rights of association. I mean, if you knew that there was going to be persistent surveillance wherever you went, maybe you'd be more reserved about going to a favored religious institution or another group that might be potentially, you know, publicly disfavored. And that would, I think, be really bad. Now, of course, the other side of this is convenience. I love the fact that I turn on my iPhone, it reads my face, and it turns on. 

Dave Bittner: (Laughter) Right. Right. 

Ben Yelin: I don't have to exhaust my thumbs by typing in a passcode. 

Dave Bittner: Yeah, get the blisters on this - on your thumbs. Yes, absolutely. 

Ben Yelin: Yeah. It's - you know, that's excellent. 

Dave Bittner: Right. 

Ben Yelin: Most of us probably would value that convenience because the effects on our personal privacy are theoretical and tangential. 

Dave Bittner: All right. Well, it certainly is a lot to think about. We want to thank Elizabeth Wharton for joining us. And we want to thank you for listening. 

Dave Bittner: And of course, we want to thank this week's sponsor, KnowBe4. Go to kb4.com/kcm and check out their innovative GRC platform. That's kb4.com/kcm. You can request a demo and see how you can get audits done at half the cost in half the time. 

Dave Bittner: Our thanks to the University of Maryland Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.