Caveat 8.11.22
Ep 137 | 8.11.22

Is privacy included?

Transcript

Jen Caltrider: I can't tell you how many privacy policies I read that crow at the top of their privacy policy, we will never share or sell your data without your consent. We care about your privacy. I mean, every company says that, right? And then you keep, like, reading and digging into their track record. And you're like, holy cow. Like, too many companies just collect as much personal information as they possibly can. You know, it's really hard to trust these companies these days when it comes to privacy.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance law and policy podcast. I'm Dave Bittner, and joining me is my co-host Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben has the story of how a geofence warrant for data from 2015 might be the deciding factor in a death penalty case. I've got the story of the FBI compelling someone to open an app with their face. And later in the show, Carole Theriault interviews Jen Caltrider from Mozilla's Privacy Not Included initiative. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover please contact your attorney. 

Dave Bittner: All right, Ben, we got a lot to cover this week. Why don't you start things off for us? 

Ben Yelin: So this is a story about geofence warrants in a case that dates back to 2015. So in 2015, in Kansas City, Mo., the police department was investigating a spate of crimes. They believed these crimes were connected. So there was some cocaine dealing, an armed robbery and a murder. They tried to pin this on two individuals - who ended up being indicted in 2018 - but they didn't quite have the evidence necessary to obtain convictions for the homicides. And the DOJ is seeking the death penalty in the case if they're able to obtain a conviction and that post-conviction determination that the death penalty is warranted. 

Dave Bittner: Wow. 

Ben Yelin: So they need some type of definitive piece of evidence that the suspected perpetrators were at the scene of the crime when it occurred. And that brings us to geofence warrants, which, as we know, is a request - or more like a demand - that Google, or whichever company, turn over all of the devices that were in a particular area at a particular time. And the data that the Kansas City Police Department is seeking here dates all the way back to 2015. 

Ben Yelin: We don't know exactly what the data is going to show, but there are at least suspicions that the data will indicate that the suspects were one of a couple of people who were being identified at the scene of the rather gruesome crimes that happened in 2015. And if the use of these geofence warrants is upheld, and that turns out to be the deciding factor in both determining guilt and innocence and also determining whether the death penalty will apply, then we could see perhaps the first case in this country of somebody being put to death on - at least somewhat on the account of this type of digital surveillance... 

Dave Bittner: Wow. 

Ben Yelin: ...Which is really remarkable. So an interesting angle to the story here is that for all Google users who obtained their accounts after 2020, they are enrolled in a provision that automatically deletes all of their location data after 18 months. 

Dave Bittner: I - yeah, OK. That's where I was going, so go on. Go on. 

Ben Yelin: Yeah. So that applies to users who obtained accounts after 2020. 

Dave Bittner: OK. 

Ben Yelin: For anybody - like most of us - who opened our Google accounts prior to 2020, the default option is that our location data is maintained forever. 

Dave Bittner: That's a long time. 

Ben Yelin: Yeah, meaning it follows us through our travels and tribulations over perhaps a period of at least a decade or more. 

Dave Bittner: Yeah. 

Ben Yelin: Now, you can opt out of having this data collected. You can opt in to the option of having it scrubbed automatically every 18 months or... 

Dave Bittner: You should totally do that (laughter.) 

Ben Yelin: As you should do, especially if you are committing crimes. 

Dave Bittner: Right. 

Ben Yelin: But that's not something that most people have the wherewithal to go about doing, especially if you've been indicted and have been behind bars. 

Dave Bittner: Right. 

Ben Yelin: So really, Google, in trying to defend themselves for providing the data in this case, is saying, after 2020, we don't keep permanent records of location information. These are automatically deleted after 18 months. But what they're not telling you is that provision is not grandfathered in for old users, people who opened their accounts prior to 2020. So there are two things that just really opened my eye with the story. One is we're talking about the death penalty here. I mean, this is a - the most serious consequences of a criminal case. 

Dave Bittner: Right. 

Ben Yelin: And you never know which piece of evidence is going to be the determining factor that leads to somebody getting the death penalty. But the fact that it's location data obtained seven years ago I think is really eye opening. The fact that Google's ability to collect where you are at a particular time, having a dossier on your location history, could mean the difference, at least for a couple of individuals between life and death. And that's something I don't think we've talked about before - I certainly don't think I've seen before. And then the other really interesting element of this is how I think people who might have heard about Google's new policy on deleting old location data records might not have realized that doesn't apply to people who have been users prior to 2020. 

Dave Bittner: Yeah. 

Ben Yelin: And most of us aren't on Gizmodo or aren't on Motherboard by VICE 24/7 and don't realize that we have an affirmative obligation to opt out of this data collection. So it just really caught my eye. 

Dave Bittner: Yeah. I mean, is this the kind - I'm thinking of a geofence warrant in general and the kind of evidence that someone would present in court to a jury, to a judge, whatever. It's not - a geofence doesn't say this person did the crime. It just says this person was at this location at this time where there was a crime committed. So... 

Ben Yelin: It is circumstantial. Absolutely. 

Dave Bittner: Right. So is it one of those things that just - it's one of the building blocks to help remove reasonable doubt? 

Ben Yelin: Exactly. It is a piece of circumstantial evidence. A geofence warrant, in and of itself, would never be enough to convict somebody. 

Dave Bittner: OK. 

Ben Yelin: Because - I guess I should take that back. If you are - if a crime was committed in an isolated area and you were the only person identified in that geofence warrant, you could probably convince a jury beyond a reasonable doubt that somebody committed a crime. But otherwise, yeah, it's a investigative method that's supposed to augment other sources of evidence. And here they already have some evidence, at least for the other crimes committed as part of this crime spree, including things like DNA evidence, which are more probative. So a geofence warrant is just adding to the picture. People kind of talk down circumstantial evidence. Circumstantial evidence isn't the best evidence, but it still can be extremely compelling. 

Dave Bittner: Yeah. 

Ben Yelin: I mean, if I was a cop and pulled you over and didn't catch you drinking, but your breath smelled like alcohol and there are empty bottles on the ground, that would be pretty convincing. I could put two and two together. So just because circumstantial evidence isn't direct evidence, I don't think we should pooh-pooh it and say that it wouldn't be a deciding factor. I think we don't know exactly what a juror - they're human beings like the rest of us - would glom onto in considering somebody's innocence or guilt or potentially considering whether somebody deserves the death penalty. 

Dave Bittner: I guess I'm thinking of geofencing from two different angles here, which - and I'm probably describing this inelegantly - but as primary evidence versus supporting evidence, right? 

Ben Yelin: Right. 

Dave Bittner: In other words, if my whole case is built on a geofence data, well, that's one thing. And I would say that in my mind - my opinion - you know, that would be kind of shaky because it is so circumstantial. But if I had a mountain of other evidence and I went to Google and said, hey, was this person at this place at this time? And that's just another, you know, another brick in the wall... 

Ben Yelin: Right. Going to get Pink Floyd in my head. 

Dave Bittner: There we go. That's a little different. I guess it's a different use case. And I'm not - and my initial reaction is, I guess I'm more OK with that than it being a primary bit of data. But is that - I mean, is that a flawed line of thinking in my case, legal boy? 

Ben Yelin: I don't think it's necessarily a flawed line of thinking. 

Dave Bittner: (Laughter) OK. 

Ben Yelin: But I will say that the use of the data is concerning just because the records themselves are so vast, going back so many years, and because this is something that's completely unprecedented. I mean, never had we had the type of technology that would allow law enforcement to pin us at a certain location at a particular time in the past. 

Dave Bittner: Right, for the past decade. 

Ben Yelin: Exactly. 

Dave Bittner: Yeah. 

Ben Yelin: You know, there were things like surveillance cameras, but that's just not the same as doing a data dump of a single cellphone tower or figuring out everybody who had a particular - was at a particular location at a particular time. So I think those are the elements that are particularly concerning, whether it is the deciding piece of evidence or not in a given case. And like I said, sometimes you never know what the deciding factor is going to be for a juror. This could be the piece of evidence that pushes things over the top. You need a unanimous jury verdict, and maybe there is one person on the fence. And this person, not to use a pun, but saw the geofence warrant, and that tipped them one way or another. And that could be the difference between somebody getting a needle in their arm or not. 

Ben Yelin: So I just think whether it's justified in this case, depending on the other types of evidence that are obtained, I think it's just concerning that when the stakes are at their absolute highest, we're relying on a pretty pervasive surveillance technique that we haven't really grappled with and that our legal system is still wrestling with. I mean, we don't really have firm rules on the legal standard for collecting geofence data. We've seen somewhat conflicting case law on it. So that's why it's concerning for me. 

Dave Bittner: Yeah. 

Ben Yelin: I just think for those factors. 

Dave Bittner: That's fascinating. All right. Well, we will have a link to that story in the show notes. 

Dave Bittner: My story this week also comes from Forbes. This is from Thomas Brewster, and it's titled, "The FBI Forced a Suspect to Unlock Amazon's Encrypted App Wickr With Their Face." So evidently, there was a gentleman who the FBI - well, it always comes down to child sexual abuse material, doesn't it, Ben? 

Ben Yelin: Right. I mean, so much of the cases in this area of the law have to do with CSAM, just because, for good reason, law enforcement is very aggressive in investigating the use of child pornography. So that's where you get a lot of these types of searches. 

Dave Bittner: Right. So there was a gentleman who the FBI was compelled to believe that - they had the evidence to believe that there was a gentleman who was involved in this sort of thing, sexual abuse material of young girls. And part of the investigation had to do with folks who were in a Wickr group, which is an encrypted chat group. So the FBI got a search warrant to grab this person's mobile device. And it's worth mentioning that this person had a history with this sort of thing. This was not his first encounter with law enforcement when it came to this type of material. So the FBI grabs this person's mobile device, and the device is unlocked. However, the Wickr app was using face ID to have an extra level of security on that specific app. So the phone is unlocked; the app is not. The FBI went and got a warrant to compel the suspect to unlock the app using biometric information - his face - to unlock the app. And that, according to this article, is new. What do you make of this, Ben? 

Ben Yelin: Yeah. So this could be the case that gets us some sort of final disposition on this compelled decryption question. It's a small chance that this is going to be the case. But a man can dream, right? 

Dave Bittner: (Laughter). 

Ben Yelin: So we've talked about this dichotomy in the legal world where the Fifth Amendment right against self-incrimination, against testifying against yourself, applies only to so-called testimonial evidence. And most courts have held that something like a passcode is testimonial evidence. It is the content of one's own mind. And forcing somebody to type in that passcode would be compelling a person to testify against themselves in violation of the Fifth Amendment. Most courts have held that biometric data is not testimonial evidence. And the analog we see in the non-digital world is always a police lineup. You have no constitutional right to refuse to be in a police lineup because you're not revealing the contents of your own mind. And most courts have held that that applies to face ID or thumbprints - those types of things. 

Dave Bittner: Right. 

Ben Yelin: A couple of courts have disagreed. And we've seen cases in state courts in California and Idaho that have said that biometric data is actually testimonial evidence. If we see enough of these disagreements, maybe the Supreme Court will finally weigh in and give us a definitive decision on what counts as testimonial evidence for the purpose of the Fifth Amendment. The Department of Justice still adheres to a policy that it does not need a warrant to compel somebody to present their face for face ID or their thumbprint to unlock a device. And you can see what the consequences are here. I mean, it really cuts against the advantages of using an encrypted messaging application. You think by communicating over Wickr or another type of application, that the contents of your communications are going to remain private. I mean, that's why people use these robustly encrypted messaging applications. 

Dave Bittner: Right. 

Ben Yelin: But this is sort of an end-around to that encryption. Even if the company itself doesn't hold a key to get access to these communications, even if the government doesn't have a backdoor to access these communications, they now have an alternative method - find the person, get them to unlock their device, and then get them to reveal their face or their thumb or their fingerprints to open the messaging application. 

Dave Bittner: Right. 

Ben Yelin: The upshot of this is we now know some actions that people can take to protect their communications across these encrypted applications. And Mr. Brewster mentions these at the end of the article. Use a good password, something like a six-digit code, or a passcode to unlock your device. And then disable face ID for these encrypted messaging applications, or any other type of biometric data. Most people don't do that. It is nice to not have to type in your passcode to use an application, in the interest of time and convenience. But now that we know that the DOJ and most state law enforcement agencies and local law enforcement agencies are using this as an investigative technique, I think that might motivate people to change their practices. 

Dave Bittner: Yeah. And so if you were to use a complex passcode rather than face ID, then, as things stand today, the government would not be able to compel you. It would just depend on which - you rolls your dice in which court you end up in, or are we pretty set on that these days? 

Ben Yelin: In terms of compelled decryption for biometric data? 

Dave Bittner: No, no. For a passcode. 

Ben Yelin: Oh, for a passcode. We're pretty settled on that being testimonial evidence. 

Dave Bittner: OK. 

Ben Yelin: There have been a few cases in state courts that have hinted that that might be in question. But for the most part, we've seen that passcodes are considered testimonial data... 

Dave Bittner: I see. 

Ben Yelin: ...Meaning it's protected by the Fifth Amendment. And I think that's pretty clear 'cause it is the content of one's own mind. Now, there's an end-around to that, which is the so-called foregone conclusion doctrine, where if information was going to be discovered anyway, then a person can't have a defense that they refuse to unlock their device using a passcode. But for now, yeah, it is widely considered constitutionally protected under the Fifth Amendment that you do not have to reveal your passcode to unlock a device. 

Dave Bittner: All right. Well, we will have a link to that story in the show notes. And, of course, we would love to hear from you. If there's a topic you'd like us to cover on the show, you can email us. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, it is always a pleasure to welcome Carole Theriault back to the show. And this week, she has interviewed Jen Caltrider. She is from Mozilla's Privacy Not Included initiative. Here's Carole Theriault speaking with Jen Caltrider. 

Carole Theriault: Well, listeners, do I have a treat for you today. We have Jen Caltrider. She's Mozilla's Privacy Not Included head honcho. Thank you for taking the time to be on the show 'cause I can tell from the output on the site, Privacy Not Included, that you guys are busy cats over there. 

Jen Caltrider: Yeah, yeah. There's a lot of privacy problems in the world today. 

Carole Theriault: I couldn't agree more. So maybe we should start at the top. So for those listeners who don't know about Mozilla's Privacy Not Included project, could you give us a quick overview? 

Jen Caltrider: Yeah, sure. So back in 2017 - which seems like "The Land Before Time" these days, but it was only, like, six or seven years ago - a lot of connected devices were starting to become more prevalent in people's homes. You know, people were getting smart speakers and robot vacuums and fitness trackers and everything. And when we looked at - or around at Mozilla - you know, Mozilla really cares about privacy. We're a nonprofit with a mission that focuses, in part, on trying to protect the privacy on the internet. And we didn't see that average consumers could find out before they bought a product, what are the privacy and security concerns of this connected device or this connected app? So on a whim almost, we kind of said, well, let's try and create a buyer's guide for people to help explain that. You know, without a lot of resources, we kicked it off, and we were just curious if people would even care. You know, there's websites that review products on features and reliability and things like that, but nothing like privacy and security. So we gave it a shot and we found that people liked it. 

Jen Caltrider: You know, everybody says they want to protect their privacy, but when it comes to what they can do, there - it's a lot harder to know. Since 2017, we have been reviewing the privacy and security of connected devices. We've moved into doing apps as well. We've gone from kind of just trying to state the facts to being a little more opinionated to help people understand, hey, what this company's doing is really bad. And, you know, maybe you should find some other product to use if you care about privacy, to, hey, we have a Best Of list now and we have a Creep-O-Meter, where people can rate how creepy they find products. And we have a Privacy Not Included warning label that, you know, when you land on a product, if it has that, that's us kind of saying, hey, you know, we'd be wary of using this product because your privacy might not be protected. 

Carole Theriault: It's really cool how far and wide your project has gone because you do things for the smart home, you do toys and games, you do entertainment, you do wearables, you do health and exercise, pets, video call apps, dating apps. I mean, you really cross the whole gamut. This must be a massive workforce here. 

Jen Caltrider: If only. We're a very small team, actually. We're a team of two. There's two of us - myself and... 

Carole Theriault: Wow. 

Jen Caltrider: ...Misha Rykoff, who's my fellow researcher. And we do the - all the reviews of the products. And we approach our reviews of the products like a consumer would. We kind of want to tell people, what can consumers find out before they buy a product to know if it's private or secure? Because you don't want to get home with it and then start setting it up and be like, oh, yeah, once you connect this, we're going to collect all your data. And so we approach it like that, and we look at what's available publicly. You know, we read privacy policies and public documentation and news articles about the company. We email the email listed in the privacy policy for privacy-related questions to see, do they get back to us? You know, if they do, do they answer their questions? You know, what can we - can we tell if they are using strong encryption to protect your data? Can we tell if they have a way to manage security vulnerabilities? 

Jen Caltrider: And so the two of us - you know, we just spend all our time kind of digging in and looking at that. We do have lawyers that come in at the last minute and review everything to make sure we aren't going to say anything incorrect or that, you know, might get us sued. But for the most part, it's just Misha and I with our heads down doing research that, if you were an average consumer and had eight hours a day to do this and a bit of knowledge, that's what we do. 

Carole Theriault: I am absolutely gobsmacked that there's only two of you doing all this work. That is a testament to your skill and passion, let me tell you. Now, in your research, do you often find a disconnect between what is being said on the website and what is said inside the privacy agreement, for example? 

Jen Caltrider: You mean how companies say they protect your privacy and then what they actually do? 

Carole Theriault: Yep. 

Jen Caltrider: Yeah. 

Carole Theriault: Exactly. 

Jen Caltrider: Indeed. I can't tell you how many privacy policies I read that crow at the top of their privacy policy, we will never share or sell your data without your consent. We care about your privacy. I mean, every company says that, right? And then you keep, like, reading and digging into their track record and what data they collect and how they share that data. And you're like, holy cow. Like, you know, everybody says they care about privacy, but there's a lot of, like, show-don't-tell here. 

Jen Caltrider: And a lot - too many companies just collect as much personal information as they possibly can 'cause that's very valuable to them. They use it for targeted advertising and personalization and sharing it with business affiliates and selling it in some cases. And they take the data they have on you, and they go out to other third parties, like social media sites or data brokers or public sources, and they collect even more data about you. Because the more they know about you, the more they can keep you addicted to the app or target you for - to buy more products, to get you to - sell you more things. And so it's - you know, it's really hard to trust these companies these days when it comes to privacy, and it's sad. I'm a little jaded. 

Carole Theriault: I don't blame you one bit. I just am super glad you do what you do. Now, little problem - we've almost run out of time, and we haven't touched upon your research into mental health apps, which I think is fascinating. So I'm going to invite you on next time so we can discuss this. And in the meantime, listeners, go check out Privacy Not Included. Go see the devices that you have in your house and see how they stack up against others. And if your device isn't listed, you can actually fill in a request so that it gets reviewed. Isn't that right, Jen? 

Jen Caltrider: Yeah, we have a form there that you can submit what - requests for reviews. We obviously can't review everything. We wish we could. And so we try and focus on what we know people will like. So please let us know what you're interested in. And because we can't review everything, you know, even just reading a couple of reviews of similar things will give you some ideas of what questions to look for, what questions to ask. You know, it's just - hopefully, we're helping people understand a little bit more of the concerns they should have and how they can approach it so they can just shop a little bit smarter. 

Carole Theriault: Couldn't agree more. Now, maybe you can tell us about your latest research. What were you looking into and what did you find? 

Jen Caltrider: We just got done reviewing mental health apps, and they were a particularly creepy space for us. They were, like, probably the creepiest things I've ever reviewed are these mental health apps. In one part because they collect such a huge amount of personal information on you. You know, what's your mood? You know, how often are you seeing a therapist? You know, I have what your OCD triggers are, your eating disorder triggers or what symptoms you're having, what meds you're taking, and all this really personal data that they collect. 

Jen Caltrider: Also, you know, with the mental health crisis that's exploded over the past year, there's hundreds of millions of dollars kind of flowing into this space. And so the companies are growing really quickly. And they are - and they're caring about making money right now more than they are about protecting privacy, even though they say they care about privacy. And, you know, so in our experience, just reviewing these mental health apps over these past couple of months, you know, we had some things where - you know, we read the privacy policies. And of the 31 companies that we emailed asking our questions - at the email address listed in their privacy policy, after a month and a half, only three companies had actually responded to us. 

Carole Theriault: Wow. Are you kidding me? 

Jen Caltrider: Yeah, they just didn't respond to our questions. And then one of them - one of the companies we then followed up with, they weren't really happy because they - you know, we called them out for some questionable privacy practices, and they weren't really happy. And they're like, well, why didn't you reach out to us before you launched? And we were like, we did, three times, at the email address in your privacy policy. And their response was, oh, well, the person that monitors that left in March and we haven't replaced them. And I'm like, well, that's not showing that you care about privacy. 

Jen Caltrider: Another company was unhappy with us, and they wrote a post defending themselves. And in the post they said, oh, Mozilla got everything wrong because they tried to infer our business practices from our privacy policy. And I had to laugh because I'm like, well, how else is a person supposed to infer your business practices around privacy other than your stated business practice around privacy? 

Carole Theriault: Yeah. 

Jen Caltrider: You know, it feels like, to me, that it's almost a game for these companies to write these privacy policies that too often are vague - vaguely worded, have wiggle room. There's, like, five privacy policies - there's a privacy policy and a privacy notice and an addendum for the EU and an addendum for California. And they don't make it easy for consumers to actually stop and understand. 

Jen Caltrider: And one of the things that really got me when I was reading all this was going back to that, we'll never share or sell your data without consent. Well, what does consent look like, right? Like, downloading and registering to use the app, I think, too often might count as consent, which most people are like, no, that's not consent. I just want to use your app. You better ask me first before you give my personal data to Facebook. And, no, the - you know, based on my reads of privacy policies, a lot of time, consent is simple as you downloaded the app, you registered, you've given us your consent to do this. And then, you know, you read, well, how do I withdraw consent? And a lot of times it's like, in order to withdraw consent, you must delete the app. 

Carole Theriault: Yeah. 

Jen Caltrider: You're like, that's not great. Even if consumers do go in and read privacy policies, which - please do. I - you know, I'm a nerd, and I know they suck. And it's hard to read them, but it's also a good exercise. But there's just so many questions that you kind of walk away with where things aren't clear that, you know, maybe even if you read it, it sounds good at the top. But then you actually get down into the depth of it, and you're like, hold on. It said you wouldn't - I saw a privacy policy said that we'll never share your data with - for advertising purposes. And then down below it said, here's the advertisers we share your data with. And I'm like, wait a minute, you know? So it's tricky. It's really tricky. 

Carole Theriault: Do you think the value that we're putting on so-called big data is what's causing this problem? 

Jen Caltrider: Oh, yeah, absolutely. I mean, you know, back in 2017 when we started - you know, OK, so there was - your fitness tracker tracked your steps and maybe your heart rate, and your smart speaker would listen for, you know, a couple of commands. But now, you know, we have apps that are collecting our conversations with therapists. We have apps - you know, we have fitness trackers that can tell your emotional state and whether you're drunk or not. You know, so the amount of data that can be collected now - like, well, this is the data economy. If we're going to make money, this is how it's done. And - but, you know, there's just so much personal information that's out there now, so much more that, you know, people are going to, I think, rather quickly go from, nothing bad's happening - I'm not feeling any repercussions from all this data sharing - to, holy crap, you know, how do they know this about me? And now what do I do? How do I take that back? 

Jen Caltrider: And so, you know, I feel like in any kind of, you know, justice movement, there's a tipping point. And we're not quite at that tipping point with privacy, but I feel like it's getting awfully close when your cars can know everywhere you've been, and your mental health apps can know your emotional state and Facebook can - you know, is developing, you know, algorithms to know as much personal information about you as it can so that, you know, people all over the world can target you with ideologies. It's getting really scary. And, you know, now's the time to kind of make a difference, I feel like. 

Carole Theriault: Sometimes we have to end on a more sobering note but an important one. We've just been talking with Jen Caltrider. She is the lead at Privacy Not Included, a Mozilla Foundation project. 

Dave Bittner: All right, Ben. Interesting stuff, huh? 

Ben Yelin: Yeah. This is a really cool tool - and good for Mozilla for introducing it - that really puts some power back into the consumers' hands. They can do their own research, which is a very dangerous phrase, but I think useful in this context... 

Dave Bittner: (Laughter) That's right. 

Ben Yelin: ...To figure out how private data is on a particular application or a device. And the database is constantly growing. It's going to expand as new products come on the market. And I think it's an example of if our policymakers aren't going to solve our problems for us, we have to be proactive in solving these problems for ourselves... 

Dave Bittner: Yeah. 

Ben Yelin: ...So doing more research on exactly how private our data is in individual circumstances. And I'm glad Mozilla has provided a tool to do that. 

Dave Bittner: Yeah, absolutely. All right. Well, again, our thanks to Carole Theriault for bringing us this interesting story. And, of course, thanks to Jen Caltrider from Mozilla for joining us as well. We do appreciate both of them taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.