Exploring the cultural values of personal privacy.
Stuart Thompson: But it's too late. Like, you can't get your data back. You don't know where it's gone. You don't know who has access to it. You can't delete it. You can't request it. There's no method for you to find out anything about what's known about you.
Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner and joining me is my co-host Ben Yelin from the University of Maryland's Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hi, Dave.
Dave Bittner: On this week's show, I have a story about our own state of Maryland trying to crack down on ransomware, Ben shares a New York Times story about facial recognition software and, later in the show, my conversation with Stuart Thompson, also from The New York Times. We're going to be discussing his recent article "Twelve Million Phones, One Dataset, Zero Privacy." While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll be back after a word from our sponsors.
Dave Bittner: And now a few words from our sponsors at KnowBe4. You know compliance isn't the same thing as security, right? How many times have we all heard that? And it's true, too. Having checked the legal and regulatory boxes won't necessarily keep the bad actors out. They're out-of-the-checkbox kinds of thinkers. But what about compliance itself? You've heard of legal exposure and regulatory risk, and trust us, friend, they're not pretty. So again, what about compliance? We'll hear more on this from KnowBe4 later in the show. It's not a trick question, either.
Dave Bittner: And we are back. Ben, before we jump into this week's stories, we got some feedback from a listener - wrote me over on Twitter. His name is Tom (ph). And he said, Dave, fan of your podcasts. Question on today's "Caveat" podcast. It sounded like the caller question was regarding revenge porn. You both said that there's nothing that can be done. I'm not sure that's entirely accurate anymore. I believe a number of states have passed anti-revenge porn laws. Is it possible to have that confirmed by Ben and amended if true? Also, possibly make mention of the different groups, such as Badass Army that work to help victims of it. Keep up the great work across all these shows. Enjoy them all - from Tom.
Dave Bittner: So, yes. Ben, you and I had a conversation. We were talking about revenge porn and this notion of whether or not someone steals photos or shares photos that were shared with them. But there is some nuance. Can you clarify here?
Ben Yelin: So first of all, thank you for writing us, as always. Everybody is welcome to do that on your social media platform of choice. Tom is correct. There are actually 41 states who have some version of revenge porn statutes on the books. The statutes vary in terms of the punishment. So some of them prescribe injunctions, where the person who posts the revenge porn is forced to take them down. Formal warnings, takedown notices, other infringement notices - that varies across states as well. One thing that's notable that I think gets to this listener's question is revenge porn has a very specific definition. And not to, you know, get too X-rated on our G-rated podcast...
Dave Bittner: (Laughter) Right.
Ben Yelin: But revenge porn - the way it's defined in most states has to do with depicting sexual acts.
Dave Bittner: Explicit stuff.
Ben Yelin: Explicit stuff.
Dave Bittner: Yeah. Yeah.
Ben Yelin: Yeah. Well, let's just put it at that. Which means that posting, you know, racy photos of your ex-girlfriend or ex-boyfriends, while ethically questionable to say the least, wouldn't count as revenge porn for the purposes of these statutes, even though, obviously, that image could be used for blackmail purposes, for some of the same purposes that people use revenge porn for. I think part of it is lawmakers have to draw the line somewhere, and the line they've drawn is around explicit acts.
Dave Bittner: So for example, if someone posted a picture of a woman in a bikini - not revenge porn.
Ben Yelin: In most states, it is not.
Dave Bittner: What if she were nude?
Ben Yelin: Then in most states, that would be revenge porn.
Dave Bittner: OK.
Ben Yelin: Because they talk about showing private body parts...
Dave Bittner: I see.
Ben Yelin: ...As another part of what defines a typical revenge porn statute.
Dave Bittner: OK.
Ben Yelin: You know, I think, like, if you wanted to look for guidance, most website's - social media website's terms of services would probably be a good place to start. You know, you can post bikini photos on Facebook; you can't post naked pictures. And I think that's the general reasoning behind these statutes.
Dave Bittner: Yeah.
Ben Yelin: There's also a lot of variety in terms of what we would call the mens rea or the criminal intent of the person posting revenge porn. So states often look at whether a person actually intended to harm the reputation of the person for whom they were depicting. Some states don't take that into consideration; the simple act of posting the image themself is violating the statute.
Dave Bittner: Ah, that's interesting.
Ben Yelin: And then, obviously, consent is a huge element of it.
Dave Bittner: Right. Right.
Ben Yelin: So most states have elements where if there has been explicit or implied consent by the person whose images are being posted, that does not count as a violation of the statute.
Dave Bittner: And also, I think, as Tom points out, there are organizations - one that I know of that he mentions here, they call themselves the Badass Army. They are on Twitter if you search for Badass Army. And they help people who've been victims of this sort of thing, help - you know, guide them through what their options are, provide support; from everything I've seen, a good organization out there doing good work to try to help people who find themselves victims of this sort of thing.
Ben Yelin: Yeah. You know, whether it violates the law or not, when you're a victim of any type of online harassment like this - and that's what this is; this is harassment...
Dave Bittner: Yeah.
Ben Yelin: ...It's good to know that there are resources out there.
Dave Bittner: Yeah. All right. Well, Tom, thank you for sending in your kind note and for asking us for some clarification. Always good to do that.
Dave Bittner: Let's move on to our stories. I'll kick things off for us this week. My story comes from Delmarva Now, which is a publication that covers the Maryland-Delaware-Virginia area. And this is about Maryland legislators who are introducing a ransomware bill that's going to tighten down the rules when it comes to ransomware, increase some penalties, just sort of try to bring this into focus. This was written by Wesley Brown from Capital News Service. So we've got some Maryland legislators, Senator Susan Lee, a Democrat from Montgomery County, sponsored the bill. And it sounds to me like what they're doing here is just trying to put the word out to criminals that even just possession of ransomware would be a crime in the state of Maryland.
Ben Yelin: So this would be a radical change in the law around ransomware in Maryland. The way the current statute works, it's only illegal to actually use ransomware. This statute, if enacted, would criminalize the simple possession of ransomware, unless the person possessed it for research purposes. So if you were in an academic setting and you were trying to study different types of ransomware, that would not be a violation of the statute.
Ben Yelin: But as we do in other areas of the law, oftentimes we criminalize the possession of something because the threat that the person will use whatever that thing is is particularly dangerous. That's why we prohibit, in many instances, the possession of certain types of firearms, even if a person does not use the firearm. So we're criminalizing somebody for simply possessing what can be very dangerous. And Maryland has experience, obviously, in dealing with the dangers of ransomware.
Dave Bittner: Right.
Ben Yelin: Most famously, Baltimore City was the victim of a ransomware attack back in May of last year - crippled city systems for several months. Took me a while to get my water bills.
Dave Bittner: (Laughter).
Ben Yelin: It was not fun to get a bill three months later for $300. So obviously, this is a very serious problem.
Dave Bittner: Yeah.
Ben Yelin: And the key with this piece of legislation - and I think this is something that Senator Lee would agree to - is it's a deterrent. You want to make it as undesirable as possible for people who are even considering possessing ransomware to realize that their actions would violate this law, and they would be subject to pretty significant penalties. I think the statute calls for something like a $10,000 fine or five years in prison as a maximum penalty. So it's a misdemeanor, but it's a serious misdemeanor.
Dave Bittner: Yeah.
Ben Yelin: So we are creating this deterrent effect.
Dave Bittner: A couple of things noteworthy here - one, just a shoutout to one of your colleagues and certainly friend of the CyberWire, Markus Rauschecker. He testified on this bill.
Ben Yelin: He sure did. Yeah. He is a friend of the show, also a fellow employee at the Center for Health and Homeland Security. Markus testified on this last year. The bill was not enacted in the 2019 legislative session. It was all Markus's fault.
Dave Bittner: (Laughter).
Ben Yelin: Just kidding, Markus. So we're back here in 2020. They have made some changes to the bill to make it more palatable. And from what I heard from Markus, the hearing went very well. And there seems to be broad, increasing support for this type of legislation largely because not only Baltimore City, but also the smaller city of Salisbury, Md., also suffered a ransomware attack. So there's just more of an awareness for how much of a danger it is to our localities. And this is a proactive measure for there to be a deterrent.
Ben Yelin: Another thing worth noting - I didn't know if you were going to mention this - is that a couple of other states - and this is what Markus mentioned in his testimony - Michigan and Wyoming have experimented with this approach. The approach is so new that we don't actually have any quantitative research on whether this is an effective deterrent to cyberattacks. That remains to be seen.
Dave Bittner: Well, let me play devil's advocate on that. A Maryland law would affect citizens of Maryland, right? Do we think that the people who attacked the city of Baltimore were Marylanders? I don't.
Ben Yelin: Probably not.
Dave Bittner: What I'm getting at is, how much of this ransomware is even coming from within the United States?
Ben Yelin: So I agree with your devil's advocate's take. That's what lawyers do.
Dave Bittner: (Laughter).
Ben Yelin: Lawyers play devil's advocate. A couple of things, though - that sort of principle is applicable to a lot of different state laws.
Dave Bittner: OK.
Ben Yelin: You could say, why pass a state law? - on anything because somebody from Virginia could come in and, you know, do this illegal behavior and then would go beyond our jurisdiction. We wouldn't be able to catch them.
Dave Bittner: OK.
Ben Yelin: So it's still a deterrent factor just because the state of Maryland is making a statement that this type of behavior is criminal in nature. So sometimes just simply stating that the possession of something is criminal itself acts as a deterrent.
Dave Bittner: That ne'er-do-well who's thinking about - their bank account is almost empty and weighing their options and saying, you know, I have some computer skills; maybe I'll just spray out some ransomware here. They may see this and think twice.
Ben Yelin: Absolutely. Yeah. I mean, it's - the deterrent effect, I think, is something that, you know, we use for all criminal statutes, even in instances where its perpetrators are beyond our jurisdiction. I think the state of Maryland is just doing what it's - what it can in the absence of federal law that criminalizes ransomware, in the absence of some sort of, you know, international law. We can only control what happens within our own borders. And so this is a start. And it's also beyond, you know, the tangible effects of the statute in - I don't think have an estimate of how many people per year or whatever would be prosecuted. It's about the intangible, which is sending a message that possession of ransomware, even if it is not used, in and of itself is both wrong and potentially dangerous.
Dave Bittner: So stay in school, kids.
Ben Yelin: Stay in school, kids.
Dave Bittner: (Laughter).
Ben Yelin: There are better ways to make money.
Dave Bittner: Right. Right.
Ben Yelin: Betting on sports - no, just kidding.
Dave Bittner: (Laughter) All right.
Ben Yelin: Don't possess your ransomware.
Dave Bittner: Yes, absolutely. All right. Well, that is my story. Ben, what do you have for us this week?
Ben Yelin: So my story comes from The New York Times. It got a lot of play over the weekend - written by Kashmir Hill. It's entitled "The Secretive Company That Might End Privacy as We Know It." It's about a company called Clearview AI, started by a gentleman named Hoan Ton-That, 31-year-old Australian. The way this application works that he created is you can take photos of anybody on the street. And using this technology's facial recognition software, it can match that person's photo to publicly available information. So if that person has posted social media pictures or if that person is featured in a YouTube video - if their photo is anywhere on the internet, it can be matched to the photo taken on the street.
Ben Yelin: And what was previously unknown about this relatively small startup company is that 600 law enforcement agencies across the United States have started to make use of this technology, have used Clearview AI to help solve crimes. As you can guess, it's an incredibly effective crime-fighting tool.
Dave Bittner: Right.
Ben Yelin: They mentioned an instance - I believe it was in South Carolina - where two individuals got into a fight in a park. A bystander took a video of the fight. The police were trying to find the perpetrators. They used Clearview AI, matched the person's face with publicly available information, got that person's name and were willing to effectuate an arrest. So that side of it's very good.
Ben Yelin: The other side of it, of course, is frankly very disturbing. We've talked a lot about technologies that are invasive of privacy, that would create a perpetual surveillance state. And tell me if you disagree. I don't think I've - we've ever come across a technology potentially as invasive as this. Largely because it's facial recognition, it means the thing that's most personal to us, our face - be photographed in public, be matched up to publicly available information online and to be used, potentially, by law enforcement.
Dave Bittner: Right. So for years, for decades, we have been OK with our image being captured when we're out and about in public.
Ben Yelin: Right.
Dave Bittner: Surveillance cameras, which I think we all agree, for protecting retail places, for security...
Ben Yelin: Sure. Security - yeah.
Dave Bittner: All reasonable. And we're, I think overall, mostly comfortable with that. Taking that to the next level, where as we are out and about, gathering up a log of where we are - I'm just imagining, you know, walking into my local fast-food joint and walking up to the counter and having the person behind the counter say, oh, welcome back, Mr. Bittner. And quite (laughter)...
Ben Yelin: And they actually mentioned that in this article. They say, you know, when they were first brainstorming - these guys came up with this startup, and they were brainstorming ways to use it. And one of their suggestions was, well, why don't we give it to hotels? They can take - you know, use security footage of a person walking into the hotel and then make them feel very at home and welcome. When they get to the front desk, you'd say, hello, Mr. Bittner.
Dave Bittner: Yeah.
Ben Yelin: Good to see you again. And...
Dave Bittner: Home - at home and welcome is not the feeling that I'm going with here, but all right.
Ben Yelin: There was - I was sort of following the reaction to this article on social media just kind of as an active Twitter user myself. And more so than any other surveillance articles that I've seen recently - even The New York Times expose that we talked about a couple of weeks ago and that we're going to talk about in our interview segment today - there was just this sort of depressed, angry reaction to the fact that this technology exists, the fact that it's used by law enforcement agencies across the country. And there was sort of a tone of resignation among the founders of Clearview AI. Mr. Ton-That basically said social media sites themselves are scraping users' images. You know, Facebook does it all the time.
Dave Bittner: Right.
Ben Yelin: It's not us that invented the use of facial recognition software. We are just sort of augmenting this tool. And then there was this other kind of Orwellian quote. So one of the investors - one of the early investors in the startup is a guy by the name of David Scalzo - founded Kirenaga Partners. And he was interviewed as part of this article. He dismissed concerns about Clearview making the internet searchable by face, which was sort of the logical conclusion of where this would all be going if Clearview AI expanded. And he said that he's come to the conclusion that, quote, "because the information constantly increases, there's never going to be privacy. Laws have to determine what's legal, but you can't ban technology. Sure, this might lead to a dystopian future or something, but you can't ban it."
Ben Yelin: I mean, that's sort of a really eye-opening and frankly shocking statement of resignation, in my view. And, you know, it's sort of up to the general public whether they want that viewpoint to be the viewpoint of our policymakers. And in the absence of robust federal data privacy legislation, Mr. Scalzo's view is the prevailing view that sort of - what we've decided to accept as a society. There is no privacy out there. We can have laws here and there that protect personal privacy, identifiable information. But technology is limitless. And whether the technology in question sounds dystopian or not, it doesn't make sense to ban it. And so I think as a society, we're going to have to kind of reckon with that viewpoint.
Dave Bittner: All right. So a couple things I want to ask you about - first of all, there is no expectation of privacy when you're out and about in public...
Ben Yelin: Right.
Dave Bittner: ...Correct?
Ben Yelin: That is correct. Yep.
Dave Bittner: So we know that we're overall probably pretty comfortable with that.
Ben Yelin: Yes.
Dave Bittner: So I'm thinking about the earlier example you were talking about of the two gentlemen who got into a tussle in...
Ben Yelin: It was Indiana, by the way. I got the state wrong.
Dave Bittner: All right.
Ben Yelin: My apologies to the great Palmetto State.
Dave Bittner: Indiana wants me. Lord, I can't go back there. Yes.
Ben Yelin: Exactly.
Dave Bittner: So two gentlemen get in a tussle in the park. They use the video to find them. How is this not a faster version of a police officer canvassing door to door with a description saying, you know, it was a guy with bushy eyebrows, a mustache and a sports jersey. Do you recognize this guy? This is just that, but faster.
Ben Yelin: Yeah. I mean, I think the whole issue is that it's faster, and it's instantaneous.
Dave Bittner: Right.
Ben Yelin: So something that used to take hours of police work, you know, a lot of monetary resources, human capital, staff resources, now can take place with the click of an iPhone camera.
Dave Bittner: Casually.
Ben Yelin: Right.
Dave Bittner: Without - with no...
Ben Yelin: Which means that it can...
Dave Bittner: Yeah - no downside.
Ben Yelin: Yeah. And it's so easily done and replicated. I think sometimes the only thing protecting our personal privacy in the past was the fact that it took a lot of work to conduct this type of surveillance.
Dave Bittner: Yeah.
Ben Yelin: When it doesn't take a lot of work and it's cheap and it doesn't cost a lot of money or resources, then that type of surveillance is going to be in mass use.
Dave Bittner: Yeah.
Ben Yelin: So the fact that it is easier to identify people using this technology in and of itself is the reason why it's so dangerous for personal privacy, in my view.
Dave Bittner: Yeah. You know, when I've talked to law enforcement about these sorts of things - I remember specifically having a conversation with a chief of police about license plate scanners. And his point was, well, don't you want that criminal to be tagged? Don't you want - I'm trying to make your family safer. If that criminal drives by one of our scanners and we see there's an outstanding warrant for that person, we're going to go get that bad guy. And I'm thinking with this sort of thing, don't you want to know when that sexual predator gets too close to that elementary school? Don't you want that information? That's the argument from law enforcement, and I think they would say it's compelling.
Ben Yelin: It is compelling. I absolutely agree with it. One thing they mentioned in this article - and this was actually gleaned from Clearview's sales presentation - is that the app has helped identify some really bad people. Somebody accused of sexually abusing a child - that person appeared in the mirror of somebody else's gym photo, which is almost a stroke of luck.
Dave Bittner: (Laughter) It's, like, out of a TV show.
Ben Yelin: It is. Yeah.
Dave Bittner: Zoom in. Enhance.
Ben Yelin: Exactly.
Dave Bittner: (Laughter).
Ben Yelin: Yeah. It's a "Law & Order" episode...
Dave Bittner: Right (laughter).
Ben Yelin: ...Waiting to happen. A person behind a string of mailbox thefts in Atlanta - that one - do we need Orwellian technology for mailbox theft? I don't know.
Dave Bittner: (Laughter).
Ben Yelin: But for something more serious - you know, a John Doe found dead on an Alabama sidewalk. Absolutely, from law enforcement's perspective, any tool they can have to help solve and prosecute crimes is useful not only for law enforcement but for the rest of us.
Dave Bittner: OK. So in your view, what's the balance we could strike with this?
Ben Yelin: I mean, that's the billion-dollar question. I think - and I know we've talked about this before - the legal protections and personal privacy protections need to be made robust in order to catch up with the technology. So you have to try to keep things in equilibrium because we can capture billions of people's faces just by strangers taking iPhone pictures on the street. The laws have to protect personal privacy in a way that's just as robust. And whether that's Congress passing, you know, a type of CCPA - California Privacy Protection Act - law that protects personal privacy or it's the court stepping in and saying, you know, something like the Fourth Amendment, which protects us against unreasonable searches and seizures, has to be extended to cover this type of technology that previously would have required some sort of invasive search or seizure into a person's property or into a person's private life. So the laws have to be adjusted to keep up with these rapid changes in technology. Now, that doesn't happen for a variety of reasons. Legal bodies are not startups, in case you haven't noticed.
Dave Bittner: (Laughter) They tend to move at a different pace.
Ben Yelin: At a snail's pace. Yeah.
Dave Bittner: Yes. Yes.
Ben Yelin: Sort of a snail versus the cheetah thing here. So that's one part of the problem. And, you know, from a policymaker's perspective, law enforcement is in their ear, too, saying, yeah, we know that this technology potentially does sound Orwellian, but here are all the bad people we've locked up because of what Clearview AI has done.
Dave Bittner: Right.
Ben Yelin: And if you can get in a lawmaker's ear on that, it's almost hard to turn this down as a potential resource. I think I was struck by the outrage I saw in my social media feeds by people who otherwise just aren't outraged about this type of thing.
Dave Bittner: Yeah.
Ben Yelin: So maybe by taking it a step too far, you know, it will raise that public consciousness so that Congress or the courts step in and say, we have to extend privacy protections to match the scale of the technology that exists.
Dave Bittner: Well, and how important that organizations like The New York Times are out there - are bringing these sorts of things to light.
Ben Yelin: Yes. I subscribe mostly for the crossword puzzles.
Dave Bittner: (Laughter).
Ben Yelin: But yeah, I mean, they've done some very, very important work in this area. This was investigative work in and of itself.
Dave Bittner: Yeah.
Ben Yelin: This person was trying to hound down the founder of Clearview AI for several months. The founder was getting very good at evading questions, had set up a - I believe a faked LinkedIn profile for himself using, like - yeah. He listed himself on LinkedIn as somebody named John Good, which is, you know...
Dave Bittner: Little on the nose (laughter).
Ben Yelin: ... A half-step better than John Doe.
Dave Bittner: Right. Right.
Ben Yelin: But I think, you know, eventually, they agreed to speak with the author of this article just 'cause I think...
Dave Bittner: They - yeah.
Ben Yelin: ...Having an article posted without their input would be worse than...
Dave Bittner: Right.
Ben Yelin: ...One with their input.
Dave Bittner: All right. Well, certainly, another interesting piece of work from The New York Times. We'll have a link to that in the show notes. It is time to move on to our Listener on the Line.
0:23:47:(SOUNDBITE OF DIALING PHONE)
Dave Bittner: And had a caller call in this week with an interesting, somewhat specific question. Here is this week's Listener on the Line.
Unidentified Person: Hello. I'm calling from Virginia, and my job is in the field of digital forensics. I wanted to know what are the legal requirements for obtaining a private investigator's license when performing digital forensic. Thanks.
Dave Bittner: All right. Ben, I know you probably have the answer to this right off the top of your head, right?
Ben Yelin: Yes. No, that is absolutely not true.
Dave Bittner: (Laughter).
Ben Yelin: It's a great question.
Dave Bittner: Yeah.
Ben Yelin: I did a little research on behalf of that question. I know this is probably the least satisfying legal answer people ever get, but it really does depend on the state. So some states, like Texas, do require a private investigator's license for most types of computer forensic specialists. Other states, like California, generally, do not require the license because the nature of the work by computer forensics experts is so fundamentally different.
Ben Yelin: It's in the course of - you know, computer forensics examiners, they're not engaged in the type of activity that falls within the purview of the definition of private investigators. You know, sometimes they are - professional engineers are conducting experiments. You know, they may be doing things like chemical testing. You know, and this isn't just true for forensic technology - all different types of scientists. That may not be connected to any type of particular investigation, and we don't want to dissuade those types of experts from either testifying in court or continuing their forensic research. And so that was the basis of the California law.
Ben Yelin: I would note that that California law seems to descend from a state court decision. I was reading an article on this. The State Department that has jurisdiction over this issue didn't quite know how to answer the question you presented, but they ultimately, based on California court precedent, said that the work done by computer forensics experts falls short of the jurisdiction under the Private Investigator Act.
Ben Yelin: So I would recommend that you look up your own state statute on what's required of private investigators and whether a forensics expert has to get one of those private investigator's licenses. There are a lot of resources out there that have done 50-state surveys on the scope of private investigator licensing laws across all 50 states. So I would encourage you to check that out.
Dave Bittner: All right. Interesting stuff. And thank you for sending in that question. We would love to hear from you. If you have a question for us, you can call and leave a message. Our number is 410-618-3720. That's 410-618-3720. You can also email us an audio file. That would be at caveat@thecyberwire.com. Send in your audio file, and perhaps we will use it on the air.
Dave Bittner: Coming up next, we've got my interview with Stuart Thompson from The New York Times. He's here to talk about his article "Twelve Million Phones, One Dataset, Zero Privacy." That bombshell article, Ben, you and I have discussed previously. Really interesting conversation so stick around for that.
Dave Bittner: But first, a word from our sponsors. And now back to that question we asked earlier about compliance. You know, compliance isn't security, but complying does bring a security all its own. Consider this - we've all heard of GDPR, whether we're in Europe or not. We all know HIPAA, especially if we're involved in health care. Federal contractors know about FedRAMP. And what are they up to in California with the Consumer Privacy Act? You may not be interested in Sacramento, but Sacramento is interested in you. It's a lot to keep track of, no matter how small or how large your organization is, and if you run afoul of the wrong requirement, well, it's not pretty. Regulatory risk can be like being gobbled to pieces by wolves or nibbled to death by ducks - neither is a good way to go. KnowBe4's KCM platform has a compliance module that addresses, in a nicely automated way, the many requirements every organization has to address. And KCM enables you to do it at half the cost in half the time. So don't throw yourself to the wolves, and don't be nibbled to death by ducks.
Dave Bittner: And we are back. Ben, I recently spoke with Stuart Thompson. He is one of the co-authors of The New York Times article "Twelve Million Phones, One Dataset, Zero Privacy" - a bit of a bombshell article you and I spent some time discussing recently. Here is my conversation with Stuart Thompson.
Stuart Thompson: Source came to us after reading some of the Privacy Project work that we've been doing this year, basically, shining a spotlight on privacy and technology sort of again after - you know, sort of that's been a topic for a while, but we've come back to it. And, you know, they had access to this data and were concerned with what they saw and wanted someone to shed light on it and really give a sense of the volume and scale of the collection that's going on. And this is, you know, mobile phone locations and sort of tracking where you go and what you do in your life. And it's not really treated as a serious or concerning thing in the industry, and the sources were concerned and wanted somebody to argue for change, and that's what we tried to do.
Dave Bittner: What can you tell us about the dataset itself? I know your source wanted to remain anonymous. So, of course, we'll respect that. But the dataset itself, how unusual is it among the people who trade in this sort of thing?
Stuart Thompson: It's pretty common. It's a huge industry. We're talking about millions of millions of dollars, lots of investment going on. And, you know, we counted, you know, probably around 70 companies trading in this data in some capacity, were working around it. There's probably more than that. You know, it's probably not too unfair to say it's a shadowy business. They're not very public-facing. They're not companies that have - you know, they're not household names, although some of the household names do participate in location tracking, like, you know, Facebook and Google. But there's a lot of companies you've never heard of, like Teemo and Fidzup and Kubik and Factual and these kinds of companies that - you know, they're not household names.
Stuart Thompson: And they've been collecting this data for, you know, a decade. Since the App Store pretty much launched and allowed companies to get access to GPS data on your phone, they've been collecting it. And one of the more popular companies that people probably know is Foursquare. And if you were like me in the sort of beginning of the App Store days, you might have been checking into a business and trying to be the mayor of your local Starbucks or something. But little did we know at the time that that's some kind of utility as sort of like a social network kind of thing, but actually way more valuable is turning that data into business insights, and that's what's been going on for a decade.
Stuart Thompson: But we've never really been able to see what that looks like. You know, we kind of have an idea of tracking, and we kind of think that we understand what tracking is like for targeted advertising. But to actually see, like, a huge map of the nation's capital with so many dots that it completely fills the screen is a different thing entirely.
Dave Bittner: Can you walk us through how you approached this dataset? I mean, how did you get started? How did you decide where you were going to begin?
Stuart Thompson: Yeah, the data is actually really simple, which is another maybe threat to how it can circulate around. It's just - you know, think of it like an Excel spreadsheet, if you've ever used that or, you know, a table, a simple table. It's got latitude, longitude, date, time, a user ID and the duration of their stay in one place in seconds. And, you know, with just a couple of simple tools, you can start mapping that out. So we built a couple different dashboards and things that let us analyze and filter it. And basically, what we were trying to do is assess risk. And we more or less came up with a list of things, like what might be risky in here? Unlike previous releases of data, including the stuff that The Times has reported on, which is sort of the foundation for a lot of our reporting, this data included Washington, and that raised some national security questions, like is the Pentagon in there? Are the nation's spies being tracked? Are congressmen being tracked? Is government buildings excluded from this or not?
Stuart Thompson: And we basically found, no matter where we zoomed into or how we filtered, there was data point somewhere. So we just started looking at - you know, we zoomed into the Pentagon, and we thought, OK, maybe there won't be any dots here because it's a secure facility, but that wasn't the case at all. There were hundreds and hundreds of phones being tracked. The one exception is maybe the CIA building, which had no pings inside of it, but it didn't really matter because there were pings in the parking lot, and that lead back to people's phones and sort of gave clues to the IDs, so that's what we used instead.
Dave Bittner: Help me understand here. Once you have established who an ID is associated with, I mean, is that really the key to being able to track an individual?
Stuart Thompson: Yes. So the data includes an ID, and if you look at any individual point, that's not that useful. Like, a point in a house, you know, might give you something, but if you can't connect that to a bunch of other points, then it doesn't give you very much. But that's useful for marketers to know where individual people go. So they attach an ID to it. And if you pull out one person's - or all the points associated with that ID, you basically get a bunch of points around the city, and it catalogs where they go.
Stuart Thompson: And the most common cluster of location pings was their house usually because their phone is just sitting at their house most the time, at night or when they're home eating dinner, and it's just pinging their locations. There's a huge cluster right at their house. And the second most common cluster is at their workplace. It's also where they spend a lot of time. So they'll be in meetings and having lunch in the break room, whatever, and that's - their phone is also pinging a lot. So those two points are actually super revealing to identity.
Stuart Thompson: And the data doesn't have, you know, a name. It doesn't have a phone number. Like, a lot of people ask us, so can you look up my phone number in there? It doesn't work like that. You know, the industry calls it anonymous, and that's really fundamental to their business plan, to be like, you know, don't worry about us. Don't regulate us too hard; it's anonymous data. And they're true in a way because there aren't really any identifiable features in the data itself, other than the location. But if you think about where you go, you probably travel from home to your workplace, and how many people make that journey every single day?
Dave Bittner: Right.
Stuart Thompson: And that's the kind of data that's in here. So we can look at a house and back up from there and look at public information that we have access to that shows their name, who owns the house. And then, you know, people post stuff online. They have a LinkedIn profile. They have a Twitter account - whatever. There's a newsletter, somewhere some place on the internet there's some breadcrumb that mentions your name and your workplace, and really, that's a huge spotlight on who you are potentially, and one or two other data points is enough to feel very confident about who it is.
Dave Bittner: As you combed through the data, were there any particular individuals you were able to key into that you found particularly chilling or unsettling at how easy it was to track someone?
Stuart Thompson: Yeah. So we basically started looking at the data thinking, can we deanonymize the data. And you're looking at basically just a bunch of points, like just a bunch of dots on a map. It's not really too scary. But the first time I was successful in tying a dot to a person by looking at a house and a workplace and a couple of other places this person went to confirm who they were, it was really scary. It was freaky. Like, I got goose bumps. I, like, sat back in my chair, took a deep breath. Like, I think I went for a walk. It turned the data from something that's just a spreadsheet and a bunch of points on a map to, like, someone's diary. Like, it felt, like, pretty invasive at that moment to turn something that's, you know, innocuous and sold and traded for profit to, you know, something very revealing about someone's life.
Stuart Thompson: So it didn't have to be a prominent person to feel chilled by what we were seeing, but then there were prominent people in there. So then we listed in the story a couple of people - senior Defense Department officials, Secret Service agents. We found one Secret Service agent - who we believe is a secret service agent - you know, following Trump around on a high-profile weekend with a foreign prime minister visiting Mar-a-Lago.
Stuart Thompson: And, you know, you see the path in the story, and it's like, whoa, that's pretty crazy. (Laughter) But you have to think, we started out being like, maybe this is fully airtight. You know, maybe there's no tracking at all of people within a literal arm's reach of the president. But that's not the case; everyone was sort of involved in this.
Dave Bittner: Now, the people who are buying and selling these datasets, how restrictive are they in terms of who they will sell to? If I went to one of these companies and I said, hey, I want to buy a bunch of this data, how hard would it be for me to get my hands on it?
Stuart Thompson: That's a great question, and we can only sort of speculate based on what companies say. We've had - we talked to some sort of insiders, former employees and so on. But the truth is, there's nothing legally stopping them from selling to anybody. So you can imagine - and, you know, this former CEO of Foursquare says he was turning down million-dollar deals to sell batches of data. Now, that's easy for Foursquare to do 'cause they have a big business doing other kinds of analysis. But if you're a startup, a small company with, you know, 12 people, you have an incentive to make some money, and there's nothing legally stopping you from selling this. And, you know, we had a former employee of one of the companies tell us, you know, contracts like this go for millions of dollars.
Stuart Thompson: So yeah, I mean, the companies say they work with trusted businesses. They have a vetting process that includes, you know, references. You have to have a plausible business case. Those are the kinds of parameters that companies tell us about. But the truth is, there's no way to really say that that's all that happens. Of course, companies, you know, want to say that, and it's hard to evaluate because there's no obligation for them to report who they sell to. Companies won't tell you where they get data from or where it goes. And we can piece it together by looking at some of the big players that, you know, analyze the data and, you know, boast about the analysis they're able to do. But it's a black box in a real way.
Dave Bittner: What's the reaction been to the story? I'm thinking, specifically, have you heard anything from anyone, say, for - on Capitol Hill?
Stuart Thompson: So Congress seems extremely distracted right now.
Dave Bittner: (Laughter) That's a fair point, yes.
Stuart Thompson: So we haven't seen, you know, a bunch of hearings triggered right away. But I think there's definitely a number of senators who are interested in privacy and have put forward their own privacy bills. And, you know, this kind of thing, I think, adds to the sense of unease that people have. And when privacy returns to the forefront, we expect some sort of privacy - federal privacy bill definitely within the next couple of years. I hope this year, but it's an election year so you never know. But there's definitely going to be a federal privacy law coming, and this kind of exposure is really important for people to understand the scale of what's going on in an industry that pretty much operates invisibly, like in the background of your phone.
Dave Bittner: How has this changed your attitude to this sort of thing? Do you have a different perspective having been through the work that you've done here on how you interact with your own mobile devices?
Stuart Thompson: Oh, yeah. I'm a nut now. I'm totally freaked out.
Dave Bittner: (Laughter).
Stuart Thompson: My editor was joking that I was going to file this story by carrier pigeon because I would be living in the woods somewhere. But yeah, I mean, when you see it - and, you know, I started off the Privacy Project - when they first told me, like, oh, we're going to do a series on privacy, I was like, OK. Like, I remember writing about privacy, like, a decade ago. And I was like, I don't know if I really care, and, like, maybe, you know, if you consent, you know, they give you a screen and you consent to it, so what's the problem? I think, like, it's a little bit like we're all brainwashed a little bit to think that, like, the status quo is the right kind of - the only way it can work.
Stuart Thompson: And when you start - like, when I started looking at this data, I just got - I felt really sort of trapped, like I couldn't escape the industry if I wanted to. And a lot of people, they don't want to right now. But, you know, maybe after seeing this story or maybe after seeing some more stuff, they might feel like they don't want to participate. But it's too late. Like, you can't get your data back. You don't know where it's gone. You don't know who has access to it. You can't delete it. You can't request it. There's no method for you to find out anything about what's known about you. And, you know, you might change your mind down the road if you don't care, like I did, you know.
Stuart Thompson: I used to use Foursquare. I used to think location services were a cool thing on a phone. It's like, what a cool way - what a cool, like, additional piece of technology we have access to now where you can, like, monitor your life. But it's extremely valuable to businesses, and there's, you know, no oversight. There are very few legal limitations on what they can do with it. So yeah, I'm totally freaked out. I turn off my location services at all times - my phone, for anyone that wants it. And yeah, I mean, it's scary. But I know a lot of people don't do that, and they're not totally equipped to deal with it.
Stuart Thompson: The other thing I'll say is, like, we published our story, which did great and got a lot of attention, but the second most read piece in the whole series wasn't our expose on the national security threats; it was how to protect your phone. It was, like, the three steps to protect your phone. And people want to know that stuff. And I think it's the responsibility of the companies, like Apple and Google, to make that much easier to adjust because it's - you know, we need to publish an article that gives you three complicated steps and a bunch of videos to protect your phone, when it should be much easier than that.
Dave Bittner: What are your thoughts on potential mitigations for this sort of thing? Is this something where we should see regulations or legislation? Do you have any insights there?
Stuart Thompson: It's complicated, the solutions that Congress can put forward. I think, generally, I'd love to see more transparency. Like, I'd like companies to have to, you know, publish where the data goes and what they do with it. If Congress looks at this issue and makes a federal privacy law and the end result is longer privacy policies, it's going to be a total disaster. And so far, a lot of the laws that get published is, like, essentially more notice and more consent, and that's not really effective. I mean, we've seen that - you might consent to share your location with an app that you don't know where that's going, and it shouldn't really be up to you to have to decide all that.
Stuart Thompson: So I'd really like to see more pressure put on companies. In a tangible way, you know, you could really put a lot of limitations, like the timeframe that you can keep the data for. The use cases that you can use it for is another, borrowing from the European privacy law. So if you state that you want to use it for advertising, you then can't use it for hedge fund analysis or something; it has to be for the strict reasons that you state upfront. And yeah, I mean, the companies right now that can keep the data forever, that's, you know, a privacy risk for sure. So it'd be nice to have some limitations on that.
Stuart Thompson: And there's people arguing that this stuff just shouldn't exist at all. There's the use case where, you know, you give your location to an app and it might use it within the app environment, but what we're reporting on are companies that sit inside that app, have nothing to do with the app and then use it, basically, as, like, a mining tool to collect your location data. And maybe - that seems to me like something that maybe should stop.
Dave Bittner: All right. Ben, what do you think?
Ben Yelin: Well, it's just great to hear from Stuart, first of all. You know, we spent a lot of time on this podcast and I've spent a lot of time in my own brain wrestling with the implications of this article. So it was just nice to hear from him and also nice to hear that he, too, has been wrestling with this. What was interesting to me is that he was put on this Privacy Project not being somebody who was obsessed with digital privacy. He came at it from a perspective that I think a lot of people come at it with, which is we all press I agree to the terms and conditions.
Dave Bittner: Right.
Ben Yelin: If I have nothing to hide, why - you know, why should I be scared? And then he did this extraordinary research project and realized that there are a lot of reasons to be disturbed. The - not only the violations of personal privacy, but some of the things he uncovered about dangers to national security, that phones were pinging inside the Pentagon, outside Langley, the CIA headquarters. And so it took him doing this extensive research to be at a place where he is now, where he turns off location services on all of his applications.
Dave Bittner: Right.
Ben Yelin: And, you know, his colleagues are joking about him doing the story from - in the woods.
Dave Bittner: Yeah. Yeah.
Ben Yelin: And I think sort of the logical leap to make from that is the more all of us would get into the weeds in how invasive this technology is, the lack of transparency, I think the more it would concern all of us. So sometimes it's just a matter of - I wouldn't necessarily use the word ignorance, but just not understand...
Dave Bittner: Yeah. Resignation is the word I think of. That...
Ben Yelin: Yeah. A lot of it is resignation because we - and I know we've talked about this, too. We like having location services for all of the conveniences it provides, and it does provide a lot of conveniences. And another point I thought was very meaningful that he made is, yes, we do have the opportunity to consent to location sharing within each application, but unless you do the type of research that he's done, most users not only have no idea where that information is going, but they shouldn't really have an idea where that information is going. It should not be up to the users of these devices to figure out the dangers of sharing their location. Just, like, if you think of, you know, a nondigital product...
Dave Bittner: Right.
Ben Yelin: ...It should not be up to the users, you know, to figure out what, you know...
Dave Bittner: (Laughter) Right. I shouldn't have to keep a Geiger counter at my house to test the foods I bring into my home for radioactivity, right? (Laughter).
Ben Yelin: That is a perfect example. Mine was going to be, you know, if your dishwasher stops working, it shouldn't be up to you to figure out what...
Dave Bittner: Right. Right.
Ben Yelin: ...Cog in the machine is causing that problem to exist.
Dave Bittner: Right.
Ben Yelin: But I like your example better.
Dave Bittner: Yeah. Right. We have general rules about radioactivity in consumer products. We agree that's a bad thing (laughter).
Ben Yelin: Exactly. And so this would just be another type of consumer protection. He shares my skepticism that Congress is going to act quickly on this, even though there's some interest. But I think the place he wants to start, which is laws around transparency just so that people, if they choose to be aware of where their information is going, have the option to be aware. They may not choose to be aware, but at least giving them that option.
Dave Bittner: All right. Well, our thanks to Stuart Thompson for agreeing to speak with us. Certainly an interesting conversation; in my estimation, an important one as well. So we want to thank him for taking the time. And of course, we want to thank all of you for listening.
Dave Bittner: And we want to thank this week's sponsor, KnowBe4. If you go to kb4.com/kcm, you can check out their innovative GRC platform. That's kb4.com/kcm. You can request a demo and see how you can get audits done at half the cost in half the time.
Dave Bittner: Our thanks to the University of Maryland's Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com.
Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening