Your own data used against you.
Scott Giordano: Today's anonymous data may be tomorrow's deanonymized data.
Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. This episode is for September 16, 2020. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: On this week's show, I have the story of adult websites and their reaction to deepfakes. Ben describes a ruling from the 9th Circuit Court that has Fourth Amendment implications. And later in the show, my conversation with Scott Giordano. He's VP and senior counsel, Privacy and Compliance at Spirion. And he's going to be discussing the surprising ways your data can be used against you and how you can protect yourself and those who matter most.
Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney.
Dave Bittner: All right, Ben, let's kick things off with some stories this week. You've got an interesting one. What do you have for us?
Ben Yelin: Yeah, so a major court case came out last week relating to the now-defunct call detail records program. I feel like a broken record on this 'cause I know we've talked about it ad nauseum, but there keep...
Dave Bittner: (Laughter).
Ben Yelin: ...You know, we keep getting these new news developments, and I don't want to leave our listeners hanging.
Ben Yelin: So I'll give as brief a background as possible. We had this phone metadata program where the federal government as a matter of course was collecting nearly all domestic phone records - the metadata - so who made the call, who received the call, the duration of the call. This was authorized pursuant to the USA Patriot Act, Section 215. It was one of the programs uncovered by Edward Snowden. Congress got very angry once this program was uncovered - at least how the program was being carried out at the time - so they essentially ended it.
Ben Yelin: But before they ended it, somebody had been arrested, potentially based on evidence obtained through this call detail records program. And that person was named Basaaly Saeed Moalin. And he is the subject of this 9th Circuit case. So his is a Somalian immigrant, came to the United States and was accused of providing a material support to a terrorist group - an alleged terrorist group in Somalia.
Ben Yelin: And some of the evidence that was used, particularly in later stages of the investigation, was from these call detail records. After the NSA and the FBI did some investigative work, they found out that his phone number was making calls to a suspected terrorist overseas. And so that was at least part of the reason why he was arrested and convicted. He was actually - the original conviction was seven years ago. For a bunch of procedural reasons, it took until 2020 for the case to be resolved.
Ben Yelin: So what we got from the 9th Circuit is that the phone metadata program as it existed prior to 2015 is very likely to be unconstitutional. They did not make a definitive ruling because they said that it wouldn't really have mattered whether this evidence was suppressed or not because the conviction would have been upheld even absent this evidence. But they did muse at length about the constitutional problems with Section 215 and the phone metadata program.
Ben Yelin: And that's the first time a court at this level has really grappled with the constitutional issues. We've seen district court cases that have done so at the lower level, and we've seen appeals court cases that have made rulings on narrower grounds, such as standing and whether the statute itself authorizes this type of collection. But we've never seen courts go deeply into the constitutional issues here. And that's what the court did here.
Ben Yelin: So to sum it up really briefly, the precedent case for all metadata is Smith v. Maryland, which applied what's called the third-party doctrine to electronic communications. Basically, what that decision held is that if you voluntarily convey information to a third party, like a phone company, you forfeit your reasonable expectation of privacy and, therefore, you don't have any Fourth Amendment protection. Now, that case was about a single individual and a local police department that placed a pen register that recorded the numbers that that individual dialed.
Ben Yelin: Obviously, technology has changed significantly since 1979 when this case was decided. And what this court is saying is technology has changed in such a sufficient manner that we can no longer use Smith as precedent when we're talking about bulk metadata, that metadata reveals so much more now than it did back then. And they give some examples. They also cite some other cases, including from the Supreme Court, where justices have talked about how much information - private information - you can glean from this type of bulk metadata.
Ben Yelin: So in the view of this court, when you look at the totality of the circumstances, they think that the Smith precedent should not apply in situations like this. In their view, this program is very likely unconstitutional. And if it still existed today, it's possible that we'd see it struck down by the Supreme Court or by the 9th Circuit itself. So certainly a landmark decision.
Dave Bittner: Now, since the program has already been done away with, I mean, is this kind of a - I don't know - a period on the end of that sentence?
Ben Yelin: Yeah, I feel like this is the final nail in the coffin of the Section 215 program. We had already had both the NSA itself and all sorts of intelligence experts saying that this was largely an ineffective counterterrorism program. I know we've talked about this - that phone metadata is just not that useful of a tool anymore. Maybe it was back in the ancient days of the 2000s.
Dave Bittner: (Laughter).
Ben Yelin: But, you know, these days, people communicate on - via secure applications. Nobody makes phone calls. I basically - I call my parents, and that's about it.
Dave Bittner: (Laughter) Right, right. Yeah.
Ben Yelin: So it's just not that effective of a counterterrorism tool. And from what the court is saying here, the fact that it's ineffective means that you can't justify the invasion of privacy. So I think this really is the proverbial death now for this program.
Ben Yelin: I don't think, you know, it was going to come back as it existed prior to 2015. I don't think Congress was going to reauthorize it to that extent. But I think this just sort of puts the icing on the cake.
Dave Bittner: Yeah. Interesting that on Twitter, Edward Snowden himself was crowing about this, sort of raising the - his flag as a victory.
Ben Yelin: Yeah, he was doing a little chest thumping. And you know what?
Dave Bittner: (Laughter).
Ben Yelin: Whether you like Snowden or not, I think he deserves to chest thump here. I mean, the only way we know about the existence of the call detail records program was because of the Snowden leaks. And it really did cause dramatic change both in Congress and among the courts. So it was a success for the country.
Ben Yelin: I mean, I think Snowden has his own motives. He thinks, you know, well, maybe Donald Trump will pardon me now. This program has been shown to potentially be unconstitutional; I've been vindicated. But he certainly has a right to claim some victory here.
Dave Bittner: Yeah. The conversations I've had with folks in the intelligence community who are not pro-Snowden will make the point that the way that he released the information he did, the amount of information that he released - the subtext seems to be that perhaps lives were lost in the intelligence community because of the information that he released in the way he did, which is a, you know, interesting counterpoint to that.
Ben Yelin: Yeah, always important to remember. I've heard the same thing. You know, I've had students who have been in the intelligence community. And they've just - they get a seething look on their face when they talk about Snowden. And so I'm sympathetic to that as well.
Ben Yelin: And just because he can claim vindication as it relates to this program, it does not necessarily mean he can claim vindication for all the documents he released, and we're talking about volumes and volumes of documents.
Dave Bittner: Yeah.
Ben Yelin: I wanted to touch on one other thing about this case quickly, if I could.
Dave Bittner: Sure.
Ben Yelin: The 9th Circuit also said for the first time that the government has to give notice to potential criminal defendants that they've used this type of surveillance technique. Previously, courts have said, and Congress has said, that notice has to be given to criminal defendants when a warrant has been issued or, in the intelligence context, when there has been a FISA warrant or an authorized FISA search. But never has this notice requirement been applied in situations like this, where there has not been a warrant - no warrant was issued here - and when it was very unclear in the first place whether this was even a Fourth Amendment search to begin with, particularly if you believe the precedent from Smith v. Maryland.
Ben Yelin: So that might be the biggest tangible impact of this case - that we have, for the first time, a court saying that when some sort of foreign intelligence technique is used, even if a warrant is not obtained, the defendant has to be given notice of this type of surveillance so that they can attack it at trial. And a lot of legal scholars, including our boy, Mr. Orin Kerr...
Dave Bittner: (Laughter).
Ben Yelin: ...Have already written on this and talked about the broad implications and what this means going forward. So just certainly something to look out for.
Ben Yelin: That's something I don't think people were expecting to see from this case. And, you know, now, at least in future proceedings of the 9th Circuit, you're going to have to have this type of surveillance revealed, even if it does not emanate from a warrant, to criminal defendants or terrorist targets.
Dave Bittner: All right. Well, interesting developments, for sure.
Dave Bittner: And I'm going to pivot now and go from the important topic of constitutional law to porn.
Ben Yelin: Woohoo.
Dave Bittner: (Laughter) Yes. And...
Ben Yelin: They're both adult subjects, but in very different ways.
Dave Bittner: Well, that's true, I suppose. This is a story from WIRED, written by Matt Burgess over at WIRED U.K. And the title of the article is "Porn Sites Still Won't Take Down Nonconsensual Deepfakes." And it's the deepfakes angle here that I think is particularly interesting for us.
Dave Bittner: And the reality is that you have these porn sites online, which are very popular, of course - some of the most popular sites on the internet, period. And I guess a lot of them allow people to upload videos. And more and more, the problem is that people are uploading deepfakes, which is, for those who - just a quick review.
Dave Bittner: Deepfakes are when you're able to use computer technology to map the face of one person onto the body of another person. And the software has gotten to the point now where it's very convincing and easy to do. So it doesn't cost a lot of money, doesn't take a lot of time. Our computers have gotten so powerful and so fast that this is not out of reach of folks who set out to do this sort of thing.
Dave Bittner: And so the issue is that many of these porn sites - well, they make money off of people watching the videos. The videos are surrounded by ads, and so on and so forth. So they're interested in generating page views. And so if people can log on and see these deepfakes that may involve their favorite celebrities - they mention folks like Emma Watson, Natalie Portman, Taylor Swift - and that generates page clicks. And evidently, these sites are slow to take down the videos, and that has a lot of people upset that there really isn't a legal framework here to speed this up.
Dave Bittner: What do you make of this, Ben?
Ben Yelin: Yeah. I mean, to me, it's just a sad story in a lot of ways. I mean, it's very exploitative of the individuals who are depicted in deepfakes. And that's obviously the biggest problem.
Dave Bittner: Right.
Ben Yelin: But just the fact that you have all these - and it's generally guys who have sad enough lives that they need to fantasize about their celebrity crushes.
Dave Bittner: (Laughter).
Ben Yelin: And the technology exists that they can do so in pornographic videos is just kind of depressing to me.
Ben Yelin: But, yeah. I mean, the lack of legal recourse here is very problematic. There are a few potential avenues. So you can file a defamation suit. That's generally not going to be successful when we're talking about high-profile celebrities just because they're in the public eye. There's a more stringent standard in terms of protecting First Amendment rights when we're talking about people who are very public figures. It might work, you know, in some of the more extreme examples where you have not, you know, celebrities who are being depicted, but noncelebrities, even people like YouTube stars - young YouTube stars. You might have a better defamation case there.
Ben Yelin: And there are some other causes of action. They're generally not successful. They're very onerous. You know, you'd have to have a good attorney to be successful. So you're really kind of out of options if you're being depicted here.
Ben Yelin: It is very exploitative. It does hurt a person's reputation. And that's just sort of the least of the problems there.
Dave Bittner: Yeah, and it's not fast. The good thing for these sites to do would be to err on the side of if someone complains about a user-submitted video, just take it down. It comes down quickly.
Ben Yelin: Right.
Dave Bittner: And then figure it out later. Let people argue or duke it out while the video is not online anymore. If there's ever an area to err on the side of caution, well, this would be one of them.
Ben Yelin: Yeah. One - another thing I worry about related to that point is once it goes up on one site, then you start to get into darker areas of the web where...
Dave Bittner: Right, right.
Ben Yelin: ...They might not be as amenable to requests to take down this information.
Dave Bittner: Yeah.
Ben Yelin: And so these videos can spread like wildfires. I hate to use that metaphor given what's happening in my home state right now.
Dave Bittner: Yeah. Yeah.
Ben Yelin: But these videos can spread very quickly. So it's not just the high-profile websites that they mention here. It's that, you know, it can end up in some of the darker corners of the internet.
Dave Bittner: Yeah. Well, I was curious to see what was going on in terms of policy and legislation. So I was doing some digging around, and I found an article from the folks over at Malwarebytes. This is something they published back in January about some of the laws and proposals that are making their way around the country. Some states - they said California, Virginia and Texas - already have deepfake laws. There are some in Massachusetts, New York and our own state, Maryland. Interesting to me that the Maryland law primarily focuses on election fraud, which, hey, politicians looking out for themselves. Imagine that.
Ben Yelin: Of course.
Dave Bittner: (Laughter) Right.
Ben Yelin: Yeah.
Dave Bittner: But one thing that struck me was they listed some of the federal deepfake legislation before Congress. I cannot let this pass. One of them is called the Defending Each and Every Person from False Appearances by Keeping Exploitation Subject. What does that spell out?
Ben Yelin: We got an acronym, baby.
Dave Bittner: (Laughter) What does it spell out, Ben? It spells out DEEPFAKES.
Ben Yelin: DEEPFAKE.
Dave Bittner: (Laughter).
Ben Yelin: You know me and my acronyms.
Dave Bittner: There's nothing that a - I guess a congressional intern likes better than being able to come up with a name for something that spells out the thing that it is, right?
Ben Yelin: In my second life, I just want to have the job of the person who develops congressional acronyms. I mean, a lot of the laws we talk about on here - USA Patriot Act, USA Freedom Act - those are acronyms, which is unknown to some people.
Dave Bittner: (Laughter) Are they really?
Ben Yelin: This is an especially good one.
Dave Bittner: Yeah.
Ben Yelin: So credit for that, at least.
Dave Bittner: Yeah.
Ben Yelin: A lot of states have laws that relate to the political realm. They don't want, you know, the words of politicians to be twisted in a way that misrepresents what they've actually said. So there's that viral video that went around a couple of years ago of - I think it was "Key & Peele" did a sketch where it made - they made it look like Obama was - using deepfake technology, made it look like Obama was saying something that he never actually said. And it was...
Dave Bittner: Right.
Ben Yelin: ...Really well done and very effective. And I think, you know, that's a major fear that could interrupt our political process, especially when you have malicious foreign actors who would benefit from doing such a thing.
Ben Yelin: So, you know, because the technology is easy to use and it's cheap, it certainly could allow fake political information to get out there. That's a very serious problem. I think just as serious is the exploitative nature of deepfake pornographic videos.
Dave Bittner: Yeah.
Ben Yelin: So I think it's incumbent upon states to address both problems. And it seems like some of them are sort of starting to get there, but we're still in our very early stages.
Dave Bittner: Yeah. And it seems as though when these things do come up for vote, it seems like they get broad bipartisan support. Everybody seems to be on board.
Ben Yelin: Yeah, absolutely. My philosophy is eventually everything becomes polarized.
Dave Bittner: (Laughter).
Ben Yelin: But at least for right now, yeah.
Dave Bittner: You're not the least bit cynical, are you, Ben?
Ben Yelin: I'm not cynical at all.
Dave Bittner: (Laughter).
Ben Yelin: But you know what? On the positive side, at least for right now, I do think there really is bipartisan support for doing away or just at least giving a cause of action for the individuals who are depicted in these videos. So I guess that's one positive to draw away from this.
Dave Bittner: All right, well, those are our stories for this week.
Dave Bittner: We would love to hear from you. If you have a question for us, you can call in. It's 410-618-3720. You can also send us an email to caveat@thecyberwire.com.
Dave Bittner: Ben, I recently had the pleasure of speaking with Scott Giordano. He is VP and senior counsel for Privacy and Compliance at Spirion. And our discussion centered on some of the ways that your own data can be used against you, as well as ways you can protect yourself and your loved ones. Here's my conversation with Scott Giordano.
Scott Giordano: Here's the problem with mobile devices is that - and, by the way, the applications that really power them - is that we have so many choices available to us to do all kinds of wonderful things, but we don't really see what's going on behind the scenes. We don't understand where our personal data is being sent, who's using it, why they're using it, how they're using it, with whom they're sharing it. There's so much we don't know, and that's the problem. And that casts a shadow over everything that we do with mobile devices and, soon, really everything that connects to the internet, whether it's an internet-connected refrigerator or thermostat or fish tank, as the case may be. It's a problem for all of those things.
Dave Bittner: Well, can you give us some examples of, in day-to-day life, the types of information that folks are turning over over the course of their day?
Scott Giordano: A great place to look is in lawsuits. And I'll give you a couple of really crisp examples - is the Zoom - Zoom as in the Zoom video that we've all been using probably for the last four or five months. There was a lawsuit recently filed against Zoom because, among other problems, Zoom was sharing information automatically with Facebook. And this was true even if you didn't have a Facebook account. So you have all this shadow data floating around out there. It was being used without your explicit permission, and that's a problem.
Scott Giordano: Same thing with the Ring doorbell. The Ring security doorbell, again, we're all well-versed with. There were a lot of problems with the security piece of it, but also with the privacy piece in the sense that there were trackers in the app that the Ring doorbell uses, and that was sending personal information to all kinds of third parties.
Scott Giordano: And these are things, again, that you and I really can't tell what's going on. We don't know what information is being shared, and we're trusting implicitly all of these devices. And it's really not a good practice to get involved in.
Dave Bittner: Yeah, I saw, you know, in the past year or so, there was the revelation that - I believe it was a weather app that was...
Scott Giordano: Yeah.
Dave Bittner: ...Gathering up a lot more information than folks thought it would.
Scott Giordano: Yes, yes, the Weather Channel app. And, in fact, there was a lawsuit about that. I don't know if that has been settled out of court or it's still pending. But, yes, the weather app that's powered by IBM, as they like to say. And I hate to say it, but it's the app I use to check the weather here in Seattle. That was selling personal information without users' permission.
Dave Bittner: Now, what about the flip side of this? I mean, the providers of these apps, of these services, certainly, organizations like Facebook - they'll say, well, when you signed up for our service, you agreed that we are entitled to gather this information. You clicked that box. You said, I agree, and so we're good here.
Scott Giordano: Well, we're not good here, and here's why - because ordinary people have no idea what's really being done with their data behind the scenes. And, in fact, Google was sanctioned for this by the French data protection authority, the CNIL, because they had one box, and they had about 20 different processes. So you had one box to check. You couldn't differentiate among those different processes or uses and say, well, I want you to use my data for this but not for this. It was, as I like to joke, one box to rule them all.
Scott Giordano: And it is not really consent. If you're consenting to anything and everything with one box, you don't really give users much of a choice except not to use your product at all. And that kind of defeats the purpose of why we have the internet and we have all these great products.
Dave Bittner: Have we seen any meaningful shifts in this? I mean, with things like GDPR, with the CCPA out of California, are they really moving the needle?
Scott Giordano: They are. They are. And I've worked on many projects getting ready, in fact, for GDPR, and it really changed the culture. It changed the dynamics of these organizations where they had to, perhaps for the first time, think about what personal data they had, what they were doing with it, with whom they were sharing it, how they were using it - all these questions. And this is something that traditionally you haven't really had to answer except if there was a breach and there was some kind of exfiltration of data.
Scott Giordano: Then the question is, well, what did we have, and what were we doing with it? And that doesn't really bode well whenever there's an investigation by an authority and they ask you why you had the data and you can't tell them. And that was one of the problems with the Equifax case in the U.K. When the U.K. data protection authority asked them about, why did you have certain data, they couldn't tell you. And so that was not well received, to say the least.
Dave Bittner: Is there still this culture among these companies that - I don't know - just because we can, we will, you know, when it comes to gathering data, that all data has value and so because storage is cheap and we have access to it, we might as well vacuum it up, and maybe it has value to sell to someone else?
Scott Giordano: Yeah, that's a very traditional American culture, American view, certainly very Silicon Valley - the idea of move fast and break things. And the problem is that in that process, it's our own personal data, our own lives that are suffering as a consequence.
Dave Bittner: Can you give us some examples of, you know, when we're talking about data that's been anonymized? We hear that a lot, that - relax. It's OK. The data has been anonymized. No one can trace it back to you. But it's a little more complicated than that.
Scott Giordano: It is because today's anonymous data may be tomorrow's deanonymized data. And Bruce Schneier, who I'm sure many of your listeners probably follow or at least have heard of, he wrote about this in one of his books where he talked about the Kinsey study that happened in mid-century last century, if you will, and about human sexuality and how all of these participants that were anonymous then could be easily deanonymized today.
Scott Giordano: And so it really raises a larger question if your study that you're doing today on human sexuality is anonymous, but next week, next year, a data science breakthrough happens and, suddenly, everyone's deanonymized, how does that change the dynamic? And I don't think that, unfortunately, everyone's thinking about this. They're thinking about, what can we do with the data we have? How can we monetize it? And that's, in and of itself, not bad, but you always have to think about the externalities. How is this going to impact real people?
Dave Bittner: You know, we're seeing reports Apple, for example, in the next version of iOS, their mobile operating system, they're going to be alerting users more overtly when apps try to gather information. And it'll tell you, this app's trying to gather this bit of information about you. Are you going to allow that? It seems like that could be a move in the right direction.
Scott Giordano: It is. It is. And that should be the model going forward, is now you're - as a consumer, I'm not just giving consent. I'm giving informed consent, which really is the touchstone for consent. And certainly, that's how it's being done with the GDPR. Consent under that regulation has to be informed. People have to know what they're consenting to. So, yes, this model is - that Apple is doing is great. I think it's long overdue. But I'm glad that they're doing it.
Dave Bittner: Do you think we're going to see that going forward? I mean, is it possible that companies will see embracing privacy as a competitive advantage?
Scott Giordano: I do. I see popular companies, again, like Apple, using that as a marketplace competitive advantage. Unfortunately, there are plenty of companies out there, though, that see your data as the whole reason why they created the app or the device, and all they want to do is get it and monetize it. And how will they do it? They're not too concerned about the external effects on individuals.
Dave Bittner: Where do you suppose we're headed? I mean, I hear a lot of people saying that what we really need is some sort of, you know, federal legislation - some regulations at the federal level so that we don't have a patchwork of state-by-state regulations. Do you think that's a possibility?
Scott Giordano: I don't. I think that train left the station five, 10 years ago. Right now, if you look at maybe the last two years of legislation, about 35 states have updated their cybersecurity and privacy regulations. California is the most well-known, but New York actually has a much lower threshold. It applies to anyone in the U.S., not just certain companies that have certain revenues. Anyone that even looks at the personal data of a New York resident is subject to the New York SHIELD Act.
Scott Giordano: So we're seeing that this is really being done at the state level - that things that the federal government should've done are now being done in the states. And it's probably best to leave the states to do it.
Dave Bittner: And, again, where do you suppose we're headed with this? Do you feel like we're headed in the right direction? Is the will of the public to have these things clamp down, that - are people saying, hey, we've had enough of this?
Scott Giordano: I think so. I think the popularity of the proposed update to the CCPA, the CPRA, which is now on the California ballot - I talked to one of the authors at RSA. He said that the polling for it was the highest of any proposed ballot initiative in history. So I think that bodes well for Californians. And I'm hoping that other states will take note of this and start using the same model for their residents.
Dave Bittner: What about the potential reach of these laws? You know, GDPR was, I think, one of the things that famously - that - how it caught people's attention was that its reach extends beyond just the EU to if you're anywhere in the world and you're handling people from the EU's data, it applies to you. The CCPA is similar. Is that a way to sort of - I don't know - extend the grip of these regulations so that it doesn't necessarily matter where you are, that they can affect you either way?
Scott Giordano: I think it's a good model, and here's why. Say that you are a company in the EU and you're marketing directly to California residents, or any particular state. Would it be fair to not have the benefit of California law to protect those individuals? Because, certainly, that's the view of the European Union that the GDPR protects individuals that are EU data subjects - protects them around the world. So why not use that same model? I think it's the way to go. I think that it really forces us to think about what are we doing with someone's personal data and how it needs to be treated with respect.
Dave Bittner: What do you suppose equilibrium looks like when it comes to these sorts of things? Where - if we were to find a balance between the appropriate use of our data and people's privacy, in your mind's eye, can you imagine what that might look like?
Scott Giordano: Yeah. I think that being able to provide a list to individuals of exactly what you're doing, in a bullet point form, with their personal information and having them understand that these are the things you're doing in exchange for whatever you're giving them, whether it's a free email or it's a free weather app or it's a free anything. But you're giving them a very clear idea.
Scott Giordano: And we have this in consumer law already, where you're required to give certain information to consumers, whether it's about warranties or it's about recalls or what have you. Same thing for personal information. There should be an information sheet that gives you a very concise view of exactly what's being done with your data. And that's something that should be available at any website or, if it's an app, going to the about section of the app and finding out exactly what these limitations are. But that should be something that's just issued with every device or every app. That way, you understand exactly what you're getting yourself into.
Dave Bittner: What about providing some sort of granularity to the users to be able to go through some sort of menu and say, hey, I'm willing to share this, but this is off limits, and, you know, perhaps I'd even be able to pay a monthly fee to not - to run the version of, let's say, Facebook, for example, that isn't gathering all my information?
Scott Giordano: I think that is a great way to go. The problem is it requires some effort on the part of end users, and I don't know if they're willing to tolerate the effort. I think for folks like your audience, they'd certainly be willing to go and check the boxes that they want to apply and say, fine, I'll pay for the rest, but I'll view your advertisements, I'll allow you to use my data, et cetera. And so it really gives you a lot of control.
Scott Giordano: On the consumer side, you have to put in some effort to really understand what your data is being done with and what you can do to protect yourself. So it's incumbent upon consumers and not just the vendors of the smart devices or the apps.
Dave Bittner: Looking at it from the other side of things, for the organizations that may be gathering data, I mean, what sort of insights do you have for them? How should they be viewing - how should they be respecting the data that they're gathering?
Scott Giordano: I think the first thing to remember is that you're gathering data from real people, not from machines or other things like that. At the end of the day, there's a real person behind that data, and what you do with it has an impact on their lives. I think that, unfortunately, individuals tend to get involved in things on the internet that are just terrible for personal data. But there's no stopping these folks.
Scott Giordano: And I'll give you a great example that I'm still shocked five years later - the whole Ashley Madison hack that happened. The idea that - and for the benefit of your listeners, this was a website - an aficionado (ph) website where you could go on and allegedly look for someone to have an affair with, OK? So put aside whether that's a smart idea.
Scott Giordano: The idea that that data is not going to get monetized - it's very naive to think that that organization is not going to do something with that data. And that may be the whole business model is to get that data rather than any kind of a fee that you may be charging every month. And also, the idea that perhaps the people that set that up are not just a bunch of businesspeople. It could be an intelligence agency. It could be a crime ring. I mean, there's no end to it.
Scott Giordano: But consumers have to take a little more responsibility on what they're getting involved in on the internet. And we're still seeing the repercussions of that hack today, where people are being blackmailed and so forth. So it's on the consumer side to be skeptical and to be maybe a bit cynical, like a privacy lawyer, and not just on the side of the app developers.
Dave Bittner: You know, I think a lot of us feel a little let down. You know, when things started moving online, when retailers and many of the businesses and services that we use went online, there was kind of this hopeful promise that, hey, this is going to be great for everybody. You're only going to see ads for things that you're interested in. And you're - we're not going to waste your time with ads for things that you would never purchase. And to a lot of us, that sounded like a great idea.
Dave Bittner: But it seems as though we've gone way beyond that, where, you know, how many of our friends and family now - they talk about how so many of their interactions are just creepy, where they feel as though, I had a conversation with someone about something, or I did a little bit of searching for something online, and now that search or that conversation is following me around, and I feel like I'm being pestered to buy that thing no matter where I go online.
Scott Giordano: It's frustrating, especially if you've already bought the thing in question and it still follows you around...
Dave Bittner: (Laughter) Right, right.
Scott Giordano: ...Which drives me bonkers. So...
Dave Bittner: Right. I don't need a second car.
Scott Giordano: Yeah, or second a lot of things.
Dave Bittner: Right, right.
Scott Giordano: And this drives me bonkers 'cause sometimes it'll follow you from years ago. It just seems like that there's a certain degree of either laziness on the part of advertisers or inability to really fulfill that idea of precise targeting. I would love to get ads that really understand what I'm looking for, but there's not a lot of intelligence or thought behind these ads, and it escapes me why that's the case. If you're going to ask for my personal data, at least do something smart with it. But, in fact, they're just doing the same stuff that we've been seeing for the past 10 years.
Scott Giordano: And so I think it's incumbent upon the advertising industry, which is this entire invisible industry. Very few people really understand how that ecosystem works for internet advertising. They need to step up to the plate and say, how can we make ads, which support a lot of things that we have - how can we make the ads a little more of a better fit and a little less creepy and intrusive? 'Cause I agree there's a fine line between, hey, that's spot on and, wow, how did they know that? That is not cool.
Dave Bittner: Right.
Scott Giordano: And I think we've all had those moments.
Dave Bittner: What's your advice for folks out there who want to take a more active role in protecting their privacy? Do you have any tips for some of the ways they can best go about that?
Scott Giordano: Well, first is a healthy dose of cynicism - that if you're getting something for free, that there's going to be data of yours that's going to be sold. And it may come from things you didn't expect. We talked about the Weather Channel app earlier.
Scott Giordano: But certain things that are - if they're gaining, shall we say, notoriety in a negative way - perhaps they're very common that people, even children, participate in, and now we have stories that data is going to be sold or shared with the Chinese government, et cetera - you really have to have a very cynical view of everything that is on your phone.
Scott Giordano: In fact, if we all pull out our phones right now and look at the hundred or so apps, how many of those do we really use? We maybe use 10 at the most. Why not get rid of things you're not using, things that have your personal information but you're not getting anything out of them? You've downloaded it once, you played around with it, and you forgot about it. I think that kind of a very proactive if I don't need it, I'm getting rid of it viewpoint is the best way to go with this. And that's great for apps and for mobile devices.
Scott Giordano: But think about everything else we have in our lives, all the Internet of Things. Routers - they're one of the worst offenders. It's not easy to update a router to make sure you have the latest security. So that's a problem in and of itself beyond just having our personal data taken from an app and having apps talk to one another when they shouldn't. You have this whole other issue of, how do you get these devices to make sure that they understand what you want out of security?
Dave Bittner: All right, Ben, what do you think?
Ben Yelin: Yeah, another very informative conversation. This hits on a lot of themes we've heard from some of our previous guests - that the amount of information being collected from us without our consent is going beyond the scope of what it's ever been. And lawmakers and policymakers have just been very slow to address this.
Ben Yelin: You know, I think one positive I took away from the interview is the long arms of the European Union and California have really started to effectuate change in this area, which is good, especially in the absence of federal action. And Scott seemed to think that this is something that can be done at the state level. I still would prefer, I think, from a compliance perspective to have federal privacy legislation just so you can have some level of uniformity.
Ben Yelin: But definitely a really interesting interview, and I was very glad to hear from Mr. Giordano.
Dave Bittner: We'd like to thank all of you for listening. That is our show.
Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.