There is a trust gap.
Chris Pedigo: The No. 1 way that consumers protect their privacy today is by deleting cookies, which really doesn't do the trick, and it actually breaks things when you try to log back into a site.
Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. This is Episode 42 for August 19, 2020. I'm Dave Bittner, and joining me is my co-host, Ben Yelin, from the University of Maryland's Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: On this week's show, I've got a new project from the EFF that hunts stingrays, Ben describes a lawsuit accusing Zoom of misrepresenting their encryption and, later in the show, my conversation with Rande Price and Chris Pedigo from Digital Content Next. They're a trade association for digital publishers. We're going to be discussing the challenges of aligning the data that websites and apps collect with what consumers expect. While this show covers legal topics, and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. All right, Ben, why don't you start things off for us this week? What do you have for us?
Ben Yelin: Sure. So my article is from The Washington Post cybersecurity blog 202 - "Zoom Sued by Consumer Protection Group for Misrepresenting its Encryption Protections." So our favorite pandemic-based application is in a little bit of legal trouble. They're not in financial trouble. Hopefully, smart investors out there got in on the ground in January. But they are in some legal trouble now.
Dave Bittner: Well, they can afford to hire good lawyers, right (laughter)?
Ben Yelin: That's exactly right. Their stock has certainly risen over the past several months. And I think they're going to need some good lawyers. This is a lawsuit that was brought in the Superior Court in the District of Columbia, so the equivalent of what a state court would be. And it is a claim under the District of Columbia's Consumer Protection Act basically saying that Zoom was engaging in fraudulent trade practices, business practices. The allegation is that Zoom was misrepresenting its encryption capabilities. They have long branded themselves as having end-to-end encryption. They put it on a bunch of their reports, publications. It was advertised on their website, you know, it was a feature of their service that they publicized. Turns out, as probably many of our listeners know, that they did not have end-to-end encryption. That was just a misrepresentation. They use a lesser form of encryption. They used TLS, which meant that Zoom itself could potentially have access to a Zoom conversation. And because, you know, the end-to-end encryption wasn't in place, it left conversations vulnerable to bad actors, hackers, potentially.
Ben Yelin: So the allegation here is that Zoom misrepresented their end-to-end encryption capabilities, and because consumers in the District of Columbia, where this lawsuit has been initiated, were falsely relying on that end-to-end encryption guarantee, Zoom was basically defrauding its consumers. So a couple noteworthy things about this lawsuit. First, D.C. is one of the few jurisdictions in the country that allows third-party groups to bring lawsuits on behalf of consumers. So usually this would be brought either by the consumers themselves in a class-action lawsuit or by the attorney general within a jurisdiction. D.C. actually allows nonprofit organizations to represent consumers in a consumer protection lawsuit. So that's what's happening here. So that's sort of one interesting element of this.
Ben Yelin: The other is, I think we've known for some time - probably since around April, and that's when everybody kind of got into this world of Zooming, when Zooming became a verb that was part of our regular vernacular - I think we've known that they misrepresented themselves as providing end-to-end encryption. In fact, they admitted it. They admitted the mistake. They sought to rectify that first by guaranteeing end-to-end encryption on their paid service, which did not go over well. So now they claim to be guaranteeing end-to-end encryption on their free service as well. But it sort of took lawyers a while to get all of their ducks in a row to come up with a plausible cause of action. And just based on reading the civil complaint here, it seems like the attorneys from this consumer watchdog group in D.C. have done their homework and this is a pretty compelling lawsuit.
Dave Bittner: Yeah, I mean, this is interesting because, is this just a case of perhaps the folks in the marketing and PR department being a little overzealous and, I don't know, overselling what they thought was going on? Maybe they had a miscommunication with the engineers. What do you think?
Ben Yelin: Yeah, I mean, there are a couple possibilities here. One is that, as you say, maybe the technologists were not in charge of the marketing, and end-to-end encryption is something that the marketing team knows that consumers want but they didn't have an understanding of exactly what that is. What the civil complaint alleges, and I think this is very compelling, is that end-to-end encryption has a very specific meaning. It is defined in federal standards. So if you advertise that, you know, knowledgeable consumers will know exactly what that means and they will justifiably rely on that characterization. So, you know, I think what Zoom's original excuse was, was, well, technically it's not what the industry would refer to as end-to-end encryption.
Dave Bittner: (Laughter).
Ben Yelin: We were using that as a term of art to mean that, you know, the transmission of our conversations was very secure.
Dave Bittner: Right. I can't help thinking about mobile providers and unlimited data.
Ben Yelin: Yeah, exactly, exactly. Unlimited data, and then the last 15 seconds of the commercial is unlimited doesn't actually mean unlimited.
Dave Bittner: Right (laughter).
Ben Yelin: There are a million different exceptions.
Dave Bittner: (Laughter) Right.
Ben Yelin: So yeah, I mean, I think that's very possible here. One thing that I think does not bode well for Zoom - they would probably prefer to plead ignorance here and say, this was a mistake on the part of our marketing team. We made a mischaracterization. It was not intentional. We weren't trying to defraud the consumers of our products. Reading through the complaints, it seems like Zoom should have been aware that this was a misrepresentation. They were informed by privacy advocates and cybersecurity professionals saying, you advertise yourself as providing end-to-end encryption and you do not provide that. And there's a lot of written documentation that Zoom and its parent company, headquartered in California, were providing that information. So it's going to be difficult for them to plead ignorance.
Ben Yelin: If I had to guess, I think it's possible that Zoom would try to settle this case. If the lawsuit went through and the consumer protection advocacy group was successful, it would be about $1,500 in damages to every D.C. resident who made a non-business Zoom call in the relevant time period. So that's, you know, that's certainly going to add up. So it seems pretty clear here that they did make a misrepresentation, they did make a mistake. Perhaps it would make sense for their bottom line to try and settle the suit before they get sued in other courts across the country.
Dave Bittner: In a case like this where you have a product that is mostly free - you know, I would suspect that most of the people who are using Zoom are not paid customers - how do you come up with a number, a dollar number, for what people have suffered if they're not paying for the product?
Ben Yelin: So oftentimes, it's not a scientific calculation on the damages suffered. It's either based on some sort of custom and legal precedence, like, this is the general level of damages that we assign based on fraudulent practices related to cybersecurity. So it's usually not as specific as it is for other types of damages. It also depends on the severity of the allegations.
Ben Yelin: So what's happening in this case is the civil complaint is saying that the plaintiffs deserve what are called treble damages, which is basically triple the amount that would normally be given for a standard, we-didn't-really-know-what-we-were-doing fraud case. And what that seems to indicate is that Zoom was aware or should have been aware that they were fraudulently representing their product so that they should be - basically they should be punished above and beyond what compensatory damages would otherwise indicate, if that makes sense.
Dave Bittner: Yeah, no, it does. And of course, you know (laughter), maybe the lawyers have a boat payment coming up, right?
Ben Yelin: Yes, yeah, those yachts won't buy themselves, my friends.
Dave Bittner: (Laughter) I hate to be cynical, but it's just so easy to.
Ben Yelin: You can always be cynical to lawyers. I will say, Zoom obviously has the capability to hire the best lawyers in the business considering how well they've done as opposed to the rest of us in this country during the pandemic.
Dave Bittner: Right (laughter).
Ben Yelin: But I also think that the consumer advocacy group in Washington, D.C., is going to have very good legal representation as well. Because this is a relatively stringent consumer protection law in the District of Columbia, there's a long history of D.C. cases related to fraudulent trade practices. So there are a lot of experienced attorneys there. So it's going to be really interesting to see how this case proceeds going forward. It was just filed yesterday. So I think it should be some time before we actually have any resolution here.
Dave Bittner: All right. Well, yeah, interesting indeed, one to keep an eye on. My story this week is from Zack Whittaker over at TechCrunch. And he writes, "A New Technique Can Detect Newer 4G Stingray Cellphone Snooping." Now, Ben, listeners of this show know that you and I love our stingrays (laughter).
Ben Yelin: Love our stingrays.
Dave Bittner: We just can't get enough of them.
Ben Yelin: We've metaphorically been stung by the stingrays.
Dave Bittner: That's right. And so just a quick review, a stingray is a cell tower simulator. And what it does is it allows law enforcement to basically activate this device. It pretends to be a cellphone tower, and then anyone who has a cellphone in the area where the stingray is, their cellphone is going to attempt to log on to this device, which is just part of how cellphones work. And based on that information, they can get a lot of information about the cellphones that are in the area, the locations and whose cellphone it might be. And there's information that's exchanged in part of the handshaking routine that cellphones normally have between themselves and towers.
Dave Bittner: But these stingrays have been very secretive. The folks who make the stingrays, Harris Corporation, they have these things all wrapped up under nondisclosure agreements. So they're sort of cloaked in mystery. More and more information has come out about them. Actually, we've had some kind listeners write in who are former law enforcement who have given us some insights into how they work, so I appreciate everyone who's written in to try to help us understand more how they work.
Dave Bittner: But what this story is about is how some folks at the Electronic Frontier Foundation have come up with an open source project that they're calling the Crocodile Hunter, and it can detect cell site simulators. So if you're running this software - and you need some hardware and some software, it's not something that's easy to do, you do need to to have some stuff to be able to do this - but if you're set to do this, you can run this open source software and you can detect stingrays. And they have done that. They've gone into some areas and they've detected some stingrays. Now, before we go on, yeah, let's discuss this name Crocodile Hunter, Ben (laughter).
Ben Yelin: So yeah, we have to talk about this name. I was a big fan of the Crocodile Hunter. It's been 14 years now. So, you know, maybe a little bit too soon to be confronting this tragedy, but...
Dave Bittner: Yeah. The backstory is Steve Irwin was killed by a stingray. He was doing the things that he does, he was in all sorts of adventures with wild animals.
Ben Yelin: Yeah, dangerously confronting wild animals in their own habitats.
Dave Bittner: Right. And he died because of stingray's barb hit him in the heart and he died from that. So this...
Ben Yelin: So why would you - yeah.
Dave Bittner: I just - it strikes me as being a little tasteless maybe (laughter).
Ben Yelin: Slightly tasteless, but also, I don't get it. Why would you name the software that is trying to detect and ultimately defeat stingray technology after a person who was defeated by those very stingrays? It just seems like, wouldn't you want - like, if you were the stingray trying to defeat the crocodile, then maybe you'd name your enemy the Crocodile Hunter, if that makes sense.
Dave Bittner: (Laughter) Right. Right. Right.
Ben Yelin: It just seems like this is the reverse of what it should be named here.
Dave Bittner: Yeah, well, nobody asked us (laughter).
Ben Yelin: Nobody asked us. It's a clever name, I just don't know that it really works. And it's certainly the first thing that caught my eye about this. We love our friends at the Electronic Frontier Foundation, but I feel like they're kind of setting themselves up for failure here.
Dave Bittner: Yeah (laughter). And also just having affection for Steve Irwin and how much entertainment he provided, it just - I don't know. It just feels a little funny, a little icky.
Ben Yelin: I'm still not over it, Dave. He was so good. He was so good.
Dave Bittner: He was. He was.
Ben Yelin: But - and I apologize to our listeners for that digression there - it is a very interesting open source technology, open source software. As you said, it's not like a lay person could get their hands on this and detect stingrays. You'd have to have some level of sophistication. You'd have to have a certain type of hardware and software to actually make this work. So it's not something that's widely available. But I think it could shed light on the prevalence of stingray devices, particularly within given geographical areas. As you've said, stingray policies are extremely secretive. They're protected by these nondisclosure agreements. Law enforcement obviously does not want to divulge their investigatory methods, so they're very cagey about whether they use stingray technology. It's very difficult to litigate these cases. There have been some successful lawsuits against the use of stingray technology, including one in the state of Maryland, but it's very difficult to litigate just because information is so scarce.
Ben Yelin: So what you could get from something like this is perhaps some sort of dashboard that shows the prevalence of stingray devices in a particular area, even if, you know, not many of us could actually have this Crocodile Hunter on our own personal devices. We might have information on how often it's being used, which police departments across the country are using it most prevalently, whether there are any sort of biases in the use of these stingrays, whether they're in neighborhoods that are full of minorities, et cetera, people who have faced historical discrimination. So I think that could be particularly illuminating and why I think this is a worthy project from the Electronic Frontier Foundation.
Dave Bittner: Yeah, maybe lift that veil of secrecy that stingrays have enjoyed so much.
Ben Yelin: Yeah, absolutely. And that's sort of what the Electronic Frontier Foundation does in general is to sort of unmask surveillance techniques that fly under the radar. And because surveillance is so secretive, you know, as opposed to other enforcement tactics, the technology moves so quickly that it just takes awhile for the activist community to realize exactly what new gizmos law enforcement is using on a daily basis. So to just get access to that information I think could be particularly valuable to the general public.
Ben Yelin: And, you know, I think the way to get this into the public consciousness is to make use of this, develop some sort of report that says, look at the prevalence of this technology. Look how frequently it's being used in these particular neighborhoods. Is this a technology you want to entrust to your local law enforcement division, you know, especially given newfound skepticism towards certain law enforcement officials in this country? So I think that that could definitely be valuable.
Dave Bittner: I still scratch my head and wonder how the FCC ever allowed these things to operate. Just the whole notion of everything that they do just seems to run counter to what you want to do when you're trying to run a communications network in a nation. But, I guess, I don't know, that's why I'm not on the board of the FCC, right?
Ben Yelin: Well, we should get you on the board of the FCC, first of all.
Dave Bittner: (Laughter) Let's start a campaign. Yeah, that would go well.
Ben Yelin: Dave for FCC.
Dave Bittner: (Laughter) Right. I'm just the guy they want (laughter).
Ben Yelin: Agencies are very deferential to law enforcement, especially when you get high-profile cases that say, you know, we were after a serial rapist or serial murderer and the only way we were able to catch them was tricking their cellphone into revealing, you know, identifying information. So it's just - it's hard to go against the wishes of law enforcement. That's kind of why these surveillance technologies subsist. It's hard to argue against it when you have these high-profile cases where you catch the bad guys.
Dave Bittner: Yeah. All right, well, we'll have a link to the story. Again, that's Zack Whittaker over on TechCrunch - interesting stuff for sure. We would love to hear from you. If you have a question for us, we have a call-in number. It's 410-618-3720. That's 410-618-3720. You can also email us. It's firstname.lastname@example.org.
Dave Bittner: Ben, I recently had the pleasure of speaking with Rande Price and Chris Pedigo. They are from a trade association called the Digital Content Next, and they work with digital publishers. And certainly top of mind for them are some of these challenges of trying to align the data that websites and apps collect with what consumers expect and, more and more, demand. Here's my conversation with Rande Price and Chris Pedigo.
Rande Price: The line of questioning here is really to understand consumers' expectations and how well the industry is aligning with consumer expectations. And from where we sit, Digital Content Next is the trade organization for premium publishers. And these publishers have, you know, a direct relationship, a one-to-one relationship with consumers. But again, we wanted to check in as a whole of the industry with consumers and their expectations with what data is collected and what their expectations are with that data collection. We've done similar studies with - about Facebook and Google and consumers' understanding of exactly what they're collecting and how they're using it. And so we thought this was a good opportunity for us to check in once again.
Rande Price: And what we found here was that the majority of consumers understand that data is being collected overall, and their expectations about the primary website - so I go on a website and that website is collecting data to help improve overall to, you know, protect me from fraud and malice - those are top expectations that are aligned with consumers. So consumers do expect - 59%, they do expect websites to collect data to help protect them against fraud and malicious activity, 55% to help improve the website and navigation overall and also, you know, high expectation in terms of collecting data for subscriptions. So I've logged in, I've signed on - there's a high correlation that, once I'm a subscriber, they will be collecting data and doing this kind of for my benefit.
Rande Price: Where we see the expectations don't align is once that data goes out of house. So once we have outside vendors using that data to either retarget for advertising or use that data or sell that data, that's where consumer expectations completely fall off. That's what consumers don't expect and I'd say are not informed about the transparency of when that data is being sold and used outside of that first-party relationship.
Dave Bittner: Can you describe to me sort of what the ecosystem is like out there in terms of the folks who are providing these ads that are a big part of the economic engine of the internet? I mean, are there different levels of the folks who are using best practices and, you know, following the rules and doing the right thing? And then does it branch off from there as some folks on different layers aren't respectful of people's wishes?
Chris Pedigo: The sort of digital ad ecosystem is really quite vast. There are obviously two big players in that ecosystem. Facebook and Google really sort of dominate the landscape. But there's a myriad of companies that perform a lot of different roles, from providing simple analytics, making sure the ad is served, measuring the ad. To Rande's point about consumer expectations, I think most consumers understand that advertising is a key funder of the web. What they don't expect is for companies like Google and Facebook to be collecting data about them as they go around the web. But - and that's where the CCPA comes in, is it really tries to give consumers control over the sale of their data. And the way they define sale is data that is flowing from the first party - from the website - to a third party for sort of independent use.
Chris Pedigo: If you look at the digital ad ecosystem, you've got lots of companies that aren't necessarily taking that data to build a profile. There's not a sale of that data. So that, you know - sort of analytics company or measurement company is collecting data in line with what a consumer would expect. You know, they're providing a service in the app or on the website, but they're not taking that data and reusing it elsewhere. Where I think CCPA is helpful and giving consumers more control is that it is giving consumers control over how companies like Google and Facebook might take that data, build a profile about them and use it to serve advertising elsewhere on the web.
Dave Bittner: And what specifically is included in CCPA that helps us here?
Chris Pedigo: So there's a number of rights that you get under CCPA if you're a California resident. One is the sort of general right to know what data has been collected about you. You can request to have your data deleted. But I think the big one is, you can exercise your right to not have your data sold. So it's called the Do Not Sell right. And in that scenario, you click a button on the publisher's site. Or there may be, in the future, some sort of extension in the browser or some sort of browser control to let websites know your preference. But once you've clicked - once you've activated that Do Not Sell right, your data can be used for a number of purposes - for fraud, for recommendations while you're on the site, for anything that the first party sort of uses to make that service better for you - that can still happen. But it's data flowing to companies like Google and Facebook that cannot happen anymore.
Dave Bittner: So in terms of the consumers out there, I think, like a lot of folks out there, at the outset when online advertising began and there was this opportunity to customize ads based on your interests, I think, like a lot of people, I thought, well, that's a good idea. I'm on board with that. I would rather see ads for things that I'm interested in rather than ads that have nothing to do with anything I'm interested in. But it seems as though we've kind of overshot that. A lot of the places we visit, you find that the ads start following you around, and it's a little bit creepy. Are there best practices out there for organizations to walk that fine line between collecting appropriate data and not making the users uncomfortable?
Rande Price: Well, I think that's where we would really look to, you know, aligning with expectations, and with that comes transparency. So, you know, I think one of the questions and concerns consumers had when ads were following them around was, how are they doing that? So they weren't informed on kind of what was going on in the back end and what data was being collected and what data was being used and sold and used for targeting. So, you know, one of the things we discuss is that idea of transparency and informing consumers about what is actually happening on the back end, you know, of course in consumer language. Now, you know, when we looked at the things that consumers were doing, for those that were aware that they could opt out - I mean, when we did this study, only half of the consumers even knew they had an option to opt out of data. So again, that's something that needs to be discussed and needs to be talked about in layman's terms with consumers that there is this option. You know, as Chris said, there's an option in CCPA to do not sell my data. So those options need to be really clear for consumers.
Rande Price: And when we talk to consumers in this study about what they're doing for those that are aware of opting out, when we talked to them about what action steps they were using, most of those weren't really helping them in protecting their data. The idea of deleting their cookies - that also doesn't, you know, protect them against, you know, online tracking. And doing do not track - I mean, it's known out there that many ignore those do-not-track signals, so their data is still being collected. So a lot of the action steps that they currently are taking also aren't helping. So when consumers take these steps and then they still see those ads kind of following them around, it is a little disconcerting to them as to why this is all happening.
Dave Bittner: Yeah. I mean, it strikes me that perhaps there's a trust gap here where even the advertisers who are doing their best to do the right thing get lumped in with the folks who are out there just vacuuming up data and selling it to the highest bidder. I wonder if there's a - I don't know - a PR campaign that needs to be put out there or even, you know, some sort of seal of approval or something that if you're on a site where folks agree to do the right thing, that they could say that they're doing so.
Chris Pedigo: Yeah. It's a really good point about trust. I think we've been sort of following this for years now. The unbridled collection of data really does creep out consumers, right? And it's not just that you're seeing ads pop up all over the place, but there's a, you know - a lot of consumers that are wondering if it impacts the kinds of offers they're getting - credit card offers or, you know, whether it impacts their credit score, all that kind of thing. And what's interesting is consumers really don't have a lot of ways to meaningfully sort of impact that data collection to opt out. And so what we've seen them do, like Rande referred to, the No. 1 way that consumers protect their privacy today is by deleting cookies, which really doesn't do the trick, and it actually breaks things when you try to log back into a site. Your...
Dave Bittner: Right.
Chris Pedigo: ...Login cookie's gone. But we have - what's interesting is that a lot of consumers have started downloading ad-blocking software, right? So they're just blocking ads altogether. They're sort of throwing the baby out with the bathwater. In the US, I think it's around - Rande correct me if I'm wrong, but I think it's around 20 to 25% of consumers have downloaded ad-blocking software. In Europe, it's closer to 50%, and in Asia it's 60% to 70%, you know, depending on country. So, you know - and this is primarily - the consumers downloading these ad-blocking software are primarily younger consumers, so future customers, essentially, the next generation. That trust gap is starting to have meaningful consequences for not just the bad actors but for good actors, people that are using data in responsible ways.
Dave Bittner: Yeah. It's interesting to me because I am one of those folks who has an ad-blocker. And so quite often I'll go to a website, and a window will pop up and it says, hey, we see you're using an ad-blocker. You know, please disable it so we can provide this content. And I guess the disconnect there for me is that I'm actually OK looking at the ads. I would love to have an option where they could show me the ad, turn off the tracking.
Chris Pedigo: Yep.
Dave Bittner: Right. But we don't have that fine control. It's - we have a hammer. You know, it's either on or off.
Chris Pedigo: Right.
Dave Bittner: And I find that to be kind of frustrating because I would like to support these sites by viewing the ads.
Chris Pedigo: Yep. And again, that's where CCPA comes in is if you activate your Do Not Sell right, publisher can still show you ads. It's just data about you can't flow to third parties for, you know, building profiles and all that stuff that would creep you out.
Dave Bittner: Is the value of the data that they're selling when they collect it and then sell it to third parties - does it have so much value that it's hard to resist, that extra little bit of income is alluring?
Chris Pedigo: I think about it more in the sense that a single publisher doesn't really have leverage in the marketplace to say to all the third parties out there, particularly the big ones like Google and Facebook, that we're not going to share this data with - about the consumer. Really, the ad marketplace today is so heavily dependent upon data that they're, you know - if a single publisher were to pull back and say no, we're not going to make that data available, their ads just wouldn't sell, or they certainly wouldn't sell at the rate they are today. And it's no secret that, you know, the media business isn't swimming with cash right now. So...
Dave Bittner: Right.
Chris Pedigo: ...A single publisher really has no leverage with a company like Google or Facebook. And you see that time and time again. So in some ways, the CCPA could be helpful in sort of resetting the marketplace and driving up the value of, you know, simple contextual advertising.
Dave Bittner: Where do you suppose we're headed with this? If you can imagine a state of equilibrium between, you know, the advertisers and the consumers, where do we end up in an ideal situation?
Chris Pedigo: Well, I think, you know, Rande's piece here is really important, talking about consumer expectations. And I think - because I think even brands, advertising brands, want to be in line with what consumers expect. They want to be making consumers happy, right? Happy consumers are more likely to buy their products. In terms of the equilibrium, I think what you're seeing is a lot of countries and now states like California are passing privacy laws to give consumers control over how their data is sold, how it's shared. Europe, you know, a couple of years ago passed the General Data Protection Regulation. GDPR is - and it's still sort of in its early days in terms of being enforced, but it does give consumers - it tries to sort of walk back the mass data sharing and give consumers control. CCPA is similar on that - along those lines. It's trying to give consumers control that - see where they can manage how their data is flowing around the web. What's interesting with CCPA is that you're - it really is supposed to only apply to California residents, but you've got a number of companies that are applying it nationwide. Now, the coronavirus has impacted the agenda and timing of a lot of state legislatures around the country. But I expect as they continue to meet, that more and more states will pass laws similar to CCPA. And so essentially, we don't have a federal standard here in the U.S. for consumer privacy, but it does feel like the CCPA will become the de facto standard. And so that consumer control over their data I think will become the new sort of normal for advertisers.
Dave Bittner: All right, Ben, what do you think?
Ben Yelin: It was really interesting. You know, I think maybe I would have guessed this, but consumers do have a superficial understanding of where and how their data gets collected on the internet and on various websites. It's sort of the level you would expect your mom and dad to have.
Dave Bittner: Yeah.
Ben Yelin: If you go to a website and you give them your email address, you know, I think we all would have an expectation that they're collecting that information and using it. But what I think most of us do not intuit is that it's being used by third-party providers and to the extent to which that's happening. So there is this mismatch in consumer expectations and what's actually happening with consumer data as it traverses these third-party data collection companies. And, you know, I think that puts the onus, as your interviewee said, on policymakers. What the CCPA has tried to do is try and line up consumer expectations with what's actually happening. And the first step is just giving consumers two things - the opportunity to opt out and the opportunity to have notice of where their data is going. So I do think this is a policy problem. But the study is very worthwhile. I think it confirms what I would have suspected, but it's just interesting to see it backed up by data.
Dave Bittner: Yeah. I have to admit that this is a bit of a bugaboo for me that, you know, I'm sure this happens to you, too. You know, you'll go to a website or something. You want to check something out. Let's say it's a news website. And something pops up, and it says, hey, we see you're using an ad-blocker, right? If you want to see this content, please disable your ad-blocker. Whitelist us so we can...
Ben Yelin: Yep.
Dave Bittner: ...Show you this because this is how we make our money. And my response to that - and I wish there was a way I could respond in the moment to that - is OK, look, fine. It's not the ad I have a problem with...
Ben Yelin: Right.
Dave Bittner: ...Right? It's the 50 trackers that you have installed, that's what I have a problem with. So let's make a deal here. You show me the ad, but turn off the trackers. If you do that, I'm fine looking at the ad.
Ben Yelin: I'll see your stupid ads. Absolutely.
Dave Bittner: Right (laughter). Right. Right. But that's not an option. And it bugs me that they pretend like that's the issue.
Ben Yelin: Right.
Dave Bittner: That, oh, you're blocking ads. No, I'm blocking all the tracking. That's the part I have a problem with. You want to put an ad on your website? More power to you.
Ben Yelin: Right. We know you have to pay the bills. That's fine. We know...
Dave Bittner: Right.
Ben Yelin: ...That you have to have advertising, otherwise I'd be giving you my credit card because I'd still want to read your article. But yeah, that's right. We're not reliably informed about the fact that we are being tracked as consumers. And that's really, you know, sort of the bread and butter of consumer privacy legislation is just if these companies are not going to reveal that in their standard whitelisting page on a website, then it's incumbent upon policymakers - and California has done this, the European Union has done this to a certain degree - to require companies to make that disclosure to us in order to do business within those jurisdictions. And as we've seen, once companies have to adapt these disclosure policies for California, that's going to be the nationwide standard because they don't want to change what they're whitelisting notice says so that it's different in all 50 states across the country. So yeah, I thought it was a really interesting interview and sort of a eye-opening view of the mismatch between what actually happens in terms of our data getting collected and consumer expectations.
Dave Bittner: Yeah. Well, our thanks to Rande Price and Chris Pedigo from Digital Content Next for joining us. That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.