Caveat 3.16.23
Ep 164 | 3.16.23

A blueprint of the CCPA.

Transcript

Eric Cole: Small doctors offices or small places that gather patient information, you're much better just outsourcing the client management piece and not try to do it yourself.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben talks about a case of an overbroad warrant for Ring doorbell data. I look at software liability and whether it may see increased scrutiny from the Biden administration. And later in the show, my conversation with Dr. Eric Cole from Theon Technology. We're discussing the impact from the rollout of CCPA, the California Consumer Privacy Act. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben. We've got some good stories to share this week. Why don't you kick things off for us here? 

Ben Yelin: So my story comes from Politico. It's entitled "The Privacy Loophole in Your Doorbell." I was alerted to the story from a student - or former student of mine on LinkedIn, who tagged me in a post on this. So kudos to her and any other students who want to tag me on LinkedIn with some important articles. 

Dave Bittner: And kiss up to your professor (laughter). 

Ben Yelin: Exactly. Always willing and able to accept such invitations. 

Dave Bittner: Right on. 

Ben Yelin: So this article is about an incident with a man named Michael Larkin, a business owner in Hamilton, Ohio. He got a call from local law enforcement saying that they were looking for footage from one of his Ring cameras. He has 21 Ring cameras outside of his home, which I believe is also his business. 

Dave Bittner: Wait (laughter). That - so he's a Ring enthusiast. 

Ben Yelin: He is really a Ring enthusiast. 

Dave Bittner: I'm trying to think where I would put 21 Ring cameras in my house. 

Ben Yelin: My guess is it's just, like, one of those very large properties... 

Dave Bittner: Yeah. 

Ben Yelin: ...Where there are a lot of different doors and windows... 

Dave Bittner: OK. 

Ben Yelin: ...And maybe he has a warehouse or something. 

Dave Bittner: Fair enough. 

Ben Yelin: I was curious about that as well, but I'm just taking the story as it comes to me. 

Dave Bittner: Yeah. OK. 

Ben Yelin: So - but they said that he did have 21 Ring devices. 

Dave Bittner: Wow. 

Ben Yelin: So law enforcement were conducting a drug-related investigation on a neighbor. And so they asked Mr. Larkin for two hours of video to see if they could find any, quote, "suspicious activity." So between 5 and 7 p.m. on this date in October, give us that video. Larkin cooperated, gave them video, and it turns out on that video, there was some pretty suspicious footage. A guy was going up and down the street multiple times, 12 times in that two-hour time frame. I cannot think of a legitimate reason somebody would drive up and down the same block... 

Dave Bittner: Yeah. 

Ben Yelin: ...Twelve times. So that's pretty suspicious. Right? 

Dave Bittner: OK. 

Ben Yelin: It turns out that once law enforcement had that level of suspicion, they wanted to go access additional information. And to do so, they asked a judge for a really broad set of records from Larkin's Ring devices. The company Ring, which is obviously an Amazon - it's an Amazon product. So Amazon received a warrant from a local judge requiring them to submit footage for more than 20 cameras on Mr. Larkin's property, whether Mr. Larkin was willing to share data from those cameras himself. So this is certainly problematic. It's 20 cameras and nearly unlimited private home security footage. Some of the cameras are pointed inside Mr. Larkin's house. And though Amazon claims that they didn't hand over any data from one of those interior Ring device cameras, it certainly suggests the possibility that that could happen on other additional warrants. 

Ben Yelin: So policymakers have taken notice of how broad this warrant is and how it's generally problematic in the age of mass digital surveillance. Senator Edward Markey of Massachusetts, a Democrat, talked about how he's been warning about this for years, how there just aren't proper security protocols for Ring Amazon devices and that the law is, in terms of the particularity requirement in a warrant, just hasn't caught up to the facts. One thing that an Electronic Frontier Foundation attorney talks about here is usually, when you're trying to obtain a warrant for something physical, like information that's contained in somebody's house, it's for a limited purpose. There is a limited confined space. So the warrant gives you the right to search this room at this hour. 

Dave Bittner: Right. 

Ben Yelin: It's not, you know, weeks and months of footage from 20 cameras that can show all of the intimate activity in one person's house. 

Dave Bittner: I'm thinking like, hey, listen, we think something might be going on at your neighbor's house. We're getting a warrant to search yours. 

Ben Yelin: Right. 

Dave Bittner: Like, go on. 

Ben Yelin: Right. 

Dave Bittner: Yeah. 

Ben Yelin: So if you get a - like, a physical warrant to search somebody's house, you go in and search it once, and then that warrant has expired. 

Dave Bittner: OK. 

Ben Yelin: And that's why we have our particularity requirements. 

Dave Bittner: OK. 

Ben Yelin: Not to get too much into our legal history, but that was the purpose of the Fourth Amendment, was to avoid general, overbroad warrants, where if a king or his subjects had suspicion that somebody had contraband in their house, without any real level of specificity, they could just go in and raid the person's house and see what they found. And that kind of seems like the equivalent of what's happening here. Stored video footage is governed by data privacy laws. 

Dave Bittner: OK. 

Ben Yelin: In case you have never listened to our podcast before, you'll know that there aren't really any federal data privacy laws. Most of the laws exist at the state level. We have a few state-based data privacy laws - California, Virginia, a couple of other ones. 

Dave Bittner: What state did this take place in? 

Ben Yelin: This took place in Ohio... 

Dave Bittner: OK. 

Ben Yelin: ...Which does not have such a data privacy law at the time of this recording. So that doesn't offer much protection. The proposal in Congress that really would have clarified that this information belongs to the Ring user, that measure failed at the end of the last session of Congress, and its prospects are increasingly unclear. 

Ben Yelin: Then there's the question of Ring's or Amazon's responsibility in all of this. Ring has a symbiotic relationship with police. Police realize that Ring data is extremely valuable in a time of high crime. These privately owned cameras are generating really valuable surveillance. And so police really want data from these Ring devices. From Ring's perspective, from Amazon's perspective, they are donating or giving Ring devices to local law enforcement. 

Dave Bittner: Yeah. 

Ben Yelin: And local law enforcement are giving them to individuals to put up in their houses. 

Dave Bittner: In exchange for access. 

Ben Yelin: In exchange for access, exactly. And it seems like, at least according to this article, Amazon doesn't have a great history of playing hardball with these government requests. 

Dave Bittner: Unlike, say, Apple. 

Ben Yelin: Exactly. So they mention the Apple-FBI 2015 kerfuffle here. And I think it's relevant because there is a way, if you're willing to take the high-profile hit and go against law enforcement, that, as a company, you can fight a warrant saying, this crosses the line. This is too broad. 

Dave Bittner: Yeah. 

Ben Yelin: Amazon does not seem to have done that. They complied fully with the request at issue here. And though they say that they don't always give the government exactly what the government requests in every circumstance, certainly this example would suggest otherwise if this is something that's replicated nationwide. So I think this highlights the need for, once again, a federal data privacy law to clarify that Ring footage belongs to the user. It's the user's property - and also some type of governance as it relates to obtaining warrants for this. I mean, that's the most disturbing part of this to me. Even if you had a robust data privacy law, a local judge signed off on this warrant, and there might need to be federal guidance on these judges about how broad such a warrant can be when we're talking about Amazon Ring devices. 

Dave Bittner: Yeah. A couple things come to mind here for me. So, first of all, the way the story is laid out, this neighbor was cooperating with law enforcement. So... 

Ben Yelin: Yes, he was. 

Dave Bittner: ...Why get the warrant? Was there a point where he said, listen, police officers, enough is enough? And I've given - you know, I don't have any more time for you? Or - you see where I'm getting here? Like, he's giving them what they asked for. He's being cooperative. 

Ben Yelin: I think what they anticipated is that once he found out how much data they were requesting, he would have told them to stuff it. 

Dave Bittner: OK. 

Ben Yelin: So... 

Dave Bittner: Yeah, sure. I mean, but - yeah. 

Ben Yelin: I'm not sure if they actually did ask him, like, hey, can you turn over hours' and days' worth of data from your 21 Ring devices. 

Dave Bittner: Right. And the stuff inside your house (laughter). 

Ben Yelin: Right. The ones that are pointing inside your bedroom... 

Dave Bittner: That's weird. 

Ben Yelin: ...That might, you know, record some personal activities. I don't think he was ever in a position, at least according to the story, to make that decision. I think what happened is law enforcement, anticipating that he would resist that broad of a request, went directly to Ring itself and said, we know that you have the data. You keep it somewhere in the cloud, and we're going to get a judge to compel you to turn over that data, regardless of Mr. Larkin's own willingness to hand it over himself. That's what's disturbing about this is it's Mr. Larkin's Ring device. He presumably purchased it. It's his property. So obviously, things that happen outside of the walls of his property are in public. He doesn't have as much of a reasonable expectation of privacy in that information. 

Dave Bittner: Right. 

Ben Yelin: But certainly, some of the cameras are pointed inward and might reflect things that happened within the curtilage of his property. 

Dave Bittner: Yeah. 

Ben Yelin: So the fact that, despite that seemingly significant privacy interest, the judge just rubber-stamped this warrant is really eye-opening to me. 

Dave Bittner: I think about, you know, my own situation. I live in a townhouse community. And so when I go out in the morning to get in my car and, you know, start my day, I can think off the top of my head, there are three Ring cameras that I walk by every day - right? - that are just doorbells pointing out at the - you know, the common area. And sometimes I wave to them as I go by (laughter), right? But that's kind of where we are. And I don't so much have a problem with that because, as you say, those public areas, I don't - you don't necessarily have an expectation of privacy. But when we're talking about the cameras inside your home or your office or the ones that are not pointing towards public spaces, I just - that seems to me to be a different kettle of fish here. 

Ben Yelin: Yeah. And I'm also thinking - I mean, townhomes are one thing. But if you're in, like, a rural property, where even the outside of your house doesn't really qualify as a public space - like, if you live on... 

Dave Bittner: Right. 

Ben Yelin: ...Several acres of property and the entrance to your house isn't public but you have a Ring camera, that introduces its own set of problems in terms of an expectation of privacy. 

Dave Bittner: Yeah. 

Ben Yelin: You know, the flip side of this is what law enforcement would say, that perhaps there would be a positive effect on people's behavior, knowing that there are so many Ring devices in an area that can record potentially criminal activity. 

Dave Bittner: Oh, boy (laughter). That's a - (laughter). 

Ben Yelin: It's a slippery slope. And there's also... 

Dave Bittner: Yeah. 

Ben Yelin: ...No proof of it. I mean... 

Dave Bittner: Yeah. 

Ben Yelin: If you were to draw a simple graph... 

Dave Bittner: Right. That's just... 

Ben Yelin: ...Of Ring devices on one axis and - or, I guess, two lines, Ring devices and violent crime rates over the past several years... 

Dave Bittner: Well... 

Ben Yelin: ...Both of them would be going up in tandem. 

Dave Bittner: So let me ask you this, because you used a turn of phrase earlier in your description here that I want to check you on. You said, in a time of high crime. Is this a time of high crime? Don't we have historically low actual crime levels right now? 

Ben Yelin: So yeah, I mean, it depends on the time frame that you're referencing. 

Dave Bittner: OK. 

Ben Yelin: Certain types of violent crime have gone up in the past few years. Although, there's some indications that maybe it's going back down. Knock on wood. 

Dave Bittner: Yeah. OK. 

Ben Yelin: Compared to, like, the late '80s and the early '90s, we're living in a low crime period. 

Dave Bittner: I see. 

Ben Yelin: But, I think, since most people don't really have conscious memories of the late '80s and early '90s - no offense. 

Dave Bittner: Hey, watch it. Watch it, Ben, watch it. What do you mean most people? 

(LAUGHTER) 

Dave Bittner: Those are some of the best times of my life, Ben (laughter). 

Ben Yelin: I know, I know. Well, I would say there's a certain generation that remembers those times. 

Dave Bittner: Right (laughter). 

Ben Yelin: But now there are a couple of generations who do not. 

Dave Bittner: Yeah. 

Ben Yelin: It certainly seems to the millennials and the Gen Zers out there that crime has increased in a way that's unrecognizable. So even if it's not rates as high as they were in the late '80s and '90s, it's still something that is eye-opening to people who haven't lived in this sort of environment before. 

Dave Bittner: So here's my other thought on this, is that, you know, obviously, the Ring doorbell, I would hazard to say, is the most popular of these devices... 

Ben Yelin: It absolutely is, yes. 

Dave Bittner: ...Because it's easy. It's got a good app. It just works. 

Ben Yelin: Looks cool, yeah. 

Dave Bittner: It's relatively inexpensive. Right, it has - it checks all those boxes. But it's not the only one out there. And so I wonder, first of all, if Ring had an option where all of your footage, when it was uploaded to their cloud, was encrypted and Ring did not have the key to that encryption, in a warrant case like this, would the judge have demanded that the owner of the Ring cameras provide the decryption key? 

Ben Yelin: I mean, now we're talking about the Apple-FBI scenario. 

Dave Bittner: Right. 

Ben Yelin: So one or two things could happen. Amazon could say, we'll try to do that. We'll try to break our own encryption. That's going to take a lot of manpower. 

Dave Bittner: Yeah. 

Ben Yelin: So they'd probably resist it, at least to some extent. Or they could say, we're just - we can't do that. That's impossible. That's beyond the realm of our capabilities. 

Dave Bittner: Right. 

Ben Yelin: I think, for something like - I guess they were investigating drug-related activities, which is serious. But it's also not the San Bernardino terrorist attacks of 2015. I mean, this doesn't... 

Dave Bittner: Yeah. Yeah. 

Ben Yelin: ...Implicate ISIS or other international terrorists. So the stakes are somewhat lower. 

Dave Bittner: Yeah. 

Ben Yelin: And I think that Amazon, if it really wanted to hold itself up as, you know, a privacy conscious company, could do something like that. They could say that all of the data that goes into the cloud is encrypted. And you have the key as the user. We don't have the key. And if the government comes to us, we'll tell them, sorry. That data... 

Dave Bittner: Yeah, we cannot unlock it. 

Ben Yelin: Exactly. 

Dave Bittner: And you have to go after - and so that's a whole different - I mean, that's a whole different discussion that we've had here many times about whether the government can compel you to turn over a password, basically. 

Ben Yelin: Yeah. Whole other discussion - I don't think it's relevant here because as we mentioned, there is this positive relationship between Ring and Amazon and local law enforcement. And it's so mutually beneficial that I don't think that relationship is going to be broken by one high-profile overbroad warrants like the one we have here. 

Dave Bittner: Do you think anything could come out of this? 

Ben Yelin: (Sighing) I don't know if you could hear that audible sigh. 

Dave Bittner: (Laughter). 

Ben Yelin: I mean, if the million other anecdotes that we've talked about on this podcast and that other people have talked about didn't inspire Congress to finally get a federal data privacy law across the finish line, is this going to be the one that pushes it forward marginally? I don't think so. The only difference I could see it making is it could inspire states to enact their own privacy laws if they have not already. So if there was a major uproar in Hamilton, Ohio, based on this incident, then perhaps the Ohio state legislature and their governor could enact CCPA-type legislation to help remedy this problem. I don't anticipate that happening. 

Dave Bittner: Could there be guidance to judges - like, a higher court could say, hey, you know, knock it off? 

Ben Yelin: They absolutely could do that. 

Dave Bittner: Yeah. 

Ben Yelin: One problem with this particular case is that if Larkin were to challenge it, he doesn't have standing because the warrant doesn't - they weren't looking for illegal activity on behalf of Larkin. He wasn't criminally charged. 

Dave Bittner: Oh, boy. 

Ben Yelin: So there is a... 

Dave Bittner: Oh, boy. 

Ben Yelin: I know. I mean, there's an invasion of privacy. And he claims standing that his Fourth Amendment rights, his constitutional rights were jeopardized as a result of this policy. But... 

Dave Bittner: Yeah. 

Ben Yelin: ...It's difficult because it's hard to identify what the actual injury was here. 

Dave Bittner: Right. Just all that footage of him, you know, going to get a bagel in his boxer shorts. Right? 

Ben Yelin: Right. How bad can it be? It wasn't released to the public. Like, is that something that's justiciable? 

Dave Bittner: OK. 

Ben Yelin: That's a whole other question. I mean, I would argue if I was his attorney that it is justiciable because any violation of constitutional rights, even a de minimis violation, should confer standing. But that's not always how courts see the issue. And they might think, well, let's wait for a case where somebody is actually arrested based on this data and let's have that arrested person challenge it. Then they would obviously have standing. 

Dave Bittner: I see. 

Ben Yelin: A lower court - you know, the proper remedy for this from a higher court, rather, would be to disallow this evidence in a future criminal proceeding. I mean, it would be like an exclusionary rule thing, which wouldn't have curtailed the original warrants. So once that data is collected, I mean, if you consider the harm being that private data was collected, nothing a higher court would do at this point would ameliorate that harm. 

Dave Bittner: Right. Right. Wow. All right. Well, that is an interesting one for sure. 

Ben Yelin: Sure is. 

Dave Bittner: We'll have a link to that story in the show notes, of course. Moving on to my story this week - this comes from the folks over at the Lawfare blog. This is an article written by Jim Dempsey, and it's titled "Cybersecurity's Third Rail: Software Liability." I found this to be a fascinating article. 

Dave Bittner: Of course, Ben, as we know, the Biden administration recently released their national cybersecurity strategy. And this article points out that in that strategy, they seem to be taking target at software liability. And up till now, liability for the folks who make software has been pretty much, I guess - I don't know - hand-waved away by the EULA. Is that a fair way to say it? 

Ben Yelin: Right, right. Contracted away, saying, you can't hold us liable for however defective our software is. 

Dave Bittner: Right. In agreeing to use this software, you agree that no matter how bad it gets, it's not our fault. So the Biden administration seems to be saying that that's not going to do anymore. They are coming after three elements, according to this article. 

Dave Bittner: They're - they want to prevent manufacturers and service providers from disclaiming liability by contract. That's what we just talked about. They want to establish a standard of care. That sounds good to me. And they want to provide a safe harbor to shield from liability the companies that do take reasonable measures to secure the products and services. So I think they're trying - it's not letting the perfect be the enemy of the good, I suppose. Is that a fair way to say it? 

Ben Yelin: Right. You don't have to have perfect cybersecurity practices. We've seen a lot of these safe-harbor provisions show up in various pieces of state legislation - Utah, Connecticut, Ohio. I was actually just working on a proposed bill here in Maryland dealing with that exact subject. And the idea of those safe-harbor provisions is that it would be an incentive for companies to take reasonable measures. 

Ben Yelin: The reasonable measures standard is a really hard standard to apply because how do you determine what reasonable is? And I think that's going to get into how the Biden administration tries to turn these broad policy goals into practice. It's really hard to figure out what a standard of care is. Oftentimes we talk about things like NIST guidelines. 

Dave Bittner: Right. 

Ben Yelin: But the NIST guidelines themselves say this is not a checklist. These are - this is a general guidance document. 

Dave Bittner: And yet... 

Ben Yelin: And yet. Yeah, exactly. 

Dave Bittner: (Laughter) Right, right. 

Ben Yelin: And then, you know, the other documents that might provide more precise guidance for things like HIPAA or other federal statutes only apply in limited circumstances. 

Dave Bittner: I see. 

Ben Yelin: So that - it's just going to be really difficult for courts to establish what that standard of care is. And what's going to end up happening is you're going to have battle - the battle of the attorneys for each individual case, where the attorneys figure out which expert witnesses they can bring in and try to argue that their company either was or was not complying with these best practices. So that's one really interesting element of it. I think the first element of the strategy is the most achievable and also, I think, normatively, the best... 

Dave Bittner: OK. 

Ben Yelin: ...Which would be preventing manufacturers and service providers from disclaiming liability by contract. As they note in this Lawfare piece, that was a very common practice in the majorly pre-digital age... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Talking 100 years ago. 

Dave Bittner: Yeah. 

Ben Yelin: Auto companies did that, and all different types of manufacturers did that. They said by purchasing this product and signing this piece of paper, you will not hold us liable. And courts, going back a century, have held that for the manufacture of products, strict liability, where you are liable whether you were negligent or whether you signed away - or you tried to have the customer sign a contract, you are going to be strictly liable based on the fact that you manufactured that product because there's kind of an unequal bargaining relationship between that manufacturer and the user. You're putting a product on the market that's potentially going to be used by millions of people. And most end users are not going to have the resources to go in and challenge the - you know, to do an individualized lawsuit to challenge the specific manufacturing defects that exist in a product. So I think that's a really admirable goal that the Biden administration is laying out as part of this policy. 

Dave Bittner: And what are the pathways that the Biden administration has to pursue this? 

Ben Yelin: So they are most likely going to use their authority under the FTC to compel administrative action, civil fines or potentially criminal sanctions in some circumstances, but using that enforcement authority for these consumer practices. Beyond that, there really isn't that much authority that the federal government has. I mean, it's just the agencies that have preexisting enforcement power that are going to be able to issue sanctions against companies that end up violating this policy. 

Dave Bittner: I guess I'm wondering, do they - will he - will they have to go through Congress? 

Ben Yelin: I think on an individualized scale, they will not have to go through Congress. They're going to have to go through the federal court system... 

Dave Bittner: OK. 

Ben Yelin: ...Because companies will challenge whatever FTC action - or any other federal agency, for that matter. If they try to obtain a civil penalty, there's going to be a court case. But for individualized penalties or sanctions, that's not something that would be in the purview of Congress. So that's kind of to the advantage of the administration. They can do something without the cooperation of what's a pretty hostile Congress... 

Dave Bittner: Right. 

Ben Yelin: ...On this - at this point. 

Dave Bittner: Is this inevitable? I mean, this article points out that other verticals have gone through this, and this is where it generally lands. Does - is this a part of the growing, the maturation of the software industry? 

Ben Yelin: Yeah. I mean, I think it is. I think it's bringing the software industry in line with other manufacturers. And the manufacturers themselves will say, we can't have this cloud of liability over us 'cause it's going to stifle innovation. It's going to hurt product development. We're going to be more reluctant to bring products to market because we're going to be so fearful of lawsuits. And that's certainly a legitimate point. But that's what car manufacturers said a hundred years ago. 

Dave Bittner: Right, right. 

Ben Yelin: And we have a body of law in this country that holds companies accountable for the products that they manufacture, at least holds them responsible if they act in a way that doesn't conform to the standard of reasonable care. So I do think in some ways this is inevitable because it's a body of law that already exists in other sectors. 

Dave Bittner: I see. All right. Well, we will have a link to that article. Again, it's from Lawfare. It's written by Jim Dempsey. Jim Dempsey happens to be a lecturer at UC Berkeley Law School, and he's a senior policy adviser at the Stanford Cyber Policy Center.

Dave Bittner: Ben, I recently had the pleasure of speaking with Dr. Eric Cole. He's from an organization called Theon Technology, and we talked about the impact of the rollout of CCPA, the California Consumer Privacy Act, how that's going to affect businesses and government entities and everybody in between. Here's my conversation with Dr. Eric Cole. 

Eric Cole: CCPA is basically new regulation from the state of California. It stands for California Consumer Privacy Act. And because the United States doesn't currently have federal laws on privacy, CCPA is really viewed, in most people's minds, as being the current federal regulation on how to protect consumer information and what's required of companies to implement proper security measures. 

Dave Bittner: And where do we stand in terms of the timeline and rolling it out for it to actually become active and have teeth? 

Eric Cole: To me, it started rolling out in January of 2023. So it's in place, and we already have attorneys contacting us from both sides about lawsuits that are involving with it. And then it will be rolled out throughout the year. But it's one of those where we tell organizations, if you haven't taken a look at it, you definitely need to because there's a lot of control that is given to the consumer, where, in the past, if somebody went to your website, provided information, you, essentially, own the information. Now it flips it and basically says the consumer still owns it, and you have to be very careful and often get permission in how you use it or how you distribute it. 

Dave Bittner: And so how does that affect folks who are doing business here in the U.S. and, indeed, globally? 

Eric Cole: Because it's a California law, it essentially only impacts California citizens. So unless you're a very regional business that only has folks in Washington, D.C., or Boston or New York, and you can guarantee that you don't have anybody who's a resident of the state of California, then you wouldn't be impacted by it. But I think we all realize guaranteeing is a pretty hard thing to do because even if you look at a gas station, if you have somebody from California, with a California driver's license, renting a car in Boston and you take their information or they purchase something in Boston, you're still covered under CCPA. So while it technically only covers one state in the United States, it is really a national and international standard because if you're dealing with anybody from California, which is a pretty high probability that could happen with large companies, then you're going to be impacted by this law. 

Dave Bittner: Well, you mentioned that you're - you've been reached out to by folks who are looking at lawsuits. What sort of things are they pursuing? 

Eric Cole: In a lot of cases, it's information that's collected about consumers and then used or sold in other areas. So, for example, what is common practice is, depending on the settings of your web browser, when you visit a website, you actually pull down personal information. So a lot of websites can pull down my name, my phone number, my address. I don't know if you've ever had this happen, but you're going in, and you're visiting a website, and you're looking up a new automobile, or you're looking to make a big purchase, and 10 or 15 minutes later, you either get an email or a phone call from a local dealership. And you're sitting there going, how in the world that they do that? Because I didn't provide any information? But people don't realize, your browser is providing a lot of that information, and that has always been considered sort of customary and OK. Well, CCPA says it's not. So if you do that and you're pulling that information from the browser or the settings or others and they happen to be a resident of the state of California, then you can actually be getting yourself in a lot of trouble and potentially have fines or other information associated with it. 

Eric Cole: The other thing where this happens a lot is where there's plug-ins in-website. So for example, you might go and you want to see a doctor. And you put in your zip code into a hospital website so they can recommend a local doctor or a local specialist. That's using, often, a Google plug-in. But the hospital is passing your information to Google so it can run the search. Once again, customary practice and how most websites were developed. CCPA says, that's not OK. That's not acceptable. So what we're finding is a lot of these customary practices of how websites and our technology has been developed - and sort of everyone didn't like it, but we sort of accepted it - CCPA is now coming in and saying, that's not OK, that's not OK. And not only that, but that's against the law. And you can have lawsuits brought against you for doing it. So it's really disrupting a lot of companies, technologies, websites. And a lot of large organizations are getting themselves in trouble without even realizing it. 

Dave Bittner: Well, you know, CCPA has been coming for a while now. So I think it's fair to say, you know, it wasn't a surprise. How have organizations in general done when it comes to being ready for this? 

Eric Cole: So you're right. It wasn't a surprise. We saw this train coming down the track for a long time. But I think the issue was accurate information. So when companies that we work with went in and said, OK, CCPA is coming, they would go in and look at their data store, make sure the database is protected, make sure that's secure. But what they didn't do is they didn't go back and look at how their websites were built and developed. They didn't go back and look at a lot of the front-end code on public websites and things like that because nobody thought that that was really dealing with personal information. Everyone went back to the databases - protected the databases, controlled the databases, secured the databases - and assumed that because code has been running for three, five or seven years, or what is considered customary practice, was OK, everybody ignored that piece of it. And now it's a lot of the websites and how the websites engage with third parties, and a lot of the Google add-ons that are in a lot of websites are now causing companies a lot of issues because they never considered or thought that that would negatively impact their ability to meet CCPA. 

Dave Bittner: Well, how about for consumers themselves? I mean, are they going to see positive outcomes from this? Will there be any noticeable changes when they're doing the things they do online? 

Eric Cole: To me, I think the big, noticeable change will be an awareness that whoever you give your data to really has to protect it. And what I like about CCPA is it really switched the mindset - that before CCPA, most companies' mindset is, once you go to our site and you register or enter information, it's our data. The data belongs to us. We can use it. We can sell it. We can do what we want with it. But now what CCPA says is, no, it's still the consumer's data. It still belongs to them. And you still have to get their permission to use it. So immediately, I don't think a consumer is going to notice, all of a sudden, wow, this is so different than it was before. But in terms of their level of protection and the number of phishing attacks and other attacks where their information is exposed much larger than they realize, over time, I think they'll see a positive impact to how they're targeted and who they're targeted by. 

Dave Bittner: Do you see there being any unintended consequences here? I mean, I'm thinking, you know, an example just at a nuisance level. When GDPR came into effect, you know, suddenly we started getting all these pop-ups about cookies and cookies, you know (laughter)? And you click through. And you get on with your life. How does CCPA shape up when it comes to those sorts of things? 

Eric Cole: That's a great point. You're going to see a lot of that in which now, if you have data or information or records that you haven't used in six months or nine months or a year, you're going to get a lot more of, hey, do you still want us to keep the information? Do you still want us to record it? Are you sure you still want to be in our database? And then if you don't reply, because most people ignore those types of messages, you're going to be deleted. And then if it's - I seen this happen. If you're in a situation where you go to a doctor once a year or once every 18 months, now, because you ignored the request - you didn't say it was OK, which they were required to do - you know, go back to the doctor, and you have to re-fill out all those forms again because... 

Dave Bittner: Oh, no (laughter). 

Eric Cole: They had to delete your record. Yeah. So I think you're going to see that area where people don't realize that, yeah, now these doctors' offices and banks and others are in these hard places where if you ignore them like we used to, it's going to create a lot more work for yourself. 

Dave Bittner: Oh, that's interesting. What about for those small and medium business-sized owners, you know? How can they check to make sure that they're where they need to be when it comes to this? 

Eric Cole: I mean, really, the best bet for small- to medium-sized businesses is to outsource the processing. So if you're going to be storing consumer information or credit cards or anything personal or PII or anything that fits into that category is - don't run your own database. Don't run your own systems. Don't run your own servers. There are so many software as a service out there that can run this in the cloud, that can manage it. And then they handle all of the regulation on your behalf. That's just a much easier, cheaper, quicker way of doing it. So, like, small doctors offices or small places that gather patient information, you're much better just outsourcing the client management piece and not try to do it yourself. 

Dave Bittner: Where do you suppose we're headed here? I think a lot of folks are frustrated that we don't have a federal law on the horizon. Certainly, there have been other states that have put their own laws in place. What do you see coming here? 

Eric Cole: And so to me, there's sort of one of three scenarios. The most likely scenario that I think will happen is that Congress is going to sit back and sort of see over the next couple of months how CCPA sort of rolls out and how companies handle it. And to me, the most likely scenario is, with some minor changes, they're probably going to just go in and make that the national standard - so a lot of security professionals and folks who have spoke on the Hill or behind closed-door meetings - so to feel like that's the tone that Congress is using of saying, let's use this as a test, see how it works. And then most likely, this is sort of going to be the framework for what's going to be our national policy. So that, to me, is the most likely. 

Eric Cole: The second is to continue what we've been doing where they just sit back and basically let California be the national standard. So they don't officially make a national standard. They just sort of let CCPA do its thing, and most companies in the United States have to follow it anyway. Once again, that's not the best option. And then the third one, which I hope doesn't happen, is they just rewrite, create a whole new standard, and then that creates chaos. Because remember, most U.S. companies, large entities, we had GDPR because we had international clients. Then we had CCPA. And it's one of these where we're now just spending so much money on regulation that we're not actually implementing proper security. And it hurts us. To me, all these different regulations that you push on these companies, what it does is it takes a security budget that's supposed to be focused on protecting your data from compromise, and now most of it is on meeting regulatory issues. And now a lot of the fundamental vulnerabilities that should be fixed aren't. So in some cases, it actually creates a bigger problem for us if we're not careful. 

Dave Bittner: Well, in that first scenario that you describe, you know, the feds using California as sort of the building block, is - in your estimation, is that a solid foundation on which to build it? 

Eric Cole: Where we are now, yes. I think the better option would have been a year or two ago just to adopt GDPR. It was close enough. It - most companies were following it. It would have been the easier, quicker option. But because we didn't and we allowed CCPA to happen and now here we are, now I think our next best option is to do that because if we then go in and, like I said, do yet a third regulation that's different than GDPR and different than CCPA, we're just creating way too much unnecessary confusion. GDPR and CPA are good enough. They cover the foundational items. Let's just adapt those and move on. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: Yeah. I mean, it's been really interesting following the rollout of CCPA because I think it'll give indications as to how these privacy laws will work across the country as they're introduced at the state level and potentially at the federal level. And there have been growing pains. I mean, they had to put an initiative in front of the voters to correct some of the errors in the original drafting of CCPA. And, you know, there have been other stumbling blocks in getting the law up and running. But I think those of us who are interested in digital privacy are rooting for its success because of the fact that it's the pioneer law in this field. 

Dave Bittner: Yeah, this stuff rolls out slowly (laughter). 

Ben Yelin: Yeah. 

Dave Bittner: All right. Our thanks to Dr. Eric Cole again. He's from Theon Technology. We do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.