Caveat 7.13.23
Ep 179 | 7.13.23

Laws, lawsuits, and privacy.

Transcript

Dan Frechtling: Seventy-five percent of the world's population is under one form or another of comprehensive privacy laws and that number was only 20% of the world in 2020.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law, and policy podcast. I'm Dave Bittner, and joining me is my cohost, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben.

Ben Yelin: Hello, Dave.

Dave Bittner: Today, Ben looks at proposed legislation coming out of Massachusetts that could ban cell site location data. I've got the story of comedian and author Sarah Silverman suing OpenAI over copyright infringement. And later in the show, Ben's conversation with Dan Frechtling, CEO of Boltive, a company that provides publishers and ad exchanges tools to monitor and audit their programmatic ads. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney.

All right, Ben. Lots going on this week. Why don't you kick things off for us. What do you got?

Ben Yelin: So this is a big story I first saw in the Wall Street Journal from Byron Tau, who writes a lot on these surveillance issues. And it's about a proposed law in the state of Massachusetts that would institute a widespread ban on companies buying and selling cell phone location data.

Dave Bittner: Hm.

Ben Yelin: And this would be a first of its kind law across the country. Several states have enacted versions of data privacy laws that put limits on the sale of cell site location information and other personal data. The Supreme Court weighed in in the Carpenter case and said that the government needs a warrant in most circumstances to obtain historical cell site location information. But this is the first time we've seen a state consider a law that's this potentially broad.

Dave Bittner: Hm.

Ben Yelin: So what would this law do? It's called the Location Shield Act. It would sharply the practice of collecting and selling location data drawn from mobile phones in Massachusetts. So for the government, there would be a warrant requirement, so for law enforcement access to this data, that would kind of conform the state law to the Carpenter case. And I think more importantly and more significantly, it would ban data brokers from buying and selling location data about state residents without court authorization.

Dave Bittner: Hm.

Ben Yelin: There are a limited number of exceptions that I think are designed to benefit consumers. So they can- these companies can retain your location data for things like weather applications, Amazon deliveries, that sort of thing.

Dave Bittner: Okay.

Ben Yelin: But it's a pretty broad, or at least the proposal is, a pretty broad ban on these data brokers who buy and sell location information. Against the professed wishes of the users, the people who have these devices. So there are a lot of progressive organizations in the state of Massachusetts who are fighting for this law, including the Massachusetts chapter of the ACLU.

Dave Bittner: Hm.

Ben Yelin: And the sponsor of this bill is a Democrat representing the Boston suburb. She's the majority leader in the state senate, so somebody with a little bit of influence and power. And she believes that this piece of legislation has a very good chance of being enacted before the end of this year.

Dave Bittner: Hm.

Ben Yelin: There is opposition, of course, 'cause there's a lot of money involved in this.

Dave Bittner: Right.

Ben Yelin: There's pretty well-organized opposition and I don't want to paint this opposition in a corner that this is all about the money. I think there are legitimate reasons why this law could potentially be too broad and limit too much activity that ends up hurting instead of helping the consumer.

Dave Bittner: Hm.

Ben Yelin: So there this group called the State Privacy and Security Coalition. They testified before a joint committee of the Massachusetts legislature in opposition to this bill. Their representative, Andrew Kingman, said that while they support heightened protections for particular types of personal information, including location data. They think there should be a better alternative to a wholesale ban on this data.

Dave Bittner: Yeah.

Ben Yelin: That the definition is very broad and it doesn't give the users any type of choice. There's no opt-out provision. It's simply prohibited.

Dave Bittner: Right.

Ben Yelin: So if a user benefits from companies having their location data, and we've talked about many of the ways that users benefit from that feature. Then they're out of luck because it just simply would be outlawed in the state of Massachusetts. I also think that the industry is gonna have a problem here complying with this Massachusetts law given that the 49 other states wouldn't have such bans in place.

Dave Bittner: Mm-hmm.

Ben Yelin: So they'd have to alter their practices, their collection practices, their EULAs, etc., to account for this new Massachusetts law. That happens sometimes. We've seen it in other contexts. We know that Illinois has a unique law on collecting biometric information and companies have had to alter their trade practices and their EULAs to comply with that, as well. But certainly you could understand why they would be opposed to that.

Dave Bittner: Yeah.

Ben Yelin: And then one last kind of interesting wrinkle here is the abortion rights element of this.

Dave Bittner: Hm.

Ben Yelin: So a lot of abortion rights activists in the wake of the Dobbs decision in 2022 overturning Roe v. Wade.

Dave Bittner: Right.

Ben Yelin: Believe that governments in states that have banned abortion could track people through their cell phone traveling out of state seeking an abortion. And when you combine that with other similar types of threats on people's civil liberties like digital stalking, they mention, and national security. I think in the mind of these legislatures- of this legislature, the risk is just too large that the collection of this information is gonna be abused.

Dave Bittner: Hm.

Ben Yelin: It seems kind of out of place for Massachusetts to be raising that concern here. I under- certainly understand the abortion angle to this but abortion is legal in Massachusetts and this law won't really have extra territorial application. So it's not like the Massachusetts legislature can control the type of information that's available to be sold in Tennessee or whichever state has passed one of these anti-abortion laws.

Dave Bittner: Right, right.

Ben Yelin: So this is just definitely something we're gonna have to follow. It seems like this has a very good chance of passing but it would be a very significant law because it would be the first of its type across the country.

Dave Bittner: Yeah. It's interesting the opposition here that you mention. They seem to want to give consumers the ability to opt out of sale rather than having a broad prohibition. To me, that seems like a nonstarter. I mean, we've, you know, if they really believe it, then let consumers opt in. If they believe that the benefits are clear and that consumers are- really want this, right? You know, consumers really want to be- to have their locations tracked because of the benefits of it, then let them opt in.

Ben Yelin: Right.

Dave Bittner: Don't make them opt out. But, of course, they're not gonna do that because, as we, you know, it's just too- there's too much money at play. There's information --

Ben Yelin: There's just too much money in it, yeah.

Dave Bittner: Yeah, too valuable.

Ben Yelin: I mean, you can think of this as kind of a sliding scale. It's the- this Massachusetts ban would be the most extreme version of restrictions on location data.

Dave Bittner: Yeah.

Ben Yelin: Then you could have at the next level would be an opt in where the default is companies aren't allowed to buy and sell your data but you could opt in to it. Then there would be the opt out which is the approach that Connecticut has taken, as well as a number of other states, states that are both controlled by Republicans in the legislature and by Democrats.

Dave Bittner: Mm-hmm.

Ben Yelin: And then there's the do nothing approach, which is still the law of the land in several states where we have really no inhibitions.

Dave Bittner: It's the approach that stands the test of time.

Ben Yelin: It sure does, and certainly has stood the test of time federally.

Dave Bittner: Right.

Ben Yelin: So even since the Carpenter decision in 2018, we now have this whole new frontier of companies buying and selling data. And in many cases, it's not just local police departments who are purchasing this data for law enforcement purposes but it's federal agencies. So the FBI, the Department of Homeland Security, ICE, and they're doing this obviously without any judicial authorization. So, you know, it means that if these agencies have the money, they can use this data for surveillance, which certainly cuts against the spirit of the Fourth Amendment.

Dave Bittner: Yeah.

Ben Yelin: And is something that Congress probably should take up but it hasn't or they've kind of gone at it in fits and starts and haven't been able to actually enact anything into law. So I think in light of that inaction, you see states getting out in front of this issue and it looks like Massachusetts is gonna be the trailblazer on this.

Dave Bittner: That's interesting. I wonder what you- if you're a company, let's say, like Verizon, you know, you're a big wireless provider. So this is the data that through the course of business, you naturally collect even if you're not- if- so let's say even if you're not selling the data. You're collecting this data just as part of providing your service to people. As you- as someone travels and their phone pings tower to tower, right, that information gets logged. If you're a Verizon of the world, how do you deal with state borders, you know, the states that border Massachusetts, and suddenly, I'm pinging this tower. And, you know, I move a hundred yards in this direction and now I'm pinging the tower across the border because, you know, radio signals don't obey borders.

Ben Yelin: See boundaries, yeah, I mean.

Dave Bittner: Right, right. So, you know, it's just it's an interesting puzzle to me when those lines from a practical point of view could be fuzzy. I wonder how they're gonna approach that.

Ben Yelin: Yeah, I mean, it would be really hard to enforce. Massachusetts borders a number of different states.

Dave Bittner: Yeah.

Ben Yelin: It actually borders, if my geography is correct, Maine, New Hampshire, Vermont, Connecticut, Rhode Island, and New York.

Dave Bittner: Right.

Ben Yelin: So that's a lot of opportunities to run into this issue where somebody is physically in Massachusetts but they are pinging a tower in one of these other states.

Dave Bittner: Mm-hmm.

Ben Yelin: So how do you adjudicate these disputes? Is it based on where the individual is physically located or on which tower they ping? And does that create some type of bizarre situation where somebody outside of Massachusetts happens to gain additional rights over their location? Just because they're pinging a cell phone tower in Massachusetts?

Dave Bittner: Right, right.

Ben Yelin: Even though their state hasn't passed any type of law or restriction?

Dave Bittner: Yeah.

Ben Yelin: Yeah, I mean, that could certainly present some complications. That's the problem with doing this at the state level, generally.

Dave Bittner: Hm. Mm-hmm.

Ben Yelin: I mean, it would be- certainly be better in terms of compliance to have some uniformity and you could only really do that through federal legislation. Unless every state decided to jump off the bridge at the same time, but that rarely happens.

Dave Bittner: Right, right.

Ben Yelin: States think very differently from one another. State legislatures have all different types of legislative sessions, some that span months, some that span years. So even practically, it's just hard to do that. But this is what happens when you have federal inaction. States try to dip their feet in the water and it causes all of these types of territorial dis- potential territorial disputes that are really hard to adjudicate.

Dave Bittner: Yeah. No, this is interesting and I suppose, from my point of view, anyway, it's good that Massachusetts has got their eye on this. I don't see much of a downside. I guess in my mind, this is long overdue that we- to me, it's established that the ability to track someone's location the way that they can with cell phone data is an overreach from the companies. And, in my opinion, too far when it comes to violating our privacy. So to me, this is a good thing that Massachusetts is taking a look at this, and hopefully, putting some limits on it, and I --

Ben Yelin: Pretty profitable, though. I mean, if you're one of these companies that obtains this data, and all of a sudden, you know, this has become a revenue source for these companies. And at least in one state, you're gonna cut off that source of revenue, they're gonna fight it.

Dave Bittner: Yeah, poor babies.

Ben Yelin: They're probably not gonna fight it saying, "We're Verizon. We're gonna lose money." They're probably gonna fight it by forming the Coalition to Protect Freedom or the State Privacy and Security Coalition, if you will.

Dave Bittner: Right.

Ben Yelin: To fight against something like this.

Dave Bittner: Some way we'll find a way that this is keeping us from protecting the children, right?

Ben Yelin: Yes. As Helen Lovejoy said, "Always think of the children. Won't somebody think of the children?"

Dave Bittner: Right, right. All right, well, you know, it's a story of perhaps some hope, right?

Ben Yelin: It is, yeah. I mean, I'm curious to see if the industry through these trade associations has the clout in a very liberal state like Massachusetts to take this down.

Dave Bittner: Yeah.

Ben Yelin: It's gonna be a really intense lobbying effort. And I think it would be a good sign for the privacy and civil liberties community if Massachusetts can overcome that type of opposition and actually enact this into law. Then you'd have to go through the court system. I mean, I could see a concern under the Dormant Commerce Clause if the Supreme Court or federal courts determines that this in some way interferes with interstate commerce.

Dave Bittner: Hm.

Ben Yelin: Or it's- if they view it as kind of a protectionist action on the part of Massachusetts, then you could potentially run into those types of issues. I don't necessarily think that's gonna be a problem.

Dave Bittner: Right.

Ben Yelin: But I'm just trying to foresee how this might become an issue in federal courts if it gets there.

Dave Bittner: Yeah.

Ben Yelin: And I'm sure these trade associations would throw everything at the wall to see what sticks in order to prevent a type of law like this from actually going into place.

Dave Bittner: Yeah. It crossed my mind that, you know, I wonder if we could see a rush to suddenly people spinning up VPNs that made it look like their location was in Massachusetts. But I guess it really wouldn't apply 'cause we're talking about wireless mobile devices and wireless devices hitting towers. So it's different than, you know, your home computer's IP address and that sort of thing.

Ben Yelin: Right, and where you can use ExpressVPN or whatever to disguise where you are, yeah.

Dave Bittner: Right, exactly, yeah. All right, well, we will keep an eye on that one. That's interesting, for sure. My story this week comes from The Verge. This is an article by Wes Davis and it's titled, Sarah Silverman is Suing OpenAI and Meta for Copyright Infringement. Of course, Sarah Silverman is the well-known, popular comedian and author. And --

Ben Yelin: She's very funny. I hope that if this story gains legs that she'd be willing to come on our podcast and do a routine, talk about --

Dave Bittner: There you go.

Ben Yelin: -- copyright law. But also, you know, regale us with some of her funny comedy. So it's an open invitation, Sarah.

Dave Bittner: There you go. There you go. Yeah, I have enjoyed her work, as well. And what's at the center of this case is that she published a book called- titled Bedwetter. And what she's claiming is that the systems from OpenAI ChatGPT, have sort of vacuumed up the contents of her book from what they're describing as shadow libraries. Which are they describe in this article, there's websites like Bibliotik, Library Genesis, Z-Library, and others, which are basically, you know, black market libraries. So they get access to books. They make PDFs of them, and you can have access to these books for free without paying for them or without having your local library having purchased a copy. Or, you know, made whatever the proper channels are to make this book available. So she's claiming that OpenAI gained access to the contents of her book through one of these services, one of these online systems, and that is a copyright violation. And the fact that a service like ChatGPT, when asked, can summarize her book, that that is a copyright infringement. I have thoughts on this and I have to say I question some of this, but --

Ben Yelin: Me, too, yeah.

Dave Bittner: -- I'm wondering where you land on this, Ben.

Ben Yelin: So I can really see both sides of it. I mean, this is her intellectual property.

Dave Bittner: Right.

Ben Yelin: And they are taking the actual text of her book, putting it into this iterative language model, and it's spitting out a summary. I mean, at least in theory, somebody could read the summary instead of reading the book itself and that might cause Sarah Silverman to lose money. That's kind of the essence of a copyright violation.

Dave Bittner: Okay.

Ben Yelin: On the other hand, I'm failing to see why this doesn't qualify under fair use, or at least I wouldn't say I'm failing to see it. I would say I certainly question whether this would be a type of fair use. Because when we search something on Google, for example, it displays pictures, information, that might be copyright protected.

Dave Bittner: Right.

Ben Yelin: But it's- it carries an exception under our copyright law because it's not presenting the information in and of itself. It's just kind of a conduit for us to access that information.

Dave Bittner: Yeah.

Ben Yelin: I'm not sure how analogous the situation is to a Google search. But I also know I could probably get a decent summary of Bedwetter by doing a Google search and going on to one of these sights myself that she mentions in this lawsuit.

Dave Bittner: Right.

Ben Yelin: So I'm not sure why the fact that it goes into this ChatGPT AI model makes that much of a difference. I guess because it's automated, you can replicate it a million times. It doesn't require a user to actually step through those hoops, but --

Dave Bittner: Yeah. It seems to me like the core of what they're going after here is that OpenAI did not purchase a copy of the book.

Ben Yelin: Right.

Dave Bittner: That's the crux of this, I think.

Ben Yelin: Right, right.

Dave Bittner: So what if OpenAI went to their local library and borrowed a copy of the book and had ChatGPT ingest it from that.

Ben Yelin: Right, or, you know, there are a bunch of different ways we can get summaries of books without buying them or reading them.

Dave Bittner: Right.

Ben Yelin: You know, Amazon has the rights to these book- to these books, but they frequently have summaries. Your friend buying a book, and reading it, and then summarizing it to you. I think one of the reasons this has become a case is because ChatGPT is good enough at what it does that the summaries are quite accurate. And even though they get some details wrong, it is in many ways just a replication of the copyrighted text.

Dave Bittner: Yeah.

Ben Yelin: So I understand why that would be a concern. It's too accurate and too detailed of a summary that it might dissuade people from purchasing the book. Because ChatGPT has stolen this information without paying for it and has translated it into something that's so easily digestible for the consumers.

Dave Bittner: Yeah.

Ben Yelin: But I do think that we have to look at the broader implications of this lawsuit. It's funny that this is about Sarah Silverman but there are a lot of other plaintiffs here. And there have been many other cases filed alleging copyright violations from ChatGPT and its competitors.

Dave Bittner: Yeah.

Ben Yelin: And we could get into a situation where a lot of these services are defamed by copyright violations where they are cut off from access to certain corners of the Internet. If they have not purchases access to that information, including things like copyright works, books, music, etc. And that would certainly limit the usefulness of ChatGPT and that would have consequences for the industry. So I think ChatGPT and similar companies are really gonna go to the bat here to try and litigate this and argue that this is fair use. That this is not just a blatant copying without attribution or without payment for the work cited here.

Dave Bittner: Yeah.

Ben Yelin: And I think this will really be an interesting test case on the future of ChatGPT.

Dave Bittner: Let me let you in on a little secret here, Ben, that is probably not so much of a secret. But as you know, as the host of the CyberWire podcast, one of the things that I do is interview a lot of book authors because people write books on cybersecurity. And they want to have that book reviewed, and promoted, and do an author interview on the CyberWire because we have a big audience and it helps promote the book, right?

Ben Yelin: You're like the Jimmy Fallon of the cybersecurity industry. Get them on the couch, interview them about their new book, yeah.

Dave Bittner: Well, yes. And it's all good. You know, it's a nice- I like to think it's a virtuous circle. You know, we help spread the word, and people get educated, and all that kind of stuff. But the simple fact of the matter is is that as much as I would like to, I do not always have time to read the entire book, right? I get so many books. We get so many books sent to us. Books are long. I have a lot of work to do, a lot of research, so I don't always have time to read the book. So sometimes what I will do is instead of reading the book, I will read reviews of the book, right?

Ben Yelin: Mm-hmm.

Dave Bittner: So if I can find let's say half a dozen reviews of the book, that'll give me a summary of the book. It'll give me what people thought were interesting about the book, what the flaws people found in the book. And that can be good enough research for me to do a thoughtful interview with the author, right? So what that leads me to is what if the chatbot is trained on reviews of the book and not the book itself?

Ben Yelin: Yeah, then, I mean, a review is certainly fair use, so then you're not illegally accessing information and putting it into your model. I think that would make a big difference in terms of this particular lawsuit.

Dave Bittner: Yeah.

Ben Yelin: I think the circumstances of this lawsuit are relatively explicit. In that it's narrow to this very particular circumstance where, you know, you have these steps that outline how the data sets have these kind of illicit origins.

Dave Bittner: Right.

Ben Yelin: And that's not gonna be the case with something like summarizing a review just because a review is I guess not really a reproduction of the copyrighted work. It's just that, a summary.

Dave Bittner: Yeah.

Ben Yelin: And so I think that would make a huge difference.

Dave Bittner: Yeah, interesting. Yeah, I, you know, I think big picture here, my thought is that this highlights how much catching up our copyright law needs to do. I personally think that, you know, copyright is too long. I don't understand why, you know, we get so much longer- I mean, the reason is because Mickey Mouse.

Ben Yelin: Right.

Dave Bittner: But, for example, why is copyright so much longer than say a patent?

Ben Yelin: Right.

Dave Bittner: And I think it's just gotten too long beyond its original use, you know. Copyright was for fair use, not for the protection of the author's, originally, and --

Ben Yelin: Right. I mean, you have to balance the protection of the financial interest of the author with having like a- and I don't know how to put this in a way that doesn't sound kind of corny. But like a marketplace of information where people can read creative works, and discuss them, and have them in the public easily accessible.

Dave Bittner: Yeah.

Ben Yelin: So you have to balance those competing interests and I think this lawsuit and ones like it are gonna test the boundaries of those conflicting values in court. And it's just something that's gonna go on for a while because I think it is an existential threat for both the authors and for ChatGPT.

Dave Bittner: Mm-hmm.

Ben Yelin: The stakes are very, very high here. If it turns out that ChatGPT comes- becomes good enough to reproduce the book. That, you know, it's just as useful and entertaining for a random person to read the ChatGPT summary as it is to read the book itself, then that's a major threat to people like Sarah Silverman.

Dave Bittner: Right.

Ben Yelin: Conversely, if the court rules in the other direction, that's a major threat to OpenAI ChatGPT because it's gonna cut off their access to a lot of useful inputs that would go into making their service more comprehensive.

Dave Bittner: Right.

Ben Yelin: So the stakes here are really, really high. I'm very curious to see what this federal court does and I suspect that this is gonna be a long litigation. They're already trying to establish a class for the purpose of a class action lawsuit, so it's certainly gonna be a bunch of different plaintiffs. And now that more of these companies are popping up, it's gonna be a lot of different defendants, so this is gonna be very complex litigation.

Dave Bittner: Yeah. All right, that's another one we'll have to keep an eye on, see how it plays out, but yeah, it's interesting. It's a fascinating one, for sure. All right, we will have a link to all of these stories in our show notes, and of course, we would love to hear from you. If there's something you'd like us to discuss on the show, you can e-mail us. It's caveat@n2k.com.

Ben, you recently had a really interesting conversation with Dan Frechtling, who is CEO of Boltive. They're a company that provides publishers and ad exchanges with tools to monitor and audit their programmatic ads. That's what they do but the conversation with Dan is really a lot more far reaching when it comes to some policy things. Interesting stuff. Here's Ben's conversation with Dan Frechtling.

Dan Frechtling: Well, there's quite a bit going on, Ben, both at the state level and in the private sector with regard to the lawsuits that we're seeing, frankly, that are coming up. And 2023 has been a banner year. It's five years since the passage or really the effective date of GDPR and the effective date of CCPA in California. And I think we're gonna continue to see states innovating with their legislatures. I don't think we're gonna see a national law. It's not likely to happen this year, and if it's not happening this year, it's definitely not happening in an election year next year. So it's going to continue to be up to the states and they are not shirking at the task. So we have, you know, five US state laws taking effect this year, five comprehensive laws, more if you consider sectoral laws like not related to healthcare. But you also- we've also seen five, soon to be six, more laws passed this year for future years and the legislative sessions aren't even over yet. So what we're seeing the states really lead, we're seeing lawsuits come in areas particularly around healthcare and video. And there was new legal theory some decades old being applied to the privacy problem. So it's an immense amount of activity. I don't think it's an exaggeration to say there has never been a regulatory change in history that has happened this fast simultaneously really around the world as what we're seeing with privacy now. Seventy-five percent of the world's population is under one form or another of comprehensive privacy laws and that number was only 20% of the world in 2020.

Ben Yelin: Wow, I guess I didn't realize how drastic that change has been. Yeah, I mean, I think 2022 was the closest we were gonna get to some type of federal data privacy regulation, and for a variety of parochial reasons, it didn't happen at the end of the last legislative session. And I think you're right that the action is gonna happen at the state level. Are there any recently enacted or state statutes under consideration that have caught your eye? I saw that you had posted something on a Florida child protection law. Is there anything else besides that or that that's caught your eye, recently?

Dan Frechtling: Yeah, there has been. So Florida Bill of Rights is very interesting. What's happening in Florida and Texas, because they're red states. The fact that they're taking privacy so seriously is a pretty big deal, a pretty- I think a pretty good sign in terms of it being a bipartisan issue. What we're seeing in Connecticut is quite interesting because they've already passed a comprehensive law. It's a very good law, but they're continuing to innovate on it and that's a theme we may want to come back to. But what they've done is they have a good comprehensive privacy law, but this year, they have passed on some amendments that address health and children's data simultaneously, and that's a bit unusual. Sometimes you have like in Washington State, My Health My Data, very much a health-focused law. The Child Age-Appropriate Design Code Act in California, very much a children's data-oriented law. Well, Connecticut is embracing, too, and it's protecting women's health and it's protecting children's health all in one bill and expanding on the CTDPA, as it's called, which is Connecticut's law. So that to me is the most interesting recent law. I did just mention a second ago My Health My Data in Washington State, which sounds like a health bill, a health law, but it really goes beyond that because of the way that health is defined. It includes nutrition. It includes fitness, wellbeing, other things that would traditionally be beyond a HIPAA perhaps definition of health and the fact that it's not just residents of Washington State that are protected, it's consumers. And that means that cloud services housed in Washington State like AWS and Azure, right --

Ben Yelin: I was just gonna mention, yeah, I mean, there's one very prominent company headquartered in Washington State, yeah. How do you feel about that integrated approach that you've seen with the Connecticut law and the Washington law? Where you're bringing some of these disparate subject matters like health data privacy and children's privacy under the umbrella of a broader data privacy law? What do you think about that approach?

Dan Frechtling: I think it's a good sign because it reminds us that the states are much more agile when it comes to keeping up with current events than the federal level is. So when the Dobbs decision last year which overturned Roe v. Wade and opened up a whole series of new issues around women's health, the states have taken action on that. And contrast that with the federal government where even the federal privacy laws that we've been able to pass, you know, the Electronic Communications Privacy Act, 1986. That's nearly 40 years old yet it hasn't been updated for e-mail, right, computers, mobile phones, Internet, right? So the slow, glacial pace of federal laws compared to the way that states can catch up as we see generative AI create new privacy issues. I mean, good luck with the federal government protecting citizens. I- my money would definitely be on this continued state innovation.

Ben Yelin: Yeah, I know we're starting to address it here in Maryland. I've talked to legislators who are looking at other states and seeing what's being done about this generative AI problem. It's six months old and we don't have really any framework to deal with it. And we don't want the consequences to spiral out of control. In terms of this state-federal distinction, obviously the critique is, especially from the private sector, you don't have a level of uniformity. Compliance becomes difficult. You know, there's certain issues around preemption. How do you see those issues and how would you respond to a critic of kind of the state-by-state approach?

Dan Frechtling: Well, you can have both. You can have both because if you have a federal law that sets a floor and doesn't preempt states, then you can have the best of both worlds at least from a consumer protection standpoint. Meaning you're covering all 50 states with at least a baseline protection. And then you're allowing individual states to go further to the degree that they need to. It's not a best outcome for the chambers of commerce and businesses because it can be quite hard to match to all those state laws. But it's hard to match to all the sales tax laws. It's hard to match to all the insurance laws that vary by state. Health breach notification laws went from same path that we're seeing right now, started in California, and over 15 years, every state passed its own health breach notification law. And we seem to have made it through there. So I think the critics are right that it does make things more complicated. The patchwork makes things more complicated for businesses in the interest of consumers, who have been the ones who have been on the invasive end of this. They have been suffering from the lack of protection. It is gonna be the best outcome to have a federal law, when we get to it, and states in the meantime. And by the way, as I was born and raised in Maryland, and with a little bit of Virginia getting the head start on the Virginia Privacy Act, I really do hope Maryland catches up.

Ben Yelin: Yeah, we've had a hard time. We've had a version of it proposed in I think three or four consecutive legislative sessions, and for a variety of reasons, it hasn't made it across the finish line, but we're gonna keep trying. Will you talk a little bit about the area of children's privacy online? I know there have been- there's been some administrative action on it at the federal level and then some state statutes. But just your general lay of the land, what the major issues are and how states are dealing with that.

Dan Frechtling: Sure, yeah. So children's data has been- protection has been a neglected corner of law until recently. So COPPA from 2000 is the only federal law addressing targeted advertising at children and it requires verifiable consent but it only covers children under 13 years of age. And it has lagged innovations in online technology, so in 2013, it was extended to cookie trackers and geolocation data. But this is, you know, back to how the states are leading, CCPA in California increased the age of consent to 16 years, so there- between age 13 and 16, there's some authorization required. It added different forms of personal- what's considered personal information. Now the California Age-Appropriate Design Code Act, which I mentioned a moment ago. That mirrors what's going on in the UK to require businesses to estimate with some level of certainty how old a visitor is, how old a child is visiting their site to go a little bit further. And that protects children up to age 18. So we're kind of seeing the bar literally raised from 13, to 16, to 18, and that's, I think, necessary. At the federal level, we've got COPPA 2.0, CTOPPA, it's also called, which is trying to do some of those same things. KOSA, the Kids Online Safety Act, trying to do some of those things but not having as much luck for the reasons we just mentioned. But where I will credit the federal government is after Biden's State of the Union address in February, we've- where he sort of extolled or exhorted Congress to get behind a ban on online ads for children. Congress still hasn't acted, but the FTC certainly has with, right, with, you know, $500 million in fines to Epic Games. And then recently Edmodo, an educational company, Amazon with its Alexa recording and storing children's voices, and Microsoft Live Xbox. So we're seeing the FTC step in, as it said, as Lina Khan said it would. Until there's a federal law, the FTC will act within it- what it views as its rulemaking authority to protect kids, as Biden asked for. So that's been an interesting thing. We're not even halfway- we're just about halfway through the year and we've seen, you know, at least four major FTC actions around children's data.

Ben Yelin: Which is pretty unprecedented. I mean, we have not seen this in previous years. This seems to be a Lina Khan innovation during her tenure at the FTC, which I think is certainly promising. What do you see as sort of the next frontier in data privacy? So we talked about iterative chatbot AI. What are some other issues that are not quite on the public's radar yet but that are on your radar in terms of future privacy regulation?

Dan Frechtling: You know, I don't- oh, gosh. With so much going on right now, Ben, I don't even look that much further ahead to know, you know, there's- it's a story- it's a medium-by-medium story, of course, right? So the other thing the FTC has done, and California has done in its enforcement sweeps, is to target apps, right, apps which behave very similar to websites. When you look at the SDKs within apps operating the same way as trackers do. So that's gonna unfold more this year. If we ever get to the metaverse, if when we record this, it's a couple of days after Apple announce its new product, Vision Pro, or whatever they're calling it. There will be privacy issues 'cause every new medium has its own privacy issues, as we learned about the Xbox case that just passed this year. As we look out to what's happening in maybe a year from now, my interest is in what private enterprise does. Because in Q3 of next year, we have Google finally saying for sure, without a doubt, they're going to deprecate third-party cookies in the Chrome browser. So that whole tracking mechanism that has been around, it will be the 40th, is it the 40th anniversary? The 30th anniversary of Lou Montulli inventing the cookie will be the sort of devise of the third-party cookie which Lou never intended. But that to me is what is the private sector going to do when its commercial actors, we saw it with Apple in app tracking transparency. Now we're seeing it with Chrome in deprecation of cookies. How is that gonna change the landscape? 'Cause that's gonna have a pretty profound impact on e-commerce and online advertising.

Ben Yelin: Yeah, and I think frequently the private sector fills in those gaps when they see it within their interest for the bottom line and that's one of the ways we get privacy protections. I mean, Apple has tried to sell themselves as the company that will protect your data. That's given them a competitive advantage. Beyond what we've talked about so far, is there anything else that we didn't address that you think is worth mentioning before we wrap up?

Dan Frechtling: Yeah, I think what's going on with video privacy is quite interesting. This is the- this is something that's driven by a law called the VPPA, the Video Privacy Protection Act. And as you may know, the origins there were with Robert Bork's Supreme Court nomination where he was defeated, or was not nominated, or was not approved by Congress. One of the things that was used to attack his character was getting a list of his rentals back when there were videotapes.

Ben Yelin: I remember, yep.

Dan Frechtling: Yeah, right, right? And so that law --

Ben Yelin: Video rental history, yep.

Dan Frechtling: Right, and so and it was not because of any sympathy for Robert Bork. It was because everyone in Congress said, "If they can do it to him, they can do it to me." So they quickly passed that Video Privacy Protection Act so that you could no longer go and require- request somebody's video rental history. But this class of companies called videotape service providers, which we pretty much assumed had gone away with the closure of the last Blockbuster, now has been applied to streaming video. And the presence even of a single pixel sharing data on a video page has been alleged or asserted is in the scope of this videotape law. Because pixels reveal things about your viewing, my viewing, when we visit that page and that pixel is firing. So this is a subject for the courts right now to decide 'cause some of the argument is, well, this data isn't really that personal if you're just sharing what the page title was that somebody visited. But if there's more data that's shared beyond that, like the events that somebody did. If they filled out a form, if they provided survey information, that almost doesn't matter, Ben. 'Cause it's created enough churn and dust that certain law firms are sending warning letters to companies, which is essentially extortion, saying. "If you don't pay us $10,000, we're going to take this to court." And many companies, if they're- it's not uncommon for there to be a pixel on a page hosting video even if you're not a streaming company. Many companies are having to cave because they don't have the legal resources to respond. So I think that's interesting. It does show a little bit of overreach that- and part of the reason why we see some of these laws prohibit a private right of action to enforce them. But I think it's an influence, nonetheless, because it gets into all kinds of allegations about wiretapping violations, and interception, and stuff that sounds really, really nasty. When the companies who are involved were completely unknowing about any wrongdoing on their part.

Ben Yelin: Right, not to mention, then you get- not to get too much into the legalese here, but then you get into issues with a third-party doctrine. Where you've inadvertently but voluntarily shared data with a third party, and if they receive a subpoena, they will comply with it, no warrant required. Which is gonna be unfortunate for the consumer if law enforcement gets involved.

Dan Frechtling: That's true, yeah. Very good point on that, yeah.

Dave Bittner: All right, interesting conversation, Ben. I really enjoyed that one. I think the big take home seems to be that, for the moment, the action is really at the state level when it comes to this stuff.

Ben Yelin: Yeah, I mean, we really struck out at the federal level at the end of last year.

Dave Bittner: Mm-hmm.

Ben Yelin: There was hope for a federal data privacy law, and for a bunch of complicated reasons, including the parochial interests of the former Speaker of the House, who represents the state of California. California had this strong CCPA data privacy law. She was, and other members of the California delegation, were worried about this federal law being weaker and yet still preempting that state law. We struck out at the federal level. And I think kind of thematically, this goes with our first story, today. It's a vacuum --

Dave Bittner: Right.

Ben Yelin: -- that the states have had to fill up with their own policies.

Dave Bittner: Yeah.

Ben Yelin: And there are certainly advantages to that. States are the laboratories of democracy, but there are disadvantages. It's hard to establish uniformity. Compliance is really issue- is really an issue for companies in the private sector. So there's certainly plusses and minuses, but I think, yeah, we're recognizing the reality that the action on this, at least for the time being, is gonna be at the state level.

Dave Bittner: I really enjoyed the part that Dan brought up about the video privacy law, you know, that law going all the way back to Judge Bork with the- making it illegal to get people's video rental lists. And how people are applying that law to tracking pixels with videos online. I think it's just an interesting example of a legacy law with a long tail that people are experimenting with to try to apply in this new digital realm.

Ben Yelin: Yeah, I mean that's what's so funny. If there's one theme of our podcast, generally, it's that a lot of litigation on the most modern, contemporary forms of technology is based on either case law or statutes from the pre-digital world. Which is just weird. It's a weird way of having a legal system.

Dave Bittner: Right. Right, yeah. It's like, I don't know, automotive law based on rulings on horse and buggies.

Ben Yelin: Exactly, exactly. Very well done metaphor.

Dave Bittner: Thank you, thank you. All right, well, again, our thanks to Dan Frechtling from Boltive for joining us, really interesting stuff and we do appreciate him taking the time.

That is our show. We want to thank all of you for listening. We'd love to know what you think of this podcast. You can e-mail us caveat@n2k.com. N2K strategic workforce intelligence optimizes the value of your biggest investment, your people. We make you smarter about your team while making your team smarter. Learn more at n2k.com. Our senior producer is Jennifer Eiben. The show is edited by Elliott Peltzman. Our executive editor is Peter Kilpe. I'm Dave Bittner.

Ben Yelin: And I'm Ben Yelin.

Dave Bittner: Thanks for listening.