Laws outpace compliance programs.
Cinthia Motley: The challenge that companies have is that they've had their privacy and cybersecurity compliance programs, but the emergence of all these laws, it's outpacing their compliance programs. So they're finding themselves in a very fast-moving environment.
Dave Bittner: Hello, everyone, and welcome to "Caveat", the CyberWire's privacy, surveillance, law, and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hey, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: Today, Ben has the story of a federal judge blocking an Arkansas law limiting minors' access to social media. I've got the story of a court-ordered blocking of websites at the DNS level. And later in the show, my conversation with Cinthia Motley and Sean Buckley of Dykema. We're discussing the evolving legal challenges and compliance issued associated with some of the common data collection practices. While this show covers legal topics, and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. All right, Ben, we've got some good stuff to talk about this week. Why don't you kick things off for us here?
Ben Yelin: So my story comes from the PBS NewsHour. This is my PBS NewsHour voice.
Dave Bittner: Okay.
Ben Yelin: And it's about a judge blocking a new Arkansas law requiring parental approval for minors to create social media accounts. So this law, which was passed earlier this year, was set to go into effect September 1st. And the federal judge issued a preliminary injunction against the law right before it went into effect on August 31st.
Dave Bittner: Okay.
Ben Yelin: So the way the law was supposed to work is that social media companies, and there was a lot of confusion about what counted as a social media company, it had to have more than $100 million in annual revenue, and its primary purpose had to be various social media functions, so like interacting with peers and posting news articles. It's all defined in the statute, but I think it's defined in kind of a sloppy, inconsistent way, and that's one of the reasons that the state of Arkansas is going to have trouble here prevailing in this case. Companies like X, Meta -- I know, I'm using the new names of these companies just to annoy you personally.
Dave Bittner: Thank you.
Ben Yelin: And the other usual suspects have these new requirements to verify the age for any new users that sign up within the state of Arkansas.
Dave Bittner: Okay.
Ben Yelin: Every user, whether they are a minor or not, is going to have to, instead of just doing what we normally do, which is check the box that says I'm at least 13 years of age, I'm at least 18 years of age, would have to submit some type of verification, whether that's a driver's license, some form of government-issued ID. So this is going to be pretty difficult to enforce for a number of reasons. They would send that to some sort of third-party entity that would review whether this was a proper form of identification. Weirdly, this law doesn't apply to Snapchat, and the judge in this case asked the litigant for the state of Arkansas why it does not apply to Snapchat, and he couldn't really give a good answer.
Dave Bittner: That does seem odd.
Ben Yelin: It's very odd, and I'll talk about why that's relevant in a legal sense in just a moment.
Dave Bittner: Okay.
Ben Yelin: I think that gets to the legal problem with this law is that it's really void for vagueness. I'll also note this is not the first law of this type across the country. There are other laws that have popped up. Utah has one of them. Texas is experimenting with something that sets up some type of age verification requirement. Most of these sites require you to be at least 13 years of age, but in Arkansas, this would apply to minors. So anybody under 18 would have to get express parental consent in order to be allowed to use that platform. This is an inhibition on free speech, both for minors, because minors, though they don't have full First Amendment rights in all cases, certainly have some level of First Amendment rights, and you would be blocking them from engaging in important national conversations. And then there is also the free speech effect that this would have on adults, people who don't have government-issued ID or who want to remain anonymous would now be forced to submit their ID. That might cut against their anonymity, and that might stifle their ability to participate in free speech activity. If you are suppressing free speech under our case law, if it is a content-neutral restriction on free speech, meaning you're not trying to punish the speaker for specifically what that speaker is saying, then the government has to have a pretty darn good reason for that law to go into effect. That's opposed to what we call strict scrutiny, where if it is a content-based restriction, the government has to have a darn good reason, more than a pretty darn good reason.
Dave Bittner: Okay.
Ben Yelin: The legal language for strict scrutiny is a compelling state interest, and for what we call intermediate scrutiny, it is an important governmental interest. The judge here basically says, I don't need to decide which level of scrutiny applies here. We know that protecting kids from the harms of social media, and there are various studies that are cited as part of the legislative record of this Arkansas law, we know that that is an important governmental interest. The question is whether the means of achieving that interest are properly tailored to fit the ends of that interest. And when you have a law that's this vague, and it makes it difficult for people to know how and whether they can comply with the law, that's when you run into constitutional problems. The fact that there's no way for the court to determine whether this is the least restrictive means of achieving the objective of protecting kids from social media, that's, I think, why this law was at least preliminarily struck down. And the reason for that is the definition of social media, as I mentioned, is vague enough that it's very unclear which of the actual social media networks this applies to. For example, this would not apply to YouTube, because they're primarily a video service. If you ask me --
Dave Bittner: Have they never read the comments?
Ben Yelin: I was just about to say that. That seems incredibly silly to me, because if you're talking about the smut that is poisoning teenagers' minds in this country, I think it's just as much YouTube as it is Facebook or Meta. But the way they wrote the statute, it has this sort of vague test of, is the primary function of the website social networking? And I'm not sure that that is a meaningful distinction. It's hard to discern whether that would be the least restrictive means of effectuating the policy goal, when you put it like that, when the term is so vague that you don't know whether it applies to Snapchat, when you know that it doesn't apply to Google or YouTube, which to me, if you are legitimately interested in protecting kids from the worst stuff on the internet, I think that YouTube would absolutely be at the top of the list. And then there are the rights of the social media companies themselves. So social media companies that violate this statute would have faced a $2,500 fine for each violation. The law also would have prohibited social media companies and third-party vendors from retaining users' identifying information after they've been granted access to a social media site. And so this puts at least some cost burden on the social media sites, and I think at least incidentally, or a little bit, threatens the free speech rights of the services themselves. So ultimately, because this is a law that is threatening free speech, and the state has not shown that it is using the least restrictive means to effectuate its policy goal by having this kind of vague, poorly defined piece of legislation, I think it's absolutely appropriate that this federal judge put the brakes on it. I think Arkansas is going to have to revise the law, make it more clear what actually counts as a social media site. You have to refine the definition beyond this sort of primary purpose test. I think until they do that, this law is going to be, as we say in the constitutional law world, void for vagueness.
Dave Bittner: Help me understand this, what did you say, least restrictive?
Ben Yelin: Right.
Dave Bittner: Help me understand that, because initially that sounds to me like trying to prove a negative.
Ben Yelin: Yeah, I mean it is one of those very legalistic terms. It means that this is the way of achieving that goal that puts the least burden on people's free speech rights. And I think what the court is saying, and frankly what the litigants are saying -- the litigants are part of a trade organization called Net Choice. That includes all of the major companies, TikTok, Facebook, X, et cetera. What they would say is there are ways to effectuate these policy goals that would have much more of an incidental, smaller burden on people's free speech rights. Exactly what those methods would be, those least restrictive methods would be, I don't think the litigants nor the judge have the obligation to describe those. I think they're just saying here, this law is too overbroad. If you're going to restrict people's free speech rights, even if you have a compelling reason to do so, you need to find a way that both the companies and the consumers know exactly what they need to do to comply with the law. And the way the law is drafted, at least the thinking here, is that consumers and the companies, there's going to be a lot of confusion as to whom the law applies, to which companies the law applies, et cetera.
Dave Bittner: I know you and I, we often joke about the breathless summoning of the need to protect the children. Right.
Ben Yelin: Somebody please think of the children.
Dave Bittner: Right. And it's because it's quite often used as an excuse to accomplish other things. Do we feel as though the legislators in this Arkansas case are coming at this in good faith, that their intention actually is to protect minors from some of the ills of social media?
Ben Yelin: I don't want to cast any dispersions on whether they're doing this in good faith. That's not my place.
Dave Bittner: Yeah.
Ben Yelin: I think there is legitimate concern about the effect of social media on kids.
Dave Bittner: Right.
Ben Yelin: The US Surgeon General just came out with a report talking about the negative effects. There have been a sufficient number of studies that show some of the effects on mental health, for example. So this is a problem that really does exist. I think the issue is, it's really over-encompassing if you are confronted with this problem to first go to a full age restriction on the platform. So in other words, the problem of smut on social media doesn't necessarily have to do with the platforms themselves or the daily use of those platforms for finding information on video games or connecting with friends. It's about the negative content that people find on those platforms. So I think a better way to go about this would be to at least protect minors from this type of harmful content. And the companies have found through algorithms, basically they've done a decent job of protecting minors from some of the worst content. Now it's certainly not a foolproof method. Sometimes something innocuous seeming like a British kid playing video games and getting millions of views on YouTube Kids might actually be the type of content we don't want our kids to see because maybe they'll use language that's threatening to people's gender identity or sexual orientation. So it's not always easy to tag which content is the type of content we need to restrict. But I just think you can agree that they have good faith, they've identified a real societal problem, but that good faith belief doesn't necessarily justify this sort of gating effect or gating strategy where you're blocking off an entire service to minors and potentially to adults if adults cannot fulfill those identification requirements. ITha's just dropping an anvil on a tiny little ant, although maybe that metaphor is a little too extreme, but I think you get what I'm saying.
Dave Bittner: To what degree, if at all, does this relate to some of the efforts we've seen states make to restrict access to pornographic websites, the Pornhub of the world, to keep minors off of those platforms, which I think is certainly non-controversial in theory?
Ben Yelin: Yeah.
Dave Bittner: But the states have been unsuccessful at that as well for similar reasons.
Ben Yelin: Yeah. So adult pornography is protected First Amendment speech.
Dave Bittner: Right.
Ben Yelin: However smutty it is, it is protected under our First Amendment. Now there are things that are not protected under the First Amendment, one of them is child pornography.
Dave Bittner: Ben, you keep using that word smutty, it's very judgmental of you.
Ben Yelin: It is very judgmenty. Judgmental. I really should use something else. How about --
Dave Bittner: It's a loaded word. Icky?
Ben Yelin: Yeah.
Dave Bittner: Okay.
Ben Yelin: Something like that. Something with a y at the end.
Dave Bittner: Right. One man's smut is another man's treasure.
Ben Yelin: Yeah, and that does sound like something a Supreme Court justice would say.
Dave Bittner: Right.
Ben Yelin: So those type of websites do have some level of First Amendment protection. I think when we're talking about categories of First Amendment-protected speech, pornography, while it is protected, would be at the lower end of the spectrum, I think, especially when we're discussing minors. We do want to protect minors from the worst pornographic indecent websites. I think that's a legitimate societal interest, even within the confines of the First Amendment.
Dave Bittner: Yeah.
Ben Yelin: But there are other things that social media websites or social media services have, present, that certainly falls under kind of greater First Amendment protection or presents a greater First Amendment concern, even for minors. If you're shutting off political conversations or preventing people from being able to connect with their neighbors and their peers, that is an inhibition on First Amendment activity even for minors. So things that you might be able to reasonably prevent from a First Amendment perspective, like granting minors access to pornography, you know, I think you don't have that same type of justification when we're talking about a blanket ban on all social media services.
Dave Bittner: I see. Yeah.
Ben Yelin: Yeah.
Dave Bittner: I imagine too that there's a compelling case to be made for teenagers who may be seeking out answers or help with a lot of the problems that teenagers face and may not want their parents to know that they're looking for that sort of stuff.
Ben Yelin: Right. You think of people with sexual orientation, gender identity questions whose parents might not be approving.
Dave Bittner: Right.
Ben Yelin: Things like suicide and self-harm. I think it is in a societal interest to have resources, just my opinion, available to minors of a certain age without requiring the consent of their parents. Now that's a live public policy dispute. I think there's a good faith disagreement on that.
Dave Bittner: Right.
Ben Yelin: But even so, coming up with this blanket ban where you're just putting up a giant gate in front of all social media services, many of which might not only be useful to children but might be First Amendment protected information, is just too large of an action, too vast of an action to fulfill Arkansas' goal of protecting children. And if Arkansas really wanted to protect children, they'd have a better targeted law that actually attacked all forms of social media that were harmful to children, including things like YouTube and Google. They would make the law more narrowly tailored to the most objectionable content on all of those websites. And the fact that they were not able to do that, I think, is why this law ended up failing in court.
Dave Bittner: So has this judge given the Arkansas legislators a bit of a roadmap for how to come at this a second time?
Ben Yelin: Yeah. They're gonna take another bite at the apple. I think the legislature could, first of all, appeal this preliminary injunction up to the Court of Appeals. The Federal Court of Appeals, they very well might succeed there. The Federal Court of Appeals might vacate that injunction and put the law back into place. If that doesn't succeed, I think there is a roadmap here for Arkansas to really put in the work that they should have put in in the first place to make this law as narrowly tailored as possible. I think lawmakers have to know that any inhibition on free speech is gonna earn you the watchful eye of the judicial branch. So the more carefully you draft these laws, the more well-documented your decisions are, the better you're gonna do in court. And I think Arkansas has learned that lesson the hard way.
Dave Bittner: All right. Well, it'd be an interesting one to keep an eye on to see, how many times do Arkansas legislators have to go back to the well here? And will they ultimately be successful in accomplishing what they're setting out to do here? Or is it not possible?
Ben Yelin: Yeah. I mean, I think so far, it seems very difficult for me to achieve this balance of a constitutionally acceptable version of this type of restriction, but I think it's certainly within the realm of possibility. And some state, if it's not Arkansas, is gonna try and pull it off.
Dave Bittner: Yeah. All right. Well, we will have a link to that story in the show notes. My story this week comes from a website called IS Preview, which I think is a little bit of a pun off of ISP.
Ben Yelin: I see what they did there.
Dave Bittner: Yes, exactly. It's an organization out of the UK. It's an article written by Mark Jackson, and it's titled Quad9 Founders on the Dangers of Global DNS Blocks by Rights Holders. So Ben, are you at all familiar with a service called Quad9?
Ben Yelin: I was not until I read this article.
Dave Bittner: Okay. So Quad9 is a domain name service provider, and just real quick, your DNS providers, they're the ones who convert the plain -- I'm gonna say plain English and be provincial here -- but the plain English version of a website name, for example, Google, and the DNS provider takes that word that you put in there, say google.com, and converts it to an IP address, and that's how you go and find the website.
Ben Yelin: Right.
Dave Bittner: So most folks, when they set up their internet service at home, they'll use whatever the domain name server is that comes automatically with their server, and your ISP, your internet service provider, typically provides that, so they always provide that service, and most people just go with whatever they're provided with, and that works out fine. However, there are some third-party DNS providers. Quad9 is one of them. They're called Quad9 because their DNS name server is 9.9.9.9.
Ben Yelin: Very clever. Four nines.
Dave Bittner: Yeah. But lots of people go with other ones. Google has a public DNS server, CloudFlare has one. Part of the rationale for doing that is some of these third-party ones might be faster, they may be automatically blocking sites that are known to be malicious or providing malware, so lots of different reasons you could do that. What has happened here is that the folks who are running Quad9 were hit with a lawsuit from Sony who wanted Quad9 to stop resolving domain names of sites that were providing pirated music, pirated content. So let's say music, movies, that sort of stuff.
Ben Yelin: Right. The type of stuff I could have gotten on LimeWire in the mid-2000s.
Dave Bittner: Exactly. Exactly. The golden age of content piracy.
Ben Yelin: Oh, I miss it -- if that type of thing had been done legally.
Dave Bittner: Right, right. So Quad9 went up against Sony in this lawsuit and they lost. Quad9 lost. And so Sony has successfully had this takedown order with Quad9. The judges agreed that Sony had an interest in protecting their rights to music and other content that they owned. Quad9 is appealing the decision here. But what I want to talk about is the broader issue here. This particular article is kind of a Q&A with some of the folks who run Quad9 with some of the issues of doing this. And they're making the case that perhaps the DNS level of the Internet is not where we want to be blocking things. That it is a type of censorship, that it is an undue burden on an organization like Quad9 who happens to be an organization that runs on grants and sponsorship from other organizations who find the service that they're doing to be a good one. Many other organizations provide DNS service either as part of another service that they provide like being an ISP or if you're someone like Google who provides just about everything, why not be a DNS server as well?
Ben Yelin: Right.
Dave Bittner: So Quad9's making the point that this is a burden on them. It provides extra expense. It can degrade the quality of their service. But they also make the case that they really shouldn't be the people doing this kind of censorship. I'm curious what your thoughts are on this, Ben.
Ben Yelin: So help me with this, Dave. Can we think about sort of the perfect metaphor here in the non-digital world? So would this be like blocking a bus that takes you on a road to a particular location because you're prohibited from attending that location?
Dave Bittner: That's interesting.
Ben Yelin: Yeah. I'm just trying to help me/our listeners to conceptualize this problem. Because it seems to me that you might have some intellectual property interest here, but as you said, this isn't the method to attack that intellectual property interest. This is censorship.
Dave Bittner: Yeah.
Ben Yelin: If somebody is trying to use your creative works to make money under our understanding of copyright law, that's one thing, but this is about access to it, correct?
Dave Bittner: Well let's go back to the thing we always love to test these things on, which is pornography.
Ben Yelin: Yeah.
Dave Bittner: So let's say you got your friendly neighborhood sex shop, right? They're selling all the things that sex shops sell, and they're doing good business. And as you say, I decide as an adult of legal age that I want to go and shop in this establishment. And so I get on the bus and I say, take me to the establishment. Or maybe I get in an Uber or a cab and I say, take me to the establishment. And they say, what's the address? Where is it? And I give them the address and they go, oh, we cannot take you there.
Ben Yelin: Absolutely not. Yeah.
Dave Bittner: So I think that's a legit analogy that the blocking of that site, the blocking of that shop should take place at the entrance to the shop.
Ben Yelin: Right.
Dave Bittner: In other words, if they don't want to allow anyone who's under 18 to come in, or they don't want to allow anyone in who's been a problem in the past or whatever, they can do that at the front door. But it shouldn't be up to the transportation providers to do that blocking.
Ben Yelin: Yes. I think that's exactly what troubles me about this. Now, stepping back a second, I think people would be skeptical of this because of course Quad 9 wouldn't like this because this cuts against their service and their revenue and their entire business model. I get that.
Dave Bittner: Right.
Ben Yelin: But I think the concern here is that you're doing the censorship at the routing level and you can understand a slippery slope where maybe all the major internet service providers collude together and start blocking access to certain domains. And I know we talked a lot about the First Amendment in our first segment here, but that could have a chilling effect on people's First Amendment rights if they know that no matter what they say or what they do, nobody is going to be able to have access to their content.
Dave Bittner: Right.
Ben Yelin: And I just think that's a little bit extreme. I think our legal system is generally pretty good at adjudicating intellectual property disputes and I just don't think we need this type of tool prescribed by the courts. I think it frankly reflects a misunderstanding on the part of our court system as to exactly what's going on here. I think if they understood it properly, as this is just a routing method and not any sort of endorsement of the content, then perhaps this case would have come out differently. I think it's possible they just really didn't fully understand what was happening here.
Dave Bittner: Yeah, and just to be clear here, and shame on me for not saying this earlier, but this did take place in Germany. Quad9 is a Swiss company that operates in Germany, so I am not familiar with all the exact rules and so on and so forth when it comes to having these things happen in Germany and international law and all those sorts of things. So that is an element that's at play here.
Ben Yelin: Having once watched a Bundesliga German soccer match, I consider myself an expert on German constitutional law.
Dave Bittner: There you go.
Ben Yelin: No, I know they have an equivalent First Amendment right. It's frankly not as strong as ours. There are certain categories of speech that are not protected in Germany.
Dave Bittner: Right.
Ben Yelin: I think we can guess what some of those are based on historical factors.
Dave Bittner: German history, yeah.
Ben Yelin: But I think the same spirit still certainly applies.
Dave Bittner: Yeah, and as you say, some of the other ISPs, or I should rather say the DNS providers, are concerned that Sony was using this as a test case and coming at Quad9, who compared to the Googles of the world, they're underfunded.
Ben Yelin: It's an easy target, yeah.
Dave Bittner: Right. They don't have the legal team that someone like Google would have to fight something like this. The concern is that if Sony can get rulings against Quad9, then they start going after the bigger providers, and a rights holder like Sony finds this to be an effective hammer against the problem of people sharing their copyrighted content.
Ben Yelin: Yeah, my guess is they tested this in Germany, then you can test it in kind of the broader European Union and in our legal system. And our legal systems, while they are different in a lot of ways, I think some of the same principles still apply, so I think it's probably a smart move on the part of Sony to use this as a test case. Now that they've succeeded here, I think the cat's out of the bag and we could see them try it in other jurisdictions.
Dave Bittner: It is interesting to go back to our analogy, because this is like Sony, rather than suing the porn shop we were talking about, suing the bus service.
Ben Yelin: Or Uber, yeah.
Dave Bittner: And being successful in saying that you can't take people to this location, rather than going after the illegality of the operation of the location itself. It seems to me like Sony should just shut down or go after the website. And I understand, it's harder to go after a secretive, shadowy website that will go from one bulletproof host to another, when you can go after the public-facing organization like Quad9.
Ben Yelin: Right.
Dave Bittner: So there's a practical consideration here.
Ben Yelin: Yeah. I mean, I also think you have, and they mention this in the article, there are a lot of good reasons why somebody would use a third-party DNS provider. And now, if this lawsuit becomes widely accepted, whether it's in Germany or elsewhere, it would probably end up being too much of a threat to the business model of these third-party DNS providers that they might go out of business. So without any of these providers, you might not be able to get the benefits, like additional malware filters or helping users avoid DNS-related bugs.
Dave Bittner: Right.
Ben Yelin: So I think that's certainly a policy consideration as well.
Dave Bittner: I also wonder how ultimately effective this is, because a DNS provider is providing the translation of a website name to an IP address. Well, if the DNS provider doesn't provide that translation, that IP address is still out there.
Ben Yelin: It still exists. Yeah. Right. In our metaphor, the sex shop still exists. It's still there.
Dave Bittner: Right.
Ben Yelin: There are other ways to get to it.
Dave Bittner: Right. And in the community of folks who enjoy shopping at those places, where is it going to get around where the location is.
Ben Yelin: Right.
Dave Bittner: And I imagine the same thing here. If you're someone who's into pirating audio or video or whatever it is, you're going to be on some forum somewhere, and they'll just share the IP address.
Ben Yelin: Exactly. That's it. Echan or whatever.
Dave Bittner: Yeah.
Ben Yelin: I mean, that's certainly valid. I don't know how this is going to be possible to fully enforce.
Dave Bittner: Right.
Ben Yelin: It just seems like an ill-considered judicial decision to me, but I guess that's what this court has done here.
Dave Bittner: Yeah. Well, we'll see. Quad9 has not given up the fight, and it seems as though they've rallied support from some of the other providers here. So hopefully, we'll see how it plays out. All right. We will have a link to that story in the show notes, and of course, we would love to hear from you. If there's something you'd like us to consider for the show, you can email us. It's caveat@n2k.com. Ben, I recently had the pleasure of speaking with Cinthia Motley and Sean Buckley. They're from a company called Dykema, and we were discussing some of the legal challenges When it comes to compliance issues associated with some of the common data collection practices that are out there, things like ad technology, you know, the things that we're all dealing with day-to-day. Here's my conversation with Cinthia Motley and Sean Buckley.
Cinthia Motley: So the legal landscape in this area is particularly, it's an emergent area of law, and we are getting a tremendous amount of aggressive regulatory landscape that continues to increase monthly, daily. And it's all in the advent of, you know, advancements in technology, companies' use of that data. So we have several state laws already in the books, California being the forefront of this, and as well as some state emerging laws also that are following the same trend. So the challenge that companies have, Dave, is that, you know, they've had their privacy and cybersecurity compliance programs, but the emergence of all these laws, it's outpacing their compliance programs. So they're finding themselves in a very fast-moving environment. That's where we find ourselves, and that's a lot of what you end up seeing in the news, not just with, you know, data breaches, which involve, obviously, the protection of data, but also, you know, privacy compliance requirements.
Dave Bittner: And Sean, where does this put us on sort of both sides of the equation? I mean, you've got the people out there who are trying to do advertising and online commerce, and then the consumers as well. Is it fair to say there's a bit of a tension between those parties right now?
Sean Buckley: Absolutely, right? So we, you know, as lawyers, you know, sit on both sides of that table sometimes on, right, the media company, maybe the advertiser, and occasionally on the consumer front, you know, joke with some of our clients that, you know, with respect to kind of this tracking technology regulation is, here in the U.S., it was kind of the wild, wild west, you know, five years ago, where unless you were kind of doing business in Europe under GDPR, you know, you could do quite a few things, and there wasn't a lot of guardrails in it. And you fast-forward to today, and it's an entirely different landscape where you have consumers who are increasingly aware of their privacy rights, and, you know, you have states and the government, you know, the federal government, you know, passing laws and, you know, enforcing existing laws on the book, right? So the Federal Trade Commission, you know, with the health breach notification law coming out against GoodRx, and they had never actively pursued previously, but it's kind of a wake-up call to quite a few people when that came across, along with the California Sephora action. It was a shot across the bow, it's like, wake up, you know, they're not playing around. They're going to come after you guys for, you know, failing to get disclosed, failing to get the appropriate consents, whatever it might be.
Dave Bittner: And how are the ad tech companies responding here?
Sean Buckley: You know, some industry consortiums, right, the IAB, the NAI, that are working together to find a solution similar to what they did in Europe when GDPR passed. Passing consent strings is a very technical process, and it kind of requires all the players to work together to be able to read and to pass those signals from kind of an advertiser or publisher website. But they are taking it seriously, because, right, it's kind of, you know, driven from the highest, you know, the biggest players out there, right? The Googles, the Metas, know that they have big targets on their back and know that they're going to be the test cases, because, you know, unlikely that the regulator is going to go slap the hand of, you know, small town Main Street advertiser running a Facebook ad that, you know, Facebook allowed to happen. They're going to go directly after Meta to do that. So you're seeing some interesting compliance methods, right. And a bunch of people, especially in July, pushed out new policies when CPRA came into effect to kind of restructure kind of some of what they were doing and reclassify themselves and make further disclosures of, hey, we're processing your data in this manner. We're doing this under the law.
Dave Bittner: Cinthia, when we see the regulatory regime here for privacy in the US, I mean, it strikes me that most of the action has happened on the state level. First of all, I mean, do you agree with that assessment? And where do the Fed stand when it comes to this?
Cinthia Motley: Well, there's definitely at the state level, a lot of like, as I mentioned before, emerging data privacy laws to address some of the things that Sean was discussing. Right? The data privacy, you know, especially in the digital ad tech has existed. Now, it's just more guardrails are being put around it with a very expansive definition of, you know, personal information to even include an IP address, cookies, pixels, things that, you know, you ordinarily wouldn't think as personal information. So to answer your question, states are catching up with that. And there's also a strong push at the federal level with the Federal Privacy Act. But it's still subject to a partisan support. And last year there was a big push. It looks like there was some bipartisan support. Some things are still moving. So what's happening is that federal regulators like the FTC, SEC just came out with the cybersecurity rule. They're basically addressing it, you know, at that level and going back to say, as Sean was saying, like, you know, even now you are required to not have deceptive business practices and more of that enforcement. At the federal level, it's still subject to, you know, lobbying power as well as bipartisan support to get something passed that would kind of unify a lot of these laws that are currently at the state level.
Sean Buckley: Major lobbying efforts by these media companies to get a federal law that preempts these state laws, this, you know, patchwork growing and increasing complexity. And right now, they're kind of getting hung up on that exact preemption method, you know, that the California contingent, they don't want to give up some of those rights that are in there, even if it's a national level. So that's kind of where they're getting hung up. But all of them, you know, would prefer a federal law that preempts these state laws.
Cinthia Motley: Yeah. And to that point, also to Sean's point, a big hangup is private cause of action. Right. So there's strong lobbying around that, and preemption still allowing the laws too, because there's still tremendous lobbying power there as well, allowing the states to, you know, enforce their own state level data privacy protection laws. So those are some of the biggest hangups that that we see with these at the federal level emerging laws as well.
Dave Bittner: You know, I think for a lot of years there was kind of a sense of resignation among consumers when it came to having their information gathered and being tracked online and all that sort of thing. Do you sense that there's more of a groundswell now that folks are deciding that this is worth fighting for?
Cinthia Motley: I think so. And I think it's important to look at our friends across the pond. Right. So you could say that GDPR, you know, the federal EU law started back in 2018. There's this massive wave. And quite frankly, it's become the model law around the world. A lot of the elements of the California CCPA come from GDPR. But I think there's a shift to answer your question, Dave, in that in the EU, data privacy has been considered a fundamental human right. In our U.S. laws, you know, when you think of personal information, you think credit card information, Social Security numbers, financial driven law. And I think now with this emergence, we're realizing more that, you know, you can't, you know, take your advertising and then sell it and then pass it around to other entities without your knowledge or disclosure, maybe not necessarily consent yet, you know, at all levels, but more awareness to say, I have something to say about my data. And I think that's the awakening that we're seeing. And it's definitely reflecting in the regulatory trends. And then again, at the federal level, with the American Data Privacy and Protection Act, trying to kind of be the omnibus, if you will, of all of these is certainly taking a lot of those GDPR elements as well and embedding them into U.S. law as well.
Dave Bittner: Sean, where do you suppose we're headed here? I mean, as you look towards the future, any predictions on where things might land?
Sean Buckley: So it's going to be interesting because the Internet, as we know it, is going to change. Right. So we are used to having free websites, right? Go there, get whatever you want. And the fundamental exchange in that was you go to a publisher's website, that publisher sells ads. Right. And so the ads are targeted towards you. Right. If I don't have a cat, I'm not going to buy cat food. Cat food company doesn't want to pay for advertising to be shown to me. There's an exchange of information. Right? They use my IP address cookies as they sync it with some of our big data providers and say, oh, you know, Sean doesn't have a cat. You know, let's not serve him cat food ads. But with these increasingly, I guess, the increase in these regulations and consent mechanisms and the ability of consumers to essentially not provide their personal information or opt out for those ads, right, those ads are less valuable and that advertiser has to find a way to cover the difference in order to stay profitable. Right? That publisher does. And so if they're not going to get your IP address when you go to their website, well, they need to get it somewhere. And so you're actually seeing an increase in roadblocks of you have to sign in. Right. You have to create an account. And in that, instead of just providing a IP address, which is a string of numbers, I'm now having to give my actual email address, my first name, last name, whatever it might be. So you're actually getting directly identifying information because that publisher can then use it, you know, in, you know, the sale of ads to these demand side platforms. And so you're seeing kind of a change in that. And I think it's only going to increase of, you know, websites have to stay profitable. And if we're cutting off the flow of information, you know, they're going to get it somehow, and some of them are going to close and business models are going to change. But it's going to be a lot harder to just go to a website, you know, read your content. You're going to have to provide some information somehow.
Dave Bittner: You know, I think like a lot of folks, probably decades ago now, you know, I was on board with the notion of having targeted advertising. This made perfect sense to me. You know, why not put ads in front of me that are for things that I've demonstrated an interest in? I think where people draw the line is sort of how it's become almost creepy and the degree that we feel like they're overdoing it, you know, tracing us around our location data and things like that and our inability to dial it in. I can't help thinking, you know, all those years before advertisers had these capabilities, companies were still in business. They made profit by showing us ads on TV or on the radio. And it seems as though they've become used to this method of being able to hyper track us. And I guess I question the notion that there's no turning back from that or dialing it down.
Sean Buckley: Data is a drug almost, right, to marketing teams. If you are going to spend a million dollars and you're going to run your ad on the Super Bowl, right, you know the demographics of what that is. Right. Nielsen will have those. That's pretty consistent. You know, if you're going to buy on local news, right, you might be kind of shouting into a room and hope that, you know, a percentage of those people are going to buy your product. But you're really not going to be able to tell how effective that million-dollar spend is and your return on investment. And so the data game, right, has allowed marketers to say, all right, if I spend a thousand dollars on Meta's platform, I'm going to be able to see which one of these came to our website, which one of these checked out, which one of these actually converted. And that is the drug that these, you know, marketing teams have been used to is saying, yeah, we used to be able to advertise on radio and TV and you can still can, right. But especially with connected TV, right, there's still a lot of data there.
Dave Bittner: Right.
Sean Buckley: But the days of just saying, all right, I'm going to hand a million dollars to my ad agency and hope my sales increase, you know, CMOs at these major companies, right, have to justify their jobs and their spend and, you know, to say, hey, our advertising is working. And, you know, data has allowed that to happen in ways that, you know, you couldn't do, you know, 15 years ago. And so that's a hard drug to give up.
Dave Bittner: Ben, what do you think?
Ben Yelin: Sometimes I just feel sorry for the compliance folks, because you're confronted not only with a patchwork of federal laws, but also now all of these state laws are popping up that are issuing new restrictions. And it's making a lot of these compliance officials and lawyers unnecessarily rich. So as a lawyer, I approve of it. But I also feel sorry for them because their job is just getting increasingly difficult.
Dave Bittner: Yeah. And I wonder, where is the breaking point where there is no choice but to have federal legislation happen because people are screaming so loudly that it is just so impractical to abide by all of these different state standards?
Ben Yelin: I don't know when we're going to reach that tipping point. I would have thought we'd been there already, but clearly we're not.
Dave Bittner: Yeah, absolutely. All right. Well, our thanks to Cinthia Motley and Sean Buckley from Dykema for joining us. We do appreciate them taking the time. That is our show. We want to thank all of you for listening. N2K's workforce intelligence optimizes the value of your biggest investment, your people. We make you smarter about your team while making your team smarter. Learn more at n2k.com. Our senior producer is Jennifer Eiben. This show is edited by Trey Hester. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.