
Protecting or policing?
[ Music ]
Dave Bittner: Hello everyone, and welcome to "Caveat", N2K CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin, from the University of Maryland's Center for Cyber Health and Hazard Strategies. Hey there, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: On today's show, Ben discusses the UK's controversial new Online Safety Act. I've got a look at automated license plate scanners expanding into schoolyards. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. [ Music ] Alright, Ben, we've got a lot to cover this week. You want to kick things off for us?
Ben Yelin: Sure. So, the world of cyber and data privacy and surveillance news always seems to follow me wherever I go.
Dave Bittner: Yeah [laughs].
Ben Yelin: And I was on vacation, as you know, last week in the United Kingdom. And sure enough, our news story, our first story for this episode, comes from the United Kingdom. So, they enacted a new Online Safety Act. It went into effect about a week ago. And this Act mandates that any site accessible in the UK, doesn't have to be a UK website, just something you can visit while you're there, including social media, search engines, music, and things like adult content, have to enforce age checks to prevent children from seeing harmful content. So, what counts as harmful content? The obvious stuff, pornographic content, anything that encourages suicide, self harm, that sort of thing.
Dave Bittner: Mm-hmm.
Ben Yelin: I think those are things that all of us would agree we want to keep away from children. Then there's priority content that is harmful to children. So, content that is abusive on the basis of race, religion, sex, sexual orientation, disability, or gender reassignment, anything that incites hatred against people on the basis of those protected characteristics, bullying content, which just seems very vague to me and kind of ripe for potential abuse.
Dave Bittner: [laughs] See, there's half the internet [laughs].
Ben Yelin: Right. And I was just saying to you before we started recording that, like, I could be accused of bullying on various social media sites when I'm arguing with people.
Dave Bittner: Yeah.
Ben Yelin: And then there's this kind of catch-all third category of harmful content, which is any content that presents a material risk of significant harm to an appreciable number of children in the UK, as long as that harm doesn't flow from the potential financial impact, the safety or quality of goods featured in the content, or the way in which a service featured in the content may be performed.
Dave Bittner: Okay.
Ben Yelin: So, this has become very controversial. The UK doesn't have the type of robust constitutional free speech protections that we have. I think this law, if it were enacted in the United States, would be thrown out on First Amendment grounds.
Dave Bittner: Mm.
Ben Yelin: Any potential restrictions on speech that are permissible have to further a compelling governmental interest and would have to be the narrowest means of achieving that interest. And when you talk about regulating content or blocking access to content based on vague concepts like bullying, and even hate speech, obviously none of us like hate speech, but in the US, that is constitutionally protected speech.
Dave Bittner: Mm.
Ben Yelin: So there's been a bit of an outcry to this. A lot of the social media sites and their founders and owners have written letters and have signed petitions saying that the UK Online Safety Act is overly restrictive. It's forcing people oftentimes to upload identification and send that identification to third-party vendors in order to view content. And there are provisions in the law that allow for the UK government to restrict content entirely. So a lot of internet sleuths have been logging onto VPNs, putting themselves in the United Kingdom, and kind of testing out how the law works.
Dave Bittner: Oh.
Ben Yelin: And so they've posted content on the Israel Gaza war. Some of that's ended up being blocked, depending on what images are being used. And other types of what we would consider to be free speech have been blocked.
Dave Bittner: Mm.
Ben Yelin: The way around this for people in the UK is the use of VPNs to put themselves in other countries. And one of the things that we've seen, and you probably could have guessed it, is the most downloaded applications over the last week in the UK have been VPN platforms.
Dave Bittner: Right.
Ben Yelin: Which is really interesting. I think this is just another example of something that's very well-intentioned, preventing children from accessing harmful content, that ends up really frustrating users and causes a really big backlash. And even though right now this is centered on the UK, it has an impact on us in a number of ways.
Dave Bittner: Do you have any sense for how much the enforcement of these rules is proactive versus reactive? You see where I'm going with that?
Ben Yelin: Right. Like, if they are trying to --
Dave Bittner: Are they responding to reports that -- like, reactive would be, I post something with hate speech, you report it and the platform takes it down.
Ben Yelin: Right.
Dave Bittner: Proactive would be, the platform is actively scanning for hate speech without anyone else's intervention.
Ben Yelin: I think, at least the way the law is written, is it allows for both. But I think there has been, based on the reporting I've seen, there has been some proactive enforcement. It's not just reactive. I also don't think that distinction ends up meaning a whole lot, because eventually things that are borderline are going to get reported and it's up to the UK governance bodies, whatever the relevant secretary is, to enforce it. But I do think they are doing a certain level of proactive enforcement here to make sure that the law is being followed as it's intended.
Dave Bittner: So what if your Facebook -- like, because I just think about how often Facebook ignores people's -- or not even ignores, disallows people's reports of all kinds of, you know, unattractive speech, let's call it. Right? You know?
Ben Yelin: Right.
Dave Bittner: And so does this mean that an organization like Facebook is going to err more on the side of, you know, when in doubt, throw it out, that sort of thing?
Ben Yelin: That's the concern, is that even though maybe under a technical interpretation of the law, a specific type of content would be allowable, it's the threat of government action that's going to have a chilling effect on these social media websites operating overseas. So the punishments under this law are very severe. It would be fines of up to 10% of global revenue, which is a lot.
Dave Bittner: Mm-hmm [laughs].
Ben Yelin: We're talking about Facebook, that's probably like half the budget of the United Kingdom. Not seriously, but it's a large amount of money.
Dave Bittner: [laughs] Right.
Ben Yelin: That's one of the punishments. And then their other kind of prophylactic measure would be to take down content on websites entirely, which feels less like something that a Western democracy would do.
Dave Bittner: Mm.
Ben Yelin: Again, I'm sympathetic because I understand the need to protect children from harmful content, and the methods that we currently use here in the United States aren't always effective. I will say, you know, this comes in the wake of our Supreme Court holding just last month that Texas's age verification procedures are constitutional, that they're narrowly targeted enough to protect against minors accessing certain websites. But the category of websites in that case was much narrower. I think what makes the UK law something that the tech community is up in arms about is just how broad potential restrictions are when we talk about things like hate speech and bullying. So I think that's where people are taking the most offense.
Dave Bittner: Are people taking their legal avenues to try to fight this?
Ben Yelin: So far, we haven't seen any direct lawsuits. We've seen threats for lawsuits in the English -- or, the UK legal system. So that's something that is certainly coming down the pike. As somebody who's not an expert in the UK legal system, although I can sometimes play one on the podcast, it's just hard to prognosticate how those cases will go, just because they don't have that same type of robust First Amendment law.
Dave Bittner: Yeah.
Ben Yelin: But I think, so far, the companies have kind of done everything beyond filing lawsuits. So they've written, they've complained, there's a petition that's going around that has 350,000 signatures that was produced by leaders from websites like Gab, which is a known conservative right-wing site in the United States. And kind of a key opposition figure in the UK, Nigel Farage, who people probably remember from the Brexit stuff, and he's now the leader of a Senate party in the UK called the Reform Party, has promised to repeal this law if he ever gets into power. So it's become kind of a form of political controversy as well.
Dave Bittner: Do we end up with a group of haves and have-nots of, you know, basically this isn't that bad for anyone who can afford a VPN?
Ben Yelin: Yeah, I mean it kind of is. And VPNs, for most people, are decently affordable.
Dave Bittner: Mm-hmm.
Ben Yelin: You pay a monthly fee that's not exorbitant in the slightest. It also kind of -- the fact that VPNs exist and there's not really any effort that I can see in the UK to restrict the use of VPNs means that there's just an easy workaround around this law. But that's just the natural result of not having global governance structures, where if you don't like the way one country is enforcing its laws and restricting speech, then VPNs are always an option. You can just put yourself in another country. And until we -- we probably won't any time in the near future, but until we have global governance of these types of issues or some type of international agreements, I think we're going to see exactly what we're seeing right now. And it's not just the UK where this has happened. When Florida earlier this year implemented a new law restricting access to online pornography, there was a huge spike in searches for VPNs there. And people were just situating themselves in other states. So, yeah, I mean the VPN element of this is fascinating to me because it's just such an easy workaround. But yeah, it does create a potential haves and have-nots problem. I don't think VPNs are prohibitively expensive for people who are regular users of the internet, but it's certainly an added cost, which is something we never want to do in the spirit of free speech. We want equitable access to these platforms.
Dave Bittner: Yeah. Have there been any of the major players threatening to pull out of the UK market?
Ben Yelin: Yeah, so Gab was one that explicitly threatened to pull out of the agreement. I think there have been a couple of other sites or other platforms that have said that, at least as this law is being enforced right now, it might not be practical for us to continue operating in the United Kingdom. So, Gab is really the one example where they've explicitly threatened it so far, but I think the other companies have kind of implied it in their letters that they're implementing age checks to block children from seeing harmful content, but if we get into a situation where they are over-enforcing this law and taking down content that is not actually offensive, then the lever that these companies might pull is to take away their services in the UK.
Dave Bittner: Yeah.
Ben Yelin: Again, that's a problem a VPN also solves for you. But you are cutting yourself off from a pretty significant market if you take yourself out of the UK. So it's certainly a risk for the companies.
Dave Bittner: Well, and one person's bullying is another person's spirited debate. Right? Like, you know, you said yourself, you've been involved in some online exchanges that could probably be categorized as bullying, but I imagine in the heat of it, you wouldn't have categorized it as that.
Ben Yelin: Yeah, and not to get all philosophical, but I just think one of the, like a key essence of the right to free speech is to have provocative and sometimes offensive content. Now, like, this is a law about restricting access to that content by children. But one of the things that Texas case talked about, and always the concern, is that if you have these burdensome verification procedures that end up blocking a significant amount of content, that doesn't only affect children, it affects adults as well. And so that blocks access to a potential marketplace of ideas. And there's just like a kind of checkered history of restricting content on any platform because it's offensive to certain people.
Dave Bittner: Right.
Ben Yelin: And I think that's a slippery slope that those of us in the United States can appreciate. So, again, certainly like probably most of the intended content that's being targeted under this law consists of things that I personally would find offensive.
Dave Bittner: Right.
Ben Yelin: I do think it's more the spirit of free speech and the First Amendment in the United States as being violated here.
Dave Bittner: It seems to me like it also gives people a nuclear option. In other words, you know, you and I are having a spirited online debate and we both agree that we're having a spirited online debate. And then suddenly I feel like I'm losing the debate and I'm bent out of shape about that. And so I report the debate to Facebook or whoever's hosting it and say, I'm being bullied. And then, kaboom, it's gone [laughs].
Ben Yelin: And then there's -- [laughs] Yeah, there's some trigger word on there that --
Dave Bittner: Right. Right. Right.
Ben Yelin: Yeah. I would hate to see a situation like that, especially because when you understand the intention of this law, I don't think UK policymakers are wishing to use the full force of the government to stop contentious political conversations on these platforms. But that incidentally may be what they're doing.
Dave Bittner: Mm-hmm.
Ben Yelin: And so I think it's incumbent upon the UK if this, the backlash to this law, gets even more severe, and to other countries that are looking into this type of age verification or content moderation, to try and figure out less restrictive alternatives that don't require age verification through third-party applications, that sort of thing, that are instead focused on having the companies adjust algorithms or just less restrictive means that would keep children from viewing the most offensive content, but wouldn't prevent adults from accessing offensive content on these platforms.
Dave Bittner: To what degree do you think US lawmakers are keeping a close eye on this?
Ben Yelin: It's funny. So the president of the United States was in Scotland and he met with the British prime minister, and he got a question about this law. And basically, his only frame of reference was like, well, they're not taking down Truth Social, so I'm fine with it. Like, it's not a big deal.
Dave Bittner: [laughs] Right.
Ben Yelin: Besides the president himself, you have to think that US policymakers are paying attention here, both to the law itself and to the backlash.
Dave Bittner: Mm.
Ben Yelin: When you see petitions being signed, when you see outcries from these US-based platforms, I think you can see the political peril in trying to institute these regulations. And if you go beyond what Florida and Texas did, which is just instituting age verification measures for pornography and not anything else, I think that's one thing, but this is a signal that if you start to veer beyond that and have the government and not the platforms themselves coming up with hate speech policies, that you're going to see the type of backlash the UK is seeing right now. So I think this is something that US policymakers have to pay attention to.
Dave Bittner: Yeah. All right, well we will have a link to that story in the show notes. [ Music ] My story this week comes from the folks over at "The Record", and this has to do with license plate readers, which is a shiny object for us, right? We just can't --
Ben Yelin: Yeah, one of our favorite topics.
Dave Bittner: [laughs] Cannot resist a good license plate reader story. Let me ask you this, Ben, before we dig into the story, have you noticed the proliferation of the Flock license plate readers around town? Would you know one if you --
Ben Yelin: Are those the ones that have caught me speeding on the interstate?
Dave Bittner: No, no, no. I guess, would you know one if you saw one?
Ben Yelin: I don't think I would.
Dave Bittner: Okay, so they are conspicuous in that it's a very simple installation. It's a -- and it's all in black. So you have the black pole that comes out of the ground, you have a little camera that looks kind of like a ring doorbell camera, and then a solar panel at the very top, and that's all there is. So the way that they work is, obviously they get their power from the solar panel, there's a built-in battery so it can work when the sun goes down, but then it has a cellular connection and it's just keeping track of vehicles that are coming by and sending them to Flock's mothership for processing and all that kind of stuff. But I have seen them. Once you know what to look for, I see them popping up all over the place.
Ben Yelin: All right, well when I drive out of here today, I will be having my eyes set on Flock Safety cameras.
Dave Bittner: Yeah [laughs]. So, Flock is looking to expand their presence into schools, school parking lots in particular, the pickup and drop off loop.
Ben Yelin: Mm-hmm.
Dave Bittner: Right? And so they're partnering with a company called Raptor Technologies, who is a company that focuses on safety in schools. And it looks like their first foray here is into what they're calling dismissal management systems. So, you know, at the end of the school day when the bell rings and all the kids run out to get on their school buses.
Ben Yelin: Right.
Dave Bittner: And these days, many, many kids are being picked up by their parents to be driven home because, you know, our precious little snowflakes cannot walk [laughs].
Ben Yelin: [laughs] Or take the school bus, yeah.
Dave Bittner: That's right.
Ben Yelin: Or we live too close to the elementary school to qualify for the school bus.
Dave Bittner: Yes.
Ben Yelin: That's true for some of us.
Dave Bittner: But that's -- Yeah, that's my point, is that if you are in walking distance, and I'm using air quotes, chances are you're getting a ride because, again, precious little snowflakes. Have I mentioned, Ben, that I used to walk to school uphill both ways in the snow?
Ben Yelin: Both ways, barefoot in the snow?
Dave Bittner: [laughs] Yes, yes, absolutely. So. So what they're talking about are security methods for the schools. Because Flock's system, not only does it read license plates, it can recognize the make and model of a car. So let's say you drive into the school parking lot, it'll read your license plate number. But in your case, it would also recognize that you're driving a late model Rolls Royce and keep track of all that. What they're talking about here is matching students to vehicles. So little Johnny comes out of the school and it's keeping track through the combination of Flock's technology and Raptor's technology, so I'm assuming there's some facial recognition stuff going on here.
Ben Yelin: There would have to be, if a vehicle is not involved in one way or another, because if you're monitoring bus stops and walking routes, and any ALPR is not going to be sufficient.
Dave Bittner: Well, I think what they're talking about here is particularly -- or, specifically, for the kids who are getting a ride home in a car.
Ben Yelin: Right.
Dave Bittner: So it could tell if, you know, a kid's getting into the car that the kid always gets into.
Ben Yelin: Right.
Dave Bittner: But more importantly, it can tell if the kid gets into a car that it hasn't seen before, that it doesn't recognize, and it could put a red flag on that.
Ben Yelin: Right.
Dave Bittner: What's your initial response to that possibility?
Ben Yelin: Well, there are a couple of potential concerns. One is for the privacy rights of students and families. And in the absence of any type of opt out provision, just by sending your kid to school, you're subjecting them to a mass surveillance system. The claim, at least in this article, is that the company has a database of over one billion license plates. And once you have a license plate, if you were to have a law enforcement agency, and I think several thousand of them use this technology, that wants to do an investigation, it becomes much easier now that you're part of that database. So that's certainly one concern I have. From a legal, constitutional perspective, I mean, your rights, the rights that you have as a public school student are diminished in a sense in the United States, for the purpose of protecting everybody else's safety and not interrupting school procedures while kids are in school or safe pickup and drop off. So it's not like if this were instituted outside of a private business or even at some type of adult workplace, I think it would present different issues, but because we are dealing with schoolchildren, I think there is a significant interest on behalf of school systems in ensuring safe pickups and drop offs. So I'm sympathetic to it. I would just worry about potential invasion of privacy and what this might do to vulnerable students who, you can think of situations where there's some type of domestic violence situation and they're not going home with one of the parents, they have a safe extended relative who's picking them up instead. That information gets relayed to a principal when that student and that student's caretaker wouldn't want that information revealed to a principal. Or, you know, situations with divorced parents, like you could see it becoming an issue there. But I also, as a parent myself, like if there were some system to ensure that my kids always got home safely and that I could track them and make sure that they were getting in the right vehicle, like my parent sense, I think, is different than my privacy advocate sense on this one.
Dave Bittner: Yeah. The folks at Flock Safety have put up a blog post with, they list five ways that this will help school campus safety. So why don't we run through them together and see if we agree with them?
Ben Yelin: All right.
Dave Bittner: The first one is they say streamlining dismissal with automated vehicle matching. So that's pretty much what we were just talking about, that they're going to match license plates to who is authorized to pick up a child.
Ben Yelin: Little Johnny got in the right car.
Dave Bittner: Yeah. And I don't have any immediate problems with that. They're talking about preventing unauthorized access to campus. So someone drives on to the school campus who's known to be a troublemaker by their -- is already in some database. Right?
Ben Yelin: Yeah, I'm cool with that one to be honest. Yeah.
Dave Bittner: Yeah. Yeah.
Ben Yelin: I mean, we do that, like, people in Home Depot who've been caught stealing and have some type of trespass action against them, like there are facial recognition tools that Home Depot uses that they can kick you out of the store.
Dave Bittner: Right. Right.
Ben Yelin: If we're going to use it there, I'm fine using it at school systems.
Dave Bittner: Yeah. They say for supporting emergency response investigations. So they're saying like in a crisis situation, if you had a missing student or a security threat that, again, you know, them being able to access their vast network of data would help law enforcement establish what's going on more quickly. I think that's plausible.
Ben Yelin: Reasonable, yeah.
Dave Bittner: Yeah. Yeah. They're talking about deploying scalable safety infrastructure fast. This is just basically them saying that these are easy to install and they're inexpensive and, you know, instead of paying for infrastructure and trenching and cabling, you just pay us a low, low monthly fee [laughs].
Ben Yelin: Right. Exactly. Or, you know, each school has one safety officer. We were considering putting two in there, but instead of doing that we're going to give you this, you know, automated license plate reader system, this Flock system.
Dave Bittner: Right. Right. And then their fifth is creating safer school zones beyond campus. So they're talking about having these zones of, they call them corridors, or safe corridors, for student travel. Again, the ubiquitous surveillance blanket [laughs].
Ben Yelin: It gets -- Yeah. That gets a little more suspicious to me when it's not directly on school grounds, because people could be engaged in a lot of private, First Amendment protected activity within close proximity to school districts that they just don't want those who control the databases to know about.
Dave Bittner: Right.
Ben Yelin: If there's a familial association or religious association or a political association that's discoverable through one of these networks, I think that would be problematic.
Dave Bittner: Yeah. So I guess we are cool with three out of the five. I'd say one out of the five is really just kind of marketing. [laughs] And maybe that fifth one of the being wrapped up in the warm embrace of surveillance is the one that makes our eyebrows lift up, right?
Ben Yelin: Yeah. I would also love to see, at some point, like some actual empirical research literature on this. Like, in the aggregate, is this making schools safer or not? Oftentimes I'm surprised when I see studies on various safety methods where it's like this is something that we've been doing for a long time and the literature says that it actually doesn't have a major impact on keeping schools safe.
Dave Bittner: Yeah.
Ben Yelin: So I'm just curious. At some point, when this has been instituted and they start to do like double blind studies with schools that have Flock systems and schools that don't and comparing safety records, that would be really interesting to me. But we're not there yet.
Dave Bittner: Yeah. I also wonder, like, given the moment that we're in right now, what sort of restrictions a community would want to put on the sharing of this information. And I'm specifically thinking about, it's one thing to share with local law enforcement, but it's another thing to share with ICE.
Ben Yelin: Right.
Dave Bittner: Right?
Ben Yelin: Or, and this is the high profile elephant in the room, Texas authorities reportedly performed a nationwide search combing through 83,000 Flock cameras to track down a woman who they claimed had self-administered an abortion, which is illegal in Texas.
Dave Bittner: Right.
Ben Yelin: And that search had reportedly included a probe of footage collected in states where abortion is legal. It's like, okay, that, you know, starts to raise some eyebrows.
Dave Bittner: Mm-hmm.
Ben Yelin: However cool this technology is, when it's being applied in that setting to do that type of surveillance, no matter what your views are on abortion, it certainly is threatening to a person's privacy. If they're searching through 83,000 Flock cameras to find an image of your license plate, I think that could also potentially have the type of chilling effect that we've talked about. If you know that wherever you go you're going to be caught by a Flock camera and there might be some search where they're trying to locate where you've been, a history of your last known location or whatever, that could prevent you from engaging in activities that other people might find distasteful or offensive.
Dave Bittner: Right.
Ben Yelin: And you never want to see that.
Dave Bittner: Yeah, to just be able to track your associations.
Ben Yelin: Right. And I'd also -- I've got to invoke Carpenter here.
Dave Bittner: Mm.
Ben Yelin: One thing that Carpenter said is having -- and they didn't lay out specific parameters, but historical cell site location data, where you're painting a picture of where somebody was over an extended period of time, that should require a warrant. Because that is a search that is broad, it's deep, and it's involuntary. Would all those things not apply in this situation as well? People are not opting in to being caught by these Flock cameras because they're so ubiquitous. That lends credence to the breadth of the search, and I think you could certainly see some parallels there.
Dave Bittner: Mm-hmm. Yeah. Yeah, that's interesting. Drop the kid off down the street and let him walk the last quarter mile, right?
Ben Yelin: Yeah. And now you're just going to seem suspicious though. Then the Flock people are going to be on you even more. They're going to be like, what is this person trying to hide?
Dave Bittner: [laughs] Yeah, yeah. I have to say, like, what, you know, the subversive side of me, when I started noticing these Flock cameras and what they look like and I started seeing them around town, unlike, you know, some of the, let's say, like red light cameras and things like that that are built very robustly, I think because they know there are people out there who would be adversarial to them, you could take out one of these Flock cameras with a baseball bat [laughs].
Ben Yelin: Oh, you totally could. Yeah, they look pretty flimsy.
Dave Bittner: Yeah, yeah. I mean, just backing into one of the poles would have it, you know, looking at the sky instead of --
Ben Yelin: Now, we are not encouraging anybody to engage in any illegal --
Dave Bittner: No, no, no, no. But if civil disobedience is your thing, [laughs] -- So funny.
Ben Yelin: If you want to be subversive without doing anything illegal, you can also just, you know, flip it the bird or something.
Dave Bittner: There you go.
Ben Yelin: Yeah.
Dave Bittner: There you go. Or get yourself a spicy vanity license plate [laughs].
Ben Yelin: Vanity license plate. That's what you really need to do. Yeah.
Dave Bittner: Right. Right. Right. No, you always see the folks on the cybersecurity side, you know, trying to come up with some sort of a like a zero day vulnerability for these cameras, where if they view the wrong thing and then try to, you know, use optical character reading to convert it, is there some way you could get them to crash remotely?
Ben Yelin: Yeah.
Dave Bittner: You know, that sort of thing. So. All right. Well, again, we'll have a link to this story in the show notes. Very interesting. It'll be -- This is one that's going to be interesting to track as these cameras become more ubiquitous, because they're already ubiquitous [laughs].
Ben Yelin: They sure are.
Dave Bittner: You know, what comes out of them? What, good or bad? You know, because if we can rescue a kid who's been abducted, good news.
Ben Yelin: Absolutely.
Dave Bittner: But if the -- like we've just outlined, there could be lots of potential problematic uses of this as well. So, interesting to see how, where we come out on that on balance, and how our communities respond to that. So we'll have a link to that story in the show notes, and of course we would love to hear from you. If there's something you'd like us to consider for the show you can email us. It's caveat@n2k.com. [ Music ] And that's "Caveat", brought to you by N2K Cyberwire. We would love to hear from you. We're conducting our annual audience survey to learn more about our listeners. We're collecting your insights through the end of this summer. There's a link in the show notes. Please take a moment and check it out. This episode is produced by Liz Stokes. Our executive producer is Jennifer Eiben. The show is mixed by Tre Hester. Peter Kilpe is our publisher. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening. [ Music ]

