Caveat 6.27.24
Ep 223 | 6.27.24

High-tech tales of Law and Order.

Transcript

Dave Bittner: Hello, everyone, and welcome to "Caveat," N2K CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hi there, Ben.

Ben Yelin: Hello, Dave.

Dave Bittner: On today's show, Ben has the story of a bipartisan effort to penalize platforms for hosting harmful deepfakes. And I've got the latest on cell-site simulators. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. All right, Ben. We've got some good stuff to cover here this week. You want to kick things off for us?

Ben Yelin: Sure. So I was planning on covering a couple of decided Supreme Court cases this week, but the two that I'm looking at are the Consolidated Net Choice cases --

Dave Bittner: Mm-hmm.

Ben Yelin: -- from Florida and Texas about those laws -- state laws regulating social media companies. And then the lawsuit filed by Missouri against the federal government for big tech censorship.

Dave Bittner: Mm-hmm.

Ben Yelin: Unfortunately, the Supreme Court does not abide by our schedule and those opinions, as of press time, have not been released. But rest assured they will be released sometime this week so we will cover it on our next show.

Dave Bittner: I think you should send the Supreme Court a strongly-worded letter that --

Ben Yelin: I -- I think so, too. Yeah. I mean, if I were to send them a strongly-worded letter, maybe this wouldn't be the subject I would -- I would lead with, but you know?

Dave Bittner: Good point.

Ben Yelin: It would be the most bipartisan and non-ideological letter I could send.

Dave Bittner: Yeah. That's an interesting thing to ponder. I'm not going to put you on the spot here and -- and demand an answer from you, but what an interesting thing. If you were given ten minutes in front of the Supreme Court to just pontificate, you know, what would the -- or make your case. You know? If anybody was given that opportunity, what would you talk about? Right?

Ben Yelin: I remember actually -- this is a very quick diversion but --

Dave Bittner: Yeah.

Ben Yelin: -- when I was doing a semester program in Washington when I was in college, we got to meet Justice Alito --

Dave Bittner: Oh!

Ben Yelin: -- and they insisted you could not ask any questions about his personal life, what he likes to do in his free time, what his favorite books are.

Dave Bittner: Uh-huh.

Ben Yelin: I was kind of more interested in that. Like, I know he's not going to say anything substantive to a bunch of students --

Dave Bittner: Right.

Ben Yelin: -- so if I asked him what his views are on the major questions doctrine, he'd just, you know, give, like, a rambling answer that didn't actually give me any hint to how he would decide cases.

Dave Bittner: Yeah.

Ben Yelin: So maybe he would have talked about the fact that he's a Phillies fan. But I digress.

Dave Bittner: Right. Right.

Ben Yelin: I decided to cover the story of a bipartisan proposal in Congress to address deepfakes on popular platforms.

Dave Bittner: Mm! Mm-hmm.

Ben Yelin: And we first have to take a moment to admire the acronym here. This is truly -- truly a classic. You and I track these acronyms. It's one of our hobby horses.

Dave Bittner: Uh-huh.

Ben Yelin: And I just -- I couldn't wait to show this one to you. Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks -- the "Take It Down Act."

Dave Bittner: Oh, wow!

Ben Yelin: Wah-wah-wah!

Dave Bittner: They're going for distance. Right?

Ben Yelin: Yeah. They really -- they really drew that one out.

Dave Bittner: I saw one earlier this week and I can't for the life of me remember what it was for now, but it was -- I swear I'm not making this up -- it was called the "HOOHA Act."

Ben Yelin: Wow!

Dave Bittner: Yeah. All right.

Ben Yelin: Yeah. I would love to hear the story behind that one.

Dave Bittner: Yeah. I don't know, but that's what it was.

Ben Yelin: So this is a bipartisan proposal -- Ted Cruz is actually the lead sponsor of it -- and there are six Democrats and six Republicans who have signed on. And this bill does a number of things related to deepfakes. Social media platforms and other sites that host user-generated content would be required to have procedures in place for removing, quote, nonconsensual, intimate imagery including deepfakes within 48 hours of a victim's request, as long as the victim makes a good-faith request which they can back up by some semblance of evidence.

Dave Bittner: Hmm.

Ben Yelin: Enforcement for this provision would fall under the purview of the Federal Trade Commission, so this would be like an unfair and deceptive trade practice.

Dave Bittner: Hmm.

Ben Yelin: So that's one element of the law. It would also criminalize individuals for spreading non-consensual, pornographic images or deepfakes through interstate commerce. And the reason they use that interstate commerce language is that so they can have federal jurisdiction.

Dave Bittner: Oh, I see.

Ben Yelin: Otherwise it would be a state issue. But because it's the internet, of course it's going to be interstate commerce. So I think this bill actually has some legs and I'm kind of curious whether this is going to pass before the end of this legislative session. The reason I think that they've drafted this in a way that improves its chances of passing is that they have made this bill -- bill very narrowly tailored to the specific topic of harmful deepfakes. They have two sets of criminal penalties: one for deepfakes depicting children under the age of eighteen; and another for non-consensual images depicting adults. But there are carve-outs for free speech principles, especially when we're talking about images depicting adults, so for things like parody or if that adult has willingly exposed themselves in a public or commercial setting so that's not really deepfake. It's just a photo of a -- of a naked adult.

Dave Bittner: Hmm.

Ben Yelin: And then there are exceptions for academic research and law enforcement tactics to help identify the perpetrator. So if a state or federal law enforcement agency is in possession of some of these images or is searching for them on a search engine, that activity will not be criminalized. My one concern was the sort of Section 230 implications here.

Dave Bittner: Yeah.

Ben Yelin: If these companies would be held liable under this statute for having these images on their website, the statute kind of reaffirms Section 230 and says the companies themselves can't be held liable in civil court for any content moderation decisions they make vis-a-vis these images. What the enforcement mechanism comes down to is -- are they complying with these victim requests? And because that would be done under the portion of the FTC regulations that deal with unfair and deceptive business practices, I think that would pass the -- the smell test, the constitutional and -- and statutory muster to avoid these Section 230 problems. You know I'm usually a huge cynic about Congress doing anything --

Dave Bittner: Right.

Ben Yelin: -- and I will never change my opinion on that. But I have a feeling about this because I think there is a real desire to address the proliferation of deepfake images. Congress has addressed this in the -- to a limited extent in the past few years, although not in a way that would be this direct. We've seen a bunch of state legislatures start to address this issue. I know this is something that is certainly up for debate here in Maryland, so I kind of have a good feeling about this. I think by the end of the year we might see this, either as a standalone measure being enacted or part of some sort of larger legislative package.

Dave Bittner: It's interesting to me, and we -- we covered this in the past few days over on The CyberWire, and one of the elements that was interesting is that this isn't the only bill that's out there to try to address this. There's a -- there was a bill introduced by Senators Maggie Hassan from -- a Democrat from New Hampshire, and John Cornyn, a Republican from Texas. They had a bill that would make it a criminal offense to share deepfake, pornographic images and videos without consent. And then there was one from earlier this year from Dick Durbin who's a Democrat from Illinois. He had a bipartisan bill that allowed victims of non-consensual deepfakes to sue people. Durbin couldn't get a -- a floor vote on the bill. Lummis said that it was too broad in scope and it would stifle American technological innovation, which is, you know --

Ben Yelin: That's what they always say. So this -- this question, though, that you bring up about the Durbin bill is really interesting, and that's why we can't have a private right of action.

Dave Bittner: Yeah.

Ben Yelin: So the senators who proposed this particular bipartisan bill addressed that by saying that it's basically too cumbersome for most normal people to sue one of these big tech companies.

Dave Bittner: Mm-hmm.

Ben Yelin: It's time intensive. It costs a lot of money. Even if you're part of some type of class action, then your portion of, you know, whatever the damages are is going to be so minimal that it's just not worth your time. So I think in their minds the proper approach is this notification and requirement within 48 hours that the image be taken down. That gives quicker and easier relief to the vast majority of people who just don't have the resources to initiate a lawsuit. And so I think in that respect this bill is -- is promising. You know, I think this bill kind of combines principles in the Hassan/Cornyn bill and also in the Durbin bill, save for that private right of action provision.

Dave Bittner: Mm-hmm.

Ben Yelin: But I think this kind of reflects the broad consensus on what the best policy would be to limit the spread of these deepfake images which are just becoming more and more ubiquitous.

Dave Bittner: Yeah. On that private right of action element, you know, we just recently saw Vermont's governor veto a data privacy bill for that very reason. Vermont was going to have one of the strongest data privacy bills and they were going to allow the private right of action. The bill passed their House, what was it -- 139 to 3. But the governor vetoed it saying, you know, it's too much. So we'll see if Vermont can override that veto or not.

Ben Yelin: Yeah. It's really interesting. I mean, there are interest groups aligned on either side of this here. I think the industry is adamantly against this private right of action.

Dave Bittner: Mm-hmm.

Ben Yelin: I think it would be very cumbersome for them to have to respond to all of these lawsuits when they see themselves as, you know, we're not the ones who are spreading these images and, you know, it would just be a waste of our time and resources. But I guess the positive or the pro argument for private right of action is it has the potential to give the most amount of monetary relief to victims of the distribution of these images. Part of me thinks, you know, the real problem is that the images exist in the first place. So taking them down is really the policy solution that we want. It would be nice to financially reward the victims --

Dave Bittner: Right.

Ben Yelin: -- or to have some type of declaratory judgment in favor of the victims. But, really, our interest is in trying to take these non-consensual images off of the internet.

Dave Bittner: Mm-hmm.

Ben Yelin: And if we can do so, I do think that is the most efficient avenue here. It's also the way you garner bipartisan support. You know, Ted Cruz is a -- despite the fact that he's a lawyer himself -- is an opponent of bills that expand private right of action because he thinks it would lead to just over litigation --

Dave Bittner: Right.

Ben Yelin: -- and it would only benefit the trial lawyers. And you know what? There's something to that. For all you people who complain in the comments on our website about our political bias, this is me -- praising Ted Cruz.

Dave Bittner: Wow! Mark the moment, ladies and gentlemen.

Ben Yelin: Yeah. We must mark this in the -- the annals of our podcast history.

Dave Bittner: Right. Right. Yeah. You're going to be wearing your Ted Cruz t-shirt next. Right?

Ben Yelin: Yeah! Yeah! Absolutely! Go down to Texas and introduce myself. I'd love to. If I were to meet him, I'd hope it would be in Cancun because that just sounds like a much nicer vacation.

Dave Bittner: Just bump into him on the beach.

Ben Yelin: Yeah.

Dave Bittner: Yeah, why not?

Ben Yelin: During a terrible ice storm.

Dave Bittner: Right. Right. Yeah. One thing that this reminds me of, that is -- is not a perfect analogy, but I wonder about, kind of, unintended consequences. Right? So if you think about over on a platform like YouTube where you have copyright infringement issues. Okay? And so let's say that I post a song that is an original composition of mine and you reach out to YouTube and say, hey, I want to put a copyright strike against this -- this song. Right? And you're doing it in bad faith and you're just dead wrong about it. There's no penalty for that. Right? There's no penalty for either YouTube or you for falsely claiming a copyright strike. And so obviously deepfakes are different and -- and have greater consequences for someone personally than a copyright strike. But -- -- I hesitate when something like this is put in place where you have, you know, 48 hours to respond, but there -- and -- and I don't -- I haven't read the bill, so I wonder if there are safeguards in place to keep people from abusing the power of this bill, right, just to either be a nuisance or a troll or, you know, who knows what.

Ben Yelin: So they just put the -- the typical lawyerly language in there --

Dave Bittner: Okay.

Ben Yelin: -- that they require the request to be made reasonably and in good faith. The requirement is that the individual submit a brief statement that the identifiable individual has a good-faith belief that any intimate visual depiction identified under the relevant clause of this bill is non-consensual, including any relevant information for the covered platform to determine the intimate visual depiction was published without the consent of the identifiable individual. So it has that language in there saying that it has to be a good-faith request.

Dave Bittner: Or what?

Ben Yelin: Or the --

Dave Bittner: Nothing, right?

Ben Yelin: -- the service doesn't have -- right! I mean, or the service doesn't have to take down the image and therefore they would not be subject to sanctions from the Federal Trade Commission.

Dave Bittner: And I guess I'm saying that, certainly in a deepfake situation, it's probably better -- well, I -- I would say it is better to err on the side of taking down the images if there's any chance that it's, you know, the kind of things that these things are. But you -- you see where I'm coming from.

Ben Yelin: I totally do, and I think there's always the risk of that.

Dave Bittner: Yes.

Ben Yelin: I think this is just the easiest solution to it.

Dave Bittner: Right.

Ben Yelin: Requiring this statement of a good-faith belief is about as -- as good as we can do.

Dave Bittner: Mm-hmm.

Ben Yelin: If you can think of a better solution to try and thread that needle, I'm sure Ted Cruz would love to answer your phone call.

Dave Bittner: No, no, I guess what I'm -- but that's not what I'm saying. What I'm saying is, I think everything that they're putting in place here is fine, but I -- but I think it would be good to have, in addition to all of this, some kind of teeth to say if you repeatedly file false takedown measures, then there could be a misdemeanor or a, you know, like there's some penalty for doing that in bad faith because it seems to me like that's not overtly part of the legislation.

Ben Yelin: I think that's right. I think that's right. I -- I think that's a reasonable concern. There's no way to properly identify or penalize somebody. I mean, I guess there would be a way, but I think it would be hard to identify somebody who's just doing this for trolling purposes.

Dave Bittner: Yeah.

Ben Yelin: And the mechanism to do that in this bill is to question the good faith of that person.

Dave Bittner: Right.

Ben Yelin: So you'd have evidence of bad faith if a person continually was submitting these requests and, on investigation on the part of these companies, the requests turned out to be frivolous. But yes, there is no real punishment for the person making the complaints and I think that's definitely a potential weakness here.

Dave Bittner: Yeah.

Ben Yelin: I'm not sure if there's a good legislative solution for that, whether you would qualify that as, like, a federal misdemeanor. It's also hard to know -- it's hard to read a person's mind and to know whether they actually think that there are a bunch of deepfake images on the internet that need to be taken down. Whether they even have that as a subjective relief -- belief, even if it's not objectively reasonable.

Dave Bittner: Yeah.

Ben Yelin: So it'd just be very hard to enforce. But I agree with you. I mean, I think that's the worry here is that you're going to get a bunch of -- of trolls who would seek to take down images that are not non-consensual, pornographic images.

Dave Bittner: Right.

Ben Yelin: So I like the -- the structure of this bill that you have 48 hours for the company to go through a review before they would be subjected to any type of penalty or enforcement action from the FTC. You just hope that the requests don't get so overwhelming that these companies are unable to respond in the statutorily limited time period.

Dave Bittner: Yeah.

Ben Yelin: And that's -- that's a concern as well. But I think this is sort of their bite at the apple here of how to solve this problem. It's not going to be perfect but I think in their view it's -- it's worth trying as a way to -- to give people a course of action if they see a -- a non-consensual image of themselves online.

Dave Bittner: Yeah. I think one of the perils here is that, as you say, if they find themselves overwhelmed the tech companies could turn to automation and that's where -- when we -- again, when we look to the copyright situation, that's where we find, you know, some of the absurdities where, you know, someone who -- someone plays a -- a piece by Beethoven on their piano and -- which obviously is in the public domain. Right? They record it at home.

Ben Yelin: It's been enough time. Right?

Dave Bittner: Yeah. And -- and they get a takedown notice from Sony Classical Music, you know, because it pattern matches one of their recordings of one of their -- one of their performers playing a classic bit of Beethoven. You know? So you end up with absurdities like that.

Ben Yelin: Yeah. I would be worried about it, and they probably will end up using artificial intelligence tools which, even if those work, 90, 95% of the time you're going to get those funny counterexamples of something that was labeled as a non-consensual deepfake that was --

Dave Bittner: Yeah.

Ben Yelin: -- just, like, a normal photo of a person smiling in public.

Dave Bittner: Yeah.

Ben Yelin: That's kind of the nature of artificial intelligence the way it is now. I just don't think it's advanced to the extent that we could 100% rely on it, and I think for something like this you really should have human eyes. But I agree that I don't think there's anything in the language of the legislation -- I did read it, although maybe I missed it -- that forbids companies from doing this type of review through the use of artificial intelligence, so they very well might resort to that.

Dave Bittner: Right. All right. Well, it's an interesting story, for sure. I -- I'm with you. It's good to see some progress here and -- and the fact that this could actually make it through Congress is, like I said, a little beacon of hope. Right?

Ben Yelin: Yeah! My one caveat, so to speak --

Dave Bittner: Uh-huh.

Ben Yelin: -- is that, even though there are seven months left in this legislative session, that would seem like a long time to us, Congress really isn't going to be in session that much the rest of the year.

Dave Bittner: Mm-hmm.

Ben Yelin: I think they're probably going to be done with their business by early October prior to the election, and then there's the lame duck session and they have to worry about budget stuff. So it could just be a -- an issue of time constraints. We've seen that as a problem in the past. But I think, given the bipartisan support for this, there's certainly a good chance that this gets across the finish line.

Dave Bittner: Yeah. All right. Well, we will have a link to that story in the show notes, and we will be right back after this message from our show's sponsor. All right. We are back. Ben, my story this week is something that is irresistible to us here. It's about the cell-site simulators -- or Stingrays as they are known -- and the folks over at the EFF have published a report looking at the next generation of cell-site simulators, these Stingray devices. Before we jump in here, Ben, do you think you can take a run at describing what these -- what the functionality of these devices has been?

Ben Yelin: Sure. So many local law enforcement agencies use Stingray or cell-site simulator devices. Basically the way these work is that they trick cell phones into sending a signal that reveals their physical location. So it acts as a cell phone tower. Your phone is always looking to connect to the nearest or most convenient cell phone tower. And a cell-site simulator is actually run by law enforcement, but they trick phones into believing this is the nearest cell phone tower. Is that an adequate explanation?

Dave Bittner: Yeah. I think so. And so it's a way for geofencing an area to see if we have someone that we're interested in and we want to track their comings and goings, and we have a signature of their cell phone -- we know how it communicates, does its handshaking with the cell phone towers -- we can use this to track them. I guess the other way you can use it is to just cast a wide net and see who comes through --

Ben Yelin: Right!

Dave Bittner: -- any particular area.

Ben Yelin: And that's sort of the problem with this from a legal perspective is it does cast a pretty wide net. And because of the Carpenter decision, we know that courts look disfavorably on historical cell-site location information.

Dave Bittner: Mm-hmm.

Ben Yelin: And there was a case -- a ground-breaking case here in Maryland that held that the use of Stingrays or cell-site simulators required a warrant here in Maryland.

Dave Bittner: Yeah.

Ben Yelin: There has been some disagreement amongst state and federal circuits on whether there should be a warrant requirement for the use of cell-site simulators, so there's no universal rule on it. But that's really the concern is it's generally warrantless if there's no individualized suspicion, if you're not trying to track a specific suspect.

Dave Bittner: Yeah. I guess the other noteworthy thing about this technology is that law enforcement has been conspicuously secretive about it. Right? Like, there have been cases where if someone tries to uncover the fact that a Stingray device was in use as part of law enforcement making their case, law enforcement will drop the case rather than reveal that this was what they were using.

Ben Yelin: Right. It's so weird, isn't it?

Dave Bittner: Yeah.

Ben Yelin: Like, that they are okay potentially letting a guilty person go free if they do not have to reveal the details. It must be so valuable to their departments that they're fine just letting go of that litigation which kind of makes you even more suspicious of how they're using it and how effective it actually is.

Dave Bittner: Right. Well, does it mean that they don't want the scrutiny? That they want this to be mysterious? You know, they didn't want -- because exactly what happened in Maryland, as you described, the requirement for a warrant certainly slows them down and makes these less useful if you're not allowed to fire it up whenever you want and you have to get a warrant first. That makes your investment less valuable in your day-to-day law enforcing.

Ben Yelin: Totally. And the big risk is that if there's some misconduct in these prosecutions because of the use of cell-site simulators, then it allows any enterprising defense attorney to reopen hundreds, if not thousands, of cases which obviously the state is not going to be pleased with that. And that happens in other contexts. If there's some state's attorney who's been abusing his or her power and that's uncovered several years later, then every single person who was tried in a court of law by that state's attorney is going to challenge the circumstances of their conviction.

Dave Bittner: Right.

Ben Yelin: So that's their other concern is if it's discovered that they did some shady stuff and that shady stuff was part of the reason why a bunch of criminals were convicted, they don't want those convictions to be challenged.

Dave Bittner: Yeah. So this article from the EFF, the Electronic Frontier Foundation, kind of an update on the newest generation of these devices. And they highlight devices from an organization called Jacob's Technology. And they found that the Massachusetts State Police force had awarded a contract worth nearly a million dollars to Jacob's Technology for this technology. Interesting they include -- it includes -- you buy one of these and you get yourself a Chevy -- a Chevy Silverado, Ben, because --

Ben Yelin: It comes with a -- with a free Silverado?

Dave Bittner: Well, kind of. It's integrated into the Silverado. And, again, to the secretive nature of this, the Silverado -- they install a false roof on the Silverado to hide the antennas that this requires to function.

Ben Yelin: I mean, I'm sure it doesn't sound weird to people who are in law enforcement.

Dave Bittner: Right!

Ben Yelin: We are not, so it just sounds like extra suspicious to me.

Dave Bittner: Yeah.

Ben Yelin: Yeah.

Dave Bittner: Yeah. These new systems are capable of operating on 5G systems, also with some of the older cellular standards. My recollection of part of how this works is that -- all of our cellular devices are backwards compatible. In other words, if you can't get a 5G connection, your phone will look for an LTE connection.

Ben Yelin: Right.

Dave Bittner: If it can't get a --

Ben Yelin: Ever notice that those never work, though? It's, like, when we actually had LTE we could get reasonable service.

Dave Bittner: Right.

Ben Yelin: But if we connect these days to an LTE tower -- nothing. You get nothing.

Dave Bittner: Well, so it can fall back to even some of the older standards. And -- and some of the older standards have been sunsetted so that they don't even work anymore. But as you go back with each generation of cellular standards, the security gets weaker and weaker. So part of what goes on here is that if this tower simulator can trick your phone into use -- using one of the lower-level standards, then it can take advantage of the fact that it has weaker security than, say, modern 5G does. Right?

Ben Yelin: Yeah.

Dave Bittner: Very clever.

Ben Yelin: Very, very clever.

Dave Bittner: Yeah.

Ben Yelin: I mean, they're running a multi-million dollar industry here and they are improving the products. And it seems like if this is -- this one company is selling equipment to thirty-five state and local police departments, and over three hundred departments globally, they must be doing something right. And I think it's because they're innovating here.

Dave Bittner: Yeah. So what's the danger here? I mean, you -- you -- people wonder what's the big deal? Right? And one of the things that this article points out, that we've talked about before, is that this has the potential to interfere with emergency services. If you're in the zone where one of these devices is being used and you call 9-1-1, the call may not go through.

Ben Yelin: Right. Because you're routed to the CSS instead of actually to the 9-1-1 call center.

Dave Bittner: Right.

Ben Yelin: Yeah, I mean, I think that's certainly a risk. I tend to focus less on the practical risks, which is just not as much in my wheelhouse --

Dave Bittner: Yeah.

Ben Yelin: -- and more on the legal risks here which is that -- or the civil liberties risks, to be more accurate -- that you're capturing nearby phones of unrelated individuals who are not suspected of any crime, but can be identified by their location in this method that -- that's warrantless. And unlike historical cell-site location information, the data is not lying with a private communications. It's lying with a law enforcement agency.

Dave Bittner: Mm-hmm.

Ben Yelin: So you just have this data. They don't have to request it. They don't have to get a subpoena to go to your cell phone provider. They just have information on where you were at a -- at a particular time because your phone has been tricked into submitting your location to this cell-site simulator. So that seems to be the biggest risk to me. But, yeah, I mean the fact that this could potentially interfere with 9-1-1 calls seems really, really bad.

Dave Bittner: Yeah.

Ben Yelin: And is that a risk worth taking on behalf of these law enforcement departments? It seems to be, given the fact that we know that so many departments are -- are spending a good deal of money on this and that companies are innovating in ways that make cell-site simulators more effective.

Dave Bittner: I also wonder, even -- we talk about with requiring a warrant --

Ben Yelin: Right.

Dave Bittner: -- which, in many of the conversations you and I have, for me anyway, that to me is a good standard. You know, there's a lot of these personal privacy things where if you get a warrant then I think that's reasonable.

Ben Yelin: Right.

Dave Bittner: But what I wonder, as you were talking about, you know, Maryland has a warrant requirement. But there's a difference between having a warrant to listen to Ben Yelin's phone conversations. Right?

Ben Yelin: It wouldn't be that interesting, but sure.

Dave Bittner: Right. Lots of sports talk. You know? But that's different than getting -- than the warrant covering Ben Yelin's entire neighborhood, including the conversations that you might be having.

Ben Yelin: Yeah, so, I mean, just to be clear here, when we're talking about cell-site simulators, there's no content of conversations that's being recorded.

Dave Bittner: Right. It's the location information.

Ben Yelin: It's the location information.

Dave Bittner: Yeah. Thank you for making that --

Ben Yelin: Now that, in and of itself, may or may not be potentially revealing, depending on your perspective.

Dave Bittner: Right.

Ben Yelin: The only perspective I necessarily care about here is the Supreme Court's perspective --

Dave Bittner: Hmm.

Ben Yelin: -- where they say that cell-site location information in the aggregate is of a nature that it's so deep, it's so broad, and it's so involuntary that it does invoke Fourth Amendment protections. And if that's true for collecting that information directly from the telecommunications companies, then I think it's extra true given that this is law enforcement collecting the data themselves.

Dave Bittner: Yeah.

Ben Yelin: So I just think Carpenter is kind of the north star case here and how I think courts should see this issue. The problem with Carpenter is there aren't a lot of very clear, delineated lines.

Dave Bittner: Mm-hmm.

Ben Yelin: So in that case it was historical cell-site information lasting for a period of seven plus days. What happens if it's eight days? What happens if it's four days?

Dave Bittner: Right.

Ben Yelin: Courts have been arguing about these edge cases across the country over the last six years since the Carpenter decision was released. But I think the principle really holds true where a person does have a -- a reasonable expectation of privacy in the whole of their own movements. If there's just one cell-site simulator, maybe that's not that much of a problem. But if they are positioned everywhere so that they are capturing a significant amount of data, then I think that does tend to invoke those Fourth Amendment protections. Another analogy -- and -- and I'm sorry to be so Maryland-centric on this --

Dave Bittner: Yeah.

Ben Yelin: -- was the Baltimore spy plane.

Dave Bittner: Oh, yeah.

Ben Yelin: So there's a plane, if you remember --

Dave Bittner: I do, yeah.

Ben Yelin: -- you could hear it buzzing in -- in Baltimore. It flew around in concentric circles around downtown during the daytime. It would take real-time images. A group of activists calling themselves "Leaders of a Beautiful Struggle" brought a federal case on it. It went to the Fourth Circuit Court of Appeals, and that Court of Appeals held that this was a Fourth Amendment violation and it -- because it was analogous to Carpenter --

Dave Bittner: Hmm.

Ben Yelin: -- that it captured -- it could potentially capture the whole of a person's movement because it's taking all of these real-time photos. So I -- I -- I think you're in that realm of when the collection becomes large enough that, at least hypothetically, you could track a person's physical movements over a period of time --

Dave Bittner: Yeah.

Ben Yelin: -- that's when the Fourth Amendment button turns on. And I think it's been proven in court that that's what federal courts and state courts have started to think when it comes to these cell-site simulators. So I think they have to be deployed very strategically so that law enforcement aren't just putting them at every major intersection or every major busy neighborhood so that a person's movements could be tracked over a significant period of time.

Dave Bittner: Right. They become like speed cameras.

Ben Yelin: Exactly! Which, of course, those already do exist.

Dave Bittner: Right. Right.

Ben Yelin: But you and I would never get caught on one of those screeners -- speed cameras. Right?

Dave Bittner: No, there's -- well, so, there's one in my neighborhood right now with -- like, there's a crosswalk, you know, to the local elementary school.

Ben Yelin: How many tickets have you gotten?

Dave Bittner: You know what? I've never gotten a speed -- ah! I don't think I've ever gotten a speed -- no, actually that's not true. I've gotten one speed camera ticket.

Ben Yelin: I've gotten like twenty of them, by the way.

Dave Bittner: But I think our county where we -- where we live and where -- well, where I live and where we're recording here, they actually post the location of the speed cameras, where they're going to be. That's, like, part of the -- the local legislative agreement that they made with the county council. It's, like, okay, you want to have these speed cameras, and they're for -- they put them in school zones. But you have to let the -- let the public know where they're going to be every week.

Ben Yelin: I think that makes a huge difference.

Dave Bittner: So they do that --

Ben Yelin: Because then people have an expectation --

Dave Bittner: Right. But what if you did that with Stingrays? Right? What if there were a requirement that if you're going to use one you had to have a public notice. Right?

Ben Yelin: I think that it would put any Stingray program on firmer constitutional ground, but it would also make Stingrays way less effective.

Dave Bittner: Yeah.

Ben Yelin: Because then a criminal would be, like, all right. For this drug deal, let's go to the -- let's go to the location where there are no Stingray devices.

Dave Bittner: Yeah. Let me pull up my Stingray RSS feed and see where they are this week because you know somebody would -- would open source a Google map that would have the -- where they are this week. The other thing that rubs me the wrong way about all this is -- is the degree to which the FCC turns a blind eye to this, how they just -- you know, this is -- someone introducing a device that's function is to interfere with radio signals.

Ben Yelin: Right!

Dave Bittner: Right? Like --

Ben Yelin: Which they would freak out if it was anybody besides law enforcement doing that.

Dave Bittner: Exactly.

Ben Yelin: Yeah.

Dave Bittner: And -- and the -- the experts I've spoken to, I've asked specifically about this question and they have said that the FCC allows law enforcement and the military to do these sorts of things that they -- they generally, you know, turn -- turn a blind eye to it and allow it to happen. But I don't know -- it just -- it just irks me a little bit.

Ben Yelin: Yeah. And in some sense I understand that law enforcement and the military are special in a lot of ways.

Dave Bittner: Right.

Ben Yelin: But, yeah, it also kind of rubs me the wrong way, too --

Dave Bittner: Yeah.

Ben Yelin: -- that they're -- there's just kind of this tacit understanding that we'll look the other way --

Dave Bittner: Yeah.

Ben Yelin: -- when they would freak out if anybody else was covertly messing with radio communications.

Dave Bittner: Right. Sure. I want to set up my pirate radio station and they're on me. They're doing a fox hunt and having me shut down in no time at all.

Ben Yelin: Yeah. The FCC would be coming to your house.

Dave Bittner: That's right. That's right. All right. Well, we will have a link to this story from the EFF in our show notes. And, of course, we would love to hear from you. If there's something you'd like us to cover on the show, you can email us. It's caveat@n2k.com. And that is "Caveat," brought to you by N2K CyberWire. We'd love to know what you think of this podcast. Your feedback ensures we deliver the insights that keep you a step ahead in the rapidly-changing world of cybersecurity. If you like our show, please share a rating and review in your podcast app. Please also fill out the survey in the show notes, or send an email to caveat@n2k.com. We're privileged that N2K's CyberWire is part of the daily routine of the most influential leaders and operators in the public and private sector, from the Fortune 500 to many of the world's preeminent intelligence and law enforcement agencies. N2K makes it easy for companies to optimize your biggest investment, your people. We make you smarter about your teams while making your teams smarter. Learn how at N2K.com. This episode is produced by Liz Stokes. Our executive producer is Jennifer Eiben. The show is mixed by Tre Hester. Our executive editor is Brandon Karpf. Peter Kilpe is our publisher. I'm Dave Bittner.

Ben Yelin: And I'm Ben Yelin.

Dave Bittner: Thanks for listening.