Caveat 12.12.24
Ep 243 | 12.12.24

Living in the shadow of AI borders.

Transcript

Petra Molnar: It's an interesting moment in time right now to try and get a sense of what the governance regime looks like. And, from a border technology perspective, you know, there is not a lot of lawmaking. But I think we can even extend that into the conversation around tech governance generally.

Dave Bittner: Hello, everyone and welcome to "Caveat," N2K CyberWire's privacy, surveillance law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hey there, Ben.

 

Ben Yelin: Hello, Dave.

 

Dave Bittner: On today's show, Ben has an update on the effort to ban TikTok in the United States. I've got the story of the NYPD using citywide surveillance to help catch the United Healthcare CEO killer. And, later in the show, my conversation with Petra Molnar, Harvard faculty associate, lawyer and author of the newly released book "The Walls Have Eyes: Surviving Migration in the Age of Artificial Intelligence." While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. [ Music ] All right, Ben, lots to share this week. Why don't you start things off for us here?

 

Ben Yelin: So, I'm not sure it's like really gotten into people's heads that TikTok might be banned in a month and a half in the United States.

 

Dave Bittner: Okay.

 

Ben Yelin: And we got a really critical update on that from the District of Columbia Circuit Court of Appeals which upheld the law that was passed earlier this year that would ban TikTok unless they are purchased by a U.S. entity. So, unless ByteDance divests in the company, ByteDance being a company that, at least allegedly, is controlled by the Chinese government.

 

Dave Bittner: Right.

 

Ben Yelin: So, the way the law works is that TikTok will be banned in app stores as of January 19th. If Apple or Google has the application in their app stores after that time, they will face significant penalties. Of course, the question is whether this law is going to be enforced in the first place. And I think that's something that we'll get into. The president at that time, which will still be Joe Biden -

 

Dave Bittner: By like one day. Right?

 

Ben Yelin: By one day. Yeah.

 

Dave Bittner: It's going to be like one day.

 

Ben Yelin: Has the option -

 

Dave Bittner: Yeah.

 

Ben Yelin: Of, if he thinks that there's a chance for divestment from ByteDance, then he can extend the deadline by 90 days.

 

Dave Bittner: Oh.

 

Ben Yelin: And, obviously, they picked that January 19th date for that purpose.

 

Dave Bittner: Okay.

 

Ben Yelin: So, this case went directly to the D.C. Circuit Court of Appeals. I know we covered it when we were talking about oral arguments and it seemed like TikTok might fare poorly. And, it turns out, that they did fare rather poorly. This is a three-judge panel, two Republican appointees, one Democratic appointee. The decision was written by Senior Judge Douglas Ginsburg, who I think we mentioned last time, a Supreme Court nominee briefly in the 1980s before he was caught for, gulp, smoking marijuana.

 

Dave Bittner: Ohh!

 

Ben Yelin: How dare he?

 

Dave Bittner: Well, things change.

 

Ben Yelin: Yeah. So, we're talking about, you know, 35-some-odd years ago and -

 

Dave Bittner: Right, right.

 

Ben Yelin: Our standards have drastically shifted.

 

Dave Bittner: Yeah.

 

Ben Yelin: But he wrote this opinion and basically said that Congress did its homework. If I had to boil down the crux of the decision here, that's what he's saying. In order to pass any restriction on speech, there is going to be a heightened level of scrutiny on that law. Now, there is a question as to whether intermediate scrutiny applies or strict scrutiny applies. If it is a content-neutral restriction on speech, generally, intermediate scrutiny applies and the government would have a better chance of winning that case.

 

Dave Bittner: Example, please?

 

Ben Yelin: So, a content-neutral restriction on speech would be nobody is allowed to stand outside a private residence with a bullhorn at 3:00 in the morning.

 

Dave Bittner: I see.

 

Ben Yelin: It has nothing to do with what you're saying. You could be reading the phone book or you could be talking about Israel and Palestine.

 

Dave Bittner: Right.

 

Ben Yelin: A content-based restriction on speech is the opposite where you are restricting the speech based on its content. We've never had a case like this and I think it's actually legitimately unclear whether this is a content-neutral or content-based restriction on speech. It's content neutral in the sense that all of TikTok would be banned. So, whether you're making videos on your favorite musicians or you're making videos on the Chinese government and their spying techniques, if the law goes into effect, that will be disallowed no matter what the speech is. But there are other elements to this law. And one of the justifications is that we're worried about China kind of having malign influence and through their influence over ByteDance messing with the algorithm so that people get more pro-Chinese content. And, if that allegation is true, that would implicate a content-based restriction on speech, right, since that's specifically about China, specifically about some type of content. What the court does here is say, "We're just as confused as you are as to which level of scrutiny applies. So, just for the sake of argument, we are going to apply strict scrutiny. Let's apply the most-strict judicial standard on this law and see what happens. We're not saying that strict scrutiny should apply, but we're just going to go for it."

 

Dave Bittner: "We're going to take the hard route."

 

Ben Yelin: The hard route here.

 

Dave Bittner: Right. Okay.

 

Ben Yelin: In order to pass strict scrutiny, the government has to have a compelling interest and the means of achieving that interest have to be narrowly tailored to achieve that goal. Basically, there have to be no other viable alternatives. The government here has a compelling interest. We don't want China to have private information on the millions of Americans who use TikTok and we don't want the Communist Party of China to influence us through manipulation of the TikTok algorithm. So, that portion of strict scrutiny I think is pretty well established. The real question is, "Are there other ways we could achieve this goal?" And I think the court is persuaded that there really aren't any other ways. This is a very specific, narrowly targeted law. It only targets this application. It doesn't target any other social media sites that are not primarily owned by Chinese companies. People have a lot of other avenues, not only for speech generally, but for the type of content that's posted on TikTok. You could use Instagram Reels or Facebook Reels. And they are persuaded that there is no other way to achieve the goal of protecting our national security other than a full ban, everything else would be completely impractical. They look at a bunch of other alternatives. There were discussions between the government and TikTok about having this national security agreement where data would be housed in the United States and they'd get a third-party Oracle to audit the data to make sure none of it was getting sent to China. The court here, as was the case in Congress, is unconvinced that that would be workable. There are too many workarounds that would allow our data to go to the Chinese government. So, therefore, this law does satisfy strict scrutiny. The government - or Congress put together a very robust record based on classified briefings that they have received that this is a real danger. And, as a result, as of now, it appears as if TikTok is going to be outlawed in the United States by January. So, I know you're going to ask what the next step is here. I'm getting really good at predicting your next questions.

 

Dave Bittner: Am I that predictable? All right, go ahead.

 

Ben Yelin: So, I'll go there. The plaintiffs in this case, which are a bunch of TikTok users and organizations like the ACLU, obviously, along with TikTok itself and ByteDance, they are going to appeal this to the United States Supreme Court. What the Supreme Court could do is issue a temporary injunction against enforcement of the law while they hear the full case on the merits. What that would mean is we would be going past that January 19th deadline. So, let's say the Supreme Court weighs in on January 3rd and says, "We're going to temporarily put a hold on enforcement of this law, we're going to have full oral arguments and a decision over the summer," that would be hugely significant because, if that injunction is issued, the decision then falls on President-elect Trump if the law is upheld. And Trump, despite trying to ban TikTok when he was last president in 2020, seems to be against it now. It's always hard to know what his view is on a particular subject. He has kind of gone back and forth on this. He is a self-described China hawk, but, also, he made promises to a lot of his constituents that he was not going to allow TikTok to be banned, including some major investors into his campaign. So, my guess is, if there is that injunction and if the case is upheld or the law is upheld, Trump might choose to simply not enforce it. Meaning his Department of Justice would not pursue civil or criminal penalties against the app stores for including TikTok in their app stores. And that could be kind of a way to nullify this law without Congress having it overturned. That itself would create a lot of practical difficulties. It's also possible Trump could try and convince Congress to reverse itself, which seems unlikely since, yes, we're going to have a new Congress, but they're not that different than the previous Congress. And -

 

Dave Bittner: Right.

 

Ben Yelin: They have this whole record of why TikTok is a significant danger. So, we don't know exactly what's going to happen here. I'm surprised that there isn't more of a panic among our Gen Z and Gen Alpha followers that their beloved application might actually be banned. But that's where we are here with a month or so to go until that January 19th deadline.

 

Dave Bittner: All right, I have a couple questions.

 

Ben Yelin: All right, let's do it.

 

Dave Bittner: Do we know - when the ban goes into effect, you specifically said the app comes off of the app stores. Does the app disappear off of people's mobile devices and do they lose access to the app's central servers?

 

Ben Yelin: I believe the way it would work practically is it's enforced through the app stores, but, eventually, the old versions of TikTok would become obsolete for one reason or another. Like, eventually, an update is required -

 

Dave Bittner: Yeah.

 

Ben Yelin: And if you're never able to secure that update, then the app is going to be insecure or might otherwise not qualify for use on your device. So, I think - I don't - I'm not 100% sure about this. I don't think they're going to go into your device and delete the app for you.

 

Dave Bittner: Right.

 

Ben Yelin: I think the way it's going to be enforced is through bans on this application being downloaded through the app store and they're going to hope that just kind of takes care of itself over time.

 

Dave Bittner: Huh, okay, that's interesting. The other thing I've seen a lot of comment and concern about with this is how much of the back story of this is wrapped up in classified briefings.

 

Ben Yelin: Sure.

 

Dave Bittner: And people saying the senators involved - senators, Congress people involved saying, "We've been briefed. Trust us, it's bad."

 

Ben Yelin: Yeah.

 

Dave Bittner: When you have a decision like this with the Circuit Court judges, do they get to review the classified information or do they simply have to trust the people who tell them, "Trust us, it's bad"?

 

Ben Yelin: I don't believe they are privy to the same classified information that Congress is receiving. And I don't think they need to be privy to it to adjudicate this case. They, through judicial precedent, are supposed to give deference to findings on national security by both the Executive Branch and Congress. That's the way these cases have worked in the past. If there is a robust record and if Congress has sufficiently convinced the court that they've done their due diligence, even if the information is classified, I think it is customary for the courts to accept those findings.

 

Dave Bittner: Okay. And we're relying on Congress to do that oversight?

 

Ben Yelin: We are relying on them to do that oversight. And one of the things that Judge Ginsburg said in his decision is this was a long process. There is a period of review before this legislation was passed going back months and even years. This is a bipartisan process. It's not like Republicans narrowly passed this law and it was signed by a Republican president. We have a split Congress. The law was signed by a Democratic president. So, that's more persuasive to him that this isn't some sort of partisan witch hunt that -

 

Dave Bittner: Right, yeah.

 

Ben Yelin: Congress really did consider the depth of this problem. And, without knowing 100% of the details, they get deference as to a determination on threats to our national security.

 

Dave Bittner: It's interesting to me, I'm just - again, I'm thinking about oversight and I guess fairness because let's say, for example, I think you mentioned like the ACLU is one of the plaintiffs here, well, they don't have access to the information that they're trying to counter. Right?

 

Ben Yelin: Sure.

 

Dave Bittner: When it comes to national security, you just have to - and I get it. You know, I mean, there's practical limitations here.

 

Ben Yelin: The ACLU will never accept the "just trust us" standard.

 

Dave Bittner: Right.

 

Ben Yelin: And, you know what, more power to them -

 

Dave Bittner: Right.

 

Ben Yelin: Because there has to be somebody out there who will always say, "I don't trust the government to tell me the truth."

 

Dave Bittner: Right.

 

Ben Yelin: Because it's - the government has lied to us in the past many times. So, I'm glad that they are joining this litigation. That's not generally how it works in court, though. But, yeah, I mean, if you were to look at it completely objectively, it is kind of questionable and unfair that we are just deferring to evidence that none of us have seen.

 

Dave Bittner: Yeah.

 

Ben Yelin: We are trusting members of Congress. I think the court is persuaded by the process that there were these classified briefings, that it was bipartisan, that they did a long investigation, that this wasn't something that popped up in two to three weeks, that this is something that had been building for a long time. That lends credence to the idea that this is well thought out. But -

 

Dave Bittner: Okay.

 

Ben Yelin: That's all we have. Of course, the ACLU could be right and the government could be lying to us. It's just - it's very hard to know.

 

Dave Bittner: Yeah, yeah. And I guess that's a - you know, it's one of those time will tell kind of things because we know that, in the past, the government has lied to us, thanks to the work of folks like the ACLU and the media. Right?

 

Ben Yelin: Yeah. And, you'll note, there are a lot of comments in opposition. We'll post an article in the show notes. There's a lot of comments in opposition from organizations like the ACLU and the Knight First Amendment Center at Columbia University and a lot - other electronic privacy groups saying that despite what the court says this is an unconstitutional burden on free speech on behalf of TikTok itself and on behalf of its users. But they lost in court. So, now it will go up to the Supreme Court. I have no idea how the Supreme Court would rule on this on the merits. It's such a unique and novel case. We've never tried to do anything like this in the past.

 

Dave Bittner: Yeah.

 

Ben Yelin: So, my guess is as good as anybody else's how this would fare at the Supreme Court. We'll just have to see if it gets there. Now, the Supreme Court could just reject the case. They could choose not to hear the case and then the law would go into effect January 19th and Trump would have to pull some kind of maneuver when he becomes president to resurrect this application. It's possible he figures something out, but -

 

Dave Bittner: If he wants - if he even wants to. Right? I mean -

 

Ben Yelin: If he wants to, yeah.

 

Dave Bittner: Yeah, he might not want to touch it.

 

Ben Yelin: That, again, is very unclear. And one of the funny things in this article is like all of these academics and experts on both the law and the kind of social dynamics of TikTok are like, "Well, we don't really know what's going to happen because we don't know which Trump we're going to get."

 

Dave Bittner: Right.

 

Ben Yelin: "Are we going to get the China hawk or are we going to get the guy who promised to his younger fanbase during this election season that he wouldn't allow this popular application to be banned?"

 

Dave Bittner: Right.

 

Ben Yelin: And he's switched his opinions on things in the past.

 

Dave Bittner: Yeah.

 

Ben Yelin: In some ways, I think that's been good for him politically, just kind of forget that you are reversing yourself and do the thing that's more popular in this moment. I think that's actually a pretty good political instinct. But it does mean that it's hard to figure out ahead of time exactly what he's planning on doing.

 

Dave Bittner: Right, yeah. He has - he's definitely - is it fair to say that over the years and during his past administration he has successfully weaponized his unpredictability?

 

Ben Yelin: Totally.

 

Dave Bittner: Yeah.

 

Ben Yelin: And it's actually worked well for him -

 

Dave Bittner: Right.

 

Ben Yelin: In a bunch of different circumstances including, and, you know, I might get pushback from our Democratic listeners here, in foreign policy. I actually think it's - his unpredictability has ended up working to our geopolitical advantage. If you look at something like North Korea, like we did almost sort of normalize relations with them because Trump is such a loose cannon he went off joking about Kim Jong Un's -

 

Dave Bittner: Falling in love.

 

Ben Yelin: What was he saying about him?

 

Dave Bittner: [inaudible 00:17:41] in love.

 

Ben Yelin: Yeah, "little rocket man."

 

Dave Bittner: Right.

 

Ben Yelin: And I think all of these potential foes of the United States are like, "We don't know if this guy is serious. He seems like the type of dude that might set off nuclear weapons. Maybe we have to sit down and take him seriously." So, I think that type of loose cannon behavior has actually worked to the country's advantage in a limited number of circumstances.

 

Dave Bittner: Interesting, yeah. All right. Well, the clock is ticking, it's TikToking.

 

Ben Yelin: It's TikToking. Ah, that was right there for the takin, Dave.

 

Dave Bittner: Oh, so obvious -

 

Ben Yelin: Yeah.

 

Dave Bittner: So obvious and, yet, so good. All right. We'll have a link to the story in the New York Times. My story is also from the New York Times this week. And, actually, this comes by way of your professorial man crush -

 

Ben Yelin: Yeah.

 

Dave Bittner: Orin Kerr, Professor Orin Kerr.

 

Ben Yelin: Sort of friend of the pod.

 

Dave Bittner: Sort of friend, yeah.

 

Ben Yelin: He one time said that he listened to our segment and enjoyed it. Of course, that was when we were talking about his book. But -

 

Dave Bittner: Ah, okay.

 

Ben Yelin: I'll take it.

 

Dave Bittner: Yeah, well, Orin, if you're listening, we would love to have you as a guest. The door is always open.

 

Ben Yelin: We will drop everything -

 

Dave Bittner: Yeah.

 

Ben Yelin: I promise.

 

Dave Bittner: If you want to see Ben squee and joy -

 

Ben Yelin: Yeah.

 

Dave Bittner: Come, please. As a gift, come allow Ben to speak with you. That would be wonderful. So, actually, I saw this on social media. Orin was talking about this article. This is an article from the New York Times and they are talking about the attempts by NYPD to track down the United Healthcare CEO shooter and the degree to which the NYPDB - the NYPD is using this huge video surveillance network that just blankets New York City and that they have access to.

 

Ben Yelin: Can I just briefly, before we get into the serious details of this, go through two kind of humorous elements of it? The first is that the person who's been wearing a mask throughout the entire time that he's been on the run from police apparently lowered his mask at a youth hostel because the clerk at the hostel was flirting with him. And that's the only reason why we have an image of his face. And then the second thing that just really got me was they have video of him going into Port Authority in New York City and the quote from law enforcement was like, "Well, we have video of him going in, but not of him going out. So, we are assuming at this point that he got on a bus." I was like, "Wow, guys, that is some good detective work. You are making the big bucks."

 

Dave Bittner: Right, right. Well, over on the old site that we do not name -

 

Ben Yelin: A site that shall remain nameless.

 

Dave Bittner: That's right. Orin Kerr says, and I'll quote him here, he says, "I take the view of some Fourth Amendment scholars to be that this camera system should be considered illegal, although I'm curious if those who think that believe its use should be allowed with a warrant - and, if so, what a particular warrant for this would look like." So, Ben, let's start there. We - the NYPDB - the NYP - I keep wanting to say "NYPD Blue" because somehow it's just programmed in my mind -

 

Ben Yelin: It's a great show, yeah.

 

Dave Bittner: To connect those two words together. The NYPD has access to this network of cameras that are public and private cameras. They have the - they have access to feeds to all of these. And some folks have a problem with that. What do you make of Professor Kerr's question here?

 

Ben Yelin: I think it's a great question. There really are two questions here. The first is whether what's going on, this type of mass surveillance around New York City, qualifies as a Fourth Amendment event, if you will, a search or a seizure.

 

Dave Bittner: Right.

 

Ben Yelin: And that gets at whether this individual has a reasonable expectation of privacy. Generally, when you are in public view, you do not have a reasonable expectation of privacy. And I think that's kind of the letter of the law at this point. If you are peering into people's homes or if you are listening in on people's private spaces, that triggers Fourth Amendment protection. But if you are out in public where you can be seen by law enforcement officials as well as by surveillance cameras, you have forfeited that expectation of privacy. Where it gets a little tricky is when we have this type of mass surveillance where it's not just one or two cameras, it's a network of cameras. It's a network of cameras that's running 24/7 that's following the whole of a person's movement.

 

Dave Bittner: Right.

 

Ben Yelin: There are questions as to whether the fact that this is all encompassing makes this a search for Fourth Amendment purposes. And I think there are unsettled views on this. The Fourth Circuit in a case dealing with our home state and my former home city of Baltimore City evaluated a program where there was a low-flying plane that circled the city and took real-time photos -

 

Dave Bittner: Yeah.

 

Ben Yelin: During daylight hours. The case was called Leaders of a Beautiful Struggle v. Baltimore City. And, in that case, the Fourth Circuit held that this was an unconstitutional search and seizure because this was similar enough to Carpenter v. the United States where we were unconstitutionally following the whole of a person's movement, which is impermissible without a warrant. Now, that's the second question. Let's say this is a search and a warrant is required. What in the world would that warrant look like? That's the best question Kerr asks here. Typically, with a warrant, you need some level of particularity. So, you have to describe the place to be searched, the things to be seized, et cetera.

 

Dave Bittner: Right.

 

Ben Yelin: It's hard to have that level of particularity. The only level of particularity we have is that we want video evidence on this person, but we don't know where he's been, where he's going, how many cameras have been triggered. It gets to the point where it starts to feel like a general warrant that is not limited by particularity, which is required in Fourth Amendment jurisprudence. So, I have to say, I don't know how you would draft an applicable warrant here. Good thing there are better lawyers out there than me who are probably figuring out a way to do it. But you have this kind of catch-22 where the particularity relates to the individual, but the Fourth Amendment requires you to describe the place to be searched or the things to be seized. And neither of those I think you could really narrow down here given that we have almost no information on where this guy is. So, that's why it's so interesting. I don't know how you would draft a warrant in these circumstances.

 

Dave Bittner: Huh. Yeah, I mean, they talk about the city has over 60,000 cameras that the police have access to and they say that there are currently hundreds of detectives working on the case going through hundreds of hours of video to get the little snippets that could be usable. You know, I guess it's noteworthy that a case this noteworthy triggers this larger response. Right? You know who - there are sadly probably people who get killed every week in New York City and it doesn't trigger this kind of response.

 

Ben Yelin: Yeah, this was a high profile killing and it wasn't -

 

Dave Bittner: Right.

 

Ben Yelin: It was - it happened in broad daylight and it was done in cold blood. I mean, it was just shooting an executive as he exited a hotel. So, that's obviously high profile enough that they are using all of their resources to get to the bottom of this. And we potentially still have a suspect here who is at large and armed and dangerous. So, there is that element of it, which is not always true when we're talking about homicides. If it's something that's limited to a particular area or a person doesn't have the means to escape, then it can be easier. But when we're talking about this guy has been seen all over the city and now potentially has been seen, if the NYPD sleuths are correct, getting on a bus, leaving Port Authority, he could be anywhere in New Jersey, Connecticut or really anywhere else around the country. So, we have to use all of the law enforcement tools at our disposal to conduct this really mass search. It's unclear in Fourth Amendment jurisprudence where the line is between, well, he's out in public, so he doesn't have a reasonable expectation of privacy versus there are - how many cameras did you say in New York City?

 

Dave Bittner: Sixty thousand.

 

Ben Yelin: Sixty thousand. So, it's just impossible to have any sort of privacy when you're out and about. And that's something that could not have been contemplated by our founding fathers when they drafted the Fourth Amendment.

 

Dave Bittner: Right.

 

Ben Yelin: It's hard to know where you draw the line there.

 

Dave Bittner: Right. And when - like when you talk about a warrant in the pre-camera days, it would - correct me if I'm wrong, it would not have been possible to have a warrant that says, "I want to search every public place in the city.'

 

Ben Yelin: That would not meet the particularity requirements.

 

Dave Bittner: Right.

 

Ben Yelin: Yeah.

 

Dave Bittner: But, basically, that's what we're doing.

 

Ben Yelin: That is exactly what we're doing. Yeah. I mean, if you look at the old British cases, it was always the king via his subjects requested a general warrant to go into Dave's house and see what they can find. They didn't suspect that you had anything or they didn't put on the record that they had any suspicion. It was just, "Let's go to his house and whatever's there, you know, we'll prosecute him on it."

 

Dave Bittner: Right.

 

Ben Yelin: And that I think bears some similarity to what's happening here. It's like it's so broad that it's every single camera in the city. And that goes against the spirit of our Fourth Amendment, which calls for that level of particularity. But like take away the legalese, if we have a high-profile shooting done in broad daylight in cold blood, don't we kind of want to have this network of surveillance cameras at our disposal? Isn't it better for society? I don't think there's a great answer to that question. You have to balance the costs and benefits of it. But -

 

Dave Bittner: Yeah.

 

Ben Yelin: It's a really, really tough question.

 

Dave Bittner: I wonder if the whole notion of a reasonable expectation of privacy needs to be revisited because -

 

Ben Yelin: That's one of my hobby horses.

 

Dave Bittner: Is that right?

 

Ben Yelin: It is.

 

Dave Bittner: Well, like we were talking about the thing over Baltimore with the airplane, it seems to me like one of the issues there or maybe the key issue there was the ability to rewind, to have basically this DVR -

 

Ben Yelin: Yeah.

 

Dave Bittner: Of life. And, so, if you're not expected - you're not expected to have privacy in public, but that was in a time when you couldn't just go anywhere and rewind everything that happened in public. You know, like we've talked about before, there'd have to be some sort of effort, you'd have to have - you'd have to have -

 

Ben Yelin: Like every -

 

Dave Bittner: Somebody tailing somebody or -

 

Ben Yelin: You'd have to put a law enforcement officer on every city block -

 

Dave Bittner: Right.

 

Ben Yelin: And outside every building.

 

Dave Bittner: Right.

 

Ben Yelin: You can't conjure up 60,000 law enforcement officials to stand at street corners and still have a functioning New York City Police Department. So, it would be impractical. I hope the Supreme Court, which hasn't taken any Fourth Amendment cases, gives us some clarity on this because right now it's really unclear. The only clue we have is from Carpenter where they said, "Well, a collection of historical cell site location information going back as much as seven days, that does trigger Fourth Amendment protection." But there's no clarity on how that would apply in a bunch of different circumstances. And I think it's incumbent upon the Supreme Court to clarify that for us. So, Supreme Court, if you're listening, take up a Fourth Amendment case, settle this question for us and we will be pleased to discuss it on our podcast.

 

Dave Bittner: There you go. There's a couple other interesting tidbits from this article that I think are worth sharing. The folks from the police department say that their camera network can use search terms, in this case things like "backpack" or "bicycle," but they said that those types of queries turn up so many false hits that they're not particularly useful. And they also said that facial recognition software was not helpful because they do use that, but it is based on booking photos -

 

Ben Yelin: Right.

 

Dave Bittner: Taken at arrests. So, they're not accessing those "for sale" public databases that we've talked about.

 

Ben Yelin: Or even in other states where you can use facial recognition based on driver's license records, which is a much easier way to do it. Obviously, if you're using it based on previous arrests, it's not going to work for someone who's never been arrested.

 

Dave Bittner: Yeah.

 

Ben Yelin: So - but that's kind of the downside of caring about the Fourth Amendment impact on the use of facial recognition software.

 

Dave Bittner: Yeah, interesting. All right. Well, I mean, this one is going to play out. It's certainly - I think it's been fascinating to see the way that this has captured the public's imagination and really been a sparking a lot of discussion about related things. Right?

 

Ben Yelin: Exactly.

 

Dave Bittner: Health - the state of health care in the United States is people's -

 

Ben Yelin: Right. I mean, and, ultimately, we do have a very terrible tragedy here in that -

 

Dave Bittner: Right.

 

Ben Yelin: Somebody lost his life. And I think perhaps our tone should first and foremost reflect that, that this is a tragedy.

 

Dave Bittner: Yeah.

 

Ben Yelin: And I can understand for everybody involved why they'd want to use every tool at their disposal to solve this heinous crime.

 

Dave Bittner: Yeah, absolutely. All right. Well, again, we'll have a link to that story in the show notes. And we would love to hear from you. If there's something you'd like us to consider for the show, you can e-mail us, it's caveat@n2k.com. And, if you happen to be Professor Orin Kerr, you will go straight to the front of the line. [ Music ] All right, Ben, I recently had the pleasure of speaking with Petra Molnar who is a Harvard faculty associate, lawyer and author of the newly released book "The Walls Have Eyes: Surviving Migration in the Age of Artificial Intelligence." A really interesting conversation. Here's Petra Molnar. [ Music ]

 

Petra Molnar: I always like to start off by saying that, you know, I'm not a technologist, I am trained as a refugee lawyer and an anthropologist. But, seven years ago, I barely knew what an algorithm was. You know, I was on Wikipedia learning about artificial intelligence, but I wasn't really thinking about technology. And, back in 2018, a colleague of mine and I came across the fact that the Canadian government was using algorithmic decision making in our immigration system without any conversation publicly. And we wrote a report about this and it kind of opened my mind up to this whole other area of how technology is changing the way that people on the move are experiencing migration. And then I started working on this and it slowly grew into a global project. And, six years later, the book was born.

 

Dave Bittner: Well, let's dig into some of the details here. And I suppose before we get to the artificial intelligence elements of this, can you kind of level set for us, I mean, what - when we're talking about the U.S. /Mexico border and the folks who are migrating through that border, what do you think our listeners need to know about the reality of that situation?

 

Petra Molnar: Yeah, I think that's a good place to start. There are thousands of people crossing the border, you know, every day. And it is a situation that has been happening for many years where people are exercising their internationally protected right to asylum. But they're increasingly coming up against unregulated and high-risk technologies that are being experimented with at the border. And not just at the U.S. /Mexico border, but globally as well. And, from a legal perspective, there's so much experimentation that's happening at the border with a variety of different technologies, whether that's drones, automated decision making, biometrics, but also more draconian-type projects like robo dogs and AI lie detectors. But, right now, we don't have a lot of law or governance at all to kind of draw some guardrails or red lines under some of these extremely high-risk pieces of technology.

 

Dave Bittner: Well, when you call something high risk, what is it about that that puts it in that category for you?

 

Petra Molnar: So, if you look at the kind of impacts of this technology, I think that's where we need to start. What is this really doing to people and their legal cases? We know that technology is a lens through which to understand how power operates in society. Right? And we know that immigration systems are very opaque, very discretionary. And there's a vast power differential between people on the move and the decision makers. And if we're importing technology into the mix, again, whether it's algorithms or surveillance, without having much governance and law to, again, regulate a lot of this, the concerns are that it's going to exacerbate a lot of the discrimination that's already a daily reality for people who are part of this system. Not to mention that the kind of growing surveillance dragnet that we've been seeing at the U.S. /Mexico border, but also other places, too, that I cover in the book. Oftentimes, the surveillance technology is introduced as a deterrence mechanism. So, states will say, "Let's bring in more technology to prevent people from coming." But that actually doesn't work in reality. Right? Researchers have shown that deterrence doesn't work because people who are desperate will continue to come. What's going to happen is they are going to take increasingly more risky routes to try and get to safety. And there has been a nearly tripling of deaths at the U.S. /Mexico border since the introduction of smart border technologies. You know, while writing this book, I went to some of the memorial sites of people who died in the Sonora Desert trying to reach safety. And it's really disturbing to think that there's so much technology that's being invested in rather than perhaps even using some of this investment for other things, whether that's, you know, supporting lawyers, supporting communities or even trying to make the system more fair and transparent.

 

Dave Bittner: Well, help me understand, and please forgive my ignorance here, when someone is crossing the border, and let's use the U.S. /Mexico border as an example, someone coming in from Mexico to the United States, what rights do they have in terms of - what umbrella are they under in terms of their rights when it comes to this sort of surveillance?

 

Petra Molnar: So, we can start off with the rights that are available to you and I and everyone on this planet. There is a system of rights in place that's governed by international law and also domestic law that allows every person that's at risk of persecution to come to a country and claim asylum or ask for refugee status. And that is what a lot of people do at the U.S. /Mexico border. And, again, they're exercising their internationally protected right to do so and also to try and avail themselves of a system of adjudication that exists in the United States. But with surveillance technology and all sorts of technology that's now at the border, this is where there is a bit of a tension here because so much of this technology is unregulated. It's introduced without a lot of public scrutiny or accountability. A lot of us find out about it because we're either researchers or journalists or podcasters who are interested in what's happening at the border. But there isn't this kind of concerted effort for public transparency and accountability around what's even happening. And there's not a lot of law to regulate technology, especially in places like the border, you know, and also examples around, for example, technology that's used in refugee camps and other spaces of humanitarian emergency. And, so, when we look at it from a legal perspective, it's really scary to see that there is just not a lot of guardrails put in place to prevent this kind of unbridled, what some people call, techno solutionism that we are seeing across the world borders, not just at the U.S. /Mexico frontier.

 

Dave Bittner: How much of this technology is being used for reliably identifying people? You know, your repeat offenders, if you will, you know, people who have been - tried to come across the border, been caught, sent back, come back. Is this a way to know who's coming across and how reliable is the technology of doing so?

 

Petra Molnar: That's actually a really good question because that's often how this technology is presented, you know, that we need more surveillance, more tools to tell us who is coming. But, the thing is, a lot of this technology doesn't even work the way that it's intended. You know, even when you look at, for example, facial recognition technology, that's been now used at the U.S. /Mexico border through this application that's called the CBP One Facial Recognition App, which is now something that everybody has to download and use on the Mexico side if they want to claim asylum in the United States. There's been critiques upon critiques now of this application because it doesn't actually work. It doesn't work very well on people who are darker skinned, sometimes you need a lot of, you know, bandwidth to be able to even download it and use it on your phone. So, there is also this gap between how the technology is presented as being this kind of piece of efficiency and this problem solver, but it actually doesn't even work like that in practice. And sometimes it actually creates even more problems and more inefficiencies in the system.

 

Dave Bittner: As you were researching for this book, were there any things in particular that surprised you that were unexpected?

 

Petra Molnar: So many. You know - and if you choose to read the book, you'll see that I've had some pretty surreal experiences. I think what surprised me the most, Dave, was the tremendous power that the private sector has in the conversations around what we innovate on and why. And, by that, I mean I think I just didn't realize - even as someone who's been working in migration issues, you know, since like 2008, I didn't really realize that there is this huge border industrial complex that's worth multibillion dollars. Right? And it is this kind of industry that has grown up around the border where private sector entities get to say, "Well, if migration is a, quote/unquote, 'problem, we have the solution." And the solution is not more lawyers or more psychosocial support or actually helping people in their home communities and addressing the root causes of migration so people don't have to migrate, right, instead, the solution is a robo dog or a drone or an AI lie detector. So, it's - for me, it was really important to pay attention to the kind of normative power of the private sector that really gets to set the stage of what we innovate on and why and who gets to imagine what kind of world we are building.

 

Dave Bittner: What is the legislative status when it comes to these sorts of technologies and the border? You said there - a lot of things are still undecided.

 

Petra Molnar: Yeah, that's right. And, if I put on my lawyer hat, it's an interesting moment in time right now to try and get a sense of what the governance regime looks like. And, from a border technology perspective, you know, there is not a lot of lawmaking. But I think we can even extend that into the conversation around tech governance generally. You know, we have privacy legislation in some jurisdictions and there have been some moves globally, such as the recently ratified European Union's act to regulate artificial intelligence, or the AI Act. It just came into force in August this year. But even something like that - and it's a big piece of lawmaking. Right? Like it tries to encapsulate AI from toys to the border and everywhere in between, it just, unfortunately, does not go far enough to protect the human rights of people who are often on the margins of society. And I don't just mean people on the move and refugees, but also people who are in the criminal justice sector or people who are applying for welfare, for example. It's been disappointing to see, you know, even the AI Act kind of fall flat on this. And I think it's also - again, if we're looking at it globally speaking, that act could have set a precedent for other countries, like the United States and Canada and Australia, to regulate in a much more stronger way. And right now there is not an incentive to do so, once again, I think because a lot of private sector interests are the ones who are driving policymaking and who are setting the stage on how we even think about drawing some guardrails or some red lines under some of this tech that I think makes a lot of people uncomfortable.

 

Dave Bittner: What would you say to the folks, for example, in the U.S. Border Patrol, you know, who - they are tasked with enforcing the limits at the border? And I can imagine them saying, you know, "This technology is a force multiplier for us. It lets us do more with less and more effectively do our jobs." It seems to me like, I mean, that's a legitimate argument from their point of view.

 

Petra Molnar: Well, what I would say to that is that we can't lose sight of the fact that there are real human beings who are at the center of all of this, right, and people who have had oftentimes to flee really difficult circumstances. I think we desperately need more humanity in the kind of conversations that we're having about the border generally, but especially when it comes to new technologies because, again, the human migration has been with us for - you know, since time immemorial. And, also, we're likely to going to be seeing more and more migration in the coming decades as well. And, so, perhaps the conversation really needs to be about if people are going to be migrating, how do we make sure that the system is fair, transparent and accountable rather than opaque, discretionary and full of high-risk technology.

 

Dave Bittner: Where do you suppose we're heading here? I mean, we have recently re-elected President-to-be Donald Trump, I think we have a good idea of where he stands when it comes to the U.S. /Mexico border. What do you think that portends for the future when it comes to these technologies?

 

Petra Molnar: Well, you know, I don't have a crystal ball, but I do anticipate that there is going to be more and more technology that's introduced at the border as a result of this upcoming presidency. And, you know, we've already seen signals for this, right, with the kind of space that's being made for private sector companies, you know, in the Oval Office. Of course, Elon Musk comes to mind, but there will be other players as well in there that are going to be kind of jostling for power and influence and making a lot of money off of it, too.

 

Dave Bittner: If we look at the global situation here beyond the U.S. /Mexico border, is this technology being applied throughout the world?

 

Petra Molnar: Absolutely. This is really a global story. And even though, you know, my book starts in Arizona and it's a space that I've really had the privilege of working in for a while now, it's only one piece of this story. And there's technologies that are being tested out and introduced in refugee camps in Greece, for example. I spent three years in Greece trying to understand this. There is different types of digital identity systems in East Africa, in Kenya in particular. There's also different types of technologies that are used to automate immigration systems in Canada and Australia. Again, it's become this kind of - some people call it an AI arms race that I think we're seeing everywhere in every sector, but it also plays out this way in immigration.

 

Dave Bittner: You know, it's such a charged issue and people have strong opinions and there's - it just seems to me like there's a lot of emotional baggage that comes with this issue because it's so polarizing. I'm curious, do you have a message that you want to share for folks? You know, with the expertise you have on this issue, the research that you've done for this book, being a lawyer who's in this world all the time, are there any words of wisdom you have for folks to better understand what's going on here from your perspective as a insider?

 

Petra Molnar: Yeah. And thanks for the opportunity to reflect on that. I mean, you know, I think, ultimately, it is about that point that I tried to make earlier about not losing sight of our common humanity and the fact that so many of us have a migration story of our own. And the fact that migration has been politicized to such an extent is actually a political project on part of, you know, different administrations, not just in the United States, but all over the world, kind of pitting ourselves against one another rather than kind of holding space for complexity and the complications of being a human being here. And the other thing I will say, too, is, you know, I think perhaps, you know, you're getting the sense that like I think the border and migration spaces are important to understand in and of their own right because, again, they're opaque, they're discretionary, they're very high risk. But, the thing is, border technology doesn't just stay at the border. You know, I mentioned these robo dogs, they were announced in a press release by the Department of Homeland Security in February of 2022. And I was literally on the sands of the Sonora Desert visiting some memorial sites of people who passed away there when this announcement was made. That was one of the surreal moments that I mentioned earlier. But, a year after that, in May of the following year, the New York City Police Department, on TikTok no less, announced that they want to now be piloting robo dogs on the streets of New York to keep New York safe. One was even painted white with black spots on it, like a Dalmatian. There is clearly this bleed-over of border technology that happens in other spaces of public life and I think that's why we need to pay attention to it. [ Music ]

 

Dave Bittner: Ben, what do you think?

 

Ben Yelin: I think this goes well with the first couple of episodes we did after the election where we're looking at the practical impact on the topics that we cover of a second Trump administration.

 

Dave Bittner: Right.

 

Ben Yelin: He has promised mass deportations and there are technological tools that did not exist during his first term, which only ended four years ago. We didn't have the type of LLM models that we have now through artificial intelligence.

 

Dave Bittner: Right.

 

Ben Yelin: And just computing always improves over time. So, he will have tools at his disposal that will aid in the effort to deport people - undocumented people in the United States. And I think it's really important to study the implications of all that.

 

Dave Bittner: Yeah. All right, well, our thanks to Petra Molnar for joining us. Again, the book is titled "The Walls Have Eyes: Surviving Migration in the Age of Artificial Intelligence." [ Music ] And that is "Caveat," brought to you by N2K CyberWire. We'd love to know what you think of this podcast. Your feedback ensures we deliver the insights that keep you a step ahead in the rapidly changing world of cybersecurity. If you like our show, please share a rating and review in your favorite podcast app. Please also fill out the survey in the show notes or send an e-mail to caveat@n2k.com. We're privileged that N2K CyberWire is part of the daily routine of the most influential leaders and operators in the public and private sector, from the Fortune 500 to many of the world's preeminent intelligence and law enforcement agencies. N2K makes it easy for companies to optimize your biggest investment, your people. We make you smarter about your teams while making your teams smarter. Learn how at n2k.com. This episode is produced by Liz Stokes. Our executive producer is Jennifer Eiben. The show is mixed by Tré Hester. Our executive editor is Brandon Karpf. Peter Kilpe is our publisher. I'm Dave Bittner.

 

Ben Yelin: And I'm Ben Yelin.

 

Dave Bittner: Thanks for listening. [ Music ]