Caveat 2.9.23
Ep 160 | 2.9.23

Surveillance and the threat to press freedom.


Carrie DeCell: Twenty-two members of El Faro's newsroom were subjected to hundreds of spyware infections using NSO Group's Pegasus spyware, which is a spyware that can be installed on smartphone devices surreptitiously and remotely. So spyware - or a smartphone user would have no way of knowing at the outset that their phone had been infected with this surveillance technology.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben discusses the debate around nationwide bans on TikTok. I have the story of encryption technology that may improve online privacy. And later in the show, Carrie DeCell, the senior staff attorney of the Knight First Amendment Institute at Columbia University joins us to discuss the threat of malicious surveillance technology and how that could affect press freedom around the world. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben. Before we jump into our stories this week, we've got some follow-up from a listener named Steve, who writes in and says, I have a question about the recent interview about the Google algorithm case before the Supreme Court. Would the approach used by Twitter, where users have a choice between content from just those we follow or the algorithm feed, be a defense against such a charge? In the Twitter operation, users, by default, see the algorithm feed, but have the ability to only see posts from accounts they follow. By having the choice for follow-only, is that a protection, or only if the follow is the default and opt-in for the algorithm by the user? What do you think, Ben? 

Ben Yelin: First of all, this is a great question. As soon as I saw this question, I was like, yeah, they - this person has really identified an issue here. So the case of Google v. Gonzalez, which we talked about, which is going to be heard next year... 

Dave Bittner: Yeah. 

Ben Yelin: ...That concerns YouTube, where the default is an algorithm. 

Dave Bittner: Right. 

Ben Yelin: If you are logged in to Google while you are using YouTube, it is collecting information about the videos you watch by default without you having to opt in. As this listener notes, Twitter, the way it's structured right now - I think the default when you log in is to go to the, quote, "for you" tab, which is an algorithm. 

Dave Bittner: Yep. 

Ben Yelin: But you do have the choice of using the, quote, "following" tab, which is what every decent Twitter user should be using. 

Dave Bittner: (Laughter). 

Ben Yelin: And that's the reverse chronological order of tweets from people that you follow. 

Dave Bittner: Right. 

Ben Yelin: Every user has the choice to go into that following tab, and the choice is very clear. 

Dave Bittner: Yeah. 

Ben Yelin: You know, it would be very apparent to any Twitter user. So I would surmise that, from Twitter's perspective, that actually might be a reasonable defense in this case, that the algorithm is there to enhance the user experience. But the user does have some agency in opting out, the way they don't as it relates to YouTube. Now, what Google might argue is if a person were concerned about the algorithm, they could watch videos incognito, and, you know, nothing - no information would be collected, and the algorithm wouldn't lead them to different and darker places. 

Dave Bittner: (Laughter) Daugh (ph), Ben, you don't actually believe that's true, do you? 

Ben Yelin: I know. You know me - the inherent optimist, right? 

Dave Bittner: Right, right. 

Ben Yelin: So I do think that it does make a difference. It's - I can't be sure in a legal sense 'cause it's such a novel issue. 

Dave Bittner: Yeah. 

Ben Yelin: But just my instinct is the fact that while the default in Twitter is to have this algorithmic timeline, the choice to go back to the reverse chronological order is so readily apparent that I think it might reflect better on Elon Musk and his operation there than the Google case at this point. 

Dave Bittner: Yeah. 

Ben Yelin: But this is certainly something I'll be interested in following more, especially if this kind of thing comes up at oral argument, whether the issue of opt in on behalf of the user ends up being a determining factor in this case. 

Dave Bittner: Right. It seems to me like if you had that be part of the onboarding process, you sign up for a new account, and it says, hey, which would you prefer? I guess, what's in it for the provider? Like, what's in it for YouTube to give you the option to have a non-algorithmically driven YouTube experience other than if... 

Ben Yelin: Yeah, I mean, nothing right now. But - yeah. 

Dave Bittner: But if doing so provided them some sort of legal shield, there you go. 

Ben Yelin: Exactly. And if that was their... 

Dave Bittner: Because you know most people would not opt for it, probably. 

Ben Yelin: Right. 

Dave Bittner: (Laughter). 

Ben Yelin: I mean, that's something that we mention all the time is that the algorithm is very convenient. And if you are not being led to, you know, ISIS terrorist videos, for the most part, it's quite useful for the average user. 

Dave Bittner: Right. 

Ben Yelin: I mean, I can't tell you how many great rabbit holes I've gone down just because I happen to search one thing, and then I get some amazing suggestions that I would have never come across naturally. 

Dave Bittner: Right. 

Ben Yelin: And I think that's true for most people on both YouTube and Twitter. There are accounts that you probably would not have discovered on Twitter if it not - if that algorithm had not existed in the first place. 

Dave Bittner: Yeah. 

Ben Yelin: So, yeah, I mean, I certainly think that could add an interesting wrinkle into this case. 

Dave Bittner: All right. Well, our thanks to our listener, Steve, for sending that thoughtful question in. And of course, we would love to hear from you. Our email address is 

Dave Bittner: All right, Ben, let's jump into our stories this week. Why don't you start things off for us here? 

Ben Yelin: So there's this debate going on nationwide about banning TikTok. We've seen it happen in a number of states. Governors have issued bans on TikTok from the devices of state employees. And there's now a debate, at least in Congress and among federal agencies, about whether there should be a ban among federal employees themselves or, perhaps more drastically, there should be a nationwide ban on TikTok in general so that it would be prohibited from app stores - Google Play and the Apple App Store. That would be a pretty radical action for Congress to take. And I think a really good perspective on this is a New York Times op-ed, which is the inspiration for this segment here, by a guy named Glenn Gerstell, who's a senior advisor at the Center for Strategic and International Studies. And he served as general counsel at the NSA and Central Security Service from 2015 to 2020. 

Dave Bittner: OK. 

Ben Yelin: The crux of this article is that he believes that there is a justification, at least a reasonable argument, about banning TikTok. We know that China has passed a law in 2017 that any - it can have access to the data from any Chinese company without, obviously, any sort of legal process for collecting that data. The parent company of Tik Tok is ByteDance, a Chinese company. And they are at least prone to hand over extensive personal information on the app's American users. And then there's, of course, the potential for disinformation, that the Chinese government could demand propaganda go out across TikTok service among American users. 

Dave Bittner: Right. 

Ben Yelin: So this is a very legitimate concern. But at least in the view of this op-ed author, this is more of a Band-Aid, a Band-Aid that would have pretty radical downstream effects that doesn't really address the root problem, which is the vulnerability of our personal data in general. So TikTok presents its own security concerns because of its relationship with the Chinese government. That's legitimate. But the bigger problem is that there is no federal law or federal policy controlling what happens to our data in the first place, whether, you know, there are any regulations on applications about sending data overseas or selling data to data brokers or at least giving users the ability to opt out of data being sold, et cetera - everything you'd want to see from a federal data privacy law. So it just seems like from the perspective of this author - and I think I really agree with this - that there's a broader problem that goes beyond TikTok. And by focusing on TikTok itself, despite the fact that there are specific national security concerns there, we're kind of missing the forest for the trees. 

Dave Bittner: You know, as we're recording this, we just went through the news cycle with the Chinese balloon, the spy balloon. 

Ben Yelin: Yeah, it was quite a news cycle. 

Dave Bittner: It was. It was. And - but I saw several comedians, you know, the late-night comedians, say that, you know, the - I guess the setup for the joke was the Chinese would say, it was just a weather balloon. We don't need a balloon to spy on Americans. We have TikTok for that. 

Ben Yelin: Right, exactly. 

Dave Bittner: I saw several versions of that joke (laughter), right? 

Ben Yelin: Which is 100% true. I mean, and we know, based on some pretty compelling evidence, that TikTok does present this risk. It's used - I think the statistic I saw is that 2 out of 3 teenagers in the United States is on TikTok. And it's, I think, one of the fastest-growing social media applications. It's certainly become the most prominent. As a quick side note, at the press conference at the White House - I think it was the White House press secretary or maybe a representative from the Pentagon. They were talking about the Chinese balloon incident. 

Dave Bittner: Right. 

Ben Yelin: And somebody said the Department of Defense has a TikTok of what happened with the Chinese balloon incident. What they were actually describing was, like, a play-by-play of everything that happened with the Chinese balloon over 72 hours - so not a literal TikTok post - that caused some funny media confusion. 

Dave Bittner: Oh, interesting. 

Ben Yelin: But more to the point, yeah, I mean, I think, too, we have this application that is owned by a Chinese company, that's vulnerable to both espionage and misinformation and that's used by 2 out of 3 American teenagers. So we could address that problem by doing something that, frankly, causes a bunch of constitutional problems, and that would be banning TikTok. And that would kind of stop the bleeding in this very narrow sense without actually solving the broader problem, which is we don't have any proper federal regulation on our data. 

Dave Bittner: And former President Trump tried to do just that, right? He tried to to ban TikTok and was unsuccessful. 

Ben Yelin: He was unsuccessful. Yeah. I mean, I think it's important to recognize the First Amendment issues at play here. And this is something that's mentioned in the article. So the government can institute time, place and manner restrictions on speech. A ban on TikTok would be what we would refer to as a content-neutral restriction on speech. 

Ben Yelin: And courts don't look as just favorably on content-neutral bans - so bans, no matter what the speaker is saying. Courts don't look as unfavorably on those restrictions as they do on content-based restrictions. So something where they said you can't talk about U.S. military posture in Ukraine on any social media application, that would be a constitutional red flag. 

Ben Yelin: I still think the government, under its First Amendment case law, would have to show that this type of restriction serves an important governmental interest and that the means were at least pretty closely tailored to whatever that interest was. And I'm not sure that they would be able to fulfill the second prong of that test. It doesn't have to be the least restrictive means of achieving that goal because, for you legal nerds out there, we're operating under intermediate scrutiny here, not strict scrutiny. I'll let the non-law people just ignore that sentence. 

Dave Bittner: I was just - I was going to say the same thing, Ben. I was going to mention that. But I'm glad you did. 

Ben Yelin: Yeah, exactly. That's a little Easter egg for the lawyers out there. 

Dave Bittner: Right. Right. 

Ben Yelin: But basically, you have to have a pretty good reason for instituting a ban like this. And the ban has to be pretty closely related to that policy objective. And, you know, I'm just not sure how closely a ban on TikTok is associated with the objective of overcoming Chinese espionage or disinformation when there are a whole bunch of other applications that either have already spun off or will spin off that might be just as vulnerable to those types of bad outcomes. 

Ben Yelin: So I do think there is a potential First Amendment concern here that hasn't really been addressed. And certainly, you'd be taking a risk in passing a federal statute banning TikTok that you'd run into First Amendment problems. And then there's the political problem. I think this is something that certainly impacted what happened with President Trump a few years ago. We can't overlook the fact that TikTok has a constituency. I mean, just try going into any high school in America and saying we're going to ban, you know, this service that two-thirds of you, two-thirds of you students, use to communicate with your friends. 

Dave Bittner: But they're not old enough to vote, Ben. They're not old enough to vote. 

Ben Yelin: Yeah, but their parents can vote. And not only that, as loathe as I am to admit, a lot of grown adults also use TikTok, and they're a constituency, as well. It's just not going to be an easy thing to do. If it was some obscure platform, obscure service, then that's a no-brainer. You get rid of it. You know, there are probably other applications that could fulfill that role. At this point, you know, would we all migrate to Meta and just do the type of short videos there that are pretty much equivalent to TikTok? Would we migrate over to Google and just have YouTube do these types of shorts that are equivalent to TikTok? I don't think we're quite at that point yet. I think TikTok has really saturated that market. So you would really face a political backlash. So with these kind of pitfalls in mind, why don't we address the broader problem - at least, that's what this op-ed author is saying - and actually come up with a comprehensive set of federal data privacy regulations? That kind of seems to be the answer to a lot of the problems we identify on this podcast. It's certainly easier said than done. 

Dave Bittner: Well - so let's say that the allegations are true, that the Chinese government is using TikTok to gather information, just for argument's sake, right? 

Ben Yelin: Sure. 

Dave Bittner: And, of course, the Chinese say that it's not. And suppose we banned TikTok, is there any information that's available on TikTok that the Chinese government couldn't gather by other means? I mean, this information is freely available on the free market. You are... 

Ben Yelin: To data brokers, exactly. 

Dave Bittner: Right. Any of us could - with our credit card in hand, could go out and purchase probably anything that we wanted about anyone with no - nothing in the way of us doing so because there is no privacy law. So would it make a difference? 

Ben Yelin: I'm not sure it would make a difference. Now, the fact that there is this direct line in Chinese law from ByteDance to the Chinese government makes it more concerning than other applications that don't have Chinese parent companies. But yeah, I mean, do I have any doubt that so-called private information that's posted on TikTok isn't readily available elsewhere for purchase or already on the dark web? No, there are no guarantees. And that's why I think this op-ed is really well argued, that until we address this broader problem, TikTok - banning TikTok just might be more trouble than it's worth, until we have a data privacy law. I think that's pretty compelling. 

Dave Bittner: All right. Well, we will have a link to that story in our show notes, of course. My story this week comes from the folks over at The Guardian. It's written by Alex Bellos, and it's titled "Can a New Form of Cryptography Solve the Internet's Privacy Problem?" Fascinating story here and the case they lay out. I'm going to just read some of this because I think it lays out the technology really well. 

Dave Bittner: They use the hypothetical story of a student they call Rachel, and they say Rachel is a student at a U.S. university who was sexually assaulted on campus. She decided against reporting it, and this article points out that fewer than 10% of survivors do. What she did do, however, was register the assault on a website that's using novel ideas from cryptography to help catch serial sexual predators. So this is an organization called Callisto, and it lets the survivor enter their name in a database together with details about the assailant, such as their social media handle or phone number. All this stuff is encrypted, meaning that all the identities of both the survivor and the alleged perpetrator are anonymous. And this article points out, if you hacked into the database, there's no way to identify either party. But if the same perpetrator is named by two people, the website registers a match and then triggers an email to two lawyers. Each - and what's better than one lawyer, Ben? 

Ben Yelin: Yeah. Absolutely. 

Dave Bittner: Two lawyers. Each lawyer receives the name of one of the survivors, but not the name of the perpetrator. The lawyers then contact the survivors to let them know of the match and offer to coordinate further action, should they wish to pursue it. So what this technology allows people to do is to find out if someone else is in the same boat that they are without revealing who they are, who the perpetrator is or any of that private information. But it allows a case to be gathered and made while still preserving the privacy of the different parties. And this article says that the technology is under an umbrella that they call PETs, which is privacy-enhancing technologies. And there's a lot of work being done in this area. What do you make of this, Ben? 

Ben Yelin: So first, I should say that the - this company was founded by somebody that I went to elementary school with. 

Dave Bittner: Really? 

Ben Yelin: So I have a nice personal connection here, and I just really admire the service that Callisto provides. And at least somebody in my elementary school class went on to have a successful career. So... 

Dave Bittner: (Laughter). 

Ben Yelin: ...That is certainly encouraging. 

Dave Bittner: Very nice. 

Ben Yelin: Beyond that, I think the promise of this technology is that it can be replicated in a variety of different fields. So this field is particularly useful because we're enabling victims of sexual assault to possibly identify a perpetrator. And since research shows that most perpetrators of sexual assaults on college campuses commit multiple assault offenses, it really could cut down on the incidence of sexual assault if you're able to identify these perpetrators. And that's a really positive outcome to help address this really serious problem. But PETs is also being used to address other problems. So they talk about - I think it was a study in Switzerland where there was a PETs-oriented database where you could enter information on certain medical conditions without identifying any personal information. It still used this type of cryptography. And that can give medical researchers at four independent teaching hospitals to conduct kind of broad data analysis on a large class of patients, and that, obviously, can be useful for medical research. 

Ben Yelin: We've seen PETs pop up in industries ranging from insurance, marketing, machine learning, cybersecurity, cryptocurrency and government. So this article talks about how the United Nations launched a PET lab, which, they note kind of in a mocking tone here, has nothing to do with the welfare of domestic animals. 

Dave Bittner: (Laughter). 

Ben Yelin: But a forum for national statistical offices to find ways to share data across borders while protecting the privacy of their citizens. And so this is a really promising form of technology, and this is just such a vivid example of where it can be useful. I mean, there's a reason that only 10% of people report sexual assaults, and that's because it's a very difficult experience if you're going through it alone and you've had this really traumatic experience and don't know where to seek help. 

Dave Bittner: Right. 

Ben Yelin: So the fact that this technology exists, I think, is just a really important advance. 

Dave Bittner: Yeah. I've done a number of interviews over the years with the folks over at a company called Enveil, which - their thing is fully homomorphic encryption, which is - it's a type of cryptography where you can run analytics on the encrypted data without decrypting it. So you can get answers from your encrypted data without revealing what the actual encrypted data is. And my understanding is that it's very computationally intensive. And so, you know, it's only in recent years that it's been practical to use, but they're making great strides in it. So... 

Ben Yelin: Yeah. 

Dave Bittner: ...It's really a fascinating area where you could prove that something is so without revealing the fundamental information about it. I guess at the root of this, you - at some point, there has to be a level of trust - right? - that if something says that something is so, but then you say prove it - right? - and you say, well, no, it's encrypted. I can't share the information with you. But trust me. Here's the mathematical proof as to why it is so. 

Ben Yelin: Yeah. 

Dave Bittner: Right? 

Ben Yelin: One thing that's interesting about this is this entire field came from kind of like a game theory study by a Chinese computer scientist back in the 1980s named Andrew Yao, who asked the following question. Is it possible for two millionaires to discover who is richer without either one revealing how much they are worth? And the reason this is kind of game theory-esque is that I think, intuitively, we would think that's impossible. If you're not revealing personal information, how do you - how can you find out who's richer? But if you're just kind of exchanging packets of anonymized information between one another, using randomness to hide the exact numbers, at the end of it, both millionaires will be satisfied that they can know with relative certainty which one of them is richer. And I think that's kind of the principle at play here. That was obviously pre the type of cryptographic tools that we have now. But it's just an interesting that this - it's interesting that this comes from kind of a dated theory here. 

Dave Bittner: Yeah. Yeah, it talks about how you could have - like, banks could use this to find out if they're - like, two different banks are being defrauded by the same person but they could do it in a way that they wouldn't break financial data protection laws. You can imagine how this could be useful to - for, as you mentioned earlier, medical facilities to be able to share information or track things without breaking HIPAA laws. I'm thinking in the cybersecurity realm that you could have - organizations could share that they've suffered a data breach without revealing who - you know, who did the breach or what... 

Ben Yelin: Absolutely. 

Dave Bittner: ...The breach was. You know, that just - if you put it into a system like this and if you get a match, then you know that there's a pattern here without - because organizations are reticent to reveal that they've even had a data breach, much less what happened and who it could have been, you know, and that sort of thing. So all kinds of applications for this. An interesting technology to track for sure. Obviously, the - you know, the case of sexual assault on college campuses is very compelling and really makes the case clearly. But there's all kinds of things you can use this for. 

Ben Yelin: Absolutely. 

Dave Bittner: Yeah. So this article wraps up and says that this Callisto software or platform is available at 40 universities in the U.S. And the plan is to roll it out nationwide. So it seems to me like a good thing. 

Ben Yelin: Very promising. Yeah. 

Dave Bittner: Yeah. All right. Well, we will have a link to that in the show notes. Interesting article for sure. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Carrie DeCell. She is a senior staff attorney at the Knight First Amendment Institute at Columbia University. And we were discussing the threat that malicious surveillance technology poses to press freedom around the world. Here's my conversation with Carrie DeCell. 

Dave Bittner: So what prompted our conversation today is that you and your colleagues at the Knight First Amendment Institute at Columbia University have an event coming up. It's titled Spyware and the Press. What prompted the creation of this event? 

Carrie DeCell: That's right. It's coming up on February 8. We'll be hosting it at the Columbia Journalism School. And the event will focus on a lawsuit that the Knight Institute recently filed against spyware manufacturer NSO Group on behalf of 18 journalists and other members of the digital news organization El Faro, which is based in El Salvador but is really one of the leading independent news organizations in the Americas. 

Dave Bittner: And what is behind the lawsuit here? What exactly went on? 

Carrie DeCell: The lawsuit challenges numerous spyware attacks using NSO Group's Pegasus technology against the staff of El Faro over the course of about a year and a half. So between about June of 2020 and November of 2021, 22 members of El Faro's newsroom were subjected to hundreds of spyware infections using NSO Group's Pegasus spyware, which is a spyware that can be installed on smartphone devices surreptitiously and remotely. So a spyware - or a smartphone user would have no way of knowing at the outset that their phone had been infected with this surveillance technology. And Pegasus then gives its operators effectively full control of and access to all information stored on the device. 

Carrie DeCell: The Pegasus operator can monitor your texts, your phone calls, your whereabouts using GPS location tracking. It can turn on your microphone to record surrounding sounds or turn on your camera to take photographs. And it has been reported that Pegasus can also enable access to cloud-based accounts that are accessible through the smartphone. So it's really an all-encompassing surveillance tool. And these 22 members of El Faro that I mentioned were the victims of repeated Pegasus infections on their smartphones. And many of these infections coincided with major news investigations that the journalists were conducting at the time, focused on corruption within the Bukele administration in El Salvador, on gangs within El Salvador and many other hard-hitting stories that these journalists bravely pursue day in and day out. And the spyware attacks against them seem clearly to have been in retaliation for their courageous, independent reporting. And they really impacted these journalists' ability to continue reporting in this way. So we're seeking to hold the manufacturer of the spyware, NSO Group, accountable for these attacks. And we filed the lawsuit that I mentioned in U.S. court at the end of 2022 to do just that. 

Dave Bittner: And what do you hope the outcome to be here? When you say holding them accountable, where do - how do you hope this plays out? 

Carrie DeCell: Yeah. So I mean, first and foremost, we want this lawsuit to demonstrate that spyware manufacturers that rely heavily on the U.S. infrastructure of U.S. technology companies - in this case, Apple - in order to deliver their spyware to victim's phones around the world, that they can be held liable for violating U.S. law in U.S. court. And in this case, we're talking primarily about the Computer Fraud and Abuse Act, an anti-hacking law. But one of the most important pieces of relief we're seeking through this lawsuit is an order requiring NSO Group to disclose the client in this case. So who ordered the spyware attacks against our plaintiffs? 

Carrie DeCell: In addition to the incredibly sophisticated technology that NSO Group offers its clients, it also promises secrecy. It seeks to assure clients that the use of this technology cannot be traced back to them, and therefore, they can't be held accountable for hacking into the phones of journalists and, you know, in other instances, of human rights activists, of lawyers, of political opponents around the world. So we want to demonstrate not only to the client in this case, but to people - well, government entities around the world that would use NSO Group's technology, that they, too, may be uncovered through litigation like the lawsuit we filed here. 

Dave Bittner: And I think it's worth mentioning that NSO Group themselves have said that, you know, they only license this technology to people who agree to their terms of service and indeed, that they've taken away the ability for people to use it who they've found using it against their terms of service. But despite NSO saying that, I think your case is one where we've seen plenty of examples where that's simply not true. 

Carrie DeCell: That's exactly right. NSO Group has been saying that for years, saying that they sell this incredibly invasive technology responsibly, but the technology has been used time and again against law-abiding members of civil society. And at some point, NSO Group has to be considered on notice that this is happening. And in some instances - in Mexico, for example, some victims of Pegasus attacks have later been reinfected with Pegasus, you know, a couple of years later. So NSO Group appears to be continuing to sell its spyware to governments and clients that have used it improperly in the past. And so we haven't seen evidence that its due diligence procedures have improved much, if at all. 

Dave Bittner: Yeah. You know, I think it's easy for those of us here in the U.S. to kind of take some of the freedoms that we enjoy - when it comes to the press - for granted. But of course, that's not true around the world. It's interesting to me that you're bringing this suit here in the U.S. It's - what is the state of press freedom in Central America, where this activity is alleged to have happened? 

Carrie DeCell: Well, certainly in El Salvador, I would say it's not good at the moment, and in other parts of Central America and Guatemala and elsewhere. Many of the journalists who were the victims of these spyware attacks at El Faro have also been subjected to massive online harassment campaigns. In some instances, they have been physically surveilled, you know, been followed as they're walking home. And El Faro itself has been subjected to baseless investigations by the government. So they're under extreme pressure in El Salvador as they continue to carry out their very important mission. 

Carrie DeCell: I want to emphasize, also, that the attacks against El Faro in El Salvador do harm the U.S. readers of El Faro in the United States. There are hundreds of thousands of people in the United States that read El Faro online. It's a digital newspaper. And when these spyware attacks are conducted against journalists, it has a severe chilling effect on the work that they do. It deters sources from speaking with them. It deters outside writers from publishing with El Faro. It deters advertisers from publishing with El Faro. And that puts an incredible amount of pressure on the news organization. It also has just meant that these journalists have had to spend a significant amount of time responding to these attacks, trying to shore up their digital security. And that's time taken away from their important investigative efforts. So the readers of El Faro in the United States have suffered real harms as a result of these attacks, as well. 

Carrie DeCell: And just one final point, NSO Group has said that its Pegasus software cannot be used against U.S. phone numbers or within the United States, but there are many U.S. citizens who are working abroad who use non-U.S. phone numbers. One of those U.S. citizens is a plaintiff in our case, Roman Gressier, who is working for El Faro and living in Central America now. But it was reported that a reporter for the New York Times, while writing a book and reporting on Saudi Arabia, was also the victim of a Pegasus attack. So just because, you know, NSO Group has tried to at least claim that there are some boundaries on the use of this technology that would protect the territory of the United States, it doesn't mean that this technology is not significantly harming U.S. interests. 

Carrie DeCell: Sorry to carry on, but the - I just wanted to mention one more thing, which is that the technology really depends - the effectiveness of this technology depends on its transmission through popular computer services and apps. So Apple and WhatsApp have also sued NSO Group for abusing their software and services to create exploits or vehicles to deliver Pegasus to the targeted phones around the world. So they clearly themselves feel harmed by the proliferation of this kind of spyware. 

Dave Bittner: NSO Group is certainly the most well-known of the providers of this sort of thing. Is it safe to say that there are others out there, though? 

Carrie DeCell: Yes, yes. There certainly are. NSO Group seems to have been the most successful in selling this spyware around the world. And its spyware is particularly sophisticated. But it's definitely not the only spyware manufacturer. There are others around the world, some in Cyprus, some in Germany, others in Israel, and, you know, other kind of hacking operations cropping up in India, for example. So the threat is really growing. And we hope that this lawsuit can be at least one small step toward addressing that threat. 

Dave Bittner: How do you suppose that we could establish international norms for this sort of thing, to have, you know, accepted practices that this is something that is not done? 

Carrie DeCell: Yeah, a lot of folks are working very hard to try to establish those norms. So people at Amnesty International, at Access Now, at the Citizen Lab, David Kay, the former special rapporteur for the United Nations for freedom of expression and opinion - you know, people have been working very hard to convince international organizations to put into place - you know, in the first instance, really, a moratorium on the sale of commercial spyware until a regulatory framework regulating the sale and use of this kind of spyware can be put into place. I think it's worth, you know, noting that it's, really, a big question as to whether or not the kind of spyware like Pegasus, which is truly kind of all-encompassing in its capabilities, can be brought into line with human rights norms and privacy expectations. But to the extent that's possible, you know, there's certainly no regulatory framework that's currently in place that does an effective job of mitigating the risks of the use of this spyware around the world. 

Dave Bittner: I see. Before I let you go, can we talk a little bit about the event that's coming up here? I mean, you have quite an impressive panel lined up. 

Carrie DeCell: Yes, yes. So the event will be moderated by Sheila Coronel from the Columbia Journalism School. And it will feature Carlos Dada, the named plaintiff in our case, who's the co-founder of El Faro and was the victim of many, many spyware attacks using Pegasus. So he will speak to his own experience as a victim of these spyware attacks and his experience leading a newsroom through this harrowing ordeal. And then Ronan Farrow has been doing tremendous reporting on NSO Group and on Pegasus specifically, and he wrote a piece about the El Faro journalists who are the victims of the spyware attacks at issue in our lawsuit, and so he will be appearing on this panel as well. And so will I. I'll bring the legal perspective. 

Dave Bittner: Yeah. Overall, what is the state of press freedom around the world? Is it - is freedom on the march, or are we in a state of decline these days? 

Carrie DeCell: I'm afraid to say it certainly, you know, seems to be that press freedom is declining all around the world. We hear reports - you know, organizations like the Committee to Protect Journalists are reporting about journalists being imprisoned and murdered in many countries. I mean, the murder of Jamal Khashoggi stands out in many of our minds, particularly folks, you know, from the Washington area. But Jamal Khashoggi is an example of how the use of spyware like Pegasus can lead to incredible physical danger. It's been reported that Pegasus has been used against his family members and close associates leading up to his kidnapping and murder. So I think that case illustrates, really, the severe threat posed by this kind of spyware. But journalists and the press are facing increasing threats all around the world. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: I'm a big fan of the Knight's center. And, you know, we've talked about Pegasus and the NSO Group and the chilling effect it can have on both media sources - which is, I think, the focus of your interview here... 

Dave Bittner: Yeah. 

Ben Yelin: ...And government actors. I just think it's a very worthwhile project. And the event that you discuss, I think, is taking place kind of as this podcast is being released. 

Dave Bittner: That's right, yeah. 

Ben Yelin: But I encourage people to certainly check it out. I'm sure there's going to be a recording. It sounds like it's going to be very interesting. 

Dave Bittner: Yeah. We'll have a link to that event in the show notes as well. Again, our thanks to Carrie DeCell for joining us. We do appreciate her taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.