Caveat 9.8.22
Ep 140 | 9.8.22

Edward Snowden and whistleblower ethics.


Robert Carolina: You have to really search your soul and say, am I really onto this something here, or am I - is this becoming a self-fulfilling prophecy? Because anyone who takes that step has reached the conclusion - I know better.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben looks at Cloudflare's decision to cut service to Kiwi Farms. I've got the first report from the U.S. Congressional Research Service on the metaverse. And later in the show, Ben's conversation with Robert Carolina, the senior fellow with the Information Security Group at Royal Holloway, University of London. They're discussing the legacy of Edward Snowden. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, we got some good stories to share this week. Why don't you start things off for us here? 

Ben Yelin: So the big one in the technology world is what happened over Labor Day weekend with Cloudflare and the site Kiwi Farms. So Cloudflare - I think all of our listeners would be familiar with them - they are a web infrastructure and security services provider. They support the back end of websites. They prevent your website from being the victim of a DDoS attack... 

Dave Bittner: Right. 

Ben Yelin: ...For example. Then there's the site Kiwi Farms, which is an outgrowth of 4chan and 8chan. It's basically a bunch of internet trolls. They go after marginalized individuals, including people who are autistic, people who are struggling with gender, sexuality things. They've led harassment campaigns against trans rights activists, other individuals, and their harassment has been linked to a series of suicides. So they're really, really bad people, to put it mildly. 

Dave Bittner: Yeah. And what seems to be their motivation? Are they just out there to be mean for the sport of it? 

Ben Yelin: I kind of get the impression that that's, like, 90% of it. I think they have a political agenda, which is, we're against political correctness, we're against some of these, like, modern social movements. But really, they're just a bunch of angry people online who are fine engaging in harassment campaigns. 

Dave Bittner: And they feel a certain degree of anonymity and immunity, I suppose. 

Ben Yelin: Yeah. I'll also state - yes, they do. I mean, they can stay anonymous. I think it's important to state that this - so sites like Kiwi Farms can serve as a community for a lot of people who are otherwise marginalized and don't have interaction with other members of society, which is really dangerous. I mean, sometimes, the only community people feel like they belong to is an online community, and what if your online community is a bunch of trolls who are organizing harassment and targeting of transgender people and autistic people, that sort of thing? 

Dave Bittner: Wow. 

Ben Yelin: So there is an activist campaign - there's one prominent trans activist who is spearheading this - to get Cloudflare to cut services to Kiwi Farms. And this weekend, Cloudflare announced that they would do that. Their CEO issued a long statement where he really expressed great regret and sorrow about making this decision, basically saying, as a provider of back-end support for website services, we do not want to be involved at all in basically any type of content moderation. It's basically, we're free speech absolutists, and the absolute last thing we would ever want to do is cut off service to a site because of the content of their speech. I think in this circumstances, he felt compelled to do it because there were instances of specific threats being levelled at individuals. This one trans activist who led the campaign ended up having to go to Ireland or Northern Ireland, and she really felt a danger for her own life and had to report... 

Dave Bittner: You say she had to go there. You mean she went there out of fear for her own safety? 

Ben Yelin: Exactly. 

Dave Bittner: Wow. 

Ben Yelin: She's a U.S. citizen who went overseas because she was fearful for her own safety because of this harassment campaign. And when she was overseas, she had to call law enforcement to let them know about credible threats against her because people with Kiwi Farms were able to track her location somehow. 

Ben Yelin: So as a result of the fact that there was a specific charge of harassment, that there was some type of imminent threat, Cloudflare felt forced to make this decision. They've only done something like this a couple of times in the past. They cut services to the Unite the Right group in the wake of the 2017 Charlottesville attack protests. But it's just this very interesting scenario of big tech companies who have these very deeply held principles about not moderating content, fostering an atmosphere of free speech, but then they're up against these edge cases where people are making very specific threats. And it's almost impossible for them morally, legally, etc., to stay silent to this type of harassment. And it just shows that you can wax poetically about not policing speech, about not moderating any sort of content, not being politically biased, but there are going to be instances like this because the internet is a big place. And there are a lot of bad people out there who want to harass people. 

Dave Bittner: Right. Well, can we - let's - help me unpack this a little bit, so - clarify my own understanding. I mean, I would imagine that Cloudflare, for example, would not support a website that was posting, you know, child sexual abuse material, right? And... 

Ben Yelin: No, they... 

Dave Bittner: ...Would they say that, well, that's illegal? 

Ben Yelin: Yes. 

Dave Bittner: So you can't do that. 

Ben Yelin: And that's always a special carve out... 

Dave Bittner: Yeah. 

Ben Yelin: ...Even under our First Amendment laws. So I think they would refuse to support something like that. That's true. But in all types of closer cases, you know, I think in one instance, they withdrew their services from the Nazi Daily Stormer website. And there was another 8chan troll haven in 2019 that mass shooters were using, and they were using that site to distribute racist manifestos that motivated mass shootings. And they cut services to that. But beyond that, I mean, there are a lot of other really, really objectionable things that they're fine supporting because that's their business. That's their business model. 

Dave Bittner: Can you - help me understand here, I mean, this whole notion of a free speech absolutist, right? I mean, even our Constitution is not absolute when it comes to free speech. 

Ben Yelin: It's not. Yeah. I mean, there are certain things that fall outside free speech protection - false advertising, things that lead to an imminent, lawless action. So speech that not only is threatening but also would cause somebody to imminently cause physical harm to somebody else. Those are just two examples. Certain types of obscenity, profanity all fall outside First Amendment protection. But First Amendment protection is pretty robust. So you see a lot of people online say things like, well, the First Amendment doesn't protect hate speech. It absolutely does protect hate speech. It protects all different types of speech that many of us would find completely objectionable. 

Ben Yelin: So when people say - who know what they're talking about, say that they're free speech absolutists, I think they're saying that they're willing to go out on a limb and defend all different types of, quote-unquote, "bad speech" in order to protect the value of a free and open society. I just think eventually they're going to run into the same issue that Cloudflare runs into where you will end up - it's almost just a law of nature. You will end up being confronted with something so objectionable that's causing such kinetic harm that you're going to have no choice but to go against your own values. And that's exactly what Cloudflare did. They say that it's a dangerous decision - that the decision they made to cut services is dangerous for them, and it's one that they're not comfortable with. And they didn't even want to attribute it to the activist campaign because they're so ashamed of it. They just said they're taking this action because there is an imminent threat to somebody's life or physical safety. 

Ben Yelin: But we are going to reach that point because there are a lot of - I mean, Kiwi Farms is not alone in the data sphere among people who are willing to cause harm to others, who are willing to dox and stock and swat and do all different types of terrible things to people. So I just think it kind of shows the limit, even of these companies that say privacy is a value, and free speech is a value. There are always going to be cases where you're going to have to go against your values. I've rarely seen companies - and there are a few of them - but it's just a real rarity that when you're faced with a circumstance like this, you're able to actually practice the very values that you preach to the fullest extent possible. 

Dave Bittner: Now, what does this mean for Kiwi Farms? Or - are they - can they shop around and find someone else to provide the service for them? Or is this it for them? 

Ben Yelin: Yeah. I mean, they sure can. All we know from them, they posted a statement on Telegram saying that this decision was done without any discussion, all they got was a suspension notice. They're denying that there was a threat to life or health and safety. Yeah, they can provide another provider for back-end services. Again, the website is going to work. They're just going to be vulnerable to denial-of-service attacks. 

Dave Bittner: Right. 

Ben Yelin: So for all intents and purposes, they've kind of made themselves a target for people who want to institute cyberattacks. So they're probably going to be screwed in that respect. But yeah, I mean, if you - somebody wants to create a Cloudflare alternative that serves the 8chan community and Kiwi Farms, the market isn't going to stop them from doing so. It just might take a while to develop. And oftentimes, these alternatives just end up being kind of poor imitations of the original. I mean, you think of something like Truth Social. Like, that sounds good in theory as an alternative to Twitter. It's something that former President Trump created so that he could practice free speech on his platform. But they've... 

Dave Bittner: Right. 

Ben Yelin: ...Run into a whole bunch of problems. One of them is people don't want to be on a website or on a social media platform that doesn't regulate speech because then you just got a bunch of trolls and harassers and... 

Dave Bittner: Right (laughter). 

Ben Yelin: ...You know, neo-Nazis. 

Dave Bittner: It just doesn't work. 

Ben Yelin: It just doesn't work. 

Dave Bittner: It just doesn't work. 

Ben Yelin: Exactly. 

Dave Bittner: No. You cannot - I mean, you cannot have a forum without any kind of moderation. It just doesn't - I mean, it's a great idea. Or maybe great is not the right word for it. It's a seductive idea, right? 

Ben Yelin: Yes. 

Dave Bittner: But it just doesn't work. 

Ben Yelin: It just doesn't work. And... 

Dave Bittner: Yeah. 

Ben Yelin: ...I think more companies are going to be faced with situations like this where they're kind of - their back is against the wall, and they're forced to admit that we can't practice what we preach all of the time. And Apple's done it. Meta's done it. All of these companies that pride themselves on fostering a community of free speech, whether it's a pressure campaign or not, or whether it's protecting life and safety, it's going to happen. And so I just think it's a wake-up call to any company that thinks it's going to be the foremost champion of free speech and a company that can stay in the good graces of the rest of the corporate community. It's - as you say, it's just - it's kind of impossible. 

Dave Bittner: Yeah. What about the legal side of things? I mean, does - is there any peril for Cloudflare if someone like Kiwi Farms - as you say, you know, their harassment leads to suicides - is Cloudflare in anybody's, you know, crosshairs because of that? 

Ben Yelin: They're largely protected because of Section 230 of the Communications Decency Act. 

Dave Bittner: I see. 

Ben Yelin: I mean, they are a platform, so they have legal immunity because they aren't participating in this - they're just facilitating this type of speech by providing back-end services. They're not actually committing the speech themselves. So they were at very - a very low risk of legal liability because of Section 230. I wouldn't say there's zero risk, but I would say it's a low risk. I think it's more reputational harm and not wanting to be responsible for someone's actual death or suicide. I mean, I think that adds a whole new level of fear to some of these companies. 

Dave Bittner: Does getting involved with this in this way - I mean, does cutting off Kiwi Farms make them more liable to some legal peril because now they've made an editorial decision? And could someone argue that by making an editorial decision, they're no longer simply a platform? 

Ben Yelin: No. I mean, it's the same way that, at least as the law currently exists, social media platforms can regulate their own platforms and prevent all different types of speech. They're private organizations. They can choose with whom to do business with. You could call them hypocritical for saying, oh, well, they stopped services to Kiwi Farms. So what about this other site where there was this type of hate speech directed at these individuals? That's an interesting theoretical argument, but because they're protected by 230 and they're not acting as a publisher, they're acting as a content platform, they're still pretty well protected by Section 230. 

Dave Bittner: I see. All right. Well, I see why - I'm chuckling because of how universal it is that these edge cases are what, you know, sets the pace, right? I mean... 

Ben Yelin: Yeah. I mean, maybe we should call this Yelin's law or Bittner's law... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Where any company that asserts its free speech principles will eventually run into a situation where a really bad person or bad group of individuals makes them abandon their principle. It is going to happen. 

Dave Bittner: Yeah. 

Ben Yelin: I think there are certain things that are certain in life - death, taxes and bad people on the internet. 

Dave Bittner: (Laughter) That's right. That's right. 

Ben Yelin: You're not going to stop any of those things from happening. 

Dave Bittner: Yeah, absolutely. All right. Well, we will have a link to the story about that in the show notes. My story this week is actually a report that came out from the Congressional Research Service. They just released their first report covering the metaverse. And it's called "The Metaverse: Concepts and Issues for Congress." Before we dig in here, Ben, can you give us a little background on the Congressional Research Service and the - exactly where they fit in? 

Ben Yelin: Sure. So they're housed under the Library of Congress, and they write policy papers explaining issues, theoretically to members of Congress, but more so to staff members and to the general public. So if you have a question on basically any policy issue, there's a good chance that Congressional Research Service has written a paper on it. So I know when I first started teaching classes, they were an invaluable resource. If I wanted to know about the legal history of this type of surveillance or that type of surveillance, one of the first places I always check is the Congressional Research Service. They're great writers. I know people who work for them. They have analysts who write on pretty much every topic under the sun. So they're a very reliable resource. 

Dave Bittner: And do they try to be nonpartisan in the work they do? 

Ben Yelin: Absolutely. So they are nonpartisan. They write papers based on the facts and based on their own research. 

Dave Bittner: Yeah. 

Ben Yelin: So they are not affiliated with any political party. As far as I'm concerned, I've never seen one - a Congressional Research Service paper that's espoused any type of ideology. I mean, they're pretty much straight down the middle. And that's why they're - one of the reasons why they're so trustworthy. I'll also say that they write about conflicts and concepts that aren't normally covered in mainstream media sources. So if you wanted to know about how the Federal Register works, that's something you would turn to the Congressional Research Service for... 

Dave Bittner: I see. 

Ben Yelin: ...As opposed to something like a hot-button political issue like abortion. They're far less likely to write on that than just things that only the nerds in government know about. 

Dave Bittner: Right, right. So filling a valuable purpose here, filling in some of those content gaps. 

Ben Yelin: Exactly, exactly. 

Dave Bittner: Yeah. Yeah. So this report digs into the metaverse, which, of course - I think it's fair to say there are lots of folks out there who think this is going to be the next big thing. Some folks are saying that, you know, the company formerly known as Facebook, who has gone so far as to name the company Meta after the metaverse, is really betting the farm on this. 

Ben Yelin: Right. I mean, the metaverse is the new thing. I've seen the little cartoon version of Mark Zuckerberg. 

Dave Bittner: (Laughter) Right. Right. 

Ben Yelin: It's not that great, to be honest. I'll just throw that out there. It kind of looks like early Second Life technology, but... 

Dave Bittner: Yeah. Well, it's got - it'll come along. It'll come along. 

Ben Yelin: Yeah, they're working on it. 

Dave Bittner: This report is very interesting. I mean, it goes into exactly what the metaverse is, some of the technologies that support the metaverse. They talk about things like augmented reality, mixed reality, virtual reality and then even so far as what they refer to as a brain-computer interface, which I guess is when - you know, not having to put a pair of goggles on to see the things in the metaverse, to actually wire into your brain itself and close your eyes and just be there (laughter). 

Ben Yelin: As someone who's kind of scared of virtual reality - I mean, I've put on those little headsets and had monsters chasing me and it, you know, kind of creates a pit in my stomach. I'm not going to be an early adopter of the metaverse. I'll wait till everybody else has had their experience with it. 

Dave Bittner: (Laughter) Come on. Come on, Ben. All the cool kids are doing it. 

Ben Yelin: I know. Don't peer pressure me. 

Dave Bittner: (Laughter) That's right. That's right. Everybody else is there. OK. So they go into some of the things that Congress would need to consider, some of the things they think will come up, perhaps. And it's a lot of stuff that we talk about here - content moderation, data privacy, market power and competition, the digital divide. These are all important issues as we head into this new possibility here, yes? 

Ben Yelin: Yeah. I mean, you mentioned a couple of really interesting ones. So content moderation, first and foremost - just as there are offensive people, as we said in our last segment, on the internet, there are going to be offensive, bad people in the metaverse who are going to try and do very bad things. 

Dave Bittner: Well, remember, we saw this thing - one of the early things, I think, when Facebook first released one of their beta versions, there were women who were reporting that folks were coming up to them in the metaverse and physically assaulting them. You know... 

Ben Yelin: Right. It was harassment. 

Dave Bittner: ...Getting too - yeah, sexual harassment within the metaverse. Who could have predicted, Ben? Who could have predicted it? 

Ben Yelin: Exactly, exactly. I mean, we should have a new law for this as well. 

Dave Bittner: Right, right. 

Ben Yelin: Users are going to come in contact with virtual beings who do awful things to them. It seems like the analysts, at least that they talked to for this report, say that any effective moderation in the metaverse is going to be virtually impossible. It's just going to require the type of real-time enforcement that is impossible. It just can't exist. 

Dave Bittner: Yeah. 

Ben Yelin: They talked to the founder of Second Life who said that they never developed identity systems that would enable strong governance. So perhaps eventually, they would be able to adopt content moderation practices. But at least in the early days of the metaverse, there would certainly be a lot of potential for abuse. Data privacy is another one. People submitting personal information, which is, as they say in this report, the seminal asset in the digital economy, that's going to be a problem with the metaverse because a lot of people are going to be giving valuable data to a largely unregulated market, so to speak, a virtual market. 

Ben Yelin: Another one that really interests me is the digital divide. If metaverse becomes something that's widely adopted and it's where people are making contacts, making real-life connections, making arrangements for job interviews, if it starts being used in private schools and then public schools, then we could have a situation where the metaverse is something like broadband, where... 

Dave Bittner: Right. 

Ben Yelin: ...Some people have access to it and some people don't. And that becomes a real problem of equity. And there's really not a policy solution to it. I think we're a long way from that being the case. I think we look at the metaverse now the way we looked at Facebook when it first came out in 2004, 2005. Like, what is this? 

Dave Bittner: Yeah. 

Ben Yelin: This thing is never going to get a billion users. 

Dave Bittner: Well, even the internet, you know, when that was new, and it was just the - it was that first wave of, you know, nerds like you and me who were - you probably weren't around. 

Ben Yelin: No, I was pretty young. 

Dave Bittner: I was. (Laughter) I was there. 

Ben Yelin: You were there. You were there at the founding. 

Dave Bittner: Believe me. It was the first wave of nerds like me and my college roommates who were on the internet and how - but going through that process of trying to establish norms, you know? And... 

Ben Yelin: Right. You're trying to govern space that, almost by definition, is ungovernable. There are no jurisdictional boundaries. There isn't, like, a preset value in terms of rules and regulations that are just going to exist because it's been willed into existence. It is kind of the Wild, Wild West. And I don't think policy - I mean, we know that policymakers and lawyers and judges are always behind the times when it comes to technology. 

Dave Bittner: Right. 

Ben Yelin: I think they're not even close to answering some of these policy questions. I think this CRS report is really probably the first time that even staff members for members of Congress have thought about the policy implications of the metaverse. So I'll say that I really hope they read this. It's well-written. It's compelling. And I think these are issues that eventually Congress and state legislatures are going to have to work through. 

Dave Bittner: Is it, by its nature and perhaps even necessarily, a reactive process? 

Ben Yelin: Yeah. I mean, we're going to - it's going to be reactive because we're going to get some of these horror stories like you talked about, a few people facing abuse or sexual harassment... 

Dave Bittner: Right. 

Ben Yelin: ...Or there being arbitrary content moderation policies that causes a political controversy. So it is all going to end up being reactive, and it's just so early in the existence of the metaverse that we haven't even gotten to the reactive part yet. 

Dave Bittner: Right. 

Ben Yelin: I mean, when the only articles on problems in the metaverse are on sites that, you know, people who don't follow this stuff would never end up on in a million years... 

Dave Bittner: Yeah. 

Ben Yelin: ...It's - we're just a long way away from those types of conversations. 

Dave Bittner: I hate to be unkind, but I'm just imagining, you know, the members of Congress, you know, sitting there with goggles on - you know, the octogenarian members of Congress with their goggles on, trying to make heads or tails of this new world. 

Ben Yelin: What is going on? Yeah. 

Dave Bittner: And, you know, I mean, I'm sure some of them will take to it and adapt to it and love it and be right on top of it. But I guess this is another one of those examples where... 

Ben Yelin: Maybe it's not great that we're led by a bunch of 80- and 90-year-olds? 

Dave Bittner: Right, right. 

Ben Yelin: Yeah. 

Dave Bittner: And - but also that the velocity of change in this technical world, it might be a mismatch with how - the foundational framework of how some of our systems are built. 

Ben Yelin: Absolutely. We're not built for quick, adaptive changes. 

Dave Bittner: Yeah. 

Ben Yelin: That's not how the policy world works - certainly not how Congress works. 

Dave Bittner: Right. 

Ben Yelin: It's usually multiple years in the making and a lot of fits and starts before anything happens. 

Dave Bittner: Yeah, yeah. All right. Well, as you say, it's an interesting report and a good read. So we'll have a link to that in the show notes. If that's something that you're interested in, do check it out. We would love to hear from you. If there's something that you would like for us to cover, you can email us. It's 

Dave Bittner: Ben, it is always a pleasure to have Robert Carolina join us. He's been on the show a couple times. He is a senior fellow with the Information Security Group at Royal Holloway University in London. And you had the honors this time of speaking with Robert Carolina, and your conversation focused on Edward Snowden. Here's Ben and Robert Carolina. 

Robert Carolina: I think the challenge is an ethical approach to disclosure involves balancing a series of related problems. The first test - I suppose the first thing for anyone who wants to disclose or feels they should be disclosing is whether or not they've spotted evidence of wrongdoing. In other words, are - is the person really convinced that what they've spotted is wrong with a capital W? There's a lot of times we see things we don't like or we think it might be wrong, or there's something - the smell test, it just doesn't quite pass. But is that really enough, I mean, if you're going to go down this path of disclosure to a journalist? Because, before we take a next step, let's remember that for people who work in - particularly in high-security situations or even just under a civil law nondisclosure agreement - you know, sort of like - I don't want to call it a low-security situation, but it's - the stakes are lower, certainly. When you move outside of the official whistleblowing channel, the official policy, and you say, I'm going to disclose to a journalist, you know, you're taking - it's an enormous step. 

Ben Yelin: Right. 

Robert Carolina: You're taking onto yourself this view that - you know, forget what the official policy is. Forget what the official channels are. Forget what the law says. Forget what my contract says or whatever it is. I know better than somebody else what the public needs to know, and now I'm going to take that giant step. So it's a big step. But I think before taking it, the - a threshold question is, have I actually spotted something that is so terribly wrong that I'm even prepared to contemplate that? 

Ben Yelin: Right. Right. 

Robert Carolina: So, yeah, I mean, I keep plowing down this road, but, I mean, there's one important thing, Ben, that of course, you and I have to remind everybody, and that is what we're talking about right now, going outside of official channels, for a lot of people, that's not just sort of taking a calculated risk that they might get sued in a civil court; for a lot of people, that step is committing a crime. 

Ben Yelin: Yes, it's committing a crime, either getting yourself locked up in this country or, in the case of the person we're about to discuss, living in a foreign country for almost a decade as a result of one's actions. So let's bring that full circle because people can probably anticipate where this is going. We now have... 

Robert Carolina: Probably from the headline on (laughter)... 

Ben Yelin: Exactly. We're not super subtle here. But just like this were a law school exam, I would like you to apply that legal test that you developed to the situation with Edward Snowden in 2013. How does the balancing test look, in your view? 

Robert Carolina: Well, OK, I'll refrain from making the conclusion until I lay out the case. We begin with who is or who was Edward Snowden years ago when the disclosures took place. I mean, who he is now, that's - I'm not going to look at that. But who was he? All right. He was a system administrator, and he had root access to a whole lot of systems that held an enormous amount of classified information about - apparently about what the NSA does and how they do it and everything else. I haven't seen anything, I haven't heard anything that suggests that he had a lot of know-how or a lot of training in how intelligence products are used. 

Ben Yelin: Right. 

Robert Carolina: You know, who uses them? Who consumes them? And how? I haven't really heard a lot of anything that suggests that he had a whole lot of training in the law that governs intelligence projects, and I haven't heard or seen any evidence that suggests he was significantly trained or had a good viewpoint of how these programs or projects fit with U.S. defense interests or multilateral defense interests. 

Ben Yelin: And that's pretty important because you and I both know - and I think hopefully our listeners have picked up on this - Fourth Amendment jurisprudence, especially when it comes to national security, is extremely complicated, and a lot of factors go into whether something is, quote-unquote, "unconstitutional." And I think a point you made in your presentation is he was just not in a position to know, at least to the best of our knowledge. 

Robert Carolina: Well, I don't see any evidence that he was in a position to know, and I just don't see him having the chops to, in this case, even spot with clarity the existence of wrongdoing. 

Ben Yelin: Right. 

Robert Carolina: Now, didn't pass the smell test. He didn't like it. He felt wrong about it. So maybe holding him to a standard of, you know, is he - does it have to be illegal in order to be immoral? Well, not necessarily. But if we shift the argument from legality to morality, I also haven't really heard a lot of evidence, in listening to interviews with him, for example, as to how he - what informed his moral compass of understanding not just that something was wrong but trying to gauge how wrong it was. 

Ben Yelin: Right. 

Robert Carolina: He felt it was very, very wrong. So all right. But let's move on to the next side of this thing, which is how would he assess public interest in the disclosure? How would he - was he in a position to figure out the - let's call it the legitimate public need to know? That might be a different - a better way to put it because public interest - oh, yeah, I'm interested in a lot of things. But genuine, legitimate need for the public to know things that are going on, things that he felt were bad, that's one thing you got to balance. And what's on the other side of this little teeter-totter? What kind of harm might be caused if I disclose it? Now, in this case, you know, he's obviously a very smart individual, very clever individual. And, again, I don't see anything in his constitution, in his makeup, in his training, in his interviews or anything else that leads me to believe that he's got sort of a deep insight into what is the legitimate public need to know certain types of things. And, equally, I don't see anything in his training or background that suggests he would be able to calculate what kind of damage could be done with these disclosures. 

Ben Yelin: Right. Right. 

Robert Carolina: I think the worst aspect of all of it comes up in something that he says in the famous documentary that they filmed at the time of the disclosure, "Citizenfour." Interesting film - it's really interesting to watch. They're in his hotel room in Hong Kong, I think it is. And he's got the journalists in the room with them, and they've got the film - or they've got the - they're videoing him. They're doing all this kind of stuff. It's really - you know, it's a really tense moment. And the journalists, it's like for the first time, they begin to realize just how much data they're being given. 

Ben Yelin: Right. Right. 

Robert Carolina: You know, it's like, this is a mountain of stuff. It's like - it's worse than, you know, a kid in a candy factory. I mean, you know, it's like - I think it probably began to dawn on them that, oh, my God in heaven, this is, like, way more than we could have ever hoped for. And one of the journalists asks the question. He says, well, how much of this can we publish? And here's the great quote. And I want to - I went, and I actually looked at the published shooting script, you know, where they took the lines because I didn't want to misquote him. And the quote that I pulled from this was - Snowden says, I'll leave, you know, what to publish on this and what not to publish to you guys; I trust you to be responsible on this. OK. So what's Snowden's solution? Does he drizzle out dread - breadcrumbs like Mark Felt? No. He says here's the whole ball of wax. Here's the big ball of string. Here's the entire thing. I'm going to chuck it out the door. I'm going to put it in your lap, and I'm just going to say, hey, it's your problem. You figure it out. 

Robert Carolina: We talk a lot in security about things like disclosure, responsible disclosure in terms of, like, finding vulnerabilities and all that kind of stuff. I think that if someone wants to take on the mantle or wants to try to claim the moral high ground for being what I would call a good or a just whistleblower, I think at the very least, they need to be able to satisfy these criteria - that they can accurately identify something that is wrongdoing - they understand just how wrong it is and why - that they are able to assess the legitimate public interest in finding out what the wrongdoing is, and they're able to balance that against the harm that is being done. And then whatever they disclose has to fit within that balancing structure. I think that's ultimately what makes a whistleblower who can at least, you know, sleep well at night. 

Ben Yelin: So I always have to be the annoying lawyer and turn into the devil's advocate here. And I promise I will do this briefly because you just made a very compelling case. But I think this is just a point of contention here. After Snowden disclosed this information, two years later, Congress passed the USA Freedom Act, which was a pretty wide-reaching reform proposal of our surveillance state. They also made some minor revisions to the FISA Amendments Act. And more broadly, we've now had a nine-year debate on the value and the extent to which we want to have such a pervasive national security surveillance state. So given all of that, that's something of value that he provided us. How does that affect your analysis, if at all? 

Robert Carolina: Well, first, let me violently agree with everything you just say. 

Ben Yelin: (Laughter). 

Robert Carolina: I agree that the disclosures prompted an important debate, and frankly, that debate was long overdue. Many of us who work in the field of cybersecurity had suspected for a very long time that things basically worked this way. So it didn't surprise me, and there were a lot of people who weren't surprised. But I guess what did surprise me was how surprised the public policymakers were and how surprised a lot of other people were. So I agree with that side of this, Ben. However, even though the disclosures prompted a long overdue debate on these types of issues, I'm not persuaded that prompting that debate would only happen after disclosing everything that was disclosed. I think there probably could have been a lot less that might have gotten people to move on this. 

Robert Carolina: I don't know. I might be wrong. I'll freely concede that. But I don't think that the good that has come from this is necessarily - necessarily outweighs the potential harms that could have been created. And I don't think - I just don't think it makes for what I would think of as responsibility in the mind of the whistleblower. And, you know, to be to be brutally honest - and I was this honest with the students I was talking to as well - piece of advice No. 1, don't become a whistleblower. Don't do it. That's just a basic, simple piece of advice. Don't do it. Right or wrong, lauded as a hero or thrown in jail, you know, even if you win the sort of whistleblower lottery and everyone applauds you, there's a very real chance that your career is over and that your life is over because there will be a large number of people who are going to, like, lift you up on their shoulders, and they're going to applaud you, and they're going to say, oh, what a great person you are, and then you turn around with your CV, your resume, and you say, oh, you have any jobs going? Oh, gosh, look at the time. 

Ben Yelin: Yep. 

Robert Carolina: You know, it's - I think it's a very, very potentially self-destructive act. So if someone's going to cross that threshold, obviously, one of the series of reforms that have come through, I understand - although you're the expert on this; I'm not - is that particularly within government offices, there are a lot of, let's say, official channels for directing whistleblowing concerns. 

Ben Yelin: Absolutely. More so than there used to be. 

Robert Carolina: And Snowden knew about those, and he said, but I didn't think they were effective. It's like, well, that's just kind of a way of saying you didn't get the result you wanted. That's not the same thing as they weren't effective. You know, right or wrong, it's still hard to tell. But for anyone sort of, like, navigating that point of saying, you know, I'm prepared to just, you know, rip up this - suffer in a civil sense, in a nongovernment operations sense, I'm prepared to be sued by my employer. I'm prepared to be fired. I'm prepared to, you know, become a pariah in my industry or whatever it is. You know, if you're prepared to take that kind of step, you have to really search your soul and say, am I really onto something here or am I - is this becoming a self-fulfilling prophecy? Because anyone who takes that step has reached the conclusion, I know better. I know better. I know better than my boss. I know better than the person who runs the company. I know better than the person who runs the agency. I know better than the inspector general. I know better than all these people in the line ahead of me who are supposed to be experts at this sort of thing. But I know better. You really need to be careful before you go down this path because sometimes blowing the whistle itself is a harmful thing. 

Ben Yelin: Right. Right. 

Robert Carolina: And you really need to search your soul before you go down this path. But, ultimately, Ben, this is the problem of talking about subjects like this. I don't think there's a simple answer to it. I - you know, whenever you look at hard ethics questions like this, I don't think there is a simple answer to it. 

Ben Yelin: Yeah. I mean, I think that certainly rings true. And I think we might have some of our listeners who would agree with you and some who would think Snowden and those who came before him were making a morally courageous decision. And I think that's something we'll have to leave up to our listeners. But I want to thank you for, really, a great interview and for your insight. And glad you were able to join us. Robert Carolina, thank you. 

Robert Carolina: Thank you. 

Dave Bittner: Interesting conversation, Ben. What were some of the take-homes for you? 

Ben Yelin: I like that he has developed a framework of judging releases of privileged or classified information, that it's not arbitrary, and you can separate it from one's own politics. You can actually look at it analytically and think about the nature of the information being revealed, the qualifications of the person who is revealing that information, whether they're qualified to understand it and qualified to reveal it, to whom they are revealing that information. And I think you can make qualitative, normative judgments about the next Edward Snowden if you have this framework, if you realize that somebody is doing it responsibly, if they've gone through proper channels and haven't gotten a sufficient response, then you can have a different normative judgment than somebody who just throws the paper on the ground and says, I've had enough of this. I'm going to give this to a gossip journalist, and we'll see what happens. 

Dave Bittner: Right. 

Ben Yelin: So I appreciate Robert having that - trying to put that framework together. I thought it was a very well-done framework. 

Dave Bittner: Yeah. It's fascinating to me, as we get a little distance with that whole situation with Edward Snowden, that, you know, there was a period of time when it seemed as though how you felt about the Snowden revelations was almost a litmus test for your thoughts on privacy and government surveillance and all that kind of stuff. I remember, you know, a lot of folks who I know, people I respect, who said it's not so much what Edward Snowden did; it's the way he did it - which I think is interesting because I'm not sure if he'd done what he did in a different way, it would have been possible to have the outcome he was looking for. 

Ben Yelin: A hundred percent. 

Dave Bittner: Does that make sense? (Laughter). 

Ben Yelin: Yeah. I mean, if you do go through proper channels and you have a conversation with your superiors and it's never released in The Guardian newspaper and... 

Dave Bittner: Right. 

Ben Yelin: ...You know, splashed across social media and they have segments on John Oliver's show, it's never going to get into the cultural zeitgeist. So it's not always as easy as saying, oh, just bringing this up to your superior, and we'll deal with it internally. But I do think what Robert has done here is at least come up with a framework of judging these situations beyond, do I like the person who's revealing classified information, or do I not? 

Dave Bittner: Right. 

Ben Yelin: I just think that is a valuable exercise. 

Dave Bittner: Yeah, absolutely. All right. Well, again, our thanks to Robert Carolina for joining us. We do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.