Caveat 6.30.21
Ep 84 | 6.30.21

Can individual liberties keep up with technology?

Transcript

Andrew Hammond: Can some of these ideas about individual liberties and individual rights, which reach back to Enlightenment ideas, can they respond quickly enough to technology which is just changing exponentially?

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, Ben has the story of an appeals court decision protecting Big Tech companies from responsibility for acts of terrorism. I look at a French surveillance company whose executives may find themselves in hot water. And later in the show, my conversation with Andrew Hammond, historian and chief curator at the International Spy Museum. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, we got some interesting stories to share this week. Why don't you start things off for us? 

Ben Yelin: Yeah. And I'll note, Dave, that our stories this week kind of have a common theme, which I think is an interesting tack for us. 

Dave Bittner: Right. 

Ben Yelin: I'd almost recommend it going forward. 

Dave Bittner: Just kind of worked out that way. 

Ben Yelin: It did, yeah, entirely by coincidence. 

Ben Yelin: So my story comes from, originally, a Twitter account I follow, a gentleman named Gabriel Malor. And he's someone I rely on to post interesting appeals court decisions on a variety of topics. And he alerted me to a decision from the 9th Circuit, which is on the West Coast, that holds that the Communications Decency Act, specifically Section 230, largely protects the Big Tech platforms Google, Twitter and Facebook from lawsuits claiming that they assist international terrorists, specifically ISIS, by allowing their platforms to be used by ISIS members and, thus, bearing direct responsibility or indirect responsibility for these terrorist attacks. 

Ben Yelin: So this is actually a consolidated case. There are three separate lawsuits here coming from three separate plaintiffs, each of whom had a family member that was killed in a terrorist attack. One of them was killed in France, one of them in Turkey and one of them in the San Bernardino attack here in the United States. 

Ben Yelin: So the Anti-Terrorism Act doesn't just impose criminal and civil liability on terrorists themselves, which it does, but it also includes a provision that was added in recent years, about five years ago, that potentially imposes criminal and civil liability on anybody who gives substantial supports or sponsors international terrorism. It's - it was enacted as part of the Justice Against Sponsors of International Terrorism Act of 2016. So that includes secondary civil liability for aiding and abetting or conspiring to commit international acts of terrorism. 

Ben Yelin: What the plaintiffs are alleging here - and their cases are slightly different, but they're basically alleging the same thing - is that Google or Facebook or Twitter are secondarily responsible for these acts of terrorism because they allowed this content to be posted on their platforms and because their algorithms promote additional content coming from these terrorist organizations. So, for example, Google, when you search for ISIS or ISIS-related material, will give you a mosaic of recommended searches for additional information on that topic. 

Dave Bittner: Right, right. 

Ben Yelin: They might lead you to additional terrorist recruitment videos. 

Dave Bittner: I was going to say I think YouTube is most famous for this - potentially taking you down a path that most people probably would be better off not going down. 

Ben Yelin: Yeah, the proverbial rabbit hole. Like, oh, I see you're interested in this type of terrorism. 

Dave Bittner: Right (laughter). 

Ben Yelin: Can I interest you in this other type of terrorism? 

Dave Bittner: (Laughter) Right. 

Ben Yelin: So these three plaintiffs brought a case basically trying to hold these companies liable. And the court of appeals here - the decision is complicated, but it largely shields liability on the part of these tech companies because of Section 230 of the Communications Decency Act. That section says that companies cannot be held liable for the content that's posted on their platforms. That's very basic to what the Communications Decency Act and Section 230 does. 

Dave Bittner: Right. 

Ben Yelin: And that's something we've talked about a million times. 

Dave Bittner: Yeah, and that's sort of like the phone company can't be held responsible for conversations you have over their network. 

Ben Yelin: Exactly. So that's a long-standing principle. Where this gets interesting is things like an algorithm. So what these plaintiffs are alleging is it's not the content that's posted on these websites. It's actually the creation of the tech companies themselves - the algorithms, the mosaics that are created by the use of search engines or searching tools - that that's actually content created by the tech companies themselves, and they should be held secondarily liable for that content. 

Dave Bittner: Can you just quickly unpack what secondarily liable means? That secondarily term - what does that mean? 

Ben Yelin: So it means that you can be held liable as a cause of terrorism even if you did not commit the acts of terrorism yourself. 

Dave Bittner: I see. 

Ben Yelin: So, of course, the terrorists themselves are liable, but there can be other entities that also assume liability for the same incident, even though, as far as I know, Google employees themselves aren't the ones actually committing the acts of terrorism. 

Dave Bittner: OK, sure. Makes sense. 

Ben Yelin: Section 230 of the Communications Decency Act can either be interpreted in a limited sense to only applying to the content that somebody else posts on a platform, or it can have this broad interpretation where it also shields liability for algorithms or any of the type of search engine optimization-type things that these tech companies create - that these are creations used for their own self-enrichment, they're not built for the purposes of fomenting acts of terrorism and that these are things that are universally applicable. So the algorithms haven't been specifically designed to direct traffic to ISIS websites or whatever. 

Dave Bittner: Right. 

Ben Yelin: They are created... 

Dave Bittner: That's just what they do (laughter). 

Ben Yelin: Exactly. They are created for, so to speak, like, a secular purpose... 

Dave Bittner: Right, right. 

Ben Yelin: ...Which is, we want to make more money. And just incidentally... 

Dave Bittner: We want to amplify engagement. 

Ben Yelin: Exactly. 

Dave Bittner: Unfortunate side effect of that from time to time is terrorist recruitment. 

Ben Yelin: Right. 

Dave Bittner: (Laughter). 

Ben Yelin: If it just happens that people are recruited and commit acts of terrorism causing mass destruction and death, then, you know, we were just trying to make a little bit of money. What's the problem with that? 

Dave Bittner: (Laughter) Right. We have a responsibility to our stockholders. 

Ben Yelin: Exactly. So the majority on this panel of the 9th Circuit basically held that Section 230, even when it comes to algorithms, largely shields these companies from liability. 

Ben Yelin: There's a really interesting concurrence from one of the judges saying, basically, under the precedents of this circuit and from the extent that there are precedents from the Supreme Court, this decision is decided correctly. That liability does exist. We are bound as an appeals court by that precedent. But we need to reconsider our Section 230 jurisprudence going forward because companies probably should be held responsible. You know, it probably is a wise policy for companies to bear some civil liability, at least, for these types of algorithms where they end up illuminating and directing traffic to these destructive websites that can cause acts of violence or death. 

Ben Yelin: This is something you'll see in a lot of decisions, where a concurrence will say, we're bound by precedent, but this is what I would like to see happen in the future. So it's both a call, I think, to members of Congress who could clarify the Communications Decency Act to not have this broad liability shield when we're talking about algorithms. 

Ben Yelin: But it's also potentially a call to the - either the full Court of Appeals for the 9th Circuit, other federal courts of appeals and the U.S. Supreme Court, saying, I want to introduce this alternative theory of Section 230 where it's - the shield is limited to somebody else posting content on that platform. The shield should not apply as broadly when we are talking about an algorithm that's created by the companies themselves that is directing this traffic. 

Ben Yelin: And I'm suspecting that perhaps this concurrence in the long run might lead to some sort of policy change, whether that happens through the courts or whether that happens through Congress. But it's a fascinating decision, goes through the history of liability under this anti-terrorism statute for the amendments under this 2016 Justice Against Sponsors of International Terrorism Act. 

Dave Bittner: What happens next now that this decision has been released? Does it go somewhere else from here? Is it settled? Which are the machinations that occur next? 

Ben Yelin: So imagine you are taking a six-hour road trip. We're at the part of the road trip where you have closed your garage door and driven up the street. So that's my way of saying we are very early in the process... 

Dave Bittner: OK. 

Ben Yelin: ...Because this appeals court decision was about motions to dismiss by the companies themselves. And they actually denied some of the motions on various issues that we did not discuss. 

Dave Bittner: OK. 

Ben Yelin: So some of these motions to dismiss are now going to go back to the district court level. These are consolidated cases, so the outcomes at the district court level could be disparate. It could come back up to the Court of Appeals for the 9th Circuit. They would rely on this precedent from the decision here, but it could take a while for the case to come back up to the 9th Circuit. 

Ben Yelin: Once any case makes it back to the 9th Circuit, you could have another decision. That decision could be reviewed en banc by the entire 9th Circuit panel. So you know how frustrating it is to wait so long for any resolution to these issues. 

Dave Bittner: (Laughter) Right. The slow pace is a - both a feature and a bug, right? 

Ben Yelin: Yes, it sure is. It is. I think saying it moves at a snail's pace is a little too generous. 

Dave Bittner: Right, and you're unfairly besmirching snails (laughter). 

Ben Yelin: Yeah. Snails are much faster than our judicial system... 

Dave Bittner: Right. 

Ben Yelin: ...In some circumstances. 

Dave Bittner: Right. 

Ben Yelin: I do think the reasoning here is going to be applicable going forward, obviously in the 9th Circuit, but now this can be a defining case in other circuits, where they really took a look at this issue in a bunch of different contexts 'cause we're talking about a bunch of different instances of terrorism. So I think this will have some precedential value. But as of getting some sort of final resolution, either on these three particular cases or a final resolution on the issue as a whole, yeah, we're going to have to sit and wait. 

Dave Bittner: Yeah, interesting, I guess, signaling here - right? - of - interesting messaging coming out of this decision. 

Ben Yelin: It's fascinating messaging. 

Dave Bittner: Yeah. 

Ben Yelin: And I think it's a decision that's going to be cited going forward because I think it does help sort of define the scope, as this court sees it, of Section 230 in anti-terrorism cases. 

Dave Bittner: All right, well, we'll have a link to the tweet that kicked off your investigation there. It actually hasn't been - as you and I record this, there hasn't been a whole lot of coverage of this decision yet. So you've really been digging into the case itself. 

Ben Yelin: Yeah, yeah. I've had to do some reading. It's hard to believe. 

Dave Bittner: (Laughter) Wow. 

Ben Yelin: It's a 167... 

Dave Bittner: It's like you're back in law school (laughter). 

Ben Yelin: I know - 167-page case. So it's relatively dense. But, you know, for those of you who enjoy legalese, it's a really fascinating decision to read. 

Dave Bittner: Yeah. All right. Well, we'll have a link to that where you can track it down there. 

Dave Bittner: My story this week comes from the MIT Technology Review. This is an article written by Patrick Howell O'Neill, and it's titled "French Spyware Bosses Indicted for their Role in the Torture of Dissidents." And this is kind of fascinating and, as you mentioned at the outset of our show, kind of along the same lines of what you were talking about here. 

Dave Bittner: This is about a company in France. The - originally, the company was named Amesys. They later changed their name to Nexa Technology. 

Ben Yelin: Always good to rebrand after you've sold some of your spyware to... 

Dave Bittner: Happens all the time. 

Ben Yelin: ...State sponsors of terrorism, yeah. 

Dave Bittner: Happens all the time. 

Dave Bittner: So what's happened here is that the Paris Judicial Court has indicted some of the leaders of this company, claiming that they sold their technology to Libya and Egypt over the past decade or so, and that led to a crushing of opposition, torture of dissidents and other human rights abuses. 

Dave Bittner: This, I think, is fascinating. You know, I do a weekly segment over on the "Grumpy Old Geeks" podcast, and we have sort of a dark running gag over there that no one ever gets punished for anything in tech. You know, like, doesn't matter what - you can destroy democracy, and no one goes to jail, right? No one goes to jail. 

Ben Yelin: Right. 

Dave Bittner: And in this case, there may be some consequences for some folks here from the powers that be over in France. What's your take on this, Ben? 

Ben Yelin: It's a really interesting story. I'll first say that I'm limited by the fact that I am unfortunately not an expert in the human rights statutes at issue here coming from the French government. 

Dave Bittner: Right. 

Ben Yelin: So I am speaking rather generally and not from an area of specific expertise here. But these are really serious allegations. So one of the allegations stems from the sale of the spyware to Gadhafi, Moammar Gadhafi, back in 2011, which was during the - really one of the early days of the Arab Spring. And the allegation is that the use of the software led directly to spying on dissidents and eventually led to their torture. 

Ben Yelin: And they were able to testify. People who were tortured were able to testify in a case in 2014 that helped connect the dots from the spyware to the torture itself. And the same with a separate sale of the spyware to Egypt under Sissi in 2014. 

Ben Yelin: I think it's really interesting, just like our previous story, that there's perhaps some limitation instigated by the judicial system, in this case in France, on the principle that you can do whatever you want to make a buck, if that makes sense. In the case we just discussed, it's can we maximize our algorithm? Can we design an algorithm in a way that makes us, the executives of these tech companies, as rich as possible, no matter what the consequences are for society at large? 

Dave Bittner: Right. 

Ben Yelin: And here, it's can we, under relevant French law, sell this spyware to these countries and these leaders who are pretty clearly cruel to political dissidents... 

Dave Bittner: Yeah. 

Ben Yelin: ...Without facing any consequences? And I think what this indictment signals is there is a limit to how much you can do as a tech company or as a tech platform to make a buck if your actions are going to lead to some type of societal harm. So, I mean, I think that's a really interesting lesson. Whether it continues in this case I think is under question because it's just an indictment that's in the preliminary stages of consideration... 

Dave Bittner: Right, right. 

Ben Yelin: ...In these French courts. 

Dave Bittner: One of the things they point out in this article is that at the time these deals were made, there were no specific prohibitions in the law in France against these sorts of sales. And the French government did not approve or disapprove of the sale. And in doing nothing, that allowed the sale to move forward. 

Dave Bittner: What - I guess what I'm thinking of is that, you know, like, here in the United States, we have rules against doing business with certain nations. 

Ben Yelin: Right. 

Dave Bittner: And there are things that you cannot export, or you must get permission before you share certain technologies. Famously, back in the '90s, there were certain types of encryption that were considered to be munitions, and you could not export them. I remember there's a story of a gentleman who famously tattooed one of the encryption algorithms on his arm and then traveled internationally to try to make a point that these restrictions were silly and not really enforceable. 

Ben Yelin: And the algorithm - the tattoo fit on his arm? I'm a little surprised. 

Dave Bittner: I don't know. 

Ben Yelin: Must have been a talented tattoo artist. 

Dave Bittner: Yeah, small text or large arms - I don't know. 

Ben Yelin: Yeah. 

Dave Bittner: But either way, you know, it was just an interesting statement from somebody who kind of knew better. 

Dave Bittner: Let's say you're the head of a tech company here in the U.S. and you are a company who handles surveillance technology. What's your take on this story? 

Ben Yelin: My take is I should be careful who I sell it to. And I should definitely involve legal counsel because, yes, in the United States, based on various sanctions that we have and, frankly, on the statute that we just mentioned in our previous story, you can be held liable to materially supporting either terrorist organizations or, in this case, state sponsors of terrorism. So there are consequences, and there are limits. 

Ben Yelin: In terms of what happens in this case, because it's an indictment, the judges have to decide whether the case can proceed to criminal court or whether it would be dismissed based on a lack of evidence. And what one of the experts here said is these cases are notoriously difficult to prosecute for a number of reasons. 

Ben Yelin: Sometimes the evidence itself is a little unclear. Sometimes, like as in France, the sufficient regulations or requirements are confusing enough that a court might decide that there isn't enough meat in the bones of these statutes to prosecute a case. And corporations can exert political pressure on either judges or politicians to shield themselves from liability in these cases. I mean, they still carry a lot of political influence. 

Dave Bittner: Right. 

Ben Yelin: So I think anybody who's overly optimistic about holding these companies accountable, I think you need to kind of curb your enthusiasm, so to speak... 

Dave Bittner: (Laughter) Right. 

Ben Yelin: ...And wait to see whether this case actually proceeds. But I do think, you know, if you work for a tech company in the United States and you see a case like this, I think this is a warning sign that you need to have some level of care in sales and procurement in order to make sure that you're not doing business with either terrorist organizations or state sponsors of terrorism that are enemies of your own country. 

Dave Bittner: Yeah. All right, well, we will have a link to that story from the MIT Technology Review in our show notes. 

Dave Bittner: Of course, we would love to hear from you. If you have a question for us or a story you think we should cover, you can send us an email. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Andrew Hammond. He is the historian and chief curator at the International Spy Museum. Here's my conversation with Andrew Hammond. 

Dave Bittner: Today, we are talking about this notion of freedom versus security online and in the real world. You know, you wrote this interesting piece with your co-author, Richard Aldrich, about Obama, the NSA and U.S. foreign policy. Can we just start there with a little bit of level-setting about some of the backstory of how we have managed that tension between security and freedom? 

Andrew Hammond: I think it's a question that goes to the heart of debates about modern democracy and modern liberalism, the security of the state, the security of the enterprise, the umbrella under which people within that state live, and then also their own individual rights, their own individual freedoms, their own sphere, where they can live their lives out without anyone, you know, looking over their shoulder telling them what to do. 

Andrew Hammond: So think about the American Revolution. That was in some ways a response to what was perceived as overleaning government, government leaning too heavily on the citizens or reaching too much into their lives. So if you think about the technological changes that have taken place since then, you know, I think one of the main tensions is, can some of these ideas about individual liberties and individual rights, which reach back to Enlightenment ideas, can they respond quickly enough to technology which is just changing exponentially? 

Andrew Hammond: So if you think about that journey over time, we could think about - during the Civil War, we could think about Abraham Lincoln suspending habeas corpus. We can think about World War II or the period just afterwards, the House Un-American Affairs Committee, McCarthyism. There's always a tension within American history, which is one of the things I'm most interested about, between individual freedom and between the freedom of groups and government. 

Andrew Hammond: If there is no government, if there's no American state, then in many ways, that's a guarantor of freedom. But how much power should that state have? Quite often, people that run the government think that they need more power. And quite often, people that are not in the government think that the government should have less power. So how does all of that cash out? How does all of that play out in the end? 

Dave Bittner: Well, in your writing, you've brought up a couple of events that really brought this to the fore, you know, as we were, I guess, hot and heavy into the digital age, and, you know, some of these information-gathering organizations, like the NSA, the CIA, were - they were capable of gathering massive amounts of information in a way that perhaps hadn't been available to them before. 

Dave Bittner: And we had this combination of things. We had, of course, the Snowden revelations. But then we also had the revelations that the United States was taking particular interest in, for example, Angela Merkel's telephone. So we had this notion that, you know, friends don't spy on friends, as they said. I mean, these two revelations led to a lot of conversation in this area. Can you kind of unpack the conversations that took place when these came to the fore? 

Andrew Hammond: Yeah, I think some of the conversations that took place with the - say, with the Snowden leaks - I mean, I think one of the main things there, just looking at it as the - if you think about Edward Snowden, think about the types of positions that he had. You know, he wasn't, like, a super high-level person, but he saw a straight, a place where a lot of information flowed through. So I think that that's quite interesting. Or if you think about Bradley Manning, you know, we're not talking Major General Bradley Manning or Colonel Bradley Manning. We're talking about someone pretty far down the military hierarchy. 

Andrew Hammond: So I think that just given the volume of information that people that are trying to - you know, intelligence agencies, militaries that are trying to make sense of all of this information, there's huge flows of it coming in. And it's - I guess it's easier to hack into in many ways because the amount of people that are exposed to it can be much greater, and the volume of information that's passing through can be much greater. 

Andrew Hammond: I think some of the other bits - the bits about Snowden was that, I mean, sure, states have spied on their citizens historically. But now, I mean, just think about an iPhone. You're carrying around one of the greatest intelligence-gathering devices ever devised in human history. Just think about the apps that you have if you've got an iPhone - your banking, your location, the food you're buying. You go to Whole Foods and you scan because you're a Prime member. People can probably deduce from what you're buying at Whole Foods, they can probably have insights into your education level, you know, where you may live, your age, a whole variety of different things. 

Andrew Hammond: So the potential of the information and the volume of information is just - is huge. I don't think that Western societies have confronted some of the implications of the information age for the way that they do business, which is traditionally pretty slow, pretty labyrinthine and very bureaucratic. We're in a very different kind of an information space, and I don't think that government 2.0 has necessarily arrived as of yet. 

Andrew Hammond: And on the spying with friends, I mean, I think that's a really interesting one because, you know, this is something historically that has always happened. Britain spied on America in the runup to World War I. The Americans spied on the Japanese during peace negotiations in the 1920s. The Germans spied on their Italian allies during World War II. So this goes on all the time. You can almost say that this is the default position. 

Andrew Hammond: I think that very rarely you have this unique constellation of events that create a place where this doesn't happen. So one of the places that it's said not to happen is with the (unintelligible) relationship with Britain and America and within the Five Eyes community. So they are said not to spy on each other, but historically they sometimes have. 

Dave Bittner: Yeah. 

Andrew Hammond: So we've got a very unique kind of setup of affairs. But I don't want to make it sound like, oh, well, you know, I've researched this, and I'm a historian, and, I mean, of course, you know, I'm just pouring cold water on every concern that everyone has because, you know, I just know more. It's not about that. It's just if you just look at the evidence, states historically have spied on each other, apart from very rare occasions. 

Andrew Hammond: Think about the amount of your listeners that have been in a relationship or a marriage, and the marriage - one of the main ways they break down, apart from communication, is through trust. And trust is a very difficult thing to maintain over a long period of time. So just like it is for couples, how many people have not just looked over the shoulder of what a partner was doing online? How many people have not wondered what their text messages are doing? So just as between couples that share a house, share a mortgage, have the same kids, just like trust can be difficult for them, it can also be very difficult for allies like Germany and America. 

Andrew Hammond: That's not to say that - I'm not justifying it. I'm not excusing it. I'm just explaining it like this... 

Dave Bittner: Yeah. 

Andrew Hammond: ...Does happen and can happen and will happen. 

Dave Bittner: It strikes me as being kind of like, you know, "Casablanca," where I'm shocked - shocked - to find out that there's espionage going on here, right? I mean (laughter)... 

Andrew Hammond: I think you should just assume that espionage is going on. 

Dave Bittner: (Laughter) Right. With the Biden administration, how do we see their approach being? I mean, I think it's fair to say things were fairly chaotic throughout the Trump administration. And so the Biden administration has the opportunity to sort of set their own level here. Are there any signals coming out of that administration so far? 

Andrew Hammond: I think the first thing to say would be just, you know, because I'm appearing on behalf of the Spy Museum, you know, we try to steer clear of politics. So I don't want to get too much into Trump and Biden and... 

Dave Bittner: Sure, sure. 

Andrew Hammond: ...And so forth. 

Andrew Hammond: I mean, I think there's a number of different ways of looking at the Biden administration. There's looking at Biden's backstory. So he was Obama's vice president for eight years. And the article that you mentioned at the beginning of our chat was about Obama and Snowden and the NSA. 

Andrew Hammond: So at the moment, because we're not that far into the administration, it's a little bit like reading the tea leaves. You know, so we can look at, he was vice president. Well, but he could be his own man. He could be very different as president. We could look at some of the things that Biden - some of his positions in the past, some of the things that he's espoused. We can look at the types of people that he's put into positions of authority in the various intelligence agencies. We can also look at the legacies that he's inherited at those various agencies, their processes and so forth. 

Andrew Hammond: I think at the minute, there's a few shoots coming through the air. There's a few seeds that I see growing. But I wouldn't want to say too much more until I see more of the leaves coming through. 

Dave Bittner: Yeah. That's fair enough. I mean, is this a fundamental tension upon which our democracy is built? Is it, you know, not so much a bug as a feature of the system? To have this notion of freedom and security in tension, is that necessarily a bad thing? 

Andrew Hammond: I don't think it is a bad thing, because - I mean, let's take the term freedom. Like, in American history, it has meant different things to different people at different points in time or even just more generally. So in the Civil War - James McPherson's great book "Battle Cry of Freedom." In that book, basically the Confederates and the Union Army's soldiers were fighting each other, killing each other, but they both said they were doing so on behalf of the same term. So we've clearly got an example where different ideas of freedom are at play. 

Andrew Hammond: And this has been a constant feature of American history. Like, let me give you this really interesting experiment that I'd done when I was doing some of my research. There's an archive at Brown University of extremist propaganda and literature. And I went there, and I looked through some of the, like, super right-wing archives and some of the material. And then I went and looked at the opposite extreme. I looked at a lot of the stuff on the, like, far left. What one term do you think both of those groups used continually in their newspapers and their magazines? Which one term? Freedom. 

Dave Bittner: Right. I said I'm going to go out on a limb here and guess it was freedom. 

Andrew Hammond: So almost by definition, you know, freedom is just part of the furniture, it's part of the vocabulary of the American experience. But it can get cashed out in very different ways. So as always, something that's a feature of the system, it's not a final destination. It's more a journey that America is on about what the nature of freedom is. And it can definitely mean different things to different people at different points in time. 

Andrew Hammond: And security is one of the main complicating factors there - right? - because some people think that the nation is more secure when the state has more power. And, you know, I'm not taking a position (unintelligible), but some people think that government should just do the least possible amount of things. 

Andrew Hammond: Yeah, so I think that those two debates sometimes take - you know, are a little bit divorced from the realities of the international system. And I think that some - so, for example, some kind of, like, super minimalistic state that doesn't have a military, that doesn't have intelligence agencies - it's not very practical for the type of world that we're living in at the moment. But on the other hand, we don't want, you know - if anybody's ever read any of the dystopian novels - "1984," all of them, you know, "Fahrenheit 571" (ph) - you know, we certainly don't want that kind of state power either. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: Have you been to the Spy Museum, Dave? 

Dave Bittner: I have, yes. In fact, for those who aren't familiar, they have a - it's a museum in Washington, D.C. And they recently relocated, I'd say - what? - about a year before the pandemic shut us down. They greatly expanded. And it is - if you're coming to D.C., I'd put the Spy Museum on your must-visit lists of museums, especially if you have kids. It's really a lot of fun. 

Ben Yelin: It is. It's a very interactive museum. I went before they moved to their new location. But that's the first thing I would say. Highly recommend the museum. 

Dave Bittner: Yeah. 

Ben Yelin: Really interesting conversation. I think one thing that stood out to me is how much more we understand about surveillance in the last 10 years, starting with the Snowden revelations and then, you know, with the additional revelations that we weren't just spying on our own citizens, but we're using surveillance techniques on our allies, you know, people like German Chancellor Angela Merkel. 

Dave Bittner: Right. 

Ben Yelin: And I think that just kind of increased the awareness of the power of surveillance, how it's so pervasive and we don't know what we don't know. The fact that all of this information was leaked leads us to believe that there's a lot of information out there on surveillance practices that hasn't been leaked that perhaps is more intrusive than anything we've heard about thus far. But I thought it was a really interesting conversation. 

Dave Bittner: Yeah. Well, our thanks to Andrew Hammond from the International Spy Museum for joining us. 

Dave Bittner: That is our show. We want to thank all of you for listening. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.