
Safeguarding against disinformation.
Barbara McQuade: Our country cherishes First Amendment freedoms, as do I. And it's something that is highly regarded on the left and the right. Without First Amendment free speech, we lose our ability to speak out against our government. But I think that there are people who use the C-word, censorship, in an effort to silence all critics of anything that anybody says, and is carte blanche to be able to say anything they want to say.
Dave Bittner: Hello everyone and welcome to "Caveat", N2K CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner and joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hey there, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: On today's show Ben discusses a fascinating case dealing with middleware and adversarial interoperability. I swear it actually is fascinating. He's got the story of a leak of Australians' biometric data. And later in the show, author Barbara McQuade discusses her new book, "Attack From Within: How Disinformation is Sabotaging America". While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. All right, Ben. Let's jump right into it here. You have an interesting story to share this week, don't you?
Ben Yelin: Yes so this is not literally called Zuckerman v. Zuckerberg, but that's basically what's happening in this case.
Dave Bittner: Zuck v. Zuck?
Ben Yelin: Zuck v. Zuck. Yeah. And it really is a fascinating case from a legal perspective and also just what the broader implications would be if the court ruled in a certain way.
Dave Bittner: Okay.
Ben Yelin: So to give you a little background, Zuckerman is a professor at University of Massachusetts and is interested -- he's an advocate, he's interested in an open internet. And he's putting together this tool. It's called Unfollow Everything 2.0. So, a different individual by the name of Barclay had previously created a tool called Unfollow Everything. Basically this allows you to click a button interoperable with Facebook, and it will unfollow everything for kind of two purposes. It'll allow you to see an uncluttered newsfeed, unburdened by all of the groups and individuals that you follow and are friends with. And it's also a social science experiment. So, some of the data would go back to the University of Massachusetts and allow them to do some analysis on how unfollowing everything actually changes your browsing habits, changes your overall sensibility in life, all those sorts of fun things.
Dave Bittner: Your mood.
Ben Yelin: Totally.
Dave Bittner: Right, okay.
Ben Yelin: So Meta sued Unfollow Everything.
Dave Bittner: The original one.
Ben Yelin: The original one, started by Louis Barclay --
Dave Bittner: Okay.
Ben Yelin: -- a few years ago. So this is 2.0, and instead of releasing this to the public, Mr. Zuckerman is looking for a declaratory judgment from a federal court in California to say that this type of action, this type of service is legal. Because of the previous lawsuit, he was concerned that if he just released the software publicly, that he would be subject to a suit, and that ends up being a major part of this case.
Dave Bittner: Okay.
Ben Yelin: So, this concerns so-called middleware. Basically, a type of application that is compatible with, but adversarial to the goals of a big tech company. There are a lot of kind of benign examples of middleware, so things like ad pop-up blockers that make our browsing experience better. That wasn't started by Google when they created Chrome, but somebody else started it, and it makes the Chrome browsing experience better.
Dave Bittner: Yeah.
Ben Yelin: The reason this is adversarial in terms of interoperability is that it would hurt Facebook's bottom line. If you unfollow everything, you don't get their curated news feed, which brings you some very important targeted advertising that makes them billions of dollars. So, Mr. Zuckerman decided to sue Meta and Zuckerberg, asking for this declaratory judgment, saying that this type of activity is legal. And his legal theory of the case turns on a little-known provision of Section 230 of the Communications Decency Act.
Dave Bittner: Huh.
Ben Yelin: So we've talked about the Communications Decency Act and pretty much everyone else has, mostly about one subsection, and that's subsection C1. This is the so-called 26 words that changed the internet. "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." So that's --
Dave Bittner: Do you have that memorized?
Ben Yelin: I do not have that memorized.
Dave Bittner: Okay.
Ben Yelin: I probably should.
Dave Bittner: I was going to be very impressed there for a second, Ben.
Ben Yelin: It's kind of like --
Dave Bittner: And-or disturbed.
Ben Yelin: The nerdy kids, instead of memorizing the Pledge of Allegiance, will just memorize section 230c1.
Dave Bittner: Yeah.
Ben Yelin: So there's actually more to section 230. The first thing I'll discuss is section c2a. That states that no provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to certain material. And the material that's covered in this particular section is material that is obscene, lewd, las -- I don't even know what that word is.
Dave Bittner: Lascivious.
Ben Yelin: Lascivious.
Dave Bittner: Yeah.
Ben Yelin: Filthy, excessively violent, harassing, or otherwise objectionable, whether such whether or not such material is constitutionally protected.
Dave Bittner: Okay.
Ben Yelin: So hold that provision in the back of your head for just a second.
Dave Bittner: And that's basically a -- you have permission to filter?
Ben Yelin: Yeah, you have permission to filter content.
Dave Bittner: Yeah.
Ben Yelin: To prevent your users from seeing what you determine to be inappropriate information.
Dave Bittner: Okay.
Ben Yelin: And you can restrict pretty much anything. It doesn't have to be unconstitutional. You can restrict what would otherwise be constitutional First Amendment speech, if that makes sense. Then there's this other very little-known section, and I think most legal scholars have claimed that there just has not been any case law on this, and this is section c2b, and it says, "No provider or user of an interactive computer service shall be held liable on account of any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph 1." Now, I want you to keep that paragraph 1 in your head, too, because we're going to come back to that.
Dave Bittner: Okay.
Ben Yelin: In a way that's kind of hilarious.
Dave Bittner: All right.
Ben Yelin: So what Zuckerman is arguing is that this section, c2b, should immunize him against any potential lawsuits from Meta. And his argument is as follows. He would qualify as a provider of an interactive computer service in the way that the Communications Decency Act defines the term interactive computer service, which is an access software provider that provides or enables computer access by multiple users to a computer server, sure. And it would also enable Facebook users who download Unfollow Everything 2.0 to restrict access to material that they and Zuckerman find objectionable. The purpose of this tool is to allow users who find the entire news feed objectionable, or who find specific sequencing of posts objectionable within their news feed to turn off that feed. So it is a very novel theory. It's a novel theory in that it has not been tested before. It's unclear whether section c2b would actually apply to somebody like Zuckerman. He's claiming that Unfollow Everything 2.0 qualifies as an interactive computer service. There's no real indication whether the court would agree with him that this is an interactive computer service. And it's also -- you'd have to go into a little bit of legislative history here. I think what Congress was trying to prevent is some third party getting sued because they helped another company take down inappropriate content, right?
Dave Bittner: I see.
Ben Yelin: So if the third-party provider, let's say that was a big tech company with a social media platform, was desiring to take down certain content, any company that gave technical assistance in doing so could not be liable under the statute. You are immunized from liability. And so that's essentially his argument.
Dave Bittner: Huh. Is this a letter of the law versus spirit of the law situation?
Ben Yelin: Yeah, the funny thing is we don't really know what the spirit of the law is because it's never really been tested. That's what I think the spirit of the law is, is to immunize the people who help social media companies take down smut, if that makes sense.
Dave Bittner: Okay.
Ben Yelin: But that's just never been tested in court. I mean, I haven't studied the legislative history. I don't know why this particular provision is in here. But yeah, I think you could properly read that as the spirit of this law. Legal scholars, in talking, the article we're going to post in the show notes comes from Techdirt by Mike Masnick, and he reached out to a bunch of his favorite legal scholar friends, and they were all kiod of like, Eh, interesting, but, like [laughter] I'm not sure that this theory is going to take hold in federal court. It's untested, it's new. He said reactions were kind of all over the map. What would happen if Mr. Zuckerman's argument is accepted is that websites would no longer be able to take legal action against these middleware providers who are providing technical means for people to filter or screen content on their feed. Now, it's important, as he says in this article, to know that this does not bar the companies themselves from putting in technical measures to block such tools. It just immunizes third parties who are assisting social media companies in taking down content to be immunized. One question I have is the adversarial nature of the relationship.
Dave Bittner: Right.
Ben Yelin: I think what the law was intended to protect is somebody cooperating with Meta to help them remove that offensive content, not this sort of adversarial thing where what Zuckerman is attempting to do here goes against the wishes and the monetary interests of Meta. But the upshot of all of this is that this could end up being a Trojan horse embedded in Section 230 that would protect adversarial middleware, or these adversarial interoperable applications that Facebook does not want to grant access to and wants to sue in a court of law to prevent them from gaining unauthorized access to their site. There have been a lot of suits along these lines, but they've all been suits brought under the Computer Fraud and Abuse Act about unauthorized access. Nobody, to my knowledge or the knowledge of anybody who has been researching this, remembers a case where the main claim is based on this obscure part of section 230. So I'll take a breath there.
Dave Bittner: [Laughter] What about the fact that Zuckerman is coming after this proactively? Is that unusual and how will the court respond to that?
Ben Yelin: It's not necessarily unusual, but I think it's very problematic for Mr. Zuckerman.
Dave Bittner: Okay.
Ben Yelin: So you need standing in a court of law. You have to have a particular concrete interest in the case. That means you have to have suffered some type of injury in fact. Mr. Zuckerman has not operationalized this service. It's not in existence yet. This is a preemptive lawsuit to get a declaratory judgment saying that he is allowed to do this. Now, the reason I think he might be able to claim standing is that he could say that an injury would be certainly impending if he were to bring this service online. And the proof he has is that the previous guy who tried to do this got sued as soon as he did it.
Dave Bittner: Right.
Ben Yelin: So, unlike a lot of other cases where the injury is speculative, I think you could make a very reasonable claim here that an injury would be certainly impending, which is the constitutional standard. Now, what Mr. Masnick said, Mike Masnick who wrote this article, is that he remembers seeing in intellectual property cases that courts are very reluctant to issue these types of declaratory judgments. They want to see what the injury is first. So that's one kind of obstacle in this case, this obstacle of standing. So even before they get to the merits, this would potentially be a reason why this case is thrown out. There are a couple other interesting wrinkles here. Meta and companies like it have taken legal action against data scraping for the past decade or so.
Dave Bittner: Right.
Ben Yelin: We've seen it in very high profile cases. With the AI revolution and many of these big tech companies setting up their own AI platforms that are going to be scraping data from other parts of the internet, they might not be as gung-ho against data scraping.
Dave Bittner: Ah, right.
Ben Yelin: They might be trying to back off the lawsuits, claiming that data scraping is, you know, some sort of crime of moral turpitude because it could come back to bite them.
Dave Bittner: Right, all of a sudden they need the data to train their AI models, so it turns out data scraping isn't that bad.
Ben Yelin: Exactly, but if you're defending this lawsuit, or if -- yeah, if you're defending this lawsuit, but you're also involved in another lawsuit saying data scraping is good when it applies to our artificial intelligence service, then that's going to be a very bad look for the company. Now, they don't necessarily care. Companies make opposite legal arguments all the time. That's the nature of being a lawyer, is, you know, sometimes you have to turn the facts and the law to your favor depending on the circumstances, but I think it would certainly hurt their reputation. There's one other hilarious wrinkle in here.
Dave Bittner: Okay.
Ben Yelin: I'm going to bring you back to the relevant language of section c2b of this statute. It says that, "No provider or user of an interactive computer service shall be held viable on account of any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in Paragraph 1." That material, the very material they are almost certainly referencing, is not in Paragraph 1. It is in Paragraph A. In all previous litigation, that what seems to be a typo that was just missed in the drafting of the legislation, that's just kind of been glossed over, but that could be a live issue here. If Meta wants to take this statute literally, Paragraph 1 is kind of the 26 words that changed the internet type of thing. The segment of the statute that we already read, it doesn't refer to the list of objectionable content that is in Paragraph A. The obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable. But really, this provision, c2b, wouldn't make sense if it was referring to Paragraph 1. It only makes sense if it's referring to Paragraph A. So that's kind of another major wrinkle in this lawsuit, is kind of all of this becomes scrambled and jumbled if we take that typo literally. I don't know what the language is. We take it literally or seriously when it --
Dave Bittner: Yeah.
Ben Yelin: -- probably should not be taken literally or seriously.
Dave Bittner: How much leeway does a judge have to say, Look, obviously this is nonsensical, this is a typo, we're going to interpret it this way, or is a judge obligated to kick it back to the lawmakers to fix the typo?
Ben Yelin: They are not obligated. The case I think back to was a 6-3 Supreme Court decision about the Affordable Care Act. Technically, the law, and this was a drafting error, said that subsidies were only available to exchanges established by the states. A lot of states chose not to establish healthcare exchanges, so the federal government stepped in and established their own healthcare exchanges. So if you read that provision literally in the ACA, subsidies for insurance would not have been available to people who bought insurance through one of those federally managed markets. And that would have really destroyed the entire law, because most, especially red states, didn't establish their own exchanges. They used the federal system. So, like, a significant portion of people wouldn't have been available to those much needed subsidies. And basically the Supreme Court in that case said, We're not going to take that provision literally. Like, the statute here isn't a death sentence. You have to look more broadly at the purpose of the statute. But, yeah, that was a heavily criticized decision. The two justices who joined the liberal justices in that decision no longer -- or I guess Chief Justice Roberts was in the majority. But the other conservative, Anthony Kennedy, has since left the court, and you now have a much more conservative court. So they might tend to lean more to that literalist interpretation, which would throw this case into kind of a mess of uncertainty.
Dave Bittner: Wow. I will note, just as a little aside that is related to this, that on the occasions that I do use Facebook, I use a plugin called Facebook Purity, which does some filtering of my feed. And what's interesting, you talked about kind of the adversarial nature between Facebook and a plugin like this. Every now and then, Facebook throws up a little pop-up that says, Hey, we notice you're using Facebook Purity. Here are all the things you're missing out on. You don't want to miss out on these things, right? You're missing out on the things, all the things.
Ben Yelin: All this click bait, yeah.
Dave Bittner: You're missing the things, you know.
Ben Yelin: It says like, the Muppets want to meet Dave Bittner.
Dave Bittner: Right, right.
Ben Yelin: And then you click on it and it's like, yeah, single women in your area want to meet up.
Dave Bittner: Right, right. So it's interesting to me that, here's another example of that -- of some sort of filtering mechanism existing, Facebook knowing it exists because they consider themselves to be in an adversarial relationship with it, but for whatever reason, they have not gone after this particular supplier.
Ben Yelin: Maybe it's not worth it to them.
Dave Bittner: Yeah.
Ben Yelin: I mean, litigation is very expensive. It also potentially could be bad publicity. And then there's that other factor where if we're going to start being concerned about data scraping, you know, let's make sure that doesn't come back to bite us.
Dave Bittner: Right.
Ben Yelin: So those are the potential factors why they wouldn't have initiated a lawsuit. It just also -- it could be as simple as there are a billion different plugins that have adversarial relationships with our site. We only have the capability and the desire to sue the big ones, the ones that would really cut against our profit margins that we make from having these curated news feeds.
Dave Bittner: Yeah. If you were a betting man, how do you think this would play out?
Ben Yelin: It's really tough because there are so many obstacles to even getting this case heard on the merits. My guess is either this is thrown out due to lack of standing or potentially failure to state a claim just because this theory is untested and I think a court could hold that this is an improper reading given the legislative history and just given the exact words in the statute. I could see a federal court saying this is an improper reading of that language and therefore this isn't stating a valid claim. Now, Zuckerman has also put forth a bunch of other claims saying, All right, if you don't buy my section 230 argument, here are a bunch of other arguments. Maybe they take up one of those other arguments.
Dave Bittner: Right.
Ben Yelin: But if I were to lean one way or the other, you know, it depends on if you get, and I don't know which judge they've drawn, but if you get somebody who, this is the Northern District of California, they deal with all the big tech cases. Are you going to be the one who wants to foster a novel theory here that could change the way middleware and adversarial interoperability works for the entire capital I Internet? I don't know if any judges are that ambitious. And if they necessarily, on shaky grounds, want to get on the bad side of these big tech companies, who would probably freak out if Zuckerman won this case. Because they have done a pretty good job in the past of shutting down middleware products that interfere with what they see as the main purpose of their service.
Dave Bittner: Yeah. As a law professor, though, you love this, right?
Ben Yelin: Oh, yeah. I mean, this is popcorn GIF 100%.
Dave Bittner: [Laughter] Right.
Ben Yelin: The lawsuit was just filed, so we're in the very early stages of this.
Dave Bittner: Yeah.
Ben Yelin: Somebody I went to high school with is lead attorney for Zuckerman.
Dave Bittner: Oh.
Ben Yelin: So I have a very quasi-personal connection to this case.
Dave Bittner: Oh, that's interesting. Small world.
Ben Yelin: Yeah, yeah, we'll see how it goes. It's something I'm definitely going to follow. It's only for the nerdiest of observers, because it's very, very in the weeds.
Dave Bittner: Right.
Ben Yelin: I apologize to those of you who are not entertained by adversarial interoperability in middleware, but for those of you that are, this is the Super Bowl of cases.
Dave Bittner: There's something for everyone in here, including a typo.
Ben Yelin: Absolutely.
Dave Bittner: All right, well we will have a link to that story in the show notes. My story this week comes from Wired. This is about a breach of a facial recognition system. This is something we've touched on here before, Ben. This was a facial recognition system used in Australia, used in bars and clubs, and it's an Australian company called Outabox, or as they would say, Adabox.
Ben Yelin: Yeah, I was about to see if you were going to do the accent.
Dave Bittner: Boy, I do it poorly. It was put in place during COVID-19. This was to check people's temperature, to make sure they weren't running a fever before they could go into the bar. And then also somehow it was tracking people to try to determine if they perhaps had a gambling problem.
Ben Yelin: If you've been to one too many gambling joints --
Dave Bittner: Right.
Ben Yelin: -- it'll flag and say, We've seen this face too much, can't let them in.
Dave Bittner: Right, so you would go to this kiosk, it would scan your face, recognize who you were, I guess shoot a little infrared beam at your forehead to take your temperature, and then let you in to the bar or club or not. And that's what it did. And evidently, you know, it functioned well throughout COVID. So now there's a website that has been spun up that's called "Have I Been Outaboxed", which will reveal if your data was compromised. Evidently the collection of data was improperly secured, it was hanging out there in the open, and someone vacuumed it up and is, you know, making it available, or perhaps that's not the best way to say it, is making it searchable so you can see if your data was part of the data that was hanging out there for anyone to gather up. So far, there's been no signs that anyone has been using this data. So this is one of those cases where a researcher discovers that data has been improperly secured and it's just been kind of sitting out there. Nobody was looking for it. Nobody was actively going after it. So there's no way to know if anybody actually has it or not.
Ben Yelin: Yeah, there's kind of like this backdrop of this case of was this done by a maligned foreign actor. And there's really no evidence of that.
Dave Bittner: Right.
Ben Yelin: Like, we don't think this was China or Russia.
Dave Bittner: Right. It does have some, you know, interesting information. Obviously it has your facial biometrics. It has driver's licenses, club memberships, contact details, and gambling activities. And an interesting little wrinkle in this is that after this breach, the Australian police did arrest a gentleman in Sydney who was trying to blackmail people related to the incident.
Ben Yelin: Going on, "Were You on Outabox" and say, Hey, just saw your name pop up here, looks like you have a major gambling problem, would sure be a shame if I contacted your employer and --
Dave Bittner: Right.
Ben Yelin: -- told them about your time at the slot machines.
Dave Bittner: Right, exactly.
Ben Yelin: Yeah, I mean, this is an interesting story because I think it shows the concerns of unregulated facial recognition software and what can happen when we're not careful, especially some of these COVID-era programs that have been set up that, at least in this case, they weren't properly secured.
Dave Bittner: Right.
Ben Yelin: If you want to think about US examples, I don't remember seeing this as something that was outside private establishments like bars and restaurants. I just don't think -- we didn't take COVID as seriously here as they did in Australia and other places.
Dave Bittner: Right.
Ben Yelin: So we just kind of told people to wear masks and stay six feet apart and we'll keep our bar at 25% capacity or whatever.
Dave Bittner: Right.
Ben Yelin: But one place we do do this now is airports. They screen our faces, match them up with our driver's license to ensure that we are who we say they are. And hopefully they do a better job securing that data at the TSA than the company that made this software did. Because as you mentioned, it's not just releasing the biometric data. It's a lot of personal information about these individuals. And now we know that it's been used for blackmail purposes. So those are the negative effects of having this unregulated facial recognition system. I think we have to start thinking about these things when we decide to use facial recognition and biometrics. This is a very present, serious risk. It should factor into any cost-benefit analysis of employing a facial recognition program.
Dave Bittner: Yeah, I think it's also noteworthy, I mean, something I just infer from this that hasn't been outright stated, but my inference is that this data was not encrypted in any way. In other words, it was collected, it was stored, and what was hanging out there for someone to stumble upon was unencrypted data, which is how the website where you can search it is able to verify if you searched it or not. So that basic security step of encrypting data at rest wasn't even taken in this case with information that is personal --
Ben Yelin: Right.
Dave Bittner: -- very important.
Ben Yelin: Right,
Dave Bittner: Yeah.
Ben Yelin: Yeah, encrypt your data, people.
Dave Bittner: Right. It'll be interesting to see if Australia makes -- if there's any reaction to this, if there becomes some sort of regulatory oversight on requiring that data at risk be encrypted in a situation like this.
Ben Yelin: Yeah, or if it causes local law enforcement agencies in Australia or around the world to think twice about employing facial recognition tools, because now you have kind of the horror story example and you know it could happen to anybody. When you're talking about everybody who was scanned by a bar and restaurant during the COVID era in Australia, that's a lot of people. A lot of people like to go to bars and gamble. So that's a significant amount of data that's been compromised.
Dave Bittner: Yeah. All right, well, we will have a link to that story in the show notes. Again, that is from Wired. [ Music ] Ben, you recently had the pleasure of speaking with Barbara McQuade. She is author of the book, "Attack From Within: How Disinformation is Sabotaging America". Barbara McQuade has quite a resume here. She's a professor at Michigan Law but also served as the US Attorney for the Eastern District of Michigan. She was appointed by President Obama and she was the first woman to serve in her position. Here's Ben with Barbara McQuade.
Barbara McQuade: Although I'm a professor at Michigan Law School now, I started my career, I spent most of my career working as a prosecutor in national security. And during the time I was working as a national security prosecutor, I saw the threat to national security evolve from Al Qaeda, to ISIS, to cyber intrusions, to Russia. And now that I teach law, I teach a course in national security, and a growing area has been disinformation. It started with Russian disinformation. I have my students read Robert Mueller's report on Russia's interference in the 2016 election. And it's really evolved, in my view, from an external threat to an internal threat. I think now we see a really significant threat to our national security coming from American sources, hence the name of the book, "Attack From Within". And I think that there are people in this country who don't care about truth so much as advancing their own political agenda, personal agenda, or profit agenda.
Ben Yelin: So let's address the elephant in the room here, which is that what you identify as misinformation or disinformation, others will identify as, maybe as Kellyanne Conway once said, alternative facts. And efforts such as yours to call out disinformation are met with the likes of people like Elon Musk who say that your real goal is censorship. And so I just want off the bat to give you a chance to respond to people who I think would be making that argument.
Barbara McQuade: Yeah, this is -- I'm so glad to get this question because I think that our country cherishes First Amendment freedoms, as do I. It's something that is highly regarded on the left and the right. Without First Amendment free speech, we lose our ability to speak out against our government. And so I cherish our First Amendment rights of free speech. But I think that there are people who use the C-word, censorship, in an effort to silence all critics of anything that anybody says, and it's carte blanche to be able to say anything they want to say. Of course, the First Amendment has some limitations on it, like all fundamental rights. And so the Supreme Court has held that rights of free speech, like other fundamental rights, may be limited when the limitation is narrowly tailored to achieve a compelling governmental interest. And so it's for that reason that you can't yell fire in a crowded theater, because you might be creating chaos or havoc or danger to other people. It is a crime to threaten to kill somebody and to communicate a threat in interstate commerce. It is a threat to engage in a conspiracy with somebody, even though it is communicated by speech. It is a crime to commit fraud, even though your fraud is committed by speech. So those who claim that the First Amendment is absolute either don't know better or they do know better and they're lying. So this idea of, you know, who's the truth police anyway, what is truth, there's no such thing as truth, you're playing a very dangerous game. Because although we can all have our own opinions and we can have our views on, you know, eternal truths like the meaning of life and other things, there are such things as facts. I spent my career as a prosecutor in court where there are facts and you have to use evidence to support your factual assertions. This idea that truth is unknowable and there are no facts is something that is used in Russia, in Putin's Russia, the idea that there are no facts, truth is for suckers, truth is non-existent, everything is political spin, everything is PR, and if you seek truth, then you're naive. And instead, what you should focus on is, you know, maximizing your profit and disengage from politics because everybody's corrupt anyway. I think there is such a thing as truth. I think there are such things as facts. And I think that we should not be duped into thinking that any effort to focus on truth is itself an act of censorship.
Ben Yelin: Why do you think, and I appreciate everything you said there, why do you think Americans in particular are susceptible to that type of argument, that sort of Elon Musk, anybody who tells you about disinformation favors censorship. Why do you think we as a country are particularly vulnerable to that? What are those -- what are the factors there?
Barbara McQuade: Yeah, I mean, you know, it sounds good. I think if you have not spent a lot of time reading First Amendment law, you know, the First Amendment itself says, "Congress shall make no law". So it sounds pretty absolute, but it's not. I also think that we find ourselves in a moment in history when there is a lot of pushback, I think largely because of efforts to polarize American society through Russian disinformation, through domestic disinformation, to polarize people, to point at others as scapegoats for, you know, the problems in our country, whether those people are immigrants or people of color or the LGBTQ community. And so with DEI programs and concerns about political correctness, I think that it has become -- cancel culture is, you know, one of the new phrases, I think it is easy to lump all those things together. And to get people riled up about their perceived rivals and enemies in this world, and feeling like you can't say anything about them or, you know, you will be shunned or you will be a criminal. And so to lump all that together with any effort to address disinformation I think has perhaps some popular appeal when you try to divide the world into only two teams. You know, there's the red team and the blue team and everything my opponent says is bad and everything I say is good. You know, that's an old debater's trick called the either or fallacy. I'm going to convince you that there's only two sides in this world and the other side is so bad and so untenable because they're, you know, communist, radical, whatever I want to call them, that you should be on my side. And anybody who speaks out against one of those other sides is going to be persecuted. So, I think that it is consistent with that line of thinking, that any effort to limit speech is all about censorship and prevent you from speaking out against that which is politically correct. I think that's where this idea comes from. But I think we need to have the nuanced appreciation that there is a different thing when it comes to, you know, as you said, what Kellyanne Conway called alternative facts, I mean, absolute falsehoods about the number of people at Donald Trump's inauguration. I mean, it's a verifiable fact that you can determine by looking at photographic evidence. Whether an election was stolen, we know from lawsuits, we know from audits that Joe Biden won the 2020 election. And yet there are many people who continue to persist in false claims that the election was stolen. The earth is not flat. You know, who's to say that earth is flat? Well, there are photos from outer space and I believe in their credibility that the earth is round. There is scientific evidence to suggest that. And so I think the idea that, you know, there's no such thing as truth is very convenient to people who want to manipulate others.
Ben Yelin: What strikes me is that efforts at disinformation have been extremely successful. I mean, you look at just polling, how many people have these false beliefs. So I'm wondering if you could talk a little bit about effective disinformation tactics. What you sort of think of the person you picture as the greatest purveyor of disinformation. I have mine. I suspect many of our listeners and our guests might share that person, but what are those techniques that you recognize that you talk about in your book?
Barbara McQuade: Yeah, so a lot of these tactics are ones that have been around for decades. And although the means for communicating them has changed with social media, I think many of the same techniques are still in use. You know, Hitler, Stalin, Mussolini used these techniques. One of them is this idea of declinism, that things are awful in society. We are a country in decline. Our country was once great and now things are terrible. Hitler used this after World War I to talk about the great state of decline of Germany and the great shame of the German people. And then, you know, we hear from, you know, Donald Trump today, for example, we'll talk about "American carnage" and "a nation in ruins", you know, ignoring the fact that crime is down, unemployment is down, wages are up. There's a lot of, you know, really good things going on in the country, but instead, the narrative is again and again and again how awful things are. Because then if things are awful, one, we can scapegoat and blame other people for that and cast someone as the enemy that we need to work against and hate. And so in Hitler's Germany, of course that's the Jewish people. In United States society today, you know, maybe it's immigrants. "Immigrants are animals, they are vermin, they are poisoning the blood of America." We see the same thing about people of color and Muslims, or the LGBTQ community, you know, who want to groom your children for pedophilia. If we can have these Other people that we push aside and say those are the bad people, we can blame them for this decline in society. And then, of course, desperate times call for desperate measures. And so Hitler was able to persuade people that it was necessary to suspend the constitution to push back against these dire circumstances that they found themselves in Germany. And, you know, even Donald Trump himself said regarding the 2020 election that, you know, when such an awful thing as a stolen election happens, we have to take desperate measures, including terminating the Constitution. So these are really the same techniques we've seen throughout history being used again.
Ben Yelin: I guess the difference, you know, we've seen these techniques throughout history, the difference now is of course the internet, which brings us to this podcast, which is about law and policy of cybersecurity, privacy, surveillance, that sort of thing. So can you talk about the role the internet has played in fostering disinformation, how it's kind of changed the game, even though the techniques are the same?
Barbara McQuade: Yes, I think the greatest example of this comes from Robert Mueller's report, which I assign to my students, about the Internet Research Agency's work to spread disinformation using social media. Facebook, Twitter now X, YouTube, and it was achieved by creating false personas online, creating accounts with names that portrayed themselves as members of various affinity groups in the United States. And the goal was to sow division among these groups, to destabilize American society, to get us fighting with each other and ignoring Russia on the world stage so they could do what they wanted to do, you know, invade Ukraine, annex Crimea, destabilize NATO. And so, you know, they would portray themselves as, for example, there was an account called Blacktivist. Many months before the election, Blacktivist cultivated a lot of followers of African Americans who thought that Blacktivist was one of them. There was another group called United Muslims of America. There was a group called Tennessee GOP. There was one called Heart of Texas. And they all portrayed themselves as being grassroots Americans with a particular affinity or identity. And then they would do things to stoke divisions, to say outrageous things, to get people looking at that group and say, Look at how outrageous those people are. They say all of these horrible things. Well, it's because they weren't those people at all. They were some, you know, Russian operative in a hoodie somewhere in a boiler room in Moscow. Or, what Blacktivists wrote just before the 2016 election, again, followers thinking that this was their Black political activists saying things like, You know, Hillary Clinton has never done anything for our community. We should send her a message that we should not be taken for granted by not showing up at the polls for her. We should stay home on election day and send an important message. So we'll never know how many people heeded that call, but if even only a few did in a swing state that was decided by a narrow margin, you could see that it could have tremendous influence on the outcome of the election. And so in this way, we now have technology that can spread messages instantly to millions of people and can do so anonymously.
Ben Yelin: I'm always amazed at the precision of the Internet Research Agency. I mean, I lived in Baltimore City at the time. We've gone through the Freddie Gray unrest and the memes that you saw that were so specific to the circumstances, that through the Mueller report we realized were the result of Russian disinformation. It was just really -- it was really striking to me. I want to get to sort of the solutions you propose in your book. but I also wanted to get your thoughts on section 230 of the Communications Decency Act and the debate around whether it is the obligation of social media companies to police content to protect against the spread of disinformation. And then these laws that we've seen at the state level in Texas and Florida where these companies are now prohibited, at least within those states depending on what happens at the Supreme Court, from discriminating against content based on viewpoint. And that could really allow the facilitation of false information, so I wanted your thoughts on that.
Barbara McQuade: Really interesting issues. So section 230 of the Communications Decency Act passed in 1996 comes at a time when the Internet is, at least social media, is really in its infancy. And, you know, if you read the words, it reads almost like a children's story. The internet is a wonderful thing where people can connect and solve, you know, creative projects together. It's lovely, and it is used in that way, right? I mean, we're all communicating via the internet right now, and it's incredible that we can do these things. But there is a sinister side that has grown up around social media that we now see that we couldn't see in 1996. And so maybe the law needs to change. I don't know that we need to completely end all immunity for social media platforms. I think that could be very difficult because it has so many, you know, millions and millions of users. It might be difficult to monitor all of that. But I think there are some things where we could have limitations on the immunity for things that the platforms themselves are responsible for generating. So, for example, when they take paid ad content, you know, maybe they should be responsible for things that appear in paid ad content, even if not everything that's on the internet. They're being paid for it, so maybe that's something where they should be responsible and could be legally liable for something that is harmful or otherwise violates the law, is defamatory, or includes something that's threatening or harassing or something like that. I think we could require things like disclosure of who is paying for ads on the internet, and if they fail to do that on social media, they could be liable for civil fines in that way. So I think there's some bases for removing immunity from social media platforms. We know that aside from content, algorithms are responsible for some of the harm online. You know that Facebook whistleblower, Francis Haugen, who's a former data scientist there, and revealed that Facebook was engineering these algorithms that were designed to push content that would outrage us. So, you know, we now have a hate button and we can hate content, and that's what would -- the algorithms would rate the most highly and push to the top of our feeds because they knew that that led to more engagement. We'd spend more time on the platform when we're excited or angry, and that means more eyeballs on the ads, and they make more money on the ads. And so, you know, why couldn't we, even if social media platforms get immunity for the content that they post there, require some limitations about the algorithms? And if they violate those things, make them legally responsible for that. Maybe we could also prohibit them from, as you said, with very targeted messages coming to us based on where we live or what we do or who we are. Maybe we have rules surrounding how they scrape our private data and then sell that to others, whether that's for advertising or whether that is for political consultants so that we can be micro-targeted with ads. You know, I've been bombarded lately with ads for Detroit Tigers baseball caps. How do they know? I am a huge baseball fan. I've been going to games. I like to wear baseball caps and I've been getting so many targeted ads about that. And so, you know, why can't we control our own data? There are some bills pending in Congress along those lines, but I think that's something, again, that could be a carve-out from the immunity that is given to social media platforms under Section 230.
Ben Yelin: And then beyond those potential technological or legal solutions, in terms of cultural changes, I mean, what would you like to see happen to help break the spell of disinformation?
Barbara McQuade: Yeah, I think one thing we need to do is to get our arms around media literacy. So the things we've been talking about so far are things on the supply side of how we might regulate social media to prevent so much disinformation from reaching us. But I think on the demand side, there are some things we can do to make us more resilient when we receive disinformation. And one is this idea of media literacy. In Finland, they have introduced media literacy into their schools because in light of their proximity to Russia, they have been bombarded with disinformation from Russia for decades, you know, trying to reduce the power of NATO and reclaim some of their former Soviet satellite countries. And so media literacy in the schools, you know, teaching students things like there may be fake news out there. It may be that someone creates a false newspaper that looks like a real one, but it's called something like, you know, in Detroit, we have the Detroit News and the Detroit Free Press. They create something called the Detroit Tribune, right? And it doesn't even exist, but it looks pretty good. And then it pushes stuff out there on social media and says, "The Detroit Tribune is reporting X, Y, and Z." Well, maybe you do a little research to find out if the Detroit Tribune is a real thing before you take what they report at face value. Learning that the headlines don't always reflect accurately what is in the story of a newspaper. Looking for a second source before you assume some fact to be true. I once fell for some misinformation when I read a story that Patrick Mahomes had said he would not play another down for the Kansas City Chiefs until they changed their name to something that was not offensive to Native Americans. I thought that was true and I tweeted that to everybody I knew. You know, it turns out that was false. And --
Ben Yelin: As a 49ers fan, I wish that would have been true.
Barbara McQuade: [Laughter] Yeah.
Ben Yelin: But, nevertheless.
Barbara McQuade: No such luck. No such luck for you. But it was false. And, you know, shame on me, because when I look back at the story after I realized it was false, it did come from an ESPN account, which looked pretty legit. But when I read a little closer, it was from the ESPN Sprots Center account, not Sports Center. So it was, you know, a fake designed to look pretty good. So, you know, teach people these things so that they can spot them. And, you know, we can do that in our schools, but we could also do it for adults through civic organizations and faith communities. There's a lot of great lifelong learning programs out there, virtual learning programs, podcasts like this one that could help people to build that media literacy. But I also think, you know, culturally, I think there are things that we can do as a society to recognize the importance of truth and how it is under assault. And that is I think that all of us need to show some restraint and mutual respect to people who are our political rivals ideologically. And so I think that there is sometimes this temptation when there are people out there stoking division on social media, you know, trolls who will argue with you. And it turns out they're just bots who are picking a fight with you and, you know, berating you because you are of your gender, your race, your whatever. And they're just there to stoke division. And so, you know, learning to not take the bait when we see these things, to not argue with bots, because that is what ramps up the heat in all of these exchanges. And so I think those arguments have spilled over into real life. And sometimes we view our neighbors with skepticism or frustration because they have views that are different from ours. And I think that we need to peel back why it is we have certain strong opinions and ask ourselves, What are the facts that you are basing those conclusions on? And if it's simply because that's what the Fox News host told me, you know, maybe they ought to peel back and look for the evidence and the facts upon which they are basing those conclusions. And all of us should, right? Sometimes we jump to conclusions because we read something on social media and we believe it to be true, when in fact, if we look for the evidence there, it's really missing. And so I think that building that resilience requires our own media literacy, but our own sort of tolerance and respect for those who might share different views politically.
Ben Yelin: I think that's a very positive message to end on, and I agree that it's incumbent upon all of us. We're all susceptible to it. I know both you and I are users of X, formerly Twitter, and spending an hour on there, I think we break our own rules frequently. So I appreciate that as a message. Is there anything else that we didn't discuss that you would like to get out there in this interview?
Barbara McQuade: Just one last comment, and that is this, that, you know, our democracy really relies on an informed electorate, and we can't be informed if we don't have accurate information. I think we should see it as our own patriotic duty to seek out accurate information so that when we cast a vote, we are casting the one that is in the best interest of the civic virtue of our country and not just our own personal self-interest.
Ben Yelin: Well, thank you, Professor Barbara McQuade. The book is "Attack From Within: How Disinformation is Sabotaging America". And thank you so much for joining us today.
Barbara McQuade: Thank you, Ben. Great to be here. [ Music ]
Dave Bittner: Ben, that was a really interesting interview. What a great guest.
Ben Yelin: Yeah, she was a big get for us. I was very pleased to talk about her. I mean, the book is incredible. I think she's visionary in her approach to this and being able to speak with her about potential concrete solutions towards the end of the interview was good because I think it's easy to identify the problem, but it's harder to figure out what we do about it.
Dave Bittner: Yeah.
Ben Yelin: And so I highly encourage you to read the book. I think it's going to change the way people think about misinformation and disinformation and how we can confront it going forward.
Dave Bittner: Our executive producer, Jennifer Eiben, said she wants Barbara to run for president.
Ben Yelin: Yeah, no comments. [Laughter] I'll just say I think she would be a great candidate.
Dave Bittner: And by the way, I have to say, Ben, as an aside, I have to praise you. Because Barbara McQuade brought up the shouting fire in a crowded movie theater argument, and you kept your mouth shut.
Ben Yelin: I was so close.
Dave Bittner: You were an excellent host, you were polite, you did not interrupt her, you did not correct her, you let -- in the spirit of a good conversation, you just let it go, and I realize how difficult that must have been for you, Ben.
Ben Yelin: It was a show of real restraint on my part. And she's also, like, a tenured law professor who worked as a US attorney in the Obama administration.
Dave Bittner: Right.
Ben Yelin: If she wants to claim that fire in a crowded theater is actually a thing when I think it isn't, it is her right to claim that. She's earned it.
Dave Bittner: Is it fair to say that she massively outranks you as an attorney?
Ben Yelin: Yes, it is absolutely fair to say that.
Dave Bittner: Okay, so there you go, so it makes sense.
Ben Yelin: Yeah.
Dave Bittner: You would let her have the final word on that.
Ben Yelin: I have no problem admitting that.
Dave Bittner: All right, well, all kidding aside, our sincere thanks to Barbara McQuade for joining us. It was quite a privilege for us to have her on the show. Again, her book is titled "Attack From Within: How Disinformation is Sabotaging America". It is well worth your time. [ Music ] And that's "Caveat", brought to you by N2K CyberWire. We'd love to know what you think of this podcast. Your feedback ensures we deliver the insights that keep you a step ahead in the rapidly changing world of cybersecurity. If you like the show, please share a rating and review in your podcast app. Please also fill out the survey in the show notes or send an email to caveat at N2K.com. We're privileged that N2K CyberWire is part of the daily routine of the most influential leaders and operators in the public and private sector. From the Fortune 500 to many of the world's pre-eminent intelligence and law inforcement agencies, N2K makes it easy to optimize your biggest investment - your people. We make you smarter about your teams while making your teams smarter. Learn how at N2K.com. This episode is produced by Liz Stokes. Our executive producer is Jennifer Eiben. The show is mixed by Tre Hester. Our executive editor is Brandon Karpf. Peter Kilpe is our publisher. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening. [ Music ]