Caveat 6.16.21
Ep 82 | 6.16.21

Private companies are not subject to the First Amendment.

Transcript

Robert Nelon: A private company like Facebook or Twitter is not subject to regulation under the First Amendment, which applies only to government actors.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, Ben reviews the Supreme Court's decision on the Van Buren case. I've got the story of a U.S. state pushing Google to be classified as a public utility. And later in the show, my conversation with Robert Nelon. He's a partner at the national law firm Hall Estill. And we are discussing Facebook's decision to uphold their ban on former President Trump's account. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. All right, Ben, we got some good stories to share this week. But before we do, just a quick note for our listeners, we are back in studio today. 

Ben Yelin: I'm looking at your face right now, Dave. 

Dave Bittner: (Laughter) I apologize. I know that can be traumatic. But, no, for the first time since COVID began, we are back here together in our studio, and I have to say, it's delightful. We are both fully vaccinated or, as they say, maxxinated (ph). 

Ben Yelin: We are maxxinated. 

Dave Bittner: Although you got your antibodies the hard way (laughter). 

Ben Yelin: Yes, I did. I'm a supervaxxed (ph) individual, having had COVID and my two shots. I'm practically invincible. 

Dave Bittner: There you go. 

Ben Yelin: So, yeah, no, but it feels wonderful to be back in person, to be with you. And I guess - nobody would trust us with a PSA, but if we were to put one out, we'd say be vaccinated. You can go do the things you love, like being in studio... 

Dave Bittner: There you go. Yeah. 

Ben Yelin: ...With your co-host. 

Dave Bittner: No, it's really great. All right. Well, let's jump in to some stories. And Ben, you have what I think it's fair to say is the big one this week (laughter). 

Ben Yelin: I have the big one. If you're like me, and you are most certainly not, you were waiting by your computer many weekdays around 10 a.m. on SCOTUSblog, waiting for the Supreme Court to hand down a major decision on the Computer Fraud and Abuse Act. This was Van Buren v. United States. And they finally did. We got a decision, a 6-3 decision from the Supreme Court last week that sided with Mr. Van Buren and vacated his conviction. So for a little bit of background, Van Buren was a police officer in Georgia. He had access to a law enforcement database for license plates. He was permitted to access this database as a police officer, but somebody who turned out to be an undercover cop tried to pay him $5,000 to get into that database and search it for personal purposes. I think the undercover cop was trying to get him to stalk somebody, basically. And it was an undercover cop, so Mr. Van Buren was caught. Not only was he violating his law enforcement agency's own policies, but he was charged under the Computer Fraud and Abuse Act. The Computer Fraud and Abuse Act - it dates back to 1986, and it has a couple of key provisions to it. The first is basically your standard anti-hacking provision. It's illegal to hack into somebody else's computer or network if you're not authorized to be there. The second, more ambiguous provision is this exceeds authorized access provision. So you are not allowed to exceed your authorized access. And the basis for this case is whether Mr. Van Buren exceeded his authorized access. He was allowed to be in this database, but did he do something in this database that he was not allowed to do? What Van Buren was arguing was that you can't be exceeding authorized access unless you're in some sort of area, a file folder network, that you're not supposed to be in. So, for example, let's say he had access to law enforcement network, but there was a shared drive folder that was locked, and, you know, he was able to hack into that folder. 

Dave Bittner: Right. 

Ben Yelin: Even though he had authorized access to be on the network, he wasn't authorized to be in that folder. So that's what Van Buren was arguing - that just simply using the database that you're allowed to be in for nefarious purposes - that's not a violation of the federal statute. 

Dave Bittner: I see. 

Ben Yelin: What the government was saying is it is. So if your purpose for using the database violates your organization's terms of service, that *** 

S3: itself can be a violation of the Computer Fraud and Abuse Act because you are exceeding authorized access. So the Supreme Court reviewed this and in a six three decision sided with Van Buren, saying that exceed authorized access only applies to situations where somebody is in an area of a network of a computer where they are not allowed to be. It was an interesting ideological breakdown. So the decision came from the newest justice on the Supreme Court, Justice Amy Comey, Barrett, Amy CONI Barrett, rather. She was joined by the two other Trump appointees, Justices Gorsuch and Cavnar, along with the court's three liberal members, Breyer, Sotomayor and Kagan. That's an interesting mix. 

S4: It really is. And then the dissenting opinion was Justice Thomas joined by Justices Alito and Chief Justice Roberts. So. It's a very groundbreaking decision, a lot of it was this textualist argument, I'm not going to bore you with it because it's really, really bizarre legalese, which sort of depended on the definition of the word. 

S3: So of course it does. 

S4: Yeah. So the law says you can't exceed authorized access. So to obtain information that you're not entitled to and I'm paraphrasing. And the question is whether so referred to the previous portion of the law that talked about illegally accessing a database or whether it was sort of its own clause. And they looked at the textualist past, the legislative history of Black's Law dictionary. That's how they technically made the decision, OK? Yeah. And I know that that's not satisfying because this is a very important policy question. And we are having this philosophical debate on the meaning of a two letter word that we use in every other sentence. 

S3: Well, can you read the tea leaves and unpack what you suspect is actually with what they're getting at here? 

S4: So what they're getting at and they do mention this explicitly in the decision, is if we were to have the government's interpretation where you could be charged based on the purpose of using a database that you are otherwise authorized to use, that would open the floodgates to a lot of criminal liability for things that all of us do every day. So, for example, if you put false information on your Tinder profile, you you might have access to Tinder, but you'd be using it for a nefarious purpose. You'd be violating their terms of service. So if you said you were six feet three, six foot three and you were really six feet, that could potentially be a violation of the Computer Fraud and Abuse Act if the government's interpretation was correct. If you use your employer's computer to go on Facebook, and that's not allowed by, you know, whatever contract you signed with your employer, your acceptable use policy that could have led to criminal liability under the Computer Fraud and Abuse Act. And the court made a, I think, practical policy argument that that would unjustly expand the scope of the statute beyond the intentions of those who drafted it back in nineteen eighty six when it was supposed to be a criminal liability basically for hacking or getting into somebody's computer network when you weren't supposed to be 

S3: worth noting, too, that back in nineteen eighty six we're pretty much talking about dial up that you know. Right. 

S4: Yes. More games. These are, these were the dark ages of online communications. So I mean that's another major aspect of this case, is the law hasn't really been updated in the past 30 years or so. Thirty five years. What would have been really nice is for Congress to go in and clarify what exceed authorized access means. They can do that and still could do that and could have done that. But in the absence of Congress having acted, it was up to the Supreme Court to make this interpretation. And so they made it. The dissent tries to make the other textual argument, basically saying, my reading of the two letter word is is right and yours is wrong, which is fine. And another interesting item they brought up is the majority talks about how if we have this definition of the computer fraud and Abuse Act that the government is asking for, all types of normal activity are going to be criminalized. What the dissent is saying is, look, that's true. Even with your definition of exceeds authorized access under the Computer Fraud and Abuse Act, Justice Thomas gave what I think is kind of a funny but interesting example. Let's say your employer did not allow you to access the games folder on your computer and you went and played free cell and solitaire. That would be, under the majority's definition, exceeding authorized access because you were in an area of the computer you were not allowed to be in. I don't think we're going to see many criminal prosecutions for playing free cell, although I'd love to be a fly on the wall at the prosecutor's office for that one. Right. But it is sort of interesting to realize that no matter what, this law could potentially be overly broad. I think what the majority's decision does is narrow the realm of criminal liability as much as is practically possible with the language of the law that does exist. So long story short, you are not going to be criminally charged by the federal government for using your work computer to go on Facebook. And that's probably a relief to the millions of us who have committed that heinous act. 

Dave Bittner: *** So really - I mean, is the practical effect of this that it sort of narrows the possibilities, the incidence, through which prosecutors might try to throw the Computer Fraud and Abuse Act at people for things that they've done? 

Ben Yelin: Yeah. And that actually has very serious implications. A couple of people brought up to me in the past week the situation of Aaron Swartz, who - I think it was maybe eight or nine years ago... 

Dave Bittner: Right. 

Ben Yelin: ...Was really - I think he was an MIT student doing just kind of journalistic research - was charged with a violation of the Computer Fraud and Abuse Act because the government was basically unhappy with what he had found. 

Dave Bittner: Right. 

Ben Yelin: And... 

Dave Bittner: They threw the book at him. 

Ben Yelin: They threw the book at him. They were harassing him with this statute. 

Dave Bittner: Yeah. 

Ben Yelin: And he ended up taking his own life. And, you know, I think this was an example of the overzealous prosecution under this statute that went unmentioned in the majority's opinion. But I think that was sort of the subtext here, is you don't want the federal government to throw the book at somebody because, you know, they're using a database that they have access to for some sort of prohibited purpose. 

Ben Yelin: And I'll also say, you know, one of the points the majority opinion made is it's going to be very hard for any prosecutor or judge to go in and deduce, in many circumstances, what was somebody's actual purpose in exceeding that authorized access. Do we want the courts involved with, well, was he searching this database for personal reasons? Is there a legitimate business purpose? That's very hard to adjudicate, and it's much better to have what Justice Barrett called a gate-up, gate-down approach, where either you're in an area you're not supposed to be or you're not. And that's the deciding factor in the decision. 

Ben Yelin: So certainly a fascinating decision. The breakdown to me was particularly interesting. I think most digital privacy advocates and people who, you know, certainly you and I trust on this issue, were very pleased with the decision and with Justice Barrett's reasoning. It was a very interesting read. 

Dave Bittner: Yeah. Yeah, absolutely. And we'll have a link to one of the many articles that have covered (laughter) this if you want to dig into it in more detail. But, yeah, certainly a big story this week. 

Dave Bittner: All right. Well, let's move on to my story. This is an interesting one. It came from Reuters, article written by Diane Bartz. And it's titled "Ohio Sues to Have Google Declared a Public Utility." The state of Ohio recently asked a court to declare Google a public utility. And basically what it sounds like thereafter is they want to forbid Google from giving preferential treatment to its own products. 

Dave Bittner: In the lawsuit, they say that Google is used for nearly 90% of internet searches and has 95% of the search share on mobile devices. And they're accusing Google of responding to certain search requests in a way that prioritizes Google's products, even if other responses would give better answers. Google, of course, is saying, no, we wouldn't (laughter) - we don't do this. They're not seeking any money from Google. But they're requesting to require Google to refrain from prioritizing its own products. Can you give us a little of the backstory here and unpack this for us, Ben? 

Ben Yelin: Yeah. This is a really interesting request from the Ohio attorney general, who's the person instigating this request. So a lot of things come with regulating something as a utility. It basically allows the state to exert a lot of control over the utility. I mean, most utilities are private organizations. But, you know, the state has significant regulatory power over a gas company or a water utility because they recognize that that utility serves some sort of public purpose, and they want to regulate the market so that that utility doesn't completely screw over its consumers. 

Dave Bittner: Right. 

Ben Yelin: That seems to be the impetus behind what the Ohio attorney general is seeking to do here, saying Google now is ubiquitous. It's - really has a monopoly on the search engine market. Even though that, you know, might not be technically true, I think we can agree that... 

Dave Bittner: From a practical - I mean, you know, it's - Google has become the verb. Let me Google that. Right. 

Ben Yelin: Yeah. 

Dave Bittner: Yeah. Yeah. 

Ben Yelin: Exactly. Exactly. And they are steering their users to their own products. We do regulate utilities so that utilities can't do that. If you want to get a smart thermostat, for example, in most states and jurisdictions, the utility can't tell you, you know, which of those products to buy. 

Dave Bittner: Right. 

Ben Yelin: They might, you know, encourage you to buy a certain product. But the state does have a role in regulating that. 

Dave Bittner: Yeah. Or, like, I can get electricity from a green supplier of electricity, you know, in my community. And they don't have to run new cables to my house, right? 

Ben Yelin: Yeah. Exactly. 

Dave Bittner: Like, they use the existing infrastructure, and basically, behind the scenes, some paperwork happens and people get credits and get charged and so on and so forth. 

Ben Yelin: But it's the same wires. 

Dave Bittner: Exactly. 

Ben Yelin: Yeah. 

Dave Bittner: Yeah, yeah, yeah. 

Ben Yelin: To me, this seems like it's a bit of a stretch. And I'm kind of sympathetic to Google's argument here, which is, you know, we are designed to give people the most relevant and helpful results. 

Dave Bittner: Right. 

Ben Yelin: We don't need the government coming in and regulating what our users and customers see. If our users are unhappy, they either don't have to use Google, or they can use a different search engine, or they can change ******** 

Ben Yelin: ***** their preferences so that they're seeing different types of advertisements. We want to preserve that freedom. 

Ben Yelin: As far as I'm concerned, there's never been a successful lawsuit trying to get a private company like this in the technical realm to be regulated as a utility. We have seen a lot of litigation against these Big Tech companies, and this just might be sort of the next frontier. You know, we've sued these companies or the government has tried to instigate lawsuits against these companies for breaking our intellectual property laws, for monopolies, et cetera, et cetera. And this is kind of a different avenue of going after one of these companies. I don't think it's going to be successful. It seems to me like it's kind of a public relations - not stunt, but kind of a - an eager attorney general saying, I'm looking out for you. I'm willing to fight on the public's behalf by going against this tech giant behemoth. 

Dave Bittner: Right. 

Ben Yelin: And that might earn this attorney general some good political will. But I don't see the lawsuit succeeding. 

Dave Bittner: Now, you know, there's been calls from a variety of parties to break up some of the big companies like Google so that they can't exert this sort of dominant force over search. I mean, could this be a step along the way in that journey? Could this be, you know, another bit of evidence where, I don't know, from a federal level, they can say, look. Even the attorney general of Ohio said that something needs to be done. 

Ben Yelin: I mean, I don't think that that's - and it certainly has no precedential value in court. Even if they did get a favorable court decision here, it is a state court that would be overseeing this. So this would not be applicable to the federal government. 

Dave Bittner: I see. 

Ben Yelin: During a lawsuit where we're talking about breaking up the Big Tech companies, I don't think judges are going to be sympathetic to, look. Ohio decided to regulate Google as a utility. But I do think it's part of a broader theme of action being taken at all levels of government to cut against the supreme power of these tech companies. 

Ben Yelin: Different politicians and different justice departments, the U.S. Department of Justice and attorneys general across the United States, have their own sometimes differing reasons for going after these tech companies. I do think it's all part of the general theme that they are too big and too powerful and prioritize their own services at the expense of everyone else. And because Google has become a verb and because it dominates this market, that is a real negative for consumers. I think that broad effort is going to continue. And this is just kind of a small avenue in which one attorney general is trying to advance that cause. 

Dave Bittner: Yeah. All right. Well, we will have a link to that story in the show notes. We would love to hear from you. If you have a question for us, you can leave us a message at 410-618-3720 or you can email us. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Robert Nelon. He's a partner at the national law firm of Hall Estill. And we are discussing Facebook's decision to uphold their ban on former President Trump's account. Here's my conversation with Robert Nelon. 

Robert Nelon: As everyone knows, former President Trump was a frequent user of social media, especially Twitter and Facebook, to get his message out to his supporters. After the election, when President Biden won, Trump, of course, was unwilling to concede and began claiming that the election had been stolen and was pretty vocal about it and a frequent tweeter and poster on Facebook and probably other social media platforms as well. 

Robert Nelon: As we approached January 6, when the election was to be certified in Congress, he became even more frequent and was encouraging supporters to come to the Capitol and try to stop the steal, as he called it. And his social media posts began, even more so than normal, to raise alarm among the owners of the platforms as well as many in the public that he was advancing potentially harmful speech, perhaps even inducing violence or inciting violence. 

Robert Nelon: And as it turned ******** 

Robert Nelon: *** out, on January 6, a large group of Trump supporters stormed the Capitol. And during the time that the violence was going on, Trump was praising his supporters, calling them patriots and telling them he loved them on social media. And Facebook took that as a violation of its community standards and temporarily banned him from Facebook on January 6 and extended the ban on January 7, and then indefinitely suspended former President Trump from that platform. 

Dave Bittner: Can we just sort of set out some definitions here because I think there's a lot of confusion, particularly among, you know, folks who aren't steeped in on the policy side of things? You hear a lot of people throwing around words like censorship. Where do we stand with that? A private company like Facebook - where does censorship apply to them? 

Robert Nelon: A private company like Facebook or Twitter is not subject to regulation under the First Amendment, which applies only to government actors. So Facebook is free to allow whatever speech it wants or prohibit whatever speech it wants. Facebook does have its terms of service that define what can and cannot be published on that platform. And the problem that Facebook ran into in this case was not a First Amendment violation because it - like I said, not regulated by the First Amendment. But Facebook's problem was that its terms of service, at least in the view of the oversight board, were not clear enough, specific enough, and there was no provision in their terms of service for an indefinite suspension that Facebook imposed on the former president. But there's no censorship as such. There's no First Amendment violation, because Facebook simply is not subject to regulation under that amendment. 

Dave Bittner: I don't know. Is it reasonable to say that someone like the former president of the United States enjoys a special status? Being the former president of the United States, should he be allowed more leeway than a regular citizen? Or I suppose in this nation where we say we believe in rule of law, should it be applied equally? 

Robert Nelon: I think it should be applied equally. Obviously, there are influencers who, by reason of their public position, either elected position or just notoriety, have a louder megaphone than the average citizen does. And so we have to be careful about how we permit speech from the very powerful influencers to make sure that they are not taking undue advantage of their position of influence. But in theory, the rules of the road, the terms of service that a social media platform would apply, ought to be equally applied to everyone. 

Dave Bittner: What do you make of Facebook's oversight board here? First of all, just the fact that Facebook has set up this oversight board at all and then the process by which they made their decision - do you have thoughts on that? 

Robert Nelon: It's an interesting concept to have an oversight board. Obviously, with a social media platform like Facebook that has billions of users, and, I don't know, untold billions of posts every day, there isn't any effective way, despite hiring thousands of monitors to look at posts - there's no way they can monitor everything. And so things are likely to escape attention, or there can be big issues in terms of proper application of their community standards that having an independent board take a look at it is an interesting concept. I'm not sure that there are any other social media platforms that have such an oversight board, but it's a unique feature of Facebook, and it gives them perhaps some credibility when they make decisions about who can speak and who cannot speak on their platform. 

Dave Bittner: Thinking of Facebook's self-interest, I mean, is it possible that - you know, we talk a lot on this show about Section 230 and how it sort of - it shields these companies from many of the things people say. Do you suppose that they were concerned that the former president saying the things he was saying - you know, what many people were saying, inciting these folks who came down on the Capitol - could that have extended to Facebook themselves? Could they have found themselves the target of any sort of liability for his actions? 

Robert Nelon: I think under Section 230, which immunizes the social media platform for any publication that they did not create the content themselves - Section 230 would immunize Facebook for any post by anybody, whether it incited violence or not. But Facebook could well have been concerned that the appearance that they were permitting a harmful speech of some kind could be detrimental to their business model. If you really wanted to get cynical, you could say Mark Zuckerberg was trying to pass the buck to somebody else about making decisions on what could or could not be posted on Facebook. I'm sure there are many reasons why they opted to create an oversight board, but probably the least of them was concern that there would be legal liability because of Section 230. *** 

Dave Bittner: Where do you come down on Section 230? Do you think it is ready for some sort of reform, or do you think it's good the way it stands? 

Robert Nelon: I think it's good the way it stands. Section 230 came along because of the problem in the early days of the internet that there really was not, as people quickly learned, an effective way to moderate harmful speech on the internet. And as a broad First Amendment principle, the - while the First Amendment does not protect harmful speech such as falsity or incitement to imminent violence, as a general proposition, you can't prohibit speech in advance. That's considered a prior restraint. 

Robert Nelon: The remedy under the First Amendment, if you're looking at government actors, is to provide a remedy for harmful speech after the fact - defamation suit or whatever it may be. But you can't ban the speech in advance. Section 230 came along because it became impractical for social media platforms. As the number of users rapidly increased, they couldn't monitor speech. And there were a series of cases in which the federal courts split over the liability of the platform for, say, defamatory speech. Could you hold the platform responsible? 

Robert Nelon: And the opinion in Congress - and pretty uniform opinion in Congress at the time - was, the internet would not work if the social media platform could be held liable for content posted by somebody else over whom they had no control, other than after the fact to take something down. And even that process, if you ultimately, like we do today, have millions, if not billions, of posts every day, there's no practical way to monitor that and to moderate it and to remedy that, except in the most extreme circumstances. And if the platforms could be liable to everybody who's offended by the post of somebody other than the platform owner itself, we'd be swamped with litigation against the media platforms, and the internet would cease to work as we know it. 

Dave Bittner: You know, as we've witnessed the rise of social media and this shift to our digital online economy, how's the First Amendment doing? Is it standing the test of time? Where do we stand with it? 

Robert Nelon: Well, the First Amendment is obviously subject to very new and different challenges than what we might have had back in the days when newspapers were the only voice that you had. And even with the advent of radio and television, there were technological limits on how much speech there was. And it was easier to apply First Amendment principles to the speech. 

Robert Nelon: With social media, where anybody and everybody can say whatever they want to, generally, it poses real challenges for the First Amendment. And, again, the First Amendment would apply to the speaker and not to the platform. But because of the volume of speech and the fact that on social media there is a greater difficulty in moderating that speech and regulating misinformation or disinformation, outright false speech, harmful speech that we might generally characterize as hate speech, finding definitions and criteria to apply the First Amendment become extremely difficult. 

Dave Bittner: Are the courts meeting that challenge? I mean, are we seeing - as the cases make their way through, what's your perspective on how things have been interpreted? 

Robert Nelon: Oh, I think generally the courts are doing as well as we could expect the courts to do. The law always moves behind technology. It takes awhile for the legal system to catch up with the technology and understand how the law should apply to a particular scenario or the set of events. But generally speaking, the courts are doing a good job of dealing with the issues that are coming up. 

Robert Nelon: But the courts are relying in great part on the platform owners and the users to moderate their own speech and make sure that they're abiding by the law. If you're a speaker, make sure that your speech remains protected by the First Amendment. If you're the platform, do the best that you can to make sure people are complying with your standards, your terms of service, to avoid significant legal issues. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: It's so interesting. I have mixed feelings on it. I understand why, you know, some people are upset that a politician can be largely permanently censored on the biggest social media platform, and thus is unable to get his message out to millions of followers. You know, I can understand why it appears to people that Facebook is making a statement of its own by prohibiting him and elongating this suspension until 2023. 

Ben Yelin: I also completely understand why Facebook did it. Their terms of service state very clearly that our platform will not be used to foment violence. And what happened on January 6 - and, you know, I know you guys got into this in the interview - is, Facebook and all ******** 

Ben Yelin: **** other social media services were used to advertise this event. You know, even though the post didn't explicitly call for violence, I think the tech platforms are reasonably saying that they could be interpreted as an impetus to cause violence. 

Dave Bittner: Right. 

Ben Yelin: And I think more importantly, this is not something that just happened in the past and is no longer an issue. This wasn't a one-time violation where, you know, former President Trump is going to come out and say, look. I made a mistake. I got a little hotheaded. 

Dave Bittner: Right. Right. 

Ben Yelin: I'm only going to use Facebook, you know, to give some commentary on national news stories. 

Dave Bittner: He had a pattern of being provocative, to say the least. 

Ben Yelin: Yeah. And, you know, for those of us who are following the failed blog that he had, From the Desk of Donald Trump, he has not moderated his postings. He hasn't changed the tone of his posts. He's still insisting, falsely, that the 2020 election was stolen. 

Dave Bittner: Right. 

Ben Yelin: From various reporting, he seems to think that he's going to be reinstated in office sometime this year because of... 

Dave Bittner: Because of reasons. 

Ben Yelin: Because of reasons. 

Dave Bittner: (Laughter). 

Ben Yelin: Exactly. 

Dave Bittner: (Laughter) Right. Right. 

Ben Yelin: And he's still saying the same types of things that seemed to foment violence on January 6. 

Dave Bittner: Right. 

Ben Yelin: So because the message is still there, I think the oversight board from Facebook understood that context and made that decision accordingly. He's in his mid-70s. I don't think he's going to change. This is who he is. 

Dave Bittner: Sure. 

Ben Yelin: Do I think that in 2023 or 2024, when they reconsider this decision, he's going to be a changed person who found religion or something? I highly doubt it. 

Dave Bittner: Yeah. 

Ben Yelin: So my guess is that this suspension, which is now a semi-permanent suspension, is going to be a permanent suspension. 

Dave Bittner: Yeah. That's the part that left me scratching my head a little bit, is why Facebook put a time limit on this. You know, why not make it permanent? Are they hedging their bets, or are they, you know, ducking some arrows that they suspect will be shot at them? What do you make of that? 

Ben Yelin: I think it's an effort to have a fair due process. In their decision extending his suspension, they wrote out what their criteria were. I think they want to be very clear to all their other users that certain actions come with certain consequences. And we have to have some sort of uniform rules. To say that you're going to suspend somebody forever when that's not really something that was contemplated beforehand... 

Dave Bittner: Yeah. 

Ben Yelin: So, you know, there wasn't a policy in December 2020 saying politicians who say X are going to be permanently removed from this platform. 

Dave Bittner: Right. 

Ben Yelin: Because that wasn't there, I think they are trying to be fair and saying, we will analyze this in its current context. We will try and have some sort of uniform application of standards. But if the president continues the rhetoric that he has thus far yet to abandon, we're not going to go back on our decision just because of the passage of time. 

Dave Bittner: Yeah. I suspect, too - I mean, you know, it is not outside of the realm of possibility that he could run again, and he could win. 

Ben Yelin: Absolutely. 

Dave Bittner: So what does - you know, if they were to put a permanent ban on him, and he was president again, that would put Facebook in a bit of a sticky pickle where, you know, now he is once again the president of the United States. And that's a different value proposition for de-platforming him, right? 

Ben Yelin: It is. Although I'll note they de-platformed him, as did Twitter. Now, granted, it was the lame-duck period. 

Dave Bittner: Right. 

Ben Yelin: But he was still president. 

Dave Bittner: Yeah. 

Ben Yelin: So I think, you know... 

Dave Bittner: That's a good point. 

Ben Yelin: ...That's the precedent now that says the president isn't completely immune from our terms of service. 

Dave Bittner: Right. 

Ben Yelin: I think what Twitter has said is, we're going to give him some extra slack as president because his tweets have some sort of value to the public, knowing what this very powerful person is thinking, what policies he's considering. 

Dave Bittner: Right. 

Ben Yelin: But that type of deference isn't endless. And... 

Dave Bittner: (Laughter) Right, right, right, right. 

Ben Yelin: ...He obviously pushed all of these platforms to their limits. 

Dave Bittner: Yeah. 

Ben Yelin: It's not an easy decision for the platforms. There are a lot of people who are very angry. There are a lot of people in Congress who want to haul Mark Zuckerberg in front of a committee and yell at him for de-platforming the putative leader of the conservative movement in this country. 

Dave Bittner: Right. 

Ben Yelin: But, you know, I think they are trying to come up with something that balances, you know, their desire to foster a platform of free speech and public debate versus, we don't want to be responsible for people barnstorming the Capitol (laughter) and injuring police officers. 

Dave Bittner: Yeah. 

Ben Yelin: So you do have to strike a balance there. 

Dave Bittner: Yeah. Absolutely. All right. Well, once again, our thanks to Robert Nelon for joining us. We do appreciate him taking the time for us. 

Dave Bittner: That is our show. We want to thank all of you for listening. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.