Caveat 3.24.22
Ep 118 | 3.24.22

Why the EARN IT Act matters to you.


Aron Solomon: So essentially, the EARN IT Act was something that was first proposed in 2020. And they realized that it was a tire fire, and we thought we'd never see it again. But it came back in 2022 (laughter).

Dave Bittner: Hello, everyone. And welcome to "Caveat," the CyberWire's privacy, surveillance law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today Ben shares a recent federal district court case on geofencing. I've got the story of the FCC proposing new cyber breach disclosure rules. And later in the show, I speak with Aron Solomon from Esquire Digital on why the EARN IT Act matters. 

Dave Bittner: While this show covers legal topics, and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, we've got some good stories to share this week. Why don't you start things off for us? What do you got? 

Ben Yelin: It's my man, the professor at the University of California, Berkeley, Orin Kerr. 

Dave Bittner: (Laughter) The object of your intellectual desires. 

Ben Yelin: The object of my deepest, darkest intellectual desires. 

Dave Bittner: (Laughter) His dreamy opinions on legal matters have you dotting your I's with hearts. 

Ben Yelin: He really is. He really is a great writer... 

Dave Bittner: (Laughter) Yeah. Yeah. 

Ben Yelin: ...And shapes a lot of my thinking on the Fourth Amendment, even though we don't always agree ideologically. 

Dave Bittner: OK. 

Ben Yelin: And he still has a standing invitation to come on to the show if he ever listens. 

Dave Bittner: Yeah. 

Ben Yelin: So I will drop literally everything. 

Dave Bittner: (Laughter). 

Ben Yelin: So this is about a federal case that was decided recently in a district court in Virginia. It's the case of United States v. Chatrie, and it concerns geofence warrants. So Mr. Chatrie is accused of robbing a bank. He went into a bank in Richmond, Va., with a note that said, give me all your money. The bank complied, gave him $195,000. So the law enforcement had basically no leads on who committed this crime. So they asked Google for information on which cellphones were in the area at that particular time, and that's known as a geofence warrant. Google voluntarily requires a warrant to obtain that information. The legal rules on obtaining a warrant aren't exactly settled at this point, but Google, as a matter of its own policy, requires a warrant to obtain that information. 

Dave Bittner: OK. 

Ben Yelin: So a magistrate judge signed that warrant, allowed Google to share that geofencing information with law enforcement. There were 19 people in the area at the time that the crime took place. And doing a little more analyzing and intelligence work, they narrowed it down to one or two people, eventually down to one suspect, Mr. Chatrie, who is arrested and charged with armed robbery. 

Dave Bittner: OK. 

Ben Yelin: So he seeks to suppress this evidence, saying that this geofence warrant is a violation of his Fourth Amendment rights. And the court agrees with him and says that this warrant likely does violate his Fourth Amendment rights, but in this particular case, they're allowing the criminal charges to go forward under what's called the good-faith exception. Good-faith exception basically means this is an unsettled area of the law, and what law enforcement did and were trying to do is within the reasonable bounds of what they thought the law was at the time they sought this warrant. 

Dave Bittner: Oh. 

Ben Yelin: So they were doing in good faith what they thought they were allowed to do under the Fourth Amendment. 

Dave Bittner: So there's enough precedent, enough has happened that law enforcement could look at this and say, this is within legal boundaries. 

Ben Yelin: Exactly, at least for now. 

Dave Bittner: Right. 

Ben Yelin: Now, this case might get appealed. It's... 

Dave Bittner: This could be the case that changes that. 

Ben Yelin: It could be the case that changes that, or there could be other cases that change that, where law enforcement's behavior is a little more egregious, particularly if it's another company they don't actually - unlike Google, they're not required to seek a warrant. If they get that information warrantlessly, that might be more of a constitutional challenge. 

Dave Bittner: I see. 

Ben Yelin: So the good-faith exception is kind of disappointing here because we don't settle some really confusing areas of the law. The first is whether a geofence warrant is a search in the first place. So for there to be a Fourth Amendment search, you have to have a violation on one's reasonable expectation of privacy. Generally, when you share anything with a third party, like a Google, you lose your expectation of privacy in that information. That's the third-party doctrine. But in the Carpenter case, we saw that if you don't really have a choice, you know, if it's something as banal as turning on your phone, then you haven't really voluntarily shared that information with a third party. That's a little bit muddled in this case. Professor Kerr goes into kind of excruciating details on whether this disclosure was actually voluntary. There are a lot of points at which you can opt out of location sharing on Google. You can also do things like put your phone in airplane mode or turn off your phone. 

Dave Bittner: Or leave it at home when going to rob a bank. 

Ben Yelin: Exactly. 

Dave Bittner: (Laughter). 

Ben Yelin: You know, so in Carpenter, the justification was, well, you need to - you need your phone to engage in everyday life. 

Dave Bittner: Yeah. 

Ben Yelin: Everybody has a cell phone. You need it to engage in business. 

Dave Bittner: Right. 

Ben Yelin: I think it's less clear whether you need Google Maps to live your everyday life. You could use a different application. You could print out from MapQuest on your home PC how to get to the bank that you want to rob. 

Dave Bittner: (Laughter) Right. Right. 

Ben Yelin: You don't necessarily need to use those Google services. So in Professor Kerr's view, this may not be a constitutional search, which would mean that a warrant would not be required in those circumstances. Then there's the question of, if it is a search, what kind of probable cause do you need? And that's where things get particularly complicated here. The judge in this case seems to indicate that the government needs probable cause as to each individual who was in the area that is being searched. And Professor Kerr doesn't agree with that. He thinks that's too stringent of a standard. That's not how we engage in most Fourth Amendment searches. For the most part, you have authorization to search a particular place for a particular thing. And no matter what you find at that place while you are searching for that thing, that is a legal search. And the fruits of that search can be used for a future prosecution. 

Dave Bittner: Can I dumb it down a little and ask you a question? 

Ben Yelin: Absolutely. 

Dave Bittner: So let's say that, you know, this - a bank robbery occurs, and police are canvassing the area, and they're talking to a guy who happened to be standing on the street corner. And he says, oh, I saw, you know, Bob was standing out in front of the bank earlier today. And so now the police say, well, we better go talk to Bob. That's probable cause to go talk to Bob, right? 

Ben Yelin: Absolutely. Yeah 

Dave Bittner: So isn't this similar to that, in that basically Google is the one who's saying, we saw Bob in this area. Is this - shouldn't that be probable cause to go knock on Bob's door and say, hey, Bob, what were you up to today? 

Ben Yelin: Yeah. But they only know that Google has that information because they already issued the warrant. 

Dave Bittner: Yeah. 

Ben Yelin: Whereas, in your example, no warrant has been issued. You're just talking to a person outside the court. 

Dave Bittner: OK. 

Ben Yelin: So that would be - in your hypothetical, that would be probable cause to go get the warrants. But... 

Dave Bittner: Oh, the warrant to talk to Bob. 

Ben Yelin: Exactly. 

Dave Bittner: OK. 

Ben Yelin: But you - or to search Bob's house or Bob's car or... 

Dave Bittner: OK. 

Ben Yelin: But in this circumstance, you know, what's unclear is whether they had probable cause to seek that Google geofence warrant in the first place... 

Dave Bittner: Oh, I see. 

Ben Yelin: ...Certainly, as it relates to all 19 individuals who are within this particular area, the COV don't have probable cause. It's not really that likely that any one of these 19 people were involved in the robbery. But as Professor Kerr says, that shouldn't be the standard. If you have enough information that you're going to find some evidence of a crime in this particular area, that should be sufficient in most circumstances to surveil that area to get information. So that's the question on the warrants. 

Ben Yelin: There's this separate question on particularities. You have to particularly describe the place to be searched or the things to be seized. And this gets particularly confusing in this case because the government has to establish probable cause that each phone that appears in the geofence is involved in a crime. And that would end up sort of defining the particularity requirement. In the judge's view, the government needs to articulate a warrant so that no innocent person has their phone included in the geofence. That's just not practically possible. 

Dave Bittner: Right. 

Ben Yelin: Just by the nature of geofencing, you're not going to be able to do that. Otherwise, geofencing wouldn't be a useful investigative technique. It's just not something that's entirely plausible, but it's a capability from a policy perspective that we probably want law enforcement to have for particularly heinous crimes. And so it would be impractical to demand that level of particularity as it relates to every geofence case. So I think this leaves us in a rather unsatisfactory conclusion here. This particular search, or this particular case is allowed to continue because law enforcement was off - was acting in good faith. But we don't have an answer to these really difficult Fourth Amendment questions as to whether this counts as a search in the first place. And if it does count for a search, what type of probable cause and particularity is required? 

Dave Bittner: Well, can I swing back to Bob again? - 'cause... 

Ben Yelin: You and Bob - yeah. 

Dave Bittner: (Laughter). 

Ben Yelin: You must have a vendetta against Bob. 

Dave Bittner: Bob's a shady character in my mind. But let's say Bob is just standing outside of the bank, you know, (laughter) smoking a cigarette, you know, taking a break or whatever on his way home - had nothing to do with the robbery. Bob is completely innocent. But, again, the guy standing on the street says, hey, I saw Bob standing in front of the bank around about the time when the robbery occurred. Do we think it's reasonable for the police to go want to have a conversation with Bob? How do they know if he's a person of interest before they talk to him? His merely being there makes him interesting. And how does the geofence not make every person who was in that area at that time also interesting? See where I'm going? 

Ben Yelin: I do see where you're going. I think you could make a plausible claim. And that's kind of a cousin of what Professor Kerr is arguing, that you can't really have that level of particularity with a geofence... 

Dave Bittner: OK. 

Ben Yelin: ...Because any - it's very unlikely that only one person is going to be within a particular location at a given time. 

Dave Bittner: Right. 

Ben Yelin: We're rarely that lucky. So it's going to have to be an instance where you have 19 people, like you did here, and you have to investigate every single one of them to rule suspects out. I think, in a normal investigation, it's reasonable to follow these types of leads to get - and we do that all the time in other types of law enforcement investigations to figure out who was at a given location at a particular time. For example, you don't need a warrant in most circumstances to get the sign-up sheet at an event if that might give an indication as to who was at an event where a crime took place. So I think you are on to something here. And I think what Professor Kerr is trying to argue is, having requirements for that level of particularity just wouldn't be practical. And it would defeat the point of having this law enforcement tool in the first place, because we're rarely going to get circumstances where you're asking Google for information as to who was in a particular area, and you're only expecting one person to be in that area. This is not that point in the investigation. This is the point where you're searching for any kind of lead whatsoever. And there just might not - there might as well not be geofence as a technique at all if it's going to be limited by strong particularity requirements. 

Dave Bittner: All right. Well, as you said, that's interesting for sure. All right. We'll have a link to that article over on Lawfare, written by Orin Kerr. My story this week comes from JD Supra. It's actually an article written by the folks over at Cooley, the law firm Cooley. And what they're covering - sort of wrote a summary about how the SEC, the Securities and Exchange Commission, have voted to propose new rules for cybersecurity disclosure and incident reporting. So at a recent meeting, the SEC director, Renee Jones, said that in today's digitally connected world, cyber threats and incidents pose an ongoing and escalating threat to public companies and their shareholders. 

Ben Yelin: True. 

Dave Bittner: So what they are proposing is enhanced regulations for public companies over which the SEC has jurisdiction, that they would report about material cybersecurity incidents on Form 8-K. We're going to come back to that word material in a second (laughter). 

Ben Yelin: Yeah. It's very important. 

Dave Bittner: They're going to require periodic disclosure regarding their policies and procedures to identify and manage cybersecurity risks, management's role in implementing cybersecurity policies, board of directors' cybersecurity expertise, updates about previously reported material cybersecurity incidents. And then there's also sort of a technical thing. And they're going to require them to present them in a particular sort of markup language that they use for these sorts of things. Let's start with that word, material. They're going to require material cybersecurity incidents to be reported. Does that raise your antenna there as a lawyer for being a fuzzy word? 

Ben Yelin: Yeah. I mean, among lawyers, materiality is maybe second only to reasonable care as one of the vaguest... 

Dave Bittner: (Laughter) OK. 

Ben Yelin: ...Legal terms that's impossible to suss out. 

Dave Bittner: Right. 

Ben Yelin: So the SEC, in trying to define this word, is using case law. And the one case they cite - they actually cite two cases. There's the TSC Industries cases and Basic v. Levinson. 

Dave Bittner: OK. 

Ben Yelin: And the definition that emerged from those cases is, information is material if there is, quote, "a substantial likelihood that a reasonable shareholder would consider it important" in making an investment decision, or if it would have a significant - if it would have significantly altered the total mix of information made available. I'm not sure that that's going to be specific enough of a definition of materiality just because there's a lot that's vague in there. Seemingly, when you're talking about publicly traded companies, nearly any piece of information could be of use to a shareholder. You never can really quite anticipate the reasons why somebody would want to purchase a stock or not, or why somebody would want to sell it. So there are going to be doubts about the critical nature of the relevant information. But I think what the SEC is trying to do there - trying to do here is come up with their best-faith attempt at a definition. 

Dave Bittner: Yeah. 

Ben Yelin: It's not going to be perfect. And I think, more often than not, the question of materiality is going to be relatively clear. I think, when you look at 99% of the cases, is it information that would reasonably change the behavior of a shareholder? Or is it information concerning a - some type of critical piece of information, so a piece of data that was stolen, altered, accessed or used for an unauthorized purpose? 

Dave Bittner: Yeah. 

Ben Yelin: I think, in most cases, we'll know whether there has been a material incident or not. 

Dave Bittner: This article had an interesting example that got me thinking. They said, what if a company has a policy with their backups, their data backups, that after a certain amount of time, data backups get destroyed? And they discovered that they had a backup sitting on the shelf that did not get destroyed in the time period that it was supposed to get destroyed. So this data backup was sitting on the shelf, contrary to their policy. Should they be required to report that? There was no data leak. It didn't affect anything. They - once it was discovered, they destroyed the data. I suppose you could make the argument that it points to some sloppiness on the part of the people taking care of their data security. But it's... 

Ben Yelin: Yeah. That's where I would lean. 

Dave Bittner: Yeah. But it's an interesting question, isn't it? 

Ben Yelin: It is. I mean, and that's why materiality is such a hard standard. You wouldn't necessarily, on first blush, think of that as a material breach because it's - as you said, there's no data that's being stolen. It's just something that was intended to be destroyed that has not been destroyed. But if you are a shareholder, that might give you useful information on how well a particular company complies with its own data retention policies. And that might change your behavior as an investor. And under their definition, that would be material. So there are kind of these confusing questions. It's hard to think of something that really wouldn't be material in any sense of the word when we're talking about a cybersecurity incident. There are going to be examples. But I think it's relatively rare. And what the SEC, I think, is trying to argue in promulgating these regulations is, we'll know it when we see it. We have a - based on our case law and based on our practices, we have a relatively good idea of what material means in this context. So even if the definition itself is unclear, they don't - I think they don't think it's going to present any of these - any significant problems. 

Dave Bittner: Yeah. You know, one of the things they point out here - there was a dissent from Commissioner Hester Peirce. And Peirce said in her statement that this exceeds the SEC's limited role - said, quote, "flirting with casting us as the nation's cybersecurity command center, a role Congress did not give us." So it's an interesting question. Is this the type of thing that the SEC should be involved with? In my mind, I would say, yes, because risk is risk. 

Ben Yelin: Right. 

Dave Bittner: And cyber risk is just a subset of risk. So why not? 

Ben Yelin: I completely agree with you. I think this does fall under the domain of the SEC. They are designed to protect consumers and to prevent financial markets from collapsing, or there being some type of nefarious behavior within our financial markets. I think, when we're talking about corporate finance, cybersecurity is just as big of a risk as other types of potential risks to our financial system. 

Dave Bittner: Yeah. 

Ben Yelin: And so I think that's well within their purview. I understand the point that's being made in this dissent. I just don't - I don't agree with it. It's - this is an emerging risk. Public issuers are going to have to contend with the fact that cyber incidents are happening. They're happening to big organizations and small organizations. We are currently engaged in an international conflict where cyber warfare is a distinct possibility. And it's completely fair on the part of investors to want to know more about how issuers are managing these risks. So I certainly think it's within the purview of the SEC. That would be my view on it. 

Dave Bittner: Yeah. I tend to agree with you. All right. We will have a link to that story in the show notes. We would love to hear from you. If you have something you'd like us to discuss here on the show, you can email us. It's 

Dave Bittner: All right, Ben. I recently had the pleasure of speaking with Aron Solomon. He is from a company called Esquire Digital. And we are discussing the EARN IT Act, which is back and better than ever (laughter). 

Ben Yelin: Yes, much to the chagrin of civil liberties advocates. 

Dave Bittner: (Laughter) That's right. So here's my conversation with Aron Solomon. 

Aron Solomon: So, first of all, the EARN IT Act - very few people actually know the name (laughter) because it's a long name. Let's do that first. It's Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2002 (ph). So after a name like that - and it's even got a longer title - to establish a National Commission on Online Child Sexual Exploitation Prevention, and for other purposes. So that's why they gave it... 

Dave Bittner: Right. 

Aron Solomon: ...An acronym like the EARN IT Act because that is super, super long. 

Dave Bittner: They love their acronyms in D.C., don't they? 

Aron Solomon: They do. So essentially, the EARN IT Act was something that was first proposed in 2020. And they realized that it was a tire fire. And we thought we'd never see it again. But it came back (laughter) in 2022, essentially. That's kind of the very high-level Reader's Digest version of it. And the reason people say, well, what's wrong with the EARN IT Act? - it's because it's going to empower every United States state and territory to create sweeping new internet regulations by stripping away the kinds of protections that Section 230, the only surviving piece of the Internet Communications Act, provides for. 

Dave Bittner: You know, on its face, I think people would say, well, the notion of preventing child sex abuse - that sounds like a good idea. And that is what the advocates of the EARN IT Act certainly lead with. First of all, what is the case that they're making, and where do you feel they're coming up short? 

Aron Solomon: So I agree. It is a very good thing to protect against child sexual abuse. There's no doubt about it. And I think we would all agree that. But the problem is... 

Dave Bittner: Yeah. 

Aron Solomon: ...Is that this is going to be, if it ends up getting passed, an extremely overreaching act, which is not going to just deal with issues like child sex abuse. It's going to potentially have your emails and your direct messages and other kinds of communications read. It's surprising to those who kind of espouse internet privacy that something like this has gotten so far. To go back to the child abuse piece of it, it seems like something that could have been a very narrowly crafted tool to do that is instead being replaced with a mallet. That's going to do a lot of things that may and may not also catch that, too. 

Dave Bittner: So what is in there that has civil liberties folks concerned? 

Aron Solomon: So what's in there that has civil liberties folks concerned is essentially, the EARN IT Act can create a massive new surveillance system run by private companies, and it can roll back a lot of the securities features that we have come to expect. So we all realize that the past few years - let's say since 2016 - a lot of these companies, like Facebook, have been in the public spotlight. And public trust for these companies has certainly decreased. The EARN IT Act essentially creates a foundation for private actors - I guess if you want to call them, like the Facebooks of the world - or states to scan every message that gets sent online and then to report violations. It could be anything, not just online messages. It can be websites, backups and cloud photos. 

Aron Solomon: One of the important things within Section 230 is that legally, it made sites like, let's say, Twitter as an example - not the site itself being the one that communicated a message, that they were essentially republishing things that you and I might decide to tweet. So states are going to be allowed to pass whatever kinds of law they want to hold companies like Twitter liable as long as they can find a way, with the EARN IT Act, to say that it somehow relates to online child abuse. So that kind of overreaching is what online privacy people fear. 

Dave Bittner: And what about the use of encryption? I mean, that's a key element of this as well, yes? 

Aron Solomon: Oh, my gosh. It absolutely is. So at a time where a lot of us are looking for more private services - you know, people thought that Telegram was going to be the next big thing (laughter) when they realized that it was run by somebody who's basically in the pocket of the Russian government. And I say that as somebody who's not in any way a conspiracy theorist, but it's very obviously that kind of tool. So when it comes to end-to-end encryption or other types of encryption - and people should understand, this is stuff like Signal and even your iPhone, iMessage, WhatsApp and anything that runs on AWS, Amazon Web Services, it's going to spread the use of the EARN IT Act to help law enforcement punish companies when they deploy these kinds of encrypted technologies. That's super, super scary. It can even - I mean, it can theoretically go as far as they want it to go, which is, I think, really something that kind of the average person who uses the internet hasn't thought of enough because it's been done politically in a way to hide those aspects of it. 

Dave Bittner: I'm curious that the EARN IT Act sort of kicks things down to the states. I mean, something is - with the global reach of the internet, you know, and internet without borders, it strikes me that something like that would be very difficult to manage, if not impossible. 

Aron Solomon: Sure. I agree 100%. And this is again what we have to look at as the kind of the day that this all started here was in late 2016. Because what the elections did is it drew - and we see that some of this was very valid today - concern about Russian interference in the elections and Russian interference with our online systems. So we started to look at companies like Google, Apple, Microsoft, Facebook, Twitter, et cetera, and their role in moderating content. And as I said before, Section 230 kind of allowed them some leeway into what they were able to do when it came to that, when it came to content moderation. 

Aron Solomon: I remember it was Ted Cruz. I think it was back during the first version of this bill. Senator Cruz says that Section 230 should actually only apply to politically neutral groups, so not the Twitters, which they thought was very, very left wing because of who they were trying to modify, right? So here's the reason - long answer to your short question. I'm one of the people who is predicting, at a minimum, a red wave during the midterm elections, maybe something more akin to a red tsunami. As not only the federal balance of power shifts towards Republicans, but more and more states do, then it makes perfect sense to take something away from the federal government's aegis, if you want to say that, in Section 230 and put it back to the states who, you know, the vast majority of states are going to create something that people said, whoa, we didn't know that EARN IT would do this - and it will. 

Dave Bittner: How could something like that play out, though? I mean, I'm thinking of myself. You know, I live in Maryland, which is a, you know, a left-leaning state. I suspect, you know, something like this would not - our legislators would not go for something like this. But if a neighboring state did, you know, Pennsylvania or West Virginia, you know, one of our neighbors, what does that mean from a practical point of view in terms like, would I not be able to send a Signal message to someone who doesn't live in the same state as me? Are we talking about things that simple and straightforward? 

Aron Solomon: I think in some ways, we are, and it's really a fantastic question. So if the EARN IT Act passes and folks like the electronic, you know, the EFF - right? - are correct in what they anticipate might happen, we're going to be living within a surveillance state. So within the small S state, I think your example is a really good one. You could be living in a state, Maryland, that says we're going to adhere to the protections under Section 230. And you could be communicating with someone in - I don't know - pick a state, Texas, where they're like, since Texas is always in the news, where they're like, no, actually, every single thing is being monitored. And you, the person in Maryland, may eventually be committing a Texas crime in sending that message. 

Aron Solomon: Especially, come on, like, so many of the messages that we send to each other online, whether it's in an encrypted format or not, are subject to misinterpretation. I mean, I used to live in China. I knew that every message that I was sending, even on things I was allowed to use, like Chinese applications, was being watched. So you would be really careful about not doing something, especially with a language difference, that might be interpreted in a certain way. Is that the way that we want even, quote-unquote, "safe messaging" to happen in the future? Well, I mean, theoretically, it could happen with the EARN IT Act. 

Dave Bittner: You know, one of the things that I've seen pointed out about the EARN IT Act is that while it's, again, on its face, aimed at protecting children, a lot of folks who will be affected are adults and particularly sex workers, you know, folks who are involved with legal adult content. You know, I remember a few years ago when platforms like Craigslist, you know, removed their personal ads because they felt like they could - they were liable, you know, suddenly for folks who are sex workers and folks who are using it for that. I mean, is this taking that to the next level? 

Aron Solomon: It's taking it five levels down. And again, you know, when people hear the term sex worker, we're not just talking about someone who's doing legal sex work, advertising their services. We're talking about people who decide to post subscription content to things like OnlyFans. So again, what this is doing, the strategy behind the EARN IT Act is it's going to let private companies do this dirty work of mass surveillance. So that's going to mean that somebody in that position who has federal protections now, any content they place on things like Tumblr, which, by the way, took all sex-related content off their site, I guess it was probably a year and a half ago, to OnlyFans to any place that somebody who decides that part of what they want to do with their life it monetize this kind of content won't be able to monetize it and won't be able to use it. 

Dave Bittner: And I suppose then, I mean, the fear is that it pushes that stuff underground where people don't have the sort of legal protections that they have when it's, you know, a public facing. 

Aron Solomon: Yes. And we've got to think about this within the broadest scope of the way it's done. So if you look at something like OnlyFans, which a lot of people have probably heard of, when it comes to paying for content that you can get on OnlyFans, you know, it's done with payment methods that are very traditional, right? So somebody could decide I want to see the pictures of this person, and they could use a regular credit card or they could use some other kind of payment transfer. And it's relatively safe because OnlyFans is very well known. So it affects all parts of that transaction. It takes money out of the pockets of the people who decide that they want to sell this type of content. And it also makes purchasers much more vulnerable when things like this go underground. And you've got, you know, potential scams and theft of accounts or crypto or whatever the case is. It makes something the wild, wild west that just doesn't need to be that. 

Dave Bittner: So where do we stand now? As you and I are recording this, where does this legislation stand in terms of where it is, you know, making its way through the machine? 

Aron Solomon: So the way - I always think back to the great "Schoolhouse Rock," how a bill becomes a law, you know. 

Dave Bittner: Yeah. 

Aron Solomon: I won't pain our listeners to have me sing it. 

Dave Bittner: (Laughter). 

Aron Solomon: But basically, where we are right now is even though they hope and pray that they will, they are the bill and they pass committee. So that's like a lot more steam than it had a couple of years ago, especially given, as I said before, where the United States is as a nation politically and what's kind of expected to happen later this year. I would not at all be surprised if this fully passes and becomes law. And then we're going to find that it's going to be something that the courts are going to have to deal with. 

Dave Bittner: All right, Ben, what do you think? 

Ben Yelin: The zombie lives. I mean, it's just crazy. It's crazy to me, given the opposition we saw to the EARN IT Act in 2020... 

Dave Bittner: Yeah. 

Ben Yelin: ...Among online activists that it's being introduced. I think your interviewee made a great point that the goals here are noble. I think everybody who's involved in opposing EARN IT realizes that protection of children online is an extremely important governmental interest that might even justify some radical action on the part of our federal government, but that the provisions of this particular bill are - go too far in the other direction. 

Dave Bittner: Yeah. 

Dave Bittner: So I can understand why there is this widespread opposition, even though we are dealing with CSAM. It's something that everybody from right-wing groups to civil liberties groups to left-wing group seems to oppose because it would have potential effects on censorship of information or all different types of second order effects, and certainly would threaten the security of our communications as well. 

Dave Bittner: Yeah. 

Ben Yelin: So I appreciated the interview. And, I mean, even though I think many of us would rather not be talking about the EARN IT Act again, we are, so... 

Dave Bittner: Yeah. It's important that it has attention drawn to it so it can be debated. All right. Well, again, our thanks to Aron Solomon. He is from Esquire Digital. We do appreciate him taking the time for us. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.