
Who’s the boss of AI?
[ Music ]
Dave Bittner: Hello, everyone. And welcome to "Caveat," N2K CyberWire's privacy surveillance, law, and policy podcast. I'm Dave Bittner and joining me is my cohost Ben Yelin from University of Maryland Center for Cyber Health and Hazard Strategies. Hey there, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: And joining us once again is our N2K CyberWire colleague and editor of the caveat newsletter Ethan Cook. Ethan, welcome.
Ethan Cook: Hey guys. Good to be back.
Dave Bittner: Our topic today is federal preemption. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover please contact your attorney. All right, gents. Let's jump right in here. So Ethan you have written up quite a set of notes for us to discuss preemption here today. Maybe I start with you. Give us a little insight in to this journey as you took on this topic.
Ethan Cook: Yeah. So, you know, for a little behind the scenes look after every deep dive we internally always meet and kind of talk about, you know, what's the next thing that's going to make sense to talk about. Right? And there was a weird thing going on in Congress that we haven't seen in a little bit and it's not that it hasn't happened before and it's not that it's not been, you know, popular before, but it's not necessarily the flashy thing to talk about, but it had a lot of impact. And that was a moratorium going through the house at the time and has now gone through and failed in the senate, but the whole point of it was to use federal preemption to ban all state -- not ban. Negate all state AI laws that have been passed already and prevent any other ones from being passed for the next 10 years which when you say that out loud sounds really extreme and kind of convoluted and the antithesis of what honestly both parties kind of put forward which is that we value states having opinions. There's a value there. No one really ever says states shouldn't ever be able to have any say over a matter. And this whole moratorium felt completely the opposite. It felt like it was a complete removal of power and this has been done before and it gained a lot of notoriety very quickly. And, you know, as we were talking Ben pointed out that this while it is a kind off the beat subject it had a lot of relevance and should -- was worth looking in to. And, you know, since then and after doing my research and writing this up I mean I would agree. This is a very interesting topic and I think it's going to set up a larger conversation about what AI policy looks like in the next five years.
Dave Bittner: Ben, what do you make of this?
Ben Yelin: So first of all I want to thank you guys and our audience for indulging because this is really nerding out on a legal concept. But I hope to try and illustrate why I think preemption is such an important topic because a lot of what we discuss on this podcast are laws and policies and in case you haven't noticed Congress though they've been a little bit more productive this year is slow to address societal problems. It's polarized. It is famously inefficient. It has arcane rules that prevent just the routine passage of small pieces of legislation, things like requiring unanimous consent in the senate for most things and having a 60 vote threshold for most things to get through the senate. So that leaves the state and the states generally are well positioned. The Constitution gives states police powers to protect the health, safety, and welfare of their citizens. As long as states are not violating the federal Constitution or their own state constitutions they can pass laws on anything. And they do. I think I mentioned in one of our previous episodes that Maryland just officially labeled the orange crush as our official cocktail.
Ethan Cook: Wow. I didn't know that. I did not know that.
Ben Yelin: Always on top of the critical issues here.
Dave Bittner: Okay.
Ben Yelin: So what makes preemption so interesting is it's the federal government taking the keys away from state government and their ability to experiment with laws and regulations, especially on topics like artificial intelligence where things are rapidly developing and state legislators and legislatures need to be nimble and to be able to develop fast acting policy solutions to address problems that are impacting their citizens. And the reason that this AI provision, which ultimately failed, as you said, raised my eyebrows is that this would have handcuffed the states from responding to new developments in AI. And we've seen federal preemption in the past and we're going to talk about this, Ethan, and a bunch of different areas. This is not an entirely new topic. One thing I think we have not seen is preemption in an area where Congress hasn't really done anything itself. There are no comprehensive federal AI regulations. We've had executive orders from the past two presidential administrations. Not much of that is kind of like binding policy. So it would be, all right, federal government hasn't passed any AI regulation. State governments can't pass any AI regulation. So who's going to protect us? And I think that just creates a really interesting dynamic and I think it's worth discussing not just in the context of AI, but everything from cybersecurity to data privacy. States pass a lot of legislation on these topics that are tailored to the specific desires of the electorates of those states. And federal preemption is a way of cutting off those efforts at the kneecaps. And so I think it's really important to discuss.
Dave Bittner: So before we dig in to the details of this AI policy can we touch on some of the history here? I mean what is there in the Constitution that gives the feds the power to preempt the states?
Ben Yelin: So this comes from the supremacy clause which says that federal laws are the supreme law of the land. The caveat to that, so to speak, is that Congress is limited to its enumerated powers. So Congress can only pass laws pursuant to article one section eight of the constitution which has a list of things Congress can do. Because this was written in the 1780s a lot of these items that are listed in article one section eight seem kind of silly. But there are a few things that still apply. Raising and supporting armies. Protecting intellectual property. Those are domains of the federal government. The kind of catch all that's been used to justify a lot of federal action is Congress' ability to regulate interstate commerce. So if Congress wanted to step in and regulate artificial intelligence they could say, and Supreme Court precedent would back them up, that because this has a substantial effect on interstate commerce Congress has the power to regulate it. So when you're in an area where Congress has an enumerated power because of the supremacy clause whatever Congress does usurps or supersedes the actions of state governments. So that's where the notion of preemption comes from. What the supreme court has held is that Congress has to be pretty specific and explicit about preempting state action. So there's something called implied preemption where Congress passes a bunch of laws regulating something and courts can imply that Congress intended to occupy the field. And that's happened in a number of different areas. Regulation of food and drugs. ERISA insurance regulation stuff. Then there are some areas where there is explicit preemption, where it's not implied, but it's actually written in to the law that upon passage of this bill states are prohibited from acting in this particular area. And that generally has been held to be constitutional in most circumstances. And that's what this AI provision would have done. It would have been explicit preemption. At least the house version of it.
Dave Bittner: Yeah. All right. Well, let's dig in to the specifics of this. Ethan, what can you tell us about the history of this AI preemption? Any insights on its origin?
Ethan Cook: Yeah. So as many people have probably been aware of there is a bill that just recently got passed called -- the thing has been the big beautiful bill. Right?
Dave Bittner: Yeah.
Ethan Cook: And it is our reconciliation bill for this year and it has got a lot of feedback both good and bad over the past couple weeks. And probably the most infamous moment it had was when it passed the house the first time in a 2015 to 2014 vote. Very, very narrow. And there was a lot of criticism specifically to Republican law makers who had the majority that this bill came together very quickly. It was not read through. It was voted upon and passed to the senate without really any debate or discussion. And one of the things that happened after it went to the senate was people started kind of breaking it down and actually looking what was in this thing and they discovered this moratorium. And it gained a lot of push back instantly from people basically saying that -- on both sides of the aisle, by the way. This was not just a democratic push back saying that this is not what we're about. We don't have anything in place federally to kind of cover our bases. We're taking away efforts. Obviously state legislators were very - there was a lot of state law makers came out who were very [inaudible 00:10:08] about this. And the only real argument in favor of this was basically saying that yeah we are pulling this because we currently have too many state laws and it is creating confusion. You have one state doing one thing that are requiring AI developers and AI deployers to do one thing, and then you have another state doing the exact opposite or, you know, maybe it's a similar set of requirements, but the way they go about are differently so they have to do X, Y, and Z and it gets really confusing. And what this whole process is doing is causing us to both lose economic advancement and technological innovation. And that was the crux of the argument of why we should preempt state laws in this. And then, you know -- and then I think the maybe goal was we would pass something eventually that would -- at the federal level that would legislate this. And, you know, bring some guidance. And then it went to -- while in the senate there was a good back and forth between senator Ted Cruz who was writing it to make it legal within the senate's version of the bill and after it was approved by the senate parliamentarian Republicans pretty much unanimously pulled support for the moratorium and it overwhelmingly failed to pass in the senate and was killed before it was passed by the house and the reconciliation bill was eventually passed in general. So while the moratorium is not in law and it has failed, I do think it opens a conversation of what does this mean for federal AI policy because the fact that it made it this far and the fact that it was debated this heavily I do think is worth noting.
Dave Bittner: All right. I tell you what. Let's do a quick break here for a word from our sponsor. We'll be right back. [ Music ] All right. We are back. Wasn't there some back and forth between Marjorie Taylor Greene and Ted Cruz? I think this is one of the things that she kind of said she [inaudible 00:12:19].
Ben Yelin: It's frankly to her credit that she admitted it and was like, "Look. I'm a realist. I know that members of Congress aren't reading 1,000 page bills in their entirety." Now I think they should be briefed by their staff on everything that's in the bill, but yeah. She basically said, "I voted yes on this without knowing about this AI regulation moratorium. If it comes back to house and that's still in the bill I'm going to vote against it." And she was joined by other very conservative law makers. Senator Marsha Blackburn of Tennessee. Tennessee has taken a lead in regulating certain criminal uses of AI, things like deep fake pornography. And so I think she was concerned that that type of regulation would have been preempted. So Marsha Blackburn was working with Ted Cruz who was kind of the pro preemption senator on this subject. They came up temporarily with a compromise where instead of there being a blanket moratorium they would condition broadband funding to the states. So this bill provides AI empowered broadband funding to help set up rural broadband across the country. And states would only be eligible for that broadband money if they did not regulate artificial intelligence. So it was kind of the carrot instead of the stick approach.
Dave Bittner: Right. Like the old 55 mile an hour speed limit.
Ethan Cook: Or the classic drinking age. If you'll bring it to 21 we won't slash your funding, but if you keep it under - you can. You can, but you just won't get the money.
Ben Yelin: Right. And so it was that sort of approach was held to be constitutional in those two circumstances. But then President Obama through Obamacare tried a different tactic in terms of conditioning funding so they passed the Affordable Care Act with this pretty large expansion of Medicaid and they said to the states, "You either accept this expansion of Medicaid or you will lose all of your Medicaid funding." So not just the new funding, but all existing funding. Your state won't have Medicaid. And the Supreme Court said that that was too coercive. It was the state -- it was the federal government basically putting the gun to the head of the state. So I don't know how courts would have seen this. I think this is somewhere between drinking age and Medicaid. I don't think it's on either polar end. But that's kind of where the constitutional question would have lied is is this policy overly coercive? Is it forcing states to do something basically against -- completely against their will? But then I think Marsha Blackburn kind of started to pull away from that deal. And I think Cruz realized he didn't have the votes. They actually put an amendment up to -- during consideration of the bill in the senate to strip out this AI preemption provision. It passed 99 to 1 which is just very interesting. I think it was one of those things where it was going to be out of the bill anyway so what's the point in supporting this provision.
Dave Bittner: Right.
Ben Yelin: It's not like it's politically popular.
Dave Bittner: It wasn't worth fighting over.
Ben Yelin: Yeah.
Dave Bittner: I saw at one point he tried to cut it down to 5 years instead of 10.
Ben Yelin: Yeah. I mean I think he was [inaudible 00:15:45] like four different things. They were going back in negotiations to try and make it work. There was it was approved by the parliamentarian. It was then unapproved by the parliamentarian. It was then reapproved by the parliamentarian. So it went back and forth through multiple iterations in the senate and ultimately just didn't -- it never gained the support that it needed. Yeah. And so at the end of the day this provision didn't pass. You know, I think from our perspective it's worth noting what kind of state regulations this would have preempted. States have started to take action on AI policy, setting up governance structures for AI, promulgating rules on which AI tools can be used by various government agencies, doing inventory of AI systems in state government offices. Certainly in the criminal realm restrictions on deep fake pornography. California passed a law restricting the use of artificial intelligence in political advertising. So depending on the version of this moratorium that would have passed all of those laws would have been declared null and void because of this preemption provision. And so that would have just had pretty severe consequences. And I think the under covered part of this is AI is developing so rapidly. We are so far from where we were in 2022 when we first learned about generative AI and LLMs. And we first started screwing around with Chat GPT. Think about how far we've come in those three years. If you look out on the 10 year horizon which was the original proposal who the hell knows what's going to happen. We could be controlled by AI overlords. They could be commandeering all of our cars and driving them in to rivers. Like we have no idea what problems are going to arise. And so I think --
Dave Bittner: Or benefits.
Ben Yelin: Or benefits. Or benefits. Yeah.
Dave Bittner: We could all be sitting around the pool with lovely cocktails with little umbrellas in our glasses enjoying.
Ethan Cook: Orange crush cocktails in Maryland. It is our official state cocktail.
Dave Bittner: With our AI assistants, you know, doing all of our work for us. Right?
Ben Yelin: Yeah. I just think like the big point here is if there are bad consequences the federal government is not going to solve them in a timely manner because they never solve these problems in a timely manner. So we're reliant on state action. So if the federal government would have just handed us as consumers a double body blow, we are not going to regulate AI and you're not allowed to regulate AI, and as a result it goes completely unregulated.
Dave Bittner: It seems upside down to me that the feds are basically saying, "You know what? We haven't been able to do anything in this space and so we insist that you not be able to do anything either."
Ben Yelin: It is. I think there's a dynamic of which it's not just that we haven't. I think this new administration does not want to. And I think that was really evident by within the first like two weeks Trump rescinded Biden's executive orders about having safeguards for AI. And there's been a lot of language coming out from that administration saying that whether it's J.D Vance in Europe saying that Europe needs to pump the brakes on its regulations and like open itself back up to AI developers, whether it's talking about and putting in new policies. I think one of Trump's executive orders was all about not only just rescinding Biden's safeguards, but also how can we promote greater economic innovation regarding AI within here. And I think there's been a very clear message that regulation and safeguards on AI are hampering innovation. And removing those will enable the U.S to be -- to continue its technological dominance. And there is a lack of desire for federal AI regulation. And I think that's actually fine in theory. I could understand if you're Open AI Sam Altman you do not want to be subject to 50 different regulatory regimes. I think that's completely rational. What's irrational to me is that you would impose those restrictions without setting some type of federal floor. And that's what's generally been done when we're talking about federal preemption. So federal government says, "We're going to regulate food and drug labeling." These are the things that have to be on the labels of the food we eat and the drugs that we take. And states are therefore not permitted to make additional requirements beyond that because we've occupied that field of regulation. At least in that circumstance the federal government had set a floor that I think was a minimally acceptable floor. So if you're going to have a uniform standard at least that standard was something. Right? I think the problem with this regulation is that the standard would have been nothing. It's just it's institutionalizing the wild wild west despite the deep uncertainty about where AI is going to be taking us in the next 10 years.
Ethan Cook: I think another wrinkle to that. It's not just about, you know, institutionalizing the wild west which I do I would agree with that. I also think there's a dynamic of the, you know - to your point you said earlier Ben states are nimble. They can pass things faster. They can be more adaptive to what's going on and what their constituents are saying and things along those lines. But the federal government has proven for the past 20 years to be very ineffective at almost passing anything. And I have a really big concern about, okay, even if we set the floor, is that floor actually impactful with how fast AI is changing? Like sure we passed a -- we passed that floor. We get through. We get -- let's take -- we pull a bunch of state laws and we pull them together, see what's working, what's not working, what we can all agree upon, what we can't. Pass that. Great. And in five years we saw in five years from 2022 or in three years, almost four, it's changed completely. And what we need now as opposed to years ago is completely different. What does it look like in five years? And can realistically we be expected at a federal level to pass multiple floors that actually are impactful?
Ben Yelin: I also think yeah. I think that's exactly right. From a political perspective this also creates problems because you have members from some states who have already passed regulation who don't want the federal government to preempt their estates. And that ends up killing legislation that would provide a reasonable floor. So that's what happened with data privacy. There was this proposed federal data privacy law. It was the end of the congressional session in 2022. There was a chance that that data privacy bill was going to pass in lame duck session. Nancy Pelosi was still speaker and she put a kibosh on it because California had already passed CCPA and if they had passed this federal law that federal law would have preempted the stronger protections that existed in the California law. So that's the political wrinkle here is there are always going to be members of Congress from states who have policies that exceed that congressional floor. I think the only way to make it work is a grand coalition where the floor that is established by the federal government is something, a compromise, agreed upon by people who want more regulation and people who want less regulation. I think that's the only way it would ever work and that in some ways seems like a pipe dream. Like I don't know if we can ever actually find that perfect equilibrium. But because of the threat of federal preemption I think it makes even politicians like Nancy Pelosi who would usually be favorable to regulation kill regulatory bills because they're worried about it preempting stronger state laws.
Dave Bittner: Isn't there a tension here between the -- because you have on the Republican side of the aisle the desire for as little regulation as possible. Right? A strong belief in that. And certainly this presidential administration believes that, as Ethan pointed out. On the other hand also a strong desire for state's rights. Right?
Ben Yelin: Yeah. Absolutely. A strong professed desire for state's rights. And I think in most other circumstances people with conservative leanings would say that states are better situated to regulate policy. I mean that was the line on abortion politics for 50 years, that states have different political needs, priorities, situations, and they are best situated to set policies, not the United States Supreme Court. I do think there is a lot of truth in that. Now it's not 100% foolproof. People hear state's rights and they think about civil rights in the 1860s, the 1960s.
Dave Bittner: Right.
Ben Yelin: Obviously all of us believe that in those circumstances the federal government was right to step in and supersede state's rights claims with things like the Civil Rights Act and the Voting Rights Act. But I think generally, at least in the abstract, there is something critically important about giving states flexibility. And that's the axiom about being laboratories of democracy. Yes. So experimenting with different policies to see if they work. But it's also being able to tailor policies to the unique political coalitions that govern these states. And it's kind of more democratic in a small D sense. State legislators are closer to the people. They are elected from smaller districts. People I think have more of a chance to personally interact with their state legislatures. So like there's a strong theoretical grounding to this even though it's been misused in the past.
Dave Bittner: Let's do a quick break here for a word from our sponsor. We'll be right back. [ Music ] Ethan this was not popular with the general public. Right? I mean people -- I think most -- is it fair to say the majority of U.S citizens are kind of looking at AI with at least one raised eyebrow?
Ethan Cook: Yeah. I would say that the moratorium was overall unpopular which is why, you know, no one got supported. It really did not get any support in the senate when it was finally discovered and fleshed out as a concept because it was -- you know, I think no matter how you sell this I don't think it gains popularity. You go to the average person. "Hey, should the federal government prevent states from passing any AI laws?" Most people would probably say no.
Dave Bittner: Right.
Ethan Cook: Someone -- let's even flip it on it's head. Hey, do you think AI makers should have regulations imposed on them? Most people would say yes. Right? I don't think there's any way you can really sell removing this without the huge part of having a floor established that makes it popular. You know. I think there is a wrinkle to this that I am waiting to see where it goes over the next couple months slash next couple years which is I think with AI being this kind of untold promise of it could be everything great, it could be everything bad, it could be everything in between, what -- I think federal lawmakers are -- part of this in action is the paralyzation of fear that over regulating any specific thing will cause us to collapse the industry and cause them to escape and go somewhere else. And there is a fear from an economic perspective. There is a fear from a technological innovation perspective. And I think the last one that probably does not get as much attention as it deserves is a fear from a military perspective. I mean Open AI just signed $100 million of contracts with the Department of Defense to help them create their own AI platform. Right? That is -- if Open AI up and ends decide to leave and go let's say to Canada or name another country, that has a huge impact on our defensive capabilities. And while AI's not there yet, with how fast it's advanced in a couple years I think it will be there and potentially scaring off developers as Europe quote unquote has done is I think a major fear on both sides of the aisle.
Ben Yelin: Yeah. I mean that is the other side of this. It's the -- it's what you've just talked about. It's also geopolitical. We don't want our adversaries in China to be the innovators on AI. I mean for a while it seemed like the Chinese had done better than us at developing AI tools. I just think there are going to be regulations that protect consumers that don't doom the entire industry to being over regulated and therefore failing in the United States. And I think state legislatures are well situated to consider the pluses and minuses of those regulations and to balance those inequities. Are they always going to get it right? No. But they're nimble enough that if they don't get them right they can come back the next month, the next year, and change policies in a way that I just don't think Congress is capable of doing. So I kind of come back to this idea that even if you are fearful of over regulation you'd rather have the nimble flexible varied approaches of state governance than whatever you would get from an inert federal Congress. Even despite all of the factors that we've discussed.
Dave Bittner: Can we go up a level and just talk about the place that we are in where something like this gets inserted in to this gigantic mega bill? And rather than having consideration standing on its own we end up with things like this where you have a legislator like Marjorie Taylor Greene saying, "| didn't know this was in there." Right? I mean that shouldn't -- that just shouldn't be. It makes everything harder. Right? And yet here we are.
Ethan Cook: Yeah. So most state legislatures have single subject rules meaning like if you have a bill about the Chesapeake Bay you can't hide an AI provision in there.
Dave Bittner: Right. Right.
Ethan Cook: Like it has to all relate to a single subject. One of my long time hobby horses is that the reason Congress can't have single subject bills is just that they are handicapped by arcane set rules. Congress has to do a bunch of things every year. Most notably they have to pass appropriations bills. There are also authorization bills that have to pass like the big military defense authorization bill. And the way the senate rules are set up any one senator, let alone the minority party, can just kind of indefinitely delay things. So if you demand a vote on cloture then you're entitled to 30 hours of debate precloture and then you have to get 60 votes to continue on most pieces of legislation. And then you've got 30 hours to debate after cloture. Then you balance that with the fact that senators don't want to be there all the time debating. They want to be in their districts. They have to fund raise. They're off raising money. And the clock really starts ticking quickly. In state legislatures none of this is a problem. I follow the Maryland state legislature where they have single subject bills and they are fully capable of passing 20 bills in a matter of minutes. But it's the senate rules that prevent Congress from having that power. And I think all of this is a downstream effect from those rules. So my hobby horse, my long time political goal, is to reform these senate rules so that we can operate more like a state legislature. State legislatures are just as democratic. They reflect the will of the voters. We can even still have a 60 vote threshold. But just getting rid of all of these unanimous consent requirements and time requirements that prevent the senate from acting on a sufficient number of legislations to cover the needs of the federal government. And that's my soapbox.
Ben Yelin: It's a fair soapbox to be on, by the way.
Dave Bittner: I mean is it fair to say there are those who would say that those slow downs are a feature, not a bug?
Ethan Cook: Yeah. Totally. And people do say that. Like oh shouldn't we have more time to debate. They're not -- it would be great if they were actually on there debating for all 30 hours. That would be great. If you turn on CSPAN2 that's not what happens.
Dave Bittner: Yeah.
Ethan Cook: They're usually in a quorum call which is just the senate floor sitting empty while one presiding officer is doing their crossword puzzle waiting for the clock to hit those 30 legislative hours. So I believe that the senate can still be the so-called cooling saucer that slows down the passions of the house. There are still ways to do that without having these time requirements and requiring unanimous consent of 100 people to do anything.
Dave Bittner: Right.
Ethan Cook: I think there are ways to reform the institution while still allowing it to check the excesses of the house of representatives.
Dave Bittner: So, Ethan, this is dead for now. This AI bill is dead for now. Any sense of, you know, those who lost on this one? We've been talking about Ted Cruz for example. Any sense for whether or not they're going to try to come back and fight another day with this?
Ethan Cook: I would be shocked if they didn't. I think there is too much money behind AI that actively would like to see this pump the brakes on some of the state legislation going on. I think there is too much military incentive. We are a defense oriented country. We spend a lot of money on defense. And AI has a lot of potential there. And I think that that is a huge driver and there's big people behind that. I think we have an administration that's friendly to deregulation. So that's a huge win for that argument as opposed to the previous. So I -- and I think the underlying factor that's also kind of there is that midterms are coming. They will be there in a year. You know, a year and some change. And that may or may not -- historically speaking election wise typically midterms are a flip back to the other party no matter who's in power. So if you're going to go based off historical data it should flip back to Democrats. Now that's not guaranteed. Things happen. But that's historically what does happen. And there is I feel a big urgency right now among Republican lawmakers who currently have a majority in both chambers, majority in the Supreme Court, and a majority obviously controlling the White House, to get as much passed as possible within this two year period before any chance of disruption occurs. And I would not be shocked if we see something come through. I wouldn't be shocked if it was this summer before the legislative session broke out. But I would not be shocked if we saw something proposed again by the end of the year and certainly something being proposed next year. I don't think it's going to be as egregious as this one was because I think pretty much everyone was like, "This feels a little -- 10 years is a lot of time. That feels a little too much." But I wouldn't be shocked if we saw a three year or a four year or a five year and more language that supports, you know -- to call it the show name, some caveats in there that allow for some, you know, greater provisions on let's say deep fakes and saying, "Okay. That has nothing to do with actually hampering AI. That's how we use it. That has nothing to do with the application and development." Right? I think there would be more things along the lines of restricting what risk assessments are required, what governance requirements, compliance requirements are required on a state level. How would this impact specific industries? Whether that be healthcare or finance or defense. I think those are probably the things that are going to be targeted. I would be really shocked if this isn't attempted again, but I do not think it will be in a single bill. I think it's going to be again stapled in to something, tried to pass largely without people being aware of it, and hopefully you get it passed as this was very clearly attempted to do.
Dave Bittner: Yeah. I guess the other side is now, you know, the Marjorie Taylor Greene's of the world they have their radar up now. They're going to be, you know, more vigilant. Right?
Ethan Cook: Yeah. I would be shocked if people aren't paying a little bit more attention to AI after for the next couple months which I think behind the scenes there's going to be a lot of conversations related to this and what the next steps are. What should people be thinking for? Etcetera. I do think one dynamic that I'm curious to see how this kind of plays itself out over the next couple months with, you know, both houses and both chambers. And to your point, Ben, about the kind of the slowness of how this -- how Congress operates in general, will be if it takes longer than I think, if let's say it takes a year and a half, how far has AI come in that time period? What have states done in past since then and does that have to change because in -- a year and a half doesn't sound like a lot of time, but when you compare that that's pretty much half the lifespan of AI since it's come on to the scene, how much it's changed in just that amount of time period, I think that could fundamentally change what that bill looks like and what that moratorium looks like.
Dave Bittner: Yeah. It's hard to imagine legislation that doesn't end up being reactive or, you know, obsolete by the time it's passed.
Ethan Cook: Especially with AI advancing at what it does. You know.
Dave Bittner: Right. Right. All right. Any final thoughts, Ben, before we wrap things up this week?
Ben Yelin: Should I stand on my soapbox again and talk about senate rules? No. I think I [laughs] -- I think I've covered it enough. And I think Ethan's prognostication is accurate and I'll note that they will have a couple more opportunities for these reconciliation bills which are not subject to the senate filibuster. I think they blew their one opportunity to sneak this in to the bill without people noticing. So they won't be able to get away with that again which I think Ethan touched on well.
Dave Bittner: Yeah. It's interesting to me. Like we noticed it. You know, those of us covering the cyber beat like we, you know -- we were talking about it from the very beginning. You know, it's like I guess as people do there.
Ethan Cook: Seeing it and I was waiting for it to pop up in to the general news and it took a week or two before people really started kind of like, "What is this?"
Dave Bittner: Right. Exactly.
Ben Yelin: I guess what you're saying is that Marjorie Taylor Greene does not read my tweets.
Dave Bittner: Right. Not a regular listener of the CyberWire I guess. Well, what are you going to do?
Ben Yelin: She's always welcome. Welcome on here any time.
Dave Bittner: Yeah. Maybe she will be now. You never know. You never know. All right. Well, Ethan Cook is the editor of our "Caveat" newsletter. Ethan, thank you so much for joining us here again this week. We look forward to having you back again soon. [ Music ] And that is "Caveat" brought to you by N2K CyberWire. We'd love to hear from you. We're conducting our annual audience survey to learn more about our listeners. We're collecting your insights through the end of this summer. There's a link in the show notes. We do hope you'll check it out. This episode is produced by Liz Stokes. Our executive producer is Jennifer Eiben. The show is mixed by Tre Hester. Peter Kilpe is our publisher. I'm Dave Bittner.
Ben Yelin: I'm Ben Yelin.
Ethan Cook: And I'm Ethan Cook.
Dave Bittner: Thanks for listening. [ Music ]

