Caveat 5.2.24
Ep 216 | 5.2.24

The building blocks of AI governance.

Transcript

Cameron Kerry: In technologies like artificial intelligence, they have what's being referred to now as sociotechnical components. These are not purely technological issues like, you know, the design of a mechanical part or a computer chip. They do have broader social and economic implications. And, you know, the system of standards development has some ability to deal with that.

Dave Bittner: Hello everyone, and welcome to "Caveat N2K", CyberWire's privacy, surveillance, law, and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hey, Ben.

Ben Yelin: Hello, Dave.

Dave Bittner: On today's show, Ben and I discussed the case of an AI-generated recording leading to criminal charges. Here in Maryland. I've got the story of proposed legislation to improve the safety of water systems. And later in the show, Cameron Kerry of the Brookings Institute shares their report, "Small Yards, Big Tents: How to Build Cooperation on Critical International Standards." While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. [ Music ] All right. Ben, we've got a lot to cover here today. You want to kick things off for us?

Ben Yelin: Sure. So, this is a story from our home state of Maryland, but the story went national. And I'm not surprised that it went national because it's really a first of its kind in terms of the impact of AI-generated sound recordings. So, to back up a little bit, back in January, there was a major scandal at a Baltimore County high school. Baltimore County is the suburban county surrounding Baltimore City. And it was in the community of Pikesville. Pikesville has a lot of African American students and also a lot of Jewish students. So, we're just sort of laying the groundwork here for what ended up being a massive scandal.

Dave Bittner: Just to connect this personally, my in-laws live in Pikesville.

Ben Yelin: Ah, okay. We're raising the stakes even more here.

Dave Bittner: Right.

Ben Yelin: So, the principal of this high school, a guy by the name of Eric Eiswert, was accused by faculty members and in social media posts of making racist and antisemitic comments in an audio recording. That recording had been circulating on social media. The recording, you can listen to it, it's online. It is patently offensive, basically using racist stereotypes, antisemitic stereotypes as bad as you can possibly think. He was put on non-paid administrative leave during an investigation. There has been an investigation. Baltimore County Police contacted forensic experts. Long story short, it was determined that this recording was fake and had been generated by artificial intelligence. The twist to the story is, the person who created this fake recording --

Dave Bittner: Allegedly.

Ben Yelin: Allegedly. Yes. Sorry, we got to be very careful and you're right to say that. This is all just according to an indictment. It's a guy by the name of Dazhon Darien. He is the former athletic director of Pikesville High School. He allegedly was upset about just some personnel matters. I think his contract wasn't getting renewed. It was kind of minor things, not something that would lead to creating or that should have led to creating a false recording of antisemitic and racist comments. But they were able to trace this recording through obtaining a warrant to Mr. Darien. There was a warrant put out for his arrest. He was at BWI Airport attempting to board a flight with a loaded gun. Not a good idea. So, --

Dave Bittner: Wow.

Ben Yelin: -- that raised flags as it should from TSA. They did a little background search on him and discovered that there was this outstanding warrant. So, he is being charged by the Baltimore County State's Attorney, the district attorney in Baltimore County. There are a number of charges, none of them are particularly serious. The largest prison sentence for the charges that have been presented so far is six months, and that's for basically interrupting school proceedings. There are also charges for impersonation, fraud, and basically the equivalent of blackmail because he was using this as retaliation for employee-employer disputes. So, this is a first of its kind story. We've never seen anything like this where somebody weaponized the use of AI to create an entirely false recording that led to, not just anger against this principal, but death threats. He had to have law enforcement stationed outside his home based on something that he did not say. When the story first came out, before we knew that this had been AI generated and orchestrated by Mr. Darien, there was a big public outcry among the community in Pikesville. A lot of activists who grew up in the Baltimore area, including very prominent ones, went on social media and said, "This guy has to be fired," referring to the principal, "For these comments." And of course, the principal never uttered the words that were seen in that video. So, there are a couple of angles to the story that are interesting. I think we're going to be talking about this story for a long time. The first is that, I think Mr. Darien is being undercharged here considering the impact that he had. And that has to do with the lack of criminal statutes relating to the fraudulent use of artificial intelligence. Maryland doesn't have anything in the law books that would lead to a potential criminal charge here. And I know members of the state legislature have already started brainstorming about how they're going to address this in the next session. A lot of states have criminalized the use of AI, but only in limited circumstances, so things like election disinformation and the use of revenge pornography. California was the first state to ban revenge porn. But this is not revenge pornography and this is not election disinformation, this is ruining somebody's life but it doesn't fit neatly within those proposed statutes. So, I think it's incumbent upon policymakers to try and figure out a creative way to address this problem where the only available charges we have here are relatively minor. Interrupting school proceedings carries a six-month sentence. You'd have to think, since this is, I believe, Mr. Darien's first criminal offense, that even if he is convicted, they might end up suspending that sentence. And so, I think it's incumbent upon policymakers to try and figure out a legislative solution to this. One thing I notice is that, there is a federal effort through introduced legislation in Congress that would ban the use of artificial intelligence for "malicious purposes". And they lay out a bunch of examples of malicious purposes. None of them necessarily cover what happened here, but I think that could be a good starting point, a good benchmark where, if you have a proper definition of malicious purposes and if you define it in a way that it covers the conduct that we saw here, then I think we would have justice in a case like this where a guy's life was ruined due to the use of artificial intelligence in this audio recording.

Dave Bittner: Right. We've theorized that this was going to happen. Right. The fact that this happened, I think, doesn't surprise anyone who's been following this technology. This was one of the things that we all imagined, --

Ben Yelin: Yeah, absolutely.

Dave Bittner: -- that deepfake audio and eventually video would be used to put words into someone's mouth. And now it's happened.

Ben Yelin: And the fact that it's happened has a couple of consequences. First, anybody who actually did utter words that were this offensive, they can properly, and I think reasonably, claim that, oh, this could be artificial intelligence. That's going to be the new excuse for criminal defendants who end up being charged either under these statutes or any other statute.

Dave Bittner: Well, I mean, it's a whole new category of doubt that is now reasonable.

Ben Yelin: Totally. That's exactly what it is. We've introduced reasonable doubt. Now, most recordings, still- yes, we live in the artificial intelligence era, but it is still in its infancy. Most recordings of human beings are actually not generated by artificial intelligence. But now, I think criminal defendants or people who are just under suspicion of saying offensive things can say, well, it was- look what happened in Pikesville, it was artificial intelligence. And that's going to make- prosecuting these cases or suspending people from schools or whatever the punishment is, it's going to make it harder to meet out since you're introducing that level of reasonable doubt. But I also think this instance will lead all of us to be more cautious about, or at least it should lead all of us to be more cautious about believing what we see in audio and video, what we hear in audio and what we see in video. When we see politicians say things that do not sound like something a politician would say, I think we have to think back to this and remember that these tools are not only available, but they're very accessible. They're either free or relatively cheap, and it's just going to become part of our national discourse. I think this is- because this is the first kind of high-profile instance that doesn't have to do with elections or revenge pornography, I think, no matter what, this is going to be a groundbreaking case.

Dave Bittner: I think it's also worth noting that, prior to all this happening, my understanding is that, this principal, Mr. Eiswert was a respected member of the community, you know, as a principal of a prominent high school in a prominent community. And then, this recording came out and it's fair to say, I mean, it really upended that community. It turned things upside down, not only for his world, but that microcosm that was the Pikesville High School community. It was a bombshell.

Ben Yelin: It absolutely upended the community. I mentioned the fact that he was facing threats, that he had to have 24/7 police protection around his house, that he went several months on leave. He has not worked at the school since this alleged video, which we now know was created through artificial intelligence, was released on social media. And your only hope is that he can recover some semblance of a normal life. But he is really the victim here. And we'll see, you know, I'm sure in a platform at a time of his choosing, what he has to say in response to all of this as somebody who was the victim of this really vicious- I don't even know what to call it. It's not a prank, it's --

Dave Bittner: No, it's an attack.

Ben Yelin: It's an attack. Yeah, it's absolutely an attack. It's a form of defaming somebody. So, I certainly think there could be a civil lawsuit here under defamation, which would also be groundbreaking because it would be a civil case related to the use of artificial intelligence. There haven't been too many of these types of civil cases, at least around defamation, libel, et cetera. So, certainly, you could see something like that. There's not really a criminal statute that covers defaming somebody, so I think it's the criminal code that you have to adjust to account for these types of scenarios. And I know that the Maryland State legislature is going to start working on these issues, and they'll hope to have a bill drafted for the next session that would prevent something like this from happening in the future, hopefully.

Dave Bittner: Yeah. Well, and you'd think that this- this having happened, this will give them more incentive and more solid ground with which to move forward on something like that. It's no longer theoretical.

Ben Yelin: Yeah. It's not abstract and I think that's very important. I think it seems very pie in the sky until it isn't. We've seen that basically in everything else we've talked about like, oh, cyber attacks on critical infrastructure, like that sounds boring. And then, ah, Colonial Pipeline happens and we're sitting in gas lines. So, I think, once something is made real, it increases the political pressure on our policymakers to do something about it. And I think now they have the ammunition of this case. They might not even- they might even name a piece of legislation after the principal who became the victim here. That's just me guessing.

Dave Bittner: That's a dubious distinction, right?

Ben Yelin: Yeah. Now, if they can create an acronym with his last name, now we're talking.

Dave Bittner: Right. Right.

Ben Yelin: That would take some extra work.

Dave Bittner: Yeah. Yeah. So, I have a crazy thought that crossed my mind that I will totally acknowledge and get in front of the fact that it is silly before I say it, okay, but let's stick with me here. So, Maryland is a two-party consent state, right, which means that, if I'm going to record audio of a conversation between the two of us, I must get your consent before I record that audio.

Ben Yelin: That is correct. Linda Tripp learned this the hard way.

Dave Bittner: So, I cannot secretly record our conversations. So, what crossed my mind was, could you somehow invoke that by creating an audio recording of someone without their knowledge, even if it isn't really them? Right. You see where I'm going here?

Ben Yelin: Yeah. I don't think you can possibly implicate two-party consent laws here because there's just no- there isn't actually another party. Now, you can get anybody's voice through public recordings. So, I'm sure there was some Zoom session with the parents where Mr. Eiswert, the principal here, gave a great speech about --

Dave Bittner: Exactly. Yeah.

Ben Yelin: Yeah. So his voice was easily accessible. I was actually talking with my students about this and a couple of students said, you know, "This is why I don't put my voice anywhere online, I'm concerned about it being used for deepfakes." I'm like, "Oh, God, I have- we have a podcast.

Dave Bittner: Welcome to my world. Yeah.

Ben Yelin: Anybody can access our voices at any time."

Dave Bittner: It's true.

Ben Yelin: Not to give you any ideas.

Dave Bittner: That's right.

Ben Yelin: But, yeah, the fact that he was a principal, I'm sure there are YouTube clip- clips of him. All you have to do is input that into a free service available through many of your favorite AI providers and they can generate any verbal content from that clip through artificial intelligence.

Dave Bittner: And it's shockingly good.

Ben Yelin: Yeah. It's very good at what it does. And this is kind of the follow-up to an- another thing I think we addressed where President Biden was alleged to have made calls. It wasn't actually President Biden, that's the kicker. But "President Biden" made calls to voters in New Hampshire telling them it wasn't necessary to vote in the primary. That was generated through artificial intelligence. That, I think you could prosecute if you have a statute related to election interference. But this is just an entirely new problem and you have to create a statute that's specific enough that it doesn't criminalize First Amendment protected activity, so things like parody videos that are trying to make some type of political statement, but it's broad enough that it can anticipate scenarios where AI is harmful. So, here, we kind of got lucky that it happened at a school because we have statutes on the books about interrupting school proceedings. Granted, it's only a six-month sentence, but at least it's something. If we didn't have that and we didn't have these instances of extortion here, then who knows what the state's attorney for Baltimore County would've been able to use to effectuate criminal charges? And just- and I think he said it better than anybody else, "I'm going to go down to Annapolis and figure out a legislative solution to this problem."

Dave Bittner: Wow.

Ben Yelin: And I hope he does.

Dave Bittner: Yeah. All right. Well, we will have a link to that story in the show notes.

Ben Yelin: Can I just say one thing, --

Dave Bittner: Sure.

Ben Yelin: -- by the way, before we finish that story? I said that he had a loaded gun, that is not confirmed. I don't want to get sued myself for a defamation. It was a gun, I do not know if it was loaded or not.

Dave Bittner: So he's trying to go through airport security with a gun?

Ben Yelin: With a gun, yeah.

Dave Bittner: Okay. All right. Yeah, absolutely. Well, Ben, there's two stories here I wanted to get your take on. These are both stories having to do with critical infrastructure and, you know, motions to protect them from technology and some of the threats that we see on the horizon. This first story is about US Representative Rick Crawford, who's a Republican from Arkansas, who is- who's teamed up with Representative John Duarte, a Republican from California, and they have introduced a house resolution that is looking to protect water and wastewater systems against cybersecurity threats. And they're doing this- the bill proposes to establish a water risk and resilience organization under the EPA to develop these risk and resilience requirements for drinking water and wastewater systems. What do you make of this, Ben?

Ben Yelin: I think this is a really interesting piece of legislation and I think it's really promising. As you said, the bill would create this task force and resilience organization. It's under the umbrella of the EPA. The EPA already does work analyzing risk and building in resiliency. Sometimes, that has to do with resiliency to adverse weather events. This is just another threat that the EPA has to wrestle with, these cyber incidents against critical infrastructure.

Dave Bittner: Yeah. I was going to ask you that. You know, why EPA and not one of the other- why not Homeland Security, you know?

Ben Yelin: I think it's because the EPA has regulatory authority over our water systems.

Dave Bittner: Ah, I see. Okay.

Ben Yelin: So, it's within their purview to promulgate these regulations.

Dave Bittner: So they could have teeth?

Ben Yelin: Exactly. And so, you'd have this organization, it would already be under the structure of the EPA. Once it's in the administration, it would be tasked with proposing regulations, implementing those regulations to enhance the cybersecurity resilience of our water system. I think this threat is another one that probably seems abstract to people, but the current EPA administrator, Michael Regan, the National Security Advisor, Jake Sullivan, they've mentioned in committee hearings that they're concerned about critical infrastructure, particularly water systems, as a vector for unintentional or intentional disruption, or whether it's some type of terrorist attack, there's a cyber attack against our water, or if it's in some type of computer system failure that affects our water system.

Dave Bittner: Let me ask you this, Ben, just I'll throw you a- throw you a curve ball, where do you get your water from where you live? Do you know?

Ben Yelin: I live in Baltimore County, and actually it is the Baltimore City Department of Public Works that is in charge of water services for Baltimore County.

Dave Bittner: Yeah. Same for me here. I live in Howard County, which is another Baltimore suburb. But, yes, we get our- I'm old enough to- I'm old enough to remember people talking about referring to city water versus well water. Right?

Ben Yelin: Yeah.

Dave Bittner: Like I was- when I was a little kid, the neighborhood I lived in, were living there when they ran city water through the neighborhood. So --

Ben Yelin: Did it taste different?

Dave Bittner: I don't know. I was six probably. I mean, well, it got fluoride, right.

Ben Yelin: Yeah. Fluoride's good.

Dave Bittner: So, you know, we were getting- so we were- we had a, well and we had septic, and it was a big deal that the county was running through, you know, city water. I remember them digging up the front yard to pull out the septic tank and disconnecting the pumps that were down in the basement to pump the water out of the well, you know, all that kind of stuff. But I get- my point in asking you that is, I think most people or many people don't really think about where their water comes from. It's one of those things that safe drinking water is something that most of us here in the US- and of course, there are well publicized exceptions.

Ben Yelin: Flint, Michigan.

Dave Bittner: Yes. Most of us don't even think twice about the fact that the water that comes out of our spigots is safe and basically unlimited.

Ben Yelin: We only notice it in two circumstances. One, when it's not available to us, like if there's a water main break. Or two, I am a native Californian, and there are a lot of droughts there. And when you have droughts, they're voluntary or mandatory water restrictions. Those are unpleasant. You're instructed to limit your showers to whatever it is, under five minutes. As someone who enjoys taking long hot showers, that's a sacrifice to make. But, yeah, it's something we take for granted. And this is why it worries me. I mean, I think there are significant vulnerabilities with our water systems. This is another example where I think the United States, even relative to other countries, has some advantages, in that, there are redundancies. We have a lot of distinct water systems where they're probably all- and I don't- I'm actually not an expert in this, but they're all on different computer systems and run different software so it would be hard to institute some type of centralized attack that brought down all of the water systems in the United States. But certainly, for each individual water system, there are a lot of vulnerabilities. And I think it is certainly a wise idea to get the federal government involved in figuring out ways to harden these systems against cyber incidents.

Dave Bittner: It's interesting because, I think many view that as being simultaneously a feature and a bug, you know, in that, yes, the water system is diffuse, it is diverse --

Ben Yelin: Decentralized. Yeah.

Dave Bittner: Decentralized. But on the flip side, that means you have a broad and diverse attack surface. And there are water systems, like we were describing for us, that handle entire communities, entire cities, and the suburbs that surround them. There are water systems- my understanding is, there are water systems that handle small townships, you know, a few thousand folks homes.

Ben Yelin: Totally. Yeah.

Dave Bittner: And so, we think about the available funding for a small system like that to protect themselves from a cyber attack. I think this sort of federal attention, and hopefully it will come with support and funding for the systems to protect them, because, you know, how much fun- unfunded mandate is for a small operator like that?

Ben Yelin: Yeah. Ask somebody who works for local government how much they love unfunded mandates, they'll tell you.

Dave Bittner: Right. Right. So, I mean, getting back to this specific bill, what is the process for this to go through? I mean, I understand, right now, it's under review.

Ben Yelin: Yeah. So, it has been given to two committees. Both committees have jurisdiction over this, the Transportation and Infrastructure Committee in the House and the Energy and Commerce Committee. I don't know if they have hearings scheduled on this bill. This is Congress and we have to remember that the vast majority of bills that get proposed go nowhere. They die in committee or they never receive a committee hearing at all. So, usually, when you're introducing a bill in Congress these days, it's to raise awareness about something. And maybe you can get some of it tucked into a major omnibus must-pass spending bill at the end of the fiscal year.

Dave Bittner: Well, I mean, like you and I have talked about, these things, I think become a lot easier for legislators to consider when there have been real-world examples. And one of the things the reporting is highlighted on this story is, you know, there was an Iranian group who went after a Pennsylvania water facility. So, the people who are pushing this through, it's not just theoretical. Right?

Ben Yelin: Yeah, not at all. And I think the elephant in the room here is China. There was testimony by the FBI director about the scale of Chinese offensive cyber operations. Their capabilities have been enhanced over the past several years, and we are concerned that they're going to get to the point that they could propagate and attack a wide, broad scale attack on our water system. So, that's actually why I think this is appropriate for federal action, is, this is an international cybersecurity concern. It has to do not just with protecting infrastructure here at home, but there's a foreign policy element to it as well. So, would I bet on this bill passing prior to the end of this Congress? Not necessarily. Although, it is encouraging that this is proposed and sponsored by two Republicans. Generally, Republicans are not fans of the EPA or regulations, so this- you're kind of off to a good head start here.

Dave Bittner: Yeah. Before we wrap up this segment here today, I want to get your take on another story. The Department of Energy put out an initial assessment report looking at the benefits and risks of artificial intelligence when it comes to critical energy infrastructure. So, the interesting report, like I say, it's an initial assessment, which I guess means it's first shot at this sort of thing. What is your take on this sort of thing, Ben?

Ben Yelin: So, it's a really interesting report. I think sometimes we overemphasize the fears attendant with artificial intelligence and underemphasize the benefits. And it's good that this initial assessment really covers both. So, AI benefits that they mentioned in the report, improved operational awareness, predictive maintenance, resource exploration, improving system efficiency and response capabilities, those are all things that I think, if you operate critical infrastructure, are very promising. And we're already seeing some of these tools deployed. But then, there are some significant risks. They identified four risk categories, unintentional failure modes due to things like bias or misalignment, adversarial attacks, so that's either from some sort of foreign adversary or domestic cyber criminals, hostile applications, so model-based attacks, autonomous control concerns, and then the AI software supply chain getting compromised. If we're overly reliant on it and supply isn't able to meet demand on this, then we will have set up a reliance on the systems while not being able to actually use these systems. So, the recommendations from the report are that, the Department of Energy should deepen engagement with the energy sector taking into consideration these benefits and risks, and continue to build updated assessments throughout 2024 and beyond. They did work with energy sector stakeholders and subject matter experts for this report. The plan is to continue working with them going forward. And also, there is broad alignment with the federal government's policies on artificial intelligence. The White House, through the Office of Management and Budget, issued an executive order from the president focusing on enhancing AI safety, privacy, and equity. And I think the goals identified here align with the goals identified in that executive order.

Dave Bittner: It seems like this is one of the ways that that executive order is bearing fruit. Right?

Ben Yelin: Yeah. This is just one sector where the general themes expressed in that executive order are being realized. So, there's nothing here that has significant teeth. I mean, this report in and of itself doesn't pro- promulgate any regulations or change policies in any way. I think it's just good for improved situational awareness for the Department of Energy and for companies that operate critical infrastructure. And I think, outlining these risks hopefully is further incentive for some of these companies to get involved in this engagement process, to figure out best ways to mitigate these risks so that we can still make the best use out of these technological tools.

Dave Bittner: One of the things that strikes me about this assessment- this report is that, the tone is very much one of collaboration, right. There's a lot of acknowledgement that there needs to be public-private cooperation, collaboration, that, you know, this isn't- it strikes me that the tone of this report is not the government coming down from on high and saying, this is the way it shall be. I mean, obviously there are cases where that has to be the way it is, but it seems to me like there's a good faith effort here from the government acknowledging that a lot of the knowledge and a lot of the effort to make something like this workable is going to come from industry.

Ben Yelin: Yeah. I mean, I think they have to rely on industry. The government itself operates a substantial amount of the critical infrastructure in this country, but the private sector controls the majority of it. So, you can't effectuate change without the cooperation of the private sector and you don't want the private sector coming in as they have when regulations have been proposed at the state and federal level saying this is premature, this is going to increase costs on consumers, you know, this is the proverbial heavy hand of the government stifling innovation. I think an effort at collaboration and public-private partnership, the purpose of that is to stop this type of adversarial attitude from some of these companies. And so, I think it's an important first step in that regard.

Dave Bittner: All right. Well, we will have links to both of these stories in the show notes. Interesting stuff, for sure. [ Music ] Ben, I recently had the pleasure of speaking with Cameron Kerry. He is from the Brookings Institute. And we are discussing a recent report they put out. It's called "Small Yards, Big Tents: How to Build Cooperation on Critical International Standards." Here's my conversation with Cameron Kerry. [ Music ]

Cameron Kerry: The main things I think were the increased activity on the part of a number of governments, but particularly the United States, the European Union, and China, to focus on technical standards development for artificial intelligence, for emerging technologies like quantum, like biotech. And as part of that, to increase the government's focus on standards development, you know, in ways that can be both helpful and harmful to the system of standards development. And in the context of work we are doing on trying to align international approaches to standards developments, it was, you know, important to address how to make sure that governments do no damage, but, you know, succeed in upping the standards game and, you know, do that in ways that hopefully will align together.

Dave Bittner: Can you give us some insights on how this has traditionally been done? I mean, is it- when we look back historically, are we talking about agreements, are we talking about treaties? You know, how is the business of standards get executed?

Cameron Kerry: Sure. Look, I think a lot of this will be familiar to listeners of CyberWire from the systems that have governed the technology of the internet for 30 years. So, organizations like the W3C, the Internet Engineering Task Force, the IEEE, ICANN, a number, these are technical standards organizations, engineers, businesses, other stakeholders, that are working together to develop standards- technical standards, specifications that, you know, can be used widely. It's been a successful model, both for governance in the technology space, the internet is a great example of that, and also in the marketplace, in promoting, you know, reliability, interoperability, promoting safety across a variety of sectors. And it has succeeded through the work of these stakeholders coming together and doing things in a way that's technology driven, industry driven, and can work across the board to, you know, promote those values.

Dave Bittner: Well, let's dig into the report itself. What are some of the key recommendations from the report?

Cameron Kerry: The key to the approach is that, is it important for governments to get involved? It is right for governments to get involved. And I think there are similar concerns across the board, US, China, European Union, and many other governments, about the importance of standards to the development of these key technologies, technologies that are important to future economic development, important geopolitically and to fostering the success of AI and critical technologies. So, it's right to do that, but, you know, also important that governments don't do this in a way that undermines, you know, what has made the system of standards development to be successful, the bottoms-up quality of the system. And, you know, top-down requirements from governments that don't have a good understanding of the technology can do damage to the system. And I think we have seen that, for example, in the way that China has tried to, you know, gain some of the system in areas like 5G and 6G technology aimed more at, you know, benefiting its national champion companies, you know, than in the broader success of the technology. So, we need to preserve that. But, you know, we also need to recognize that, in technologies like artificial intelligence, they have what's being referred to now as sociotechnical components. These are not purely technological issues like, you know, the design of a mechanical part or a computer chip. They do have broader, you know, social and economic implications. And, you know, the system of standards development has some ability to deal with that. For example, you know, one of the first ethical standards frameworks for artificial intelligence came out of the IEEE, the international electrical engineering organization. And so, there is, I think, the ability to deal with these issues, but it is important that standards systems and standards development systems and standards development organizations broaden their base, broaden the voices that are included. So, much of the recommendations focuses on how to accomplish that. The reality is that, you know, if we're going to preserve, you know, a bottoms up science and technology-driven system, that it's going to have to fall on the standards development organizations to up their game in terms of the outreach that they do when it comes to developing standards in areas like artificial intelligence. That's going to mean reaching out to civil society and to, you know, other organizations, to small business to help to channel inputs from those organizations. It's going to mean doing more publication of standards that, you know, have broad social significance. I use, as an example of that, the recently issued ISO/IEC standard on artificial intelligence risk management. It will also be important in the course of developing standards like that for standards organizations to, you know, do more outreach and more publication, provide more information about the process of standards development, where they are, what's under consideration. A lot of that can be done, you know, without damaging the business model that standards development organizations have where, you know, they rely on licensing, on membership for the revenue that supports the standards development. I also believe that governments have a vital supporting role, and much of that will need to come through funding. In the United States, in particular, the National Institute of Standards and Technology does a terrific job when it comes to standards development, both in stakeholder outreach and all the workshops, educating people about standards without actually being the standards development organization itself. But, you know, NIST is acquiring enormous additional responsibilities when it comes to artificial intelligence to other technologies. We need to increase the funding for NIST. We need to increase funding to be available to support small business participation, civil society participation in standards development. And we need to broaden the international cooperation in working together, as the US and the European Union have done in the trade and technology- the US-EU- as the US and the EU have done in their US-EU Trade and Technology Council discussion across a broad range of technology issues. That has included, in AI, sitting down, identifying standards and priorities and a scientific collaboration that could be done for so-called pre-standardization work. And that kind of work needs to broaden to include other like-minded governments.

Dave Bittner: You know, you mentioned kind of some of the social aspects of this and it makes me think about how you could expect there to be a natural tension between adversaries. You know, the United States and China, we could see there being differences there. But there are cultural differences in, for example, the way the United States and the European Union approach privacy. And I'm curious, you know, how much of an effort like this is technical and how much of this is diplomacy?

Cameron Kerry: Well, it's a piece of both. But look, I think there are certainly differences with China. I talked about some of the concerns about China's approach in some standards development. But, you know, China also brings some things to standards development. There are a lot of capable scientists and engineers. And certainly, the interviews that I have done with participants in the standards development process have found that, you know, while there are exceptions, you know, that China's approach to international standards has not succeeded in the marketplace and, you know, in the sort of relatively democratic, participatory process of standards development organizations in getting their standards adopted. And then, when they are adopted, they often- or, you know, when they're adopted as a standard by an organization, they are often not adopted in the marketplace, which is ultimately the success. I mean, they're not adopted in the marketplace which is ultimately the measure of success of any given standard.

Dave Bittner: I see. So, where do you suppose we're headed here? I mean, particularly, as you mentioned, you know, the importance of emerging technologies like artificial intelligence, are you optimistic that we're headed in the right direction?

Cameron Kerry: Well, I think here's a lot to be seen. I think I'm particularly concerned that, in some respects, the European Union, in its model for standards development, is following a little bit the China model. The Artificial Intelligence Act substantially adopted a proposal by the European Commission to give the commission authority to issue standards requests to European standards organizations to develop standards. So, you know, there are 10 European standards bodies that develop standards at the European Union level and that the commission can request so-called European harmonized standards. It's had that authority for a while, but this would enlarge that authority for AI by also permitting the European Union, if it doesn't like the standards that are adopted by the European standards development organizations, to adopt what are called common specifications, so essentially, standards by commission FIAT. And, you know, I think we'll see how the standards organizations in Europe do and how the commission does, whether they adopt and work with international standards organizations or they go ahead and adopt their own European standards. Now, they have said, "Look, the Artificial Intelligence Act also protects rights." That's absolutely true, but that's not unique to Europe. You know, the President Biden's Artificial Intelligence Executive Order and, very recently, the Office of Management and Budget guidelines applying that to federal agencies talk about artificial- uses of artificial intelligence that affects safety and rights. That's very parallel to what's in the approach of the European, the EU Artificial Intelligence Act. So, I think there are- there's common ground there. And as I talked about earlier, you know, the standards development process is capable of incorporating a look at things like rights and values, particularly if the base of participation is broader. [ Music ]

Dave Bittner: Ben, what do you think?

Ben Yelin: It was a really interesting interview. I was pleased to see the report. They issued some concrete recommendations that I know you got into with Mr. Kerry. I'll also note that, Cameron Kerry is the younger brother of 2004 Democratic presidential candidate, former Secretary of State, current Climate Envoy, John Kerry. You can hear it in his voice. I think those of us who've been following politics for a long time, that John Kerry, Massachusetts accent is indelible for us.

Dave Bittner: Right. Right. Both of those gentlemen did not fall far from the same genealogical tree, I guess, right?

Ben Yelin: Yeah. They've mastered that Massachusetts accent, for sure.

Dave Bittner: Yeah. Yeah. I thought it was a really interesting conversation. And I really appreciate Mr. Kerry for taking the time for us. Once again, that was Cameron Kerry from the Brookings Institute. And the report is titled, "Small Yards, Big Tents: How to Build Cooperation on Critical International Standards." We will have a link to that in our show notes. [ Music ] That is our show. We want to thank all of you for listening. A quick reminder that N2K Strategic Workforce Intelligence optimizes the value of your biggest investment, your people. We make you smarter about your team while making your team smarter. Learn more at n2k.com. Our executive producer is Jennifer Eiben. This show is mixed by Tre Hester. Our executive editor is Peter Kilpe. I'm Dave Bittner.

Ben Yelin: And I'm Ben Yelin.

Dave Bittner: Thanks for listening. [ Music ]