Caveat 5.23.24
Ep 219 | 5.23.24

A mission to fortify software supply chains.

Transcript

Anil John: We wanted to make sure that we helped catalyze a set of, you know, products and capabilities that these companies would have and make available to the marketplace that, you know, provided visibility into the software supply chain and connected that visibility to, you know, potential vulnerabilities that exist as well.

Dave Bittner: Hello, everyone, and welcome to "Caveat", N2K CyberWire's privacy, surveillance, law, and policy podcast. I'm Dave Bittner. And joining me, as always, is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hey there, Ben.

Ben Yelin: Hello, Dave.

Dave Bittner: On today's show, Ben has the story of the FBI charging a Wisconsin man with the distribution of AI CSAM. I've got the story of Scarlett Johansson at odds with OpenAI. And later in the show, Melissa Oh and Anil John from the Department of Homeland Security's Science and Technology Directorate, they're talking about how they're working to advance software chain resiliency. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. [ Music ] All right. Ben, we've got some good stuff to cover here today. You want to start things off for us?

Ben Yelin: Sure. So, I have a really interesting novel case that came down this week. And the story I'm using is from 404 Media written by Samantha Cole. That's the outlet that our friend of the pod, Joseph Cox, is at now. I guess our one-way friend of the pod.

Dave Bittner: He doesn't know it, but.

Ben Yelin: Yeah, he doesn't know He is a friend of the pod, but he is. Great source of news on any of the topics basically that we cover on this podcast.

Dave Bittner: Yeah. 404 Media is really killing it over there. Like, you know, just sort of an all-star group, an Avengers team of journalists --

Ben Yelin: It's totally an Avengers team.

Dave Bittner: -- they assembled, so.

Ben Yelin: And they've teamed up with another outlet called Court Watch. And together, they unearth kind of obscure legal filings which, whew, that's like some of my favorite sleuthing in the world, is trying to dig up obscure legal filings. So, I admire them for doing that. But this is about a Wisconsin man, his name is Steven Anderegg, and he was indicted this week by the FBI. The allegation states that he used Stable Diffusion, which is a text-to-image generative AI model, to create realistic images of prepubescent minors. So he basically created child pornographic images, or CSAM. And the charge claims that he distributed these images via interstate commerce, which is why it's a federal crime, and that he shared the images with a real 15-year-old boy. Not only did he share the images with him, but he described his process for creating the images and sent him several of these images through Instagram direct messages. It was Instagram that alerted the National Institute for Missing and Exploited Children, or National Center for Missing and Exploited Children that CSAM was being distributed. They traced it back to Mr. Anderegg, and he has been arrested and charged. So, I don't know if you've used this sort of text-to-image artificial generative AI. I've never used it, Stable-

Dave Bittner: I have, yes.

Ben Yelin: -Diffusion. I mean, there are brilliant ways, fun, enjoyable fruitful ways to use it, but obviously, it has this danger that you can create images that violate the law, and that's exactly what happened here.

Dave Bittner: I mean, obviously, the folks who run these systems try to put guardrails on them, so I'm guessing that- I would love to see the prompts that this person used to get around those guardrails because I suspect he was pretty creative.

Ben Yelin: Me too. I actually read through the indictment and they did not have the prompts, which is kind of what I was looking for. It basically said that, he was so explicit in his prompts and so specific into what he was looking for that it was very obvious which images he was trying to generate. But I don't know why it went around their guardrails, and I'm sure there's some head scratching going on at Stable Diffusions headquarters to figure out how to prevent this type of thing from happening. So, this is a really interesting case because, the way the FBI is charging this is as if images are being circulated of actual children. Now, there's the crime of sharing these images, which are prurient and illegal, with the 15-year-old, I think we can certainly all understand that, but transmitting these images over the internet is a crime in the sense that, the FBI is considering this to be child pornography. And this would be the first case to test that theory or to test that hypothesis. We've had a lot of cases that are sort of quasi artificial intelligence like deepfakes, for example, and other cases where you have adults, maybe young adults who are portraying children. And courts have kind of gone in multiple directions on those issues, generally holding that you can still kind of create a market for child pornography by having things like deepfakes and adults posing as children circulating on the internet and therefore, that type of activity should be illegal. And I would guess that that would be an appellate court's justification in this case if they decide that these charges would stand. I think they would say that, by generating these images, even though they don't depict real actual children, you are creating a market for child pornography such that you trigger these CSAM laws. And I think that'll be a really interesting and novel argument. I'm curious to see how it does in court.

Dave Bittner: So, let's go down this path to get this horrible, you know, despicable path together. We've talked about this before here. My understanding has always been that the core of the laws against child pornography are there to protect children from abuse. That is, it is the abuse itself that we're trying to prevent and so, by trying to eliminate the market for this sort of thing, that will slow down and hopefully eliminate the people who are doing these horrible things to children. So, when you create an image like this and there are no children involved, is that different? And do the same laws apply?

Ben Yelin: I think from the FBI's perspective, and the deputy attorney general, Lisa Monaco, was quoted on this, they don't see the distinction. You know, you can look at it in a meta sense and say that, when you're looking at CSAM images, the vast majority of people have never seen that individual person, so it's less about that individual person and it's more about the image itself, I guess. I guess you could surmise that, a good portion of people who have received images created through generative AI might not know that they're looking at images generated through AI. And I don't really know what legal significance that has. So, to pull up this quote by the principal deputy assistant attorney general, "Today's announcement sends a clear message, using AI to produce sexually explicit depictions of children is illegal, and the Justice Department will not hesitate to hold accountable those who possess, produce, or distribute AI-generated child sexual abuse material." And Deputy Attorney General Monaco said, "Technology may change, but our commitment to protecting children will not. The Justice Department will pursue those who produce and distribute child sexual abuse material no matter how that material was created." Put simply, CSAM generated by AI is still CSAM. That's the theory.

Dave Bittner: I hate the fact that I've- I don't want to sound like I'm coming down on the side of the horrible people here, right, but that's an-

Ben Yelin: Someone's got to do it.

Dave Bittner: -untested theory. It's an untested-

Ben Yelin: That's a lawyer's job, Dave.

Dave Bittner: Right. Well, but so that's an untested theory so far.

Ben Yelin: So far, it is an untested theory.

Dave Bittner: So let me ask you this. Suppose somebody writes a short story about their sexual relationship with an underaged child, is that illegal?

Ben Yelin: I don't know whether it would be under this theory. What if the, so-

Dave Bittner: Does a short story fall under First Amendment protection?

Ben Yelin: I think it would because I don't think it qualifies as pornography if it's not. It would have like literary value. And I don't think it carries the same weight as images or a video. So, yes, I think a story would be different in that context. I think it would probably be legal. I don't see how you could criminalize that under the First Amendment even though, of course, it would be morally wrong to do so.

Dave Bittner: Sure. So, if we play out, you know, this absurd scenario and we go the spectrum between the written word on one end, totally photorealistic AI-generated imagery on the other end, and we travel between those two things, suppose I wrote- I don't want to say, I suppose someone-

Ben Yelin: Someone. Just to be clear, it's not you.

Dave Bittner: No. Well, it gives me the chills to think about it. Suppose someone wrote a story and they illustrated it with stick figures. You know, I'm trying to like- at what point-

Ben Yelin: That's a really interesting question.

Dave Bittner: At what point does it become the imagery that is problematic?

Ben Yelin: See, I think stick figures portraying a story doesn't have the same effect of creating a market for images of actual children because it's so fundamentally distinct. Generative AI is good enough that the images are realistic, and it does create the market. It furthers an interest. It will get more people interested in procuring child pornography, which, from a policy perspective, is very, very bad.

Dave Bittner: And therefore, the abuse of children because there is the demand for this.

Ben Yelin: Because there is the demand. The theory is that this creates additional demand because the images are so realistic in a way that I think stick figures would, not. But, you know, I love going down the spectrum. What about something like anime?

Dave Bittner: Right. Right.

Ben Yelin: Because that's realistic-ish. It's not stick figures, but it's very clear that the images are not good depictions of actual human beings. So, I guess it's up to our court system and the Justice Department itself to draw that line.

Dave Bittner: I mean, it's the old- who was the supreme court justice who talked about-

Ben Yelin: I know it when I see it.

Dave Bittner: Yes. It comes down to that, right?

Ben Yelin: I want to say it was Justice Potter Stewart. Someone's going to realize I was wrong on that and send us an angry letter, so please do.

Dave Bittner: You're going to lose your street cred on, you know, your ability to quote things.

Ben Yelin: Robert Carolina's going to come on and be like, "I don't trust you guys anymore. You got that quote wrong."

Dave Bittner: Exactly. He's making a list.

Ben Yelin: Yep. He's the Santa Claus. He is making a list and checking it twice.

Dave Bittner: That's right. That's right. It's fascinating. I mean, how many times, you know, it is these things at the edges, these terrible things that we use to define, you know, what the norms are going to be and what the legal applications are going to be?

Ben Yelin: Yeah. I mean, that's really how our legal system works, is, it's all based on judicial precedent, which is generally really helpful because most cases reflect some facts in previous cases. Then you've got a novel case like this and you have to kind of create new precedent. It's not like there's some written code out there that definitively resolves this issue the way it would in, say, a civil law system like you have in most of continental Europe. You're reliant on judicial precedent, and this is precedent that, to my knowledge, has just not been created yet. So, there are a lot of things that judges can look at. Certainly, the Justice Department will be part of these proceedings during the prosecution and will make its case as to why this qualifies as CSAM for the purpose of our anti-CSAM statutes. I'm sure the defense will be hiring some of the best lawyers in the country to try and argue otherwise. So, I'm very interested in seeing what happens with this case. This individual is still in custody. I think he's working out a bond agreement so he'll be released, and then I assume that this case will come to trial. So, we have a long way. I guess it's his detention hearing that's scheduled for later this week. So, unfortunately, we're not going to get a resolution to this case for a long time, but I'm very curious, once we see briefs, on how this case develops.

Dave Bittner: What about Stable Diffusion, the company who generated this image? It existed on their servers, right?

Ben Yelin: Yeah.

Dave Bittner: What's their argument going to be that they're not liable for this?

Ben Yelin: I think they have a pretty good Section 230, immunity argument.

Dave Bittner: Would it be Section 230?

Ben Yelin: Yeah. Because, they are just the platform here. It was the individual who was writing the text that led to that prompt. And we want to give Stable Diffusion the ability to curate its content as it sees fit and that's why, from a policy perspective, we decided to provide them immunity from any decisions they make regarding material to restrict. I think any lawsuit against Stable Diffusion would probably be barred by Section 230.

Dave Bittner: Interesting. I wonder if it'd be in their best interest- you know, we've heard about these systems like, you mentioned the National Center for Missing and Exploited Children, you know, they have these automated systems that can look at images, although, I suppose they're using like -- so they're using hashes of existing images. And in this case, it's a completely original creation, so it would be hard to detect in order to report, you know, someone. And also, I mean, what if you accidentally stumbled onto something? You know, you are doing a totally legitimate image generation and the AI hallucinates and- you know, your request for, you know, beautiful people bathing on the beach at the local, you know, nude beach who are all over 21 and it just hallucinates and puts a bunch of kids in there.

Ben Yelin: It's a scary thought. I think our law already accounts for that because we criminalize distribution, so presumably somebody would not be distributing it, and we criminalize possession. But generally, you have to take some action to establish possession like saving it on your hard drive or downloading the image.

Dave Bittner: I see. So you're going for delete, delete, delete, delete.

Ben Yelin: Delete, delete, delete. I mean, when in doubt, report it to the authorities, I would say. At least, the authorities within these existing- like you would flag it for the AI company or for Instagram or for whomever so that they can go through their process and report it to law enforcement.

Dave Bittner: Yeah. All right. Well, I have to keep an eye on this one. I mean, it's just disturbing all around and such an interesting thing to see how it's going to play out.

Ben Yelin: Yeah. It is disturbing. Like, if you look at the fact pattern of this case, it's scary and disturbing, but it's going to create a really interesting legal precedent as it develops. And I think that's why it's worth us looking into it and following this case as it continues.

Dave Bittner: All right. Well, we will have a link to that story in the show notes. Again, that's from the folks over at 404 Media. And their work is definitely worth checking out if you haven't already.

Ben Yelin: For sure.

Dave Bittner: My story is a bit lighter. How could it not be?

Ben Yelin: Yes. Hard to get much heavier than our previous story.

Dave Bittner: Right. Right. And this has to do with the movie star Scarlett Johansson. Still doesn't return my emails. She was-

Ben Yelin: I'll ask your wife about that.

Dave Bittner: Yeah. Well, all right. I thought you were going to ask her about that like because you're good friends with Scarlett Johansson.

Ben Yelin: Now, do you mean her in the Scarlett Johansson sense or her in the 2012 movie sense when Her, the character, was portrayed by Scarlett Johansson?

Dave Bittner: Well, we're going to get to that.

Ben Yelin: We're going to get to it, aren't we?

Dave Bittner: So, evidently, Scarlett Johansson was approached by the CEO of OpenAI, Sam Altman, to license her voice to be one of the virtual assistants with OpenAI's tools. Now, as you mentioned, Scarlett Johansson famously portrayed an artificial intelligence in the movie, Her, which -- what'd you say, 2012, was it?

Ben Yelin: I think so, yeah. Starring Joaquin Phoenix.

Dave Bittner: Right. Right. So, interesting film. A little unnerving, I thought, the scenes I've seen from it. But she plays a very convincing AI. And what brings this to focus is that, OpenAI recently had a press event where they unveiled the most recent version of their AI, which was extremely conversational. People describe it as being flirty. And there's very little latency, very little delay in it responding to you.

Ben Yelin: Can I just say it was a little bizarre?

Dave Bittner: Yeah.

Ben Yelin: Did you watch it?

Dave Bittner: I did.

Ben Yelin: Did you not think it was just very weird? The way they were conversing with her and the way she was conversing with them, it was just, I guess, not surprising, but just like a little bit eye-opening. Like, it gave me the heebie-jeebies a little bit.

Dave Bittner: That's interesting because, you are not the only person who's said that to me. And I would say, of the people I've talked to about it, definitely the majority felt creeped out by it. I did not. I did not. And I understand why you would be or could be, but it did not creep me out. For some reason, I just thought, oh, isn't this interesting? This is the next phase of where this is going.

Ben Yelin: I mean, I think the distinction is how conversational it is. Like, with our Alexa devices and with Siri, it feels very computerized. We give them the prompt, they think for a second and give us like a very computerized answer. The weather today will be 59 degrees. Whereas, like this actually felt like you were having a conversation with somebody.

Dave Bittner: Yeah, exactly. So, evidently, OpenAI had reached out to Scarlett Johansson about using her voice and she had considered it, but ultimately, she said no. But then, two days before they came out with this demo, they reached out to her again and said, "Please reconsider," and again, she said, no. And then they did this demo, and lots of people who watched the demo were like, "Whoa,-

Ben Yelin: That is Scarlett Johansson.

Dave Bittner: -- that sounds just like Scarlett Johansson." And on top of that, Sam Altman, OpenAI's CEO, posted a one-word tweet on, on X, Twitter, that just was the word her.

Ben Yelin: Big mistake, Altman.

Dave Bittner: Right?

Ben Yelin: You should not have done that.

Dave Bittner: Right. So, Scarlett Johansson has written them a stern letter saying, "Knock it off. Please don't use my voice." OpenAI is saying, "Oh, we weren't using your voice, this was another actress. We can't tell you which actress because of privacy, but it was another actress who just"-

Ben Yelin: Let's just say it was- yeah.

Dave Bittner: It happened to sound like you, I suppose. I think some people could interpret it that way. But meanwhile, they have ended the availability of the voice that allegedly sounded like Scarlett Johansson. So, it seems to me like there's some tomfoolery going on here, Ben. And I'm curious what your take is on this.

Ben Yelin: Can we get to the boring ending of this first so we can get into the actual interesting portions of the nitty gritty?

Dave Bittner: Okay.

Ben Yelin: The boring ending is that, there isn't much legal recourse for Scarlett Johansson here.

Dave Bittner: Well, yes. That's part of what I wanted to discuss here.

Ben Yelin: Yeah. So there's an article, I think, that you're using for the segment where they talked to a bunch of legal experts and looked at various statutes, and you really don't have a developed copyright law as it relates solely to people's voices. This is really a gap in our laws because voices are very distinctive. Some voices, like mine, it doesn't really matter because the vast majority of people who would hear my voice have no idea who I am and it's not defining feature, it's not something that's deeply personal to me. But like Scarlett Johansson is so defined by her voice. She's done voice acting. I mean, she played this prominent role in that movie. So, it is sort of bizarre that the law hasn't caught up with that. We have a patchwork of state laws. Even in California, which you'd think would have a developer body of law on this considering it's the hub of the entertainment industry, doesn't really have a relevant statute on this. So, she's kind of out of good legal options, which is why I think she's kind of taking this public. She released this statement on X that described exactly, as you did, what happened here and how she went through this process of nicely rejecting them a couple of times and they went ahead with a voice that sounds eerily similar to hers. So I think it's public shaming. I mean, she is a very well-liked celebrity, so I think the public shaming kind of worked in this case, got them to shut down the system until they can replace it with a voice that doesn't sound so much like Scarlett Johansson. I'm thinking more of like a John Krasinski real casual hangout, but less like the sort of deep AI that you could fall in love with in he. Although, I guess, I'm sure a lot of people might fall in love with the John Krasinski voice too, but that's who I'm picturing as the next voice of my conversational AI buddy.

Dave Bittner: When I was reading this story, I was thinking about impressionists, you know. In the old days, I mean, Rich Little was probably one of the most famous impressionists.

Ben Yelin: Very unfunny, but a very good impressionist.

Dave Bittner: Yeah. I agree with you. It is funny how, when you look back on his act, he really wasn't that funny.

Ben Yelin: No.

Dave Bittner: He was a good impressionist.

Ben Yelin: He just was good at impersonating people.

Dave Bittner: But then like, I think today, someone like Frank Caliente-

Ben Yelin: Caliendo.

Dave Bittner: Caliendo, I'm sorry, a very well-known and an amazing mimic.

Ben Yelin: A little funnier. Still, I wouldn't put him in my top 10 comedians, but, yep.

Dave Bittner: And then, but I think also you see it on Saturday Night Live regularly,

Ben Yelin: Including by Scarlett Johansson who did a great impersonation of the State of the Union response-

Dave Bittner: Now, isn't that interesting?

Ben Yelin: -of Senator Katie Britt of Alabama.

Dave Bittner: Yes, she did. So, I mean, I think, for comedians, you could say that that falls under fair use for parody and that sort of thing. But if OpenAI just stood by their guns and said, "No, I mean, you know, it's not you, I don't know what you're talking about. I suppose someone could think that it kind of sounds like you."

Ben Yelin: But it's like, "Yeah, a lot of voices sound the same and we based it on somebody else." I'm sure they could probably identify that person if they needed to, if they're being truthful and say, "Listen to this person, that's who we modeled it after." That's why I think there just isn't much of a legal case here. And that's why she took it public because there isn't much that she could do to compel them to take the system down.

Dave Bittner: Do you think this could be part of copyright reform, a better protection for this sort of thing?

Ben Yelin: Yeah. I could see it being wrapped up in like a larger policy package relating to voice impersonation. In a way, this is like a bizarre corollary to the case we talked about here in the Baltimore area where there's the deepfake made of the principal saying racist anti-Semitic things. That was a voice appropriation case, obviously very different. OpenAI never purported that Scarlett Johansson was hanging out with you while you talked to your new chatty AI assistant. But, I mean, I think the similarity is that, we have this gap in our current laws that deals with mimicking people's voices. I think things like fair use certainly still apply, which is why parody would be protected, but I think something like this, where it is- I mean, Scarlett Johansson derives value from her voice. It is just the way the New York Times derives monetary value from its journalism. It's an asset that they have, it's an asset that she has.

Dave Bittner: I think it's interesting when you look at some of the guardrails that some of these large language models have put on themselves. For example, if I ask it to pretend like you're Martin Short and write five insults about, you know, my good friend Ben Yelin, it will come back and say, "I'm sorry, I cannot pretend to be a celebrity or a known person." I say, "Okay, write me five jokes in the style of Martin Short that are about my good friend Ben Yelin," then it will do it.

Ben Yelin: That's the first thing I'm going to do. After we're done recording. I'm interested to see how he would roast me.

Dave Bittner: That's right. That's right. But my point being that, if I said to my assistant, "I would like your voice to be in the style of Tom Hanks or James Earl Jones or Scarlett Johansson or," why wouldn't it do that?

Ben Yelin: Yeah. I mean, I think, I feel like it's-

Dave Bittner: Cookie Monster, right?

Ben Yelin: Right. I mean, I think that should be the same thing. I mean, I do think it is a form of intellectual property because these voices are distinctive and they carry their own value. So you're kind of co-opting Tom Hanks' voice or Cookie Monster's voice without paying them for a service that they're really providing. I mean, that's the purpose of copyright law. Correct?

Dave Bittner: Yeah.

Ben Yelin: So, you know, you'd get these very tough cases where you'd have to go deep into discovery to figure out, well, was it actually based on Scarlett Johansson? You'd have to look through OpenAI's email records and subpoena, you know, everything that they've said to one another in the past three years to see if their story holds up. And I have a feeling that nobody would want to go through that litigation. So, it would be good for Congress or even state legislatures, particularly like California which has the capacity to do something like this, to really lay out the rights people have in the appropriation of their voice.

Dave Bittner: Yeah. Interesting. All right. Well, we will have a link to that story in the show notes. And of course, we would love to hear from you. If there's something you'd like us to consider for the show, you can email us, it's caveat@n2k.com. [ Music ] Ben, I recently spoke with Melissa Oh and Anil John. They are both from the Department of Homeland Security's Science and Technology Directorate. And we are discussing some of the work that they and their colleagues are doing to help advance software supply chain resiliency. Here's my conversation with Melissa Oh and Anil John. [ Music ]

Melissa Oh: As part of the Department of Homeland Security and the Silicon Valley Innovation Program, for context, since we work with the startup community a lot and identify areas of partnership with the startups and DHS, when coordinating with the Cybersecurity and Infrastructure Security Agency, CISA, within DHS, there was a significant amount of energy and recognition that the software supply chain is an area that we need to put a lot of focus on. And CISA approached us, SVIP, to find ways to improve the adoption and ways to encourage more use of the software bills of material that they're championing. And so, as part of SVIP, we decided that that was an important area for us to work with them on. We conducted an ideation workshop to identify what some of those areas of concern were. And so, that's why, in partnership with CISA, we put out the call to work with startups in this area. Anil, you might actually offer some additional flavor to that, so feel free to do that.

Anil John: Absolutely. Thank you, Melissa. And I think the only piece that I would add would be that, it was important that, in the solicitation, we targeted the development of capabilities that actually served multiple communities. A software developer, you know, sitting in front of their IDE, gaining visibility into what are the components of software that they're integrating into their build, a system administrator who has a responsibility to understand, you know, what capabilities, what components of software, what SDKs, what other pieces of software might exist within the platforms and technologies that they're managing, and obviously, at the enterprise level, having a broad visibility into the software assets that exist as well, all of those things were important to both us and to CISA. And those are some of the things that ended up being reflected in the solicitation that went out.

Dave Bittner: And so, where do we stand right now? How much progress has been made, Melissa?

Melissa Oh: So, our startups are making a lot of great progress. They finished Phase 1. Part of that Phase 1, they transitioned Protobom, which is the software translation tool to OpenSSF, the Open Software Security Foundation, and that's exciting for us. And as far as where they're at now, they're actually entering- many of them have started their Phase 2. And the cohort of companies, as Anil mentioned, they're building out those commercial capabilities to provide those software visibility tools, with Protobom baked in and with the capability to provide those services and solutions to end users, but also encourage and ensure that the open source aspect of this solution is continued to be maintained and used and adopted.

Anil John: And I think that's an important, you know, part of the way that we structured the solicitation itself and how the companies that we selected are working in general because, we wanted to make sure that we helped catalyze a set of, you know, products and capabilities that these companies would have and make available to the marketplace that, you know, provided visibility into the software supply chain and connected that visibility to, you know, potential vulnerabilities that exist as well. But we also wanted to make sure that, beyond just the products and the capabilities that these companies were working on, we also did something that benefited the broader software development and the software security ecosystem as well. And that's where the other part of our solicitation was, we required the companies that were working on all of these products to work together as a cohort in order to develop what Melissa noted, a Protobom, a translation capability between the two major flavors and standards of SBOM. And we've been very fortunate in that, that Protobom, that software translation SDK has now been accepted by the OpenSSF, Open Software Security Foundation, which is part of the Linux Foundation, has a globally available open source module that is available not just for the companies from our portfolio that worked on it, but for the global software security, you know, companies as well that allows for easy translation across software for SBOM formats.

Dave Bittner: Melissa, as we go forward here, how do you envision this scaling? What does it look like? What place does it take in the security ecosystem going forward?

Melissa Oh: That's a great question. You know, I think this is just the first step to advancing the efforts to ensuring software developers and suppliers are incorporating SBOM into what they're providing to end customers. So, as part of that first step, you know, I think energizing the use of it, the broader adoption of it. I think also, sort of this, you know, acknowledgement and recognition that we need to ensure open source security is a priority. And so, I think, you know, we're going to look for more ways to partner with industry in how to do that more and more. And so, hopefully, this is just a catalyst to improving our overall software supply chain transparency and visibility and hopefully others take note of that across the government as well as broader ecosystem as well.

Dave Bittner: Are we envisioning a particular size organization that this would be most suited for?

Anil John: Not really because, I think, you know, all of the capabilities that are going to be -- that are on the roadmap to be build and refined are going to become available as product cues in the market that can be consumed by, you know, company of any size that has the ability to, you know, buy a product and use it to basically assess a software security needs. So, we're not looking at any particular size of companies, it should be usable by any size company.

Dave Bittner: I see. How do people get involved with this? If someone's interested and wants to find out more, what's the best way to do that?

Anil John: I think one way that we would recommend would be to, we are actually having a demo week. SVIP has our annual demo week when we bring together all of the companies in our portfolio, including obviously the companies that have been, you know, working on this. So, on, you know, May 22, there will be an opportunity for anybody in the DC metro area to come join us for free and get a demo of those products and talk to the companies directly. And obviously, we are happy to basically ensure that, as these companies have products that actually become broadly available and they successfully graduate from SVIP as well, we are happy to share that information as well. And if there are, you know, other obviously government agencies and other partners who have an interest in this type of technology and what we are doing, they should feel free to reach out to us and we'd be happy to chat with them on how best to partner and work together to move this ecosystem forward.

Dave Bittner: Melissa, this strikes me as being another good example of the importance of these kinds of public-private partnerships. You know, we have DHS, as you say, we have CISA, and we have these organizations that you're partnering with to make these products available, to see this through. Could you speak to that element of this? I mean, it seems to me like this really is the shape of things to come.

Melissa Oh: Yeah. I love working the public-private partnership model. You know, I think that it goes a long way to providing this collaborative model for bringing about good solutions, both from the private sector and the public sector to working together, for sure. More to come in these ways. You know, the Silicon Valley Innovation Program in particular, you know, our mantra is to basically energize and mobilize the startup community from the emerging tech space to working on hard problems within the government, in particular DHS. And we're finding a lot of great energy from the tech sector wanting to support the mission in many ways. And in the case of cybersecurity and software supply chain security, it's crucial to everybody, and it is definitely an area of further growth that we're looking to partner in more and more. As Anil mentioned, best ways to be on the lookout for opportunities for ways that we're putting out calls and solicitations, we certainly hope to encourage more participation from startups in our efforts going forward. [ Music ]

Dave Bittner: Ben, what do you think?

Ben Yelin: It's really great to talk to people who are actually involved in this fight. I mean, I think we talk about it very theoretically on this show, so a great get for you guys. They're doing great work in that office. And, obviously, I'm pro improving the resiliency of anything, so.

Dave Bittner: You know, it was a little bit of an eye-opener for me too because, I think for most of us, when we think about DHS, I don't know, I mean, I guess mostly what I think about is getting through the airport, right? That's-

Ben Yelin: It's a big department though. I mean, they do a lot of things. And I think resilience is a word in the industry and Emergency Management and Homeland Security that applies to a whole bunch of different threats. And you have to have not a threat specific office, you have to have an all hazards office to figure out what systems and whatever needs to be resilient to protect us against bad outcomes. So, I think it fits very well within that office.

Dave Bittner: Yeah. Well, again, our thanks to Melissa Oh and Anil John from DHS's Science and Technology Directorate for joining us. We do appreciate them taking the time. [ Music ] And that's Caveat brought to you by N2K CyberWire. We'd love to know what you think of this podcast. Your feedback ensures we deliver the insights that keep you a step ahead in the rapidly changing world of cybersecurity. If you like our show, please share a rating and review in your podcast app. Please also fill out the survey in the show notes or send an email to caveat@n2k.com. We're privileged that N2K CyberWire is part of the daily routine of the most influential leaders and operators in the public and private sector from the Fortune 500 to many of the world's preeminent intelligence and law enforcement agencies. N2K makes it easy for companies to optimize your biggest investment, your people. We make you smarter about your teams while making your teams smarter. Learn how at n2k.com. This episode is produced by Liz Stokes. Our executive producer is Jennifer Eiben. The show is mixed by Tre Hester. Our executive editor is Brandon Karpf. Peter Kilpe is our publisher. I'm Dave Bittner.

Ben Yelin: And I'm Ben Yelin.

Dave Bittner: Thanks for listening. [ Music ]