Caveat 6.15.23
Ep 176 | 6.15.23

IT companies sweep in to aid Ukraine.

Transcript

Bilyana Lilly: The whole idea that companies now can be considered combatants and can be direct targets in the conflict is a very significant point for discussion at the moment.

Dave Bittner: Hello everyone and welcome to "Caveat" the CyberWire's privacy, surveillance, law, and policy podcast. I'm Dave Bittner: , and joining me is my co-host, Ben Yelin: from the University of Maryland Center for Health and Homeland Security. Hello, Ben.

Ben Yelin: Hello, Dave.

Dave Bittner: Today Ben has the story of the US government declassifying a report on consumer privacy. I've got the story of deep fakes entering US politics. And later in the show my conversation with Dr. Bilyana Lilly, discussing her co authored paper for CyCon: "Business @ War: The IT Companies Helping to Defend Ukraine." While this show covers legal topics, and Ben is a lawyer, the views expressed do not constitute legal advice, or official legal advice on any of the topics we cover. Please contact your attorney.

Dave Bittner: All right, Ben. We got a lot to cover this week. Why don't you start things off for us here.

Ben Yelin: So my story was posted really everywhere. But we're using a story from the Wire, the US is openly stockpiling dirt on all its citizens. [Laughter] Getting right to the point there.

Dave Bittner: Yeah.

Ben Yelin: And this refers to a report released by the Office of the Director of National Intelligence. The report was drafted in 2022, January of 2022, but it's just being released now to do requests from members of Congress, including Senator Ron Wyden.

Dave Bittner: So it's a declassify?

Ben Yelin: It has been declassified.

Dave Bittner: Okay.

Ben Yelin: This report was drafted by an advisory committee of senior officials. We don't know exactly who those senior official- officials are. Their names have been redacted in the report. But generally, they are former intelligence officials. And the report is about the extent to which government agencies are using data, commercially available data that is open source and using it for law enforcement purposes. So the report in its executive summary I think, makes very clear two things. One, is that the value of purchasing publicly available data that third parties collect on us through the course of our ordinary Internet browsing is extremely valuable for both intelligence purposes and for law enforcement purposes. The information is vast. Most people leave a very clear trace of their activities on the Internet. It leaves a lot of valuable information about us. Of course, that's going to be something of great value to law enforcement. The other side of it is this presents major civil liberties concerns. And that's one of the things that this report acknowledges. With the Fourth Amendment to the United States Constitution, we have the right to be free from unreasonable searches and seizures. And so we have a procedure that goes into the government collecting data on our stuff, on our property, and it requires a warrant issued based on probable cause by a neutral magistrate, that's very well established. The problem here is when we're talking about commercially available information, that all of those procedural steps are simply stomped over. We don't have any existing standards to deal with simply purchasing publicly available information, even though that information can be just as personal and revealing as information obtained via warrant or via subpoena. So that's really the main concern here. They issue a series of recommendations about how the intelligence community and law enforcement agencies should handle the acquisition and treatment of this information. For one, they recommend that the intelligence community developed a multi layered process to catalog to the extent feasible the information that ISC elements acquire. They admit this is going to be a complex undertaking. I think one of the most eye opening aspects of this entire report was that they really don't have sufficient data on the extent to which this information is being used by agencies, or which agencies have used them and are using them. I think in their words, "The intelligence community can't understand and improve how it deals with this information unless it knows exactly what agencies are doing with it." The second recommendation is that the intelligence community should develop standards and procedures for this collection, governing and requiring regular reevaluation of acquisition and use decisions. So just a little internal oversight on it. And the third point that they include for recommendations is that the intelligence community should develop more precise guidelines to identify and protect sensitive information. It implicates privacy and civil liberties concerns. And they have a bunch of suggestions in the report that get to the specificity of these more broad recommendations. So their conclusion is that this data is very powerful for intelligence and for law enforcement. But it's increasingly sensitive for individual privacy and civil liberties, even though the data is anonymized. As we know, you can deanonymize it pretty easily if you put the puzzle pieces together.

Dave Bittner: Right.

Ben Yelin: And therefore, basically, in the recommendation of the senior advisors, the intelligence community needs to act. They need to define policies to govern acquisition and treatment. And this is certainly something that not only the intelligence community is going to be looking at, but members of Congress, who've already issued a series of proposals on this, I think, are going to have to take a second look, given the severity of what we see in this report.

Dave Bittner: And then, of course, this is prompted by requests from Senator Ron Wyden, who sort of takes the lead on a lot of this consumer privacy stuff.

Ben Yelin: Yeah. So Senator Ron Wyden is at the center of every single story that relates to data privacy, visa vie Congress and the Federal Government.

Dave Bittner: Yeah.

Ben Yelin: So he is- he was the person who demanded that this report be declassified. And I think he will be at the forefront for any reform efforts, including legislation that's already been introduced in Congress to set some type of standard around the collection of this information. I think the obvious solution, if there were the political will, is to require the equivalent of a warrant for any information that in the absence of this commercially available data, would require a warrant. So anything that is somebody's persons, papers, affects, physical property, et cetera. Any data to which a person has a reasonable expectation of privacy. The federal government could step in and say, for Fourth Amendment purposes, there should be no distinction between purchase data and data that we can obtain via warrant and via subpoena. Is that actually going to happen? I certainly have my doubts for a number of reasons. One, as this report admits, the intelligence gathered through this data is extremely valuable. And I think the law enforcement community and the intelligence community don't want to give up these capabilities so easily. That would be very cumbersome on them. It would make their investigations more difficult. That certainly filters down to members of Congress, who will hear from the intelligence community saying, hey, we stopped XYZ incidents because of commercially available data that we receive from some of these app browsing or Internet browsing history. And then, you know, the general point I always make is that inaction is the norm when it comes to Congress.

Dave Bittner: Yeah.

Ben Yelin: Last session, I think many of us were cautiously optimistic about comprehensive data privacy legislation, and for a variety of reasons, including some that were very parochial. Speaker Pelosi, at the time, didn't want the federal government to preempt California's robust data privacy law. And she was the Speaker of the House at the time and had a lot of influence. That's another thing that just tends to happen in Congress. It's much easier for something not to happen than it is for something to happen.

Dave Bittner: I can't help thinking that this is- we're dealing with what to do with the horse once it's left the barn, rather than preventing the horse from leaving the barn. Right? Like, why don't we come at the source of this, and as you say, have- if we had some sort of federal data privacy law that restricted this information from being gathered and aggregated and sold to begin with, then we don't have this problem. But we have- but until we do that, then it seems to me, like, be it through, it seems to me, like, the government could always find a way to separate themselves from those gathering the information. In other words, let's say they're selling- they're buying it from a data broker now. Maybe they could buy it from a contractor who bought it from the data broker who bought, you know --

Ben Yelin: Right.

Dave Bittner: You see where I'm going with this?

Ben Yelin: They separated themselves as much as they can from the actual data collection.

Dave Bittner: Right.

Ben Yelin: To become so diffuse that it's, like, well, you know, who's to say that we had any active role in getting our hands on this information. It was just kind of --

Dave Bittner: Yeah.

Ben Yelin: -- yeah.

Dave Bittner: It was posted in the public square where, you know [laughter]

Ben Yelin: which it is. I mean, it is open --

Dave Bittner: Right.

Ben Yelin: -- source data that we're all in one way or another willfully giving all these companies and third party data brokers.

Dave Bittner: Yeah. I'd put air quotes around willfully.

Ben Yelin: Oh, totally.

Dave Bittner: You know, I, you know, we're not- I'm not saying, hey, track me. You know, none of us are, and yet that's what's happening.

Ben Yelin: Nobody is aware of the- to the extent, or very few people are aware. I'm sure our listeners are among the few in the crowd who are aware. [Laughter] But most people have no idea the extent to the information that's being collected on us just in the course of ordinary activities, the apps we download. You know, should I allow Panera to view my location so it's easier for me to locate a sandwich shop? Sure, absolutely.

Dave Bittner: Yeah.

Ben Yelin: That's a rich collection of data that just with a click of a button you are providing. So there's the fact that people don't really know what data is being collected. And then the fact that even the intelligence community itself isn't really aware. They don't have any institutional knowledge of the extent of the collection that's happening inside the government. So we're just flying completely blind here, both as consumers and as government agents. That's why I think the first step really is some proper oversight here, where there's mandatory reporting of key statistics in terms of which agencies are using this data, something measuring the volume of the data being collected, that's certainly a valuable first step. But as you say, I mean, one of the problems with all of these issues is by the time anyone is actually paying attention beyond Ron Wyden.

Dave Bittner: Right. [Laughter] Right.

Ben Yelin: So, you know, other members of Congress, the general public, the Wall Street Journal, Wire, by the time they're paying attention, the, well, not to mix metaphors here, but the cat is already out of the bag. We're already doing this at a very large scale.

Dave Bittner: Yeah.

Ben Yelin: And this makes me think of what's happening in the world of AI, where we're having these kind of high level conversations about the ethics of it, and how far is this really going to go? All of those really important conversations, but in the meantime, the technology is proliferating. More and more people are using it, and we might blow our chance to actually get a handle on it, and prevent the absolute worst outcomes. So it's just- it's certainly something that worries me that our legal system, and even our media is just behind the reality of what's already happening.

Dave Bittner: Is it plausible that members of Congress may be being lobbied by federal law enforcement agencies to say, hey, look, you know, we don't want to lose this capability?

Ben Yelin: I think they absolutely could be lobbying some of these intelligence agencies. I also think because this type of data collection, exogenously is profitable for the companies collecting it and for third party brokers. I mean, you can make a sense that there's a lot of money in this. And when you have an advantage for the companies collecting the data for the brokers who buy it and sell it, and for the government who gets valuable intelligence, like how do we break that chain where everybody's incentive is more data collection? The only way to do it is to raise a giant fuss. And that's what members of Congress are trying to do here. But it's just so hard to know how to raise a sufficient fuss so that something actually happens. That's what's so frustrating.

Dave Bittner: Right.

Ben Yelin: Because, you know, we see so many stories like this. This is one of the more high profile ones, because we're getting an actual report from the Director of National Intelligence, but it just kind of in the midst of other news stories falls by the wayside. It's off the radar of most members of Congress, it's just not something that they're spending a lot of time thinking about. So it's just really hard to effectuate change. And I think that's something that's frustrated for people who are in this world and people who care about data privacy.

Dave Bittner: It seems to me, like, we need, you know, an agent of chaos, a, you know, a John Oliver to come in and buy up a bunch of this information about members of Congress. You know, post a map showing their every movement and [laughter]

Ben Yelin: He, like, literally did that though. I think he had a segment --

Dave Bittner: He threatened it. Yeah. He threatened it, but somehow, like, it didn't follow through.

Ben Yelin: He didn't reveal the names.

Dave Bittner: Right.

Ben Yelin: Yeah.

Dave Bittner: Right. Right. And so I think he was, I mean, to me, that was a shot across the bow. And who knows, maybe he got brushed back, you know, either by HBOs legal team or members of- who knows, you know. But it seems to me, like, if it's going to take something like that, where Congress gets affected personally in some sort of very embarrassing, visceral kind of way, in a bipartisan nature.

Ben Yelin: Right.

Dave Bittner: Right. Where they're all exposed, then that will get their attention. But until then, I think they, it just hasn't risen to the level of them to really focus.

Ben Yelin: I mean, you've used the example in the past of how Congress quickly passed a law, you know, regulating the extent to which the government could search people's video rental records?

Dave Bittner: Right. And libraries.

Ben Yelin: And libraries.

Dave Bittner: Yeah, checking out library books.

Ben Yelin: There are a couple of issues. One I see is that we're just more numbed by scandal that it just doesn't pack the same punch that it used to. Just given everything we've gone through in recent political history.

Dave Bittner: Right.

Ben Yelin: And just our media landscape is different. There isn't, like, a Walter Cronkite, that's going to just read the news that everybody's going to trust on this. Everybody gets their news from different sources. It's very ideologically biased. And even if there was some bipartisan expose, so much of its impact would depend on how it's being covered. And if it's being covered as, like, a political hit job by John Oliver --

Dave Bittner: Right.

Ben Yelin:. -- then maybe it has the reverse effect. And it motivates lawmakers to hunker down even more and be resistant to giving into this mob.

Dave Bittner: Right. Right.

Ben Yelin: So I don't mean- I don't want to be a pessimist, but it is just- it's hard to see a realistic path to getting out of this doom loop where we keep finding out about these practices that invade our privacy and civil liberties, and we keep seeing inaction from Congress to actually do something about it.

Dave Bittner: Yeah. Being pessimistic about Congress is an evidence based approach, right? [Laughter]

Ben Yelin: It sure is. Yes, it sure is.

Dave Bittner: All right. Well, it's an interesting report, for sure. And as you said, it's been covered in a lot of different places. We'll have a link to the coverage from Wired here in our show notes. But this is getting a lot of attention all over the media. So, yeah, definitely worth reading and checking out. My story this week comes from the Washington Post. This is an article from Aaron Blake. And it's titled "DeSantis Ushers in our Fake Images in Politics Nightmare." We have been talking about this, we've been warning about this. I think we all knew this day would come.

Ben Yelin: It's happening.

Dave Bittner: I don't know. I mean, I guess if you were going to place your bets, it would have been realistic to say that this round of- this election season would be the one where we would see AI generated images used in political ads. And --

Ben Yelin: Yeah. This is going to be that election. You know, the article here talks about a ad, I believe, from the super PAC of Ron DeSantis. But it's certainly that's not the first time that AI generated images have been used in an ad.

Dave Bittner: Right.

Ben Yelin: So I don't think this is something that's specific to any candidate or any party. The weapon is now out there, to the extent that it's a weapon used against political opponents.

Dave Bittner: Yeah.

Ben Yelin: And talk about a cat that we're not going to be able to put back in the bag.

Dave Bittner: Right. So details here is that, according to this report, the DeSantis campaign spun up an ad, a TV ad, and it has images of Donald Trump and Anthony Fauci hugging. And, of course, as we know, Governor DeSantis from Florida is running against former President Trump for the Republican nomination for president. And this is a very heated competition. These two are going at each other. There's lots of name calling. You know, there- the gloves are off.

Ben Yelin: Sure.

Dave Bittner: [Laughter] So this image, which the DeSantis campaign would think would be to their advantage, because folks on the Republican Party, folks on the right, of course, Dr. Fauci is not popular.

Ben Yelin: Nope.

Dave Bittner: And so to have- to say the least. So to have an image of former President Trump, embracing Dr. Fauci, you know, literally, which leaves to the figurative notion of it could be damaging to former President Trump. And the fact that this is being done so blatantly in a high level campaign here. You know, this wasn't- they weren't trying this out on the lower levels, right. They went straight to the high level of- high level campaign, and are doing this without shame, or [laughter] they're just going for it. Here we are.

Ben Yelin: Yeah, I mean, I think there are also some distinctions with this particular ad, versus other instances where we've seen AI generated images in political advertising. So here the images look, I mean, I've seen them, they look more realistic. It's certainly plausible. If you look really closely, you can tell that it's AI generated.

Dave Bittner: Yeah.

Ben Yelin: Cause Fauci kind of looks like a cartoon version of himself. But if you're not looking closely, it not only looks realistic, but, like, in people's minds, Trump and Fauci did have hundreds of press conferences with one another.

Dave Bittner: Right.

Ben Yelin: They were on the same team. So it's very plausible that at least to an average voter, that they would have hugged at some point. So it's not something that, you know, most people would just assume is fake. Like, one of the things they mentioned in this article, Ron DeSantis, riding an actual rhinoceros.

Dave Bittner: [Laughter] Right. Right.

Ben Yelin: Which Donald Trump putting out an ad out to that effect trying to prove that DeSantis is a rhino Republican in name only.

Dave Bittner: Right.

Ben Yelin: That's obviously false. Or an ad that Trump put out, which had AI generated voices of Ron DeSantis, Elon Musk and others, including a Mach devil, Adolf Hitler spelled incorrectly, and Dick Cheney.

Dave Bittner: Okay.

Ben Yelin: Like, that was obviously such absurd parody that nobody would actually think that that was representing something realistic.

Dave Bittner: Right.

Ben Yelin: So this ad is the first time that it's, like, it's something plausible, it's something that's well done, and there is no transparency. So there's no disclaimer at the bottom of the page saying this is AI generated, this image is AI generated. In fact --

Dave Bittner: To your knowledge, is there anything within the rules of political campaigns that would require any sort of, you know, we- I remember, when it happened that candidates had to start saying, you know, I'm Joe Blow, and I approve this message.

Ben Yelin: Yeah.

Dave Bittner: You know, at the end of every- beginning or end of every ad. Is there anything in there that requires any, you know, notice of accuracy and image generation [laughter] as the words are leaving my mouth, I realized how absurd they sound. [Laughter]

Ben Yelin: I don't think so. I mean, there are regulations on what you can put in political ads.

Dave Bittner: Okay.

Ben Yelin: You mentioned one of them, that was from the Bipartisan Campaign Finance Reform Act from the early 2000s. There are restrictions on using military images. So if you have served in the military, and you take pictures, you have pictures of yourself in the military uniform, you have to put a disclaimer saying the use of military images doesn't imply an endorsement from the Department of Defense, et cetera, something like that.

Dave Bittner: Okay.

Ben Yelin: As far as I know, there are no rules about AI that exist yet. I think the federal government is just starting to address this problem.

Dave Bittner: Right. Right.

Ben Yelin: And they're addressing it from a very broad sense that I don't know when it's going to get granular enough that we can actually regulate political advertisements. I mean, I think a good enterprising lawmaker could draft a bill that says, For federal races over which the federal government has jurisdiction. You have to if you're going to use AI generated images, there has to be a very conspicuous text at the bottom saying this image is AI generated.

Dave Bittner: Yeah. Well, and to that point, this article does point to some previous reporting from the Washington Post about how evidently representative Yvette Clark, who's a Democrat from New York.

Ben Yelin: Yes.

Dave Bittner: Did sponsor some legislation that would require disclosure of AI generated content in political acts.

Ben Yelin: So I guess to my point, she was that enterprising lawmaker.

Dave Bittner: [Laughter] Right.

Ben Yelin: She did file this bill about a month ago. But as this article points out, it's going to be very hard for this bill to move in Congress. She's part of the political minority. And Republicans have generally been more reluctant to embrace any regulation on political speech.

Dave Bittner: Yeah.

Ben Yelin: Especially in campaign advertising. So, yeah, I mean, I don't think that's going anywhere. I mean, you do have a couple of options if you feel that you've been the victim of AI generated images.

Dave Bittner: Right.

Ben Yelin: The very unlikely solution would be some type of defamation lawsuit. But given the high legal standard there, I don't think that would succeed in this context. The other is, I'm not sure if this is fighting fire with fire, but getting the word out there that these are fake AI generated images. And there is now this tool on Elon Musk's Twitter. And not to focus too much on Twitter. But Twitter is largely where the political conversation happens.

Dave Bittner: Right.

Ben Yelin: I think other platforms are better for influencers and pop culture, that sort of thing. But Twitter is where political figures interact. And there's this new feature, Community Notes, where you can put something in context or fact check something and that fact check can be endorsed by community members. And there would be a little disclaimer below a tweet with that AI imagery that says this is generated by AI. That's what happened here. So under this video now is one of those community notes saying these hugs didn't actually happen.

Dave Bittner: Right. That doesn't do any good if the ad runs on your local TV affiliate?

Ben Yelin: Most certainly not.

Dave Bittner: Yeah.

Ben Yelin: So that the local TV affiliate is the biggest problem. I mean, you can put disclaimers on things like YouTube videos, but, yeah. I mean, running on cable news or local TV is going to be a different problem entirely.

Dave Bittner: What happens if we- let's take this to the absurd extreme here. What happens if one of these candidates posts a video of their opponent, and they literally put words in their mouth. They have some kind of deep fake and, you know, they have them saying something that is the opposite of something that they would say. Is that just, you know, grounds for a lawsuit or what happens then?

Ben Yelin: Well, that's a great question. It could potentially be grounds for a lawsuit, though it's very difficult to prove. The First Amendment considerations are quite strong given that this is political speech, even though it's something that's false. I don't know how a legal challenge would fare. There's also the pressure that you could put on individual news stations to pull advertising, which has sometimes worked in the past for something that's so fundamentally misleading. You go to the news sources themselves and say, you're undermining faith in our elections by running this advertisement. And occasionally, that works if you put enough pressure on them. So that can happen too. But I don't think we're far off at all from things, like, deep fakes showing up in political advertising. I mean, AI generated voice is getting so much better than it used to be.

Dave Bittner: Yeah.

Ben Yelin: Even just a few months ago. It's coming. And I don't think as a society, we're prepared to deal with it, especially since we know that false information can proliferate so quickly across social media. It's something that's definitely going to be a problem.

Dave Bittner: Well, think about the, you know, during former President Trump's initial campaign, the- what was the- what's the name for the tape when he was in the bus? You know, the --

Ben Yelin: The Access Hollywood tape?

Dave Bittner: The Access Hollywood --

Ben Yelin: Yeah.

Dave Bittner: -- tape, right? So in a situation like that, where he was not on camera, you know, it was a remote, like, a wireless microphone --

Ben Yelin: Right.

Dave Bittner: -- recording. So you don't have any issues with the video part of it, like, his lips don't have to match what he's saying. Even just a few years after that happened, it's the wild west of catching someone on a hot mic, you could have them say anything.

Ben Yelin: Right.

Dave Bittner: And obviously, you know, the Access Hollywood tape didn't keep President Trump from becoming president.

Ben Yelin: Right.

Dave Bittner: But --

Ben Yelin: I mean, you could ruin somebody's political career if it's believable.

Dave Bittner: Right.

Ben Yelin: You know, the other side of the coin is it adds a layer of plausible deniability for the candidates, because even for something they did say they could say, that could have been AI generated.

Dave Bittner: Right.

Ben Yelin: We saw that with Elon Musk in a lawsuit in his capacity as CEO of Tesla, where he made some- we- didn't we have a story on this? He made some representation at a public presentation?

Dave Bittner: Yeah.

Ben Yelin: And then implied to the court that that might have been AI generated, even though, like, actual people were there and saw it. [Laughter] So --

Dave Bittner: So just the ability to inject doubt?

Ben Yelin: Yeah. It's injecting reasonable doubt into the minds of voters and just making our political process that much more confusing and uncertain, which is not great.

Dave Bittner: [Laughter] Oh, interesting times, interesting times.

Ben Yelin: It sure is.

Dave Bittner: Oh, all right. Well.

Ben Yelin: Long sigh. Yeah.

Dave Bittner: It's so hard [laughter].

Ben Yelin: Yeah.

Dave Bittner: It's just so, yeah. You know, I mean, it's a great example of how hard it is to keep up with technology. Right, how hard it is for policy, and our political system to keep up with technology. They run at different tempos.

Ben Yelin: Right, right, right. No, I completely agree.

Dave Bittner: Yeah.

Ben Yelin: And the technology is always two or three steps ahead of our political system and our legal system. And we are frequently seeing the consequences of that. [Music]

Dave Bittner: Yeah. All right. Well, we will have a link to these stories in the show notes. And of course, we would love to hear from you if there's something you'd like us to consider for the show. You can email us. It's caveat@n2k.com.

Dave Bittner: Ben I recently had the pleasure of speaking with Dr. Bilyana Lily. We're discussing a paper that she co-authored for CyCon. It's titled "Business @ War: The IT Companies Helping to Defend Ukraine." Here's my conversation with Dr. Bilyana Lilly.

Bilyana Lilly: We chose to write on this topic already back in September of last year because my co-authors and I noticed how there were several companies that were in the news and their contributions to Ukraine were discussed. And a lot of the cybersecurity community has been writing about the cyber dimension of the war in Ukraine, and they have covered the contributions of foreign companies. And one of the reasons why Ukraine managed to stay online for that long is because of the contributions of foreign IT companies. But we couldn't find one research or article that systematically analyze the contributions of different companies, categorize them, and also assess the risks and benefits for the different companies from providing these contributions. And I got really curious about this topic. And I also wanted to understand why are companies helping? Why now? Because this whole massive support was so overt and so instantaneous, as soon as the war started companies openly started providing support free of charge. And even actually before Ukraine was invaded by Russia through conventional means there was already conflict happening in cyberspace. And some companies have already been providing assistance for months prior to the conventional invasion. So I was very curious to know, what was the risk assessment of those companies? Why did they decide to help? Were they aware? They were aware of the risks, but despite that, why did they decide to assist so overtly?

Dave Bittner: Can we discuss a little bit of the historical precedent here? I mean, I think about, you know, going back to World War II here in the US, when we had manufacturers who would shift from making cars and trucks and things like that to being on a wartime footing and making tanks and jeeps and those sorts of things. Are there any parallels here, or is this really a new domain?

Bilyana Lilly: Because of the nature of this conflict, because this was a country, a neighbor of Russia, a country that is not a part of NATO, is not a part of the European Union I think this is a rather unprecedented scenario. And we can draw parallels with past conflicts, but they will be limited. And I think in recent years in recent conflicts, there hasn't been any support that's been provided at that scale. So I would say that, from that perspective, the support that has been offered has been unprecedented.

Dave Bittner: And what specifically are we talking about the types of support that are being offered here?

Bilyana Lilly: So we looked at a number of companies, about 20. And we only covered publicly disclosed contributions and support that hasn't been or a system that hasn't been provided under non disclosure agreements. And we categorize the different contributions into three categories, hardware, software, and cyber services is the simplest way possible. I've heard some companies from- because we also reached out to the companies that we identified in our paper. And I have to thank publicly the those that responded. We had a whole questionnaire and they- and then followed up with additional questions. And some companies were very helpful and provided really clear answers. Some of them had slightly different categorization of their assistance. For example, they would include also raising awareness about certain threats in cyberspace to Ukraine, but also their clients abroad, clients and partners outside of Ukraine. So we didn't include that category. We also didn't include social media companies, which I just came back from CyCon, which is NATO's conference that takes place every year in Estonia. I now know that social media companies have been incredibly helpful as well. And that's another category that I hope someone looks at our research and uses it as a baseline and continues to expand on it. And I think that's another category that we should consider when we examine the support of companies, foreign companies to Ukraine going forward.

Dave Bittner: What are the diplomatic elements in play here? I mean, obviously, the United States has its own diplomatic interests in this war. So if I'm an independent company here, and I'm headquartered in the US, do I check in with the government before I offer up my help?

Bilyana Lilly: It's a really good question. So from my conversations, or our conversations with different companies, when we asked them about their decision making process, I don't think the US government came up. I would imagine that was a consideration. But the two main considerations that the companies had at the time were first, should they exit Russia? And then should they end Ukraine? So two decisions here. First, the exit strategy was one decision category, and then the second is should we provide assistance and what are the risks and benefits from doing so? And, but also although the question of whether the companies consulted with US government didn't come up. When we looked at the companies that have- we're examining the companies that have provided assistance to Ukraine since the beginning of the war, most of them are US based. And the United States was very clear in its position that it is supporting Ukraine and is condemning Russia's actions. So the companies more or less followed US foreign policy.

Dave Bittner: And did any of them to your knowledge, suffer any real consequences here? You talk about pulling out of Russia. Was that a big market for some of these companies?

Bilyana Lilly: For some of them, yes, but not as big as, for example, China, if we're going to look forward to potential other conflicts. More significant threats that the companies get faced include intellectual property theft, a lot higher risk of confidentiality, integrity, availability of their data, in communications in Ukraine. Some of them, employees have faced harassment by Russian aligned actors, most likely on behalf of the Russian government, or at least actors that have pledged allegiance to the Russian government. We also had a clear threat signaled or stated from the Kremlin indi- when the Kremlin said last October, that commercial satellites will also be considered participants in the war and could be targets for the Russian government.

Dave Bittner: As we look forward here, how does this inform the future of warfare?

Bilyana Lilly: I think we're going to draw lessons from this conflict for years to come, if not decades. So far, what I could say from our research is that a lot of the assistance was provided ad hoc, and companies didn't have a playbook. A lot of them took significant risk when they intervene in the- intervened in the beginning and started providing assistance. So what we're learning is that the provision and coordination of assistance could be streamlined and facilitated a lot better if we had playbooks and prepared in advance with something, like, for example, multi-stakeholder, incident response playbooks that we create similarly to the incident response playbooks that we have today. But instead of just listing the types of scenarios for different cyber incidents within an organization, we also consider scenarios for an actual conflict that the entire- an entire country can face. And then what types of tools, products, and services the country would need, or the specific organization within the country would need to withstand the particular onslaught in cyberspace that they may face from an aggressor, like, Russia. And I'm specifically looking at countries, like, Georgia, like, Belarus, that are closer to Russia and may benefit from having such multi-stakeholder Incident response playbooks. Another area where we are- we're learning a lot, the conflict has accelerated critical discussions about rules of engagement in cyberspace, specifically for private sector companies, specifically in conflict. Because we have international humanitarian law, but IHL pertains to states, but it doesn't cover the private sector. And now companies are starting to discuss among themselves and with government, what are the responsibilities of these companies during conflict? Who is going to pay for the services? For how long are they supposed to provide assistance? How are they going to coordinate that assistance? What protection can they expect and from whom? So all of these questions are now being discussed on the sidelines of the Munich Security Forum in CyCon. We had several very informative discussions, closer discussions on the topic. And I think there's- that there's a lot of interest from the private sector and different governments to address these questions. But the topic is so complex that I don't think there is a playbook that's been formulated yet, but it is being formulated as we speak.

Dave Bittner: What are some of the concerns that have been brought up here? I mean, it seems to me, like, we introduce a bit of fuzziness here, quite a bit of fuzziness here where it used to be that, in my mind anyway, these lines were more distinctly drawn. You know, your federal government, your military, they have responsibilities for these things. And the private sector could assist, but it was through the military, through the federal government. As we blur these lines, what are some of the potential hazards here?

Bilyana Lilly: Definitely physical harm, or harm to the infrastructure of a company, to its employees. The whole idea that companies now can be considered combatants and could- can be direct targets in the conflict is very significant point for discussion at the moment. Another question is for how are companies expected to provide assistance free of charge? Some of the licenses that have been issued to Ukraine a year ago, a year and a half ago, are going to expire soon. Who is going to pick that bill? For how long are companies expected to support Ukraine? And those are significant considerations. And I, Dave, honestly, I can't blame those companies, because they're not governments, and we shouldn't expect them to act like governments. I realize there's a corporate responsibility element to this, and there is some element of altruism. And there are humans that are operating these institutions, but they're profit making institutions. Their job is not to protect citizens, their job is to make profit. So I don't think we can expect them to be supporting Ukraine indefinitely. And governments should take that responsibility. Also, because the war wasn't started by a company, it was started by a political decision and by the Kremlin. So the person or the entities that are supposed to respond to that war are political institutions as well.

Dave Bittner: Where do you expect this to head then? Are we going to see frameworks drawn up? Will this become more formalized, or will we have rules of the road going forward?

Bilyana Lilly: I really hope so. And I think because of the momentum that we've gathered so far, with all these discussions that are taking place, I think we will have at least a general playbook and rules that would guide the behavior of the private sector in conflict going forward. It is going to take a few more months to generate that playbook because a lot of stakeholders need to be consulted. But I think we will see the formulation of better rules of engagement in cyberspace for private companies in conflict. We'll see, hopefully, a definition of what an aggressor and a victim means, because in this particular conflict, it wasn't that difficult to see. Russia unilaterally without provocation invaded Ukraine and committed unspeakable atrocities. But what if we have a conflict where the aggressor and the victim or the attacker and the victim aren't as clear, and it's harder for the international community to pick sides? So I think we'll have- those discussions are taking place at the level of the International Red Cross, NATO, the European Union, on the sidelines of CyCon and the Munich Security Conference, and the Aspen Institute and others. And I think we will see in the near future, the formulation of guidelines that will assist companies with making these decisions.

Dave Bittner: We've certainly seen a lot of support from our allies with these efforts, and it's really been a global effort here. But what about our adversaries? How do you suppose they're viewing all of this?

Bilyana Lilly: Oh, for Russia, this is like, for them, this is fantastic, because they can now say that the entire NATO is against them. And the reason why Russia invaded Ukraine is because of NATO's aggressive policies, and now, they clearly point the finger at us and say that we are all working together, and we're a consolidated aggressor against Russia, and they're basically fighting all of NATO in Ukraine. [Music]

Dave Bittner: What about China? I mean, you know, sort of on the sidelines, but more aligned with Russia than us.

Bilyana Lilly: China is wise enough to try to preserve a level of neutrality. Although, of course, they're not neutral, but they're trying to not directly be involved in the conflict as much as they can.

Dave Bittner: Ben, what do you think?

Ben Yelin: That was really fascinating to learn about the extent and really the heroism of some of these companies who have stepped up in a way that's not necessarily profitable to them and maybe not sustainable in the long run.

Dave Bittner: Right?

Ben Yelin: But to do something that's really aiding in this effort, and I think your comparison was apt that businesses during World War II suspended some of their normal operations to contribute to the war effort. But it's something that's not going to be sustainable as your interviewee said, because eventually it's the government's that are going to need to get involved and take action.

Dave Bittner: Yeah.

Ben Yelin: But it was certainly really interesting.

Dave Bittner: Yeah. I always enjoy my conversations with her. And, you know, one thing that makes me wonder about is kind of where the lines are drawn when it comes to our expectations of the national defense. Right. In other words, you and I, you know, we're sitting here on the mainland of the United States of America, and we feel like certain things are- certain protections come from certain organizations and those lines are very clear.

Ben Yelin: Right.

Dave Bittner: You know, the military defends our borders.

Ben Yelin: Right.

Dave Bittner: And we have an army, and an Air Force, and the Navy and all those kinds of things. So this notion that we have this realm, this cyber realm where we are also relying on the private sector to do a lot of the heavy lifting, I think that's an interesting [music] reality.

Ben Yelin: Yeah. It's interesting in its novel.

Dave Bittner: Yeah.

Ben Yelin: And I think there are some pretty broad implications to it.

Dave Bittner: Yeah, absolutely. All right. Well, again, our thanks to Dr. Bilyana Lilly for joining us. We do appreciate her taking the time.

Dave Bittner: That is our show. We want to thank all of you for listening. We'd love to know what you think of this podcast. You can email us at caveat@n2k.com. N2K Strategic Workforce Intelligence optimizes the value of your biggest investment, your people. We make you smarter about your team while making your team smarter. Learn more at n2k.com. Our senior producer is Jennifer Eiben. This show is edited by Elliot Peltzman. Our executive editor is Peter Kilpe. I'm Dave Bittner: .

Ben Yelin: And I'm Ben Yelin:.

Dave Bittner: Thanks for listening.