Where oh where in the world is that information?
George Tziahanas: So data sovereignty, right, is this concept that, you know, for a long time, we thought of the digital world as a world without borders. And actually what we're seeing now is national borders are starting to mean something again, even in the digital world.
Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law, and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hi, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: Today Ben has the story of a raid on an independent journalist who leaked unaired clips from Fox News. I've got the story of a DC court ruling that AI-generated content is not eligible for copyright protection. And later in the show, my conversation with George Tziahanas of Archive360. We're talking about what happens when information could be housed anywhere in the world. While this show covers legal topics, and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. All right, Ben, let's dig into our stories here. Why don't you kick things off for us?
Ben Yelin: So my story is everything. It comes from the Substack of Journalist Kim Zetter, who is one of the top journalists in the field of cybersecurity, national security.
Dave Bittner: Yeah.
Ben Yelin: We certainly have learned a lot from her work over the years.
Dave Bittner: Sure.
Ben Yelin: And I was actually alerted to the story on LinkedIn. I've had people, fans of the show, perhaps, start just tagging me on these types of stories.
Dave Bittner: Nice.
Ben Yelin: So please continue to do that. It, you know, makes it so it's less work for me to find articles. So thanks everybody for that. So this is about a guy named Tim Burke who got in trouble with Florida law enforcement because he leaked unaired clips of Fox News, specifically the Tucker Carlson show. So Tucker Carlson, of course, no longer with Fox News, but the relevant material was leaked in 2022.
Dave Bittner: Okay.
Ben Yelin: Basically what happened is Tucker Carlson did an interview with Ye, the artist previously known as Kanye West.
Dave Bittner: Okay.
Ben Yelin: And he presented the interview as this guy is a normal guy. He deserves to be listened to. But it turns out there are a bunch of unaired clips that didn't make it into the segment where Ye is acting like a maniac and saying things that are wild and anti-Semitic.
Dave Bittner: Oh.
Ben Yelin: So somehow, and we'll get into this, this journalist Tim Burke got access to these unaired portions and leaked them to Vice News. Vice News published them, and Florida law enforcement started an investigation into Burke for violating the Computer Fraud and Abuse Act, the anti-hacking statute, and they raided Tim Burke's home and took a bunch of his equipment. So it was one of those middle-of-the-night raids. They obtained a warrant to do it, accusing him of violating the CFAA. And now Tim Burke is going to try to suppress that evidence when and if he goes to trial. He has not been charged. This is simply a search that was done in suspicion of criminal activity. So a little background on Tim Burke, first of all, I think it's relevant. He actually wasn't working at a major publication when the story took place. He wasn't really working on any publication. So there's kind of the question, if we have special federal protections for journalists where there's kind of a heightened standard for targeting journalists, was he a journalist in these circumstances? I think he argues and other digital rights advocates have argued persuasively that, of course, he's a journalist. Anybody who's doing this type of investigative work and who is making news by talking to sources, etc., whether they're working for an actual publication, they are a journalist, and they deserve those protections.
Dave Bittner: Right.
Ben Yelin: Tim Burke was actually the guy who co-wrote the -- so this is more than 10 years ago now on Manti Te'o, the football player's alleged fake girlfriend, which was like one of the most crazy, interesting, wild pieces of journalism I've probably seen in my lifetime.
Dave Bittner: So my girlfriend lives in Canada kind of story?
Ben Yelin: Yeah.
Dave Bittner: I'm not familiar with it.
Ben Yelin: Basically, there was -- there was this a, I mean, we can go on a little tangent here because it's so interesting.
Dave Bittner: Quickly.
Ben Yelin: But basically, this guy had claimed that his grandmother and his girlfriend had died within the same football season. It became a national story.
Dave Bittner: Oh.
Ben Yelin: His grandmother actually had died. The girlfriend was fake.
Dave Bittner: I see.
Ben Yelin: I think what Manti Te'o argues is that he never realized the girlfriend was fake. I think he improperly claimed that he had met her, but there was this other guy who was obsessed with this football player who pretended to be a woman and use like a stock photo online.
Dave Bittner: Oh, right, this is coming back to me now. I do, yeah, yeah, I have a vague recollection of this.
Ben Yelin: So yeah, I mean, this is something that Tim Burke had did in a past life meeting. He is a pretty darn good journalist.
Dave Bittner: Right.
Ben Yelin: So let's talk look about the Computer Fraud and Abuse Act segment here because fun as it is to reminisce about Manti Te'o, I think there's an important legal story here.
Dave Bittner: Right.
Ben Yelin: The question is whether Burke violated the CFAA. And if you remember from the Van Buren case that we've talked about, exceeding authorized access requires you to be somewhere where you're not allowed to be. It is this gate-up, gate-down approach. If you've reach that gate, that's a violation of the statute. Burke says he doesn't remember how he obtained the information that he ended up sending to Vice News. I don't know if I believe that entirely.
Dave Bittner: Yeah.
Ben Yelin: But that is what he said.
Dave Bittner: Okay.
Ben Yelin: So it turns out that Kim Zetter did some investigative work based on court filings. And what appears to have happened is that somebody who provided news tips to Burke in the past found a username and password for a demo account on a website used by broadcasters called liveu.tv. So this website provides transmission services to TV and radio broadcasters and others so they can send live feeds from the field into production offices.
Dave Bittner: Okay.
Ben Yelin: So that used to kind of be done by satellite. This is a way so that you don't have to do it by satellite.
Dave Bittner: Right.
Ben Yelin: Burke found a publicly available username and password for a demo account. He found it or his tipster found on a web page belonging to a CBS Radio affiliate in Tennessee. Again, this was posted publicly. So to me, there's no hacking here. There's no unauthorized access. There's no exceeding authorized access.
Dave Bittner: Because it's a demo account.
Ben Yelin: It's a demo account.
Dave Bittner: Right.
Ben Yelin: And he didn't even hack anything to get that username and password to that demo account. It was publicly available on a website. This is journalism.
Dave Bittner: Right.
Ben Yelin: This is somebody searching the internet to find a resource, a source.
Dave Bittner: Let me ask you this. So let me interrupt --
Ben Yelin: Sure.
Dave Bittner: -- and say if so suppose, you know, you walk out of the studio here today, and you inadvertently leave behind a scrap of paper that has your username and password for your email. If I log on to your email using that scrap of paper that you left behind, my understanding is that would be a violation of the -- of the -- of the Computer Fraud and Abuse Act because I am not authorized to access your email account.
Ben Yelin: That is correct. Now, there's a very important distinction, and this gets into the legalese here.
Dave Bittner: Okay.
Ben Yelin: Which I know is kind of annoying, but it's about whether a reasonable person would think that that information was trying to -- that somebody was trying to keep that information private.
Dave Bittner: Okay.
Ben Yelin: So you discovering my password, a reasonable person would believe that I was trying to conceal that, that it was an accident that I accidentally left in your office.
Dave Bittner: Right.
Ben Yelin: And so that would be a violation of the statute.
Dave Bittner: Same thing of like in the physical world, if I leave my front door unlocked, that doesn't mean that someone can waltz into my house?
Ben Yelin: Exactly, exactly.
Dave Bittner: Right.
Ben Yelin: So we're using a reasonable person standard mean, meaning we have to look kind of objectively what would a reasonable person think under similar circumstances?
Dave Bittner: Okay.
Ben Yelin: If I saw a login to a demo account on a public website, you know, this is maybe a tough question. I wouldn't think that that website was trying to shut me out from being able to access it.
Dave Bittner: No, no, of course not.
Ben Yelin: Especially because they are publishing demo accounts. Now, it's unclear why the CBS affiliate in Tennessee was posting a demo account login information for the service, but they did, and there's no indication that liveu.tv is trying to keep its feeds private. I just don't think that's necessarily a standard practice of theirs. They probably want people to have access to demo accounts so that people purchase their services. So I think that's very relevant here, and that's what distinguishes it from, oops, I left my password behind. You have access to it. You didn't break down the gate to that password. It's still a violation of the statute because a reasonable person would know, you know, it's a username and password. It's something I would want to keep as private.
Dave Bittner: Right.
Ben Yelin: So the CFAA violation or potential violation was the entire reason for this raid by law enforcement. And even though Burke has not yet been charged with a crime, I think certainly, you could argue that his civil liberties were violated. They confiscated a bunch of his recording equipment.
Dave Bittner: Yeah.
Ben Yelin: All different types of stuff that he would need to practice journalism. And that is -- certainly violates the spirit of the federal laws that protect journalists from these types of First Amendment violations. So it's just a really interesting, multi-layered story. And I'm hoping that law enforcement, whether they drop, whether they decline to file charges, or if this happens in court that the Van Buren interpretation of the CFAA holds and that this is -- this was not an exceeding -- an instance of exceeding authorized access, and therefore Burke should not be punished under the law.
Dave Bittner: We've had a couple of stories, you know, along this same vein recently. We had the one with the, you know, the small town newspaper that got raided -- I don't know -- last week or the week before.
Ben Yelin: Yeah, I mean, we did that story last week.
Dave Bittner: Yeah.
Ben Yelin: That was really the hook here for Kim Zetter is everybody's paying attention to that story. This story out of Florida is quite similar. It's still a law enforcement, attack, search, and seizure of a journalist.
Dave Bittner: What does it take to tamp down on this? In other words, I presume law enforcement, well, I guess what folks at Fox News probably complained to law enforcement, made their case. Law enforcement had to convince a judge for the raid.
Ben Yelin: Right, they did. Yeah, and Fox News sent a cease and desist to Vice and to another media resource who was planning to publish some of this material.
Dave Bittner: Okay.
Ben Yelin: So they got their legal department on top of it. While their legal department was busy defending lawsuits from Dominion Voting Systems, they also have time to address this. Yeah.
Dave Bittner: So what does it take to, I mean, is it going to take the police getting their hands smacked? Could judges have to demand more scrutiny? What step along the chain is going to have people take a closer look at this? Because it seems to me like, right now, the judges are going along with this. It's certainly, in these two cases, without perhaps the scrutiny it deserves.
Ben Yelin: So I only say this half jokingly. It's us yelling about it on a podcast. That is really what changed the situation in Kansas, where you had a prosecutor, who, I think, went out over his skis to obtain this warrant, who ended up revoking the warrant and returned the seized materials to the journalists of the Marion County Record, which we talked about last week.
Dave Bittner: Right.
Ben Yelin: I think the only reason that happened is there was a national outcry. Every media source in the country signed documents, letters, saying that this was an egregious violation of the First Amendment, that this threatened the rights of journalists. We had that terrible situation where the publisher, the 98-year-old woman who was a publisher of that paper ended up dying, I think, proximately because of this raid. And because of the national outcry, they revoked that warrant and returned the material to that news source. And what Kim Zetter is saying here is that perhaps Florida prosecutors should do the same, return all of Burke's seized materials.
Dave Bittner: Right.
Ben Yelin: I think it maybe it shouldn't take a public outcry. But I think if both prosecutors and judges at every level realize that people care a lot about journalism and the First Amendment and the right of journalists, reporters, etc., to not be threatened by the government pursuing illegal raids, I think once there's kind of a public awareness of that, that could really change behavior. So I think the public outcry is kind of the mechanism here, at least in the short term, of how we're going to hold these law enforcement officials, these judges, accountable for unjust actions.
Dave Bittner: To what degree do you suppose this is political? In other words, you know, this happened in Florida. We know what the political leanings are in Florida. Certainly, journalists have a lower standing among people on the right than people on the left. That's the reality that we live in. You know, the accusation of fake news and all that sort of stuff. I mean, ideally, we would think that law enforcement and judges would be above that. But does that -- does that at all come into play here? Does Kim Zetter address that? Or is that a -- is it a subtext here? What do you think?
Ben Yelin: I think it is more subtext. There's no evidence that this was politically motivated.
Dave Bittner: Okay.
Ben Yelin: I don't think Tim Burke himself is like, I mean, he's leaked to liberal websites. I don't get the impression that he himself is some kind of radical leftist.
Dave Bittner: Right.
Ben Yelin: Who is subjecting himself willingly or not to the watchful eyes of Florida law enforcement. I think they got the complaint from Fox News, and they investigated it. And I just think they had an incorrect interpretation of the Computer Fraud and Abuse Act that they used incorrectly and unjustly to obtain this warrant. So there certainly is a subtext. This is Florida. This was Fox News.
Dave Bittner: Yeah.
Ben Yelin: I don't know how much Fox News wants to defend Tucker Carlson at this point.
Dave Bittner: Right.
Ben Yelin: Considering he's moved on but --
Dave Bittner: Right, right.
Ben Yelin: -- that's neither here nor there. I think it is subtext and not something that's really out in the open either in other stories about this or in Kim Zetter's piece on Substack.
Dave Bittner: It's also interesting to me that they went after the journalist rather than the person or entity who originally posted this to the file-sharing site. Like there seems to -- was there any effort to find out who was responsible for that?
Ben Yelin: I mean, it certainly doesn't seem like it. There's nothing in this piece that indicate that they went after anybody here except for Burke and the people who worked with him. So yeah, I mean, we had multiple characters in the story. It wasn't just Burke who --
Dave Bittner: Killed the messenger.
Ben Yelin: Right, right. He simply obtained this publicly available login information from a tipster that he had relied on in the past. The tipster is anonymous. And Burke is someone who's not anonymous because I think he was associated with the publishing of these articles. I think both Vice News and the other outlets who used this material said this was the result of an investigation by Tim Burke. So I think law enforcement is going after the person they know is involved in the story, rather than the people who they can't -- don't have any proof have been involved in what happened here.
Dave Bittner: Is this an example of the Computer Fraud and Abuse Act being outdated and needing an update, or is this an example of a good law being used in a bad way?
Ben Yelin: I think it's more the latter.
Dave Bittner: Yeah.
Ben Yelin: I mean, I certainly was encouraged by the Van Buren case and by Justice Barrett's opinion on this, which I think protects the Computer Fraud and Abuse Act from being used to abuse journalists and others by making it a crime to log into somebody else's Facebook account, for example. I mean, that was kind of the parade of horribles that Supreme Court justices were warning about when they narrowed the interpretation to this gate-up, gate-down approach. Did you have a right to be there in the first place? So I just think this was a misapplication of the law. The way our legal system works is that if this ever were to come to trial, I think, a good attorney for Mr. Burke and the court would have to look at that Van Buren decision and realize that this certainly -- this instance wasn't in the spirit of that decision.
Dave Bittner: Yeah.
Ben Yelin: So I do think this is an example of just misapplying the relevant law. Certainly, any changes to these types of statutes, we're talking about a computer fraud statute from the 1980s. They can always use updating.
Dave Bittner: Right.
Ben Yelin: When you have a statute that's as old as I am, you know that it's getting old and gray and bigger and bolder. So you know, you need to -- you need to kind of update it to comply with modern times.
Dave Bittner: Yeah.
Ben Yelin: But I think this is less of a story of the statute itself and more just a misapplication of it.
Dave Bittner: Okay. Well, we will have a link to this story in the show notes. And I'll just reemphasize here that, in my opinion, anything that Kim Zetter writes is worth your time. In fact, she's one of a handful of newsletters that I pay to subscribe to because I really find her stuff high value and definitely worth checking out. So we'll have a link to that in the show notes. My story this week comes from Bloomberg Law, and this is about a DC court ruling that says that AI-generated art lacks copyright protection, which I suppose Ben means it's in the public domain.
Ben Yelin: Yeah, and is. We can -- we can use it. We can sell it.
Dave Bittner: Yeah.
Ben Yelin: We can make a fortune off this AI-generated art. You know, I have to say AI-generated art has kind of impressed me so far. I'm following a Twitter account that's based on -- or an X account, if you will.
Dave Bittner: Right.
Ben Yelin: That is AI based. It's somebody who puts Donald Trump in like famous historical photos. It makes him look natural. So like here's Donald Trump, you know, with Moses during the Exodus from Egypt.
Dave Bittner: Right, right.
Ben Yelin: It's pretty good. Like they've -- AI --
Dave Bittner: So it's subtle.
Ben Yelin: It's very subtle, exactly. I'm not sure if this is somebody who's a fan of Donald Trump or just somebody who's trying to be funny. I find it humorous.
Dave Bittner: Yeah.
Ben Yelin: But yeah, I've been kind of impressed with how good AI-generated art has been so far. And now we, at least according to this one district court judge in DC, it is in the public domain because at least to the extent there's a body of law on this, we've never recognized that art not created by humans doesn't have copyright protections.
Dave Bittner: Yeah, so this article points out that this was Judge Beryl A Howell, of the US District Court for the District of Columbia, basically confirmed a US Copyright Office decision that denied a copyright registration to a bit of AI art that was generated by a computer scientist Steven Thaler, who claimed that it was eligible -- wanted to be eligible for copyright protection. This is interesting because there -- it sort of pivots or centers on this notion that something that can be copyrighted has to have a human involved in making it, and what made me laugh about this is that they denied copyright recent -- one of the cases they cited was one where they denied copyright to a monkey who took a selfie, if you remember that one.
Ben Yelin: It's so funny. So many of the precedent cases they use for AI have to do with animals.
Dave Bittner: Right.
Ben Yelin: Because it's like the -- I think the analogy is that it's this non-sentient being who isn't aware of its own existence. So that's the similarity between animals and AI. You know, what's kind of funny about this is I think the guy who brought this lawsuit Thaler or Thaler --
Ben Yelin: Yeah.
Dave Bittner: I think he was kind of trying to make a point here. He admitted in all the filings that he played no creative role in coming up with these images. He didn't paint it.
Dave Bittner: Right.
Ben Yelin: He didn't design it. He might have decided what the inputs were, but it's not his -- it isn't his work of art. And I think we have to keep a watchful eye for future cases where that isn't clear. Maybe an artist designed something and incorporated AI to augment that design.
Dave Bittner: Right.
Ben Yelin: Or put it in a certain style. I think that's where it's going to get complicated and where you're going to have a new body of case law under the federal Copyright Act.
Dave Bittner: Yeah.
Ben Yelin: But I think this case was just not the one where it was going to happen because you have a plaintiff, who's admitting -- I'm thinking of that online meme with that old guy saying he admit it -- I just I think he's just admitting here that he played no role in the creation of this work.
Dave Bittner: Yeah. They do point out that, earlier this year, the Copyright Office did grant limited copyright registration for an AI-assisted graphic novel, which I think is interesting because it seems inevitable to me that where this is going to go is, at some point, it's going to be a matter of degrees.
Ben Yelin: Right.
Dave Bittner: Right? Like if I have an AI-generated background in my piece of art, and I paste a smiley face in the middle of it, is that now copyrightable?
Ben Yelin: Yeah, your creation was the smiley face.
Dave Bittner: Because I added the smiley -- right, right.
Ben Yelin: Yeah, I think it's going to become a line trying exercise. And maybe we're going to get some legal standard, where it's like, whether a reasonable person would think that this is created by a human being or through artificial intelligence. I don't know exactly what that standard is going to be, but I think you're right that we are going to have to draw the line somewhere. Maybe it's going to be was the majority of that artwork designed by a human being?
Dave Bittner: Right, more than 49%.
Ben Yelin: Yeah. And like how are you going to get a jury to determine that or a judge to determine that? It's really murky.
Dave Bittner: Yeah.
Ben Yelin: I mean, I just think this is such a new area of the law that we don't really have a reliable body of law to draw from here. Certainly Congress or state legislatures who get involved, I'm not holding my breath as it comes to Congress.
Dave Bittner: Right.
Ben Yelin: But state legislatures could define what counts as AI-generated material for copyright purposes. That would be the best way to do this is to kind of develop some positive law around this. But until then, I mean, I think judges are just going to have to kind of do some guesswork and figure out is this actually a creative work done by a human being? Or is this the equivalent of a monkey taking a selfie?
Dave Bittner: So I have a couple of questions here.
Ben Yelin: Yeah.
Dave Bittner: A couple of scenarios. So could I copyright the prompt that I use to generate the AI image?
Ben Yelin: See, I think you'd have a much better case doing that because that's your creation.
Dave Bittner: Right.
Ben Yelin: The art itself isn't your creation, but presumably you developed that prompt. You developed the input.
Dave Bittner: Yes.
Ben Yelin: So I think you might have a copyright interest in that.
Dave Bittner: Okay.
Ben Yelin: I'm trying to think of what like a metaphor for that would be in the physical world, you know, you putting in some type of music software, play these chords, these piano chords.
Dave Bittner: Right.
Ben Yelin: You still would have been the creative genius behind it, not the system that ended up spitting out those chords. So I think perhaps you would have a copyright interest in it.
Dave Bittner: Well, that leads perfectly into my next question, which is what about kind of the inverse of this? So my son has been sending me AI-created songs that are sung in the voice of Frank Sinatra. Okay, so someone took Frank Sinatra's voice, and they trained an AI on it. And they've been having this AI-generated voice of Frank Sinatra sing popular songs, right, songs that came out way after Frank Sinatra passed away.
Ben Yelin: Right. Frank Sinatra singing Beyoncé or something.
Dave Bittner: Exactly, exactly. So could the fact that these are AI-created protect from copyright holders coming -- or the descendants of Frank Sinatra who want to protect his interest, right, could the fact that it's AI-created protect someone from people coming after them?
Ben Yelin: Yeah, I mean, it kind of blows your mind a little bit, does it?
Dave Bittner: Yeah.
Ben Yelin: Because you have two creative works here. You have the song itself that they're drawing off of, which certainly there are copyright protections attached to that.
Dave Bittner: Right.
Ben Yelin: I don't think there's any question there. And then you have the voice. Frank Sinatra's voice is a very distinct, specific thing.
Dave Bittner: Yeah.
Ben Yelin: So I'm not sure, especially if you were to be making money and selling this music based off his voice, then I think you're using his creative outlet, his voice, which to me seems like it also would be a copyright violation. So again, we don't have any sort of guiding law on this because it's so new. But my instinct would be that there are kind of two sets of copyright protections at play there: the drafter of the original song and then the voice itself.
Dave Bittner: Couldn't they say it's parody?
Ben Yelin: If it were parody, you could say it's parody.
Dave Bittner: Right, but, I mean, couldn't I just say that the notion of Frank Sinatra singing a Beyonce song is so absurd as to be parody?
Ben Yelin: Yeah, I mean, that would probably be your best argument in that circumstance.
Dave Bittner: Yeah.
Ben Yelin: It would be like kind of what Weird Al does.
Dave Bittner: Right.
Ben Yelin: Right, yeah.
Dave Bittner: Exactly.
Ben Yelin: And in that sense, maybe it would be protected. But let's say you were genuinely trying to have Frank Sinatra sing Beyoncé songs because you thought it was beautiful.
Dave Bittner: Yeah.
Ben Yelin: And you tried to produce an album based on that. Then I think you'd run into copyright problems. Even though it was AI-generated, you're using, in my mind, two copyrighted works: the song itself and Frank Sinatra's beautiful New York voice.
Dave Bittner: But does copyright cover the sound of someone's voice?
Ben Yelin: I mean.
Dave Bittner: So I'm a Frank Sinatra impersonator. How good can I be before I get in trouble?
Ben Yelin: Yeah, I mean, what's different about AI is it's trained on his actual voice, so it's not somebody imitating it.
Dave Bittner: Right. So I'd be trained on his actual voice.
Ben Yelin: Yeah, I mean, you raise a good point here, but like --
Dave Bittner: Yeah, right?
Ben Yelin: And this gets into the metaphysical. Is it really his voice? I mean, it's trained on his voice. There's no other human being trying to imitate it.
Dave Bittner: Right.
Ben Yelin: It might be saying -- the artificial intelligence might have Frank Sinatra saying words that he actually said in the voice that he actually said them.
Dave Bittner: Yeah.
Ben Yelin: So yeah, I mean, you're -- I think you're getting beyond my capability for being able to communicate that dispute
Dave Bittner: Come on, come on, lawyer boy.
Ben Yelin: Yeah. That's why I just throw my hands up and say let's let the jury decide. I'm going to go, you know, I'm going to go take a long lunch.
Dave Bittner: Exactly, it's undetermined, yeah, yeah. But I mean, it really points to the fact that these are interesting times. And these are unanswered questions that we're going to have to address.
Ben Yelin: There are going to be so many test cases because eventually someone's going to try and make money off of this.
Dave Bittner: Right, right. That's where the rubber hits the road.
Ben Yelin: Yeah, I have Frank Sinatra doing the full Taylor Swift Eras Tour setlist, and I'm going to sell it on the Apple Music Store for whatever, $12.99.
Dave Bittner: Right.
Ben Yelin: Fhen I think you're going to run into many copyright problems.
Dave Bittner: And what if I say it's this AI voice that I've created, and his name is Hank Binatra?
Ben Yelin: Right. So is that --
Dave Bittner: Right.
Ben Yelin: Is that kind of the Weird Al track, or is that you're ripping off two copyrighted works? I just don't know.
Dave Bittner: Yeah?
Ben Yelin: I really don't know.
Dave Bittner: All right. Well, time will tell.
Ben Yelin: I know it's not a satisfying answer.
Dave Bittner: No, it's not, but it's fascinating to think about, right? And, you know, yeah, we're -- this is what we're in for. This is what we're going to see. This genie is not going back in the bottle.
Ben Yelin: No, it certainly is not. And you know, if you want to be the test case, I encourage you to create that music. We might be saying your name on this podcast because your name is going to be attached to a test case here.
Dave Bittner: Right, all the way to the Supreme Court.
Ben Yelin: Exactly. So have fun with that.
Dave Bittner: Yeah. All right, well, we will have a link to that story. Again, that's from Bloomberg Law. We'll have a link to that in the show notes. Ben, I recently had the pleasure of speaking with George Tziahanas. He's from an organization called Archive360. And we're talking about where your data is stored and the degree to which that matters. It really a fascinating topic. Here's my conversation with George Tziahanas.
George Tziahanas: So data sovereignty, right, is this concept that, you know, for a long time, we thought of the digital world as a world without borders. And actually, what we're seeing now is national borders are starting to mean something, again, even in the digital world. Probably started, you could -- if you had to put a point on it -- when the US passed the CLOUD Act, which codified something that was happening in the court system here in the US where the DOJ was trying to compel Microsoft, in that case, to produce information on a foreign national, where the data was held in a foreign data center, but under the custody or control of Microsoft. And that was working its way through the court system when the US basically said that, yeah, the US can compel anybody who has custody or control of information, even if it's in a foreign jurisdiction, to provide it back to the DOJ. And that was pretty much enough for a whole bunch of countries, globally, to think about sovereignty of their data.
Dave Bittner: So given that, where do we find ourselves with different nations approaching this?
George Tziahanas: Right, and so one of the ways I've tried to explain this and when we work with our customers is think about it not from a national security perspective anymore, but a national interest perspective. So this isn't really just focused on highly secret national security related information. We're talking about things really around the national interest. So maybe information and systems that are associated with an infrastructure, energy sector, the financial services sector, or the privacy associated with the citizens of that nation state. You see this happening across Europe. You see this in countries like Germany and France. You see this happening in the Middle East. So the Kingdom of Saudi Arabia, Dubai, you see this, obviously, in Asia and China. And here in the US, a broader introduction of a concept called confidential, unclassified information has been introduced by the US government as well.
Dave Bittner: And what does that mean? Can you define that term for us?
George Tziahanas: Right, so all of these states that are introducing some concept around the sovereignty of data are establishing a scheme, a data classification scheme, that says this series of information and this classification of information is subject to some sort of either restriction around national boundaries or how that information has to be protected and secured within a set of systems. And so these, again, are fairly broad classes of information that could pertain to designs of infrastructure. They could pertain to the critical market and financial services related information that's moving. This could be data that pertains to government programs, that it's not classified, but still is something that the US wants to make sure and these other jurisdictions want to make sure are protected.
Dave Bittner: Now, what about for cloud providers or for folks who provide backup services? You know, I could see the benefit of having your data distributed, not just different data centers around the nation, but around the world. Does this come into conflict with their needs?
George Tziahanas: It does, and it's actually very interesting because the -- some of these laws are taking the form now, not just -- and this is why I say look at it from a national interest perspective, and not just a security perspective. A number of these countries are taking the position that not only does that information have to stay within the borders of that jurisdiction, but that the custody and control of this information must be managed by -- managed within that country and by nationals of that country. France has done this, and Germany has done this. So in France, they're proposing that no entity with a foreign ownership structure basically or with foreign actors can really manage these environments for about 600 companies. France has effectively established a government-approved cloud for this types -- this type of information that will be managed by T systems, which is a part obviously of a German company. If you look at how the Middle East is trying to deal with this, those states might not be large enough on their own to justify a cloud provider to come in. So they're investing and basically giving the cloud providers an investment to come and provide these great services and these great technologies in country. But again, that's a way for them to make sure that that information stays in country with a level of control because they're making these investments and having effectively a joint venture.
Dave Bittner: Yeah, I think that notion of national interest versus national security is really fascinating and, you know, a really sharp observation. I think that's a really interesting way to look at things. It's a great insight.
George Tziahanas: Thanks.
Dave Bittner: So for organizations who are concerned with this, what sort of questions should you be asking your providers?
George Tziahanas: Well, I think the first thing you really want to start asking yourself, right, as a customer is, you know, what, if any of this is subject to -- what if any of the information that I'm working with is potentially subject to a data sovereignty requirement, right? Or if you're in the US, if you're working for the US government, or you're a contractor with US government, what types of information might be subject to this, the CUI classification? And that's the first step, right, because you kind of have to understand that. Then the second piece is, if I'm using cloud providers, do I have an option to put certain classes of information in different jurisdictions? Now, many of the large cloud providers have already started to do this. And they were doing this even before, you know, some of the -- some of this became a real hot topic. You know, so the large cloud providers have instances of cloud in many of these different jurisdictions. And because of what we see around data sovereignty, there are more of those in country, if you will, or in region deployments going on. So your service providers and your cloud providers, you need to make sure that they -- you're able to take that application, that workload, whatever it is, and get it into the right data center, if you will, from those cloud providers and make sure that that's happening.
Dave Bittner: Does encryption come into play here at all? Does it make a difference? You know, is it okay to store something in another nation if we presume they're not going to be able to access it?
George Tziahanas: So far, that doesn't appear to be the case. There's one interesting though way to look at this as well, which is, you know, if you are a company, and you are concerned about this, and you're concerned about the long arm of foreign jurisdiction, the long arm of the US, the long arm of some other country, encryption is actually an important piece of this. Because if you, as a customer, can maintain the keys on the encryption, and the cloud provider, the service provider, do not have the keys to that data, it's almost irrelevant if that cloud provider gets a subpoena or regulatory requests to provide that data, in the sense that they can't provide it in a form that's going to be useful without going to the customer who's maintaining the keys. That's a really important attribute of how you really should be thinking about structuring data for Europeans, structuring your management of this data, and gives the customer control to at least know if a country is coming after that data.
Dave Bittner: Where do you suppose we're headed here? You know, we have -- I hear folks on the policy side talk about this notion of a splinter net, you know, that we're going to see more and more nations putting up sort of virtual walls around themselves out of their own interest. What does the future hold here, do you suppose?
George Tziahanas: Yeah, I had written an article several months ago on this, and I use the analogy of the digital Iron Curtain, right? So it's descending kind of around countries. Obviously, it's not like an East block versus West Block. But that's what we're definitely seeing, I suspect, because, again, if you view this in the context of national interest, I suspect that it is going to continue. How far it goes, I'm not sure. But there's definitely going to be some form of digital curtain, if you will, around a lot of countries and a lot of workloads.
Dave Bittner: Ben, what do you think?
Ben Yelin: It was a really interesting conversation. I love this notion of data sovereignty, that it's not national security, necessarily, but it's just in the national interest to protect our own data sovereignty. I think we've seen that reflected, maybe it's not national sovereignty, but with the European Union trying to protect their data from the US surveillance state. And that's been the cause for the Schrems cases and the need to keep renegotiating these data-sharing agreements. So that's where our surveillance laws really come into play in a really fascinating way.
Dave Bittner: Yeah. All right. Well, again, our thanks to George Tziahanas from Archive360 for joining us. We do appreciate him taking the time. That is our show. We want to thank all of you for listening. N2K's strategic workforce intelligence optimizes the value of your biggest investment, your people. We make you smarter about your team, while making your team smarter. Learn more at n2k.com. Our senior producer is Jennifer Eiben. This show is edited by Tré Hester. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.