Caveat 12.8.22
Ep 152 | 12.8.22

Follow the money along the blockchain.

Transcript

Alex Iftimie: We have seen some early signs that this activity is helping.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben explores the potential legal and policy issues surrounding the new AI chatbot, ChatGPT. I've got a case which questions the legality of livestreaming the police. And later in the show, Alex Iftimie from Morrison Foerster discusses how the Department of Justice seized nearly half a million dollars in ransomware payments made to a North Korean hacking group that targeted U.S. medical providers. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben. Before we dig into our stories this week, we've got some excellent follow up from some of our listeners, here. 

Ben Yelin: We have great fans, don't we? 

Dave Bittner: (Laughter) We do, indeed. 

Ben Yelin: Yeah. 

Dave Bittner: We do indeed. And one of the things I like best is they both inform us and set us straight (laughter). 

Ben Yelin: The most important thing. I mean, that's why I married my wife - is she sets me straight when I'm wrong, and... 

Dave Bittner: That's right. 

Ben Yelin: ...Makes me a better person. And feel the same way about our listeners. 

Dave Bittner: That's right. So we got a nice note from someone named Micah, who was responding to our story about GitHub Copilot, which is sort of the AI that's helping people on GitHub complete their code. Micah's correspondence is too long to read here. But to summarize, basically, what he suggests is that GitHub Copilot has been a useful tool for software engineers since it was first released to the public, but there have been legal implications around its usage. Its auto-completion of code examples verbatim from books, including local variable names, raises the question of whether this violates the author's intellectual property rights. However, it could also be seen as a positive example of machine learning. Ultimately, it is hoped that GitHub Copilot will continue to improve and remain a useful tool in the future. What do you think here, Ben? 

Ben Yelin: It was a really interesting email. One of the things this listener talked about was there are books that are written by pioneers in coding, and they will write down some of their code examples in the book. Now, the book itself is... 

Dave Bittner: Yeah. 

Ben Yelin: ...Copyrighted. But if you input the code into this GitHub artificial intelligence, Copilot, then it's going to be used to suggest coding solutions to whichever problem you're trying to solve. And I can see how that would at least feel like an intellectual property violation. 

Dave Bittner: Yeah. 

Ben Yelin: It is somebody's intellectual work that, at least conceivably, is being appropriated for profit. 

Dave Bittner: Right. 

Ben Yelin: But the other aspect of this email, which I think is really interesting, is Copilot's really useful. 

Dave Bittner: (Laughter). 

Ben Yelin: People who code for a living really like it. 

Dave Bittner: They do. Yeah, they do. We got several other notes from people who are saying, like, yeah, I get it, but, boy, is this thing nice. 

Ben Yelin: Yeah. And it's one of those things where, if you can come up with some framework that protects intellectual property rights, I don't think you want to throw out the baby with the bathwater, if that makes sense. 

Dave Bittner: Yeah. 

Ben Yelin: And maybe because this is an open-source tool and it's something that's generally improving institutional knowledge of coding, at least from a policy perspective, maybe that's more valuable in the long run than somebody's intellectual property rights. Easy for me to say because I'm not the person who developed those lines of code... 

Dave Bittner: Right, right. The thing I... 

Ben Yelin: ...But it's just - it's a tough dilemma. 

Dave Bittner: Yeah. The thing I wonder about is - a couple things. So if I write a book, and I have coding examples in there, if I say to my dear readers, here's a coding example, aren't I implying that - go use this? Right? I mean, isn't that kind of what I'm saying? 

Ben Yelin: Yes. But in other contexts, if you use that example verbatim and pass that work off as your own... 

Dave Bittner: Yeah. 

Ben Yelin: ...You could get into legal trouble. 

Dave Bittner: Sure. 

Ben Yelin: That, you know, in an academic setting would be plagiarism... 

Dave Bittner: Yeah. 

Ben Yelin: ...If it's uncredited. But if you are trying to make some sort of money off of that code, then, potentially, you could face legal liability under intellectual property law. And that's kind of what's happening. I mean, people write code for fun. My 9-year-old nephew does, and God bless him. But people also do it for their employment. 

Dave Bittner: Right. 

Ben Yelin: And somebody is making money through it. 

Dave Bittner: Yeah. 

Ben Yelin: So I see what you're saying. Maybe there is sort of an implication that, if it's in a book, you can take it and use it. 

Dave Bittner: Well, in this example - like, I'm not saying - you know, it's - so I'm saying when you frame something in your book as an example to teach people - let's go down the pathway here. What if it was a mathematics book - right? - and I have examples of how to solve a math problem. 

Ben Yelin: If that was put into an algorithm... 

Dave Bittner: Yep. 

Ben Yelin: ...That helped solve math problems online, probably nobody would notice, and therefore it wouldn't be an intellectual... 

Dave Bittner: Right, right. 

Ben Yelin: ...Property violation. I mean, again, it's - I think to me it's about a specific functionality that you are creating through code. The code... 

Dave Bittner: Yeah. 

Ben Yelin: ...Does something, and you're the person who's figured out how to do that thing in an economical way, in a way that's valuable enough that somebody was willing to publish it in a book. 

Dave Bittner: Right. 

Ben Yelin: And in that sense, I could see why it would cause intellectual property problems. There's no... 

Dave Bittner: Yeah. 

Ben Yelin: ...Clear answer here because you're right - it is convenient. There are a lot of instances in which things are made available in literature, and other people use them online. And it's not a copyright violation. 

Dave Bittner: Yeah. 

Ben Yelin: But I just think we have to recognize this potential problem while also understanding the level of convenience that coders are getting through this software. 

Dave Bittner: Right. And I suppose, I mean, something like GitHub Copilot would make me less likely to go to the bookshelf and look for that book that had the coding examples. I wouldn't need to do that, so I might not need to buy that book. 

Ben Yelin: Right. As long as the very first person who reads that book inputs that code into the machine learning and that becomes part of the predictive algorithm... 

Dave Bittner: Yeah. 

Ben Yelin: ...But somebody had to put that original code in the system in the first place. 

Dave Bittner: Right? 

Ben Yelin: Yeah. But that's another really interesting point. 

Dave Bittner: Yeah. All right. Well, our thanks to Micah for sending that in to us. We got another kind note from a listener named Steve, who writes, (reading) in the U.S., medical equipment is controlled and approved by federal agencies, and this approval includes no updates or changes can be made. When managing these devices, we are legally prohibited from applying the patches. When the system creator wants to add patches, they need to go through the approval process again for that gear. So when running security, we rely on isolation of these systems. When I worked in health care eight years ago, we had systems running all the way back to Windows 95 unpatched on the network that had to be isolated to the only required communications, but no patches or changes allowed. 

Dave Bittner: That's a really interesting insight. 

Ben Yelin: Yes, it is. And it seems like a big problem to me. I understand it because this is a heavily regulated industry for good reason. You don't want a pacemaker that's going to malfunction inside somebody's body. That would have very dire consequences. 

Dave Bittner: Right. 

Ben Yelin: But as a result of that regulation, they're not able to easily patch the software, and that's also a problem. You have to weigh the risks of under-regulation versus the risks of overregulation to the point that you're using Windows 95 software. 

Dave Bittner: Right (laughter). 

Ben Yelin: And that's really a problem that Congress should solve. I mean, they're the ones who should consider all these issues. I'm hoping that the PATCH Act, which we discussed on our last episode, helps to address this shortcoming. But it's just really interesting to have a reader describe how this works... 

Dave Bittner: Yeah. 

Ben Yelin: ...In the real world. 

Dave Bittner: Yeah. 

Ben Yelin: Yeah. 

Dave Bittner: No, I appreciate it. And Steve, thank you for sending that into us - some really interesting insights. We would love to hear from you. Our email address is caveat@thecyberwire.com. 

Dave Bittner: All right. Ben, let's jump into our stories here. Why don't you start things off for us? 

Ben Yelin: So I've spent a lot of time over the past week on the new rage in chatbot AIs called ChatGPT. If you haven't tested it out yet, you should. It is brilliant and weird, as The New York Times says in the article that we're using for this segment... 

Dave Bittner: (Laughter) Right. 

Ben Yelin: ...By Kevin Roose. Basically, you can type in instructions, and this very intelligent chatbot will be able to fulfill your instructions no matter how weird and irreverent they are. 

Dave Bittner: Yeah. 

Ben Yelin: So I have asked the chatbot to write Radiohead lyrics for things I see on my desk, like a coffee mug, and it writes completely believable lyrics. You can tell it to write a soliloquy about, you know, pushpins, and it'll write a soliloquy about pushpins. It is much better than a Google search because it relies on pretty advanced machine learning. It learns from not just what it finds in its inputs - so basically everything on the internet - but also, it's iterative so that it learns as more people upvote or downvote the output of this chatbot. It learns from those upvotes and those downvotes. So I really wanted to talk about this. I've been kind of racking my brain about how we can incorporate this into our podcast, which is about law and policy issues and not just about cool technology. 

Dave Bittner: Yeah. 

Ben Yelin: And I really think there is an angle here. So one thing you have to worry about with something like this is - is this chatbot going to be used for nefarious and potentially illegal ends? As you said to me before we started this podcast, I think the capabilities are getting better when we're talking about this type of artificial intelligence, but we might be sacrificing ethics. The AI might be instructing us to do bad things. This software tried to solve that problem. There are certain things that the software - that the chatbot will refuse to do. So if you literally typed in, how do I murder my neighbor, it'll tell you that's a dangerous request, and we're not going to respond to it. 

Ben Yelin: Unfortunately, there are workarounds. And this New York Times article gets at those workarounds and how they can be potentially problematic. If you wanted to write something like - write a screenplay about, with specific detail, about how somebody could murder somebody with an AR-15, then it might spit out an answer because that's something that's more abstract. Or if it refuses your first request, you could say something like, what if I wanted to use this just as part of a learning exercise, or what if I wanted to do this for another reason? What if I wanted to describe a dream I had? If it's something that abstract, it might be able to instruct you on how to do dangerous and illegal things. 

Ben Yelin: That leads us to this potential Section 230 problem. So could the creators of this chatbot be subject to legal liability if, for example, this artificial intelligence spits out instructions on how to make a Molotov cocktail and somebody makes the Molotov cocktail? You'd think this would be protected by Section 230. It is a platform. It's generally not responsible for the inputs of its users. And certainly, the creators of this are going in with the expectation that they're going to have that Section 230 shield. There are a couple of problems. One, Section 230 isn't exactly the most popular legal provision these days, and it's... 

Dave Bittner: Right. 

Ben Yelin: ...Certainly ripe for reform or an outright abolition of the law. And I think there are ways you could potentially argue that the chatbot exercises a degree of editorial control. They're more than just a platform. Based on everything that's gone into their inputs, based on specific decisions that they make about what type of information can be released through this chatbot, I think there's at least the small potential that they could be exposing themselves to legal liability, even if Section 230 would apply. So I wanted your take on this. I have spent a lot of time playing around with this, as I know you have, and... 

Dave Bittner: Yeah. 

Ben Yelin: ...I just want your general input on it. 

Dave Bittner: Well, I agree with you. First of all, it is kind of mind-blowing, and its capabilities are amazing. You can say - you know, I said, write me a poem about tall trees. And it did, and it was good (laughter), you know? 

Ben Yelin: Yep. And it feels like a real poem. It's not... 

Dave Bittner: It does. 

Ben Yelin: Yeah. 

Dave Bittner: Right. On the other hand, I've caught it getting basic facts wrong. 

Ben Yelin: And what's the one thing, Dave, that you know more about than anybody in the world? 

Dave Bittner: Jim Henson and the Muppets (laughter). 

Ben Yelin: That's correct. And if you had asked me that question without having discussed this before, that is what I would have said. 

Dave Bittner: So let - so I'll give you an example. I asked the chatbot, who is Jim Henson? And it got it mostly right. It said he's the creator of the Muppets. He's a filmmaker. He's a - all - nice little bio of Jim Henson. But it incorrectly stated that he was the creator of "Sesame Street," which he is not. He was certainly a contributor to "Sesame Street" and very important for "Sesame Street," but he - you - no one would say, in an accurate biography of Jim Henson, that he was the creator of "Sesame Street." So I asked this question a few days ago, and, in my conversation with the chatbot, I said, Jim Henson was not the creator of "Sesame Street," and the chatbot responded and said, Jim Henson was the creator of the Muppets. OK, fine. That's true. So a couple days later, I asked it again, who is Jim Henson? I got back a differently worded response with most of the same information, but it still said, with confidence, that Jim Henson was the creator of "Sesame Street." Now, this brings up a criticism. 

Ben Yelin: Must be like nails on the chalkboard to you, I mean... 

Dave Bittner: It is. It is. 

Ben Yelin: Yeah. 

Dave Bittner: Yes (laughter). But it brings up, I think, which is a fair criticism, which is that this system is so good at seeming to know what it's talking about, right? It was designed to trick you. It's designed to look good, to sound good, to be convincing as an entity with which to interact. But, perhaps less important are, as we mentioned, ethics, but also just facts. 

Ben Yelin: That's going to - I mean, that is going to get in the way of somebody's high school or college essay. 

Dave Bittner: Right. 

Ben Yelin: 'Cause that's really the future of this, and it's more of an ethical problem than it is a factual problem. But you could write, you know, write an essay comparing this Shakespeare character to, you know, maybe this more modern character in a book you've never heard of. And the chatbot would be able to do that. 

Dave Bittner: Yeah. 

Ben Yelin: Getting facts wrong is certainly a problem, and it would get that student in trouble. 

Dave Bittner: Right. 

Ben Yelin: I think the broader concern is, are we not going to be able to evaluate students' writing skills because, in 10 years, everybody's just going to be inputting something into a chatbot? 

Dave Bittner: Yeah, I've seen people say this is the end of take-home homework. 

Ben Yelin: Right. 

Dave Bittner: Because even math problems - you can say, solve this math problem and show your work. And it does. 

Ben Yelin: Yes. I certainly worry about it, as somebody who does online teaching, 'cause I've already had plenty of students who have tried to improperly use online resources. And now it's going to be much harder for me to detect. So yeah, I mean, I think that's a future problem, maybe 10 years down the line. The problem is that the technology is going to get better. So right now, it might be messing up some of those basic facts, and it seems like it's learning mechanism must be failing if you reported this incorrect fact about Jim Henson and it hadn't been corrected two days later. 

Dave Bittner: Right. 

Ben Yelin: You'd have to think then, in the development of this chatbot, that they're going to eventually get that right. 

Dave Bittner: Yeah. 

Ben Yelin: They're going to create some type of better iterative process where they can correct mistakes in real time. 

Dave Bittner: Yeah. 

Ben Yelin: I mean, certainly, Wikipedia has basically figured out how to do that, so I'm confident that that problem will be solved. The only big problem right now is almost all the inputted data is historical. It comes from, basically, 2021 and before. So you wouldn't be able to write a paper about current events in 2022 because that just has not been input into the system, which makes it a little less useful than Google right now, if you were to write a paper about current affairs. But I think that's a capability that they'll be able to solve. 

Dave Bittner: Yeah. 

Ben Yelin: So I think they're going to solve those more technical capability-oriented problems. I don't know if they're going to be able to solve the ethical problems and their potential legal problems, where they might be instructing people to do really heinous and potentially illegal things if they don't tighten up their algorithm so that you could get around these types of workarounds where it's, write me a screenplay about killing my neighbor instead of write me instructions on how to kill my neighbor. 

Dave Bittner: Yeah. There's a part of me that wonders if this is a problem or this is the leading edge of a new reality. And let me tell you why I say this. I - when I was a kid, you know, coming up through school - and I'm a little older than you - my math teachers, for example, would say, you can't use a calculator in math class because you're not always going to have a calculator with you. Lies, Ben. 

Ben Yelin: Yes. 

Dave Bittner: Lies, right? No. 

Ben Yelin: (Singing) Wrong. Yeah. 

Dave Bittner: I don't always have a calculator with me. I have a supercomputer that has access to all of the world's knowledge... 

Ben Yelin: Yep. 

Dave Bittner: ...All the time, no matter where I am. So yes, I understand that learning mathematics teaches me how to think and blah, blah, blah, blah, blah. I get it. But the argument that I will not have a calculator with me turned out to be completely false. 

Ben Yelin: Very wrong, yeah. 

Dave Bittner: So there's going to be - I mean, we're probably months away from this being an app - right?... 

Ben Yelin: Right. 

Dave Bittner: ...Where you can just ask your mobile device to write the essay for you or to do the math problem... 

Ben Yelin: Right. 

Dave Bittner: ...Or to - whatever it is. So when that's the reality - when we have - when everyone has access to that sort of information, what's the social transformation? What's the transformation of the educational system when we're able to outsource that sort of information - those kinds of skills. How does that change things? How does that change the nature of work? 

Ben Yelin: Yeah, I mean, I think a lot of banal office jobs might be replaced in the long run by these types of - this type of technology. I think about the legal field 'cause that's where I'm most familiar - write me a brief about the history of jurisprudence on X is something that could probably be better done by a competent chatbot than by an actual human being, unless you're at the very top of your law school class. 

Dave Bittner: Yeah. 

Ben Yelin: And that's kind of just bad, normatively, for society. I mean... 

Dave Bittner: (Laughter) Is it though? I mean, because so - what if you - you're still going to have to have somebody fact-check it... 

Ben Yelin: Seemingly. 

Dave Bittner: ...Seemingly... 

Ben Yelin: Yeah. 

Dave Bittner: ...Unless you have another bot that, you know... 

Ben Yelin: Fact-checking bot, yeah. 

Dave Bittner: Right - now, who watches the watchman. But is someone's time better spent fact-checking than doing the original writing - the drudgery of the original research and writing? 

Ben Yelin: Yeah, I mean, maybe. And I think people have said what I just said with all different types of technology. Like, if we automate this, all these people are going to lose their jobs, and it's not going to be a real human being doing this. And, you know, there have certainly been some pains when that's happened with other types of technology, but I think most of us would agree that that has been a worthwhile tradeoff because we've made our own lives easier. 

Dave Bittner: Right. 

Ben Yelin: I've heard people say that this is going to be the biggest technological innovation since the iPhone. I'm not sure that's wrong, having experimented with this. For example, one thing I inputted before our show today was, write a podcast transcript about cybersecurity laws. I kind of think it did a better job than we did in producing this show. 

(LAUGHTER) 

Ben Yelin: It was that good. 

Dave Bittner: (Laughter). 

Ben Yelin: I just hope we, ourselves, don't get automated. 

Dave Bittner: Right. 

Ben Yelin: Yeah. 

Dave Bittner: Right. 

Ben Yelin: I have to, you know, add a little bit more color and personality so that I, myself, can't be replaced by a chatbot. 

Dave Bittner: Well, but this - you could say, write this in the style of Ben Yelin, and it'll do it. 

Ben Yelin: It will do it if I was well-known enough that I'd be included in the inputs in the chatbot system. You know, I've actually done that with more prominent writers, analysts, political writers. You know, you can pick any columnist that you like to read - let's say it's David Brooks of The New York Times - and say, write a David Brooks column about humidifiers. I'm just looking at things I see around the room - and... 

Dave Bittner: (Laughter) Right, right. 

Ben Yelin: ...It would do a great job at writing a David Brooks column about humidifiers. I know where I'm kind of extolling the virtues of this while also worrying about the longer-term implications, and I actually think both of those instincts are valid. 

Dave Bittner: Mmm hmm. Yeah. How do you know what's real and what's not or what's artificially generated or not? Because this is creating things on the fly in a unique way each time, what are the tells? How are you going to know? How is it - right now - you're a teacher, right? You have plagiarism tools at your disposal to find out if your students are not doing the right thing. What are the tools going to be to prevent this? Is that possible? Or are we going to have to come - are we - does the future mean we assume that everyone has access to this and we adjust how the world works because of it? I don't know how to answer that. 

Ben Yelin: I think that's possible. 

Dave Bittner: Yeah. 

Ben Yelin: I think it's possible in the long run. Again, they're going to have to perfect some of the imperfections... 

Dave Bittner: Yeah. 

Ben Yelin: ...And you've identified one of them. But they have smart enough people that I think that's eventually going to happen. 

Dave Bittner: I did see a really interesting example of this, where there was someone who was a coder - a software developer who had, I believe it was a family member who had trouble with written communications. This was someone who was a small-business owner, had, like, a landscaping business, fine with interpersonal communications. This is not someone with a developmental disability... 

Ben Yelin: Right. 

Dave Bittner: ...Or something like that. Just really struggled with written communications. And so what they did was they created an automatic pathway for this person to send an email to a Gmail account. The Gmail account would then feed the contents of that email into this chatbot and ask the chatbot to return a properly worded, polite business email. 

Ben Yelin: That's really useful. I mean, that's going to be really helpful. 

Dave Bittner: Right. 

Ben Yelin: Yeah. 

Dave Bittner: Right. And so this person, who, again, struggled with just the written word, now, in their business - daily business goings-ons has beautifully written interactions with his customers. That's really helpful. 

Ben Yelin: It is. And then you worry about, well, what happens when you get into the real world and you're meeting somebody face to face, and they realize that this was all a facade. You really don't have social skills. 

Dave Bittner: Well, but that's - but that was part of the point here. The person who did this said this person does have social skills. 

Ben Yelin: They just can't express it in writing, yeah. 

Dave Bittner: They're just - yeah, writing is the thing that is their problem. If you're face-to-face with this person, you would have no idea that they had a writing issue. So isn't that fascinating? 

Ben Yelin: It is. Yeah. I mean, I think the potential here is limitless in terms of things, if we can improve accuracy, like instant medical diagnosis or, you know, ways of developing a good Thanksgiving recipe without doing the type of specific Google search you'd have to do to be, like, give me a Brussels sprouts recipe. You could say, concoct the perfect Thanksgiving dinner, and that's something that would be really awesome. 

Dave Bittner: Yeah. 

Ben Yelin: I just - with these limitless opportunities comes some level of trepidation and risk. 

Dave Bittner: Yeah. 

Ben Yelin: Yeah. 

Dave Bittner: I guess that's a big part of it - isn't it? - that when we're faced with something that has the vast capabilities that this seemingly has - and I don't know about you, but I'm not 100% sure how much is it just pulling the wool over my eyes. Is it - right? - how much does it really know, or how much of it is - is it just a really good con man, you know? Is it convincing me that I really need to buy that monorail? You know, I don't know. And so that makes me uneasy. 

Ben Yelin: Is the chatbot Lyle Lanley? 

Dave Bittner: Yeah. 

Ben Yelin: Yeah. 

Dave Bittner: It just makes me uneasy. 

Ben Yelin: Well, it put New Haverford (ph) on the map, so... 

Dave Bittner: (Laughter). That's right. It's more of a Shelbyville idea (laughter). 

Ben Yelin: It's more of a Shelbyville idea. 

Dave Bittner: All right, well, I think you and I could probably talk about this all day. 

Ben Yelin: We could. I just saw somebody post on Twitter - here's a Ben Shapiro monologue attacking Joe Biden for falling into a vat of soup. 

Dave Bittner: OK (laughter). 

Ben Yelin: And it sounds like Ben Shapiro, and it's really funny. 

Dave Bittner: Yeah. 

Ben Yelin: So, I mean, I've - it's certainly been a week worth of fun entertainment and interactions among myself and my friends. 

Dave Bittner: Yeah. Yeah. Just be careful what you ask for, right? 

Ben Yelin: I know, I know. 

Dave Bittner: But it's so exciting, too. 

Ben Yelin: I know. 

Dave Bittner: That's the thing about it. It's darn fun. 

Ben Yelin: It's a land of contrasts. 

Dave Bittner: (Laughter) That's right. 

Dave Bittner: All right. So my story this week comes from The Washington Post. This is an article written by Rachel Weiner, and it's titled, "Threatened With Jail for Live-Streaming Traffic Stop, He Sued." And this is about a gentleman named Dijon Sharpe, who was part of a traffic stop. And during the traffic stop, he opened Facebook Live and started streaming the traffic stop. And the police officer was not pleased with this. He said, in the future, if you want to Facebook Live, your phone is going to be taken from you. And if you don't want to give up your phone, you'll go to jail. And Mr. Sharpe questioned this, wondering whether it was the law. He did not believe this was the law. And this case is going before the U.S. Court of Appeals for the 4th Circuit. 

Dave Bittner: Now, this is something you and I have talked about before - about your right to record police when they're doing their day-to-day business - things like traffic stops or people's interactions with them. But this case of streaming has not come up yet. And in this article, they're saying this is pretty interesting, and this is going to have some consequences here. What do you think about this, Ben? 

Ben Yelin: It's a fascinating legal issue, and I think the couple of experts they spoke to for this article were correct in saying this is a type of novel issue relating to new technology, where there's not an easy resolution. So there are really two constitutional issues here. There is the Fourth Amendment right against unreasonable searches and seizures. So that relates to whether law enforcement could actually seize your device or search it, which would naturally put an end to the Facebook Live livestream. And then there's the First Amendment implication. I mean, this is a - would seemingly be a First-Amendment-protected activity. You know, this is your right of speech, your right of expression. You're filming this interaction to make a type of political argument that maybe there's something wrong with standard police interactions in certain communities, and that would be a First-Amendment-protected activity. 

Ben Yelin: I think there are a couple of ways that law enforcement might end up winning this case in front of the 4th Circuit. The first is, from a Fourth Amendment perspective, generally, law enforcement has a lot of leeway if there is a valid traffic stop. So if they pull you over for having a busted taillight, they can generally search your car without having separate probable cause that something is amiss. If they notice the smell of marijuana or a broken bottle on the - broken beer bottle on the floor of the passenger-side seat, they could be justified in searching and potentially effectuating an arrest. So cops do have a lot of leeway in traffic stops generally, and that might help them in these particular circumstances. The other is, if we have a First Amendment problem or really any constitutional concern, that's not the be all and end all. And I'm not going to use the fire in a crowded theater thing, 'cause you know how much I hate that. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: The better way of explaining it is you can actually abridge people's constitutional rights if there is a, quote, "compelling state interest" and if that - the means of achieving that interest are narrowly tailored to the ends. And it's possible you could make an argument that, when we're talking about the compelling needs of law enforcement to maintain order, to protect against criminal activity, that might be seen as a compelling interest that would allow some inhibition on First Amendment rights. So I don't think - you know, even though, personally, I think, instinctively, we want to give people the ability to record these interactions, and it makes sense under the First and Fourth Amendment that they should be able to do that, I think there are ways that law enforcement still might win this case at the Fourth Circuit. 

Dave Bittner: Yeah, one of the things they point out in this article is that police are making the case that it could put them in danger from people knowing where they are when the traffic stop, let's say, is happening. I guess I have a hard time with that argument because, presumably, the police car is going to have its lights on... 

Ben Yelin: Right. 

Dave Bittner: ...Saying, look, it's us. It's the police. Here we are, you know? 

Ben Yelin: Yeah. Unless it's an undercover operation, yeah... 

Dave Bittner: (Laughter) Right, right. 

Ben Yelin: ...I think it should be pretty obvious. 

Dave Bittner: Right. But the other thing is - I guess I'm trying to split the hairs here - the difference between the right to record police, which is pretty well established, and the right to stream police, which is what is at issue here. What's the difference? 

Ben Yelin: So one of the major differences in this case is the fact that it's a traffic stop. That's one of the distinguishing factors here. Traffic stops, according to decades of court precedent, are extremely - are just more dangerous than standard on-the-street stops. The person has a getaway vehicle at their disposal... 

Dave Bittner: Right. 

Ben Yelin: ...So they can easily drive away and avoid law enforcement. A car could potentially be used as a weapon. So because of the inherently dangerous nature of traffic stops, I think courts are more willing to tip their hand in favor of law enforcement. In terms of livestreaming versus recording, I mean, I think it gives an extra capability to the person being pulled over because it's happening in real time, and the person can represent the interaction as this is what actually happened. Nobody can suspect me of improperly editing this video to make law enforcement look bad because I'm livestreaming in real time. 

Dave Bittner: Right. 

Ben Yelin: And so that might make life easier in terms of a type of argument. I mean, think of the most high-profile cases of police misconduct and how every single second of those videos mattered in terms of the context. And I think livestreaming is more likely to give you that capability and prevent the type of selective editing that we've seen in certain cases. 

Dave Bittner: So let's say I'm walking down the street, minding my own business, and I come across a traffic stop. And I pull out my phone, and I point my phone at the - and I'm on - you know, I'm on the other side of the street, right? 

Ben Yelin: Right. 

Dave Bittner: I'm 40 feet away, and I aim my phone at the police officer going about their business. What happens next? Does the police officer say, hey, pal, are you recording... 

Ben Yelin: Bittner. 

Dave Bittner: ...Or are you streaming (laughter)... 

Ben Yelin: Yeah. 

Dave Bittner: ...right? 

Ben Yelin: Put away that phone. 

Dave Bittner: (Laughter). 

Ben Yelin: I think, the way that the law is now, your First Amendment rights are generally well protected. I mean, we have enough case law that it is legal to film the police from a First Amendment perspective. I think what makes this case particularly vexing is it's the person that's getting pulled over, so they are involved in the law enforcement interaction. So I don't think you would have any problem as a bystander. They might ask you to stop. 

Dave Bittner: Right. 

Ben Yelin: They might do more than ask you. They might say, put away that phone, or there'll be consequences. And no matter what your constitutional rights are, you might be inclined to just not do that and get yourself in trouble. 

Dave Bittner: Yeah. 

Ben Yelin: But yeah, I mean, I think the distinction is that you're not in that dangerous weapon of a moving vehicle. 

Dave Bittner: Right. 

Ben Yelin: And it's the traffic stop that makes this particularly dangerous, so I think that's where - that's going to be the deciding factor in this case - is how to apply it to a traffic stop, where, in other cases, we've given a lot of leeway to law enforcement for these particular types of searches. 

Dave Bittner: Hmm. All right. Well, we'll keep an eye on that one for sure. Again, that is from The Washington Post, and we will have a link to that in the show notes. 

Dave Bittner: Again, we would love to hear from you. You can email us. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Alex Iftimie from Morrison Foerster. We're discussing the background of the government's response on the Department of Justice seizure of nearly half a million dollars in ransomware payments that were made to a North Korean hacking group that was targeting U.S. medical providers. Here's my conversation with Alex Iftimie. 

Alex Iftimie: In July, DOJ announced that they seized 500,000 from two virtual currency wallets used by North Korean ransomware actors. And we have the benefit in this case of a DOJ civil asset forfeiture complaint - essentially, the court documents that the Department of Justice is using to recover the funds that they were able to recover that tell an interesting story about a relatively new and rare North Korean ransomware group that targets the health care sector. It highlights some of the different ways victims respond to these kinds of attacks and also sheds some light on the ways that the FBI is able to track criminals using tools to analyze blockchain transactions. And so there's a lot to digest and dissect in what we've learned over the last couple months. 

Dave Bittner: Well, can you walk us through? What are some of the details here? 

Alex Iftimie: What we know is as follows. Last year, in May 2021, a Kansas hospital suffered a ransomware attack which crippled that hospital's access to some servers. Those servers were ones that were really important to them. They were servers that host hospitals' X-rays, CAT scans and MRIs. And the ransomware attackers left behind a ransom note that essentially said the hospital had 48 hours to pay, or the price would double. The hospital ultimately, in about two weeks' time, made the decision that they did have to pay. And they paid approximately $100,000 in bitcoin in two installments, and the attackers gave the hospital the decryption keys to unlock their servers. 

Alex Iftimie: We also know that that Kansas hospital contacted the FBI following the attack, and that, as a result of that notification, the FBI was able to identify what was previously an unknown ransomware, malware called Maui. The FBI then started to track the funds that the hospital had paid in bitcoin. The audience may be familiar with these bitcoin transactions. Every transaction is registered on the bitcoin blockchain. And using those investigative tools, the FBI was able to find a virtual currency exchange, where the ransomware payment was deposited to, and, in August of 2021, obtained records from that currency exchange through legal process. The FBI was then able to track payments to a second virtual currency exchange and ultimately identified that those accounts were accessed by a Hong Kong-based IP address that law enforcement presumed to be some money launderers who assisted North Korean cyber actors in cashing out bitcoin ransom payments into fiat currency. So that's - you know, that's one of the entities that we learned about. 

Alex Iftimie: The second is a Colorado medical provider that also made a ransom payment to this group - approximately $120,000 in bitcoin. And a similar story here - the FBI was able to follow the money, essentially, and trace the transactions from this medical provider through to the criminal group, although one notable difference here is that, it appears, based off of the court documents, that the FBI went to this Colorado medical provider rather than the medical provider voluntarily disclosing the incident to law enforcement. 

Dave Bittner: You know, we hear about folks who are up to this - you know, using things like tumblers to try to hide the transactions or obfuscate them. I mean, to what degree does this speak to the FBI and the Department of Justice being able to unwind these things, and, you know, does that put the criminals on alert that perhaps it's not as easy to launder the money as they thought it was? 

Alex Iftimie: David, I think that's exactly right. I mean, this case, like other ransomware recoveries - and there have been a number of them this year alone - they continue to highlight that the Department of Justice really does have the ability to follow the money on the blockchain - that there is - that anonymity is not foolproof, even when you're talking about these virtual currencies. And the court documents in this case show how the FBI, step by step, was able to follow the money from the victims through to the accounts where they were held and, in this case, able to - the investigators were able to use legal process to track down who owned these accounts and ultimately to seize the funds and get that money back into the hands of victims, as opposed to the crooks who are trying to - you know, to steal these funds for their own purposes. 

Dave Bittner: Now, when you say they're able to use, you know, legal processes, I mean, is this an international effort? Is this, you know, dealing with our allies to be able to track these sorts of things down? 

Alex Iftimie: It is something that involves a lot of international partners. So criminals are wise enough to know that, if they leave these assets in the United States, they're more likely to be caught or - you know, or leave them in the possession of U.S.-based entities. And so they often try to move these funds to jurisdictions that they think are going to be more favorable. Increasingly, we're seeing that U.S. law enforcement has been able to partner with law enforcement in other countries to get records from those countries - to get records from virtual currency exchanges or other institutions in those countries and ultimately to effect arrests. We have seen law enforcement track down people responsible for laundering money, people responsible for developing the malware and deploying ransomware against victim networks. 

Alex Iftimie: And so all of this is part of a strategy of the department to raise the costs of of engaging in this activity and hopefully to deter those who would think about entering the ransomware space from doing so. The reality is that a lot of - there are a lot of well-educated individuals who see this as cost-free activity in certain countries. And the goal of the Department of Justice's work is to really see - make those individuals see that there are costs to their activities and to make them choose another life, hopefully, before they get too far down the path. 

Dave Bittner: What insights can you share with us in terms of the Department of Justice's choices in how they went about this? The - you know, the fact that they didn't announce the civil enforcement action - you know, those sorts of things. What does that speak to, in terms of how the DOJ is choosing to come at these events? 

Alex Iftimie: What I think it shows is that the Department of Justice isn't just interested in investigating crime that has already occurred and charging that type of crime and hopefully finding the people responsible and holding them accountable. Certainly, that is an important function for the Department of Justice, but it's not the only one. And what we've seen here is that they've adopted a strategy to disrupt these actors from being able to use their capabilities - from being able to take advantage of their ill-gotten gains and to not just play defense, but also to play some offense. And some of those capabilities include, as you note, using search warrants and court orders to seize stolen funds. 

Alex Iftimie: They've also been used to seize domains, command and control servers, other infrastructure that's used by these cybercriminals with a goal of disabling their ability to conduct these types of attacks against the U.S. private sector and really the global private sector. And we have seen the Department of Justice use these kinds of seizure tools to take down global botnets, which - some of them are controlled by state-sponsored groups, others by criminal groups. We've seen them use search warrants coupled with other technical tools to remove malware from victims' systems to help victims avoid the type of compromise from these sophisticated actors that the government is concerned about. And we've also seen, like we saw in this particular ransom case, the use of search warrants in civil asset forfeiture authorities to seize the cryptocurrency wallets, the stolen property and, where possible, to return it to victims. 

Dave Bittner: Do you suppose that this is making a dent? I mean, you know, this was - we're dealing with North Korea here, who, of course, you know, sort of has a unique position on the global stage. What do you think here? I mean, is this just low-hanging fruit, or is this - does this make a real difference on the global stage? 

Alex Iftimie: I think time will tell. It's - that is the key question. We have seen some early signs that this activity is helping. We have seen on the margins that fewer victims are paying ransoms, at least as a percentage of the victims that are being attacked. We have seen that the amount that is being paid by victims, on average, has gone down as well in recent months. The data is still out, though. I think we need to see a longer trend to really get a sense for what impact this - these government actions are having. And frankly, there may also be some other variables that are at play for why we're seeing a reduction. 

Dave Bittner: In this particular case, what about those health care organizations that got hit here? I mean, was it ultimately a happy ending for them? Did they get their funds back? Do we know? 

Alex Iftimie: Well, they are in the process of getting their funds back. The government has filed motions to forfeit those funds and to give them back to the victims. That process usually takes some time. The reality is, even if you get the money that you paid in ransom back, that really is only part of the cost of these incidents. There are impacts in terms of operational disruptions. We don't know what the impact of these systems being down was on patients who needed care from these medical institutions. And so there are costs that - where it's really hard to ever make victims whole from these ransomware attacks. And so part of what we need to do as a society, really, is to prevent these attacks from happening in the first place and making organizations more resilient to these attacks, ensuring that they have the redundancies and the backup systems in place that, if you are the victim of one of these attacks, allows you to recover quickly and get back on your feet. Because the reality is, the ransom - the value of the ransom paid is only a small fraction of the ultimate cost of these incidents to the victims who experience them. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: It's really a fascinating story. I kind of am a little encouraged by the capabilities of the Department of Justice to access the blockchain and to block the financial transactions involving cryptocurrency. This isn't easy. And when you have - we're talking about a scale of millions of ransomware attacks, and they are increasing exponentially. Eventually, that's going to exceed the capabilities of the Justice Department, even in high-profile cases involving foreign actors. So, you know, I think we're going to have to add resources to the computer crimes division of the Department of Justice, but I found it to be interesting and encouraging. 

Dave Bittner: Yeah. Yeah, absolutely. 

Dave Bittner: All right. Well, again, our thanks to Alex Iftimie from Morrison Foerster for joining us. We do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.