Caveat 12.23.21
Ep 106 | 12.23.21

Considering policy on ransomware payments.

Transcript

James Gimbi: When we're talking about whether or not victims are going to continue to pay, I think that on that they're absolutely going to continue to pay. And it's a little counterintuitive - right? - because it seems like people are going to follow rules. And generally, they may. But the problem is that, as a policy lever, bans really are only going to work well when the penalty outweighs the benefits of whatever behavior you're trying to prevent.

Dave Bittner: Hello, everyone, and welcome to Caveat, the CyberWire's privacy, surveillance law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin, from the University of Maryland's Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben discusses the expansion of the so-called Glomar response. I look at Google's email-scanning policies - and later in the show, James Gimbi director at technical advisory firm MOXFIVE on why he believes banning ransomware payments is bad policy. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, let's kick things off here. We got a little quick follow-up here from a listener. This came to us via Twitter. And this is from @brentdstewart. He wrote us and said, following CDA Section 230 discussion on Caveat, providers like Facebook don't just show friends updates in historical order. They intermix content, like Republican or Democrat fringe posts, to enhance stickiness. If they are exercising editorial control, does that still qualify for immunity? Ben, what do you think? 

Ben Yelin: Great question - goes into the heart of the Section 230 debate. Generally, Section 230 covers the activities of companies like Facebook whether they are simply the facilitator of posts or whether they're exercising some sort of editorial control - the way they are in the circumstances this person is describing. What's really interesting about this is Mark Zuckerberg went in front of Congress earlier this year and suggested that Section 230 be reformed so that it doesn't apply to Facebook or any other company's algorithms - so the idea being that Facebook is exercising some form of editorial control just by having the algorithm. They're moving posts around. They're illuminating posts that are particularly controversial, that might spark some sort of reaction. So that's an exercise of editorial control. And therefore, in his view, you know, Facebook should be held liable in those circumstances. 

Dave Bittner: Is that like, you know, Br'er Rabbit saying, don't throw me in the briar patch? 

Ben Yelin: That's exactly what it is. 

Dave Bittner: (Laughter). 

Ben Yelin: I mean, he knew that the political pressure was coming. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: He knew that Facebook was in the ringer from all sides of the political spectrum about Section 230, with, you know, people on the political right saying that it's - you know, it's not fair that Facebook censors conservative voices; they should be held liable - and people on the left saying they're fostering, you know, disinformation about vaccines. They're, you know, inspiring insurrections against our government. They need to be held accountable. Zuckerberg, knowing that, went in and said, all right, I have a solution that you know, is an interesting half-step - will preserve our liability for everything else. But as it relates to algorithms, you know, sure, why not? You know, and I think... 

Dave Bittner: (Laughter) OK. 

Ben Yelin: ...A couple of lawmakers saw right through that. The ones that are proposing some of the more significant 230 proposals are saying, let's not be reeled in by Zuckerberg here. I think what he's trying to do is satisfy the need for some sort of reform while only carving out a narrow exception to Section 230. So applying to algorithms would be great. Opening up liability for the use of algorithms would probably be a step in the right direction. But Facebook makes a lot of other editorial decisions. And depending on the contours of the proposal, I'm not sure that those would be covered. So the answer to your question - generally, right now they are immune from suit under Section 230. But we'll see what happens once Congress gets its hands on some reform proposals. 

Dave Bittner: All right. Well, thank you to our kind listener for sending in that thoughtful question. Of course, we would love to hear from you. You can email us. It's caveat@thecyberwire.com. 

Dave Bittner: All right, Ben, let's jump into some stories. Why don't you start things off for us? 

Ben Yelin: So mine comes from the Lawfare Blog - frequent source of ours - from Christina Koningisor. And it's about secrecy creep. So I'm sure you've heard of the so-called Glomar response. You send a FOIA request to the CIA, the FBI, any three-letter agency saying, I want information on X, and they don't want to tell you information on X. They will send you a very curt reply saying, we can neither confirm nor deny the existence of X. I have to admit, I wasn't too familiar with the origin story of the Glomar response - comes from the late 1960s, when the CIA discovered a sunken Soviet ship. And the CIA and the rest of our government apparatus built a separate ship outfitted with a giant claw. 

Dave Bittner: (Laughter). 

Ben Yelin: I am totally serious about this. 

Dave Bittner: As you do. 

Ben Yelin: I'm picturing the ones at the arcade to grab the stuffed animals. 

(LAUGHTER) 

Dave Bittner: Yeah. 

Ben Yelin: You just move the little joystick and press the button. And it.. 

Dave Bittner: I don't know why the image of Dr. Doofenshmirtz just came in my mind. 

(LAUGHTER) 

Ben Yelin: Yeah. No, you press the button, and the Soviet ship comes up, and you've got some valuable intelligence. 

Dave Bittner: Yeah. 

Ben Yelin: So obviously, this, you know, led to the CIA coming up with some sort of cover story, saying that he had built - and they convinced Howard Hughes, a eccentric billionaire at the time, to claim he built the ship to mine valuables from the sea floor. 

Dave Bittner: Ah, OK. 

Ben Yelin: And so they named the vessel the Glomar Explorer, which was, of course, a misdirection. So that became the preferred nomenclature for this so-called Glomar response. You see it all the time when we're talking about federal agencies. What's happened now is the Glomar response is filtering down to state and local law enforcement agencies. And according to this author - and I completely agree with it - that's particularly problematic. The reason it's used at the national level is because we're talking about national security. You know, there are constitutional provisions, particularly Article 2, saying that the president of United States is the commander in chief, protecting the country from enemies, foreign and domestic. You know, there is a national security role for the government that would more justify this type of secretive response. 

Ben Yelin: The other difference with the federal government is that we have inspectors general inside these agencies who can do reports, you know, providing oversight on how these programs are working. And Congress meets for the whole year. They have oversight committees. They can make sure that - or at least in theory - that these agencies aren't abusing their authority. 

Dave Bittner: Right. 

Ben Yelin: None of that really exists at the state level. So you don't have those strong inspector general departments. And frankly, you know, state legislatures are a little bit weaker. I know here in Maryland, they only meet for three months. That's true in other states as well. So you don't have that same type of oversight. But all of these state and local law enforcement agencies, particularly - and this is why it's relevant to us - when they're asked about the use of modern surveillance technology, they are giving the Glomar response. So... 

Dave Bittner: Where are - what does that mean? Where are they - what's the typical response? 

Ben Yelin: We can neither confirm nor deny the existence or the use of that piece of technology. 

Dave Bittner: And how is that the get-out-of-jail-free card for, you know, a - like a Freedom of Information request at the state level? 

Ben Yelin: Well, I mean, it depends on the state. Sometimes it might not legally be the impediment to collecting that information. Sometimes, you know, the state will have a statutorily protected interest in the information. But at the very least, it ties it up in litigation. So litigation takes a long time. That might - the delay, the oversight and discovery of such a surveillance program, you know, beyond the point where it would be useful to the public. So that's one particular concern. You know, also, it's going to be much harder to file a lawsuit. The only lawsuits we've seen that have been successful against mass surveillance practices have been where people are able to establish standing. And to establish standing, you have to prove with some impending certainty that you yourself were the victim of some type of modern surveillance tool. 

Ben Yelin: You know, the federal government - and if there's a firestorm of controversy about something - sometimes will admit that they use a certain surveillance technology. And that might enable a plaintiff to establish standing. You know, at the state level, if state agencies are constantly using this Glomar response, that can prevent people from being able to properly allege in court they're the victim of these surveillance practices. And so that could have - I know this is an overused term, but a chilling effect on constitutionally protected rights. You know, if you don't know what surveillance tools your own state and local governments are using, that might make you less likely to, say, show up to a protest. If they're using artificial intelligence, facial recognition technology, license plate readers and all you're getting when, you know, the media is going to these local agencies is, can neither confirm nor deny, you might think twice about your own First Amendment-protected activity. 

Ben Yelin: And the other element of this is, you know, at the national level, we have news agencies - The New York Times, The Washington Post. They will dig into these things. 

Dave Bittner: Right. 

Ben Yelin: If they discover an illicit surveillance method, you'll get some articles. You and I will probably talk about them. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: If it happens in Bumble-you-know-what, Pennsylvania... 

Dave Bittner: (Laughter) Right. 

Ben Yelin: ...You and I - it's probably not going to come on our radar. And it's - you know, local media has really kind of been decimated over the past half century. 

Dave Bittner: Yeah. 

Ben Yelin: So it's less likely that we're going to get accountability in that respect. So I just thought it was a really interesting exploration of how Glomar, which started in the national security context at the federal level, is now filtering down to these state and local agencies. 

Dave Bittner: How do we push back on this? Is this - I mean, do you - is this the kind of thing where you call your local legislator and say, hey, I'm not OK with this? 

Ben Yelin: Yeah. I'm actually much more optimistic about calling one's local legislators. I really do think they listen to their constituents. First of all, they generally have fewer constituents than a federal representative. 

Dave Bittner: Right. 

Ben Yelin: And it's just easier to get laws passed at the state level. You don't have to go through, you know - generally, these legislatures aren't as polarized. You don't have to go through the wringer of our Congress critters where there's a filibuster and a bunch of these other veto points. Legislators are less controlled with an iron fist by leadership like the speaker of the House or the president of the Senate. So you are able to actually get things done. And I think the most effective thing here is to go to a state legislature and say, you can stop these types of Glomar responses by statute, passing something that says the government has to divulge this information if it receives the equivalent of a FOIA request. And we've seen laws like that passed in different jurisdictions across the country. I know New York City has done that in a couple of circumstances. So it can be done. And this is something where I do think contacting your legislator might actually make a big difference. 

Dave Bittner: All right. Well, we will have a link to that story in the show notes. My story this week comes from Forbes. This is written by Thomas Brewster, and it's titled "Google Scans Gmail and Drive for Cartoons of Child Sexual Abuse." Child sexual abuse materials - CSAM is the shortened version of that - has certainly been in the news a lot - I think most recently because of the controversy surrounding Apple's plans to search for it. And, of course, Apple has dialed down those plans. 

Ben Yelin: In response to our podcast, I'm sure. 

Dave Bittner: (Laughter) Yes - and only our podcast. Yes. But this article has to do with the degree to which Google scans both their email service, Gmail, and their Google Drive service for this sort of material. It's not surprising to anyone that they scan for this sort of thing. What this article points out that I think is interesting is that they're scanning for not just photographs, not just videos, but artistic depictions of... 

Ben Yelin: Cartoons. 

Dave Bittner: Cartoons of these sorts of things - and, of course, you know, let's just say at the outset that, of course all - it's hard to imagine anything more horrific than child sexual abuse material, right? 

Ben Yelin: Right. 

Dave Bittner: We can all - I think that's something that everyone can agree on. What's interesting here is that someone's account got flagged for potentially having these sorts of images. This article describes it as digital art or cartoons depicting children engaged in sexually explicit conduct or engaging in sexual intercourse of underage boys. But no one got arrested. No charges have been filed because when they dug into this, they discovered that this was an artist. This was a legitimate artist - someone who had won art awards, who was recognized for his art. This article does not reveal that person's name because they've been charged with no crime. 

Ben Yelin: Right. 

Dave Bittner: They point out that the laws against this sort of thing outlined that they have to prove that the relative images were obscene or lacked serious literary, artistic, political or scientific value. So... 

Ben Yelin: Comes from a Supreme Court case, by the way. 

Dave Bittner: OK, go on. 

Ben Yelin: United States v. Miller... 

Dave Bittner: OK. 

Ben Yelin: ...Is where that test comes from. 

Dave Bittner: So I just want to unpack this a little bit, Ben - because I remember a couple - I don't know - decades ago - this is probably in the mid-to-late '90s, when computer graphics were starting to come into their own. And that was a world that I was involved with. And there was concern about - when it came to things like child sexual abuse materials, that people were able - were starting to be able to create photorealistic images of this sort of thing, but without any children being involved. 

Ben Yelin: Right. 

Dave Bittner: Right? And so the question was, what's the legality of that? If these laws are there to protect children - and rightfully so - from this sort of abuse, where are the lines? And so I wanted to unpack that with you. Is - where does our Constitution come down on this? Am I - is - I don't want to say am I because that gives me the creeps. 

Ben Yelin: Yeah. 

Dave Bittner: Is someone allowed to think about these sorts of things? Is that legal? 

Ben Yelin: It is. So child pornography itself - actual child pornography - is not protected under our First Amendment. That's one of the few carve-outs under our First Amendment. It enjoys no First Amendment protection. You can be prosecuted for distributing or using child pornography, obviously. 

Dave Bittner: Right. 

Ben Yelin: But anything that does have some sort of artistic value - if it doesn't involve the exploitation of actual children, then that generally is covered by our First Amendment. I think the rationale behind this carve-out for sexually explicit material of children is about the exploitation of children. I think the thinking is, if this is art, if this is a cartoon, you're not actually exploiting children. There were no actual children used to, you know, make these depictions. And that's true also when you have adult actors who are depicting children in the use of sexually explicit material. I don't know how satisfying an answer that's going to be because I think a lot of people would still find the distribution of those types of things morally wrong. 

Dave Bittner: Yeah. 

Ben Yelin: But I think that is the way that the law sees them. You know, we have a very robust First Amendment. We want to protect the free exchange of ideas. We want to have a robust marketplace of ideas. It goes too far when you're exploiting children. But in all other contexts, you know, the Supreme Court has been very deferential to anything that might be incredibly controversial but have some sort of artistic, literary, political value, especially judged against, you know, contemporaneous values. 

Ben Yelin: So if it's something that's not so far out of the realm - and then they use a community standard. If it's something that the average person in your community would not find, you know, completely explicit, you know, beyond the line, something that should be restricted, these things are generally protected. I understand why Google is trying to go after this because I think these images - no matter what the law says - can still be exploitative, can still be damaging to children. 

Dave Bittner: Right. 

Ben Yelin: So just because the law is the way it is, that does - we're not casting, you know, a normative view on that particular legal doctrine. 

Dave Bittner: Yeah. This article also points out that Google releases a transparency report every year, and it says in the first six months of 2021, they found more than 3.4 million pieces of potentially illegal content in 410,000 reports that they passed on to - I believe these all go to the Center for Missing and Exploited Children. And that was up from 3.9 million in 365,000 reports in the previous six months and double that from the six months before that. So... 

Ben Yelin: I've heard a theory on this, by the way. 

Dave Bittner: OK. 

Ben Yelin: Which is - and this was actually - I just had a student write about this in a paper. 

Dave Bittner: Yeah. 

Ben Yelin: But it has to do with the pandemic. More people are home and not only have more time to unfortunately create images like this but have more time to view them and distribute them. 

Dave Bittner: Interesting. 

Ben Yelin: Which is a disturbing theory but one I've certainly seen in some literature and scholarship. 

Dave Bittner: Yeah, yeah. So I - it's funny because it's hard to know is Google placing their users under more scrutiny? Are there simply more things to be found? Hard to know. 

Ben Yelin: Yeah, I mean, I would guess it's really the latter. You know, Google, along with all the rest of these companies, most of them have reporting requirements the National Center for Missing and Exploited Children. 

Dave Bittner: Right. 

Ben Yelin: You know, they are looking for these so-called hash values where you have an image that's been tagged as depicting child pornography. So they have a pretty robust system to get that reported. 

Dave Bittner: Yeah. 

Ben Yelin: So I really do think it's unfortunately a demand problem rather than, you know, Google going above and beyond to try and secure that information. 

Dave Bittner: Yeah. And I guess the take-home here, part of it is that everyone needs to be aware that when you're using these services, any of these services, not just Google, any of these online storage services, be it email or any of these file storage services, that your files are being scanned for this sort of thing. That is a routine thing that's done these days. 

Ben Yelin: Absolutely. And, you know, I think that's obviously for a very legitimate law enforcement purpose. But yeah, people should be aware of that. It's something that these companies can do, and it's something that they do do. And, you know, whether it invades our privacy or not, you know, beyond our expectations, it's certainly a reasonable question. But you can't argue with the government's interest through this nonprofit, the National Center for Missing and Exploited Children, or the company's interest in keeping this type of material off of its, you know, networks. 

Dave Bittner: Right, right. All right. Well, again, the article is written by Thomas Brewster. That's over on Forbes, and we will have a link to that in our show notes. 

Dave Bittner: Ben, I recently had the pleasure of speaking with James Gimbi. He is director at the technical advisory firm MOXFIVE. And our conversation centers on his belief that banning ransomware payments is bad policy. Here's my conversation with James Gimbi. 

James Gimbi: Obviously, the government has found itself in a situation, especially post-Colonial Pipeline, where they realize they may need to be playing a bigger role in cybersecurity generally and also in ransomware. And so policymakers are looking for opportunities to step in and make some sort of positive impact. And policymakers tend to use analogous ideas when they're setting a policy in an archaic domain, which makes sense, right? You try to use what you understand to address what you may not understand as well. So they'll say, well, this sounds like negotiating with terrorists, which we don't do in the physical world, or at least try not to. So we won't do it here. Or they'll say, like in the case of the governor of Missouri a few weeks back, they want to think of it like breaking into a house or breaking into a vault. It's a crime if you try to jiggle my door handle, so why isn't it a crime to look at my source code or whatever? 

James Gimbi: I think that the fundamental problem is that there's really not a workable metaphor for cybersecurity, right? So there's not a familiar way, something that makes sense to most people to express this brand-new defense discipline that every firm, every government, every organization has to find a way to, if not perfect, at least be proficient at. And those defenses are going to be subject to attacks from anywhere on the planet at any time, at scale, with virtually no cost and really no realistic chance of being caught. So it's a brand-new thing, and it's going to require a different approach from policymakers. 

Dave Bittner: You know, the things that we see being floated out here are really, as I believe you've pointed out, you know, there are two major elements here. Victims will stop paying, and threat actors will stop ransom operations if they can't get any money. But you've pointed out that there's - this is basically flawed thinking at its root? 

James Gimbi: I think so, yeah. So, you know, to quickly restate those assumptions, the idea that, first of all, bans are going to cause victims to stop paying, and then secondly, that they will force - if the victims stop paying, threat actors are going to peel back the effort. First, when we're talking about whether or not victims are going to continue to pay, I think that on net they're absolutely going to continue to pay. And it's a little counterintuitive, right? Because it seems like people are going to want to follow rules and generally they may. But the problem is that as policy lever, bans really are only going to work well when the penalty outweighs the benefits of whatever behavior you're trying to prevent. 

James Gimbi: So as a result of a ban, I expect that a number of well-resourced and well-advised firms who may have otherwise paid may decide not to. But the first problem is that that's a pretty significant sample bias, right? There's an awful lot of firms who aren't that well resourced and who aren't going to have that kind of advisement. And the second piece is, of course, the second order effects coming out of this. To me, it seems that a ban is going to be likely to mean more pain for the less-resource employers or more pain for more victims, or perhaps more serious and more sophisticated threat. Meanwhile, the bad guys are still making very, very easy money. 

Dave Bittner: Now, it's a really interesting point, you know. I'm reminded of - I heard a story of a town near me that has some public swimming pools, and they decided that at the end of the summer, it was easier for them to drain their pools into the local sewer system and pay the fine, rather than do it the correct way, which ultimately would cost them more money. And I think, you know, here I am, I'm doing exactly what you said we shouldn't do, which is use metaphors, but that's what that reminded me of. 

James Gimbi: Absolutely. I mean, I think that's - we've all seen that become part of a calculus. And the funny thing is it can affect the calculus for other actors as well. So, you know, I've talked before about the way that these bans can influence incentives for every player in the space, the threat actor, the victim, but also law enforcement. So, you know, one of the things that I'm concerned about is the impact to prosecutorial incentives, where if the penalty is going to be significant enough to seriously impact how a firm is going to decide whether or not to pay a ransom to bring our entire business back online, then it's also going to be significant enough to get the attention of prosecutors. And government attention is, of course, a very limited resource. So there's this thought that perhaps prosecutors will divert attention away from perhaps more constructive avenues to pursue. And, you know, I would hope this wouldn't happen. But, of course, I'm worried about the potential for harassing victims as well. 

Dave Bittner: Yeah. So what options do we have then? I mean, do you have other ideas for what may lead to some success when fighting ransomware? 

James Gimbi: I do. I think there's a lot of opportunity there. So the good news is there's been a lot of thinking by a lot of smart people in the space. So while there's not a consensus on whether or not ransom payment bans are a good idea or a bad idea, there's a lot of other consensus that is available to policymakers. So, you know, we've seen - for those who might be familiar, the Institute for Security and Technology has this ransomware task force, which is composed of law enforcement and civil society, private practitioners, academia, and they've identified quite a few dozens, in fact, policy recommendations that can be put into play. And they all push and pull on different mechanisms. Some of them, I think, are more promising than others. 

James Gimbi: I think that the thing that we need the most attention on is really the diplomatic angle, where we need to try to find a way to remove the safe haven from the equation. And the reason I'm saying that is because ransomware is just so insanely profitable for these threat actors. The research I've seen have pointed to north of 95% profit margin. That's a heavy incentive. And when you're dealing with, you know, basically no chance of being caught and no chance of physical harm, there would have to be a massive, I would say, technological change to affect those incentives without having a local enforcement. So to restate, I think that the most important piece has to be getting some sort of attention on the safe haven nations. 

Dave Bittner: You know, at the outset, we mentioned that part of this is the way that our system is set up in terms of our policymakers themselves, you know. And I think we always think of legislation as being sort of trailing behind the advances in technology, even advances in society. It tends to be reactive. Are there ways that we can do a better job from that point of view, a better job of bringing our legislators up to speed, of having of giving them the information they need to try to come at this in a more timely way? 

James Gimbi: You know, I'm glad you brought that up because it's actually something I'm pretty optimistic about. So, you know, just kind of quickly giving some context from my background. Before MOXFIVE, I had played a few different roles in consulting capacities, but I also had the opportunity to work in the Senate for a year as a policy adviser through a program called TechCongress. And the mission of that particular program is to do exactly what you're talking about - to try to bring expertise to policymakers. And that's one small piece of a very rapidly growing pie, so to speak, of public interest technologists and folks at the intersection of public policy and technology finding ways to positively influence and make themselves available to decision-makers in the executive branch and in the legislative branch and in international organizations as well. 

James Gimbi: This is something where we saw such a small group of people who fit that intersection even just five, six years ago, and now we're really starting to see a huge, robust community with specialized programs at universities and fellowship programs all over state and local governments. It's really something where I think a lot of positive change is happening. 

Dave Bittner: Is your sense that the policymakers themselves recognize that this is an area that they need some assistance with? 

James Gimbi: I think that is a growing realization. You know, in Congress, there was a little bit of resistance to it, I think, in the beginning. But the members who did take advantage of bringing on technical expertise are really starting to see that pay off in the quality of what they're able to put out in terms of legislative and oversight impact. And we're starting to see demands from the legislative branch to the government branch for hiring technologists for policy positions. And we're starting to see the executive branch really, again, embrace technologists as a part of the policy process more and more proactively. There's now a dedicated program at the FTC. There's a lot of growing interests within the White House itself. And this is something that has spanned across multiple administrations, starting perhaps with, you know, the great work done by 18F and digital service. 

Dave Bittner: Yeah. I mean, it is an area that has unusual, for these days, bipartisan support. You know, everybody wants to do better with this. 

James Gimbi: That's right. Absolutely. 

Dave Bittner: Yeah. 

James Gimbi: And that, you know, from my own experience working on the Hill, pretty much everything I touched was in some capacity a bipartisan issue. And talking with other policy advisers from the technical space in the policy world, that's not an uncommon story. 

Dave Bittner: Do you have any words of wisdom, any advice for folks who find themselves in the situation of being hit with ransomware? I'm thinking of, you know, just weighing their options, going through that decision process. Should we go down - should we explore paying the ransom or not? 

James Gimbi: That is always going to be a very tricky business decision. And unfortunately, there's not a blanket answer that I think can be confidently thrown out, aside from to say, you need to make sure that you're getting the input required from folks who are familiar with these processes. You don't want to be going into this sort of situation alone. We strongly advocate the use of a technical adviser. We strongly advocate for the use of external breach counsel. And, you know, these are professionals who walk through this every day as part of their job and can really help navigate the inevitable speed bumps that come up during a breach. And then, of course, there's - to kind of get ahead of it, the prevention piece. One thing that I wish we could wave a wand and do away with is exposed RDP to the internet. 

(LAUGHTER) 

Dave Bittner: Right. 

James Gimbi: You know, if we were able to get rid of that overnight, I think you'd see a drop-off in the steep double digits of ransomware cases, at least for a little while. 

Dave Bittner: Ben, what do you make of that? 

Ben Yelin: One thing that's interesting to me is it's really hard to develop policy right now because best practices in dealing with ransomware - it's such a live issue. Colonial Pipeline just happened, you know, nine months ago. 

Dave Bittner: Yeah. 

Ben Yelin: And we're still evaluating and formulating the response to that, both at a government level but also on an individual business level. So when the government steps in and tries to enact regulations, even if right now the thinking is, you know, we don't want people to pay the ransom - and generally, there's a wide spectrum of views on that particular issue. It's just really hard to enact that into some sort of policy because this is very dynamic. This isn't like another area of the law where, you know - I think he mentions in the interview, opening up a doorknob is very different. We have... 

Dave Bittner: Yeah. 

Ben Yelin: You know, there's something that's very subtle about trespass law related to opening a doorknob. But this is still so new that any regulation, you know, almost by its nature is going to seem somewhat premature. So I think that's something that people who have lawmaking authorities always have to remember. 

Dave Bittner: Yeah, absolutely. All right. Well, again, our thanks to James Gimbi for joining us. We do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.