Caveat 6.1.23
Ep 174 | 6.1.23

What would it take to change a law?

Transcript

Caleb Barlow: The more tools out there we have vacuuming up the internet, the harder it will be to propagate this type of illicit material.

Dave Bittner: Hello, everyone, and welcome to Caveat, the CyberWire's Privacy Surveillance Law and Policy podcast. I'm Dave Bittner, and joining me as my cohost, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben.

Ben Yelin: Hello, Dave.

Dave Bittner: Today Ben has the story of a massive fine being levied against Meta for violating European Union privacy laws. I got the story of New York taking a stand on AI when it comes to hiring. And, later in the show, we're joined by Caleb Barlow from Cylete. He's going to grill Ben on a variety of cyber-related policy issues, make sure you stick around for that. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney.

All right, Ben. We've got some good stories to share this week. Why don't you start things off for us here.

Ben Yelin: So mine comes from the New York Times in the technology section by Adam Satariano. And this is about a fine levied against Meta, the parent company of Facebook. It's a record 1.2 billion euro fine, which equates to about $1.3 billion from the Ireland's Data Protection Commission for a violation of GDPR, the statute that's now been in place in the European Union for about five years. And, as part of this legal finding, Meta has also been ordered to stop transferring data from Facebook users in the United -- in Europe to the United States. So this is one of the most significant rulings since the advent of GDPR. It is the largest fine to have been levied under that statute so far. And it is a serious allegation of violating European Union data protection rules. So, just for a little bit of background, these regulators -- and I'll talk about in a second why this came out of Ireland, which is an interesting element of this. The regulators have alleged that Meta failed to comply with a 2020 decision by the European Court of Justice -- this was Schrems II -- that Facebook data shipped across the Atlantic was not sufficiently protected from American spy agencies. So we've had -- we've tried to codify these data sharing arrangements between the United States and the European Union.

Dave Bittner: Right.

Ben Yelin: Makes the internet run more smoothly. It's good for business. It's good for transactions. We don't want to have siloed social networks where you have a Facebook Europe and a Facebook United States. It wouldn't be as fun or as interesting. The problem is that the European Union, I think with relatively good reason, doesn't trust our data protection practices. And their best evidence is some of our draconian or allegedly draconian surveillance measures. So they've mentioned some of what we talked about on this podcast just last week, the collection of data under Section 702 of the FISA Amendments Act.

Dave Bittner: Right.

Ben Yelin: Or from something like XKEYSCORE, which was done pursuant to Executive Order 12333, known in the industry as 12 triple 3. With this type of evidence, the European Court of Justice has said that we need to have proper measures in place before we can have this type of data sharing. So the Biden administration has been in negotiation with our counterparts on the European Union to try and come up with a data sharing agreement that can satisfy some of the concerns that were raised in Schrems II, but those are all preliminary. That hasn't been codified yet. And it's very possible that, once that is codified, and we've talked about some of the provisions there, including one that gives non-US persons kind of a right of legal action against the United States for allegedly misusing data, that has yet to be enacted. And the -- so we're in this kind of gray area where we know we want to have a borderless method of data transfer. We have the European Union being distrustful of the United States and its surveillance practices and now a company like Meta, who wants to foster free and open data exchanges, being held accountable with a really significant penalty here, 1.2 billion euros.

Dave Bittner: Yeah.

Ben Yelin: So to answer what I know is going to be your next natural question, where does this go from here? Meta is starting an appeals process, so it is unclear if they're actually going to pay these fines. This is going to potentially be a lengthy legal process, and the entire case could be rendered moot if we come up with a data sharing agreement with the European Union that satisfies the EU, the European Court of Justice; and Max Schrems, an Austrian activist who has been the plaintiff in all of these lawsuits. So it's just a really interesting, groundbreaking decision and certainly caught my eye.

Dave Bittner: It's interesting that Meta seems to be the frequent flyer here when it comes to these violations, right? I mean, both in -- both in the EU and in the US, they -- they just -- they make these agreements, they get fined, they agree; and then they don't seem to stick to their agreement. They keep coming back. They keep being brought back to the table to -- time and time and time again. It's -- it's just their business. This is just the way they do business.

Ben Yelin: I think it's the way they do business. I don't want to sound too conspiratorial, but, one, like, they're an easy target. A lot of people just do not like Mark Zuckerberg and think Facebook has been bad for society.

Dave Bittner: Yeah.

Ben Yelin: They also have a president of Global Affairs named Nick Clegg.

Dave Bittner: Okay.

Ben Yelin: Nick Clegg used to be the head of the Liberal Democratic Party in the UK Parliament. Now the Liberal Democratic Party in the UK isn't what we would define as liberal Democrats here, kind of a centrist, more libertarian leaning party. And they were in a coalition government with the Conservatives under David Cameron. There was a lot of, like, residual anger for the Liberal Democrats being part of that coalition and helping that government institute austerity practices. So that created kind of some bad buzz around Nick Clegg, especially just in kind of left-leaning circles. So I don't want to say that that's the reason we're having these, you know, legal decisions. It's certainly not. But I do think, if we're trying to determine why Meta has become such a target, I think that could at least be a factor.

Dave Bittner: Now, you mentioned Ireland. What's the significance there?

Ben Yelin: So, yeah. The Ireland issue of this is kind of fascinating to me. There's a provision in GDPR that requires regulators in the country where a company has its European Union headquarters to enforce GDPR. It just happens to be that Ireland is the regional headquarters of Meta, TikTok, Twitter, Apple, and Microsoft.

Dave Bittner: Right.

Ben Yelin: And because of the way GDPR is structured, they are the ones handling these lawsuits. I would guess it's the similar role that Delaware plays in our legal system where there are maybe clear sets of precedents. You can -- you know what to expect from the judges in these cases. So it might be advantageous for that or for tax reasons for these companies to have their European Union headquarters in Ireland.

Dave Bittner: Right.

Ben Yelin: And so Ireland has kind of taken a disproportionate role in meting out these fines. I think Irish authorities are kind of frustrated with this arrangement. They had been overruled by a board made up of representations from several EU countries on this particular ruling. The board insisted on imposing this fine, which is not something that Irish authorities had tried to -- tried to enforce themselves. And I think this worries Ireland because they want to have these companies still very present in the European Union and still headquartered in Ireland. It's a source of jobs. It's a source of economic growth. So, yeah. I mean, I think that is a very interesting angle of the story, for sure.

Dave Bittner: $1.3 billion, can we calibrate that? Or have we reached the point where we're talking about real money with a company the size of Meta?

Ben Yelin: I mean, it's a pretty hefty amount of money even for a company like Meta.

Dave Bittner: Yeah.

Ben Yelin: And these fines start to pile up. I mean, you -- a billion here, a billion there and it starts to add up to real money. They were just fined by regulators under the GDPR in January $390 million for forcing users to accept personalized ads as a condition for using Facebook, which is a violation of GDPR. They were fined last November for a separate data leak. That was $265 million. So it's essentially going to have its impact on Meta's bottom line, enough that they're going to have incentive to try and change some of the policies that are causing these lawsuits to be successful. So some of that is going to be them working with our government, working with EU authorities to come up with a type of Safe Harbor Privacy Shield provisions that can pass muster on both sides of the Atlantic.

Dave Bittner: Yeah.

Ben Yelin: And so I think they're going to be -- they're at least going to try to be heavily involved in that process.

Dave Bittner: Do you have any sense on where the Biden administration is with that? And is that on a good healthy trajectory?

Ben Yelin: It's still being negotiated. So the preliminary or the outlines of a deal, the preliminary agreement was announced last year by President Biden and the President of the European Commission. But the deal is still being negotiated, finalized. We don't have necessarily a timeline on that. Could be several months; it could be a year. I just think it's uncertain how far they are from coming up with something final. So I think, in the meantime, it means there's a significant legal risk for some of these big tech companies in continuing to move data between the US and the European Union because we are kind of in this legal limbo until not only we have a new agreement but that new agreement doesn't get invalidated by the European Court of Justice.

Dave Bittner: Well, and --

Ben Yelin: So we have to go through those two steps before these companies can be sure that they can transfer data across the Atlantic without being subject to these risks.

Dave Bittner: What about the very real possibility that President Biden does not serve a second term and someone else comes in who has very different ideas about these things? To what degree would this be subject to the desires and whims of a new administration?

Ben Yelin: Oh, 100 percent. I mean, President DeSantis or Trump or whomever could come in and burn this up on day one, if they want it to.

Dave Bittner: I see.

Ben Yelin: And that would have an impact. I mean, it would have a practical impact. But it would also, you know -- it doesn't make a difference in terms of what -- how the European courts enforce GDPR. So, in the absence of that agreement, we're going to see crackdowns on data transfers. And that's going to hurt the internet. It's going to hurt the economies of both the US and the European Union. But it's -- it is certainly something that could happen. I mean, if there are ideological differences with this approach --

Dave Bittner: Right.

Ben Yelin: -- then we could certainly see it invalidated with a new presidential administration. Nothing here is etched into stone.

Dave Bittner: Yeah. So it is an area of potential volatility.

Ben Yelin: Absolutely.

Dave Bittner: Yeah.

Ben Yelin: A lot of volatility and a lot of risk.

Dave Bittner: All right. Well, we will have a link to that story in the show notes. My story this week also comes from the New York Times. This is an article written by Steve Lohr, and it's called A Hiring Law Blazes a Path for AI Regulation. And this is about New York City, who is adopting rules that put some limits on how companies are using AI when hiring people. This caught my eye because I think we've heard a lot about algorithmic filtering when it comes to job applications, right. Like the first -- the first level you've got to get by if you want anybody to look at your resume is the algorithm.

Ben Yelin: Right.

Dave Bittner: And whether or not we call that AI or not, you know, is that a -- I don't know. Is that simple filtering or, you know.

Ben Yelin: You've got to weed out the applications that you're not -- that no human would ever accept, and might as well have machines do it.

Dave Bittner: Right. But the flip side is that that can put some people at a disadvantage. It can also put people at an advantage who know how to game an AI or game some sort of automated system. You know, you can put stuff in your resume, check all the boxes in your resume to get to that second level and then rely on your wit and guile to explain why, well, I didn't actually get that degree. But --

Ben Yelin: Right.

Dave Bittner: But if you didn't list the degree on the resume, you never would have gotten past that step anyway.

Ben Yelin: I mean, this will be familiar for a lot of people who have used automated resume collection systems, including with the federal government, USA Jobs, where they will list what the requirements are for the position. You don't just submit any resume. You make sure that those keywords are in there.

Dave Bittner: Right.

Ben Yelin: So we're looking for someone with five years of experience in XYZ, you put in the first line of your resume five years of experience in XYZ because you want to make it past that filter.

Dave Bittner: Right, right. So this article lays out some of the requirements here. It requires companies who are using AI software in hiring to notify candidates that an automated system is being used. It requires companies to have independent auditors check the technology annually for bias. And candidates can request and be told what data is being collected and analyzed, and companies will be fined for violations. How does that sound in principle there, Ben?

Ben Yelin: In principle, it sounds great. It applies to any companies that have workers in New York City, which is a lot of different companies. So even if it's a multinational corporation but they have employees who work in New York City, it would apply to them as it relates to those employees in New York City. So it certainly seems promising on first blush. But, as we always say, the devil is in the details.

Dave Bittner: Right.

Ben Yelin: I think a lot of -- there's kind of two sources of opposition to this law. There are advocates who have said that this doesn't go far enough. It was watered down. This was a statute that was passed in the waning days of the Bill de Blasio administration, towards the end of 2021. And in the process, over the past couple of years, it's been whittled down. Some of the stronger enforcement provisions were taken out. And then there are those in the business community who say, of course, that this is unnecessarily burdensome and that, you know, AI is an important tool or whatever you want to call it, whether it's AI or just algorithms is an important tool for expediting the review of applicants. And this is going to cut against our bottom line. So there are -- there -- there is a lot of opposition on both sides here.

Dave Bittner: Yeah. This article quotes someone named Alexandra Gibbons, who's the President of the Center for Democracy and Technology, which is a policy and civil rights organization. And they say, What could have been a landmark law was watered down to lose effectiveness. And the article goes on to say that, The law defines an automated employment decision tool as technology used to substantially assist or replace discretionary decision-making. And the rules adopted by the City appear to interpret that phrasing narrowly so that AI software will require an audit only if it is the lone or primary factor in a hiring decision or used to overrule a human. So, yeah. That seems pretty narrow.

Ben Yelin: Yeah. That is really defining it narrowly.

Dave Bittner: Yeah.

Ben Yelin: It really is. I mean, it's very hard to prove that AI would be the only reason, the only, the primary or -- lone or primary factor that a hiring decision is made. That's going to be very hard to prove.

Dave Bittner: Yeah. So I wonder to what degree, given that, to what degree is this an effective tool? Do we look at this as a first step, as a way that the breezes are blowing? Or is -- will this ultimately be ineffective?

Ben Yelin: I think it is a first step. It's laying the groundwork for similar statutes in other states that are interested in this type of regulation. I think this -- the way this law is being interpreted now, the way the regulations are written, it is going to be relatively toothless because most organizations use automated software for that sort of filtering effect.

Dave Bittner: Right.

Ben Yelin: But it's still a human making final hiring decisions. And that would be a defense against any potential fine levied by the City. You could say this wasn't the primary reason we hired this person. We hired them for these reasons. We just used this AI software to filter out applications that were going to go to the bottom of the pile anyway.

Dave Bittner: Yeah.

Ben Yelin: So, yeah. I worry that this does render this law relatively toothless. I think even the activists will admit it's better to have this law on the books than to have nothing. At least there's a skeptical eye on the use of AI software in hiring or algorithmic software in hiring.

Dave Bittner: Right.

Ben Yelin: But, yeah. I mean, I do think the rules adopted by the City will limit the applicability of this law and its enforcement mechanism for sure. The other thing they noted is the law's looking out for discrimination. So there have been a lot of allegations. This is well supported by the data that certain kinds of groups are disproportionately disadvantaged in hiring process through the use of these algorithms. In the law, they mention bias by sex, race, and ethnicity. But, as Miss Gibbons points out, it does not bar discrimination against older workers or those with disabilities. And that is kind of a blind spot to the civil rights of those individuals.

Dave Bittner: Yeah.

Ben Yelin: So, again, that's another shortcoming of the statute here.

Dave Bittner: Yeah. Interesting. All right. Again, well, we will have a link to that article in the show notes. It's from the New York Times written by Steve Lohr. Again, we would love to hear from you. If there's something you'd like us to discuss on the show, you can email us. It's caveat@thecyberwire.com.

Ben, we're going to do something a little different for us this week. We've invited Caleb Barlow to join us. He is the CEO of an organization called Cylete. Has a lot of experience in cyber with policy issues. Has run some large organizations, some large companies. Has some really interesting perspectives to share with us. So let's welcome Caleb Barlow to the show. Well, let's just jump in here, Caleb. So I know you had some topics you wanted to cover with both of us but mostly Ben. So why don't we kick things off?

Caleb Barlow: I think you can weigh in.

Ben Yelin: I'm ready for it. Throw them at me, Caleb. Let's see how this goes.

Caleb Barlow: So, yeah. I want to throw down a little bit here. Like, as an operator listening to this show, one of the things that, you know, oftentimes comes up, and I think you guys do a great job of kind of sarcastically rolling your eyes a little bit, you know, at least to the extent you can on a podcast when, you know, the Computer Fraud and Abuse Act. Let's talk about that nightmare again. Right? You know, Ben, you talked recently about an opportunity you had to influence some legislation in Maryland. And, you know, I've had the opportunity when I was working for large companies to, you know, lobby on Capitol Hill and go and testify in front of Congress and even have the unique opportunity once to testify in front of the United Nations.

Ben Yelin: Oh, wow.

Caleb Barlow: And the point is you can get -- you can influence these things. You can get things changed. Now, it takes forever.

Ben Yelin: Yes.

Caleb Barlow: But, you know, what I wanted to do is throw out some of these things we deal with and really poke at, well, what would it take? Could it even be done to change some of these things down the road?

Ben Yelin: Let's do it.

Caleb Barlow: So let's start with our good friend, the Computer Fraud and Abuse Act.

Ben Yelin: Ah, yes. Yeah.

Caleb Barlow: One of our -- one of our favorites, right. And I think the challenge for cybersecurity researchers in particular is -- I don't know what the date of this law was but, I mean, I'm pretty sure it dates back to when we were all kids and watching WarGames, right?

Ben Yelin: Yeah. It actually dates back to the year I was born. I do not want to age myself, so I'm not going to reveal that year. But I am as old as the Computer Fraud and Abuse Act. So it's old.

Dave Bittner: So I was probably in my 20s.

Caleb Barlow: David -- Dave and I were a little older, but we'll go with that. You know, the challenge with the Computer Fraud and Abuse Act, right, is you get into this kind of, you know, exceeds authorized access definition.

Ben Yelin: Ah, yes.

Caleb Barlow: And also, you know, here's the other thing. What's a computer nowadays, right? Like, you know, it was one thing when we were all worried at, you know, kind of using the plot of WarGames, right, of, you know, Matthew Broderick breaking into the -- you know, the computer that simulated a -- you know, nuclear Holocaust, right. And I think we all kind of get probably shouldn't do that. But the challenge we have nowadays is that, first of all, you know, we not only have computers, we have automobiles, we have devices, we have thermostats all connected to the internet. And, in a lot of cases, we need to be able to test these things. And, you know, I can give you a few examples of where this is really challenging. Like, I had a client years ago that was building a programmable logic controller. I won't get into what it controlled, but let's just say it controlled really important things. And they'd had these things out in the market for decades. And they had no idea where they were but, more importantly, you know, their big concern was they had lots of these things out there with a default username and password. And, on one hand, bad guys can scan for that all day long. All they have to do is, you know, go use Shodan, figure out where they are, and then go try the default password. On the other hand, security researchers are forbidden from doing that, or at least in the definition we all accept is what's the law. So what do you think would be required to update the Computer Fraud Abuse Act?

Ben Yelin: I think we would need something like the bill that was proposed about a decade now by a couple of representatives, one of them Zoe Lofgren, who represents Silicon Valley, and that is to grant special dispen -- disposition or dispensation to researchers. Of course, this spans back to the tragic Aaron Swartz story.

Caleb Barlow: Yes. So why don't we talk about that --

Ben Yelin: Sure.

Caleb Barlow: -- because I think that's a really, really important context of where this law has gone particularly awry.

Ben Yelin: Yes. Yeah. So I don't know if you want to get into the story of it. Or, I mean, I can.

Caleb Barlow: I mean, the basic -- the basic issue here -- and keep me honest here on the story -- was you had a computer researcher that was pursued aggressively by the Justice Department and I believe the school he was going to, as well --

Ben Yelin: Yeah. MIT.

Caleb Barlow: -- because he might have gone a little too far. And he ended up unfortunately taking his own life.

Ben Yelin: Right. I mean, I would -- I don't use this word lightly. He was being harassed by the Department of Justice.

Dave Bittner: They're making an example of him.

Ben Yelin: They were making an example of him. And he was doing what I think most of us would agree was scholarly research. And I think him taking his own life wasn't necessarily the natural extension of what the Justice Department had done, but it was certainly part of a broader story. So I think there's -- that was really the impetus around these proposed laws. You had a government that was bringing disproportionate charges against somebody because the Computer Fraud and Abuse Act is far too broad. They're using language in the law that's very difficult to interpret, that was drafted in a predigital age. And throwing the book at this person for reasons that, you know, I don't want to cast dispersions on the Department of Justice, but I -- I'm not entirely sure they were the most legitimate reasons. So what would you need to do to amend the Computer Fraud and Abuse Act? Well, for --

Caleb Barlow: So do you imagine, like, some sort of license for computer researchers?

Ben Yelin: Yeah, yeah. It'd be something like that. Or you could just exclude basic terms of service violations from the CFAA for individuals that are engaged in good faith academic pursuits, research who are doing work trying to protect network security. It wouldn't be easy. There are financial interests that are aligned against this. A lot of some of the big tech companies and just some of the big manufacturers really like the gate up, gate down approach. Supreme Court took in the Van Buren case for the CFAA because it just -- it makes their lives easier when there's litigation. It's a much simpler standard for them to try to adjudicate. So there are reasons that they are interested in maintaining the status quo. I also don't think this is -- just to get back to an earlier point, I don't think this is something that you could just go into Congress and adopt this change overnight. Like, there's a lack of understanding. It's been ten years since we have the Aaron Swartz situation. Congress, the natural order of things is inertia. So how would you actually inspire some type of change like this? I think that really is an interesting question here.

Caleb Barlow: Well, you know, one of the things we have seen, though -- and you know, this is more your swim lane than mine, but we have seen kind of updates and clarification from the Justice Department on what they're willing to prosecute. And I think, you know, kind of taking a little bit of a step back to say, you know, some of these kind of obscure use cases like, oh, you know, the -- your company said that the systems at work are only for use at work, and you happen to be on Facebook.

Ben Yelin: Right.

Caleb Barlow: We're not going to prosecute you because of that. Right. I think some clarification has occurred there. But, you know, there's certainly a need to get much more specific, especially now that, you know, what is a computer, right?

Ben Yelin: Right.

Caleb Barlow: It's everything nowadays, right? So, you know, what's unauthorized access to a thermostat? I don't know what that looks like in a lot of cases.

Ben Yelin: In my house, that's just my wife touching it. But yes.

Caleb Barlow: I agree. I mean, that would work in my house.

Dave Bittner: Oh, boy.

Caleb Barlow: Especially like the pool heater, right? That's unauthorized access. I'm coming after you with CFAA.

Ben Yelin: We're going to get so much trouble for this. But, yes. Yeah.

Dave Bittner: But what about -- I mean, doesn't that sort of make the current situation that we're up to the whims of any particular set of people who are in charge of the Justice Department?

Ben Yelin: That's exactly what I was going to say. So one of the things that the Supreme Court justices brought up in the Van Buren case, the majority of justices were talking about this parade of horribles where you would have political prosecutions based on really incidental intrusions into somebody's network. And the opposing justices said, Well, you know, that's an exaggerated parade of horribles. The Justice Department has instituted these prosecutorial policies. We're not going to prosecute for some of these edge cases. That is up to the whims of individuals in the Justice Department. And whatever your political persuasion, think of the -- you know, your foremost political enemy winning the presidency, being able to appoint an Attorney General and a Deputy Attorney General and all sorts of positions in the Department of Justice. And imagine that person getting to have prosecutorial discretion. The only way to take that away from the Department of Justice is to pass a statute in Congress. So I just think you can't rely on the Justice Department to make what you would consider favorable prosecutorial discretion decisions because those are, by nature, indefinite. We have changes in administration.

Caleb Barlow: But the point is, like, I think the point you're making is that this is probably due for clarification. And there probably is a middle ground of how we could get there by being a little more specific on, you know, what these educational and research pursuits look like.

Ben Yelin: Right, right.

Caleb Barlow: So your whole point on prosecu -- prosecutorial discretion brings up, you know, another one of my favorite laws to talk about, which is, you know, the False Claims Act.

Ben Yelin: Ah, yes.

Caleb Barlow: Which, interesting enough, goes back to the Civil War. And, you know, I think this is -- there's a couple of very interesting things about using this law to prosecute cyber crimes. And, you know, what this is really being used for is to say, Hey. XYZ Company, you didn't have adequate security controls in place when you were selling or, you know, portraying goods to the US government in their purchasing process.

Ben Yelin: Right, right.

Caleb Barlow: Now, what is most interesting about this is, first of all, this was obviously written way before computers existed. The second thing, though, is it has a lot of teeth, including very strong whistleblower provisions and things like that. But, you know, as much as the government is touting this as how they're going to, you know, kind of enforce that everybody has the proverbial fire alarms installed and you don't get to be a victim if, you know, the fire alarms weren't working or the sprinklers weren't working. You know, on the other hand, there's been almost no prosecutions under this law as much as it's being touted as a tool out there. So what's your kind of thought on this, both from the perspective of how old it is but also, you know, is this even a -- is this a hammer in search of a nail?

Ben Yelin: So a couple of things. On how old it is, I can do you one better than the False Claims Act. I mean, the --

Caleb Barlow: You tell me.

Ben Yelin: The lot issue in the Apple v. FBI case in 2015 was based on the All Writs Act, which was enacted in 1789. So that makes the False Claims Act seem modern by comparison.

Caleb Barlow: It's just amazing.

Ben Yelin: It is amazing. I mean, the fact that we were adjudicating a dispute between our law enforcement agency and one of the foremost big tech companies in the world based on a Revolutionary era statute is really silly, I mean, if you just step back and think about it. Like, if you were to explain that to somebody who had never studied law, it would seem particularly ridiculous. I think the government has used the False Claims Act more as a weapon to recover money that it thinks it's entitled to. And the fact that somebody who sues under the False Claims Act, not only does the government get to recover money but, depending on the circumstances, the individual or the party suing also gets a portion of that money, I do think it's an incentive structure that could be promising if we want to hold companies that don't have proper cyber hygiene accountable. But, you know, I worry about it. I don't worry about it as much that it's outdated, just that we're applying it in sort of an arbitrary way that doesn't necessarily fit modern circumstances.

Caleb Barlow: Well, I mean, I think that's one of the more interesting things with this, right, is that, you know, we're reaching a point in kind of our cyber history, if you will, where we understand now what good looks like in terms of cyber hygiene.

Ben Yelin: Right.

Caleb Barlow: And -- you know, and, in contrast, we really know what bad looks like. So, you know, much might the analog of, you know, the fire alarms and the sprinklers need to work if you own a building, you know, if you don't have basic endpoint protection, if your network isn't segmented, you know, if you don't have security logging turned on and you're a company of any size, you should probably -- you know, if you're a victim, it's your own fault at that point, right.

Ben Yelin: Right, right.

Caleb Barlow: And, you know, what I think is interesting, both in the usage of the False Claims Act as well as even things like HIPAA where, you know, you do see fines under HIPAA; but the fines are really, really, really small.

Ben Yelin: Right, right.

Caleb Barlow: You know, you fine a multibillion dollar hospital $10,000, right, because they didn't have this tool in place.

Ben Yelin: They're nominal damages.

Caleb Barlow: Exactly. Right. Like, it seems to me like this is an area -- and maybe this happens on the civil side of things where the law needs to catch up to basically hold people accountable where they've knowingly done nothing.

Ben Yelin: Right. I think there are also better ways to do that. You could -- that this is the False Claims Act as part of like a sticks approach. And I do think there's some promise in a carrots approach. I've seen this done at the state level where there's some sort of liability shield for a company that has in good faith tried to institute proper security practices. Now, that leads to potentially a lot of litigation. You have expert witnesses for both sides arguing about what best practices are and whether that satisfies whatever standard the state legislature comes up with. And, also, it's not very equitable, since most of the organizations that can afford to ensure compliance aren't your standard mom and pop shops. So, with those caveats aside, not to, you know, throw in the name of our podcast there, but with those caveats aside, I do think that approach potentially offers more promise. It's less punitive. And it is potentially providing an incentive structure and could cut down on litigation, which is a win win for everybody. And I say that --

Caleb Barlow: So maybe switch that one more to the positive.

Ben Yelin: Yeah.

Caleb Barlow: Okay. So, Dave, are we ready to throw Ben a tough one?

Dave Bittner: Oh, yeah. Let's -- I've been waiting my whole life.

Caleb Barlow: I mean, those are two softballs, right? I've saved this one for you.

Dave Bittner: Say, Ben. This is why I brought Caleb on the show because his knowledge of this stuff far exceeds my own. So -- so we're bringing out the big guns for you here.

Caleb Barlow: Okay. Here comes a really tough one, right?

Ben Yelin: All right. Let's do this.

Caleb Barlow: So child pornography is obviously bad.

Ben Yelin: Yes.

Caleb Barlow: I think we can all agree on that.

Dave Bittner: We're all on the same page.

Ben Yelin: Yes.

Caleb Barlow: You know, the challenge with child pornography laws is they prohibit, you know, disseminating it but even storing it.

Ben Yelin: Right.

Caleb Barlow: And this is a case where, you know, kind of some inadvertent consequences have occurred. So if you're a security company that recognizes that oftentimes bad guys work through images --

Ben Yelin: Right.

Caleb Barlow: -- and that a lot of bad things can happen in images, not just child pornography but lots of other things can happen in --

Ben Yelin: That could subject you to legal liability.

Caleb Barlow: Correct.

Ben Yelin: To some extent.

Caleb Barlow: So what's happened in this case is, although the tooling exists to do things well beyond facial recognition but to understand what's really going on in an image, where did this image come from, is it potentially copyrighted, is it from someone else, there is a lot of reservation about doing any security work around anything that has anything to do with an image because you are going to inadvertently collect child pornography, especially if you're doing things like dark web image collection. So, for example, let's say you're a company that's contracted with a few banks to go out and look for places where people are selling stolen checks or stolen credit cards. Oftentimes, those things will be advertised on the dark web with pictures of the image of the bank, the image of the details, the image of the credit card number. And a lot of security researchers and companies are scared to death of grabbing that because they know that their collection is inadvertently going to also grab child pornography.

Ben Yelin: Right.

Caleb Barlow: You know, so this is a tough one because I can see both sides of it. But we are handicapping our ability to solve a lot of other crimes because we don't want to accidentally be storing child pornography.

Ben Yelin: Can we just get into the politics of this for a second before going to the merits of it?

Caleb Barlow: Sure. I told you this was not going to be an easy one.

Ben Yelin: It's not an easy one. Did you watch the confirmation hearings for now Supreme Court Justice Ketanji Brown Jackson?

Caleb Barlow: I did not. I am not -- unlike others on this podcast, I am not a SCOTUS nerd.

Ben Yelin: Yes. So, as a proud SCOTUS nerd, I spent an inordinate amount of time watching the lines of questioning. And the Republican members pilloried her because she was a district court judge. And, in her sentencing, the allegation was that she issued light sentences for child pornographers, that there were certain mandatory sentences and that she would always either be on the low side of it, or she would depart from whatever the sentencing recommendation was. What she tried to explain during the hearings is that possession laws are outdated because you used to add counts of possession based on the number of images that were collected, which made sense when you, like, actually had to go out and collect images or put a certain number of --

Caleb Barlow: In the days of film.

Ben Yelin: Exactly, exactly. When we're talking about files, and let's say you download one zip file that you unknowingly realize, eventually realize has thousands and thousands of images, they wanted to -- the prosecutors wanted to expand sentences based on the number of images. And she basically said that that was an outdated view of the law that wasn't properly reflected of the digital age. And she got pilloried for it. I mean, for a little while, I thought that was going to be problematic for her confirmation.

Caleb Barlow: Interesting.

Ben Yelin: And I actually happen to think that she's right on the merits of this, but the result is that you are giving people who have been -- either have been convicted or have pled guilty to having possession of child pornography, you are lessening those sentences. So it is a really difficult issue. You know, you always want to be on the side of researchers and people who have a good -- have a good faith objective to protect security.

Caleb Barlow: Here's the other side of this, right. If these researchers do inadvertently find this, they can get into law enforcement, and maybe something can be done about it. The problem today is no one wants to scan for this stuff. No one wants to touch it. So, you know, part of my worry in this is, there could be lots of places where this -- these images exist that no one's going to categorize or know because no one wants to look for it.

Ben Yelin: Right. I mean, that's -- that's an area where you could see real legislative action, where you --

Dave Bittner: So are we talking about like a, I don't know, like a good Samaritan kind of -- you know, your -- someone falls ill on an airplane, and so they say, Is there a doctor in the house? And a doctor helps the person. You know, that doctor has certain protections under, like, Good Samaritan rules, right? That doctor did not set out to do medical work when they got on that plane. Similarly, if a researcher stumbles across something and does the right thing by alerting law enforcement or whatever, do there need to be some sort of protections in place if that researcher can show in good faith what they were up to? Am I making any sense?

Ben Yelin: Yeah. I mean, it would be --

Caleb Barlow: That seems to make a ton of sense.

Ben Yelin: It would be a kind of liability shield where if, within a certain amount of time, you came across the images and properly reported them to law enforcement, you would have a complete shield of liability. You still might see organizations fearful, you know. Let's say you possess the -- you unknowingly possess the image, and you didn't report it within the requisite timeframe to law enforcement and you were still subject to civil and criminal penalties, that might still disincentivize you to engage in this type of research. But I think that would be a step in the right direction.

Dave Bittner: What if I'm an AI company, and I'm vacuuming up every single image on the internet? And I'm doing it in an automated way, and there's no -- as we've -- we talk about over and over again, there's no way I can do this at scale and look at every -- have a human look at every image to make sure it's nothing objectionable. Do they deserve any sort of shielding, or do they get what's coming to them because it's too much to do at scale?

Caleb Barlow: That is a great question. That is a great question.

Ben Yelin: I don't want to bring up another elephant in the room, but that gets into some Section 230 stuff for me.

Dave Bittner: Yeah.

Ben Yelin: Because I think most big tech platforms currently do a pretty good job of rooting out the absolute smut on the internet.

Dave Bittner: Right.

Ben Yelin: All of this -- all of that is done through algorithms. And one of the purposes behind Section 230 is we don't want to punish a Twitter or a Google for its effort to root out smut on the internet because, you know, they missed one or two images.

Dave Bittner: Right.

Ben Yelin: That's one of the main justifications of that Section 230 liability shield.

Caleb Barlow: Well, and I think this is the type of thing of, you know, what Dave highlights here is exactly where we're headed, right, where you're going to have these massive tools that are vacuuming up all kinds of data, you know. And you can add AI on top of this to try to detect what's going on. I mean, there are tools out there that are very good at determining, you know, is this a -- is this explicit material, or is this just an advertising site for underwear, right? I mean, believe it or not, those tools actually exist, and they're -- from what I understand, the efficacy is quite high. But, you know, you're going to occasionally have a failure or two. And, you know, maybe there's some level of liability shield here that can be developed because, you know, I think this is not just about protecting the researchers. I think something like this is also recognizing that, the more tools out there we have vacuuming up the internet, the harder it will be to propagate this type of illicit material.

Ben Yelin: Right. I wonder, too, for example -- and, I mean, this is an edge case, so let's take it as such. But any of us are out there poking around on the internet, and we accidentally, innocently accidentally stumble across an objectionable image, right? And this -- like, this used to happen a lot more in the early days of the Internet when you were poking around image folders, you know, before the web and all that kind of stuff. Oh, what's in here? Ah! You know, I didn't want to see that. But the very act of viewing an image even accidentally means that image is on your computer, and you're breaking the law. But you wouldn't have known you were breaking the law until you downloaded the image and you viewed it. Now, I don't know that anyone has ever been prosecute for accidentally stumbling upon an image or not. But you see where I'm going where there's kind of a catch 22 here. There's a lack of nuance.

Ben Yelin: Yeah. I mean, I think that's why possession shouldn't be a strict liability crime. And when we're talking about child pornography, I mean, it should be there -- in the legal parlance, and this is going to annoy both of you, but you should absolutely have a mens rea requirement where there's some sort of intentionality. You have to have a criminal state of mind.

Caleb Barlow: Well, and let's face it, too. There's also a big difference between I accidentally viewed it once. And I viewed it -- I viewed it 20 more times, and I distributed it.

Ben Yelin: Right. Exactly.

Caleb Barlow: Your computer's going to know that too.

Dave Bittner: Right. Sure.

Ben Yelin: Not only is the computer going to know that but, like, our legal system is well set up to make those determinations. Is this -- you know, was this somebody who accidentally stumbled upon an image, deleted it, reported it to law enforcement, etc.? Or is the somebody who was distributing it and opening the file repeatedly? Our legal system adjudicates those types of issues all the time to determine a critical state of -- a criminal state of mind. So I think it would be well within its capabilities to do that in these types of cases.

Caleb Barlow: All right. Let's talk about something where there are no laws today.

Ben Yelin: Ooh. Okay.

Caleb Barlow: It's an edge case where there's a lot of research going on. So, you know, one of the best ways today to disrupt adversaries is to hack back.

Ben Yelin: Right. Hack them back.

Caleb Barlow: And, of course, you know, let's face it. Amongst, you know, listeners here, this is happening quite actively today.

Ben Yelin: Yeah.

Caleb Barlow: And I'm not talking so much about somebody breaking into the bad guys' system and, you know, kind of traditional hacking. But where this is happening today is, oh, maybe I run a cloud service, and I see that a few adversaries are using my cloud service. I'm not going to turn it off. I'm going to watch what they're doing and gather as much intelligence as I can and basically provide their infrastructure. Or, in some cases, it may involve direct adversarial engagement where, you know, maybe I'm a researcher. And I'm in forums, and I'm pretending to be a researcher. And I'm providing help and guidance, all in exchange for learning more about what these people are doing. And, you know, in many cases, in fact, I would love to say in most cases, you know, this is being done. And that data is being shared with law enforcement or other threat intel researchers. But I think this is another area where there's just a lot of loosey goosey stuff going on right now because, at the end of the day, you know, kind of the legal standard of this is, are the bad guys based halfway around the world really going to come back and sue you? Probably not. Right. But at the same time, there's a whole list of potential laws that are getting violated in these cases. And, you know, I think this is an area where, as this type of function grows, we probably need to establish some guidelines and/or, you know, back to our comment on maybe some level of licensure of who's allowed to do this type of work and why.

Ben Yelin: Yeah. There have been academics who've put out papers on, like, coming up with a legal framework for hacking back, cyber self-defense, if you will. I remember one of them -- I'm trying to remember exactly where I read this -- compared it to a principle in -- I believe it's in criminal law called the shopkeeper's privilege where, if somebody steals something, if you suspect that somebody's stolen something in your store -- actually, I think it's probably tort law. Don't quote me on that. But if somebody -- if somebody steals something in your store, you have the right -- and you reasonably suspect it, you have the right to search them on their way out without being implicated for something like false imprisonment. And our legal system through common law, through years of precedent has well-established that exception. So your defense of false imprisonment from detaining someone would be the shopkeeper's privilege. Maybe you have something like that. You can have an affirmative defense for hacking that you were doing it -- it was an act of cyber self-defense, I think our legal system could establish either through case law or by statute some type of framework where, if you are sued for some type of -- type of intrusion to another network, you'd at least have an affirmative defense saying, I was doing this for cyber self-defense or this -- this wasn't hacking back as an aggressor. I'm not trying to propagate the Wild West here. I am legitimately interested in protecting myself and my network.

Caleb Barlow: Yeah. I actually think the more important thing with this isn't even so much being worried about entering into the legal system if somebody gets caught doing it. I think it's the court of public opinion that often doesn't realize that this type of activity goes on. And, you know, when you read a lot of these, like, you know, big takedowns from the Justice Department, you know, there's two things that always enter your mind. One, how did they pull that off?

Ben Yelin: Right.

Caleb Barlow: And, you know, I think they've started to reveal a little bit more that, you know, some of these actions involve a little bit of offense. But the second thing is there's always an acknowledgment that, oh, and partners were involved. And, of course, none of the partners are mentioned, which is the way you want that, right.

Ben Yelin: Yeah.

Caleb Barlow: But I do think -- you know, what I worry about is -- and I think what the security research community worries about is not so much, you know, ending up in legal trouble but this ending up getting slanted sideways by a reporter that maybe doesn't understand the full context and it ending up in some sort of, you know, hit piece on a security company when the public just doesn't understand the efficacy and how much of this type of activity is actually going on.

Ben Yelin: Right. And would have no understanding of the context of it. These aren't -- this isn't just like a digital vigilante doing this for the hell of it. Like, this is a some type of --

Caleb Barlow: Well, those people are out there too.

Ben Yelin: They're out there too. Yeah. But, in these circumstances, it's somebody with a good faith interest.

Dave Bittner: Is this a digital stand your ground law?

Ben Yelin: Oh, God. Let's not go there. Yeah.

Caleb Barlow: But, you know, okay. I mean, the funny thing is, Dave, there is a certain aspect to that, right. I mean --

Dave Bittner: Right. I say it half jokingly.

Caleb Barlow: No. But I actually think there's something to this, right. I mean, the reality is, when you're working incident response, more often than not, you go in. Something bad's happened. You explain it to the executives, and you move on, right? However, what security researchers love is that, you know, 1 in 100 client that's like, Oh, hell no. Not on my watch. Whatever you need to do to figure out who these people are, you know, oh, you want to -- you want to let this attack keep going? I'm fine with that. Bring in all your tools, all your research, all the logging. Let's go figure this out, right. And the reality is --

Ben Yelin: Braveheart style. Yeah. Just -- yeah.

Caleb Barlow: The reality is, though, guys, like, that's where the needle really moves is when you run across that client that's willing to say, Not on my watch, and take a little risk.

Dave Bittner: Right. But, as a security researcher, to what -- to what degree are you willing to take on that risk?

Caleb Barlow: A hundred percent.

Dave Bittner: And is that risk and liability transferred to you?

Ben Yelin: And just like you said, it's not just legal liability. It's --

Dave Bittner: Right.

Ben Yelin: You don't want to be on the cover of --

Caleb Barlow: You don't want to be on 60 Minutes explaining how you, you know, were running some -- you know, your hosting environment, you knowingly let it run for a year hosting somebody's bad guy server.

Ben Yelin: Right, right. Exactly. Exactly. Yeah.

Dave Bittner: All right, gentlemen. Well, this was great fun, so much so that we're going to have to do this again.

Ben Yelin: We would love to have you back, Caleb. I swear.

Caleb Barlow: Dave -- Dave just loves the fact that I called you a SCOTUS nerd because he wouldn't dare to do that. But I say that -- I say that in true respect for what you do.

Ben Yelin: Let's just say I may or may not be logging in to SCOTUS blog every Thursday at 10am like a proper SCOTUS nerd, and I am proud of it. And if there are other SCOTUS nerds out there, wear it proudly on your sleeve. That's what I have to say.

Caleb Barlow: I think you need a T shirt.

Ben Yelin: I think so too. Yeah. Maybe we can -- maybe we can manufacture some of those, Dave.

Caleb Barlow: CyberWire SCOTUS Nerd.

Ben Yelin: Yeah. Exactly. That could be a new vertical for us.

Dave Bittner: There we go.

Ben Yelin: The SCOTUS Nerds.

Dave Bittner: Corner the market on SCOTUS Nerd T shirts.

Ben Yelin: But, Caleb, I love your questions. They're thought provoking. We'd love to have you back and do this again.

Caleb Barlow: Well, let's -- let's also mention this, right? I mean, there's a lot more of these things that drive operators nuts. And, you know, if they've got more questions, just drop us a -- drop us a note on LinkedIn, and let's talk about it some more.

Ben Yelin: Absolutely. Hundred percent.

Dave Bittner: All right. So the first of many. We're going to make this a regular thing here. Caleb Barlow is the CEO of Cylete. Caleb, thank you so much for joining us.

All right. Well, that is our show. We want to thank all of you for listening. We'd love to know what you think of this podcast. You can email us at cyberwire@n2k.com. Your feedback helps us ensure we're delivering the information and insights that help keep you a step ahead in the rapidly changing world of cybersecurity. N2K strategic workforce intelligence optimizes the value of your biggest investment: your people. We make you smarter about your team while making your team smarter. Learn more at n2k.com. Our senior producer is Jennifer Eiben. This show is edited by Elliot Peltzman. Our executive editor is Peter Kilpe. I'm Dave Bittner.

Ben Yelin: And I'm Ben Yelin.

Dave Bittner: Thanks for listening.