Having your CISOs back.
Paul Kurtz: When you look at the regulatory authorities that are out there today, the SEC does make a fair amount of sense because it can cover publicly traded companies and, in this case, they've also covered companies in the investment business as well.
Dave Bittner: Hello, everyone, and welcome to "Caveat", the CyberWire's privacy, surveillance, law, and policy podcast. I'm Dave Bittner and joining me is my cohost Ben Yelin from the University of Maryland, Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: Today Ben has the story of Google moving to end geofence warrants. I've got the ongoing debate on whether or not to pay the ransom. And later in the show, we're joined by Paul Kurtz, Chief Cybersecurity Advisor, and field CTO at Splunk with his perspective on how the CISO and board view the new SEC cyber disclosure regulations. While this show covers legal topics, and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. All right, Ben, we've got some interesting stuff to share this week. You want to kick things off for us here?
Ben Yelin: Sure. So, my story comes from TechCrunch and is by Zack Whittaker. And it's kind of a follow-up story, an unexpected follow-up story to our episode last week. We talked about a Fourth Circuit case dealing with the sticky issue of geofence warrants and whether that passes constitutional muster under the Fourth Amendment. One of the things we discussed is that in the absence of any prevailing legal standard, we kind of rely on Google as the chief collector of our location information to set the policy for us through their terms of service. And that's what they've done. What came out after we recorded last week is that Google is moving to end geofence warrants, which is pretty remarkable and they have the power to do it given their market share in this space. So, the big headline here is that Google will allow users to store their location data on their devices rather than on Google servers. And this will have the effect of ending the practice that allowed law enforcement agencies across the country to go to Google and say, "Hey, can you give us all of the devices that were in this particular area at this particular time?" If that's stored now on individual devices, not on the servers, it's impossible for law enforcement to get. So, they didn't explicitly say that they were ending the use of geofence warrants, I think that might have been a little too big of a pill to swallow and maybe that's not something they want to share publicly necessarily. So, the way they couch it is that they're going to give users more power to control their own data. And I think this is very significant. Now, other companies, they may or may not follow suit but I think it's important to note how big of a player Google is in this space. Not only are they -- do they operate all Android devices but most of us use Google Maps so they know where we are at all times.
Dave Bittner: Right. They're the biggest provider of mapping services out there. I guess, Google Maps is the --
Ben Yelin: Google Maps is the big thing, exactly. And so, I think it can be understood how big of a decision this is. What I'm waiting for is the type of pushback that we may or may not see from law enforcement agencies at both the state and federal level. Geofence warrants have increased exponentially over the past several years. Google received I believe over 11,000 geofence warrants the last year that it was measured in 2020. And that was about a quarter of all the legal demands that Google received. So, if you can think about all of the legal demands, subpoenas, requests under the Stored Communications Act, having a quarter of them be these geofence warrants, which are not particularized, I think is really startling. And I think in the long run, it may have been bad publicity for Google to know that the government has access through Google to this tool that might be used to get us in trouble with the law. So, my question is how does law enforcement react? If they say this is going to inhibit our ability to solve crime, we're in a high-crime environment, especially in these major cities, here are a bunch of specific examples where a geofence warrant has led to a conviction where it otherwise might have not led to a conviction. And I wonder if either that would force Google to voluntarily backtrack on this decision, which I doubt, or if it motivates Congress to pass legislation requiring Google to maintain that data on its own servers, which would be, to put it mildly, incredibly controversial but I don't think that type of thing would be out of the question.
Dave Bittner: Really? Interesting. Yeah. I -- hm. So, a couple of things here. I mean, before we dig in too deep, I mean, geofencing, just to lay out what that is in case folks need a quick refresher here. I mean, this is when -- and correct me if I'm wrong, this is when law enforcement comes to an organization like Google and they say we want to know everybody who was in this place, in this range of times based on the location pinging that their devices have done to you.
Ben Yelin: Exactly. And there's no individualized suspicion. There's no -- at this point, you don't have an identified suspect. That would be a very different subject. What's happening here is simply we want a bulk collection of all the devices that were in this area at this particular time. And it's been used to solve crimes as petty as, you know, theft from supermarkets to as serious as the January 6th attacks. So, it's pretty widely used by both state and federal agencies, which is why this decision I think is so significant. And it is something that is unique to Google. One of the things they note in this article is that Apple claimed in 2022 that it had only received 13 geofence warrants. The reason they received such a tiny fraction of what Google received is that Apple "does not have any data to provide in response to geofence requests because this data relies on -- or resides in users' devices, which Apple cannot access." So, I think Google is trying to become Apple in this realm. And whether this really inhibits the work of law enforcement I think is the key question here. I certainly think we could see some type of proposal that requires these big tech companies to retain location data at their servers for some limited time, maybe it's 45 days, maybe it's a year just as a crime-fighting measure given how effective this tool has been. I think you would get massive pushback from some of the civil libertarians in Congress, organizations like EPIC, and the Electronic Frontier Foundation. And I think Google is kind of responding preemptively to those types of criticisms here. I think they've -- maybe expressing a mild form of guilt that they've been part of this form of mass data collection.
Dave Bittner: Are you projecting, Ben?
Ben Yelin: Maybe I am projecting on my friends at Google.
Dave Bittner: So, the fallback here perhaps for law enforcement is that they would still have access to cell tower location data, right?
Ben Yelin: They would. But to get cell tower location data per the Carpenter decision, you need a warrant with individualized suspicion based on probable cause.
Dave Bittner: Ah, so that's a much higher burden.
Ben Yelin: A much higher burden. So far there has not been -- there's -- Google had required the government to produce a warrant for them to turn over geofence data but the government never needed individualized suspicion because by necessity or by definition when you're asking for a geofence warrant, you're asking for everybody's information, not just one person's information. So, cell site location information usually pertains to an individual, and per the Carpenter's decision that requires a warrant based on probable cause which is difficult to obtain, especially when you're at the beginning of an investigation, you don't have any leads and it's just one of those things that's going to make the life of law enforcement agency that much more difficult.
Dave Bittner: So, at the risk of getting in the weeds here, why is it that the Carpenter decision made it so that cell tower location data requires that individualized suspicion but a geofence from someone like Google does not?
Ben Yelin: Well, I guess, I should be clear here so things like tower dumps, which are kind of the very close cousin of geofence warrants, still might be permitted. I think there are some disagreements with the jurisdiction. So, tower dumps are getting the cell site location information from everybody within a particular area at a particular time. I think the Carpenter decision makes it more difficult because we have this Supreme Court finding that there's something unique about cell site location information. The unique things about cell site location information vis a vis something like Google Maps is the lack of voluntariness. So, all of us use our devices, we have to use them to participate in modern life. We're not really consenting to any company collecting our data here.
Dave Bittner: We're heading towards the EULA, aren't we, Ben? We're heading towards the EULA.
Ben Yelin: Yeah. But I think what the Supreme Court says, I think they've actually kind of dismissed the fact that you can EULA your way out of this so to speak by saying you need a cell phone to participate in modern society. As soon as you turn on your device, it pings a tower so you're not really making any conscious choice to forfeit your reasonable expectation of privacy. Whereas for something like Google Maps, you are making that choice. I mean, you do have to agree to Google Maps' terms and conditions and it's just much more obvious to an average consumer that when you're using mapping technology rather than just making a call on your cell phone, your phone company is going to have an idea of your location. It's just much clear to the average non-tech savvy consumer. So, I don't think this necessarily means that all tower dumps are unconstitutional, I just think because of the Carpenter decision, there is this increased level of protection for historical cell site location information.
Dave Bittner: Yeah. All right. Well, I mean, this seems to me to be a good thing for individual privacy.
Ben Yelin: I think so. And I think for Google's reputation as a company that is privacy conscious and, you know, I think perhaps their view in the long run is that this will be better for their bottom line.
Dave Bittner: Yeah. All right. Well, we'll take it. We'll see how it plays out, right? Yeah. All right. We'll have a link to that story in the show notes. My story this week comes from our friends over at the Washington Post, specifically, Tim Starks who writes the Cybersecurity 202 at the Post and also is a regular over on the CyberWire podcast with me. In light of the recent story, as you and I are recording this here today, we got word that the ransomware group Rhysida has posted about one and a half terabytes of data on their darknet site, on their Leak Site that affects Insomniac Games. They're the folks who make Spiderman 2, they have a game that they're working on with the Wolverine character. And basically, what happened is Rhysida hit Insomniac Games with ransomware. Insomniac Games said, "We're not paying the ransom." And Rhysida said, "Okay. We're going to publish your stuff." And that's what's happened. Rhysida has stuck to their word.
Ben Yelin: Can I just say something really quickly on this?
Dave Bittner: Sure.
Ben Yelin: They said that they were able to obtain the next 12 years' worth of sensitive commercial and strategy documents, Slack screenshots, and personnel files that would reveal the studio's release slate for 12 years. I had no idea that these companies -- and maybe it's not universal for all these companies -- do that level of planning. I'm frankly kind of impressed. I don't know what I'm going to be doing in 12 days from now, let alone 12 years from now.
Dave Bittner: Right, exactly. Yeah, what's your 12-year plan? I don't know.
Ben Yelin: Try not to die.
Dave Bittner: I'm not going to -- yeah, I don't know what I'm having for lunch today. I don't know what my 12-year plan is. No, well, you know, I think a lot of that has to do with just the scope of these games now, you know, they're bigger than movies.
Ben Yelin: Absolutely.
Dave Bittner: And the amount of effort that goes into creating something like this I guess you need to have that sort of timeline, you know. I imagine a lot of it too is just lining up investment and getting all your ducks in a row, you know, I'm sure some of it is aspirational.
Ben Yelin: Right, right.
Dave Bittner: But that leaves me to the story in the Washington Post from Tim Starks and his colleagues there where they surveyed a bunch of cybersecurity professionals to see whether or not they are for or oppose a ban on paying the ransom. Any guesses to where this landed, Ben?
Ben Yelin: Well, I have the answer unfortunately in front of me. So --
Dave Bittner: Okay. Well, according to the survey, 74% --
Ben Yelin: We should give our users -- or our listeners just a quick second to think up their own answer and then see if it aligns with the actual number.
Dave Bittner: Yeah. I'm trying to think what I would have thought. I guess this tracks with me. So, here is the answer is, 74% of the cyber experts that the Post surveyed oppose a ban on paying ransoms.
Ben Yelin: That's in line with what my expectations would have been as well. Yeah.
Dave Bittner: Why do you -- why would you have thought that it would go this way?
Ben Yelin: I think I've just talked to enough individuals who have been impacted by ransomware events. Specifically in my work, you know, it's a local government who are cut so flat-footed that they're being presented with some of the worst options imaginable and they don't want to be hemmed in by any type of ban like this, given how desperate a situation it becomes when you have to recover that data. All circumstances are different and I think that's what some of these experts are getting at here. But, yeah, I mean, it's more just kind of a general vibe I've gotten from talking to people in this space.
Dave Bittner: Yeah. This article goes through some of the pros and cons, you know, the arguments for and against. They say arguments against a ban include the risk of penalizing the victims. And they point out especially folks in critical infrastructure, just taking away their options, you know, if you're supplying water for a town and paying the ransom means that the water gets turned back on, you got something you got to weigh.
Ben Yelin: Right. I mean, those are the kind of conversations people are having. It's never -- like it's always easy in the abstract to say, "Oh, I would never pay a ransom." Like you're encouraging the cyber criminals. But until you've been in that situation -- now, granted, paying the ransom doesn't guarantee anything. But it's just -- it's you have to kind of put yourself in those shoes.
Dave Bittner: Yeah. There's a worry that it would drive the activity underground, that people would still find a way to pay the ransom but now they'll just have to hide it from regulators, from, you know, their investors, all that sort of thing.
Ben Yelin: I'm just going to Venmo you for Fantasy Football.
Dave Bittner: Right, exactly.
Ben Yelin: For an amount of $1 million.
Dave Bittner: Right. And then also people question if this is the government's role here. Does the government have a place to enforce a ban like this? And that's an interesting question.
Ben Yelin: That is an interesting question. I'm kind of more torn on that question. I sort of think they do, just given how vast an impact it can have, especially on governments but also large companies and the private sector. Like it does have downstream effects on the economy. And so I think the government has a sufficient interest in that. If you take the Constitution at its word that the government can regulate interstate commerce, I think a big ransomware attack -- if you were to make an assumption that government policy could stop ransomware attacks, I think it's within reason that the government would have a role here. That's my view on it.
Dave Bittner: And then turning to some of the arguments in favor of the ban, obviously if you can't pay the ransomware people and they can no longer get their money, then they're going to move on to something else, theoretically, right? They're in this for the money and if there's no money, why bother?
Ben Yelin: Right, right.
Dave Bittner: But there are some other interesting things here. I mean, they point out that it could encourage better security practices. If you take away the option of paying the ransom, then that means you have to build a stronger wall around your organization or put better things in place to prevent it in the first place.
Ben Yelin: That's compelling. You know, the one pushback I could see to something like that is even companies who follow NIS standards and all security guidelines, they are still the victim of attacks, it can still happen as cybercriminals get more sophisticated. So, yes, it might incentivize better security practices but that's not a foolproof answer here.
Dave Bittner: Yeah. It kind of reminds me of, you know, that all chestnut of we do not negotiate with terrorists.
Ben Yelin: Right.
Dave Bittner: Right?
Ben Yelin: You should probably negotiate with terrorists sometimes. Yeah.
Dave Bittner: I mean, I think, you know, it's probably more accurately said we don't negotiate with terrorists except when we do.
Ben Yelin: Right.
Dave Bittner: And I think a ban would be that as well. There would be exceptions. And it's one of the things they point out here is how do you decide what entities get exceptions, you know, do public companies -- are they banned from paying but a private organization or a critical infrastructure provider is allowed to pay? You know.
Ben Yelin: Yeah, that gets into sticky territory there. Then where do you draw the line? What if it's like a public-private partnership? You know, how do we -- I know the government has attempted to just -- to define critical infrastructure but there are some things that are sort of on the line between being critical infrastructure and not critical infrastructure. So, I think that might cause more problems than it would be worth.
Dave Bittner: Yeah. There's an interesting nuance here that they pointed out in the article that I think is worth considering. They say that sometimes the payments help identifying the attackers. So, when you make that payment, you know, you can follow the money. You can find out who was this, where did it go, how did they cash it out, you know, how did they try to launder it, those sorts of things can help with law enforcement and even national security of figuring out who's behind some of this stuff, which is interesting.
Ben Yelin: Yeah. It's interesting too. I mean, that's something that's never a situation you want to be in but I guess there are some positives that make it slightly easier to identify your attacker.
Dave Bittner: Yeah.
Ben Yelin: You know, I think there are ways to still have a proper incentive structure without going for a full ban. So, they suggest in this article prohibiting insurance companies from covering ransom payments. We've talked about that as a proposal. And then requiring public disclosure of payments, which seems to me just to be like an unnecessary shaming of organizations. Like, let's put you in the town square and beat you with a stick for being a sucker and paying the ransom here, which I don't necessarily think is helpful, especially in situations where you don't know what an organization -- what type of decision an organization has to make and it might not be fair to parade them out in public and use them as an example of somebody who didn't use proper security protocols.
Dave Bittner: I mean, that's an interesting thing. And I'm thinking of the ways that that meshes with some of the disclosure requirements that we're seeing. In fact, you know, our conversation later in the show with Paul Kurtz is about that very thing, you know, some of the SEC's requirements on disclosure. So, if you have to disclose a breach, are you allowed to be coy about whether or not you paid the ransom, right?
Ben Yelin: Yeah, I mean, I think that becomes a really interesting question. Yeah. Maybe you can just plead the Fifth in your filings, just say -- this is a joke, by the way, you can't actually do that. But just say I will neither confirm nor deny if I paid the ransom.
Dave Bittner: I always think about with this sort of thing what -- how different would it be if we did not have cryptocurrency as a payment enabler?
Ben Yelin: Maybe we'll find out in a couple of years when the entire cryptocurrency industry collapses. Although I believe that something will take its place. That might be a kind of quasi-cryptocurrency. But that's a good question. I mean, that's what's enabling all of this. We used to have a system where money could be easily tracked. You know, in the old days, it was tracking the serial numbers on dollar bills and now we have more sophisticated ways of tracking financial transactions. That's the dream of cryptocurrency is to take these transactions off the grid, away from the peering eyes of the government. But these are the consequences. It does become harder to track. And it does make it harder for us to get leads on who's perpetrating these attacks.
Dave Bittner: Yeah. All right. Well, there are a lot more details of this poll, that again Tim Starks and his colleagues at the Washington Post did so we will include a link to that in the show notes, do check it out. It is well worth your time. [ Music ] Ben, it is my pleasure to welcome to the show Paul Kurtz. He is the Chief Cybersecurity Advisor and field CTO at Splunk. And he's going to talk with us today sharing his perspective on how the CISO and the board of directors view the new SEC cyber disclosure regulations. My pleasure to welcome to the show, Paul Kurtz. So, we've had this interesting new set of rules come from the SEC when it comes to cyber disclosure. Paul, I would love to start off with just kind of getting a little bit of the lay of the land from you. What was your response when you saw these rules coming down the pike from the SEC?
Paul Kurtz: Well, I thought it's probably one of the most significant developments we've seen in some time of really holding companies to task on the status of their cybersecurity efforts. Previously, you know, there were a few well 10-K filings that specified, you know, some cybersecurity data but really the SEC had not really done anything to really direct greater compliance and disclosure what to do related to it. It just went into effect yesterday.
Dave Bittner: So, Paul, why do you suppose that the SEC is the appropriate organization here to take this on?
Paul Kurtz: It's a really good question, Dave. When you look at the regulatory authorities that are out there today, the SEC does make a fair amount of sense because it can cover publicly-traded companies and in this case, they've also covered companies in the investment business as well. And you already have a normal reporting structure for 10-Ks and quarterly filings. And so, it makes it a worthy channel in order to disclose cyber events. If you look out to the regulatory space in others who might have that kind of jurisdiction, they really don't exist. And then I think I'd be wrong if I didn't say that, you know, the chairman at the SEC is very much the activist on cybersecurity. And for some time has wanted to, if you will, bring greater disclosure into publicly traded companies so that investors have greater insight into how cyber attacks might be affecting their investments.
Ben Yelin: Hey, it's Ben here, Paul. First of all, it's good to be talking with you. My question is how do you get buy-in, especially in some of these C-suites on having cyber expertise on boards or giving enhanced power to CISOs, how do you get buy-in from senior leadership in organizations to convince them that this is a serious problem worth addressing just given resource constraints?
Paul Kurtz: Yeah. Well, I think the SEC guidance really helps bring that buy-in because now it is hard for boards of directors to so to speak sidestep, and if you will not see disclosure as an issue of importance. Previously boards of directors, you know, it was approving budgets for cybersecurity, a lot of to-and-fro on whether or not, you know, we need to spend or that companies need to spend money on cybersecurity. But now that's all changing. And I think the CISO now has, if you will, more heft in the ability to influence boards of directors that their compliance status needs to measure up to reality. By that I mean if you have companies that are complying with, you know, with all the various guidance that's out there, you'll also need to be able to stand -- you know, to back that up with real technology. It can't just be a statement of compliance that is not backed up with capability. And in fact, in the case of SolarWinds where the CEO -- or excuse me the CISO was held to task and actually is a defendant, I think will underscore the importance both for boards of directors and obviously for CISOs in wanting to make sure that reality matches whatever compliance guidance they might be putting together.
Ben Yelin: I guess I feel like a lot of CISOs, their pleas fall on deaf ears. And there's been just a problem with not listening enough to the CISOs. I've kind of heard that in the public and private sectors. And I guess my worry is it's going to take an enforcement action for organizations to take this seriously. So, I guess my question is can you give us like the worst consequences you can imagine for an organization for not being prepared to comply with these? Let's create some fear in some of our listeners here. What's the worst-case scenario for an organization if they do not try to address this problem adequately?
Paul Kurtz: I think the worst-case scenario is that the SEC steps in and finds somebody not to be in compliance with the regulation and names him as a defendant in a case. And we've already crossed that threshold. In the case of SolarWinds, the CISO has been named as a defendant and is going to be held to account. So, that is the worst-case scenario in terms of a CISO. What that means in terms of the company itself is a little more opaque. But I would think that going forward, given that the SEC regulation is now in place that we're going to see more action on the part of the SEC to hold companies to account as well.
Dave Bittner: Paul, I'm curious for your insights on practical things that folks should be doing now. Both the CISO and if I'm somebody sitting on a board, what sort of boxes should I be checking off or things -- lists that I should be making to make sure that my organization is doing the right thing here and not running afoul of the SEC?
Paul Kurtz: Yeah. The security infrastructure that you have in place has to match up with what you say you are doing in order to secure yourself. So, that's kind of point one. In other words, the realities have to match. You have the appropriate security controls in place with the supporting technology and you're regularly understanding the status of your systems, in effect, you know, having that mission control, that ability to have real insight in real time on the status of your capabilities. This requires really the CISO and the board to come together. In other words, it's not necessarily all on the CISO, it is not necessarily all on the board, so we're going to have to see or I think what we will see is boards of directors and CISOs working more closely together to make sure that they have a program, a process in place in order to handle disclosures relating to hacks. And I think we've seen this just in the past 24 hours with VF Corp disclosing that they've in fact had a breach where you can see that there's a level of coordination between the CISO and the company itself as to what they're ready to disclose as far as the hack that occurred a couple of months ago.
Dave Bittner: And I'm curious, Paul, in the past, I don't know, a year or so, maybe a little more than that even, it seems to me like there's been a necessary elevation of the role of the CISO. And I think a lot of people complain that, you know, CISOs aren't truly members of the C-suite yet, that there's still some work to be done there. But it seems to me like there's really been a recalibration and an emphasis, both on the role the CISO but then also making sure that your board of directors has the necessary expertise when it comes to some of these technical issues. Do you think my line of thinking there is on track? Has that been your experience as well?
Paul Kurtz: Yeah. I do think, Dave, you are on track on that in that, you know, there is a greater coordination in understanding between the CISO and the boards of directors. I will say that I do -- and I was at a conference a couple of weeks ago in Charlotte and met with a lot of CISOs. And CISOs are concerned. They're concerned because they now with the new guidance, there's a lot of responsibility that they hold regarding the security of the enterprise. And it's -- and with the SolarWinds CISO being held to account and becoming the defendant of the SEC, it puts a lot of pressure on the CISO. And you could argue that this -- the net effect might discourage individuals from taking the job as CISO because of the risk, because, if you will, the personal risk. That all being said, I think there is an opportunity because of the action of the SEC with regards of the SolarWinds CISO that it does give a CISO more leverage with the board. The leverage to ensure that budgets are fulfilled and they actually have the capabilities in order to defend the enterprise. And so, I think we're in a new space here, CISOs under greater risk but possibly at the same time having more leverage.
Dave Bittner: Paul Kurtz is the Chief Cybersecurity Advisor and field CTO at Splunk. Paul, thank you so much for taking the time for us today.
Paul Kurtz: Thank you. I appreciate it. [ Music ]
Dave Bittner: And that is our show. We want to thank all of you for listening. A quick reminder that N2K strategic workforce intelligence optimizes the value of your biggest investment, your people. We make you smarter about your team while making your team smarter. Learn more at n2k.com. Our Executive Producer is Jennifer Eiben. The show is edited by Tré Hester. Our Executive Editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.