Caveat 9.21.23
Ep 188 | 9.21.23

Make your mark CISOs.

Transcript

Karen Worstell: How do you manage your overall technology digital footprint in such a way that you would know what was happening in that environment? And are you doing the basic blocking and tackling that keeps that risk to an acceptable level?

Dave Bittner: Hello, everyone, and welcome to "Caveat", the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hey, Ben.

Ben Yelin: Hello, Dave.

Dave Bittner: Today, Ben discusses a Michigan case dealing with persistent unmanned drone surveillance. I've got the story of a California judge blocking a law aimed at increasing the online safety of kids. And later in the show, my conversation with Karen Worstell of VMware discussing how CISOs can make their mark with the new SEC rules. While the show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. All right, Ben, let's jump right into our stories here. You want to kick things off for us?

Ben Yelin: So I wanted to talk today about unmanned aerial vehicles or drones. This is a case that comes out of Michigan. It's currently before the Michigan Supreme Court. And I was alerted to it by the Electronic Frontier Foundation. They filed an amicus friend of the court brief arguing that this type of persistent surveillance is unconstitutional. So this is in Long Lake Township. Sounds beautiful. I have no idea where it is. They hired private operators to repeatedly fly drones over Todd and Heather Maxon's home to take aerial photos and videos of their property as part of a zoning investigation. I'll note that --

Dave Bittner: Wait, zoning?

Ben Yelin: Yeah. It would be one thing if they were like, you know, cooking meth in their backyard.

Dave Bittner: Right. They've got 100 acres of fine marijuana growing or something.

Ben Yelin: Yeah.

Dave Bittner: But zoning? Okay.

Ben Yelin: It seems very weird that they would do this for a zoning investigation.

Dave Bittner: All right.

Ben Yelin: Now, I have no idea what that entails. Maybe there is a crime of moral turpitude that just relates to zoning. I have trouble imagining what that would be.

Dave Bittner: Could be. Could be.

Ben Yelin: The city, Long Lake Township, is arguing that the Maxons here -- spelled M-a-x-o-n I guessI mispronounced it the first time. Go with Maxon.

Dave Bittner: There you go.

Ben Yelin: The Maxons don't have a reasonable expectation of privacy in their backyard. Therefore, there is no Fourth Amendment search. And therefore, it is proper that this township paid to have these warrantless drone surveillance searches conducted over a long period of time. This is based on a lot of case law going back to the 1980s. So in order for there to be a search for Fourth Amendment purposes, somebody's reasonable expectation of privacy has to be violated. And there are all these cases in the '80s that dealt with aerial surveillance. Usually, those cases did deal with people growing drugs in their backyard. And the Supreme Court said, look, that's in public view. Theoretically, anybody could see and take high-resolution photos from 10,000 feet above your house. Then you get a decent view of what you were growing. You shouldn't have an expectation of privacy. Therefore, no search. Therefore, no warrant required. What EFF, the Electronic Frontier Foundation, is arguing on behalf of the Maxons is that this kind of search is fundamentally different than the previous aerial surveillance cases. And that gets to a broader point about how as technology changes, it's appropriate to, in the view of EFF, disregard some of this outdated case law. Because back in the day, yes, it was possible to do aerial surveillance.

Dave Bittner: Right.

Ben Yelin: But it was prohibitively expensive for a local township to fly a plane in the sky. It was far more resource-intensive. It was something that would require a sophisticated law enforcement operation. That's similar to the difference in tracking cell site location information today as it was tracking somebody's real-time location 30, 40 years ago.

Dave Bittner: So is the idea here that in the past it was kind of self-limiting just because of the expense and hassle of doing it?

Ben Yelin: Exactly. And that's a very common constitutional argument now from civil liberties advocates. The reason it might have been acceptable in the 1980s is that, as you say, this kind of natural limiting factor of resources on behalf of these small towns. And now we have drones. Anybody can buy them. They're very inexpensive. They're very easy to fly and operate. Our next-door neighbor, who's 11 years old, flies a drone. And I always am catching to see whether he's flying it over my property and my curtilage. So at least theoretically, any 11-year-old kid who has an Amazon account could do some type of drone surveillance on somebody's house.

Dave Bittner: Right.

Ben Yelin: And because of the ubiquitousness of this technology and how easy it is to procure, the argument is aerial surveillance has fundamentally changed in the last 40 years and as a result, the law should fundamentally change.

Dave Bittner: Can I ask a rookie question here?

Ben Yelin: Yeah, there are no rookie questions when it comes to the law.

Dave Bittner: Well, wait till you hear my question then.

Ben Yelin: All right.

Dave Bittner: So can you explain to me the difference, if any, between surveillance and a search?

Ben Yelin: Sure. So a search in layman's terms would be to seek to find something. I think that's how we would all understand it if you didn't have the misfortune of going to law school. Law school ruins normal words, ruins nomenclature.

Dave Bittner: Okay.

Ben Yelin: Because search for legal purposes is defined very specifically as either a physical trespass, which is not happening here, or a violation of one's reasonable expectation of privacy.

Dave Bittner: Okay.

Ben Yelin: So whether there is a physical trespass or not, if the government violates somebody's reasonable expectation of privacy, that is a search, no matter what type of surveillance, what type of form that surveillance takes.

Dave Bittner: Okay.

Ben Yelin: So the argument here on behalf of the Maxons and on behalf of EFF is that people should have a reasonable expectation of privacy from drones, casual, cheap drones flying over their property and taking pictures. Interestingly, one of the cases they cited was a Fourth Circuit federal case from our home area here in Baltimore. A bunch of activists in Baltimore got together and created a group called Leaders of a Beautiful Struggle, which, how are you going to oppose them in a law, in a legal case?

Dave Bittner: Sure.

Ben Yelin: And they challenged this aerial surveillance program that existed in Baltimore City. For several years, Baltimore City would fly at a relatively low altitude, manned aircraft. It wasn't unmanned. It would just fly kind of in circles around downtown Baltimore and would take real-time photos. This was a crime-fighting measure.

Dave Bittner: Right.

Ben Yelin: You couldn't identify people's faces. They were just kind of little dots. But if you saw a dot go from a crime scene to a house, you get a pretty good idea of how to find a person. So this was challenged in federal court, and the Fourth Circuit held that this was kind of analogous to what happened in Carpenter, the Supreme Court case, which held that because the nature and quality of historical cell site location information is so fundamentally different, that's how we should approach all technology cases. So the fact that in the past aerial surveillance might not have qualified as a search has no bearing on the present where we have the type of technology that an aircraft can go up and take real-time photographs every split second to get this kind of encyclopedic knowledge of the happenings in Baltimore City. I think that's very similar to what's at stake in this Michigan case. Is the court going to adhere to this old precedent from the 1980s about aerial surveillance, or are they going to do what courts have started to do more regularly and reconsider past doctrines in light of modern technology?

Dave Bittner: All right. I have a couple questions here. We have talked about cases involving pole cams before. So I'm curious. My recollection with that is that I have a piece of property. I have no expectation of privacy. So somebody can sit across the street in their squad car or unmarked squad car and keep an eye on my property. >2 Right. Then I put up a fence to increase my expectation of privacy, right?

Ben Yelin: Yeha.

Dave Bittner: So to get above that fence, now law enforcement has put a camera up on a pole. And my recollection is that the courts considered the possibility that that was different in the attempt to circumvent my specific attempt at privacy. A pole cam is different than someone sitting and watching something that's in plain view.

Ben Yelin: Right, right. And you've exhibited a subjective expectation of privacy in that scenario by building that wall and building that fence.

Dave Bittner: Right. Now my understanding of the law when it comes to drones is that they are under the orders of the FAA. And that because they're considered little tiny aircraft, I'm not allowed to restrict a drone from flying over my property because the air is controlled by the FAA, not me. And in the same way that I can't prohibit a commercial airliner from flying over my property -- if only we could.

Ben Yelin: Right. This can be loud if you're near the airport.

Dave Bittner: Don't get me started. But I can't prohibit a drone from flying over my property for the same reasons. Does that change if that drone is taking photos?

Ben Yelin: I think it does fundamentally change. First of all, the government is involved here. So if it's just a private person flying a drone over your property, you'd have a tough time because the FAA regulates it saying this is some type of nuisance on your property.

Dave Bittner: Okay.

Ben Yelin: Which is a little weird, frankly, that a neighbor can fly a drone over your property and there really aren't any significant legal consequences. You probably just have to talk to your neighbor.

Dave Bittner: Yeah.

Ben Yelin: It is different when the government is involved. That's the Fourth Amendment implication here.

Dave Bittner: I see.

Ben Yelin: What happened is that this town paid to have a private company fly a drone over this property and take photos. And they did so without a warrant. So when it's the government doing it, presumably for -- I'm not sure if this is a civil case or a criminal case. I would guess it's probably a civil case if we're talking about zoning.

Dave Bittner: Right.

Ben Yelin: But since the government's doing it, that's why the Fourth Amendment is implicated here.

Dave Bittner: What about satellite photography? I mean, we've got very high resolution, you know, one-meter resolution from satellites. Where does that fall in here?

Ben Yelin: I mean, this is where it gets really sticky.

Dave Bittner: Yeah.

Ben Yelin: One of the reasons the Supreme Court had this doctrine that developed in the 1980s and the 1990s about not having a reasonable expectation of privacy in your backyard is that satellite technology existed. And as satellite technology continued to evolve, the pictures are becoming clearer and clearer. I mean, look at a Google Street View or Google, whatever the overhead Google view is of your home.

Dave Bittner: Right.

Ben Yelin: You know, they were able to adequately capture the congratulations on my preschool graduate sign in my backyard. That's pretty darn sophisticated.

Dave Bittner: Yeah.

Ben Yelin: You know, I think even if we expect a satellite to be taking photos, there is some type of limitation on that. They are not taking multiple photos in real time. Usually it's rather isolated. They're taking photos either on an as-needed basis or it's not the type of thing where there is a photo being taken every single second and a person can zoom in and then you have this chronicle, this encyclopedia of what's happening at somebody's house.

Dave Bittner: Right.

Ben Yelin: That's generally not how things work with satellite technology, but that's what's happening here with the drone is it's real time constant monitoring. So it's just a little more involved than overhead.

Dave Bittner: A little more invasive.

Ben Yelin: More invasive than that type of surveillance. Now, how that matters from a legal perspective, I don't know. I have to say, I mean, this is a really difficult issue for me. I can understand why somebody does not have an expectation of privacy in the outdoor curtilage around their house.

Dave Bittner: Right.

Ben Yelin: It is something that people can see into, planes at low levels. I mean, that's what the Supreme Court has said.

Dave Bittner: Yeah.

Ben Yelin: It's up to the Supreme Court to determine that this type of surveillance -- first the Supreme Court of Michigan -- that this type of surveillance is just fundamentally different and merits a reconsideration of that legal doctrine. And I don't know if they are going to be able to do that in this case.

Dave Bittner: I mean, what if I just get a tall ladder and I'm on the neighboring property, you know, tall ladder and a camera with a long lens, same thing, right?

Ben Yelin: Well, you'd be a creep, first of all.

Dave Bittner: I'd be what?

Ben Yelin: You'd be a creep.

Dave Bittner: Well, yes. Yes. That's beside the point. Right.

Ben Yelin: Yeah, I mean, I guess there are more limitations in that. I think the law recognizes that snooping neighbors as long as they don't trespass on your property is just a reality of living in a neighborhood.

Dave Bittner: So the business of this is that it's the government.

Ben Yelin: Yes.

Dave Bittner: Is that right? I mean, that's why this matters.

Ben Yelin: That's why this matters for Fourth Amendment purposes. That's correct.

Dave Bittner: Okay.

Ben Yelin: You could have a cause of action potentially for your neighbor snooping on you, but it's really hard to do because that's going to require that they've kind of pierced your property in some way, either through a physical trespass or they have the type of technology that most people can't procure where they're looking inside your house, a place where you have that expectation of privacy. But yeah, since the government is involved here, we have Fourth Amendment implications. And I'm just curious to see if courts at the state level follow the path of Carpenter, where, yes, in the past, this type of third-party surveillance, as took place with cell site location information, might not have been a search, but it's a search now because of how pervasive the technology is. And I'm wondering if state Supreme Courts are going to follow on a similar path for all different types of technology. We're talking about drones here, but I think it has wider applicability. As new technology that's more invasive gets introduced, are courts willing to consider doctrines that were formed far before this technology existed? And I think that's why this case is so interesting.

Dave Bittner: Yeah. It strikes me as being potentially at least a very fuzzy line. Are we in like pornography, I know it when I see it kind of thing, case by case?

Ben Yelin: That really does come from a Supreme Court opinion. And it was really one of the dumbest lines that's ever been written by a Supreme Court justice.

Dave Bittner: Okay.

Ben Yelin: Because really, it is your job to come up with a dividing line. And I know it when I see it is not really an easily justiciable dividing line.

Dave Bittner: Right. But that's the thing. I mean, that's what makes this hard.

Ben Yelin: It's really hard. Yeah, that's what makes this hard. I mean, the way courts have distinguished it in the past is if it's a technology that's widely available, then that cuts against somebody's reasonable expectation of privacy. So there was a case, Kyllo, a famous Fourth Amendment case in the 1980s where the government got this type of rare infrared technology to measure heat emanating from somebody's house.

Dave Bittner: Oh, yeah, yeah.

Ben Yelin: And they ruled that that was a search because people wouldn't expect that somebody has infrared readers on the outside of their house.

Dave Bittner: Right.

Ben Yelin: That was beyond reasonable expectations. But here, I mean, the fact that everybody knows or should know that that drone technology is out there, that it's relatively cheap to procure, that anybody can buy it, maybe that really does cut against that reasonable expectation of privacy. And to get really meta here and a little bit deep, that's what's wrong with the reasonable expectation of privacy test in general, is that the government can play kind of an active role if they go out and say, well, guess what, guys? We have ubiquitous drone technology. We're going to fly them over your houses. That would cut against a person's reasonable expectation of privacy, and that would be the government playing a role in diminishing Fourth Amendment protection for people just by announcing that they have this ubiquitous technology. And I don't know why we're content in giving the government that power. I mean, previous Supreme Court justices have analogized it to what would happen if the government says, all right, we're going to open all your letters that come into your mailbox.

Dave Bittner: Right.

Ben Yelin: That would certainly cut against your reasonable expectation of privacy. If you heard that on the news, you would have no reasonable expectation of privacy, and it would be the government that would be cutting against that expectation of privacy. And so should we have this entire test on whether there's been a Fourth Amendment search based on something that the government itself can play a role in, if that makes sense? And I think that's something that needs to be reconsidered. There are a lot of good academic papers out there about reconsidering this reasonable expectation of privacy test. This comes from a case called Katz in the 1960s, and I'm one of those people who thinks that perhaps this doctrine does need to be reconsidered.

Dave Bittner: What, in your mind, would be a good solution here, a next generation expectation of privacy guideline?

Ben Yelin: It's hard to find an idea that solves every problem. The one that's come closest to me is a technology-based approach where try to maintain the same level of Fourth Amendment protection that existed prior to the technology taking place. So whichever level of Fourth Amendment protection you had in your backyard or the curtilage of your property should be maintained in spite of the fact that we now have ubiquitous unmanned aerial vehicles. So whatever the law needs to do to sustain that same level of constitutional protection, the law should do. And that goes in the opposite direction as well. If a criminal develops, uses some type of technology to evade capture by law enforcement, then we should grant more powers to law enforcement to restore that balance as well. For you legal nerds out there, this is called the equilibrium adjustment theory. And among all the theories about how to handle this question of what is a search, I think that's the most compelling, but there really are no perfect answers here. And hopefully we get a lot of smart minds who end up making it to the Supreme Court who can revisit this test because it creates a lot of these types of situations where you have an unworkable, vague legal standard. It's really hard to adjudicate. And I think the Michigan Supreme Court is going to struggle with this just like courts have over the past several years as this new technology has been introduced.

Dave Bittner: What does the word curtilage mean?

Ben Yelin: It's like the -- yeah, that's another legal term of ours. But it's basically the area adjacent to your house that is part of your property.

Dave Bittner: Okay.

Ben Yelin: So if you have a front yard, a backyard, if the sidewalk is part of your property according to whatever the property records are, that's the curtilage of your house.

Dave Bittner: Okay.

Ben Yelin: Yep.

Dave Bittner: All right. Well, this is an interesting one for sure.

Ben Yelin: Absolutely.

Dave Bittner: As we so often say, Ben, we'll keep an eye on this one and we will.

Ben Yelin: Absolutely.

Dave Bittner: We'll have a link to this in the show notes. Yeah, interesting stuff. All right. Well, my story this week comes from the Washington Post. This is a story written by Cristiano Lima and it is about a judge blocking a California law that's meant to increase the online safety of kids. California had an age-appropriate design code and this judge says that that probably violates the First Amendment. Can you unpack this for us, Ben? What's going on here?

Ben Yelin: Yeah, so California passed a law, the California age-appropriate design code, as you say. It has a couple of requirements. One is that platforms themselves would have to vet their own products before they release them to the public. They would have to vet their products to make sure that whatever they were offering doesn't harm kids and teens. And then the law, I think the less controversial portion of the law would require platforms to enable stronger data privacy protections by default for younger users. So on applications that are geared toward younger users, there would be a requirement on behalf of these platforms to have built-in default stronger privacy protections. This first provision about vetting products before their public release has a couple of First Amendment issues attendant to it. One is that it's kind of vague. I don't know exactly what goes into vetting one's product. It kind of reminds me of a scene in The Office where Michael is worried about the sustainability of his paper company that he opened.

Dave Bittner: Right.

Ben Yelin: And his financial advisor tells him, like, yeah, you're pretty much screwed. And Michael's like, can you just crunch the numbers again? And the guy presses return on his computer and says, crunch. I crunched the numbers. That's kind of what I'm imagining here.

Dave Bittner: Yeah.

Ben Yelin: Like, yep, we vetted it again, and it's still fine.

Dave Bittner: Right.

Ben Yelin: What the platforms would argue is that under Section 230 of the Communications Decency Act and under the First Amendment, the platforms themselves have a right to determine which content shows up on their platforms without having to do this type of internal vetting. That they have the right both by statute and by implication, the Constitution, to make those decisions for themselves to sell the products that they want to sell subject to other federal laws. So for things like child pornography, obviously, there's a role for the government to play in preventing that from circulating. But for most other types of content, it could be an inhibition on what these platforms publish, and that is a prior restraint on speech, which is an almost always per se constitutional violation. So that's really the trouble with the law here. This is not just a California problem.

Dave Bittner: Yeah.

Ben Yelin: The California law was passed with bipartisan support. This is not a partisan issue. A lot of states, both red states and blue states, are trying to institute these laws. There's a federal law that has not passed but has been proposed to protect child online privacy. And they all kind of run into the same problem that if you are prescribing certain content on the Internet, especially as it relates to the accessibility for adults, that could be an inhibition on the First Amendment rights of these platforms. As we know from recently announced retiree Mitt Romney, corporations are people, my friend. So they still do have First Amendment rights.

Dave Bittner: Right.

Ben Yelin: And I think that's going to be a problem for all these laws.

Dave Bittner: How does this differ from, for example, my cable TV provider allowing me to have parental controls on what shows my kids have access to?

Ben Yelin: Well, that's granting the user controls. Now, in all of these contexts, I think these platforms have been pretty good about giving the parent or the guardian or whomever the mechanisms to control content that goes to their kids. But that's all a power that the user has and not a power that the company would have or that the government is imposing on the company. So I think it's completely different and entirely acceptable to give those options to people to use those locking features, age verification if you are implementing them on your own devices. I think what's happening here is that the government is perhaps coercing these companies into censoring content writ large before it goes to anybody's device. And that's the real constitutional issue here.

Dave Bittner: You know, we often talk about how many efforts there are in some of the legislation we see that we kind of joke and roll our eyes at the attempts to think, who's going to think about the children. Right?

Ben Yelin: Yeah.

Dave Bittner: Because I think quite often that's rolled out as a framework or a way to get something across the finish line even if they're not in good faith actually attempting to do that.

Ben Yelin: Right.

Dave Bittner: It seems to me in this case they are in good faith trying to do that.

Ben Yelin: Yeah.

Dave Bittner: Is that a fair assessment?

Ben Yelin: I think it's totally fair. I mean, this is not just an overactive legislature in California. Similar laws have faced these difficulties in Arkansas and Texas. So it's something that everybody is trying to do. I mean, I think of course you should recognize the societal interest we have in protecting kids from harmful content on the internet.

Dave Bittner: Right.

Ben Yelin: But the First Amendment presents all these types of difficult problems. There is no First Amendment prohibition on hate speech despite what you might read on your social media account. That's really difficult. It's a really difficult societal problem. But we have to balance the impact it would have to censor a type of speech versus the value that you would get in that type of censorship. And in this country we've almost unanimously, especially in the last 50 or so years, come down on the side of more speech, more freedom to put out content that you want to put out, not subject to pressures from the government. So I think that's kind of where I suspect the law will settle here. And that's bad news for people who are trying to get these kids' privacy laws across the finish line.

Dave Bittner: So we're putting the effort in the parents' laps to be the filter for what their kids do and do not see.

Ben Yelin: That's going to be the impact of this. Now that presents its own concerns. If you see this as a societal problem, parents aren't perfect, as we all know. And they either might not have the resources or wherewithal to protect their kids from stuff that they see on social media sites.

Dave Bittner: Yeah.

Ben Yelin: I caught my daughter watching a Mr. Beast video where there was an F-bomb. She's almost seven.

Dave Bittner: Living in your household, I'm sure she's never heard that word before.

Ben Yelin: Exactly. Certainly, when I realized that this was her video saying that, I was like, oh, man, I really should do something to make sure she's not watching something like this. And I consider myself a pretty attentive parent.

Dave Bittner: Yeah.

Ben Yelin: So it's not easy, especially when your kids are just growing up around this stuff. So I can see why policymakers want to make it easier for parents. But you just run into this First Amendment buzzsaw, and there's not much you can do about it at that point.

Dave Bittner: At the risk of opening up a can of worms here, I'm reminded of what we're seeing in communities across this great nation of ours where folks are trying to restrict what goes in libraries and ban books and so on and so forth. I mean, it seems to me that that is coming at the problem similarly to the way that this law did by restricting the provider, which would be the library, rather than relying on the parents to be in control of what the kids bring home and consume from the library. Am I off base with my analogy here?

Ben Yelin: No, you're not off base. Just a couple of things I'll point out. One, there have been a lot of lawsuits related to these so-called book bans.

Dave Bittner: Yeah.

Ben Yelin: So you're not the only person that sees the First Amendment problem here.

Dave Bittner: Okay.

Ben Yelin: And also, it's a little bit different because public libraries are public. So there's less of this problem of the government coercing private companies and preventing them from making decisions as to what content to put out there.

Dave Bittner: I see.

Ben Yelin: Libraries, I think, should have a degree of independence, but they just don't because they are government entities.

Dave Bittner: At the end of the day, they're funded by whoever's in office.

Ben Yelin: We the taxpayers.

Dave Bittner: Yeah.

Ben Yelin: Yeah.

Dave Bittner: All right. Interesting stuff. Again, we will have a link to that story in the show notes that is from the Washington Post. We would love to hear from you. If there's something you'd like us to discuss on the show, you can email us. It's caveat@n2k.com. Ben, I always look forward to having a conversation with our next guest here. It's Karen Worstell. She is from VMware. And our conversation today centers on CISOs and how they will interact with some of these new rules we've seen from the Securities and Exchange Commission when it comes to reporting breaches that have -- what's the word I'm looking for here? Material impact. Here's my conversation with Karen Worstell.

Karen Worstell: In a nutshell, I think what the SEC has tried to do here is what I would equate -- I would equate this to cybersecurity like Sarbanes-Oxley. And that is they're attempting to create a reporting system that has consistency and is centralized so that investors and shareholders have the opportunity to have the information they need about risk, particularly in the area of cyber risk, for the purposes of making investment decisions. That is triggered by, of course, what we all know has been happening over the last decade as the number of incidents, major incidents, have really expanded and the concerns that the threat environment is continuing to accelerate. It has some very interesting implications, but we'll get into that. I think the thing that's really important to remember is they are asking for us to disclose material cyber incidents within four days of determining that the event is material. So they're not asking us to report every cyber incident. There has to be a determination of materiality to the company, and that can include a lot of things that are important for us to talk about. And they want to also have visibility of the company's processes and its approach to cyber risk management, and that needs to be disclosed on the 10-K filing. So these are not just simple reports. These are SEC filings that are needing to take place, either on the 8-K or the 10-K for U.S. companies and for foreign registrants, then they would be filing those in a different way, but they would still be participating. So it affects anybody who is a registrant under the SEC.

Dave Bittner: You bring up that word material. What sort of wiggle room does that give in-house counsel to decide whether or not something needs to be disclosed?

Karen Worstell: It's not prescribed, right? There isn't a formula or a prescriptive way for somebody to say, if it hits this threshold, it's material for you. They have to look at this from a number of different angles. There are certain accounts in the chart of accounts for a company that if those accounts were affected, that could be a material thing. But there's also the question of impact. And this is where it gets kind of interesting, because when I'm determining a material impact, there's the impact to my company, like whether that might be our ability to operate, or it could be to the people whose information I hold. So there was just some interesting breaches reported in the news in the last few days. And as we look at those, let's just take a payment processor who has a significant security breach, and none of the merchants are able to clear transactions or make sales for a significant period of time. Those merchants had losses. Those losses associated with the inability for them to conduct business is part of the determination of materiality in some ways. I mean, it could be like, not that you're necessarily directly responsible for their losses, but they have a stake in whether or not your security is good. And if you have an outage and they come back to you and say, I need you to make me whole for the losses that I've incurred, then we've got potentially a class action lawsuit, et cetera, et cetera, when those kinds of losses are involved.

Dave Bittner: Does this represent an opportunity for security leadership within an organization to know what needs to be done?

Karen Worstell: Yeah, and again, it's not prescriptive. The SEC hasn't come back and prescribed anything, and this is why it's so much like Sarbanes-Oxley. So I rely on the experience that I had in conducting a sort of a security overhaul for a company in the early days or early years of Sarbanes-Oxley. This is one of those situations where companies have to look at, what is this asking me to be able to do? And then say, what would it take for me to be able to do that? What this is really asking a company to be able to do is to say that their cyber risk management processes are good enough, that they would be aware if there was any kind of an incident or a series of incidents that could result in a material impact. So then I have to ask the question, what would it take for a company to be able to say, yes, I absolutely am able to do that. That means you've got visibility of your compute environment and your third-party relationships to such a degree that you would know if something was happening. And that is a tall order. A lot of companies don't have, for example, a data inventory. They don't have a very clear understanding of their entire compute environment. They don't have control over the cloud environment necessarily where things are being spun up and spun down in a really rapid fashion. This is going to take some introspection on the part of the company to say, if that is the intention here and that is what I'm trying to do, can I do that today? The answer in many cases is going to be no. And then it's a question of how far away am I from that? What do I have to do to get there? And I think that has the opportunity to overhaul a lot of the way we've approached cyber risk management in the past, because we've gotten super focused on cyber risk management as an incident detection and response capability and what does it take to keep the bad guys out. What this is asking us is, how do you manage your overall technology digital footprint in such a way that you would know what was happening in that environment? And are you doing the basic blocking and tackling that keeps that risk to an acceptable level? And it's, I think, going to overhaul the way we look at security practices, and it's going to become much more aligned to things like the ISO 27000 information security management system processes and quality management for the digital environment, as opposed to a very narrowly focused cyber security thing that has in recent years been very focused on cyber war and that kind of jazz.

Dave Bittner: What's your assessment of how the SEC has dialed this in here? Do you feel as though this is going to move the needle without being too burdensome?

Karen Worstell: There's a ton of discussion in the SEC document about how burdensome this could be. I haven't fully digested that myself. I'll just tell you based on what I saw from Sarbanes-Oxley. When we fully embraced, I would say we leaned into Sarbanes-Oxley in a big way at the company that I was at at the time. We were like, we're going to do this the right way because it was important for us. We had a big merger coming up and it was important for us to have no deficiencies in security. So we just leaned right into it and said, what will it take for us to do this the right way? And we did it. We discovered how much work there was to do to bring the whole IT environment into a well managed, quality managed set of processes and techniques. But when we did it, we started deploying defect-free code. Our 7911 outage call that we ran six days a week went away. Our developers got their lives back. We could provision accounts and deprovision accounts in a matter of minutes as opposed to a matter of weeks. We knew what was in our environment and we knew what was happening. We had an intrusion detection and response capability that ran to six sigma level tolerances. That was the payout for doing it the right way. All we did was cyber risk management the way the SEC is asking us to do it now. Is it a heavy lift? It can be. Is it worth it? Yes. I think the payoff in the terms of the way the company is managed, its visibility, its confidence and ability to provide assurance to shareholders as well as stakeholders is probably worth the effort. And I would really encourage companies to take a look at it that way.

Dave Bittner: Ben, what do you think?

Ben Yelin: Yeah, really interesting stuff. I think if you're a CISO, this is a new world for you. You're not used to these types of rules. There may be concepts that you're being introduced to here that have never been a primary focus in your world of cybersecurity. So I think it's incumbent upon all of us to make sure that there is situational awareness on behalf of these CISOs, that there's knowledge sharing, people are helping each other figure out how to make sense of these regulations. So I appreciated the conversation.

Dave Bittner: It's interesting to me how for years now CISOs have been asking to have more of a seat at the table at the board. A lot of folks have said that chief information security officers were, in many organizations, a part of the C-suite in name only.

Ben Yelin: Right.

Dave Bittner: But I think as we see these types of rules come into play, I don't know, maybe it's a double-edged sword for the CISOs. Be careful what you ask for because you're going to get that seat at the table, but you have to --

Ben Yelin: You're going to be held responsible for what happens.

Dave Bittner: Right, exactly. You have to navigate some of these new rules. So interesting.

Ben Yelin: Yeah.

Dave Bittner: All right. Well, our thanks to Karen Worstell for joining us and helping us understand it. Again, she is from VMware, and we do appreciate her taking the time. That is our show. We want to thank all of you for listening. We want to remind you that N2K Strategic Workforce Intelligence optimizes the value of your biggest investment, your people. We make you smarter about your team while making your team smarter. Learn more at N2K.com. Our senior producer is Jennifer Eiben. This show is edited by Trey Hester. Our executive editor is Peter Kilpe. I'm Dave Bittner.

Ben Yelin: And I'm Ben Yelin.

Dave Bittner: Thanks for listening.