Caveat 6.2.21
Ep 80 | 6.2.21

The normalization of data leaks.

Transcript

Chris Strand: Having businesses required to adhere to some form of cybersecurity measure, from a data perspective, would be a great thing to do.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On this week's show, Ben looks at Florida's new social media censorship bill. I look at the history of predictive policing in Chicago. And later in the show, my conversation with Chris Strand from IntSights. We're discussing whether the normalization of data leaks could drive further need for federal data privacy regulation and combat complacency. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, let's jump into some stories this week. We got some good ones. Why don't you start things off for us? 

Ben Yelin: So mine comes from the great state of Florida. This is a story quite literally involving a Florida man, so I'm sure the meme appreciators on social media will be happy with us. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: And the particular article I'm looking at comes from the technology section of The Washington Post. And this is about a bill signed into law this past week by the governor of Florida, Ron DeSantis. And the bill bars social media companies from doing a couple of things. They can't block political candidates who are active candidates for public office, otherwise they can face massive administrative penalties. 

Ben Yelin: I happen to mention that there is a very famous political resident of the state of Florida who in a very high-profile way was banned from various social media platforms. 

Dave Bittner: (Laughter) Who could we be thinking of (laughter)? 

Ben Yelin: Yeah, so I don't think that provision of the law was quite a coincidence. 

Dave Bittner: Yeah. 

Ben Yelin: We'll come back to that. 

Dave Bittner: OK. 

Ben Yelin: In a broader sense, the law gives a cause of action to any Florida resident to sue a Big Tech company if they think their content moderation practices are unfair. And they can sue them for, you know, pretty large gobs of money. I think it's something like a $500,000 potential cause of action. If you're Twitter or Facebook, that's probably not going to break the bank too much. But, you know, it is no small amount of money, especially if you're being sued multiple times by every single internet troll in the state of Florida. 

Dave Bittner: Right. 

Ben Yelin: Governor DeSantis, along with a lot of his colleagues in the Republican Party, have criticized Big Tech platforms for their content moderation practices, alleging that they are biased against conservatives. And so this bill kind of addresses that problem by just taking a big baseball bat and swinging it around wildly... 

Dave Bittner: (Laughter) Right, right. 

Ben Yelin: ...If I was to use the metaphor. 

Dave Bittner: So let me get this straight. So what this law does is - let's say Facebook decides that someone is violating their policies, and this someone is - happens to be someone who's running for public office. If they ban that person from Facebook, Florida can then fine them, according to this article, a quarter-million dollars per day. 

Ben Yelin: Yes. Yes, they can. So it's supposed to be a strong disincentive for these platforms to ban active political candidates. I'll note, by the way, that President Trump himself - for those of you who didn't quite get our reference earlier, he is the resident of Florida. 

Dave Bittner: Yeah. 

Ben Yelin: He was banned after he was a candidate for public office. So ironically, this law wouldn't have even applied to him. The thinking is if he runs again in 2024 or if there's another candidate, they would not be allowed to be deplatformed no matter what they said on a social media platform, lest this social media platform have to pay a significant fine. So it's basically a ban on taking down a active political candidate's account during an election season. 

Dave Bittner: Yeah. You know, I take issue with this notion - and we hear this from a lot of folks over on the Republican side - that the social media companies are censoring them. If you look at any - just any random day, if you look at the top articles being shared on Facebook, I mean, it is almost guaranteed that - I don't know - 8 out of 10 of them... 

Ben Yelin: Ben Shapiro, Ben Shapiro, Ben Shapiro. 

Dave Bittner: (Laughter) Right. Eight out of 10 of them are right-leaning articles. So this notion that they're not able to get the word out and share the information, the point of views, it just doesn't stand up to the facts. 

Ben Yelin: No. I mean, I think what they're upset about is some high-profile incidents... 

Dave Bittner: Yeah. 

Ben Yelin: ...Most notably the president getting deplatformed after January 6. 

Dave Bittner: Sure. 

Ben Yelin: Now, what the platforms would say is that they were reacting to the president fomenting violence that led to an insurrection against our government and five people being killed and a hundred police officers being injured. 

Dave Bittner: Right. 

Ben Yelin: But what a lot of Republicans saw was, you know, in their minds, a further effort to silence the voice of prominent conservatives. And this is very unfair in their view. Now, the problem from their perspective, from advocates of this law's perspective, is that this is almost certainly blatantly unconstitutional. 

Dave Bittner: (Laughter) OK. 

Ben Yelin: I would be surprised if this law wasn't - well, I'll put it this way. If there's not an injunction stopping the enforcement of the law within the next month or so, let's just say I'll be extremely surprised. 

Dave Bittner: OK, but unpack it for us. What about this law makes it blatantly unconstitutional? 

Ben Yelin: So there are really two things. The first is the First Amendment issue. You are removing the First Amendment rights of these corporations. And, yes, per the Supreme Court, corporations are people, my friend, and they have constitutional rights, and they have the ability to moderate their platforms as they see fit. That is their mode of operating their business, according to a lot of court precedent. And also, you know, some of the trade organizations that were interested in this piece of legislation, they absolutely retain First Amendment rights to regulate the content on their platforms as they see fit. 

Dave Bittner: OK. 

Ben Yelin: So that's the first important issue. The second one is the issue of preemption. So states cannot enact laws that conflict with federal laws because the Constitution says that the federal Constitution and federal laws are the supreme law of the land. 

Ben Yelin: So, you know, the federal government has enumerated powers. One of those powers is to regulate interstate commerce. And in the process of regulating internet commerce, the United States Congress has enacted things like the Communications Decency Act, which sets policies for these very types of disputes. Section 230 says that a platform cannot be sued for its content moderation decisions. They are immune from lawsuit. By having occupied that field of the law, the federal government's Section 230 in the Communications Decency Act would preempt that state law and would render that state law inoperative. 

Ben Yelin: So those are really two issues in which this law is really destined to fail in the courts. And again, I'd be very surprised if we didn't see it struck down in the very near future. 

Dave Bittner: Well, I mean, I think it's no secret that Florida Governor DeSantis has his sights set on a potential presidential run. I mean, could this be mostly just a performative kind of thing to get the word out that this is - you know, this is something that's important to me from a policy point of view, and this is the kind of thing I will pursue if you choose me to be your candidate? 

Ben Yelin: Yeah, I think this is entirely performative. He's trying to distinguish himself in a field of politicians, you know, including people like Ted Cruz and Josh Hawley, as someone who's, A, very loyal to former President Trump but, B, antagonistic to Big Tech platforms, which is, you know, their big boogeyman at the moment. 

Dave Bittner: Right, right. 

Ben Yelin: So I suspect - and I have no reason to believe that Governor DeSantis isn't a smart guy. I suspect he probably knows that this law is unconstitutional. It's going to get struck down in federal court. But this is a performative action. It shows his potential supporters, both in Florida and around the country, that he's willing to take these types of drastic actions. 

Dave Bittner: Right. 

Ben Yelin: He's willing to stand up to their enemies, which are these Big Tech platforms who are, in his view, silencing conservative voices. 

Dave Bittner: Yeah. 

Ben Yelin: And, you know, I will say I think he's - you know, to put on my political analyst hat here, I think he's sort of hedging his bets that former President Trump won't run again in 2024 and that perhaps he'd be the natural heir to President Trump. And there is some evidence that he's gaining steam in Republican circles. And so this is a way for him to kind of stand out and say, I've actually taken on the Big Tech platforms. I signed a piece of legislation into law. It has real meat behind it. There are going to be administrative fines. Individual citizens in Florida can sue the tech companies as part of this law. And that's a way of him distinguishing himself. 

Ben Yelin: Now, I will say it bears mentioning - I don't know if you noticed in this article, there is an exception in this law. 

Dave Bittner: (Laugher) Yes, the exception jumped - leapt off the page at me. Share with us, Ben, what the exception would be, a very - I don't know - Florida-specific exception. 

Ben Yelin: Let's just say there's a certain industry in Florida that the state government does not want to get on a bad side of that industry. 

Dave Bittner: Right. 

Ben Yelin: And that is the theme park industry. 

Dave Bittner: (Laughter) Yes. There's a certain mouse who has a lot of political influence in the state of Florida (laughter). 

Ben Yelin: He sure does. I'm not a theme park guy myself, but, you know, there are lots of people who go to Disney World multiple times a year, certainly brings in a lot of money for the state of Florida. 

Dave Bittner: Yeah. 

Ben Yelin: The same for Universal Studios, which is owned by Comcast. Both Comcast and Disney have internet platforms. And therefore, they would have been subject to some of the restrictions of this law and would've been liable to be sued for their content moderation decisions. However, the Florida state Legislature thought that this might be dangerous to important political players in their state, so they carved out an exception in the legislation saying this bill does not apply to companies that own theme parks in the state of Florida, which, as one trade group association member said, if this were actually some sort of moral problem, if this was, you know, beyond just political posturing, we would not be exempting companies like those that operate a theme park. And it's not like, you know, Disney and Comcast are mom and pop stores here. 

Dave Bittner: Right, right. 

Ben Yelin: They're just as powerful as Google and Facebook, probably more powerful within the state of Florida. 

Dave Bittner: Yeah. 

Ben Yelin: So it was a pretty transparent move, I would say, to come up with this carveout for those companies. It kind of stinks to high heavens. 

Dave Bittner: (Laughter) OK. Well, you know what? I mean, got to hand it to Governor DeSantis. I mean, people are talking about this. We're talking about this, right? So he's getting the word out that these are the things he's important for. And if you, you know, if you take him to the national level, to the federal level, you know, maybe he can bring these things to the nation. And so it's - I mean, it's an effective messaging technique, right? 

Ben Yelin: Absolutely, it is. I think it's very effective for him politically. He has gotten his name on the map. You know, I think it's important for us and legal scholars out there to recognize that this is performative. But if I were to just sit back and say, is this going to help DeSantis in a Republican primary, you bet it will. 

Dave Bittner: Yeah. 

Ben Yelin: This is something that the voters care about. So, you know, I can certainly understand it from that perspective. 

Dave Bittner: Yeah. All right, well, we will have a link to the article in the show notes. Again, that's from The Washington Post. 

Dave Bittner: My article this week - my story is quite different from yours. 

Ben Yelin: Yeah. 

Dave Bittner: And, boy, I got to tell you, Ben, this is quite an article here. This is over from The Verge. It's called "Heat Listed," and it's written by Matthew Stroud. And it follows the story of a gentleman named Robert McDaniels (ph), who is a middle-aged gentleman, lives in Chicago. 

Ben Yelin: Careful, Dave. He's five years younger than me, so... 

Dave Bittner: (Laughter) I know. I'm sorry, Ben. 

Ben Yelin: ...If you want to call him middle-aged... 

Dave Bittner: I know. Every time I tread there, I understand it's a little sensitivity to you there, Ben. 

Ben Yelin: Yup. 

Dave Bittner: (Laughter) He's no spring chicken, though (laughter). 

Ben Yelin: There you go, yup. 

Dave Bittner: He's a man who's earned some wisdom of years. How about we say that? 

Ben Yelin: There you go. 

Dave Bittner: There you go. So this gentleman was minding his own business at home one day, and he got a knock on the door. Happens to live with several of his siblings at his grandmother's house in an area of Austin that I would say is - how would you describe this neighborhood, Ben? 

Ben Yelin: Area of Chicago, which is called Austin. 

Dave Bittner: I'm sorry. Thank you for the correction. Yep, yep. I misspoke. 

Ben Yelin: Not to be that guy. 

Dave Bittner: Yep. 

Ben Yelin: But yeah, I mean, it is a extremely poor area of Chicago that's been blighted by a half-century of bad policy decisions, redlining, the building of large expressways to more wealthy suburbs. 

Dave Bittner: Right. 

Ben Yelin: So it's a place that has really suffered economically over the past 50 or so years. 

Dave Bittner: Yeah. Say, about 10% of Chicago's murders took place in this area. So... 

Ben Yelin: And it certainly does not represent 10% of the population. So it's... 

Dave Bittner: Right. 

Ben Yelin: That's a high proportion. 

Dave Bittner: Right. So Mr. McDaniel is - he's running into police, and they're hassling him and giving him a hard time. And again, he's not doing anything. He's committed no crimes. 

Ben Yelin: He never did anything, yeah. 

Dave Bittner: Right. 

Ben Yelin: He's not been suspected of a crime. He has not - he's not been the victim of a crime, so he hasn't complained to police that he's in any sort of personal danger. 

Dave Bittner: Right. 

Ben Yelin: He literally has not done anything. 

Dave Bittner: But his neighbors notice that the police are following him around. And so, again, this bolsters the theory in the neighborhood that he's working with the police, that he's a snitch. Well, eventually it gets to the point where his neighbors, who think he's a snitch, shoot him. He gets shot. 

Ben Yelin: Twice, I believe. 

Dave Bittner: Twice, yeah. 

Ben Yelin: Yep. 

Dave Bittner: He gets shot twice. Again, he's done nothing wrong. (Laughter) Right? He's - it's just so bizarre, this story, Ben. 

Ben Yelin: It's really ruined his life. I mean, it was a single incident in 2013 - and this article just came out, so we're talking about eight years ago - where he was identified as part of this algorithmic system put in place by the Chicago Police Department. Now, there are a lot of predictive-based policing tools that are predictive based on location. And that has all sorts of its own inherent problems. There are certainly racial biases involved in that. 

Dave Bittner: Right. 

Ben Yelin: But at least it's not based on any individual. So... 

Dave Bittner: Yeah. 

Ben Yelin: ...You know (ph), crimes are more likely to take place on this corner, therefore, we should send, you know, more patrol officers to this neighborhood. 

Dave Bittner: Right. 

Ben Yelin: That's one thing. 

Dave Bittner: Right. 

Ben Yelin: This is not that. This is - we've gotten it down so granular to this individual level that this person who has not had any major police interactions, but based on his characteristics - and I realize that's an extremely loaded term here... 

Dave Bittner: Yeah. 

Ben Yelin: ...Is likely to be involved in a fatal shooting. The irony of all of this is he hadn't been involved in any violent interactions until this happened to him... 

Dave Bittner: Right. 

Ben Yelin: ...After which, for the next eight years, his life was largely ruined. His neighbors were suspicious of him. He was being followed by law enforcement. Being on such a list where you're a suspect has consequences beyond just local law enforcement. You know, things like Immigration and Customs Enforcement, they mention in this article, could use that data to support a - some sort of deportation order. So this certainly isn't a consequence-free decision the Chicago Police Department made. 

Dave Bittner: Mm hmm. 

Ben Yelin: The other thing that's just so disturbing about this is the algorithm itself is just a big black box. They asked a member of the Chicago Police Department about it and how they make these types of determinations. And at least from what this article said, the police officer kind of shook his head and said, oh, you know - let's just say it's very likely that this guy is going to be in trouble. 

Dave Bittner: Right. 

Ben Yelin: That's unacceptable. 

Dave Bittner: If you're on this list, you're on this list for a reason. 

Ben Yelin: Yeah. 

Dave Bittner: It was... 

Ben Yelin: And that's unacceptable. 

Dave Bittner: Yeah, yeah. 

Ben Yelin: You know? If you are going to use this type of tool, it has to be extremely transparent about how you are arriving at your decisions. Otherwise, the public is going to lose confidence. And you know, let's just say the Chicago Police Department is kind of already on thin ice among many members of the community there. 

Dave Bittner: Yeah. 

Ben Yelin: And this certainly doesn't help. So I think it's a - it's the ultimate warning about the use of personalized predictive policing tools based on algorithms... 

Dave Bittner: Right. 

Ben Yelin: ...Enough so that some of the other police departments that have used this in the past, like the Los Angeles Police Department, have just discontinued this practice... 

Dave Bittner: Yeah. 

Ben Yelin: ...Which is probably a good idea. 

Dave Bittner: This article, I mean, it really encapsulates it here. It says, it predicted a shooting that wouldn't have happened if it hadn't predicted the shooting. Right? 

Ben Yelin: Such a great way of putting it. I mean, it just completely, you know, squares the circle. Like, the law enforcement, by creating this tool, they were the ones that created these violent conflicts. 

Dave Bittner: Right. 

Ben Yelin: So it did more harm than it did good. And I think, especially without having the transparency to know what goes into these formulas, they are just simply indefensible, in my view. 

Dave Bittner: Yeah, yeah. Yeah. And they point - the article points out some interesting additional things about this sort of predictive policing - you know, that on the surface, it can sound like this is a really good idea. You know, hey, we have this hot spot in our community, and so we should send more police there to try to prevent crime. Well, that sounds like a reasonable thing to do. But one of the unintended... 

Ben Yelin: Who wouldn't support that? Yeah. 

Dave Bittner: Yeah. One of the unintended consequences can be if you send more police to an area, you're likely to get more arrests there. And now you have more arrests there, so that place is now more of a hot spot, so now we have to send more police there. And you end up with this feedback loop. 

Ben Yelin: Not to mention that those types of arrests can lead to violent interactions over rather menial matters. So they mentioned a lot of high-profile incidents here. You know, we've reached the one-year anniversary of the George Floyd incident in Minneapolis where there was an interaction that ended in somebody getting killed based on the alleged use of a counterfeit $20 bill. When you have this type of policing that goes into these so-called high crime areas with lots of patrol cars and looking for evidence of a crime, you increase the chances for these types of interactions, which, we have seen in the past, can become deadly. I don't think you can really understate the consequences of this type of decision here. 

Dave Bittner: Yeah. I just can't stop thinking about this. And one of the things I've been thinking about and wondering is, how would I respond to this, you know? My circumstances are very different than Mr. McDaniels' circumstances are, right? I am a white man living in the suburbs... 

Ben Yelin: Yep. 

Dave Bittner: ...In a middle-class neighborhood that doesn't have the challenges that his does, right? So if the police showed up at my front door and knocked on the door and said, Mr. Bittner, we've run this algorithm, you know. And we think that you're either going to shoot someone or be shot. How would I respond to that (laughter)? Like, what... 

Ben Yelin: There is no good way to respond. I mean, most normal people would shut the door on law enforcement and say, this is ridiculous. There's no way of knowing that. 

Dave Bittner: Right. 

Ben Yelin: But when you get into the circumstances of life, you know, that this individual got in, that's not really an option. You can't just tell the police off. And when you're in a neighborhood like his, as this article mentioned, where neighbors sit on porches, they talk to one another... 

Dave Bittner: Right. Right. 

Ben Yelin: ...You get into this really intense danger that you're going to be identified as somebody who cooperates with law enforcement. You know, you're a snitch, basically. 

Dave Bittner: Well, I also think if I found the police following me around or hanging out outside of my work, I mean, I suppose I would have the resources to lawyer up and get them off of my back, which resources, again, Mr. McDaniels doesn't have at his disposal. 

Ben Yelin: Exactly. It's not easy for someone to call up the ACLU and say, hi, I'm looking for a staff attorney to take on this case. 

Dave Bittner: Right. 

Ben Yelin: Like, most people don't have the resources to do that. 

Dave Bittner: Yeah, yeah. 

Ben Yelin: And that's what's so painful about this is the very people who we need to be protecting here at the relevant time period don't have the resources to get that type of legal protection, which is why this is, really, a tragic story. And like I said, we're eight years into this happening. And this person's life has essentially been ruined. The same community caretaker who came to his house in 2013, who's now a prominent member of the community, a former felon himself who was in jail for several years on drug-related crimes, came back to this guy's house and said, I want to give you opportunities. You know, I want to work with you, give you a, you know, $12 an hour paid job, get you out of trouble. And what this guy said was, I'm not interested. I'm distrustful. And you can certainly understand that perspective. 

Dave Bittner: Yeah. 

Ben Yelin: So it's really just a tragic story all around. 

Dave Bittner: Yeah. I recommend you check the article out. There's a lot of details here that, obviously, we haven't had time to cover here on this show. But it is interesting. I suppose, you know, the good news, if there is any, is that this program has been discontinued, and others like it. But I think it's really a cautionary tale that - because it seems as though, if anything, we're seeing more and more of these types of algorithms get spun up. And I think, as citizens, we need to be vigilant that - we have to be mindful of them, you know? (Laughter) Bad things happen. And even if it's not going to affect you, it's not the kind of decision-making you want to have in your community - in my opinion, anyway. 

Ben Yelin: Absolutely. I think that's 100% true. 

Dave Bittner: All right. Well, again, we'll have the link to that in the show notes. We would love to hear from you. If you have a question for us, you can call in. Our number is 410-618-3720. You can also email us. It's caveat@thecyberwire.com. Ben, I recently had the pleasure of speaking with Chris Strand. He's from a organization called IntSights, and our discussion focused on the notion of the normalization of data leaks, how we're kind of getting used to them, and whether that should drive the need for more federal data privacy regulation. Here's my conversation with Chris Strand. 

Chris Strand: In terms of the lay of the land, OK - I guess the State of the Union type of thing here - we've observed, and I personally observed in living and eating and breathing within this industry for - gosh, I don't even want to say how long. But it's a normalization of data leaks. It's based around data exposure and data leaks. It's a little bit of both. Every time we're reading in the press, you know, another data leak or exploit, et cetera, being reported, there's a certain amount of apathy that goes along with that. And I notice that there's a parallel item happening within the industry that I think is contributing to this from a data perspective in terms of measuring risk on data and the like. And that's that both data owners and custodians - and just in layman's terms, I mean, data owners being us, individuals who have data, and consumers and also the businesses that are utilizing that have become distracted recently, especially over the last year. They've become distracted. 

Chris Strand: And there's a relation around - they've lost the sense of the value to this data. And it's not on purpose. I'm not pointing the finger at anyone. But in terms of, again, State of the Union here, in terms of what's going on, there's this assumption that data has already been stolen. I've talked to a number of people on the consumer side and the business side that will say, oh, yeah. It's out there anyway already. But they fail to realize how incredibly important and valuable that their information has become over the last few years, especially recently because of - aka, the pandemic has escalated the value of personal data, in terms of health care data, to the nth degree. And so that sort of trend is troubling because the value of that data is vastly underestimated. And both individuals and organizations also fail to realize the threat landscape to that data, i.e., you know, not just the liability in terms of what that data can come back and bite us with, but they don't realize the vast data footprint that that data lives within. 

Chris Strand: So that downstream liability of data leakage is not fully understood because data can be leaked. And you might join into something for convenience, like we all do. We say, well, I have to give over this data because I want to be part of this social media platform. But once you do that, we often fail to realize the downstream effect of that. And, of course, as well, so do businesses in that respect, as the way they're using that data for commerce is often the relation to the downstream effect of that data - is also often underestimated in terms of the potential liability down the road of that data, you know, being used in a different way, especially a malicious way but also how the value of that data is changing and how it can be used downstream in different ways. 

Dave Bittner: Well, how do we go about calibrating, then? I mean, how do we properly determine what the value of data is? 

Chris Strand: Well, there's many ways. So being a - I guess a recovering audit and compliance professional... 

Dave Bittner: (Laughter). 

Chris Strand: ...I would say that one of the best ways is to lean that data up against some sort of framework. Well, and that could be a data privacy law or a mandate or a cybersecurity framework that that is chiefly involved in protecting data. But again, that's easier said than done. And that's why I say recovering auditor because I had to stop being an auditor, plain and simple, because I was tired of telling people, oh, yeah, just do this, and everything will be fine, and you'll be in compliance because people start to roll their eyes when you talk about that. And you start to feel like you're like a visit to the dentist for most of your customers. 

Dave Bittner: (Laughter). 

Chris Strand: So yeah. So, I mean, drilling in on that a little bit more, I think that one of the ways we can remedy this is to start to recognize or move towards some of the industry standards. And so, I mean, you take the U.S. in particular. The way to solve this problem would be to have some kind of national data privacy program or - and a framework that people can actually measure the value of their data up against and also how they're supposed to be protecting that data. Again, that's much easier said than done. If we can look towards examples of that to avoid the problems that we would have with the distributed nature of data privacy across - again, with the U.S. as an example - all the different states' privacy laws and such that are in force, or they're thinking about them - they're in motion of passing those laws. Almost every state is. If we can kind of work around that and look towards national models, I think that that would be one of the best remedies - is take something that's been done at a national level that focuses on data, focus on protecting that data and actually provides the underlying controls, so that there's some skin in the game in terms of actually achieving that or measuring that because it's easy to say, like I said - I kid around about it. But as an auditor, we'll often say, oh, yeah, just apply this framework. That doesn't always work because you don't know what to do. The framework - some frameworks are so large and overbearing and intimidating that businesses can't even take the first step to apply it to anything. They're not prescriptive, right? So if we can pick some of those prescriptive ones, even ones that live within the, you know, let's say the compliance realm or the regulatory compliance realm and take something that's prescriptive, it gives us a start to actually apply the way we're using data against some prescriptive controls. And there's a million of those. I won't get into those, but - or those frameworks themselves. But there are some prescriptive frameworks out there. And I'd be disingenuous if I didn't mention the PCI DSS. I'm not an advocate for them, but I am - and this is the only time we'll bring an actual regulatory framework into the discussion. But it's a great - it's a data security program, and it's a great prescriptive way of actually just following a bunch of rules to position yourself so that you can show that you're protecting data. And again, those prescriptive steps is what I'm getting at. And I think that that's one step that we could take to get a little closer to solving that problem. 

Dave Bittner: You know, as you and I are recording this, the president in the White House just put out an executive order on cybersecurity. One of the things that caught my eye in there is this notion of kind of an NTSB for cyber, that - a push towards requiring notifications when these sorts of things happen so they can be investigated. Do you think things like that and other things within this executive order are going to push us in the right direction? 

Chris Strand: Well, it's definitely a step in the right direction. The idea of introducing reporting requirements - and that's, you know, requiring notification, et cetera, things like this - from a cybersecurity perspective is a great step forward. Of course, the supply chain is getting an awful lot of attention. I filled up my entire day yesterday just talking about this. But, you know, the executive order recently here is again moving closer to one of - the supply chain executive order from February that was instated, which is just bringing focus. And I think this is what it is - is it's bringing focus on those rules of what you need to do because, again, there's not a lot of concrete rules across all the different data security mandates and cybersecurity mandates that businesses can say, oh, I definitely have to follow this, unless they fall into a specific sector. Like, they're in the energy sector, and they happen to fall under the NERC CIP classification for measuring cybersecurity controls or one of the other authoritative bodies like FERC or HIPAA HITECH within health care with the HHS and that sort of thing. These executive orders coming from this level, coming from the federal level, again, is - it's almost like a gradual or a soft sell to bring attention to it but also start listing some of the cybersecurity measures that we need to do, whether it be reporting with notifications or whether it be talking about measuring the risk to your supply chain, such as in the executive order. I think it's Order 14017. That's - I wrote down the order number... 

Dave Bittner: (Laughter). 

Chris Strand: ...Because everybody always asks me that. They say, what's the actual supply - what is that executive order number? Getting back on topic, the supply chain executive order brings attention to the need to measure risk, to actually perform a risk assessment as part of your security hygiene exercises in terms of proving that you have the mechanisms in place or the controls in place to prove that you have decent cybersecurity posture against attacks to your supply chain. So, again, all of these are really - yeah, I mean, they're definitely a step in the right direction. And if anything good is coming out of all the potential, the craziness that we've seen against the supply chain in the energy sector over the last couple of days, I think this is it is, at the federal level, there's more talk about it. There's more emphasis to actually release something and release executive orders that bring more attention into the fact of how we're measuring cybersecurity posture. 

Dave Bittner: You know, something you alluded to is this notion that people are fatigued when it comes to this stuff. And I would say even - particularly among consumers, I think there's a sense of resignation. It's already out there. What am I going to do? And even - when a data breach happens, you know, one of the first things you hear an organization say is, you know, we were hit by sophisticated international threat actors. You know, there was nothing we could do. And so, you know, people roll their eyes and they sigh and they - but they get on with their lives. How do we break that pattern? How do we get people actively engaged in their privacy from both levels, you know, from the businesses and from the consumers so we can really take a run at this? 

Chris Strand: That's a tough one because there's different things that we're going to have to do on either side of the fence there. So with the businesses, with the consumers is a different approach. I think with consumers, we can help to educate a little bit more. I mean, as cybersecurity professionals and thought leaders, we can definitely do as much as - we can scream as loud as we want. But it's - I mean, sometimes individuals will listen, sometimes they won't. But I think that it gets back to that - again, you know, when I talk about the apathy within this world, I think it gets back to the fact of educating users on the value of their data. And organizations can do a good job of this in terms of, let's say, making user agreements more descriptive and not being sort of - well, doing it sort of under the covers or under a parent company and things like this. And what I mean by is, you know, you sign up for one thing and then you sign up for something else to share your data, and you're told, oh, this is covered as part of the first thing that you signed up for. 

Chris Strand: I mean, this type of mentality or technique in terms of getting buy-in from customers to share their data or from consumers to share their data needs to change. How we're going to change that, that's a whole other story. But I think that that would be one way - and it's through education. There could be other campaigns. I mean, at a federal level, maybe we make this a requirement for businesses, just like all the executive orders that are focusing on the supply chain and how we're supposed to protect that. We could do the same with data and individual consumer privacy could be done at the state level, could be done from a federal perspective but, in essence, to educate users more on the value of their data so they understand the repercussions of sharing that data further downstream. 

Chris Strand: And then on the business side particularly, it's similar, but having businesses required to adhere to some form of cybersecurity measure from a data perspective would be a great thing to do. Again, it's going to be a trip because we're going to have to pick that universal data protection standard that would require businesses to fully disclose how they're sharing information, how they're using that information, even encouraging - again, I come back to a data impact risk assessment. This is something that a lot of businesses will or maybe won't do again, depending on what sector they're in and what they're covered by and what the requirements are in terms of using that data. But there are universal data usage - I mean, at this point, even globally, there's universal data usage requirements that every company would have, especially if it's working on a global scale. So I think it could help to, again, through those frameworks, encourage business use parameters to align with data use because a company needs to do specific things with data. 

Chris Strand: And they're going to do whatever they can do with that data, not that I'm saying businesses are intentionally ignoring the data privacy or the privacy of that data that they're using. But it's more a case of they don't really know that they're doing anything wrong with that data. So, again, if they have some kind of framework to fall back on and to understand at least their business as usual, so they get a sense of how they're supposed to be using that data, well, that can help then determine, compare that or lean that up against a data privacy mandate or a data security mandate so that they can see if they're in or out or how far in or out they are in terms of the use of that data, if they're using it properly, et cetera, et cetera. 

Dave Bittner: All right, Ben, what do you think? 

Ben Yelin: Really interesting conversation. I mean, I think it's harder for us to become complacent now. I mean, this interview takes place in the context where, on the East Coast, we had a ransomware attack that took down an oil pipeline... 

Dave Bittner: Right. 

Ben Yelin: ...And caused gas shortages. So even if you are not a news consumer, this is something that is going to affect you one way or another, and you're going to have to start paying attention. Another interesting thing he said, just from a policy perspective, is you can apply something like GDPR where there is a top-down structure, but it gives flexibility to its member states. You can apply that in kind of a micro and a macro sense. As a country, we could do that with data privacy laws, which give states like California, which might want to have, you know, more stringent protections, the ability to do so. It's not one-size-fits-all, but it does offer some level of uniformity. And then this is something that can also be applied, you know, within individual businesses. So, yeah, I thought it was a really interesting interview. 

Dave Bittner: Yeah. Yeah. All right. Well, our thanks to Chris Strand for joining us - interesting conversation. And we do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.