Caveat 9.15.22
Ep 141 | 9.15.22

Prioritizing data, security, and observability.

Transcript

Tim Eades: Data is critical to obviously being accessed by the user and by the application. It's all over the place, and it's hard to secure.

Dave Bittner: Hello everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law, and policy podcast. I'm Dave Bittner and joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben discusses a court decision on standing in data breach cases. I've got the story of the city of Baltimore spending nearly a million bucks to upgrade their stingray. And later in the show, Tim Eades from vArmour - we're talking about why all security professionals need to prioritize data security and observability. While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, why don't you start things off for us this week? We've got a lot to cover. 

Ben Yelin: Yeah, so I have a very important case that came down the pike. I read about it from the IAPP website, which is the International Association of Privacy Professionals - kind of a think tank on digital privacy issues based in D.C. And they talked about a very interesting case coming out of the 3rd Circuit Court of Appeals dealing with the issue of standing in data breach cases. 

Dave Bittner: OK. 

Ben Yelin: So I'm going to try and do this without getting into the deep legal weeds, just the shallow legal weeds here... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Because I know some of this stuff is not going to be super exciting. But because the Constitution of the United States says that courts can only hear cases in controversies, the Supreme Court has interpreted that as that you actually have to have some stake in the case. They won't hear cases that are theoretical or based on some threatened future harm. You actually have to have a concrete, particularized interest in the case. So generally, the test for standing is that a person has to have suffered some sort of actual injury. In legalese, that's called injury in fact. 

Dave Bittner: OK. 

Ben Yelin: There has to be a connection between the alleged action and the injury. So whomever you're suing, they actually have to have caused your injury. And there's this factor called redressability, meaning the court actually has to be able to redress the harm that came from the defendant's action. So this becomes really interesting in data breach cases because frequently, when there's been a data breach, the harm hasn't been particularized yet as soon as the breach happens. But the victim still wants to sue whomever was negligent with their data. 

Dave Bittner: Oh. 

Ben Yelin: So let's say you're a federal government employee. You hear about the OPM hack. You don't know if your particular data has been stolen, has been - somebody's been trying to steal your identity. You don't know if you've suffered any sort of monetary injury, but you're still really PO'ed at OPM for... 

Dave Bittner: Right. 

Ben Yelin: ...losing your personal information. 

Dave Bittner: You want to make a case that you have been harmed. 

Ben Yelin: Exactly. Exactly. So the Supreme Court, in 2021, made standing in these types of data breach cases much more difficult to establish. It was a case called TransUnion LLC v. Ramirez, and they said that the threat of future harm does not provide standing for a damages claim in a data breach case. So that seemed to be a big blow to plaintiffs who want to get some type of legal relief after their data has been stolen. So in comes the 3rd Circuit Court of Appeals, which is based in New Jersey, Pennsylvania - also the Virgin Islands, in case you ever want to travel there. 

Dave Bittner: That's interesting (laughter). 

Ben Yelin: Yeah. 

Dave Bittner: There's a side story that we won't take the time to explain there... 

Ben Yelin: Exactly. 

Dave Bittner: ...But I'm sure there's history (laughter). 

Ben Yelin: I don't know who decided to draw the maps in that way... 

Dave Bittner: Right. 

Ben Yelin: ...But it is kind of interesting. But the 3rd Circuit had to consider a data breach case in light of that new Supreme Court test. And they developed a three-part test of their own that I think does pave the way for plaintiffs to bring these claims, even if they haven't already suffered a particularized injury. So first, you have to establish that the risk of future identity theft or fraud is sufficiently imminent. So it's something that can't be unduly speculative. It has to be a type of harm that is almost certain to happen. But the key is it doesn't have to have happened yet. So if somebody's personal information has been stolen and it's likely to end up on the dark web, and you can prove that it's likely to end up on the dark web and be used by cybercriminals, then you can satisfy that part of the test. 

Ben Yelin: Second, the harm has to be concrete. And when they look at concreteness, they're looking at how that has been traditionally defined in American courts. It has to be some type of harm sufficiently analogous to harms long recognized at common law - so things like the disclosure of private information, which has been categorized at common law with our English ancestors as a concrete injury, which is promising for the plaintiffs here. And then finally, the - a court will look at whether the plaintiff had alleged separate harms, in addition to the substantial risk, that would qualify as concrete. 

Ben Yelin: So for example, in this case, the plaintiff had already experienced a type of injury, in that she suffered emotional distress and was able to show that she incurred significant therapy costs from having to deal with the outcome of this data breach. So the court ended with I think what is a broad statement that indicates a potential path forward for plaintiffs here - that given that intangible harms like the publication of personal information can qualify as concrete because plaintiffs cannot be forced to wait until they have sustained that harm before they can sue, the risk of identity theft or fraud constitutes injury in fact. 

Dave Bittner: That's interesting. 

Ben Yelin: Yeah. So the real upshot here is, at least in this circuit, there is a path forward if you've been the victim of a data breach. Just because you can't come to court and prove that you've already suffered some type of financial harm, that doesn't preclude you from bringing a case. Great day for lawyers out there who are now going to be able to bring a whole different class of cases in this circuit. 

Dave Bittner: Is this kind of like a - I don't know - I think of like a defamation sort of thing, where you can say there's, you know, potential loss of income because you said something bad about me or, you know, that sort of thing. 

Ben Yelin: Right. It's a threat of reputational harm. 

Dave Bittner: Right. Right. 

Ben Yelin: In defamation cases, you actually have to show in court that it's likely or likely enough that that harm would actually occur, which is why they consider things like, who is the speaker? How significant is the speaker's reach? How significant is the ability that the person accused of defamation would be able to actually tarnish somebody's reputation? I think the same things are going into consideration in this case. It's how likely is it, based on the particular circumstances, that there is going to be a concrete and particularized injury? But what's significant is you don't have to prove that the injury has already happened because that's really hard to prove, especially in the early stages of a data breach. 

Ben Yelin: Just the nature of a data breach means that it kind of goes into the indefinite future because the data is out there. It's no longer protected. So whomever gets a hold of the data - whether it's the individuals who obtained it in the first place, or whether it's sold on the dark web, you don't know when it's going to be used. So I think if this court hadn't developed this test, you could see a scenario where somebody is just sitting around and waiting for there to be some type of monetary harm that they can prove in court. And I don't think courts want to force that to have to happen, which I think is the rationale behind this case here. 

Dave Bittner: What do you make of it? I mean, is this good? 

Ben Yelin: I do think it's good. I mean, from the perspective of - if you think that people who have suffered data breaches should be entitled to some judicial relief, especially if the fiduciary - the person holding the data - was negligent in some way. After that TransUnion Supreme Court case, it really seemed like there wasn't a path forward. Now, again, this is only one circuit, and circuits have kind of been all over the place on this issue. It's a pretty big circuit. It's not New York, and it's not D.C., but when you have some big mid-Atlantic states where a lot of media members live - like New Jersey, Pennsylvania, Delaware, where the Third Circuit is located - I think it could get enough attention that it might be persuasive to other circuits, particularly the way they developed this three-part test. I think tests are persuasive because you can actually have a way of measuring whether a plaintiff has satisfied their requirements. 

Ben Yelin: So it's preliminarily very significant and, I think, a positive step. We'll have to see if other circuits adopt it and if the Supreme Court, which is not very friendly to plaintiffs in any kind of civil suit - for decades now, they've been making it harder for people to sue by creating more particularized requirements for things like standing - whether the Supreme Court is just going to say this is not compliant with our own case law. You went rogue here. We're not going to allow this to happen. But this is a really, really important first step and a potential path forward for these types of cases. 

Dave Bittner: Hmm. I can't help wondering what kind of can of worms this opens because, I mean, now this is crossing that line of being able to claim damage on the potential, not the actual. 

Ben Yelin: Right. So... 

Dave Bittner: Isn't that a fundamental element of where we've been before this? 

Ben Yelin: Yeah. I think the way the court is defining the test is that you don't have to have suffered harm to be able to allege some type of particularized injury if the circumstances are such that that injury is overwhelmingly likely. So when you have a case where the hacker is unknown, and you don't know whether your information is going to be released - you don't know what data has been compromised - it's going to be very hard to bring a suit, even under this test. But here, a specific group in this case had taken credit for the hack, it was undisputed that sensitive data was taken, and the plaintiff alleged, based on research that she commissioned from a cyber intelligence firm, that her data was going to be published or already had been published on the dark web. 

Ben Yelin: And another factor - and they looked at previous court cases from other circuits - is that these particular hackers had already shared compromised information on the dark web in the past. So there was some, at least, basis to believe that they would do so again. So you can't - this isn't going to be a case where every single person who gets a notification saying, hey, your data has been breached can go into court the next day and win a big lawsuit. 

Dave Bittner: I see. 

Ben Yelin: As much as the plaintiff's lawyers and the trial lawyers would like that to be true, that's not the case. It's going to have to be dependent on the facts at hand - whether it's a hacker that's high-profile enough that people know about it, whether you can actually prove that your information was stolen and whether you can hire some type of forensics team to see whether your data is out there on the web already - those types of things. So it's not going to be a free-for-all where, you know, ooh, one of the credit agencies was breached, and I got a credit report from there six years ago. I'm going to get a million dollars. Like, that's not what's happening here. 

Dave Bittner: I see. All right. Well, that is an interesting development. We will have a link to that story in the show notes. My story this week comes from the Baltimore Banner, which I will note is Baltimore's newest news publication. 

Ben Yelin: It's great. If you live in the Baltimore area - and they are not paying us to say this - it's a great source of news for local criminal justice stuff and politics in Baltimore. 

Dave Bittner: Yeah, it's good to see a new organization taking a run at local news because, obviously, you know, that's been hit hard... 

Ben Yelin: For sure. 

Dave Bittner: ...This past decade or so. So this is a story by Justin Fenton. And I know you're going to roll your eyes, Ben, because this is one of our shiny objects here on the "Caveat" podcast. The Baltimore police are poised to ramp up cell phone tracking with purchase of new $920,000 device. 

Ben Yelin: Stingray - ding, ding, ding, ding, ding. 

Dave Bittner: (Laughter) The Baltimore police, they're getting themselves a new stingray - a shiny new stingray. So this is interesting. Evidently, the old stingray that they had wasn't capable of simulating 5G networks. And, well, that wouldn't do (laughter). 

Ben Yelin: It's like when an actual stingray loses its stinger. They cut it off when it goes to the aquarium. It's lost its impact - its sting. 

Dave Bittner: Yeah. So what the police representative, Lieutenant Habib Kim, told the Baltimore Banner that it's been 10 to 15 years since they've upgraded their stingray, and so they have a new stingray coming along here. I thought this would be a good opportunity for us to maybe take stock here of where we stand with these cell-site simulators. Ben, just for folks who might be recent listeners - we haven't talked about this in a while - what exactly do these things do? 

Ben Yelin: Sure. So they simulate a cell phone tower, so they trick every phone within a geographic area into transmitting their location data. As if your phone was trying to locate the nearest cell tower, it's actually transmitting to a simulated cell-site location which is the so-called stingray device. I should note that they're generally called cell-site simulators. It's like calling tissues Kleenex. 

Dave Bittner: Right. 

Ben Yelin: StingRay was one brand of it, but they actually are using a different brand now, I believe. 

Dave Bittner: Yeah. 

Ben Yelin: So there are a lot of constitutional issues here. The law enforcement historically never got a warrant to obtain data from stingray devices, and it was a very effective law enforcement tool 'cause you could peg somebody's location pretty clearly if you had information on their cell phone without getting a warrant. So there was a very prominent case that made it all the way up to the Maryland Court of Appeals, which is the highest court in Maryland - at least it is now. They're thinking of changing their name, Maryland voters... 

Dave Bittner: (Laughter). 

Ben Yelin: ...As you'll see on your sample ballots. But they ruled in 2016 that the Fourth Amendment precluded the use of these cell-site simulators without a warrant. And the state legislature passed a bill in 2020 that limits the use of technology, including prohibiting law enforcement from using this to obtain the content of communications and spelling out specific criteria about when law enforcement is actually able to make use of this. There are now restrictions. You need a warrant, according to the Court of Appeals, and it can only be used under a set number of enumerated circumstances. 

Dave Bittner: Right. 

Ben Yelin: So it has to be for things that are somewhat more serious. It's still a very intrusive surveillance technique, especially when you combine it with some of the other surveillance methods used by Baltimore police. I just finished rewatching "The Wire"... 

Dave Bittner: (Laughter). 

Ben Yelin: ...And that show was last aired almost 15 years ago now. 

Dave Bittner: Right. Right. 

Ben Yelin: But so much of it was about the capabilities of the Baltimore - I know it's a fictional show, but it was very realistic. 

Dave Bittner: Yeah. 

Ben Yelin: But so much of it was the capabilities of Baltimore police - Baltimore law enforcement - in intercepting communications and getting knowledge on their targets through the use of creative surveillance tools. And this is just one of them. I mean, you can see how it would be very useful for something like a robbery or a home invasion, where maybe you want to see whether an individual suspect who's done a spree of robberies is going to commit the next robbery. You track their - get a warrant to track their cell phone, figure out where they are, which cell phone towers or cell-site simulators they're transmitting data to and take the investigation from there. 

Ben Yelin: So it's not as unrestricted as it once was, and I think that's in large part due to the court decision and the effort of state legislatures. And there are still some civil liberties concerns. They talked about how the city council president in Baltimore was concerned about safeguarding the use of these simulators, and the deputy police commissioner said that the equipment is being kept behind two separate biometric doors. It can't be stolen. It's in a secure garage, and two officers are needed to use the device. It's kind of like turning the nuclear keys on a submarine. 

Dave Bittner: (Laughter). 

Ben Yelin: And there's also now reporting requirements on annual usage that have to go to the Governor's Office on Crime Prevention, Youth and Victims Services. Those aren't public reports, but certainly it's useful information for the executive branch in Maryland. So it's very interesting that they're purchasing new devices to try and upgrade and modernize. It shows that they still find this to be a valuable law enforcement tool, even despite these legal restrictions. 

Dave Bittner: Right. I mean, a million bucks, right? That's not chump change. 

Ben Yelin: It's not. I mean, there are a lot of things a municipality could do with a million bucks. Air conditioning in city schools - that would be nice. 

Dave Bittner: Oh, you... 

Ben Yelin: I twisted the knife there. 

Dave Bittner: You old lefty (laughter). 

Ben Yelin: I know. 

Dave Bittner: You couldn't help yourself, could you, Ben? 

Ben Yelin: Sorry if that's overly political, if you think that Baltimore City schools should not have air conditioning. 

Dave Bittner: (Laughter) No, I'm with you. I just think your rhetorical style there, Ben, perhaps leaves a little to be desired. 

Ben Yelin: I know. Leave all of your complaints in the reviews for our show. 

Dave Bittner: Right. 

Ben Yelin: And we'll take them to heart. 

Dave Bittner: It's caveat@thecyberwire.com. 

Ben Yelin: Yep, exactly. 

Dave Bittner: You know, the Baltimore Police say that they use this not just for bad folks out there, that - they cite an example that there was a person who was threatening to harm themselves. And they were trying to locate this person, and the cell company couldn't able to determine where they were. And because the police didn't have a 5G version of this device, they weren't able to track the person in time, and the person had taken their life. That's a story. 

Ben Yelin: Yeah. 

Dave Bittner: (Laughter) Right? 

Ben Yelin: Well, yeah. There are going to be examples like that, for sure. 

Dave Bittner: Right. Right. 

Ben Yelin: And I think that does - at least seemingly - justify spending money on a device like this... 

Dave Bittner: Yeah. 

Ben Yelin: ...Especially if it has been valuable in those types of scenarios. You have things like Silver Alerts, Amber Alerts, where you might know something about the identity of the person missing or the person who was taken - a child, for example. And certainly, a cell site simulator would be useful in those circumstances. So it's not like some other surveillance tools where it seems so out of the ordinary and abnormal and disproportionate to the actual threats. You do understand why there's some use for this. And when it comes to Baltimore - I always mention this - I mean, Baltimore does have a very serious crime problem. 

Dave Bittner: Yeah. 

Ben Yelin: And so I understand from the police department's perspective, trying everything that they can to help alleviate that problem. This is just another tool. And 5G technology is now, you know, old enough that you really are going to have to upgrade everything to be interoperable with 5G devices. So it makes sense that they would do that in these circumstances. 

Dave Bittner: The thing that gets my goat about this or I have trouble getting past is how the FCC is deferential to law enforcement when it comes to these devices, right? You and I can't go out there with some kind of rogue transmitter that's going to cause interference to every cellphone user within, you know, X number of yards, basically, make them dump their calls, right? 

Ben Yelin: Right. 

Dave Bittner: But, you know, what if I'm calling 911, and they fire one of these things up and it dumps my call, and now - you know, you can see the problem there. And this is where, to me, the FCC should be saying, I'm sorry, what? 

Ben Yelin: Yeah. 

Dave Bittner: Your device does what? But I've asked people at - you know, at the FCC or former FCC folks about this directly. And they've told me that this is a case where the FCC is generally deferential to law enforcement and the DOJ and the military. When it comes to these sorts of things, they feel as though public safety outweighs the FCC's mandate to enforce this kind of thing. I don't feel comfortable with that. And maybe that's just a me thing, but... 

Ben Yelin: I don't think it's just a you thing. I certainly understand the perspective from the FCC. You don't want to be seen as crossing local, state or federal law enforcement. 

Dave Bittner: Right. 

Ben Yelin: They're coming to you saying, we need this to solve crimes. In every one of these cop TV shows, somebody goes to the relevant agency, and it's a bureaucratic mess. 

Dave Bittner: Right (laughter). 

Ben Yelin: And they can't get the information they need. The heroic detectives are saying, just... 

Dave Bittner: You've got blood on your hands, Jenkins (laughter). 

Ben Yelin: Exactly. Dig through those files. And there is something to that. I mean, I - if there were a scenario where - like the one you described - where somebody was threatening self-harm and we weren't able to get information because the FCC exerted its jurisdictional power, I think a lot of people would be upset. I'm not saying it's right. I'm just saying... 

Dave Bittner: Yeah. 

Ben Yelin: ...That's the perspective, I think. 

Dave Bittner: Yeah. All right. Well, we will have a link to that story in the show notes. Again, that's from the Baltimore Banner. Definitely worth checking out. We would love to hear from you. If there's a story you'd like us to cover, you can email us. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Tim Eades. He is from vArmour, and we were talking about his notion that security professionals need to set some priorities when it comes to data security and observability. Here's my conversation with Tim Eades. 

Tim Eades: So as we look at data security, we have to look at kind of like the evolution of the problem. So, you know, 10, 15 years ago, we would all look at application security. And about five, seven years ago, we started talking about workload security. And now we're talking about data security. But let's break that apart a little bit. So one of our customers at vArmour is a great example. Five, six, seven years ago, they had 35 workloads per app. But increasingly, the workloads have been disaggregated from the application. So now they've gone from 35 workloads per app to 65 workloads per app. Same number of apps - in this case 4,100 - but there's more and more workloads. 

Tim Eades: And the same principle is actually happening to data. Data is now more disaggregated from the app and more distributed than ever before. You can have data residing in AWS, in Snowflake, in a Hadoop cluster. And remember, data serves, in this particular example, in two particular ways. You know, the access - you know, is Dave accessing this piece of data in Snowflake, and then he's accessing the data inside a Hadoop cluster? And also, the application is also using that data to perform a service, you know, to you, the user, and eventually to other apps, right? So you've got app to data, and user to data. And data is more distributed than ever before. So as that data traverses the infrastructure to get served up to the user or to the app, things get complicated because they go through middle systems. They go through middleware, like IBM and Q Series or Kafka - all middle systems. And as they go through that, sometimes they lose their headers, and they create this new attack surface as they lose their - some of their address book. 

Tim Eades: So data is more distributed. Data is critical to, obviously, being accessed by the user and by the application. It's all over the place, and it's hard to secure. Because one of the problems you have is classification of the data itself at the object level. Then you have to know, OK, it's going to traverse the infrastructure. So then you have all these different types of things. It's a difficult problem to solve. Some people say, well, then I'm going to encrypt it. OK, that's great. But you still need to search inside it. So then you have technologies like, you know, homomorphic encryption. But, you know, it's a difficult problem to solve. It's a critical problem to solve because if you're going through - digital transformation is completely impossible without understanding data security. 

Dave Bittner: How much of this, if any, is because we're in this world now where data storage is so inexpensive as to be practically free? So in my mind, that leads to a bit of a packrat mentality. 

Tim Eades: Yeah, people are storing more and more data. It's just - it's like the water table just goes up and up and up. A bit like the national debt, you know? It just goes up and up and up. And, you know - and it's being stored in more and more and more locations. And, you know, part of that is driven by regulators, that you're saying that, hey, you need to store data for longer periods of time. Other challenges, obviously, is that it's a natural thing, like you said, because it's so cheap, so you will store it. And then the disparity of it. Like I said, as in so many different locations - in a large organization, especially - knowing what it is, where it is, who should access it, who should not have access to it. And then, as the application pulls it, will it see it? Can it be authenticated? 

Tim Eades: Think of your attack service. Think of ransomware as it relates to data security. If you have data in all these different locations, you need to understand what they are - you know, what's the classification of it? You know, who should access it? It's based on classification. There's all these companies, like Dimitri's company out there, BigID, doing cataloging. You know, it's a real challenge. And the attack surface is there. It's real. It can get compromised. The regulators have caught on. So if you talk to some of the leading guys at some of the banks, they are now - they call them MRAs - matters that requires attention - and matters they call MRIAs - matters that require immediate attention. There are more and more regulations coming out around data security, data provenance, data dependency mapping. And so those types of regulators are coming, and you're going to see a lot of acceleration in the deployment of data security technologies. 

Dave Bittner: Help me understand how this works at scale. So in other words, if I double my amount of data, does my effort double? Does it quadruple? Is it halved? What's that relationship? 

Tim Eades: Well, you know, in the old world, you would say if you have double - if the - the more data you store, does your workload double with it? 

Dave Bittner: Yeah. 

Tim Eades: And you would probably say yes. But in the world now of automation and some of the tools that are available, you know, there's a great company called Okera that does data classification, and it's all SAS-based technology. It's really, really smart. There's a way to do this through automation and simplicity. And I think that's also, you know, the way forward because, you know, you've got to abstract the complexity of the problem from the user so that the time to value and the automation of value is something that can be realized very quickly. And so this is actually one of the benefits of the pandemic. I think a lot of people - because, you know, a lot of technology companies were - just like every company, were - sat at home behind Zoom calls and knowing that they're not going to do as many in-person customer service meetings, lots and lots and lots of people really focus on simplicity and intuitiveness and everything else. And certainly that's been a constant drive of vArmour. But I see, across much of my portfolio, the drive for simplicity, intuitiveness, SAS-based and automated value really can solve this problem. 

Dave Bittner: Do organizations who are starting up now have an advantage - that they're - you know, they're not bringing along legacy baggage? 

Tim Eades: Organizations as security companies or new companies? 

Dave Bittner: Well, I'm just thinking of a company who, in their day-to-day operations, have to be stewards of data and do so securely. As you say, you know, through the pandemic, we've - there's been a shift. And there's a lot of modern tools that are designed from the outset to handle this. 

Tim Eades: Yeah. And so the way I look at it, if you're born now as a new company, and you're - I don't know - a digital bank - it's Dave and Tim's Digital Bank - and you're running everything in a cloud, your ability to leverage cloud-native technologies and more intuitive technologies and more automated technology is a lot, lot easier. If it was Dave and Tim's Bank, that we've been around for 40 years or 50 years, and we have, you know, data sensors from 30 years ago, this problem is complicated, and it's difficult, and it's long time to solve it. And you got to work out how much time do you spend just trying to solve that problem, instead of investing in the future of, you know, becoming a digital bank. So, yeah, I think you're right. If it's a new enterprise, you can solve it easier and quicker if you're using clouds and cloud-native technologies. If you're one of those organizations - like the majority in the world - which are hybrid and they have legacy technologies too, there's a road there. You have to start down the road. The regulators will tell you to start down that road. But doing it right in public cloud first is the right place to start. 

Dave Bittner: Where do you suppose we're headed with this? If you look towards the horizon, what do you see? 

Tim Eades: I look towards the horizon around data security as - it's both scary - on one side, from an attack service perspective, the lack of knowledge and the lack of classification, the lack of security around it is a little scary to me. On the other side, I do look at the new technologies that are coming along to address it are scaling and are doing it in a new and automated way, like I said. On the flip side, I guess the real driver behind it is also the regulators are really pushing for it and pushing for better knowledge and observability around who is accessing their data and who's not and who should. I look forward to the future with nervousness on one side. And the - if you're, like, the angel on one side - you know? - you're like, this is going to be great because, you know, we've got great tools and we've got great regulators and pushing everybody to deploy it. On the devil side, the attack service is enormous and very exposed. 

Dave Bittner: And is that where we stand today? I mean, when you look at the organizations that you work with, to what degree are they on top of this? What's the state of things? 

Tim Eades: Well, it starts with recognizing the problem. And so I think over the last, you know, three or four years, it's become more and more of a recognized problem. It wasn't three or four years ago that the data was that distributed, that the data is that exposed because it wasn't classified correctly, wasn't cataloged correctly. That wasn't there four years ago. No way. So now I think the recognizing - you know, the understanding of the problem is there. It's more broadly known. It's - I wouldn't say it's understood. But you see, you know, chief data officers, chief data security officers - so it's emerging. It's recognized by the regulators, like I keep saying. But I think it's something that over the next five, seven years, this is going to be a topic that's not going to go away. It's not going to go away. It's going to get more to the top of everybody's mind. And, like I said, 15 years ago, you talked about app security and everybody would do readouts about - to their board of directors with a CISO program about, hey, let me tell you about my app security and where am I vulnerable in my apps. There, we'll talk about data security for the next decade. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: It was a really interesting conversation. I hadn't thought as much about the observability issue, where you don't know where your data goes, especially if you're using some type of cloud service. It goes into the ether, and you still have some level of responsibility over that data. And I think it's going to change the culture within organizations when they realize that - especially for those using things like legacy systems - that this data is getting out there and they don't know where it is. And they don't have eyes in the sky on it. 

Dave Bittner: Yeah. 

Ben Yelin: So I thought that was a really interesting interview. 

Dave Bittner: Yeah. All right. Well, our thanks to Tim Eades for joining us. We do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.