Caveat 1.5.23
Ep 155 | 1.5.23

Inside the largest data breaches in world history.

Transcript

Andrew Hollister: So I don't think the fact that this breach was censored, in and of itself, was probably particularly surprising. It would, perhaps, have been more surprising if it wasn't censored.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben discusses a groundbreaking Wisconsin court case relating to files stored in Dropbox. I revisit the legality of AI-generated code. And later in the show, Andrew Hollister from LogRhythm discusses the Shanghai National Police data exposure incident and whether or not we may never know or ever know the full details of one of the largest data breaches in world history. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, let's jump in. We got some good stories for the new year. Why don't you start things off for us, here? 

Ben Yelin: So my story I originally discovered through Professor Orin Kerr, imaginary friend of this podcast... 

Dave Bittner: Right, right. 

Ben Yelin: ...Who linked me to the case. And of course, you came in with it as your story, and we had to switch things around. 

Dave Bittner: Yeah. 

Ben Yelin: But it really is a groundbreaking case from the state of Wisconsin, and it relates to files stored in a Dropbox account in the cloud. So I'll give a - kind of a brief summary of what happened in the case, and then we can talk a little bit about the interesting legal findings here. Basically, there was this guy who worked for Taylor County in Wisconsin. He was a deputy sergeant for their police department. And television producers for a show I've never heard of, called "Cold Justice," were working with the county on producing some type of documentary or real crime drama based in Taylor County. 

Dave Bittner: OK. 

Ben Yelin: And they agreed - the county government agreed to give the show files related to a single murder. But this deputy sergeant took it upon himself to send files without county authorization on a couple of unrelated murders, which violates both the IT policy for the county and for law enforcement and potentially is a public corruption charge. I mean, you can be charged with abusing your office and revealing files that are essentially private. 

Dave Bittner: OK. 

Ben Yelin: So he had saved files in Dropbox. It was a Dropbox account that was started with his work email, so that's what's sort of interesting about this case. He used his work email address to open up this Dropbox account. Once he opened it, he was the only one who had the password. I think they said he shared it with his girlfriend... 

Dave Bittner: OK. 

Ben Yelin: ...Maybe one or two other people, but he had not shared it with the county. He was the only person who regularly checked in on the account. This was a mixture of his personal files and some of his work files. The only complicating factor, of course, is that it was started with a county email address. So there was a suspicion that this guy, whose name was Bowers - there was a suspicion that he was the one who leaked this material in an unauthorized manner to these television producers. And as part of the investigation, somebody in the county IT department did a password reset on Bowers' Dropbox account. 

Dave Bittner: Ah. 

Ben Yelin: So, you know, you don't have the password for it directly, but you know he started it with his work email. So you say, hey, I forgot my password. They send you the password reset. Sure enough, he was keeping those files in there. They continued the investigation, and it turned out he had emailed the files, and he was charged. So he is seeking to suppress this evidence because he says that he has a reasonable expectation of privacy in information that he has stored in that personal, private Dropbox account. What the county is saying is that this is a violation of 2007 and 2012 IT policies, which basically state that anything that's on a county device or done on a county network is - you should not have a reasonable expectation of privacy in anything saved on a device or a network. 

Dave Bittner: Yeah, that's pretty standard, I would say, for most workplaces, right? 

Ben Yelin: Absolutely, yeah. 

Dave Bittner: Yeah. 

Ben Yelin: What the court is saying here - and I think this is unquestionably the correct decision, in my view - is that this was not county property. This was not on a county network. This was not on a county device, although we know, with Dropbox, you can certainly access it on any device, but it... 

Dave Bittner: Right. 

Ben Yelin: ...Wasn't specifically on a device. Really, this is the equivalent to somebody's container - a locked container full of papers and effects. And if this was a actual, physical, locked container with somebody's papers and effects, that would clearly require a warrant under the Fourth Amendment - probable cause and a warrant. And that's what the equivalent is here. So we now have precedent from this one state that, in these particular circumstances, where you're saving something in the cloud and you've exhibited that subjective expectation of privacy by having it password protected, not sharing it with anyone, then you have that expectation of privacy, and the government is going to need a warrant to obtain it. So they are going to suppress this evidence. 

Ben Yelin: One interesting element about this case that Orin Kerr pointed out is, even though this would seem somewhat obvious, there really hasn't been a lot of case law relating to this. Basically, the assumption has been, among these cloud-computing services, that if they got this type of request, it'd better come with a warrant 'cause, otherwise, they're not just, willy-nilly, going to hand over somebody's private data that's stored in their account. That would be not only bad PR, but it would just subject them, potentially, to breach-of-contract lawsuits, etc. So it's just unlikely you would ever get a case because it would also - if you are seeking to obtain a warrant, that's going to lead to kind of a timeliness issue. Usually, you're trying to get access to information before somebody tries to destroy it and, quote, "misplace it". 

Dave Bittner: (Laughter). 

Ben Yelin: So going through the whole warrant process can be cumbersome. So it's just - it was unlikely that we were going to get a clear-cut case of something like this, and we did. And now, even though this is technically only valid in the state of Wisconsin, I think this is something that we'll largely see adopted nationwide. 

Dave Bittner: So let me ask you this. Let's bring it to the real world, here. Let's say you and I are sitting here in our CyberWire studios, and I bring in a safe that I have purchased with my own money. And inside the safe, there are some papers. Does my employer have the right to go through that safe without - to pick the lock on that safe without my permission or without a warrant? 

Ben Yelin: No, they do not. I think that's something that's echoed in this case. It's not about whether it physically takes place on county property. And certainly, Mr. Bowers probably accessed these Dropbox files on a county computer. It's about that expectation of privacy. And there are ways you can evaluate whether somebody that expectation of privacy and whether that expectation is reasonable. If you brought a safe in here, and you were the only one who knew the password, that's pretty darn good evidence that you had a subjective expectation of privacy. Now, the fact that you brought it into the office is somewhat questionable. I would have recommended you just keep it at home. 

Dave Bittner: (Laughter) Right. So my judgment may be off, but... 

Ben Yelin: Right. But from a legal perspective, you have locked that safe. You are the only one who has the combination to open it. Nobody else has access to it. There is no employer policy that says CyberWire has the right to access Dave Bittner's... 

Dave Bittner: I see. OK. 

Ben Yelin: ...Devices on our property, so there's just no evidence that you wouldn't have an expectation of privacy. 

Dave Bittner: But let's contrast that with my business email account, for example, where I'm the only person who has the password for that. In fact, you know, it has multifactor authentication, so I have both the password and the hardware key. Is this a case where company policy has clearly been spelled out that IT has the right to reset that account and access whatever's in it? 

Ben Yelin: Yeah, absolutely. I mean, that's where the expectation of privacy analysis begins - is... 

Dave Bittner: Yeah. 

Ben Yelin: ...There's a reason your employer has that written out in a policy. Even though you're not going to read it, it does tell you... 

Dave Bittner: (Laughter). 

Ben Yelin: Well, you might. Most people wouldn't read it. 

Dave Bittner: No, no, you're right. You're right (laughter). 

Ben Yelin: It tells you, I mean, usually explicitly, you don't have an expectation of privacy in anything you do on this email account... 

Dave Bittner: Right. 

Ben Yelin: ...Because, ultimately, it is within our dominion as an organization. It is not yours. I think most people kind of understand that instinctively, but that's why companies make that clear. There was no such explicit agreement relating to Dropbox. You know, the only question in the case was whether this was kind of an extension of his email account... 

Dave Bittner: Right. 

Ben Yelin: ...Because he was using his work email address. 

Dave Bittner: Right. 

Ben Yelin: And what the court said is that it's not. Just because you use your work email address, that doesn't lessen your expectation of privacy, given that nothing in the, you know, IT policy that you signed governs something like cloud storage on a - with a personal password, personal key. 

Dave Bittner: Right, right. And I could see - I'm just trying to puzzle through this. I mean, suppose I used my work email address to access my online medical records. I wouldn't expect that my employer would be able to claim rights to view my medical records simply because I used my work email address as the - to access that. 

Ben Yelin: I think that's a great example. I mean, that is not going to be - that's a relevant factor, whether you used your work email address. And certainly, if I were setting up a Dropbox to illicitly send files to TV producers, I'd probably go ahead and use my Gmail. 

Dave Bittner: Right. 

Ben Yelin: But that's not ultimately the deciding factor. It really comes down to an analysis of that subjective expectation of privacy and whether that expectation is reasonable. And you kind of have to look at the totality of the circumstances. When we're talking about something like health records, you know, that's protected by HIPAA. You can't contract that out in an IT policy and just say, for the purpose of our devices and our networks, we supersede federal law related to protected health records. 

Dave Bittner: Right. 

Ben Yelin: I mean, that's something that's not going to hold up in court. So you would have a reasonable expectation of privacy in that, even if you used your work email account. I think the only thing that an IT department can do with its policies is claim dominion, ownership, access to an email account and all of its contents or anything that happens on - anything that takes place on a device solely controlled by the organization. So if he had only accessed this Dropbox account using the county's laptops or county mobile devices, perhaps that would have been a different story. He would have had a lessened expectation of privacy. There is no evidence in this case that he did that. So he was really seeking to conceal the contents of those files. 

Dave Bittner: Could the police force be in some kind of legal peril for accessing his Dropbox account without permission? Are they running afoul of the Computer Fraud and Abuse Act by doing this? 

Ben Yelin: Oh, they probably are fine for a couple of reasons. There's - qualified immunity is one of them. 

Dave Bittner: Of course, of course (laughter). 

Ben Yelin: The other is a good-faith exception under the Fourth Amendment. Basically, like, if case law is unclear on something, they're not going to hold law enforcement accountable for doing something when there was no precedent prior to this case in Wisconsin saying they couldn't access the Dropbox contents. 

Dave Bittner: Hmm. OK. 

Ben Yelin: And the state is already kind of being punished here because the evidence obtained from the Dropbox files and anything that came from it can't be used against this deputy sergeant, meaning he's still going to be in the department. I mean, maybe they fired him for other reasons... 

Dave Bittner: Right. Right. 

Ben Yelin: ...But he is not going to be prosecuted for breaching the public trust here. 

Dave Bittner: So what happens now with this? How does - Wisconsin makes this decision. How does it spread across the nation, potentially? 

Ben Yelin: So there's going to be some law review articles about this. And this is the type of thing that - let's say a case comes up with a similar fact pattern in Oklahoma. And a judge is going to say to his or her clerks, hey, is there any case law across the country - a situation where somebody had a Dropbox account with a work email address and - what have other courts said? That's going to be persuasive authority. You know, somebody would have to have a completely different perspective on the issue, as a judge, in order to come down with a different conclusion than this case. But they certainly have the right to do that. This is only controlling within the court system in Wisconsin. 

Dave Bittner: I see. 

Ben Yelin: But the way the case is argued, at least from my eye, seems like it would be pretty persuasive. And like I said, it's just not that common that cases like this are going to come up for some of the reasons we talked about. 

Dave Bittner: Right. 

Ben Yelin: So this really could be the groundbreaking precedent. I mean, we saw that as it related to the content of email communications in the Warshak case, which was a 10th Circuit case - never made it up to the U.S. Supreme Court, but it just became the prevailing standard for whether people had a reasonable expectation of privacy in the content of their emails. Even though it had - emails are stored on third-party servers. 

Dave Bittner: Right. 

Ben Yelin: That's the other element here - is - we've talked a lot about the third-party doctrine on this podcast - if you willingly hand information over to a third party, that you lose an expectation of privacy in that information. What the court's saying here is that really doesn't apply here because, for all intents and purposes, you're not actually handing any information to Dropbox. They do keep a backup version on their servers of everything that's stored there, but they also advertise - and this was all quoted in the case - as saying, your files are safe, even from us. We don't have access to it. 

Dave Bittner: Right. Right. 

Ben Yelin: So you are actually maintaining that expectation of privacy. I think that's very persuasive to judges here. 

Dave Bittner: All right. Well, interesting for sure. We will have a link to that story in the show notes. My story this week comes from the folks over at the IEEE Spectrum. IEEE is an electrical engineering organization, very well known, well respected, and their article here is titled "Ownership of AI-Generated Code Hotly Disputed: A Copyright Storm May be Brewing for GitHub Copilot" - article written by Rina Diane Caballar. Ben, you and I have touched on this before, and this saga sort of continues here. There is a class-action lawsuit being filed against GitHub Copilot, Microsoft, who is their parent company, and OpenAI. And they're claiming that, basically, people are pirating open-source software and violating the open-source licenses. Now, can we jump in here with just a little descriptor of, when we're talking about open-source software, what exactly we mean? 

Ben Yelin: So there are rules governing open-source software. I mean, there are licensing rules. 

Dave Bittner: Right. 

Ben Yelin: Generally, open-source, of course, is acceptable and preferred in some circumstances, as long as it's not using - as long as it's not made up of information that is otherwise protected under our intellectual property laws. So you can't just, for the purpose of creating open-source, cobble together a bunch of copyright information and then feed that into either an algorithm or anything else and spit it out as open-source software. So I think that's what we talk about when we talk about violating the licensing regime related to open-source software. 

Dave Bittner: So this Copilot functionality within GitHub - they have terms and conditions there where, I guess in a classic EULA way, they're saying it's up to the users to keep an eye on this. Obviously, the people who are bringing this class-action suit don't agree with that. They point to a case - they're saying it's Google v. Oracle. And they say, in that case, taking the names of methods but not the functional implementation is OK. You're replacing the functional content, but still keeping some of the template. They have a quote here from Kit Walsh, who's a senior staff attorney at the Electronic Frontier Foundation, and they're arguing that training Copilot on public repositories is fair use. She says fair use protects analytical uses of copyrighted work. Copilot is ingesting code and creating associations in its own neural net about what tends to follow and appear in what contexts, and that factual analysis of the underlying works is the kind of fair use that cases involving video game consoles, search engines and APIs have supported. 

Ben Yelin: That's why I kind of struggle with this. I mean, I think Kit Walsh, the senior staff attorney at EFF, makes a reasonable argument that this is fair use. I mean, fair use generally boils down to anything that's not going to lead to somebody making a profit off somebody else's work. So fair use - it would be reproducing academic materials for learning purposes. It really boils down to factors that are almost a little metaphysical here and are going to be really hard to judge. 

Dave Bittner: Right. 

Ben Yelin: So what Walsh says is it boils down to, quote, "how much Copilot is reproducing from any given element of the training data" and if it encompasses creative expression that is copyrightable. That's just so hard to trace. I mean, Copilot says it, you know, uses the advanced practices in traceability. But how can you find that, like, discrete line of code that violates copyright in thousands, millions of lines of code that go into this open-source software? 

Dave Bittner: Right. 

Ben Yelin: It just seems like that would be really difficult to uncover. But it's also not like the textbook definition of fair use because somebody is going to profit off what's created through Copilot. I mean, that's what just - that's the reason this is a really difficult issue. I'm kind of curious to see what happens in court here 'cause I don't think there is a clear side one way or the other. 

Dave Bittner: The thing that I struggle with and where I think folks are tiptoeing around one of the core issues is - and as you say, it's sort of metaphysical - which is, is a computer capable of being creative? And I think a lot of people don't want to acknowledge that perhaps, under the hood, these neural networks are being creative. And what I mean is, if I'm an artist and I go to my local museum - I go down to Washington, D.C., and spend an afternoon at the National Gallery, and I decide that I'm going to look at all of the Picassos. 

Ben Yelin: Right. 

Dave Bittner: And then I go home, and I create a piece of art that is obviously heavily influenced in the style of Picasso. Is that plagiarism? I'm being inspired by a great artist, and I'm using that person's art to inspire my own work, but my own work is original. 

Ben Yelin: Yeah. I mean, that's where this gets really difficult. I think - you know, as far as I know, you are not a computer. 

Dave Bittner: Ha ha ha (laughter). 

Ben Yelin: Yeah, I know - or it just hasn't been revealed yet. 

Dave Bittner: That's right. 

Ben Yelin: It's a really difficult question. I mean, you are using your own intellectual capabilities in that example to extrapolate from your inspiration and create something that's your own. 

Dave Bittner: Right. 

Ben Yelin: That's not really happening in the context of GitHub here, or Copilot. 

Dave Bittner: Isn't it? 

Ben Yelin: Well... 

Dave Bittner: See? That's what it... 

Ben Yelin: I mean, I just don't... 

Dave Bittner: And when is it? If we're saying it's not, when is it, right? 

Ben Yelin: Yeah. I mean, that's really the million-dollar question here. 

Dave Bittner: Yeah. 

Ben Yelin: Is the output of what's being put into this algorithm - is that the type of creative inspiration that's equivalent of you drawing your own Picasso? Or is it just regurgitating copyrighted information that's already gone in? Like, you - I feel like you have to answer that question somewhat philosophically. And, like, I don't - I just don't know how that's going to work in the judicial context. 

Dave Bittner: Well, let's be even more specific about this - 'cause what if I were an artist of collage, right? 

Ben Yelin: OK. 

Dave Bittner: So my original art is made up of going through magazines, books, artworks, and cutting out existing bits of art and assembling them in my own way, right? So if I do a bit of collage, and I use someone else's artwork in there - 'cause I think that's probably more along the lines of what we're talking about here. 

Ben Yelin: I think you would have copyright problems in that context. 

Dave Bittner: You think? 

Ben Yelin: Yeah. 

Dave Bittner: It's kind of like - what? - rappers sampling, you know, music and stuff like that, right? 

Ben Yelin: Right. 

Dave Bittner: We've - yeah. 

Ben Yelin: I mean, 'cause then you'd be passing off somebody else's copyrighted material as your own. And, you know, if this was your fourth-grade school project and you're making a collage, who cares, right? 

Dave Bittner: Yeah. 

Ben Yelin: But if this is - you know, you want to produce this collage and sell this as a piece of art, then I think you do owe acknowledgments, monetary consideration, etc., to the people who actually created those images. I think the purpose of our intellectual property law is to foster an environment of creativity, where people can reap the fruits of their own creative work. 

Dave Bittner: Right. 

Ben Yelin: And if you're looking at the spirit of that, it just seems like somebody has actually created the code here. That was the intellectual work. There's not - is the computer engaging in its own intellectual work in spitting that out and turning it into something else? To me, it just doesn't seem like it does. But I'm open to being persuaded on this issue. This would be a good time to write in to our show, actually, if you think we're way off base here. 

Dave Bittner: Well, but, you know, I think this is interesting because people are really passionate about this. They are taking sides, and there is spirited discussion and I think good-faith, interesting arguments from both sides. And so as I say, what I'm sensing is that there's something about this that I think, at our core, makes some people uncomfortable. The notion that a computer - that an AI system could express genuine creativity I think puts a lot of people on edge. And I get it. 

Ben Yelin: Yeah. 

Dave Bittner: It doesn't - for whatever reason, it doesn't bother me the way that it bothers a lot of other people, but I certainly understand their concerns. 

Ben Yelin: Yeah, I mean, we talked about this in the context of ChatGPT. It's - there's a certain level of creativity that AI uses. And, you know, if I said, write a Shakespeare soliloquy about coffee mugs, like... 

Dave Bittner: Right. Right. 

Ben Yelin: Again, there's that question of - are you appropriating somebody else's creativity and using that to turn a profit? And I think it's really unclear when we're talking about a computer doing it and how traceable it is to the original copyrighted code, and that's going to be something that's really hard to identify. It's like, you put a million pieces of, you know, woodchip to set up mulch on somebody's yard. How do you go about, you know, finding out which was the mulch that's stopped the erosion of your garden? This is a terrible example. 

Dave Bittner: (Laughter) I'm with you, Ben. I'm with you. 

Ben Yelin: I know. 

Dave Bittner: Keep going. Come on, land the plane. Land the plane (laughter). 

Ben Yelin: No, I'm just going to jump out and activate my parachute. 

Dave Bittner: So. I think I've said this before, and I wonder. I think, for a lot of folks who are creative people - and I think people who write code - creative people; artists - creative people. And they're seeing these artificial intelligence systems, you know, coming into their lanes. And I can't help thinking this must be how portraiture artists felt when photography came about. 

Ben Yelin: Right. 

Dave Bittner: That they're looking at this new technology, and they're thinking to themselves, who is going to sit to have their portrait painted when you can press a button and snap a photo, and there you go. 

Ben Yelin: Right. 

Dave Bittner: You know, the whole family, right? 

Ben Yelin: Right. 

Dave Bittner: It's that - to me, I think that's an interesting comparison of a technological advancement that - and it's not that we don't have portraiture artists anymore. 

Ben Yelin: Right. 

Dave Bittner: We do, but they're not the primary way to have your image captured anymore. 

Ben Yelin: Right. I think that's a pretty apt comparison. And I don't know the history of the law there, but I'm sure, just like this, that kind of took a while to develop. 

Dave Bittner: Mmm hmm - so to speak. 

Ben Yelin: Yeah, exactly. 

Dave Bittner: (Laughter). 

Ben Yelin: Terrible dad joke there. 

Dave Bittner: And we'll leave it there. We'll leave our listeners... 

Ben Yelin: Before this gets way worse. 

Dave Bittner: Yeah, we'll leave our listeners shaking their heads ruefully at our dad jokes. All right. Well, we will have a link to that article over from the Spectrum newsletter from the IEEE. We would love to hear from you. If there's something you'd like us to discuss on the show, You can email us. It's caveat@thecyberwire.com. 

Dave Bittner: Ben, I recently had the pleasure of speaking with Andrew Hollister. He's from a company called LogRhythm, and we were discussing the recent data exposure incident involving the Shanghai National Police. Turns out, as you know, there are a lot of people in China. And so... 

Ben Yelin: Although they are actually losing their place as the most populous country in the world as we speak... 

Dave Bittner: Oh, is the... 

Ben Yelin: ...To India. 

Dave Bittner: India is outstripping? 

Ben Yelin: I read that last night. 

Dave Bittner: Oh, interesting. All right. All right. Well, still, a huge data breach - certainly one of the largest in history. And my conversation with Andrew Hollister centers on the potential impacts of that and also, with China's policy, whether we'll actually know how bad it was. Here's my conversation with Andrew Hollister. 

Andrew Hollister: Yeah, so I think it kind of came to light around the end of June, when, actually, a kind of previously unknown individual put up for sale all of this data - a kind of staggeringly 23 terabytes of PII, or personally identifiable information - from China, potentially belonging to some 1 billion people, as you say. And I think one of the remarkable things here is really the kind of breadth of information that was actually contained in this cache of data - you know, everything from - we would typically see, you know, an email address, possibly a password, a name and a date of birth or something like this. But in this case, it really had almost everything you could possibly think of - names, addresses, birthplace, phone numbers and even through to things as sensitive as criminal records associated both with Chinese - or all this information both associated with Chinese nationals and even some foreign nationals who might have visited during the past few years - so both in scale and in breadth is quite a remarkable leak. 

Dave Bittner: And where do we suppose this data came from? What was the original source? 

Andrew Hollister: So it appears to have been accessed from an unsecured police database in Shanghai. And, again, it's believed to have happened - the details are a little bit cloudy - but it's believed to have happened because a dashboard for managing that database was left open to the internet without a password. So this is obviously information that's come from, you know, essentially, a government source who were collecting that information on behalf of the government organization. 

Dave Bittner: Can you help us understand, how does this fit in with what we know about the way that China does collect information on their citizens? 

Andrew Hollister: In a way, Chinese surveillance of their citizens is not really a kind of speciality of mine, but, I guess, in general terms, we know that there's a great deal of surveillance that is done within China as a society. We see that in the media. We see that in reports. I think the surprising thing here was both the breadth of that surveillance that this reveals and the - I suppose revelation's perhaps a bit of a strong word - the confirmation that that data is, at least in some form, drawn together, collated and made accessible as a complete dataset versus that there's, you know, kind of discrete stores of data. I think the surprising thing was all this data was collected together in one place and evidently not for the benefit of the hacker who is taking the data away, but obviously for the benefit of that organization to be able to get access to a very broad set of data about individuals in one source. 

Dave Bittner: And how does this inform our own thoughts for how our data is collected? I mean, I think about, you know, advertisers, data aggregators, all those sorts of things we talk about all the time here in the West. Is it really that different for us than the types of things we saw from this data breach itself? 

Andrew Hollister: Yeah. I think it's a good question. And, you know, it's obviously a concern. It's been a concern of privacy campaigners and others for, I guess, a decade or more now. And the ability to draw different data sets together and be able to imply certain things, the ability to be able to collect data for one purpose and that that data is perhaps used for many other purposes has been something that's obviously occupied the minds of government regulators, both in Europe and in the United States. And, I think, probably notably, the Chinese government have also been occupied with the same issues around the personal information protection law that was - which passed in China, I think, in - about six or eight months before this particular breach. So - but I think those things are very often focused on commercial and - rather than public use of data, if you like, its use by commercial entities versus perhaps use by government organizations. And certainly, that appears to be the case here. 

Dave Bittner: When we think about the response of the Chinese government - and by that, I mean this has really been tamped down. The very fact that this data breach happened has been removed from social media, from local reporting in China. That in itself is noteworthy. 

Andrew Hollister: Yeah. I think it is. I guess my view would be social media seems to be fairly routinely censored for both one thing and another in China in general. So I don't think the fact that this breach was censored in and of itself was probably particularly surprising. It would perhaps have been more surprising if it wasn't censored. And clearly, it provides some level of view or level of insight into the fact that Chinese organizations, be they commercial or be they governmental, wrestle with the same kinds of challenges as Western organizations that were perhaps more commonly used to being seen compromised by these kinds of attacks. They're not immune to it. We're, perhaps, more used to considering perhaps China and other geographies as the source of these kind of acts rather than being affected by them themselves. But this - I think this pretty well illustrates for us, wherever you are in the world, whatever your kind of - the geopolitical situation, you are vulnerable to these kinds of breaches. 

Dave Bittner: I wonder - or I can't help wondering if this contributes to what I sense is kind of a growing sense of resignation when it comes to these sorts of things. You know, we hear about data breaches. We hear about our data being vacuumed up by various organizations. And I think for a lot of folks, they feel as though there's not a whole lot they can do about it. And so, you know, there's almost a feeling like, well, it's all out there. So, you know, what else is there to do? 

Andrew Hollister: Yeah. I know. It's a tricky question, isn't it? And you read some of the reports about the numbers of billions of records that have been breached over the course of, I guess, the last 10 to 20 years. And there's multiple records per human being on the planet, in aggregate. So I think there is some sense of resignation that, my data's out there somewhere, on an individual level. I think on a - kind of looking at it, I guess, more from a commercial or organizational level, it has been - and whether this is resignation or not, I don't know - but it has been accepted for some time, in the cybersecurity industry, at least, that it's when, not if an organization will suffer a breach of some kind or another. And, you know, I think this underlines the fact that, irrespective of your geography, that principle stands. 

Andrew Hollister: And then really the question is, if that's the case, what can we do about it? Is it actually - we're going to kind of stand back and throw up our hands in kind of despair or whether we more take the attitude, well, I know an - attackers will come after the information that I hold. What are the steps that I can take that quickly identify that and at least can minimize the impact of that attack? 

Andrew Hollister: And I commented some time ago on a breach that happened at a hotel chain, and a number of millions, if I remember correctly, of records were stolen. And the same organization got targeted again about 12 or 18 months later, and the organization detected and responded to that second breach much more quickly to the extent that many, many fewer records were actually breached. And I think that gives us a good illustration; organizations can make progress against these kind of breaches, but it takes effort. It takes investment. You know, of course everybody has a cybersecurity program. Of course everybody's trying to protect against this kind of activity. But the actions we take in response to it, the actions we take in terms of kind of doing the basics are very, very significant and make a difference over time. 

Dave Bittner: All right. Ben, what do you think? 

Ben Yelin: Really interesting stuff - I mean, a combination of the secretive nature of the Chinese government... 

Dave Bittner: Right. 

Ben Yelin: ...And just the ripple effects that this has had worldwide and the fact that we have so little information in that sense makes it kind of terrifying, in that, you know, we have something that's certainly going to go beyond China's borders, but we know so little about what happened. So, yeah, I thought it was a really interesting interview. 

Dave Bittner: Yeah. All right. Well, our thanks to Andrew Hollister from LogRhythm for taking the time to join us. We do appreciate it. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.