Lessons learned from environmental legislation.
Bret Cohen: Business has been largely unregulated with their data, and they're figuring out new ways to use it. And they need some guidance on how they should be regulated.
Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hey, Dave.
Dave Bittner: On this week's show, Ben shares news on the ability of the government to search your electronic devices at the border. I have a story about Google drawing the attention of HHS for gathering medical patient data. And later in the show, my interview with Bret Cohen. He's president and CEO at Tier 1 Cyber. He's got some interesting insights on some of the parallels between data security and privacy laws and environmental legislation. We want to remind you that while this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. We'll get started in just a moment.
Dave Bittner: But first, a word from our sponsors KnowBe4. And now a few thoughts from our sponsors at KnowBe4. What do you do with risk? We hear that you can basically do three things. You can accept it, you can transfer it or you can reduce it. And, of course, you might wind up doing some mix of the three. But consider; risk comes in many forms, and it comes from many places, including places you don't necessarily control. Many an organization has been clobbered by something they wish they'd seen coming. So what can you do to see it coming? Later in the show, we'll hear some of KnowBe4's ideas on seeing in to third-party risk.
Dave Bittner: And we are back. Ben, why don't you kick things off for us this week?
Ben Yelin: So we had a big case in one of our areas of great interest, which are border searches.
Dave Bittner: Yeah.
Ben Yelin: So obviously, the Fourth Amendment prevents unreasonable searches and seizures. There's an exception that's been recognized by the Supreme Court and other courts relating to searches at the border just because that represents sort of a special need beyond traditional law enforcement.
Dave Bittner: Right.
Ben Yelin: We don't want bad people or contraband to be coming into this country. So at least the feeling in the past has been we shouldn't have as stringent requirements against searching either individuals themselves, their clothes, their body, et cetera.
Dave Bittner: Even U.S. citizens we're talking about here.
Ben Yelin: Yes. This applies to U.S. citizens and U.S. persons.
Dave Bittner: Right.
Ben Yelin: And specifically, the doctrine arose in the context of contraband, so whether we're bringing back drugs or items that otherwise would not make it through customs. This has become complicated when we start to discuss electronic devices. So a group of 11 plaintiffs filed a suit against the Department of Homeland Security saying that their Fourth Amendment and First Amendment rights were violated when, in a number of circumstances, Customs and Border Protection would compel people to enter in their passcode to get access to their smartphone or their personal device.
Ben Yelin: The named plaintiffs in this case are U.S. persons, U.S. citizens. The Customs and Border Patrol asked this named plaintiff to unlock her phone. This woman had a religious objection to having her phone unlocked because her and her daughters wear headscarves and some of the photos contain pictures of her daughter without headscarves, which goes against her religious beliefs, so she didn't want that agent to be able to see those pictures. She and the other plaintiffs also asserted that this violated their rights against unreasonable searches and seizures.
Ben Yelin: So a district court judge in Massachusetts held that suspicionless searches at the border violate the Fourth Amendment, so the government has to prove some sort of individualized suspicion to search a device. And the way the prevailing law had worked prior to this decision is there were sort of different rules for two different types of searches. For a routine search, which, you know, sort of just doing, like, a cursory glance at somebody's phone, not actually downloading any data but just sort of combing through somebody's pictures and social media posts, that did not require any individualized suspicion. A nonroutine search did require reasonable suspicion, which, as we've talked about before, is a standard below probable cause, but it still means you have to have a reason that you want to look at that device.
Ben Yelin: What this court held is that for all searches of any electronic device at our border crossings, the government needs to show reasonable suspicion. So unless they can prove that, you know, they have some reason to think that you have evidence of contraband, illegal items on your phone, you know, unless they can prove that in court, then anything gleaned from that device is going to be inadmissible in a future criminal trial. And the court granted declarative and injunctive relief, meaning Customs and Border Protection, Immigration and Customs Enforcement are prohibited for now from doing suspicionless searches of electronic devices at our border crossings.
Dave Bittner: Now, what does this mean from a practical point of view? If I'm coming back from vacation or business travel and I'm coming through the border and that border security agent decides that they just don't like the look of me, by this rule, they can't make me have a bad day. They can't search my device.
Ben Yelin: So it's always interesting talking about the practical impacts of this.
Dave Bittner: Yeah.
Ben Yelin: If you or I were questioned by a Customs and Border Patrol agent and the guy in a scary-looking uniform told us to unlock our device, we'd probably unlock it. You know, if they discovered that I had text messages about all the great drugs I was able to get ahold of during my vacation to Colombia or whatever...
Dave Bittner: Right.
Ben Yelin: ...And the government tried to prosecute me, because of this decision, they would suppress that evidence because this Customs and Border Patrol agent didn't have any individualized suspicion. They had no reason beyond, you know, a mere hunch that there would be evidence of criminal activity contained on that device. So I always wondered, like, is it realistic that somebody whose questions - you know, after a 12-hour plane ride, they're at an airport.
Dave Bittner: Right.
Ben Yelin: Customs and Border Patrol agent comes up to them, and they're like, give us your passcode. We need to check your device. Somebody is going to be, like, well I'd like to cite the recent...
Dave Bittner: Right.
Ben Yelin: ...District court case of...
Dave Bittner: Yes. Exactly.
Ben Yelin: ...Alasaad v. Nielsen, which, by the way, is the name of the case. So, you know, there's that real-world element to it. Certainly, somebody who did say that is now fully within their rights to do so.
Ben Yelin: There is some sort of complicated procedural element to this. While the declaratory relief here - the declaration that these types of suspicion less searches violates our constitutional rights - that has nationwide applicability, the injunction, which is the order that would prevent Customs and Border Patrol from instituting these searches - it's not entirely determined yet whether that will apply nationwide. There's going to be a separate hearing to make that determination because the plaintiffs in this case who sued didn't include in their briefs whether they want a nationwide injunction. Although because these were 11 different plaintiffs who came in from all corners of the world and at 11 separate ports of entry into the United States, it seems to me that it would make sense to have some nationwide applicability.
Dave Bittner: One of the people who I believe is one of the plaintiffs - but certainly, if not, was certainly mentioned when it came to this case - was a journalist who was coming through and got his device searched and was basically given a hard time by - I believe it was an ICE agent about social media posts that were critical of the government. And that was part of why this person joined this case. That's my recollection of it. To me, that sort of thing is more chilling than anything else. He was hassled. He was slowed down. He was hassled, you know, given a stern talking to by someone in authority. He was not arrested, but he was detained.
Ben Yelin: Right, and, you know, that could have a chilling effect on his First Amendment activity as a journalist. As I recall from - and I've, you know, certainly read about the case you're talking about. I'm not sure if they were one of the plaintiffs in this case.
Dave Bittner: Yeah.
Ben Yelin: But they didn't have any suspicion that, you know, there was anything incriminating on this journalist's device. They just knew that this person had been critical of the government.
Dave Bittner: Well, and that's not illegal.
Ben Yelin: No, not yet.
0:08:11:(LAUGHTER)
Ben Yelin: Well, we'll see if that changes by the time this episode airs.
Dave Bittner: OK.
Ben Yelin: But yeah. I mean, you can see how that would have a chilling effect.
Dave Bittner: Right.
Ben Yelin: And I think one thing that sort of brings up is how much information is actually contained on these personal devices. This is something that the Supreme Court recognized in an unrelated case, Riley v. California. We're not just carrying around a collection of physical material. We're carrying around a detailed record of our entire life, our religious - our political associations.
Dave Bittner: Yep. Conversations with friends and family that...
Ben Yelin: Exactly.
Dave Bittner: ...Have nothing to do with our crossing of the border.
Ben Yelin: Absolutely. So, you know, it's very different than physical searches, than taking apart your suitcase and checking your baggage.
Dave Bittner: Right.
Ben Yelin: We're talking about something that's far more personal and intrusive, and that's something that the Supreme Court has recognized. So I think that informs this district court judge in this case that personal devices are - deserve this heightened level of protection just because of how much information is contained within them.
Dave Bittner: Where does this likely go next?
Ben Yelin: So my guess is that the government is going to appeal. First, there's going to be a hearing about whether the injunction will be nationwide. The Trump administration in general, led by Attorney General Barr, has been very adamant against nationwide injunctions. They're making sort of a nationwide effort in a whole different bunch of cases to prohibit these types of nationwide injunctions. If there is an injunction, I think the government would almost certainly appeal to the Court of Appeals. We would see if the Court of Appeals would issue a temporary stay on this decision pending what they decide.
Ben Yelin: So it's possible that, you know, today you have the right to not have a suspicionless search of your device at the airport, but if there's some sort of temporary stay issued by a court of appeals, that right might be eliminated for a certain amount of time. But it would go up to that court of appeals. It would be heard, probably, by a three-judge panel of appeals court judges, and they'll consider the record and the facts that went into this district court decision.
Dave Bittner: Would it likely head to the Supreme Court?
Ben Yelin: It's possible. So, you know, usually, you'd have to see a circuit split to get a case come to the Supreme Court. That's more of a general rule of thumb. That's not always the case. The Supreme Court has ruled in on border searches specifically in the context of the Fourth Amendment, how it qualifies as a special needs search. So, you know, because they've weighed in on the past, perhaps this would be another opportunity for them to weigh in. But I'd say there's no guarantee that no matter what happens at the court of appeals level, the Supreme Court would grant certiorari on a case like this.
Dave Bittner: All right. I suppose for folks who advocate privacy and so on, this is an important win.
Ben Yelin: It sure is. I mean, this is a case - the plaintiffs worked with the ACLU, the Electronic Frontier Foundation. They are some of the best privacy lawyers in the country, and, you know, so this is a big accomplishment, big win for them and for privacy and civil liberties advocates.
Dave Bittner: All right. Well, my story this week has to do with Google and some of the attention they have drawn from the Department of Health and Human Services, specifically their Office for Civil Rights. Turns out that Google has a project called Project Nightingale, which...
Ben Yelin: Like, why do they have to choose the scariest, most random-sounding project name? That sounds like a bizarre military operation.
Dave Bittner: (Laughter) So I suspect that they are referencing Florence Nightingale, who, I believe, is credited with being, really, the founder of modern nursing. So I think that's probably what they're tying into.
Ben Yelin: I see. OK.
Dave Bittner: At any rate, Google teamed up with Ascension, which is a healthcare organization. It's a hospital system, specifically a Catholic hospital system. They operate in 21 states. And Google teamed up with them to gather patient data, and Google says that they're using this patient data to analyze the data and allow the folks at Ascension to better serve their patients, which sounds like a reasonable endeavor. It sounds like something that you would want to do for the good of people.
Dave Bittner: But as you and I have talked about before, we do have this thing in the United States called HIPAA, which covers privacy when it comes to your medical data. And that is what HHS is interested in - whether this partnership could perhaps be in violation of HIPAA. What do you make of this, Ben?
Ben Yelin: Yeah. It's a really interesting case. I mean, first of all, Project Nightingale is pretty intensive. They're including patient data. It's not just, you know, completely anonymized data. It's - from this article, it claims that it includes names and birth dates.
Ben Yelin: And the goal, according to Google and Ascension, is to help deliver more targeted medical treatment. That language is important because the way HIPAA works - health systems and hospitals are able to give HIPAA-protected information to third parties if those third parties are going to use that information for a clinical purpose or to support clinical activities.
Ben Yelin: So if they were to use this information to further medical research or for some other reason that would assist with patient care, then it would be HIPAA-compliant. Where it would not be HIPAA-compliant is if Google were collecting this information for nonclinical purposes, such as, give me all of the people who have, you know, heart conditions...
Dave Bittner: Right.
Ben Yelin: ...And we'll put Lipitor ads...
Dave Bittner: Right.
Ben Yelin: I keep using the same company. I should think of a different example.
Dave Bittner: (Laughter) Yeah, but that's the thing here, right? I mean, if - I suppose if it were some data analytics company that we - no one had ever heard of that wasn't in the business - the primary business of selling ads, probably people wouldn't notice. They'd say, oh, this is fine. Ascension is partnering with someone to analyze the data and deliver better healthcare. But when it's Google, it kind of makes us raise our eyebrows and think, well, they certainly have an interest in vacuuming up data for their own selfish purposes.
Ben Yelin: Right. And that's - you know, I would say that's one of two reasons why this seems so suspicious. Yeah. So the fact that it's Google makes all the difference in the world because every single person, virtually, in this country sees Google advertisements. So if we see that our personal medical information is being given to Google as a third party, we're probably not that concerned as to whether they intend to use it for clinical purposes. They have it. And that, in and of itself, is going to be a concern.
Ben Yelin: The other concern is that doctors and patients were not notified of this partnership. Now, in some ways, that makes sense. The hospital system and Google came to an arrangement, and they probably wanted to prevent blowback from patients and doctors about invasions of privacy. But they still wanted to conduct this work to develop more robust medical records. So...
Dave Bittner: But isn't that like if I really want to have a chocolate chip cookie - so instead of asking my mom if it's OK...
Ben Yelin: You just take it.
Dave Bittner: ...I just take the cookie (laughter).
Ben Yelin: Yeah. As somebody who has a young child, that strategy does not work...
Dave Bittner: OK.
Ben Yelin: ...For them.
Dave Bittner: Right. Right.
Ben Yelin: Yeah. So I think that's sort of one of the reasons why it caused so much alarm. So...
Dave Bittner: Yeah.
Ben Yelin: You know, one thing that's worth noting is these types of relationships exist in other contexts. Other hospital systems - I think this article mentioned the Mayo Clinic - have entered sweeping partnerships. Mayo Clinic entered a partnership with Google to store its data in the cloud, to use Google's analytics tools to analyze clinical information. In that case, they - Mayo Clinic insisted that the data would be anonymized, which seems to not have been the case when we're talking about Ascension Health.
Ben Yelin: So you know, this is something that's not wholly out of the ordinary, and it's authorized under the law with this clinical use exception, which is a - I wouldn't call it a loophole 'cause I think it - it's a legitimate purpose. We would want these third-party companies to use what they bring to the table, which is, you know, analytical capabilities, technological capabilities to...
Dave Bittner: Right.
Ben Yelin: ...Aid in clinical outcomes.
Dave Bittner: Yeah.
Ben Yelin: But it would be nice to get informed consent from patients and doctors.
Dave Bittner: Yeah. Yeah, that's an interesting tension there because they - obviously, when you go for informed consent, you're less likely to get the number of people that you'd get otherwise.
Ben Yelin: Right. And the data is only going to be useful - you know, it's going to be more and more useful the larger sample size you can get. But then again, you know, this article mentions that 150 Google employees have access to this treasure trove. I mean, we're talking about a hospital system that's in 21 states. I'm guessing we're talking about millions of health records.
Ben Yelin: So, you know, the fact that 150 Google employees have access to those records, including names and birth dates - you know, I could see why somebody would be like, I support clinical research. I support using analytical tools to improve medical outcomes, but this is just a step too far. You know, perhaps it's not worth it, even if there are medical benefits from this type of relationship, to have all of this data taken without informed consent from the patients.
Dave Bittner: Yeah. Well, I mean, it's interesting. The Wall Street Journal originally broke the story about this. And I suppose the system is working in the way it should in that the Department of Health and Human Services is taking a look at this, and that's how we want things - these things to play out.
Ben Yelin: Absolutely. So that's very promising. HHS is analyzing this relationship to see if there is evidence of a HIPAA violation. That's what regulatory agencies are for. I don't know exactly how the Wall Street Journal got their hands on this information because it seemed to have been secretive and not shared with Ascension employees. I'm wondering if it came from some sort of anonymous leak. But yes, I mean, as soon as HHS got ahold of this information from the media, they immediately announced that they were going to conduct an investigation, which I think is promising. Like I said, that's what HHS is for, is preventing abuses of our federal statutes.
Dave Bittner: Right. All right. Well, that's certainly one to keep an eye on - see how that one plays out. It's time to move on to our Listener on the Line.
0:18:10:(SOUNDBITE OF DIALING PHONE)
Dave Bittner: Our Listener on the Line this week is Tony (ph) from Buffalo. He calls in, and he has this question for us.
Tony: Hi, this is Tony from Buffalo. If one of my friends or relatives sends me his laptop to fix, would it be legal for me to download his personal documents, like his tax returns or financial records?
Dave Bittner: That's an interesting question. Ben, what do you make of this?
Ben Yelin: So to be a nice person, you probably just should not do that. But...
Dave Bittner: Yeah, not good to be poking around on someone else's device (laughter).
Ben Yelin: No. And I would say, you know, Tony to Buffalo, whether this is you or a friend of yours...
Dave Bittner: Right. Asking for a friend, yeah (laughter).
Ben Yelin: Yeah, asking for a friend. If you're given a computer for a limited purpose, I would not download somebody's personal information.
Dave Bittner: Right.
Ben Yelin: However, because this is a law and policy podcast, I'll answer this in the context of what's called the Computer Fraud and Abuse Act, which prevents unauthorized access to personal information. The way the Computer Fraud and Abuse Act works is there can be criminal or civil penalties if you have - even if you've been granted access to a device for a limited purpose, if you glean information from that device that goes beyond that limited purpose, you are violating the Computer Fraud and Abuse Act. So you know, if you work in an IT department and somebody gives you their laptop to fix some sort of hardware issue and you download their personal files, that would be a Computer Fraud and Abuse Act violation.
Dave Bittner: Now, I suppose if that laptop belonged to the organization, if that was your work laptop, all bets are off, right? I mean, that belongs to them, yeah.
Ben Yelin: Yes, that's completely different if it's not your personal laptop because I'm sure the nature of the agreement with your employer is that they retain access to the information on their computer.
Dave Bittner: Yeah.
Ben Yelin: They have the right to check it at any time. But when we're talking about a personal device, I think this would run afoul of the Computer Fraud and Abuse Act. The other question related to this is, does the act apply to somebody's personal device? So originally, the Computer Fraud and Abuse Act only applied to government computers. The act was amended to account for the dot-com boom in the 1990s to include any device, whether private or public, that is connected to the internet. So you know, it's sort of interesting that perhaps the rules would be different for a device that did not have a wired or wireless connection to the internet.
Dave Bittner: That's interesting.
Ben Yelin: But, you know, I'm not sure how a court would come down if, let's say, somebody had had access to the internet, had, you know, their TurboTax files on there, but happened to be in a place where there was no Wi-Fi, and so they weren't connected at that moment.
Dave Bittner: Yeah. Yeah. What about these cases where folks have taken their computers in to have them repaired at, you know, the local computer shop and those repair people stumble across illegal things - you know, like child pornography or something like that - and they feel an obligation to report that to the police? How does that play into all this?
Ben Yelin: According to the Computer Fraud and Abuse Act, they would still be potentially criminally or civilly liable because that would exceed the authorization given to them to look at that device. In the real world, there would probably be some sort of process where you could be a whistleblower and give that information to the government.
Dave Bittner: Right.
Ben Yelin: And you could do so anonymously, without the government worrying about violations of the computer - or Fraud and Abuse Act. You know, we talk about things like inevitable discovery; that could be something that a court could use. They could say, whether it was a guy at the computer repair shop who came across this or anybody else, it was inevitable because this child pornography was contained on this device, that it was going to be discovered. So therefore it can be admissible in court. But from a technical standpoint...
Dave Bittner: (Laughter).
Ben Yelin: ...If that was a connected device, that would be a violation of the Computer Fraud and Abuse Act.
Dave Bittner: Yeah. I heard one just in the past couple weeks where there was - a woman had taken her - I believe, her iPhone into the Apple Store for some work. She went to the Genius Bar.
Ben Yelin: Was it Rudy Giuliani? Or...
Dave Bittner: I do not believe it was.
Ben Yelin: OK.
Dave Bittner: No, no. This was outside of his technical support needs. But on her device were some private, personal, intimate photos of herself, and somehow the person at the Genius Bar saw these photos and copied them and posted them online. I believe the person at the Apple Store quickly lost his job. But way out of bounds there, right?
Ben Yelin: Yeah. I mean, that is a facial attack against the Computer Fraud and Abuse Act because you're abusing what your limited access was for, which was to fix that device. And it is a data privacy protection law. It's limited in its applicability, but it is one of the strongest data privacy protection laws the federal government has on the books.
Dave Bittner: All right. Well, thank you to Tony for calling in with the question. Coming up next, we've got my interview with Bret Cohen. He is from Tier 1 Cyber. He's got some interesting insights on some of the parallels between data security and privacy laws and environmental legislation.
Dave Bittner: But first, a word from our sponsors, KnowBe4. So let's return to our sponsor KnowBe4's question - how can you see risk coming, especially when that risk comes from third parties? After all, it's not your risk - until it is. Here's Step 1 - know what those third parties are up to. KnowBe4 has a full GRC platform that helps you do just that. It's called KCM, and its vendor risk management module gives you the insight into your suppliers that you need to be able to assess and manage the risks they might carry with them into your organization. With KnowBe4's KCM, you can vet, manage and monitor your third-party vendor security risk requirements. You'll not only be able to pre-qualify the risk. You'll be able to keep track of that risk as your business relationship evolves. KnowBe4's standard templates are easy to use, and they give you a consistent, equitable way of understanding risk across your entire supply chain. And as always, you get this in an effectively automated platform that you'll see in a single pane of glass. You'll manage risk twice as fast at half the cost. Go to kb4.com/kcm and check out their innovative GRC platform. That's kb4.com/kcm. Check it out.
Dave Bittner: Ben, I recently had the pleasure of speaking with Bret Cohen. He is the president and CEO at Tier 1 Cyber, and our conversation centered on some of the interesting parallels - things I never really thought about before - about data security and privacy laws and some of the things we've learned along the way from environmental legislation. Here's my conversation with Bret Cohen.
Dave Bittner: When I've been talking to folks who are focused on the privacy side of cybersecurity, when it comes to things like data retention, I've heard people say that we need to change our mindset from this thought that we should be collecting and keeping all sorts of data - that rather than data being valuable, maybe we should think about data is being radioactive, that it has potential danger to it. If we get too much of it in one place, bad things can happen. Do you think there's something to that argument?
Bret Cohen: Yeah. No, absolutely. You know, I think we're at a point where companies are looking for guidance - what to do with all of this data. You know, business has been largely unregulated with their data, and they're figuring out new ways to use it. And they need some guidance on how they should be regulated in this area, and consumers need that guidance as well.
Dave Bittner: We're over a year now into GDPR, and California has their own data privacy law that they're spinning up here. What do you think we're seeing, from that point of view, the effect that that's had on how companies are handling storage and retention and even their relationship with data and privacy?
Bret Cohen: Sure. I think companies are still trying to figure that out, quite frankly. You know, in particular, kind of medium-sized businesses - those businesses in the, you know, 50 million in annual revenue to, you know, 200, 250 million - they're not sure exactly how to handle their data, how GDPR really applies to them or doesn't apply to them. And, you know, with CCPA coming on board here shortly, they are equally as confused, quite frankly, which is exactly why, you know, there's been a push for the national privacy legislation.
Dave Bittner: Do you think we're headed in that direction? Do you think we'll see a national legislation for privacy?
Bret Cohen: Absolutely I do. I mean, it seems on a weekly basis, there's a different bill proposed. I believe in the last week or two there was a bipartisan - not bill proposed, but information. The New York Times was reporting that they're working on a bipartisan proposal. Now, I don't necessarily see it happening, you know, obviously, this year or even next year for various, you know, political reasons and distractions, if you will. But certainly, I think 2021 in particular, we're primed for it. Businesses are ready for it, and it's something that I think we all deserve.
Dave Bittner: It's interesting because some people I've spoken to are skeptical that Congress will be capable of putting something together like this. The dysfunction, their inability to pass anything right now - that that will keep them from being able to do anything and that we'll have to rely on the states - do you think that's a possibility?
Bret Cohen: In the short term, yes, but certainly not in the long term. I mean, here is really where the parallels between the environmental and the privacy law are - kind of align. You know, privacy - environmental law - sorry. You know, 50 years ago, it really needed a bipartisan effort. There was a lot of debate of whether or not something could actually get done, but it did get done, you know? They created a system. They went out there - kind of out on a limb, if you will. But there was a real need for it, a need for it for companies' sake, for competitive sake.
Bret Cohen: And, you know, I think we're going to see that on the privacy front as well. There's going to be a real demand for it, a need for it, not only for consumers' sake, but in particular, for businesses. And as we're seeing GDPR and CCPA - I think actually, those laws will drive us to a national privacy standard because there are still aspects of those laws that are undefined, quite frankly. So I think once those are on the books and businesses have a little bit of experience trying to comply with them that that will make this much more obvious and needed as we move forward.
Dave Bittner: Yeah. It's interesting to me that it does seem like that rare bit of policy interest that has genuine bi-policy support. It's noncontroversial, these notions of privacy and security. It seems like both sides can get behind that.
Bret Cohen: Absolutely. You know, I think there is obviously a big business interest from some of the larger companies in getting this done and getting this done in their manner. But, you know, I think in particular, the small- and medium-sized businesses really need this, and if we can create a level playing field, that will certainly benefit all. And I think it's up to our legislatures to demand that from the larger companies from all the different stakeholders to get something that really is truly bipartisan through.
Dave Bittner: What are some of the specific challenges that face those small- and medium-sized businesses when it comes to dealing with these issues?
Bret Cohen: I think it's a matter of resources to put to it, you know? If you have all these different standards, lots of different state standards, and they're all - they vary in different ways, then it makes it very expensive and complicated for midsized businesses to try to figure out how exactly to comply with them.
Bret Cohen: And then even with the CCPA, for example, it has a provision in it that companies have to have reasonable information security controls - protocols in place. You know, exactly what does that mean? How are medium-sized businesses supposed to figure out what cybersecurity framework they're supposed to follow? How do they limit their liability? They want to do well. They want to do right by their clients' consumers, but they're having an awful hard time - or they will have an awful hard time figuring out exactly how to comply with vague standards.
Dave Bittner: What are some of the lessons that we can take from environmental policy of days past and even how things function in that zone these days? What are some of the lessons we can use for that when it comes to cybersecurity?
Bret Cohen: I think one of them is this concept of unintended consequences that we saw with environmental law. Some of the environmental laws were very broad in scope, and they didn't necessarily take a holistic point of view when they were initially passed. So for example, there would be a regulation for regulating the clean air, and it'd be very strict and stringent, and a company would sort of - would try to apply. But it would create more hazardous waste, and then we didn't necessarily pass laws at the right time to regulate the hazardous waste.
Bret Cohen: So, you know, kind of applying that to the data privacy world, you know, what information will be protected? How will it apply to different size entities? And in particular, I am interested in the liability portion of all of these proposals. You know, what if companies do comply, but they have a cyber event, a breach? Because as we know, it's difficult to protect against these things, difficult to create a foolproof system. So if they are breached, you know, does that mean that they still have liability? What assurances can we give midsize businesses? And a lot of these liability issues have been kind of evolved and been dealt with in the environmental sense and can teach us a lot of lessons for data privacy.
Dave Bittner: The more you sort of go through the issues there, I think it really is a fascinating analogy that - you have both local issues - you know, if I have a factory that inadvertently let something, you know, spill into a local river, well, that's an issue. But also, it's the kind of thing that can cross borders. If something is released into the air, if you have pollution or something like that, then - well, then you've got, you know, your neighbors, your national neighbors that could have something to say about that. And those are very similar things to the kinds of things we're dealing with in the cyber space.
Bret Cohen: No, absolutely. And, you know, back to the liability point, for data privacy, do we create a kind of a strict liability system that, if a company does get breached - that they still have to fix the consequences and pay penalties? Or do we create some sort of safe harbor where companies will lay out standards that - if they comply with and they meet this kind of a reasonable threshold, that they're exempt from liability? They might still have to fix some of the problems, but they've basically done everything they can do.
Bret Cohen: You know, on the environmental side, we have a little bit of both. We have some aspects that are strict liability. You're responsible no matter what. And we have other aspects where, if you pass certain standards, you still have to clean it up, per se, but you don't necessarily have penalties associated.
Dave Bittner: What are your recommendations for those small- and medium-sized businesses who are trying to navigate the environment right now kind of hanging in there while these things shake out? Any words of wisdom for them?
Bret Cohen: I would recommend that they do the best they can do. I mean, that sounds like an obvious point, but, you know, I think a lot of these new regulatory systems promote this kind of reasonable compliance standpoint. So you have the technical aspects, which are fairly straightforward to follow, quite frankly. But then when it comes to their reasonable, you know, information security standards, what exactly should they follow?
Bret Cohen: Interestingly enough, this is somewhat analogous to the Department of Defense - the Pentagon has implementing a new cybersecurity certification. It's called CMMC, Cybersecurity Maturity Model Certification, and they're requiring that for all of their contractors. And it's basically - sets forth protocols, different controls that the companies have to have in place certifies them at different levels. So, you know, back to your question - I think these small and midsized companies can basically pick a framework, maybe at CMMC, maybe it's NIST, maybe it's ISO. There are, you know, lots and lots of different frameworks. But pick a framework, do the best you can do to comply with that, and that should be - you know, I won't say good enough, but that's a good standard and a good starting point for them in trying to navigate these seas as - still becomes more knowing.
Dave Bittner: All right, some interesting things there, Ben. What do you think?
Ben Yelin: Yeah, definitely interesting. I'm maybe not as bullish as Mr. Cohen is on the chances for national data privacy legislation. I do think there is a desire out there, as he said, on the part of users and on the companies for a national standard. Maybe I just have a lack of institutional trust in Congress to solve a really important issue like this...
Dave Bittner: (Laughter).
Ben Yelin: ...Especially when you're dealing with potential landmines. He talked about liability as one of them. And, you know, you're going to get a lot of angry lobbyists and constituents if companies are going to be held strictly liable for data breaches. That's something that might frighten a member of Congress. I might not be quite as bullish as he is on the chance for a national standard.
Dave Bittner: Right.
Ben Yelin: You know, what's interesting to me is that in some ways the CCPA might turn into the de facto national standard the way GDPR turned into the de facto international standard because if you're forced to comply with the CCPA, whatever practices you're going to change to comply with that, in the context of California, you're necessarily going to change those for the other 49 states.
Dave Bittner: Yeah.
Ben Yelin: So it's almost like a race to the top, which is better than what we often see, which is the race to the bottom.
Dave Bittner: (Laughter).
Ben Yelin: You know, there's like - there's a reason why all the credit card companies are located in South Dakota, right?
Dave Bittner: Right, right.
Ben Yelin: Yeah. And why all businesses are incorporated in Delaware. So, you know, I think that aspect was interesting. The constitutional nerd in me, which always comes out, especially on this podcast...
Dave Bittner: (Laughter).
Ben Yelin: ...Got me thinking about - when he was making the comparison between environmental laws and data privacy, got me thinking about the Commerce Clause and why the founders included it in the first place. You know, our federal government was supposed to be one of limited powers, enumerated powers. But one of those powers was to regulate interstate commerce, and the reason is, interstate commerce was an issue of national applicability. So we needed to have uniform national standards that we could conduct business across states. And I think in the context of both environmental legislation and data privacy you can see why that's so important.
Dave Bittner: Yeah.
Ben Yelin: The winds can flow from Virginia into Maryland; the water can flow from Pennsylvania into Maryland.
Dave Bittner: Right.
Ben Yelin: And so these are issues that are crosscutting across state lines, and so it would make sense from a constitutional perspective for Congress to be empowered to come up with a solution.
Dave Bittner: One of the things that I found really interesting about what Bret had to say was about unintended consequences. And it reminded me of this funny thing that I think happened with GDPR, where, you know, you have a right to be forgotten. Well, if you have a right to be forgotten and there had to be verification that you've been forgotten, how do they verify to you that you've been forgotten if they've forgotten you, right? (Laughter).
Ben Yelin: That sounds like quite a tongue twister there.
Dave Bittner: Well, yeah. I mean, if - so how do you maintain the data that documents that someone has been forgotten if they've requested that their data be forgotten, right? (Laughter).
Ben Yelin: Yeah, it's a bit of a Catch-22...
Dave Bittner: It is.
Ben Yelin: ...Because that very request is going to be on the record.
Dave Bittner: Right. Exactly.
Ben Yelin: Yeah.
Dave Bittner: Exactly. So, you know, that - to me, that's the sort of conundrum that you could end up with these sorts of things, and that's an interesting point Bret made.
Ben Yelin: Yeah. And all of these laws have unintended consequences.
Dave Bittner: Yeah.
Ben Yelin: You know, usually you discover them several years down the line. This is not limited to data privacy.
Dave Bittner: Right.
Ben Yelin: All federal and state laws end up having some sort of loophole, and that's why, you know, they're meant in the future. I think one thing that's unique to data privacy is - because this is such a relatively new area - states are supposed to be our laboratories for democracy.
Dave Bittner: Right.
Ben Yelin: So California is our laboratory. We're going to look to them to see if their legislative language was clarifying enough for compliance and if there are those unintended consequences. And you know, those are lessons that the federal government and the other 49 states can take. But, you know, to extend the metaphor, California is a very large lab. You know, it's, like...
Dave Bittner: Right.
Ben Yelin: ...The largest lab in the science wing of your local college. So, you know, if...
Dave Bittner: (Laughter) Right, right. They're looming over everything because of their scale.
Ben Yelin: Exactly.
Dave Bittner: Yeah (laughter).
Ben Yelin: You know, so if there's, like, a massive chemical explosion in that lab...
Dave Bittner: Right.
Ben Yelin: ...You know, those chemicals are probably going to leak into the Nevada room and Oregon room and into the Arizona room.
Dave Bittner: Yeah (laughter). I think that creaking sound you hear is us stretching the metaphor beyond its breaking point (laughter).
Ben Yelin: Yes, that's actually true. We should stop before it gets worse.
Dave Bittner: All right. Well, let's leave it there. Our thanks to Bret Cohen from Tier 1 Cyber for joining us. Really interesting insights there. That is our show.
Dave Bittner: We want to thank all of you for listening, and we want to thank our sponsor, KnowBe4. Go to kb4.com/kcm and check out their innovative GRC platform. That's kb4.com/kcm. You can request a demo and see how you can get audits done at half the cost in half the time. Our thanks to the University of Maryland's Center for Health and Homeland Security for their participation. You can learn more at mdchhs.com.
Dave Bittner: The "Caveat" podcast is probably produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.