No fruit of the poisonous tree.
Ben Wright: To tell you the truth, I can't say that the consumers have benefited a great deal from the kind of standard that we've been enforcing with respect to large organizations for the past 20 years.
Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's law and policy podcast. I'm Dave Bittner. And joining me is my co-host Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: On this week's show, I've got a story about warrants for Google keyword searches. Ben looks at a case involving mobile device passwords and Miranda rights. And later in the show, Ben speaks with Ben Wright. He's a professor at the SANS Technology Institute. They're going to be talking about setting performance standards for security and the associated liabilities.
Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney.
Dave Bittner: All right, Ben, let's jump in with our stories here. Why don't you kick things off for us?
Ben Yelin: Sure. So I'd say 50% of the time, this is the Joseph Cox show...
Dave Bittner: (Laughter).
Ben Yelin: ...And the other 49% percent, it is the Orin Kerr show.
Dave Bittner: Yeah.
Ben Yelin: And that's where I got my story today, professor of law from Berkeley. I'm a big fan of his. The love is so far unrequited. But if he wants to come on this podcast...
Dave Bittner: So he's object of your affection - right? - your professional affection (laughter).
Ben Yelin: Exactly. And he has a permanent invite to join us on "Caveat."
Dave Bittner: Right.
Ben Yelin: But he flagged a really interesting case out of the state of Nevada. And this is about the Fifth Amendment, Miranda rights and iPhone passcodes. So just to give a little bit of background, the person who is charged with a crime here, a guy by the name of Travis Mickelson, shot at a group of Sikh men at a gas station pretty much unprovoked. It ended up being classified as a hate crime because this alleged that he was biased against them for their religion.
Ben Yelin: So Mr. Mickelson fled the scene. Eventually, law enforcement were able to track him down at his home. They asked Mr. Mickelson, before they gave him his Miranda rights, to unlock his phone. And he had one of those devices where you scribble in a - your finger in a certain manner, in a certain pattern so...
Dave Bittner: Right.
Ben Yelin: ...That it unlocks the phone. And he did so. The police said that they asked him for that just so that they could put the phone on airplane mode to preserve his battery. Oldest trick in the books - don't fall for it, people.
Dave Bittner: (Laughter) Like that matters. I mean, your phone runs out of battery, it's not like it gets wiped clean.
Ben Yelin: Exactly, exactly. And you know what? It's probably better for you not to give up your passcode than, you know, it is for you to be at a police station and have your phone run out of battery - just going to throw that out of there.
Dave Bittner: Right, right.
Ben Yelin: So anyway, after they got access to his phone, they were able to obtain a warrant and found some incriminating messages and used that to arrest him, prosecute him and convict him. So he was convicted. He's challenging his conviction, stating that the evidence should be suppressed because he wasn't read his Miranda rights.
Ben Yelin: So a quick refresher - the Miranda rights come from the Fifth and Sixth amendments to the United States Constitution. The Sixth Amendment grants you the right to counsel even if you can't afford one. We're not focused so much on that here. The Fifth Amendment says - and you've seen this in your favorite police procedural shows.
Dave Bittner: Dun, dun (ph)
Ben Yelin: You have the right - yeah, exactly.
Dave Bittner: (Laughter).
Ben Yelin: You have the right to remain silent. Anything you say can and will be used against you in a court of law. And what that comes from is the Fifth Amendment right against self-incrimination.
Ben Yelin: Now, that only applies to so-called testimonial evidence. So other types of evidence, including, as they say in this case, consenting to a search, isn't covered by the Miranda warning. You need not be Mirandized in the normal course of any sort of law enforcement stop if law enforcement asks and you consent to have any part of your belongings searched.
Ben Yelin: And that seems to be the reasoning here. The passcode, according to this court, is not inherently incriminating. The person never disputed that it was his device. And at least according to this court, the officer didn't ask Mr. Mickelson for his passcode in order to find incriminating information. He merely wanted him to turn it on airplane mode. So what the court holds here is that no warrant was required, that this was a proper search and it was acceptable to compel this person or to ask this person to unlock the passcode without giving him his Miranda warnings.
Ben Yelin: So it's an interesting case. The reasoning isn't particularly detailed, and I think there are a lot of missing parts here. So I'd like to see what happens, you know, if this case is appealed to a higher court. But it's certainly one of the few cases I've seen that's discussed, you know, unlocking a smartphone in the context of Miranda rights.
Dave Bittner: Now, suppose his passcode was, I hate Sikhs. Like (laughter), that could have more - that could be material evidence, right?
Ben Yelin: Yeah, absolutely. Or...
Dave Bittner: (Laughter).
Ben Yelin: ...You know, we've talked before about the home screen picture. That could have evidence on it. Some people receive text messages on their home screen. So unlocking the device could grant access to something like that, you know? And anything that's at least written on the device, you could seemingly argue, is testimonial information.
Ben Yelin: You know, I think what law enforcement says here and what the court agrees with is this sort of evidence doesn't count as an interrogation. An interrogation is defined under Supreme Court precedent as any words or actions on the part of the police that the police should reasonably know are likely to elicit an incriminating response from the suspect. And I think what the court is saying here is that a cellphone passcode isn't that type of evidence. It is not inherently incriminating. It's the equivalent of letting somebody search your pockets or search your car. Just the fact of consenting to that type of search isn't consenting to something that is inherently incriminating.
Ben Yelin: Now, this to me seems like a bit of a legal fiction because all of this led to law enforcement being able to obtain a warrant to access the contents of his device. But, you know, I think what the court is saying here is everything here was done legally, so there is no fruit of the poisonous tree.
Dave Bittner: I see.
Ben Yelin: Yeah, and I think that's why they came down the way they did on this issue.
Dave Bittner: All right, yeah. That's interesting, for sure. I guess now you see how this plays out in other cases now that this one's been decided.
Ben Yelin: Yeah. So we've seen a lot of Fifth Amendment cases as it relates to compelled decryption. And as we've talked about, courts have come to different conclusions as to whether unlocking one's passcode counts as testimonial evidence.
Ben Yelin: We haven't really seen it until this case in the context of the Miranda warning itself. You can collect testimonial evidence - law enforcement could - if you properly Mirandize somebody. That's without question. So I think the issue here is that this was all done before the person was read his or her Miranda rights.
Ben Yelin: I would say, you know, for practical purposes, law enforcement should probably, just to be on the safe side, read the person their Miranda rights...
Dave Bittner: Yeah. Why not?
Ben Yelin: ...Because they will always be on firmer legal footing. But it seems here that having somebody unlock their phone with a passcode isn't the type of, you know, custodial interrogation that invokes the Fifth Amendment and, thus, the Miranda rights.
Dave Bittner: All right. Well, my story this week comes from CNET. It's written by Alfred Ng. The title of the article is "Google is Giving Data to Police Based On Search Keywords, Court Docs Show."
Dave Bittner: So basically, what's happening here is that back in August, the police arrested a gentleman named Michael Williams, who is associated with R. Kelly, the famous singer.
Ben Yelin: Troubled, to say the least, yup.
Dave Bittner: (Laughter) Yes.
Dave Bittner: And the allegations are that Mr. Williams set fire to a witness' car in Florida. The investigators did their work, and they traced him to the arson. But one of the things they did was they sent a search warrant to Google, and they requested information on users who had searched the address of the residence close in time to the arson.
Dave Bittner: So this raises some interesting issues here, doesn't it, Ben?
Ben Yelin: It sure does. And Google responded by providing the IP addresses of people who searched for that victim's address.
Ben Yelin: Now, Google says that this is not a standard practice of theirs. They evaluate every government request on a case-by-case basis, trying to balance the government's law enforcement interests against invasion of privacy.
Dave Bittner: Right.
Ben Yelin: And, you know, they also said something like, we - these types of reverse warrants are only obtained 1% of the time. That is not a satisfactory answer, in my view.
Dave Bittner: (Laughter).
Ben Yelin: To me, this is - you know, we've talked about geofence warrants, where you get all of the cellphones that were in a particular area. This, to me, seems more problematic than that because this isn't just metadata - something like location, whether a certain person was pinging a certain cellphone at a particular time. This is something that might be inherently more personal, and that's a Google search. And when we're talking about content, I think that does merit additional Fourth Amendment protection. And I think a warrant should've been required in this case. And I think certainly Google, but also the court, has to answer for that.
Ben Yelin: I mean, you can see the logical extension of allowing these types of reverse warrants on search history. What about finding everybody who searched Black Lives Matter rally at City Hall on X dates...
Dave Bittner: Right.
Ben Yelin: ...And finding out who went based on that? I mean, these are the types of things that we have to be on guard against. So I think Google's explanation here, to me, was not satisfying. And, you know, I think we're going to have to look very carefully at these keyword warrants going forward.
Dave Bittner: So a warrant was - to be clear, they did get a warrant. And they presented Google with that warrant. Do we fault the judge for going along with the warrant - an overly broad warrant?
Ben Yelin: In my opinion, yes. Now, the warrant itself is sealed, so we don't have access to it, and, you know, we probably won't until this case goes further in our judicial system. So we're going on relatively limited information here.
Ben Yelin: But, you know, the Fourth Amendment has this particularity requirement where you have to describe the people to be searched or the things to be seized. And this seems, just like, in some cases, geofence warrants, to me to violate that particularity requirement. You don't have an individual suspect. You're casting an extremely wide net that's going to encompass a lot of innocent people to potentially find relevant information. And you're going to be collecting perhaps an inordinate number of IP addresses.
Ben Yelin: Now, we don't know how many IP addresses had been searching this particular property. But you can understand in other contexts where that might lead to the collection of hundreds of different IP addresses. So to me, that would signal that the warrant would be overbroad if it doesn't meet that particularity requirement.
Ben Yelin: And the particularity requirement is in there because when the Fourth - not to get too historical here, but, you know, one of the reasons we have a Fourth Amendment is to protect against these general warrants...
Dave Bittner: Yeah.
Ben Yelin: ...Which is what we saw among our legal ancestors in the United Kingdom, where the king would say, go search this man's house and see what you can find.
Dave Bittner: Right (laughter).
Ben Yelin: This is obviously...
Dave Bittner: Don't come back until you've found something (laughter).
Ben Yelin: Exactly. Dear sire, yeah. Yeah, I mean, this is obviously different. But to me, it has that feel a little bit. So I can see why some of these privacy advocates and policy groups were really taken aback by this decision, and they're putting a lot of pressure on Google.
Ben Yelin: Google hates when these stories come out because they, A, want to maintain a good relationship with law enforcement...
Dave Bittner: Right.
Ben Yelin: ...And B, you know, want to ensure their customers that their data is being protected. So this is something that's going to be a hit to Google's public reputation.
Ben Yelin: But, you know, I think this certainly raised the eyebrows of attorneys who perhaps are going to seek legal recourse in case we see some of these keyword warrants in the future.
Dave Bittner: I guess I'm having trouble feeling too bad about the specificity issue here because in this case, we know an arson was committed, right?
Ben Yelin: Yes.
Dave Bittner: We know where the arson was committed. We know the approximate time the arson was committed. It seems to me reasonable that, given we know all of those specific things, that to go to a judge and say, Your Honor, these are the things we know; we would like to get some more information based on these specific things - from a law enforcement point of view, I guess I'm having trouble with the notion that that would be overly broad because who else are you going to get? You're going to get, you know, oh, there was a pizza delivery guy who searched for that address. I mean, how many people are going to search for a specific address within a certain number of hours? That doesn't strike me as being overly broad. What am I missing here?
Ben Yelin: So first of all, you always have to look at - you know, it's beyond just this one case 'cause it could have precedential value. So you have to think about, if we allow these types of keyword warrants, that could include a situation where there was a broader time period, a longer time period. Maybe there were searches of multiple addresses. Maybe there were more personally intrusive searches that were being conducted.
Dave Bittner: I see.
Ben Yelin: So there's that aspect of it.
Ben Yelin: Also, I mean, without specific information about the warrant itself, we don't know what the time period was. We don't know the exact breadth of the warrant. And for a number of reasons, there could have been people who were searching this address who are completely innocent, just the way that a lot of people whose cellphones are captured through geofence warrants just happen to be in the wrong place at the wrong time.
Dave Bittner: Right, right.
Ben Yelin: And we have to protect those people's constitutional rights as well.
Dave Bittner: I see.
Ben Yelin: So I completely understand your viewpoint, and I think that's the argument that law enforcement made. They're not just saying, you know, anybody who searched house in Florida, we're going to obtain their IP address.
Dave Bittner: Right, right.
Ben Yelin: Like, it was this specific house. But to me, you still have particularity problems where you don't have a criminal suspect. And at least from the public information we have, we don't know how many people actually searched for that address. We don't know how large that universe is. So it's very possible that the search can be overbroad.
Dave Bittner: The other thing this reminds me of is that comparison to a public library and how librarians have been vigilant about protecting the things that people go to the library to search for. You know, and I'm imagining if I went - because all this information, if I went to my local public library and I went to the stacks and I pulled out...
Ben Yelin: What's that? We haven't been allowed at those in eight months.
Dave Bittner: (Laughter) I know. I know.
Dave Bittner: But I go to the stacks and I pull out the local phone book that has, you know, names and addresses, and I look it up that way, I suspect most people would have an issue with people looking over your shoulder for that. You know, what are you looking for in that card catalog? What do you - what books are you looking for in the public library? I think - so I find it helpful to use that comparison, you know, a Google search - looking at your Google search history to having someone shadow you as you go through, you know, the public library to try to find information for yourself.
Ben Yelin: Yeah, although maybe the - perhaps the better metaphor would be, let's get the fingerprints of everybody in the library on a certain day who took out this one particular book. Now, you know, in reality, how many people are taking out a phone book in a library these days?
Dave Bittner: Yeah.
Ben Yelin: But at least you can imagine that that could capture a lot of innocent people.
Dave Bittner: Right, right, right. You could see, like, oh, Bob, why were you taking out a book on the symptoms of venereal diseases, you know? Like, you know what I mean? Like, there are things...
Ben Yelin: Yeah.
Dave Bittner: ...That an innocent person could get caught up in, as you say, you know, that have nothing to do with the potential bad person that the police or law enforcement is looking for.
Ben Yelin: Yeah. You know, I just think that these type of reverse warrants where you're not searching for a particular person or from the possession of a particular person, but you are searching a universe of people to try and get evidence about what happened for a crime - geofencing and these types of warrants introduce that problem.
Ben Yelin: And I don't know that there's an easy solution to it. I mean, obviously, this is a very effective law enforcement tool. It worked in this case.
Dave Bittner: Right.
Ben Yelin: But, you know, perhaps our state legislatures and the federal government, as they have done to a certain extent with geofence searches, is going to have to come up with some sort of justiciable standard here.
Dave Bittner: Yeah. All right, well, interesting story for sure. It is time to move on to our Listener on the Line.
(SOUNDBITE OF PHONE DIALING)
Dave Bittner: Ben, we got a message. This actually came to us over at LinkedIn from one of our listeners, a gentleman named Jason (ph). And a short question here, but an interesting one. He says, hi, guys. Would a backyard count as in the public for Fourth Amendment rights?
Ben Yelin: So, Jason, it is a great question. Generally, when we're talking about broader searches and seizures, anything that's within the curtilage of a home counts as a house for the purposes of the Fourth Amendment.
Ben Yelin: But there's actually a specific Supreme Court case from the 1980s that really cuts against that idea. There was a case called California v. Ciraolo. And in that case, this individual was growing marijuana in their backyard. And the police, at a relatively low altitude, flew a surveillance plane and took real-time photos of this person's backyard. They caught the person growing marijuana, and he was arrested and convicted.
Ben Yelin: And what the Supreme Court said is you don't have a reasonable expectation of privacy in your backyard from people who are up in the sky. Basically, if you...
Dave Bittner: (Laughter).
Ben Yelin: ...You know, take a vertical line from your backyard going up into space...
Dave Bittner: Right.
Ben Yelin: ...There's a diminished expectation of privacy because, you know, at least according to their rationalization, anybody - you know, this is public airspace. Anybody could be flying a plane up there. It would be specious reasoning now, although now that's complicated by the fact that people have drones and all sorts of things.
Dave Bittner: Yeah.
Ben Yelin: But it was especially questionable, to say the least, when it was decided. So - you know, and it was a very - it was a narrow majority, a 5-4 decision that said people don't have a reasonable expectation of privacy in their backyard as it pertains to aerial surveillance, but that is still precedent.
Ben Yelin: So, you know, if you are growing drugs in your backyard, you have to do a better job of concealing it from...
Dave Bittner: OK. So I was thinking - when I saw this question, my line of thinking in this as I was exploring it in my own mind was, you know, it'd be one thing if my backyard backed up to a highway, right? So I'm there, you know - I don't know - playing football with my friends, and people are driving by in their cars, and everybody can see us, you know? So I would suspect that there's no reasonable expectation of privacy there. I know my backyard backs up to a highway. Lots of people can see me, and that's the reality of it.
Dave Bittner: But if I built a 10-foot fence around my backyard, which still faces the highway, wouldn't I have therefore increased my expectation of privacy? So I guess I was thinking, if somebody came up, you know, on the shoulder of the road and got up on a ladder so they could look over my 10-foot fence, does that change things at all?
Ben Yelin: It does, I mean, because so much of this is measured on your subjective expectation of privacy. That's the...
Dave Bittner: Right.
Ben Yelin: ...First prong of the test.
Dave Bittner: OK.
Ben Yelin: So the more you do to conceal your property, the greater the privacy interest is. And we go back to the seminal case on this, which is Katz v. United States. I don't know if you know about this, Dave. There used to be a thing called phone booths, where...
Dave Bittner: (Laughter) Isn't that the thing Superman used?
Ben Yelin: Yeah, something like that.
Dave Bittner: (Laughter).
Ben Yelin: You'd go in, and you'd close the door behind you. And that, you know, act was an act to conceal the audio of your conversation.
Dave Bittner: Yeah.
Ben Yelin: So, you know, I think the mistake that the defendant made in this case was not doing enough to conceal their drugs in their backyard. You know, the reason that I think this ruling was particularly controversial and why I personally have problems with it is, you know, this individual could have done everything they could to conceal what they were growing in their backyard. They could have built that 10-foot fence. But that doesn't protect them from aerial surveillance.
Ben Yelin: And I just think it's a total legal fiction that, you know, the public has access to these airways. While that's technically true, very few people have airplanes. And that's not fair for this individual to expect that somebody is going to be taking pictures of their backyard from a low altitude. So...
Dave Bittner: Interesting.
Ben Yelin: But that's just my personal opinion on this case. The Supreme Court disagreed with me. So what else is new?
Dave Bittner: Ah, well, what do they know, right?
Ben Yelin: Yeah, exactly.
(LAUGHTER)
Dave Bittner: All right. Well, our thanks to Jason for sending that in. We would love to hear from you. Our call-in number is 410-618-3720. You can also send us an email to caveat@thecyberwire.com. Or you could do what Jason did, which is reach out to us on LinkedIn. Ben and I are both there as well.
Dave Bittner: All right. So, Ben, you recently had the pleasure speaking with Ben Wright. He is a professor at the SANS Technology Institute. And you guys discussed setting performance standards for security and associated liabilities - interesting conversation. Here's Ben and professor Ben Wright.
Ben Yelin: You lay out a proposal for a new legal standard of performance for cybersecurity by any large enterprise in the United States. I'm wondering if you can run us through what the current standard is for people who aren't familiar with it. What do most courts look at when trying to evaluate those claims?
Ben Wright: Authorities have stated a number of different kinds of standards for cybersecurity over the years. But the most common standard is reasonable security, such that organization is expected to take reasonable measures to ensure that it is protecting data or otherwise protecting networks that it needs to be protecting to make sure that security compromises have not happened.
Ben Wright: I believe that reasonable security is not necessarily the best way to define what large, qualified organizations are doing in the United States today to deal with cybersecurity. And so I propose a different standard, and that would be professional teamwork.
Ben Yelin: So could you explain a little bit what that would mean? What does a professional teamwork standard look like in practice?
Ben Wright: I analogize it to what a hospital does. A hospital is a large organization that has a team of professionals working on problems. For instance, patients show up with COVID-19, and our society has long said that there's not an expectation that a hospital can guarantee that the patient will not die or will not suffer as a result of having that disease and bringing that disease into the hospital. Our society has long said what the hospital is expected to do at a very general high level is provide a professional teamwork level of care to deal with all the problems that patients have. And our society has long recognized that if a hospital is legally liable every time the patient dies, then hospitals will go out of business.
Ben Wright: And I view cybersecurity at large organizations today as looking a lot like a team of experts, a team of professionals dealing with very difficult problems where the outcome can be negative. There can be a breach. There can be a compromise of security. There can be a ransomware compromise, for example.
Ben Wright: And our society often seems to have this expectation that large organizations are going to be perfect, even though society recognizes hospitals can't be perfect. I argue that large organizations, like insurance companies, and banks and credit bureaus, like Equifax, can't be expected to be perfect. Instead, they should be expected to be throwing professional resources at the problem, fighting the good fight, but with an explicit recognition, we're going to lose sometimes.
Ben Yelin: Let me ask you this question this way. I guess the easier question would be what a large organization would have to do to comply with your standard. I think what might be more interesting is what kind of behavior, if you were in a deposition and you heard that an organization was engaging in some type of behavior vis-a-vis cybersecurity, would lead you to believe that they should be liable? What is some of that behavior that would introduce more risk, according to your standard?
Ben Wright: Ignorance. Well, negligence is not quite the right word. Failure to show up. Failure to seek the right people with the right resources to address the level of the problem as it evolves. So ignoring a problem is a good example of something that should lead to liability.
Ben Wright: So that's sort of like a patient arrives in the ambulance, and the patient has COVID-19 or some other kind of problem, and the hospital just doesn't send anybody out there to the ambulance to bring the patient in. The hospital doesn't have nurses, doctors and so on to examine the patient. The hospital just goes on break and...
Ben Yelin: (Laughter) Right.
Ben Wright: ...People just go home, and they don't do anything.
Ben Wright: Or another example would be something malicious where they just maliciously say, we want that person to die, so we're just not going to treat them, or we're going to, you know, inject them with some kind of poison.
Ben Yelin: I guess what I'm getting at - and forgive us 'cause we're both lawyers, but lawyers are always looking for, you know, where the line is. So where is the line between, you know, being malicious - you know, with your hospital metaphor, not sending a nurse out to get somebody coming in on an ambulance or not doing anything, in other words - versus showing the proper level of professional responsibility? Or is there a line? Is it something that would be more adjudicated on a case-by-case basis?
Ben Wright: Well, it's very much adjudicated on a case-by-case basis. And reasonable care, which is the more common standard, is not all that quantitatively different...
Ben Yelin: Sure, yeah.
Ben Wright: ...From professional teamwork. But I argue that there is a qualitative difference between reasonable care and professional attention, especially where professional attention or professional teamwork is defined explicitly as expectation that perfection will not be expected.
Ben Yelin: Right.
Ben Wright: And so we explicitly in our society have long recognized that perfection out of the medical community is not expected. On the other hand, in practice, what I so often see with respect to organizations like Equifax and Target stores and other breaches at large organizations is an expectation of perfection. You're a large organization. You were not perfect. You must be punished. And we don't hear that type of talk with respect to hospitals and doctors.
Ben Wright: And thus is I'm suggesting this change of standard. I'm making more of a suggestion that relates to the emotions, the political feelings around what is expected of our organization.
Ben Yelin: So to play a little devil's advocate, what would you say to a consumer who, you know, suffered from the Equifax breach, from the Target breach, who was concerned that this standard did not hold large corporations properly accountable? In other words, with your standard, what's in it for the average consumer?
Ben Wright: I would say that for the average consumer, it's rare that they get very much...
Ben Yelin: That's true, yes.
Ben Wright: ...Out of these fines, these political posturing by regulators and politicians, class-action lawsuits. So I see these settlements...
Ben Yelin: Right.
Ben Wright: ...From Equifax - these class-action settlements, class-action settlements with Yahoo and so on. And as a lawyer...
Ben Yelin: Everybody gets a dollar, yeah.
Ben Wright: Yeah. I dig and I dig and I dig, and I read and I read. And I say, am I going to get anything out of this? Oh, well, yeah, actually, you, the consumer, you're going to have to go compile a whole bunch of information and file a whole bunch of documents, and you don't know what the outcome of all this is. And maybe if you're lucky, you're going to get $10 or something like that. And so to tell you the truth, I can't say that the consumers have benefited a great deal from the kind of standard that we've been enforcing with respect to large organizations for the past 20 years.
Ben Wright: Furthermore, I believe that the consumers have been somewhat misled by what they sense out of the legal and political community. The consumer has been led to believe that you're OK, you're secure, so long as you don't get a notice in the mail that says you've been breached.
Ben Yelin: Right, right.
Ben Wright: And that's wrong.
Ben Yelin: Right.
Ben Wright: You as a consumer should fully be aware that your data has been compromised a zillion different ways, and you should be very alert to all kinds of problems like identity theft completely regardless of whether you got a notice from a class-action settlement, regardless of whether you've got a payment related to a class-action settlement, regardless of whether there was a fine imposed against the company, regardless of whether the company was berated in the media by politicians and so on.
Ben Wright: So I actually believe and argue that if our society is at more of an emotional level explicitly acknowledging that what's expected of large organizations is something equivalent to what a hospital does, then the consumers start thinking about big organizations like Equifax and Target stores differently from the way they're led to believe through all the political posturing and class-action lawsuits and regulatory fines that we see today.
Ben Yelin: So from a practical policy perspective, let's say one of our listeners believes strongly in your legal theory and thinks that we should move away from a pure reasonableness framework. What sort of step should your average consumer take as it relates to talking to a state legislator or federal legislator? Like, what would a practical step be for somebody who believes in this?
Ben Wright: I'm not sure your average consumer has a lot of impact on these kinds of things. I honestly think that this is a very esoteric kind of topic and it's not the type of thing that a consumer, a taxpayer, a voter is likely to get very much traction from by talking to their congressmen. You know, write a letter to your congressman. OK. Right. I don't know. I'm just very honest. I just don't - I don't think that the average consumer is going to have very much of an impact on this one way or another. Write a tweet. OK. Send out a tweet. That doesn't change the importance of the topic. But I don't think this is a consumer - average consumer voter topic.
Ben Yelin: Right. Goes straight into the shredder when it gets to that congressional office, which I know we've all seen. So I guess in terms of your next step, how do you change the sphere of influence around this? And how do you make this into something beyond just a very interesting academic theory and something that, you know, you'd really see into practice?
Ben Wright: I teach a class at the SANS Institute where many qualified cybersecurity professionals at large organizations from all over the world come to be trained on how to manage legal issues in cybersecurity. Many of these people are very influential in writing policies for their organizations, in explaining to boards of directors and higher management, what are we doing? What standards do we set for ourselves? How do we measure what we do? They're also very influential in writing contracts with their corporate customers and their corporate suppliers and vendors and so on.
Ben Wright: And I teach these people that very commonly a better standard to be stating within the enterprise is a standard of professional teamwork. And when you tell the board of directors our standard of performance is not reasonable care, our standard of performance that we set for ourselves is professional teamwork, that changes the conversation in the board of directors.
Ben Wright: So this reasonable care standard feels often more like a technical thing. When you read regulators defining reasonable care, they often speak in terms of taking measures, like installing a firewall and having multifactor authentication. I believe that in large, well-managed organizations, the best topic is not what technology did I buy or what technology did I install?
Ben Wright: The best topic is what people have I hired? What qualifications do we have for those people? What training and ongoing professional advancement do we have for these people? What kind of a professional environment is created for these people so that they can work as a team like a group of professionals in a hospital dealing with a pandemic, dealing with very difficult problems where it's acknowledged some of these patients are going to die?
Ben Yelin: Right.
Ben Wright: But what do we do in order to do the best job? Oh, well, we need to go by this certain kind of ventilator. Yeah, ventilators are important, but that's not necessarily the essence of what a hospital really needs to be doing in order to provide the best kind of care in a very difficult epidemic.
Ben Yelin: So it seems like so much of this is about a cultural change at a micro level. It's just getting organizations to think differently about cybersecurity practices within their own organization.
Ben Wright: That's where I can be more influential. I'm not very influential in persuading state legislatures. Legislatures, they write all kinds of laws. I try to read them, but I'm not influential on them. I'm not influential on the Federal Trade Commission and the way that they interpret their laws. But I can be influential on cybersecurity professionals who come take my class, and I can be influential on a few of my clients who come to me for advice.
Ben Wright: I'll give a good example of how I've witnessed professional teamwork emerge recently.
Ben Yelin: Sure. Sure.
Ben Wright: Six months ago, large organizations had cybersecurity experts. They had a whole bunch of policies. The policy said, we're going to use this kind of computer. We're going to use this kind of network. We're going to have this kind of logon. We're going to have this kind of monitoring of what's going on.
Ben Wright: And then overnight, employees were sent home - thousands of employees sent home from large organizations. And overnight, we have these thousands of employees doing sensitive work, trying to use their home computer, trying to use their home router at the same time that the child is trying to go to online school.
Ben Yelin: Yeah, yeah.
Ben Wright: And while all this happened, the organization had a policy about how we manage people working and so on. The policy was worthless. It was out of date. It didn't - and they did not know how to write a policy for work from home.
Ben Wright: So question - did all the cybersecurity teams say, well, I don't know what to do; I - you know, we just don't have a policy on this, and it's going to take us two or three weeks to write a new policy? Is that what the cybersecurity team did? No, it's not. They did the same kinds of things that the professionals at hospitals started doing when COVID patients started showing up.
Ben Wright: They didn't know exactly what to do. They didn't have a perfect handbook or a textbook out there to tell them exactly how to deal with COVID patients. But they didn't throw up their hands and say, I don't know what to do.
Ben Wright: What they did as professionals is they tried to understand the patient as best they could. They rapidly learned everything they could from all over the world about how COVID patients were being treated in Italy and Germany and China and so on. And they used their best professional instincts to try to guess, we'll try this. We'll try this drug. We'll try this a little later. We'll do this kind of thing, this other kind of thing.
Ben Wright: I believe that the cybersecurity teams at large organizations, when the pandemic hit, behaved in a way that was very analogous to what doctors and nurses were doing at the same time as COVID patients were coming into hospitals.
Ben Yelin: And it seems that that's what we'd want to encourage is that type of agility in the face of adversity, the ability to solve problems and not just, you know, check boxes that you're complying with FTC regulations or something like that.
Ben Wright: I do believe that cybersecurity has changed at large organizations in the almost 20 years that I've been teaching this class and working in this field. And I believe that we can come to look at cybersecurity work at large organizations in much more of a lens of professionalism rather than the traditional lens of viewing cybersecurity as a vocational project that involves installing technology and just checking boxes that I use this technology and that technology.
Dave Bittner: All right. Boy, I enjoyed this conversation, Ben.
Ben Yelin: The battle of the Bens, yeah.
Dave Bittner: I know, right? Ben versus Ben. Let me just start off by saying there's nothing more interesting than when two lawyers go at it in conversation, right (laughter)?
Ben Yelin: If by interesting you mean painfully boring, but, you know, I hope we've gone against that reputation here.
Dave Bittner: No, I loved it. I loved it, not the least of which is because I am on board with Professor Wright here about the notion that using public health as an analogy to cybersecurity is useful.
Dave Bittner: I don't know if you remember. I had a conversation with Richard Clarke, you know, well-known for his post-9/11 White House - Richard Clarke, you know...
Ben Yelin: Yeah.
Dave Bittner: ...Security gentleman. And I posed the question to him if public health was a good comparison, and he said no (laughter).
Ben Yelin: Interesting.
Dave Bittner: He flat-out disagreed with me. But that said, I am on board with Ben Wright on this that I think it's very useful. So I found this conversation really interesting from that side of things.
Ben Yelin: Yeah. I mean, my perspective, which I think came out in the interview, and it was not meant to be confrontational, is just about line-drawing. Where - how are we going to know that practitioners have violated the standard of care that Professor Wright proposes? And with that lack of clarity, you know, it's going to be really hard to litigate these types of cases.
Ben Yelin: I think the public health metaphor was a good metaphor. And it's certainly true that you take a holistic approach in the hospital. Everybody's trying their best. As long as you are working as a team, you know, I think the legal system will look upon you favorably.
Ben Yelin: But on the other hand, there are a lot of medical malpractice suits, and you are measured against what a similarly situated doctor or nurse or EMT would've done in that situation. You'd have to come up with a really compelling reason, and perhaps Professor Wright did, as to why that same standard should not apply when we're talking about cybersecurity.
Dave Bittner: Yeah. I mean, what I think about often is that, again, with public health, you can do everything right. You can wash your hands. You can clean surfaces. You can, you know, cough away from people. But every now and then, you're still going to get a cold, right? You know, and I feel like that's a useful thing to keep in mind when it comes to cybersecurity - that, yes, you want to try to protect yourself as well as possible. But I think you have to be careful to not let the perfect be the enemy of the good.
Ben Yelin: I think so. And, you know, I am enticed by the idea that we should be less punitive - that's my general philosophy in life - and that we should be geared toward, you know, making sure that every organization is making a good-faith effort to protect their network.
Dave Bittner: Yeah.
Ben Yelin: So that part of it was very compelling to me.
Dave Bittner: Yeah. All right, well, again, our thanks to Ben Wright from the SANS Technology Institute for joining us. Great interview, Ben. Thanks so much to both of you for that.
Dave Bittner: That is our show. We want to thank all of you for listening.
Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producers are Kelsea Bond and Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.