Caveat 10.21.21
Ep 99 | 10.21.21

The Merman Borgnine Conjecture aka cybersecurity ethics.


Robert Carolina: Are we on the right road? Yes. Are we very far down that road? I'm not convinced that we are.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner. And joining me is my co-host, Ben Yelin from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: On today's show, Ben shares the story of the Missouri governor who's threatening legal action against a reporter who found a flaw in the state's website. I've got the story of a facial recognition conference in Washington, D.C. And later in the show, we're joined once again by Robert Carolina. He's a lawyer living in the U.K., and our conversation focuses on ethics in cybersecurity. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, we got a lot to cover this week. Why don't you start things off for us here? 

Ben Yelin: My story comes from the Missouri Independent. You know I read that every single morning. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: The story is about the governor of Missouri, who's threatening legal action against a reporter for the St. Louis Post-Dispatch. 

Dave Bittner: Yes. 

Ben Yelin: So what happened - and I know you've talked about this on the CyberWire podcast - a reporter with that newspaper alerted the state that Social Security numbers of state employees, including teachers and administrators, were vulnerable to public exposure because of a flaw on the website of the Missouri Department of Education. 

Dave Bittner: Right. 

Ben Yelin: Basically, you couldn't clearly see people's Social Security numbers, but it was visible in the HTML source code. 

Dave Bittner: Yeah. And this is a website where you could go and check the - basically the resume of teachers, check their credentials. 

Ben Yelin: Right. So checking their resumes and credentials - great. That's certainly a valuable public service. Revealing their Social Security numbers - not so great. 

Dave Bittner: Right. 

Ben Yelin: That's a major violation of the public trust. 

Dave Bittner: Right. 

Ben Yelin: So the reporter for the St. Louis Dispatch was extremely responsible about this and notified the relevant state agency of this security flaw. 

Dave Bittner: Before he reported on it. 

Ben Yelin: Before he reported on it. 

Dave Bittner: Yeah. 

Ben Yelin: Which, you know, is exactly what the book tells you to do in these types of situations. And the newspaper did not publish the story until the department was able to fix the problem. Of course, after they did fix the problem, they did publish the story. 

Dave Bittner: Right. 

Ben Yelin: And that raised the ire of the governor of Missouri, Governor Mike Parson. He is threatening legal action against this reporter, calling the reporter a hacker and referring the case to the Cole County prosecutor, which is the relevant county where this took place, and also asking the Missouri State Highway Patrol to investigate, saying basically that this administration stands against hackers, people who try and steal personal information. 

Ben Yelin: I don't think it would be too hard for our listeners to agree that this was a wrongheaded move by the governor because the reporter in this situation was extremely responsible. He did not publish the article until the vulnerability had been patched, and he did not publish the article before notifying the relevant state agency. 

Dave Bittner: Right. 

Ben Yelin: The Missouri governor is arguing that this qualifies as a hacking. This is unauthorized access into the state system. I think that argument is - how do you say bogus? 

Dave Bittner: (Laughter) I figured you going to pull out one of your fancy legal words like specious or - you know (laughter)? 

Ben Yelin: I could do that, but I think bogus - I was going to use, you know, maybe a derivation of a four-letter word. 

Dave Bittner: I see. OK. So bogus is good (laughter). 

Ben Yelin: But, yeah, we want to keep our - the G rating on this podcast. 

Dave Bittner: Right. Right. 

Ben Yelin: And the reason it's bogus is largely because of the decision in Van Buren v. United States. Now, he would be prosecuted under a Missouri statute, anti-hacking statute, but that statute's very similar to the Computer Fraud and Abuse Act. 

Dave Bittner: Yeah. 

Ben Yelin: So my guess is the jurisprudence on that statute would be the same. 

Dave Bittner: OK. 

Ben Yelin: And in that statute - in that case, rather - in the Van Buren case, as we've talked about, the Supreme Court said you can only be prosecuted if you are in an area for which you are not authorized to be, so an area that is closed off to the public. 

Dave Bittner: OK. 

Ben Yelin: It's this gate up-gate down approach. 

Dave Bittner: OK. 

Ben Yelin: If you breach that gate, that's hacking... 

Dave Bittner: OK. 

Ben Yelin: ...And you can be prosecuted. Here, everything that this reporter discovered was public. It was all on this website. It was HTML code. It's something that was clearly put on the website in error by the relevant Missouri Department of Education. 

Dave Bittner: Right. 

Ben Yelin: But it was up on the website, so it was public. 

Dave Bittner: And to be clear here, I mean, as many folks in infosec have pointed out, basically all you have to do to see this is hit F12 on your keyboard, which is view source. 

Ben Yelin: Right. 

Dave Bittner: And that's where the Social Security numbers were revealed. So it wasn't as if you went to the website and the Social Security number was there for anyone to look at. But it's routine to be able to look at a website's source code. This is not generally hidden. This is - it's there to be seen. 

Ben Yelin: Right. If we're going to extend the gate up-gate down metaphor, the gate is up, and whatever you're trying to view through the gate is clearly visible. All you have to do is, you know, extend your eyelids a little bit. 

Dave Bittner: Right. OK. 

Ben Yelin: I completely ran that metaphor to the ground. 

Dave Bittner: (Laughter). 

Ben Yelin: But, yeah, I mean, everybody has an F12 key. 

Dave Bittner: Right. 

Ben Yelin: So they can view the source code. This was not an unauthorized access into a state system. The Post-Dispatch, the relevant newspaper, of course, is standing by its reporter. They posted a statement saying the reporter did the responsible thing by reporting his findings to the Department of Education and that this person's not a hacker. A hacker is somebody who breaks into a system with malicious or criminal intent. 

Dave Bittner: Yeah. 

Ben Yelin: And here, there was no such break-in, no breach of security, no malicious intent. I think we have all the evidence we need that there was no malicious intent because he brought it to the attention of the agency. 

Dave Bittner: Yeah. Let me just stop you here for a quick aside because I know we'll get letters. 

Ben Yelin: We get letters. 

Dave Bittner: There are multiple - of course, of course, of course, Ben, there are multiple definitions, often contradictory definitions, of the word hacker. 

Ben Yelin: Yes. 

Dave Bittner: Everything from someone who is bad at playing golf. But in computers, hacker generally can mean someone who is very skilled with computers, someone who's very clever and is capable of doing things with computers that mere mortals are not. That is the - often the industry definition, the definition shared by professionals. And then there is the also accepted definition of someone who gains access to systems often but not always without permission. And that is the definition that the governor was using here using the term hacker. 

Dave Bittner: I only go down this pathway - and you can tell I'm a little frustrated to even have to do it - because lots of people are pedantic about it in the computer world. And as I said, if I don't mention that, we'll get letters. 

Ben Yelin: Right. You're absolutely right. 

Dave Bittner: (Laughter). 

Ben Yelin: And I think even under that second definition, this is not hacking because it wasn't unauthorized access. 

Dave Bittner: Right. 

Ben Yelin: It was viewing something that was public. 

Dave Bittner: Right. 

Ben Yelin: Anybody can view it. They weren't - the department was not able to effectively conceal this information from anybody who has an F12 key. 

Dave Bittner: Yeah. 

Ben Yelin: So it doesn't qualify under that definition of hacking. 

Dave Bittner: Yeah. I tell you one thing that I'm still scratching my head about is why did the - first, how did this get raised to the level of the governor's attention to the point where the governor of the state of Missouri felt he needed to hold a press conference about this? At no point did, you know, cooler heads prevail and someone say, listen, Gov; you probably - this is not a good idea. You probably don't want to do this, right? 

Ben Yelin: I don't get it. 

Dave Bittner: Yeah. Yeah. 

Ben Yelin: One of the reasons I don't get it is, you know - nothing against the St. Louis Post-Dispatch, but how many people would have actually read that article? And how public a story would that actually have been? This becomes a local story that generates very little interest if it's simply published in the St. Louis Post-Dispatch and the governor doesn't come out and recommend criminal referral. 

Dave Bittner: Yeah. Yeah. Yeah. Officials at the school board have thanked the Post-Dispatch for their responsible disclosure, and we don't feel that any information has been leaked. Everybody wins. 

Ben Yelin: Yeah. I'd skip over that article and go right to the, you know, sports section... 

Dave Bittner: (Laughter). 

Ben Yelin: ...And see how the St. Louis Blues are doing. 

Dave Bittner: Right. Right. 

Ben Yelin: Because it's just not - it's not really interesting information. 

Dave Bittner: Yeah. 

Ben Yelin: This is kind of the perfect example of the Streisand effect, where you're drawing - in an effort to conceal something, you're drawing attention to it. 

Dave Bittner: Yeah. 

Ben Yelin: And I think that's exactly what happened with the governor here. I mean, I think, you know, to be fair to the governor, he wants to disincentivize citizens, journalists within the state of Missouri gaining unauthorized access to state systems. 

Dave Bittner: Yeah. 

Ben Yelin: But that's just not what he's doing. 

Dave Bittner: No. And he's doubled down. He has not backed off. He has doubled down. I think, you know - I think the opportunity here is for the prosecutors to let the governor say what the governor wants to say, let some time pass and then choose not to prosecute, right? 

Ben Yelin: I think that's what's going to happen. 

Dave Bittner: (Laughter). 

Ben Yelin: I think we're going to get - you know, several weeks or months down the line, this county prosecutor will say, we don't have enough evidence to, you know, initiate legal proceedings. 

Dave Bittner: Yeah. Yeah. 

Ben Yelin: And we'll all forget about this. But it certainly was a curious move by the governor to be so public about this, especially when the facts here, to my mind, are relatively obvious and one-sided. 

Dave Bittner: Yeah. I don't know if this governor is, you know, termed out or whatever. But, boy, if I were running against him, I know where I'd start my campaign ads, right (laughter)? 

Ben Yelin: Fun fact - he is a governor - he was the lieutenant governor, and he took over because the previous governor had to resign as part of a sex scandal. So... 

Dave Bittner: Oh. Oh, my. 

Ben Yelin: Yeah. When you fall into that position, you know, I would play it super conservative. 

Dave Bittner: (Laughter). 

Ben Yelin: Just enjoy the fact that you've become governor, and don't try and rattle cages too much. 

Dave Bittner: Yeah. Yeah. All right. Well, yeah. I suspect this one will just fade away. But I think it's a good reminder just - because, you know, here's the other aspect of this is that it's such an easy target. We - it's so easy for us to, you know, punch up at some of these politicians who - for not knowing about technical issues. 

Ben Yelin: Right. 

Dave Bittner: Right? They - and this isn't helpful. 

Ben Yelin: Right. 

Dave Bittner: This makes it so much easier to do that, to say, see. Look. Look. See, here's another one. 

Ben Yelin: You'd think they would try and at least have an adviser who could tell them, look; this is not the type of issue for which you should hold a press conference. 

Dave Bittner: Right. 

Ben Yelin: You're in the wrong here. And this is - this action was not criminal. We should not be referring it to county prosecutors. Just let it go. 

Dave Bittner: Yeah. 

Ben Yelin: And seems like such an adviser was not present in the office this week. 

Dave Bittner: (Laughter) That's right. That's right. All right. Well, we will have a link to that story in our show notes, of course. 

Dave Bittner: My story this week comes from Business Insider, and it's written by Caroline Haskins. And it's titled "I Attended a Top Surveillance Conference in Washington, a Bizarre Experience in which Industry Insiders Lamented Being Under Attack." Now, Ben, of course, facial recognition is a common topic for you and I here on this show. So there was a surveillance technology event. It's called connect:ID, and it was in Washington, D.C. 

Ben Yelin: First of all, where was our invitation to this conference? 

Dave Bittner: I know, right? 

Ben Yelin: If you would've just, you know, invited us and given us a nice breakfast or lunch, we wouldn't have to comment on this article. But... 

Dave Bittner: (Laughter) Oh, right. OK. So what you're saying is we could totally be bought off by just a delicious meal. 

Ben Yelin: Yeah. I mean, the meal would have to be really good. 

Dave Bittner: (Laughter) OK. Sure. Sure. Some good swag. 

Ben Yelin: Yeah. 

Dave Bittner: Yeah. Yeah. Maybe a T-shirt - I attended the surveillance conference and all I got was this lousy T-shirt. 

Ben Yelin: I got was this lousy T-shirt - yeah, exactly. 

Dave Bittner: And my face scanned for... 

Ben Yelin: Yeah. 

Dave Bittner: ...Forever time. It was funny. I was thinking about this conference. Like, as everybody walks in, does it say, welcome, Bob Smith; welcome, Jane Doe (laughter)? 

Ben Yelin: Yeah, exactly. Exactly. Yeah, we know exactly who you are. 

Dave Bittner: Right. 

Ben Yelin: No need for name tags here. Yeah. 

Dave Bittner: Right. Right. Exactly. Right. Nobody has to wear a name tag. 

Dave Bittner: But the reporter here, Caroline Haskins, describes the event - and I think it's very interesting - from the point of view of how the folks who sell facial identification, the folks who are in this business, perceive themselves and how they're reacting to other organizations' perceptions of them. 

Dave Bittner: There's a quote here from John Mears, who is the chairman of the International Biometrics and Identity Association. He says facial recognition is under attack, and he encourages attendees to educate the public and lawmakers about misconceptions. He said - he listed several myths, such as face technology is all the same, face recognition use cases are similar and should all be banned and - wait for it, Ben - facial recognition tech is biased and disadvantages people of color. 

Ben Yelin: Yikes. 

Dave Bittner: (Laughter) I know. So, you know, what do you make of this? I guess you if you have a group of people who are like-minded, and of course they're in the business of promoting - their paychecks depend on the acceptance of this technology, I guess it's not surprising that the angle that they're going to come at this - but it's interesting to me that they label it misinformation rather than something we need to address and perhaps do better at or fix. 

Ben Yelin: Right. I mean, I understand it from a human perspective. You know, everybody's got to eat. They're trying to sell these systems. 

Dave Bittner: Right. 

Ben Yelin: And they're frustrated that people are very distrustful of facial recognition systems. We've seen that reflected in public opinion polling. You know, what I would have done, armed with that information, is try to either improve the algorithmic formulas to address some of these problems, to make sure that these systems are not racially biased, or, you know, have a more effective public relations campaign to try and educate the public as to the benefits of facial recognition systems. 

Ben Yelin: You know, I think we have to keep in mind that this event is very insular. It's made up of people who are on the inside, people who work for these companies and people who work for agencies who employ this technology. And so I think they are kind of cut off from the layman's view about this technology. And I think it's - because they work in it every day, I think it's almost humorous to them that normal people see this as, quote, "dystopian sci-fi." 

Ben Yelin: There was apparently a slide presented in one of the presentations where somebody showed images from "The Terminator," "RoboCop," "Blade Runner," "Minority Report," saying, you know, everybody thinks that we - our technology belongs in one of those movies. It's now taken on that cultural significance. But the real world isn't science fiction, and that's just not how these systems work. 

Ben Yelin: So yeah, my take on it is I do think you're in a very insular community. I think once you're in that community and you're constantly talking to other people who are part of the industry, you're talking to vendors or you're talking to local law enforcement offices, Customs and Border Protection, where they're in this world, they're either selling or buying these systems, you're cut off from the public, the common perception as to what these systems are. 

Ben Yelin: But the thing is, the public perception matters because ultimately we're going to - you know, facial recognition evidence is going to be used in legal proceedings, and there's going to be a jury of one's peers. And that jury is not going to be made up of representatives from Clearview AI. Let's put it that way. So ultimately, you do need to build trust with the public. 

Ben Yelin: And, you know, I think they can let off some steam at a conference, but ultimately it's incumbent upon the industry to change these practices. If you are going to employ facial recognition, you have to address some of these racial disparities and not try to sweep that under the rug. 

Dave Bittner: Yeah, they talk about a document that came from Clearview AI, who, of course, has been in the news for their facial recognition systems. And it's a document they released back in August that was titled Facial Recognition Myths. And Clearview said facial recognition technology is not surveillance because it happens after the fact, not in real time. 

Ben Yelin: Man, talk about shifting the goalposts there. 

Dave Bittner: Yeah. And I don't - I mean, to me, that - I don't even think that's true. 

Ben Yelin: No, it's really not. There's nothing about - I mean, surveillance is surveillance, whether it's forward-looking or, you know, employees' data collection from the past. That's true in a practical sense and true in a legal sense. I mean, going back and looking at somebody's cell site location information - that's looking at the past, but that's also very clearly surveillance. 

Dave Bittner: Yeah. 

Ben Yelin: Yeah. 

Dave Bittner: But I also think of, like, the use case - let's say security in a retail store, right? I mean, that's going to be real time. I mean, what's the point of these systems - if I'm a retailer and I want to know, is this person walking into my store trouble, is this someone who has been caught stealing from my store before or from the store down the street or, you know, who knows where, right? 

Ben Yelin: Right. 

Dave Bittner: And as I'm saying this, I'm getting sort of the creeps about it because - of all the things we talk about here. But real-time information on that - that's what I want. 

Ben Yelin: Right. Right. 

Dave Bittner: I want to know. I don't want to know after the fact. You know, oh, let's go through and see. Was that person who - you know, that's not the goal of any of this sort of stuff. 

Ben Yelin: Yeah. 

Dave Bittner: The faster, the better. 

Ben Yelin: Yeah. That certainly seemed to be a misplaced comment. 

Dave Bittner: Yeah. 

Ben Yelin: I mean, I do think I've been to a lot of these types of conferences. And I - most of the conferences I've been to have been in the context of emergency management and homeland security. There is a lot of jargon thrown around. There is a lot of letting off steam. There are a lot of things that are said that, you know, people wouldn't repeat publicly outside the context of a conference. 

Dave Bittner: Sure. 

Ben Yelin: So I'm somewhat sympathetic there. You know, they probably didn't know that a Business Insider reporter was going to be, you know, writing a story about this. And I get that. 

Dave Bittner: Yes, and they had their guard down. 

Ben Yelin: Right. That's why, you know, they decided - somebody decided to put up dystopian sci-fi movie images in a PowerPoint presentation. 

Dave Bittner: Yeah. 

Ben Yelin: But there is the broader issue here, which is, are you going to get defensive and try and blame the public for their own misconceptions? Or are you going to recognize the problem - that you have surveillance technology, and it is surveillance technology, that is flawed, and unless it is ameliorated is going to break the public trust? Are you actually going to go out and try to address those problems? 

Dave Bittner: Yeah. 

Ben Yelin: And you know, we weren't at the whole conference. Maybe there were some panels where people did express some of those viewpoints. But it's certainly discouraging reading this article. 

Dave Bittner: Yeah. You know, my take on this, too, is that, like, I don't think all facial recognition technology should be dismissed out of hand. I think there are useful purposes for it in law enforcement. But as it just comes up over and over again, like, my take is, get a warrant. 

Ben Yelin: Right. 

Dave Bittner: Right? Convince a judge that this is something you need to do. Make your case in front of a judge, and then, OK. Have at it. But I am not a fan of, you know, police cars driving around with video cameras on them and tagging everyone they see. I just have a problem with that. 

Ben Yelin: Right. Right. I think most people's objections with this are the fact that these technologies are employed and used by law enforcement without any individualized suspicion. It's not just facial recognition. I mean, how many episodes of we spent talking about the Baltimore spy plane... 

Dave Bittner: Right. 

Ben Yelin: ...Or other forms of surveillance where information is collected in bulk, it's not based on individualized suspicion, and it's a dragnet? You're picking up images of hundreds of thousands of people who are completely innocent. 

Dave Bittner: Yeah. 

Ben Yelin: And until you wrestle with that problem - you know, I think there are proper law enforcement justifications for using this technology. But until you wrestle with that problem, I don't think it's worthwhile to, you know, sit back and complain that the public has all these misconceptions... 

Dave Bittner: Yeah. 

Ben Yelin: ...In my view. 

Dave Bittner: Yeah. 

Ben Yelin: If they were to invite me to the conference and give me, you know, some good swag and... 

Dave Bittner: (Laughter) You should be a speaker, Ben. 

Ben Yelin: I know. I know. 

Dave Bittner: (Laughter). 

Ben Yelin: Then maybe we'll feel differently. 

Dave Bittner: They'd invite you once (laughter). 

Ben Yelin: Yes, exactly. I don't think I would get the repeat invite. That's for sure. 

Dave Bittner: Right. Exactly. I would not be bringing him back (laughter). 

Ben Yelin: Yeah. I think I get negative reviews... 

Dave Bittner: Right. Right. 

Ben Yelin: ...On my presentation. 

Dave Bittner: Fair enough. All right. Well, we will have a link to that story. Again, it's from Business Insider, written by Caroline Haskins. Interesting stuff, for sure. 

Dave Bittner: All right, Ben. I recently had the pleasure of speaking with Robert Carolina once again. He is an attorney living over in the U.K. And our conversation focuses on ethics in cybersecurity. Here's my conversation with Robert Carolina. 

Robert Carolina: This conjecture came about when I was asked to write about ethics in cybersecurity for - you know, for CyBOK, which is a subject you and I have spoken about before. 

Dave Bittner: Yeah. 

Robert Carolina: I was writing about law, and they said, you should also write about ethics. Well, the Merman Borgnine conjecture is this. It comes from Ethel Merman and her marriage to Ernest Borgnine. And when she - and they were married for, I think, all of about 45 days. 

Dave Bittner: Right. Right. 

Robert Carolina: An infamously tempestuous relationship - they were both, let's say, people of very strong opinions, and it just did not last at all. Well, when Ethel Merman went to write her autobiography, it came to the chapter which was entitled "My Marriage to Ernest Borgnine." And then you turn the page, and there's nothing. It just goes straight to the next chapter. 

Dave Bittner: Right. 

Robert Carolina: So for me, the Merman Borgnine conjecture is this; if you're going to write about cybersecurity ethics, there's really nothing worth writing about. And so I seriously considered turning in a first draft of CyBOK that had a section on ethics with nothing underneath it and then moved immediately on to Section 14. That was plan A. 

Dave Bittner: (Laughter). 

Robert Carolina: I abandoned plan A because I had a feeling that the people who commissioned CyBOK would not have a good sense of humor about the Merman Borgnine conjecture. Indeed, some of your listeners will have to look up, who is Ethel Merman, or who was Ethel Merman? They might remember Ernest Borgnine. 

Dave Bittner: Right. 

Robert Carolina: But when they do, I hope they will look up "SpongeBob SquarePants" because now the character of Mermaid Man will be even funnier. 

Dave Bittner: (Laughter). 

Robert Carolina: So anyway, that's a little Easter egg for whoever wants to look it up. 

Dave Bittner: Right. Right. 

Robert Carolina: So then I went to plan B. And plan B was, OK, well, let's write something about ethics. Well, what can you find? For that matter, why did I come up with the Merman Borgnine conjecture? Why did I think there was nothing worth writing about? And that is because I really searched out there in the marketplace for what I would describe as any kind of guidance material that would really help a cybersecurity practitioner answer really difficult questions. And frankly, there wasn't really very much at all that I thought was valuable. 

Robert Carolina: Now, don't get me wrong. There were a lot of ethics statements out there that - because for some reason, there - you know, there's not just one organization that purports to represent security practitioners. There seem to be lots and lots of them. They just grow up like - they sprout up like weeds in the back garden or something like that. 

Robert Carolina: But most of them that had ethics statements - the ethics statements - they had, these - you know, these kind of, like, bromides, like, you know, all right. Well, Principle 1 - first, do no wrong, or don't be evil, or, you know, your job is to protect this or, you know, protect - and it's like - but nothing in there that you could really use as a teaching tool, certainly nothing in there that I could use to advise someone who is trying to figure out what's the ethical thing to do in a certain circumstance - you know, things like comply with the law, be aware of all legal obligations. And that was the other - and, of course, some of the older codes - that's all they focused on was the law. 

Dave Bittner: Right. 

Robert Carolina: They weren't focusing on ethics. They were focusing on the law. If you go to even older codes, the law that they're most focused on is copyright. Don't steal software. So the oldest ones were don't steal software. 

Robert Carolina: Then, the more recent ones were don't spy on people because data protection, and then comply with law. Well, great. How do I navigate some really difficult problems on that answer? You don't. And that really concerned me. Let me tell you the reason I'm concerned. 

Dave Bittner: Yeah. 

Robert Carolina: I'm concerned because security practitioners live in a world - cybersecurity practitioners live in a world where they operate using a special set of skills. And no, that's not a callback to Liam Neeson. I mean, it's just... 

Dave Bittner: (Laughter). 

Robert Carolina: It's like being an airline pilot or a physician or a surgeon. You know, it's a very special set of technical skills. 

Robert Carolina: Secondly, people work outside the glare of public supervision. If you're going to do a job as a cybersecurity professional, you're very often in a dark room someplace without anyone looking over your shoulder. Your client might not even see what you're doing, or whatever you're doing is invisible, and there's no one in the community who can see you as you do it. 

Robert Carolina: Third thing - people who do cybersecurity are placed in a really unique position of trust. Very often a security practitioner, especially if they're working in-house, will be given privileged access to a whole lot of systems. And once that happens, a very uncomfortable thing begins to happen. And that is the practitioner is put in a position of asymmetric power with respect to their client, with respect to their employer. And that's just a fancy way of saying, if you've got the keys to the kingdom and someone honks you off, you can say, well, you know, if you don't do what I say, I'll delete all your stuff, or you'll never find it again. 

Dave Bittner: Right. 

Robert Carolina: You know, there's two ways to have - there's two ways to get yourself in a situation of ransomware. One is to be hit with - you know, with a Trojan horse that comes onto your machine. The other is to have an unethical cybersecurity practitioner who decides they're going to hold all your data for ransom. 

Dave Bittner: Before we had cybersecurity people, were there other people in organizations who were in a similar sort of situation, someone whose capabilities perhaps, you know, outstripped what they should have been? 

Robert Carolina: Well, it's not so much outstripped what they should have been, I mean, because there's a lot of people who work in society who are in a position where they could have asymmetric power over clients. You look at any of the traditional professions - lawyers, medical doctors, for that matter, electricians. You know, what - I mean, what do all these groups have in common? And that is they're doing something - they're providing a service to people who don't really understand how the service works. You're dealing with clients who don't necessarily know a good practitioner from a bad practitioner. And you're dealing with a circumstance - if somebody doesn't have a strong ethical compass, they can really do a lot of harm to members of the public. 

Dave Bittner: Coming up as a lawyer, I mean, in your training, in your - you know, getting your law degree, to what degree - how often were ethics discussed? Is this a part of the curriculum, you know, to get where you are today? 

Robert Carolina: I studied law in the United States back in the late 1980s, and I went to Georgetown Law. And at that time, the American Bar Association mandated that every person getting a legal education in the U.S. had to study - had to take a specific course on professional responsibilities - legal ethics. It wasn't a gigantic course, but it was a course. And we studied, you know, the legal canons of responsibility, ethics, all that kind of good stuff. For that matter, we were required to take the multistate - what is it? - the multistate ethics exam or something like that. 

Dave Bittner: Yeah. 

Robert Carolina: And if you want the short version of legal ethics to try to get you past the ethics exam, rule No. 1 - don't steal money from clients. There's a lot of well-developed ethics guidance in the legal profession. And let's be candid. It's because lawyers are really in a position where it's really easy to steal money from clients. It's really easy to mess them over, OK? The more you find this kind of asymmetric power balance between professional and client, my hypothesis is, the more you find regulation in place. 

Robert Carolina: Same with medical practitioners - I mean, who can do more damage to a human client, human patient than a medical practitioner? I'm not saying that's what they do. I'm not saying that's what they're trying to do. I'm saying that in the past and in cases where people don't have an ethical guidance, don't have some kind of a compass, terrible things can happen. 

Robert Carolina: Now, are cybersecurity practitioners - do they have that much asymmetric power? Well, not necessarily. But then again, the power that they do have is significant. I'll let you in on something. One of the reasons I'm so hot about this - OK? As a practicing lawyer who advises people in relation to cybersecurity matters, I've had more than a few circumstances over the past few decades where clients have, in effect, been held to ransom by an unethical practitioner. It happens. 

Dave Bittner: Yeah. 

Robert Carolina: I don't know how often it happens because invariably, organizations - particularly small- or medium-sized enterprises - they would rather just go along to get along and try to get past the individual who's making their life a bit difficult. But it's a terrible spot for people to be in. 

Robert Carolina: So - now, again, the practitioners that I've dealt with over the years, the cybersecurity practitioners that I've dealt with over the years have almost universally been good people. I've been happy to deal with them. I've been proud to work with them, and just - it's been wonderful to support them, and some of them think very deeply about these kinds of things. But that's not everybody, OK? There are some bad people out there. 

Robert Carolina: The ones I'm most worried about, though, the thing that gets me exercised about this, are practitioners who want to do the right thing, but they're genuinely conflicted in terms of understanding what the right thing is. 

Robert Carolina: So when I went to look for usable codes of ethics, I'll tell you one that I did find, which was really useful. And I have to give them full credit for this. This is the Association for Computing Machinery, the ACM - their Code of Ethics and Professional Conduct. They completely revised that thing in 2018. I mean, the previous version was, like, 1992. The ACM has been around for a long, long time. And they had a code of ethics that was out in 1992. They revised it in 2018. What's the difference between 1992 and 2018? The 1992 code - you'd search a long time to find the word internet anywhere in there. 

Dave Bittner: Right. 

Robert Carolina: You get to 2018, and the overwhelming majority of every conflicting problem they're talking about has to do with online and connected systems. 

Robert Carolina: And so to their credit, they seem to have invested a tremendous amount of effort in coming forth with a code that's specifically built around a skill set. OK. You people who do computing machinery for a living - you know, computer engineers, software developers, all those kind - anyone who joins the ACM, think about their skill set. If you're going to use this very important skill set, which can be used to harm people, which can be used to help people, here's our guide. Great. It's wonderful. It's a lot of case studies on that sort of thing. 

Robert Carolina: The other one that I found a few years ago when I wrote this thing was the CREST Code of Conduct for CREST Qualified Individuals. Now, that one came out of the U.K. The organization CREST was originally developed as a self-regulatory body for people doing penetration testing. And that was interesting because their code of conduct was built around not - in part around a specific set of skills, but also in part around a business process. So there's more in that code that talks about business engagement and being good to your client and all that kind of stuff. The ACM code doesn't really seem to dig into those types of issues too terribly much - the sorts of things that working professionals worry about every day. 

Robert Carolina: For instance, in law school, one of the biggest things was, can you sue your client to collect a bill? Is that a conflict of interest? Turns out, it's not a conflict of interest if you're trying to collect your bill - so surprise, surprise. 

Dave Bittner: (Laughter). 

Robert Carolina: And... 

Dave Bittner: So say the code of ethics written by lawyers, right (laughter)? 

Robert Carolina: Well, exactly. Well, you know, you - I hate to say it, but, you know, law doesn't come free, man. 

Dave Bittner: Right. Right. Sure. Sure. 

Robert Carolina: You know, it's like, I wish it did. I wish it did. You know, so - and this is the other problem. Ethics don't come for free, either. 

Dave Bittner: Yeah. 

Robert Carolina: And now we hit a serious systemic challenge. And again - all right, time for controversy. There's no money in ethics, by which I mean people don't get rich talking about and developing codes of ethics - at least no one I've met. I mean, if they are, please give me a call. I want to find out how they - I want to find out what they're doing because a lot of people, I think - and this is - and I think this is a perception issue, and I would love cybersecurity practitioners to look at this a little bit differently. 

Robert Carolina: I think a lot of people perceive ethics as a threat to their ability to provide services because everyone says, oh, do you want this to be an ethical profession? Oh, yes, yes. We'll vote for ethics. OK, well, let's sit down and actually write very specific rules about what's allowed and what's not. Oh, well, you see, now people start to get a little bit nervous. Why? A code of ethics reduces degrees of freedom. 

Dave Bittner: To what degree - what I would describe as the relatively recent professionalization of cybersecurity, you know, drifting away from what I would say - you know, the first 20 years or so when we had these rock star practitioners who no one understood what they were doing or how they did it and, you know - so they could pretty much operate any way they wanted to. But I think we're at a stage now where there is professionalism. And so with that comes more standardization. 

Dave Bittner: First of all, do you think I'm on the right track there? Do you think that is the reality of where we are? 

Robert Carolina: I think you're on the right road. I'm just not sure how far down that road we are. I mean, people have been talking about the professionalization of cybersecurity for all the last 30 years I've been involved with it. 

Dave Bittner: Yeah. 

Robert Carolina: And everyone keeps saying, oh, we're making progress. Well, I think progress has been limited, in part because people have been so busy trying to do cybersecurity, in part because trying to take a very careful focus on how to develop a code of conduct or a code of ethics takes an incredible amount of time, in part because the infrastructure itself is continually changing and, you know, it's hard to develop good case studies if the problems keep changing, and in part because of just the massive influx of people into the space. 

Robert Carolina: You know, 25 years ago, the people I was meeting who were doing cybersecurity, they tended to fall into a couple of odd categories. Either, oh, this is the person who used to work for the government. Oh, oh, oh, oh, oh, OK. So they did - yeah, yeah, one of those guys. Oh, OK, great. I get it. And who's this person? Oh, that's the person who developed a cryptographic product - cryptographic products because they're a crypto expert. Oh, one of those types - who's that person? Oh, that person's kind of odd. He got his start doing computer security for a large school system in North America back before anyone knew that students might be hacking in to try to change their grades. Oh, that's interesting. And what about this other person? Well, that person used to be a deputy sheriff in wherever and was put in charge of - so I mean, they were all sort of one-offs. 

Dave Bittner: Yeah. 

Robert Carolina: They were all one-offs. And now there's a huge number of people coming into the space. And that's - and it takes a while to - it takes a little while to settle, I think. 

Dave Bittner: Is some sort of, you know, certification or standards bodies - is that a solution here? Do we need a cyber AMA or a cyber FDA or, you know, something like that to get us on an even playing field? 

Robert Carolina: Well, that's an interesting question because that immediately leads to this problem - how do we define a cybersecurity profession? What is it exactly? And you can see this when you look through CyBOK, by the way. One of the reasons CyBOK has so many different sections is because there are so many things people do that are labeled as or perceived as the practice of cybersecurity - you know, product development, penetration testing, risk assessment, risk management, cryptography development, applied cryptography. I mean, you know, it goes on and on and on and on and on. 

Robert Carolina: Now, here's the thought question. Here's the really hard thought question. Let's assume for a moment that you're a government that wants to regulate the cybersecurity profession, right? How do you do any type of professional regulation? Well, you license people. OK, I get that. In order for the licensing to make sense, how do you encourage people to get licenses, or how do you know who has to get a license? Answer - you define in law a series of things that no one is allowed to do unless they have a license. Here's the question - what goes on that list? 

Dave Bittner: Yeah, well - and, you know, what I'm thinking of as you're describing this is, you know, for example, where I live here in the states, if I want to have someone come in and do electrical work in my house, they have to be a licensed electrician. But if I want to do that work myself, that's good. No problem. It's fine. 

Robert Carolina: Yep. 

Dave Bittner: Right (laughter)? 

Robert Carolina: Yep. Well, it's because... 

Dave Bittner: So what does that mean? 

Robert Carolina: Well, I mean, a lot of these things - I mean, law is the same way. If you want to advocate in a courtroom on behalf of another person for money, you have to have a license. 

Dave Bittner: Yeah. 

Robert Carolina: If you want to represent yourself, you know, with some weird exceptions, there's no rule against that. Or in some edge cases, if you want to volunteer to represent someone without being paid, there are circumstances where that's allowed if you're not a lawyer as well. So, yeah, this idea of, are you doing it for yourself or are you doing it for others is one way to slice that. 

Robert Carolina: But I mean, when I first posed this question to someone more than a decade ago, I had one thing that I think immediately comes to mind, and that's penetration testing. And that's because it seems to me that's a task undertaken by cybersecurity practitioners that if it goes wrong has the opportunity to create disproportional harm to client. If someone doesn't know what they're doing, they can really screw up a client very, very badly. So I thought that would probably be a good candidate for licensure and regulation first. 

Robert Carolina: And I find it interesting that one of the more successful professional bodies in the U.K., in fact, was CREST. And they were, in fact, started specifically with the idea of regulating - self-regulating penetration testers. Now, they've branched out into other areas as well, and you can read all about them. 

Robert Carolina: But - so, yeah - but this idea of what would be on the prohibited list - are you going to say risk assessment? I don't think so. I mean, because you got - that's just a business process. Reporting on risk is something that people with an MBA do every day. Well, OK. So what's it going to be? 

Robert Carolina: And other people have suggested to me that on the list should be specifying security systems. OK, well, what do you mean by a security system - any IT with a security component? So actually, that list of what is not allowed is, to me, the more interesting question that I very rarely see addressed. 

Dave Bittner: Is there a desire within the industry to address this? Is there even recognition that this is an issue? 

Robert Carolina: I think there's a growing recognition that it's an issue, but I think that the recognition is very uneven. 

Robert Carolina: I think part of it - well, let me highlight another area which I think may have caused or contributed to the problem. Let me highlight another area that I think may have contributed to the problem. And that is a lot of people who work in cybersecurity learned their trade, they learned tradecraft while working in government service. And there's nothing wrong with that. I celebrate those folks. I know a lot of them. You know, they worked for a security service or for the police or whatever it happens to be. And they were taught, you know, again, a very special set of skills. 

Dave Bittner: Yeah. 

Robert Carolina: When - I find that when cybersecurity practitioners, at least in the U.K., where I'm most familiar - when they work within a government agency of some type, those government agencies tend to have extremely rigorous, very thought-provoking and very thoughtful frameworks that they use to answer questions like what should we do or what should we not do. In fact, some of the most rigorous analysis I've ever seen done by that have come from reports that I've had from people who work in, well, you know, the big, super-secret agencies, OK? 

Dave Bittner: Right, right. 

Robert Carolina: We all know who they are. 

Dave Bittner: Right. 

Robert Carolina: I've been very impressed with the focus that they bring. I think the challenge comes that when someone who's learned all of their tradecraft working in public service then leaves, they have all of their tradecraft they bring with them. They know how to do the things. What they no longer have is a large pyramid of decision-making and recourse to guidance and, you know, a senior person of this level or a senior minister of that level or, you know, a warrant from this organization or - you know, all of that sort of surround has disappeared. 

Robert Carolina: So some of them - I mean, if they go and work for, let's say, one of the big international accounting firms, you know, one of the - when I was in - when I first studied accounting, it was the Big Eight. Now I guess it's the Final Four (ph). 

Dave Bittner: (Laughter). 

Robert Carolina: You know, if you go and work in that type of environment, because they are regulated professionals, they already have an enormous surrounding framework that they have to use to assess what they will or will not do for a client or to a client, for that matter. 

Robert Carolina: When they go away into other types of organizations that don't have that regulated surround, I think - I feel - the sadness, I feel, the tragedy, I feel, are for the people who are looking for answers to what should I or shouldn't I do, but they find themselves working in an organization where no one seems to have a really good answer to that. Oh, you know, we go to Jim, we go to Sally. They always seem to have a good answer. But they're kind of - you kind of get the impression that maybe they're making it up as they go along. And I think that - I think that's a tragedy. I want those people to have clear guidance, and I'm hoping that we're moving closer to a day when they will. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: First of all, good to have a friend of the show overseas. I'm glad we were able to speak with him again. 

Dave Bittner: Yeah. 

Ben Yelin: I think ethics is just completely underdiscussed in the context of cybersecurity. I think that problem's particularly acute among lawyers and attorneys. We talk about what the law is and not about, you know, ethically what's proper and, you know, what are ways, from a public policy perspective, to make sure that the use of surveillance tools, the use of cybersecurity is in the public interest. 

Dave Bittner: You're a professor. I mean, you teach the next generation. Is this something you touch on? 

Ben Yelin: It is. And I think ethics is, you know, going to start to be more of a part of a broader cybersecurity curriculum. It took a long time to integrate law and policy into cybersecurity curricula. I think we've finally started to penetrate that in master's programs. You'll see classes on the technology side, and then you'll have to take a course relating to law and policy. And in more and more programs across the country, I think cyber ethics is going to be a required course as well. And I think it should be. 

Ben Yelin: You know, we have ethics courses for all different types of domains, practices. Mr. Carolina talked about ethics in the legal context. You know, you have to take an ethics exam in most states to qualify for the bar. In the state of Maryland, you had to go to a full eight-hour training in Annapolis - I know from experience - where they tell you, don't steal money from clients and say that about, you know, 20,000 separate ways. 

Dave Bittner: Right. 

Ben Yelin: But I do think it's critically important, especially to maintain the credibility of the industry, to know that, you know, people are keeping the public interest in mind, thinking about ethical ways of handling other people's data in the interest of protecting their privacy and ensuring their security. 

Dave Bittner: Yeah. All right, well, again, thanks to Robert Carolina for joining us. Always a pleasure to have him on the show. I always look forward to chatting with him. Always leaves me thinking about the things we discussed long after the conversation is done. So always a treat, and I hope folks in the audience enjoy his presence as well. 

Dave Bittner: That is our show. We want to thank all of you for listening. 

Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.