8th Layer Insights 7.30.24
Ep 48 | 7.30.24

What About Ethics?

Transcript

Perry Carpenter: Hi. I'm Perry Carpenter and you're listening to "Eighth Layer Insights." Imagine you've just discovered a major security flaw that could expose millions of users' data. Do you report it immediately, risking your company's reputation and stock price? Or do you keep quiet, potentially saving jobs, but leaving customers vulnerable? We live in an age where a single wrong technology based decision can upend lives and topple corporations. Cybersecurity professionals are the digital guardians, but as the stakes grow higher and higher and the challenges more complex, we are faced with a pressing question. What ethical principles should guide those tasked with protecting our digital world? Doctors have the Hippocratic oath. Lawyers are bound by bar associations. But when it comes to cybersecurity there is no single agreed upon oath or creed that everyone recognizes. So now think of another scenario. Imagine you're a pen tester. You're hired to find weaknesses in a company's security. But during your authorized attack you stumble upon evidence of illegal activity by company executives. Do you report it and deal with the complexity and the anger from a customer? Or do you stay silent, allowing potential crimes to continue? Think about it. When ethical hackers discover vulnerabilities in major platforms they face tough choices. How and when should they disclose flaws? Their decisions could affect millions of users. Or think about ransomware attacks on hospitals during the pandemic. Should those working with the hospital advise their clients to pay the ransom and potentially save lives even if it means funding cyber criminals? These aren't just hypotheticals. They represent the daily reality for those on the front lines of our digital defenses. And the stakes are nothing short of the trust that underpins our entire digital society. So how do we navigate this ethical minefield and how do we balance security with privacy, transparency with confidentiality? Well, to help us think about this today we're joined by Ed Skoudis, coauthor of the book "The Code of Honor: Embracing Ethics in Cybersecurity." Ed and his coauthor Dr. Paul Maurer have taken on the huge task of crafting a comprehensive ethical framework for the field of cybersecurity. They draw from age old wisdom and contemporary challenges. And from that they've developed what they call the cybersecurity code. It's a moral compass for a digital age. In our conversation we explore the ethical complexities of being a cybersecurity professional, the challenges associated with writing and adopting a code of ethics, and a whole lot more. And so on today's show get ready to boot up your moral compass, turn on your philosophical flashlight, and journey through the jungle of right and wrong in cyberspace. Welcome to "Eighth Layer Insights." This podcast is a multidisciplinary exploration into the complexities of human nature and how those complexities impact everything from why we think the things that we think to why we do the things that we do and how we can all make better decisions every day. This is "Eighth Layer Insights." Season five. Episode eight. I'm Perry Carpenter. Welcome back. Okay. Before we jump into the interview I wanted to let you know that I have a new book coming out October 1. It's called "Faik: A Practical Guide to Living in a World of Deep Fakes, Disinformation, and AI Generated Deceptions." Oh. And the fake in the title is spelled F-A-I-K. Yeah. See what we did there? Again that's faik spelled F-A-I-K, a practical guide to living in a world of deep fakes, disinformation, and AI generated deceptions. I'm really excited about this book because it is the first book I've written that serves the general public. So this is all about helping anyone that picks it up level up their understanding of AI to see how AI generated content can be weaponized, why we fall for digital manipulations. Helps people learn how to think like a hacker. And most importantly of all it has practical steps and strategies that anyone can follow to help protect themselves and their loved ones from sophisticated online scams, especially those AI generated ones. And so if you want to learn more check out the links in the show notes or you can go to this book is faik, F-A-I-K, dot com. That's thisbookisfaik.com. Okay. Now let's jump into our interview with Ed Skoudis.

Ed Skoudis: My name is Ed Skoudis. I am the president of the SANS Technology Institute College. A lot of people don't realize that's SANS has a college, but we do with over 2,000 students in it, about half undergraduate and half graduate. So that's something I do that I really love doing. I'm also very involved with SANS ranges. I have a team called Counter Hack. We build the net wars range for SANS that we offer at various conferences such as SANS conferences as well as some of the B sides events and RSA conference sometimes hosts net wars. My team also builds the holiday hack challenge which is our big free challenge that about 20,000 people play every year. It's available year round. So if you ever want to build your skills in a wonderful cyber range, you can go to sans.org/holidayhack. Play that. I'm also the original author of SANS' number one selling course SANS security 504 which is on incident handling and hacker attacks. And I wrote SANS foundational course on penetration testing. I'm a SANS fellow. I'm on a few boards of directors today. And I just love all things cybersecurity. I love getting out to events and seeing people, providing some coaching and input and just really enjoying the cybersecurity community. I really love helping people, you know, build their skills, build their capabilities, sort through some tough issues, and you know I do my best to try to make the world a safer more secure place through that. And to that end I am the recent coauthor of a new book on cybersecurity ethics. The book title is "The Code of Honor: Embracing Ethics in Cybersecurity." And it was just recently released. I am the coauthor of it. Dr. Paul Maurer is the other coauthor. He and I worked on it for about a year and a half. And it was just released. So that's me and my background.

Perry Carpenter: Fantastic. And that's an amazing background. So before we get in and talk to the book given all the SANS activities that you do you're very focused on helping people level up their skill sets and preparing the workforce. You know it's not very often we get somebody who is that focused on workforce development for as many people as you are. So I'd love to spend just a minute or two getting your thoughts on, you know, this thing that everybody keeps calling the cybersecurity skills gap. From your perspective, when you hear that phrase what is it? And then what do we need to do to fully address that or address it better than we have been over the past decade that we've been talking about it?

Ed Skoudis: Absolutely. The cybersecurity skills gap, I mean it is a big problem. I talk to folks in commercial sector, in government space, in military, and they can't find enough really solid people with hands on keyboard skills. Now there are some detractors out there. It's a point of some controversy surprisingly. Some people say that the problem's not as bad as it seems. You know there's estimates that we need between 600,000 and a million more cybersecurity practitioners in the United States alone. Those numbers I think are pretty accurate based on what I'm seeing. Others, you know, throw rocks at that and say, "No. It's fewer than that." But I think it's undeniable that it's a problem. And if you're trying to hire and retain really good talent, you know this. I mean we see it again and again. So SANS has done a lot on this. I've tried to support the various SANS initiatives on workforce development. But notice I said earlier hands on keyboard. There's a lot of people that are maybe book smart or maybe policy and procedure smart, and there's nothing wrong with that, but practitioner smart, the ability to make computers do things and secure them by configuration, by forensics, by penetration testing, by detailed SOC analysis, those are the positions we need to fill better and faster.

Perry Carpenter: Yeah. Well, and getting to that, though, one of the biggest hurdles for people in cybersecurity that I see is still that entry level job. So if there's one thing that people need to do right now to address that at the front end or those entry level jobs are potentially more within the scope of what somebody with a little bit less experience can get where they don't have that, how do -- how do organizations prepare for the fact that sometimes they're becoming their own worst enemy? At least if I'm going to frame it that way. And you can correct me if I'm wrong. Maybe that's a good question. Are they sometimes their own worst enemy in that entry level job where they can take somebody that's got raw talent and then find a way to get them the hands on keyboard experience or should we be setting up college programs better to where that's given? Or is there some other step that's being missed? Because if that number keeps growing the way it is, then it means we're doing something wrong at one level of the chain that could potentially be addressed and we could alleviate that at least a little bit. Right?

Ed Skoudis: Yeah. I agree 100%. You know it's the classic problem that is often framed this way. You'll see a job requisition and you read it and it's an entry level job requisition. You know somebody who's, you know, maybe a starting out SOC analyst or somebody that's brand new doing cyber defense. The salary's associated with what you'd see with somebody who's starting out here. And you read through it and it looks like a really good job rec until you get to three years experience needed. Well, gee. That's not entry level anymore. Right? And how do you get that experience? And I think it's the responsibilities of organizations as well as the individuals themselves. And from an organizational responsibility there's many things that can be done. One is internships. I've had interns working for my company and for me for about 20 years now. And that is a great way to help mint new cybersecurity skill. You bring somebody on board. Now I tell my interns, "I will pay you, but I expect you to work." I'm not big on free internship. I expect you to work. I expect you to produce. And I will help you to do that, but interns -- and my interns have gone on to do some really wonderful things. I'm super proud of them. And some of my interns still work for me. I hired them full time after they did a good internship. So I think organizations can do outreach and build the interns. I have many friends that also have interns as well and I'm happy to see them do that. So internship is one thing. Also writing true new to cyber job requisitions and not saying you need three years experience in cyber to be a brand new person starting out in cybersecurity. That just doesn't make any sense. So you need to let your HR people know. Maybe it's not the cybersecurity organization itself that is causing that big problem of three years experience for somebody with no experience. But by the time -- what the cybersecurity team needs to hire gets translated through the HR process. That's where the disconnect comes up. So looking at what your job rec's saying and helping to direct HR better so that you can get new people on board. We have to make a significant effort as an industry to get new people on board. SANS has done a lot with new to cyber with classes and web casts and such. And that will get us to the individual. I talked about what the organization needs to do, internships and interacting with HR in a more detailed fashion to pull in new people. But for an individual if you're new to cyber there's a lot of stuff you can do. And one of the most important things I think is for you to play capture the flag events. Now some of them are very advanced and you'll probably get wiped out pretty quick. That's fine. Look for introductory capture the flag events and don't expect to do great on it. Just start and play and have some fun, maybe a few laughs. Maybe make some friends that are working on the same CTF as you. And you might say well what CTF could I start on. Obviously you want something free. You'd like something high quality. You'd like something fun. I'm sorry. I don't want to turn this into an advertisement for something we give away, but the holiday hack challenge. The SANS holiday hack challenge. My team spends thousands of hours building it and we give it all away. Thankfully SANS sponsors it. So SANS lets us keep the lights on while we're working for it. But it is an absolutely free challenge available to all year round. It's competitive in December and January so that you can win prizes. But if you want to play in March or August or whenever, you can just play it and just have fun with it. There's a discord community. So if you get stuck you can ask questions. The SANS holiday hack is that free thing that we give to everyone hoping especially that new to cyber people will participate. But beyond that you can go to CTF time. So capture the flag, CTF, CTF time I believe it's dot org. They have a lot of different examples of capture the flag events that happen, you know, all around the calendar. And, you know, all different time zones. And you can find ones that are associated with people who are brand new. So building those skills, putting that on your resume to say, "Hey, I played SANS holiday hack challenge and I completed it." Or, "I got 30% of the way through." Or whatever or the other CTFs that you might participate in, that's a way that you can show that you have interest and excitement and drive and motivation is by participating in these, especially the free CTFs.

Perry Carpenter: Yeah. And especially the -- you know, the CTF mindset is one of curiosity and determination and just kind of puzzle solving. And I think that shows a certain skill set that is very advantageous in cybersecurity. So I think that's really good advice.

Ed Skoudis: It has helped me. It has helped me so much. I've been playing CTFs for about 28 years myself, just playing them. I would play, you know, at DEF CON, the big hacker conference, many, many years for about a decade. I started writing my own CTFs I would say about 23 years ago and I started building a team to build CTFs meaning things like net war cyber ranges and holiday hack challenge back in 2010. So that's almost 15 years ago. So CTFs have been with me really since the start of my cyber career in helping me build my own skills, in helping me create CTFs to help others build their skills, and then in fact even building my counter hack team we were all created here just to -- because I had this idea 25 years ago. What an amazing way to learn and build skills. Real skills. Well, the CTF you know some people write them so that they're whimsical and fun and weird, and that's good. But it's not actual skills that you can use on the job. That's bad. We try to write them so they're fun and whimsical and weird, entertaining, but they also give you actual real world skills that you can use on the job in a very measurable fashion. And that's what I think you want to look for. And, like I said, 25 years ago I said, "I think there's a big space for this. Let's start doing it at SANS. And then -- " And not only doing it for SANS events, but doing it for other events. Like I said earlier, B sides, RSA conference, and so forth. And invest heavily in that.

Perry Carpenter: Yeah. So I think that's an interesting and yet fun and awkward segue into the topic. So if we're addressing the cybersecurity skills gap from the organizational point of view, there's a mental framework of what are we actually looking for that we then make actionable in the way that we build job requisitions and the way that we build entry pathways. If you're a learner, you have a mental framework. You have curiosity. You involve yourself in community and you involve yourself in things like CTF exercises and then that becomes actionable skill sets. One of the other mental frameworks that's really important in cybersecurity is of course the way that we embrace and think about and actually apply ethics in what we do. So that gets to your book. Every book has a purpose as the author is thinking about that framing that in their mind. Sometimes there's an inciting incident behind why the book went from the author's mind into words on a page and then physically bound and put out into the world. What was the thing that made this a book that needed to be written from your perspective?

Ed Skoudis: Wow. That is a wonderful question and I have a couple of different threads that kind of came together to make this book. From my personal perspective I taught at SANS for 25 years. You know, in teaching a class on incident handling and hacker attacks and then teaching a class on penetration testing I would get similar questions emailed to me after class. Sometimes it'd be in class. Most of the time it'd be after class. You know, independent. Somebody took a class a year ago or a month ago, whatever it was. And the questions were often ethically complex. Please help me. I need advice. Like one of the very common questions I would get is I work as a cyber defender for this given organization. We just passed our audit. Or maybe a PCI audit. We just passed it. And, Ed, I know we should not have passed. I don't know if the auditor made a mistake or they didn't listen or who knows. I don't want to blame anybody. But we're not as safe and secure as we're supposed to be to meet the standard and I don't know if management understands that or if management's glossing over it. What do I do? That's one question that comes on. Another question. We've been breached. I think we have a responsibility to disclose this breach, but my management says we don't. What do I do? Or another one. I've discovered a zero day vulnerability. I've interacted with the vendor. They don't think it's that important. What do I do? So it's a bunch of that just over, you know, 20 years of people saying, "Hey. What do I do?" In all these different ethical dilemmas. That was my perspective. So I've been interested in this stuff because, you know, every year I would get a dozen or more emails like that over the space of 20 years. The second thread that led to the creation of this book is Dr. Paul Maurer. Dr. Maurer is president of Montreat College which is down in North Carolina. And Dr. Maurer has built -- has been building a cybersecurity program at Montreat primarily at the undergraduate level. And I found out about his work there about 10 years ago and I went down to a couple of conferences that he hosted down there. I also was added to his advisory committee especially on the curriculum. How can we build a curriculum that is good for cybersecurity practitioners at the undergraduate level? This was before I became the president of the SANS college or anything like that. Anyway Paul was talking about this and, you know, building great people with their bachelor's degrees in cybersecurity, emphasizing the need for building character and having solid ethics. And he somehow got on the radar of some folks at the NSA and the NSA said, "We need cybersecurity practitioners to have a great sense of ethics and character." And so the NSA asked Dr. Maurer, "Can you build a curriculum for us and we will fund it?" And Paul said, "I can do you one better. What if we did a book? It could be a curriculum, but if we had a book that would make it not just for the college classroom, but more for the industry generally." Let's not make it a textbook. Let's make it a practical real world book with ethical dilemmas in cybersecurity and scenarios and practical advice for how to navigate them. And the NSA said, "Great. We would like that." And then Paul reached out to me and said, "Ed, are you interested?" Now given my previous 20 plus years of trying to help people tease through ethical dilemmas I thought this is a way that we can do it at a scale bigger than I can do myself. So those are the two threads that kind of came together to create this book, me answering these questions for a couple of decades, Paul building a program for undergraduates over 10 years and then getting the NSA's interest saying part of -- so this will tie it to our earlier conversation. Part of the issue with the knowledge gap with the, you know, workforce development problem is not only do we need people to be skilled with fingers on keyboard, we need them to be trustworthy too. Right? If you have a bunch of people who are ethically compromised or you know ethically dangerous you don't want to skill them up into how to do penetration testing better or, you know, give them access to this environment that has such sensitive information. So that's the thing I liked about what Paul was talking about, Paul Maurer, is -- because I was always about, hey, let's skill people up and that will fix the problem. He's like, "You want to skill up the people, but a crucial skill is how to make ethical decisions." And it's like I'm in. I'm in. And we worked for two years to create the book that tries to address that.

Perry Carpenter: Yeah. That's amazing. Well, and two years on a book in the technology industry is a long time because we have a whole bunch of quick cycle books being written. And I think one that stands the test of time is a -- you know, an evergreen book that's had that amount of thought, research, and due diligence put into it. When we talk about it, what were some of the most interesting finds that came from the collaboration? Because you'd had, you know, a couple decades of thinking about it from your perspective. Paul had had a long period of time thinking about it from his perspective that was, you know, mostly academic, but also had some real world tendrils out there I'm sure. As you came together were there any interesting insights that kind of struck you as like, "I've never thought about it that way?" Or here's an interesting way to frame something that never came to mind.

Ed Skoudis: Oh yeah. I mean the book is chock full of that kind of stuff. What we try to do is to create essentially a centralized fairly simple, fairly straightforward code. In fact the book is called, you know, "The Cybersecurity Code." And it's got this. And we make available for free not the code, but the code itself. And it's a series of principles upon which you make ethical decisions. And one of the first principles, I will treat all people with dignity and respect. That is to say cybersecurity fundamentally has to be about people. And, you know, taking care of people, defending people, and not mere technology or simply money. It's about defending people. I will seek the best interest of others is another one of our principles. So it was wrestling with my coauthor Paul to come up with those principles. And when we originally worked on this, you know, having a dozen principles, maybe 15 principles. And we're like are these two the same thing. Maybe they're just different facets of the same thing. So to try to condense it down we even argued over what some of the principles would be. I had some ideas for additional principles I'd like to see that my coauthor disagreed with and we argued for days about that. Now I can give you two examples. And one of them is this. I thought that one of our fundamental principles, you know in cybersecurity ethics, should be protecting privacy of others. My coauthor didn't agree with that. And I said, "No. I really -- " And we argued back and forth. Here's why I think it's essential. He's like, "Well, maybe that's just cultural. You think that because of the culture. If it was a different culture, would they care about privacy as much?" Even our own culture if you go back 100 or 200 years ago privacy wasn't as big a thing as we treat it now. So we argued over that and I'm happy to say I won. I won. We have as our eighth principle I will protect and respect the privacy of others. Because it's so easy when you're doing cybersecurity work to inadvertently or even purposefully violate that. And that causes all kinds of problems. So I won that argument. And this is -- you know, I'm boiling down, you know, probably 6 or 10 hours of argument into 30 seconds. So that was one that I won. Another one that I didn't win is liberty and freedom. And I argued that, you know, that should be a fundamental tenant of ethics in cybersecurity is that we should work hard to ensure that we give people freedom. And he said that's just cultural. You say that because you're an American. It is not universally held around the world. So and I lost that. I mean it was an impassioned argument on both sides back and forth and we do not have that as one of our principles which is fine. I mean I get it. He convinced me just like I convinced him about privacy. So those were kind of the things that we really thought hard about. And then the other thing, Perry, is the order because many times these principles come into conflict with each other. So generally speaking the earlier ones are the ones to pay more attention to and try to adhere to better than the later ones. And our first principle is I will treat all people with dignity and respect. The last of our eight principles, I will protect and respect the privacy of others. You could see how those would come in conflict. You know, hey I have to look at the contents of this file so that I can treat other people with appropriate respect. Which one do I take into account? So balancing these things, that's the real tricky part

Perry Carpenter: Yeah.

Ed Skoudis: And that's why every chapter in the book has a case study that we go over saying this kind of thing happened. And they're all based on real world stuff that I've seen in my career. This kind of stuff happened. How do you dissect that? How do you apply our principles to that? And then every chapter ends with another case study and a series of questions that the reader has to go through and can maybe even discuss, you know with a small group or something like that. So I mean so those are the examples. But really, you know, it comes up and we're pushing on each other like practically speaking how does somebody deal with this. What does that mean? And then to kind of push on each other well if we accept your principle, doesn't that imply this and that and these other things? It was a great debate [inaudible 00:27:01].

Perry Carpenter: Yeah. Well, it does get to that what are the consequences of having this principle in place and what are the unintended consequences that could -- so you almost have to threat model your own principles in that.

Ed Skoudis: You do. And you know what it almost reminded me of? In fact we cited a little bit in the book. Do you remember Isaac Asimov's rules of robotics? So, you know, you've got the first rule of robotics. You know, a robot won't hurt a human. You've got your second rule of robotics is a robot won't do this unless it conflicts with the first rule. And then the third rule of robotics, you know, a robot won't hurt itself unless that conflicts with the first rules. Etcetera. It felt like that because I mean look. He wrote "I Robot" and some of the Foundation series on inner conflicts between these three rules of robotics. So we came up with, you know, eight principles for cybersecurity ethics and seeing them interplay with each other and break each other and then trying to figure out how do you turn that into still a good decision. You know, that was -- that was the interesting and hard part of the book. And I'm real happy with the way it came out. I do hope it will help people.

Perry Carpenter: After the break, the conclusion of our interview with Ed Skoudis. Welcome back. One of the nagging questions I have and I'm sure that people that are listening have is we've mentioned explicitly two of the eight right now. Can you just rattle off the eight real quick so that that nagging question gets out of listeners' minds?

Ed Skoudis: Yes. So first of all you can download them all. We've got a single page and you can download them with, you know, nice pretty font and all of this stuff. You can even sign off on this code of ethics. If you go to montreat, M-O-N-T-R-E-A-T, dot edu and then just slash cybersecurity dash code.

Perry Carpenter: Okay. And I'll put a link to that.

Ed Skoudis: Montreat dot edu, cybersecurity dash code. Yep. So here we go. You ready?

Perry Carpenter: Yes.

Ed Skoudis: Here's the eight. I will treat all people with dignity and respect. I will seek the best interest of others. I will strive to recognize, take ownership, and appropriately communicate my mistakes and exercise patience toward others who make errors. I will be honest, trustworthy, and above reproach in my actions and communications. Now I know what you're thinking. We'll come back to it in just a minute, but I know what you're thinking. Okay? I will not be a lone wolf, but will instead work collaboratively with my peers and superiors. I will endeavor to exercise patience, wisdom, and self control in all situations. I will not steal and will do everything within my power to prevent theft in all its forms. And last, but not least, I will protect and respect the privacy of others. Now what you might be thinking -- and I've heard this challenge from a lot of people as we talk about this book. Well, those aren't just cybersecurity centric. Those are just general principles for ethics.

Perry Carpenter: Yes

Ed Skoudis: That's not a weakness of this book. It's a strength. I mean Paul and I spent a lot of time combing through cybersecurity codes of ethics. You know there's one associated with ISC square. There's one associated with GIAC. Many of the certifications have them. And we looked at them and tried to boil them down to a common thing. We also looked at various codes of ethics from other professions like the Hippocratic oath. We looked at the Geneva Conventions associated with war. We looked at all kinds of different ethical sets of principles and kind of boiled them down. And this code that we put in the book could be applied to other fields. But what the book does is it takes those principles, carefully fought and wrestled over, and says how do we apply them to pressing issues in cybersecurity. And that I think is not a weakness of the book that these principles can be applied elsewhere. It's a strength of the book.

Perry Carpenter: Yeah. Yeah. Absolutely. I was -- I agree with that. As I was listening to that I was like well that's like just being a good human. Right? A good citizen. But the question that it's answering is how do I apply the being a good citizen or digital citizen to the field that I work in right now. And start to ask the questions about if this is a principle that I want to align my character with, how does that play out in the day to day decision making that I'm doing as I get hands to keyboard and have to do an investigation on somebody or have to react to an incident? What are the -- what are the dilemmas that I might find myself in that could be maybe in conflict with one of those principles where I could either give way and do something easy or I could hold to integrity and make a harder decision? Maybe by bouncing to one of those other principles and having a collaborative discussion with somebody so that I can wrestle with the things that need to be done and I can -- I can be transparent about my decision making, document the things that are going on, and then go back and do whatever the collaborative decision is on that. So I think it makes a lot of sense to approach it that way.

Ed Skoudis: You summarized it perfectly. I mean that's an excellent summary of the book. I could not have said it better myself.

Perry Carpenter: So in that -- flesh out for a second then like what do I expect as I open one of the chapters and start to see a principle as it's expressed in that. You mentioned there's some case study elements.

Ed Skoudis: Yes.

Perry Carpenter: There's some interactive elements of, you know, maybe discussions that could be had or thought experiments that could be had. How do you wrestle the reader through that in a chapter?

Ed Skoudis: So we have an introductory chapter that's just, hey here's what this book is about. But then the structure of the book is one of these principles per chapter. So I will treat all people with dignity and respect. We talk a little bit at the beginning of the chapter why is that so. What does that come from? Why is that important? We usually at the beginning of a chapter have a scenario where there is some sort of conflict where it's like what do I do here to address that issue. So there's usually a scenario right at the beginning or if we feel some more definitional work is needed up front there's some more definitional work and then the scenario and then why. Why is this so? And then how do I apply that? How do I practice that? How can I integrate that into my day to day decision making? How might it interplay with other of these tenants that could prioritize it or lower its priority based on what happens in the real world? And then every chapter has a nice conclusion and then this application, the interactive element that you talked about, Perry, where we give you a scenario. And some of these scenarios are one or two or three pages long, but they're exactly the kind of things that I see me and my friends dealing with in cybersecurity. They're all real world scenarios. Well, we sort of dress them up so you can't say, "Oh. This is that organization." If you're an astute reader with some cybersecurity background you can say, "This looks an awful lot like that case." Or the other case. And we did try to write it to be relatively timeless. I mean in this fast changing field it's impossible to be completely timeless, but we tried to write it on ethical issues that have manifest themselves over the last decade or more maybe in different ways, but they keep coming back. You know the privacy one. If you just think about the violations of privacy that have been done by various organizations, often social network organizations, I won't name specific names, but there's some of them that are more egregious than others, you'll read some of our scenarios and say, "Oh. That sounds like this with a little bit of that added in." When we -- on a couple of occasions in the book when we felt somebody did something very good ethically we actually do call them out by name. Hey, here's an example of something good that you could hold up. For example Dan Kaminsky. The work that he did with DNS and found that fundamental flaw in DNS almost 15 years ago now. Time flies. And how he properly disclosed that flaw, worked with vendors. I mean he had such a high sense of ethics and doing things the right way to make the world safer and more secure. So we actually have, you know, a few paragraphs in there about what Dan did back in the day and how that can be a model for the rest of us. Unfortunately Dan passed away a few years ago. He was quite an amazing person. I don't know if you ever had a chance to meet him, Perry, but --

Perry Carpenter: I never met him. I saw his influence everywhere. And definitely saw the outpouring of support on Linked In and other networks when he passed.

Ed Skoudis: Yeah. So and there's other examples that we put in the book like that. Like here's somebody who had, you know, a complicated situation. Here's how we saw them work their own way through this that was in an ethical fashion, a very ethical fashion, and what we can learn from that.

Perry Carpenter: Yeah. So practical application on things like responsible disclosure. What about really, really dicey controversial situations? Like friends at the NSA that were kind of the impetus behind some of the book have had some fairly catastrophic disclosures that have happened with -- well, I don't want to name names and get into all the controversies behind that, but we've seen several issues with quote unquote "whistle blower" events and other tranches of data that have been put out in the public sphere. How do you wrestle somebody through whether that is ethically done and whether the purpose behind that is pure, whether it's self motivated, whether there's other factors? And then, you know, what would your -- well, I'll leave it at that. I mean that's a complicated question. Right?

Ed Skoudis: It's a very complicated question and there are no easy answers on it. So I will say a few things though to that end. First the NSA did not give us any edits or suggest any changes whatsoever. The book is it's Paul's words and my words. We did work with an author, Matt Linton, who helped us create the book. But so much involvement back and forth and back and forth on that. But the NSA had no editorial input or control which was great. And they didn't want it. They -- I mean from one perspective they thought enough of Paul and me to just say, "Go ahead and do that." And whatever happens happens. And, you know, that's good. They're not distributing the book. It's actually, you know, coming out completely, you know, separate from them. So that's good. You could also say, "Well, gee. They knew Paul and me and they knew we wouldn't do anything that would cause them much damage." So that's why they decided to work with us. I mean that's just maybe a more cynic's view of things. There might be some truth to it. I mean obviously they don't want to do something that they think is going to hurt themselves. But when it comes to things like the whistle blower stuff and some of the events that have happened in the past, the book doesn't talk about them explicitly by name, but it discusses the hate. If you're in an organization that is doing something that you feel to be unethical, here is a process you can apply to try to resolve that. And it starts internally. You know, raising something with your coworkers or your management. Going up. Finding an appropriate ally inside your organization. Maybe it's even -- and the book goes through this. Maybe it's even reaching out to the legal team within the organization because they might have different sensitivities than say the business people in your organization. Right? For whatever your business might be, whether it's the NSA or some vendor or something like that it talks about the importance of having a mentor within your organization and outside of your organization. Now for the outside the organization mentor you still have to follow NDAs of course. You don't want to violate any of those. But to bring up hypothetical situations. And it talks about how to find a good mentor and build that relationship with a mentor so you can bounce ideas so you're not just doing this on your own, you know, as a lone wolf. And then it does talk about sometimes things break down inside an organization and the only way to resolve that if you feel strongly enough about this and you've interacted with your mentor inside the organization, outside the organization, you can't find an ally inside the organization who will do what you have carefully considered to be the right thing and then you might have to rely on a whistle blower thing. But the book says prepare to have your life blown up because it's a game changer. We're not trying to scare you. We're not trying to say don't do this. But you have to be very judicious in how you approach that because everything changes for you. So the book goes through that, you know, and I think it's useful. I don't think we say anything in that context that is like stunning or surprising, but if you've not gone through it yourself or have had friends like I have who have had to go through that, maybe you wouldn't think about all those steps and how to apply those steps in a structured order. And the book does provide that.

Perry Carpenter: Okay. You know just a couple more questions before we end off. Is there a principle or a story within the book that you are like really, really excited to get out or happy that it's finally going to be put in print and out in the world?

Ed Skoudis: I -- oh gosh. There's a lot of them, you know. One of them is, you know, the discovery of zero day vulnerability that the vendor doesn't think is that important. And then how to do that well and appropriately so that people can look at what you did and say you're not an ambulance chaser. Right? You're not just doing this to get money. But you're also not an arsonist. Right? How do you -- how do you properly do that? And we talk about some scenarios that will look familiar to those who've studied this industry a lot. And I'm happy to have that out. And I think another real important point about the book is it's a structured approach to ethics. Here's a certain set of principles of how to be good person. How can you in a structured way apply that to various scenarios like responsible disclosure, like you know what do I do when my organization, you know, is making false claims about its own security? Etcetera. Etcetera. Etcetera. And I mean I think in this age of AI and concerns about the safety of AI and a lot of people getting frustrated with their organizations who are not necessarily seeing eye to eye with them on safety within that own organization, I don't want to name names, but controversies are big and Perry you're all over this stuff, I do think it's worthwhile to mention this is not an IT ethics book. It's a cybersecurity ethics book. So there are some bigger IT ethics stuff, especially with AI and so forth, that the book doesn't directly tackle. It's our focus is on cybersecurity. Cybersecurity practitioners. Cyber defenders. Pen testers. SOC analysts. Digital forensics. Incident response. Those are the people this is written for. And it's written for practitioners as well as leaders. There is the bigger issue of IT ethics which we don't take on just because there's so many more things. The other thing is it's focused on the civilian space meaning commercial, educational, governmental. It does not get into the military aspects of this. Now I spent a lot of time working on the military aspects of this, but that's a whole different can of worms from an ethical perspective that the book does not touch because again we're trying to keep it to a couple hundred pages, make it very practical. And if you do IT it just doubles the size of the book. If you do military it triples the size of the book. And it also the military one would make a bunch of it not applicable to many of our potential readers.

Perry Carpenter: Yeah. I did just look up the page count. 224 pages. So that's about 200 pages of actual text and then indices and everything else after that. So very digestible. Very approachable for anybody. All right. So the last -- one last real question and then a couple just fun ones.

Ed Skoudis: Sure.

Perry Carpenter: So last real one is what's your biggest hope for the book now that it's out in the world. Do you want to see it picked up and used in classrooms? Do you see conferences around it? Are there training events? What's your big picture hope for where this goes?

Ed Skoudis: My biggest hope -- I mean all the stuff that you said would be very nice. But my biggest hope is that practitioners would get the book and read it and then it sits on their shelf so that when an ethical dilemma comes up, an explicit decision that has to be made, maybe they pick it up and flip to a couple of pages in it and just review it. I do hope it helps people make better decisions. Even just getting your mindset into an ethical framework. Because sometimes you don't even realize an ethical decision is before you. And you make a decision, but you don't even realize that you made a decision based on ethics. Just getting your mind primed for that, that's my big hope. I do hope that it is used in classrooms. We put together a pretty simple curriculum associated with it. So it lends itself to professors doing that. I also want to point out, you know, we're publishing this through Wiley which is a great publisher. They do a great job with that. All proceeds from the book do not come to me or to my coauthor Paul. They go to pay for cybersecurity education for people. Yeah. So they go to people who are studying in an undergraduate level to be cybersecurity practitioners.

Perry Carpenter: Go straight back to the beginning of the discussion. Right? Dealing with the skills issues.

Ed Skoudis: Exactly. So I make no money on the book. All of my -- any proceeds for me will be donated to educating cybersecurity people. I think that's an important thing and just sort of the nature of how we worked through this book. And it's the same with Paul.

Perry Carpenter: Okay. All right. So then a couple fun ones just since we have a couple minutes left. So these would generally be like ice breaker questions, but I'm an idiot and I always put these at the end. Given the amount of research and work that you do I'm sure you have tons of Chrome browsers or whatever your browser of choice is. You know just tabs and tabs open. If somebody were to without context look at your browser history what would be the most awkward thing to explain?

Ed Skoudis: Gosh. I have very disparate interests. I mean obviously I'm interested in technology. I take a fascination with AI. But I think the thing that people would say that's a little weird is I'm really interested in like the history of culture. You know, looking back 500 or 1,000 years, how did we get here? I think we're in an epochal time right now and this is -- it's the end of something and the beginning of something. And it's huge. So, you know, I get fascinated with little pieces of time like 1451 through 1455. A guy named Johannes Gutenberg had invented the printing press and had this idea that he was going to start printing books. What happened with that? How did he get there? How did we get there? So I mean I have a lot of stuff that I've looked at that. So the Gutenberg press was a fascinating thing for me over the last six months. Or, you know, I really got interested in how classical music was created. You know, how did we get from 1,000 AD of very simple chants and lyres and stuff to say Beethoven? And what was the process by which that came about? And why did it take so long and why was it so late compared to other forms of classical art? So people would look at that and say, "You're a weirdo." There you go. So there you go. Yes.

Perry Carpenter: No. That's really cool. All right. One emoji you either overuse or would just rather get wiped off of the ASCII set of available emojis.

Ed Skoudis: Oh, I think the overused emoji -- I mean so I often will do like the point up one all the time. I agree. I will often use -- I don't even know what the emoji's called, but you know the one that's like cringe that's got the little face and like teeth are like -- I do that one a lot. And then even like that plus plus the scream emoji. You know, the scream. Those are three that I way overuse.

Perry Carpenter: Okay. And the very last question. A book cybersecurity related or not that you recommend to anybody.

Ed Skoudis: Oh gosh. I mean that's a hard question. I read a lot. I have so many books. Okay. Yeah. I've got one for you. Just if you look at the books that I've read in the last six months everybody should read this. "The Mysterious Disappearance of Rudolf Diesel." It's a history based thing. In 1913 the guy who invented the diesel engine disappeared and died. Ish. You need to read the book.

Perry Carpenter: Okay. I'm intrigued now.

Ed Skoudis: It's so much fun. It's so much fun. You will not be disappointed. It's like a thriller and mystery kind of thing. It's one of the most fun and entertaining real world factual books that I've read in the last six months.

Perry Carpenter: If the cursor blinks on the screen, a cybersecurity professional somewhere is making a decision that could impact millions of people. When it comes to cybersecurity we can't afford to treat ethics as some kind of vague philosophical concept. Instead we need to recognize that our ethics form a shield to help protect our future. This conversation with Ed reminded me that in this world of ones and zeroes there is a critical third element. Human conscience. So as we close out think about this. True security isn't just about impenetrable defenses. It also has to include uncompromising ethics. Security is more than just code and networks. It's about people. It's about our shared values and the choices we make both individually and collectively. And with that thanks so much for listening and thank you to my guest Ed Skoudis. I've loaded up the show notes with more information about Ed, links to his book, and a few other resources you're guaranteed to enjoy. Oh. And I also put in a couple links where you can preorder my book "Faik: A Practical Guide to Living in a World of Deep Fakes, Disinformation, and AI Generated Deceptions." The release date is set for October 1, but when it comes to the world of publishing preorders are everything. So please consider preordering right now. If you haven't yet, please go ahead and subscribe or follow wherever you like to get your podcasts. And please tell someone else about the show. That helps us grow. If you want to connect with me, feel free to do so. You can find my contact information at the very bottom of the show notes for this episode. Eighth Layer Insights branding was created by Chris Machowski at ransomwear.net, that's W-E-A-R, and Mia Rune at miarune.com. Our Eighth Layer Insights theme song was composed and performed by Marcus Moscat. Until next time, I'm Perry Carpenter signing off. [ Music ]