SpyCast 10.4.22
Ep 559 | 10.4.22

“Sure, I Can Hack Your Organization” – with Eric Escobar (Part 1 of 2)

Transcript

Andrew Hammond: Hi, and welcome to "SpyCast." I'm your host, Dr. Andrew Hammond, historian and curator here at the International Spy Museum in Washington, D.C. "SpyCast's" sole purpose is to educate our listeners about the past, present and future of intelligence and espionage. Every week, through engaging conversations, we explore some aspect of a vast ecosystem that looms beneath the surface of everyday life. We talk to spies, operators, mole hunters, defectors, analysts and authors to explore the stories and secrets, tradecraft and technology of the secret world. We are "SpyCast." Now sit back, relax and enjoy the show.

Andrew Hammond: Today, I'm up at over a hundred thousand felonies. If you were to look at what I was able to do and the number of users I was able to compromise, from my perspective, it's the coolest job in the entire world. This week's guest is professional hacker Eric Escobar. Eric has legally compromised, well, almost everything from health care and banking to technology and critical infrastructure through to amusement parks and next-generation military aircraft. Listen in for Part II next week. In Part I, we touch on what keeps Eric up at night, thinking like a professional hacker, hardening your attack surface - i.e., protecting yourself and your information - and plain English explanations of important cyber concepts like kill chains and zero-days. Hint - they're not the names of heavy metal bands. If you're a fan of the podcast, I would greatly appreciate it if you could leave us a kind review on Apple Podcasts. Make sure to check out this week's show notes for resources to learn more. Thanks for listening, and enjoy this week's episode.

Andrew Hammond: I was just wondering, just to start off, Eric - so you're a professional hacker. You attempt to compromise all different types of networks, from the military through to amusement parks. I guess one of the first questions that I had, just when I was thinking about this interview - you've seen quite a lot. Is there anything that keeps you up at night? Is there anything in the wee, small hours where you're like, that one really, like, scares me? 

Eric Escobar: You know, the ones that really keep me up at night are anything to do with critical infrastructure, which is, you know, obviously, Colonial Pipeline and all the havoc that that caused. Those are the ones that really just keep me up at night for a couple of reasons. I mean, really, if you look at any of our traditional, you know, different internet uses - Amazon, you know, Google, Apple, like, all these different services - what's the worst that's going to happen? You might lose some files. You know, you might need to recover from a backup. You know, your information might get out there. But with all the critical infrastructure, there's chance of potential for loss of life, which is way worse than anything that can happen in the cyber realm. So those are the ones - like, watching any critical infrastructure get compromised is really the thing that keeps me up at night because, you know, lives are in the balance, lives are on the line. 

Eric Escobar: And we do a lot of testing for critical infrastructure. And I've seen computers and machines that have been online, and not been taken offline, longer than I've been alive. So when you think about how often you have to reboot your machine, and it's like - well, these haven't been rebooted in my lifetime. So it's really interesting to see those types of things because, you know, they interact with really big, expensive hardware. And so there's a Catch-22 that happens where you can't really take the machine offline to do maintenance on it because it's critical infrastructure. So then how do you test it to make sure that a hacker can't take it offline or maintenance can be done on it, right? So to answer your question, critical infrastructure is what really keeps me up at night because of the actual physical harm that it can do in the world. 

Andrew Hammond: Wow. And before we met today, I mentioned that some of our listeners are involved in this business - some of them are involved in intelligence business and some of them are just people on the street that love a good spy story or that are trying to get up to date with what's happening in the world. So just to give them a better understanding of what we're talking about here, at the Spy Museum, we have a shard from the Aurora Generator Test in 2007, which basically is a test to prove that a piece of code can affect the physical world. And basically, to cut a long story short, they blew up a generator. So something that's intangible can affect the tangible world. So that's ultimately what you're talking about. Is that correct? 

Eric Escobar: Yeah, that's my actual job is doing exactly that. Not - like, not too dissimilar, a couple weeks ago, we compromised a - what's it called? - an oil refinery. So that same exact, like, hey, we're able to access, you know, industrial control systems. And if we touch the wrong computer, if we do something wrong, things go boom. And so that's why it's my fear 'cause exactly that - that code can affect the real world in those - you know, in those circumstances. 

Andrew Hammond: OK. Wow. And how did you get into this business, Eric? How did you end up - because your background - actually, like our intern-at-the-moment's father who's a civil engineer, your background's in civil engineering. So tell us a little bit more about that transition. 

Eric Escobar: Yeah, absolutely. So when I was in high school, I was like, man, I want to do some engineering. I'm good with math, good with science. So I did, like, the survey of all the different types of engineers that there were. There's, like, a class that my high school offered. And so it was either computer engineering, or it was civil engineering. I was like, you know what? I don't want to be behind a desk all day on a computer, so I'm going to go into civil engineering and build real things out in the real world. So I go to school, get a master's degree in civil engineering, get my - you know, I'm a registered civil engineer still in the state of California, you know, in the United States. And I started working for a couple of years, and they kept sending me out to these very remote places. And I was like, hey, I'm getting married. I'm going have wife and kids here soon. I can't be, like, out on all these, you know, remote places doing random work out in the field. And so then as luck would have it, my college roommate, his dad was a, you know, head of security for a large company in Silicon Valley. And he goes, you know what? You got the mind for this. Do you want to - like, I'll replace your engineering salary if you just want to give this a go. And so I looked at - you know, I talked it over with my soon-to-be wife. And I was just like, so how are we feeling about this? And she's like, well, I mean, no time like the present. So I made the hop from civil engineering into the security industry, and I never looked back. I still do some random, you know, engineering stuff on the side if there's a unique problem that arises. But yeah, it's a weird, windy path. And it all comes back to the who you know and the connections that you make. And you never know how they're going to reform or impact you later on in life. 

Andrew Hammond: So before this interview today, I was at the National Cryptologic Museum, which is reopening after a refurb, and they're the - in the early days of American cryptography, they didn't test so much for what people knew at that time. They tested for particular ways of thinking. So I was just wondering if you could tell us a little bit more about - what are some of the similarities between the way that civil engineers think and systems engineers and people that are working in your field think? Like, what are some of the things that carry over that mean that if you've got that, if you think in that particular way, you'll be good in this field? 

Eric Escobar: Yeah. I think it's - it just comes down to problem-solving and enjoying a good puzzle. You know, and when you're a civil engineer, you're looking for, hey, I need to get water from this dam to this hydroelectric facility. Or I need to build this building, and it needs to hold a thousand people. And so you're looking at - you know, you have constraints of time, resources, you know, budget, all of that stuff, right? And so you're trying to solve that problem of - how do I achieve this objective with the limited constraints that I have? And that's exactly what hacking is. In the same regard of, hey, I'm trying to, you know, compromise your active directory, compromise your oil refinery, break into this aircraft, and I have a very limited amount of information. How do I achieve this objective? So really it's just problem-solving and always loving a good puzzle. 

Andrew Hammond: So - and with this field that you're in and being a professional hacker, is there any space for people like me and Erin, people that are more humanities type folk, or is it still a preponderance of people that are, like, math, engineering, that type of mind? 

Eric Escobar: No. I mean, honestly, it's the full spectrum. The amount of people that we have on our team - so my background's civil engineering, right? If you were to take a survey of our team and say, what are your backgrounds? And we have several English majors, one of our best hackers used to be an RV salesman. We have a Ph.D. in physics. So we all kind of meander and find our way into this career through - you know, everybody's story's completely unique, just like mine is. But a lot of us come from the humanities aspect of it. And really the aspect there that I think is most interesting is the ability to communicate to your clients who are running these systems because yes, I'm very technical, but if I can't explain something in such a way to somebody who is maybe not as technical as I am, then my complete job has failed and fallen apart if I can't communicate that accurately. And so I always make the joke to my wife, who's an English major, that I never thought that I would be writing this - like, I went into engineering so I wouldn't have to write a word a day in my life. And now I read, you know, several thousand pages of reports as I QA them for our team. And then I have to write several hundred page reports a week for our clients. So, yeah, the humanities, it's definitely one of those things, just the ability to communicate, the ability to pull from historical, you know, precedent and all that stuff, pretty much - the way that I see it for this industry is everybody has a unique skill set that is, you know, wildly in need, even if you don't quite realize it yet. 

Andrew Hammond: One of the things that I find really interesting about that industry is that a lot of the barriers to entry that exist for other fields are different in the cybersecurity realm. So, for example, if you want to be on the Supreme Court, you have to have been to law school. If you want to be in the Supreme Court, you don't have to have been to three or four law schools in the whole country, but it's pretty much a certainty that you're only going to get onto it unless you've been to one of those three or four law schools, if we look at it historically. But it seems to me that the barriers to entry are different in this - in the field that you're in. And it also seems like - I could be naive or idealistic - but it seems like it's more meritocratic in that sense because it's like, you turn up in a kitchen as a chef. People don't really give a monkey's where you went to chef school. They're going to judge you based on what you cook for them. So I wonder if you could just talk about that a little bit for us. I find that really interesting about the field that you're in. 

Eric Escobar: Yeah, you hit the nail on the head. It's complete meritocracy as far as - if you're good at what you do and you can communicate it, you'll rise to the top. And that's absolutely what I love about it. And it's kind of funny because I have several of my friends who - they see - Eric, you know, what - you know, this is what you do? Wow. You know, you get to work from home? That's awesome. You know, the pay's great. That's even better. And I'm like, you know what? You can do it, too. And so I have a good friend of mine who was once a pastor, now turned hacker - and same thing. He's driven, he wants to do the work, he loves solving puzzles, and so he can apply all of his communication skills and all of his other soft skills to the actual technical aspect of it. 

Eric Escobar: My wife's college roommate is - has a master's degree in biomedical engineering, and I was like, you know what? You could also do this too. And so sure enough, I keep trying to get, you know, all my friends and family - and it's kind of a joke now where it's like, oh, Eric's going to try and recruit you, huh? But that's exactly it. The barrier to entry - you don't need a four-year college degree to do this. You know, there are - when I started, there was no such thing as a cybersecurity program. And so if we did hire anybody, the closest match potentially would be a computer science degree. 

Eric Escobar: But really, what we care about most, at least on our team, is your ability to communicate and your ability to solve problems and your ability really just to think on your feet. And those are - you know, it's harder from one aspect because you can't teach it. You can't just decide, I want to do this and be incredibly successful at it. You might have a four-year college degree in computer science. Heck, you might even be a Ph.D. in computer science. But if you aren't going to be able to work a problem, think on your feet and communicate properly, it doesn't really matter what your credentials are. You can't teach it. A lot of this is something that you have to, you know, innately have as a part of your personality. 

Eric Escobar: So from one end of the spectrum, it's awesome because somebody who could be good at it in, you know, a year, you could show them everything that they need to know, and they'll be off to the races. But on the other aspect of it, there are some things you just can't teach, you know, just like any other profession or any other field. And so in that regard, I absolutely love it because if somebody shows a proclivity to it, if somebody shows that, hey, they're willing to invest the time to learn a new skill, you know, the sky's the limit. We only need more adversarial testers. We only need more computer science folks from all walks of life, right? So, yeah. 

Andrew Hammond: And just thinking about this historically, when you see things like this in the past, quite often there's then a movement towards professionalization and certification and codification. And then those barriers get reestablished where, if you want to get into field X, you'll have to have ticked all of these boxes. Do you see that in tech, or do you ever think that that could really take off? Or do you just think that it wouldn't really work for this particular field? 

Eric Escobar: Yeah, you definitely see it now. So you see now - that now there are degree programs or certifications, certification bodies, you know, all these different things to, like you say, try to make it more professional, try and put a suit and tie on a hacker, right? I think to a degree, yeah, that - they will, you know, shut out potentially some, mostly because if you're trying to hire for - you know, if you have a job opening for, you know, hacker for hire, and you see that you have a hundred and - you know, a hundred potential employees, and 50 of them have a four-year degree in computer science or, you know, security, if you're trying to just find a way to filter that down, you might just filter by that and you might lose a bunch of great candidates. 

Eric Escobar: But if you're a human and you're trying to make sense of, how do I, you know, stack this - you know, how do I sort and filter this stack of resumes in front of me? - that might be a way that there be - that there could be some gatekeeping. But I even still feel like with that, that there are ways to break into the industry even if you don't have that four-year degree, even if you don't have that going forward, just because, like I said, it's - or like you said, it's a meritocracy. And if you have skills, if you have ability, you'll eventually find your way to a place that's going to appreciate and want and need those skills. 

Andrew Hammond: And when did you first realize that you had the chops to do this or - not just the chops. I don't want to embarrass you, but you've - you know, you went on to become and still are a very successful hacker. Like, when did you realize that, wow, this is somewhere where I can excel as opposed to just, yeah, I guess I'll be able to keep a roof over my head and, you know, stay out of jail and stuff like that? Yeah, when did you - when did it dawn on you that, you know, this is somewhere where you could distinguish yourself? 

Eric Escobar: You know, I don't think it has yet. Have you ever heard of... 

Andrew Hammond: OK. 

Eric Escobar: ...Have you ever heard... 

Andrew Hammond: That's good. 

Eric Escobar: ...Of imposter syndrome? 

Andrew Hammond: Yeah, I've got it. 

Eric Escobar: Everybody in this field - I shouldn't speak for everybody - but I would say if you surveyed this field, everybody feels like - I feel like they're an imposter to a degree. And for those in your audience listening, imposter syndrome is where you feel as if, like, man, is somebody going to figure out that I don't know what I'm doing? There was one time my wife - you know, she walks in my office, and she's like, are you just Googling how to do something for your job? I'm like, absolutely. And she's like, what if your coworkers, you know, found out or, you know, like, you know, wouldn't that be kind of funny? I'm like, oh, no, we all - like, we're all Googling all the questions. Nobody can know at all. And so really, to answer your question, like, I - like, some people might look at me and be like, wow, Eric is a great hacker. He compromises and breaks into all these large companies. And then I have the people that I look up to. I'm like, oh, my gosh, like, you could never call me a hacker compared to, you know, these individuals that I've met and these individuals that I know. Like, they're the real deal. I'm just an imposter here. So I really don't think it's quite hit. I mean, it does pay the bills, don't get me wrong. And I don't think I'm going anywhere any time soon. But, you know, you're - what's the saying? If you're the smartest one in the room, you're in the wrong room. And I don't think I've ever been (laughter) in a room where I've been... 

Andrew Hammond: (Laughter). 

Eric Escobar: ...The smartest person. So, yeah, hopefully that answers that question, but I really just - yeah, every day feels like I'm an imposter to a degree. 

Andrew Hammond: And in this field, as well, how much of it depends on current knowledge, and how much of it just depends on this way of thinking and this skill set? So, for example, if you - say you went into a - God forbid, say that a - let's not say you. Let's say a hacker went into a coma for 10 years and then woke up. There's different technology, different problems, but a lot of the underlying fundamentals are the same. How difficult would it be to get back up to speed? Is that, like, you need to start all over again, or is it just, OK, you know how to think the right way, now it's just the case of a slightly different technology or slightly different code? 

Eric Escobar: I think you could take any person who is adept at solving challenges with constraints, and they could get up to speed in this job - in a year, be able to talk the talk, walk the walk, and in two years, be able to hold the conversation in a room of professionals and nobody would've knew that you never touched a keyboard a day in your life. So realistically - my view anyways - is that it is not about the tools, it's not about, you know, how the systems interact and operate. It just comes down to being able to, you know, the - think on your feet. It comes down to being able to work through a problem with those constraints. And anybody with that mindset, I think that they could go into a coma for a hundred years, wake up and still have that same of like, OK, I may not know anything. I may take some time to get up to speed. But it would not be, like, the nail in the coffin of like, oh, I waited too long. Like, this has all gotten away from me. 

Eric Escobar: Because technology changes so rapidly, that I'll go on - I went on paternity leave last year. And so, you know, I'm not hands on keyboard for, like, three months as, you know, taking care of kids and taking care of the family. And then I get back to it. I'm like, whoa, look at all these new attacks. This is really cool. You know, look at all these things that are now available, and look at all these things that previously, you know, we had no capability to test. And now, you know, oh, wow, we bypassed full-disk encryption on the laptop. That's incredible. So it really - I think it is just - if you're a problem solver, you could do this job, no problem. Doesn't matter when you decide to pick it up. 

Andrew Hammond: And full disclosure for our audiences, Eric was talking there - both Aaron (ph) and I were on Craigslist looking for apartments in California because I think we're both going to have a career change coming up quite shortly (laughter). 

Eric Escobar: Please do. And I don't live in the cool part of California. I live in a place called Fresno, which is the agricultural, like, you know, capital of the world, so... 

Andrew Hammond: Heartland. 

Eric Escobar: Yeah, it is - I see way more cows than I do waves. 

Andrew Hammond: It's "Grapes of Wrath" country, right? 

Eric Escobar: Not wrong. 

(LAUGHTER) 

Andrew Hammond: So this is really, really fascinating. Just to take a step back, Eric, tell us a little bit more about Secureworks - like, the company that you work for - and tell us what you do there. So we know you're a professional hacker, but help us understand the connection between you and Secureworks. What does your company do? 

Eric Escobar: Yeah, absolutely. So, gosh, that is - I feel like that's a loaded question, right? Like, our - I'm sure all of our marketing team and sales team are looking at me like, come on, say the right things, Eric. 

Andrew Hammond: (Laughter). 

Eric Escobar: But essentially, Secureworks is a security company. And we have a bunch of different departments within our company that all, you know, take care of one aspect of security. So I work on what's called our SWAG team, or Secureworks Advisory Group - that's our acronym, is SWAG, which is kind of cool. And basically, we're the adversarial team. Clients come to us and say, please try and break into us, tell us how you broke into us and - so we can patch it before a nation-state, you know, or another threat actor is able to break into them. And so that's why it's the coolest job in the entire world from my perspective, because on any given day, I'm committing several thousand felonies if I didn't have permission to do what I'm doing. Today, I'm up to over a hundred thousand felonies if you were to look at what I was able to do and the amount of users I was able to compromise. And so as far as, like, looking at the broader part of Secureworks, I'm in the adversarial section where we attack our clients. You know, we - for - to try to make them more secure. 

Eric Escobar: But what's kind of neat is that we have a bunch of other, you know, divisions, I guess, is the best way to put it, within our company that do different aspects of security. So we have our incident response team. So basically, if your company were to get breached and find out, oh, no, you know, you've been breached, you can call us. Our guys will parachute in and basically say, hey, we're going to evict the threat actor, find out how they got in, patch the hole and make it so that your company can function again, right? If there's ransomware, how do we recover from backups? Is there, you know, potentially a recovery key somewhere? So that's incident response. So I break in. Incident response responds when somebody like me that's not friendly breaks in. 

Eric Escobar: And then we also have our CTU team, our Counter Threat Unit. Counter Threat Unit - they're responsible for seeing, what does the adversarial landscape look like? What are nation-states doing? What tools and techniques are being used by other threat actors that aren't friendly, you know, out in the wild? And then can we take what we've learned from there and apply it to our defensive products so that we're able to make sure that an incident never happens 'cause we catch it before it does, right? So you can think of them as, like, you know, the researchers in the field sampling all the things that are bad, taking it back home and writing, you know, different definitions to be able to catch any of that malware going forward. Or, you know, it doesn't have to be malware. It could be, more often than not, how threat actors operate and, you know, their operating principles. 

Eric Escobar: And then we have our flagship product, which is Taegis. Taegis is like - it's an XDR platform. XDR is a fancy, basically, way of saying it is your enterprise way to monitor how threat actors are, you know, potentially trying to pivot into your network, how they're trying to access your network and, you know, what does that look like, and can they catch that threat actor before - you know, before they're able to do anything? So it's kind of a fun cat-and-mouse game that we - you know, all those different divisions play against one another because incident response is like, oh, man, like, how could we find out, you know, what you're doing in your network? And, you know, there's always a cat-and-mouse game that goes with our Taegis platform of, hey, can we bypass our own security product, right? And so it's a fun game to go back and forth, and, like, OK, you know, we bypassed it here. Then they patch it, and then they can detect it and, you know, just going back and forth. 

Eric Escobar: But really, it makes everybody sharper on our team and same thing with Counter Threat Unit. We're pulling in stuff that's being used in the wild so we can see, hey, what is - you know, what are threat actors and nation-states and other, you know, adversarial groups - what are they doing? What do we see? So that's in a very - like, that is a very quick and concise, you know, summary of what we do. But it's really fun because you get it from all different angles. You get to see what's happening in the - basically, cyberspace, on the internet. 

Andrew Hammond: Wow. And how do you spell that, Taegis? 

Eric Escobar: I should know this - T-A-E-G-I-S. 

Andrew Hammond: So this is, like, a model for just protecting a network? Is that correct? 

Eric Escobar: Not just protecting a network, it's protecting your (laughter)... 

Andrew Hammond: Oh, sorry, not just protecting network, yeah (laughter). 

Eric Escobar: Yeah, so it's - I did spell that right. I had to look it up just to make sure. Yeah, so it's - so it essentially goes - you know, you - it looks at your network holistically and basically says, you know, not just, hey, what is happening to the server? It looks at your network holistically and says, you know, do we notice weird patterns? Do we see machines that are maybe not connected to other machines? Do we see authentication attempts that shouldn't be from certain hosts? It does a wide range of different things to look, not just at one single endpoint device - not, hey, was this one computer compromised? - but evidence of compromise throughout your entire network. 

Eric Escobar: Because, oftentimes, if I'm going to break into - like, to get a little bit technical, if I'm going to break into your network, I typically don't like to use malware. I typically don't like to use, you know, some tool that's going to get captured. What I typically do is find a way to gain credentials - you know, someone's username and password - and then I use their user account to basically do everything throughout their other network. So there would be no malware to find 'cause I'm using their network and their accounts as they should be used and finding vulnerabilities and weaknesses in their permission authentication schemes. That, you know, is basically undetectable 'cause I'm not using malware. And so there's a lot of pattern matching, a lot of, you know, really technical stuff on their side of the house, you know, to prevent and discover things that are anomalies, so to say. 

Andrew Hammond: OK, when you say a security company, you mean cybersecurity. Is that correct? 

Eric Escobar: Cybersecurity company. 

Andrew Hammond: Yeah. 

Eric Escobar: Although we do do - so as part of our adversarial testing, we also do physical security as well. So I've, you know, done the whole secret agent, break-in, clone badges, you know, go in at night, pick the lock and all that stuff as well. 

Andrew Hammond: Oh, you have? Wow. OK. 

Eric Escobar: Yeah. 

Andrew Hammond: We'll be right back after this. 

Andrew Hammond: One of the things that I was - that I'm interested in is, you know, with this field - you know, like, "SpyCast" is on the CyberWire network now. And we've done traditional intelligence espionage and people kind of get that, more or less, OK, that's over here. And then they sort of get cyber. They're like, OK, that's computers. That's over there. I'm increasingly interested in the places where they overlap. And it seems that, you know, a lot of people are like, OK, well, the NSA - like, that's an area where, you know, both of them overlap. And other than that, it gets a bit fuzzy. I'm not sure about it. 

Andrew Hammond: But, you know, when you hear the term InfoSec - like, information security - I mean, that's what - a lot of what intelligence agencies do. Or when you were speaking about, like, breaking in without using malware, it's - like, intelligence agencies as well, they - I mean, sure, you can do some kind of brute force attack and get information, but if you scream out that you've just done something, then they're going to go in, change all their codes and do a whole bunch of countermeasures to try to protect themselves against what you've just committed against them. So I don't want to say that both of them collapse into one another, but it just seems really interesting to me, all of the places that they overlap. And I don't know if I've ever read a book or something that adequately explains that overlap. But do you have any thoughts about that? 

Eric Escobar: Yeah. So InfoSec - like, I think all the - like, the industry terminology is always kind of funny 'cause you say, oh, I'm in InfoSec. And everybody's like, I don't know what that means. And it's like, that's really fair. And so when you think about it, you know, you expand it out, information security. And so a lot of people are like, oh, so you safeguard, you know, the typical things - right? - your health data, your financial data, your - you know, all these different things that you think of when you think of, like, oh, my online accounts is what is being safeguarded. Well, it's interesting when you think about it. You know, so you mentioned, you know, ways that they overlap. Really just information - you know, if you're a spy agency, if you're a nation-state and you're trying to discern information, there's a lot of guesswork - a lot of educated guesswork - that goes into that. 

Eric Escobar: And so an example that I always kind of like to think about realistically - if you look at, say, the United States political landscape - totally not a hot-button issue. If you are a foreign, you know, nation and you're trying to understand, hey, what - you know, what are the political parties, you know, angling to do? What's going on here? Well, think if they were able to break into, say, the, you know, manufacturer of, like, flags - right? - of little American flags that get waved around at campaign rallies. Well, if you knew how many orders of each of those flags are going to respective, you know, different political campaigns and parties and all that stuff, well, now you've built up - just with that information of orders of flags, if you're able to compromise a small manufacturing place, now you know all the ordering, all the processing information of how that goes, typically logistics of who, how, where and why those flags are going to be in that position. You typically know how many are in the war chest or how many people they're expecting at a campaign rally, right? 

Eric Escobar: And so there's - it's one of those things that it's information security 'cause you don't necessarily know how the information is going to be used. You know, you might have a threat actor that breaks in, trying - to that same flag company - trying just to steal, you know, email addresses so that they can send out, you know, phishing emails just willy-nilly. Or you might have a nation-state trying to compromise that same flag factory for the purpose of trying to divine, what does the political landscape look like in the United States for the upcoming midterms? There's a lot of hypotheticals. And then there's a lot of, like, you know, where things actually overlap, like you said, with NSA and other intelligence agencies. 

Andrew Hammond: And even for - it seems to me that even for - like, for someone like you that's in the private sector, this is still part of your world because the companies and so forth that you're doing this penetration testing for, this hacking for, it seems to me that quite a few of them will be trying to protect themselves against nation-state actors like Russia and China and hacker groups that are affiliated with intelligence agencies from those countries. So, I mean, that's quite interesting, as well. It seems to me that whether, you know - you don't have a choice in the matter, almost, because nation-states have a large amount of resources. They can put manpower to a problem for decades and decades, theoretically, or even longer. So people like you are up against, this - these foreign intelligence agencies. It's not, like, a matter of choice. It's just - it just is. That's quite interesting to me. 

Eric Escobar: Yeah. And that's - I mean, you hit the nail on the head - is - the way that I always like to think about it, is that if I said, hey, Andrew, you know what? I'm going to send 12 special force operators to come break into your house. And if I said that and then, you know, it got plastered all over the news - oh, my gosh, can you believe Andrew? He got compromised 'cause 12 Navy SEALs kicked in his door. Everybody would be like, well, yeah, it's a normal person against 12 well-trained Navy SEALs. Of course that's going to happen. 

Andrew Hammond: What do you expect (laughter)? 

Eric Escobar: But realistically, in the cyber domain, it's even worse because you have nation-states that are funded with millions and billions of dollars potentially targeting, you know, a small company, a medium-sized company, even a large company. Even if you look at a large company and you said, hey, 12 Navy SEALs, kick your door - your way in the door, you know, a lot - you know, the news, the - you know, the media apparatus would be a lot more friendly, saying, oh, well, yeah, no one would expect that they should be able to withstand an attack against a nation-state. But that's what we're asking everyone to do. I mean, that's what we're asking you and I to do every time we're trying to protect our email, every time we're trying to use encryption for anything - login passwords to Facebook, Instagram, all your social media accounts. All of these things have to be able to defend themselves against, you know, the latest and greatest technology threat actors and, you know, the equivalent of the digital Navy SEALs. And that's exactly it, is that it's - is that we are having to, you know, do this, not by choice, but because this is the state of the world. 

Eric Escobar: And not only - you know, to break down the analogy even more of, like, 12 Navy SEALs kicking in your door, they can do that from their respective countries. They don't even have to, like, get out of bed, you know, to potentially perform that attack. Whereas if they were, you know, physical, actual operatives, they would have to. And so that's just the reality of where we live, is that, you know, all of this information is constantly being attacked 24 hours a day, seven days a week from, you know, like, hacking groups that are built out of teenagers, right? The most recent hacks, I think, of Uber was tied to Lazarus Group, which is a bunch of teenagers, right? I could be totally wrong on that. I'm pretty sure - I think that's right, but the analogy stands up. It could be anybody. It could be a nation-state. It could be a bunch of teenagers across the world. 

Eric Escobar: So it is - it's one of those things that when you frame it in that mind, you're like, yeah, that's a really hard problem because it turns out countries have a lot of resources that if they want to break in somewhere, they can apply hundreds of people, potentially, to focus on one problem, you know, to put in all that brainpower into - in trying to break in. 

Andrew Hammond: I like that analogy, the Navy SEALs. I was also just thinking that the Navy SEALs can't break your door down while eating a bag of Cheetos, but a hacker overseas can, right (laughter)? 

Eric Escobar: Absolutely. It's funny. There's been several times where it's like, oh, I'm making dinner or, you know, like, got to watch the kids right now, you know, before - if they woke up from their nap early. So it's like, wow, I'm breaking into a Fortune 500 company while, like, hanging out with my 4-year-old. 

(LAUGHTER) 

Andrew Hammond: That's funny (laughter). And this is where this term APT comes from - right? - Advanced Persistent Threats. That's a nation-state that can just throw relatively infinite amounts of money and time at a problem. 

Eric Escobar: Yeah. And it could be a nation-state. It could be combinations of nation-states. It could be really well-resourced, you know, threat actors. So it doesn't necessarily have to be a nation-state. But yeah, advanced, persistent threat - and they're typically named so if - you know, a lot of different threat actors, they have, you know, similar processes. They have similar techniques, similar tools. And so you can kind of aggregate those. So, like, what our CTU team would do is they'd basically say, OK, there is this hacker group that we don't know anything, or, you know, we don't necessarily know, like, oh, this is who they are. But we can tell from their attack pattern and, like, what they're doing that this is probably a similar group, and they might have some crossover. But not necessarily needs to be a nation-state but definitely well-resourced and definitely professionals in the field of what they're doing. 

Andrew Hammond: Wow. And one of the things that I wanted to ask as well was, can you break down this term kill chain for us? I've heard this, like, used quite a lot, and I know that in the realm of cyber it's quite important. And for some of our listeners, this will be, you know, something that trips off the tongue. But for others, they'll be, what the heck are they talking about? So what's a - what's the kill chain? 

Eric Escobar: Yeah. So I'll give you a brief example with a story of a test that we recently did. So kill chain in, like, a one-sentence thing is basically how you're able to achieve your objective, how you're able to compromise somebody from the beginning to the end. So if you're reading a book, it's just a quick story, a quick narrative of - how were you able to do it? So for, like, one of our tests, we're trying to break into this medical facility, and we're trying to break into it from the public internet. So just, like, any other internet user has the same level of access as we do, and they give us, hey, here's our target computers to break into. We found, hey, there's a page publicly available. It says, have you forgotten your password? Click here to reset it. You only need to answer some security questions. So we found a list of users on LinkedIn, and we compared them to social media profiles such as Facebook, Instagram, Twitter, Snapchat, all these publicly available social media things. And we found some of the questions were things you can probably look up on social media. So one of them was like, favorite superhero - found the person's Facebook page, instantly obviously it's Batman. And so it went from there, and so we had one of their security questions already. The next question was their maternal mother's maiden name or maternal grandmother's maiden name. So something seems pretty abstract until you stumble upon an obituary that contains that same information. 

Eric Escobar: So now we're able to reset this user's password. We reset their password. We're able to log into their VPN, so the way that they remotely access their company. And then from there we're able to impersonate them on the network. We're then able to access a file share on their network. And now from the public internet, we're accessing a server within their internal corporate network. Turns out that file server had, you know, some vulnerabilities with it, and it was able to basically access a more secure server. So we're able to go into - from one file server into a more secure server, which contained the entire company's username and password database. And so I was then able to extract all that information sitting from the public internet. So that's essentially what a kill chain is - all the different steps that you use to achieve that objective of whatever the client wants or however a company was compromised. So does that make sense? Hopefully, I explained that OK. 

Andrew Hammond: I think so. So it's like going over a bridge and then at each stage of the bridge, there's the potential to be stopped or the potential not to complete your journey. And you have to keep completing every 50 meters to get to the end. And the kill chain is just if you can stop them over the length of the bridge, getting to the end of the bridge then - does that make sense, or is that not a good analogy? 

Eric Escobar: Yeah, yeah, yeah. That's very similar of just - you're trying to find a path to achieve your objective. And for us, the kill chain is all the different steps that you achieve that objective. And then what's nice is that when we generate a report for our clients, we basically say here are, like, you know, the 10 or 15 key steps. Had you stopped us at any point along the way in these steps, then you would have potentially stopped us from the compromise. And then - you know, so that's like a chain if you think about, like, a physical chain. But then it's - also can be more like a web from the standpoint of, like, there's more than one way to - you know, to potentially compromise. And so there's all the, you know, additional kill chains potentially and how those stem and weave. But yeah, that's the nail on the head is - where can you get stopped along that path - along that path of compromise? 

Andrew Hammond: And help us understand a little bit more as well about hardening the attack surface. So that's one of the terms that I've heard. How do you harden an attack surface? Break that down for our listeners. 

Eric Escobar: Yeah. So say you're just a standard computer user, right? You have your - just your standard laptop. And let's talk about, like, hardening your laptop or - you know, this sounds like a really, oh, we got to harden, you know, secure, you know, batten down the hatches kind of thing. And really, it's not that dissimilar from just, like, if you're in standard, you know - large companies try to harden their systems just like you could harden your laptop. So, hey, maybe the password that you use to log into your laptop, maybe that's just, you know, a four-digit code. Well, if you're trying to harden it and make it harder for somebody to get in, instead of having a four-digit code, maybe use a sentence that's, like, 15 characters long. So it's easy to type in your keyboard. That would be one way that then I couldn't just potentially guess what your password is, you know, if it's four zeros in a row. Other things that you might do is, hey, I'm not going to, you know, potentially connect to, like, public Wi-Fi or if I am, I'm going to use something like a VPN to protect my internet traffic as it leaves my computer. Other things that, like - trying to harden yourself might be something physical. I'm not going to leave my laptop in my backpack in the back of my car when I go to the grocery store, right? So it doesn't have to just be in the - you know, the digital domain. And there's a lot of things like that. Like, just enabling something like multifactor authentication, which is, like, if you log into your bank, you're logged into something else where you get, like, a text message, or you have to, you know, hit a button on your phone. Just adding those simple things is hardening, you know, your attack surface, is limiting your attack surface, so that if I'm trying to break into, say, your Facebook, your Gmail, your Instagram and there's a second-factor authentication, I would need to - I would basically need to steal your phone in order to, you know, get that second-factor authentication. And same thing if you're using a hard, unique password that I couldn't just guess - well, good luck then. There's something else that I don't know. So that's really all that it is, is a really simple concept of just - you're limiting the way that somebody like me is going to be able to easily break into you and just creating more and more barriers of difficulty. 

Andrew Hammond: OK. So it's almost like - it seems it's almost like defense in depth. Rather than one huge wall, like in the "Game of Thrones," it's just - here's, like, two dozen walls where I'm going to make it really difficult for you to do this, and it's probably going to be easier for you just to go somewhere else and make your life easier. 

Eric Escobar: And that's the - so there's always an analogy I like to - or it's more of a joke of - you're camping with your buddy. You and your buddy are at your campsite, and a bear stumbles into your campsite. And you start putting your shoes on. And your buddy leans in and he goes, there's no way you're going to outrun that bear. And he goes, I don't need to outrun that bear. I just need to outrun you. 

Andrew Hammond: (Laughter). 

Eric Escobar: And that's exactly what it is. 

Andrew Hammond: I like that one. 

Eric Escobar: Hackers are lazy. We're opportunistic. And, you know, we're not going to struggle and try and crack the hardest server if, you know, the next server over is going to be something that is old and outdated and easy to compromise. We're always going to go find the path of least resistance. And so in that same case, if you are a hardened target, if you're a target that has multi-factor authentication, unique passwords for everything, and long passwords, I'm not necessarily going to go after you. I'm potentially going to try and find another way either into your system through somebody else or I'm, you know, just going to leave you alone all together. And so that's really all that it is, is just adding - you know, it's like layers of security, right? So it doesn't have to be, like you said, one big wall. But, hey, little incremental steps that you could do just to make it - my life harder as a hacker. 

Andrew Hammond: OK. And just before we move on from these definitions - which are really, really helpful, by the way. Thanks so much for doing this and indulging me. Zero-days - this is the last one. Help us understand what zero-days are. 

Eric Escobar: Yeah. So zero-days - the quick definition of it is it's basically a vulnerability or an exploit in a system that nobody knows - that, you know, no company is aware of. And so the reason where it gets the term zero-days is it's days since it was discovered. So say a vulnerability is found in Windows. And it's been, you know, a certain number of days since it's been discovered - so it's been 10 days, it's been, you know, 11 days, it's been, you know, three months. So how many days past since it's been discovered has it been out in the wild? And zero-days are at zero because they are out in the wild and nobody knows about them, potentially. 

Eric Escobar: And so the reason that zero-days are so, you know, I guess, like, mythical or, you know, so scary is because you could be, you know, using a fully patched iPhone. And that fully patched iPhone - completely up to date, all the security stuff, you know, technically as secure as an iPhone could be, if it has a zero-day in it, that means that a threat actor or an attacker potentially has access to it, even though it's been completely patched, completely updated and has all the latest security definitions. And that's what makes it so scary, is that you don't even know that you're vulnerable because you don't even - you know, 'cause nobody else in the world, other than the attacker, potentially, knows that this vulnerability exists. And so that's why it's called zero-day 'cause hasn't even, you know, basically been released. Nobody's aware of it. 

Eric Escobar: And that's the reason that they're scary is because, again, you - like, Apple recently patched a couple zero-days where, hey, they found out iPhones are being actively exploited and - against fully patched, updated, you know, devices. And so they had to release - you know, once they discovered it, then they released patches and, you know, all that stuff to update your phone, which is why you should always keep your devices up to date. But that being said, that's basically the simple definition of it, is just something that is not known to the rest of the security community. 

Andrew Hammond: And how do these things come to light? Like, with zero-days, is there malicious actors out there that are just specifically hunting down zero-days, or is it someone stumbles across it that works for a company, puts it on the darknet and says, you know, I'm offering this for this amount of bitcoin or something, send it to this address? Help the average person on the street understand how these things come to light, or - not come to light for everybody 'cause the whole point is that you're - you get access to this before other people know about it, so you can take advantage of it. So help us understand how these things, like, bubble up and come to the surface. 

Eric Escobar: Yeah, so there are dedicated researchers that they spend all of their time, you know, looking for very specific vulnerabilities into very specific systems. That's not everybody. That's not how all zero-days are discovered. But what's interesting is that there's a term called bug bounties, where basically, companies say, hey, if you find a zero-day, if you find a vulnerability, by chance or because you're a researcher, in any of our systems, we'll pay you a certain amount of money per level of the vulnerability to report it to us and let us know. So I think, you know, Apple has some crazy, like, $2 million bug bounty so that if you did find a zero-day in the most up-to-date, you know, iOS and you report it to them, you get, you know, several hundred thousand dollars for sure. And I think maybe up to a million is the most that's ever been paid out. 

Eric Escobar: So companies will pay to say, hey, if you find it to us, report it to us and, like, all above board, all - you know, we'll send it to you. You're not a criminal. You are more than allowed to try and find this stuff. And if you report it to us and do responsible disclosure, you know, completely, you know, please let us know. You know, make the - you know, the world a more secure place. Sometimes, you just stumble into them. So there's been several, you know, websites, applications that I've looked at. And, you know, you get into, like, a weird edge case where you're like, oh, if I just don't put a username in this field and hit submit, it logs me in as an administrator. Well, that's a vulnerability. And was I trying to do anything nefarious? No, not necessarily. It could have just been an accident, but that's technically a zero-day. So it's - you know, there's researchers - it spans the whole spectrum of researchers who are dedicated to, like, only looking at certain platforms for high-paying bounties. And then there are, you know, just people that stumble across a vulnerability. 

Eric Escobar: And just 'cause it's a zero-day doesn't mean that it is, you know, actually weaponized or anything. It might be like, oh, I notice that there's a flaw in this application. So maybe the zero-day doesn't actually get me any, like, really great access or really great ability to do something, but still, nobody knows about it. And if it helps you as a part of your kill chain, then, yeah, that could be kind of a scary zero-day. But not all zero-days are like, and then we got access to all of this text messages and just from his phone number, but, yeah. Does that make sense as far as, like, the ranges of what's out there? 

Andrew Hammond: It does, yeah. That's really helpful. And tell me if I've understood this properly. So one of the ways that I have thought about this in the past is a zero-day is like Buckingham Palace, where you can go around and make sure that every single window is closed, but if one window out of 15,000 is not closed and no one knows that it hasn't been closed, then the whole palace is potentially vulnerable if someone knows where that one window that hasn't been closed is. Is that - would that be a good analogy? 

Eric Escobar: That's pretty spot on as far as - I always tell my clients, look, I have the easy job. I need to find one way in. You have the hard job. You have to, you know, basically make sure all of those windows are all completely closed. In the past, it used to be exactly like that, of, like, hey, you find one window open, game over, you've completely taken over the entire thing. Different applications, different websites, different, you know, physical devices like iPhones and Android phones - they're starting to implement - or not starting, they have implemented additional security layers and security features so that - say you were to compromise an iOS app or an app on an iPhone. You know, there's things like the secure enclave, to get really technical, that keep, you know, things like keys and private data secure on those devices. 

Eric Escobar: So, you know, for some networks, you know, there are some times where if you get a zero-day on the network or if you're able to compromise that network, you have the keys to the kingdom and you can run around all of Buckingham Palace, you know, scream at top of your lungs and you have - you know, you are good to go. And then there are other - you know, and a lot of time it's from larger companies that have, you know, different layers of - you know, of security aspects in place that make it so that, hey, maybe you got in through the entryway, but you're never going to make it down the hallway into, you know, bedroom chambers or something like that. 

Andrew Hammond: And in the context of Buckingham Palace, the kill chain would be everything that's trying to stop you getting into the queen's - sorry, RIP - the king's bedroom. 

Eric Escobar: Yeah, the king's bedroom. 

Andrew Hammond: It would be the fence. It would be the electronic security system. It would be the dogs. It would be the security team. It would be the windows. It would be the material of the windows. It would be the sensors in the hallways. All of those things are trying to stop you getting through to the end. 

Eric Escobar: Yeah. And so the kill chain in this perspective is all the different things that an attacker did. So, you know, did they - like you said, did they bypass the motion sensors? Did they jump the gate? All the different things that they were able to do, that if any one of them had worked properly and kept out the attacker, you know, that kill chain wouldn't exist. The kill chain is all the things that were breached along the way. 

Andrew Hammond: And in the context of your job, you would try to get into Buckingham Palace. And then when you got in, you would say, here's how I got in and here's how you need to harden the attack surface? 

Eric Escobar: Yep. And that's exactly it. And it's funny because you bring up the Buckingham Palace example, but the thing that I always tell our clients are, hey, look, I can steal user passwords all day. I can't access file shares all day. Tell me what your crown jewels are and what keeps you up at night. And so when you say Buckingham Palace, it's funny, because I always tell our clients, tell me what your crown jewels are, and that's what I'll go steal. And I'll tell you exactly how I stole them so that you can, you know, block every aspect of that kill chain so that if somebody like me were to come back, all those things have been patched, blocked, updated or remediated in some way. 

Andrew Hammond: It would be funny if the royal palace reached out to you and said - and you said, you know, watch your crown jewels - the crown jewels. 

Eric Escobar: The actual crown jewels, Eric. All right. Challenge accepted. Let's go. 

(LAUGHTER) 

Andrew Hammond: Thanks for listening to this episode of "SpyCast." Go to our webpage, where you can find links to further resources, detailed show notes and full transcripts. We have over 500 episodes in our back catalog for you to explore. Please follow the show on Twitter at @INTLSpyCast and share your favorite quotes and insights or start a conversation. If you have any additional feedback, please email us at spycast@spymuseum.org. I'm your host Dr. Andrew Hammond, and you can connect with me on LinkedIn or follow me on Twitter at @spyhistorian. This show is brought to you from the home of the world's preeminent collection of intelligence- and espionage-related artifacts, the International Spy Museum. The "SpyCast" team includes Mike Mincey and Memphis Vaughn III. See you for next week's show.