8th Layer Insights 8.29.23
Ep 36 | 8.29.23

Blending Awareness, Social Engineering, and Physical Penetration Testing -- A Conversation with Jayson E. Street


Perry Carpenter: Hi, I'm Perry Carpenter, and you're listening to "8th Layer Insights". Picture this. It's a bustling office on a Monday morning, and you're walking down the hall, coffee in hand, and you nod to a familiar face, but for some reason, you just can't place it. That person's going in the opposite direction, so you quickly push them from your mind and forget. Fast forward a few hours, you're now in an emergency meeting. Confidential files have been accessed, systems have been compromised, and yeah, that familiar face, it's actually a master of social engineering, using trust, human interaction, and confidence as their tools of infiltration. This, dear listener, is the thrilling and often overlooked realm of the human side of security. Now, I'm sure this isn't for you, but you do know that lots of people will casually plug in USB drives that they find somewhere, or that they pick up at a conference, or there are coworkers who respond to an email request or a vendor request without double-checking the details. It's in these everyday moments that the line between security and vulnerability gets blurred. But as you probably know, these vulnerabilities stem from our very human nature, our desire to trust, our desire to help, or any number of symptoms of the fast-paced, distracted world that we live in, where double-checking every single detail seems just too tedious. So how do we learn to navigate in this world where there are so many risks that are constantly coming at us, at our families, our coworkers, and society as a whole? To help us decipher this complex tapestry of human psychology and security, we're joined today by Jayson E. Street. Jayson is a little bit different than many of the other people in the awareness space, or in the red teaming, or the physical penetration testing space. What Jayson does is artfully and skillfully combine social engineering and physical penetration testing with a security awareness and a human improvement mindset. He crafts experiences that don't just expose weaknesses, but also transforms those experiences into powerful lessons that organizations and individuals can take away and get something positive from. And so, as we prepare to venture into stories of undercover operations, cunning manipulations, and eye-opening revelations, I urge you to reflect how many times we might have been unsuspecting participants in someone else's game. And so, on today's show, we explore social engineering and physical penetration testing with the goal of raising awareness in our workforce. Welcome to "8th Layer Insights". This podcast is a multidisciplinary exploration into the complexities of human nature and how those complexities impact everything from why we think the things that we think to why we do the things that we do and how we can all make better decisions every day. This is "8th Layer Insights", Season 4, Episode 6. I'm Perry Carpenter. Welcome back. As I mentioned in the intro, today's guest is Jayson E. Street. One of the really cool things that I like about Jayson is that not only is he bold and able to do all of the really interesting and fun penetration testing things that we see on TV and even in spy movies and things like that, but Jayson's purpose behind doing those kinds of things that he does and consistently gets away with is really about improving the human side of things. He sees penetration testing as a security awareness exercise. The really cool thing about Jayson is that he will physically penetrate organizations. He will send fish. He will do all the things that a red team will do on the physical side, but then he goes in and does cleanup after the fact. He goes in and talks to the employees that he bypassed or tricked. He encourages them. He tells them what they did right, and he wants to leave them with a positive experience as much as that can happen, but at the same time, get across the point that bad things do happen in organizations and that humans are a critical layer of defense. So with that, let's go ahead and get to the interview. What I want to talk about is really, I think, where your passion is. Security awareness and pen testing and the intersection between those two. I looked at the outline of what you're going to be talking about in your post-DEF CON training, and you mentioned the current state of security awareness. I know I've got in my mind what that is, but I'd love to get from your perspective what's working and what's failing.

Jayson E. Street: I think the biggest problem that we're facing when we're doing security awareness is we're trying to drive things toward compliancy and toward people following rules instead of educating people on threats and explaining it in ways that they understand, because we'll say nine times out of ten, they're not going to willfully do something that they know is harmful to their career. And they're also not going to willfully do something that they think they may lose their job over. Now the point is, and the problem we've had in our industry, is we don't really tie in security to their job security, and we don't explain information security related to things that they have common knowledge and common ground in. One of the biggest things that I talk about is how we approach information security, because I go all over the world, and I've broken into so many different kinds of places, government facilities, it's like state treasuries, banks, hotels, just everywhere in all different parts of the world, and it's not about culture, it's about humans. I remove the culture, I look at how humans operate, I learn how people expect things to be, and then I use that against them. And when we talk to our users and we try to talk to them about being security aware, we're talking to them like, don't open up attachments, or these are the kinds of cyber threats. And that is so intangible, and that is so far removed from their daily life, and that's not going to work. See, I drive a car, or I drive my motorcycle, I'm pretty good at driving it, I do well, and I handle it well. I don't know actually how my motorcycle gets forward motion. There's like this big rotary track or something. I think there's still a chain, I'm not sure, I haven't seen it. It doesn't make me an idiot.

Perry Carpenter: Right. Right.

Jayson E. Street: It does what I want it to do. I don't need to understand how my spark plugs work in my car to work my car, but I do know that I should lock my car when I go to a gas station, when I'm going in to go get something, even though it's slightly inconvenient, turning my car off and then shutting the doors, it's still something I should do. I understand that I need to be situationally aware when I'm on the freeway to not just worry about where I'm going, but seeing if someone is behaving in an unsafe manner that might pose a risk to me.

Perry Carpenter: Yeah.

Jayson E. Street: And so that is how we talk to our users. We tell them, hey, you wouldn't open up a package that was a cardboard box left on your hood of your car when you came out from work that just said Bob from accounting, because that looks suspicious. I mean, it says it's from Bob from accounting, but why would he leave that box on your hood? Why would Bob from accounting send you an email with an attachment on it that you weren't expecting or didn't realize or hadn't communicated to you that he was going to be sending something? It's like you wouldn't let someone who, when you're trying to do your work or you're at a stoplight, say, hey, I want to put this in your car. It's like, it's going to be fine. I need to plug this in. It's going to increase your gas mileage. No. Then why will you let someone come to your desk and say that they need to plug in a USB drive or a device to work on your computer? And you don't give car keys to just anybody when you're going out. It's like, oh yeah, I'm your mechanic. It's like, I need your car keys and I'm going to go do this work. Well, that's great, when you went to the mechanic and they required access.

Perry Carpenter: Yeah.

Jayson E. Street: But how many IT administrators, how many people that you don't know say, I need your password and do you share that? Because you wouldn't just give them the car keys because they're responsible, you think, for that car, but you're responsible for that car, especially if it is a company car. Well, if it's a company computer, it's just like a company car. See, I am talking all about proper security and security policies to follow for a computer, but not once do I really mention the computer aspect of it. I let them relate to that information and realize how it ties in to their work computer.

Perry Carpenter: So you've alluded to this before, but I think the way that you're thinking about awareness from an end goal perspective is really important. So I'm really a big advocate of anytime you're trying to think about anything, you have to understand what the actual goal is. And the goal in so many circumstances for security awareness isn't really just that somebody be "aware". It's that there's some kind of behavioral action. There's reduction of risk. There's lots of other things that come with that. So for you, when you think about like a first principles type of approach, what is the ultimate goal that comes to your mind when you say, I want somebody to be aware of something?

Jayson E. Street: When I talk about that, and the only thing I think that we need to really teach, and then everything follows through afterwards, and it's just gravy, is we need to create and foster situational awareness, not security awareness, not computer awareness. We need people to understand in their daily lives when they're not at work, when they're at home, when they're at the mall, that they have an increased sense of situational awareness where they look at, they see something suspicious, and instead of just letting it go out of their mind and go, okay, I'm sure it's going to be okay, we let them act on it. We give them the empowerment to act on it and say, if you see something, say something. If you think it's something that's suspicious, that is your brain telling you that something is off the pattern. You need to respond to that, and here are the methods that we give you for when you're at work and that happens, and we will respond, and we will not penalize you, we will not make fun of you, we will not ridicule you no matter how many false alarms you send us. We will be thankful for those because it's that one that's not a false alarm that could save us, and that is what you increase. Employees are never going to care about your data, but if you can get them and educate them on why it's important to secure their Wi-Fi at home and patch their routers at home and patch their phones and their phone devices and check the privacy settings for their kids on TikTok and Snapchat and see what they're doing on those apps and seeing what the security risks are on some of the things that they do online and some of the scams that are out there on eBay and Facebook marketplace and stuff, when you educate them on those things, guess what? They still don't care about your data.

Perry Carpenter: Right.

Jayson E. Street: But that situational awareness that they start learning and foster from home, they naturally just bring it to work. They wouldn't fall for it now at the house, they're not going to fall for it at the job. That's what you want. You don't want them to look like they're having to adhere to a policy. You want them to start actually thinking different and becoming more situationally aware just period.

Perry Carpenter: I love that. There was a phrase you said earlier, which was if you want to give somebody awareness by having them get robbed, you're the guy to call. So whenever you engage in something like that, so when you're doing a pen test, can you describe a typical pen test the way that you run it and like what the end goals are and what some of the steps are?

Jayson E. Street: Yes. People always get surprised when I get on like the scope calls and we start talking about what my services are. I'm not there to compromise anything. I'm not there to get vulnerabilities. I'm not to find out your vulnerabilities. I don't care if you're PCI compliant or Sarbanes-Oxley, whatever old white dude you've got to comply to. It's like, I don't care. I'm there just to be the worst possible thing at the worst possible moment at the worst possible time. I'm not trying to test you for compliance. I am trying to create a horrible event that happens in your location. And the difference is that on day one of just the talking before the engagement even starts, I've explained to the client, I will get caught at some point during the engagement. Even if it is the third day, even if it is after I've been successful and broken in and gotten everything that I was supposed to get, I will make sure that I get caught. I guarantee you, I will get caught. I literally had to power down a computer that was doing business at the time behind a bank line and unplug it and start walking out of the bank before the teller realized he should be questioning me and then I gave him the win because I will make sure I get caught. The problem is, I'm not there to exploit. I'm there to educate. To me, and I say this all the time, it's my little tagline, it's like it's about education, not exploitation. I am not doing a pen test. I am not doing a red team. I am doing a security awareness exercise, because the hard part of my job is not the breaking in part. That is usually too easy. I've got on video where from walking into the front door of a bank that was actually technically closed to compromising the first machine in 15 seconds. I don't have a problem getting in and compromising machines. That's not the hard part. After I finished compromising that site, I will leave the site for two minutes. That two minutes for me is my signal to say I successfully escaped. I was able to get out and not be detected. One time, I had to duck into this hair weave store in this other country. It was hilarious because a couple from, I think Lagos, were eating lunch and invited me to have lunch with them. Very nice people.

Perry Carpenter: Nice.

Jayson E. Street: But I still escaped. Then I go back and here's where the work is, because this is not what red teams do or pen testers do. I then go back and I talk to every single person that I compromised right then and there. And that's when the social engineering starts, because I've got to turn an event where they let me do something bad. By the time I'm done, they've got to feel positive about it.

Perry Carpenter: Yeah.

Jayson E. Street: They've got to understand that I wasn't there to test them. I was there to teach them. That they didn't receive a loss, they received a lesson. I get them to understand that, hey, it's like your company was not trying to do a catch-up. They were trying to show you what a real-world attack is like, because if you don't really know what it looks like, how could you be prepared for it? And they're so concerned about your safety and security in their data, they wanted you to see what these kind of attacks can be like. I talked to them about how tellers are trained when people come in to rob banks. I'm just one of those kind of live tests. I explain to them that I was caught in this other area and this is how they caught me. They were doing a great job. I give them something to look up to, not all the things to look down on.

Perry Carpenter: Yeah.

Jayson E. Street: And so I educate each one of those people and I make them feel positive about it. I make them feel not that they were doing something bad or they were getting caught or they're going to get their names reported, because I don't record any of the names that have failed. I show them those things and then I make it a positive experience and I make it a teachable moment. That's what I call it, a teachable moment for them. I don't file reports. Usually for a client, I will do a PowerPoint presentation where they can record it and actually use it as a security awareness video for their employees based on the stuff that I found working with them. When I show phishing attacks, I show the most horrific, horrible, guaranteed to click phishing attacks with their employees. I will go and do OSINT. One of the requirements that I tell people that I set myself is that I will do it no more than two hours on Google and your company's website to attack you. I will not use Maltego. I don't use Recon NG, Spiderfoot. I don't use any kind of tools. I will use Google starting out and then I will find out as much as I need to rob you. It usually works. One hour and 45 minutes was the longest it took in a financial institution in Kingston, Jamaica for me to be successful. That was the longest I've ever taken, and that was actually focusing a lot, because usually I'm playing video games or doing other stuff because I have ADHD while I'm actually trying to rob them.

Perry Carpenter: Yeah.

Jayson E. Street: And so I show them that and then I create the phishing attack, but I use an email which looks like once you read it, it's like I will use news events. I will use things that are happening in their location, in their area. I used the real life murder of two girls that had happened in their area of their city and used that as a way to get them to click a link, and I show that, and it's horrible. I would never send it. It's like I would never -- and I tell people, it's like if you're going to send a phishing email internally to your employees, there's only one way that you can do that. You do it with minimum reward and you better have a way to manage expectations. So if I was going to send something internally to an employee, if I was hired by a client to say, we need you to do a phishing attack, I am literally going to find in less than a half a mile radius of headquarters a restaurant that I would think that they would populate or they would go to or a cafe that's inside the lobby of the building, and I would send a card to them saying, hey, we're going to give you a $5 gift card for being in the region and being in this office or something like that. Something innocuous like that. And the reason why is because every person that clicks that link, they will get notified that they clicked the link. They will have to go through retraining. They will get their name on a -- because this is an internal testing, so therefore it's different than what I would do. I say they should get reported, it's like you need to put that on there as a strike. You need to like learn better, make it a teaching opportunity. But then after all that is done, you give them a $5 effing gift card for the place that you sent them the phish for. You let them know, it's like because they were expecting something. There was a company that did a -- literally at the height of the uncertainty and doubt and stuff in 2020, December 2020 during the holidays, they sent an email, a phishing test to employees saying that they won a $1,000 bonus or something.

Perry Carpenter: That was cruel. Yeah.

Jayson E. Street: I guarantee if you did not have a malicious insider in your company before, I guarantee you have hundreds now.

Perry Carpenter: Yeah. Yeah. Talk about being tone deaf.

Jayson E. Street: Yeah.

Perry Carpenter: I saw stuff like that too. Unfortunately, every security awareness vendor that has phishing simulation tools has created a tool set and then somebody can use that in a totally tone deaf, stupid way.

Jayson E. Street: Oh yeah.

Perry Carpenter: And that's really, really unfortunate, reflects really bad on the vendor community as well.

Jayson E. Street: A hundred percent.

Perry Carpenter: I love what you're saying there, though. I'm wondering when you're doing your scoping before somebody signs a contract, do you ensure that there's not going to be any kind of punitive effect on the people that are there? Is that kind of one of your standard things? Sounds like it's at the forefront of your mind. You don't want people to get fired because of the fact that they're human.

Jayson E. Street: That is mandatory.

Perry Carpenter: Yeah.

Jayson E. Street: One of my favorite stories of me, you know, never getting a client again, I was asked to break into a telecom company in another country and boy did I. It's like I got into every floor of their headquarters. It's like I would get into a place and then it was supposed to be all secured. I went through security. I bypassed their office, their lobby security and metal detectors and stuff. And then each floor was locked from the elevator lobby and they were like, you can't get in here. So it's like, okay, I'm here. Okay. We'll go to the accounting. The accounting is like, you know, that's where all our money is. Like, okay, I'm inside the accounting. Where do you want me to go next? Like, okay, go to the IT because I think some know you're supposed to be there anyway and you won't be able to succeed.

Perry Carpenter: Yeah.

Jayson E. Street: Okay. I've compromised your IT department area. Should I go to the executives next? And they're like, okay, no, no, no, we're good. And the CEO was a little upset with that. It's like, I'm like, you shouldn't be upset. It's like, these are findings. These are not something that's bad. These are something that you can now know that you can improve on.

Perry Carpenter: Right.

Jayson E. Street: He's like, well, I want you to do a phishing test now. I want you to do a phishing. And he said this, because I'm a stickler for scope and I will make you pay for the scope if you be a jerk about it. So his scope specifically was I could pick one to 100 people to do a phishing test on. That's what he said. It's like, pick one or 100 people. I don't care. But do a phishing test to our internal employees, and you have to be successful. And I want the names of the people who fail. And I was like, no. What would that serve? It's like, I can give you a general masking of how many people click. No, I need to know who did it so we can reeducate them. Like, you know, and I'm like, okay, okay. I will do it your way. And I will see what kind of lesson we can learn. Two days later, I think it was two or three days later, I have to do the closing, you know, the exit interview with them.

Perry Carpenter: Yeah.

Jayson E. Street: And I tell him like, and for the results of the phishing attack, I went with the scope that you gave me. I chose one person. It was you, the CEO. And within 12 hours, you clicked the link because you thought that I was a speaker from a conference you went to three months before for this telecom company. And you thought that he was inviting you to join their board of directors for this new initiative that they were forming. And you clicked the link to see what the website looked like. It had a server running and this is all your information because you clicked it. And so what we learned from this is not anything that could help your employees. It was the fact that without proper education, anyone can be susceptible to a phishing attack. How you go about educating them and teaching them is what's more important. Am I ever going to get hired by this person again? No. Am I okay with that? Yeah.

Perry Carpenter: Perfectly. And I love that, because I get people all the time and I see discussions online with people saying, you know, what do you do with repeat clickers in phishing campaigns? You know, do you have like a three strikes you're out policy and you fire people? And I tell people all the time, unless you're willing to hold the CEO accountable to whatever policy you set up, it's a fundamentally broken policy.

Jayson E. Street: Exactly.

Perry Carpenter: And if the CEO clicks on that three times, are you going to fire him? Probably not. You're going to say, what is some mitigating technology we can put in place, processes, training, something else. And if you're going to give that benefit of the doubt and that grace to a CEO, give it to everybody else. Otherwise, you've created a class system.

Jayson E. Street: I will tell you straight out, if your CEO and your executives do not follow your security policy, you do not have a security policy.

Perry Carpenter: Right.

Jayson E. Street: Because every person that reports to your CEO, they're like, well, she's able to do that so I should be able to do that. And everybody that reports to those people are going to be like, well, he's able to do that, so I'll be able to do that. And then by the time it's done, you only got Bob in the mailroom paying attention to what he's supposed to be doing because everybody else thinks that they can be the exception. Now, the other thing on the other side of that coin is I think my response is if a driver who has responsibility for a company van crashes a van three times, that's cost the company over $100,000, they're probably going to lose their job. If you've got an employee that can click an email link and cost the company $350 million, I think they should lose their job. I think on the first thing, no harm, no foul. Hey, you click the link. This was a test. It's like you clicked it. You need to go through an extra hour of training and you need to have it, like we're going to like home in on you, like why this is so important and you need to be very vigilant. And I only say this is a year so they can click to a year no problem.

Perry Carpenter: Yeah.

Jayson E. Street: And the second time you tell them, okay, this was bad. We're going to have you go through training again. But after training for three months, we're going to put your email address in our email gateway system to send it to a allow list digest that you will get email to your inbox and you're going to have to allow every single email you think is not suspicious before you're allowed to even look at it and click it. You're not going to get the email directly to your inbox. You're going to have to vouch for it for three months. And then within a year, if they do it a third time, you fire them. It's like it is literally just a basic understanding that when employees realize that this will impact and affect their ability to feed and clothe themselves or their loved ones, they will then start taking it seriously. But when they see information security as a janitorial service, the format and restage a laptop because they did something wrong on it, they're not going to take it seriously.

Perry Carpenter: This would be fun to drill down on just from a -- let's try to figure out some best practices here, because I like your idea about sending it to a digest, because what that does is it intentionally gets somebody to slow down and to enter more of like a system to a logical type of thinking.

Jayson E. Street: Exactly.

Perry Carpenter: In so many circumstances, though, I think that most companies, even though that would do that, that would accomplish that goal, they would say, that's going to slow down business so much that we're not going to do it. So at that point, is it still the employee's fault because the infrastructure wasn't set up right, or is it a shared responsibility? How does that work? Because I think at some point, we have to realize that people are still just human and distraction, urgency, fear, you know, all these other things come into it. And at the same time, the fact that somebody clicks on something that was designed to be clicked on is more of a technology problem than a human problem. You know, the fact that the human was the last line of defense there means that not only the secure email gateway failed because it didn't detect something nasty that came through, it became a human problem. When they clicked it, you have endpoint protection platforms, you have EDR, you have everything else that just wasn't up to the task. And nobody's throwing out the vendors that make those. They're saying, oh, the employee clicked. So they were the "weakest link" in that, when in reality they were a link and every link failed.

Jayson E. Street: Well, there's a couple of things I want to address on that one.

Perry Carpenter: Yeah.

Jayson E. Street: First of all, humans are never the weakest link. We need to be honest in our industry. They're the least invested in.

Perry Carpenter: I love that.

Jayson E. Street: Okay, that is the thing. It's not that they're the worst problem. They're the least invested. If we invested in our technology the way we invest in our employees to get them to understand what their job functions are when it comes to security, we would be running snort boxes with base rules on a Cisco PIX firewall with a basic ACL and wondering why you're getting pwned every other day, because we are not investing any money or taking any time to actually properly inform and educate them. If we are going to educate our employees to make sure they function and do the jobs that are required of them, the biggest fundamental flaw is that we're not acknowledging and realizing that information security is one of their roles and part of their job responsibility. That is one of the fundamental flaws.

Perry Carpenter: Yeah.

Jayson E. Street: That is a standard fundamental ability. They are part of the security team on day one. They're not apart from the information security team. They are part of it. They are the biggest IDS system that you're going to have available to you. If you're not training that and fixing the baseline and improving the signatures, that's on you. Also, this whole thing about, I keep hearing about technology and more of these blinky boxes and then the human is going to click the link. That's the problem. Your technology is not there as a wall. Okay, I think we've established already that walls don't really work that well. Okay? It's like the technology should not be a wall. Your technology is there as a safety net. You are supposed to give your trust and education and understanding of the responsibilities to your employees. They're there to be your first line of defense. And then the technology is supposed to be relied on as the safety net if they make a mistake. The problem is we gear everything toward the technology. So when the technology fails, and it will fail just like a human fails, then the human being unprepared, uneducated, underfunded on figuring out what they're supposed to do, they're naturally going to fail.

Perry Carpenter: Yeah.

Jayson E. Street: Then we act surprised about why this happens. So no, what we need to do is understand an Amazon delivery driver needs to drive and deliver so many packages a day. Amazon doesn't get upset when they obey the speed limit, when they use proper turn signal, when they stop on red before they can turn right. That's part of doing business is understanding that their employees have to operate their machines in a safe manner, because in the long run, that will save more money than if they have to keep replacing equipment or something horrible happens, and then they lose more money. Target lost $350 million from an email from another company. That was a lot of money. That's a lot of vans.

Perry Carpenter: Yeah.

Jayson E. Street: And the whole point is that we're talking about it because those people didn't realize that just clicking on that could cause that kind of damage or that it was their responsibility to be more careful with that, but we're not telling them that they need to operate their equipment safely. We're not giving them that urgency. We're not training them on it, and yes, doing security will always slow things down. That's why networking hates us so much. I have never gotten a birthday card from a network engineer at a place that I've worked at, and that's okay, because they're all about making things go faster, and we're always going to be about slowing it down. We slow down traffic every day when it comes to dealing with IDS systems or IPS systems or firewall filtering or stateful inspection. We're already slowing things down. The problem is when it comes to the human layer, it has to be slowed down perceptibly.

Perry Carpenter: Yeah.

Jayson E. Street: Because when it's being slowed down imperceptibly with the network and how fast electronics are, it's not as noticeable, but your executive management needs to understand that a five-minute breach that gets detected after five minutes, someone clicked a link in a phishing email, realized they made the mistake, and then calls information security so they can start an incident response, that's a five-minute breach. It's going to cost them probably a week's amount of full-time employee hours to get that fixed, to get that remediated, which is a lot better than a five-month breach that will cost a company millions and millions and millions of dollars.

Perry Carpenter: Yeah.

Jayson E. Street: We need to let the executives know because executives are smart. I hate this whole Gilbert thing of executives being stupid. They're extremely intelligent. We're not talking to them in the way that they understand. If you show them how you can mitigate risk and prevent loss of income and loss of revenue based on certain precautionary measures, and you show them metrics every month on things that you're stopping and things that you're doing, they will respond to that, and they will invest accordingly. I know so many people that go like, yeah, see, we had no breaches. We had no incidents. See, if you give us another $3 million for next year, it's like, we'll make sure nothing happens again. Show them numbers. How many attacks did you receive? How many machines have been patched in your internal network to interrupt the code? How many assets do you have? How many assets are you monitoring? How many incidents had to be investigated and then turned out to be nothing? Those are all a lot of numbers that they may not really understand or go totally over like, okay, we need to know, but it's something tangible that they can see and they can respond to that. When we communicate that way, then executives will understand when you come to them and say, yes, we are slowing these things down here because a breach that will cost us half a billion dollars will slow us down just a little bit more. Those are becoming more and more increased in frequency. So we can't say that, oh, it's not going to happen to us.

Perry Carpenter: More of our interview with Jayson E. Street after this. Welcome back. It's obvious that you really understand the importance of investing in humans and you see the clear issue that for decades we've been putting all of our faith and our dollars and effort in things that blink.

Jayson E. Street: The blinky boxes now have blue lights instead of red lights. So that means like, you know, 2.0. So I'm like, whoa.

Perry Carpenter: Right. Yeah. And I think it's because ultimately, and I even say this as a vendor, I think we just always want to turn the problem over to somebody else. Not that humans are a "problem", but we want to say, oh, this is solved by I can write a check for a couple of hundred grand. I can take this appliance, I can plug it in and then I can dust my hands of it, versus any time you're dealing with humans and culture and all the personal, you know, inner dynamics of that, it's a little bit messier. Right? It's a little bit more unpredictable. And it's not just a binary type of thing.

Jayson E. Street: Right.

Perry Carpenter: And you're never just fully done. Can you talk a little bit about stories and experiences? Is there a favorite story or two that you have or maybe one that you've not told a lot before that that comes to mind?

Jayson E. Street: I'm known for accidentally robbing the wrong bank and stuff and getting away with it. And everybody treats that like that's something cool when if you actually look at it at its face value, I screwed up really bad. That was a fail. I mean, let's be honest.

Perry Carpenter: Yeah.

Jayson E. Street: I was out of scope. I didn't pay attention. That was on me. It was great that I didn't end up in prison, but that was on me.

Perry Carpenter: Right.

Jayson E. Street: My biggest success story, though, is this one time in January 2020 in the before times, I was onsite breaking into a place. I broke into the same place a year before. And the year before, I went through them like butter.

Perry Carpenter: Yeah.

Jayson E. Street: None of the other consultancies that had tried to do a physical for them had managed to get into the floors and get into their actual operating office areas and stuff. And the guy who was our point of contact found me at his desk when he came back from a conference meeting sitting in his chair going, I'm done with this part of the assessment. Let me tell you what I did. And by the way, here's one of your employees' badges. They're probably going to want this, because it was very convenient trying to get into your floors because they were segmented, which was really neat until you can steal a badge. I showed them these things, and what they did was they took it seriously. They listened to my findings. They understood the dangers that they that imposed. I communicated it in a way that they understood and realized the dangers of the vulnerabilities that I showed them. The CEO in their yearly meeting where he only has one hour out of a year to talk to all his employees, spent over 15 minutes talking about nothing but the importance of security awareness and adhering to security policies and how they're part of the security team. The CEO showed them that I think this is important, so therefore, you have to think this is important. I come back the next year and I look different. And it doesn't really matter. A lot of people say, well, they probably recognize you from the year before. Some of them, maybe. The receptionist was brand new. She didn't remember me from last year. It's like she wouldn't let me just walk in like I knew where I was going because I knew exactly where I was going. Then when I asked to go to the restroom, because I always ask when I go to the restroom, because I always get lost when I go to the restroom for some reason. I go in. I get lost. I turn right instead of left where the restroom was. I turn right down the corridor into the area. Compromise two machines. So technically, I'm successful. Lady in an office with a glass window -- the whole front of the office was glass wall -- sees me and I can read body language and facial expression. And so I knew that she thought I was suspicious. She immediately started picking up the phone. She started looking at me as she was calling as she was standing up. So I knew, so I started walking out. And as soon as I did go back to the lobby, the guy who I was meeting was already on his way down the corridor because the receptionist already told him that I deviated from the path, because she had a camera in the hallway that showed that I did not go into the bathroom. And I am telling you, the best feeling in my life was over the next two days, in every section that I went into, I got caught.

Perry Carpenter: Yeah.

Jayson E. Street: I did not have to try to get caught. I got caught by them, by employees who didn't know who I was, employees that were just suspicious. And I was successful in every section. It's like I did manage to compromise at least one. That part is ridiculous because you know why? That is still a five-minute breach. Because even if I was successful on the person in front of them, when they said, well, no, I need to get -- I was stopped, the breach has now been discovered. So now there's a five-minute window on responding because let's face it, we need to stop trying to prevent breaches so much if we're not willing to invest in the money on how to detect it quickly and then respond to it effectively. Because that is what's going to save a company. It is not having the biggest wall. It's having the people that are manning those walls and detecting the breach as soon as it occurs and being able to respond effectively so it creates the least impact to the company. It's all about now detection and response. That is what's going to save a company, not just a straight defense of keeping it from happening. So every single section I got caught without having to try, not once, not once did I have to work at trying to get caught. And to me, if you're not rooting for your client to succeed, you suck. To me, this was the best freaking engagement I was ever on because I did my job. I've heard red teamers brag for some reason about how they come on to a job the next year and they were able to pop the company with the same bone and the same vulnerability and the same exploit. And I'm like, you're bragging that you suck, that you were so ineffectual of explaining to your client why this was imperative, that it gets fixed so they didn't fix it, that you were able to do it again? That's sad. My biggest success in my life was I was able to effectively communicate the findings the year before and my clients improved. I'm not there to try to find out where they fail. I'm trying to verify that they're learning and that they have things in place and find out where there's gaps that they can improve on. We are so stuck in this industry of red team toxic masculinity of punching people in the face to see if they got a plan that we're not realizing that I am your advocate, not your adversary. The only reason the red team exists is to make the blue team better. I am there to help you.

Perry Carpenter: Yeah.

Jayson E. Street: And helping you is to show you sometimes, yeah, your baby's ugly, but here's how we can improve it. It's always going to be there rooting for the blue team. I only exist, the red team only exists to make the blue team better, to show them where these vulnerabilities are and to give them a voice they don't usually get, because the blue team is probably saying a lot of the things that we are finding. But they're not getting that traction. And so I tell them, I say, look, it's like I know that you have something to be responsible. I know you know some of these things are flawed. I know that you need improvement on the things. I'm going to be that voice for you. I'm going to be that advocate for you to let upper management know that these things need to be repaired, that these things need to be fixed. Not that you failed on these. Not that you were not great. No, to say, hey, you've got a very capable team. They understand there's some flaws. They need the funding to get these things fixed. And that's how you do a situation. So, yeah, my biggest success was getting caught all the time. It was great.

Perry Carpenter: Do you think that that's the most successful company that you've seen so far as far as learning from their past mistakes?

Jayson E. Street: Hands down.

Perry Carpenter: That's awesome.

Jayson E. Street: I've robbed state treasuries. I've robbed research facilities, government facilities. I've robbed hotels, financial institutions. It's like all different kinds of businesses, web-based business, everything in all different parts of the world. And I am telling you, it's always the same when it comes to management. It's like they all want you to do something. And then as soon as you're showing them that it's that easy that you could actually get in, then they're like, oh, well, okay, well, then we got it. Okay, well, that's just a problem here. It's like, oh, that's it. And they want to find ways to not understand.

Perry Carpenter: Yeah.

Jayson E. Street: Or they want to find out who to blame on it. In information security, we face this mythology of we're there to eliminate risk. No time in the history of information security has your job responsibility ever been to eliminate risk. Your job is to mitigate as much risk as you can, to mitigate as much as you can. And you are going to have risk left over. And then you need to go to management with the rest of that risk. And you need to have a plan to tell them, we can offset this much risk based on, you know, our blinky boxes or SLAs or vendors or other places. And we can offset this much risk. And then at the end of the day, here's this much risk left. How much risk are you willing to tolerate, you know, and remain viable? And how much are you willing to invest to help us mitigate this much more risk or offset this much risk? But at the end of the day, you are connected to the internet. There is still going to be risk. Your job is never to eliminate it. Your job is to give your company and your executive and your management team the proper information and the instruments to make the changes and the decisions they need to mitigate as much, offset as much, and accept as little as possible at the end of the situation.

Perry Carpenter: Well, and anybody that thinks they've eliminated 100 percent of any risk is probably fooling themselves in a lot of ways.

Jayson E. Street: Yeah. Look at every freaking vendor out there with HackerProof. And I'm like, it's why we drink. I drink Diet Pepsi, but still, it's why we drink it.

Perry Carpenter: I drink Diet Coke.

Jayson E. Street: I'm not holding it against you.

Perry Carpenter: I still enjoy talking to you.

Jayson E. Street: Yeah.

Perry Carpenter: So I guess in the few minutes that we have left, I got a couple questions that I want to ask. Number one is advice for people who are just now entering cybersecurity and are thinking about taking this kind of career path, whether that's physical penetration testing, security awareness, the blend of the two that you do. What do you think the path in is? And is this something that people can get into now? And what's the skill set in the community like?

Jayson E. Street: I would say I think one of the biggest things that people need to understand is, yes, you can get into information security whenever you want, whatever time in your life that you want to start getting into it, get into it. We talk about diversity a lot, usually it's gender or racial roles. We don't understand there's so many different kinds of diversity. People coming from a poor economic background, people coming from an affluent background, people coming from the hospitality industry or the restaurant industry or from the teaching industry or it's like from law enforcement or from military.

Perry Carpenter: Yeah.

Jayson E. Street: Every single added voice that we can get from a different path than ours adds to the picture, adds to the solutions because it creates a different viewpoint, a different facet for us to explore and see something through. That is the most important part is that we need to be way more accepting of diversity from all different aspects. Especially gender and racial, but just from every walk of life, from every different kind of lifestyle, it's because they have a different story and a different perspective and different risk that we will never acknowledge or see -- and not acknowledge because we don't want to, but because we can't be aware of them.

Perry Carpenter: Yeah.

Jayson E. Street: That is an important part. I say any person should please try to join information security. But another key thing is don't look for the most profitable career path. You need to understand that sometimes in information security, it's like if you want to get into it and you want to dive into it, there is a risk of like getting into it and just grinding where it becomes too much where it's just all you do. It better be something that you enjoy, because if you're doing it because it's a good paycheck and it's a good career choice, it's a good thing, that's awesome. If you're doing it from nine to five, you're part of the 90 percent of the people in our industry that make the internet run, and kudos to you. But if you're one of the ones that wants to go and do it for something that you like because you'll be doing it a lot and you don't want to burn out on it and you want to have a good work-life balance -- and also, when it comes to trying to get into pen testing or social engineering or anything like that, you need to understand day one, it's not about you. It's not about what you can break. I have very low self-esteem. I have a very bad imposter syndrome. It's like every time I started an engagement, 30 minutes before I step out of a car door, I am a wreck because I'm just like, okay, this is where everybody figures out I'm a fraud. Here's where it's like it's going to go wrong. I don't know what I'm really doing. It's like, I've never really known what I've been doing. I've just been lucky. This is just luck. It's like, I'm just in the right time at the right place. I can't believe people keep believing that I can actually do these things. I tell myself all these lists of horribleness and stuff, and I try to self-gatekeep, but at the end of the day, it doesn't matter because did I provide value to the client? Did I help them become more secure? We keep thinking that we have to hurt them to show value, that we have to find a flaw to show value, that we have to show vulnerability to prove that we did something. It doesn't start out always as something malicious. It starts out with something in ourselves where we're saying, well, I need to prove myself. I need to prove that they're spending money on the right person to break in. I've got to go in and beat them up as much as possible so they realize that they spent their money wisely. Instead of realizing I'm there to validate their security, I'm there to validate what they did right, help find the things that they need to improve on. Not that they did wrong, but the things that they need to improve on, the things that they need to focus on more that they may not have realized. I need to be that separate set of eyes that see things a little bit differently than the defenders do. And I need to give them that value and that information so they can improve. And if I don't break in the first day, or if I don't get root or domain admin, or if I don't get the keys to the kingdom during the pen test, I didn't fail. My client did great.

Perry Carpenter: Yeah.

Jayson E. Street: That is going to be awesome. And so we need to understand more about letting the client know where they're doing right and how they're doing right and not trying to find all our validation from their mistakes. It's like the validation should be from our skill set.

Perry Carpenter: Yeah, that's a really cool perspective shift that I don't hear a lot. I think that there's an intellectual curiosity and there's the fun of the game as a foot part of this. But the fact that you're able to stand back a little bit and be rooting for the client and saying, they did really great right there. I wasn't able to do that. I wonder if I try this, if they're going to be just as good, which is totally different than I'm just going to go in and own them. Let's see where they fail. So I love that. All right. Then the last question for me then is, are there any other, for lack of a better word, urban legends or misconceptions about cybersecurity or the social engineering side of things that you like to dispel?

Jayson E. Street: I think one of the biggest urban legends that you get out is the extremism where it's like, oh, you can't click on any link. Oh, you can't open up any attachment. Oh, you can't function. And no one talks about the mitigations that you can do in place. I've literally suggested and implemented a plan where HR and marketing had separate computers at their desk, one on a DSL line where their emails, a specific email that they got from external sources would go to so they could open it up on a computer that did not touch the company network.

Perry Carpenter: Yeah.

Jayson E. Street: And then a secured FTP method, it's like when it was attached, when it was canceled, that they could then put in. Those things are things that I could see where we mitigate and say, well, we're not expecting them to fail. And we're not telling them this is what they can't do. We are showing them how they can do it safely and be aware more of what they're doing. And this whole thing where a social engineer has got to be someone who's lying or someone who is sketchy or unethical or into it, it's like, no, I never did any of these like breaking the networks and stuff until I started getting paid for it.

Perry Carpenter: Yeah.

Jayson E. Street: I started out in a totally different path. One of the things that I always get is like people were like, we're trying or that the social engineers are happy about lying to their clients or happy about lying and making them do something that they didn't want to do. And that is the farthest thing from the truth. It's like I am very aware of the lies that I tell and the things that I'm telling. And one of the main things that I do when I talk to a person that I've compromised afterwards is apologize right off the bat and explaining what I did was wrong and why was a bad thing. But I've robbed a building in a wheelchair before. It's like I'm a horrible person not because I'm really a horrible person. Some people would describe that.

Perry Carpenter: Right.

Jayson E. Street: But it's like I am portraying a horrible person. We've established that I'm trying to rob you so we know where my moral code is. I'm robbing you. So being in a wheelchair shouldn't be anywhere as surprising as if I was with a machine gun. I'm a bad person doing something bad. And people think that social engineers get away with all they get away with because they have the confidence knowing that they're not going to go to jail. I worked on a gang task force. Okay, I arrested a lot of people. Not one of those criminals said, I committed the crime knowing I was going to go to jail and get caught.

Perry Carpenter: Right.

Jayson E. Street: No one can be more confident than a criminal. Okay, because they're like, oh, yeah, we're going to get away. They come in there reeking of self-confidence because they think they can get away with the robbery and make out with the payday. It's like that's why they're doing that.

Perry Carpenter: Yeah.

Jayson E. Street: It's like criminals are stupid or they wouldn't be criminals, or they'd be in white collar where the real money's at.

Perry Carpenter: Right. So is there anything that you wish that I had asked, maybe that you wish that you'd been asked in interviews before, but for some reason everybody is just so blind that they've not thought to ask it?

Jayson E. Street: I would say I think the question would be what I would like to see more from our industry.

Perry Carpenter: Okay, yeah. So what would you like to see more?

Jayson E. Street: I would like to see more from information security and the hacking industry and the community in general, just a little bit more empathetic realization that servers don't get their feelings hurt when you pop them with MSO 867. But people do get their feelings hurt when they click on that link and they find out that it was wrong, that there's people on the other side of that Twitter handle that you disagree with that aren't your friends that are still going through things that are still having issues. And you still need to understand that even though they can be an opponent, you don't have to treat them less than to win.

Perry Carpenter: Yeah.

Jayson E. Street: And I think if we could understand more about the empathy and the struggles of what our employees have to go through on a day-to-day basis and the things that they face and the pressures that they're facing doing their job, trying to stay employed, that maybe we'd take a little bit more time to be understanding and approaching them in a way that is better suited for them to learn to be better, instead of forcing them to say they have to comply, showing them how it helps them to help be part of the team. We want to be right instead of being kind. And we don't understand sometimes that we can be both.

Perry Carpenter: That's a fantastic statement to end on. Well, I hope you enjoyed that interview with Jayson E. Street. It's really, really interesting to hear Jayson's stories and to hear the heart and the passion behind the work that he does. There's really no one else out there doing this kind of work that I'm aware of in the same way and in the same mindset that Jayson has. And I think we could use more of it. One of the really cool things that Jayson does is he's very, very good about sharing the mindset and the resources that he has. And so I urge you to check out his website, check out his YouTube videos and more. There's a wealth of information out there. As I alluded to also in this, I, as well as my son and my goddaughter, just got back from a class that Jayson taught after DEF CON, where he talked about all the different strategies and the tools that he uses to do that. So if you get a chance to take that class next year or if he starts to offer it again in intervening months, I encourage you to do it. It's really, really eye opening. So with that, thanks so much for listening. And thank you to my guest, Jayson E. Street. If you want to learn more about Jayson, I've loaded up the show notes with a ton of other resources for you. If you've been enjoying "8th Layer Insights" and you want to know how you can help make this show successful, there are just a couple of ways that you can do so, and both are really important. First, go ahead and take just a couple seconds and give us a five-star review on Apple Podcasts or Spotify and leave a short review wherever you can. Apple Podcasts is a great way, but you can also engage in conversation in the Q&A section on Spotify. So I encourage you to do that if you're a Spotify user. All of these help people who stumble on the show understand that this show is worth their most valuable resource, their time. Another way that you can help us is by telling someone else about the show. What I've seen over and over and over again is that when it comes to podcasting, word of mouth referrals are really the key to making a sustainable podcast. Also, if you haven't yet, please go ahead and subscribe or follow wherever you like to get your podcasts. If you want to connect with me, feel free to do so. You can find my contact information at the very bottom of the show notes for this episode. This show was written, recorded, sound designed, and edited by me, Perry Carpenter. Cover art and branding for "8th Layer Insights" was designed by Chris Machowski at ransomware.net and Mia Ruhn at MiaRuhn.com. The "8th Layer Insights" theme song was composed and performed by Marcus Moskatt. Until next time, I'm Perry Carpenter, signing off.