Hacking Humans 8.30.18
Ep 14 | 8.30.18

Red teaming starts with research.

Transcript

Justin White: [00:00:00] You can't trust somebody just because they act the part. You know, there's a lot of good liars out there who will look you in the face and tell you a bold-faced lie. And you will absolutely believe it.

Dave Bittner: [00:00:12] Hello, everyone and welcome to the CyberWire's "Hacking Humans" podcast, where each week we look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire and joining me, as always, is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe.

Joe Carrigan: [00:00:33] Hi, Dave.

Dave Bittner: [00:00:33] As always, we've got some interesting stories to share. And later in the show, we've got Joe's interview with security researcher Justin White, who shares his experiences doing red teaming and penetration testing. But before we get to all of that, a quick word from our sponsors at KnowBe4.

Dave Bittner: [00:00:51] So how do you train people to recognize and resist social engineering? There are some things people think. Test them. And if they fall for a test scam, fire them. Or other people say if someone flunks the test, shame them. Instead of employee of the month, it's doofus of the day, or maybe you pass out a gift card to the one who gets the A-plus for skepticism in the face of phishing. So how 'bout it? What do you think, carrots or sticks? What would you do? Later in the show, we'll hear what the experts at KnowBe4 have to say. They're the sponsors of this show.

Dave Bittner: [00:01:28] And we are back. Joe, what story do you have to share with us this week?

Joe Carrigan: [00:01:32] My story comes from Zeljka Zorz over at Help Net Security. There is a security company - a cloud security company called Avanan. It's warning about a new phishing attack that they're calling PhishPoint. The targets of this attack are users of Microsoft's Office 365 product. And Avanan is saying that 10 percent of their customers who use Office 365 have seen this attack come through. The user receives an email containing a link to a SharePoint document. That's why it's called PhishPoint.

Dave Bittner: [00:01:59] OK.

Joe Carrigan: [00:01:59] Get it?

Dave Bittner: [00:01:59] Yeah.

Joe Carrigan: [00:01:59] Very clever.

Dave Bittner: [00:01:59] Right.

Joe Carrigan: [00:02:01] But the email looks identical to a standard SharePoint invitation to share.

Dave Bittner: [00:02:07] Yeah.

Joe Carrigan: [00:02:08] Avanan, as you might expect, says that their platform protects against it. But then they go on to say you can also protect yourself with two-factor authentication and, of course, phishing training because the link in the email is what's malicious.

Dave Bittner: [00:02:21] Just a FYI - I mean, SharePoint is Microsoft's file-sharing framework, I suppose. Behind the scenes...

Joe Carrigan: [00:02:27] Right.

Dave Bittner: [00:02:27] ...You can set up a SharePoint site where you can have folders to share documents with your colleagues and so on.

Joe Carrigan: [00:02:34] So if you and I want to work on something together, I create a SharePoint site. And in that site, I upload a document. And then you and I can collaborate on writing that document. It's very common workflow in a lot of companies to say there's a document in SharePoint. Let's work on it together. If you're somebody that uses Office 365, you're probably somebody that also uses SharePoint. So if you receive this email, it will not look out of place to you. This could be a very compelling phishing attack. The only thing that's going to be different is the URL in the link. And you can't see the URL because it's masked with text, but the URL in the link is malicious.

Dave Bittner: [00:03:09] So can you hover over the URL and just see...

Joe Carrigan: [00:03:11] Yeah, you can hover over it.

Dave Bittner: [00:03:12] Right.

Joe Carrigan: [00:03:12] And you can take a look at it.

Dave Bittner: [00:03:13] Which, of course, is harder on mobile, as we've talked about.

Joe Carrigan: [00:03:16] Absolutely.

Dave Bittner: [00:03:16] Yeah.

Joe Carrigan: [00:03:17] A lot harder on mobile. We've talked a lot in the past about how the UI on mobile makes it easier for phishers to get around what would otherwise be on a desktop system something obvious because...

Dave Bittner: [00:03:28] Right.

Joe Carrigan: [00:03:28] ...The real estate on the UI is prime...

Dave Bittner: [00:03:30] Yeah.

Joe Carrigan: [00:03:30] ...All of it. So you take away the information, and people can't see that they're going to a phishing site or a malicious site.

Dave Bittner: [00:03:35] Yeah, you can do it. But it's harder to - there's extra steps.

Joe Carrigan: [00:03:39] Right.

Dave Bittner: [00:03:39] And it's not as easy to do, and I think lots of people don't even know how to do it on their mobile devices.

Joe Carrigan: [00:03:44] Yeah.

Dave Bittner: [00:03:45] So I guess - I mean, a couple things at play here. They're using the fact that, No. 1, there are a lot of people who use Office 365.

Joe Carrigan: [00:03:52] Right.

Dave Bittner: [00:03:52] And so there are a lot of people who are using Office 365 who are also using SharePoint.

Joe Carrigan: [00:03:57] Yeah. It's a...

Dave Bittner: [00:03:58] It's just that numbers game.

Joe Carrigan: [00:03:59] It's a big pool...

Dave Bittner: [00:03:59] Yeah.

Joe Carrigan: [00:04:00] ...Big pool of people.

Dave Bittner: [00:04:01] And again, as you say, if you get something that looks like a legit SharePoint file-sharing notification, at first glance it'd be easy for you to click through.

Joe Carrigan: [00:04:09] Right. The question is, are you expecting this? At some point in time, you kind of have to slow the business process down and you have to say, am I expecting this? Let me make a phone call. Who would've shared a document with me? I don't know what you do in this case. I mean, if you hover over the link and you see that it's not your SharePoint server, then you're done. Bob's your uncle, right? You've identified this successfully. If you just click on the link, there's no guarantee that it's going to where you think it is.

Dave Bittner: [00:04:33] Right. An additional element to this is that it's difficult for an organization to blacklist all things related to SharePoint because that is something that people need to use to get their work done.

Joe Carrigan: [00:04:43] It is.

Dave Bittner: [00:04:44] So they're taking advantage of that as well.

Joe Carrigan: [00:04:46] Right.

Dave Bittner: [00:04:47] How can folks protect themselves from this, Joe?

Joe Carrigan: [00:04:49] Just be vigilant. I don't know that two-factor authentication will protect you 100 percent against something like a session-jacking attack. At the individual contributor level, it all comes down to vigilance and diligence...

Dave Bittner: [00:05:00] Right.

Joe Carrigan: [00:05:00] ...And making sure that where you're going is where you think you're going.

Dave Bittner: [00:05:03] All right. Well, that's something to look out for. My story this week is all about putting trust in the things around us that we interact with every day and things that we consider benign.

Joe Carrigan: [00:05:13] I trust this chair and believe it to be benign.

Dave Bittner: [00:05:16] (Laughter) Right until the moment when one of the legs gives out and tosses you onto the ground. Well, similarly, I think most of us probably trust our USB cables.

Joe Carrigan: [00:05:26] Yes.

Dave Bittner: [00:05:27] I have them sprinkled all over my house, where I work.

Joe Carrigan: [00:05:30] Yep. I just bought two new ones yesterday.

Dave Bittner: [00:05:30] Right.

Joe Carrigan: [00:05:30] They arrived yesterday.

Dave Bittner: [00:05:33] We use them for charging our devices. That's probably what they get used most for, charging our mobile devices. These days, more and more laptops are coming with USB-C cables. So that sort of push to standardization where you don't have, you know, one-off, dedicated, brand-specific charging cables - we use these cables. We - and we don't think about it. We don't think that the potential is there for the cable itself to be a vector for bad things to happen.

Joe Carrigan: [00:06:01] That's right.

Dave Bittner: [00:06:02] But, of course, researchers have looked into this.

(LAUGHTER)

Dave Bittner: [00:06:06] And folks at an organization called Security Research Labs, they've published some research on what they call BadUSB. They were talking about taking these benign USB devices and attaching hardware to them, so you have some sort of logic chip, some sort of, you know, extra bit of electronics in there...

Joe Carrigan: [00:06:26] Right.

Dave Bittner: [00:06:26] ...That can do things.

Joe Carrigan: [00:06:27] Yeah.

Dave Bittner: [00:06:27] That can pretend to be a keyboard, for example.

Joe Carrigan: [00:06:30] Yeah, absolutely. There's a tool out there called a rubber duck that looks like a USB thumb drive.

Dave Bittner: [00:06:36] Yeah.

Joe Carrigan: [00:06:37] But this thing is actually a programmable keyboard interface. So you can write some code, and have it execute that code once it's plugged into a computer.

Dave Bittner: [00:06:44] Now what attracted my attention to this was a recent story about - different set of researchers are calling USBHarpoon.

Joe Carrigan: [00:06:52] USBHarpoon, interesting.

Dave Bittner: [00:06:53] And this is a USB cable. Actually, Kevin Mitnick is in on this...

Joe Carrigan: [00:06:58] OK.

Dave Bittner: [00:06:58] ...Who's a well-known hacker and actually hopefully future guest on this show.

Joe Carrigan: [00:07:03] OK.

Dave Bittner: [00:07:03] Trying to line up an interview with him. Some other researchers, Olaf Tan and Dennis Goh - they're from the RFID Research Group - and Vincent Yiu of SYON Security. So they collaborated based on an idea from another gentleman on Twitter who goes by the handle @_MG_. And so this notion here is to take a cable - a regular charging cable - if you imagine, I think, you know, the white Apple iPhone charging cable...

Joe Carrigan: [00:07:28] Right.

Dave Bittner: [00:07:29] Right, as generic as it comes.

Joe Carrigan: [00:07:30] Yeah.

Dave Bittner: [00:07:31] Take a cable like that, and make a cable that looks exactly like that but has this malicious payload built in. And these folks went out and did it.

Joe Carrigan: [00:07:40] Yeah, it's actually pretty easy if you know how to do it. There are microprocessors or microcontrollers out there that are remarkably small and remarkably powerful. AVR, Atmel - I guess now they're Microchip. They got purchased a couple years ago. You know, they're the ones that power the Arduino board, which is an open-source embedded system platform. But they also make more powerful microcontrollers than the one that's on the Arduino, and they are remarkably cheap.

Dave Bittner: [00:08:06] Yeah, so these folks - basically they had a run of these imitation cables made. And, boy, do they look just like a regular Apple charging cable. If you saw one of these laying on a desk, there would be nothing that made you think there was any sort of malicious payload involved with this cable. It just looks like a regular old cable.

Joe Carrigan: [00:08:27] You got me worried about these two cables I purchased yesterday, Dave.

Dave Bittner: [00:08:29] Well, I mean, it's an interesting point. How do you know?

Joe Carrigan: [00:08:32] I don't know.

Dave Bittner: [00:08:32] What's the chain of custody of something you buy on Amazon?

Joe Carrigan: [00:08:37] Yeah, and that's exactly where I bought these.

Dave Bittner: [00:08:38] Yeah.

Joe Carrigan: [00:08:38] I have no idea where these things came from.

Dave Bittner: [00:08:41] Right. Is there a device you can plug them into to verify that they're not up to anything?

Joe Carrigan: [00:08:45] There is. There's a USB protocol analyzer, and I actually have one at my office.

Dave Bittner: [00:08:48] OK.

Joe Carrigan: [00:08:48] So I think before I use these - and I haven't used them yet - I think I'm going to plug it into the USB protocol analyzer and see what I can get.

Dave Bittner: [00:08:54] Just make sure they're not up to no good.

Joe Carrigan: [00:08:55] Right, exactly.

Dave Bittner: [00:08:56] Now, are you familiar with the notion of USB condoms?

Joe Carrigan: [00:09:00] I am. I am. This is something that you get to protect yourself against malicious USB outlets. So let's say you're at an airport, and somebody has sat down and replaced the USB outlet that is at the airport or under the chair at the airport. And now when you plug in your Android or iPhone device to charge it, it's malicious and it attacks your phone. So what you do is you get what's called a USB condom which just has the two power connectors so that all you get is charging. A USB connector on the inside has four wires. And two of them are power, and two of them are data. So this one just eliminates the data wires, and all you get is power.

Dave Bittner: [00:09:36] OK.

Joe Carrigan: [00:09:37] And that's how it protects you against that.

Dave Bittner: [00:09:39] Right, but these guys...

Joe Carrigan: [00:09:43] (Laughter).

Dave Bittner: [00:09:43] ...Have come up with the idea - again, this is @_MG_ on Twitter. He says bad USB cables wouldn't be complete without bad USB condoms.

Joe Carrigan: [00:09:53] (Laughter) This...

Dave Bittner: [00:09:54] So he jokes that he's tempted to get a run of these made for the vendor area at the next security conference.

Joe Carrigan: [00:10:01] Right.

Dave Bittner: [00:10:02] So...

Joe Carrigan: [00:10:02] Here's something that looks - that is designed as a security product but will actually compromise your phone.

Dave Bittner: [00:10:07] Does the opposite of what it says. So you think you are being secure by using this device. I mean, what a misdirection, right?

Joe Carrigan: [00:10:14] Yeah, this is brilliant. I think that's a phenomenally great idea. Oh, wow, I'll be more secure if I use this USB condom. Well, the USB condom has malware on it.

Dave Bittner: [00:10:24] (Laughter) The joke's on you.

Joe Carrigan: [00:10:25] Right.

Dave Bittner: [00:10:26] Yeah. I don't know how I feel about this. I don't know how I feel about the good guys making up these - I mean, research is research of course. But I think this is one of the dark sides of our industry...

Joe Carrigan: [00:10:36] Yeah.

Dave Bittner: [00:10:36] ...Is this impulse to, I'll show everybody.

Joe Carrigan: [00:10:39] I've heard that from a number of other researchers. And I often say that the easiest part of this job is just saying here's something bad that can happen. And then you wait around, and you go, see? That's (laughter)...

Dave Bittner: [00:10:49] Right, right, right. But to actually go out and spend the money to have something manufactured - obviously that's one thing for research purposes. But the idea that someone would then distribute them to prove a point, I find that troubling.

Joe Carrigan: [00:11:02] Oh, yeah. That's probably unethical (laughter).

Dave Bittner: [00:11:05] Yeah, it crosses a line there.

Joe Carrigan: [00:11:06] You can make it and say, hey, here's a proof-of-concept device. But, yeah, don't distribute them.

Dave Bittner: [00:11:10] Yeah.

Joe Carrigan: [00:11:10] That - you should not do that.

Dave Bittner: [00:11:11] All right. Well, interesting stories this week. It's time to move on to our Catch of the Day.

(SOUNDBITE OF REELING IN FISHING LINE)

Dave Bittner: [00:11:20] All right, Joe. This one was sent in by a listener whose name is Peter. This was actually a fax that was sent to his father. So already we're targeting someone in their later years.

Joe Carrigan: [00:11:32] Right. I am already suspicious because when was the last time you received a fax, Dave?

Dave Bittner: [00:11:37] It's been a while.

Joe Carrigan: [00:11:38] Right.

Dave Bittner: [00:11:39] But I am not as old as (laughter) the people - I think a lot of older people have fax machines. I don't know.

Joe Carrigan: [00:11:46] My father has a fax machine.

Dave Bittner: [00:11:48] Yeah, so does mine.

Joe Carrigan: [00:11:49] And it's a three-in-one like fax, scanner, printer.

Dave Bittner: [00:11:51] Yeah. Exactly, exactly.

Joe Carrigan: [00:11:53] And his isn't hooked up. And he gave me his old one. And I actually have a fax machine. I can send faxes, but I do not receive them.

Dave Bittner: [00:11:59] Right. All right, well, we've changed the names to protect the innocent here. So we're just going to say that the target of this is named Mr. Johnson (ph). And the folks who sent this out are claiming to be from the U.K. So of course, I'm going to read this using a ridiculous British accent.

Joe Carrigan: [00:12:14] Very good.

Dave Bittner: [00:12:15] (Reading in British accent) Dear Mr. Johnson, my name is Keith Oliver. I'm a partner at Peters & Peters Solicitors LLP, a law firm based in United Kingdom. Though this transaction might sound unrealistic and apprehensive, but I have the requisite experience to handle said. I decided to contact you after a series of attempts to locate the relatives of the deceased. There is an unclaimed permanent life insurance policy insured for 10,950,777 United States dollars with a top life insurance company in Abu Dhabi, UAE. The policyholder was one of my personal clients - late engineer Arthur Johnson (ph), who worked with the energy company in Abu Dhabi. He died in a ghastly car accident in London seven years ago.

Dave Bittner: [00:13:06] Since his death, no one has come forth for the claim, and all my efforts to locate his relatives have proved abortive. The insurance company code stipulates that all unclaimed insured permanent policies must be turned over to the abandoned property division of the state after seven years. In view of the fact that you share the same last name and nationality with the deceased, I solicit for your consent to partner with me for the claim of this policy benefit as the beneficiary of the claim. The cost of changing the policy to your name will be beard - beard - it says beard - will be beard by the attorney. If you consent to the above request, all proceeds will be processed on your behalf.

Dave Bittner: [00:13:48] With your approval, I propose the sharing of the proceeds as follows - 45 percent each and 10 percent to charity organizations. This transaction is 100 percent risk-free as there will be no violation of any civil or criminal laws. I have all the necessary documentation to expedite the process in a highly professional manner. I will provide all the relevant documents to substantiate your claim as the beneficiary. And it may take up to 30 working days from date of your receipt of your consent. Kindly note that this transaction is strictly confidential and shall not be shared with a third party without my approval. Many thanks for considering my request, and I look forward to hearing from you soon. Please contact me via email. Your earliest response to this matter would be highly appreciated. Sincerely, Keith Oliver, head of private client attorney.

Joe Carrigan: [00:14:40] We have already violated Mr. Oliver's confidentiality agreement by reading this email here on this podcast.

Dave Bittner: [00:14:48] What do you think, Joe?

Joe Carrigan: [00:14:49] Well, a couple of things - first off, why does a British lawyer give me an amount in dollars - in American dollars?

Dave Bittner: [00:14:56] Right. Yep, yep.

Joe Carrigan: [00:14:57] Why not pounds?

Dave Bittner: [00:14:58] A very specific amount.

Joe Carrigan: [00:14:59] Right. Do you think that's maybe a conversion rate?

Dave Bittner: [00:15:00] Ten million - almost $11 million. Yeah, it could be.

Joe Carrigan: [00:15:04] There is couple of missing articles early on. And as you pointed out, he said beard. This is the same scam that's been going on for years. It's the Nigerian prince scam - just with something else. And, oh, my favorite part is that he wants to give 10 percent to charity.

Dave Bittner: [00:15:20] You know, they're really - these are good people.

Joe Carrigan: [00:15:22] Yeah, they are good people.

Dave Bittner: [00:15:23] They are generous, good people looking out for charity, absolutely.

Joe Carrigan: [00:15:26] Right.

Dave Bittner: [00:15:27] I wonder too if because of the - I don't know - that Americans are, I think by default, sort of enamored with Brits - we trust them, you know?

Joe Carrigan: [00:15:36] Well, most Americans are. I don't trust them.

Dave Bittner: [00:15:38] (Laughter) Yeah, other than you. You trust no one, right - other than you.

Joe Carrigan: [00:15:42] Particularly the British.

Dave Bittner: [00:15:44] Yeah, and those of us who enjoy doing ridiculous British accents. I wonder if that's part of it too - that a British solicitor couldn't possibly be up to no good.

Joe Carrigan: [00:15:52] (Laughter) Right. Actually, if I met a British solicitor, my inclination would be to trust that person, right? But this is not a British solicitor.

Dave Bittner: [00:15:58] No, no. Who knows what would happen? Obviously, a scam targeting the elderly - an old-school scam with kind of a new twist.

Joe Carrigan: [00:16:07] Right.

Dave Bittner: [00:16:08] All right. Well, that is our catch of the day. Coming up next, we've got Joe's interview with Justin White. But first - a message from our sponsors at KnowBe4.

Dave Bittner: [00:16:20] Let's return to our sponsor KnowBe4's question - carrots or sticks? Stu Sjouwerman, KnowBe4's CEO, is definitely a carrot man. You train people, he argues, in order to build a healthy security culture. And sticks don't do that. Approach your people like the grown-ups they are, and they'll respond. Learning how to see through social engineering can be as much fun as learning how a conjuring trick works. Hear more of Stu's perspectives in KnowBe4's weekly Cyberheist News. We read it, and we think you'll find it valuable too. Sign up for Cyberheist News at knowbe4.com/news. That's knowbe4.com/news.

Dave Bittner: [00:17:08] And we are back. Joe recently spoke with Justin White. He's a security consultant and penetration tester. Here's Joe's conversation with Justin White.

Joe Carrigan: [00:17:20] What's your objective when you're red teaming?

Justin White: [00:17:22] So red teaming has a lot of different meanings depending on context, who you talk to. Me and in my circle, whenever we say red teaming, we really just mean it's a object-driven security assessment penetration test that's very broadly scoped, which usually - not always but usually includes physical break-in as well and almost always includes exploiting the human aspect, meaning I can make phone calls, call up your employees, pose to be somebody else. But ultimately, I'm just trying to break into your system - maybe just one system in particular, maybe just the business as a whole. But I can do it through sort of any reasonable means short of driving a tank through the middle of your conference center.

Joe Carrigan: [00:18:11] (Laughter) All right, so let's focus on that human aspect. When you're trying to get in somewhere, how do you start that process? What do you do first?

Justin White: [00:18:18] Anytime we're going to approach a red team, especially whenever we're going to be talking to people, we need to sound legitimate. So it always starts with research. It's mostly open-source intelligence gathering. There's a lot of tools we use to sort of scrape - things from Google, all of the different social media - Facebook, LinkedIn. And basically, we're just trying to aggregate all this information about the company as well as about some key individuals or at least key departments that we may try to exploit. So it always starts with usually about two days of just research and not actually interacting with the target at all.

Joe Carrigan: [00:18:57] And then once you have all that research, you take the next step, I imagine, which I would guess is calling them and trying to elicit more information. Is that right?

Justin White: [00:19:05] Generally, it depends. Sometimes we may be focusing more on a physical thing, at which point we'll actually try to just walk up to the security guards. If we can't slip by them miraculously right off the bat, we'll try to walk up and maybe just try to act stupid, act sort of lost, confused or, hey, I'm waiting on my buddy; he should be coming down in a minute - maybe try to engage them, observe things more close up. What I typically try to do is, you know, find something to chat about. And if you can just get people talking, inadvertently they will give you information that you can use later to your advantage.

Justin White: [00:19:42] So whether I'm doing it in person or on the phone, I'm always just trying to get little bits of information from different individuals that I can take and pivot to other individuals or other places. And I can use that to my advantage to sound more convincing that I am who I say I am or I'm here to do what I said to do because, you know, I know this name. I know about this thing that's going on at the company. Did you see what happened at the holiday party? That was crazy. You know, I've got little anecdotes like that to tell to make myself just sound more legitimate.

Joe Carrigan: [00:20:19] So you start by gathering little tiny pieces of information. And then you kind of aggregate that up, just standard open-source intelligence like you were talking about. And then at that point, is it time to try to actually start the penetration, try to get in?

Justin White: [00:20:32] Well, eventually. You know, this is a timebox thing we do. Unfortunately, unlike a real attacker, we don't have practically unbounded time. You know, I've got maybe two weeks, at best maybe four weeks to try to accomplish my goals. So, yeah, at some point, prepared or not, you have to try to escalate and accomplish some objectives. And how fast you move really just depends on how well you're able to gather intelligence and come up with some sort of strategy 'cause if you just walk in somewhere or call somebody up really not knowing what your pretense is, who you're pretending to be or exactly what you're trying to get out of them, you're probably not going to get anywhere. You really can't wing it at all.

Joe Carrigan: [00:21:14] What have you found to be the most difficult thing to get by? What's the hardest thing to overcome when you're trying to convince somebody that you're somebody you're not?

Justin White: [00:21:22] Usually it comes down to they're just following the rules. I'm sorry, sir. I really want to help you, but our policy is this and this. And, you know, that's a good thing and a bad thing. It's a good thing for the company. It's a bad thing for me. But also, that - what it tells me is that companies really need to be sure that their policies are sensible because the employees for the most part will follow policies. However, sometimes we find that their policies have gaps in them. And it's possible just following the policies that exist that you can exploit information from them. So you have to have good policies. And generally well-trained employees will follow them, unfortunately for me.

Joe Carrigan: [00:22:08] And that makes your job harder. So I like to ask everybody that does social engineering. What's the one tip you would give to people to make them less susceptible to these kind of attacks?

Justin White: [00:22:17] Really, it's just important that you follow company policies and procedures. And I had mentioned that earlier. But as a realist, I know that sometimes we're inclined - just - it's human nature. That's why this whole social engineering works in the first place. It's human nature to want to help people. So whenever I educate people about these things, I try not to sell it as absolutes, as in under absolutely no circumstances are you to violate, you know, this rule, this rule or this rule. But in reality, there are situations where the situation necessitates helping someone out that bends the rules a little bit.

Justin White: [00:22:59] So the advice I always give them is, do you absolutely know who you're talking to? If someone calls you up that you don't know, you're absolutely not breaking the rules for them. If somebody calls you up and they sound like someone you're pretty sure you know, don't break the rules for them 'cause you're not certain of who they are. For the most part, unless they are there in person and you physically recognize who that person is, absolutely don't bend the rules.

Justin White: [00:23:24] And then you have to even get a little bit more diligent than that because what if this is somebody that you do know, but they just got fired, and they're asking you to badge them into the office so they can go in there and, you know, steal things or unplug the servers or, you know, just whatever to - you know, just 'cause they're angry? So basically, what I say is if you know the situation and you know the person well and you can help them in some way, do so.

Justin White: [00:23:56] But be diligent about, you know, knowing exactly who it is you talk to. You know, know the situation. Know that that person didn't just get fired, which means before you help them out, say, hold on just a minute. Call up your boss or whoever it is who can also verify their story from another angle. You're just not taking their word for it. And then maybe let them in. But do your diligence. Don't just let them in and turn them loose in the office.

Justin White: [00:24:22] You say you left your badge at your desk. Would you mind walking me to your desk and letting me verify that indeed your ID badge and such and such is here? Show me your computer. Show me that you can log into the domain if it's a situation where you're not sure of what's going on. Stick to the policy. When you don't, do your diligence. Follow up and stay on it. See things to the end. And make sure that whatever you do for them - you let them into the office. Make sure they're not, you know, running through filing cabinets and going crazy.

Joe Carrigan: [00:24:56] Have you ever come across something that just surprised you, that you were surprised at how easy something is?

Justin White: [00:25:02] Oh, yes, actually. This was - you know, I can't chalk it up to anything but pure luck. There was one business that we were trying to break into. And they had tightened down their security based upon a physical penetration test that had happened the previous year. So they decided to just go all out and decide that nobody was getting in there who didn't belong. So we came in there figuring, OK, in situations like this where we don't figure we're going to be able to tailgate or we're going to be able to, you know, talk our way past guards, we're going to have to find some surreptitious way to get in there. We're going to have to pick open a side door or find an open window we can crawl through, something like that.

Justin White: [00:25:44] It just so happens on the first day - first day - of the assessment that we were actually on site - we had done our couple days of prep doing all the research and everything. First day on site, my co-worker drops me off near the front door in the car park. And I'm walking up to the door. And it had - it was a - what - they called it a poor man's man trap. So instead of a revolving door that you have to badge in and it rotates just enough for one person to go through, they had two sets of doors that you had to badge in each one of them to get through.

Justin White: [00:26:19] Just so happens there were two employees walking out spaced the same distance of the doors apart 'cause the doors were about 15, 20 feet apart. So I managed to tailgate past the first the one, act like I was putting my badge up to the reader on the second one when the second employee walked out. I'm inside the building. There's still a guard desk. However, just completely by luck, the guard had turned around because somebody in the back room had asked him a question. And he spun his head around, and I managed to walk right past the check-in point. After that, everything was easy. We accomplished all the goals of the first day because all of their security relied on that perimeter security.

Joe Carrigan: [00:27:02] OK, Justin White, thank you very much for joining us.

Justin White: [00:27:04] Thank you again, Joe. Pleasure to be here.

Dave Bittner: [00:27:07] All right, lots of good take-homes there from that interview. You know, I think the thing that was most interesting to me that I haven't heard a lot of other people talk about is this notion of your company having good policies.

Joe Carrigan: [00:27:20] Right. That was one of the points that I actually put down, too, is that his job is a lot harder when people follow those policies.

Dave Bittner: [00:27:26] Right. And he said, you know, well-trained employees tend to follow the policies, but you have to have policies that aren't full of holes.

Joe Carrigan: [00:27:33] Correct. Like the example he talks about of getting access to somebody or getting access and completely winning the penetration testing gig just because he compromised the one layer of security they had, which was a perimeter security. They said, we're going to put all of our eggs in the perimeter basket. We're going to have a great perimeter. And it's going to be very hard for someone to get in. And it sounds like it is very hard for someone to get it. But by luck, Justin got in.

Dave Bittner: [00:27:57] Well, and I wonder, too. Does that give everyone inside a false sense of security that surely if anybody has gotten in here, they must be legit, so my guard is going to be down?

Joe Carrigan: [00:28:07] I am 100 percent certain that it produces exactly what you're talking about...

Dave Bittner: [00:28:11] Yeah.

Joe Carrigan: [00:28:11] ...The false sense of security.

Dave Bittner: [00:28:13] Thanks to all of you for listening, and thanks to our sponsors at KnowBe4. They're the social engineering experts and the pioneers of New-school Security Awareness Training. Be sure to take advantage of their free phishing test, which you can order up at knowbe4.com/phishtest. Thanks to the Johns Hopkins University Information Security Institute for their participation. You can learn more about them at isi.jhu.edu.

Dave Bittner: [00:28:37] The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technology. Our coordinating producer is Jennifer Eiben, editor is John Petrik, technical editor is Chris Russell, executive editor is Peter Kilpe. I'm Dave Bittner.

Joe Carrigan: [00:28:54] And I'm Joe Carrigan.

Dave Bittner: [00:28:54] Thanks for listening.