FC: [0:00:00] You can have Fort Knox, but if the people don't have the culture of security, then they'll let you through the gate.
Dave Bittner: [0:00:06] Hello everyone, and welcome to the CyberWire's "Hacking Humans" podcast. This is the show where each week we look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire, and joining me once again is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe.
Joe Carrigan: [0:00:27] Hi, Dave.
Dave Bittner: [0:00:28] Later in the show, Carole Theriault returns. She's got an interview with a physical pen tester who goes by the name Freaky Clown. But before...
Joe Carrigan: [0:00:36] That's scary.
Dave Bittner: [0:00:36] (Laughter) It is. I'm a little intimidated. But before we get into all of that, a quick word from our sponsors at KnowBe4.
Dave Bittner: [0:00:46] Have you ever been to security training? We have. What's it been like for you? If you're like us, ladies and gentlemen, it's the annual compliance drill, a few hours of PowerPoint in the staff break room. Refreshments in the form of sugary doughnuts and tepid coffee are sometimes provided, but a little bit of your soul seems to die every time the trainer says next slide. Well, OK, we exaggerate, but you know what we mean. Stay with us. And in a few minutes, we'll hear from our sponsors at KnowBe4 who have a different way of training.
Dave Bittner: [0:01:21] And we are back. Joe, why don't you kick things off for us this week? What do you got for us?
Joe Carrigan: [0:01:26] So, Dave, sometimes bad practice can be just as effective as social engineering.
Dave Bittner: [0:01:32] (Laughter) What do you mean by effective?
Joe Carrigan: [0:01:33] Or maybe I should call this accidental social engineering.
Dave Bittner: [0:01:36] OK, go on.
Joe Carrigan: [0:01:37] So we have a story from an anonymous listener.
Dave Bittner: [0:01:39] All right.
Joe Carrigan: [0:01:39] Now, when I say anonymous listener, I know who this listener is. I've talked to this listener multiple times.
Dave Bittner: [0:01:44] OK.
Joe Carrigan: [0:01:44] So I consider this...
Dave Bittner: [0:01:45] Legit listener.
Joe Carrigan: [0:01:46] I'm not going to tell you who he is.
Dave Bittner: [0:01:48] All right, fair enough.
Joe Carrigan: [0:01:49] I will tell you it's a he, obviously, 'cause I just said he. But...
Dave Bittner: [0:01:52] (Laughter) You're slowly giving away this person's identity one slip up at a time.
Joe Carrigan: [0:01:56] (Laughter) Exactly.
Dave Bittner: [0:01:56] All right. Go on.
Joe Carrigan: [0:01:57] They're going to figure it out.
Dave Bittner: [0:01:58] Yeah.
Joe Carrigan: [0:01:59] So it's a story from work. He got an email at work that claimed his employer was doing a file update.
Dave Bittner: [0:02:04] OK.
Joe Carrigan: [0:02:05] And the email asked that the form be filled out. And it was asking for personal information such as your Social Security number, license plate number, insurance policy number, your driver's license number and your date of birth.
Dave Bittner: [0:02:16] That's a lot.
Joe Carrigan: [0:02:17] That is a lot.
Dave Bittner: [0:02:17] (Laughter).
Joe Carrigan: [0:02:18] Everyone in the office got the email.
Dave Bittner: [0:02:20] OK.
Joe Carrigan: [0:02:21] And our listener said, hold on, it might be fake. I'll go check.
Dave Bittner: [0:02:27] All right, good listener, good listener.
Joe Carrigan: [0:02:28] Right. He says we have him so paranoid about phishing....
Dave Bittner: [0:02:31] So he's giving us credit.
Joe Carrigan: [0:02:33] Right.
Dave Bittner: [0:02:33] All right.
Joe Carrigan: [0:02:34] ...That...
Dave Bittner: [0:02:34] We'll take it.
Joe Carrigan: [0:02:34] Right.
Dave Bittner: [0:02:34] (Laughter).
Joe Carrigan: [0:02:36] He actually takes the time like we often advise. He goes out and he finds out that somebody sent - yes, this is a legitimate email.
Dave Bittner: [0:02:42] Right. All right, well done.
Joe Carrigan: [0:02:43] Please fill out the form and send it back.
Dave Bittner: [0:02:44] Well done.
Joe Carrigan: [0:02:45] Right. So he took a minute to go and make sure that they weren't getting phished, which is great.
Dave Bittner: [0:02:50] Yeah, but there's more to the story.
Joe Carrigan: [0:02:52] But there's more to the story, right?
Dave Bittner: [0:02:53] (Laughter).
Joe Carrigan: [0:02:53] This doesn't end here.
Dave Bittner: [0:02:55] OK.
Joe Carrigan: [0:02:56] Someone in the company fills out the form with all this data, and then when they replied to it, they hit reply all.
Dave Bittner: [0:03:03] Of course they did.
Joe Carrigan: [0:03:04] And they managed to send it to everyone in the company who had received the original email. It gets better, Dave.
Dave Bittner: [0:03:11] OK, go on.
Joe Carrigan: [0:03:13] Among the recipients of the original email were some external addresses to Yahoo and Gmail addresses. So now not only has this person sent their own data out to everyone in the company, they've actually sent it outside of the company, presumably to employees who have been using external email addresses as their business email address.
Dave Bittner: [0:03:34] Oh.
Joe Carrigan: [0:03:34] The system here didn't fail. The email did exactly what it was supposed to do. It's the people who used the email that failed.
Dave Bittner: [0:03:41] Right. Well, take this one apart. What happened?
Joe Carrigan: [0:03:43] So here are the problems with this. Whoever wrote the first email sent a mass email to everyone in the organization by placing all the names in either the to field or the cc field.
Dave Bittner: [0:03:55] Right.
Joe Carrigan: [0:03:55] OK. Here's my advice - if you're ever sending out mass emails like this, the best practice is to put everybody in the bcc field, the blind carbon copy field.
Dave Bittner: [0:04:06] Right.
Joe Carrigan: [0:04:06] And then put your own address in the to field.
Dave Bittner: [0:04:09] OK.
Joe Carrigan: [0:04:10] OK. The blind carbon copy field means that when I bcc somebody - and it's not apparent in all these interfaces - it means that - let's say I bcc you and send the message to a bunch of people. They will see that the message was sent to them, but they won't see the message was sent to you.
Dave Bittner: [0:04:26] Right.
Joe Carrigan: [0:04:26] So if I put everybody in the bcc field that I'm sending this mass mailing out to, nobody else sees any of the email addresses that it was sent to. And when somebody hits reply all, which they will do...
Dave Bittner: [0:04:40] (Laughter) There's always someone.
Joe Carrigan: [0:04:40] There's always someone.
Dave Bittner: [0:04:41] At least one person...
Joe Carrigan: [0:04:43] Exactly.
Dave Bittner: [0:04:44] ...Who has a twitchy reply finger and just can't help themselves.
Joe Carrigan: [0:04:48] They can't help themselves, exactly.
Dave Bittner: [0:04:50] Right.
Joe Carrigan: [0:04:50] When they hit reply all, the only person that it goes back to is the person that sent it or the person in the to address, which is also the person that sent it. So that's what I recommend.
Dave Bittner: [0:04:58] Yes. OK.
Joe Carrigan: [0:04:59] Second - never use an external email address for your business email. It has a number of problems. No. 1, it makes it so you have to send your email - your business communications to an outside third party service that you don't pay for and therefore you probably shouldn't trust, someone like Yahoo or Gmail.
Dave Bittner: [0:05:17] Right.
Joe Carrigan: [0:05:18] I mean, you can go out and you can buy the Gmail - I'm not saying that they're not trustworthy.
Dave Bittner: [0:05:22] Yeah.
Joe Carrigan: [0:05:22] It's just that that's not under your control.
Dave Bittner: [0:05:25] Right.
Joe Carrigan: [0:05:25] If you pay for a service, that's under your control.
Dave Bittner: [0:05:27] It's outside the moat.
Joe Carrigan: [0:05:28] Exactly. You should pay for the service for all your people to have emails, and your company policy should be that no corporate communications will be sent to external emails, maybe with the exception of HR emails that need to be sent...
Dave Bittner: [0:05:41] Yeah.
Joe Carrigan: [0:05:41] ...Before or after someone is employed.
Dave Bittner: [0:05:43] Right. So you have that uniformity there.
Joe Carrigan: [0:05:44] Exactly.
Dave Bittner: [0:05:45] And you know everybody's playing by the same rules.
Joe Carrigan: [0:05:47] Right.
Dave Bittner: [0:05:48] Yeah.
Joe Carrigan: [0:05:48] But generally speaking, business communications should be done to business emails that you control. And finally, this is something that might be a little bit more complex for the average user to implement, but a great idea would be to implement digital signatures for your email. So if you have a digital signature for your HR person, then when the HR person sends this out, people can see the signature and see that it's valid and that it came from the validated user. This requires a lot of technical expertise, not only on the person in HR but on the people reading the email to understand. So even if you get these things and you go and check, that's still fine. Then - and that one is a great idea. And you will be more secure. But it does have some overhead associated with it.
Dave Bittner: [0:06:27] Right. Right. All right. Well, hats off to our listener who did the right thing by taking that extra step to check out to see if this was legit.
Joe Carrigan: [0:06:35] Yep. And thanks for telling us about it, too.
Dave Bittner: [0:06:37] (Laughter) And as for the other person, shame on you (laughter).
Joe Carrigan: [0:06:39] (Laughter) Well...
Dave Bittner: [0:06:41] I mean, obviously, it wasn't intentional.
Joe Carrigan: [0:06:43] Right. No. And that's...
Dave Bittner: [0:06:43] And that's the thing, right? That's the thing.
Joe Carrigan: [0:06:45] That's the thing. A lot of these things are not intentional.
Dave Bittner: [0:06:47] Yeah.
Joe Carrigan: [0:06:47] You know, this is essentially a data breach, what's happened. It's a very small data breach of one person. But this information has left the company network.
Dave Bittner: [0:06:54] Right. That's a lot of personal information.
Joe Carrigan: [0:06:57] Yeah.
Dave Bittner: [0:06:58] Yeah. All right. It's interesting stuff. Well, Joe, my story this week - this comes from Trend Micro. And this is about some criminals who are using malicious memes to communicate with malware.
Joe Carrigan: [0:07:11] Ah.
Dave Bittner: [0:07:11] Now, are you familiar with steganography?
Joe Carrigan: [0:07:14] I am familiar with steganography.
Dave Bittner: [0:07:15] OK. Describe for our listeners what steganography is.
Joe Carrigan: [0:07:17] Steganography is something that predates encryption. And it is a way of hiding a message in something that looks innocuous. Now, the message is not necessarily encrypted. In the very first example of steganography or the very early examples - I don't know if this is the very first. But in fact, you can read about this in "The Code Book" by Simon Singh, which is an excellent book on the history of cryptography. Generals in an army would take a piece of wood, and they'd carve a message into the piece of wood. And then they'd take wax and cover the board in wax and carve another message in the wax. And the person receiving it knew that the message in the wax was meaningless, to melt the wax off and read the message that was carved in the wood.
Dave Bittner: [0:07:53] I see. So if someone intercepted this piece of wood, they would think that the message carved in the wax, perhaps, was the message...
Joe Carrigan: [0:08:00] Right.
Dave Bittner: [0:08:00] ...Throwing them off the trail.
Joe Carrigan: [0:08:02] And then they'd send it on their way because they wouldn't want to disturb the message because once you collect the intelligence, you still want the enemy to act on it, right? But then you can anticipate it. And if you actually managed to scrape off the wax, you destroyed the original message. And when somebody received the hidden message, they'd already know that somebody knew the contents of the hidden message.
Dave Bittner: [0:08:21] Very clever, very clever. Yeah.
Joe Carrigan: [0:08:23] Yeah, a very good system.
Dave Bittner: [0:08:23] Well, this is along those lines. In this case, some bad guys are using memes to, basically, function as command and control servers.
Joe Carrigan: [0:08:33] Ah.
Dave Bittner: [0:08:34] Now, a command and control server - why don't you describe for us what that is, Joe?
Joe Carrigan: [0:08:37] So - and when you have to have a botnet, which is a bunch of computers that have some malicious software on them that are under your control, they have to be able to communicate with some way to get their orders...
Dave Bittner: [0:08:50] Right.
Joe Carrigan: [0:08:50] ...Because these things have, for example, a denial - a distributed denial of service botnet. You can just build a DDoS botnet to target one target.
Dave Bittner: [0:08:58] Yeah.
Joe Carrigan: [0:08:59] But that's not very useful.
Dave Bittner: [0:09:00] Right.
Joe Carrigan: [0:09:00] If I'm going to do this as a service, I need to be able to be paid. And then - who do you want me to DDoS?
Dave Bittner: [0:09:05] Right.
Joe Carrigan: [0:09:05] And then I need to be able to tell all those malicious bots, here's who I want you to attack next. In order to do that, I need a command and control server.
Dave Bittner: [0:09:12] Right. Here are your marching orders...
Joe Carrigan: [0:09:13] Exactly.
Dave Bittner: [0:09:13] ...So they check in with the server to see what they are ordered to do next.
Joe Carrigan: [0:09:18] Right. And the bots will be written so that they check in with the command and control server periodically. And even if the command and control server goes down, they have backup command and control servers available to them.
Dave Bittner: [0:09:26] So what the folks at Trend Micro discovered was that there was some malware that had been installed on some people's machines. And they're not sure how the malware got installed. But in order to get their commands, instead of reaching out to a dedicated command and control server, they would check a Twitter account.
Joe Carrigan: [0:09:45] Ah, OK.
Dave Bittner: [0:09:47] And they would...
Joe Carrigan: [0:09:47] That's also very common.
Dave Bittner: [0:09:47] Yeah. And they would look at this Twitter - specific Twitter account. And on this Twitter account, there were memes. And they were using steganography, hiding the commands within the graphic images of the memes.
Joe Carrigan: [0:10:00] Aha.
Dave Bittner: [0:10:01] And what's particularly useful about this is, you know, it's - Twitter, of course, is a legit social network.
Joe Carrigan: [0:10:07] Right.
Dave Bittner: [0:10:08] And you're going to want to, generally, give your employees or your network access to Twitter. It's a useful tool for business.
Joe Carrigan: [0:10:14] Right.
Dave Bittner: [0:10:15] So it's - you can't just say, well, let's block Twitter. Right? And traffic going out to Twitter doesn't look suspicious.
Joe Carrigan: [0:10:21] No, it doesn't.
Dave Bittner: [0:10:21] Someone going out to Twitter and taking a look at a graphic, there's nothing that raises a red flag about that. It's not like you're...
Joe Carrigan: [0:10:27] This is very creative, actually.
Dave Bittner: [0:10:28] ...Not like you're going out and hitting some server in, you know, some backwater in Russia or something, right? (Laughter).
Joe Carrigan: [0:10:33] Right, some weird URL or something that is...
Dave Bittner: [0:10:36] Right.
Joe Carrigan: [0:10:36] It looks like a bunch of random characters. And generally speaking, it - those are the ones they buy for command and control servers because nobody else is going to buy them.
Dave Bittner: [0:10:44] Right. Right.
Joe Carrigan: [0:10:44] It's not memorable.
Dave Bittner: [0:10:45] So the folks at Trend Micro witnessed this malware installed on machines reaching out to this Twitter account, looking at these images, which then gave them commands. In this case, the commands were to do screen captures.
Joe Carrigan: [0:10:56] Aha.
Dave Bittner: [0:10:57] So it would say, capture screen. Send it here. And all that was hidden within the graphic images. Since then, Twitter has closed down the account that was serving as the command and control location. But...
Joe Carrigan: [0:11:10] That doesn't mean they don't have another account...
Dave Bittner: [0:11:11] Yeah.
Joe Carrigan: [0:11:11] ...Up on Twitter.
Dave Bittner: [0:11:12] Right, exactly. It's the sort of the classic whack-a-mole game.
Joe Carrigan: [0:11:14] Yep.
Dave Bittner: [0:11:15] So an interesting use of a useful public service...
Joe Carrigan: [0:11:19] Right.
Dave Bittner: [0:11:19] ...Hiding something in plain sight.
Joe Carrigan: [0:11:21] Yeah. No, we see this a lot in command and controls. They'll connect to IRC channels because there are still actually IRC. For any other people who remember communicating on IRC like it was the...
Dave Bittner: [0:11:30] Yes.
Joe Carrigan: [0:11:30] ...New cool thing...
Dave Bittner: [0:11:31] I do.
Joe Carrigan: [0:11:31] They're still out there. And, you know, actually, a lot of developers and other communities still use IRC. But they put IRC clients in their malware to go out and check in with the IRC channel on the right server and get their command and control stuff from it. You could do it with something as simple as putting it on Pastebin. There's all kinds of ways you can use these public services that are available, like Twitter, Pastebin, IRC, to control these botnets. It's very difficult to stop them.
Dave Bittner: [0:11:56] All right. Well, those are our stories for this week. Joe, it's time to move on to our Catch of the Day.
Joe Carrigan: [0:12:01] All right.
(SOUNDBITE OF REELING IN FISHING LINE)
Dave Bittner: [0:12:05] Our Catch of the Day this week was sent in by a listener named Kyle. And he starts off and says, warning. There are two risque photos in this text message conversation thread. Let me just say, Kyle, yes, there are.
Dave Bittner: [0:12:20] He says, oddly enough, at least two of my co-workers received these text messages a few days before me with the same photos. It looks to be a chat bot with the endgame of a webcam site that asks for credit card info. Now, there are a couple of things about this. First of all, the brilliance in Kyle's responses cannot go unnoticed.
Joe Carrigan: [0:12:39] Right (laughter).
Dave Bittner: [0:12:39] So let me describe to you - the first thing in this is, indeed, a photo, appears to be of a - well, it's definitely a young lady.
Joe Carrigan: [0:12:47] Yes, it is.
Dave Bittner: [0:12:48] I cannot see her face because, well, why would that be important in something like this?
Joe Carrigan: [0:12:52] (Laughter).
Dave Bittner: [0:12:53] She is wearing a T-shirt. However, she has pulled the T-shirt up from her waist, and she's holding it in her teeth...
Joe Carrigan: [0:13:00] Right.
Dave Bittner: [0:13:00] ...Revealing everything underneath - well, not everything underneath because she's wearing a black bra.
Joe Carrigan: [0:13:06] Right.
Dave Bittner: [0:13:06] So we're not seeing everything.
Joe Carrigan: [0:13:09] Right.
Dave Bittner: [0:13:09] But we're seeing enough. There's certainly - for regular, red-blooded men like you and me, Joe...
Joe Carrigan: [0:13:14] This would get my attention.
Dave Bittner: [0:13:15] It would get my attention, yes (laughter). So a fetching picture that was sent out - and so below this picture is Kyle's first response. And, Joe, I'll allow you to read Kyle's response.
Joe Carrigan: [0:13:29] Just send me the virus link.
Dave Bittner: [0:13:30] Virus? No, lol.
Joe Carrigan: [0:13:33] Just send me the virus link.
Dave Bittner: [0:13:36] Hey, I'm in your area. What are you up to? I'm in town for a few nights if you'd like to meet tonight.
Dave Bittner: [0:13:42] Just send me the virus link.
Joe Carrigan: [0:13:43] (Laughter) Just send me the virus link.
Dave Bittner: [0:13:44] (Laughter) I like the no nonsense...
Joe Carrigan: [0:13:47] Right.
Dave Bittner: [0:13:47] No - and I think this is a brilliant way to respond to these sorts of things.
Joe Carrigan: [0:13:51] Yeah, he sees this and - you know, I get these kind of followers on Instagram. And they send me a message. And I'm just like...
Dave Bittner: [0:13:57] Right.
Joe Carrigan: [0:13:57] Maybe I should say that next time. Just send me the virus link.
Dave Bittner: [0:13:59] Yeah. No, I love it (laughter). I love it.
Joe Carrigan: [0:14:01] Right.
Dave Bittner: [0:14:02] Yeah. All right. Well, Kyle, thanks for sending this in. This is a good one. All right.
Dave Bittner: [0:14:06] Well, coming up next we've got Carole Theriault. She's joining us with an interview with a physical pen tester who goes by the name Freaky Clown. But first, a word from our sponsors at KnowBe4.
Dave Bittner: [0:14:20] And now back to that question we asked earlier about training. Our sponsors at KnowBe4 want to spring you from that break room with new-school security awareness training. They've got the world's largest security awareness training library. And its content is always fresh. KnowBe4 delivers interactive, engaging training on-demand. It's done through the browser and supplemented with frequent, simulated social engineering attacks by email, phone and text. Pick your categories to suit your business. Operate internationally? KnowBe4 delivers convincing, real-world, proven templates in 24 languages. And wherever you are, be sure to stay on top of the latest news and information to protect your organization with KnowBe4's weekly CyberheistNews. We read it, and we think you'll find it valuable, too. Sign up for CyberheistNews at knowbe4.com/news. That's knowbe4.com/news.
Dave Bittner: [0:15:20] And we are back. Joe, it is good to welcome back to the show our U.K. correspondent, Carole Theriault. This week, she has got the first of a two-part interview, actually. Here's Carole.
Dave Bittner: [0:15:32] You guys are in luck today because I have something pretty juicy - so juicy, in fact, that we've decided to divide it into two parts. Now, this is Part 1. And here you guys will meet professional physical hacker Freaky Clown. Already, you're interested, right? Now, he'll explain to us what he does for companies who want to test their physical defenses. And he will reveal some of the tricks he gets up to to bypass typical security measures. This is fascinating insight, and it's more than popcorn-worthy. Listening to how he studies and looks at a target company might just get the rest of us to shift our thinking a little bit. Maybe if we take the time to think more like a predator, we might just get some valuable insight into some of our inherent weaknesses. Anyway, here's Part 1.
Dave Bittner: [0:16:22] FC, thank you so much for joining us on Hacking Humans. Now, before we start, do you want to tell our lovely listeners about the origins of your name, FC?
FC: [0:16:33] Yeah, sure. FC stands for Freaky Clown.
Carole Theriault: [0:16:36] Freaky Clown...
FC: [0:16:38] Yeah.
Carole Theriault: [0:16:38] That's unusual.
FC: [0:16:40] It's not the name my mother gave me.
Carole Theriault: [0:16:42] (Laughter).
FC: [0:16:43] That's much worse. No, Freaky Clown came from a nickname I had when I was a kid. And at the time, I was getting into computers and stuff. And I needed a hacker alias. Freaky Clown just sort of went into that really easily. I had several others at the time, but that one became popular and has stuck with me ever since.
Carole Theriault: [0:17:00] So the name chose you, almost.
FC: [0:17:02] Yeah, pretty much. It was - it's been with me now for, like, 35 years, something like that. So I can't get rid of it. I can't shift it.
Carole Theriault: [0:17:11] So you've shortened it to FC.
FC: [0:17:12] So I shortened it to FC because, yeah, generally, if you're in a meeting with some sort of high-power people, they don't like saying Freaky Clown. They prefer FC.
Carole Theriault: [0:17:21] Your company website Cygenta talks about you focusing on socio-technical knowledge. Can you give us a bit of insight of what that means?
FC: [0:17:33] We understand that security is not just one part, right? We don't just do cybersecurity because cyber is just one of the three areas that you need in order to have security. In a company, you need to understand that there's a physical element, which is one of my specialties. And then you've got the human side, the socio-technical side, the sort of crossover between how humans use technology and how they interact with it so they can be more secure in their use of it because if you've got a large company that says they're secure, if they're not good in the cyber technical section, if they're not good in the socio-technical side, if they're not good in the physical side, it doesn't matter.
FC: [0:18:14] It all falls apart because if you spend, like, you know, a hundred million pounds on all of the technical controls on securing your company and you've put in loads of training for people, if I can still walk into your building and steal all the servers, it's all for naught. And so - and the same with any of those other bits. You can have Fort Knox. But if the people don't have the culture of security, then they'll let you through the gate. If those two parts are working and your cybersecurity is terrible and people can just get in over the internet, then, again, you're kind of screwed.
Carole Theriault: [0:18:45] So that's really interesting. So you say you focus on the physical aspect of that. Do you mean literally breaking into a company and breaking into a server room, for instance?
FC: [0:18:55] Yeah. So my speciality at Cygenta is the technical and physical. So I do all of the, like - the really cool hackery stuff you all know about. But I also focus on the physical aspects of security. So in my career, I've broken into literally thousands of banks in my career. Yeah. It's a great part of my job. I get to physically break into places. I get to steal stuff - all with permission, obviously.
Carole Theriault: [0:19:19] Yes. I was just going to say for our listeners that this is from a white hat perspective, isn't it?
FC: [0:19:24] Yeah. So people come to us as a company and say, OK. Look. We understand that we need some sort of security assessment. Will you come and use the same tools, techniques and tradecraft that criminals are using nowadays in order to break into our company with our permission, and then tell us how you did it? So that's where we come in. We sort of help companies become more secure by breaking into them.
Carole Theriault: [0:19:46] OK. You must have some great stories. Maybe you can share a few.
FC: [0:19:51] Where to even start? There's the time I got asked to steal a helicopter. That was quite cool. The stealing of a gold bar - that was pretty cool.
Carole Theriault: [0:19:58] OK. Whoa, whoa, whoa. Stealing a helicopter - you're going to have to tell us. So in trying to break in or steal something from a company with permission, presumably, only a tiny, few people know about this going on. And most people are in the dark. But there must be all these obstacles in the way to whatever the goal is.
FC: [0:20:18] So yeah. You'd think so. So how it all normally works is a company will come to us and ask us to do this assessment. And we will insist that the few people know about it, the better. So this is, generally, someone who's in charge of the security or some board-level member. It depends what we're trying to test. If we're testing the executive board, they may not even know that this test is going to go on.
Carole Theriault: [0:20:42] Right.
FC: [0:20:44] You know, I've got into a couple of places where I've found letters that the board are being fired. So that was quite interesting, so - because they didn't know the test was going on. So yeah, the fewer people that know about the assessment is going on the better because it makes it more real world.
Carole Theriault: [0:20:59] And I guess your findings are much more useful then.
FC: [0:21:02] The point with this type of assessment is we'll always get in. So in the last, like, 20-odd years I've been doing this, I've always got in - 100 percent record. There's only been two times where it's gone wrong. And that's not because of something I've done or not done but because of - someone in the loop hasn't wanted the assessment to happen. So let me give an example of this. I was asked to do a series of High Street banks. It was a ridiculous amount. I think it was something like eight a week in different parts of the country. So, like, every day, breaking into a new bank, then going to the next one, do the same thing.
FC: [0:21:36] But one particular area manager didn't want this to happen. He thought it was an attack against his job. And so what he did was he subverted this whole process. And he went and told everyone. And so I turn up to do an assessment on one of these sites. And I go in, and I give them a little patter about how I need to speak to the manager, et cetera. And I get ushered aside to a waiting room. And it's a little bit unusual because of the story that I've given them. I should be being sort of ushered through quite quickly. So I'm waiting in this reception area. And there's a guy sitting next to me. And he's moaning about the mortgage that he's got with this bank and how they're screwing him over and how he's having a terrible time of it.
FC: [0:22:16] And frankly, I don't care. I'm just a little bit concerned that I'm sitting here waiting in this reception. And then 20 minutes go past. I'm still not seen. And then all of a sudden, blue, flashing lights everywhere. Armed police surrounded the bank, come running in. And I sort of looked at the guy. I'm like, dude, this is for me. And he's like, I thought I was having a bad day. So it turns out that this guy had tipped off his branches about this attempted thing. And the staff had got really nervous about it and thought I was genuinely trying to rob the bank. So they bypassed their processes, and they called the cops. And then the cops turn up. And I'm like, dude, sorry. I'm trying to rob this bank but not really. So that was an interesting job.
Carole Theriault: [0:23:00] Certainly not a boring day.
FC: [0:23:01] There's never a boring day at my job, which is nice.
Carole Theriault: [0:23:05] (Laughter) So maybe you can tell us about an example of where it went right. Like, how do you bypass the people? - so just a typical example. Say you're going into a bank. You've got the receptionist there. So yeah, I just want to walk through to kind of see how many people are actually involved in trying to get from point A to point B.
FC: [0:23:21] Yeah. So the first thing you want to do is have as little interaction with people as possible. The more people you are interacting with, the more likely you are to get caught. So...
Carole Theriault: [0:23:32] Right.
FC: [0:23:32] The first thing you want to do is make sure you're minimizing that. So if there's a reception staff, maybe try and find a way to bypass it without ever talking to them. That's going to be so much quicker to get in. But one of the first things that we do is we do a lot of reconnaissance work. And so, you know, this is like a day or maybe a week before the assessment. We'll go and just look at the site, look at the people, figure out of, you know, their dress code because one of the important things is dressing like the people that you're going to go and infiltrate.
Carole Theriault: [0:24:03] Right.
FC: [0:24:04] So for example, if you're going into, like, just a normal, everyday company, they're probably, you know, business casual. If you're going into, like, an investment bank, they're going to be in suits and ties. You know, are they wearing, like, really smart shoes? Are they wearing expensive suits? Do you need, like, to go and get a handmade suit to fit in? Otherwise, you're going to stand out.
Carole Theriault: [0:24:24] You'll stand out in your Marks & Spencer garb.
FC: [0:24:27] (Laughter) Exactly. I often hear a lot of people, they dabble in social engineering. They're like, oh, just get a high vis jacket, and you can walk in anyway. That's, frankly, boss because if you're trying to break into an investment bank and you're wearing a high vis jacket...
Carole Theriault: [0:24:42] Yeah.
FC: [0:24:42] You're going to stand out. Like, obviously, you'll stand out. But they'll pick up on certain things. So for example, one client, you can't go on specific floors unless you're wearing a tie. They actually have a drawer of ties that they will lend you if you don't have one. So...
Carole Theriault: [0:24:56] Oh, lordy. I bet you they're based in the U.K.
FC: [0:25:01] And it's stuff like that. So you wouldn't be out to get away with certain things. You'd have to really dress the part that you are playing. There are times that a hi-vis jacket is going to come in handy. You know, if you're pretending to be a BT engineer or something like that, or an electrician or a delivery driver then that's great. But because most of the time it's a fallacy that you'd be doing that, you know, you're not going to pretend to be the pizza delivery guy turning up. Because if you do that, you're going to get stopped at reception. They're not going to let just a pizza delivery guy go up on the floor where they hold the gold bars.
Carole Theriault: [0:25:38] You've bypassed reception, let's say. You've managed that. Check one. Done. You have come in looking the part so you fit in very well with the culture within the organization you're trying to break into. What's next?
FC: [0:25:51] So the important thing is, is get a map of the place. This is really easy today because every major building, right, every large building will have a fire map. So if you just walk into the reception of any building, you can just go in and have a look. You could probably take a photograph or, in some cases, you can probably just rip it off the wall and walk out. No one's ever been arrested for stealing a map. (Laughter). So you can go in and get, like, a really good layout of the building. And it's really important that you remember the layout because at some point probably, you're going to have to do some running. It's a very energetic exercise sometimes, you know? So often you'll find that you're confronted by some security.
Carole Theriault: [0:26:32] And so therefore, you don't want to be a rat in a maze. Right? You definitely want to know the layout so you know where the exits are so you can get out of Dodge.
FC: [0:26:39] Exactly. I mean, I remember one bank actually in London, I managed to find a secret stairwell that wasn't on a map, actually. I was looking through cupboards, and I found this really tiny, thin cupboard. And I opened it, and there was a spiral staircase going out. I later found out that it was actually used by the high-end execs to sneak in and out of the building without their employees knowing. So it went all the way down to, like, their car park where they parked their Ferraris and stuff.
Carole Theriault: [0:27:06] I'm sure those exist in lots of companies. So you now got the layout of the building. You're in. And, what, you're armed with a letter? So you're basically going to meet someone. You've set up a meeting?
FC: [0:27:15] No, no, no, no. No. So generally what we get is a letter of authorization, and this basically is a letter that just says, you know, this guy is here doing this assessment. And it's signed by whoever's in charge of the assessment, and here's some contact details.
Carole Theriault: [0:27:30] Get-out-of-jail-free card.
FC: [0:27:32] Basically a get-out-of-jail-free card. And I've never had to use one 'cause I generally don't get caught. So it's quite cool. But there's an extra little step that I always put in, which is I'll make a fake version of that letter so that if I ever get caught, I can produce the fake letter because then that tests their policies. Because it'll always say on there, you know, don't use your mobile to just phone these numbers. Go to your internal address book and use your phone to tell you the number of the person. Because what I do on the fake ones is I'll put, you know, the guy's proper name and then my mate's telephone number.
Carole Theriault: [0:28:08] Right.
FC: [0:28:09] And so what will happen is they'll try and verify that I'm supposed to be there, look at the letter, find the telephone number and just ring it. And that will be my work colleague, and they'll be like, yeah, he's fine. Let him go.
Carole Theriault: [0:28:21] That's so sneaky, FC.
FC: [0:28:25] (Laughter). I've never had to use it, but I taught that trick to a friend of mine, and he actually used it once. He turned up at a site and he got caught by a security guard, and they phoned me and they were like, are you, like, this guy? I was like, yeah. I'm the CEO. He's supposed to be doing a thing. Let him in. It's all fine. Just leave him alone to do his job. And so they let him in. (Laughter).
Carole Theriault: [0:28:49] Wow.
FC: [0:28:49] There's always another step that you can take.
Carole Theriault: [0:28:50] Well, there you have it, part one of our interview with Freaky Clown. I certainly never considered thinking about a company in this way, and maybe some of you out there were just like me. So hopefully this helped give a different perspective on how to think about security. This was Carole Theriault for "Hacking Humans."
Dave Bittner: [0:29:11] Boy, Freaky Clown is the guy you want to chat with at a cocktail party, isn't he?
Joe Carrigan: [0:29:15] Yeah. He is.
Dave Bittner: [0:29:16] (Laughter).
Joe Carrigan: [0:29:16] You know what? I am so disappointed that no one has ever asked me to steal a helicopter.
Dave Bittner: [0:29:22] Yeah.
Joe Carrigan: [0:29:22] I now have a life goal, Dave.
Dave Bittner: [0:29:24] (Laughter).
Joe Carrigan: [0:29:26] Also, I'll need a secret staircase.
Dave Bittner: [0:29:28] Don't we all?
Joe Carrigan: [0:29:29] Yeah.
Dave Bittner: [0:29:30] Don't we all?
Joe Carrigan: [0:29:31] Yeah. That's a great interview. I love listening to these stories. I love listening to these physical penetration testers talk about how they get in. I'm amazed that he has a 100 percent success record that he's never screwed up. He must be remarkably good, unless someone sabotaged him, of course...
Dave Bittner: [0:29:44] Yeah.
Joe Carrigan: [0:29:45] ...Like he told in that story. I love the idea of the fake get-out-of-jail-free card.
Dave Bittner: [0:29:50] Right.
Joe Carrigan: [0:29:50] You know? I might just start carrying these around...
Dave Bittner: [0:29:54] (Laughter).
Joe Carrigan: [0:29:54] ...Just on purpose. You know?
Dave Bittner: [0:29:55] (Laughter) Right. Right.
Joe Carrigan: [0:29:55] And just I see if I can get some (laughter)...
Dave Bittner: [0:29:58] Yeah. Here's the one from McDonald's. Here's the one from Walmart. Here's the one from - you know, just as you go around shoplifting, as you do.
Joe Carrigan: [0:30:05] Right. (Laughter).
Dave Bittner: [0:30:07] But when you get caught, you know - I'm sorry. I'm just testing your security...
Joe Carrigan: [0:30:09] I'm testing your security. Here's...
Dave Bittner: [0:30:11] ...Here are my papers.
Joe Carrigan: [0:30:11] Here's my paper.
Dave Bittner: [0:30:11] Right.
Joe Carrigan: [0:30:11] And it's on Walmart letterhead, which I could just print up with an inkjet printer, I'm sure.
Dave Bittner: [0:30:16] Good stuff.
Joe Carrigan: [0:30:17] It is good stuff. That is a great interview. I'm really looking forward part two.
Dave Bittner: [0:30:20] And, of course, always great having Carole Theriault join us. And yeah, looking forward to part two of that interview. And that is our podcast.
Dave Bittner: [0:30:27] We want to thank our sponsors KnowBe4, whose new-school security awareness training will help you keep your people on their toes with security at the top of their mind. Stay current about the state of social engineering by subscribing to their Cyberheist News at knowbe4.com/news. Think of KnowBe4 for your security training. Thanks to the Johns Hopkins University Information Security Institute for their participation. You can learn more about them at isi.jhu.edu.
Dave Bittner: [0:30:55] The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben; our editor is John Petrik; technical editor is Chris Russell; executive editor is Peter Kilpe. I'm Dave Bittner.
Joe Carrigan: [0:31:12] I'm Joe Carrigan.
Dave Bittner: [0:31:13] Thanks for listening.