Rachel Tobac: [0:00:00] Can you help me? Four most powerful words in social engineering - can you help me? And if you can use their lingo, you're in.
Dave Bittner: [0:00:09] Hello, everyone, and welcome to The CyberWire's "Hacking Humans" podcast, where each week, we look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from The CyberWire. And joining me, as always, is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hi, Joe.
Joe Carrigan: [0:00:29] Hi, Dave.
Dave Bittner: [0:00:30] As always, we got some interesting stories to share. And later in the show, we welcome Rachel Tobac. She's the CEO of SocialProof Security. But before we get to all that, a quick word from our sponsors, the good folks at KnowBe4.
Dave Bittner: [0:00:46] So what's a con game? It's fraud that works by getting the victim to misplace their confidence in the con artist. In the world of security, we call confidence tricks social engineering. And as our sponsors at KnowBe4 can tell you, hacking the human is how organizations get compromised. What are some of the ways organizations are victimized by social engineering? We'll find out later in the show.
Dave Bittner: [0:01:14] And we are back with some interesting stories to share. Joe, you're up first this week. What do you got for us?
Joe Carrigan: [0:01:19] Well, last two weeks, Dave, I've kind of had very dark stories.
Dave Bittner: [0:01:22] Yes, you have (laughter).
Joe Carrigan: [0:01:22] So I'm going to talk about something that's not so dark and maybe lighten it up a little bit.
Dave Bittner: [0:01:27] All right.
Joe Carrigan: [0:01:27] But if you think of a police sting as something - you generally think of that as something that takes a lot of time to set up, right?
Dave Bittner: [0:01:33] Right.
Joe Carrigan: [0:01:33] You may require the cooperation of someone on the inside of the operation. You may have to acquire some contraband to use as a prop, and I'll bet that's a lot of paperwork.
Dave Bittner: [0:01:43] (Laughter) Lots of skulking about.
Joe Carrigan: [0:01:45] Right. You have to - yeah, you have to have run the long con. So some smart police have come up with a way to social engineer people with outstanding warrants. So here's the con. The mark is told that they've won a prize of some kind.
Dave Bittner: [0:01:57] Now, when you say mark, what are we talking about here?
Joe Carrigan: [0:01:59] We're talking about the person with the outstanding warrant.
Dave Bittner: [0:02:01] OK.
Joe Carrigan: [0:02:01] It's old con-artist speak, I think.
Dave Bittner: [0:02:03] OK.
Joe Carrigan: [0:02:04] I acquired the word from Harry Anderson's book.
Dave Bittner: [0:02:06] Yeah, yeah, yeah, yeah. No, I was curious what - who were they targeting here? So the police are after folks who haven't showed up for a court date or something...
Joe Carrigan: [0:02:13] Right.
Dave Bittner: [0:02:13] ...Like that.
Joe Carrigan: [0:02:13] They may have an outstanding warrant for a DWI or - you know, generally, when a small, local police department does this, they're going after nonviolent offenders. But...
Dave Bittner: [0:02:20] Gotcha.
Joe Carrigan: [0:02:20] Don't jump ahead yet, Dave, (laughter)...
Dave Bittner: [0:02:21] Oh, I'm sorry. Go ahead.
Joe Carrigan: [0:02:23] ...Because I've got a really good one at the end of this.
Dave Bittner: [0:02:25] All right.
Joe Carrigan: [0:02:26] So here's the con. The person with the outstanding warrant is told, you've won a prize, but you got to show up in person to claim the prize...
Dave Bittner: [0:02:32] Oh.
Joe Carrigan: [0:02:32] ...Right? So that's step one. And they may be told with a phone call. They may be told with a letter. They may be told with email. They may be told on social media. But they say, you've won a prize; come on down. So when they show up at the facility - it's usually, like, a hotel or something. And when they get there, there's, like, cameramen with cameras and maybe a news reporter going, how do you feel about winning this?
Dave Bittner: [0:02:52] (Laughter) Right. The old Publishers Clearing House balloons and stuff. Yeah.
Joe Carrigan: [0:02:55] Exactly. Right.
Dave Bittner: [0:02:55] Right, right.
Joe Carrigan: [0:02:56] And then they walk in. And at the front desk, there's - you know, it's just a table set up with, like, four or five people sitting at it. Of course, everybody sitting at the table is a police officer. And they say...
Dave Bittner: [0:03:06] Out of uniform.
Joe Carrigan: [0:03:07] Right. Out of uniform.
Dave Bittner: [0:03:07] Right.
Joe Carrigan: [0:03:08] They're just dressed, you know, in street clothes. They're not - yeah, not uniformed police.
Dave Bittner: [0:03:11] Right.
Joe Carrigan: [0:03:12] And they validate the person's identification...
Dave Bittner: [0:03:15] (Laughter).
Joe Carrigan: [0:03:15] ...By saying, well, what's your name? All right. You're on the list. Yeah. Good, you're on the list. Let me see some ID to make sure it's you.
Dave Bittner: [0:03:21] Right.
Joe Carrigan: [0:03:22] And everything you'd think that a contest would need to do to validate your ID.
Dave Bittner: [0:03:25] Right. Makes perfect sense.
Joe Carrigan: [0:03:27] Right. And then they say, all right, your prize is in that room. Go into that room. And an officer escorts them, you know, not looking like an officer, looking like just some guy in a suit or maybe a - and they're like it's going to be great. It's going to be great. You're going to be so happy. And they walk in to the room, and that's where the uniformed police officers...
Dave Bittner: [0:03:43] (Laughter).
Joe Carrigan: [0:03:43] ...Are that place them under arrest...
Dave Bittner: [0:03:45] Right.
Joe Carrigan: [0:03:45] ...And tell them that they're being arrested for an outstanding warrant (laughter).
Dave Bittner: [0:03:49] Yeah, wah-wah-wah (ph).
Joe Carrigan: [0:03:49] And that's the con.
Joe Carrigan: [0:03:50] Right, right, right. Now, there are videos of this on YouTube, and they're great to watch, I think.
Dave Bittner: [0:03:55] Yeah, and, well, they've been at this for a while, right? This has been something that the police continue to do, but this is something that they've been doing for a while 'cause it works.
Joe Carrigan: [0:04:01] It does. In 1985, the U.S. Marshals went after some people in Washington, D.C., that had outstanding warrants. And this was great. They told them, you've won two tickets to a Redskins game, plus transportation to the game. Plus, you get to go to a pregame party at this location. So when they showed up, there were agents dressed like Washington Redskins cheerleaders...
Dave Bittner: [0:04:21] Oh.
Joe Carrigan: [0:04:21] ...Right? And they hugged the winners - in quotes, you know...
Dave Bittner: [0:04:23] Right.
Joe Carrigan: [0:04:23] ...These people - and that was essentially a frisk...
Dave Bittner: [0:04:26] (Laughter).
Joe Carrigan: [0:04:26] ...Right? - to make sure that they didn't even...
Dave Bittner: [0:04:29] Right. So the bad guy is thinking, oh, this is delightful. I'm getting hugged by a Redskins cheerleader. My dreams have come true...
Joe Carrigan: [0:04:35] Right, and meanwhile they're...
Dave Bittner: [0:04:35] ...Not knowing that they're actually being patted down to see if...
Dave Bittner: [0:04:39] All right. I love it.
Joe Carrigan: [0:04:40] Then they go to the next room. They get arrested. This whole thing works on greed, right? If you're somebody who has an outstanding warrant, chances are you know you have an outstanding warrant.
Dave Bittner: [0:04:48] Right.
Joe Carrigan: [0:04:48] So you're trying to maintain a low profile.
Dave Bittner: [0:04:50] You're looking over your shoulder.
Joe Carrigan: [0:04:51] Right. That desire to keep a low profile is overcome by the desire for free stuff.
Dave Bittner: [0:04:56] Right. Every time.
Joe Carrigan: [0:04:57] (Laughter).
Dave Bittner: [0:04:57] I love it. All right. Well, that's a good one. My story this week comes from the folks at SentinelOne. That's a security company. It's an article they published. It's called "The Weakest Link: When Admins Get Phished. OSX.DUMMY." So imagine you are a member of a public Slack or a Discord group or, you know, any one of those social messaging systems. And you're part of a group that is there for Mac tech support. So you are someone who works in tech support. Maybe you're an admin for a Mac network, and this is a place where you frequent. And folks are there. They help each other out, and you happen to reach out for help, let's say, for something with - you're having some network performance issues.
Joe Carrigan: [0:05:37] Right.
Dave Bittner: [0:05:37] And just when you think that you're out of luck, you get a direct message from one of the group's team admins. And that person has a solution. And the solution says, we've been aware of this issue for a while, and we've developed a way around it. You'll need to download and execute this tool with privileges to solve the problem.
Joe Carrigan: [0:05:55] (Laughter) It's a trap.
Dave Bittner: [0:05:56] We'd welcome your feedback on how it goes or if there's anything we need to do to improve the fix. All right? So you download it. Like, this is the solution to all my problems, right?
Joe Carrigan: [0:06:07] Right.
Dave Bittner: [0:06:07] Here I am. I'm in a group of trusted people. They've earned my trust. We've helped each other out over who knows how long. So I will download this script. I will execute it. And bam, they've got me. I've given them admin privileges.
Joe Carrigan: [0:06:19] Because you had to run it with privileges.
Dave Bittner: [0:06:21] That's right.
Joe Carrigan: [0:06:21] Yeah.
Dave Bittner: [0:06:22] That's right. So one of the points that they were making in this article was that admins are vulnerable, not dumb.
Joe Carrigan: [0:06:30] Right.
Dave Bittner: [0:06:30] Right? This is not a case - I mean, these are smart people - but it's not a case of them being stupid. It's a case of, like we say week (laughter) after week, people having earned their trust.
Joe Carrigan: [0:06:42] Yeah. Being conned is not an indication of lack of intelligence. It's an indication of being a human.
Dave Bittner: [0:06:47] Yeah.
Joe Carrigan: [0:06:48] And these admins are probably under a lot of pressure from their customers, you know, their customers being internal employees and other people. They're running a network, and the network might not be operating well.
Dave Bittner: [0:06:58] Right.
Joe Carrigan: [0:06:58] And here's somebody who has presented them with the fix - right? - on a platter.
Dave Bittner: [0:07:03] Who among us hasn't done this? You know, you have a problem. Something's wrong with your computer. You go out and search for the solution in an online forum.
Joe Carrigan: [0:07:10] Right.
Dave Bittner: [0:07:10] …Because chances are someone else has had the same problem.
Joe Carrigan: [0:07:13] Absolutely.
Dave Bittner: [0:07:14] And someone smarter than you, or more experienced than you or just who got to it before you has the solution.
Joe Carrigan: [0:07:20] Yeah.
Dave Bittner: [0:07:20] I've certainly done that many, many times.
Joe Carrigan: [0:07:21] I've done that too. In fact, I've done that with some recent problems with my home computer. But the way I mitigated this risk was whenever I downloaded something, the very next place I went was to VirusTotal.com.
Dave Bittner: [0:07:33] Yeah.
Joe Carrigan: [0:07:33] And I uploaded it there.
Dave Bittner: [0:07:34] Well, to that point, the package that people would download scored a zero on VirusTotal.
Joe Carrigan: [0:07:39] Really?
Dave Bittner: [0:07:39] Yeah. Yeah. Now, I'm - I suspect that's probably been updated since then. But - yeah.
Joe Carrigan: [0:07:44] There's always a first infection.
Dave Bittner: [0:07:45] Well - exactly. So there was someone who responded to this on Twitter. His name is Remco Verhoef. And he said that maybe to protect users, it would help to, first of all, display admin badges for team admins so that someone can't pretend to be an admin...
Joe Carrigan: [0:08:01] Right.
Dave Bittner: [0:08:01] ...When they aren't and also disable non-admins the ability to send direct messages to users. So you can only send messages in public in the clear - right? - because why did this need to be a private message? It shouldn't have been.
Joe Carrigan: [0:08:13] Because if everybody saw it, they would expose it.
Dave Bittner: [0:08:15] Right.
Joe Carrigan: [0:08:16] As, hey, whoa, whoa, whoa - don't do that. Don't do that. So...
Dave Bittner: [0:08:18] But if it was legit, you'd want as many people to be able to find out about it as possible.
Joe Carrigan: [0:08:22] Right.
Dave Bittner: [0:08:23] Right? So there's an interesting flag to be aware of, that why would someone be sending a private message?
Joe Carrigan: [0:08:29] Yeah.
Dave Bittner: [0:08:29] Yeah.
Joe Carrigan: [0:08:30] That's a good point.
Dave Bittner: [0:08:31] Yeah. So it's an interesting one, something to keep an eye out for if you're using these sort of networks for teaming up with folks to solve problems - just something to keep in the back of your mind, that not everyone there is who they say they are, and some people may be up to no good.
Joe Carrigan: [0:08:44] That's right. Good advice for life.
Dave Bittner: [0:08:46] Yeah (laughter).
Joe Carrigan: [0:08:47] (Laughter) Right?
Dave Bittner: [0:08:48] All right, Joe. It's time for our Catch of the Day.
(SOUNDBITE OF REELING IN FISHING LINE)
Dave Bittner: [0:08:54] What do you have for us this week?
Joe Carrigan: [0:08:55] Our Catch of the Day comes from a listener in Dublin. It's Philippe (ph).
Dave Bittner: [0:08:59] Philippe.
Joe Carrigan: [0:09:00] And he writes, hi, guys. I'd like to suggest something interesting from Europe for your Catch of the Day.
Dave Bittner: [0:09:05] All right.
Joe Carrigan: [0:09:06] So it turns out, a month ago, Philippe's iPhone X was stolen while he was in London on a business trip. He immediately locked his phone with the Find My Phone feature of iCloud.
Dave Bittner: [0:09:17] Right.
Joe Carrigan: [0:09:17] I'm not an Apple user, so I might be using all these terms wrong. But (laughter)...
Dave Bittner: [0:09:20] Yeah. That's all right. I won't hold it against you. But yes, you can log into iCloud and basically lock the phone.
Joe Carrigan: [0:09:25] Brick the phone so nobody can use it.
Dave Bittner: [0:09:27] Yep.
Joe Carrigan: [0:09:27] Right. He said he had some financial information on there. But the criminals were not able to unlock the device because it was secured and then locked with the iCloud feature.
Dave Bittner: [0:09:35] Right. Right. He was using Face ID too, so...
Joe Carrigan: [0:09:37] Right. Face ID.
Dave Bittner: [0:09:37] Yep. Yep.
Joe Carrigan: [0:09:38] So he has attached an email he received from scammers two months later. And I'm going to read the email to you now.
Dave Bittner: [0:09:45] All right.
Joe Carrigan: [0:09:45] OK? This is allegedly from Apple Support.
Dave Bittner: [0:09:49] Right.
Joe Carrigan: [0:09:49] And it reads as follows - hi, customer. Your Apple ID will be disable because of some violated policies. It's important - comma. Action required on your account. The following changes to your Apple ID where on July (laughter) 10, 2018. We've noticed that your account information appears to be invalid and unverified. We need to verify your account information in order for you to keep continue using your Apple ID account. Please open the attached file and verify your Apple ID before 24 hours, or your Apple ID will be disable. Sincerely, apple support.
Dave Bittner: [0:10:33] Apple Support with a lowercase A (laughter).
Joe Carrigan: [0:10:34] This whole thing is replete with capitalization and punctuation errors (laughter).
Dave Bittner: [0:10:38] Yeah, it's hard to describe to our listeners. For a company who is known for their meticulousness when it comes to design...
Joe Carrigan: [0:10:45] (Laughter) Right.
Dave Bittner: [0:10:45] ...This is laughable.
Joe Carrigan: [0:10:47] Yes. It is laughable.
Dave Bittner: [0:10:48] So obviously a scam here.
Joe Carrigan: [0:10:51] Obviously a scam.
Dave Bittner: [0:10:51] Right?
Joe Carrigan: [0:10:52] I mean, I wish we could show our listeners the capitalization. In the first sentence, Apple ID is all lowercase. At the end of it, it's Apple with a capital A, and ID is capitalized - both letters.
Dave Bittner: [0:11:03] Right.
Joe Carrigan: [0:11:04] The sentence ends with a comma instead of a period.
Dave Bittner: [0:11:06] (Laughter) Right. Right. Some sentences have no punctuation whatsoever.
Joe Carrigan: [0:11:10] So - yeah.
Dave Bittner: [0:11:11] Yeah. Yeah.
Joe Carrigan: [0:11:11] And there's - it's one incredibly long run-on - this is just...
Dave Bittner: [0:11:14] Right.
Joe Carrigan: [0:11:14] If I got this, I'd be like, oh, jeez.
Dave Bittner: [0:11:16] Yeah.
Joe Carrigan: [0:11:17] You guys got to try harder (laughter).
Dave Bittner: [0:11:18] Now, it's interesting they were able to track him down. If you plug an iPhone into a Mac, it will give you some information about the phone. It'll give you the phone number. It'll give you however the person named the phone. So if - you know, if Philippe had said - you know, had named it Philippe's phone and included his last name, it's certainly plausible that they would have enough information to start trying to track him down. They could cross-reference some things.
Joe Carrigan: [0:11:42] Right.
Dave Bittner: [0:11:43] And maybe that's how they were able to reach out to him.
Joe Carrigan: [0:11:45] Could it be also that these two things are unrelated? That this is just - you know, he lost his phone. And that's unfortunate - but then suddenly gets his email from a scammer trying to get him to run this app just cause.
Dave Bittner: [0:11:54] Yeah, yeah. Fortunately, Philippe is a cybersecurity professional.
Joe Carrigan: [0:11:57] That's right. So this did not work on Philippe.
Dave Bittner: [0:12:00] His radar was up. And he did not fall for it.
Joe Carrigan: [0:12:05] This one did not slip through his crap detector.
Dave Bittner: [0:12:07] That's right. Well, thank you, Philippe for sending this in. And, of course, we love it when folks send in things. So if you have something that you think would be good for our Catch of the Day, please go ahead and send it to us. You can find our contact information on the CyberWire website. That's thecyberwire.com. And that is our Catch of the Day. Coming up next, we've got my interview with Rachel Tobac. But first - a message from our sponsors at KnowBe4.
Dave Bittner: [0:12:32] And now we return to our sponsor's question about forms of social engineering. KnowBe4 will tell you that where there's human contact, there can be con games. It's important to build the kind of security culture in which your employees are enabled to make smart security decisions. To do that, they need to recognize phishing emails, of course. But they also need to understand that they can be hooked by voice calls - this is known as vishing - or by SMS text, which people call smishing. See how your security culture stacks up against KnowBe4's free test. Get it at knowbe4.com/phishtest. That's knowbe4.com/phishtest.
Dave Bittner: [0:13:20] And we are back. Joe, I recently had a great conversation with Rachel Tobac. She is the CEO of SocialProof Security. She is well-known in the world of social engineering security. And in fact, she's demonstrated her social engineering skills in a number of Social Engineering Capture the Flag competitions. So here's my conversation with Rachel Tobac.
Rachel Tobac: [0:13:41] I discovered social engineering in the past five or so years. My husband actually - he's a cybersecurity researcher. And he was the person that went to DEFCON. He went first. And he actually gave me a call and told me that I needed to buy a ticket and come to DEFCON and watch the Social Engineering Capture the Flag because he could see me doing it one day. And I didn't believe him at all. How is that related to teaching, which is what I was doing at the time, and being a community manager? I was completely nontechnical at the time. But I had a background in applied behavior analysis, which is basically using persuasion to help people. And I used it to help people as a special educator, a community manager, a user researcher - many different things, but not in the cybersecurity world. So, you know, he kind of figured out that that was going to be what I should do.
Dave Bittner: [0:14:27] So you go to your first DEFCON as an observer first. Did you get the bug? Did you say, yeah, this is me?
Rachel Tobac: [0:14:33] I went to DEFCON, made my fake badge and basically just hunkered down and sat in the front row of the Social Engineering Capture the Flag. And I watched people in the soundproof booth making their calls. The people that I saw, they did a great job. But they kept getting voicemail after voicemail. If you've ever been to the Social Engineering Capture the Flag, it's really tough. It takes a lot of luck for people to even pick up the phone. It happens. The calls that I was watching were happening on a Saturday. And so to get somebody like an employee of a company to pick up their phone on a Saturday, that's hard in and of itself. And so that year, I ended up watching a lot of people get voicemails.
Rachel Tobac: [0:15:06] But the calls that did go through were so impressive. And I just kept thinking, this is the exact same thing that I do when I get our bills lowered for Comcast or when I call Verizon and try and merge accounts. Or, you know, like, I'll call my pet insurance company and try and get information about my account without authenticating first. And I never knew that that had a name. And after watching SECTF at DEFCON, I realized that it did. And I applied the next year. And Chris Hadnagy, who leads the entire village, he took a chance on me.
Dave Bittner: [0:15:35] Yeah, Chris was actually the first guest on this podcast. He's an interesting guy. So you win two years in a row. And how does that transition into deciding this is what you want your career to be?
Rachel Tobac: [0:15:46] It was kind of organic. So what happened is I ended up getting second place two years in a row. The top two win. And I started talking with people about DEFCON. And people were like, you know, you really should start telling your story. You should start getting some talks. And as soon as I got off the winner's podium at DEFCON my first year, an organization called Women in Security and Privacy - WISP, a nonprofit that helps advance women in the fields - they actually approached me. And they're like, we want you to join our board, which was a huge break for me because I was able to learn from people in the field. They took me under their wing. And the more I told people about my story, the more they realized that I should start giving some talks and let people know my experience because it was pretty nonlinear. And I didn't really come from the InfoSec world, so maybe I can lend a different type of viewpoint.
Rachel Tobac: [0:16:32] So I started giving talks. And the more talks I would give, the more people would come up to me afterwards and say, you know, this talk, I would like you to give it at my work. Can you do that? I was like, I guess. So I started thinking, if I'm going to do that, I should probably form an LLC. So that's what I did. And it kind of happened organically in that way, where it started with, you know, just giving talks at people's work. And I was kind of sharing my story, making sure that people understand that maybe you don't look like everybody that you've seen in cybersecurity, or maybe you've gone to a conference before, and you walked around, and you felt like you didn't belong.
Rachel Tobac: [0:17:03] But, you know, I felt like I didn't belong before, and I actually do. So there are many different things you can do in InfoSec. And I think it helps people to understand that even if there's people that might not look like you right now or sound like you right now, there might be in the future. Or maybe you could help kind of start that process. So it kind of started with getting other women involved in cybersecurity and then expanded from there to pen tests, OSINT work.
Dave Bittner: [0:17:27] Now, do you find that there's a mismatch between people's perception of the technical side of cybersecurity and the human and social engineering side? People come and knock on people's doors all the time and say, you need this technical solution; we're going to protect all your data. But I suspect they don't get as many people knocking on their door for the types of services you provide.
Rachel Tobac: [0:17:47] Yeah. I haven't had to do a lot of door knocking. I've been pretty lucky pretty. Much every time I give a talk, there's a line of people who are like, I didn't realize this is a threat. You've just changed my understanding of our threat model. Can you come in next week? And I think we're at kind of like a golden age of social engineering where I do have a lot of luck in the time that I came in here where people don't know a lot about what can be done, how they can protect their employees. And we're really just starting to see the media show examples of how their employees got tricked or how we're just starting to really understand that the majority of cyberattacks do start with social engineering. That's not something that we've understood for very many years.
Dave Bittner: [0:18:23] Do you think that the emphasis on social engineering from the bad guys' point of view - could that be the result of sort of a maturation of the defenses on the technical side? If it's harder to get in using traditional technical attacks, have - they have to shift their tactics and go after the human side of things.
Rachel Tobac: [0:18:41] That definitely could be it. If I said that that was definitely true, I'd be lying 'cause there's definitely no way for me to know exactly that that's the case. But that's a guess of mine. I do think that a maturation in technical security makes it more complicated. But I think that it actually might be that people have always been using social engineering. It's just that now we realize it. In the past, people might not have been really privy or understanding just how many people are walking through their door, giving them calls, reaching out to their employees over social media. It just wasn't widely realized as a thing. I think it could be both of those things. But I think it's also that people are seeing it more in the media. They understand it more as a threat now. And it might not actually be happening any more than it ever was. It's a pretty old tactic. But we know more about it now, so we see it more.
Dave Bittner: [0:19:28] Now, when you go in and present to people and do trainings and so forth, what do you witness in terms of people having aha moments? Are there things that people generally don't know about that you sort of shine a light on?
Rachel Tobac: [0:19:40] Depends on the type of team that I'm working with. If I'm working with a red team, there's a lot of laughter (laughter). Like, they're like, oh, yeah, that's something that we've wanted to try for a while. And working with them to craft some of their next red team exploits, that's always fun. But if I'm working with, say, a finance organization at a big tech company or something like that, there are a lot of moments where employees will stare and then go, oh, no. And I see them take their phone out. And they start scrolling through their social media, deleting posts.
Rachel Tobac: [0:20:09] And I think that's one of the most important things, is that there are small things that you can do to understand how much information you're putting out into the world. I think sometimes people don't realize, oh, my gosh, I did share that information of where I just traveled or I did tweet about how I purchased that thing on Amazon? And that really could be used against me. I need to be aware of that. So, yeah, people start to take notes. They start scrolling through their phone, going through their email. And they come up to me after the talks, and they're like, can you help me figure out what to scrub on my Instagram (laughter)? And that's, like, one of the most fun things to do after a talk, is just sit with five people and help them figure out what to scrub or walk through, here's what I would do with this information. Here's how I would build trust with you knowing that Alex (ph) is your best friend at work. Here's what I would pretend to say over the phone.
Dave Bittner: [0:20:57] What do you suppose people's biggest weaknesses are when it comes to having these methods used against them?
Rachel Tobac: [0:21:02] Humans in general have an inherent willingness to trust people who sound like they know what they're talking about. So if I come from a place of authority, I don't have to ever raise my voice or sound mean. My voice sounds exactly like I'm talking to you right now. And if I can come from a place of that authority, then people are more likely to comply because they're trusting. And I think that's a part of human nature. We want to be empathetic. If the person on the other end of the line is saying, hey, work with me here, you're likely to do so. And that's not a bad thing. You know, I wouldn't want our culture to get to a point where we can't have that empathy or that EQ or that trustworthiness with each other.
Rachel Tobac: [0:21:36] But the thing that I try and highlight with my trainings is we need to be politely paranoid. We don't need to question every single thing that happens in our lives. But if we notice something feels a little off, if we have that feeling of this feels weird, I don't - now they're asking questions that are a little more personal, we don't have to feel crazy for stopping the conversation mid-conversation and saying, you know what? You're starting to ask me some stuff that I don't want to answer. I'm not allowed to answer that.
Rachel Tobac: [0:22:00] And I think one of the biggest challenges is that human beings are reciprocal. We know this from Cialdini's principles of persuasion. They're reciprocal, and they rely a lot on commitment and consistency. And so if you start telling me information over the phone, you're highly unlikely to stop if it's already been 20 minutes. But I help kind of give people a different way out, give them some phrases that they can say so that they don't need to continue or they don't need to feel crazy. They're not bad people if they need to question things or people walking through their doors.
Dave Bittner: [0:22:27] And what do you find in terms of the effectiveness of training? How effective is it? Are people receptive? And what sort of tools to you provide them with, as you say, to not walk around be completely paranoid but also have...
Rachel Tobac: [0:22:40] (Laughter).
Dave Bittner: [0:22:40] ...You know, having their guard up, I guess dialing it in in an appropriate sort of way?
Rachel Tobac: [0:22:45] Every organization that I work with is slightly different. Now, if I'm working with a bank, a large financial institution, they need to be a little bit more paranoid than maybe a mom and pop shop, right? Their threat model's a little different. So we tend to work with them and make it specific to who they are and the types of calls that they might get, emails they might get, people who might try and walk through their door.
Rachel Tobac: [0:23:02] We found that the most effective thing to do is to actually put people in the place of a hacker for at least 30 minutes - give them a target, give them a chance to think about, what information would I look for? How would I go after this person? What phone number would I call? Oh, I would find it from their business card on their Instagram. Oh, got it. I should probably scrub mine. It's really abstract if we just give kind of high-level tips. But if we actually have somebody sit in the place of the hacker and think about what they would do, what a criminal might do, they're much more likely to carry that over into their personal and professional lives and actually be able to think constructively about what they want to change in their life to be more effective when securing their data, information, computers, all of that. We've seen it be extremely effective, more effective than just phishing campaigns alone or, you know, slide decks. I think that's challenging. It's just kind of death by PowerPoint. If you can actually have someone try it, that's where the real change happens. That's where you actually see that click-through rate go down and the phishing rate go down.
Joe Carrigan: [0:24:01] That was a great interview, Dave.
Dave Bittner: [0:24:02] Thank you.
Joe Carrigan: [0:24:02] I think she's correct about the twofold reason we're seeing the increase in social engineering, both being that we're noticing it more and that it is increasing because the technology's getting better. I like to look at a lot of these things as economic problems and that people are now the weakest link and the most cost-effective means of getting into an organization. So...
Dave Bittner: [0:24:20] Right. As the other methods become more expensive because the defenses are better...
Joe Carrigan: [0:24:24] Right.
Dave Bittner: [0:24:24] ...Good old-fashioned cons still work.
Joe Carrigan: [0:24:27] Exactly. And we need to inoculate ourselves. And that's kind of the mission of this podcast, right?
Dave Bittner: [0:24:32] Right. Yeah.
Joe Carrigan: [0:24:32] So...
Dave Bittner: [0:24:32] Interesting.
Joe Carrigan: [0:24:33] ...We're doing good for the world here.
Dave Bittner: [0:24:35] (Laughter) Rachel was great. I - and I really enjoyed the conversation with her, learned a lot.
Joe Carrigan: [0:24:39] Couple more things - one, sometimes when I buy things on online websites, it says, share this purchase on your social media accounts. And she talked about it. Why would I ever want to share a purchase I make on a social media account?
Dave Bittner: [0:24:52] I guess other than bragging.
Joe Carrigan: [0:24:54] Yeah.
Dave Bittner: [0:24:55] You know, look at this new Ferrari I just bought.
Joe Carrigan: [0:24:57] Yeah. I guess.
Dave Bittner: [0:24:57] (Laughter).
Joe Carrigan: [0:24:58] I'm not going to tell anybody I just bought a new Ferrari. First off, I'm never going to buy a Ferrari.
Dave Bittner: [0:25:01] (Laughter) You're a Porsche man (laughter)?
Joe Carrigan: [0:25:03] I'm a Toyota guy mostly (laughter).
Dave Bittner: [0:25:05] I see. I'm with you. Got you. All right. Very good (laughter).
Joe Carrigan: [0:25:07] Maybe a Mazda guy when I get a convertible for my midlife-crisis car.
Dave Bittner: [0:25:10] OK. Sure.
Joe Carrigan: [0:25:11] But I really like the term politely paranoid. I might personally live at a level of paranoia that some might describe as unhealthy.
Dave Bittner: [0:25:17] (Laughter).
Joe Carrigan: [0:25:17] But I do like the politely paranoid. You know, it's OK to stop the conversation. And finally, the thing I really like about her is the think like an attacker, think like a hacker, think like a criminal.
Dave Bittner: [0:25:29] Yeah.
Joe Carrigan: [0:25:29] I have a great example of this.
Dave Bittner: [0:25:30] All right.
Joe Carrigan: [0:25:30] And this is kind of a rather crass story. But we...
Dave Bittner: [0:25:34] (Laughter) Go on.
Joe Carrigan: [0:25:35] My family was on the road one day.
Dave Bittner: [0:25:36] Yeah.
Joe Carrigan: [0:25:37] And we're driving. This is up in western Maryland. And there is - on the side of the road, there's a school bus for sale. And my wife goes, we could buy that school bus and turn it into a camper.
Dave Bittner: [0:25:47] Right.
Joe Carrigan: [0:25:48] And I go, camper? Just think of all the kids we could abduct with that school bus.
Dave Bittner: [0:25:54] (Laughter).
Joe Carrigan: [0:25:54] Right?
Dave Bittner: [0:25:55] Right. Right. Right (laughter).
Joe Carrigan: [0:25:55] Because that's my thought process, right?
Dave Bittner: [0:25:57] (Laughter).
Joe Carrigan: [0:25:57] Why is it OK for someone to sell a school bus? I could just buy a school bus...
Dave Bittner: [0:26:01] Right.
Joe Carrigan: [0:26:01] ...Follow a school bus on its route and then precede that school bus by a couple of minutes and pick up kids.
Dave Bittner: [0:26:07] Right (laughter).
Joe Carrigan: [0:26:08] I mean, I'm not going to do that. I'm absolutely not going to do that.
Dave Bittner: [0:26:10] Sure.
Joe Carrigan: [0:26:11] That would be terrible and horrible.
Dave Bittner: [0:26:12] Right.
Joe Carrigan: [0:26:13] But that's the way I think like an attacker.
Dave Bittner: [0:26:15] Yeah.
Joe Carrigan: [0:26:16] And sometimes that offends a lot of people.
Dave Bittner: [0:26:18] (Laughter).
Joe Carrigan: [0:26:19] And I think that is a huge problem, is that we don't think like attackers. And we think that we're bad people for having these thoughts of these opportunities, that the - having the thought of the opportunity, that that could be used to abduct children, does not make you a bad person. What makes you a bad person is actually doing that.
Dave Bittner: [0:26:35] Right.
Joe Carrigan: [0:26:35] Right? But thinking, hey, that's a vulnerability - how many times have - when your kids rode a school bus, did the school bus ever show up with a bus number just written on a piece of paper stuck in a window?
Dave Bittner: [0:26:45] Oh, sure. The bus breaks down, the bus driver's sick.
Dave Bittner: [0:26:48] Yeah.
Joe Carrigan: [0:26:48] Yeah.
Dave Bittner: [0:26:49] Yeah.
Joe Carrigan: [0:26:49] It's a perfect opportunity. So...
Dave Bittner: [0:26:50] You know, it's funny. I remember when I was a kid. And you remember - did you have safeties, kids who were...
Joe Carrigan: [0:26:54] Yeah - yes...
Dave Bittner: [0:26:54] ...Sort of deputized to be - (laughter) to be extra - you know, they had a special little sash that they wore. Yeah.
Joe Carrigan: [0:27:00] Yeah, they loved that sash.
Dave Bittner: [0:27:00] Yeah. We had that too. But I remember one of my friends was a safety. Evidently, I was not chosen, eligible, whatever.
Joe Carrigan: [0:27:07] Right. Neither was I.
Dave Bittner: [0:27:08] I'm still bitter about it. But...
Joe Carrigan: [0:27:09] (Laughter).
Dave Bittner: [0:27:11] But the safeties were actually trained as to what license plate numbers the county buses had - so like, the prefix on the buses. So this is, you know, however many years, decades ago.
Joe Carrigan: [0:27:23] Right.
Dave Bittner: [0:27:23] There were - people were thinking about this.
Joe Carrigan: [0:27:24] Well, that's good.
Dave Bittner: [0:27:25] Yeah.
Joe Carrigan: [0:27:26] I am - I - actually, I'm very glad to hear that because that answers my question.
Dave Bittner: [0:27:29] (Laughter) Yeah. Right.
Joe Carrigan: [0:27:30] How do I know that that's the right bus?
Dave Bittner: [0:27:31] Yeah. Yeah.
Joe Carrigan: [0:27:33] But there was never a safety at my bus stop.
Dave Bittner: [0:27:35] OK.
Joe Carrigan: [0:27:36] But that was high school, so...
Dave Bittner: [0:27:37] Yeah. They kind of lorded over us. So...
Joe Carrigan: [0:27:39] Yeah.
Dave Bittner: [0:27:39] Yeah (laughter).
Joe Carrigan: [0:27:40] In elementary school, I never rode a bus to elementary school because the school was right behind my neighborhood. So we just walked to it.
Dave Bittner: [0:27:45] Oh, interesting. I only rode a bus to elementary school.
Joe Carrigan: [0:27:47] Really?
Dave Bittner: [0:27:47] Yeah. Yeah. The limo drove me the rest of the time. Anyway...
Joe Carrigan: [0:27:52] (Laughter).
Dave Bittner: [0:27:52] (Laughter) So our thanks to Rachel Tobac for joining us. She is on Twitter, @RachelTobac. And the name of her company is SocialProof Security. We appreciate her taking the time with us. That is our show. Thanks to everybody for listening.
Dave Bittner: [0:28:05] And of course, thanks to our sponsors at KnowBe4, the social engineering experts and the pioneers of new-school security awareness training. Be sure to take advantage of their free phishing test, which you can order up at knowbe4.com/phishtest. Thanks to the Johns Hopkins University Information Security Institute for their participation. You can learn more about them at isi.jhu.edu.
Dave Bittner: [0:28:29] The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technology. Our coordinating producer is Jennifer Eiben. Editor is John Petrik. Technical editor is Chris Russell. Executive editor is Peter Kilpe. I'm Dave Bittner.
Joe Carrigan: [0:28:45] I'm Joe Carrigan.
Dave Bittner: [0:28:46] Thanks for listening.