Hacking Humans 5.31.18
Ep 1 | 5.31.18

Social Engineering works because we're human.

Transcript

Christopher Hadnagy: [00:00:02] I don't think it's fair to say that if you fall for this vector that you're stupid. I think that social engineering works on us because we're human.

Dave Bittner: [00:00:19] Hello, everyone, and welcome to The CyberWire's Hacking Humans podcast, where we take a look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner. And joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Joe, welcome. Good to see you, as always.

Joe Carrigan: [00:00:40] Good to be here, Dave.

Dave Bittner: [00:00:41] We've got some great stories to share today. Later in the show, we've got an interview with Christopher Hadnagy. He's author of the book "Social Engineering: The Art Of Human Hacking." But before we get all that, a quick word from our sponsor, the fine folks at KnowBe4. So how do you train people to recognize and resist social engineering? Here are some things people think. Test them, and if they fall for a test scam, fire them. Or other people say, if someone flunks the test, shame them. Instead of employee of the month, it's doofus of the day. Or maybe you pass out a gift card to the one who gets the A-plus for skepticism in the face of phishing. So how about it? What do you think, carrots or sticks? What would you do? Later in the show, we'll hear what the experts at KnowBe4 have to say. They're the sponsors of this show.

Dave Bittner: [0:01:36] All right. We are back, and each of us have an interesting story this week. Joe, why don't we start off with yours? What do you have for us?

Joe Carrigan: [00:01:42] All right. Mine comes from the Anti-Phishing Working Group, the APWG. They have released their report for the fourth quarter of 2017. And one of the findings was that there has been a large increase in phishing sites that use HTTPS, which is a content encryption protocol.

Dave Bittner: [00:02:00] Right.

Joe Carrigan: [00:02:01] The percentage of the sites that are using this is 30 percent, whereas last year, it was only 5 percent. So that's a six times increase.

Dave Bittner: [00:02:08] So walk us through the - what exactly HTTPS is just to review what that means.

Joe Carrigan: [00:02:14] Well, from a user standpoint, it's the feature of your web browser that puts a little green lock in the address bar of your web browser. Like I'm looking at Chrome right now.

Dave Bittner: [00:02:24] Right.

Joe Carrigan: [00:02:24] And there's a little padlock - green padlock, and it says secure. I can click on it. And I can view all kinds of site information about the certificates. And the APWG conducted an informal survey and found out that 80 percent of users thought that the green padlock meant the site was secure.

Dave Bittner: [00:02:41] Right.

Joe Carrigan: [00:02:42] But that's not true because maliciously-registered domain names can easily get an HTTPS certificate. So there's some user training that needs to go on. It's important for people to understand what the padlock symbol means. First, it's a feature of your web browser, right? It's - the software on your computer is telling you - the padlock is telling you that the web browser believes the connection to the website is secure and that the website has a valid certificate. That's it.

Dave Bittner: [00:03:12] So that means that the connection between my computer and whatever server I'm connected to, rather than being clear text...

Joe Carrigan: [00:03:20] Right.

Dave Bittner: [00:03:20] ...It is encrypted, and that encryption has been verified as being valid and secure.

Joe Carrigan: [00:03:26] Correct. So when, you know, years ago, you know, back in the early 2000s, late '90s even, we would tell people when you're going to a website to do any kind of shopping, make sure that the little green padlock are - actually, back then, it wasn't even green. It was - on Mozilla, or not on Mozilla, I guess then it was called Netscape.

Dave Bittner: [00:03:46] Right.

Joe Carrigan: [00:03:47:] Dating myself. But it was a little padlock that would be open if your site was not secure, and then it would be a closed padlock with a yellow background that would be there if your site was secure.

Dave Bittner: [00:03:58] Right.

Joe Carrigan: [00:03:59] And that was how you'd tell. And we'd say, don't enter your credit card information unless it has this site. And that kind of evolved, I guess, socially to, you know, the site's secure if the padlock is on. And then the padlock went through the user interface evolution to become the big green thing you see up top, so it's more present, more visible. You don't have to go hunting for it. It's right there next to the URL. But basically, the padlock means that it's very unlikely. And I mean, what I say very unlikely, I mean really, really, really unlikely that some third party is eavesdropping or changing the data in transit. It does not mean that the website is safe.

Dave Bittner: [00:04:37] I think I reflexively look for that lock, particularly, like you say, if I'm entering some credit card information or filling out any kind of form, I look for that lock to make sure that it is a secure connection. So I guess that's not a bad thing. But the point here is not to be falsely put at ease by the fact that it's there.

Joe Carrigan: [00:04:56] Right. Exactly. No, what you're doing is 100 percent correct. You should look for that lock even before you start entering any personal information. Before you even do that, you should make sure that you trust this website.

Dave Bittner: [00:05:07] So the bad guys are taking advantage of the fact that we are used to seeing that lock.

Joe Carrigan: [00:05:12] Correct.

Dave Bittner: [00:05:13] And many of us incorrectly think that that lock means that everything's fine.

Joe Carrigan: [00:05:18] And that's right. That's what everybody - oh, there's some kind of picture in my mind, something where somebody is going, everything's fine.

Dave Bittner: [00:05:25] (Laughing) Right. Right.

Joe Carrigan: [00:05:27] I don't know.

Dave Bittner: [00:05:28] Yeah. But it turns out that the bad guys can get their hands on a valid certificate.

Joe Carrigan: [00:05:34] Very easily.

Dave Bittner: [00:05:34] And they can generate that lock, and that's what throws you off.

Joe Carrigan: [00:05:38] That's right.

Dave Bittner: [00:05:39] So how do we protect ourselves against this?

Joe Carrigan: [00:05:41] Awareness, awareness, awareness. Really, that's the only thing. You know, and it's an excellent question because, you know, to the average user, to the layman, the only thing that they can do to protect themselves is first to become aware of what this means and what the difference between trusting the site and trusting the connection is. I guess aside from that, in an organization, you can have some whitelisting software that doesn't allow your users to go to any other site other than what's on the whitelist. But that's kind of difficult to maintain.

Dave Bittner: [00:06:09] Right. All right. It's an interesting story and certainly something to look out for. So, Joe, my story this week has to do with holidays and events.

Joe Carrigan: [00:06:20] My favorite (laughter).

Dave Bittner: [00:06:22] You're a fan of holidays and events?

Joe Carrigan: [00:06:25] I don't know.

Dave Bittner: [00:06:26] Barbecues, cookouts.

Joe Carrigan: [00:06:27] Actually, I'm a big fan of barbecues and cookouts. Holidays...

Dave Bittner: [00:06:31] These are more - talking more about national holidays and I guess high-profile events. In this case, some of the bad folks out there are taking advantage of the fact that GDPR is upon us.

Joe Carrigan: [00:06:44] Right.

Dave Bittner: [00:06:44] And GDPR is the European Union's General Data Protection Regulation.

Joe Carrigan: [00:06:48] Yes.

Dave Bittner: [00:06:49] And that's a big deal. And so some of the folks who are out there doing phishing are using GDPR as part of their fish bait. In this particular case, they are targeting Apple users, and they're threatening them with account suspension. So as we talk about, a call to action of saying, you know, if you don't do this right away, we're going to suspend your account, and you won't be able to use all of those nifty Apple things that you're used to using. And so this phish looks like a legitimate email from Apple. But if they follow the link that they're given, they're sent to an account rescue site which, of course, is there to actually extract their credentials and other personal information.

Joe Carrigan: [00:07:31] Sure.

Dave Bittner: [00:07:32] It's legitimate looking, but it's not actually from Apple. One of the things they're asked to do is to update their payment details, of course.

Joe Carrigan: [00:07:41] Let me guess. Let me guess what this gathers.

Dave Bittner: [00:07:43] Right. Right. Credit card information. So once they're done, they're asked to click a button that's labeled unlock, and that sends all their information that they've entered right to the scammers.

Joe Carrigan: [00:07:52] Right.

Dave Bittner: [00:07:53] The researchers over at Trend Micro say this is more sophisticated than the usual run of phish bait, but there are few indications that it's not on the up and up. First of all, some of the recipients haven't even been Mac users. So...

Joe Carrigan: [00:08:06] Yeah. They're just - OK, so that says they're just sending it out to everybody they have on a list right, right?

Dave Bittner: [00:08:11] Right. So, you know, if you're a Windows user and you get an email saying to log in to your Apple account, well, that's a tell that something's off.

Joe Carrigan: [00:08:18] Right.

Dave Bittner: [00:08:19] And then also, the URL itself is off. It's not actually an Apple site. It's something that looks similar to an Apple site. You know, we've talked about this where they can replace the L in the word apple with a one or something like that. So at a quick glance, it looks like it's legit, but it's not. Awareness here seems to be the recommendation, organizations reminding their employees about these events.

Joe Carrigan: [00:08:41] Do you think this is a real threat to organizations or just maybe to individual users?

Dave Bittner: [00:08:45] Well, I think it could be both because I think a lot of organizations are dealing with GDPR. So - and in terms of the social engineering angle, I mean, I guess anything that raises the legitimacy - the perceived legitimacy of some sort of phishing email...

Joe Carrigan: [00:09:00] Right.

Dave Bittner: [00:09:00] ...Is - it's trouble. And it increases their chances of being effective.

Joe Carrigan: [00:09:05] This GDPR - and this is not unique to GDP, so don't think I'm saying this as a mark against GDPR because anything is like this is going to have the same effect - but it presents an opportunity to these phishers to go after people.

Dave Bittner: [00:09:21] Right.

Joe Carrigan: [00:09:21] You know, it's something that's in the news every day. People are hearing about it all the time. And if I send you an email that says you've got to step up for GDPR, it's going to be already in your mind, right? And that's kind of what's going on with the psychology of this. I got to find something in your thought process that is going to trigger some action that I want you to take.

Dave Bittner: [00:09:42] Yeah. And GDPR has big penalties if you don't comply. So...

Joe Carrigan: [00:09:46] Right. But people should realize those penalties are for companies, not for individuals.

Dave Bittner: [00:09:50] But in this case, I think they're using GDPR as the excuse to say, hey, you've heard about this thing that's coming along. Help us out here. We need you to update your information so that we can be compliant. So that's why we're asking you.

Joe Carrigan: [00:10:04] Right. It's just - again, like we say always, awareness of what's going on.

Dave Bittner: [00:10:08] Yeah. They're being opportunistic.

Joe Carrigan: [00:10:10] They definitely are being opportunistic.

Dave Bittner: [00:10:12] Taking advantage of something that's already top of mind for folks. And like we said at the top, you know, we do see it with holidays. So the Fourth of July or Thanksgiving or New Year's Day or Christmas, people take advantage of all those things being in people's minds. And they use that as the hook for the phishing. All right. So again, something to be aware of, just the people, you know, the bad guys tie their data into these events. And they use that familiarity to make their emails seem more legit. And so as always, you just need to be suspicious, especially of emails that have links. And when in doubt, ask for help. All right moving on. It's time for our Catch of the Day. All right. What do you got, Joe?

Joe Carrigan: [00:11:00] This comes courtesy of Rachel Tobac from Twitter, who is the CEO of SocialProof Security. You can find her on Twitter at @RachelTobac. And she posted a picture that was a text message exchange between two users, and she has obfuscated who they are. It's a very interesting exchange. The first is a message comes in, and it says, hey, I know you don't know me, but many years ago, I used to have your number. I'm trying to log into an old account that is still tied to this phone number, but it's telling me that it will send me a verification code. I'd like to know if it'd be OK with you if I request the code and then you could just text it back to me. If not, that's totally fine. Right. And this person says, OK.

Dave Bittner: [00:11:47] So they responded and said...

Joe Carrigan: [00:11:48] They responded and said, OK, this is the phone holder.

Dave Bittner: [00:11:51] Right.

Joe Carrigan: [00:11:52] And the person on the other end says, thank you so much. I just requested it. And then the person who holds the phone sends back the verification code. And the person texting - the phone holder - says, you're a life saver. Thank you so much, and sorry for bothering. The person replies, welcome.

Dave Bittner: [00:12:12] So what's going on here?

Joe Carrigan: [00:12:13] So you don't really know what's going on here.

Dave Bittner: [00:12:15] Right.

Joe Carrigan: [00:12:16] You know, this is not enough information to tell you what - to tell you what's actually happening. It is certainly plausible that this is somebody who used to hold the number that the receiver of these messages now holds. But it's also entirely possible, and I would say even more likely because of my lack of faith in humanity, that this is a scammer who is either trying to scam the person who currently holds the phone. They have all the information to break in the account. Like they have the phone number. They have the username. They have the password. But they don't have the verification code. Perhaps the original account holder has given up their phone number, and this person who's receiving these messages does have it. But if you're receiving these messages. You have no way to verify that the person sending you that text message has a legitimate reason to access the account he's trying to access.

Dave Bittner: [00:13:04] Right. You don't know who this person is.

Joe Carrigan: [00:13:05] You don't - you have no idea who this person is. In fact, what's interesting is the very first sentence this person says is, hey, I know you don't know me.

Dave Bittner: [00:13:13] Right.

Joe Carrigan: [00:13:14] Right.

Dave Bittner: [00:13:15] I guess one of the important points here is being sent a verification code is a common part of two-factor authentication.

Joe Carrigan: [00:13:21] Yes, it is.

Dave Bittner: [00:13:22] What strikes me about this is how casual it is.

Joe Carrigan: [00:13:26] Yeah. Yeah, it is. This seems very casual. Honestly, Dave, we don't know if this is legitimate or not, right? This could be a legitimate request.

Dave Bittner: [00:13:33] I'm with you though that it probably isn't.

Joe Carrigan: [00:13:35] But it probably isn't.

Dave Bittner: [00:13:37] But the casual nature of it and playing on everyone's desire to be helpful.

Joe Carrigan: [00:13:42] Right.

Dave Bittner: [00:13:43] So you don't know me. I used to have this number. Well, you know, that's plausible.

Joe Carrigan: [00:13:48] Right.

Dave Bittner: [00:13:48] But the way that they wrap it up and say, hey, if not, that's totally fine. In other words, you know, no big deal if you can't help me. But, boy, if you could be helpful, that'd be great.

Joe Carrigan: [00:13:57] I think that last part is key to the deception.

Dave Bittner: [00:13:59] Yeah, I agree. I agree. So it's an interesting cautionary tale there that if someone tries to get you to turn over a code like this, beware.

Joe Carrigan: [00:14:09] Yeah, I wouldn't do it.

Dave Bittner: [00:14:11] Just say no

Joe Carrigan: [00:14:11]Just say no.

Dave Bittner: [00:14:12] Let them go at it a different way.

Joe Carrigan: [00:14:13] That's right. Like, try to get in touch with the actual website hoster.

Dave Bittner: [00:14:17] All right, Joe, that's a good one.

Joe Carrigan: [00:14:18] And that is our Catch of the Day.

Dave Bittner: [00:14:20] So coming up next, we have my interview with Christopher Hadnagy, who's the author of the book "Social Engineering: The Art Of Human Hacking." We'll have that in just a moment. But first, another message from our sponsors, KnowBe4.

Dave Bittner: [00:14:37] Let's return to our sponsor KnowBe4's question. Carrots or sticks? Stu Sjouwerman, KnowBe4's CEO, is definitely a carrot man. You train people, he argues, in order to build a healthy security culture. And sticks, don't do that. Approach your people like the grown-ups they are, and they'll respond. Learning how to see through social engineering can be as much fun as learning how a conjuring trick works. Hear more of Stu's perspectives in KnowBe4's weekly Cyberheist News. We read it, and we think you'll find it valuable too. Sign up for Cyberheist News at knowbe4.com/news. That's K-N-O-W-B-E-4.com/news.

Dave Bittner: [00:15:25] Joe, earlier this week, I had the opportunity to speak with Christopher Hadnagy. He's the author of the book "Social Engineering: The Art Of Human Hacking." You and I have both read this book.

Joe Carrigan: [00:15:34] Yes. Well, I'm about half way through it right now. But it's a good book.

Dave Bittner: [00:15:36] An interesting conversation, covered a lot of ground. Here's my conversation with Christopher Hadnagy. Why don't we just start off with some high-level stuff here? I mean, how do you define social engineering?

Christopher Hadnagy: [00:15:49] That's a good starting point. So I define social engineering as any act that influences a person to take an action that may or may not be in their best interest. And I use a broad definition because I don't always think that social engineering is negative.

Dave Bittner: [00:16:03] Well, take us through that. Can you give us some examples? What are the good and the bad?

Christopher Hadnagy: [00:16:07] So I think when we look at how we communicate with our family, our children, our spouse, our bosses, our clergy, our therapists, the people that we meet every day, the way that we communicate with them on a daily basis really can fall into certain aspects of social engineering. Do we build rapport? Do we use authority? What parts of influence do we use? Are we manipulative? What chemicals are released in our brain that make us feel certain emotional content towards that person, whether it be positive emotions or negative emotions? And when we look at that, we get a clear beginning to understand how people make decisions. You know, I use a silly example when I talk about this. But for any of you guys out there that have daughters, you may have at one point in time had a princess tea party where you had makeup and nails being done. And you say, how would that decision ever be made? Well, yeah, you can say the easy answer is, well, she's my daughter and I love her. But it's not that simple because there's a lot of people that I really care for but I would still say no to that request. It's a series of physiological and psychological reactions in the human brain and body that allow a person to make a decision to say yes. So when we analyze the good side, we begin to understand how scam artists and con artists throughout the centuries have worked on humans by using these very same philosophies and principles to get them to say yes to something that they shouldn't.

Dave Bittner: [00:17:33] And this is really, I suppose, a modern name that's been put on something that's been around as long as we've been human.

Christopher Hadnagy: [00:17:40] Yeah, basically. I mean, you look at how humans have interacted throughout the written record of human history, and you could see all these things occurring, and it's just the medium in how they're delivered, right? I mean, phishing has always been there, but we didn't have email, you know, a hundred years ago, 200 years ago, a thousand years ago. So when we look at how it worked back in written form, it was different, right? We look back in the early 1900s at some of the greatest con men in history, and they used very much of the same philosophies as we see in scam artists today and people who say they're the IRS or Microsoft that want to steal money - very same philosophies, but the medium of delivery was much different.

Dave Bittner: [00:18:23] Now, when it comes to cybersecurity, what do you find - in terms of social engineering, what do you find most noteworthy or most interesting?

Christopher Hadnagy: [00:18:31] Well, when we talk about social engineering now from a security perspective, then we have to focus on the four main vectors - right? - so phishing, which is email-based attacks; vishing, which is voice phishing; SMShing, which is SMS phishing; and then impersonation, which is the people who impersonate either an employee or law enforcement and break into a building for some other kind of crime. And when we look at those four different vectors - right now, I think it was last year's Verizon DBIR report states that over 91 percent of all breaches had some element of phishing involved in them. So that's not really shocking to us because we've heard this before, but what I think is shocking to me is seeing how much of an increase there is in vishing and impersonation vectors. We didn't see that much of it just a few years ago, but now we're seeing the phone being used, especially with things like VoIP being so cheap and easy to set up. We see a lot more vishing than we ever have and a lot more impersonation, especially of law enforcement, which is, to me, a very shocking part of this attack vector.

Dave Bittner: [00:19:37] Why do you suppose we're seeing this increase? Could it be as simple as, because it works?

Christopher Hadnagy: [00:19:40] Well, I think that is part of the answer. But I also think that when it comes to vishing, let's say, I think it's cost of execution with potential payout times risk of getting caught, right? So when you do that kind of like a mathematical equation, vishing looks pretty attractive. You could set up a VoIP server for almost nothing. The risk of getting caught, especially if you're calling from a country that doesn't really care about American law - who cares? And the potential payout could be in the billions of dollars. So all of that means, hey, it's a pretty sound investment for an attacker. When it comes to impersonation, I'm actually a little bit at a loss now, to be honest with you. We collect news stories from around the globe on these vectors, and I am constantly amazed to see how many people are being caught impersonating law enforcement. And the only thing I can think of there is, it's a form of deviancy because a lot of the crimes that are committed with them are violent crimes or high-cost crimes - high-end theft or violent crimes or sex crimes, so the motivators for those types of things are much different.

Dave Bittner: [00:20:47] And I've certainly heard of an uptick of people getting calls, people pretending to be from the IRS or from a sheriff's office and saying, you know, there's a warrant out for you, and unless you pay us this amount of money right away, you know, we're going to send somebody over there, and you're going to be hauled away in handcuffs.

Christopher Hadnagy: [00:21:02] Yeah. Yeah. We - I actually got a couple of those just a few months ago.

Dave Bittner: [00:21:07] Are there any aspects of social engineering that you think goes unappreciated or underreported? This is stuff that works on us but flies under our radar.

Christopher Hadnagy: [00:21:16] Yeah. You know, when you first said that, I thought of - one of the vectors that kind of disturbs me the most is what they call the grandma-grandpa scam. So someone calls your grandparent, and they say, hey, Grandma. It's Chris. Listen; I'm in prison in Mexico. I was down on the bachelor party with my buddies, and I got picked up. I got too drunk. I don't want my wife to find out. You know, she'll kill me. You know, can I borrow 5K for bail? And soon as I get home, I'll pay you back. I just don't - they stole my wallet. I don't have access to my ATM, blah, blah, blah - give a big excuse. And grandma hears the voice but rationalizes, well, he said he was drunk; he's in prison; I don't want him to get in trouble with his wife. And she follows the instructions to go to CVS or Walmart and Western Union $5,000 to, you know, some random person. That, to me, is something we don't hear a lot of, but it happens way too often. And it's a pretty nefarious vector because the poor folks that are falling for this generally are taking money from their life savings that they need to live or for medical care or for other things to help out their grandkids, and then to find out it was just a scam.

Dave Bittner: [00:22:24] Yeah. Yeah. Just as an aside, I got a letter from a listener who said that his elderly father had been convinced to go to a local CVS store and buy iTunes gift cards to - $5,000 worth in total - several trips. And he was scratching his head as to - you know, he said, my father is not a stupid person; he's not gullible, I wouldn't have thought, and yet, they made him take - or they convinced him to take several trips, spend thousands of dollars. He said, the other thing that troubled me was, no one at CVS raised an eyebrow and said, why is this elderly person buying thousands of dollars' worth of iTunes gift cards?

Christopher Hadnagy: [00:23:05] Yeah. You know, I think you bring up a valid point. So one of the things that I've always disliked about - let's say - the infosec industry is, you know, they favor bumper stickers or T-shirts that say something like, there's no patch for human stupidity. And I don't think it's fair to say that if you fall for this vector, that you're stupid. I think that social engineering works on us because we're human. Now, maybe that particular vector would not work on you or on me, but there is a vector that would work on you or me. I'm an Amazon junkie, so the phish that tend to make me double take or the ones that actually have gotten me in the past had been related to Amazon orders or Amazon accounts or Amazon discounts. So it's just finding the right emotional trigger at the right time for the right person that can then make any human fall for a vector. So why did this elderly gentleman fall for that? I don't know, you know? But maybe whatever was said to him on the phone triggered an emotional response in him where his brain allowed him to make a yes decision on something that he definitely should've said no to. But it's just finding that trigger. And that's what these attackers gamble on - is that they'll be able to find that trigger for even just a small fraction of people. But the payout - think about that. And they may have tried that a hundred times, but now that gentleman fell for it, and there's a $5,000 payoff for them - and an untraceable payoff.

Dave Bittner: [00:24:30] Right. Now, you bring up a really good point, which is, with these tendencies so hard-wired into us, with all of us being human, how much does awareness and training help?

Christopher Hadnagy: [00:24:41] Oh, it helps a ton. So I make a comparison to something like learning how to box or a martial art or how to fight. The first time - if you ever took a lesson, the first time you went into the dojo or the gym, anyone in the - the 8-year-old over there in the back could've wiped the floor with you. And then you jump forward six months or a year later, and muscle memory has kicked in to the point where you're blocking hits you didn't even see coming. You're fast on the bag. You know exactly how to move and duck and hit. And it's the same when it comes to these type of vectors. Awareness gives you that muscle memory that allows you to say, hey, that phish doesn't seem right, or, wow, this phone call feels a little tricky, or, that request doesn't sound proper. And you learn how to block, duck, take the hit and move forward without being a victim.

Dave Bittner: [00:25:31] You know, for the people in our audience who are responsible for security, how can they use social engineering to their advantage? How can they use it for good?

Christopher Hadnagy: [00:25:39] For those who are involved in having to influence the decisions in their company, I think there's a couple things that could be done. First, security is always best when it's a team effort and when it's a culture, as opposed to when it's an adversarial relationship. So I find all too often with our clients, the security folks are big into I'm going to shame you if you messed up or, you know, the classic IT guy that comes in and everyone hates him because he's so condescending. And that doesn't work. But if security is a team effort, if they view you as someone there to help protect them, someone interested in their welfare, someone actually looking out for them, then security becomes more of a pleasure point as opposed to a pain point - so utilizing the same communication skills to build rapport. Use influence to, you know, internally "sell," quote, unquote - air quotes - the security program - I think will go a long way in making it more acceptable.

Dave Bittner: [00:26:40] So Christopher, your book Social Engineering: The Art Of Human Hacking - I understand you have an update in the works that's going to be coming out soon.

Christopher Hadnagy: [00:26:47] I do. So that book was my first book, and it's something I've been very proud of. But after looking at it, I said, wow, these stories are eight years old; there's a lot of things in there that have been updated, a lot of science that has been renewed and different. So I decided to take on the project of rewriting it. The version two is called "Social Engineering: The Science Of Human Hacking," and it covers a lot of scientific studies that involve a lot of the psychology and physiology of different things that we spoke about here in this interview. And that's coming out in June. It will encompass the same kind of outline but a lot more spin on security awareness and how to use social engineering and - as a pen tester, as a professional and as a security-awareness person.

Dave Bittner: [00:27:34] So interesting stuff, huh?

Joe Carrigan: [00:27:35] Yeah. That was great interview with Chris. I got a couple things I want to note. Once again, I would like to say that, yes, law enforcement, IRS, they'll never call you. The sheriff's office has a warrant for your arrest - they will show up. They will come and get you, and they don't call you because they don't want to tip you off.

Dave Bittner: [00:27:48] Right.

Joe Carrigan: [00:27:48:] ...So that you then become a flight risk. They just show up and take you into custody.

Dave Bittner: [00:27:52:] Chris brings up a lot of a lot of good points. I particularly like the point that we need to be careful to not just place blame on people for being human, for making human mistakes that could happen to any of us.

Joe Carrigan: [00:28:03] I thought that was a salient point in this interview in his statement that, generally, as security people, we tend to be a little more condescending, and how could you be so foolish to have fallen for that? And he's 100 percent correct. There are things out there that will make us vulnerable. We're a little more suspicious, but there are triggers that will work for us.

Dave Bittner: [00:28:22] Yeah. All right, Joe. Well, as always, thanks for joining us. I look forward to talking to you next week.

Joe Carrigan: [00:28:27] I look forward to it, too, Dave. Thanks.

Dave Bittner: [00:28:28] And that is our podcast. Thanks to our sponsor KnowBe4, whose new school security awareness training will help you keep your people on their toes with security at the top of their mind. Stay current about the state of social engineering by subscribing to their CyberheistNews knowbe4.com/news. That's K-N-O-W-B-E, the number four, dot com. Think of KnowBe4 for your security training. The Hacking Humans podcast is a production of Pratt Street Media. Our coordinating producer is Jennifer Eiben. Editor is John Petrik. Technical editor is Chris Russell, and executive editor is Peter Kilpe. Special thanks to the Johns Hopkins University Information Security Institute for their participation in the show. We are proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technology. Thanks for listening. I'm Dave Bittner.

Joe Carrigan: [00:29:22] And I'm Joe Carrigan.

Dave Bittner: [00:29:23] We'll see you next week.