Hacking Humans 5.16.19
Ep 49 | 5.16.19

Live at KB4CON 2019

Transcript

Dave Bittner: [00:00:03] Well, hello, everyone, and welcome to a special live edition of the CyberWire's "Hacking Humans" podcast. We are coming to you from KB4-CON here in Orlando, Fla. Just so everybody can hear that we actually have a live audience out here, how about a round of applause? Can we hear it?

Audience: [00:00:18] (APPLAUSE)

Dave Bittner: [00:00:24] All right. Well, of course, we want to thank our sponsors, KnowBe4.

Dave Bittner: [00:00:26] This is the show where each week we look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire. And joining me, as always, is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe.

Joe Carrigan: [00:00:44] Hi, Dave.

Dave Bittner: [00:00:45] We've got some great stories to share. We've also got a special guest. Later in the show, we're going to welcome Kevin Mitnick to the stage. He's one of the most well-known hackers in the world, and he's also the chief hacking officer at KnowBe4. Before we get to our stories, I want everyone to welcome, to the "Hacking Humans" podcast, our special guest, Stu Sjouwerman. He is the founder and CEO of KnowBe4. Stu, welcome to the show.

Stu Sjouwerman: [00:01:09] Glad to be here.

Dave Bittner: [00:01:10] How about a round of applause for Stu.

Audience: [00:01:11] (APPLAUSE)

Dave Bittner: [00:01:15] He made all of this possible. Stu, I want to thank you for hosting us here today at KB4-CON. Thank you for sharing your stage with us, sharing this audience with us today. We do have an exciting announcement to share with everyone here today. Thanks to the audience growth that we've had over the past year that we've been doing the "Hacking Humans" podcast, thanks to the support of KnowBe4, today we are happy to announce that "Hacking Humans" has been renewed for a second year.

Audience: [00:01:42] (APPLAUSE)

Dave Bittner: [00:01:43] So we're going to be around.

Audience: [00:01:48] (APPLAUSE)

Dave Bittner: [00:01:48] Thank you very much, Stu.

Stu Sjouwerman: [00:01:50] You're welcome. By the way, do you see the unfair advantage? They have scripts. I don't.

Dave Bittner: [00:01:54] Yeah.

Unknown: [00:01:55] (LAUGHTER)

Dave Bittner: [00:01:55] Yeah, sorry (laughter).

Stu Sjouwerman: [00:01:56] (Laughter).

Dave Bittner: [00:01:59] All right, well, we've got some great stories to share this week. Joe, I'm going to kick things off today. And I want to start off by getting a sense from the folks in the audience here. How many of you have been around long enough to remember what a 900-number is? Let's see a show of hands - 900-number.

Audience: [00:02:18] (APPLAUSE)

Dave Bittner: [00:02:18] All right, terrific. Hands down. Now, the contrast of that - how many have no idea what a 900-number is? Anybody? All right, a few of you out there. So back in the, I want to say, '80s...

Joe Carrigan: [00:02:29] '90s.

Dave Bittner: [00:02:30] ...Or so - '90s, the telephone system, back when phone calls used to cost money to make long-distance calls, the phone company came up with something called a 900-number. And that's where you could call this number and get some sort of service, and they would charge you a per-minute - everybody's laughing when I said service.

Unknown: [00:02:48] (LAUGHTER)

Joe Carrigan: [00:02:50] What kind of service is it?

Dave Bittner: [00:02:51] You all have used 900-numbers before, so - perhaps it was a psychic or a dating service or something else.

Joe Carrigan: [00:02:59] (Laughter).

Dave Bittner: [00:03:00] So you call up and they would charge you a per-minute fee to be connected on the phone. And this was very interesting when it first started. A lot of people lost a lot of money. And a lot of folks who used to take advantage of some of the peculiarities of the telephone system would take advantage of that, to be able to use 900-numbers for free. Kevin Mitnick's coming up here later. He might have a thing or two to say about that, perhaps. Well, there's a scam that is going around right now. This is an article from The Washington Post, and it's called "'Do Not Call Them Back!': The FCC Warns Of Late-Night Scams." So what happens is you're at home in the middle of the night - you are minding your own business, fast asleep - and your phone rings, but it only rings one ring, and then they hang up. So you're woken up, time passes, your phone rings again, one ring, hangs up. How many people would call that number back? If that happened to you in the middle of the night, how many people would call that number back? Nobody admits to doing that. All right, fair enough, fair enough. This is - but I would say that we are probably an above average crowd when it comes to knowing how to respond to these. I think a lot of folks - Joe, I mean, how would you feel? If you get a call in the middle of the night, it's usually not good news.

Joe Carrigan: [00:04:12] Yeah, that's usually the worst thing that - the worst-case scenario that you can imagine. The first thing you go to is, oh, some family member has some kind of serious problem...

Dave Bittner: [00:04:20] Right.

Joe Carrigan: [00:04:20] ...May have passed away.

Dave Bittner: [00:04:22] Right.

Joe Carrigan: [00:04:22] We have older parents, right? That's the first thing that I would go to.

Dave Bittner: [00:04:26] Right. So what's happening is - with this scam is that people are calling from other countries, from Mauritania and Sierra Leone, and they have country codes of 222 and 232. And so you will see that number on your phone, and it looks like a domestic call from the U.S. But if you call them back, in fact, you are connecting to the international equivalent of a 900-number. And that's the scam. Sometimes they will leave a message that says either it's an emergency, please call me back, or they'll leave a message that says, congratulations, you've won something. Either way, if you return the call, that clock is running, and they're charging you X number of dollars per minute. The story goes on to say that if you find yourself a victim of this, one of the things you can do is contact your phone provider, and they will, perhaps, back out the charges. And if you don't have any luck there, you can file a complaint with the FCC.

Dave Bittner: [00:05:26] Stu, I wanted to touch base with you on this. It seems like there's a lot of different things in play here that you all deal with. And first, that emotional component - you're woken from a deep sleep, wondering if something's gone wrong.

Stu Sjouwerman: [00:05:40] Yeah, call it - relate it to what we heard this morning from Apollo (ph). Your attention has suddenly been focused on something negative, you worry, you stop your normal, rational thought. And you might just grab the phone and dial that number. And here's your 10 bucks a minute that starts clicking in.

Dave Bittner: [00:06:00] Right, right. I also think it's fair to say that if you're woken up, you might not have your wits about you like you would during the day.

Joe Carrigan: [00:06:07] Right, yeah, you might not be thinking clearly...

Dave Bittner: [00:06:09] Yeah, thinking a little...

Joe Carrigan: [00:06:10] ...Because you'd just gotten up...

Dave Bittner: [00:06:11] Being a little foggy.

Stu Sjouwerman: [00:06:12] At 2 a.m.

Joe Carrigan: [00:06:13] At 2 a.m., right, yeah.

Stu Sjouwerman: [00:06:14] Yes.

Joe Carrigan: [00:06:14] And the other thing is that this is a numbers game, you know? Even if nobody in the audience responds - right? - if they make a thousand calls and 10 people call back, that's a successful ratio, right?

Dave Bittner: [00:06:26] Right. The other technique that this article points out is what's called neighborhood spoofing. And I suspect many of us have seen this. This is - certainly in the past few months, I've seen this on my phone plenty of times, where someone will - the people who are calling will imitate the prefix from your neighborhood, so you are more likely to think that this is someone local and not just a random call from out of the blue. It makes it more likely that you're going to answer that call. Let me get another show of hands here. How many people - if the phone rings on your mobile device and you do not know who it is, how many people just let that go to voicemail or don't answer at all? Almost everybody. Isn't that interesting, the way that the use of the telephone system has changed, that it's as much a screening system now as anything? If I don't know who you are, you're going to voicemail first.

Joe Carrigan: [00:07:18] Yeah, if I get a phone call from somebody I do know and they show up in my contacts and I'm wondering why they didn't text me to tell me they were going to call first...

Unknown: [00:07:25] (LAUGHTER)

Joe Carrigan: [00:07:26] That's kind of how I think of it.

Dave Bittner: [00:07:28] There you go.

Joe Carrigan: [00:07:28] I'm like this is rude.

Dave Bittner: [00:07:29] Yeah.

Unknown: [00:07:31] (LAUGHTER)

Joe Carrigan: [00:07:31] But I think I would be much more likely to answer one of those neighbor calls if it happened on my home phone because, growing up, everybody in my neighborhood had the same area code and prefix. And now in my neighborhood at home, I know that a lot of the businesses around my neighborhood at home have the same area code and prefix. But on my mobile phone, I think that doesn't work as well because it's not associated with a geographical area because it's not a cabled out switching network. It's a cellular network.

Dave Bittner: [00:07:59] Right, right. Some of us are even old enough to remember when you could make a local phone call without entering an area code.

Joe Carrigan: [00:08:04] Yes.

Stu Sjouwerman: [00:08:05] Yes, I prefer, these days, to have these things go to voicemail because Google is so very friendly to transcribe what they say, and transcriptions are hilarious.

Unknown: [00:08:17] (LAUGHTER)

Dave Bittner: [00:08:19] Yeah, that's interesting. My wife and I have actually - we still maintain a landline because there are just enough legacy folks that we - that have that number - from, like, our mortgage company or something like that - and so we keep that landline, but it doesn't even ring in the house. It goes straight to a Google Voice account, which then transcribes it and texts it to our phones. And the nice thing about that is we both get it on our phone, so we're not going to miss a message. You take away that, oh, did you take a message, or who called? And, of course, nobody calls that number anymore, so I don't know why I still keep it, but there you go.

Joe Carrigan: [00:08:55] You can give it out to all the affinity programs...

Dave Bittner: [00:08:57] Yeah, yeah (laughter).

Joe Carrigan: [00:08:58] ...So that when they sell your data, it's useless.

Dave Bittner: [00:09:00] Right, absolutely. All right, well, that's my story. Joe, what do you have for us this week?

Joe Carrigan: [00:09:04] So we've all heard of the IRS scams - right? - where the IRS is calling you and they're saying you're in deep trouble Mr. or Mrs., you need - owe us money. Well, it looks like things are changing. The FTC issued a report last month - earlier this month, rather - no, it was in April - that said that these guys are shifting a little bit, and they're not imitating the IRS anymore; they're shifting to imitating the Social Security Administration. So it's pretty much the same scam. And there is a article in The New York Times from Ann Carnes that talks about this this week. But to give you an idea of the scale of this thing, when the IRS scam was running at its peak - and that was during 2015 to 2016 - that was 12 months - they scammed people out of $17 million.

Dave Bittner: [00:09:46] Wow.

Joe Carrigan: [00:09:47] OK? In the last 12 months, the Social Security Administration scams have already scammed people out of $19 million, including - February and March, they scammed people out of $6.7 million. And if you look at the FTC report, the graph for the Social Security scam is like a hockey stick. It's way high. But what's interesting is that this graph - they also graph the IRS scams, and that's just going down. So it's as if they have - the scammers have learned what the problem is, that the IRS scams are not working anymore. And now they're hitting people up with the Social Security Administration scams.

Stu Sjouwerman: [00:10:23] Bad guys use ML and AI, too.

Joe Carrigan: [00:10:26] Right, they do. So maybe there's something there involved here. Out of the people who reported the scams, only 3.4% of the people that reported the scam said they had actually lost money. And the median loss - median loss for the scam - was $1,500.

Stu Sjouwerman: [00:10:42] Wow.

Joe Carrigan: [00:10:43] So that means that half the people in this scam lost more or - and half the people lost less. That is four times the average of the - of fraud. You know, the fraud for one of these scams usually lands around 350 bucks, and this is four times that much. Of course, the most common vector or the most common way people asked - any guess of what they wanted from the people they were scamming? Gift cards. That's what they wanted. I have to say this. I do a lot of things. I had - I was giving a presentation a couple of weeks ago. I said no government agency takes payment in gift cards.

Unknown: [00:11:18] (LAUGHTER)

Joe Carrigan: [00:11:20] And not a lot of legitimate businesses take payment in gift cards. I know that there are VPN services that, if you want to go out and buy them, they'll take payment in gift cards. That's a legitimate business. But generally speaking, they don't take payment in gift cards, right? Cryptocurrency was a second - or distant second, not a close second at all. So what do they say when they call? Or how do they get people hooked - right? - what do they say? They say...

Dave Bittner: [00:11:45] Yeah.

Joe Carrigan: [00:11:45] ...Your Social Security number's been suspended, right?

Dave Bittner: [00:11:48] Yeah, yeah.

Joe Carrigan: [00:11:49] And they say that, we've noticed some suspicious activity. Or they say you're Social Security number's been involved in a crime. And I can relate to that because last year my Social Security number robbed a liquor store.

Audience: [00:11:59] (LAUGHTER)

Joe Carrigan: [00:12:02] And that was a mess to clean up.

Audience: [00:12:04] (LAUGHTER)

Joe Carrigan: [00:12:06] But the first thing they tell people - and this is an interesting point - is they say, you're about to have all your assets in your bank frozen, so go withdraw all the cash from your bank right now before it gets frozen. So people go out and they - if they're falling for this, they go out and they withdraw all their cash. And there's no reason to do that. That had - their accounts haven't been frozen. There is no risk of it, of course. But now they have cash, and now it's easier to convince these people to can convert it into gift cards...

Dave Bittner: [00:12:33] Right.

Joe Carrigan: [00:12:34] ...And send them the codes.

Dave Bittner: [00:12:35] Right.

Joe Carrigan: [00:12:36] Of course, the report goes on to talk about the older people are the ones who get scammed, mainly because they have access to the accumulated wealth from their lifetimes. And a lot of times they might be in the early stages of dementia or Alzheimer's or some other disease that may make them much more vulnerable to this kind of attack.

Dave Bittner: [00:12:54] Yeah.

Joe Carrigan: [00:12:54] It's an unfortunate target, but it is the truth.

Dave Bittner: [00:12:58] Yeah. Stu, I'm curious for your insights on this. I mean, what is the attractive part for the bad guys about these government types of - I think of the permanence of something like your Social Security number, which is a number that - if you can change it at all, it is not an easy task to do. How does that play into the psychology of what the scammers are trying to do here?

Stu Sjouwerman: [00:13:24] Yeah, ultimately, it is either a perceived benefit or it is preventing a negative consequence. And if you - and usually, the preventing the negative consequence is really impinging more than, hey, you've won the lottery. So you do focus someone's attention on that particular type of, oh, no, I don't want that to happen, especially, you know, people that are easier to social engineer. So the bad guys are doing this very methodically. And they run this kind of scam by the numbers, too. They iterate through these scams until they find a particular vector and a particular scenario that works. And they're smart, unfortunately, and they go where the - literally go where the money is.

Dave Bittner: [00:14:16] When you all are tracking these sorts of things at KnowBe4, what does the trend curve look like? So does one scam start to fall off before another one picks up, or does - can one just come from out of the blue and suddenly that's the hot new thing? Is there a pattern there?

Stu Sjouwerman: [00:14:36] There are patterns. It depends a bit on what you're looking at. At KnowBe4, we get a couple of - ten, twenty thousand emails a day that gets reported by users through the Phish Alert button. And so we see everything that makes it through the existing mail filters. And that gets reported if you guys allow that, by the way. This is an opt-in program. So we analyze these, and we use Phish Alert to do the analysis, by the way. We analyze these tens of thousands of emails that are coming in, and we certainly see patterns. We see types of attacks coming in and out. At the moment, for instance, we see three platform attacks where the email gets sent through, for instance, Office 365 - the landing page lives in a Microsoft environment. And then where they get dropped is also a Microsoft-type either landing page or payload. So the problem is that your endpoint security is already whitelisted for those platforms. So we see these things coming in rather - relatively early.

Dave Bittner: [00:15:53] Interesting. All right, well, those are our stories for this week. It is time for our favorite part of our show, and that is our catch of the day.

0:16:02:(SOUNDBITE OF MUSIC)

Joe Carrigan: [00:16:03] I love this part.

0:16:06:(SOUNDBITE OF REELING IN FISHING LINE)

Dave Bittner: [00:16:07] Our Catch of the Day this week was sent in by a listener. This is a bit of a classic Catch of the Day. There's been lots of variations of this one. But we thought we'd have some fun with it this week, and we would let our special guest, Stu Sjouwerman, take the honors and read our Catch of the Day. So, Stu, whenever you're ready.

Stu Sjouwerman: [00:16:27] Hey, as you can see, there is no need for me to introduce myself to you because I don't have any business with you. My duty, as I'm mailing you now, is just to assassinate you.

Unknown: [00:16:40] (LAUGHTER)

Joe Carrigan: [00:16:42] Seems legit.

Stu Sjouwerman: [00:16:43] If you don't comply, I have to do it, as I have already been paid for that. But I have to ask you this question - what is the problem you have with your friend that made him hire us to kill you?

Unknown: [00:16:57] (LAUGHTER)

Stu Sjouwerman: [00:16:58] Now, do you want to live or die? As someone has paid us to kill you, get back to me now. If you're ready to pay some fees to spare your life, 3,800 bucks is all you need to spend.

Unknown: [00:17:11] (LAUGHTER)

Dave Bittner: [00:17:12] All right.

Unknown: [00:17:13] (APPLAUSE)

Stu Sjouwerman: [00:17:14] What a deal.

Dave Bittner: [00:17:20] First of all, Joe, why did you hire someone to kill me?

Joe Carrigan: [00:17:24] Never mind.

Unknown: [00:17:25] (LAUGHTER)

Dave Bittner: [00:17:27] All right. Let's unpack this real quick here. Stu, what's going on here? What's the - what are all the elements that are involved in something as absurd as this, yet still, I suppose, successful enough to be...

Stu Sjouwerman: [00:17:42] Well, obviously, you unpack this and the very first thing is you want to prevent this very negative consequence.

Unknown: [00:17:49] (LAUGHTER)

Joe Carrigan: [00:17:52] I can't think of a more negative consequence.

Stu Sjouwerman: [00:17:53] No. So shock value.

Dave Bittner: [00:17:57] Right.

Stu Sjouwerman: [00:17:58] Instant - oh, my God. Is this real?

Dave Bittner: [00:18:00] Right.

Stu Sjouwerman: [00:18:01] And then, how do I pay, and et cetera, et cetera. It's an extreme example of getting people to worry so much that they just stop thinking. All the stress levels go way off the scale, and you're no longer able to rationally decide.

Joe Carrigan: [00:18:18] You're short-circuiting the thought process.

Stu Sjouwerman: [00:18:20] Yes.

Dave Bittner: [00:18:20] Yeah. It's interesting, too, that it doesn't - like, it says your friend has put this hit out on you, doesn't have any details about that friend. So I imagine everyone starts thinking, like, yeah, I probably have a friend who would...

Unknown: [00:18:36] (LAUGHTER)

Dave Bittner: [00:18:36] Like...

Joe Carrigan: [00:18:36] And my friends know who it is.

Dave Bittner: [00:18:38] Yeah.

Stu Sjouwerman: [00:18:39] You know, if you break it down, there is this concept of the OODA loop. That's observe, orient, decide and act. And this is something they train fighter pilots in. Top guns literally are trained on OODA loop. The bad guy is essentially trying to bypass and short-circuit the OODA loop. Instead of observe, orient, decide and act, what they want you to do is observe and act. The orient and decision are taken out...

Dave Bittner: [00:19:12] Right.

Stu Sjouwerman: [00:19:12] ...And that's typically what this kind of thing does.

Dave Bittner: [00:19:15] Yeah. Yeah. All right. Well, that is our Catch of the Day. It is time to move on, and we want to welcome our special guest to this special edition of the "Hacking Humans" podcast. He is one of the most well-known hackers in the world, and he is KnowBe4's chief hacking officer. Please join me in welcoming to the stage Kevin Mitnick.

Audience: [00:19:34] (APPLAUSE)

Kevin Mitnick: [00:19:43] Excellent. Good to be here. Thank you. So talk about phone phreaking...

Dave Bittner: [00:19:49] You want to jump right into phone phreaking? All right. All right. Let's go. Let's go.

Kevin Mitnick: [00:19:53] That was actually - cool. Well, back in the day, when I was on the other side of the fence, I took control of a lot of the Bell operating companies in the United States because it was analogous to taking over root DNS servers. So back in the day, I could take a dial-up number - because that's what we used back in the '80s and '90s - and I could basically call forward it to my system that would simulate the login to whatever operating system they're going to log into. I'd be able to capture the credentials like a credential harvesting - what we call a credential harvesting phish. And that was because I had the power through the phone network to actually forward the number that the target was dialing to myself. And then I could capture any credentials, so then I could obviously log into the system. That's how easy it was back in the day.

Dave Bittner: [00:20:39] Just to give us some perspective - back then, were the phone companies not looking for this sort of thing where - did they just not think enough people - were you ahead of them enough that they weren't looking for these sorts of incursions?

Kevin Mitnick: [00:20:54] Well, actually, what gave us the ability to compromise the phone company - they used pretty decent security back then, dial-back security. So the systems would have to dial you back, or you have to have a second - not two-factor authentication but a second password. But because we were able to use telephone pretexting, which is a very well-known form of social engineering, we're able to pretend to be that insider and actually get the insiders inside the Bell operating companies to help. For example, hey, I'm out in the field. I'm having problems dialing into, like, COSMOS, which was a system back in the day. I'm dialing into this dial-up number. You know, can you walk me through the process? And that sort of thing - and they would be very helpful because if you know the terminology, if you know the lingo and you sound like a true insider, they believe it.

Dave Bittner: [00:21:43] Yeah.

Kevin Mitnick: [00:21:43] And that's what made it so powerful back in the day. If you can control the switches where all the calls are placed through, that's an immense amount of power to control anything that used dial-up technology back in the day.

Dave Bittner: [00:21:55] When you were a kid growing up, what is your first recollection of, dare I say, malicious use of social engineering? Like, to get something that someone didn't want you to have - do you have a first memory of, oh, this works?

Kevin Mitnick: [00:22:10] Well, not particularly, but I knew it was with the phone company because another student in high school who showed me all these tricks he could do with the phone really impressed me. It was kind of like magic. And what I did is I kind of listened to him call up different departments of the phone company, pretend to be a technician or somebody who had the authority to have that particular type of information. And I was just amazed at what this guy could do. So I essentially - I pretty much learned this on my own pretty much through trial and error. So at the end of a year or two, I had the ability to basically call - I knew the departments in Ma Bell probably better than the employees. I knew who to call, who I had to be, what information I was after. And if you knew their internal numbers, you had that credibility because their internal numbers were secret.

Kevin Mitnick: [00:23:03] So if you were calling on a specific number and you spoke the proper lingo, you'd get the information 90% of the time. It's the same method that private investigators use to do what they call skip tracing and that sort of thing is they have teams of investigators that use the same type of tradecraft - well, used to before it was criminalized because when I started doing this, there was actually no laws against pretexting the phone company. That didn't happen until after this HP fiasco. I forgot. I think it was 10 or 15 years ago where someone on the board of HP hired a PI, and the PI used telephone pretexting to get the cellphone records of the other board members to identify who the leaker was. And when that all blew up for HP, then they actually federally criminalized pretexting utility companies. So that was a common methodology that private investigators used to do - to identify where somebody lives, to do locates and that sort of thing.

Dave Bittner: [00:24:02] Does it surprise you that, all these years later, that the phone system is still an active avenue for these sorts of scams?

Kevin Mitnick: [00:24:13] It doesn't surprise me. I mean, even today, as I sit here, telephone pretexting works quite well when trying to attack an organization. The cellular mobile operators have come a long way. They're much better in doing authentication. It used to be just having the target's last four digits of their Social Security number to authenticate. Now they have you put in, you know, passwords. They send you a text message to your phone where you have to verify a PIN before they'll even talk to you at customer service. So they've definitely come a long way, the Bell Operating Companies. But organizations are commonly still pretexted as we sit here, and that is a very strong form of social engineering because we get instant compliance. So if I can call somebody up at the company, pretend to be from IT, call somebody that I know is not technically astute, have them enter one command into their computer, and they don't understand what they're entering but they believe it's going to fix a problem - and then you get instant access. And that, in some cases, is much better for the attacker than waiting for someone to open up an email.

Dave Bittner: [00:25:20] How much do you think having an innate sense of empathy has helped you in the work that you do - being able to sense what's going on on the other side of that phone line when you're trying to work your way and influence someone to do what you want them to do?

Kevin Mitnick: [00:25:36] You know, it's total improv. So basically, as you're sizing up whether the target is hesitant - whether they're asking questions, you know, asking questions where you can kind of sense they're not comfortable or they're questioning your identity or questioning the need - then in some cases, you would just back out of that request and go to some other target. But you kind of have to go where the conversation is and to go - kind of like what Paul Robbins was discussing earlier today in his keynote is you have to set the story and make sure that the target is going to believe that story and is going to cooperate with that end result.

Kevin Mitnick: [00:26:15] Case in point, I was hired to test a large Canadian retailer and found out they used a cloud HR provider, found out the cloud HR provider did not register the domain - call it payroll.ca. Basically, I registered it. I became the proud owner. I basically cloned the site, so now payroll.ca looked exactly like payroll.com, got SSL certificates from Let's Encrypt, so it looked very real. And then here was the phone pretext. So I'm calling up a director of HR who I found on LinkedIn pretending to be somebody from IT, saying, hey, we're standardizing on the .ca TLD, which is Canada, and asked her, hey, are you logged in to payroll.com? Oh, yeah. Well, can you go ahead and log out for me real quick? I need you to try something. Can you go to payroll.ca? Can you go ahead and - she goes, yeah. It's asking me for my login and password. Yes. Can you please log in? She logged in. It actually redirected her and actually logged her into the real payroll.com, and I said, OK, great. From now on, use payroll.ca, and if you have a problem, just call the help desk. Meanwhile, I had her credentials, knew two-factor authentication, had access to all the HR data. It literally took five minutes on the phone. The longest part of the attack was waiting for DNS to propagate. And this was literally six months ago, six to nine months ago, so it's pretty fresh.

Joe Carrigan: [00:27:37] We're seeing a lot of stories that come out now about these two-factor authentication social engineering sites, and what you're talking about is a perfect example of that where you can just ask people for the credential that you need, and...

Kevin Mitnick: [00:27:51] Well, you don't want to ask someone for their credentials. That's...

Joe Carrigan: [00:27:54] I mean, on the website, on the website, you're...

Kevin Mitnick: [00:27:56] Oh, yeah. That's how you want to do it is - you want to put the person in the position to entering their credentials...

Joe Carrigan: [00:28:01] Right.

Kevin Mitnick: [00:28:01] ...Where it's a believable story.

Joe Carrigan: [00:28:03] Right. And then they enter the two-factor token key, whatever that is - the number that comes up at the time...

Kevin Mitnick: [00:28:09] Well...

Joe Carrigan: [00:28:09] ...Whether it be text or whether they have a pre-shared key.

Kevin Mitnick: [00:28:12] In this particular case, they didn't use two-factor. That's what made it work so well. So I didn't have to use two-factor to log in, so that's what made it so much easier. If there was two-factor, I would have had to change up the attack. Then I probably would have had her type something into her machine or, you know, essentially launch a payload for me.

Joe Carrigan: [00:28:29] Change the webpage, though - right? - and ask for the two-factor code. And...

Kevin Mitnick: [00:28:32] Yeah, you can pop something up. You know, please download this, you know, plug-in. That's something new. You know, go ahead and do it. It's going to make things run faster.

Joe Carrigan: [00:28:42] Right.

Kevin Mitnick: [00:28:42] Right, and I can kind of size up - when I'm calling a target up on the telephone, I could size up whether they're going to cooperate, usually, in the first 30 seconds.

Joe Carrigan: [00:28:50] Really?

Kevin Mitnick: [00:28:50] Yeah because I kind of - I identify kind of their tone of voice, whether they seem cooperative, whether they're technically astute and would understand going to, like, a command shell - what that means. Usually, you know, you don't call an IT person because they're going to know better.

Joe Carrigan: [00:29:07] Right.

Kevin Mitnick: [00:29:07] You call somebody else that's not going to know better. And that way you get your foot in the door of the network. Yeah, so - but these telephone pretexting attacks are still - I mean, it's more focused because it's not scale.

Dave Bittner: [00:29:19] Right.

Kevin Mitnick: [00:29:19] Phishing is scale, right? And telephone pretexting is really, you know, I would say, ultra-focused on a spear-phishing attack.

Dave Bittner: [00:29:29] Do you ever think about what you yourself might be vulnerable to? If someone were going to come after you, what's your kryptonite?

Kevin Mitnick: [00:29:36] Well, I always worry about being attacked because I'm not invulnerable myself. I do a lot of traveling around the world. I was recently in China, speaking for Tencent. I remember when I was there I felt uncomfortable leaving my laptop in the hotel room - right? - because I'm sure, with the capabilities and tradecraft that the intelligence agencies have, they could easily compromise machine through hardware and potentially put some sort of hardware implant on my machine. And I always realized that I - unless I take my laptop to the shower with me, you know, in some particular cases I could always be compromised, so I always worry about it. So what I do to mitigate that risk is every - minimally every year, I buy a new machine and reload it from scratch.

Stu Sjouwerman: [00:30:25] And he sends me the bill.

Kevin Mitnick: [00:30:26] And I send Stu the bill.

Unknown: [00:30:26] (LAUGHTER)

Kevin Mitnick: [00:30:26] In fact, when I - actually, when I flew back from China - I called Stu earlier. Hey, can I order the laptop a bit earlier this year?

Unknown: [00:30:36] (LAUGHTER)

Stu Sjouwerman: [00:30:37] And I said sure.

Kevin Mitnick: [00:30:42] Yeah but - yeah, so I'm definitely concerned about that. I mean, there's malicious - well, obviously, malware payloads that are extremely difficult to detect no matter if you have AV or the more advanced EDRs these days. There's, you know, a lot - you know, I know - you know, we do this - I do this for a living, as I try to develop payloads for testing that get by EDRs. And in most cases, we're able to do it. So if I could do it, the bad guys could do it. And that's what worries me. Do I have an implant, whether it's software or hardware, on my machine? And that scares me, right? But unfortunately, I have to use the technology - well, fortunately, I have to use it.

Joe Carrigan: [00:31:27] Right.

Kevin Mitnick: [00:31:27] Yeah.

Joe Carrigan: [00:31:28] I don't know that I would take my hardware out of the country - to a different country, for that. I think I would buy scrap hardware, maybe.

Kevin Mitnick: [00:31:34] Well, it actually happened to me. Just a quick story is I was in Bogota, Colombia. And I remember I did a presentation for a newspaper out there called El Tiempo. And I went out to dinner, went back to the hotel room, put my key card in, and it was a yellow light. And I know, usually from these type of locks, yellow light means it's locked from the inside. So we go down to the reception, get a new key, come back up, yellow light. And I did this three times. Finally, I had security come with me. They used their card. It opened the room. I look in the room; everything looks normal. Prior to flying to Bogota, I had a friend of mine replace the hard drive in my MacBook. Back then in the MacBooks, you actually had to pull out the keyboard. It was like a whole operation to change your hard drive. So I had her replace it with a hard drive which would - had more capacity, formatted it, set up all my applications, flew to Bogota. This happened, all right? So then I didn't think anything of it. I just go, that's a weird fluke. But, hey, I guess they have cheap locks in Colombia. So...

Joe Carrigan: [00:32:38] So you dismissed it.

Kevin Mitnick: [00:32:39] I dismissed it.

Dave Bittner: [00:32:40] Right.

Kevin Mitnick: [00:32:40] So I fly back into Las Vegas, at the time, and then the hard drive, the new one, was giving me some issues. So I was going to replace it and go get a new one. So I asked the same friend if she could just swap it out. So she's in my apartment, with the entire laptop disassembled on my table, and she goes, why were you in your laptop? I go, what are you talking about? She goes, well, the screws I put in, I put it very loosely so I could, you know, remove the hard drive again if I had to. These are super tight. Then it hit me like a ton of bricks. Somebody was in my room, obviously, taking the hard drive out and cloning it.

Dave Bittner: [00:33:15] Wow.

Kevin Mitnick: [00:33:16] And I was, like, super angry over the whole thing. So in that particular case, that was a big wake-up call that, in foreign countries, you know, you could definitely be a target.

Joe Carrigan: [00:33:25] Yeah.

Dave Bittner: [00:33:26] Yeah. We're going to take a few questions from audience members. We've got some mics set up here at the front of these two rows. So if you have some questions, please make your way up. And we would love to hear from that. Instead of having runners, we've got the mics set up here on the stand. So please don't be shy. Come on around. And if you have a question, we would love to hear it. In the meantime, Kevin, where do you think we're headed? Do you think we are gaining ground on this? Do you think the word is starting to get out? What do you see in the near future?

Kevin Mitnick: [00:34:01] Well, I like what you guys are doing for - in fact, is educating people about the scams. Last year, we invited Frank Abagnale to keynote at the conference. And Stu and I had breakfast with him. And one of the key takeaways from that breakfast with - what Frank said, which I totally agree with, is the way to protect people from being scammed is educate them about the scam. Unfortunately, there are so many different scams he can't possibly do that in scale, right? So what I think is important is to do as much user education and training about the class of these type of scams, how they work, so people can recognize similar ones and then actually doing what we do at Knowbe4 - is actually doing the - attacking the user using the same type of tradecraft. So when they fall for it, then instead ransomware, they're - you know, obviously, going to get some training and actually try to inoculate that person against that type of attack. So I think that's important for phishing, for pretext phone calls, for even physical security.

Dave Bittner: [00:34:59] Right.

Kevin Mitnick: [00:35:00] Because we could use social engineering as a part of gaining physical access to your facility.

Dave Bittner: [00:35:05] Right, right.

Kevin Mitnick: [00:35:05] So I think it's critical.

Dave Bittner: [00:35:07] Yeah, I think we have a question over here. Sir, go ahead.

Unidentified Person: [00:35:10] Hi, Kevin.

Kevin Mitnick: [00:35:12] Where are you? OK.

Unidentified Person: [00:35:13] A question regarding two-factors authorization, specifically like Google Authenticator and how easily that would be for, like, a hacker, you know, to bypass or, I don't know, somehow duplicate that on drone telephone or something like that.

Kevin Mitnick: [00:35:31] Yeah, I mean, I actually do a demonstration tomorrow that, if you're not using FIDO or UTF technologies, it's pretty trivial. It's not really bypassing Google Authenticator; it's really fooling the user into going to a particular domain which we proxy the user through and are able to steal what they call the session key. Because when you're interacting with a website, for example, and you authenticate - have to authenticate, it creates a session key. So every time you're visiting other parts of the site, you know, you're maintaining state. Otherwise, you'd have to log in for every page you connect to, which is not, you know, usable.

Kevin Mitnick: [00:36:08] So the guy that developed this tool, if you want to Google it, is Kuba. He developed this tool called Evilginx - E, V, I, L, G, N, X (ph). And that's actually the tool that demonstrates how this attack works. The only solve for this attack is using something like a YubiKey or Google's Titan USB key. And of course, educating people that, even though they use two-factor authentication, if the websites that they're using do not offer UTF, which there's a lot that don't, that that's where the user education and training is important...

Unidentified Person: [00:36:45] OK.

Kevin Mitnick: [00:36:45] ...To educate people about this class of attack.

Unidentified Person: [00:36:47] That seems...

Dave Bittner: [00:36:48] All right. Let's jump back over here. Sir, go ahead.

Unidentified Person: [00:36:52] So as someone who has a certainly high profile, do you have recommendations that we could pass along to other high-profile individuals in our organization to kind of lower their risk profile or, you know, lower their risk surface?

Kevin Mitnick: [00:37:04] Well, I think - like, for example, celebrities; I get calls from celebrities that are actually worried about being compromised. So they have to take, you know, extra precautions - for example, enabling two-factor authentication, in some cases using virtual machines to use the internet rather than their host machine. I get these calls from time to time where we have to change the way a person behaves using technology and using the internet, using their mobile phone, using their computer. And it becomes where it's a lot less convenient, but at the same time, it's much more secure. So that's what we do to help people that are more likely to be targeted because of their status.

Dave Bittner: [00:37:49] All right. Let's switch back over here. Go ahead.

Unidentified Person: [00:37:51] Hey. How you doing, Kevin? Kind of a story with a short question. I'm noticing more the pattern is - our - like, we're protecting ourselves, but the people responsible for our information are more likely to leak it. So for an example - and for the record, I didn't do this - I noticed last year that at the bar, if you want to charge something to your room, you just need a last name and the room you're in. And I happened to call this before we came through today, or yesterday, to ask about my room. They just needed to know, like, the day I was arriving and my last name, and they would provide the room after I explained to them that, oh, my wife's afraid of heights; she doesn't want to take the elevator very high. Any way we can get a, you know, low-level room? And I just wanted to confirm that. They gave me my room number. So in that scenario, obviously anybody could just go to the bar now with the room number of Kevin Mitnick and charge (laughter), you know, Kevin a drink on his tab - which I wouldn't do that, but what....

Unknown: [00:38:58] (LAUGHTER)

Unidentified Person: [00:38:59] I mean, if you want to, that's fine.

Kevin Mitnick: [00:39:01] Well, all the hotels around the world have what they call non-registered guests or incognito. So once you tell a registration or the front desk manager that you want to be considered an unregistered guest, what celebrities do, it - when there's a big red flag on their machine, or on their computer rather, that says not to give any information out. So you hope that that works to solve the situation.

Unidentified Person: [00:39:24] All right. Well, if John...

Kevin Mitnick: [00:39:25] But that doesn't stop somebody from following you to the room, though.

Unidentified Person: [00:39:27] Sure, sure. So if Johnson has a bunch of drinks on his tab - my bad.

Unknown: [00:39:31] (LAUGHTER)

Dave Bittner: [00:39:33] All right. Let's go over here, and then we'll finish up over here. We've got two more questions, quickly. Go ahead, sir.

Unidentified Person: [00:39:39] So Kevin, you mentioned earlier a little bit about Ma Bell company, the exploits available through them back in the day and, an older term that people don't hear too much anymore, phreaking. There's still a lot of organizations, even medical organizations, insurance companies, that request that you send them information not by email but through fax. And I was wondering, in your opinion, what is the vulnerability there, even today, when sending faxes to a number that you believe is to an insurance company? When you're sending a fax, is it possible for that to kind of be listened in on, a duplicate fax created at a different location? And do you know, in the industry, are there any efforts to move away from fax? Because I think I speak for a lot of people here - we're tired of fax, man.

Audience: [00:40:24] (APPLAUSE)

Joe Carrigan: [00:40:26] (Laughter) Yeah.

Kevin Mitnick: [00:40:26] Well - sending faxes. So first of all, it's a target-rich environment if you can compromise some of the fax machines these days because you can get all the previous communications out of those machines. And a lot of cases, you compromise a company's wireless network. I just had a recent case of this, where we were able to connect to all the printers and all these types of devices like fax machines, and they use default credentials, which gave us access to all that particular data. The concern about, you know, fax machines, of course, is could there be a redirection, as - you know, of where an attacker could manipulate the number that you're calling and forward it off? Now, there's been some new research. When I used to do this, I used to use - access a switch of a phone company and do call forwarding, and then I'd be able to have my target be redirected to me.

Kevin Mitnick: [00:41:20] Now there's newer ways of doing this, and I suggest that you guys Google this guy, Karsten Nohl - he's the one that actually developed the BadUSB attack, which is very - it's pretty much common knowledge. But he's done a lot of research into what we call - into Ma Bell's SS7 network. And what they're able to do, what - he demonstrated this at the Computer Chaos Club in Germany, I think, two years ago - is how they were able, through accessing SS7, being able to do - basically, redirect the target number to themselves. And that was pretty much incredible because on SS7, which used to be a closed network just for the Bell operating companies, now it's pretty much an open network, which makes it - you know, which they don't have all the security controls in place. So they were able to manipulate those security controls to get a target's location, GPS location, and they were also able to manipulate it to actually, essentially through SS7, forward the target to a number of their choice. So take a - Google it. It's actually a pretty interesting presentation at the Computer Chaos Club by Karston Nohl.

Dave Bittner: [00:42:27] All right. Last question, sir.

Unidentified Person: [00:42:29] Hello, Kevin. So I'm leaving the country later this month, and I'm taking my notebook computer with me because I might need to connect to networks back here in the States. What do I do to the notebook before and after the trip?

Stu Sjouwerman: [00:42:41] Oh, can I jump in on that one?

Kevin Mitnick: [00:42:42] Yes, yes.

Unknown: [00:42:43] (LAUGHTER)

Stu Sjouwerman: [00:42:44] We actually just came out with a module that exactly addresses that - safe travel for road warriors, 15 minutes, has a checklist. What do you do before, what do you do during, what do you do after? You can print the checklist in a PDF. There's quite a shopping list of things that you should be doing. And Kevin and I, we created that module together. It's extremely simple to, you know, step through it, print it, and then you go through the list. You'd be surprised.

Kevin Mitnick: [00:43:16] And realize, when you're crossing borders, and, like, if you go into Canada, their Customs agents could demand your pins to your mobile devices, they could demand your passwords to your computers, and if you don't - I believe there was a guy that was actually sent to prison for failure to comply with giving up the credentials. And so now you're put in a position when you're traveling that any of the Border Patrol agencies in those countries have free access to all the data on your devices, which might be of a concern if anything is under a nondisclosure agreement and that sort of thing.

Kevin Mitnick: [00:43:50] So as one of the recommendations we make in the training course is, when you're traveling, you might consider setting up a travel machine, where you're not taking all your data, but the data that's necessary for you to do - you know, what you need on the road and that sort of thing. But definitely keep in mind, when you're crossing into different jurisdictions, the laws change, and you could actually go to prison for not providing your credentials, or in some cases, most cases, they'll just take your equipment. So you don't provide us your password? That's fine, sir. We're taking your equipment. We're going to try to do an analysis. We'll send it to you in about six months to a year.

Dave Bittner: [00:44:25] Well, folks, that is all the time we have. Thank you all for joining us for this special live version of our "Hacking Humans" podcast. How about a round of applause for both Kevin and Stu?

Audience: [00:44:39] (APPLAUSE)

Dave Bittner: [00:44:39] We want to thank all of you for joining, and we hope, of course, that you will log into that podcast app, and you'll subscribe to both "Hacking Humans" and our CyberWire daily podcast. Thanks all for coming today. Thanks so much.

Audience: [00:44:50] (APPLAUSE)

Dave Bittner: [00:44:53] That is our show. Of course, we want to thank our sponsors at KnowBe4. They are the social engineering experts and the pioneers of new-school security awareness training. Be sure to take advantage of their free phishing test, which you can find at knowbe4.com/phishtest. Think of KnowBe4 for your security training.

Dave Bittner: [00:45:09] Thanks to the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu.

Dave Bittner: [00:45:17] The "Hacking Humans" podcast is probably produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben. Our editor is John Petrik, technical editor is Chris Russell. Our staff writer is Tim Nodar. Our executive editor is Peter Kilpe. I'm Dave Bittner.

Joe Carrigan: [00:45:35] And I'm Joe Carrigan.

Dave Bittner: [00:45:36] Thanks for listening.