On this Valentines Day edition of Hacking Humans, Joe and Dave examine romance scams, including the sad tale of woman bilked out of hundreds of thousands of dollars. There's a silly, non-murdering catch of the day, and Dave interviews Max Kilger from UTSA on the six motivations of bad actors.
Links to today's stories:
Max Kilger: [00:00:00] You know, it's really important to build a better, more comprehensive understanding of the relationship between people and digital technology.
Dave Bittner: [00:00:10] Hello, everyone, and welcome to the CyberWire's "Hacking Humans" podcast. This is the show where each week we look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire, and joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe.
Joe Carrigan: [00:00:29] Hi, Dave.
Dave Bittner: [00:00:41] We've got some fun stories to share this week. Later in the show, we've got my interview with Max Kilger from UTSA. That's the University of Texas at San Antonio. He's going to talk about the six motivations of bad actors.
Dave Bittner: [00:00:41] But first, a quick word from our sponsors at KnowBe4. And now a word from our sponsor, KnowBe4 - consider, if you will, how interesting it would be to talk with a world-famous hacker, one of the originals who cut his teeth on close-up magic and has now gone straight, teaching people to recognize hacks, scams and misdirections before it hits their organization. Stick around. And in a few minutes, we'll tell you how you can.
Dave Bittner: [00:01:11] And we are back. Joe, you know, we record these shows a few days before they actually are published. But this show is scheduled to be published on Valentine's Day.
Joe Carrigan: [00:01:20] It is.
Dave Bittner: [00:01:21] So we thought we would focus on Valentine's Day sort of romance scams and so forth. I have to say, you certainly went all out this morning decorating the studio here. It's full of beautiful red roses and boxes of chocolates. And not only that, but Joe is wearing a candy-apple red tuxedo today...
Joe Carrigan: [00:01:37] Yes. I am.
Dave Bittner: [00:01:37] ...In celebration of Valentine's Day. Looking very dapper there. I don't know where one gets a tuxedo like that, but not everyone can pull it off. But on you, somehow, it works.
Joe Carrigan: [00:01:46] I had it tailor-made, Dave.
Dave Bittner: [00:01:47] It's very nice, very nice.
Joe Carrigan: [00:01:49] Thank you.
Dave Bittner: [00:01:49] Well, why don't we jump in with our stories here? Joe, what do you have for us?
Joe Carrigan: [00:01:52] So I have this article from the Better Business Bureau that actually talks about the anatomy of a romance scam and how these things work. And they generally have some very common features. And the BBB has broken them down to four phases of attack.
Dave Bittner: [00:02:07] OK.
Joe Carrigan: [00:02:07] And the first phase is contacting the victims.
Dave Bittner: [00:02:09] Sure.
Joe Carrigan: [00:02:09] Of course. They've got to reach out at first. They will create some kind of fake profile. They could do it on social media, like Facebook, or they could do it on a dating site, using a stolen credit card, and actually set up what looks like a legitimate profile. But then they're going to quickly try to move you off of that platform of communication to try to communicate with you via email or texting. Why do you think that is?
Dave Bittner: [00:02:57] Maybe it's harder to trail them? I suppose the actual dating site systems may have things in place to try to protect people, and so you want to get off that platform.
Joe Carrigan: [00:02:57] That's part of it, that they want to get off the platform that actually looks for their activity. Also, the scam is going to become pretty evident when the charge gets challenged on a credit card and then that site's going to get shut down or that profile's going to get shut down. But if they've already moved you off to email or texting then they've already gotten you isolated from the platform.
Dave Bittner: [00:02:57] Right.
Joe Carrigan: [00:02:58] Isolation is kind of a common theme that we're going to see. The next step - or next phase, rather, is grooming. They are going to learn everything they can about the victim's life. And this stage can go on for months. Some scammers will send small gifts. They'll send daily text messages or direct messages on Facebook, or Twitter or something. The grooming also focuses on, once again, isolating the victims from their friends or family so that they don't have help when making decisions. In other words, they're going to make these decisions in a vacuum without talking to people. They're going to say things like, I don't think that your brother likes me. They're going to try to - or, I don't think that friend has our best interests at heart. Right? They're going to say...
Dave Bittner: [00:03:37] Just keep this between the two of us.
Joe Carrigan: [00:03:39] Keep this between the two of us. This is one of the red flags that you tell your kids, right? If somebody ever says let's let this be our little secret then that's one of the first things to let you know that you should be talking to somebody about it, right? Right? (Laughter).
Dave Bittner: [00:03:50] Yeah. True.
Joe Carrigan: [00:03:51] So anyway, this phase of grooming goes on for months. And then phase three is the sting, right? This is where they cash in. And the scammer will ask for money, usually on an emergency basis or a plane ticket to finally meet. If the victim sends money, the scammer will always find ways of asking for more money. Right? They've struck oil, and they're going to pump this well until it's dry.
Dave Bittner: [00:04:12] Yeah.
Joe Carrigan: [00:04:13] And the BBB was saying that there have been incidences where these scams have become dangerous, and victims have unwillingly been pulled into money laundering operations, drug trafficking operations. And in a few cases, they've even been convinced to fly to foreign countries, where they've been kidnapped and held for ransom.
Dave Bittner: [00:04:30] Wow.
Joe Carrigan: [00:04:31] ...Which is terrible.
Dave Bittner: [00:04:32] Yeah.
Joe Carrigan: [00:04:32] Absolutely, just appalling. At some point in time, most people catch on and they realize it's a fraud, or they wind up getting bilked out of all their money, which is a sad situation. But then the fraud will continue. And it continues in a way that we've talked about before. There will be some kind of follow-up scam. Like, the law enforcement scam - hey, I can help you get your money back. If you send me some fees, I can chase this guy down and get some of your money back. That's just another scam. It's the - what is it, the sunk cost fallacy?
Dave Bittner: [00:04:58] Yeah.
Joe Carrigan: [00:04:58] You know, maybe if I can go after this...
Dave Bittner: [00:05:00] Yeah.
Joe Carrigan: [00:05:00] ...If I can get my money back. No. You've lost the money. It's gone. Don't try to get it back.
Dave Bittner: [00:05:04] My dad used to say throwing good money after bad.
Joe Carrigan: [00:05:06] Exactly. That's a great way to say it.
Dave Bittner: [00:05:07] Yeah.
Joe Carrigan: [00:05:08] Or - and this is insidious - the scammer may sometimes admit that the original relationship was a scam but that they actually fell in love. Right?
Dave Bittner: [00:05:17] Really?
Joe Carrigan: [00:05:17] Yes. They'll have the audacity to do that. And then the scam will continue as before.
Dave Bittner: [00:05:22] Wow.
Joe Carrigan: [00:05:22] So the question is, how do you protect yourself? And actually, there's a really useful website that's from the Australian government, called, scamwatch.gov.au. They have an entire section dedicated to romance scams. And they say, obviously, if you can spot this early on, you're better off. Right? So if you can spot a fake profile, that's your best defense. So look for these markers. Do a Google image search on the profile picture, and if it comes up as a bunch of different people or is a different person than the person you're looking at, you know that they just pulled that person's picture and put it onto a fake profile.
Dave Bittner: [00:05:53] Right.
Joe Carrigan: [00:05:54] Right?
Dave Bittner: [00:05:54] (Laughter).
Joe Carrigan: [00:05:54] And we've seen that happen.
Dave Bittner: [00:05:55] Yeah.
Joe Carrigan: [00:05:56] We had the military officer who had all of his images stolen. And somebody was absolutely impersonating this guy...
Dave Bittner: [00:06:02] Yeah.
Joe Carrigan: [00:06:02] ...Using his name and image.
Dave Bittner: [00:06:04] I'll tell you - my wife probably once a week gets a friend request on Facebook that is one of these scams.
Joe Carrigan: [00:06:10] My wife gets them frequently too.
Dave Bittner: [00:06:12] Yeah, it's a military guy, you know, or a guy in a military uniform - a handsome man in a uniform.
Joe Carrigan: [00:06:17] Right.
Dave Bittner: [00:06:17] Yep. Once a week, she probably gets one.
Joe Carrigan: [00:06:19] Yeah, it's terrible.
Dave Bittner: [00:06:20] Yeah.
Joe Carrigan: [00:06:21] My wife, like your wife, probably, just ignores them or reports them.
Dave Bittner: [00:06:23] Yeah.
Joe Carrigan: [00:06:24] Look for incorrect location information. You know, like, if somebody says Baltimore - something doesn't match up. It indicates that they're just using city names and states. Or they're in the wrong state. They want to meet with you despite the fact that they're maybe in Arizona and you're in Maryland. Someone who's not from the United States might not know that that's a very faraway place, so it doesn't make any sense. Make sure the physical attributes match the description of the picture. They're going to have physical attributes in their description. And if it says they have blue eyes, and you look at the picture, and the person has brown eyes, that should set off a flag.
Dave Bittner: [00:06:57] Oh, yeah.
Joe Carrigan: [00:06:57] Look for really broad specifications about what this person is looking for. In other words, I don't care anything about you. I just want to hook up. I just want to meet with you. That's another telltale sign. They're just casting a wide net - right? - and, of course, broken English - if you see broken English.
Dave Bittner: [00:07:15] Right.
Joe Carrigan: [00:07:16] And then once you've already started communicating, watch for the warning signs. Like, they try to quickly move you off another platform to communicate. We talk about - that happens very early. And then there are inconsistencies in their story. Those - every one of those inconsistencies should be a red flag to you.
Dave Bittner: [00:07:29] Yeah.
Joe Carrigan: [00:07:30] It's tough, Dave.
Dave Bittner: [00:07:31] It is.
Joe Carrigan: [00:07:32] You know, you can protect yourself if you're vigilant.
Dave Bittner: [00:07:34] Well, you got to get the word out there, I think, to your friends and family and folks who might be susceptible to this source.
0Joe Carrigan: [00:07:39] Exactly - vulnerable. Tell them to listen to this show.
Dave Bittner: [00:07:42] (Laughter) That's right. So my story this week is actually an example of this. This comes from the folks at AARP, who I would like to point out I am not quite yet old enough to be a member of. But, boy, I'm heading that way.
Joe Carrigan: [00:07:54] Right. Me too.
Dave Bittner: [00:07:58] So AARP, they had a story about a woman they called Amy - that's not her real name. Amy found herself alone in her mid-50s. She'd had a troubled marriage. She'd had an abusive husband who had died from cancer. And some time had passed after he died. And she decided it was time to put herself out there. So she signed up for match.com. And she had a pretty straightforward pitch. She said looking for a life partner, successful, spiritually minded, intelligent, good sense of humor, enjoys dancing and traveling - no games. So she went on a few dates. She met some people. Nothing really clicked. But then the system came back and informed her that it had a 100-percent match for her. This was a handsome, fit, silver-haired man in his early '60s. He lived less than an hour away. And so she followed up. And this is when things started with Duane.
Joe Carrigan: [00:08:51] Never trust a guy named Duane.
Dave Bittner: [00:08:55] We're going to really hit the Duane contingent of our listeners there, Joe.
Joe Carrigan: [00:08:58] Sorry.
Dave Bittner: [00:08:59] So Duane wrote back. He had a long message describing his life, said he was a computer systems analyst from California. He grew up in England. So imagine the accent, Joe - the accent. He wrote her detailed romantic descriptions of how he imagined their first meeting. She actually did a little bit of digging. She found his LinkedIn profile. It was sparse, but it was there.
Joe Carrigan: [00:09:20] Right.
Dave Bittner: [00:09:21] It seemed legit.
Joe Carrigan: [00:09:22] So he's got a bigger footprint than just some rando.
Dave Bittner: [00:09:25] Right. He told her that he traveled a lot for work, that he was currently working in Malaysia. And one day after exchanging notes with him for months, she came home, and she found a beautiful bouquet of flowers and a note that said, my life will never be the same since I met you, love, Duane. And so she feels like she's falling for this guy, right?
Joe Carrigan: [00:09:44] Right.
Dave Bittner: [00:09:45] Well, soon enough, Duane starts asking her for money. And he assured her that he was financially secure. He had a large trust fund in England. But he just needed a little bit of help getting some components out of customs for his job. He was having some issues with some bank accounts. And could she just wire him some money just to help out? He'd pay her back as soon as he returned.
Joe Carrigan: [00:10:05] Sure.
Dave Bittner: [00:10:06] And so she wired him $8,000.
Joe Carrigan: [00:10:08] Wow.
Dave Bittner: [00:10:10] Well, this is how it begins, right? So Duane strings her along and...
Joe Carrigan: [00:10:13] That's a big first hit, though - eight grand.
Dave Bittner: [00:10:16] Yeah. He strings her along, promises that they're going to meet - can't wait to meet her. But the requests for money keep going on and excuse after excuse for delaying their inevitable face-to-face meeting.
Joe Carrigan: [00:10:28] Right.
Dave Bittner: [00:10:28] So finally Amy's sister-in-law who's been kind of, you know, keeping track of all this and is concerned, she sent Amy a link to an episode of, of all things, "The Dr. Phil Show." And for those of you who may not be in the U.S., Dr. Phil is a TV psychologist, I guess, or counselor.
Joe Carrigan: [00:10:47] I don't know what his doctorate is in.
Dave Bittner: [00:10:48] Yeah, yeah. It's probably I don't know economics or something - unrelated. But anyway it's Dr. Phil. And people come on his show, and he gives them advice on how to live a better life. And Dr. Phil had a couple of guests on his show. And it was two women on the show who'd been scammed, and their stories looked all too much like Amy's.
Joe Carrigan: [00:11:07] Right.
Dave Bittner: [00:11:08] So at this point, Amy starts using reverse image search on Google, which you and I have talked about before on the pictures that Duane had sent her. And sure enough, they were obviously a real person, but they were not Duane. It was someone else who, as we've spoken about before, was completely unaware that he was involved in this scam...
Joe Carrigan: [00:11:26] Correct.
Dave Bittner: [00:11:26] ...At all. So over the next few weeks, she continued to unravel the scam. But in the end, she had been tricked out of more than $300,000.
Joe Carrigan: [00:11:34] Oh, my God. That's heartbreaking.
Dave Bittner: [00:11:36] Yeah. She owned her home. Well, you know, when her husband passed away, she'd gotten some money, and she owned her home. And this scammer knew that. And in some of the early discussions, they'd talked about their finances and so on. She contacted the FBI. She learned that there was very little that they could do to help get the money back. But one of the things that this article from AARP points out is that this is really a multifactored trauma that she went through. Obviously there's the loss of the money.
Joe Carrigan: [00:12:02] Right.
Dave Bittner: [00:12:03] But also, there's the loss of the love.
Joe Carrigan: [00:12:05] Right.
Dave Bittner: [00:12:06] And there's the feeling of being fooled.
Joe Carrigan: [00:12:07] Yes.
Dave Bittner: [00:12:08] And the victims - they blame themselves. But quite often, their friends and their family blame them as well.
Joe Carrigan: [00:12:13] Right, right. How could you be so stupid...
Dave Bittner: [00:12:15] Right.
Joe Carrigan: [00:12:16] ...Is the thing that we hear. And we've said this time and time again that this is not something that is an indicator of your intelligence. This woman is in a situation where her husband has passed away. Her marriage wasn't that great.
Dave Bittner: [00:12:28] Right.
Joe Carrigan: [00:12:28] You know, so she's been conditioned all of her life for - or all of her marriage for being set up for this kind of scam almost.
Dave Bittner: [00:12:36] Right. She's a prime target because of the...
Joe Carrigan: [00:12:37] She's a prime target. Exactly.
Dave Bittner: [00:12:37] ...Unfortunate circumstances she's experienced.
Joe Carrigan: [00:12:37] I mean, she's got this guy who, for the first time in a long time, is somebody that's nice to her.
Dave Bittner: [00:12:45] Yeah.
Joe Carrigan: [00:12:45] And unfortunately, it was a scammer who just absolutely abused that.
Dave Bittner: [00:12:49] Yeah. And one of the things it said in this article - I think it was one of the FBI agents who said love is the most powerful drug there is.
Joe Carrigan: [00:12:55] Yeah.
Dave Bittner: [00:12:56] And I think there's something to that. So, you know, we talk about you have to have empathy for these folks. But I think that's a part that's overlooked quite often, is part of the loss is - you're not a fool for falling in love, right? I mean, the - you know, the heart wants what the heart wants. Yes.
Joe Carrigan: [00:13:12] Right.
Dave Bittner: [00:13:12] Could she have been more careful? Sure.
Joe Carrigan: [00:13:13] Sure.
Dave Bittner: [00:13:14] But that feeling for her was real.
Joe Carrigan: [00:13:16] Right.
Dave Bittner: [00:13:16] And the loss and the suffering of that is real. So I guess one of the other lessons here is just be careful. If folks you know fall victim to these things, don't pile on.
Joe Carrigan: [00:13:26] Yeah, don't jump on them.
Dave Bittner: [00:13:27] Try to help them, you know.
Joe Carrigan: [00:13:28] Yeah.
Dave Bittner: [00:13:28] But they're...
Joe Carrigan: [00:13:28] And try to be a little more empathetic...
Dave Bittner: [00:13:29] They're going through a lot. Yeah.
Joe Carrigan: [00:13:31] ...'Cause there is something out there that will fool you, trust me.
Dave Bittner: [00:13:39] Yeah, that's right. That's for sure. All right. Well, those are our stories, Joe. It's time to move on to our Catch of the Day.
(SOUNDBITE OF REELING IN FISHING LINE)
Dave Bittner: [00:13:42] Joe, our Catch of the Day - this is actually something that I came across when I was searching around for some of the romance scams knowing that we were going to be doing an episode today thanks to Valentine's Day.
Joe Carrigan: [00:13:54] Yep.
Dave Bittner: [00:13:55] And this is an online profile that someone had put in on one of these dating sites, and the profile goes like this. The profile name is, I won't murder you.
Dave Bittner: [00:14:08] My self-summary - I'm a fun-loving guy and a self-starter who has absolutely no interest in committing murder. I'm looking for love, companionship or just that one lovely evening. And rest assured, that one lovely evening will absolutely end with you back at your house safe and sound. Let me take you into my magical world of not murdering anyone ever for any reason. What I'm doing with my life - I'll tell you this right upfront - certainly not murdering anyone...
Joe Carrigan: [00:14:36] (Laughter).
Dave Bittner: [00:14:36] ...Least of all you.
Joe Carrigan: [00:14:38] Right.
Dave Bittner: [00:14:38] Beyond that, mostly digging.
Joe Carrigan: [00:14:42] Digging what?
Dave Bittner: [00:14:43] I'm guessing shallow holes in the ground.
Joe Carrigan: [00:14:46] Right.
Joe Carrigan: [00:14:46] This sounds like something somebody who was going to murder me would say.
Dave Bittner: [00:14:50] You think?
Joe Carrigan: [00:14:50] Yeah.
Dave Bittner: [00:14:51] Yeah.
Joe Carrigan: [00:14:52] This is diametrically opposed to our Catch of the Week from two weeks ago - or last week, rather...
Dave Bittner: [00:14:57] (Laughter) Right.
Joe Carrigan: [00:14:57] ...Where the guy said he was going to murder you, (laughter) right?
Dave Bittner: [00:14:59] Yeah, yeah. This is good. So obviously somebody's having a little bit of fun here, but sort of the opposite of what we were talking about earlier in the show.
Joe Carrigan: [00:15:09] Right.
Dave Bittner: [00:15:10] But be careful out there, folks. You never know who you're going to meet...
Joe Carrigan: [00:15:13] Yeah.
Dave Bittner: [00:15:14] ...Online.
Joe Carrigan: [00:15:14] If I was a woman, I would never respond to this guy. I wonder how many people he got responding to his not-murdery (ph) profile.
Dave Bittner: [00:15:21] (Laughter) I don't know. Yeah, I don't know. It's funny. It's like all those people who get involved with folks in prisons.
Joe Carrigan: [00:15:26] Right.
Dave Bittner: [00:15:26] You know, what do you - people want to save people, I guess, or...
Joe Carrigan: [00:15:28] I guess.
Dave Bittner: [00:15:29] ...Help them out. So all right. Well, that's our Catch of the Day - a fun one this week. Coming up next, we've got my interview with Max Kilger. He is from the University of Texas at San Antonio. He's going to be talking about the six motivations of bad actors.
Dave Bittner: [00:15:42] But first, a word from our sponsors at KnowBe4. And now back to that message from our sponsor, KnowBe4. It can take a hacker to know a hacker. Many of the world's most reputable organizations rely on Kevin Mitnick, the world's most famous hacker and KnowBe4's chief hacking officer, to uncover their most dangerous security flaws. Wouldn't it be great if you had insight into the latest threats and could find out what would Kevin do? Well, now you can. Kevin and Perry Carpenter, KnowBe4's chief evangelist and strategy officer, will be running a webinar to give you an inside look into Kevin's mind. You'll learn more about the world of penetration testing and social engineering with firsthand experiences and some disconcerting discoveries. In their webinar, you'll see exclusive demos of the latest bad-guy attack strategies, find out how these vulnerabilities may affect your organization and learn what you can do to stop the bad guys. In other words, what would Kevin do? Go to knowbe4.com/hackinghumans to register for the webinar. That's knowbe4.com/hackinghumans, and we thank KnowBe4 for sponsoring our show.
Dave Bittner: [00:16:57] Joe, I recently had the opportunity to speak with Max Kilger. He is the academic director of critical technology studies at UTSA. That's basically a program that prepares people for a career in the intelligence community, and he spoke to us about the six motivations of bad actors. Here's my conversation with Max Kilger.
Dave Bittner: [00:17:17] Well, let's dig into the list here. There's six motivations of bad actors. Take us through - what are we talking about?
Max Kilger: [00:17:24] Sure. So there are six of them, and I'll list them off first - money, ego, entrance to social group, cause, entertainment and status. And a good way to remember this is just to remember the word meeces (ph). If you remember the old Hanna-Barbera cartoon with the mice...
Dave Bittner: [00:17:43] Right (laughter).
Max Kilger: [00:17:43] Pixie and Dixie. And Jinks the cat was chasing them...
Dave Bittner: [00:17:45] Right, right, right.
Max Kilger: [00:17:45] ...Saying, I hate meeces to pieces.
Dave Bittner: [00:17:47] I like meeces to pieces. Yes, I remember exactly. Yes (laughter).
Max Kilger: [00:17:52] Yeah, exactly right. And it's actually also a play off the old FBI counterintelligence term MICE, which stood for money, ideology, compromise and ego, which are the standard reasons why it was thought that people betray their country.
Dave Bittner: [00:18:08] I see. Well, let's dig into them one by one there. Take us through the list.
Max Kilger: [00:18:12] Sure. So money is, obviously, the first one - motivation for malicious online actors. And that's the one that you see in the headlines all the time with financial assets and bank breaches, credit card breaches and things like that. So that's a very major motivation for malicious, online actors. It has been for some time. And it will be for some time to come. The second one is ego. And ego has been around a long time since the early days of the hacking community, where, basically, this feeling of, I've, basically, challenged and won the - over the machine. There are these technical obstacles. And I'm very clever. And I've overcome them. And so that's sort of the second motivation for individuals to, basically, exhibit online, malicious behaviors.
Dave Bittner: [00:19:02] Kind of bragging rights, I suppose.
Max Kilger: [00:19:04] Bragging rights but it's also just this feeling of accomplishment. It's like...
Dave Bittner: [00:19:08] I see.
Max Kilger: [00:19:08] I pitted myself against the machine, and I won.
Dave Bittner: [00:19:12] Got you - all right, what's next?
Max Kilger: [00:19:14] The next one is entrance to social groups. So, of course, you have hacker groups hacking communities. And you just can't walk up to them and say, hey. I'd like to join you guys - doesn't really work like that. So, basically, you have to demonstrate some skill and expertise. So you may write a particular exploit or piece of malware and show it to them and say, hey. Look at this, you know, piece of malware or this exploit. And they go, oh, yeah. That is pretty cool. That's pretty clever. OK. You - we'll take that exploit, and you can join the group.
Dave Bittner: [00:19:51] (Laughter) Right. Right. So you get to be a member of an exclusive club.
Max Kilger: [00:19:55] That's correct, sir.
Dave Bittner: [00:19:56] Yeah. What's the next one?
Max Kilger: [00:19:58] The next one - the fourth one is cause. And that one is, basically, the one you think about in terms of hacktivism and political causes, basically using the w to promote a specific ideological, political, cultural or social cause.
Dave Bittner: [00:20:16] And next.
Max Kilger: [00:20:17] Entertainment - this is one that, in the early days of the hacking community, was fairly popular. Hackers are fun-loving, prank-pulling, mischievous folks. And so in the old days, they loved doing that. And then this cause for malicious, online actors kind of disappeared for a couple of decades. But in the last, say, five to seven years, it's come back because of a number of things, including sort of a large number of naive users now on the internet. So there are lots of soft targets. There are a lot of different ways to, basically, taunt and torment people. And so entertainment has kind of made a comeback.
Dave Bittner: [00:20:59] And the last one.
Max Kilger: [00:21:00] Then the last one is status. That is your status in your local hacking group and your regional hacking group, nationally and internationally, depends upon your skill and your expertise in a specific technical area, like networking or information security or operating system kernels or hardware and things like that. So status might be a motivation for a malicious act, just to say, oh, look. I was able to breach this firewall. I was able to defeat this security system on this phone, things like that.
Dave Bittner: [00:21:34] Now, I suppose these motivations don't have to be isolated from one another. Do you find that certain ones tend to pair together more than others?
Max Kilger: [00:21:45] Well, that's kind of interesting - not as much as you think. Typically, people who have similar motivations tend to group together. And also, they tend to be sort of - oh, this is a primary motivation. Whether it's money, whether it's entertainment, whether it's ego, they tend to be fairly primary. And you will see some secondary motivations. So you'll see them paired a bit. But often, there's one that's very strong. And then there's sort of a secondary one that's much weaker.
Dave Bittner: [00:22:14] Now, I guess to that point, do you see - folks who are grouped together, is there disdain for people who are motivated by other things?
Max Kilger: [00:22:23] Yeah, that's correct. And, actually, in the very old days in the hacker community, what happened was individuals who were interested in hacking for money, basically, stealing credit cards or things like that, were looked down upon by the other members of the hacking community. That's, actually, kind of passed now. And now we have this huge cybercrime community. And so there are a large number of cybercrime groups. And so the old days are kind of gone.
Dave Bittner: [00:22:53] Now, for folks who are looking to protect themselves against these sort of things - and I'm thinking particularly when it comes to things like social engineering. I mean, how does your work inform those defenses? What kind of advice do you have for people?
Max Kilger: [00:23:07] It's the usual advice that you're going to hear sort of from other sources. You know, it's the usual stuff. Don't open things from people you don't know. Whenever I - for example, I receive a notification, say, from a bank or from a company that I deal - do business with, I'll never click on the email or click on the links. I'll always just go to the company's website and navigate to find whatever offer or whatever piece of information or whatever action I have to do so that I never really try to react to anything that's either a URL in your email or an attachment that you have to open. So that's a pretty good piece of advice. And another one, of course, is to put some sort of defensive system on your machine, whether it's one of the local antivirus companies' stuff. Use VPNs. Don't use public networks - the usual things like that. Although, putting, say, antiviral and anti-malware software on your machine, it's good. But none of them are perfect. They're always going to let some things slip through. So just be aware. It's a kind of warm and fuzzy feeling, but don't get too cozy.
Dave Bittner: [00:24:19] Yeah. I mean, it's interesting to me. For you, as someone with a, you know, deep knowledge of social psychology, you kind of sit at that intersection between the human side and the technical side.
Max Kilger: [00:24:31] Yeah, I think that's really fascinating. That's what really started my passion many years ago. I sort of realized that - how much it was going to change society and the way people interact and also the importance that it had to fields like information security and cybersecurity. And I spent a number of the first years when I was doing this sort of wandering the halls of Washington, trying to say, hey. Look. You should probably pay attention to this human element of cybersecurity. It's really important. It can help you sort of look into the future to see what emergent threats might be coming. It can help you sort of get out of your defensive huddle and become more competent and confident that you can meet incoming threats. But I have to tell you it was a very tough sell in the early days.
Dave Bittner: [00:25:22] Do you suppose that folks are starting to catch up and get the word? I mean, it seems, certainly, that with all the phishing attacks that we've seen lately, that the social engineering side of things has really come kind of front and center.
Max Kilger: [00:25:34] Yeah, you're - that's, actually, a very good observation. And I'm, actually, happier because people are beginning to seriously consider some of the human elements and components in cybersecurity. And even in the cybersecurity field, there - you know, now you have threat hunters, who, basically, go out and look for threats and profile them and try and track what they're doing. And so that's, actually, encouraging because you're really using the human components in profiling to basically help protect your organization. The thing I always tell people is, hey. Look. You know, it's really important to build a better or comprehensive understanding of the relationship between people and digital technology because once you acquire that better understanding, you can begin to look out into the future and begin to say, oh. Over here, that looks like that could be a potential, emerging threat. And once you begin to do things like that and develop future scenarios, you can, basically, inform policymakers. You say, hey. Look. We think this might be coming from this direction, and it looks pretty serious. And then policymakers can make the decision to put resources against that potential threat. And you're not always being placed in sort of a reactive position.
Dave Bittner: [00:26:54] Interesting stuff, Joe.
Joe Carrigan: [00:26:55] It is interesting. I give talks from time to time to groups of people in the community. Most recently, I gave a talk up at Hagerstown Community College to a group of small business people on cybersecurity. And we talk about why people hack. And I touched on all these different reasons. And I really like Max's meeces (laughter).
Dave Bittner: [00:27:13] Yeah. It's easy to remember.
Joe Carrigan: [00:27:15] Yeah. As an old fan of cartoons...
Dave Bittner: [00:27:17] (Laughter).
Joe Carrigan: [00:27:17] You know, that sticks with me. But my point in these lectures is that we used to say all these different reasons. But now the biggest concern is the financially motivated hacker. The reason that is is because while these other things - ego, entrance, cause, entertainment, status - are still factors in why people act this way, they make up a very small minority of why people do it. And those tend to be more random and less focused and crimes of opportunity. Money is such a motivator for these criminals that they have, actually, built out businesses around it.
Dave Bittner: [00:27:48] Right.
Joe Carrigan: [00:27:49] And they have very efficient industry. It's an industry now...
Dave Bittner: [00:27:53] Yeah.
Joe Carrigan: [00:27:53] ...To go ahead and exploit people and get their money.
Dave Bittner: [00:27:56] Right. It's not just the hobbyists...
Joe Carrigan: [00:27:57] It's not just the hobbyists. It's not...
Dave Bittner: [00:27:59] ...With their kit computers in the garage.
Joe Carrigan: [00:28:01] Yeah. It's like - imagine, you know, years from now that what you're looking at at Google now has some kind of - or Amazon or Apple - now has some kind of counterpart that's an illicit operation that is as big as those operations. I don't think it'll ever get that big. But, you know, you're looking at a late start-up for these kind of businesses...
Dave Bittner: [00:28:18] Right.
Joe Carrigan: [00:28:18] ...Where they're, actually, starting to make money. And they're being very profitable at it. I agree with Max's statement that the focus on the human element of security is a long time coming and that it's really good that we're finally just now starting to get onto it. But we really should've been getting onto it a lot earlier.
Dave Bittner: [00:28:33] Yeah. It's interesting. Why do you think that is?
Joe Carrigan: [00:28:35] I think because we've all focused on the technology. And maybe the technology has actually gotten better. We've talked about this...
Dave Bittner: [00:28:40] Sure.
Joe Carrigan: [00:28:40] ...Before. The technology's, actually, gotten good enough that...
Dave Bittner: [00:28:42] Yeah.
Joe Carrigan: [00:28:42] ...The easier the economic forces now push us to work on humans.
Dave Bittner: [00:28:46] Right.
Joe Carrigan: [00:28:46] But, you know, if we'd spent time anticipating this problem - and that's kind of the issue is we tend to be a little more reactionary than we should be. We should be trying to anticipate problems and get out in front of them before they happen.
Dave Bittner: [00:28:57] Yeah, yeah.
Joe Carrigan: [00:28:58] But that's hard to do...
Dave Bittner: [00:28:59] Yeah.
Joe Carrigan: [00:28:59] ...Especially on a global scale.
Dave Bittner: [00:29:00] Well, thanks to Max Kilger for joining us. And thanks to you for listening. That is our show this week.
Dave Bittner: [00:29:05] We want to thank our sponsors KnowBe4, whose new-school security awareness training will help you keep your people on their toes with security at the top of their mind. Do check out their webinar with hacker extraordinaire Kevin Mitnick. Go to knowbe4.com/hackinghumans and register for the webinar. That's knowbe4.com/hackinghumans. And we thank KnowBe4 for sponsoring our show.
Dave Bittner: [00:29:29] Thanks to the Johns Hopkins University Information Security Institute for their participation. You can learn more about them at isi.jhu.edu.
Dave Bittner: [00:29:37] The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben. Our editor is John Petrik, technical editor is Chris Russell, executive editor is Peter Kilpe. I'm Dave Bittner.
Joe Carrigan: [00:29:52] I'm Joe Carrigan.
Dave Bittner: [00:29:53] Thanks for listening.
Copyright © 2019 CyberWire, Inc. All rights reserved. Transcripts are created by the CyberWire Editorial staff. Accuracy may vary. Transcripts can be updated or revised in the future. The authoritative record of this program is the audio record.
KnowBe4 is the world’s largest security awareness training and simulated phishing platform that helps you manage the ongoing problem of social engineering. Their new school security awareness training platform is user-friendly and intuitive. It was built to scale for busy IT pros that have 16 other fires to put out. Learn more at KnowBe4.com.