Separating fools from money.
Lily Hay Newman: [00:00:00] Some people who do email scamming and get rich off of email scamming are really vocal about that in Nigeria and take pride in it and show off their wealth. And there's kind of a sense that if people are dumb enough to get tricked then they don't deserve to have their money, and the scammers sort of outwitted them and outsmarted them and deserve to have it.
Dave Bittner: [00:00:27] Hello, everyone, and welcome to The CyberWire's Hacking Humans podcast, where each week we look behind the social engineering scams, phishing schemes, and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm David Bittner from The CyberWire. And joining me, as always, is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe.
Joe Carrigan: [00:00:47] Hi, Dave.
Dave Bittner: [00:00:48] As always, we've got some good stories to share. And, later in the show, we'll have my interview with Lily Hay Newman from Wired. She's discussing her recent article on Nigerian scammers. But before we get to that, a quick word from our sponsors, our friends at KnowBe4.
Sponsorship: [00:01:07] Step right up and take a chance. Yes, you there. Give it a try and win one for your little friend, there. Which were the most plausible subject lines in phishing emails? Don't be shy. Were they, a., My late husband wished to share his oil fortune with you? Or, b., Please read - important message from HR? Or, c., A delivery attempt was made? Or, d., Take me to your leader? Stay with us, and we'll have the answer later. And it will come to you courtesy of our sponsors at KnowBe4, the security awareness experts who enable your employees to make smarter security decisions.
Dave Bittner: [00:01:47] And, we're back. Joe, we've got some fun stories today.
Joe Carrigan: [00:01:51] Yes, we do.
Dave Bittner: [00:01:52] Let me get started with this one. This is an actual true story from a friend of mine, actually, a former professor of mine from college. He was hired to test the security of various physical systems, let's call them. This was not long after 9/11. You think back there was an air of heightened security all around. And one of the places he was hired to test was airport security. So here's what they did. They hired someone who had significant physical disabilities. And this was someone who was in a really high-tech wheelchair.
Joe Carrigan: [00:02:25] Right.
Dave Bittner: [00:02:26] Right? So imagine Stephen Hawking, that sort of thing. It's got a respirator on the back of the thing. This person has a lot of tubes hooked up to him. He's a very thin, you know, sort of frail-looking person because of the situation that put him in that wheelchair. So they hire that person, buy him a plane ticket and send him through the line. But they gave him a little something extra to take with him to test the security. Strapped to the side of his wheelchair, they put a colostomy bag...
Joe Carrigan: [00:02:56] Right.
Dave Bittner: [00:02:56] ...Filled with chocolate milk.
Joe Carrigan: [00:02:58] (Laughter) OK.
Dave Bittner: [00:02:59] And, just to make sure there wasn't any confusion as to what they were up to, they put a hand grenade in the bag full of milk.
Joe Carrigan: [00:03:05] It was inert, right?
Dave Bittner: [00:03:06] It was inert, indeed. Yes.
Dave Bittner: [00:03:09] So they put an inert hand grenade. So it's something that's going to create a big, you know...
Joe Carrigan: [00:03:15] Something that should be obvious what it is. Right?
Dave Bittner: [00:03:18] That's right. That's right. It's not - there's not a lot of nuance to the metallic signature that this thing will have in this colostomy bag. And they have the bag, you know, sort of hosed up to him and lots of tubes. And there's lots of tubes, and beeping things and so forth. But this is part of it. So they put this person in line, and they have someone else with him, a nurse, a handler, someone who's very friendly to sort of interact with the security people and help get him through the line. So imagine this situation. You've got someone here who is unable to be disentangled from his metallic wheelchair. That's the story that they spun up here because if they disconnect him from all of the things that he's connected to, he will not survive.
Joe Carrigan: [00:03:59] Right.
Dave Bittner: [00:03:59] So they send him through the line, and then just for good measure, they put several people who were also on this team behind him in line to kind of put the heat on with the security people. Right? So they've got people - come on, I've got a plane to catch. You know, why you taking all the time with this guy? You know, I'm going to be late to my plane. So these people are hired - you know, part of the game to be jerks...
Joe Carrigan: [00:04:19] Right.
Dave Bittner: [00:04:20] ...To try to move things along. So they send him through. And the folks, the security folks, do their scanning and everything. And, sure enough, what do you know? The hand grenade got through.
Joe Carrigan: [00:04:30] Sure.
Dave Bittner: [00:04:31] No problem at all.
Joe Carrigan: [00:04:32] Sure.
Dave Bittner: [00:04:33] So let's walk through this. So there's a bunch of different social engineering techniques that were going on here.
Joe Carrigan: [00:04:38] Yes. I will say this, though. The crew of people behind them yelling at them may be part of the test, maybe not part of the social engineering organization. Because I believe that they are trying to emulate actual angry passengers.
Dave Bittner: [00:04:52] Yeah. Well, I think the social engineering component of that is you're trying to put more pressure on the people whose job it is to test these things. That you're trying to...
Joe Carrigan: [00:05:00] That's right. You want to get them to hurry it along.
Dave Bittner: [00:05:02] Right.
Joe Carrigan: [00:05:02] The bigger part of this social engineering piece is that here's a frail guy who has already been through a lot of stuff. Let's not hassle this guy any more than we need to.
Dave Bittner: [00:05:11] What are the odds that this person is up to no good?
Joe Carrigan: [00:05:14] Right.
Dave Bittner: [00:05:14] So you've got that coming into play. So there's social engineering there, saying, you know, well, you know, this person probably is not going to cause any trouble so let's give him a cursory check. He has this wonderful person with him, this nurse, you know? Well, what are the odds that we have a problem here?
Joe Carrigan: [00:05:30] Right.
Dave Bittner: [00:05:31] But, of course, the test was to see how easy it could be to get something through here. And this is the kind of thing where if some bad guys wanted to get something onto a plane, well, this worked.
Joe Carrigan: [00:05:42] Who's going to want to touch a colostomy bag?
Dave Bittner: [00:05:43] Right.
Joe Carrigan: [00:05:44] You know?
Dave Bittner: [00:05:44] Right. So you're dealing with a yuck factor.
Joe Carrigan: [00:05:46] Right. Exactly.
Dave Bittner: [00:05:47] So you've got sympathies. Right? People have a natural - they don't want to put this person through any further indignities.
Joe Carrigan: [00:05:54] Yep. You've got the yuck factor, like you're talking about. And you've got the pressure factor behind you of people yelling at you. And all that makes for something that pushes, essentially, a weapon through the security process.
Dave Bittner: [00:06:05] Right. It made it through without a hitch.
Joe Carrigan: [00:06:07] Right.
Dave Bittner: [00:06:07] So an interesting lesson there, you know? A little different from some of our cyber stories, but certainly an interesting exercise of lots of combined social engineering techniques.
Joe Carrigan: [00:06:16] Yep.
Dave Bittner: [00:06:16] These folks were successful.
Joe Carrigan: [00:06:17] I would consider that a physical penetration test.
Dave Bittner: [00:06:19] Yep. Absolutely.
Joe Carrigan: [00:06:20] You know?
Dave Bittner: [00:06:20] All right. So what do you got for us this week?
Joe Carrigan: [00:06:22] So I have something on one of my favorite topics, Dave, and that's passwords and breaking them.
Dave Bittner: [00:06:26] OK.
Joe Carrigan: [00:06:27] And it seems that the norm is that password use and password choosing, I guess you could say, is risky and lazy at best.
Dave Bittner: [00:06:36] Right. We reuse passwords, period.
Joe Carrigan: [00:06:38] Right. We're reusing passwords, or we're slightly modifying a password and then reusing that.
Dave Bittner: [00:06:42] Right.
Joe Carrigan: [00:06:42] So some researchers at Virginia Tech have published a paper, back in March of this year, called, "The Next Domino To Fall: An Empirical Analysis Of User Passwords Across Online Services." And they document their password analysis. And what they did was, Virginia Tech researchers showed that more than 16 million password pairs can be cracked with just 10 guesses. So if I know who you are - and, by password pair, they mean username and password.
Dave Bittner: [00:07:11] Right.
Joe Carrigan: [00:07:11] So if I know your username, like, let's say in a breach that's happened, I get your email address. Well, chances are, you will use that email address across multiple sites.
Dave Bittner: [00:07:19] Sure.
Joe Carrigan: [00:07:20] So if I see the same email address, and I know that you reuse passwords or I think that you reuse passwords, and you do reuse passwords or you slightly modify them and then use them then I can guess your password in less than 10 guesses. That's what these researchers are saying.
Dave Bittner: [00:07:36] Wow.
Joe Carrigan: [00:07:36] What stood out for me in the paper, one of the points that stood out for me, was they collected 497 million passwords that were hashed. And they were just hashed. They weren't salted. They were just plain hashes. But they broke 460 million of them in a week.
Dave Bittner: [00:07:51] Wow.
Joe Carrigan: [00:07:51] That's 92 percent of half a billion passwords they cracked in a week. OK? So that alone speaks volumes to the problem.
Dave Bittner: [00:08:01] Right.
Joe Carrigan: [00:08:01] The study was demonstrating the social engineering that's involved in understanding that people reuse the passwords or just slightly modify them. Virginia Tech then provided the data set anonymized to Dashlane, and Dashlane researchers noticed the prevalence of something called password walking, which is the practice of using letter and number combinations that are next to each other on the keyboard. So, you know, you think of one, two, three, four, five, six, or, Q, W, E, R, T, Y.
Dave Bittner: [00:08:29] Right. QWERTY. Yeah.
Joe Carrigan: [00:08:29] Right. You don't really think of one, Q, S, Z, X, S, W, two. 'Cause if you look at that as a password string, you're going to say, hey, that looks kind of normal. But if you look at a QWERTY keyboard and just go down the first column and back up the second column, there's your password.
Dave Bittner: [00:08:44:] I see.
Joe Carrigan: [00:08:44] Right? And Dashlane noticed a prevalence of that, as well.
Dave Bittner: [00:08:47] Interesting.
Joe Carrigan: [00:08:48] Also, lots of people used brand names, as well as the names of cultural musical icons in the report. So, like, "Pokemon" and Metallica and "Star Wars" were in there. These are remarkably easy to guess. They fall very quickly just to a basic dictionary attack because the word Metallica is going to be in every single password dictionary.
Dave Bittner: [00:09:05] Right.
Joe Carrigan: [00:09:06] Some passwords were obviously created out of frustration. They noticed the use of some profanity (laughter).
Dave Bittner: [00:09:12] Of course. Naughty words and phrases.
Joe Carrigan: [00:09:14] Exactly. The research noted that it's difficult for people to remember passwords for each of their 150-plus online accounts, and then plus their business accounts and everything. And the solution that most people arrive at is they're just going to reuse or modify passwords. And that's an untenable situation.
Dave Bittner: [00:09:31] Right. Right. For example, I can see, you know, someone decides that they want to use the word hedgehog as their password.
Joe Carrigan: [00:09:37] Right.
Dave Bittner: [00:09:37] And then they'll use hedgehog Facebook to log into Facebook, and hedgehog Twitter to log into Twitter and hedgehog Bank of America to log into Bank of America. Well...
Joe Carrigan: [00:09:45] Those are all going to get cracked, probably, in under 10 guesses, according to the Virginia Tech guys.
Dave Bittner: [00:09:49] Yeah. If I get one of them, it doesn't take a rocket scientist to figure out what the pattern might be for other log in...
Joe Carrigan: [00:09:55] Absolutely.
Dave Bittner: [00:09:55] ...Places. Right?
Joe Carrigan: [00:09:56] Yeah. If I break your Twitter password and find out that it's hedgehog Twitter then I'm going to guess the first guess for Facebook is hedgehog Facebook.
Dave Bittner: [00:10:04] Right.
Joe Carrigan: [00:10:05] Right? So this harkens back to one of my points of evangelism that I say over, and over and over again. And that is use a password manager. And use a password manager and set all of your passwords to random, 20-character passwords at a minimum. And I've gone over this before, and I say this at every talk I give. You don't just go in with the monumental task of going in and changing all your passwords. You just start using a password manager. And, over time, as you start logging into sites, you look and see if it's in your password manager. If it isn't in your password manager, you add it to your password manager and you change the password the next time you log in.
Dave Bittner: [00:10:42] Right.
Joe Carrigan: [00:10:42] Just start modifying it as you use it, and this will have the effect of the websites that you use most frequently, which are probably the most important to you, will get changed first. And the websites that are least important to you that you don't use that often will get changed later down the road. So it automatically prioritizes this for you.
Dave Bittner: [00:11:00] And, in effect, it'll bulletproof you from these sort of social engineering attacks.
Joe Carrigan: [00:11:05] Well, it'll bulletproof you from the social engineering attacks. That's right. They won't be able to use that because you'll be just using some random set of characters, numbers, letters and, you know, uppercase, lowercase, all that. But you're still going to be vulnerable to a brute force attack. But it's still going to be very, very hard to randomly guess a 20-character password, even with an MD5 hash, which is a very fast and very weak hash for password security.
Dave Bittner: [00:11:28] So I guess you sort of shifted to that scenario we joke about where if you and I are being chased by a bear, I don't have to outrun the bear. I just have to outrun you.
Joe Carrigan: [00:11:35] Right.
Dave Bittner: [00:11:36] So the bad guys will go after the weak passwords first, presumably.
Joe Carrigan: [00:11:40] That's right. They'll pick off the little, weak, sick ones.
Dave Bittner: [00:11:41] (Laughter) Right. Exactly. All right. Joe, it's a good one. Protect yourself from those social engineering attacks - password managers, random long strings.
Joe Carrigan: [00:11:49] Yep.
Dave Bittner: [00:11:50] All right. Joe, it's time for our catch of the day.
(SOUNDBITE OF FISHING POLE REELING)
Dave Bittner: [00:11:55] This week's catch of the day comes from friend of the show, Graham Cluley. He had a - something he put up on Twitter. He said, sheesh, my dumb email filter shoved this email from JPMorgan in my spam folder. Imagine if I hadn't spotted it. I could've missed out on millions.
Joe Carrigan: [00:12:11] (Laughter).
Dave Bittner: [00:12:12] I'm going to read the email here. I was really tempted to read Graham's part with a British accent because, of course, Graham is...
Joe Carrigan: [00:12:19] Right.
Dave Bittner: [00:12:19] And he loves it when I do that, so...
Joe Carrigan: [00:12:22] (Laughter).
Dave Bittner: [00:12:22] But I resist. So here's the letter. (Reading) Dear Sir or Madam, I am the operational manager in account management section in charge of credit and foreign bills of JPMorgan Chase bank here in USA. I helped a customer purchase security bonds worth $6,500,000 in the capital market. The customer dies in an accident in testate with no one to succeed his estate. Been the one that handled his financial affair for the last eight years, the private firm where the funds is presently lodged after I liquidated the security bonds from the various investment is simply waiting for me to present the next of kin. I am prepared to place you in a position to instruct the bank to release the deposit to you. Please use my private email account to contact me if the proposal is of interest to you. Regards, Mr. David Kent, JPMorgan Chase bank USA, borough of New York City.
Joe Carrigan: [00:13:17] Graham, you're going to be rich.
Dave Bittner: [00:13:20] He's going to - Yeah. Yeah.
Joe Carrigan: [00:13:21] What are you going to do with all that money, Graham?
Dave Bittner: [00:13:23] I hope he shares it with his friends here in the USA. So pretty straightforward.
Joe Carrigan: [00:13:29] Right.
Dave Bittner: [00:13:29] I mean, pretty obvious. I mean, this is about as classic of...
Joe Carrigan: [00:13:33] I don't have a copy of this. How did they spell intestate?
Dave Bittner: [00:13:37] I-N-space-T-E-S-T-A-T-E.
Joe Carrigan: [00:13:41] (Laughter) I thought they might have spelled it as more than one word because it is one word...
Dave Bittner: [00:13:46] Ah, OK.
Joe Carrigan: [00:13:47] ...Meaning without a will.
Dave Bittner: [00:13:48] Oh. All right. Yeah. So, you know, about as straightforward as a Nigerian email scam is...
Joe Carrigan: [00:13:54]Right.
Dave Bittner: [00:13:54] ...Without actually talking about a Nigerian prince, so...
Joe Carrigan: [00:13:57] Right. It's coming from a very reputable source - JP Morgan Chase, right?
Dave Bittner: [00:14:02]That's right.
Joe Carrigan: [00:14:02] You've heard JP Morgan Chase.
Dave Bittner: [00:14:03] Absolutely.
Joe Carrigan: [00:14:04] You've never heard of some guy in Nigeria...
Dave Bittner: [00:14:06] No.
Joe Carrigan: [00:14:06] ...Who claims he's a prince.
Dave Bittner: [00:14:07] And if anybody was going to have $6.5 million sitting in the bank...
Joe Carrigan: [00:14:11] They'd keep it at JPMorgan Chase.
Dave Bittner: [00:14:12] It would be JPMorgan Chase, right?
Joe Carrigan: [00:14:14] Right.
Dave Bittner: [00:14:15] So all - yeah. I mean, legitimacy here - you can see how someone could possibly fall for this, but...
Joe Carrigan: [00:14:21] The English is so broken.
Dave Bittner: [00:14:22] Yeah.
Joe Carrigan: [00:14:23] It's obviously not somebody from New York.
Dave Bittner: [00:14:25] And as Graham pointed out, this got automatically routed to a spam filter, so...
Joe Carrigan: [00:14:29] Good work, spam filter.
Dave Bittner: [00:14:29] Yeah. Bravo to whatever email service he's using that they recognized it and routed it there. So - all right. That is our catch of the day. All right. Coming up next, we'll have my interview with Lily Hay Newman from Wired. But first, a message from our friends at KnowBe4.
Sponsorship: [00:14:51] And what about the biggest, tastiest piece of phish bait out there? If you said A, my late husband wished to share his oil fortune with you, you've just swallowed a Nigerian prince scam, but most people don't. If you chose Door B, please read important message from HR, well, you're getting warmer, but that one was only No. 10 on the list. But pat yourself on the back if you picked C, a delivery attempt was made. That one, according to the experts at KnowBe4, was the number one come-on for spam email in the first quarter of 2018. What's that? You picked D, take me to your leader? No, sorry. That's what space aliens say. But it's unlikely you'll need that one unless you're doing "The Day The Earth Stood Still" at a local dinner theater. If you want to stay on top of phishing's twists and turns, the new-school security awareness training from our sponsors at KnowBe4 can help. That's knowbe4.com/phishtest.
Dave Bittner: [00:15:55] Joe, earlier this week, I had the opportunity to speak with Lily Hay Newman. She's the security staff writer at Wired. And she recently authored a story for them, and it was titled, "Nigerian Email Scammers Are More Effective Than Ever." Here's my conversation with Lily Hay Newman. I think for a lot of us who've been at this for a while, who've been around throughout the years - from the beginnings of email, the Nigerian prince scam was probably one of the first scams that we may have seen, certainly related to email. Can you take us through the history and evolution of these scams coming from Nigeria?
Lily Hay Newman: [00:16:27] They are very classic and kind of embedded in popular culture, mainstream thought, I feel. People reference the Nigerian prince scam kind of all the time just casually. Basically, the scams are very consistent, and they noticeably have not evolved that much. They definitely have in some ways. And some of the techniques are improved or changed or updated. But what's most impressive about them, I find, is that they're really based off a very simple confidence man hustle that dates back even farther than email. You know, truly, the classic schemes that are sort of feeding on urgency, feeding on compelling story, a simple story - not too many details, and just going after a big population. And you only need a few people to get tricked. So all of it is very classic, very sort of elegant, simple premise. Obviously, the Nigerian prince scam itself, which now I would think most people would be wise to that specific one because it is so well-known, is a foreign royal or, you know, someone who claims to have some sort of, like, royal mystique or status in their community or their country reaches out and says, hey, you know, I'm trying to move a lot of money out of my country. If you'll pay to help me do a wire transfer, you know, you'll give me some money so that I can then move a lot of money, that would really help me because I'm - you know, I'm in a bad situation or something like that. And so it's just that simple idea of it's sort of a celebrity or someone important asking you for help and you don't have to do very much - maybe doesn't raise alarms that it would be sort of all your money or something like that. It's just sort of a good amount of money. And you really want to help and you really want to do something quickly, and then you're going to have the promise of all this money back. So all the scams kind of work that way. They just don't necessarily use that hook of, like, a foreign royal person or important person. But they're all just operating on this premise of sort of give a little, get a lot.
Dave Bittner: [00:18:52] And one of the things you pointed out in your story was how they're targeting small businesses.
Lily Hay Newman: [00:18:56] Right. So I would say that that's the biggest change in the past few years. This concept of business email compromise has really grown. And there's always been - businesses have money, people have always been trying to steal the money. It's not - so again, it's not that much of a technological innovation or something. It's just that these types of scams - the scammers realized, oh, we can do a lot of the same stuff and just sort of tailor it to business, and it'll be effective there, too, and the money could be even greater. And one of the smartest things they do is they'll tailor emails to appear to be a real vendor or someone that a business really could conceivably contract with and just send a professional-looking invoice. They figure out - try to map out who in an organization is the right person to send this email to, or they'll send, you know, five of them or something, trying to find the people who handle financials within the company. And again an invoice that looks pretty legit and is asking for, you know, the company to pay a reasonable amount of money, like this person would approve every day, yeah, it's the same urgency of, well, this is what I'm supposed to do. I pay bills all the time. And if we don't pay the bill, the company gets in trouble, so of course I would pay the bill. But instead, you're actually wiring money into the scammer's hands.
Dave Bittner: [00:20:23] Yeah. One of the interesting points you made in the article was that these folks aren't technically sophisticated, but they have a tremendous amount of patience.
Lily Hay Newman: [00:20:31] Yeah. There's a lot of patience. There's a lot of faith in the hustle, basically, that if they work hard enough on refining it and if they cast a wide enough net, that they will get some people to pay out, that they will trick some people. So there's really sort of a maturity (laughter) to, you know, the concept that there's a lot of restraint and just sort of it'll work. Let's - we just keep doing what we're doing. And then the other interesting thing about the scammers is that many of them come out of these Nigerian sort of crime syndicates sometimes called confraternities that are sort of mafia-like gangs. There's a whole cultural growth out of these groups. There's a whole lifestyle that's part of it. Some people who do email scamming and get rich off of email scamming are really vocal about that in Nigeria and take pride in it and show off their wealth. And there's kind of a sense that if people are dumb enough to get tricked, then they don't deserve to have their money. And the scammers sort of outwitted them and outsmarted them and deserve to have it. Yeah, so there's just this whole community element, cultural element, and that has allowed the infrastructure for the scamming to kind of build up all over the world because it starts with these sort of insular groups in Nigeria. But then as people live their lives and emigrate around the world, the scams kind of move with them, and they're recruiting money mules. They're recruiting people in all different places to carry out different parts of the schemes. So it's really built up this sort of international infrastructure of how the scamming works. And then it's just coming from everywhere and coming from all sides. And though most of the easy ones are going to get blocked by your spam filter and stuff, the fact that it's spread out all over the world does make it difficult for email providers to keep up with the blocking because it's just coming from everywhere. And the emails look really legit.
Dave Bittner: [00:22:44] And why Nigeria? Is the Nigerian government turned a blind eye to these folks?
Lily Hay Newman: [00:22:49] I think that is part of it. As the international law enforcement community has attempted to respond to this, there have been issues at times getting the Nigerian government to cooperate with, you know, apprehending suspects themselves or allowing extradition. But there is traction to maybe kind of turn the tides on this. I just reported this week on an announcement from the Department of Justice that they had completed an international operation to arrest 74 scammers, some in Nigeria, some in the U.S. and a few in other countries. And they're sort of doing a lot of international collaboration and trying to gain steam on doing this. And in that case, there was cooperation within Nigeria. So it's getting there, but for many, many years, this has just been unchecked. And there weren't a lot of consequences and just free money.
Dave Bittner: [00:23:56] What are the recommendations for people to protect themselves against this? Is it a matter of technical solutions, or does training come into play?
Lily Hay Newman: [00:24:02] I think for businesses, some of the things that are helpful are requiring two people at least to sign off on big transactions over a certain amount. That's sort of - I mean - probably in practice businesses doing tons of transactions every day and paying tons of bills, it probably does add some friction into the system. But when you have a second person look at something, they might immediately say something's weird about this or they might be the person who says we don't contract with that person. We don't contract with that company. I don't know what this is. So getting a second check is always helpful, and that's also - in a less codified way, that's a great tip for individuals when you're feeling caught up in something and you're feeling the urgency. It may feel like there's no time and you have to act right then, but you really can take a few seconds and message a friend or call someone and say, what do you think of this? Do you think I should do this? And just getting that gut check, sometimes just verbalizing it to someone else, you hear yourself and you think, oh, wait, no, this is a scam.
Dave Bittner: [00:25:11] Right. Yeah. How many times do all of us look back and say, what was I thinking (laughter)?
Lily Hay Newman: [00:25:16] Right, exactly. So yeah, maybe it's kind of embarrassing, but it's definitely worth it versus having your money taken. So things like that are really helpful. There's a lot of controls they can implement in their email to make it a little more obvious when something might be fishy. Another way that the scammers will try to initiate something is to send emails that pretend to come from a higher-up in the company, like an executive or somebody's boss. And generally, they're using email addresses or, you know, email domains that look like the company email address, but they're actually a little bit different in some way because the scammer doesn't actually control the real email. What you can do in that case is businesses can implement controls on their email so that any address that isn't from internal within the company just gets a little flag that it's not internal. And much of the time, when you're emailing with a vendor or something, you know that they're not part of the company, so that is fine. But if - and you just ignore it. But if something's flagged from and it looks like it's from your boss, that can tip you off, you know, that maybe something's weird there. But it gets tough because if they can, if a scammer or a phisher can compromise a real email address, like if they can guess a password or something or someone used a reused password that was exposed in a leak and it's online or something like that, sometimes they could be sending the emails from a legitimate address, and that protection wouldn't apply. So it gets really hairy, but there are steps people can take to at least help and kind of minimize the risk as much as possible.
Dave Bittner: [00:27:00] All right. Interesting stuff.
Joe Carrigan: [00:27:02] I thought that was a great interview. One of my takeaways from this is that these guys in Nigeria seem to live by the adage, or the principle, rather, never hesitate to separate a fool from his money - right? - and that that's perfectly fine.
Dave Bittner: [00:27:15] Right.
Joe Carrigan: [00:27:15] You asked him for the money. He sent it to you.
Dave Bittner: [00:27:18] It's an interesting, I guess, moral construct, you know, that if you're - it's not my moral failing for taking the money from you. It's your failing for being gullible enough to give it to me.
Joe Carrigan: [00:27:30] Right. And I kind of disagree with that, but...
Dave Bittner: [00:27:31] Sure.
Joe Carrigan: [00:27:32] ...I don't kind of disagree with that (laughter).
Dave Bittner: [00:27:35] Yeah. Yeah.
Joe Carrigan: [00:27:36] I totally disagree with that.
Dave Bittner: [00:27:36] Yeah.
Joe Carrigan: [00:27:36] But, you know, they do have a different moral construct, it seems.
Dave Bittner: [00:27:39] Yeah, absolutely.
Joe Carrigan: [00:27:40] I also really liked her advice of talking it over with somebody. She called it a gut check. I think that's great. When you start saying something that you're thinking and it sounds stupid - and this happens to me a lot, right? Like, I'll be talking to someone and I'll be saying something and I'll go, wait a minute. This is just a terrible idea, right? And it doesn't have to be with anything like this. It can be - it can be, like, with our plans for the day. It could be something as simple as that. And just the fact of dumping that out of your mouth and saying it to somebody can let you know - by hearing yourself say something, that just lets you know that this is a profoundly bad idea.
Dave Bittner: [00:28:13] (Laughter) Right, right. All right. Well, that is our podcast. Thanks for listening. And a final word about KnowBe4, who sponsored our show. They've got new school security awareness training, and their platform is user friendly and intuitive. It scales from 50 to 500,000 users and was built for busy IT pros that have 16 other fires to put out. Try their free phishing test at knowbe4.com/phishtest. That's K-N-O-W-B-E the number four dot com slash phish test. Thanks to the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu The Hacking Humans podcast is proudly produced in Maryland at the startup studios of DataTribe where they're co-building the next generation of cybersecurity teams and technology. Our coordinating producer is Jennifer Eiben, editor is John Petrik, technical editor is Chris Russell, executive editor is Peter Kilpe. I'm Dave Bittner.
Joe Carrigan: [00:29:10] And I'm Joe Carrigan.
Dave Bittner: [00:29:11] Thanks for listening.