Hacking Humans 2.27.20
Ep 87 | 2.27.20

The art of cheating.

Transcript

Tim Sadler: [00:00:00] People are unpredictable. They break the rules. They make mistakes. And they're easily hacked. 

Dave Bittner: [00:00:05]  Hello, everyone, and welcome to another episode of the CyberWire's "Hacking Humans" podcast. You know this show. This is the show where each week we look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire. And joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe. 

Joe Carrigan: [00:00:27]  Hello, Dave. 

Dave Bittner: [00:00:27]  We've got some good stories to share this week. Later in the show, I speak with Tim Sadler. He's from Tessian, and we're going to be talking about the human element of cyber, specifically some phishing schemes. But before we get to all of that, a word from our sponsors at KnowBe4. 

Dave Bittner: [00:00:42]  So what's a con game? It's a fraud that works by getting the victim to misplace their confidence in the con artist. In the world of security, we call confidence tricks social engineering. And as our sponsors at KnowBe4 can tell you, hacking the human is how organizations get compromised. What are some of the ways organizations are victimized by social engineering? We'll find out later in the show. 

Dave Bittner: [00:01:06]  And we are back. Joe, last week, my story was about a strange voicemail that was being sent out. What we figured out was it was probably from scammers... 

Joe Carrigan: [00:01:15]  Right. 

Dave Bittner: [00:01:15]  ...And that the scammers had been scammed. 

Jim Browning: [00:01:16]  Hacked, yes. 

Dave Bittner: [00:01:17]  Someone had gone in and hacked their outgoing audio file that... 

Joe Carrigan: [00:01:21]  Right. 

Dave Bittner: [00:01:22]  ...They were using, right? Well, we heard from another listener who verified that that was indeed the case. 

Joe Carrigan: [00:01:27]  Oh, OK. 

Dave Bittner: [00:01:27]  And in fact, this listener sent us a YouTube video that was made by a gentleman named Jim Browning, and he runs a YouTube channel where he goes behind the scenes and basically scams the scammers or reveals what they're up to. And in this YouTube video, he does exactly what we described last week. He replaces the outgoing audio file that the scammers are using. In fact, I hold a bit of that audio file for us to listen to. So we'll play that right now. This is the file that he replaced their outgoing message with. 

Joe Carrigan: [00:01:59]  Excellent. 

Unidentified Person: [00:02:01]  After this. Thank you. 

Jim Browning: [00:02:01]  I decided to replace it with this. 

Computer-generated Voice #1: [00:02:05]  Hello. This is an automated message being sent from scammers from India. They were going to try to scam you by claiming that you would get a refund from your computer maintenance company. However, this is just a scam, so that they can access your computer and will try to get money from your bank account. If you ever get a message like this, it is always a scam. 

Computer-generated Voice #2: [00:02:27]  Please do not mention that I have changed their message, but if you would like to waste their time, you can speak to these scammers by pressing 1 on your telephone keypad. 

Joe Carrigan: [00:02:37]  (Laughter). 

Dave Bittner: [00:02:37]  All right, so that's the new outgoing message. And what's especially fun about this video is that the scammers don't realize that their outgoing message has been changed... 

Joe Carrigan: [00:02:47]  Right. 

Dave Bittner: [00:02:48]  ...But they're wondering why no one's calling in anymore (laughter)... 

Joe Carrigan: [00:02:52]  Right. Yeah. 

Dave Bittner: [00:02:53]  ...Why no one's pressing 1, why their percentages have bottomed out. 

Joe Carrigan: [00:02:58]  Right. They're going to catch on eventually, I think. 

Dave Bittner: [00:03:00]  Yeah. And they do. They do, but it takes a while. 

Joe Carrigan: [00:03:03]  Good. 

Dave Bittner: [00:03:03]  Yeah. 

Joe Carrigan: [00:03:03]  I mean, every one person that gets that message is a person that they don't have the opportunity to scam. And if that sits in there for an hour, think of the number of calls you can make with the auto dialer in an hour. 

Dave Bittner: [00:03:14]  Yeah. Well, and this video goes into that. It's really an interesting look inside a scam call center. We'll have a link for the YouTube video in the show notes here. We've reached out to Jim Browning. We're hoping to get him on the show to come on and describe the types of things that he's discovered and the things he does to try to thwart these scammers. So interesting video. I highly recommend you check it out for a look behind the scenes on this stuff. And thanks to our listener for sending this to us and verifying our suspicions from last week. 

Dave Bittner: [00:03:44]  Joe, what do you have for us this week? 

Joe Carrigan: [00:03:46]  Dave, I have a story from Reddit. This is actually from r/India... 

Dave Bittner: [00:03:50]  OK. 

Joe Carrigan: [00:03:50]  ...Which is the self-described - the official subreddit for India. 

Dave Bittner: [00:03:54]  OK. 

Joe Carrigan: [00:03:54]  And this comes from about a year ago, and it is called "The Art of Cheating Travelers at Dhabas." Now, a dhaba is a roadside restaurant. He's a Reddit user named deepsmahesh. That's his username. 

Dave Bittner: [00:04:05]  OK. 

Joe Carrigan: [00:04:06]  I don't know if that's actually his real name, but that's what he's called on Reddit. He was traveling from Hyberabad (ph) to Bangalore, which is a north-to-south trip right in the center of the country, pretty much. And he was doing this on a weekday in a bus, and that's about an 11-hour bus ride from my cursory Google searches. 

Dave Bittner: [00:04:23]  OK. 

Joe Carrigan: [00:04:24]  Right? Around 11 p.m., the bus stops at this roadside restaurant, this dhaba, for dinner. And before deboarding (ph), the driver informs everybody that there's some kind of problem with the bus and it may take an hour before they resume the journey. So the author of this post says he goes down, and he goes and grabs a table at the restaurant. And there are no menu cards anywhere. It's a very informal kind of restaurant. And he begins looking what other people are eating. And then a waiter comes over and says, chapati or rice? 

Dave Bittner: [00:04:52]  OK. 

Joe Carrigan: [00:04:53]  And that's like a bread - chapati. 

Dave Bittner: [00:04:55]  OK. 

Joe Carrigan: [00:04:55]  So he looks around, and he sees somebody else is eating chapati and egg curry. And he says, that looks good; I'll have that. So he ordered it. His food comes very quickly - in about two minutes. And he's about halfway into his meal, and his tablemate, who helped him decide, gets his bill, right? And the waiter comes over and gave him a piece of paper - not a proper bill, but says, here's your - you know, here's your bill. And he sees him paying 120 rupees for his meal. 

Dave Bittner: [00:05:16]  OK. 

Joe Carrigan: [00:05:18]  After some time, he finishes his meal, and he gives the waiter 120 rupees, thinking that's what it was. They ordered the same thing. They both had the chapati and the egg curry. And the waiter goes, sir, it's 150 rupees, and hands him a piece of paper with 150 rupees written on it. He inquires about the guy. He says, this guy had the same thing. He paid 120 rupees. And the waiter goes, no, no, he paid 150 rupees; you're mistaken. Not wanting to make a scene, he pays the extra 30 rupees. 

Dave Bittner: [00:05:42]  How much money are we talking about here? For those of us, like me, who are unfamiliar with rupees, do you have any sense for what the exchange rate is? Is this... 

Joe Carrigan: [00:05:50]  It's a... 

Dave Bittner: [00:05:50]  ...A couple bucks more or... 

Joe Carrigan: [00:05:51]  It's 45 cents more... 

Dave Bittner: [00:05:53]  OK. 

Joe Carrigan: [00:05:53]  ...Which means to me that for dinner, you're paying, like, about 2 1/2 bucks. 

Dave Bittner: [00:05:57]  Yeah. 

Joe Carrigan: [00:05:58]  And the other guy paid maybe 2 bucks... 

Dave Bittner: [00:06:00]  OK. 

Joe Carrigan: [00:06:00]  ...For dinner. But the economies are very different, right? 

Dave Bittner: [00:06:03]  Sure, sure. 

Joe Carrigan: [00:06:03]  It's a completely different situation. 

Dave Bittner: [00:06:05]  Yeah. 

Joe Carrigan: [00:06:05]  So he goes over any and he grabs a cup of tea from a tea stand. And a few minutes later, his waiter comes out and is smoking a cigarette. And he says to the waiter, hey, you want to have some tea with your cigarette? And the waiter says, sure. And then he exchanges some brief conversation. And he says, you know, it was beautiful the way you swindled me out of 30 rupees. And the waiter laughs and goes, you noticed, right? So he doesn't even try to hide it. The author says, won't the owner be angry if he finds you're overcharging the customer? Because the owner sits at the counter of the dhaba. And the waiter laughs, and he says, no, everyone in the staff is in on it. It's - part of the skimmed money actually goes to the owner. 

Dave Bittner: [00:06:41]  (Laughter). 

Joe Carrigan: [00:06:43]  He says, aren't you afraid the customers will create a scene? And he goes, well, let me tell you something. Travelers are the easiest people to cheat. They're always in a hurry. They are anxious because they're in a new environment. And they won't create a scene. The best part is you won't ever see them again if they do cause a ruckus, right? The guy says, well, that makes sense. He says, but how do you stop yourself from getting too greedy, right? I mean, you could have charged me 200 rupees instead of just 150 rupees. He says, there's a manager for every four waiters. He oversees their activities, and action is taken if somewhat mischievous activities are undertaken by the waiters, and it has led to termination. So if waiters overcharge too much, they'll get fired. But if they overcharge just a little bit, it's fine. It's almost accepted. The author says, what are the factors you look for when you're going to overcharge a customer? And he says, before we get into that, can you buy me some cigarettes and another cup of tea? 

0:07:33:(LAUGHTER) 

Joe Carrigan: [00:07:34]  And the author goes, sure. He says, you know, I'm getting good information. He says, the first thing we always look for is we always look for anyone wearing fancy clothes. That's a high-value customer. And he goes - and he points to the author. He says, look at yourself. You're wearing shorts and fancy slippers, right? The guy's wearing Crocs. Sometimes us waiters fight to serve a guy like you, he says. 

Joe Carrigan: [00:07:53]  The second thing we look for is the number of people traveling together. Ideally, if someone is alone or traveling in a group of more than four people, these guys have the least likelihood of arguing, right? Just think about that. If you're alone, you know, you're by yourself, you've got nobody to back you up when you start a ruckus. From a social aspect, you're just by yourself and you're going to start arguing with a waiter, you know, you're going to envision, like, two of his buddies coming over going, no, no, this is how it is, and then you're going to look like you're wrong. 

Dave Bittner: [00:08:20]  Right. You're the stranger there, too, so... 

Joe Carrigan: [00:08:21]  Right. Exactly. 

Dave Bittner: [00:08:22]  Yeah. 

Joe Carrigan: [00:08:22]  You're the stranger. But in groups of four people or more, that's when you start looking like a jerk to your peers. So you won't do it. But generally, if there's two or three of you, they are less likely to do it. I find that very interesting. And the waiter says, these are not hard-and-fast rules. You just have to get a feel for it, right? Then you adapt to the conditions. He says, there was one customer who was well dressed, but he was very cautious. He carried his laptop bag with him. And he went into the hotel, and he was asking for the price before he ordered. He says, you never scam a customer like that, right? That guy established the price first, and that was it. 

Joe Carrigan: [00:08:53]  So now the bus is getting ready to board. And before everything is over, the guy asks one more question. He says, aren't you afraid that I might complain? He says, this is the situation wherever you find travelers. It doesn't matter if it's a bus or railway station; we're not the only ones scamming the customers. Don't blame the tiger because the deer are weak, is what he says. 

Dave Bittner: [00:09:11]  (Laughter) That's interesting. 

Joe Carrigan: [00:09:13]  Right? 

Dave Bittner: [00:09:13]  Yeah. 

Joe Carrigan: [00:09:14]  And then the author gets on the bus, and he continues his trip. It was just an interesting exchange. What's also interesting is the very first reply on this, the very - the most upvoted reply is, somebody says the con starts with the bus itself. The bus did not stop there by chance or accident. Most likely, there were no repairs that needed to be done on the bus, and the bus driver is getting a cut - right? - which makes sense. 

Joe Carrigan: [00:09:37]  I traveled once abroad with a bus tour in Ireland, and we made specific stops around the area. And, yes, we knew that the bus driver had made arrangements with these people to drop us off at these different places. It was part of the tour. We were going to these different places. There's a couple wool mills. There's, like, the Triona wool mill in Donegal, where we stopped. And then we stopped at the Blarney wool mills down in, I think, Cork. I think it's all the way in Cork. But it's not an uncommon practice for these things to happen. 

Joe Carrigan: [00:10:06]  But we actually asked our bus driver and said, is there some kind of relationship you have with these places? He says, oh, yeah, we have a relationship. But, you know, the benefit is, because we bring so many customers in, we get really good prices for, like, hotels and meals and things. So it is a package deal, and there is a relationship. I don't really have a problem with this going on in India, either. I mean, this probably is exactly what happens. People do need to eat on the ride. It's an 11-hour bus ride. Somebody is going to get hungry. There is absolutely no reason to not stop at a place and have a relationship with those people. 

Dave Bittner: [00:10:36]  Right. And I suppose if there are no prices... 

Joe Carrigan: [00:10:39]  Right. 

Dave Bittner: [00:10:39]  ...If there isn't a menu with listed prices, I can imagine you being up to the whims of the proprietor of the place, who sizes you up and tries to see what the customer can bear. 

Joe Carrigan: [00:10:52]  Yeah, exactly. And from an American standpoint, I wouldn't even argue a meal for 150 rupees, for $2.50. 

Dave Bittner: [00:11:00]  Right. 

Joe Carrigan: [00:11:00]  I wouldn't even bat an eye at that. 

Dave Bittner: [00:11:02]  Yeah. That's interesting. It's an interesting look inside of that sort of thing. And I like that phrase - that you don't blame the tiger because the deer is weak. 

Joe Carrigan: [00:11:11]  Right. Yeah. 

Dave Bittner: [00:11:12]  Yeah. That's fascinating. My story this week is a weird one. 

Joe Carrigan: [00:11:16]  A weird one. 

Dave Bittner: [00:11:17]  A weird one. 

Joe Carrigan: [00:11:17]  That's good. 

Dave Bittner: [00:11:19]  (Laughter) So imagine, Joe, that you are a expectant mother, or perhaps you have... 

Joe Carrigan: [00:11:27]  I kind of look like an expectant mother (laughter). 

Dave Bittner: [00:11:30]  Well, then suppose you're a newborn. Joe, we could all afford to lose a few pounds. 

Joe Carrigan: [00:11:35]  Yes. 

Dave Bittner: [00:11:35]  Right? 

0:11:35:(LAUGHTER) 

Dave Bittner: [00:11:36]  So suppose you're a - you've got a newborn at home, or you're very close to giving birth. And you're poking around on Facebook, and somebody pops up, and they say, hey, I live near you, and I'm a professional photographer, and I'm trying to build out my portfolio, trying to move into this area of taking photos of newborns and babies and expectant mothers and that sort of thing. I would love to take some photos of you. I will do it for free in exchange for building up my portfolio. 

Joe Carrigan: [00:12:09]  OK. 

Dave Bittner: [00:12:09]  And you say to yourself, well, that sounds reasonable. 

Joe Carrigan: [00:12:12]  Yep. 

Dave Bittner: [00:12:13]  Sure. Come on over, and let's get some pictures taken of the new bundle of joy. Well, that is what happened, but it took a turn for the weird. It was a woman named Juliette Parker. This is in Washington state. And she has been charged with second-degree assault and attempted kidnapping charges. Prosecutors say she was only posing as a photographer, and then she drugged a mother in order to try to steal her baby. So she would - came to the house several times, did some photo sessions, but there were some odd things that were happening when she came over. She seemed to avoid touching anything. And if she did touch things, she would wipe them down, trying to get rid of her fingerprints. 

Joe Carrigan: [00:12:57]  Really? 

Dave Bittner: [00:12:59]  She would only sit on the floor. She wouldn't sit on furniture. 

Joe Carrigan: [00:13:03]  For fear of leaving a hair follicle there. 

Dave Bittner: [00:13:05]  Could be, could be. Also, when she was taking pictures of the baby, she would take photos, but then she would also take a bunch of selfies with the baby. 

Joe Carrigan: [00:13:15]  That's kind of weird. 

Dave Bittner: [00:13:16]  It is very weird, right? And the police speculate that maybe she was trying to set up a time-stamped trail of photos of herself with the baby. So in other words, you know, she's found later on with this baby, and she'd be able to say, I've been with - this is - I've known - you know, I've been with this baby for weeks for... 

Joe Carrigan: [00:13:33]  Yeah, this is my baby. 

Dave Bittner: [00:13:34]  Right. 

Joe Carrigan: [00:13:35]  Right. 

Dave Bittner: [00:13:35]  Exactly. Exactly. 

Joe Carrigan: [00:13:36]  Here's some pictures of - took of me and him last month. 

Dave Bittner: [00:13:38]  Yeah. So evidently, she did this with a bunch of different people, but there was one woman that she seems to have focused on. It was a woman named Elysia Miller - had her over several times to take pictures. And the last time that she came over, she brought her teenage daughter with her, which is interesting in that she already has a child. 

Joe Carrigan: [00:13:59]  Right. 

Dave Bittner: [00:14:00]  ...But also brought over three different types of cupcakes - three different varieties of cupcakes. And evidently, the - or allegedly, I suppose, is the correct way to say it. This woman pressured the woman who had the newborn to eat one of these specific batches of cupcakes. And she did and started feeling sick... 

Joe Carrigan: [00:14:22]  Really? 

Dave Bittner: [00:14:22]  ...Started feeling woozy, threw up a bunch of times. At this point, the photographer woman left. And later on, the woman who had the baby noticed that her keys were missing to the house... 

Joe Carrigan: [00:14:33]  Really? 

Dave Bittner: [00:14:34]  ...Called the photographer, said, my keys are missing. The photographer said, oh, the - I found the keys out on the lawn of my house and sent the keys back over with a friend, an unknown man. So this woman who was not feeling well - the victim in all of this who had the baby - she called 911, went to the hospital, spoke to the sheriffs. Her symptoms align with the date rape drugs, the - you know, I don't know the technical names. 

Joe Carrigan: [00:15:02]  Roofies. 

Dave Bittner: [00:15:02]  The roofies, yeah, yeah - the kinds of things that try to, you know, make you woozy and so forth. And the police are making the case - they're alleging that this photographer was out there basically shopping around to try to kidnap a baby. And evidently, they've interviewed an ex-boyfriend who said that this was something that she talked about - that kidnapping a baby was something she was willing to do as a last resort. 

Joe Carrigan: [00:15:27]  Now, why didn't that boyfriend raise a red flag at that point in time? 

Dave Bittner: [00:15:31]  I don't think their relationship lasted very long after that. 

0:15:33:(LAUGHTER) 

Dave Bittner: [00:15:35]  My sense is that (laughter) - yeah, he's - I believe he's described as an ex-boyfriend. 

Joe Carrigan: [00:15:42]  Right. 

Dave Bittner: [00:15:42]  So I don't believe the relationship lasted very long after that. 

Joe Carrigan: [00:15:46]  You know what we should do? We should kidnap a baby. Yeah, I think we should see other people. 

Dave Bittner: [00:15:52]  (Laughter) Right, right. Exactly. 

Joe Carrigan: [00:15:54]  (Laughter). 

Dave Bittner: [00:15:54]  Right. You know what would be fun to do? You know, I don't really have any plans this weekend. Let's go kidnap someone else's baby. 

Joe Carrigan: [00:15:59]  (Laughter) Right. 

Dave Bittner: [00:16:00]  Oh, look at the time. 

Joe Carrigan: [00:16:01]  (Laughter). 

Dave Bittner: [00:16:03]  Right. Obviously, this is disturbing and horrible. 

Joe Carrigan: [00:16:07]  Yeah, absolutely. 

Dave Bittner: [00:16:08]  I'm just thinking about that vulnerability because many people fell for this. This woman went to many people's homes... 

Joe Carrigan: [00:16:16]  Right. 

Dave Bittner: [00:16:17]  ...And actually took photos, gave them the photos. I think, in terms of trying to screen for this thing, in retrospect, it's easy to say, oh, there were a lot of red flags - the way that she acted when she came to someone's house. 

Joe Carrigan: [00:16:29]  Yeah. There are a lot of red flags in that term. But, you know, I might be willing to chalk those up to someone just being OCD. 

Dave Bittner: [00:16:36]  Or just - yeah, just a little peculiar, you know? 

Joe Carrigan: [00:16:38]  Yeah, just being a little peculiar. I don't know that that would set me off. 

Dave Bittner: [00:16:41]  Right. And you and I have talked about - I choose to think the best of people, so I don't think I would automatically - you know, if someone was reaching out or offering up on Facebook... 

Joe Carrigan: [00:16:50]  Yeah. 

Dave Bittner: [00:16:51]  ...And said they are looking to build their portfolio, well, that sounds reasonable to me. 

Joe Carrigan: [00:16:55]  The one thing that does - that would be a red flag for me is she was taking selfies with the baby. 

Dave Bittner: [00:16:59]  Yeah. 

Joe Carrigan: [00:17:00]  That's unusual. 

Dave Bittner: [00:17:00]  That is unusual. 

Joe Carrigan: [00:17:01]  There's no reason for a photographer to do that, I don't think. 

Dave Bittner: [00:17:04]  I also wonder, would you be better off meeting at a neutral location rather than inviting this person to your home? 

Joe Carrigan: [00:17:11]  I don't know. 

Dave Bittner: [00:17:12]  You got this situation where she tried to steal the keys, allegedly. 

Joe Carrigan: [00:17:15]  Yeah, I don't know, 'cause if you go to a neutral location, how does that affect the idea of taking a picture of the kid? I mean, the kid needs to be a comfortable... 

Dave Bittner: [00:17:24]  Right, right. 

Joe Carrigan: [00:17:25]  The baby needs to be comfortable and... 

Dave Bittner: [00:17:26]  That's true. 

Joe Carrigan: [00:17:26]  ...The best place to do that is at home. 

Dave Bittner: [00:17:28]  Yeah. 

Joe Carrigan: [00:17:28]  Yeah, I don't know. That's not really something I would look at as a fear for this, you know? It wouldn't - I wouldn't - that wouldn't have set off a red flag. 

Dave Bittner: [00:17:35]  Yeah. Tell the photographer that you want to meet at the local police station. 

Joe Carrigan: [00:17:39]  Right, yeah (laughter). I don't know how you protect yourself against this other than when somebody comes over and says, you really got to try these cupcakes, I'd be like, no. 

Dave Bittner: [00:17:48]  (Laughter) I mean, I suppose you could follow up, make sure they have a website, ask for some references. 

Joe Carrigan: [00:17:53]  Well, they're not going to have references if they're starting up. 

Dave Bittner: [00:17:56]  Yeah. 

Joe Carrigan: [00:17:56]  Right? I mean, that's the whole thing. This is a brilliant scheme. 

Dave Bittner: [00:18:00]  Right. You get what you paid for. 

Joe Carrigan: [00:18:02]  Right, exactly. 

Dave Bittner: [00:18:03]  Yeah. And how fortunate that before she got away with what she was, again, allegedly up to... 

Joe Carrigan: [00:18:08]  Right. 

Dave Bittner: [00:18:09]  ...It didn't go the way that she planned, and now she's in trouble with the law. 

Joe Carrigan: [00:18:14]  That's right. 

Dave Bittner: [00:18:14]  So... 

Joe Carrigan: [00:18:14]  As well she should be. 

Dave Bittner: [00:18:16]  Yeah. Hopefully, justice will be done. All right. Well, that is my story for this week. 

Dave Bittner: [00:18:21]  It is time to move on to our Catch of the Day. 

0:18:24:(SOUNDBITE OF REELING IN FISHING LINE) 

Dave Bittner: [00:18:27]  Joe, you came up with our Catch of the Day this week. Do you want to describe to us what's going on here? 

Joe Carrigan: [00:18:32]  Yes. This is a text exchange between a scammer and somebody who decides they're going to waste a little bit of scammer's time. And this comes from littlethings.com. Dave, why don't you play the part of the scammer this time? And I will play the part of the recipient of the first message. 

Dave Bittner: [00:18:50]  All right. It starts like this. 

Dave Bittner: [00:18:53]  Hello. My name is Harris, and I represent a Fortune 500 company. I would like to interest you in a chance to earn $5,000 a month in the comfort of your own home. If you're interested, give this number a call, and we can discuss something. 

Joe Carrigan: [00:19:06]  Congratulations. You have successfully subscribed to the interesting facts of the day. Here's your daily fun fact. A human's cells replace themselves over seven years, so every seven years, you're essentially a clone of yourself. To unsubscribe from interesting facts of the day, please send unsub IFOTD. 

Dave Bittner: [00:19:25]  Unsub IFOTD. 

Joe Carrigan: [00:19:27]  Congratulations. You have unsubbed from the interesting facts of the day. We'll be sorry to see you go. We are currently in a phase of survey and testing. Would you like to take part in our survey? Send yes or no. 

Dave Bittner: [00:19:39]  No. 

Joe Carrigan: [00:19:40]  You have selected no problem. Thank you for your patience. 

Dave Bittner: [00:19:43]  What? 

Joe Carrigan: [00:19:43]  On a scale of 1 to 10, how would you rate our fun facts? 

Dave Bittner: [00:19:47]  Wot? How do I stop this? 

Joe Carrigan: [00:19:49]  You have successfully subscribed to ways of twerking with keywords WOT. Our current database only has Miley on it. Would you like a tutorial on the Miley way of twerking? Reply yes or no. 

Dave Bittner: [00:19:59]  No. 

Joe Carrigan: [00:20:00]  You have successfully subscribed to naughty otters. Here's your free daily otter picture. And then it's a picture of an otter in, like, some underwear. To unsubscribe from this service, please reply no otters. 

Dave Bittner: [00:20:14]  No otters. 

Joe Carrigan: [00:20:15]  You have successfully unsubscribed from naughty otters. Would you like to subscribe to naughty dogs? Please reply yes or no. 

Dave Bittner: [00:20:21]  No. 

Joe Carrigan: [00:20:22]  You have selected no problem. Here is your daily naughty dog picture. And it's a picture of a dog, like, cooking. 

0:20:28:(LAUGHTER) 

Dave Bittner: [00:20:30]  Dude, you got to stop this. I've got a business to run. I can't be checking my phone for [expletive] like this. 

Joe Carrigan: [00:20:36]  Thank you. You have keyed in the daily secret word, [expletive]. 

0:20:42:(LAUGHTER) 

Joe Carrigan: [00:20:42]  As a reward, we'll be providing you with 80 free naughty dog pictures for the next 24 hours. 

Dave Bittner: [00:20:48]  If I ever find you, I will rip out your throat and go ham on your [expletive]. 

Joe Carrigan: [00:20:53]  Congratulations. You have keyed the keyword, ham. Here is a picture of some sexy ham. (Laughter) And it's a picture of a woman, like, eating a ham with her pinkie extended. 

Dave Bittner: [00:21:03]  [Expletive] you. 

Joe Carrigan: [00:21:07]  (Laughter) That is awesome. 

Dave Bittner: [00:21:08]  That is a good one. 

0:21:09:(LAUGHTER) 

Dave Bittner: [00:21:12]  It's so satisfying to waste these people's time, right? (Laughter). 

Joe Carrigan: [00:21:15]  Yeah, it is. 

Dave Bittner: [00:21:16]  The frustration is palpable, isn't it? 

Joe Carrigan: [00:21:17]  Yes, it is. 

Dave Bittner: [00:21:18]  It really is. All right. Well, that is our Catch of the Day. 

Dave Bittner: [00:21:21]  Coming up next - my conversation with Tim Sadler from Tessian. We're going to be discussing the human element of cybersecurity and phishing schemes. 

Dave Bittner: [00:21:29]  But first, a word from our sponsors KnowBe4. And now we return to our sponsor's question about forms of social engineering. KnowBe4 will tell you that where there's human contact, there can be con games. It's important to build the kind of security culture in which your employees are enabled to make smart security decisions. To do that, they need to recognize phishing emails, of course, but they also need to understand that they can be hooked by voice calls - this is known by vishing - or by SMS texts, which people call smishing. See how your security culture stacks up against KnowBe4's free test. Get it at knowbe4.com/phishtest. That's knowbe4.com/phishtest. 

Dave Bittner: [00:22:15]  And we are back. Joe, I recently had the pleasure of speaking with Tim Sadler. He is from a company called Tessian. And we discuss the human element of cybersecurity, along with some details on some phishing schemes. Here's my conversation with Tim Sadler. 

Tim Sadler: [00:22:30]  I think, for a long time, when we've spoken about securing people, we've always defaulted to training and awareness rather than thinking about how we can use technology to take the burden of security away from people. So I think there's a challenge at the moment in that humans are unpredictable. They break the rules. They make mistakes. And they're easily tricked. And that's what's leading to so many data breaches today that are ultimately caused by people and human error. 

Dave Bittner: [00:22:55]  And so the bad guys, knowing this, have adjusted their tactics. 

Tim Sadler: [00:22:59]  I think that's right. I mean, if you think about email for an organization, it is an open gateway. So it is one of the only pieces of infrastructure an organization has where anybody can send anything into an organization without pre-approval. And I think that's one of the reasons why we're seeing such a high level of threat around phishing, spear-phishing, business email compromise, those kinds of attacks. It is the - really, the entry point for every attacker that wants to get into an organization today, and it's so effortless to execute one of these scams. 

Dave Bittner: [00:23:30]  So what kind of things are you tracking? What are some of the specific campaigns that are popular these days? 

Tim Sadler: [00:23:36]  So I think, you know, we see everything from the well-known trends like the fact that, you know, it's tax season and the W-9 form scam - so attackers putting malicious attachments in emails trying to get people to open them because, you know, it's tax season, and that's something that everybody is watching out for. 

Tim Sadler: [00:23:56]  And then some of the more interesting things that we're seeing specifically are around attackers scraping LinkedIn data to automate attacks based on people moving jobs. So a new joiner to an organization will - you know, is - may have a higher propensity to be duped by a phishing scam. They won't know the protocol that an organization has in place. So we're seeing a lot of attacks that come through when people are new to an organization. It's maybe in their first or second week, and then they'll receive a spear-phishing email pretending to be the CFO or pretending to be the CEO, trying to dupe them into doing something and, again, use those techniques of deception and urgency on emails. 

Dave Bittner: [00:24:40]  Now, what about some of the more targeted campaigns - you know, things like spear-phishing, even - you hear it referred to sometimes as whaling, where they're targeting high-level people within organizations? 

Tim Sadler: [00:24:51]  Yeah. I - you know, I think the rise in spear-phishing and these highly targeted campaigns is largely to do with the fact that we put so much of ourselves online now. So I read a stat recently. Over 150 million U.S. workers have a LinkedIn profile, which is an astonishing statistic. And what that means is it's trivial for any attacker to understand who the C-suite of any organization are, and then it's trivial to then emulate or spoof those identities on email. So what we're seeing is that it's really, really easy to pull off these scams. 

Tim Sadler: [00:25:26]  And actually, you can - for attackers, it is fairly scalable to do this. You can build a LinkedIn scraper. You can be pulling names. And you can be automating the purchase of domains that look like legitimate domains but, in fact, aren't. And then you can automate the sending of those emails into organizations. And, you know, the rewards from doing this kind of thing can be enormous for attackers. So I read about that charity in the U.K. this morning who fell victim to a spear-phishing scam where they lost almost a million dollars over three transactions. So it is a huge, huge payoff for these attackers when they actually - you know, they get their target to do the thing they want them to. 

Dave Bittner: [00:26:06]  What are your recommendations for organizations to best protect themselves? 

Tim Sadler: [00:26:10]  So I think, you know, it does start with awareness. You have to make sure that employees are aware that their inbox is dangerous. And they need to pause, if only for five seconds, just with every email they get and do some basic checks. So check, who is this email from? Does the domain look legitimate? 

Tim Sadler: [00:26:28]  But really, what is extremely difficult is, for most organizations today, their entire security strategy is reliant on their employees doing the right thing 100% of the time. So if you are only relying on security training and awareness, there are going to be things that creep through. There are going to be attacks that are successful. And in the same way that organizations use advanced technology to secure their networks and secure their devices, we believe that organizations today need to be using advanced technology to secure their people. 

Dave Bittner: [00:26:58]  Well, how does that technology play out? What sort of things are you describing here? 

Tim Sadler: [00:27:03]  In order to secure people - so again, we come back to this point that people are unpredictable. They break the rules. They make mistakes, and they're easily hacked. A system needs to understand the normal patterns of behavior that a person exhibits on email in order to understand what looks like a security threat and what looks like a normal email. So what organizations can do is they can use a platform - like Tessian, for example - that uses machine learning to analyze historical email patterns and behaviors to understand, on every incoming email, does this email look legitimate or not? And that's something that we've pioneered and we use and is much more effective than some of the traditional approaches, which use rules or policies to control the flow of inbound email. 

Dave Bittner: [00:27:48]  You know, it reminds me of a story that a colleague of mine shared with some friends who work for a nonprofit. And they got an email from the chief financial officer, who had just gone on vacation, and it said, I know; I realize I'm out of town, but I need you all to transfer this large sum of money, and I need it done immediately; you know, please don't let me down. And to a person, they all said, this is the last thing in the world this person would ever say or do. And that tipped them off to the problem. It sounds like - I mean, that's a similar thing to how you're coming at this from a technological point of view or looking - making sure that the behavior isn't anomalous. 

Tim Sadler: [00:28:22]  Yeah, that's exactly right. We use machine learning in the way that it's been applied to other fields - for example, credit card fraud detection. You look at their normal spending patterns and behaviors on card transactions, and then you use that intelligence to then spot the fraudulent transactions. And that's what we're doing. We're looking at normal email behavior in order to spot the fraudulent email behavior. And in the same way that you would try and train a person to look out for the unusual aspects of an email that may give a clue as to whether it's a phishing email or not, you can train a machine-learning algorithm to do the same. 

Tim Sadler: [00:28:58]  Now, the difference and the advantage to doing this is that a machine-learning algorithm can traverse millions and millions and millions of data points in a split second, whereas a human is only going to have a limited number of data points that they can remember or they can go back to in their mind. 

Dave Bittner: [00:29:13]  Where do you suppose we're headed with this? As you look towards the future and this problem with email continues to be an issue, do you suppose the types of things that you're offering here are going to become just a standard part of doing business? 

Tim Sadler: [00:29:26]  I think it's critical that organizations today realize that their security strategy cannot be reliant on training people to do the right thing 100% of the time. And again, it comes back to - at the beginning of my career, I was working for one of the world's largest banks and saw a massive problem, and that is that banks spend millions of dollars on securing their networks and devices using advanced technology, but they completely neglect the security of their people. So instead, they're relying on training them to do the right thing 100% of the time. And that, obviously, doesn't work. 

Tim Sadler: [00:29:58]  I saw people who would send highly sensitive information to completely the wrong person. They would email documents to their personal email account, or they would fall for phishing scams. So we thought this was a huge problem that needed solving, and that's why we built the product that we're building today - because we believe that in the same way you have a firewall for your network and you have an EDR platform for your devices, we believe you need a human-layer security platform to protect your people. 

Dave Bittner: [00:30:25]  All right. Interesting stuff. Joe? 

Joe Carrigan: [00:30:27]  Yeah. A couple things stick out to me. One, your inbox is dangerous, and Tim does a really good job of describing why that is. He calls it an open gateway because anyone - literally anyone - can use your inbox. 

Dave Bittner: [00:30:40]  Yeah. 

Joe Carrigan: [00:30:40]  That there - and he's right. There's nothing else... 

Dave Bittner: [00:30:42]  That's crazy (laughter). 

Joe Carrigan: [00:30:43]  Yeah. When you think about it that way, it doesn't make sense. That shouldn't be the case. 

Dave Bittner: [00:30:47]  I was thinking about this 'cause I recently - you know, you get private messages on Twitter. 

Joe Carrigan: [00:30:51]  Right. 

Dave Bittner: [00:30:52]  And I have my Twitter configured so that if you're not someone who I also follow... 

Joe Carrigan: [00:30:57]  Right. 

Dave Bittner: [00:30:57]  ...They get filtered and they don't automatically come through. I have to approve them. 

Joe Carrigan: [00:31:01]  Yeah, I think that's the way it is. The only thing you can say is, I don't want anything from anybody I don't follow, or, go ahead and allow it, but then they have a request. 

Dave Bittner: [00:31:08]  Right. It seems like that would be a reasonable thing over in the email world. But... 

Joe Carrigan: [00:31:12]  Yeah. 

Dave Bittner: [00:31:13]  ...As we know, email is ancient. 

Joe Carrigan: [00:31:15]  It is. It's a very old and terrible system... 

Dave Bittner: [00:31:18]  (Laughter) That we're stuck with - right, right. 

Joe Carrigan: [00:31:19]  ...That we're stuck with for some reason, and we can't - there's got to be a new way to do this. You know, somebody out there listening... 

Dave Bittner: [00:31:24]  Maybe the next generation will just not stop using it, and that'll be that. 

Joe Carrigan: [00:31:29]  That's the new million-dollar idea. 

Dave Bittner: [00:31:30]  Right, right. 

Joe Carrigan: [00:31:31]  If you are relying on security training alone, Tim makes this point that you're going to get compromised, and that is a hundred percent true. You need a holistic security plan or solution that at least includes technology, policy and user training. You know, your users have to be trained on both the technology and the policy. These email business compromises - a lot of them can be stopped by good policy. You know, like, when the CFO says, I need you to transfer this money now, if your corporate policy is to place a phone call back to the CFO, then that will stop a lot of these. 

Dave Bittner: [00:32:07]  Get multiple people to sign off on things or things like that. 

Joe Carrigan: [00:32:09]  Multiple people sign off on things - exactly. 

Dave Bittner: [00:32:11]  Yeah, yeah. 

Joe Carrigan: [00:32:11]  Those kind of things can go a long way towards stopping these kind of compromises. 

Dave Bittner: [00:32:15]  Right. 

Joe Carrigan: [00:32:15]  I also found it interesting that he's saying attackers are targeting people who are new to an organization on LinkedIn. I don't want to sound like I'm praising the attackers, but that's actually pretty smart. You go into a new organization, you're really not familiar with things. If you have a lot of responsibility there and they give you the keys to the kingdom right away, then you are a very vulnerable target... 

Dave Bittner: [00:32:34]  Right. 

Joe Carrigan: [00:32:35]  ...For these scraping attacks. 

Dave Bittner: [00:32:37]  You don't want to disappoint your new co-workers. 

Joe Carrigan: [00:32:38]  Right. 

Dave Bittner: [00:32:39]  Somebody makes a request via email. 

Joe Carrigan: [00:32:41]  Yep. 

Dave Bittner: [00:32:41]  You may not know who that person is. 

Joe Carrigan: [00:32:43]  Right. 

Dave Bittner: [00:32:43]  Yeah. 

Joe Carrigan: [00:32:43]  Exactly. My recommendation for this is when you change jobs, don't update your LinkedIn right away, you know? Wait for a couple of months until you get settled in, then go ahead and update your LinkedIn. One hundred fifty million U.S. workers are on LinkedIn. In January of 2018, the U.S. labor force was 160 million. That's a large percentage. 

Joe Carrigan: [00:33:03]  Now, not everybody on LinkedIn is still in the workforce. Like, I'm - connections with people who I know have retired. I actually have some connections I know have passed away on LinkedIn, but their LinkedIn accounts are still there. So they're probably in that 150 million, but that's a very large percentage... 

Dave Bittner: [00:33:19]  Yeah. 

Joe Carrigan: [00:33:20]  ...Of people that have professional positions here in the U.S. that are on LinkedIn. 

Dave Bittner: [00:33:25]  It is unusual to search for someone on LinkedIn and not have them come up. 

Joe Carrigan: [00:33:28]  Yes. 

Dave Bittner: [00:33:28]  More likely than not to find somebody there. 

Joe Carrigan: [00:33:30]  Right. My kids are both on it. 

Dave Bittner: [00:33:32]  Thanks so much to Tim Sadler for joining us. He is from Tessian. 

Dave Bittner: [00:33:37]  And we want to thank all of you for listening to us. 

Dave Bittner: [00:33:39]  And, of course, we want to thank our sponsors at KnowBe4. They are the social engineering experts and the pioneers of new-school security awareness training. You can take advantage of their free phishing test at knowbe4.com/phishtest. Think of KnowBe4 for your security training. We want to thank the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu. 

Dave Bittner: [00:34:02]  The "Hacking Humans" podcast is probably produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Joe Carrigan: [00:34:15]  And I'm Joe Carrigan. 

Dave Bittner: [00:34:16]  Thanks for listening.