Lisa Forte: [00:00:00] Everything needs to be corroborated. Anything that you rely on has to be corroborated by some other source.
Dave Bittner: [00:00:07] Hello, everyone, and welcome to the CyberWire's "Hacking Humans" podcast. Joe, I know what you're asking. You're asking, what is this show? Well, let me tell you what this show is. This is the show where, each week, we look behind the social engineering scams. What else?
Joe Carrigan: [00:00:18] Well, I didn't know you were going to throw it to me, Dave.
Dave Bittner: [00:00:22] The phishing schemes, what else?
Joe Carrigan: [00:00:24] Criminal exploits...
Dave Bittner: [00:00:25] Criminal exploits.
Joe Carrigan: [00:00:25] ...That are making the headlines and taking a heavy toll on organizations around the world.
Dave Bittner: [00:00:28] That's right. I'm Dave Bittner from the CyberWire, and who the heck are you?
Joe Carrigan: [00:00:31] I'm Joe Carrigan from the Johns Hopkins University Information Security Institute.
Dave Bittner: [00:00:34] OK. Now that we've got that out of the way, we've got some good stories to share this week. And later in the show, Carole Theriault speaks with Lisa Forte. She's from Red Goat Cyber, and she's going to be talking about how her experiences as a police officer inform her perspective on the human factors of cybersecurity.
Dave Bittner: [00:00:51] But first, a word from our sponsors at KnowBe4 - so who's got the advantage in cybersecurity - the attacker or the defender? Intelligent people differ on this, but the conventional wisdom is that the advantage goes to the attacker. But why is this? Stay with us, and we'll have some insights from our sponsor KnowBe4 that puts it all into perspective.
Dave Bittner: [00:01:19] And we are back. Joe, before we get to our stories, we got a kind letter...
Joe Carrigan: [00:01:23] Yes, we did.
Dave Bittner: [00:01:24] ...From a listener. And this is actually a letter someone wrote out...
Joe Carrigan: [00:01:28] Handwrote.
Dave Bittner: [00:01:28] ...A letter. It is handwritten, very nicely done, sent all the way from Germany.
Joe Carrigan: [00:01:32] Who does this anymore?
Dave Bittner: [00:01:33] Well, a true gentleman does this, Joe.
Joe Carrigan: [00:01:35] Yes, that's correct.
Dave Bittner: [00:01:38] (Laughter) That's who does this. Forgive me, to the author of this letter, if I get your name wrong, but I believe it's Majeet (ph) - is a way to pronounce it. Again, apologies in advance. But he says, greetings from Germany. I'm an Egyptian computer science master's student at the University of Konstanz, Germany. I'm an avid listener of your outstanding podcast "Hacking Humans," and I practically told everyone I study and work with about it. Love the show.
Dave Bittner: [00:02:01] Well, we love you...
Joe Carrigan: [00:02:03] (Laughter) Yes. Thank you...
Dave Bittner: [00:02:04] ...Majeet.
Joe Carrigan: [00:02:04] Thank you very much for the kind words.
Dave Bittner: [00:02:05] Yes. Yes. He says, I have a curious question, if I may. You gentlemen mentioned quite a few stories about email hijacking.
Joe Carrigan: [00:02:11] We do.
Dave Bittner: [00:02:11] And the aftermath is oftentimes financially disastrous.
Joe Carrigan: [00:02:14] It often is, yes.
Dave Bittner: [00:02:16] I understand that two-factor authentication is effective, of course, but are solutions like digital signatures considered in the corporate world? If I'm the only person who owns a private key that I sign my emails with, then it's computationally infeasible for anyone to masquerade as me, right? Thanks a bunch, and keep up the good work. P.S. - love the accents.
Dave Bittner: [00:02:38] What you think, Joe?
Joe Carrigan: [00:02:39] Yes, that's true. When you sign an email with your private key and you put a digital signature on it, there's no way for someone to forge that. However, my concern is that the interface is a - just a little icon next to the email that says it's been signed. Now, I can click on that, look at the digital signature and everything, but if somebody else signs it with another valid certificate and is impersonating you, it might still appear to be signed.
Dave Bittner: [00:03:01] Oh, I see.
Joe Carrigan: [00:03:01] It's definitely a good measure that should be enacted. I use it on my email at JHU.
Dave Bittner: [00:03:06] Oh, really?
Joe Carrigan: [00:03:06] Yep, I do. Chris Venghaus, who's my - one of my co-workers, he uses it as well. And I know that there are places where, if the culture of the organization is such that you don't sign an email, people will ignore it. But that's really the point, is that you really have to have a culture like that, and that's kind of difficult to implement, whereas two-factor authentication is something you can implement more easily. My suggestion is do both, you know? 'Cause once you implement the certificate, it's transparent to the user.
Dave Bittner: [00:03:33] Right. And I guess with two-factor, it's all on you.
Joe Carrigan: [00:03:36] Right.
Dave Bittner: [00:03:36] But with these digital signatures, everybody's got to be in on the game.
Joe Carrigan: [00:03:40] Yes.
Dave Bittner: [00:03:41] You need buy-in, and that's hard to get with something as open as email.
Joe Carrigan: [00:03:44] I was talking with somebody the other day, and they said they don't even like using email. And my comment to them was, yeah, email's terrible.
Dave Bittner: [00:03:50] Yeah.
Joe Carrigan: [00:03:50] It's - you know, it's a decades-old system that - security was never considered in the development of this protocol.
Dave Bittner: [00:03:56] Yeah. We're just stuck with that legacy.
Joe Carrigan: [00:03:57] Yeah.
Dave Bittner: [00:03:58] All right. Well, again, thank you for sending in your kind letter. We do appreciate it. And we always love hearing from our listeners, so please don't hesitate to reach out if you have a question for us. Joe, let's get to our stories this week. What do you have for us?
Joe Carrigan: [00:04:11] Well, Dave, you know what tomorrow is, right?
Dave Bittner: [00:04:14] Go on.
Joe Carrigan: [00:04:15] It's Valentine's Day.
Dave Bittner: [00:04:16] Uh-oh.
Joe Carrigan: [00:04:16] So that means romance scams.
Dave Bittner: [00:04:18] It means I try to avoid my wife for an entire day.
Dave Bittner: [00:04:23] What did you get me? Uh, I'll be right back.
Joe Carrigan: [00:04:27] Right. My article comes from the BBB, the Better Business Bureau, and specifically from the BBB of Nebraska, South Dakota, Kansas Plains and Southwest Iowa, so it's kind of a Midwest-focused region of the Better Business Bureau. And a man named Jim Hegarty is the CEO of this organization. They're talking about romance scams, and the BBB has a scam tracker that reports on scams in the service area of this region of the BBB, and they are telling some stories in this article. One woman from Papillion, Neb., was scammed out of four grand - $4,000. There was another woman from Omaha who was taken for $25,000. And they have a third victim in here from Wichita, Kan. - $14,000.
Dave Bittner: [00:05:11] Wow.
Joe Carrigan: [00:05:11] But the big story in here takes the cake. There is one woman from Nebraska, and they don't name her in this thing, which is probably good, but she worked with the Better Business Bureau on an investigation for her scam because she fell in love with a guy she met on match.com. All right? And it was a typical romance scam. They talked on the phone every day over the course of five months, and she video chatted with him on FaceTime. Right? So she had seen this guy.
Dave Bittner: [00:05:36] Right.
Joe Carrigan: [00:05:37] And everything about him seemed real and sincere. And over time, guess what happened? She came to trust him. And Hegarty says that this is very common among the victims. There is often a very long grooming period before these guys ask for money. It's sad to tell the story, but the woman sent the scammer money for several plane tickets for him to come visit Nebraska. And something always would come up, and he wouldn't be able to do it. Now, I know what we're thinking, right? I send you money for a plane ticket so you can come see me, and then you say, oh, something came up; I can't come. And then I say, you coming out to see me? You go, yeah, send me money for a plane ticket. And I say, use the money I sent you last time.
Dave Bittner: [00:06:13] Right.
Joe Carrigan: [00:06:15] That's what I would say.
Dave Bittner: [00:06:16] Yeah.
Joe Carrigan: [00:06:16] Or that's what I think I would say.
Dave Bittner: [00:06:17] Right.
Joe Carrigan: [00:06:18] But this woman did not say that. She continually sent him money. He claimed his paychecks were being held up because of taxes...
Dave Bittner: [00:06:23] OK.
Joe Carrigan: [00:06:25] ...And that he'd pay her back after everything was cleared up, right?
Dave Bittner: [00:06:28] Right.
Joe Carrigan: [00:06:28] So this is just another thing that is a great example of how they can provide a plausible problem that could be a long-time horizon problem.
Dave Bittner: [00:06:36] Yeah.
Joe Carrigan: [00:06:37] Tax problems can last for years.
Dave Bittner: [00:06:39] Sure.
Joe Carrigan: [00:06:39] When she finally had had enough, she sent him money for a plane ticket to come to Nebraska for Christmas, and she waited at the airport for hours and he never showed up. In all, this woman lost $400,000.
Dave Bittner: [00:06:52] Wow.
Joe Carrigan: [00:06:53] It's a lot of money. She lost all of her liquid assets. She took out a mortgage against her home, which had already been paid off, right?
Dave Bittner: [00:07:00] Oh, no.
Joe Carrigan: [00:07:00] And she lost $170,000 out of her investment portfolio. And here's what really - I don't know if this angers me or if this sickens me. I can't tell you. The woman is a widow. That's why they took advantage of her. It's just awful that this happened to this poor woman.
Dave Bittner: [00:07:15] Yeah.
Joe Carrigan: [00:07:15] The article mentions that consumers can also report these kind of scams to the Federal Trade Commission. And they've noted that these scams are growing. In 2017, there were just under 17,000 reports. In 2018, there were over 22,000 reports. And last year, 2019 - 38,000 reports, almost 39,000 reports.
Dave Bittner: [00:07:34] So over two years, it nearly doubles.
Joe Carrigan: [00:07:36] It nearly doubled over two years. It's more than doubled.
Dave Bittner: [00:07:38] Now, I wonder how much of that is increased reporting because of awareness. That could be a factor.
Joe Carrigan: [00:07:43] I wonder that, too. I tend to think it might actually be a combination of maybe increased reporting, but also increased activity.
Dave Bittner: [00:07:50] Right.
Joe Carrigan: [00:07:50] Because we also have seen many times in these stories that people who do get scammed are embarrassed to report it.
Dave Bittner: [00:07:55] Right.
Joe Carrigan: [00:07:56] I think that this happened to this woman - I don't know the story, but here are some of the common themes that we've seen in these stories. One, this woman may have been isolated, right? Like, she didn't have anybody she could talk to about this situation, and she wasn't mentioning it to people. Nobody was asking her about this.
Dave Bittner: [00:08:10] Right.
Joe Carrigan: [00:08:10] Nobody's saying, hey, Mom, what's going on with your boyfriend? Have you met him in person yet?
Dave Bittner: [00:08:15] I could imagine even a child might not even think to ask their parent...
Joe Carrigan: [00:08:19] Yeah, exactly.
Dave Bittner: [00:08:20] ...Their widowed parent if they're having any sort of romantic relationship at all.
Joe Carrigan: [00:08:24] You're absolutely right. It's a good question for children to ask their parents if their parents are widowed or single or - for whatever reason.
Dave Bittner: [00:08:30] Yeah.
Joe Carrigan: [00:08:30] You know, if they're in the dating game...
Dave Bittner: [00:08:32] You know, just - are you lonely? Are you seeing anyone, you know?
Joe Carrigan: [00:08:34] Right.
Dave Bittner: [00:08:34] So that's - something that simple could...
Joe Carrigan: [00:08:36] Yeah, check on them.
Dave Bittner: [00:08:36] ...Reveal that.
Joe Carrigan: [00:08:37] Yup. Start asking the questions. Are you sending this guy money? Does he have a pile of excuses? Because when this woman eventually confronted her scammer, he had another series of excuses lined up, ready to go.
Dave Bittner: [00:08:49] Yeah. I'm surprised at the boldness of using, you know, FaceTime to connect that way. We don't hear that very often.
Joe Carrigan: [00:08:56] No, we don't. I hope she took screenshots...
Dave Bittner: [00:08:58] Yeah.
Joe Carrigan: [00:08:58] ...So that they have a picture of the guy.
Dave Bittner: [00:09:01] I bet she didn't. Because why would you? If someone - if you trust someone and you - you know, I don't know. Although it's hard to say. You know, I can imagine if - also, if you're - if you're in love with someone, you'd might want to have a picture of them that you could look at...
Joe Carrigan: [00:09:13] Yeah.
Dave Bittner: [00:09:13] ...You know, all the time.
Joe Carrigan: [00:09:14] Yeah, exactly.
Dave Bittner: [00:09:15] Yeah. So, well, I wish her well. I hope they are able to get some of this back. Unfortunately, I doubt they will. But it's a good lesson for all of us.
Joe Carrigan: [00:09:23] Yeah.
Dave Bittner: [00:09:23] And we've said it many times here. Check in on your family and your friends, particularly those folks who might not have a lot of people around them. And sometimes these questions are uncomfortable, but you got to ask them.
Joe Carrigan: [00:09:35] Yup.
Dave Bittner: [00:09:35] You just got to figure out a nice, tactful way to breach those subjects.
Joe Carrigan: [00:09:39] Or do what I do and just be blunt.
Dave Bittner: [00:09:43] Sad ending, but cautionary tale.
Joe Carrigan: [00:09:45] Yup.
Dave Bittner: [00:09:46] My story this week actually comes from BuzzFeed News. The title of it is "These Fake Local News Sites Have Confused People for Years. We Found Out Who Created Them." I realize there's not a bit of small irony that a story about fake news might be coming from BuzzFeed.
Joe Carrigan: [00:10:00] BuzzFeed (laughter).
Dave Bittner: [00:10:01] Right. But this is - this does seem to check out. It's caught my eye. I saw it come by, actually, on Twitter. A security researcher pointed this out. So a lot of us rely on Google Alerts...
Joe Carrigan: [00:10:14] Yes.
Dave Bittner: [00:10:14] ...To keep track of news items, things we're interested in. Google has this service, Google Alerts, where you can put in topics that you're interested in. And when those things bubble up in the news, Google alerts you...
Joe Carrigan: [00:10:25] Yup.
Dave Bittner: [00:10:26] ...And you can follow those stories. Well, for the past couple of years, there have been a bunch of stories that have been bubbling up on Google Alerts that are from local online news publications that, it turns out, are not real. They are fake publications. The news they're publishing, most of the time, is plagiarized and old. So a story from a year ago will bubble up into Google Alerts as new, and it coming from a seemingly legitimate online news source. In other words, if you went to this page...
Joe Carrigan: [00:10:59] Right. It would look like a news source.
Dave Bittner: [00:11:00] Right, right. For example, you know, you and I, we live in - Howard County is the county we live in in Maryland.
Joe Carrigan: [00:11:05] Correct.
Dave Bittner: [00:11:05] So it would be, you know, the Howard County Journal (ph)...
Joe Carrigan: [00:11:08] Right.
Dave Bittner: [00:11:08] ...Or something like that. And at first glance, it looks like a legit news service. There's well-written articles, but most of them are plagiarized. So some people noticed this and were trying to figure out why these old stories were coming up on their Google Alerts and why Google wasn't catching them.
Joe Carrigan: [00:11:26] Right. Why is Google not vetting their news sources for Google Alerts?
Dave Bittner: [00:11:30] Correct. And BuzzFeed followed up on this. And one of the things, of course, they explored was that we live in this time when there's this thing called fake news where people are trying to put out disinformation.
Joe Carrigan: [00:11:43] Right.
Dave Bittner: [00:11:43] It's the old Russian playbook of not necessarily changing your mind about something but just making you feel uncertain about things...
Joe Carrigan: [00:11:51] Right.
Dave Bittner: [00:11:51] ...Wondering what is the truth and getting that feeling of uncertainty and anxiety. Often, that is the goal.
Joe Carrigan: [00:11:57] Yep.
Dave Bittner: [00:11:57] So BuzzFeed explored that, but they found out in the end - Joe, guess what this was all about.
Joe Carrigan: [00:12:03] I'm going to take a wild guess here. This is about ad revenue.
Dave Bittner: [00:12:06] Ding, ding, ding, ding, ding, ding, ding - yes. In the end, it was about making money.
Joe Carrigan: [00:12:11] Right.
Dave Bittner: [00:12:11] So they actually traced it back to some folks who are proprietors of these sites. They were spinning up bunches of these sorts of sites and filling them with...
Joe Carrigan: [00:12:23] Plagiarized material.
Dave Bittner: [00:12:23] ...Plagiarized material.
Joe Carrigan: [00:12:24] Selling ads on those sites when they hit the Google search engine.
Dave Bittner: [00:12:28] Exactly.
Joe Carrigan: [00:12:28] Or if there's a way you can query Google to find out what people are most interested in for the news alerts. I wonder if there's some angle of that. Like, if I look and I see that people are really interested in, say, Calvin Ball, right?
Dave Bittner: [00:12:41] Right, who's our county executive.
Joe Carrigan: [00:12:43] Our county executive.
Dave Bittner: [00:12:44] Yep.
Joe Carrigan: [00:12:44] And then if there's some way I could find that out, then I could start reposting old articles about Calvin Ball winning the election, which happened about a year ago, right?
Dave Bittner: [00:12:52] Yeah. So one of the questions BuzzFeed looked into is, why were these getting past Google's filtering?
Joe Carrigan: [00:12:58] Yes.
Dave Bittner: [00:12:58] Because Google, of course - it's a big part of their business is keeping that stuff out of there. Google doesn't want these things running because it makes people feel like they're less confident in Google.
Joe Carrigan: [00:13:08] Yeah.
Dave Bittner: [00:13:08] So it seems as though these news sites looking so legitimate, not drawing attention to themselves, just looking - and plus the fact that they're small-town publications.
Joe Carrigan: [00:13:17] Yeah, that's right. The effort that Google would have to go through here would be pretty big if you consider all of these different small towns. They could probably have some kind of plagiarism system...
Dave Bittner: [00:13:28] Yeah.
Joe Carrigan: [00:13:28] ...That, you know, says, here's a new article. Let's look at all the old articles and see if this is just a plagiarized article.
Dave Bittner: [00:13:34] Right.
Joe Carrigan: [00:13:34] Every academic institution has that capability.
Dave Bittner: [00:13:37] Now, evidently, Google has terminated some of the AdSense advertising accounts that belong to the people who are...
Joe Carrigan: [00:13:45] Oh, really?
Dave Bittner: [00:13:45] ...Doing this sort of thing. Yeah.
Joe Carrigan: [00:13:47] That's a significant step.
Dave Bittner: [00:13:48] And Google says they've tweaked some of their settings to try to do a better job with this. It's a tough thing to know how to advise people to protect themselves against this. This is someone taking advantage of the way that a fully automated system works...
Joe Carrigan: [00:14:03] Right.
Dave Bittner: [00:14:04] ...To put information in front of people that is stale...
Joe Carrigan: [00:14:08] Yeah.
Dave Bittner: [00:14:08] ...Out of date and yet carrying this ad load, and that's how they make the money.
Joe Carrigan: [00:14:14] Right.
Dave Bittner: [00:14:15] So I suppose - I mean, vigilance is...
Joe Carrigan: [00:14:17] Yeah, vigilance.
Dave Bittner: [00:14:17] One thing that I have the habit of doing is any news story that's been shared...
Joe Carrigan: [00:14:22] Right.
Dave Bittner: [00:14:22] Like, I'm going through Twitter, and someone says, check out this article. First thing I do is check the date on it.
Joe Carrigan: [00:14:28] Right.
Dave Bittner: [00:14:28] The publication date.
Joe Carrigan: [00:14:29] Yeah.
Dave Bittner: [00:14:29] Make sure that it's not a story from a year ago or even six months ago.
Joe Carrigan: [00:14:33] Yep.
Dave Bittner: [00:14:33] So I think that's a good standard thing to do.
Joe Carrigan: [00:14:36] And check the source to see if it's someone you know and trust.
Dave Bittner: [00:14:39] So I suppose, you know, be on the lookout for this sort of thing. And just as knowledge, shining a light on it will hopefully help people...
Joe Carrigan: [00:14:47] Yeah.
Dave Bittner: [00:14:47] ...Be more aware of it. Check to make sure that that news source is a real news source. If it's a publication you've never heard of, just take a couple minutes and check the date. You know, it's easy, really. If you copy and paste any part of any article - you know, copy a paragraph from an article. Load it into - just do a Google search of that entire paragraph.
Joe Carrigan: [00:15:08] Right.
Dave Bittner: [00:15:09] And if it's plagiarized...
Joe Carrigan: [00:15:10] It'll show up.
Dave Bittner: [00:15:11] ...It'll come - it'll take you to the original article most of the time.
Joe Carrigan: [00:15:14] Right, which is why I'm surprised that Google doesn't do this, because they already have the information that they can check again.
Dave Bittner: [00:15:19] Yeah. I suppose it's hard for them to differentiate, though, because there are sites out there that are aggregators. Some of my favorite sites in the world are - do a lot of aggregation of other people's news (laughter).
Joe Carrigan: [00:15:30] Right, right. But, you know, Google doesn't need to give you an alert of five different stories of the same thing. Like, let's say the AP releases a story.
Dave Bittner: [00:15:38] Yeah.
Joe Carrigan: [00:15:38] Right? And then every major paper in the area will pick up that story.
Dave Bittner: [00:15:42] Right. But I think that's how it works to their advantage that they're using old stories, because it will already have flowed through Google's checking.
Joe Carrigan: [00:15:50] Right.
Dave Bittner: [00:15:50] It's coming up as new because it's no longer in the, hey, let's check these against each other...
Joe Carrigan: [00:15:56] Right.
Dave Bittner: [00:15:56] ...Algorithm that Google probably has.
Joe Carrigan: [00:15:58] Yeah. I think Google - there is a technical solution for this for Google, though.
Dave Bittner: [00:16:01] Yeah. Well, and they claim that they're doing better with it. But again - something to look out for. All right. Well, that is my story this week. Of course, we'll have links to our stories in the show notes so you can check those out. It is time to move on to our Catch of the Day.
0:16:14:(SOUNDBITE OF REELING IN FISHING LINE)
Dave Bittner: [00:16:17] Our Catch of the Day comes from a listener named Zach (ph). He calls it an American prince scheme. And this one comes from Warren Buffett, Joe.
Joe Carrigan: [00:16:26] Warren Buffett?
Dave Bittner: [00:16:26] Warren Buffett. And he is...
Joe Carrigan: [00:16:29] He's one of the richest men in America.
Dave Bittner: [00:16:30] He is one of the richest men in America - dare I say, the world.
Joe Carrigan: [00:16:34] Maybe.
Dave Bittner: [00:16:35] It says - I'm just going to read it. The subject is Your Donation Fund. And he says, (reading) hi. My name is Warren Buffett, a philanthropist, the CEO and chairman of the Berkshire Hathaway.
Joe Carrigan: [00:16:47] The Berkshire Hathaway.
Dave Bittner: [00:16:49] (Reading) I believe strongly in giving while living. I have one idea that never changes my mind - that you should use your wealth to help people around the world. And I've decided to donate this amount through our Sunshine Lady Humanitarian Grants Program to give out $5.5 million - 5,500,000 united dollars - to randomly selected individuals worldwide. On - what are united dollars? Does that mean they're stuck together? They're...
Joe Carrigan: [00:17:15] Yes.
Dave Bittner: [00:17:16] ...Glued together?
Joe Carrigan: [00:17:16] They're all wrapped up...
Dave Bittner: [00:17:17] Maybe just a rubber band.
Joe Carrigan: [00:17:18] ...In bundles.
Dave Bittner: [00:17:19] OK. (Reading) On receipt of this email, you should count yourself as one of the lucky individuals. Well, that's us, Joe.
Joe Carrigan: [00:17:24] That's right.
Dave Bittner: [00:17:25] (Reading) Your email address was chosen online while searching at random. Kindly get back to me at your earliest convenience. I need receiver's name, country, home address, phone number, male or female so I know your email address is valid. I wait to receive your reply once you read this email. Happy holidays in advance. Warm regards, Warren Buffett. There's no limit to the generosity of these wealthy people in the world these days, Joe.
Joe Carrigan: [00:17:48] Yeah. I think one of the things Warren Buffett has said is he doesn't give money away to people.
Dave Bittner: [00:17:52] Oh, is that right?
Joe Carrigan: [00:17:53] Yeah, that he gives it away to organizations. And giving money to people he doesn't believe is good for them.
Dave Bittner: [00:17:58] Yeah.
Joe Carrigan: [00:17:58] So.
Dave Bittner: [00:17:59] All right. Well, there you go (laughter).
Joe Carrigan: [00:18:00] Right.
Dave Bittner: [00:18:02] Well, thank you, Zach, for sending that in. That is a good one. And that is our Catch of the Day. Coming up next, Carole Theriault returns, and she's speaking with Lisa Forte from Red Goat. And their discussion centers on Lisa's experiences of being a police officer and how that informed her perspective on the human factors in cybersecurity.
Dave Bittner: [00:18:26] But first, a word from our sponsors. Now let's return to our sponsor's question about the attacker's advantage. Why do the experts think this is so? It's not like a military operation, where the defender is thought to have most of the advantages. In cyberspace, the attacker can just keep trying and probing at low risk and low cost, and the attacker only has to be successful once. And as KnowBe4 points out, email filters designed to keep malicious spam out have a 10.5% failure rate. That sounds pretty good. Who wouldn't want to bat nearly .900? But this isn't baseball. If your technical defenses fail in 1 out of 10 tries, you're out of luck and out of business. The last line of defense is your human firewall. Test that firewall with KnowBe4's free phishing test, which you can order up at knowbe4.com/phishtest. That's knowbe4.com/phishtest.
Dave Bittner: [00:19:26] And we are back, Joe. Always great to have Carole Theriault back. This week, she is speaking with Lisa Forte. She is from Red Goat. And they are going to be talking about Lisa's experience as a police officer - some really interesting stuff that Lisa did with her time in that career. And they're going to be talking about the human factors in cybersecurity. Here's Carole Theriault.
Carole Theriault: [00:19:48] So today, I'm going to introduce you to Lisa Forte. She is a partner at Red Goat, a U.K.-based consultancy that specializes in cybersecurity testing, training and war-gaming. Now, Lisa is one of those rare humans that has worked both sides of the fence. What I mean by that is she's certainly big in the private sector as a cybersecurity expert, but she has also worked with the U.K.'s primary Cyber Crime Unit. So I thought she'd be the perfect person to speak to about what scams we have labeled as hot and rising for 2020 and what behaviors or actions she can advise us to take to help us be a little bit safer out there. Lisa, thank you so much for speaking to me today on "Hacking Humans."
Lisa Forte: [00:20:31] Thank you for having me.
Carole Theriault: [00:20:33] How about we get started with your fascinating background, like your time with the cops? What did you do there? When did that all start?
Lisa Forte: [00:20:39] So I was probably - I probably joined, like, five, six years ago. And I have to be honest. When I joined the police, I was probably a little bit naive. I grew up in a sort of pretty privileged bubble, I guess, in south of the U.K. And it was sort of an abrupt reality check for me when I got to see sort of the darker side of society, I guess. Yeah, so it was a bit of a massive reality check. And I had to grow up pretty fast. But I think it was a good experience overall. And it's definitely left me a bit wiser - maybe a bit more paranoid than I ought to be, but wiser for it.
Carole Theriault: [00:21:15] I hear you on that. Now tell me, were you interested in cybercrime before you went and worked with the authorities, or was it the other way around?
Lisa Forte: [00:21:22] So actually, I worked for a private company before I went to the police that used to put armed guards on ships to protect them from pirates, believe it or not. And...
Carole Theriault: [00:21:33] Wow.
Lisa Forte: [00:21:33] Yeah, so it's a bit of a crazy...
Carole Theriault: [00:21:34] You should write a book.
Lisa Forte: [00:21:35] Bit of a crazy story. And so we used to do a lot of OSINT on how the pirates were targeting ships and how they knew which ships were the best ones to target. So I was always interested in that sort of OSINT side of security. And so it was kind of a natural progression, I guess, from there.
Carole Theriault: [00:21:53] And so when you left the police, you would've been shaped so much more differently than most of us - right? - because you had been dealing with, like, as you said, all, like, the horrors of the cybercrime underworld. So you must have come away maybe being more paranoid, but also way more knowledgeable. Tell me what are some of the growing threats you see ahead of us in 2020? So what things do you think that we should just keep an eye on?
Lisa Forte: [00:22:18] So I think kind of - there's two sides to this. On the one hand, I think deepfakes are going to be an enormous problem, and not just from the sort of democracy side of things, although I'm sure they can have a huge effect on our freedom to vote and our freedom of religion and all the rest of it. But if you look at even legitimate videos - so for example, the video of Elon Musk, who is smoking, and Tesla's stock price plummeted almost instantly as soon as that video was released. Now, obviously, that video was genuine. But if it wasn't genuine, you could seriously affect stock prices of major companies by doing things like that. So I think the problem with deepfakes is that they have an ability to really affect a lot of things. And as you may recall, sort of linked to deepfakes is this idea of voice skins. I don't know if you've heard of this.
Carole Theriault: [00:23:09] Oh, I haven't heard that term. Tell me.
Lisa Forte: [00:23:11] So voice skins is essentially where the attackers will mimic the voice of the CEO of the company, for example. And this happened to an energy company here in Europe. And they had a German CEO, and his voice was cloned in a very similar way to a deepfake, really. And they called up one of the employees, and they asked them to transfer basically 200,000 pounds. And it sounded exactly like the CEO. So, of course, the employee thought...
Carole Theriault: [00:23:40] Wow.
Lisa Forte: [00:23:40] ...OK, yeah, of course I'm going to do it. So I think, you know, that takes vishing - this sort of voice attack - to a whole new level.
Carole Theriault: [00:23:48] Yeah. And these - there's a growing number of services like that. So I know of Lyrebird, where you can literally put in audio of people so you can - I could literally put in audio from this recording of you and hope that it comes out with something that approximates your voice. Now, you know, I would do that for a joke, but I guess people would do that for nefarious purposes.
Lisa Forte: [00:24:08] Yeah, definitely. And I don't think there are as many artifacts to be picked up on in an audio recording as you might be able to do with a deepfake video. As you watch it and play it back, you might pick up things that aren't quite right. That's going to be a lot harder to do with audio.
Carole Theriault: [00:24:23] OK. So I find them very scary, particularly if you think about where we are. You know, there's the U.S. presidential election this year. And, you know, it's scary for a lot of people that they might be duped. So what can people look for to try and stay ahead of the game?
Lisa Forte: [00:24:37] One of the main things that I was taught in the police which I never really did, in all honesty, beforehand was that everything needs to be corroborated. Anything that you rely on has to be corroborated by some other source. And I think we as a society need to start thinking, if I post this on social media, I am responsible to fact-check and ensure that this is quality content before I share it to my followers, because otherwise, I'm proliferating something I haven't even looked into. So I think that's a really big part of what we need to start doing.
Carole Theriault: [00:25:13] Yeah. Wouldn't it be wonderful if social networking sites would prevent you from forwarding an article unless you actually opened it and read it? 'Cause I suspect loads of people share stuff based on headline alone.
Lisa Forte: [00:25:25] They probably don't even open it.
Carole Theriault: [00:25:26] Yeah.
Lisa Forte: [00:25:26] They probably just see the post and think, oh, I'll retweet that.
Carole Theriault: [00:25:30] Exactly. But I guess - but you'd get less - a lot less shares that way. OK. So that's really interesting. So you're saying people should be responsible with whatever they post online. And one of the ways they can do that is to verify it and validate it with two independent sources of information.
Lisa Forte: [00:25:45] Yeah. And I think there are other things that indicate to us that we might be receiving a phishing email or some fake news or, you know, a malicious story or something, because all of these stories - they're always going to sound too good to be true. And phishing emails are the same. They always sound too good to be true. And they always sort of evoke some sort of emotional response from you. And I think if you start feeling like that when you read something, it's probably worth going and checking it out.
Carole Theriault: [00:26:12] And what about with deepfakes and voice skins? What you were saying earlier - that sometimes you can see something - a glitch or something in a deepfake that can give you a warning. So basically, if the hair is standing on the back of your neck and something doesn't feel right, what are the things that you might want to look for?
Lisa Forte: [00:26:28] So researchers have seen that in deepfakes, sometimes the individual won't blink normally because the AI hasn't had a lot of imagery of that person with their eyes closed. So for that reason, they can't make it blink. But I think even if you forget the actual video, if you just take the content, you know, does it sound odd, out of character? Does it sound too good to be true? Is it playing into your biases too much? And I think if you think that's true, then probably, you ought to go and question it.
Carole Theriault: [00:26:59] I'm just nervous. I'm just nervous. I'm nervous about tomorrow because, obviously, deepfakes are going to get better, right? And they're going to get more subtle and more - and in some cases, more sinister. And we're going to have to keep our wits about us out there.
Lisa Forte: [00:27:15] There is a bit of a silver lining to this at the moment.
Carole Theriault: [00:27:17] Oh, thank gosh (laughter).
Lisa Forte: [00:27:18] So - well, it's not that much of a silver lining, to be honest. But it's a little bit...
Carole Theriault: [00:27:21] OK, gray lining.
Lisa Forte: [00:27:22] It's a gray lining. So Deeptrace recently found 15,000 deepfake videos, and they actually realized that 96% of those videos were porn. And most of those were female celebrities' faces put onto pornographic videos. So what do humans do when they create a new technology? That's what they employ it for. So at the moment, it's not too much of a threat.
Carole Theriault: [00:27:48] That makes me feel really sorry for those that get cyberbullied out there.
Lisa Forte: [00:27:51] True. And I think this is kind of the other side to all of this stuff. Reality also can become plausibly deniable because if you look at the recent Prince Andrew interview with BBC - and there was that photograph, and he claimed that photograph had been faked. And it looked pretty conclusively like it hasn't been faked, and it is, in fact, real. But because these things exist, it gives - also gives people the opportunity to say, well, that wasn't me; that was fake.
Carole Theriault: [00:28:16] Yeah.
Lisa Forte: [00:28:16] So you can also deny things that did happen. So it sort of ruins a lot of trust in society, I think, overall.
Carole Theriault: [00:28:24] Well, one of the more cheery interviews I've done.
Lisa Forte: [00:28:28] That's what you get from me.
Carole Theriault: [00:28:30] No, thank you so much, Lisa, 'cause these are really, really important points, because the world is changing and we need to keep up with it. So thank you for giving us a hard slap of reality on today's show.
Lisa Forte: [00:28:42] Thank you so much for having me.
Carole Theriault: [00:28:43] This was Carole Theriault for "Hacking Humans."
Dave Bittner: [00:28:47] All right. Interesting conversation.
Joe Carrigan: [00:28:49] Yeah. Very interesting. I don't know I'd call what she describes as a silver lining a silver lining, especially not for the women who are victimized by these deepfakes. But, yeah, I do agree that right now, the vast amount of the energy is being spent in producing porn.
Dave Bittner: [00:29:04] Yeah.
Joe Carrigan: [00:29:04] So I think that plausible deniability is probably the biggest problem with the advent of deepfakes - that we're going to see elected officials going, nope, nope, nope, that wasn't me; that was fake.
Dave Bittner: [00:29:15] Who you going to believe...
Joe Carrigan: [00:29:16] Right.
Dave Bittner: [00:29:16] ...Me or your lying eyes?
Joe Carrigan: [00:29:18] Right. I agree that deepfakes are going to be a big problem, but that's a few years away, I think. But what I say is I'm not worried about deepfakes impacting the 2020 election, but I am worried about deepfakes impacting the 2024 election. One of the very interesting things that I actually hadn't considered about deepfakes is what Lisa described with Elon Musk and coming up with a great way to make money with a short sale scam. I short a bunch of stock of some company.
Dave Bittner: [00:29:42] Right.
Joe Carrigan: [00:29:42] And then I release a fake video of their CEO doing something stupid or saying something damaging about the company. The market panics. I buy back the stock at a lower price, and I profit.
Dave Bittner: [00:29:55] Yeah.
Joe Carrigan: [00:29:56] That's a very real scenario and a very easy-to-implement scenario.
Dave Bittner: [00:30:00] Yeah. And I wonder if it's something that, for example, the FCC need to aim their enforcement at if it becomes a problem. I haven't seen any reports of people overtly doing that, but it certainly has to be on their radar.
Joe Carrigan: [00:30:11] No, I don't think this has been done yet, but it is something that's going to happen. And I think the FCC should be paying attention to this, and it should be on their radar, just like you said. Let me ask you a question, Dave. Are you buying what Lisa was talking about with voice skins?
Dave Bittner: [00:30:25] You know, we covered that a few episodes ago.
Joe Carrigan: [00:30:28] We did.
Dave Bittner: [00:30:28] And I remain skeptical.
Joe Carrigan: [00:30:30] You remain skeptical.
Dave Bittner: [00:30:31] I do. I think in this story - the story she mentions got a lot of coverage, and a lot of the coverage concluded that they were - this is what they were doing. But again, I remain skeptical. I think it's easier to just employ a person, an actor...
Joe Carrigan: [00:30:47] Right.
Dave Bittner: [00:30:47] ...Who's good at mimicking someone.
Joe Carrigan: [00:30:49] Yep.
Dave Bittner: [00:30:50] That's going to be quicker. It's going to be cheaper. You're going to be more nimble in your ability to generate responses.
Joe Carrigan: [00:30:57] Right.
Dave Bittner: [00:30:57] So is it possible? Yeah, probably. But I think that's probably not what's going on yet.
Joe Carrigan: [00:31:04] If a voice skin is something that I can put between my mouth and somebody else's ear, then maybe.
Dave Bittner: [00:31:10] Right.
Joe Carrigan: [00:31:10] But I ought to look more into this voice skin. I'll do some research.
Dave Bittner: [00:31:12] Yeah, we're just - I don't think we're there yet.
Joe Carrigan: [00:31:15] No.
Dave Bittner: [00:31:15] And, actually, I spoke with a researcher, oh, about two weeks ago who works in this area.
Joe Carrigan: [00:31:21] Right.
Dave Bittner: [00:31:21] And he was saying exactly that - that that's not what's happening yet. The bad guys are going to go with what's easiest.
Joe Carrigan: [00:31:27] Yeah.
Dave Bittner: [00:31:28] And right now, it's easiest to just get somebody who's a pretty good mimic, so...
Joe Carrigan: [00:31:31] Yes, I would agree. I did see an interesting article. You can just Google this and find out where it is. But there's an article on CNET, but I don't think the original article I read was on CNET. But they're actually using or thinking about using mice to detect faked voices.
Dave Bittner: [00:31:47] Go on.
Joe Carrigan: [00:31:48] So what happens is when you and I communicate and we listen to a voice, you and I are listening to a bunch of different factors about that voice. We're listening to the message. We're listening to the tone of the voice.
Dave Bittner: [00:31:58] Yeah.
Joe Carrigan: [00:31:58] We're listening to the inflection. And you and I are trying to catch all the nonverbal and verbal things that are coming from that voice into our ears.
Dave Bittner: [00:32:06] Right.
Joe Carrigan: [00:32:07] Mice do not have that problem, right?
Dave Bittner: [00:32:09] OK.
Joe Carrigan: [00:32:10] Mice just hear noises. And there are artifacts in those sounds that indicate that the voice is fake that you and I will probably miss, but mice can be trained to find it.
Dave Bittner: [00:32:21] How do you get them to tell you (laughter)?
Joe Carrigan: [00:32:23] You put them in a box and you condition them to push a button for a real voice and push another button for a fake voice.
Dave Bittner: [00:32:30] I see.
Joe Carrigan: [00:32:30] You train them just like an AI model.
Dave Bittner: [00:32:32] Yeah.
Joe Carrigan: [00:32:33] Right?
Dave Bittner: [00:32:33] They get a reward for getting it right.
Joe Carrigan: [00:32:34] They get rewarded when they're right.
Dave Bittner: [00:32:35] OK.
Joe Carrigan: [00:32:35] And quickly, they can identify it. That's the idea. They actually got the idea from another experiment where they had pigeons finding breast cancer cells. And I don't know how I would feel if I found out that my diagnosis was given by a pigeon.
Dave Bittner: [00:32:49] Yeah. Also sounds painful, but...
Joe Carrigan: [00:32:51] No, they - looking at images.
Dave Bittner: [00:32:53] Oh, I see. Right, right. Right, right.
Joe Carrigan: [00:32:54] Images of cells.
Dave Bittner: [00:32:55] OK.
Joe Carrigan: [00:32:55] And the pigeons can identify cancer cells.
Dave Bittner: [00:33:00] (Laughter) Madam, we've got bad news.
Joe Carrigan: [00:33:01] Right.
Dave Bittner: [00:33:04] We brought in an expert.
Joe Carrigan: [00:33:05] Right.
Dave Bittner: [00:33:06] This pigeon - Flappy - yeah. Oh, my.
Joe Carrigan: [00:33:11] No, they actually do a good job because they're not - they're looking for things that people aren't looking for.
Dave Bittner: [00:33:15] Right, unencumbered by the thought process (laughter).
Joe Carrigan: [00:33:17] Exactly. That's exactly right. They're unencumbered by thought, and they can do it.
Dave Bittner: [00:33:22] All right. Well, again, thanks to Lisa Forte for joining us and Carole Theriault for bringing us another interesting story. That is our show. We want to thank all of you for listening.
Dave Bittner: [00:33:32] And, of course, we want to thank our sponsors KnowBe4. They are the social engineering experts and the pioneers of new-school security awareness training. Be sure to take advantage of their free phishing test, which you can find at knowbe4.com/phishtest. Think of KnowBe4 for your security training. Thanks to the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu.
Dave Bittner: [00:33:54] The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Joe Carrigan: [00:34:08] And I'm Joe Carrigan.
Dave Bittner: [00:34:09] Thanks for listening.