Hacking Humans 5.23.19
Ep 50 | 5.23.19

People aren't perfectly rational.

Transcript

Elissa Redmiles: [00:00:00] There's been a bunch of research showing that making people reset their password every three months makes it more likely that they're going to reuse passwords across websites, and that's actually more detrimental than letting them have the same password.

Dave Bittner: [00:00:14] Hello, everyone, and welcome to the CyberWire's "Hacking Humans" podcast. This is the show where each week we look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire, and joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe.

Joe Carrigan: [00:00:34] Hi, Dave.

Dave Bittner: [00:00:34] We've got some fun stories to share this week. And later in the show, we have Joe's interview with Elissa Redmiles. She's an incoming professor of computer science at Princeton, and she studies behavioral modeling to understand why people behave the way they do online.

Dave Bittner: [00:00:49] But first, a word from our sponsors, KnowBe4. Step right up and take a chance. Yes, you there. Give it a try and win one for your little friend there. Which were the most plausible subject lines in phishing emails? Don't be shy. Were they, A, my late husband wished to share his oil fortune with you, or B, please read - important message from HR, or C, a delivery attempt was made, or D, take me to your leader? Stay with us, and we'll have the answer later. And it will come to you courtesy of our sponsors at KnowBe4, the security awareness experts who enable your employees to make smarter security decisions.

Dave Bittner: [00:01:34] And Joe, we are back. Before we get to our stories today, I have a couple of little bits of business to attend to. First of all, I want to thank the folks at KnowBe4 for hosting our show last week at their KB4-CON.

Joe Carrigan: [00:01:46] That was awesome.

Dave Bittner: [00:01:46] Had a lot of fun. And thanks to everybody who came out there. It was really great to meet so many listeners face to face.

Joe Carrigan: [00:01:52] Yeah. Yeah, it was. We all need shirts that say, I'm not Dave, on 'em, though.

0:01:57:(LAUGHTER)

Dave Bittner: [00:01:57] Right. Yeah. That's the people - question people ask, are you Dave?

Joe Carrigan: [00:02:00] Right.

Dave Bittner: [00:02:01] No. No. I'm - well, I just need a shirt that just says, Dave.

Joe Carrigan: [00:02:04] Right.

0:02:05:(LAUGHTER)

Dave Bittner: [00:02:06] And the rest of you have shirts that say, not Dave.

Joe Carrigan: [00:02:09] Not Dave. Right. Just an exclamation point, Dave.

Dave Bittner: [00:02:11] Yeah. Well, we'll get on that.

Joe Carrigan: [00:02:12] (Laughter).

Dave Bittner: [00:02:13] All right. So beyond that, we got a note from a listener named Yahooda (ph). And he wrote and he said, (reading) Hi, guys. Your podcast is amazing, and I'm working on catching up on the backlog of episodes.

Joe Carrigan: [00:02:23] Awesome.

Dave Bittner: [00:02:23] (Reading) Along the way, I started wondering how much my paranoia over security was having an effect on those closest to me, especially my wife.

Joe Carrigan: [00:02:30] (Laughter).

Dave Bittner: [00:02:31] (Reading) I needed to know if my wife would hold up to a phish from someone impersonating me. I created a new Gmail account with only one letter different from my own and sent her the following. Hey. I don't seem to have your Gmail password in my LastPass anymore. Can you please send it to me so I can add it? I'm glad to say that she responded by only giving me a small hint to the password. I followed up and tried to push my luck by saying, I'm blanking on it. What is it again? I got a WhatsApp message from her letting me know that she'll just add the password into our LastPass account. Thank you, guys, for the podcast, and just know that it makes a difference, not just for those of us who listen, but also to those around us. Keep up the great work. Yahooda.

Joe Carrigan: [00:03:13] His wife is pretty savvy.

Dave Bittner: [00:03:14] His wife is pretty savvy. And I have to say, when you decide that the way to check on something with your spouse is deception, what could possibly go wrong?

0:03:23:(LAUGHTER)

Dave Bittner: [00:03:26] But sounds like all's well that ends well (laughter). And...

Joe Carrigan: [00:03:29] I wonder how he responded to that WhatsApp message. Did he go, what do you mean?

Dave Bittner: [00:03:32] Yeah.

Joe Carrigan: [00:03:32] Did he try to play both sides (laughter) of this?

Dave Bittner: [00:03:34] No. It sounds like she's got his number. So...

Joe Carrigan: [00:03:36] Right.

Dave Bittner: [00:03:37] Good for her. Good for both of them.

Joe Carrigan: [00:03:38] Yeah, but that's some good security practice right there.

Dave Bittner: [00:03:41] Yeah. Yeah. Good for her. She did all the right things. And they're using a password manager so...

Joe Carrigan: [00:03:44] And that's excellent.

Dave Bittner: [00:03:45] (Laughter).

Joe Carrigan: [00:03:46] Excellent.

Dave Bittner: [00:03:46] (Laughter) Yeah. Yeah.

Joe Carrigan: [00:03:47] Congratulations to both of you.

Dave Bittner: [00:03:48] Yeah. All right. Well, let's move on to our stories. Joe, what do you have for us this week?

Joe Carrigan: [00:03:52] Well, today I'm talking about a blog post from F-Secure, - and we'll put a link in the show notes - and during their routine monitoring over the last three months, F-Secure has noticed some interesting patterns about attachment types in spam and phishing emails. So if you go to the blog post, there's a nice pie chart of all the types of malicious attachments. Take a guess. What do you think is No. 1?

Dave Bittner: [00:04:11] Hmm. Let's see. No. 1 type of attachment? I would say some kind of office document.

Joe Carrigan: [00:04:17] That's exactly right. No. 1 is office documents. If you...

Dave Bittner: [00:04:21] OK.

Joe Carrigan: [00:04:21] ...Sum all those different pieces of the pie chart up, they're the biggest one.

Dave Bittner: [00:04:23] Right.

Joe Carrigan: [00:04:24] Followed closely by ZIP files and then PDFs.

Dave Bittner: [00:04:27] Hmm.

Joe Carrigan: [00:04:27] What's interesting in this post is that there's a little tiny sliver of it called ISOs. All right. We're going to come back to that 'cause that's really the main crux here.

Dave Bittner: [00:04:35] OK.

Joe Carrigan: [00:04:36] So they found some ZIP files that were delivering GandCrab ransomware...

Dave Bittner: [00:04:39] Right.

Joe Carrigan: [00:04:39] ...Office files delivering Trickbot, which is a banking trojan...

Dave Bittner: [00:04:42] Mmm hmm.

Joe Carrigan: [00:04:43] ...PDF files being used for phishing American Express, as well as those winner scams. Hey. You're a winner. Send me money, and I'll send you thousands or millions.

Dave Bittner: [00:04:52] Yep.

Joe Carrigan: [00:04:53] And then ISO and IMG files to deliver AgentTesla and the NanoCore RAT. Now, for those who are not familiar, an ISO file is an image file usually of a physical optical media, like a CD or a DVD.

Dave Bittner: [00:05:07] So not an image, like, a graphics image.

Joe Carrigan: [00:05:09] Correct.

Dave Bittner: [00:05:09] Like, an imaged storage volume.

Joe Carrigan: [00:05:11] Exactly.

Dave Bittner: [00:05:12] OK.

Joe Carrigan: [00:05:12] We used to download these and then write them to a CD and, huzzah, you have an exact copy of the original CD.

Dave Bittner: [00:05:19] Right. Mmm hmm.

Joe Carrigan: [00:05:19] It works with DVDs, too.

Dave Bittner: [00:05:21] Yup.

Joe Carrigan: [00:05:21] These are great for Linux distributions, right? Because you just go out and get the install media immediately. You don't have to send away for it anymore. You can also buy tons of software in this format.

Dave Bittner: [00:05:31] Right.

Joe Carrigan: [00:05:32] I believe Microsoft sells it like this. VMware sells it like this. It's a very common way to distribute operating systems and other software. But what's interesting is now you don't acquire this ISO, however you get it, right, and then burn it to a CD or a DVD.

Dave Bittner: [00:05:46] Mmm. Right.

Joe Carrigan: [00:05:46] Most modern operating systems can just mount it like it was a piece of physical media.

Dave Bittner: [00:05:51] Right. OK.

Joe Carrigan: [00:05:52] Right? This is great because it gets you the software right away, and you don't have to go searching around for a blank DVD. It's really tough to find those things anymore, right?

Dave Bittner: [00:06:00] Well, and it's probably faster, rather than running off the optical media...

Joe Carrigan: [00:06:03] Right.

Dave Bittner: [00:06:04] Right? To do the install, it's probably faster running off your hard drive.

Joe Carrigan: [00:06:07] Yeah. If you put this thing on a SSD, the data's going to come off that a lot faster than it will off an optical drive. That's 100% correct.

Dave Bittner: [00:06:12] Yeah.

Joe Carrigan: [00:06:13] So that's easy, right? Well, authors of malware have realized that it's easy, too.

Dave Bittner: [00:06:18] Mmm.

Joe Carrigan: [00:06:18] Right? And that's the problem. And F-Secure notes that since July of 2018, there has been an increasing trend of attackers using these disk images to deliver malware. It's a small sliver of the pie, but here's my security expert prediction.

Dave Bittner: [00:06:32] OK.

Joe Carrigan: [00:06:32] That piece of the pie is going to grow.

Dave Bittner: [00:06:34] How come?

Joe Carrigan: [00:06:35] Because it's becoming more common for these operating systems to be able to mount these things easily, and people are not viewing this as an attack vector. They're cautious of, you know, .DOC and .DOCX. They're cautious of Microsoft files.

Dave Bittner: [00:06:49] Yeah.

Joe Carrigan: [00:06:49] Right? They're cautious of PDFs. But they may not be as cautious of ISOs.

Dave Bittner: [00:06:54] Mm. Interesting.

Joe Carrigan: [00:06:55] So I think this represents a risk. And I just want to tell everybody that, yes, an ISO can contain a malicious payload, as well. In fact, most recently, they have seen campaigns using this technique to deliver AgentTesla, which is an info stealer, and, of course, NanoCore RAT, which we talked about earlier.

Dave Bittner: [00:07:10] Yeah. With an ISO image file, when that mounts and the OS thinks that it is a physical volume...

Joe Carrigan: [00:07:18] It is a physical volume. It's a valid ISO file.

Dave Bittner: [00:07:20] Is it possible for something, when that is mounted, to auto execute?

Joe Carrigan: [00:07:25] It is possible for that to happen. I don't know how much that happens now. I think Microsoft has disabled a lot of what used to be the AutoRun INF file on these things. I don't know if that's a real concern. But what's happening in the files that F-Secure has observed is it contains an executable file called recentpayment2019.exe.

Dave Bittner: [00:07:43] Mmm.

Joe Carrigan: [00:07:45] Right?

Dave Bittner: [00:07:45] Mmm hmm.

Joe Carrigan: [00:07:46] And if you execute that file, it installs AgentTesla. So it's the same old tricks, but they're just using a new vector of using the ISO. So you still have to click on the EXE to get it to run in this campaign. What's interesting is they also were distributing this ISO right beside a malicious Word document, as well.

Dave Bittner: [00:08:04] Hmm.

Joe Carrigan: [00:08:05] So if you enabled macros, you got the malware. If you opened the ISO file and ran it, you got the malware.

Dave Bittner: [00:08:11] Yeah. All right. Well, something to look out for.

Joe Carrigan: [00:08:14] Be aware that any file you get can be malicious.

Dave Bittner: [00:08:17] Yeah. All right. My story this week is about some text messages that are being sent out impersonating local hospitals.

Joe Carrigan: [00:08:25] Hmm.

Dave Bittner: [00:08:25] This comes from WMTV, which is from Monroe, Wis., and there's a tech scam that's been making the rounds out there. And it's people, the bad guys, claiming that friends or relatives are in the hospital and seriously ill.

Joe Carrigan: [00:08:41] Huh.

Dave Bittner: [00:08:41] I have one of the text messages here. It says, (reading) this is Platteville hospital messaging. Your relatives is with us now. Condition is worrying. You should call us ASAP.

Dave Bittner: [00:08:52] And then there's a link. What's interesting about the link is that it's a .FUN domain, which I don't believe hospitals generally...

Joe Carrigan: [00:09:00] Right.

Dave Bittner: [00:09:00] ...Use.

Joe Carrigan: [00:09:02] (Laughter).

Dave Bittner: [00:09:02] So that should be a...

Joe Carrigan: [00:09:03] I don't think we use it.

Dave Bittner: [00:09:04] No.

Joe Carrigan: [00:09:04] (Laughter).

Dave Bittner: [00:09:04] I don't even know what it is. But that would be a clue. But I guess, as always with these things, the point is by the time you get to that point, I could imagine someone seeing this and saying, oh, my gosh...

Joe Carrigan: [00:09:16] Right.

Dave Bittner: [00:09:16] ...One of my relatives is in the hospital...

Joe Carrigan: [00:09:18] Mmm hmm.

Dave Bittner: [00:09:18] ...And they're in bad shape. I better click on this to see...

Joe Carrigan: [00:09:22] See what's going on.

Dave Bittner: [00:09:22] See what it's about. And when you click on it, of course, bad things happen.

Joe Carrigan: [00:09:27] It's a malicious...

Dave Bittner: [00:09:27] It installs the malicious link. Yeah. And of course, that the folks are saying, from the Wisconsin Department of Agriculture, Trade and Consumer Protection, that, of course, you should never click on links sent by someone you don't know...

Joe Carrigan: [00:09:38] Right.

Dave Bittner: [00:09:38] ...And that these things are scams.

Joe Carrigan: [00:09:41] Of course, they are.

Dave Bittner: [00:09:42] Yeah.

Joe Carrigan: [00:09:42] This has a couple of red flags.

Dave Bittner: [00:09:43] Yeah. The short circuiting of...

Joe Carrigan: [00:09:45] Right. Exactly. You know, the idea here is that they're going to get you to be afraid for your relatives, which is a very strong fear that almost all of us have.

Dave Bittner: [00:09:54] Right.

Joe Carrigan: [00:09:54] Right? What's interesting is there's no specific information in here. I wonder if the vaguity (ph) of it helps play on the fear.

Dave Bittner: [00:10:00] I think it does.

Joe Carrigan: [00:10:01] Yeah.

Dave Bittner: [00:10:02] I think it does because you fill in the gaps...

Joe Carrigan: [00:10:04] Right.

Dave Bittner: [00:10:04] ...Yourself. I mean, all of us probably have - we all have plenty of relatives.

Joe Carrigan: [00:10:08] Mmm hmm.

Dave Bittner: [00:10:09] Right? And the first thing it goes to is, who could it be? I need to know who it is. Is it...

Joe Carrigan: [00:10:12] Yeah. Let's find out who this is. Is this somebody I care about? Or maybe it's a cousin. You know.

Dave Bittner: [00:10:18] Yeah, (laughter) who I'd be OK with being...

0:10:21:(LAUGHTER)

Joe Carrigan: [00:10:21] Right (laughter).

Dave Bittner: [00:10:22] Finally, I'll inherit my fortune from that long-lost cousin I never liked, anyway.

Joe Carrigan: [00:10:27] Right (laughter).

Dave Bittner: [00:10:29] Yeah. Yeah. No, it's horrible (laughter). But, yeah. I think you're right. That ambiguity - but also, like you said, a couple of red flags here - there's the slightly broken English.

Joe Carrigan: [00:10:37] Right.

Dave Bittner: [00:10:37] Your relatives is with us now.

Joe Carrigan: [00:10:39] Right. The .FUN domain for a hospital.

Dave Bittner: [00:10:41] Yeah.

Joe Carrigan: [00:10:43] You know what? There is no fun in a hospital.

Dave Bittner: [00:10:46] (Laughter) That's right. That's true. That's true. What are you looking to do this weekend, Joe? I don't know, why don't we go hang out at the hospital? That's a...

Joe Carrigan: [00:10:53] Right. Sounds like good fun.

Dave Bittner: [00:10:54] ...Good time. I suppose - well, maybe newborn babies. That's...

Joe Carrigan: [00:10:58] Yeah. That's pretty cool.

Dave Bittner: [00:10:58] Once they've arrived. The actual - the actual delivery can be an ordeal.

Joe Carrigan: [00:11:01] Yeah. That's not fun.

0:11:01:(LAUGHTER)

Joe Carrigan: [00:11:03] I've been through that twice.

Dave Bittner: [00:11:04] Yeah. Boy.

Joe Carrigan: [00:11:05] Well, I've watched my wife go through it twice.

Dave Bittner: [00:11:06] Yeah. That's bad enough. Right?

Joe Carrigan: [00:11:08] (Laughter) That's right.

Dave Bittner: [00:11:09] Well, that is my story. We'll have a link to it in the show notes, of course. It's a quick one, but important. That one's making the rounds. So make sure you share it with your friends and family not to fall for a message from a hospital.

Joe Carrigan: [00:11:21] Right.

Dave Bittner: [00:11:22] All right. Time to move on to our Catch of the Day.

0:11:25:(SOUNDBITE OF REELING IN FISHING LINE)

Dave Bittner: [00:11:28] Joe, our Catch of the Day this week comes from a listener who's also named Joe.

Joe Carrigan: [00:11:31] Ah.

Dave Bittner: [00:11:32] He wrote in (laughter) with this. He said, thought you might appreciate this little altercation I had with an obviously fake profile before I blocked and reported it to LinkedIn.

Dave Bittner: [00:11:42] Now, before we go through this, Joe, I will be the person trying to do the scam here, and you can play the part of Joe because you are a Joe.

Joe Carrigan: [00:11:49] Yes. I see this Joe at the meetings.

Dave Bittner: [00:11:53] (Laughter) Right. Yeah. So this is a LinkedIn request. And there's a picture of a woman here, and she looks like quite an accomplished woman. She's standing in front of a fighter jet.

Joe Carrigan: [00:12:02] Right.

Dave Bittner: [00:12:03] ...In her pilot uniform - attractive lady. Says Bree Scott is the name. And it says, I'm a United State Army sergeant. So I will start off by reading the part from Bree. My name is Bree Scott, United State Army sergeant. I like meet you and to be your friend in a sincere and mutual friendship between us. Contact me on my email address. When I am less busy, I will writ you.

Joe Carrigan: [00:12:29] Hey, Bree. That's fascinating. Which regiment do you serve?

Dave Bittner: [00:12:33] Email is better to enable me express myself very well to you.

Joe Carrigan: [00:12:36] Oh, OK. Why is that? Was your regiment close to LinkedIn headquarters territory?

Dave Bittner: [00:12:41] Write to my email. And that's where it ends. That's where Joe reported it. Couple of things here - obviously all kinds of red flags here, right?

Joe Carrigan: [00:12:49] Right. Yeah, yeah.

Dave Bittner: [00:12:50] (Laughter) The United State Army. But I did a little digging with this one and just a quick Google searching. And turns out that Bree Scott is evidently a well-known author of love poetry, so I guess our scammers get a few style points there for trying some sort of romance scam from an author of love poetry.

Joe Carrigan: [00:13:07] Right. Well-done.

Dave Bittner: [00:13:08] Yeah. I also did a Google reverse image search of the profile picture. And of course, Google has a function where you can upload an image, and Google will attempt to match it with the things they have in their database.

Joe Carrigan: [00:13:19] Yes.

Dave Bittner: [00:13:20] And sure enough, I found the original source image. And it turns out that the woman in this image is actually - her name is Amanda Weeks (ph), and she was the lead solo pilot of the Air Force Thunderbirds 2008 demonstration team.

Joe Carrigan: [00:13:32] OK.

Dave Bittner: [00:13:33] So she is actually a badass pilot (laughter).

Joe Carrigan: [00:13:36] Right. Right (laughter).

Dave Bittner: [00:13:37] But I would hazard to say that it's highly unlikely that she's out there trolling for lonely men on LinkedIn.

Joe Carrigan: [00:13:42] As a sergeant.

Dave Bittner: [00:13:43] Yeah, right. She probably has...

Joe Carrigan: [00:13:44] She's an officer.

Dave Bittner: [00:13:45] She has opportunities. Yeah, I'm sure she...

Joe Carrigan: [00:13:47] (Laughter).

Dave Bittner: [00:13:47] I don't know anything about her, but certainly an accomplished woman...

Joe Carrigan: [00:13:51] Right.

Dave Bittner: [00:13:51] ...Who wouldn't have to be scraping around on LinkedIn for the bottom of the barrel. So - not that you're the bottom of the barrel, Joe.

Joe Carrigan: [00:13:57] Right (laughter).

Dave Bittner: [00:13:58] Or any Joe, for that matter. Yeah (laughter).

Joe Carrigan: [00:14:00] I'm fascinated by the fact that this is a LinkedIn messaging system. I've never received anything like this on LinkedIn.

Dave Bittner: [00:14:07] Yeah.

Joe Carrigan: [00:14:07] I mean, I get spam all the time on LinkedIn, and I get recruiters contacting me all the time on LinkedIn.

Dave Bittner: [00:14:11] Right. Right.

Joe Carrigan: [00:14:12] Never this. I never get a romance scam starting on LinkedIn.

Dave Bittner: [00:14:14] No, I haven't either. The scammiest thing I get on LinkedIn are people who want to try to help me promote my podcast.

Joe Carrigan: [00:14:20] Yeah, yeah.

Dave Bittner: [00:14:21] (Laughter) I get that all the time.

Joe Carrigan: [00:14:21] That's pretty scammy too.

Dave Bittner: [00:14:22] Yeah, they are. They are. So - all right. Well, that is our catch of the day. Coming up next, we've got Joe's interview with Elissa Redmiles. She's an incoming professor of computer science at Princeton, and she studies behavioral modeling to understand why people behave the way they do online.

Dave Bittner: [00:14:40] But first, a word from our sponsors, KnowBe4. And what about the biggest, tastiest piece of phish bait out there? If you said, A, my late husband wished to share his oil fortune with you, you've just swallowed a Nigerian prince scam. But most people don't. If you chose door B, please read - important message from HR, well, you're getting warmer. But that one was only No. 10 on the list. But pat yourself on the back if you picked C, a delivery attempt was made. That one, according to the experts at KnowBe4, was the No. 1 come-on for spam email in the first quarter of 2018. What's that? You picked, D, take me to your leader? No, sorry. That's what space aliens say. But it's unlikely you'll need that one unless you're doing "The Day the Earth Stood Still" at a local dinner theater. If you want to stay on top of phishing's twists and turns, the new-school security awareness training from our sponsors at KnowBe4 can help. That's knowbe4.com/phishtest.

Dave Bittner: [00:15:49] And we are back. Joe, you recently had the pleasure of speaking with Elissa Redmiles.

Joe Carrigan: [00:15:54] Right.

Dave Bittner: [00:15:54] Tell us a little bit about her.

Joe Carrigan: [00:15:55] She's an incoming professor of computer science at Princeton, and she studies the economic decisions that people make with their security online. It's a pretty good interview. Let's check it out.

Dave Bittner: [00:16:06] All right.

Joe Carrigan: [00:16:07] What have you found as to why people do not behave in a secure manner? What are the factors that influence that?

Elissa Redmiles: [00:16:12] So there are a couple of big factors. One of them is kind of socioeconomic status and resources. So one of the things that we see is there's a pretty big digital divide in which people who don't have certain levels of education or even certain levels of experience and skill online struggle with things like identifying spams and scams or may have difficulty understanding and taking action on security advice.

Elissa Redmiles: [00:16:38] Another large factor that we see is that we often as security professionals don't make particularly economically backed tradeoffs when we're thinking about asking people to do security. So we're sort of asking people to do a never-ending list of things without archiving old ones or measuring exactly how much this new behavior is going to help someone. So eventually users become overwhelmed, and then they just try to pick between behaviors on their own, which they may not be very well-equipped to do.

Joe Carrigan: [00:17:07] Do you have an example of security behavior that we shouldn't be using anymore that people still use?

Elissa Redmiles: [00:17:11] Sure. So one of the big ones is password expiration policies. So there's been a bunch of research showing that making people reset their password every three months makes it more likely that they're going to reuse passwords across websites. And that's actually more detrimental than letting them have the same password because unless that password has been breached, there's really no harm to it. And there's a better chance that someone will create a strong password for each site and be able to remember it if they don't have to keep learning new ones.

Joe Carrigan: [00:17:39] You also found in your research that there was differences in spam vulnerabilities.

Elissa Redmiles: [00:17:43] Yes. So in a large-scale study I did with Facebook looking at why people fall for spam, what we saw were a few things. And so one was that we observed that women fall for spam on Facebook far more often than men. And we looked into why that was the case. And it turns out that there's not necessarily a difference in skill or anything else between men and women. But rather, women get shown a lot more shopping spam, and men get shown a lot more media spam, things like videos of gory things like beheadings. And those videos are much more easy to detect as spam than the shopping spam because you see lots of shopping posts on Facebook all the time. And so spammers are really smart in knowing that people's interests are going to align them in certain ways. So the news feed is going to route shopping spam, for example, to women, and therefore they're going to be more vulnerable. And they're going to target their calls to action and what they're trying to get out of this spam precisely to who they know is going to be their consumer.

Elissa Redmiles: [00:18:42] The other thing we found was that users who are in particular countries where there was a lot of spam or users who had low internet skill or low resources were much more likely to fall for spam because they didn't necessarily have the ability to detect the heuristics that can help people figure out this doesn't really look like something I should be pursuing. Whereas once you have more experience on the platform and online and more kind of general internet skill, you've gotten enough experience with different kinds of content to be able to sort of pick out the things that look suspicious.

Joe Carrigan: [00:19:15] So what do you recommend people who may be new to a platform or new to the internet - how do you recommend they defend themselves against these kind of attacks?

Elissa Redmiles: [00:19:23] Honestly, a lot of burden for that is actually on the platforms themselves. So there's different training programs that various companies are exploring right now for trying to kind of onboard new users who are coming onto the internet or onto a platform to give them sort of general skills. So this isn't necessarily, like, the phishing trainings that you'll see in enterprises, although those are effective as well. But this is even before that, helping people explore the platform in sort of a guided way such that they do have some sense of what usually would be coming online.

Elissa Redmiles: [00:19:55] The other thing we've seen is very helpful for people is to have some sort of security advocate identified in their network. So this is someone who they believe has expertise and hopefully actually does have some expertise, perhaps a background in computer science or in IT - if that person is available in their network. Or, in the case of social networks, some companies have been exploring sort of assigning one of these kind of advocates to people who are new to the platform so that if they see something suspicious, they can ask someone to check on it before they interact with it.

Joe Carrigan: [00:20:27] So in terms of training, like, you're - are you envisioning something like Facebook would have - here's a video that you should watch before you use our platform?

Elissa Redmiles: [00:20:34] So more interactive, probably. So one thing that they currently have at Facebook actually is a digital skills training center. It's focused around small business owners. But it actually sort of removes certain features from the platform and shows people those features individually before giving them sort of the full scope of what's available.

Joe Carrigan: [00:20:55] Now, you said that telling people not to click on links is not a viable solution. Frequently, that's what we say, particularly with emails, on this podcast. Are you referring to emails, or are you referring to spam messages in Facebook Messenger or direct messages from Twitter?

Elissa Redmiles: [00:21:09] So often the advice will be kind of more broad. And it'll say, oh, you should be suspicious of links that you find online, which is a good practice that will keep you safe, or, you shouldn't click on links that you don't know. But in the ways that people actually explore the internet or use email, we're every day kind of using other heuristics that aren't necessarily about the link in order to figure out what to do and not to do. So if I were to tell a regular user, especially one new to the internet, well, you should not click on links you get in email, that would negate them in some ways being able to use their email in the ways that they want to. So we're hoping to find and identify more specific pieces of advice that people can bake into their advice documents that get a little bit more narrow about what people should do without giving these sort of broad things that may not feel tangible enough to the users.

Joe Carrigan: [00:22:00] So you also did some research on the rationality of people in terms of using two-factor authentication. What did you find there?

Elissa Redmiles: [00:22:07] So we often talk about user security behavior as being kind of random or irrational, or perhaps they're just not focusing on security, and therefore this is why they don't make secure choices. And so I wanted to test the degree to which economic models of rationality or bounded rationality fit how people made choices about two-factor authentication. And in order to do this, I had people create accounts in an online system that function like a bank account so they would receive money for participating in the study. And the money they were going to receive lived in this bank account that they were creating, and they had the option to enable two-factor authentication for the account. And in these experiments, they were given an explicit risk that the account was going to be hacked.

Elissa Redmiles: [00:22:52] So say your account has a 20% chance of being hacked over the course of the study. And when they were offered two-factor authentication, they were told that if they enabled two-factor, it would reduce their risk of hacking by some explicit amount. And we varied that explicit amount between 50%, which is a coin flip, and 90% or near-perfect protection. And what we find is that people don't behave perfectly rationally. So if we were to compute how much time they spent on two-factor times their hourly wage - so, like, how much time they lost from not doing other experimental tasks while they were doing our two-factor - and compared that to their protection that they gained of their study incentives, we see that they behaved strictly rationally, where they only do two-factor if doing so is cheaper than just getting hacked, about 50% of the time. But despite the fact that they're not perfectly rational, which isn't maybe so surprising, we do see that they're boundedly rational.

Elissa Redmiles: [00:23:46] So we can explain their behavior very well as a function of five factors, which are the risks and the costs we told them - so typical kind of rationality factors - as well as some cognitive biases. So people are biased by something called the endowment effect. So if we give them a bunch of money at the start of the study and they're protecting this large sum of money to begin with, they are more likely to enable two-factor authentication than if they have to earn that money through the course of the study. And this is something that pops up in behavioral economics experiments in other contexts. People behave differently around money they already have than money they're earning. The other bias that we see is that people are very biased based on the types of risks we present to them. So something that seems very risky and that they've encountered a negative experience with in the past is going to make them more likely to engage in a protective behavior.

Elissa Redmiles: [00:24:42] And finally, they like to stick with behaviors that they already do. So they anchor to what they've done in the past. So if they did our study multiple times over a period of a month, they tend to stick with whatever they did the first time. So if there's someone who tends to do two-factor authentication, they want to stick with that. If they're someone who usually doesn't, they want to stick with that. Although, they will adjust if we put them in a very extremely high or very extremely low-risk setting. It may be worth waiting a couple of weeks to prompt me to enable two-factor until I actually value the account, and I may be much more likely to do it then than if you start prompting me, I get in the habit of saying no, and then even once the account has gained value, I'm like, well, I usually say no so let me just keep saying no.

Dave Bittner: [00:25:25] Wow. Interesting stuff, Joe - nice job there. She really had some good information to share.

Joe Carrigan: [00:25:30] Her research is very interesting, I think, and she has a lot of it online and you can look it up. One of the things that she talked about was, you know, the sunsetting of the old practices. You know, we have all these things that we tell people as security professionals, and it's time to start changing the old ways we do things.

Dave Bittner: [00:25:44] Mmm.

Joe Carrigan: [00:25:44] And her primary example is stop forcing people to change their passwords. And we've seen that recently, over and over again. Again, I'm going to say that that's - yeah. You don't force people to change their password. You make them pick good, strong, long passwords. But you as an individual changing your password with a password manager is good.

Dave Bittner: [00:26:02] Because?

Joe Carrigan: [00:26:03] Well, because that protects you from the breach you don't know about. But it's an individual policy. No one's forcing you to do it. And if you're using a password manager, you're going to get a long, complex password that's generated by the password manager.

Dave Bittner: [00:26:13] Yeah.

Joe Carrigan: [00:26:13] And it's transparent to you, and it's easy to do. But if you're not using a password manager, and you have a very good, long password for something, don't change that 'cause that makes it more difficult. Or don't force people to change their passwords for logging into your system.

Dave Bittner: [00:26:26] Right. Right.

Joe Carrigan: [00:26:26] 'Cause they're going to pick weak passwords.

Dave Bittner: [00:26:28] Right.

Joe Carrigan: [00:26:29] That's just the way it is. The less experience you have online, the more vulnerable you are. That's not really surprising.

Dave Bittner: [00:26:34] No. I guess not. No. You'll be less savvy...

Joe Carrigan: [00:26:36] Right.

Dave Bittner: [00:26:37] ...Less exposure to what people might be out there trying to do.

Joe Carrigan: [00:26:40] That's correct. Training and education is key. And there are lots of ways we could deliver the training. The platform could deliberate. You know, and when she says platform, I think she's talking about, like, Facebook and Twitter and those things.

Dave Bittner: [00:26:50] Right.

Joe Carrigan: [00:26:50] And of course, there's this podcast, which I think everybody on the planet should listen to.

Dave Bittner: [00:26:54] (Laughter) We're preaching to the choir.

Joe Carrigan: [00:26:57] Right. We're preaching the choir. But spread the word, everybody. You know, we deliberately keep this podcast not too technical so it's approachable to just about everybody.

Dave Bittner: [00:27:04] Yeah.

Joe Carrigan: [00:27:04] The mission of this podcast is to help people not fall victim to this kind of stuff.

Dave Bittner: [00:27:08] Yeah. I think it's a good point, too, about the platforms, that...

Joe Carrigan: [00:27:11] The platforms, right.

Dave Bittner: [00:27:12] ...That an onboarding process could go a long way for teaching people how to better protect themselves.

Joe Carrigan: [00:27:19] Right. And I really like the idea of reaching into the very nature of the social platform and finding security experts within that area and having them mentor and help other people on the platform.

Dave Bittner: [00:27:30] Mmm hmm.

Joe Carrigan: [00:27:30] Although I don't know that I have time to do that.

0:27:32:(LAUGHTER)

Dave Bittner: [00:27:33] Well, you help people in other ways.

Joe Carrigan: [00:27:35] I do (laughter).

Dave Bittner: [00:27:37] And the whole thing about what motivates people to do things - just fascinating. Her notion of not prompting someone to enable two-factor until they have discovered the value in that account...

Joe Carrigan: [00:27:49] Right.

Dave Bittner: [00:27:50] That's fascinating. That's new to me.

Joe Carrigan: [00:27:52] Yeah. That is kind of new information. I thought that was a really good finding that she had.

Dave Bittner: [00:27:56] Mmm hmm.

Joe Carrigan: [00:27:56] Yeah. I think it's easy. Think about yourself setting up, like, a Gmail account. Right? I'm just going to set up a Gmail account, and if this works then great. But over time, if I start using that Gmail account and it becomes important to me, and then Google were to say, do you want to secure this with two-factor authentication, that's a different economic decision for me.

Dave Bittner: [00:28:14] Yeah.

Joe Carrigan: [00:28:15] Right? This is the email account I use most of the time to communicate with people.

Dave Bittner: [00:28:19] Yeah.

Joe Carrigan: [00:28:19] You know, when I set up my main Gmail account, it wasn't that important to me. I just wanted to get a Gmail account.

Dave Bittner: [00:28:24] Right.

Joe Carrigan: [00:28:24] My main account was actually on Yahoo. But now that's where all of my - (laughter) all my spam emails go, and my main email is my Gmail account.

Dave Bittner: [00:28:32] And I even think that the way that, for example, like, Gmail could word that, they could say, you know, we notice you're making good use of your Gmail account...

Joe Carrigan: [00:28:39] Right.

Dave Bittner: [00:28:39] ...It seems to be important to you. Here's a way you can protect it.

Joe Carrigan: [00:28:42] Exactly. That's a good point.

Dave Bittner: [00:28:44] Yeah.

Joe Carrigan: [00:28:44] I think Elissa would agree with you on that.

Dave Bittner: [00:28:45] Yeah. All right. Well, really interesting stuff there. And thanks to Elissa for taking the time to join us. That is our show. We want to thank all of you for listening.

Dave Bittner: [00:28:55] And, of course, we want to thank our sponsors at KnowBe4. They are the social engineering experts and the pioneers of new-school security awareness training. Be sure to take advantage of their free phishing test, which you can find at knowbe4.com/phishtest. Think of KnowBe4 for your security training.

Dave Bittner: [00:29:11] We want to thank the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu.

Dave Bittner: [00:29:20] The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben. Our editor is John Petrik. Technical editor is Chris Russell. Our staff writer is Tim Nodar. Our executive producer is Peter Kilpe. I'm Dave Bittner.

Joe Carrigan: [00:29:39] And I'm Joe Carrigan.

Dave Bittner: [00:29:39] Thanks for listening.