Hacking Humans 7.14.22
Ep 204 | 7.14.22

Behavioral science in the world of InfoSec.


Kelly Shortridge: Honestly, a lot of security programs are trying to fight against just straight-up evolution, and it's kind of like looking into the sun. You're never going to win that fight.

Dave Bittner: Hello, everyone, and welcome to the CyberWire's "Hacking Humans" podcast, where each week, we look behind the social engineering scams, the phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire. And joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe. 

Joe Carrigan: Hi, Dave. 

Dave Bittner: Got some good stories to share this week. And later in the show, we've got Kelly Shortridge from Fastly. I caught up with her recently at the RSA conference and she's teaching us all about behavioral economics in cybersecurity. All right, Joe, we've got some good stories this week. Why don't you kick things off for us? 

Joe Carrigan: Dave, my story comes out of Indiana. I think it's Indiana. 

Dave Bittner: Indiana wants me, Lord, I can't go back there. 

Joe Carrigan: This comes from Paige Barnes at WSBT. And it is a story about a package scam. And police are issuing a warning about this, quote, "new package scam hitting the area." Now, I'm going to go into this. It's not really a new package scam... 

Dave Bittner: OK. 

Joe Carrigan: ...But there is a new and brazen twist to it. 

Dave Bittner: Oh. 

Joe Carrigan: So here's what's happening. The Elkhart Police Department says that a package is delivered to you that you never ordered, right? These packages will look like they're for you. They'll say Dave Bittner on them. 

Dave Bittner: OK. 

Joe Carrigan: Dave Bittner, from - who lives in Indiana. 

Dave Bittner: Right. 

Joe Carrigan: And you pick the package up, you take it into your house, and then, what happens? Somebody knocks on the door and they say, I just got a notification that my package was delivered here for some reason. And they're trying to get this package - you to give them this package. And if you give them the package, they say, hey, thanks, and they walk away, and then you will be billed for whatever it was. Now, in this case, it was a $1,500 iPhone 13 Pro Max from AT&T. So what had happened was, these folks - these bad guys got these fine, upstanding people - their personal information, including probably a credit card... 

Dave Bittner: OK. 

Joe Carrigan: ...And sent them or ordered with AT&T a new iPhone and said, just bill me later and here's my credit card information. 

Dave Bittner: Oh. 

Joe Carrigan: So they had a lot of information for them. And they - I am 100% positive the hope was that they would go there and just grab the package off the porch. 

Dave Bittner: Oh, I see. 

Joe Carrigan: Right. 

Dave Bittner: Right. 

Joe Carrigan: But the porch - the porch pirates got there and saw that it had already been taken inside. They arrived within minutes of the package being delivered because they got the notification that the package had arrived. 

Dave Bittner: So somehow they've looped themselves into the package tracking... 

Joe Carrigan: Exactly. 

Dave Bittner: ...We assume here. 

Joe Carrigan: Probably from when they set it up because they probably watch that tracking number like a hawk and they see when it's out in delivery. You know, FedEx and UPS and even the Postal Service, but not to the same extent that FedEx and UPS will do, will show you where that product is on the route. 

Dave Bittner: Right. 

Joe Carrigan: Well, how many stops - well actually, Amazon will show you how many stops away it is. 

Dave Bittner: (Laughter) Right, right. 

Joe Carrigan: But they will tell you when it's been delivered. 

Dave Bittner: Yeah, and they'll... 

Joe Carrigan: It's almost instantaneous... 

Dave Bittner: ...Take it and show you a picture. 

Joe Carrigan: ...They'll show you a picture. 

Dave Bittner: Yeah. 

Joe Carrigan: It's amazing, the logistics - the technology behind the logistics now. But that technology also enables these criminals to know when the packages arrive so they can show up, grab the package and get out of Dodge. 

Dave Bittner: OK. 

Joe Carrigan: But if you're there and you've already taken the package in, then they will - they will knock on the door and ask you for the package. 

Dave Bittner: That's weird (laughter). 

Joe Carrigan: Yeah. Yeah, that is weird. That's really concerning, actually. I don't know what to tell people to do here. The police say don't give him the package. 

Dave Bittner: Well, I just think - I'm trying to put myself in this situation, and if a package arrived... 

Joe Carrigan: Addressed to you. 

Dave Bittner: ...Addressed to me. 

Joe Carrigan: Right. 

Dave Bittner: And I - presumably I would open that package probably right away. I don't know, maybe not... 

Joe Carrigan: Yep. 

Dave Bittner: It could sit on my kitchen counter for a few hours... 

Joe Carrigan: Sure. 

Dave Bittner: ...I don't know. But if I open it up and there was a brand new - anything of significant value, first of all, I'd be confused (laughter). 

Joe Carrigan: Confused and concerned. 

Dave Bittner: Right. And I'd say, I did not order this. 

Joe Carrigan: Right. 

Dave Bittner: I know there are - now, there's a part of me that would be like, woo-hoo because... 

Joe Carrigan: Free iPhone (laughter). 

Dave Bittner: ...Well, and there is a rule I know with the Postal Service that if someone delivers something to you that you did not order, you are allowed to keep it. 

Joe Carrigan: Really? 

Dave Bittner: Yeah, yeah. I remember when we were kids, they were public service announcements they'd run on TV. And it was kind of funny because they had, like, you know, a couple of people in Alaska getting a new refrigerator delivered to them or something, you know? It was ha ha, we don't need this, but... 

Joe Carrigan: Right (laughter). 

Dave Bittner: ...So - but, yeah, so - but this is more complicated than that because what you're saying is that not only did they have it delivered, but they used my stolen credit card... 

Joe Carrigan: Right. 

Dave Bittner: ...Information to purchase it. 

Joe Carrigan: Yes. So... 

Dave Bittner: I wonder, too, is this taking - because we've heard - there's lots of scams with a lot of retailers now have these buy now, pay later... 

Joe Carrigan: Yes. 

Dave Bittner: ...Things, you know... 

Joe Carrigan: Yes they do. 

Dave Bittner: ...As an option. 

Joe Carrigan: Yep. 

Dave Bittner: I wonder if they're taking advantage of that... 

Joe Carrigan: Probably. 

Dave Bittner: ...Business as well. 

Joe Carrigan: I would imagine that is a big part of this. 

Dave Bittner: Because it delays me getting any kind of notification on my credit card that I just got billed... 

Joe Carrigan: Yeah. 

Dave Bittner: ...Right? 

Joe Carrigan: Now, the people on this - in this story, who wanted to remain anonymous as they are probably - should - smart to do, I would say, said that they had a credit card charge for $1,500 for this phone. They, of course, called the credit card company and said, no, this is not a valid transaction and our information's been stolen. So I imagine they're getting a new credit card. 

Dave Bittner: But the credit card - I could see the credit card company saying, hey, look, you signed off on that, or here's the - we also got sent the photo of this being delivered. 

Joe Carrigan: Right. 

Dave Bittner: So... 

Joe Carrigan: But then you can say, here's a police report... 

Dave Bittner: (Laughter). 

Joe Carrigan: ...Where somebody showed up at my door looking for this package... 

Dave Bittner: Right. 

Joe Carrigan: ...That wasn't me. 

Dave Bittner: Right. 

Joe Carrigan: So... 

Dave Bittner: Right. 

Joe Carrigan: ...We're going to - we're going to - I'm going to send this back and you're going to take this off my credit card... 

Dave Bittner: Yeah. 

Joe Carrigan: ...Right now. 

Dave Bittner: This is another good reason to have something like a Ring doorbell. Like a... 

Joe Carrigan: Yes, I would agree. 

Dave Bittner: ...Front porch camera. 

Joe Carrigan: Yep. It is another reason. Just make sure that you're reading the disclosures on that Ring doorbell pretty well... 

Dave Bittner: Yeah. 

Joe Carrigan: ...'Cause there's a lot of government involvement, if you will, on those things. 

Dave Bittner: Yeah. 

Joe Carrigan: You can opt out of it, I think. 

Dave Bittner: Yeah. 

Joe Carrigan: Or maybe you have to opt in. I don't know how it works. I don't have a Ring doorbell. 

Dave Bittner: Yeah. 

Joe Carrigan: But yes, a Ring doorbell would do well here. You know, Dave, frequently somebody shows up at my house to deliver a package. They put it on the front door. They ring - they either ring the doorbell - actually, they don't have to because the dogs go crazy. 

Dave Bittner: Right. 

Joe Carrigan: Right? So I know somebody's been there. And my office - my home office sits on the back of the house, so I actually can't see if somebody's dropping a package off. But I generally know by the tone of the dog barks, and I just let them sit out there for a little while and frequently go out there - we don't really have a problem with porch pirates in my neighborhood... 

Dave Bittner: Yeah. 

Joe Carrigan: ...Which is interesting... 

Dave Bittner: Yeah. 

Joe Carrigan: ...Because I know it happens a lot, and it would be a great place to be a porch pirate. 

Dave Bittner: Yes. 

Joe Carrigan: But we haven't ever had anything taken off our porch. 

Dave Bittner: I haven't either. 

Joe Carrigan: Yeah. 

Dave Bittner: Yeah, I haven't either. And - but there are plenty of places where it is just rampant... 

Joe Carrigan: Right. 

Dave Bittner: ...And a real problem. 

Joe Carrigan: You know where it happens - one of the things - my next-door neighbor - actually, my next-door neighbor is a new next-door neighbor, but we had neighbors that we were kind of close to living next to us for a while. And we live in a town that has a lot of walking trails. 

Dave Bittner: Yes. 

Joe Carrigan: And it's great to be able to walk around the town to get to where you need to go, or you can take your bike or something. 

Dave Bittner: Sure. 

Joe Carrigan: But you don't have to drive. The town was designed so that you didn't have to drive everywhere. 

Dave Bittner: Yes. 

Joe Carrigan: And the flip side of that is she lived - this family lived on the - and I say she because I was talking to the mom of the family - but they lived in this house right next to one of those paths. And she had been broken into once. And the police told her, being next to this path, you're going to have a higher incidence of break-ins. Houses on the paths have a higher incidence of break-ins, which is interesting, I think. 

Dave Bittner: Yeah. 

Joe Carrigan: So get a security system if you live near a path. 

Dave Bittner: Right. Right. So what do we do here, though? I mean, if these folks are brazen enough to come to your front door... 

Joe Carrigan: Yeah. 

Dave Bittner: ...And, I don't know, if it were you or me, probably - would you hand over the package? 

Joe Carrigan: I don't know, Dave. That's a really good question. I would assess the person at my front door, you know, looking at them. Is this person going to violently harm me? That's the concern I have because you're dealing with a criminal. 

Dave Bittner: Yeah. 

Joe Carrigan: This is - this always goes back - I always go right back to that Penn Jillette story about the three-card monte... 

Joe Carrigan: Yeah. 

Joe Carrigan: ...Where he starts taking pictures, and one of the guys in the crowd goes, uh-uh... 

Dave Bittner: Yeah. 

Joe Carrigan: Right? - 'cause that's - he's in on the deal. Does this person have another person with them? Usually porch pirates operate alone, but sometimes they operate in pairs. But there's actually - Mark Rober... 

Dave Bittner: A runner and a driver. 

Joe Carrigan: Right. Mark Rober on YouTube has some great, great videos about how he's penetrated the porch pirate network. But I don't know, Dave, what - your question is a good question. What do you do? I would say you make an assessment at that point in time when that person is knocking on your door. It might also be good to go, I don't know what you're talking about. We didn't receive any package. Have a nice day. 

Dave Bittner: Yeah. It's just that the package is in my name. 

Joe Carrigan: Right. 

Dave Bittner: You know, like, if the package was in their name... 

Joe Carrigan: Yeah, that would make sense. 

Dave Bittner: ...I'd just hand it over. 

Joe Carrigan: Yes. 

Dave Bittner: Oh, 'cause that happens all the time. 

Joe Carrigan: Sure. 

Dave Bittner: Here you are. I wish you a good day, sir. 

Joe Carrigan: Yeah. 

Dave Bittner: But at the same time, no phone, no laptop computer, no whatever is worth a potential physical altercation. 

Joe Carrigan: Right. 

Dave Bittner: If you're standing there with the box in your hand and the person says, you know, give me that box or bad things might happen to you, I'd say, enjoy your box, my friend. 

Joe Carrigan: Right. Exactly. 

Dave Bittner: And then I'd call the police. 

Joe Carrigan: Yep. 

Dave Bittner: You're right. This is brazen. 

Joe Carrigan: It is. It is. It's concerning. I'm wondering - and I don't know what to tell listeners. You know, if this happens to you, I think you have to make that decision at that point in time. 

Dave Bittner: Yeah. 

Joe Carrigan: Decide what's going on. Or just don't answer the door. That's another option as well. 

Dave Bittner: The only thing I can think of that might help with something like this is just to make sure that you have alerts set up on your credit cards because that way, if anybody charges anything... 

Joe Carrigan: Yeah, you get a text message. 

Dave Bittner: ...You get a little notice that says, hey, you know, you just put $10 down on a new iPhone or whatever. 

Joe Carrigan: Right. 

Dave Bittner: You can - what? What's this? 

Joe Carrigan: I have that set up for a couple of my credit cards, and I know when my wife goes to Starbucks. 

Dave Bittner: There you go. Right, right. Why is my wife ordering $100 worth of coffee? Oh, it's one cup of Starbucks. 

Joe Carrigan: Right, right - one cup of over-roasted, burnt coffee. 

Dave Bittner: There you go. All right, well, we will have a link to that story in the show notes. 

Dave Bittner: My story this week comes from CPO magazine. And this is written by Scott Ikeda, and it's titled, "One Million Facebook Credentials Compromised in Four Months by an Ongoing Phishing Campaign." And this is a report. There's a company called Pixm - P-I-X-M - and they are an anti-phishing platform. And they have been tracking a credential harvesting campaign that's been active since late 2021 that has really been successful in getting people's Facebook credentials. 

Joe Carrigan: So this is not a Meta breach. 

Dave Bittner: No, this is... 

Joe Carrigan: Bunch of, like... 

Dave Bittner: This is people... 

Joe Carrigan: ...Giving up their credentials. 

Dave Bittner: ...Giving up their - so this is - what the bad guys are doing is they're setting up a login page that looks just like Facebook. 

Joe Carrigan: Sure. 

Dave Bittner: It's a classic thing. Somebody sends you a link that's - or a - let's just - for argument's sake, someone sends you an SMS message that says - and it's from a friend of yours, right? And it says, hey, Joe, are you aware of this video of you? 

Joe Carrigan: Right. 

Dave Bittner: And you're like, whoa, wait, what? 

Joe Carrigan: I don't know what videos of me might be out there, but... 

Dave Bittner: Right. Can't be good. 

Joe Carrigan: Right. 

Dave Bittner: Might be good. Who knows? So you go to the page, and it says, to view this video, log in to your Facebook account. And the login looks just like Facebook. You log in. Now they have your login credentials, and that's the ballgame. 

Joe Carrigan: Well, Dave, do they have my YubiKey? 

Dave Bittner: Well, we'll get to that. 

Joe Carrigan: OK. 

Dave Bittner: So evidently the group who is up to this are escalating their attacks. So the folks in this report say that they made about 2.5 million attempts on Facebook users in 2021, and that has increased to 8.5 million attempts over a similar period in 2022. 

Joe Carrigan: 8.5 million attempts - so a total of, like, 12 million attempts. 

Dave Bittner: Yes. 

Joe Carrigan: And how many records have they compromised - a million? 

Dave Bittner: Yeah, according to this report, over a million credentials have been stolen. 

Joe Carrigan: That is a remarkably high success rate - 1 in 12. 

Dave Bittner: Yeah. So one of the other things that caught my eye about this was how they're going about it. To quote the article here, it says, "the phishing campaign is able to get around automated Facebook security used to recognize when accounts are sending out malicious URLs to contacts." 

Joe Carrigan: Right. 

Dave Bittner: "It does this with a link chain that begins with a legitimate app deployment service that then takes the target through several" - wait for it, Joe... 

Joe Carrigan: Redirects. 

Dave Bittner: "...Redirects" - (laughter) right - "before landing on the attack site." So they say they use services like glitch.me, famous.co, amaze.co and something called funnelpreview.com. And these are all legitimate services that are whitelisted on Facebook because if they don't whitelist them, they'll - it'll block legitimate stuff happening on Facebook. So they don't want that. 

Joe Carrigan: Yep. 

Dave Bittner: So they're saying that Facebook can block individual links once people report them. But... 

Joe Carrigan: I have a great idea for Meta. 

Dave Bittner: OK. 

Joe Carrigan: You got some smart people that work at your company. 

Dave Bittner: They do. 

Joe Carrigan: Start a link shortening service. There you go. Start a link shortening service and insist that anybody using your platform use that link shortening service and then validate the links to make sure they're valid. You could go a long way with this. Make it free, make it part of Facebook, and don't redirect to any other link shortening services. 

Dave Bittner: They make the point that - actually, they talk to - who do they talk to here? Erich Kron, who is a security awareness advocate at KnowBe4. Hey, there you go. 

Joe Carrigan: We've had Erich on our show, haven't we? 

Dave Bittner: Yeah, yeah, one of our sponsors. And he makes the point that contacting a human for support at these huge social media organizations is nearly impossible. 

Joe Carrigan: Yes, of course, it is. That's the business model. 

Dave Bittner: Right. But there's another interesting point he makes here I think is really important. He says people often underestimate the value of their social media accounts failing to enable multifactor authentication. 

Joe Carrigan: Yeah. 

Dave Bittner: And I think that's true. People think, oh, well, I better use MFA on my bank account. 

Joe Carrigan: Right. 

Dave Bittner: Something like that. But who's going to be interested in my Facebook account and what's - there's nothing of value in there. Well, not so, especially - you know, we've seen these social media sites. They have more and more functionality. They have ways for you to spend money, to collect money and all those sorts of things. 

Joe Carrigan: Yeah. As Meta goes into the metaverse - right? - this is - somebody asked me this recently. They said, what do you think is going to be the next big thing in social engineering? And I said this metaverse idea, this idea that we're going to be participating in some large virtual reality environment where there will be real money exchanged for virtual services or things like that, that is going to be the next big thing. And that is why these guys are probably going - well, I don't know if that's why they're going after Facebook accounts. Facebook accounts have value to them right now. 

Dave Bittner: Yeah. 

Joe Carrigan: Right? And if I can overtake - take over, rather, a Facebook account of somebody who's an administrator on a page... 

Dave Bittner: Yeah. 

Joe Carrigan: ...That has a number of followers, I can start pushing ads out to it. And that's how they monetize this. 

Dave Bittner: Yeah. Yeah. 

Joe Carrigan: Or one of the ways. 

Dave Bittner: Yeah. So the lesson here, be mindful. Be careful about these fake login pages. Be careful about these lures. You know, if somebody says, hey, was this you in this video... 

Joe Carrigan: Right. 

Dave Bittner: ...Chances are - the fact that there's so little information about it is a big red flag. 

Joe Carrigan: Yep. 

Dave Bittner: But also enable MFA on your social media accounts. That'll go a long way towards making sure you are not the low-hanging fruit. 

Joe Carrigan: Absolutely. And make sure it's a hardware-based multifactor authentication. 

Dave Bittner: Yeah. 

Joe Carrigan: Because that will stop it right here, right in the tracks, right in its tracks. It's very difficult to get around that. In fact, I don't know of a way to do it. 

Dave Bittner: All right. Well, we will have a link to that in the show notes. Joe, it is time to move on to our Catch of the Day. 


Joe Carrigan: Dave, our Catch of the Day comes from a listener named Will. There's a postscript in this, but it says - instead of PS, it says NB. Do you know what that means? 

Dave Bittner: I don't. No, I don't know what NB means. I'm sure one of our listeners will let us know. It's probably... 

Joe Carrigan: Yeah. I'm not even going to Google it because I know... 

Dave Bittner: It's probably local to, you know, some part of the country in which we are not... 

Joe Carrigan: Right. 

Dave Bittner: ...And has meaning. So, yeah, let us know if you know what NB means. 

Joe Carrigan: So this is a pretty good one. Why don't you go ahead and read this one and comment after you're done? 

Dave Bittner: OK, goes like this. (Reading) Good day. We're looking for a certain person originally from Asia, Europe and the United States of America, USA. Sir, if you fit into this description, then you're one of the ones we've been looking for in the past 10 months. And please carefully read this mail and let me know as soon as possible. We're glad to bring this news to you. IMF, in conjunction with Asian and European Union, met and agreed that a vault office should be used to compensate all scam beneficiaries, and your compensation amount is U.S. $1 million. That's 1 million United States dollars only as you are among the persons who is to be compensated as your payment. You are to reconfirm your current contact details, namely full names, full address and direct telephone number only so your funds can be processed and paid to you. Waiting for your response urgently. Jone Aknown (ph) head IMF European Union Coordinator. NB, this mail is a confidential correspondence meant only for the - actually, Joe, this is in all caps. So I'm going to say (shouting) this mail is a confidential correspondence meant only for the recipient. If you are not the one, please disregard this communication. 

Joe Carrigan: So now Will's in trouble with Jone Aknown. 

Dave Bittner: (Laughter). 

Joe Carrigan: Jone, by the way, is spelled J-O-N-E. That's a spelling of Jone I've never seen. 

Dave Bittner: Yeah. Yeah. And I don't know. I mean, it could be from, you know, a Scandinavian country. Perhaps it's Jo-nay (ph) or... 

Joe Carrigan: Yon (ph). 

Dave Bittner: Who knows? Yon - yeah, yeah, yeah, yeah. 

Joe Carrigan: Right. This is... 

Dave Bittner: It might actually be a real person. 

Joe Carrigan: Yeah. 

Dave Bittner: I don't know. 

Joe Carrigan: Who knows? 

Dave Bittner: Yeah. 

Joe Carrigan: This is a fantastic scam. My favorite part of this is that it says, we're looking for someone from Asia, Europe or the United States of America. 

Dave Bittner: Really narrowed it down there. 

Joe Carrigan: Right. Yeah. 

Dave Bittner: (Laughter) Africans need not apply. 

Joe Carrigan: Right - nor South Americans, nor Australians. 

Dave Bittner: Right. Right. 

Joe Carrigan: And certainly nobody from Antarctica, right? 

Dave Bittner: No. No. 

Joe Carrigan: Also, well, it also kind of rules out Canadians and Mexicans as well, right? 

Dave Bittner: Yes. That's true. 

Joe Carrigan: It doesn't say North America. 

Dave Bittner: No. 

Joe Carrigan: It just says United States of America. 

Dave Bittner: Yep. Sorry, Canada. 

Joe Carrigan: Right. 

Dave Bittner: OK. Yeah. 

Joe Carrigan: So, you know, Mexico and Canada dodged a bullet here. But - oh, this isn't me. I'll just throw it away. They certainly encompass the largest populations on the planet, those being China, India and the United States. So they're going after the big fish here, Dave... 

Dave Bittner: Yeah. 

Joe Carrigan: ...In terms of people. 

Dave Bittner: Yeah. 

Joe Carrigan: It's obviously fake. The English is all broken up. Nobody randomly finds people via email. That's not how these searches happen. You know, I don't - if I'm looking for somebody, I don't just email everybody and go, is this you? 

Dave Bittner: Yeah. Can I send you a million bucks? 

Joe Carrigan: Right. Yeah. Yeah. You can send me a million bucks. 

Dave Bittner: Right. 

Joe Carrigan: That should stick out like a sore thumb here... 

Dave Bittner: Yeah. 

Joe Carrigan: ...To everybody. I mean, this is why our Catches of the Day tend to be just the most ridiculous scams. But like we said before, sometimes they are ridiculous, so they weed out people that wouldn't fall for them. 

Dave Bittner: Right. 

Joe Carrigan: Right? 

Dave Bittner: Right. 

Joe Carrigan: They are actually targeting people who would believe this. 

Dave Bittner: Yeah. 

Joe Carrigan: So... 

Dave Bittner: Yeah. 

Joe Carrigan: ...Keep listening... 

Dave Bittner: Right. 

Joe Carrigan: ...Everybody. 

Dave Bittner: Well, our thanks to Will for sending that in. We would love to hear from you. If you have something you'd like us to consider for the Catch of the Day, you can send it to hackinghumans@cyberwire.com. 

Dave Bittner: All right, Joe. I recently had the pleasure of speaking with Kelly Shortridge. She's from an organization called Fastly. I ran into her at the RSA Conference, where she was presenting on behavioral economics in cybersecurity. Here's my conversation with Kelly Shortridge. 

Kelly Shortridge: So I have been looking at the intersection of behavioral economics and infosec now for at least six years, probably longer. My background is actually in behavioral economics, and when I was first introduced to the information security industry, what struck me was like, it turns out a lot of decision-making is pretty inefficient, and the market as a whole seems very inefficient. We keep seeing all these kind of unwanted outcomes, and then we try similar strategies. And it seems like we're almost stuck in the same kind of, like, thinking loop. 

Kelly Shortridge: So I started to look into, OK, what are some of the biases that are maybe at play? And what's interesting is you see more kind of finger-pointing and blame when, you know, user brains go wrong, shall we say, or the user brains don't operate in what we would call, like, the secure way or what we consider desirable. But we look less at how decision-making by security leaders and professionals maybe has - have certain quirks to them. So the talk was really to kind of introduce the concept of behavioral economics, explain, you know, what we think is a very pithy way of putting it, which - I feel very lucky I got lizard brain to be mentioned in the Wall Street Journal. 

Dave Bittner: (Laughter). 

Kelly Shortridge: But I like - for what it is, you know, you have lizard brain, and then you have philosoraptor (ph). And they're in this battle. The lizard brain is our default mode of thinking. It's very fast and automatic. It reacts to threats, which - you know, being at RSA, everybody's trying to bombard us with, oh, my God, there are threats everywhere. And you have philosoraptor, which is a bit slower. It's - you know, when you have a complex math problem, you have to expend those brain cycles. The thing is the lizard brain is the default because that's what helps us survive. It's what has helped us thrive for years and years. We maybe want a little more philosoraptor on our decision-making. So the talk is really focused on how, as security leaders and practitioners, can we start harnessing our philosoraptor more and how we can recognize that most people's brains default to lizard brain. So if we're designing security policies and procedures and tools, we need to work with that lizard brain. We need to make sure the secure way is easy, fast and simple. 

Dave Bittner: Well, I mean, let's go through some of the highlights of the presentation. What are some of the things that you're pointing out, the actions that people can take to do a better job? 

Kelly Shortridge: Yeah. So the presentation was really diving into how the lizard brain and also philosoraptor manage in information security. So there are a few great examples. One is questioning folk wisdom, which is maybe a provocative thing to say at RSA. But for instance, you hear all the time, you know, stock prices are hurt when a breach happens. Well, if you look at the data, that's not necessarily the case. So be aware that - this is called availability bias, that just because something is familiar and it's repeated often, that doesn't mean it's true. It just means there's very good marketing. 

Kelly Shortridge: But you can also leverage that to your advantage when you're thinking about things like security awareness in your organization or you want to encourage secure behavior. You need to create those pithy messages. You need to make sure they're repeated. You almost need to have the same sort of principles as, like, a political slogan or marketing slogan. But we don't always think that way. Again, we present these kind of, like, very logical, drawn-out arguments for why security matters. But really, what people need - they just need, like, quick advice they can remember. So that's a simple example kind of how you can see, on each side of the equation, this stuff matters. 

Dave Bittner: Is there a fundamental issue here that the lizard brain takes priority over the more rational side of the brain so it screams the loudest and the quickest? 

Kelly Shortridge: It does. Yes. And this is why it's actually useful, again, to kind of harness the lizard brain almost against itself. So there's a paper I'm actually working on with Josiah Dykstra, which is around opportunity cost, which can be very elaborate. You have to think about, here are all of the alternative options. You know, let's say it's spending six hours of your time. What are all the things you can do with it? Turns out it's a lot. That's way too much thinky-thinky (ph), right? The lizard brain's like, I don't want to deal with all that. However, you can create this heuristic of, like, OK, but what if I did nothing? This becomes very powerful in information security. So consider application security testing, one of those tools. Use that heuristic, what we call the null baseline. Like, what happens if we did nothing? Maybe you would be releasing software to production faster. Maybe your developers would be less cranky. Maybe that's good for the organization. So you start to kind of uncover these hidden potential benefits or hidden costs of actually pursuing something securitywise. Again, make sure that you're not introducing unintended consequences in your organization 'cause then lizard brain's like, security is the most important. Like, clearly this is my priority. So, like, everyone else, you know, that doesn't care about security - clearly they're wrong and irrational, and can you believe them? But instead, it's almost like you're harnessing this new lizard brain tactic of, like, OK, but let me just really quickly consider, what if I did none of this instead in order to almost trick yourself into being more of a philosoraptor. 

Dave Bittner: What about the threat actors, the bad folks out there who are intentionally trying to trip that lizard brain side, who are trying to get you into an emotional state and not think rationally? How do we train people to be aware of that and be able to counter it? 

Kelly Shortridge: We don't. As a security industry, we have to start designing, again, tools and workflows and procedures that try to help. We can't expect users to be experts. We can't expect them to have their thinky-thinky hat on all the time 'cause we don't have it on all the time either. And frankly, if you're looking - most people are dealing with external emails constantly, and now we're saying, OK, 95% of the time when you click on this link from an external sender, it's going to be totally fine. But now you have to slow yourself down and maybe read, you know, 20% fewer emails every day just for security. They're going to get fired probably 'cause they're not going to be as productive. You can't ask them to do that. And training only goes so far. And I think if we were exposed to more training outside of security ourselves, we would realize, like, oh yeah, I totally forgot that training message at some point. So I think the answer is we don't. And frankly, these hackers are just using the same tricks you see in advertising and marketing. You know, like, click now, the sale will end soon - like, all of those behavioral tricks to get you to, like, buy more and buy faster. 

Dave Bittner: Right. 

Kelly Shortridge: That's just what attackers are using. So until we get rid of all that, it's almost like whatever training we do is just going to be undone by the general commerce and, you know, even business emails. How many times have you had your boss say, like, you need to finish this by end of day? You need to, like, click and view this thing and review it for me. An attacker can just leverage that. So you're now saying, like, OK, you got to train something that has to completely override, again, commerce, business culture, all that. I don't think it's going to work. 

Dave Bittner: So to what degree, then, are we releasing the users from the responsibility for that oversight, for taking that extra moment to see or consider whether or not that link might be malicious? 

Kelly Shortridge: I think we should look to other industries and ask, how often are we asking our users to have that responsibility in other safety decisions? Look on the airplane. Yes, you're responsible for keeping your seatbelt buckled and all of that, but that's kind of where it ends, right? And I think that's very similar to, like, yeah, you're responsible for making sure, let's say, like, you don't lose your phone. I think that's a very reasonable thing for users. But even as experts, it's so hard for us to, like, detect, like, OK, the site looks pretty legitimate. Is it malicious? Having to go through, like, the headers in emails and all that metadata - like, it's tough even for experts. Again, most users - they're either, like, trying to accomplish something workwise. They're trying to accomplish something in their personal lives. We can't expect them to be security experts as well. And it really does take expertise in order to kind of understand what's a bamboozle or not. 

Kelly Shortridge: So I really think it's - I think we have shoved that responsibility, and I think we've made our jobs easier in some ways. But it's only temporary. And this is the lizard brain at work. It takes that temporary relief of, like, OK, we're going to push this complexity onto the users, not realizing, like, all that burnout we experience all the time for all these incidents - part of that is because we haven't designed our systems with the way real users behave in mind. So it's - yes, in the short term, we've, like, offloaded some responsibility. In the long term, that's part of the reason why we keep seeing, like, incidents happening and even going up. 

Dave Bittner: Can you describe to me what an ideal system would look like that would take all of this into consideration? 

Kelly Shortridge: So one of the examples we mentioned in the talk, and I think there has been a paper talking about this, is - think about, let's say, the - one of the worst cases of, like, an application security testing tool. It's, like, you're a developer. You've just finished your, you know, mind-blowing feature - at least in your mind, it's mind-blowing. And you've submitted a pull request, and you're like, yes, victory. But now you have to leave your command line. You have to now open up your browser. You have to click and log into, like, a security portal for this tool. You have to initiate the scan on your code. You have to wait four, six hours - maybe more - in order to get results of the scan. You have to decipher what those results mean. You have to find exactly where to fix in the code. It's a nightmare. 

Kelly Shortridge: Imagine, though, that whole scanning process was in your IDE. It just looks at the diff of, like, OK, you're just changing this part of the code. Let me just run the analysis in the background to see, like, while you're working on other changes. Now it's in your context where you're working. It hopefully is providing immediate feedback for you, like, exactly here's where you need to fix, and here's why. It takes a lot more upfront effort, instead of just saying, like, OK, developer, go do this later, but think about the quality results. And I believe that paper found that - I can't remember the exact percentage, but there was a - quite a huge leap in the number of fixes that developers actually made. 

Kelly Shortridge: I think it'd be the same thing in reverse, right? You know, if security person was building a tool and they had to go completely out of their way in order to, you know, write code, and then when there is, you know, a compilation error, there was, like, no meaningful feedback, we'd hate it. We'd be like, why were the developers doing that to us? 

Dave Bittner: Right. Right. 

Kelly Shortridge: So I think it's no wonder that they kind of push back. So I think that's a good example of, like, we need to be in the workflows of these people. We need to think about how they're - we need to - again, we have to make it easy, fast and simple. That does require more upfront effort from us, but I think we as an industry kind of need to be doing that more. 

Dave Bittner: Are there organizations who are successfully implementing the types of things that you're suggesting here? 

Kelly Shortridge: Well, I mean, I'm going to be biased about Fastly, so I'll use an example that's not Fastly. But I think Snyk, who - I've spoken at their conference, partly because I think they've done something great, which is putting the results of those kind of security scans in line with the pull request in GitHub. So again, the developer doesn't have to leave their local context. They can just see the results of the scan there. They don't have to kind of go into this separate tooling. So I think that's kind of - you know, and they've been pretty wildly successful, I would say, even among developers, I think. Developers are a curmudgeonly bunch. I think there's some kinship there where, you know, if they don't hate it... 

Dave Bittner: (Laughter) Really. 

Kelly Shortridge: Yeah, right. If they don't hate the tool, in some ways that's a victory. And I have certainly heard with Snyk, they don't hate the tools. So I think that's a good sign. 

Dave Bittner: Yeah. What about from the user's point of view? I mean, if we look towards the future, what their experience might be like in this sort of scenario, what do you imagine? 

Kelly Shortridge: In some ways, we want to abstract the security problems away for them, which, again, is - can potentially be a daunting task. But instead of things like, you know, kind of having security bottlenecks and approvals, make them - I heard of an email security company recently. I don't know if I'm supposed to mention brand names, but... 

Dave Bittner: (Laughter). 

Kelly Shortridge: So I'll just say there's an email security company that I know introduced a workflow where if even one user in the whole organization sees a phishing email, they can report it. And then it's automatically kind of, like, protect at cost the population rather than requiring, you know, open a ticket and all that kind of tedious stuff. But then, cleverly, you can - I believe you can generate an alert in Slack that basically says thank you user for, like, reporting that and saving the company. And I think those kind of tricks where if we can empower users to kind of feel like the heroes for just taking a little more time, if they decide to be philosoraptors, reward them, right? Because they've done something great. Rather than assuming that's the default, assume they're just going to be doing their work and then, like, incentivize that kind of effort. 

Dave Bittner: Trigger the happy part of their lizard brain. 

Kelly Shortridge: Exactly. Yes. And the great thing is - and one thing we mention in the talk is what's really powerful - I think most people have had the experience of, like, you're, like, driving to work or something and then you get there and you're like, oh, my God, I don't remember any of my drive here. 

Dave Bittner: Right. 

Kelly Shortridge: It's automatic. The first time that you had to get there, philosoraptor was in charge because it was like step by step, like, OK, I got to make this turn and then this one. So after repetition and practice, eventually those philosoraptor kind of thinky-thinky processes become lizard brain instincts. As much as we can kind of encourage that, we incentivize the first couple of steps where they have to do that thinky-thinky and then it becomes automatic. That's great. But I think, again, we need to assume as the default that we don't want the users to care. We don't want them to think about it. In some sense, they want security to get out of the way. It's not like they don't care about security. It's just no one is really getting their bonuses or getting promoted when, you know, they're a developer or they're a marketing coordinator in procurement. They're not getting rewarded for doing security. 

Kelly Shortridge: And I think it's unlike we can change bonus schemes to be in the favor of security. So instead, we just have to make sure they can do their jobs. We do the hard work of setting everything up and setting the systems up so if they do click on a phishing link, it's not the end of the world. Because they don't feel great when they do that, we don't feel great, and it's worth the upfront effort, again. But that is - philosoraptor care is about later land, as we called it in the talk. Lizard brain's like no, I'm just thinking about myself in the present. So it does kind of... 

Dave Bittner: In the moment (laughter). 

Kelly Shortridge: Exactly. It does take a bit of thinky-thinky for us to make the decision of like, OK, we're going to put in that upfront effort to save ourselves all that pain during incidents and reduce those incidents as a result. 

Dave Bittner: In general, would you say that the folks who are developing these tools, the developers in general, are they more lizard brain or philosoraptor dominant? 

Kelly Shortridge: Every human's more lizard brain dominant. 

Dave Bittner: OK. 

Kelly Shortridge: That's just how we're designed as a species. And that's part of the reason why we love, you know, like, sweet and salty snacks and, like, immediate rewards and, you know, all the stuff - the shiny stuff we see at the conference, right? I think the key thing - there's this kind of unfortunate feedback loop in the industry where people designing security tools have to satisfy the requirements of their customers. So that's the security teams. Security teams still have their lizard brain mindset of like, oh, my gosh, everything's a threat, we're vulnerable, we have to protect it at all costs. And, you know, as I say, like, they don't really care if, like, the money printer stops going like brrrr (ph). Like, they're fine if it shuts down if it means it's secure. It's obviously - the business disagrees, but that means that if you're developing tool and you want to succeed, for the most part, you have to cater to those requirements. And then, of course, the customers see more of the chatter about, like, eliminate all threats, like, prevent everything, which is not - again, that's lizard brain sort of framing. 

Dave Bittner: Right. 

Kelly Shortridge: So there's kind of, like, symbiosis around, like, OK, stop everything at all costs and don't think about how to make things easy, fast and simple for users. Like, just have those, like, really annoying bolt-ons for everyone else. Save yourself some work upfront, even though maybe down the line during the incident, it's going to be extra messy. It's really unfortunate. Of course, I know we're talking more about the talk today, but my co-author, Aaron Rinehart, and I are trying to change that with security chaos engineering and start to hopefully make more of that philosoraptor and, you know, longer term thinky-thinky more automatic through a set of kind of principles and practices. I think the key thing is, whether you acknowledge it or not, the lizard brain is active in your users, in your colleagues and in yourself. Everyone can be kind of like a fool when you think about it, and I mean that very endearingly. I can as well. My lizard brain is active all the time. 

Dave Bittner: I have a friend who said nothing is foolproof for a talented fool. 

Kelly Shortridge: Exactly. Yes. So I think the key thing is you have to assume the lizard brain is the default in yourself and in everyone else and have compassion for that. Work with the lizard brain rather than trying to restrict it. Like, you're - honestly, a lot of security programs are trying to fight against just straight up evolution. And it's kind of like looking into the sun. You're never going to win that fight, right? 

Dave Bittner: Joe, what do you think? 

Joe Carrigan: You know what's weird, Dave? I just started reading a book called "Misbehaving: The Making Of Behavioral Economics" by Richard Thaler. I just checked it out of the library last week and started reading it. 

Dave Bittner: All right. 

Joe Carrigan: Thaler won the Nobel Prize in Economics in 2017 for basically starting the field of behavioral economics. So let me explain to you the idea of behavioral economics. 

Dave Bittner: OK. 

Joe Carrigan: Now, for the brief time that I was an economics major in high school - or college rather... 

Dave Bittner: (Laughter) I was going to say, having read this book, you are as much of an expert as Kelly (laughter)? 

Joe Carrigan: Right. No, I'm not. I'm sure Kelly is - but I'm always fascinated by books by economists. For some reason, I love them. I tear through them. Like, the "Freakonomics" books, I love those. 

Dave Bittner: OK. 

Joe Carrigan: The key point here is that, traditionally, economics was based on the idea that people were rational beings who would make their optimal decisions for their best outcomes. 

Dave Bittner: What an adorable idea. 

Joe Carrigan: Right, exactly. And behavioral economics says, no, no, we need more realistic expectations of people. So people - I'm not advocating against free markets and open economic systems, but people will make decisions based on incomplete information and emotion and other psychological factors. 

Dave Bittner: Sure. 

Joe Carrigan: Right? So that's the key understanding of behavioral economics. And the book is pretty interesting. I'm - I've just started it. I'm not very far through it yet. 

Dave Bittner: OK. 

Joe Carrigan: So I can't talk about it too much. 

Dave Bittner: Yeah. 

Joe Carrigan: But I want to give the listeners some background on what the idea of behavioral economics is. I like the way she talks about the cognitive processes in such endearing terms, I'll say. 

Dave Bittner: Yeah. 

Joe Carrigan: Lizard brain and velociraptor brain and thinky-thinky, right? 

Dave Bittner: Yeah. Yeah. 

Joe Carrigan: That's one of my favorite things because that means - thinking about things can be exhausting. It takes energy. And when we are working through a process, we're often not thinking about that process. We're doing it, right? Unless you're doing some kind of creative process, which - I actually - I haven't done a lot of coding recently. But I remember when I was doing coding, I'd come home from work, and I'd be, like, mentally exhausted - very, very hard - 'cause I'd spent my entire day thinking about a problem and solving it. 

Dave Bittner: Yeah. 

Joe Carrigan: So I like the way she talks about these different aspects of what goes on inside of our brain. 

Dave Bittner: Yeah. 

Joe Carrigan: And I want to touch on that a little bit more because when I'm giving talks - I've started - there's a talk I'm giving right now about - or I've given recently - about how the social engineering plays on our base instincts, the things that make us human. And you have to be able to recognize that pattern of the social engineering attack in order to be good at understanding what's going on. I like her idea of question folk wisdom. Yeah. A good - and she cites the example of stock prices are negatively impacted by breaches. When stock prices are negatively impacted by breaches, that impact is very brief. 

Dave Bittner: Yeah. 

Joe Carrigan: And we don't see a lot of long-term impact from in stock prices from breaches, and we don't see a lot of consumer behavior. Do you still shop at Target? 

Dave Bittner: Oh, do I still shop at Target. 


Joe Carrigan: Do you even bear any animosity... 

Dave Bittner: Please. 

Joe Carrigan: ...Towards Target for giving up your credit card information? 

Dave Bittner: There's a red, velvet rope for me at Target. Yeah. 

Joe Carrigan: (Laughter) Kelly makes an excellent point here that people don't necessarily need the long explanations, but they rather need soundbites to remember things. The subtext here is that people already want to do the right thing. And she actually says that later in the interview. They just need a way to remember it, to put it into their lizard brain... 

Dave Bittner: Yeah. 

Joe Carrigan: ...If you will. 

Dave Bittner: Yeah. 

Joe Carrigan: She spends a great deal of time talking about one of the fundamental problems of infosec, the emotional triggers are the same triggers that marketing and sales use. I have another thing I talk about in my social engineering talks, my security awareness talks. I call it the social engineering one-two punch - right? - which is, you have a problem; I have the solution. And we see that a lot in the scams. Like, there's an arrest warrant out for you, but if you pay me 500 bucks, it'll go away. 

Dave Bittner: Right. Right. 

Joe Carrigan: It's that kind of thing. But she makes an excellent point. That is the same thing that legitimate companies do when they're marketing their products. Think back to Steve Jobs when he said, you have all this digital music and no way to transport it. 

Dave Bittner: Right. 

Joe Carrigan: Here's an iPod. 

Dave Bittner: Right. 

Joe Carrigan: And people went nuts for it. They - you're right. I have a problem I didn't even know I had, Steve Jobs. 

Dave Bittner: (Laughter) I've been carrying around all these CDs like an animal. 

Joe Carrigan: Right - like a caveman. 

Dave Bittner: Right. 

Joe Carrigan: I do have a disagreement with Kelly, and I would love to discuss it with her at some point in time. And it's not a big disagreement. But she says there is - that we're pushing a lot of this responsibility for human - for the human part of the problem down to the individual contributors... 

Dave Bittner: Yeah. 

Joe Carrigan: ...On the - in the process. And her point is that's not the way to go. And her reasoning behind that is sound, that we're asking too much of busy people. Productivity is based on - is reduced. And bonuses - bonuses come from being productive, not being more secure. 

Dave Bittner: Right. 

Joe Carrigan: And so I can't disagree with that at all. 

Dave Bittner: Yeah. 

Joe Carrigan: But I still think there is an individual responsibility for security awareness. And I think there needs to be a change in organization or organizational psychology change that makes that part of the culture, that we have to do things securely, and we have to recognize these patterns. And I don't think - I am optimistic here about there being some way for us to make people better at recognizing when they're being targeted by a social engineering attack. And I think that once we get people to recognize that - and to Kelly's point - at the lower level of our brain, you know, more ingrained in our neural pathways - once we get people to recognize that, that we're - we'll be a lot more successful. 

Dave Bittner: Yeah. 

Joe Carrigan: I think that using her point of giving people short little things to remember is a great step in getting there. 

Dave Bittner: Yeah. 

Joe Carrigan: So I think this is a great interview. I'm sorry I didn't see the talk at RSA. I didn't get to go to RSA. I'll look to see if it's available. I want to see her talk. 

Dave Bittner: Yeah. Yeah, absolutely. They usually have those on the RSA conference website, so definitely worth a look. As you can possibly tell, I really enjoyed this conversation... 

Joe Carrigan: Right. 

Dave Bittner: ...Really compelling stuff. And to me, it was a sort of a fresh approach, new energy to this problem that - she made me think about things in ways I hadn't really considered before. And that's always interesting. 

Joe Carrigan: I think she's a luminary in the field, I think. She's going to be hopefully making positive changes. 

Dave Bittner: Yeah. So again, our thanks to Kelly Shortridge for joining us. We do appreciate her taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. We want to thank the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu. The "Hacking Humans" podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Joe Carrigan: And I'm Joe Carrigan. 

Dave Bittner: Thanks for listening.