Make systems to mitigate the mistakes.
Margaret Cunningham: You know, you got to think past the mistake and start making systems that are able to mitigate the actual impact of the mistake.
Dave Bittner: Hello, everyone, and welcome to the CyberWire's "Hacking Humans" podcast, where each week, we look behind the social engineering scams, the phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire, and joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe.
Joe Carrigan: Hi, Dave.
Dave Bittner: We got some good stories to share this week. And later in the show, my conversation with Margaret Cunningham. She's from Forcepoint. We're going to be talking about cognitive biases that lead to reasoning errors in cybersecurity. Be sure to stick around for that.
Dave Bittner: All right, Joe. Before we jump into our stories this week, we've got some follow-up here. What do we have?
Joe Carrigan: We received a note from a listener named Alex (ph), who was replying to my question about receiving that phone call a couple episodes ago with the Alexa command. He says, (reading) hi, Dave and Joe. I was listening to "Finding Targets of Opportunity" episode where Joe got that automated message that tried to enable Alexa to do something. There are a few things that might be happening. One, someone created a malicious Alexa skill, and ring phones is the command to enable the skill, which, of course, would do anything that one could do with a dropper in the network. Two, there's a bug in an existing Alexa skill that the attacker knows how to abuse, which could have the same implications as above. And three - and this is my personal favorite, he says - someone developed an Alexa skill where ring phones is the command to activate that skill. In an effort to legitimize or artificially inflate the number of uses of their skills, they sent the automated message that Joe received to as many people as they could. This could generate revenue from their in-app advertisements or if Amazon pays a developer for numbers of uses. I'm not sure if this is part of the business model or not. Or, as stated before, it is simply to inflate the number of users or downloads to make the app appear legitimate. All the best, Alex.
Joe Carrigan: I think that's good. We also got some other feedback from this that was - I tested and turned out that's not the case. I actually got one of these calls again this week. They didn't issue any commands, but there's a background noise that's very distinctive when I get the call. And there's nobody on the other end, and the call will hang on indefinitely until I finally hang up.
Dave Bittner: What kind of background noise?
Joe Carrigan: It's like a low rumble, like, almost like white noise.
Dave Bittner: OK.
Joe Carrigan: Maybe it's pink noise. I don't know. But it's a very distinctive tone. I remember it every time I hear it. I'm like...
Dave Bittner: Right.
Joe Carrigan: This is the same thing again.
Dave Bittner: Right (laughter). OK. And suddenly, you found yourself online with an irresistible urge to buy cookies or something.
Joe Carrigan: Right. Exactly. That's what it is. It's subliminal advertising being served to me.
Dave Bittner: Right, right, right. All right. Well, our thanks to our listener Alex for sending that in. A real quick note - we got a note from a listener named Brandon (ph), who was reminding us - we were talking about DNS names and trying to find similar DNS names. He pointed out the site dnstwister.report, which we've talked about here, I believe, in a past episode.
Joe Carrigan: Yes.
Dave Bittner: But it's a useful online tool for finding similarly named websites and can save you a lot of time in that effort. So thanks to our listener for sending that in. We would love to hear from you, of course. If you have something for us, you can write into hackinghumans@thecyberwire.com.
Dave Bittner: So let's dive into some stories here. My story this week comes from the folks over at Vox on their Recode website. The title of the article is "Dark Patterns: The Tricks Websites Use to Make You Say Yes Explained." This is written by Sarah Morrison. And this notion of dark patterns, Joe - is this something you're familiar with?
Joe Carrigan: Yes. The classic example is the fake hair on the screen, right?
Dave Bittner: (Laughter) Right.
Joe Carrigan: Like, I do an overlay of a hair on a screen and then get you to touch the screen to wipe the hair off. But in so doing, I also get you to click on a button on the screen that is represented by the hair.
Dave Bittner: Well, this article digs into many of the different types of dark patterns. And to sort of explain what a dark pattern is, it's a way to - it is a type of social engineering. And it is a way to sort of force your hand to influence you to make the decision that they want you to make that may not be in your best interest. And so they use several examples here in this article. There's one from Instagram, where Instagram wants access to your website activity. In asking you to do so, they give you two options. They say, make ads less personalized or make ads more personalized.
Joe Carrigan: Right.
Dave Bittner: Well, there is no don't show me ads button (laughter).
Joe Carrigan: Right. And there's also no...
Dave Bittner: (Laughter) That's sort of a...
Joe Carrigan: There's also no don't share my information button, right? (Laughter).
Dave Bittner: Right, right. And so you think, well, gosh, if I have to see ads, I guess it would be better for them to be personalized. And if you press that button, then you're sharing all your information with Instagram.
Joe Carrigan: Right.
Dave Bittner: So in the way that they word this question - and, of course, they have the one that they want, whether you share your information. They have that one preloaded, ready to be clicked, highlighted.
Joe Carrigan: Right (laughter).
Dave Bittner: That's the one you want to...
Joe Carrigan: It's default selection, Dave.
Dave Bittner: Right, right. There's a bunch of other ones, you know? I think we've all been on those websites where some ad pops up on the screen, and you have to spend the next 10 seconds searching around the screen for the little, tiny X that...
Joe Carrigan: Yes.
Dave Bittner: ...You have to click to get rid of it.
Joe Carrigan: Yeah.
Dave Bittner: And if you miss the X, the - you actually go to the advertiser's page (laughter).
Joe Carrigan: Right.
Dave Bittner: Yeah. So that's another example...
Joe Carrigan: How is that not considered clickjacking? I don't know.
Dave Bittner: Well, it's a good question. There is some other ones where they'll have some sort of false sense of time running out. They have an example here with a website called Ultimate Guitar Pro. And they have a spring sale - pro access, 80% off. And there's a countdown timer that says you know, five hours, 40 minutes, 57 seconds. And it's counting down. And they point out that their access sale has been a few hours away from ending for the past several months, if not years.
Joe Carrigan: Right. If you reload the page, it goes back to five hours, 40 minutes, 57 seconds.
Dave Bittner: Right, right. Exactly. And also, historically, they make the point that, you know, you and I, Joe - we grew up in the days of the Columbia Record Club (laughter).
Joe Carrigan: Yes.
Dave Bittner: Remember that?
Joe Carrigan: Yes.
Dave Bittner: Columbia House Record Club? And...
Joe Carrigan: I got roped into that once.
Dave Bittner: (Laughter) This was - for our younger users, this was a thing that you would get. It usually an advertisement in your Sunday newspaper, a big, big full-page color ad. And it would say, you know, 12 records, tapes or CDs for a penny.
Joe Carrigan: Right.
Dave Bittner: And this seemed like a good deal. So you'd send in - you'd select your 12 CDs or cassettes or albums.
Joe Carrigan: Yup.
Dave Bittner: And they'd send them to you. But what you were really signing up for was a monthly delivery of a overpriced album. And the point is that a lot of these sorts of things - these patterns are still in apps today. There are lots of apps. I got almost roped into one, recently. I was looking for an app to do some simple little function that I needed to have done. And one of the options was an app that had a free three-day trial, Joe.
Joe Carrigan: Oh, don't...
Dave Bittner: For three days.
Joe Carrigan: No, no, no.
Dave Bittner: You could try the app for three days. What do you think happened after those three days?
Joe Carrigan: You get billed a monthly fee of, like, 150 bucks or something like that.
Dave Bittner: (Laughter) Twenty-nine dollars a week, a week.
Joe Carrigan: A week? Oh, my God.
Dave Bittner: (Laughter) Right. And so the other notion here is that these app companies - they figure in the heat of the moment, you're going to do the thing you want to do. You're going to forget about it. And then in a few days, they're going to bill you the $29 or whatever it is. And they hope, A, you're not going to notice, and you're just going to pay it.
Joe Carrigan: Right.
Dave Bittner: Or if you do notice, you will just delete the app, shut off the payment. But it's not worth your trouble to go back and get the refund for the original.
Joe Carrigan: Right.
Dave Bittner: And so they profit. What's being done about this? Well, there are some steps being taken to try to tamp down on these things. If you live in California, California's Consumer Privacy Act, the CCPA, actually has provisions in it to help fight this sort of thing. So if you find yourself falling victim to this, you may want to report that to the powers that be in California. Congress is taking a look. Senator Mark Warner and Deb Fischer have introduced a bipartisan bill called the DETOUR Act, which, of course, has to stand for something.
Joe Carrigan: Yes.
Dave Bittner: And it stands for Deceptive Experience to Online Users Reduction. How clever.
Joe Carrigan: Yeah, I think there's an entire office in - up on Capitol Hill that is dedicated to the creation of clever acronyms.
Dave Bittner: (Laughter) Yes, exactly. A national treasure, a person who just has a knack for this.
Joe Carrigan: Right.
Dave Bittner: (Laughter) They pay lots of money to just, you know, yeah, just rattle them off.
Joe Carrigan: It's probably team of people.
Dave Bittner: And the FTC, the Federal Trade Commission, is also looking to tamp down on this. Evidently, they have some existing ability. This article refers to Section 18 of the FTC Act, which allows them to make rules to kind of try to fight this sort of thing. So, you know, it's a cat and mouse. It's a whack-a-mole. But it's good to see that it has caught the attention of regulators, and they're trying to do something about it. In the meantime, I think it's just something to be aware of. And as we always say, you know, warn your friends and loved ones about that these things are out there, particularly - I think the apps are particularly troubling...
Joe Carrigan: Yeah.
Dave Bittner: ...'Cause if your kids download a game or your folks download some sort of utility or something like that - and the next thing you know, they're paying way more than it's possibly worth, and they're just getting scammed out of money.
Joe Carrigan: I was joking when I said 150 bucks a month. But your - the app you were talking about actually does cost 150 bucks a month if you have a five-week month.
Dave Bittner: (Laughter) Right.
Joe Carrigan: There was the Smurf app. Do you remember that, when "The Smurfs" movie came out?
Dave Bittner: No, I don't.
Joe Carrigan: And kids had to buy Smurfberries. And they were, like, billing their parents' credit cards for, like, $500 to play the video game on their phones.
Dave Bittner: (Laughter) Right, right, right.
Joe Carrigan: And people went bananas. And Apple, I think, said, no, this is not going to happen. I may be misremembering this.
Dave Bittner: Yeah.
Joe Carrigan: But it was - I know it was "The Smurfs" app, and there were Smurfberries that kids were buying.
Dave Bittner: Right, right.
Joe Carrigan: And it was costing people a lot of money.
Dave Bittner: Yeah. To their credit, I know at least on the Apple store, Apple is pretty quick to refund people's money when these sorts of things happen and try to shut them down on the App Store. But they can only do that if you report it. And like we said...
Joe Carrigan: Right.
Dave Bittner: ...So many people just don't feel like it's worth their time, or they could be embarrassed that they fell for something like this. So they just, you know...
Joe Carrigan: Yeah.
Dave Bittner: ...They figure it's a cost of a lesson learned, and they move on.
Joe Carrigan: And on the other side of the argument, as an app developer, you're entitled to charge whatever you want for the app. I think it should be easy for you to cancel it and get your money back...
Dave Bittner: Yeah.
Joe Carrigan: ...Easier. And if Apple makes it easy, hopefully Google does, too. I don't know. I've never had this issue. I don't sign up for apps with recurring fees, just as not - I don't see the value in it, but...
Dave Bittner: That you know of.
Joe Carrigan: That I know of. That's right. Maybe I'll find an app one day that actually does everything I need to do.
Dave Bittner: (Laughter) That's right. That's right.
Joe Carrigan: And I'd happily pay $30 a week for it, but I haven't found that app yet.
Dave Bittner: There you go. Yeah. Yeah. All right. Well, that's my story this week. What do you have for us this week, Joe?
Joe Carrigan: Dave, I have a story from ZDNet that is written by Danny Palmer. And we talked about this last week on the CyberWire. And it's called "Why do phishing attacks work? Blame the humans, not the technology." The first thing I want to do here is take issue with the word blame the humans.
Dave Bittner: Yeah. Yeah.
Joe Carrigan: I don't think you blame the humans who fall for the phishing attack. I think you blame the humans who are sending the phishing emails. Those are the people who are at fault. Those are the criminals. Those are the nefarious actors.
Dave Bittner: Yeah.
Joe Carrigan: The people who click the apps are doing what people do. And then the subheading says - of this article says, cyber criminals know that people want the easiest route to resolving an issue, and phishing emails are designed to take advantage of that, which we've talked about many times. Now, in a social engineering attack, there are a number of elements, and usually you'll find all of these elements - or most of these elements, not all - but there's a pretext - right? - which is a lie about who this person sending you the email is and why they're contacting you. Then there's an appeal, which is some emotional appeal that they're trying to elicit from you, and that's usually fear, greed or a desire to help our fellow humans, right?
Dave Bittner: Right.
Joe Carrigan: Or maybe some combination of those two things. Usually you see greed and desire to help paired. That's what those emails about the - hey, I'm about to die, and I have all this money. Can you help me? And you can take some money. That's appealing to both of those issues, right?
Dave Bittner: Right. Right.
Joe Carrigan: And then there's an artificial time constraint and some call to action. And then sometimes we'll see isolation, which is a very powerful tool that they'll use, but we don't always see that. But all of these elements are almost always in a phishing email with the exception of isolation. It comes and goes, right? Isolation you see in those sextortion emails. Don't tell anybody about this. That's them trying to isolate you.
Joe Carrigan: The article says the messages are designed so that clicking the phishing link is the easiest thing to do. Danny quotes Troy Hunt, and he says, part of the problem is that phishing signals are often indistinguishable from positive user experience attributes, right? So in other words, what a phishing email looks like is exactly the same as what another email would look like. It doesn't seem out of place, and that's kind of why they work. Users can choose not to follow the link and open a new window and go to the website to verify the message's authenticity, thus avoid the phishing link. However, phishing attacks are successful for the previous two reasons. In other words, people look for the easiest way to solve the problem, and the phishing email looks indistinguishable from another email.
Joe Carrigan: One of the things in the article is that Danny Palmer uses the word coerced in referencing controlling user behavior. And he's right. Coercion is a very common tool, but it's not the only tool that they use. They also use, like I said, the appeal to greed or the desire to help. Anything that someone seems to be appealing to you, you know, like almost like pandering, you've got to almost have this, like, skepticism when you're looking for pandering or someone trying to scare you, which I think is a good quality to have. That if you're skeptical of those kind of things all the time, you're better off. But in emails, you should be learning to spot that kind of thing. You shouldn't be falling for these appeals to your base emotions.
Joe Carrigan: The defenses for this are very similar, whether you're talking about the individual person or you're talking about an organization. If you have the three prongs of defending against this - you got technology, training and policy. Your technology is for everybody, including organizations and individuals. Use multi-factor authentication wherever you can, and use the best form that's available to you. Use a password manager so that you're not reusing passwords because you don't want one account getting compromised and then all of your accounts getting compromised or all your accounts that use that password getting compromised. Next is training. In a company, the best thing to do is have a security awareness program that all of your employees have to attend on a regular basis. Of course, if you're in person, you don't have that opportunity, you know, if you're an individual, but you can always listen to this podcast, which I think is great, but I'm already preaching to the choir there, right?
Dave Bittner: (Laughter) Right.
Joe Carrigan: So tell your friends about the podcast. And finally, have a policy in place. If you're an organization and somebody receives an email that says, hey, we're changing this information about our banking center money here, have a policy that says, that requires a phone call - right? - or that requires some other form of authentication or authorization. If you have a personal policy, I don't ever click the link, right? And that's kind of my personal policy, right? Like, I get an email from somebody that says, is this you on Instagram? This is a very common phishing email tactic, where they just have a very short message like, is this you or can you help me or are you in, and then they have a link that's a phishing link. So don't click the link and go to the website personally. Enter - use either your links that you have in your browser or manually type it and go that way.
Dave Bittner: All right. Well, it's a good story. We will have links to all of our stories in the show notes. Joe, it is time to move on to our Catch of the Day.
(SOUNDBITE OF REELING IN FISHING LINE)
Joe Carrigan: Dave, our Catch of the Day comes from Big Mike (ph). He's a listener to the show, and he says it's a little bit different than most Catches of the Day. He listens to old time radio podcasts, and one he just listened to had a perfect example of social engineering. It's from "The Great Detectives of Old Time Radio" podcast. This one is from Episode 3423 of the "Great Detectives of Old Time Radio" podcast, Casey, Crime Photographer Lady Killer.
(SOUNDBITE OF PODCAST, "GREAT DETECTIVES OF OLD TIME RADIO")
Tony Marvin: Good evening, ladies and gentlemen. This is Tony Marvin. Every week at this time, the Anchor Hocking Glass Corporation of Lancaster, Ohio, and its more than 10,000 employees, bring you another adventure of Casey Crime Photographer, ace cameraman who covers the crime news of a great city, written by Alonzo Deen Cole. Our Adventure for tonight - Lady Killer. Mid-afternoon, in the cocktail lounge of a luxurious resort hotel in Colorado, a manager surveys the place with casual approval and saunters the bar. He's about 35, well-dressed and rather good-looking. But there is nothing distinctive about him. As he waits for one of the bartenders to serve them, he hums an old tune.
Cecil Gramatan: (Humming).
Frank: What'll it be, sir?
Cecil Gramatan: Martini, please. Extra dry.
Frank: Yes, sir. Say, haven't I served you before, sir?
Cecil Gramatan: No, I just checked into the hotel an hour ago. This is my first visit to the bar.
Frank: I don't mean here - someplace else, maybe LA. I worked there last year.
Cecil Gramatan: I've never been to Los Angeles.
Frank: Denver or Frisco, then?
Cecil Gramatan: No, I'm sure we've never met before. I've spent the last 10 years in Europe.
Frank: Well, I've never been across the water - yet. I guess, you just remind me of somebody.
Cecil Gramatan: Yes, I imagine that's it.
Frank: Yeah. See how this martini strikes you.
Cecil Gramatan: Oh, it's exactly right.
Frank: That's how I try to make everything. Call me when you want another. My name is Frank.
Cecil Gramatan: Frank?
Frank: Yeah?
Cecil Gramatan: I shan't want another for a while, so I'll pay you now. There you are. Keep the change.
Frank: Say, thanks.
Cecil Gramatan: All right. Oh, by the way, that fine-looking woman at the corner table over there - you know her?
Frank: The brunette with the big diamond rings?
Cecil Gramatan: Uh huh.
Frank: Yeah, I know her.
Cecil Gramatan: Her face is very familiar. I was just wondering if...
Frank: (Laughter) You've probably seen her picture in the papers. There was a big story about her a couple of weeks ago when she got a Reno divorce from her husband, plus a million-dollar settlement.
Cecil Gramatan: Oh.
Frank: She's Madeleine Chalmers.
Cecil Gramatan: Oh, yes. Yes, of course.
Frank: I used to wait on her back in Toledo, where she comes from. I worked there two years ago. She and another wealthy lady named Uttley used to say I was the only bartender they'd ever met who can make a planter's punch exactly right.
Cecil Gramatan: This Mrs. Uttley, she's a close friend of Mrs. Chalmers?
Frank: Miss Uttley. She was one of them bachelor girls then. Since, she's married a banker named Fisher. Yeah, she and Mrs. Chalmers were pals.
Cecil Gramatan: Well, I'm acquainted with a banker named Fisher. I believe he married an Uttley. Let's see now. His first name is...
Frank: This one's first name is Douglas. Either one you know?
Cecil Gramatan: Well, his wife's first name is...
Frank: Irene, Irene Uttley.
Cecil Gramatan: Uh huh, they're the people.
Frank: I'm told they took a trip to Europe last year, where you were. I guess, you met them over there.
Cecil Gramatan: Yes, London or Paris, I think. Mr. and Mrs. Fisher aren't here by any chance?
Frank: No, no. Mrs. Chalmers tells me they're up in Maine this summer - Bar Harbor.
Cecil Gramatan: Well, since Mrs. Chalmers is alone, I shouldn't be intruding if I introduce myself, I suppose, and inquired about my friends, the Fishers?
Frank: No, I don't think so.
Cecil Gramatan: I'll see you later, Frank.
Frank: Yeah, thanks again, Mr...
Cecil Gramatan: Thank you. (Humming). How do you do, Mrs. Chalmers?
Madeleine Chalmers: I beg your pardon.
Cecil Gramatan: I can see you've forgotten me. Irene Uttley introduced us several years ago in Toledo, I think it was, before she married Doug Fisher.
Madeleine Chalmers: Oh, you're a friend of Irene's and Doug's?
Cecil Gramatan: Well, I spent a day with them only a week ago in Bar Harbor.
Madeleine Chalmers: How are they? Irene hasn't written to me in ages.
Cecil Gramatan: They were fine, enjoying themselves. May I sit down and order us - I seem to remember you had a preference for planter's punch.
Madeleine Chalmers: Do sit down.
Cecil Gramatan: Thank you very much.
Madeleine Chalmers: I'm terribly embarrassed. You remember even my favorite drink, and I can't recall...
Cecil Gramatan: Well, unlike you, I have a face that people soon forget. My name is Gramatan, Cecil Gramatan.
Madeleine Chalmers: Cecil Gramatan.
Cecil Gramatan: You plan to stay here for some time, Mrs. Chalmers?
Madeleine Chalmers: At least, several weeks.
Cecil Gramatan: Well, I'm going to remain about the same period, and if you'll permit our acquaintance to ripen, I'll try hard not to be forgotten again.
Dave Bittner: Well, Joe, what do you think about that?
Joe Carrigan: Well, I'll tell you, Dave, that's pretty good. I like...
Dave Bittner: (Laughter).
Joe Carrigan: I want to think Big Mike for sending this in. Although this is a dramatization, this is a great example of two key components of social engineering - first, intelligence gathering - right? - or OSINT. This guy - back in 1947 when this came out, it - they didn't have the Internet. Right?
Dave Bittner: (Laughter).
Joe Carrigan: The Internet actually did not exist back then.
Dave Bittner: (Laughter) Right.
Joe Carrigan: And this guy goes into the single source of information, the bartender, and he gives the guy a tip, which gets him talking. And people love to talk and show off how much they know. And then he uses that information to build a pretext to approach the victim. It's exactly how social engineering works in modern days. It's just - we don't ask the bartender. We ask LinkedIn or Facebook.
Dave Bittner: Walks into the conversation with total confidence and just starts sort of filtering in little bits of information, little facts, sets the woman off of her balance because she's wondering to herself, oh, how could I have forgotten this person? I'm so embarrassed, you know?
Joe Carrigan: Right.
Dave Bittner: This - obviously we must have met before. How else could this person know so much about me?
Joe Carrigan: Absolutely.
Dave Bittner: And away they go. Yeah. And off they go with the confidence game - right?
Joe Carrigan: Yep. And this is a reminder. And one of the things that Mike says is people think of social engineering as a recent occurrence, and it's not. It's been around for millennia.
Dave Bittner: Yeah, absolutely. Well, our thanks to Big Mike for sending that in. Again, it's "The Great Detectives of Old Time Radio" podcast.
Dave Bittner: Joe, I recently had the pleasure of speaking with Dr. Margaret Cunningham. She's from a company called Forcepoint. And our discussion centered on cognitive biases that can lead to reasoning errors in cybersecurity. Here's my conversation with Dr. Margaret Cunningham.
Margaret Cunningham: Unfortunately, people have limited capabilities of, you know, paying attention or remembering things. And when we have attackers who are focused on those specific types of mistakes, like, you know, slipping up or clicking something that's suspect without thinking, they're very good at manipulating the environment so that you make more of those mistakes. They've really got it down pat. In addition, they're very good at manipulating emotion and trying to get people to feel like they're in a hurry or they're in trouble, which is very, very effective for making those types of mistakes.
Dave Bittner: Yeah. And, you know, it's something we talk about on this show a lot, about how they kind of short circuit your critical thinking and, you know, make you do things that you probably otherwise wouldn't do by just cranking up your emotions.
Margaret Cunningham: Yeah, absolutely. And there's a framework that is a decision-making framework that looks at the difference between the types of decisions people make when they're in a hot state - so they've got worry, anxiety, time pressure - versus a cold state where, you know, they're sort of able to sit back and think more logically, a little bit more analytically about whatever they need to make a decision about. So that hot versus cold is something that is easily manipulated and also something that really benefits the attackers.
Dave Bittner: Can we dig into that a little bit? How do you define that hot state?
Margaret Cunningham: This is sort of a different type of example, I'll say. So if you think about the choices that you would make about your end-of-life care when you are 25, you think, you know what? It's fine. When I'm 85, if I'm already really sick - you know what? Just let me go. I don't want any of the interventions. I don't want to spend the money. That's a very cold decision-making state. You're not in the heat of the moment. You're not experiencing the emotions of being close to the end of your life.
Margaret Cunningham: But, you know, in contrast, I can speak actually from experience. My grandmother, who is in her late 80s, thought that she would never want to go through cancer treatment. And when she was much older and suffering from cancer, she was determined to do the treatments. And it's a totally different type of decision. It's much more emotional than if you can put that distance between it. And that's very extreme, but we go through this all the time at a kind of smaller level throughout the day.
Dave Bittner: Are there things that people can do to be self-aware when they find themselves in a hot state, or is it the hot state, by its very nature, leads you astray?
Margaret Cunningham: You know, I think there are a lot of things that we can do to prepare people for specific types of hot states that are linked to cybersecurity issues. So right now in the middle of Texas, a lot of people are without power. They're cold. They're hungry. They're worried about their water, everything else. If you're an organization who has a ton of employees in Texas, it would be a great idea to start communicating with them about what they can expect from the company.
Margaret Cunningham: And what I mean by that is let them know that no one's going to ask for their account information. No one's going to ask about the status of their home network. Give them the correct phone numbers to call if they need someone. And by giving all of that information explicitly and concretely, it takes the burden off of the person. And that's very helpful when you're stressed out.
Dave Bittner: And I suppose, I mean, part of the story is that the scammers are likely going to be trying to take advantage of these people who are in this difficult situation when they're already emotionally and physically vulnerable.
Margaret Cunningham: Oh, yeah, absolutely. I'm parked in a hotel right now, which is very lucky. But if someone called me, knew my name, obviously had my phone number and then told me there was something wrong at my address and they needed information about my electric account or my water, I might give it to them because I'm very worried about my home, and I can't check on it. We see this sort of predatory behavior during a lot of crisis.
Dave Bittner: Let's go through some of the types of vulnerabilities here, some of the biases. I want to get your take on this thing called availability bias. What's that?
Margaret Cunningham: Your memory is a funny creature. Basically, sometimes it works like a big stack of paper. Whatever is closest to the top of the stack is the thing you grab for. So if you keep hearing about something over and over and over again, you're likely to think that that's the thing that's causing problems. For instance, right now we are being bombarded by cybersecurity news about SolarWinds and Russian hackers and all of these very important factors that are impacting all of our organizations. But the reality is that's not the whole picture. And if we focus too much on that because it's the most available, then we might be missing the boat for things like social engineering attacks during crisis.
Dave Bittner: So we can't let the distraction of these shiny objects take us away from that basic day-to-day hygiene that's so important.
Margaret Cunningham: Yeah, absolutely. And it can really impact where resources are spent at the organizational level. So we have CEOs and important people running companies, and they hear over and over again about Russian hackers. Right? They might say, that's where I'm going to spend this quarter's money trying to protect against this and kind of move things away from the basics that might actually serve them better.
Dave Bittner: What about this notion of - you hear it referred to as the problem exists between keyboard and chair. You know, without blaming the user, but often that's the weak link, right?
Margaret Cunningham: Yeah. You know, I don't really like calling people the weak link, mostly because if you look at people on the grand scale of things, we're doing a great job. Actually, we're doing a phenomenal job. It's just these little tiny pieces that aren't quite there. So the problem exists between keyboard and chair is sort of an old phrase, like somebody calls and they need help with their computer. And the tech support person is looking at the problem and going, oh, my gosh, the problem isn't the technology; it's the person.
Dave Bittner: Right.
Margaret Cunningham: What's funny about that is, you know, there's something in psychology called the fundamental attribution error - a big, long name for when somebody else makes a mistake, I blame it on that person. Something about that person is the reason why they made the mistake. But if I make the same mistake, I have something called the self-serving bias, where I can find all of the environmental factors that got to me. And those environmental factors are why I made the mistake. Simple example - you watch somebody trip on a sidewalk. And you're like, oh, what a clumsy fool.
Dave Bittner: Right, right (laughter).
Margaret Cunningham: And you're like oh, man, what a klutz. But if you trip on the sidewalk...
Dave Bittner: Right. Every other driver on the road is an idiot. Right? And I - but I - but not me.
Margaret Cunningham: Yeah. Yeah, absolutely. Never me, never me.
Dave Bittner: Yeah.
Margaret Cunningham: And it's funny because this happens at a really, really large scale in tech because (laughter) we're always sort of finger pointing. OK, well, if I'm the user, I'm going to say, my company should have had a better email filter. The company's going to say, this person should have been paying attention and never clicked that link. Heck, they took our training.
Dave Bittner: Right.
Margaret Cunningham: The problem with it is it doesn't really get to the root of the problem. And so we're still standing around pointing fingers and suffering from the same issues (laughter).
Dave Bittner: How do we overcome these things? How do we get past some of these biases?
Margaret Cunningham: It's never going to be perfect. And I actually don't think there's a way to design anything that's perfect - technology, people, whatever. But what we can do is we can think about the types of mistakes people make, when they're making those mistakes and how to make our systems a little bit more dynamic in their response.
Margaret Cunningham: So if you know that you have a big issue with people getting phished at your company, you're going to have to think about the things that happen after they click the link. What can you do to prevent being stuck with a crazy screen asking for bitcoin 'cause you've got ransomware? You know, you've got to think past the mistake and start making systems that are able to mitigate the actual impact of the mistake. That's a tall ask, I know.
Dave Bittner: Yeah. But what about the culture itself? I mean, I'm thinking of, you know, making it so that your people aren't afraid of making mistakes. Or - they aren't afraid of retribution if they make a mistake - that, you know, these are more learning opportunities than something that someone's going to get punished for.
Margaret Cunningham: When people have to hide their mistakes, they fester. If I know that I'm being scrutinized in a way that, if I make one wrong move, I may lose my job, I am never going to admit to a mistake. And that actually exposes companies to a huge risk in a lot of other industries where there's a lot at stake - say, aviation.
Margaret Cunningham: We not only, like, encourage sharing the mistakes. We encourage sharing the near-misses. And I don't think that we're doing that very much in this industry. It's really important that we start thinking of our organizations and our employees as more than their employee number. It will actually have a positive impact on your safety culture. You know, I think when people don't feel invested in their organization or they feel that their organization is not invested in them, they're less likely to engage in positive behaviors.
Dave Bittner: All right, Joe, what do you think?
Joe Carrigan: Dave, I'm very happy to see that Dr. Cunningham is a principal research scientist focused on human behavior. One of the things I've been saying for a long time now is that we don't have enough people focusing on the human side of cybersecurity. And I'm glad that she's here, and I think we need more people doing this kind of research. I think we're getting better at this as a community, as a cybersecurity community. There are more people focusing on this, but I don't think we're where we should be. We tend to focus on the technology side because that's - you know, we're technology people for the most part. But there's definitely a place for the psychology and the human side of this in the field.
Joe Carrigan: It is a great observation that Dr. Cunningham makes about - at the start of the interview. Our capabilities are limited. We don't like to think about that. We don't like to think of ourselves as having limited capabilities. But it is a fact. And attackers have the ability to manipulate people and the skills to do so. So they're going to exploit that, and they're going to take advantage of that limitation that you have and try to get you to do things that are counter to your interest. I like the description of hot state and cold state. And her health care example - she calls them extreme, but I think they're very valid.
Joe Carrigan: She goes on to talk about the power outages in Texas. And if you think about these kind of things, these are very low on Maslow's hierarchy of needs. If you're not familiar with that, it's a - it's, like, a pyramid shape of - it's a standard psychological - I don't know - model, I guess. It talks about how, if you don't have your base needs met, then you'll never meet your higher needs, right? And I was looking at it the other day, and I was thinking about where the social engineering attacks hit us. And they hit us fairly low on Maslow's hierarchy of needs. They try to elicit our fear, our greed. They're not going after our higher functions. They're going after our lower functions. I like what she says about PEBCAK - you know, a problem exists between chair and keyboard - and the fundamental attribution error and self-serving bias and your comment - everybody else on the road is an idiot, and I'm great at driving.
Dave Bittner: (Laughter) Right, right.
Joe Carrigan: Driving is not the only place this happens. That's absolutely correct. Everyone likes to blame someone else. And if that's the case, we never will actually solve the problems. We have to look at what the real problem is. Thinking past the mistake and build a system that can mitigate the mistake - she says this is a big ask, and it is a biggie. I think that it starts with a holistic approach. Realize that the success of a phishing email is the failure of a system, not just a part of a system or a person.
Joe Carrigan: For example, if there's a credential phishing email that comes through and somebody clicks on a link and gives up their credentials, what has failed is the email server accepted the delivery. And there are mitigations you can put in place to stop that from happening - right? - if it's - by verifying the sender. The spam filter missed it. The user spam setting missed it. The user clicked on the link. The web filters - either the network web filters or the browser-based web filters missed it. The user doesn't double-check the domain. There's some manner of multifactor authentication failure. And the attacker is in, right? That's seven different things, and there might be even more. But that's seven things I can think of off the top of my head that fail in order for a user to cough up the credentials. Any one of those things works, and the attack fails, right?
Dave Bittner: Right.
Joe Carrigan: So it's a system, and we have to think like that. Building a culture where mistakes are allowed to fester is very bad. Nothing good comes of that. When people don't feel invested, they're less likely to engage in positive behaviors like coming forward with a mistake they've made. If people are motivated to cover up their mistakes, no good can come of that.
Dave Bittner: Yeah. You have to look at it as being a learning opportunity and not an opportunity to punish people or, you know, make them feel bad.
Joe Carrigan: Absolutely not.
Dave Bittner: Everybody can learn from these mistakes. Yeah, absolutely.
Joe Carrigan: Carrots, not sticks, right?
Dave Bittner: (Laughter) There you go. Well, our thanks to Dr. Margaret Cunningham for joining us. We do appreciate her taking the time.
Dave Bittner: That is our show. We'd like to thank all of you for listening. We want to thank the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu. The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Joe Carrigan: And I'm Joe Carrigan.
Dave Bittner: Thanks for listening.