Awareness, behavior, & beyond.
Jessica Barker:: The click in many organizations should not be the defining thing that causes a breach. We should say "what set this person up in a situation where that click happened? And then how can we move in and mitigate, and build resilience there, for the next time, so that they can do better?"
Dave Bittner: Hello everyone, and a warm welcome to the Hacking Humans Podcast, brought to you by N2K CyberWire. Every week, we delve into the world of social engineering scams, phishing plots, and criminal activities that are grabbing headlines, and causing harm to organizations all over the world. I'm Dave Bittner, and joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute, hey Joe!
Joe Carrigan: Hi, Dave.
Dave Bittner: We've got some good stories to share this week, and later in the show, our guests are Dr. Jessica Barker, and and Perry Carpenter. [ Music ] Alright Joe, before we jump in, we've got a little bit of follow-up here, do you want to start things off for us?
Joe Carrigan: We do, indeed have some follow up. First, is some follow up on Deana's problem with her grandmother, that she referred to has "Nana."
Dave Bittner: Mm, mm-hm.
Joe Carrigan: And Richard wrote in to say you mentioned setting up alerts and such to slow things down. It might be helpful to separate the suggested security changes from the relationship. That is to say, just say that there are more and more scams every day, and here are some ways to protect yourself from whatever. And it gives some concrete examples of something worrisome, like a fake IRS, a fake kidnapping, or a scam of that nature. By separating the security measures from the existing relationship, your grandmother may be able to consider the situation without the emotional load, and start to wonder about some of the inconsistencies about-in the story, in the scammer story. Richard also adds that it would be helpful if she were also listening to Hacking Humans, as well, and I can't agree more [laughter].
Dave Bittner: Okay.
Joe Carrigan: Anytime anybody can shamelessly promote this, plug. But Richard did suggest that.
Dave Bittner: Yeah.
Joe Carrigan: And you know, we target this podcast to be the podcast that your grandmother can listen to.
Dave Bittner: Mm-hm.
Joe Carrigan: We do talk about some inside baseball and cyber security from time to time, but generally speaking, we're talking about things that everybody can understand.
Dave Bittner: Sure. Yeah, I mean, that's an interesting point to kind of, I guess, you know, sneak in a little bit of the doubt [laughs]--
Joe Carrigan: Right.
Dave Bittner: About scams and so forth, separating it from the proximate anxiety that is going on. That's an interesting idea. And it's in a situation as difficult as the one that was described, couldn't hurt. What else we got?
Joe Carrigan: Michael writes in with some feedback on episode 288, he said, "enjoying your podcast, but listening to your recent episode where you were talking about social engineering to compromise open source projects, I had some thoughts. This is the open source projects that were being compromised by a foreign agent.
Dave Bittner: Right.
Joe Carrigan: They were putting large blobs in there. He says, the implication of how you were discussing this was that this is only a problem for open source software. I think this is a bit of a misrepresentation. I can imagine closed source projects are equally vulnerable, particularly to nation state attackers. Getting someone employed by the company directly, kor at an outsourcing firm, to try to smuggle a back door, seems plausible. Closed source projects are potentially even less scrutinized, and already having lots of blobs to protect the company software. Blobs are large binary large objects is what it's short for. It just means, here is a bunch of stuff we're going to use. Could be anything really. It's just binary data, which essentially all your data on your computer is binary data.
Dave Bittner: Mm-hm.
Joe Carrigan: I'm now getting too deep in the weeds, I'm going to stop. Maybe I'm misunderstanding something, but I generally feel that open source is a better guarantee of security than closed source. Michael. Alright, so, a couple things. Number one, you're 100 percent correct, and in fact, there is the recent solar winds breach that was exactly this. I guess it was like two years ago, now, right? Solar winds?
Dave Bittner: Yeah.
Joe Carrigan: That was a compromise of software supply chain, and cryptographic keys, which was an impressive get, actually, but that was, that stuff was rolled out to production systems all around the world, and let a nation state actor have access, so your point is valid, Michael. I don't think that I was trying to say that, in the show, or trying to intimate that in any way, shape or form. Open source projects, we like to think are more secure because they are open, and open to scrutiny, but just because they're open to scrutiny doesn't necessarily mean they always get that scrutiny.
Dave Bittner: Yeah, it's the age-old, I guess debate is the way to say it? Right? And you're absolutely right. And it's, I think it's tempting to hold up the ideal of open source software versus the reality of open source software.
Joe Carrigan: Right.
Dave Bittner: And I think this particular breach that we're talking about here was an example of some of the realities that people don't talk about, which is the exhausted maintainer, you know? Desperate for someone to come in and try to help. And you know, not-it's not like people were knocking on this person's door to say how can we help you [laughs], right?
Joe Carrigan: Or the people, rather the people that were knocking on the door were malicious actors.
Dave Bittner: Well the one person who is knocking on the door, right. So, you know, I think ideally yes, open source software has a lot of eyes on it.
Joe Carrigan: Yeah.
Dave Bittner: But meanwhile, back in the real world, that doesn't always happen that way.
Joe Carrigan: True.
Dave Bittner: You need, you need that vigilance all around.
Joe Carrigan: Yeah. I will also say that Michael's point is valid, about closed source being insecure for its own reason, you know, you have, where the closed source product you're usually talking about a profit driven product, right? So there is going to be budget and schedule constraints being placed on that all the time, security can be an afterthought in a lot of those situations. So I mean, it's, I think it's more of a pick your poison kind of situation.
Dave Bittner: Yeah, at the same time, a closed source thing could be under better projection because there are trade secrets in there, and the company is financially motivated to keep it secret.
Joe Carrigan: Yeah.
Dave Bittner: So you're right, there is a lot of different things at play here, but I do think Michael's point is a good one. So yeah. Thank you for sending that in, Michael. We do appreciate it. And we appreciate hearing from all of you, even though we don't have the ability to answer all of your questions on air, we do read all of them. You can email us. It's Hacking Humans at N2k dot com. Alright, let's move on to some stories here. And my story comes from the folks over at Forbes. It's an article titled Security Experts Issue Jenny Green Email Warning for Millions. You've got to be careful of Jenny.
Joe Carrigan: Who is Jenny?
Dave Bittner: Well, 867-5309, that's who Jenny is, everybody in the-
Joe Carrigan: Jenny says--
Dave Bittner: Every Gen Xer knows who Jenny is [laughs].
Joe Carrigan: Right, knows her phone number anyway.
Dave Bittner: That's right. That's right, yeah. Jenny, and in the early days of the internet, there was Jenny Cam. So, there's lots of different Jennys out there. But this is about an email campaign. A massive email campaign that is distributing ransomware, specifically the LockBit 3.0 ransomware, and this is from some research from the folks at Proof Point, which is a security company. And they are using a botnet, called Phorpiex, like p-h-o-r-p-i-e-x, so Phorpiex.
Joe Carrigan: Sounds like a great spelling of-I guess it means something?
Dave Bittner: Yeah, I don't know. It could be an ancient, you know, God or mystic, or something like that [laughter], I don't know, but if someone out there knows, and it could be the ancient, you know, Greek god of, um, comfortable footwear? I don't know [laughter], but [laughing] who knows. But-
Joe Carrigan: Those awesome Greek sandals they've all been wearing--
Dave Bittner: Right, right, but what they're seeing is, the folks who are sending this out are making use of this botnet, which is a botnet that has been around since 2018, so five years, five, six years or so, and they're sending out millions of emails a day, which is a high volume.
Joe Carrigan: That is high volume for anything, really.
Dave Bittner: Right, and what's interesting about this is that the ransomware is contained in the email, so that's unusual also. Usually-
Joe Carrigan: Yeah, usually you are sent somewhere to go get it.
Dave Bittner: Right. Right, because they don't want it to be detected as part of the email payload.
Joe Carrigan: Right.
Dave Bittner: So you get a compressed zip file with an executable, that downloads and executes the LockBit ransomware. I want to check in with you on this, Joe, because I'm assuming that because of my current knowledge of how things work on a PC is a little outdated and limited here-
Joe Carrigan: Okay.
Dave Bittner: If you uncompress a zip file, and there's an executable in there, it doesn't automatically execute, right?
Joe Carrigan: No, it does not.
Dave Bittner: I wouldn't think so.
Joe Carrigan: I've never had that happen, anyway.
Dave Bittner: Yeah, okay. I was just making sure [laughing].
Joe Carrigan: Right.
Dave Bittner: Because, my point is that it seems as though you have to do something here. You have to launch that executable that is in the zip file.
Joe Carrigan: Yeah.
Dave Bittner: In order to make things happen.
Joe Carrigan: Yeah, you have to be convinced to open it up and run it.
Dave Bittner: Right. So you get an email from someone named Jenny Green--
Joe Carrigan: Although I will say this, there are executable zip files that I think you can script, but that's like a tool, like you remember PK Zip used to have that capability?
Dave Bittner: Okay, no [laughter]. No, okay.
Joe Carrigan: You could create an executable zip file that would do the extraction for you, but I don't remember if it would start up the application afterwards.
Dave Bittner: Yeah.
Joe Carrigan: But it came in as an EXE. Maybe my memory is not what it used to be, but I think I remember that.
Dave Bittner: Yeah, I mean, I think it would surprise me, and you know, again, I'm living over here blissfully ignorant on planet MacIntosh-
Joe Carrigan: Right.
Dave Bittner: When I-so I know how it works on this platform, but I'm, you know, I'm guessing for how it works on the PC. And I assume, because it wouldn't make sense for it to do anything other than that. It certainly wouldn't be secure, so let's give that platform the credit for doing the right think that I think it deserves.
Joe Carrigan: Yep.
Dave Bittner: So you get an email from someone named Jenny Green, and the title of the email is "Your Document."
Joe Carrigan: Hm.
Dave Bittner: And again, you open it up, and you get this ransomware. So what's really unusual about this is just the amount of emails that are going out, and they seem to be scatter shot. Doesn't seem to be targeting anybody in particular, they're just going after everybody. So the mitigation here is if you don't know anyone named Jenny Green, or are not expecting a document from someone named Jenny Green, delete email from Jenny Green. I would do so far as to say set up a filter [laughs], look for the name Jenny Green, and if it doesn't put it right in the trash, at the very least, it flags it, and says you know, this requires some extra scrutiny here.
Joe Carrigan: Yeah, and we had somebody on a couple weeks ago, they said if you're not expecting it, and it's a breach of process, right? It's a change in process, or a difference in process. Those are the two biggest red flags you can notice.
Dave Bittner: Yeah.
Joe Carrigan: You're not expecting anything from Jenny Greg-or Jenny Green, rather, and normally when someone sends you a document, it's not zipped up. You don't need to zip up documents anymore. There's enough bandwidth to send uncompressed documents.
Dave Bittner: Right.
Joe Carrigan: Yeah, these would all be, these should be two red flags, even though the second one is kind of shoehorning of the second use-or second red flag, but the fact that you're getting an email from someone you don't know, and they have something that is yours, that should all be, yeah, completely, you should be very suspicious of that.
Dave Bittner: Yeah. Evidently these all come from Jenny at GSD dot com, so fire up your filters, and [laughs], quarantine anything from that.
Joe Carrigan: Yeah, I don't know how good that is going to be. All they had to do is change the text and the script that generates these things, and--
Dave Bittner: Yeah, true. I mean, you can bet they'll start iterating on it, but that's, and this is very recent, from the past week or so, so you know, that's where it stands now. But you're right. It's sure to change. Alright, well, we will have a link to that story in the show notes. Joe, what do you have for us?
Joe Carrigan: Dave, I connect with a guy on LinkedIn named Paul Raffile. And Paul posted something on LinkedIn that said he just got fired from Meta, before he even started.
Dave Bittner: Okay.
Joe Carrigan: So Paul has a career, a long career, as an intel analyst, an investigator, and Meta had recruited him according to this post, to lead their human exploitation investigations. And he says his job was to prevent real-world harm, from crimes like sextortion and trafficking, and he was supposed to start earlier this month, in May.
Dave Bittner: Okay.
Joe Carrigan: But moments after hosting a webinar to combat the surge in sextortion that targets minors, and we've been talking about this for a while now, on this show, these tragic cases, that usually wind up-they target young men, usually teenaged boys, who do things that teenaged boys do, Dave, you and I were both teenaged boys once...
Dave Bittner: Yes.
Joe Carrigan: Sometimes my brother and I will reminisce, and remark how amazed we are that we survived [laughter], from some of the things we did.
Dave Bittner: Yeah. True.
Joe Carrigan: But, I'm glad I'm not a teenager now.
Dave Bittner: Right.
Joe Carrigan: Because these guys get roped into these sextortion schemes, and then they start getting the pressure cooker environment from the scammers, and this has led to over 30 teenagers killing themselves.
Dave Bittner: Mm-hmm.
Joe Carrigan: Terrible thing.
Dave Bittner: Right.
Joe Carrigan: So he's holding this webinar about this topic, hearing from the parents of these children who, and I like the way Paul puts it, who were killed by this crime. Meta contacted him shortly after the webinar, and rescinded the offer of employment.
Dave Bittner: Okay.
Joe Carrigan: He is speculating that it's because of the webinar.
Dave Bittner: Why?
Joe Carrigan: Probably because he was critical of the platforms.
Dave Bittner: Oh, I see.
Joe Carrigan: So he has an email here, he's posted an email, this LinkedIn posting that says "following up from our phone conversation to confirm your offer of employment at Meta has been rescinded, effectively immediately. Thank you for your interest. Cheers! Bobby."
Dave Bittner: Okay.
Joe Carrigan: So Bobby from Meta was the one that did this. So, we have a guy who was hired by Meta to help Meta stop this. He is still involved in trying to stop this crime from happening, by informing people and raising awareness of it.
Dave Bittner: Okay.
Joe Carrigan: And he has other articles that we're about to go into, one of them here as well, that say here are some things that Meta could be doing right now that it isn't, and Meta here is the-or presumably, Meta sits in on the webinar, and says no, we're not going to hire you after all.
Dave Bittner: Okay.
Joe Carrigan: So here are some things that he suggested in another post. We'll put links to both of these posts of Paul's in the show notes. He has a list of six things Meta could do today to curb the surge of sextortion on Instagram. Number one. Make teens, followers, and following lists always private by default. Offer this privacy setting to all users. These criminals can come in, and just target kids, and they immediately see everybody they're connected to.
Dave Bittner: Mm-hm.
Joe Carrigan: And we had a story awhile ago, our most recent story on this, talked about how these scammers infiltrated the victim's personal network of people that he knew, and was blackmailing them, blackmailing the victim, and saying here are all the people I'm going to show these pictures I have of you to.
Dave Bittner: Mm-hm.
Joe Carrigan: So he says make teens, followers and following lists always private. And set that as the default. Classify-this is number two, classify the Yahoo Boys under the dangerous organizations and individuals policy. The Yahoo Boys are the group that are most responsible for this. He cites that, he says they're responsible for an 18,000 percent increase in sextortion targeting minors.
Dave Bittner: Isn't that a group out of Nigeria?
Joe Carrigan: It is. They're out of Nigeria. They call themselves Yahoo Boys because they set up throwaway Yahoo email accounts, and that's what they do.
Dave Bittner: Right.
Joe Carrigan: Number three, victims of sextortion on Instagram are astounded by how slowly Meta responds to their reports. This is part and parcel of your big complaint about all the social media companies. It's like we can't do this at scale.
Dave Bittner: Right.
Joe Carrigan: And, okay, what do you say about that, Dave, if you can't do it at scale?
Dave Bittner: You shouldn't do that.
Joe Carrigan: Right. Number four, stop allowing these scammers to use the same catfish images and profile pics repeatedly across thousands of profiles. This would be remarkably easy for them to do, right? This is something that Microsoft has come out with ways of detecting CSAM imaging, even if you change the image a little bit, and they use a photo DNA, is the technology. It's a hashing algorithm that hashes to similar values for similar files or identical values for similar files.
Dave Bittner: Right.
Joe Carrigan: So it's generally speaking from a cryptographic standpoint, that is not a good feature of a hashing algorithm, but in detecting kinds of images, it's a very good feature of a hashing algorithm. These are out there. They could say we've seen this picture, and this picture, and this picture and this picture used in these scams, this is a scamming account. We need to do something about this. They can automate this entire process.
Dave Bittner: Right.
Joe Carrigan: It is remarkably simple.
Dave Bittner: And yet they don't.
Joe Carrigan: And yet they don't. Correct. Number five, require all accounts with a high risk geography or IP range to be verified with unique phone numbers and email addresses. This will slow down account creation of the spam accounts, or scam accounts. So they're not verifying unique phone numbers or email addresses.
Dave Bittner: Okay.
Joe Carrigan: I would think they would verify unique email addresses. That should be almost, I mean, not saying this is what it should be, but it should be a primary key to a person. And of course, I'm not advocating using data as key material, so keep your letters at home, but the idea is that you have an indicator of you know, unique identifier for these people, whether or not they've set up an account before.
Dave Bittner: Right.
Joe Carrigan: Don't let them keep creating new accounts.
Dave Bittner: Right, right. Yeah. And if you have thousands of accounts being created from the same IP address, you know, and the same-
Joe Carrigan: Using the same email address.
Dave Bittner: Right. Right.
Joe Carrigan: You should only get to create one account with an email address, and if you need to have access back to that account, we'll send a password reset to that email account, email address.
Dave Bittner: Right.
Joe Carrigan: And number six, he says he's still waiting on Meta to send that safety notice they promised to all users that have been targeted by a sextortion criminal. That was supposed to provide awareness and resources, he says you could-they could be transparent and tell us, how many teen accounts have been targeted by the sextortion accounts in the past 24 months? That would be interesting to know. Meta knows [laughter] what is going on here.
Dave Bittner: Right.
Joe Carrigan: And they are not doing anything about it.
Dave Bittner: Yeah, I think they don't want the scrutiny.
Joe Carrigan: Right. Well, they're getting the scrutiny now.
Dave Bittner: I'm sure it's an embarrassingly, embarrassingly large number, and they don't want anybody to know, I'm guessing.
Joe Carrigan: Yeah. Yeah, I think this needs to be investigated. I really think the FTC needs to get involved here. FCC, somebody. Somebody needs to be involved here, and be investigating this, and Meta's complicity in it. Maybe the Department of Justice, I don't know. If this happens to you, if you're a young man, and this happens to you, the best thing to do is just to ignore the messages from the scammer, and block their account. Okay? That's the best thing to do. Do not send them any money, because if you send them money, they're just going to ask for more money. And they're going to keep up the pressuring tactics, until you're out of money, and then, if they were going to send the material out to your group, which they may have not been willing to spend their time to do, because they have lots of people they're doing this to, if they're not going to get money out of you, it doesn't really benefit them to spend the time to distribute the images they have of you.
Dave Bittner: Right.
Joe Carrigan: It benefits them to spend time pressuring some other teenager.
Dave Bittner: Yeah.
Joe Carrigan: I mean, it seems like, you know, that it seems kind of like you're trying to just outrun your buddy when the bear is chasing you.
Dave Bittner: Right [laughter].
Joe Carrigan: But don't send them any money, because if they're going to send the information, they're going to do it either for free, or after they've charged you thousands of dollars to do it. It's best to have that happen for free. And do not, do not kill yourself over this. This is not worth it. Like I said to Dave, at the beginning of this, like we were talking, we've done really stupid things in our lives. We are very fortunate the internet was not around to capture them, but these things, there are resources out there to make these things go away, to make these images go away. There are, I can't remember the name of the project right now, but there is a take it down project, that actually has access to these social media platforms, to remove these images from these platforms. So they could be cleared out. But this is done by a third party organization. This isn't done by the social networking platforms.
Dave Bittner: Right.
Joe Carrigan: So, by the way, Paul is looking for work, so he did leave his job to take the job at Meta, which they rescinded. I think that's terrible. So if you're looking to hire somebody who has intel analyst background, investigation and security strategy and digital risk analysis, and open source intelligence gathering skills, click on the links in the show notes, and you'll be taken right to one of Paul's postings, and you can see his profile there on LinkedIn.
Dave Bittner: Help me understand the focus in this, in what Paul's saying here, on young men. Is this-so are we speaking specifically about a scam that is targeting young men, and not young women?
Joe Carrigan: It is targeting disproportionately targeting young men. And I don't know, I haven't heard any stories about it targeting young women, but generally speaking, young women-there are differences here, Dave. And one of the differences here is that if some guy starts flirting with a young woman, you know, a 15, 16-year-old girl, and starts getting amorous with them-with her-that seems to be easier for young women to resist than it is for young men to resist being approached by a young woman. So a young man is more likely to engage with somebody pretending to be a woman, or a female, talking to them, than a female is to engage with a male talking to them. They're also more likely to reciprocate the sending of pictures back.
Dave Bittner: I see.
Joe Carrigan: So what these guys do they just go out to these adult-in air quotes-websites, they download a bunch of pictures of a certain model, and then they start sending these pictures to this guy and ask for reciprocal pictures of him.
Dave Bittner: I see. So you're saying that they're taking advantage of the fact that I guess the generalization is that young men are greater risk takers than young women, and more impulsive when it comes to these sorts of things.
Joe Carrigan: Yes.
Dave Bittner: Thank women would be-
Joe Carrigan: Yeah.
Dave Bittner: Because, I mean, I think why I ask is that I don't think that there is any question that women, you know, women in general, and young women in particular, you know, often have a hard time of it online.
Joe Carrigan: They do.
Dave Bittner: For all sorts of different reasons.
Joe Carrigan: Right.
Dave Bittner: So, okay. Interesting.
Joe Carrigan: They do. This is probably-this is not one of the areas they have a tougher time than men. This is probably one of the areas where men have the tougher time.
Dave Bittner: Hm, okay.
Joe Carrigan: And it's because of the way we are, Dave [laughs].
Dave Bittner: Fair enough [laughing]. I can't deny it. Certainly not when I was a teenager.
Joe Carrigan: Right. Mm.
Dave Bittner: Alright, well, like you said, we will have links to all of this in the show notes. And if there's something you'd like us to consider for the show, you can email us. It's Hacking Humans at N2K dot com. Alright Joe, [music begins] it is time to move on to our Catch of the Day [rod and reel swishes]. [ Music ]
Joe Carrigan: Dave, our Catch of the Day comes from Gordy, who sent this one in. It was an email message with a PDF attached, and it comes from someone claiming to be Anders Eberheim.
Dave Bittner: Okay.
Joe Carrigan: So, do you want to read the text of the email?
Dave Bittner: The text says, "Hello there, your recent purchase has brought us immense joy, and we are deeply grateful for your decision to choose us. Rest assured that we are dedicated to providing top notch quality and ensuring your satisfaction. Thank you."
Joe Carrigan: And then it has a very long, random number designated, it looks like a GU ID, but I don't know that it is.
Dave Bittner: Okay, long string of numbers.
Joe Carrigan: Huh?
Dave Bittner: Long string of numbers, and letters, and dashes and stuff.
Joe Carrigan: You want to read, the email begins with Dear Gordy, and just has Gordy's email address.
Dave Bittner: Okay. It says, "Thank you for choosing us as your preferred service provider. Our mission is to offer uninterrupted security support empowering your business growth. Contact us at this phone number to report if this transaction was not authorized by you. Note, within the next day or two, you may be certain that your bank's records will contain the specifics of the transaction. To request a cancellation and refund for your yearly subscription, please call the customer service number indicated. Item described: McAffee Advanced, $599.99, auto debit. Please do not reply to this auto-yes, so standard McAffee-
Joe Carrigan: Renewal scam.
Dave Bittner: Anti-virus renewal scam, that's right.
Joe Carrigan: Yep that's what it is. So if you ever get one of these, just delete it. Do not call the number. It is a scam. They are, what happens when you call the number is you get somebody who tries to convince you to put a bunch of software on your computer that gives you control of the computer, and then they will maybe run some kind of bank scam, where they allegedly put too much money into your bank account.
Dave Bittner: Yeah. Actually got an email from my father not too long ago about one of these, that he was, you know, he emailed me, asking me if he should pay it, and I said no [laughter].
Joe Carrigan: No.
Dave Bittner: And I'm glad he asked.
Joe Carrigan: Yes, of course.
Dave Bittner: Right? But they're everywhere, and it's easy for people to fall for it. And I think that, as we've said, that large number that, you know, nearly $600 gets people's attention.
Joe Carrigan: Right.
Dave Bittner: And the way they frame it, that you're about to be charged $600 which, you know, $600 is not something that most people would be willing to just throw away, you know, there's a lesson learned, so. [Background music begins] Alright, so thank you Gordy, for sending that in. We do appreciate you taking the time. [ Music ] Joe, I recently had the pleasure of speaking with Dr. Jessica Barker. She is the CEO and co-founder of a cybersecurity company called Cygenta. And she is also the author of the book "Hacked, The Secrets Behind Cyber Attacks," and also joining us is Perry Carpenter, who is a podcast host here on the N2K network, and also works for KnowBe4, our show sponsor. So, here is my conversation with Jessica Barker and Perry Carpenter.
Jessica Barker: When I talk about the human side of cybersecurity, I usually say the human side. But in reality, people of course are absolutely central to cyber security. If we think about every stage of the information or technology life cycle, from design, development, use, abuse, impact. Everything along the way, people, of course, are central at every stage. And what we have seen, unfortunately for many years, is that people are a primary target for cyber criminals looking to compromise organizations, looking to steal information, get access, or steal money. Why it's overlooked? Historically we have focused so much more on the technology side of things when it comes to cybersecurity. Thankfully, that is changing. And it has been changing for a few years now. If I think back to when I started in the industry 13 years ago, it was much more of a niche focus, particularly in the private sector. There have been academics doing great work on this for a while, but there weren't many boardroom conversations. There weren't many security teams talking about the human side of cyber security to the extent that they are now. And I think that's partly because technology is perhaps an easier sale, and it's probably comforting for people to think they could if only [laughs] just buy a piece of technology to try and address some of these challenges we face in cybersecurity.
Dave Bittner: Perry, what are your insights when it comes to that?
Perry Carpenter: I think what it comes down to is when you look at the past, you know, several decades of technology, what we keep finding is that people are looking for the quick black box fix that they can just plug in, and quote-end quote be secure. And technologists, and vendors, unfortunately, really kind of play into that in a vicious cycle kind of way, where you go to whatever the conference du jour is, and all the vendors are selling their new whizz-bang thing with blinky lights, and all of the technologists are jumping on whatever the fat is, and it's-the promise is always this can eliminate the human issue in things. And what we found over and over and over again, year after year, decade after decade is, it doesn't. Because humans are little chaos animals [laughter], and we live in the space where human behavior and action is both predictable, but extremely unpredictable, and as soon as you design something to account for a certain type of behavior, somebody either intentionally or accidentally Mr. Magoo's their way through all those defenses, and you end up in this cycle of data breach and incident after incident, to the point where regardless of the report that you look at year after year after year, between 70 and 90 percent of data breaches are easily traceable back to some kind of human related issue. And so my passion for this is to try to raise the awareness of that to like Jess said, the boardroom level, and to really start to help technologists understand where the shifts need to take place in terms of budget, in terms of energy focus, in terms of mindset shift, and so on, so that we can deal with the fact that yeah, humans are involved at every stage of technology development, implementation, administration and then, you know, the person behind the keyboard that is actually doing the work.
Dave Bittner: You know, I found it helpful for me, I often use the analogy of public health, to where, specifically where you can do everything right. You can wash your hands, you can be careful not to rub your eyes, you can, you know, do proper hygiene in the restroom, all those kinds of things. But every now and then, you're still going to get a cold. Maybe you'll get the flu. And certainly we've all been through, you know, COVID, which I think brought, you know, the public health to our attention in a great way. Do you think that's a useful analogy, this notion that despite all of your efforts, you know, you have to be prepared for the fact that every now and then something is going to get through. Jess, what do you make of that?
Jessica Barker: Absolutely. It's about mitigating. So how can we mitigate the risk and the analogy, I think, works very well. There's lots of steps we can take to keep ourselves healthy, but then also knowing what to do if something comes along. There is never, unfortunately, 100 percent security. We do what we can to make sure we are as protected, as safe as possible, and then it's about understanding actually what to do if something does go wrong. So, for example, if you're in an organization, making sure people know what to do if they receive a suspected phishing email, or call, or message, or whatever it might be, is really important. You can try and tell people to look out for phishing emails. You can try and tell them not to click on the link. But you really also need to tell them what to do if it happens, if it gets through, if they do click a link, if they are worried that they gave away information in a phone call. So taking that proportionate response, being prepared as well as trying to be protected is really important.
Dave Bittner: You know, Perry, one of the things that I enjoy very much about your podcast, 8th Layer Insights, is your focus on empathy. And I think that something that Jessica alludes to in her answer, there, is that we have to be sensitive to-we have to create a safe space for people that if they feel like they accidentally click something or maybe did the wrong thing, and I'm putting the wrong thing in air quotes, that they can come to the appropriate folks in an organization and say hey, I think I made a mistake here, and not feel like the hammer is going to come down on them.
Perry Carpenter: Absolutely. So, what I always tell people is in your security program, and when you're doing security awareness behavior, culture, and you know, whatever people are talking about this discipline becoming, one of the foundational things that you have to realize is that at all times you're either building, maintaining or eroding slash destroying the relationship with your people, and so having your people, the people that work in your organization, feel like they can be vulnerable and open with the Security Department is way, way, way better than people who will try to evade or be afraid of the Security Department, and so people need to understand that whenever they suspect something, or whenever they believe that they may have accidentally done something, or they have a question about something, that a discussion and being open and reporting is the option to choose, because the security team, the IT Team, and the organization is there for them, to enable them to be successful in what they're doing. Not to sit and point at them, and laugh at them, or ridicule them, or call them stupid. That does nobody any good, and so I really, really think it's important when people are doing things like simulated phishing programs to set the stage correctly with their employees, and say we're not out to try and get you. We're out to try and build you up so that you can be more resilient, not only in this organization, but in life in general, to know what's coming at you, and when these things come, hopefully we've prepared you for what may slip past all these technical defenses, and when it lands in your inbox, even if you click on it, we want you to be really, really open. You can still become part of the preventative defense for the organization if the worst were to happen, and you were to click on that. And I will tag one more thing on there. The click in many organizations should not be the defining thing that causes a breach. If it is, there is probably something wrong with the security architecture or the other things. And we have a tendency, we being technologists, not people that focus on the human side, but we technologists have a tendency to, when the human does the thing that we've quote-end quote trained them not to do, we believe the training is ineffective or should be thrown out. We should double-down on technology. That is absolutely not the case. The human acted like a human. There was distraction. There was some other factor involved. Something seemed different, we didn't prepare them for that case in the right way, and if that is the thing that causes the organization to have a breach, well then we also have to think about all the other failures of technology that took place. What happened at the secure email gateway layer? What about end point protection platforms? What about applications sandboxing? What about network segmentation? If those things were in place, and they failed, nobody says we're going to throw out our email gateway. Nobody says we're going to throw out their end point protection, or do other things when those things fail. What they do is they double down, and they get to root cause on why they failed, and we should do the same when it comes to the human side of things. We should say what set this person up in a situation where that click happened? And then how can we move in, and mitigate, and build resilience there for the next time, so that they can do better?
Dave Bittner: Yeah, it's a really interesting point, and it kind of reminds me of, you know, highways have guard rails. So if someone is driving their car, let's say late at night, and they doze off, they may go off the road. Their car may get scratched up, but hopefully the guard rail is going to keep them from careening into a ravine or something [laughs], you know, it's not a perfect solution but it's a mitigating factor, and we regularly put those mitigating factors in all sorts of areas of our lives.
Perry Carpenter: Absolutely, and I just think we're, you know, as a discipline, we-and I include myself in the technologists, we, with that, we really, really just want the quick fix. And whenever the quick fix doesn't work, in many organizations, it becomes an internal politics issue. Why didn't that thing work? And so now it becomes a political football that people play, you know, pass around, rather than really doing the hard work, the grinding work, dealing with what root cause really is, and that spans all aspects of technology, which is why you end up with vendor replacements and everything else so often, rather than dealing with some of the implementation issues, it becomes way easier just to blame the vendor. And so we see the blame game becoming something that happens fairly universally both inside and outside of technology.
Jessica Barker: And to your point, Dave, of the guard rails, someone driving along a highway, they get tired, they drift off, maybe they, you know, move off the highway. There's an accident-we can think about this in terms of security. What makes somebody so tired? What is it that has led to that situation where they're in a position where they're going to be so tired, and that they drive off the road, they're so overworked, they have so much of a burden, they are stressed, they are distracted, the culture in the organization maybe is pushing the to focus on productivity, say, over security. We can think about all of those systemic factors. And if there is an incident, really importantly, looking at what went wrong, rather than who we can blame. So it's this approach that is more of a just culture, and looking more at psychological safety. So, to Perry's earlier point, people feel comfortable coming to the security team with a question, with a concern, when there may have been an incident.
Dave Bittner: Jessica, the title of your book is "Confident Cybersecurity," which I will point out for our readers, last fall came out in paperback, so you should definitely check that out, but the choice of that title, "Confident Cybersecurity," I think is very important and very telling. To our earlier point about technology versus the human side, I don't look at my computer or my mobile device and say oh, my iPhone is sure looking confident today. Now that's a uniquely human term.
Jessica Barker: Absolutely, confidence comes from us feeling capable. And feeling capable partly comes from practice, from being able to practice the thing that we want to feel more confident in. I think of confidence like a muscle. It's something that can be grown, can be developed. It's something we don't often talk about in terms of cybersecurity, or we haven't historically. But self-efficacy, someone feeling confident, someone feeling capable and able to practice the cybersecurity behaviors we recommend, self-efficacy is, and has been shown by all the research to be the most important factor in any cybersecurity interventions that are looking to change behavior in a positive way. So, it's far more important, for example, when we're talking about a threat, to raise people's self-efficacy, rather than to focus very heavily on the danger. So of course we want to communicate the threat, we want to talk about the reality of cyber crime, but we really need to follow that up by focusing on what people can do to protect themselves from that. The actions that they can take. So I have long had a rule if I am delivering an awareness raising session, creating content for clients, I won't talk about a threat, unless we can also communicate what the organization is doing and what we want people to do to protect themselves. And that was very much the approach I took with "Confident Cybersecurity," with my latest book out in April, 2024, "Hacked, The Secrets Behind Cyber Attacks," again, I'm digging into the different threats in each chapter, but I'm ending each chapter, and I'm ending the book itself with a look at what people can do to protect themselves, and really making that as accessible and as actionable as possible.
Dave Bittner: Before we wrap up, I'd love to get each of your takes on this idea of protecting our family members. I think for a lot of folks in our audience here, we are the ones who take responsibility perhaps for elderly parents, or siblings, or friends, you know, those folks where we may have above average knowledge of these things, and so they turn to us for advice or guidance on how to protect themselves online. I would love to just-a little bit of advice from each of you. Maybe some tips and tricks and some ways to approach this, Perry, why don't I start with you?
Perry Carpenter: Yeah, so I think one of the best things is to really have the great attitude about it. One of the things that I realized about myself is that I suffer greatly from the curse of knowledge. And so I've been steeped in several different disciplines for so long that even sometimes the vernacular that I use to explain something can seem over somebody's head to the point that it sometimes feels demeaning to them. And so I really need to understand who I am speaking to, like the point we made before about empathy. Try to approach the world from their point of view for a little bit. Understand the things that the analogies or the points that I can make, so that I'm not being fear-mongering, or I'm not being overly technocentric. And then the thing that I want them to do at the end of that time is to feel empowered, and knowledgeable enough to do the rest of their life that they want to do. I don't want to make them an expert in password managers. I want to make them an expert in how to download, install, and correctly use the thing that is there, and that tool that I'm teaching them how to use, or encouraging them to use, serves a greater purpose. And that is allowing them to do all the other things that they want to do in the easiest, most transparent way possible, so that they can just get on with their life, and so that is the biggest thing that I try to do, is understand where somebody is, and then make sure that I am approaching the things that I'm trying to help them accomplish in a way that is really as consumable as possible by them, or as relatable as possible.
Dave Bittner: Jessica, how about you?
Jessica Barker: Absolutely, I think Perry has summed it up so well, talking about how we can empower people, how we can use empathy, and I really related to what he said about the kiss of knowledge. Every year extra that I spend in this industry, I feel like I feel the burden of the curse of knowledge more, and I have to work to overcome that. So I listen to the words that people use if I'm talking to family members, friends, people in the community about cybersecurity, I listen to the words that they use. How they describe their interactions with technology and also asking questions, really actively listening. So if someone tells me they're using terrible passwords, rather than reacting with horror, which obviously would be negative to my relationship and to how they feel about me, and how comfortable they feel, they would feel judged if I reacted with horror. Instead, it's asking why, you know, trying to really understand that, to then be able to solve the root cause of that problem. And beyond that, I'm a big believer in helping people to then help themselves. So when I'm with some of my family members, I will want to check in. And I see it as a privilege being able to check in on some of the cybersecurity practices, their level of hygiene, if you like, and one thing I will always check is are all of their devices up to date? Now, I could go and run the updates, once they've unlocked the device for me. I could run those updates, have it done, but instead, I'm going to want to make sure I'm helping them to do it. I'm maybe guiding them. I'm maybe showing them where they need to go, but they're the ones who are going through and practicing that behavior, so that when I'm not there, they're more likely to remember the steps that they took, and to do that in future, rather than relying on the next time I go, me doing it for them.
Perry Carpenter: Could I throw one more thing in there, just in case it's helpful?
Jessica Barker: Sure.
Perry Carpenter: One of the problems that I've realized with my language is not cursing, it is my language is that when somebody asks me something, I'll say "well, you just do this," and I never realized how off-putting saying just, you know, blank something, can be. Because it assumes that well that person may be stupid or something. That's the way they can take it. Maybe it's the way that I say it? I don't know. But kind of in that same line, when I released my first book and I was obsessed with like the Amazon ratings, and all that, and was looking it up, one of the things that really frustrated me is in these categories around security and so on, often one of the things that turns up at number one with all these security categories is like password journals. And that really, really freaked me out. I was like okay, I've got my book, and I'm, you know, would encourage people to use things like password managers and all that. And I really, really look down on password journals. It's like, that is the stupidest thing ever. And I posted on LinkedIn about it. And somebody called me out in the best way possible, and they said, you know, for somebody that is elderly and doesn't necessarily have a lot of tech savvy, that can be the easiest and most secure thing that they do. And when you think about it from a risk perspective, at least if they're doing it right, they're cataloging everything, and hopefully have different passwords for all those different things that they're cataloguing in it, and they're not just out there on the internet, and it's not in the notes on their phone. It's [overlapping speakers] air gapped, yeah. So from a risk management perspective, that's better than a lot of options. And I really, really thank that person every day for calling me out on that. Because that mindset shift for me, and rather than demeaning somebody that was doing that, understanding the use case that brought them there is important. Would I rather them use a password manager? Yeah. But that also means that we have to figure out how do we move somebody from that state of mind practice and their behavior patterns to something that may be more secure? But that becomes the game at that point.
Jessica Barker: That's such a great example from Perry. For me, it touches on the fact that in security, we can sometimes repeat or fall into the same kind of group thinking without going back and looking and really analyzing the issue. So many of us, myself included, I used to say that password books, password journals were a bad idea, you should never write passwords down, because that's what everybody says, never write passwords down. And then I also switched, I really flipped that advice many years ago when I realized the likelihood of someone breaking into your home and stealing your book of passwords is far lower than someone across the internet breaking weak passwords that are being re-used, if someone doesn't want to use a password manager, of course the cognitive load is way too much to be able to use unique complicated passwords for the hundreds of accounts we all have. So now, my advice is, if you really don't want to use a password manager, then if you trust everybody you live with, which is obviously crucial in this, it's better to write them down, and to keep it safely, and if someone breaks into your home and steals your password book, you're probably going to know about it, where if someone is breaking into your accounts, compromising your passwords, that can take longer to identify. [ Music ]
Dave Bittner: Hey, Joe, what do you think?
Joe Carrigan: The human side, Dave, is the biggest side of the cybersecurity problem. We have Roger Grimes, also from KnowBe4, a couple of weeks ago, back in episode 287, he said if you could fix the human part of the problem, you could fix 70 to 90 percent of cyber attacks [laughter], because-
Dave Bittner: Don't tell the AI.
Joe Carrigan: Right [laughter], 70 to 90 percent of these attacks all involve some kind of successful human attack.
Dave Bittner: Right.
Joe Carrigan: Perry also quotes that statistic, by the way.
Dave Bittner: Yeah.
Joe Carrigan: Why is it overlooked? I think that Perry is absolutely correct, we like the bright, shiny buttons and blinking lights and everything, and there is this attitude that there is a technical solution to this problem.
Dave Bittner: Mm-hm.
Joe Carrigan: Bruce Nayer-what was it Bruce Nayer said? If you think that technology can solve your problem, you don't understand the technology and you don't understand the problem.
Dave Bittner: Okay.
Joe Carrigan: So he had some pretty harsh words. But I agree, there is no technological fix for the problem. I like that Perry calls people, "little chaos animals"-
Dave Bittner: Right.
Joe Carrigan: One of the things that you guys talked about was the need to raise this to the board level.
Dave Bittner: Mm-hm.
Joe Carrigan: And that does need to be raised to the board level. It is something that everybody needs to understand, this is the big part of the problem, and I think that Jennifer and Perry are big on these things. I haven't taken a look at Jennifer's book yet. I plan on reading it after you're done with it over there.
Dave Bittner: Yeah.
Joe Carrigan: But employees do need to know what to do when something goes wrong. And they really need to know how to report information. And they need to have the safe space for reporting it.
Dave Bittner: Yeah.
Joe Carrigan: I will say this. When I worked in, I worked for a company, and we had information that would go on to a server that sometimes people would put the wrong information on the server.
Dave Bittner: Okay.
Joe Carrigan: And if they just called us immediately and told us, I just put the wrong information on the server. It was so much easier for us to say thank you very much, you did the right thing, and we tried to do this. We did not issue condemnation. We said you did exactly what you needed to do. This will take us five minutes to clean up, and they would literally take us five minutes to clean up. If somebody went through and put the information on the server, and then went, oh, that's not supposed to be there, and then deleted the information from the server, and then told us about it, now our clean-up job was hours long, because we had to go through and wipe all of the empty space on the disk. Because we didn't know where the file was. And that was very time-consuming. So it's worth it to have that policy, and that kind of culture, where people can come and bring these things forward. It will save you so much time and money in response to these kinds of issues.
Dave Bittner: Sure.
Joe Carrigan: Perry is really big on security culture. You could say he wrote the book on it [laughter].
Dave Bittner: He's written several books on it [laughing].
Joe Carrigan: Right, but he's absolutely right, and at all times, you are either building, maintaining, or eroding your relationship, underpinning your security culture.
Dave Bittner: Yeah.
Joe Carrigan: And I really get the point, they both touched on. Perry and Jessica both touched on this, about speaking to people without sounding demeaning, or you have to think about how you-how your audience is perceiving the message. That is of paramount importance. Absolutely the most important thing about talking is understanding your audience [laughter].
Dave Bittner: That reminds me of the old Bob Newhart joke. He said, you know, I don't enjoy country music, and I don't want to denigrate anybody who does. And for those of you who don't know what denigrate means, it means put down [laughter].
Joe Carrigan: Country music listeners [laughter].
Dave Bittner: Right.
Joe Carrigan: Modern country is a vast wasteland of garbage, I think [laughter]. Anyway, that's my opinion. Yeah, it's important to understand who is listening to you when you talk, so you don't make the assumptions that everybody makes when we're talking to them. And there is a good discussion that both Jessica and Perry have about the password journals. And they both say that they were, they would discourage that, and I would agree that password journals are less than optimal. But I will say they are better than reusing the password on all the accounts. Now, password manager better? Of course it is. But handwritten journal is way better than password reuse.
Dave Bittner: Right.
Joe Carrigan: This is like-I'd liken this to the SMS multi-factor authentication. It verifies that you are in custody of your phone.
Dave Bittner: Yeah.
Joe Carrigan: Right? That it's not the best. There are all kinds of things that can be done to get into the middle of that, but it is way better than nothing.
Dave Bittner: Right [laughter].
Joe Carrigan: Right? And this is the same thing. Using a password journal is way better than using the same password on all the different sites you go to.
Dave Bittner: Yeah.
Joe Carrigan: This is a better answer.
Dave Bittner: Yeah. Alright. Well our thanks to Dr. Jessica Barker and Perry Carpenter for joining us. It is a pleasure to talk to either of those folks, but to have them both together was extra special. [Background music begins].
Joe Carrigan: Yes it was.
Dave Bittner: So we do appreciate them taking the time, and thanks to our producer, Jennifer Eiben, for making that happen. [ Music ] And that's Hacking Humans, brought to you by N2K CyberWire. Our thanks to the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu. We'd love to know what you think of this podcast. Your feedback ensures we deliver the insights that keep you a step ahead in the rapidly changing world of cybersecurity. If you like our show, please share a rating and review in your podcast app. Please also fill out the survey and the show notes, or send an email to Hackinghumans@n2k.com. We are privileged that N2k CyberWire is part of the daily routine of the most influential leaders and operators in the public and private sector, from the Fortune 500 to many of the world's preeminent intelligence and law enforcement agencies. N2K makes it easy for companies to optimize your biggest investment, your people. We make you smarter about your teams, while making your teams smarter. Learn how at N2K.com. This episode is produced by Liz Stokes. Our Executive Producer is Jennifer Eiben. We are mixed by Elliot Pelsman, and Trey Hester. Our Executive Editor is Brandon Carp. Peter Kilpey is our Publisher. I'm Dave Bittner.
Joe Carrigan: And I'm Joe Carrigan.
Dave Bittner: Thanks for listening. [ Music ]