Hacking Humans 2.4.21
Ep 133 | 2.4.21

Understanding human behavior is a key to security.


Nico Popp: There's only two types of companies, the ones that have been hacked and the one that don't know they have been hacked yet.

Dave Bittner: Hello, everyone, and welcome to the CyberWire's "Hacking Humans" podcast, where each week we look behind the social engineering scams, the phishing schemes and the criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bitner from the CyberWire. And joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe.

Joe Carrigan: Hi, Dave.

Dave Bittner: Got some good stories to share this week. And later in the show, my conversation with Nico Popp from Forcepoint. We're going to be talking about why understanding human behavior is a major key to security. All right, Joe, before we dig into our stories, we got a kind note from a listener who is looking for a little advice from us. This person writes, they say, Hi, Dave and Joe. Thank you for doing your podcast. I find it very helpful, especially being someone that's not knowledgeable with the internet. I posted an ad on Craigslist to see a helmet. Someone or scammer sent me the following.

Dave Bittner: Scammer - Hi. New Smith Women's Allure snow helmet, size small - still available?

Dave Bittner: Me - yes, it is. 

Dave Bittner: Scammer - I need to verify before we meet. I will send you a code to verify you are real because I saw most post is fake. I want to send a six-digit number, then you tell me that I will match it with my number. If it is right, then you are real. And after that, I will call you and discuss all about this. Can I send the code? 

Dave Bittner: Me - sure. 

Dave Bittner: Scammer - I just send it. Check your phone message and send the code. 

Dave Bittner: And then our listener says, I actually sent them the code. And luckily, it didn't work. And they wanted to send another code to me to a different phone number. Now, I suspected that something was off because the response to my Craigslist ad was weird, but with the advice of an engineer working in security, I answered. I want to know, what are they trying to scam? Also, this is the first time I heard of such a scam. It would be great if people like me are educated about it, so we could avoid such scammers. Thank you for your time. 

Dave Bittner: Joe, what do you make of this? 

Joe Carrigan: I think this person should change their password on whatever email they used to sign up for this Craigslist account because this, to me, sounds like somebody is trying to execute a change password attack and take over the account. I think Craigslist puts your email out there for people to reach out to you. 

Dave Bittner: Well, Craigslist initially - I think they obfuscate your email when you have an initial contact with someone. You know, it goes through - there's a layer that Craigslist uses. But I don't think it's a big deal to then find someone's actual email after that. 

Joe Carrigan: Right. 

Dave Bittner: But it seems to me, I mean, this whole - how many accounts do we use where either the second factor is someone texting you a six-digit number - right - or something like that? 

Joe Carrigan: A lot of times that's the workflow for a password change, right? Like, I forgot my password. We're going to send you a six-digit code. So this person probably already has our listener's email address, has clicked on the I-forgot-my-password link. And they said, we're going to send a code to your phone number. Then they've reached out to the person who said, I'm going to send you a code to verify you're real. And then this is an attempt to take over an account is what I think this is. 

Dave Bittner: Yes, absolutely. 

Joe Carrigan: So you should change your password on that email immediately, as soon as you can. 

Dave Bittner: Yeah. So it's a good reminder here that if anybody asks you to do something like this, if they say they're using some sort of - they're sending you a code to verify who you are, and it is not your bank or your, you know, it's not the people you're dealing with this account. If it's a stranger, it's almost always a scam. 

Joe Carrigan: If it's an inbound call, I would say it's always a scam. I mean, I've had people from - I think, God, you know, I have Comcast here at home. And I've had people - where I call in to Comcast and they say, we're going to verify it's you by sending your phone a code. That's OK, because I've made the call out to what I know is Comcast. An inbound call, you don't know that's from anybody. 

Dave Bittner: Right, something out of the blue. Yep. Yep. 

Joe Carrigan: Right. 

Dave Bittner: It's a good reminder. So to our listener here, yep, looks like someone's trying to take over one of your accounts. And it seems like - hopefully, you were lucky, and they did not do so. But I think Joe's advice is rock solid to go through and just change those passwords. Don't reuse your passwords. Change your passwords. Use a password manager. 

Joe Carrigan: Use a password manager - easiest thing in the world to do. 

Dave Bittner: Right. Right. 

Joe Carrigan: And you'll be surprised - once you've implemented a password manager, you will wonder how you logged into accounts without it before. 

Dave Bittner: That's right. All right. Well, thank you for sending us that kind note. It's a good reminder for all of us. Joe, why don't you kick things off with some stories for us this week? 

Joe Carrigan: Dave, I saw an article over on Graham Cluley's website. He had a story from the FTC, the Federal Trade Commission. So I went, actually, directly to the FTC website. And Seena Gresson, who is an attorney with the FTC, wrote a blog post warning customers of a site that is impersonating the FCC. And it is calling itself the U.S. Trading Commission, which is not a real thing. But they're using the FTC logo. The FTC is the Federal Trade Commission. I don't think I've said that yet, right? 

Dave Bittner: Yeah. 

Joe Carrigan: So - and they are the - part of the Department of Commerce. And they regulate a lot of things. And part of what they do is they investigate internet scams. But this site is claiming that it operates a personal data protection fund. This is a new site. And it compensates people whose personal information had been exposed on the web. That's the claim, though. But, of course, they don't do that. This is a - what we call a follow-on scam, right? 

Dave Bittner: OK. 

Joe Carrigan: And a follow-on scam is something you usually see when someone has been bilked out of their money. The scammers, once they've taken a certain amount of money and they can't get any more money out of the person, they'll change tactics, and they'll call the person and say, hey, we're from law enforcement. We know you got scammed. We're going to try to help you get your many thousands of dollars back. But in order for you to do that, we're going to need some more money up front. And it's just the same scammers changing the narrative here. And this is somebody doing the same thing on a broader scale, hoping to just kind of lure people in. The sites says you can receive an instant cash payment - right? - just by clicking on some links and giving them some personal information, including your bank account or electronic wallet information, (laughter) which... 

Dave Bittner: Ah, there it is. 

Joe Carrigan: Exactly - which, of course, means that they're just going to empty out your bank account if you give them that information. 

Dave Bittner: Right. 

Joe Carrigan: The website also warns that you may be downloading malware onto your computer. Yeah, they may ask you to install something, and if you do it, then bam. Of course, there's always the possibility of the drive-by download where you just load the web page and the web page exploits some kind of vulnerability in your browser. But that's less probable than them saying, oh, install this software - because that's easier to do. They're going to ask you to do something. That's the main crux of the entire nature of social engineering, is they're going to get you to do something that's not in your best interest, right? 

Dave Bittner: Right. 

Joe Carrigan: They're promising refunds not just to people in the U.S., but all over the world. They're saying, we can get your refunds. So the FTC has some things they would like you to know. They put this in this article, this blog posting. The FTC does shut down scams and return money to people who have lost it to dishonest and unfair business practices. So the FTC may go out and recover money on your behalf and give it back to you, but they will never ask you for money. They will never ask you for your bank account. They will never ask you for a credit card. They will never ask you for a Social Security number. And if they do wind up recovering money and giving it back to you, it will most likely be in the form of a check that they mail to you. 

Dave Bittner: Right, right (laughter). Yes. 

Joe Carrigan: That's how this process works. 

Dave Bittner: Government likes to do it the old-fashioned way, right? (Laughter). 

Joe Carrigan: Right, the old slow way. I'll tell you, Dave - this is how I elect to receive my IRS refunds for my tax returns. This is how - if I owe money at the end of the year, I send them a check. And if they owe me money, they send me a check. And you know what? I'm OK with that. 


Dave Bittner: OK, well, good for you, Joe. 

Joe Carrigan: I'm fine with that. Yeah. If you really need the money right away - and I understand that - there are companies out there, like any of the tax prep companies, who will advance that money to you for a small fee. 

Dave Bittner: Right, right. 

Joe Carrigan: I recommend that over storing your bank account information with the IRS. There have been some remarkable data breaches from the government, so I just don't trust them with my banking information. Not that I think that they're malicious; I just don't think that my information is secure. And there have been some recent events that make me happy that I made that decision. 


Dave Bittner: OK, fair enough. Fair enough. All right. Well, we will have a link to this alert from the FTC in the show notes, of course. Interesting story. My story this week - this comes from The New York Times. And actually, it's The New York Times Magazine. It's an interesting, long article, and it is all about telephone scams and how these scams work. It's a long narrative, starts out talking about an elderly woman who got scammed, but then it goes on to talk about someone who fights the scammers, who is actually listening in on the scammers while they were scamming this elderly woman. And he then followed up with the woman to make sure that she didn't get scammed. So this is a good guy who streams his efforts online on YouTube to show him scamming the scammers. 

Joe Carrigan: Awesome. 

Dave Bittner: He sets himself up so that scammers try to interact with him, but then he turns the tables on them and actually installs software on the scammers' machines... 

Joe Carrigan: Right. 

Dave Bittner: ...So that he can monitor them. 

Joe Carrigan: We should probably note that this is probably not within the bound of legality. 

Dave Bittner: Yes. 

Joe Carrigan: Right. 

Dave Bittner: Well, he is in the U.K., I believe. 

Joe Carrigan: Oh, OK. 

Dave Bittner: So he is outside of the Computer Fraud and Abuse Act here in the United States... 

Joe Carrigan: OK. 

Dave Bittner: ...Which I suppose is part of how he gets away with what he does. But yes. 

Joe Carrigan: Good. 

Dave Bittner: I would generously say it's a gray area (laughter). 

Joe Carrigan: Right. Absolutely. And while I like to see this kind of thing happen and I kind of get a little oh-yeah-that's-great feeling... 

Dave Bittner: Yeah. 

Joe Carrigan: ...We should warn everybody - do not try this at home (laughter). 

Dave Bittner: That's right, yeah. I mean, this person obviously has some skills. 

Joe Carrigan: Yep. 

Dave Bittner: But, you know, if you start messing around with folks who are already committing crimes, chances are it could escalate. 

Joe Carrigan: Yes. 

Dave Bittner: And the last thing you want to have is that happen. But the person who wrote this article actually flew to India to look for some of these scam call centers. 

Joe Carrigan: Really? 

Dave Bittner: Yeah. And many of them are in India. It seems like there's a real epicenter of this in India. And the article points out that there's a good reason for that in that - remember, I don't know, a decade or maybe a couple of decades ago, when many of the world's call centers shifted to India, right? 

Joe Carrigan: Right. 

Dave Bittner: And the reason for that was twofold. It was - people are cheaper to hire in India. The standards for paying people is much lower than it is, say, here in the United States. 

Joe Carrigan: Right. 

Dave Bittner: But also, many, many folks who live in India speak English... 

Joe Carrigan: Right, absolutely. 

Dave Bittner: ...And speak English well. So it's a good fit when it comes to that. So couple of interesting things in this article that I hadn't really considered - one is that there are many, many existing legitimate call centers in India. 

Joe Carrigan: Absolutely. 

Dave Bittner: And for many of them, it's not a big deal to shift over to maybe doing some business that's not so legitimate. 

Joe Carrigan: They may even have both business models. 

Dave Bittner: Well, exactly. And the illegitimate businesses are - probably have higher margins, right? 


Joe Carrigan: Sure. 

Dave Bittner: They're more profitable. But it also makes it harder on law enforcement because the local law enforcement who's looking to shut down these scams - and they do execute raids from time to time, and they do shut them down - it's difficult for them to do that if a call center is running a legitimate business in addition to the illegitimate stuff because they say these folks are very smart and they're very good at hiding the illegitimate side of their business. And so you go in to raid one of these places, and they're just doing legitimate business for Microsoft or Google or any of the big companies who outsource their call centers over to India. 

Joe Carrigan: Yeah. Al Capone used to own a bunch of laundromats in Chicago, and that's how he put his money through there. And that's why the term is called laundering. 

Dave Bittner: Oh, interesting. 

Joe Carrigan: You go in there, and you're confronted with, hey, we're just doing honest business here. 

Dave Bittner: Right. Another component of this is that for the folks who are doing this - the individuals who are hired to do the scamming, this is good work for them. They make more money than they would doing other things. This article points out that there's an overabundance of folks who have completed education in engineering, but who there are not jobs for - says that about 20% of the people who have been educated to be an engineer, there are actually positions for locally. So they have all of these college graduates who have technical skills. 

Joe Carrigan: Really? In India, only 20% of engineers are hired. Is that what you're saying? 

Dave Bittner: That's what - according to this article, yeah. It says Indian educational institutions churn out more than 1.5 million engineers every year. But according to one survey, fewer than 20% are equipped to land positions related to their training. So these folks have some technical skills and are well-spoken, speak English. So they're well-equipped to take these jobs. And this is a good opportunity for them. 

Dave Bittner: This article goes and actually finds one of the scammers - the person who wrote this - finds one of the folks who took employment with one of these scamming call centers and talks to him about it. And, you know, it's interesting. I mean, he clearly feels some remorse for what he's doing... 

Joe Carrigan: Right. 

Dave Bittner: ...Not enough to not do it. 

Joe Carrigan: Right. 

Dave Bittner: But (laughter) it's an interesting situation. Everybody wants to provide for themselves and for their families. And I could imagine, you know, you're taking advantage of people who are a world away. You're not hitting up people in your hometown or anything like that. It could be tempting. You can understand the temptation. 

Joe Carrigan: Absolutely. No, I understand the situation, especially if I graduated from some college with an engineering degree thinking I was going to get a good job and I become disenchanted with that, I could absolutely see how people turn to this. The solution is - you know, economic development, I think, is the solution. That's really the solution to a lot of the world's problems. 

Dave Bittner: This article talks about how this particular person was paid a quarter of whatever they defrauded the victims from. 

Joe Carrigan: That is a pretty good commission. 

Dave Bittner: Could be a pretty good commission - and also said that this person had scammed up to $5,000 from folks. So in a part of the world where, as this article mentions, it's not unusual for folks to make $25 a month, say, it's quite an opportunity to do this. 

Joe Carrigan: Yeah. For our listeners in the United States, it's really hard to understand. The international poverty level is below $2 a day, and a lot of people survive on less than that. So if you can get a job or if you can scam somebody out of $25 every day in one of these economies, you're doing very, very well for that economy. 

Dave Bittner: Right. So there is a lot to this article. We'll have a link to it in the show notes. I'm just scratching the surface here. But if you're interested in these sorts of things - and I would hazard to say, if you're listening to the show, you probably are - it is a good, entertaining and educational read. Again, from The New York Times magazine, it's titled "Who's Making All Those Scam Calls?" It's a good one to check out. 

Dave Bittner: All right, Joe, it is time to move on to our Catch of the Day. 


Joe Carrigan: Dave, our Catch of the Day comes from a listener calling himself Billy Baroo. Dave, unfortunately, we have missed the time window for this one because this one comes from the former first lady, Melania Trump, and it is about an overdue payment release. Dave, would you care to take it away? 

Dave Bittner: Well, I certainly will. (Imitating Melania Trump) Overdue payment released - I am Mrs. Melania Trump. This is to officially inform you, darling, that your overdue payment from Kenya, total sum 18 million United States dollars is currently here in my office, White House, Washington, D.C. And the funds will be delivered to you as soon as you get back to this office and comply with the requirement as needed to deliver your total fund to you as well. Your home address and your cellphone number is highly needed to complete this delivering to you as well, darling. Bear in mind that I have taken my time to be in charge of your fund as instructed by my husband to ensure that you receive your funds successfully from the White House to reduce the economy. And I'm the only one that has your fund in regard to my husband, Mr. Donald Trump II. And you will receive your fund through bank draft check or ATM card will deliver to you. And you are also expecting to be announced as beneficiary of same amount as your fund is delivered to you, darling. So you are currently advised to get back to me with your home address and cellphone number. Reconfirm your information to avoid wrong delivery - your full name, country, city, address, cellphone number. Thanks. Regards, Mrs. Melania Trump, the White House, official residence of the president of the U.S. First lady, United States of America, darling. 

Joe Carrigan: Dave, I just imagined Melania Trump sitting somewhere in the White House with her feet propped up on $18 million cash. 

Dave Bittner: (Laughter) Just a big pile of cash, right. 

Joe Carrigan: Right, like an ottoman (laughter) 

Dave Bittner: And she saying, boy, I need to get this to the person who it's (laughter) - who is entitled to it. 

Joe Carrigan: Who is entitled to it, exactly. 

Dave Bittner: Yeah. Well, you know, I mean, she's - online bullying was her thing. So she's - not only is she helping to stop online bullying, but she's making sure that people get the money back to them that's entitled to her. So... 

Joe Carrigan: Yes, that's right. 

Dave Bittner: (Laughter). 

Joe Carrigan: Very (inaudible). 

Dave Bittner: All right. Well, pretty obvious what's going on here, Joe. 

Joe Carrigan: Yeah, they're just trying to scam somebody with - first off, they're collecting a bunch of personal information so they can maybe sell that for people to steal your identity. And then they're also probably going to try to execute an advance fee scam where they say, oh, in order for you to get this money, we're going to need some fees, you know? 

Dave Bittner: Mmm hmm. Mmm hmm. 

Joe Carrigan: A small, like, .1% of $18 million - send that to us, and we'll release this doc. Think of the money you'll make, Dave. 

Dave Bittner: Yeah, I know. Well, and - you know, I think also taking advantage of the image of both Mrs. Trump and Mr. Trump as being wealthy people. 

Joe Carrigan: Right. 

Dave Bittner: So the notion that this sort of money would be adjacent to them (laughter) is probably not a stretch. 

Joe Carrigan: Right. What's a billionaire need with $18 million? 

Dave Bittner: Right, exactly. 

Joe Carrigan: Right. 

Dave Bittner: Exactly. The authority of the office of the president of the United States and the first lady. And, of course, I guess the first lady being a do-gooder as first ladies are. 

Joe Carrigan: Yup. 

Dave Bittner: So there's a lot here that you could see how someone could fall for it. All right. Well, thanks to our listener for sending that in. That is a fun Catch of the Day.

Dave Bittner: So, let's return to our sponsor KnowBe4's question - carrots or sticks? Stu Sjouwerman, KnowBe4's CEO, is definitely a carrot man. You train people, he argues, in order to build a healthy security culture, and sticks don't do that. Approach your people like the grown-ups they are, and they'll respond. Learning how to see through social engineering can be as much fun as learning how a conjuring trick works. You can hear more of Stu's perspectives in KnowBe4's weekly CyberheistNews. We read it, and we think you'll find it valuable, too. Sign up for CyberheistNews at knowbe4.com/news. That's knowbe4.com/news.

Dave Bittner: Joe, I recently had the pleasure of speaking with Nico Popp from Forcepoint. And our conversation centers around why understanding human behavior is a major key to security. Here's my conversation with Nico Popp. 

Nico Popp: Every time I talk about security and then behavior - I don't know if you've seen that cartoon, it always comes to mind. There is a ring, you know, a boxing ring. And then on the left, you have a woman that's holding a post that says data security, with, you know, in the corner, we have firewall, encryption, antivirus software, et cetera, et cetera. And in the other corner, you get a big guy, a little bit sloppy with a big red T-shirt that says human error. 

Dave Bittner: (Laughter). 

Nico Popp: And, you know, basically, Dave is going to defeat cybersecurity on the entire site because we've ignored Dave - right? - for 20 years in cybersecurity. So we're really at the beginning. 

Dave Bittner: Yeah. 

Nico Popp: Long answer, but we are at the beginning. 

Dave Bittner: Yeah. As someone named Dave, I am familiar with that cartoon. Absolutely (laughter). 

Nico Popp: (Laughter). 

Dave Bittner: I mean, let's walk through, I mean, some of the evolution here of what got us to where we are today. I mean, is it fair to say that the history of cybersecurity has really focused a lot on the ones and zeros, on the connections, the binary stuff, rather than the human factors? 

Nico Popp: For sure. It's even - it's interesting. Even when we focus on the human - first of all, yes, it's very binary in many ways in the way we secure things. But I think the biggest issue is being that we focus on the outside world, right? And so if you look at the $20 billion we're spending in security, it's basically looking for the outside enemy. And so we'll look at, as you said, binary things, like bad IP addresses - right? - bad URLs, bad domain name, you know, hash of toolkits that will show that the enemy is there. And the reality is it's always - always, right? - the enemy is in, right? We spend all that money and then - you know, I think it was a very famous CEO in the Silicon Valley say, look, there's only two types of company - the one that have been hacked and the one that don't know they have been hacked yet. 

Dave Bittner: Hmm. 

Nico Popp: So it has failed because we've totally ignored kind of the core of what's inside the enterprise, the human and the data. Quite frankly, I put both in the same basket. So we've focused on a lot of things, but the end user, but Dave. 

Dave Bittner: So when we're talking about behavioral analytics, what does that encompass? What does that describe? 

Nico Popp: It's kind of trying to answer the question, if the breaches - right? - the attacks at some point will always turn into, you know, an insider job - right? - can we - instead of looking for the outsider, the bad guys, can we actually try to better understand what our users are actually doing in the normal course of the day? How do they (unintelligible) what is normal versus abnormal? What is, you know, usual versus unusual? So that when a bad guy is actually posing as Dave - right? - we can actually notice that there is something, you know, different, and we can react accordingly. So I think it's really about trying to understand normal and usual behavior, normal of your users, and what's abnormal and usual behavior of data. 

Dave Bittner: Can you give us some examples? I mean, what would a typical person going throughout their day, what sort of things would be tracked and fed into this system to be able to make sure everything was on the up and up? 

Nico Popp: Yeah. So, you know, for example, you know, is it normal for Dave to actually, you know, aggregate - hold - so much data, and among that data, so much maybe customer confidential data, right? Is that something that Dave should be doing in terms of the job that he's performing for us? Is it something that is done in the past, right? That's, for example, that's what we call an indicator of behavior that is suspicious, right? Another example could be, you know, typically, the bad guys will turn into power user - right? - what we call system administrators. And what they'll do, they'll create new Daves in the active directory. They'll create new user, and they will elevate the right of the new users. These are employees that you never had. So, you know, can we understand that behavior? Why is that system administrator doing these things abnormal? So you can imagine time of the day, access to data, access to application, frequency, volume, confidentiality, all these things are indicator of behavior that can be very revealing. That's what we're looking for, for these anomalies, if you want. 

Dave Bittner: So when someone starts using a system like, I suppose there's a period of time when the system is being trained, when it's getting up to speed. 

Nico Popp: So it actually depends on the technique. So we have techniques where we actually have libraries of what we call indicator of behaviors. And each indicator of behavior, very much like, you know, you have these indicators of compromise that are looking at outside-in, we're looking at inside-out security. So we have this library in the case of Forcepoint of pre-constructed behavior, which we know represents a little bit of risk. And so we can actually, without baselining the user, we can actually look whether this user exhibits these known risky behavior. And then we can start restoring the user. In addition, what we like to do is listen and monitor and compare you to peer, (unintelligible), people that have the same job, they - they seem to perform the same tasks. So we'll do both because often baselining, you know, might be too - taking too much time. So we like to use - to mix the techniques. 

Dave Bittner: I mean, do you sometimes find the folks in the course of their regular day and getting their jobs done or are inadvertently using risky techniques, doing things in a risky way when perhaps they don't need to? 

Nico Popp: Well, you remember my cartoon, right? See, I think - there are a lot of Daves out there. And, you know, and it's actually one of the - they are different threats. There is the bad guys masquerading as the good guys. And that one is you know, that's kind of the compromise user. There is the malicious insider, the Dave that turned bad that's like, I'm done with this place. I'm going to sabotage or I'm going to move to a new company, maybe a competitor. And I'm going to steal the crown jewels. That's kind of insider threat. That's the second type of behavior. And then there is like people that just trying to, you know, they're just sloppy. They're just uneducated or they're just trying to go fast because they have a task to complete. They are the ones that are going to send - it's the developer that's going to send the source code to their iCloud. It's this researcher that is working on the COVID-19 vaccine that's going to send the formula to his Gmails so he can basically keep on working because he has more storage. 

Nico Popp: That's the kind of Dave that we need to - we want to spot. I mean, those, by the way, are interesting because I think that's where security could actually improve because we've ignored the human. We can actually - there's a moment - right? - a teachable moment where security could actually intervene to say, you know, Dave, this is not the right behavior. Let me teach you that you should not be doing that. This is risky. And let me turn you on what good behavior is. And I think that's part of cybersecurity ignoring the end user is not just the behavior. It's also - we haven't tried to teach. We haven't tried to interact. You know, I - we haven't tried to leverage all users to help us secure the workplace. And I think that's another dimension of human-centric security for us. 

Dave Bittner: I suppose, too, for the folks running the business, it gives them insights onto whether or not they're providing their teams with the tools that they need. You know, as you mentioned, if someone's transferring data to a Gmail account or a personal account or something because they feel like they don't have enough storage, well, if someone is going to the point of doing that, I want to know that, as a manager, that they don't have the tools they need. And also, there's some reason why they're not telling me. 

Nico Popp: Exactly. It's one of the shortfall of security - right? - is what you said at the beginning. We're very binary. So in security, we decided that the world was black and white. And so because of that, you know, in security, we allow you to do something or we block you from doing something, right? The reality is that there are 50 shades of human grey - right? - in behavior. And so part of the idea here is to basically say, look, maybe we can look at people in - let's simplify in three different ways. There are the people that are doing it right. They are well-educated. They are low risk. We're going to free them. We're going to free the good and let them work seamlessly and do everything they have to do because they have a dream, they're educated, they are respectful. OK. And then, you know, there's the people that are medium risk, right? Because remember, my indicator of a behavior, my monitoring of what they are doing, they probably not threat yet, but - and they're probably still teachable. They haven't moved to the dark side yet. So I can basically use this teachable moment to actually recognize, oh, Dave needs that kind of training. Dave needs these kind of tools. Dave needs this kind of feedback. 

Nico Popp: And then finally, there is the guy that actually caused the line of risk, right? And then those I probably want to confine and restrict what they can do. And if security was able to adapt to these three groups of human - and, again, I oversimplified - I think we already improved quite a bit the - the efficacy of cybersecurity as well as the user experience of cybersecurity that often has such a better rap. We're in the way. No, we want to set people free to do their job but as long as they do it carefully. And we want to teach them. 

Dave Bittner: That's a really important point because I think most people have the feeling, the impulse that, you know, we don't like to feel like there's someone looking over our shoulder all the time, you know, that we might get slapped on the wrist for doing the wrong thing or, you know, accidentally going to the wrong folder or whatever. So one of the points you're making is that it's important when we have this sort of monitoring that it's not about punishing people. It's about, as you say, teachable moments, you know, showing them the right path. 

Nico Popp: I would go even beyond. It is like you are - you know, Dave, you are now part of our cybersecurity solution. You know, I always use the example of credit card companies, right? They have been brilliant. You know, they have huge fraud issues. And what have they done? They basically involve us into the process of solving, right? They don't block always your credit card. They may block you, but they may ask you, you know what? We've seen that transaction. It look suspicious to us. Is that, you know - is that really you trying to complete this thing? So we don't have - and it's working, right? Can you imagine they are using all these consumers to solve the fraud problem? And, of course, we care. So we participate. So taking that concept of putting the human in the middle and saying, look. You're part of the solution. We're going to engage you. It's not just about monitoring you, spying on you. Quite the opposite. We're trying to make you better. But also, we want you to be part of our cybersecurity team, you know, because we want to be able to leverage the fact that we have this smart and caring human being, common folks, right? - behind the keyboard that also care about the company assets and can help there. Something that cyber has never done - really, that whole idea of putting human in the middle of cyber. It's all this different dimension, these different approaches. 

Dave Bittner: All right, Joe, what do you think? 

Joe Carrigan: Dave, I love these kind of interviews where we talk about the human factors in cybersecurity. First, I want to talk about the cartoon that Nico's referencing about. Dave - you know, In this corner, we have all this tech. 

Dave Bittner: (Laughter). 

Joe Carrigan: And in the other corner, we have Dave. 

Dave Bittner: Yeah. 

Joe Carrigan: I love that cartoon. I posted it on Facebook. And one of my friends who was at the first job that I had that was a full-on security job - he says, my money's on Dave. 

Dave Bittner: Thank you, friend. I appreciate that (laughter). On behalf of Daves everywhere (laughter). 

Joe Carrigan: Yes. Nico makes a great point. Binary things are easy. This is why the - you know, the technology is getting a lot better. We are technologically more secure, but these malicious actors are still getting in and it's because of human behavior. Behavior is an interesting field to me. People hoarding data is interesting behavior. Do these people need to hoard all this data? Are they collecting the data? This was one of the behaviors that Edward Snowden did right before he took off that could've stopped that data breach. Creating a new user or copies of yourself or doing privilege escalation - these are behaviors that should stick out like sore thumbs. And I like - whatever his product is, they have developed these kind of baseline, already known bad behaviors that they look for. So they don't have to come in and say, oh, baseline behavior and then start saying, oh, there's some anomalous behavior. They already know what that behavior is. And they have the profiles for it. When Nico is talking about people getting around security systems like using iCloud or email - I like what Nico says that there's an opportunity for training. But I agree with you in this interview that this is an indicator of a bigger issue. It means either you don't have the tools that your employees need to do the job, or you do have the tools, and they don't know about it, right? - which... 

Dave Bittner: Yeah. 

Joe Carrigan: ...Is an education issue. And your point is they're not telling you that they're doing this, which indicates some deeper issue that you have in your culture, I think. There's no product in the world that will fix that, ever. This harkens back to what Bruce Schneier says. He says, if you think security can be resolved with a product, then you don't understand the problem or the product. 

Dave Bittner: Hmm, interesting. 

Joe Carrigan: And I think that's right. The security department is often viewed as the no department, right? I like what Nico says here about we need to be more collaborative with people, and we should never be about punishing people. I agree with that 100%. If you punish people, that - again, I think that damages your culture. And Nico says something here that I agree with 100%. And, Dave, if by some magic I was made CISO of some organization tomorrow, one of my first actions would be to hold an all-hands meeting and say to every single employee that they are now part of my cybersecurity team. That would be the very first action I'd take. I'd say, listen. This is not going to work without all of you being part of the team because it isn't. It just isn't. The awareness, the ability to talk to people, to interact interpersonally within the organization and have open discussions about security-related issues and secure behavior - a positive security culture is imperative to a good security program. 

Dave Bittner: Yeah. I love the idea of highlighting when someone has made a mistake... 

Joe Carrigan: Right. 

Dave Bittner: ...Or, you know, someone has clicked on a link, and something bad happened - to take that to the entire organization not as a way of shaming that person but a way of praising that person to say, listen. You know, so this happened to so and so. They got fooled by this. And we are so appreciative that they brought this to our attention because now we know that this is happening. We've been able to, you know, train that person, and we're able to share this with all of you. 

Joe Carrigan: Right. 

Dave Bittner: This is a good outcome for this. We are able to prevent it. So, you know, thank you so much. And so just by framing it as being a positive learning experience for everyone and not putting that shadow of shame over someone... 

Joe Carrigan: Right. 

Dave Bittner: ...I think that could go a long way towards building a positive - as you say, a positive culture in your organization when it comes to security. 

Joe Carrigan: Yeah. People are going to make mistakes. It's just a fact of life. 

Dave Bittner: Yep. 

Joe Carrigan: And you have to embrace that and understand it. And your goal should not be the elimination of mistakes, but your goal should be the reduction of those mistakes and the improvement of ways to mitigate it when people do make mistakes. 

Dave Bittner: Right. And you want them to tell you about it when it happens. You don't want... 

Joe Carrigan: Absolutely, you want them to tell you about it. 

Dave Bittner: If you make them feel shame, they're going to try to hide it. And that only leads to worse outcomes. 

Joe Carrigan: Right, right. 

Dave Bittner: (Laughter) Yeah. Well, again, our thanks to Nico Popp for joining us. We do appreciate the conversation there. 

Dave Bittner: We want to thank all of you for listening. And, of course, we want to thank the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu. The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of Data Tribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Joe Carrigan: And I'm Joe Carrigan. 

Dave Bittner: Thanks for listening.