Security has to be friendly.
Allan Alford: [00:00:00] Security has to be friendly, or it won't get used.
Dave Bittner: [00:00:03] Hello, everyone. And welcome to the CyberWire's "Hacking Humans" podcast, where each week we look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire and joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe.
Joe Carrigan: [00:00:22] Hi, Dave.
Dave Bittner: [00:00:22] We've got some good stories to share this week, and later in the show we're joined by David Spark and Allan Alford. They are co-hosts of the "Defense in Depth" podcast. They've got some good stories to share as well.
Dave Bittner: [00:00:33] But first a word from our sponsors at KnowBe4. Have you ever been to security training? We have. What's it been like for you? If you were like us, ladies and gentlemen, it's the annual compliance drill - a few hours of PowerPoint in the staff break room. Refreshments in the form of sugary donuts and tepid coffee are sometimes provided, but a little bit of your soul seems to die every time the trainer says, next slide. Well, OK, we exaggerate. But you know what we mean. Stay with us, and in a few minutes, we'll hear from our sponsors at KnowBe4, who have a different way of training.
Dave Bittner: [00:01:14] And we are back. Joe, I'm going to start things off for us this week. There's a story that's been making the rounds. I've seen it mentioned on "Good Morning America" and lots of general interest news shows. And it was triggered by - the Los Angeles County district attorney put out a statement warning people against juice jacking. Now, juice jacking...
Joe Carrigan: [00:01:36] I've never heard of juice jacking. What is...
Dave Bittner: [00:01:38] OK. The term was coined by security journalist Brian Krebs.
Joe Carrigan: [00:01:42] OK.
Dave Bittner: [00:01:42] Very well known. Juice jacking is when you take a public charging station, a public USB charging station, and you put some kind of computer behind it.
Joe Carrigan: [00:01:53] Right.
Dave Bittner: [00:01:53] So when someone plugs in, seemingly to get power, in addition to getting power they get their device hacked.
Joe Carrigan: [00:02:01] Right.
Dave Bittner: [00:02:01] Because data can transfer over that USB cable in most cases.
Joe Carrigan: [00:02:06] It's how you would interface your phone directly with your computer.
Dave Bittner: [00:02:09] Correct. And so the LA County district attorney has put out this warning about juice jacking, I think particularly because we're coming up on the holiday season, when a lot of people will be traveling.
Joe Carrigan: [00:02:19] Right.
Dave Bittner: [00:02:19] And it's very common to see these days, in airports and bus stations and other places, these USB charging stations where you can plug in your USB cable; it'll give you power. I was in the airport a few days ago, and you saw people gathered around these stations...
Joe Carrigan: [00:02:34] Right.
Dave Bittner: [00:02:35] ...Desperately getting power for their devices while they were on their way somewhere. I have mixed feelings about this. First of all, I think the odds of you getting infected with something like this is very low. I think it is good general hygiene to - rather than plugging into some unknown USB port, to use your AC charger.
Joe Carrigan: [00:02:56] Yeah, to use your AC adapter. I would agree. There is also a device called a USB condom that you can get...
Dave Bittner: [00:03:00] Right.
Joe Carrigan: [00:03:01] ...Which essentially takes out the two data pins on a USB connection.
Dave Bittner: [00:03:04] Right, yep.
Joe Carrigan: [00:03:04] So all you get is the power pins.
Dave Bittner: [00:03:06] You can also get cables that are power-only.
Joe Carrigan: [00:03:09] Right.
Dave Bittner: [00:03:09] Another thing they point out here is - and it's something we've talked about before - is that there are folks who have made malicious cables.
Joe Carrigan: [00:03:15] Right.
Dave Bittner: [00:03:16] So if you see a cable laying around or if you have a cable plugged into one of these charging stations, don't use a cable...
Joe Carrigan: [00:03:23] Right.
Dave Bittner: [00:03:23] ...When you don't know where that cable came from. Another point about this - and a reason why I think it's probably not as big a deal as maybe they're making it out to be...
Joe Carrigan: [00:03:34] Right.
Dave Bittner: [00:03:34] ...Is that both iOS and Android have taken steps to prevent this sort of thing...
Joe Carrigan: [00:03:41] They have. They have.
Dave Bittner: [00:03:42] ...In the past few years.
Joe Carrigan: [00:03:42] In order for you to get hacked on Android you have to turn on the USB device bridge, or you have to make - change a setting. And if you don't have that setting set, then a lot of this is not going to do anything. Maybe they can be able to read files off your device, but they're not going to be able to install malicious software.
Dave Bittner: [00:03:57] Yeah, yeah. Same on iOS. They're aware of this, and so they've taken mitigations to prevent it. But, you know, just good hygiene, general hygiene. Carry that charging brick with you.
Joe Carrigan: [00:04:07] Yep, I have one of those.
Dave Bittner: [00:04:08] And use that instead of plugging directly into that USB port. And that could be - you know, again, I was traveling recently, and these days in hotels, you know, lamps will have USB ports in them for you to plug things into.
Joe Carrigan: [00:04:19] I still use the AC adapter.
Dave Bittner: [00:04:20] Just use the AC adapter. It's the safest way to go (laughter).
Joe Carrigan: [00:04:24] Right.
Dave Bittner: [00:04:25] All right. Well, that's my story this week. Joe, what do you have for us?
Joe Carrigan: [00:04:27] Dave, this week I want to talk about email again.
Dave Bittner: [00:04:30] OK.
Joe Carrigan: [00:04:30] I don't think we talk about email enough on this show.
0:04:32:(LAUGHTER)
Dave Bittner: [00:04:32] OK.
Joe Carrigan: [00:04:33] But Agari has a new report out...
Dave Bittner: [00:04:34] Yeah.
Joe Carrigan: [00:04:35] From their Agari Cyber Intelligence Division, which they like to call ACID, which is a cool acronym, right?
Dave Bittner: [00:04:41] (Laughter) Right, OK.
Joe Carrigan: [00:04:42] And this is their "Email Fraud and Identity Deception Trends" quarterly report for 2019, quarter three. And some of the things in here are interesting. ACID uses a term called identity deception techniques, and they define that as emails that impersonate trusted brands or individuals.
Dave Bittner: [00:05:00] OK.
Joe Carrigan: [00:05:00] And these types of email campaigns have accounted for 64% of advanced email attacks. The composition of these deception emails is changing over time. And they've been tracking it. They've been releasing this quarterly report for a while now. They say that impersonating brands has dropped about 6% as a percentage of these impersonation emails. But email attacks impersonating individuals have increased to 22% of the total of these impersonation attacks, and that's up from 12% in the previous quarter.
Dave Bittner: [00:05:30] That's interesting.
Joe Carrigan: [00:05:31] That has almost doubled. Emails spoofing people are parts of sophisticated business email compromise attacks.
Dave Bittner: [00:05:37] Right.
Joe Carrigan: [00:05:37] BEC attacks. And they tend to have higher payouts, which is probably why you're seeing an increase in them.
Dave Bittner: [00:05:43] And that's the thing where, you know, I get an email that supposedly comes from my boss...
Joe Carrigan: [00:05:47] Right.
Dave Bittner: [00:05:47] ...Asking me to buy some gift cards or something like that, OK.
Joe Carrigan: [00:05:50] Yep, yep. And there's other types of this as well. Gift cards is actually No. 1 for the campaigns. But payroll diversion is another kind of attack that accounts for about 25%. And then there are wire transfer, which accounts for a smaller part but usually has a higher payout. Another interesting thing in this report is that employee reporting of phishing attempts rose by 6%.
Dave Bittner: [00:06:10] Well, that's good.
Joe Carrigan: [00:06:12] It seems like it's good, but there's some really distressing information in here to me. So reporting goes up by 6%, but false positive rates go up by 7%, right?
Dave Bittner: [00:06:21] OK (laughter)
Joe Carrigan: [00:06:22] What's shocking about this is that the false positive rate for employee-reported phishing attempts is 75% percent now - 75% percent of employee phishing reports are actually not phishing emails. And that's really bad because - another point they make in this report is that the time to investigate, to respond to one of these reports, has increased by 14% as well. So now if you are doing everything you need to do to respond to one of these reports, one of these employee reports, it takes about eight hours if it's a real incident. But it takes seven hours, a little over seven hours, if it's a false positive, and 75% of them are false positives. I don't know how to address that. I don't know...
Dave Bittner: [00:07:01] Right.
Joe Carrigan: [00:07:01] I mean, that...
Dave Bittner: [00:07:02] You want your employees to send in things they're suspicious of.
Joe Carrigan: [00:07:05] Right.
Dave Bittner: [00:07:05] You want to encourage that behavior.
Joe Carrigan: [00:07:07] Yep.
Dave Bittner: [00:07:07] So do we need to do a better job training the employees to differentiate?
Joe Carrigan: [00:07:13] Maybe. I really don't know. There's a technical solution that I'm going to be talking about in a second, but that doesn't help with false positives.
Dave Bittner: [00:07:19] OK.
Joe Carrigan: [00:07:20] The false positives here are a huge time-waster.
Dave Bittner: [00:07:22] Right.
Joe Carrigan: [00:07:23] I think in the future you're going to see some kind of automation on this, although how are we going to trust the automation is getting it right and we're - the automation isn't producing more false positives or maybe it only eliminates half the false positives, right?
Dave Bittner: [00:07:33] Yeah. Just - if you can reduce the number of things that require the eyes of a real person, that's going to be helpful.
Joe Carrigan: [00:07:39] Yeah, I would agree. There's a technology out there called Domain-Based Message Authentication Reporting and Conformance, or DMARC, right?
Dave Bittner: [00:07:47] Right.
Joe Carrigan: [00:07:47] And basically, this is a DNS text record that companies put into - out on the DNS services that tells other companies what their policy is for validating the email. Now, this goes all the way back to, I think, 2002 when Yahoo came up with a way of validating emails with signatures, digital signatures. But there's other ways it works as well, and it's actually a pretty robust system. What's interesting is that the report said the implementation of DMARC records has increased, but the companies who have implemented reject policies in these records - which means if this email doesn't match this policy, you should reject it - is only 13% of Fortune 500 companies have reject policies in their DMARC records.
Dave Bittner: [00:08:29] Right.
Joe Carrigan: [00:08:29] So that means that the other 87% either have nothing or tell you to quarantine the message, which is a very small portion of that. And then the other ones are just monitor the message. But it's shocking to me that the Fortune 500 companies don't have these policies fleshed out. Now, that's going to change very quickly, I think. That as these companies become more comfortable with the technology and the people at these companies become more comfortable, they're just going to start having these reject policies set. But they should really start looking at this a lot better because it's a really great way to avoid business email compromise. The problem with it, though, is that it has to be implemented on both sides of the communication, right?
Dave Bittner: [00:09:06] Right.
Joe Carrigan: [00:09:06] And the report doesn't really touch on - or at least I couldn't find it - on how recipients work with DMARC. If I put out a good reject policy for joescompany[.]com, and somebody sends a spoofed email that spoofs my email address - joe@joescompany[.]com - if the recipient company doesn't go, well, let's check the DMARC record, then they just receive the message, and they don't reject it when they should reject it. And there's no statistics that tell you how many companies have that configured, and I don't know how Agari would get that. The DNS records are publicly available, right? So it's easy to measure how many Fortune 500 companies have these. But it's kind of difficult to measure how many companies are implementing the check.
Dave Bittner: [00:09:42] Oh, I see. Yeah, it's surprising to me that, all these years in, that email is still as messy as it is.
Joe Carrigan: [00:09:48] Well, it stems all the way back from the intended purpose of email. And the problem is in the very name - it's Simple Mail Transfer Protocol. In SMTP, Simple Mail Transfer Protocol, I can put any address in the from address I want to in an email, and the server will send it; it will comply.
Dave Bittner: [00:10:03] Before backwards compatibility, we end up with all of this messiness and security issues.
Joe Carrigan: [00:10:08] Yeah, that's a good point. I was thinking about this in the past week - about how we could change SMTP maybe to secure email transfer protocol...
Dave Bittner: [00:10:16] (Laughter) Right.
Joe Carrigan: [00:10:16] ...Which would also be SMTP. So that's terrible, but...
0:10:21:(LAUGHTER)
Joe Carrigan: [00:10:21] I think something needs to be done with email on an internet level to make it better.
Dave Bittner: [00:10:24] Yeah, yeah. It's just such a huge battleship to turn (laughter).
Joe Carrigan: [00:10:28] It is. It's a massive battleship to turn.
Dave Bittner: [00:10:31] Yeah, yeah. All right. Well, it's time to move on to our Catch of the Day.
0:10:35:(SOUNDBITE OF REELING IN FISHING LINE)
Dave Bittner: [00:10:39] Our Catch of the Day was sent in by a listener. This is a bit of a romance scam. The subject line is, nice to meet you. And it goes like this.
Dave Bittner: [00:10:47] Hello. How are you? After telling me all about yourself, you sure sound like the kind of guy I'd love to meet, go on a date and see where that leads. I'm supposed to be having a good day, being the end of the seminar, and am supposed to wait for some kind of call if I'm being chosen by an agency or not while I get my flight book so I can fly tomorrow. But unfortunately, my card came back declined. So contacted my card company since I knew the funds I left there, and when the reports came, I was told I purchased some phones, which I never did. Then they mentioned something like identity theft. So now my account is frozen, and they promised to dispute the charges and issue another card to me when I get back to the States. I really want to get out of here as soon as I can, if possible tomorrow, but I don't have the means because the airline agent is insisting I get the remaining funds before he can give me a ticket back. I don't know how you feel about this. Hopefully, you see from my point of view and help me get out of here. Waiting to hear from you. Hugs. Text me.
Dave Bittner: [00:11:42] Joe...
Joe Carrigan: [00:11:44] (Laughter).
Dave Bittner: [00:11:44] ...Are you going to text her back?
Joe Carrigan: [00:11:45] Yes. Of course, I'm going to text her back and wire the money right away so that she can get back to the United States.
Dave Bittner: [00:11:51] (Laughter) I mean, it's a sad, sad situation here, and you just want to be a helpful person.
Joe Carrigan: [00:11:55] Yes, yes. I'm wondering if this is seminar or semester, if it's supposed to be - if someone is going to a seminar, who goes to a seminar without buying a round-trip ticket?
Dave Bittner: [00:12:04] Good point.
Joe Carrigan: [00:12:04] Yep.
Dave Bittner: [00:12:04] Good point. Although, I suppose a seminar could be an opportunity to try to drum up some romance. You know, if you're out of town or something like that...
Joe Carrigan: [00:12:12] Right.
Dave Bittner: [00:12:12] ...You could see that being alluring to some people. So yeah, pretty funny, pretty standard stuff (laughter).
Joe Carrigan: [00:12:19] Yeah. Yeah, good one. Yeah, thanks for sending that in.
Dave Bittner: [00:12:22] All right. Well, that is our Catch of the Day. Coming up next, we're joined by David Spark and Allan Alford. They are the co-hosts of the "Defense in Depth" podcast, and they're going to share some of their experiences with social engineering scams.
Dave Bittner: [00:12:35] But first, a word from our sponsors, KnowBe4. And now back to that question we asked earlier about training. Our sponsors at KnowBe4 want to spring you from that break room with new-school security awareness training. They've got the world's largest security awareness training library, and its content is always fresh. KnowBe4 delivers interactive, engaging training on demand. It's done through the browser and supplemented with frequent simulated social engineering attacks by email, phone and text. Pick your categories to suit your business. Operate internationally? KnowBe4 delivers convincing, real-world, proven templates in 24 languages. And wherever you are, be sure to stay on top of the latest news and information to protect your organization with KnowBe4's weekly Cyberheist News. We read it, and we think you'll find it valuable, too. Sign up for Cyberheist News at knowbe4.com/news. That's knowbe4.com/news.
Dave Bittner: [00:13:41] And we are back. Joe, I recently had the pleasure of speaking with David Spark and Allan Alford. They are the co-hosts of the "Defense in Depth" podcast, really informational and entertaining show. So we hope you'll check that out. Here's my conversation with David and Allan. David and Allan, thank you so much for taking the time for us. I'm really looking forward to chatting with you. As co-hosts of the "Defense in Depth" podcast, you all speak with a lot of CISOs, and I'm curious what stories they're bringing back to the two of you in terms of dealing with things like social engineering, things like phishing - all of these human-facing attacks that they're up against these days. David, why don't I start with you?
David Spark: [00:14:23] You know, it's interesting you go immediately to, like, the outward threat. But one of the things that we keep talking about and is reported often is that, you know, internal threats are the problem, but often the internal threats are not malicious in nature; they're purely someone trying to get their job done, and they're using some type of technology to get it done. So, you know, what they think is the right thing to do, like forwarding email to a personal account or someone telling them, forward your email to my account - which could be done maliciously or kind of unknowingly. So sometimes we are, quote, "acting human" in not a way that's meant to be malicious but the end result is malicious. And that often - I will just say, just from the data from the reports that we've seen - is more often the case of why there is quote, "an internal threat."
Dave Bittner: [00:15:10] Allan, what's your take?
Allan Alford: [00:15:11] I'm with David that insider threat has to be considered, but I think the external ones are certainly, certainly applicable as well.
David Spark: [00:15:17] And I'm just saying that the data shows that the percentage of sort of internal threats that are nonmalicious is actually higher in terms of causing problems, yeah.
Allan Alford: [00:15:23] Oh, agreed. Agreed. Agreed. I would agree with that 100%. And like you said, I think the impetus is usually people trying to do their jobs and figuring out the most effective and most rapid way to get a thing done. I worked in one shop - I'm not going to name names - where we had a very highly ranked and highly placed administrator for one of our operating system environments basically get walked out the door, not because he turned out to actually be a malicious hacker but because he was trying to expedite his task list and made some shortcuts that ended up getting mistaken for hacker activity. A lot of people spent a lot of time running to ground exactly what happened, and a lot of resources were wasted. He kept denying that he had done anything. We ultimately proved he did. And it was an ugly situation. And at the end of the day, his motivation really was just, hey, I was trying to get my job done more quickly because I had a lot on my list to do. Wasn't malicious at all but ended up going south for everybody.
Dave Bittner: [00:16:13] I think it's a really good point. You know, I heard a story from someone who - they were telling me that their organization would not allow them to use services like Dropbox. You know, they couldn't connect to those kinds of file-sharing services. But they all still had to get their work done. And so everybody knew that if someone was trying to send you something via Dropbox, well, the way to do that was to log on to the Wi-Fi of the Starbucks in the lobby of the building...
0:16:41:(LAUGHTER)
Dave Bittner: [00:16:42] ...And now you can get the files that you needed. And I think what - you know, when we talk about insider threats, that's the kind of thing where people are just trying to get their work done. And you have to be careful not to put up these roadblocks because people will find a way around them.
David Spark: [00:16:56] Don't move the data; move the access to the data. Keep the data in one location; just shift where you're getting access to it so people can do what they need to do, which is often work from home.
Allan Alford: [00:17:07] And there's a greater lesson, I think, in that story of the Starbucks Wi-Fi, and that is that security has to, absolutely must, especially in today's climate of BYOD, work from home, you know, everything is SAP now. You absolutely have to make security usable and friendly. And if you don't do those two things, security is going to be an obstacle, and security is going to be treated as an obstacle by those who most need it. And when people see something as an obstacle, what do they do? They go around it. They go under it. They go through it. They go over it - you know, whatever it takes to get away from that obstacle. So if you don't make security usable, if you don't make it friendly, if you don't make it a good user experience - right? - you're going to get bypassed every time.
Dave Bittner: [00:17:45] David, I'm curious, you know, from your point of view, the experience that you have in marketing and communications, as a skilled communicator yourself, how do you feel like we're doing in terms of messaging, of getting the word out about these sort of social engineering scams?
David Spark: [00:18:03] I think it's minimal at most. I mean, honestly, most people hear little tidbits here and there. But I think the most common thing that is happening is the spear phishing attacks where people know the people to go after within an organization to be able to get the money that they need and to create these sort of fake emails that - and I know you talk about it all the time on the show - where they're asking for, you know, gift cards - I mean, the most common scam - or essentially, you know, to get somebody's invoice, and it all seems legit.
David Spark: [00:18:35] I think what companies should be doing is just understand that this is a normal. And how do we create internal systems that no matter what request that comes in we have a - essentially, a multifactor authentication process that goes on that, before anybody sends money out, a phone is picked up and you're calling somebody and you're getting a verbal OK from somebody or something else, whatever it is - but just assuming that this stuff happens all the time, and you must build at least two-factor authentication process to prevent the money actually going out the door.
Dave Bittner: [00:19:05] Allan, from your perspective as someone who is in a leadership position with folks on the cyber side of the house, how do you send this message down to the people who work with you?
Allan Alford: [00:19:17] In a variety of ways. Training and awareness are key, opening people's eyes to what can happen, sharing those stories, telling them and passing them on. Every shop I've been in the bad thing has happened in one way or another. Somebody always manages to pull off some sort of scam or trick, and somebody always falls for it. So you collect those stories and share them and spread them, and you create training programs specifically around those, right? Anti-phishing training is pretty common practice. And one of the things I always do is I warn them. You know, what are the key signs that it's a bad email? It used to be in the olden days, you know, oh, oh, bad grammar, and, you know, it's obviously somebody that's scamming me - you know, give money now please. And they don't even spell please right, and they don't use a period.
Dave Bittner: [00:19:54] (Laughter).
Allan Alford: [00:19:54] Like, oh, that's how it used to go. Now it's so much more sophisticated, and so you have to get into the psychological tricks that the bad guys use and incorporate that and get those lessons on the table in your training, right? A false sense of urgency, appealing to your sense of curiosity, appealing to your sense of greed. There was a famous scam - for years I lived in Austin, and there was a very famous scam all over Austin. You'd be at some shop and coming out to your car, and these guys in a white van would pull up, and they would offer to sell you stereo speakers. And what they would tell you is that they worked for the company...
David Spark: [00:20:22] Not just in Austin - happened here in California as well.
Allan Alford: [00:20:22] Was this an international one?
David Spark: [00:20:25] Oh, yeah. Oh, yeah.
0:20:26:(LAUGHTER)
Allan Alford: [00:20:26] So they worked for the company. They accidentally ordered too many. You know, wink, wink, nudge, nudge - we'll sell them to you for super cheap. And the reality is there was no company. All of these speakers are actually poorly manufactured, cheap speakers - that this is their actual sales technique. This is their distribution network, right? So it's things like that, right? If it seems too good to be true, it is. If it's appealing to your sense of greed, don't be greedy (laughter). You know, there's some basic lessons there.
David Spark: [00:20:51] I was at this conference in Philadelphia, and I heard something regarding phishing tests on employees. One of the things we had talked about on a previous episode is that you can always create a phish to get people. There's always a way to get it. And certain phishes have a degree of sort of success severity in terms of how well it can go. And what they did with their phishing test is they actually put a score before they sent it out, like, oh, this one we expect, you know, a much higher open rate than this one. So when they did do estimates of how successful they were with their employees, it was graded on a curve, if you will. That the - you know, they expected the tougher ones, you know, the ones that really got at people's greed or the fact that the Super Bowl was coming up, those would be opened up more than some of the sort of lower-risk ones. And I thought that was very wise to handle it that way.
Allan Alford: [00:21:38] That's a great approach. One of my most triumphant moments as a CISO was getting the CEO himself to fall for one of my fake emails.
0:21:44:(LAUGHTER)
Allan Alford: [00:21:44] I was like, yes, I got him. And that was obviously a very cunning and very well-crafted email. You know, it's important with those programs, though. It's all about - you know, Dave's question to us was, you know, what are you doing and what are you seeing in terms of education and awareness, right? Any phishing programs are oftentimes used incorrectly in my mind because they're oftentimes used as a vehicle for shaming the end users. If the goal is truly to educate and train and teach, then, like David said, let that super crafty one out the door, but don't expect that you're going to get some miraculous resistance to it.
Allan Alford: [00:22:15] You know, make it a learning lesson - hey, you fell for a good one; don't feel so bad. But why was this a good one, and what should you do next time? And just walk them through it, and give them support, and give them guidance, and keep these as examples to help just get it ingrained in people's minds.
Dave Bittner: [00:22:29] What about having in place tools so that people can report these things easily but also that they know that they're being followed up on, that they're not just going into some sort of void in the company and they never hear anything about them again?
David Spark: [00:22:42] You know, that's a good point. I don't think we've ever discussed that on our show. Like, how does the person who reports the phishing scam learn about how it's followed up? I mean, do you do anything like that, Allan?
Allan Alford: [00:22:52] We do, actually. We do. And I was just thinking about my most recent security training I did in my - you know, where I work, my day job. And I threw up a slide that must have had - I'm trying to count it now. There were at least seven different means by which people could get across a piece of security - you know, hey, there's a thing here; it needs attention. We gave them seven different means to do it.
Allan Alford: [00:23:13] All of those means converge on the same place, and that same place processes tracks, tickets, responds, and ultimately, you'll know whether you picked up the phone, whether you went to this one email address, whether you went to this one website. The data gets housed. The data gets tracked. And the responses are there, and you will be communicated with. So it's important to open those floodgates as often as you can for the inputs. And to your point, let's make sure that there's actually some feedback because if it goes into a vacuum, people are going to quit using it. Again, security has to be friendly, or it won't get used.
Dave Bittner: [00:23:44] All right. Joe, what do you think? Interesting guys, huh?
Joe Carrigan: [00:23:46] Interesting guys, yeah. First thing I want to talk about is they say, internal threats are not necessarily a result of malintent. And he couldn't go into too much details. I'd like to know what Allan was talking about with the guy that got walked out.
Dave Bittner: [00:23:56] Yeah. My guess is that he set up some sort of automation or something to make his job easier. But then when people started asking around and saying, what's this automation running, he was like, oh, what automation? I don't know anything about any automation.
0:24:09:(LAUGHTER)
Dave Bittner: [00:24:09] That's my guess - it was something like that.
Joe Carrigan: [00:24:12] They're right; people are just trying to do their jobs, and they're trying to do their jobs in the most efficient way possible.
Dave Bittner: [00:24:16] Yeah.
Joe Carrigan: [00:24:16] And the belief on the part of these people is, if I can do this job quickly, then I can spend more time doing other stuff for the company. Everybody understands you're selling your time to the company. It's not necessarily a lazy thing, I don't think; I think it's actually a productivity thing. People want to be productive. That's my opinion.
Dave Bittner: [00:24:31] Yeah.
Joe Carrigan: [00:24:32] But security does have to be usable. I've told this story a number of times, and it's anecdotal. But in health care, in hospitals, there are - or were nurses whose job it was every now and then to go around and wiggle the mice on computers so that the computers didn't lock, so when people walked into the room with the computer or to the treatment center or whatever, wherever it is, that they don't have to do anything other than start using the computer. Now, a lot of people have moved on. I know that at Hopkins the facilities use, like, a CAC card. But it's kind of like a chip and PIN system.
Dave Bittner: [00:25:04] Right.
Joe Carrigan: [00:25:04] So it's fast, right?
Dave Bittner: [00:25:05] Right. Yeah.
Joe Carrigan: [00:25:06] And that's the point - it has to be fast, and it has to be usable. It can't stand in the way, particularly in health care.
Dave Bittner: [00:25:11] I remember, for a while - you may still be able to get this - there were some folks making USB dongles that basically wiggled the mouse every couple of minutes (laughter).
Joe Carrigan: [00:25:22] Right. Yeah, actually, the FBI uses that when they're investigating cybercrimes. They stick it in as part of their forensics to prevent the machine from locking if they catch somebody red-handed.
Dave Bittner: [00:25:30] Oh, interesting.
Joe Carrigan: [00:25:30] Right? Again, we hear that policies and process are key, particularly for business email compromise. People need to follow the process and follow the policy. And your comment about the Starbucks Wi-Fi, getting around that, I kind of blame the employees for going around it, but I also kind of blame management for saying, no, nobody's going to use Dropbox, right?
Dave Bittner: [00:25:49] Right.
Joe Carrigan: [00:25:49] There's got to be some other way to go about this, right?
Dave Bittner: [00:25:52] Yeah. And I think that's the bigger point, is that, you know, people still have to get their work done.
Joe Carrigan: [00:25:57] Right.
Dave Bittner: [00:25:58] And they're measured, they're rewarded and punished by how well they get their work done. So you've incentivized these people to get around your security, right? (Laughter).
Joe Carrigan: [00:26:07] Right. Exactly. And that gets me to my final point - if you're running one of these phishing exercises, don't use them to shame people. I don't know how often this happens, but I get the sense that it does happen, that so many people fell for the email and we're shaming them. And the guys are right about that, that if you write a really good phishing email, you can expect a higher click rate on it.
Dave Bittner: [00:26:26] Sure. Yeah.
Joe Carrigan: [00:26:27] You know, you can expect more of your employees to fall for it, and use that as a teaching moment. I agree 100%, yeah. But do not shame people. That is counterproductive.
Dave Bittner: [00:26:34] Yeah. Use a carrot, not a stick, right? (Laughter).
Joe Carrigan: [00:26:36] I - yeah. For this kind of thing, I say use a carrot, not a stick.
Dave Bittner: [00:26:39] Yeah.
Joe Carrigan: [00:26:40] Other times I say use the stick. But not in this case.
Dave Bittner: [00:26:42] (Laughter) Yeah.
Joe Carrigan: [00:26:43] If you're going to be phishing your own employees, then that is a 100% carrot situation.
Dave Bittner: [00:26:47] Yeah. All right. Well, once again, thanks to David Spark and Allan Alford for joining us. They are co-hosts of the "Defense in Depth" podcast. We hope you'll check that out. It is a good show. That is our podcast.
Dave Bittner: [00:26:59] We want to thank our sponsors, KnowBe4. Their new-school security awareness training will help you keep your people on their toes, with security at the top of their mind. Stay current about the state of social engineering by subscribing to their Cyberheist News at knowbe4.com/news. Think of KnowBe4 for your security training. We want to thank the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu.
Dave Bittner: [00:27:25] The "Hacking Humans" podcast is probably produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben. Our editor is John Petrik, executive editor is Peter Kilpe. I'm Dave Bittner.
Joe Carrigan: [00:27:40] And I'm Joe Carrigan.
Dave Bittner: [00:27:41] Thanks for listening.