Hacking Humans 5.9.19
Ep 48 | 5.9.19

A data-driven approach to trust.


Richard Ford: [00:00:00] I would just encourage people to be more mindful in how they think about trust. Think about, is somebody really trustworthy, what trust you're extending. And then make a rational, data-driven, thought-out decision.

Dave Bittner: [00:00:12] Hello, everyone, and welcome to the CyberWire's "Hacking Humans" podcast, where each week, we look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire, and joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe.

Joe Carrigan: [00:00:31] Hello, Dave.

Dave Bittner: [00:00:31] We've got some interesting stories to share this week. And later in the show, we have my interview with Dr. Richard Ford. He's chief scientist at Forcepoint, and we're going to be talking about models of trust. So stay tuned for that. But first, we've got a word from our sponsors, KnowBe4. So who's got the advantage in cybersecurity, the attacker or the defender? Intelligent people differ on this, but the conventional wisdom is that the advantage goes to the attacker. But why is this? Stay with us, and we'll have some insights from our sponsor, KnowBe4, that puts it all into perspective.

Dave Bittner: [00:01:11] And we are back. Joe, why don't you kick things off for us?

Joe Carrigan: [00:01:13] I will do that, Dave. This week, my story comes from a friend of the show and my close, personal friend, Graham Cluley.

Dave Bittner: [00:01:20] (Laughter) OK.

Joe Carrigan: [00:01:21] (Laughter).

Dave Bittner: [00:01:21] Friend, mentor, colleague...

Joe Carrigan: [00:01:24] Right.

Dave Bittner: [00:01:24] ...Stalkee.

Joe Carrigan: [00:01:25] Right. (Laughter).

Dave Bittner: [00:01:25] All right. Go on. Go on.

Joe Carrigan: [00:01:27] I've talked to him a couple times on Twitter...

Dave Bittner: [00:01:28] (Laughter).

Joe Carrigan: [00:01:28] ...And been on his podcast. But...

Dave Bittner: [00:01:31] (Laughter).

Joe Carrigan: [00:01:31] ...He has a story about Saint Ambrose Catholic Church in Brunswick, Ohio.

Dave Bittner: [00:01:35] OK.

Joe Carrigan: [00:01:36] It's a pretty big church, with 5,000 families in the community and about 16,000 members. The pastor is Father Bob Stec, and he had to send out a letter recently to his parishioners. You see, they're renovating the church, and the company that they're working with is Marous Brothers Construction company. This is how this works when you have a church that's old. You need to renovate it, and you hire somebody. And this is actually a pretty expensive endeavor. But on Wednesday, Marous Brothers calls the church inquiring as to why they have not paid their monthly payment for the project for the past two months. The amount they're looking for is approximately $1.75 million.

Dave Bittner: [00:02:11] Wow.

Joe Carrigan: [00:02:11] OK? That's two payments, two monthly payments they haven't received.

Dave Bittner: [00:02:15] Yeah.

Joe Carrigan: [00:02:15] So the church does some investigation, and they find out that they have been a victim of some business email compromise where someone has gotten into the church's email system and has convinced people that Marous Brothers has changed their banking information.

Dave Bittner: [00:02:30] OK. So presumably, the church was paying, electronically...

Joe Carrigan: [00:02:34] Correct.

Dave Bittner: [00:02:35] ...These monthly payments.

Joe Carrigan: [00:02:36] Yes.

Dave Bittner: [00:02:36] Mmm hmm.

Joe Carrigan: [00:02:37] And now the FBI is involved, of course.

Dave Bittner: [00:02:39] OK.

Joe Carrigan: [00:02:39] The FBI has found out that hackers broke into two email accounts belonging to the church. They don't know how they did it - if they got in with phishing or did some keylogging malware distribution. But basically, the staff was tricked.

Dave Bittner: [00:02:50] Yeah.

Joe Carrigan: [00:02:50] Right? And this money is essentially gone because it's been a while since the money was transferred. It didn't happen, like, three days ago, and then they could get in. They're looking for two months of payments. So if nothing else, the first payment has certainly been completely laundered by now...

Dave Bittner: [00:03:05] Wow.

Joe Carrigan: [00:03:05] ...And is in criminal hands. The second payment, maybe they'll get some of that back, but probably not. They're not going to get any of this back. This is a big hit to the church. Five thousand families in the church. That's an average of $350 per family that this church has been bilked out of.

Dave Bittner: [00:03:20] Wow. And I wonder - I mean, it sounds like these bad guys did their homework.

Joe Carrigan: [00:03:24] Sure.

Dave Bittner: [00:03:24] They must've known that this renovation was happening. They knew who was doing it.

Joe Carrigan: [00:03:30] Right.

Dave Bittner: [00:03:30] And I suppose that's probably not that hard to look up. I would think maybe the church bulletin was put online and...

Joe Carrigan: [00:03:38] Yeah. These church bulletins are online all the time, all the news that's going on. Father Stec had to apologize to his parishioners. I can't imagine having to be somebody who has to write that letter, right?

Dave Bittner: [00:03:46] Right.

Joe Carrigan: [00:03:46] You know, I know you all trusted us with a bunch of money, but we've actually lost it. And I like what Graham says here at the end of this article. He says, don't feel too superior about this. More and more, not just churches, but firms are becoming victim to these kind of attacks. Generally, people in the church don't think that they're really a target for this kind of attack.

Dave Bittner: [00:04:05] Right.

Joe Carrigan: [00:04:05] Because who would attack a church? Right?

Dave Bittner: [00:04:08] Yeah.

Joe Carrigan: [00:04:08] What kind of a monster would attack a church? Well, these guys don't have any kind of scruples. They really don't care where their money comes from. They're stealing money. They'll steal it from a church just as well as they'll steal it from somebody else. So it's really not something you can discount. You have to think about this, that you have to always have an adversarial mindset whenever you're talking about money.

Dave Bittner: [00:04:25] It's a shame.

Joe Carrigan: [00:04:26] Yeah, it is a shame. But again, this is - another red flag that you have to be aware of in these kind of scams is, we're changing our banking information. I'm going to go ahead and say that should be as big of a red flag as, I need you to pay me in gift cards.

Dave Bittner: [00:04:39] (Laughter) Right. Right.

Joe Carrigan: [00:04:40] When you get that message, the first thing you should think is, this is a scam, let me make a phone call and see what happens.

Dave Bittner: [00:04:46] Yeah. Never follow through with that without checking.

Joe Carrigan: [00:04:49] Right.

Dave Bittner: [00:04:50] I mean, I think it's a good point, too, that, don't call the phone number they provide in...

Joe Carrigan: [00:04:54] Absolutely not.

Dave Bittner: [00:04:55] ...In that correspondence. Right? They could say, we're changing our bank account information. If you have any questions, please call me. My personal line is this.

Joe Carrigan: [00:05:05] Right. No. Call the number that you've been calling 'cause chances are you've already dealt with these folks before, this vendor before, whatever it is.

Dave Bittner: [00:05:12] Right.

Joe Carrigan: [00:05:12] If you work in accounts payable, you know the people you have to write checks to.

Dave Bittner: [00:05:16] I hope maybe they'll get some of this back with insurance, or something like that.

Joe Carrigan: [00:05:20] If they've been insured for this kind of attack then there's a chance they can get it back that way. But in all probability, this money is just gone.

Dave Bittner: [00:05:26] Yeah. That is a big chunk of change.

Joe Carrigan: [00:05:28] It is.

Dave Bittner: [00:05:30] Well, all right, Joe. It's a heartbreaker, but an interesting story.

Joe Carrigan: [00:05:34] I'm good with the cheery stuff on this podcast.

Dave Bittner: [00:05:37] (Laughter). Well, I do have some cheery stuff...

Joe Carrigan: [00:05:40] Good.

Dave Bittner: [00:05:40] ...This week. (Laughter) Yes. My story, this comes from the U.S. Attorney's Office from the Southern District of New York. And we've got some good news. Nine defendants were arrested in New York, Florida and Texas for a multimillion-dollar wire fraud scheme.

Joe Carrigan: [00:05:54] Ha-ha-ha.

Dave Bittner: [00:05:55] So...

Joe Carrigan: [00:05:56] Did they defraud any churches, Dave?

Dave Bittner: [00:05:57] Not - well, I don't know.

Joe Carrigan: [00:05:57] Not these guys, huh?

Dave Bittner: [00:05:57] (Laughter) I don't know. But these folks had built people out of an excess of $3.5 million. And they basically went about it in three different ways. They did business email compromise...

Joe Carrigan: [00:06:11] Uh huh.

Dave Bittner: [00:06:11] ...Where they fooled people - you know, exactly what you just described.

Joe Carrigan: [00:06:15] Right.

Dave Bittner: [00:06:16] The second way, they used what's called the Russian oil scam. I don't think that's one we've covered on our show.

Joe Carrigan: [00:06:22] No. That's a new one to me.

Dave Bittner: [00:06:23] Have to look that one up. Evidently, it involved the opportunity to invest in oil that is stored in Russian oil tank farms. And you have to wire some upfront payments, and if you do that then you'll be an investor in this oil field, and you profit. Of course, it's a scam.

Joe Carrigan: [00:06:41] Yeah, of course.

Dave Bittner: [00:06:42] (Laughter). And then the last category was a romance scam.

Joe Carrigan: [00:06:45] Huh. That's another way to take advantage of people's feelings, emotions.

Dave Bittner: [00:06:48] So we've got nine people arrested here. Most of them came out of New York. A couple of them, from Florida, and one from Texas. They are each charged with a count of conspiring to commit wire fraud. They could serve up to 20 years in prison. So congratulations to the folks who went after them for this. Hats off to the folks in the Southern District of New York. I suspect they probably had some help with their friends from the FBI. It says there were some folks from U.S. Immigration and Customs Enforcement, to Homeland Security. So lots of different...

Joe Carrigan: [00:07:21] Department of Treasury, maybe?

Dave Bittner: [00:07:22] But it's good to see that sometimes the good guys win. And I think it's good to get that message out there for the folks who are trying to do this kind of stuff, that sometimes the long arm of the law will catch you and will take you in.

Joe Carrigan: [00:07:35] Particularly, if you live in the United States.

Dave Bittner: [00:07:37] Right. And I think that's an interesting part of this, too, is that, I think it's easy for us to think that most of this sort of thing is happening overseas 'cause a lot of it does.

Joe Carrigan: [00:07:45] Right.

Dave Bittner: [00:07:45] But not all of it.

Joe Carrigan: [00:07:47] Nope.

Dave Bittner: [00:07:47] And in this case, the good guys were able to nab the bad guys. So let's hope some of those people get some of their money back, even if they get some of it. It's probably unlikely they'll get all of it, but...

Joe Carrigan: [00:07:57] Yeah.

Dave Bittner: [00:07:57] ...I guess something's better than nothing in this case. All right, Joe. Well, it's time to move on to our Catch of the Day.


Dave Bittner: [00:08:06] This week, our Catch of the Day comes from one of our regular listeners. His name is Todd, and Todd has actually sent us a previous Catch of the Day. He sent us an email. He said, gentlemen, last time I emailed, it was about the awful sextortion story that possibly led to a soldier to kill himself. He says, fortunately, this time, the subject is a lot more lighthearted.

Joe Carrigan: [00:08:26] Good.

Dave Bittner: [00:08:26] He says, I was perusing my spam filters for anything legitimate, and I found the first attachment with a subject line like, Compensation. I clearly had to open it at once.

Joe Carrigan: [00:08:36] (Laughter).

Dave Bittner: [00:08:38] And the letter reads like this. The subject is Compensation. (Reading) Dear friend...

Joe Carrigan: [00:08:44] Three exclamation points.

Dave Bittner: [00:08:45] Yes. (Reading) I'm sorry but happy to inform you about my success in getting these funds transferred under the cooperation of a new partner from Vietnam, although I tried my best to involve you in the business, but God decided the whole situations.

Joe Carrigan: [00:08:59] (Laughter). Here we go.

Dave Bittner: [00:09:00] (Reading) Presently, I'm in Vietnam for investment projects with my own share of the total sum. Meanwhile, I didn't forget your past efforts and attempts to assist me in transferring those funds despite that it failed us somehow. Now contact my secretary in Burkina Faso. Her name is Ms. Chantal Davids (ph) on her email address below. Ask her how to send you the total of $1.45 million, which I kept for your compensation for all the past efforts and attempts to assist me in this matter. I appreciated your efforts at that time very much so feel free and get in touched with my secretary Ms. Chantal Davids, and instruct her where to send the amount to you. Please do let me know immediately you receive it so that we can share joy after all the sufferness at that time.

Joe Carrigan: [00:09:43] (Laughter).

Dave Bittner: [00:09:44] (Reading) In the moment, I'm very busy here because of the investment projects which I and the new partner are having at hand. Finally, remember that I had forwarded instructions to the secretary on your behalf to receive that money so feel free to get in touch with Ms. Chantal Davids. She will send the amount to you without any delay. OK. Extend my greetings to your family. My best regards, your brother, Dr. Abu Ahmed (ph). Greetings from Vietnam.

Joe Carrigan: [00:10:08] My brother is not named Abu.

Dave Bittner: [00:10:11] (Laughter). I think that was just a friendly greeting.

Joe Carrigan: [00:10:13] OK.

Dave Bittner: [00:10:14] I think he was your spiritual brother, not...

Joe Carrigan: [00:10:16] Right. Yeah. I love the appeal immediately to a religion. God decided the whole situation.

Dave Bittner: [00:10:21] Right.

Joe Carrigan: [00:10:22] This is a godly man. How could he lie?

Dave Bittner: [00:10:23] No. It could - not possible. Not possible. There's some odd things in this one. The whole thing about how you've assisted me in the past...

Joe Carrigan: [00:10:31] Right.

Dave Bittner: [00:10:31] Once again, I suppose - 'cause if you got this out of the blue, that sort of explains why this person might be doing this. But you may - I imagine a greedy person might think to themselves, well, I'm just going to play along. I'm going to get $1.45 million. At any rate, a lot going on here. I think most people would see through this one but, like we've said before, they're using these as a filtering mechanism to find the people who might be susceptible.

Joe Carrigan: [00:10:56] Right. And that's why they send these emails with these outlandish claims because if you were the kind of person who would believe it, you're also the kind of person who's likely to send money.

Dave Bittner: [00:11:06] Yeah. All right. Well, it's got a lot of different things in here. That's a fun one. And thanks so much to Todd for sending it into us. That is our Catch of the Day. Coming up next, we've got my interview with Dr. Richard Ford. He's the chief scientist at Forcepoint. He's going to be telling us all about models of trust. But first, a message from our sponsors at KnowBe4.

Dave Bittner: [00:11:29] Now let's return to our sponsor's question about the attacker's advantage. Why did the experts think this is so? It's not like a military operation, where the defender is thought to have most of the advantages. In cyberspace, the attacker can just keep trying and probing at low risk and low cost, and the attacker only has to be successful once. And as KnowBe4 points out, email filters designed to keep malicious spam out have a 10.5% failure rate. That sounds pretty good. Who wouldn't want to bat nearly 900? But this isn't baseball. If your technical defenses fail in one out of 10 tries, you're out of luck and out a business. The last line of defense is your human firewall. Test that firewall with KnowBe4's free phishing test, which you can order up at knowbe4.com/phishtest. That's knowbe4.com/phishtest.

Dave Bittner: [00:12:29] And we are back. Joe, I recently had the pleasure of speaking with Dr. Richard Ford. He is the chief scientist at Forcepoint. They are a security company. They focus on what they call human-centric cybersecurity. So we're going to be talking about something they refer to as models of trust. Here's my conversation with Dr. Richard Ford.

Richard Ford: [00:12:48] There's an old expression, right? If you want to know about water, don't ask a fish. And I think sometimes when we think about trust, it's the same kind of challenge. So in other words, trust is so woven into everything that we do that we're not really aware of how much time we're spending on trust and how much effort and how important it is. And conversely, as an attacker, a lot of the time what I'm trying to do is get you to trust in something that's not actually very trustworthy.

Dave Bittner: [00:13:14] Can you give us some examples?

Richard Ford: [00:13:16] Yeah, absolutely. So I will give a very simple one, right? Let's take the most hackneyed, sort of overdone piece of spam ever, and that's the piece of spam that shows up and says, hello. I'm a Nigerian prince, and I'm trying to give you an awful lot of money. At the end of the day, what you're trying to do is get users to trust that that might be real. They use some drivers like greed to go, OK, maybe I'll chance my hand at this. But, again, it's sort of trying to build this trust relationship so that I'll send you my bank account so that you can then deposit - what is it, usually? - about $9 million or so.

Richard Ford: [00:13:49] And by the way, as an aside, one of the reasons those emails are so awful isn't because the attackers are stupid. It's because they're smart. Every single time somebody responds to one of those emails, it takes the attacker a little bit of time to deal with it. So they only want people to respond if they're actually going to be gullible and go all the way through. So if you make the email really obvious, you've sort of built in a filter out in the front end to filter out all the people who'll talk to you for a bit and then go away without sending their banking details.

Dave Bittner: [00:14:17] Now, in terms of organizations protecting their networks, I mean, the way that we establish trust there has changed over the years as well, hasn't it?

Richard Ford: [00:14:25] Yeah, radically so. So if you wind the clock back about 20 years, we have this sort of outside-bad-inside-good sort of world view, right? That's where the whole concept of firewalls came from, is the firewall would stop you from the burning sort of mess that the internet is. And, you know, your interior network could be all pristine and beautiful.

Richard Ford: [00:14:46] Of course, over the last few years, we've seen the rise of what would be called zero trust. And zero trust was, you know, created to get us away from this sort of false idea that everything outside is potentially bad, and everything outside is good, and recognized instead that, you know, bad things also happen inside. And so you shouldn't just trust something based on where the packets seem to be coming from, for example. So I think we've gotten much more sophisticated around concepts of trust, and I think that's very much a good thing.

Dave Bittner: [00:15:16] Yeah. It seems to me like, you know, this old model, I - sometimes I imagine in my mind, you know, my castle with my moat and my drawbridge. And it seems like, in the old days, if you thought about the drawbridge as your username and password, once that drawbridge was down and I was allowed through the castle gate, well, I had pretty much free run once I was inside the castle.

Richard Ford: [00:15:34] Yeah. And unfortunately, that mentality actually still does exist today, right? If we step back and think about many of our security mitigations, so many of them are focused on this sort of rather gross, macro-level perimeter around the outside of the business. I think one of the things that I see most encouraging in the industry right now is that we're focusing increasingly on micro-perimeters, moving those perimeters closer to the object you're trying to protect so you protect from somebody inside the castle as well as sort of the hordes that are outside the castle. And I think we're seeing the industry sort of growing up in this respect. I'm really encouraged by it.

Dave Bittner: [00:16:10] How is that taking place? How are these micro-perimeters playing out in the real world?

Richard Ford: [00:16:15] I would argue there's a lot of different forms of micro-perimeter, right? I mean, one of the things that we do is we try and make certain that people are actually more verified in the sense that, you know, we're going towards multi-factor because if you're going to trust me, you better be sure that I'm me. I think there's concepts around recognizing that we need to look at east-west traffic as well as north-south. So in other words, not just inbound and outbound but internal to the company.

Richard Ford: [00:16:40] But also, I think, you know, there's increasing ideas around this sort of behavioral micro-perimeter. So it's not just what access does Richard have, but how is Richard using that access. And that's where the whole field of UEBA, or user and entity behavioral analytics, is starting to help us raise our game. So we're not just looking at access control - that's this is what I can do, this is what I can't - but looking at it in context. So yes, I can access this particular customer's data. But is it reasonable that I just accessed and printed out 10,000 customers' data? No, that's probably not reasonable. That's probably a symptom of something bad happening.

Dave Bittner: [00:17:16] And I suppose, I mean, part of what you're up against here is that you don't want to increase the amount of friction that people encounter when they're just trying to do their work.

Richard Ford: [00:17:25] Security friction's evil. If you look at security friction, it's kind of a hill that security solutions go on to die, right? I mean, I like to talk to CISOs a lot. And one of the conversations we often have is I'll say, look, if I gave you a solution that was as good at detecting bad things as what you have today but caused no friction, would you buy it? And the answer is universally yes. Security friction leads to a lot of very negative consequences in the environment. That's actually why a behavioral approach is so strong. If you make the unit of analysis the human being, and you very carefully analyze how they're leveraging and using data, I think you can sort of have your cake and eat it, right? You can extend trust to the people that are trustworthy. And for the people that are showing significant signs of risk, you can kind of start to deliberately introduce frictional mitigations there. So in other words, you don't have this one-size-fits-all approach to security, which does drive friction. So I think your point is very well-made.

Dave Bittner: [00:18:20] Do you suppose we'll see these sorts of things trickling down to the consumer level? I know enterprises where we see this stuff today. Do you think that's something that's going to - we're going to find in our future?

Richard Ford: [00:18:29] I think we already are seeing it, actually. So if you look at, for example, Gmail - Gmail now supports multi-factor, which is one of the tenets of sort of zero trust, right? It's pretty important to be able to verify who people actually are. There's fraud protection, which is present every single day. If you use your credit card, it's extraordinarily likely, if not a certainty, that your credit card provider is running your transactions through algorithms to detect fraud. So you've probably had that call, when you've made a charge that's a little bit out of the normal for you, where it's your bank going, hey, did you really just buy a washing machine or whatever it is?

Dave Bittner: [00:19:03] Right.

Richard Ford: [00:19:03] That's fraud detection. That's based on your historic purchases. How do you normally buy things? So we're - we are seeing that trickle down already today. And I think, yeah, we'll see it more often. You have fraud detection now even on your cellphone, potentially, trying to make certain that the purchases you're making from, say, the Apple Store correspond to the things that you normally would be buying.

Dave Bittner: [00:19:23] Now, you mentioned zero trust. What exactly are we talking about when we say that?

Richard Ford: [00:19:28] Right. So zero trust is something that's been championed pretty heavily by Forrester. And what it came out of, again, was sort of getting us away from this concept of outside bad, inside good because a lot of vulnerabilities get generated when you think of the world that way.

Richard Ford: [00:19:41] But it's actually evolved dramatically now to say, look at a world where you assume compromise, where you assume that things are going to go wrong. Don't just radically and sort of blindly extend trust in the wrong places. Trust is sort of earned. It's built up. It's confirmed. It's much more than trust, but verify. It's assume compromise and try and make certain that, in the event of compromise, the sort of damage is as minimal as possible.

Dave Bittner: [00:20:06] So what are your recommendations for folks who are going about their lives, you know, using their computers, their mobile devices in an everyday sort of way? How should they approach this? How do they dial in the amount of trust - the relationship they have with the various services that they use?

Richard Ford: [00:20:22] Yeah. So that's a really hard question, but one that I'm happy to tackle. First of all, let's talk about sort of the zero-trust methodologies. Right? One thing that every single user should do is up their authentication game, right? Again, Gmail supports multi-factor now. Right? You can buy a cheap little dongle that will help you authenticate in a much more secure way. And I think people don't understand or - that's a little bit harsh, don't understand - but I don't think they've sort of absorbed into themselves how much of a nuisance it is and how much damage can be done if you actually get your username and password compromised or guessed. So the - there's that whole side of it.

Richard Ford: [00:20:59] In terms of trust and how they extend trust, I think we've made some progress, right? It used to be that you would believe everything that hit your inbox. And in fact, the pendulum has swung quite a long way in the other direction now. And I think that, you know, we've become quite cynical about the things that we see online. One of the places where there's still a lot of room for improvement, though, are in applications. So that free flashlight application that wants access to your contacts - that's a little bit suspicious, right? So we need to think hard about the payments of personal information that we sort of pay for things in because when you get that free app, and it wants access to your contacts, what do you think it's using it for? Right?

Dave Bittner: [00:21:40] Right.

Richard Ford: [00:21:41] What's the small print say? Is this data being shared? How is it being shared? And it's actually very difficult out there in consumer land to sort of avoid data sharing. GDPR and a post-GDPR world does give you a lot more control, assuming you're dealing with people who are trying to do it all right. So you do have more privacy control now. But there are a lot of people who are in that sort of gray space too, about sort of vaguely doing things right. And you have to be a little bit more savvy. Many websites or services now give you a lot more control over your data. But most users never go in and change the defaults.

Dave Bittner: [00:22:14] Yeah. It's an interesting notion that it's worth the time to go in there and to even just review them to make sure they're what you think they are.

Richard Ford: [00:22:21] That's right. A lot of users sort of wave their hands up in the air in frustration and say, my privacy is gone. What can I do? I understand where that sort of feeling comes from. But I think that we are actually trending towards a better place if consumers take back the power. And I think that potential they have, the opportunity to do that - I mean, there are some really interesting things going on in terms of technologies that could help enhance our privacy.

Richard Ford: [00:22:48] There's the Brave browser, for example, which is actually the browser that I tend to use, which has a whole bunch of interesting technologies inside of it to try and allow advertisers to target me - as in, give me type of ads - but still protect the privacy of the person that is Richard Ford. And it uses some interesting tech to try and accomplish that. So, you know, we're seeing things trend in an interesting direction. But it'll only work when the sort of average person in the street gets more involved and takes a little bit more control.

Richard Ford: [00:23:18] I think it's important for users to think hard about where they're extending trust. If I send you a document, and you double-click it and open it, you're extending some trust to me - right? - because there's some risk to you every time you open an attachment. Similarly, when you stick your credit card into anything, really, I mean, even a gas pump - right? - there's some level of risk. You're going to have to think about the trust you're extending. I don't want users to live in a world where it's all about distrust, however. There are two ways that trust goes wrong. There's when you trust something that you shouldn't have. I think we're all very familiar with that. But there's also an example where you should have trusted something and you didn't and you miss out on an opportunity. So I would just encourage people to be more mindful in how they think about trust. Think about - is something really trustworthy? - what trust you're extending. And then make a rational, data-driven, thought-out decision.

Dave Bittner: [00:24:10] Lots of interesting stuff there. Huh, Joe?

Joe Carrigan: [00:24:11] Yeah. That was a great interview, Dave. I like the castle analogy. Everything outside is bad. Everything inside is good. Right? And I like what Dr. Ford says here - that security teams need to think about their domains as compromised. And that needs to be part of your threat model. Right?

Dave Bittner: [00:24:26] Yeah.

Joe Carrigan: [00:24:26] So let's say that I don't consider anybody on the inside of my network malicious, so they have free rein everywhere. Well, that's no good because people are going to be malicious on the inside of your network at some point in time.

Dave Bittner: [00:24:38] Well - and insider threats don't necessarily have to be malicious. They can be laziness or just someone can make a mistake.

Joe Carrigan: [00:24:44] Yeah. Actually, I'm not even talking about insider threats. At some point in time, somebody's going to compromise your firewall and get on the inside.

Dave Bittner: [00:24:49] OK. Sure.

Joe Carrigan: [00:24:50] And then you have to act as if that threat exists on your network. And yes, the insider threat is also a real thing. And you're right; the insider threat is not usually malicious. It's usually either lazy or absent-mindedness or not even realizing what they're doing.

Dave Bittner: [00:25:04] Yeah.

Joe Carrigan: [00:25:05] Right?

Dave Bittner: [00:25:05] Yeah.

Joe Carrigan: [00:25:05] Ignorance. Behavior is key. I like what he says. You're right. It's fine for me to look at one or two customer records. But if I'm pulling out 10,000 customer records, what business reason would I have for doing that? That's something that is not happening on a regular basis. Or if it is happening on a regular basis, it's part of a process that happens at a particular time for maybe a data warehousing application, and some data analysis is going on.

Joe Carrigan: [00:25:28] But you know when those things are happening. Right? And you know...

Dave Bittner: [00:25:32] Right.

Joe Carrigan: [00:25:32] ...Where they're coming from. So if you see that kind of activity coming from somebody's workstation, that should set off a red flag somewhere...

Dave Bittner: [00:25:40] Yeah.

Joe Carrigan: [00:25:41] ...An alert at least.

Dave Bittner: [00:25:42] Yeah. And again, it could be inadvertent, where...

Joe Carrigan: [00:25:44] Right.

Dave Bittner: [00:25:45] ...Someone instead of, you know, grabbing a file or two grabs the whole folder...

Joe Carrigan: [00:25:49] Yep.

Dave Bittner: [00:25:49] ...And forwards that on to someone. That is (laughter) something you want to keep an eye on.

Joe Carrigan: [00:25:53] Exactly.

Dave Bittner: [00:25:55] Yeah.

Joe Carrigan: [00:25:55] Or somebody has a misformed SQL query.

Dave Bittner: [00:25:58] (Laughter) Right.

Joe Carrigan: [00:25:58] Or they...

Dave Bittner: [00:25:59] Or that.

Joe Carrigan: [00:25:59] Yeah. I like what he says about security friction being evil. This was something that was kind of alien to me, and I'll share a story about it. When I started working at Hopkins, I came in with this idea about - with all these other security notions. And I was wondering about why a hospital is so vulnerable to security problems. This wasn't clear to me until I was steeped in the environment. It's because cybersecurity is a secondary concern in a hospital.

Joe Carrigan: [00:26:26] The primary concern is saving people's lives. You know? When people come into the emergency room, their first complaint is - I can't breathe, my chest hurts, maybe get these bullets out of me.

Dave Bittner: [00:26:39] Right.

Joe Carrigan: [00:26:40] Right? It's never make sure my data's secure.

Dave Bittner: [00:26:41] Right.

Joe Carrigan: [00:26:42] You know? It's always something more important - literally more important. You are much lower on the Maslow's hierarchy of needs...

Dave Bittner: [00:26:48] Right. Yeah, absolutely.

Joe Carrigan: [00:26:50] ...Than cybersecurity. So the idea was - or the statement that somebody told me was, if your product interferes with a doctor's ability to deliver health care to a patient, that product is out of here...

Dave Bittner: [00:27:01] Yeah.

Joe Carrigan: [00:27:01] ...The next day.

Dave Bittner: [00:27:02] The doctors won't stand for it.

Joe Carrigan: [00:27:03] They won't stand for it.

Dave Bittner: [00:27:04] Yep.

Joe Carrigan: [00:27:04] So he's 100% correct. And even environments where you don't have that kind of thing - like maybe in a financial environment, where you're much more tolerant to the security friction - if you can lower the amount of friction, you can make a more profitable business. Right? And that's good for anybody.

Dave Bittner: [00:27:19] Yeah. And I think, also, that friction tends to spawn workarounds.

Joe Carrigan: [00:27:22] Yes (laughter).

Dave Bittner: [00:27:23] If something's a pain, then someone will figure out a - they'll think they're being clever. And they probably are being clever.

Joe Carrigan: [00:27:29] Right.

Dave Bittner: [00:27:29] But those workarounds lead to vulnerabilities.

Joe Carrigan: [00:27:31] Yeah, you make hackers out of your entire workforce (laughter).

Dave Bittner: [00:27:33] Right. Exactly, exactly.

Joe Carrigan: [00:27:35] Up your authentication game. Everybody should be using multifactor authentication, strong passwords and a password manager.

Dave Bittner: [00:27:42] Yeah.

Joe Carrigan: [00:27:42] All right? That...

Dave Bittner: [00:27:43] If it's important to you, you need to be doing that.

Joe Carrigan: [00:27:45] And finally, towards the end of the interview, he's talking about trust. Trust is something that's innate in us - right? - who we trust. And I like the way he says trust can fail both ways. We can either trust something we shouldn't or not trust something we should. And that comes from our very human nature. We trust things that we're familiar with, and we distrust things that we're not familiar with.

Dave Bittner: [00:28:02] Right.

Joe Carrigan: [00:28:03] So that can be two different ways.

Joe Carrigan: [00:28:05] Now, maybe that's good. Maybe something I'm not familiar with - hey, you would need to prove to me that you're doing something here in my interest. Right? You know, the guy knocks on your door and says, hey, I'm here to sell you windows.

Dave Bittner: [00:28:16] Right, right.

Joe Carrigan: [00:28:16] Immediately, I don't trust that guy (laughter). Right?

Dave Bittner: [00:28:20] (Laughter) Yeah. You and I have discussed this before, where everyone has their own risk level...

Joe Carrigan: [00:28:25] Right.

Dave Bittner: [00:28:26] ...Where for me, there's a certain amount of risk that I'm willing to take where I will accept a certain amount of loss...

Joe Carrigan: [00:28:33] Yes.

Dave Bittner: [00:28:34] ...Due to scamming in exchange for not going around distrusting everybody.

Joe Carrigan: [00:28:40] Correct, correct.

Dave Bittner: [00:28:41] And everybody has to figure out what their acceptable level is for themselves.

Joe Carrigan: [00:28:44] Yes. And I had that same thing.

Dave Bittner: [00:28:47] Yeah. And your life informs that 'cause, you know, it's once bit, twice shy. Right? That's what they say?

Joe Carrigan: [00:28:52] Yep.

Dave Bittner: [00:28:53] Yeah, yeah. All right. Well, again, thanks to Dr. Richard Ford from Forcepoint for joining us. And thank you for listening.

Dave Bittner: [00:29:00] And we want to thank our sponsors at KnowBe4. They are the social engineering experts and the pioneers of new-school security awareness training. Be sure to take advantage of their free phishing test, which you can find at knowbe4.com/phishtest. Think of KnowBe4 for your security training.

Dave Bittner: [00:29:16] Thanks to the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu.

Dave Bittner: [00:29:24] The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben. Our editor is John Petrik; technical editor is Chris Russell. Our staff writer is Tim Nodar; executive editor is Peter Kilpe. I'm Dave Bittner.

Joe Carrigan: [00:29:43] And I'm Joe Carrigan.

Dave Bittner: [00:29:44] Thanks for listening.