The fallacy of futility.
Ray [REDACTED]: [00:00:00] We're so numb to these massive breaches that it feels like they're almost inevitable.
Dave Bittner: [00:00:06] Hello, everyone, and welcome to the CyberWire's "Hacking Humans" podcast, where each week, we look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire, and joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe.
Joe Carrigan: [00:00:24] Hi, Dave.
Dave Bittner: [00:00:25] We've got some good stories to share this week, and later in the show, we've got a return visit from Ray [REDACTED]. He's got some follow-up from his previous visit as well as some new information to share.
Dave Bittner: [00:00:34] But first, a word from our sponsors at KnowBe4. So who's got the advantage in cybersecurity – the attacker or the defender? Intelligent people differ on this, but the conventional wisdom is that the advantage goes to the attacker. But why is this? Stay with us, and we'll have some insights from our sponsor KnowBe4 that puts it all into perspective.
Dave Bittner: [00:01:01] And we're back. Joe, I'm going to kick things off for us this week. You're familiar with the notion of a Ponzi scheme.
Joe Carrigan: [00:01:07] Yes, I am.
Dave Bittner: [00:01:08] Similar, I guess, to a pyramid scheme.
Joe Carrigan: [00:01:09] Similar – very similar, yes.
Dave Bittner: [00:01:11] What's the difference?
Joe Carrigan: [00:01:12] The difference is generally that in a pyramid scheme, people kind of know that they're in some kind of scheme where they have to go out and recruit new people. But a Ponzi scheme, people may not know that or don't know that. They're just seeing returns on their investments.
Dave Bittner: [00:01:25] Yeah. So the way a Ponzi scheme works is I present you with an investment opportunity.
Joe Carrigan: [00:01:30] Right.
Dave Bittner: [00:01:31] I say you're going to get amazing returns on this investment opportunity.
Joe Carrigan: [00:01:34] That's right.
Dave Bittner: [00:01:35] But there is no investment opportunity. What I'm really doing is I'm going and finding other people...
Joe Carrigan: [00:01:39] Right.
Dave Bittner: [00:01:40] ...To also give me their money, and I'm paying you returns based on the money that they're giving me.
Joe Carrigan: [00:01:45] Right.
Dave Bittner: [00:01:45] But then I'm promising them returns.
Joe Carrigan: [00:01:47] Yep.
Dave Bittner: [00:01:47] So then I have to go get more people to pay the returns that I owe them.
Joe Carrigan: [00:01:51] Yep.
Dave Bittner: [00:01:52] Eventually, if you imagine the shape of a pyramid...
Joe Carrigan: [00:01:54] Yep.
Dave Bittner: [00:01:54] ...It gets bigger and bigger and bigger as you go down and collapses under its own weight.
Joe Carrigan: [00:01:58] That's right. If you're an early investor in either one of these schemes, you can actually make out.
Dave Bittner: [00:02:01] You can.
Joe Carrigan: [00:02:02] Right.
Dave Bittner: [00:02:03] But chances are...
Joe Carrigan: [00:02:04] You won't.
Dave Bittner: [00:02:04] ...You're not going to do well. Right. Or you're going to run afoul of the law...
Joe Carrigan: [00:02:09] Oh.
Dave Bittner: [00:02:09] ...Which is what happened to...
Joe Carrigan: [00:02:10] Very quickly, you'll run afoul of the law.
Dave Bittner: [00:02:12] ...This gentleman from Rochester, N.Y., a gentleman named Perry Santillo, who was running a Ponzi scheme. He bilked people out of over $100 million.
Joe Carrigan: [00:02:23] Wow.
Dave Bittner: [00:02:23] Actually, $155 million.
Joe Carrigan: [00:02:25] This is a pretty big Ponzi scheme.
Dave Bittner: [00:02:27] Ran this Ponzi scheme starting back in 2012 and lost over $70 million of the $155 million...
Joe Carrigan: [00:02:36] How did he lose $70 million?
Dave Bittner: [00:02:37] I think he spent it...
Joe Carrigan: [00:02:38] OK.
Dave Bittner: [00:02:40] ...Or paid it out as...
Joe Carrigan: [00:02:42] Yeah, paid it out as returns.
Dave Bittner: [00:02:43] ...As returns, which is money that cannot be recaptured...
Joe Carrigan: [00:02:46] Right.
Dave Bittner: [00:02:47] ...And bilked people out of hundreds of thousands of dollars...
Joe Carrigan: [00:02:49] Sure.
Dave Bittner: [00:02:50] ...By telling them that they were going to be investing in legitimate investment strategies.
Joe Carrigan: [00:02:54] Right.
Dave Bittner: [00:02:55] Now, what I think is particularly interesting about this scheme is one of the ways that they got people to have confidence in the scheme was they would buy up legitimate investment firms where the owners were about to retire.
Joe Carrigan: [00:03:12] Really?
Dave Bittner: [00:03:12] Yeah. So let's say I'm running, you know, the Acme Investment Firm here in Maryland.
Joe Carrigan: [00:03:17] Right.
Dave Bittner: [00:03:18] And I decide I've had a great career, and it's time for me to move on. I'm going to retire down to Florida and cash out of my business.
Joe Carrigan: [00:03:25] Right.
Dave Bittner: [00:03:26] These guys would come along, make me an offer for the business, buy out the business and, as part of that buyout, they're buying my book of business. They're buying...
Joe Carrigan: [00:03:34] Right. They're buying your customers.
Dave Bittner: [00:03:35] They're buying my customers.
Joe Carrigan: [00:03:36] Right.
Dave Bittner: [00:03:36] So then they would convince the customers to convert their investments.
Joe Carrigan: [00:03:41] Oh, this is horrible.
Dave Bittner: [00:03:42] Yeah.
Joe Carrigan: [00:03:42] I don't know if I've discussed this before, but I come – my family is kind of a financing family.
Dave Bittner: [00:03:47] OK.
Joe Carrigan: [00:03:47] My mom worked for one of these investors that you're talking about...
Dave Bittner: [00:03:50] Right.
Joe Carrigan: [00:03:50] ...And my dad actually managed investment funds and was a CPA. But what they've done here is absolutely terrible.
Dave Bittner: [00:03:57] Yeah.
Joe Carrigan: [00:03:57] What this guy has done – because he's gone into people who have had sound investment advice, and now he's capitalizing on that and destroying these people's nest eggs.
Dave Bittner: [00:04:07] Right. Yeah, and that's precisely what happened. Evidently, this guy, not surprisingly, liked to live high off the hog.
Joe Carrigan: [00:04:14] Really?
Dave Bittner: [00:04:15] He had a extravagant lifestyle with expensive suits and cars and houses and all that sort of stuff.
Joe Carrigan: [00:04:22] Yeah.
Dave Bittner: [00:04:23] But now he's behind bars, facing fines. Of course, the FBI, who took him in – they're trying to get back as much of the funds as they can to return to the victims. But, of course, they say they're not going to be able to.
Joe Carrigan: [00:04:35] They're not going to be able to recover everything, and these people are probably out – it sounds like close to 75%. Was it?
Dave Bittner: [00:04:41] It's - they're going to get back pennies on the dollar...
Joe Carrigan: [00:04:43] Right.
Dave Bittner: [00:04:43] ...Likely. It's good that they caught him, but I think the cautionary tale here is about those investment firms. If someone comes to you and says, hey; we want to convert your investments to something else, do your due diligence.
Joe Carrigan: [00:04:57] Right. Yeah. This is tough because generally, you think that the people who are retiring – these are people you've trusted all their lives, and these people have also been scammed, right?
Dave Bittner: [00:05:05] That's true. Right. Because they have relationships with these folks...
Joe Carrigan: [00:05:08] Exactly.
Dave Bittner: [00:05:09] ...That they - their customers - I know - you know, I have friends who work in investment firms, and it's not merely transactional.
Joe Carrigan: [00:05:16] No, it's not.
Dave Bittner: [00:05:16] They're helping people achieve their dreams and provide security for their retirement and so on and so forth. They know these people. It would surprise me very much if they were, you know, cavalierly turning the business over to someone that they hadn't attempted to check out themselves.
Joe Carrigan: [00:05:30] Right. And they're going to send an email out that says, I'm retiring. I'm selling my book of business.
Dave Bittner: [00:05:34] Right.
Joe Carrigan: [00:05:35] Here's your new investment manager. I think he's a good guy.
Dave Bittner: [00:05:38] Exactly.
Joe Carrigan: [00:05:38] Right.
Dave Bittner: [00:05:38] Turns out...
Joe Carrigan: [00:05:39] He's not.
Dave Bittner: [00:05:40] ...Not such a good guy.
Joe Carrigan: [00:05:41] He's now in prison...
Dave Bittner: [00:05:43] Yeah.
Joe Carrigan: [00:05:43] ...Where he belongs.
Dave Bittner: [00:05:45] Yeah, so – cautionary tale, one to look out for. That wrinkle about buying out other investment companies, that's a new one.
Joe Carrigan: [00:05:51] Yeah, that's...
Dave Bittner: [00:05:52] I hadn't heard about that before.
Joe Carrigan: [00:05:53] ...Smart. But, I mean, this guy has hurt a lot of people.
Dave Bittner: [00:05:56] Yeah. All right, well, that's my story this week. What do you have for us, Joe?
Joe Carrigan: [00:05:59] Well, I'm staying with the same theme of hurting a lot of people.
Dave Bittner: [00:06:04] (Laughter).
Joe Carrigan: [00:06:04] This one comes from Lisa Vaas, who writes over at the Naked Security blog, which is from Sophos. You know, we do a lot of talking about deepfakes...
Dave Bittner: [00:06:12] Yeah.
Joe Carrigan: [00:06:12] ...On this show. You talk about it on the CyberWire. Do you know who gets victimized most by them? Is it politicians?
Dave Bittner: [00:06:18] No. I would say there's a lot of fear around the possibilities of what...
Joe Carrigan: [00:06:23] Right.
Dave Bittner: [00:06:23] ...Deepfakes could do with politicians.
Joe Carrigan: [00:06:25] Right.
Dave Bittner: [00:06:25] But I would say so far, no, not yet.
Joe Carrigan: [00:06:28] No, it's not. It's actually women that get victimized by these things...
Dave Bittner: [00:06:33] Yeah.
Joe Carrigan: [00:06:33] ...The most.
Dave Bittner: [00:06:34] Not surprising, I guess.
Joe Carrigan: [00:06:35] There is a report titled "The State of Deepfakes," which has been released by a company called Deeptrace.
Dave Bittner: [00:06:40] Oh, yes. Actually, we have – a gentleman from Deeptrace is going to be a guest on this show in a few weeks.
Joe Carrigan: [00:06:45] Oh, very good.
Dave Bittner: [00:06:46] Yeah.
Joe Carrigan: [00:06:46] They use deep learning and computer vision for detecting and monitoring deepfakes on the internet. They found that 96% of deepfakes being created in the first half of this year were porn...
Dave Bittner: [00:06:55] Oh.
Joe Carrigan: [00:06:57] ...Mostly being nonconsensual.
Dave Bittner: [00:06:59] Right.
Joe Carrigan: [00:06:59] Right. A lot of them are made of celebrities without compensation or even permission from the actors. I mean, when you have a serious actor, I really don't imagine a situation in which they would give their permission for their face to be used in a video like this. Also, the number of deepfakes has doubled in the seven months leading up to July 2019, and this growth is because of the availability of easier-to-use tools. All right, one example of this was an app that came out a couple of months ago called DeepNude.
Dave Bittner: [00:07:31] I remember.
Joe Carrigan: [00:07:32] This was an app that let you create a nude photo of anyone you took a picture of, and I think it only let you create nude photos of women.
Dave Bittner: [00:07:38] That is correct.
Joe Carrigan: [00:07:38] If you took a picture of a guy...
Dave Bittner: [00:07:40] It would turn it into a naked woman.
Joe Carrigan: [00:07:42] ...It would turn it into a naked woman. Exactly.
Dave Bittner: [00:07:44] I would hazard to say they knew who their target audience was.
Joe Carrigan: [00:07:47] Exactly.
Dave Bittner: [00:07:47] Yeah.
Joe Carrigan: [00:07:48] Yeah, their target audience is a bunch of gross guys, right?
Dave Bittner: [00:07:51] Right.
Joe Carrigan: [00:07:54] That got banned quickly, right?
Dave Bittner: [00:07:56] Well, it got pulled. Yes, it got banned and pulled.
Joe Carrigan: [00:07:58] Pulled...
Dave Bittner: [00:07:59] The – I think the person who created it was perhaps naively unprepared for the avalanche of criticism that would come his way and thought...
Joe Carrigan: [00:08:07] Was he really naively unprepared for that? I don't think...
Dave Bittner: [00:08:10] I don't know.
Joe Carrigan: [00:08:10] ...He expected his app to be as popular as it was because he put a $50 price tag on it.
Dave Bittner: [00:08:14] Yeah.
Joe Carrigan: [00:08:15] And then people bought it up, and it got pulled from the markets, right? Then there was another one, a face-swapping app called Zao in China that got pulled because people were afraid of privacy violations. But deepfakes have been banned on lots of places. Reddit was one of the first places to ban it, and that's actually where the term deepfake comes from – is from a Reddit board. Twitter has banned it, and major porn sites have banned it, right? Now...
Dave Bittner: [00:08:36] Yeah, that's interesting.
Joe Carrigan: [00:08:37] If a major porn site bans something – right? I think I've said this before, but, you know, maybe that's something we should be looking a little bit harder at.
Dave Bittner: [00:08:48] Right.
Joe Carrigan: [00:08:48] We should be analyzing this more and doing a little more thinking about it.
Dave Bittner: [00:08:51] Someone has the moral flexibility of a major online porn site when they think, you know? That's too much for us.
Joe Carrigan: [00:08:56] Right. Yeah, exactly.
Dave Bittner: [00:08:58] But you could see the legal hazard that they'd have there, too.
Joe Carrigan: [00:09:01] Right.
Dave Bittner: [00:09:02] That could be a big problem.
Joe Carrigan: [00:09:03] Oh, absolutely. That could be a huge problem. Most of the software that makes these deepfakes require some kind of programming ability and a GPU. You need a good GPU or a system with a bunch of GPUs in it.
Dave Bittner: [00:09:14] Right, or a lot of patience...
Joe Carrigan: [00:09:16] Right, or a lot of patience.
Dave Bittner: [00:09:17] ...A lot of free time on your hands.
Joe Carrigan: [00:09:18] Yeah, and when I say a lot of patience – I mean, these GPUs really process this deep learning data a lot faster than you could ever hope to process it on a CPU. It does require a GPU, essentially. It becomes an intractable problem without a GPU.
Dave Bittner: [00:09:32] So some sort of investment there in hardware...
Joe Carrigan: [00:09:35] Yeah, but it's not a big investment. I mean, I think the most advanced GPU on the market right now for gaming is, like, seven or eight hundred bucks. It's not a lot of money.
Dave Bittner: [00:09:44] And it's only getting easier.
Joe Carrigan: [00:09:46] Yeah, exactly. That's the thing. The technology is getting easier to use as well. There are tutorials out there on how to do this step-by-step, and also, the software is starting to implement better GUIs...
Dave Bittner: [00:09:56] Right.
Joe Carrigan: [00:09:56] ...Graphic user interfaces that lets people do this. Now, here's my concern for this, and this is something that Lisa Vaas has brought up as well. Think of the reputation damage that can be done to a young woman if a deepfake were to be released of her, even just a picture that starts circulating among her social group, and for no other reason than, say, like, revenge porn. This can be devastating.
Dave Bittner: [00:10:21] Well, I'm imagining someone out there looking for a job, and the employer does a Google search on that person's name. Up comes the deepfake.
Joe Carrigan: [00:10:31] Yeah, that could have real-world implications there.
Dave Bittner: [00:10:34] Right.
Joe Carrigan: [00:10:34] I mean, financial implications for somebody – it could be devastating. I think we need to do a lot more thinking about this problem. Google and Facebook are investing heavily in this. Google just created a dataset for use in machine learning to detect these deepfakes, and Facebook has dumped something like $10 million into it. I think these big tech companies are starting to take it seriously. I think we need to ask people in government to start taking it a little more seriously, and not for their own selfish reasons, right? Of course, people in government have a real reason because they're the ones that stand to lose some credibility here, but I really would like to see the focus shift to the general public, the general population of the world, really.
Dave Bittner: [00:11:14] No, I would imagine we'd see legislators jump on this when it became a real problem for them. If people started to flood online forums and so forth with deepfakes of the politicians themselves, that would get their attention and make them go, wait a minute. We need – perhaps some regulation is in order here.
Joe Carrigan: [00:11:32] Yeah, but I think you're right. I think that if this started impacting politicians more in the same way that it could impact the rest of us, then maybe, but I'm not saying that we should advocate for that. I don't want to see that either. I would like to see a little bit of forethought from people. This is something that should and kind of does have broad bipartisan support because it is a universal problem. I think it's something that everybody agrees on. I'd just like to see some action on it.
Dave Bittner: [00:11:54] Yeah, I guess the challenge is you don't want to overcorrect.
Joe Carrigan: [00:11:58] No, you don't.
Dave Bittner: [00:11:58] You don't want to stifle legitimate free speech by going too far with it. That's always the trick, right?
Joe Carrigan: [00:12:04] That is a delicate balance we try to achieve here in the United States.
Dave Bittner: [00:12:07] Yeah. All right, well, certainly one to watch, and like I said, we're going to have a gentleman from Deeptrace on the show here in a couple of weeks, so we'll look forward to that. Right now it is time to move on to our Catch of the Day.
0:12:18:(SOUNDBITE OF REELING IN FISHING LINE)
Dave Bittner: [00:12:21] Our Catch of the Day was sent to us by a listener this week, and it goes like this. (Reading) Join the Illuminati. Greetings from the Illuminati world elite empire, bringing the poor, the needy and the talented to the limelight of fame, riches, powers and security. Get recognized in your business, political race. Rise to the top in whatever you do. Be protected spiritually and physically. All these you will achieve in a twinkle of an eye when you get initiated to the great Illuminati empire. Once you are initiated to the Illuminati empire, you will get numerous benefits and reward. Note that this email message was created solely for the purpose of our recruitment scheme, which will end next month, and this offer is for unique ones only. If you are not serious on joining the Illuminati empire, then you are advised not to contact us at all. This is because disloyalty is highly not tolerated here in our organization. Do you agree to be a member of the Illuminati new world order? If yes, then kindly reply back to us on our direct recruitment email. Please note, kindly make sure all your responses are sent directly to the email stated above for more instructions on our membership process. Note – some email providers incorrectly placed official Illuminati messages in their spam junk folder...
Joe Carrigan: [00:13:34] What?
Dave Bittner: [00:13:34] ...Or promotion folder. This can divert and exclude our responses to your emails. Thank you, The Illuminati.
Joe Carrigan: [00:13:41] Dave, I have always wanted to join the Illuminati.
Dave Bittner: [00:13:44] (Laughter) I'm surprised they haven't reached out to you before, Joe.
Joe Carrigan: [00:13:48] I know. You know, here – you know what the first red flag on this – is that they say that the Illuminati is bringing the poor, the needy and the talented to the limelight of fame, riches and power. That is not what the Illuminati does (laughter).
Dave Bittner: [00:14:01] Is that right, Joe?
Joe Carrigan: [00:14:02] Yes.
Dave Bittner: [00:14:03] How do you know that, Joe?
Joe Carrigan: [00:14:03] Well, I mean, as a guy who's always wanted to join – I don't know, Dave.
Dave Bittner: [00:14:08] I see. Right. You've done your homework.
Joe Carrigan: [00:14:09] I've done my homework.
Dave Bittner: [00:14:10] Yeah, all right – as an Illuminati hopeful.
Joe Carrigan: [00:14:13] As an Illuminati hopeful – right. Illuminati confirmed.
Dave Bittner: [00:14:17] All right. Well, thanks for sending that in. It's a fun one.
Joe Carrigan: [00:14:19] That is awesome.
Dave Bittner: [00:14:20] Coming up next, we've got a return visit from the gentleman who goes by the name Ray [REDACTED]. He's got some follow-up from his previous visit as well as some new information to share.
Dave Bittner: [00:14:29] But first, a message from our sponsors KnowBe4. Now let's return to our sponsor's question about the attacker's advantage. Why did the experts think this is so? It's not like a military operation where the defender is thought to have most of the advantages. In cyberspace, the attacker can just keep trying and probing at low risk and low cost, and the attacker only has to be successful once. And, as KnowBe4 points out, email filters designed to keep malicious spam out have a 10.5% failure rate. That sounds pretty good. Who wouldn't want to bat nearly 900? But this isn't baseball. If your technical defenses fail in one out of 10 tries, you're out of luck and out of business. The last line of defense is your human firewall. Test that firewall with KnowBe4's free phishing test, which you can order up at knowbe4.com/phishtest. That's knowbe4.com/phishtest.
Dave Bittner: [00:15:34] And we're back. Joe, I recently had a chance to speak once again with the gentleman who goes by the name Ray [REDACTED] online. He prefers to maintain a certain amount of anonymity on his Twitter account.
Joe Carrigan: [00:15:46] So his last name's not really [REDACTED]?
Dave Bittner: [00:15:47] His last name is not really [REDACTED], no. I imagine that would cause problems when he tried to apply for actual things. But no, he's a good guy, well-known online, well-respected. And we had him on a few weeks ago, and we got some follow-up, a couple of things that we wanted to address. So here is my conversation with Ray [REDACTED]. Ray, it's great to have you back, and after your last appearance, we had a couple of messages come in from listeners with some follow-up. So I just wanted to go through some things one at a time with you and maybe clarify some things, maybe a correction or two. Where do you want to begin?
Ray [REDACTED]: [00:16:22] Sure. So yeah, as you mentioned, after the previous podcast about SIM hijacking, we did get quite a bit of listener mail and some messages on Twitter. It was a couple different things, and I guess the easiest place to start with is just the blatant correction that I need to make with regards to – several listeners reached out and said I'd made the statement that SIM hijacking was not very common in Europe, and apparently, that was dead wrong. It is absolutely common.
Dave Bittner: [00:16:49] OK.
Ray [REDACTED]: [00:16:49] And it's also growing as well. So we heard from a listener named Liam (ph) that was in Ireland that said, you know, it's definitely a problem there. So I completely missed on that one and want to make sure that I retract – not redact, retract – that statement.
Dave Bittner: [00:17:02] (Laughter) OK, fair enough.
Ray [REDACTED]: [00:17:03] Can't re-retract it (ph).
Dave Bittner: [00:17:04] Right, fair enough. What else did we hear about?
Ray [REDACTED]: [00:17:06] So a couple people actually reached out and talked about, how do you handle this if someone cannot necessarily afford a computer or possibly even have access to other online resources, especially around the Google Authenticator kind of resets and things like that? And I just wanted to kind of point out that there's not a real easy answer for that. A lot of the modern banking things assume, rightly or wrongly, that people do have access online. But Google does have an option where you can print out actual emergency backup codes for resets – you know, as another possible option for resetting their services. And of course, the Google Voice service that we had mentioned as a potential way to mitigate some hijacking – that's a free service from them, as well.
Dave Bittner: [00:17:52] So we're talking about someone who may be in a situation where they don't have a mobile device; they don't have a computer; maybe they're using a system at a local library or some public-access computers - how they're limited with their access.
Ray [REDACTED]: [00:18:06] Correct. Yes. And then the question was, was, you know, if we can't rely on text messaging on non-smartphones - right? - so just think in terms of, like, a flip phone or something like that - you know, what other ways could they implement multifactor? And the reason that this one caused me a lot of - I struggled with it a little bit is because we don't ever want to fall in the pitfall of telling people not to use anything at all, right? That's sort of a kind of a dangerous spiral when people say, well, SMS is so insecure, so just don't use multifactor at all. Well, that's certainly not the path we want to go on. It's always better to have multifactor of some kind, but you just need to be aware of the strengths and weaknesses of each type.
Dave Bittner: [00:18:43] So there are a couple of options out there beyond having your own device.
Ray [REDACTED]: [00:18:48] Sure. Absolutely.
Dave Bittner: [00:18:49] Yeah. What else did we hear about?
Ray [REDACTED]: [00:18:51] So one of your listeners actually reached out to me and brought my attention to the fact that PayPal now allows OTP, or the - basically, the one-time password authenticators like Google Authenticator and Authy, which I'd kind of recommended be done. Unfortunately, I dug really deep into that PayPal implementation side, and there's some serious problems with the way that they've brought it about.
Dave Bittner: [00:19:19] Hmm. It's interesting because I saw a lot of positive feedback that PayPal had at long last enabled this.
Ray [REDACTED]: [00:19:26] Yeah. So as most people know kind of in the social engineering space, there's a type of service called knowledge-based authentication, which is kind of the crappiest authentication you could ever do. That's where they ask you, what street did you grow up on, or what your mother's maiden name is or maybe your birthday, right? Unfortunately, on the PayPal multifactor authentication choices, if you go to log in and you don't have your second factor, you can immediately say, I don't have this, or, I don't want to use that, and it defaults to asking you some very, very simple knowledge-based authentication questions. And there's no way to turn that off.
Ray [REDACTED]: [00:20:03] So not only can you - well, once you get it set up with your Authy or your Google Authenticator, not only can you bypass it by forcing it to SMS unless you remove your phone number, but even worse, anyone can actually go to bypass it if they know the most basic information about you. Like, we're talking about really accessible OSINT. So if you're doing that on PayPal, I would recommend lying to those questions (laughter) because...
Dave Bittner: [00:20:28] Oh, right.
Ray [REDACTED]: [00:20:31] And think of it just as another password because if you give them the correct answers, then that's going to leave a pretty big, gaping hole that could potentially be abused. And most people do have PayPal linked to a bank account, so it's not like it's a small vulnerability that's there.
Dave Bittner: [00:20:45] Yeah. That's surprising to me. In enabling this stronger factor, they still sort of fall back to the weakest.
Ray [REDACTED]: [00:20:52] Yes. And I know that a lot of people on Twitter have talked about this. I've seen Lesley Carhart, who goes by @hacks4pancakes, talk about it. I think even Krebs had brought attention to it, as well. So it's not like this is a secret. We're not, like, divulging anything that's not publicly out there. But I've never seen a response from PayPal. And you would think that of all of the companies in the world that would want to have airtight multifactor, they have the biggest interest in fixing this and getting away from knowledge-based authentication bypass.
Dave Bittner: [00:21:20] Yeah. One other topic I want to hit with you - and that is this notion of people becoming resigned about their private information being out there. You listened to a recent interview that Carole Theriault did on our show, and the person she was talking to mentioned this. You had some thoughts.
Ray [REDACTED]: [00:21:38] Yes. So on the episode, Carole was saying that when she kind of evangelizes to her friends about protecting their data – credit card numbers and things like that, credit information – that she commonly hears people say, oh, Carole, it's way too late for that; my data's already out there; there's nothing we can do about, you know, protecting that, et cetera. And that's actually something that we talk about a lot in cybersecurity education courses. I actually call that the fallacy of futility. And what it is, is it's the idea that if we take the fact that online privacy doesn't exist anymore - right? - if we say, well, there's no such thing as online privacy - as complete, the problem is, is, that's not a binary statement, right? It doesn't either exist or it doesn't. There are varying degrees of privacy.
Ray [REDACTED]: [00:22:26] So for example, I'm resigned to the fact that because I was involved in the OPM breach, there are Chinese hackers that have access to my information, period, right? But that doesn't necessarily mean that I want the 13-year-old script kiddies that are poring through IRC to have access to it, right? It's very important to keep in mind that just because your data has been breached before – and if we look at things like Troy Hunt's Have I Been Pwned, et cetera, almost everybody listening to this podcast has been involved in at least one breach – that doesn't mean that you'd necessarily want to be involved in others, right? So – and ultimately, some of that data may be different, like, especially if you're using unique email addresses, but it is in everyone's best interest to try to protect themselves, you know, through OPSEC and practicing good security hygiene.
Dave Bittner: [00:23:11] Where do you think this false belief comes from? Why do people head down this path?
Ray [REDACTED]: [00:23:16] Well, I think it really is driven by the fact that, just like in cybersecurity, we have something called alert fatigue, we have something called outrage fatigue, and we have something called breach fatigue - right? - which is when you see a big announcement about DoorDash and, you know, millions and millions of people's information being leaked or even Words With Friends - right? - we're so numb to these massive breaches that it feels like they're almost inevitable, right? And to a certain degree, when humans feel like something is basically inevitable, there is a tendency to just assume that it's going to happen at all times and that there's nothing that can be done to mitigate the impact of it.
Dave Bittner: [00:23:54] That's interesting. It makes me think about - you know, I had people who - you know, they got their car stereo stolen so many times that they just started leaving the door unlocked, so at least that way, the glass wouldn't get broken anymore.
Ray [REDACTED]: [00:24:06] Sure, but I would actually argue that there are better things to do that could prevent your car from being stolen, necessarily, than that. That is an interesting analogy that's there because we're not talking necessarily about your car getting hurt when they take that data. But reusing passwords, which is by far the most common OPSEC mistake that is being made - the reuse of even strong passwords - right? - that is a glaring example of - the people that are doing that are going to be victimized by credential stuffing, and the people that have unique passwords or password managers are not.
Dave Bittner: [00:24:37] Joe, what do you think?
Joe Carrigan: [00:24:38] I appreciate Ray coming back on to answer listener questions.
Dave Bittner: [00:24:41] Yeah.
Joe Carrigan: [00:24:41] Thank you, Ray.
Dave Bittner: [00:24:42] Yeah.
Joe Carrigan: [00:24:42] Unfortunately, there is some cost to multi-factor at some level. Yeah, a cell phone is kind of expensive, and maybe you have a limited number of texts that you can receive because you're on a prepaid plan, but you can get inexpensive cell phones and then use Google Authenticator on them. That's free, and getting the Google Authenticator code is free. There are low-cost options for this – not free options, though.
Dave Bittner: [00:25:05] Yeah. I think it's a good point, though, that you shouldn't have to buy your way into this type of security, and it's good that there are printed-out paper options.
Joe Carrigan: [00:25:14] That is available for, like, account recovery but not necessarily for two-factor authentication, I think...
Dave Bittner: [00:25:18] Yeah.
Joe Carrigan: [00:25:19] ...Which is unfortunate. You know, we should be able to have this for free, but this software still needs hardware to run on. And the software is free. It's just that the hardware that it runs on isn't. Knowledge-based authentication is bad. It is a form of multi-factor authentication. It's probably better than nothing, but it's really so much less secure than even SMS.
Dave Bittner: [00:25:38] Yeah.
Joe Carrigan: [00:25:38] And the fact that PayPal just defaults to KBA is really bad, and that you can't turn that off – that's terrible. And I do recommend that you do exactly what Ray says here, and that is lie on those questions. You know, what's the street you grew up on? Copper Cup. I didn't grow up on Copper Cup Way.
Dave Bittner: [00:25:54] And then put that in your password manager.
Joe Carrigan: [00:25:56] And then put that in your password manager in the notes field. Exactly.
Dave Bittner: [00:25:59] So you don't have to remember it.
Joe Carrigan: [00:26:01] I really, really, really appreciate Ray's stance here on what he calls the fallacy of futility. The data about us that's out there is a lot like the overall security picture. I say that security is a spectrum, and you want to be on the more secure end of that spectrum. So, like, when Ray talks about using SMS as opposed to using nothing – yes, that's not the most secure solution you can use, but it is way better than not having any multi-factor authentication. That moves you in the more secure direction on the spectrum, and the same is true for your data. You want to move yourself to the more secure side of the spectrum, and you do that by protecting your information. And don't let the amount of information fatigue lull you into some form of learned helplessness. Vigilance is key, and yeah, I know it's exhausting. Yeah, but you have to keep it up.
Dave Bittner: [00:26:48] Right.
Joe Carrigan: [00:26:48] Every time there's a breach, yes, you may lose another piece of your data, but don't lose hope is what I'm saying.
Dave Bittner: [00:26:55] I find myself falling back on the analogy of public health because I think it's useful. Just because I get a cold doesn't mean I'm going to stop washing my hands.
Joe Carrigan: [00:27:04] Yeah.
Dave Bittner: [00:27:04] You know, oh, I got a cold. I guess all that handwashing was a waste of time.
Joe Carrigan: [00:27:07] Right. No, it wasn't.
Dave Bittner: [00:27:09] Yeah, you got to keep up, but it – yes, it's a little extra effort, but, you know...
Joe Carrigan: [00:27:12] I think that's an excellent analogy actually, Dave.
Dave Bittner: [00:27:15] Yeah, well, thank you very much.
Joe Carrigan: [00:27:16] Yeah.
Dave Bittner: [00:27:17] All right, well, that is our show. We want to thank Ray [REDACTED] for joining us once again. You can find him on Twitter @RayRedacted. We want to thank all of you for listening.
Dave Bittner: [00:27:25] And of course, we want to thank our sponsors at KnowBe4. They are the social engineering experts and the pioneers of new-school security awareness training. Be sure to take advantage of their free phishing test, which you can find at knowbe4.com/phishtest. Think of KnowBe4 for your security training.
Dave Bittner: [00:27:41] We want to thank the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu. The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Joe Carrigan: [00:28:02] And I'm Joe Carrigan.
Dave Bittner: [00:28:03] Thanks for listening.