Charles Arthur: [0:00:00] The hacks of the future are actually hacks that exist in the present day, but we just haven't paid enough attention to.
Dave Bittner: [0:00:07] Hello, everyone. And welcome to The CyberWire's "Hacking Humans" podcast, where each week, we look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world.
Dave Bittner: [0:00:20] I'm Dave Bittner from The CyberWire. And joining me, as always, is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hi, Joe.
Joe Carrigan: [0:00:27] Hi, Dave.
Dave Bittner: [0:00:28] As always, we've got some interesting stories to share. And later in the show, we welcome Charles Arthur, who's the author of a book titled "Cyber Wars." But before we get to all that, we've got a quick word from our sponsors, our friends at KnowBe4.
Dave Bittner: [0:00:46] Have you ever been to security training? We have. What's it been like for you? If you're like us, ladies and gentlemen, it's the annual compliance drill - a few hours of PowerPoint in the staff break room. Refreshments in the form of sugary doughnuts and tepid coffee are sometimes provided. But a little bit of your soul seems to die every time the trainer says, next slide. Well, OK, we exaggerate, but you know what we mean. Stay with us, and in a few minutes, we'll hear from our sponsors at KnowBe4, who have a different way of training.
Dave Bittner: [0:01:21] And we are back. Joe, there was a story making the rounds this week. It was pretty much covered everywhere. Had to do...
Joe Carrigan: [0:01:28] Yup. I saw this one.
Dave Bittner: [0:01:29] Had to do with the president of the United States and comedian named John Melendez - right? - more popularly known as Stuttering John.
Joe Carrigan: [0:01:37] Stuttering John from "The Howard Stern Show."
Dave Bittner: [0:01:39] Right. So Stuttering John, John Melendez, had a previous relationship with Donald Trump. Trump was a regular guest on "The Howard Stern Show."
Joe Carrigan: [0:01:49] Right. Melendez said that he had spoken with Trump on numerous occasions over the phone before he was president.
Dave Bittner: [0:01:54] Right. They were familiar with each other. I don't know if they were buddies, but certainly acquaintances. So the story goes that Melendez was going to record his podcast, had no guests, and thought, well, let's try to call the president - called the White House switchboard as himself, and the White House said, I'm sorry, the president is not available; he's too busy. We're all making sense so far, right?
Joe Carrigan: [0:02:15] Right. Yup.
Dave Bittner: [0:02:16] So he called...
Joe Carrigan: [0:02:16] That's what happens every time I call the White House switchboard.
Dave Bittner: [0:02:18] Yeah, well, absolutely.
Joe Carrigan: [0:02:19] (Laughter).
Dave Bittner: [0:02:19] So Melendez calls back, by his own description, using a ridiculous British accent.
Joe Carrigan: [0:02:25] Right.
Dave Bittner: [0:02:26] And as you know from last week's show, I'm a big fan of using ridiculous accents...
Joe Carrigan: [0:02:30] (Laughter) That's right.
Dave Bittner: [0:02:30] ...To - for social engineering scams - so calls using a ridiculous British accent, pretending to be a staffer for Senator Menendez. Senator Menendez is from New Jersey.
Joe Carrigan: [0:02:43] Right.
Dave Bittner: [0:02:43] So we have Melendez and Menendez. Calls to be the scheduler for Senator Menendez. And the White House operator asks, why were they calling from a number that they did not have on file? And Melendez said it was because the senator was on holiday, right?
Joe Carrigan: [0:02:59] The first thing that happens is, hey, how come you're not calling from a proper number?
Dave Bittner: [0:03:02] Right.
Joe Carrigan: [0:03:03] And John comes up with an immediate explanation for that.
Dave Bittner: [0:03:07] Right. Plausible...
Joe Carrigan: [0:03:08] Plausible.
Dave Bittner: [0:03:09] ...That he's away. He's on vacation, and he's calling.
Joe Carrigan: [0:03:12] Right.
Dave Bittner: [0:03:12] So according to the reports, the White House checked with the senator's office, and the office confirmed that the senator had not actually made the call. But White House son-in-law Jared Kushner put the call through from Air Force One anyway. And Melendez had a conversation with the president. They talked about immigration and Supreme Court justices.
Joe Carrigan: [0:03:33] Right.
Dave Bittner: [0:03:33] A pleasant call. Melendez - at no point in the call did he actually say he wasn't Menendez. The president called, assuming this is who he was talking to.
Joe Carrigan: [0:03:42] But at no point in the time he - did he say that he was Melendez...
Dave Bittner: [0:03:44] Correct.
Joe Carrigan: [0:03:44] ...Right?
Dave Bittner: [0:03:45] Right. He did not say, ha-ha, got you.
Joe Carrigan: [0:03:48] Right.
Dave Bittner: [0:03:48] He never revealed who he actually was.
Joe Carrigan: [0:03:50] And he didn't say Baba Booey a bunch of times either.
Dave Bittner: [0:03:51] He did not say - as far as I know, he did not say that. But - so the recording of the whole call is readily available. It's everywhere online. It's also on John Melendez's podcast. So check it out if you want to hear it there. He did say after the fact that he received a visit, and I'm presuming a stern talking-to, from the Secret Service for doing this. I don't know that, technically, there's anything illegal in what he did. There were no threats.
Joe Carrigan: [0:04:17] Yeah. No threats, but he did misrepresent himself.
Dave Bittner: [0:04:20] Yeah. So I guess you could say, maybe, fraud.
Joe Carrigan: [0:04:22] Maybe, yeah.
Dave Bittner: [0:04:24] You know, I could see how the Secret Service wouldn't be happy about this. But - and on the other hand, you could say that he exposed some weaknesses in the communications chain from the White House to the president, so maybe he did them a little favor.
Joe Carrigan: [0:04:38] It sounds like the switchboard did everything right, and then Kushner did an end run around the policies...
Dave Bittner: [0:04:44] Right.
Joe Carrigan: [0:04:44] ...And the procedures and short-circuited them and put the call through anyway.
Dave Bittner: [0:04:48] Sure. Yeah. And there's been lots of criticism of this president with his use of unsecured cellphones...
Joe Carrigan: [0:04:55] Yup. There has been.
Dave Bittner: [0:04:55] ...And devices and so forth. So yeah. It's an interesting tale. I think, particularly, the social engineering angle of this is just that calling using a British accent, which I think, as Americans, we tend to find, at the very least, charming but often authoritative, right?
Joe Carrigan: [0:05:12] Yeah. But, I mean, as Americans, shouldn't we be more suspicious of British accents?
Dave Bittner: [0:05:16] You'd think.
Joe Carrigan: [0:05:17] I mean, it was just Independence Day, right?
Dave Bittner: [0:05:19] You'd think. And yet, we are not.
Joe Carrigan: [0:05:20] Right. Yeah. That's right.
Dave Bittner: [0:05:21] But then also, the thing about being on holiday - you know, a perfectly plausible...
Joe Carrigan: [0:05:26] Yeah. Perfectly reasonable explanation.
Dave Bittner: [0:05:28] Yeah. So interesting story - I suspect there's someone at the White House, and coordinating with the Secret Service, to do a better job of making sure that these sorts of things don't happen again.
Joe Carrigan: [0:05:39] Let's hope so. But congratulations to John Melendez for an excellent penetration test of the White House communication system.
Dave Bittner: [0:05:46] There you go. What story do you have this week, Joe?
Joe Carrigan: [0:05:48] So my story is not nearly as lighthearted, unfortunately.
Dave Bittner: [0:05:52] Again?
Joe Carrigan: [0:05:52] Just like last week.
Dave Bittner: [0:05:56] OK.
Joe Carrigan: [0:05:56] So this story comes from Naked Security and the folks over at Sophos. The article starts with a story about a woman in the U.K. who's in her 70s. She's got a terminally ill husband. And she gets an email promising her 500,000 pounds, right? And over time - it's obviously a scam email, right? So over time, the scammers bleed her for over 100,000 pounds, completely draining her life savings...
Dave Bittner: [0:06:19] Oh, no.
Joe Carrigan: [0:06:20] ...Because she believes she's going to get this half a million pounds. And it doesn't really come to light until she goes to remortgage her house, which I assume was paid off. And now she's going to go get another mortgage because she thinks she's throwing part of the - I guess, the sunk-cost fallacy. You know, I put a lot of money into this; I'm still going to get $500,000 out of it, right? The power...
Dave Bittner: [0:06:37] They're stringing her along, presumably.
Joe Carrigan: [0:06:39] Exactly. And then a solicitor, which is what they call a lawyer across the pond...
Dave Bittner: [0:06:44] It's like they've got a different word for everything.
Joe Carrigan: [0:06:45] It is - senses that something's up. And good for the solicitor, he puts a stop to it and lets her know that she's being scammed. And the article brings up a few questions, like, why didn't family or friends notice that she was under duress like this? Why didn't they know that she was sending money overseas, losing money? It just didn't come up. Why did it take a solicitor to find out that something this blatant was going on? And then the article asks, why did it only come out after the damage was done? I actually don't think all the damage was done. I just think some of the damage was done. I mean, it could've been worse. She could've mortgaged her house and sent them that money, too.
Dave Bittner: [0:07:19] Right.
Joe Carrigan: [0:07:20:] So it was stopped before it got worse, but not before a lot of money was sent across.
Dave Bittner: [0:07:25] Yeah.
Joe Carrigan: [0:07:26] So there's a company called Reassura in the U.K. who provide scam-avoidance services. And they commissioned a professor named Mark Button at the University of Portsmouth to write up this article. And he said that 22 percent of those who are over the age of 65 are just unwilling to talk about their finances. It's just...
Dave Bittner: [0:07:43] At all?
Joe Carrigan: [0:07:43] At all.
Dave Bittner: [0:07:44] Whether there's a scam or not?
Joe Carrigan: [0:07:45] Whether they - even when things are good.
Dave Bittner: [0:07:47] OK.
Joe Carrigan: [0:07:48] But once they've been scammed, that number jumps up to 36 percent because they're embarrassed to talk about it, and it becomes like this blemish on their psyche. And it's almost like, how could I have been so stupid to have fallen for this?
Dave Bittner: [0:08:02] Yeah. They're embarrassed.
Joe Carrigan: [0:08:03] Yeah. They're embarrassed. Exactly. And I completely understand this, as well.
Dave Bittner: [0:08:07] But I think it's interesting that the scammers, no doubt, know this. That's part of why they're going after folks like this. They're less...
Joe Carrigan: [0:08:15] I think that's 100 percent correct.
Dave Bittner: [0:08:17] …Less likely to be reported.
Joe Carrigan: [0:08:19] Yup.
Dave Bittner: [0:08:19] And they can string them along and, like in this case, slowly bleed them of the money. So you have this sort of tailspin kind of thing.
Joe Carrigan: [0:08:27] For huge amounts of cash. This unwillingness to talk about being scammed is most likely leading to an unwillingness to report it, too, so these kind of crimes are probably going unreported.
Dave Bittner: [0:08:37] Because the people are embarrassed.
Joe Carrigan: [0:08:39] Because people are embarrassed to report it.
Dave Bittner: [0:08:40] Don't want - they feel - after the fact - we hear this all the time - they feel stupid.
Joe Carrigan: [0:08:44] Right. How could I have been so stupid?
Dave Bittner: [0:08:46] Right. How could I have fallen for this?
Joe Carrigan: [0:08:48] Right. And, you know, people shouldn't feel this way. There is something out there that will get all of us.
Dave Bittner: [0:08:53] Yeah. You can understand the feeling, though. I certainly can.
Joe Carrigan: [0:08:57] Sure.
Dave Bittner: [0:08:58] Not something you're going to go out and brag about - that time when you got duped.
Joe Carrigan: [0:09:02] (Laughter) Right.
Dave Bittner: [0:09:02] You know, but the point is that those of us who are helping look out for these sorts of folks - you know, I have elderly parents.
Joe Carrigan: [0:09:10] Right.
Dave Bittner: [0:09:10] You have elderly parents in your family.
Joe Carrigan: [0:09:13] Yes, we do.
Dave Bittner: [0:09:13] It's important to start these dialogues.
Joe Carrigan: [0:09:15] Yeah. We actually had a conversation with my in-laws. And I asked if they've ever seen anything like that. And they had. And they almost got scammed. If it hadn't been for someone that they spoke to - they were in the process of sending money to somebody. And this person, thankfully, said, stop sending that; don't do that. But it hasn't happened to my parents yet, but that's just because nobody's come in with the right trigger for them.
Dave Bittner: [0:09:36] Right. It's a concern. It's something I think about often. And fortunately, I'd say probably every other week or so, I get an email from my father. It says, you know, Dave, what do you think about this? Or, should I reply to this?
Joe Carrigan: [0:09:48] Right.
Dave Bittner: [0:09:48] So he's aware that they're out there. And it's good that he's checking with me first. And most of the time, the response is, no. You know, no. (Laughter) Do not - please do not reply.
Joe Carrigan: [0:09:58] Bad idea.
Dave Bittner: [0:09:59] Yeah. Yeah. But having that - those lines of communication open - boy, that's really important.
Joe Carrigan: [0:10:04] Yup.
Dave Bittner: [0:10:05] All right, Joe. It's time to move on to our Catch of the Day.
(SOUNDBITE OF REELING IN FISHING LINE)
Dave Bittner: [0:10:12] So this week, our Catch of the Day comes from the Yorkshire and Humber cyber unit. I wonder if I'm pronouncing that correctly because it is British, and, as we said earlier in the show, they have a different way of saying everything.
Joe Carrigan: [0:10:23] (Laughter).
Dave Bittner: [0:10:23] So this is from Twitter. And they said they received a fairly convincing phishing email from Twitter this morning, said, if in doubt, go to the Twitter website, check your account and change the password. Don't click on the links in the email. Let's describe what is going on with this.
Joe Carrigan: [0:10:41] OK.
Dave Bittner: [0:10:41] It says - first of all, there's the Twitter logo at the top of the image.
Joe Carrigan: [0:10:46] Yup.
Dave Bittner: [0:10:46] It says, looks like there was a login attempt from a new device or location. All right. That's pretty routine. But here's the - here's where it gets good. It says, if this wasn't you, secure your account by resetting your password now.
Joe Carrigan: [0:10:59] Aha.
Dave Bittner: [0:11:00] And below that, there's a button that says, reset password.
Joe Carrigan: [0:11:03] How convenient.
Dave Bittner: [0:11:05] Now, the thing is, odds are, this wasn't you.
Joe Carrigan: [0:11:09] Right. Right (laughter).
Dave Bittner: [0:11:10] If the over - because it's a scam, overwhelming odds are this wasn't you.
Joe Carrigan: [0:11:14] Right.
Dave Bittner: [0:11:15] So you get this. You say, wait; that wasn't me.
Joe Carrigan: [0:11:18] I better change my password right away.
Dave Bittner: [0:11:19] I better secure my account.
Joe Carrigan: [0:11:21] Right.
Dave Bittner: [0:11:21] So you click the reset password button, which I'm sure takes you to a site that looks like a Twitter login.
Joe Carrigan: [0:11:27] It says, enter your current password...
Dave Bittner: [0:11:30] Right.
Joe Carrigan: [0:11:30] ...First, right?
Dave Bittner: [0:11:31] Yup. Yup. And Bob's your uncle, they got you.
Joe Carrigan: [0:11:34] They got you. Exactly.
Dave Bittner: [0:11:36] Now, it's interesting, too. It says, if this was you, please confirm your identity by using this temporary code on Twitter or wherever you might enter your Twitter password.
Joe Carrigan: [0:11:46] That's a ruse.
Dave Bittner: [0:11:46] That's just noise, right?
Joe Carrigan: [0:11:47] Right.
Dave Bittner: [0:11:48] There's nothing really going on here.
Joe Carrigan: [0:11:48] That's to make it look - actually, it's not just noise. It's something to make it look more realistic, more convincing.
Dave Bittner: [0:11:55] But then, at the bottom, here's the cherry on top.
Joe Carrigan: [0:11:58] Yes.
Dave Bittner: [0:11:59] It says, how do I know an email is from Twitter? Links in these emails will start with HTTPS and contain - contain - twitter.com. Your browser will also display a padlock icon to let you know a site is secure.
Joe Carrigan: [0:12:15] Mmm hmm.
Dave Bittner: [0:12:16] So as we've talked about, HTTPS doesn't mean...
Joe Carrigan: [0:12:22] Secure.
Dave Bittner: [0:12:23] Right. It means...
Joe Carrigan: [0:12:24] Encrypted.
Dave Bittner: [0:12:25] Encrypted.
Joe Carrigan: [0:12:25] Right.
Dave Bittner: [0:12:26] But it doesn't mean legit (laughter).
Joe Carrigan: [0:12:26] It means - exactly. It means the communication channel is secure...
Dave Bittner: [0:12:31] Right.
Joe Carrigan: [0:12:31] ...So that an eavesdropper probably can't tell what's going on in the communication. And when I say probably, I mean, can't tell.
Dave Bittner: [0:12:38] But they're taking advantage of that misunderstanding...
Joe Carrigan: [0:12:42] Correct.
Dave Bittner: [0:12:43] ...That the padlock icon means everything's legit.
Joe Carrigan: [0:12:46] That's right. And also, it will contain the string twitter.com.
Dave Bittner: [0:12:51] Right.
Joe Carrigan: [0:12:52] I saw something incredibly smart the other day. Somebody registered a domain that started with com, right?
Dave Bittner: [0:13:00] Oh.
Joe Carrigan: [0:13:00] So I could say com-joe.com. I could register that string...
Dave Bittner: [0:13:07] OK.
Joe Carrigan: [0:13:07] ...That domain - com-joe or com-whatever. And then I create a subdomain called Twitter - right? - on my own server that is com-joe. So now if you look at the domain, it'll say twitter.com-joe.com.
Dave Bittner: [0:13:23] Right.
Joe Carrigan: [0:13:24] That would look to the casual observer like twitter.com.
Dave Bittner: [0:13:28] Right. It contains what they said it's going to contain - twitter.com.
Joe Carrigan: [0:13:31] It does. And it looks - it could even go https://twitter.com, and if you don't look beyond to -joe.com, then you might think, there's twitter.com. That's where I'm going.
Dave Bittner: [0:13:43] So several ways to trick you into thinking that it's secure when it's actually not.
Joe Carrigan: [0:13:47] Absolutely.
Dave Bittner: [0:13:49] Yeah. All right. Well, it's something to look out for - classic ruse here trying to gather up your password for Twitter. And then once they've got it, they own you.
Joe Carrigan: [0:13:57] Right.
Dave Bittner: [0:13:58] All right. So that is our Catch of the Day. Coming up next, we've got my interview with Charles Arthur. He is the author of "Cyber Wars: Hacks that Shocked the Business World." But first, a message from our sponsors at KnowBe4.
Dave Bittner: [0:14:17] And now back to that question we asked earlier about training. Our sponsors at KnowBe4 want to spring you from that break room with new-school security awareness training. They've got the world's largest security awareness training library, and its content is always fresh. KnowBe4 delivers interactive, engaging training on demand. It's done through the browser and supplemented with frequent simulated social engineering attacks by email, phone and text. Pick your categories to suit your business. Operate internationally? KnowBe4 delivers convincing, real-world, proven templates in 24 languages. And wherever you are, be sure to stay on top of the latest news and information to protect your organization with KnowBe4's weekly Cyberheist News. We read it, and we think you'll find it valuable, too. Sign up for Cyberheist News at knowbe4.com/news. That's knowbe4.com/news.
Dave Bittner: [0:15:18] And we are back. Joe, I recently had the opportunity to speak with Charles Arthur. He is the writer of the book "Cyber Wars: Hacks that Shocked the Business World." Interesting conversation. Here's my talk with Charles Arthur.
Charles Arthur: [0:15:33] I've been looking around for a book idea, and I had a few ideas in my mind about what I wanted to do. My publishers contacted me and said, would you like to write a book about hacking? And they had a suggestion about looking at big hacks. So I had to think about it and came up with a list of seven or eight things that I thought were important hacks, that were illustrative about the sort of problems that people and organizations have and which would catch people's attention or which would - already caught people's attention, which I thought would be useful to know about.
Dave Bittner: [0:16:05] Yeah, and the book does go into depth into certainly some of the well-known, high-profile hacks - the Sony hack, the TalkTalk hack, Mirai, ransomware, things like that. What I want to focus on today with you are some of the hacks that involved social engineering. You have a chapter in your book about John Podesta and how his Gmail account was hacked. Can you tell us that story? What happened there?
Charles Arthur: [0:16:28] Sure. So John Podesta was the chairman for Hillary Clinton's campaign. And the campaign had been aware that they might be the target of hackers. They weren't sure what sort of origin those might be, but a lot of people obviously would want to hack into a political campaign. The Hillary Clinton campaign had actually protected their emails - their email accounts - with two-factor authentication, which means that not only do you have to have the username and the password, but you also have to - when you try to log in, you have to type in a code, a six-digit code which is generated automatically by a program or sent to you by text message. John Podesta's personal inbox, though, did not have two-factor authentication turned on.
Charles Arthur: [0:17:07] And so one morning, while he was actually over in California, there were a number of people from the campaign who monitored his inbox because that was, in effect, one of the various different nexuses for information flowing through the campaign. They got an email which appeared to be from Google which said, someone has your password; they've been trying to log in from - they named an Eastern European country - you should go and protect your account now. There was a bit of mild panic within the campaign headquarters. It was early on a Saturday morning. And it ended up with someone going to this phishing page, as it turned out to be, and entering in his credentials, which meant that all of his emails from the past and into the future were now open to the hackers and meant that they could basically strip mine his entire email, and they did this. They kept on monitoring it. It meant, also, that they had access to other elements that he had connected to that email - so his iCloud account had a recovery email there, and they could also get into that. And so they were able, at their leisure, to pass on these emails to people who might be hostile to Hillary Clinton, which includes Julian Assange of WikiLeaks.
Charles Arthur: [0:18:23] And that meant that when, late in November or October, there was a tumultuous day, really, in terms of the 2016 presidential election, when first, the FBI said that there were efforts by the Russians to hack the election, and then The Washington Post broke a story about Donald Trump and an "Access Hollywood" tape about basically assaulting women - at the sort of later in the afternoon of that day, Julian Assange released the first tranche of these John Podesta emails. So for a media trying to look at what story to fixate on, there was more than enough to think about. But the John Podesta emails were trickled out over time and had a sort of drip-drip effect on people's thoughts about the Hillary Clinton campaign.
Dave Bittner: [0:19:07] And so just to be clear, I mean, the message that they got - it was not an actual email from Google.
Charles Arthur: [0:19:12] It wasn't actually from Google. It was rather cleverly disguised. It said, someone has your password. And I suspect that Google has methods in place to trap emails which say that, but this email had a Unicode character so that the O in password looks like an O if you - especially if you view it on an iPhone, but if you view it in different fonts, then you can see that it's actually not a normal English O. It's a Unicode character which looks like that. But I think it was chosen after some experimentation by the hackers to get through filters such as Google had set up.
Dave Bittner: [0:19:45] Was the success of this phishing email dependent on the fact that Podesta did not have two-factor enabled?
Charles Arthur: [0:19:52] Like 90 percent of people, he did not have two-factor enabled. If he had had it enabled at that time, it would've been much more difficult - not impossible, but much more difficult - for the hackers to have captured and broken into his email. There are ways that you can get around people having two-factor authentication. You can set up a page which asks for the code. But you can't automate it, and the thing that people who are doing phishing like to do is they like to automate the process so that they can capture lots and lots of log-in details. Capturing two-factor authentication is much harder because you have to capture a code as it's generated. You then have to log in to the real page with that code, and then you have to go in and generate your own codes to be able to access the account afterwards. And Google will always warn you about extra logins that it gets and where it thinks that they look a bit peculiar. Google has many systems running in the back end that you don't see.
Dave Bittner: [0:20:48] Another one of the chapters in your book covers the story of HBGary, which is a security company that had a run in with Anonymous, and there were some social engineering factors in that story, as well.
Charles Arthur: [0:20:59] That's true, yes. So HBGary had a guy who reckoned that he had figured out the names of the people at the top of Anonymous, which is a slightly strange idea because Anonymous, especially at this time in 2011, was very much a free-form collective which didn't have anyone leading it. It was more a sort of a large crowd which collected on the - effectively, a street corner of the internet and tried to decide what direction to run in. So the idea that he had found the identities of various people seemed a bit ridiculous. And also, to Anonymous, it was a bit insulting but also a bit threatening because many of them were, you know, hackers in one way or another, and they didn't like the idea that their personal details might possibly, if he was correct, be out there.
Charles Arthur: [0:21:43] So Aaron Barr, which is the guy's name - Aaron became the target of a very concerted attack by some people from Anonymous to break into the HBGary account, his personal accounts and so on, and they managed to do part of that. I mean, the initial break-in to the HBGary systems was using a method called SQL injection, which relies on weaknesses in databases to allow access to a server. But when they were trying to break into the systems to the parent company, what they did was get access to the email systems by calling the support line for the email and pretending to be someone who had forgotten their password and asking for a password reset so that they could log in. And because they had looked at some of the previous emails from the subsidiary company, they were able to give a tolerably (ph) good impersonation of the person who would've actually been looking to get the logins.
Charles Arthur: [0:22:41] You have to say that a lot of the failing there was on the part of the support line. They should've been a bit more suspicious. But, of course, the thing about support lines is they're there to give support, and if you don't get support, then you tend to get a bit antsy, so there's always a difficult line to tread for them.
Dave Bittner: [0:22:57] Now, in the research that you did for the book in the process of writing it, what were some of the takeaways for you? For organizations looking to defend themselves against social engineering, are there any lessons to be learned here?
Charles Arthur: [0:23:08] The principal takeaway that I found and the thing I found most surprising initially but which has changed since is the low value that companies put on the data of the customers that they have. So looked the British internet service provider called TalkTalk, which was hacked and lost about 160,000 people's details - names, addresses, birthdays, addresses, email addresses, bank details. All this was lost, and they were fined by the U.K. Information Commissioner's Office 400,000 pounds, which sounds a lot initially, but actually, it's about 2.50 pounds per person. And, you know, the people who were hacked didn't get any benefit from that, and all their personal details were now out on the net.
Charles Arthur: [0:23:54] By contrast, TalkTalk was fined 5 million pounds by the communications regulator for overbilling about 50,000 people. And the contrast between the giant fine for overbilling and the tiny fine for losing people's data struck me as the key imbalance that exists between the customer in terms of their data and the customer in terms of service. But that's all changing now, especially in Europe where the introduction of the GDPR, the General Data Protection Regulation, means that companies can be fined up to 20 million euros, or 4 percent of their annual global turnover, which can be very large amounts, indeed. In the case of TalkTalk, it would've been around 60 million pounds. So I think that the lessons are that, actually, the big fines are coming. And therefore, companies need to be looking much more about the safeguards they have around their customer data and even whether they're holding too much data.
Charles Arthur: [0:24:50] I think a lot of companies have gotten into the habit of getting data, holding data just because they can. They don't know why they're going to use it. They just think that having more is better. But the lesson is that, in general, you are going to be hacked at some point. It might be big. It might be small, but you are going to get hit. So you need to have less data than you might've otherwise have if you weren't going to get hacked. The information commissioner, in giving evidence to the U.K. Parliament, said some people say data is the new oil - you know, that everyone's going to make money off it. But he suggested maybe you should think of data as the new asbestos, potentially toxic to your company.
Dave Bittner: [0:25:25] (Laughter).
Charles Arthur: [0:25:26] And I think that's an important lesson that companies should learn. And if you have the expectation that you're going to be hacked, then you should start to set up rings of trust within the company. Your most valuable data is behind multiple rings of trust. Your less valuable data is behind fewer rings of trust. And you have to expect that people are going to get in, and you have to try to minimize the effect that will be - especially on your most valuable data when they do get in.
Dave Bittner: [0:25:52] Yeah. I mean, it strikes me that we're playing off of that very real human factor that people want to be helpful, so that creates a vulnerability there.
Charles Arthur: [0:26:01] That's absolutely the case. I mean, you know, the social engineering aspect of hacking is one of the oldest, one of the most trusted and most reliable in many ways. If it's not the support line, then it's someone at the telephone company or it's just the person who's operating the system at the time. You can make the computer completely fail-safe. But if, in the end, you have people who are able to access the data, then one way or another, the data is accessible. And there's always a way of persuading people, by fair means or foul, to hand over that data.
Charles Arthur: [0:26:33] The thing that I found most surprising when I was researching this book was how old a lot of these techniques are. I mean, you know, social engineering is basically conning people. That's - you know, it's pretty much as old as humans and languages. If you're trying to figure out what the hacks of the future will look like, then what you want to do is to look at academic papers now in terms of hacking, what they're suggesting might be weaknesses, and just throw that forward and expect to see that in sort of 2038. The hacks of the future are actually hacks that exist in the present day, but we just haven't paid enough attention to.
Joe Carrigan: [0:27:10] That's a great interview, Dave.
Dave Bittner: [0:27:12] Well, thank you very much.
Joe Carrigan: [0:27:12] Two takeaways from this for me right away - one, phishing works (laughter), and it works well. And, two, so does two-factor authentication. Charles talks about how difficult it would be to hack into somebody's account when they have two-factor authentication. That is not impossible, but I've talked about this before, I think, on the CyberWire podcast where I talk about the multiplicity of difficulty. Just getting your password has some kind of difficulty factor. And if I make it more difficult by adding a two-factor authentication methodology to it - even with just a text code, which may not be the most secure way of performing two-factor authentication - it makes it much more difficult for an attacker to get into your account. Now, if you're being targeted by a nation-state actor or somebody who's focused entirely on getting your password, yeah, maybe two-factor authentication might not be the best protection. And if you think that's the case, then text messaging is not the best way.
Dave Bittner: [0:28:02] Right.
Joe Carrigan: [0:28:03] It's best to go with a time-based key in a secret seed. But even that could be socially engineered around. If I can get you to give me that, then - guess what? - I'm in. But for these automated processes, which is what the vast majority of people need to worry about, something as simple as a text message is probably adequate to get them through the day.
Dave Bittner: [0:28:20] It's interesting when you look at what he was talking about with the Clinton campaign...
Joe Carrigan: [0:28:23] Right.
Dave Bittner: [0:28:24] ...That they had multifactor on everything but this one account.
Joe Carrigan: [0:28:28] Yeah, on Podesta's account.
Dave Bittner: [0:28:29] And it was a high-value account, and they got in. And that was the ballgame.
Joe Carrigan: [0:28:33] I also like his statement data is the new asbestos.
Dave Bittner: [0:28:35] Right.
Dave Bittner: [0:28:36] Yeah, I like that, too.
Joe Carrigan: [0:28:37] That's good.
Dave Bittner: [0:28:38] I like thinking of it as maybe - if you think of your data as being radioactive...
Joe Carrigan: [0:28:42] Yeah.
Dave Bittner: [0:28:42] ...You know, like (laughter)...
Joe Carrigan: [0:28:43] Yeah, handle it as such, you know?
Dave Bittner: [0:28:44] Right, right. You don't want to keep too much of it around. You could hit critical mass, and bad things could happen.
Joe Carrigan: [0:28:49] Right (laughter).
Dave Bittner: [0:28:50] Get rid of it. There's no reason to keep it around. I really enjoyed my conversation with Charles Arthur. Again, the title of the book is "Cyber Wars: Hacks that Shocked the Business World." Our thanks to Charles Arthur for joining us for today's show. And that is our podcast.
Dave Bittner: [0:29:06] Thanks for listening, and thanks to KnowBe4 for sponsoring our show. For help inoculating your organization's employees against social engineering with their new-school security awareness training, talk to KnowBe4 and be sure to sign up for their Cyberheist News at knowbe4.com/news. Thanks to the Johns Hopkins University Information Security Institute for their participation. You can learn more about what they're up to at isi.jhu.edu.
Dave Bittner: [0:29:30] The “Hacking Humans” podcast is proudly produced in Maryland at the startup studios of DataTribe where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben. Editor is John Petrik. Technical editor is Chris Russell. The executive editor is Peter Kilpe. I'm Dave Bittner.
Joe Carrigan: [0:29:50] And I'm Joe Carrigan.
Dave Bittner: [0:29:51] Thanks for listening.