Be very aware of your desire to be right.
Ben Yagoda: [00:00:00] Be very aware of your desire to be right and your dislike of being wrong.
Dave Bittner: [00:00:09] Hello, everyone. And welcome to yet another episode of the CyberWire's Hacking Humans podcast. This is the show where each week we look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I am David Bittner from the CyberWire. And joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe.
Joe Carrigan: [00:00:31] Hi, Dave.
Dave Bittner: [00:00:32] Later in the show, we got my interview with writer Ben Yagoda about cognitive biases. But before we jump into all of that - a quick word from our sponsors at KnowBe4.
Dave Bittner: [00:00:45] Step right up and take a chance. Yes, you there. Give it a try and win one for your little friend there. Which were the most plausible subject lines in phishing emails? Don't be shy. Were they, A, my late husband wished to share his oil fortune with you or, B, please read, important Message From HR or, C, a delivery attempt was made or, D, take me to your leader? Stay with us, and we'll have the answer later. And it will come to you courtesy of our sponsors at KnowBe4, the security awareness experts who enable your employees to make smarter security decisions.
Dave Bittner: [00:01:24] And we are back. Joe, we've got some great stories to share this week. But first, we love our listeners.
Joe Carrigan: [00:01:32] We do.
Dave Bittner: [00:01:32] We got a letter in the mail this week - this past week.
Joe Carrigan: [00:01:35] Yeah, it's a postcard.
Dave Bittner: [00:01:36] Yeah, describe to us what we got here.
Joe Carrigan: [00:01:38] On the picture side of the postcard, I'm looking at a very handsome Amish man who's smirking knowingly. And on the reverse side, I see a message here. It says - it's addressed to us here at the CyberWire. It says greetings, English, please type the following at your command prompt - four, slash, R, quote, C, colon, backslash, end quote, percent sign, percent sign, V, open parentheses, star, .doc, comma, star, .docx, comma, star, .xlsx, close parentheses, Do, R, E, N, quote, percent sign, percent sign, V, close quote, quote, percent sign, percent sign, tilde, N, X, V, .ransom, close quote. Then press enter. Thank you for running Amish ransomware.
Dave Bittner: [00:02:34] (Laughter) It was a long time getting there.
Joe Carrigan: [00:02:38] Right. Maybe I should just say it's a long string that essentially renames all your doc files to something else.
Dave Bittner: [00:02:43] I see. That's what it does. Welcome to Amish ransomware. Yeah, I guess it's not very sporting of us to make fun of the Amish. But they don't listen to podcasts, so that's OK.
Joe Carrigan: [00:02:52] As far as we know. I did see an Amish guy using email once, though.
Dave Bittner: [00:02:55] All right.
Joe Carrigan: [00:02:56] I wish so bad at that point in time I had a camera phone. But I didn't.
Dave Bittner: [00:02:58] Well, there you go. So whoever sent that into us, they did not put a return address on it. It actually doesn't even have a postmark on it.
Joe Carrigan: [00:03:04] Right. How am I going to get them my bitcoin?
Dave Bittner: [00:03:07] (Laughter) Yeah, that's right.
Joe Carrigan: [00:03:09] That's what I'd like to know.
Dave Bittner: [00:03:09] Well, thanks for sending that in. Joe, why don't you kick things off this week? What do you have for us in terms of stories?
Joe Carrigan: [00:03:14] So last week, I said I would talk about this because I don't know if it's clear for a lot of our listeners - our technical listeners may understand what I'm about to talk about. But our non-technical listeners, this may be new information to. And I want to talk about URLs and how they work. Basically I'm going to stick to the conversation about web-based URLs - HTTP and HTTPS.
Dave Bittner: [00:03:35] So this is how you go to a web - the thing you type in to go to a website.
Joe Carrigan: [00:03:38] Exactly. So they follow a very regular structure. And the first thing that they have - they will start with the characters either HTTP or HTTPS. And these will indicate whether you're going to use Hypertext Transfer Protocol or Hypertext Transfer Protocol Secure.
Dave Bittner: [00:03:55] Of course they do.
Joe Carrigan: [00:03:56] Right.
Joe Carrigan: [00:03:57] It just basically means that if you have an S after the HTTP, it means that you're going to use a - try to use a secure connection. Then it will be followed by a colon. Sometimes you'll see other URLs that start with like FTP. That's a File Transfer Protocol. You might actually see telnet colon. But most of the time, again, you see HTTP and HTTPS. Immediately following that, there's usually two slashes. And those tell your web browser that we're going to be looking for a computer - another host, the server you can pick up - even though technically the server's the software that runs on the machine that you're going to be talking to.
Joe Carrigan: [00:04:33] And in there, there will be one of two things. There will either be an IP address, or there will be a named domain. And these are things like www.thecyberwire.com or www.google.com or www.facebook.com. And what that is - it's a way for humans to know how to find a resource because humans are not very good at remembering numbers. So if I tell you, go to my IP address - it's this number, dot, this number, dot, this number, dot, this number, you're never going to remember that. It's like trying to remember a phone number.
Dave Bittner: [00:05:02] We don't even need to do that anymore.
Joe Carrigan: [00:05:03] Right, we don't. So there's a tool, which I'm not going to go into, called domain name service, or DNS, that actually resolves - the process is called resolving - the human readable name to an IP address so that your computer then knows how to connect to it.
Dave Bittner: [00:05:18] So the IP address is the actual map to where the server lives.
Joe Carrigan: [00:05:22] Correct.
Dave Bittner: [00:05:23] It is assigned an IP address, and that's the real way for other computers to connect to it.
Joe Carrigan: [00:05:27] Correct because there's these things called routes that the network has and is cognizant of - and those routes are all based on IP addresses. They have nothing to do with names. So if I tell a computer, go find cyberwire.com, it doesn't have any cognizance of a route to that computer. It has to have the IP address in order to get the route.
Dave Bittner: [00:05:47] So that goes and gets translated to an IP address.
Joe Carrigan: [00:05:51] The domain gets translated to an IP address.
Dave Bittner: [00:05:52] And off we go.
Joe Carrigan: [00:05:52] And off we go - we're off to the races. That's a very complicated process, but we're not going to go into that.
Dave Bittner: [00:05:56] OK.
Joe Carrigan: [00:05:57] Following the domain, you may see a colon followed by some numbers. OK, now this is what's called a TCP port or a UDP port. It's a port number on the computer. And there are 65,000 available ports on any given computer. And these are where services or servers will actually run. And they will listen on these individual ports. Once the information arrives at this host, the next thing the host says is, who's listening for this piece of information? And the way it derives that is by the port number. If you see a port number in a web-based URL, that should raise a red flag - OK? - because there are default port numbers. Everything has to have a port number. If I'm going to send you to a HTTP address, the default port is 80. But I don't need to enter that into the URL. The URL can be simplified by leaving it off.
Dave Bittner: [00:06:49] It's assumed.
Joe Carrigan: [00:06:50] It's assumed, exactly. And if the protocol is HTTPS, then the default port is 443. Why would somebody run on a non-standard port? There are lots of reasons. We used to do it all the time when we were developing applications - web applications because our development server would have, like, 3 different instances of the software. So we'd connect to different ports, and we didn't have to buy different machines. But if I have compromised a company site, I may want to hide on that company's site and just start another web server that listens on, say, port 8080, right? And now you're actually going to that company's site, but you're going to a non-standard address. You're going to port 8080, and you're going to a completely service on that computer than the standard web server.
Dave Bittner: [00:07:32] So the domain name will be accurate.
Joe Carrigan: [00:07:35] The domain name could be accurate - could be 100 percent accurate.
Dave Bittner: [00:07:36] But it's not taking me to where I think I'm going.
Joe Carrigan: [00:07:39] Exactly, that's a good way to put it.
Dave Bittner: [00:07:41] So if I see those port numbers, should I strip them off as a routine sort of thing to do?
Joe Carrigan: [00:07:46] It depends on the situation, you know. But generally speaking, if you see port numbers, and you're not technical, I wouldn't even visit the webpage. I see port numbers after the domain name here. So if it says www.someplace.com:8080, I would avoid going to that place.
Dave Bittner: [00:08:02] OK, interesting.
Joe Carrigan: [00:08:03] OK. So after the domain, with or without a port number, you'll see a series of slashes and other characters - like, it may look like a directory structure. And that's exactly what it is. These are just essentially folders and files on the server that - or on the host there are going to be used to reference a particular file that you're going to access.
Dave Bittner: [00:08:20] Right.
Joe Carrigan: [00:08:21] And that's the file locator. And then after that, there may be a question mark, OK.
Dave Bittner: [00:08:26] Yes, yeah. I wanted to ask you about this.
Joe Carrigan: [00:08:28] Right. Now, the question mark begins the query string. And it's the last part of a URL. Inside the query string, you'll see a series of what we call key value pairs. So it will say, like, some term, and then it will have an equal sign. And it will have some other set of characters after it. Then it will have an ampersand. The ampersand, it denotes here's another key value pair.
Joe Carrigan: [00:08:50] OK. So for example, I go to google.com/search. And then it has a question mark, and it says Q equals this plus is plus a plus test, which is the search string that I'm looking for. So Q is obviously a variable for the search. Then there's an ampersand. And there's something that says RLZ equals and then has another value in there. And then there's another ampersand, and it says OQ equals this plus is plus a plus test. I'm going to guess that means original query.
Dave Bittner: [00:09:21] OK.
Joe Carrigan: [00:09:21] This can go on for the entire length of a URL - I believe up to 4K of characters. There's a lot of information that can be crammed into a URL.
Dave Bittner: [00:09:28] Yeah, and you see this a lot - a lot - all sorts of stuff after that question mark. I see that all the time.
Joe Carrigan: [00:09:33] Yes.
Dave Bittner: [00:09:33] And that's the tracking stuff, right?
Joe Carrigan: [00:09:35] A lot of the tracking information can reside right up there, exactly.
Dave Bittner: [00:09:38] Because I usually - if I'm sharing a URL, and I see a bunch of stuff after a question mark, I'll usually trim all that stuff out, test the URL to make sure it still works...
Joe Carrigan: [00:09:47] And then...
Dave Bittner: [00:09:48] And then that's what I'll send out.
Joe Carrigan: [00:09:49] Yeah, me too.
Dave Bittner: [00:09:50] To avoid that tracking.
Joe Carrigan: [00:09:52] Exactly. I do that as well. Now, here's a little bit of social engineering stuff that can happen with a URL. And I've put the URL in our file here that we're looking at. So the URL reads like this. It is http://email@example.com. And if you click on that link, it will take you to thecyberwire.com.
Dave Bittner: [00:10:17] Yes, it does.
Joe Carrigan: [00:10:18] OK. So what this is is this is an abuse of an email URL. An email URL will look like mail to, colon, username, an at symbol and then a domain.
Dave Bittner: [00:10:29] Right.
Joe Carrigan: [00:10:29] So because I've specified HTTP here, it tries to look it up like a server. And once it sees that there's an at sign, it says, well, this user has entered an email address. And it disregards everything in front of that at sign. So let's flashback to the last week. We were talking about DHL and their shipping information. If I craft a URL that says http://firstname.lastname@example.org then have all the other query information after that, then you might look at that URL and go, well, this actually does go to dhl.com. But there's an at sign there. And the at sign makes it get evaluated as an email address. So then, everything before the @ is disregarded, and you just go to joesmalicioussite.com, where you get all kinds of malware and other stuff installed.
Dave Bittner: [00:11:26] Yeah. And I can imagine someone, you know, forming this to say - to have everything after the @ look like something that would be related to the original site. Like, it could be, you know, customerservicehelpers.com, right?
Joe Carrigan: [00:11:38] Right. Right. Absolutely.
Dave Bittner: [00:11:39] So it would say email@example.com. And I'd think, oh, well, this must be customer service for Google. But, no, it has nothing to do with Google.
Joe Carrigan: [00:11:48] Exactly. And customerservicehelpers.com is, of course, a malicious site.
Dave Bittner: [00:11:52] Right. Right.
Joe Carrigan: [00:11:52] So I can go out and buy domain names relatively cheaply and then do terrible things with those domains with exactly this kind of thing.
Joe Carrigan: [00:12:00] So this goes back to what you said last week. What I should've said at the very beginning of getting one of those phishing emails - just go to the website using your own fingers and type in the address, or use your own bookmark that you have.
Dave Bittner: [00:12:12] Right.
Joe Carrigan: [00:12:12] Don't click on the link in the email.
Dave Bittner: [00:12:14] Yeah. Good advice. All right, well, it's good stuff. It's a lot to take in, but it's worth knowing.
Joe Carrigan: [00:12:20] It is kind of a lot to take in. You can look it up on Wikipedia and get a better understanding of how it works. The Wikipedia article is not really layman-friendly, I don't think...
Dave Bittner: [00:12:28] (Laughter).
Joe Carrigan: [00:12:28] ...So I tried to simplify it here.
Dave Bittner: [00:12:30] No, it's a good overview. All right, well, Joe, this week, my story - you know, we've got - Thanksgiving is behind us, and so we are full-throttle into the holiday season. And that means...
Joe Carrigan: [00:12:42] We are now.
Dave Bittner: [00:12:43] We are now, yeah.
Joe Carrigan: [00:12:44] Now are you OK with it being Christmastime?
Dave Bittner: [00:12:46] I am now. I am on board. I like the Christmas music on the radio. I like the decorations going up. But I am definitely one of those people who resents it all going up before Thanksgiving. I think each holiday - you know, you got...
Joe Carrigan: [00:13:02] Should have its own period.
Dave Bittner: [00:13:03] Yeah, yeah. You got your Halloween. And then Halloween ends, and now we're in Thanksgiving season.
Joe Carrigan: [00:13:07] Yup.
Dave Bittner: [00:13:07] And then Thanksgiving ends, and now it's Santa's turn.
Joe Carrigan: [00:13:11] I actually agree with you a hundred percent.
Dave Bittner: [00:13:13] (Laughter) So with the holidays in full-throttle here, it means that scammers are going to be out looking to take advantage of all that additional shopping traffic.
Joe Carrigan: [00:13:20] Absolutely.
Dave Bittner: [00:13:21] And, of course, one of the ways they're going to do that is through credit card skimming. Now, I was over at Chris Hadnagy's website. This is social-engineer.org (ph). Of course, Chris has been a guest a couple of times on our show.
Joe Carrigan: [00:13:32] Yes, he has.
Dave Bittner: [00:13:32] And they publish a regular newsletter. It's their Social-Engineer newsletter. And they have a really good guide here - a whole newsletter dedicated to protecting yourself against some of these skimming attacks.
Dave Bittner: [00:13:45] So it's a really interesting article here. It's called "Are You Being Skimmed?" And skimming, certainly, is nothing new, but there are some things in here that I had not seen anywhere else, or things I had not considered. And, of course, there are different places that you can get skimmed.
Joe Carrigan: [00:13:59] Yup.
Dave Bittner: [00:13:59] You can get skimmed at the ATM and want to be able to look out to see if someone has put a skimming device overtop of the main credit card-reading or bank card-reading device on the machine.
Joe Carrigan: [00:14:12] Yeah. They have a picture here in this article. It's a skimmer - looks exactly like the surface of the ATM.
Dave Bittner: [00:14:17] Right.
Joe Carrigan: [00:14:18] And it looks like it would be hard to detect.
Dave Bittner: [00:14:20] Yeah. And they say, you know, wiggle it and make sure that it's solid on there. But I think of particular interest is being skimmed at gas pumps because, for whatever reason, the gas stations were able to lobby the credit card companies to sort of drag their feet when it comes to installing chip and PIN...
Joe Carrigan: [00:14:40] Yup.
Dave Bittner: [00:14:40] ...Systems on their gas pumps.
Joe Carrigan: [00:14:41] Indeed.
Dave Bittner: [00:14:42] They made the case that it's going to cost a lot of money to turn those over. So I think they've got till 2020 to turn those over.
Joe Carrigan: [00:14:49] So they have a longer time period with the PCI requirements to go to that chip and PIN system than anybody else does.
Dave Bittner: [00:14:55] Exactly.
Joe Carrigan: [00:14:56] And here we are, about a year and a month away from the deadline - right? - and I still don't see a lot of chip and PIN at gasoline pumps.
Dave Bittner: [00:15:05] No. I have yet to see one at a gas pump. And one of the things that this article points out is that the folks who are hitting up these gas pumps, they're actually installing the skimming equipment inside the gas pump.
Joe Carrigan: [00:15:16] Right.
Dave Bittner: [00:15:16] So they get a key to open up the pump. They install the electronics. The more sophisticated versions of these electronics - they have GSM built in. They just phone the information back to the bad guys.
Joe Carrigan: [00:15:27] Right.
Dave Bittner: [00:15:27] Some of them have Bluetooth built in.
Joe Carrigan: [00:15:29] So the bad guy only has to get into the machine once.
Dave Bittner: [00:15:32] Right.
Joe Carrigan: [00:15:32] He doesn't have to go back again to incriminate himself to collect the information. The information is just available to him, either over the cellular network or over a Bluetooth connection.
Dave Bittner: [00:15:40] Yeah, exactly. I was actually talking to one of the local police officers here. I'm hoping to get him on the show soon. He deals with scams and people trying to take advantage of seniors. And he was saying that he actually will not pay anything but cash at a gas station.
Joe Carrigan: [00:15:56] Really?
Dave Bittner: [00:15:57] Yeah. He just won't do it because gas stations are so susceptible to this, and they've seen it so many times, and there's really no way to tell. He only pays cash at gas stations.
Dave Bittner: [00:16:08] Now, there's a couple tips here - a couple things I hadn't really thought of that I think are interesting. One of the things they suggest is use well-lit pumps closest to the store, as they're more easily monitored by staff.
Joe Carrigan: [00:16:19] Right.
Dave Bittner: [00:16:19] Now, that's a great idea. If you got a line of pumps there, usually there's one that's the one - you know, the person inside the store can look out and see. If I'm a bad guy...
Joe Carrigan: [00:16:29] I don't want to put a skimmer on that one.
Dave Bittner: [00:16:30] ...That's the last one I'm going to put a skimmer on...
Joe Carrigan: [00:16:33] Right.
Dave Bittner: [00:16:33] ...Because it's in plain view of the people inside the store. So that's a good idea.
Joe Carrigan: [00:16:38] I would agree.
Dave Bittner: [00:16:39] Obviously, look for tampering - that sort of stuff. Use cash or pay inside. Cover your PIN because a part of these attacks is they'll also put some sort of video camera either nearby or on the device itself...
Joe Carrigan: [00:16:50] Right.
Dave Bittner: [00:16:51] ...So that they look for your PIN number. Use a credit card instead of a debit card.
Joe Carrigan: [00:16:54] I like that one. That's my methodology.
Dave Bittner: [00:16:56] Yeah, yeah. Because when you use a credit card, you - the odds are much better that you can get your money back if it's taken away.
Joe Carrigan: [00:17:02] Right because, first off, it's not your money going into the fraudster's pocket to begin with.
Dave Bittner: [00:17:08] Right.
Joe Carrigan: [00:17:08] It's the credit card company's money.
Dave Bittner: [00:17:09] And they say, if you're a sophisticated user - and I would say this is, well, someone like you, Joe...
Joe Carrigan: [00:17:14] Right.
Dave Bittner: [00:17:15] ...(Laughter) You can search for suspicious Bluetooth devices at ATMs and gas pumps. I'd say that's a little past the paygrade of most people.
Joe Carrigan: [00:17:23] Right. Yeah.
Dave Bittner: [00:17:25] (Laughter).
Joe Carrigan: [00:17:25] You could probably get an app on your phone that searches for Bluetooth devices.
Dave Bittner: [00:17:30] Yeah, yeah. You want to be a good Samaritan, I suppose you could do that. But who knows, because they're not going to - they're probably not going to name the Bluetooth access point, Crooks Stealing Money (laughter).
Joe Carrigan: [00:17:39] Right. Did they mention seals? Look for seals.
Dave Bittner: [00:17:42] They do.
Joe Carrigan: [00:17:42] OK.
Dave Bittner: [00:17:42] Yeah, look for seals. But again, you know, if I'm a crook, how hard would it be for me to get a roll of that seal tape, you know?
Joe Carrigan: [00:17:47] Yeah, that's probably true. But that's what I do is I do actually look for the seals. They're numbered, usually, and they have the logo of the company. But you're 100 percent correct. I could go to some cheap web-based place and just print off a roll of things that look like security seals.
Dave Bittner: [00:18:01] Yeah, absolutely. So you know, buyer beware. And if you can, obviously, this season, use places that have the chip and PIN - the tokenization of your card with a chip and PIN device. It gets encrypted and sent off. It's much safer than when you're swiping it. Also, things like Apple Pay and Android Pay - those are tokenized.
Joe Carrigan: [00:18:21] Yup.
Dave Bittner: [00:18:21] And so those are much more secure than swiping that card...
Joe Carrigan: [00:18:25] Yes.
Dave Bittner: [00:18:25] ...At a retail place or in an ATM or at a gas pump.
Joe Carrigan: [00:18:28] Agreed.
Dave Bittner: [00:18:28] So lots of good tips here, so do check it out. It's on the social-engineering.org (ph) website. And, of course, thanks to our friend Chris Hadnagy for putting this information out there.
Joe Carrigan: [00:18:39] Doing good work, Chris.
Dave Bittner: [00:18:40] Yup. All right, Joe. It is time to move on to our Catch of the Day.
(SOUNDBITE OF REELING IN FISHING LINE)
Dave Bittner: [00:18:48] Joe, this week's Catch of the Day was sent in from a listener named Steve (ph). And he says, hi, Dave and Joe.
Joe Carrigan: [00:18:52] Hi, Steve.
Dave Bittner: [00:18:53] I enjoy listening to your podcast as I travel between clients. I thought I would share with you a recent email that I received from a friend who wanted to warn me about a fraud but was also trying to help me to get those often-promised riches. Please keep up the good work. Best regards, Steve from across the pond. Steve is from the U.K.
Joe Carrigan: [00:19:10] We're not going to do a U.K. accent this week.
Dave Bittner: [00:19:12] (Laughter) You know what, I'm going to spare - we'll spare our listeners this time.
Joe Carrigan: [00:19:16] OK.
Dave Bittner: [00:19:17] So this is - the subject of this email is, the truth about your fund here in Africa. This is from barrister Sam A. Mutako (ph). I believe Steve works in the financial services industry, so this is targeted towards him.
Dave Bittner: [00:19:30] It says, greetings - after a serious thought, I decided to contact you because you was deceived about your fund. That is why it is impossible for you to succeed in completing the process after several payments. I will be very open to tell you that all info you received so far was all lies and that it is about extorting money from you. I know that you will be surprised to receive this massage, but this is nothing but the truth. It is a planned work with some of the officials, but the manager is not aware. After all the evil, they went to the bank manager trying to divert the fund. The manager refuses to accept the idea. The manager said he want to talk to you before he can release the fund. That is to make sure that you authorize the transfer to a different account. I will personally direct you to the appointed paying bank, where you will be paid instead of wasting your hard-earned money and time on these hoodlums. But before then, you must not disclose my info to anybody because this is top secret. Kindly send your reply to me via this email. And then he has his email address, and it's from barrister Sam A. Mutako.
Joe Carrigan: [00:20:35] Esquire.
Dave Bittner: [00:20:35] Esquire.
Dave Bittner: [00:20:36] So you know - so there's a little added bit of legitimacy there.
Joe Carrigan: [00:20:40] Right.
Dave Bittner: [00:20:40] A little bit of everything in here.
Joe Carrigan: [00:20:43] Because only an esquire can put a (esq.) after his name.
Dave Bittner: [00:20:47] That's right. It's the law. So thanks for sending this in to us, Steve. This is a good one. My favorite part is that, I know you're going to be surprised to receive this massage.
Joe Carrigan: [00:20:56] Right.
Dave Bittner: [00:20:56] I know I'm always surprised when I receive an unsolicited massage.
Joe Carrigan: [00:20:59] Me too, yeah.
Dave Bittner: [00:21:02] Not necessarily a bad thing, but it does leave me surprised. So all right, well, that is our Catch of the Day.
Dave Bittner: [00:21:08] Coming up next, we've got my interview with Ben Yagoda. He's a freelance writer. He recently penned an article titled "Your Lying Mind: Cognitive Biases Tricking Your Brain." That was published in The Atlantic. So we'll hear from him in just a moment. But first, a word from our sponsors at KnowBe4.
Dave Bittner: [00:21:27] And what about the biggest, tastiest piece of phish bait out there? If you said A, my late husband wished to share his oil fortune with you, you've just swallowed a Nigerian prince scam, but most people don't. If you chose Door B, please read, important message from HR, well, you're getting warmer, but that one was only No. 10 on the list. But pat yourself on the back if you picked C, a delivery attempt was made. That one, according to the experts at KnowBe4, was No. 1 come-on for spam email in the first quarter of 2018. What's that? You picked D, take me to your leader? No, sorry. That's what space aliens say. But it's unlikely you'll need that one unless you're doing "The Day the Earth Stood Still" at a local dinner theater. If you want to stay on top of phishing's twists and turns, the new-school security awareness training from our sponsors at KnowBe4 can help. That's knowbe4.com/phishtest.
Dave Bittner: [00:22:31] And we are back. Joe, I recently spoke with Ben Yagoda. He's a freelance writer. And his article "Your Lying Mind: Cognitive Biases Tricking Your Brain" was published in The Atlantic - some interesting stuff here. Here's my conversation with Ben Yagoda.
Ben Yagoda: [00:22:45] One thing that had gotten a lot of attention in recent years actually grew out of the Iraq War and the mistaken idea that Iraq had weapons of mass destruction, which, according to everything I've read, of course wasn't true. But it wasn't kind of a cynical ploy by the American government to put that out there. The people in the CIA and intelligence agencies actually strongly believed or were convinced that this was the case. And there was a lot of - after that event, there was a lot of analysis in trying to figure out exactly why and how that had happened.
Ben Yagoda: [00:23:26] And in fact, the conclusion was a lot of it had to do with cognitive biases - specifically this very important one known as confirmation bias, part of which says that, if you or I or any human has a conviction or a belief, he or she will there upon do everything that we can to hold on and justify that belief. So every piece of evidence will kind of read as confirming it - hence confirmation bias. If something comes up that seems to contradict it, we'll either discount that or figure out some way to prove to ourselves that it actually proves what we already believe.
Ben Yagoda: [00:24:05] And that seems to be a lot of what happened with this idea of weapons of mass destruction - that we knew that Saddam Hussein had had them in the past. He was a bad guy. So we were convinced that he did it. And every piece of evidence our analysts viewed in that direction. So obviously that's not a good way to conduct an intelligence operation. And so the government put out some money and requests for proposals and focused in on the idea of so-called serious games - serious video games that could be used in training for analysts to counter these biases. And there was a kind of competition. Several groups of scholars and researchers put together these games.
Ben Yagoda: [00:24:49] And the one that seemed to have the best results was one called "Missing." So I actually got a hold of and played that game. Another thing I did was work with a psychologist named Richard Nisbett who's active in the field. And he's kind of an optimist in the - along the idea of can cognitive biases be overcome - with the idea that with training and instruction, some of them really can. And he points to examples such as the sunk cost fallacy.
Ben Yagoda: [00:25:21] The sunk cost fallacy is the idea that if we put in some money or time or effort to some endeavor, and it seems to be going really badly, the sunk cost fallacy tells us, well, stick it out because you've already put in so much money, time or effort in. And you don't want that to go to waste, which is really, as I say, a fallacy that - you know, if you're going to the movies, and you've spent $10 for your ticket, and you've watch 40 minutes, and the movie is a disaster - it's boring, unpleasant, painful to sit through - the idea of sitting through another hour just because you paid the $10 is ridiculous. I mean, you've already paid the money. And so would you choose to spend the next hour of your life in this painful situation or going outside and having a nice walk in the park?
Dave Bittner: [00:26:09] You've paid to have a bad time.
Ben Yagoda: [00:26:11] You've paid to have a bad time. And do you really want to have more - (laughter) more bad time? So Nisbett points to the fact, as he says, that economists who understand this routinely walk out of bad movies. And if they order a terrible meal in a restaurant, they don't necessarily walk out. But once they've eaten a few bites enough to know that it's bad, they either send it back or just don't eat the rest whereas the rest of us do the sunk cost fallacy - sort of tough it out and say, well, I'm going to do the rest of it. So he says if economists can learn this, why can't the rest of us? So I took his course. He has a course on the Coursera platform. And I read his book and tried to see if it would make me a better thinker and less subject to some of these fallacies.
Dave Bittner: [00:26:58] Yeah. You know, on this show, we talk about social engineering and people using some of these techniques for bad things - to take advantage of people, to steal from them. How does the things you learned in your journey here - how do you suppose that informs what you know about that? Are you more tuned into email scams and things like that?
Ben Yagoda: [00:27:20] Well, I'm not sure about the email scans. But one thing that does come to mind about - and it's not - it's more benign than a terrible phishing scam - but something known as the anchoring effect, which is our habit of mind that if we hear on the one hand a large number or on the other hand a small number, that will sort of reset our whole expectation for numbers in general. So that's why good negotiators start off with - if they're trying to sell something, total highball number that the person they're negotiating with might not accept that. But that will at least set the stage for a final number that's higher than it maybe should be. And on a smaller level, restaurant menus - sometimes there'll be, you know, a ridiculously high-priced veal chop for, like, you know, $55. And you'll say, no, I wouldn't spend that. But then, there'll be a $45 filet of sole, and you'll say, well, that's reasonable. That's $10 less than the high number, so that'll be fine. I'll pay for that one. If there hadn't been that $55 veal chop, you never would've spent $45 for an entree.
Ben Yagoda: [00:28:32] That's a more benign way that people try to take advantage of us, and it's something that we certainly can be aware of both in restaurants and when we try to sell our house or buy a house.
Dave Bittner: [00:28:44] Do you have any advice for folks, you know, going about their normal day-to-day business? Through the things that you learned, are there any tips or tricks where people can do a better job with these things?
Ben Yagoda: [00:28:53] Richard Nisbett's book "Mindware" and his Coursera course are, if you have some time to spend, good ways to guard yourself against these things.
Ben Yagoda: [00:29:04] You know, it's a little bit tricky because there are so many of them. There's, you know - the Wikipedia page for cognitive biases has - lists 185 of them. Some of them are sort of dubious or trivial, but there's a good solid hundred or so, I think. And they're different, and it's hard to be aware of all of them at any one time.
Ben Yagoda: [00:29:25] Somebody sent me an email after this article came out that I thought was pretty interesting. And I think she's a trainer of some sort, and said that one thing she tells people is, be very aware of your desire to be right and your dislike of being wrong. And to sort of scrutinize oneself, interrogate oneself on the political spectrum, which is something that's really out there now in terms of confirmation bias.
Ben Yagoda: [00:29:56] We've got, more or less, two opposing sides in politics in the country that are really, you know, at arms against each other metaphorically - I hope it stays metaphorically. And everybody is so committed to their side and really subject to confirmation bias. So everything they see, they interpret in the light of their side.
Ben Yagoda: [00:30:19] And to just interrogate that and scrutinize that and think about, well, is it more important for me to be right and my side to get ahead or to kind of address the issues and problems facing us? A sort of self-consciousness, I think, is always a good thing.
Dave Bittner: [00:30:36] Joe, lots to take in there.
Joe Carrigan: [00:30:37] Yes, yes. I love what he says about confirmation bias. Confirmation bias is the one reason I always say, don't get your political news from social media.
Dave Bittner: [00:30:46] Right, right.
Joe Carrigan: [00:30:48] Nowhere is there a bigger echo chamber formed by confirmation bias - it seems to amplify itself - than social media.
Dave Bittner: [00:30:57] Yeah.
Joe Carrigan: [00:30:57] And I've actually seen posts on social media where I just start typing over and over again, don't get your political news from Facebook, don't get your political - and people have reacted going, well, just because you think we're wrong - I don't think you're wrong. That's not my point. My point is don't get your political news here.
Dave Bittner: [00:31:14] Yeah. A friend of mine pointed out in the - leading up to the midterms, you know, we - the people we like, we give the benefit of the doubt, and the people we don't like, we don't give the benefit of the doubt, right?
Joe Carrigan: [00:31:26] Right. Absolutely.
Dave Bittner: [00:31:27] It's the same thing.
Joe Carrigan: [00:31:28] That's confirmation bias.
Dave Bittner: [00:31:29] Yeah.
Joe Carrigan: [00:31:29] We see a lot of the sunk cost fallacy in scams.
Dave Bittner: [00:31:32] Oh, yeah.
Joe Carrigan: [00:31:32] We had a story about a guy that lost a lot of his retirement saving, and then somebody called him up and said, hey, you can get some of that money back if you come after me. Fortunately, he didn't fall for that scam.
Dave Bittner: [00:31:41] Yeah. For the low, low price of whatever.
Joe Carrigan: [00:31:43] Yeah, exactly. Anchoring effect - it's funny he says that there's always something expensive on the menu, and a lot of times I go, and I see it's lamb. And I like lamb.
Dave Bittner: [00:31:52] Yeah.
Joe Carrigan: [00:31:52] I like lamb so much that I might be willing to pay the price for it. But frequently, when I get lamb, you know, when I see lamb on the menu and I order it, I get the response, oh, we're out of the lamb.
Dave Bittner: [00:32:02] Really?
Joe Carrigan: [00:32:03] Really.
Dave Bittner: [00:32:03] Oh.
Joe Carrigan: [00:32:04] So I'm thinking they don't even have lamb.
Dave Bittner: [00:32:05] Interesting.
Joe Carrigan: [00:32:06] They just put it on the menu for this anchoring effect.
Dave Bittner: [00:32:09] Yeah, that's interesting.
Joe Carrigan: [00:32:10] That's ol' paranoid Joe talking about restaurants now. But like Ben said, this is pretty innocuous. I'm not too concerned about it.
Dave Bittner: [00:32:17] Yeah. You know, there's a local guitar store in town, and they would always advertise in the phone book back when you looked things up in the phone book that, you know, if you came in and visited and tried out playing a guitar, they had a free amp that you could have. You know, a little - tiny, little free amp.
Joe Carrigan: [00:32:33] Yeah. A practice amp?
Dave Bittner: [00:32:34] Free little practice amp, yup. Just come on in. And they were always out of the amp.
Joe Carrigan: [00:32:38] Right.
Dave Bittner: [00:32:38] Oh, decades - they never had that amp in stock. But they were happy to give you a couple bucks off of a guitar that you'd buy instead, right? Right?
Joe Carrigan: [00:32:46] That's actually called bait-and-switch, and that might be illegal.
Dave Bittner: [00:32:49] Yeah, I think it is, but so it goes. All right. Well, thanks to Ben Yagoda for joining us again. The title of his article is "Your Lying Mind: The Cognitive Biases Tricking Your Brain." That is in The Atlantic. We'll have a link to that in the show notes. And thanks to all of you for listening.
Dave Bittner: [00:33:04] And, of course, thanks to our sponsors at KnowBe4. They are the social engineering experts and the pioneers of new-school security awareness training. Be sure to take advantage of their free phishing test, which you can order up at knowbe4.com/phishtest. Think of KnowBe4 for your security training. Thanks to the Johns Hopkins University Information Security Institute for their participation. You can learn more about what they're up to at isi.jhu.edu.
Dave Bittner: [00:33:30] The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben, editor is John Petrik, technical editor is Chris Russell, executive editor is Peter Kilpe.
Dave Bittner: [00:33:46] I'm Dave Bittner.
Joe Carrigan: [00:33:47] And I'm Joe Carrigan.
Dave Bittner: [00:33:48] Thanks for listening.