podcast

Skepticism is the first step.

Joe shares stories of typo-squatting. Dave reminds warns us against responding to malicious email, even just for fun. The catch of the day is from a listener, leading on a romance scammer. Carole Theriault returns with an interview with Chris Olson from The Media Trust on how targeted advertising can enable election interference.

Links from this week's stories:

Transcript

Chris Olson: [00:00:00] So in today's landscape, the digital ecosystem is designed to deliver content that is targeting each of us individually. So each of our experiences on the worldwide web is unique. 

Dave Bittner: [00:00:11]  Hello, everyone, and welcome to the CyberWire's "Hacking Humans" podcast, where each week, we take a look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire. And joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe. 

Joe Carrigan: [00:00:30]  Hi, Dave. 

Dave Bittner: [00:00:30]  We've got some good stories to share this week. And later in the show, Carole Theriault returns. She's got an interview with Chris Olson from The Media Trust on how targeted advertising can enable election interference. 

Dave Bittner: [00:00:42]  But first, a word from our sponsors, KnowBe4. Have you ever been to security training? We have. What's it been like for you? If you're like us, ladies and gentlemen, it's the annual compliance drill, a few hours of PowerPoint in the staff break room. Refreshments in the form of sugary doughnuts and tepid coffee are sometimes provided. But a little bit of your soul seems to die every time the trainer says, next slide. Well, OK, we exaggerate, but you know what we mean. Stay with us. And in a few minutes, we'll hear from our sponsors at KnowBe4 who have a different way of training. 

Dave Bittner: [00:01:22]  And we are back. Joe, why don't you start things off for us this week? 

Joe Carrigan: [00:01:25]  Dave, I want to tell you a story. 

Dave Bittner: [00:01:27]  OK. 

Joe Carrigan: [00:01:27]  This is a story from my own personal history. It's a very funny story. 

Dave Bittner: [00:01:31]  I'll be the judge of that. 

Joe Carrigan: [00:01:32]  But back in the day, before Facebook and MySpace, there was - the very beginnings of social media were social networking websites. There was a website called highschoolalumni[.]com (ph). 

Dave Bittner: [00:01:45]  OK. 

Joe Carrigan: [00:01:45]  And I think it's still around. I don't know. I haven't been there in years because everybody I want to be connected with from high school I'm connected with on Facebook. But at this point in time, that was all we had. (Imitating unidentified accent) Back in my day, that was all we had. 

Dave Bittner: [00:01:57]  (Laughter) Right. 

Joe Carrigan: [00:01:57]  Anyway, I was telling my boss about this site. I said, hey, this is pretty cool. It's a way to connect with people you haven't seen in a while. If you're looking for somebody, maybe they're on this site. And he says, well, let's take a look at it. So we go over to my computer, and he's shoulder surfing. And I type in highschoolalimni[.]com (ph) with an I where a U should be. And if you look at your QWERTY keyboard, the I and the U are right next to each other. 

Dave Bittner: [00:02:17]  Oh, yes. So they are. Yes, indeed. 

Joe Carrigan: [00:02:19]  So I hit return, and the webpage goes black, and I'm like, that's odd. And then my screen explodes in porn ads. 

Dave Bittner: [00:02:29]  (Laughter). 

Joe Carrigan: [00:02:29]  Right. 

Dave Bittner: [00:02:29]  Of course it does. 

Joe Carrigan: [00:02:31]  With my boss standing over my shoulder. Remember, that's an important part of this story. 

Dave Bittner: [00:02:34]  Oh, yes (laughter). 

Joe Carrigan: [00:02:35]  That's where the humor comes in, right? Now, fortunately, my boss... 

Dave Bittner: [00:02:39]  Is this a work computer also? 

Joe Carrigan: [00:02:40]  Yes. Oh, yeah. 

Dave Bittner: [00:02:42]  Of course it is. A big red alarm just went off. Somebody at the IT department who is eating their lunch just did a spit-take all over their computer because alarms went off. Yeah. 

Joe Carrigan: [00:02:50]  Where's Carrigan? What's he doing? 

Dave Bittner: [00:02:50]  (Laughter). 

Joe Carrigan: [00:02:53]  My boss got a chuckle, shakes his head and walks back to his desk. And I'm sitting there trying to clean this up because I had to close all these windows. It was terrible, terrible experience, but this practice has a name... 

Dave Bittner: [00:03:05]  Yeah. 

Joe Carrigan: [00:03:05]  ...Right? It's called typosquatting. It's a lot more prevalent than you may think it is out of the box. But think of all the damage you can do in the way you can take advantage of the way people read and the way people type. 

Dave Bittner: [00:03:16]  Yeah. 

Joe Carrigan: [00:03:17]  First, you can capitalize on just people making typos like I did in that story I just told, right? 

Dave Bittner: [00:03:22]  Right, yep. 

Joe Carrigan: [00:03:22]  And then there's email reception, right? You can receive emails that are being sent to domains that are mistyped. We're going to talk more about that in a minute. 

Dave Bittner: [00:03:30]  OK. 

Joe Carrigan: [00:03:31]  And then, of course, there's the thing we've talked about frequently on here, using them as phishing sites. For example, an L and a one look very similar, if not the same, in certain fonts, right? So if I register the website 1inkedin[.]com (ph) which is a registered domain, by the way. 

Dave Bittner: [00:03:46]  Of course, it is. 

Joe Carrigan: [00:03:47]  There is no website to it right now, but that doesn't mean there won't be in the future. 

Dave Bittner: [00:03:50]  Yeah. 

Joe Carrigan: [00:03:50]  Then I can tell people go to linkedin.com and link to 1inkedin[.]com. Alastair Paterson has a good story about this in Security Week right now. And he talks about an experiment by the GodaiGroup back in 2011 where researchers registered domain names that looked a lot like these companies and then sat back to see what would happen. And one of the things they found during that time period was that they received 120,000 emails that were destined for the actual companies. And these things included all sorts of sensitive information like trade secrets, business invoices, personal information of employees, network diagrams. What could be better than a network diagram to someone trying to penetrate a network? 

Dave Bittner: [00:04:30]  Right. 

Joe Carrigan: [00:04:31]  Username and password - so that's something that could be better than a network diagram - and then as well as service requests. But that's pretty impressive, that just in six months they got 120,000 emails with this kind of information in it. Not all those emails contained this kind of information, but some of them did. The growth of typosquatting has actually spawned the lobbying group called the Coalition Against Domain Name Abuse, or CADNA, which - I don't know. They're pushing for government regulation. I don't know how you regulate this. I'm not really sure what that regulation would look like. One of the big problems is that most users, when they're browsing the internet according to this article, still manually type in the web address they want to visit as opposed to going to a search engine or using a link, a URL, like, a bookmark that they might have. 

Dave Bittner: [00:05:14]  Right. 

Joe Carrigan: [00:05:14]  Which I use bookmarks, and a lot of times, when you start typing a bookmark, if the web browser knows that you're typing a bookmark that you already have, it will fill in the bookmark for you, which is a useful feature. But one of the things that make this possible is that domain names can be registered and then dropped without any risk or cost to the person doing it. For five days, you can say, nope, I don't want this domain name; give me my money back. 

Dave Bittner: [00:05:37]  Interesting. 

Joe Carrigan: [00:05:37]  And in five days, you can sow and reap a lot of havoc and cause a lot of damage. 

Dave Bittner: [00:05:42]  If they do it this way, they can minimize the amount of money that they even have to spend on registering the domain names. 

Joe Carrigan: [00:05:48]  They can minimize it to zero. Right. They can eliminate the cost. Let's say I register a domain day one. Day two, I start sending out phishing campaigns and do that for about four days, which is a really long lifespan for a phishing campaign, right? And then I go to the registrar and I go, nope, I don't want this domain anymore. And I don't pay a dime as a bad guy. So any money I've made is 100% profit. So it's just another way for these guys to reduce cost. Some countermeasures - what can be done about this? Well, from the site's standpoint, there is a practice that is becoming more common of going around and buying up domain names that might be typosquatted. 

Dave Bittner: [00:06:19]  Right. 

Joe Carrigan: [00:06:20]  Like, for example, if you go to acebook.com (ph), Facebook owns that. 

Dave Bittner: [00:06:23]  Oh, OK. 

Joe Carrigan: [00:06:24]  So when you type in acebook.com, you go to facebook.com. It redirects you. But you can't buy them all, right? 

Dave Bittner: [00:06:31]  Right, right. 

Joe Carrigan: [00:06:31]  There's no way you can buy them all. 

Dave Bittner: [00:06:32]  It's not like Pokemon. You can't collect them all. Right. 

Joe Carrigan: [00:06:33]  Right, exactly. I mean, there are so many different combinations, the problem is, I think, intractable. This is good advice for looking for common ones. But if I'm really trying to deceive you, you might be able to stop most of the people entering wrong addresses by typos, but you will probably not be able to stop malicious actors with this tactic. So I think, again, it falls back on the humans - right? - the users of these sites to be vigilant. Watch for misspellings in domain names. Make sure that you're typing them. I say use bookmarks so that you know that you're going to a bookmark that you have that's always been the right one. Or use a search engine. But even using a search engine, we've talked about on this show how that can lead you to another suspicious site. 

Dave Bittner: [00:07:13]  I've noticed that, you know, a lot of times when you're using a search engine, if you mistype something, for example, into Google... 

Joe Carrigan: [00:07:20]  Right. 

Dave Bittner: [00:07:20]  ...It'll come back with the search that it thought you meant to use. 

Joe Carrigan: [00:07:23]  Right. 

Dave Bittner: [00:07:23]  It says, did you mean to search for this rather than that? So on a certain level, they're looking out for you. 

Joe Carrigan: [00:07:28]  Yeah, they are, but they'll sell ads to people that are selling competing products. 

Dave Bittner: [00:07:32]  Right, right. 

Joe Carrigan: [00:07:32]  And you might click on that ad... 

Dave Bittner: [00:07:33]  Right, right. 

Joe Carrigan: [00:07:34]  ...And call that number like I did on my phone one day trying to get in touch with Comcast. And I wound up talking to somebody else. I had some choice words for that person. 

0:07:42:(LAUGHTER) 

Joe Carrigan: [00:07:42]  Look out for strange redirects. This is a real giveaway. So if you actually go to highschoolalimni[.]com now, which I don't advise you do... 

Dave Bittner: [00:07:49]  (Laughter). 

Joe Carrigan: [00:07:49]  ...But you can go there. 

Dave Bittner: [00:07:51]  Right. 

Joe Carrigan: [00:07:51]  It takes you through a number of redirects before you wind up at the final page, which has a big red button that says I'm a human - right? - but that button goes to a JavaScript function that says go to visa. I didn't click on that, and I didn't take the time to look at the source code, but I would bet that's malicious. So don't do that. Don't click on that button. But the redirects are obvious. You see that happen. If you type in or you click on a link to where you think you're going and you see, like, the screen flash and the URL change three or four times, you should know, OK, this is a red flag - a big red flag. I'm not going where I think I'm going. 

Dave Bittner: [00:08:25]  Danger, danger. Pause. 

Joe Carrigan: [00:08:26]  (Laughter) Right. Odd-looking letters or numbers; be skeptical about sharing personal information and financial information, and confirm you're on the right websites. 

Dave Bittner: [00:08:35]  Yeah, especially if the website starts asking you for information that isn't relevant to that website... 

Joe Carrigan: [00:08:42]  Right. 

Dave Bittner: [00:08:42]  ...For example. Because one of the things that they'll do is they will pretend to be the website that you think you're logging on to. 

Joe Carrigan: [00:08:48]  Correct. 

Dave Bittner: [00:08:49]  So they'll pretend to be, say, your bank website. This happened to me once where I was on my way into logging into what I thought was the account for my credit card. 

Joe Carrigan: [00:08:59]  Right. 

Dave Bittner: [00:09:00]  It looks just like my credit card website. And they started asking for information that was not relevant to anything having to do with my credit card. 

Joe Carrigan: [00:09:08]  Right. 

Dave Bittner: [00:09:08]  And so that triggered my response. And I looked at it and, sure enough, it was a typosquatting site. 

Joe Carrigan: [00:09:13]  And we talked about the Stripe scammers who had a redirect to a website that looked just like the Stripe login where they... 

Dave Bittner: [00:09:21]  Oh, yeah. 

Joe Carrigan: [00:09:21]  You enter your username and password, and the next thing they say is, what's your bank account number?

Dave Bittner: [00:09:24]  Right. 

Joe Carrigan: [00:09:25]  Right? Which seems like it would be necessary at Stripe, but it's really not part of the login or authentication process, right? That's kind of a more subtle red flag. I don't know that a lot of users would've caught that. I don't know that I would have caught that. 

Dave Bittner: [00:09:36]  Yeah. 

Joe Carrigan: [00:09:36]  You don't really know until you fall victim to these things. 

Dave Bittner: [00:09:39]  No, not at all. 

Joe Carrigan: [00:09:40]  But if something seems broken or strange, like images don't show up on the webpage, that's another red flag as well. 

Dave Bittner: [00:09:47]  Oh. 

Joe Carrigan: [00:09:47]  So, really, I think this falls back into the human's hands, into the user's hands, because there's not a lot that can be done to stop it from the side of the site that you're trying to visit. And there are always going to be these malicious actors out there. And this lobbying group, CADNA, I don't know how much success they're going to have in lobbying for this. I don't know how big of a priority this is going to be for legislators, and I don't even know what any regulation would look like. 

Dave Bittner: [00:10:09]  Yeah. It's certainly good information, and I think probably along the lines most of us have fallen for something like this. I once found out the hard way, similar to you, that - with a client looking over my shoulder - that if you don't type the Y in youtube.com, you will go to someplace that is - well, they have videos... 

Joe Carrigan: [00:10:26]  (Laughter). 

Dave Bittner: [00:10:27]  ...But not the kind of stuff that you usually see on YouTube. So, yeah, just be careful. And (laughter) yeah, it's a tough one. All right. Well, that's a good story. My story this week comes from the folks over at PhishLabs. This is a blog post written by Dane Boyd. And it's titled "Don't Respond to Suspicious Emails." I think, Joe, it is natural sometimes for folks who maybe feel as though they figured out that an email is a scam. 

Joe Carrigan: [00:10:57]  Right. 

Dave Bittner: [00:10:58]  And they want to string the scammer along. 

Joe Carrigan: [00:11:00]  They want to scambait. 

Dave Bittner: [00:11:00]  They want - yeah, or they just want to waste their time. And we certainly - in our Catch of the Day segment, we deal with that a lot where people have done exactly that. And the folks here at PhishLabs are warning that perhaps that's not the best thing to do for a number of reasons. First of all, when you respond to the scammers, no matter how you respond, they know that they've got a real email address. 

Joe Carrigan: [00:11:23]  Yes. 

Dave Bittner: [00:11:23]  There's a hot, live email address. 

Joe Carrigan: [00:11:25]  That's true. 

Dave Bittner: [00:11:25]  And if you figure that they are, you know, mostly just going after lists of emails, unless they're really targeting you, for them to know - to have verification that an email address is indeed real, that's valuable information. 

Joe Carrigan: [00:11:39]  It is. And we've talked about that before as well. 

Dave Bittner: [00:11:41]  One of the other things they point out is that when you respond, quite likely you will include your email signature. And your email signature has valuable information in it. 

Joe Carrigan: [00:11:51]  Yeah. 

Dave Bittner: [00:11:51]  It has phone numbers. It has, a lot of times, addresses. And this is information that they can use in future attacks. Also interesting point that I'd never considered is that a lot of times your email headers contains information about your location. 

Joe Carrigan: [00:12:05]  Yes. 

Dave Bittner: [00:12:06]  And so this helps the scammers know where you are and so they can follow up with targeted information. Again, basically, the more information you provide to them, the easier it is for them to come back at you with targeted information that is going to be harder for you to detect. 

Joe Carrigan: [00:12:23]  Yeah. That's a good point. 

Dave Bittner: [00:12:25]  Yeah. So interesting article here - we'll have a link to it in the show notes. I guess the overall message here is that - don't get too cocky, kid, right? (Laughter). 

Joe Carrigan: [00:12:33]  Right, yeah. I had somebody on Instagram send me one of these messages, direct messages on Instagram. I started down the road of toying with this person, right? It's the beautiful young woman in the Instagram picture, and there's no reason for her to be talking to me at all - right? - except to scam me. But the more I thought about it, this was actually my personal Instagram account. I don't want to use my personal Instagram account to do this because of exactly these reasons. I don't want to give too much information to these people. I don't want these people to know who I am because there are actually pictures of me on that profile. I'm much more keen to starting up a fake account, but even that - I don't know - maybe there's some information they can glean from that. 

Dave Bittner: [00:13:14]  Yeah. 

Joe Carrigan: [00:13:14]  I live in a fairly populous area, though. 

Dave Bittner: [00:13:16]  Interesting point they make, that when you engage with these criminals, it could be like catching a tiger by the tail... 

Joe Carrigan: [00:13:22]  Yeah. 

Dave Bittner: [00:13:22]  ...In that, OK, they may be coming at you with something this time that's pretty easy for you to detect. 

Joe Carrigan: [00:13:28]  Right. 

Dave Bittner: [00:13:29]  But that doesn't mean they don't have more powerful tools in their toolbox. 

Joe Carrigan: [00:13:33]  Right. 

Dave Bittner: [00:13:33]  And if you annoy them, they may be able to come back at you just out of spite. 

Joe Carrigan: [00:13:39]  Right. 

Dave Bittner: [00:13:39]  So I guess don't underestimate them. Best to just leave them alone, let them move on to someone else. 

Joe Carrigan: [00:13:44]  Right. 

Dave Bittner: [00:13:45]  I suppose on one level - I mean, you and I have talked about the need to sort of waste their time and change the economic proposition. 

Joe Carrigan: [00:13:52]  Right. 

Dave Bittner: [00:13:52]  So there's that. 

Joe Carrigan: [00:13:54]  Right. 

Dave Bittner: [00:13:54]  But I don't know, I guess - there are good points here that, unless you really know what you're doing, you could be in for more than what you counted on. 

Joe Carrigan: [00:14:01]  Agreed. 

Dave Bittner: [00:14:01]  Yeah. All right. Well, those are our stories. It's time to move on to our Catch of the Day. 

0:14:05:(SOUNDBITE OF REELING IN FISHING LINE) 

Dave Bittner: [00:14:09]  Our Catch of the Day was sent in by a listener named Daryl (ph). He is from Australia, and he sent a series of text messages that he received. He's sort of stringing along the person, which... 

Joe Carrigan: [00:14:20]  (Laughter). 

Dave Bittner: [00:14:20]  So there you go. Shows that - do as I say, not as I do, I guess (laughter). 

Joe Carrigan: [00:14:23]  Right. 

Dave Bittner: [00:14:25]  This is a fun exchange. I will play the part of the person reaching out to Daryl, and Joe, you can play the part of Daryl. 

Joe Carrigan: [00:14:32]  OK. 

Dave Bittner: [00:14:32]  And it goes like this. Want to meet? 

Joe Carrigan: [00:14:35]  Yes. Which campsite do I need to go to that doesn't require my credit card number? 

Dave Bittner: [00:14:40]  I need a casual sex partner - first time. I am a escort girl in your area. Copy my link and paste your browser. Just join there and call me live there. I am waiting for you there. It's free - join. Just do the free register and call me. 

Joe Carrigan: [00:14:54]  Does anyone fall for this? 

Dave Bittner: [00:14:56]  Just copy my profile link and paste your browser, then put any mail on there and call me face-to-face. 

Joe Carrigan: [00:15:01]  There must be a few. I checked that link - totally unsecure. You should probably just get a job. 

Dave Bittner: [00:15:07]  It's totally free. No need to CC. You can check it now and call me on there. Babe, I'm waiting for you on there. 

Joe Carrigan: [00:15:13]  Yeah, it's totally a scam, and the URL is listed as malicious. I reported it. You should be done soon. Also, your VPN is outdated and easy to bypass. 

Dave Bittner: [00:15:22]  Just copy my profile link and paste your browser, then put any mail on there. 

Joe Carrigan: [00:15:26]  What is it with your country and scams? Also, really easy to track you now. 

Dave Bittner: [00:15:30]  WTF. Bye. 

Joe Carrigan: [00:15:32]  Haha, quit scamming. I can't believe how many times you sent this out. How many people fall for it? 

Dave Bittner: [00:15:38]  Bye. You just lost real girl. Good luck. 

Joe Carrigan: [00:15:41]  You have real files on this device - dumb move. 

Dave Bittner: [00:15:43]  I am fake. So bye. Please stop massage (ph) me. And it ends there. 

0:15:47:(LAUGHTER) 

Dave Bittner: [00:15:50]  Well, thanks to Daryl for sending that in. He certainly wasted a little bit of that scammer's time. 

Joe Carrigan: [00:15:55]  Yep. 

Dave Bittner: [00:15:55]  So that's... 

Joe Carrigan: [00:15:56]  They're coming for you, Daryl. 

0:15:57:(LAUGHTER) 

Dave Bittner: [00:15:57]  Yeah, that's right. Heads up. Yeah, shields up, shields up. All right, that is our Catch of the Day. Coming up next, Carole Theriault returns. She's got an interview with Chris Olson from The Media Trust on how targeted advertising can enable election interference. 

Dave Bittner: [00:16:11]  But first, a word from our sponsors at KnowBe4. And now back to that question we asked earlier about training. Our sponsors at KnowBe4 want to spring you from that break room with new-school security awareness training. They've got the world's largest security awareness training library, and its content is always fresh. KnowBe4 delivers interactive, engaging training on demand. It's done through the browser and supplemented with frequent simulated social engineering attacks by email, phone and text. Pick your categories to suit your business. Operate internationally? KnowBe4 delivers convincing, real-world proven templates in 24 languages. And wherever you are, be sure to stay on top of the latest news and information to protect your organization with KnowBe4's weekly Cyberheist News. We read it, and we think you'll find it valuable, too. Sign up for Cyberheist News at knowbe4.com/news. That's knowbe4.com/news. 

Dave Bittner: [00:17:16]  And we are back. Joe, it's always great when Carole Theriault returns. She's got a great interview this week. She talks with Chris Olson. He's from The Media Trust. And they're going to be talking about election interference. Here's Carole Theriault. 

Carole Theriault: [00:17:29]  Election time is certainly gaining momentum in the States, and there is a lot of talk about how the US can better protect itself against the nefarious schemes afoot to make the process of voting, well, less honorable and transparent. Now, there have always been miscreants out there who've tried to sway a election, be it from class president to POTUS heights. But the methods they are now employing are light-years away from more traditional approaches. Chris Olson, the CEO of Media Trust, an information security company based in the US, kindly agreed to give us a bit more insight on what voters need to watch out for. Chris, thank you so much for taking the time to chat to us today. 

Chris Olson: [00:18:09]  Yeah, thanks for having me today. 

Carole Theriault: [00:18:10]  So tell me a bit about you and Media Trust. What do you guys do? 

Chris Olson: [00:18:14]  So the Media Trust goal is to fix the internet by creating better digital ecosystems, governing assets, assets - our websites and mobile applications and then enabling digital risk management. Digital risk management covers malvertising, data compliance and then disinformation campaigns in elections. 

Carole Theriault: [00:18:32]  OK. Now, I hear you are warning folks about electoral manipulation, and this is for many obvious reasons a serious topic. Can you frame this up for us? Like, what do you mean by electoral manipulation? 

Chris Olson: [00:18:46]  So in today's landscape, the digital ecosystem is designed to deliver content that is targeting each of us individually. So each of our experiences on the worldwide web is unique. For example, if you were to visit a news website from where you're sitting today and I was going to visit it from Virginia, just based on our biography, we're going to get significantly different information, different content and different advertising. That's all built on the backs of third-party code that has been proliferating over the last, roughly, fifteen years as the worldwide web has become a predominant method of communication, commerce, etc. 

Chris Olson: [00:19:21]  For election messaging, what's critical is that the ability to target messaging to each individual consumer on top of what you referenced, which is there are politicians and then other groups that are attempting to get people to think in certain ways, creates a perfect storm where consumers don't know where the information that they're reading is coming from. And from that perspective, it means that users have to be very careful what they take as the truth, and they have to start to understand as much about where information is coming from as what the message itself is saying. 

Carole Theriault: [00:19:53]  You know, when I look at my news feeds, if I see a headline, you know, that might be clickjack-y (ph) or certainly piques my interest, I will check the source before I click on it. And I tend to only click on sources I know, which, you know, is a good thing and a bad thing, I suppose. 

Chris Olson: [00:20:08]  Yeah. And then I would say that you're probably educated versus the typical consumer, and in addition to that, you're wary of people trying to deliver messaging to you and the concept of presenting information that would cause you to click and just move into content based on maybe commercial reasons and then commercial being they're trying to get you to do something to make money. So the typical consumer is not thinking about those things, and if something sensational is in front of them or something that would pique their interest based on prior behavior, they're typically clicking there first before doing any analysis. 

Carole Theriault: [00:20:41]  Of course, I'm based here in the UK; you're based in the States. And we're going to see different types of ads and news feeds come in to our news feeds, right? Different stories are going to show up for you than will for me. So how is that used to manipulate elections? 

Chris Olson: [00:20:55]  OK, so it's a great question. One of the ways is the way that advertising has been designed to meet a consumer or to bring specific information to particular target audiences that actually pays for the free worldwide web that we all get to use. 

Carole Theriault: [00:21:10]  OK. 

Chris Olson: [00:21:10]  So if you've been to websites before - say, a travel site or a retail site - you've looked to purchase something, you've then left the site, and suddenly there's advertisements that display ads or video ads or more subtle things or even articles on websites that you've gone to after leaving that site or that app or even after having visited the store, right? So if you've physically gone into a store and you've left and you go home, you look at a website and there's advertising there - that is built on the back of what we call third-party code. And third-party code is made from, you know, roughly - there are a few hundred companies that really manage the predominance of that. There are thousands of companies that engage in that activity. And so that capability of knowing your predilections and your preferences and then bringing you messaging, be it advertising or content, is built in to and part of a worldwide web. 

Carole Theriault: [00:22:01]  Right. 

Chris Olson: [00:22:01]  Political candidates, political advertising and political messaging literally leverages the exact same infrastructure to target consumers with information. The way that they're manipulated goes back to pre-digital days in that particular constituencies with particular goals of moving different people in a direction - whether that's to get them to vote for someone, it's to sow discord, it's to get them to go out on the street to protest - those are more - what we would say more nefarious use cases. And in particular, they're nefarious if the deliverer is obfuscating or not telling the truth about who they are when they deliver those messages or if the messages are simply false. 

Carole Theriault: [00:22:41]  OK. So say I'm a consumer. I'm in some demographic. And I'm being targeted with false information. So it's all geared up to attract to me, to lure me in, to engage me, right? 

Chris Olson: [00:22:52]  That's right, yep. 

Carole Theriault: [00:22:53]  And so from an election perspective, how are people supposed to fight back? 

Chris Olson: [00:22:57]  First, consumers can see very little. It's not easy for them to notice these things are false, and it's especially true when they read something and their emotions get peaked; they forget to remember that they're supposed to care who sent them the message. But they can't see much, but they can know a lot of things. So first, they - unfortunately today, most of what is delivered on the worldwide web can't be trusted. It doesn't mean that most of it isn't true, but it can't be trusted. So skepticism is the first step. 

Chris Olson: [00:23:24]  And once you're skeptical and understanding that something is being targeted to you for a purpose rather than you just stumbled upon the information, and you know that there are people with some form of goal behind it, that they're trying to get you to do something, you can look at it in a different way. One of the things which we just mentioned is if you know the author - and you know, of course, they can fake that they're the author. A bad guy would do that. But if you know the author, that's a great step. Messages that - with no attribution to a candidate, those would be things that they should watch out for. 

Chris Olson: [00:23:53]  And then - this is where it gets really difficult - but outlandish claims, you really need to check yourself. Outlandish claims are likely not true. They can be true occasionally. But for the most part, those things that seem either too good to be true or too terrible to be true, depending on your predilection, probably are. 

Carole Theriault: [00:24:12]  Do you think that if people just stopped sharing content based on just clickbait-y (ph) headlines and actually read the article and thought, OK, actually, that has got some juice to it or that's something useful, that might help ebb the flow of disinformation? 

Chris Olson: [00:24:27]  No. I think that something like that would help, but I think that may be asking too much of human nature. And so I... 

Carole Theriault: [00:24:34]  (Laughter) You're probably right. 

Chris Olson: [00:24:35]  I do think that it falls back on the big platforms, the news websites. There historically, prior to digital, was an editor that would know what was going to be produced on, say, a newspaper - all right? - something that was actually printed. And so that there was at least - whether that person is biased or not really isn't the important part; it's that there was a person or a group of people that looked before anything was delivered. We knew they were biased, but we at least knew their biases, and we had an expectation of that. Today that editorial step is skipped, and it doesn't mean that all of the content on newspaper sites isn't read by an editor and approved; a lot of it is. 

Chris Olson: [00:25:16]  But significant portions of websites are moving much too fast. In addition, because of third-party code and targeting, the editors aren't able to see a lot of the content before it runs because it isn't targeted to them or to the groups of people that they've hired to review the content. So there's too much. It's targeted so they can't see it. And then I think the final piece is, where we've gone over the last year or two is to try to put them into a corner of being a moderator, which gets into a free speech conversation, which in the West is a problem, rather than forcing them to expose and know who is delivering the content. 

Carole Theriault: [00:25:54]  Chris, I could talk to you all day about this. It's fascinating. 

Chris Olson: [00:25:57]  Yeah. Thank you. 

Carole Theriault: [00:25:58]  But for our listeners, I think the overall message is keep your wits about you, folks. Things are heating up out there. This was Carole Theriault for "Hacking Humans." 

Dave Bittner: [00:26:08]  All right. Joe, what do you think? 

Joe Carrigan: [00:26:09]  Good interview. I like what Chris has to say in this interview. The first thing he says that everybody should pay attention to be aware of is that the content on the web is tailored to you individually. It is remarkably tailored thanks to all this third-party code that Chris was talking about. And a lot of this information that's coming to us, consumers of this information have no idea where it's coming from. They just don't know. 

Dave Bittner: [00:26:31]  Yeah. It struck me - one of the things he said was, when this information is coming at you, someone will send you something that will push your buttons... 

Joe Carrigan: [00:26:40]  Right. 

Dave Bittner: [00:26:40]  ...And your emotions make you forget to check the source. 

Joe Carrigan: [00:26:43]  Yep. 

Dave Bittner: [00:26:43]  You're so wound up about this either terrible thing or great thing that you see come by that you hit the share button before you go and check to see - wait a minute, where did this come from? 

Joe Carrigan: [00:26:53]  I see this all the time on my feed on Facebook. 

Dave Bittner: [00:26:56]  Yeah (laughter). 

Joe Carrigan: [00:26:56]  You know, I have friends who are very pro-Trump and friends that are very anti-Trump, right? And it's funny to watch what they share because they share exactly this kind of stuff, where they haven't thought about it. And these are otherwise smart people. 

Dave Bittner: [00:27:10]  Right. 

Joe Carrigan: [00:27:10]  You know, you these are people who I have respect for in other fields. I generally tend to not have respect for anybody else's political opinion but my own. 

0:27:17:(LAUGHTER) 

Dave Bittner: [00:27:18]  Right. 

Joe Carrigan: [00:27:19]  And I don't expect anybody to respect my political opinion. So I'm OK with that. 

Dave Bittner: [00:27:23]  Right, OK (laughter). 

Joe Carrigan: [00:27:24]  I guarantee you, dear listener, you don't agree with me on everything. 

Dave Bittner: [00:27:28]  Yeah. 

Joe Carrigan: [00:27:28]  You and I are going to find some point of disagreement. You know what? That's OK. But you can't let your emotions get in the way of your thought process and your validation process here. 

Dave Bittner: [00:27:37]  I also think that some of us - I like to think of myself on social media, on Facebook, places like that - I really work hard at being a trusted source. 

Joe Carrigan: [00:27:46]  Yeah. 

Dave Bittner: [00:27:47]  Like, I - before I share something, I go and check the sources, or I will try to frame it and say, hey, you know, here's an interesting article from an admittedly left-wing source... 

Joe Carrigan: [00:27:56]  Right. 

Dave Bittner: [00:27:56]  ...Or a well-known right-wing source. And so... 

Joe Carrigan: [00:27:59]  I've seen you do that. 

Dave Bittner: [00:28:00]  ...At least you know what the source is. And so my hope is that when people see something coming from me, they'll know that it's been, to some degree, vetted by me. 

Joe Carrigan: [00:28:09]  Right. 

Dave Bittner: [00:28:10]  But more importantly, I have other people on social media who I feel the same way about. If I see something come from them, based on their past history, I expect that I'm going to spend more time on the things they share because of the thoughtfulness they put into researching things before they share them. Be that person (laughter). 

Joe Carrigan: [00:28:27]  I generally don't share - or don't share political stuff on Facebook. I don't think it's useful - or on Twitter. I just think that because the web is so tailored to our experience, what happens is you wind up in an echo chamber. It's just not helpful. It doesn't let you consider other people's opinions. It invalidates them out of the box with crass and harsh languages. And I don't think it's constructive at all. Skepticism is the first step. I like that he says that. Understand that when you see an ad, that ad is being targeted to you and to you alone. Just like Chris was talking about when you go to, like, amazon.com - I go to Amazon and I'll look up something, and then as soon as I leave Amazon, I'll get the ads for what I was just looking at, right? 

Dave Bittner: [00:29:04]  (Laughter) Right, right. Yeah. 

Joe Carrigan: [00:29:05]  And it's irritating because sometimes it's the ads for something I just bought (laughter). 

Dave Bittner: [00:29:09]  Yeah. Already own it, yeah. (Laughter). 

Joe Carrigan: [00:29:11]  Not going to buy it again. Thanks. 

Dave Bittner: [00:29:13]  Yeah, too late. No, I don't need two of those cars. 

Joe Carrigan: [00:29:15]  Another thing that struck me as kind of funny but also kind of sad - asking people to read the article is asking too much. 

Dave Bittner: [00:29:23]  (Laughter) Yes, I know. 

Joe Carrigan: [00:29:24]  No, no, I just want to share this article - the headline that matches with my worldview. 

Dave Bittner: [00:29:28]  (Laughter) No. Right. 

Joe Carrigan: [00:29:29]  And let everybody else deal with their emotions on it. 

Dave Bittner: [00:29:32]  Yeah. 

Joe Carrigan: [00:29:33]  I think that's kind of irresponsible. Like you said, look for people who are the trusted source. But I'm firmly of the opinion that if you're talking political stuff, don't get it on Facebook or on any social media platform. 

Dave Bittner: [00:29:43]  Yeah. Where should you go, then? 

Joe Carrigan: [00:29:45]  You should do your due diligence as a political member of whatever society you live in and go to a news source that you trust and go to multiple news sources that you trust. And find out - there are websites out there that talk about these biases. All these news sources have the biases. Those biases are open and analyzed. But you should look at something that not only aligns with your opinions, but also read something from somebody who disagrees with you on the position, right? 

Dave Bittner: [00:30:11]  Yeah. 

Joe Carrigan: [00:30:11]  Maybe there's something you're missing that you're not considering in this. 

Dave Bittner: [00:30:14]  Right. Be deliberate about breaking yourself out of your own bubble. 

Joe Carrigan: [00:30:18]  I agree. 

Dave Bittner: [00:30:18]  Yeah. All right. Well, again, thanks to Carole Theriault for bringing this story to us, and thanks to Chris Olson from The Media Trust for sharing his expertise. That is our show. 

Dave Bittner: [00:30:28]  We want to thank our sponsors, KnowBe4, whose new-school security awareness training will help you keep your people on their toes with security at the top of their mind. Stay current about the state of social engineering by subscribing to their Cyberheist News at knowbe4.com. Think a KnowBe4 for your security training. 

Dave Bittner: [00:30:46]  We want to thank the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu. 

Dave Bittner: [00:30:55]  The "Hacking Humans" podcast is probably produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben. Our editor is John Petrik. Executive editor is Peter Kilpe. I'm Dave Bittner. 

Joe Carrigan: [00:31:09]  And I'm Joe Carrigan. 

Dave Bittner: [00:31:10]  Thanks for listening.

Copyright © 2019 CyberWire, Inc. All rights reserved. Transcripts are created by the CyberWire Editorial staff. Accuracy may vary. Transcripts can be updated or revised in the future. The authoritative record of this program is the audio record.

Supported by:
KnowBe4 Logo
KnowBe4

KnowBe4 is the world’s largest security awareness training and simulated phishing platform that helps you manage the ongoing problem of social engineering. Their new school security awareness training platform is user-friendly and intuitive. It was built to scale for busy IT pros that have 16 other fires to put out. Learn more at KnowBe4.com.

Subscribe to the CyberWire
Subscribe to the CyberWire Podcast: RSS Stitcher Google Play Music Castbox
Follow the CyberWire