Hacking Humans 3.19.20
Ep 90 | 3.19.20
Disinformation vs. misinformation.
Transcript

Samuel Woolley: [00:00:00] We are seeing the emergence now of bots that are built with more machine learning, so they can learn from their environment and change the way that they speak so that now, rather than the bots just being clunky and being used to amplify certain kinds of content, the bots are actually engaging in conversation. 

Dave Bittner: [00:00:15]  Hello, everyone. And welcome to another episode of the CyberWire's "Hacking Humans" podcast. This is the show where each week, we look behind the social engineering scams, the phishing schemes and the criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire. And joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe. 

Joe Carrigan: [00:00:37]  Hi, Dave. 

Dave Bittner: [00:00:37]  We've got some good stories to share this week. Later in the show, Carole Theriault returns, and she's going to be speaking with Samuel C. Woolley from the University of Texas at Austin. They're going to be talking about disinformation campaigns. 

Dave Bittner: [00:00:49]  But first, a word from our sponsors at KnowBe4. So what's a con game? It's a fraud that works by getting the victim to misplace their confidence in the con artist. In the world of security, we call confidence tricks social engineering. And as our sponsors at KnowBe4 can tell you, hacking the human is how organizations get compromised. What are some of the ways organizations are victimized by social engineering? We'll find out later in the show. 

Dave Bittner: [00:01:18]  And we are back. Joe, why don't you start things off for us this week? You got a little... 

Joe Carrigan: [00:01:21]  Well, I have a correction, Dave. 

Dave Bittner: [00:01:23]  Yeah, yeah. 

Joe Carrigan: [00:01:23]  I was listening to our last episode. 

Dave Bittner: [00:01:25]  OK. 

Joe Carrigan: [00:01:25]  I listen to all of our episodes, and not just because I love the sound of my own voice and yours... 

Dave Bittner: [00:01:30]  (Laughter). 

Joe Carrigan: [00:01:30]  ...But because I want to make sure I didn't say anything that makes me go, oh, that was stupid. 

Dave Bittner: [00:01:34]  Yeah. 

Joe Carrigan: [00:01:35]  And last week, I said something that makes me go, oh, that was stupid. 

Dave Bittner: [00:01:38]  OK. 

Joe Carrigan: [00:01:38]  So when I said last week - we were talking about uploading things to VirusTotal. 

Dave Bittner: [00:01:42]  Yeah. 

Joe Carrigan: [00:01:42]  And I said that VirusTotal didn't have signatures. That's not how VirusTotal works. VirusTotal is actually a very good site. You upload a file to VirusTotal, and it processes that through a bunch of different engines. So the question was, what does it mean... 

Dave Bittner: [00:01:56]  When you say engines, what do you mean? 

Joe Carrigan: [00:01:57]  Virus-scanning engines - through, like, 50 of them. 

Dave Bittner: [00:02:00]  OK. 

Joe Carrigan: [00:02:00]  So, I mean, McAfee's on there. Symantec's on there. But a bunch of virus scanners you've never heard of are also on there, as well. 

Dave Bittner: [00:02:06]  OK. 

Joe Carrigan: [00:02:06]  And having a low VirusTotal score means that very few of those virus scanners detected anything as malicious. 

Dave Bittner: [00:02:14]  I see. 

Joe Carrigan: [00:02:14]  So there are actual, legitimate software that will come up as malicious by one or two of these scanners sometimes. 

Dave Bittner: [00:02:21]  OK. 

Joe Carrigan: [00:02:21]  I've had that happen before. So when we were talking about the malicious JavaScript that if you uploaded it, it had a low hit rate, that could easily be interpreted as being OK. 

Dave Bittner: [00:02:32]  I see. All right. So just a little clarification. 

Joe Carrigan: [00:02:35]  A little clarification from what I said last week, yes. 

Dave Bittner: [00:02:37]  Yeah, all right. Very good. Well, my story this week - actually, this one hits a little close to home for you, I suppose. You know, there's - everyone, understandably, is talking about the coronavirus... 

Joe Carrigan: [00:02:48]  Yup. 

Dave Bittner: [00:02:48]  ...The COVID-19. I heard someone comment on Twitter the other day that the great thing about COVID-19 is that it's made every podcast a COVID-19 podcast. 

Joe Carrigan: [00:02:57]  (Laughter). 

Dave Bittner: [00:03:00]  And I suppose, thanks to this story, we're no exception to that. So this was actually sent in by a listener. And this was a message that was being sent around by their workplace IT department. And it reads like this. It says (reading) the Department of Health and Human Services released an alert regarding a malicious website pretending to be a live map for coronavirus COVID-19 global cases by Johns Hopkins University. It's circulating on the internet, waiting for unwitting internet users to visit the website. Visiting the website infects the computer with an information-stealing program which can exfiltrate a variety of sensitive information. Furthermore, anyone searching the internet for a coronavirus map could unwittingly navigate to this malicious website. Cyber actors may be sending emails with malicious attachments or links to fraudulent websites to trick victims into revealing sensitive information or donating to fraudulent charities or causes. Exercise caution in handling any email with a COVID-19-related subject line, attachment or hyperlink, and be wary of social media pleas, texts or calls related to COVID-19. 

Dave Bittner: [00:04:02]  So this is a good warning here. I think worth mentioning that I think the reason that Hopkins, which, of course, is your place of employment... 

Joe Carrigan: [00:04:10]  Right. 

Dave Bittner: [00:04:11]  ...The reason that they would become a target here is that Hopkins has this amazing utility, this sort of global heat map - one of the best there is. 

Joe Carrigan: [00:04:19]  Right. Yeah, Dave, it's coronavirus.jhu.edu. 

Dave Bittner: [00:04:23]  Yeah. 

Joe Carrigan: [00:04:23]  And it is a heat map, and it's actually a pretty interesting tool. I advise everybody to take a look at it. But type in the URL coronavirus.jhu.edu. 

Dave Bittner: [00:04:30]  Yeah. Don't trust a search engine, necessarily. 

Joe Carrigan: [00:04:33]  Right, although it did come up as the first hit on Google. So that's how I found it. 

Dave Bittner: [00:04:37]  OK. 

Joe Carrigan: [00:04:37]  But, you know, make sure you're going to jhu.edu. Just as the case with just about everything, this email coming around - the URL may even look like it says coronavirus.jhu.edu. But behind the scenes, it may be bobsmaliciouswebsite.com or something. 

Dave Bittner: [00:04:51]  Right, yeah. So I think the bigger message here, though, is, as we're sort of on the leading edge of dealing with this here in the U.S. as we record this - of course, things are changing very quickly. 

Joe Carrigan: [00:05:02]  Yes. 

Dave Bittner: [00:05:02]  But be extra vigilant because there is no question that the bad guys are going to be taking advantage of this, taking advantage of all of our emotions and our - possibly our fears... 

Joe Carrigan: [00:05:13]  Right. 

Dave Bittner: [00:05:13]  ...And our anxieties. 

Joe Carrigan: [00:05:14]  Oh, definitely our fears and our anxieties. 

Dave Bittner: [00:05:15]  Yeah. 

Joe Carrigan: [00:05:15]  That's one of the things they're going to capitalize on. 

Dave Bittner: [00:05:17]  Yeah. Just be extra vigilant. Look twice - all those things we warn you about. Like Joe just said, make sure that you type in that URL. Don't click the links. Think twice. Ask a friend before you respond to... 

Joe Carrigan: [00:05:28]  Right. 

Dave Bittner: [00:05:28]  ...Donate or give some money, something like that. Just slow down. Check it out. Make sure that it's legit or, you know, donate to the legitimate causes that we all know are helping with these sorts of things... 

Joe Carrigan: [00:05:41]  Right. 

Dave Bittner: [00:05:41]  ...Things like the Red Cross... 

Joe Carrigan: [00:05:42]  The legitimate cause of your choice that you normally give to. 

Dave Bittner: [00:05:44]  Right. 

Joe Carrigan: [00:05:44]  Right. 

Dave Bittner: [00:05:44]  Right, absolutely. Absolutely. So thanks to our listener for sending that in. That is my story this week. Joe, what do you have for us? 

Joe Carrigan: [00:05:51]  My story, Dave, comes out of Delaware here in the U.S. It is about an 89-year-old woman who lost $9,500 to scammers. They called her and said, your grandson has been arrested for causing a car accident. 

Dave Bittner: [00:06:05]  Right. 

Joe Carrigan: [00:06:05]  And they had her grandson's correct name. They knew who she was. 

Dave Bittner: [00:06:09]  Yeah. 

Joe Carrigan: [00:06:10]  And they knew that she was related to this person. And they said, you're going to have to mail us $1,500 in bail money. And we've talked about these scams before, right? 

Dave Bittner: [00:06:19]  Yeah. 

Joe Carrigan: [00:06:20]  The woman actually puts $1,500 in an envelope and mails it to an address in Connecticut. And what happens? They call her back, and they say the other driver has passed away. 

Dave Bittner: [00:06:29]  Oh. 

Joe Carrigan: [00:06:30]  Now you have to send us $10,000 in funeral expenses, right? So she rounds up $8,000. And a guy shows up at her house to collect the money. 

Dave Bittner: [00:06:41]  Really? 

Joe Carrigan: [00:06:41]  Yeah, shows up at her house. And she gives him the eight grand. At this point, the woman calls her family and says, what's going on with him? Is he OK? And they're like, yeah, he's fine. He's right here. You know, none of this has happened. This is a scam. Now she's mad. 

Dave Bittner: [00:06:57]  (Laughter). 

Joe Carrigan: [00:06:58]  Right? She's been scammed. Everybody knows she's been scammed. But... 

Dave Bittner: [00:07:03]  I could just picture the movie trailer, you know? 

Joe Carrigan: [00:07:05]  Right. 

Dave Bittner: [00:07:05]  Don't mess with grandma. 

Joe Carrigan: [00:07:08]  (Laughter). 

Dave Bittner: [00:07:08]  She's a grandma on... 

Joe Carrigan: [00:07:08]  With these allergies, you have a great voice for movie trailers. 

Dave Bittner: [00:07:11]  She's on a mission. Yeah. 

Joe Carrigan: [00:07:15]  (Laughter) Awesome. So she calls the cops. 

Dave Bittner: [00:07:17]  OK. 

Joe Carrigan: [00:07:17]  Right? And the cops say, they're going to call you back. And when they call you back, they're going to try to get more money out of you. Let's see if we can get them. 

Dave Bittner: [00:07:24]  Oh, excellent. 

Joe Carrigan: [00:07:25]  Right? Sure enough, they call her back. And they say, we need another $10,000. And the excuse this time is for the sake of your grandson, right? 

Dave Bittner: [00:07:33]  What does that even mean? 

Joe Carrigan: [00:07:34]  I don't know what that means. I don't know what that means. 

Dave Bittner: [00:07:36]  (Laughter) OK. 

Joe Carrigan: [00:07:37]  But this time - this is terrifying to me. This time, the guy says, we're going to send a couple of guys over. They're going to need to come into your house to count the money. 

Dave Bittner: [00:07:45]  Really? 

Joe Carrigan: [00:07:46]  OK? Now, if this woman hadn't called the cops, where would this have gone? 

Dave Bittner: [00:07:50]  Yeah. 

Joe Carrigan: [00:07:51]  Right? This is what Penn Jillette was talking about a couple weeks ago. 

Dave Bittner: [00:07:53]  Yeah. 

Joe Carrigan: [00:07:53]  These guys are thugs. And they're violent, and they're going to do whatever they need to do to get the money. 

Dave Bittner: [00:07:58]  Well, you know, it's funny. I was thinking the same thing. But what I was thinking about of what Penn said was about how these folks have to take that risk. To meet someone in person... 

Joe Carrigan: [00:08:07]  Right. 

Dave Bittner: [00:08:08]  ...Is putting them at risk. And I'm not sure how this story's going to end, but it sounds like that risk plays out. 

Joe Carrigan: [00:08:13]  Right, it does, because she goes, OK, fine. You send a couple guys here, and I'll have the money ready for you. 

Dave Bittner: [00:08:20]  Yeah. 

Joe Carrigan: [00:08:20]  And the guys show up. And when they show up, the cops are there. And the cops arrest two guys and charge them each with two felonies. 

Dave Bittner: [00:08:26]  Nice. 

Joe Carrigan: [00:08:26]  Right? Nice. 

Dave Bittner: [00:08:27]  Right. 

Joe Carrigan: [00:08:27]  Which is great. 

Dave Bittner: [00:08:29]  Yeah. 

Joe Carrigan: [00:08:29]  I'm very happy that this ended this way. Now, she's still out the 15 - or $9,500... 

Dave Bittner: [00:08:34]  OK. 

Joe Carrigan: [00:08:34]  ...Because she did give somebody $1,500 via the mail. And then the first guy that showed up, who may have been one of the guys who was arrested - I don't know. 

Dave Bittner: [00:08:41]  Who knows? 

Joe Carrigan: [00:08:41]  He doesn't have that $8,000 anymore at this point. 

Dave Bittner: [00:08:43]  Yeah. Well, they might've just been money mules or... 

Joe Carrigan: [00:08:45]  Right. 

Dave Bittner: [00:08:45]  ...You know, hire these thugs to come just pick up the money and take a, you know... 

Joe Carrigan: [00:08:49]  They could very well have been money mules. 

Dave Bittner: [00:08:50]  Yeah. 

Joe Carrigan: [00:08:51]  In fact, I'll bet that's exactly what they are. So a little bit of a happy ending. And this story has a lot of the common things that we see. First off, these people that perpetrate these scams have no morals or lows to which they will not stoop. 

Dave Bittner: [00:09:06]  Yeah. 

Joe Carrigan: [00:09:06]  You know, they went after this 89-year-old woman. They made her feel like her grandson was in trouble. And the second thing that stands out about this story is something we always say. Once they hook you, they're going to try to get as much money out of you as they can, right? Because they've laid that groundwork and they've got a live one, they're going to exploit that resource as much as they can. And this is particularly true in romance scams and extortion scams as well. And they will exploit that resource until the person either runs out of money or wises up. And fortunately, this lady wised up after what is a relatively small amount of money, considering some of the scams we've seen. 

Dave Bittner: [00:09:42]  Yeah. 

Joe Carrigan: [00:09:42]  But it's a little bit bigger than the average for when elderly people are scammed out of money. I think the average was, like, $6,800. So this is an above-average scam. 

Dave Bittner: [00:09:52]  Right. Well, and as we often say, it was her talking to her family... 

Joe Carrigan: [00:09:57]  Right. 

Dave Bittner: [00:09:57]  ...Telling someone else. 

Joe Carrigan: [00:09:58]  Yeah, and that's... 

Dave Bittner: [00:09:59]  That's what slowed it down. 

Joe Carrigan: [00:10:00]  That's the last thing I wanted to say. You know, when you get these kind of calls, you need to verify this first. You know, if this woman had made the call to her child that had the grandson - to that family before she sent the $1,500, she would've known it was a scam immediately. 

Dave Bittner: [00:10:15]  Right. 

Joe Carrigan: [00:10:15]  I don't know that that would've resulted in them catching the crooks because this looks like the crooks do something that is remote first, right? Like, if I - let's see if I can get this lady to send me $1,500 to an address. And if I can, then I'll take the risk of sending somebody to her house. 

Dave Bittner: [00:10:31]  Right. 

Joe Carrigan: [00:10:31]  But if I can't, then I'll just move on to the next person. 

Dave Bittner: [00:10:34]  Yeah. 

Joe Carrigan: [00:10:35]  They might not have been caught, but she wouldn't have been injured the way she was... 

Dave Bittner: [00:10:38]  Yeah. 

Joe Carrigan: [00:10:39]  ...Financially. 

Dave Bittner: [00:10:40]  Yeah, that's interesting. So a twist on a story that we've talked about before. Because with this sort of funeral-type thing, this accident scam, I guess... 

Joe Carrigan: [00:10:49]  Right. 

Dave Bittner: [00:10:49]  ...We could call it. 

Joe Carrigan: [00:10:49]  Yeah, we could... 

Dave Bittner: [00:10:50]  We've heard of that one before, but this is the first time I think we've talked about someone turning the tables. 

Joe Carrigan: [00:10:55]  Right, yeah. 

Dave Bittner: [00:10:56]  Yeah. 

Joe Carrigan: [00:10:56]  It's a great story, I think. 

Dave Bittner: [00:10:57]  Yeah. 

Joe Carrigan: [00:10:57]  I like it a lot. 

Dave Bittner: [00:10:58]  Yeah - happy ending. 

Joe Carrigan: [00:10:59]  That's why I went with it this week, Dave. 

Dave Bittner: [00:11:00]  Yeah, I like it. I like it (laughter). All right, that's a good story. 

Dave Bittner: [00:11:03]  Well, it is time to move on to our Catch of the Day. 

0:11:07:(SOUNDBITE OF REELING IN FISHING LINE) 

Dave Bittner: [00:11:10]  Our Catch of the Day comes from a user on Reddit. It's under the category of "Been Going Back and Forth with These A-Holes for a Few Weeks Now. More Pictures in the Comments." So this is the initial attempt to hook somebody with a sad story here. The woman's name is Mireille Faugere, I suppose. And, Joe, it sounds like she is French. 

Joe Carrigan: [00:11:33]  Yeah. 

Dave Bittner: [00:11:34]  So, as we know, I am a master of dialects. So here we go. 

Joe Carrigan: [00:11:38]  All right. 

Dave Bittner: [00:11:39]  (Imitating French accent) Hello. Please, I know this post sounds strange and probably amazing, but that's the reality. My name is Mireille Faugere, born in August 1956, in Tulle, France. I was a consultant in Switzerland, where I had to serve for 19 years. I took the liberty of contacting you because I wanted to do something very important. It seems a little suspicious, even if you don't know me. In fact, I have terminal esophageal cancer. 

Joe Carrigan: [00:12:05]  No. 

Dave Bittner: [00:12:06]  (Imitating French accent) My doctor just informed me that my days are numbered because of my condition. I have had this disease for over four years. I almost sold my business. And some of the money will go to various associations, orphans and homeless care centers, because my spiritual father advised me to honor the memory of my husband and child. I almost sold my business. And some of the money will go to various associations, orphans and homeless care centers, because my spiritual father advised me to honor the memory of my husband and child. I do not know your area of practice, but I would like to help others in your country. I am currently in my hospital bed. I have a sum of 450,000 euros in my personal account. I do not know your area of practice, but I would like to help others in your country. I would like to ask you if you accept that I give you this money that will help you in your projects and also follow my project to build an orphanage that I have in progress. May God's peace and mercy be with you. Honestly, Mireille Faugere. 

Joe Carrigan: [00:13:08]  You know, I wasn't believing it until she said, honestly... 

Dave Bittner: [00:13:10]  (Laughter). 

Joe Carrigan: [00:13:11]  ...Mireille. 

Dave Bittner: [00:13:12]  Well, who would write that without being sincere? 

Joe Carrigan: [00:13:15]  Right. There's a lot of repeated text in this. 

Dave Bittner: [00:13:17]  There is. That's interesting. I wonder why. 

Joe Carrigan: [00:13:19]  I don't know. It sounds like it might be a copy and paste issue. But this is obviously a scam. It's an advance-fee scam. 

Dave Bittner: [00:13:25]  Yep. 

Joe Carrigan: [00:13:25]  You know, we're going to give you all this money, but first, you got to pay to get it. 

Dave Bittner: [00:13:29]  Pulling on the heartstrings. 

Joe Carrigan: [00:13:30]  Yeah. 

Dave Bittner: [00:13:31]  Someone's got terminal cancer. 

Joe Carrigan: [00:13:32]  Though that is terrible. 

Dave Bittner: [00:13:33]  Yeah. They want to build an orphanage, so... 

Joe Carrigan: [00:13:35]  Right. They want it to go to their various (imitating French accent) associations. 

Dave Bittner: [00:13:38]  Yeah, going to good causes. 

Joe Carrigan: [00:13:39]  Which reminds me of Adam Sandler's Cajun Man. 

Dave Bittner: [00:13:42]  (Laughter) Yeah, (imitating Cajun Man) humiliation. 

Joe Carrigan: [00:13:45]  (Imitating Cajun Man) Association. 

Dave Bittner: [00:13:46]  (Imitating Cajun Man) Desperation - yeah, yeah. Yeah, so pretty standard stuff here. But... 

Joe Carrigan: [00:13:50]  Yeah. But I thought the French accent would be nice, Dave. 

Dave Bittner: [00:13:54]  (Laughter) Well, I aim to please. 

Joe Carrigan: [00:13:56]  Yes. 

Dave Bittner: [00:13:56]  All right. Well, that is our Catch of the Day. Coming up next, Carole Theriault returns. She is speaking with Samuel C. Woolley. He's from the University of Texas at Austin. And they're going to be chatting about disinformation campaigns. But first, a word from our sponsors, KnowBe4. 

Dave Bittner: [00:14:13]  And now we return to our sponsor's question about forms of social engineering. KnowBe4 will tell you that where there's human contact, there can be con games. It's important to build the kind of security culture in which your employees are enabled to make smart security decisions. To do that, they need to recognize phishing emails, of course. But they also need to understand that they can be hooked by voice calls - this is known by vishing - or by SMS texts, which people call smishing. See how your security culture stacks up against KnowBe4's free test. Get it at knowbe4.com/phishtest. That's knowbe4.com/phishtest. 

Dave Bittner: [00:14:59]  And we are back. Joe, always great to have Carole Theriault back on our show. This week, she chats with Samuel C. Woolley. He's from the University of Texas at Austin. And they're going to be talking about disinformation campaigns. Here's Carole. 

Carole Theriault: [00:15:12]  So let me introduce you guys to Samuel Woolley, an assistant professor at the University of Texas and project director at the Center for Media Engagement for propaganda research. Now, this past January, Sam's book, "The Reality Game: How the Next Wave of Technology Will Break the Truth," was published. And in it, he explores the ways in which new tech, like deepfakes, can manipulate public opinion. And I was wondering, should we really be concerned about deepfakes? So I thought I'd ask an expert like Sam. Thanks for coming on the show, Sam. 

Samuel Woolley: [00:15:45]  Thanks for having me. 

Carole Theriault: [00:15:46]  This must have been a fascinating book to research. And I bet you learned so much writing it. How long did it take you? 

Samuel Woolley: [00:15:53]  It took me the better part of a year. I spent time writing each morning. Obviously, I missed lots of mornings for various reasons. 

Carole Theriault: [00:16:00]  Right. 

Samuel Woolley: [00:16:01]  But it was a lot of fun to write. Given the topic area, you know, sometimes there's a bit of a challenge. But it was a culmination of six years' worth of research at the University of Washington and then at Oxford. And I'm happy with how it turned out. It really - you know, like, I'd spent a lot of time looking at what have we seen in the past in terms of computational propaganda, or the use of automation and algorithms to manipulate public opinion online, to think about what might be coming next and how we can work to stop it. 

Carole Theriault: [00:16:25]  Yeah. So maybe first, we could start talking about what a deepfake is. So maybe you can explain that to us. 

Samuel Woolley: [00:16:34]  Sure. So a deepfake is a video that's created artificially to make it look as if someone is saying or doing something that they never said. Most of them have actually been created by arts collectives or comedians or other groups. There's so many images of everyone online now, mashed together so that you have all of the different angles and shapes of the mouth, given that there's so many images. You can then overlay audio and make it look like someone's saying something they didn't do. 

Samuel Woolley: [00:17:00]  I'm actually more concerned at this stage right now in 2020 with cheap fakes, which are just basically edited, like, slowed-down or mashed-together videos that are created on, say, iMovie rather than using sophisticated AI. We've seen a lot of cheap fakes actually go viral. And nearly anyone can create them. So examples of that would be the slowed-down Pelosi video. 

Carole Theriault: [00:17:22]  Yeah. And that is almost equally as disruptive - isn't it? - because you can look at those videos and think, oh, wow, and make an opinion about a person. And you'd be completely misguided. 

Samuel Woolley: [00:17:32]  That's right, yeah. And I think it's really easy on social media to make snap judgments. The way that social media is constructed, it allows you just to share something very, very quickly. And we all know that this has been (laughter) an issue in the past for a variety of different reasons. I'm sure that we've all personally experienced sharing something and then having been like, oh, maybe I shouldn't have shared that. 

Carole Theriault: [00:17:53]  (Laughter) Yeah. 

Samuel Woolley: [00:17:53]  Well, with video, and especially with cheap fakes, it makes it very easy to make a snap judgment and then just to share it. Deepfakes are more sophisticated, and they are equally easy to share once they're out in the wild. They're just much more resource-intensive to create. 

Carole Theriault: [00:18:07]  Right. So we're right now in the middle, in the midst of a U.S. election cycle. So are you thinking we're going to see deepfakes ramp up, we're going to see the use of them more, trying to - you know, to try to change people's opinions or misguide them? 

Samuel Woolley: [00:18:24]  I anticipate that we'll continue to see more and more cheap fakes, so selectively edited videos. Since there's so much video content out of both the Democratic front-runners and also Donald Trump, I expect that we'll see a lot of just basic videos that are edited, or even not edited, just to show the person doing or saying something that they've done but in a selective context. 

Carole Theriault: [00:18:44]  And thank you for correcting me on that. So the word de jure is cheap fakes. Now, the thing is people do create memes or satirical videos, which effectively may manipulate an original content to be funny or - and it's all about, I guess, the intention or how it's received. How do we tell the difference between these two things? 

Samuel Woolley: [00:19:08]  Right. So there's got to be a space for satire and parody. I think that satire and parody are a key part of democratic conversation and of, you know, in the United States what we consider to be the First Amendment, the right to free speech. I think the thing that matters most is intent and transparency. 

Carole Theriault: [00:19:25]  And that's the problem, I guess, there - is that intent is hard to regulate. 

Samuel Woolley: [00:19:28]  Yeah, intent is hard to regulate. But, you know, when it comes to parody and satire, I don't know if we should be regulating things as much. And so if someone spreads, for instance, a video that is meant to be parody or satire but never says who did it, I think it may feel a little bit more toxic. And it gives it the potential for more harm. 

Samuel Woolley: [00:19:46]  And that's very much the case with a lot of the computational propaganda we see. A lot of it is actually anonymous. And it's meant to spread misinformation, which is accidentally spread information, so - which often happens after disinformation, which is purposefully spread false information. 

Carole Theriault: [00:20:03]  And do you feel that this onslaught of cheap fakes is going to have the impact of eroding trust between not just individuals and corporations or organizations but between individuals? 

Samuel Woolley: [00:20:16]  Yeah, I think that we've already been seeing that. I think that cheap fakes, as well as deepfakes, are just sort of the next thing in a line of technologies. And these things have led, in some ways, shapes and forms, to more polarization and to a lack of trust amongst particular communities. You need look no further than, for instance, the Brexit campaign in 2016 or the U.S. election in 2016 to see the ways in which powerful political entities like the Russian government made use of manipulative content. It's not necessarily that they're pitting Republicans and Democrats, in the U.S. context, against each other or Labour and Conservatives against each other in the U.K. context but that they're pitting people within their own groups against each other. 

Carole Theriault: [00:21:01]  And how do you feel that that's changed in the last four years? We're now in 2020. We're ramping up for a U.S. election. 

Samuel Woolley: [00:21:08]  Yeah, social bot is a bot that is - account that is used to automate someone's social media presence. Before, when we first - when I first started studying this stuff in 2013, the bots being used were very clunky. They were used to just automatically drive up likes or retweets or messages. They just massively amplified some content while trying to outshout or suppress other content. We saw a lot of this in the Syrian civil war. We've seen it in Mexico. 

Carole Theriault: [00:21:32]  Right. 

Samuel Woolley: [00:21:33]  Now bots are becoming - the people who use bots are becoming more sophisticated. They are using them in ways that are unexpected and kind of surprising. So we are seeing the emergence now of bots that are built with more machine learning, so they can learn from their environment and change the way that they speak so that now, rather than the bots just being clunky and being used to amplify certain kinds of content, the bots are actually engaging in conversation. There was a misperception in 2016, and even more recently, that there was these armies of bots out there that were trying to, like, change people's opinion by having in-depth conversation bots and, often, cyborg accounts that have some human help emerge. 

Carole Theriault: [00:22:08]  Wow. And so, OK, this is the world that we live in now. What advice do you have for individuals like me who want to be able to detect these things and not be duped by them? 

Samuel Woolley: [00:22:20]  So the first step is - there's a number of different tools out there. There's a tool called Botometer. If you're on Twitter, you can plug in any account and determine whether or not it's more or less automated. And Botometer has become something of an industry standard for a lot of people. You can get access to the Botometer API and plug in, you know, plug in there (ph) 10,000 or 100,000 accounts and determine, like, whether or not there's fake activity going on. Obviously, that takes a bit more coding knowledge. 

Samuel Woolley: [00:22:44]  And then a few other things. The team at First Draft News is doing fantastic work to educate journalists and people in civil society and researchers about how to spot and write about disinformation. And then also, you know, our own team at Texas, we've worked with nearly a hundred newsrooms to date to help train journalists. And we also have a number of different training modules and ethics-oriented documents on our site. 

Carole Theriault: [00:23:10]  Well, at least we have a few things that we can do to try and keep ourselves safe from the more manipulative miscreants' cheap fakes out there. Sam Woolley, author of "The Reality Game," thank you so much for sharing your insight with us today. 

Samuel Woolley: [00:23:24]  Thanks for having me. 

Carole Theriault: [00:23:25]  This was Carole Theriault for "Hacking Humans." 

Dave Bittner: [00:23:28]  All right, lots of interesting stuff from Carole and her guest this week. 

Joe Carrigan: [00:23:32]  And Sam, yeah. 

Dave Bittner: [00:23:32]  Yeah. 

Joe Carrigan: [00:23:33]  I'm going to say this again, Dave, and I say it a lot - don't get your political news from social media, OK? 

Dave Bittner: [00:23:39]  (Laughter) Right. 

Joe Carrigan: [00:23:39]  Social media is by no means a valid forum for political discussion. Don't even discuss politics on social media. It's not a productive conversation. 

Dave Bittner: [00:23:48]  All right. 

Joe Carrigan: [00:23:49]  It is a waste of time and effort. This is all my opinion, but I have never once seen a civil political discourse on social media. 

Dave Bittner: [00:23:55]  It does tend to tailspin quickly, doesn't it? 

Joe Carrigan: [00:23:57]  Almost immediately. 

Dave Bittner: [00:23:58]  (Laughter). 

Joe Carrigan: [00:23:58]  It's terrible. I love the term computational propaganda. 

Dave Bittner: [00:24:02]  Me, too. Yeah. 

Joe Carrigan: [00:24:02]  That's a great term. And it's important to note the difference between misinformation and disinformation, disinformation being a deliberate and willful lie and misinformation being spreading information that's incorrect but thinking you're spreading the truth. 

Dave Bittner: [00:24:16]  Oh, I see. 

Joe Carrigan: [00:24:16]  Right? 

Dave Bittner: [00:24:16]  Sure. 

Joe Carrigan: [00:24:17]  That's kind of an important distinction. 

Dave Bittner: [00:24:18]  Yeah. 

Joe Carrigan: [00:24:19]  So I guess you could say that the impetus could be a disinformation campaign, but then it becomes people spreading misinformation, right? 

Dave Bittner: [00:24:26]  Oh, I see. Yup, yup. 

Joe Carrigan: [00:24:27]  Maybe. Cheap fakes are more viral right now than deepfakes because they're more believable because you're actually taking genuine footage of somebody and manipulating that footage just with readily available tools that everybody has access to. 

Dave Bittner: [00:24:40]  Right. 

Joe Carrigan: [00:24:40]  And he talks about the Pelosi video. And I remember seeing that going viral and people really believing that Nancy Pelosi was having some kind of issue. And it turns out it was just somebody had slowed the video down to make her look like she was having some kind of cognitive issue. 

Dave Bittner: [00:24:53]  Right. 

Joe Carrigan: [00:24:53]  And it wasn't genuine at all. 

Dave Bittner: [00:24:55]  It reminds me of - if you've ever watched any reality show, there's this technique that they all use. And usually, it's right before they go to commercial... 

Joe Carrigan: [00:25:03]  Right. 

Dave Bittner: [00:25:03]  ...Where they'll say, who's going to go on to the next round? And then they'll cut to one person, cut to the next person, cut to one person, cut to the next person. And they're just cutting back-and-forth, building tension... 

Joe Carrigan: [00:25:13]  Right. 

Dave Bittner: [00:25:13]  ...Through these cuts. And it is all built through the edit. It is not what happened in real time. 

Joe Carrigan: [00:25:19]  Yeah. 

Dave Bittner: [00:25:20]  They're just stretching out that time to make the tension unbearable. 

Joe Carrigan: [00:25:24]  Right. 

Dave Bittner: [00:25:24]  And it's all in the edit. 

Joe Carrigan: [00:25:26]  It's amazing how effective this interference has been within the political parties. Like, for example, when Mitt Romney voted with the Democrats to impeach Trump... 

Dave Bittner: [00:25:37]  Right. 

Joe Carrigan: [00:25:37]  ...I saw people who I know voted for Romney when he was up for president calling him a traitor. And on the Democratic side, I have seen such vitriol from Democrats to other Democrats about either Biden or Sanders or Bloomberg or Warren - just attacking each other. 

Dave Bittner: [00:25:54]  Yeah. 

Joe Carrigan: [00:25:55]  It's remarkable. And this is one of the reasons I say that social media is not a valid forum for political discourse. 

Dave Bittner: [00:26:01]  Well, and a lot of these bad actors - they're out there stirring the pot. They're revving people up. 

Joe Carrigan: [00:26:06]  Right. 

Dave Bittner: [00:26:06]  They're encouraging this sort of discourse and disinformation and this corrosiveness. 

Joe Carrigan: [00:26:11]  Yeah, it'll be interesting to see in four years when the Republicans have their own primary without a clear front-runner, like we do this year, what happens in the Republican Party, because I'm going to bet that a very similar thing happens with the Republican Party that is happening right now with the Democratic Party. 

Dave Bittner: [00:26:26]  Yeah. Well, hopefully, we'll learn some lessons between then and now. But I have to say I don't know (laughter). 

Joe Carrigan: [00:26:32]  I don't know either. Yeah, yeah. You know, we as a species tend to be very slow to learn these kind of lessons. 

Dave Bittner: [00:26:37]  Yes, we are a reactive species, for sure. 

Joe Carrigan: [00:26:39]  And we love our tribes. 

Dave Bittner: [00:26:40]  That's right. That's right. 

Joe Carrigan: [00:26:41]  The Botometer is a pretty cool tool. This is from Indiana University. 

Dave Bittner: [00:26:45]  Yeah, let me just give our listeners a little insight here. When we were listening and Samuel mentioned the Botometer, we took a little break here... 

Joe Carrigan: [00:26:54]  Yep. 

Dave Bittner: [00:26:54]  ...Recording our show because Joe had a new toy. 

Joe Carrigan: [00:26:57]  Right, exactly. 

0:26:57:(LAUGHTER) 

Joe Carrigan: [00:26:59]  I was like, oh, what's my Botometer, you know, Botometer score? 

Dave Bittner: [00:27:02]  Right. 

Joe Carrigan: [00:27:02]  And this is on Twitter - @JTCarrigan. I have a Botometer meter score of about 0.2. 

Dave Bittner: [00:27:08]  Yeah. 

Joe Carrigan: [00:27:08]  And I - so next, I, of course, checked out Dave. And he has a score of about 0.1. 

Dave Bittner: [00:27:14]  Right. 

Joe Carrigan: [00:27:15]  So Dave is, like, 50% as likely to be a bot as I am. 

Dave Bittner: [00:27:19]  I like to think of it as that I'm twice as good as you (laughter). 

Joe Carrigan: [00:27:21]  OK. I like to think I'm twice as good as you. But there are some interesting things in here. I looked for a bot account that we know is a bot account. There's @femtech_... 

Dave Bittner: [00:27:30]  Yeah. 

Joe Carrigan: [00:27:31]  ...Which is a bot account that retweets female developers. 

Dave Bittner: [00:27:34]  Right. 

Joe Carrigan: [00:27:34]  And that only has a bot score of 1.3. 

Dave Bittner: [00:27:37]  Yeah. 

Joe Carrigan: [00:27:38]  And then I randomly picked a couple of my followers, and one has a bot score of 1 out of 5. Another one has a bot score of 2.8 out of 5. And then I picked this podcast promoter who follows me that I don't follow back, and he has a bot score of 4.7. It's interesting. 

Dave Bittner: [00:27:54]  Interesting, yeah. 

Joe Carrigan: [00:27:55]  I mean, I could read the URL out to you, but just go to Google and Google Botometer, B-O-T-O-M-E-T-E-R, and it will take you to a ui.edu webpage. And you can enter a Twitter handle, and it'll tell you how likely it is that handle is run by a bot. 

Dave Bittner: [00:28:10]  Yeah, yeah. It's fun. 

Joe Carrigan: [00:28:12]  The last thing that that Sam talked about was he has some training courses for journalists. This is of critical importance. One of the biggest problems in journalism today is this journalism phenomenon where something gets run by one service and then everybody picks it up because they need to be talking about it to get the clicks. 

Dave Bittner: [00:28:27]  Yeah, right. It just needs to get out there fast. 

Joe Carrigan: [00:28:30]  Right. 

Dave Bittner: [00:28:30]  There's no time to vet things. 

Joe Carrigan: [00:28:32]  It's better to be fast than it is to be accurate, it seems. I wholeheartedly disagree with that. I think it's much better to be accurate than it is to be fast. And there are some organizations out there that do a very good job with their journalism, and then there are some that don't. You have to, as a responsible consumer of this media, take those things into account. You know, when you see somebody that has put something up that you believe these people to be a reputable source of journalism and it comes out that they've repeatedly put up stuff that is not vetted, not factually accurate, maybe you stop consuming that media. 

Dave Bittner: [00:29:01]  Yeah, yeah. I mean, I would regularly see - back when I was on Facebook, I'd regularly see friends who would innocently post something that was from a parody site. 

Joe Carrigan: [00:29:12]  Right. 

Dave Bittner: [00:29:12]  And it would be some ridiculous thing, usually some ridiculous thing that this site said a politician did or said or something. And these friends of mine who did not know it was satire... 

Joe Carrigan: [00:29:24]  Right. 

Dave Bittner: [00:29:24]  ...Posted it as if it were true. 

Joe Carrigan: [00:29:26]  Yeah. 

Dave Bittner: [00:29:26]  And I got to tell you, I - that's one thing I don't miss about being on Facebook... 

Joe Carrigan: [00:29:30]  (Laughter). 

Dave Bittner: [00:29:31]  ...Is having to take the time to correct these folks or, you know... 

Joe Carrigan: [00:29:34]  Yeah, 'cause you feel like it's your responsibility to correct... 

Dave Bittner: [00:29:36]  Right. 

Joe Carrigan: [00:29:37]  ...These people, right? 

Dave Bittner: [00:29:37]  Right. Yeah, you want to do your part. 

Joe Carrigan: [00:29:40]  Yeah, I just unsubscribe. 

0:29:41:(LAUGHTER) 

Joe Carrigan: [00:29:41]  That's what I do. 

Dave Bittner: [00:29:42]  Yeah, yeah. 

Joe Carrigan: [00:29:43]  I let it take its place. I think the parody and satire are very important aspects, especially in a free society. 

Dave Bittner: [00:29:48]  Sure. 

Joe Carrigan: [00:29:49]  And I love sites like The Onion and The Babylon Bee. I think they're funny. 

Dave Bittner: [00:29:52]  All right, well, another really great interview from Carole Theriault. Thanks so much, Carole, for bringing that to us. And thanks to Samuel Woolley from the University of Texas at Austin for being our guest this week. That is our show. We want to thank all of you for listening. 

Dave Bittner: [00:30:05]  And, of course, we want to thank our sponsors at KnowBe4. They are the social engineering experts and the pioneers of new-school security awareness training. Be sure to take advantage of their free phishing test, which you can order up at knowbe4.com/phishtest. Think of KnowBe4 for your security training. Thanks to the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu. 

Dave Bittner: [00:30:28]  The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Joe Carrigan: [00:30:41]  And I'm Joe Carrigan. 

Dave Bittner: [00:30:42]  Thanks for listening.