Hacking Humans 5.2.19
Ep 47 | 5.2.19

Twitter bots amplifying divisive messages.


Andy Patel: [00:00:00] Typically, a tweet will get more than twice as many likes as it will get retweets. I think people have a tendency to see retweets as endorsements, so they're more likely to press like than they are to press retweet.

Dave Bittner: [00:00:13] Hello everyone and welcome to the CyberWire's "Hacking Humans" podcast. This is the show where each week we look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire. And joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe.

Joe Carrigan: [00:00:32] Howdy, Dave.

Dave Bittner: [00:00:33] We've got some fun stories to share this week. And later in the show, we have my interview with Andy Patel from F-Secure. He's looking at ways that bots on Twitter are amplifying divisive messages.

Dave Bittner: [00:00:43] But first, we've got a word from our sponsors at KnowBe4.

Dave Bittner: [00:00:46] But before we get to that, I just want to remind everyone that we've got a special live version of our "Hacking Humans" show that's going to be coming up at the KB4-CON in Orlando, Fla. That's the KB4 - stands for KnowBe4, our sponsors. And that is May 8 through the 10. We'll be live on stage doing our show. We'll have some special guests, including Stu Sjouwerman. He's KnowBe4's CEO. And we're also going to have Kevin Mitnick. He is one of the best-known hackers in the world. We'd love to see you all there. And you can find all about the KB4-CON and our live show on KnowBe4's website.

Dave Bittner: [00:01:20] And speaking of KnowBe4, here's a word from them. So what's a con game? It's fraud that works by getting the victim to misplace their confidence in the con artist. In the world of security, we call confidence tricks social engineering. And as our sponsors at KnowBe4 can tell you, hacking the human is how organizations get compromised. What are some of the ways organizations are victimized by social engineering? We'll find out later in the show.

Dave Bittner: [00:01:53] And we are back. Joe, we've got a couple of bits of follow-up this week.

Joe Carrigan: [00:01:56] Awesome. I love follow-up.

Dave Bittner: [00:01:57] And both of them have to do with our story about Google searches bringing up fake ads...

Joe Carrigan: [00:02:01] Yeah.

Dave Bittner: [00:02:02] ...That redirect people to scammer sites. And the first one came from a listener named Tim (ph). He wrote in and he said (reading) just listened to your "Hacking Humans" podcast. I want to share a small correction regarding the scammers that squat on search results for real company phone numbers. They don't just show up in the ads. The fake numbers show up convincingly for most people even in the organic search results.

Joe Carrigan: [00:02:22] Right. They're using a service here called search engine optimization, which is a service you can pay for that kind of games the system...

Dave Bittner: [00:02:29] Right.

Joe Carrigan: [00:02:29] ...At Google or whatever. Does anybody use Lycos anymore?

Dave Bittner: [00:02:32] I don't know. Bing, I guess, is out there and DuckDuckGo.

Joe Carrigan: [00:02:34] DuckDuckGo - and that helps bubble their webpage to the top of the search results.

Dave Bittner: [00:02:40] Right. And it's a good investment for them to make.

Joe Carrigan: [00:02:42] Right because it drives traffic to their scamming numbers.

Dave Bittner: [00:02:46] Yeah. I - it was a good clarification, so thanks, Tim, for sending that in. It's not just the ads. You need to be...

Joe Carrigan: [00:02:51] Wary of the search results.

Dave Bittner: [00:02:52] Yeah, absolutely, absolutely. We had another listener named Mel (ph) write in, and he wrote in to share a similar story to the one we shared. He was looking for a user manual for an old bit of networking hardware that he was trying to use. He did a search and a phone number came up, and he called and the person on the other end of the phone was perfectly willing to sell him that user manual. They would send it to him on a CD for 40 bucks or they let him download it for 10 bucks. And I'll pick up here where he wrote in. He said (reading) I decided to go with the download option. He proceeded to give me a remote desktop support website's address...

Joe Carrigan: [00:03:26] (Laughter).

Dave Bittner: [00:03:27] ...And asked me to give him the number displayed on my screen. At this point, all my spidey senses came rushing in.

Joe Carrigan: [00:03:33] Right?

Dave Bittner: [00:03:34] I asked him to give me the download URL and I will download it myself. He proceeded to tell me it was not being that simple, at which point I saw the light that they were a scam operation.

Joe Carrigan: [00:03:44] Yes.

Dave Bittner: [00:03:45] Good for Mel for his spidey sense going off.

Joe Carrigan: [00:03:47] Yeah. Those websites should be a dead giveaway that you're talking to somebody - what these websites do is they allow somebody to take control of your computer. Let's say our parents call us. We could use these websites to go ahead and help our parents remotely. I have never used these services. Frankly, I have no reason to trust them. I don't use them. But when a scammer wants to take over your computer, this is going to be one of the first things they do. Just let me have remote access to your computer, and then it's just like they're sitting at your desk. And that's essentially physical access. And we have an old saying - physical access is route access, which essentially means I can do whatever I want.

Dave Bittner: [00:04:20] Yeah, well...

Joe Carrigan: [00:04:21] So don't do that.

Dave Bittner: [00:04:21] No, no, absolutely. Well, good for Mel and...

Joe Carrigan: [00:04:24] Yeah. Mel, congratulations. You avoided being scammed here. It definitely what was going on there.

Dave Bittner: [00:04:28] Yeah, good advice for everybody. Well, let's move on to our stories. I'm going to kick things off this week. I have a story from the CBC out of Canada. And this is from the city of Ottawa. And it's about the city Ottawa treasurer being tricked into wiring more than a $100,000 to a fraudster. Her name is Marian Simulik. And she got an email from the city manager, whose name is Steve Kanellakos. And the email asked her to pay a city supplier $97,000.

Joe Carrigan: [00:04:58] Right.

Dave Bittner: [00:04:58] She searched the internet for the IT supplier, and she assumed that the payment had something to do with updating the Ottawa website.

Joe Carrigan: [00:05:06] Right.

Dave Bittner: [00:05:06] And she had a few back-and-forth emails with the city manager, who was not actually the city manager.

Joe Carrigan: [00:05:11] Yeah. This was somebody who had compromised the email.

Dave Bittner: [00:05:12] It was the scammer. And so she wired the money. And then time passes, and she receives another email from the city manager, this one asking for $150,000 to go to the same supplier. However, this email arrived while they were in the midst of a city council meeting. So she went over and asked the real city manager about the request.

Joe Carrigan: [00:05:35] Right.

Dave Bittner: [00:05:35] And of course, he had no idea what she was talking about.

Joe Carrigan: [00:05:37] Sure.

Dave Bittner: [00:05:38] So now the jig is up.

Joe Carrigan: [00:05:40] But how long had passed between the two events?

Dave Bittner: [00:05:42] Oh, not very long.

Joe Carrigan: [00:05:43] OK.

Dave Bittner: [00:05:43] A few days. A few days. It's interesting - she went to the local police. The local police really weren't able to do much.

Joe Carrigan: [00:05:49] No, they're not going to be able to do anything. You're going to have to go to the federals here.

Dave Bittner: [00:05:52] Right. And the feds did track it down. The money was sent to the United States, and it was transferred from bank to bank to bank. But they do think they're going to be able to get it back...

Joe Carrigan: [00:06:01] Oh, that's good.

Dave Bittner: [00:06:01] ...In this case. Yeah.

Joe Carrigan: [00:06:02] That's good.

Dave Bittner: [00:06:02] I thought it was interesting - a couple of things to note here. She made a personal statement, the woman who fell victim to this. She said, that I should be the target and victim of this sophisticated attack has affected me deeply both professionally and personally.

Joe Carrigan: [00:06:14] Right.

Dave Bittner: [00:06:15] She's been at this job for 28 years, and she takes her responsibility and professional stewardship of taxpayers' money very seriously.

Joe Carrigan: [00:06:24] Yeah.

Dave Bittner: [00:06:24] So you can see how falling for this, that could be a hard hit to someone.

Joe Carrigan: [00:06:28] Right, absolutely. Nothing like this has probably ever happened in her career, and now she has fallen victim to a scam that has cost the taxpayers of Ottawa - was it?

Dave Bittner: [00:06:36] Yep.

Joe Carrigan: [00:06:36] Ninety-seven thousand dollars.

Dave Bittner: [00:06:37] Right.

Joe Carrigan: [00:06:38] She might view that as a breach of trust; I don't know that I'd be so hard on her. This is some malefactor going after you because you're an easy target, and you're falling for a scam that is not uncommon. And you're falling for it for a large amount, but by no means is this close to the largest amount we've seen.

Dave Bittner: [00:06:52] No. Now, it's interesting - the city has taken some measures to avoid this sort of thing in the future. They have enabled automatic warnings when emails come in from an external source.

Joe Carrigan: [00:07:02] So this was somebody spoofing the email from outside?

Dave Bittner: [00:07:04] Who knows? They may have actually gotten into the city manager's actual email account. But this is a good thing - having warnings when an email comes from an external source can't hurt. And also, they put it in a system in place where no employee has the ability to both create and approve a wire transfer.

Joe Carrigan: [00:07:20] Right.

Dave Bittner: [00:07:21] So that's great.

Joe Carrigan: [00:07:22] Yes.

Dave Bittner: [00:07:22] Got to get a second pair of eyes...

Joe Carrigan: [00:07:24] Absolutely.

Dave Bittner: [00:07:24] ...On transferring any money. And also, they are having mandatory cyber awareness training for the city staff.

Joe Carrigan: [00:07:31] Yes.

Dave Bittner: [00:07:32] So all good things.

Joe Carrigan: [00:07:32] Money well-spent. All those things are excellent. It's good to hear that. I think it's far less likely that the city of Ottawa will have a similar event. I'm not saying it's unlikely; it's just less likely than it was two weeks ago.

Dave Bittner: [00:07:43] Yeah, it's a hard lesson to learn.

Joe Carrigan: [00:07:44] It is.

Dave Bittner: [00:07:44] And hopefully they'll get most of the money back.

Joe Carrigan: [00:07:47] I hope they get the money back.

Dave Bittner: [00:07:48] All right. Well, that's my story. Joe, what do you have for us this week?

Joe Carrigan: [00:07:50] Well, my story's kind of similar, but it's actually more of a macro look at things. Lindsey O'Donnell over at Threatpost is talking about the FBI releasing the internet crime report from the Internet Crime Complaint Center...

Dave Bittner: [00:08:03] Oh, OK. Yeah.

Joe Carrigan: [00:08:03] ...IC3, for 2018. And business email compromise in 2018, which is what we were talking about in your story...

Dave Bittner: [00:08:09] Yeah.

Joe Carrigan: [00:08:10] ...Has cost victims $1.2 billion in 2018.

Dave Bittner: [00:08:15] Hm, OK.

Joe Carrigan: [00:08:16] And that is up from a paltry $675 million in 2017.

Dave Bittner: [00:08:19] So almost doubled.

Joe Carrigan: [00:08:20] Almost doubled.

Dave Bittner: [00:08:21] Yeah, wow.

Joe Carrigan: [00:08:22] Other scams - like extortion, tech support fraud, payroll diversion, those kind of things - have increased as well. In 2018, the FBI received over 350,000 complaints with total losses exceeding $2.7 billion.

Dave Bittner: [00:08:37] Wow.

Joe Carrigan: [00:08:37] That is an average of $7,700 per incident. Now, what is missing from this report and what I'd like to see is the median.

Dave Bittner: [00:08:45] OK.

Joe Carrigan: [00:08:45] The average is a mean, is just something you can derive from these numbers, right? But the median - I'd like to know where the median falls in this, and the FBI doesn't publish that. So I'd like to see that statistic there.

Dave Bittner: [00:08:56] Yeah.

Joe Carrigan: [00:08:56] Folks over at the FBI, if you're listening.

Dave Bittner: [00:08:58] (Laughter) I'm sure they'll get right on it for us, yeah.

Joe Carrigan: [00:08:59] Right, yeah. Joe, what's - but here's what's interesting - in 2017, the total loss was half of that, like $1.4 billion, but the amount of complaints was still 300,000; so kind of close, meaning that the average amount lost has gone from $4,600 in 2017 up to $7,700 in 2018, which means these criminals are realizing that these things work.

Dave Bittner: [00:09:26] Yeah.

Joe Carrigan: [00:09:26] And they're going after bigger amounts.

Dave Bittner: [00:09:28] So they're getting bigger hits.

Joe Carrigan: [00:09:29] They're getting bigger hits.

Dave Bittner: [00:09:30] OK.

Joe Carrigan: [00:09:30] Yep. Business email compromise is growing, and we heard about that story that you just told.

Dave Bittner: [00:09:34] Yeah.

Joe Carrigan: [00:09:35] Also, personal accounts are being compromised, vendor emails are being compromised, spoofed lawyer emails, requests for W-2s, and - we've talked about this before - the targeting of the real estate sector.

Dave Bittner: [00:09:45] Right, right.

Joe Carrigan: [00:09:46] Right? Because that's a place where a lot of money changes hands.

Dave Bittner: [00:09:49] Big money, yeah.

Joe Carrigan: [00:09:49] Big money. And people are really focused not so much on security, and it's a very dispersed - and there's not a lot of control over the environment in brokerage offices and things of that nature.

Dave Bittner: [00:09:59] Yeah. And that's a disorienting day, the day you buy a house.

Joe Carrigan: [00:10:02] Yes, it is.

Dave Bittner: [00:10:03] There's just a lot going on.

Joe Carrigan: [00:10:04] Yes, it is.

Dave Bittner: [00:10:04] So you can understand - yeah.

Joe Carrigan: [00:10:05] My policy on this is if you're going to buy a house or sell a house, that you tell everybody upfront, I'm not wire transferring anything; I will have cashier's checks for everything that's going to happen. OK?

Dave Bittner: [00:10:16] Mmm hmm. Mmm hmm.

Joe Carrigan: [00:10:17] That way you can't get scammed by somebody saying...

Dave Bittner: [00:10:20] Right (laughter).

Joe Carrigan: [00:10:20] ...OK, here's the new details for the account to wire the money to.

Dave Bittner: [00:10:22] I'm going to wheel in a wheelbarrow full of rolled quarters.

Joe Carrigan: [00:10:25] Right.

Unknown: [0:10:27] (LAUGHTER)

Dave Bittner: [00:10:27] To buy this house, yeah.

Joe Carrigan: [00:10:28] I'll bet you can lift a wheelbarrow full of rolled quarters (laughter).

Dave Bittner: [00:10:30] Probably - I don't know. Probably not. Goals, Joe. Goals.

Joe Carrigan: [00:10:34] Goals, right

Dave Bittner: [00:10:35] (Laughter).

Joe Carrigan: [00:10:36] Gift cards are a big part of the scam. I was talking at an event with the CASH Campaign of Maryland a couple of days ago.

Dave Bittner: [00:10:42] Hm, what's that?

Joe Carrigan: [00:10:42] That's a group of people that provide services to low- and middle-income people here in Maryland, and they provide, like, financial advising services and tax preparation services.

Dave Bittner: [00:10:51] Oh, I see. Yeah, yeah.

Joe Carrigan: [00:10:52] And they help people start small businesses, as well.

Dave Bittner: [00:10:54] Oh, nice.

Joe Carrigan: [00:10:55] Which is great. But one of the things I said that they should know and that everybody they work with should know is that when somebody is asking for some kind of payment in gift cards, that is a scam. That should be a huge, red flag. I mean, there should just be red flags everywhere whenever somebody says, hey, you can pay me with gift cards, or, I need you to pick up gift cards. Nobody accepts gift cards as a form of payment.

Dave Bittner: [00:11:13] Right.

Joe Carrigan: [00:11:13] But Lindsey O'Donnell talked to a researcher from Agari. And he said that the gift card scam usually nets about 70% of the value of the gift card in Bitcoin, which, I think, is an impressive amount. I mean, if I scam somebody out of a hundred dollars, and I get $70 for that, that's pretty good return on my scam, I would think. I would've thought it would've been less.

Dave Bittner: [00:11:35] So the cost of the - I guess the laundering of the money, the conversion to Bitcoin...

Joe Carrigan: [00:11:39] Right.

Dave Bittner: [00:11:40] ...Is about 30%.

Joe Carrigan: [00:11:41] Thirty percent.

Dave Bittner: [00:11:42] Interesting.

Joe Carrigan: [00:11:42] The FBI report talks about ages as well.

Dave Bittner: [00:11:45] OK.

Joe Carrigan: [00:11:46] Let's talk about that. Who do you think was the biggest target and lost the most money, if you were going to pick an age group?

Dave Bittner: [00:11:52] Off the top of my head, I would say the elderly.

Joe Carrigan: [00:11:54] That's right - 60-plus. I thought about, why is that the case? And the answer is they're actually people with the most money, I think. The other problem is they're also the most vulnerable. If you're a scammer and you're fortunate enough to get a hold somebody who has, like, the beginning part of dementia and may not even be aware of it - they may not have been diagnosed with it.

Dave Bittner: [00:12:10] Right.

Joe Carrigan: [00:12:11] I'm sure you can clean that person out. Yeah, that makes you successful. But that also makes you a horrible person.

Dave Bittner: [00:12:15] Yeah, yeah.

Joe Carrigan: [00:12:16] I - the FBI report - we'll put a link in the show notes. It's an interesting report - a lot of statistics. I'd like to see the median, I really would.

Dave Bittner: [00:12:22] Yeah. All right. Well, we'll send out the bat signal to the FBI.

Joe Carrigan: [00:12:25] (Laughter).

Dave Bittner: [00:12:25] I'm sure they'll get right on that for you, Joe (laughter).

Joe Carrigan: [00:12:27] I know. They're over there right now...

Dave Bittner: [00:12:29] So many listeners to our show from the FBI. All right. Well, good story - it's time to move on to our Catch of the Day.


Dave Bittner: [00:12:39] Joe, our Catch of the Day this week comes from a listener named Ian. And Ian uses a dating app called Grindr. Are you familiar with Grindr, Joe?

Joe Carrigan: [00:12:47] I know what it is.

Dave Bittner: [00:12:47] All right. Well, Grindr is an app for men who are looking for dates with other men.

Joe Carrigan: [00:12:51] Yep.

Dave Bittner: [00:12:51] And apparently, it has not escaped the attention of scammers. And Ian sent us a transcript of a chat session that he had with someone who reached out to him looking for a date on Grindr. Now, before we dig in here, Joe, are you familiar with the term sugar daddy?

Joe Carrigan: [00:13:06] Yes. I think I'm familiar with it.

Dave Bittner: [00:13:08] What do you suppose it is?

Joe Carrigan: [00:13:09] So if I'm someone's sugar daddy...

Dave Bittner: [00:13:12] Right.

Joe Carrigan: [00:13:12] I buy them a lot of nice things.

Dave Bittner: [00:13:14] Right. You take care of them.

Joe Carrigan: [00:13:15] Yes.

Dave Bittner: [00:13:16] You look out for them.

Joe Carrigan: [00:13:17] Right.

Dave Bittner: [00:13:17] You're someone who's well-off, who - a person of means.

Joe Carrigan: [00:13:19] A person of means, yes.

Dave Bittner: [00:13:20] And you're looking - I suppose in a dating situation, you'd be looking for companionship.

Joe Carrigan: [00:13:25] Right.

Dave Bittner: [00:13:25] In exchange for that companionship, you would be able to provide...

Joe Carrigan: [00:13:29] Diamonds and such.

Dave Bittner: [00:13:30] ...All the comforts of easy living for that person.

Joe Carrigan: [00:13:32] Yes.

Dave Bittner: [00:13:32] So, Joe, you're going to play the part of Ian...

Joe Carrigan: [00:13:35] OK - our listener.

Dave Bittner: [00:13:35] ...Who is the person - he's the one who sent it in.

Joe Carrigan: [00:13:37] Right.

Dave Bittner: [00:13:37] And I will be daddy Kevin, the person who's trying to scam Ian. And it seems like Ian was onto this scam pretty quickly. And it goes like this. Hi. How are you doing today?

Joe Carrigan: [00:13:48] I'm good, thanks. How are you, sir?

Dave Bittner: [00:13:50] Great. What are you looking for?

Joe Carrigan: [00:13:52] Whatever, really. I don't have high expectations here. What about you?

Dave Bittner: [00:13:56] Looking for a sugar baby who's going to be honest and straightforward. And I'm going to take good care of him with my money and care.

Joe Carrigan: [00:14:03] I don't need the money, but I'm all about the care.

Dave Bittner: [00:14:05] Okay. Can you run errands for me?

Joe Carrigan: [00:14:07] Yeah, sure. Like what?

Dave Bittner: [00:14:09] Like getting me Apple store gift cards. And I'm going to be paying you $200 as your daily allowance. Honey, I have an online investment, so I change Apple gift cards to Bitcoin to invest on it, honey. I have so much on the investment, and I can't afford to lose so much dollars at this point.

Joe Carrigan: [00:14:25] Yeah. How can I help you out, daddy?

Dave Bittner: [00:14:28] As a sugar daddy, my main responsibility is making sure you're financially stable. So I'll be giving you an allowance of $200 every four days, which will make about $1,000 every month. I guess that's OK by you. What institution do you bank with?

Joe Carrigan: [00:14:42] Deutsche - that works for me.

Dave Bittner: [00:14:44] German bank.

Joe Carrigan: [00:14:45] Of course - they are good at hiding money. Don't worry. The FBI won't find out.

Dave Bittner: [00:14:50] (Laughter) And that's where it ends.

Joe Carrigan: [00:14:51] (Laughter).

Dave Bittner: [00:14:51] So clearly, Ian was onto this pretty much from the get-go. I would say the part with the Apple store gift cards probably tipped him off (laughter).

Joe Carrigan: [00:14:59] Right, or probably just being approached by somebody who says they want to be a sugar daddy, right?

Dave Bittner: [00:15:03] Yeah.

Joe Carrigan: [00:15:04] I mean...

Dave Bittner: [00:15:04] Well, interesting, too, that - I mean, along with this, as these things happen, came some photos of...

Joe Carrigan: [00:15:08] Yeah.

Dave Bittner: [00:15:08] ...I got to say of a very handsome, dashing-looking man.

Joe Carrigan: [00:15:11] Yeah. This is not the guy sending you the pictures or sending you the messages.

Dave Bittner: [00:15:14] Willing to bet it's not.

Joe Carrigan: [00:15:16] Right.

Dave Bittner: [00:15:16] No (laughter). So all right. Well, that is our Catch of the Day. Thanks to Ian for sending this in. That's a fun one. Coming up next, we've got my interview with Andy Patel from F-Secure. He looked into ways that bots on Twitter are amplifying divisive messages.

Dave Bittner: [00:15:30] But first, a word from our sponsors at KnowBe4.

Dave Bittner: [00:15:36] And now we return to our sponsor's question about forms of social engineering. KnowBe4 will tell you that where there's human contact, there can be con games. It's important to build the kind of security culture in which your employees are enabled to make smart security decisions. To do that, they need to recognize phishing emails, of course. But they also need to understand that they can be hooked by voice calls - this is known as vishing - or by SMS texts, which people call smishing. See how your security culture stacks up against KnowBe4's free test. Get it at knowbe4.com/phishtest. That's knowbe4.com/phishtest.

Dave Bittner: [00:16:23] And we are back. Joe, I recently had the pleasure of speaking with Andy Patel. He is a researcher over at security company F-Secure. And he's done some really interesting work looking into the ways that bots on Twitter are being used to amplify divisive messages. Here's my talk with Andy Patel.

Andy Patel: [00:16:41] I saw a couple of troll accounts while I was browsing through probably a search of Brexit. And I had a look at the account, and then I started having a look at who was following the account. And I sort of noticed a lot more of these very new, some, like, egg accounts, lots of descriptions, looked similar in certain ways and lots of use of similar hashtags. So I had this idea to just sort of crawl that space, seed my crawler with a couple of these nobody accounts that I saw actually trolling. And just as I crawled the accounts that were following those, I would just add to the queue anything that matched certain keywords and self-identified within this group. So it's basically like hashtags and things in the description field or in the name field. I let that crawler run for a couple of days, and it identified about 250,000 accounts and all the details of those.

Andy Patel: [00:17:36] So once I had that, obviously, that's a lot of accounts, so I decided to filter it in different ways. And I actually found about 2,000 accounts that were really new. So they were created in March. Actually, I did this research right at the end of March. So they were created in March of 2019 or after. And I decided to just see what these accounts were doing. So Twitter has a streaming API core where you can give it a list of account IDs, and it will follow those accounts. It will return you any tweet that those accounts publish or any tweet where those accounts are mentioned. So I left that running for a couple of days to see what those accounts were doing.

Andy Patel: [00:18:16] And it turned out that they were actually heavily amplifying just a few other accounts, which were also very new. Well, they were actually amplifying a lot of sort of big name pro-Brexit and, like, sort of a U.S. right-wing accounts. But if you strip those away - and, again, I just looked at which accounts they were amplifying that were new. I found a small handful - I think a dozen or so - that included, like, an account that was pretending to be somehow affiliated with UKIP and accounts that had, you know, within the few weeks of them being alive tweeted 10,000 times and stuff like that. So it was, to me, pretty obvious that these are, like, nonauthentic behaviors. And so I just decided to write it up. And it made it sort of an interesting bit of research.

Dave Bittner: [00:18:59] With the filtering that you did here, what did these accounts have in common? Do they come from the same source? Are they retweeting each other? Is it a circular type of thing? What patterns came out of your research?

Andy Patel: [00:19:09] Those accounts that were being amplified - to be honest, a lot of the very new accounts in that list were mostly just retweeting. If you'd looked at the 2,000 accounts themselves, out of the last 200 tweets they published, you know, 190 or more would be retweets. Certainly, many of those 2,000 accounts were publishing original tweets, meaning that to a certain extent they were being controlled by actual human operators.

Dave Bittner: [00:19:34] Was there anything that came out of your research that was particularly surprising?

Andy Patel: [00:19:38] I was actually surprised about the whole thing, to be honest. You know, it's a lot of hit and miss trying to understand how to see underneath the most amplified content on the platform. If I collect a stream of tweets and then look at the data, I'm going to, like, count things. I'm going to count, OK, how many times did I see this hashtag? How many times did I see this user tweeting? How many times was this user retweeted or replied to or quoted or mentioned?

Andy Patel: [00:20:04] And, of course, those lists are very long if you collect data, even for a day or two. It's sort of impossible to look through the whole list. So what I tend to do is just list the top 50 of something. But that's sort of the top of everything. So there you'll see the big celebrities getting tweeted a lot. You'll see the accounts that are very, very active that retweet a lot. You'll see accounts that are very, very active and just publish a lot of tweets - typically quite similar list of those that retweet and very different to that list of accounts that get amplified. But, like, to sort of get underneath that top layer to get to that second layer, that's why I sort of talked about hidden amplification because it's, like, amplification underneath the regular amplification that you see.

Dave Bittner: [00:20:45] What's your advice for people who are out there trying to use a platform like Twitter in good faith? Are there any red flags in terms of engagement for folks to know if they're dealing with a real person on the other end or if it's one of these people, systems, bots, whatever, that are trying to just amplify things?

Andy Patel: [00:21:03] Yeah, that requires some diligence. It requires that if you see something clickbait-y (ph) and exciting that you actually check other sources to see if it's true. Look at the account itself. Look at, you know, when it was created. Look out for signs that it looks - you know, that it might be a bit suspicious, like having published tens of thousands of tweets. Or, you know, scroll down its timeline. See if it's all just retweets. There are lots of ways of, you know, just eyeballing an account and sort of getting an idea of how valid it is, how real it is. And the problem is that a vast majority of the accounts you'll find on the platform are a little bit dodgy looking.

Andy Patel: [00:21:42] Obviously, yeah, when you come to how you follow accounts, I tend to keep my following list fairly short because I don't want to be inundated with the wrong kind of content. I want to get, like, a broad cross-section of, like, different topics. And so what I'll do sometimes is unfollow some accounts and follow some others just to get the balance right. But I don't, like, automatically follow back any account that follows me. And I know that that might be a tendency at the very beginning when you first join, but there are plenty of these automated accounts that actually just randomly follow new users in hopes that those users will follow them back. And if you are a new user, you might be like, oh, thanks for following. I'll follow you back. You know, I need to up my follower count.

Dave Bittner: [00:22:25] Right.

Andy Patel: [00:22:25] And those might be, you know, things like porn bots and stuff like that. And they will eventually unfollow you anyway, so there's no point in following them back. But that's one mechanism for those accounts to build up followers and start to look more legitimate. I tend to look for the verified tick next to accounts that are labeling themselves as news sources. And that way, you're going to get the BBCs and The Guardian and actual, real news sources. There are plenty of Twitter accounts that make themselves look like news sources. But then if you look at the actual username, it has numbers at the end of it, or it sits some way odd.

Dave Bittner: [00:23:03] They try to inject this emotional component to get a reaction from you.

Andy Patel: [00:23:08] Yeah. Lots of people share clickbait-y headlines, even if they didn't read the article. I think that's, you know, a given, and most people know that. But also, I mean, there has been a tendency for high-profile Twitter users to share information that wasn't properly fact-checked. I'll put it that way. And, of course, you know, those accounts have plenty of followers, and those followers are probably just going to retweet that. And sooner or later, you'll see that whole thing amplified. And if you see a tweet with hundreds or thousands of retweets and likes, you might think it's real...

Dave Bittner: [00:23:45] Right.

Andy Patel: [00:23:45] ...Without actually noticing that it's an account that somehow engineered those retweets and likes. Typically, a tweet will get more than twice as many like as it will get retweets. I think people have a tendency to see retweets as endorsements, so they're more likely to press like than they are to press retweet.

Dave Bittner: [00:24:04] Joe, what do you think - lots of stuff going on there, huh?

Joe Carrigan: [00:24:06] Yeah. Dave, if you get your news from social media, you are being played, right?

Dave Bittner: [00:24:12] (Laughter) OK.

Joe Carrigan: [00:24:12] This is one of the things I evangelize about.

Dave Bittner: [00:24:15] Yeah.

Joe Carrigan: [00:24:15] Regardless of the side of the aisle on, the point of these bots is to sow discord in the target country.

Dave Bittner: [00:24:21] Right.

Joe Carrigan: [00:24:21] There are Russian bots on the left and the right, and the point is just to make us hate each other more.

Dave Bittner: [00:24:27] Yeah.

Joe Carrigan: [00:24:27] That's all they're trying to do.

Dave Bittner: [00:24:28] Taking advantage of the bubbles that we've created...

Joe Carrigan: [00:24:30] Exactly.

Dave Bittner: [00:24:31] ...Within these social media platforms.

Joe Carrigan: [00:24:33] And that's an excellent point because that's my next point. The social media companies are no help at all. The Wall Street Journal has a great tool called Blue Feed, Red Feed. Check that out

Dave Bittner: [00:24:43] OK.

Joe Carrigan: [00:24:43] ...Because it takes current issues - and I didn't realize this - this tool has current issues and current posts from people - from a left-leaning person and a right-leaning person.

Dave Bittner: [00:24:54] Oh, interesting.

Joe Carrigan: [00:24:54] Because you've got to think about what their business model is - their business model is based entirely on views. If they show you something that you disagree with, you're going to get angry and turn it off, and they're not going to make money, OK?

Dave Bittner: [00:25:04] (Laughter) OK.

Joe Carrigan: [00:25:06] That's what the root of this problem is.

Dave Bittner: [00:25:08] Yeah. Yeah.

Joe Carrigan: [00:25:09] When it comes to political discourse, Facebook is little more than a political echo chamber, and Twitter is a spite-filled dumpster fire.

Dave Bittner: [00:25:16] (Laughter) Don't hold back, Joe. Tell us how you really feel (laughter).

Joe Carrigan: [00:25:19] This is how I feel about both these places.

Dave Bittner: [00:25:20] OK. All right.

Joe Carrigan: [00:25:23] The suggestion to follow only real news services on Twitter - I don't even recommend that. Don't follow any political stuff on Twitter. Don't get your political news from Twitter. Don't get it from Facebook. Go directly to the sources - whatever source you trust and you know has journalistic integrity. And that's incumbent upon you, dear listener...

Dave Bittner: [00:25:38] Yeah.

Joe Carrigan: [00:25:38] ...To figure out which site has journalistic integrity. How do you avoid it? Like I say, just consider all of this stuff garbage. Don't even look at it. You know people like this on Facebook and Twitter that share political stuff. And...

Dave Bittner: [00:25:50] Oh, yeah.

Joe Carrigan: [00:25:50] ...You get - oh, there they go again.

Dave Bittner: [00:25:51] Yeah. Yeah.

Joe Carrigan: [00:25:52] I could think of one person right now - when I think of who on the left does this, I have one friend that's always posting anti-Trump stuff.

Dave Bittner: [00:25:58] Yeah.

Joe Carrigan: [00:25:58] When I think who are on the right, I got another friend that's always posting pro-Trump stuff.

Dave Bittner: [00:26:02] Right. Right.

Joe Carrigan: [00:26:02] You know, I don't want to hear that on Facebook. Sorry.

Dave Bittner: [00:26:05] Yeah.

Joe Carrigan: [00:26:05] And I've just taken to - when I'm feeling a little bit punchy, I just start responding, don't get your political news from Facebook. Don't get your political news from Facebook. Don't get your political - just - and bigger, bolder letters.

Dave Bittner: [00:26:15] Wow.

Joe Carrigan: [00:26:15] Right?

Dave Bittner: [00:26:16] You're really getting wound up here, Joe (laughter).

Joe Carrigan: [00:26:17] Yeah, I am. I mean, this really frustrates me. It's not productive.

Dave Bittner: [00:26:20] Yeah. All right.

Joe Carrigan: [00:26:22] I like his suggestion - never follow a hot rando 'cause...

Dave Bittner: [00:26:26] (Laughter).

Joe Carrigan: [00:26:26] ...Why are they contacting you?

Dave Bittner: [00:26:28] Yeah.

Joe Carrigan: [00:26:28] That happens to me on Twitter and Instagram from time to time. His research all focuses on Twitter. And I didn't talk much about Twitter in my rant here, but it happens on all these social media platforms. My Twitter presence is not that big, so I don't see a lot of this. But yeah, don't follow people back just because they followed you.

Dave Bittner: [00:26:44] Yeah, follow people 'cause you think they're interesting.

Joe Carrigan: [00:26:46] Right. Exactly.

Dave Bittner: [00:26:47] Yeah. All right. Well, thanks to Andy Patel for joining us, and thanks to you for listening. That is our show.

Dave Bittner: [00:26:53] Of course, we want to thank our sponsors at KnowBe4. They are the social engineering experts and the pioneers of new-school security awareness training. Be sure to take advantage of their free phishing test, which you can find at knowbe4.com/phishtest. Think of KnowBe4 for your security training.

Dave Bittner: [00:27:08] Thanks to the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu.

Dave Bittner: [00:27:16] The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben. Our editor is John Petrik. Technical editor is Chris Russell. Our staff writer is Tim Nodar. Our executive editor is Peter Kilpe. I'm Dave Bittner.

Joe Carrigan: [00:27:34] And I'm Joe Carrigan.

Dave Bittner: [00:27:35] Thanks for listening.