Sam Small: [00:00:00] So similar to how we've seen account hijacking for however long there have been logins on machines, account hijacking is a large issue in social media.
Dave Bittner: [00:00:09] Hello, everyone, and welcome to the CyberWire's "Hacking Humans" podcast, where each week, we look behind the social engineering scams, phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire, and joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe.
Joe Carrigan: [00:00:28] Hi, Dave.
Dave Bittner: [00:00:29] We've got some great stories to share. And later in the show, we've got Joe's conversation with Sam Small. He's the chief security officer at ZeroFOX. But before that, we've got a quick word from our sponsors at KnowBe4.
Dave Bittner: [00:00:41] So how do you train people to recognize and resist social engineering? There are some things, people think. Test them, and if they fall for a test scam, fire them. Or, other people say, if someone flunks the test, shame them. Instead of employee of the month, it's doofus of the day. Or maybe you pass out a gift card to the one who gets the A-plus for skepticism in the face of phishing. So how 'bout it? What do you think? Carrots or sticks? What would you do? Later in the show, we'll hear what the experts at KnowBe4 have to say. They're the sponsors of this show.
Dave Bittner: [00:01:17] And we are back. Joe, you want to kick things off for us this week?
Joe Carrigan: [00:01:21] I do, Dave. I have an article from HubPages, which is a community of writers who produce things of interest to other people. And this one comes from Terry Davis, who worked for the California Department of Corrections for 25 years. And he was talking about inmate mail - because inmates can send and receive mail, but all of that mail is monitored.
Dave Bittner: [00:01:39] OK.
Joe Carrigan: [00:01:40] And it is opened and read by Department of Correction employees. And Terry noticed a pattern from one particular inmate while he was doing this detail.
Dave Bittner: [00:01:47] Yeah.
Joe Carrigan: [00:01:48] And for the first week of every month, the inmate would send 10 handwritten letters a day to various women.
Dave Bittner: [00:01:55] OK.
Joe Carrigan: [00:01:56] The letters were all identical with the exception of the name of the woman and the address they were going to.
Dave Bittner: [00:02:01] OK.
Joe Carrigan: [00:02:02] But they were handwritten, and they were written verbatim, one to the next. And the letters always started with, I don't normally write to people, but - and then he'd say, I read your letter in PenPal Magazine, or something.
Dave Bittner: [00:02:14] Right.
Joe Carrigan: [00:02:14] And it would go on to describe his situation and how he needed to have someone to communicate with because he was isolated, and even inside prison, he was isolated. He didn't have any friends on the outside, no family. And he would tell his story about how he was in the wrong place at the wrong time. The reason he's in prison is because he didn't know they were going to commit a crime that night. He fell in with the wrong group of people, and so on. And it was a long letter that he'd write.
Dave Bittner: [00:02:37] He's being straight up about being in prison.
Joe Carrigan: [00:02:39] Right.
Dave Bittner: [00:02:40] And who knows if he colored the story in any way, shape or form. But...
Joe Carrigan: [00:02:44] I would imagine it's heavily colored.
Dave Bittner: [00:02:46] Yeah. Yeah. But not - it wasn't an outright scammer. By what this person, Terry, who wrote the article, read, he was telling a plausible story.
Joe Carrigan: [00:02:53] A plausible story. Right.
Dave Bittner: [00:02:54] OK.
Joe Carrigan: [00:02:55] In the following weeks, Terry noticed that this inmate would be writing about one or two letters a day. He wouldn't be writing the same letters, but these letters were more tailored. For example, if the woman that he was writing to was a particularly religious woman, he would say that he had been going to chapel and that the services were actually becoming meaningful to him.
Joe Carrigan: [00:03:15] And they'd have discussions about the sermons, and they'd have Bible verse discussions and things like that. If the woman had kids, he would tell a story about how his father was abusive and abandoned his family and that he always wanted to have kids, but he would never treat his family like that. He wanted to be a good father.
Dave Bittner: [00:03:30] I see.
Joe Carrigan: [00:03:31] Right.
Dave Bittner: [00:03:32] So he had a little from column A, a little from column B...
Joe Carrigan: [00:03:35] Right.
Dave Bittner: [00:03:35] ...Depending on how the interaction went with these ladies he was writing letters to.
Joe Carrigan: [00:03:40] Exactly.
Dave Bittner: [00:03:40] OK.
Joe Carrigan: [00:03:41] Eventually, he would have these pen pals sending him cash or stamps or something he could sell for money. And then the next month, the process would start all over again with 10 letters a day.
Dave Bittner: [00:03:52] (Laughter) OK.
Joe Carrigan: [00:03:53] So one day, Terry gets to talk to this inmate one-on-one. And he mentions to the inmate that he saw the pattern - you know, the inmates know their mail is read.
Dave Bittner: [00:04:02] Right.
Joe Carrigan: [00:04:02] And the inmate is actually surprisingly frank with Terry, and he says that he buys a mailing list each month that has 100 women's names and addresses and gets this mailing list from a pen pal magazine whose target demographic is prison inmates. And the list cost him about $20 to buy the list.
Dave Bittner: [00:04:20] OK.
Joe Carrigan: [00:04:21] He then sends the first week of letters, and in the following weeks, he's responding to the people who have responded to him.
Dave Bittner: [00:04:28] Right.
Joe Carrigan: [00:04:29] And eventually, he convinces these women to send him in cash and stuff. And he winds up making about a hundred dollars a month, which doesn't seem like a big take...
Dave Bittner: [00:04:38] Yeah.
Joe Carrigan: [00:04:38] When you're in prison, it's significant. So each month, he's raking in a profit of about $80, which is probably significant.
Dave Bittner: [00:04:45] It's interesting because I mean, I see this is as - and also he's sort of filling the time. He's got a routine down. Week one, I do this. Week two, I do this.
Joe Carrigan: [00:04:54] Exactly.
Dave Bittner: [00:04:54] He's got his own little small business here.
Joe Carrigan: [00:04:56] He does.
Dave Bittner: [00:04:57] And he's filling the hours that he's, you know, got to fill there. So why not?
Joe Carrigan: [00:05:02] What else is he going to do?
Dave Bittner: [00:05:03] Yeah.
Joe Carrigan: [00:05:03] He also had a long gain, though. He was looking for someone to move him with when he got out. And he actually succeeded in getting someone, a woman, to fall in love with him. Right? It becomes the long-romance scam.
Dave Bittner: [00:05:15] Right.
Joe Carrigan: [00:05:16] And on the day he was paroled, she picked him up and took him home. And she was a divorced woman who had a nice house in a good neighborhood. The house was paid for. Right? So it was a stable home. And six months later, she comes home and he's gone, and so is all of her jewelry, all of her credit cards, all of her cash, you know?
Dave Bittner: [00:05:34] Yeah.
Joe Carrigan: [00:05:34] So she gets hurt pretty bad in this story. Terry's point in this whole article is this, that these inmates on the inside really have nothing but time.
Dave Bittner: [00:05:43] Right.
Joe Carrigan: [00:05:43] That's what they're there to do, is to serve time. And many of them will use that time to try to run scams like this with people on the outside. They'll even exploit their friends and family, if they can.
Dave Bittner: [00:05:53] Yeah.
Joe Carrigan: [00:05:53] So.
Dave Bittner: [00:05:54] It's fascinating to me that the prison allowed this because anyone who would look at this would see...
Joe Carrigan: [00:06:00] Right.
Dave Bittner: [00:06:00] ...He's writing multiple letters. He's got this little machine going here where he's making money. I guess technically what he's doing is not against the rules.
Joe Carrigan: [00:06:08] Right. And it's probably not even illegal.
Dave Bittner: [00:06:10] No. There's no rule against having pen pals. He's not saying anything untrue. He's just saying, out of the goodness of your heart, can you please send me some money? And people do.
Joe Carrigan: [00:06:20] Right.
Dave Bittner: [00:06:22] Fascinating. The thing to put the word out about is to warn your friends that these scams exist.
Joe Carrigan: [00:06:28] Right. What's interesting to me about this is that this guy does this even though it's a small-scale operation for him. I mean, he's making $80 a month, which is not a huge amount of money.
Dave Bittner: [00:06:39] Yeah. But in prison, maybe it is.
Joe Carrigan: [00:06:41] Right.
Dave Bittner: [00:06:41] It's enough.
Joe Carrigan: [00:06:42] It's still enough for him to continue to do it.
Dave Bittner: [00:06:45] Right.
Joe Carrigan: [00:06:45] We had a story last week where someone lost, like, $40,000 to scammers. They were a little more advanced, but the spectrum of scam is very large in all directions.
Dave Bittner: [00:06:55] Yeah. That's interesting. I also wonder about the people who sign up to be pen pals with prisoners. I guess they're, you know, they want to be good people and help people to maybe - in the time of need who want to - who sincerely want to reform themselves. But...
Joe Carrigan: [00:07:09] Right.
Dave Bittner: [00:07:09] ...They have to be careful that they could be part of a long con here, like this guy did, fooling a woman to fall in love with him, have him move in and he ended up taking her for a lot of things she owned.
Joe Carrigan: [00:07:21] Yeah. Terry makes a point in the article that he only ever saw one of these relationships work out in what people would consider a successful manner, and that 99 percent of the time, it's a scam.
Dave Bittner: [00:07:32] Wow.
Joe Carrigan: [00:07:33] Yeah.
Dave Bittner: [00:07:33] All right. Well, it's interesting stuff. That's, (laughter), a new one to me.
Joe Carrigan: [00:07:37] Yep.
Dave Bittner: [00:07:37] Well, my story this week, I want to talk about some of the ways scammers are trying to target our paychecks. And I know a lot of us get our paychecks by direct deposit.
Joe Carrigan: [00:07:48] Yes.
Dave Bittner: [00:07:49] That is probably the most convenient way to get paid. You know, you get an email however many times - once a week, every couple weeks, whatever it is your interval is. And it says, congratulations, you've been paid. And the money just shows up in your bank account, and it's convenient.
Joe Carrigan: [00:08:02] I don't get an email.
Dave Bittner: [00:08:03] You do not?
Joe Carrigan: [00:08:04] No.
Dave Bittner: [00:08:04] Well, I do.
Joe Carrigan: [00:08:05] OK.
Dave Bittner: [00:08:05] So to that point, there are several different ways that folks are trying to scam people out of their paychecks. And, by the way, they're particularly targeting folks who work for universities.
Joe Carrigan: [00:08:16] Ah.
Dave Bittner: [00:08:17] So...
Joe Carrigan: [00:08:17] I'd better pay attention.
Dave Bittner: [00:08:18] Exactly.
Dave Bittner: [00:08:21] So they will send you a phishing email that looks like it came from HR, and they'll say, we have to do a survey or we need some information from you in order to do X, Y, or Z. Please visit this site. They'll ask you a few questions. And then they'll say, to verify you are who you are, please put in your username and password. And then they've got you.
Joe Carrigan: [00:08:42] Right.
Dave Bittner: [00:08:42] Then they log in to the HR system, and what they do is, they change the routing information for your paycheck. So when you get your paycheck automatically deposited, direct deposited, it goes to another account that is not yours.
Joe Carrigan: [00:08:55] Right.
Dave Bittner: [00:08:55] The problem with this is, for the folks who don't get emails, like you, I guess...
Joe Carrigan: [00:09:00] Right.
Dave Bittner: [00:09:01] ...It could take a couple days to realize that, hey, I didn't get paid.
Joe Carrigan: [00:09:05] Right.
Dave Bittner: [00:09:05] And then you go to HR, and you say, I didn't get paid this week. And they say, well, let me look into that. And so there's this time delay.
Joe Carrigan: [00:09:12] Right.
Dave Bittner: [00:09:13] And in the meantime, the scammers got your money...
Joe Carrigan: [00:09:15] Out of the account.
Dave Bittner: [00:09:17] And they're gone.
Joe Carrigan: [00:09:17] Right.
Dave Bittner: [00:09:17] They're long gone. Now, there's another way that they can do this, and that is targeting HR. There was - someone sent us a story about some scammers who were using information they had gotten from stolen applications for rental units in an apartment complex. So think about all the information you put down on a rental application.
Joe Carrigan: [00:09:38] Right.
Dave Bittner: [00:09:38] Right? Lots of personal information there. Lots of financial information, maybe a canceled check from your employer. Your signature. Basically, everything you'd need to pull off this scam. So these bad guys were using these applications that had been stolen from an apartment complex, and then they would send an email or a fax to HR at the company and say, hey, this is Joe Smith, and I want to have my paycheck sent to my new bank account.
Joe Carrigan: [00:10:05] Right.
Dave Bittner: [00:10:05] And the HR people would look at it, and they had everything they needed there.
Joe Carrigan: [00:10:08] Right. They might even have the old bank account information.
Dave Bittner: [00:10:11] Exactly. Exactly. Yeah.
Joe Carrigan: [00:10:12] Change it from this routing number, this account, to this routing - this checks out because they have the correct information.
Dave Bittner: [00:10:17] And here's my signature, which checks out. And same thing, what happened. So in this case, the HR people would change the routing information. And in this case, the person getting paid has no idea it's been changed.
Joe Carrigan: [00:10:27] Right.
Dave Bittner: [00:10:28] Right? And same sort of thing happens. It takes a little while to track down, and in the meantime, the money is gone.
Joe Carrigan: [00:10:35] Right. So I have a question.
Dave Bittner: [00:10:36] Yeah.
Joe Carrigan: [00:10:37] In these cases, who's liable?
Dave Bittner: [00:10:39] It's an interesting thing. There was a case of this where a university got hit, and the university made the people whole at their own expense. Really, out of the goodness of their heart. I think they said, OK, we're going to make sure everybody gets paid this one time.
Joe Carrigan: [00:10:55] Right.
Dave Bittner: [00:10:55] (Laughter).
Joe Carrigan: [00:10:56] Well, if I am, as a university employee, a victim of a phishing scam where I provide my login credentials...
Dave Bittner: [00:11:03] Right.
Joe Carrigan: [00:11:03] Right? And then the scammers then used the information I provided to change my routing information, I would understand if they said, we're going to do this this one time. But in the second case, where you talk about somebody sending a fraudulent fax to HR, and HR just going ahead and changing the information, I think if that happens a hundred times, the university should be liable, or the employer.
Dave Bittner: [00:11:23] Yeah.
Joe Carrigan: [00:11:24] Any employer. Because...
Dave Bittner: [00:11:25] I think that's reasonable.
Joe Carrigan: [00:11:25] ...There should be better checks and balances than, I receive a fax and I go ahead and do it.
Dave Bittner: [00:11:29] Yeah. Yeah. There was a tip here that caught my eye, something that I hadn't heard before in terms of protecting yourselves to replying to these possible phishing emails. And that was, if you're suspicious about an email - say you get an email from HR - rather than using your email's reply feature, use your forward feature. So you get an email from HR, hit the forward button. And then you manually enter in the email for HR.
Joe Carrigan: [00:11:55] Right.
Dave Bittner: [00:11:55] Chances are, you're going to have the right email address for HR in your address book.
Joe Carrigan: [00:12:00] Right.
Dave Bittner: [00:12:01] So it's going to - even if it auto enters it, it's going to be the correct one from HR. So that way, you're less likely to be replying to a scam email address, a look-alike email address...
Joe Carrigan: [00:12:11] Correct.
Dave Bittner: [00:12:12] ...Something like that. I thought that was a clever tip. I hadn't really thought about that one. It's a little, small little step that could lower the chances of you replying to a phony email address.
Joe Carrigan: [00:12:22] Yeah. Yeah. It's kind of like entering the URL when you get the Bank of America letter.
Dave Bittner: [00:12:26] Yep.
Joe Carrigan: [00:12:26] Don't click on the link.
Dave Bittner: [00:12:27] Yep.
Joe Carrigan: [00:12:27] Or email, rather. Not a letter. But don't click on the link. Enter Bank of America in your web browser. It's the same kind of thing. You're just doing it in the To field of your email.
Dave Bittner: [00:12:35] Yeah. Yeah. Not going to change the world, but it's a good little tip and probably a good habit to get into. It doesn't take you a whole lot of extra time.
Joe Carrigan: [00:12:42] Right.
Dave Bittner: [00:12:43] All right, Joe. Those were our stories. It's time to move on to our Catch of the Day.
Joe Carrigan: [00:12:47] All right.
(SOUNDBITE OF REELING IN FISHING LINE)
Dave Bittner: [00:12:51] Joe, our Catch of the Day this week was sent in by a listener. This is someone who goes by the name, Floetry.
Joe Carrigan: [00:12:56] Floetry?
Dave Bittner: [00:12:57] Which I think is lovely. It's like poetry, with an F-L. And so this email that they sent us is titled, Stop Contacting the Wrong Office.
Joe Carrigan: [00:13:06] (Laughter).
Dave Bittner: [00:13:07] And it goes like this. (Reading) Sometimes I wonder if you are really, really with your senses. How could you keep trusting people, and at the end, you will lose your hard-earned money? Or are you being deceived by their big names? They impersonate on many offices, claiming to be governors, directors, chairmen or of one office or the other. Their game plan is only just to extort your hard-earned money. Now the question is, how long will you continue to be deceived? Sometimes they will issue you fake check, introduce you to fake diplomatic delivery, un-existing online banking. And they will also fake wire transfer of your fund with payment stop order and even send you fake ATM cards, et cetera.
Dave Bittner: [00:13:44] (Reading) Anyway, by virtue of the position, I have been following this transaction from inception and all your efforts toward realizing the fund. More often than not, I sit down and laugh at your ignorance and that of those who claim they are assisting you. It is very unfortunate that at the end you loose (ph). Although I don't blame you because you are not here in Nigeria to witness the processing of your payment in Nigeria. The problem you are having is that you have been told the whole truth about this transaction, and it is because of this truth they decided to be extorting your money. The most annoying part is even fraudsters have really taken advantage of this opportunity to enrich themselves at your expense. Those you feel are assisting or working for you are your main problems. I know the truth surrounding this payment, and I am the only person who will deliver you from this long suffering if you will abide by my advice.
Joe Carrigan: [00:14:32] Whew. Thank you.
Dave Bittner: [00:14:33] (Reading) They claim they are helping you and forward all the fraudulent emails you receive to them. At the end, they do nothing about the fraudsters. Soon they will ask you to pay money to receive a compensation of millions of dollars. Do not pay any money to them because they are only interested in your hard-earned money, and you will never receive any compensation in return. They will always keep coming back to ask for more money. Please, I beseech you to stop pursuit of shadows and being deceived. Feel free to contact me immediately if you receive this email so that I can explain to you the modus operandi, guiding the release of your payment. Do not panic. Be rest assured that this arrangement will be guided by your embassy here in Nigeria. You're urgently requested to provide me with the following information.
Joe Carrigan: [00:15:16] (Laughter).
Dave Bittner: [00:15:16] (Reading) Full name, address, telephone number, passport or national identity copy. Contact me upon the receipt of this mail if you wish to receive your fund and stop wasting your hard-earned money. I await your urgent response. Yours sincerely, Mr. Ibrahim Mustafa Magu (ph), Chairman Economic and Financial Crime Commission.
Joe Carrigan: [00:15:36] Wow.
Dave Bittner: [00:15:38] (Laughter).
Joe Carrigan: [00:15:39] That is an excellent Catch of the Day.
Dave Bittner: [00:15:41] (Laughter).
Joe Carrigan: [00:15:41] I haven't seen one this good in a while. In terms of just Nigerian scams...
Dave Bittner: [00:15:46] Yeah.
Joe Carrigan: [00:15:46] This is excellent.
Dave Bittner: [00:15:47] Kind of turns it on its head, doesn't it?
Joe Carrigan: [00:15:49] It does. Hey, hey - there are Nigerian scammers out there, but you can trust me. I'm here in Nigeria.
Dave Bittner: [00:15:55] (Laughter) That's right. Brilliant. I am your friend on the ground here in Nigeria. I am the chairman of the Economic and Financial Crime Commission, and I am the one who will make sure you get all of your Nigerian money.
Joe Carrigan: [00:16:07] (Laughter) Right.
Dave Bittner: [00:16:08] Don't trust those other folks. You can trust me.
Joe Carrigan: [00:16:11] Floetry, this is awesome. Thank you so much.
Dave Bittner: [00:16:14] (Laughter). All right. That is our Catch of the Day. Thanks for sending it in. Coming up next, we've got Joe's interview with Sam Small. He's the chief security officer at ZeroFOX. But first, a word from our sponsors, KnowBe4.
Dave Bittner: [00:16:28] Let's return to our sponsor, KnowBe4's, question. Carrots or sticks? Stu Sjouwerman, KnowBe4's CEO, is definitely a carrot man. You train people, he argues, in order to build a healthy security culture. And sticks don't do that. Approach your people like the grown-ups they are, and they'll respond. Learning how to see through social engineering can be as much fun as learning how a conjuring trick works. Hear more of Stu's perspectives in KnowBe4's weekly Cyberheist News. We read it, and we think you'll find it valuable, too. Sign up for Cyberheist News at knowbe4.com/news. That's knowbe4.com/news.
Dave Bittner: [00:17:15] And we are back. Joe, you recently had the opportunity to speak with Sam Small from ZeroFOX.
Joe Carrigan: [00:17:20] I did. Sam is a Hopkins alum, and he's also the chief security officer at ZeroFOX. And he was good enough to give me some of his time for the interview. Let's take a listen.
Dave Bittner: [00:17:28] All right.
Sam Small: [00:17:29] So similar to how we've seen account hijacking for however long there have been logins on machines, account hijacking is a large issue in social media. So for instance, we see numerous examples where either the official social media account for an organization, or account that belongs to, maybe, a CEO or a CTO of a large corporation will be hijacked by an adversary or attackers and then used to either propagate messages or potentially malicious links, things like that, to the audience of that particular account.
Sam Small: [00:18:01] Likewise, we see a lot of targeting. While you might see, ultimately, a very traditional type of attack at the end of the process, a lot of events start today with targeting of either employees of your organization or the customers that an organization may have.
Sam Small: [00:18:18] So obviously, with social, there's a little bit of implied trust that, OK, you know, this account that's messaging me has a picture or a photo of someone who looks real and a name that sounds real. And when I look at that their social history, it seems real, et cetera. But at the end of the day, what we really are seeing is maybe, you know, social engineering having grown up and gone to college (laughter) and being proliferated at scale. And so that's another very common vector that we see.
Sam Small: [00:18:44] And, you know, that leads into kind of the next category that I like to talk about, which is social phishing and malware. And then additionally, there's kind of more compliance-related kind of issues and risks that many organizations face. So for instance, sharing information publicly on Twitter or on Facebook or Instagram, either via text or maybe something that's on a whiteboard in a picture that someone's posting. You know, we see those types of problems very often.
Sam Small: [00:19:08] So for instance, you could imagine if a group of nurses were posing for a picture together inside of a hospital, but maybe behind them on a whiteboard is some confidential patient information, that's a HIPAA violation, potentially. I could go on and on. But there's a wide array of risks that are out there that lead to compromise and material damage for organizations.
Joe Carrigan: [00:19:31] We had a story a couple of weeks ago on our podcast about someone who had taken over a verified Twitter account and then changed all the information to look like it was Elon Musk's account so they could run a bitcoin scam. You're talking about account hijacking. And while this is kind of account hijacking, it's also account hijacking paired with impersonation.
Sam Small: [00:19:49] That's right, yeah. So impersonation is - you've hit the nail on the head. That is the keyword here. And so fraudulent accounts is another category we talk a lot about at ZeroFOX. And I think that's - you know, that's kind of the bucket that we would put that particular story in as well.
Sam Small: [00:20:02] And so organizations face this challenge, and this kind of goes back to what I was saying earlier, which is if you have an inventory of all your social accounts, much in the same way you might have an inventory of all the machines on your network or all the IP addresses, you know, assigned to your organization, you can monitor those assets all day long. But what are you doing to monitor the accounts that are pretending to be part of your organization or that are pretending to be the voice of your brand or your company? And that's really where the brave new world is in this part of our industry. But there are additional concerns as well.
Sam Small: [00:20:34] So for instance, you might even imagine physical risk. You might not expect it, but physical risks are another interesting area that many of our customers have concerns about. And so for instance, you might imagine that if you're an organization that is either putting on or hosting an event, or likewise, maybe you're an organization with high-profile executives that are traveling, it's important to be as vigilant as possible.
Sam Small: [00:20:57] And you may want to gather all of the traditional signals from maybe the State Department or local police coverage, or things like that. But you would be remiss in not also collecting the data that is proliferated on social media around risks and threats that exist in either relation to your exact event or the exact location that one of your executives is traveling or the broader region in general.
Sam Small: [00:21:22] This is one of the areas where we see digital threats transcend into the physical realm. And likewise, even just last week I was working with a customer where we started with a threat that came through the U.S. Postal Service. And we took the indicators and some of the information that was in that document and then transcended into the digital space to try to identify if there were other overlapping indicators or other types of noise related to that particular type of threat. So it's a really interesting space from that perspective.
Sam Small: [00:21:51] You don't often see - you know, with traditional malware, you don't see yourself hopping from the virtual space to the physical space and vice versa. It's not always the case that we're synthesizing information from various sources. But ultimately, that's really where the power lies here, is that you have all of these different sources. And we're trying to synthesize intelligence and data and information across these sources as they all relate to, ultimately, either a single entity, a single brand, a single location, a single person or a single persona.
Joe Carrigan: [00:22:22] I like scary stories, Sam.
Sam Small: [00:22:23] Sure.
Joe Carrigan: [00:22:25] So do you have any - anything that your customers have experienced that you can tell - without, obviously, violating any of your customers' confidentiality - where something has actually happened or malware has been distributed or maybe even physical harm has happened?
Sam Small: [00:22:35] I will give you a few generic examples...
Joe Carrigan: [00:22:37] OK.
Sam Small: [00:22:37] ...That don't involve our customers, but they do involve real people in real organizations. And these are more stories I've collected and we've collected here at ZeroFOX over the years from public-facing events. So one of the types of attacks that's fairly common begins with customers who are trying to either express frustration or seek help from a brand or a retailer that they've interacted with.
Sam Small: [00:23:02] So for instance, if you bought something from a retailer, whether they're online or a traditional brick-and-mortar store, and you're an unhappy customer, or you're seeking customer support, a lot of that stuff happens on social these days. I'm sure everyone has seen that type of occurrence. And so often, what happens is attackers kind of lay in wait for someone to @ mention a retailer or use a hashtag related to a particular organization's campaign. And then what they'll do is they'll have an account set up, lying in wait, that has the same logo as the organization that's mentioned, same display name as the organization that's mentioned and perhaps a slightly different but realistic-looking handle, regardless of the social network that it's on.
Sam Small: [00:23:43] And then what'll happen is that because this communication kind of happens in real time - where you send a tweet, and you get a tweet back immediately, or you send a message, and you get a message back immediately - it can easily fool even a sophisticated user into thinking that they're interacting with the organization that they're trying to reach or that they've mentioned.
Sam Small: [00:24:00] And then normally what happens is someone - the adversary will share a link, saying, hey, I'm sorry you're experiencing this pain with our service or our product or let me put you in touch with the right people. You know, follow this Bitly link. And then what happens? You have, you know, a drive-by download. Or you have a form that looks very much like the login page to one of these services, whether it be, you know, like, a PayPal or Amazon or what have you.
Sam Small: [00:24:25] Now you have either a compromised machine or a compromised account. And guess what? If you don't follow great security hygiene and you reuse your credentials, then not only is it that account that you've given the adversary access to, but probably every other service where you reused those same credentials.
Joe Carrigan: [00:24:42] I can also imagine that being an opportunity just to say, hey, just for validation, can I have your credit card number that you used to purchase this?
Sam Small: [00:24:48] That's exactly right. Yeah. What you - either, what's your credit card number or, you know, give us your Social Security number or, you know, what have you. You know, you need to reset your password. Please follow this link, and reset your password here. Again, you know, giving access to your password that you would typically use for that service.
Joe Carrigan: [00:25:04] What can individuals do to protect themselves against these kind of attacks, where somebody's jumping into the middle like that?
Sam Small: [00:25:09] It's important to be aware of and know how to recognize the UI and UX of verification. So a lot of these platforms, like Facebook and Twitter, have the concept of verified accounts. And if you know how to recognize that, that can give you a sense of assurance that you're speaking with or engaging with the brand, as you intend. That being said, attackers also use that to their advantage, as well. So we've certainly seen adversaries try to cleverly use emoji or international characters or their cover photo to spoof verification.
Sam Small: [00:25:45] So another way to help identify the authentic entities that you're trying to communicate with is by going directly to their web page and looking for an indication of what their official Facebook page is or what their official Twitter page is, et cetera. So you know, it never hurts. Just like when people are phished, we always give them the advice of, hey, if you get a really great-looking link from a retailer in your inbox, and it looks too good to be true or it looks a little funny, then just go to that retailer's website directly yourself. And so it's very much the same advice there.
Sam Small: [00:26:20] Also, I think typically, be suspicious of shortened links. And, you know, depending on the provider, there are tricks that you can use to see what the ultimate destination or what the redirection chain is for a shortened link. So for instance, with Bitly one of the things that you can do is take any Bitly link.
Sam Small: [00:26:36] If you were to copy it and then paste that URL in the URL bar of your browser, if you add just a plus sign at the end of any Bitly link, Bitly will display kind of a dashboard page that explains not only where this Bitly link will direct you to, but also gives you basic statistics about how many times it's been clicked and where those people are coming from, et cetera. So that's also kind of an interesting thing that can help give people more confidence about what exactly they're clicking on.
Dave Bittner: [00:27:07] Wow, Joe, interesting stuff from Sam at ZeroFOX.
Joe Carrigan: [00:27:10] Yeah, yeah. First off, the thing that sticks out in this is how old am I that I can't think of six social networks off the top of my head? (Laughter).
Dave Bittner: [00:27:14] (Laughter) I'm with you there, my friend. I am - I am with you. Yes, yes. So the kids these days...
Joe Carrigan: [00:27:20] Right (laughter).
Dave Bittner: [00:27:21] ...With their Snapchats and their links and then - yeah.
Joe Carrigan: [00:27:25] And their whatsa-bots (ph).
Dave Bittner: [00:27:25] Yeah, exactly. Right.
Joe Carrigan: [00:27:26] Once again, Dave, we hear that security events start with customers or employees being targeted. It's the beginning of the attack chain.
Dave Bittner: [00:27:33] Yep.
Joe Carrigan: [00:27:33] The people are the first thing to be attacked. And social media presents a huge attack surface.
Dave Bittner: [00:27:40] Right.
Joe Carrigan: [00:27:40] An absolutely huge attack service. We don't even think about these things that we're doing on social media, like the example Sam gives about the nurses taking a picture with patient data being on the board.
Dave Bittner: [00:27:48] Right. Right.
Joe Carrigan: [00:27:49] That's a HIPAA violation, or it could be a HIPAA violation. You just don't even think about it. It's just not present in your front of mind. I would view it as an honest mistake.
Dave Bittner: [00:27:56] No, I saw one recently with some folks in the military who were standing in front of some terminal that had a sticky note on it that had the user name and password for some defense system.
Joe Carrigan: [00:28:05] Right, yeah.
Dave Bittner: [00:28:05] You know, just a nice - a nice photo of everybody, and there it is.
Joe Carrigan: [00:28:09] There it is.
Dave Bittner: [00:28:09] So yeah, interesting.
Joe Carrigan: [00:28:10] But while social media does present the huge attack surface, it also offers an opportunity to collect risk data, which is what Sam was talking about as well. And that is a great way of looking at the social media landscape a little bit differently, I think. And it's useful. I find that very interesting. Again, we hear, to protect yourself as the individual user, go to the company's website. Verify what their social media accounts are. Make sure you're talking to the right person.
Dave Bittner: [00:28:35] Yeah, who you think you're talking to.
Joe Carrigan: [00:28:36] Be familiar with how, like, Twitter, for example, verifies its users - that little blue check. And there are a lot of things that look like the little blue check. I saw an account the other day that had something that looked kind of like a little blue hurricane next to it. And I thought to myself, I wonder if that person is trying to deceive people into believing he's validated?
Dave Bittner: [00:28:52] Yeah. Interesting.
Joe Carrigan: [00:28:52] I love the Bitly feature.
Dave Bittner: [00:28:54] With the plus.
Joe Carrigan: [00:28:55] With the plus sign. I had no idea that was even possible.
Dave Bittner: [00:28:58] Me neither.
Joe Carrigan: [00:28:58] And I tried it with a couple of links, and it's fascinating. You can see the click history of it.
Dave Bittner: [00:29:02] Yeah.
Joe Carrigan: [00:29:03] And all kinds of great stuff. It's a good tool.
Dave Bittner: [00:29:04] No, it's great. Yeah, it really is. Well, thanks to Sam for taking the time for us and - from ZeroFOX. We appreciate that, lots of good information. And that is our podcast.
Dave Bittner: [00:29:14] We want to thank our sponsor KnowBe4. Their new-school security awareness training will help you keep your people on their toes, with security at the top of their mind. Stay current about the state of social engineering by subscribing to their Cyberheist News at knowbe4.com/news. Think of KnowBe4 for your security training. We want to thank the Johns Hopkins University Information Security Institute for their participation. You can learn more about them at isi.jhu.edu.
Dave Bittner: [00:29:39] The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of Data Tribe, where they're co-building the next generation of cybersecurity teams and technologies. Our coordinating producer is Jennifer Eiben. Our editor is John Petrik, technical editor is Chris Russell, executive editor is Peter Kilpe. I'm Dave Bittner.
Joe Carrigan: [00:29:55] And I'm Joe Carrigan.
Dave Bittner: [00:29:56] Thanks for listening.