Hacking Humans 7.8.21
Ep 155 | 7.8.21

Collaboration, data portability, and employee mobility fuel insider risk.

Transcript

Joe Payne: Two-thirds of all data breaches in the last year were caused, actually, by insiders. And yet only about 10% of security budgets and activity are focused on insiders.

Dave Bittner: Hello, everyone, and welcome to the CyberWire's "Hacking Humans" podcast, where each week we look behind the social engineering scams, the phishing schemes and criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire. And joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe. 

Joe Carrigan: Hi, Dave. 

Dave Bittner: Got some good stories to share this week. And later in the show, my conversation with Joe Payne from Code 42. We're going to be talking about insider risks. All right, Joe, let's jump right into some stories here. Why don't you kick things off for us? 

Joe Carrigan: Dave, my story this week comes from Mark Stone over at Security Intelligence. And he has a story about Frank Abagnale. Abagnale is the con artist who is portrayed in "Catch Me If You Can" by Leonardo DiCaprio. 

Dave Bittner: Yep. 

Joe Carrigan: Lately, actually, a lot of his story has been called into question by Alan Logan, who wrote a book in 2020 called "The Greatest Hoax On Earth: Catching The Truth While We Can." He's essentially saying some of Frank's claims are not true. But one thing we can say about Frank, he is a good con man because either he scammed people out of - or organizations out of a hundred - or $1.5 million, or he had a - built a career based on telling people that he did. 

Dave Bittner: (Laughter) So either way... 

Joe Carrigan: Either way. Right. 

Dave Bittner: ...He's not built his career on a foundation of being up and up. 

Joe Carrigan: Right. 

Dave Bittner: (Laughter) OK. Fair enough. 

Joe Carrigan: So say what you will about him, I still have a little bit of respect for the man. I still have respect for the man. 

Dave Bittner: All right. All right. 

Joe Carrigan: Anyway, the article, it talks about why social engineering works. And one of the sections in the article is, who is susceptible? And, of course, the answer to that question is everyone. Abagnale does have his own security consulting company. So it's nice to hear other people say that that's correct. Everybody is susceptible. 

Dave Bittner: Sure. 

Joe Carrigan: And Abagnale says that people in the enterprise are just as vulnerable as people at home. I have a good explanation for that, Dave. It's... 

Dave Bittner: OK. 

Joe Carrigan: ...That they're still people, right? 

Dave Bittner: (Laughter) OK. 

Joe Carrigan: Their vulnerability doesn't change based on where they're working. 

Dave Bittner: OK. 

Joe Carrigan: The key point that he makes is that everyone can be scammed. It's not a sign of low intelligence. And we need to share when it happens to us. I think that is sage advice. The more we talk about these things, the more we inoculate each other about it. One of the other things, he says, is that the Internet is making old scams much more accessible to more people. He talks about in the old days, you had to pick up the phone and call somebody. Now, you don't need to do that. You can email 1,000 people at once... 

Dave Bittner: Right. 

Joe Carrigan: ...Or a million people at once. He has this concept in his practice of the art, I guess. He calls it the ether. And this is a quote from him. "Scam artists put individuals under what I call the ether. Ether is a condition of trust and - or even infatuation with what is being presented to the victim. Getting a victim under the ether is crucial to all cons no matter where or how they are perpetrated. The heightened emotional state makes it hard for the victim to think clearly or make rational decisions. And "to get their victims under the ether, fraudsters hit the fear, panic or urgency buttons." And this is one of the things we talk about all the time. I would also say the greed button. But I think that might fall under the urgency button. 

Dave Bittner: Yeah. OK (laughter). 

Joe Carrigan: Abagnale goes on saying the effect can be almost hypnotic. A good con man can keep his victim up in the altitude of the ether because once they drop into the valley of logic, the con man loses them - right? - once they understand this. And we've talked about people who have approached people that they know are in the middle of a con, right? And the con artist is so good that this person is turning against his friends in favor of the con artist. 

Dave Bittner: Right. Well - and it strikes me, too. What do you hear people say time and time again on the other side of a scam when they realize what has happened to them? You hear them say... 

Joe Carrigan: I should have known. 

Dave Bittner: I should have known. I was going to say, what was I thinking? 

Joe Carrigan: What was I thinking? Right. 

Dave Bittner: Right? Yeah. 

Joe Carrigan: This is more from Frank. He says, to introduce the ether, the con artist asks questions to trigger emotional responses. Once a con identifies the trigger - whether it's good news, bad news, whatever it is - he uses that as part of the pitch to drive you into the heightened emotional state. The questions he asks help him create a target profile that contains information that he can use in follow-up calls - or follow-up interactions, I guess - to keep you under the ether until he seals the deal, all right? So this is very important. When somebody starts asking you questions, that, to me, is a red flag. I don't know. I was - maybe I was raised in a more suspicious environment. But... 

Dave Bittner: (Laughter). 

Joe Carrigan: When somebody starts asking me questions, that kind of puts me off. 

Dave Bittner: Yeah. 

Joe Carrigan: Right? 

Dave Bittner: They kind of - it's like they're doing cold reading on you... 

Joe Carrigan: Yeah. 

Dave Bittner: ...You know? 

Joe Carrigan: Yeah. 

Dave Bittner: And I've also noticed - I don't know if this has happened to you. You know, sometimes you'll be walking through the mall or something. And there'll be those folks who have the little carts in the middle of the mall where they try to sell you things. 

Joe Carrigan: Oh, yeah. 

Dave Bittner: And one of the ways they try to hook you in is they'll say, can I ask you a question? And I always say, no. 

(LAUGHTER) 

Joe Carrigan: Me, too. 

Dave Bittner: I keep walking. But most people would be like, oh, yeah... 

Joe Carrigan: I say, apparently, yes, you can. And then I keep walking. 

Dave Bittner: ...Yeah. OK. What's the question, you know. And then, well, you know, do you - are you frustrated with your pockmarked skin, you know? Like (laughter)... 

Joe Carrigan: My pockmarked skin? 

Dave Bittner: What? Wait a minute. 

Joe Carrigan: They're trying to get inside your head, Dave (laughter). 

Dave Bittner: Right. Exactly. Exactly (laughter). 

Joe Carrigan: And that's right. That's exactly what Abagnale is talking about here. 

Dave Bittner: Yeah. 

Joe Carrigan: They're going to say something that fires off an emotion. Then they're going to sell you some unnecessary product for you, I guess. I don't know. 

Dave Bittner: Right. Sure. Sure. 

Joe Carrigan: But you look fabulous, Dave. 

Dave Bittner: Thank you very much (laughter). 

Joe Carrigan: Two red flags and how you can defend yourself from them - in every scam, he says, no matter how sophisticated or amateur or whatever, every scam is going to have two things. One, they're going to ask you for money. But you must act immediately, right? They're going to put that artificial time constraint. That's the time constraint that Christopher Hadnagy talks about. And, two, the fraudster's going to ask you for information, right? And this might be banking information, Social Security information or date of birth, credit card number, all that stuff - any information, personal, identifiable information, that you're going to need. There's probably also going to be a timeline with that as well, artificially imposed. It's important to remember, Frank goes on to say, that you did not solicit this call or email. Remember - this is something I say - all information must be provided on outbound payment. 

Dave Bittner: And then they say, lastly, always send information only through secure methods, and make sure to use two-step authentication... 

Joe Carrigan: Yep. 

Dave Bittner: ...For any email or messaging service to prevent hackers from logging in without you noticing. And I would say, you know, particularly for that title company, if they had had multifactor on their email... 

Joe Carrigan: Right. 

Dave Bittner: ...That would greatly reduce the odds of someone... 

Joe Carrigan: Yeah, every single title company in this country should use multifactor authentication on their email... 

Dave Bittner: Yeah. 

Joe Carrigan: ...On all their systems if they can, because they are handling large quantities of other people's money. 

Dave Bittner: Right. So we will have a link to that in the show notes. Again, that's over from the folks at Fox News. That is my story this week. Joe, it is time to move on to our Catch of the Day. 

(SOUNDBITE OF REELING IN FISHING LINE) 

Joe Carrigan: Dave, our Catch of the Day comes from a listener named Michael who writes, I've recently started listening to your Hacking Humans podcast and I'm working my way from the first to the most recent episodes. 

Dave Bittner: Ah, very good. 

Joe Carrigan: This email came through today, and I thought you could use it for your Catch of the Day. In the meantime, I'll wait for the Australian dollar to drop lower so the U.S. 35 million will be worth more to me. 

Dave Bittner: (Laughter). 

Joe Carrigan: That's thinking, Michael. 

Dave Bittner: Right. Right. 

Joe Carrigan: This email comes from is - just says R&D, and then it says anonymous at some crazy email address. 

Dave Bittner: Yeah. 

Joe Carrigan: And the subject is extremely urgent, attention required. 

Dave Bittner: All right. It goes like this. 

Dave Bittner: (Reading) Good day. I'm head of research and development's special projects team of a pharmaceutical company. I'm based at the Central Research Center of WHO in Washington, D.C., USA. I have a highly lucrative business venture I would like to discuss with you worth 35 million U.S. dollars. Due to the origin of the funds being diverted from money received from the U.S. government strictly for research and development of the COVID-19 vaccines, I will like to remain anonymous until I'm sure that I can trust you to be my salient partner in receiving and investing this funds on our behalf. I'm still in active service and will not want to jeopardize my career. So if you're not interested, please do not hesitate to delete and disregard this email. However, if you are interested and would like to work with me, simply call me on this secure phone number, which I have solely set up for this purpose. I will give you more details and my identity once I am convinced that you are willing to work with me. Please do not reply to this email. Regards, anonymous. 

Joe Carrigan: This is interesting. Number one, do you think he means silent partner instead of salient partner? 

Dave Bittner: Yeah, probably (laughter). Probably. 

Joe Carrigan: I find it interesting that it's not going to be an email scam. They've got a phone number they can get on to start talking to people. 

Dave Bittner: Right. And they say WhatsApp also available. 

Joe Carrigan: Right. 

Dave Bittner: Interesting. Secure back channel communications. 

Joe Carrigan: WhatsApp has end-to-end encryption. But keep in mind that it is owned by Facebook. 

Dave Bittner: Right. Right. 

Joe Carrigan: So, you know, take that with a grain of salt. This is a pretty good email. I like the way it's got all the broken English in it and the things that are just telling red flags. If I was going to try to scam somebody out of $35 million and have somebody else help me, I might be willing to jeopardize my career for that. 

Dave Bittner: (Laughter). 

Joe Carrigan: If I already made the conscious decision that I'm going to be a criminal and try to get this $35 million out, that's walk-away money, Dave. 

Dave Bittner: Yeah. I guess the thing that struck me about this one that's a little unique is... 

Joe Carrigan: Yeah. 

Dave Bittner: ...Where they say, if you're not interested, please do not hesitate to delete and disregard this email. 

Joe Carrigan: Right. 

Dave Bittner: So it's sort of that reassurance of, hey, no hard feelings, you know? 

Joe Carrigan: Right. Yeah, it's... 

Dave Bittner: I'm not putting the heat on you. There's, you know... 

Joe Carrigan: There's plenty of other people I can get to help me. 

Dave Bittner: Right. Right. So really, it seems like this one's really relying on greed... 

Joe Carrigan: Yes. 

Dave Bittner: ...But not so much on the hey, you must do this right now part of it - in fact, going the other way with that of trying to put the person at ease. 

Joe Carrigan: Oh, I'm sure once you get on the phone with them, the hey, you got to do this right now stuff starts. 

Dave Bittner: Yeah, you're probably right. 

Joe Carrigan: (Laughter). 

Dave Bittner: You're probably right. All right. Well, that is our Catch of the Day. We want to thank our listener for sending that in. 

Joe Carrigan: Thank you, Mike. 

Dave Bittner: We would love to hear from you. You can email us to hackinghumans@thecyberwire.com. 

Dave Bittner: All right, Joe, I recently had the pleasure of speaking with another Joe - Joe Payne from Code42. And we were discussing insider risks. Here's my conversation with Joe Payne. 

Dave Bittner: Let's start off with just sort of an overview here about why this matters. I mean, why the increased focus on insider risk? 

Joe Payne: Well, it's a great question. It's actually, I think, the fastest growing area of risk for organizations today, even faster than external - you know, we have some data that shows that two-thirds of all data breaches in the last year were caused actually by insiders. And yet only about 10% of security budgets and activity are focused on insiders. And there are really three reasons why you're seeing this dramatic increase in insider risk. The first is we've all deployed collaboration technology and sharing technology that has made our employees really productive, which is fantastic - things like Slack and Teams and Box and OneDrive. These are great technologies for sharing and collaborating, but they also make it really, really easy to share data outside the organization, either accidentally or on purpose. 

Joe Payne: The second thing is that all data is portable today in a way it didn't used to be. If I wanted to steal some information from my company 20 years ago, I would have had to come into the office at night and I would have had to, you know, Xerox things, and I would have known that I was doing something wrong and that - not only would it be harder to do because I got to Xerox a bunch of files, but also I know for sure that this was not the right thing to do, whereas today I can just drag a file folder from one side of my desktop to the other and all of a sudden I have some of the most important data that the company has. So portability of data is the second reason. 

Joe Payne: And the third reason is employees. Employees are switching jobs today more than they've ever done. And that's the No. 1 risk indicator that you might have a data breach is a departing employee. Sixty percent of employees say that they took data from their last job to specifically help them in their current job. So they took that data from company to company. That's 60%. That's an incredibly high number of people who admit to taking data. So when you combine technology, portable data and portable employees, employees that typically stick around between three and four years in a job, you've got sort of the perfect mix of things that are happening that are creating this insider risk problem. 

Joe Payne: Insider risk is a much broader approach and appreciation of the problem that employees create by moving data around. Insider threat immediately assumes bad intent on actors, and it's a way to paint the problem as people doing bad things on purpose. And we really don't believe in that approach because a lot of the risk created is by accident, or it's people trying to just get their job done, and they have good intent. And so let me give you an example of that. Let's say I want to share with my boss a document that is sort of a plan for the next quarter, and I decide to use my Dropbox account to share that document because I use Dropbox for my kid's soccer team, and it's a really easy way to collaborate on a document and work on it together. And so I send an email to my boss and say, here's a document; let's work together on this. I've now created insider risk for the company by putting the document in Dropbox, which is likely a nonsanctioned piece of technology in our company. But I didn't do it with malicious intent, and so I think to describe that as an insider threat isn't accurate. To describe it as insider risk is absolutely accurate. 

Joe Payne: Contrast that with an employee that's maybe leaving in the next couple of weeks, and they take information like a customer list, and they upload it to their Gmail account and send it to themselves. That is also insider risk. That has a malicious intent - the intent to steal the customer list from the company that I'm leaving to take it to the company that I'm going to. That would be described as insider risk but also could be described as an insider threat. So insider risk is a broader approach to the problem but also one that acknowledges that sometimes risk is created by employees who aren't really trying to be a threat to the organization. 

Dave Bittner: Yeah. And, I mean, that - what it reminds me of is the notion of, you know, shadow IT, where people are kind of creating their own workarounds because maybe using something like a Dropbox folder is easier than the tools that the company provides. 

Joe Payne: Well, honestly, we as a tech and security industry have a little bit of the blame that we should take in this process because the reason people create shadow IT is because we get in the way of them trying to do their jobs. So if you look at sort of how people have tried to address insider threat and insider risk over the last few years, most of that are with solutions that block collaboration, that say, oh, we're not going to let you put files on a thumb drive; we're not going to let you email certain documents back and forth to people. And the way users respond to that is they go around it, and they find different ways to do things. 

Joe Payne: And so one of the pieces of data that I think is really interesting that we did a bunch of primary research in this area - 51% of employees say that they are being disrupted daily or weekly while trying to do legitimate work. So they're being blocked from doing their job. And so if you get blocked from sharing a document with one of your colleagues, then what you'll do is just find a different way to share it. Oh, I can't email the person through our email system this document, or I can't share it via our sharing technology, so then I'll use Gmail, or I'll use Dropbox or whatever. And I know that I can sort of outsmart you. So, you know, one of the things that we really push people to think about in a modern world is you really need to create an environment where sharing is OK and sharing is good and people are sharing data, but that they do it within the frameworks that we've created and through the tools that we want them to do it with. And by not blocking them, they're less likely to push data around our controls. 

Dave Bittner: Well, I mean, let's dig into the framework that you all have outlined here that - your insider risk management framework. What are the key components? 

Joe Payne: Well, there's really just sort of five components. And let me just - I'll kind of walk them very quickly. First, the five are identify, define, prioritize, automate and improve. And it's a classic security circle where we're just going to get better at this process as we go. But identify is really about monitoring all files, all vectors of exfiltration and all users. And it's a really different approach than I think people have been thinking about in the past. In the past, people have said, oh, let's identify our, you know, most valuable data, and then watch it - or our most vulnerable users or our most risky users and only watch them. Or let's only watch a few vectors of exfiltration. Like, let's watch thumb drives. 

Joe Payne: But in today's world, what's interesting is all users have really important data - I mean, all users, whether they're in HR and they have payroll data, whether they're salespeople and they have customer data, whether they're developers and they have source code data. I mean, everybody has a valuable data, so watching all users is important. Watching all files is important. It's impossible to accurately identify which files in an organization are valuable. And so figuring out - saying like, hey, we're only going to watch some of them, and we're going to leave it up to the users to say, this is an important file - the same users who might want to exfiltrate those files - it doesn't make any sense. And so - and then all vectors, that's probably the biggest change is that, look; there's Dropbox. There's Gmail. There's Yahoo Mail. There's private GitHub accounts. There's all kinds of sync-and-share products out on the market. So users have dozens of ways to exfiltrate data that they didn't have before. So you have to sort of watch all those. So identify is step one of the framework. 

Joe Payne: Define is step two, which is really defining between trusted and untrusted activity. So, for example, if I share a document with you via Slack, and you are one of my colleagues, great - trusted activity. Shouldn't, you know, raise any bells. There shouldn't be any alarms for that. That happens 200 times a day across organization - or more, depending on how big your company is. So we don't want to get in the way of that collaboration. But if I share a document with another friend of mine who also uses Slack at their company, but they're not in my company, wait - that's an untrusted domain because it's outside of our organization. I need to define that that's untrusted, so that we will raise some alarms. And it won't block that activity because it might be legitimate, but we will identify it and identified it as risk. 

Joe Payne: Part three of the framework as prioritize, which is - how do we triangulate the files, the vectors and the users to prioritize what insecurity we need to pay attention to? That's best illustrated with an example. So if I have a normal employee, what I would call a low-risk employee, who's moving a business document via Gmail - you know what? That's probably a low-risk thing that might happen a lot, and I'm not going to pay a lot of attention to it. But if I have a departing employee, someone who's put in their notice and they're leaving, and they're moving source code, and they're putting it in a private GitHub repository... 

Dave Bittner: (Laughter). 

Joe Payne: ...Those three things - and you laugh, but this happens every day. 

Dave Bittner: And it happens to be encrypted, right? 

(LAUGHTER) 

Joe Payne: Yeah. Well, that's funny you mention that one. That's a - we call that a risk indicator. And the fact that somebody uses ZIP files is a big-time risk indicator because people don't really use ZIP much anymore other than to obfuscate and encrypt files that they want to take with them. So that prioritize brings to the forefront the things like that that we really need to pay attention to. 

Joe Payne: And then automate and improve are the last two steps. They make - they're commonsense steps, which is, you know, we need to look at a right-sized response. Hey, for the person that's sending that thing via Gmail, let's have the system automatically just send them an email that says, hey, you shouldn't be using Gmail; you know, you should use our standard email system when you're sharing documents. Or you shouldn't be using Dropbox; we use Microsoft OneDrive here at this company, and here's a training video. And then for the person that obfuscated the source code, OK, we're going to automate a case process and start a case and all that kind of stuff. So automating is really important in security because there's so much activity happening. In order to handle prioritized events, you really got to automate. 

Joe Payne: And then the last one is improve. And it's just that cycle of constantly looking at - OK, turns out that some of these cases were less important than others, and how do we tune the system? And you'll do that in any system you work at. So it's a fairly straightforward framework. We think it's going to be helpful for people as they address the insider risk problem. 

Dave Bittner: How does an organization go about implementing something like this but then also avoiding, you know, those unnecessary speed bumps or hurdles or roadblocks that, I mean, even just, you know, come with change? 

Joe Payne: I think one of the most important things that - we talk about the three T's of addressing insider risk. And the first one is transparency. So as you point out, organizational change is challenging, and one of the things that security teams sometimes don't think about first - 'cause in security, we often are not transparent. We want to be stealthy. We want to quietly sit in the shadows and watch for activity. But in this case, in the insider risk problem, you want to be transparent. You want the organization to understand exactly what you're monitoring and why. So we monitor all files that leave the organization and go to untrusted sites. That should be something that organizations communicate about to their employees. That in itself, just being transparent, will not only earn trust, but it will also slow down the exfiltration that's happening 'cause many organizations aren't watching the store, and employees know that. It's like, well, everybody else, when they left, they took all the data, so how come I can't take the data? 

Dave Bittner: (Laughter). 

Joe Payne: And so letting them know that you're going to watch the store, I think, is super important. So the second T is training. And it sounds obvious, but a lot of employees today don't know what they own versus what the organization owns. Hey, I built that little widget. It was my source code. I wrote it, so I just want to take a copy of it with me when I go to my next job. Or I worked on those prospects. That's my pipeline, my sales pipeline, so when I leave this company and go work for our competitor, I want to take that pipeline with me. No, that's not allowed. You know, you were paid to do those things, and the company owns that information. So just some basic training on what is intellectual property and what you're allowed to take with you or what you're allowed to use outside of work is really important. 

Joe Payne: In addition, too, we find all the time that people don't realize, oh, wait - we're supposed to share documents using G Drive or OneDrive or Box? I thought we were supposed to do it with Dropbox. It's like, no. So training's super important. And then the last thing - that last T is technology. Having the right technology in place in order to monitor will put teeth in the policy and help people understand that, oh, wow, people are watching the store, and if somebody does do things, there are consequences for taking data. 

Dave Bittner: All right, Joe, what do you think? 

Joe Carrigan: Good interview. Some interesting points come out of this. Two-thirds of data breaches were caused by insiders. I'm not surprised by that, actually, especially with the nature of our podcast, right? 

Dave Bittner: Yeah. 

Joe Carrigan: That doesn't mean that two-thirds of these breaches were caused by malicious insiders. I'm going to get to that in a minute. 

Dave Bittner: Right. 

Joe Carrigan: The collaboration technology is a real vector for these things. He talks about Slack. I mean, I have Slack at my office. And I'm actually on a couple of different channels that are outside of my office, that are, you know, outside of my domain where I can collaborate with other people in different environments. Now, I don't send files to them, but I sure could. 

Dave Bittner: Yeah. 

Joe Carrigan: The tool lets me do it. 

Dave Bittner: Yeah. 

Joe Carrigan: It's interesting. And the other thing he says about this is that data is much more portable. And the point he makes is about 20 years ago, you would have to either - well, maybe 25 or 30 years ago - you'd have to make copies of everything, physical copies. You'd have to know that you were doing something malicious in order to do it. And most people actually aren't malicious, right? 

Dave Bittner: Right. 

Joe Carrigan: That's why society works. That's part of human evolution. 

Dave Bittner: (Laughter) Right. 

Joe Carrigan: We generally try not to take advantage of each other. There are a few of us that do. 

Dave Bittner: Yeah. 

Joe Carrigan: But generally, we don't. The risk factor was much lower back then. But now, with the combination of the data being very portable and the technology being so readily available, these kind of things happen all the time. Insider risk versus insider threat - I think that is a really important distinction. I'm not sure that the subtlety is going to be apparent to your insiders, so careful with the language you use to talk to your people. Don't say, we're going to mitigate some insider risk because immediately, they're going to think insider threat, right? And they're going to say, what? Don't you trust us? 

Dave Bittner: Yeah. 

Joe Carrigan: He talked about the IT department. And sometimes the IT department is called the N-O department, right? 

Dave Bittner: Or the department of no. 

Joe Carrigan: Like, can I do this? No. Right. 

Dave Bittner: Yeah (laughter). 

Joe Carrigan: Well, people will get around your systems. 

Dave Bittner: Yup, yup. 

Joe Carrigan: Right? And... 

Dave Bittner: Shadow IT. 

Joe Carrigan: Exactly. 

Dave Bittner: Yup. 

Joe Carrigan: And one of the things you have to remember is that IT is something you do for people, not to people, right? 

Dave Bittner: (Laughter) Yes. 

Joe Carrigan: And it's - you're trying to provide them with the ability to get their job done. And if they're coming to you and asking you for a way to move files, be receptive to that, all right? Say, let me get something up and running. You probably already actually have that. If you have - if you use a Microsoft solution like Office 365 or Microsoft 365, whatever they're calling it these days, you probably already have OneDrive, which is a great file-sharing solution. 

Dave Bittner: Google has their version. There's... 

Joe Carrigan: Google has... 

Dave Bittner: You know, there's... 

Joe Carrigan: Yeah. 

Dave Bittner: ...Dropbox. I mean, there's no shortage of them... 

Joe Carrigan: Right. 

Dave Bittner: ...Available. Yeah. 

Joe Carrigan: Yeah. When he's talking about his five-point plan - the identify, define, prioritize, automate and improve - one of the most key points that we talk about here - in identify, everything is valuable, right? There is no such thing as information that doesn't have value. And try to distinguish between benign and malicious behavior - I guess that's what his company does. That's probably the business model. And then prioritize because you can't watch everything. You know, I think it's going to be - in the future, it's going to be a lot easier to watch everything with the development of AI tools and machine learning. One of the most interesting things I thought about this interview was that he said zip files - nobody uses zip anymore unless they're going to obfuscate and exfiltrate data. 

Dave Bittner: Yeah, I don't know about that. I mean, I guess for me, being an old timer, a lot of times we would zip files up so that the file type would go through. 

Joe Carrigan: Right. 

Dave Bittner: You know, sometimes when you transferred things online, something would get lost in the translation. 

Joe Carrigan: Well, a lot of times, we would have - we'd be sending code back and forth. So we couldn't put it in and just send the code, right? Because first off, it's a big project with a lot of files. 

Dave Bittner: Right. 

Joe Carrigan: But if the code came through or there was an application - you know, a compiled application in the email attachment, the email system would strip it off. 

Dave Bittner: Yeah, right. 

Joe Carrigan: So we would zip it. 

Dave Bittner: Right, right. And so I suppose it's - you know, this is - could very well be an example of me continuing along with old thinking that really doesn't apply anymore. 

Joe Carrigan: Right. 

Dave Bittner: Like, the systems probably just handle anything you throw at them. 

Joe Carrigan: Right. 

Dave Bittner: And you don't need to zip things anymore. But... 

Joe Carrigan: Well now, instead of zipping it, you put it in a code repository like GitHub. He talks about the three Ts - the transparency, technology and training. And I want to focus on the human aspect here, the transparency. You absolutely want to tell your employees what you're watching and why. Nothing will make your employees hate you more than showing up and saying, hey, we noticed you moved this file up to an unapproved sharing system. You're watching that kind of stuff? Tell them. Tell them that. Put that... 

Dave Bittner: Right (laughter). 

Joe Carrigan: ...In the acceptable use policy, you know? 

Dave Bittner: Right, right. 

Joe Carrigan: Make sure that they get training on it. Make sure that they know. That way, when you come and you say, hey, you moved this to an unapproved sharing platform, they go, oh, I shouldn't have done that. Let me pull it back down. 

Dave Bittner: Yeah. 

Joe Carrigan: They realize that's their mistake fairly. If you don't tell them that, how do they even know that's a mistake? Frankly, that's not their mistake. 

Dave Bittner: Yeah. I have an acquaintance who I know through a volunteer organization that we both work with. And he's a business owner. And through COVID, when his employees went to work at home, he installed every possible kind of monitoring software on their computers that was available - like... 

Joe Carrigan: Right. 

Dave Bittner: ...You know, webcams, you know, timers, mouse movements, like, everything. And he comes back to talk to some of us, you know, other people who've had business experience, and he says, I don't understand why I have so much turnover compared to my competitors. And we're like, hmm, I don't know. What a mystery. 

Joe Carrigan: Right. 

Dave Bittner: You know? 

(LAUGHTER) 

Dave Bittner: So the point is... 

Joe Carrigan: That is a completely different problem. 

Dave Bittner: There's a fine line here. 

Joe Carrigan: Right. 

Dave Bittner: And I think, as you say, tell people what you're doing, why you're doing it and get buy in. Don't just say, this is what is happening and here is why and, you know, we are watching all the time. Make - say, these are the things we're trying to protect against and these are the things we put in place. 

Joe Carrigan: Right. 

Dave Bittner: And let them be a part of the solution. Maybe they have suggestions, or they can come to you and say, you know, I think that's a little intrusive and here's why. And so at least you're having a conversation and you're being collaborative, it's... 

Joe Carrigan: Right. 

Dave Bittner: ...Not just, you know, something coming down from the boss mountain, right? 

Joe Carrigan: Boss mountain. 

Dave Bittner: All right. Well, our thanks to Joe Payne for joining us. Again, he's from Code42. We thank him for taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. And we want to thank the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu. 

Dave Bittner: The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Joe Carrigan: And I'm Joe Carrigan. 

Dave Bittner: Thanks for listening.