CSO Perspectives (Pro) 9.21.20
Ep 23 | 9.21.20

Red team, blue team operations around the Hash Table.

Transcript

Rick Howard: Hey, everybody. Rick here. Last episode, I went through the history and evolution of pen testing, red team blue team operations and, finally, purple team operations. But as I started talking to our CyberWire subject matter experts at the Hash Table about the topic, I discovered that these names do not represent hard and fast boundaries of tasks associated with each. It was more of a spectrum of activities with overlap on the edges. In my head, I've been calling these activities opposing force exercises to distinguish them from the more typical network defender operations like, you know, monitoring, configuring, patching and whatnot. As with many things in cybersecurity, though, there exists a lot of gray area or, shall we say, wiggle room when it comes to defining what opposing force exercises actually are.

Rick Howard: My name is Rick Howard. You are listening to "CSO Perspectives," my podcast about the ideas, strategies and technologies that senior security executives wrestle with on a daily basis. Today, we are talking to two cybersecurity thought leaders about their experiences and views on red team blue team operations. 

Rick Howard: Let's start with the blue team. For all intents and purposes, the blue team is the infosec team. They are the folks that are the bread and butter of your network defender organization. Team members are the folks in your SOC, on your intel team and on your incident response team, among others. They have designed and deployed your network defense architecture, help maintain it and respond to potential bad guys trying to penetrate it. In this role, we don't usually call them the blue team, but if we go with the Prussian military board game I mentioned in the last episode, the blue team members are the good guys trying to defend their turf against the black hat hackers. 

Rick Howard: I invited Tom Quinn to the Hash Table to talk about red team blue team ops. He is the T. Rowe Price CISO and is a regular on this show. He described blue teams this way. 

Tom Quinn: Blue team is often referred to as the defenders of your environment. Often, it's incident response people. It could be a help desk. It is people who are in front of the alert fabric, your cyber incident response fabric and the like, and they're your responders. 

Rick Howard: In the early 1970s, good-guy hackers, white hackers, ethical hackers started to use their skills against their own systems. These exercises became known as penetration tests. A separate team would attempt to poke holes in the technology deployed to protect the enterprise - things like boundary obstacles, live operating systems and deployed software. The pen testing teams were not trying to mimic known adversaries. They were trying to find unknown open windows and doors or holes in the fences of the deployed security architecture. When found, the organization could decide how to close them and thus reduce their attack surface. Security experts have different ideas on how to use these teams, everything from, on one end of the spectrum, sitting the team somewhere on the internet and telling them to find a way in, however they want to do it, to, on the other side of the spectrum, giving the team extremely specific parameters about what they are supposed to try and from where they are supposed to try it. For my part, I never thought the former was that valuable. Can a pen test team find their way in? Of course they can. That's what they get paid to do. That's not the right question. 

Rick Howard: I was talking to Rick Doten about this at the Hash Table this week. He is the CISO for Carolina Complete Health, and he has been on the program now several times. Before he was a CISO, though, in a previous life, he ran a commercial pen test team. His clients would ask him to see if the pen test team could get into the clients' network. Here's what he would tell them. 

Rick Doten: So when I was a consultant, I would often have customers who'd call and say, hey, I'd like a penetration test just to see if you can get in. And I would always tell them, save your money. Yes, we can. There's no question about it. You know, it's like, if you have a specific reason that you want - something to focus on, or you just updated a system or even your monitoring, or you want to test it the way that these controls are acting, that would be something. But if it's just a general can you get in, yes, we can always get in. 

Rick Howard: I think his point is that pen tester activity should not be free-for-alls. They should be highly tailored to test something specific, like, you know, a newly deployed S3 bucket or a change in firewall settings or maybe even a newly deployed server farm. 

Rick Doten: Is there anything - did you just change something? Or is there something you're monitoring, something you want to prove, something you want to test? Then we'll come up with scenarios that you want to kind of see. I had one that was kind of capture the flag. It was a media company who said, we have this database with a lot of sensitive media, you know, whether it was, you know, like, you know, movies or things that were timely, you know, things like that. And they said, we want to give you - you know, how much time do you think we should give you to try to get into it and access it before you think that's good enough from a security standpoint? And so that was kind of an interesting approach to say, all right, here's the target. You know, I will give you five business days to get to it however you want to. And if you can't get to it in five days, then I'll say, all right. That's good. And, you know, we'll kind of move on. 

Rick Howard: Especially for contractors who might want to sell you 90 days of time from their ethical hacker team. It doesn't have to be that detailed. 

Rick Doten: You know, there was a lot of testing going on, and they streamlined it to, like, OK, if you can find a vulnerability and show that, oh, this particular FTP, you know, instance on Solaris is vulnerable because of this version, and you proved it before, that's all I need. You don't need to prove the point. I'm going to fix it anyway. Just stop there because they want to streamline the process. You know, and sometimes, like I said, there was a time element. I had one - a different media company in New York that asked me about, can I just do a three-day assessment? And because, you know, having done this for many, many years, if it's bad, you will find out in the first hour, you know? You're starting to look at, like, oh, here it is, you know? And so, you know, after a couple of years of being a customer, she realized that, oh, you know, we find all the good findings, like, in the first day or two. And then it kind of diminishes after that. So she goes - can I just have a three-day one? Whatever you find in three days is good enough. I know it's not comprehensive, but I just need to know, does this platform suck or not? And if it sucks, then we may do some additional things. And if you don't find anything in three days, we may say, OK, we'll put this on a lower priority list because I know that if you find something, you're going to find it right away, not after, like, you know, five or six days of grinding on stuff. Then it's much more skewered because you have to find something 'cause it's pretty secure. 

Rick Howard: It wasn't until the late 1990s and early 2000s that we got the idea of using red teams to emulate known adversary activity. These are the red pieces from the Prussian military game, the bad guys. But remember when I said that not everybody agrees with hard boundaries between penetration testers and red teamers? It's more of a spectrum of activity that overlaps? Here's Tom again, the T. Rowe Price. CISO. 

Tom Quinn: So red team - right? - are people who are in place - to me, they do a couple of roles. One is to test the defenders, to test the cyber controls and to look to find cracks and areas for improvement. So I view that as almost a quality assurance capability for the blue team itself. But I also find for red team that their goal isn't just to run a tool like a pen tester maybe or to dig into a web application or something maybe or to scan the environment with, you know, pretty well-understood tooling and alike. It really is to subvert a control set by any means necessary. And I - again, I see them as different, right? The red team having to - those two functions. And then pen testers may be a bit more generic and a bit more tech design-oriented. There's lots of terrific tools, as well, to help, you know, do a lot of the heavy lifting, as well. And I think the purpose that I find is pen testing is to me much more of a quality assurance function for your system and design practices. 

Rick Howard: For the sake of discussion, though, let's assume that our red team is trying to emulate known attacker behavior. We know from the original Lockheed Martin kill chain paper that adversaries must successfully negotiate the intrusion kill chain to be successful. Any red team worth their salt will be running attack campaigns using that as a model or some version of it. Rick Doten worked for Lockheed Martin when the research team rolled out the original idea back in 2010. He didn't help invent it, but he was the customer frontman that had to explain it to the masses. 

Rick Doten: When I was a Lockheed Martin, I talked a lot about the, you know, cyber kill chain, you know? I did not come up with it. The gentleman who came up with it I knew. And - but I was the face guy, and I briefed it a lot. And I talked about how this kind of changes the game because remember all of our friends used to have their signature on their email, you know? In cybersecurity, you have to be right every single time, and the bad guy only has to be right once. And so this flips it to being if there was a kill chain, then they have to be something unique every single step of this chain, or else I'm going to see them. And - 'cause I'm going to learn about the stuff they did before. And they messed up, and they're going to change one thing. It's like, all right, well, that exploit didn't work. Let me try another exploit. But the delivery method and the installation method may be the same, or they may have been the same, or the C2 channel may have been the same, but they change the exploit. And so I'm kind of watching them. And I thought that was a really great way of putting it, but that also humanized it, as well. 

Rick Howard: So we have blue teams, penetration testers and red teams. But when you combine a coordinated exercise between the red team and the blue team, you have something called a purple team. 

Rick Doten: Yeah. And that's kind of like, you know, football practice, right? You know, your offense plays the defense (laughter). And so - or yeah. Or somebody else's offense comes and plays your defense. But, yeah. So that's a thing where there might be - going back to your definition of red team being specific scenarios that they're going to attempt and that the red - that the blue team knows that it's going to happen and are watching to see if it's there and to try to respond and recover and mitigate from that. 

Rick Howard: And then Rick came up with this great visual to represent the red team blue team exercises. Do any of you old-timers out there remember the vintage Warner Brothers cartoons, the "Looney Tunes" collection of Bugs Bunny and Daffy Duck and Elmer Fudd? I grew up on that stuff. Saturday morning cartoons was a religion at the Howard house. Back in the day, from 6 in the morning till about 1 p.m., much to the chagrin of my father, my butt was glued to the couch, shifting between three channels - ABC, CBS and NBC - to watch the best cartoons. Now that I look back on it, that explains a lot about my personality. Do you remember the one about the sheepdog named Sam and the wolf named Ralph? These are two characters who are not necessarily friends but perhaps good neighbors. Each morning, they get up, commute to work, exchange pleasantries, clock in and then beat the living tar out of each other as the wolf tries to steal the sheep, and the dog tries to stop him. At the end of the day, they clock out, exchange pleasantries again and commute home to get ready to do it all over again the next day. 

(SOUNDBITE OF TV SHOW, "LOONEY TUNES") 

Mel Blanc: (As Sam Sheepdog) Morning, Ralph. 

Mel Blanc: (As Ralph Wolf) Morning, Sam. 

Rick Howard: That's how Rick thinks about red team blue team exercises. 

Rick Doten: It's kind of like, you know, the Warner Brothers cartoon with the sheepdog and the coyote where they both punch into the time clock, and then they beat each other up. Morning, Fred. Morning, Ralph. And they beat each other up. That's our life in security operations. They know who we are. We know who they are. Their job is to get in. Our job is to keep them out. And so the point being this group of, you know, humans whose - their job is to get to our data and either disrupt our operations, steal intellectual property, steal our money or whatever - that's what we have to kind of look at it as. And they have opportunity. They have capability. And they have intent. And we need to understand what they are to better defend ourselves. 

Rick Howard: One thing that Rick brought up was an idea we heard about from our two episodes on incident response. Steve Winterfeld, the Akamai advisory CISO, said that you can install these things called Kovel arrangements, basically legal documentation declaring that incident response communication during a crisis could not be released to any discovery motions from lawyers in the future that might try to sue the company, you know, for, you know, bad incident response handling or something like that. With a Kovel arrangement in place, those communications were privileged. Rick says that you could have something similar in place to protect your red team blue team operations. 

Rick Doten: In the last five years or so, it's been very common, you know, especially in forensics but also in penetration testing, that, you know, if they're - you know, more common in the incident response standpoint, where if there was an incident, you have outside counsel who - you engage a firm. And then that output becomes, you know, privileged product that cannot then be disclosed if there's something - something goes on. So we've seen in the last five years, also, with penetration testing that, OK, if you're going to come and find vulnerabilities, I want to make sure this report is not something that gets discoverable if there is some - you know, a litigation or something like that or a lawsuit that there - you know? It's like, oh, well, they knew about it because it was in a report from two years ago, and we were able to obtain the copy of the report. 

Rick Howard: Even with Kovel arrangements in place, senior executives may not quite understand what you are trying to do. I've seen it in my own career. I told my boss that I wanted to turn hackers loose on our production networks. And he tilts his head, raises his eyebrows, looking at me like I might have a horn growing out of my head. Here's Tom again, explaining his experience talking to executives about the exercises. 

Tom Quinn: Yeah. One of the things I've observed - right? - is, you know, audience matters, too. What I find is, you know, for most businesspeople, they're barely going to understand a concept like red team blue team. 

Rick Howard: And even if they authorize the exercise, if anything goes wrong to the production system at the same time, the opposing force exercise will be blamed first. Here's Rick again. 

Rick Doten: From the business side of understanding, like, it's not going to be impacted - and, rarely, it does. And when there is anything that goes on - and this is from both the - being on both sides of the table as a consultant running pen test teams - if anything happens on the network, you're going to get blamed for it. 

Rick Howard: Both Tom and Rick were quick to point out that you had to do some prep work before the exercise, document exactly what you did during the exercise and quickly summarize exactly how the exercise went after in order to relieve the executives' anxiety about this thing that they don't understand. 

Rick Doten: We're very diligent about our timing and documenting everything we're doing so that when some server goes down or router goes down, we're like, yeah, we weren't touching that. That's something else because you always get blamed for it. And they're not going to believe you until you prove that it wasn't you. And then, oftentimes, we'll discover what the real problem was, which is something that they did. And so - but yes. It - you know, and it's making them feel comfortable. And so that's what - you know, when I was leading teams, you know, particularly later on, my job as a leader was to make the customer feel comfortable with what was happening. And so I want them to be comfortable with this and be very, very, very transparent because 20 years ago or, actually, 15 years ago, there were a lot of teams that were not transparent, and they'd want to be very secretive about it. And, like, we're in the conference room, and you don't get to see what we're doing. We'll just give you the report. And when I have customers ask me that, I'd say, I hope you kicked them out because you can sit with our guys and watch what they're doing. You can be there every step of the way. I've had probably a dozen customers sitting with me when I'm doing pen testing against their environments, and we're an open book. There is nothing that they shouldn't know that I'm doing against their environment. And so making them feel comfortable and being very transparent is a very active part of it so that if something goes wrong, you can either rule out that it wasn't you or understand what exactly it was, so you can remediate. 

Rick Howard: Tom goes as far to write rule books for each side that he calls team charters or RACI charts that lay out the boundaries, the things that each team can and can't do during the exercise. RACI is spelled R as in responsible, like who's responsible for participating in completing an action, A as in accountable, who is accountable for making sure an action is complete, C as in consulted, who is consulted during the process of completing an action, and I as in informed, who is informed about the state of an action. Here's Tom. 

Tom Quinn: Those charters are approved. So you know, what are our roles and responsibilities? You know, is there a RACI diagram - right? - that's there? And it's not to be bureaucratic. I think it's really to ensure that people have a very clear understanding of what they're supposed to do. You know, in some cases - right? - people just - you know, if they're not - if they don't have the right mindset or they're not inculcated correctly into the role, you know, you could just - you know, people may think that their job or responsibility is to turn over every rock that they can in any way, shape or form. And that's not the role of this. 

Tom Quinn: And I think, especially for new people coming into a red team and, for that matter, new people coming into a blue team, if they've been trained in another organization or they're maybe right out of college or right out of high school - right? - for something, they may have a different set of expectations, and I think it's crucial to onboard them and give them as many - as much training as you can. But the audience for those charters isn't just the red team and the blue team. Part of the audience for those charters is management and peer managers - right? - who are in operations in the business team so that they can see, really, what we do. I think people do get charters. People do see, like, and, I think, take comfort in the fact that, you know, it's really been reviewed by others and approved and that they may even have an opportunity to opine on it. So there's lots of good reason to have that, and I think, you know, just a little bit of bureaucracy, right? Lower case B bureaucracy - a little bit of bureaucracy is - I think is a good thing. 

Rick Howard: With all of this activity, blue teams, penetration testers, red teamers and all the knowledge to understand the intricacies of intrusion kill chain plus all the hoops you have to go through to make sure that senior management doesn't freak out, it sounds like the kind of people you need to do these jobs are not the typical infosec people. Once you figure out what that means, how do you find them? Here's Rick again. 

Rick Doten: They're special. And I have hired hundreds of them over the years and actually kind of learned new interview techniques because many of them, you know, are poor communication, poor, you know, verbal skills. Many are on the spectrum. You know, these are very detail-oriented people who do not give up on a problem until they figure it out, and that's one of the important things. And so the interview technique that I had was I would do a lot of talking, and I would see how they react to things that I said. And I knew what they knew by what got them excited because if I just say, describe to me your ability to do this, they would just kind of shut down. 

Rick Howard: I love the way that Rick looks at this. He and I are both sci-fi geeks, and he likens the concept of hiring red team and blue team people the same way that actor Robert Preston in the 1984 movie "The Last Starfighter" recruits fighter pilots. He is looking for aptitude, not years of experience. Preston, by the way, an Academy Award nominee for "Victor/Victoria" but probably best known for his performance in the 1962 musical "The Music Man," plays a character named Centauri. He places video games all around the galaxy - it was the early 1980s, after all - and the players that got the high scores demonstrated the aptitude to be great fighter pilots. 

(SOUNDBITE OF FILM, "THE LAST STARFIGHTER") 

Robert Preston: (As Centauri) I must congratulate you on your virtuoso performance, my boy. Centauri is impressed. I've seen them come, and I've seen them go, but you are the best, my boy - dazzling, light years ahead of the competition - which is why Centauri is here. He's got a little proposition for you. Interested? 

Rick Howard: I totally agree with this. From my own experience, I don't need Ph.D.s in computer science sitting in my SOC. I don't need analysts with an alphabet soup of certifications running my incident response team. I need them to know some basics of how operating systems work, how networks function and how to script their own tools. But mostly what I need is a desire to learn new things, a passion to break things apart and put them together again and the tenacity to not give up on seemingly intractable problems. I need people with aptitudes for this stuff. 

Rick Doten: Good skills in pattern matching and problem-solving and abstract thinking, you know, whether they were literally janitors or, you know, someone who worked in a bar or something like that - and so I think that we're starting to, like, realize we can uncover these people who have the aptitude that don't realize that they can do this. 

Rick Howard: After preparing two entire episodes and talking to many of the CyberWire's experts around the Hash Table on the subject of red team blue team operations, I'm still asking myself this basic question - are red team blue team operations essential to any infosec program? As the last two seasons of "CSO Perspectives" have been about first principles in cybersecurity, I'm struggling here to make the case that spending the required resources on this activity in order to have an impact to significantly reduce the probability of material impact on my organization is somehow fundamental. From my viewpoint, red team blue team operations and maybe even penetration tests are not the first lever I'm going to pull in building my infosec program. The first priority is to establish the four big strategies - resilience, zero trust, intrusion kill chain prevention and risk assessment. Once I have implemented those strategies and have matured them to a certain level, then I might try to install those opposing force techniques. Penetration tests could definitely improve an existing zero trust program, and red team blue team operations or purple team operations could definitely improve the intrusion kill chain prevention program but only after those two big strategies have been in place for a while and are mature enough where new elements can be added. Here is Rick Doten with the last word. 

Rick Doten: It depends on how mature you are. If you're less mature, and you're still, like, working on trying - you know, you're understaffed. You're under - you know, you don't have the instrumentation - then it's not - don't worry about that. You know, try to get the fundamentals done well. But when you're at the point where you have a good program and you're trying to make sure it's always improving, then do it. And then that helps you find the things that you are maybe missing or didn't think about or maybe didn't have coverage on. But - and, you know, you know, but if it's a - you know, if you're, you know, in the principle idea of, like - if you're starting out the fundamentals, you know, Azure configuration management. You know, I have, you know - you know, I'm watching things. I have controls for this thing. I'm using multifactor, blah, blah, blah. As you're kind of building up on that, you know, just vulnerability testing, which is just kind of like making sure that everything you see on the surface area is good enough - but when you get to the point where, I think I have a good program, and I want to now try to make it better, that's where they fit in. 

Rick Howard: And that's a wrap. Next week, we will be doing the last episode in season two. We will be summarizing what we have learned this season about security operations centers, incident response, data loss prevention, identity management and now red team blue team operations. In the meantime, if you agreed or disagreed with anything I have said about red team blue team operations, hit me up on LinkedIn, and we can continue the conversation there. I have an open mind here. If you think that these opposing force exercises are essential, convince me. I'm ready to change my mind. In the meantime, see you next week. 

Rick Howard: The CyberWire's "CSO Perspectives" is edited by John Petrik and executive produced by Peter Kilpe. Our theme song is by Blue Dot Sessions, and the mix of the episode and the remix of the theme song was done by the insanely talented Elliott Peltzman. And I am Rick Howard. Thanks for listening.