Adding some color to incident response.
Dave Bittner: It's February 22, 2023, and you're listening to "Control Loop." In today's OT cybersecurity briefing, Dragos has released its ICS/OT Cybersecurity Year in Review for 2022, finding a rise in ransomware attacks targeting industrial organizations. Forescout discloses two vulnerabilities affecting the Unity line of Schneider Electric's Modicon Programmable Logic Controllers, and dozens of vulnerabilities have been discovered in industrial Internet of Things devices. Today's guest is Tim Starks from The Washington Post's Cybersecurity 202. Tim and I talk about the upcoming White House National Cyber Strategy and its possible effects on critical infrastructure. In the Learning Lab, Dragos' VP of product and industry market strategy, Mark Urban, begins his two-part discussion about the importance of incident response planning with Vern McCandlish, who is a principal industrial incident responder at Dragos.
Dave Bittner: Dragos has published its ICS/OT Cybersecurity Year in Review for 2022. The report found that ransomware attacks against industrial organizations nearly doubled last year, with 70% of these attacks targeting the manufacturing industry. The report states, there were multiple reasons for the increase in ransomware activity impacting industrial organizations, including political tensions, the introduction of LockBit builder and the continued growth of ransomware as a service. Dragos observed ransomware trends tied to political and economic events, such as the conflict between Russia and Ukraine and Iranian and Albanian political tensions.
Dave Bittner: The security firm also discovered two new threat actors in 2022 - CHERNOVITE and BENTONITE. CHERNOVITE is the developer of PIPEDREAM, an ICS attack framework that Dragos says represents a substantial escalation in adversarial capabilities. The framework was likely developed by a state-sponsored actor, but Dragos says it doesn't appear to have been deployed in the wild yet, stating, Dragos assesses with low confidence that no adversary has employed or leveraged components of PIPEDREAM against industrial networks for disruptive or destructive effects. Dragos' discovery of CHERNOVITE constitutes a rare case of accessing and analyzing malicious capabilities developed by an adversary before its deployment, giving defenders a unique opportunity to prepare in advance.
Dave Bittner: Dragos' CEO, Robert M. Lee, said in a briefing that CHERNOVITE targeted multiple electric and liquid natural gas sites in the U.S. in early 2022. Lee stated, this is the closest we've ever been to having U.S. infrastructure go offline. While PIPEDREAM wasn't deployed, Lee says the threat actors were getting very close to pulling the trigger. Politico notes that Mandiant suspects that a Russian state-sponsored actor is behind PIPEDREAM.
Dave Bittner: BENTONITE is a threat actor that's been opportunistically targeting maritime oil and gas, governments and the manufacturing sectors since 2021. Dragos says BENTONITE conducts offensive operations for both espionage and disruptive purposes. Dragos, as a policy, doesn't attribute threat activity to particular nation-states. But the researchers note that BENTONITE has overlaps with the threat actor tracked by Microsoft as PHOSPHORUS, which Microsoft has tied to the Iranian government.
Dave Bittner: The report also offers a look at data related to security improvements in different industrial verticals in 2022 compared to 2021. Dragos says the oil and gas industry improved its security measures in most areas, probably due to the TSA security directives issued following the ransomware attack that hit Colonial Pipeline in May 2021. The researchers note, identifying IT/OT interdependencies and applying strong network segmentation were major aspects of the security directives.
Dave Bittner: Forescout has disclosed two vulnerabilities affecting the Unity line of Schneider Electric's Modicon Programmable Logic Controllers. The security firm discovered the flaws last year as part of its OT:ICEFALL research but waited to disclose them at the request of the vendor. One vulnerability can enable remote code execution via an undocumented memory write operation, while the other exemplifies a broken authentication scheme. The two flaws can be chained to carry out remote code execution on Modicon Unity PLCs, which can enable deeper access to industrial control systems.
Dave Bittner: The researchers note that while the exploitation of these flaws is complex, organizations should keep these types of vulnerabilities in mind. The record quotes security researcher Jos Wetzels as saying, this is not your average script kiddie stuff, but it is something you should take into account as a possibility when you are designing new system architectures.
Dave Bittner: The U.S. Cybersecurity and Infrastructure Security Agency, on February 16, released 15 industrial control system advisories. They cover systems by Siemens, Sub-IoT, Delta Electronics and BD Alaris. Operators, check your systems. And as always, apply updates per vendor instructions.
Dave Bittner: Researchers at OTORIO have discovered 38 vulnerabilities affecting industrial Internet-of-things devices from four separate vendors. Three of the vulnerabilities affect Etic Telecom's remote access server. Two of the flaws impact Sierra Wireless AirLink router. And five affect InHand Networks' InRouter302 and InRouter615. The rest of the vulnerabilities are still in the disclosure process. The researchers note that attackers can use publicly available apps, such as Wiggle, to identify these types of vulnerabilities, stating, our scanning uncovered thousands of wireless devices related to industrial and critical infrastructure, with hundreds configured with publicly known weak encryptions.
Dave Bittner: To mitigate IIoT vulnerabilities, OTORIO offers the following recommendations. First, establishing a zero-trust policy between cells and the L3 control center, ensuring that if an attacker compromises a single cell, they won't be able to reach other cells or unnecessary services in the L3. Next, applying a white-list based communication template monitored by the FW/IPS between L3 and the cells. The communication template will guarantee that only allowed traffic is sent from the cells to the L3. And lastly, creating a proxy address for internet-managed devices, industrial cellular gateways, intelligent field devices and so on. Traffic will be sent to the proxy functionality which will perform man-in-the-middle to the data to detect any malicious behavior.
Dave Bittner: Tim Starks is the author of The Washington Post's Cybersecurity 202. I recently had the pleasure of speaking with Tim about the upcoming White House national cyber strategy and its possible effects on critical infrastructure.
Dave Bittner: Tim, it is great to have you joining us here on Control Loop. I want to focus on the White House's national cyber strategy and the effects that it may have on critical infrastructure. Can we start off with just a little overview for folks who may not be all that familiar with the White House's national cyber strategy? I know this is something you've done quite a bit of reporting on there at The Post.
Tim Starks: Yeah. So there's a relatively short history in terms of the United States of national cybersecurity strategies, as you might expect. There was one in the George W. Bush administration. It was not highly regarded, terribly, but it was also, you know, a first, so that's somewhat to be expected.
Dave Bittner: Yeah.
Tim Starks: The criticism of that was that it just wasn't very connected to the overall national security or homeland security strategies. So that was the first one. Obama did a couple of things that could have been considered a national cyber strategy, but they didn't have that name. They had different names. So that's not, you know, if you're being technical about it, they did not have a national cyber strategy. Trump did have one that is, you know, not poorly regarded. You know, one of the points of emphasis of that was about going on offense a little bit more, even though they didn't use some of that phrasing directly. That was one of the gists of it. It was very focused on - as all the others have been, regardless of whether they were true strategies or not - this need for public-private partnership, which, if you've been in the cyber world, you've been hearing about that phrase for so long...
Dave Bittner: Yeah.
Tim Starks: ...And voluntary measures. What we have happening with this administration, the Biden administration, is that their national security strategy is going to buck that convention. It's not going to say we don't need public-private partnership, but it is going to call more directly than any strategy ever has - in fact, none have even come close - to advocating for more regulation.
Tim Starks: So with the way strategies work, usually they are very high-level documents that don't have a lot of very specific policy prescriptions. They're meant to be a signaling of priorities. What's interesting about this one is we might actually see a little bit more of that than we usually do in a strategy. And one of the other things that's interesting is that there might be a follow-up implementation document. You know, when you look at most strategies, they don't say, this is how we're going to do it. They say, this is what we want to do. The possibility of this implementation element makes this potentially more substantial than a lot of other of these past national cyber strategies we've had.
Dave Bittner: What are some of the regulatory elements that have been included here?
Tim Starks: Well, again, it's very broad in terms of how it is going to approach that. I think what we've seen, you know, more specifically from the administration, it has been coming out of - let me break this down. There are several centers of power in the Biden administration as it pertains to cyber. This national strategy is being written by the national cyber director's office. There is a role for the White House National Security Council in putting some finishing touches on it. So this is the part of the document that is saying, this is the national cybersecurity director; these are the priorities. It talks about the need to use regulation to level the playing field on national security. It talks about critical infrastructure, in particular.
Tim Starks: And then some of the other stuff that we're probably going to see as far as breaking things down further, you know, will come in terms of elaboration from administration officials. And, you know, in fact, Chris Inglis has talked about needing to go a little further in areas with, you know, the kind of areas where we have put some more regulations on industries - and not just cyber - but talking about going a little bit further, specifically, he said, as we have for cars or airplanes or drugs or therapeutics. So the strategy looks at this from a very all-critical-sectors point of view and will talk - you know, it will talk about using executive authority. It'll talk about, we might need to go to Congress - we being the administration - when they lack executive authority.
Tim Starks: And then I think what we've been seeing as coming out of the National Security Council, there's been more specific direction about sector-by-sector need for regulation. And some of that has actually become reality already. Obviously, I think of things like the - what happened after Colonial Pipeline hack where they said to the TSA, we need you to put some more regulations in place for these very critical pipelines. They did that. They put some reporting requirements. They said we need you to develop these certain kinds of plans that we can look at. It was a little bit of a rocky process at first, but they've gotten to a point of some harmony between industry and the actual agency - in the case of this one, the TSA - in doing the regulations that are not so acrimonious as they were at first. So we're seeing some of this on a sector-by-sector basis playing out already, even without the strategy.
Dave Bittner: Do we know what the pecking order is going to be in terms of the various organizations that'll have a hand in this? I'm thinking of, like, CISA. You know, what is their part to play?
Tim Starks: CISA has very little part to play in this. They are going to have a - well, I shouldn't say little, but they're not going to be the agency that is telling people what to do. They are going to play a role in different ways. I'll give you an example of that. CISA, in this case, will likely be - according to a White House official I spoke to - supporting the Environmental Protection Agency on its rules and mandates for improving cybersecurity at water facilities. Right now, there are these sanitary reviews that EPA does, and the talk is of putting CISA officials on those teams so that when they do those reviews, they also will review them for cybersecurity. CISA does not have much in the way of regulatory authority. There's one exception recently where they passed this cybersecurity incident reporting law in Congress, where they - where CISA is going to play the lead role in writing that regulation and has been working on that already.
Tim Starks: Where most of this breaks down, I think, is it will be the very sector-specific agencies. And this is not just a I think, this is the - these are the plans. You have an agency like, you know, the Colonial Pipeline - TSA is an agency that makes sense, given their authority over some of those issues to do that. That's a sector-specific agency is what they call it. So anyway, that is one of the examples. Other examples are, of course, you know, Defense Industrial Base, all those defense contractors and experts and academics who work that are considered critical infrastructure, that'll be going to the Defense Department. There are some departments that share responsibility. Dams is one where there are multiple agencies that are involved and there are multiple rules. I think interior and defense are the ones for those.
Tim Starks: And then there are agencies where - that's the agencies where they know they have some authorities. That's what we're talking about so far. There are other agencies where they - their authorities are not clear. They have responsibility for those sectors, but they don't have the rulemaking power to do what - something like what TSA has already done. So, interestingly enough, a lot of those are at DHS specifically. And it's not clear why that is the case, whether that was an oversight on the part of the people who wrote and created the Department of Homeland Security or whether it was a deliberate decision to keep them hands-off. But that's things like election security or critical communications. Those are areas where their authorities either don't exist or they're very unclear. It's going to - basically it's going to go like that throughout the entire federal government and through every industry that is considered critical.
Tim Starks: This is, of course, something to keep in mind that this - a caveat is that we're talking about not just critical infrastructure, which is the term we use for all these industry sectors that are very important, but the most critical of the critical infrastructure. You know, in the case of the pipelines, I think there are less than a hundred that they - that these rules will apply to. They might expand them and are - in some cases, are working to expand these rules to include other - the entirety of a sector, but the focus at this point is largely on protecting the most critical of the most critical.
Dave Bittner: You know, industry, I think it's fair to say, generally doesn't like having more regulations put upon them. How has the critical infrastructure sector responded to what's coming from the White House?
Tim Starks: In some cases, they've been very blunt about how much they don't like it. I think of - the air carriers have been blunt about the way that this has been approached in their sector. There is maybe a little bit of a cognizance in the broader business community that this is what the administration wants to do. It's happening. They need to work with them and try to get these rules to be something like what they're more comfortable with. They're not overly burdensome. And that is, to be clear, something that the strategy says. They don't want to work with industry. They want to make it not so burdensome. But, you know, there's no - there are very few industry groups who are calling for regulation for themselves. It's extremely rare in any field, cyber included.
Tim Starks: There have - there are exceptions, perhaps, where you might hear people on the sidelines. I was talking to Senator Warner, who chairs the Intelligence Committee. He - in the Senate. He said that he's heard from individual hospitals, smaller organizations, that they would like to see some regulation. But the industry groups in particular tend to not say, yes, we want this. In the case of, like, the Chamber of Commerce, they've said - what they've functionally said is we understand that these regulations are coming. They've been dealing with regulations in some way, shape or form in various sectors. So they're saying, let's make sure that these new regulations don't conflict with old regulations or don't conflict with other regulations that are in the works. They said, let's see about trying to get people incentives to do these things as opposed to punitive. So that's the way the industry is approaching it. It's a little bit scattershot in terms of how things are going. But overall, the message is we don't like them and we don't want them, if we can avoid it.
Dave Bittner: I wonder, too, you know, the old, please don't throw me in the briar patch, you know, kind of thing where, you know, I remember - I think it was some folks from Facebook were testifying. And they were saying how much they welcomed, you know, what - this - our sector needs, you know, some more rules. I wonder if that could be at play here, as well. Just a recognition that there's still work to be done. Things aren't as they need to be. So how do we balance acknowledgement of that with our desire to not be overly regulated?
Tim Starks: You know, it does not feel, to me, much like industry has been saying that they're underregulated, certainly, that they don't - you know, I think that they are aware that if the regulations are coming, they need to be working with the administration as opposed to not. You're better off being in the room than not being in the room, I guess, is what I would say.
Dave Bittner: Yeah. What are we looking at here in terms of timeline and prioritizing what different areas get attention?
Tim Starks: Yeah. It's interesting. I had reported at my previous employer back in the spring about what the thrust of this National Cyber Strategy was going to be. It had been in the works for a little while already, and they had targeted September for completion. Now, if you cover government very often, you know they don't always hit the targets. Clearly, they haven't hit the target here. I - you know, we had been hearing a little bit of January is - was the time frame for when the strategy might come out. Obviously, it's now not January. So when it will happen for sure - you know, you have Chris Inglis leaving office. You know, I think there were people who were of the mind that he might get this done before that all finished. But it doesn't look like that's the case.
Tim Starks: That's the strategy, though. That's - again, that's the overarching document. If you're looking at sector by sector, it is all over the place. In some cases, the regulation's already done. In some cases, they're in the formative stages. In some cases, they're trying to figure out what kind of language they need to propose to Congress to say, hey, these agencies don't have the authority to put forward regulations in these sectors. And that stuff starts to get more - not even in the next couple years, probably, because House Republicans have taken over that chamber. They are reflexively opposed to legislation. There is - sorry - reflexively opposed to regulation. And that's, you know, one of the reasons why I think we hear the discussion from the administration.
Tim Starks: There are a couple different reasons I think we hear the discussion from the administration about terminology here. They talk more about safety, less about security. They talk more about mandatory minimum standards or baseline standards or that kind of thing that is - it's a little bit of a marketing speak to try to say to Republicans, hey, we agree on cybersecurity. Let's work together to do these basic mandates. I still think that Republicans are going to come in very skeptically and have already expressed skepticism if you look at some of the leadership saying, we don't want to do that.
Tim Starks: But then you have, you know - again, a lot of this is targeted. You know, when I spoke to the White House recently, they had said, you know, that they were targeting the end of the month of January for the EPA regulations. But, you know, I've been hearing about the deadline for that happening or the target for that happening for a very long time, months and months and months and months. So there's a - there's no - there's a sort of a series of pots spinning that there are - there are tops that are spinning that they're trying to get stop and finish. And in some cases, they have. In some cases, things have been pushed back. I think they're working on it all at the same time. It's just a matter of how much they can get out the door under what authorities.
Dave Bittner: You've spoken to several members of Congress about this, as well. I mean, are there particular members that you see taking the reins here, taking a leadership role?
Tim Starks: Yeah. I mean, certainly you could see Senator Warner focusing on health care and wanting to clarify things like who's in charge of pushing forward those rules for industry and the sector. You know, in the past, Senator Peters, who chairs the Homeland Security Committee, had been the main driving force for the cyber incident reporting law. I don't think that there's anybody who, right now, has a, you know, strong, I'm to take charge in this sector and really try to push these things forward, with a couple exceptions. And in part because I think they're - you know, it's the beginning of a new year in Congress, and people are still working out what their priorities are going to be. And they're still working out what they're going to be able to agree on with the Senate being Democrat-controlled and the House being Republican-controlled and the administration being Democratic-controlled. I think there's going to be a feeling-out process of who they - what they think can get done by whom and with whom.
Dave Bittner: Yeah. Is it fair to say that if we had another incident - you know, a Colonial Pipeline-level thing - that that could draw everyone's attention and move up the timeline as well?
Tim Starks: Yeah. I think that - you know, that is one of the ways our country functions, for better or worse. You know, 9/11 was a formative part of my life and career, and that is when a lot of things happened as a result of that major catastrophe. If you look at the Colonial Pipeline - you know, if it was just Colonial Pipeline, maybe we wouldn't have had the momentum for that cyber incident reporting law that we did. But we also had SolarWinds. We also had Kaseya. We also had JBS, the beef manufacturer. We had - we saw things hitting people in their pocketbooks in a way that we hadn't before. And that's the kind of thing that can encourage politicians to respond.
Dave Bittner: Right.
Tim Starks: And I think that, you know, there's always this argument that happens. Let's not wait for that to happen. You know, we know it's going to happen eventually. Let's not wait for it. But that often falls on deaf ears until the thing actually happens. So, yeah, I think - I don't even think. I pretty much know that for something really dramatic to happen on this front, something really dramatic will have to happen on the attacker's front.
Dave Bittner: That's Tim Starks from the Washington Post's Cybersecurity 202.
Dave Bittner: In today's Learning Lab, Dragos' VP of product and industry market strategy, Mark Urban, begins his two-part discussion about the importance of incident response planning. Joining him is Vern McCandlish, who's a principal industrial incident responder at Dragos.
Mark Urban: Hi, I'm Mark Urban, once again, with the Learning Lab here at "Control Loop." And I'm joined by Vern McCandlish, one of our incident responders here at Dragos. In fact, Vern, give me your formal title that we're just talking about.
Vern Mccandlish: I am a principal industrial incident responder.
Mark Urban: Principal industrial incident responder, which means you've probably been around the block a little bit with some incidents. Is that fair to say?
Vern Mccandlish: Yes, that's fair to say.
Mark Urban: That's fair to say. So we had - in the last couple episodes, we had Lesley Carhart talking through, you know, what are some of the things that companies in the industrial space can do in order to better prepare for an incident response, what's in a plan, et cetera, like this. And because - and I know you have a strong view of that, but you've also been through a number of incidents. And part of this is to give folks some color as to - you know, we talked about the theory with Lesley about what happens in an incident, but why don't you bring us in and talk about a couple incidents and what you found. What are some interesting stories that can highlight the importance of planning? What are, you know, some stories that can highlight things to avoid? So give us a couple.
Vern Mccandlish: So I'll start off with what I consider to be the basic one. It happened to me when I did incident response in the IT space. And now, it's happening again when I'm in the OT space. And that is where a well-meaning government agency will contact a company or an organization and tell them, hey, we see evidence that you are being currently attacked and potentially are compromised by a state-sponsored or very large economically motivated activity group. And that's all they give them. There is no other information. It's just a government agency showed up and said, hey, you are compromised. You are the target. You are likely breached. But we can't give you any of the details right now because that would give away too much about how we know things. So we just need you to start looking. And that's a place that you kind of have to have a plan for because it's a very dark and scary space now.
Vern Mccandlish: So what we have done with that is try to work - you know, try to track the activity groups, know what their methods, their tools, tactics and procedures are, so that we can actually try to figure out where we need to start looking when one of those organizations calls us. That also works for, if I'm in an organization and I'm - I have to anticipate that this could be a call that I get. And it could even be just a call from another company that says, hey, I see your systems attacking mine. Where do you actually start? What do you actually start looking for? Do you have the capability to actually go - it's more like a threat hunter or hunting because you have no evidence that it's happened yet. You don't know where it's happening. You don't know what it's going to look like. But you have to be capable of going in and doing some forensic analysis on systems and on your network to be able to figure out what is happening. Is this true? Is this really happening?
Vern Mccandlish: And the big pitfall that I see with that is there's a lot of denial at first - well, if it was happening, we would have seen it or, well, now that you've told us and we looked, we haven't seen it yet. And there's a lot of hopefulness of, well, we spent three days looking and we haven't found it. And there's always a yet at the end of that statement when I'm working, because I have yet to have a government agency come to an organization and say, we see that you are being targeted and attacked right now and not eventually finding out it's true.
Mark Urban: So there's a pretty high correlation of, you know, if somebody at an agency says, hey, you know, your haystack has a couple needles in it, and then they go searching after it. And if they don't have the right methodology to - you know, to look through the haystack, then it takes them longer to find it. But you're saying there's - there are always needles in that haystack. That's a terrible metaphor, but that's what you're saying. Hey, if you get that call, there's something going on, and don't think your lack of the ability to find it is a good thing.
Vern Mccandlish: And that really is the point that I - when I - and I understand why the government agencies do this. It's not an exercise on their part. They're not being, you know, super suspicious or anything. They actually have a lot of other drivers behind why they would do this. But they do notify the organization - hey, you're the target. Go look for it. But we really can't tell you what to look for or what specific thing. They might give them a, hey, go look at this particular part of your network or something. But it's usually - I used to say it's like, hey, there's something going on in that room over there. Well, what's going on? I don't know. You have to go look. It's not specific enough. You need to have the ability to go in and figure it out.
Mark Urban: All right. So then as you're talking about going in and have that ability to find out, what's the fork in the road that makes that a much simpler, more straightforward process versus something that takes far too long and doesn't end up in the right place? What are some of the key things that - you know, that differentiate, you know, a quick and clean process or something that's too long and drawn out and not successful?
Vern Mccandlish: Well, the first is to actually have somebody that can do the incident response, have someone that can actually do that investigation for you, whether that's your IT department, whether you actually have an OT-specific team, whether you have a third-party vendor that you actually have that task contracted out to, have a plan for if that type of call is made. Who are you going to engage on your team or externally to actually do this work? And the second part, the thing that makes the work hard or easy, is whether you have planned for visibility. Do you actually have the ability to see into the spaces where we would want to go look? So if I get called into these and - I want to know where the data is. Where is your network telemetry? Where are your endpoint logs? And if I'm having to go system by system and switch by switch to get this information, that's a lot harder. And it's going to be a lot slower than if we already have a plan of collecting data so that it's visible to security operators and hunters to be able to find.
Mark Urban: Wait, and - so you brought up - you said whether it's the IT person. But, you know, if you get on to the industrial side, into the operational technology, into the industrial control systems, that have, you know, the specialized protocols that have their own kind of unique thing. It's very different from IT. Can somebody on that side of the - you know, of the equation be effective in sussing out threats on the operational side or the industrial side?
Vern Mccandlish: They can because they can get started. Typically - since IT butts right up against OT, the attackers very rarely are getting directly into OT without going through some type of thing, like a VPN server or a jump box or something else that the IT person would at least have visibility and understanding and knowledge of, and possibly, they have the authority - they actually have the credentials that would allow them to go in and do this type of investigation. So it doesn't preclude them from doing it. Does it help to actually have experience doing this on the other side of the DMZ and the industrial space so you have more opportunities for data and understanding and context to look for? Absolutely. But I'm not going to - even though I do industrial incident response, the people in IT actually have the skills, and oftentimes they're the ones that have the authority to go do this first attempt at triage, first attempt at trying to figure out what's going on.
Mark Urban: Even if it's in the OT systems, often there's a trail from the IT system through remote access and things like that. That's a good point. So as you look at how do people crossover from that IT rolled into the OT systems, a lot of times it's because they're not segmenting their access credentials. OK, so government agency says you got something going on and, you know, it's better if you have the information base there to look through. It's better if you have somebody skilled, whether it's on the IT side or the OT side to be able to do it. So bring us through then - have you been in a situation that's, you know - outside of government agencies - so let's pick a different war story, if it will. And yet, what else has illustrated some of the challenges and things not to do when it comes to incident response?
Vern Mccandlish: Well, I don't think it is much as a what not to do - the second thing I had on my list that I really had a passion to talk about today was the prevalence of third-party vendors that operations relies upon to do things in their environment. These can be 20- to 30-year service contracts with very large organizations that supply the equipment. They can be software licenses with companies that do things like historians, or they do - they provide the VPN services at the perimeter. And these third-party vendors oftentimes have proprietary systems that do a lot of this work. Now, we at Dragos do a lot of the stuff necessary to be able to see or know what those things are doing in context. But actually being able to do forensics on a device requires me to actually have the authority, have the credentials. I need to have the elevated privileges to be able to do the forensics on the system.
Vern Mccandlish: And oftentimes, that's going to mean we have to make a ticket with the vendor to come in and do that. You are in a crisis. We have found the bad person coming in through your VPN server. We have found the bad person who's utilizing the software stack from this vendor that you use, and you may not have the permissions necessary to be able to go in and do the investigation to figure out what's going on. So now I have to engage, as an incident responder, a third-party vendor, and I have to get them to prioritize it the way we have it prioritized and come in quickly to get us this information and access.
Vern Mccandlish: And identifying who those vendors are, what those appliances are in your environment can help inform a data-collection strategy where you could get some of those logs forwarded off, or you could have a break glass in case of emergency set of credentials that you can use on the device. I've had this happen, where I had the opportunity to basically do the collection on a device. I knew how to do collection on the device. But in the contract, if I did the collection on the device, I could disrupt their service contract. They'd still have to pay it, but the company wouldn't have been on the hook for actually having to provide the service anymore because you void the service contract. So, again, we're not talking about a one- or two-year contract. A lot of these things are, in the industrial space, 10, 20 - I've even seen 30-year contracts for a service-level agreement. You just can't take that lightly.
Vern Mccandlish: So that is a challenge that we as incident responders have to identify and would like to identify ahead of time. If I come into an environment and I know those five things require me to get a hold of the vendor and I see the attacker is touching one of those five, I know how to do it. We have a workflow. We have a plan, and we have already socialized with that vendor what we expect them to do, and it's - hopefully, it's in writing.
Mark Urban: So there's two things in there. The one I'll tease out is, like, first of all, the gazinta (ph) that might be the ingress of that particular attack might be through that third-party vendor. Maybe not, but that's - that - we've seen that and - you know, in a lot of the industrial engagements that our company does. And then secondly, you're saying also that, like, OK, A, that might be the ingress factor, and then, B, by the way, as you're trying to troubleshoot it, you're locked out of that system because you don't have the proper credentials that are controlled by that external vendor. So it's like a double whammy. And those are the things that you want to prepare for. Like, hey, what are those systems that have escalated privileges that are controlled by an external third party? And let's get that escalation chain established beforehand, not when we're in the middle of a fire.
Vern Mccandlish: Absolutely. And the analogy I use is local fire departments will typically do questionnaires to the community to look for people that might have an acetylene tank at a business, what types of things that they have. So they know ahead of time what they can anticipate when they respond, and they have that information available. In the digital OT incident response space, we're looking for the same types of things. What are the things in your environment that are going to require extra care? And what are going to require third-party vendors for us to engage in? Because now it's not just the - and even when I do external incident response, it's not just the victim organization, and it's not just me coming in to help them. We are now having to engage one or more third-party vendors to come in and help get the data so we can answer the questions.
Mark Urban: Thank you. Vern McCandlish, principal industrial responder here at Dragos, thank you very much.
Dave Bittner: And that's "Control Loop," brought to you by the CyberWire and powered by Dragos. For links to all of today's stories, check out our show notes at thecyberwire.com. Sound design for this show is done by Elliott Peltzman, with mixing by Tre Hester. Our senior producer is Jennifer Eiben. Our Dragos producers are Joanne Rasch and Mark Urban. Our executive editor is Peter Kilpe. And I'm Dave Bittner. Thanks for listening. We'll see you back here next time.