Executive discussions and how to communicate your cyber risks to the Board.
Tre Hester: You're listening to a CyberWire podcast powered by Dragos. It's Wednesday, August 24, 2022, and you're listening to "Control Loop." In today's OT cybersecurity briefing, DOE invests in securing the U.S. power grid, CISA's recent ICS security advisories, Industroyer2 makes an appearance in Ukraine, DDoS attack against Energoatom's website, ransomware trends and the threat to OT systems. Ransomware gang attempts to extort the wrong water company, and we think Mr. Kipling (ph) would understand. Our guest is Jason Christopher from Dragos, talking about intelligent boardroom decisions and how to use threat-informed industrial risk management. In the Learning Lab, David Foose, senior product manager at Dragos, explores some of the basics of OT. That is, of course, operational technology. He focuses the discussion with an explanation of SCADA.
Tre Hester: On August 18, the U.S. Department of Energy announced it's putting $45 million toward cyber technology aimed at safeguarding the nation's power infrastructure against cyber aggression, providing funding for up to 15 research endeavors that focused on reducing cyber risks. These research projects are also intended to bolster relationships between energy sector utilities, vendors and universities. Energy Secretary Jennifer Granholm issued a statement explaining, quote, "As DOE builds out America's clean energy infrastructure, this funding will provide the tools for a strong, resilient and secure electricity grid that can withstand modern cyber threats and deliver energy to every pocket of America," end quote.
Tre Hester: The Hill notes that this is just the latest move signaling the DOE's efforts to improve cybersecurity of the energy grid. In April, the agency announced a $12 million investment in six research projects focused on using anomaly detection, artificial intelligence and machine learning to secure critical infrastructure, including the power sector. And in July, House legislators passed a bill establishing a DOE grant program for graduate students and postdoctoral researchers studying cybersecurity and energy infrastructure. CISA has issued 41 new ICS advisories since we last spoke with you. There's far too many to list here, so we recommend that you visit cisa.gov/uscert/ics/advisories. The affected systems include products from some of the largest control system vendors - Siemens, Schneider, Emerson, Mitsubishi, BR Industrial Automation, Delta Industrial Automation, LS and Baxter. Once again, if you missed that address the first time around, the details may be found on cisa.gov/uscert/ics/advisories.
Tre Hester: One of the mysteries of Russia's war against Ukraine has been the failure of Russian cyber operatives to live up to the high expectations they set back in 2016, when they used a cyberattack to shut down significant portions of the Ukrainian power grid. At Black Hat 2022, earlier this month, ESET researchers Robert Lipovsky and Anton Cherepanov gave a presentation on Industroyer2, a successor to the Industroyer malware used to cause a power blackout in Kyiv in December 2016. Industroyer2 was developed by the Russian threat actor Sandworm against a Ukrainian energy company in April of 2022. While Industroyer2 was technically more sophisticated than the original malware, it failed to trigger a blackout. Lipovsky states, quote, "The attack was thwarted thanks to a quick response by the defenders at the targeted company, the work of CERT-UA and our assistance. But although no blackout took place, it was still a big deal. Because the attack had been successful, theoretically, more than 2 million people could have been left in the dark. So in our opinion, this was the most significant cyberattack, even if unsuccessful, during the war thus far," end quote.
Tre Hester: Anton Cherepanov added that the attackers also made a mistake in the timing of the wiper stage of their attack, which they launched just before 6 p.m. on a Friday. Cherepanov stated, quote, "These attackers missed one very important thing - that Friday is a very short working day and most people end their work at 5 p.m., or even 4 p.m. So at 5:58 p.m., 95% of workstations were switched off, so they weren't wiped," end quote. This should serve, by the way, as a useful reminder that offensive cyber operations are harder for the attacker than the defender often imagines. Your well-thought-through technical attack might fizzle because you've forgotten something simple, like maybe when it's the people that you're messing with actually knock off work. After all, you, the attacker, may be pulling all-nighters fueled by whatever the GRU's equivalent of pizza and Mountain Dew might be. And you've forgotten that the factory whistle blows early in Kyiv on Friday afternoon.
Tre Hester: Give some credit to the defenders and the people who've rendered them assistance. Reuters reports remarks delivered at the Black Hat conference in Las Vegas on August 10 by Victor Zhora, deputy head of Ukraine's state special communication service. Zhora, whose appearance was little heralded and was widely reported as a surprise to those in attendance, said that detection of cyberattacks had more than tripled since the war began in February and that they became particularly intense in late March and in early April. Reuters summarizes Zhora as saying, quote, "Ukraine faced a number of huge incidents in cyberspace from the end of March to the beginning of April, including the discovery of Industroyer2 malware which could manipulate equipment and electrical utilities to control the flow of power," end quote. Zhora also acknowledged the pro-bono cloud service provided by Microsoft, Amazon and Google, which have helped the Ukrainian government back data up in physically safe servers abroad.
Tre Hester: Ukraine's state-owned nuclear power company, Energoatom, sustained a distributed denial-of-service attack against its website for about three hours on Monday, August 15. The corporation said the attack had little effect on visitors to the website and no effect on its power plants. According to The Record, the attack was launched by the Russian hacktivist group, People's Cyber Army. It was a nuisance-level attack that had only limited impact, but such DDoS attacks can serve as a misdirection for more serious and damaging campaigns.
Tre Hester: Ransomware continues to present a threat to industrial operations. On Tuesday, August 9, Dragos released its industrial ransomware analysis for the second quarter of 2022. While the threat actors' interests in targeting can shift, the report includes a quick rundown of what the opposition's interests look like now. Some of the threat actors target by sector. Karakurt ransomware has been primarily focused on transportation organizations. Vice Society has only targeted automative manufacturing companies, while LockBit 2.0 is targeting entities in the pharmaceutical, mining and water treatment sectors. Others show a geographical focus. For example, Moses Staff has exclusively targeted Israel. Black Basta, RansomHouse and Everest have focused on organizations in the U.S. and Europe. The Quantum and Lorenzo (ph) gangs have only targeted entities in North America.
Tre Hester: And finally, the threat actors shift. Old ones grow quiet, and new ones start making noise. Lapsus$, CLOP leaks and Rook were active in the first quarter but not now. Black Basta, Midas leaks, Pandora and RansomHouse have been busy in the second quarter but were nowhere to be seen in the first. In general, ransomware attacks were fewer in the second quarter than they had been in the first. But on the other hand, the more recent attacks were more consequential. Dragos closes its report with a prediction - quote, "Due to the changes in ransomware groups themselves, Dragos assesses with moderate confidence that new ransomware groups will appear in the next quarter, whether as new or reformed ones. Dragos assesses with moderate confidence that ransomware with destructive capabilities will continue to target OT operations, given the continuous political tension between Russia and Western countries," end quote.
Tre Hester: The key aspect of the rising ransomware threat to OT systems is the destructive capability that's been on display elsewhere in Russia's hybrid war against Ukraine. Wiper attacks began on February 24, shortly before Russian troops crossed the line of departure in their invasion of Ukraine. They enjoyed some success against telecommunications targets, but these attacks seem to have peaked in February and March. By April, they seem largely to have ceased to have much effect and have been largely displaced by denial and nuisance-level attacks by hacktivist front organizations and by familiar cyberespionage campaigns run by the usual intelligence services - the SVR, the FSB and the GRU. But it would be unwise to be complacent. Russia has demonstrated a capability to wage destructive cyberwar, and assuming that Moscow has given up would be folly.
Tre Hester: U.K. water supplier South Staffordshire Water has sustained an apparent ransomware attack that disrupted its IT systems, though the company says the attack hasn't affected its ability to supply safe water to its customers. The company stated, quote, "We are experiencing disruption to our corporate IT network, and our teams are working to resolve this as quickly as possible. It's important to stress that our customer service teams are operating as usual. We're working closely with the relevant government and regulatory authorities and will keep them, as well as our customers, updated as our investigation continues," end quote. After the attack, however, the CLOP ransomware group claimed that it had gained access to SCADA systems at Thames Water, the U.K.'s largest water supplier. Thames Water called these claims a hoax, and it seems CLOP was confused. Once the CLOP gang began publishing stolen information, it became apparent that the data had actually been taken from South Staffordshire Water. CLOP has since corrected their error and is now attempting to extort South Staffordshire Water.
Tre Hester: Why water and why now? There's a drought. Ilia Kolochenko, founder of ImmuniWeb, commented, quote, "While Europe and other regions are suffering from the unprecedented wildfires and catastrophic drought, nefarious cybercriminals may purposely target critical national infrastructure in sophisticated cyberattacks. In the case of financially motivated attacks designed to obtain a ransom, wrongdoers have significantly more chances of getting paid by cruelly exploiting people in extreme need," end quote. The greater the need, the higher the likelihood that people in extremes will be willing to pay up. CLOP struck a high moral tone in their extortion notes. The English may be broken, but the message is clear, if not fully convincing. Quote, "CLOP is not a political organization, and we do not attack critical infrastructure or health organizations. We decide that we do not encrypt this company," end quote - that is, Thames Water or South Staffordshire Water - quote, "but we show them that we have access to more than five terabytes of data, every system, including SCADA, and these systems which control chemicals in water. If you are shocked, it is good," end quote.
Tre Hester: Rudyard Kipling - yes, that Rudyard Kipling - of "Jungle Book" fame, wrote a poem about this. He'd never heard of malware or ransomware, of course, or of any ware except hardware, in the sense of hammers and nails, or cookware, in the sense of pots and pans. But he'd immediately get the point. His poem goes something like this - (reading) It is always a temptation to an armed and agile nation to call upon a neighbor and to say, we invaded you last night. We are quite prepared to fight unless you pay us cash to go away. And that is called asking for Dane-geld, and the people who ask it explain that you've only to pay 'em the Dane-geld and you'll get rid of the Dane. It is always temptation for a rich and lazy nation to puff and look important and say, though we know we should defeat you, we have not the time to meet you. We will, therefore, pay you cash to go away. And that is called paying the Dane-geld. But we've proved it again and again that if once you paid him the Dane-geld, you never get rid of the Dane. It's wrong to put temptation in the path of any nation, for fear they should succumb and go astray. So when you are requested to pay up or be molested, you will find it better policy to say, we will never pay anyone Dane-geld, no matter how trifling the cost. For the end of that game is oppression and shame, and the nation that pays it is lost.
Tre Hester: Tell it, Rudyard. And to our Danish listeners, no offense. We anglophones know things have changed since Beowulf's day.
Tre Hester: Our own Dave Bittner sits down with Jason Christopher of Dragos to discuss intelligent boardroom decisions and how to use threat-informed industrial risk management. Here's Dave.
Dave Bittner: Let's start off with just a little quick background on you. I mean, when you are, you know, out and about at a cocktail party or something and you have to tell people what exactly it is you do for a living, how do you explain that?
Jason Christopher: So I - first off, I always say I focus on industrial cyber-risk. So whenever you talk about cybersecurity, they don't tend to - they think, oh, well, the banking information stuff, that - yeah, it's really important. I'm like, yes, it is, but I'm more interested when there's, like, you know, the potential of property damage, when a turbine may go down and we lose services, for example, like power or water. And so I'll describe to folks - I'll say, focus on the industrial side of things, so where it's far more dirty, physically, and also, from a cybersecurity perspective, more difficult to actually go through and secure those devices. And that typically jazzes them 'cause they can understand a power plant. They can understand a water plant. And then you start talking about hackers and what that does to them - they're like, oh, OK, there's something there to focus on.
Dave Bittner: Can we just start with some high-level stuff here? I mean, when we look at the threats facing the nation from an industrial security point of view, I think the public perception tends to center on electricity just because...
Jason Christopher: Yep.
Dave Bittner: ...It captures the imagination to think about the lights going out. Are we properly calibrating ourselves?
Jason Christopher: That's a good question. I think that we - to your point, we focus on electric because we can tangibly say turn on the light, off and on. But even in sort of the post-COVID age, where we deal with things like supply chain, where we're all feeling the supply chain sort of ripple effect, and asking the same question of, well, what if that was cybersecurity predicated, that could probably help calibrate us a little bit more because that's manufacturing. That's food. There's a lot of services that we depend on that use automation systems, that use industrial control systems, that - those threat adversaries focusing on that wider breadth of this is what critical infrastructure means, I think, would help out quite a bit, for sure.
Dave Bittner: When we're talking to the folks at the board level, in - the board of directors about this kind of stuff, what do you think is the proper approach? How do we make sure that we're speaking to them in a language they can understand?
Jason Christopher: That's exactly, I think, the troubling issue is that we don't, oftentimes, speak in their language. We assume that they should know our language really well, but they don't, and they shouldn't. They're executives. And the executives are really smart at running their operations, their business. That's what they do. And what we're there to do is protect that, which means that we have to understand their concerns and make sure that we're putting protections in place that address those concerns. So what does it mean from a operations perspective, production outages? What does it look like to have, potentially, environmental impacts due to a cybersecurity attack? Executives would understand all that 'cause they understand what those impacts mean for their business, but they've never looked at - could we do that from a cybersecurity perspective? Could I have an environmental impact due to cybersecurity? And if we can speak their language in those terms, the ones that do, the CISOs that do, the sort of ICS security leaders that do, they find themselves getting far more budget, far more ability to deal with the threats that they're concerned with than the ones who say, oh, we've had 50,000 attacks on our firewall in the past year. Well, what does that mean?
Dave Bittner: Right.
Jason Christopher: I've had all those attacks, but we're still up and operational. So is this a concern? Is this not a concern? And making sure you speak in their operational impact thing is absolutely the No. 1 advice I give to any security leader in the space.
Dave Bittner: Help me understand - on the ground, when you're talking about high-level people in these sorts of organizations, have they typically come up through the system? Do they have a background in this sort of thing?
Jason Christopher: They typically come up in the operational side.
Dave Bittner: OK.
Jason Christopher: So many CEOs get it because they've been a part maybe of the plant floor at some point in their career, or they've at least managed a team that has. So they understand the operations really well. What they won't understand is that cybersecurity perspective. Unfortunately, a lot of the folks that we get to do the leadership part for security may have never come from the shop floor. They may have come from the IT side. They may have come from a totally different sector, like financial, and they're now leading ICS security, but they're sticking to their guns of what they know, which is - hey, we need to take care of those phishing attempts. We need to make sure we have anti-virus on all of our work computers. Well, what are we doing for the actual devices that run the levers, the valves, that run the conveyor belts? Those are the things that they may not understand as well. So the CISOs, I feel, are somewhat injected into a C-suite that is already really familiar with operations, and it's up to that CISO to really understand that operations, too. And they may not have grown up with it like the others have.
Dave Bittner: But if someone has come up through the shop floor and they understand the operational side of things, is there a mismatch between the velocity of the rate of change between - because we talk about...
Jason Christopher: Yep.
Dave Bittner: ...On the OT side, that machine will be running for 30 years.
Jason Christopher: Yep.
Dave Bittner: And that computer - the IT systems - no (laughter).
Jason Christopher: I just trade (ph) that out every four - right? Yeah.
Dave Bittner: Right, right.
Jason Christopher: Absolutely.
Dave Bittner: So how does that - when the leaders have come up on that side of things, is there a - how do they calibrate themselves to balance those two realities?
Jason Christopher: So they really need to get a leader on the security side that can help them understand threats a lot better. They need to - 'cause they all understand that these things are what we call deterministic in nature, meaning that when a control system is operated, it's operated by a good and smart engineer. They were always designed that way.
Dave Bittner: Right.
Jason Christopher: They were never designed with the idea of a malicious actor or a dumb engineer doing anything. So there weren't the same checks and balances that you see in IT. And if somebody'd come up from, like, actually doing operations, they understand that. They understand - once they're on the system, they can tell it to do whatever they want it to do. Linking that to cyberthreats, where these folks are actually going through and doing malicious things, that is what really jazzes and sort of motivates the conversation with those executives, and it helps with that sort of disconnect of, well, I know this technology really well, but I don't know anything about adversaries going after this technology. And linking those two things together - again, back in their language, back in their safety and comfort zone - really helps them explore cybersecurity a bit better.
Dave Bittner: What about threat intelligence? I mean, what part does that play in an executive's ability to manage that risk?
Jason Christopher: It helps you not to boil the ocean. So one of the things that you'll see is, like, oh, we could just apply a different set of controls, or maybe we want to use one of the technical standards out there for ICS. There's a whole lot of them. But maybe you really only need to focus on your crown jewels. Or maybe you really only need to focus on the things that threat adversaries are actively going for because that could inform your then defenses as a result. If you're about to deal with, you know, a barbarian horde coming after your castle, do you need high walls? Do you need a moat? Do you need a drawbridge? All of those elements could kind of tell you, well, do they have the ability to cross that moat? And we've seen them with these bridges that could allow them to bypass our - can they trebuchet...
Dave Bittner: Right (laughter).
Jason Christopher: ...Over our big walls?
Dave Bittner: We've spotted them boiling oil...
Jason Christopher: Yeah.
Dave Bittner: ...And is that a problem?
Jason Christopher: Exactly.
Dave Bittner: (Laughter).
Jason Christopher: It could be a bad day for us.
Dave Bittner: Right.
Jason Christopher: And understanding the threats then lets you understand what defenses you should put in place, and it allows the executive suite to feel more confident about the investment they're putting forward in there.
Dave Bittner: Can we touch on compliance?
Jason Christopher: Sure.
Dave Bittner: You know, the policy and compliance side of things - I mean, I know this is something that you're deeply involved with NERC and some of those standards. Where do we stand there?
Jason Christopher: So with regards to compliance, I think it's really important that we, as security leaders, embrace compliance because all the other executives embrace compliance. They have to. If you're in the financial suite - and we always say, hey, I would love the CISO to be at the same level as the CFO because the CFO is at the like - sort of the big boys' table of the executive suite...
Dave Bittner: Right.
Jason Christopher: ...And sometimes the CISO could be buried somewhere else in the organization. Well, the CFO doesn't complain about audits. I mean, they do complain about audits, but they are part of their life. They know, to do business, I must go through these audits. We need to embrace the same thing for security. And the reason that I think that it's so important is if we say that it's important enough to create a governance structure - it's important enough that we say we must do these standards - wouldn't you want to know and double check your homework - that you're actually performing to those standards?
Jason Christopher: And so embracing it and sort of internalizing it and then improving upon it is what's going to make us be more mature with that. And the maturity that I then see with financial analysts who report to the CFO - if we could have that same sort of culture, where it's not that we don't like compliance 'cause they try to trick us or they try to get us, but instead of we want compliance to help us improve our processes, that will lead to that same maturity that then we are at the big boys' table. We are mature in the same approach, and it becomes part of our culture. It's not just a weird thing we do because regulators want to know more information.
Dave Bittner: Right.
Jason Christopher: It's part of our DNA. It's why we do security. It's to make sure that we do it right.
Dave Bittner: You know, before you were with Dragos, you were involved with a lot of setting those policies. You were on some of those teams that worked through that. Can you give us some insights as to what that process was like? I mean, there's a lot of different interests there.
Jason Christopher: For sure.
Dave Bittner: And how do you go about it?
Jason Christopher: So it comes down to consensus building. And it's really interesting when you bring technical people together and you say, hey, we need to build consensus on a set of requirements because, in that process, one, the technical people - they know the technology really well. But the difference between technology and getting everybody in the same room to agree on something is that people, unlike technology, have feelings. And you want to make sure that you're paying attention to those feelings and that you can get to the same consensus building in a way that you would have any discussion, just like we're having right now...
Dave Bittner: Right.
Jason Christopher: ...And making sure that you understand where they come from and how it is that you may want to move the needle towards - maybe you do a more stringent practice, and you'd like to see others in the industry do that. Or maybe you've seen it fail somewhere, and you want us to avoid something. And being able to tell your story is a really big part of that consensus-driving process. When - in the NERC standards, which are the ones that I'm most familiar with - I also teach ICS-456 here at SANS with regards to - how can you comply across that? - which is really fun because everyone thinks that it's going to be like, oh, how do I write a policy?
Dave Bittner: Yeah.
Jason Christopher: And then I set you into, like, a whole set of labs or using Kali Linux to be able to bring down a control system asset. And you're like, oh, why did I do that? Because, again, compliance can help me inform how to better do defenses. So within those NERC standards, where we built that consensus, some of the things that were really important were saying, OK, not just the people in the room, but then everybody else in the entire industry gets to vote and weigh in on whether or not they think that's a good practice. And so you saw a lot of people in the industry really promote themselves to do better, and I thought that that was a really interesting thing. We talk about it within regulations - the regulatory floor, not the ceiling, right? So they're very basic requirements across the board. But even in the areas where people could agree, they're trying to push themselves just a little bit better and make sure that the security is there 'cause, again, people want to turn on the lights and have power - not because an attacker has made it incapable of you being able to do so.
Dave Bittner: Is part of this an adjustment of mindset - that, oh, I don't have to do compliance - ooh, I get to do compliance?
Jason Christopher: I wish we could get there. I...
Dave Bittner: I realize I'm being a little...
Jason Christopher: Yeah, yeah, for sure.
Dave Bittner: ...Pie in the sky, but...
Jason Christopher: So it's interesting because I talk a lot about culture whenever I talk to utilities about this.
Dave Bittner: Yeah.
Jason Christopher: We, within the utility business, had a really bad culture of safety up until the 1970s. In the 1970s, we had a lot of folks, unfortunately, get injured or lose their lives because there weren't very good safety standards. You now fast-forward to today. You can't go into any utility and not start a meeting with a safety moment or they'd have a safety intranet page that talks to you about the different safety incidents have taken place and how to prevent safety. Everything from what you have to do in the plant floor to how do you walk up a ladder to don't use your phone and text and walk at the same time. Like, the safety moments can be all over the place.
Dave Bittner: Right.
Jason Christopher: So the number one thing I then engage them on is like, well, when is the last time you did a security moment? When was the last time you started a meeting and said, hey, don't plug in that USB or charge your phone into that human-machine interface that we have at the plant floor?
Dave Bittner: Interesting.
Jason Christopher: Do we have a security intranet page? And I think that we've done a really great job of talking about culture from a safety perspective. And what I'd love to be able to see us mature in is where I think the compliance can win out is start talking about it. It's baked into our DNA. And it's a part of the culture that we have for security. When we move that way, I think that you're going to start seeing then behaviors change, attitudes change as a result of that. And some utilities have done a really great job at that. Others - I mean, there's a lot of utilities out there. Others are still struggling with it. But if they can liken it back to the safety moments that we used to have and bring that forward to security, I think that would be a recipe for success all across the board.
Dave Bittner: Are there any areas that you feel aren't getting the attention they deserve?
Jason Christopher: Within ICS security...
Dave Bittner: Yeah.
Jason Christopher: ...As a topic? I would say really trying to understand the impacts is a struggling point for us. And part of it is because unlike IT security, where IT may own all the application side, they may own all the different impacts because they have the servers, they understand how people are using them, they understand how the business is structured, especially in banking or financial areas, when I talk about it for ICS, the IT folks who typically own that need to talk to the engineers, need to talk to the operations people...
Dave Bittner: Right.
Jason Christopher: ...And get those sort of perspectives in place. When we recover from an incident in ICS, there's a certain moment where, even in OT security, you're not bringing the plant up and operational. You have to be able to have this handoff to the engineers and operators and say, OK, I think we're in a better space. I think we're ready. And if you've not had that conversation before, if you've not been able to talk about the impacts, talk about how you'd respond and recover across those boundaries, that becomes a really challenging area.
Jason Christopher: So I think the area that we're missing is we get into our silos really easily, and we don't break those down to say, what's the core level expertise across our entire organization that allows us to not just detect an incident but actually recover an entire facility after an incident? And how do those two teams - security and operations - work together to do that?
Dave Bittner: As you look toward the future, what are the types of things that you think need to happen for us to get where we want to be?
Jason Christopher: I think a lot of it is already happening. I've seen more executives get involved than I've ever seen before. Over the past 20 years that I've been doing this, it really started off with conversations at the ground floor. The way I got started in the this, I was just installing different devices, and I said, hey, who's taking care of cybersecurity here? I was just an engineer. And they said, don't worry about it. Our internet service provider totally has it covered because we rely on them. Why wouldn't they have our cybersecurity covered? I'm like, I don't know enough to know why that's not right, but I know that's not right. We fast-forward 20 years and like, oh, wow, and a lot of people still think that way.
Dave Bittner: (Laughter) You were more right than you knew.
Jason Christopher: Right. And so, you know, if I were to look at the progress we've had across that, I see a lot more executives, especially CEOs and COOs in the past probably two or three years that get it than I've ever seen before. And I think that what really needs to happen then is break down those silos and get that culture piece in there. Culture eats strategy for breakfast. So if I could see the increased culture and that awareness on things like security moments and see it embedded in our behaviors and attitudes, I think that would be the winning way to get there. Seeing CEOs and COOs and VP of engineering be more in tune with it I think will help out with that for sure.
Tre Hester: In today's Learning Lab, Mark Urban sits down with David Foose, senior product manager at Dragos, to explore some of the basics of operational technology. Here's Mark.
Mark Urban: Thanks. I'm joined today in the Learning Lab by David Foose, an industrial cybersecurity champion. According to his LinkedIn profile, he's got an engineering background, worked in cybersecurity for operational technology for a number of years. He's worked at Emerson Automation, a manufacturer of industrial OEM systems that control electrical and water utilities and other industrial processes, and here at Dragos on the product management team. Welcome, Dave.
David Foose: Thank you. Glad to be here.
Mark Urban: So today we're going to continue to explore some of the basics of OT, operational technology. You know, my background, like a number of security folks that may be listening, is more in information technology, IT. And I remember a bunch of years ago, I was working with a network forensics recording and analysis device. And one of the customers that I was talking to asked if we did SCADA, and I answered yes because it was on the data sheet. Well, I proceeded to get somewhat of an abrupt and brief education where I learned that SCADA was a whole world of systems with unique languages and protocols and a whole different level of expertise. So me saying we knew SCADA was like me saying I know French. You know, I knew a few phrases and could wield, you know, Google Translate with the best of them. But, you know, it's far from really understanding and communicating in the core language of SCADA. But let's take a step back. Dave, you're the expert. What - help us explain, what is SCADA?
David Foose: Well, SCADA is a very common term a lot of people have heard of in the industry, and it stands for supervisory control and data acquisition. And that's a big mouthful. That's why we reduce it to SCADA. Thinking about back to how control works, whenever we - when we first started doing control, it was a very physical process. You went to do something, you were controlling maybe water or doing some other kind of process and you physically went there and turned a crank and then you wanted to move back a little bit and you start using pneumatics and then eventually started adding computer controls to it. So eventually you would add more than one computer and you needed to supervise that, control it, and then all the data from the different computer systems that you put out there needed to be brought back so that you could make decisions across it. So that's kind of where the idea of providing it all together is where a SCADA would come together as supervising that data acquisition and the control decisions at that point for the human in the loop.
Mark Urban: Gotcha. So it's - you know, it's meant to - it helps automate these control processes and used in things like electrical utilities, water utilities that - interesting, so process automation, industrial automation. Is it a distributed control system? You know, DCS is another term thrown around there.
David Foose: Yeah, there's several names for it - again, it being a generic term for what you're doing. There are several names you might run into and in the control realm you'll see SCADA, you'll see DCS or distributed control system, you'll see IACS, which is industrial automation control system. You'll also hear that a lot in the oil and gas - ICS, industrial control system, discrete controls. I hear talk of skids. Basically depending on if you have a particular type of industry, it may be a very large, distributed, huge - it might be gigantic wind farms stretching out through multiple states or even across countries down to just a single small location that you're just doing a small robotic arm and controlling only just that.
Mark Urban: Gotcha. So different acronyms to talk about system automation, management of those processes, control processes and control loops, the name of the podcast. Well, let's get back to SCADA. And I think there are a lot of similarities between SCADA, or a lot of common components that we'll talk about. So even though we're saying SCADA, these elements are present in those other, you know, IACS, ICS, skids, et cetera, some of those other piece. So then back to SCADA, Dave. You know, recognizing that there are a lot of different terms that have similarities in different components, what is a SCADA system?
David Foose: Well, a SCADA system is generally made up of a number of smaller components where the control is actually made up of. You may hear of a PLU or RTU, something that actually is doing the control, and then they will be networked together in some way. Those, the control devices, will be taking care of pumps, valves, temperature sensors, the actual process itself. And then that is all fed back up to a HMI, up at that point, to a human machine interface, a computer of some sort that is the data and you are as a human making decisions based upon that data.
Mark Urban: Gotcha. OK. So you threw in a couple terms. Let's start with the term RTU. What is an RTU?
David Foose: Well, RTU stands for remote terminal unit. Much like - it - normally it is a small embedded-like computer found down near the process itself. It may be doing a limited function at the actual process area and it's taking care of a similar small dedicated function, taking care of some minor detail to - it may not have a ability to be programmed with a whole bunch of different logic to if this happens, do this. It may just have a single function to do and take part of that control. Whereas a PLC, a programmable logic controller, that one is a bit more of a beefier computer-like device. That one there would be able to do some more programming. It may have additional functions. It may be able to make more decisions. And therefore it can provide more data and back to the user, up to the SCADA system itself.
Mark Urban: Gotcha. So an RTU is typically more fixed function, more single purpose. One example we were talking about earlier, that it can monitor the state of a circuit breaker in an electrical utility. Or at a water treatment plant, it can monitor the level of organisms in water that's being treated. And so it's kind of, hey, working with sensors, monitoring a single kind of thing and then sending that data back to the systems. And I think you're saying a PLC can be programmed to do things, like it might throw a valve to mix in chemicals to kill some of those microorganisms and manage the information based on what it's told from above. Or it might throw a switch to reroute electricity, you know, down a different part of the grid. So is a PLC then just kind of more capable, more than an RTU, more programmable, kind of a multifunction device? Or how should we think about the differences between those two?
David Foose: Most of the time, a PLC would be a bit more of a larger computer-like device, still embedded, still has - can handle the extreme conditions oftentimes that where this device may have to live in a process environment, the extreme temperatures and maybe some different types of exposure to weather. But they are normally generic devices that have ability to be programmed with decisions that it can make multiple times a second and handle simple decisions and send the changes or decisions and information back up to the - through the actual controller person supervisory system at the HMI for displaying, here's what I'm seeing. And it might be temperature flow, amount of pressure in a pipe and various information like that.
Mark Urban: Got you. So you brought up the term HMI. Can you - what's an HMI? What does it do?
David Foose: HMIs originally were just a big panel with a whole bunch of dials and dials that just showed you what was going on. But they were often referred to as an enunciator, enunciating what was going on. But now they are computer screens themselves with graphs showing overruns, time changes, what's going on, pressure, temperature. They can - there's even some that very much can show time shifts. You can actually look to see what's going on, alarms in the systems. Like, something might be going wrong. So it just stands for human machine interface. But they're very much similar computers that you may find even in your home or office situation.
Mark Urban: Got you. So - but these are interfaces that show you information, you know, show dashboards in real-time from all the RTUs and PLCs that are contributing to the system. HMI is what, you know, a human sees, the interface to that system. And, you know, can you actually control things? Can you make changes from that HMI? Or is it kind of a read only thing?
David Foose: It depends on the environment. But most of the time, your operators sitting there are looking at multiple screens, looking at trending data or alarms coming from several, if not hundreds of different, smaller controller devices - PLCs, RTUs, regular controllers, pumps, valves - thousands of these in your environment, and making decisions to make sure that nothing is going wrong with the safety and reliability and tweaking systems 24 hours a day.
Mark Urban: So one of the concepts also in SCADA, you know, when we're going through the prep that we talked about are supervisory systems. Now, I can imagine what that is. But what kind of role do they play in the system?
David Foose: A supervisory system overall can be a single machine, like we have with HMI. Or it can actually be a collective of machines. Oftentimes, you'll find a control room or a - we even have some situations where it's actually found in the cloud, depending on how the system is structured. But it is where you're going to get all your data put into one single place and visualized for you to make decisions with.
Mark Urban: Got you. And it's one of the things that, you know - again, drawing from my brief interface with those back in the day before I really got into the operational technology area, SCADA systems, do they - you know, does - these all communicate over vast distributed networks. Is, you know, that - is that, like, a whole different language unto itself like I suggested, you know? What sorts of communications happen and, you know - and over what kind of distances?
David Foose: There are still limitations to the physical nature of how they can actually talk. There's still going to be physical wires. There's going to be, still, sight - physical line of sight kind of communications, stuff done in microwave or wireless technologies used that you would see normally. But the actual way they communicate with each other is specialized because the devices were built with an engineering and a requirement to speak at a certain speed because these decisions are being made very quickly in some cases, like an electrical substation level can be rapidly - multi times a second, thousands of times a second. They are required to speak special and quickly. And it can be very difficult for people who don't understand the level of complexity of these protocols to look at it and take care of these types of equipment and make sure that they have the reliability they need to communicate.
Mark Urban: Got you. So highly, you know, a different focused language that PLCs and RTUs speak, but we probably don't speak without the help of some expertise to interpret it. OK. So SCADA system, it monitors and - it's made up of things like RTUs, HMIs, PLCs. Those are the things that interface with sensors and valves and switches that mix chemicals, move electricity, move water, manage processes. So that's - anything that we missed, Dave? I mean, there's a whole big world of SCADA and DCS and AIX and ICS. Thanks very much for, you know, taking us on this explanation and helping us to understand SCADA and some of those other systems.
Tre Hester: And that's "Control Loop," brought to you by the CyberWire and powered by Dragos. For links to all of today's stories, check out the show notes at thecyberwire.com. Sound design for this episode is done by Elliott Peltzman, with mixing and voiceover by Tre Hester. Our senior producer is Jennifer Eiben. Our Dragos producers are Joanne Rasch and Mark Urban. Our executive editor is Peter Kilpe. And I'm Tre Hester, filling in for Dave Bittner. Thanks for listening.