A new approach to mission critical systems.
Dave Bittner: [00:00:03] Hello everyone and welcome to the CyberWire's research Saturday presented by the Hewlett Foundation's Cyber Initiative. I'm Dave Bittner and this is our weekly conversation with researchers and analysts tracking down threats and vulnerabilities and solving some of the hard problems of protecting ourselves in a rapidly evolving cyberspace. Thanks for joining us.
Dave Bittner: [00:00:26] And now a moment to tell you about our sponsor the Hewlett Foundation's Cyber Initiative. While government and industry focus on the latest cyber threats we still need more institutions and individuals who take a longer view. They are the people who are helping to create the norms and policies that will keep us all safe in cyberspace. The Cyber Initiative supports a cyber policy field that offers thoughtful solutions to complex challenges for the benefit of societies around the world. Learn more at hewlett.org/cyber.
Dave Bittner: [00:01:02] And thanks also to our sponsor and Enveil whose revolutionary ZeroReveal solution closes the last gap in data security, protecting data in use. It's the industry's first and only scalable commercial solution enabling data to remain encrypted throughout the entire processing lifecycle. Imagine being able to analyze search and perform calculations on sensitive data all without ever decrypting anything, all without the risks of theft or inadvertent exposure. What was once only theoretical is now possible with Enveil. Learn more at Enveil.com.
Andy Bochman: [00:01:42] We spent like the past decade or two trying to get awareness that this was an emerging problem.
Dave Bittner: [00:01:48] That's Andy Bochman. He's a senior strategist for Idaho National Labs National and Homeland security Directorate. Today we're discussing the research the INL has been doing developing new approaches to protecting mission critical systems.
Andy Bochman: [00:02:02] At first it was a nuisance level problem and it was treated accordingly. It didn't require vast amounts of spending or restructuring of organizations in order to be able to keep it at arm's length. And we were trying to I'd say Wake up bring more awareness to seniors telling them it's going to get worse probably so be aware of that where we weren't sure if they were aware. I would say fast forward till now. I spend a lot of time on the Hill and with companies and senior folks in companies they're aware they're awake they're nervous. The problem is they don't know what to do. It seems like we keep doing more of the same on cyber hygiene or say cyber defense. We spend more every year on products. We spend more every year on services. We do more training and improve our policies. And yet it's not changing anything. The number of attacks are increasing and they're getting more powerful and the damage they're inflicting is getting more severe.
Andy Bochman: [00:03:04] So today the problem is not a nuisance level problem, it's a strategic business overall risk and a strategic risk to the nation. The Idaho National Lab has come up with a way that I think is practical and relatively easy to understand to begin to mitigate it in a demonstrable way.
Dave Bittner: [00:03:24] Can you take us through some of the history of this. We were talking about industrial control systems and critical infrastructure. How did we go from the previous era of analog feedback and gauges and so forth to where we are now with everything being digital.
Andy Bochman: [00:03:39] I think the drivers are primarily economic and they have to do with efficiency and new capabilities. The efficiencies obvious. If you can automate a process that used require 100 human beings performing different types of maneuvers hands on and you can replace them with an automated system and maybe have just a handful of humans touching the process, guiding the process, while heck you've just saved yourself a ton of money. You might spend something on the automation and keeping a running, but you've not only reduced your headcount and improved your bottom line but you've probably speeded the process too, and made it more standardized. Those are some of the additional capabilities besides efficiency and money saving that come along. You also may get situational awareness. You may be able to monitor much better using sensors than you ever we're using human eyes and ears, and so you can be much more in tune with the processes that are most important to you and can run closer to the edge, thereby again being more efficient, saving more money, making more for less. I think that's been the siren call digitisation and automation. And I think it's only accelerating now with artificial intelligence. Internet of Things and all varieties of automation.
Dave Bittner: [00:05:00] What are the things that your paper points out is, in the old days you had these three physical pillars of security: gates, guards, and guns.
Andy Bochman: [00:05:09] That's right, though by the way are still there, Dave. Gates, guards and guns are present at every nuclear reactor site and their presence on other important parts of electric utility and chemical and oil natural gas certainly they're present. They may not be quite as visible but they're present in financial institutions too. Any place that has something really valuable to protect has to have strong physical security. And I think for decision makers physical security is a lot easier to understand. You can see it, you can feel it in a way that cyber security has proven much more ephemeral and much more intangible and only recently. I think that's what delayed the understanding of it and the increasing anxiety up until relatively recently once they started to see the impacts.
Dave Bittner: [00:05:57] So describe to us, what are the impacts now that we've hosed everything up to the Internet? It seems like a lot we've made that perimeter porous in a way.
Andy Bochman: [00:06:07] Yeah I think the lure of the Internet. You know we've started off just with local area networks, right? You can go back to Clifford Stoll's "The Cuckoo's Egg," which is I think considered to be the seminal book on cyber security where the first nefarious actor on a very very tiny network was doing misdeeds and still figuring out what was going on, eventually. The Internet was sort of one size fits all way for everyone to get connected. So first LANs were connected to it and now you know in many cases, everyone's just directly connected internet and devices is directly connected to the Internet as the search engine Shodan reveals.
Dave Bittner: [00:06:46] Now you all make the point that what is referred to today as cyber hygiene is inadequate to protect industrial control systems.
Andy Bochman: [00:06:54] Yeah and this is a subtle point I want to make sure I get across, Dave. The article and the methodology from the Idaho National Lab. It's not a diatribe against cyber hygiene. We want people to continue to do cyber hygiene to the best of their ability. Again cyber hygiene and the way I'm using it and it is a somewhat loose term with different definitions. I mean it is everything that we now do in a typical enterprise, whether it's technology, whether it's services, whether it's training ,whether it's governance and policy. All of those things are cyber hygiene conforming to accepted best practices. We need to keep doing those things, else every WannaCry and NotPetya, and all of their offspring that are constantly being born will cripple large companies or companies actually of any size. Ransomware to is something that's really increasing awareness of the link between cyber security and dollars and cents.
Andy Bochman: [00:07:56] Our point though is that among all the many different systems that you might protect in a medium or large size enterprise, you have endpoint security and you're just doing diligence all across your networks and systems with cyber hygiene. What we're saying is, one of the mantras of INL is, if you're targeted, you will be compromised. It's just a plain fact and it's very easy to demonstrate. It's been happening over and over again. Well, maybe you might say, well we're the type of company, hopefully we won't be targeted. The problem is you don't get to choose if you're targeted or not. I would say it's fairly clear that if you're anywhere near a critical infrastructure in terms of what your responsibilities are, then you are a target. And back to the mantra, if you're targeted you will be compromised.
Andy Bochman: [00:08:46] Therefore that makes the whole narrative a lot more compelling to people that were sort of wishing and hoping that this problem would go away. One more part. The methodology isn't about the entire enterprise and you're hundreds or thousands or millions of endpoints. It's about the, it's very selective. It's about a handful of processes or functions that you perform or products that you make that are so important that if you were to lose them for more than a day or two, you'd be out of business. And that introduces a new term beyond strategic business risk. It's not a new term in the world but it's new related to cybersecurity. I think for most people, it would be called "corporate viability," that you are in many cases now in a position where through cyber means you could be put out of business. And you might not be put out of business, but as a CEO when we've seen this multiple times now, you could lose your job because ultimately the buck stops with you.
Dave Bittner: [00:09:43] So take us through what you're proposing here, what are you suggesting?
Andy Bochman: [00:09:47] What we're trying to do again is to say: keep doing cyber hygiene, do it to the best of your ability. It will help keep the ankle biters, Mike Assante referrs to in the HBR paper, or help hold them at bay to the greatest extent possible. But for the handful of things, the systems and processes that absolutely must not fail. First of all, figure out what they are because not everybody understands what their most important processes and dependencies are. So that's the first step of the methodology. The second step is, create the map of the different hardware, software, communications and human processes that support those processes that must not fail. And the third part is, flipping the table around and looking at yourself from the outside as an adversary would. INL can help with this. But many organizations can already go a fair ways towards accomplishing this by sort of asking yourself in the first phase: if I was going to take my company out of business, if I was going to put us out of business, what would be the most damaging thing I could do? What would I target in the third phase of the methodology, we have cyber-informed people, people with experience and being on the offensive side, navigate through the landscape that was defined in the second phase. All the hardware, software, comms and processes that support the most important things that must not fail, and find the easiest pathways to achieve the effects that they want--the company-ending effects.
Andy Bochman: [00:11:11] The last part is called "Mitigations and Tripwires." That simply means and this is probably the part that I think people latch onto, for better or for ill. Because were saying when you see now that some of your most important systems and processes are at extreme risk, because of the numerous digital pathways in that are extremely hard to police. We're talking about selectively introducing out-of-band solutions, analog solutions. Humans are analog, so adding a trusted human where you might have removed that person years ago, because he can't be... People will take issue with this but in theory he or she can't be hacked. Of course, they can be social engineered, but if you understand what I'm saying. Not simply just layering on more complex software defense solutions and hoping for the best, things that you can even understand, but rather adding engineered solutions that you can fully understand, will protect a machine say, from killing itself if it's given the instructions via software to do so.
Dave Bittner: [00:12:14] It strikes me, you know one of the things that stood out to me when I was reading through the report, was there was a phrase here that said, "but if your own employees physically update the software at those plants the effort can be prohibitively expensive." And I wonder if chasing after cost savings has that ultimately been a blind alley? If we can't trust the data that's being sent back to us from a remote location, or something like that, were we chasing after something we shouldn't have been chasing? It seems to me that if something goes wrong, we're going to have to send a human out there any way to see what's really going on.
Andy Bochman: [00:12:48] Yeah well, I mean one of the big features that purveyors of industrial systems and industrial equipment, that they tout, is that it's remotely accessible. That you can do remote diagnostics and remote updates. It's sort of a blessing and a curse. If in a situation where your need to issue a security patch because you found a high-severity vulnerability, the ability to deploy that patch to hundreds or thousands of systems quickly from one central location is fantastic. It's really necessary these days. The downside of that convenience, I don't think I'd use the word convenient yet, but that's certainly a big player here, and it's related to efficiency. The downside is that if trusted folks inside your company have that capability, then armed with the proper credentials, and adversaries are getting better and better at acquiring credentials so they don't even look like hackers on your systems, they can do the same thing. And if they can do the same thing now you're in you're in peril.
Andy Bochman: [00:13:48] So it's got a balance between the convenience, in order to have efficiency, in some cases that's important for our cyber security to be able to issue patches, but convenience for other reasons too, to save money, to speed up other functions. For competitive reasons many times over. People are taking on risk. They don't understand. I guess that's one way to sort of summarize this topic is, we don't think that senior leaders don't think people in government fully understand the risk that they're carrying. Not trying to scare them it's trying to show them reality. And once they come to understand that as the CEO of the first pilot that we did with a large utility, and the second pilot that's going on with the Department of Defense right now. Once they understand their exposure and how their company or their mission is at risk, then they can go ahead and make informed engineering decisions to how they want to mitigate that risk. In some cases we think they can reduce it, I shouldn't say to zero, but they can greatly reduce it from where they are now and be able to continue modernizing while they're still reducing their risk.
Dave Bittner: [00:14:54] Now take us through what's going on with those pilot programs. What are you learning from that? How are your theories standing up to practice in the real world?
Andy Bochman: [00:15:03] So far so good. There's definitely a cognitive leap that has to be made when we're introducing the concept that you may be doing a great job in cyber hygiene, you may have a very competent Chief Security Officer, you may have good policies, and you may have a budget that's equal to or superior to your peer organizations. The problem with the neck is the pivot is the hard part, because back to the mantra I cited earlier: if you're targeted you will be compromised. And that statement stands independent of how robust your cyber hygiene is.
Andy Bochman: [00:15:41] So you can imagine the CEO who's been told for many quarters or years, we are very strong on cybersecurity, we go to conferences and we learn that we are among the best and that's good. And I'm not pooh poohing that at all. It's just a pivot is, if they're in critical infrastructure they're a target and if they're targeted they will be compromised. And if they know that that's, once they come to accept that, and it doesn't always happen in the first couple minutes, takes a little while and sometimes some demonstrations. But once they come to understand that, they become very eager to figure out what they can do about it. Like I said, the solutions here are not all that mysterious. they are solutions that I would say CEOs and other seniors,even people that aren't real comfortable in the computer realm, they can understand in industrial companies anyway, engineering solutions because they make so much sense. And it's actually their own engineers sometimes that come up with the best mitigations. In the case of the pilot, even before the INL folks could lead them to a solution, they were already way out ahead, thinking of ways to better protect a very important piece of equipment.
Dave Bittner: [00:16:49] And what sort of feedback are you getting from other folks in the ICS community? Has there been much pushback or are folks embracing it?
Andy Bochman: [00:16:56] I'd say two-thirds positive, one-third either confused or negative. The two-thirds positive are chiming in on Twitter and elsewhere with, this is very similar to what I've been advocating for a long time. And it's true. There are a number of people that have been thinking about leveraging the core acumen of these large critical infrastructure companies, which is engineering by the way, leveraging in ways that would reduce their cyber risk. It's just it hasn't been an easy sell, I don't think. And maybe no one's really tried to package it up before in a way that is deliverable. We're at the edge of that now with several pilots either done or underway and beginning to get closer to something that's more repeatable and scalable.
Andy Bochman: [00:17:43] So I'd say it's mainly been positive, Dave, but there are going to be people who, for perhaps religious reasons, like I mentioned when I got academic pushback just on the word analog, they can't even listen beyond that, then that's fine. That may be appropriate in their world. But when you're out there in the field hands-on these systems and they're vital to the survival of the company or to the performance of a mission in the military, don't really have the luxury of intellectual purism. You've gotta find things that work. And I think in this case, this consequence-driven, cyber-informed methodology, so far at least, and I think it will continue, is proving itself to be something that works.
Andy Bochman: [00:18:20] The methodology, the CCE methodology, it might look at first blush like it's about a one-time assessment that will lead to improvements, security improvements, in an organization. But that's actually not the ultimate intent of it. It's not why INL created it and why it's being birthed now. What we're really trying to do, is use it almost as on-the-job training and that by going hand-in-hand, arm-in-arm with the end-user, customer both with their senior leaders and with their engineers and with their cyber teams. We're trying to change the way they think about this, up until now intractable, overwhelming problem, so we're not so much trying to leave them with a one-time set of updated improvements to their security, we're trying to change their minds. And so that when we do leave, and have made some of the mitigations together, left them to continue some of them on their own, that that new type of thinking fully informs everything they do from that point on. And they don't need any outsider to hold their hand anymore, and the thinking permeates not just the C-suite and the board, and not just the engineers and operators and the cyber team, but the procurement folks and the H.R. folks, and everybody comes to see that they have a role in substantially reducing the amount of risk that they're carrying now. Again, you probably have captured this already but if they're the critical infrastructure provider, they're a target. They don't get to choose that.
Andy Bochman: [00:19:52] And if they are targeted, they will be compromised. And if they accept that, they can help them come see that they'll increasingly understand that now that there are a handful of fairly straightforward things, things that also don't necessarily have to be very expensive at all, they can do to substantially improve their standing.
Dave Bittner: [00:20:14] Our thanks to Andrew Bochman from Idaho National Labs for joining us. If you'd like to learn more about the work they're doing there's an article he recently published in the Harvard Business Review. That was the May 15th edition of the Review. The title of the article is "Internet Insecurity." Thanks to the Hewlett Foundation cyber initiative for sponsoring our show. You can learn more about them at hewlett.org/yber. And thanks to Enveil for their sponsorship. You can find out how they're closing the last gap in data security at Enveil.com. The CyberWire's research Saturday has proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cyber security teams and technology. The coordinating producer is Jennifer Eiben, editor is John Petrik, technical editor is Chris Russell, executive editor is Peter Kilpe, and I'm Dave Bittner. Thanks for listening.