The CyberWire Daily Podcast 2.12.21
Ep 1269 | 2.12.21

Alleged hardware backdoors, again. Selling game source code. ICS security, especially with respect to water utility cybersabotage. Don’t be the hacker’s valentine.

Transcript

Dave Bittner: You may have heard mention about SPACs in the news a lot lately. What's a SPAC, you say? Well, tune in on Monday, when we will bring back one of our favorite episodes from this past year about a cybersecurity SPAC. And be sure to listen to the end to hear a new interview with one of the original guests.

Dave Bittner: Bloomberg revives its reporting on hardware backdoors on chipsets. Has someone bought the source code for The Witcher and Cyberpunk? CISA issues ICS alerts. The FBI and CISA offer advice about water system cybersabotage as state and local utilities seek to learn from the Oldsmar attack. Verizon's Chris Novak ponders if you should get your cybersecurity DIY managed or co-managed. Our guest is David Barzilai from Karamba Security on the growing importance of IoT. And looking for love on Valentine's Day? Look carefully, and don't give that intriguing online stranger money.

Dave Bittner: From the CyberWire studios at DataTribe, I’m Dave Bittner with your CyberWire summary for Friday, February 12, 2021. 

Dave Bittner: Bloomberg has returned to a 2018 story about Chinese-inserted hardware backdoors on Supermicro chips, doubling down on its earlier claim that the IT hardware supply chain has been compromised by Chinese intelligence services. The story is, for the most part, sourced to former U.S. law enforcement and intelligence personnel. In 2018, Supermicro harshly characterized the report as, quote, “a mishmash of disparate and inaccurate allegations that date back many years,” end quote. 

Dave Bittner: Bloomberg's renewed claim is a strong claim, and it will require strong evidence for corroboration. The 2018 round of this particular story petered out in an atmosphere of general disbelief, but Bloomberg never retracted its report and has now returned to the story with a longform article. Initial reaction to yesterday's story seems to have been wait-and-see skepticism. Dragos' CEO Rob Lee's tweet is representative. Quote, "The burden of proof is on the journalists," end quote. And so we shall see. 

Dave Bittner: Computing reports that someone - and it's not clear who - has purchased the source code for The Witcher and Cyberpunk 2077 that criminals stole in the course of hacking CD Projekt Red. VX-Underground told Computing that the crooks opened an auction for the source code on an underground forum but then shut the bidding down after, the crooks say, they accepted an offer from someone else, who contacted them elsewhere. 

Dave Bittner: Security firm KELA told The Verge that they’re convinced the now-cancelled auction, at least, was for real, but no one is ready to confirm that the sale actually took place. The crooks, who are thought to have used a version of the HelloKitty ransomware to hit CD Projekt Red, set a million dollars as the starting bid, with an offer to sell immediately to anyone who was willing to put 7 million down - virtual cash - on the digital barrelhead. That seems steep, to say the least, and it’s difficult to see how they could command such a price. 

Dave Bittner: The US Cybersecurity and Infrastructure Security Agency (CISA) has issued new ICS security advisories, one for the Wibu-Systems CodeMeter, a second for Rockwell Automation DriveTools SP and Drives AOP, and a third for TCP/IP stacks embedded in a range of vendors' products.

Dave Bittner: The FBI hasn't had anything to say about the progress of its investigation into the Oldsmar water system cybersabotage incident, and neither have the Secret Service and local law enforcement authorities. But yesterday the Bureau did tweet renewed encouragement to, quote, "remind you how important cybersafety is to protecting the American public and U.S. critical infrastructure," end quote. 

Dave Bittner: The FBI in particular urged its Twitter followers to read the Cybersecurity and Infrastructure Security Agency's latest alert on the incident. CISA offers a good bit of sound and generally applicable advice on digital hygiene and best security practices. One part of the agency's alert, however, is specific to water utilities and how they should secure their cyber-physical systems. CISA writes, quote, "Install independent cyber-physical safety systems. These are systems that physically prevent dangerous conditions from occurring if the control system is compromised by a threat actor. Such safety system controls would include the size of the chemical pump, size of the chemical reservoir, gearing on valves and pressure switches and so on." 

Dave Bittner: Other water systems, in the meantime, have sought to reassure users that they're safe. The Dayton Daily News says that the city of Dayton, Ohio, whose water system supplies more than 400,000 people in the city and surrounding county, thinks it's unlikely to suffer the same kind of attack seen in Florida. Dayton decided eight years ago that its water control systems would not be connected to the internet and that it uses teams of security watchstanders as well as redundant safety systems to protect its utility from sabotage. 

Dave Bittner: Florida's Port Charlotte Sun reports that water systems in the southwestern part of the state tell the newspaper that they're unlikely to go the way of Oldsmar. Quote, "That's because remote access at treatment sites is either nonexistent or limited to select administrators and not to outside vendors, as was suggested in the Oldsmar data breach, local officials say." 

Dave Bittner: And another state's advice to its utilities suggests the scope and seriousness of the challenge. Wisconsin's Department of Natural Resources has joined its Massachusetts counterpart in urging local water systems to upgrade their cybersecurity, Government Technology reports. Wisconsin has 611 local water utilities, and the Department of Natural Resources urges them all to at least install firewalls and use strong passwords. That this advice would seem necessary is not particularly reassuring. That the state of Wisconsin alone has more than 600 local water systems suggests the extent of the security challenge and the very large number of potentially vulnerable attack surfaces. 

Dave Bittner: Other incidents of cyber sabotage hold lessons for water utilities. Domain Tools' Joe Slowik, blogging about Oldsmar, reviews four other high-profile attacks that successfully hit control systems - the Stuxnet attack on Iranian uranium enrichment centrifuges, the GRU's disruption of local Ukrainian power distribution in late 2015, Russia's repeat performance against the grid around Kyiv in 2016 - that time with Industroyer/CRASHOVERRIDE wiping - and 2017's Triton/Trisis attack on a Saudi petrochemical facility. All of these were at least to some extent successful, which the Oldsmar cyber sabotage attempt was not. And all of the earlier attacks were evasive, which Oldsmar also was not. 

Dave Bittner: Slowik writes, quote, "Overall, these four examples of high-profile, technically complex ICS attack scenarios emphasize a critical barrier to adversary success - the ability to evade influence or outright deny operator visibility into and control over ICS environments. In all four examples, the attacks required some mechanism to hide from operators or deny their ability to correct or mitigate changes made to operating parameters," end quote. That wasn't the case in Oldsmar. The attempt there was neither complex nor obscure. Water utilities and others may not be so fortunate the next time around. 

Dave Bittner: The three vulnerabilities most often mentioned in connection with the Oldsmar cyber sabotage have been password sharing - a matter of cyber hygiene - use of beyond-end-of-life software - a patching and updating issue - and the use of TeamViewer for remote access to control systems. Jeremy Turner, head of threat intelligence at Coalition, wrote to point out that TeamViewer is far from the only software used for remote access and that, moreover, it's not even one of the less-secure tools employed for that purpose. Chris Hickman, chief security officer at digital identity security vendor Keyfactor, reminds us that with industrial IoT, it's at least as important to authenticate devices as it is to authenticate users. 

Dave Bittner: Sunday is, of course, Valentine's Day, and the usual romance scams are coming. The folks over at CISA warn, verify your Valentine. They're not trying to rain on your parade, but they do hope you'll approach the annual day of romance with unromantic skepticism. Here's a tip - if you only know someone online, repose your trust elsewither, like in someone you've actually met. CISA says, quote, "Once your heart is hooked on hope, they finagle funds from you as a fake fiance." Trust us, finagling is not what you want on Valentine's Day. The U.S. Federal Trade Commission says that exploitation of the lonely and the lovelorn took a record monetary toll last year - the emotional toll is, of course, unquantifiable - and that this year will be no better. Whatever you do, don't send money, and don't let the catphish get you. 

Dave Bittner: David Barzilai is chairman and co-founder of IoT security firm Karamba Security. He joins us today with his insights on the state of IoT security, what challenges we're facing in the days to come and how organizations can best ensure they're protected. 

David Barzilai: You see that the more that the industry was connected with connected devices - meaning IoT devices, including edge devices - then once they are more recent - there's more long history of connectivity, then such devices have been targets for hackers. And such in those industries, you see more advancements, whether in terms of manufacturers trying to protect the device and-or regulators requiring manufacturers to protect their devices. If the industry is recently connected, then such industry is more emerging. The manufacturers are aware of the need to protect, but they have just started their journey, primarily with, you know, STMC and not so much with security measures yet. 

Dave Bittner: What about devices that have been installed for a long period of time, perhaps they've been declared to be end-of-life by their manufacturer, but they're still working? They're still doing everything that they were designed to do and serving the company. Should there be a plan in place to naturally cycle through those devices, to retire them and update them with new ones merely from a security point of view? 

David Barzilai: Yes. And actually, the problem is even more harsh because when you know about a device is end of life or discontinued to be supported, then you know that you need to require your manufacturer - you either by yourself try talk to the customer - right? - the end customer, the users, the BU (ph) which uses those devices, asking them to retire them and to renew the inventory or the versions of those devices to be new ones in order to be supported and, as such, even better-protected. 

David Barzilai: But I said that the problem is even harsher because in - within the IoT in most industries if not all, given that most of them run on embedded devices, even when we talk about routers that are not embedded - but still, you have quite significant and vast usage of third-party binaries like a TCP/IP stack that runs well on the embedded system and other types of third-party modules. 

David Barzilai: Some of those modules may be end of life. So the device by itself is still running. It's still being refreshed. But that TCP/IP stack is now discontinued - no updates to it anymore. So within a device that presumably is being supported, you have the connectivity piece, which is the door - right? - which is the attack surface for the hackers, which is not maintained anymore. So such problems indeed are becoming more common. 

David Barzilai: So the remedy for such issues, whether we're talking about all the devices that are outdated or devices that include components that may be outdated, is to run - to embed into the device - again, as part of the n+1 (ph) version, runtime integrity software that checks the exploits or checks deviations from the predefined set of operations. The beauty of IoT, unlike data center, is the devices are immutable, meaning they should run according to the manufacturer's specifications. So should you be able to harden those devices according to those specs, changes to them must be hackers trying to exploit vulnerabilities, whether we're talking about new version of the software or old, outdated, unsupported modules within such device. 

Dave Bittner: Right. So you should be able to know if they're running as expected or not. 

David Barzilai: Exactly. Recently, regulations have come to be affected or have become affected and ratified within various industries. We have seen it within the automotive industry that in June of 2020, a regulation that was written by 53 countries had put the blame on attacks on the car manufacturer. And they are now accountable to make sure that their software of the different controllers is going through certain SDLC requirements. And they also embed runtime integrity measures. 

David Barzilai: Something similar happened within the smart factories and smart homes. It's an IEC standard. And coming up, there's going to be a medical standard that - the draft is waiting since 2018. And we believe that in 2021, it will be ratified. So what are we seeing? - a mounting pressure on manufacturers to protect their devices, which is good for us given that IoT devices may expose us, in terms of privacy and in terms of safety, to risks. 

Dave Bittner: That's David Barzilai from Karamba Security. There is a lot more to our interview. Don't forget to go listen to extended versions of this and many other interviews at CyberWire Pro. It's on our website, thecyberwire.com. 

Dave Bittner: And I'm pleased to be joined once again by Chris Novak. He's the global director for Verizon's Threat Research Advisory Center. Chris, it's always great to have you back. What I want to touch with you today is this notion of, if I'm out there looking to equip my organization with effective cybersecurity, do I do it myself, do I have someone manage it for me, or do I go the co-managed route? How does someone go about researching that, exploring that and making the best decision? 

Chris Novak: Sure. And, I mean, I would tend to say that it probably depends on how sophisticated the organization is. I think there's a case to be made for all three. And I guess I probably would say DIY, I would say, maybe refers to kind of security is being done entirely internal to the organization - right? And I tend to see probably your larger institutions and typically, if you look at it from an industry perspective, probably more common in financial services where almost everything is done in-house. 

Chris Novak: And then you have kind of managed where typically you're kind of saying, look, my business is not necessarily security. I recognize it's something I need to have, but my core business is something else. Let me hand all of that off to someone else - right? And then the third being co-managed, there's some balance there where I say, hey, you know what? Maybe I'm not ready to give all of this to someone else, or I feel like there's certain nuances to the way I want it done. And so I try to strike a balance between what I do internally and what, you know, maybe someone on the outside might assist with. 

Dave Bittner: And so what goes into that decision process of deciding which the best route for me? 

Chris Novak: Yeah, I would tend to say that it is most often related to your regulatory environment, your compliance environment and then just your risk tolerance. A lot of organizations will look at that and say, OK, you know what? Based on certain regulations that I must adhere to or certain compliance obligations that are upon me, I'm going to keep more of this internal to me so that I can keep it kind of better under my thumb, better monitor it. And then also other organizations will typically look at it and say, hey, if I'm ultimately going to be held responsible for the outcomes, you know, maybe I want the people who are actually managing that to be, you know, direct reports as opposed to, you know, third-party, you know, contractors or vendors. And then there are others who will look at that and say, you know what? There's a case to be made. And say, you know what? I can potentially offload some of that liability, you know, depending on the way contracts are written. And maybe I feel that I can actually move some of that to those third parties. 

Chris Novak: So, again, there's kind of that give and take. But the calculus I find most often that organizations use is a combination of the regulatory and compliance landscape as well as what they feel they need to be able to manage from an internal risk standpoint. And in some cases, they may even be looking at, you know, like, an outside assessor to come in and say, hey, let's actually do a review of the environment and say, OK, what is it that you feel you can handle? What is it that you feel you need to give off to, say, a third party who may be able to do it better? And where this conversation often comes up, I find, is around the areas of resources and skill sets. There's a giant resource gap in cybersecurity, and I think everybody is seeing that. 

Chris Novak: And so the challenge that I think we all see is we're all kind of pushing, pulling and tugging resources, and we're all basically fighting for the same small set of resources to do cybersecurity. And so I think part of the argument also that you have there is unless I really have the need to have those resources directly working for me and unless I really need these resources 24 by seven, there's also a business case to be made that says, you know what? Going co-managed or fully managed might actually be more efficient for me from an operation standpoint because, I mean, for example, my team, we do incident response and malware analysis all day long. That's all we do for organizations all around the world. 

Dave Bittner: Right. 

Chris Novak: And so we need that staff 24x7. But a lot of our clients don't necessarily need full-time malware analysts. And they don't necessarily need, say, two or three of them to be able to provide round-the-clock coverage. We might need several dozen just because we have many clients that rely on us to do that. But they can look at that and say, great, we don't have to keep all of these resources on staff, and then we don't have to train them and retain them. And if someone leaves, then we need to go find someone to replace that talent. You know, you can kind of move towards that fully managed or co-managed outlook and say, let that be somebody else's challenge to deal with. And I can just say I expect X, Y and Z, and I expect it to happen this fast. 

Dave Bittner: Yeah. All right. Chris Novak, thanks for joining us. 

Chris Novak: Thank you, Dave. 

Dave Bittner: Thanks to all of our sponsors for making the CyberWire possible. 

Dave Bittner: And that's the CyberWire. For links to all of today’s stories, check out our Daily Briefing at thecyberwire.com. And for professionals and cybersecurity leaders who want to stay abreast of this rapidly evolving field, sign up for CyberWire Pro. It will save you time and keep you informed. Every day is a fashion show, and the world is your runway. Listen for us on your Alexa smart speaker, too. 

Dave Bittner: Be sure to join us this weekend for "Research Saturday" and my conversation with Dr. Shreyas Sen from Purdue University on using your fingertip to transfer digital information. It's a unique one. That's "Research Saturday." Check it out. 

Dave Bittner: The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our amazing CyberWire team is Elliott Peltzman, Puru Prakash, Kelsea Bond, Tim Nodar, Joe Carrigan, Carole Theriault, Ben Yelin, Nick Veliky, Gina Johnson, Bennett Moe, Chris Russell, John Petrik, Jennifer Eiben, Rick Howard, Peter Kilpe. And I'm Dave Bittner. Thanks for listening. We'll see you back here next week.