Not every incident is necessarily an attack. Not everything that purrs is a kitten (sometimes it’s a bear that would like you to think it’s a kitten). ICS security notes.
Dave Bittner: [00:00:03] Some notes on not jumping to conclusions that incidents are cyberattacks. A false flag operation shows the difficulty of attribution. Not everything that purrs is a kitten because sometimes it's a bear. Notes from the ICS Security Conference in Atlanta, including some reflections on the criminal market's business cycle, the dangers of social engineering and the importance of attending to the fundamentals. And the Vatican fixes a bug.
Dave Bittner: [00:00:34] And now a word from our sponsor, ExtraHop, delivering cloud-native network detection and response for the hybrid enterprise. The cloud helps your organization move fast, but hybrid isn't easy. Most cloud threats fall on customers to resolve, and prevention-based security wasn't designed for the modern attack surface. That's why Gartner predicts that 60% of enterprise security budgets will go towards detection and response in 2020. ExtraHop Reveal(x) Cloud is the only SaaS-based network detection and response solution for AWS with complete visibility, real-time threat detection and automated response powered by cloud-scale machine learning. Request your 30-day free trial of Reveal(x) Cloud at extrahop.com/trial. That's extrahop.com/trial. And we thank ExtraHop for sponsoring our show.
Dave Bittner: [00:01:30] Funding for this CyberWire podcast is made possible in part by McAfee, security built by the power of harnessing one billion threat sensors from device to cloud, intelligence that enables you to respond to your environment and insights that empower you to change it. McAfee - the device-to-cloud cybersecurity company. Go to mcafee.com/insights.
Dave Bittner: [00:01:52] From the CyberWire studios at DataTribe, I'm Dave Bittner with your CyberWire summary for Monday, October 21, 2019. The CyberWire has some of our folks down in Atlanta this week for the 2019 ICS Security Conference, which opened this morning. Before we talk about some of today's sessions, however, it's worth discussing some news that broke over the weekend that's directly relevant to ICS security. We're all familiar with the difficulties surrounding attribution. It's the familiar fog of war and the related but less often discovered glare of war - the way in which having too much information can blind you to what's really going on. So here's some fog of war that blew in over the weekend. Often, there's uncertainty with respect to whether an incident involves a cyberattack at all. And that was the case with an incident in Iran. A social media report out of Iran yesterday said that a refinery fire in that country was caused by a cyberattack. But these reports remain unconfirmed and note that the Twitter thread's assertion that the incident is confirmed doesn't really count. Reuters, sourcing Iranian state media, said there was a fire in a canal carrying waste from the Abadan refinery but that the fire was under control. In this respect, ICS security firm Dragos blogged caution in accepting reports of a cyberattack at face value. After all, while cyberattacks can and have caused physical damage, accidents do happen, and it's important not to jump to conclusions. That holds true of attribution as well.
Dave Bittner: [00:03:29] Another example of the difficulty of attribution may be found in a joint report issued this morning by the U.K.'s NCSC and the U.S. NSA. The agencies find that the Russian government group Turla - also known as Venomous Bear, WhiteBear, Snake, Waterbug and Uroburos - hijacked Iranian tools to mount an effective false flag operation in which Turla effectively posed as APT34, or Helix Kitten. The espionage operation not only used APT34 back doors but also prospected known APT34 victims. According to Reuters, the NCSC says it's not aware of any official attributions influenced by the misdirection. But officials point out that the discovery should serve as a caution against hasty attribution. Compare a similar false flag during the last Winter Olympics held in South Korea when Russian services impersonated North Korean operators. WIRED is running a long series on that incident that's worth a look. We note that the joint warning seems consistent with the recently announced determination of NSA's cybersecurity directorate to engage the public more directly.
Dave Bittner: [00:04:39] To return to the Atlanta ICS Security Conference, we heard some interesting presentations during the first morning. If there's one overarching lesson the speakers agree on, it's the importance of paying attention to the fundamentals. Bruce Billedeaux of Rockwell Automation subsidiary MAVERICK Technologies presented an overview of the darknet and what those concerned with ICS security should know about it. The basic problem from an ICS perspective is the way in which sensitive information and hacking tools can be propagated across the black markets that establish themselves in the darknet. He offered a range of lurid true stories designed to make plant managers' flesh creep - the ease with which people trade company information anonymously, the hacking services freely available and the price lists that make such services accessible to many who wish companies ill. One of his more interesting observations noted last week's recent arrests of some 300 individuals who were engaged in child abuse in the course of running illicit content services online. That, Billedeaux pointed out, is what law enforcement is interested in stopping and quite properly so. Your concerns, he said, addressing an ICS audience, don't have that kind of high priority. And he also noted the fracturing of contraband black markets with the Silk Road takedown. That's part of the normal black market business cycle - consolidation followed by an official crackdown followed by the proliferation of small operators followed by another phase of consolidation that continues until the next official crackdown. We're currently in a fragmented phase, Billedeaux observed.
Dave Bittner: [00:06:22] Earlier in the morning, Mark Carrigan, COO of PAS Global, talked about the good, the bad and the ugly of ICS security. The good lay in signs of increased cooperation between OT and IT, with OT beginning to catch up to IT, particularly with respect to access management. He also saw industry focused on the right things - visibility, audits and security awareness programs. And above all, companies now understand that OT security deserves investment. The bad is that attacks on OT are no longer just collateral damage. Threat actors, especially those run by nation states, are now researching OT systems and developing attacks designed specifically for those systems. And then there's the ugly, chiefly the confusing OT security market and the tendency companies have to fixate on shiny objects, the latest buzzwords and trends. We also find, Carrigan observed, that solution results seem to fall short of expectations and too much information overwhelms understanding. Too much focus on detection is also ugly. Basic protection and recovery mechanisms can have massive risk reduction.
Dave Bittner: [00:07:36] Turning to the threat of social engineering, a presentation by Chad Lloyd, security architect at Schneider Electric, pointed out that compromising a system very often starts with compromising a human being. Social engineering enables the attacker to leapfrog not only cyber defense in depth but even expensive physical security measures. He agreed with Carrigan - attention to the basics matters. And in defense against social engineering in particular, those basics include security awareness training for employees. We'll have notes and updates throughout the duration of the conference.
Dave Bittner: [00:08:11] And finally, we're all familiar with the internet of things and the industrial internet of things. There's also inevitably an internet of sacramentals. That is, the things religious believers use in the course of their devotions. Last Wednesday, the Vatican introduced an eRosary app that's designed to enhance the prayer life of those who use it. You signed up with an email and a four-digit PIN was transmitted. Unfortunately, that pin was easily intercepted and once intercepted could give an attacker access to all the information the Android app requested. The researcher who found the vulnerability informed the Holy See and the bug was fixed by Thursday.
Dave Bittner: [00:08:58] And now a word from our sponsor, Dragos. The CyberWire is partnering with Dragos for a free ICS webinar entitled "Threat Intelligence: Explained, Examined & Exposed" on October 22, which is tomorrow. We'll share real-world insights from hunting some of the most sophisticated threats and cover vulnerable assets that need protection. Be sure to register for tomorrow's ICS threat intelligence webinar featuring Dragos and the CyberWire. I'll be there. Register at dragos.com/webinar (ph). That's dragos.com/webinar. And we thank Dragos for sponsoring our show.
Dave Bittner: [00:09:44] And joining me once again is Joe Carrigan. He's from the Johns Hopkins University Information Security Institute; also my co-host on the "Hacking Humans" podcast. Joe, it's great to have you back.
Joe Carrigan: [00:09:53] It's good to be back, Dave.
Dave Bittner: [00:09:55] I had an article come by. This was from the MIT Technology Review, and it was about how easy you are to track down even when your data has been anonymized.
Joe Carrigan: [00:10:05] Yeah.
Dave Bittner: [00:10:06] Article by Charlotte Jee.
Joe Carrigan: [00:10:08] It's...
Dave Bittner: [00:10:09] What's going on here?
Joe Carrigan: [00:10:09] ...Fairly trivial to reidentify people from an anonymized data set.
Dave Bittner: [00:10:14] OK. Well, let's start out with some definitions here. When we're talking about an anonymized data set, what's going on?
Joe Carrigan: [00:10:19] OK. First off, let's explain why we have these things of - these things called anonymized data sets, particularly for in the field of health care.
Dave Bittner: [00:10:27] OK.
Joe Carrigan: [00:10:28] A lot of times, we need these data sets in order to perform research, right? But there's regulations, and, you know, there are HIPAA regulations, and there might be some internal IRB regulations that say if the - if you're going to store this kind of information, you have to store it in an anonymized fashion, right? Which means that all of the personal identifiable information has been stripped from the data set and replaced with tokens.
Dave Bittner: [00:10:53] OK.
Joe Carrigan: [00:10:54] All right. But there is some information that can't be stripped because it's important to the research, and those things happen to be demographic pointers - right? - like your age...
Dave Bittner: [00:11:05] Whether I'm a man or a woman.
Joe Carrigan: [00:11:06] Whether you're man or a woman, your gender, whether you're white, black, Hispanic, whatever. Your race is usually a very important indicator because in - there are certain health conditions which affect different races disproportionately...
Dave Bittner: [00:11:18] Right, sure.
Joe Carrigan: [00:11:18] ...Than the other races, so that's important - where you live, what zip code you're in. So those are all very valid reasons to have deidentified data sets. However, this study found if I know three things about you - that being your birth date, your gender and your zip code - then I can identify somebody correctly 83% of the time in a data set. And they even have a tool that has some data sets that tells you how well you can be identified. So I went into this tool, and I entered my date of birth, my gender, my zip code. And it came back with a 99% identifiablility. So in other words, if somebody looked me up in any of these data sets, then they would be 99% sure that they had me.
Dave Bittner: [00:12:03] Even though they didn't have - the data set did not have your name in it...
Joe Carrigan: [00:12:06] Right.
Dave Bittner: [00:12:07] ...They could...
Joe Carrigan: [00:12:07] They could look at the health information that was associated with my record. Now, a couple of things - when I join any of these sites that demand to know my birthday, I always tell them the same thing. It's January 1. That's not my birthday, but that's what they get told.
Dave Bittner: [00:12:22] OK.
Joe Carrigan: [00:12:22] Because I don't want them to be able to identify me with other data sets that might be out there where they actually do have my birthday. And I'm actually now reconsidering whether or not it's important for my health care provider to know my actual birthday after reading this because when I enter January 1 and the other information, that identifiablility instantly goes down to 63%, which is well below the average, and getting pretty close to 50%, which is essentially you're anonymous because it's a coin flip on whether or not you have the actual person.
Dave Bittner: [00:12:52] Right. OK.
Joe Carrigan: [00:12:53] Now, there is a new technology out there - a newer technology - called differential privacy, which takes one of these anonymized data sets and adds noise to it, but the noise doesn't change any of the value of the data set. All it does is add more anonymity to the point where if I know something about somebody, those three points of data or maybe more points of data, then when I get a record out that may or may not be the person, I can't tell more than 50-50%. You know, in other words, I'm guessing whether or not that is the actual person. So I think that's the solution when you're looking at public health data and public health information. But when you're looking at data that's information about an individual, it may not be useful, right?
Dave Bittner: [00:13:41] Yeah. It's interesting. In this article, they were saying that they're going to be using differential privacy for the U.S. Census database.
Joe Carrigan: [00:13:48] Well, that's a good use of it - right? - because the U.S. Census database is a database that's supposed to have general demographic information about a population, right? So differential privacy is a great application for that.
Dave Bittner: [00:14:01] All right. Well, it's interesting. Nice to know that there are solutions at hand to make this better, that it's not all hopeless.
Joe Carrigan: [00:14:09] Very real problems.
Dave Bittner: [00:14:10] Yeah, yeah. All right. Well, Joe Carrigan, thanks for joining us.
Joe Carrigan: [00:14:13] It's my pleasure, Dave.
Dave Bittner: [00:14:19] And that's the CyberWire. Thanks to all of our sponsors for making the CyberWire possible, especially our supporting sponsor, ObserveIT, the leading insider threat management platform. Learn more at observeit.com. The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technology.
Dave Bittner: [00:14:41] Our amazing CyberWire team is Stefan Vaziri, Kelsea Bond, Tim Nodar, Joe Carrigan, Carole Theriault, Nick Veliky, Bennett Moe, Chris Russell, John Petrik, Jennifer Eiben, Peter Kilpe, and I'm Dave Bittner. Thanks for listening. We'll see you tomorrow.