David Bittner: [00:00:03:24] Does LoJack for Laptops report back to Moscow? World Cup cybersecurity. Schneider Electric patches developer's tools. Travel and hospitality reward points are the bait of the black market. Medical device vulnerabilities. Taking the gloves off Cyber Command. It's National Password Day and Microsoft (along with many others) would like to move beyond the password. And a remembrance on Press Freedom Day for working journalists murdered by the Taliban.
David Bittner: [00:00:37:11] And now, a few words about our sponsor, Dragos, the leaders in industrial control system and operational technology security. In their latest white paper, Dragos and OSIsoft present a modern day challenge of defending industrial environments and share valuable insights on how the Dragos OSIsoft technology integration helps asset owners respond effectively and efficiently. They'll take you step by step through an investigation, solving the mystery of an inside job using digital forensics with the Dragos platform and the OSIsoft pi system. Download your copy today at thecyberwire.com/dragos. That's thecyberwire.com/dragos D R A G O S. And we thank Dragos for sponsoring our show.
David Bittner: [00:01:35:16] Major funding for the CyberWire podcast is provided by Cylance. From the CyberWire studios at DataTribe, I'm Dave Bittner with your CyberWire summary for Thursday, May 3rd, 2018.
David Bittner: [00:01:48:11] Finding, locking and getting files back from a stolen laptop, these things are all good. If you look at them another way, however, you can see some potential for problems. Finding, locking and data exfiltration are, of course, things that attackers are just as interested in doing as admins are. Security firm NetScout's Arbor Networks has reported a possible back door in LoJack for Laptops. A tool that enables administrators to remotely lock, locate and remove files from a stolen computer. Five LoJack agents were found to be communicating with four dodgy command and control domains, three of which have in the past been associated with Fancy Bear, familiar to all of us by now as Russia's GRU. Absolute Software, which makes LoJack for Laptops, says it's been in discussions with Arbor Networks. It takes the matter seriously and is investigating, but doesn't believe its customers are at risk.
David Bittner: [00:02:44:16] Fans of football, what we here in the States refer to as soccer, know that this year the World Cup will be played in eleven Russian cities. Russian security authorities are boosting cyber preparations before the event, looking at hotel Wi-Fi, World Cup networks, media, and so on. They're probably mindful of this past winter's hacking of the Olympic Games, and don't want the same threat actors spitting in the soup this summer. Who was that mucking around with the South Korean games? Fancy Eagle? Fancy Lion? Fancy Kiwi? Fancy Kangaroo? Fancy Loon? Fancy something or other? Well, right, never mind!
David Bittner: [00:03:24:20] Schneider Electric has patched a vulnerability in its InduSoft Web Studio and InTouch Machine Edition. The products aren't themselves control systems, but rather toolsets used to develop SCADA systems, human-machine interfaces and applications that connect automated systems. The bug, discovered and disclosed by security firm Tenable, is a buffer overflow issue that could be exploited to execute arbitrary code.
David Bittner: [00:03:50:17] Travel reward points are relatively easy to monetize and they're being sold in Russian language dark web markets. Botnet operators often pick up such credentials incidentally in the course of other illicit activities and, for the most part, they sell them to other criminals. The botnets deploy keyloggers to pick up more directly valuable items like network or financial credentials, but they sweep up the travel rewards along the way. It's like watermen going after rockfish and then selling the bycatch for cat food. Analysts at security company Flashpoint, who've been following the matter, say that the fact that there are such a surprisingly large number of dark web boutiques specializing in travel and hospitality reward points, indicates a serious criminal demand for them. How long will the trade continue? Well, that's an easy question to answer. As Flashpoint says, it will go on as long as money is to be made from it.
David Bittner: [00:04:48:01] Edna Conway is Chief Security Officer, Global Value Chain at Cisco. She joins us to discuss how organizations can build what she calls a pervasive security architecture that tackles the often undefined and overlooked third party risks.
Edna Conway: [00:05:03:03] As we digitize, we're all aware of the fact that we're expanding the ecosystem of third parties who really will inevitably impact us for better or for worse. When you start to think about that, what it means is, you need to think about security is a pervasive way, not just using the words cyber and thinking information security, but thinking comprehensively about the way we experience and digital economy through devices, engagements with people, engagements via services and incorporate that holistic thinking into an approach that I call pervasive security. That's the goal.
David Bittner: [00:05:42:04] Let's dig into some of the details. How do you make that a reality from a practical point of view?
Edna Conway: [00:05:47:23] It's not easy. I think the reason why it's not easy is because you have to sit back, take the time to think about your third party ecosystem. So, if you want to pervasively drive security, the first thing you really need to think about is looking at that value chain of the third party community holistically. Who are they? What do they provide to you? Where are they providing it and how are they providing it? These begin to inform you on how to drive an architecture that will allow you to look at security threats and exposures in a way that's while comprehensive, flexible for the purposes of examining each of those third parties in their own environment, in their own business context and how you'd utilize them.
David Bittner: [00:06:39:02] When dealing with a third party, what's your recommendations in terms of verification, to make sure they're actually living up to their end of the deal?
Edna Conway: [00:06:46:07] Verification is an interesting question. I mean, if you look at it, global governments are clearly ramping their focus on what they refer to often as cyber supply chain risk management. So it's a way of managing risk, not necessarily focus that says compliant and compliance only. We see it in the NIST CSF draft 1.1. We see it in the energy sector who are in the US, North America and Mexico with NERC CIP. How you do that is first sit down and say what am I worried about? Identify what the threats are, translate those threats into exposures that make sense for your third parties and then drive a flexible architecture with what I like to call domains that are common, but the requirements within those domains are customized, based on the nature of each third party members effort on your behalf as part of your ecosystem.
David Bittner: [00:07:42:22] Do you find that people have a hard time breaking this process down into manageable bites?
Edna Conway: [00:07:48:21] It depends on who they are. I think, in the information communications technology arena we're seeing it grow more and more as part of doing business. The reporting, the metrics on it is still a little bit of a burgeoning area, quite frankly. But look, you know, I think everybody realizes-- they don't particularly say it the way I do-- but look, I believe the currency of the digital economy is trust. The same currency humans have always had. If you say that trust is the currency, data's the fuel and data's a fuel maybe for our own decision making and artificial intelligence to help us with decision making. All that does is form deposits in your bank of trust. You have to figure out how you want to go out and address the people part of it. Do I trust in people whether informed by AI or not? Do I trust in the data? Maybe. I'm going to say I want to drive that because I'm going to drive a digital ledger capacity.
Edna Conway: [00:08:48:10] Do you trust in the security processes that are employed by that third party from who you acquire a product or service or information? And then do we want the government to validate, and that's an open question, what is going on in industry in an effort to seek to both protect their citizens and get back to the question you asked me which is alignment? All of us can do it in a variety of different ways, but we need to look at some international standards and perimeters to set the floor and, quite frankly, I think also set a sealing that says no matter what we're doing and no matter how high we seek to achieve a level of security based on risk, these fundamental, 10, 12, 15 things need to be in the portfolio of what needs to be done and what we're going to measure.
David Bittner: [00:09:35:21] That's Edna Conway from Cisco.
David Bittner: [00:09:39:19] Becton Dickinson has advised that its medical devices using WPA2 encryption are vulnerable to KRACK key reinstallation attacks. This general Wi-Fi problem isn't confined to medical systems, but Becton Dickinson has issued a fix for the problem insofar as it affects their devices.
David Bittner: [00:10:01:03] And the US FDA has ordered the recall of about 465 thousand St. Jude (now Abbott Laboratories) implantable cardioverter defibrillators (that is ICDs) for a firmware update. The problem with the ICDs and their associated Merlin at home monitors essentially come down to an authentication back door. You may remember that this is the vulnerability publicly disclosed by MedSec in 2016. Controversial because the disclosure was done in apparent conjunction with short selling of St. Jude Medical stock by Muddy Waters LLC. The vulnerabilities MedSec reported were subsequently independently confirmed by Bishop Fox.
David Bittner: [00:10:43:02] US Senator John McCain, Republican of Arizona, is about to publish a book in which he argues, among other things, that the US ought to punish Russian cyber operations with American cyberattacks. That's one Senator's view, of course, but there are signs that the National Security Council wants the gloves taken off of US Cyber Command, too. CyberScoop reports a movement among the NSC staff to rescind or modify Presidential Policy Directive 20 to streamline the process by which military commanders could receive approval for offensive cyber operations. It's worth noting that PPD-20 is a classified document and so critiquing it involves a lot of looking at what agencies do and reading between the lines. But it's generally been characterized as a document that requires extensive interagency coordination across the Federal Government, in the interest of both proper restraint and due respect for agency equities.
David Bittner: [00:11:40:16] You're familiar with today's big holidays and observances, right? If you can take time away from celebrating Garden Meditation Day, quietly, Public Radio Day, with proper self-satisfaction, National Raspberry Popover Day, formerly known, with unintentional sauciness, as National Raspberry Tart Day, or Paranormal Day, because the truth is out there, consider that it's World Password Day. Do you know where your credentials are? We hope they're not in too many places.
David Bittner: [00:12:12:15] More seriously, today is also World Press Freedom Day. It's an important right with important responsibilities. This year's observance should also be a somber and reflective one. Taliban suicide bombers exacted a high toll in attacks this Monday. Ten journalists were killed covering the news. It's worth hearing their names. They were, Shah Marai, photographer for Agence France-Press and father of six. Yar Mohammad Tokhi, cameraman for Tolo News, due to be married this month. Ahmad Shah, of the BBC Afghan Service. He alone wasn't killed in the bombings but was shot dead by unknown assailants in Khost province. Maharram Durrani, who had just begun her work as a reporter for Radio Free Europe/Radio Liberty. Abadullah Hananzai, journalist and videographer for Radio Free Europe/Radio Liberty. Sabawoon Kakar, a five-year veteran of Radio Free Europe/Radio Liberty. Ghazi Rasuli, reporter for Afghanistan's 1TV. Nawruz Ali Rajabi, a cameraman also with 1TV. And the final two who lost their lives, Salim Talsh and Ali Salimi, both with Mashal TV. May they all rest in peace. May their families, friends, and colleagues be granted consolation in their grief.
David Bittner: [00:13:53:14] I'd like to give a shout out to our sponsor BluVector. Visit them at bluvector.io. Have you noticed the use of file-less malware is on the rise? The reason for this is simple, most organizations aren't prepared to detect it. Last year BluVector introduced the security markets first analytic specifically designed for file-less malware detection on the network. Selected as a finalist for RSA's 2018 Innovation Sandbox Contest, BluVector Cortex is an AI driven sense and response network security platform that makes it possible to accurately and efficiently detect, analyze and contain sophisticated threats. If you're concerned about advanced threats like file-less malware or just want to learn more, visit bluvector.io. That's bluvector.io and we thank BluVector for sponsoring our show.
David Bittner: [00:14:56:15] And joining me once again is Ben Yelin. He's a Senior Law and Policy analyst at the University of Maryland Center for Health and Homeland Security. Ben, welcome back. We had a story come by from the MIT Technology review and the title of the article was when an AI finally kills someone, who will be responsible? Let's dig in here. What are we talking about?
Ben Yelin: [00:15:16:00] So, this sounds like a horror movie gone completely wrong. Actually, I think it's a relatively realistic scenario in the near future. The proposition this article raises is that it's going to happen some time over the next several years. We have a self-driving car navigating the city streets that accidentally hits somebody, who is going to be held legally responsible? That's something that this academic, John Kingston at the University of Brighton in the UK has tried to sort of decipher and do a legal analysis. So obviously, an AI is not a real person. You can't lock this person behind bars. Either real bars or proverbial bars. So we're sort of looking at alternatives. Which actual humans would get punished for either the deliberate actions or the negligence of an AI? And this author proposed a few alternatives, a few sort of legal theories about who we should hold accountable. First he calls a perpetrator via another. In the physical world it would be somebody gets somebody else who has some sort of mental incapacity, maybe a minor, maybe a dog, maybe someone with a mental illness to commit a crime for them. Then the person who actually solicited that crime is the one who should be legally responsible.
Ben Yelin: [00:16:29:19] That seems like something you could reasonably apply to AI. If I hacked into the system and instructed the self-driving vehicle somehow to hit somebody on the street, I should be held legally responsible. The second theory he talks about, natural probably consequence. That's sort of when in the course of doing what it does, the AI would happen to cause some sort of harm and the example that this author gives is an Artificially Intelligent robot in a Japanese motorcycle factory killing a human worker. It sort of reminds me, you know, of the Simpsons episode where the Itchy and Scratchy characters falsely identify the Simpsons family as other robots to murder. But that's sort of what he's talking about here. The robot makes the mistake. If the programmer knew that this was potentially a concern, then they could be held legally responsible.
Ben Yelin: [00:17:25:04] Then finally there's the third theory which is direct liability. That requires both an action and an intent. Action is usually going to be straight forward in this scenario that he's talking about. You know, the AI hits somebody with their car. But the question is intent and this hypothetical, it's very unlikely that somebody had the intent to program the self-driving vehicle to hit somebody. You know, in that case, intent is going to be hard to prove. So what he proposes is that perhaps we should consider it as a strict liability crime. There are a lot of crimes in our legal system. One is statutory rape. Frequently we see speed limits as a strict liability defense, where either, even if you had a good excuse and you didn't know what you were doing was wrong and you had no sort of criminal mental state of mind, you could still be held criminally liable. So, these are the three theories he posted. Certainly, I don't think it's something we're going to come to a resolution on any time soon, but there are certainly interesting issues to think about.
David Bittner: [00:18:30:09] It will be interesting to see how it plays out, so I guess we'll have to stay tuned.
Ben Yelin: [00:18:36:08] I'll see you in 2023.
David Bittner: [00:18:46:20] Alright, Ben Yelin, as always, thanks for joining us.
David Bittner: [00:18:55:03] And that's the CyberWire. Thanks to all of our sponsors for making the CyberWire possible, especially to our sustaining sponsor Cylance. To find out how Cylance can help protect you through the use of Artificial Intelligence, visit cylance.com. And thanks to our supporting sponsor, VMWare, creators of workspace one intelligence. Learn more at vmware.com. The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technology. Our show is produced by Pratt Street Media with editor John Petrik, social media editor Jennifer Eiben, technical editor Chris Russell, executive editor Peter Kilpe and I'm Dave Bittner. Thanks for listening.