Slack closes a vulnerability. Email tracking in a court martial. Restrictions on doing business with Huawei come into place. A case of responsible disclosure.
Dave Bittner: [00:00:03] A Slack vulnerability is discussed and fixed. And this is not as seen on TV. A real NCIS investigation is likely to occupy real JAGs for some time to come, with implications for military and civilian cyber law. The U.S. is moving rapidly on Huawei and its associated companies. It's now much harder for U.S. companies to do business with them, and there's likely to be fallout in other countries as well. And an exposed database affords an instructive case of responsible disclosure.
Dave Bittner: [00:00:39] Now a moment to tell you about our sponsor, ObserveIT. The greatest threat to businesses today isn't the outsider trying to get in. It's the people you trust, the ones who already have the keys - your employees, contractors and privileged users. According to a recent CA Technologies research report, 53% of organizations confirmed insider attacks within the last 12 months. Can you afford to ignore this real and growing threat? With ObserveIT, you don't have to. See, most security tools only analyze computer, network or system data. But to stop insider threats, you need to track a combination of user and data activity. ObserveIT combats insider threats by enabling your security team to detect risky activity, investigate in minutes, effectively respond and stop data loss. Want to see it in action for yourself? Try ObserveIT free - no installation required - at observeit.com/cyberwire. That's observeit.com/cyberwire. And we thank ObserveIT for sponsoring our show.
Dave Bittner: [00:01:46] From the CyberWire studios at DataTribe, I'm Dave Bittner with your CyberWire summary for Wednesday, May 17, 2019.
Dave Bittner: [00:01:55] Tenable this morning reported a vulnerability in the business cooperation tool Slack. The flaw is now fixed, as it had been earlier disclosed to Slack. It affected the Slack desktop application for Windows, version 3.3.7. It has been fixed in version 3.4. It had been possible for an attacker to send a malicious hyperlink via a Slack message. Once clicked, the link would change the document download location path to a file share the attacker owned. This could have enabled theft or manipulation of other documents subsequently downloaded within Slack. As noted, the problem has been fixed, so users would be wise to update to the latest version. There's no indication that the vulnerability was ever exploited in the wild.
Dave Bittner: [00:02:40] The U.S. Navy may have put trackers in emails destined for defense counsel and news media covering a military trial involving leaks, Military Times reports. The Naval Criminal Investigative Service is investigating media leaks surrounding a high-profile case in which a Navy special operator is charged with murder, and a Navy officer is charged with conduct unbecoming an officer in an associated case. The Navy judge presiding over the case had imposed a gag order to help ensure fair due process for the defendant, and NCIS was trying to find who was violating that order.
Dave Bittner: [00:03:16] In any event, the Navy judge advocate prosecuting the case sent emails to, among others, defense counsel with a tracking image embedded below the signature block. A Navy Times editor was among those who received the email with the tracker. It was designed to identify the recipient machine's IP address and report it to a server in San Diego. It normally requires a subpoena or court order to acquire IP addresses or other metadata, Military Times says. Military Times is a sister publication to Navy Times, both papers belonging to the Sightline Media Group.
Dave Bittner: [00:03:52] A Navy spokesman told Military Times that this is about the defendants, Senior Chief Edward Gallagher and Lieutenant Jacob Portier, quote, "receiving a fair trial with due process in the military justice system," end quote. The spokesman, Captain Greg Hicks, declined to comment specifically on the tracking code, but said, quote, "following continuing and ongoing violations of the federal protective order, NCIS initiated a separate investigation into violations of that protective order. That investigation is ongoing," end quote.
Dave Bittner: [00:04:23] Captain Hicks did not say whether the Navy obtained a search warrant or subpoena in connection with the emails. He did say that, quote, "the media was not and is not the focus of the investigation. The focus of the investigation is squarely on identifying unauthorized disclosures that violate the judge's protective order." An NCIS spokesman said that, quote, "during the course of the leak investigation, NCIS used an audit capability that ensures the integrity of protected documents. It is not malware, not a virus and does not reside on computer systems. There is no risk that systems are corrupted or compromised," end quote.
Dave Bittner: [00:05:01] This of course satisfies no one who was troubled by the telltale code. Military Times points out that maybe this is a violation of existing privacy laws, including the Electronic Communications Privacy Act. And defense counsel have complained about the potential for abuse. The law here may be unsettled, but several state bar associations are on record against the use of such tracking technology. This sort of investigation, by the way, for those of you who watch "NCIS" on TV, is actually a lot more typical of the cases NCIS works on than the harum-scarum stuff you see on the small screen.
Dave Bittner: [00:05:38] Wednesday's U.S. executive order on securing the information and communications technology and services supply chain declared a state of emergency under the International Emergency Economic Powers Act, the National Emergencies Act and Section 301 of Title 3, United States Code. The Executive Order directs the secretary of commerce to take the lead in minimizing the risk from companies controlled by foreign adversaries - read, China. Its immediate effect is to clamp down on the use of Huawei technology in the U.S. The U.S. Commerce Department immediately banned Huawei and 70 of the company's partners. The measure will also affect U.S. exports. Broadcom, Qualcomm, Intel and Oracle, among others, will henceforth find it difficult to sell to Huawei, The Wall Street Journal points out.
Dave Bittner: [00:06:26] Strictly speaking, Commerce placed the Chinese company and its partners on an Entity List. Doing business with them will require a special license. The Entity List applies to both imports and exports. China's government has called the executive order and its attendant enforcement actions a wrong course and promises to resolutely defend Chinese companies from Washington's depredations. Beijing sees the affair as a move in a trade war.
Dave Bittner: [00:06:55] U.S. allies may be nudged by both prudential policy and the Wassenaar Arrangement to follow suit. Wassenaar is an arms export control regime whose 42 signatories undertake to cooperate on restricting trade in not only conventional weapons but dual-use articles that have both military and civilian uses. Cyber tools are among the dual-use items the arrangement addresses. U.S. allies are also concerned that giving Huawei too large a share in their national infrastructure could inhibit intelligence sharing with the United States. Of the Five Eyes nations, the U.S. and Australia take the hardest line on the risks posed by Huawei products and services. The other three eyes - Canada, New Zealand and the United Kingdom - are uneasy about the Chinese company but more ambivalent than the Australians and Americans.
Dave Bittner: [00:07:45] France's President Emmanuel Macron’s reaction to the U.S. executive order is representative of that in other allied countries. It's not France's perspective to move against Huawei or any other company, but France is determined to take measures to secure itself. That said, President Macron suggested that a trade war was in no one's interest. So why do companies and governments do business with Huawei? The company's gear is good enough. And besides, it's generally the low-cost option. It's so low-cost, in fact, that a number of Huawei skeptics consider the pricing unsustainable - a low-ball campaign for market penetration that will change once the customers are locked in.
Dave Bittner: [00:08:27] Ever taken an online survey, maybe to enter a sweepstakes for prizes? Sure you have. Most of us have. Of course, such surveys and sweepstakes are marketing instruments. And a large Elasticsearch database containing such information as name, physical address, email address, IP address, phone number, date of birth and gender was found exposed online by independent researcher Sanyam Jain. Jain, who BleepingComputer says is affiliated with the GDI Foundation, tracked the data back to marketing firm PathEvolution, a subsidiary of Ifficient. He disclosed the exposure to the company, which promptly secured the data.
Dave Bittner: [00:09:06] Ifficient has pointed out that the data didn't include Social Security numbers, credit card numbers or passport numbers, so they fell short of the fullz so beloved of hacker black marketeers and short of the kind of personal information covered by various U.S. state laws. But it's still an embarrassing lapse, and Ifficient is both shoring up its security and notifying people affected by the disclosure. But Ifficient's response to Jain's disclosure was, Jain said, refreshing. They thanked him and took action. All too often, the response to this kind of disclosure, he suggested to BleepingComputer, is to be either ignored or threatened with legal action. So an unfortunate lapse but a nice tale of responsible disclosure responsibly received.
Dave Bittner: [00:09:57] Now a moment to tell you about our sponsor, ThreatConnect. Designed by analysts but built for the entire team, ThreatConnect's intelligence-driven security operations platform is the only solution available today with intelligence, automation, analytics and workflows in a single platform. Every day, organizations worldwide use ThreatConnect as the center of their security operations to detect, respond, remediate and automate. With all of your knowledge in one place, enhanced by intelligence, enriched with analytics, driven by workflows, you'll dramatically improve the effectiveness of every member of the team. Want to learn more? Check out their newest e-book, "SOAR Platforms: Everything You Need To Know About Security, Orchestration, Automation And Response." The book talks about intelligence-driven orchestration, decreasing time to response and remediation with SOAR and ends with a checklist for a complete SOAR solution. Download it at threatconnect.com/cyberwire. That's threatconnect.com/cyberwire. And we thank ThreatConnect for sponsoring our show.
Dave Bittner: [00:11:13] And joining me once again is Joe Carrigan. He's from the Johns Hopkins University Information Security Institute and also my co-host over on the Hacking Humans podcast. Joe, it's great to have you back.
Joe Carrigan: [00:11:22] It's good to be back, Dave.
Dave Bittner: [00:11:23] We had a report come by - this is from a company called Apricorn. And they manufacture hardware-encrypted USB data storage devices.
Joe Carrigan: [00:11:33] Correct.
Dave Bittner: [00:11:34] And they came up with a report. They did a survey...
Joe Carrigan: [00:11:38] Right.
Dave Bittner: [00:11:38] ...Of over 300 people across a bunch of different industries, examining ways that they use USB devices.
Joe Carrigan: [00:11:45] Correct.
Dave Bittner: [00:11:45] So trying to sort of take a look at, what are people doing right, and what are people doing wrong?
Joe Carrigan: [00:11:50] Right.
Dave Bittner: [00:11:50] And what did they find here?
Joe Carrigan: [00:11:51] They found that 91% of the respondents claimed that encrypted USB drives were important...
Dave Bittner: [00:11:57] Yeah.
Joe Carrigan: [00:11:57] ...Right? But only 58% said that they regularly use encrypted USB devices. So that's interesting.
Dave Bittner: [00:12:05] Yeah, well let's just walk back a little bit here.
Joe Carrigan: [00:12:08] Sure.
Dave Bittner: [00:12:08] When we're talking about an encrypted USB drive, what's the practical implications of that?
Joe Carrigan: [00:12:12] I'm not exactly sure how the product that Apricorn sells works, but presumably, it's a hardware-level encryption so that everything on the device is encrypted and that if you don't have the keys to get into that device, then you're never going to open it.
Dave Bittner: [00:12:26] Right. So if I lose this drive in a parking lot or something...
Joe Carrigan: [00:12:30] Right.
Dave Bittner: [00:12:30] ...Somebody who picks it up - it's going to be worthless to them.
Joe Carrigan: [00:12:32] It's going to be worthless to them. Right.
Dave Bittner: [00:12:33] Yeah. Yeah.
Joe Carrigan: [00:12:34] Right. There's other ways you can do it. I think SanDisk has a product that's similar that runs software that encrypts the drive. And then, of course, there's the free solution. You could use VeraCrypt, which is an open-source encryption product that lets you create volumes - encrypted volumes or encrypt entire volumes like a USB drive. That's actually the solution I use to keep my important stuff encrypted on my USB drives.
Dave Bittner: [00:12:59] Yeah.
Joe Carrigan: [00:13:00] And I use it by creating a volume - right? - an encrypted volume that takes up a certain portion of the drive. The reason I do that is because I still need to have these drives available for unencrypted usage. I mean, encryption is not always important. For example, if I'm going to give a presentation to somebody - right? - I'm going to give a presentation to a group of people, and I have that presentation on a USB drive, right? I'm going to show everybody in the room - and show everybody in the world - what this presentation is, if they wanted to watch it.
Dave Bittner: [00:13:30] Right.
Joe Carrigan: [00:13:31] So I really don't care if this information is discovered, but I do need a way to quickly and easily put it on somebody else's computer without having to worry about - do they have the VeraCrypt software installed? Or do they have the SanDisk software installed?
Dave Bittner: [00:13:43] Right. Or even just have the key or the password.
Joe Carrigan: [00:13:45] Right. Or do I have the keys or the password?
Dave Bittner: [00:13:47] Might not want to reveal. Yeah.
Joe Carrigan: [00:13:48] I can just plug it in, copy my presentation over and deliver my presentation. So, I mean, again, we have to decide, what's information we want to protect, and what's information we don't want to protect? Now, obviously, there's lots of information we want to protect. And if that information needs to be protected, then when it's on a USB device, that's considered data at rest, right?
Dave Bittner: [00:14:06] Yeah.
Joe Carrigan: [00:14:06] We should definitely be encrypting that information...
Dave Bittner: [00:14:08] Yeah.
Joe Carrigan: [00:14:09] ...Through some means.
Dave Bittner: [00:14:10] Yeah. And this whole notion of kind of the, you know, USB device promiscuity, you know, of going from one computer to the other...
Joe Carrigan: [00:14:18] Yep.
Dave Bittner: [00:14:18] ...I think about public health. And on the one hand, you know, there's washing your hands. In the other hand, there's inoculation.
Joe Carrigan: [00:14:27] Right.
Dave Bittner: [00:14:27] You know, and it seems to me like you should try to be protecting your devices from both ends...
Joe Carrigan: [00:14:32] Correct.
Dave Bittner: [00:14:32] ...You know, informing your users not to just plug these things in and out willy-nilly...
Joe Carrigan: [00:14:37] Right.
Dave Bittner: [00:14:38] ...But also have whatever mechanisms that are on the machine that might get plugged into - whenever something gets plugged in - to take a look at that, and before you go off and load something or run something, you know, have some kind of software on there to take a look at whatever might be on there and scan to see if there might be any problems.
Joe Carrigan: [00:14:53] Yeah, absolutely. And additionally, there are other things out there - malicious hardware that is just a bank of capacitors. It stores up all the power that gets sent to the USB drive over time...
Dave Bittner: [00:15:03] Right.
Joe Carrigan: [00:15:03] ...And then feeds it back all at once in an attempt to burn out the motherboard.
Dave Bittner: [00:15:06] It just zaps it.
Joe Carrigan: [00:15:07] Yeah.
Dave Bittner: [00:15:07] Yeah.
Joe Carrigan: [00:15:08] So don't ever plug anything you find in.
Dave Bittner: [00:15:10] (Laughter).
Joe Carrigan: [00:15:11] I'm not saying - those devices are rare. Those motherboard destroyers are rare, and they're kind of costly, but...
Dave Bittner: [00:15:17] Better safe than sorry.
Joe Carrigan: [00:15:17] Better safe than sorry.
Dave Bittner: [00:15:18] Yeah.
Joe Carrigan: [00:15:19] Your bigger risk here is finding malware or getting malware on something. And I don't like the idea of using the free devices at conferences. I've even gone so far as to hand them out at a conference. But...
Joe Carrigan: [00:15:32] (Laughter) Because your boss told you you had to (laughter)?
Joe Carrigan: [00:15:35] No, it's just a piece of swag that we had. But before we did that, I actually had one of my students go through on a Raspberry Pi and delete everything off those things, because we - you know, where do you get those things? They come from some marketing supplier.
Dave Bittner: [00:15:49] Right.
Joe Carrigan: [00:15:50] You don't know what the supply chain on that marketing supplier is. They're buying those devices from the lowest bidder.
Dave Bittner: [00:15:54] Yeah.
Joe Carrigan: [00:15:55] Right?
Dave Bittner: [00:15:55] Yeah.
Joe Carrigan: [00:15:55] So we ran a complete wipe on everything before we handed it out. And we don't hand them out anymore...
Dave Bittner: [00:16:01] Yeah.
Joe Carrigan: [00:16:01] ...Just because we just don't think it's good practice to hand them out.
Dave Bittner: [00:16:04] (Laughter) Right. Right. All right. Well, this company Apricorn who makes these - obviously, they have a little bit of interest in making people want to use encrypted USB drives...
Joe Carrigan: [00:16:13] Which is not a bad idea.
Dave Bittner: [00:16:14] Not a bad idea. But you know, just be mindful it's not a completely unbiased look at this.
Joe Carrigan: [00:16:20] Right. Right.
Dave Bittner: [00:16:21] All right. Well, Joe Carrigan, thanks for joining us.
Joe Carrigan: [00:16:22] My pleasure, Dave.
Dave Bittner: [00:16:28] Now it's time for a few words from our sponsor, BlackBerry Cylance. They're the people who protect our own endpoints here at the CyberWire, and you might consider seeing what BlackBerry Cylance can do for you. You probably know all about legacy antivirus protection. It's very good as far as it goes, but you know what? The bad guys know all about it, too. It will stop the skids, but to keep the savvier hoods' hands off your endpoints, BlackBerry Cylance thinks you need something better. Check out the latest version of CylanceOPTICS. It turns every endpoint into its own security operations center. CylanceOPTICS deploys algorithms formed by machine learning to offer not only immediate protection but security that's quick enough to keep up with the threat by watching, learning and acting on systems behavior and resources. Whether you're worried about advanced malware, commodity hacking or malicious insiders, CylanceOPTICS can help. Visit cylance.com to learn more. And we thank BlackBerry Cylance for sponsoring our show.
Dave Bittner: [00:17:34] My guest today is Mike Kijewski. He's CEO and founder of MedCrypt, a company that helps medical device manufacturers improve their cybersecurity. Our conversation centers on the current security state of medical devices, steps that producers of those devices are taking to better secure them and how appropriate regulation may take a part in moving things forward.
Mike Kijewski: [00:17:57] The majority of medical devices that are being used today were designed without really any thought about cybersecurity when the devices were developed. And there could be a couple of reasons for that. It could be that device vendors were assuming that these devices were being used in a hospital network. And the hospital network was inherently secure, and therefore, the devices themselves didn't need to have these security features. Or it could be that the device vendors just didn't think that, you know, security was an issue. Who's ever going to hack a medical device? You know, is this a sort of an apocalyptic scenario that is reserved for, you know, terrorist movies? Or is this, you know, actually a real, practical concern?
Mike Kijewski: [00:18:36] And I think going forward, there are challenges when designing a medical device in prioritizing clinical features over cybersecurity features. So for example, you know, the No. 1 priority of a pacemaker is that it always continues to keep the patient's heart beating. And when you're designing a pacemaker, that's obviously the most important thing that you need to be designing for the device. Well, how many clinical features can an engineering team, you know, put off to the future in return for implementing some security features to ensure that that device is, you know, functioning safely?
Mike Kijewski: [00:19:12] And designing security features into devices, as you can imagine, can be pretty tricky and pretty time-consuming. So there's this constant battle between, you know, clinical functionality, interoperability, ease of use for clinicians and actually building security features into these things so that, you know, bad guys can't do bad things with them.
Dave Bittner: [00:19:29] Yeah, yeah. One of my colleagues here, Joe Carrigan, works at Johns Hopkins. And he was saying that, you know, someone comes into the emergency room there, and they're not going to say, hey, my first priority is that you secure my private information. You know, I need you to, you know, take care of this chest pain that I'm having or get these bullets out of me.
Mike Kijewski: [00:19:49] Yeah. That's exactly right. And I think, you know, maybe even a less dramatic illustration of that issue is the classic problem in health care of - who is the user? Who's the customer? Who's the buyer? When you have a cardiologist who's, you know, choosing a pacemaker for a patient, the cybersecurity features are probably relatively low on their list, right? That pacemaker is not going to function on the hospital's network, so the hospital IT department isn't that concerned about it. The insurance company doesn't really have any incentive to minimize the security risk in the pacemaker. And the medical device vendor, you know, they're just trying to build clinical functions to differentiate their device from the competitors.
Mike Kijewski: [00:20:27] So you end up with a device that is in a patient's chest that goes home with them that perhaps has some security features, and then you have, you know, a nightmare scenario where the FDA has to mandate a recall on a device because they've decided that, you know, a security vulnerability found in that device is not acceptable.
Dave Bittner: [00:20:45] Do you think there's a regulatory solution here? I mean, certainly, when I think of the medical industry, I think of things like HIPAA where, you know, big changes can happen that can come down from on high. Is that one possibility?
Mike Kijewski: [00:20:59] You know, it's a really good question. And I, before starting this company, you know, considered myself to be a, you know, very free market, somewhat libertarian-leaning individual who was very skeptical of the government's ability to have a positive impact, you know, on something in the market. But having worked pretty closely alongside the FDA in looking at this problem, I found that, No. 1, there might not be another organization that has the leverage necessary to fix this problem. It might need to come from the FDA. And the things that the FDA has done in the last four years to attempt to improve medical device cybersecurity I think have been very measured, sort of responsible interventions and not the kind of heavy-handed anti-business legislation that some people would have you believe that the FDA is, you know, in the business of doing.
Mike Kijewski: [00:21:54] One example of this - they've come out with two guidance documents related to what they call post-market cybersecurity and premarket cybersecurity in medical devices. And as a guidance document, the first page of each of these documents says, these are non-binding recommendations from the FDA. So lots of medical device vendors said, oh, well, these are optional because it's guidance. We don't really need to do this. And the FDA has come out and said, no, that is not true. Safety in medical devices is mandatory. Cybersecurity is an aspect of safety. Here is our recommendation on how to build safe medical devices. If you do it some other way, that's fine, but it needs to be better than this.
Mike Kijewski: [00:22:33] And people have asked them, why don't you just pass a law? Why don't you create some regulation that actually mandates that device vendors do this? And the FDA has said, well, the pace of cybersecurity moves so quickly that if we were to make a regulation, the regulation would be, you know, five years - if not 10 years - behind what the current state of cybersecurity is. If we say, hey, you have to use encryption on medical devices - well, what kind of encryption? What's the key length? What algorithms are OK? It kind of becomes a rat's nest of questions that need to be answered that lead to this sort of heavy-handed regulation that ends up being sort of anti-business. So I think the FDA has done a great job of addressing the problem, you know, especially this most recent guidance they put out last October.
Mike Kijewski: [00:23:11] So one thing that comes up regularly when writing these sorts of stories are the nightmare scenarios of - you know, the "Homeland" episode where the vice president's pacemaker gets hacked. And it's pretty easy to point the finger at device vendors that have been in the news recently for having cybersecurity vulnerabilities in their devices. But what I will say is that, No. 1, the clinical functionality of a medical device - basically any medical device - almost always outweighs the cybersecurity risk of that device. So for example, I think Medtronic had an issue with pacemakers last fall where a vulnerability was found. Does that mean that if - you know, your parent or your grandparent has a pacemaker in their chest, they should go get removed? Because no. Absolutely not, right?
Mike Kijewski: [00:23:51] The clinical functionality of these devices are orders of magnitude more beneficial than the cybersecurity risks are detrimental. And I do think that, you know, from working with a lot of these bigger medical device vendors pretty closely, they're doing a great job of changing their practices and building features into devices. And some of them have been doing this for, you know, close to a decade and have done a pretty good job at it. So it's - you know, the sky is not falling. This isn't a nightmare scenario. Device vendors, I don't think, are being negligent by putting devices out there that lack security features. I think it's more of an industry problem where, as we discussed, the incentives maybe aren't perfectly aligned to result in really well-secured devices. And the FDA has been the organization that I think has done the best job of changing that dynamic.
Dave Bittner: [00:24:37] That's Mike Kijewski from MedCrypt.
Dave Bittner: [00:24:45] And that's the CyberWire. Thanks to all of our sponsors for making the CyberWire possible, especially our supporting sponsor, ObserveIT, the leading insider threat management platform. Learn more at observeit.com.
Dave Bittner: [00:24:57] The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technology. Our CyberWire editor is John Petrik, social media editor Jennifer Eiben, technical editor Chris Russell, executive editor Peter Kilpe. And I'm Dave Bittner. Thanks for listening.