Notes from the CISA Summit. New DDoS vector reported. Medical images exposed online. Huawei and US sanctions. Engaging ISIS in cyberspace.
Dave Bittner: [00:00:03] We've got a quick look at CISA's National Cybersecurity Summit. A big new distributed denial-of-service vector is reported. Medical servers leave patient information exposed to the public internet. Huawei is suspended from the FIRST group as it argues its case in a U.S. federal court. And one of the challenges of engaging ISIS online is that it relies so heavily on commercial infrastructure. It's got to be targeted carefully.
Dave Bittner: [00:00:34] And now a word from our sponsor ExtraHop, the enterprise cyber analytics company delivering security from the inside out. The cloud may help development and application teams move fast, but for security teams already dealing with alert fatigue, tool sprawl and legacy workflows, cloud adoption means a lot more stress. You're building your business cloud first. It's time to build your security the same way. ExtraHop's Reveal(x) provides network detection and response for the hybrid enterprise with complete visibility, real-time detection and guided investigation. Reveal(x) helps security teams unify threat detection and response across on-prem and cloud workloads so you can protect and scale your business. Learn more at extrahop.com/cyber. That's extrahop.com/cyber, and we thank ExtraHop for sponsoring our show.
Dave Bittner: [00:01:32] Funding for this CyberWire podcast is made possible in part by Bugcrowd, connecting organizations with the top security researchers, pen testers and white hat hackers in the world to identify 10 times more vulnerabilities than scanners or traditional pen tests. Learn more about how their award-winning platform provides actionable insights like remediation advice to help fix faster while methodology-driven assessments ensure compliance needs are met at bugcrowd.com.
Dave Bittner: [00:01:59] From the CyberWire studios at DataTribe, I'm Dave Bittner with your CyberWire summary for Thursday, September 19, 2019. U.S. federal agencies are taking election security seriously, as we heard yesterday at the second annual National Cybersecurity Summit organized by the Cybersecurity and Infrastructure Security Agency, CISA. CISA and its partners are concerned with direct hacking of voting systems but also with countering influence operations mounted by hostile foreign governments. Discussions were particularly aware of the ways in which social media lend themselves to confirmation bias and the ways in which such bias can be used to create or exploit fissures in civil society.
Dave Bittner: [00:02:43] CISA director Christopher Krebs also offered a suggestion to the security industry - please stop selling fear. Sure, it can work for marketing sometimes, although even there, it's subject to diminishing returns as the customer slides into learned helplessness. But it's an impediment to sensible discussions and planning that could actually avert damage. This is especially true, he thought, with election security, where citizens' confidence in their institutions is a principal target. He didn't ask why we should do the opposition's work for them, but we will. If the bad actors want to destroy trust and confidence, let them try to do so without the security industry scoring a lot of their own goals. So keep calm and carry on.
Dave Bittner: [00:03:29] Akamai reports that a new distributed denial-of-service vector WS-Discovery, a UDP amplification technique, is being exploited in the wild. The approach is a good one from the attackers point of view since it's enabling them to achieve amplification rates of fifteen-point-three-thousand (ph) percent. Now, we don't have an intuitive grasp of how big that is either. It's like astronomical distances. You've got no feel for them at all, but you're pretty sure they're pretty big. This, Akamai points out, gives the attack technique the fourth highest reflected amplification factor on the DDoS leaderboard.
Dave Bittner: [00:04:06] There's been another case of misconfigured servers exposing private information to public inspection. Researchers at Greenbone Networks have found a very large number of medical images - radiological images, for the most part - sitting out there online. Greenbone looked at 2,300 picture archiving and communication systems - servers based on the DICOM protocol - and found that some 400 million images belonging to 24.5 million patients were easily accessible. Why would someone care about this? Apart from being sensitive about your X-rays, there are several good reasons. The exposed files were commonly associated with patient data that included a full name, date of birth, date of examination, what the researchers call scope of the investigation, the type of imaging, the attending physician, the health care facility where the procedures were performed and the number of images generated during the procedures. One often thinks first of identity theft in such cases, and of course, that's a possibility, but this sort of information is also very useful in social engineering. Consider - you're in for medical imaging, which is often associated with serious and frightening conditions. Your guard will be down if you receive an email or phone call that appears to be from the doctor or the tech who took the X-rays or MRI. That's the bigger problem here.
Dave Bittner: [00:05:30] GDPR created huge incentives for companies to make sure they met data privacy regulations by the implementation deadline. Still, there are some areas where they are lagging behind. David Talaga is from data integrity and integration firm Talend, and he offers his insights.
David Talaga: [00:05:48] It's the GDPR one-year anniversary back in May '19. At that times, the European Data Protection Board - we're just told there were 90,000 complaints. Most complaints were coming for kind of telemarketing use case, promotional email, video surveillance - that kind of things. On all sites, what we found out is that 98% of the policy have been updated by customers, which is fine. The data privacy policy with GDPR - you can go to the website. You can see some terms of agreement that have been updated. But in reality, we saw that 70% did not apply. They failed to provide data within the 30 days, which is a kind of a limit that has - to GDPR. So in reality, data management is still suffering. It's still not there. So from a policy point of view, from a process part of view, that's fine. You can see that the legal - the lawyers have done their jobs. But in reality, data management pipelines need will need to be integrated with each other.
Dave Bittner: [00:06:46] And what is causing that gap? Why are they not doing a better job?
David Talaga: [00:06:50] Because I think that the point is that they really start by going to the lawyers but not going to the IT departments and talking business and actually talking together about - what do I need to do to make sure that my data is covered? And the fact is, sometimes, topics like that are quite - no way is accountable for that into companies, first point. Second point, they don't know that that such kind of tools are existing sometimes. They're very keen on integrating things but not keen on trusting the data, and they don't even know where to be aware of that. So we tried over the last two years to tell them to make really good progress in that - informing your customers, informing the market that things have changed. And now we have data equity tools that help them to really protect their data and secure their data pipeline.
Dave Bittner: [00:07:34] Now, describe to me - when you're talking about data masking, what exactly is that, and what does that get me in terms of compliance?
David Talaga: [00:07:44] Data masking - so it's when you go to a store, and once you go to a store - and they register your personal information. But maybe - or you go to a website and you enter your email, your address and so on, and you kept receiving some promotional emails. And you're upset about that, so you want to claim for a deletions form just to have your record being deleted by the company. The thing is, right now, companies are struggling to do that. They're doing manually. It's OK for one or two records, but imagine that you have millions of records processing through your website or through your retail shops. So imagine that you have several hundreds of these records of ask from the customer. So you need the encryptions. You need data masking. So at some point of time, once the user has requested they are to be deleted, you can automate the kind of tasks. So what has been that personal data, like first name, will be replaced by a random series of figure, and later, without any kind of personal information. You will keep the first name and last name structure so you can - all the gender kind of things so you can - but no personal information will remain into the data. And you can tell these guys, the company, OK. That's OK. We have deleted all your data. It's done.
Dave Bittner: [00:09:01] That's David Talaga from Talend.
Dave Bittner: [00:09:04] The Wall Street Journal reports that Huawei's membership in FIRST, the Forum of Incident Response and Security Teams, has been suspended. FIRST says the suspension is temporary and was undertaken in response to U.S. trade sanctions against Huawei. FIRST is an important cooperative group for the sharing of information among cyber incident response organizations, and Huawei's exclusion from the forum is not a trivial matter. The Washington Post reminds readers that Huawei is defending itself against the sanctions in oral arguments today before the U.S. District Court for the Eastern District of Texas. Their contention is thought likely to be that the U.S. government's strictures against them aren't based on security at all but are just a gambit in a Sino-American trade war. Observers are dubious about how likely this is to fly, but it's not an obviously crazy position, and you can't blame a guy for trying.
Dave Bittner: [00:10:00] U.S. Cyber Command is ramping up operations against ISIS. The sometime-caliphate is not generally reckoned to show a high level of technical sophistication, but it's been able to operate effectively, particularly in its use of the internet for communication and inspiration. Its resilience lies in part in its use of commercial infrastructure, which makes ISIS's online operations difficult to disrupt without doing unacceptably high and sometimes collateral damage. A Marine Corps brigadier general told Fifth Domain, quote, "whether it's cyber or kinetic, we're still under the law of war. So we have to, one, determine where that is, and if we find that out and we can't hand that off to another intelligence agency or local law enforcement, then we're at an ends until we can get our higher policymakers to come to some agreement at a higher government level to get after that problem," end quote.
Dave Bittner: [00:10:52] To follow up yesterday's discussion of cyber calls for fire, this illustrates the complexity of the problem. If cyberattack is analogous to fire support, it's like fire support delivered during combat in a densely populated city. And the general's observation about the laws of war isn't idle. That's why they put JAG lawyers on targeting teams. So a tough problem but not necessarily an insoluble one - Task Force-Ares, good hunting.
Dave Bittner: [00:11:26] And now a word from our sponsor Dragos, the leaders in industrial cybersecurity technology. Threats to electric infrastructure are progressing in both frequency and sophistication. In their latest whitepaper and webinar, Dragos re-analyzes the 2016 Ukraine cyberattack to reveal previously unknown information about the CRASHOVERRIDE malware, its intentions and why it has far more serious and complex implications for the electric community than originally assessed. Learn more about CRASHOVERRIDE and what defenses to take to combat future sophisticated cyberattacks by reading the whitepaper at dragos.com/white-papers or watching their webinar at dragos.com/webinars. To learn more about Dragos' intelligence-driven approach to industrial cybersecurity, register for a free 30-day trial of their ICS threat intelligence at dragos.com/worldview, and we thank Dragos for sponsoring our show.
Dave Bittner: [00:12:37] And joining me once again is Ben Yelin. He's a senior law and policy analyst at the University of Maryland Center for Health and Homeland Security. Ben, it's always great to have you back. I saw a story come by in the MIT Technology Review by Patrick Howell O'Neill, and this was about smartphones and how this notion of compelled decryption might be headed for the Supreme Court. What's going on here?
Ben Yelin: [00:13:02] So the legal principle involved here is your Fifth Amendment right against self-incrimination. You can't be forced to testify against yourself in court. Prior to the advent of Face ID and even Touch ID, this issue was relatively simple because courts considered entering in a passcode to unlock your smart device to be what we call testimonial evidence. That's the equivalent of asking you in court, what's your password to get into your phone? And that would force you to tell a judge or jury what that password is, and they'd have access to your device. What's become very interesting with the advent of Face ID and Touch ID is that the evidence collected is no longer testimonial. Rather, it's something that you wouldn't have to actively tell anybody. It could be simply the device matching up to your face or to your fingerprints. Courts have been very divided as to whether forcing somebody to decrypt their device using Face ID or Touch ID violates their Fifth Amendment right to - against self-incrimination, and because there's been that divide at the lower court level, I think we're anticipating in the next couple of years that this is going to be an issue that's going to make its way to the United States Supreme Court.
Dave Bittner: [00:14:22] Now, I know devices have what's referred to sometimes as cop mode, which is where, if you have one of these biometric unlocking mechanisms enabled, you can press a button on the phone a certain number of times, and it'll switch over to require a password. Are we looking at any sorts of adjustments to the legal approach, to that sort of thing?
Ben Yelin: [00:14:45] One thing we talk about frequently in the battle between privacy and government security is this idea of achieving equilibrium in our right to privacy. So because the technology has evolved to have things like cop mode - where, if there are a certain number of attempts to unlock the phone with Face ID or Touch ID, the user has to type in the password - that's technology that has made it more difficult for law enforcement to gain access to these phones, which means, according to how Fourth Amendment jurisprudence has worked over the years, my guess is that courts are now going to try and come up with an equitable solution to try and put those rights back in equilibrium. To put it more simply, they're going to try and give law enforcement additional capability to decrypt those devices in response to this change in technology. That's usually the way it goes for these types of digital privacy cases, and I think that's something that the Supreme Court would consider. It would be a major burden on law enforcement to lose this backdoor access to electronic devices, and if there is this Fifth Amendment right against self-incrimination as it applies to biometric data, they're going to have a very hard time getting access to smartphones.
Dave Bittner: [00:16:07] Any indications on where the Supreme Court might go with something like this?
Ben Yelin: [00:16:12] So one of the foremost experts on digital privacy, Professor Orin Kerr, has been tracking this and has noted that there have been contradictory decisions in all different judicial circuits across the country, and that's why it's such a favorable case for the Supreme Court. When there are disagreements among circuit courts, it's something the Supreme Court is going to look at closely because they're going to need to settle this issue, especially as we get to a point where almost all smartphones and other electronic devices are going to be enabled with Touch ID or Face ID and are going to require biometric data to decrypt. So I think it's going to motivate the Supreme Court to get involved. We've also seen this come up in the news recently because the attorney general of the United States suggested that Congress should enact a law to give law enforcement the ability to decrypt devices. It would basically be a law mandating access - backdoor access for the government to this encrypted data. So it's something that's prevalent in the news. That's, you know, usually the signal. When you have a split among judicial circuits and something that's being talked about in the coequal branches of government, that's usually a good signal that the Supreme Court is ready to take up the issue.
Dave Bittner: [00:17:33] All right. Well, time will tell. Ben Yelin, thanks for joining us.
Ben Yelin: [00:17:36] Thank you.
Dave Bittner: [00:17:42] And that's the CyberWire. Thanks to all of our sponsors for making the CyberWire possible, especially our supporting sponsor ObserveIT, the leaving insider threat management platform. Learn more at observeit.com.
Dave Bittner: [00:17:55] The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technology. Our amazing CyberWire team is Stefan Vaziri, Kelsea Bond, Tim Nodar, Joe Carrigan, Carole Theriault, Nick Veliky, Bennett Moe, Chris Russell, John Petrik, Jennifer Eiben, Peter Kilpe and I'm Dave Bittner. Thanks for listening. We'll see you tomorrow.