Dave Bittner: [00:00:01:02] Many thanks to all of our Patreon contributors. You too can become a contributor, just go to patreon.com/thecyberwire and find out how. We do appreciate it. Thanks.
Dave Bittner: [00:00:14:07] Is the third-man in the Shadow Brokers leak soon to be revealed? ISIS enters its diaspora phase. Monero miner targets Macs. Google Play ejects apps with the Tizi back door. Scarab ransomware is blasted out in a spam campaign. Uber's value takes a hit post-breach-disclosure. Barracuda Networks is taken private. Trend Micro buys Immunio. And the Pittsburgh FBI office takes another whack at Chinese industrial espionage.
Dave Bittner: [00:00:47:16] A few words from our sponsors at E8 Security. If you've been to any security conference over the past year you've surely heard a lot about artificial intelligence and machine learning. We know we have. But E8 would like you to know that these aren't just buzz words. They're real technologies and they can help you derive meaning from what an overwhelmed human analyst would see as an impossible flood of data. Go to e8security.com/cyberwire and let their White Paper guide you through the possibilities of these indispensable emerging technological tools. Remember, the buzz around artificial intelligence isn't about replacing humans, it's really about machine learning, a technology that's here today. So, see what E8 has to say about it and they promise you won't get a sales call from a robot. Learn more at e8security.com/cyberwire. And we thank E8 for sponsoring our show.
Dave Bittner: [00:01:45:20] Major funding for the CyberWire podcast is provided by Cylance. I'm Dave Bittner with your CyberWire summary for Tuesday, November 28th, 2017.
Dave Bittner: [00:01:55:13] In a developing story that we'll be watching, Brian Krebs thinks he has a lead on who the unknown leaker was whose device was looted for alleged NSA tools that found their way into the hands of the Shadow Brokers last year. It's too early to name individuals or companies, but the third person may become known before too much more time has passed.
Dave Bittner: [00:02:17:00] ISIS, effectively ejected from territory it once controlled, appears to be entering its long-anticipated diaspora phase, which informed observers expect to be marked by more focus on cyberspace. For the foreseeable future, this is held by most to mean increased attempts at online inspiration. There may be an attempt to reestablish a territorial sanctuary, possibly in the Philippines, and there are signs that the former Caliphate may be trying to attract women to its cause, in part through online matrimonial appeals.
Dave Bittner: [00:02:49:18] Criminals continue their attempt on cryptocurrencies. Security company SentinelOne today announced their discovery of a new cryptocurrency mining Trojan, "OSX.CpuMeaner," that targets Macs. It's after Monero cryptocurrency, and it appears, SentinelOne researchers say, to have borrowed some of the tactics and techniques used in the adware underground.
Dave Bittner: [00:03:13:00] Google's latest sweep through Google Play turns up several apps equipped with the Tizi back door. Tizi has typically been used to install spyware on target devices.
Dave Bittner: [00:03:23:07] There are other concerns about Android security, and especially privacy. A study by Yale University concludes that about three quarters of Android apps come with third-party tools that track user's activity.
Dave Bittner: [00:03:36:15] Forcepoint warns of a massive spam campaign that's distributing Scarab ransomware. The crooks sent out about twelve-million infected emails over a six hour period. Ransomware is enjoying a burgeoning demand in the black-market. Carbon Black has reported a 2,500% rise in ransomware sales since last year, so the hoods still seem to think this is the coming thing in crime. Ransomware is also growing more targeted and more difficult to detect.
Dave Bittner: [00:04:05:05] No-one seems to be buying the whistling in the dark Uber did before its recent shake-up and breach disclosure. It strikes most observers as unlikely in the extreme that the criminals who hacked the ride service actually destroyed the data they stole. There had been speculation last week that the company's value would take a hit, and we now have a better idea of what the breach discount may be in this case. SoftBank's offer is out, and it seems to be about 30% lower than pre-disclosure expectations would have put it.
Dave Bittner: [00:04:35:13] Security analysts face a seemingly ever increasing stream of available data, and separating the signal from the noise can be challenging. Properly dialing in what generates an alert and demands your attention can make all the difference in the world. Bryan Ware is CEO at Haystax Technology and he shares techniques for using machine learning to help cut through the noise.
Bryan Ware: [00:04:57:02] So often we discover that a breach has taken place months after that breach took place. When you think through that then you realize that the data was there at the time that the breach was taking place, maybe even there was some data there before the breach took place, but that data wasn't actionable, you know, you couldn't make a decision from it. And so we're in this era of artificial intelligence and machine learning specifically where it is a great opportunity to build algorithms that look for the kinds of anomalies, or look for the kinds of changes, that could be indications of, you know, some kind of a threat, and something that you'd like to bring as an alert, you know, to an analyst or to a decision maker.
Bryan Ware: [00:05:41:14] So the approach that we've taken at Haystax is what we call our Model First Approach and that is that we build models that represent what experts believe or what analysts would do if they were trying to evaluate is this a real threat or not, or is this a suspicious event or not, and those models are very human and conceptual terms. It's a form of kind of AI that is called Bayesian Networks, so these are probabilistic networks that represent the belief of experts and the uncertainties in those beliefs, and then once we had that model then we can connect it to all those alerts that come from other machine learning approaches or specific pieces of data. So what it allows us to do is to really operate at scale in the sense that if you're generating potentially hundreds of thousands of events per day, you never want to have that many alerts, but if you could resolve them in the way that your analysts would, and prioritize them according to the mental model they're going to use anyway kind of after the fact, but you do it at the time of the event, at the time of the transaction, well, then you can build a really scalable system and you can just let the analyst see the ones that are of serious concern.
Bryan Ware: [00:06:56:05] The way I kind of describe that is that's building the physics of the problem space. What does a suspicious event look like? Or what does an insider threat look like? And once I've built that out then I know how I would use data as it becomes available to determine the degree in which this person looks like an insider threat or the degree of which this looks like a suspicious transaction. So, yes, we do have to ultimately connect it to all of that data but it's not so much learning from the data as watching the data as it changes and as the data changes then the model updates as well so that the beliefs change. Let me give you a really simple example.
Dave Bittner: [00:07:33:09] Yeah.
Bryan Ware: [00:07:34:04] We might say that an insider threat would be someone who comes into work at an unusual hour and prints documents that they don't usually have access to or wouldn't normally print, to an unusual printer, and maybe there's a bunch of other little bitty things too. Now, it turns out that if you just built an alert on printing to an unusual printer or if you just built an alert on printing a large file, or you just built an alert on came into work after six o'clock, then you'll end up, for any large company, with lots and lots of alerts that are almost always easily explained. The model says, well, I would want to know about someone who's potentially thinking about leaving the firm and printing documents and coming into work at unusual hours. All together I'd love to know that. But now, you know, those are different kinds of data and different kinds of alerts.
Bryan Ware: [00:08:20:10] So we have the model that says this is how I would fuse that together and how I would reason on it if I knew all of those things and maybe even some other things about this employee's performance and then we have machine learned algorithms that basically say, well, what is the normal times that Bryan comes into work and what are the normal places that he comes into work? What doors does he go through? Or what does he normally print? Those are all machine learned from the data. But the way that I combine all those different alerts is a model that, for the most part, is static for a pretty long period of time. It represents, you know, what the experts, what analysts really, really believe and we do learn some new things and we do change our beliefs and we learn some new indicators, for the most part those models stay pretty stable even though the data is changing constantly.
Dave Bittner: [00:09:07:03] That's Bryan Ware from Haystax.
Dave Bittner: [00:09:11:07] In industry news, Akamai has announced that it's completed its acquisition of Nominium.
Dave Bittner: [00:09:16:23] Trend Micro announced that it's bought Immunio, a company that specializes in application security. It's thought that Trend Micro sees the acquisition as a way of moving its hybrid cloud security offerings into the devops market.
Dave Bittner: [00:09:31:03] Thoma Bravo is taking Barracuda Networks private. The private equity firm paid $1.6 billion for the company. Some analysts think this will be good for both buyer and seller. Barracuda may become more focused and agile, and Thoma Bravo may have picked up a business the markets tended to undervalue.
Dave Bittner: [00:09:50:22] And, finally, there's been another international indictment in the US, this one of three Chinese nationals associated with the APT3 cyber-spying operation. The operation is also called "Boyusec," short for the Guangzhou Bo Yu Information Technology Company, a contractor for the Ministry of State Security that's known for domestic surveillance of targets in Hong Kong. The US regards the firm as basically a front for an espionage operation.
Dave Bittner: [00:10:18:09] The indictment charges the three Boyusec workers with theft of intellectual property belonging to Western aerospace and defense firms. The indictment mentions theft from Trimble, Siemens, and Moody's Analytics.
Dave Bittner: [00:10:31:10] And how did they get indicted? Through the hard work of the Pittsburgh FBI office. AlienVault's Chris Doman reached out to us to share his appreciation for the Pittsburgh Field Office's work. He said, quote "It’s not a surprise this indictment comes from the FBI’s Pittsburgh office, they have been very aggressive at going after cyber criminals," end quote. We agree. And to the Steel City G-Men, yinz're doing a great job.
Dave Bittner: [00:11:01:13] Now I'd like to tell you about some research from our sponsor Cylance. Good policy is informed by sound technical understanding, the crypto wars aren't over. Cylance would like to share some thoughts from ICIT on the surveillance state and censorship, and about the conundrum of censorship legislation. They've concluded that recent efforts by governments to weaken encryption, introduce exploitable vulnerabilities into applications, and develop nation state dragnet surveillance programs will do little to stymie the rise in terrorist attacks. These efforts will be a detriment to national security and only further exhaust law enforcement resources and obfuscate adversary communiqués with a massive cloud of noise. Back doors for the good guys means back doors for the bad guys, and it's next to impossible to keep the lone wolves from hearing the howling of the pack. Go to cylance.com and take a look at their blog for reflections on surveillance, censorship and security. And we thank Cylance for sponsoring our show.
Dave Bittner: [00:12:07:09] And I'm pleased to be joined once again by Emily Wilson. She's the director of analysis at Terbium Labs. Emily, you were at a conference recently where the subject of the privacy of children came up, and it led to an interesting discussion. Why don't you share with us what happened?
Emily Wilson: [00:12:22:18] I was at Cybersecurity Week in the Netherlands and someone in the audience, in a discussion, raised the point that kids these days don't care about their data, kids don't care, they're just giving all of their data away for free.
Dave Bittner: [00:12:37:03] Right.
Emily Wilson: [00:12:38:08] And one of the panelists pushed back on that, I thought appropriately, saying, you know, we have protections in place for plenty of other aspects of kids' lives. We don't allow them to drive or to drink or to vote or to join the military until they are old enough to understand the implications and the consequences of their actions. So why are we treating data privacy like it's something different? And it is a hard thing to figure out how to solve, right? Partially because plenty of organizations are and have been collecting data on children for years and we've already seen, you know, some companies that specialize in devices for children having data breaches but it's also-- I don't think it's fair to characterize it as kids today don't care about their data when plenty of adults don't understand the implications of all of the data that they're sharing either.
Dave Bittner: [00:13:37:08] And we sort of have this generational divide where the people who are setting policy are not digital natives so this sort of thing isn't reflexive to them.
Emily Wilson: [00:13:46:11] It isn't reflexive, but I also-- and this was another point that one of the panelists made, we can't wait to look at best practices and regulations and data privacy and think about how this is impacting adults and children alike until we have digital natives in office or in positions of authority to help influence policy. Now these are decisions that we need to be making now. And I think the other question, something that I think about a lot, is when and how are we going to start seeing the fallout from some of this data? And not just data that's being shared, not just data that you're putting in to sign up for some app, but, you know, in the work that I do I am lucky enough to be working away from some of the more unpleasant parts of the dark web where children are exploited more directly but I do see plenty of data leaks involving children. And I mean children, not university students, I mean children. Whether this is social security numbers of children being sold or data leaks from elementary schools, it's awful to see.
Dave Bittner: [00:14:56:20] So we can understand beyond the obvious and the horrific things, that child exploitation and that sort of thing, is there a particular value of a, you know, personal identificat-- of a PII of a child?
Emily Wilson: [00:15:10:14] In some cases, right now, yes. The social security numbers for example, they're useful for tax fraud, for specific types of tax fraud. In other cases, and honestly this is the part that makes it worse, is that people don't particularly care who data belongs to or where it comes from as long as they can use it, and so, you may have the information for a 17 year old being mixed in with the information for a 45 year old and the difference is that one of those people is going to be checking their credit score more regularly.
Dave Bittner: [00:15:44:16] And so the ten year old who gets their information breached may not know there was ever a problem until they're 16 and try to get a driver's license or 18 and trying to get a credit card and their information's been out there for a decade.
Emily Wilson: [00:15:57:22] Exactly, and that's the kind of thing that it don't think we are seeing yet at scale but I imagine we will start to see over the next, call it ten years, and I don't know what that's going to look like and I don't think many other people do either. But it's a question that's beginning to be raised and I think that's good.
Dave Bittner: [00:16:19:00] Alright, Emily Wilson, thanks for joining us.
Dave Bittner: [00:16:23:21] And that's the CyberWire. Thanks to all of our sponsors for making the CyberWire possible, especially to our sustaining sponsor, Cylance. To find out how Cylance can help protect you using artificial intelligence, visit cylance.com. The CyberWire podcast is produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technology.
Dave Bittner: [00:16:45:15] The podcast is produced by Pratt Street Media. Our editor is John Petrik. Social media editor is Jennifer Eiben. Technical editor is Chris Russell. Executive editor is Peter Kilpe. And I'm Dave Bittner. Thanks for listening.