The CyberWire Daily Podcast 10.12.18
Ep 703 | 10.12.18

Busy Bears, again. Mixing IT and OT is a risky business. New Android Trojan. Supply chain seeding attack updates. Facebook purges more "inauthentic" accounts. Data privacy. Cyber sanctions.

Transcript

Dave Bittner: [00:00:03] Ukraine says it's under cyberattack again. ESET connects TeleBots and BlackEnergy. Port hacks highlight the risks of mixing IT and OT. Talos finds a new Android Trojan. Facebook purges more inauthentic sites; this time, they're American. Data privacy regulation is trending in both Sacramento and Washington. The EU will consider cyber sanctions policy. NATO looks to cyber IOC. We'll learn about emotional intelligence from Compassionate Coding's April Wensel. And alleged SIM swappers have been arrested.

Dave Bittner: [00:00:43] And now a word from our sponsor, Wombat. When it comes to security, it doesn't matter how deep your moat is or how high your castle walls are if an unaware employee lowers the drawbridge for cybercriminals. Ninety percent of attacks now start with phishing and social engineering to gain access to systems and data, so educating your employees to spot and defend against these threats is more crucial than ever. Wombat Security, a division of Proofpoint, is the leading provider of information security awareness and training software, designed to educate your employees to identify social engineering and protect your organization. Through phishing simulation and knowledge assessments, Wombat paints a picture of where your employees are vulnerable and changes their risky behaviors through highly effective interactive training. Born from research conducted at Carnegie Mellon University, Wombat's suite of training covers topics from phishing and social engineering to physical and office security, and even compliance topics like GDPR. And now, through an integration with Proofpoint's world-class threat intelligence, Wombat is leading the way with phishing simulations and content based on the latest emerging threats. Don't let cybercriminals into your castle. Transform your employees from risky to ready with Wombat Security. To learn more, visit wombatsecurity.com/cyberwire. That's wombatsecurity.com/cyberwire. And we thank Wombat for sponsoring our show. Major funding for the CyberWire podcast is provided by Cylance.

Dave Bittner: [00:02:20] From the CyberWire studios at DataTribe, I'm Dave Bittner with your CyberWire summary for Friday, October 12, 2018.

Dave Bittner: [00:02:27] Ukraine's SBU security service warns that various government agencies in Kiev are under cyberattack again - no attribution so far.

Dave Bittner: [00:02:37] ESET reports that TeleBots and BlackEnergy and, therefore, Industroyer and NotPetya, are linked to the same threat actor. They found that the Exaramel back door deployed in April used the same infrastructure that TeleBots used to deploy NotPetya. And they concluded that Exaramel itself is an evolved version of the Industroyer malware BlackEnergy deployed against sections of the Ukrainian power grid.

Dave Bittner: [00:03:03] ESET doesn't explicitly attribute the operation to anyone, but, as ZDNet points out, they don't have to. Western governments have, by consensus, already attributed the operations to the Russian state intelligence services. ESET's results simply provide more confirmation.

Dave Bittner: [00:03:21] Observers look at cyberattacks against the ports of Barcelona and San Diego and conclude that mixing IT and OT yields unacceptably high risk. The Barcelona and San Diego incidents appear to have been largely confined to business systems, but port operations were affected, too, if only through a commendable abundance of caution. Attacks on industrial infrastructure have often begun by compromising business networks and moving from there to operational technology. That's what seems to have happened in the attacks on the Ukrainian power grid. Sometimes it works the other way around, as it did in the Target breach, when a compromised HVAC contractor enabled hackers to pivot the point-of-sale systems.

Dave Bittner: [00:04:05] Cisco's Talos research group has found a new Android Trojan, GPlayed. It masquerades as the Play Store, using the name Google Play Marketplace to further the imposture. GPlayed is both spyware and banking Trojan. Talos notes that the growing preference on the part of many developers to bypass established app stores in favor of other distribution channels - and they're looking at you, Fortnite - will tend to give bogus sites, like GPlayed, more plausibility and currency than they might otherwise enjoy. In any case, be sure you know what you're downloading.

Dave Bittner: [00:04:42] Skepticism over Bloomberg's Chinese supply chain attack story continues to rise. Some sources have walked back their statements to Bloomberg. Other observers point to an implausibility. If Chinese intelligence services really had ceded the supply chain as effectively as the story suggests, why would they engage in all the noisy hacking they've continued to conduct?

Dave Bittner: [00:05:06] Facebook has purged more inauthentic sites. In this case, the 559 pages and 251 accounts the social network took down were, for the most part, American. The problem, in Facebook's view, is their coordinated inauthenticity. The company admits that the inauthentic content is often indistinguishable from legitimate political debate and is trying to develop that distinction on the basis of behavior as opposed to content. The inauthenticity specified is moneymaking, clickbaiting people into ad farms. There is some irony in the notion that a social network would find making money from advertising suspicious, but this is more cognitive dissonance than contradiction.

Dave Bittner: [00:05:51] The Google Plus API issues revealed earlier this week when Google announced that it would be winding down the service as a commercial failure and that the app developers, in fact, had access to Gmail user data continue to prompt a growing interest in developing national regulations for data privacy in the U.S., especially coming, as it does, so soon after Facebook's recent privacy issues.

Dave Bittner: [00:06:14] Three senators asked Google yesterday why it decided not to disclose the privacy issues back when it discovered them. This seems to foreshadow deliberations over more extensive privacy laws. California has recently passed a sweeping data privacy law, and industry would probably be more comfortable dealing with a single set of federal regulations than it would with 50 state regimes.

Dave Bittner: [00:06:38] New York's financial sector's security and disclosure regulations have had a general effect on the sector and seemed to have been relatively well assimilated. But California's law is likely to have much more sweeping and problematic consequences. A study published this week by PWC found that of companies surveyed only half thought they'd be able to comply with the California Consumer Privacy Act of 2018 by the time its deadline kicks in during 2020.

Dave Bittner: [00:07:06] The U.K. And Netherlands intend to push the EU to develop more effective sanctions against cyberattack. Both countries have taken a hard line against GRU operations against targets on their territory. In the case of the U.K., there's continued and determined outrage over the lethal Novichok nerve agent attack, as well as over what British authorities perceive as a growing threat to critical infrastructure.

Dave Bittner: [00:07:29] The Netherlands expelled GRU officers over what it characterized as an attempt to hack into the Netherlands-based Organization for the Prevention of Chemical Warfare, the international body to whom the UK referred its Novichok complaint. The sanctions the two countries wished to see prepared are seen as being directed principally against Russia and China. Reuters says The Five Eyes and a few friends, notably Germany and Japan, have agreed to closer co-operation against Russian and Chinese cyber operations. And NATO expects to reach full cyber-operational capability by 2023.

Dave Bittner: [00:08:07] And finally some alleged SIM swappers have been arrested. The Regional Enforcement Allied Computer Team - called REACT, a task force composed of various California police departments - responded to a complaint from a company that had been the victim of SIM swapping and alerted the feds to the suspects and their whereabouts.

Dave Bittner: [00:08:26] The Secret Service collared Joseph Harris and Fletcher Roberts Childers in Oklahoma City. They're both in their early 20s and are of course entitled to the presumption of innocence. Childers hasn't yet been charged. But Harris, who goes by the nom hack Doc in criminal circles, has. Harris is suspected of having stolen some $14 million in a single cryptocurrency hack.

Dave Bittner: [00:09:00] And now a word from our sponsor - who's that sponsor you say? Well, it's none other than the mysterious team behind the spectacularly successful F.A.K.E security booth at RSA 2018. You remember. It was the one with no vendor name, no badge scanning and the charismatic snake oil salesman pitching his imaginary cybersecurity cures for all that's ailing businesses around the world. So who was behind that booth? Why did they do it? Who's really sponsoring our show today? Get the answers you've been dying to hear and hear the story behind the booth at fakesecurity.com/cyberwire. That's fakesecurity.com/cyberwire And we thank whomever it is for sponsoring our show.

Dave Bittner: [00:09:53] And joining me once again is Jonathan Katz. He's a professor of computer science at the University of Maryland and also director of the Maryland Cybersecurity Center. Jonathan, welcome back. We had an interesting story come by. This was from Fast Company. And the title of the article was "MIT's Tool For Tracking Police Surveillance: A Cryptographic Ledger." This sounds like something that is right up your alley. What's going on here?

Jonathan Katz: [00:10:15] This work is relevant to the broader discussion about providing law enforcement access to encrypted data. And this specific proposal isn't so much looking at how exactly that access would be provided but about providing accountability - public accountability for that access. So basically what the researchers propose is that you have some kind of system set up between law enforcement and the judicial system that would place certain values on a blockchain whenever law enforcement requested access to encrypted data. And the idea that it would be that the public could look at what kind of requests are being made, how often these requests are being made. And even down the line, after the investigation might be over, they could even potentially look at the data that was requested and get a sense of how often this kind of thing is going on.

Dave Bittner: [00:11:05] So really leveraging that transparency that is inherent in the blockchain - I suppose, in this case, people hope for the greater good for law enforcement.

Jonathan Katz: [00:11:14] Yeah, that's right. So I think a lot of people are concerned about providing unfettered access to law enforcement to access encrypted data. And part of their concern I think is not that they mind law enforcement going after real criminals, but they mind the idea of law enforcement being able to target whoever they like for no particular reason. And so providing an accountability like this might actually make people more comfortable with the idea of giving law enforcement access.

Dave Bittner: [00:11:39] And what is your take on this? Does the underlying science seem to make sense? I mean, from a cryptographic point of view, is this is this a workable solution?

Jonathan Katz: [00:11:47] I think definitely yes. I think, again, if you're comfortable with the idea of providing access at all, then the idea of providing accountability in this way is actually a really interesting one. And I'm all for the idea of providing greater accountability in government in general. So that does seem like a reasonable approach.

Dave Bittner: [00:12:04] And what about from a privacy point of view? What's the flip side here? Is there - are there things that people could have concerns about of making this sort of information available?

Jonathan Katz: [00:12:14] Well, I think people are always concerned about whether or not law enforcement and the judicial system would actually use the technology. So for example, you could imagine that if law enforcement has the ability to go after encrypted data, then they may not contact a judge and request permission. Or they may contact the judge, and the judge may decide that in this particular case they don't have to report it - making that decision on their own, kind of an extralegal decision. And so people who are concerned about government infringement on their privacy might just as well be worried that the government won't use the system as it's been proposed.

Dave Bittner: [00:12:49] Right. A blockchain doesn't do you much good if the folks actually aren't using it.

Jonathan Katz: [00:12:53] Yeah, that's right. And it's not so easy to prove that somebody failed to use the system properly.

Dave Bittner: [00:12:57] Right, right. All right, well it's interesting - certainly worth keeping an eye on. As always, Jonathan Katz, thanks for joining us.

Jonathan Katz: [00:13:04] Thank you.

Dave Bittner: [00:13:09] Now, a few words about our sponsor Invictus - we've all heard that cyberspace is the new battle space. Invictus International Consulting was founded by people who know a battle space when they see it. This leading cybersecurity company, headquartered in Northern Virginia, boasts an expert staff with decades of cybersecurity, technology solutioning and intelligence analysis experience. Its customers in the intelligence, defense and homeland security communities highly value these Invictus cyber warriors and their professional ethos. Invictus is a service-disabled, veteran-owned small business - That's SDVOSB - with over 60 percent of the Invictus workforce comprised of veterans.

Dave Bittner: [00:13:52] The company excels in achieving mission success not only within the government space. But it has been a game-changer within its commercial clientele as well. An award-winning company - recently named to 2018's Cybersecurity 500 list as one of the world's hottest and most innovative cybersecurity companies - Invictus recently won the most valuable industry partner award at (ISC)2's 15th annual information security leadership awards, as well as several others. Check them out at invictusic.com and learn more and to see if you have what it takes to become a cyber warrior. That's invictusic.com And we thank them for sponsoring the CyberWire.

Dave Bittner: [00:14:40] My guest today is April Wensel. She's the founder and CEO of Compassionate Coding, an organization that aims to combine the effective practices of agile software development with a focus on empathy and the latest in positive organizational psychology. She's a veteran software engineer and technical leader with more than a decade in the software industry.

April Wensel: [00:15:01] So I was a software engineer, and I lead engineering teams in various startups in Silicon Valley for about ten years. And I noticed a lot of problems with the industry - things like lack of diversity, lack of other women around me in my work. I also saw teams failing due to just unproductive conflict happening in code reviews or just on the team in general. And I thought that some of the products we were building were having a negative impact on the world.

April Wensel: [00:15:29] You know we see this with, like, Facebook using data in potentially unethical ways. And my realization was that all of these are really symptoms of an underlying problem, which is that in tech we really haven't been carrying enough about human beings. And so that's what I set out to solve with my company Compassionate Coding.

Dave Bittner: [00:15:48] Now, what you talk about is bringing emotional intelligence and ethics to the tech industry. I think most of us are familiar with the notion of ethics. With you. But can you describe to us - what do you mean by emotional intelligence?

April Wensel: [00:16:00] Emotional intelligence is a term that was popularized by Daniel Goleman. And the idea is that we talk about, you know, IQ, our intelligence quotient. And there's this idea that there's another aspect of the way our mind works and that's the emotional side and that there's something, like, that we could call the emotional quotient. It's our ability to interact in the world while understanding and managing our own emotions and understanding and interacting with the emotions of other people.

April Wensel: [00:16:33] And so the field of emotional intelligence includes a lot of different types of skills - things like having confidence, having motivation, having persistence and resilience. Those are kind of personal aspects of emotional intelligence. And on the other hand, communication skills - so having empathy, being able to persuade people, things like that in the social arena.

Dave Bittner: [00:16:53] So where does the tech industry fall short with this? And do you have any notion for why it is that way?

April Wensel: [00:17:00] Yeah, so that's a funny question because the tech industry is nearly just the void of emotional intelligence almost across the board. And I think there are reasons for this - though I think, you know, relevant is that Linus Torvalds, the creator of Linux, recently came out and said that he doesn't really have a lot of empathy or understanding of emotions of other people and that his behavior has hurt others. And so it was a big deal that he came out because he's like sort of a figure that's been representative of the caustic nature of the tech industry - people in the tech industry. And the fact that he came out and admitted that this had horrible effects on people was a big deal. And this just happened recently.

April Wensel: [00:17:45] And so I think what happened was early - in the early days of the tech industry, some of the first people that got involved were these sort of very - they had very low emotional intelligence skills. And they became representative of what made for a good software engineer. And so they started hiring people who were just like them. And so it kind of created this idea that to be a good software engineer, you have to be like these people. You have to not care about human beings. You have to care more - you have to interact with people in the same way that you interact with a machine, in this very direct rational way where there's no room for any emotion.

April Wensel: [00:18:22] We have this sort of monoculture in tech where everybody kind of falls into this category. Now, there are some exceptions, but they are just exceptions. And it's because we've been excluding all these people. I think that is sort of a systemic thing because when these people came to power, it's like now we have this pattern matching that happens in tech interviews - where it's like, huh, she doesn't seem technical because she doesn't remind me of, you know, all these male software engineers I've worked with who, you know, were poor at communication and, you know, communicate it in a certain direct way or whatever. And so we've been filtering out a lot of people. And so it's just - you know, the problem just gets worse and worse. And so that's what I'm trying to help remedy.

Dave Bittner: [00:18:59] What about this notion of the rock star? I don't think coddling's the right word but maybe accommodation - where, you know, you can have someone who is an amazing coder. And because of that, they don't have to worry about how they dress when they come to the office or even, you know, basic grooming skills. Because of their skills, we're going to let them be antisocial and unsanitary in the workplace.

April Wensel: [00:19:24] Yeah. So I really, really don't like this idea of the rock star developer and how it's come to be. And I think that it's harmful because no matter what kind of code this person is producing, they are affecting the people around them - meaning that, you know, if they're being abrasive in code reviews and insulting and abusive to people around them - they're like sort of - their behavior is toxic - then it's hurting the productivity of everyone on the team.

April Wensel: [00:19:52] And, you know, it's not just me kind of just claiming that this is the case. I mean, even Google, who for many years has been the sort of standard of only hiring for, quote, "technical ability" and, like, treating people like robots in the interview process, even they came out recently with a study - they did this Project Aristotle where they found that what makes for an effective team at Google - none of the top five were anything about technical ability or performance that would fall under the rock star category. The top thing was psychological safety on the team. And everything was all sort of people stuff. It was other stuff like structure and clarity and things like that. There was nothing, quote, "technical" in what makes for an effective team at Google.

April Wensel: [00:20:32] And so I think that that's really important there to note, which is that I think we've just been assuming that, oh, this person is such a great developer. But if that developer doesn't have good empathy too for the users, then they're probably not actually producing the best product. Maybe they're producing the most, quote, efficient code, but that doesn't necessarily mean the best product.

Dave Bittner: [00:20:54] There's this impulse that I see, particularly in the tech world - and I think I think it's amplified on Twitter in particular - and it is sort of the dog pile, where someone says something that someone thinks is stupid or technically incorrect or imprecise. And here comes the snark, you know. And here comes - and everyone piles on. And I just - when I see that, I think that is - it's not a helpful impulse. But it also - I don't think it's healthy.

April Wensel: [00:21:23] Yeah, that's a really good point. It's like, you know, sometimes - especially early in my career, I was afraid to post anything, like any code online, because I saw it happen so many times where if you make one little mistake, people just rip you apart. And one will say, oh, well, you're just incompetent. And, yeah, that's really toxic. And I think that - you know, there's talk of imposter syndrome that people experience. And I think that that develops from this really hostile competitive and, like, judgmental - aggressive even - culture that we've created.

Dave Bittner: [00:21:52] So what is your advice to organizations? You know, if I'm trying to build a team, I'm - I've got a startup I'm working on. I'm an entrepreneur - or even just improve the team that I run in a larger organization, what are some of the things that I can do to enhance everyone's emotional intelligence, to make sure this is something that we're paying proper attention to?

April Wensel: [00:22:12] Yeah, I think one thing is just recognizing the importance of that in a very clear way in the company. And what that means - that might mean including it in the hiring process because a lot of times, you know, we'll put people through these rigorous, like, coding tests, which I don't think is a very good way to interview people in the first place because it's not very representative of the actual work that they're going to be doing, which is much more - usually much more collaborative and everything like that.

April Wensel: [00:22:37] So I would say, you know, de-emphasize all of that and emphasize the person's ability to communicate well in the interview. And, again, that doesn't mean that they're not awkward or something like that. It just means that they seem interested in what you're saying and that they're able to convey ideas and that they're able to understand what other people might be thinking.

April Wensel: [00:22:56] And you can get at that by asking about past places they've worked or past projects they've worked on. And so I would say you have to update your hiring processes to factor in empathy and emotional intelligence and also your promoting practices - you know, who gets promotions, who gets rewarded, who gets the bonuses - because there's a lot of work that goes on in software teams that isn't credited.

April Wensel: [00:23:19] Like, if you're the person who talks to designers and talks to other people, and you do that well, that's part of your job. It's not just about how many lines of code you produce or something like that - or how many tickets you close. It's really - there's a lot of stuff that happened that isn't credited well. And so I think that if you're going to value this on the team, that's an important part - and, again, like doing some sort of training, whether through videos or something like that or bringing somebody in, but providing resources to help people grow the skills because that's all they are - are skills that can be grown.

Dave Bittner: [00:23:48] That's April Wensel from Compassionate Coding. You can learn more at the Compassionate Coding website. That's compassionatecoding.com. And that's the CyberWire.

Dave Bittner: [00:24:01] Thanks to all of our sponsors for making the CyberWire possible, especially to our sustaining sponsors Cylance. To find out how Cylance can help protect you using artificial intelligence, visit cylance.com. And Cylance is not just a sponsor. We actually use their products to help protect our systems here at the CyberWire. And thanks to our supporting sponsor VMware, creators of Workspace ONE Intelligence. Learn more at vmware.com.

Dave Bittner: [00:24:27] The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cyber security teams and technology. Our CyberWire editor is John Petrik. Social media editor's Jennifer Eiben. Technical editor's Chris Russell. Executive editor's Peter Kilpe. And I'm Dave Bittner. Thanks for listening.