The CyberWire Daily Podcast 7.24.19
Ep 891 | 7.24.19

Facebook settles with the FTC. The US opens an antitrust probe of Big Tech. A new phase in the crypto wars. NSA’s new directorate. Doxing the FSB. BlueKeep. Financial consequences of a breach.

Transcript

Tamika Smith: [00:00:03] Facebook's settlement with the U.S. Federal Trade Commission is out. The U.S. Justice Department opens an antitrust inquiry into Big Tech. Another salvo in the Crypto Wars is fired in New York. The NSA gets a new directorate. More on the doxing of Russia's FSB. And, please, patch for BlueKeep.

Tamika Smith: [00:00:30] From the CyberWire studios at DataTribe, I'm Tamika Smith in for Dave Bittner with your CyberWire summary for Wednesday, July 24, 2019. 

Tamika Smith: [00:00:40] Lancaster University in the U.K. suffered a large breach of student data following a phishing attack, the BBC reports. The breach affected more than 12,000 students, and the data were used to send fraudulent invoices to undergraduate applicants. Undergraduate fees for the school reaches tens of thousands of pounds, and sources told The Register that around six students paid the phony invoices. The National Crime Agency arrested a 25-year-old man in connection with the breach, and he was released under investigation. 

Tamika Smith: [00:01:13] Ars Technica reported that a security researcher posted a slide deck on GitHub documenting how to perform heap spraying against a weak RDP service. This was one of the larger obstacles in the path to achieving remote code execution from BlueKeep. So the knowledge will likely widen the pool of people who have working exploits for the bug. 

Tamika Smith: [00:01:34] Kazakhstan seems to be working out the kinks in their HTTPS traffic interception efforts, according to ZDNet. The government last week began requiring local ISPs to force their customers to install root certificates, which allowed government agencies to launch man-in-the-middle attacks against encrypted data within the country. Researchers at Censored Planet say that so far only one Kazakh ISP was actually intercepting HTTPS traffic, and the interception activity turns off and on seemingly at random. Furthermore, only 37 domains are being targeted, including Facebook, Google, Twitter, Instagram and YouTube. The researchers say the activity suggests the system is still in a testing phase, and it may be rolled out gradually. 

Tamika Smith: [00:02:22] The U.K. has postponed its decision on whether Huawei's kit should be excluded from the country's 5G network. U.K.'s cultural secretary Jeremy Wright said that the government needs to wait until the U.S. clarifies its own policy regarding the Chinese company. The BBC notes that all four of Britain's telecommunications companies have already begun building their 5G networks with Huawei's equipment so a government ban would require them to start over. The decision will fall to Boris Johnson, who takes over for Theresa May as Britain's prime minister today. 

(SOUNDBITE OF YOUTUBE VIDEO, "BABY SHARK DANCE") 

Unidentified Child: [00:03:03] (Singing) Baby shark, doo, doo, doo, doo-doo, doo. Baby shark, doo, doo, doo, doo-doo, do. 

Tamika Smith: [00:03:03] This is an example of content that children love to watch on YouTube. And these days, the streaming company is replacing television for many children. In the U.S., a survey shows about 90% of children have an online presence before they're 2. During this screen time, while they're watching cartoons like this, their data is being collected. The Washington Post reports that the FTC has reached a settlement with YouTube after investigating complaints that the streaming company collected kids' data improperly. This is a violation of the Children's Online Privacy Protection Act. The exact fine is still unknown but is expected to be in the millions. 

Tamika Smith: [00:03:41] Here to shed more light on this is Emily Wilson. She's the vice president of research at Terbium Labs, where she specializes in criminal marketplaces or data - the dark web, data privacy and fraud. Thanks for joining the program, Emily. 

Emily Wilson: [00:03:55] Thanks for having me. 

Tamika Smith: [00:03:56] So let's start with the opinion piece in WIRED magazine that talks about this very issue of data privacy. It's entitled "How To Protect Our Kids' Data Privacy." Now, Emily, kids prefer to watch their cartoons, and educational videos and other content on places like YouTube. And also, parents love the convenience of it. But at what cost are we using these services? 

Emily Wilson: [00:04:18] You know, you mentioned convenience there, and I think that's the big one because it used to be that if you wanted to consume media, or follow certain shows or see certain content, you had to be watching TV at the right time. You know, we have our Saturday morning cartoons, or you had shows that would come on at 6 o'clock on a Wednesday, and that was when you watched it. 

Emily Wilson: [00:04:38] But now when we have things like YouTube and other streaming platforms, this content is available not only all the time, but on any device, which gives these platforms, and the advertisers behind them and whoever else is sort of tracking this data so many opportunities to collect information. 

Tamika Smith: [00:04:56] So some tech giants, including Facebook CEOs, say their solution to this is data ownership. Essentially, allowing users to control their own data, and then deciding when companies and the government can use it. But is this answer realistic when you're talking about users who are not even aware of what giving consent is? 

Emily Wilson: [00:05:13] Absolutely not. It's frankly ridiculous, the idea that you could put children in a position to own their data. I honestly think that most adult consumers aren't even properly informed about what it would mean to own their data or to give consent about data usage. 

Emily Wilson: [00:05:31] When we think about children, we protect children from things that they are not capable of making informed decisions on - right? - health care decisions, drugs, alcohol, sex, voting, driving. There are these things that we understand as a society that we have a duty of care to protect children from. Why have we decided that data and privacy and information should be any different than that? Why do we think that children would have the ability to provide informed consent on those topics if we understand that, you know, we shouldn't put them behind the wheel of a car yet? I think it's absolutely absurd. 

Emily Wilson: [00:06:06] This information also shows up in criminal marketplaces. I know over the last few years - certainly in the work that I do on the dark web - I've seen an increase in the amount of child data being sold explicitly as child data - infant records, medical records from pediatricians' offices, children's socials being marketed to be used for the child tax credit when people are committing tax fraud - to say nothing of all of the records that belong to children that have shown up in breaches and we don't know they're children - things like health care breaches or social media breaches. Cybercriminals aren't going through these breaches and saying, oh, well, I'm going to leak these records, or, I'm going to sell these records, but only for the adults. No. The children are getting caught in the mix as well. 

Tamika Smith: [00:06:49] As we start shaping this conversation, what do you think that lawmakers and tech giants need to be considering? 

Emily Wilson: [00:06:58] That's a loaded question, first, because when I think about what the tech giants should be considering, I have such low expectations of them putting safety and security ahead of profits that I expect that most of this action will need to come from lawmakers. And to that point, I think that we should be considering things like data collection, data privacy, advertisements - we should be considering these issues, like data collection, like surveillance, privacy and security, in the same way that we consider the other things that we know children aren't in a position to consent to, right? 

Emily Wilson: [00:07:36] And even, I think, saying that parents need to consent up to the age of 13 - what were you doing at 13? How did you view the world at 13? On your 14th birthday, did you suddenly have a great deal more maturity about, you know, sharing information, about the way that you interacted with the world? Of course not. Your brain is still developing. Saying that a parent consented at 13 suddenly makes everything OK is just unreasonable. 

Emily Wilson: [00:08:01] So I think that we need to be having a broader conversation about what it means to be collecting data for all of us and then, also, what do we allow sites to collect about children? What do we allow sites to use for children, right? Even if there are basic requirements for children to create profiles, are sites then allowed to use that activity against them? 

Tamika Smith: [00:08:22] Well, thank you very much, Emily, for joining the program today. 

Emily Wilson: [00:08:25] Thank you for having me. 

Tamika Smith: [00:08:26] That's Emily Wilson. She's the vice president of research at Terbium Labs, where she specializes in criminal marketplaces for data, the dark web, data privacy and fraud. 

Tamika Smith: [00:08:37] The FTC this morning formally announced the details of its settlement with Facebook. In addition to paying a $5 billion fine, the company will be required to set up a board-level privacy committee. Facebook will also need to conduct privacy reviews of all new services or products the company rolls out, according to The Verge. These reviews will be submitted to CEO Mark Zuckerberg and a third-party assessor on a quarterly basis. On Wednesday, the Securities and Exchange Commission said that Facebook agreed to pay a $100 million fine for allegedly misleading investors about the improper use of user data, according to Reuters. 

Tamika Smith: [00:09:16] U.S. Attorney General William Barr gave the keynote address at Fordham University's International Conference on Cyber Security Monday. In it, he argued in favor of providing law enforcement with ways to access encrypted data. Barr said the Fourth Amendment strikes a balance between what he calls the individual's right to privacy and the public's right of access. He added, making our virtual world more secure should not come at the expense of making us more vulnerable in the real world. 

Tamika Smith: [00:09:45] Barr cited a specific case in which a Mexican cartel used WhatsApp to coordinate the murders of hundreds of Mexican police officers. He didn't endorse any solution in particular, but listed a number of possible options. He said the Justice Department is confident that there are methods of providing access to law enforcement without materially weakening the security provided by encryption. He said it's high time that technology companies start brainstorming ways to develop and implement these solutions. Barr concluded by saying that the Justice Department is open to a cooperative approach with the private sector. He implied that legislation could ensure it happens either way. 

Tamika Smith: [00:10:26] NSA Director General Paul Nakasone announced a new cybersecurity directorate to improve foreign intelligence sharing with other agencies and the private sector. An NSA spokesperson told CyberScoop that the directorate will update and maintain a section of NSA's website to share research and warn of new vulnerabilities. The directorate is set to begin operations on October 1 and will be headed by Anne Neuberger. According to The Wall Street Journal, Neuberger recently led NSA and Cyber Command's election security task force, and she holds the position on NSA's board of directors. 

Tamika Smith: [00:11:01] And finally, consider election security. Perfect election security is probably impossible, and a serious approach to it would be prohibitively expensive. So CSO is asking, how much security is enough? Their answer is - enough to convince the loser they lost. That's commendably sensible, but some losers will always be convinced they were jobbed. 

Dave Bittner: [00:11:31] And joining me once again is David Dufour. He's the vice president of engineering and cybersecurity at Webroot. David, always great to have you back. We wanted to touch today on some aspects of security awareness training, and you had some stuff you wanted to share with us about that. 

David Dufour: [00:11:46] Yes. Great to be back as always, David. We're starting to spend a lot of time looking at how people learn best in any industry, but of course, we're talking specifically about cybersecurity. And a lot of times, what happens - organizations, you know, once a year, they roll out their learning management system. They put in four hours of training for people to make sure their PCI compliance is up to date, and they learn about phishing. And I think most educators would really get that that's not the best way to do it, and so a lot of folks are starting to look at what we're calling microlearning. 

Dave Bittner: [00:12:17] And so what is that? 

David Dufour: [00:12:18] As it relates specifically to cybersecurity, we're spending a lot of time at Webroot and working with other industry folks to figure out how to deliver learning at the point of the incident. So for example - and we've not perfected this. We're working out ways of doing it, so I just want to say this is on our roadmap. For example, let's say you receive a phishing email, and you open that phishing email, and it in fact is a legitimate phishing email. 

David Dufour: [00:12:45] We might want to be able to detect that you've opened that and, in that moment, launch a two or three-minute training program on phishing, what phishing is, what you should be looking for to determine if something is a phishing attack or not. And you want to do it in that moment, so instead of removing someone from an environment, then teach them and then put them back in an environment, you try to deliver that training quickly in their day-to-day world so it's more top-of-mind for them. 

Dave Bittner: [00:13:13] It strikes me also that it's - rather than being sort of a slap on the wrist, that you're given this opportunity to show them what the right thing to do is. 

David Dufour: [00:13:21] That's exactly right. You're doing that. You know, I think we're all trained to not do the negative feedback. This is actually giving them that positive feedback in the moment, saying, hey, not trying to beat you down. We're just trying to give you a heads-up here. 

Dave Bittner: [00:13:33] Is it easier to get buy-in from this as well because, you know, I don't have to schedule everybody for the day or two of training that is sort of just a natural part of everyone's day-to-day? 

David Dufour: [00:13:45] You would think so. I'm not trying to harp on anybody or talk about how some processes work, but a lot of time, in large organizations, it's just easier to send out the annual - go take this training. We want to tick some boxes so we're compliant. We're going to make sure those boxes are ticked, and we're going to put this on the shelf for a year. 

David Dufour: [00:14:06] I don't want to say there's a ton of buy-in. People are excited to hear about it, but there is some overhead from the management implementation side of it. It is more effective, but it's not just a compliance - we're ticking boxes. It's really trying to get that training so you're actually trying to provide security. 

Dave Bittner: [00:14:22] I can also imagine for the user that you have to get some buy-in there because if I'm in the midst of just trying to get my work done, making my way through, you know, my morning mountain of emails, I might not be in the mood to stop and take a couple of minutes for this training to interrupt that process that I'm already in. 

David Dufour: [00:14:39] That is another great point. You know, you have to understand who the user is because if it's a call center person and they're logged into the phone dialer and they're receiving calls, you know, they're not going to be able to stop and take that training at the moment. So you've got to figure out when can you deliver it that's most applicable. And if you're a salesperson, you might want to queue that up. So you do have to look at your folks that you're trying to provide that training to and the best time to deliver it without interrupting their workload. 

Dave Bittner: [00:15:04] I don't think there's any question that different people learn in different ways, so it seems to me that if you can meet people where they are in terms of their learning style, that's good for everybody. 

David Dufour: [00:15:14] It is good for everybody, and it's just another way of thinking about it. And I think, again, educators listening, from a cybersecurity perspective, we focus so much on the compliance side. We need to really start looking at our users because as you and I have talked in previous podcasts, we're finding that users - once they absorb it, they're really trying to learn and do the right thing. And I think a lot of times, we sit up in our ivory towers, telling them, well, you don't know what you're doing. Let us tell you. They really do want to absorb and learn this and do the right thing. We've just got to meet them where they can learn. 

Dave Bittner: [00:15:44] All right. Well, David Dufour, thanks for joining us. 

David Dufour: [00:15:47] Thanks for having me, David. Always a pleasure. 

Tamika Smith: [00:15:53] And that's the CyberWire. The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technology. Our amazing CyberWire team is Stefan Vaziri, Kelsea Bond, Tim Nodar, Joe Carrigan, Carole Theriault, Nick Veliky, Bennett Moe, Chris Russell, John Petrik, Jennifer Eiben, Peter Kilpe. And I'm Tamika Smith. Dave Bittner will be back tomorrow. Thanks for listening.