The CyberWire Daily Podcast 7.6.18
Ep 635 | 7.6.18

When catphishing, it pays to know what bait they'll take. Permission hogs are often misers. Cyber comes to the NTC. Natural intelligence screening for artificial intelligence. The Thermanator.

Transcript

Dave Bittner: [00:00:03] Catphishing in Berlin and Tel Aviv - whether you're offering payment for a white paper or up-to-date football scores, it pays to know the right bait. Android apps may be permission hogs, but it's surprising how often the hogs hoard like misers, never really using them. The U.S. Army pushes cyber into the brigades, how Facebook checks facts, and the Thermanator knows which keys you've typed from the heat your hot hand leaves behind.

Dave Bittner: [00:00:40] And now a word from our sponsor. Who's that sponsor? - you say. Well, it's none other than the mysterious team behind the spectacularly successful fake security booth at RSA 2018. You remember. It was the one with no vendor name, no badge-scanning and the charismatic snake oil salesman pitching his imaginary cybersecurity cures for all that's ailing businesses around the world. So who was behind that booth? Why did they do it? Who's really sponsoring our show today? Get the answers you've been dying to hear, and hear the story behind the booth at fakesecurity.com/cyberwire. That's fakesecurity.com/cyberwire. And we thank whomever it is for sponsoring our show. Major funding for the CyberWire podcast is provided by Cylance.

Dave Bittner: [00:01:38] From the CyberWire studios at DataTribe, I'm Dave Bittner with your CyberWire summary for Friday, July 6, 2018. Catphishing remains in the news. Not only have Israeli soldiers been prospected by fictitious dating profiles, apparently prepared by Hamas, but members of Germany's Bundestag have received the attentions of Chinese intelligence services. In the case of the Bundestag members, the profiles, while bogus, seemed unusually open in their operation, making little or no attempt to conceal their Chinese nationality. The Chinese affiliation would have come to light sooner or later, and it was evidently better and more disarming to put it out there from the start.

Dave Bittner: [00:02:22] German lawmakers were offered payment for various kinds of inside information and, in some cases, for writing papers providing their analysis of certain issues. They were also invited to visit China, where presumably they would be further entangled. Some officials did visit China, where, unsurprisingly, their mobile devices were compromised as tends to happen on such junkets. The recruitment technique is fresh but ultimately classic - accustom the person being recruited to doing small and more or less innocent favors for you, then escalate to the point where the recruit has gone too far. At that point, you have him or her.

Dave Bittner: [00:03:01] Perhaps it starts at a party where you discover a common interest in stamp collecting or bird-watching. You trade stamps. You help that nice person get a good spot to watch, say, storks migrate. A bit later, they ask for a copy of your office phone directory. They've lost touch with some old colleague who works in an adjacent department, and they'd love to get back in contact to renew acquaintances. That phonebook's not classified. Right? No harm there. It's that sort of thing. And online, it happens over social media. A Chinese ministerial delegation is scheduled to arrive in Berlin for bilateral talks Monday. The spying incident is expected to figure in the agenda. (Speaking German). Give the Bundestag a refresher on social engineering.

Dave Bittner: [00:03:52] Returning to the Israeli incidents for a moment, the soldiers were not only approached for dates but, with probably greater success, were offered apps that kept them up to date on World Cup results. An Israeli officer involved in the investigation said, according to a report in The Arab Weekly, that at least one of the football apps was pretty good, a nice interface and slick coverage of the games. As Golden Cup's self-description had it, the app provided HD livestreaming of games, summaries and live updates.

Dave Bittner: [00:04:23] The Israeli Defense Forces attribute the campaign to Hamas, generally regarded as aligned with Iran to the extent that it's a virtual proxy for Tehran. It's worth noting that the data stealing went on beneath an app that performed pretty much as advertised. As Check Point said, quoted in The Register, quote, "This attack involved the malware bypassing Google Play's protections and serves as a good example of how attackers hide within legitimate apps which relate to major popular events and take advantage of them to attract potential victims," end quote.

Dave Bittner: [00:04:57] So what are all these apps up to anyway? Here's something that can be either a good news or a bad news story depending upon how you choose to spin it. Researchers at Northwestern University and the University of California, Santa Barbara investigated more than 17,000 Android apps from Google Play and three major third-party app stores. They concluded that, while apps tend to be permission hogs, the permissions they hog usually go unused. Only 21 of the apps inspected were extracting and reporting data in a questionable fashion.

Dave Bittner: [00:05:36] So the good news is that your Android phone probably isn't spying on you and reporting back to Shanghai, Pyongyang or Moscow - or, for that matter, to Laurel, Cheltenham (ph), Ottawa, Canberra, or Willington (ph). The bad news seems to be that if you're careless with your permissions, your phone could do all that if it really wanted to.

Dave Bittner: [00:05:50] The U.S. Army continues to integrate cyber operations into unit training at brigade level and below. Its established a cyber range for rotational units to use as they come through the National Training Center at Fort Irwin for brigade and task force training. Cyber operations have long been a national and not an organizational responsibility. There have been plans to change this for some time. And if cyber operations have come to Fort Irwin, our NTC desk assures us that's the clearest possible sign that this change is now a reality within the Army.

Dave Bittner: [00:06:25] Facebook, like other platforms, continues to struggle with content screening. An interview in WIRED offers some perhaps surprising perspective on how their process works. Most accounts of it have focused on the role played by artificial intelligence - with the dopey, biased or otherwise tendentious results that periodically surface being attributed to the algorithms. But Facebook's relationship with content is more complicated than that. Most descriptions have imagined the AI screening and then the humans intervening as necessary.

Dave Bittner: [00:06:58] That seems not to be correct. In its efforts against the propagation of fake news of the kind spread about by the troll farmers of Russia's Internet Research Agency, the AI just looks for trending stories, with human fact-checkers - and Facebook employs thousands of them - doing what their job title implies - checking facts. Then the humans turn the content they've found to be bogus or, if you're in a suspicious mood, objectionable on whatever mysterious grounds the House of Zuckerberg may have established - fact-check that, Muggalos - they turned that content over, we repeat, to the AI, which then romps out to look for its appearance. And no, your clown makeup won't help.

Dave Bittner: [00:07:42] Finally, another team of researchers, those at the University of California, Irvine, reports on the Thermanator proof-of-concept hack. Someone with a decent mid-range thermal camera who gets close enough to an unattended keyboard or keypad within 30 seconds of use can see what keys were pressed. Hunt-and-peck typists, as opposed to those who paid enough attention in school to use all 10 fingers, left particularly clear thermal signatures.

Dave Bittner: [00:08:10] It's tough to imagine how this might be useful in the wild. You'd think someone hanging around the office with a decent mid-range thermal camera would be conspicuous and easily recognized, even if they were wearing Juggalo or Juggalette at makeup. On the other hand, setting up an inconspicuous camera around a terminal where people enter PINs - an ATM in a high-traffic area, maybe - might work, although how you'd get the rest of the pay card data isn't entirely clear.

Dave Bittner: [00:08:37] Perhaps the proof of concept is useful in drawing attention to the possibilities brought by the increasing commodification of sensors that were once relatively expensive and exotic - or another reminder, as if more were necessary, of the shortcomings of passwords and PINs generally. But here's one lesson that shouldn't be overlooked: your typing class could have made you a more secure user of computers. So stay in school, kids.

Dave Bittner: [00:09:08] I'd like to take a minute to tell you about an exciting CyberWire event, the 5th Annual Women in Cyber Security Reception taking place October 18 at the International Spy Museum's new facility in Washington, D.C. The Women in Cyber Security Reception highlights and celebrates the value and successes of women in the cybersecurity industry. The focus of the event is networking, and it brings together leaders from the private sector, academia and government from across the region and women at varying points on the career spectrum. The reception also provides a forum for women seeking cybersecurity careers to connect with the technical and business professionals who are shaping the future of our industry. It's not a marketing event. It's just about creating connections.

Dave Bittner: [00:09:54] We're grateful to our sponsors Northrop Grumman, CenturyLink, Cylance, Accenture, Cooley, T. Rowe Price, VMware, Delta Risk, SecureStrux and Edwards Performance Systems. If your company is interested in supporting this important event, we still have some great sponsorship opportunities available. We're also partnering with Maryland Art Place to have a special work of art created for the event that attendees can take home with them. As it's been in previous years, this event is invitation only. We do it this way to ensure a mix of women with diverse backgrounds and at different career levels. If you are interested in getting an invitation to this year's event, tell us a little bit about yourself and request one at our website thecyberwire.com/wcs. That's thecyberwire.com/wcs. We look forward to hearing from you. We hope to see you there.

Dave Bittner: [00:10:56] And joining me once again is Emily Wilson. She's the director of analysis at Terbium Labs. Emily, welcome back. You all have a white paper that is recently released. And you're looking at things like fraud and how it relates to supply chain and things like that. Can you give us an overview? What are you getting at here?

Emily Wilson: [00:11:15] Sure, yeah. So very excited to have this research coming out. I'm looking forward to having conversations about it. I think it's a conversation starter. We're looking at a couple of things here. We're thinking about fraud as a supply chain. And so we're looking at really two pieces - one, sort of the goods and services aspect and then also, what does this mean if this is a supply chain? Does this modify the way that we're thinking about fraud?

Emily Wilson: [00:11:37] So on the one hand, the sort of goods and services side, we talk about a lot - and certainly other people are discussing - you know, the dark web trade in information isn't kind of a scramble or a one-off. It's a really well-structured economy. There are vendors, and there are buyers; it's subject to supply and demand; goods command certain prices. And so we're able to evaluate it as an economy.

Emily Wilson: [00:11:58] And one of the things that we're looking at is, how is data valued? We all have a concept in the real world - we think about our risk calculations or our data classifications - of what information is most important. And we think of information as being valued in the same way that we measure import. But it's different on the dark web. The information that is most prevalent or most valuable may not directly tie back to your concepts of data sensitivity or data classification. And if we're going to be thinking about the economy and thinking about how it impacts us, we need to understand how data is actually valued.

Dave Bittner: [00:12:33] So the things that might be valuable to me or I may perceive as being valuable, that might not align to what the folks on the dark web consider to be valuable.

Emily Wilson: [00:12:42] Right. Because when these people on the dark web are thinking about data, they're thinking about the potential for monetization. So something that you might have that's very sensitive may not be easy to monetize or may have such a small audience, like intellectual property, where it's going to be kind of one-offs. Right? It's going to be very targeted, people coming after specific things - as opposed to the information that's being traded constantly.

Emily Wilson: [00:13:04] The other piece of this that we're thinking about is - if this is a supply chain, how do we think about disrupting it? And the analogy I'm trying out here - and I'll try out with you guys who are listening - is we think about agriculture. We have an understanding of what a product recall would look like. Right? If something goes wrong, somebody gets sick - oh, no. I started eating salads, and now romaine is going to kill me. You know, we walk this process back, and we identify a point. And you know, we issue a big recall.

Emily Wilson: [00:13:32] In fraud, we're having kind of the same approach. Right? Payment card fraud - something goes wrong. We say, OK, we'll figure it out. We scramble. It's a very reactionary approach because right now that's our only way of understanding when fraud has occurred is as it's occurring, after it's occurred. But if this is a supply chain, how can we think about getting ahead of it? How can we think about stepping back? What if we could get to it before something happened? What if we could get to it as this information is becoming available? And so that's a hard question. And it's something we're working on. So I'm excited to discuss it with people.

Dave Bittner: [00:14:05] All right. Well, check out the white paper over at Terbium Labs. As always, Emily Wilson, thanks for joining us.

Dave Bittner: [00:14:14] And now a few words from our sponsor Dragos, the leaders in industrial control system and operational technology security. In their latest white paper, Dragos and OSIsoft present a modern-day challenge of defending industrial environments and share valuable insights on how the Dragos-OSIsoft technology integration helps asset owners respond effectively and efficiently. They'll take you step by step through an investigation, solving the mystery of an inside job using digital forensics with the Dragos platform and the OSIsoft PI System. Check it out. You can download your copy today at thecyberwire.com/dragos. That's thecyberwire.com/dragos. And we thank Dragos for sponsoring our show.

Dave Bittner: [00:15:13] My guest today is Brian Wells. He's the chief technology officer at Merlin International. Brian previously served as the associate vice president of health technology and academic computing for the University of Pennsylvania Health System Perelman School of Medicine. He also held a leadership position at Children's Hospital of Philadelphia. Our conversation focused on the lessons that can be learned from high-performance health care organizations, specifically how they approach cybersecurity.

Brian Wells: [00:15:43] They are under attack. They're a very target-rich environment. They have a lot of legacy technology that's been around for years that isn't secure enough, perhaps. And they also are attractive in that they're running an organization that provides life-and-death services to patients. And they can't afford to have any systems go down. Be they electronic medical records systems or medical devices that has to work all the time, really can't go down and force everyone to work with a paper system that they're not used to using any longer. And so there's a lot of power that a hacker might have to extort or to, you know, use ransomware to force a health system to pay them money to get their systems back up and running again.

Dave Bittner: [00:16:24] Yeah. And particularly when it comes to ransomware, I think, you know, the common advice is to not pay the ransom. But I can certainly see in a health care situation, when lives are on the line, that it may be something that organizations consider.

Brian Wells: [00:16:39] They definitely consider it. I think it depends on the maturity of the organization. So if you're a large organization with a robust IT team that's following best practices around security and disaster recovery, you may be able to recover quickly enough and not pay their ransom. But if you're a smaller health system or hospital that doesn't have the depth of team and technologies, you may have to just pay.

Dave Bittner: [00:17:02] Now, you have quite a bit of experience in the health care sector. What sort of advice do you have for organizations to help protect themselves?

Brian Wells: [00:17:10] There's a lot of basics they have to, you know, start with. No. 1 would be educating their staff and their employees about - on how to be careful with emails and other types of things that they maybe come in contact with on their computers, to not click on links they don't recognize or to open up files that were given to them by someone they don't trust. And so a big part of it is just educating the staff as to how to be very careful and secure in their daily dealings with data and working with information.

Brian Wells: [00:17:38] And then secondly, they have to secure their networks - so a robust firewall system that protects their network from external attack. They need to have constant monitoring tools that are checking the endpoints on their network that are connected and make sure that they're patched and current and running the proper antivirus technologies and that sort of thing. And then they just have to really just hire a chief information security officer and build a security team that is constantly monitoring the organization, checking log files and looking at data to make sure that they have not been attacked and to prevent future attacks. It's a difficult - a never-ending job. It's kind of like weeding your lawn. You're never done killing the weeds on your sidewalk. They always come back. And you just can never give up. You have to have a collection of people and technologies to really be vigilant about protecting everything.

Dave Bittner: [00:18:26] And what about incident response and sort of practicing for the inevitability of these sorts of events? I can imagine in a health care situation, it's hard to carve out the time for those sorts of exercises.

Brian Wells: [00:18:40] It's very hard to carve it out. And you definitely have to have one. If you can't carve out time to stage a fake, you know, situation, you definitely should do tabletop exercises that brings the appropriate business, clinical and technology folks to the table and walk through - what would we do if we had a ransomware attack and we couldn't access our electronic medical records for 24 hours? What would we do - or maybe even longer potentially?

Brian Wells: [00:19:02] So you do have to have a plan. You have to accept the fact that, it's not a question of if; it's a question of when - and really prepare and practice to the best of your ability. You can't really, you know, stage a real attack because it's going to upset, you know, patient care. And that's the primary job. And so you really just have to model it, maybe set up a test environment where you can simulate it in a nonproduction mode, see how anyone responds. But you really just have to, at a minimum, run these tabletop exercises.

Dave Bittner: [00:19:31] Now, what about the notion of reducing friction? I'm thinking particularly for the doctors and nurses, the people who are actually, you know, doing the health care, applying the medicine. I have heard that, you know, if something gets in the way of them being able to provide care to their patients, well, that's not going to be their priority. Some sort of security procedure that slows them down in the operating room or, you know, in the patient care - they're not going to stand for that. How do you strike that balance between meeting their needs as health care providers but also protecting the organization?

Brian Wells: [00:20:09] You have to involve them in the process. The more mature organizations have a security governance committee that has security people as well as IT people as well as clinicians, nurses, doctors and business folks. And they sit around a table, and they weigh the pros and cons of forcing an automatic screen saver timeout of five or 10 minutes versus 20 minutes or 30 minutes. Those kind of organizational discussions have to happen with all the stakeholders in the room. And they have to be educated as to the trade-offs of what would happen if we weren't secure. Would you - you know, can you tolerate switching back to using a paper system for 48 hours if we allow people to not be secure? And so there is that constant conversation.

Brian Wells: [00:20:49] You can't - IT can't just inflict these things on the organization. They have to understand the pros and cons. I think one thing that is important is the role that third parties play. So, many organizations use third-party software vendors for their applications as well as they bring in consultants and other organizations. And I think it's extremely important to make sure that all of your third-party partners, be they vendors of technology or vendors of people - nurses or IT people or other consultants - have a shared accountability so that they're also involved in ensuring that their software, their technology, their people are following the rules and behaving securely as well.

Dave Bittner: [00:21:28] That's Brian Wells. He's chief technology officer at Merlin International.

Dave Bittner: [00:21:36] And that's the CyberWire. Thanks to all of our sponsors for making the CyberWire possible, especially to our sustaining sponsor Cylance. Find out how Cylance can help protect you using artificial intelligence. Visit cylance.com. And Cylance is not just a sponsor. We actually use their products to help protect our systems here at the CyberWire. And thanks to our supporting sponsor VMware, creators of Workspace ONE Intelligence. Learn more at vmware.com.

Dave Bittner: [00:22:04] The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technology. Our show is produced by Pratt Street Media with editor John Petrik, social media editor Jennifer Eiben, technical editor Chris Russell, executive editor Peter Kilpe, and I'm Dave Bittner. Thanks for listening.