The CyberWire Daily Podcast 4.19.19
Ep 826 | 4.19.19

Observations on the Mueller Report. Doxing Iranian intelligence. Insecure messaging. Old Excel macros. Wipro hack and gift cards.

Transcript

Dave Bittner: [00:00:03] Some observations on the Mueller report - in particular, its insights into what two specific GRU units were up to. Someone is doxing Iran's OilRig cyberespionage group. A French government messaging app appears less secure than intended. Old Excel macros can still be exploited. And what were the Wipro hackers after? Gift cards, apparently.

Dave Bittner: [00:00:32] And now, a word from our sponsor ExtraHop, the enterprise cyber analytics company delivering security from the inside out. Prevention-based tools leave you blind to any threats inside your network. By adding behavioral-based network traffic analysis to your SOC, you can find and stop attackers before they make their move. ExtraHop illuminates the dark space with complete visibility at enterprise scale, detects threats up to 95% faster with machine learning and guided investigations that help Tier 1 analysts perform like seasoned threat hunters. Visit extrahop.com/cyber to learn why the SANS Institute calls ExtraHop fast and amazingly thorough, a product with which many SOC teams could hit the ground running. That's extrahop.com/cyber. And we thank ExtraHop for sponsoring our show.

Dave Bittner: [00:01:28] From the CyberWire studios at DataTribe, I'm Dave Bittner with your CyberWire summary for Friday, April 19, 2019. The U.S. special counsel's report on Russian interference in the 2016 presidential election was released in redacted form yesterday, finding insufficient evidence of collusion - that is, conspiracy and coordination - between the Trump campaign and Russian intelligence services and offering no recommendation on obstruction.

Dave Bittner: [00:01:56] The Mueller report's conclusions about Russian operations are unambiguous. The GRU's Unit 26165 did the hacking, and the Internet Research Agency managed the influence campaign. The report also concluded that the GRU's Unit 74455 retailed the results of the doxing through its subsidiaries DCLinks and Guccifer 2.0 and through a sympathetic WikiLeaks.

Dave Bittner: [00:02:22] It's perhaps not unreasonable to note that WikiLeaks, for all of its pose of disinterested commitment to transparency, has never shown much disposition to similarly traffic in discreditable Russian material like the Panama Papers - much the opposite, in fact - so not a wholly owned front like DCLinks or Guccifer 2.0, but pretty close to being an agent of influence.

Dave Bittner: [00:02:47] The report contains a good bit of information on how the GRU worked. It began by spear phishing personnel in the Democratic National Committee and the Clinton campaign, and following up that phishing expedition with credential theft. Once inside targeted networks, the attackers used Mimikatz to harvest credentials. They used X-Agent for screenshots and keylogging and W-Tunnel for data exfiltration. Middle servers were used to obfuscate the destination of the traffic.

Dave Bittner: [00:03:17] While the Trump campaign thought it would benefit from discreditable material so released, the investigation did not establish that any members of the campaign conspired or coordinated with the Russians. That's true of both the hacking and the subsequent social media campaigns. According to the report, the investigation did not identify evidence that any U.S. persons knowingly or intentionally coordinated with the Russian organizations.

Dave Bittner: [00:03:44] The report noted that collusion is not really a well-defined legal concept. It's more a journalist's than a lawyer's term. And so the report explained that what people call collusion, the investigators treated as a combination of the legal concept of conspiracy and the less formal concept of coordination, which lacks a settled legal definition. The investigators approach their task within the framework of U.S. federal conspiracy law. Since coordination appeared in the document appointing the special counsel, the report explains that the investigation construed coordination as requiring an agreement, tacit or express, and that coordination required more than two parties simply taking actions that were informed by or responding to one another's actions.

Dave Bittner: [00:04:31] The discussion, we note, seems to be all about the GRU, Fancy Bear, with its FSB colleague Cozy Bear not earning a mention, unless it's buried obscurely in the report's 448 pages and we've just overlooked it. Still, one bear is more than enough.

Dave Bittner: [00:04:48] Iran's APT34, the hacking group also known as OilRig, is itself being doxed. A Telegram channel called Read My Lips is dumping the group's tools and some of its identities online. WIRED compares them to the shadow brokers. Whoever they are - and neither disgruntled insiders, opposition groups, nor foreign intelligence services can be ruled out - their declared motive is exposing, quote, "this regime's real ugly face."

Dave Bittner: [00:05:16] Alphabet's Chronicle, Google's security corporate sister, has been watching Read My Lips, and they confirm that the tools being dumped do indeed appear to be OilRig kit. The doxing group has so far published not only tools, but also evidence of the intrusion points used against some 66 organizations OilRig has targeted. Also dumped are the IP addresses of servers Iranian intelligence uses and, more troubling for those so targeted, the names and photographs of people Read My Lips says are working for OilRig. The doxing group explained, quote, "We are exposing here the cyber tools - APT34, OilRig - that the ruthless Iranian Ministry of Intelligence has been using against Iran's neighboring countries, including names of the cruel managers and information about the activities and the goals of these cyberattacks."

Dave Bittner: [00:06:08] The French government recently introduced its own in-house messaging service, Tchap. Messages in Tchap are encrypted end to end, and they're stored domestically in French servers, outside the reach of foreign law. Access to Tchap is supposed to be restricted to government officials, but researcher Elliot Alderson, who goes by the hacker name Baptiste Robert, succeeded without much difficulty in getting himself an account he was in no way entitled to. So back to the drawing board.

Dave Bittner: [00:06:40] Researchers at security firm Avira have found another way in which well-intentioned and useful backwards compatibility can cause problems. Vulnerable Excel 4.0 macros with bugs some 25 years old can still be exploited in current versions of Excel. Microsoft recommends upgrading to Microsoft Visual Basic for Applications.

Dave Bittner: [00:07:03] KrebsOnSecurity thinks the hackers behind the Wipro attack may be a criminal gang, not necessarily a nation-state as much earlier speculation maintained. It appears that the hackers may have also targeted a number of other large IT firms, competitors of Wipro, although with what success, if any, they had remains unclear. What were they after, if they were regular crooks and not working for a foreign intelligence service? Well, fullz for one thing - marketable PII on various individuals. And of course, they were after opportunities to work gift card scams; after all, the gift card scam is the gift that keeps on giving.

Dave Bittner: [00:07:47] Time for a message from our sponsor, KnowBe4. It can take a hacker to know a hacker. Many of the world's most reputable organizations rely on Kevin Mitnick, the world's most famous hacker and KnowBe4's chief hacking officer, to uncover their most dangerous security flaws. You might ask, hey, where can I get the skinny on the latest threats? And where could I find out what would Kevin do? Well, at KnowBe4's webinar, that's where. Kevin and Perry Carpenter, KnowBe4's chief evangelist and strategy officer, give you an inside look into Kevin's mind in this on-demand webinar. You'll learn more about the world of social engineering and penetration testing by listening to firsthand experiences and some disconcerting discoveries. You'll see exclusive demos of the latest attack ploys, find out how they could affect you and learn what you can do to stop them. Go to knowbe4.com/hacker to register for the webinar. That's knowbe4.com/hacker. And we thank KnowBe4 for sponsoring our show.

Dave Bittner: [00:09:00] And joining me once again is Malek Ben Salem. She's the senior R&D manager for security at Accenture Labs. Malek, it's great to have you back. You know, we recently had some news that came out about some groups that were making use of Facebook rather than the dark web to sell tools and tips and techniques for folks who are up to no good. What's your take on this?

Malek Ben Salem: [00:09:25] Yeah. Early April, the Cisco Talos Intelligence Group has reported on some Facebook groups that have some shady activity, perhaps even illegal activity. These included almost 400,000 members in about 74 groups, which engaged in things like selling credit card numbers, identity theft, selling forged IDs, wire fraud, tax fraud, DDoS attacks - you name it. What's interesting is that these groups were not hidden. With, you know, a simple keyword search, you'd be able to identify them. And once you join one group, with Facebook's current recommendation algorithm, you will be presented with similar groups that engage in similar activity for you to join.

Malek Ben Salem: [00:10:17] Talos tried to take down these groups using Facebook's abuse reporting functionality. Some were immediately taken down; others only had some specific posts removed. To me, this raises a question about the efficacy of the abuse reporting function that Facebook is relying on. It doesn't seem that it's working well, especially knowing that back in April of 2018 - so a year ago - the well-known security reporter, Brian Krebs, also alerted Facebook about dozens of Facebook groups where hackers offered similar illegal services.

Dave Bittner: [00:10:56] I'm wondering, do you have any insights on the difficulty of this? It seems as though when it comes to some things, like pornography, for example, you know, Facebook doesn't seem to have any trouble finding that sort of thing and shutting it down quickly. But some of these other things that are more speech-driven, they seem to be slower on the draw.

Malek Ben Salem: [00:11:19] Exactly. And I think that's why Facebook is being criticized about their reliance, total reliance on this model of, you guys report, or you tell us when something is wrong. I think there is a lot to be done there, a lot to be improved, again, especially if you're relying on - if you can use a simple keyword search to identify these groups. I think there is potential; there is an opportunity for Facebook to do more, to apply artificial intelligence in order to detect such content and to take it down, you know, in a timely manner.

Dave Bittner: [00:12:06] Yeah, I have to admit it leaves me scratching my head. If regular folks can find this stuff with just a keyword search, then, well, why isn't Facebook behind the scenes implementing systems that look for those keywords where it knows there could be problems and have someone take a look at it.

Malek Ben Salem: [00:12:23] Exactly. And we know they're reading the content, right? They're using it for profiling users and presenting ads to users. So I think they have an opportunity to build trust with their users and to make sure that whatever they're reading can be used to protect their own users from harm presented by these hacker groups.

Dave Bittner: [00:12:47] Yeah. All right. Well, interesting insights. Malek Ben Salem, thanks for joining us.

Malek Ben Salem: [00:12:53] Thank you, Dave. My pleasure.

Dave Bittner: [00:12:59] And now, a word from our sponsor HackEDU. Are you looking to reduce vulnerabilities in your software? Security teams are turning to HackEDU to help them shift left and be more proactive in reducing vulnerabilities in software. HackEDU offers interactive security development training to help software developers lower the risk of vulnerabilities in code. Developers improve their ability to write secure software, boost their understanding of how software systems are hacked and decrease the time to solve security-related problems. In addition, HackEDU's training helps meet PCI, HIPAA, ISO and NIST compliance requirements. Unlike other offerings, HackEDU uses real applications, real tools and real coding exercises to teach both offensive and defensive security, developers online and on demand. HackEDU's training approach has been shown to be more effective and more engaging than defensive training alone. HackEDU is proven to train developers. Visit hackedu.io/cyberwire and try HackEDU's SQL injection lesson free. Again, that's hackedu.io/cyberwire to try a free lesson. And we thank HackEDU for sponsoring today's show.

Dave Bittner: [00:14:23] My guest today is Barbara Lawler. She's the chief privacy and data ethics officer for Looker Data Sciences, a business analytics and intelligence firm. She's a leader in the data privacy world, having previously served as chief privacy officer at Intuit and Hewlett Packard. She holds leadership positions in a number of influential policy organizations and has testified before several U.S. congressional committees.

Barbara Lawler: [00:14:49] Part of our debate is actually, what does privacy and data protection mean in the context of American values, American business, American innovation? Actually, what do we mean by privacy? What are the outcomes of possible regulation or legislation? How is that balanced - or should it be balanced against, I think, what is the really unique American innovations that actually rely on pretty extensive use and reuse of data about people or about their activities?

Dave Bittner: [00:15:23] So, I mean, what is the historical foundation of how Americans think about privacy?

Barbara Lawler: [00:15:30] The historical foundations for many go back to some of the earlier waves of new technologies. So there's - you'll find discussions about when photography came into being in the late 19th century and what that meant in the public commons and in the private commons if someone was capturing an image of you and then how was that image commercialized.

Barbara Lawler: [00:15:57] As we moved into the age of computers, it became this topic around massive processing of data and the compilation of lists and identifying people by numbers. As we move into the internet, and now, cloud computing, it becomes, what does that look like at scale? And what does that look like when you can have massive scale, massive amounts of data? But also, the reverse of that microscope is the ability to microtarget and in an incredibly detailed, almost down to the individual level. And that - what does that mean for our autonomy, our ability to make our own choices about who we interact with, the ability to control who we hang out with, who we associate with? And who decides that, or who influences that?

Barbara Lawler: [00:16:49] So there's some pretty fundamental questions that historically have been around those kinds of concepts. I think the challenges that we're facing are around what - it goes back to what our American values - and what does it mean to have a free and open society - in particular, in the United States, where we place the highest value on free speech. We place a high value on transparency. And what does that mean when information is used or misused for purposes that maybe a company wasn't clear about, either because they didn't want to be or they didn't know how to be or they didn't know that they should?

Barbara Lawler: [00:17:29] You know, when we look at what's happening in California around CCPA, which, I think it's important to underscore is still a work in progress - there are at least 40 proposed amendments to adjust or tweak CCPA. So it's definitely not baked yet. It just has a first round of baking. But when we look at - for example, what was CCPA trying to solve? CCPA was trying to solve issues around transparency and issues around control. Those actually aren't new issues, but how they manifest in social media and connected devices - and what do we even know about how our data as individuals is used, monetized, reused or shared? And should we know? Should we care? Should we have a say, and when should we have a say?

Dave Bittner: [00:18:19] You initially reached out to me because there are some misperceptions about CCPA that you wanted to clear up. What are some common misperceptions there?

Barbara Lawler: [00:18:30] One of the misperceptions is - is there a financial incentive for companies to comply? And I think there's some thinking that enforcement will be weak, and that there isn't a lot of enforcement incentive. If you look at the potential range of fines that they - as they've been proposed, it's important to look at that is - when it's per issue, that it's also - per issue means per person. So let's say you have a database of 50,000 people that in some way was sold or shared, in violation of CCPA, whether it's $750 or $7,500 - which is the max - that's a per line item fee. So if you do the math on the worst case, $7,500 per incident times 50,000 gets you to about $375 million; that is a significant financial incentive.

Barbara Lawler: [00:19:25] I think the bigger incentive is the AGs actively requested that there be private right of action, which means class action lawsuits. And we've seen class action lawsuits proliferate in other areas of, I would say, consumer protection and privacy protection. And I think the risk for both business and individuals there is that often class action lawsuits, there isn't a direct benefit to the actual consumer purported to protect. The benefit may be to fund or use for education and perhaps funding for those law firms themselves. I think some of the other not so much misperception about CCPA but confusion is the definitions aren't clear. I think we're all hoping that the AG's office, that one of his areas of focus, as he's stated, is to add clarity to the definitions. So are employees of organizations covered as consumers in California? The way it's worded now, yes. Will that change? Potentially.

Barbara Lawler: [00:20:27] There's confusion about the definition of sell. You might say, I don't sell data. My company doesn't sell data. But right now the definition of sell is any exchange of value or consideration. So if you're using a third-party vendor just to produce a podcast, for example, there is consideration, there is value exchanged - that's considered a sale. You probably don't think about it as a sale, but right now, under CCPA, it is. So those are the things that aren't always clear to folks and I think need to be cleared up. The last one I would add is, when a consumer requests what is a very large potential sample of information of what's - that's not really even a sample, what's held about them, it's for 12 months.

Barbara Lawler: [00:21:13] So what that means is there is a 12-month look back. Did that look back start in January of 2019, which means if you haven't started thinking about that or planning for that, you're already late? Will that start in January of 2020? Will the effective date change? Because the AG doesn't need to provide his final guidance until July of 2020. So there's some interesting gaps that we hope will be closed and clarity on when does the look back start, when will the effective date be and some clarity around those definitions that I think will give companies a much stronger sense of confidence on the ability to actually comply with CCPA.

Dave Bittner: [00:21:52] What's your advice to people out there who are trying to get a better handle on this? I think there's a sense that folks feel like they don't have control over their own data.

Barbara Lawler: [00:22:03] I think the first thing is that there are some great resources in a few different locations online that can show you how to actually control your privacy settings. And these are basic things like, if you're not using a mobile app anymore, delete the app, change your location settings. One of the best organizations, Stay Safe Online, particularly around Data Privacy Day, which happens in - on January 28 every year. There's a tremendous amount of resources for individuals as consumers, in a business context. And also, as parents, there's resources for teens. There's additional resources for teens from the Cyber Angels organization, which focuses on teens and kids. Girl Scouts has a program.

Barbara Lawler: [00:22:59] You'll also see some pretty good resources for - from organizations like the Privacy Rights Clearinghouse based out of San Diego, Calif. So there are a lot of places to go. My advice is, check your privacy settings; and you can do that by going into the settings menu of your smartphone or the settings menu on the different web apps that you're using. And I think we're at a stage where less is more. And what I mean by that is, if you looked at the average number of apps on somebody's smartphones, they've kind of stagnated. And I think there's a great opportunity for folks to really take a look at, do I really need all of those apps? If I haven't used it in three months, I should just get rid of it. Because that just simplifies the opportunity for location tracking and data collection that I may not know about or may just not be comfortable about.

Dave Bittner: [00:23:55] That's Barbara Lawler. She's the chief privacy and data ethics officer for Looker Data Sciences.

Dave Bittner: [00:24:06] And that's the CyberWire. Thanks to all of our sponsors for making the CyberWire possible, especially our supporting sponsor, ObserveIT, the leading insider threat management platform. Learn more at observeit.com. The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technology. Our CyberWire editor is John Petrik, social media editor Jennifer Eiben, technical editor Chris Russell. Our staff writer is Tim Nodar, executive editor Peter Kilpe, and I'm Dave Bittner. Thanks for listening.