The CyberWire Daily Podcast 9.14.22
Ep 1662 | 9.14.22

Patch Tuesday notes. Mr. Mudge goes to Washington. Joint warning of IRGC cyber activity. No major developments in the cyber phases of Russia’s hybrid war (but Ukraine is sounding confident).

Transcript

Dave Bittner: Patch Tuesday notes the U.S. Senate Judiciary Committee hears from the Twitter whistleblower. A joint warning of IRGC cyberactivity. Rob Boyce from Accenture on cybercriminals weaponizing leaked ransomware data. Chris Novak from Verizon describes his participation in the CISA Advisory Board. And Ukraine reiterates confidence in its resiliency.

Dave Bittner: From the CyberWire studios at DataTribe, I'm Dave Bittner with your CyberWire summary for Wednesday, September 14, 2022. 

Patch Tuesday notes.

Dave Bittner: Some quick notes on Patch Tuesday before we move on to the usual fare of threats and vulnerabilities. This week, Microsoft, Apple, SAP, and Adobe have all rolled out patches and software upgrades. Consult vendors for details. And the U.S. Cybersecurity and Infrastructure Security Agency yesterday released five industrial control system advisories. Again, users should consult your vendors for information on mitigations. 

The US Senate Judiciary Committee hears from the Twitter whistleblower.

Dave Bittner: Yesterday, the U.S. Senate Judiciary Committee heard testimony from Peiter “Mudge” Zatko, now familiarly known as the whistleblower, on his allegations of privacy and security problems at Twitter. The committee chair, Senator Durbin, Democrat of Illinois, and the committee's ranking member, Senator Grassley, Republican of Iowa, expressed their concerns about Twitter. Both were concerned about data security and privacy. Senator Durbin wanted Twitter to do more to censor hate speech and misinformation. Senator Grassley warned of Twitter's potential exploitation by foreign intelligence services. Zatko, whom we'll henceforth refer to by his handle, Mudge, described his responsibilities when he was part of Twitter's executive team and explained why he was testifying. 

Peiter Zatko: From November 2020 until January 2022, I was a member of Twitter's executive team. In my role, I was responsible for information security, privacy engineering, physical security, information technology and Twitter global support. I'm here today because Twitter leadership is misleading the public, lawmakers, regulators and even its own board of directors. What I discovered when I joined Twitter was that this enormously influential company was over a decade behind industry security standards. The company's cybersecurity failures make it vulnerable to exploitation, causing real harm to real people. 

Dave Bittner: He complained that the company's executive team chose to disregard warnings of security problems, preferring instead to mislead the board, its employees, its customers, the public and legislators. Perverse incentives operated to drive the executives in that direction and enmeshed the company in two basic problems. Mudge said... 

Peiter Zatko: And when an influential media platform can be compromised by teenagers, thieves and spies, and the company repeatedly creates security problems on their own, this is a big deal for all of us. When I brought concrete evidence of these fundamental problems to the executive team and repeatedly sounded the alarm of the real risks associated with them - and these were problems brought to me by the engineers and employees of the company themselves - the executive team chose instead to mislead its board, shareholders, lawmakers and the public instead of addressing them. This leads to two obvious questions. Why did they do that? And what were the problems and vulnerabilities identified? And that's what I'm here to talk about. So first, why did they do that? To put it bluntly, Twitter leadership ignored its engineers because key parts of leadership lacked the competency to understand the scope of the problem. But more importantly, their executive incentives led them to prioritize profits over security. 

Dave Bittner: Senator Durbin was concerned about the limitations of informed consent and asked whether the wordy end-user license agreements actually amount to realistic consent. Senator Grassley pursued his inquiry about an espionage threat with questions about the range and accessibility of the personal information Twitter collects. In his response, Mudge brought up an interesting point about Twitter's engineering culture and the infrastructure that supports it. Twitter doesn't maintain a distinct development or testing environment. 

Peiter Zatko: So Twitter has engineers and non-engineers. Twitter does not have - at least when I was there, which was up until January of 2022 - does not have a testing environment or a development or staging environment. This is an oddity. This is an exception to the norm. Most companies will have a place where you test your software, where you build it, where you make sure it's working the way you want it to. Think about somebody building an airplane and saying, like, I'm going to put it in a wind tunnel. I'm going to build it in an environment. I'm not going to put passengers on it, put it in the air, and then figure out how to build it or tweak the engines at that point. Twitter just has the production environment, the running systems, the live data. When you become an engineer, which is, half of the company are engineers, you are by default given some access to this live production environment. You are doing your testing. You are doing your work on live systems and live data irrespective of where you are in the world as an engineer. 

Dave Bittner: It became clear in the questions and answers that the senators and the whistleblower all regarded Twitter as having been compromised by foreign intelligence services. China, India and Saudi Arabia are all thought to have succeeded in placing agents on Twitter's payroll. Mudge said, in effect, that the company had no way of finding such insider threats, constraining their activity, or remediating the damage they might have done. 

Peiter Zatko: Other than the person who I believed with high confidence to be a foreign agent placed in a position from India and from - it was only going to be from an outside agency or somebody alerting Twitter that somebody already existed that they would find the person. What I did notice when we did know of a person inside acting on behalf of a foreign interest as an unregistered agent, it was extremely difficult to track the people. There was a lack of logging and an ability to see what they were doing, what information was being accessed, or to contain their activities, let alone set steps for remediation and possible reconstitution of any damage. 

Dave Bittner: Senator Klobuchar, Democrat of Michigan, wanted to know how willing Twitter was to knuckle under from requests by foreign governments to censor content. She made particular mention of Russia. And Mudge saw this particular risk as a function of executive incentives, inability to manage data and self-delusion about the governments in question. 

Peiter Zatko: I understand, be it out of a frustration of the inability to perform - and this kind of goes into content moderation, which we talked about before. And while that wasn't my main bailiwick, and I've been informed I shouldn't go into details about conversations I've had with Twitter council, there was a, we don't really have the ability and tools to do things correctly. This is a lot of work. It's not, you know, driving our main executive incentive goals. Is there a way that we can simply punt? And since they have elections, doesn't that make them a democracy? 

Dave Bittner: Senator Blumenthal, Democrat of Connecticut, wanted to know how high up in Twitter management the decisions to mislead government regulators Mudge alleges were taken. 

Peiter Zatko: To the CEO - I do not know to what level inside the board. They did not know because of misrepresentation or chose not to push. 

Dave Bittner: There was general comment on Twitter's alleged indifference to U.S. regulatory risk, including those imposed by the consent decree Twitter entered into with the U.S. Federal Trade Commission. It was clear from the Senators' questions and comments that they thought the FTC's authorities and resources were unequal to the task of regulating large social media platforms like Twitter. Whether this might be addressed by reforms surrounding the FTC or by the creation of an entirely new agency was unclear. All of these were discussed. There may be an international model ready at hand, however. For all of its apparent indifference to the consequences the FTC might bring, Zatko said that Twitter took French regulators much more seriously than they did American agencies. Thus Gallic regulatory teeth may be better adapted to keeping Big Tech on the straight and narrow than the Yankee choppers so far seem to be. 

Joint warning of IRGC cyber activity.

Dave Bittner: CISA and its partners have added their warning to those that have drawn attention to Iranian cyberactivity this week. The Islamic Revolutionary Guard Corps has continued to exploit known vulnerabilities for initial access. In addition to exploiting Fortinet and Microsoft Exchange vulnerabilities, the authoring agencies have observed these APT actors exploiting VMware Horizon Log4j vulnerabilities for initial access. Exploitation of known vulnerabilities is a long-standing practice. The concentration on extortion is somewhat more novel for an Iranian threat actor. CISA says, the IRGC-affiliated actors have used their access for ransom operations, including disk encryption and extortion efforts. After gaining access to a network, the IRGC-affiliated actors likely determine a course of action based on their perceived value of the data. Depending on the perceived value, the actors may encrypt data for ransom and/or exfiltrate data. The actors may sell the data or use the exfiltrated data in extortion operations or double extortion ransom operations, where a threat actor uses a combination of encryption and data theft to pressure targeted entities to pay ransom demands. A full set of indicators of compromise, advice on mitigation and a set of preventative best practices accompany the alert. 

No major developments in the cyber phases of Russia’s hybrid war.

Dave Bittner: Cyber operations proper have remained relatively quiet, or at least inconsequential, during Ukraine's current counteroffensive, although there are reports from the U.K. and elsewhere of a continued uptick in distributed denial-of-service attacks against financial services. And finally, WIRED this morning  published an interview with Yurii Shchyhol, director of Kyiv's equivalent of CISA. He offers a moderately encouraging picture of the war from Ukraine's point of view. Russia has moved into a phase of cyberwar in which it's largely targeting softer civilian targets. The director stated, our attitude remains the same. We treat them as criminals trying to destroy our country, invading it on the land but also trying to disrupt and destroy our lifestyle in cyberspace. And our job is to help defend our country. He's also at pains to stress that Russian cyber operations represent a threat to nations other than Ukraine, stating, the whole civilized world needs to recognize that the threat goes beyond Ukraine. Cyberspace has no boundaries. If there is any attack perpetrated against the cyberspace of one country, by default, it's affecting and attacking other countries as well. 

Dave Bittner: Coming up after the break, Rob Boyce from Accenture on cybercriminals weaponizing leaked ransomware data. And Chris Novak from Verizon describes his participation in the CISA advisory board. Stick around. 

Dave Bittner: Robert Boyce is global lead for cyber crisis and incident response services at Accenture. And I recently spoke with him about some research he and his colleagues published tracking how cybercriminals are weaponizing leaked ransomware data for follow-up attacks. 

Robert Boyce: So we've seen - and honestly, this is personally fascinating to me. So I want to talk to you about this. 

Dave Bittner: OK (laughter). 

Robert Boyce: So we have - you know, we've had this string of ransomware attacks for the last, you know, couple of years now. And it's been, you know, pretty heavy with their affiliate programs and all of that. And so what we've now started to research is we took a look at the 20 - the top 20 leak sites. And those leak sites are - with all the victim data that has been published through there. And we started to see a really interesting trend that after a company has had their data disclosed, the attackers are now going through that data to learn as much as they can about the people, the profile, the inner business workings of an organization, and then leveraging that for very sophisticated business email compromise attacks. So, you know, business email compromise is not new, but they're able to get so much fidelity in the data that's been disclosed from the victims of ransomware that they can now, with much more, you know, certainty, execute a better business email compromise attack because they have so much data - really fascinating. 

Dave Bittner: Is your sense that this is a second round of adversaries here, that, you know, different groups are combing through this? Or might it be the same group, you know, coming back for more? 

Robert Boyce: Yeah. It's a, I mean, great question. It could be either. I think we've just been so focused on the threat of ransomware that we - basically, once we get through that and we do the investigation, we - you know, we go through the forensics, we help maybe recover business operations, that we all congratulate ourselves and job done. Let's go back to business. And then we've really not had the foresight to think about, well, all that data that was just released, what can people do with that? And we're seeing the attackers actually index the data, making - like, doing basically their own big data analytics on what they're being able to get so that they could create this. So it could be - you know, it could be the same adversaries. It could be, you know, a second set of adversaries that have now just thought, what can we do with this data? So it's hard to say. We haven't seen a real correlation there yet. But yeah, it's just still interesting to me. 

Dave Bittner: Yeah. As a defender, is it in your best interest to also gather up this data and catalog it yourself to see where your weak spots may be? 

Robert Boyce: Yeah. So I find most organizations do that as part of the IR process because they need to understand the data that could have been stolen or was stolen. So - or when that does get disclosed, I find most organizations do go and retrieve their own data or have someone do that for them. And they have to go through it. But what I'm not seeing is them take any action based on that. So they may take a look at, saying, OK, these user accounts may have been used as part of the attack or these email addresses may have been disclosed. But they very rarely - well, they will change the passwords typically, but they very rarely will take any additional action than, say, putting additional monitoring on those email addresses that were disclosed, as an example, or - you know, or really looking into what business processes may be impacted by the data that was taken and lost to think through this type of attack scenario, right? So it's - so they usually have it, but they're not typically thinking about what the implications could be outside of standard privacy or other regulations that they need to comply to. 

Dave Bittner: Is that the primary take-home then, or are there other action items here as well? 

Robert Boyce: I mean, there's always, like - there's always the standard follow-up actions from a breach that people need to - like, the basic hygiene. But when we're talking about the data aspects, again, outside of the - you know, the requirement to disclose or the requirement to notify either customers, employees, et cetera - yeah, they're not really thinking about that. And quite honestly, Dave, I'm not even - I mean, it would take a - quite a sophisticated partner to be able to think through, like, what are all the possibilities that people could do with this data if they have it? So yeah, it's - I think the take-home would be, you know, obviously, do the basic hygiene, given the data that was lost and what your obligations are. But now we're starting to see the emphasis really needs to be put on thinking through the additional business implications of that data. What could be used with it? What types of processes may have been disclosed as well? So yeah, it's going to - just definitely going to have to be a lot more thought put into that than there has in the past. 

Dave Bittner: Do organizations need to worry about some of the liability and regulatory consequences of something like this? 

Robert Boyce: Yeah, for sure. I mean, they're usually pretty well in-tune with their regulatory obligations related to data, whether it's privacy or, you know, whatever additional, you know, regulations may exist, depending on their industry and, you know, what those may be. And they're usually very well coached by their internal and outside counsel on what those obligations are and what they have to do. So I think - I find that that's pretty well known. It's the other data that people just aren't putting two and two together to really, you know - like, for example, the procurement data or the - being even able to get enough procurement data to know when a standard payment is made to a specific vendor. Who initially - who makes that payment typically? Who authorized it? Who executes it, where it goes? And so when you know that pattern, it's much easier to say then, well, I'm going to impersonate the employee or the partner and start a new transfer because it's within the bounds of the pattern. 

Dave Bittner: Right. 

Robert Boyce: So it's really super sophisticated. 

Dave Bittner: Yeah. Even just knowing the internal rhythms and cadence that a company uses. 

Robert Boyce: Correct. Yeah. 

Dave Bittner: Yeah. 

Robert Boyce: That's why it becomes... 

Dave Bittner: Fascinating. 

Robert Boyce: ...Fascinating. Yeah, it really is. 

Dave Bittner: Yeah, absolutely. All right. Well, Rob Boyce, thanks for joining us. 

Dave Bittner: And I'm pleased to be joined, once again, by Chris Novak. He is managing director for security professional services at Verizon. Chris, always great to welcome you back. I want to talk about some stuff that you are active in, in addition to your work there at Verizon. You are actually an advisory board member with the Cybersecurity and Infrastructure Security Agency. Can you give us some insights on what it's like to be part of that group? 

Chris Novak: Sure. Yeah. I would say, honestly, it's - maybe not to be too cliche, but kind of a dream come true. I've always looked at opportunities in which we can take what we learn in cyber and share it with others. You know, even going back to the early days of working at Verizon, you know, we put together things like the data breach investigations report. And the real genesis behind that was, how do we share what we see, what we learn, what we know with the community at large to try to make everybody safer, right? The concern is there's no restrictions or rules on how the threat actors operate, collaborate and share. And we need to do everything we can to, you know, similarly collaborate and share and understand what it is that we see in the threat landscape that we can all use to better defend and protect ourselves. And in that advisory capacity, that's one of the things that I'm deeply passionate about is, how can we take what it is that, you know, we're seeing and bring that to - you know, for example, some of the investigations that the Cyber Safety Review Board within CISA are working on? And how do we take that experience and build on that, right? How do we actually, you know, develop, you know, new recommendations or other ways in which we can help our friends in the government mature what it is that they might be doing, help our partners in critical infrastructure secure that? Because by virtue of securing that, we're essentially making the world a safer place for you and me. And then also, how does that - you know, how does that expand from there out to the broader, you know, public and private sector interests? 

Dave Bittner: You know, when Chris Krebs was part of spinning up the agency and then, of course, now Jen Easterly at the helm, these public-private partnerships have really been a focus. And I think for a lot of folks on the outside, that's a bit of a shift. What's the cultural reality as part of this, of, you know, melding the government and the private sector? 

Chris Novak: Yeah, it's been an absolute pleasure to work with Jen and her entire team there. And I think the public-private partnership, I think, is extraordinarily valuable. And actually, interestingly, as I started doing more work with CISA and the CSRB, one of the things I found was lots of other entities are reaching out, going, hey, might that be something that these other entities might be interested in starting up? You know, the Australian government is interested. The U.K. government is interested. Lots of them are going, maybe a public-private partnership similar to what CISA is doing would be good for them as well, right? And I think it's important because I think if you look at anything through just one lens - and, you know, I think we all kind of deep down understand this - but if you only look at it through one lens, it's that biased - or the bias of that one view that is going to lead the way of how you act and how you operate. You know, if we only look at things through the lens of the way a government agency sees it, that may have a bias to how we react, respond and make recommendations. If we only look at it through private sector, you know, financial services or manufacturing or health care, those are all going to have biases as well - and not necessarily that the bias is bad but that the bias doesn't necessarily take consideration of what the full threat landscape is that we're trying to address. So, you know, bringing all of that together in a holistic fashion allows us to really make sure that we're taking everybody's interest, all the constituents, all the stakeholders and ultimately anybody that could be at risk of a cyberattack, taking those concerns into consideration when we make, you know, recommendations for improvements. 

Dave Bittner: And what do you bring back to your colleagues at Verizon? You know, the time that you spend on this board, what insights are you able to carry back? 

Chris Novak: Well, I think a couple of things. One is, it's always insightful and interesting to work with some of my peers at other private sector organizations. So there's a handful of us also on the board from, you know, Microsoft, Google and others. And so seeing and hearing what they're experiencing and bringing that back and say, hey, you know, these are things that we should look at or consider, or these might be other ways that we can partner better with some of our peers because we're hearing some of the challenges that they have, and there may be solutions or ways in which we've addressed it. And then similarly, on the government side of things, you know, I think, you know, sometimes people are maybe a little bit intimidated just at the sheer size of, quote, "government." There's so much there. And it works in so many different ways, knowing what some of those challenges are and saying, hey, you know what, this may be a way that we can work better as a partner, better as a contributor and better as a collaborator - so knowing where, you know, CISA or DHS or anyone else in government may have challenges that they need help in addressing. You know, we're all on the same defender team, so to speak. So bringing that back and saying, how can we help solve for that, for me, that's been very eye-opening, and same for my team. 

Dave Bittner: Is your sense that this sort of arrangement, this sort of committee, could be a model for, you know, other areas of government when it comes to cybersecurity, those sorts of things? 

Chris Novak: Absolutely, I do. Yeah. I think, honestly, that public-private partnership, I think, ultimately is critical to success because so many of the entities that I see and talk to, whether they be government or private sector, they're rarely ever limited in scope to just that. The government is constantly having to work with the private sector, right? It doesn't just work with government, right? At the end of the day, it really serves the people. And the same thing with the private sector - they may largely be, say, a B2B entity, but at the end of the day, they're going to be subject to government regulations. They're going to at some point touch government data or government systems, or they're going to want to pick up a government contract. And so that collaboration I see as being very, very important, and I expect that it's going to kind of ripple out, I think, as people see some of the successes that CISA has had. I mean, for example, the CSRB investigation of Log4j I think was extraordinarily successful. And I think CISA and the CSRB get a lot of kudos for that. I expect that others are going to look at that and say, how do we take this, and how do we replicate this in other places? 

Dave Bittner: All right. Well, Chris Novak, thanks for joining us. 

Dave Bittner: And that's the CyberWire. For links to all of today's stories, check out our daily briefing at thecyberwire.com 

Dave Bittner: The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our amazing CyberWire team is Elliott Peltzman, Tre Hester, Brandon Karpf, Eliana White, Puru Prakash, Liz Irvin, Rachel Gelfand, Tim Nodar, Joe Carrigan, Carole Theriault, Ben Yelin, Nick Veliky, Gina Johnson, Bennett Moe, Chris Russell, John Petrik, Jennifer Eiben, Rick Howard, Peiter Kilpe, and I'm Dave Bittner. Thanks for listening. We'll see you back here tomorrow.