The CyberWire Daily Podcast 11.23.21
Ep 1465 | 11.23.21

Tardigrade malware infests the US biomanufacturing sector. GoDaddy suffers a significant data breach. Facebook Papers to be reviewed and released. NSO Group’s troubles.

Transcript

Tre Hester: Tardigrade malware infested the U.S. biomanufacturing sector. GoDaddy suffers a significant data breach. A Gizmodo-led consortium will review and release the Facebook papers. Ben Yelin on our privacy rights during emergency situations. Our guest today is Ric Longenecker of Open Systems to discuss how ransomware attacks represent the No. 1 threat for universities. And the NSO Group may not recover from current controversy over its Pegasus intercept tool.

Tre Hester: From the CyberWire studios at DataTribe, I'm Tre Hester with your CyberWire summary for Tuesday, November 23, 2021. 

Tre Hester: BIO-ISAC, the Bioeconomy Information Sharing and Analysis Center, yesterday released a report on malware it calls Tardigrade, named after the moss piglet - or, if you prefer, water bear micro-animal - in which it describes as the work of an advanced persistent threat - that is, a nation-state intelligence service. Tardigrade appeared this spring when it hit BioBright's manufacturing facility. It resurfaced in an October attack. There are some similarities with the SmokeLoader malware, familiar since 2011. And those similarities are enough for BIO-ISAC to assess Tardigrade as a member of the SmokeLoader family. 

Tre Hester: SmokeLoader, which MITRE calls a malicious bot application that can be used to load other malware, has been involved with what BIO-ISAC describes as, quote, "multipurpose tools that include keylogging, information theft, botnet support and backdoor access," end quote. 

Tre Hester: But there are some significant differences that show Tardigrade as having evolved beyond its parent malware. Quote, "previous SmokeLoader versions were externally directed, dependent on C&C infrastructure," end quote, BIO-ISAC says, whereas, quote, "this Tardigrade version is far more autonomous, able to decide on lateral movement based on internal logic," end quote. It's also good at immediate privilege escalation to the highest level. 

Tre Hester: And Tardigrade is more than polymorphic malware. It is, BIO-ISAC says, metamorphic, by which they mean it seems to be able to recompile the loader from memory without leaving a consistent signature. 

Tre Hester: Recompiling occurs after a network connection in the wild that could be a call to a command-and-control server to download and execute the compiler. This gives the malware an unusual level of autonomy. The malware is installed either by infected email software, malicious plugins, malvertising, general network infection or contaminated removable media, like USB drives. 

Tre Hester: WIRED says Tardigrade seemed curiously indifferent to whether they were actually paid. Tardigrade proved more advanced than it appeared, evasive, persistent and clearly interested in more than ransom. 

Tre Hester: BIO-ISAC says the malware is spreading through the biomedical sector, which suggests that some intelligence service is actively scouting the U.S. biomedical industry. There's no further attribution available at this time. 

Tre Hester: While which nation-state might be responsible for Tardigrade, BIO-ISAC offers some speculation on the motive, which it bases on the malware's behavior. The main role of this malware, the ISAC's report says, is still to download, manipulate files, send main.dll library if possible, deploy other modules and remain hidden. First, Tardigrade's operator seems interested in stealing intellectual property from the biomanufacturing industry. The second objective seems to be staging, battlespace preparation and establishing persistence with a view toward further operations. 

Tre Hester: Finally, the researchers think that at least some of those subsequent operations may have been ransomware attacks. BIO-ISAC offers recommendations for organizations in the biomedical sector that may be at risk. First, review your biomanufacturing network segmentation. Run tests to verify proper segmentation between corporate, guest and operational networks. Most facilities use remote logins with shared passwords to operate key instrumentation. Enforcing segmentation is essential. 

Tre Hester: Second, work with a biologist and automation specialist to create a crown jewels analysis for your company. Ask, if this machine was inoperable overnight, what would be the impact, and how long would it take to recertify this instrument? 

Tre Hester: Third, test and perform offline backups of key biological infrastructure. That should include, the ISAC says, ladder logic for biomanufacturing instrumentation, SCADA and historian configurations and batch record system. 

Tre Hester: Finally, inquire about lead times for key bioinfrastructure components, including chromatography systems, endotoxin and microbial contamination systems. That final point is worth considering when studying the risks associated with any industrial system. Many components may need to be replaced after a successful attack, and they're not always immediately available right off the shelf. 

Tre Hester: Domain registrar and hosting company GoDaddy had disclosed in an SEC filing a major data breach affecting up to 1.2 million active and inactive managed WordPress accounts. The breach began, GoDaddy believes, on September 6. The company discovered it on November 17, and investigation remains in progress. 

Tre Hester: The essential points of the disclosure are these. Quote, "the original WordPress admin password that was set at the time of provisioning was exposed. If those credentials were still in use, we reset those passwords. For active customers, sFTP and database usernames and passwords were exposed. We reset both passwords. For a subset of active users, the SSL private key was exposed. We are in the process of issuing and installing new certificates for those customers," end quote. 

Tre Hester: GoDaddy's security team believes the attackers used a compromised password to access GoDaddy's provisioning system for its managed WordPress service. 

Tre Hester: Gizmodo has announced its intention to release - to responsibly disclose, as Gizmodo puts it - the Facebook papers first reported by The Wall Street Journal and provided to committees of the U.S. Senate. The Facebook papers record internal discussions of the design and operation of Meta's Facebook and Instagram platforms, recently controversial over allegations that their very design conduces the spread of hate, misinformation and material that's harmful to minors. Gizmodo and its partners at New York University, the University of Massachusetts Amherst, Columbia University, Marquette University and the American Civil Liberties Union will be sifting through the material and releasing it as they complete their review. 

Tre Hester: The responsibility in the disclosure lies in the group's avowed intention to avoid perpetuating harm. Quote, "we believe there's a strong public need in making as many of the documents public as possible as quickly as possible. To that end, we've partnered with a small group of independent monitors, who are joining us to establish guidelines for an accountable review of the documents prior to publication. The mission is to minimize any costs to individuals' privacy or the furtherance of any harms while ensuring the responsible disclosure of the greatest amount of information in the public interest," end quote. 

Tre Hester: There's also an acknowledgement that simply dumping the material, which was provided by Facebook whistleblower Frances Haugen, could cause harm in other systemic ways. Quote, "beyond privacy reasons, the documents require additional review to ensure that we aren't just handing criminals and spies a roadmap for undermining what controls Facebook does have in place to defend against propaganda that spreads lies, hate and fear. That would undermine any benefit the world stands to reap from this act of whistleblower justice," end quote. 

Tre Hester: It's worth remembering in this context that whatever the company's other faults may or may not be, Facebook's record of exposing coordinated inauthenticity, the deliberate use of bogus accounts by mostly governments to spread disinformation, has been seen as a positive one. 

Tre Hester: And finally, the headwinds NSO Group faces appear to be blowing harder. The intercept tool vendor was sanctioned earlier this month by the United States, and reputational damage continues to press the company. Bloomberg reported yesterday afternoon that Moody's Investors Service cut NSO Group's rating to Caa2, which is eight degrees below what's considered investor grade. The company, Moody's says, faces a risk of default on approximately $500 million in debt. NSO Group's cash burn is expected to continue for the remainder of the year as it loses customers and as U.S. sanctions begin to bite. 

Tre Hester: Among the big accounts NSO Group has lost as revelations of the controversial use of its tools emerged was, MIT Technology Review reports, the government of France, which was nearing a decision to acquire the company's Pegasus intercept tool before it backed out. News that French politicians were among those on other nations' Pegasus target list did not help the company's sales. 

Dave Bittner: Universities find themselves in the crosshairs of ransomware operators. And given their size and complexity of their mission, it's not surprising. Ric Longenecker is CISO at Open Systems, a provider of managed detection and response products. He joins us with insights on the challenges universities face. 

Ric Longenecker: You've got anywhere between one and, you know, 20,000 students on a campus, you know, people in an interesting time of their lives. I remember back in the day when I was in school, Napster, other things - right? 

Dave Bittner: Right. 

Ric Longenecker: So you can definitely have a lot of different things - research, you know, you could say hackers among the students, et cetera, activists, et cetera. So it just makes the whole bit with the university, you know, having a large student population quite interesting. 

Ric Longenecker: And at the same time, especially, you know, at many universities now, I mean, they're obviously a great source of IP innovation and other things all over the world. And so that makes things quite interesting and quite - from an IP perspective. 

Ric Longenecker: And if you look at the other end of things, they typically have, you know, reasonable endowments or sponsorship based on the university. So you also know that, you know, there's a bit of funding or a way to get something of value out of them monetarily. 

Ric Longenecker: And on top of it, they hate - they all hate bad publicity. Yeah. I mean, every university wants to maintain the absolute best publicity they can, best reputation. And so they're interested in handling problems, you could say, sometimes internally, right? You can even relate that to, you know, athletics, you know, athletic associations and other things. And cyber is the same, right? 

Dave Bittner: Yeah. 

Ric Longenecker: People want to know that their kids are going to a good university that handles security well. 

Ric Longenecker: And then from the other end, you know, many universities don't necessarily fund or have the IT teams or be able to recruit the right IT teams that you might be able to see at, you know, let's say a Fortune 500 or something. 

Ric Longenecker: Some countries and some places, you know, around the world - I've been a pretty global guy, worked for the U.N. for a while and things - you know, they actually have CERTs - you know, computer emergency response teams - for their universities. But especially - we see in the States we don't necessarily have that because we have so many universities, and it's so widespread across states. 

Ric Longenecker: So it really kind of represents a unique problem where you've got this pot of people on a campus that can present some very interesting challenges for an IT team, the need, you know, to maintain a great reputation, and then on the other end, maybe the IT teams that aren't necessarily capable of handling the problem and not a lot of, you could say, government or centralized guidance or support in order to actually manage and meet the problem. So it's kind of the perfect storm. 

Dave Bittner: It really strikes me that, you know, particularly for a large university, that it is - it's like a little city. I mean, they're providing housing and food and transportation, you know, heating and air conditioning, you know, all of the basics of everyday life. And within that complexity is a long list of potential targets. 

Ric Longenecker: Exactly. And it's - definitely has, like, an immediate potential to impact people, you know, if you talk about safety, other things and - fully operations. I mean, if you look at during COVID, almost every university had to transition, like every business, to completely virtual operations. 

Dave Bittner: Well, let's talk about some recommendations there for how universities can get in front of this. Talking about MDR specifically, is there a benefit for organizations who are taking advantage of managed detection and response that that MDR provider likely has a view into many organizations beyond their own so that sort of shared incoming information benefits everybody? 

Ric Longenecker: So with so many organizations presented, unless they invest millions, they can't actually set up an ops and fusion center, et cetera. You know, they just can't keep up with it. And so many organizations right now, including universities, just have a million security tools. I mean, there's a thousand new cyber startups in Israel this year. I think last year, the year before, somebody said - you know, there was a - different studies that are out - there's more than 9,000 cyber companies. 

Ric Longenecker: And just especially at, you could say, university perspective, there's the possibility to bring a lot of different options in, and that literally makes it difficult to provide focus and actually to dwell on what you actually need. And if you look at actual threat intelligence, you know, from a single university, to digest that, create that, et cetera, even if you take in feeds and make, you know, different integrations, whether it's STIX or MISP or something else - (unintelligible) every day. 

Ric Longenecker: And a lot of the U.N. agencies, just like academia, don't necessarily talk to each other, right? So when an incident happens in one place, it's not necessarily communicated in the other. And, you know, indicators of compromise, IOCs, are shared. But just like a lot of government and the education sector, it's not necessarily shared within a rapid period of time. 

Ric Longenecker: And I even go back to my time there when I was in Geneva, Switzerland, and WannaCry, NotPetya was happening. At the time, we didn't necessarily have a CERT completely organized or a SOC, and so we relied on outsourced services. And actually, that's the continued way of actually working. As I mentioned before, it takes time to do this. 

Ric Longenecker: The education, you know, sector kind of has the same thing. If you have a provider that would work across a number of different sectors and/or partners with other companies - and that really, really can be beneficial to the team, too, as it has a lot of other things to deal with in their digital journey. 

Dave Bittner: That's Ric Longenecker from Open Systems. 

Dave Bittner: And joining me once again is Ben Yelin. He's from the University of Maryland Center for Health and Homeland Security and also my co-host over on the "Caveat" podcast. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Article caught my eye over on The Washington Post. This is written by Drew Harwell, and it's titled "Data Broker Shared Billions of Location Records with District During Pandemic." And the upshot of this is that thanks to a FOIA request, it was uncovered that there was a company who provided the government with location data - millions of pieces - or billions of pieces of location data - for the District of Columbia to use as part of their public health efforts during the pandemic. 

Dave Bittner: Here's my question. To what degree do our civil rights, do our privacy rights go out the window when we are in an emergency situation? I think we can agree, certainly, the onset of the pandemic was an emergency situation. 

Ben Yelin: Absolutely. 

Dave Bittner: So what happens then? 

Ben Yelin: Well, frankly, it's a major concern. At the federal level, there are reasonable constraints on emergency powers, depending on the circumstances. At the state level, just based on our constitutional system, states have the power to protect the health, safety and welfare of their citizens. And that certainly makes its way to emergency powers. 

Ben Yelin: I know, at least in Maryland and in almost all states, the governor's powers as it relates to emergency use are relatively limitless. Once there's a declared emergency, in most circumstances, the governor can control ingress and egress from a particular area. They can issue, you know, mandatory curfews, quarantine and isolation. They can suspend, in most states, any law or statute that is inhibiting the emergency response. 

Ben Yelin: And in some other circumstances - you know, I know this is true in Maryland - they can actually compel people who have experience in health care services - so doctors, nurses, et cetera - to get on the front lines against their will and participate in a public health response. 

Dave Bittner: Wow. 

Ben Yelin: So these powers are extremely broad. And there's been a concern with the COVID-19 pandemic. Yes, it's been a real-world emergency. You know, we've lost 700,000 people, so you can't minimize the impact of it. 

Dave Bittner: Right. 

Ben Yelin: But how long are we going to be under these emergency conditions? Most states are still under some version of a declared disaster, an emergency declaration, which gives their governments pretty broad powers. 

Ben Yelin: And, you know, taking a somewhat pessimistic view on this, we're probably going to be dealing with COVID in one way or another for at least the next several years. And, you know, there are going to be these cycles of uptick in case, and then people get boosters, and there's a downtick. But that means, you know, we might not see a cessation of these emergency declarations, and that will allow states and localities, like the District did here, to do things that might not be kosher in the absence of an emergency. So that's something that people have to be concerned about and be vigilant about. 

Dave Bittner: Well, let's talk about - I mean, this case in particular, we're talking about location data gathered by our mobile devices. 

Ben Yelin: Right. 

Dave Bittner: Could an argument be made that gathering this kind of data is less intrusive than setting up checkpoints? You know, like when you're trying to establish whether or not people are obeying things you put in place for social distancing, for staying at home, those sorts of things, if you can do that in a passive way, could that perhaps be less unsettling to the community than having people out on the streets with guns, you know, enforcing this sort of thing? Am I - is this at all a decent argument in your mind? 

Ben Yelin: It is a decent argument. You know, the flip side to that is it's not as effective of a tool 'cause there really isn't an enforcement mechanism. 

Dave Bittner: Right. 

Ben Yelin: You can get data on people's traveling habits. So, ah, looks like there are a lot of cellphones at this frat house at Georgetown. 

Dave Bittner: Right, right. On Saturday night, yeah. 

Ben Yelin: Looks like they're - yeah, they're not observing, you know, isolation and social distancing policies. 

Dave Bittner: OK. 

Ben Yelin: But, you know, that data is not - there are ways to de-anonymize it, as we know, but it is anonymized, at least in its raw form. And according to the Electronic Frontier Foundation, their researchers - they looked into this data; they were the one that submitted the FOIA request - there hasn't been an abuse - or there hasn't been alleged abuse from law enforcement in how they've handled this data. 

Ben Yelin: So it is less severe. It is less restrictive than having boots on the ground, sending in the National Guard, et cetera. Is this something that we would want to continue indefinitely? I think that's a separate question. But I - yeah, I think it's absolutely less intrusive than many of the other methods that were used or could be used to enforce public health measures. 

Dave Bittner: What would be the methods by which the state's ability to do these sorts of things in an emergency situation could be dialed back? 

Ben Yelin: There have been proposals that have passed in a limited number of states - I think they've probably been proposed in every single state - by the state legislatures to rein in governors' powers during an emergency. We saw this effort in Maryland. It went nowhere, but it was proposed. You know, there are probably a dozen or so bills that sought to curb the governor's emergency powers. So, for example, a governor could declare an emergency for 30 days, but after 30 days, it would have to be ratified by the state legislature in one form or another. Otherwise, that declaration would be discontinued. 

Dave Bittner: I see. 

Ben Yelin: Or just revising emergency powers to take some - you know, certain powers away from the governor. So maybe things like - you keep things like controlling egress and ingress from a affected disaster zone. Keep that, but do away with compulsory service for health workers. You can try and dial down some of the specific powers. 

Dave Bittner: But this would be a third rail that the feds would stay away from - right? - this - state powers. 

Ben Yelin: It is. Yeah. 

Dave Bittner: Yeah. 

Ben Yelin: I mean, it really is not the federal government's role. This is - the states, under our constitutional system, have primary responsibility in responding to emergencies. The federal government's role is quite limited. It's really, through the Stafford Act, you can request money for a disaster declaration. And FEMA, certainly, when we're talking about a multistate emergency, they, you know, play a role in coordination. And we're talking about COVID - you know, things like the CDC... 

Dave Bittner: Right. 

Ben Yelin: That comes into play. But in terms of emergency response and bringing the hammer down in terms of government regulations, that is really something that happens at the state level. Frankly, no matter who your governor is - I would say it's unlikely a governor is going to sign a bill, in most circumstances, limiting their own powers. 

Dave Bittner: Right, right. 

Ben Yelin: So, you know, that's the type of thing you're probably going to need a veto-proof majority for something like that. 

Dave Bittner: Yeah. All right. Well, interesting stuff, for sure. Ben Yelin, thanks for joining us. 

Ben Yelin: Thank you. 

Tre Hester: And that's the CyberWire. For links to all of today's stories, check out our daily briefing at thecyberwire.com. And for professionals and cybersecurity leaders who want to stay abreast of this rapidly evolving field, sign up for CyberWire Pro. It will save you time and keep you informed. Also, listen for us on your Alexa smart speaker. 

Tre Hester: The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our amazing CyberWire team is Elliott Peltzman, Brandon Karpf, Puru Prakash, Justin Sabie, Tim Nodar, Joe Carrigan, Carole Theriault, Ben Yelin, Nick Veliky, Gina Johnson, Bennett Moe, Chris Russell, John Petrik, Jennifer Eiben, Rick Howard, Peter Kilpe. And I'm Tre Hester, filling in for Dave Bittner. Thanks for listening, and we'll see you tomorrow.