The CyberWire Daily Podcast 10.3.23
Ep 1919 | 10.3.23

Where ICS touches the Internet. BunnyLoader traded in C2C markets. Phantom Hacker scams. API risks. Cybersecurity attitudes and behavior. DHS IG reports on two cyber issues. Updates on the hybrid war.


Tre Hester: Nearly 100,000 ICS services exposed to the Internet. BunnyLoader in the C2C market. Phantom Hacker scams. API risks. Cybersecurity attitudes and behaviors. Homeland Security IG finds flaws in TSA pipeline security programs, and privacy issues with CBP, ICE, and USSS use of commercial telemetry. Kyiv prepares for Russian attacks on Ukraine's power grid. Ben Yelin on the Department of Commerce placing guardrails on semi-conductor companies. As part of our sponsored Industry Voices segment, Dave Bittner sits down with Nick Ascoli, Founder and CTO at Foretrace, to discuss the last year in data leaks. And Russian disinformation is expected to aim at undermining US support for Ukraine.

Tre Hester: I’m Tré Hester Dave Bittner with your CyberWire intel briefing for Tuesday, October 3, 2023.

Nearly 100,000 ICS services exposed to the Internet.

Tre Hester: BitSight has identified nearly 100,000 industrial control systems exposed to the Internet, particularly in the education, technology, government and politics, and business sectors. The researchers note, however, that overall there’s been a steady decline in Internet-exposed ICS services since 2019. So in some respects this is actually a good-news story.

Tre Hester: BitSight adds, “Exposed systems and devices communicating via the Modbus and S7 protocols are more common in June 2023 than before, with the former increasing in prevalence from 2020 and the latter more recently from mid-2022. However, exposed industrial control systems communicating via Niagara Fox have been trending downward since roughly 2021. Organizations should be aware of these changes in prevalence to inform their OT/ICS security strategies.”

BunnyLoader in the C2C market.

Tre Hester: Zscaler is tracking a new malware-as-a-service offering called “BunnyLoader” that’s being sold on underground forums for a one-time price of $250. The malware “is designed to steal information related to web browsers, cryptocurrency wallets, VPNs and much more.” BunnyLoader targets cryptocurrency wallets for Bitcoin, Monero, Ethereum, Litecoin, Dogecoin, ZCash, and Tether. The researchers note that the malware “has been under rapid development” since its initial release on September 4th.

Phantom Hacker scams.

Tre Hester: The US Federal Bureau of Investigation (FBI) has warned of an increase in “Phantom Hacker” scams targeting senior citizens. “This Phantom Hacker scam is an evolution of more general tech support scams, layering imposter tech support, financial institution, and government personas to enhance the trust victims place in the scammers and identify the most lucrative accounts to target. Victims often suffer the loss of entire banking, savings, retirement, or investment accounts under the guise of ‘protecting’ their assets.” The Bureau says victims have lost over $542 million to tech support scams in the first half of 2023, with 66% of these losses from victims over 60 years old.

API risks.

Tre Hester: BreachLock has published an article for The Hacker News looking at cybersecurity risks associated with APIs: “2023 reports indicate cyberattacks targeting APIs have jumped 137%, with healthcare and manufacturing seen as prime targets by attackers. Attackers are especially interested in the recent influx of new devices under the Internet of Medical Things and associated apps and API ecosystem that has supported the provision of more accessible patient care and services. Another industry that is also vulnerable is manufacturing, which has experienced an increase in IoT devices and systems, leading to a 76% increase in media attacks in 2022.”

Cybersecurity attitudes and behaviors.

Tre Hester: The National Cybersecurity Alliance (NCA) and CybSafe have published a report looking at cybersecurity behaviors around the world. In the United States, the researchers found, “A significant majority (79%) now recognize Multi-Factor Authentication (MFA) and, encouragingly, 70% within this group are actively using it to enhance their online security on a regular basis. However, despite these positive trends, there are concerns about access to adequate training; based on the survey only 44% of participants in the United States reported having access to cybersecurity training programs.”

Homeland Security IG finds flaws in TSA pipeline security regulations.

Tre Hester: A redacted version of a report by the Office of the Inspector General at the Department of Homeland Security has been released. The IG was looking into the Transportation Security Administration’s (that’s TSA’s) formulation and enforcement of pipeline safety regulations after the May 2021 ransomware attack against Colonial Pipeline.

Tre Hester: TSA responded with two regulations:

  • Security Directive Pipeline–2021–01, “Enhancing Pipeline Cybersecurity (SD-01),” issued on May 26th, 2021, rule required that operators of “critical” pipelines–those that carry hazardous fluids and natural gas–to designate a cybersecurity coordinator, report cyber incidents, and conduct a vulnerability assessment.  

  • The second regulation, Security Directive Pipeline–2021– 02 “Pipeline Cybersecurity Mitigation Actions, Contingency Planning, and Testing,” issued on July 19th of that year, required owners and operators of pipelines designated as “critical” “to implement additional and immediately needed cybersecurity measures to prevent disruption and degradation to their infrastructure in response to an ongoing threat.

Tre Hester: The issue is in the oversight. The IG found that TSA, while it properly worked with stakeholders to develop the rules, didn’t effectively follow-up to track compliance. The IG made three recommendations, all of them procedural enhancements designed to ensure proper oversight of operator compliance. TSA has concurred with the IG’s report and its recommendations, and so improvements are expected to be on the way.

DHS IG also finds privacy issues with CBP, ICE, and USSS use of commercial telemetry.

Tre Hester: Another Homeland Security Inspector General (IG) report found that three of the Department's agencies--Customs and Border Protection, Immigration and Customs Enforcement, and the Secret Service--"did not adhere to Department privacy policies or develop sufficient policies before procuring and using commercial telemetry data." The data the agencies purchased included mobile device geolocation information, and the IG found that they hadn't prepared to preserve the privacy of the individuals whose data they purchased.

Kyiv prepares for Russian attacks on Ukraine's power grid.

Tre Hester: Ukraine is preparing for winter attacks against its energy infrastructure, the Economist reports, a reprise of last winter's Russian counter-grid program. That program was dominated by kinetic attacks, and Ukraine expects more of the same over the coming months, but it's also working to increase its resilience in the face of cyberattacks against power generation and distribution, as these are also expected. They haven’t had much effect since the invasion in February of 2022, but Kyiv isn’t prepared to relax its guard.

Russian disinformation expected to aim at undermining US support for Ukraine.

Tre Hester: And, finally, as the 2024 elections approach, the US Intelligence Community expects Russia to mount influence operations directed against US support for Ukraine. 

Tre Hester: The New York Times reports that Russian disinformation about NATO in general and the US and UK in particular–”Anglosaxonia,” in the Kremlin demonology–have been common during the war, but the next round of influence operations is thought likely to be directly disruptive in concept. The US elections next year are expected to be targeted, with Russian operators seeking to support candidates unsympathetic to Ukraine, and to denigrate candidates who favor continued US support of Kyiv. Heavy use of influence washing and troll farms, directed by Russian intelligence services, is expected.

Tre Hester: As the signs say around the Aquarium’s libraries, “officers must wash information before returning online.” Or something like that.

Tre Hester: Coming up after the break, Ben Yelin on the Department of Commerce placing guardrails on semiconductor companies. And as part of our sponsored "Industry Voices" segment, Dave Bittner sits down with Nick Ascoli, Founder and CTO at Foretrace, to discuss the last year in data leaks. Stick around. [ Music ]

Dave Bittner: Nick Ascoli is Founder and Chief Technical Officer at Foretrace whose offerings include an external attack surface management platform. In this sponsored "Industry Voices" segment, I ask Nick Ascoli to explain the differences between a data leak and a data breach and why that difference matters.

Nick Ascoli: There is a fundamental difference between a data leak and a data breach. While the outcome is -- you know, overwhelmingly the outcome is the same, which is that an unauthorized third party has access to your data. The difference between a leak and a breach is a leak is basically when sensitive data is exposed publicly and accessible to the unauthorized third party. A breach is a successful attempt to steal sensitive data from, you know, an organization's digital infrastructure. Now, that's not -- that's not Webster's definition of a data leak or a data breach, but that's the definition I go on generally. So a common data leak scenario, the ones especially that we've seen in the last year are like misconfigured web applications, a file system being made public, an API vulnerability that enables the accessing of data that's not intended for the user. A breach scenario are the ones we're familiar with and see, you know, very, very often in some of the larger, you know, more notorious data breach news stories, which is an internal compromise, you know, lateral movement and exfiltration, using, you know, complex post-exploitation frameworks. So the common root of a leak is an accident, usually, procedural or technical oversight. Occasionally, it's malicious. But in a breach scenario, it's overwhelmingly malicious. So that's the fundamental difference.

Dave Bittner: For the folks that you work with who are having success in preventing this, what are the common elements, the things that people put in place to protect themselves?

Nick Ascoli: Pathway to success for ensuring your data isn't present in a data leak is really just doing exactly what the adversary does yourself, which, you know, as our job, that's exactly what I do, is looking for customers' content in -- in the public, really agnostic of source. A lot of people try and take the approach of sort of grouping in data leak detection with third-party risk in the sense that you might monitor or look at the public footprint of third parties, but the reality is, you know, third parties use third parties who use fourth parties. There's an infinite, you know, list of parties involved in the handling of any one organization or any one applications data. So what organizations should be doing is looking for their data completely agnostic of source. That is, checking public, you know, open indexed forms of data wherever they lie, which, you know, encompasses a truly wide variety of sources where data gets published online, whether it's by applications or by people, and looking for your data within those, because if you narrow your scope to where you think your data is, you know, odds are, your data is in a lot more places than you think, which is sort of the nature of that that fourth-party risk phrase, is that your data will end up in, you know, your third parties have third parties, too, and your data is changing hands a lot. So to really find it in the wild, you have to kind of be agnostic of where you think it is and look for it where it actually lies.

Dave Bittner: What are those conversations like for you? I mean, when you present to someone and say, "Look, this is what we found, these are the things," is there generally surprise, the degree to which things are out there?

Nick Ascoli: Yeah, I mean, there's certainly been meetings where, you know, the meeting has had to be cut short for -- it turned into a fire drill right away, something was out there that wasn't supposed to be, and that's an issue, but overwhelmingly, what we look for, you know, we're monitoring continuously, so these end up -- our sort of notifications of, you know, us finding an email and password on a, you know, on a -- one of the many public Git sites or a token present in a public Google Doc for some reason or, you know, any number of leaks, things that we're looking for, these become sort of triaged like a normal internal incident would and baked into the sort of fabric of security operations, which is something that, you know, we've pushed for, for a long time, is the weaving the sort of fabric of external reconnaissance and adversaries' techniques for reconnaissance into traditional security operations such that the response can either be automated with a SOR in the case that it can, or is triaged by the internal security team and managed the way it should be. These incidents usually involve a little bit more, you know, potential legal or PR consideration due to their public nature, but usually, the remediation still falls in the hands of the security team. But there is, to your point, there's a lot of surprise. There's really no shortage of findings that we end up coming up with of data that the customer truly could not have predicted ended up there and that's because the handle that an organization tries to get on where their data is going via, you know, subsidiaries, vendors, partners, consultants, their sort of known register of people who have their data, often ends up looking a lot different in reality, and places that their data end up, while, you know, they would seem innocuous, like a developer using GitHub, even though, you know, the organization is a Bitbucket shop, one misconfiguration of a repo making that proprietary code with hard-coded stuff in it public, which is an example we do see a lot, can have dramatic consequences despite it being, you know, one person engaging in a single-shadow IT instance. So there are a lot of surprises, definitely.

Dave Bittner: What are your recommendations for organizations who want to do a better job with this, who want to start down this path of getting a handle here? How should they begin?

Nick Ascoli: I think, starting from scratch, you should be looking at your external footprint through the lens of an adversary to the extent that you can, and there's a lot you can do without making an investment up front, like rotating, if you're an enterprise, rotating defenders to search for this kind of data by hand, and I'm talking literally running Google Dorks, you know, on some schedule, querying Shodan yourself, querying, you know, looking on the on the Git sites for your code showing up, maybe perusing or, you know, having an experienced OSINT professional peruse criminal forums and marketplaces for the presence of your data to understand where it exists online. But, you know, do this by hand to understand the scale that you're dealing with, and then to the extent that you can, automate it and look into tooling that can automate it for you to get ahead of these issues, you know, otherwise, it's something that will pop up, you know, you'll get the sort of reconnaissance pages of your pen test report and that will be your picture of the outside, but the issue is, that's a snapshot. So having defenders, rotating defenders, or offensive personnel, if you have it, doing this continuously enables you to be, A, much better prepared for those findings and, B, hopefully getting in front of those findings so that you don't find out six months later that this service was misconfigured in facing the public, but you find out, you know, when it goes online.

Dave Bittner: That's Nick Ascoli, Founder and Chief Technical Officer at Foretrace. [ Music ] And joining me once again, is Ben Yelin. He is from the University of Maryland Center for Health and Homeland Security and also my co-host on the "Caveat" podcast. Hey, Ben.

Ben Yelin: Hey, how are you, Dave?

Dave Bittner: I'm doing well. Thanks. Interesting article here from The Record, which is The Recorded Futures News organization. This is written by Martin Matishak, and it's about guardrails that the folks at the Department of Commerce have put on semiconductor companies in the effort to increase national security here. What's going on here, Ben?

Ben Yelin: So last year, Congress enacted a bill called the "CHIPS and Science Act," and this was a bill, a bipartisan bill to boost domestic semiconductor manufacturing. It was kind of a -- considered a really big legislative accomplishment. This is something that's going to be good for our economic development and to be a leader in the semiconductor field.

Dave Bittner: And take away some of our dependence on other nations, and I suppose specifically China, for the manufacturing of a lot of our semiconductors?

Ben Yelin: Yeah, yeah, that's actually one of the reasons they passed this legislation, is so that United States can be that counterweight to China in advancing this type of computing technology. So in that spirit, the U.S. Commerce Department has released their national security guardrails from any business that's seeking federal funding under this legislation. Basically, the regulation would prohibit companies that are receiving funding under this bill from, quote, expanding material semiconductor manufacturing capacity in foreign countries of concern, and those foreign countries, namely, are China and Russia, and that would be applicable for a period of 10 years.

Dave Bittner: Okay.

Ben Yelin: I think there are kind of two ways to look at it. One is that this is kind of a protectionist measure that is intended to boost U.S. industries. We don't want any of the funding, even in an indirect way, to go to Chinese and Russian entities. Now, a classic economist might tell you that these types of protectionist measures end up hurting us all in the long run. I'm not somebody who tends to think that way, so I understand why, especially given the goal of the legislation, which was to boost U.S. manufacturing, that you'd need these national security guardrails. And then there's just the general national security concerns. I mean, semiconductors are going to be a part of our critical infrastructure, having these types of chips. These chips are going to fuel things that we need to live and survive and to secure our country.

Dave Bittner: Right.

Ben Yelin: And putting any money in the hands of our entities controlled by our foreign adversaries certainly presents some of those long-term risks that we would really like to avoid, so I certainly understand it from that perspective.

Dave Bittner: It also points out that it restricts them from engaging in certain joint research or technology licensing efforts. What does that address here?

Ben Yelin: So I just think it would be like going in on a contract together, so you have like a U.S. company who's bidding on money that's being released under this bill. If they were to go in on a bid with a Chinese or Russian company, that would generally be prohibited under these regulations so that we're fulfilling the goal of the bill, which is to boost domestic manufacturing. You don't want a tiny U.S. company that's -- and granted, this is an absurd example, but --

Dave Bittner: Yeah.

Ben Yelin: You don't want a tiny U.S. company that's just going to do like the grants management and then all of the actual semiconductor production goes to a Chinese company.

Dave Bittner: I see.

Ben Yelin: So I think they're trying to limit those types of partnerships, and there is an enforcement mechanism. Basically, if you are found to be violating these guardrails, then you would have your own federal dollars revoked, and I don't think any company wants to see that happen.

Dave Bittner: Yeah. Do you suspect that this is going to cause a lot of heartache here, or these seem to be reasonable restrictions?

Ben Yelin: I don't think these are necessarily surprising. You might have some type of deleterious effect on the industry just because prior to this point China, in particular, has been such a leader in this field, so you might be relinquishing some of your access to institutional expertise by having something like this, but I just think it's still prudent for a couple of reasons. One, the purpose of the bill was increasing domestic manufacturing of these chips and, two, I think we just have to recognize the major national security implications. We don't want to be beholden to some of these foreign countries, so I think you -- any sort of negative effects that would come from these types of regulations are outweighed by the national security imperative here.

Dave Bittner: All right. Interesting stuff. Ben Yelin, thanks for joining us.

Ben Yelin: Thank you. [ Music ]

Tre Hester: And that's the CyberWire. For links to all of today's stories, check out our daily briefings at We'd love to know what you think of this podcast. You can email us at Your feedback helps us ensure we're delivering the information and insights that keep you a step ahead in the rapidly changing world of cybersecurity. This episode was produced by Liz Irvin and Senior Producer Jennifer Eiben. Our mixer is me, with original music by Elliott Peltzman. The show is written by our editorial staff. Our executive editor is Peter Kilpe, and I'm Tre Hester filling in for Dave Bittner. Thanks for listening. We'll see you back here tomorrow. [ Music ]