The CyberWire Daily Podcast 4.7.23
Ep 1797 | 4.7.23

Stopping Cobalt Strike abuse. Leaks are mingled with disinformation. Google offers advice for board members. Securing cars and their garages. CISA releases ICS advisories.

Transcript

Dave Bittner: Preventing abuse of the Cobalt Strike pentesting tool. The US investigates a leak of sensitive documents related to the war in Ukraine. Hacktivists activity continues. Google's advice for boards. Electronic lockpicks for electronic locks. Nexx Security devices may have security flaws. Tesla employees reportedly shared images and videos from Tesla's in the wild. Matt O'Neill from the US Secret Service discusses investment crypto scams. Our guest is James Campbell of Cado Security on the challenges of a cloud transition. And CISA releases seven ICS advisories.

Dave Bittner: From the CyberWire studios at DataTribe, I'm Dave Bittner, with your CyberWire summary for Friday, April 7, 2023.

Preventing abuse of the Cobalt Strike pentesting tool.

Dave Bittner: Cobalt Strike, a legitimate penetration testing tool, has often been abused by cyber criminals. Microsoft's Digital Crimes Unit in collaboration with cyber security company, Fortra, and the Health Information Sharing and Analysis Center, that's the Health-ISAC, is taking legal and technical measures to disrupt illicit versions of Cobalt Strike and abused Microsoft software. Microsoft says the cracked software has been used in more than 68 ransomware attacks, targeting health care institutions around the world, which in Microsoft's words, "Have cost hospital systems millions of dollars in recovery and repair costs, plus interruptions to critical patient care services, including delayed diagnostic, imaging, and laboratory results, canceled medical procedures and delays in delivering of chemotherapy treatments."

Dave Bittner: Microsoft stated on March 31, 2023, the US District Court for the Eastern District of New York issued a court order allowing Microsoft, Fortra, and Health-ISAC to disrupt the malicious infrastructure used by criminals to facilitate their attacks. Doing so enables us to notify relative internet service providers and Computer Emergency Readiness Teams who assist in taking the infrastructure offline, effectively severing the connection between criminal operators and infected victim computers. In full disclosure, Microsoft is a CyberWire partner.

US investigates a leak of sensitive documents related to the war in Ukraine.

Dave Bittner: The New York Times reports that US authorities are investigating an apparent leak of sensitive information concerning plans for US support of Ukraine. The files have been circulated in Twitter and Telegram by Russian accounts. A significant fraction of the information seems genuine, although some at least of that could be inferred from publicly known open sources, and genuine enough to prompt an investigation. Other data, notably, casualty estimates appear to have been falsified in the Russian interest, with Russian casualties understated and Ukrainian casualties exaggerated. And these seem to represent an admixture of disinformation, which may be the principle point of their publication.

Hacktivist activity continues.

Dave Bittner: TechRepublic offers a summary of trends in Russian hacktivism. Finland has become a recent target as it became a member of NATO this week. And anonymous Sudan has stepped up activity against Israel. The nominally Sudanese group appears to be acting in alignment with Russian interests, if not actual direction.

Google's advice for boards.

Dave Bittner: Google has released a report titled, "Perspectives on Security for the Board," highlighting how corporate boards can best navigate cyber security and cyber risk. First, cyber risk should be viewed through the lens of business risk. Google references the National Institute of Standards and Technology's cyber security framework, which can be useful for boards in reference to cyber. The framework comprises five core tenets: Identify, Protect, Detect, Respond, and Recover. Google also notes that it is imperative to understand the connection between threat intelligence and risk mitigation. To do this, Google advises boards to ask CISA's three questions: How good are we at cyber security? How resilient are we? And what is our risk? Google also advises a bold and responsible approach to AI, saying that boards and CISA should work together to secure scale and evolve their AI approaches. Overall, Google recommends getting up to speed, being engaged, and staying in the loop as sound practices for board members overseeing the management of cyber risk.

Electronic lockpicks for electronic locks.

Dave Bittner: Automobiles have a Controller Area Network bus called the CAN bus, and that bus can be compromised. The technique requires physical access to the automobile. "Ian Tabor, an automotive security expert of EDAG Group decided to do a forensic analysis to find out how his car was stolen," SecurityWeek reports. "He discovered that his headlight had been destroyed and the wires had been pulled out." The Register writes that Tabor investigated and found that various systems had seemingly failed or suffered faults. The faults were generated as the thieves broke into a front headlamp and tore out the wiring, and used those exposed connections to electrically access the CAN bus. He concluded that the thieves probably used a hacking device that use the car's Controller Area Network bus to inject false codes to start the car and open the door.

Dave Bittner: You can buy the hacking hardware on online criminal markets. Security Week writes that, "Such hacking devices can be acquired on dark web sites for up to $5,500. And they are often advertised as emergency start devices that can be used by vehicle owners who have lost their keys or automotive locksmiths. These devices seem to be specific to car makes, which limits the thief or locksmith who uses them to one brand of cars. For this method, car thieves still have to make physical contact with the car. And so experts recommend taking proper physical security measures.

Nexx security devices may have security flaws.

Dave Bittner: When purchasing a Smart Security System, buyers assume that the security of the system itself can be assumed as a given. There is always, however, an inherent risk associated with connecting security devices to the larger internet. And since we're talking about cars, here's a risk to the garages we park them in.

Sam Sebastian, an independent cyber security analyst working with CISA (the US Cybersecurity and Infrastructure Security Agency), posted on this issue, writing, "I discovered a series of critical vulnerabilities in Nexus' smart device product line, which encompasses smart garage door openers, alarms, and plugs. These vulnerabilities enabled remote attackers to open and close garage doors, take control of alarms, and switch smart plugs on and off for any customer. This is the last thing users would expect when installing a security device." Sebastian's blog explains the vulnerability, noting that Nexus servers failed to verify if the bearer token in the authorization header corresponds to the alarm trying to connect. He further explains that the MAC address for each device is the same as the device's serial number, which means that an attacker can register an already registered device and effectively take control of it. Nexx has not so far patched the vulnerability. Sebastian recommends that Nexx users deactivate their devices and write the company, requesting a fix.

CISA releases seven ICS advisories.

Dave Bittner: There are some new industrial control system advisories out. Yesterday, CISA released seven ICS advisories affecting systems from JTEKT industrial control links, Korenix, mySCADA, Hitachi Energy, and Rockwell Automation. Users should take a look and apply the fixes and mitigations the vendors have on offer.

Tesla employees reportedly shared images and videos from Teslas in the wild.

Dave Bittner: And finally, back to cars, in this case, the privacy associated with using them. Several former Tesla employees admitted that they used to share pictures and videos from cameras installed in Tesla electric vehicles from 2019 to 2022. As reported by Reuters on the 6th of April, these cameras are installed to enable driver's safety and automated driving. The media ranged from videos of naked Tesla owners walking to their cars to an image of the user's garage. Why one would approach one's car naked is not explained. Among the higher profile images captured, include shots of a James Bond submersible car allegedly captured inside Elon Musk's garage.

Dave Bittner: It's no secret formally, at least, that Tesla's collect and report images. Tesla states in its Customer Privacy Notice, "We want to be very clear that in order for fleet learning camera recording to be shared with Tesla, your consent for data sharing is required and can be controlled through the vehicle's touch screen at any time. Even if you choose to opt in, the camera recordings are limited to 30 seconds and remain anonymous, ensuring it's not linked to you or your vehicle." Reuters reports that the computer program that Tesla employees used at work could show the location of recordings, which would seem to provide less anonymity than customers might expect. Knowing how a company uses your data is important. And experts recommend that as onerous as slogging through the documents may be, users should read terms of service and privacy notices. As the great American philosopher, Mr. Tom Waits put it, "The large print giveth and the small print take way."

Dave Bittner: Coming up after the break, Matt O'Neill from the US Secret Service discusses investment crypto scams. Our guest is James Campbell of Cado Security on the challenges of a cloud transition. Stay with us.

Dave Bittner: James Campbell is CEO and co-founder of Cado Security, and a former GCHQ analyst. I spoke with him about cloud security, the challenges people face, and the things that can be overlooked during a cloud transition.

James Campbell: It's kind of a double-edged sword because the pace in which you can adopt technologies or people can take advantage of those new technologies in cloud tends to outpace the speed of which security considerations come into play. So a lot of organizations are, you know, playing catch up, so to say, when it comes to understanding the cloud, the complexities that it brings when it comes to the security side of things. So there's lots of, kind of, blind spots there. I guess the other thing to consider as well is that, you know, I don't think a lot of people come to the cloud thinking that their traditional approach to monitoring and security kind of transfers from the kind of on-premise, you know, traditional on-premise environments through cloud, but it doesn't necessarily translate in the same way. So, you know, there's a lot to learn there. And this is where, kind of, gaps start appearing when it comes to security and risk.

Dave Bittner: What are some of the things that folks typically overlook as part of this transition?

James Campbell: A good question, say, probably, two parts of this answer really or two examples I can give. So one is the understanding of the kind of shared responsibility model. Say, you know, some people come to the cloud, you know, taking into consideration or thinking at least that, you know, is the cloud provider's responsibility for security. But that's not necessarily the case. And I don't think a lot of people understand, you know, where does the line stop from a responsibility perspective. And what do they need to look after. You know, while a cloud provider has the responsibility to provide resiliency of the underlying services and infrastructure, ultimately, you know, the security of the environment you set up is your responsibility. And so, you know, I think a lot of people kind of get a little bit confused about where that gap lies, or where that line is, I should say. So it's good to really understand that. And I guess the second part is, you know, particularly with new technology. So, you know, cloud is amazing. So you move to the cloud to embrace the cloud, right? So you wouldn't just use it like an expensive data center where all your data is just sitting there on thousands of virtual machines. What you would probably look at, if you're fully embracing the cloud is, you know, the use of serverless technologies like, you know, containers and ephemeral infrastructure. Lambda functions as an example, just for AWS as one example. Auto-scaling groups for your virtual machines. So you're only using the resources that you need, which, ultimately, you know, for especially large enterprise, you're saving millions of, you know, dollars a year. But this comes with an added risk, which, you know, is one of the gaps, you know, that I mentioned where -- okay. So we're in the cloud, we have containers or virtual machines kind of spinning up and down as we need them. What happens if I had a detection for something suspicious on a virtual machine that's part of an auto scaling group? You know, what happens if that system is only alive for 15 minutes? You know, how is my team going to investigate that suspicious activity by the time that data gets recycled and deleted? So these are kind of the points or the pain points that people are starting to understand as they're on this -- you know, their understanding of the cloud is maturing.

Dave Bittner: That's a really fascinating insight. I'm curious, you know, from your perspective, you all do cloud incident response. So what are the specific challenges you face doing incident response in that environment?

James Campbell: I think, it's couple of things. So one is, you know, cloud can be a lot more complex than people think of it. A lot of things are available at your fingertips, which is great. But it can also mean, you know, a lot of customer environments, you know, tend to be, you know, a little wild west. They kind of shadow IT store where you have lots of, kind of, technical people with lots of tech knowledge at their fingertips. It's very easy to deploy systems, it's very easy to deploy new databases, etc, etc. And so keeping across all that is very difficult to do. And so if you don't have a tight grip on the IT store -- so a lot of customer environments tend to be, you know, hundreds of root accounts and hundreds of different services they didn't know even existed. And so navigating the complexities of that is quite hard, particularly, if you have, say, like a detection or you need to respond to something in the environment, you know, trying to find out where that asset is, who has access to it, and, you know, how do I get to that data so I can investigate what's going on as soon as possible. It's really, really hard. And that's kind of one of the problems we try and help customers with, is how do you kind of automate that. So how do you embrace the cloud in a way to solve the problem as well, through automation, and, you know, and take away some of those complexities? So that's definitely a big one there. And the other big one there is definitely around containers and ephemeral infrastructure. So, you know, your assets spinning up and down, terminating, recycling, all that data is kind of, you know, churning very quickly. And so how do you get a [inaudible], you know, as part of that automation story? How do you -- how do you make sure you're retaining that data or the useful bits so that, you know, when you need it most, especially if it's only, you know, alive for 15 minutes at a time?

Dave Bittner: What are your recommendations then, you know, for folks to really, to operate most efficiently, most effectively, and best to defend their cloud infrastructure? Any general words of wisdom?

James Campbell: I think it's a big -- it's a big question, really. There's lots --

Dave Bittner: Yeah.

James Campbell: There's lots you can do, obviously. But I think it's really, truly getting a grips of your cloud estate. So truly understanding, you know, the assets or, you know, technologies you're using, your cloud estate, and being aware of, you know, the controls you have in place. Also, you know, you need to ask yourself as well, you know, just having kind of your high level detections or, you know, kind of visibility from that perspective is one thing. But what do you do next if something happens? Like, how do you -- how do you gain access to that data? How do you actually investigate? How do you, you know, what do I need? What, you know, mitigating control do I need to put in place? And I think, you know, a lot of people tend to stop at the point of saying, "Right. I have my data in the cloud, my cloud is set up. I have my policies in place. I have asset." You know, visibility, you know, and then something happens. And then quite quickly, they're in a world of trouble where, you know, they're trying to contract out a third party who knows about how to deal with cloud incidents, etc. You know, I think a lot of people really got to exercise and understand the full, you know -- I guess go through the motions of the attack lifecycle, so to say, from, you know, all the way from kind of that preparation to, you know, the exploitation of what you do to mitigate such things. And that's all.

Dave Bittner: That's James Campbell from Cado Security.

Dave Bittner: There's a lot more to this conversation. If you want to hear more, head on over to the CyberWire Pro and sign up for interview selects where you'll get access to this and many more extended interviews.

Dave Bittner: And I'm pleased to welcome back to the show, Matt O'Neill. He is Deputy Special Agent in Charge for Cyber with the US Secret Service. Matt, welcome back.

Matt O'Neill: Thank you for having me.

Dave Bittner: So I want to touch today on something that I know is a focus for you and your colleagues. These are investment crypto scams, sometimes referred to as pig butchering. I know that's not your favorite term here, but that's what some people refer to it as. What exactly are we talking about here? Can you describe it for us?

Matt O'Neill: Sure. So what will happen is victims will be contacted largely through social media, dating sites, sometimes, even just text messages. And they will try to get the victim to invest in this, out-of-this-world crypto investment scheme. And they'll provide with really good sort of graphical interfaces to show you how much money you're making. Now, keep in mind, you don't know this person at all, you've never met them. But the ROI for a lot of these investments appear to be so good that it's tempting to a lot of individuals. And I think with this sort of lack of knowledge, generally speaking about cryptocurrency and how it works, but people read the news and see that, oh, this person made a ton of money on, in crypto, then it encourages folks to invest. And so what will happen is, once you start investing a little bit, you'll get access to a platform. It'll look like a traditional investment account. And then, of course, you're going to look like you're making a lot of money. So that's going to encourage you to spend more money, right, invest more money. And then the idea of pig butchering is you get them to invest all of their money, and then you basically pull the rug out of them. And that's where the butchering sort of comes into.

Dave Bittner: How does something like this begin? Does this typically start as a romance scam or a friendship? This is more of a long-game kind of thing, right?

Matt O'Neill: Sure. So it will start either through, like you mentioned, romance scams. It'll, or through just, through social media platforms, whether it's Instagram, LinkedIn, Facebook, you name it. It could come through there. A lot of times, it's important to know that we have a high degree of self-disclosure. We tell everybody everything about what we're doing on a daily basis, for us to see that and leverage that against you, to give you this sort of feeling of, "Oh, well, this person knows me, I'm comfortable." They're just taking advantage of the information you've already provided to them to develop what you believe to be a relationship. And then whether it's a professional or personal relationship. And then that will sort of lower your guard a little bit to invest in whatever they're asking for. It's key to know that if you are involved, and the statistics are off the charts for the amount of reported fraud, and we actually think it's even underreported, it's over $2 billion last year. The key to know is, first, contact law enforcement, federal law enforcement, us, Secret Service, FBI, just contact somebody. Report it to the IC3 website. But also, if you've invested money already and they -- you asked to withdraw your money, and they say, well, you have to pay a fee, you're not going to get your money back, and you're just going to give them more of your money. So that's something that is hard to hear. But the reality of these types of cases is they have already spent your money. They've moved it on. This is transnational organized criminal groups. And there is no -- there's no real meaningful way for you to get your money back. The best scenario is to contact your federal law enforcement agency to try to at least get involved in the process to try to disrupt and dismantle these organizations.

Dave Bittner: I was going to ask, I mean, what -- if someone reaches out to you, they find that they've been a victim of something like this, what sort of resources do you bring to bear to try to help?

Matt O'Neill: So we work with the Department of Justice. We work with -- we're a global organization, so we will work with our foreign offices and our foreign partners through groups like Europol to try to affect arrests, but also asset forfeiture. For a couple years, I ran our asset forfeiture branch, which in my view, is one of the best in the world at recovering funds for victims. And so depending on where the money is currently sitting, if it is at a virtual assets service provider, like a cryptocurrency exchange that honors legal process, there is a slight chance, but there is a chance that the money could be frozen, and then returned through the asset forfeiture process back to the victim.

Dave Bittner: What is the message that you want to get out here for folks? I mean, our audience, I think, probably, considers themselves fairly sophisticated. But then there's always friends, and family, and relatives, and folks who may not be so sophisticated. Is this largely an educational process too, of letting people know these things are out there and to warn them of it?

Matt O'Neill: Yes, I believe so. I think the key components are, the first is, never invest with a complete stranger. Make sure that you've actually met the person in person before you start investing. Do some research as to figuring out yourself whatever cryptocurrency is that they are claiming to be investing in. All that information is readily available. Do your own research. And then if -- and when you're currently in one of these situations, and you're trying to get your money back, and they're asking for additional money, do not send them additional money. Please, reach out to your local law enforcement, federal law enforcement, and get us involved as quickly as possible. We might not be able to get your money back, but we're trying to build out, sort of, the larger ecosystem of where these fraudsters are, and any piece of information is very helpful.

Dave Bittner: All right. Well, Matt O'Neill is Deputy Special Agent in charge for Cyber with the US Secret Service. Matt, thanks so much for joining us.

Dave Bittner: And that's the CyberWire. For links to all of today's stories, check out our daily briefing at thecyberwire.com. Be sure to check out this weekend's Research Saturday. And my conversation with Sahar Abdelnabi from CISPA Helmholtz Center for Information Security. We're discussing their work, a comprehensive analysis of novel prompt injection threats to application integrated large language models. That's Research Saturday. Check it out. The CyberWire Podcast is a production of N2K Networks. Proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cyber security teams and technologies. Our amazing CyberWire team is Elliott Peltzman, Tre Hester, Brandon Karpf, Eliana White, Puru Prakash, Liz Irvin, Rachel Gelfand, Tim Nodar, Joe Carrigan, Carole Theriault, Maria Varmazis, Jason Cole, Ben Yelin, Nick Veliky, Milly Lardy, Gina Johnson, Bennett Moe, Catherine Murphy, Janene Daly, Jim Hoscheit, Chris Russell, John Petrik, Jennifer Eiben, Rick Howard, Peter Kilpe, Simone Petrella, and I'm Dave Bittner. Thanks for listening. We'll see you back here next week.