The CyberWire Daily Podcast 7.25.22
Ep 1626 | 7.25.22

The minor mystery of GPS-jamming. Twitter investigates apparent data breach. Ransomware C2 staging discovered. A C2C offering restricted to potential privateers.


Dave Bittner: The minor mystery of GPS jamming. Twitter investigates an apparent data breach. Ransomware command and control staging is discovered. Andrea Little Limbago from Interos looks at the intersection of social sciences and cyber. Our guest is Nelly Porter from Google Cloud on the emerging idea of confidential computing. And a C2C offering is restricted to potential privateers.

Dave Bittner: From the CyberWire studios at DataTribe, I'm Dave Bittner with your CyberWire summary for Monday, July 25, 2022. 

The minor mystery of GPS-jamming.

Dave Bittner: Russian electronic warfare hasn't been particularly aggressive in jamming GPS. C4ISR reviews the potential explanations, and they closely parallel the reasons why Russian offensive cyber operations have been similarly restrained. The first possibility is that just maybe Russian electronic warfare isn't as good as everyone thought it was. Other Russian capabilities have been overestimated, and there may have been a tendency to exaggerate Russian electronic warfare prowess as well, the thinking goes. Maybe, but on the other hand, Russia has shown an ability to jam GPS signals - in Norway, for example - or spoof them - in the Black Sea, for example. It's not like the ability to maneuver armored units against opposition. If you can jam in peacetime, there's no obvious reason you can't jam in wartime. 

Dave Bittner: Other explanations seem likelier. Or there's this. Russian forces themselves use GPS, and they don't want to deny their own access to the system in the theater of operations. Russia does have both Glonass, a domestic alternative to GPS, and Chaika, a terrestrial navigation system roughly equivalent to the American LORAN. But these are not as widely used. GPS receivers are cheap and ubiquitous, and many Russian units use them. Almost every smartphone has GPS. Very few, if any, use Glonass. GPS is everywhere. So this seems possible. 

Dave Bittner: Or maybe Russian EW operators - or, more properly, their commanders - are concerned about the ease with which their jammers could be located, targeted and destroyed. The assets are valuable, and they have to be husbanded for a time when they're really needed. Another possibility - Ukraine's stockpile of Soviet-era weapons aren't dependent upon GPS, and so GPS jamming won't affect them. Of course, Ukrainian forces are just as likely to use GPS receivers as Russian forces are, and systems they've recently received from NATO used GPS. So this possibility seems unlikely. Or, finally, perhaps Russia is pulling its punches, holding its full capabilities in reserve against possible use against the main enemy, which would be NATO. In any case, the question has an interesting symmetry with the question about why Russian offensive cyber operations have been more limited, less destructive in their effects than had been expected. 

Twitter investigates apparent data breach.

Dave Bittner: Twitter is looking into the possibility that data from a breach are now being posted on the dark web. Restore Privacy traces the incident to reports in HackerOne back in January of a breach that had the potential of exposing user information even when that information was hidden in privacy settings. Twitter closed the vulnerability and paid the researchers who reported it a bug bounty. But it appears possible that the vulnerability has been exploited to collect a very large tranche of user data. Restore Privacy says that some of the data released as a teaser are authentic and that the criminal who holds them, who goes by the hacker name Devil, is offering the database for sale. Bidding starts at $30,000. 

Dave Bittner: 9 to 5 Mac sees the principal risk in the compromised data as more plausible, more effective phishing campaigns. Twitter told The Record that it's investigating, but their comments focused principally on the January vulnerability disclosure. A Twitter spokesperson said, we received a report of this incident several months ago through our bug bounty program, immediately investigated thoroughly and fixed the vulnerability. As always, we are committed to protecting the privacy and security of the people who use Twitter. The spokesperson went on to say, we are grateful to the security community who engages in our bug bounty program to help us identify potential vulnerabilities such as this. We are reviewing the latest data to verify the authenticity of the claims and ensure the security of the accounts in question. 

Ransomware C2 staging discovered.

Dave Bittner: Censys reports finding a criminal ransomware operation that's being staged. And the discovery comes before actual attacks appear to have been carried out. The gang involved is Russian. Some of the attack infrastructure, the researchers say, has been put in place in the U.S. According to the report, Censys located a host in Ohio, also possessing the Deimos C2 tool discovered on the initial Russian host, and, leveraging historical analysis, discovered that the Ohio host possessed a malware package with software similarities to the Russian ransomware hosts. The Record points out that Censys duly acknowledges the role CISA played in the discovery. The Record reports, part of how Censys was able to tie the hosts to MedusaLocker was from a Cybersecurity and Infrastructure Security Agency report released three weeks ago that spotlighted the ransomware group and provided email addresses, IP addresses and TOR addresses that the group uses. 

A C2C offering restricted to potential privateers. 

Dave Bittner: And finally, there are special offers in the underground markets, too. Sometimes it's like a membership club, a little restrictive maybe, in ways that might not pass legal muster in most jurisdictions. But then the writ runs differently in the C2C underworld. SecurityWeek reports that Luna ransomware is available only to Russian-speaking cybercriminals. Luna is a cross-platform capable attack tool, coded in Rust, that's landed with some eclat recently in the criminal-to-criminal markets. It's being offered only to Russophone affiliates, presumably because of their suitability as privateers. Are you a criminal speaking a different language? Sorry. Go take your trade elsewhere. 

Dave Bittner: Nelly Porter is group product manager for Google Cloud, where she and her colleagues have been contributing to efforts to enable and implement confidential computing. 

Nelly Porter: Confidential computing is one of the tools to protect customers' data in Cloud and everywhere else. It's one of the privacy preserving technique, I would say, where hardware will assist us to provide cryptographic isolation in addition to normal isolations that we usually have to protect our tenants amongst themselves and our tenants against Cloud provider itself. 

Dave Bittner: And how does that work from a practical point of view? What exactly is going on behind the scenes? 

Nelly Porter: So behind the scenes, CPUs like ND CPUs and Intel CPUs provide specific instructions that allow very quickly and very efficiently to encrypt memory of your environment, your trusted execution environment. And by protecting memory, we need ensure that all sensitive data, which is anything that you don't want to see by anyone else, will be, always - cybertext (ph). But cybertext when you're looking from outside in, when you're running your application, your workload, your (inaudible) you will see everything without any changes. And this magic, it's happened because those CPUs so specific, I would say, system will achieve all those specific extensions, not only encrypt memory but very efficiently decrypt memory when it's coming to cache line. So memory controllers would be able to deal with this, again, situation and encrypt-decrypt very quickly so CPUs completely unaware that the data they need to process is actually - was previously encrypted and will be encrypted right after extraction will be completed. And it's probably where confidential computing is different from full homomorphic encryption when CPUs are actually performing their extraction on fully encrypted data. 

Dave Bittner: So is the idea here that because we're doing the encryption and decryption in hardware that the users don't suffer any sort of performance hit? 

Nelly Porter: You're absolutely right. And not only performance hit, as security people will also love to separate duties, and if (inaudible) encryption is done by hardware, it means it's done by somebody else like AMD and Intel. And Cloud provider, even if they wish, would not be able to extract those keys or modify anything as the operation and workload performing what needs to be done. 

Dave Bittner: Well, how are you and your colleagues there at Google approaching this? What sort of things are you all going to be making available? 

Nelly Porter: Done some work in this area. For many years, we worked in this - actually one of the creators of Confidential Computing Consortium, so we strongly believed from day one that only working together we would be able to crack this nut and to offer confidential computing to our customers. It has to be (inaudible). Different approaches will be possible. But as a product, we offered our customers confidential VM. These AMD - what we call secure with the virtualization extension, and it's - we provide confidentiality of those four codes when they run in GCP. We are also extending support of confidential environments to other GCP services. Our customers love to run cards and containers in our managed Kubernetes service. It's called Google Kubernetes Engine. So we offer confidential environments for those and for Kubernetes as well. And we bring in secure analytics to the market. We have a set of products that's actually helping customers to run, manage Hadoop and manage Atspark. So we have confidential variant of those services as well. 

Dave Bittner: Are there any downsides to this? Are there reasons why people might not want to implement it? 

Nelly Porter: Again, you're probably asking the wrong person. I do believe that confidential computing providing stronger data protection control to our customers and without implication of performance and usability - it's probably - it's vague how we'll see cloud providers will offer the services for our customers and start - will progress, and they will become much more, again, available. The things that might be complicated for our customers - our customers can run only the OMSAgent in Kubernetes as they need analytics and they need ability to run data, varied houses and huge workloads. So one of the customers told me - not once, by the way, a few - that they need to run a confidential environment, HANA SAP workloads. It's huge, huge monsters databases. So, again, we don't have the support right now for those services, but, again, it would be one of the reasons why customers would not apply this particular protection for their workloads. 

Dave Bittner: Do you envision a time in the future where this just becomes a standard part of cloud computing, where this is something that's enabled by default? 

Nelly Porter: Absolutely. And I think it's - since an analogy that I always brought up is HTTPS connection, so I actually been part of a fitting situation when somebody were telling us again and again that certificate from BAB services is never going to catch up. And it's - again, today when your site doesn't support the HTTPS types of certificate, it's becoming an exception. It's not it's the rule. I do believe that confidential computing will be exactly the same as time will progress. And more diversity of services and CPUs will come to the market, which will become simply default, ubiquitous option for public cloud providers, to offer additional privacy and protection for customers about cloud server data. 

Dave Bittner: That's Nelly Porter from Google Cloud. 

Dave Bittner: And joining me once again is Andrea Little Limbago. She is senior vice president of research and analysis at Interos. Andrea, always great to welcome you back to the show. I want to touch today on the social sciences and how they are intersecting with our cybersecurity world. What's the latest there? 

Andrea Little Limbago: Yeah, no, thanks, Dave. As you know, this is something I always like to be a big proponent of. You know, I think, you know, we're at a point where it's almost becoming, you know, much more, you know, normal and accepted to have social sciences in cybersecurity, so it's great to be here. I wouldn't say that we're 100% there yet. I still remember, you know, probably, you know, maybe, eight years ago being at conferences and, you know, being asked what a social scientist is doing in cybersecurity. And that was always the top question I got wherever I was at - you know, at RSA, Black Hat, BSides, you know, sort of the large community events. 

Dave Bittner: Right. 

Andrea Little Limbago: You always got that question, and I never do now. And actually, in contrast, I see more and more of the - you know, the next generation coming in with some aspect of, you know, multi-disciplinary. You know, they have social science training, they've got - you know, with data science or with various kinds of information security. And it's a really great, you know, multidisciplinary perspective that they're bringing into the industry. So it's - one, it's very refreshing. It's great to see. It's great to, you know, talk with them about what - you know, what they're interested in looking at. And then it's also, you know, great to, you know, connect with others, you know, across industry. And that's been - another core component of RSA now is it has a human element track of it. And we're seeing more and more, you know, acknowledgement that there is room for, you know, a whole range of disciplines in cybersecurity, and that's something that, you know, we increasingly need to go toward not only because of the whole, you know, workforce shortage that the industry has but... 

Dave Bittner: Right. 

Andrea Little Limbago: ...Just because it impacts so many different aspects of society that it really does take so many different perspectives to address the challenges that we have right now in the industry. And so the more social sciences that can come in to complement, not replace - I mean, and I think that's the important thing; you know, it's complement and bring the perspective in - the better off we are. And whether it's for looking at using the legal frameworks that are going on, looking at, you know, the whole, you know, range of cyber warfare and discussions on that and actually make it, you know, sync with, you know, decades old of theories that have actually been applied in other areas that might be useful in this area until, you know - and obviously, you know, social engineering and that whole element of it. I mean, there's just a whole range where the social sciences can contribute, and we're increasingly seeing it. It's - you know, it's really great to see that trend, you know, continue to emerge. 

Dave Bittner: And so are you finding that, more and more, the social scientists have a seat at the table? 

Andrea Little Limbago: Increasingly. That's it. I'm not - so it's not there yet. Very often, you know, it's also - you know, it's kind of pushing our way in. But (laughter) I do think that there - yeah, I think increasingly, there is a seed being made. And, you know, I think there are, you know, in this area and in other areas of the industry, still gatekeepers that try and, you know, keep sort of a narrow focus of what cybersecurity should be. But I really do think that for - the broader part of the community, you know, I think, is excited and willing to work together. 

Andrea Little Limbago: And that's where, you know, I think some of the most, you know, exciting innovations are going to come - is when you have the multidisciplinary collaboration going on. You know, we don't want the social scientists to be in their own silo and, you know, vulnerability experts in their own silo and so forth. We want to, you know, get that cross-fertilization together, and that's because I think we're fairly new at that as an industry. I think it leads to a whole lot of optimism about some innovations that might be coming down the road, even if you think about things like for passwords and so forth, right? Like, you know, if you bring in - social scientists can understand a bit better why or what may be a better solution to passwords. It's even just very some basic things where we're getting people to do more multifactor authentication. So even the fundamentals can really benefit from that. 

Dave Bittner: Yeah. I mean, what is your pitch? You know, when you're making the case that the social scientists deserve a seat at the table and that you all have, you know, serious things to contribute, what are you telling the folks on the tech side? 

Andrea Little Limbago: Yeah. I mean, so on the one hand, I'd say, I mean, clearly, you know, what we've been doing isn't working. We're still seeing, you know, ransomware off the charts. We're still seeing, you know, people - you know, everyday citizens still are really not pursuing the foundations of proper cyber hygiene. And so then we're not necessarily succeeding yet as an industry. And so why not try something new? And then you can, you know - continuing to do the exact same thing we've always done, you know, is the definition of insanity. So we shouldn't be doing that. And instead of just, you know, continuing to look within the same areas, exploring more and starting to think about the human element of it - that's what works because, again, we see over and over again sort of the notion of the human as the weakest link, that - you know, it was 90-plus percentage of attacks that are linked to humans. And we can't take the human out of it, right? 

Dave Bittner: Right, right. 

Andrea Little Limbago: And so instead of blaming it on humans, which is a cop-out because we're part of the system, integrate them into a solution. And that's where the social scientists can come in and really help make that integration of human behavior into solutions, make them, you know, smarter, sustainable, something that humans will actually implement as opposed to trying to work around. 

Dave Bittner: All right, well, interesting stuff, as always. Andrea Little Limbago, thanks for joining us. 

Dave Bittner: And that's the CyberWire. For links to all of today's stories, check out our daily briefing at Don't forget to check out the "Grumpy Old Geeks" podcast, where I contribute to a regular segment called Security Ha. I join Jason and Brian on their show for a lively discussion of the latest security news every week. You can find "Grumpy Old Geeks" where all the fine podcasts are listed. 

Dave Bittner: The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our amazing CyberWire team is Liz Irvin, Elliott Peltzman, Tre Hester, Brandon Karpf, Eliana White, Puru Prakash, Justin Sabie, Rachel Gelfand, Tim Nodar, Joe Carrigan, Carole Theriault, Ben Yelin, Nick Veliky, Gina Johnson, Bennett Moe, Chris Russell, John Petrik, Jennifer Eiben, Rick Howard, Peter Kilpe and I'm Dave Bittner. Thanks for listening. We'll see you back here tomorrow.