Apparent hacktivism exposes Iranian prison CCTV feeds. Misconfigured Power Apps expose data. FBI warns of the OnePercent Group. Mr. White Hat gives back. Dog bites man
Dave Bittner: CSAM has been a topic of conversation since Apple recently announced that they will be enabling scanning for child sexual abuse materials on iOS devices. My co-host Ben Yelin and I have an in-depth discussion about this topic with David Derigiotis of Burns & Wilcox on this week's episode of "Caveat," our weekly privacy, surveillance law and policy podcast. Since it's such an important topic, we thought we would share this episode with our daily podcast listeners, and you'll find it in your podcast feed before Wednesday's daily episode. We hope you'll join us and have a listen.
Dave Bittner: More hacktivism appears to have hit Iran. Misconfiguration Power Apps portals expose data on millions. The FBI warns of the activities of a ransomware affiliate gang. Mr. White Hat really does seem to have given back all that stolen altcoin. Ben Yelin checks in on Apple's CSAM plans. Our guest is Charles DeBeck from IBM Security on the true cost of a data breach. And finally, dog bites man; criminals cheat other criminals.
Dave Bittner: From the CyberWire studios at DataTribe, I'm Dave Bittner with your CyberWire summary for Tuesday, August 24, 2021.
Dave Bittner: A group calling itself Adalat Ali - Justice of Ali - has posted video it says it obtained by compromising CCTV systems at Iran's Evin Prison, Zero Day reports. Adalat Ali, which may be an Iranian dissident hacktivist group, says it wishes to draw the world's attention to abusive conditions in Evin. Attribution and identification of the group remain unclear. While it looks like a hacktivist operation, that's a preliminary assessment. The hacked video is the second major region operation against Iranian systems after the attack on the country's railroads.
Dave Bittner: Security firm UpGuard has disclosed that it found Microsoft Power Apps portals configured to allow public access. The researchers notified 47 organizations that their data were vulnerable to exposure. Some of the information at risk included personal information used for COVID-19 contact tracing, COVID-19 vaccination appointments, Social Security numbers for job applicants, employee IDs and millions of names and email addresses.
Dave Bittner: The issue involves misconfiguration as opposed to exploitation of a vulnerability. Users are addressing the misconfiguration. WIRED puts the total of records exposed at around 38 million. That's exposure as opposed to known compromise. But in any case, it's a lot of records.
Dave Bittner: UpGuard notified the organizations whose exposed instances it found, but it also informed Microsoft, which is, we note in disclosure, a CyberWire sponsor. Redmond responded by changing the default table permission. Starting October 2021, Microsoft said, all new provisioned portals will have strict as the default value instead of none. Microsoft has also made a portal checking tool available so organizations will be able to determine whether their data have inadvertently been exposed.
Dave Bittner: UpGuard thinks the principal lessons to be learned from this experience are these. First, platform vendors might consider changing their product in response to observed user behavior, and platform operators should, quote, "take ownership of misconfiguration issues sooner, rather than leave third-party researchers to identify and notify all instances of such misconfigurations," end quote.
Dave Bittner: Second, software-as-a-service providers should improve their users' visibility into access logs.
Dave Bittner: Third, anyone handling sensitive information should be prepared to handle reports from researchers of a data leak, breach or exposure.
Dave Bittner: And finally, UpGuard would like to see better understanding of the problem of data exposure. If you've left data open to the world, accessible to anyone, the people who find such data haven't hacked you.
Dave Bittner: The U.S. FBI yesterday warned of the activities of a ransomware gang styling itself the OnePercent Group. The Record reports that the OnePercent Group is a criminal customer of ransomware-as-a-service operators. It is or has been a known affiliate of REvil, Egregor and Maze. Coveware pointed out, for example, that victims who didn't pay the OnePercent Group wound up mentioned in dispatches in REvil's Happy Blog.
Dave Bittner: The Bureau says that the extortion demands have proceeded in three escalatory stages. First, a leak warning; after initially gaining access to a victim network, OnePercent Group actors leave a ransom note stating the data has been encrypted and exfiltrated. The note states the victim needs to contact the OnePercent Group actors on Tor or the victim data will be leaked. If the victim does not make prompt communication within a week of infection, the OnePercent Group actors follow up with emails and phone calls to the victim stating the data will be leaked.
Dave Bittner: The second stage they describe as the one percent leak. If the victim does not pay the ransom quickly, the OnePercent Group actors threaten to release a portion of the stolen data to various Clearnet websites.
Dave Bittner: And then finally, the full leak; if the ransom is not paid in full after the one percent leak, OnePercent Group actors threaten to sell the stolen data to the Sodinokibi Group2 to publish at an auction.
Dave Bittner: How do the attackers get access to their victims? Well, phishing, of course.
Dave Bittner: Mr. White Hat, as Poly Network refers to the hacker who looted cryptocurrency held by the DeFi provider, has now returned all of the more than $600 million stolen in the theft. Vice reports that Poly Network is now in the process of returning the holdings to their proper owners. Poly Network reports that it's well on the way to complete recovery. And all things considered, the company seems surprisingly pleased with Mr. White Hat.
Dave Bittner: Mr. White Hat, whoever he is, has also, according to Vice, returned the $500,000 bounty he received from Poly Network. So whether it was a demonstration from the start, a goof or the crime it appeared to be, and whether the return of the funds was didactic, repentant or motivated by the sensation of the hot breath of John Law on the back of the neck, the money is back and flowing into the wallets where it belongs. And good afternoon, Mr. White Hat, wherever you are.
Dave Bittner: And finally, security firm Digital Shadows this morning offered a look at fraud, contention and mutual exploitation in the criminal underworld. The C2C market does function like a market, but a market with some very ugly corners.
Dave Bittner: Digital Shadows says, quote, "there are still some unscrupulous criminals out there," end quote, in what they would concede is an observation worthy of Captain Obvious. But what kinds of unscrupulous criminals are out there? What's their taxonomy? If you're interested in the C2C market - perhaps if you're asking for a friend - here are the two biggest families of faithless crooks.
Dave Bittner: First, exit scams - criminal proprietors of underworld markets closed shop and abscond with their criminal customers' ill-gotten money. And phishing - yep, that carding forum you, Mr. and Mrs. Criminal, were interested in may in fact just be a spoof. And the invitation from Mokele Mbembe's Widow's Carding Shop may be designed to steal from you. That's just two, and we trust that human wit will evolve still others.
Dave Bittner: Charles DeBeck is a senior cyberthreat intelligence strategic analyst with IBM X-Force Incident Response and Intelligence Services. He and his colleagues at IBM recently published the latest version of their Cost of a Data Breach Report. I checked in with Charles DeBeck for some of the highlights.
Charles Debeck: First off, the general cost of a data breach has continued to increase year after year as we've done this report. So there's a bit of natural inflation to the data over time. But this was a pretty significant jump. And I think a big part of that was probably due to the large increase in remote work that impacted operations around the globe. The pandemic was a major factor, I think, in increasing the average cost of a data breach.
Dave Bittner: And how so? What does working from home contribute to the number going up?
Charles Debeck: Well, so the numbers show that the average - on average, a data breach that occurred in an organization had significant remote work operations that had to be stood up, increased average cost by about $1 million. So right off the bat, we know quantitatively there's a significant impact.
Charles Debeck: Qualitatively, though, when we're looking at, you know, what would cause this increase, I think a large part of it is the fact that organizations very quickly had to stand up new network infrastructure, new endpoint infrastructure and new capabilities to enable remote work in a very, very tight time frame. This sort of expansion of capabilities is usually done over the course of months or years with long-term strategic planning. By comparison, last year we saw a lot of organizations suddenly get told, you have to set up a brand new set of networks and capabilities, and you have until Monday, which is a very tight timeline.
Dave Bittner: (Laughter) Right. Was there anything in this year's report that was unusual or stood out as being surprising?
Charles Debeck: I think one thing that really surprised me was some of the consistency that we saw in defensive measures or things that help mitigate cost for data breaches. Last year, we saw that automation and artificial intelligence had a major impact on reducing the average cost of a data breach, leading to a difference in average cost of about $3 million, which was pretty huge. Again, this year, we tended to - we saw the exact same sort of thing - automation and AI coming in and having a huge impact on organizations and reducing their average cost of a data breach.
Charles Debeck: And that, to me, is interesting because it means that not only is this a one-off thing - this isn't just a fluke or a random data point, but it starts to emerge as a trend, to me, for organizations that this is something that's consistently providing value and something that organizations can do to have a reasonable probability of helping protect themselves.
Dave Bittner: Where do you suppose we're headed here? I mean, is there - you all have been at this for quite a long time. You know, this - you've been putting out this report year after year for nearly two decades now. And do you suppose we have any hope of flattening the curve? Is there - are there good days ahead?
Charles Debeck: I think there is hope on the horizon. And I really think it comes down to, how can organizations reduce the time it takes to identify and contain data breaches? And it is a cat-and-mouse game, right? Threat actors are constantly trying to make it tougher for us to do this, and net defenders are constantly trying to do this faster and faster.
Charles Debeck: But I think that we're finding new tools in our arsenal here. And again, going back to that sort of artificial intelligence and automation component, I think that's one of the key ways we can help reduce that timeline for identifying and containing breaches because automation allows you to work at computer speed, you know? It allows you to do things in a matter of moments, whereas a actual person would take a matter of minutes. But minutes in computer time is an eternity. And so I think using analytics and using automation does, to me at least, provide a good sense of hope that organizations can do a lot to help reduce the cost of a data breach. But it is a very conscious investment. It's something that may not return on the investment immediately but in the long term will provide significant benefits for an organization.
Dave Bittner: You know, based on the information you gathered here, are there any specific recommendations you can make for organizations to better defend themselves?
Charles Debeck: I think one specific recommendation I want to make for an organization is if you are engaged in a cloud migration, make sure that you're doing it in a smart and safe manner. Now, one thing we found in this report that I thought was very interesting was that cloud migration was actually a major cost amplifier. So if an organization was breached while migrating to the cloud, that actually significantly increased the average cost of a data breach for them.
Charles Debeck: So to me, the takeaway here is that organizations should still be moving to the cloud. There's a lot of security benefits. So we could talk for a really long time about all the great reasons why organizations should move into cloud environments. But I think what it means to me is we should continue that movement, but we need to make sure we're doing it safely and securely. We don't want to just sort of haphazardly move our stuff into a cloud and say, OK, great. There it is. Hopefully everything's all right. You know, we want to make sure that we're doing this in a way that makes sense so that we don't have a breach that happened during this migration process, which could be very costly for an organization.
Dave Bittner: That's Charles DeBeck from IBM X-Force.
Dave Bittner: And joining me once again is Ben Yelin. He's from the University of Maryland Center for Health and Homeland Security and also my co-host on the podcast "Caveat," which, if you have not checked out yet, what are you waiting for (laughter)? It's worth a listen. Ben, welcome back.
Ben Yelin: Thanks, Dave.
Dave Bittner: We would be remiss if we did not discuss the recent hubbub from Apple's announcement that they are going to be scanning iOS devices for CSAM, which is child sexual abuse materials. Can we just do a quick overview here? From your point of view, what's going on here?
Ben Yelin: So Apple is doing two things. They are both scanning iMessages. If a parent opts in, they're scanning the messaging application on iOS devices for nude images for minors. So if a minor is between 13 and 18 years old, the minor would be notified, would get an alert that would tell them, you are about to send or receive a nude message. This is a warning. That message would go to the parents if it's a child under 13. I think there are fewer civil liberties objections to that particular announcement from Apple.
Ben Yelin: The announcement that presents more significant civil liberties concerns, in my view, is the announcement that Apple is going to scan photos in the iCloud against a known database of child pornographic images. And if they discover that an image is - matches one that's in that database, they could potentially share that information with the government, and that would lead to a criminal prosecution.
Dave Bittner: Right. Now, the sticky wicket here is that there are plenty of tech companies who are scanning their cloud services for these sorts of images. That is routine at this point. Facebook, Google, Dropbox - they all do that. What sets Apple apart is their plan is to do the scanning on-device.
Ben Yelin: Right, so it's not just in the cloud. It's on the device itself. And there's no technological reason they couldn't scan a hard drive, for example. They're making a policy choice to confine this right now to photos that are posted on an iCloud. But the technology exists to search it on somebody's device, even if they don't post that photo to the iCloud.
Ben Yelin: So this presents many potential civil liberties concerns. It's not, per se, a Fourth Amendment violation because this is a private company. But the government, of course, knowing that Apple has instituted this practice, this policy is going to know that they probably have access to information that would be valuable for criminal prosecutions. And we know the government has tried hard to get Apple to reveal encrypted communications...
Dave Bittner: Right.
Ben Yelin: ...To give the government access to encrypted communications. And it's not just our government. Even though this program is being piloted in the United States, it certainly eventually will be available to overseas governments that are far less concerned with civil rights and civil liberties. And even though it's being used right now for CSAM, it could be used for other purposes - to scan images, to scan messages for disfavored political content or for censorship purposes.
Ben Yelin: So the idea is once you build this technology and once you put it into practice, as Apple plans to do over the next several months, then you have created this backdoor. And even though you are trying - you are claiming to confine the use of this technology in the short term, once the technology is created, Apple is going to be under enormous pressure from governments around the world to use it for more expanded purposes. And so that's the inherent danger here.
Dave Bittner: We should mention that users do have the ability to opt out. If you don't use Apple's iCloud photo service, your photos on your device, according to Apple, won't even be scanned. They won't be looked at unless you're using their cloud services. But that doesn't seem to be putting people at ease.
Ben Yelin: Yeah. So first of all, as I said before, that's a policy choice. That's not a technological choice. Apple, of course, still could scan your device. They do it for a bunch of other purposes - you know...
Dave Bittner: Right.
Ben Yelin: ...To find malware on your MacBook, for example.
Dave Bittner: Right.
Ben Yelin: So that's not necessarily anything new. That's a policy choice that they're making now. And I think the concern is that this is going to be a slippery slope where a government says, if you really care about stopping child exploitation, why confine these searches just to photos that have been posted on the iCloud? Why can't you also search, you know, photos that have been saved on a hard drive or even, you know, have been, you know, just saved on a single device. So I think that's the concern, that it's more of a slippery slope.
Ben Yelin: I also think the fact that this is Apple carries, you know, an increased weight as opposed to another service provider. Apple presents itself as, you know, being very committed to user privacy, the protection of users' information. That's how it sells themselves. That's how they present themselves publicly.
Dave Bittner: Right.
Ben Yelin: And so I think this cuts against one of their professed corporate values, which is the protection of private information. They're put in a tough place because obviously, to be against this, it's seemingly to be against rooting out sexual exploitation of minors.
Dave Bittner: Right.
Ben Yelin: The intentions here are very noble, and I think we have to acknowledge that. I think we have to acknowledge that the problem that they're trying to solve is, of course, of the utmost importance.
Dave Bittner: Right.
Ben Yelin: But, you know, I think the method in which they're engaging in this type of surveillance of their own users could come back to haunt those users. And so I think we have to be honest about that as well.
Dave Bittner: Yeah, it also strikes me that this is - in some ways, you know, Apple has a corporate culture, I believe, of kind of knowing what's best for our users...
Ben Yelin: Yes.
Dave Bittner: ...Right? And it's that old - you know, like Henry Ford said, you know, if I'd asked my users what they wanted, they would have said they needed, you know, better - faster horses or...
Ben Yelin: Right.
Dave Bittner: ...Better buggy whips or, you know, something along those lines. But - and so Apple along in their history has said, you know, you don't need that floppy drive anymore. You don't need that headphone jack anymore. And I think that aligns with Apple's surprise at the backlash here. I think Apple thought that they did the hard work of designing what is, I think most people agree, a very clever technological solution to this. And yet people are still having a very strong reaction.
Ben Yelin: Yeah, I think a couple of things go into that. One is we have values in this country about protecting private information. Some of that is inherent in our legal system. The Fourth Amendment protects us against unreasonable searches and seizures. So even though this, you know, as of now, isn't an action the government is taking, it does seem contrary to our values, where we don't want anybody in our protected private spaces. And that certainly includes technological spaces, including, you know, the iCloud where we store our photos. So I think that's a huge part of it.
Ben Yelin: The other part of it, like I said, is the fact that this is supposed to be the company that most stringently protects user privacy. And so if Apple is doing it, then what does that mean for every other company that doesn't present themselves as protecting our private information? And what does it mean for technological companies that are based overseas in more authoritarian countries? Are they going to learn from Apple and deploy this technology in a way that doesn't just target sensitive, exploitative images, that sort of thing?
Dave Bittner: Yeah.
Ben Yelin: But it's used to go into messaging applications, to go into photos and try and crack down on free speech or political dissent. And I think those are - that's kind of the nature of the backlash, as I see it.
Dave Bittner: Yeah. All right. Well, there's much more to this conversation. And in fact, we spend the entire episode of "Caveat" this week discussing this. We're joined by David Derigiotis. He's from Burns & Wilcox. And we take a little unusual route where we take on one topic for this week's "Caveat." So if this is something that interests you, please check it out. That's the "Caveat" podcast. Ben Yelin, thanks for joining us.
Ben Yelin: Thank you.
Dave Bittner: Thanks to all of our sponsors for making the CyberWire possible.
Dave Bittner: And that's the CyberWire. For links to all of today's stories, check out our daily briefing at thecyberwire.com.
Dave Bittner: The CyberWire podcast is proudly produced in Maryland out of the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our amazing CyberWire team is Elliott Peltzman, Tre Hester, Puru Prakash, Justin Sabie, Tim Nodar, Joe Carrigan, Carole Theriault, Ben Yelin, Nick Veliky, Gina Johnson, Bennett Moe, Chris Russell, John Petrik, Jennifer Eiben, Rick Howard, Peter Kilpe. And I'm Dave Bittner. Thanks for listening. We'll see you back here tomorrow.