
Doctors’ Perspective: The Rise of Healthcare Ransomware
Sherrod DeGrippo: Welcome to the Microsoft Threat Intelligence Podcast. I'm Sher DeGrippo. Ever wanted to step into the shadowy realm of digital espionage? Cybercrime? Social engineering? Fraud? Well, each week, dive deep with us into the underground. Come here for Microsoft's elite threat intelligence researchers. Join us as we decode mysteries, expose hidden adversaries, and shape the future of cybersecurity. It might get a little weird, but don't worry. I'm your guide to the back alleys of the threat landscape. Welcome to the Microsoft Threat Intelligence Podcast. I'm Sher DeGrippo, Director of Threat Intelligence Strategy at Microsoft, and I'm joined by Christian and Jeff, co-directors from the UCSD Center for Cybersecurity, who I met a couple of months ago, and was really just delighted by. These guys are actual doctors, which most of you probably need to go see out there in the listening audience, make a doctor's appointment. And they're life security nerds, which we don't tend to see a lot out there in the industry and I worked together with them on the Microsoft Threat Intelligence Healthcare Ransom Report. So go check that out. We'll put that in the show notes. But Christian and Jeff, welcome to the show. How are you guys doing?
Christian Dameff: We're doing good. Thanks for the invitation today, and I just want to share our sentiments that we're glad to be here on your podcast, and talk about something that I think will be a little bit of a bummer, you know, hospital ransomware attacks, but at the end of the day, hopefully we can take this in a positive direction for what we're doing about it.
Jeff Tully: I'll just start it off on a lighter side, by saying Sher, I was in your neck of the woods the other day, and you gave me a killer barbecue recommendation, so thank you for that. It was delicious.
Sherrod DeGrippo: Of course, of course. I live in the south, I live in Atlanta, and we have a lot of good barbecue here, but there is one place that has pimento cheese wontons, which they're pretty amazing. Quite good. Yeah. Christian, you've got to come down. Although you guys live in San Diego, so it's not like you're hurting for great food out there.
Christian Dameff: Definitely not the barbecue mecca, though.
Sherrod DeGrippo: So, it's not, I know. But you have all the good Mexican food there, right?
Christian Dameff: Yeah.
Sherrod DeGrippo: Yeah, so let's talk a little bit about your background. Christian, I'll start with you. So, you're an emergency room doctor, I know that. And you're also a pretty avid like, hacker. You're in the hacker scene. You went to Defcon. You've spoken at quite a lot of conferences. Give me an understanding, kind of, of how you came to be at the intersection of both of those things.
Christian Dameff: Yeah, that's a good question. Basically, I just grew up in the hacker scene. I was never a very popular person in high school, you know, all my friends were geeks, nerds. We didn't call it info-sec, we didn't talk about having careers in cybersecurity. It was more just, when you couldn't be invited to cool parties and hang out with the cool folks, you hung out online. You had things like Live Journal, or you know-
Sherrod DeGrippo: Oh! I was on Live Journal too.
Christian Dameff: IRC, it was a big deal.
Sherrod DeGrippo: I was on Live Journal and IRC, woo!
Christian Dameff: So that's where you hung out and yeah, same thing for me. And in a lot of sense, I kind of grew up on the early internet, so I had my friends, and you know, get into hijinks and shenanigans, and a lot of that ended up being kind of the hacker space, that kind of hacker room, and that was something that was a natural outlet for me as a teenager, right? So you want to get into a little bit of trouble, but not a lot of trouble, and that was what drew me to it. Also, just kind of exploring, and kind of the challenge of things. So I grew up a hacker long before I ever wanted to become a doctor. And yeah, like I mentioned, I never thought it would be a career. Go to college, studied philosophy, and then thought really hard about never having an actual job. And then I wanted to drive an ambulance, that was something I really wanted to do. So went and got my EMT certification. Went to try and drive an ambulance, and they said, you idiot, like, you're young, you can't drive an ambulance until you're this old, and I'd already gotten this EMT certification, so the only job I could get was at an emergency department, and while seeing things like cardiac arrest, and CPR, and strokes, and gunshot wounds and all that, a lot of people would run from that stuff, but for whatever reason, it really drew me in. And so I then decided I wanted to go back to school for med school. And then, I remember this moment in my life until the day I die, I was third year medical student, rotating in the neonatal intensive care unit, so this is the part of the hospital where the sickest little babies are, and I mean, so serendipitous, you know, ten seconds earlier, ten seconds later, my life would have been completely different, but I saw a monitoring device, this was a thing that was monitoring this little baby's heart rate, and their oxygen saturation, I saw one boot up, as I'm just touring through this unit. And it was like running Windows Vista, or something crazy, like something archaic. And I remember, at that time, thinking how easy it would be to own it. To basically just pop it. It would be easy. And then I thought, what are the consequences of that, right? So if that monitor wasn't effective, now where it didn't work, whatever reason, then that little baby wouldn't have this critical monitoring, and that to me kind of light bulb goes off. And my hacker roots growing up, and my medicine focus, I realize it's not about patient privacy much anymore, and this kind of crazy future of ours. It's about patient safety. It's about care quality. It's what happens when you don't have that equipment that you need to take care of someone who is super sick or they'll die. And the consequence of things like malicious software, malicious adversaries hacking this infrastructure, can really translate to loss of human life. And so then I, you know, hooked up with Jeff here. Well before that we'd been doing some research and combined those two things so we really focused our healthcare cybersecurity research on patient safety and outcomes. So that's the super villain, or superhero origin story, it's kind of unimpressive, but that's where I come from.
Sherrod DeGrippo: Did you ever get to hack the baby monitor?
Christian Dameff: I didn't, but I have a colleague who did, and that story, I'll let him tell, but you know, there is this really interesting ethical dilemma you'll have where you have this device, it has got a vulnerability, there is no patch available for it. You know that it's open on the internet, and that it would be trivial for someone to exploit. What do you do? If you patch it, back there, back in the day, if you applied an out of band patch, then it might violate the FDA's approval for it. At least that was the thought. So a lot of folks were in a rock and a hard place. They thought that it would void the warranty, or the device wouldn't be supported, or it could have some adverse patient outcome with a patch out of band, but the same time, there is this huge vulnerability and you know, there's been this traditional fight in that space. But I will say, you know, a lot of that stuff has kind of flushed out. The FDA, of course, is much more supportive of patching legacy medical devices and things. But it's still kind of hairy territory, because how do you know what the patch is going to do for the function of the device? Well, it's going to hurt the patient and et cetera, so these are hard things when we talk about that intersection of patient safety and cybersecurity because the consequence of screwing up the recovery can sometimes be as bad as exploiting the vulnerability in the first place.
Sherrod DeGrippo: Jeff, give us an idea of kind of how this started for you. Were you on Live Journal?
Jeff Tully: I was on Live Journal, I wasn't quite as much of a script kitty as Christian was. I'm just kidding. I wasn't in the hacker space formally, and Christian was actually my entry into that, into that community at Defcon and other places, and I'd seen some just because he's my best friend and this is basically how we are all the time, so no, I think we were lucky to have a little bit of an insight as we were going through medical training that many of the perspectives that I think hackers employ on a regular basis, understanding a system, figuring out how it works in normal operating conditions, and then seeing ways in which you can tweak that to improve or decrease performance, like, that's a lot of clinical physiology and pathophysiology. And so I think it was very natural for us to continue to explore this niche and we are not the world's most technical people, but you know, kind of the truism that I like to use is when we're in a room full of hackers, we're the best doctors, and when we're in a room full of doctors, we're the best hackers [laughter].
Sherrod DeGrippo: Yeah!
Jeff Tully: And that has been something that has gotten us into some pretty cool places, and been able to do some pretty interesting work. I guess they kind of combined two of our major interests and passions.
Sherrod DeGrippo: That's really cool. It's something I'd never really considered until I met the both of you at U.C. San Diego for the Healthcare Report on Ransomware. I never really thought about like people in these really intense professions already deciding to combine it with something that is very, very intense also. But I think that makes sense, in some ways, after getting to know you guys a little bit. You're pretty into intense things. So let's talk a little bit about the UCSD Center for Cybersecurity, which I got to visit. There's-I shouldn't have called the video hilarious, but it is hilarious to me. It was a hilarious experience. Everyone can go watch the video of me at USCD with you guys, with the weird classic dummy bodies, and putting stuff into them, it was very strange. And it was pretty gross, like even though they are just made of plastic or whatever. There's still an element of disgust to it. So tell me about when you say Center for Cybersecurity for Healthcare, what does that mean? Like, what are you doing there? What do they do there?
Jeff Tully: Yeah, so just to provide a gentle little correction, because we have a number of world class cybersecurity researchers across campus who would probably be very frustrated at us if we claim to be the center for cybersecurity. We are actually the Center for Healthcare Cybersecurity.
Sherrod DeGrippo: Yes.
Jeff Tully: So we try to stay in our lane, and understand, and study the ways in which the technological dependencies we have as clinicians can be affected by many of the threat actors and types of attacks we'll talk about. So we are really interested in developing an evidence base, right? Because as physicians, if I want to use a new therapy on a patient, there's going to be clinical trials and studies and data that help to support that choice to use a particular therapy or procedure or something like that. When we're speaking about cybersecurity in general, I think, but specifically within healthcare, we don't have a lot of data as how to best use limited resources or apply the skills of a small, limited work force. And so at the Center for Healthcare Cybersecurity, we're trying to bring a little bit of that rigor and evidence from medicine into the domain of healthcare cybersecurity.
Sherrod DeGrippo: That's something that I think we need, because in my opinion and the information security profession, I don't think we realize how much we really truly operate off vibes. We don't do a lot of data-driven decision making. We don't do a lot of evidence-based work. There's a lot of feelings involved, which I love feelings, and I'm very vibey. But I do think that sometimes decision making comes down to not a lot of backing data. It's just sort of like, well, this is what we should do. And I think when it comes to patience and healthcare, like, we have to kind of apply some more rigorous metrics to that. So Christian, what kinds of stuff are you doing there?
Christian Dameff: Yeah, Sher, we completely agree with your assessment. And before I answer that question, I just want to put a little bit of analogy. I mean, medicine hasn't always gotten it right like that. I mean, we used to do mercury, leeches, and bloodletting. Like, oh you have a cold? Let's just throw some mercury on that. Yeah. Turns out like that stuff doesn't really work. And we did that for, you know, hundreds of years until this evidence-based medicine revolution that happened not too long ago, where you really have to kind of put up or shut up in medicine. Like if you don't have data, you don't have efficacy, you don't have these trials, you know, government agencies won't let you try new therapies on patients, and that's good because a lot of times not just doesn't work, but doesn't help. An example I think of one thing that we're looking at here at the center is something that I think will translate to a lot of your listeners. Phishing is a huge problem. It is, you know, cited as one of the main vectors for hospitals getting ransomed, are phishing attacks. And we don't have a lot of great data to support, you know, what do you do about it? And when we talk about vibes, what we think about a lot is sort of, we can train this problem away, or we can impact the rates of phishing susceptibility by just training the whole enterprise. So that's why we have our yearly cybersecurity trainings. That's why we have simulated phishing campaigns. The idea that you can teach people to do something better is a huge part of cybersecurity. You know, what's the evidence for that? I think there is a growing evidence base that's suggesting that you can't train your way out of that. No matter what you do it's asymmetric, you know, attackers just need to make a slightly better phishing message to get 10% more of your work force to click through it. It's very, very easy to set up these campaigns off, and as a consequence, you know, the training will never catch up to or become sophisticated enough or have your employees attentive enough on the training to actually make a difference. So one of the studies that you know we're working on here is looking at those types of things, where we can see if training or what types of training actually work, and our hope at the end of that, you know, we're doing that in a health enterprise population. But the hope is that the results of that are going to influence many other, you know, finance, manufacturing. They have the same problems. And it turns out at the end of the day, which it's looking like it would, that phishing training doesn't actually work. Well then what should we do as an industry? Should we keep putting money into phishing training that doesn't work? Or should we be moving more towards taking the human element and social engineering side of this out of the equation? Are there ways that we can re-engineer systems so that they're becoming more, et cetera, et cetera. So building that evidence base, using the kind of laboratory of the health system to test some of these cybersecurity mitigations, controls, and then be able to say alright well why is this in all of our frameworks? Why is this required to get cybersecurity insurance? Why is this-the evidence shows it doesn't work, hopefully we an start changing not just the vibes, but also the way we go about recommending things, because they're not cheap. And I hope one of the things that we do here at the center is allow evidence to help drive decisions that unburdened poor hospitals from investing in stuff that doesn't make sense. Right now, I think many people would agree, healthcare is broken. I think Jeff and I agree with that. There are cyber haves and have nots in healthcare. Hospital systems that are well equipped to deter these types of attacks, and others that are not. It's the hospitals that are poor and in the middle of nowhere, where if they get popped and hit with ransomware are going to result in people dying and huge areas of care being lost. And so if we can build an evidence base to show maybe when things work or don't work, then we can go to those poor hospitals that don't have two nickels to rub together and say instead of spending your money on something that doesn't work, try this instead. That's how we're really going to help keep patients safe, by helping figure out, and with our scarce resources, what do we actually want to invest in? The only way we can do that with a straight face is by, again, putting up or shutting up. You know, show the data.
Sherrod DeGrippo: Testing the actual controls I think is something that we don't do enough in security. I am a big believer in not necessarily depending upon your work force or your users to be able to do everything they do in their jobs as well as go head to head in battle with cyber criminal gangs. It's just something I think as an attitude in security is very misguided. That we think, oh, you know, this person is-let's say they're an accountant. They're a senior accountant in your organization and they're really good at being an accountant. And now we are also expecting them to be really, really good at fighting and resisting organized crime. It just doesn't make sense to me. I think that the technical controls have to do almost all of the work and be on the path to getting to the point where those phishing emails don't ever get in anyone's inbox at all. That's really the goal. Knowing how to recognize them and being trained on them is great, but as technologists, it's our responsibility to not let them ever get in the inbox to begin with. I came to USCD and you have a simulation room where somebody is doing a medical procedure and then they are told that they just got ransomware. Can you kind of walk me through what that's like? Have you had people freak out? Has anyone ended up, like, panicking? What is that process like?
Jeff Tully: Yeah, so I think this is a good way to kind of reinforce on the topic we just hit upon, which is this tension between do we need all of our doctors and nurses to become cybersecurity experts? Or do we need to equip them to be able to better deal with the disruptions that arise when the network is shut down in the sending of a ransomware attack. And so with the simulation center, you know, simulation is a concept that is ubiquitous in medical education. We stole it from aviation. It's all about training people to operate under pressure in rare but high stakes situations in a controlled environment, so that the first time you encounter a heart attack or an emergency surgery in the real world, you have kind of a foundation of training to fall back on. So the work that we'd done was kind of cybersecurity based scenarios, as taking real security research. We started with medical devices that have been shown by folks like J. Ratcliffe and Billy Rios to have different classes of vulnerabilities. We understood how exploiting those vulnerabilities might translate into a physical consequence that would affect whether it's infusion pump delivering medication or a pacemaker that was inappropriately shocking. Like what that translates to in a clinical concept. And then we take students and resident doctors and even new attending doctors, and kind of drill them through this. So they had to deal with medical decision-making challenges within the context of kind of this security research. And that was something that was very eye opening and I think that helped people begin to incorporate that yes, these issues of cybersecurity and healthcare are not just about data privacy, but really about patient safety. We've done that and continue to do that, and the challenge I think is scaling it. And using it as a way to, again, not train doctors to be information security professionals, but to appreciate the ways in which these types of incidents may affect their ability to deliver patient care.
Sherrod DeGrippo: One of the things that you guys told me when I was down there is that one of those scenarios is that under a ransomware attack, there is no way to send prescriptions via the computer, so they have to write them on paper, and the younger medical students have never done that. That blew my mind. And since then, I've asked people when is the last time you had a paper prescription that you took to a pharmacy, and they're like 20 years. So what do you do? Are you teaching people how to write paper prescriptions, or how does that work?
Christian Dameff: Yeah, that's a great question. So actually one of the big efforts under the center right now is to systematically go through a large hospital to interview the folks taking care of patients, with doctors and nurses. Understand their clinical work flows. Understand what types of patients they care for. Map those to the required technologies. And then develop a model of what would happen in the various ransomware scenarios. And the reason I'm kind of slow rolling the answer to your question is, that's very much the case, for instance, for a family practice doc, who is seeing patients in a clinic. People come to the clinic. They'll check in. Oh, they can't check in because the electronic health record is down. What are they going to do to intake patients? Oh, hey, they need to look at your past medical history. It's all contained within the electronic health record. It's no longer there. Well, they won't know what surgeries you might have had. Or what medicines. Or what happened in the last 10 years of your medical information? Oh, they need to send a prescription to the pharmacy. And it uses some intermediary third party vendor that got hit with ransomware while at work, now they have to hand write a prescription. Well, do they have prescription paper? Do they have a process to write, for instance, narcotics? Like right now, in many states you actually cannot hand write a prescription for certain types of medications. You have to do those electronically. So what's your plan going to be if, like what happened earlier on this year, when Change Healthcare got hit, and you couldn't do e-prescribing. How are you going to take care of your patients that have chronic pain who can't get electronic prescriptions sent to their pharmacy, but now they have to write it on paper, when the pharmacy won't fill the prescriptions of narcotics, for instance, on paper. So what we're doing at the center is like this Herculean task of understanding those work flows, mapping the required technologies, and then saying hey, of the ransomware attacks we've seen in the last five years in healthcare, what systems are likely to go down? How is that going to impact these patients? Which patients are going to be at highest risk to die when these systems are down? And how can we prepare ahead of time, as well as respond in the moment to reduce the risk to those patients? And I think, you know I don't mean to put Jeff on the spot, but I think if you could talk about the patient population like a stroke, a patient having a stroke or a stemmie and just how dependent we are on that technology and kind of what would happen as a consequence if we didn't have that?
Sherrod DeGrippo: You guys love to talk about strokes, and have terrified me about them. So I didn't even know what a stroke was before I met you guys [laughs].
Jeff Tully: Well, let's try to demystify it a little bit, and not be quite so fear mongering. I think we have the ability to do incredible things in medicine, and stroke is just one example of that. You know, a stroke really refers to any time the brain is not getting enough blood, and there's kind of two main situations in which that happens. One, there is a blockage in the blood vessels, so if the blood vessels are pipes, there's something clogging up the pipe, and blood can't get through, and the other is sort of a burst pipe, or a hemorrhagic stroke. One of those vessels bursts, and bleeds into the brain. And I think this has gone from something even just in the last 10 to 15 years from often times being life-threatening or severely disabling to now we have interventional procedures where if we can quickly diagnose somebody with a CT scan, and use sophisticated image processing to analyze where the blockage is, and what type of blockage it may be, then we can have a neurosurgeon put a catheter in through your groin, go all the way up to your cerebral blood vessels and suck out this clot right? So in certain situations, this can be a problem that if presented to the medical system promptly and if all the technology is functioning can be very, very treatable. You can walk out the same day. And when you sort of go and do that work flow analysis that Christian mentioned, and sort of break down what are each of the individual component pieces of technology that are required for this work flow, and what dependence do they have on the network, and what types of underlying platforms-I mean, a CT scanner is basically just like a fancy peripheral that is connected to a computer. A lot of times depending on your vendor or your ability to invest lots of money into upgrading this equipment, you know, you could be on a box running Vista or Windows 2000 or something like that. And so understanding this very complicated meshwork of all of these individually important in totality critical infrastructure, like you can very easily see how it's hard to take people at face value when they say oh yeah, our system is under a ransomware attack, but patient care is proceeding normally and there is no real disruption because we've trained on these types of things. And so I think these plans that Christian and I are working on, and some of the other studies that we're doing just kind of reflects our philosophy that there may be a little bit of an over-focus on prevention and a little bit less on like what are the failure mode effects? How do you become more resilient and able to operate in the setting of a failure, and from a clinical standpoint, that means, what do you do to keep patients safe?
Sherrod DeGrippo: Christian, do you want to add anything to that?
Christian Dameff: No, just that when you were drilling in the center on a stroke patient, you did a great job of putting in a breathing tube. I want to say Jeff taught you. He's a great teacher, but I watched from afar, and I just want to say you did a great job. You have a second career in medicine, I can tell.
Sherrod DeGrippo: Thank you very much. I got to do that five times. Four of them were correct. And you know, I hope I never have to do that, stick tubes down anyone's throat, for real, ever. That would be great. So, let's talk a little bit about like the ransomware ecosystem. Microsoft tracks it very much as an ecosystem. We track the threat actors that not just do ransomware, but that create the infrastructure, create the sending capabilities, if it's going via phish, or it's starting that way, initial access brokers. Threat actors like Storm 1101 that create phish landing pages, and that's all they do. But those credentials are then packaged up, sold, initial access brokers get them, they do the initial access, and they sell those to ransomware actors, so it's this big ecosystem that we see. And so I'm kind of wondering from your perspective, are you seeing a change over the past couple of years. And like the vectors, the types of ransomware, even in terms of dwell time, certainly in the data that Microsoft has we're seeing dwell time go way down. We would see dwell time of days a couple of years ago and now it's at hours. Tell me from your perspective as healthcare providers, what are you seeing in the ransomware ecosystem that sticks out to you?
Christian Dameff: Yeah, I just have two points to this. Number one, the amount of data and visibility into these attacks that you're discussing I think is so meaningful. And so, for folks like Microsoft to be able to share that information, whether it's in the report that you mentioned, or through other means, it's tremendously important. Because the first big point I want to make about this is that the quality and amount of data regarding healthcare ransomware attacks is really lacking. Whether or not it be government agencies responding to this, the individual organizations themselves, if you asked me is there a definitive healthcare ransomware registry where I am confident it has actually collected all of the ransomware attacks that have happened on hospitals in the last five years, I would say none such repository exists. If you asked me is there somewhere that is a definitive source on how long, for instance, dwell time or vectors on healthcare ransomware that we could point to, I would say there's not enough publicly exists. And so the point you bring up, and an earlier one, about data supporting this, we are very early in this space. And so for any listeners out there that are interested in the research side of this, and again, thank you for Microsoft for their work in this, which is basically unparalleled, we have a long way to go to actually understanding and sharing this type of threat intelligence, to then make some meaningful decisions. The second point I'll bring up is that some more Meta trends I've seen, I last saw on the threat actors themselves were more on just the types of attacks and how they impact, you know, ransomware is continuing to increase. One of our colleagues, Dr. Hannah Neprash, you know, has been publishing data on the number of attacks, you know, the best methodology she can and they're going up, they're not coming down. 2024 has been a banner year for healthcare ransomware. It's probably going to be even more so the year after but the attacks are lasting longer. That's one of the things that is really terrifying, you know, when Hollywood Presbyterian got hit in 2016, got hit with SamSam, it was down for a few days. But nowadays, there is no such thing as a few days of downtime in a hospital ransomware anymore, it's weeks to months. And when we talked earlier about what those patient care impacts can be, they're dynamic. You know, if you have a couple days of downtime, you can probably avoid a lot of really bad outcomes. At a month, that's nearly impossible. I'll give an example from the clinical context. Say you're taking care of pregnant mothers and you are hit with ransomware. You're an obstetrician, you're working at a hospital, your hospital gets hit with ransomware, one of the first things you're probably going to do is cancel any scheduled inductions. These are mothers that want to come in and start the labor. They're scheduled on Thursday to come into the hospital and start their labors, you're going to cancel a lot of those planned inductions because hey, you're under a ransomware attack, you can't care for these patients very well. What happens a week later for all those planned inductions that you canceled? Well, some percentage of them are going to turn into emergency C-sections. And anytime you have an emergency situation, the chance of harm to the mother or to the baby goes up. And so this is the dynamic nature of healthcare ransomware attacks, and why when they're not a week, but they're four weeks, you can see different patient populations get harmed at different intervals, and that is a terrifying thought. And one thing we really need to focus on is how do we reduce the time it takes for us to respond to these attacks, so that these week-long ransomware attacks are now maybe 10 days. Something more manageable. And to just point earlier, there's a lot of attention on prevention, but we're still failing. And there is not a lot of attention on response. And we hope that dynamic changes, because ability for us as a country and as the world to make every hospital ransomware proof is just impossible. So now we've really got to figure out what do we do when we get hit and how do we reduce that from a month of downtime, to you know, ideally hours? It's a really, really hard problem.
Jeff Tully: Yeah, and the only thing that I would add on that from the standpoint of Meta trends, as we saw earlier in the year is this potential impact to pieces of the entire national heath infrastructure, as opposed to just individual hospitals, or even hospital systems. With Change Healthcare, that was something where even folks like Christian and myself who are practicing medicine on a regular basis, I wasn't really familiar with their role as an intermediary and some of the financial transactions and prescriptions that are required to serve I think up to 100 million patients in the country. And I don't know the sort of motivations or a strategy for that particular threat actor, but this idea that we can start to see targeting of larger infrastructure, you saw in the UK an incident that impacted the ability for blood donation and transfusions to be extended across the country. Sort of the larger-scale impacts, and the fact that we in healthcare don't have a very good handle on the maps of interdependencies throughout the entire network, not just an individual hospital, is something I think also to keep an eye on.
Sherrod DeGrippo: Yeah, that's interesting, and that's something that I think from Information Security professionals, it's really important that we understand all of the different industries and systems and how they work. But it's very hard for anybody to really understand all those dependencies and then have them mapped and put in the right place, and be able to access very easily and quickly. That's just a maturity level that I think a lot of industries aspire to, but I don't really know any that have gotten there. Not even financial services, which tends to be one of the most mature. I don't think that in general they have a really great handle on their dependencies either. I want to ask you a spicy question which is have you thought about and do you have an opinion on some of the discourse that is pushing for policy bans against paying ransoms?
Christian Dameff: Sherrod, you didn't tell us this was like a politics and religious podcast, alright?
Sherrod DeGrippo: It is.
Christian Dameff: I heard you know you don't talk about those things at the dinner table.
Sherrod DeGrippo: This is basically the FM Morning Zoo Crew.
Christian Dameff: Oh I love it. Alright. So, again, I think the standard response to that, like the diplomatic, political one is like oh, it depends. It's a nuance question, and people respond to this by saying "in this situation this, and that," you know, I would say I'm going to dodge it in a different way and just say I'm for whatever actually works, right? Like this comes back to what we talked about earlier, which is using evidence to make better decisions. Now it's hard, you know, you can't do a trial. You know, we can't say half the hospital is getting hit with ransomware, you guys pay the ransom, and half don't, and then we measure some outcomes, and then at the end of that we say well, based on what we thought was important, turns out you should or shouldn't because here is the data. But what I would say is that it seems pretty convincing to that this will continue as long as there is a significant financial incentive for threat actors. Right? That is for the large part why we're hearing they're hitting healthcare at higher rates, because healthcare is paying, and they're paying large amounts of money and they're pretty easy to hit. That seems to me like an impossible equation when you come to talk about the market, or I guess the incentives to hit hospitals, changing anytime soon. We're not going to see some one- or two-year effort, and then all of a sudden all the hospitals are far more resilient, and turns out that they're not going to pay ransoms because they're far more capable of handling these attacks. I don't think it's happening anytime soon. I also think if you banned it, would the threat actors find some new industry to try to go after that would pay? I don't know. I mean, we've seen some evolution of that over the time, you know, maybe that's the case. But I don't think any of this is going to happen anytime soon. What I would encourage us to do is focus less on trying to fix this problem by sacrificing institutions and potentially patient care with the thought that we're going to be able to fix this problem quickly. I would much rather focus on our response and resiliency efforts to take the sting and the patient safety impacts out of ransomware attacks and have that be our focus. Right now, because that will help us in many other ways as well. Maybe it will disincentivize organizations from paying ransoms, and that changes the economy. But at the same time, it also makes hospitals more resilient to other types of disasters like hurricanes and earthquakes and things like that, things that are also very important threats outside of the cyberspace. So I'm going to answer this question again with like a complete dodge, and say we don't know. We don't have good data. I doubt things are going to change quickly enough. There is no magic bullet here. And regardless of what we end up doing, we really need to focus on response. Becuase that seems to me to be the best and most certain course of action moving forward to do the most important work of just keeping patients safe. Jeff? I don't know. Do you want to do a spicier take on that?
Jeff Tully: I'd like to avoid getting in trouble, but I think what I will say is I think this question comes up a lot. The question of things like should we regulate minimum cybersecurity standards for hospitals? I think people just have to understand that we have a very fragile overall national health ecosystem, and some of the most vulnerable institutions are the ones that are the rural, critical access hospitals that care for relatively underserved patient populations. And when we think about the tools and the resources that we want to make available for them to increase their resiliency, I think I agree with Christian that we want to do it in a way that is additive, and that is not sort of an unfunded mandate or penalizing them for being in a very challenging and difficult situation. And so we kind of just take a little bit more of a holistic view when it comes to some of these things, because we've been on both sides, and we work at some of these hospitals, and we understand that occasionally there are folks who don't have a single full-time IT security professional on staff, and it gets really hard to start mandating things and complex technical controls and introducing regulatory penalties like this, I think is something that hopefully we can attack from a little bit of a different angle, by focusing on some of the resiliency aspects.
Sherrod DeGrippo: I love that. I would really like to see technologists and security professionals solve the problem at the technology level instead of getting to the point where we're talking about banning ransoms, banning ransomware payments, or my question always is, you know, okay so if you ban ransomware payments, what happens if an organization does pay? Do you then fine them? Punish them? How does that work? Are they going to start incorporating the calculus of the fine, plus the ransom, plus you know, it just kind of changes the stakes and makes it a little more expensive, and punishes the victim, which I'm not a big fan of personally. So I know that you're working on some things that are visible at the national level. Tell me a little bit about the visibility focus that you're working on?
Christian Dameff: I think so many of your listeners out there are like hearkening back to one of their very first kind of cybersecurity education things, and it's like, it's all about visibility right? If you don't know what's on your network, how are you going to really be able to defend it? And I think a lot of people would be shocked at just how little the folks you think are in charge of securing these things and responding to them, how little visibility they really have on the national healthcare infrastructure. Kind of the earlier conversation we had about interdependencies. But I think Change Healthcare caught a lot of people off guard, and just realizing how interdependent we were, and how little visibility we had in on this. And so one of the efforts that we have out of this center is really trying to understand how you could measure, identify, measure, monitor these types of signals so that you can kind of understand what happens, not only at the individual hospital level, you know, when something goes down, kind of an early warning system. But also something that we've learned through the center is what happens in the periphery, the neighboring. If you watch the Microsoft video that you mentioned earlier, you'll hear a story about what happened in a community around hospitals that were hit with ransomware. And I think that's a fascinating and horrible story that, it's not just what happens at one hospital when it gets hit. Truly, the effects expand out. The hospitals that are ransomed can't see patients, but those patients don't stop getting sick. You know, patients don't stop having heart attacks because there's a ransomware attack in their city. They go other places. And so those other hospitals in the periphery get overwhelmed. All the ambulances come to them, and that can have consequences as well. With what we're trying to build at the center, this kind of visibility of national healthcare infrastructure, we hope that it serves as kind of an early warning for infection, but also as an ability for us to warn regions that hey, you're about to get wiped out with patients, you better staff up. Or, how can we do better load balancing, you know, no packet load balancing, but patient load balancing? Because healthcare is a finite resource. There are only so many hospitals, there's only so many doctors, and very quickly like we saw during Covid, those resources can get exhausted, and the collateral damage to that is patients. We hope that visibility can really help us respond in real-time, understand trends, and warn neighbors about what's about to come.
Sherrod DeGrippo: Yeah, that's great. I think that's something that a lot of people in IT, in security, we always feel like we have these blind spots, and getting a better idea of what's out there, and what we need to actually be securing, I think, is really important. Well, Christian, Jeff, I could talk to you guys forever. I hope that you will come back. Thank you so much for joining me. I have the co-directors of the U.C. San Diego Center for Healthcare Cybersecurity, Christian and Jeff with me. Quick question for the audience, because I know you guys have a lot of fans. Will we be seeing you on-site in Las Vegas for Defcon next year?
Jeff Tully: Wouldn't miss it.
Sherrod DeGrippo: Christian? Commit!
Christian Dameff: Every year.
Sherrod DeGrippo: Alright, every year.
Christian Dameff: Every year. We've been there for like 20 years, we won't break that streak.
Sherrod DeGrippo: Nice, love it. Well, everyone, keep your eyes open for Christian and Jeff walking around wearing, I don't know, lab coats and stethoscopes and who knows what else?
Jeff Tully: Bloody scrubs.
Sherrod DeGrippo: And bloody scrubs. Nothing better. But it's fake blood. Because it's from the weird dummy guy. So it's totally fine. [Music begins] Don't be alarmed. It's fake blood.
Jeff Tully: Or is it?
Sherrod DeGrippo: [Laughs] Or is it? Thanks for joining me guys. It was great talking with you. This is the Microsoft Threat Intelligence Podcast.
Jeffy Tully: Thanks Sher.
Christian Dameff: Thank you. [ Music Begins ]
Sherrod DeGrippo: Thanks for listening to the Microsoft Threat Intelligence Podcast. We'd love to hear from you. Email us with your ideas at tipodcast@microsoft.com. Every episode we'll decode the threat landscape, and arm you with the intelligence you need to take on threat actors. Check us out, msthreatintelpodcast.com, for more, and subscribe on your favorite podcast app. [ Music ]