Caveat 12.1.22
Ep 151 | 12.1.22

Patching healthcare cybersecurity risks.


Sam Heiney: This is just step one in a long battle that we will all realistically be fighting for the foreseeable future.

Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben. 

Ben Yelin: Hello, Dave. 

Dave Bittner: Today, Ben discusses a lawsuit that takes aim at artificial intelligence technology that generates its own computer code. I've got the story of some human rights activists opposing the Kids Online Safety Act. And later in the show, Sam Heiney, who is VP of products at Impero, on patching health care cybersecurity risks in the Internet of Medical Things. 

Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney. 

Dave Bittner: All right, Ben, we got some good stuff to share this week. Why don't you start things off for us? 

Ben Yelin: So my article comes from The New York Times by Cade Metz in their technology section. And this is about a new type of artificial intelligence released by Microsoft earlier this year that has now led to what I think could be a very groundbreaking lawsuit. So the way this technology works is Microsoft has released and sold to a bunch of other organizations artificial intelligence technology that helps programmers code. And the way it works is by generating its own computer code based on a predictive algorithm. 

Dave Bittner: OK. 

Ben Yelin: So it's called Copilot. The tool is designed to speed the work of professional programmers. Just as if you're typing an email in Outlook, it might suggest the next two or three words in the sentence based on artificial intelligence, what you've previously written, what other users have previously written. That's how this would work, except with code. This seems to be something that would be convenient for people who write code for a living. 

Dave Bittner: Yeah - to say the least (laughter). 

Ben Yelin: Yeah. I mean, when you're writing thousands and thousands of lines of code, it might help if there's some sort of predictive technology that can make your job a little bit easier. 

Dave Bittner: Right. 

Ben Yelin: But we've run into a potential intellectual property problem. So there's this guy named Matthew Butterick. He is a programmer, designer, writer and lawyer in Los Angeles. Man, that's a lot of things to be. I can only be one of those things. 

Dave Bittner: (Laughter). 

Ben Yelin: But he's not happy with this technology. And he, with a combination of other lawyers in this field, filed a lawsuit seeking to establish a class action against Microsoft and the other companies that helped to design and deploy Copilot. 

Ben Yelin: So there are a couple of issues, here. First, we're at the very preliminary stages of this lawsuit. They haven't even been able to establish a class yet for a class-action lawsuit, and I'm wondering if they will be able to establish a class. Usually, you can find enough people who can allege a legal wrong to make a class if there's some common issue of law and fact. 

Dave Bittner: Yeah. 

Ben Yelin: But if Mr. Butterick is part of kind of an insular community that has problems with this type of predictive technology and it's not something that's widely - it's not a view that's widely shared in the community, they might have a problem establishing a class. So that's one potential issue, here. 

Dave Bittner: OK. 

Ben Yelin: The second is the actual cause of action is - the cause of action does not relate to a direct copyright claim, and that's something that makes me question the utility of this lawsuit. He's not alleging a direct copyright infringement. Instead, he's arguing that GitHub, which is one of the companies that bought the software... 

Dave Bittner: Right. 

Ben Yelin: Their terms of service and privacy policies run afoul of federal law that requires companies to display copyright information when they make use of material. So it's kind of an indirect attack, alleging that some of the intellectual property of programmers is being used without attribution, but it's not a direct attack alleging a copyright infringement. So in that sense, it's kind of more of a test lawsuit. 

Dave Bittner: Yeah. 

Ben Yelin: If you're not ready to allege a copyright infringement, you can allege kind of this more procedural issue, which is that they are not divulging when they're using copyrighted information. 

Dave Bittner: So, OK. So help me understand what the beef here is. So as AI technologies work, I'm guessing that this system has ingested a lot of sample code... 

Ben Yelin: Yep. 

Dave Bittner: ...And is using that to generate the new code. 

Ben Yelin: That's right. And so the... 

Dave Bittner: And is that the problem? 

Ben Yelin: Yes, that's the problem. 

Dave Bittner: OK. 

Ben Yelin: So the question is, where do you find that sample code? And what Mr. Butterick and his co-plaintiffs are arguing is that there's a global community of programmers who have spent years, months - days, months, years building this type of code... 

Dave Bittner: Right. 

Ben Yelin: ...At the heart of this type of technology. And now, this code is being used without any attribution and without any financial benefit to those who built it. 

Dave Bittner: OK. It's a - mmm - go on (laughter). 

Ben Yelin: Yeah, so another kind of weird irony about this is Mr. Butterick is a proponent of open-source technology. He's used open-source platforms. He's wanted to share his code among the programming community. But those were other human beings. There was a kind of a bit of a community there. When you share something on an open-source platform, you are taking the direct action of sharing it, meaning you're kind of forfeiting your copyright claim. 

Dave Bittner: Right. 

Ben Yelin: And so I think the difference here is the predictive technology is not getting the consent of the people who actually built the code. If they were to get consent of the people who built the code, that would take a long time and would probably cut against the utility of the software, which is to cut through the process of needing to talk to programmers and just - it just does the programming for you... 

Dave Bittner: Yeah. 

Ben Yelin: ...In an economical way. So I really think this could - once this case develops, it could be a groundbreaking case in the general field of artificial intelligence as it relates to intellectual property because you are taking, in the aggregate, the work of people who've toiled away building these lines of code... 

Dave Bittner: Huh. 

Ben Yelin: ...And it's now going to be publicly available without the consent of the people who built that code, but in a way that I think would be of real benefit to the broader industry. So it's... 

Dave Bittner: Yeah. 

Ben Yelin: ...Just a really difficult policy problem. 

Dave Bittner: I don't buy it (laughter). 

Ben Yelin: All right. Give me your skepticism. 

Dave Bittner: Well, so, I mean, to me, this seems like - I am - all right, let me spin up an analogy, here. I'm a hopeful filmmaker, right? And I go out and I watch all of the films of Steven Spielberg twice. And then I go out and I make my own movie in the style of Steven Spielberg. Have I violated copyright? 

Ben Yelin: No. I don't think that analogy perfectly works. As far as I know, and this - I might be wrong about this. You are a human being. 

Dave Bittner: (Laughter) So far. Yeah, evidence would support that, although... 

Ben Yelin: Yeah, evidence... 

Dave Bittner: ...My wife might disagree from time to - my children, certainly. But go on (laughter). 

Ben Yelin: Yeah, so there's nothing automated about it. 

Dave Bittner: Right. 

Ben Yelin: You know. There's an old meme. I used to see it on Twitter all the time. Like, I forced a bot to watch 48 hours of Steven Spielberg films and... 

Dave Bittner: Right. 

Ben Yelin: ...This is what he spit out. 

Dave Bittner: Yeah. 

Ben Yelin: I feel like that would be more of a close analogy because it is an artificial system that is... 

Dave Bittner: Well, OK. So we have these - we have systems that do exactly that. So there are - we have these AI art-generating systems, where I can say, you know, draw me a teddy bear in the style of Matisse. And, boom, out it comes. Is - would that be a copyright violation? 

Ben Yelin: So I'm just trying to play - I understand your argument. I'm trying... 

Dave Bittner: Yeah. 

Ben Yelin: ...To play devil's advocate from... 

Dave Bittner: Yeah. 

Ben Yelin: ...The perspective of this community of programmers. 

Dave Bittner: Right. 

Ben Yelin: I think there is a little bit more that goes into building lines of code than there is to co-opting the style of an artist. 

Dave Bittner: Yeah. 

Ben Yelin: The code is pretty exact. I mean, you've developed some type of functionality. The code does something. 

Dave Bittner: Right. 

Ben Yelin: With Matisse - or with any artist for that matter - you're kind of using a general creative style, but it's not like Matisse had a distinct brushstroke that you're using or a distinct way of drawing a particular object that had never yet been discovered. So I do think your analogy is pretty good, but... 

Dave Bittner: Yeah. 

Ben Yelin: ...I can understand why these circumstances might be a little bit different. 

Dave Bittner: Yeah. I mean, I don't know. It seems like even in its - by its very nature, the AI would be derivative, right? If I - so let's - you know, 10 PRINT "HELLO WORLD!"; 20 GOTO 10, right? There's my code. Who - no one's going to lay claim to that. And I understand there are - there are certainly routines and chunks of code that someone could say, I created this. 

Ben Yelin: That was me, yeah. 

Dave Bittner: This innovation was mine. I don't know. I guess I'm having trouble shedding my skepticism here. Just - you know, it's been a long time since I've done coding, so I'm sure my thoughts on it are quite out of date. But I - maybe I'm just missing the point that this is an exploratory lawsuit? Right? 

Ben Yelin: It is an exploratory lawsuit. So the law, I think, in this area is pretty ill-defined... 

Dave Bittner: Yeah. 

Ben Yelin: ...And this is, I think, an effort in trying to define it. The - a court just might say there is no cause of action here. There is no - even in this weird procedural way in which this is being challenged, there's no valid copyright claim. I still think it's possible that there is, just because of the uniqueness of building a line of code and developing some type of functionality. I've tried to - just racking my own brain - I like your Matisse metaphor. I've tried to think of this because I think better when I think of things in a more analog and less digital way. 

Dave Bittner: Right. 

Ben Yelin: But think about some type of predictive technology that could write a Shakespeare play. 

Dave Bittner: Yeah. 

Ben Yelin: So you take, you know, every single word that Shakespeare has ever written. You put it into an artificial intelligence whatever... 

Dave Bittner: Yeah. 

Ben Yelin: ...And it spits out a new Shakespeare novel. 

Dave Bittner: Iambic pentameter (laughter). 

Ben Yelin: Yeah. 

Dave Bittner: Right? So what's the problem? 

Ben Yelin: I'm not sure that that - so, yeah, maybe you're not copying the exact words that Shakespeare wrote, but you're capturing that style. I feel like when you're doing predictive coding, you might actually be copying the exact words. Does that make sense? 

Dave Bittner: Yeah. 

Ben Yelin: 'Cause there is a specific line of code. It has certain zeros and ones in it. 

Dave Bittner: Right. 

Ben Yelin: Which is different than... 

Dave Bittner: There's a limited number of ways to accomplish a particular action or a limited number of best ways, I suppose, to generate a result in computer code. 

Ben Yelin: Right. I mean, it would sort of be like, with the Shakespeare play, the system spitting out to be or not to be because it only read "Hamlet," right? And that's what happened in that book, 'cause that was the only - those were the only words that could achieve the intended purpose in that circumstance. 

Dave Bittner: Sure. 

Ben Yelin: I am obviously no Shakespeare scholar, as you can tell. 

Dave Bittner: (Laughter) It's too bad we're not using sports analogies (laughter). 

Ben Yelin: I know, I know. I'd much rather be doing that, but that's just not the way my mind was... 

Dave Bittner: Right. 

Ben Yelin: ...Working this morning. 

Dave Bittner: Right. 

Ben Yelin: So I guess to play - again, I'm playing devil's advocate here. I think most people who've done programming for a living and who are programmers probably think this is just a nice, little convenience. 

Dave Bittner: Yeah. 

Ben Yelin: You know, we all use predictive technology when we're sending emails, when we're texting. It predicts the words that we're going to say after we type something in. We have artificial intelligence in Gmail that can automatically respond to emails by saying, yep, sounds good, because that's how I usually respond to emails. And I don't think anybody sees some type of copyright claim in that type of technology. I think the uniqueness about coding is that somebody has invented a way to achieve some level of functionality, and if that becomes the predictive line of coding in Copilot, then that really is stealing somebody's intellectual work and potentially turning it into profit. 

Dave Bittner: Yeah. 

Ben Yelin: And that would violate the spirit of our intellectual property laws. So I'm curious to see where this goes. I think it's a reasonable claim. It sounds like you're more skeptical than I am, but I think we'll both have to kind of see how this develops, see if they can even establish a class to... 

Dave Bittner: Yeah. 

Ben Yelin: ...Let the lawsuit go forward. 

Dave Bittner: I guess I just - I wonder, how can you come up with a copyright claim against something diffuse? You know, they're not saying this line of code is - they're not saying this system ripped off this particular line of code that was copyrighted by this organization. They're just coming after the possibility that it could, right? 

Ben Yelin: Right. It is very diffuse. 

Dave Bittner: (Laughter) Right. 

Ben Yelin: And my guess is, for somebody to eventually have a copyright infringement cause of action, you're going to have to have somebody who developed a piece of code... 

Dave Bittner: Yeah. 

Ben Yelin: ...That made its way into this predictive technology and have that person say, I want the fruits of my own labor. 

Dave Bittner: Yeah. 

Ben Yelin: And we're not - we're certainly not at that point yet. I don't even think that the person, Mr. Butterick, who's alleging piracy here - I don't think he's the person who's done that type of coding. I think he's the one who's organizing the lawsuit. He's a lawyer. 

Dave Bittner: Yeah. That's what - he can't help himself (laughter). 

Ben Yelin: Exactly. Exactly. 

Dave Bittner: What is your take on this kind of thing in general? I mean, this kind of - you know, someone decides we're going to test - we're going to do some testing here. We're going to, you know, nip around the edges and see what comes out of this. Is this a good thing? Is this a nuisance? You know, is this productive? 

Ben Yelin: I hope you could hear my audible sigh. 

Dave Bittner: (Laughter). 

Ben Yelin: I think, in some circumstances, it can be productive because you are testing novel legal theories. 

Dave Bittner: Right. 

Ben Yelin: Sometimes the only way to test them is to get your day in court. The court - our court system has ways of throwing these cases out before they would take up the time, energy and resources of an actual federal or state court. That's what a motion to dismiss is. You're going to make some law clerk do the work of writing something saying this is not a valid cause of action. 

Dave Bittner: Yeah. 

Ben Yelin: But at least the judge is not going to have to take their - his or her time considering the case. So we won't - it'll only actually make it into court if there is some type of legitimate cause of action. And so in that sense, I think it's OK to allow people to kind of test these theories. 

Dave Bittner: Yeah. 

Ben Yelin: You know, we do have these guardrails in our court system. You do have to have standing. So you have to be able to allege, with some level of particularity, that you've been hurt by whatever action you're fighting against here. And I think that's going to be a big obstacle in this case. Are you going to be able to find one programmer who is able to allege, with some level of certitude, that his or her own code was adopted into this predictive technology, was used and was used in a way that doesn't qualify as fair use? And that's going to be a really steep hill for that person to climb. And we're even far away from that step of the lawsuit. So I don't think anytime soon Copilot is going to be - you know, there's going to be an injunction against... 

Dave Bittner: Yeah. 

Ben Yelin: ...Copilot. 

Dave Bittner: Yeah. 

Ben Yelin: But I do think sometimes there can be some use to allowing people to test legal theories in our court system. 

Dave Bittner: All right. Well, keep an eye on this one (laughter). It's fascinating. We'll have a link to the story in the show notes. Again, this comes from The New York Times. 

Dave Bittner: Moving on to my story this week - comes from the folks over at Axios. This is written by Ashley Gold. And the title of the article is "Human Rights, LGBTQ+ Organizations Oppose Kids Online Safety Act." Now, I guess we should start off here by describing what the Kids Online Safety Act is. It is a bipartisan bill from Senators Richard Blumenthal, who's a Democrat from Connecticut, and Marsha Blackburn, who's a Republican from Tennessee. After a series of hearings they had with some tech company leaders with concerns about the platform's negative effects on children, they are proposing the Kids Online Safety Act, which makes social media platforms do a number of things. They're supposed to enable the strongest settings by default for minors to provide them with options to protect their information, and it would also require that the companies provide parents and children a channel to report harms to the platform and also a lot of sort of parental monitoring kinds of tools. So that sounds great, right? (Laughter) I want to... 

Ben Yelin: And if you name something the Kids Online Safety Act, you have... 

Dave Bittner: Yeah. 

Ben Yelin: ...A pretty high bar to find opposition. That's why they give it that title, but... 

Dave Bittner: Yeah. And, you know, it's bipartisan so - check. 

Ben Yelin: Right. 

Dave Bittner: I mean, and they're - this is one of those things they're trying to get through during the lame-duck session here, you know, before Congress turns over. 

Ben Yelin: Right. So you got about a month left. 

Dave Bittner: Yep. But there are a bunch of human rights and LGBTQ+ organizations who joined in and submitted a letter to Congress who are opposing this bill, saying that they have a number of issues with it. They say that - I'm going to quote them here. They say, "we believe the privacy, online safety and digital well-being of children should be protected. However, the bill would undermine those goals by effectively forcing providers to use invasive filtering and monitoring tools, jeopardizing private secure communications, incentivizing increased data collection on children and adults and undermining the delivery of critical services to minors by public agencies like schools." 

Dave Bittner: So it seems like what they're concerned about here is - particularly for older kids, so think about your 15-, 16-, 17-year-olds who may be in one of these groups, you know, LGBTQ+ group, a kid, you know, who is in that group of kids, who's looking for information, looking for resources, looking for, you know, sex ed kind of stuff. And what they're saying is that this bill could make it so that their parents would have to know about that, and that could lead to things like domestic violence, things like that. So the flip side is I'm sure there are lots of parents out there who would say, well, yeah, I want to know what my 15-, 16-, 17-year-old is looking for online. That's exactly the point of this. 

Ben Yelin: Right. I mean, it is a tough dilemma. I'll note that there's an interesting coalition involved in writing this letter. It's both LGBTQ+ advocacy organizations, like GLAAD, but also online privacy advocates, like the Electronic Frontier Foundation. And I will say that those two different types of groups are not always on the same side of these types of disputes. 

Dave Bittner: Interesting. 

Ben Yelin: This is one of those where I really understand the complaint that's reflected in this letter. 

Dave Bittner: Yeah. 

Ben Yelin: Obviously, there is a positive intention to protect kids and their online safety. 

Dave Bittner: Right. 

Ben Yelin: But you do have this problem of people who are already part of marginalized groups who might just want to do some research into, as you say, basic things like sexual education - you know, is being who I am OK? Are there resources that can support me? And allowing, potentially at least under this legislation, parents of those older kids to view that information could put kids in those marginalized groups in significant danger. 

Dave Bittner: Yeah. 

Ben Yelin: Now, it's really up to Congress to determine what weighs more heavily here - the interests of the parents to always have the right at least to monitor the online content of even their older children. 

Dave Bittner: Right. 

Ben Yelin: And I think many people, as you say, would assert that that should be - that right should be absolute. And then there are the interests of people who are part of these more marginalized groups who might legitimately be threatened by legislation like this. 

Dave Bittner: Yeah. 

Ben Yelin: I - in terms of the prospects of a piece of legislation like this passing, in spite of this letter, I still think there's a decent chance that this bill could get added to some sort of broader piece of legislation or might pass on its own... 

Dave Bittner: Right. 

Ben Yelin: ...By the time Congress adjourns at the end of the month. It does have bipartisan support. But because of this letter, I could see somebody like Senator Ron Wyden or maybe some other progressive Democratic senators stand up and say, we're not going to agree to this bill unless there are some changes. And maybe those changes would be via an amendment that would loosen some of the age requirements. 

Dave Bittner: Right, right. 

Ben Yelin: So parental notification would be required for kids up to 13 instead of up to 17. That might be one type of acceptable change. But I'm not sure that this type of letter is going to derail the bill. There are a lot of senators in both parties who are fine crossing groups like the ACLU and the Electronic Frontier Foundation. 

Dave Bittner: Right. 

Ben Yelin: And there are at least 48 or so senators who would not mind crossing the interests of GLAAD and other LGBTQ+ rights organizations. 

Dave Bittner: Right, right. 

Ben Yelin: So it's hard to know what the prospects are. I do think the complaint alleged in the letter is certainly valid and something for the senators to take into consideration as they consider this bill. 

Dave Bittner: Yeah. I think another element of this that has folks like the Electronic Frontier Foundation on notice is that they're concerned about things like end-to-end encryption, that this bill could disable that. In other - in order for the parents to be able to read things, have access to things, then that can't be end-to-end encrypted. Those communications can't be end-to-end encrypted. They would have to be accessible by the parent. And there are folks who, you know, strongly believe that end-to-end encryption is practically a fundamental right (laughter), right? 

Ben Yelin: Yeah, I mean, folks like the Electronic Frontier Foundation live and die by this idea that we can't cut against end-to-end encrypted technology... 

Dave Bittner: Right. 

Ben Yelin: ...Because that's a slippery slope. 

Dave Bittner: Yeah. 

Ben Yelin: If the government ever has access to it or if you're giving a certain class of people access to end-to-end encrypted information, that could cut against people's privacy rights but also enable bad actors, people who would exploit those security flaws to commit cybercrimes or do other nefarious things. 

Dave Bittner: Yeah. 

Ben Yelin: So that's certainly a valid concern as well. I do think there's a legitimate gripe here that this piece of legislation is overbroad. I will say that it probably would have been helpful if they had written this letter prior to the lame duck session. There might be a timing issue, but I guess that remains to be seen. Maybe they wanted to get ahead of this before this bill actually comes to the floor or gets passed with unanimous consent. And this was the time to do it, in their view. 

Dave Bittner: What about this kind of in-between age for kids - you know? - this 15-, 16-, 17-year-old, when you're standing on the front porch of adulthood, right? 

Ben Yelin: Right. 

Dave Bittner: You know? In general, how does our legal system treat folks like that? 

Ben Yelin: That's a really interesting question. So generally, between - before the age of 7 - I'm just looking at criminal law... 

Dave Bittner: Yeah. 

Ben Yelin: ...Principles from common law. Before the age of 7, you're not responsible for anything. 

Dave Bittner: OK. 

Ben Yelin: Between 7 and 14, you - there's kind of a rebuttable presumption that you do bear some type of responsibility. So you could have an infancy defense if you commit a crime when you're 13. Above 14, you are - you could certainly be charged for a crime, either in juvenile court or, in many cases for violent crimes, in an adult court. And you could be charged as an adult, convicted as an adult, and they could certainly take away your rights when you are 15, 16 or 17. So you're kind of treated as an adult in that context. You're treated as an adult in other contexts as well. But in some contexts, you're treated like a kid. You can't vote. 

Dave Bittner: Right. 

Ben Yelin: You're not eligible for Selective Service. You're treated like a kid in the way we see sexual violence protective measures, so... 

Dave Bittner: Right, right. 

Ben Yelin: ...Statutory rape applies in most states if committed against somebody 17 years of age or younger. So it is kind of this weird in-between area where we haven't really decided if people falling into this age bracket are kids or adults. It's kind of ad hoc based on how we want to see them, based on the individual circumstances involved... 

Dave Bittner: Yeah. 

Ben Yelin: ...Which, I think, makes this even more complicated. 'Cause in many ways, they are kids. Their brains haven't fully matured. But in many ways, they are adults in that they've developed an identity, you know? 

Dave Bittner: Right. They have physical maturity in many cases. 

Ben Yelin: They're starting to develop physical maturity. 

Dave Bittner: Yeah. 

Ben Yelin: For me, maybe I wasn't quite there at age 15... 

Dave Bittner: (Laughter). 

Ben Yelin: ...Or 16. But for many kids, they have already developed physical... 

Dave Bittner: Right. 

Ben Yelin: ...Maturity. 

Dave Bittner: There's a broad - well, but that's part of it, too, right? I mean, at that age, there's a broad spectrum of development among kids, you know? You've got kids who are still, you know, 4 foot 6 and kids who are 6 foot 2, you know (laughter)? Right? 

Ben Yelin: Right. 

Dave Bittner: And everything in between. 

Ben Yelin: Right. It is sort of this bizarre in-between area. And it's just hard to know how the law should treat people. 

Dave Bittner: Yeah. 

Ben Yelin: You know, I will note, in other areas of online law and regulation, the cutoff is more like 13 years of age. We've seen that with regulations, for example, against Meta, the parent company of Facebook. 

Dave Bittner: Right. 

Ben Yelin: They don't allow unsupervised accounts under the age of 13, I think. 

Dave Bittner: Yeah, yeah. 

Ben Yelin: So, you know, it's - there are no easy answers. 

Dave Bittner: Right. 

Ben Yelin: You want to protect kids as long as you possibly can. There are certain areas of the law that see these people as kids and believes that the government should protect them at all costs. There are areas of the law where that's not the case. I happen to think that, in these circumstances, when we're talking about access to potentially life-saving information and resources, this is one of those instances where it might make sense to treat them more like adults than like kids. But, again, my perspective might vary from the median perspective in this country, and I think that's something that's up to legislators to decide. 

Dave Bittner: Yeah. All right. Well, we will have a link to that in the show notes. Again, that's from the folks over at Axios. 

Dave Bittner: We would love to hear from you. If there's something you'd like us to discuss here on the show, you can email us. It's 

Dave Bittner: Ben, I recently had the pleasure of speaking with Sam Heiney. He is the VP of products at a company called Impero. And our conversation focuses on health care cybersecurity risks and, in particular, the Internet of Medical Things. Here's my conversation with Sam Heiney. 

Sam Heiney: Over the last, I would say, decade - probably a little bit longer than that - we've all been aware of an increasing number of cybersecurity threats, cyberattacks, disclosures of our personal information from a variety of institutions. There has been an ongoing threat against our devices as we use more and more of those devices. And I think it really started to come to a head in the last couple of years, as the proliferation of those devices just exploded. You know, there are some estimates that there are more devices - kind of that Internet of Things - your Internet of Medical Things, your Internet of Industrial Things, all of these sensors and small computers - there are now more of those than there are computers and cell phones. 

Sam Heiney: So we've got these little, embedded computers everywhere, and hackers, activists, cybercriminals, malcontents, are targeting those devices more and more. And because we're seeing more and more of them in the health care industry, in health care facilities, they have become a favored target by those bad actors. I think it was, you know, back just - even if you go back to 2017, the FDA said that 1 in 10 of these medical devices had vulnerabilities. And there have been some other estimates more recently, as just as recent as 2021, that state that health care as an industry is the largest target for these cybercriminals compared to all industries. So when you look at that context - lots of devices, lots of criminals, people really coming after it. There were some senators - I can't remember the people who have sponsored the bill. Forgive me. But it was really prudent that they started trying to elevate and escalate some of these protections that the PATCH Act is recommending. 

Dave Bittner: And what are some of the specifics about, you know, the medical world that make them a particular target for these bad actors? 

Sam Heiney: I think the easy answer is that medical facilities, health care as an industry, is a critical industry. It's super important to our daily lives. You know, there's that old joke about you ask the bank robber, you know, what - why did you rob the bank? And he said, that's where the money is. Well, when you think about your life and what is valuable to you, money's pretty high up there, but so is your health. So if I really want to disrupt something, if I want to get leverage over you, if I want to use ransomware and I want to install that on your computer and make you pay me, I'm going to attack something that's important to you. And health care devices managing our very lives, that's a pretty juicy target. So if I'm a criminal and I can lock out a hospital and say, you can't treat patients because you can't access any of your systems unless you give me a million dollars, well, unfortunately, sometimes the only thing that those hospitals can do is pay the million dollars. So it's a rich target because of how important and critical the information involved is. 

Dave Bittner: You know, I could imagine if I'm someone, you know, maintaining the equipment in a hospital, you know, thinking of something that you see all over a - you know, an infusion pump, something like that, and I have, you know, many of those at my facility, and they're all working fine. I can imagine the impulse to say, well, all of these are working fine. I'm not going to take them offline and risk upgrading them, updating them, you know, doing software updates, if you will, because that may take them offline, and they won't be available for a certain amount of time. I can understand that mindset, but I suppose you can't really come at it that way anymore, can you? 

Sam Heiney: No. And I saw a quote, and I can't remember where I saw it, so don't hold me to this, but I think it's accurate, that a large health care organization or kind of larger health care organizations can have up to 10,000 medical devices within their facilities. And if you just think of that scale, I mean, it's just this massive number of devices, and that's just with one health care organization, right? That's - think of a hospital that could have 10,000 of these devices. Now, if you have to shut them all down at once and just stop servicing and stop helping your patients, obviously, you know, you're not going to do that. But if you can create a routine of upgrading, updating, patching and maintaining your equipment, then - you're already doing that today, right? I mean, you go into surgery, and you use sterile equipment. You come out; you then sterilize it. It's just part of the process. 

Sam Heiney: What the PATCH Act is advocating and what cybersecurity professionals are advocating is that you look at all of your medical equipment kind of in that same fashion, that as a routine part of using the equipment, you need to patch, update and prevent cyberthreats. It's the same as scrubbing the floor, sterilizing the equipment. Upgrading and patching the software, that just has to be a part of your everyday, normal operating procedure. 

Dave Bittner: And what are some of the key elements of the PATCH Act here? 

Sam Heiney: You know, I've reviewed the text, and I've seen some descriptions. And I think the - you know, the way it's been described in a couple of articles that I've seen who are a little more eloquent than me are there are four big planks. Device manufacturers need to first plan for addressing cybersecurity, right? They need to have a plan in place that says, here is how I address, with this piece of equipment, cybersecurity. There's a lot of discretion, but being able to show the FDA and the regulators that, hey, we have thought about cybersecurity, and here's how we are addressing it. 

Sam Heiney: The second is that you need a - you need to be able to provide to regulators a software bill of materials, an SBOM. And that software bill of materials is really a listing. It's a detailed list of all the components that go into that equipment. So if you think of a computer, a computer's running software, right? I'm looking at a Windows device, and within Windows, there's lots of different libraries and binaries and all sorts of bits and pieces. The software bill of materials just lists it all out. It says, OK, we're running Windows, and it uses this free software library. You have this application installed. Part of that application is this other kind of hidden application from a different vendor. You want to surface all of those bits and pieces in the software bill of materials so that you have full visibility, and you don't have any surprises, right? So first, have a plan. Second, make sure you've got visibility on the device in the software bill of materials. 

Sam Heiney: The third part is to really monitor for vulnerabilities. So if you put a piece of hardware - if you put a medical device out onto the market, you need to be monitoring - what are the threats to that device? - and have a plan for addressing any vulnerabilities that you find. So if I need to upgrade my operating system - if it's using, you know, some version of Linux - a proprietary version of Linux and a vulnerability is discovered, I need to be able to document and show - how will I upgrade that operating system and get that vulnerability fixed? And then finally, another big piece of this is the idea of a coordinated vulnerability disclosure. I actually wrote that here in my notes so I wouldn't say it incorrectly. 

Dave Bittner: (Laughter). 

Sam Heiney: The coordinated vulnerability disclosure - I mean, this is something in the software world. It's if you find that there is a flaw. If you find that there is a vulnerability, you need to document it and you need to disclose it. And so having the regulatory and kind of the oomph coming from Congress saying this is something you have to do gives a little bit more motivation to vendors, to hardware suppliers, to software suppliers to say, I'm looking for these problems. And when I find them, I'm going to tell people so they can fix them. So that vulnerability disclosure is an important part of making sure that everyone gets out there and fixes it and patches their equipment. 

Dave Bittner: You know, it's certainly noteworthy that this is a bipartisan effort. You know, with the Senate and the Congress being the way that it is right now, that is just short of being miraculous. What is industry's response to this? Are they on board? Is there any significant pushback that you've heard of? 

Sam Heiney: You know, there are always going to be grumblings from device manufacturers who have new requirements placed on them. And this will, for some device manufacturers, be a little burdensome. And you will have, I would imagine - and I talk with a variety of different device manufacturers who - again, they really want clarity, right? They really want it to be clear - what is the requirement? When do I have to do it? - so that they can plan accordingly. This is a bill that is in - you know, kind of hasn't been approved. It hasn't been voted on. Is it going to happen? Is it not going to happen? So any of that - you know, when things aren't for sure, when you can't make a plan, there's always going to be grumbling from the industry providers. 

Sam Heiney: But I'll tell you that even the most jaded and the most - whoever you are, whatever product you're supplying - if it's in the health care world, these vendors want to keep people safe. That's - you know, they're corporate profiteers who just want to make money. But the people I've been working with and most of the manufacturers - they're putting out insulin pumps and, you know, blood gas analyzers and x-ray machines because they're honestly trying to make the world a better place and help people. And they want cybersecurity to be included. Having more guidance, having more regulation - there might be some grumbling, but I think people will get on board pretty quickly, especially since large industry, you know, like the health care providers, the American Hospital Association, the, you know, American Medical Association - they're all throwing their weight behind this. So the device manufacturers will get in line. 

Dave Bittner: Hmm. Do you have any sense for what kind of timeline we might be on with this? 

Sam Heiney: You know, that, unfortunately, is outside of my area of expertise. I think the last I saw, it was in committee. And so I just - I haven't been tracking that well enough. And I just - like I said, I don't have the expertise to read those tea leaves. I am hopeful. I mean, I support the PATCH Act. I think it's a necessary step. It's not a silver bullet. You know, industry is going to do more - need to do more. Device manufacturers are going to need to do more. Health care providers can't just rest on the - if this passes, say, oh, well, we got the PATCH Act. Everything's fine. This is just step one in a long battle that will - we will all realistically be fighting for the foreseeable future. 

Dave Bittner: Ben, what do you think? 

Ben Yelin: That was really interesting. I actually went and looked at the piece of legislation that you guys discussed, the PATCH Act. It hasn't gotten past the committee process, and we're almost at the end of this congressional session. So it's one of those things that, more likely than not, the legislation is going to be DOA. I think it could be reintroduced in the next session. And one thing that was interesting about the interview is it seems like there's some level of acceptance, even among industry folks, that, because we're dealing with things like medical devices, which contain personal - extremely personal information, some information that has monetary value, it's worthy of these extra layers of protection. So yeah, I thought it was an interesting interview. 

Dave Bittner: Yeah. All right. Well, our thanks to Sam Heiney for joining us. We do appreciate him taking the time. 

Dave Bittner: That is our show. We want to thank all of you for listening. The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Ben Yelin: And I'm Ben Yelin. 

Dave Bittner: Thanks for listening.